Science and Technology in the News

Message boards : The Lounge : Science and Technology in the News
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · 8 . . . 10 · Next

AuthorMessage
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35275 - Posted: 18 Oct 2010, 12:39:43 UTC

Adding Human Intelligence to Software

TurKit lets programmers combines code with input from an army of online human workers.

By John Pavlus

Amazon's Mechanical Turk service has long provided a cheap source of labor, when the job is simple for humans but difficult for computers. Tasks such as describing a picture, for example, can be completed online by remote, human workers. Programmers already use groups of these workers, called turkers, to do many such tasks at the same time. But Mechanical Turk offers no easy way for programmers developing new software applications to combine and coordinate the turkers' efforts. Now computer scientists at MIT have developed a toolkit that does just that. Called TurKit, the tool lets software engineers write algorithms to coordinate online workers using the Javascript programming language, and create powerful applications that have human intelligence built in. The software can also be debugged like normal code.

"Usually in Javascript, you wouldn't be able to access Mechanical Turk without a lot of work," explains Greg Little, a PhD candidate at MIT's Computer Science and Artificial Intelligence Laboratory, who created TurKit. "This is a bridge for writing code that interacts with the workers on Mechanical Turk, so we can easily explore new methods of human computation."

With TurKit, human input is stored in a database. That way, anytime the software under development crashes, the turkers don't have to start over from scratch. Instead, once the program has been fixed, it can pick right up where it left off. "If you wait an hour for the humans to finish their task, and then the program throws an error, you don't want to wait another hour just to see if your bug fix works," says Little. TurKit also prevents the human input from changing unpredictably during the debugging process. "If I got different behavior every time I ran (a program), I could never debug that moving target," says Michael Bernstein, a PhD candidate at MIT, who used TurKit to create a word-processing application called Soylent...

read more here ...

http://www.technologyreview.com/computing/26535/?nlid=3646
http://www.technologyreview.com/computing/26535/?nlid=3646
ID: 35275 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35277 - Posted: 18 Oct 2010, 15:19:23 UTC

New BLOG POSTS

NASA Ames’ Worden reveals DARPA-funded ‘Hundred Year Starship’ program

by Amara D. Angelica

NASA Ames Director Simon “Pete” Worden revealed Saturday that NASA Ames has “just started a project with DARPA called the Hundred Year Starship,” with $1 million funding from DARPA and $100K from NASA.

“You heard it here,” said Worden at “Long Conversation,” a Long Now Foundation event in San Francisco. “We also hope to inveigle some billionaires to form a Hundred Year Starship fund,” he added.

“The human space program is now really aimed at settling other worlds,” he explained. “Twenty years ago you had to whisper that in dark bars and get fired.” (Worden was in fact fired by President George W. Bush, he also revealed.)

But these ambitious projects will need whole new concepts for propulsion, Worden advised. “NASA needs to build a true starship, probably using electric propulsion, probably also using solar energy and nuclear energy.

One new propulsion concept is electric propulsion, said Worden. “Anybody that watches the [Star Trek] Enterprise, you know you don’t see huge plumes of fire. Within a few years we will see the first true prototype of a spaceship that will take us between worlds.

“We are [also] funding a young scientist to develop microwave thermal propulsion. The idea is if you can beam power to the spaceship, so you don’t have to carry all the fuel; and then you use that energy from a laser or microwave power to heat a propellant; it gets you a pretty big factor of improvement. I think that’s one way of getting off the world.”

KurzweilAI has been speaking to this scientist, Dmitriy Tseliakhovich, who has formed a company called Escape Dynamics LLC. The concept is based on a PhD thesis by Kevin L.G. Parkin and his work at Caltech, where NASA is teaming with Caltech scientists and engineers in building a prototype. Tseliakhovich said his team is also working with Autodesk on this project, which is a spinoff of a team project at Singularity University this past Summer.

“The space launch system currently developed by Escape Dynamics has a unique potential to drop the cost of space access by more than an order of magnitude and finally open space for medium- and small-size businesses,” he believes.

“The microwave thermal thruster using beamed propulsion is an excellent idea,” said Dr. Narayanan M. Komerath, a professor at Georgia Tech College of Engineering and a NASA Institute of Advanced Concepts Fellow. “[Kevin Parker] picks the 140 GHz window, which apparently offers strong advantages in absorption by the materials that he uses in the propulsion system.”

But Worden warned that in settling on other worlds, we need to be cautious. “How do you live in another world? I don’t have the slightest idea,” he said. “If you’re a conservative, you worry about it killing us; if you’re a liberal, you worry about us killing it. I think things like synthetic biology have lot of potential for that. I think rather than make an environment on Mars like Earth, why don’t we modify life … including the human genome … so it’s better suited to [Mars]?

Wordon also thinks we should go to the moons of Mars first, where we can do extensive telerobotics exploration of the planet. “I think we’ll be on the moons of Mars by 2030 or so. Larry [Page] asked me a couple weeks ago how much it would cost to send people one way to Mars and I told him $10 billion, and his response was, ‘Can you get it down to 1 or 2 billion?’ So now we’re starting to get a little argument over the price.” ...

read more here ...

http://www.kurzweilai.net/nasa-ames-worden-reveals-darpa-funded-hundred-year-starship-program
http://www.kurzweilai.net/nasa-ames-worden-reveals-darpa-funded-hundred-year-starship-program
ID: 35277 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35317 - Posted: 20 Oct 2010, 16:25:35 UTC

Space Weather News for Oct. 20, 2010

SUNDIVING COMET:

A newly-discovered comet is plunging toward the sun for a close encounter it probably will not survive. The comet is too deep in the sun's glare for human eyes to pick out, but it is showing up nicely in coronagraph images from the Solar and Heliospheric Observatory. Visit http://spaceweather.com for latest movies.

ORIONID METEOR SHOWER:

Earth is passing through a stream of debris from Halley's Comet, and this is causing the annual Orionid meteor shower. Bright moonlight is reducing the number of visible meteors; nevertheless, sky watchers are reporting some bright Orionids. The best time to look is during the hours before local dawn on Thursday, Oct. 21st, and again on Friday, Oct. 22nd.

Check: http://spaceweather.com

for a sky map and more information.

ID: 35317 · Report as offensive     Reply Quote
Profile Jord
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 29 Aug 05
Posts: 15477
Netherlands
Message 35336 - Posted: 21 Oct 2010, 20:21:11 UTC

Most distant galaxy ever found sheds light on infant cosmos

Observations of the most distant object yet discovered go a long way in supporting astronomers' models of the early Universe. But the far-flung galaxy, details of which are published in Nature today1, also raises questions about the source of the first light in the cosmos.

Light from the galaxy, named UDFy-38135539, left the object just 600 million years after the Big Bang, giving a snapshot of the cosmos in its infancy. This value smashes the previous record held by a galaxy by 150 million years. The image shows the galaxy as it was when it was around 100 million years old and is just 1-10% of the mass of the Milky Way.

The galaxy is particularly fascinating because, 600 million years after the Big Bang, the Universe was thought to be going through a phase called reionization. However, there has been little direct observational evidence for this, says astronomer Matt Lehnert at the Paris Observatory in France, who led the team involved in the study. According to astronomers' best models, the early Universe burst out of the Big Bang around 13 billion years ago as an ionized fireball. This ball of gas gradually cooled, becoming neutral as protons and electrons combined to form hydrogen. "Then stars and galaxies began to form, lighting up the Universe, heating up the gas and reionizing it," says Lehnert. "This galaxy allows us to peek at the reionization era."

http://www.nature.com/news/2010/101020/full/news.2010.552.html
ID: 35336 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35381 - Posted: 23 Oct 2010, 3:52:47 UTC



The Director of NASA’s Ames Center, Pete Worden has announced an initiative to move space flight to the next level. This plan, dubbed the “Hundred Year Starship,” has received $100,000 from NASA and $ 1 million from the Defense Advanced Research Projects Agency (DARPA). He made his announcement on Oct. 16. Worden is also hoping to include wealthy investors in the project. NASA has yet to provide any official details on the project.
Worden also has expressed his belief that the space agency was now directed toward settling other planets. However, given the fact that the agency has been redirected toward supporting commercial space firms, how this will be achieved has yet to be detailed. Details that have been given have been vague and in some cases contradictory.

The Ames Director went on to expound how these efforts will seek to emulate the fictional starships seen on the television show Star Trek. He stated that the public could expect to see the first prototype of a new propulsion system within the next few years. Given that NASA’s FY 2011 Budget has had to be revised and has yet to go through Appropriations, this time estimate may be overly-optimistic.

One of the ideas being proposed is a microwave thermal propulsion system. This form of propulsion would eliminate the massive amount of fuel required to send crafts into orbit. The power would be “beamed” to the space craft. Either a laser or microwave emitter would heat the propellant, thus sending the vehicle aloft. This technology has been around for some time, but has yet to be actually applied in a real-world vehicle.

The project is run by Dr. Kevin L.G. Parkin who described it in his PhD thesis and invented the equipment used. Along with him are David Murakami and Creon Levit. One of the previous workers on the program went on to found his own company in the hopes of commercializing the technology used ...

read more here ...

http://www.universetoday.com/76195/nasas-ames-director-announces-100-year-starship/
http://www.universetoday.com/76195/nasas-ames-director-announces-100-year-starship/
ID: 35381 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35420 - Posted: 25 Oct 2010, 18:51:45 UTC

October 20, 2010

Tomorrow’s Internet: 1000 times faster

UCSB-led team developing next-generation Ethernet to handle surging traffic, support cloud computing, emerging applications

Imagine if all the data traversing the world right now—on long distance networks and between and within computers and other hardware—could be sent through a single fiber the width of a human hair.

A new research center has been launched at the University of California, Santa Barbara (UCSB) to make that a reality. Researchers with the Terabit Optical Ethernet Center (TOEC) will develop the technology necessary for a new generation of Ethernet a thousand times faster, and much more energy efficient, than today’s most advanced networks. They are aiming for 1 Terabit Ethernet over optical fiber—1 trillion bits per second—by 2015, with the ultimate goal of enabling 100 Terabit Ethernet by 2020.

Partnering with TOEC as founding industry affiliates are Google Inc., Verizon, Intel, Agilent Technologies and Rockwell Collins Inc.

Internet traffic is booming, as businesses and institutions handle massive quantities of data and consumers stream video, share high-resolution photos and battle it out in online games. Millions of people will soon be consuming billions of bits per second in their living rooms, all at the same time.

“We’re going to need much faster networking to handle the explosion in Internet traffic and support new large-scale applications like cloud computing,” says Daniel Blumenthal, Professor of Electrical and Computer Engineering at UCSB and Director of TOEC, which is part of UCSB’s Institute for Energy Efficiency (IEE).

“The work that will be conducted at TOEC will enable the future of the Internet,” says Stuart Elby, Vice President of Network Architecture for Verizon.

Ethernet, the way computers talk to each other over a network, has become the de facto standard for data transmission both on a small scale and across global networks. “It’s an accepted, flexible interface,” says Internet pioneer David Farber, a professor at Carnegie Mellon University and former Chief Technologist for the Federal Communications Commission.

Ethernet is constantly evolving, but soon—in as little as five years, according to some estimates—it won’t be able to keep up with the speed and bandwidth required for applications like video and cloud computing, and distributed data storage.

“Based on current traffic growth, it’s clear that 1 Terabit per second trunks will be needed in the near future,” Elby says.

Not only will Terabit Ethernet soon be needed to satisfy the demands created by the way we use networks now, but Farber says high-performance, high-speed Ethernet will open up opportunities we couldn’t dream of today: “You build it, they will come.”

http://engineering.ucsb.edu/news/468/
http://engineering.ucsb.edu/news/468/
ID: 35420 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35430 - Posted: 26 Oct 2010, 15:18:15 UTC

CUDA SPOTLIGHT

Dell published a new paper titled "Expanding the Boundaries of GPU Computing," which includes a case study about the National Center for Supercomputing Applications (NCSA) at the University of Illinois. The case study describes Lincoln - a 47 TFLOPS cluster based on Dell hardware with NVIDIA Tesla GPUs for parallel processing. NCSA’s John Towns says that NCSA is seeing "applications that on a per-GPU basis have an equivalent performance of anywhere from 30 to 40 CPU cores all the way up to over 200 CPU cores…."

read more here ...

http://www.dell.com/content/topics/global.aspx/power/en/gpu_computing?c=us&l=en&cs=555
http://www.dell.com/content/topics/global.aspx/power/en/gpu_computing?c=us&l=en&cs=555
ID: 35430 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35434 - Posted: 27 Oct 2010, 0:57:47 UTC

US approves world’s biggest solar energy project in California

The U.S. Department of Interior approved on Monday a permit for Solar Millennium, LLC to build the largest solar energy project in the world — four plants at the cost of one billion dollars each — in southern California.

The project is expected to generate up to 1,000 Megawatts of energy, enough electricity to annually power more than 300,000 single-family homes, more than doubling the solar electricity production capacity of the U.S.

Once constructed, the Blythe facility will reduce CO2 emissions by nearly one million short tons per year, or the equivalent of removing more than 145,000 cars from the road. Additionally, because the facility is “dry-cooled,” it will use 90 percent less water than a traditional “wet-cooled” solar facility of this size. The Blythe facility will also help California take a major step toward achieving its goal of having one third of the state’s power come from renewable sources by the year 2020.

The entire Blythe Solar Power Project will generate a total of more than 7,500 jobs, including 1,000 direct jobs during the construction period, and thousands of additional indirect jobs in the community and throughout the supply chain. When the 1,000 MW facility is fully operational it will create more than 220 permanent jobs ...

read more here ...

http://www.businesswire.com/news/home/20101025007018/en/Solar-Trust-America-Clears-Final-Regulatory-Hurdle
http://www.businesswire.com/news/home/20101025007018/en/Solar-Trust-America-Clears-Final-Regulatory-Hurdle
ID: 35434 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35439 - Posted: 27 Oct 2010, 5:32:42 UTC

Chinese Chip Closes In on Intel, AMD

China may finally have a processor to power a homegrown supercomputer.

By Christopher Mims

At this year's Hot Chips conference at Stanford University, Weiwu Hu, the lead architect of the "national processor" of China, revealed three new chip designs. One of them could enable China to build a homegrown supercomputer to rank in a prestigious list of the world's fastest machines.

The Loongson processor family (known in China by the name Godson), is now in its sixth generation. The latest designs consist of the one-gigahertz, eight-core Godson 3B, the more powerful 16-core, Godson 3C (with a speed that is currently unknown), and the smaller, lower-power one-gigahertz Godson 2H, intended for netbooks and other mobile devices. The Godson 3B will be commercially available in 2011, as will the Godson 2H, but the Godson 3C won't debut until 2012.

According to Tom Halfhill, industry analyst and editor of Microprocessor Report, the eight-core Godson 3B will still be significantly less powerful than Intel's best chip, the six-core Xeon processor. It will be able to perform roughly 30 percent fewer mathematical calculations per second. Intel's forthcoming Sandy Bridge processor and AMD's Bulldozer processor will widen the gap between chips designed by American companies and the Godson 3B.

However, China's chip-making capabilities are improving quickly. Intel's Xeon processor uses a 32-nanometer process (meaning the smallest components can be formed on this scale), while the Godson 3B uses 65 nanometers, leading to significantly slower processing speeds. But the Godson 3C processor will leapfrog current technology by using a 28-nanometer process, although this will only increase its clock speed by about a factor of two, estimates Halfhill. With its eight additional cores, this should make the 3C about four times as fast as the Godson 3B.

read more here ...

http://www.technologyreview.com/computing/26596/?nlid=3693
http://www.technologyreview.com/computing/26596/?nlid=3693
ID: 35439 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35441 - Posted: 27 Oct 2010, 12:25:57 UTC

NASA Science News for Oct. 26, 2010

Modern power grids are increasingly vulnerable to strong solar storms.
A new NASA project named "Solar Shield" could help keep the lights on.

read more here ...

http://science.nasa.gov/science-news/science-at-nasa/2010/26oct_solarshield/
http://science.nasa.gov/science-news/science-at-nasa/2010/26oct_solarshield/
ID: 35441 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35481 - Posted: 28 Oct 2010, 16:11:14 UTC

Chinese supercomputer is world’s fastest at 2.5 petaflops

China set to claim supercomputing crown - October 28, 2010

In a potential blow to US national pride the world’s fastest supercomputer is now Chinese, beating the Americans into second place for the first time since 2004 with a machine which is smaller and more energy efficient than its closest US rival.

In the run up to the release of the official list of the top 500 supercomputers next week the Chinese supercomputer, Tianhe-1A, looks certain to occupy the top spot.

Tianhe-1A, which means 'Milky Way', has clocked up 2.5 petaflops – equivalent to roughly two quadrillion (or 2.5 x 1015) calculations every second, making it significantly faster than the Cray Jaguar at Oak Ridge National Lab in Tennessee – the US’s fastest supercomputer - which can only muster a comparatively feeble 2.3 petaflops.

Jack Dongarra, a University of Tennessee computer scientist who maintains the official supercomputer rankings told the New York Times that Tianhe-1A “blows away” the competition. “We don’t close the books until Nov. 1, but I would say it is unlikely we will see a system that is faster,” he said.

Tianhe-1A, a new supercomputer revealed today at HPC 2010 China, has set a new performance record of 2.507 petaflops (quadrillion floating point operations per second), as measured by the LINPACK benchmark, making it the fastest system in China and in the world today, according to an NVIDIA statement.

The supercomputer operates 50% faster than the world’s current top supercomputer, the Cray XT5-HE Jaquar at Oak Ridge National Laboratory, which can deliver 1.76 petaflops of sustained performance. The Tianhe-1A operates at one-third the power and at one half the size of the Jagquar, according to NVIDIA.

The system uses 7,168 NVIDIA Tesla M2050 massively parallel graphics processing units (GPUs) and 14,336 multi-core central processing units (CPUs). It would require more than 50,000 CPUs and twice as much floor space to deliver the same performance using CPUs alone. the company says.

Tianhe-1A was designed by the National University of Defense Technology (NUDT) in China. The system is housed at National Supercomputer Center in Tianjin and is already fully operational. It will be operated as an open access system to use for large scale scientific computations ...

read more here ...

http://blogs.nature.com/news/thegreatbeyond/2010/10/china_will_claim_supercomputin.html
http://blogs.nature.com/news/thegreatbeyond/2010/10/china_will_claim_supercomputin.html
ID: 35481 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35666 - Posted: 9 Nov 2010, 17:00:53 UTC

Quantum Computing Reaches for True Power



QUIBIT CHIP Four quibits are symmetrically coupled via a capacitive island, the cross in the center.
By JOHN MARKOFF
Published: November 8, 2010
New York Times

  • IBM has begun a five-year research project based on advances made in the past year at Yale University and the University of California, Santa Barbara that suggest the possibility of quantum computing based on standard microelectronics manufacturing technologies.

  • Researchers at Toshiba Research Europe and Cambridge University reported in Nature that they had fabricated light-emitting diodes coupled with a custom-formed quantum dot, which functioned as a light source for entangled photons.

  • Google has received a proposal from D-Wave and NASA’s Jet Propulsion Laboratory to develop a quantum computing facility for Google next year based on D-Wave technology.


In 1981 the physicist Richard Feynman speculated about the possibility of “tiny computers obeying quantum mechanical laws.” He suggested that such a quantum computer might be the best way to simulate real-world quantum systems, a challenge that today is largely beyond the calculating power of even the fastest supercomputers.

Since then there has been sporadic progress in building this kind of computer. The experiments to date, however, have largely yielded only systems that seek to demonstrate that the principle is sound. They offer a tantalizing peek at the possibility of future supercomputing power, but only the slimmest results.

Recent progress, however, has renewed enthusiasm for finding avenues to build significantly more powerful quantum computers. Laboratory efforts in the United States and in Europe are under way using a number of technologies.

Significantly, I.B.M. has reconstituted what had recently been a relatively low-level research effort in quantum computing. I.B.M. is responding to advances made in the past year at Yale University and the University of California, Santa Barbara, that suggest the possibility of quantum computing based on standard microelectronics manufacturing technologies. Both groups layer a superconducting material, either rhenium or niobium, on a semiconductor surface, which when cooled to near absolute zero exhibits quantum behavior.

The company has assembled a large research group at its Thomas J. Watson Research Center in Yorktown Heights, N.Y., that includes alumni from the Santa Barbara and Yale laboratories and has now begun a five-year research project.

“I.B.M. is quite interested in taking up the physics which these other groups have been pioneering,” said David DiVincenzo, an I.B.M physicist and research manager.

Researchers at Santa Barbara and Yale also said that they expect to make further incremental progress in 2011 and in the next several years. At the most basic level, quantum computers are composed of quantum bits, or qubits, rather than the traditional bits that are the basic unit of digital computers. Classic computers are built with transistors that can be in either an “on” or an “off” state, representing either a 1 or a 0. A qubit, which can be constructed in different ways, can represent 1 and 0 states simultaneously. This quality is called superposition.

The potential power of quantum computing comes from the possibility of performing a mathematical operation on both states simultaneously. In a two-qubit system it would be possible to compute on four values at once, in a three-qubit system on eight at once, in a four-qubit system on 16, and so on. As the number of qubits increases, potential processing power increases exponentially.

There is, of course, a catch. The mere act of measuring or observing a qubit can strip it of its computing potential. So researchers have used quantum entanglement — in which particles are linked so that measuring a property of one instantly reveals information about the other, no matter how far apart the two particles are — to extract information. But creating and maintaining qubits in entangled states has been tremendously challenging.

“We’re at the stage of trying to develop these qubits in a way that would be like the integrated circuit that would allow you to make many of them at once,” said Rob Schoelkopf, a physicist who is leader of the Yale group. “In the next few years you’ll see operations on more qubits, but only a handful.”

The good news, he said, is that while the number of qubits is increasing only slowly, the precision with which the researchers are able to control quantum interactions has increased a thousandfold.

The Santa Barbara researchers said they believe they will essentially double the computational power of their quantum computers next year.

John Martinis, a physicist who is a member of the team, said, “We are currently designing a device with four qubits, and five resonators,” the standard microelectronic components that are used to force quantum entanglement. “If all goes well, we hope to increase this to eight qubits and nine resonators in a year or so.”

Two competing technological approaches are also being pursued. One approach involves building qubits from ions, or charged atomic particles, trapped in electromagnetic fields. Lasers are used to entangle the ions. To date, systems as large as eight qubits have been created using this method, and researchers believe that they have design ideas that will make much larger systems possible. Currently more than 20 university and corporate research laboratories are pursuing this design.

In June, researchers at Toshiba Research Europe and Cambridge University reported in Nature that they had fabricated light-emitting diodes coupled with a custom-formed quantum dot, which functioned as a light source for entangled photons. The researchers are now building more complex systems and say they can see a path to useful quantum computers.

A fourth technology has been developed by D-Wave Systems, a Canadian computer maker. D-Wave has built a system with more than 50 quantum bits, but it has been greeted skeptically by many researchers who believe that it has not proved true entanglement. Nevertheless, Hartmut Neven, an artificial-intelligence researcher at Google, said the company had received a proposal from D-Wave and NASA’s Jet Propulsion Laboratory to develop a quantum computing facility for Google next year based on the D-Wave technology.

read more here ...

Source: New York Times, Nov 8, 2010

ID: 35666 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35680 - Posted: 10 Nov 2010, 16:12:03 UTC


Eurek Alert!

Quantum computers a step closer to reality thanks to new finding

Quantum computers may be much easier to build than previously thought, suggests a new study in Physical Review Letters

by
Laura Gallagher
Research Media Relations Manager
Imperial College London

Quantum computers should be much easier to build than previously thought, because they can still work with a large number of faulty or even missing components, according to a study published today in Physical Review Letters. This surprising discovery brings scientists one step closer to designing and building real-life quantum computing systems – devices that could have enormous potential across a wide range of fields, from drug design, electronics, and even code-breaking.

Scientists have long been fascinated with building computers that work at a quantum level – so small that the parts are made of just single atoms or electrons. Instead of 'bits', the building blocks normally used to store electronic information, quantum systems use quantum bits or 'qubits', made up of an arrangement of entangled atoms.

Materials behave very differently at this tiny scale compared to what we are used to in our everyday lives – quantum particles, for example, can exist in two places at the same time. "Quantum computers can exploit this weirdness to perform powerful calculations, and in theory, they could be designed to break public key encryption or simulate complex systems much faster than conventional computers," said Dr Sean Barrett, the lead author of the study, who is a Royal Society University Research Fellow in the Department of Physics at Imperial College London.

The machines have been notoriously hard to build, however, and were thought to be very fragile to errors. In spite of considerable buzz in the field in the last 20 years, useful quantum computers remain elusive.

Barrett and his colleague Dr. Thomas Stace, from the University of Queensland in Brisbane, Australia, have now found a way to correct for a particular sort of error, in which the qubits are lost from the computer altogether. They used a system of 'error-correcting' code, which involved looking at the context provided by the remaining qubits to decipher the missing information correctly.

"Just as you can often tell what a word says when there are a few missing letters, or you can get the gist of a conversation on a badly-connected phone line, we used this idea in our design for a quantum computer," said Dr Barrett. They discovered that the computers have a much higher threshold for error than previously thought – up to a quarter of the qubits can be lost – but the computer can still be made to work. "It's surprising, because you wouldn't expect that if you lost a quarter of the beads from an abacus that it would still be useful," he added.

The findings indicate that quantum computers may be much easier to build than previously thought, but as the results are still based on theoretical calculations, the next step is to actually demonstrate these ideas in the lab. Scientists will need to devise a way for scaling the computers to a sufficiently large number of qubits to be viable, says Barrett. At the moment the biggest quantum computers scientists have built are limited to just two or three qubits.

"We are still some way off from knowing what the true potential of a quantum computer might be, says Barrett. "At the moment quantum computers are good at particular tasks, but we have no idea what these systems could be used for in the future," he said. "They may not necessarily be better for everything, but we just don't know. They may be better for very specific things that we find impossible now."

read more here ...

http://www.eurekalert.org/pub_releases/2010-11/icl-qca110910.php
http://www.eurekalert.org/pub_releases/2010-11/icl-qca110910.php

ID: 35680 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35745 - Posted: 16 Nov 2010, 15:21:38 UTC

Supercomputers ‘will fit in a sugar cube,’ IBM says

A pioneering research effort could shrink the world’s most powerful supercomputer processors to the size of a sugar cube, IBM scientists say. The approach will see many computer processors stacked on top of one another, cooling them with water flowing between each one.

The aim is to reduce computers' energy use, rather than just to shrink them. Some 2% of the world's total energy is consumed by building and running computer equipment. Speaking at IBM's Zurich labs, Dr Bruno Michel said future computer costs would hinge on green credentials rather than speed. Dr Michel and his team have already built a prototype to demonstrate the water-cooling principle. Called Aquasar, it occupies a rack larger than a refrigerator.

IBM estimates that Aquasar is almost 50% more energy-efficient than the world's leading supercomputers. "In the past, computers were dominated by hardware costs - 50 years ago you could hold one transistor and it cost a dollar, or a franc," Dr Michel told BBC News. Now when the sums are done, he said, the cost of a transistor works out to 1/100th of the price of printing a single letter on a page.

Now the cost of the building the next generation of supercomputers is not the problem, IBM says. The cost of running the machines is what concerns engineers. "In the future, computers will be dominated by energy costs - to run a data centre will cost more than to build it," said Dr Michel. The overwhelming cause of those energy costs is in cooling, because computing power generates heat as a side product.

Cube route

"In the past, the Top 500 list (of fastest supercomputers worldwide) was the important one; computers were listed according to their performance. "In the future, the 'Green 500' will be the important list, where computers are listed according to their efficiency." Until recently, the supercomputer at the top of that list could do about 770 million computational operations per second at a cost of one watt of power. The Aquasar prototype clocked up nearly half again as much, at 1.1 billion operations per second. Now the task is to shrink it. "We currently have built this Aquasar system that's one rack full of processors. We plan that 10 to 15 years from now, we can collapse such a system in to one sugar cube - we're going to have a supercomputer in a sugar cube." Mark Stromberg, principal research analyst at Gartner, said that the approach was a promising one. But he said that tackling the finer details of cooling - to remove heat from just the right parts of the chip stacks - would take significant effort.

Third dimension

It takes about 1,000 times more energy to move a data byte around than it does to do a computation with it once it arrives. What is more, the time taken to complete a computation is currently limited by how long it takes to do the moving.Air cooling can go some way to removing this heat, which is why many desktop computers have fans inside. But a given volume of water can hold 4,000 times more waste heat than air. However, it adds a great deal of bulk. With current technology, a standard chip - comprising a milligram of transistors - needs 1kg of equipment to cool it, according to Dr Michel. Part of the solution he and his colleagues propose - and that the large Aquasar rack demonstrates - is water cooling based on a slimmed-down, more efficient circulation of water that borrows ideas from the human body's branched circulatory system.

However, the engineers are exploring the third dimension first. They want to stack processors one on top of another, envisioning vast stacks, each separated by water cooling channels not much more than a hair's breadth in thickness. Because distance between processors both slows down and heats up the computing process, moving chips closer together in this way tackles issues of speed, size, and running costs, all at once.

In an effort to prove the principle the team has built stacks four processors high. But Dr Michel concedes that much work is still to be done. The major technical challenge will be to engineer the connections between the different chips, which must work as conductors and be waterproof. "Clearly the use of 3D processes will be a major advancement in semiconductor technology and will allow the industry to maintain its course," Gartner's Mark Stromberg told the BBC.

"But several challenges remain before this technology can be implemented - issues concerning thermal dissipation are among the most critical engineering challenges facing 3D semiconductor technology."

http://www.bbc.co.uk/news/technology-11734909
http://www.bbc.co.uk/news/technology-11734909
ID: 35745 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35792 - Posted: 18 Nov 2010, 18:15:18 UTC

PASADENA, Calif.-- A new image from NASA's Wide-field Infrared Survey Explorer shows what looks like a glowing jellyfish floating at the bottom of a dark, speckled sea. In reality, this critter belongs to the cosmos -- it's a dying star surrounded by fluorescing gas and two very unusual rings ...

http://www.nasa.gov/mission_pages/WISE/news/wise20101117.html
ID: 35792 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35802 - Posted: 19 Nov 2010, 15:52:12 UTC

How Can Warming Cause Colder Winters ?

It may sound pretty crazy at first, but that's because we give the wrong name to what really is going on around us. Our changing climate isn't all about temperatures.

Several new studies -- most recently that of physicist Vladimir Petoukhov and colleagues in Germany and Russia, reporting in the Journal of Geophysical Research -- have been pointing to a warming Arctic to explain recent severely cold winters in the Northern Hemisphere.

climate isn't all about temperatures.

SEE ALSO: Warmer Arctic Spells Colder Winters

Warming temperatures may be at the root of it all -- on a planetary scale -- but where the changes hit the pavement on a regional scale, where you and I live, it is not necessarily the warming that we are going to remember.

Like a furniture mover running amok in a comfortable room, temperature changes are rearranging important features of our climate system -- altering patterns of cloudiness, for instance, as well as ocean currents, glaciers and ice caps. Most important for the Northern Hemisphere's winters is the loss of ice floating on the surface of the Arctic Ocean.

SEE ALSO: Winter Outlook: Wet, Cool Northwest, Dry Southeast

What happens in Las Vegas may stay in Las Vegas, as they say, but this definitely is not true of the Arctic. Instead of the sunlight bouncing off the bright sea ice and reflecting back into space, the exposed ocean now absorbs its warmth -- changing not just the temperature of the water but the circulation of the atmosphere above it.

Read More here ...

http://news.discovery.com/earth/inside-the-cold-winter-paradox.html
http://news.discovery.com/earth/inside-the-cold-winter-paradox.html
ID: 35802 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35836 - Posted: 22 Nov 2010, 16:39:09 UTC



CAPE CANAVERAL
5:58 p.m. EDT Nov. 21, 2010
Space Flight by Jason


United Launch Alliance (ULA)

2010Delta IV Heavy Rocket - Roars Off Launch Pad on Secret NRO Mission



A Delta IV heavy lifts off from Cape Canaveral Air Force Station at 5:58 p.m. EDT carrying a secret NRO payload.
Photo Credit: Universe Today/Alan Walters - awaltersphoto.com

Posted in: Breaking News, Military, Missions, Satellites, Space Flight by Jason


CAPE CANAVERAL

United Launch Alliance (ULA) successfully launched a Delta IV Heavy rocket from Cape Canaveral Air Force Station in Florida, sending a classified surveillance satellite to space. Liftoff occurred on Nov. 21 at 5:58 p.m. EDT. The enormous rocket thundered to life, and as almost to underscore the secretive nature of the mission, the fiery exhaust was only visible for a short while before disappearing into thick clouds. However, long after the rocket was out of view, it made its journey known through its roar. The vibration was so visceral that vehicles and windows of buildings in the immediate area began to rattle with the raw power that was unleashed ...

http://www.universetoday.com/79635/delta-iv-heavy-roars-off-launch-pad-on-nro-mission/
http://www.universetoday.com/79635/delta-iv-heavy-roars-off-launch-pad-on-nro-mission/

ID: 35836 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35856 - Posted: 23 Nov 2010, 15:31:13 UTC



Singapore’s Agency - for Science - Technology - and - Research -
partners with 10 EU research organisations to work on
the groundbreaking project that lays the foundation
for creating and testing a molecular-sized processor chip.


Prof Christian Joachim
Visiting Investigator
Institute of
Materials Research and Engineering (IMRE)


Singapoe - Agency - for Science - Technology - and - Research

Singapore and European research organizations are working together to build what is essentially a single-molecule processor chip. As a comparison, a thousand of such molecular chips could fit into one of today’s microchips.

The ambitious project, termed Atomic Scale and Single Molecule Logic Gate Technologies (ATMOL), will establish a new process for making a complete molecular chip. This means that computing power can be increased significantly but take up only a small fraction of the space that is required by today’s standards.

The fabrication process involves the use of three unique ultra high vacuum (UHV) atomic scale interconnection machines which build the chip atom-by-atom. These machines physically move atoms into place one at a time at cryogenic temperatures ...

http://www.a-star.edu.sg/?TabId=828&articleType=ArticleView&articleId=1393
http://www.a-star.edu.sg/?TabId=828&articleType=ArticleView&articleId=1393

ID: 35856 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35897 - Posted: 27 Nov 2010, 14:49:18 UTC - in response to Message 34143.  

P ≠ NP ? is it bad news for the power of computing ?

Has the biggest question in computer science been solved? On 6 August 2010, Vinay Deolalikar, a mathematician at Hewlett-Packard Labs in Palo Alto, California, sent out draft copies of a paper titled simply "P ≠ NP".

This terse assertion could have profound implications for the ability of computers to solve many kinds of problem. It also answers one of the Clay Mathematics Institute's seven Millennium Prize, so if it turns out to be correct Deolalikar will have earned himself a prize of $1 million.

The P versus NP question concerns the speed at which a computer can accomplish a task such as factorising a number. Some tasks can be completed reasonably quickly – in technical terms, the running time is proportional to a polynomial function of the input size – and these tasks are in class P.

read more ...

New Scientist Physics & Math


Neo - (Project scientist) - at AQUA@home posted the following:

Ah yes, that P!=NP "proof" was debunked back in August. :) There are a bunch of posts on a blog by Richard Lipton about it, along with hundreds of comments discussing the issues with the paper: http://rjlipton.wordpress.com/2010/08/15/the-p%e2%89%a0np-proof-is-one-week-old/


http://aqua.dwavesys.com/forum_thread.php?id=670#9283
ID: 35897 · Report as offensive     Reply Quote
Profile Byron Leigh Hatch @ team Carl ...
Avatar

Send message
Joined: 30 Aug 05
Posts: 505
Canada
Message 35983 - Posted: 6 Dec 2010, 18:49:10 UTC

The 70 Online Databases that Define Our Planet

If you want to simulate the Earth, you'll need data on the
climate, health, finance, economics, traffic and lots more.
Here's where to find it.


Published by MIT

The Physics arXiv Blog
produces daily coverage of the best new ideas
from an online forum called the Physics arXiv
on which scientists post early versions of their latest ideas.
Contact KentuckyFC @ arxivblog.com

Back in April, we looked at an ambitious European plan to simulate the entire planet. The idea is to exploit the huge amounts of data generated by financial markets, health records, social media and climate monitoring to model the planet's climate, societies and economy. The vision is that a system like this can help to understand and predict crises before they occur so that governments can take appropriate measures in advance.

There are numerous challenges here. Nobody yet has the computing power necessary for such a task, neither are there models that will can accurately model even much smaller systems. But before any of that is possible, researchers must gather the economic, social and technological data needed to feed this machine.

Today, we get a grand tour of this challenge from Dirk Helbing and Stefano Balietti at the Swiss Federal Institute of Technology in Zurich. Helbing is the driving force behind this project and the man who will lead it if he gets the EUR 1 billion he needs from the European Commission ...

read more here ...

http://www.technologyreview.com/blog/arxiv/26097/?p1=A2
http://www.technologyreview.com/blog/arxiv/26097/?p1=A2
ID: 35983 · Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · 8 . . . 10 · Next

Message boards : The Lounge : Science and Technology in the News

Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.