Aug 23

US government developing ultimate cyber weapon; Prime-factoring quantum computing makes encryption obsolete (Natural News, Aug 20, 2012)

The U.S. government is making steady progress on a game-changing technology that would give it the most powerful weapon ever devised in the realm of cyber warfare and information dominance. The weapon is called a “prime-factoring quantum computer,” and a small-scale version of the game-changing technology has already been demonstrated by researchers at UC Santa Barbara, where qubits — quantum bits of computational potential — factored the number 15 into its prime factors three and five.

So what, you say? Can’t any fifth grader do the same thing?

But hold on: Every digital encryption algorithm used today depends in the extreme mathematical difficulty of factoring (the prime numbers of) very large numbers. When you buy something on the internet, for example, your credit card number is sent to the merchant using something called “SSL encryption” which typically uses a 40-bit, 128-bit or sometimes even a 256-bit encryption algorithm. Anyone who might intercept your web form data would not be able to extract your credit card number unless they decrypted your encrypted data. This task requires extraordinary computing power.

For example, using “military grade” 512-bit encryption means that it would take a supercomputer longer than the age of the known universe to decrypt your file and expose your secrets. This is why the U.S. military uses such encryption. It’s virtually unbreakable given today’s computers.

But quantum computers have the spooky ability to process complex decryption algorithms using what some scientists believe are computational bits which coexist in an infinite number of parallel universes. You feed the quantum computer a decryption task, and it “calculates” the answer in all possible parallel universes. The correct answer then emerges in this universe, seemingly magically.

Quantum computing appears to break the laws of physics… yeah, it’s spooky Continue reading »

Tags: , , , , , , , , ,

Jul 13

“The project took JP Morgan around three years, and the bank is now looking to push it into other areas of the business, such as high frequency trading.”

JP Morgan supercomputer offers risk analysis in near real-time (Computerworld UK, July 11, 2011):

JP Morgan is now able to run risk analysis and price its global credit portfolio in near real-time after implementing application-led, High Performance Computing (HPC) capabilities developed by Maxeler Technologies.

The investment bank worked with HPC solutions provider Maxeler Technologies to develop an application-led, HPC system based on Field-Programmable Gate Array (FPGA) technology that would allow it to run complex banking algorithms on its credit book faster.

JP Morgan uses mainly C++ for its pure analytical models and Python programming for the facilitation. For the new Maxeler system, it flattened the C++ code down to a Java code. The company also supports Excel and all different versions of Linux.

Prior to the implementation, JP Morgan would take eight hours to do a complete risk run, and an hour to run a present value, on its entire book. If anything went wrong with the analysis, there was no time to re-run it.

It has now reduced that to about 238 seconds, with an FPGA time of 12 seconds.

“Being able to run the book in 12 seconds end-to-end and get a value on our multi-million dollar book within 12 seconds is a huge commercial advantage for us,” Stephen Weston, global head of the Applied Analytics group in the investment banking division of JP Morgan, said at a recent lecture to Stanford University students.

“If we can compress space, time and energy required to do these calculations then it has hard business values for us. It gives us ultimately a competitive edge, the ability to run our risk more frequently, and extracting more value from our books by understanding more fully is a real commercial advantage for us.”

The faster processing times means that JP Morgan can now respond to changes in its risk position more rapidly, rather than just looking back at the risk profile of the previous day, which was produced by overnight analyses.

The speed also allows the bank to identify potential problems and try to deal with them in advance. For example, JP Morgan can now run potential scenarios to assess its exposure to problems such as the Irish or Greek bank problems, which Weston said “wouldn’t have even been thinkable” before.

Continue reading »

Tags: , , , , , , , ,

Jul 01

Los Alamos Takes Supercomputers Offline (Data Center Knowledge, June 30, 2011):

The wildfire threatening Los Alamos, New Mexico has gained national attention, largely due to concerns about the safety of nuclear waste at Los Alamos National Labs, which played a key role in the Manhattan Project and nuclear weapons development and testing. As we noted Monday, the Department of Energy facility also houses two of the world’s leading supercomputers, the Cielo and Roadrunner systems. Those systems have been taken offline, Computerworld reports.“

A Los Alamos spokeswoman said the laboratory conducted an ‘orderly shutdown’ of two of its largest supercomputers,” writes Patrick Thibodeau at ComputerWorld. “IBM’s Roadrunner, the first the break the petaflop barrier in 2008, and now the 10th ranked most powerful supercomputer in the world, and Cielo, a Craig system that is ranked No. 6 on the Top500 list. The supercomputer shutdowns were conducted ‘early on,’ but an exact day or reason for the action wasn’t clear.”

Continue reading »

Tags: , , , , , , ,

Oct 28

Now China will be No.1 in HIGH-FREQUENCY TRADING!

See also: China Unveils World Speed Record Train Line

A Chinese scientific research center has built the fastest supercomputer ever made, replacing the United States as maker of the swiftest machine, and giving China bragging rights as a technology superpower.

The computer, known as Tianhe-1A, has 1.4 times the horsepower of the current top computer, which is at a national laboratory in Tennessee, as measured by the standard test used to gauge how well the systems handle mathematical calculations, said Jack Dongarra, a University of Tennessee computer scientist who maintains the official supercomputer rankings.

Although the official list of the top 500 fastest machines, which comes out every six months, is not due to be completed by Mr. Dongarra until next week, he said the Chinese computer “blows away the existing No. 1 machine.” He added, “We don’t close the books until Nov. 1, but I would say it is unlikely we will see a system that is faster.”

Officials from the Chinese research center, the National University of Defense Technology, are expected to reveal the computer’s performance on Thursday at a conference in Beijing. The center says it is “under the dual supervision of the Ministry of National Defense and the Ministry of Education.”

The race to build the fastest supercomputer has become a source of national pride as these machines are valued for their ability to solve problems critical to national interests in areas like defense, energy, finance and science. Supercomputing technology also finds its way into mainstream business; oil and gas companies use it to find reservoirs and Wall Street traders use it for superquick automated trades. Procter & Gamble even uses supercomputers to make sure that Pringles go into cans without breaking.

And typically, research centers with large supercomputers are magnets for top scientific talent, adding significance to the presence of the machines well beyond just cranking through calculations.

Over the last decade, the Chinese have steadily inched up in the rankings of supercomputers. Tianhe-1A stands as the culmination of billions of dollars in investment and scientific development, as China has gone from a computing afterthought to a world technology superpower.

“What is scary about this is that the U.S. dominance in high-performance computing is at risk,” said Wu-chun Feng , a supercomputing expert and professor at Virginia Polytechnic Institute and State University. “One could argue that this hits the foundation of our economic future.”

Modern supercomputers are built by combining thousands of small computer servers and using software to turn them into a single entity. In that sense, any organization with enough money and expertise can buy what amount to off-the-shelf components and create a fast machine.

Continue reading »

Tags: , , ,

Jun 09

WASHINGTON – Scientists unveiled the world’s fastest supercomputer on Monday, a $100 million machine that for the first time has performed 1,000 trillion calculations per second in a sustained exercise.

The technology breakthrough was accomplished by engineers from the Los Alamos National Laboratory and the IBM Corp. on a computer to be used primarily on nuclear weapons work, including simulating nuclear explosions.

The computer, named Roadrunner, is twice as fast as IBM’s Blue Gene system at Lawrence Livermore National Laboratory, which itself is three times faster than any of the world’s other supercomputers, according to IBM.

“The computer is a speed demon. It will allow us to solve tremendous problems,” said Thomas D’Agostino, head of the National Nuclear Security Administration, which oversees nuclear weapons research and maintains the warhead stockpile. Continue reading »

Tags: , , ,

Mar 06

Out of the Blue

Can a thinking, remembering, decision-making, biologically accurate brain be built from a supercomputer?

In the basement of a university in Lausanne, Switzerland sit four black boxes, each about the size of a refrigerator, and filled with 2,000 IBM microchips stacked in repeating rows. Together they form the processing core of a machine that can handle 22.8 trillion operations per second. It contains no moving parts and is eerily silent. When the computer is turned on, the only thing you can hear is the continuous sigh of the massive air conditioner. This is Blue Brain. Continue reading »

Tags: , , , ,