Skip to main content

Computing

My first computer, as a child, was a Commodore VIC-20. I learned to program on it but within weeks had exhausted its 5-kilobyte memory. My initial frustration led to a realization that I needed to use the resources more efficiently, and so I managed to squeeze much more performance out of the machine. Soon, my ambitions exceeded the resources I had on hand.

From the editor: Computing

David Harris


My first computer, as a child, was a Commodore VIC-20. I learned to program on it but within weeks had exhausted its 5-kilobyte memory. My initial frustration led to a realization that I needed to use the resources more efficiently, and so I managed to squeeze much more performance out of the machine. Soon, my ambitions exceeded the resources I had on hand. The next generation of home computers came along soon, and the 64 KB memory of the Commodore 64 kept me busy for a little longer. As IBM PC clones became readily available, personal-computer advances seemed to speed up. –I was witnessing Moore's Law at work. The memory and processor speed of a modern home computer are almost exactly what Moore's Law predicted since I first started computing–, a doubling in power every two years.

Just as home computing is advancing rapidly, so is high-end computing, tied closely to advances in science. This issue of symmetry examines computing advances in relation to particle physics, in time for SC|05, the premier international conference for high-performance computing.

Particle physicists have always been among those at the leading edge of computer development. They use the latest software and hardware in almost all aspects of their work, from designing experiments to collecting and analyzing data. Physicists have consistently shown that they can use the extra computing resources effectively and efficiently, often pushing the limits of discovery.

To collect data from CERN's Large Hadron Collider, computing resources need to exceed current availabilities by a vast amount. To handle all the data, physicists and computer scientists are developing new infrastructures and techniques such as grid computing, a method for sharing resources over global networks.

Meanwhile, understanding the floods of data to come from LHC will require improvements in theoretical understanding. Making precise predictions with quantum chromodynamics, the theory of the strong force that mediates quark interactions in LHC collisions, is notoriously difficult. One approach to the problem is a computer-intensive technique called lattice QCD. Physicists designed custom chips to perform the lattice calculations, and those chips have since found further application in IBM's Blue Gene/L supercomputers, used to study biological processes.

Biologists, climatologists, geologists, and many other scientists are increasingly using high-end computing. But even as more sciences take advantage of computing advances, the symbiosis between physics and computing will continue, driving Moore's Law until physics has a different kind of influence on computing.

When the structures on computer chips shrink so that quantum effects dominate, a new type of computing will take over. Perhaps the next generation of computer-savvy children will be hacking a new era of quantum computers, learning to drive computation and physics to further advances.

David Harris
Editor-in-Chief
 

Click here to download the pdf version of this article.