Professor Seth Lloyd talks about the world's smallest universe, quantum mechanics, quantum computers, and pushing Moore's Law beyond the capacity of the human brain.
A quantum computer is any device for computation that makes direct use of distinctively quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. In a classical (or conventional) computer, information is stored as bits; in a quantum computer, it is stored as qubits (quantum bits). The basic principle of quantum computation is that the quantum properties can be used to represent and structure data, and that quantum mechanisms can be devised and built to perform operations with this data.Although quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits. Research in both theoretical and practical areas continues at a frantic pace, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis. (See Timeline of quantum computing for details on current and past progress.)If large-scale quantum computers can be built, they will be able to solve certain problems exponentially faster than any of our current classical computers (for example Shor's algorithm). Quantum computers are different from other computers such as DNA computers and traditional computers based on transistors. Some computing architectures such as optical computers may use classical superposition of electromagnetic waves, but without some specifically quantum mechanical resources such as entanglement, they have less potential for computational speed-up than quantum computers.
Seth Lloyd is a Professor of mechanical engineering at MIT. He refers to himself as a "quantum mechanic".Lloyd was born on August 2, 1960, received his AB from Harvard College in 1982, his Math.Cert. and M.Phil. from Cambridge University in 1983 and 1984, and his Ph.D. from Rockefeller University in 1988 (advisor Heinz Pagels) for a thesis entitled "Black Holes, Demons, and the Loss of Coherence: How complex systems get information, and what they do with it." After postdoctoral fellowships at the California Institute of Technology and Los Alamos National Laboratory, he joined MIT in 1994.His research area is the interplay of information with complex systems, especially quantum systems. He has made contributions to the field of quantum computation and proposed a design for a quantum computer.In his book, Programming the Universe, Lloyd contends that the universe itself is one big quantum computer producing what we see around us, and ourselves, as it runs a cosmic program. According to Lloyd, once we understand the laws of physics completely, we will be able to use small-scale quantum computing to understand the universe completely as well.Lloyd states that we could have the whole universe simulated in a computer in 600 years provided that computational power increases according to Moore's Law. However, Lloyd shows that there are limits to rapid exponential growth in a finite universe, and that it is very unlikely that Moore's Law will be maintained indefinitely.Lloyd is principal investigator at the MIT Research Laboratory of Electronics, and directs the Center for Extreme Quantum Information Theory (xQIT) at MIT.