Computers and Computational Complexity
Moore's Law, generally speaking, estimates that the number of transistors on a square inch of a microprocessor will double about every 18 months. This sometimes leads people to believe that eventually, computers will be able to solve any mathematical problem, no matter how complicated (See Turing Test). This is not the case; computers are fundamentally limited by the constraints of physics, and certain upper bounds on what to expect can reasonably be formulated. Also, there are certain theoretical results which show that some problems are inherently beyond the reach of complete computational solution, no matter how powerful or fast the computation; see n-body problem.
Between 1980 and 2000, hard disk sizes increased from about 10 megabytes (1 × 107) to over 100 gigabytes (1011 bytes). A 100 gigabyte disk could store the given names of all of Earth's seven billion inhabitants without using data compression. But what about a dictionary-on-disk storing all possible passwords containing up to 40 characters? Assuming each character equals one byte, there are about 2320 such passwords, which is about 2 × 1096. In his paper Computational capacity of the universe, Seth Lloyd points out that if every particle in the universe could be used as part of a huge computer, it could store only about 1090 bits, less than one millionth of the size such a dictionary would require. However, storing information on hard disk and computing it are very different functions. On the one hand storage currently has limitations as stated, but computational speed is a different matter. It is quite conceivable that the stated limitations regarding storage have no bearing on the limitations of actual computational capacity; especially if the current research into quantum computers results in a "quantum leap".
Still, computers can easily be programmed to start creating and displaying all possible 40-character passwords one at a time. Such a program could be left to run indefinitely. Assuming a modern PC could output 1 billion strings per second, it would take one billionth of 2 × 1096 seconds, or 2 × 1087 seconds to complete its task, which is about 6 × 1079 years. By contrast, the universe is estimated to be 13.7 billion (1.37 × 1010) years old. Computers will presumably continue to get faster, but the same paper mentioned before estimates that the entire universe functioning as a giant computer could have performed no more than 10120 operations since the Big Bang. This is trillions of times more computation than is required for displaying all 40-character passwords, but computing all 50 character passwords would outstrip the estimated computational potential of the entire universe.
Problems like this grow exponentially in the number of computations they require, and are one reason why exponentially difficult problems are called "intractable" in computer science: for even small numbers like the 40 or 50 characters described earlier, the number of computations required exceeds even theoretical limits on mankind's computing power. The traditional division between "easy" and "hard" problems is thus drawn between programs that do and do not require exponentially increasing resources to execute.
Such limits are an advantage in cryptography, since any cipher-breaking technique which requires more than, say, the 10120 operations mentioned before will never be feasible. Such ciphers must be broken by finding efficient techniques unknown to the cipher's designer. Likewise, much of the research throughout all branches of computer science focuses on finding efficient solutions to problems that work with far fewer resources than are required by a naïve solution. For example, one way of finding the greatest common divisor between two 1000 digit numbers is to compute all their factors by trial division. This will take up to 2 × 10500 division operations, far too large to contemplate. But the Euclidean algorithm, using a much more efficient technique, takes only a fraction of a second to compute the GCD for even huge numbers such as these.
As a general rule, then, PCs in 2005 can perform 240 calculations in a few minutes. A few thousand PCs working for a few years could solve a problem requiring 264 calculations, but no amount of traditional computing power will solve a problem requiring 2128 operations (which is about what would be required to brute-force the encryption keys in 128-bit SSL commonly used in web browsers, assuming the underlying ciphers remain secure). Limits on computer storage are comparable. Quantum computers may allow certain problems to become feasible, but have practical and theoretical challenges which may never be overcome.
See also: computation, computational complexity theory, algorithmic information theory, computability theory, and big O notationRead more about this topic: Large Numbers
Famous quotes containing the word complexity:
“In times like ours, where the growing complexity of life leaves us barely the time to read the newspapers, where the map of Europe has endured profound rearrangements and is perhaps on the brink of enduring yet others, where so many threatening and new problems appear everywhere, you will admit it may be demanded of a writer that he be more than a fine wit who makes us forget in idle and byzantine discussions on the merits of pure form ...”
—Marcel Proust (18711922)