Beginning in the early 2000’s, computer-processor design stalled out. The semiconductor industry could no longer increase the speed of new processors in order to increase software performance because there was simply no way to cool them. By 2004, semiconductor industry design, led by Intel, added multiple processors (cores) on a single chip so that they could continue to support Moore’s Law that is, in essence, the doubling microprocessor performance every two years. The common thinking was that advances in performance would come from adding additional processing cores rather than by increasing speed.
Unfortunately, with few exceptions, the world’s 25 million plus programmers do not have a good way to program and fully utilize multiple cores. Nearly all the world’s applications typically use only a single core at any given time, and the unused cores remain dark (idle).