The Evolution of Computers
How computers have changed since calculators
Y
You stare at your computer with bated breath. The clock strikes 23 hundred. The clock ticks closer to 24 hundred, midnight. 23:30, the seconds overflowing onto the minute ticking up one. Every second it ticks closer one by one until. 23:59, you’re in the final minute now holding your breath you watch the time strike midnight. But nothing happens. The day moves over to 2000 Jan. 1st 00:00:01. Perhaps some of you reading will recognise the year 2000 problem. Where many people
feared that because of the way computers addressed years, that 2000 would be indistinguishable from 1900 and cause computer systems around the world to fail. That never happened, but why was this even a problem in the first place? For as long as people have had computers they have tried to make them faster. In the late 1900s, data storage was expensive at 10$ a kilobyte if you were lucky. Now you can get 100Gb for 10$. Simply because there is money to be made when their computer is even just slightly
faster, people pushed the limits of what was possible. Moore’s law described this trend: “Where computing processing essentially doubled every year” says Brandon Allmon, an engineer at Dell. “[We’re] kind of hitting the end of that”, Allmon states. “[And] so that you have to keep coming up with new ways to do it and density on chip, different materials. So used to be silicon, now it’s a lot of copper, when you get down to that sizes, you’re actually bumping down into the size of the actual atoms being a problem” We know our ability to make