3 minute read

Need for speed

THE EVOLUTION OF COMPUTERS FROM MEGABYTE TO TERABYTE Erik Cheng and Sophia

Sunbury

Advertisement

Computers, regarded as a constant in modern society, have evolved significantly over the past 40 years. According to the University of Minnesota, Charles Babbage and his assistant, Lady Ava Lovelace, invented the first computer in 1833– the Analytical Engine– to run calculations.

Since then, computers began slowly evolving to add additional functions and inputs, and government agencies like the Census Bureau started adopting computers for faster information processing.

However, one of the most significant breakthroughs was the first personal computer, the IBM 5150.

Introduced on Aug. 12, 1981, the IBM 5150 shocked the world by demonstrating that anyone could own a personal computer (PC) and operate it relatively easily. The 5150 offered the information processing functions of large government information centers for $1600 (or $5,224 in today's money).

The Central Processing Unit, which is considered a computer's brain, takes in information and sends instructions to the rest of the computer. The IBM 5150 utilized a powerful processing chip known as the Intel 8088. Running at a speed of 4.7 million cycles per second (or 4.7 megahertz), the 8088 handled information using 29,000 transistors: small, electricityregulating components that allow electricity to pass through at specific intervals. Modern processors like the Intel i9-10900X or AMD Ryzen 9 7950X, considered top-ofthe-line processors, run at jawdropping speeds of up to 5.7 billion cycles per second (or 5.7

Screen text design by Oliver Fichte

Art Element by Oliver Fichte

gigahertz). Both outclass the 8088 chip with transistor counts of over 13 billion on a single chip.

Compared to a human heartbeat, beating at 20 to 150 cycles per second (or hertz), the 8088 runs more than 31,000 times faster. In contrast, the Intel and AMD processors are 38 million times faster than the human heart.

One of the most considerable leaps in computer technology was storage. A byte of data is one typed character that the computer processes as instructions. However, as technology advanced, companies began measuring bytes in exponentially increasing units.

The IBM 5150 transcribed bytes onto a physical drive known as a floppy disk and had a capacity of up to 360,000 bytes or 360 kilobytes. In contrast, modern computers now store trillions of bytes or terabytes of data. Additionally, the data is stored electronically instead of physically to compact the process further.

The popular social media platform BeReal needs 9.8 megabytes of free space to be installed and run on a smartphone. For the 5150 to properly run BeReal, it would need over 27 floppy disks.

Random access memory (RAM) is the storage accessed for quick instructions to execute functions from an application. Installed apps like Google Chrome, Calendar, or social media platforms use RAM to function properly. The more memory, the faster computers can process information and send it to other parts of the computer to handle tasks. The IBM 5150 had a memory capacity of 64 kilobytes; today, customers can purchase RAM chips with capacities of up to 256 gigabytes — that’s 256,000 kilobytes.

A key innovation of modern computer technology is the separate graphics processor. Known as the graphics processing unit or GPU, it handles all the visual tasks like displaying text on a screen or playing videos, leaving more general functions for the CPU. The NVIDIA RTX 4090, a behemoth of GPU technology released in 2022, handles tasks at a rate of up to 2.52 gigahertz. NVIDIA, the company that invented the GPU, integrated its models with separate memory chips, known as VRAM, to help create textures, depth, and shading for visuals. The RTX 4090 includes 24 gigabytes of VRAM to make realistic displays for customers.

This article is from: