1 minute read

AI COMPUTING POWER'S FUTURE

Next Article
The Rosy

The Rosy

The size and intelligence of the neural networks that underpin modern AI is increasing. For instance, recent advancements in machine language understanding have relied on the construction of some of the largest AI models ever, as well as mass textual input.

These networks have recently grown to nearly unfathomable sizes thanks to a new cluster of computer processors, which also demonstrate that making them even larger can lead to additional AI advancements in language comprehension.

Cerebras Systems created the largest computer chip in the world, as well as technology that enables a cluster of those chips to run AI models that are more than a hundred times larger than the largest ones currently in use.

According to them it can currently execute a neural network with 120 trillion connections, mathematical simulations of how biological synapses and neurons interact. These computations will run roughly half as quickly on existing hardware as they would on gear with 120 trillion connections.

AI-focused chip designs have proliferated like never before as a result of recent developments and industry interest.

Cerebras packs in significantly more computing power by using the entire wafer, allowing its numerous computational units, or cores, to communicate more effectively. Whereas conventional semiconductor designers cut a wafer into parts to build separate chips, Cerebras instead uses the entire thing.

Unlike most GPUs, the Wafer Scale Engine Two (WSE-2) features 850,000 cores instead of the customary few hundred.

Regarding 175 billion parameters, the neural network that powers GPT-3, we know that Chat GPT-4 has been trained with 100 trillion or more parameters as of right now.

An algorithm and numbers are used to link words together mathematically to form a parameter. When it comes to comprehending word relationships and knowing how to put words together to form a response, that is a significant advancement.

In order to advance their AI research, Open Ai has a stated interest in investigating the potential of quantum computing, and they have worked with firms like IBM and Rigetti to do so.

However, there is no information in the public domain indicating that Open Ai had specific plans to use quantum computing for their language models or other applications of AI.

When quantum computing will be feasible for widespread use and how widely applicable it will be for AI applications are still unknown, but many experts think that quantum computing has the potential to significantly advance the field of AI, and Open Ai and other top AI research institutions will keep investigating its potential as the technology advances.

ButiftheysucceedincombiningAIandquantumcomputing, Itwillbeatrulyremarkablequantumleapandagame-changer, PushingthecapabilitiesofAItotheirutmostpotential.

Butcheckwhat’snextonthesideline… HyperdimensionalComputing. What?

This article is from: