Photonics turns out to be a tough nut to crack

Photonics turns out to be a tough nut to crack

The growing computing power power required to train advanced AI models like OpenAI’s ChatGPT may eventually hit a wall with mainstream chip technologies.

In a 2019 analysis, OpenAI found it that from 1959 to 2012, the amount of power used to train AI models doubled every two years, and that after 2012 power consumption began to increase seven times faster.

It already causes tension. Microsoft is Reportedly faces an internal shortage of the server hardware needed to run its AI, and the scarcity is driving up prices. CNBC, in conversation with analysts and technologists, estimates the current cost of course a ChatGPT-like model from its inception to over $4 million.

One solution to the AI ​​training dilemma that has been proposed is photonic chips, which use light to transmit signals instead of the electricity that conventional processors use. In theory, photonic chips could lead to higher exercise performance because light produces less heat than electricity, can travel faster and is much less sensitive to changes in temperature and electromagnetic fields.

Light Matter, Lights on, Luminous computing, Intel and NTT are among the companies developing photonic technologies. But while the technology caused a lot of excitement a few years ago – and attracted a lot of investment – ​​the industry has cooled noticeably since then.

There are several reasons why, but the general message from investors and analysts who study photonics is that photonic chips for AI, while promising, are not the panacea once thought.


Related posts

To hire your first startup employee, start with a list of 1,500 people

Somethings, a youth mental health startup, launches with a $3.2 million raise led by General Catalyst

Hear how MinIO built a unicorn in object storage on top of Kubernetes and open source