Photonic Computers: What Happens When You Build AI Hardware Out of Light

Science News ran a profile this month on Bhavin Shastri, a physicist and engineer at Queen's University in Canada, and I have been thinking about it ever since. Shastri is building photonic computers, computers that use light instead of electricity to process information. He has been obsessed with lasers since he was a kid. He is now working on chips that contain 100,000 neuron-like components and can perform 120 billion operations per second, which is about 40 times faster than the average electronic computer. I am going to need a moment.

The problem Shastri is solving is not just a speed problem, though the speed numbers are genuinely staggering. It is an architecture problem. Standard computers face what is called the memory bottleneck: they cannot simultaneously access the bulk of their memory and perform calculations. When they are calculating, they are not retrieving. When they are retrieving, they are not calculating. This is fine for most tasks and increasingly a serious limitation for AI workloads, which require enormous amounts of data to be processed continuously and in parallel. Training modern AI models consumes energy at a scale that some projections have equaling Japan's total electricity consumption by 2026. That is not a sustainable trajectory.

Photonic neuromorphic computing changes the architecture fundamentally. Light does not interfere with other light the way electric currents interfere with each other through magnetic effects. You can run countless beams of different wavelengths through the same path simultaneously without overheating or interference. The chips Shastri is building pack photonic components together in patterns that behave like biological neurons, creating a physical neural network on a chip. The physics emulates the biology, as he puts it. The result is hardware that promises to be faster and dramatically more energy efficient than what we currently use to run AI systems.

For those of us who work in IT infrastructure, this is the kind of research that deserves attention even though it is years away from showing up in a data center. The energy cost of AI infrastructure is already a planning concern for large organizations. The compute requirements for the models being developed now are outpacing what current electronic hardware can efficiently deliver. Photonic computing is not the only proposed solution, but it is one of the most compelling because it addresses both the speed and the energy problems at the architecture level rather than just scaling more of the same hardware.

Shastri says these computers are not coming to your home anytime soon, and he is right. Right now they are suited for specific research and industry applications. But AI, radio signal optimization, and image processing are already on his team's application list, and the trajectory of the research is pointed directly at problems that matter at scale. The engineer who started by being amazed by a laser pointer in a water bottle experiment as a kid is now pioneering computing architecture that could change how AI hardware is built. That origin story never gets old.

https://www.sciencenews.org/article/bhavin-shastri-photonic-computer-brains

Previous
Previous

Celebrity Beyond Stole My Heart and Eden Restaurant May Have Ruined Me for All Other Dining

Next
Next

AI at Work: The Governance Gap Is Still Winning