New technology has to ensure Earth can handle a fully AI-powered communications workflow

December 16, 2024

I quite like my new ‘AI-powered’ working life. It’s certainly saving me a lot of time – from helping me to analyse data and craft messaging to targeting the right audiences on behalf of my clients.

And you can bet I used it to help with some of the more technical subjects of this blog post. So far, so good. The benefits of AI are well documented, but what about the big, power-hungry elephant in the data centre?

The carbon emissions from the computation associated with AI are huge. One query on ChatGPT is the equivalent of driving a car 17 meters. Doesn’t sound like a lot in isolation, but considering the near billion queries a day that the service is currently handling – it’s like adding thousands of cars to our roads each year.

Our insatiable appetite for an easier (and increasingly more digitally-powered) life means that traditional silicon-based chips, the workhorses of modern computing, struggle to keep up with the energy-guzzling needs of the latest and greatest AI supercomputers.

Moore’s law is, finally, dying

For decades, we’ve relied on Intel co-founder Gordon Moore’s idea that the number of transistors on a microchip doubles every two years, leading to exponential growth in computing power. This principle has fuelled the digital revolution, enabling everything from smartphones to the internet.

But with the frenetic development pace of AI, we’re reaching the limits of silicon-based electronics. Transistors are now approaching the size of individual atoms, and we can’t go any smaller without running into the limits of what physics will allow (at least for how our current computer systems are built and programmed – I’ll leave the quantum conversation for a future post).

A lightbulb moment

So, we need more and more compute power to facilitate our AI revolution. Still, Isaac Newton’s pesky limitations will result in stalled progress and a concerning increase in (even) more carbon emissions.

Photonic chips might help. By using photons, or particles of light, to transmit and process data, rather than the slower electrons of our current silicon-based chips, a whole host of benefits can be realised. Because light is fast, photons enable higher bandwidth and lower latency – very helpful for increasingly complex AI workloads. They also generate far less heat, significantly reducing energy consumption in data centres, which spend a large proportion of energy on cooling servers. Better for the environment and our future AI overlords.

So, what does it all mean for us?

Photonic chips are not just a theoretical concept. They’re already finding their way into the real world, from optical neural networks that mimic the human brain to AI systems that can analyse vast datasets in near real-time.

So, linking back to our world for a second, what does this potential future look like for marketing and comms pros? Assuming photonic chips can facilitate the continued pace of AI development – the sky(net)’s the limit. Potent tools that can inspect customer sentiment across huge swathes of the internet in the blink of an eye, personalising campaigns to a plethora of different audiences at the click of a button, and even predict the future – all while minimising the amount of carbon put into our atmosphere.

Ever the eternal optimist – that sounds like a pretty bright, or should I say light, future to me.

Tom Hunt is a director in London and Cognito’s Head of Insights & Analytics

Tom Hunt
Director / United Kingdom
Article Link
Michelle Chan:  Is ESG dead? The disconnect between the narrative and what’s happening in the market.
Read More
Article Link
The rise and rise of the micro-culture
Read More
Article Link
One critical report shows us how to combat greenwashing once and for all
Read More
Article Link
New technology has to ensure Earth can handle a fully AI-powered communications workflow
Read More
Article Link
Wrestling with the consequences of a generation without social media
Read More