In recent years, artificial intelligence (AI) has become an integral part of our daily lives, powering everything from smart assistants to complex data analysis. However, as AI technologies continue to advance and proliferate, a concerning trend has emerged: the rapidly increasing energy and water consumption of data centres that support these systems. The Power Hunger of AI According to the International Energy Agency (IEA), global data centre electricity demand is projected to more than double between 2022 and 2026, largely due to the growth of AI. In 2022, data centres consumed approximately 460 terawatt-hours (TWh) globally, and this figure is expected to exceed 1,000 TWh by 2026. To put this into perspective, that's equivalent to the entire electricity consumption of Japan. The energy intensity of AI-related queries is particularly striking. While a typical Google search uses about 0.3 watt-hours (Wh), a query using ChatGPT requires around 2.9 Wh - nearly ten times more en
There are those in the AI observer community that have been suggesting of late that 'AI has plateaued' which reveals, to me, a lack of understanding of how models develop. It's not iterative design but step changes we are witnessing. The differences between Claude 3 Sonnet and 3.5 Sonnet are stark. One test I often carry out to asses the current capabilities of LLMs is to request a simple prompt to analyse an unpublished poem, commentating on the style and literary devices employed. The outputs have improved significantly over the 18 months I have employed this approach. This is my recent attempt. The poem is self written, self published (so not widely available) to ensure that it's unlikely to have found its way into the training data. For added context, implied in the text, this was written during a residency at a Museum and Art Gallery and came from a conversation with a member of staff about his late father. Calm is museum archival software. PROMPT. "analyse