Tech: Echoes of Winters Past?
For almost a year now, I’ve been trying to predict the end of the AI hype1. The onset of the fourth2 “winter” has seemed inevitable, as there have been no significant improvements to the models since late 2024. Not a hard winter like the one we experienced during the initial waves of AI technology, but rather a milder one, similar to the period following the introduction of Siri, Google Maps, and other third-generation pioneers.
As signs of AI fatigue and disillusionment become more apparent, more people are beginning to align with my initial thoughts. However, as others have noted, the situation is not so straightforward3. When we consider all the evidence, it appears paradoxical: we are simultaneously facing significant challenges while investing in AI at a faster rate than ever. This contradiction arises because we tend to think of AI as a single technology or a singular tech trend.
Yes, we’ve hit a wall on Large Language Models almost a year ago4. Yes, even the general audience has noticed that most improvements are small and incremental compared to those of ChatGPT’s initial craze. And yet, this is happening even as investment in the AI sector is skyrocketing. The race is on for big tech – and even geopolitical players like the U.S. and China – who view AI as a winner-takes-all opportunity.
If we examine the situation more closely, there are actually three distinct, bespoke trends here. The AGI question, the generative AI race, and applied AI5.
The AGI Question: The goal of achieving human-level intelligence is likely to face a “hard winter”6, at least until we have commercially available high-power QPUs—or some other world-changing breakthrough7.
Most people will ignore this, as the AGI was never the primary driver of the hype; that was the promise of commercial value from Generative AI.
The Generative AI Race: Reflecting on the hype curve at the start of this wave, I anticipate a significant slowdown in generative AI research and development. However, I predict this slowdown will be shorter and less severe than the one experienced between deep learning and large language models (LLMs) due to the ongoing geopolitical race.
This potential slowdown is something I, along with others, have been trying to predict. But I’m more and more starting to think it’s the wrong question.
The Rise of Applied AI: Even with the advancements in AI technology in 2024, we have the potential to accomplish remarkable things like never before. The improvements in usability, efficiency, and cost—whether incremental or substantial—are making these technologies increasingly accessible. Frankly, this trend may outweigh the other two factors, regardless of how mundane it may seem.
So, while from the perspective of a tech analyst and visionary, we may be heading towards an “AI winter”–a period of decreased investment in speculative projects, the viewpoint overlooks the broader context.
In a broader context, the true and lasting value lies not in the next theoretical breakthrough, but in the practical applications that follow. It’s time to stop harping8 on a potential ‘winter’ and start building the future with Applied AI.
Footnotes
-
While the Gartner hype cycle model is a tad naive for my tastes, it’s a good generalization of how these things go. Especially, if you think about the public eye. ↩
-
There have been two significant winters in the development of artificial intelligence, and we arecurrently experiencing the third summer. In addition to these seasons, there was a slowdown between the first and second waves of the current and third AI boom, which some, including myself, refer to as a “little winter.” When discussing allegories, the debate over which one is more appropriate is often unproductive. ↩
-
In this, I’m grateful to Riku Tapper for his honest, insightful, and unwavering criticism of my thinking in the discussions we’ve had at the office. It’s not always easy to stop my train of thought, even if the train left the tracks a long time ago. ↩
-
The scaling wall is, in essence, also a “Transformer wall.” The foundational self-attention mechanism, while enabling groundbreaking progress, has a quadratic complexity with respect to context length, making it computationally expensive and difficult to scale to very long contexts. ↩
-
To be fair, there are two more branches of the technology that are interesting from tech and business angles: Deep and Machine learning – often dubbed as traditional AI and big data, which in my opinion started this revolution. ↩
-
Leading experts like Meta’s Chief AI Scientist Yann LeCun have been clear about this for some time, stating that we are never going to get to human-level intelligence by just training on text and that current LLMs “lack a proper understanding of the physical world ↩
-
I’m still rooting for John Carmack to pull a rabbit out of his proverbial hat. If someone can do this, it’s them. ↩
-
Ah, ‘Harping’ – I have waited for an opportunity to use this term in a sentence. I hope I got it right :D. ↩