[#18] Riding the Wave of AI: A Shift in Enablement or a Technological Revolution?
How different is this wave of Generative AI? Can it be as big as mobile or cloud?
A couple of engaging conversations recently prompted me to contemplate the role of Large Language Models (LLMs), particularly whether their advent signifies a major platform shift in technology akin to the shifts to mobile or cloud, or whether they represent more of an enablement shift. As an avid follower of technological evolution, I felt compelled to explore this intriguing question.
The mobile revolution freed us from the confines of fixed-location work, while the advent of cloud computing eradicated the limitations of local data storage. Now, we stand on the brink of another potentially transformative surge - the rise of AI, with a particular emphasis on LLMs.
One viewpoint I encountered posited that LLMs do not symbolize a significant platform shift. The premise here is that platform shifts, like the transition to mobile or the cloud, necessitate substantial changes in technology, organization, and processes. They demand the acquisition of new knowledge, reorganization of teams, development on different platforms, and even alterations in product building and QA processes. It's a formidable mountain to climb, and historically, it's why enterprises took considerable time to navigate such shifts.
If you've had a fair share of experience in the tech field, you might notice that this AI wave doesn't quite resonate the same way as previous ones. Unlike the mobile revolution, we're not clutching a novel piece of hardware. Nor are we migrating our data to a remote cloud, as was the case with the onset of cloud computing. Rather, AI, and LLMs in particular, are enabling us to extract maximum value from our current assets. They act like a secret sauce, lending a potent boost to our existing tech stack, leading me to categorize this as an "enablement shift."
In the business realm, enterprises are rapidly harnessing this AI wave to generate content, make informed decisions, and deliver products at unprecedented speeds - a feat that would have been hard to imagine in a pre-AI era! Interestingly, this shift doesn't necessitate a change in platform. It's about leveraging APIs and utilizing our existing data repositories. This is a significant advantage for established players, particularly those with substantial resources, access to vast amounts of data, and the ability to invest in self-hosted model access, enabling them to train models on their proprietary data while ensuring its security.
We've progressed from the era of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), the figureheads of the previous AI wave. The new protagonists are diffusion-based models for image generation and transformer-based architectures like LLMs for language tasks. These cutting-edge models, particularly LLMs, demonstrate a remarkable capacity to synthesize and process information employing a Chain of Logic/Chain of Thought style of processing, a capability that would have seemed far-fetched just a few years ago.
However, these technological advancements don't equate to a radical shift in the same vein that mobile or cloud did. Unlike mobile technology, which necessitated the development of entirely new hardware and software ecosystems, or cloud computing, which required a profound shift in data storage and access paradigms, AI is largely neutral to these factors. AI models can operate on existing hardware (although specific hardware can expedite operations), and they can learn from data regardless of its location.
The real allure of AI rests in its ability to analyze massive volumes of data and extract valuable insights, augment human decision-making, and automate mundane tasks. It enhances existing systems rather than replacing them, making it more of an evolution than a revolution.
So, where does this lead us? Is the rise of AI a platform shift or an enablement shift?
If you found this piece useful or interesting, don't hesitate to share it with your network.
If this was shared with you and you liked the content, do consider subscribing below to receive the next piece directly.