Microsoft and 5 other AI pick-and-shovel plays
With the release of Chat GPT late last year, AI has headed back up the hype cycle. And while we’ve been here many times, this time feels different. That said, we are very early in a rapidly evolving landscape, and while it’s difficult to know where it will be next year it’s impossible to see five to ten years out.
So, what is the best way to play the AI trend and benefit from companies succeeding in the sector? We think a prudent strategy is to invest in companies building the tools to sustain the industry’s growth – the ‘pick-and-shovel’ plays of AI.
From the companies we cover at Swell, we’ve come up with six worthy of consideration as the structural backbone of AI: three cloud providers – Amazon (NASDAQ: AMZN), Microsoft (NYSE: MSFT) and Alphabet (NASDAQ: GOOG); and three semiconductor companies – Nvidia (NASDAQ: NVDA), TSMC (TPE:2330) and ASML (NYSE: ASML).
Why cloud providers?
Massive AI models such as large language models (LLMs) need to run in vast data centres, or as Nvidia CEO Jensen Huang likes to call them, ‘AI factories’. To date we have seen no slow-down in the benefits of scaling these models. In fact, scientists have assigned the term “emergence” to an effect found with LLMs which describes how as the size of the model increases, the AI exhibits sudden non-linear breakthrough abilities (i.e. it gets a sudden spike in improvement in a specific and often unpredictable area). This is why LLMs like OpenAI’s GPT3, which has 175 billion parameters, excel at a range of tasks from text comprehension to language translation and even writing code.
This exponential amplification in a model’s capabilities is driving the race for larger models. The size of the largest deep learning models has increased from 98 million parameters in 2018 to 540 billion today, and the computational requirements to train these models is growing at a rate of 275x every 2 years. Unfortunately, hardware limitations have caused the cost of training and running these models to soar, resulting in LLMs being affordable for only the largest tech companies or well-funded start-ups.
The current paradigm is more suited to a centralised, cloud-based model, where the AI is trained and run in the cloud as opposed to on-device, hence the large public cloud providers (Amazon, Microsoft, and Google) are well placed to provide the infrastructure that powers the world’s AI applications.
Why semiconductor companies?
Developing AI models requires high-end semiconductors, primarily GPUs (graphics processing units) to train and run the models. Demand from AI is expected to help drive industry revenues past $1 trillion by 2030.
Nvidia is the outright leader in GPU design, and has built a powerful moat in the form of its CUDA platform. GPUs are estimated to be used in roughly 30% of existing data centre servers but Nvidia expects this to approach 100% as AI use increases. The long-term risk for Nvidia is backward integration from the big tech companies who have the capacity and resources to design their own AI chips. However, the sheer demand for computing power means the industry will continue to rely heavily on Nvidia for the foreseeable future.
Two companies with almost zero backward integration risk are TSMC (Taiwan Semiconductor Manufacturing Company) and ASML (Advanced Semiconductor Materials Lithography). Whether it's Nvidia GPUs or big tech’s custom chips, chances are TSMC is making them and using ASML equipment to do so. TSMC is the frontrunner in leading-edge semiconductor manufacturing, with only two real competitors, while ASML is literally the only company in the world capable of making its EUV (extreme ultraviolet) machines.
Both companies are not without risk though, particularly geopolitical, and in the case of TSMC the risk of conflict is potentially existential. But if you can stomach that, it trades on 14x PE with a 2% dividend yield which we think is very attractive given its position.
With all that said one company stands out as the pick of the bunch.
Microsoft - little to lose.
Periods of rapid technological change are often when the seeds of disruption are sown. But Microsoft is uniquely positioned among the incumbents as it has little to lose from AI and so much to gain. It owns both the infrastructure (Azure), and software distribution (M365) to drive AI growth. Moreover it is helmed by Satya Nadella, who has a formidable track record for capital allocation and is committed to pursuing AI aggressively.
Having invested in Open AI in 2019, and again in 2021, Microsoft has recently increased its stake to a 'multi-year, multi-billion' dollar investment widely rumoured to be in the vicinity of $10 billion.
By going on the offense, Microsoft can create new value for existing users while potentially disrupting incumbents in other fields. Products like the recently released Teams premium which uses OpenAI models to annotate and summarise Teams meetings in real-time, will add value to an already successful suite of productivity solutions. And yet to be released products such as Microsoft Designer which infuses generative AI into the creative process, will compete in areas historically outside Microsoft's reach; markets dominated by other software giants like Adobe. Here Microsoft can afford to be more innovative as it has relatively little to lose but much to gain.
Search is another such area where the risk/reward trade off seems skewed to the upside. While it’s not yet clear how it will be monetised, AI powered search could be a terrific complement to monetised search. This thinking is based on the principle of complementary goods which suggests if you lower the cost of a complement, demand for your product increases. Google has been exploiting this for years, providing direct answers to certain queries such as “What’s the weather forecast?” even though they are not directly monetised.
That's because the 80% of Google queries that are not monetised drive growth in the 20% that are, creating a flywheel that has made Google’s search empire almost impenetrable. This is something Microsoft can start to exploit with AI search. To be clear, we don’t believe it will replace Google anytime soon. In fact Google is expected to release its own AI powered search tool at an event this week. But with 58% of revenues coming from search vs 6% for Microsoft, Google has far more at stake and hence must approach this with caution.
Perhaps the most underappreciated aspect of Microsoft’s AI strategy is GitHub CoPilot. The tool was released publicly in June last year and is already writing between 40% and 80% of code for the developers who use it. This could lead to unprecedented efficiency gains per developer, at a time when many businesses are looking to cut costs or at least do more with less. In addition, we see CoPilot further democratising software development, becoming a tool not just for professional developers but for a much larger base of knowledge workers.
Known Unknowns
The difficulty with investing AI right now is the known unknowns. What will the industry look like in 5 years’ time? Who will be the winners and losers? With change happening at such a rapid rate even AI experts of the past are being left behind.
While we recommend treading with caution, we believe investing in the companies that provide the ‘pick and shovel’ plays, could be the best way to gain AI exposure without taking on unnecessary risks.
Never miss an insight
If you're not an existing Livewire subscriber you can sign up to get free access to investment ideas and strategies from Australia's leading investors.
You can follow my profile to stay up to date with other wires as they're published – don't forget to give them a “like”.
3 topics
8 stocks mentioned