“In this letter we highlight a ‘bet’: a follow-up to our December letter in which we wrote extensively on our general thesis on the desirability of artificial intelligence. We present a case study of NVIDIA which, according to us, is uniquely positioned to capture this opportunity.
Unfortunately, for some readers, again, this letter tends to overflow with technical computer jargon. Part of our mission is to educate co-investors on our long-term thinking. We try our best to moderate the complexity, however, sometimes technical analysis is the only way to strengthen the thesis.
It is encouraging that we continue to find global opportunities to deploy capital. We remain cautious on South Africa and believe that, overall, the earnings distribution is biased down despite the recent average turnaround in stock prices. We believe we have the best of all South African opportunities. Therefore, the majority of the additional capital of the fund is deployed in global opportunities.
Nevertheless, we have strengthened our position in a South African company that will directly benefit from the vaccination of the population. We have also been involved in two specific situations which are still generating satisfactory returns on capital.
A quick read of our fact sheet will show high cash balances. This is slightly misleading as a substantial portion of cash supports derivative exposures which are not reflected in the disclosure.
NVIDIA is one of our portfolio companies that we believe has tremendous potential. Their recent GTC conference confirmed some elements of our long-term thesis:
• As large corporate clients began to find competitive advantages by adopting AI solutions, their competitors would need to invest heavily to stay on top.
• NVIDIA is creating layers of “tools” to address the entry and exit bottlenecks we mentioned in our December letter.
• On top of that, and luckily for us, geopolitical and COVID-related factors added additional favorable winds and a well-known semiconductor shortage ensued in the industry.
Talk to your brother-in-law who works in IT, and he’ll likely associate NVIDIA as a gaming hardware company. However, over the past few years, NVIDIA has built an AI platform business with hardware. integrated, development middleware and AI applications. The game funded this platform, but AI is likely to define NVIDIA over the next decade.
By the way, we think co-founder and CEO Jensen Huang is one of the most impressive CEOs we’ve met. It is customary among tech CEOs to have a signature outfit; Jensen is no different and he frequently wears a “shiny” black leather jacket. Since the COVID era, Jensen has showcased all NVIDIA product launches from his kitchen. For those unfamiliar with the industry, it’s worth contextualizing how they got there.
One of the key inputs to AI is hardware for computing power, especially deep learning models. Computing power is determined by the semiconductor architecture, packaging, and software layer to extract maximum performance.
The game funded the future AI platform
While Intel will be associated with the dominance of the central processing unit (CPU), NVIDIA almost already has the space for graphics processing units (GPUs); the main reason – the players.
Towards the end of last year, the company launched its 3000 series with a lot of enthusiasm from gamers. At the time of writing this letter, it is extremely difficult to reach them. The “street price” is as high as three times the recommended retail price.
NVIDIA achieves software-like gross profit margins of 63% while technically deriving most of its revenue from hardware. However, if that premium “street price” is any indicator, NVIDIA could charge a lot more to earn higher margins. The consumer surplus is obvious.
Why are customers willing to pay three times the selling price of an NVIDIA GPU?
Hardcore gamers want to immerse themselves in the worlds in which they play. They don’t want to be in a busy world, they want to feel like they’re there. The GPU brings them closer to “reality” with high fidelity and immersive experiences.
We can share an anecdotal picture of how things are going: Epic Games (40% owned by Tencent) recently announced a “Metahuman Creator” software kit that simulates virtual humans for games and movies. At this point, the only GPU capable of handling software in real time is the new NVIDIA GPU. We would be personally afraid of becoming an actor when we see what can be created by software.
In most computer generated images, it is often quite clear that the image is not a photo. The human eye is able to easily differentiate a moving image from a real photograph. There is just something wrong with it.
One of the missing links is that in the real world light bounces off all objects in a particular environment and doesn’t just emanate from the sun or direct light sources. Games got better thanks to a technique called rasterization where shadows and nuances of color simulate a deeper sense of light, depth and context. However, to solve the problem of “light bouncing off all objects”, the game engine had to keep track of each object, where it was, its position in relation to others, and what light was bouncing off its surroundings. The computational requirements were just too great at the time.
However, with developments in computing power, ray tracing was finally introduced commercially by NVIDIA in 2018.5 Ray tracing is a rendering technique that simulates the many light paths emanating from objects in the image while obeying the laws of physics.
Hardware + AI models
You can imagine that this is a computationally intensive process. Yet even with the latest technology, GPUs still cannot fully replicate the endless light interactions that occur in the real world. NVIDIA came up with a deep learning model, called DLSS, to use AI software to improve what hardware still can’t do. The model essentially fills the gaps. A GPU and DLSS create an incredibly realistic picture and drive demand for upgrades to the new 3000 series.
You might be wondering what does this have to do with NVIDIA’s AI platform? It turns out that the same mathematical approach to solving complex rays of light is very similar to the types of calculations needed by AI.
GPUs are structurally superior for AI issues
Basically, an AI algorithm is a brute force optimized linear algebra problem. There are billions of calculations that need to be done very quickly.
Processors are good at doing a lot of different tasks, GPUs are very good at doing a specialized task. Simply put, the CPU is a Swiss Army Knife and the GPU is a surgical scalpel.
GPUs perform repetitive linear algebra calculations in parallel threads. Conversely, processors are fast processors in their own right, but they handle math operations sequentially rather than parallel. This makes CPU processing time orders of magnitude slower than a GPU.
Materials to build an ecosystem
It is becoming increasingly clear that NVIDIA’s strategy is to use its leadership position in GPUs (or what should really be called “AI” hardware) to create an AI ecosystem by linking hardware users together. to their software (middleware and applications).
We believe that this built-in battery could build a considerable divide over the next decade.
NVIDIA’s strong position
Right now, it’s up to NVIDIA to lose this substantial future opportunity.
• Make AI Simpler: Today, a PhD is needed to develop a well-formed deep learning model that generates substantial value. Unfortunately, we “average” users would generate an below average model that has many false positives and, therefore, a model that does not accurately predict desired results. The Blue Sky Opportunity is for a business to make it much easier to use AI to their advantage for businesses and SMB clients who can’t hire PhDs. We are talking about software as simple as Excel and Word for AI problems. If this opportunity is resolved, the total addressable market (TAM) for AI will grow exponentially.
• Building an AI infrastructure: The challenge is that the fundamental building blocks of AI do not yet exist at scale. “Highways, pipelines and conduits” still need to be built. NVIDIA is slowly putting the building blocks in place to do so by launching network accelerators, pre-trained deep learning models, and large-scale supercomputers that solve high computational problems such as weather forecasting and sequencing. Genoa.
Provide tools to developers who end up building a ditch
NVIDIA’s CUDA SDK6 is the main software that interfaces with their GPU. Since they have the most advanced GPUs, as pioneers, scientists have reassigned gaming GPUs to their particular use case. Seeing a new user market, NVIDIA provided free software to maximize GPU performance. As the AI opportunity took hold, CUDA became the de facto parallel processing software SDK used by developers.
It’s a critical place to build a moat, and 2.3 million developers are already using NVIDIA SDKs. But we’ve seen this movie before.
Developers are the facilitators
Two decades ago, Bill Gates got very rich because MS-DOS was in a similar position. Operating systems are the structural basis for developers to build their applications. Many books have been written about how IBM offered the opportunity to Gates on a platter. However, the reality is that the developers were the real enablers. Operating systems have high switching costs for a developer who would have to completely rewrite their application to work on a different operating system. Hence, they stick to it even if a better operating system comes along.
And so, over time, the software ecosystem grows; more apps and users, which means more developers, which means more apps and users. In just a few years, Gates is the richest man on Earth! This is a great example of software network effects.
NVIDIA believes the game was just the first “killer app” that excelled in using its technology. This is the start of what could be a very big opportunity. NVIDIA has generally sold “shovels” to hopeful prospectors. He begins to become a prospector himself … “