Google's Technical Win In AI Battle. Find Out!
I seldom dive into Motley Fool news article.
This is because (to me personally) its fluffy, most of the time.
However, every now and then, miracles happen, and they have insightful posts.
Below is one good example. It opened my eyes, helping me to know $Alphabet(GOOG)$ more “intimately”.
Afterall, we are supposed to know our portfolio’s stock/s with latest details in order (a) to be confident of stock’s future and (b) not be swayed by erratic market behaviour.
Today’s post on Google brings my coverage of Google into full cycle; hence “tetralogy”.
My two other Google posts:
Why Google is a winning stock. Click here! to read. Give a “LIKe” ok. Tks.
Google & Nvidia partnership. Click here! to read.
Google’s answer to Microsoft’s Co-pilot. Click here! to read.
Today’s post will be a little “technical”.
I will try to write it as palatable, even for non-technically inclined readers.
One of the reasons why $NVIDIA Corp(NVDA)$ is hogging the limelight when it comes to all things Artificial Intelligence (AI) is because of its AI chip.
Its coveted H100 chip and the much-touted new generation AI chip both have Nvidia’s GPU (graphicsl processing unit) as part of its architecture.
What is GPU?
A GPU is a specialized processor has many uses:
Helps speed up the rendering of graphics and images on a computer or some other device.
Responsible for processing graphics information such as an image’s geometry, color, shading, and textures.
More importantly, GPU is used in artificial intelligence (AI) and machine learning (ML) applications because they are able to perform complex mathematical computations concurrently, essential for training deep neural networks.
GPUs are also used for inference, that is the process of using a trained model to make predictions on new data.
Concept of inference can be applied to many applications Eg. (1) fraud detection, (2) recommendation systems, and (3) even autonomous vehicles’ driving.
What is the relationship between Nvidia’s GPU and Google’s TPU (Tensor Processing Unit)?
Nothing.
The intro serves to bring focus to Google’s TPU.
What Is TPU?
A Tensor Processing Unit (TPU) is an application specialized integrated circuit (ASIC) [circuit board] developed by Google for neural network machine learning, using Google’s own TensorFlow software.
Tensor Processing Units (TPUs):
Are designed to accelerate machine learning workloads.
Are used to speed up the process of creating and rendering computer graphics.
Are offered as minor chip or cloud versions.
More interestingly:
Google created TPUs from scratch {nothing}.
In 2015 (8 years ago), Google had begun using them.
In 2018 (5 years on), Google opened them to the public.
In 2022, its 4th-generation TPU, the TPU v4, became available for use by Google Cloud customers.
TPUs have been deployed for use “Internally” in Google’s applications: (see below)
Google Search.
Street View.
Google Photos.
Google Translate.
Along the way, TPUs have helped many of Google’s services run state-of-the-art neural networks (a) at scale and (b) at an affordable cost.
Advantage of TPU / ASIC
The benefit of an application-specific integrated circuit (ASIC) is that it can be designed at the hardware level to perform specific tasks.
Google has even lay claims that its TPU v4 beats Nvidia's last-gen A100 data center GPU on a variety of AI workloads.
This could be true because GPU is more general purpose built, making it useful for a wide variety of workloads in addition to AI, but potentially less efficient.
Above introduction of GPU and TPU lead us back to the topic of Artificial Intelligence again.
It has been universally accepted that OpenAI’s ChatGPT is currently the defacto leader when it comes to artificial intelligence implementation.
In a nutshell, ChatGPT is an AI chatbot built on top of OpenAI's foundational large language models (LLMs) eg. GPT-4 and its predecessors.
A “language model” is a mathematical representation of natural language that assigns probabilities to sequences of words or symbols.
Widely used in AI applications: for example,
Speech recognition.
Machine translation.
Natural language understanding.
Text generation.
OpenAI’s ChatGPT is one of the most if not the most advanced language models to date.
It is based on GPT-3.5 & GPT-4.0 architecture, that is a deep neural network that can generate coherent and diverse texts on various topics and styles when it is given some input or prompt.
The relationship between language models and artificial intelligence is that language models are essential tools for building intelligent systems that can communicate with humans and understand natural language.
In order for above relationship to (a) function seamlessly and (b) quickly, requires a lot of processing power.
This is achieved by linking thousands of powerful AI chips together (the demand).
The setup cost of AI chips and the other mandatory equipment to build what is essentially an AI supercomputer is enormous, not to mention the operational costs.
Now, we know why Nvidia’s stock price is like a raging bull don’t we’?
While it does not disagree with the concept of AI processing power, Google has argued that efficiency is another important factor not to be disregarded.
On Tue, 29 Aug 2023, Google presented a brand-new iteration of its TPU - the TPU v5e.
The TPU v5e:
Focused on providing a balance between performance and efficiency.
It is designed for both (a) AI training and (b) Inference.
It is already available in preview for Google Cloud customers using the Google Kubernetes Engine platform.
Provides twice the training performance per dollar and up to 2.5 times the inference performance per dollar compared to the TPU v4.
Is technically superior to TPU v4 that is limited to 3,000 chips for a single workload on Google Cloud.
TPU v5e’s superiority permits customer to leverage on tens of thousands of v5e chips at once for a single workload on Google Cloud.
While Google’s TPU give its cloud customers a cost-effective way to run AI workloads, demand for Nvidia’s H100 GPUs remains.
Again, to strike a “cost and processing speed” balance, Google Cloud also unveiled its new A3 virtual machines powered by H100 GPUs.
Each virtual machines features Intel's latest Xeon CPU paired with 8 xH100 GPUs.
Tens of thousands of H100 chips can be used for a single workload, providing enough power for the most demanding AI tasks.
Just to be clear, Google Cloud is not the first major cloud provider to launch virtual machines powered by Nvidia's H100.
$Amazon.com(AMZN)$ Web Services (AWS) announced a similar product in July 2023.
$Microsoft(MSFT)$ Azure announced the product in August 2023 as well.
The “technical advantage” Google has over AWS & Azure is its efficient & cost-effective TPU-powered services, giving Google the edge as cloud providers scramble to win AI workloads.
Google Cloud has become an important business for Alphabet.
It generated $8 Billion of revenue in Q2 2023, turning an operating profit.
Artificial Intelligence is one way Google Cloud can differentiate itself from the competition as AI industry matures and starts to care more about return on investment (ROI) and how much it costs to train advanced AI models.
My Viewpoint:
Referring to the “History of Large Language Model (LLM)” diagram (see above), Google has been at the forefront of it for the longest time.
It reigned from 2011 to 2017.
In 2020, it lost the lead when OpenAI’s Generative Pre-trained Transformer 3 (GPT-3) is a large language model was introduced by OpenAI.
Will Google’s TPU become the technical edge and cost-advantage for Google to leapfrog into 1st or 2nd position? Only time will reveal.
Despite all the technical advantages that comes with “leaving it to the experts” (eg. Nvidia for GPU); it pays too to be technically strong and independent.
Do you think the future for Google (as a stock for investors) is bright?
Do you think Google Cloud will be able to capture more business (& clients) as AI implementation continues its penetration into applications?
Please give a “LIKe”, “Share” and “Re-post” ok. Thanks. Rating is very important (to me).
Do consider “Follow me” and get firsthand read of my daily new post/s ok. Thanks.
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.
Google’s recently announced collaboration with NVDA will be phenomenal “Win-Win-WIN for both parties in this budding relationship, my esteemed investing brethren!!!🏋🏿♂️💪📈
Google vs Amazon, Google being the long time very likely supersized winner.
I concur. I believe GOOG has a great future.
Goog goes down like Yahoo before.
Google is the leader in search, NFL ticket will have a big impact this quarter, google is positioned well to lead AI over Web. Do you need any further reasons to buy?
Would you consider "Follow me" and get first hand read of my Daily new posts? Thanks!). Tks!