iVolve

Nvidia's Q4 2020 Earnings Call

Happy Valentine's Day

Nvidia's Q4 2020 Earnings Call

Collette's Comments

Esports continues to amaze

"The global phenomenon of esports keeps gaming momentum with an audience now exceeding 440 million, up over 30% in just two years, according to Newzoo. The League of Legends World Championship brought more than 100 million viewers on par with this month's Super Bowl."

AI & Cloud

"The industry continues to do groundbreaking AI work for NVIDIA. For example, Microsoft's biggest quality improvements made over the past year, in its Bing, search engine, stemmed from its use of NVIDIA GPUs and software for training and inference, of its natural language understanding models. These DNN transformer models, popularized by BERT, have computational requirements for training that are in the order of magnitude higher than, earlier image-based models. Conversational AI is a major new workload, requiring GPUs for inference to achieve, high throughput within the desired low latency."

Jensen's Comments

Inference vs Training

"What most people don't understand about Inference is, it's an incredibly complex computational problem, but it's an enormously complex software problem. And so, the second dynamic is moving from training or growing from training and models going into production called Inference."

Growth in AI

"The deep recommendation systems, the natural language understanding breakthroughs the conversational AI breakthroughs all happened in this last year. And the velocity by which the industry captured the benefits here and continue to evolve and advance from these what so-called transformer models was really quite incredible. And so the all of a sudden the number of breakthroughs in AI has just grown tremendously and these models have grown tremendously."

Nvidia GPU programmability vs ASICs

"Now, our approach for acceleration is fundamentally different than an accelerator. Notice we never say accelerator, we say accelerated computing. And the reason for that is because we believe that a software-defined data center will have all kinds of different AIs. The AIs will continue to evolve the models will continue to evolve and get larger and a software-defined data center needs to be programmable. It is one of the reasons why we've been so successful. And if you go back and think about all the questions that have been asked of me over the last three or four years around this area the consistency of the answer has to do with the programmability of architecture, the richness of the software, the difficulties of the compilers, the ever-growing size of the models, the diversity of the models, and the advances that these models are creating. And so we're seeing the beginning of a new computing era."

More on programmable GPUs vs fixed-function ASICs

"As you move out to the edge, it really depends on whether your platform is software-defined whether it has to be programmable or whether it's fix functioned. There are many, many devices where the inference work is very specific. It could be something as simple as detecting changes in temperature or changes in sound or detecting motion. Those type of inference models are – could still be based on deep learning. It's function-specific. You don't have to change it very often, and you're running one or two models at any given point in time. And so those devices are going to be incredibly cost-effective. I believe those AI chips, you're going to have AI chips that are $0.50, $1 and you're just going to put it into something and it's going to be doing magical detections. The type of platforms that we're in, such as self-driving cars and robotics, the software is so complicated and there's so much evolution to come yet and it's going to constantly get better. Those software-defined platforms are really the ideal targets for us. And so we call it AI at the edge, edge computing devices. One of the edge computing devices, I'm very excited about is, what people call mobile edge or basically 5G telco edge. That data center will be programmable. We recently announced that we partnered with Ericsson and we're going to be accelerating the 5G stack. And so that needs to be a software-defined data center."

On Innovation

"[Innovation] really basically comes down to two dimensions. One dimension is, are we continuing to expand? Are we continuing to expand the number of applications that we can accelerate? Whether it's AI or computer graphics or genomics or 5G for example. And then the number -- and then the second is those applications, are they getting more impactful and adopted by the ecosystem, the industry? And are they continuing to be more complex? Those dimensions, the number of applications and the impact of those applications and the evolution the growth of complexity of those applications, if those dynamics continue to grow, then I think we're going to do a good job. We're going to sustain. And so -- and I think when I spelled it out that way, it's basically the equation of growth of our company. I think it's fairly clear that the opportunities are fairly exciting ahead."