In a wide-ranging interview with Wired, Nvidia CEO Jensen Huang revealed that he personally uses Perplexity AI as his go-to AI chatbot. When asked about his use of tools such as ChatGPT and He Bard, Huang said he prefers this lesser-known chatbot. In an interview, Huang said he uses both Perplexity AI and ChatGPT “almost every day.” AI rivals like Bard/Gemini and Grok don't seem worthy of Green's team's approval at this point.
So what is Huang using AI chatbots for? Nvidia's CEO explains that he uses chatbots for research. Currently, Huang has a particular interest in computer-aided drug discovery. I hope this is a scientific and business interest and not because a loved one has a worrying health issue.
Perplexity's appeal may be clear from the fact that it bills itself as “the world's first conversational answer engine.” In fact, when we ourselves first poked and prodded the app or used his Perplexity via the website, we found that queries were quick and easy, allowing us to naturally dig deep into topics. Perplexity also provides a handy “library” of past query threads and a “discovery” feed of news and current events to explore.
Above are some screenshots of Perplexity. This shows the app's simple UI, good taste on PC tech sites, and insight into why developers think their apps are a good choice.
Nvidia has been involved in a lot of the AI pie, but we note that as recently as January 2024, it participated in a $73.6 million Series B funding round led by IVP. With this in mind, it's not all that surprising that Jensen Huang is investigating his Perplexity AI and that some executives are “dogfooding” the product.
Perplexity AI's website and apps are free to use, but a single paid tier, Pro subscription, gives you access to more features and capabilities. 20 per month for unlimited Microsoft Copilot queries, choice of AI model (GPT-4, Claude 2.1, or Perplexity), unlimited file uploads, $5 per month of generated AI credits, and more. You may want to pay $200 or $200 per year. .
Other interview content
The topic of personal AI chatbot preferences is just one part of a lengthy interview with Wired's Lauren Good. Unsurprisingly, 2024 brought up a lot of topics related to AI. One of the more interesting bits was Huang's explanation of a new type of data center. Nvidia's CEO outlined an “AI factory” that has reportedly been in development for several years. This has been likened to a power generator, and NVIDIA is close to commercializing it.
Another topic that piqued our interest was the discussion of Moore's Law. Huang explained that Nvidia acquired Mellanox to avoid Moore's Law at the data center scale.
We also heard about Huang's frequent meetings with top TSMC executives such as Morris Chan. Hot topics include advanced packaging, capacity planning, and related emerging technologies such as CoWoS. But when faced with more specific questions about AI GPUs and the latency of the next Blackwell generation, Nvidia's chief was uncharacteristically, but understandably, evasive.