Share

Nvidia's ChatGPT monopoly is going to be worse for gamers than crypto ever was

Opinion: Nvidia's ChatGPT-powered profits will make gaming GPUs scarce

 

 

 

It seems like only yesterday that gamers everywhere were cursing the existence of Ethereum when they went shopping for the best graphics card they could afford, only to find empty shelves and gloating crypto bros on Twitter flexing with their five RTX GPUs bought for twice their MSRP. Let's not even talk about the scalpers who sold them the cards in the first place.

 

 
 

Since the cryptobubble popped and Ethereum moved to proof of stake rather than proof of work, getting a graphics card has actually been a relatively easy affair, much to gamers' relief. Many of the best cheap graphics card deals now offer well below MSRP prices on high-performance GPUs like the Nvidia RTX 3080 and I even saw an Nvidia RTX 3090 selling for less than $900 at one point.

 

 

But with the rise of large language models like those behind ChatGPT and Midjourney, which leverage the power of a graphics card to produce the generative output in the form of images, text, sound, and even full video, there's a looming GPU crunch on the horizon. Not only will it threaten to make it even harder to find a GPU than it was at the height of the crypto craze, but it might also have a far greater impact than crypto ever could and could even force Nvidia to exit the consumer GPU market.  

 

 
 

Why is generative AI different than crypto?

 

An Array Of Graphics Cards Mining Bitcoin

(Image credit: Shutterstock / GreenBelka)
 

 

 

The TL;DR version is pretty simple: Crypto is largely a Ponzi scheme with very little real-world application besides crime and price speculation. Generative AI can actually produce a useful product.

 

In the 15 years since the Bitcoin whitepaper was published that laid the foundations of cryptocurrencies and blockchain technology, it has struggled to find any real practical purpose that has proven to be valuable to the market. 

 
 

NFTs were probably the closest crypto ever came to finding a use case, and as it turns out most NFT sales were probably so-called wash sales meant to artificially inflate the price on an NFT marketplace for some mark to buy at an exorbitant premium that they will never be able to recover.

 

To those who claim that it's still too soon to tell what blockchain and crypto can do in the future, generative AI really reveals how much the value of crypto relies on smoke and mirrors and blind devotion to the idea than in any actual utility.

 

Dall-E, Midjourney, ChatGPT, and other forms of generative AI are far younger than crypto. They are, in fact, in their infancy, but are already revolutionizing creative industries, like de-aging actors in films, assisting in the creation of music – including a 'new' Beatles song featuring the voice of John Lennon reproduced from a decades-old demo – and drafting documents for just about every industry under the sun.

 

Now you can argue, as I have, that none of this is actually good. I personally think it's utterly dehumanizing to dilute our collective culture by saturating it in "good enough" AI-generated media. But you can't argue that these AI models, powered by neural networks, aren't producing something of value. 

Midjourney AI generated image of an animator at their desk

(Image credit: Midjourney)




Much like the quality of the mass-produced textiles of the Industrial Revolution was a downright mockery of the handcrafted fabrics of human artisans put out of work by the steam loom and the spinning jenny, the quality of AI-generated content pales next to the work of a highly skilled human creator.

 

 

No AI will even come close to producing a novel like Blood Meridian or One Hundred Years of Solitude. But it doesn't need to.

 

 

In the US, the Writers Guild of America is currently on strike, with one of the screenwriters' union's main concerns being the potential for film and TV studios taking the creative work of actual writers – the characters and story arcs that make up your favorite TV shows like The Mandalorian – and using an AI to generate new scripts based off previous human work (Full Disclosure: I am a member of the Writers Guild of America, East, though digital media members operate under different contracts and so we are not on strike).

 

 

Is there any doubt that if studios or media companies could make money off a new generation of AI-generated penny dreadfuls they wouldn't do so? And, sadly, the potential for them to do so is very real. They don't have to make the most revenue, just the most profit, and cutting out the cost of labor is the easiest way to do that, even if the products they sell are absolutely awful. If you could sell something for $1,000 but have to pay an artist $600 for it, or sell something for $450 and pay nobody anything for it, you'll go for the latter each and every time, even if it sucks.

 

What makes Nvidia so special, anyway?

 

It all comes down to something Nvidia developed to help speed up rendering workflows in creative industries, and to a lesser extent power its DLSS technology in its graphics cards.

 

The process of rendering a basic 3D scene with a graphics card is fairly complicated because it requires a lot of math and complex processes, and easily the most computationally taxing of these is matrix multiplication. In order to speed up these computations, Nvidia developed the tensor core. This is specialized circuitry in Nvidia's consumer GPUs going back to its RTX 2000-series that allows multiple multiplication operations to be performed in a single clock cycle, rather than sequentially over multiple cycles. This dramatically speeds up 3D rendering by using deep learning technology, but this tech isn't just restricted to 3D graphics.

 

Very few people will ever need to know what matrix multiplication is or how it works, but it is essential for machine learning. Without it, the generative artificial neural networks that sit behind the curtain of every LLM and every other AI model simply can't work. And, at the moment, Nvidia's GPUs are really the only place outside of highly specific and specialized data center hardware that can effectively carry out matrix multiplication efficiently. 

 

What's more, Nvidia's tensor cores are considerably more mature (Nvidia Lovelace features fourth-generation tensor cores) than Intel and AMD's competing AI hardware, meaning that training a neural network on a pod of Nvidia GPUs will go substantially faster than if you tried to do the same on AMD or Intel hardware. 

 

 

In short, Nvidia is the only real game in town right now for the computer hardware you need to actually train all the AI networks that are seeing an explosion of interest and investment, and that can produce a useful product that makes the investment in the hardware profitable. It's the driver of Nvidia's record profits reported last quarter, and this AI hardware advantage is what has turned Nvidia into the newest trillion-dollar company pretty much out of nowhere.

 

As more companies pivot away from Web3 to generative AI, the demand for AI hardware will only grow. There's a reason that Jansen Huang's Nvidia Computex 2023 keynote was essentially an informercial for Nvidia's AI hardware with only a passing reference to its gaming division, and even that was as a demonstration of how AI hardware was making game rendering faster.

 

 

Finding an Nvidia GPU might become even harder than it was during the crypto boom

 

Gamers standing outside a best buy to buy an RTX 3080 Ti graphics card

Gamers standing outside a Best Buy to buy an RTX 3080 Ti graphics card on launch day in NYC  (Image credit: Future)



Neural network training, much like cryptomining, is only really possible with a lot of hardware and a lot of time. You can maybe train a basic neural network that can identify a picture of a cat for a college computer science course on the Nvidia GPU you have at home, but that's about it. 
 
In order to power the kind of massive models behind ChatGPT and others, you need lots of compute, as they say, and so you need large-scale GPU operations to make this whole thing work. If this sounds like crypto, that's because it is. 
 
Only now, instead of independent operators getting a reverse mortgage on their house so they can buy up the best graphics cards to build a mining operation in a warehouse, you have the likes of Google, Microsoft, and many other massive industrial players who are going to need a lot of AI hardware to continuously train their models.
 
There is also the potential for independent operators to have a place in this new paradigm since distributed computing and processing of training data are absolutely on the table. Rather than buy the hardware to get the compute you need, you might be able to rent it out instead from a pool of operators when you need it to save on overhead.
 
Either way, there's going to be a lot of demand for Nvidia hardware, and there is only so much supply out there. Sales of Nvidia's midrange GPUs have been fairly flat among gamers this year, so there's still plenty of stock available right now. But generative AI has pretty much hit the scene like a meteor from space, so the market is still adapting to the new reality. 
 
It won't be long before it does, though, and distributed computing platforms like Folding@home show how easy it is to roll out distributed data processing tools. Unlike the cryptomining pools that went bust when the bubble burst, though, generative AI isn't going anywhere. The additional demand for graphics card stock will only grow and remain durable where crypto could not.


Translate & Edit: P2E Game

Welcome to P2E GAME

Hearing the echoes from Metaverse.

Blockchain Games list | NFT Games lists | Crypto Games list | Play to Earn Games List
Bookmark 0
Bookmark
Not-liked 0
Like
Reviews
Reply
Latest
Discover what matters to you
  • NFT
  • GameFi
  • Industry News
  • Launchpad
  • Airdrops
  • Insight
  • Region News
  • Weekly Overview
  • Editors' Picks
  • Partnership
No corresponding news is found