Check out Hetzner Cloud here: https://htznr.li/hub25_2Support us on Patreon: https://www.patreon.com/hardwareunboxedYT Membership: https://www.youtube.com/ch...
tl:dw some the testing shows 300-500% improvements in the 16GB model. Some games are completely unplayable on 8GB while delivering an excellent experience on the 16GB.
It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.
So their strategy is making and selling shitty cards at high prices? Don’t you think that would just make consumers consider a competing brand in the future?
Yea I don’t know why buying a shitty product should convince me to throw more money at the company. They don’t have a monopoly, so I would just go to their competitor instead.
They had trouble increasing memory even before this AI nonsense. Now they have a perverse incentive to keep it low on affordable cards, to avoid undercutting their own industrial-grade products.
Which only matters thanks to anticompetitive practices leveraging CUDA’s monopoly. Refusing to give up the fat margins on professional equipment is what killed DEC. They successfully miniaturized their PDP mainframes, while personal computers became serious business, but they refused to let those run existing software. They crippled their own product and the market destroyed them. That can’t happen, here, because ATI is not allowed to participate in the inflated market of… linear algebra.
The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?
The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?
It’s not that they don’t work.
Basically what you’ll see is kinda like a cache miss, except the stall time to go ‘oops, don’t have that’ and go out and get the required bits is very slow, and so you can see 8gb cards getting 20fps, and 16gb ones getting 40 or 60, simply because the path to get the missing textures is fucking slow.
And worse, you’ll get big framerate dips and the game will feel like absolute shit because you keep running into hitches loading textures.
It’s made worse in games where you can’t reasonably predict what texture you’ll get next (ex. Fortnite and other such online things that are you know, played by a lot of people) but even games where you might be able to reasonably guess, you’re still going to run into the simple fact that the textures from a modern game are simply higher quality and thus bigger than the ones you might have had 5 years ago and thus 8gb in 2019 and 8gb in 2025 is not an equivalent thing.
It’s crippling the performance of the GPU that may be able to perform substantially better, and for a relatively low BOM cost decrease. They’re trash, and should all end up in the trash.
That’s what I’m on about. We have the technology to avoid going ‘hold up, I gotta get something.’ There’s supposed to be a shitty version that’s always there, in case you have to render it by surprise, and say ‘better luck next frame.’ The most important part is to put roughly the right colors onscreen and move on.
id Software did this on Xbox 360… loading from a DVD drive. Framerate impact: nil.
It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.
Its moreso for OEM system integrators, who can buy up thousands of these 5060ti’s and sell them in systems as 5060Ti’s, and the average Joe who buys prebuilts won’t know to go looking at the bottom half of the tech sheet to see if its an 8 or 16.
As well as yes, direct scamming consumers, because Jensen needs more leather jackets off the AI craze and couldn’t give a rats ass about gamers.
I agree that they don’t give half a shit about their actual product, but their biggest competitor has never been more competitive, and Nvidia knows it. Pissing off your costumer base when you don’t have a monopoly is fucking stupid, and Nvidia and the prebuilt manufacturers knows this. It’s business 101.
There’s gotta be something else. I know businesses aren’t known for making long term plans, because all that will ever matter to them is short term profits. But this is just way too stupid to be because of that.
That something else is that they don’t need the gamer market. Providing consumer cards is literally an inconvenience for them at this point, they make 2 billion a quarter from gaming cards but 18 billion on datacenter compute, with some insane 76% gross margins on those products they sell (to continue funding R&D).
Yup. This is basically aimed at the people who only know that integrated GPUs are bad and they need a dedicated card, so system manufacturers can create a pre built that technically checks that box for as little money as possible.
tl:dw some the testing shows 300-500% improvements in the 16GB model. Some games are completely unplayable on 8GB while delivering an excellent experience on the 16GB.
It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.
for money/extreme greed
Okay well that’s the low-hanging fruit but explain to me the correlation? How does confusing their customers fuel their greed?
Uninformed buyers will buy the 8GB card get a poor experience and will be forced to buy a new card sooner than later.
So their strategy is making and selling shitty cards at high prices? Don’t you think that would just make consumers consider a competing brand in the future?
Yea I don’t know why buying a shitty product should convince me to throw more money at the company. They don’t have a monopoly, so I would just go to their competitor instead.
For most consumers it might not, the amount of nvidia
propagandaadvertisement in games is huge.They had trouble increasing memory even before this AI nonsense. Now they have a perverse incentive to keep it low on affordable cards, to avoid undercutting their own industrial-grade products.
Which only matters thanks to anticompetitive practices leveraging CUDA’s monopoly. Refusing to give up the fat margins on professional equipment is what killed DEC. They successfully miniaturized their PDP mainframes, while personal computers became serious business, but they refused to let those run existing software. They crippled their own product and the market destroyed them. That can’t happen, here, because ATI is not allowed to participate in the inflated market of… linear algebra.
The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?
It’s not that they don’t work.
Basically what you’ll see is kinda like a cache miss, except the stall time to go ‘oops, don’t have that’ and go out and get the required bits is very slow, and so you can see 8gb cards getting 20fps, and 16gb ones getting 40 or 60, simply because the path to get the missing textures is fucking slow.
And worse, you’ll get big framerate dips and the game will feel like absolute shit because you keep running into hitches loading textures.
It’s made worse in games where you can’t reasonably predict what texture you’ll get next (ex. Fortnite and other such online things that are you know, played by a lot of people) but even games where you might be able to reasonably guess, you’re still going to run into the simple fact that the textures from a modern game are simply higher quality and thus bigger than the ones you might have had 5 years ago and thus 8gb in 2019 and 8gb in 2025 is not an equivalent thing.
It’s crippling the performance of the GPU that may be able to perform substantially better, and for a relatively low BOM cost decrease. They’re trash, and should all end up in the trash.
That’s what I’m on about. We have the technology to avoid going ‘hold up, I gotta get something.’ There’s supposed to be a shitty version that’s always there, in case you have to render it by surprise, and say ‘better luck next frame.’ The most important part is to put roughly the right colors onscreen and move on.
id Software did this on Xbox 360… loading from a DVD drive. Framerate impact: nil.
Its moreso for OEM system integrators, who can buy up thousands of these 5060ti’s and sell them in systems as 5060Ti’s, and the average Joe who buys prebuilts won’t know to go looking at the bottom half of the tech sheet to see if its an 8 or 16.
As well as yes, direct scamming consumers, because Jensen needs more leather jackets off the AI craze and couldn’t give a rats ass about gamers.
Ngl gamers don’t deserve respect.
I agree that they don’t give half a shit about their actual product, but their biggest competitor has never been more competitive, and Nvidia knows it. Pissing off your costumer base when you don’t have a monopoly is fucking stupid, and Nvidia and the prebuilt manufacturers knows this. It’s business 101.
There’s gotta be something else. I know businesses aren’t known for making long term plans, because all that will ever matter to them is short term profits. But this is just way too stupid to be because of that.
That something else is that they don’t need the gamer market. Providing consumer cards is literally an inconvenience for them at this point, they make 2 billion a quarter from gaming cards but 18 billion on datacenter compute, with some insane 76% gross margins on those products they sell (to continue funding R&D).
To me it sounds like they are preying on the gamer who isn’t tech savvy or are desperate. Just a continuation of beinf anti-consumer and anti-gamer.
Yup. This is basically aimed at the people who only know that integrated GPUs are bad and they need a dedicated card, so system manufacturers can create a pre built that technically checks that box for as little money as possible.