I’m gonna get downvoted for this but… gaming consoles.
Gaming consoles made sense back in the day before home computing took off, and for a while they actually had superior hardware than computers when it came specifically to running games. But nowadays gaming consoles are just locked down user-hostile computers with a subscription service attached. The gaming equivalent of inkjet printers. It’s an industry made irrelevant by advancements in technology, propped up by misleading marketing and artificial hype that sadly many people fall for.
I have a PS5 because it will play the damned games. There’s nothing in the PC realm for $400 I could buy that could come close to guaranteeing the same thing. Consoles don’t exist because people are stupid, they exist because gaming and GPU companies are cartels just like almost every other sector of the economy.
You are underestimating the importance of standards here. On a PC you will always only get a fraction of the hardware’s power, because there’s way more stuff running at the same time, not just the game, and because the developers can’t know exactly what hardware configuration every single gamer has.
On a console you can know exactly how much RAM you will have available, so you can design your content to use that amount of data and then stream it into memory that you reserve at the start.
If you do that on a PC you may ask for more RAM than the PC has or you may leave RAM unused. Or you can try to optimize the game for different specs, which costs time and money, so you won’t get the same results with the same budget.
Back in the olden days when games were written in assembly and there was barely enough memory for a framebuffer it made sense to tediously optimize games to squeeze every bit of performance out of the limited hardware. Modern consoles are not like that. They have their own operating systems with schedulers, multitasking, and memory allocators, much like a desktop computer. Your claim that “way more stuff is running at the same time” is only true if the PC user deliberately decides to keep other programs running alongside their game (which can be a feature in and of itself – think recording/streaming, discord, etc.) It is true that while developing for PC you have to take account that different people will have different hardware, but that problem is solved by having a graphics settings menu. Some games can even automatically select the best graphics options that will get the most out of your hardware. What you’re describing is a non-problem.
You’re not wrong. There definitely used to be a difference back when consoles would get way better support and PC ports were terrible.
Sound On / Off
– The entire options menu of a PC port in like 2006.
But nowadays I struggle to understand the point of getting one of those big chonky tower consoles like whatever the latest Xbox or PlayStation is. (PlayStation even selling entirely new consoles for a simple graphics/RAM upgrade, smh).
I’m gonna get downvoted for this but… gaming consoles.
Gaming consoles made sense back in the day before home computing took off, and for a while they actually had superior hardware than computers when it came specifically to running games. But nowadays gaming consoles are just locked down user-hostile computers with a subscription service attached. The gaming equivalent of inkjet printers. It’s an industry made irrelevant by advancements in technology, propped up by misleading marketing and artificial hype that sadly many people fall for.
I downvoted you just because you’re one of those people who is literally asking for it.
New PC graphics cards alone cost as much as entire games consoles. The top end ones cost the same as multiple PS5s. That’s why consoles exist.
Consoles don’t make money, the expensive games and subscriptions do.
I have a PS5 because it will play the damned games. There’s nothing in the PC realm for $400 I could buy that could come close to guaranteeing the same thing. Consoles don’t exist because people are stupid, they exist because gaming and GPU companies are cartels just like almost every other sector of the economy.
You are underestimating the importance of standards here. On a PC you will always only get a fraction of the hardware’s power, because there’s way more stuff running at the same time, not just the game, and because the developers can’t know exactly what hardware configuration every single gamer has. On a console you can know exactly how much RAM you will have available, so you can design your content to use that amount of data and then stream it into memory that you reserve at the start. If you do that on a PC you may ask for more RAM than the PC has or you may leave RAM unused. Or you can try to optimize the game for different specs, which costs time and money, so you won’t get the same results with the same budget.
Back in the olden days when games were written in assembly and there was barely enough memory for a framebuffer it made sense to tediously optimize games to squeeze every bit of performance out of the limited hardware. Modern consoles are not like that. They have their own operating systems with schedulers, multitasking, and memory allocators, much like a desktop computer. Your claim that “way more stuff is running at the same time” is only true if the PC user deliberately decides to keep other programs running alongside their game (which can be a feature in and of itself – think recording/streaming, discord, etc.) It is true that while developing for PC you have to take account that different people will have different hardware, but that problem is solved by having a graphics settings menu. Some games can even automatically select the best graphics options that will get the most out of your hardware. What you’re describing is a non-problem.
There is value in static hardware so you can perform specific optimizations and target framerate. The subscriptions are 100% bullshit though.
Steam deck makes sense
To be fair, the same deck advertises it as a computer that you can play games on.
You’re not wrong. There definitely used to be a difference back when consoles would get way better support and PC ports were terrible.
– The entire options menu of a PC port in like 2006.
But nowadays I struggle to understand the point of getting one of those big chonky tower consoles like whatever the latest Xbox or PlayStation is. (PlayStation even selling entirely new consoles for a simple graphics/RAM upgrade, smh).
At least the Switch’s portability made sense.
The old consoles also were just plug the game in and boot up.
No Hassle.
Now they sounds like Windows boxes.
I LOVED how the original X-Box had an “desktop” in it. Unfortunately that’s gone way too far anymore.
Nowadays I find these interfaces so overly complicated and fiddly that it makes the UX of an N64 far superior.
I pretty much went PC-only after the xbox 360 though, when ports finally started getting good. :)