• warlaan@feddit.org
    link
    fedilink
    arrow-up
    13
    arrow-down
    3
    ·
    14 hours ago

    You are underestimating the importance of standards here. On a PC you will always only get a fraction of the hardware’s power, because there’s way more stuff running at the same time, not just the game, and because the developers can’t know exactly what hardware configuration every single gamer has. On a console you can know exactly how much RAM you will have available, so you can design your content to use that amount of data and then stream it into memory that you reserve at the start. If you do that on a PC you may ask for more RAM than the PC has or you may leave RAM unused. Or you can try to optimize the game for different specs, which costs time and money, so you won’t get the same results with the same budget.

    • renzev@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      edit-2
      8 hours ago

      Back in the olden days when games were written in assembly and there was barely enough memory for a framebuffer it made sense to tediously optimize games to squeeze every bit of performance out of the limited hardware. Modern consoles are not like that. They have their own operating systems with schedulers, multitasking, and memory allocators, much like a desktop computer. Your claim that “way more stuff is running at the same time” is only true if the PC user deliberately decides to keep other programs running alongside their game (which can be a feature in and of itself – think recording/streaming, discord, etc.) It is true that while developing for PC you have to take account that different people will have different hardware, but that problem is solved by having a graphics settings menu. Some games can even automatically select the best graphics options that will get the most out of your hardware. What you’re describing is a non-problem.