Especially if they’re made in an engine that has been previously optimised, for example.

Like, right now, my Nvidia Geforce Experience app is telling me to update to the “Game ready driver” for Diablo 4.

What is Diablo 4 doing so uniquely that it can’t make full use of the general purpose firmware that is already there? Surely, it wouldn’t have been designed in a way that would run poorly unless Nvidia made a new version that set straight their performance blunders, right? What are they doing here?

  • Izzy@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I suspect it is more along the lines of a new game heavily using some feature that wasn’t as commonly used and then Nvidia going back to optimize it.

    • robotscostrent@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      That’s my guess as well. I doubt Nvidia is adding special logic or optimizations for every new game. The whole “drivers made ready for Diablo 4” thing is likely more for marketing than anything.

      • Izzy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Only major gaming studios would have enough reputation to get Nvidia to do anything, but it is also in Nvidias best interest to focus on updates involving the biggest selling games. Both for advertisement purposes and to direct their work. They can put in the least effort for the maximum outcome.

    • UsualMap@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      A lot of this. But also things like developers doing things in an inefficient way, which the driver can then optimise for.

      It’s not that game developers are intentionally not using GPUs properly, but the companies behind these GPUs are in a really good position to easily see if there’s something that they can do on their end to improve performance with specific games.

  • xchgeaxeax@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’ve heard that for major AAA releases, the hardware vendors will sometimes include hand-optimized shaders in the driver. When the game goes to load its shaders, the driver will look to see if they match one of the ones in its database, and instead load the hand optimized shader in place of the one provided by the game.

    They may also go in and tweak some low level settings to work around bottlenecks introduced by the game. For instance if the game is trying to synchronize its state with the GPU every frame, and the vendor can prove that doing so isn’t necessary, they may have the driver ignore the synchronization when it is running that specific game.

    Why can’t the game developers do this? Well in some cases they can, but sometimes the performance issues are too difficult for someone not intimately familiar with the workings of a specific GPU to diagnose. The game studio could either try and find an Nvidia GPU expert, or they could let an Nvidia employee with access to low level simulations of the card and knowledge of the system do it for them.