• LinkOpensChest.wav@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            0
            ·
            11 months ago

            They can both reduce aliasing, I guess? But they’re completely different things.

            And moreover, I’m struggling to understand what either has to do with the post.

          • daellat@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            11 months ago

            A lot actually. The T in TAA stands for temporal. DLSS uses temporal information too. Not sure if they’re in the same spot in the render pipeline though.

            • null@slrpnk.net
              link
              fedilink
              arrow-up
              0
              ·
              11 months ago

              Sure, and they are both things you’d find under video settings.

              I meant more as an answer to the question OP asked.

              • daellat@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                11 months ago

                Oh that’s not what you asked you asked how DLSS relates to TAA. To answer your question, TAA causes a generally blurry image.

  • Otherwise_Direction7@monyet.ccOP
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    11 months ago

    The one that can think off right now is Omori

    Seen a teeny tiny bit of the hand drawn beginning cutscenes and it looks gorgeous, only to immediately discover later that the entire game was played in 16-bit 2D pixel

    Welp

  • Otherwise_Direction7@monyet.ccOP
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    11 months ago

    Also, here’s the original source of the image if anybody for some whatever reason ever need it. It’s hard to crawl back for the source link since the original is really old at this point of time

  • HipsterTenZero@dormi.zone
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    11 months ago

    Dramatic zoom in on Kazuma Kiryu’s painstakingly rendered pores. “I guess we’re all the same deep down after all…” Looks up, sidequest jingle plays. Two seconds later, a ps2 man appears shouting “I’ll kill you!” for a random encounter.

  • ObsidianZed@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    This is how I feel watching new CG trailers for each Elder Scrolls Online expansion. I already wanted to like it because I like the Lore and still enjoy watching the trailers because they look so cool, but I almost always say to myself “if only the game didn’t suck so absolutely.”

  • nifty@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    Resident Evil, Yakuza, Sleeping Dogs, Far Cry etc. So even with games that have better graphics, the cut scenes proportionally increase in quality.

  • funnystuff97@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    On a similar vein, Arkham Knight (and in some cases Arkham City) looked worse in cutscenes if you maxed out the graphics settings. Obviously not if you ran it on a potato, but the games are somewhat well optimized these days*.

    *At launch, Arkham Knight was an unoptimized, buggy mess. It has since gotten much better.

    • Otherwise_Direction7@monyet.ccOP
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      Wait you mean that the game’s gameplay looks better than the actual cutscenes in the game?

      But how? Does the game use FMV for the cutscenes or something?

      • funnystuff97@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        11 months ago

        The cutscenes were rendered using certain graphics settings that you could exceed if you maxed out your own settings. Plus, because it was a pre-rendered video, there must have been some compression or something, as you could just tell when you’re in a cutscene-- it was grainier and there was a smidge of artifacting. Don’t quote me on this, but I believe the cutscenes were rendered at, like, 1080p, and if you were playing at 4K it would be a very noticeable downgrade. (Note that I did not and still do not have a 4K monitor)

        Although thinking about it again, I do vividly remember some in-game-engine cutscenes in Arkham Knight. I’ll have to replay that game again sometime to jog my memory.

    • nevetsg@aussie.zone
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      I am playing through Rise of Tomb Raider in 4K and having a similar experience. I think the cut scenes are in 1080p.