Title says it all (i have turned on 165hz on settings). Its a cheap monitor, do some 165hz monitors not truly give you that experience? Or are my eyes fucked

  • Schneemensch@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I am totally with you. I have had a 144Hz monitor for 2 years now. I am 100% sure that everything was configured correctly and I could spot some small differences in the UFO test. But other than that I do not feel any differences in day-to-day activities or games. Windows reset my frequency settings occasionally, but I never noticed it.

    • tony@lemmy.hoyle.me.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I’ve never seen any difference with the top two with that test. My monitor is 144hz and TBH I might as well have saved my money and got 60Hz ones.

      We’re not all hardcore gamers trained to see miniscule differences.

      • Dudewitbow@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        You dont have to be a hardcore gamer to see the difference. A lot of people who use phones see the difference 90/120hz makes over 60.

      • Vlyn@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Humans can see a single solid color frame changing at 1000 fps. So if you don’t notice a difference between 60 and 165 fps something isn’t working. It’s not your eyes.

        • GiveMemes@jlai.lu
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Seeing a solid color frame change is completely different from the minor changes generally occurring per frame, especially in media such as movies and games which are continuous.

          • Vlyn@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            The Hobbit movies at 48 instead of 24 fps still looked much smoother and better.

            • foggenbooty@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              Yup, while I do see the point some people make about it breaking the immersion of film for being too fluid (everybody has their preferences) it definitely WAS more fluid.

              I will say though that when I first moved from 60-144hz I wasn’t blown away by the change either. Things seemed a bit smoother maybe but not that big a deal. It wasn’t until I accidentally went back to 60 that something felt horribly wrong. I can ABSOLUTELY see the difference now and for some reason I had to get acclimated.

              • Vlyn@lemmy.zip
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 year ago

                The problem with the movie was that a lot of TV watching people see it as a “soap opera effect” because those are shot in 60 fps. So they don’t like it and want a “cinematic” feel.

                For me who doesn’t usually watch TV it was glorious. Yes, you notice every tiny mistake on the screen at 48 fps, but it actually feels real. Like that’s a real dwarf there talking with an elf for example. More lifelike if you get what I mean? It’s a damn shame you can’t buy the movies with HFR :-/

                Well, 144hz has more than one benefit. You get a smoother image output of course, but also less input lag (seeing actions you take faster on the screen). But switching between the two is very obvious usually, even when just moving around a window on the desktop.

        • Turun@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Your usecase may be different, but I am usually not required to catch solid color frames in my day to day computer use.

      • MustrumR@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Do you have it enabled in Windows under display settings tho? It sounds like you aren’t actually having it enabled. Other possibility is that your monitor has very low response time and everything blurs.

        I’m not sure it it’s possible to not see a difference in refresh rate jump this big until about 160Hz.

        • GenderNeutralBro@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Or it just doesn’t work right in their browser. It says in big bold letters “VSYNC is not available on the Linux platform” and at 960 pixels per second I actually can’t tell the difference between the 100hz and 50hz lines. If I slow it to 480 pixels per second it becomes apparent, but I still feel like that’s browser funkiness rather than a true frame rate difference. I don’t think it’s actually running at 100fps.

          It’s not my eyes, btw. I can usually tell the difference very easily. I had a problem with my Nvidia drivers for a while that would often make it reset to 60hz on reboot, instead of my display’s max of 100. It was always immediately obvious to me just from the mouse cursor, even without consciously looking for it.

          LOL as I was writing this, I reloaded the page and now it’s very very obvious at 960. Something’s definitely inconsistent on my device. Go figure.

      • andrew@lemmy.stuart.fun
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        The difference shouldn’t be miniscule, though. If you’ve never been able to see a difference, my money’s on not setting the refresh rate in Windows. It’s not automatic.

        • tony@lemmy.hoyle.me.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          It’s mostly marketing. Films are perfect at 24fps and gamer bros think they can see framerates ten times that.

          • midnight@kbin.social
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Really? Movies at 24 fps are tolerable because we’re used to it and there’s a lot of motion blur, but any motion or panning shot still looks incredibly jerky. You have to get way up into the 100s of fps before you hit diminishing returns of smoothness, and even then it’s still noticeable.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    My experience of 144hz is that in terms of seeing a difference, it’s not much. I mostly see it when looking around a scene and the movement is more fluid. However, what you can notice isn’t as much as what makes a difference in games.

    I tried the dust2 awp test map on 60hz and 144hz. The difference with how many I could hit with 144hz was not down to chance and was quite repeatable. I think (and it’s just a layman theorizing here) that unconsciously our muscle memory, or hand/eye co-ordination are working on cues beyond what we consciously see. And this is why it helps for split second game decisions like this.

    My opinion is, if you cannot see the difference consciously and you don’t play FPS then maybe you should de-prioritize refresh rates over other monitor features. There’s nothing wrong with that.

    • Blisterexe@lemmy.zipOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Yeah, back when I played valorant I thought I played better, my theory is that I just got a really shitty panel and that’s why i can barely tell

  • Satelllliiiiiiiteeee@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Is it possible that there are ghosting issues with the panel? I had a 120hz monitor at work at one point that had ghosting issues so bad it made it look barely any better than a 60hz panel. Going from 60hz to 120+hz should definitely be noticeable to most people

  • Swarfega@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Have you configured your OS to use a higher refresh rate in monitor settings? The difference is night and day…

  • NoFortunateSon@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    You might not notice anything at first, but after some days of gaming and then going back, you’ll probably notice the difference then.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    You’re only going to notice if the thing playing goes up to 165fps. If you’re, say, watching a movie or video you won’t notice anything because there’s nothing to notice.

    Play a game that you can get really high FPS in (maybe Half Life 1 which a modern machine should have no trouble getting 300+). Limit it to 60. Check it out. Then go up to 144. Then 165.

    Also if you have an nVidia GPU, it may not be setting the refresh rate properly. I constantly have this issue with driver updates resetting it back to 30hz on my machine. You gotta go into the Nvidia control panel, find the display settings and scroll down somewhere toward the bottom is a refresh rate setting. Change that to the highest your display can use.

    • SharkAttak@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Let’s not forget that the industry always likes to exxagerate with the goal to sell… IMO refresh rate is the latest victim of “bigger number is BETTER!” marketing.

  • Turun@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I have used https://github.com/Nixola/VRRTest before to check the refresh frequency. I use X11 and wanted to check if my 144Hz monitors work with my older 60Hz one. Set the test mode to squares and the frame rate to twice your monitor’s refresh rate. You should see every second square light up. If this is not the case, play around with the frame rate in the program until every second square lights up.

    I can’t see the difference either though. Yes, the mouse moves a bit quicker if I pay attention to it. But I do not care or notice, to be honest.

  • rasensprenger@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’m also unable to see the difference directly, but everything just feels more snappy. If you can’t feel it, maybe you have some extra latency from somewhere else

  • onlinepersona@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’m 100% sure if the majority of people in here claiming they see the difference were actually tested, they’d fail it. Something like

    • 60Hz, 120Hz, 144Hz, 165Hz, 200Hz
    • multiple game scenes and clips:
      • varying FPS ranging from 29 to 320fps
      • quiet and busy (not much stuff happening vs a lot of stuff happening)
      • slow and fast camera or background movements

    Take the Cartesian product of that for all the different possibilities and play them a random set thereof. Maybe 20 or so.

    It’s just like screen resolution. If you sit at arms length or further away from your screen (which you should) and increase the resolution of your screen, everything becomes smaller (icons, text, images). That means you’ll have to scale them up to be at the same size as when they were at a lower resolution.
    Also, at a certain distance, you become unable to spot details of a certain size --> you physically will not be able to see the different between 1080p, 2k, and 4k from that distance. It’s called visual acuity. I bet you, if you put did a similar test as above with video resolution, screen resolution, screen size, and distance from screen, the majority would start do much worse than they think they can.

    It’s mostly marketing and “bigger number = better” think.

    • Crit@links.hackliberty.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      And I’m 100% you’re either testing incorrectly or have some issue that makes it so you can’t see high FPS or something. I could definitely tell the difference between 20, 60 and 165fps, maybe not small increments like going from 140 to 160, but it’s definitely noticeable when things are suddenly smoother. Sure you can fake some of it with motion blur and good frame pacing, but high FPS is definitely noticeable, at least in my case up to 160, but I haven’t got a monitor that goes higher to compare.