• BmeBenji@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

    • flintheart_glomgold@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.

      TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.

    • bruhduh@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p

      • AngryMob@lemmy.one
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        can doesn’t mean should.

        720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance

        to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.

        or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.

        For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched

        • bruhduh@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          10 months ago

          Devs already use it instead of optimisations, what makes you think that bosses don’t try to push it further because deadlines and quarterly profits, immortals of aveum is example and it’s not even end of generation, only half (i agree with you from user standpoint though)

    • Final Remix@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      *monkey’s paw curls*

      Granted! Everything’s just internal render 25% scale and massive amounts of TAA.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

      Honestly most people sit far enough from the TV that 1080p is already good enough.

      • minibyte@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        10 months ago

        I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.

        I’d settle for 4k @ 120 FPS locked.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          I’m 6-8 feet from a 65, depending on seating position and posture. It seems to be a pretty sweet spot for 4K (I have used the viewing distance calculators in the past, but not recent enough to remember the numbers). I do wear my glasses while watching TV too, so I see things pretty clearly.

          With games that render at a native 4K at 60fps and an uncompressed signal, it is absolutely stunning. If I try to sit like 4 feet from the screen to get more immersion, then it starts to look more like a computer monitor rather than a razor sharp HDR picture just painted on the oled.

          There is a lot of quality yet to be packed into 4K. As long as “TV in the living room” is a similar format to now, I don’t think 8K will benefit people. It will be interesting to see if all nice TVs just become 8K one day like with 4K now though.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.

        Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

        • Holzkohlen@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

  • clearleaf@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    The performance difference between 1080p and 720p on my computer makes me really question if 4k is worth it. My computer isn’t very good because it has an APU and it’s actually shocking what will run on it at low res. If I had a GPU that could run 4k I’d just use 1080p and have 120fps all the time.

    • Chestnut@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Tldr: Higher resolutions afford greater screen sizes and closer viewing distances

      There’s a treadmill effect when it comes to higher resolutions

      You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience

      The reason to upgrade to a higher resolution is because you want bigger screens

      If you want a TV for a monitor, for instance, you’ll want 4k because you’re close enough that you’ll be and to SEE the pixels otherwise.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience

        This is sort of how I feel about 3D movies and why I never go to them. After about 20 minutes, I mostly stop noticing the 3D.

      • Johanno@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        As long as don’t know that there is anything better you will love 1080p. Once you have seen 2k you don’t want to switch back. Especially on bigger screens.

        On the TV I like 1080p still. I remember the old CRT TVs with just bad resolution. In comparison 1080 is a dream.

        However if the video is that high in quality you will like 4k on a big TV even more. But if the movie is only 720p (like most DVDs or streaming Services) then 4k is worse than 1080p you need some upscaling in order to have a clear image now.

    • pishadoot@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      1440p is the sweet spot. Very affordable these days to hit high FPS at 1440 including the monitors you need to drive it.

      1080@120 is definitely low budget tier at this point.

      Check out the PC Builder YouTube channel. Guy is great at talking gaming PC builds, prices, performance.

  • LaunchesKayaks@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Has anyone else here never actually bought a TV? I’ve been given 3 perfectly good TVs that relatives were gonna throw out when they upgraded to smart TVs. I love my dumb, free TVs. They do exactly what I need them to and nothing more. I’m going to be really sad when they kick the bucket.

    • ikidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Any TV is a dumb TV if you plug a Kodi box in the HDMI and never use the smart trash.

    • lengau@midwest.social
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      I’ve bought my TVs because all my relatives are the same as us. My mom finally tossed an old CRT TV a couple of years ago because it started having issues displaying colours correctly.

    • Leg@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Yes, people like me buy TVs. I’m the guy who keeps giving away perfectly good TVs to other people because I’ve bought a new one and don’t want to store the old one. I’ve given away 2 smart TVs so far, though I’m not sure what I’ll do with my current one when I inevitably upgrade.

    • woodenskewer@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      I was a given free, very decent, dumb tv and upgraded it to a smart tv with a $5 steam link and ran a cat 6 cable to it from my router. Best $5 ever. Have no intention of buying a new one. If I ever do, I will try my hardest to make sure if it’s a dumb one. I know they sell “commercial displays” that are basically a tv with no thrid party apps or a way to install them.

    • starman2112@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I used my family’s first HDTV from 2008 up until last year, when my family got me a 55" 4k TV for like $250. Not gonna lie, it’s pretty nice having so much screen, but I’m never getting rid of the ol’ Sanyo.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      One of my TVs was given to us by my mother-in-law, but we did buy the other one. Before the ‘smart’ TV era though.

  • Pratai@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Sony Bravia Z series. Bought in 2010 I think? Still works like a charm!

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    One of my TVs is 720p. The other is 1080p. The quality is just fine for me. Neither is a ‘smart’ TV and neither connects to the internet.

    I will use them until they can no longer be used.

    • AngryCommieKender@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      The last TV I owned was an old CRT that was built in the 70s. I repaired it, and connected the NES and eventually the SNES to it. Haven’t had a need for a TV ever since I went to university, joined IT, and gained a steady supply of second hand monitors.

      • scoobford@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Context matters a lot. On a 27" monitor, it makes a pretty decent difference. On a 50" TV at 10+ ft…meh?

      • qaz@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        1080 vs 2k is pretty clear to me, but I have a hard time telling the difference between 2k and 4k.

        • Rodeo@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Small print text rendering is where you’ll see the difference.

          Game graphics, whatever, but if you have to do a lot of reading or coding, you can make the text smaller and still stays crystal clear.

        • Kiosade@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Yeah and once you’re deep into playing… you stop caring about that stuff and focus on the game.

    • starman2112@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      I have a 4k TV, it legitimately is no better than 1080 lmao

      There’s a very noticeable difference, but it’s nothing like the difference between SD and HD. It’s pretty, but not that pretty. I prefer the performance (and proper scaling for my computer) of 1080, even on a 55" screen

  • Cowbee [he/him]@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    4k is the reasonable limit, combined with 120 FPS or so. Beyond that, the returns are extremely diminished and aren’t worth truly considering.

      • Cowbee [he/him]@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        There are legitimately diminishing returns, realistically I would say 1080p would be fine to keep at max, but 4k really is the sweet spot. Eventually, there is a physical limit.

        • starman2112@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          I fully agree, but I also try to keep aware of when I’m repeating patterns. I thought the same thing about 1080p that I do about 4k, and I want to be aware that I could be wrong again

          • Cowbee [he/him]@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            Yep, I’m aware of it too, the biggest thing for me is that we know we are much closer to physical limitations now than we ever were before. I believe efficiency is going to be the focus, and perhaps energy consumption will be focused on more than raw performance gains outside of sound computing practices.

            Once we hit that theoretical ceiling on the hardware level, performance will likely be gained at the software level, with more efficient and clean code.

    • kandoh@reddthat.com
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      8k is twice as big as 4k so it would be twice as good. Thanks for coming to my ted talk

  • TrickDacy@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    I enjoy 4k on the monitors I sit only a few inches from all day, but so far I find it hard to justify a whole chain of upgrades for the living room when I think the picture quality already looks great at 10+ feet away or whatever . To be clear, I mean I don’t see the need to upgrade the living room from 1080p to 4k, let alone beyond that

    • echo64@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      It really depends on the size of tv. It’s like a cinema screen, you want very high resolutions for that even though it’s far away, because it’s a large size

      • TrickDacy@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        10 months ago

        I think mine is 56 or 58 inches. A lot of people have commented that it’s large. It feels like the right size to me /shrug

        • echo64@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          It depends how far it is, and personal preference.

          4k makes a massive difference for me on my TV, I’ll opt for it whenever I can. But it’s all personal preference and circumstance which is good to remember.

          Your situation isn’t the same as everyone’s, theirs a valid use for 1080p but also for 4k

      • Zink@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        It depends on how much of your FOV the screen covers, since it’s the angular resolution of our eyes that matters.

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I mean, you can get 4K TVs for cheap and fix them (As long as the display is NOT damaged, once that’s gone the TV is nothing but scrap)

      Got a 60 inch 4K HDR TV for free off Facebook, the led backlights had just gone out. $20 for a replacement set, 2 hours of my time and a couple cuts on my hand and it’s been a fantastic TV since lmao

  • HEXN3T@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I have a 4K 120hz OLED TV. The difference is quite drastic compared to my old 1080p LED. It’s certainly sharper, and probably the practical limit. I’ve also seen 8K, and, meh. I don’t even care if it’s noticable, it’s just too expensive to be worthwhile. We should just push more frames and lower latency for now, or, the Gods forbid, optimise games properly.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I feel like resolution wasn’t much of an issue even at 1080p. It was plenty. Especially at normal viewing distances.

      The real advantages are things like HDR and higher framerates including VRR. I can actually see those.

      I feel like we’re going to have brighter HDR introduced at some point, and we’ll be forced to upgrade to 8K in order to see it.

      • HandBreadedTools@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Ehhhh, I think 1080p is definitely serviceable, it’s even good enough for most things. However, I think 1440p and 4k are both a pretty noticeable improvement for stuff like gaming. I can’t go back to 1080p after using my 3440x1440 monitor.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          I notice more on my PC. Being up close I can see individual pixels. And for productivity software, the higher resolution wins every time.

          On a 55" TV, sitting 3 metres away, no real difference for me. I’d rather have extra frames than extra pixels.

          And that’s for gaming. With good quality video, I can’t see any difference at all.

      • Honytawk@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        9 months ago

        Depends entirely on the size of the screen.

        A normal monitor is fine on 1080p

        But once you go over 40", a 4K is really nice

    • johannesvanderwhales@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Too expensive both in terms of the price, and the massive amount of storage needed for 8k video. I don’t really think 8k is ever going to be the dominant format. There’s not really much point in just increasing resolution for miniscule gains that are almost certainly not noticeable on anything but a massive display. Streaming services are going to balk at 8k content.

    • Neil@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’ve heard recently that there’s “cheap OLED” and “expensive OLED.” Which one did you go for? I’ve got a 75" 4k OLED for $400 and it’s definitely super dark. I can’t even watch some movies during the day if they’re too dark. The expensive ones are supposed to be a lot better.

      • Venat0r@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        10 months ago

        I’ve got an older Sony bravia A9G and I’ve seen reviews complaining that it’s too dim but I’ve had no issues. I think some people just have really poorly thought out tv placement, or overly bright rooms. Also just close the curtains if the movie is dark…

        If you want to watch tv outside in direct sunlight you’ll need to follow this guide to build a custom super bright tv: https://youtu.be/WlFVPnGEb8o

  • mostNONheinous@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I legit just had an Olevia branded 37 inch TV I’ve had since 2007 bite the dust finally. 16 years was a hell of a run, It cost me $600 at the time, which cost me roughly $37.50 per year of use. RCA ports went out partially ages ago but the HDMI just kept ticking. It was an lcd and I never had a single pixel die out on me. Played everything from GameCube-Wii-Switch ,PS2-4, OGxbox-360-XboxOne and ran a chromecast for the last 3-4 constantly. Felt like I was putting a dog out to pasture. Loved that bad boy.

  • LoudWaterHombre@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    My takeaway from this comment section is that smart TVs are straight from hell and should be treated as such. It is very important, that you get a TV BEFORE smart TVs were a thing.

    • johannesvanderwhales@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Display technology has advanced quite a bit since smart tvs have become ubiquitous, though. So you are sacrificing quality to avoid those headaches.

      Personally I just don’t give my smart TV an internet connection.

      • LoudWaterHombre@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        That’s what I did too. It has no connection and I don’t use any of the smart TV features. Instead I have my own box I’m using. I never felt this stupid.

    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      Nah, you can buy new TVs

      Just make sure they can be used without network and then never connect them to the internet.

      Bought a new TCL recently, none of the smart features work, but got excellent screen quality with all the new specs.