If the game is demanding enough they also consume the same amount of electricity, maybe even more.
And are full of untold mathematical horrors, just like physical lamps.
More interestingly, lamps in video games use the same amount of real electricity if they are on or off.
Highly depends on the rendering engine and if you’re looking at it, as it could unrender if you look away, meaning less energy used.
Not necessarily, on OLED displays (which are definitely a thing for desktop computers and TVs) a light that’s turned off is using less power because the pixels the lamp is displayed on (and the ones around it too) are dimmer.
YELLS IN GPU VERTEX PIPELINE
that consumes electricity. ever think about the poor gpu? about how your words hurt its feelings?
jokes aside the power to process a few hundred vertices every frame is insignificant
Actually, the pixels go completely black and do not consume any electricity at all in that state.
You might be thinking of early OLEDs, which had to stay on at all times to prevent blur/smearing. But panel manufacturers solved that problem a few years ago. Don’t remember exactly when the change happened, but I remember first seeing true black OLEDs sometime around 2017/2018.
The light doesn’t become true black, it’s dark but not a complete nothingness. So yes, it’ll still consume power.
When a lamp turns off it doesn’t become a black hole. Previous commenter was correct, though I appreciate your info about OLED
And traditional LCDs with a backlight use more power for darkness. The LCD is transparent by default and turns opaque/black when a voltage is applied.
OLED displays (which are definitely a thing for desktop computers and TVs)
Probably not for most people, due to cost. More realistic for portable devices where battery saving is a thing, as it doesn’t seem like there’s much mainstream push for OLED (or similar equivalent) monitors that aren’t top-end (on newegg, I could only find 240Hz options).
That and often search results are for other panel technologies (IPS/TN/VA). Lower spec stuff seems to exist but you really gotta scrape the bottom of the barrel (portable monitors) to find some niche product.
Monitors no, TVs very much so.
TVs very much so
Very much so… what? A quick glance, they’re expensive AF (riddled with “smart” features and now AI, gigantic on top of 4K etc) too.
Sure I guess there’s actually a chance a few impulsively bought one at a big-box store (or “on sale” for the full price of a non-OLED TV), but it’s more likely they bought “LED” which is marketing speak for local dimming (not even close to OLED turning pixels off).
I’m not sure sub-£550 ($700) with reasonable sizes (42"), really counts at expensive AF anymore (not cheap but not expensive AF). But each to their own.
Alright sure, maybe. But LCD screens are ubiquitous, and most people probably aren’t looking to buy more displays. In a similar vein, early 4K adopters probably don’t have much reason… if they can just be happy with what they already have.
It is good enough to be the last thing to upgrade, especially looking at the chunk of cost it’d be when lumped in with PC/console cost. (also, selling is probably not for everyone even if less-modern HDTVs had any resale value, and at ~42" you might even not get any quick takers even if free)
A quick look at the Steam survey, ~56% of users are still using 1080p and ~20% are using 1440p. If OLED is almost exclusive to 4K and/or 240Hz many will likely continue to ignore it.
Also if you don’t have the hardware+content, it also doesn’t really make sense. That’s additional cost, and you may even need to look specifically for content created that works well with OLED (if not created with it in mind). Higher-speeed broadband availability/cost and streaming enshittification(+encoding quality) may be factors here too.
And burn-in seems to still be a thing, at least with some types/models.
So I see this as a long way off for mass adoption, similar to VR. And more to my point that it’s more of an exception than a norm.
EDIT: Also just saw QDEL, seems a year away still but may fix burn-in and cost (especially if it is pushed to lower end, print manufacturing may allow it). Though who knows, I’m also seeing tandem OLED (except it seems to make cost worse).
A few things:
- I disagree that LCD is good enough, especially for living room gaming. It is the best and most significant upgrade I’ve ever done, by a long way.
- In terms of Steam Survey, again no arguments from me, oled monitors are rare, I was arguing that TVs are not.
- There isn’t such thing as content that works well with OLED, everything looks significantly better, especially with HDR, which almost everything supports and has done for a significant period of time.
- As someone that has been using an OLED TV for 5+ years, burn-in really isn’t an issue, there’s not a trace of burn-in on either of my TVs, or any of my portable devices with OLEDs. The only time I’ve ever experienced burn-in on an OLED was a Nexus 5, which is so long ago, that it’s almost irrelevant. In the case of the Nexus 5, the only reason it ended up with burn-in is because I enabled the developer option to keep the screen on at all times, resulting in the status bar burning into the screen. All modern OLED displays take burn-in into account and run screen cleaning occasionally, which isn’t noticeable as the screen just appears a black. So unless someone is running a news channel with a static logo 24/7 on the screen, they’re not going to have issues with burn-in. It’s worth noting I have an OLED TV on my desk too (that one was indeed on sale, for ~400 IIRC), and that has static content such as an Apple logo (work laptop 😞), on it for hours each day, with no burn-in.
I’d argue that’s not true if the lighting is baked into the map.
Does it matter? The screen still has to display it, and GPU render it, even if no RT is involved.
The GPU renders the map no matter if there is lighting baked in our not. It’s exactly the same operation. And depending on your display tech, brighter pixels might actually use slightly less energy.
But they still use energy.
You can hardly argue that the lamp itself is using energy when “not a lamp” is using exactly as much energy
Did you know that characters in video games have an electrical current to keep them alive just like real people?
somebody said this at work yesterday, and now it’s here
Shades in video games use even more electricity
Not on OLED screens + prebaked lightning
That’s too specific conditions, but okay :)
You made a blanket statement. There are exceptions.
not that specific. most modern displays are oled, and most efficient games use prebaked lighting. the average gamer probably plays on an oled display, and has a game with prebaked lighting.
Looks like we’re from a different galaxies, as I never seen oled display for PC in my eyes(I know they exist, but they are extremely rare where I am)
deleted by creator
So do stones in video games. And water.
You aren’t supposed to think about it
But what about candles?
Lamps in video games aren’t real. It’s the video game that’s using the electricity.
Video games aren’t real. It’s the computer components that use electricity
computer components aren’t real. It’s all just tiny gremlins doing maths really fast and turning pixels on and off
Tiny gremlins aren’t real. It’s all just a dream. Wake up you have to make me breakfast. I would like pancakes please.
Babe wake up I want pancakes
Pancakes aren’t……
No…. pancakes are real. And they’re perfect.
The lamp is rendered by small electric lights, be it LEDs or LCD. CRTs are in a bit of an grey area. But you can absolutely use a monitor as a light source by itself .
That’s like saying “lamps don’t create light, it’s the flame/filament in the lamp that creates the light”
Solar powered
Schmidt?
Every electronic device in the game uses real electricity. Even if it’s not on.
Which is really unexpected if you’re looking at an oil lamp.
Change electricity to energy and we’re good again