And video quality. Watching some historical videos from my childhood, like tv shows on youtube… the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.
I think a lot of it is based on if it was recorded with actual film or recorded with the idea that it would only ever been on TV. If film was used (which was more expensive) the masters can still be used to produce high quality versions.
Here is a whole playlist that falls into that category.
People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren’t capable of displaying the difference in quality. To the average person they were the same.
VHS was capable of not bad quality, people just had a lot bad equipment.
Some TV shows (if they were crazy) were shot on film so you could re digitize them now in 4 or 8k and they’d look amazing. But there was also a lot of junk that was out there.
And as others have mentioned if you do an awful job of digitizing it then you could take something that looked good and throw all of that quality away. But if the tape wasn’t stored in good condition then it could just struggle to be digitized in the first place when done properly.
CRT screens definitely used pixels, but they updated on the horizontal line rather than per pixel. This is why earlier flatscreen LCDs were worse than CRTs in a lot of ways as they had much more motion blur as stuff like “sample and hold” meant that each pixel wasn’t updated every frame if the colour info didn’t change. CRTs gave you a fresh image each frame regardless.
I have heard that pixels in CRTs are round and LCD/LED are square, that’s the reason why aliasing is not too noticeable on CRTs. Is this true or another internet bs?
They’re not round persay, but they aren’t as sharp so have more light bleed into one another giving a natural alaising effect. This is why some old games where the art is designed to account for this bluring look wrong when played on pixel perfect modern TVs.
What they’re referring to is that analogue CRTs don’t really have a fixed horizontal resolution. The screen has a finite number of horizontal lines (i.e. rows) which it moves down through on a regular-timed basis, but as the beam scans across horizontally it can basically be continuous (limited by the signal and the radius of the beam). This is why screen resolutions are referred to by their vertical resolutions alone (e.g. 360p = 360 lines, progressive scan [as opposed to interlaced]).
I’m probably wrong on the specifics, but that gives the gist and enough keywords to find a better expansion.
There’s a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that’s gotten shared over the years, especially YouTube’s encoders. They will just straight up murder videos to save bandwidth. There’s also a lot of stuff that just doesn’t look great when it’s being upscaled from magnetic media that’s 240x320 at best.
However, there’s also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There’s a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn’t capable of. There’s a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.
Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing…
And video quality. Watching some historical videos from my childhood, like tv shows on youtube… the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.
I think a lot of it is based on if it was recorded with actual film or recorded with the idea that it would only ever been on TV. If film was used (which was more expensive) the masters can still be used to produce high quality versions.
Here is a whole playlist that falls into that category.
https://www.youtube.com/watch?v=wCDIYvFmgW8&list=PL83uD9UA8b_l2SzDUjVwskh3ugYgmO_8r&index=97
Here is an alternative Piped link(s): https://piped.video/watch?v=wCDIYvFmgW8&
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
It was filmed with poor quality and the films can degrade overtime. It was archived that way because the source was 💩
People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren’t capable of displaying the difference in quality. To the average person they were the same.
VHS was capable of not bad quality, people just had a lot bad equipment.
Some TV shows (if they were crazy) were shot on film so you could re digitize them now in 4 or 8k and they’d look amazing. But there was also a lot of junk that was out there.
And as others have mentioned if you do an awful job of digitizing it then you could take something that looked good and throw all of that quality away. But if the tape wasn’t stored in good condition then it could just struggle to be digitized in the first place when done properly.
You kinda can tell though. CRTs didn’t really use pixels, so it’s not like watching on today’s video equipment though
CRT screens definitely used pixels, but they updated on the horizontal line rather than per pixel. This is why earlier flatscreen LCDs were worse than CRTs in a lot of ways as they had much more motion blur as stuff like “sample and hold” meant that each pixel wasn’t updated every frame if the colour info didn’t change. CRTs gave you a fresh image each frame regardless.
I have heard that pixels in CRTs are round and LCD/LED are square, that’s the reason why aliasing is not too noticeable on CRTs. Is this true or another internet bs?
They’re not round persay, but they aren’t as sharp so have more light bleed into one another giving a natural alaising effect. This is why some old games where the art is designed to account for this bluring look wrong when played on pixel perfect modern TVs.
deleted by creator
Noted.
Ummm…what? How do you think did CRTs show the picture?
What they’re referring to is that analogue CRTs don’t really have a fixed horizontal resolution. The screen has a finite number of horizontal lines (i.e. rows) which it moves down through on a regular-timed basis, but as the beam scans across horizontally it can basically be continuous (limited by the signal and the radius of the beam). This is why screen resolutions are referred to by their vertical resolutions alone (e.g. 360p = 360 lines, progressive scan [as opposed to interlaced]).
I’m probably wrong on the specifics, but that gives the gist and enough keywords to find a better expansion.
Vcr won vs betamax because it was cheaper to make a VCR, then weighed less than betamax so less material costs.
VHS won because you could record more than 30 minutes on a tape
Obligatory Technology Connections video
Here is an alternative Piped link(s): https://piped.video/hGVVAQVdEOs
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
There’s a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that’s gotten shared over the years, especially YouTube’s encoders. They will just straight up murder videos to save bandwidth. There’s also a lot of stuff that just doesn’t look great when it’s being upscaled from magnetic media that’s 240x320 at best.
However, there’s also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There’s a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn’t capable of. There’s a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.
For example.
Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing…
Here is an alternative Piped link(s): https://piped.video/vscKaVByjRU
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
Lol
I watch a lot of hockey. Just watching hockey games from the 2000s are full on potato. I don’t remember them looking that bad back then.
Hockey is definitely the sport helped the most by HD video.
All sports have been, also the rise of faster refresh LCD as those early flat screens blurred a lot.