Here’s why the 90s camera quality looks so poor when compared to the 80s and 70s:
Camera quality from the 90s does not always look poor when compared to the 80s and 70s.
That said, there are many instances where 90s images do look noticeably worse, and it mostly stems from the emergence of budget imaging technology.
90s images were all over the place, both better and worse than the 80s and 70s.
So if you want to learn all about why the 90s camera quality seems to be worse than the 80s and 70s, then you’re in the right place.
Keep reading!
- Filters on Photos: How to Remove?
- Pictures on Phone vs. on Computer: Different?
- Taking Polaroid Picture of Phone or Computer Screen: How to?
- Looking Fatter in Pictures Than in Mirrors: Why?
- Snapchat Camera Quality: How to Improve It?
- Mastering Upside Down Photography: Innovative Selfie Tricks with Your Phone
- Pretty in Mirror, Ugly in Flipped Phone Pictures?
How Do the 90s Pictures Look Different From the 80s and 70s Pictures? (3 Things)
There’s a lot we need to talk about in this context.
The proposed question suggests that the 90s camera quality is worse than the 70s and 80s.
Is that even true?
I’ll take a lot of time explaining this with the nuance that comes with the conversation, but allow me a simple answer first.
No. It’s not true.
90s quality is not universally worse than 70s or 80s quality.
That holds true for both video and still pictures.
That said, you can absolutely find instances where images from the 90s look worse than things you can find in the 70s and 80s.
So, what do we do about this?
How can we really talk about it?
Well, there are a few key things that can bring a lot of clarity to the conversation.
As we go through the differences, technologies, and trends that all impacted image quality across these decades, I’ll point out clear instances where 90s quality was better.
At the same time, I’ll explain why there are so many cases when 90s quality wasn’t better.
Also, it’s worth noting that we’re discussing both pictures and video today.
Specifically, we’re talking about image quality, whether the images are moving or still.
That means we have to cover a lot of different aspects of film, recording, and imaging technology, but the universal concept behind it all is the idea of image quality.
That can be measured in terms of image sharpness and color quality, and because of how cameras work, we’ll have to keep these ideas largely separated.
You can have a crisp, clear picture where the colors are faded or washed.
Likewise, you can have vibrant colors with blurry lines and poor image quality.
So, I’ll remark on both aspects of image quality throughout.
To summarize before we go through all of the finer details, 90s images are not always worse, but there are cases where they are, and that is counterintuitive.
With all of that said, let’s start by comparing images.
We’ll talk about technology and trends later.
#1 70s
It’s a lot harder to find images and videos from the 70s, and that’s because camera equipment was relatively expensive.
Sure, you’d have home picture cameras, but home video recorders were quite rare and not really commercially available.
So, most 70s images that you can find today were created professionally.
Such images are surprisingly sharp for anyone who grew up in later eras and hasn’t studied the technology.
You can look at something like the original Star Wars.
It still looks quite sharp, the colors are good, and the images are clear.
All of that holds true when you watch the original versions as opposed to the digitally remastered versions.
All of that said, color capture in the 70s was not as effective as later technologies.
Even though the images are surprisingly clear, you can see that all of the colors look washed out and faded when compared to later decades.
Some of that has to do with the fact that film images fade over time.
Most of it has to do with the fact that key color-capturing technologies were invented in the 70s but not widely available until the 80s.
#2 80s
Everything I just said about 70s images holds true for 80s images, plus the advancements that were made between the two decades.
As an example, you can look at this clip from Reagan’s farewell address.
It still looks quite good.
It’s not on the modern, digital level of image clarity and color capture, but there’s no doubt that you can tell what Reagan looked like, the colors in his office, and the authenticity of the video.
80s technology had everything that was available in the 70s plus some key advancements.
That’s why the pictures are even sharper, and the colors are less faded and closer to matching “real” colors.
#3 90s
So why do 90s images look so much worse?
Well, in a lot of cases, they don’t.
The truth is that the 90s saw a wider range of technologies, and not all of those technologies were advancements.
Instead, the 90s was a time when affordability really made waves in imaging technology.
You could get pretty cheap cameras and video recorders, and the quality reflected the prices.
So, there are a ton more photos and videos that have survived since the 90s, largely because everyone was contributing by taking photos and making videos.
If you look at a bunch of random images from the 90s, you’ll see a wide range of quality.
A lot of it looks much worse than in the 70s and 80s.
In most cases, those images were created by cheaper technologies.
But, at the professional end, the images weren’t worse.
Compare Jurassic Park to the original Star Wars, or to your favorite movies from the 80s, and the 90s technology is clearly superior.
The 90s movie looks incredibly sharp, and the color is much better than anything you’ll find in the 70s or 80s.
In other words, the 90s was full of technology that was both better and worse than what was available in the 70s and 80s.
That’s the primary reason for the differences.
How Are the 90s Cameras Different From the 80s and 70s Cameras? (3 Differences)
In fact, we can directly compare camera technology across the decades.
Cameras have seen regular improvements pretty much since they were invented, and that was definitely true in these three decades.
There were a few milestone advancements that are worth pointing out.
As I take you through them, you’ll see why the highest levels of image capture only improved through these decades, and at the high end, 90s images definitely are not poor by comparison.
#1 70s
One of the biggest things to happen to cameras in the 70s happened in 73.
That’s when the first CCD chip was released.
This is a device that allows for digital imaging, and the first chip was able to pick up 100 pixels in each direction.
That means it was a 10,000 pixel photo detector.
That allowed for sharper imaging than had ever existed before in human history.
Even though it was invented and released in 1973, CCD usage took a while to really become widespread.
So, the vast majority of 70s film was still recorded on Super 8, and it’s why overall image quality in the 70s lagged behind later decades.
Another big technology of the decade was CCD color imaging.
This allowed for much more precision in color capture, but again, it was a new technology in the 70s and took a while to catch on.
#2 80s
Meanwhile, in the 80s, digital imaging saw some serious advancements.
By the end of the decade, there were megapixel sensors (sensors that could record 1 million pixels per image).
Also, the adoption of digital imaging really started to spread.
It still wasn’t an everyday technology, but it was applied at the professional end of things, and 80s video quality proved sharper than anything from the 70s.
Keep in mind that traditional film still dominated the markets, but lessons learned from digital film development helped produce cameras that were ever sharper.
On top of that, the 80s also saw the first electronic autofocus.
This improved the average quality of video, especially for sports broadcasts.
Lastly, the color capture technology of the 70s was improved and widely adopted in the 80s.
It’s why 80s color often looks a lot better than 90s color.
#3 90s
As for the 90s, that’s when things really started to go digital.
Professional recording studios started adopting digital recording devices at high rates.
Experiments with high-definition broadcasts were underway.
Even consumers could get their hands on digital cameras by 95.
Additionally, the first version of Photoshop was launched in the early 90s, and optical stabilization was released.
Arguably, imaging technology saw more advancements in the 90s than in any other decade.
This is really where the transition from film to digital recordings took place.
And, this helps explain why quality was so all over the place in the 90s.
A lot of people were experimenting with new technologies, and early attempts didn’t always go well.
What About 90s Film Quality Compared to 80s and 70s Film Quality? (3 Comparisons)
That covers camera technology, but there’s more to image quality than just the device capturing the image.
The film matters a lot too.
In the post-digital age, film is no longer a concern.
As long as the device has enough space for the 1s and 0s that make up digital film, it all works out.
But in the 70s, 80s, and 90s, traditional film was still a thing, and it made a huge difference in image quality.
We can compare some of the ways film changed over these decades to help explain why things looked so different in the 90s.
#1 70s
The 70s was the first decade to have digital imaging, and that pushed engineers to really think about film and recording.
This might surprise you, but this digital push actually led to the development of 3D and CGI imaging.
Both of these technologies were actually invented in the 70s.
Once again, the 70s suffered from poor adoption rates.
It takes a long time from inventing the very first CGI image to making it a reliable, mainstream option.
That’s true for all of the incredible advancements in technology through the 70s.
So, even though some pretty amazing stuff existed in the decade, the vast majority of video was still recorded on Super 8, and still photos were on the same basic film that had been around for decades.
Because of this, 70s images look extremely similar to anything you might see from the 50s or 60s.
#2 80s
Meanwhile, the 80s benefited from film advancements as much as they did from image capture technologies.
While 70s inventions fueled sharper imaging and better color capture for the 80s, the CGI, 3D, and other film advancements also led to better quality across the 80s.
Again, this is why 80s film is often distinguishable from anything in the 70s.
The technology was better, and it showed.
As for advancements, the 80s saw a lot of innovations that helped pave the way for the widespread adoption of digital imaging, but they were all incremental.
Rather than redefining the landscape, 80s technology pushed pixel counts, color ranges, and generic image quality metrics.
But, there is one stand-out advancement from the 80s that is worth mentioning.
The decade saw the very first high-definition recording.
Following the trend, this advancement was too much too fast to really take root in the 80s, but high definition was a thing.
That demonstrates how technology was improving image quality and how that impacted pictures and video from the decade.
#3 90s
As for the 90s, I’ve already mentioned that consumer-grade digital cameras were first launched in 95.
Well, you can’t have a digital camera without digital film, and the 90s saw the widespread adoption of digital film.
At the start of the decade, the 90s looked a lot like the 80s, and traditional film was still the norm.
By the end of the decade, professional pictures and recordings were digital more often than not.
Incremental improvements happened at rapid rates throughout the decade, and by 99, you could find high-quality digital images just about everywhere.
That doesn’t mean that film was a dead medium in the 90s.
In fact, film was still more popular than digital replacements by the end of the decade.
But, the idea of digital imaging proliferated, and at the end of the decade, Hollywood was starting to shoot movies entirely with digital devices.
Again, it wasn’t the majority option, but it was a viable option for the first time.
To help put this in context, The Phantom Menace was the first blockbuster movie to be screened on digital projectors.
It debuted in 1999.
So, the 90s was a decade with rapidly advancing, hybridized technologies that used both digital and analog resources.
This yet again helps to explain the disparity of image quality in the 90s.
Early digital devices weren’t necessarily superior to their analog counterparts.
But by the end of the 90s, digital images were more reliable than most film options.
Only the highest-end options could compete with digital options at that point.
On top of that, mixing and matching digital and analog technologies was an underdeveloped industry, and that led to varying imaging quality.
- Filters on Photos: How to Remove?
- Pictures on Phone vs. on Computer: Different?
- Taking Polaroid Picture of Phone or Computer Screen: How to?
- Looking Fatter in Pictures Than in Mirrors: Why?
- Snapchat Camera Quality: How to Improve It?
- Mastering Upside Down Photography: Innovative Selfie Tricks with Your Phone
- Pretty in Mirror, Ugly in Flipped Phone Pictures?