720p vs 1080p: When Is It Time To Upgrade Your TV?

I still have a 720p television.

There. I said it.

And since the average lifespan of a TV is between 5-10 years, I’ll bet I’m not the only one.

It;s not that my TV is dying. It’s a Samsung and still looks great, actually. But it’s 720p, and that automatically makes it obsolete, right?

Or…does it?

This got me wondering what the real difference is between 720p vs 1080p TV’s, and at what point can you really see the difference.

So in this article, I’m going to explain the difference between 720p and 1080p (and 1080i). Then I’m going to explain why (and when) it makes a difference. Then, you can decide if you need to upgrade now, or whether that old 720p TV of yours still has some life left in it.

I’ve also done this same analysis for the difference between 1080p TV’s and 4K TV’s. You can check out that article here.

Let’s dive in!


How Is TV Resolution Measured?

When we talk about TV’s and computer monitor’s picture quality, we’re talking about their resolution. That’s the number of pixels on the screen. A pixel is a fluid crystal illuminated by a device behind the TV screen to show different colors. If the size of the screen remains constant, a higher resolution number means a sharper display. 

But instead of talking about the total number of pixels, we shorten that to the number of pixels on one axis (either vertically or horizontally).

Here’s an example:

The full resolution of a 720p TV is 1280 pixels wide by 720 pixels high on the screen. To get the total number of pixels on the screen, we multiply the two numbers together.

720p = 1280 pixels wide x 720 pixels high = 921,600 total pixels.

Since it doesn’t really roll off the tongue to say we have a 921K TV, the TV company marketing departments decided to shorten that to the number of pixels on the vertical axis: 720.


Progressive vs. Interlaced: What Do The ‘p’ & ‘i’ Mean?

Tacked on the end of that resolution number is either a ‘p’ or ‘i’, which stands for either Progressive or Interlaced.

This refers to how the image is drawn on the screen. In general, progressive scan (p) will give you a smoother picture than an interlaced picture at the same resolution.

But why?

How Interlaced Video (i) Works

Interlaced video was the first attempt at increasing the resolutions on your TV screen.

It worked by drawing every other line of an image on the screen in a single frame. Then, on the next frame, it would draw the opposite lines. If you looked at each frame individually, it would only ever show half of the image. But, if you play them quickly enough, it would trick our eyes into thinking there’s one complete image.

For the most part, this worked well. The problem was that the two frames weren’t the same image. So for fast moving images, like sports, you’d see strange artifacts where the images didn’t line up correctly.

Still, interlaced video was a great way to increase resolution without increasing the bandwidth that the video used.

How Progressive Scan (p) Works

By contrast, progressive scan will draw the entire image, line by line, on every frame. That means motion is more fluid, and there’s no weird artifacts when the subject of the image is moving quickly. Action movies and sports look much

Unfortunately, that comes at a cost.

Progressive scanning takes up a log more bandwidth than interlaced images. This is what drove the adoption of better and better cables over the years.

 


What Is HD TV Resolution?

Now that you understand how to read a resolution spec, let’s look at an easy question that has a rather complicated answer:

What is HD resolution?

Back in the old days, TV signals were either 480i in North America (NTSC) or 576i in Europe (PAL). So when HD came out, it was simply defined as ‘anything higher than Standard Definition.’

Predictably, manufacturers had different ideas of what that really meant and three different HDTV resolutions emerged.

720p was the clear winner early on, so manufacturers called it High-Definition, or HDTV. But when 1080p became more cost-effective, suddenly there were two competing HD formats.

1080i (interlaced) was short lived, at least as far as consumer televisions went. However, 1080p, which was known as “Full HD” (FHD) at the time, quickly became the dominant HD television resolution.

One final note: based on the “official” definition of High-Definition, even 4K and 8K TV’s would classify as HDTV’s. While technically true, TV manufacturers found a winner by calling them “4K TV’s”, so that’s how they’re marketed.

Putting aside the marketing labels, probably the best way to compare 720p and 1080p is looking at their pixel count.

We’ll do that in the next section.


A Quick Word About 1080i 

One of the early, competing television formats was 1080i, and it seemed to be the best of both worlds.

The resolution was higher than the standard 720p resolution that was becoming popular. And, it used the same amount of bandwidth as the lower resolution, progressive scanning video.

However, because it was interlaced, it also had some of the problems as well, like mis-matched video and jaggies where there should be smooth curves.

For a brief time, the industry considered 1080i to be better than 720p. After further testing, the two resolutions are essentially the same.

Deinterlaced resolution offers smoother images by default, so a 1080p image will look better than a similar 1080i image. Additionally, many TV’s and streaming devices will automatically convert interlaced signals into progressive scanned signals.

So, if a 1080i signal is fed into a 720p TV, then it will be deinterlaced and down-converted to 720p. If fed into a 1080p TV, the same thing will happen only that this time, no change in resolution will happen. The final display will be 1080p.


720p vs 1080p: Pixel Count

In a previous section, I gave you the simple calculation to figure out how many pixels are on your TV screen.

That calculation is:

Number of Pixels High x Number of Pixels Wide = Total Number of Pixels

Pretty simple, eh?

The more pixels on your screen, the sharper the image will appear to the human eye (up to a point).

So, let’s find out how many pixels do the pixel counts compare between a 720p television and a 1080p television. Let’s start with 720p.

As I said earlier, a 720p TV is 1280 pixels wide and 720 pixels high. Multiplying the two together, that gets a total of 921,600 total pixels.

720p = 1280 pixels wide x 720 pixels high = 921,600 total pixels.

That sounds like a lot, and, compared to the 307,200 pixels in a 480p television, it was! However, 1080p resolution made another huge leap in the number of pixels manufacturers could squeeze on to a TV panel.

1080p TV’s are 1920 pixels wide, and 1080 pixels high. So that leads us to this:

1080p = 1920 pixels wide x 1080 pixels high = 2,073,600 total pixels.

That’s more than double the number of pixels in a 720p television and almost six times the number of pixels in a 480p, standard definition television.


Can You Tell the Difference Between 720p & 1080p?

Here’s another easy question with a complicated answer: Can you tell the difference between 720p and 1080p?

That depends on how big your television is, and how far away you’re sitting.

To illustrate, let’s use the example of my 43″ Samsung TV. I have it in my upstairs den, and our couch is about six feet away from the TV.

Manufacturer’s recommend placing your television about three times the vertical screen height away from where you’ll be sitting. For my TV, that’s about 5.5 feet away. So it’s a little too far away, given the size, but not too bad.

At that distance, could I see a difference if I upgraded to a 1080p TV?

To find out, let’s take a look at this chart from Rtings.com.

source: rtings.com

This chart has TV sizes along the bottom, and the distance from your TV going up the left-hand side. To read the chart, find the size of your TV, and go up until you get to how far away it is.

I’ll use the 45″ line, since it’s the closest to my TV size.

This chart tells me that I could put my TV up to about 9′ away from my couch and still be able to see the difference between 1080p and 720p.

It also tells me that, I should NOT upgrade to a 4K TV, because my six foot distance is right at the border where 1080p and Ultra HD (4K) look about the same.


720p vs 1080p: Content

But even if your TV is the right size and viewing distance away, you might not need to upgrade.

That’s because most TV stations in the United States broadcast almost exclusively in either 720p or 1080i. So if all you use your old TV for is broadcast TV, then it’s probably not worth it to upgrade.

On the flip side, 1080p resolution is the current standard for most streaming services, and Blu-ray movies. Although you can also get some 4K content from most streaming services, and a handful of 4K UHD Blu-ray discs. But comparing 4K to 1080p is a whole other conversation.


720p vs 1080p: When Should You Upgrade?

There is little difference in resolution quality between 720p and 1080p when the screen is smaller than 50 inches. The smaller screen size compensates for the fewer pixels in 720p. The differences are harder to see farther than 2m away from a 50-inch TV screen.

Sometimes you may even notice a 720p screen seeming “better” than a 1080p screen; the reason is usually related to the bandwidth. The simplest explanation for this is that the 720p needs to be less compressed on the monitor since it has fewer pixels than the 1080p resolution.

How Much Are 720p And 1080p Display TVs?

1080p HD TVs are generally more expensive than 720p HD display TVs.

The price varies depending on factors such as brand, display technology, features, and specialized capabilities.

If this is the first time you will be buying a digital HD TV, then there is no problem starting with an entry-level 720p HD TV. This TV will still give you a high-definition experience, especially if you choose a screen less than 50-inches. This will give you the feel of HD TVs and you will see for yourself whether you need to upgrade or not. 

Most times, if it’s just for general use, then the 720p option will be perfect. 

Getting value for money is a priority if you are on a budget. You may also want to spare the money due to the current economic crisis in many parts of the world. Saving is a wiser choice for most people.

However, if you want to upgrade your current digital TV, then 1080p would be the best option. You can also enjoy a bigger screen with this option. This is also the best TV to get if you plan to use it as a big computer monitor. Just make sure not to set it at 1920×1080 resolution because the icons and text will become too small.


The Verdict

There’s no doubt that 1080p is better than 720p, but the real question is whether it’s worth it for you to upgrade your old 720p TV.

Depending on the size of the TV, how far away you sit when watching it, and what content you watch on it, you may not see that much of a benefit to upgrading. The most important thing is understanding what all the specifications you encounter mean and knowing what your needs are.

I’m not going to tell you that upgrading your TV is a bad decision, but it might not be the best bang for your buck.

Tim Wells