Does Size Matter?
1080p, 1080i, 720p, 480i. When it comes to video, size can matter. A large screen size and a high number of pixels usually equates to a better image. With the advent of high definition (HD) TVs and HD programming, people have come to expect a better quality image on the big and small screens. Gaming consoles output Blu-ray quality graphics and more and more web sites/engines like Youtube and Vimeo offer HD viewing options. More to the point, inexpensive cameras and phones can now film in HD. So a better question would be, when does size matter?
First, let’s break down how this all works.
We view video and films through digital signals. HD signals are much larger and offer much more clarity. Plasma TVs, LCD and LED-LCD TVs , computers and smart phones are driving viewer expectations. The blue box below represents a standard-definition(SD) digital signal.
Films are shot anywhere from 2K to 4K. Look at how much larger the 4K signal is compared to a SD signal (the blue VCD box below).
Videos downscale well, but they don’t upscale with clarity. So what does this mean? Simply put, a larger video signal with more information will look good on larger screens and anything smaller. Conversely, a smaller image will only look good on a smaller screen. It doesn’t have enough information to upscale properly.
Why is that?
A video signal is made up of thousands of pixels (dots) and each dot is made up of bits (colored dots), hence a Standard-Def video signal might have a screen size of 640X480, 300,000 pixels and an 8 bit color mode. Which does not upscale well to a TV that is designed to host a 24 bit signal and several hundred thousand additional pixels. This is the reason many older films have been remastered at a higher quality.
Note the difference below in image quality based upon the number of pixels represented on screens with different native resolution sizes below. This is a SD signal and it does not upscale well to HD.
If you are watching an older TV program on your 1080p HD television, it will look something like the image above on the right. Its not the TV, its the signal being sent to it. So going back to the original question, when does size matter?
When you’re buying a new Television
You need to consider what you watch. Do you watch primarily network television and use a Wii? A 720p TV might be good enough for you. If you watch HD cable, satellite, Blu-ray movies and/or game with a newer gaming system, you might want a 1080P television. Companies, however, are starting to make 4K TVs, while many movie theaters still don’t have 4k projectors. Sharp has demonstrated an 8K television in Europe (goodbye theaters).
The only problem with 4K and 8K TVs is that there aren’t any broadcast signals available for these TVs (yet). 1080i (i stands for interlaced and means that every other line of pixels is deleted and replaced by a duplicate of the remaining line. So 1080i is really 540p scaled up to a 1080 screen size) is the largest signal output through today’s TV bandwidth, but you can download a 1080p (p stands for progressive and offers more pixels than interlaced signals) streaming signal from various devices and sources (Apple TV, Google TV, etc.) I would advise making sure your new TV has a refresh rate of at least 120 Hz and a high contrast ratio. I like plasmas for this reason. They tend to have a refresh rate of about 600 Hz (great for watching sports) and the highest contrast ratios (great for watching films). LED LCDs with a 240 Hz refresh rate would be my next choice. They output more light and do well in bright rooms.
When you’re shooting a video or film production
Let’s face it, you want to record, edit and output at the highest resolution possible for all of the reasons listed above. Its easier to downscale than upscale. Filmmakers are shooting in 4K and 8K for productions meant solely for the web. So if you can’t shoot at a high resolution, just focus on good composition, audio and storytelling. That is the root of any good film or program.