Videography FAQ: What are 8K, 4K, and Full HD? How Do I Use Them?
With the rise of video-based streaming and social media platforms, video creation has become more of a part of our lives than ever. While interchangeable lens cameras were first developed to take still images, it is now the norm for them to have video recording capabilities too—and record at increasingly high resolutions. What are resolutions like 4K and 8K really, and why and when do they matter? This is what we will explore in this article.
The basics of video resolution
The origin of resolution standards in video
In still photography, there are no real fixed standards on the pixel resolution of the images you shoot. It varies from camera to camera depending on the image sensor. Even if you print the images, there may be standard paper sizes such as 3R or A4, but there are no benchmarks that say that an A4 print must be printed at 300 dots per inch to conform to A4.
It is different for video production. Video originated in cinema and television broadcasting, where display resolution standards developed early on to ensure technical consistency throughout the industry. For example, regardless of the actual screen size, all television sets that cater to a particular standard will display images using the same number of pixels. Therefore, a term like “Full HD” doesn’t just mean a very high definition video, it specifically refers to a video resolution that consists of 1920×1080 pixels.
Full HD is currently the most widely supported resolution. So what about 4K and above?
Currently, digital television sets are becoming the mainstream, and these generally have a 1920×1080-pixel display resolution (Aspect ratio: 16:9). This resolution confirms to the Full HD (Full-spec High Definition) format, and also may be referred to as “1080p”. You may have also noticed that it is supported by many computer and laptop monitors, and also by Blu-ray Disc digital storage format. With so many Full HD playback platforms all around us, you can even argue that we don’t really need any higher resolution.
However, television sets and computer display monitors increasingly support 4K display, which provides a resolution that is four times that of Full HD. Videos are also a medium that people often will play back to watch, even if years have passed. For that reason, recording in a higher resolution future-proofs footage, ensuring that the memories recorded in them can be played back clearly for a longer period.
What are the current main video resolution formats?
Surprising fact: 4K DCI is only 8.84 megapixels!
|
|
|
|
|
SD | 720 | 480 | 0.34MP | Analogue TV, DVD videos |
HD | 1280 | 720 | 0.92MP | Online streaming |
FHD | 1920 | 1080 | 2.07MP | Digital TV, Blu-ray, online streaming |
2K | 2048 | 1080 | 2.21MP | Digital cinema |
4K UHD | 3840 | 2160 | 8.29MP | Television, online streaming, 4K Ultra HD Blu-ray |
4K DCI | 4096 | 2160 | 8.84MP | Digital cinema |
8K UHD | 7680 | 4320 | 33.17MP | Raw footage, archival footage, medical and research purposes |
8K DCI | 8192 | 4320 | 35.38MP | Raw footage, archival footage, medical and research purposes |
You may have noticed how small the video resolutions are compared to photography resolutions. Even 4K video, which is considered extremely high quality, has a megapixel resolution of only around 8 megapixels! In other words, after taking the different aspect ratios into account, you just need a camera with around 10 megapixels to shoot 4K. Nowadays, even entry level cameras have a resolution of around 20 megapixels, so 4K video capability has become quite common.
Remember: A higher resolution means more data to deal with
If your camera can do it, shooting in a higher resolution than required for your purposes increases your post-production flexibility: you have more leeway to crop frames, and also more pixels to pull information from, which can give you better results in editing processes like colour grading.
However, handling all that extra data requires more computing power and storage capacity. Don't forget to consider those when you decide which resolution to shoot in!
Also see:
From Filming to Editing: Should You Still Be Shooting in 1080?
Related but not the same: Super 35mm versus full-frame sensors, Find out more in:
6 Things About Cinema Cameras that Serious Video Creators Should Know
What’s the difference between DCI and UHD?
The “K” in 4K stands for “kilo”, which denotes multiplication by 1000. Therefore, “4K” generically refers to a resolution that is around 4000 pixels wide. There are different 4K resolution standards, but the most widely used are currently 4K UHD and 4K DCI.
4K DCI: Origins in cinema
The 4K DCI standard (4096×2160) was established in 2005 by Digital Cinema Initiatives, which is an organization formed by major motion picture studios to set standards in the cinema industry. It has an aspect ratio of 1.9:1, which is consistent with the 2K (2048×1080) standard that was commonly used in cinema projectors before 4K existed.
4K UHD: Origins in television broadcasting
The 4K UHD (Ultra High Definition) standard came about much later, when the television broadcast industry was considering broadcasting in 4K. It was set by the International Telecommunication Union, with dimensions of 3840×2160. Like Full HD, it has a 16:9 aspect ratio, which made it compatible with existing television sets.
What the differences mean for video recording
If you want your video to be viewable on a 4K television set, a camera that records in 4K UHD should be sufficient. However, 4K DCI has a higher horizontal resolution than 4K UHD, so a camera that can record 4K DCI will be able to record not just 4K that is suitable for cinema, but also high-quality 4K UHD fit for 4K television with no deterioration despite the cropping.
8K video: A resolution for future proofing
Like 4K, there are two main standards for 8K: 8K UHD and 8K DCI. However, unlike 4K, 8K monitors are not yet available to the mainstream consumer. It will be a while before the average user can view their own 8K videos in full glory.
What’s so amazing about 8K?
The appeal of 8K lies in its greater realism, made possible by the detail captured by its higher resolution. Humans can see things in three dimension due to a few factors, including:
- Binocular disparity: differences between the images viewed by the right and left eyes
- Dynamic range: The range of tones from highlights to shadows that can be perceived
- Definition: The ability to perceive fine details, which is tied to the pixel resolution of an image
An 8K image is around 35 megapixels, which is said to provide close to the maximum definition perceivable by human vision. In that sense, viewing an 8K video in its full resolution is supposed to provide a very realistic sense of dimensionality even though the footage is not 3D.
With 8K broadcast still in the trial stage, 8K video is currently used mainly for archival purposes (e.g., to document cultural and natural heritage) and medical research. 8K footage can be used as raw material for 4K (or lower) video productions as it can be cropped to those resolutions in post-production to create digital zooming, panning, and sliding effects with minimal image quality deterioration.
You may also be interested in:
EOS C70, R5 C, R5 or R3: Which to Get for Video?
In summary
- Full HD, 4K DCI/UHD, and 8K DCI/UHD are all display resolution (definition) standards. Each refers to an image format consisting of a specific number of horizontal and vertical pixels.
- 4K DCI and 8K DCI are cinematography standards, and have a 1.9:1 aspect ratio.
- Full HD, 4K UHD, and 8K UHD are television broadcast standards and have a 16:9 aspect ratio.
- Currently, Full HD is the most widely supported standard by consumer display devices, and support for 4K is growing.
- 8K promises great realism due to its ultra-high definition, said to be close to the limits of human perception.
We demystify more video jargon in:
What is IPB/Long GOP and ALL-I/Intra-frame?
What do 4:2:2 and 4:2:0 mean?
Receive the latest update on photography news, tips and tricks.
Be part of the SNAPSHOT Community.
Sign Up Now!