Every TV we review is put through the same set of tests to gauge its picture and audio performance, usability and smart features.
Tests are carried out over a number of days, with updates added to the review following any firmware updates that bring new features. The exact process is broken down below.
Our testing environment
We use a combination of bespoke testing and real-world environments to test TVs, feeding content to it via an array of 4K Blu-ray players, streaming players, game consoles and set-top boxes, as well as connecting other external devices (soundbars, surround systems) to gain perspective on how they work.
Our screen tests are done by eye but supported with technical measurements.
Testing by eye involves an expert watching a wide range of material. We also – where possible and relevant – perform side-by-side comparisons to similarly priced rivals. This allows us to spot obvious differences in performance.
Brightness is tested in varying conditions – during the day and at night; with the lights on and off. This is to test the native brightness output of a screen to see if it’s enough with real life conditions, as well as to check if there are anti-reflection filters installed, and how effective they are.
Resolution: pictures should be crisp and well defined, but not overly sharpened. We pay attention to the picture’s sharpness at various resolutions, to test the TV’s native and upscaled detail handling.
Colours should be vibrant but true to life, and for this we pay particular attention to nature scenes – grass and sea in nature documentaries – and skin tones. We also pay attention to the subtlety of shading, especially in tricky contours.
Contrast levels are the differences between the brightest highlights and the deepest shadows. We pay attention to how deep the blacks go, and whether they are grey or crush detail. We look at the brightest areas to see if details are bleached out.
Uniformity is another issue, as TVs are lit in different ways. We pay attention to light pollution, such as light bleed and halo effects. We view from different angles to find the display’s weak areas.
Motion handling is judged by looking at the amount of judder and motion blur all motion processing turned off, and then with it turned on at varying intensities. This is to see how a display natively handles moving images, fast and slow, and how effective the corrective processing is.
After tests are done by eye, we take some technical measurements to check for peak brightness rating (measured in nits) and input lag.
Video material: SD and HD broadcast signals, assorted DVDs, Blu-rays and 4K Blu-rays, plus video streams at various resolutions in standard dynamic range and HDR on Netflix, Amazon Video, Now, Disney+ and Apple TV+.
Technical test equipment
We use the following video players in our tests: Panasonic DP-UB820EB 4K Blu-ray player, Amazon Fire TV Cube, Amazon Fire TV, Apple TV 4K.
Brightness and colour measurements is measured with an X-Rite i1 Display PRO Plus, a professional colour calibrator.
Input lag for HD gaming at 60Hz is measured with a Leo Bodnar input lag tester and input lag for 4K resolutions is tested with a 4K HDMI signal lag tester from Leo Bodnar as well.
These are for reference only – we first decide by eye whether a TV is bright enough in various lighting conditions, and we play games to manually feel if latency is an issue.
How a TV performs is as important to what’s on the inside, since this decides what a TV is supposed to do.
We study specs sheets and test a TV’s sockets to see if it is up to spec with similarly priced rivals. For instance, does it have the same number of HDMI ports, and are those ports capable of transmitting the same amount of data? We play video through different sockets to make sure they play correctly.
There is also the issue of format compatibility. If a manufacturer claims to be compatible with a particular video format, we’ll check that. If they promise an update is coming, we’ll check that too. We also take note of whether a product’s compatible formats are in line with trends. If the industry is adopting a particular standard and a product doesn’t support that, we’ll make note of it.
Most TVs these days come with an operating interface. The first way we judge these is to see what apps/streaming services are included, and check if there are plans to release missing apps in the near future. We also check whether these apps are 4K and HDR compatible.
Beyond that, we check a TV’s software performance by manually hopping between apps, settings and channels. This is usually enough to make slower TVs stutter, while faster TVs tend to be more responsive.
We put a lot of emphasis on a TV’s accessories, such as the remote control. This is important as it’s what you interact with, every time you use the TV. Some TVs ship with multiple remote controls, some have a single unit with extra features, while others are simple things with premium build quality. These are not scored individually but they do contribute to the overall usability of a TV.
Scoring and verdict
After all the tests are complete, we score the TV using the criteria outlined in the criteria mentioned here. We first check to see if the TV’s performance matches the manufacturer’s claims, and that all the features work as expected and advertised.
We look at how it compares to other similar products, if it’s missing any vital features and whether it impresses as a whole.
#Value is a consideration during scoring, too. If a competing product offers equivalent features or performance for less money then this will affect the score. Equally, if a device is only slightly more expensive but performs significantly better then we’ll score accordingly.
If we’ve missed anything that you think we should cover, get in touch with us on Twitter @TrustedReviews or you can email the editor at firstname.lastname@example.org.
The post How we test TVs appeared first on Trusted Reviews.