Uncalibrated
Results
Brightness and Contrast Ratio
For the brightness, contrast, and color accuracy tests, we depend on the hardware colorimeter and software to help calibrate the displays. As previously stated, we use a Monaco Optix XR (DTP-94) colorimeter and Monaco Optix XR Pro software. The software and hardware help users to get color that is more accurate from their displays. Before we get to the calibrated results, we took a quick look at the range of brightness and contrast at stock settings while changing just the brightness level.
Without adjusting any of the default settings other than brightness, the maximum brightness level of the Samsung 245T reaches almost 320 nits -- as usual, more than most people need. At minimum brightness, we measure just over 100 nits. Further tuning of the color levels allows us to reach slightly lower/higher brightness levels, but the default range is sufficient. At maximum brightness, we measure a contrast ratio of nearly 1400:1 -- even without the dynamic contrast setting enabled. Reducing the brightness also reduces the contrast ratio, and it's interesting to note that black levels are reasonably stable at around 0.22 nits.
What happens when we enable Dynamic CR? Maximum brightness jumps to 350 nits while the black level went up slightly to 0.24 nits, resulting in a final contrast ratio of 1460:1. That's pretty close to the advertised 1500:1 contrast ratio, but the impact on Delta E is quite severe. Even after calibration -- which appears to get confused by Dynamic CR -- we measured an average Delta E of 7.029. That's worse than the uncalibrated Delta E we'll see below, so we recommend turning off the dynamic contrast.
Color Accuracy
The problem with calibrating a display is that it doesn't help all applications. Specifically, the video overlay used when watching DVDs or other movies completely bypasses any color profiles, so you are stuck with the uncalibrated colors. Playing games also uses the default color options. It is possible to tweak things using the OSD, but the amount of color correction that can be done via the OSD pales in comparison to color correction tables. Ideally, we would like to see video drivers begin to apply color profiles to all output -- office applications, movies, games, or anything else.
For uncalibrated color accuracy, we adjust the brightness as well as the contrast and colors (where applicable) using a "calibrate by eye" chart and the OSD controls. Also remember that color accuracy can vary from panel to panel even within the same model, and the results we are reporting are only from testing a single LCD. During testing, Monaco Optix XR Pro sends 24 color patches to the display with the colorimeter measuring the resulting values. The difference between what the requested and actual colors shown on the LCD is Delta E, with lower values being better. Any score less than one is "perfect" -- the naked eye is not going to be able to tell the difference -- and scores less than 2.0 are nearly perfect.
Ideally, you would want all of the tested colors to have a Delta E of less than 1.0, but almost no one is likely to have problems with anything scoring below 2.0. From 2.0 to 4.0, most people still won't notice the slight inaccuracies in the color palette, but when comparing displays side-by-side, differences may be apparent -- multimedia professionals in particular would prefer better colors. Anything above 4.0 begins to represent a more significant deviance, and numerous scores above 6.0 will almost certainly be noticeable by anyone using the display. Consistency is also important, so a display that has very good scores overall but with high spikes on some colors may actually be less desirable than a display with a slightly higher but more consistent average Delta E. Note also that fluctuations of as much as one point in Delta E are possible during a short amount of time. It generally takes 30 minutes for a display to warm up, and we perform all of our calibration and testing after the displays have been running for at least one hour with the screensaver disabled.
Without any form of color correction, the Samsung 245T rates a very good -- especially relative to the competition -- 3.80, ranking second overall with a relatively large gap between it and the nearest competitors. The Acer still scored better, but with that display had other concerns. The individual color scores are also very good, with the worst still only 6.0 and many others are below 3.0. So far, this is the best overall display when it comes to color accuracy -- provided of course that you stay away from the Dynamic CR setting. It's not perfect, but after seeing so many LCDs with uncalibrated Delta E of over 6.0 this is definitely a noteworthy achievement.
60 Comments
View All Comments
Owls - Thursday, February 7, 2008 - link
I agree. The ads are highly instrusive. Any other sites people recommend?GNStudios - Thursday, February 7, 2008 - link
I read the review and got very intrested in the monitor (I have a Samsung 215TW now). When browsing some the internet I found many people complaining that it's very noisy.Is this true?
mattsaccount - Thursday, February 7, 2008 - link
My parents bought one of these over Christmas. The monitor they received definitely emits a certain amount of noise, but none of us found it that distracting. You can barely hear it in normal use, and it's not an irritating high pitch ring or anything.JarredWalton - Thursday, February 7, 2008 - link
I haven't noticed any noise from this particular unit, but that's pretty variable. Usually the noise comes from capacitors inside the chassis, so as best as I can tell it's luck of the draw.kmmatney - Thursday, February 7, 2008 - link
I'd be interested to see how my $299 Soyo 24" LCD compares. It uses a non-TN panel (MVA), and can be had from OfficeMax.jimmy43 - Thursday, February 7, 2008 - link
Well I'm glad you guys talk about the different panel technologies to educate people, there is more than just the size and refresh time to a monitor. However, I'm wondering what is with the input lag taboo at these large sites? It's not too hard to measure, and it would complete your article so we dont have to go to independant reviewers to get a good idea of how laggy a monitor really is.nevbie - Thursday, February 7, 2008 - link
Agreed, and also, here is a reference to such a review that tests input lag (as an example): http://www.tftcentral.co.uk/reviews/content/hazro_...">http://www.tftcentral.co.uk/reviews/content/hazro_...Note that in many cases here the input lag exceeds the response time, that so many reviewers pay attention to.
Monitor reviews are very interesting, but so subjective..
Xbitlabs (www.xbitlabs.com) monitor reviews seem to have most of the measurements that I have seen in reviews, with the exemption of input lag.
PS. If you review HP LP2065 (I hear S-IPS or MVA), I'll give you a virtual hug. =P
tayhimself - Thursday, February 7, 2008 - link
Can the input lag be removed by disabling scaling etc? What causes input lag, and how is it measured? Thanks!JarredWalton - Thursday, February 7, 2008 - link
There are a few things to consider. First, how do you measure input lag? If you use two outputs on one GPU, they don't necessarily get identical content - you can get +/-1 frame difference due to refresh rates, internal buffering, etc. Using a splitter for a signal can do the same thing. So you have a margin of error of at least one frame. I've tested with varying techniques in the past and decided input lag wasn't a real issue... or at least not an issue you can easily fix just by changing LCDs. CRTs may be better in this area, but I'm even less willing to go back to using a cumbersome CRT.The real issues with image lag are more complex. You have things like double (or even triple) buffering that add one or two frames of lag. Then technologies like SLI and CrossFire add at least one frame of lag when doing AFR (the most common mode), and triple and quad solutions using AFR could add up to three frames of lag internally... and no one seems to worry about that. (I asked NVIDIA and ATI about this in the past, and their response was something along the lines of "you don't actually think anyone can notice the 0.02s delay, do you!?")
I tend to agree, at least for *most* people. Despite what many would like to think, our eyes really don't react quickly enough to notice differences of a couple hundredths of a second. If I ever encounter an LCD where I notice a problem with input lag, I'll make a note of it, but I haven't yet - even with the much-maligned 2407WFP.
I suppose professional gamers might have more issue with input lag, but then there are multiple sources of lag they need to try to reduce. There are lots of things that most people just live with and don't notice - image tearing because VSYNC is off, lag because you can't afford a $2000 CPU+GPU setup, lag at your input device (mouse/keyboard), running on a 19" LCD instead of 30".... Internal image lag in an LCD is one of these things in my book.
lyeoh - Tuesday, February 12, 2008 - link
Please do more useful reviews of monitors.Input lag is an issue with nonCRT monitors. In fact significant input lag is a _showstopper_ for many people (even if they didn't know of such a thing till they experienced it :) ).
I personally don't care about lags of 10-15ms but some LCD panels have been _tested_ and _documented_ by many to have lags of >50ms, and that is VERY SIGNIFICANT.
Go search youtube for input lag if you don't believe there are monitors with significant lag.
I have walked into a shop which was selling panel TVs and even the shopkeeper noticed the lag when I pointed it out, that screen had terrible lag (my guess is at least 100-200ms). Imagine playing Tekken on that and not seeing your opponent's move till 100ms after it has occurred...
Even a nonpro gamer playing Counterstrike or other FPS will find it annoying that he keeps getting shot by someone peeking round a wall/corner before he even gets to see that person. Games like Guild Wars allow some players to interrupt skills if you do things in time. Every millisecond counts. If your round trip ping is 100ms and your reflexes are 250ms, you can easily interrupt (with a 0.25 sec interrupt skill) opponent skills that take 0.75 seconds to cast (assuming the game adds 100ms max). If the panel is too slow, what used to be easy with a faster LCD/CRT becomes difficult if not impossible to do reliably.
Gamers might be able to tolerate colours not being so good, and even a few dead pixels (actually a dead pixel in the exact center makes it good for some games as a built-in crosshair ;) ), but high input lag badly affects the gaming experience far more.
As for the two outputs having a difference, just use a card which doesn't (you can check with CRTs). To be rigorous, you can always swap the outputs to confirm the results.
I'm sure you can think of ways of measuring input lag. Some people use a chronometer/stopwatch displaying on both the screen being tested and a CRT, and then take a few pictures of it with a decent camera.
The rest of your post about double/triple buffering etc is not relevant - little to do with a monitor review.
You can go measure system latency in a different review- PC, video card, game or even CPU review. It might be quite interesting, given a cache miss in modern CPUs can waste a lot of cycles. A CPU might perform well in throughput, but when there is an unexpected change it might take a while to reach top speed again. My guess is the time scales of a CPU make it unlikely that the latencies would reach the order of many milliseconds, but who knows...