Your Universal Remote Control Center
Displays & Projectors Forum - View Post
Up level
Up level
The following page was printed from


Original thread:
Post 1 made on Tuesday January 12, 2010 at 00:17
Daniel Tonks
Wrangler of Remotes
October 1998
So which is better - 1280x720 or 1920x1080 when shown on a 1366x768 screen? I've heard stories how some cheap deinterlacers will throw away one entire field of a 1080i signal and scale the results on a 720p screen - giving the equivelant of 1280x540 resolution - but what if the deinterlacer is better? Is the potential for an extra 48 pixels vertical and 86 pixels horizontal resolution worth it?

Last night I swapped cable boxes in my bedroom system, which uses a 42" Philips LCD, 1366x768 resolution, about 4 years old now. For the first time I decided to see if there was any difference between 1080i output and 720p output (using HDMI). I tuned to one of the high def news channels with loads of tiny on-screen text and kept switching back and forth between the two, trying to observe single pixel differences.

I didn't see any actual changes in resolution. Detail levels were about the same. What I did notice - and this was with my nose pressed against the screen - is that in 1080i the set actually has soft flickering on edge detail pixels, even on a paused screen, almost like you might see on an actual interlaced screen. Obviously the deinterlacer is doing something odd even on static elements... like alternating between the two different fields or something.

However in 720p mode the image is rock solid with no movement, as you would expect from an LCD.

So after 4 years running that set in 1080i mode, I'm putting it to 720p.

Hosting Services by ipHouse