Your Universal Remote Control Center
Displays & Projectors Forum - View Post
Previous section Next section Up level
Up level
The following page was printed from


720p vs 1080i on a 768p screen
This thread has 2 replies. Displaying all posts.
Post 1 made on Tuesday January 12, 2010 at 00:17
Daniel Tonks
Wrangler of Remotes
October 1998
So which is better - 1280x720 or 1920x1080 when shown on a 1366x768 screen? I've heard stories how some cheap deinterlacers will throw away one entire field of a 1080i signal and scale the results on a 720p screen - giving the equivelant of 1280x540 resolution - but what if the deinterlacer is better? Is the potential for an extra 48 pixels vertical and 86 pixels horizontal resolution worth it?

Last night I swapped cable boxes in my bedroom system, which uses a 42" Philips LCD, 1366x768 resolution, about 4 years old now. For the first time I decided to see if there was any difference between 1080i output and 720p output (using HDMI). I tuned to one of the high def news channels with loads of tiny on-screen text and kept switching back and forth between the two, trying to observe single pixel differences.

I didn't see any actual changes in resolution. Detail levels were about the same. What I did notice - and this was with my nose pressed against the screen - is that in 1080i the set actually has soft flickering on edge detail pixels, even on a paused screen, almost like you might see on an actual interlaced screen. Obviously the deinterlacer is doing something odd even on static elements... like alternating between the two different fields or something.

However in 720p mode the image is rock solid with no movement, as you would expect from an LCD.

So after 4 years running that set in 1080i mode, I'm putting it to 720p.
Post 2 made on Tuesday January 12, 2010 at 12:37
Ernie Bornn-Gilman
Yes, That Ernie!
December 2001
It has always made sense to me that a 720 signal sent to a 720 screen should look better than a 1080 signal, i or p, sent to a 720 screen, especially if the original video is not 1080i or 1080p. No matter what else, 1080 to a 720 screen requires an extra conversion step.

One detail, too, please: when you have a screen with 768 lines, are all of those lines actually visible? Old TV signals ran at a frequency that was 525 times 60, meaning there were 525 "lines" and which we called 525 lines, but there was also a small amount of time in each field that was blank...we call those signals 480i today, implying that 45 lines weren't actually visible lines.

Are all 720 lines viewed? Are all 768 lines viewed?

I'd think converting 720 lines of video signal to 768 lines of display would be a difficult conversion.
A good answer is easier with a clear question giving the make and model of everything.
"The biggest problem in communication is the illusion that it has taken place." -- G. “Bernie” Shaw
OP | Post 3 made on Tuesday January 12, 2010 at 19:10
Daniel Tonks
Wrangler of Remotes
October 1998
Well, it's a 1366x768 fixed pixel display, so they'd all be visible. The question is whether the set's circuitry is designed to make use of that slight extra resolution at any step before final display.

HD, being digital, doesn't need to store its extra information in unused resolution like NTSC.

Resolution conversion really shouldn't be difficult - think of playing a movie back in Media Player on a PC... you can make the window any size from a thumbnail up to full screen and the CPU usage doesn't change.

Jump to

Protected Feature Before you can reply to a message...
You must first register for a Remote Central user account - it's fast and free! Or, if you already have an account, please login now.

Please read the following: Unsolicited commercial advertisements are absolutely not permitted on this forum. Other private buy & sell messages should be posted to our Marketplace. For information on how to advertise your service or product click here. Remote Central reserves the right to remove or modify any post that is deemed inappropriate.

Hosting Services by ipHouse