Black border on HTPC MCE Win7

heylarson

Distinguished
Feb 18, 2010
14
0
18,560
Let me explain my HTPC setup to get the ball rolling.

# I have COX Digital cable coming in from the wall.
# Connects to my digital set top box (Cable In).
# From set top box (Cable Out), connects to my Hauppauge WinTV-HVR-1600 ATSC/ClearQAM/NTSC TV Tuner MC-Kit 1183 via TV IN coax input
# HDMI out on my ASUS AH3450/DI/512MD2(LP) Radeon HD 3450 to the HDMI on my Sony KDF-50E2000 50" Rear Projection LCD.
# From TV audio optical out to my stereo receiver.

Everything is up and running, except that the desktop and Windows 7 MCE all have a black border. The details for my TV state that it is 720p (1280x720), but when I put it on that resolution, it still doesn't fill out the picture. I have the Radeon Catalyst Control Center (CCC) installed. I have attempted to put the scaling to 0%, but still it does not fill out the screen.

My biggest issue is that the TV signal does not even come close to filling the screen. It's like a little black rectangle worth of picture. I thought that high-def signals were 720p; how come it's not taking up my entire screen when i have the resolution set to 1280x720??

If anyone has any suggestions, please help.
 

rwpritchett

Distinguished
Mar 17, 2006
203
0
18,860
If you've truly set the hard-to-find scaling in CCC to 0% and you are double-sure that your TV is 1280x720, then I suggest you explore your TV menu and look for something like 1:1 mapping, zoom, stretch or similar to get the full screen at the proper pixel resolution. You may have to play around with it since each TV manufacturer is different.

Another idea... some "720p" televisions are 1366x768 pixels rather than 1280x720 (actually, a LOT of 720p TV's have that rez). Try 1366x768 and see if it fills the screen better.
 

heylarson

Distinguished
Feb 18, 2010
14
0
18,560
I appreciate the info. The PC resolution wants to default at 1152x648; that's when it has the best clarity. Unfortunately, I don't see a 1366x768 resolution available. I couldn't find anywhere within CCC where I could dictate a custom resolution. Also interesting, when the resolution is set to 1158x648, the scaling option slider is disabled, but if I set the rez to anything different, the option is enabled. I believe that I used to use a program called PowerStrip to achieve custom resolutions, but it's only a trial download.

Within the Media Center, I see how you can go to 'zoom' and if I choose 'Zoom 2', it stretches video to remove the black bars on the sides, but the image is not as sharp as when it was directly connected to the set-top box.
 

heylarson

Distinguished
Feb 18, 2010
14
0
18,560
I ripped the HTPC unit out. I couldn't tolerate the image quality. I recently did some work on my pop's home theater system where I had to go from the coax out on the cable box into the tv. I noticed that the picture quality was similarly degraded to what I was experiencing. There's something about using the coax-out on the cable box that kills the picture quality.

Why would the coax-out on the cable box kill the picture quality?

There has to be a way that your HTPC can get the same picture quality that the cable box puts out.
 

heylarson

Distinguished
Feb 18, 2010
14
0
18,560
So after a little more research, I think I answered my own question. Using coax out on a cable box in fact does degrade the image quality. The cable box decodes the signal and then uses a low end transmitter to send the picture out via coax. It's noticeably better quality to not use coax, except if your cable doesn't require a set top box. In that case direct to tuner without the set top box is better.

Looks like i'll need a CableCard from my cable company. It should allow for the decrypting and then that will be the highest quality.

I'm going to look into this route and see what Cox Communications has to offer.