xpcustom

Distinguished
Sep 11, 2008
4
0
18,510
First, sorry if this isn't in the right area of posting. This is kind of a cross between PC's and TV's. Ok with that said I plan on getting a new 32" Samsung LCD TV this Christmas and I wanted to use it as a secondary monitor for when I watch tv from Media Center or watch movies. I know using it as a primary monitor will dumb down the resolution to a mir 1366 x 768 resolution. A big difference from my already 1680 x 1050 on my monitor. So I came up with the idea of using a DVI-D splitter cable so when I send the signal out of the PC, it will split between the monitor and the TV like using clone setting off a video card. My question is if I went and did this kind of setup using a resolution of 1680 x 1050 (Monitor resolution), would the TV be able to see the signal or would I get a blank screen? If I receive an image, would it fill the entire screen? If no image, is there a way to do this while maintaining the monitor resolution of 1680 x 1050 and still be able to send the same signal to the monitor as well? Or could I just use the second port in my Nvidia card and have the same resolution on both screens? Any answers would be of use as I am completly stuck on this one.

Forgot to mention this but I do have an XBOX 360 that is networked and I have Windows Media center on my PC. So if everything else fails I just wanted to ask if at the very least I could watch TV through the XBOX 360 as an extender on the TV.
 
You should be able to set different resolutions for each monitor/HDTV connected to different graphic card ports in the NVidia Control Panel. My HTPC is connected to my HDTV at a resolution of 1366x768 and to my projector at a resolution of 1280x724, but I'm using an ATI card.

-Wolf sends
 

xpcustom

Distinguished
Sep 11, 2008
4
0
18,510
I thought about doing that however since the new version of the Nvidia control panel, only thing available for Vista, they opted out the ability of setting different resolutions for each display. Now when you try to clone your desktop to another screen, Nvidia automatically scales down the resolution so both displays have the same resolution. It forces your better display to work at the same resolution that your lower quality one works at. I tried the registry hack to work around this, but it only saw one display and crashed when you forced it to work. I originally bought the 8600GT card to do this not knowing about this issue till shortly after trying it. I heard people say the registry hack works with XP, but I use Vista just for the higher Direct X version and the better Media Center.