Forum Settings
       
This Forum is Read Only

PC to HDTV? Best gfx card without bottleneck for my system?Follow

#1 Jul 11 2010 at 8:38 PM Rating: Decent
Scholar
****
9,997 posts
Actually have a couple of questions-- firstly, I'm considering hooking my tower up to the HD port on my big screen television. Is there anything I need to know about this? Is this going to tax my system any more than a regular monitor would?

Secondly, currently the weakest link in my system is my Radeon 5750, and though I just got it, I may upgrade it sooner than I planned to. My other specs are as follows:
Windows 7 64-bit
Phenom X4 2.5 ghz
8GB DDR2 RAM
PSU is 750 watts, I think, maybe less.

Given that, what's the best card(s) I could put in my system without any other hardware upgrades without experiencing significant diminishing returns in the performance?

Thanks.
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#2 Jul 11 2010 at 8:41 PM Rating: Decent
*
209 posts
@720p your current system should be fine. @1080p I am unsure.

P.S. all you need is a $10 DVI to HDMI cable and everything will work including the sound.

Edited, Jul 11th 2010 10:41pm by TheBSTGuy
#3 Jul 11 2010 at 10:44 PM Rating: Decent
Scholar
**
350 posts
Big screen HDTVs only support about the same resolution as a good HD computer monitor (1920x1080), so you don't need better hardware just because you are using an HDTV. Even 720p looks good on a large HDTV.

If you're gonna be playing on a big screen TV, you probably want to think about getting either extension cables for your keyboard and mouse, or a good wireless keyboard and mouse. I've heard good things about the Logitech DiNovo Edge wireless keyboard, but it is expensive.
#4 Jul 11 2010 at 11:05 PM Rating: Excellent
***
2,614 posts
Video cards with HDMI ports have made this relatively straightforward. You'll be pushing the same resolution either way, so it won't be any more taxing on your video card. Today's cards even have onboard 7.1 audio hardware.

I'd definitely stick with the 5750 for now. The smallest upgrade that would make it worth your while would be an HD 5850 for about $300. The catch is that the graphics card market sucks right now. In a previous generation with viable competition from Nvidia, the 5850 would be a sub-$200 card by now. Instead its price has gone up since launch and held steady for nearly a year.

There are rumblings of Nvidia finally making some steps into the $200-range market this month, but based on their past showing with the 400 series I wouldn't expect it to make much of a splash. If I were you, I would wait: first to see how your card performs in the final version of the game, and second for the new generations of hardware that should be coming out from both companies this fall and next spring.
#5 Jul 12 2010 at 12:01 AM Rating: Decent
Scholar
****
9,997 posts
Ah good, I had heard rumblings that using a bigger screen was much more taxing on the tower even though I wasn't sure how that could be. My card has an HDMI output so it's incredibly simple. Now I just need to figure out how I want to go about using my tower for gaming on the TV and regular work at my desk monitor. I imagine I'll eventually employ some sort of splitter.

I know that my system can run at 1080p with max settings, just not very smoothly. I was thinking that in that case I may as well put it on a bigger screen at 720p where I'll be sitting far enough away that I can't tell the difference, then I can keep on beautiful settings like ambient occlusion. I prefer to play on a TV anyway, that way I can turn on my couch/recliner settings. Unfortunately I was made afraid that my tower might actually run worse on a TV, but apparently I was misled.

Control-wise I'm just getting an adapter for the old Logitech Netplay controller that I used on my PS2 with FFXI and hope it works well. My brother and I are really hoping they'll put out something similar, at least when the PS3 version hits. It's too convenient. But if that doesn't pan out, I may look into getting one of those PS3 gamepad keyboard clip ons in addition to a wireless keyboard, assuming both can function on my PC simultaneously (and at all).

Guess I'll stick with the 5750 for now. I wish I had known what a step up the 5770 was when I picked it up-- it wouldn't have been much more expensive.

Oh right, one more question. Since I upgraded my card, I'm using the HDMI output to my monitor, and of course the sound comes out of my monitor speakers. Unfortunately they kind of suck and I already have a pretty nice speaker system for my rig. Is there some way to get sound to my speakers without losing HD video?

Thanks for the input.
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#6 Jul 12 2010 at 12:04 AM Rating: Good
Thief's Knife
*****
15,053 posts
Be aware that HDMI supports two ways of encoding color. RGB (which is what PC monitors use) and YCbCr (which is the color space used by compressed video formats)

It is important that you understand this because video cards will sometimes choose a less than optimal color space format when hooked up to an HDTV. You can usually force the pixel format in your video card's driver settings.

Explanation of YCbCr here

If you have to use YCbCr the use YCbCr 4:4:4 not YCbCr 4:2:2. 4:2:2 means that the color (CrBr) are half the resolution of the brightness (Y).


There will also (at least on ATI cards) be an option for "full range" and "studio range" when outputting RGB pixels

Full range means that brightness level RGB 0,0,0 = absolute black and RGB 255,255,255 = maximum white


Studio range means that absolute black = RGB 15,15,15 and maximum white = RGB 240,240,240

Ideally you want RGB full range. But you need to have your HDTV and your video card both set to use the same brightness range (i.e. full range or studio range). Most HDTV's will have an option for full range/studio range somewhere in the setup menus. (on Samsung TV's it is called "HDMI Black Level")

If you are using a DVI to HDMI cable then it will always be RGB full range.


Also, turn off *ALL* "image enhancement" image sharpening, dynamic contrast etc. when using an HDTV as a PC monitor. It will just degrade the image quality. There will usually be an option in your HDTV's setup called something like "Movie" or "Natural" that disables most of this. Then it's just a matter of setting image sharpening to 0%, disabling "dynamic contrast" "edge enhancement" etc.

What you want is to display exactly the image that your PC is sending to the TV with no changes.




Tools you can use to adjust your brightness/gamma. Please note if you have any edge "enhancement" or image sharpening these will not work right. (but you should have all those turned off anyway)

Lobivopsis wrote:

Looks like I'm going to have to pull out my gamma correction/adjustment links again.

First of all, if you are using an LCD monitor set your desktop to the native resolution of your display.

Now we adjust your black point and white saturation level.

Black point test:

http://www.drycreekphoto.com/Learn/Calibration/monitor_black.htm

White saturation test:

http://www.lagom.nl/lcd-test/white.php

Another black point test:

http://www.lagom.nl/lcd-test/black.php

Luminance sensitivity test:

http://www.drycreekphoto.com/Learn/Calibration/monitor_sensitivity.html



Get your black point and white saturation correctly adjusted and then you need to adjust the gamma curve to 2.2


http://www.normankoren.com/makingfineprints1A.html#gammachart
http://www.photoscientia.co.uk/2point2.htm
http://www.graphics.cornell.edu/~westin/gamma/gamma.html
http://www.lagom.nl/lcd-test/gamma_calibration.php




Edited, Jul 12th 2010 3:09am by Lobivopis
____________________________
Final Fantasy XI 12-14-11 Update wrote:
Adjust the resolution of menus.
The main screen resolution for "FINAL FANTASY XI" is dependent on the "Overlay Graphics Resolution" setting.
If the Overlay Graphics Resolution is set higher than the Menu Resolution, menus will be automatically resized.


I thought of it first:

http://ffxi.allakhazam.com/forum.html?forum=10&mid=130073657654872218#20
#7 Jul 12 2010 at 12:15 AM Rating: Decent
Scholar
****
9,997 posts
Wow, that's pretty technical. Thanks a ton for the overview. Guess I'll be referring back here when I actually rig this up. I suppose this leads me to my next series of questions, too, even though I haven't finished with the last ones. Sorry...

I'm about to shop for a new TV, probably something 50+ inches. Is there anything specific I should keep in mind as I shop? I've done a little consumer research in the past, but pretty much all I recall is looking out for contrast ratio/resolution/black levels (don't even know how to check black levels) and the practical stuff like screen gloss, inputs and features. Does it matter if I go with LED/LCD, or even plasma (I almost assuredly will not do plasma)?
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#8 Jul 12 2010 at 12:36 AM Rating: Decent
*
126 posts
If your gonna be playing ffxiv on it alot, i'm unsure of the LED but do not get a plasma. Plasma is prone to burn-in and would seem pretty easy with the on-screen gauges. I had FFXI burn in on my old CRT HDTV and it kinda sucked. I've got a DLP and would suggest a DLP as there in no possibilty of burn-in. Plus, I've seen deals the last month for a 60" DLP for $800 through Dell. I've got a 60" I'm gonna be running the PC through and after whatching beta vids on youtube I can't wait to get this!

Edited, Jul 12th 2010 2:36am by Kadin

Edited, Jul 12th 2010 2:37am by Kadin
____________________________
Carbuncle
75 Pld/Bst
#9 Jul 12 2010 at 12:40 AM Rating: Good
Thief's Knife
*****
15,053 posts
Kachi wrote:
Wow, that's pretty technical. Thanks a ton for the overview. Guess I'll be referring back here when I actually rig this up. I suppose this leads me to my next series of questions, too, even though I haven't finished with the last ones. Sorry...

I'm about to shop for a new TV, probably something 50+ inches. Is there anything specific I should keep in mind as I shop? I've done a little consumer research in the past, but pretty much all I recall is looking out for contrast ratio/resolution/black levels (don't even know how to check black levels) and the practical stuff like screen gloss, inputs and features. Does it matter if I go with LED/LCD, or even plasma (I almost assuredly will not do plasma)?


First of all, go here.

http://www.avsforum.com/

This is the best source of information on home theater products on the Internet.

Get Samsung or Sony (my personal preference is Samsung) You really can't go wrong with them.

Years ago I got a Phillips and had problems with using it with PC input so took it back and got Samsung instead. I don't know if they have fixed this now.

Vizio is decent but low grade.

Akai is absolute garbage that will stop working in about a year or so (and then they will expect you to ship it to them to fix it) do not buy ever.


Whatever you decide on, read up on it on AVS first.

I'd recommend LCD over Plasma if you want to use it as a PC display.

On most LED backlit LCDs the LEDs simply replaces a fluorescent bulb. It allows the HDTV to be thinner, and it doesn't wear out over time like a fluorescent bulb does (it will eventually but your TV will stop working long before you start seeing reduced brightness in an LED). There are LCD's that have an active matrix LED backlight but they are very expensive and they're not good as PC displays anyway.

Contrast ratio numbers are usually padded. i.e. they are using "perceived" contrast ratio using dynamic contrast. Dynamic contrast is something I never use (I don't use any "image enhancement" features) So while it may say 20,000 to 1 the LCD panel is really more like 2000 to 1.


Edited, Jul 12th 2010 3:50am by Lobivopis
____________________________
Final Fantasy XI 12-14-11 Update wrote:
Adjust the resolution of menus.
The main screen resolution for "FINAL FANTASY XI" is dependent on the "Overlay Graphics Resolution" setting.
If the Overlay Graphics Resolution is set higher than the Menu Resolution, menus will be automatically resized.


I thought of it first:

http://ffxi.allakhazam.com/forum.html?forum=10&mid=130073657654872218#20
#10 Jul 12 2010 at 1:19 AM Rating: Decent
Scholar
****
9,997 posts
Quote:
I've got a DLP and would suggest a DLP as there in no possibilty of burn-in. Plus, I've seen deals the last month for a 60" DLP for $800 through Dell. I've got a 60" I'm gonna be running the PC through and after whatching beta vids on youtube I can't wait to get this!


Really? Is that 720 or 1080p? Do you ever get that rainbow effect I've heard of on DLP?

@Lobi: My last purchase I bought a Samsung LCD after doing my homework, because it seemed pretty clear that they had the best track record, at least a few years go. I'm actually going to be using it both as a PC display for FFXIV, and just a regular TV (including console games), so there's more to consider than just how it will do with the PC.

Do you know anything about the Insignias? I understand those are basically the Vizio of Best Buy. I'll be weighing the value rather than automatically grabbing the cheapest or the best I can afford, and I can get Best Buy stuff almost at cost.

I didn't realize that LED just referred to the bulb type. That's good to know.

Popped in on the AVS forums. @_@ Lots of gobbeldygook, not sure if it's just over my head or if I'm not sure where to look.
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#11 Jul 12 2010 at 11:15 AM Rating: Decent
Scholar
***
1,151 posts
The DLP rainbow effect either you see it or you don't. It depends on the viewers eyes more than the TV. I would suggest going to a store and looking at DLP TV before buying one. I would hate to get one home only to be able to see it.
#12 Jul 12 2010 at 12:09 PM Rating: Decent
Scholar
****
9,997 posts
Ah, strange. Maybe I should shy away from it for the benefit of other rainbow-eyed people who may watch it. :P

Any ideas about this?
Quote:
Since I upgraded my card, I'm using the HDMI output to my monitor, and of course the sound comes out of my monitor speakers. Unfortunately they kind of suck and I already have a pretty nice speaker system for my rig. Is there some way to get sound to my speakers without losing HD video?
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#13 Jul 12 2010 at 12:38 PM Rating: Decent
Scholar
**
350 posts
Kachi wrote:
Since I upgraded my card, I'm using the HDMI output to my monitor, and of course the sound comes out of my monitor speakers. Unfortunately they kind of suck and I already have a pretty nice speaker system for my rig. Is there some way to get sound to my speakers without losing HD video?


Do you have a separate audio output from your sound card? You could just plug that output to your sound system, and then disable or mute the audio on your TV. Actually there is probably also a way to disable audio being sent over HDMI within Windows.

If you had a good audio receiver, it would support HDMI switching, and would carry the video signal too. Using that, you would be able to plug your computer into the audio receiver as an input, and then plug your TV in as an output. Then use the audio receiver to play the audio, and mute the audio on your TV. Most of the cheaper audio receivers won't support audio pass-thru, which for your situation is OK, but it is annoying in situations where sometimes you want to use your TV speakers instead of the surround sound system.
#14 Jul 12 2010 at 12:42 PM Rating: Good
***
2,614 posts
Quote:
Now I just need to figure out how I want to go about using my tower for gaming on the TV and regular work at my desk monitor. I imagine I'll eventually employ some sort of splitter.

Oh right, one more question. Since I upgraded my card, I'm using the HDMI output to my monitor, and of course the sound comes out of my monitor speakers. Unfortunately they kind of suck and I already have a pretty nice speaker system for my rig. Is there some way to get sound to my speakers without losing HD video?

Solution to both problems: your video card supports up to 3 displays simultaneously. Hook up your monitor via DVI (the cable probably came with it) and your TV via HDMI. Plug in your speakers to your motherboard's audio outputs.

The details from there are a matter of getting your settings right, so you'll have to fiddle with them. The easiest method is probably to hit Windows key + P and select either Duplicate or Projector Only whenever you want to play on the TV. This way your motherboard's audio should handle most things and your HDMI audio device should automatically turn on when needed. If it doesn't, you'll probably have to change something in your Catalyst Control Center (ATI's configuration software).

By the way, Nvidia just launched their GTX 460, and it turns out it's actually pretty awesome. It's still not worth upgrading unless you can get a good resale price for your 5750, but it bodes well for the market. This will probably drive ATI's prices down finally.
#15 Jul 12 2010 at 1:05 PM Rating: Decent
Scholar
****
9,997 posts
Yeah, I kind of forgot that with my new graphics card came new outputs, like the very one I plugged the HDMI cable into ^_^;

I actually don't mind having the audio from my HDMI, particularly on a TV, but it tends to be a little treble-heavy and weak all by itself on the monitor. So...

Quote:
Solution to both problems: your video card supports up to 3 displays simultaneously. Hook up your monitor via DVI (the cable probably came with it) and your TV via HDMI. Plug in your speakers to your motherboard's audio outputs.


This should work out just fine, I think.

I might pawn my 5750 if I can get ~$75 for it (think that's all I paid for it), depending on what kind of deal I can get on a 460 and if my PSU is good enough. I'm also a little hesitant to upgrade to a 460 until I see how it actually handles the game. There have been a few issues with certain Nvidia cards.
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#16 Jul 17 2010 at 8:16 PM Rating: Decent
Scholar
****
9,997 posts
Update on this: I've been looking at TVs, and I'm actually wondering if I should consider getting a Plasma. It would be greatly appreciated if one of you more knowledgeable folks would helps me out along my line of reasoning here in case I'm wrong about something or am overlooking a crucial detail.

First of all, the refresh rates are quite a bit better on plasmas-- most are 600hz as opposed to the 60-240 in the LCDs. I'm not sure how much faster this is in practice or if you can even notice it, but my brother has a 240hz LED and the lag just from the TV makes him unable to play certain games online with it (twitchy PvP games). So this is probably my biggest consideration.

Secondly, the price. I can get a plasma for about half the price of an LCD, and about a third the price of an LED. Granted plasmas don't last as long, but I figure by the time it's time to replace it, an LCD will probably cost half as much as it does now anyway.

Other typical bad plasma stuff... it's heavier, more prone to burn-in (though I understand that this problem is nearly moot now). Anything else?

Thanks.
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
#17 Jul 17 2010 at 10:13 PM Rating: Good
***
2,614 posts
Quote:
First of all, the refresh rates are quite a bit better on plasmas-- most are 600hz as opposed to the 60-240 in the LCDs.

Plasmas aren't my specialty, but it's my understanding that these numbers aren't directly comparable between plasmas and LCDs. The 600hz "sub-field drive" is essentially a marketing term invented as an answer to rapid-refresh LCDs. I don't even know if refresh rate is a term that makes any sense when applied to a plasma... they work on fairly different principles.

But there's one important thing to know:

Quote:
I'm not sure how much faster this is in practice or if you can even notice it, but my brother has a 240hz LED and the lag just from the TV makes him unable to play certain games online with it (twitchy PvP games). So this is probably my biggest consideration.

Refresh rate has nothing to do with input lag - neither does response time, which is another thing often confused for it. Despite it being one of the most important considerations in buying a TV for gamers, input lag is a spec that isn't advertised by any manufacturer that I know of. To find out how laggy a TV is, you have to either read reports online from other consumers or bring equipment to the store and test it yourself.

There are ways to reduce input lag. Some TVs have a special "Game Mode" for this. You should also try to operate games at the native resolution of the TV so that it doesn't have to upscale or downscale them.

All that refresh rate affects is the perceived smoothness of the picture's motion. It works like this: a TV with a refresh rate of 120 or 240 hz analyzes video with a low framerate and interpolates new frames in between the existing frames. This makes video shot at 24 or 30 fps look like it was shot at 60 fps, which is about the fastest that the human eye can perceive.

Many people don't like this effect on TV shows and movies because they feel it makes them look "cheap" (soap operas and similar shows are shot at a higher framerate, and this duplicates that effect). But it's pretty great for games, especially for something capped at 30 fps like FFXIV (unless they fix it by release, which I hope to god they do). The effect can be turned down or off, in any case.

There may be a catch. When I was shopping a couple years ago, motion interpolation tended to introduce ugly artifacts into the video. Look up something called the triple ball effect. I've found that it's very noticeable in some games and almost invisible in others. The software has probably also been improved since then.

Quote:
Secondly, the price. I can get a plasma for about half the price of an LCD, and about a third the price of an LED. Granted plasmas don't last as long, but I figure by the time it's time to replace it, an LCD will probably cost half as much as it does now anyway.

Other typical bad plasma stuff... it's heavier, more prone to burn-in (though I understand that this problem is nearly moot now). Anything else?

Lifespan and burn-in are pretty much non-issues at this point. Most videophiles prefer the picture of a plasma because it's richer and has better black levels. I'm surprised that they're cheaper than LCDs now, but make sure you're getting something with comparable specs. You should get one that displays 1080p, and you definitely don't want one that only does 1024 x 768.

Something else to consider is power use. LCDs are more energy efficient.
#18 Jul 18 2010 at 5:11 PM Rating: Decent
Scholar
****
9,997 posts
Ah I see. Thanks, that's exactly the kind of analysis I needed.

I had thought perhaps the refresh rates weren't directly comparable, but someone at the store told me that they were (which I should know better than to trust). The input lag is definitely the issue I'll want to consider, so it's good to know that it isn't necessarily different between plasma/lcd. Seems like it'll be a pain to figure it out though.

Definitely getting 1080p. No question there.

Honestly, I probably won't be able to tell the difference on most features. 1080p and input lag though, I'll definitely notice.

How much more efficient are we talking on the power usage?
____________________________
Hyrist wrote:
Ok, now we're going to get slash fiction of Wint x Kachi somehere... rule 34 and all...

Never confuse your inference as the listener for an implication of the speaker.

Good games are subjective like good food is subjective. You're not going to seriously tell me that there's not a psychological basis for why pizza is great and lutefisk is revolting. The thing about subjectivity is that, as subjects go, humans actually have a great deal in common.
This forum is read only
This Forum is Read Only!
Recent Visitors: 1 All times are in CST
Anonymous Guests (1)