Forum Settings
       
This Forum is Read Only

hey guys, framerate?Follow

#1 Aug 21 2010 at 3:08 PM Rating: Good
Scholar
***
1,536 posts
Hi, I know this doesn't warrant a topic for itself but I was wondering if there were any programs or any way to check the framrate the game is running at while you're playing? Just out of curiosity really, usually your eye can tell but I wanna see if it ever reaches a good 60fps for me at any time.
____________________________
MUTED
#2 Aug 21 2010 at 3:41 PM Rating: Good
Avatar
***
1,416 posts
I use Fraps.
____________________________

#3 Aug 21 2010 at 3:45 PM Rating: Decent
Avatar
***
2,045 posts
I was under the impression the human eye couldn't tell the difference between 30 and 60, anyway Fraps.com can do what you want.
____________________________
BANNED
#4 Aug 21 2010 at 3:50 PM Rating: Good
Scholar
***
1,536 posts
I meant i can tell when fram rate is dipping and when its good and steady, not the exact number which is what i wanted :P thanks for replies, i'll check it out.
____________________________
MUTED
#5 Aug 21 2010 at 5:22 PM Rating: Good
Scholar
*
233 posts
My eye can't really tell the difference between 30 and 20...but with fraps it bugs me to see the numbers drop!
#6 Aug 21 2010 at 8:55 PM Rating: Excellent
***
2,535 posts
preludes wrote:
I was under the impression the human eye couldn't tell the difference between 30 and 60, anyway Fraps.com can do what you want.


It can tell the difference, it most definitely can.

The eye doesn't actually have a "frame rate" - due to the way the eye works, it can't; the eye is an analog device, so digital concepts like frame rate are utterly meaningless. It's like talking about "the ear's sampling rate".

Technical details:
It's because the retina encodes brightness as frequency - the brighter something is, the faster the neurons in the retina fire. Because of this, the more light there is, the more sensitive your eyes are to movement, which leads to a higher perceptible frame rate.

This is, incidentally, why it's somewhat of a misnomer to say film runs at 24 fps. True, movies are filmed at 24 fps, but when projected, each frame is actually shown 2 or 3 times in a row before advancing to the next frame - otherwise, people would notice the flickering.
#7 Aug 21 2010 at 8:58 PM Rating: Decent
Edited by bsphil
******
21,739 posts
Teneleven wrote:
I use Fraps.
Seconded, Fraps is really nice. Very worth the purchase, and you get a lifetime account to always update to the latest version.
____________________________
His Excellency Aethien wrote:
Almalieque wrote:
If no one debated with me, then I wouldn't post here anymore.
Take the hint guys, please take the hint.
gbaji wrote:
I'm not getting my news from anywhere Joph.
#8 Aug 21 2010 at 9:50 PM Rating: Excellent
Avatar
***
2,000 posts
I would definitely recommend MSI Afterburner. Less overhead than Fraps, measures FPS and GPU temp (among other things), takes screenshots, etc.
____________________________
#9 Aug 21 2010 at 10:48 PM Rating: Good
Scholar
*
113 posts
A cool website that I stumbled upon a while back has some basic visuals to show the difference between 15, 30, and 60 fps:

FPS comparison

Edited, Aug 22nd 2010 12:50am by Cowgomoo
____________________________
FFXI: Fenrir
#10 Aug 21 2010 at 11:18 PM Rating: Default
Thief's Knife
*****
15,053 posts
BastokFL wrote:

This is, incidentally, why it's somewhat of a misnomer to say film runs at 24 fps. True, movies are filmed at 24 fps, but when projected, each frame is actually shown 2 or 3 times in a row before advancing to the next frame - otherwise, people would notice the flickering.




Depends on how it's displayed. If it's a film projector, it's 24 FPS *period*. And the problem with LCDs and a low refresh rate is the exact opposite of flickering. LCDs have to use at least a 60hz refresh rate to reduce ghosting not because they would flicker.


You are thinking of CRT displays which would flicker if the refresh rate was too low due to phosphor decay.




Edited, Aug 22nd 2010 2:31am by Lobivopis
____________________________
Final Fantasy XI 12-14-11 Update wrote:
Adjust the resolution of menus.
The main screen resolution for "FINAL FANTASY XI" is dependent on the "Overlay Graphics Resolution" setting.
If the Overlay Graphics Resolution is set higher than the Menu Resolution, menus will be automatically resized.


I thought of it first:

http://ffxi.allakhazam.com/forum.html?forum=10&mid=130073657654872218#20
#11 Aug 22 2010 at 12:05 AM Rating: Good
Scholar
***
1,536 posts
Well while we're on the topic of FPS, I bought myself crysis to test out my FFXIV rig and see how high I can play Crysis on - well I auto tuned it, meaning the game checked what settings are good for my computer, and it decided on high settings + high resolution was good...now the thing is, when I'm turning the camera in Crysis (looking around) I see shearing, like the screen tearing and stuff like that. I thought it was the settings so I knocked them down to medium settings but this thing continued. I also see this kind of thing in FFXIV a little bit but I don't see it at all in ME2 (also played on high settings high res). Any ideas? (assuming this is an FPS thing, although I'm not so sure).

Edited, Aug 22nd 2010 2:05am by SolidMack
____________________________
MUTED
#12 Aug 22 2010 at 12:10 AM Rating: Decent
*
127 posts
SolidMack wrote:
Well while we're on the topic of FPS, I bought myself crysis to test out my FFXIV rig and see how high I can play Crysis on - well I auto tuned it, meaning the game checked what settings are good for my computer, and it decided on high settings + high resolution was good...now the thing is, when I'm turning the camera in Crysis (looking around) I see shearing, like the screen tearing and stuff like that. I thought it was the settings so I knocked them down to medium settings but this thing continued. I also see this kind of thing in FFXIV a little bit but I don't see it at all in ME2 (also played on high settings high res). Any ideas? (assuming this is an FPS thing, although I'm not so sure).


Look for an graphics option for "Vertical Sync" and turn it on.
#13 Aug 22 2010 at 2:32 AM Rating: Decent
Scholar
Avatar
***
2,536 posts
Cowgomoo wrote:
A cool website that I stumbled upon a while back has some basic visuals to show the difference between 15, 30, and 60 fps:

FPS comparison


Thanks for the link!
____________________________
FF11 Server: Caitsith
Kalyna (retired, 2008)
100 Goldsmith
75 Rng, Brd
Main/Acc
Exp/Hybrid
Str/Attk
Spam/Others
#14 Aug 22 2010 at 2:32 AM Rating: Decent
Scholar
Avatar
***
2,536 posts
akirussan wrote:
I would definitely recommend MSI Afterburner. Less overhead than Fraps, measures FPS and GPU temp (among other things), takes screenshots, etc.


Seconded. I use Afterburner when playing the game. A big reason is to also monitor my GPU temps. I have it overclocked so I have to be careful with that.
____________________________
FF11 Server: Caitsith
Kalyna (retired, 2008)
100 Goldsmith
75 Rng, Brd
Main/Acc
Exp/Hybrid
Str/Attk
Spam/Others
#15 Aug 22 2010 at 4:42 AM Rating: Decent
***
3,825 posts
Threx wrote:
Seconded. I use Afterburner when playing the game. A big reason is to also monitor my GPU temps. I have it overclocked so I have to be careful with that.


Just set the fan scaling utility to pump the fan to 100% when you're within 5 degrees of your danger zone. You'll hear it. I accidently had mine set at 100% at 75 instead of 95 so playing earlier I thought XIV was attempting to destroy my GPU, but the fan lets you know so you don't need to parse and sample your card if you're not testing it.
____________________________
FFXI:Sylph - Perrin 75 Hume THF; Retired (At least from my use any way)
EVE Online:ScraperX; Retired
WAR:IronClaw- Peryn SW;SkullThrone- Grymloc BO; Retired


#16 Aug 22 2010 at 7:59 AM Rating: Excellent
***
2,535 posts
Lobivopis wrote:
BastokFL wrote:

This is, incidentally, why it's somewhat of a misnomer to say film runs at 24 fps. True, movies are filmed at 24 fps, but when projected, each frame is actually shown 2 or 3 times in a row before advancing to the next frame - otherwise, people would notice the flickering.




Depends on how it's displayed. If it's a film projector, it's 24 FPS *period*.


Film projector shutters run at 48 or 72 Hz, because with a 24 Hz shutter the audience will notice flickering. Since the film "samples" at 24 FPS, this means that each frame is shown 2 or 3 times in sequence, just as I said. The image changes at 24 Hz, but it flickers on and off at 48 or 72 Hz.

This is also why the optical audio track on film is not in sync with the video (i.e. the audio for a frame is adjacent to the actual frame). The shutter mechanism requires the film advance and hold, while the audio system requires the film move continuously; thus the optical audio is offset from the video to allow for the necessary slack between the tho systems.

Quote:
And the problem with LCDs and a low refresh rate is the exact opposite of flickering. LCDs have to use at least a 60hz refresh rate to reduce ghosting not because they would flicker.


I never mentioned LCDs.

That said, Ghosting is an artifact caused by the LCD having too slow a response time, particularly relative to refresh rate, not merely a low refresh rate. In fact, with a given response time, increasing the frame rate actually makes ghosting MORE apparent - an LCD with a 10 ms response would look pretty good at 60 fps, but if you were able to run it at 120 or, god forbid, 240 fps it would ghost like crazy.

This is the main reason why LCD monitors didn't become widespread until response times has dropped below 20 ms (since with a 60 Hz frame rate, the image changes every 16.7 ms), and also why the biggest complaint about the original GameBoy was the blurry screen - the GameBoy refreshes the screen at around 60 Hz (exact value depending upon model, but generally in the range of 59 to 61 Hz), but the screen's response time was significantly longer than 17 ms.

Active-matrix LCD monitors do need to be refreshed periodically, but this has nothing to do with ghosting - TFT cells will gradually lose charge over time, much like DRAM. The required refresh rate for active-matrix LCDs is not particularly high, however, because TFT cells can retain charge for several minutes or longer - and this long charge retention time is why LCDs don't flicker, even a ludicrously low frame rates. (Passive-matrix LCDs are a slightly different story - they don't flicker because the LCD controller is directly supplying current to each cell.)

Also, the backlight typically runs in excess of 100 Hz, so far above the flicker fusion threshold that it doesn't cause visible flicker. Except for those backlights that are always on, and thus by definition don't flikcer.

Quote:
You are thinking of CRT displays which would flicker if the refresh rate was too low due to phosphor decay.


As an aside, this is actually the reason interlacing was invented - when broadcast standards were being finalized in the US, they had decided on a frame rate of 30 Hz, which is high enough that motion appears continuous, at a vertical resolution of 525 lines per frame. (30 frames/sec was also chosen because it was compatible with North America's 60 Hz AC electrical standard.) Unfortunately, the persistence time of CRT phosphors was so short and the eye so sensitive (TV CRT phosphors persist for only a few ms, and their images persist on the retina for about 10-20 ms after that) that the upper part of each frame has already faded from vision well before the frame was done, and thus there was noticeable flickering.

So they split each frame into two fields of 262.5 scanlines each, containing odd and even numbered scanlines respectively. This meant that the image of one field appeared before the previous field had faded from the retina, and thus there was no more apparent flicker.

Of course, even though motion appears smooth at 30 Hz, it appears even smoother at 60 Hz; and because a non-interlaced signal is slightly easier to generate than an interlaced one, video game consoles instead produced a non-interlaced 60 Hz image instead of an interlaced 30 Hz one. Because of this, the even-numbered scanlines on the CRT are never actually drawn (even though consoles from the SNES era on were capable of generating an interlaced signal, this was rarely used), there were thin black lines between each row of pixels, causing the stereotypical console "scanline" look.

Also, since going progressive scan meant dropping the extra 1/2 scanline per frame (262 lines/frame, as opposed to 262.5 lines/field), video game consoles actually has a screen refresh slightly higher than 60 Hz. (60.04 Hz, to be exact, compared to 59.94 Hz for regular color TV, which is slightly lower than the 60 Hz refresh of B&W TV, for other reasons I don't feel like going into.)

Of course, the choice of 30 Hz frame rate for TV greatly complicated conversion of film for broadcast, since there are 1.25 TV frames for every frame of film. In Europe, the TV frame rate was instead 25 Hz (chosen because it was compatible with their 50 Hz electrical systems), so film conversion was, somewhat counterintutively, made simpler - in Europe they simply show the 24 FPS film at 25 FPS on TV, on the grounds that the 4% speed difference is hardly noticeable.
#17 Aug 22 2010 at 9:08 AM Rating: Decent
Edited by bsphil
******
21,739 posts
Cowgomoo wrote:
A cool website that I stumbled upon a while back has some basic visuals to show the difference between 15, 30, and 60 fps:

FPS comparison
Nice link. The reason TV/film is shot at a low framerate is because of bandwidth issues, not because "we need to draw people in!" though. Same reason that TV has been interlaced until just recently, it drastically cuts down on the amount of bandwidth needed to properly display a picture.
____________________________
His Excellency Aethien wrote:
Almalieque wrote:
If no one debated with me, then I wouldn't post here anymore.
Take the hint guys, please take the hint.
gbaji wrote:
I'm not getting my news from anywhere Joph.
#18 Aug 23 2010 at 9:36 AM Rating: Good
***
2,535 posts
bsphil wrote:
Cowgomoo wrote:
A cool website that I stumbled upon a while back has some basic visuals to show the difference between 15, 30, and 60 fps:

FPS comparison
Nice link. The reason TV/film is shot at a low framerate is because of bandwidth issues, not because "we need to draw people in!" though. Same reason that TV has been interlaced until just recently, it drastically cuts down on the amount of bandwidth needed to properly display a picture.


True, but remember that statement came from a cinematographer, and they, like most artists, very much tend to view the way they do things as justified for artistic (and often elitist) reasons, regardless of the truth.

Which in this case is, of course, inertia - movies are filmed at 24 FPS because they've been filmed that way for nearly a century, and there is now an enormous infrastructure supporting the making and showing of 24 FPS movies, and decades of seeing 24 FPS movies at the theater has conditioned us to expect to see 24 FPS motion when we go to the movie theater.
This forum is read only
This Forum is Read Only!
Recent Visitors: 16 All times are in CST
Anonymous Guests (16)