pretty much anything over 30 fps is interpreted as a fluid motion... so really anything over that is icing. Any framerate over the refresh rate of your monitor is wasted (because it never gets displayed)... eg: 70 fps on a monitor refreshing at 60 hz
Keep in mind also that graphics cards are shooting for higher and higher framerates, so that they can sustain 30 fps under the most demanding operating conditions.
No one is expected to actually 'see' the result, except through improved stability of program performance.
Although 30fps is "fluid", 50 fps seems to be the limit to the eye's perception. Anything over 50fps is virtually impossible to notice. Some young children might be able to see 55 or in some cases 60, but by the time you're 5 or 10 years old, your eyes have grown accustomed to see life happen in "real time", not short bursts of intense images. Therefore it is nearly impossible to distinguish "real life" from anything over 50fps.
A friend of mine's father is an optician and he was glad to school me on this. It is kind of complicated, so I've tried to simplify it here.
Like Rifter said, movies are 24FPS; additionally NTSC (Don't remember PAL) television is 30 FPS. Though not a scientific method, this should give you an idea of what is perceptably smooth, as neither movies nor television appear to jerk at all.
24fps is smooth enough for general use. One of my friends insists that he can tell the difference between 30fps and 50fps, maybe it's just his imagination but it is plausible - you can see your monitor flickering if it's at 60Hz so why not. that said, the whole screen flickering is much more noticeable than a infinitesimal stutter of a game running at "only" 35fps
Thanks folks! That puts the whole thing a bit more in perspective
(only a little pun intended
I guess I will follow up with another question though... How do you find out what the FPS IS for a given game? For instance, I have Tribes (Great game!) and I was wondering how i could find out how many FPS I was getting. Any help?
There is a good reason why 60 fps and above are the present ideal and it has to do with motion blur. In real life and film motion blur is caught in the natural way so lower fps is ok. At present games don't have this and edges of moving objects are still solid.
the increased frame rate helps smooth things out and make up for the lack of motion blur.
The new voodoo cards will have an early implementation of this. Hopefully with futher improvements this will be the way to photo real graphics, that the industry still holds as it's main goal.
I don't think motion blur has anything to do with it, except make the image look less sharp.
AFAIK anything above 30 fps it's basically wasted. You _may_ (or may not) tell the difference between 30 fps and 50 fps if you watch two animations side by side, one in 30 one in 50. But if you see one today and one tomorrow, even 20 fps will probably look just as smooth.
Moraelin -- the proud member of the Idiots' Guild
Just keep in mind that the 30 fps is only valid when the rate is constant - like in a movie theater. The problem with gaming is those annoying peaks and valleys in the frame rates. You can be looking smooth at 30 fps and then suddenly it drops to 10 fps and - stutter, stutter. So, for computer gaming you need some overhead. If I had to guess (but I don't really), I'd say that 60 fps would be a comfortable number to shoot for.
As you all can see, the Demos from, Eg, Q3A, are just a standard one and the final results are of an average throughout the Demos. The actual speed may differ according to the time when you play. This means that when there are more things on the screen, like more human or moving objects, the fps will surely slow down to an extend.
To enable a rather smooth game for most of the game, a 50fps on both Demos is a recommendation by me, since it can take care of the more intensive scenes.
I will say anything that don't drop below 30fps will do a good job.
By the way refresh rate, the higher the better, a low refresh rate will get your eyes tired really fast. And you can tell the difference between 60hz and 72hz, if you think you don't see it, don't look the monitor directly, look at one side....(and see). Anyway, refresh rate above 72Hz will do a better job, but you will not be able to tell the difference.
Based on this, i will say the top fps the eye can see will be more than 60 but less than 75.
Ok this is a 3DFx explaination because I can't find a better one at present. forget the 3Dfx sales pitch and read the bold part as that is the important facts.
Motion blur can both remove the jerkiness from a computer-generated animation and create the illusion of enhanced speed and motion. Have you ever noticed that there is no jerkiness in a movie, but there is plenty of jerkiness in a computer animation? Computer animation and movies are created in completely different ways. When shooting a movie, the camera's shutter opens, stays open for 1/24th of a second, then closes instantaneously before opening again to capture the next frame. All of the motion that happens in that 24th of a second is captured on the film. In fact, if you were to look at fast-moving objects in a single frame of a movie film you'd see that they are actually blurred because they are moving while the shutter is open. The result when you string together a series of such frames is the appearance of very smooth, continuous motion from frame to frame.
Now with regard to creating computer animation, suppose you're running at a frame rate of 60 frames per second (fps). A single "frame" of computer animation is displayed on the computer screen, held there for 1/60th of a second, then instantaneously the next frame of animation, which occurs 1/60th of a second in time later, is displayed. In virtually all of today's computer-generated animation, the objects in each frame are sharply rendered, motionless in the frame. When the next frame is displayed and an object, such as a vehicle or ball, has moved in the 60th of a second that has elapsed between the two frames, the object suddenly appears in a new location. Our eyes are so sensitive to motion that no matter how high the frame rate, even 100fps (frames per second), most people will notice the jerky motion.
3dfx's motion blur feature simulates an object's motion during the period of time that each frame is displayed on the screen. Moving objects are blurred, just as they are in real film, to enable very smooth and continuous motion. But we can do even more! By exaggerating an object's motion blur, T-Buffer can create the illusion of tremendous speed and make a scene much more visually appealing. To learn more about the T-Buffer and motion blur, take a look at our