How many FPS can the Human Eye Notice?

J

Jacen0505

New Member
#21
Hehe i love tribes! anyways open up the console with ` (next to one) and type
showfps();
to see your fps in tribes
hehe i'm online as [xx]Jacen5 mayb ei'll frag you sometime
 
P

PeODB

New Member
#22
Well if the game is constantly at 30fps then you wont se a difference but if it is at 60fps and it drops at 30fps you will see it for the first few seconds then you wont.
 
F

Freon

New Member
#23
If I see one more person say you can't see beyond 30 fps I'm going to go pyscho. It is simply not true. People, get your head out of your asses and stop spouting such obsurd things. If you believe that 30 fps is the limit, God help you because you're dumber than a door knob. <this is where I slam my head against a hard concrete wall at the shear amazement at the general stupidity in this forum>

3dfx's motion blur only blurs objects, not the entire screen. When you turn your character's head in a game the world will sutter like it always does. Moving objects will appear to blur. A far cry from simulating a real shutter.

Watching a movie with real motion blur, the limit MAY be about 60 fps. Things like this will vary from not only person to person, but vary on how hard a subject concentrates, and if it is a A/B/X or side-by-side comparison.
But in a game where you do 180 degree turns in a fraction of a second with no blur (besides ghosting of the phosphors in the monitor) I'm positive the average human can see the difference between 60 and 100 fps. (of course assuming the refresh rate on the monitor is set to 100+ hz)

coolguy867: The real life is analog. This is not the Matrix.

Some light is given off descretely, though. Wave your hand right in front of a TV or your computer monitor. It will look like your hand is blinking at whatever speed your monitor is set at (60hz for a TV). Now go outside on a sunny day and wave your hand around. It will just look blurry.

I'm sure there is some blur associated with the time it takes for a cell in your retina to distinguish the speed of changes in brightness and color. If you've ever looked at one of those inverse American flag optical illusions you can see that there is a delay of sorts. Just remember that anything to do with the eye is all analog. There is no hard set limit, no hard on/off, etc. Your eye doesn't sample in a hard set speed. It is a constant signal. It is possible there is a rise and fall time for changes in that signal.
It's the same arguement for vinyl records vs CDs. A vinyl record is NOT samples, but a continous signal, the same way your ear receives it. A CD, even though it samples very fast, is NOT a continous signal, but a digital blocky version of the sound. Vinyl is like an infinite khz CD. (IMHO, vinyl sucks because it scratches and is not very portable, and CD samples fast enough that it is almost impossible to tell the difference) In the same way, real light in the real world is infinite frames per second. The soft limit is how accurate and ever trained your ear or eye is.

My personal opinion is that anything past 100 fps is very hard to distinguish. 30 fps isn't even close and gives me headaches.


------------------
BP6, C-366/550 (x2 some day), 320meg (again)
CL TNT2U 175/220
 
N

Necessary Evil

New Member
#24
I run qiii on a crappy k62-500 and voodoo 3 2000 pci, watching real time fps. It ranges from 18 to 40 or so, at 18 I can sort of tell...but it doesnt stay at 18 but for a split second then its back to like 25 or so...after a few hours..I do notice some fatigue.....maybe thats the difference

with lower fps you didnt notice it...but you find yourself fatigued...tried, irritable,,maybe even with a headache?
 
Salmoneus

Salmoneus

New Member
#25
I'm with Freon!

I can't belive you guys think that 30fps is good for a game, like a first person shooter! Don't believe that shit. Above 50 is acceptable, and you would need much more as an average to never drop below 50.

Take a good look at a movie. A wide camera panning looks choppy enough, even with the natural motion blur.

Wake up! It takes 50 average, 30 minimum, to even SLIGHTLY enjoy a game.


/Salmoneus
 
I

Izomorph

New Member
#26
Refresh rates have nothing to do with FPS. You really can't match a moniter's Hz with a video card's FPS. Totally different measures. Our eyes can only notice up to 30 FPS, and that's on the kean side.
 
M

Moraelin

New Member
#27
As Izomorph said, refresh rate is totally different from frame rate in games. A monitor at 60 Hz flickers, and that has nothing to do with motion, but with the fact that each pixel is lit for a very short time, and then quickly decays back to black. Briefly each pixel is some times much brighter and some times much darker than the average, and that tends to annoy your eyes, who can only adjust to the overall average. That's why you can see the flickering on a monitor at 60 Hz non-interlaced, but not on a TV at 50 Hz interlaced (effectively 25 Hz refresh). The TV's phosphor has much higher latency.

On the other hand, you can set your monitor to 85 Hz or even higher, so the image is stable, even if your game does 20 fps.

For the record, the first movies were made at 16 fps, which shows that the brain already starts interpolating the movement at that rate. Again, the transition was to 24 fps (not 60 or 75) and also had a lot to do with flickering, not just with smoothness.

As for motion blur, excuse me, when was the last time you saw motion blur IN REAL LIFE? So how's a movie's blur supposed to help? The motion blur in movies is more of a side effect, than something that makes them cool. Also note that unless we're talking car races, there is not enough blur in movies, either. When you see someone walk or even run in a movie, they are not blurred at all. When you film something you generally have the shutter set to hopefully minimize it, so the image looks crisp, not to throw in more blur. (Except in some very special effects, like time warps, flashbacks, etc, which are artificially heavily blurred, by combining several frames into one.)

Ditto for the background being out of focus. It's more of a limitation of physical cameras, than something neat. It doesn't make it more realistic. It's something that's _acceptable_ as long as I focus on the same thing as the camera, and pay no attention to the background. But if I, say, try focusing on some other detail, all of a sudden it feels very different from what I'd experience In Real Life. In Real Life, my eyes would adjust to that detail, and I'd see it crisp and clear, not fuzzy and out of focus.

So, 3DFX marketing BS notwithstanding, why would I want those in a game? Movie producers would probably sell their soul to get rid of those limitations of real cameras. Why would you want those problems added to a medium that luckily doesn't have them?

------------------
Moraelin -- the proud member of the Idiots' Guild

[This message has been edited by Moraelin (edited 05-15-2000).]
 
BladeRunner

BladeRunner

Silent & Cool.....
#28
I don't think you are understanding the facts of that explaination. forget motion blur. what It's trying to say is a moving object is caught every 24th of a second so if it were a ball it could have moved while the cameras shutter is open so it will appear oval or "blured" this creates a smoothness that you will not get with a computer image simulation that is a sharp rendered image unless uo have high enough fps (about the speed of the balls movement) so as to not require the bluing effect.

3dfx are just putting in artifical effects to simulate this together with depth of field that will make the whole experiance more "real"
 
F

FS

New Member
#29
If I were less politically correct I'd say exactly what Freon said. 30fps is not even close to the limit of what the eye/brain can notice.

Also no movie director would want to lose the motion blur. 24fps is really very little, and extensive post production must be done before any movie appears smooth. Even the motion blur alone isn't enough to compensate for the low framerate. And high-speed shutter cameras are nothing new, any film director can get one if he would want to for some reason.

BTW NTSC TV is actually 60fps, it's just interlaced to save bandwidth and make the first TVsa possible. Try looking at a TV show at 60 fields per second and then covert to 30 fields per second and tell me you don't see the difference. I'll get you a good doctor if you can't.
 
J

JM

New Member
#30
Well as I've heard the eye can see 25-30 fps in real life but I think that rule bends in computers. Television for example is a steady 30fps and we see it in fliud motion but I can see differences on 30fps on games. Average fps is no good because you wont be certain that you'll be on that mark all the time. When a firefight is going on it drops waaay down. For me I anything above 40 is good enough. I really don't see why i should get 100+ fps since i hardly if not never notice it. You'd have to be a bug (a fly perhaps) to see every frame when you hit 100+ fps.
 
M

Moraelin

New Member
#32
Don't get me wrong, I'm not against 3DFX making these effects available. As I've said, for some well defined situations, hell, you may well need motion blur as a special effect. But I seriously doubt I'd want it applied wholesale to a whole game.

Let's face it, people, if I wanted to have my graphics blurred and out of focus, I'd still be playing on an ancient composite colour monitor. Or on an old LCD screen, from the DSTN era or earlier, whose latency blurred anything that moved anyway. Including the mouse cursor. But I'm playing on a relatively expensive 17" monitor with small dots size, in 1024x768, 32 bit colour.

All these years, the whole computer industry has struggled for sharper quality graphics. Now we're told that's all wrong, and I'd need some expensive card to make my image blurred and out of focus again. Why? Is it just me that sees something wrong about this picture?

Again, some 3DFX features I aggree with. That FSAA looks great. And I'm happy even with the idea of motion blur, if it's used very sparingly, in situations where it's only needed, as a special effect. But that I'd need all my games blurred... Sorry, folks, I just don't buy it. To me it's just marketing BS.

------------------
Moraelin -- the proud member of the Idiots' Guild
 
F

FS

New Member
#33
It's not that the picture itself should be blurred. On the contrary, you should try to get it as sharp as possible. It's only the motion that needs to be blurred if the fps isn't big enough for your eyes themselves to do the blurring part. And not tuu much motion blur either, just enough to fill in the gaps between individual frames (ie like 1/24 seconds for 24fps).
 
M

mobydisk

New Member
#34
I can hardly believe I am participating in this flame bait. Freon, FS, Bladerunner are correct - not by opinion, but by fact.

The human brain does not even see in FPS, so the whole discussion is asking the wrong question. The human brain EXPECTS motion blur; requires it. Motion blur is not a "special effect" but a consequence of how we perceive motion. "Frames" are a discrete sample, and is the best current technology can do to simluate images.

As a game player, 30-60FPS is way too low. My brain tells me this by giving me a headache after 15-60 minutes. Just sit still in Q3A, and turn left/right. At 60fps, if the turn takes 1/10 of a second, that is 6 frames in the turn - not enough to tell when to stop turning and fire accurately. We must interpolate, and that is hard to do without the blur.

Perhaps we can do what 3DFX wants - but not with today's technology. We would have to be doing 60*5 FPS then blurring 5 frmes at a time to get a real motion blur. Only then will things look "real" and touch on what our vision is capable of.

------------------
The Moby Disk
http://mobydisk.com
 
M

Moraelin

New Member
#35
Mobydisk, do you at least realize the monstrosity of what you're saying? To assume that the 60 fps is what's limiting your turning and aiming, is to assume you could react in less time that 1/60 of a second.

THINK about it. Your brain has not only to assess the situation, but also transmit the information all the way to your arm, hand and fingers, via relatively slow nerves. (No, you're not working on copper wire, nor on optic fiber
) Said muscles have to actually contract, which is a relatively slow process, and you have the mass of the whole arm and hand to slow you down.

Sorry, THAT I really don't buy. Saying that your eyes can even tell the difference at that speed is one thing. Who knows, maybe your eyes really are miracle. But claiming that the whole eye-to-brain-to-hand process can happen at THAT kind of speed is just ludicrious.

To give you an example, the common fly may well be the fastest reacting creature, needing only approx 1/10 of a second to change direction in flight. For that it has eyes with approx 1/100 sec time resolution, and a sort of bypass mechanism that connects straight to the flight control sensors, which in turn have a direct (and very short) link to the wing muscles. It also only has 1-2 gram to move around, unlike your arm's weight. Now you're telling me you're at least 6 times faster than that? Gee, now THAT would be a miracle


------------------
Moraelin -- the proud member of the Idiots' Guild
 
RoadWarrior

RoadWarrior

New Member
#36
Moraelin,

I don't know where you find "common" flies that weigh 1-2 grams, but please keep them to yourself.

Thank you.
 
RoadWarrior

RoadWarrior

New Member
#37
Ok, everybody do this test:

Run down to your favorite scientific supply company and get get a really expensive strobe light.

Got it? Good, now set it up as the only light source in a room that is completely dark otherwise. I mean totally dark.


Now go in the room, and set the strobe for 30 flashes per second...pretty annoying isn't it?

Now slowly increase to 60 flashes per second. That's a little better now, isn't it?


All right then, stay in the bloody room and play with your strobe, and let the rest of us get back to our gaming!!!


Thank you.
 
M

Moraelin

New Member
#38
Umm... RoadWarrior, if you'll read this thread, you will see that the issue of screen refresh rate versus game frames per second has been addressed already. Your strobe light example is an example of flickering, not of motion smoothness. And flickering has to do with screen refresh rate, not with the game's FPS. It's a good example, but for something completely different from what was discussed. 'Nuff said.

------------------
Moraelin -- the proud member of the Idiots' Guild
 
RoadWarrior

RoadWarrior

New Member
#39
You're right, I should have read more of the thread, but I've been out benchmarking flies.

Without getting into a lot of technical mumbo-jumbo about methodologies, or that nasty "bathroom scale" debacle, I'd have to report that my local varieties seem to average under a tenth of a gram.

The only thing I've found anywhere near a gram is a rather large bumblebee. I've named him Eric, and he is currently sipping sugar water, here on the bench.

Does anyone know where I can get a Bee License?
 
BladeRunner

BladeRunner

Silent & Cool.....
#40
RoadWarrior

LMSAO
thanks for lightening up a discussion that was heading into the usual tedium


It's not that the picture itself should be blurred. On the contrary, you should try to get it as sharp as possible. It's only the motion that needs to be blurred
originaly posted by FS

.....sums it up very well I think. This feature will hopefully remove the jerkyness of fast moving objects without bluring everything. You also must realise that this is a "new" graphics feature and will probably need time to develop to it's best in the same way the basic forms of anti-aliasing have lead to FSSAA.



[This message has been edited by BladeRunner (edited 05-15-2000).]
 

Associates