How many FPS can the Human Eye Notice?

rh71

rh71

LuRkeR ExtRaoRdinaiRe
#41
movies at 24fps eh? Some movies, like i was watching 'best friend's wedding' the other day... when they panned across and followed an actor, it seemed more 'floating' than usual. It seemed like every frame was there as it moved across a room... (not that I can tell truthfully)... being something out of the ordinary, I just assumed it was bad filming... I guess it's just good quality filming after reading this thread?
 
D

Darkterritory3

New Member
#42
In real life i hear that i eye see like if it was 128 fps in game game!! 128 is the total of the eyes!! Maybe am wrong but that is was i heard!! you will never be able to see a game faster than 128 fps!!
 
F

Freon

New Member
#43
Izomorph: Actually refresh rate and frame rate used to be locked together back in the early Voodoo 2 days before drivers let you disable vsync.
Why was the "vsync disable" option added? To increase frame rate! Voodoo 2 cards with vsync on were basically stuck at about 50 fps.
Vsync causes performance to trail off near the refresh rate. i.e. if your refresh rate is set to 60hz, it is very hard for your video card to display at 60 fps. The closer it gets to 60 fps internally, the more performance it loses purely due to vsync. If it doesn't get 100% done writing a frame in that 1/60th of a second, it has to wait until the next frame to finish up and display it leaving a lot of dead processor time. Syncing the timing on devices is one of the greatest challenges in computer architecture from a broad perspective, but I digress...
Try running a Geforce Vsync locked at 60hz (and subsequently 60 fps). It will STILL look choppy. If you set your refresh rate to a gazillion hz, 60 fps will still be identifiable. Nice try Moraelin.
Discrete does not equal (and never will) realistic or transparent.

The problem you don't see is that even in refresh rate = frame rate, a computer still displays DISCRETE images. The only savoir it has is that if you get refresh rate and frame rate high enough, the ghosting and fall time of the phosphors on the monitor themselves could create enough blurring to fool you.
30fps is NOT enough to fool the eye and at 60hz with a computer CRT the fall time of the phosphors are (thank god) not that high either. If you've ever played using TV out you can actually see how the fall time on the phosphors can blur up the image over the time domain.

Ok, next topic.
I guess the whole point of the T-Buffer is "Hollywood on the desktop". In real life your own eye can only focus on one depth plane (good luck trying to tell what the user is TRYING to focus on *snicker*). A sniper rifle's scope has trouble at under 25m, etc. It's for immersion. I'm not totally sold on it either. I guess it would be kinda cool if in Counter-strike people would look blurred if they were close and you were zoomed in two clicks with a sniper rifle. Eh?

Ok back to refresh rate, fps, blah blah blah.
Morelin: Wave your hand in front of your face in natural light. It's called motion blur. Your retina and brain attached to it are analog devices and have their own latency effect blur. Try it, it works. And again try waving your hand in front of a bright computer monitor. Doesn't look natural. Turn your monitor refresh up to 200hz and it STILL won't look natural. That is a damned close simulation of 200fps vs. real life. This example bypasses your inane refresh rate arguement because refresh rate = frame rate.
Also, do you have any idea the fall time on the phosphors on a CRT? They stay lit well PAST the next screen refresh Captain Ignorant. At 60hz, the phosphors stay lit LONGER than 1/60th of a second. I suggest you go do some more reading on the subject. The heart of your arguement says that a monitor's phosphor pop on and off BEFORE the next electron gun pass (like a strobe light). It's not true! Haven't you ever waved your mouse cursor around a black screen? Do I need to go quote technical specifications for you? Post some fall off graphs?

On a computer with discrete frames, whether vsync is enabled or not, whether refresh rate = frame rate or not, the human eye and brain can see the difference WELL past 30 fps, and even WELL past 60 fps. Given the right test and a fast enough moving object, probably past 200.

FS: Yes, I was going to try to be PC about it but then thought there was no way because some people just have their heads crammed SO far up their asses that they'll never see the light. And I've done the single field vs interlaced test on my ATI AIW card (where I can change it back and forth on the fly). Big difference. Single field clearly looks choppy. Excellent example. And it has NOTHING to do with refresh rate.

Alittle more on TV for those interested. Yes, TV is 60hz. Every other line is draw 60 time per second. That equates to having the ENTIRE screen redrawn at only 30hz, true. It is like running 60hz at half the resolution, NOT 30hz at full resolution.
Motion pictures clearly do use motion blur. Frame step a DVD or pause a VHS tape (best on a good 4 or 6 head VCR). Toy Story, Bugs Life, etc. even use motion blur! But I'll bet you never noticed... You would have noticed if they didn't, though.

JM: "see 25-30 fps in real life " Real life is not descrete. It is 'inifite frames per second.' The only limit is the soft limit of the human eye and brain.

modydick: Ahh yes. Full screen motion blur. Multiple per-pixel sampling over the time domain rather than the space domain. Same performance problem as FSAA, but also the geometry, physics, AI, etc. must be calculated for all the inbetween frames, too. Big ouch for performance. T&L anyone? T&L&AI&TR&P (transformation, lighting, artificial intelligence, translation, and physics?) LOL

------------------
BP6, C-366/550 (x2 some day), 320meg (again)
CL TNT2U 175/220
 
F

FS

New Member
#44
There is no way you can give the eye an fps figure, especially anything accurate to three figures. And 128 is a frickin' binary number!! (2^7) To my best knowledge most humans don't operate on binary code


That number might be a good estimate though, maybe something in the 100-150 fps range will start to get completely indistinguishable from Real Life(tm).
 
F

FS

New Member
#45
Hey I just figured a good example:

Look at a spinning propeller on a helicopter or a prop plane. Do you see the prop as sharp as if it was still but it just jumps to a different angle X times per second? Or do you just see a blur?

QED
 
F

Freon

New Member
#46
FS: *Ding ding ding!* You win one chocolate chip cookie.





------------------
BP6, C-366/550 (x2 some day), 320meg (again)
CL TNT2U 175/220
 
RoadWarrior

RoadWarrior

New Member
#48
I'm still perceiving an area of gross stupidity here, so let me lay out a scenario, and you can have at it.

Assume that you are a human.

You are sitting in a chair, looking at a 21" monitor running at 1280X1024 resolution, with a refresh rate of 72Hz.

The monitor is displaying a game being played by someone else, in another room.

If the game is producing 72 FPS what do you see?

If the game is producing 144 FPS does it look any different?
 
M

Moraelin

New Member
#49
Freon, first of all I'll skip over the "captain ignorant" and "head up the ass" wise crack for now. Frankly, if THAT is your best technical argument... no comments. Here's a hint, though: actually knowing what you're talking about may help a helluva lot more than filling the space with cheap insults.

Second, the v-sync option can lower your frame rate to A LOT less than your monitor's frequency. As long as your video card can render ALL frames faster than 1/60 of a second (that's 16.66 ms), yes, you'll get 60 fps, too. Only then comes a frame which needs slightly longer... say, 1.7 ms. V-Sync will force it to wait an extra frame, bringing the total to 33.3 ms.

Second, at 60 Hz vertical refresh what you will notoce is flickering. Yes, it will be annoying to the eyes, but for reasons completely different than the frame rate.

As for the monitor's staying lit, have you actually seen a graph of how the light level decays for said phosphor? Yes, it won't decay to pure black in 1/60 of a second, but it will decay more than enough to be like a strobe light.

------------------
Moraelin -- the proud member of the Idiots' Guild
 
O

OverclockingAddict

New Member
#50
I don't know, I can tell the difference between 40 fps and 70fps. It's all relative. Like the people playing Q3 on a P233 and a voodoo 1 4mb, they think their framerates are good (15-25 FPS) but God help them if they ever see the same game run on a P3 or Athlon and a Geforce... They will spend all the money in the world to get those extra FPS.
 
T

triton2

New Member
#51
ok ill try to explain it a little better (maybe)... the reason you can tell between 30 and 50 frames is because of the intervals your brain sees the images, and the intervals the computer displays the images. the human eye takes pictures at about 40-50 fps, then the brain puts them together, and you have video..... get it? ok, now lets say the computer is showing 30 fps... ok... the FIRST frame starts at one point of time. NOW your brain takes an snapshot of that image at a different time the fist frame started, so actually you see the second or third frame, OR maybe no frame at all.... thats at 30 fps... now at 50 fps, theres a far greater chance of your eye to see a image, the more images you see, the smoother it looks.....

kinda like this...

COMPUTER SCREEN FLASHES AT THESE INTERVALS
|aba|aba|aba|aba|aba|aba|aba|aba|aba|


YOUR EYE SEES AT THESE INTERVALS
|ab|ab|ab|ab|ab|ab|ab|ab|ab|ab|ab|ab|

Now you put them together
|aba|aba|aba|aba|aba|aba|aba|aba|aba|
|ab|ab|ab|ab|ab|ab|ab|ab|ab|ab|ab|ab|

(the letters are just the latency time between the images)
you see where some lines meet (well, close enough) and some dont? so if this was a game running at 30 fps, and your eye sees at 40 fps, youre not going to see EVERY image, THATS why you can tell a difference between 30 and 50.
Because there would be a better chance of the line meeting up if there were more of them


kinda confusig i know but if you think about it, it does make sense./....

your eyes are constantly taking pictures....


[This message has been edited by triton2 (edited 05-17-2000).]
 
F

FS

New Member
#52
Actually the rods in the human eye take about 0.05 seconds to refresh, so in effect it seems like the brain receinves new info at about 20-25fps. This does NOT mean that the brain will not see more then 20-25fps.

What I said earlier holds, but the info gets transmitted from the eyes to the brain every 0.05sec (or it takes 0.05 secs for an image to fade off the receprors). The eyes still receive radiation constantly so you will register everything that flashes in front of your eyes.
 
F

Freon

New Member
#53
Adding on to that, cones (color receptors, separate from the rods that detect brightness only) are slower to react.

The reaction time of the rods and cones has more to do with the speed of change you see IN ONE SPOT.

IOW, if you stare dead on a strobe light that is blinking at 20hz, it will give you a massive headache. The faster it gets (once again, staring at it NOT MOVING), the more it starts looking like a constant light. But MOVE your head or the strobe and you will see it flicker.

IMHE, 60hz for a CRT flickers badly. Once again, this is with a STATIC NON-MOVING, NOT MOVING MY HEAD image. 75hz looks like a constant image. But if I move the mouse around I still see it jump. In fact even at 150hz I still see it jump around even though the image brightness is flickering so fast (and so little) I couldn't possibly notice.

Same holds true for frames per second. The only caveat for frames per second is that the refresh rate must be higher than the frames per second. Also without v-sync you get tearing. The collision of refresh rate and fps causing tearing is a separate issue.



------------------
BP6, C-366/550 (x2 some day), 320meg (again)
CL TNT2U 175/220
 
D

drzaius

New Member
#54
i agree that in order to make a game look fluid you need a minimum of 30fps, but the reason why TV and movies etc. is because we are just watching them, not interacting with them. would you believe that most saturday morning cartoons are about 8-12fps? also the fact that we sit about 6 feet (or more) from the TV and about 1 foot (or less) could have something to do with it all.
as a test, play a game with a long hallway, one with solid wall on all 4 sides, and try to get about 24-28 fps, either by increase color or res. or unclocking. then just 'walk' straight down it. then try it again and move the mouse around a lot, and maybe through in a bot or 2.
Now if i'm right, you could watch the walking down the hallway, but could not interact with it. so it's not the eye that has a problem, it's everything else, and for god sakes remember to blink every once in awhile
.

as for my theory on if we see real life in fps, i'd say that it's probebly comes at a constant rate which would be the speed of light.

as for me i'm going off to another galaxy to fend of evil.


[This message has been edited by drzaius (edited 05-21-2000).]
 
Luis G

Luis G

Natural Disaster
#59
One electron travels at about 30000Km/h, to measure the fps in real life you just need to make a division between the circunference of the atom and the electron speed.
 

Associates