|
Author |
Thread Statistics | Show CCP posts - 0 post(s) |
Ten-Sidhe
Osmon Surveillance Caldari State
414
|
Posted - 2012.08.16 18:43:00 -
[1] - Quote
I thought 60pfs was the limit of the eye, but 30 looked almost as good(sdtv is about 30fps, imax about 60). Never understood marketing on 200+fps tv sets, you can't see it anyway so what difference is it.
I would overlook low fps if game is good, big maps, big player count, eve tie in. I played overloaded rts games that slowed till I could count fps with a watch, 2-3 fps.
Looked it up, halo3 is locked at 30fps, avatar was filmed at 24fps, animated films are 12 or 6 fps. If game is good and at least in 20's I'll be happy. |
Ten-Sidhe
Osmon Surveillance Caldari State
414
|
Posted - 2012.08.16 18:57:00 -
[2] - Quote
most monitors only show 60, or in some cases 24fps. Rather have good draw distance, and large limits on map size/player count at 24fps then smaller counts at 60fps. Movies are 24fps, looks good enough there. |
Ten-Sidhe
Osmon Surveillance Caldari State
414
|
Posted - 2012.08.16 19:04:00 -
[3] - Quote
Add some motion blur with lower frame count if its easier on hardware, that is all the extra frames do create motion blur.
"The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually.[1] The visual cortex holds onto one image for about one-fifteenth of a second, so if another image is received during that period an illusion of continuity is created, allowing a sequence of still images to give the impression of motion." -wikipedia
The brain speeds up at times. FPS may trigger this so eye "overclocks" to more frames per second, 24-30 should still be plenty. Twice the frames won't add that much, more players/draw distance would be better. High fps is also a buffer for when it slows, so if designed to run slow and steady it would have same affect. |
Ten-Sidhe
Osmon Surveillance Caldari State
414
|
Posted - 2012.08.16 19:51:00 -
[4] - Quote
Movies also benefit from natural motion blurring. PC/console games don't, That is why they need more frames to emulate the blurring. The eye detects analog light input, then brain reads digitally. It expects the image to be smeared, digital input doesn't register right. The more images are overlapped the more it tricks it into thinking its motion blur, seems to work more realistically then coding in motion blur so far.(one day they may find the perfect way to code motion blur, they just haven't yet.) The eye picks up 10-12 fps, but if out of sync input may be needed at double this. The brain also speeds up when it thinks it has to, fps trick into this mode.
So, brain reads 20-24 images(double normal is pretty good increase), expecting analog blur of motion on each image. Digital images need to be at double this to make sure brain doesn't miss a refresh when it reads. So at most the brain will need 40-48 images per second, but it expects them blurred so any additional frames aid smoothness by simulating blurring.
Any gain above 30 fps are small, and could be replaced with a little blurring. Having the frame rate not go up and down and good draw, gameplay(map size and player count fit here) are more important then a little eye candy.
It may have started as a troll, but turned into a debate about what is more important in graphics. |
|
|
|