Difference between 30fps and 60fps gaming


















Its like a CGI movie and still its a game. Oh and im a pc,ps4 player. I had enough of always upgrading my pc. To play the next gen games on 60 fps you need a pc with the double price of a ps4.

Getting a power house just for bigger fps? No thanks. Story and gameplay above graphics. I had fun with my ps3 for 7 years and under those years i upgraded my pc twice and had to buy a new one for new tech. And now i have a ps4 that will surve me for another 7years without upgrade. Its worth it. Even just for the exclusive that look better then other games cause the developers know how to optimise it on the ps4.

Sorry for my english but im from Hungary. A lot of last gen games still ran on the GTX just at settings comparable to the console. Optimization is one of the biggest lies that are told.

A lie? How odd. Even John Carmack seems to think that optimization is better on consoles and that it theoretically puts them ahead of a PC with equal specs. What a console fanboy! Optimisation should be better on a given console because, for example, all xbones have the same hardware and drivers making it much more simple. He specifically apologized for his English and sighted the fact that he was not a native English speaker, as he is from Hungary.

GDDR5 is nice, but seriously, the developers know all they need to know about optimizing the games most of the time. Its simple why the games doesnt have 60 fps. The consoles simply dont have enough power unless they play on the very lowest settings.

And then just upgrade when necessary? LOL you used bloodborne as and example? And the heat failures didnt plagued the consoles. I never had trouble with my ps3.

Some poeple keep their console in a wrong place. I buy the top of the line graphics card every year to use all those hidden settings in Nvidia Inspector or Radeon Pro. So happy times for PC gamers. For some games I actually prefer 30fps over 60fps. Yea, I call bullshit. Its not hard at all to tell the difference between 30 and 60fps, especially in a game.

There is no need. Are you using a 10 year old profile pic? Can you please explain me how 30 fps looks better than 60? Same goes for resolution. Just throwing around buzz words like you know what you are talking about here huh? You use a Supercharger attached to your Turbo too, huh? Gotta get the flux capacitor tuned right? I call bullshit on your ten year old ass. Even when running under 30 fps, when I click my mouse, or move it, everything still happens instantaneously. This goes back to gaming at a time when the very idea of 60 fps was mere fantasy for the entire industry, even on the PC.

On the motion blur front, making a blanket statement that it always sucks is wrong. Assetto Corsa is a prime example, as well as pretty much any recent racing game from Codemasters. Also, when done correctly, it has zero or very near zero effect on frame rate or any sort of latency.

Alien Isolation, which I play at a solid 60fps with all eye candy at maximum, including the motion blur, and it looks 10x better with the blur ON. You also made a blanket statement. Any mod that adds on post-effects that is not part of the original graphics engine code is going to be inefficient. You really have no idea what the fuck you are talking about with any of this. Not much more I can ask out of it. Has a ingame GUI that lets you edit and adjust almost every effect.

Input lag is definitely detectable. You can see this if you turn V-sync on or have an extremely low framerate. The ENB mod may not be shit like he said, but it definitely has some less-than-satisfactory aspects. Also many of the features you said are restricted to Skyrim. Yes, they are, and how is that a problem? It is made for games like Skyrim, Fallout, etc, that have less than satisfactory renderers and DX support. For what it is, and what it can do, it is amazing. For what it is it is awesome, but what it is, is a horribly hacky way of making a game look better.

When it comes to ENB it can break a lot of things during injection through an assortment of misguided coding. For one, the code has no safety measures, so incorrect configuration can cause issues from BSOD to simple game crashes. Also being a dll injector if someone wanted to be malicious they could ship a virus with it. Well, say that all you want, but nothing like that has happened to me over the last 4 years.

Nothing major. Not to mention, your opinion on this matter is rather void, given your previous statements…. And ignoring my point purely based on my previous statements, be they good or bad, is naive. There is one man working on the ENB mod.

And taking all that into account, along with how well it DOES work. And as far as professional quality goes, it all really depends on what you would qualify as professional, you have the AAA game developers who ship games like Skyrim with outdated engines, heaps of bugs, updates that completely destroy certain aspects of the memory allocation, shoddy graphics that require work like this to be done, etc,.

But to some, they could be called professional. Now, tell me how the Developer of the ENB mod is any less professional than these developers? He promises nothing, yet still delivers a better product most of the time, that in all my time with it, HAS NOT done those things, has updated the graphical effects of the games it is tailored for, and fixed multitudes of bugs across those games. TL;DR; Professional quality? When I saw and downloaded it I expected it to be just as shit as it was. Like, Really??

Motion blur however, when done correctly can very much make lower framerates appear better. Meaning that, yes, there ARE some games that have inefficient and also not the most attractive motion blur code, and in those cases it can hurt framerate. There are always going to be exceptions. Racing games can be just as graphically and computationally intense as any shooter once you throw in a complex physics engine and a full field of other cars with AI. If you played fighting games, you would know what input lag is.

I wont try and convince you it exsists though, see a player can be used to the lag and still play very well. So you may never relize it is there in the game you play. Watch the youtube channel linus tech tips. They use a slow motion camera and actually test input lag by showing when they click the mouse and when the action goes off.

Counting the frames between. Then you get the input lag. Input lag is a thing. Your brain must just be to slow to notice it. I feel bad for you. I would be scared to be on the road near you.

Because you will have slow ass reflexes. I have no problem playing very fast paced games, whether I see the input lag or not. Your comparison to a few frames of video that all happens in mere microseconds, versus the things that happen while driving a car is a clear indication of your inexperience at living on this planet. I think input lag is more of a problem for competitive players and the same for motion blur, hence hz monitors. I used to play fps competitively and when you are concentrating on your performance in game, slight lags are much more perceptible.

First-person shooters Oblivion, New Vegas, Skyrim etc. They should test vidcards with the games setup to give 60 FPS minimums. Look, 30 FPS. I did Skyrim tech support for years, and everyone with problems stuttering, lagging and crashing had waay too high settings enabled, including heavy AA and post-processing and high-rez texture packs etc.

Achieving 60 FPS with their wimpy rigs by changing settings was like a revelation to them — they had never before played the game running smooth, fast and fluid.

It keeps piling on the effects and AA until your frames are in the 30s… quite unfortunate. Also, 40FPS is generally what most people consider perfectly playable. They give you a slider to adjust for performance or quality and it changes the setups accordingly. This is idiot proof. Is the game not running at the frames you like? Slide it toward performance and verify. Rinse and repeat. All played and looked beautifully.

At a realistic framerate, the unreal quality of it all is starkly visible once our brains try to intepret the scene as real. I found 48 Frame Per Second to be a distraction that constantly removed myself from the movie experience. Cute article. One of the most common parts of post-production for CGI elements is adding in motion blur to cause the effect. You cannot have a subjective opinion on what is an objective fact. While I can say and think gravity does not exist, gravity will still objectively exist.

I certainly do NOT agree and any ps4 owner should be concerned with this statement as there are several cases of frame rate drops even on 30fps titles. On another note Microsoft said the opposite with 60fps is better. Console gamers see nice graphics and are bewitched.

A huge problem we have is people trusting representatives of these companies. Personally, i notice the difference mainly on the particle and moving liquid stuff, which is the reason i try to attain 60fps.

What a poor comparison. I honestly clicked this expecting a good objective analysis. Never do you make it clear how significant the difference is. Your point 2 is also ridiculous. The mouse, keyboard, latency, and your brain are determining how fast you react.

In the article we never stated otherwise, It says that input latency was much lower half at 60 FPS compared to The whole point was that there is in fact a difference between 30, and 60 FPS when playing competitive games as latency can and does become an issue.

This article was never intended to fuel the console vs PC debate, but rather to try and debunk and or clarify some of the misconceptions that are tied to it. The real fight is not between consoles and pc, the fight is against big companies such as ubisoft. For example, I enjoyed the PS3 version of tomb raider just as much as I enjoyed the ps4 version. This whole push for 60fps is generally a recent development that most people do not care about.

Who is most people? This is the point the article is trying to get across: people spreading misinformation and made up stats without any foundation or evidence to back themselves up. The actual truth of the matter is, if your claim was correct, then companies would not be making TVs and monitors that display images faster than 60Hz. The point of hz and hz panels is to help smooth out the judder that can occur on 60hz panels.

Until relatively recently, videos on YouTube were 30fps and have only started allowing to select for 60fps on desktop. The YouTube app has only just started rolling out updates for allowing 60fps playback b. The only time that 60 fps is strictly needed is during multiplayer as the reduction in latency is vital.

For single player, a nice bonus but in no way vital or essential. Perhaps my inclusion of TV Panels was a mistake; my TV is a true Hz panel, so when connected to my PC, it is capable of displaying fps content verified with testufo.

But obviously that is not true of all Hz TV Panels, which as you stated most will simply use motion interpolation to simulate Hz motion. The whole point of hz panels are for motion interpolation, black frame insertion, and 24fps playback. If they actually did consider gamers, more of them would make low lag tvs, however as it stands the only one that does this is Sony and starting this year Samsung. If people were clamping for 60fps as much as you say then movies would have been made higher than 24fps a long time ago, it was a frame rate chosen for a limitation that no longer exists yet it is still the standard frame rate for movies.

The reason is that people associate this with the cinematic feel of movies and has existed for so long it has basically become a cinema standard.

Motion interpolation is a feature with a mixed reception, some watch but most people hate it because it makes them feel like everything was shot as a cheap home video or soap opera hence the name soap opera effect similar to the mixed reception to 3D which I am a fan of.

Now, I agree that this has no bearing on video games but that served my point of 60fps not really extending past video games. For video games, I generally give it up to preference as there is no standard frame rate for video games. Some say that they will take 60fps every time, some say that they will take a smooth 30fps over slightly stuttery 60fps.

Some will even take stuttery 30fps with frame drops if they think the game is good as proven by the scores of people that love games like Fallout 3 and Skyrim that performed horribly on consoles.

It might even have a subjective quality to it depending on whether a person is more high octane or laid back, similar to people that prefer to call or text or how people prefer the morning or the night.

I will concede that the other is wrong or right. I in all honesty would be in support of all games giving you a choice between locked 30 or unlocked My point is that making a big issue of not having a perfectly locked 60fps is really acting sort of entitled and spoiled. Games have not become unplayable at 30fps.

Tell that to the all entitled and spoiled elitist people who refunded Arkham Knight to the point that WB pulled it from sale, and has yet to reinstate it. If you prefer 30fps then good for you. Warner Brothers and Rocksteady learned the hard way that they can no longer get away with such minimum effort ports. A lot of older console games played at 60FPS…it is not vital or essential but saying it is unnecessary or a waste is a bold faced lie.

I prefer higher frame rates because I get irritated at lower frame rates. Most people like to play farmville, what does that actually add? Input delay IS a thing. For the most part, this result in a smooth image for things like film and TV. What this results in is a far smoother, more flicker-free experience.

It does more than impact just the visuals though; it also impacts on your game input. A faster frame rate means your controller inputs are translated to on-screen actions quicker.

Improve the rate at which any of those happen, and your inputs will feel smoother. It goes beyond just what you see to how a game feels. A 60fps game, depending on the type of game, of course, is just smoother, feels more natural and responsive. Here are two sites that should help you at least see the difference. Looks like the dark knight is about get a little bit better, as Microsoft has accidentally….

Suits was a famous legal drama series that aired from to and …. So for months my brother was playing GTA V on PS4 on a p projector in the living room, even though it's an 80 inch screen it's at p so he never saw the really clear detail of the game, it looked good but then one day he moved it upstairs to the office and put it on a 50 inch p Screen. Then he was amazed because he could see all the detail and how sharp the game looked.

It was like watching that 4k vs p video once he was on a p HDTV he could see all the detail nice and clear, that he was not seeing at p. So hopefully that becomes a growing trend and we don't have to choose one or the other. Keep in mind that there arent a lot of developers as talented and familiar with their hardware as ND.

I can rarily see the difference between 30 and 60 there has to be really rapid movement for me to notice, but I can feel the difference between 30 and 60, especially in shooters. I'll take 60fps anyday of the week. Both are important but a game running smooth is the most important thing about graphics bar none. You can have the prettiest game in the world at p but if the frame rate is up and down its useless. Honestly I would prefer developers lock their games at 30 or 60 fps and hit p.

A locked frame rate even at 30fps is better than a developer aiming for 60fps and its ends up jumping around from fps. I dunno, variable frame rate isn't always such a problem for me, in some games at least. Maybe it should be optional. Bloodborne was pretty acceptable for me for example despite the wacky frame rate, except a couple of parts of the game lecture building which ran like ass until they patched them. In some games doesn't work in many I've liked the effect of TV-inbuilt motion interpolation too, and depending on your TV you can bump a low fps up to a much higher one - yeah it's making the middle frames out of the previous one and it often looks atrocious, but in some games it's nice.

I played through most of AC4 with it turned on thinking it was just extremely smooth.. XD I know a lot of people hate motion interpolation - I can see why - I do think it's worth experimenting with though in certain cases.

I used it on my Samsung plasma and the game was not responsive at all, I was playing battlefield 3. While it did look nice the input lag was huge, I had to turn all effects off just to play the game. Oh yeah completely.. AC was fine but Killzone is hilariously bad, it already has some input lag, so adding more made it v difficult to play, like you were recovering from a hangover and everything's taking your brain slightly too long to figure out. You'd die before you could react. But it looked like you were there..

I recently invested in a Hz monitor and just moving the cursor alone across the desktop it's crazy the difference. G-sync is a technological marvel btw, I'm a stickler when it comes to framerate and before g-sync if the framerate dipped even a couple of frames I found it intolerable once or twice in a game I can handle but it still irks me. I'd say when the fps goes below around 58 it becomes a much less enjoyable experience there are some exceptions depending on the genre - RTS while 60 would be preferable I'm actually fine with I always want it locked as well, while I find 30 fps unplayable in most cases a variable framerate between is worse imo.

That is until g-sync where now I can have the fps drop down to around the 50 mark and I really can't perceive too much of a difference. No more V-sync with mouse lag and stutter and no more screen tear also unbearable. Thanks for the comment player, Ok in my statement, I was talking about the average gamer, but this is a good point. You said the human eye can see a perceptible difference beyond 60fps, I don't doubt that.

You are clearly a hardcore gamer if you bought a Hz monitor. And therefore you get it, you know when it's dipping or when it's not as smooth, most of my friends couldn't tell at all until I explained it to them Don't take my word for it, check this out. I'd hardly call one "average gamer" test subject a conclusive answer on this subject. Even if you were talking about it in that context, I still have a problem with the bits of misinformation like "since at 30 you are really close to how you see things in real life.

I know I'm tooting my horn here, but I brought this up in one of my recent blogs about trained fighter pilots able to spot images flashed on a screen for something like 2. Still appreciate you trying to bring up this topic but some stuff at the beginning simply isn't correct. I had to explain to them what they are looking at For example one of my friends on PSN swears up and down that The last of Us plays no different then it did on PS3 - I said no it's at 60fps now He could not tell.

That's really what I meant, to the average gamer it just seems fluid to them. To your point a fighter pilot is far from the average gamer, but I see your point. I wish we could do a test right here to see how many people can tell 30 vs 60 fps.. I clicked on your name but I could not tell which blog was the one you were referencing. And be sure to check out the links at the bottom of the page. I also didn't know the Hobbit tried to do 48fps I want to see this movie now..

Very nice blog post. Thanks Crazyglues. Yeah, I knew how tricky the subject is so I dove headfirst into that information to try and express my thoughts as clearly as possible. I tried making it direct but it kept bugging out on me. Here is a perfect example why p and 60fps is preferred. I had a heated argument about how Forza 5 is the only next gen racers than ran at and 60fps. If he can't tell the difference between p and p, then the case is close.



0コメント

  • 1000 / 1000