Why Refresh Rates Matter: From 30Hz to 540Hz

You missed the main fact about the toll it takes on graphic card, you do need a mid-high end card for anything above 60 fps in gaming, the toll on the pockets for higher hertz is all together another story.
 
Above 144hz it is difficult to notice the difference, and in addition current games will not reach 540fps unless with a substantial sacrifice in quality settings.

There's literally a whole article above you explaining how and why the higher Hz is in fact noticeable, even above 144.

Also the article is just trying to settle the question of whether or not the higher Hz is important, not necessarily what it takes to get there. Great write up!
 
Does the clarity still hold up with the motion tricks that nvidia/amd/tvs use?

with all this frame gen being pushed would be neat to know.
 
There's literally a whole article above you explaining how and why the higher Hz is in fact noticeable, even above 144.

Also the article is just trying to settle the question of whether or not the higher Hz is important, not necessarily what it takes to get there. Great write up!

And I'm bringing real-world experience to the table. If you've followed the latest releases, you'll know how difficult it is to maintain 120fps or even 60fps in recent AAA games, sacrificing quality and resolution to reach such high targets would be contradictory.
 
And I'm bringing real-world experience to the table. If you've followed the latest releases, you'll know how difficult it is to maintain 120fps or even 60fps in recent AAA games, sacrificing quality and resolution to reach such high targets would be contradictory.

this is how the real world work s (and is why I've never been able to tell thew difference between 85hz CRT, 85hz tn overdrive 1ms or my b7 oled TV running at native 120hz @1080p.)

if you actually pay attention, everyone has their own visual threshold, (and the vast majority cant TELL THE DIFFERENCE BETWEEN 120 AND 240) JUST BECAUSE YOURS IS HIGHER THAN MOST DOESN'T MAKE THE REST OF US WRONG!
 

Are higher refresh rates better for gaming? Yes, they are. In this article we'll explain why refresh rates often labeled as "overkill" might actually offer more improvement than you might think.

Permalink to review.


BUT... These comparisons are made on an object moving at 960 pixels per second (and then twice that).

So, is that a reasonable benchmark?

This seems a bit fast for the typical other player running and jumping around as I try to shoot them. At 720 pixels/sec the difference between 120 Hz and 240+ Hz is reduced and less noticeable. Thus, the many "real world" comments disputing the results.

120/144 Hz is the sweet spot for most people who are constrained to more affordable GPU and monitor/TV setups. Even faster is better, but these comparisons that keep moving to faster blur tests as the monitor tech improves are somewhat disingenuous.
 
I have 170hz monitor and I tried using 120hz to see if there will be a difference but I could barely notice it. 60hz looks like a slideshow after 120hz but anything above you get diminished returns. Only in extreme scenarios I will want 240hz and above.
 
My thing is, like most have said, its not practical to go above lets say 165hz. The type of computer youd need wont be cheap. The gpu alone is gonna hurt, that will be at least $500.
Sure it would be nice to have a 240hz/360hz monitor and even if you did get one, the pc to run it, you are gonna be at like $2500 at least. Not all that realistic in todays world. Maybe a decade or even 5 years ago but not now.
PC gaming needs to be more accessable/affordable otherwise the price to game in pc will be out of reach for the majority. Which is shame considering its the best gaming platform.
 
There's literally a whole article above you explaining how and why the higher Hz is in fact noticeable, even above 144.

Also the article is just trying to settle the question of whether or not the higher Hz is important, not necessarily what it takes to get there. Great write up!
Yes, but they show the difference through still shots. Of a demo with a very fast moving object using motion blur.

I feel like to some extent this is a monitor equivalent of the audiophiles who will buy $100+ audio cables, special amps and speakers, and insist they sound more "natural", the sound "fills the room", even when audio measuring equipment shows no difference, and in fact in A-B testing they can't reliably tell if they are listening on their special rig or on a good-sounding but more pedestrian setup. (Of course in the monitor case, unlike the audiophile situation, there is a measurable difference, at least to a machine recording the screen.)

In other words, I'd LOVE to see some A-B testing, find some gamers, and some other gamers who are sure they need high refresh rates, have them play a few games at different framerates (don't tell them what framerate they are at -- if you tell that type of audiophile which equipment they are listening to, they'll make sure to find flaws in the regular equipment and praises for the specialized kit even if they sound identical.) I would expect many could tell 30 versus 60FPS, a few 60 versus 120, and I doubt in reality anyone will notice above that. But A-B testing would reveal all.

That's cool though! This'll keep some funds flowing into AMD and Nvidia for high-end gaming cards so they (especially Nvidia) don't just decide "screw it, we'll focus entirely on AI and cut out developing graphics cards". And (as people decide their nice card isn't good enough because it can't hit 480hz in whatever games) can make for a healthy supply of used cards for everyone else to buy at a good price and enjoy.
 
BUT... These comparisons are made on an object moving at 960 pixels per second (and then twice that).

So, is that a reasonable benchmark?

This seems a bit fast for the typical other player running and jumping around as I try to shoot them. At 720 pixels/sec the difference between 120 Hz and 240+ Hz is reduced and less noticeable. Thus, the many "real world" comments disputing the results.

120/144 Hz is the sweet spot for most people who are constrained to more affordable GPU and monitor/TV setups. Even faster is better, but these comparisons that keep moving to faster blur tests as the monitor tech improves are somewhat disingenuous.

960px/second sounds fast until you compare that to how long it takes to traverse your monitor. At 1920x1080 resolution, it still takes longer than a second to go from the top to the bottom of the screen (and exactly 2 seconds to go left to right). That's a long time, especially for first person or third person games where the camera is changing what you are looking at quite quickly. For a game like Civ it wouldn't matter too much.
 
Yes, but they show the difference through still shots. Of a demo with a very fast moving object using motion blur.

I feel like to some extent this is a monitor equivalent of the audiophiles who will buy $100+ audio cables, special amps and speakers, and insist they sound more "natural", the sound "fills the room", even when audio measuring equipment shows no difference, and in fact in A-B testing they can't reliably tell if they are listening on their special rig or on a good-sounding but more pedestrian setup. (Of course in the monitor case, unlike the audiophile situation, there is a measurable difference, at least to a machine recording the screen.)

In other words, I'd LOVE to see some A-B testing, find some gamers, and some other gamers who are sure they need high refresh rates, have them play a few games at different framerates (don't tell them what framerate they are at -- if you tell that type of audiophile which equipment they are listening to, they'll make sure to find flaws in the regular equipment and praises for the specialized kit even if they sound identical.) I would expect many could tell 30 versus 60FPS, a few 60 versus 120, and I doubt in reality anyone will notice above that. But A-B testing would reveal all.

That's cool though! This'll keep some funds flowing into AMD and Nvidia for high-end gaming cards so they (especially Nvidia) don't just decide "screw it, we'll focus entirely on AI and cut out developing graphics cards". And (as people decide their nice card isn't good enough because it can't hit 480hz in whatever games) can make for a healthy supply of used cards for everyone else to buy at a good price and enjoy.

It's obvious you do NOT understand the need for more frames, or even play fast paced games. None of this is true for single-player games, where where as FPS movement and reaction time don't matter (you can laterally pause those games)

Yes, one benefit to faster Hz monitors is pixel clarity, but the MAJOR reason is the character movement and FEEL... is quite different and easily noticeable to seasoned gamers. Understand, that moAr frames and Hz isn't solely about clarity, but about the unfettered gameplay that One experiences, once they can spin their Character in-game unfettered by fps hitches/spikes and/OR visual blur....


Ironically, if you are playing something like Cyberpunk, it doesn't matter what frames you get... it doesn't hinder your canned gameplay.



Lastly, audiophiles are purest who want to make sure hardware doesn't get in the way of the listening experience.


 
I'm going to chime in here and say this article is partly true, and partly backwards. As someone who can actually see the difference between even 120 and 240 fairly easily, I can confirm that there is certainly a notable difference, and I like the higher framerates a lot. But the title of the article is "why it matters," and the answer in gaming is "that's entirely up to the player."

Yes, it looks nicer when things move smoothly. But that nicer look not only comes at an exponentially-increasing cost in hardware, it also does not give the competitive advantage people think it does. Everyone always forgets the most important part of this: the majority of video games NEVER UPDATE THEIR LOGIC AT THAT RATE. Games will, in almost all cases, update at a 20, 30 or 60Hz tick rate, which means no matter how nice and smooth everything looks to you, the idea of having a competitive advantage in a modern shooter from a higher framerate is completely imagined. Even if your reflexes WERE tight enough to take advantage of the difference between 16.7 and 8.4ms (they aren't, but that's a whole other discussion), the game itself would literally still only translate your in-game actions on the logic update "framerate" which would still render your visual advantage null.
 
I haven't used a display with a higher refresh rate than 75Hz, so this is speaking from a bit of a place of ignorance, but in my mind, high refresh rates are very similar to HDR and lossless audio quality. If you need a visual confirmation to be able to tell that the high-end feature in question is enabled (HDR pop-up on TV screen, frame rate counter for high refresh rate, HD or lossless label on music streaming services), then the enhancement probably isn't that important or impactful in the first place.

Lastly, audiophiles are purest who want to make sure hardware doesn't get in the way of the listening experience.
It's been said that audiophiles listen to equipment, not music, and I'm inclined to agree.

Double-blind A/B testing also reveals most audiophile equipment as pure snake oil. The placebo effect is strong, and we humans are utterly terrible at objective measurement of anything to do with our senses.
 
It's been said that audiophiles listen to equipment, not music, and I'm inclined to agree.

Double-blind A/B testing also reveals most audiophile equipment as pure snake oil. The placebo effect is strong, and we humans are utterly terrible at objective measurement of anything to do with our senses.

No, you are talking about posers.
It's ok to mock People who spend frivolously in high-end equipment, (because they can), but with no real goal or pursuit/achievement in mind, those people become audio-blowhards. Not people who can get emotional or have goosebumps from certain passages on music.

I have never seen anything labeled "Audiophile Equipment" in my life, perhaps if you are fooled by such labels then it's you who aren't moved by music..?

One of the best listening enhancements you can make is sound absorption/barriers. It's not about money... it's about the passion. And a good listening room isn't bought, they are organic and evolve over time.


most kids vape an album a week, if their passion was listening, imagine their room in 30 years...
 
I have 170hz monitor and I tried using 120hz to see if there will be a difference but I could barely notice it. 60hz looks like a slideshow after 120hz but anything above you get diminished returns. Only in extreme scenarios I will want 240hz and above.
I have the same speed monitor 27" 1440p. It's high grade monitor AOC Agon 273, the best 273, forgot the letters.
Any how, the post by Nobina is exactly what I feel and experience. In fact if he had not written at all I may have posted the same thing. Exactly.

But there are other factors because some people will not agree at all. No reason not to belief them, but not sure how they have such great eye site!!
 
I have never seen anything labeled "Audiophile Equipment" in my life, perhaps if you are fooled by such labels then it's you who aren't moved by music..?

One of the best listening enhancements you can make is sound absorption/barriers. It's not about money... it's about the passion. And a good listening room isn't bought, they are organic and evolve over time.
There's nothing wrong with enjoying music and wanting a great quality listening experience, but you can get a great quality listening experience for much cheaper than a lot of self-proclaimed audiophiles spend.

When I say "audiophile equipment," I mean expensive audio gear. Some of it really is pure snake oil, like audio cables that are hundreds of dollars. A lot of it is simply massively diminishing returns, where the difference between $100 headphones and $1000 headphones isn't a lot, not enough to justify the $900 price difference, at least. Or expensive DACs versus your motherboard's built-in audio.

High-resolution digital audio is another component of this. From my understanding, most people can't reliably tell the difference between 320 kbps MP3 and lossless CD audio, much less higher bit rates or sample rates than CD.

Basically, what I'm getting at is that the difference in experience between using an audio setup you spent $10,000 on and one you spent $500 or less on largely boils down to the placebo effect. You're expecting to have a much higher quality listening experience, so as a result your brain's subjective experience matches what you are expecting.

Check out this thread on Head-fi if you're interested in learning more:


A striking example of this placebo effect is with certain people who spent hundreds of dollars on a "high-quality" HDMI cable who swear up and down that their expensive cable gives a better picture quality, even after they've been told that that's literally impossible because HDMI provides a digital signal.
 
Basically, what I'm getting at is that the difference in experience between using an audio setup you spent $10,000 on and one you spent $500 or less on largely boils down to the placebo effect. You're expecting to have a much higher quality listening experience, so as a result your brain's subjective experience matches what you are expecting.
Yeah, if you are listening to digital through headphones, nothing really matters. To get away from that, is a pursuit of analog music.
 
This article would not exist if LCDs were not such an inferior display technology.
In order to work around the slowness of LCD tech we are forced to push expensive amounts of compute power into our computers...
It's a true difference viewing how 60hz on an oled looks vs 165hz on an LCD or 100hz on a CRT for that matter.
 
Back