IdealCPU is supported by its readers. If you buy products through links on our site, we may earn a commission. Learn more

100Hz vs. 144Hz: Which Refresh Rate is Better?

As the market for PC gaming peripherals becomes continually more saturated, the quality and capabilities of peripherals like gaming monitors increases. This leaves inexperienced builders and gamers helpless when searching for a new monitor, with all the various benefits and features of different monitors.

A few commonly asked questions amongst newbies to the PC clan are what Hz even means, how many Hz you need for smooth gaming, and how a monitor’s Hz and the computer’s GPU affect each other. I’ll answer all of these questions in this article, and then some.

While the difference is slightly noticeable, a monitor with a 144Hz refresh rate will provide smoother motion and response time in comparison to a 100Hz monitor. The comparison between 100 Hz vs. 144 Hz actually matters if you’re actually able to reach 144 FPS in-game. In short, 144hz only makes a difference at the most competitive level of gaming.

With that being said, let’s discuss these points in further detail so you can make an informed decision on which Hz is best for you.

What is Hz?

Hz is a form of measurement used in a variety of industries and applications, but for gaming monitors, it represents the refresh rate of the display. The more times your screen refreshes per second, the smoother your gameplay will appear.

While Hz is technically measuring how many frames your screen displays per second, Hz does not dictate your FPS, or frames per second. In fact, your FPS is determined by your graphics card. However, the difference between the two goes deeper than just that.

Hz vs. FPS

As I mentioned above, your monitor’s Hz is its refresh rate. This means that, at 100Hz, your monitor is refreshing a maximum of 100 times per second. So how does your graphics card play into that?

Put simply, each graphics card has different capabilities for handling games. Newer, more expensive graphics cards are designed to handle heavy graphics loads. As such, these are better equipped to handle graphics-heavy games and don’t generally cause screen issues like tearing or stuttering.

With older or budget graphics cards, you might notice these issues more, depending on your monitor. If your graphics card is only capable of generating 60 frames per second on your 100Hz monitor, the delay shows. Your game might lag, or the graphics will appear haphazardly occasionally, causing an unsatisfying game experience.

There are ways to counterbalance this delay, such as variable refresh rate technology, which I’ll get into further below.

What’s the Difference Between 100Hz and 144Hz?

The higher the refresh rate, the harder it is for the human eye to tell the difference between two monitors with different rates.

For example, you’d notice the difference between 60Hz and 144Hz for certain. 60Hz is the minimum refresh rate recommended for gaming and typically does not produce smooth gameplay on high settings.

The difference between 100Hz and 144Hz is noticeable to some, but the difference is less striking to the untrained eye. While it is a 44% increase in refresh rate, think of it like this: if someone lobs a baseball to you, you can track its movement and catch it with relative ease, right?

See also  Best Affordable CPU Coolers under 50 Dollars for Intel & AMD [2021]

If someone throws the ball directly at you, it’s harder to track since it’s going faster. If someone throws a fastball at you, it’s nearly impossible to track, but not that much more difficult than when they threw it at you.

Both throws were far faster than the lob, and the difference in their speeds doesn’t translate as much as the difference between the lob and the direct throw. Who knew that sports could apply to gamers?

Do Graphics Cards Affect Hz?

Graphics cards do not affect Hz, as Hz is a firm setting of the display’s design. Graphics cards produce varying frames per second depending on the immediate needs of the game you’re playing.

The more frames per second it generates, the higher the percentage of Hz it’s utilizing on your display.

If your graphics card produces more frames per second than your monitor can handle, you see issues like screen tearing and stuttering. For example, if your GPU is putting out 130 FPS, but your monitor is only 100Hz, your GPU is running faster than your monitor can handle.

So, what is screen stuttering and tearing?

Screen Stuttering and Screen Tearing

When your graphics card is producing more than your monitor can handle, or too little, it manifests on the screen. When there is a notable delay between the two processing speeds, your game may appear to lag. This is called stuttering.

This is because your monitor is refreshing faster than your GPU can produce the frames, and can also happen from the GPU being too fast for the monitor.

Alternatively, when your GPU is processing at different speeds than your monitor, the graphics will show up at different times on the monitor’s display. This is called tearing.

This shows as a visible line in the display where one half of the screen was generated in one fraction of a second, and the other half was generated the next.

Which is Better for Gaming?

Now that you’re familiar with Hz, FPS, and common screen issues, let’s discuss which Hz is better for gaming.

Gaming at 100Hz

A 100Hz monitor can refresh a maximum of 100 times per second, making it a great tier of monitor for most casual gamers. Depending on your graphics card, 100Hz will play any popular esports title at 60+ FPS on medium-to-high settings.

You really only start losing performance with some graphics-intensive games like GTA V and The Witcher 3. These games have dense, rich environments and, as such, are harder for your GPU to process.

Even if your GPU is top-tier, a 100Hz monitor will miss out on some of the ultra-fine details that your GPU computes. To many, that’s a sacrifice worth saving money for.

Gaming at 144Hz

With a 144Hz monitor, you get a much wider margin for capturing those fine details. A good graphics card will be able to match 144Hz in most cases, again barring graphics-intensive games.

With a high-quality GPU, however, you can play even the most rigorous games on your desired settings. 144Hz is more than enough to capture all of the frames your GPU is producing. Additionally, the gameplay will be much smoother, and the risk of screen tearing or stuttering is near zero.

See also  Logitech G403 vs Logitech G703 | In-Depth Comparison & Suggestion

To fully reduce the risk of screen tearing or stuttering, you can purchase a monitor with variable refresh rate technology.

Gaming with Variable Refresh Rate

Having a high Hz on your monitor is a great place to start; in fact, it’s enough for most gamers on its own. But, to fully utilize the power of your GPU and monitor combination, you can use variable refresh rate technology.

VRR is typically built directly into the monitor, though some manufacturers sell it as a standalone addition. VRR functions as a balancer for your monitor’s refresh rate and your GPU’s FPS output.

When you play a heavy-duty game and your GPU is outputting a high FPS, VRR will adjust your monitor’s refresh rate to match the FPS as closely as possible. This negates any delays between the two, minimizing the risk of screen tearing or stuttering.

Common VRR technology comes in the form of Nvidia’s G-Sync or AMD’s Freesync, with both having different features that distinguish them from each other. G-Sync is more expensive but comes built into the monitor, while Freesync is an open-source software.

In Conclusion

To summarize, 144Hz is better than 100Hz for gaming due to its increased refresh rate. The higher refresh rate allows the monitor to handle graphics-heavy displays better, resulting in smoother gameplay. 100Hz is enough for many, but 144Hz provides a superior experience.

FAQ’s

Is 100Hz Enough for Gaming?

The recommended Hz for gaming is 60Hz, to produce, at most, a 60 FPS experience on undemanding games. At 100Hz, you nearly double that speed, making it a viable option for budget gamers that want to play games with rich environments on medium settings.

Keep in mind, however, that to utilize all of that refresh rate speed, you need a GPU capable of outputting, at a minimum, 100 frames per second. Always check the specifications of your graphics card and your monitor before making a decision.

Can You Tell the Difference Between 144Hz and 165Hz?

The difference between 60Hz and 100Hz is easily noticeable, with the difference between 100Hz and 144Hz being less noticeable. The difference between 144Hz and 165Hz is only a 21% increase in speed, which is hard to notice at an already high speed.

You may be able to notice the difference when playing extremely graphics-intensive games, but for most purposes, the speed difference is negligible.

Is Variable Refresh Rate Technology Expensive?

If you’re looking at G-Sync monitors, with VRR built-in, you’ll notice a gap of about $200 between those and regular monitors of similar capabilities. AMD’s Freesync is a cheaper alternative, though it doesn’t offer the same performance improvements as G-Sync.

IdealCPU is a participant in the Amazon Services LLC Associates Program. Many or all of the products featured here are from Amazon or our partners who compensate us. If you grab anything we mention using our referral links, we may get a small commission. However, there's no extra cost for you.

Leave a Comment