6 PC gaming superstitions busted

The average PC gamer may not be superstitious about walking underneath ladders or being cursed by black cats, but we have plenty of questionable rituals and myths that have built up across decades of message boards and Reddit threads. Established beliefs about mouse DPI, framerates, and thermal paste are confidently passed down across the decades—even when they’re flat-out wrong or haven’t been relevant since the days of Counter-Strike 1.6.

Here’s six PC gaming superstitions I’ve seen many times over the years that I’m ready to correct the record on. 

The human eye can’t distinguish framerates beyond 60 fps

(Image credit: panstasz)

Superstition: Busted

Thankfully the debate over whether we can see the difference between 30 frames per second and 60 has pretty well been squashed in recent years by console games that support visual/performance modes and feel way better to play using the latter. But the occasional misguided internet commenter will still insist that our eyes’ and brains’ ability to process motion tops out at about 60 fps, which is just all sorts of wrong. Part of what makes this topic confusing is that our eyes don’t work in “frames per second,” but the key takeaway is that our eyes are pretty dang good at detecting motion. Years ago we spoke to a visual psychologist and a professor of brain science to explore this topic in-depth.

I suspect as 360Hz monitors become more common, we’ll start seeing more research digging into how we perceive motion at higher framerates. BlurBusters thinks we’ll see benefits up to 1,000Hz and beyond.

Low mouse DPI settings like 400 – 800 are “better” for first person shooters

(Image credit: Logitech)

Superstition: Double busted

There’s two bits of advice here that often become conflated. One is that you’re going to perform better in a game at a lower mouse sensitivity level or DPI setting, requiring you to make broader physical movements with your mouse to move your crosshair further on the screen. The second is that your mouse will actually be more accurate at a low DPI like 400 or 800 DPI. I’m going to say these are both wrong in 2024.

First, anyone who tells you there’s a “best” DPI setting is full of it. The correct choice of mouse sensitivity is completely personal and affected by the size and resolution of your monitor. If you set your mouse to 1600 DPI, it’s going to cross the length of a 20-inch 1080p monitor much, much faster than a 30-inch 4K monitor, for example—no message board declaration of the best DPI for a game will be taking into account the details of your PC. This is also completely dependent on how you like to play games. It’s okay to prefer small mouse movements at higher sensitivity over big ones at lower sensitivity if that works for you. If you’re struggling with your accuracy and find your crosshair movements too erratic, try lowering mouse sensitivity in-game. If you can never turn around fast enough to catch someone coming up behind you, maybe bump it up a little bit. Find your sweet spot.

But what actual DPI setting should you be using with your mouse? There’s no universal answer. As I explained years ago while busting gaming mouse myths, a mouse sensor can perform slightly differently at different DPI settings, and back in the 2010s when gaming gear companies were racing to stick higher and higher DPI claims on their mice, they were usually doing it by subdividing pixels on the same old sensors. A sensor that was originally only rated for, say, 4000 DPI suddenly being advertised as supporting 16,000 DPI is going to be working with much “noisier” data, because the resolution of the sensor hasn’t changed—all that’s changed is how the images the sensor takes to discern movement are being divided up and processed.

In the last decade, though, gaming mice from high quality brands have moved on to a new generation of sensors supporting much higher resolutions. If you’re using an old mouse or a real cheapo, your mouse may be infinitesimally more accurate at a lower DPI setting. But unless it was made pre-2010 I wouldn’t worry about turning it down lower than 800 DPI. And if your mouse was made anytime in the last 10 years, turning your DPI up into the thousands will be fine. Don’t sweat it. Just maybe don’t set it all the way to the max, either. 

You have to reset your modem and then your router, in that order

(Image credit: Future)

Superstition: Mostly busted

A thousand websites battling for the SEO title of “how to reset internet” will give you the same basic advice: if something’s screwy with your home network, unplug your modem and router, wait 30 seconds, and then plug them both back in—starting with the modem first. Much rarer is an explanation of why you need to plug the modem back in first, or what will happen if you do carry out this process in the “wrong” order. I’ve reset my network with reckless router-first abandon more than once without issue, but I also see the basic logic in plugging in the modem first, so that it can establish a connection with your ISP before the router tries to pick up on that connection. But does it really matter? 

For an answer I turned to the networking channel of the Brad & Will Made a Tech Pod Discord server, where I knew some true network pros like to hang out. Member Cakebatyr, who’s a software developer with CCNA certification from Cisco, rose to the challenge. After ruminating on the question Cakebatyr came up with an extremely common sense answer: “It depends™️.”

What does it depend on? Well, here’s a pretty obvious factor: how long it takes your devices to boot up. “My router takes about 3 minutes to boot, my cable modem takes less than that,” Cakebatyr said.

“There are two things happening here:

Power cycling the modem refreshes connectivity with the ISP and your modemPower cycling the router kills and restarts the DHCP lease cycle from the ISP

“To me it comes down to which part takes longer to establish, and my gut feeling is that the modem would take longer to establish a connection to the ISP than a DHCP request from the router to the ISP.”

If your router ended up ready before your modem, most likely it would just re-check for a restored internet connection after a minute. So the wrong order might cost you a whole minute or two of extra downtime.

To take things a step further, Cakebatyr asked three friends who work at Internet Service Providers and got a no comment, a confirmation of our answer here: “Order doesn’t really matter / down to device boot times,” and this XKCD comic.

I’m convinced—for peak efficiency, power cycle whichever device takes longer to restart first.

Motion blur is always bad

(Image credit: Studio Drama)

Superstition: Reluctantly busted

Look, I know you just read the words “motion blur” and had to suppress the urge to throw up. I hate it when my entire screen turns into a continuous smear every time I move the camera, and I’d wager motion blur is the most frequently disabled setting in all of PC gaming. But I’ll begrudgingly admit not all motion blur is equally bad. Well-implemented per-object motion blur can help smooth out animations, and in certain games can even help paper over a lower framerate.

We wrote at length about why gamers dislike motion blur and also about why it has good reason to exist, and the cases were it does make sense to turn it on. Meanwhile, I’ll let Digital Foundry make the case for how effective motion blur can be for 30 fps games in particular.

Unplugging USB devices without safely ejecting them first is fine 

Superstition: Mostly busted

Well, shit. I’ll admit I’ve spent most of my life gleefully ripping USB thumb drives out of my PC without ejecting them first in the Windows system tray. And it usually seems fine! Once or twice I screwed up some files by being a bit trigger happy and removing the drive while the data transfer wasn’t quite finished, but usually no issues… that I was aware of. Jacob recently connected the dots between this carefree behavior and Windows finding errors on USB drives, though. We are the problem, it turns out.

It’s not a guarantee that being cavalier in removing your external drive will cause any damage, yet it is a very real possibility even with modern Windows and speedy flash storage. But who wants to commit that extra five seconds of effort? 😒

You shouldn’t use too much thermal paste!!!

Superstition: Busted*

When I was first building PCs, the “pea-sized dot” of thermal paste was gospel. Using too much paste was a big no-no, as was spreading it before you put the cooler on. In reality? It pretty much doesn’t matter, but you’re actually better off using too much than too little.

There are a million YouTube videos on quantities and techniques of thermal paste application, but Gamers Nexus’s thorough testing reveals that, in general, there’s a tiny performance difference between a small dot of paste and a larger one. Even going way overboard with thermal paste, causing it to sploosh out the sides of your CPU cooler, is fine—with the caveat that if you’re using a conductive thermal paste (like liquid metal), you could potentially short out other components around the CPU. But again, that’s only if you’re going way overboard.

One other exception is that if you’re using a really large CPU like AMD’s Threadripper, the normal pea-sized dot isn’t going to properly cover the entire surface of the chip. But in the course of building a normal PC? Just slop some paste on there and move on. 

Half-Life 3 confirmed 

Superstition: My cousin who works at Valve told me it’s true 

Leave a Reply

Your email address will not be published. Required fields are marked *