As mentioned on our first TechItAll podcast; we’re open to answering questions anyone may have regarding hardware, software, or general questions on things. We will collect good ones and try to answer them either on the podcast or in written form like this. Feel free to send your questions using the hashtag #GameItAllTech, visit our Facebook page or comment below! (We’re working to have a centralized contact point in the near future.
Does it matter if I use an SSD or HDD for games?
For the most part, no, surprisingly. While I always recommend having at least a small SSD (Solid State Drive) for your OS (Operating System), most games are actually optimized to run on HDD’s (Hard Drive Disks). Most of the loading for games is done on the initial start-up and level loading, so there isn’t much difference after that point. One of the few exceptions would be an Open-World game that would continuously draw data for map loading as you move about in real time.
Is it ok to just use the stock CPU cooler?
Most modern CPU’s (Central Processing Unit) can’t be overclocked by design, unless you opt to cough a little more cash for the K or X suffixed chips (for Intel CPU’s, as an example) then you can overclock – however, there will not be much thermal headroom. Most stock coolers that come with CPU’s are just the bare minimum to keep it functioning at it’s factory setting. Recently though, AMD has released the Wraith coolers as a method to curb this trend. Still, if you’re serious about overclocking, it’s worth it to upgrade to an aftermarket cooler & heatsink as you’ll get much better performance out of it. Ideally a liquid cooled solution is best.
My PC has a video port on the motherboard, do I still need a graphics card?
If you plan to play any modern 3d intensive game at reasonable settings, then absolutely. Integrated graphics like Intel’s Iris or AMD’s APU (accelerated Processing Unit) solutions have come a long way, indeed; but they can’t hold a candle to what the dedicated cards can do. This fact is magnified once you get to the topic of 4K gaming or using multiple monitors.
Do I absolutely need a paid antivirus?
Not necessarily. There are a lot of Free Antivirus options available now. Many of them are quite on par with each other, so a little research on their reviews can help you decide. Just watch out to not click ‘yes’ to everything as sometimes they try to install extra things you don’t want. ( Sponsored programs are how they balance our their money. )
Personally, I just use Windows 10’s built-in Windows Defender, and keep Malwarebyte’s Anti-malware on the side for the deeper scans. Haven’t had an issue to date with that combination.
Does a “Killer” Network card really give me an advantage with online games?
Not realistically, no. While they do shave off a few milliseconds, your overall network performance is really dependent on a load of other factors: ISP (Internet Service Provider) servers, connection type, network load, game servers, wired/wireless, etc… Ideally just having a strong connection like cable or fiber is the better method of having the best speed. Aside from that, Most game servers apply a network prediction logarithm to equal the playing field for players, so unless you’re on a very slow connection or are very distant from the physical game servers, you’re not going to see a difference with or without it. Best to use your budget on something else.
Do I need a 144-160 Hz monitor?
No. Anything from 60-90 should be just fine, unless you plan to use polarized 3D, which then 120+ is necessary as it needs to refresh pretty fast. Instead, you may want to look for something that supports Nvidia’s G-Sync or AMD’s Freesync, depending what card you’re using for graphics to smooth out framerates.
Can I use my flat-screen TV as a gaming monitor?
Having a flat-screen TV for a monitor can work as well, however make sure it’s got less than 5ms response time so it doesn’t seem laggy. Also you may need to manually adjust the resolution to something customized as TV’s tend to over-scan. (This means they cut off around the perimeter, limiting screen resolution real estate by default. A relic from how the TV industry works)