Author Topic: Building a 4K Ultra HD Gaming PC 2 of 2  (Read 619 times)

Offline javajolt

  • Administrator
  • Hero Member
  • *****
  • Posts: 35894
  • Gender: Male
  • I Do Windows
    • windows10newsinfo.com
    • Email
Building a 4K Ultra HD Gaming PC 2 of 2
« on: May 09, 2014, 03:53:01 AM »
2 Putting It All Together

Now comes the fun part — assembly. For many it's the most rewarding part of a new PC build, the same way a baker loves combining ingredients to create a cake capable of running Crysis 3. For others it's a parade of opportunities to destroy the expensive components they have gathered together. I fall somewhere in the middle, largely because the first motherboard I laid my hands on professionally went up in a puff of smoke, resulting in me never being asked to cover for Digital Equipment Corporation's repair department ever again.

Once the computer is powering up, the rest is quick and easy. Installing your operating system, finding the correct drivers, finding the really correct drivers and completely configuring the software side of things to your liking shouldn't take much more than a week and a half. After that, it's a gaming PC!

The Ultra HD Experience

Upon loading the graphics card drivers and setting the monitor resolution to 3840 x 2160, I was immediately struck by how desperately I need a new pair of glasses. Having spent the better part of a decade at 1920 x 1080, suddenly having four times that in an area only slightly bigger than my regular office monitor was surprising and squinty.

It looks like this:



For the full effect, open (3840x2160) that image in a new tab. Massive, isn't it? Only it isn't. Everything is tiny. Everything is small when you're Ultra HD. Sing along if you know the words. It's a ridiculous amount of screen real estate to play with. I also had to set my mouse's dots-per-inch to its highest setting, so it didn't take half a day to traverse the monitor.

For us older folks and those with shaky eyesight, there are two ways to deal with an Ultra HD Windows desktop. Either set the default resolution lower and save Ultra HD for games, or get really, really close to your monitor.

I'm not joking. Lean in close to your standard 1920 x 1080 monitor. A six inches away, the space between pixels might as well be waving a giant "I'm the Space Between Pixels!" sign. Six inches from Sharp's Ultra HD monitor, all I see are graphics and letters. Incredibly sharp graphics and letters.

It's like the difference between an iPad 2 and an iPad 4 with Retina display. If you're used to a 2, and someone hands you a 4 to look at, your instinct is to stab them in the eyes with your stupid old tablet and run away with theirs. This loaner monitor makes me want to stab AMD in the face. It's a good thing.

Gaming in Ultra HD

There's not as much squinting involved, but you'll still want to get up close to your monitor when playing a game — that's where the magic happens. Where normally you'd start to see the image begin to break down, instead you're experiencing a level of detail not possible with a lower resolution screen.



Again, open this screenshot of Tomb Raider running at 3840 x 2160 in a new tab, and then zoom in. Pan around, soak it all in. I've never appreciated texture work as much as I do playing a game running at Ultra HD resolution.

But there's a downside as well. Check out this shot of LEGO Marvel Super Heroes.



Again, amazing details, but I'm also picking up on some flaws I wouldn't have noticed before. Why is the road paint texture crawling up the back of Hulk's foot? Is that character picture in the top left hand corner really that grainy?

The same way the first round of HDTVs alerted us to the fact that actors have pores, Ultra HD might uncover tiny blemishes in our games we might have missed otherwise. As adaptation of the new standard increases, I'm sure we'll see less and less of our games' greasy pores.

For the most part, the games I tested on the completed system looked just lovely in Ultra HD, but how did they play?

Performance

As soon as I verified the PC I built was not in immediate danger of melting, I installed Steam and began downloading BioShock Infinite, my current go-to benchmarking tool. I'm always happy to see Elizabeth, no matter how many frames per second she appears. Within the hour, there she was. You should probably enlarge this one. It's kind of breathtaking.



Breathtaking, but only running just north of 30 frames per second on Ultra settings. For many people, that's not a problem — 30 frames per second is respectable. I'm sure someone who had just paid $3,000 for a monitor capable of running at 60Hz would be fine with Vsync locking everything at 30 because it can't even get close to 60.

I tried a couple more titles — Batman: Arkham Origins, Tomb Raider — both benchmarking in the high 30s. It was at this point I turned completely stupid. I got in touch with the AMD tech I'd been communicating with and voiced my concerns over the low frame rates. I was worried I had done something wrong, maybe messed up a setting or something.

With incredible patience and without laughing once, he explained that the graphics card was now responsible for four times the image it would have been at 1920 x 1080, so the frame rate would logically be a fraction of what it would be at the lower resolution. I was a man who had suddenly grown four times his size, wondering why the same meal I'd eaten the night before wasn't as filling.

He offered me two possible solutions. I could run the games at less than optimal settings, or he could loan me a pair of these:



At nearly twice the height of a standard high-end graphics card and weighing in at three and a half pounds, the $799 MSI Lightning Radeon R9 290X beats the living hell out of any heat issues AMD's reference card may have had. In their review, Tom's Hardware called it "the Radeon R9 290X done right." If you were facing off against the R9 290X in a boss fight, this would be its final form.

After careful consideration, I opted for the incredibly impressive graphics cards. Even the packaging was impressive (image via Tom's Hardware).



I installed both of the MSI cards into my system, and it went completely insane. I'd start a game — any game — and as soon as the polygons started flying, one of the cards' fans started going bonkers. The screen would tear, artifacts would appear, and if I let this go on long enough the system would crash.

We're still not sure what the problem was. I thought it might be an issue with stacking the cards on top of each other in the system, but I've had video cards on top of video cards in the past with no problems. Tech suggested I separate them, but the only other PCIe configuration that would work for the super-sized cards would put one of them a half-inch inside of my power supply.

In the end, we wound up compromising — one MSI Lightning R9 290X and AMD's reference card, living together in perfect harmony. It wasn't optimal, but it got the job done.

So, how'd we do?



Much better. Adding another video card essentially double the frame rate of every game I tested, bringing them past the 60 mark, allowing them to sync nicely with the pricey monitor. All except Crysis 3, which is a dick to Ultra HD and computers in general.

If you're a huge fan of triple digit frame rates, they can still be achieved, but you're going to have to turn down the bells and whistles. Taking BioShock Infinite a couple steps back from Ultra to High brings the average frames per second to just over 100. Pick up a monitor where that doesn't result in jaggies and tears, and you're in business! I'm fine with around 60, for now.

Now that we've spent several months gathering components, assembling a PC from scratch, playing games and learning to respect AMD as a PC hardware manufacturer, it's time to box up the pricey monitor and video cards and send them home. It's for the best. In a few months the people in my neighborhood are going to figure out computers, and I'll be more at risk for a break-in than ever before.

While I try to find the styrofoam inserts for the monitor box, let's wrap this up.

Are We There Yet, Ultra HD Gaming?

Not without a whole hell of a lot of money we're not. With at least a pair of higher end graphic cards required to get frame rates into the 60s for recent graphics-intensive titles and the cost of securing a halfway-decent monitor still in the multiple thousands, outside of enthusiasts with money to burn and members of the gaming press borrowing a whole mess of hardware from the likes of AMD, Ultra HD is beyond the means of most consumers.

It's going to take a steep drop in display prices. Displays even bigger than the Sharp 32-inch I used here — the clarity of Ultra HD resolution means we can put larger monitors on our desktops — need to be readily available and relatively affordable. Once you start seeing Ultra HD displays being sold at Wal-Mart, you'll know we're there.

The graphics hardware is getting better. AMD recently launched the $1,500 R9 295X2, a dual-GPU card specifically designed with Ultra HD and multiple monitor gaming in mind. Nvidia has its $3,000 Titan Z, created with the same intent. The hardware is out there. Most of us just have to wait for it to come down to our level.

When it does — when every PC gamer (not wearing a virtual reality headset) can sit a couple of feet away from a 50-inch display without worrying about picture degradation or eye strain — then we'll be there, and it will be gorgeous.



source:techspot