Upgrading GPU

Please use this forum to ask our resident IT geeks advice.
Post Reply
User avatar
Miztaziggy
Posts: 2451
Joined: Fri Apr 15, 2011 9:15 pm
Location: Tadcaster

Upgrading GPU

Post by Miztaziggy »

Not sure if anyone here will be able to help with this, but I currently have 2 x AMD 5870 GPUs in crossfire. They get pretty good frame rates (eg over 90 FPS on BF3 at 1920x1200 on ultra), however there is definite stutter (or micro stutter as they call it) and I think it comes from the crossfire configuration.

Anyway, BF4 is out next month, and I have been playing the beta, and it's the same issue. It's nice and smooth with good frame rates, however there is the occasional micro stuttering which annoys the crap outta me.

I was thinking of going for one of the new R9 280x cards:

http://www.scan.co.uk/products/3gb-sapp ... eams-dp-2x

It's basically a 7970 rebadged with a few extra bits.

I was going to just go ahead and buy 2 of them and go for crossfire again, but I am wondering if they still have the crossfire stuttering?

Has anyone got a 7 series card in crossfire or has anyone got a single 7970 and played any of the newer PC games to test frame rates?
Image
Virt
Posts: 6793
Joined: Wed Dec 12, 2012 12:35 pm
Location: Leicestershire

Re: Upgrading GPU

Post by Virt »

Micro-stutter will exist in all AMD cards until they sort out their drivers..

You'll probably be fine with just one of the cards, 30fps is more than the human eye can individually identify so it gives a smooth animation. Although 60fps, supposedly, reduces eye fatigue over longer periods of time I really don't see it as necessary.

I think nVidia have better drivers in terms of their SLI configuration for micro-stutter, but I'm an AMD fanboy so I would never suggest going to the dark side :lol:
Slowly approaching the more bikes than birthdays achievement
User avatar
Miztaziggy
Posts: 2451
Joined: Fri Apr 15, 2011 9:15 pm
Location: Tadcaster

Re: Upgrading GPU

Post by Miztaziggy »

Virt wrote:Micro-stutter will exist in all AMD cards until they sort out their drivers..

You'll probably be fine with just one of the cards, 30fps is more than the human eye can individually identify so it gives a smooth animation. Although 60fps, supposedly, reduces eye fatigue over longer periods of time I really don't see it as necessary.

I think nVidia have better drivers in terms of their SLI configuration for micro-stutter, but I'm an AMD fanboy so I would never suggest going to the dark side :lol:
Yeah anything over 30 fps is fine, but frame rate your computer is processing is totally different from what your eye can process, because frame rates vary quite a lot.

Say you're getting 60 fps average, when the scene isn't changing, it'll be much higher, whereas if there's a lot of action, it goes right down. Playing a game with 30 fps average feels dreadful as there are loads of spikes in performance. I play quite a lot of games on the PC and I always find that 40 fps is the absolute minimum, but 60 fps is usually just about right. Hence why I bought a second 5870 to crossfire mine when BF3 came out.

Yeah I do think Nvidia maybe have better drivers, but, I too am an AMD fan. Mainly because you can get the same performance for half the price, but, more recently, they have announced Mantle. It's a low level API they developed with DICE. It's going to take over from Direct X apparently. I've read that it increase performance by huge amounts, and will be implemented in the new Frostbite engine for BF4 and introduced as a patch in December.

Mantle basically will be a common API between the new consoles and the PC which will totally destroy Nvidia if they stick to DX. Basically you'll have the choice of a Nvidia card with Direct X and half the frame rate, or ATI cards that are compatible with Mantle and run much better. So that's the main reason im sticking with the red team.
Image
Virt
Posts: 6793
Joined: Wed Dec 12, 2012 12:35 pm
Location: Leicestershire

Re: Upgrading GPU

Post by Virt »

True, but you can always adjust quality and render settings either from the menu or the .ini configuration file within game files.

Mantle is currently an unknown area in honesty, depending on the work required to integrate it into a game. It's no secret that developers are lazy buggers ( as a dev I myself can confirm this, though I obviously don't work with games ) and like to avoid work. We're wonderful people like that, and if they have to go to extra work to include a new device then it might stick for a while but they will eventually get fed up.

In consoles where it's probably a necessity I can see it being done, in PC gaming where it's always the case of "yeah well, if they're gonna have to upgrade their rig to play this then oh well" I don't see it being adopted so easily
Slowly approaching the more bikes than birthdays achievement
User avatar
Miztaziggy
Posts: 2451
Joined: Fri Apr 15, 2011 9:15 pm
Location: Tadcaster

Re: Upgrading GPU

Post by Miztaziggy »

I hate having to run games on lower graphics quality settings because my PC can't handle it. When I have to do that, I know it's time to upgrade.

As for Mantle, I know what you're saying, and I thought the same at first, it'll all depend on adoption, but the beauty of it is that the new consoles are based on PC architecture with the same AMD cards. The way a game developer would write a game for a console is by using a low level API to get the most out of the hardware, seeing as how the consoles have much less processing power than the PCs.

With the PS3 and Xbox the architecture is different from the PC and so PC ports are just dumbed down to console level rather than rewritten properly to work on the PC.

With Mantle, the API developers use for the PC and for the consoles will be the same. Basically it will mean less work for the developers and much better quality games on the PC. Hopefully they will be able to make a game that actually does scale up graphically on the PC now.

It's already incorporated into Frostbite 3, which will be used (apparently) on the majority of the next generation EA games. Everything from BF4, to Command & Conquer, Mirror's edge, Dragon Age, Star Wars Battlefront, Need for Speed, Mass Effect 4....etc etc

Personally, I think something like this has been a long time coming. DX was a necessity when it first came out, I remember the days when you had to have a specific 3d graphics card such as a Voodo 3D. I had one :). But they didn't work with all game. Direct X came along and levelled the playing field. It worked with all available 3d graphics cards and any game that came after.

Nowadays, DX has apparently become bloated and a hindrance to developers. A console can get twice (according to John Carmack from ID) the performance from it's inferior hardware than a PC can. The culprit is DX and Windows. Gaming graphics seemed to reach a peak when Crysis came out, but then took a massive step backward with Crysis 2 because it was made for the consoles. There is still nothing as visually impressive as Crysis and that came out what, 3 years ago?

I think Mantle and maybe Valve's SteamOS will go a long way to improving things.
Image
Virt
Posts: 6793
Joined: Wed Dec 12, 2012 12:35 pm
Location: Leicestershire

Re: Upgrading GPU

Post by Virt »

Hmm.. Agree and disagree, yes it all depends on adoption. It takes a lot to convince people to change their methods, especially developers, as being comfortable with one thing does not mean you're comfortable in general. I can do most tasks I'm assigned in Javascript with relative ease, ask me to use a new API for a Javascript plugin and I'm suddenly a script kiddie again. It's not something developers like, a lot of them are set in their ways and are rather stubborn about it. Hence my low expectations.

I understand that the new consoles are using an x86 architecture though, which is brilliant in terms of the ability to share games across platforms, but it's not the best architecture to go on. It's power hungry, bloated and not specialised enough for any specific thing. The original xbox was x86, because that way they didn't need to rewrite anything and spend money in research and design/development before getting into the console market. By changing the 360 from x86 they improved it, I'm curious to see how x86 handles but I don't think it's ultimately worth the performance hit it will serve.

Will any non-EA endorsed developers use Mantle though? It's one thing to use it in one game engine, but there are hundreds, if not more, different game engines out there being used for a wide variety of genres. It sounds like a publicity stunt to help lower level rigs play EA games and nothing else, it will be incredibly interesting to see other publishers and developers reactions to it.

VooDoo 3D is probably from before I even had a computer, or internet, though. I've never heard of it.

I agree with it being a hindrance though, I know someone who's doing a Game Development uni course (I really don't see the point in such a specific course but hey ho) and they're using DX and they absolutely hate it. Although I don't know if Mantle is the answer, it may just be a stepping stone towards it.

But yes, steamOS will definitely go a long way to improving things.. Seeing as it's an OS specifically for gaming, it might just be the case of people will have to use two OS's to get the most of their games instead? Who knows?
Slowly approaching the more bikes than birthdays achievement
User avatar
Miztaziggy
Posts: 2451
Joined: Fri Apr 15, 2011 9:15 pm
Location: Tadcaster

Re: Upgrading GPU

Post by Miztaziggy »

I don't think adoption will be that much of a problem. As I say, the Mantle API will be same as the API used to make a console game. Porting games to PC will become simple. When you think every game coming out on console already will effectively be using this API, there will be tonnes of games available.

Not sure about the original Xbox, but Xbox 360 wasn't x86, it was the same as the PS3 and it was power PC like a Mac.

Yes x86 might not be as efficient as power PC RISC architecture, but it's easier to programme.

Not sure about non EA developers, but, a source at AMD did say that they have an exclusivity deal with EA/DICE lasting until the end of October, but after that, they will announce more partners, and apparently, there are lots. I really hope there are and this takes off.

Lol the Voodo 3d was a beast back in the day. I remember getting one to play the original Tomb raider when it first came out...16mb graphics memory....top of the line :D It went bust and got bought by Nvidia back in the day.

Image

I remember before that with 2d graphic cards using system memory and DOS. There were different ways your memory could be configured using the config.sys file between XMS and EMS. Some games needed lots of EMS, some needed XMS. You would have to reboot the PC and run a new config.sys and autoexec.bat to run different games...lol ask some of the older developers you work with about it,it was a real pain in the butt.
Image
Virt
Posts: 6793
Joined: Wed Dec 12, 2012 12:35 pm
Location: Leicestershire

Re: Upgrading GPU

Post by Virt »

True. I don't know about being easier to program, if code wants to be a pain in the butt it will find a way irrelevant of Language, Framework or Architecture. I know this all too well :( :lol:

I guess the size of the unnamed developers is a contributing factor then, whether it's big like Rockstar or small like Carbon Games will have a large affect.

I remember the original Tomb Raider.. That game gave me nightmares for some reason, bloody Lara Croft and those incredibly triangular breasts. Of course I wasn't paying any attention to them at that age ! :lol:

I can ask but I don't think they'll know either, the company I work for specialises in Enterprise Architecture. I was rather amazed at my head dev actually knowing what Grand Theft Auto was the other week :p
Slowly approaching the more bikes than birthdays achievement
User avatar
Tweety
Posts: 466
Joined: Mon Feb 18, 2013 10:47 am
Location: Skurup, Sweden

Re: Upgrading GPU

Post by Tweety »

Virt wrote:VooDoo 3D is probably from before I even had a computer, or internet, though. I've never heard of it.
Yes young one... It was indeed before your time... In fact, it's possible you where born, but I doubt you where using computers...
Image <--- The result of OCMD... I gave up listing the mods in a sig line...
Virt
Posts: 6793
Joined: Wed Dec 12, 2012 12:35 pm
Location: Leicestershire

Re: Upgrading GPU

Post by Virt »

Tweety wrote:
Yes young one... It was indeed before your time... In fact, it's possible you where born, but I doubt you where using computers...
Think we got our first computer when I was 7 or 8? Didn't get internet until about 10.. Even then it was shitty dial-up :( I'm so inexperienced :lol:
Slowly approaching the more bikes than birthdays achievement
User avatar
Tweety
Posts: 466
Joined: Mon Feb 18, 2013 10:47 am
Location: Skurup, Sweden

Re: Upgrading GPU

Post by Tweety »

Virt wrote:
Tweety wrote:
Yes young one... It was indeed before your time... In fact, it's possible you where born, but I doubt you where using computers...
Think we got our first computer when I was 7 or 8? Didn't get internet until about 10.. Even then it was shitty dial-up :( I'm so inexperienced :lol:

Well... 1994 to 2002 was the timespan actually, so that would mean you where using computers actually... But I doubt you where buying extra graphics cards...
Image <--- The result of OCMD... I gave up listing the mods in a sig line...
Post Reply