PDA

View Full Version : AMD: Kicking Ass and Taking Names (HD5 Series)



legionaire45
September 10th, 2009, 11:11 PM
You'll probably need to take a mortgage out to pay for the entire set up (http://www.techpowerup.com/103552/AMD_Demonstrates_the_PC_s_Next_Act_at_Experience_E vents_Worldwide.html), but hot damn...
In short:

ATI's Evergreen GPUs (The HD 5 Series) will support up to 6 "High Resolution" monitors at a given time. (http://venturebeat.com/2009/09/10/amd-introduces-a-graphics-chip-that-can-power-six-computer-displays/)
Support for up to 24 High Resolution monitors when in quad-crossfire. (http://hardocp.com/article/2009/09/09/amd_next_generation_ati_radeon_eyefinity_technolog y)
Support for resolutions of up to 268 Megapixels (I believe on a single card), which is close to the resolution of a 90 degree arch of human vision.
The new cards are capable of doing 2.5 ~2.7 TeraFLOPS Single Precision. Compare that to the last gen HD4890 which, IIRC, could pull off 1.2 TeraFLOPS.
~2.1 ~2.7 ~2.1 Billion Transistors on a 40nm process (http://hothardware.com/News/AMD-Eyefinity-MultiDisplay-Technology-In-Action/)
Mobile versions of these GPUs will also support this technology
Other things we know about like DirectX 11 support, etc.


Announced/Leaked SKU's ( 1 (http://www.techpowerup.com/103599/AMD_Cypress__Radeon_HD_5870__Stripped.html) | 2 (http://www.techpowerup.com/103612/Radeon_HD_5870_Eyefinity_Edition_Spotted.html) | 3 (http://www.modacity.net/forums/showpost.php?p=460052&postcount=39) ):

Radeon HD 5870 X2 - $???
Radeon HD 5870 Eyefinity Edition - $???
Radeon HD 5870 2GB - $449
Radeon HD 5870 1GB - $399
Radeon HD 5850 - $299
Radeon HD 5770 - $???
Radeon HD 5750 - $???


Leaked/Notable shots:
Radeon HD5870 X2:
http://www.techpowerup.com/img/09-09-25/50b.jpg
Others:
http://www.techpowerup.com/img/09-09-11/132a.jpg
http://www.techpowerup.com/img/09-09-11/132b.jpg
http://img.techpowerup.org/090909/Capture511.jpg
http://img.techpowerup.org/090909/bta019.jpg
http://pcper.com/images/news/demo05.jpg
http://global.hkepc.com/database/images/20090910131112721609262822.jpg
http://www.pcper.com/images/news/demo01.jpg
http://www.chiphell.com/uploadfile/2009/0728/20090728094608914.png


As more information is released I'll try to keep this thread up to date.

Personally, I wouldn't mind going for 6 Projectors and lighting up a gigantic wall with Halo. Something the size of a building. Something you could see from a plane. Grossly overkill for Halo. But I think that would be amusing.


Nvidia's response (http://venturebeat.com/2009/09/10/amd-introduces-a-graphics-chip-that-can-power-six-computer-displays/)?

“The gaming world has moved to dynamic realism, which depicts actual physical movement more realistically than ever before. For example, the No. 1 PC game coming out next week is ‘Batman: Arkham Assyum,’ which takes advantage of graphics plus physics to give it extraordinary realism. Because we support GPU-accelerated physics, our $129 card that’s shipping today is faster than their new RV870 (code name for new AMD chips) that sells for $399.”
Translation: "BAWW."
Besides the fact that what they are saying is highly misleading at the least and probably entirely wrong, it doesn't say anything about actual processing power. Not only is Nvidia going to be late to the game with their next gen GPU, but they'll also be doing it WRONG (http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture). With the amount of horsepower on RV870, PhysX isn't going to matter. Max the resolution, AA, AF and everything else on the RV870 and you'll be winning in the long run - try that on the "$140 Nvidia card" which is either some derivative of the G92 from 2007 or the crappy GT200-based budget card that Nvidia had extreme difficulty in making. This is before you take into account the fact that the RV870 can handle multiple monitors seamlessly, while Nvidia barely has SLI working with 2 monitors.

Nvidia is really fucked. They aren't going to have anything to respond to this with for a long time. The design they do have is flawed. They've really fucked themselves. For the sake of the market, I hope that Jen Hsun Huang gets kicked out of the company. He's totally lost it at this point. Nvidia needs to get with the times or they are going to become the next 3dfx and ATI will have nobody to compete with in the graphics market until Intel finally releases Larrabee.

...Now if only AMD would start treating CPU development the same way they are currently treating their GPU development...

Amit
September 10th, 2009, 11:16 PM
:allears:

With all those monitors...Freelancer may now commence creaming his pants. :realsmug:

Holy shit, omg WTF?

mzGtxlaPQqY

:iamafag:

http://pcper.com/images/news/demo04.jpg

http://pcper.com/images/news/demo05.jpg

Phopojijo
September 10th, 2009, 11:32 PM
1) ATI paper release
2) Look at GPGPU numbers -- ~2x the performance of last generation... which is typical for ANY GPU generation.
3) Charlie at the Inquirer hates nVidia.

flibitijibibo
September 10th, 2009, 11:49 PM
First: What Phopo said.

Second: Uh... I may be a supporter of multiple monitors, but that's taking it way too far, even if it is a tech demo. It's annoying enough to have those spaces between the monitors, but to have a fucking grid in all of your games is obnoxious. That old Alienware supermonitor needs to come out of prototype hell.

Third: So NVidia doesn't have anything to compete right off the bat. Wah wah. I'd rather wait to have a piece of hardware that's not only good, but isn't a rushed out piece of shit that was released just to get a couple extra bucks (http://www.xbox.com/).

Fourth: The last time a card was released for the latest version of DirectX, it worked like shit. Note how all of these demos are running DX9/10 games. I'll even wager that it's not even running DX11 software on the OS. Again, I'd rather wait, because by the time DX11 runs the way it's supposed to (or even gets anything that supports it), NVidia will at least have something to put on the table.

Xetsuei
September 10th, 2009, 11:54 PM
LOL

Look at all the whiners.

And all the incorrectness.

legionaire45
September 11th, 2009, 12:00 AM
1) ATI paper release
2) Look at GPGPU numbers -- ~2x the performance of last generation... which is typical for ANY GPU generation.
3) Charlie at the Inquirer hates nVidia.
1) Cards based off of this tech are being released with Windows 7 - the release date is supposed to be October 23rd. This isn't a release announcement - in fact, I don't think they have even announced the name of it yet. Most people are assuming it will keep the same naming scheme as before.
2) Regardless of whether it is predicted or not, it's still notable. That isn't really the point anyway since numbers like these don't magically equal performance anyway. From the looks of it, ATI has a damned good part on it's hands and drivers to match.
3) Charlie at the Inquirer has good reason to hate Nvidia. Besides the fact that they are deceptive bastards who make an art of screwing customers over to save their own ass, their chip development has been atrocious for the past several generations. Even as far back as the Geforce 5 series they have been deceptive about their products. Nvidia deserves all the criticism it is getting.


First: What Phopo said.

Second: Uh... I may be a supporter of multiple monitors, but that's taking it way too far, even if it is a tech demo. It's annoying enough to have those spaces between the monitors, but to have a fucking grid in all of your games is obnoxious. That old Alienware supermonitor needs to come out of prototype hell.

Third: So NVidia doesn't have anything to compete right off the bat. Wah wah. I'd rather wait to have a piece of hardware that's not only good, but isn't a rushed out piece of shit that was released just to get a couple extra bucks (http://www.xbox.com/).

Fourth: The last time a card was released for the latest version of DirectX, it worked like shit. Note how all of these demos are running DX9/10 games. I'll even wager that it's not even running DX11 software on the OS. Again, I'd rather wait, because by the time DX11 runs the way it's supposed to (or even gets anything that supports it), NVidia will at least have something to put on the table.

2) Note that they mention that they are working with Samsung to produce monitors with thin bezels. Yes, there is still a bezel but seriously, who cares? Unless you set this up in a way that you have a bezel in the middle of your screen, it's not much of an issue, even in first person shooters.

3) ATI has had most of the DirectX11 feature set implemented since way back when the HD2900 was released. In fact, tessellation, mandatory AA (IIRC, a feature of DX10.1) and most of that other crap was originally a part of DX10 before a certain competitor bitched to Microsoft about it since they couldn't get it to work with their design or something. And enjoy your oddball 2880 x 900 pixel resolution on your one-off Alienware curvy monitor.

4) I may be stoned or something, but IIRC, Nvidia released the first DX10 card, the 8800 GTX and the 8800 GTS. Based off of the G80 Core, not only was it popular, but it forced ATI to completely rethink the way they designed their GPUs. This is why we have had amazing $200-$300 GPUs for the past 2 or 3 generations. Just because it's a first gen part doesn't mean it will suck. And actually, Nvidia probably won't be able to put much on the table because they are apparently focusing on GPGPU over DirectX 11. Sure, they have a design that theoretically is more general purpose, but general purpose is always going to be at least a little bit slower than dedicated hardware. ATI's architecture also has many general purpose areas but these are supported by dedicated hardware for things like tessellation.

What really matters about RV870 is the fact that it is flexible enough to do all of this. How many people are really going to use something that excessive with their computers? I don't see many people spending ~$24,000 on an array of Twenty-four 30" LCDs to go along with their $1200 graphics cards.

This thing can pull off 80FPS in WoW at a resolution of 76xx x 3200. It can play Hawx nicely on three screens using a single card without any messy directX hacks or a special box on the back. Imagine what you could do with a single card and a single screen.

AAA
September 11th, 2009, 12:32 AM
It's not unnecessary, AT ALL, to say "I love you legionaire".


..........and I do.

Kornman00
September 11th, 2009, 12:47 AM
:woop:

Glad this is coming out this year. That way when I build my new computer next year, there will either be a wider selection, more stable or just a little cheaper

I really don't understand why PC game developers don't start developing split-screen for the PC. It's a dying habit for the console, but with this, shit, you could do six-player co-op...potentially 24 player! Wow, imagine that, all on a single machine. No latency! At least no latency due to networking anyways.

If the next generation of consoles don't support some kind of multi-output system, I'd be surprised.

Cojafoji
September 11th, 2009, 12:57 AM
stock piling of cash commences: now.

flibitijibibo
September 11th, 2009, 01:07 AM
Yeah, the 8800GTX was out first, and people only used it on XP because it was basically broken in Vista for the first year.

I didn't know about the other DX11 implementations though. Maybe this won't be filled with AIDS off the bat.

Hopefully if Alienware brings the monitor back to life, it'll be 3 1920x1200s, or at least 1280x1024.

Cortexian
September 11th, 2009, 01:22 AM
6 monitors isn't as obnoxious as you guys would think, when I was at Fragapalooza there were two people there with six-monitor setups and one was right beside me. You don't even notice the aiming issue on first person shooters because there's just so many damn monitors that you're senses are overwhelmed by awesome. If I were to go ATI multi-monitor I'd need nine monitors though, I'm a fan of having your aimer/scope/sights not split onto multiple screens.

AAA
September 11th, 2009, 02:25 AM
...Yeah,....... I only like one monitor... So Eyefinity isn't exactly special to me....

Xetsuei
September 11th, 2009, 08:43 AM
Yeah, the 8800GTX was out first, and people only used it on XP because it was basically broken in Vista for the first year.

No, we used XP because Vista was a pile of shit.

Kornman00
September 11th, 2009, 08:57 AM
Speak for yourself :realsmug:

Oh wait, you are :nsmug:

Llama Juice
September 11th, 2009, 08:59 AM
reminds me of LrLU-4nP3H4

Phopojijo
September 11th, 2009, 10:11 AM
4) I may be stoned or something, but IIRC, Nvidia released the first DX10 card, the 8800 GTX and the 8800 GTS. Based off of the G80 Core, not only was it popular, but it forced ATI to completely rethink the way they designed their GPUs. This is why we have had amazing $200-$300 GPUs for the past 2 or 3 generations. Just because it's a first gen part doesn't mean it will suck. And actually, Nvidia probably won't be able to put much on the table because they are apparently focusing on GPGPU over DirectX 11. Sure, they have a design that theoretically is more general purpose, but general purpose is always going to be at least a little bit slower than dedicated hardware. ATI's architecture also has many general purpose areas but these are supported by dedicated hardware for things like tessellation.Actually if you listen to Tim Sweeney talk, it seems very much like UnrealEngine4 and the next gen of consoles will be GPGPU-based... much less emphasis on dedicated APIs ((because they're tripping over things that Microsoft or Khronos didn't program in... like AA on deferred rendering targets. If they're building a huge general purpose rendering engine they much rather just have direct control over what they're doing...))

I mean, APIs will still be there but... its use will be much more optional than "Well you can always render on the CPU if you don't like DirectX"

NullZero
September 11th, 2009, 10:28 AM
Oh. my god. Give it to me :O

You guys not happy with 1 screen? :|

InnerGoat
September 11th, 2009, 10:43 AM
Better sell your current video cards while they are still worth something ;D

klange
September 11th, 2009, 10:50 AM
N6Vf8R_gOec
Eyefinity with 24 displays playing X-Plane under Linux.

So many displays... I'm okay with my two.

InnerGoat
September 11th, 2009, 11:02 AM
I like how the screens are not all in sync :mech:

klange
September 11th, 2009, 11:15 AM
I like how the screens are not all in sync :mech:
I'm not seeing that at all.. But if there's any sync issues, it's because the ATI drivers for Linux have terribly broken vertical refresh syncing. Also, I think the video itself is messed up.

legionaire45
September 11th, 2009, 11:21 AM
I'm not seeing that at all.. But if there's any sync issues, it's because the ATI drivers for Linux have terribly broken vertical refresh syncing. Also, I think the video itself is messed up.

Some of the monitors up in the top left would lag a little bit behind where there camera was moving. It's hard to tell whether it's actual lag or whether the screens were out of sync, but they seemed to fix themselves perfectly fine after a second or two.

Wow, ATI has this working on Linux too... :o

InnerGoat
September 11th, 2009, 11:46 AM
I'm not seeing that at all.. But if there's any sync issues, it's because the ATI drivers for Linux have terribly broken vertical refresh syncing. Also, I think the video itself is messed up.
they're running multiple instances of the game which explains one of the quadrants being sluggish
linux supremacy

:)

http://www.widescreengamingforum.com/forum/viewtopic.php?t=16780&start=0

Bhamid
September 11th, 2009, 03:40 PM
Why do people prefer having loads of smaller monitors over having one giant one with an insane resolution?

sdavis117
September 11th, 2009, 03:48 PM
Why do people prefer having loads of smaller monitors over having one giant one with an insane resolution?
Most likely that will be the outcome of this. Companies will see the opportunity to make one big monitor out of tons of smaller ones, but put no space between the monitors, making look like one big monitor. A monitor like that would probably need all of those connections.

InnerGoat
September 11th, 2009, 03:49 PM
30" 2560x1600 is the biggest you can easily get and they cost a lot, thats why. You can get 3-5 24" screens for the price of one 30"

Cojafoji
September 11th, 2009, 03:54 PM
30" 2560x1600 is the biggest you can easily get and they cost a lot, thats why. You can get 3-5 24" screens for the price of one 30"
:eng101:

the mother fucking doctor is in, and he just dropped some premo grade a knowledge on yo black asses.

Xetsuei
September 11th, 2009, 05:13 PM
I like how the screens are not all in sync :mech:

Yeah, that could be a problem.

klange
September 11th, 2009, 05:27 PM
they're running multiple instances of the game which explains one of the quadrants being sluggish
Ah, that makes sense, too. I got the vid link off of Phoronix, and didn't really read the article (there wasn't much of one).

Speaking of seamlessly combining multiple monitors, I was considering cracking open my 22"s to see what the internal bezel spacing is, see if I can reduce the spacing between them.

I'm not too interested in new graphics cards - I only have one PCI-e x16 on my motherboard, and I've only owned this rig for a month. I also don't have much in terms of cash, so I don't have much to look forward to.

Dwood
September 11th, 2009, 05:42 PM
As someone stated earlier: saving commences now. I see more potential in a year than i do in the next six months. (monitor and mobo wise) What i reeeaally want to know is: how well does it play crysis?

Amit
September 11th, 2009, 06:36 PM
What i reeeaally want to know is: how well does it play crysis?

Probably good enough. If you really think about it, Crysis is more of a graphics showcase than premium gameplay. I like the game, but it still feels like the gameplay doesn't match the graphics depth.

I'm better off with one monitor or rather I'll wait to see what these thin-bezel Samsung monitors look like and cost. Essentially, a 24" screen would be large enough for me to play on.

TheGhost
September 11th, 2009, 06:45 PM
Why don't they use an odd number of monitors so that the center is actually a full monitor. None of the pictures/videos I've seen do that. The video with the jet is completely obscured by the boundaries, and I'm sure FPS would be lame too. I would say 7x3 monitors would be ideal.

Amit
September 11th, 2009, 06:48 PM
Why don't they use an odd number of monitors so that the center is actually a full monitor. I would say 7x3 monitors would be ideal.

This. The thing may have the capability for 24 monitors but that just doesn't work. Maybe it's because the set up would be ultra widescreen, assuming you're talking about 7 columns of 3 monitors. That seems like it takes up a lot of space but still less space than the whole 6x4 thing.

Shock120
September 11th, 2009, 06:58 PM
wow, all these monitors are useless, if its really the fastest GPU I would rather get 2x 2560x1600 (with more than 60 fps) and use Extended view.

*This going on my new comp* :neckbeard:

Varmint260
September 11th, 2009, 08:50 PM
Ah... I've never been a big fan of a whole bunch of monitors for one single application. In that case, I'd have to be a rich bastard with multiple projectors focused perfectly onto a wall so that the borders between each projection were barely noticeable.

Amit
September 11th, 2009, 09:32 PM
Ah... I've never been a big fan of a whole bunch of monitors for one single application. In that case, I'd have to be a rich bastard with multiple projectors focused perfectly onto a wall so that the borders between each projection were barely noticeable.

Which wouldn't be logical. If you're projecting a large image onto the wall, you might as well get one good projector that projects high resolutions across a large area.

Phopojijo
September 11th, 2009, 11:35 PM
Depends on how high resolution you could go I guess...

Btw -- I was at the theatre watching District 9 a little while ago -- digital projector -- dear *god* its contrast ratio was *ass*

legionaire45
September 12th, 2009, 08:48 AM
wow, all these monitors are useless, if its really the fastest GPU I would rather get 2x 2560x1600 (with more than 60 fps) and use Extended view.

*This going on my new comp* :neckbeard:
...You can use 2 x 2560 x 1600 displays on this...In fact, you can use up to 6 of them in on a single card. If my understanding of the technology is correct, you can use whatever resolution on these that you want, including ones that aren't out yet. So theoretically you could use even higher resolution screens on it.


Which wouldn't be logical. If you're projecting a large image onto the wall, you might as well get one good projector that projects high resolutions across a large area.
One good projector usually costs way more than several "lesser" ones, at least from what I have observed.

bleach
September 12th, 2009, 10:05 AM
Look. There's a new addition (http://vr-zone.com/articles/juniper-xt-is-radeon-hd-5770-and-juniper-le-is-hd-5750/7626.html?doc=7626) to the ATI HD5 series coming roughly around October. The good thing about it is that it is going to be sold at more affordable prices and still offer some pack of performance (for sure more powerful than RV740).

Amit
September 12th, 2009, 11:48 AM
One good projector usually costs way more than several "lesser" ones, at least from what I have observed.

Dude, it's a projector! You wouldn't need "several" of them for a good quality image. Maybe 2 or 4 might do the job though.

Depending on the Price of the HD 5850 (assuming that's what it's called), I might go for the HD 5770 as my new video card for COD: MW2. I might wait until after MW2 is out before I buy one, though, just to give benchmarks enough time to be posted.

bleach
September 12th, 2009, 12:48 PM
I believe AMD is aiming for the HD5750, at about 1.12 TFLOPS, to be slightly behind the HD4870 and the HD5770, at 1.536 TFLOPs, to be more powerful than the HD4890. That's the goal anyways. I would just wait for benchmarks out early next year. Plus, Intel is coming out with new 32 nm processors too...

343guiltymc
September 12th, 2009, 12:59 PM
Well, it looks like I'm going to have to wait for price drops, because the 5850 is going to cost around $299.

bleach
September 12th, 2009, 01:06 PM
AMD will most likely drop the prices when Nvidia comes out with their GeForce 360 and 380. I would wait for the benchmarks and conclusion for which is the better bang for the buck, in case of HD5800 and GF360/380.

Pyong Kawaguchi
September 12th, 2009, 01:47 PM
I am going to just buy another hd4870 and cf them I think, and just use WARP to compensate for lack of dx11

legionaire45
September 12th, 2009, 04:27 PM
Dude, it's a projector! You wouldn't need "several" of them for a good quality image.
I wasn't referring to image quality, I was referring to image size and resolution.

Most projectors that are "affordable" are 720p. Some are 1080p, but those are usually a bit more.

4K projectors cost quite a bit. (http://gizmodo.com/5051490/sony-brings-their-114000-4k-projector-out-from-hiding) An array of 3 x 2 720p Projectors (http://www.newegg.com/Product/Product.aspx?Item=N82E16824248056) would cost quite a bit less - only about $4079.94 + tax and shipping according to Newegg. And I have a feeling the bulbs for that are a lot cheaper than the $100,000 projector.

That's the point of this tech. There haven't been revolutionary changes in monitor resolution in the past few years because it's becoming difficult to shrink the pixels any smaller on something that is economical to manufacture. The yields of bigger screens are probably horrendous, which is why they aren't selling 30" monitors for $300 or so at this point even though they have been around since 2006 at least, when they were usually in the $2500-$3000 range.

It is much easier to instead have an array of panels. Less risk in the manufacturing process, you can sell more of them and you end up with more pixels in the end anyway. In fact, one of the highest resolution single monitors ever made - some IBM/Viewsonic (http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors) 3840 x 2400 pixel $18,000 monstrosity, used 4 panels together, and this was back in 2000/2001. Even the Alienware monitor that everyone seems to be orgasming over uses multiple curved panels to get that huge resolution.

Varmint260
September 13th, 2009, 02:43 AM
Sweet Jesus, I wasn't talking logic, here! I was talking excess! I mean, why would a person stop at one "really nice" projector when we're showing setups with 24 LCD screens?

Also, 3x2 720p projectors? WANT. Not on my current setup, but WANT.

Donut
September 13th, 2009, 04:07 AM
wow look at your electricity bill
brb getting ladder

legionaire45
September 13th, 2009, 12:02 PM
wow look at your electricity bill
brb getting ladder
If you can afford these cards and screens/projectors you can afford the electricity xD.

Donut
September 13th, 2009, 03:01 PM
lol good point

Kornman00
September 13th, 2009, 05:07 PM
Just join the military and have the gov't pay for your bills :hist101:

that was a joke. save yourself, don't join.

Amit
September 24th, 2009, 06:37 PM
PowerColor HD 5800 Series (http://www.powercolor.com/eng/products_layer_3.asp?ModelID=656)
Sapphire HD 5800 Series (http://www.sapphiretech.com/presentation/product/?cid=1&psn=000101&gid=3&sgid=591)

343guiltymc
September 24th, 2009, 07:03 PM
http://www.techpowerup.com/104544/AMD_Juniper_Early_Specs_Surface.html

Pyong Kawaguchi
September 24th, 2009, 07:07 PM
Why does the 5870 have to be so fucking long?
I hope some company makes a shortened version of it :/

legionaire45
September 24th, 2009, 07:09 PM
I'll probably update my first post on Saturday when I'm not up to my chest in homework.

Amit
September 24th, 2009, 08:30 PM
Sounds good, l45.

legionaire45
September 24th, 2009, 11:09 PM
Sounds good, l45.

k brah.

I'd get one of these but I have neither a desktop computer nor any games that I care about that I can't play maxed out already :<. I also don't have 6 monitors, or the money for it. My 8800 GTS back home is getting a little bit long in the tooth...then again, so is the CPU...

I'm just looking for an excuse to give this post some kind of validity.

343guiltymc
September 25th, 2009, 06:57 AM
There is also the 5870X2 now: http://www.techpowerup.com/104586/AMD_Radeon_HD_5870_X2_Pictured.html