PDA

View Full Version : some nvidia news



Rook
October 2nd, 2009, 03:03 AM
No one posted this??

http://www.bit-tech.net/news/hardware/2009/09/30/huang-reveals-fermi-architecture/1

http://www.nvidia.com/object/fermi_architecture.html

343guiltymc
October 2nd, 2009, 06:49 AM
Nvidia has become the underdogs.

RedBaron
October 2nd, 2009, 09:28 AM
ATI has become the underdogs.
ftfy - and AMD too of course.

Con
October 2nd, 2009, 09:35 AM
but we're quite confident here as Huang said that "we will make real-time ray tracing a reality this year."
:haw:

NullZero
October 2nd, 2009, 10:38 AM
Nvidia has become the underdogs.
ftfy - and AMD too of course.
FTFY

Cojafoji
October 2nd, 2009, 11:22 AM
That tech really doesn't translate into anything that we'd ever see, I mean personally anyway.

legionaire45
October 3rd, 2009, 02:05 AM
I was going to post about this but never got around to it.

In short: Lol (http://www.semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/), good luck guys.

3 Billion transistors for 3 Teraflops single precision. Yields on this thing are probably going to be horrific, and compared to ATI, in gaming it is at a disadvantage with DX11. They are using general purpose hardware for things that ATI has dedicated hardware on their die for; Nvidia is going to have difficulty keeping up when DX11 games actually become mainstream at some point in the future.

Nvidia can go on and on about how PhysX and CUDA are changing the world, but they are largely irrelevant right now. There isn't a single killer app for PhysX, something that uses all that potential for processing for something that affects game play or is remotely useful. Besides a couple of one-off scientific apps, the biggest thing that has used CUDA are video encoding apps and a few other similarly disappointing things.

If Nvidia focused on something like OpenCL then maybe all that special stuff for computation on the die would give it an advantage over ATI that would matter in the long term. As of right now, Nvidia is trying to force the creation of a market that is niche and will probably always remain a niche thing. They are betting this round on their expensive, low yield card somehow changing everyone's views towards GPGPUs. I'm doubtful.

Bhamid
October 3rd, 2009, 06:05 AM
Fermi is designed for GPGPU work, not graphics.

=sw=warlord
October 3rd, 2009, 10:32 AM
Fermi is designed for GPGPU work, not graphics.
Wat.

Bhamid
October 3rd, 2009, 10:36 AM
It means its designed to do stuff other than just the normal graphics work.

=sw=warlord
October 3rd, 2009, 10:38 AM
It means its designed to do stuff other than just the normal graphics work.
No your just saying normal graphics is to do with games, video cards arnt just used for games and haven't been dedicated to games for a long time.

Cortexian
October 3rd, 2009, 11:27 AM
No, he's saying that Fermi is designed to do things that aren't specifically related to graphics (I think).

Phopojijo
October 3rd, 2009, 11:46 AM
Fermi is designed for GPGPU work, not graphics.Like Larabee...

Developers like Epic Games have been bitching and moaning about OpenGL and DirectX for a while now...

It's expected in the quite-near future (maybe as early as next console generation as Epic seems to be beating around the bush at with UnrealEngine 4) that gaming will be pushed back into the software rendering style engines -- only instead of executing everything on the CPU... your per-pixel calculations would be blasted out to hundreds and eventually (or now... if you SLi/Crossfire) thousands of GPGPU cores.

Think of it this way -- MentalRay on a GPU...

Also if people use the GPU for more generalized applications -- it'll be included into systems by default. No longer will it be the niche of people who just want to run videogames... there'll be purpose for the masses -- oh and you can also play videogames too... ((see what I mean? No longer a "3d card" -- rather a "Massively Parallel Processor" (GPU) and a "General Purpose Unit" (CPU) -- maybe even share memory someday...))