Urban75 Home About Offline BrixtonBuzz Contact

Graphics cards have about 2 years left

Don't rule out why a company would be doing this? Having an oar in both sectors (cpu and gpu) puts them in a position to do better than the competition in this area, whether it's the best thing technically to do or not. If they can beat the competition and make money then they'll do it. Chip manufacturers are always ranked, tested, rated and scrutinised. If going down this route gives them an edge over the others then they will do it. The market will decide whether to buy into it or not. In essence, is it gonna be a new format war?
It's possible but unlikely, the move to on chip GPUs for the low end is definitely happening, it's cheaper than a Chip/Motherboard IGP option, takes up less room and will be more efficient for mobile useage too. The problem is the way to scale it up, low end GPUs and high end are a light year apart.

A mobile GPU has somewhere north of 30 million transistors, which is bugger all. Sticking that on a mobile CPU would be childs play. Course that's a crap mobile GPU, decent ones are about 75 million (ie light gaming) but heavy gaming wise you're hitting a billion. The former you can include, the latter are bigger than the cpu by a large margin (about 300 million transistors).

Let's face it, most desktops run office and explorer, a mobile GPU is more than enough for them and in those markets it makes perfect sense. Just like IGP did. For gaming it makes none, any possible advantage is shat all over by engineering problems. You might get some intermediate level chips or a SLI approach where some of the GPU load is taken by sections on the CPU but not wholesale replacement.
 
Interesting discussion, I can see the graphics card developing for a long time to come. Given a beefy power supply and enough slots, it seems they can transform a mediocre box into a pretty high-end piece of kit.

Not my forte by any means, but I've had to look at this stuff lately for my kid who wants a decent gaming, uhhh, rig, I believe it's called. Gah!

Anyway, found this ...
NVIDIA CUDA™ technology is the world’s only C language environment that enables programmers and developers to write software to solve complex computational problems in a fraction of the time by tapping into the many-core parallel processing power of GPUs. With millions of CUDA-capable GPUs already deployed, thousands of software programmers are already using the free CUDA software tools to accelerate applications—from video and audio encoding to oil and gas exploration, product design, medical imaging, and scientific research.
 
Yup, GPUs have been used as cheap processors for a while and it's becoming easier/more common now. The huge number of fpus they have makes them ideal for massively parallel tasks.
 
Yes this stuff about using the GPU for other purposes is what I was on about when I mentioned OpenCL, a standard that Apple pushed for so they could use it in Snow Leopard without limiting themselves to proprietary standards that vary depending on GPU manufacturer.

Providing developers start using it, its going to do some awesome things. Even as a non-coder I should be able to harness it via Quartz Composer in Snow Leopard for some very interesting realtime things.
 
The age of the discrete graphics card is coming to a close it appears.

AMD.png


If you look at that AMD slide, they are going to include the graphics chip directly into the CPU, 2011. With the Advent of multi GB DDR3 on the motherboard and the CPU controlling the memory, it can use system ram and not have to suffer loads of stuff being shunted between the GPU and the CPU's memory.

Apart from AMD fucking it up, this is only reason I can see for their acquisition of ATi, so they can blitz Intel who have never produced a graphics chip anyone actually wanted. But you can't write them off until they fail.

I look forward to the new level of realism.

GPU`s will always exist. for one simple reason, power applications. i.e. gaming and cad.


take my 4870 graphics card, it has three times the processing power of deep blue. within 12 months the next generation GPU`s will be out, too much heat to contain, to many transistors to squeeze on.


for economy graphics systems sure, the death of the motherboard based gpu and economy gpu`s can only be a good thing, but the mid and high stand alone cards will remain for some decades to come.
 
The slide has a lower title of 'Notebook Client Solutions' and you've been barking up the wrong tree.
 
Back
Top Bottom