Is the CPU dead or alive? Nvidia says a little of both

medicnick83

Paramedic
Joined
Aug 23, 2006
Messages
21,005
Nvidia's hostility toward Intel is on a high boil these days. In its latest dig against the central processing unit (CPU) and the company that makes the lion's share of CPUs, an Nvidia VP said in a private missive that the CPU is dead and it has "run out of steam."

But wait. That's not what Nvidia really thinks. The message cited by the The Inquirer is "not a public statement," said Brian Burke, an Nvidia spokesperson. "The views in (Roy Tayler's) e-mail do not mirror the views of Nvidia." (The author of the message, Roy Tayler, is VP of content relations at Nvidia.)

But is the statement that far apart from Nvidia's public sentiment? "You need nothing beyond the most basic CPU," Burke said. Sounds like Nvidia thinks the CPU is, if not terminal, certainly fading.

(The CPU, or central processing unit, is the main processor in a PC. The GPU, or graphics processing unit, handles much of the visual content on a PC.)

This of course is news to Intel, the largest chip company in the world whose main business is making CPUs. "We believe that both a great CPU and great graphics are important in a PC. Any PC purchase--including the capability level of components inside it--is a decision that each user must make based on what they will be doing with that PC," said Intel spokesperson Dan Snyder.

To be sure, Nvidia and Intel have never gotten along famously. But the acrimony (mostly Nvidia's) started to build at Nvidia's fourth quarter conference call and carried over to the company's financial analyst day earlier this month, when CEO and co-founder Jen-Hsun Huang, alluding to comments from game developer Tim Sweeny, said "Intel is incapable of running modern games. Intel's integrated graphics just don't work."

But the crux of Nvidia's marketing message, vis-a-vis Intel, is focused on the graphics chip maker's perceived limitations of the CPU. In short, buy a high-end GPU, not a high-end CPU, and save money. During the earnings conference call, Huang cited the Gateway P series notebook as an example. One model has an Intel 1.6 GHz processor and a GeForce 8800 GPU. He said systems like this with a "higher-end GPU" and "lower-end CPU" are better optimized for today's users. "Relative to a notebook with a higher-end CPU and lower-end GPU, the Gateway FX is twice the performance and yet $200 lower cost."
 

Gnome

Executive Member
Joined
Sep 19, 2005
Messages
7,210
Personally I've found, like Intel's comments, that a balance is best. nVidia is getting a bit arrogant these days.
 

dude#73

Well-Known Member
Joined
Apr 21, 2008
Messages
109
Well if you need just a baisc cpu(as quited by the nvidia dude),then why on earth do games like crysis not work on certain "basic" cpu's?
Im no intel fanboy,but on a whole i'd have to agree,a balance of both is definately required.
amd+ati= cpu's,gpu's & mobo's
nvidia=mobo's+gpu's and work on their very own cpu's has been confirmed.since they are pissed that their proposal to merge with intel has been turned down)
intel= cpu's+mobo's (gpus are rumoured to be in the pipeline in order to thwart the upcoming onslaught f amd/ati)
1)intel is king and remains unaffected by the ati/amd merge.
2)amd/ati-both companies were losers against their respective competition,so the merge was an excellient move.
3)nvidia is rumoured to have proposed to intel about some sort of collaboration,(however since intel has close to unlimited funds,they decided to begin work on their own range of gpu's in order to be on par with amd/ati & nvidia.thus they turned down the proposal.)so nvidia is pissed since they have the odds against them from both parties.so to counter attack,they have announced that they have begun development on 45nm cpu's.So now that the playing field is even,this looks set to become the most competitve era in pc hardware history...

just my 2c
 

ShockG

Expert Member
Joined
Mar 4, 2006
Messages
1,422
1. Don't ever trust anything that has it's sources from The Inq. Ever. They have had an axe to grid against MS for the longest time, NVIDIA as well.... The whole thing is reported out of context as well

NVIDIA's CEO is right in that a stronger CPU isn't near as useful as a strong VGA card. He's right that the GPU has become increasingly more important if not more so than the CPU. However he did not go on to quote Tim Sweeney's comments about SLI and Triple SLi which were just as important.

It's true now more than ever, for the consumers a strong CPU is next to useless really... I talk from having owned every single high end desktop CPu released for the last 3 years. It's all just unnecessary. 3 machines, two with 9650 Quad core CPU's. The one is 90% off, the other runs at 2.33GHz at 0.9V each with a 3870X2 at the least.... Played through Crysis, GOW, UT3 etc... have yet to say I miss the 3GHz clock. I did though feel the pinch when I used a single 3870 graphics card...

In two or three years NVIDIA could possibly have a desktop x86 chip that may be the speed of a current Intel quad core, (yes Intel will be far ahead by then) and that paired with whatever graphics card they have there will likely be enough...

You can't sell CPU performance anymore, only functionality, power savings, value etc... The days of speed king are dead and only remain for the enthusiasts worried abotu 3DMark scores.
 

Glordit

Expert Member
Joined
May 3, 2007
Messages
2,332
1. Don't ever trust anything that has it's sources from The Inq. Ever. They have had an axe to grid against MS for the longest time, NVIDIA as well.... The whole thing is reported out of context as well

NVIDIA's CEO is right in that a stronger CPU isn't near as useful as a strong VGA card. He's right that the GPU has become increasingly more important if not more so than the CPU. However he did not go on to quote Tim Sweeney's comments about SLI and Triple SLi which were just as important.

It's true now more than ever, for the consumers a strong CPU is next to useless really... I talk from having owned every single high end desktop CPu released for the last 3 years. It's all just unnecessary. 3 machines, two with 9650 Quad core CPU's. The one is 90% off, the other runs at 2.33GHz at 0.9V each with a 3870X2 at the least.... Played through Crysis, GOW, UT3 etc... have yet to say I miss the 3GHz clock. I did though feel the pinch when I used a single 3870 graphics card...

In two or three years NVIDIA could possibly have a desktop x86 chip that may be the speed of a current Intel quad core, (yes Intel will be far ahead by then) and that paired with whatever graphics card they have there will likely be enough...

You can't sell CPU performance anymore, only functionality, power savings, value etc... The days of speed king are dead and only remain for the enthusiasts worried abotu 3DMark scores.

That get's me thinking about the days of the Pentium 4 when pushing for speed [3Ghz] was the way to go.
Then came dual core and now even a 1.6Ghz Pentium E has more procesing power!

As for GPU's vs CPU's it's a case of Fancy Graphics vs everything els. Sup com was pretty looking but without a very stron CPU it would slowdown to a 4fps slideshow even on the lowest settings. Then again Crysis can run on a Pentium E2140 when you pop a 8800GTS and the game runs pretty well.

I think it's down to the developers I guess, they want to make graphicaly intense games or smart ones?
 

ShockG

Expert Member
Joined
Mar 4, 2006
Messages
1,422
Thing is there aren't enough games on the PC to drive development of new CPUs and the ones that are there are mostly overkill. I would dare anyone to say an upgrade from an E6600 to an E8500 made their game experience far more enjoyable.
 

Gnome

Executive Member
Joined
Sep 19, 2005
Messages
7,210
Strange how when a CPU becomes more powerful FPS scores seem to increase, perhaps not by much, but the same can be said of the top end GPU's

Not to mention that many computers don't even need anything more than the most basic GPU, games and 3D applications are a minority in comparison to the rest of the computing world, so perhaps out of the eyes of a gamer it is more important, but to a work environment it means nothing (unless they do 3D work which most places don't).

@Glordit: The CPU speed in terms of GHZ didn't decrease in favor of multi-cpu cores because they decided that raw single CPU speed no longer matters, it's because they hit a brick wall in terms of attainable CPU frequency.
 

ShockG

Expert Member
Joined
Mar 4, 2006
Messages
1,422
I think for people who are not concerned with games and encoding, CPUs are even less important. No doubt you get very nice frame rates when using 9775 and high end GPU, but the thing is you're comparing 105fps to 72fps on maybe a dual core E6600. That is huge but its hardly going to make any difference to gameplay. It's more than fast enough....
For office users etc... I'd be hard pressed to tell people to move from an AthlonXP/ Pentium4B based machine to anything we have today :/
 

medicnick83

Paramedic
Joined
Aug 23, 2006
Messages
21,005
Oh dude, trust me, you can tell when you work with applications like Corel Draw and Photoshop etc.

There you need lots of fast memory and fast CPU's. :D

I work with a POS at work, crashes so often and I've requested a new PC but just they keep refusing to give it to me, so what is the next best option?
BUY MY OWN PC! :D
 

Gnome

Executive Member
Joined
Sep 19, 2005
Messages
7,210
but the thing is you're comparing 105fps to 72fps on maybe a dual core E6600. That is huge but its hardly going to make any difference to gameplay. It's more than fast enough....

True but then why buy a Geforce 8800GTX VS. Geforce8800GT/GTS, same argument really (or would you buy a GF8800GTX and a Intel Dual Core E2180?)

Which is why I say balance is important, the CPU isn't going to be dead for a long time. Computers just aren't ever fast enough, too many things depend on CPU's for them to be considered unimportant or not "worthy" of a upgrade.

And the fact the Intel can still turn a huge profit on CPU's is proof of that, nVidia's statements might have been "taken out of context" but all companies make stupid statements sometimes, I think this is one of those times.
 
Top