Sign in with
Sign up | Sign in

NVIDIA And AMD Vs. Intel: A New Graphics War Emerges That Should Have Been Avoided

By - Source: Tom's Guide US | B 23 comments

 

Analyst Opinion - I’m not a big fan of public wars between companies. They tend to focus the firm on their competitor rather than their customers and these fights also tend to get people to do stupid things and miss big opportunities. For instance, if Apple had been focused on going to war with Microsoft at the end of the 90s they likely would have missed the iPod, which actually saved the company. By taking the war with Microsoft off the table it potentially opened the firm to an incredibly lucrative additional opportunity.

Last week NVIDIA’s CEO made clear what many of us had known for some time. There was a war brewing between Intel and NVIDIA that Intel actually started by creating Larrabee. Many had thought the more likely war would be between AMD and NVIDIA once AMD bought ATI, but that sure isn’t what we are seeing now. Apparently, AMD and Nvidia are getting along relatively well right now and both look at Intel as their major problem. In effect, Larrabee is driving NVIDIA and AMD back together as partners.

I believe these folks should all be focused on the big problem of relatively boring products and customer disinterest, which is pulling down the entire segment. With market and economic outlooks trending negative, I’d be trying to get folks together to address these problems rather than focus everyone on attacking each other.

Having said that let’s look at how these companies stand up against each other.

The warriors

Intel is viewed as a platform owner while NVIDIA is not. While AMD clearly can come close to matching Intel in terms of total product coverage, they don’t have the market share that would let them own any of the related platforms. We call the PC platform Wintel for a reason, but that doesn’t mean Intel is invulnerable.

Like any entrenched and dominant player Intel is hard to beat on their home turf. Intel actually is a major player in graphics even though, for the most part, Intel graphics haven’t been that great (and that is an understatement). In fact, the class action litigation that Microsoft is experiencing right now probably wouldn’t exist if Intel graphics had been stronger.

NVIDIA knows graphics: They are the dominant player in this segment of the PC market. The graphics market cycles faster than the processor market and Nvidia has proven to be a capable and powerful competitor in their segment. Currently you get a better performance bump for your PC by buying a more powerful graphics card then you do by buying a faster processor and this has worked to NVIDIA’s advantage.

With initiatives like CUDA, focused on nontraditional PC computing areas, NVIDIA has also showcased an aptitude for looking at related opportunities, which may become vastly more powerful in the future. Nvidia maintains massive support in the PC gaming community and is currently the leader, particularly at the high end, of both the PC gaming and professional workstation graphics segments.

AMD is a blend of Intel and Nvidia. While not as strong in the processor or graphics segment as its competitors, they have a processor which NVIDIA lacks and a vastly stronger graphics technology then Intel. But they are running second in both markets and are currently trying to pull these technologies together to make something new and different. Strangely enough, they seem to be going out of their way to avoid a war with NVIDIA but are locked into an ongoing bloody war with Intel.
AMD is in transition and, if they can complete this transition, could emerge as the most aggressive in terms of total performance at any given price. But, while this transition isn’t easy, it shouldn’t force a major change in how games are written that Intel’s path is likely to require. In short, it doesn’t necessarily lock out NVIDIA and it is the lock out implied by Intel’s Larrabee that appears to have started this latest war.


Handicapping the fight

NVIDIA is well in its comfort zone, but the overall drag on computer gaming and Vista adoption is hitting the company hard these days. However, at least they are taking the least risk and have the most to gain short term, if the conditions weighing on gaming improve - or when there is a general increase in the need for graphics performance. Intel poses little short term risk to Nvidia. In the long term, Intel has a significant positional advantage, if it can change the game.

The CPU builder is attempting to change the graphics game significantly and turn the market into a dramatically different direction. If they can pull it off, the result would be devastating to both NVIDIA and AMD, both of which are on different paths. But Intel has a bad track record with graphics, Microsoft support hasn’t yet emerged (and may not emerge), and the result isn’t due until 2010. In addition, Intel has yet to demonstrate the level of competence in this area needed to take out firms that have dominated the performance areas of this market for years.

AMD has accepted a major risk trying to find a way to create a more cost effective high performance platform. But current performance is hampering AMD’s ability to generate the needed resources to make the moves the company needs to make in a timely fashion. They may be able to co-exist with NVIDIA and the two of them together could possibly block Intel from taking the market away from the current graphics architecture.

All will come down to whether Intel can do what they have never been able to do in the past and create a truly competitive fast moving graphics architecture, while attracting critical gaming developers to it, supported by Microsoft, and/or whether NVIDIA and AMD can cooperate enough to block Intel’s efforts.

Wrapping Up: There is no purpose in a war

In a market largely suffering from malaise and paper thin margins, I personally think that a war, regardless of who started it, is foolish. The real enemies of better margins and revenues are unexciting products that people aren’t buying. Fixing these issues would allow all of these companies to be more successful, more profitable and their customers would be much happier.

Having said that, in a graphics war, the combination of NVIDIA and AMD/ATI will be very difficult to overcome even with Microsoft’s support and Microsoft has not given that support yet (and has historically not been fond of Intel’s efforts to dramatically change existing markets), Intel’s success is a long shot.

Larrabee, to me, appears to suddenly be a bridge too far.

Rob Enderle is one of the last Inquiry Analysts. Inquiry Analysts are paid to stay up to date on current events and identify trends and either explain the trends or make suggestions, tactical and strategic, on how to best take advantage of them. Currently he provides his services to most of the major technology and media companies.

Discuss
Display all 23 comments.
This thread is closed for comments
  • 0 Hide
    Anonymous , April 18, 2008 8:45 PM
    I have to disagree. The current Gaming market is fetting stale and fall into the same category of "relatively boring products and customer disinterest" because the gaming community is very narrow and if you look at the latest releases of graphics cards you will notice the trend: Yay... Faster GPU... Faster memory... 128 Stream Processors...
    Intel is offering a truly new way to bring gaming, physics, GPGPUs, Multicore gaming, etc. all together in a whole new way. I am personally bored of the last, current, and new generations of gaming. Intel's idea will increment the gaming world in GFX details, speed, and effects that the worthless DX10(.1) has to offer. NVIDIA and ATI need this shake-up to reinvgorate the aging technology.
    As with computing, gaming should be moving to threaded applications and games instead of just faster GPUs. if you took a current multi-threaded application and worked it on a single core CPU then moved it to a Quad core or more, you begin to multiply the speed of the application.
    I think if NVIDIA or ATI had a true multicore GPU (Not the generic x2 or SLI/Crossfire platforms) as Intel is planning and threading on that level instead of the CPU level, the performance will go up 2x and more. Throw Physics processing on one GPU core, sound on another (Processes before the sound card of course), rendering on another, vectors, and so on.
    In the future there will be no distinction between a CPU and GPU and a "processor" will perform all these functions on a singular level. Parallel processing is the way of the future.
  • 0 Hide
    greenmachineiijh , April 18, 2008 8:47 PM
    I have to disagree. The current Gaming market is fetting stale and fall into the same category of "relatively boring products and customer disinterest" because the gaming community is very narrow and if you look at the latest releases of graphics cards you will notice the trend: Yay... Faster GPU... Faster memory... 128 Stream Processors...
    Intel is offering a truly new way to bring gaming, physics, GPGPUs, Multicore gaming, etc. all together in a whole new way. I am personally bored of the last, current, and new generations of gaming. Intel's idea will increment the gaming world in GFX details, speed, and effects that the worthless DX10(.1) has to offer. NVIDIA and ATI need this shake-up to reinvgorate the aging technology.
    As with computing, gaming should be moving to threaded applications and games instead of just faster GPUs. if you took a current multi-threaded application and worked it on a single core CPU then moved it to a Quad core or more, you begin to multiply the speed of the application.
    I think if NVIDIA or ATI had a true multicore GPU (Not the generic x2 or SLI/Crossfire platforms) as Intel is planning and threading on that level instead of the CPU level, the performance will go up 2x and more. Throw Physics processing on one GPU core, sound on another (Processes before the sound card of course), rendering on another, vectors, and so on.
    In the future there will be no distinction between a CPU and GPU and a "processor" will perform all these functions on a singular level. Parallel processing is the way of the future.
  • 0 Hide
    Anonymous , April 18, 2008 9:13 PM
    Also take an economics course if you are writing about economics and speculating on a market. This is good for everyone.
  • 0 Hide
    Anonymous , April 18, 2008 9:14 PM
    the competition will make prices cheaper and make better products at the same time. this is how we win.
  • 0 Hide
    Slobogob , April 18, 2008 9:29 PM
    @JayDog/greenmachineiijh
    GPUs are quite parallel - more so than any current CPU now or coming up for the next few years.
  • 0 Hide
    Anonymous , April 18, 2008 11:22 PM
    Wow...

    First off, with Nvidia purchasing Ageia, that's what's going on, JayDogg. Take a look at current happenings in the market first before you make a comment. These X2 and Crossfire/SLI solutions are just the first step in creating a true multi-core GPU. Although I will be honest, the progression is slow, especially for AMD.

    Secondly, these "wars" are a good thing, but only when they're not in a stalemate. Having a proper "war" between these graphics giants, means that they are constantly trying to outdo each other. Take a look at AMD vs. Intel a few years ago (yeah, I know it's not GPU, but the point is the same). AMD tried to do something completely different and succeeded (Athlon), then Intel did something different again, and again had a big success (Core 2 Duo). The problem is that AMD is suffering too much to push the envelope. With this calm, Intel and Nvidia can just keep dong the same old thing (come on, what's the real difference between the 8800 and 9600?). This comfort that Intel and Nvidia have right now is what's halting progress.

    Although, having Intel step in there could be a good thing and a bad thing. The good is that Intel has the funds to do just about anything. With another push to a new technology (hopefully better), AMD and Nvidia will should respond in kind. The bad news is that AMD may not be able to keep up. Even if they go through the merger (or acquisition or whatever it is), if Intel's solution is a big hit, AMD could just fold. I don't see there being that big of a financial pressure on Nvidia, they have a much larger bankroll than AMD. As an AMD (ATI) fan boy (I like the CCC interface and hardware concepts much more on the AMD side of the fence) I don't want to see AMD go bankrupt.

    There is the chance, though, that Intel will not target AMD at all. If AMD falls, Intel will be the only real desktop processor manufacturer. This would cause big problems for the company, something that I'm sure they are aware of. (I don't include Nvidia as a processor manufacturer, because my understanding is that Nvidia's step into the processor market is just another way to off-load the processor from a few calculations the graphics card can do. I could be wrong on that though). Having AMD out of the picture could be a cause for a monopoly investigation in the processor market. No business wants to go through an ordeal like that.
  • 0 Hide
    mbulut , April 19, 2008 4:25 AM
    Using general purpose architectures it is diffucult to compete with dedicated structures. Integrating the dedicated chips(sound, graphics, physics, ...) into a single chip sounds like the best way to go. Human brain is a good example of such a structure.
  • 0 Hide
    mbulut , April 19, 2008 4:36 AM
    Using general purpose processors it is difficult to compete with dedicated ones if performance is the number one parameter. Integrating dedicated chips (sound, graphics, physics ...) into a single chip sounds like the best way to go. Human brain is a good example of such an integrated structure.
  • 0 Hide
    Anonymous , April 19, 2008 8:47 AM
    in my opinion ,the war will benifit customer,cause the war isn't only a market competation but also a technique competation . Mic beat APPLE exactlly by market nor their technique .but vedio card was no the seem as OS .it was totally depend on its technique ,the apperance of gaming ,the 3dmark result it has .in that way ,if inter do better than amd and nvidia . i think The AMD and NVIDIA will do better in the next season or week or day,hour,min,s.......
    both in their price and technique...
    thanks in advance .i am not good at using writen englich..
  • 0 Hide
    Anonymous , April 19, 2008 11:11 AM
    Competition is good indeed for consumers. However the markets we are discussing are oligopolies with a tendency towards monopolies. For economists this is far from perfect competition. The highly specialized nature of these markets creates these oligopolies, which aren't known to exhibit great competitiveness. Usually collusion is more likely. In that sense the current competition isn't really a big "force". Despite this it could, and probably would, be pretty bad if Intel would manage to capture a large market share in both markets (read CPU and GPU). The extent to which such a (semi) monopoly is bad for consumers depends on the driving forces in these markets. If software developers push hardware requirements then hardware manufacturers would have to keep up with demand. If they don't, a competitor will enter the market and supply the good (CPU's / GPU's). At least the presence of such a threat will reduce the exploitability of the market. If software developers however follow hardware developments, i.e. use whatever hardware is available, then a hardware monopoly is much more exploitable. Hardware manufacturers could "milk" certain technologies endlessly. In any event I think the demand for better hardware is very limited, which hampers progress. The number of gamers that really demand better graphics and act like it (e.g. buy high end graphics cards /CPU's)is limited. Only some of the best selling games have high end graphics and it is quistionable to what extent these graphics influence sales. I think it boils down to gameplay, i.e. software development is more important. I mean consoles have ok but not great graphics, and they capture a large part of the gaming market. With respect to the pc two major titles come to mind. WOW and COD4, both big sellers and both have "ancient" graphics (pls don't start on COD4 graphics they are worse than HL2, period.) In any case it will be interesting to see how matters unfold.
  • 0 Hide
    Heyyou27 , April 19, 2008 3:46 PM
    @ Slobogob
    I'm still confused how people posting on a tech site know so little about about a GPU vs. a CPU.
  • 0 Hide
    korsen , April 19, 2008 10:11 PM
    Hellooooooooooooooooooooo. Programming anyone? Games need to be coded for a very general hardware spectrum. Drivers need to be polished for every game, and every update to every game. There's alot of work going on there.

    Even though, of course, it would be the easiest thing on the planet to throw on 256 ROPs, 1024 unified shaders, etc etc and just crank everything up 10x. But who wants to do that? That doesn't make money, and it forces people to innovate. Nvidia and AMD will stick to incremental increases to milk the sector for what it's worth.

    Screw that. If i owned one of those companies i'd throw my nuts at the wall and just create this massive card that obliterates my competition and just wait and watch while it takes them 5-10 years to design, code, and push out something that comes close because they're all running around like headless chickens.

    I think AMD is never getting the CPU crown back, ever, and unless ATI finds out where it placed it's head when it merged with AMD, they're never getting the graphics crown back either. Statistically, ATI should be crushing Nvidia, but they're lagging 30% behind despite having 30% more hardware. No idea why they didn't take advantage when Nvidia screwed up their drivers for such a long time.
  • 0 Hide
    campdude , April 20, 2008 1:31 AM
    I'm not buying a video card till that Larabee Video Card comes out with Benchmarks and hits the shelves. I dont think i am the only consumer holding thier money waiting to see how this intel thing plays out.
    If anything AMD and Nvidia may see less sales up till the release of Larabee.(depending on the success larabee has)
    Consumers will wait to have the luxury of three Video Card company choices. .
    Rob Enderle might change his opinion if Larabee turns out to be the best Card on the market and even buy one himself.
  • 0 Hide
    campdude , April 20, 2008 1:32 AM
    I'm not buying a video card till that Larabee Video Card comes out with Benchmarks and hits the shelves. I dont think i am the only consumer holding thier money waiting to see how this intel thing plays out.
    If anything AMD and Nvidia may see less sales up till the release of Larabee.(depending on the success larabee has)
    Consumers will wait to have the luxury of three Video Card company choices. .
    Rob Enderle might change his opinion if Larabee turns out to be the best Card on the market and even buy one himself.
  • 0 Hide
    animehair , April 20, 2008 9:58 AM
    i agree with some of the posts regarding how this will only be a very good thing for the consumers in the end. i think we will see some much needed technological advances due to this new competition. Personally the whole sli/crossfire technologies went over my head a bit. I mean to expect people to buy two graphics cards always seemed silly to me. It really turns me off to computer gaming...I would much rather buy a p3 or xbox360 console because I feel as though I was being taken advantage of in the wallet with computer gaming.

    However I do like that Nvidia is coming out with dual gpu cards...and I think that dual/quad core gpu's make more sense than sli/crossfire setups.
  • 0 Hide
    Anonymous , April 20, 2008 2:17 PM
    @campdude:

    I doubt Larabee will be much more than a better performing version of current IGP's like AMD's 780G. At the time it doesn't sound feasible to actually make it that powerful, think of it, is it easy to jam an 8800GTX (just an example) into a CPU? I think not.
  • 0 Hide
    Anonymous , April 20, 2008 4:35 PM
    Hmmmm..... I can't help but think that greenmachine and jaydog are payed Intel shills, it's not possible to double post by accident with different screen names, LOL. They probably both copied and pasted the same prepared post 2 minutes apart.

    Besides, that comment about "multicore GPUs" is just retarded, that whole 320 stream processors(you know, parallel processing: "the future") bit is the same as multicore, but done much better.

  • 0 Hide
    holmebeef , April 20, 2008 5:01 PM
    Boo Hoo ... we don't want a war ... sniffle, sniffle ...

    This isn't the US in Iraq Rob and people aren't going to die. War (or competition to be less melodramatic) drives innovation, better prices and better customer service. This is the best thing that will happen to PC graphics in a long time. Perhaps we can finally solve the heat dissipation / huge energy consumption graphics cards are currently disabled with.
  • 0 Hide
    Anonymous , April 20, 2008 10:18 PM
    A few points:
    Driving forces behind faster/better graphics computing is always going to be a mix between software and hardware engineers. Software is always going to require better ways to get certain results that they are looking for in their games. Faster shading, texturing, anti-aliasing etc. But solving these problems can occur in so many different ways which are completely invisible to the programmer and is much more of a hardware problem that is solved by nVidia/ATI. The change that Intel is bringing to the table is a reaction to the software's needs but is bringing new ideas and ways to solve these same problems, because they have different ideas about the hardware solutions. When the limitations of hardware start to show it's always good to explore other hardware solutions even if that means you rock the programmer's world. (64-bit computing, SSE, Multi-Core, even GPUs weren't standard back in the day)
    Plus, Intel's Larrabee is "supposed" to introduce more than just graphics power. Any computation that would benefit from stream processing could take advantage of this new hardware. Even if they don't break into the graphics market directly they will have plenty of other avenues to explore and gain momentum. I like hardware that does more than less, and if I can get my graphics card to help out in my other applications I'd be all for that.
  • 0 Hide
    Anonymous , April 20, 2008 11:49 PM
    In my opinion there is a lot of negotiations and hard pushing going on behind the stage. At the moment AMD is the only company working in the both sectors - GPUs and CPUs. They need some cash, but other than that they are on the right track. The problem with Intel entering the market is that they have to come up with something really good and DirectX compactible.
    I am thinking in a 5 year period CPU and GPU will be a sold pretty much the same way as the CPUs today. Probably the technology would share system memory and be verstile(upgrade the memory timings/mhzs, you upgrade overall speed). So the main component to work over is the motherboard layout and technology. It must host both a CPU and a GPU processor or one chip with multiple cores (but IMO the first one is more alike). If Intel goes their way and close their platform for others will they have enough GFX speed. Would the new Larrabee work? Or would they open the platform for an nVidia GPU at the end? And how would AMD react? Keep their business prioritis and close the technology or act for popularization of their platform first?
    I think it is pretty much about what GPU or something else nVidia has to offer. What functions could it process and how fast. Also in what cost can they do it of course. If they really progress in physics and graphics, keep a healthy advantage and minimize cost of their GPUs (or whatever you might call them), both ATI and Intel would open eventually for nVidia. Because simply customers would require it and if they don't and actually slow down their platforms by doing so there is no saying who else can enter the market and join nVidia. And both VIA and IBM might take the option and build up a platform with nVidia. And i think that while Intel might keep the advantage in terms of CPUs (very hard to predict though in 5 years) graphics and physics of nvidia would be top of the branch.
Display more comments
Tom’s guide in the world
  • Germany
  • France
  • Italy
  • Ireland
  • UK
Follow Tom’s guide
Subscribe to our newsletter