Sign in with
Sign up | Sign in

Insiders Say Intel to Build PlayStation 4 GPU

By - Source: Tom's Guide US | B 59 comments

The PlayStation 4 is giving "Intel Inside" a whole new meaning.

When the term "discrete graphics" comes to mind, many immediately think of Nvidia and ATI. The two rivals represent the lions share in the discrete graphics market. However, the pond may need to make room for another big fish.

Intel, a company known more for its CPU's and motherboards, is now making a push into the discrete graphics market. According to The Inquirer, Sony has commissioned the California-based chip maker to build the graphics muscle in its next gaming console. While the PlayStation 4 may be several years off, and is (probably) powered by a Cell processor refresh, Sony is certainly shaking things up.

Anyone who follows hardware already knows the bitter rivalry that exists between Intel and Nvidia. The former believes that a harmony of CPU and GPU, with an emphasis on the CPU, is appropriate for personal computers. Nvidia, whose bread and butter is computer graphics, is pushing a "GPGPU" philosophy. This General Purpose Graphics Processing Unit, in the simplest terms, would be something like current video cards, but used for all computing functions inside of ones PC.

Nvidia has now been blacklisted in the console realm. It broke into the market when Microsoft commissioned them for the original Xbox GPU, and is the company behind the graphics power in the PlayStation 3. With the Xbox 360 and Nintendo Wii powered by ATI, and both Microsoft and Nintendo's next offering rumored to be ATI-powered as well, this newest development with Intel leads one to believe that Nvidia is no longer welcome by the three console makers.

Intel also offers stability by means of its very deep coffers, something Nvidia does not. "The nice Sony engineering lady at CES told us that Intel essentially bought the win," says Charlie Demerjiana of the Inquirer. "...theoretically good architecture, no imminent threats of going bust, and not being hated by Sony all contributed too. With a couple of [variables] satisfied, the PS4 GPU belongs to Intel. No word if this is going to be the entire architecture, CPU as well, or not. That, from what we are told, is not final yet. Perhaps the most important aspect of this deal is Intel showing its GPU know-how. With Larrabee in the back of many enthusiasts minds, a PS4 with Intel graphics that isn't a technological flop can open a window in the PC discrete graphics market for Intel.

Demerjiana also said to expect the Wii2 and Xbox3 (or whatever they end up being named), in or around 2012. If that's true, expect the PS4 at the same time. It may not be that coveted ten year life cycle Sony is shooting for, but in the interest of sales and competition, shaving a few years off wouldn't hurt.

Has Nvidia been given the boot out of the console market? Sure looks like it. However, where one opportunity closes, another begins. If the Microsoft phone rumors are true, Nvidia's Tegra platform may have some serious muscle behind it.

Discuss
Display all 59 comments.
This thread is closed for comments
  • 2 Hide
    Efrayim , February 7, 2009 1:02 AM
    Poor Nvidia :( 
  • 0 Hide
    Hatecrime69 , February 7, 2009 1:04 AM
    intel doing the gpu? I don't know..considering how little experience they have with it (none really, their worthless igp's don't count) and laterbee still has to prove itself (not to mention release) it seems far too pre-mature to say intel might be doing it, I don't think companies hate nvidia that much
  • 0 Hide
    TheFace , February 7, 2009 1:04 AM
    Why wouldn't you break the cycle? Maybe release your console mid-cycle so yours is actually 2-3 years development time, more powerful than the competitions'. Why live in the confines of this "product cycle". They can produce a console that developers WANT to program for, make it affordable yet more powerful than the competition's due to the extra development time, and maybe throw something in that the competition doesn't have (something like the wii did this time around). This doesn't just have to apply to Sony, but they want their product cycle to last longer than what the others seem to be doing. Thoughts?
  • -4 Hide
    Tekkamanraiden , February 7, 2009 1:16 AM
    So correct me if I'm wrong but Sony is going with Intel mainly because they have lots of money? Apple should get in on this.
  • 3 Hide
    enewmen , February 7, 2009 1:19 AM
    Can someone elaborate why Nvidia isn't welcome in the consoles? I wasn't aware of such big problems. Thanks.
  • 1 Hide
    LATTEH , February 7, 2009 1:23 AM
    i honestly dont think intel will put a GPU in a console but i could be wrong it may be more powerful then we thought it would be...
  • -2 Hide
    Anonymous , February 7, 2009 1:26 AM
    Nvidia is in the lead of the competition. intel on the other hand is way over expecting its larrabee to compete with ATI and Nvidia. throw in a new 16 SPE cell on the table and uv got urself a standoff. although from the demos iv seen and heard of larrabee its mainly physics oriented (although its still too early to tell. it is highly daughtful they can reach a teraFLOP) .... who knows how ps4 will work with it in combination with the cell ....
  • -3 Hide
    Tindytim , February 7, 2009 1:42 AM
    Quote:
    It may not be that coveted ten year life cycle Sony is shooting for, but in the interest of sales and competition, shaving a few years off wouldn't hurt.

    Are you retarded?

    The PS1's life was 11 years, with games being published until '05. The PS2 is still kicking 8 years from it's launch.
  • 0 Hide
    IzzyCraft , February 7, 2009 1:46 AM
    EfrayimPoor Nvidia

    Poor ppl who buy a ps4 Intel has been able to deliver gpu that are nice looking on paper but none that work out in tests very well.
  • 2 Hide
    MrBradley , February 7, 2009 2:47 AM
    Is this a joke? Wheres the logic in this? You've got two brands that focus their entire entity based on graphics. So they decide a company with almost no experience in a graphic based market?
  • 0 Hide
    Anonymous , February 7, 2009 2:53 AM
    I didn't notice that nvidia fail in every console it took part in but u can't blame nvidia if the console won't sell. Its the games that push the console.
  • 3 Hide
    yoda8232 , February 7, 2009 2:58 AM
    Overclocking on a PS4 lmao, would be nice. xD
  • -1 Hide
    1raflo , February 7, 2009 3:01 AM
    Quote:
    although from the demos iv seen and heard of larrabee its mainly physics oriented (although its still too early to tell. it is highly daughtful they can reach a teraFLOP)


    http://en.wikipedia.org/wiki/Larrabee_(GPU)

    According to this article, Larrabee is expected to have nearly 2 teraflops of computing power (a lil more raw computing power than a single 4870X2, lets w8 until the first benchmarks ...)
  • 0 Hide
    timaahhh , February 7, 2009 5:08 AM
    enewmenCan someone elaborate why Nvidia isn't welcome in the consoles? I wasn't aware of such big problems. Thanks.

    I remember there being bad blood between Microsoft and nVidia with the Xbox 1. Since Microsoft used basically off the shelf graphics and didn't own the silicon microsoft couldn't control the pricing of the Xbox 1. I imagine that Sony is having more issues with the performance end of what they got from nVidia.
  • 4 Hide
    falchard , February 7, 2009 5:23 AM
    Intel graphics? Thats like yelling we want our graphics to suck.
  • 0 Hide
    Blessedman , February 7, 2009 5:23 AM
    Almost no experience in the graphic based market? are you insane? take a look at the numbers on who ships more graphic chip sets. Intel owns around 46% of the market share of shipped graphic chips. Intel has the resources to make something like the larrabee work. Especially now that they have a little more motivation to make an instant impact on a market. With Sony now basically paying for some of the initial cost of larrabee, Intel can bring it to mass production scale without much risk.
  • 0 Hide
    Anonymous , February 7, 2009 6:24 AM
    "Demerjiana also said to expect the Wii2 and Xbox3 (or whatever they end up being named), in or around 2012. If that's true, expect the PS3 at the same time. It may not be that coveted ten year life cycle Sony is shooting for, but in the interest of sales and competition, shaving a few years off wouldn't hurt."

    umm... u meant that as PS4, right? we already have PS3 ;) 
  • 0 Hide
    apache_lives , February 7, 2009 6:45 AM
    Ummm we forget here that gpus are just giant processors (well lots of smaller cores etc) - something Intel is a monster at! And when packing in all those cores into a more advanced manafacturing process (45nm or better vs ati 55nm etc) there will be advantage.

    Intel may not be able (yet) to make a "video card" but they can make chips that will have more (perhaps) raw processing power then the competition, which sony can use.

    We all forget the turn arounds intel have - Pentium D to Core 2 Duo, jumps with 65nm vs 45nm core 2's, SSD's (first attempt dominates) etc, and do we all forget who owns most of the "graphics" market? INTEL - billions of dells etc use Intel Integrated video making up more then 50% of the market.
  • 1 Hide
    afrobacon , February 7, 2009 9:45 AM
    My thought was that Intel had such a large gpu market share because their cheap to put in pre-builts. With this in mind, how does this say anything about their performance?

    Out of curiosity whats the fastest card Intel has out right now? How does it compare to Nvidia/ATI?
  • 2 Hide
    Tindytim , February 7, 2009 10:07 AM
    afrobaconwhats the fastest card Intel has out right now? How does it compare to Nvidia/ATI?


    Intel doesn't have a single card out. Just a bunch of integrated solutions. None of which compete with any serious solutions, but none of them were meant to.

    Although I found something rather interesting when I did a bit of research into the matter:
    Quote:
    Comparison with the Cell Broadband Engine

    Larrabee's philosophy of using many small, simple cores is similar to the ideas behind the Cell processor. There are some further commonalities, such as the use of a high-bandwidth ring bus to communicate between cores. However, there are many significant differences in implementation which should make programming Larrabee simpler.

    * The Cell processor includes one main processor which controls many smaller processors. Additionally, the main processor can run an operating system. In contrast, all of Larrabee's cores are the same, and the Larrabee is not expected to run an OS.

    * Each compute core in the Cell (SPE) has a local store, for which explicit (DMA) operations are used for all accesses to DRAM. Ordinary reads/writes to DRAM are not allowed. In Larrabee, all on-chip and off-chip memories are under automatically-managed coherent cache hierarchy, so that its cores virtually share a uniform memory space through standard load/store instructions.

    * Because of the cache coherency noted above, each program running in Larrabee has virtually a large linear memory just as in traditional general-purpose CPU; whereas an application for Cell should be programmed taking into consideration limited memory footprint of the local store associated with each SPE (for details see this article) but with theoretically higher bandwidth.

    * Cell uses DMA for data transfer to/from on-chip local memories, which has a merit in flexibility and throughput; whereas Larrabee uses special instructions for cache manipulation (notably cache eviction hints and pre-fetch instructions), which has a merit in that it can maintain cache coherence (hence the standard memory hierarchy) while boosting performance for e.g. rendering pipelines and other stream-like computation.

    * Each compute core in the Cell runs only one thread at a time, in-order. A core in Larrabee runs up to four threads. Larrabee's hyperthreading helps hide latencies and compensates for lack of out-of-order execution.

    http://en.wikipedia.org/wiki/Larrabee_(GPU)

    I'd like to have faith in Intel, considering their great success with their SSDs on their first try. But that was a relatively new technology, that doesn't have a large market for it. So I can only hope it does well.
Display more comments
Tom’s guide in the world
  • Germany
  • France
  • Italy
  • Ireland
  • UK
Follow Tom’s guide
Subscribe to our newsletter