Sign in with
Sign up | Sign in

PS3 Gets Physical with Nvidia

By - Source: Tom's Guide US | B 22 comments

This afternoon Nvidia said that it has signed a "tools and middleware" license agreement with SCEI, bringing PhysX back to the PlayStation 3 console.

If there was one thing Sony did right in regards to the PlayStation 3, it was to develop the console with a PC in mind. While many console owners and PC enthusiasts will undoubtedly flame that very comment, there's a definite certainty that it is more PC-like than its counterparts: the Nintendo Wii and Microsoft's Xbox 360. With Sony's Cell multiprocesor and Nvidia's G70-based GPU (the RSX) thrown under the hood, the console's removable hard drive and support for mouse and keyboard user input gives the console an overall PC quality. Heck, gamers can even install Linux on the console.

With that in mind, Nvidia announced today that it has signed a deal with Sony Computer Entertainment Inc that gives PlayStation 3 developers access to Nvidia's PhysX software development kit (SDK). According to the company, the kit is now available as a free download on the SCEI Developer Network and consists of a full-featured API and "robust" physics engine. Now developers, level designers, and artists have complete creative control over character and object physical interactions, as the SDK allows them to author and preview the physics effects in real time.

“NVIDIA is proud to support PLAYSTATION 3 as an approved middleware provider,” said Tony Tamasi, senior vice president of content and technology at NVIDIA. “Games developed for the PLAYSTATION 3 using PhysX technology offer a more realistic and lifelike interaction between the games characters and other objects within the game. We look forward to the new games that will redefine reality for a new generation of gamers.”

Originally developed by Ageia as a Physics Processing Unit and as the NovodeX SDK, the physics middleware eventually became a part of Nvidia's overall product offering when the company acquired Ageia back in February 2008. Games supporting hardware accelerated PhysX either use a PhysX PPU or a CUDA-enabled GeForce GPU. Using this process, physics processing thus shifts away from the CPU, allowing for faster framerates and realistic interaction with environments.

With that said, there was one thing about today's announcement that left us a little confused. According to Nvidia, the company released drivers that allowed the GeForce 8 series and higher to implement PhysX processing back in August 2008. However, because the PlayStation 3's RSX GPU is based on the G70 architecture (GeForce 7800), GPU support for PhysX isn't even possible on the console. So, Nvidia, what gives? How will PhysX work on the PlayStation 3?

The answer stems back to 2006, when AGEIA originally released the PhysX SDK for the PlayStation 3, version 2.4, specifically optimized for the Cell processor. The company said that it offloaded several components of the PhysX pipeline from the PlayStation PPU (Power Processor Unit) to the SPUs (Synergistic Processing Units), generating a 50 percent reduction in maximum PPU load. That indeed is probably what's going on now with the new Nvidia PhysX SDK release: the middleware is utilizing the Cell processor, not the RSX GPU.

"PhysX on PS3 uses the CPU in PS3 and SPU which are the cores of  the cell. We do not use the NVIDIA GPU in the PS3 for PhysX acceleration," said a spokesman from Nvidia in an email to Tom's. "PhysX is also supported on many platforms which do not use GeForce GPUs for acceleration. For example, PhysX is available on the iphone--running on the arm processor core. This versatility is what is driving PhysX adoption across multiple platforms, including consoles and PCs."

So with all this techno-babble, what does this mean for PlayStation 3 gamers? It means virtual game worlds come to life in a very realistic way: trees bend in the wind, water flows with body and force, spent shells roll across the floor as players move over them in a frantic run. For developers taking advantage of the PhysX SDK, it's all created in the name of realism, to pull gamers into a suspended reality where anything is possible, only limited by the imagination of the developers.

Discuss
Ask a Category Expert

Create a new thread in the Streaming Video & TVs forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
  • 0 Hide
    gamerk316 , March 18, 2009 4:16 PM
    Hmmm...first the Naturalmotion team (makers of BackBreaker) hint of news comming late in the month, and now offical PS3 PhysX support...

    BackBreaker might finally be getting released. Real Time Physics, FTW!
  • 0 Hide
    thedipper , March 18, 2009 4:20 PM
    I think Havok has been the primary choice for video games for a reason. I'm not sure why, I'm no expert... but it just seems to be a better choice.
  • 0 Hide
    gamerk316 , March 18, 2009 4:32 PM
    Its not better, its just simpler and platform independent (need a NVIDIA GPU for PhysX). Just look at the old "Tackle Alley" video for Backbreaker; Havok can't do that, Euphoria (Backbreakers engine) and PhysX can.
  • Display all 22 comments.
  • -3 Hide
    captaincharisma , March 18, 2009 4:44 PM
    this is interesting news. does Xbox have PhysX as well? and anyone know if it is being used?
  • 4 Hide
    Tekkamanraiden , March 18, 2009 4:48 PM
    Considering Physx is a Nvidia technology and the 360 has a Ati video chipset I somehow doubt it.
  • 2 Hide
    Dave K , March 18, 2009 4:56 PM
    The PS3 COULD have been a viable (PC-Lite) platform... but sadly, it was crippled with insufficient main memory. I installed linux on mine and it was PAINFULLY slow. In the PS3 native environment, they further crippled PC-like functionality by making the Keyboard and Mouse operate in about as unintuitive and useless a way as could possibly be contrived. That at least could be fixed (has not been yet to my knowledge), but the memory problem can't.

    Physics on the other hand is right up the Cell Alley... the PS3 SHOULD be able to finally lever some differential advantage from the cell investment if this takes off (and if they can keep the Cell's pipleine full... which is tough since THERE'S NOT ENOUGH MAIN MEMORY!).
  • 0 Hide
    gamerk316 , March 18, 2009 5:11 PM
    And your point with Main Memory? They use such a light OS, they don't need 2GB just to run the thing (I'm looking at you Windows).
  • 0 Hide
    thedipper , March 18, 2009 5:27 PM
    You don't need 2gb to run Windows, you need 4gb to run Windows with a game.

    That's by far not a Windows specific issue.
  • 1 Hide
    gamerk316 , March 18, 2009 5:52 PM
    Remember, in Windows, games can only use a maximum of 2GB (assuming 32-bit coded games).

    Given the fact that most console games are written in very low level languages to get the most out of the hardware, the memory you would need to run a game for the PS3 is far less then the amount you would need in Windows.
  • 5 Hide
    gamerk316 , March 18, 2009 5:57 PM
    My point for the above, is that most OS's are unomtimzed, and coded in such a way to guarentee compatability over performance. Consoles, on the other hand, are made the other way around, made to make efficent use of system resources.

    For example, in Windows (or most any computer OS), if you perform an arithmatic function, that naturally takes up memory. And I know for a fact, even when that variable is no longer needed, few bother to actually deallocate that variable from RAM (part of the reason being how few languages in use support memory deallocation). For consoles (or any lightweight embeeded software), you make sure to clean up after yourself. If its not being used, it cleaned up to free up resources.

    In short, what you need 2GB for in Windows could easily be accomplished with half that amount, if the OS were optimzed and programmers made sure to clean up after themselves when done. I argue that Far Cry 2 could be modified to run on 512MB RAM without any major performance loss, if anyone ever wanted to put that much effort into the work.
  • 0 Hide
    thedipper , March 18, 2009 6:03 PM
    I was making the point that games load best, so generally perform best, when the PC has 4gb RAM.
  • 3 Hide
    pharge , March 18, 2009 6:22 PM
    I agree that for PS3, its main memory is enough for console games w/ limited or no program running at background. But for using PS3 as a PC-lite (as stated by Dave K) w/ linux or other OS... having more RAM on PS3 can defintely make our life much easier.
  • 3 Hide
    gamerk316 , March 18, 2009 6:46 PM
    ^^ I'll agree to that.

    Back on topic, I'm fully expecting an announcment by the Backbreaker team regarding Backbreaker by months end. That will be the game that determines what goes on with PhysX, hence the partnership with NVIDIA.
  • 3 Hide
    Dave K , March 18, 2009 7:16 PM
    gamerk316And your point with Main Memory? They use such a light OS, they don't need 2GB just to run the thing (I'm looking at you Windows).


    The point is that with 256meg of CPU memory, the PS3 is ill suited for general processing tasks, which is a shame because in every other respect it is reasonably capable. With even 1GB of main memory the PS3 would have made an excellent Linux box (well... it would have if Sony hadn't chosen to block access to most of the GPU). Furthermore, the Cell processors ability to pipeline huge quantities of data are going to be hampered by that lack of memory ... the system is likely to have problems keeping them fed.

    The result is that I will be surprised if any game producer is ever able to fully leverage the Cell's... which is a shame.
  • -1 Hide
    demonhorde665 , March 18, 2009 7:24 PM
    this just sounds more like nivida rubbing sony's butt. obivously they are happier with sony than they were with ms on teh original xbox , all becaue MS wouldtn let nvida raise prices on thier gpu's even though the price was set up in teh contract... the more i read aobut nivida the less i like them as a company, they arn't an 800 lb gorrilla as they were once described but more liek an 8oolbs baby tht crys when deals don't go thier way , and when they do they suck up as mucha s they can.
  • 0 Hide
    Grims , March 18, 2009 7:28 PM
    Curious though, considering how cheap memory was back then, and still is...why didn't they put more in it? Why ride the bare minimum on a system that needs to last 5 years.
  • 3 Hide
    Dave K , March 18, 2009 7:47 PM
    gamerk316My point for the above, is that most OS's are unomtimzed, and coded in such a way to guarentee compatability over performance. Consoles, on the other hand, are made the other way around, made to make efficent use of system resources. For example, in Windows (or most any computer OS), if you perform an arithmatic function, that naturally takes up memory. And I know for a fact, even when that variable is no longer needed, few bother to actually deallocate that variable from RAM (part of the reason being how few languages in use support memory deallocation). For consoles (or any lightweight embeeded software), you make sure to clean up after yourself. If its not being used, it cleaned up to free up resources.In short, what you need 2GB for in Windows could easily be accomplished with half that amount, if the OS were optimzed and programmers made sure to clean up after themselves when done. I argue that Far Cry 2 could be modified to run on 512MB RAM without any major performance loss, if anyone ever wanted to put that much effort into the work.


    Umm... no.

    Modern development tools have very detailed memory management approaches. Develop in .NET and you get variable scoping at a very granular level (within virtually ANY element you can have a memory scope), when a variable goes out of scope it is collected and its memory is freed. If your program is properly structured... it won't have any dead variables lying around taking up space. Decisions can be made to scope variables globally... but those decisions tend to be made for performance reasons, thus those variables should be increasing code efficiency not hurting it.

    Furthermore... the problem the PS3 has with lack of memory is less to do with optimization and more to do with the nature of the beast. Games tend to deal with processing large blocks of data, large blocks of data take up a LOT of memory.

    The throughput of a Cell processor (or worse yet... a group of them) working on a large array operation is pretty phenomenal... and all that data has to come FROM somewhere and go TO somewhere. That somewhere is memory... and the PS3 aint got enough to allow game makers to allocate large buffers to keep the Cell's fed. What that means is a limit on the volume of data, or lots of paging to disk. Limit the volume of data and you limit your options, simpler texture, less complex physics. Page to disk and your paging operation becomes your bottleneck - leaving the Processing system waiting for work.

    Additionally... there have been plenty of studies around the idea that people can 'hand optimize' to improve the efficiency of code produced by modern compilers. The result: in VERY SELECT cases, hand optimized code can perform better than compiler optimized code... but in general it does not. Modern processors are simply too complex for most programmers to effectively optimize. Programmers COULD have optimized Far Cry 2 to run optimally in 512meg of memory, but not without making tradeoffs in performance, and why would they want to? When memory costs a few buck a gig, programmers are (as they should be) working to most effectively leverage the amount that the target system has (which for a new gaming rig these days will be 4-8gig)... there is no free lunch in programming, forcing the entire Far Cry 2 package to run in 512meg would be nonsensical. Is Far Cry 2 the most efficient code out there? Doubtful... but unless the technical direction on the project was totally incompetent, there were design decisions behind every choice to increase memory footprint.

    It's popular for the uninformed to complain about the memory requirements of modern operating systems (they tend to pick on Microsoft because it's easy I think). The fact of the matter is that modern operating systems are designed to run on modern hardware... Vista 64 runs like gangbusters on my new quad box with 8 gig of ram, I don't have it installed on my old Athlon 64 machine.

    Why would Microsoft want to design their OS to run on 512meg? That would just mean that their system would be sub-optimized on my 8 gig machine. Design the OS for a window of system capabilities... optimize it there. ROUGHLY, Vista is happiest on 4-8gig, XP was happy with 1-4gig, 2000 was happy with 256meg to 1gig, NT ran quite nicely on 128meg to 256meg. Every time a new OS is released, a few people complain about how it takes up 'SO MUCH' memory.

    You want an OS that runs well in 256meg of memory, install NT or 2000... why would you EVER expect Microsoft to work to make their new flagship OS run in that little memory when it would clearly compromise the efficiency and power of the OS when running on new machines?
  • -1 Hide
    megamanx00 , March 18, 2009 10:55 PM
    I guess I would care if I owned a PS3, but I don't.
  • 0 Hide
    Blessedman , March 18, 2009 11:58 PM
    Honestly Dave you missed one important fact, that the ps3 primary function was as a game console.

    Should you not also heed your own advice? And Not trying to run an OS in a memory footprint that it won't fit?
  • 0 Hide
    russofris , March 19, 2009 3:56 AM
    PhysX was added to the PS3 in 2006 (there was an Agea press statement). In the same press statement, they also announced Physx for the 360 and Wii. The press release today should read: "Nvidia forgets that the Wii and 360 are using PhysX".
Display more comments
Tom’s guide in the world
  • Germany
  • France
  • Italy
  • Ireland
  • UK
Follow Tom’s guide
Subscribe to our newsletter
  • add to twitter
  • add to facebook
  • ajouter un flux RSS