Go Back  DVD Talk Forum > Entertainment Discussions > Video Game Talk
Reload this Page >

Pulled Anandtech Articles

Video Game Talk The Place to talk about and trade Video & PC Games

Pulled Anandtech Articles

Old 06-30-05, 07:59 AM
  #1  
Banned
Thread Starter
 
Join Date: Jan 2002
Location: Blu-Ray: We Don't Need No Stinkin' Petition
Posts: 6,677
Likes: 0
Received 0 Likes on 0 Posts
Pulled Anandtech Articles

Did Anandtech strike to close to a nerve? or were they just plan wrong? You decide.

Mods: If this violates any type of rule, let me know and I'll post these on my blog and link to them. Thanks.

PS3 and Xbox360: A Hardware Discussion - Part 1
The point of a gaming console is to play games. The PC user in all of us wants to benchmark, overclock and upgrade even the unreleased game consoles that were announced at E3, but we can't. And these sorts of limits are healthy, because it lets us have a system that we don't tinker with, that simply performs its function and that is to play games.

The game developers are the ones that have to worry about which system is faster, whose hardware is better and what that means for the games they develop, but to us, the end users, whether the Xbox 360 has a faster GPU or the PlayStation 3's CPU is the best thing since sliced bread doesn't really matter. At the end of the day, it is the games and the overall experience that will sell both of these consoles. You can have the best hardware in the world, but if the games and the experience aren't there, it doesn't really matter.

Despite what we've just said, there is a desire to pick these new next-generation consoles apart. Of course if the games are all that matter, why even bother comparing specs, claims or anything about these next-generation consoles other than games? Unfortunately, the majority of that analysis seems to be done by the manufacturers of the consoles, and fed to the users in an attempt to win early support, and quite a bit of it is obviously tainted.

While we would've liked this to be an article on all three next-generation consoles, the Xbox 360, PlayStation 3 and Revolution, the fact of the matter is that Nintendo has not released any hardware details about their next-gen console, meaning that there's nothing to talk about at this point in time. Leaving us with two contenders: Microsoft's Xbox 360, due out by the end of this year, and Sony's PlayStation 3 due out in Spring 2006.

This article isn't here to crown a winner or to even begin to claim which platform will have better games, it is simply here to answer questions we all have had as well as discuss these new platforms in greater detail than we have before.

Before proceeding with this article, there's a bit of required reading to really get the most out of it. We strongly suggest reading through our Cell processor article [anandtech.com], as well as our launch coverage of the PlayStation 3 [anandtech.com]. We would also suggest reading through our Xbox [anandtech.com] 360 [anandtech.com] articles [anandtech.com] for background on Microsoft's console, as well as an earlier piece published on multi-threaded game development [anandtech.com]. Finally, be sure that you're fully up to date on the latest GPUs [anandtech.com], especially the recently announced NVIDIA GeForce 7800 GTX [anandtech.com] as it is very closely related to the graphics processor in the PS3.

This article isn't a successor to any of the aforementioned pieces, it just really helps to have an understanding of everything we've covered before - and since we don't want this article to be longer than it already is, we'll just point you back there to fill in the blanks if you find that there are any.

Now, on to the show...

A Prelude on Balance
The most important goal of any platform is balance on all levels. We've seen numerous examples of what architectural imbalances can do to performance, having too little cache or too narrow of a FSB can starve high speed CPUs of data they need to perform. GPUs without enough memory bandwidth can't perform anywhere near their peak fillrates, regardless of what they may be. Achieving a balanced overall platform is a very difficult thing on the PC, unless you have an unlimited budget and are able to purchase the fastest components. Skimping on your CPU while buying the most expensive graphics card may leave you with performance that's marginally better, or worse, than someone else with a more balanced system with a faster CPU and a somewhat slower GPU.

With consoles however, the entire platform is designed to be balanced out of the box, as best as the manufacturer can get it to be, while still remaining within the realm of affordability. The manufacturer is responsible for choosing bus widths, CPU architectures, memory bandwidths, GPUs, even down to the type of media that will be used by the system - and most importantly, they make sure that all elements of the system are as balanced as can be.

The reason this article starts with a prelude on balance is because you should not expect either console maker to have put together a horribly imbalanced machine. A company who is already losing money on every console sold, will never put faster hardware in that console if it isn't going to be utilized thanks to an imbalance in the platform. So you won't see an overly powerful CPU paired with a fill-rate limited GPU, and you definitely won't see a lack of bandwidth to inhibit performance. What you will see is a collection of tools that Microsoft and Sony have each, independently, put together for the game developer. Each console has its strengths and its weaknesses, but as a whole, each console is individually very well balanced. So it would be wrong to say that the PlayStation 3's GPU is more powerful than the Xbox 360's GPU, because you can't isolate the two and compare them in a vacuum, how they interact with the CPU, with memory, etc... all influences the overall performance of the platform.

The Consoles and their CPUs
The CPUs at the heart of these two consoles are very different in architecture approach, despite sharing some common parts. The Xbox 360's CPU, codenamed Xenon, takes a general purpose approach to microprocessor design and implements three general purpose PowerPC cores, meaning they can execute any type of code and will do it relatively well.

The PlayStation 3's CPU, the Cell processor, pairs a general purpose PowerPC Processing Element (PPE, very similar to one core from Xenon) with 7 working Synergistic Processing Elements (SPEs) that are more specialized hardware designed to execute certain types of code.

So the comparison between Xenon and Cell really boils down to a comparison between a general purpose microprocessor, and a hybrid of general purpose and specialized hardware.

Despite what many have said, there is support for Sony's approach with Cell. We have discussed, in great detail, the architecture of the Cell processor already but there is industry support for a general purpose + specialized hardware CPU design. Take note of the following slide from Intel's Platform 2015 vision for their CPUs by the year 2015:


The use of one or two large general purpose cores combined with specialized hardware and multiple other smaller cores is in Intel's roadmap for the future, despite their harsh criticism of the Cell processor. The difference is that Cell appears to be far too early for its time. By 2015 CPUs may be manufactured on as small as a 32nm process, significantly smaller than today's 90nm process, meaning that a lot more hardware can be packed into the same amount of space. In going with a very forward-looking design, the Cell processor architects inevitably had to make sacrifices to deal with the fact that the chip they wanted to design is years ahead of its time for use in general computation.

Introducing the Xbox 360's Xenon CPU
The Xenon processor was designed from the ground up to be a 3-core CPU, so unlike Cell, there are no disabled cores on the Xenon chip itself in order to improve yield. The reason for choosing 3 cores is because it provides a good balance between thread execution power and die size. According to Microsoft's partners, the sweet spot for this generation of consoles will be between 4 and 6 execution threads, which is where the 3-core CPU came from.

The chip is built on a 90nm process, much like Cell, and will run at 3.2GHz - also like Cell. All of the cores are identical to one another, and they are very similar to the PPE used in the Cell microprocessor, with a few modifications.

The focus of Microsoft's additions to the core has been in the expansion of the VMX instruction set. In particular, Microsoft now includes a single cycle dot-product instruction as a part of the VMX-128 ISA that is implemented on each core. Microsoft has stated that there is nothing stopping IBM from incorporating this support into other chips, but as of yet we have not seen anyone from the Cell camp claim support for single cycle dot-products on the PPE.

The three cores share a meager 1MB L2 cache, which should be fine for single threaded games but as developers migrate more to multi-threaded engines, this small cache will definitely become a performance limiter. With each core being able to execute two threads simultaneously, you effectively have a worst case scenario of 6 threads splitting a 1MB L2 cache. As a comparison, the current dual core Pentium 4s have a 1MB L2 cache per core and that number is only expected to rise in the future.

The most important selling point of the Xbox 360's Xenon core is the fact that all three cores are identical, and they are all general purpose microprocessors. The developer does not have to worry about multi-threading beyond the point of getting their code to be thread safe; once it is multi-threaded, it can easily be run on any of the cores. The other important thing to keep in mind here is that porting between multi-core PC platforms and the Xbox 360 will be fairly trivial. Anywhere any inline assembly is used there will obviously have to be changes, but with relatively minor code changes and some time optimizing, code portability between the PC and the Xbox 360 shouldn't be very difficult at all. For what it is worth, porting game code between the PC and the Xbox 360 will be a lot like Mac developers porting code between Mac OS X for Intel platforms and PowerPC platforms: there's an architecture switch, but the programming model doesn't change much.

The same cannot however be said for Cell and the PlayStation 3. The easiest way to port code from the Xbox 360 to the PS3 would be to run the code exclusively on the Cell's single PPE, which obviously wouldn't offer very good performance for heavily multi-threaded titles. But with a some effort, the PlayStation 3 does have a lot of potential.

Xenon vs. Cell
The first public game demo on the PlayStation 3 was Epic Games' Unreal Engine 3 at Sony's PS3 press conference. Tim Sweeney, the founder and UE3 father of Epic, performed the demo and helped shed some light on how multi-threading can work on the PlayStation 3.

According to Tim, a lot of things aren't appropriate for SPE acceleration in UE3, mainly high-level game logic, artificial intelligence and scripting. But he adds that "Fortunately these comprise a small percentage of total CPU time on a traditional single-threaded architecture, so dedicating the CPU to those tasks is appropriate, while the SPE's and GPU do their thing."

So what does Tim Sweeney see the SPEs being used for in UE3? "With UE3, our focus on SPE acceleration is on physics, animation updates, particle systems, sound; a few other areas are possible but require more experimentation."

Tim's view on the PPE/SPE split in Cell is far more balanced than most we've encountered. There are many who see the SPEs as utterly useless for executing anything (we'll get to why in a moment), while there are others who have been talking about doing far too much on SPEs where the general purpose PPE would do much better.

For the most part, the areas that UE3 uses the Cell's SPEs for are fairly believable. For example, sound processing makes a lot of sense for the SPEs given their rather specialized architecture aimed at streaming tasks. But the one curious item is the focus on using SPEs to accelerate physics calculations, especially given how branch heavy physics calculations generally are.

Collision detection is a big part of what is commonly referred to as "game physics." As the name implies, collision detection simply refers to the game engine determining when two objects collide. Without collision detection, bullets would never hit your opponents and your character would be able to walk through walls, cars, etc... among other things.

One method of implementing collision detection in a game is through the use of a Binary Search Partitioning (BSP) tree. BSP trees are created by organizing lists of polygons into a binary tree. The structure of the tree itself doesn't matter to this discussion, but the important thing to keep in mind is that to traverse a BSP tree in order to test for a collision between some object and a polygon in the tree you have to perform a lot of comparisons. You first traverse the tree finding to find the polygon you want to test for a collision against. Then you have to perform a number of checks to see whether a collision has occurred between the object you're comparing and the polygon itself. This process involves a lot of conditional branching, code which likes to be run on a high performance OoO core with a very good branch predictor.

Unfortunately, the SPEs have no branch prediction, so BSP tree traversal will tie up an SPE for quite a bit of time while not performing very well as each branch condition has to be evaluated before execution can continue. However it is possible to structure collision detection for execution on the SPEs, but it would require a different approach to the collision detection algorithms than what would be normally implemented on a PC or Xbox 360.

We're still working on providing examples of how it is actually done, but it's tough getting access to detailed information at this stage given that a number of NDAs are still in place involving Cell development for the PS3. Regardless of how it is done, obviously the Epic team found the SPEs to be a good match for their physics code, if structured properly, meaning that the Cell processor isn't just one general purpose core with 7 others that go unused.

In fact, if properly structured and coded for SPE acceleration, physics code could very well run faster on the PlayStation 3 than on the Xbox 360 thanks to the more specialized nature of the SPE hardware. Not to mention that physics acceleration is particularly parallelizable, making it a perfect match for an array of 7 SPEs.

Microsoft has referred to the Cell's array of SPEs as a bunch of DSPs useless to game developers. The fact that the next installment of the Unreal engine will be using the Cell's SPEs for physics, animation updates, particle systems as well as audio processing means that Microsoft's definition is a bit off. While not all developers will follow in Epic's footsteps, those that wish to remain competitive and get good performance out of the PS3 will have to.

The bottom line is that Sony would not foolishly spend over 75% of their CPU die budget on SPEs to use them for nothing more than fancy DSPs. Architecting a game engine around Cell and optimizing for SPE acceleration will take more effort than developing for the Xbox 360 or PC, but it can be done. The question then becomes, will developers do it?

In Johan's Quest for More Processing Power series he looked at the developmental limitations of multi-threading, especially as they applied to games. The end result is that multi-threaded game development takes between 2 and 3 times longer than conventional single-threaded game development, to add additional time in order to restructure elements of your engine to get better performance on the PS3 isn't going to make the transition any easier on developers.

Why In-Order?
Ever since the Pentium Pro, desktop PC microprocessors have implemented Out of Order (OoO) execution architectures in order to improve performance. We've explained the idea in great detail before, but the idea is that an Out-of-Order microprocessor can reorganize its instruction stream in order to best utilize its execution resources. Despite the simplicity of its explanation, implementing support for OoO dramatically increases the complexity of a microprocessor, as well as drives up power consumption.

In a perfect world, you could group a bunch of OoO cores on a single die and offer both excellent single threaded performance, as well as great multi-threaded performance. However, the world isn't so perfect, and there are limitations to how big a processor's die can be. Intel and AMD can only fit two of their OoO cores on a 90nm die, yet the Xbox 360 and PlayStation 3 targeted 3 and 9 cores, respectively, on a 90nm die; clearly something has to give, and that something happened to be the complexity of each individual core.

Given a game console's 5 year expected lifespan, the decision was made (by both MS and Sony) to favor a multi-core platform over a faster single-core CPU in order to remain competitive towards the latter half of the consoles' lifetime.

So with the Xbox 360 Microsoft used three fairly simple IBM PowerPC cores, while Sony has the much publicized Cell processor in their PlayStation 3. Both will perform absolutely much slower than even mainstream desktop processors in single threaded game code, but the majority of games these days are far more GPU bound than CPU bound, so the performance decrease isn't a huge deal. In the long run, with a bit of optimization and running multi-threaded game engines, these collections of simple in-order cores should be able to put out some fairly good performance.

Does In-Order Matter?
As we discussed in our Cell article [anandtech.com], in-order execution makes a lot of sense for the SPEs. With in-order execution as well as a small amount of high speed local memory, memory access becomes quite predictable and code is very easily scheduled by the compiler for the SPEs. However, for the PPE in Cell, and the PowerPC cores in Xenon, the in-order approach doesn't necessarily make a whole lot of sense. You don't have the advantage of a cacheless architecture, even though you do have the ability to force certain items to remain untouched by the cache. More than anything having an in-order general purpose core just works to simplify the core, at the expense of depending quite a bit on the compiler, and the programmer, to optimize performance.

Very little of modern day games is written in assembly, most of it is written in a high level language like C or C++ and the compiler does the dirty work of optimizing the code and translating it into low level assembly. Compilers are horrendously difficult to write; getting a compiler to work is a pretty difficult job in itself, but getting one to work well, regardless of what the input code is, is nearly impossible.

However, with a properly designed ISA and a good compiler, having an in-order core to work on is not the end of the world. The performance you lose by not being able to extract the last bit of instruction level parallelism is made up by the fact that you can execute far more threads per clock thanks to the simplicity of the in-order cores allowing more to be packed on a die. Unfortunately, as we've already discussed, on day one that's not going to be much of an advantage.

The Cell processor's SPEs are even more of a challenge, as they are more specialized hardware only suitable to executing certain types of code. Keeping in mind that the SPEs are not well suited to running branch heavy code, loop unrolling will do a lot to improve performance as it can significantly reduce the number of branches that must be executed. In order to squeeze the absolute maximum amount of performance out of the SPEs, developers may be forced to hand code some routines as initial performance numbers for optimized, compiled SPE code appear to be far less than their peak throughput.

While the move to in-order architectures won't cause game developers too much pain with good compilers at their disposal, the move to multi-threaded game development and optimizing for the Cell in general will be much more challenging.

How Many Threads?
Earlier this year we saw the beginning of a transition from very fast, single core microprocessors to slower, multi-core designs on the PC desktop. The full transition won't be complete for another couple of years, but just as it has begun on the desktop PC side, it also has begun in the next-generation of consoles.

Remember that consoles must have a lifespan of around 5 years, so even if the multithreaded transition isn't going to happen with games for another 2 years, it is necessary for these consoles to be built around multi-core processors to support the ecosystem when that transition occurs.

The problem is that today, all games are single threaded, meaning that in the case of the Xbox 360, only one out of its three cores would be utilized when running present day game engines. The PlayStation 3 would fair no better, as the Cell CPU has a very similar general purpose execution core to one of the Xbox 360 cores. The reason this is a problem is because these general purpose cores that make up the Xbox 360's Xenon CPU or the single general purpose PPE in Cell are extremely weak cores, far slower than a Pentium 4 or Athlon 64, even running at much lower clock speeds.

Looking at the Xbox 360 and PlayStation 3, we wondered if game developers would begin their transition to multithreaded engines with consoles and eventually port them to PCs. While the majority of the PC installed base today still runs on single-core processors, the install base for both the Xbox 360 and PS3 will be guaranteed to be multi-core, so what better platform to introduce a multithreaded game engine than the new consoles where you can guarantee that all of your users will be able to take advantage of the multithreading.

On the other hand, looking at all of the early demos we've seen of Xbox 360 and PS3 games, not a single one appears to offer better physics or AI than the best single threaded games on the PC today. At best, we've seen examples of ragdoll physics similar to that of Half Life 2, but nothing that is particularly amazing, earth shattering or shocking. Definitely nothing that appears to be leveraging the power of a multicore processor.

In fact, all of the demos we've seen look like nothing more than examples of what you can do on the latest generation of GPUs - not showcases of multi-core CPU power. So we asked Microsoft, expecting to get a fluffy answer about how all developers would be exploiting the 6 hardware threads supported by Xenon, instead we got a much more down to earth answer.

The majority of developers are doing things no differently than they have been on the PC. A single thread is used for all game code, physics and AI and in some cases, developers have split out physics into a separate thread, but for the most part you can expect all first generation and even some second generation titles to debut as basically single threaded games. The move to two hardware execution threads may in fact only be an attempt to bring performance up to par with what can be done on mid-range or high-end PCs today, since a single thread running on Xenon isn't going to be very competitive performance wise, especially executing code that is particularly well suited to OoO desktop processors.

With Microsoft themselves telling us not to expect more than one or two threads of execution to be dedicated to game code, will the remaining two cores of the Xenon go unused for the first year or two of the Xbox 360's existence? While the remaining cores won't directly be used for game performance acceleration, they won't remain idle - enter the Xbox 360's helper threads.

The first time we discussed helper threads on AnandTech was in reference to additional threads, generated at runtime, that could use idle execution resources to go out and prefetch data that the CPU would eventually need.

The Xbox 360 will use a few different types of helper threads to not only make the most out of the CPU's performance, but to also help balance the overall platform. Keep in mind that with the 360, Microsoft has not increased the size of the media that games will be stored on. The dual layer DVD-9 spec is still in effect, meaning that game developers shipping titles for the Xbox 360 in 2006 will have the same amount of storage space as they did back in 2001. Given that current Xbox titles generally use around 4.5GB of space, it's not a big deal, but by 2010 9GB may feel a bit tight.

Thanks to idle execution power in the 3-core Xenon, developers can now perform real-time decompression of game data in order to maximize storage space. Given that a big hunk of disc space is used by audio and video, being able to use more sophisticated compression algorithms for both types of data will also help maximize that 9GB of storage. Or, if space isn't as much of a concern, developers are now able to use more sophisticated encoding algorithms to encode audio/video to use the same amount of space as they are today, but achieve much higher quality audio and video. Microsoft has already stated that in game video will essentially use the WMV HD codec. The real time decompression of audio/video will be another use for the extra power of the system.

Another interesting use will be digital audio encoding; in the original Xbox Microsoft used a relatively expensive DSP featured in the nForce south bridge to perform real-time Dolby Digital Encoding. The feature allowed Microsoft to offer a single optical out on the Xbox's HD AV pack, definitely reducing cable clutter and bringing 5.1 channel surround sound to the game console. This time around, DD encoding can be done as a separate thread on the Xenon CPU - in real time. It reduces the need for Microsoft to purchase a specialized DSP from another company, and greatly simplifies the South Bridge in the Xbox 360.

But for the most part, on day 1, you shouldn't expect Xbox 360 games to be much more than the same type of single threaded titles we've had on the PC. In fact, the biggest draw to the new consoles will be the fact that for the first time, we will have the ability to run games rendered internally at 1280 x 720 on a game console. In other words, round one of the next generation of game consoles is going to be a GPU battle.

The importance of this fact is that Microsoft has been talking about the general purpose execution power of the Xbox 360 and how it is 3 times that of the PS3's Cell processor. With only 1 - 2 threads of execution being dedicated for game code, the advantage is pretty much lost at the start of the console battle.

Sony doesn't have the same constraints that Microsoft does, and thus there is less of a need to perform real time decompression of game content. Keep in mind that the PS3 will ship with a Blu-ray drive, with Sony's minimum disc spec being a hefty 23.3GB of storage for a single layer Blu-ray disc. The PS3 will also make use of H.264 encoding for all video content, the decoding of which is perfectly suited for the Cell's SPEs. Audio encoding will also be done on the SPEs, once again as there is little need to use any extra hardware to perform a task that is perfectly suited for the SPEs.
Old 06-30-05, 08:04 AM
  #2  
Banned
Thread Starter
 
Join Date: Jan 2002
Location: Blu-Ray: We Don't Need No Stinkin' Petition
Posts: 6,677
Likes: 0
Received 0 Likes on 0 Posts
PS3 and Xbox360: Examples of Poor CPU Performance
Learning from Generation X

The original Xbox console marked a very important step in the evolution of gaming consoles - it was the first console that was little more than a Windows PC.

It featured a 733MHz Pentium III processor with a 128KB L2 cache, paired up with a modified version of NVIDIA's nForce chipset (modified to support Intel's Pentium III bus instead of the Athlon XP it was designed for). The nForce chipset featured an integrated GPU, codenamed the NV2A, offering performance very similar to that of a GeForce3. The system had a 5X PC DVD drive and an 8GB IDE hard drive, and all of the controllers interfaced to the console using USB cables with a proprietary connector.

For the most part, game developers were quite pleased with the original Xbox. It offered them a much more powerful CPU, GPU and overall platform than anything had before. But as time went on, there were definitely limitations that developers ran into with the first Xbox.

One of the biggest limitations ended up being the meager 64MB of memory that the system shipped with. Developers had asked for 128MB and the motherboard even had positions silk screened for an additional 64MB, but in an attempt to control costs the final console only shipped with 64MB of memory.

The next problem is that the NV2A GPU ended up not having the fill rate and memory bandwidth necessary to drive high resolutions, which kept the Xbox from being used as a HD console.

Although Intel outfitted the original Xbox with a Pentium III/Celeron hybrid in order to improve performance yet maintain its low cost, at 733MHz that quickly became a performance bottleneck for more complex games after the console's introduction.

The combination of GPU and CPU limitations made 30 fps a frame rate target for many games, while simpler titles were able to run at 60 fps. Split screen play on Halo would even stutter below 30 fps depending on what was happening on screen, and that was just a first-generation title. More experience with the Xbox brought creative solutions to the limitations of the console, but clearly most game developers had a wish list of things they would have liked to have seen in the Xbox successor. Similar complaints were levied against the PlayStation 2, but in some cases they were more extreme (e.g. its 4MB frame buffer).

Given that consoles are generally evolutionary, taking lessons learned in previous generations and delivering what the game developers want in order to create the next-generation of titles, it isn't a surprise to see that a number of these problems are fixed in the Xbox 360 and PlayStation 3.

One of the most important changes with the new consoles is that system memory has been bumped from 64MB on the original Xbox to a whopping 512MB on both the Xbox 360 and the PlayStation 3. For the Xbox, that's a factor of 8 increase, and over 12x the total memory present on the PlayStation 2.

The other important improvement with the next-generation of consoles is that the GPUs have been improved tremendously. With 6 - 12 month product cycles, it's no surprise that in the past 4 years GPUs have become much more powerful. By far the biggest upgrade these new consoles will offer, from a graphics standpoint, is the ability to support HD resolutions.

There are obviously other, less-performance oriented improvements such as wireless controllers and more ubiquitous multi-channel sound support. And with Sony's PlayStation 3, disc capacity goes up thanks to their embracing the Blu-ray standard.

But then we come to the issue of the CPUs in these next-generation consoles, and the level of improvement they offer. Both the Xbox 360 and the PlayStation 3 offer multi-core CPUs to supposedly usher in a new era of improved game physics and reality. Unfortunately, as we have found out, the desire to bring multi-core CPUs to these consoles was made a reality at the expense of performance in a very big way.

Problems with the Architecture

At the heart of both the Xenon and Cell processors is IBM's custom PowerPC based core. We've discussed this core in our previous articles, but it is best characterized as being quite simple. The core itself is a very narrow 2-issue in-order execution core, featuring a 64KB L1 cache (32K instruction/32K data) and either a 1MB or 512KB L2 cache (for Xenon or Cell, respectively). Supporting SMT, the core can execute two threads simultaneously similar to a Hyper Threading enabled Pentium 4. The Xenon CPU is made up of three of these cores, while Cell features just one.

Each individual core is extremely small, making the 3-core Xenon CPU in the Xbox 360 smaller than a single core 90nm Pentium 4. While we don't have exact die sizes, we've heard that the number is around 1/2 the size of the 90nm Prescott die.

IBM's pitch to Microsoft was based on the peak theoretical floating point performance-per-dollar that the Xenon CPU would offer, and given Microsoft's focus on cost savings with the Xbox 360, they took the bait.

While Microsoft and Sony have been childishly playing this flops-war, comparing the 1 TFLOPs processing power of the Xenon CPU to the 2 TFLOPs processing power of the Cell, the real-world performance war has already been lost.

Right now, from what we've heard, the real-world performance of the Xenon CPU is about twice that of the 733MHz processor in the first Xbox. Considering that this CPU is supposed to power the Xbox 360 for the next 4 - 5 years, it's nothing short of disappointing. To put it in perspective, floating point multiplies are apparently 1/3 as fast on Xenon as on a Pentium 4.

The reason for the poor performance? The very narrow 2-issue in-order core also happens to be very deeply pipelined, apparently with a branch predictor that's not the best in the business. In the end, you get what you pay for, and with such a small core, it's no surprise that performance isn't anywhere near the Athlon 64 or Pentium 4 class.

The Cell processor doesn't get off the hook just because it only uses a single one of these horribly slow cores; the SPE array ends up being fairly useless in the majority of situations, making it little more than a waste of die space.

We mentioned before that collision detection is able to be accelerated on the SPEs of Cell, despite being fairly branch heavy. The lack of a branch predictor in the SPEs apparently isn't that big of a deal, since most collision detection branches are basically random and can't be predicted even with the best branch predictor. So not having a branch predictor doesn't hurt, what does hurt however is the very small amount of local memory available to each SPE. In order to access main memory, the SPE places a DMA request on the bus (or the PPE can initiate the DMA request) and waits for it to be fulfilled. From those that have had experience with the PS3 development kits, this access takes far too long to be used in many real world scenarios. It is the small amount of local memory that each SPE has access to that limits the SPEs from being able to work on more than a handful of tasks. While physics acceleration is an important one, there are many more tasks that can't be accelerated by the SPEs because of the memory limitation.

The other point that has been made is that even if you can offload some of the physics calculations to the SPE array, the Cell's PPE ends up being a pretty big bottleneck thanks to its overall lackluster performance. It's akin to having an extremely fast GPU but without a fast CPU to pair it up with.

What About Multithreading?

We of course asked the obvious question: would game developers rather have 3 slow general purpose cores, or one of those cores paired with an array of specialized SPEs? The response was unanimous, everyone we have spoken to would rather take the general purpose core approach.

Citing everything from ease of programming to the limitations of the SPEs we mentioned previously, the Xbox 360 appears to be the more developer-friendly of the two platforms according to the cross-platform developers we've spoken to. Despite being more developer-friendly, the Xenon CPU is still not what developers wanted.

The most ironic bit of it all is that according to developers, if either manufacturer had decided to use an Athlon 64 or a Pentium D in their next-gen console, they would be significantly ahead of the competition in terms of CPU performance.

While the developers we've spoken to agree that heavily multithreaded game engines are the future, that future won't really take form for another 3 - 5 years. Even Microsoft admitted to us that all developers are focusing on having, at most, one or two threads of execution for the game engine itself - not the four or six threads that the Xbox 360 was designed for.

Even when games become more aggressive with their multithreading, targeting 2 - 4 threads, most of the work will still be done in a single thread. It won't be until the next step in multithreaded architectures where that single thread gets broken down even further, and by that time we'll be talking about Xbox 720 and PlayStation 4. In the end, the more multithreaded nature of these new console CPUs doesn't help paint much of a brighter performance picture - multithreaded or not, game developers are not pleased with the performance of these CPUs.

What about all those Flops?

The one statement that we heard over and over again was that Microsoft was sold on the peak theoretical performance of the Xenon CPU. Ever since the announcement of the Xbox 360 and PS3 hardware, people have been set on comparing Microsoft's figure of 1 trillion floating point operations per second to Sony's figure of 2 trillion floating point operations per second (TFLOPs). Any AnandTech reader should know for a fact that these numbers are meaningless, but just in case you need some reasoning for why, let's look at the facts.

First and foremost, a floating point operation can be anything; it can be adding two floating point numbers together, or it can be performing a dot product on two floating point numbers, it can even be just calculating the complement of a fp number. Anything that is executed on a FPU is fair game to be called a floating point operation.

Secondly, both floating point power numbers refer to the whole system, CPU and GPU. Obviously a GPU's floating point processing power doesn't mean anything if you're trying to run general purpose code on it and vice versa. As we've seen from the graphics market, characterizing GPU performance in terms of generic floating point operations per second is far from the full performance story.

Third, when a manufacturer is talking about peak floating point performance there are a few things that they aren't taking into account. Being able to process billions of operations per second depends on actually being able to have that many floating point operations to work on. That means that you have to have enough bandwidth to keep the FPUs fed, no mispredicted branches, no cache misses and the right structure of code to make sure that all of the FPUs can be fed at all times so they can execute at their peak rates. We already know that's not the case as game developers have already told us that the Xenon CPU isn't even in the same realm of performance as the Pentium 4 or Athlon 64. Not to mention that the requirements for hitting peak theoretical performance are always ridiculous; caches are only so big and thus there will come a time where a request to main memory is needed, and you can expect that request to be fulfilled in a few hundred clock cycles, where no floating point operations will be happening at all.

So while there may be some extreme cases where the Xenon CPU can hit its peak performance, it sure isn't happening in any real world code.

The Cell processor is no different; given that its PPE is identical to one of the PowerPC cores in Xenon, it must derive its floating point performance superiority from its array of SPEs. So what's the issue with 218 GFLOPs number (2 TFLOPs for the whole system)? Well, from what we've heard, game developers are finding that they can't use the SPEs for a lot of tasks. So in the end, it doesn't matter what peak theoretical performance of Cell's SPE array is, if those SPEs aren't being used all the time.

Another way to look at this comparison of flops is to look at integer add latencies on the Pentium 4 vs. the Athlon 64. The Pentium 4 has two double pumped ALUs, each capable of performing two add operations per clock, that's a total of 4 add operations per clock; so we could say that a 3.8GHz Pentium 4 can perform 15.2 billion operations per second. The Athlon 64 has three ALUs each capable of executing an add every clock; so a 2.8GHz Athlon 64 can perform 8.4 billion operations per second. By this silly console marketing logic, the Pentium 4 would be almost twice as fast as the Athlon 64, and a multi-core Pentium 4 would be faster than a multi-core Athlon 64. Any AnandTech reader should know that's hardly the case. No code is composed entirely of add instructions, and even if it were, eventually the Pentium 4 and Athlon 64 will have to go out to main memory for data, and when they do, the Athlon 64 has a much lower latency access to memory than the P4. In the end, despite what these horribly concocted numbers may lead you to believe, they say absolutely nothing about performance. The exact same situation exists with the CPUs of the next-generation consoles; don't fall for it.

Why did Sony/MS do it?

For Sony, it doesn't take much to see that the Cell processor is eerily similar to the Emotion Engine in the PlayStation 2, at least conceptually. Sony clearly has an idea of what direction they would like to go in, and it doesn't happen to be one that's aligned with much of the rest of the industry. Sony's past successes have really come, not because of the hardware, but because of the developers and their PSX/PS2 exclusive titles. A single hot title can ship millions of consoles, and by our count, Sony has had many more of those than Microsoft had with the first Xbox.

Sony shipped around 4 times as many PlayStation 2 consoles as Microsoft did Xboxes, regardless of the hardware platform, a game developer won't turn down working with the PS2 - the install base is just that attractive. So for Sony, the Cell processor may be strange and even undesirable for game developers, but the developers will come regardless.

The real surprise was Microsoft; with the first Xbox, Microsoft listened very closely to the wants and desires of game developers. This time around, despite what has been said publicly, the Xbox 360's CPU architecture wasn't what game developers had asked for.

They wanted a multi-core CPU, but not such a significant step back in single threaded performance. When AMD and Intel moved to multi-core designs, they did so at the expense of a few hundred MHz in clock speed, not by taking a step back in architecture.

We suspect that a big part of Microsoft's decision to go with the Xenon core was because of its extremely small size. A smaller die means lower system costs, and if Microsoft indeed launches the Xbox 360 at $299 the Xenon CPU will be a big reason why that was made possible.

Another contributing factor may be the fact that Microsoft wanted to own the IP of the silicon that went into the Xbox 360. We seriously doubt that either AMD or Intel would be willing to grant them the right to make Pentium 4 or Athlon 64 CPUs, so it may have been that IBM was the only partner willing to work with Microsoft's terms and only with this one specific core.

Regardless of the reasoning, not a single developer we've spoken to thinks that it was the right decision.

The Saving Grace: The GPUs

Although both manufacturers royally screwed up their CPUs, all developers have agreed that they are quite pleased with the GPU power of the next-generation consoles.

First, let's talk about NVIDIA's RSX in the PlayStation 3. We discussed the possibility of RSX offloading vertex processing onto the Cell processor, but more and more it seems that isn't the case. It looks like the RSX will basically be a 90nm G70 with Turbo Cache running at 550MHz, and the performance will be quite good.

One option we didn't discuss in the last article, was that the G70 GPU may feature a number of disabled shader pipes already to improve yield. The move to 90nm may allow for those pipes to be enabled and thus allowing for another scenario where the RSX offers higher performance at the same transistor count as the present-day G70. Sony may be hesitant to reveal the actual number of pixel and vertex pipes in the RSX because honestly they won't know until a few months before mass production what their final yields will be.

Despite strong performance and support for 1080p, a large number of developers are targeting 720p for their PS3 titles and won't support 1080p. Those that are simply porting current-generation games over will have no problems running at 1080p, but anyone working on a truly next-generation title won't have the fill rate necessary to render at 1080p.

Another interesting point is that despite its lack of "free 4X AA" like the Xbox 360, in some cases it won't matter. Titles that use longer pixel shader programs end up being bound by pixel shader performance rather than memory bandwidth, so the performance difference between no AA and 2X/4X AA may end up being quite small. Not all titles will push the RSX to the limits however, and those titles will definitely see a performance drop with AA enabled. In the end, whether the RSX's lack of embedded DRAM matters will be entirely dependent on the game engine being developed for the platform. Games that make more extensive use of long pixel shaders will see less of an impact with AA enabled than those that are more texture bound. Game developers are all over the map on this one, so it wouldn't be fair to characterize all of the games as falling into one category or another.

ATI's Xenos GPU is also looking pretty good and most are expecting performance to be very similar to the RSX, but real world support for this won't be ready for another couple of months. Developers have just recently received more final Xbox 360 hardware, and gauging performance of the actual Xenos GPU compared to the R420 based solutions in the G5 development kits will take some time. Since the original dev kits offered significantly lower performance, developers will need a bit of time to figure out what realistic limits the Xenos GPU will have.

Final Words

Just because these CPUs and GPUs are in a console doesn't mean that we should throw away years of knowledge from the PC industry - performance doesn't come out of thin air, and peak performance is almost never achieved. Clever marketing however, will always try to fool the consumer.

And that's what we have here today, with the Xbox 360 and PlayStation 3. Both consoles are marketed to be much more powerful than they actually are, and from talking to numerous game developers it seems that the real world performance of these platforms isn't anywhere near what it was supposed to be.

It looks like significant advancements in game physics won't happen on consoles for another 4 or 5 years, although it may happen with PC games much before that.

It's not all bad news however; the good news is that both GPUs are quite possibly the most promising part of the new consoles. With the performance that we have seen from NVIDIA's G70, we have very high expectations for the 360 and PS3. The ability to finally run at HD resolutions in all games will bring a much needed element to console gaming.

And let's not forget all of the other improvements to these next-generation game consoles. The CPUs, despite being relatively lackluster, will still be faster than their predecessors and increased system memory will give developers more breathing room. Then there are other improvements such as wireless controllers, better online play and updated game engines that will contribute to an overall better gaming experience.

In the end, performance could be better, the consoles aren't what they could have been had the powers at be made some different decisions. While they will bring better quality games to market and will be better than their predecessors, it doesn't look like they will be the end of PC gaming any more than the Xbox and PS2 were when they were launched. The two markets will continue to coexist, with consoles being much easier to deal with, and PCs offering some performance-derived advantages.

With much more powerful CPUs and, in the near future, more powerful GPUs, the PC paired with the right developers should be able to bring about that revolution in game physics and graphics we've been hoping for. Consoles will help accelerate the transition to multithreaded gaming, but it looks like it will take PC developers to bring about real change in things like game physics, AI and other non-visual elements of gaming.
Old 06-30-05, 08:59 AM
  #3  
DVD Talk Platinum Edition
 
Join Date: Feb 2001
Posts: 3,393
Likes: 0
Received 1 Like on 1 Post
Any link to the cliff's notes version?
Old 06-30-05, 09:03 AM
  #4  
2017 TOTY Winner
 
Save Ferris's Avatar
 
Join Date: Nov 2001
Posts: 13,579
Likes: 0
Received 2 Likes on 2 Posts
Hmm. For what its worth I can share my experience with PR and marketing.

My sister was the marketing director for a popular company that partners itself with Ebay. She pulls strings all the time to get her companay written up about in all kinds of websites/magazines/newspapers etc. when they have a new product or feature.

I came accross some criticism about one of her latest projects in an article at some popular website (I cant remember which site) and I called her to tell her and kind of push her buttons. Well in a few hours the article was pulled. lol

She knew lots of people and had some kind of influence enough to take care of it. I was pretty amazed lol.
Old 06-30-05, 09:09 AM
  #5  
DVD Talk Legend
 
Join Date: Oct 1999
Location: Plano, TX
Posts: 23,225
Likes: 0
Received 1 Like on 1 Post
Originally Posted by TheMadMonk
Any link to the cliff's notes version?
"A bunch of technical speculation that has little to do with video games."
Old 06-30-05, 09:13 AM
  #6  
DVD Talk Hall of Fame
 
Join Date: Aug 2000
Location: Wichita, KS
Posts: 9,127
Likes: 0
Received 1 Like on 1 Post
Reading hurts my head. Damn that was long. Why was the article removed? Just because Sony and Microsoft are hyping these systems as being more powerful than they really are? There are no laws against marketing strategies being broken here that I know of.
Old 06-30-05, 09:19 AM
  #7  
2017 TOTY Winner
 
Save Ferris's Avatar
 
Join Date: Nov 2001
Posts: 13,579
Likes: 0
Received 2 Likes on 2 Posts
PR and Ad campaign people dont like criticism when launching a new product. They can raise hell and say its not 'fair' because theres no hard data out yet. A powerful enough PR person has enough weight to get the articles pulled.
Old 06-30-05, 09:21 AM
  #8  
DVD Talk Legend
 
Join Date: Oct 1999
Location: Plano, TX
Posts: 23,225
Likes: 0
Received 1 Like on 1 Post
Originally Posted by edstein
Reading hurts my head. Damn that was long. Why was the article removed? Just because Sony and Microsoft are hyping these systems as being more powerful than they really are? There are no laws against marketing strategies being broken here that I know of.
I don't know; when you claim a little computer can recreate Jesus, build its own sub-galaxies and render Spider-Man 2 in real-time, you're putting yourself behind the eight ball. Better to clean them up now so that in five years when people are wondering where their Spider-Man 2 graphics are somebody can Google it, come up empty and claim they never said it.
Old 06-30-05, 09:28 AM
  #9  
2017 TOTY Winner
 
Save Ferris's Avatar
 
Join Date: Nov 2001
Posts: 13,579
Likes: 0
Received 2 Likes on 2 Posts
Bait and switch is the oldest trick in the book. thankfully with todays active community, its harder to bury these secrets on the web!
Old 06-30-05, 09:42 AM
  #10  
DVD Talk Special Edition
 
Join Date: Jun 2001
Location: Virginia
Posts: 1,114
Likes: 0
Received 0 Likes on 0 Posts
After reading those articles, I think I will stay with my new PC for gaming. It sounds like what I have is better than the Xbox 360 and it even isn't released for 6+ months.

I can understand why these were pulled. Anand basically states that the new consoles are not as "next-generation" as Sony & MS would leave you to believe. Both companies might be a tad-bit pissed about tthat.
Old 06-30-05, 09:45 AM
  #11  
Video Game Talk Editor
 
Flay's Avatar
 
Join Date: Jul 1999
Location: Westchester, Los Angeles
Posts: 4,097
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by TheMadMonk
Any link to the cliff's notes version?
Xbox 360 and PS3 CPU's = Bad
Old 06-30-05, 09:57 AM
  #12  
2017 TOTY Winner
 
Save Ferris's Avatar
 
Join Date: Nov 2001
Posts: 13,579
Likes: 0
Received 2 Likes on 2 Posts
And of course the articles could be PR from PC companies. Everyone wants us to buy one thing and not buy the other. I, for one am going to buy every little techno gadget and gizmo that gets my pulse racing.
Old 06-30-05, 10:44 AM
  #13  
Banned
Thread Starter
 
Join Date: Jan 2002
Location: Blu-Ray: We Don't Need No Stinkin' Petition
Posts: 6,677
Likes: 0
Received 0 Likes on 0 Posts
Cliff Notes:

Xbox 360 = PS3
Xbox 360 < PC
PS3 < PC

Is there something else I missed?
Old 06-30-05, 11:11 AM
  #14  
2017 TOTY Winner
 
Save Ferris's Avatar
 
Join Date: Nov 2001
Posts: 13,579
Likes: 0
Received 2 Likes on 2 Posts
Isnt it a given that PCs will always blow away consoles? Especialy since there arent 'generations' of PCs but they are always evolving.
Old 06-30-05, 11:46 AM
  #15  
DVD Talk Platinum Edition
 
Join Date: Feb 2001
Posts: 3,189
Likes: 0
Received 0 Likes on 0 Posts
Those articles are def. fair in their early analysis, no doubt there. Fact is, the specs as we have them arn't that great. There is still things we don't know however. No doubt people will still be impressed, especially if they are console only gamers. Don't forget, a console is not just about graphic quality, its about ease of use, something that all the consoles will always hold over PC's to the mass market.

About the article, I think its bull shit that it can get pulled and have legal actions brought against them for "defamatory remarks without probable evidence", yet companies can say whatever they want to hype a product and not deliver it. BULLSHIT.


Regarding "burying" news, even in this internet connected world it is very easy for a company to make information "disappear". Any web site that reports news based on another news outlet will fold under any sign of pressure from a company, because they know they are just repeating "what they heard". Usually there is only one "true source" that reported the news in the first place. If at a later time that source is unable to deliver physical evidence (ie a tape, recording, hand written memo. etc) then they will fold their story as well.

Example: If Nintendo's Reggie tells IGN's Matt that the Revolution will render Shrek in real time, unless Matt records that conversation Nintendo could generate false hype early and later surpress that info by simply sending out a few cease and disest letters to the major sites. Then you get the wonderful fanboys 4 years after the fact saying "Show me a real source" even though everyone "knows" what Regie said. And don't get me started about Big Brother... (check out earth.google.com for 20 year old satalite technology pictures. )
Old 06-30-05, 12:22 PM
  #16  
DVD Talk Gold Edition
 
Join Date: Aug 2001
Posts: 2,394
Likes: 0
Received 1 Like on 1 Post
I just checked Anandtech, it's still on their site, so what is this mention of it being pulled?
Old 06-30-05, 12:27 PM
  #17  
Banned
Thread Starter
 
Join Date: Jan 2002
Location: Blu-Ray: We Don't Need No Stinkin' Petition
Posts: 6,677
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by Fandango
I just checked Anandtech, it's still on their site, so what is this mention of it being pulled?
Looks like they put the first one back. I wonder if they had to edit it any. But the second one still isn't there - the one on CPUs.
Old 06-30-05, 12:35 PM
  #18  
DVD Talk Special Edition
 
Join Date: Jul 2002
Location: The Great Basin
Posts: 1,565
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by TheMadMonk
Any link to the cliff's notes version?
How about a Cliffs notes electrical engineering degree so we can understand it all?
Old 06-30-05, 01:54 PM
  #19  
DVD Talk Hall of Fame
 
tanman's Avatar
 
Join Date: Jan 2001
Location: Gator Nation
Posts: 9,822
Received 913 Likes on 637 Posts
I still think that the end result of the next gen will be unimpressive graphics wise. They will certainly be better but I definately will not be as big a jump as the last gen. Even Gamepro (yeah, I know) had an article on the XBox vs. The XBox 360 graphics wise and they stated how it will be much better but had to zoom in on the pictures in order to show the jaggies. It seems like all graphical prowess will be to smooth out the graphics and not really revolutionize them. Look at Halo 2, it is an incredible looking game that could have easily been thrown in with the other games being shown and still fit in.

I think the next gen will be more about the peripheral bells and whistles as each generations individual consoles become more and more different from each other in terms of capabilities and features.

I could be wrong though, it seems as though the general public care about horsepower just for horesepower's sake.
Even the games themselves seem to be taking a backseat.
Old 06-30-05, 05:43 PM
  #20  
DVD Talk Gold Edition
 
Join Date: Nov 1999
Location: Osaka, Japan
Posts: 2,493
Likes: 0
Received 0 Likes on 0 Posts
I would still like to lean towards the possibility that the anandtech articles are being a little overly harsh on the 360 and ps3 configurations. The current speculation is that much of their developer feedback is very pc oriented. These are developers that are used to programming against 1 single core cpu and a single very powerful gpu so it's not surprising they are -ve on the new console hardware (of course they will have to learn to deal with at least 2 cores even for pc games in the next 2 years or so). Meanwhile developers who are experienced with leveraging hardware more like the PS2 are very positive on the new consoles.

I guess when it comes down to it, multiple cores are the future (either the 360/pc path or the ps3/cell path), just maybe they have come a little sooner than most would have expected.
Old 06-30-05, 08:11 PM
  #21  
DVD Talk Platinum Edition
 
Join Date: Mar 2001
Location: MD
Posts: 3,137
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by Save Ferris
Bait and switch is the oldest trick in the book. thankfully with todays active community, its harder to bury these secrets on the web!
no its a good tactic. I think you'd be surprised how many joe six packs believe tech specs and features that will be eventually downgraded or missing in the final product.
Old 07-01-05, 07:56 PM
  #22  
DVD Talk Gold Edition
 
Join Date: Feb 1999
Location: HB, CA
Posts: 2,600
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by Chris_D
...The current speculation is that much of their developer feedback is very pc oriented...
That was my thought as well. It'll take a lot of these guys a while to come around to the idea that they'll need to re-learn a lot about how to design and optimize their software.

A lot of developers bitched when the PS2 was launched. The EE had a MIPs core with a pair of vector units and programmers at the time whined about how difficult it would be to fully exploit the hardware. Evidently, at least some developers have figured it out.

Keep in mind that the PS2 was launched in Japan almost 20 months before the XBox launch and that the XBox probably had the higher initial subsidy of the two at the time of their respective launches. I think that the degree to which the PS2 has been able to "keep up" with the XBox in terms of game performance is a testament to the potential that a multi-core design can have.
Old 07-01-05, 08:15 PM
  #23  
DVD Talk Platinum Edition
 
Join Date: Feb 2001
Posts: 3,189
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by belboz
Keep in mind that the PS2 was launched in Japan almost 20 months before the XBox launch and that the XBox probably had the higher initial subsidy of the two at the time of their respective launches. I think that the degree to which the PS2 has been able to "keep up" with the XBox in terms of game performance is a testament to the potential that a multi-core design can have.
Doesn't the PS2 have a single core? The last system I recall having multiple cores was the Saturn. The PS2 was bitch to program because of its architecure design, not its core count.

Last edited by jeffdsmith; 07-01-05 at 08:19 PM.
Old 07-02-05, 07:56 AM
  #24  
DVD Talk Gold Edition
 
Join Date: Nov 1999
Location: Osaka, Japan
Posts: 2,493
Likes: 0
Received 0 Likes on 0 Posts
Technically the PS2 is single core but it does have 2 additional vector units, which developers have to use to get the best out of the machine. The ps2 core (emotion engine) is probably relatively more capable than the cell cpu, factoring in that it is a generation older, but the cell does have 5 more additional units and similar principles would apply. I suspect many ps2 developers would be quite comfortable with cell (and even 360) although it will probably be a little harder to fully leverage all 7 of the cell vector units.

The saturn is comparitively closer to the 360 because both cores (or cpus?) are identical like the 360 which has 3 identical cores.
Old 07-02-05, 09:06 AM
  #25  
Banned
Thread Starter
 
Join Date: Jan 2002
Location: Blu-Ray: We Don't Need No Stinkin' Petition
Posts: 6,677
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by Chris_D
The saturn is comparitively closer to the 360 because both cores (or cpus?) are identical like the 360 which has 3 identical cores.
I believe the Saturns two CPUs could not operate at the same time, something about they couldn't both access memory at the same time.

Thread Tools
Search this Thread

Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service -

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.