Bueno... he encontrado una posible entrevista a alguien con el nuevo sdk 2.1 para Wii
http://wiinside.blogspot.com/2007/04/inside-wii.html
Bueno... es un artículo técnico bastante interesante donde los números parecen muy muy reales. Y nos podrian desvelar más de lo que se sabe sobre el potencial de wii...
Lo que no me parece tan real (entendiendo los números claro) són las comparaciones que hace. Pq compara lo que quiere como quiere. (para favorecer a la wii) pero salvando comparaciones y número de veces más potente que se cree que una cosa es más que otra... los números son bastante interesantes...
Por ejemplo compara la velocidad de Wii corriendo un linux con la de xbox360 cuando el de la xbox funciona "in-order" y a menos rendimiento del que tiene. Lo que hace que iguale el rendimiento de wii a 1 de los 3 cores de xbox360. Cuando realmente seria algo inferior (no se cuanto pero algo). Ya sabemos que al usar "stepping por software" la wii puede ganar rendimiento de su G3 mejorado por IBM... Pero recordemos que el 360 usa un G5 que es superior a este G3 por muy mejorado por ibm que esté este.
Refernete a la gráfica, fiandonos de los datos que nos da, nos la coloca el doble de potente que una xbox1 y 4 veces más potente que la GC. Cuando asi por encima puedes ver que no llegaria a 4 veces más potente que la GC. Tiene el doble de la mayoria de specs (según pone) que bien aprovechado en los benchmarcs doblaria o quiza como mucho triplicaria las specs de GC. pero no más... Y seria más potente que una Xbox1, pero no llegaria a doblarla.
Aún asi si los datos los ha sacado del sdk 2.1 podrian ser más interesantes los datos en si que no la opinión parcial.
Creo que se tiene que coger con pinzas sobretodo no creerse las comparaciones con la 360 o con la GC o la Xbox1... , pero a falta de información almenos hay que tenerlo en cuenta...
Dice cosas interesantes como que cuando ejecuta cosas de GC "underclockea la consola"...
Os pongo aquí la misma:
What benchmark tests have you seen exactly?
PPC Benchmarks for the Wii Cpu under Linux compared to the X360 and PS3 Cpu's... Infact I find it kinda funny that the Wii CPU is almost Equal to a Single Core of the the X360... Also the Wii GPU has about 1/3 the Cache of the X360... See a Trend?!? 1/3 seems to be a Magic Number when comparing Wii and X360... Don't take this to mean the x360 is 3 Times the Power of the Wii, as it's Not... The X360 is roughly 2 - 2.5 the Speed of the Wii overall... The PS3 is roughly the same as the x360 Benchmark wise, however It gets Insane Scores at Floating point Math due to the SPE units... Overall to compare the X360 CPU and PS3 CPU to a Computer I'd say roughly a 3.2Ghz Intel P4 (Non-DualCore)... The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range...GPU benchmarks are being worked on but as so little is really known for the 3 GPU's in question I doubt the numbers will be accurate enough to be useful...Basically the PS3 is the Most Powerful but only if you program for the SPE's and that is Rather Hard...The X360 is Second, but once again you need to program for a 3 Core CPU (granted it's easier then Sony)...Finally the Wii, It's simple Single Core design with High Memory bandwidth and just enough ram to keeps things running smoothly... Over all a great machine, but just not what some people wanted...I feel the Wii is what is know as a "Lead-In Product"... That means it's breaks new ground in a Radical Fashion but can still fall back to a normal if needed (ie: the WiiMote is not required for all Games)... This is why there was no HUGE Leap in other Hardware... If the Wii had Failed, The loss due to Hardware heavy machine could have been very bad for Nintendo... However, now that we can see the Market is accepting the WiiMote, the Next Nintendo Console will include a Revised Version if it and also the Beefed Up CPU and GPU the others have wanted... The DS is a Lead-In Product, we have yet to see what Nintendo has learned from it though... Of course not all products of this Type do well (ie: VirtualBoy)...
Tell me about the dev kit…
Basically it had plenty of Hardware info like the Memory Layouts for the GPU... We already knew that the 3 Megs was split into 2 Sections (Frame Buffer 2 Megs and Texture Cache 1Meg)Gpu and CPU can Copy it's contents to the 24 Megs Ram and Vice Versa, allowing for extended Graphics RAM or Texture Caching (and people wondered why 24 Megs of 1T-SRAM was On Chip)...Wii Dev kits have 128Megs instead of 64Megs of GDDR3 as Main Ram...Interesting Note: Inorder to make certain games Ran Smoothly in Finally Revision GC Dev Kits ran a Slower MHz then the Retail GC... 350~ compared to 485MHz...Even more Interesting : Mario Galaxy "Live Demo" that was Shown at E3 was run on a Modified GameCube, NOT Revolution Hardware at all... Expect something BIG from this Title...(Editor Note: this interview was done before the newer Galaxy trailer was shown)The "Re-Birth" Demo is given in here as example Code and it is ONLY 32 Megs in Size for the REAL TIME Rendered Version...
So why do wii dev kits have 128MB's instead of 64?
It's breathing Room... It gives them room to make some pretty sloppy code and test it before trimming it down to fit the 64Megs of the Retail Unit... Uncompressed textures and Sound could also be used in testing...The GC Kits where very restrictive as they had less power then the retail, Nintendo fixed that this time around...
I read somewhere than the 750 CL could go as high as 1.1 ghz... so what is broadway clocked at?
Software Clocked at 729 MHz via the Bios.
So what can you tell me about the TEV unit in the Wii? Does it have any addition pipelines? Vertex shader support?
TEV is Basicly the Same... The GPU has twice the Pipelines now at 8... Tev makes up for this by allowing 16 Texturing Stages per Pipeline... Vertex Shader routines are handled by Tev just like Pixel Shader routines are...
Do you know for a fact that it(hollywood) now has 8 pipelines or are you going by the #@#%nintendo interview?
FACT, it's listed in the SDK.
Any other Hollywood GPU tidbits you can tell me?
GC GPU to Wii GPU
162 MHz Clk to.... 243 MHz Clk ( 50 % Increase in Clk Speed )
3 Megs embedded Ram Same ( Wii able to use A-Ram as Additional
GPU/CPU Ram )
18-Bit color Used to avoid Frame Drops to.... 32-Bit Color Used at all times
4 Pixel Pipelines to.... 8 Pixel Pipelines
4 Texture Pipe (16 Stages Each) to.... 8 Texture Pipelines (16 Stages Each)
Resolution is Restricted in the SDK not in the Hardware...
Those are the Basics...
You can see where this would be 3-5 Times more powerful then the Normal GC GPU..
Cool. So how fast is the 64MB of GDDR in the Wii and has the 24MB of 1t-sram seen an increase in speed as well?
That's interesting in itself... Some GameCube programs rely on the Clock to a a Certain Ratio... The Gamecube CPU is 3 times the Clock of the GPU... GPU Clk is = to the Front Side Bus... Memory is Clk 2 times the GPU Speed...486 CPU 364 RAM 162 GPUThe Wii would be as Follows...729 CPU 486 RAM 243 GPU (ARAM would be at 243 also)As you can tell it's a Formula that they use and it seems to work well
So what can you tell me about the TEV unit in the Wii? Does is it dramatically different from the TEV in the GC? Vertex shader support?
TEV is Basicly the Same... The GPU has twice the Pipelines now at 8... Tev makes up for this by allowing 16 Texturing Stages per Pipeline... Vectex Shader routines are handled by Tev just like Pixel Shader routines are.
So does the file disclose any of the GPU's performance numbers?Gflops? Polygons per second? Fill rate? Stuff like that?
Only in a round about way... It gives you test code to run and shows some optimizations that can be added...The Code is run via Debuggers Cable or Wireless Network (Wireless requires a Special Disc in the Wii)...
But you can do the math. 8 pipelines at 243mhz… that’s about 1944 megapixels per second. That’s a lot higher than the GC could handle and around twice what the xbox was dishing out.
Hmmm.. this must have been what Julian from Factor 5 meant when he said the Wii had an insane filtrate.. well compared to the Gamecube.
Exactly.