Jump to content
NEurope
Ashley

Wii Hardware Discussion

Recommended Posts

The 729Mhz one seems more realistic and likely to me than the 1.1Ghz, I really hope the 650 isn't true though.

Share this post


Link to post
Share on other sites
It's a blog. :shakehead

 

Ofcourse it's a blog. He wants to remain anonymous. the fact that something is a blog, doesn't automaticly mean that it is fake. It's fine to be reserved and doubt it, but can you actually tell it's all BS that he is saying? I don't know...

Share this post


Link to post
Share on other sites

I'm gonna go ahead and call that blog fake. The downloading of extra content into TP just strikes me as dead wrong. I just get the feeling that shiggy would not want to ruin a work of art by tacking a drawing to the side of it. Also the guy claims thats a major breakthrough... PC games have had patches like that for YEARS!!!! That and nintendo only has 512MB of storage space that they can count on. His one bit of "proof" that he works for nintendo is the file header thing but... as he says you can just open a hex editor and see it yourself >_> (mind you I have no idea what hes actually viewing but my guess is a GC disk stuck into a computer).

Share this post


Link to post
Share on other sites

Zelda has online 'functions' but no-one knows what.Downloadable content won't really work with Zelda unless it's different clothing or some useless weapon.

Share this post


Link to post
Share on other sites
the fact that something is a blog, doesn't automaticly mean that it is fake.
Yes it is. It's not as though Nintendo employees are frustated about not being able to tell anything to the fans. Isn't it strange how blogs always tell a sunny, bloomy story about how Nintendo is going to rule the world with Wii, written by Japanese people who type fluent English?

 

Blogs can never be trusted on classified information.

 

Someone on the Gamespot Wii forum claims this is the specs of the final devkit:

 

CPU: PowerPC 750FX 729 MHz

GPU: ATI 243 MHz 3 MB DRAM

RAM: 88 MB 1T-SRAM

Storage: 512 MB built-in flash memory + 1 SD memory card slot

Media: DVD-ROM

Network: IEEE 802.11b/g

Those are the IGN specs. Though they might be true, there are small things unlogical in the specs. Everything is the exact same as on the GameCube but with a higher clock frequency. The memory is split into a chunk of 24 and 64 MB which doesn't make sense for production at all, but does make sense for an easy Cube devkit update.

 

I doubt Mario Galaxy runs on that setup too, especially as IGN mentioned there were 'no additional shaders'.

Share this post


Link to post
Share on other sites

id like to believe this blog, i dont, but i would like to. Im gonna bookmark it so i can check up on it daily tho - just to see what he says.

Share this post


Link to post
Share on other sites

Does it really matter to you what's inside? Does it matter to you that your playing on a 3ghz other than a 3mhz? Why do you care if you're not developing for or reverse-engineering it?

The only hardware gamers should care about is if the controler is ergonomical or not! That's your job as gamers! Just wait for it to get out.

Share this post


Link to post
Share on other sites

Those are the IGN specs. Though they might be true, there are small things unlogical in the specs. Everything is the exact same as on the GameCube but with a higher clock frequency. The memory is split into a chunk of 24 and 64 MB which doesn't make sense for production at all, but does make sense for an easy Cube devkit update..

 

What you find "unlogical" is what makes it seem most likely to me. Nintendo have said time and time again the hardware is going to be very, very similar to Gamecube's, there's a ton of quotes on that and it's obvious when people have been comparing Wii to a "Gamecube 1.5" it's the truth.

 

Nobody else seems to be picking up on this but I think the main point in them keeping the architecture so similar is that they need to for them to do backwards compatibility the way they're planning to. The Wii doesn't have anywhere even close to enough power to directly emulate the GC and I dont think they want to have any extra space taken up by additional hardware for GC backward's compatibility so they made sure their hardware partners basically just took the GC hardware and spent years pushing it as far as they physically could so they could get extra power while keeping everything else the same so Gamecube backwards compatibility is extremely easy.

Share this post


Link to post
Share on other sites

What hes saying is that splitting the ram is a very uncommon thing that has no benifits (slows access time by a bit). So by having it split it's safe to assume these are not production models but rather chips on bread boards attached to a cube.

Share this post


Link to post
Share on other sites
What you find "unlogical" is what makes it seem most likely to me. Nintendo have said time and time again the hardware is going to be very, very similar to Gamecube's, there's a ton of quotes on that and it's obvious when people have been comparing Wii to a "Gamecube 1.5" it's the truth.

 

Nobody else seems to be picking up on this but I think the main point in them keeping the architecture so similar is that they need to for them to do backwards compatibility the way they're planning to. The Wii doesn't have anywhere even close to enough power to directly emulate the GC and I dont think they want to have any extra space taken up by additional hardware for GC backward's compatibility so they made sure their hardware partners basically just took the GC hardware and spent years pushing it as far as they physically could so they could get extra power while keeping everything else the same so Gamecube backwards compatibility is extremely easy.

 

That is not quite correct - the hardware can drastically change and the code still remains functional. Look at the x86 architexture - most software back from the 286er model is still running on a 586er (686er) machines.

 

The RAM in the Gamecube was splitted - main memory and auxillary memory (for the soundchip if I remember correctly).

Share this post


Link to post
Share on other sites

oh how i wish it could be true dante!!

 

[enters dream mode]

 

aaaaahhh

 

[/dream]

 

 

 

EDIT:

 

Check out the rest of this guys posts and read the threads!!

 

He is taking the piss!!

 

Pre-empting an IGN article and eveything!!

 

Clicky

Share this post


Link to post
Share on other sites

@ James McGeachie: It's pretty clear IGN was talking about a modified GameCube devkit. Nintendo overclocked the Cube devkit and removed the memory bottleneck by dumping in an extra chunk so developers could make bigger scale games before the final Wii hardware came out. There's simply no way that IBM and ATI have spent three years and millions of dollars on die shrinking/overclocking the same hardware they made five years ago. There's also no way Super Mario Galaxy and Smash Brawl are running on that hardware. The ATI comments in the other thread make me think that it's actually quite a bit more powerful than what IGN is saying.

Share this post


Link to post
Share on other sites

From CUBED3D

 

Speaking to GameDaily BIZ, ATI's John Swinimer, Senior Public Relations Manager of Consumer Products, has gone on record to state that Wii will be able to produce far better visuals than what we have seen so far. "I think what you saw [on Wii, at E3] was just the tip of the iceberg of what the Hollywood chip can bring to the Nintendo Wii," he said.

 

When pushed for more specifications, the only comment offered was: "I'm really not here to talk about the design specs... other than the fact that ATI worked closely with Nintendo. The team that worked on this chip also worked on the Flipper chip that was in GameCube, and they've been working with Nintendo for a very long time so there's a great chemistry with the two teams working together."

 

"I really don't think that it's about the [specifications]; I think it's about the innovation that it brings to the table—the motion-sensing, the always-on capability, which is really cool too—the fact that the chip is powerful enough and responsive enough to be there at a moment's notice, and I think that's pretty cool for the average gamer."

 

A comparison to Xbox 360 was also inquired about, but Swinimer deflected the question."They're different chips for different platforms and different uses. I don't think it's a fair comparison to put them on a chart [to analyze]. That's not what it's all about... I think if you focus on the capabilities that the chip will have for the average consumer, with the amazement and wow factor, I think that's the value that we bring."

 

Of course, all of this correlates with Nintendo admitting that the E3 demonstrations were not running on final Wii hardware, so based on these statements and that piece of information from Nintendo, look for the visuals of the system to improve before launch.

 

Stick with C3 for more as it comes...

Share this post


Link to post
Share on other sites
@ James McGeachie: It's pretty clear IGN was talking about a modified GameCube devkit. Nintendo overclocked the Cube devkit and removed the memory bottleneck by dumping in an extra chunk so developers could make bigger scale games before the final Wii hardware came out. There's simply no way that IBM and ATI have spent three years and millions of dollars on die shrinking/overclocking the same hardware they made five years ago. There's also no way Super Mario Galaxy and Smash Brawl are running on that hardware. The ATI comments in the other thread make me think that it's actually quite a bit more powerful than what IGN is saying.

 

No, IGN said the 729Mhz ones were the more advanced devkits, the ones that came after the modified Gamecubes (which were nothing but Gamecubes with controllers attached basically).

 

http://www.ebgames.com/gs/wii/wii_signup.asp?cookie%5Ftest=1&

 

EB games have put the 729Mhz thing up in their final system specs page. Obviously not final confirmation as it's an online shopping site, but I cant help but feel even more assured these are the real final ones now.

 

 

Anyway, what the heck are you talking about with with Brawl? The models in that are almost exactly the same polycount as the Gamecube ones (other than Link) and have slightly better textures. Other than that there's a bit more detail in the backgrounds and it looks like some of the backgrounds were displaying a prerecorded video possibly too. In its current state at the moment, I wouldn't be surprised if Brawl could be done on Gamecube with enough refining, nevermind Wii.

 

Metroid Prime 3 and Galaxy on the other hand, no, I don't think either could be done on Gamecube, but I do on the other hand think they could be done on a system with the specs we've seen. Look at the kind of visuals the Xbox pumped out

 

http://image.com.com/gamespot/images/2005/209/928401_20050729_screen004.jpg

http://www.xboxgazette.com/i/pic_halo2_zh_01.jpg

http://image.com.com/gamespot/images/2003/e3/0512/pc_doom305120900_screen003.jpg

 

Galaxy definitely looks really good, but again we're not talking major advancements, we're talking a lot of smaller ones that add up. Mario's model only looks slightly higher poly than Sunshine's, the areas dont seem to be much higher either, the main increases have been in the textures, which are a good bit higher res and in the lighting, which they have some really neat effects with. There also might be a tiny, tiny bit of bump mapping or something similar...that's pretty much it though. Keep in mind that Galaxy is mainly only having to render collections of small areas and objects as opposed to massive large areas filled with stuff and it's not unrealistic for it to be done on this hardware at all.

Share this post


Link to post
Share on other sites
The big final secret is set to be announced, and no doubt details about the console itself. So discuss it all in here.

 

id bet its a buit in microphone in the controller.ALL signs are telling us this.wii kareoke etc

Share this post


Link to post
Share on other sites
This fanboy's base of lies sure makes me laugh. :laughing: :laughing: :laughing:

 

I read that too and laughed at it's bullshitness. Then I discovered this .It would be nice if that were true... Ah well I'll go out to ride my motorcycle now. And hopefully stay there for the summer and come back indoors when wii launches.

Share this post


Link to post
Share on other sites
id bet its a buit in microphone in the controller.ALL signs are telling us this.wii kareoke etc

There all ready is a mic built in.

Share this post


Link to post
Share on other sites
Anyway, what the heck are you talking about with with Brawl? The models in that are almost exactly the same polycount as the Gamecube ones (other than Link) and have slightly better textures. Other than that there's a bit more detail in the backgrounds and it looks like some of the backgrounds were displaying a prerecorded video possibly too. In its current state at the moment, I wouldn't be surprised if Brawl could be done on Gamecube with enough refining, nevermind Wii.

 

Metroid Prime 3 and Galaxy on the other hand, no, I don't think either could be done on Gamecube, but I do on the other hand think they could be done on a system with the specs we've seen. Look at the kind of visuals the Xbox pumped out

 

http://image.com.com/gamespot/images/2005/209/928401_20050729_screen004.jpg

http://www.xboxgazette.com/i/pic_halo2_zh_01.jpg

http://image.com.com/gamespot/images/2003/e3/0512/pc_doom305120900_screen003.jpg

 

Galaxy definitely looks really good, but again we're not talking major advancements, we're talking a lot of smaller ones that add up. Mario's model only looks slightly higher poly than Sunshine's, the areas dont seem to be much higher either, the main increases have been in the textures, which are a good bit higher res and in the lighting, which they have some really neat effects with. There also might be a tiny, tiny bit of bump mapping or something similar...that's pretty much it though. Keep in mind that Galaxy is mainly only having to render collections of small areas and objects as opposed to massive large areas filled with stuff and it's not unrealistic for it to be done on this hardware at all.

SSBMelee already has bump mapping at places
Super Smash Bros. Melee is vastly improved over the N64 version. The models, now smooth and crisply textured, are recognizable from farther distances away and don't blend into the background. Everyone comes with a reflective lighting map and sometimes even bump-mapping or other texture effects.
Source: http://cube.ign.com/articles/166/166387p3.html

 

The wii version even lacks floor shadows right now, it's a early build alright.

 

Also you're comparing the absolute best Xbox did with a first generation GC game. And none of those games ran at those resolutions... Doom 3 and Halo 2 and Ninja Gaiden all ran at a fixed 480p so images at 1024x768 are bullshit, honestly... We got your point about how the GC is a crappy system compared to Xbox already.

 

But bare in mind that those games ran at 30 frames... SSB ran at 60 and wasn't really trying to push anything.

 

hell... GC did this on launch:

 

21.jpg

 

And this is technically superior to any Xbox game ever made. it runs at 60 frames too...

 

Stop downplaying the cube, if you really want to complain... complain about how on-par they are (although I consider cube better)...we've talked about this countless times over with facts and quotes for me into other artciles proving just this... If I recall correcly you even shown me these same images of ninja gaiden in the past to prove your point, the very same point.

 

Allow me to say just this:

 

Do you think that Halo 2 could be on the Gamecube, since it uses so many vertex shaders and bump-mapping? >>
The vertex shaders are used for bumpmapping in Halo/Halo2.

 

Halo on GCN? Possible. Halo 2 is very questionable.

 

Why? It's obvious that Halo 2 has downgraded polygonal models, but bumpmapping is a serious resource killer. It would take some serious recaching of GCN's 3MB buffer to make this happen (along with help from texture layers) in order to keep the game bumpmapped on every surface.

 

Every poly can be recreated on GCN, though textures would obviously be smaller (40MB on Xbox for textures, plus HD compared to 16-40MB on GCN). If rewritten with smaller texture files, the GCN should be able to run Halo 2 at double the framerate looking at the specs.

 

The question is whether Xbox could run Metroid Prime without serious loadtimes, "checkpoints" and other resting points to catch up on all the geometry/texturework streamed from the disc..

 

those "shitty" GC's with extra 64 MB of 1TSRAM... could beat Xbox to a pulp already... let alone the souped up CPU and GPU.

 

this whole processor thing is quite twisted considering Xbox and GameCube are two TOTALLY DIFFERENT architecures (32/64-bit hybrid, PowerPC native compared to 32-bit Wintel). GameCube, having this architecture, has a significantly shorter data pipeline than Xbox's PIII setup (4-7 stages versus up to 14), meaning it can process information more than twice as fast per clock cycle. In fact, this GCN CPU (a PowerPC 750e IBM chip) is often compared to be as fast as a 700mhz machine at 400mhz. So GCN could be 849mhz compared to Xbox's 733mhz machine performancewise.
A 729 MHz is a big improvement over that... leagues above Xbox's 733 MHz cpu...
At a ratio of 6:1, the memory bus can pass 6 times more textures. That means the GPU's texture cache bus of 10.4 GB/sec can pass 62.4 GB/s of 24-bit compressed textures, and the external bus of 2.6 GB/sec can pass 15.6 GB/s of 24-bit compressed textures, now....the Xbox cache doesnt have texture cache compression???, if so then that can make a huge difference in the comparison as with a 6:1 compression ratio, the cache can hold 6 times more data! 6 MB of data for the GC compared to 256 KB for the Xbox is a huge difference.

(...)

Memory Efficiency

There is no question that GC's main memory of 1T-SRAM is much more efficient than the Xbox's DDR SDRAM, as the latency of GC's 1T-SRAM is 5 ns, and the average latency of 200 MHz DDR SDRAM is estimated to be around 30 ns.

Memory efficiency is largely driven by data streaming. What that means is that developers can do optimizations to their data accesses so that they are more linear and thus suffer from less latency. Latency is highest on the first page fetch, and less on subsequent linear accesses. It's random accesses that drives memory efficiency down, as more latency is introduced in all the new page fetches.

 

It has been brought up that DDR SDRAM is only 65 percent effective, and it is only 65 percent effective when comparing a SDRAM based GeForce2 graphics card with a DDR based GeForce2 graphics card. The Xbox's main memory efficiency should be around 75 percent effective if one considers that the Geforce3 has a much better memory controller than what is on the Geforce2 chipsets. The GC's 1T-SRAM main memory is speculated to be 90 percent effective. A significant difference between the two memories!

(...)

What is known:

GC cache is 4 times larger than the Xbox's 256 KB.

Xbox can feed it's cache with 2.5 times greater data per second than the GC.

 

now it all comes down to this, if xbox doesnt support data compression in texture cache, then simply GC is better.....

 

GC: 1.44*6= 8.64 GB/s

Xbox: 4.05*1= 4.05 GB/s

 

and.....

 

texture bandwidth

 

GC: 10.4 GB/s*6= 62.4 GB/s

Xbox: 30 GB/s*1= 30 GB/s

I believe I proven my point, besides Mario Galaxy and Metroid Prime shaders effects were already discussed in these forums and you obviously weren't reading, believe me... they won't run on a Xbox. leaps and bounds diference really... and they aren't even running on the final hardware.

Share this post


Link to post
Share on other sites
No, IGN said the 729Mhz ones were the more advanced devkits, the ones that came after the modified Gamecubes (which were nothing but Gamecubes with controllers attached basically).

 

http://www.ebgames.com/gs/wii/wii_signup.asp?cookie%5Ftest=1&

 

EB games have put the 729Mhz thing up in their final system specs page. Obviously not final confirmation as it's an online shopping site, but I cant help but feel even more assured these are the real final ones now.

 

 

Anyway, what the heck are you talking about with with Brawl? The models in that are almost exactly the same polycount as the Gamecube ones (other than Link) and have slightly better textures. Other than that there's a bit more detail in the backgrounds and it looks like some of the backgrounds were displaying a prerecorded video possibly too. In its current state at the moment, I wouldn't be surprised if Brawl could be done on Gamecube with enough refining, nevermind Wii.

 

Metroid Prime 3 and Galaxy on the other hand, no, I don't think either could be done on Gamecube, but I do on the other hand think they could be done on a system with the specs we've seen. Look at the kind of visuals the Xbox pumped out

 

http://image.com.com/gamespot/images/2005/209/928401_20050729_screen004.jpg

http://www.xboxgazette.com/i/pic_halo2_zh_01.jpg

http://image.com.com/gamespot/images/2003/e3/0512/pc_doom305120900_screen003.jpg

 

Galaxy definitely looks really good, but again we're not talking major advancements, we're talking a lot of smaller ones that add up. Mario's model only looks slightly higher poly than Sunshine's, the areas dont seem to be much higher either, the main increases have been in the textures, which are a good bit higher res and in the lighting, which they have some really neat effects with. There also might be a tiny, tiny bit of bump mapping or something similar...that's pretty much it though. Keep in mind that Galaxy is mainly only having to render collections of small areas and objects as opposed to massive large areas filled with stuff and it's not unrealistic for it to be done on this hardware at all.

Polycounts haven't been increased tremendously (though you're missing that Brawl runs Twilight Princess Link model at 60 fps in the crazy action with no slowdown) but polygons are not the main focus of games anymore. The texture increases and lighting effects are not possible on a 50% overclock of the Flipper, which theoretically allows it to only do 50% more than, and Mario Galaxy's textures and lighting show (far) more fillrate and shader capabilities than possible on a Flipper.

 

If the specs are that, the only thing IBM and ATI have done is change the manufacturing process from 180nm to 90nm. That decreases the chips' die size so much that they can easily overclock it by 50% and have it passively cooled. That would mean there was no investment in the hardware and you don't believe that yourself, right? The split in the 1T-SRAM is a flaw in the design for a home console, too. The development kit discussed by IGN can't have been the final specifications, but it would suffice to make bigger scale games in preperation for Wii specs. It's very possible Metroid Prime 3 was developed for that devkit, but I don't believe that Brawl and Galaxy are done on that system for a second.

 

Just to bring on another point, how do you explain the comments of physics acceleration on the Wii? I don't suppose they've got that running on the programmable ISA of the Flipper...

 

About Brawl: I don't think you have noticed how much more fluent animation has become, how high-res the textures are and how Link's, Mario's and Kirby's special attacks really blow off the screen. This game really shows some potential the Wii has and it still has 18 more months of development to come.

Share this post


Link to post
Share on other sites

I wish I knew what you guys were talking about because it sounds awesome.

Share this post


Link to post
Share on other sites
I wish I knew what you guys were talking about because it sounds awesome.

 

In general, the bigger the number, the better it is :heh:

 

(1Ghz = 1024Mhz)

Share this post


Link to post
Share on other sites

×