Jump to content
NEurope
DiemetriX

Wii may use Direct X 10!

Recommended Posts

then why is it a new chip? 250 MHz on the original flipper were possible... it was even 202,5 MHz earlier... with shrinking and all they could get a lot more speed out of it... I think worked hard 5 years there... and the rayman screens were moved by gamespot into PS2/Xbox, not considered Wii.

Who says it's a new chip anyway? Look - Nintendo doesn't seem to have invested much at all in new hardware, so it could just be an overclocked though passively cooled die shrink of the Flipper for all we know, and that would fit IGN's description perfectly. If they really had invested in brand new hardware then we wouldn't be discussing this - there would be a X1600XT-like chip in it that would be perfectly able to mimic the Flipper and make far better shader operations. It's very likely ATI has included some new features in the design that allow things like displacement mapping, but I doubt they have redesigned the shader system, let alone make it DirectX 10 compliant. It's a damn shame really, because shading is what makes the difference between the 360/PS3 and last gen.

Share this post


Link to post
Share on other sites
To be honest, not that I know about the technical stuff very much, I doubt Nintendo would use something like DirectX. They always design from the ground up and they KNOW what a console needs (bar more RAM on the N64!).

 

Proprietry libraries I think are more likely.

that's the thing you know... Nintendo will not use DirectX libraries, but it doesn't mean it can't exploit some of it's features on hardware. like... Direct X 10 will support for example sub pixel scattering... it's a support on hardware, of course I can exploit that through other library...

 

directX10 just has to meet some standards to be compliant... I believe for DirectX 10 it's Pixel Shader 4.0 abilities... if they say the chip is directX10... that's what they probably mean.

Who says it's a new chip anyway? Look - Nintendo doesn't seem to have invested much at all in new hardware, so it could just be an overclocked though passively cooled die shrink of the Flipper for all we know, and that would fit IGN's description perfectly. If they really had invested in brand new hardware then we wouldn't be discussing this - there would be a X1600XT-like chip in it that would be perfectly able to mimic the Flipper and make far better shader operations. It's very likely ATI has included some new features in the design that allow things like displacement mapping, but I doubt they have redesigned the shader system, let alone make it DirectX 10 compliant. It's a damn shame really, because shading is what makes the difference between the 360/PS3 and last gen.
ATi said otherwise, and this is a new core... it might be based... but all ATi home cards since R300 are based on it... still new core, that means that at least a real pixel shader was implemented since it wasn't because of time limitations... but mind you... GC had it's ISA completly programable so infat it had shaders, good ones might I add.

 

do you really think a X1600 could mimic flipper? only if they enhanced it's core from ground... or made emulation by packets for certain games... all graphics cards have only one alpha register... flipper has 2... infact it has a lot of duplicate thingies... trust me... it's not that easy... flipper wasn't made by ATi it was made by ArtX so it's really a diferent core alltogether. Mario Sunshine (pause menus) and Zelda Wind Waker allthogether wouldn't work easily as they use all those specific functions... most third party games would work though, specially if they came on Xbox/PS2 too.

 

I have no doubts the shader is not the same... the reason GC has a fixed T&L pipeline is because nintendo didn't wan't to delay GC launch just for that and because it didn't really need them to match Pixel Shader 1,3... but trust me, it was a matter of time, the chip was concluded... now you have a chip who had 5 years to be made... no better shaders? that's very doubtable I think.

Share this post


Link to post
Share on other sites

So when you say DirectX10 features you mean features which are also in DirectX10 rather ones created entirely for DX10?

 

Who says it's a new chip anyway? Look - Nintendo doesn't seem to have invested much at all in new hardware, so it could just be an overclocked though passively cooled die shrink of the Flipper for all we know, and that would fit IGN's description perfectly. If they really had invested in brand new hardware then we wouldn't be discussing this - there would be a X1600XT-like chip in it that would be perfectly able to mimic the Flipper and make far better shader operations. It's very likely ATI has included some new features in the design that allow things like displacement mapping, but I doubt they have redesigned the shader system, let alone make it DirectX 10 compliant. It's a damn shame really, because shading is what makes the difference between the 360/PS3 and last gen.

 

Of course it's a new chip! Just because it's architecture is based in flipper does not make it the same chip. Nintendo don't invest that much in hardware because, unlike the other 2 companies, they spend their time helping optimise it properly, so they get more out of what they have. That's how they keep their hardware costs down. They're designing a console rather than a gloryfied PC. They know where it's at.

Share this post


Link to post
Share on other sites
So when you say DirectX10 features you mean features which are also in DirectX10 rather ones created entirely for DX10?
I think so, you could say that flipper is DirectX 8 range... comparing with what directX 8 can do.

 

DirectX 9.c for example is just implementation of Pixel Shader 3.0 for example... PS3 will have real Pixel Shader's since it's a Nvidia chip, they must be Pixel shader 4 compliant or something.

 

Microsoft gives "directX 9 compliant" labels if a product meets certain demands to support their Direct3DX API, just that. Matrox Parhelia was the first Graphics card to support DirectX 9 and it came out before DirectX9 was finalized or released for that matter, it's just that it supported everything DX9 was supposed to do (although they never released drivers allowing that support afterwards)

Share this post


Link to post
Share on other sites

Pedrocasilva, I want to believe you, but this puts me off:

Revolution's ATI-provided "Hollywood" GPU clocks in at 243MHz. By comparison, GameCube's GPU ran at 162MHz, while the GPU on the original Xbox was clocked at 233MHz. Sources we spoke with suggest that it is unlikely the GPU will feature any added shaders, as has been speculated.

 

"The 'Hollywood' is a large-scale integrated chip that includes the GPU, DSP, I/O bridge and 3MBs of texture memory," a studio source told us.

This sounds an awful lot like a description of the Flipper. Especially the 'no extra shaders' part, and the developers not expecting them, really seems like Nintendo and ATi haven't gone astray from the Flipper design.

Share this post


Link to post
Share on other sites

Wasn't that Nintys intention though? Microsoft has lost billions and Sony has seen what profit it does make these days all but wiped out just on R&D for their big new machines. Faster, rejigged versions of their previous chips would do the job very adequetly for Nintendo Wii.

 

I think any machine specs you find on the net (the above were posted last year, I believe) however should be taken with a pinch of salt until you read them on a piece of official Nintendo-signed paper.

Share this post


Link to post
Share on other sites

Microsoft and Sony are the other extreme. They spend loads on R&D, are willing to pay $150 per console for just the GPU and focus everything on the HD generation. Nintendo on the other hand could just use brand new hardware and technology, but they don't. I don't believe that after four years of technological improvement this is the best Nintendo can do, even when considering the small size and low price.

 

I don't think IGN is far from the truth.

Share this post


Link to post
Share on other sites
Pedrocasilva, I want to believe you, but this puts me off:

 

This sounds an awful lot like a description of the Flipper. Especially the 'no extra shaders' part, and the developers not expecting them, really seems like Nintendo and ATi haven't gone astray from the Flipper design.

Well... IGN commited big mistakes lots of times specially with the Wii... the souped up xbox coment was just... plain wrong you know? later matt said something like "GC's gekko can beat Xbox cpu in some situations" it's his way of not apologising... Wii is a souped up xbox? they only knew the CPU had two times the power of the original GC one... and GC CPU already beat Xbox CPU in all the possible situations, it was simply put more powerful.
this whole processor thing is quite twisted considering Xbox and GameCube are two TOTALLY DIFFERENT architecures (32/64-bit hybrid, PowerPC native compared to 32-bit Wintel). GameCube, having this architecture, has a significantly shorter data pipeline than Xbox's PIII setup (4-7 stages versus up to 14), meaning it can process information more than twice as fast per clock cycle. In fact, this GCN CPU (a PowerPC 750e IBM chip) is often compared to be as fast as a 700mhz machine at 400mhz. So GCN could be 849mhz compared to Xbox's 733mhz machine performancewise.

how can they conclude it's just a souped up xbox just for the CPU? probably because Xbox had 733 MHz and Rev appears to have 740 and devs said the chip is fast for what it is... but infact GC was already a souped up Xbox at least in the CPU department wasn't it?

 

Thus... IGN knows nothing, they know how to read the specs on the paper and taking conclusions without seeying in-game graphics.off course they are far from the truth, you can't judge the graphics by seeying it's motherboard or reading the specs sheet.

Wasn't that Nintys intention though? Microsoft has lost billions and Sony has seen what profit it does make these days all but wiped out just on R&D for their big new machines. Faster versions of their previous chips would do the job very adequetly for Nintendo Wii.

 

I think any machine specs you find on the net (the above were posted last year, I believe) however should be taken with a pinch of salt until you read them on a piece of official Nintendo-signed paper.

Well nintendo said that GC would be "just as powerful as a PS2" back then, sure Revolution is not going to beat the others, that's why it doesn't even make HD to be more effective... still Nintendo isn't stupid, I'm sure they have a pretty good trade off there... DirectX 10 compliant doesn't seem impossible to me... GC did a lot of things faster because of specially ehanced paths to do them... volumetric fog for example, was included in the console from ground to speed up the process. support for those features means it can do them without much hit, it's perfect for affordable chip to output better graphics.

Share this post


Link to post
Share on other sites

Har-har. Jordan calls someone an idiot and then falls flat on his face. LOLOLOL.

Share this post


Link to post
Share on other sites

I agree with pedro. My main problem with this debate is that we've fallen into the trap of looking at raw numbers, which without intimate knowledge of the specialised internal processes that Nintendo apllied to Flipper, and susequently to it's succesor, are very likely to give a wildly inacurate picture of the actual capabilities of the machine.

Share this post


Link to post
Share on other sites

@ Pedro

 

Yes, I know about that - but that's just IGN's way of telling the crowd. The Wii this way does not seem much more than a souped up GameCube, so the souped up Xbox statement wasn't completely wrong. My point is, the way IGN describes it, the Wii hardware is very similar to the Cube's hardware and that means little improvement if it really is no more than an overclock, and that's exactly what I'm afraid of. The Hollywood just seems too similar to the Flipper to be a brand new, latest technology chip, especially considering the developers' expectations about additional shaders.

 

@ Gaggle

 

It isn't the numbers I'm worrying about, really - the clock frequency means little if the hardware isn't the same. My main point of worry is the lack of improvement. The Hollywood description sounds exactly the same as the Flipper description. And then the developers seem to say 'yeah, they're probably not updating the shaders' as though Nintendo updates very little at all. If Nintendo updates so little, I'm not confident about graphical improvement, and that's just a shame.

Share this post


Link to post
Share on other sites

It's a well known fact that the first dev kits were GCs and souped up GCs, the numbers IGN got were from some of those kits. It'a funny that you say that the difference in numbers from GC and Wii doesn't translate into much, yet you think it's like a XBOX. ATI didn't spend all this time scratching each others balls, you know. It's probably based of the flipper, but that's a good thing since devs can harness its power much faster and without HD you need 4 times less memory and raw power. This souped up XBOX business is just a bunch of non sense that shows that most people have no idea what they're talking about. GC hardware surpasses XBOX in a lot of cases, so how the hell can a more powerfull GC be the same as a XBOX? I don't know much about DirectX though, so I won't talk about that. Sorry about drifting off :P

Share this post


Link to post
Share on other sites

As far as I remember Nintendo spent about the same amount of money for R&D like MS did at ATI.

 

The difference is hidden in the details. MS needs a GPU for HD graphics and case size is no limitation factor neither is cooling an issue. Nintendo on the other hand only needs power for SD graphics in a very small case - also the manufacturing process should be as efficient and cheap as possible.

 

It is like a comparison (I leave out console specific details) between a regular PC and a notebook. My notebook has a Geforce 6800 with 256MB RAM and runs every game so far perfectly even though it is "old" by todays standards. It is not as fast as a regular 6800 but it is quite cool and silent - it also does not need as much energy as its PC counterpart.

 

A good GPU is made out of a good design and I do trust ATI with that. Clockspeed is just for the masses otherwise nobody would buy new hardware - you can have a low clockspeed and a efficent design and are much faster than something average with the double clockspeed. Lousy developers always cry for more MHz...

 

Nintendo sacrifices certain things to get what they really want and I doubt the result will only be sub quality.

Share this post


Link to post
Share on other sites

@ Hellfire: I considered the Cube and Xbox at about the same level. There really is not much difference between them in power. If you bend theory enough, then the Cube seems more powerful, but the Xbox also had its strong points. As for the devkit issue - let's hope you're right. IGN could be completely wrong.

Share this post


Link to post
Share on other sites

Early 360 devkits had about ONE THIRD of the final power and as far as I know Nintendo has also a few revisions of their devkits and the specifications - so IGN can't really gauge the performance with the devkits. Nintendo surely told a few developers (for launch titles) what to expect from the final hardware.

Share this post


Link to post
Share on other sites
Early 360 devkits had about ONE THIRD of the final power and as far as I know Nintendo has also a few revisions of their devkits and the specifications - so IGN can't really gauge the performance with the devkits. Nintendo surely told a few developers (for launch titles) what to expect from the final hardware.
well... that was microsoft's bullshit to be honest... they were just reffering to the CPU, you know they have 3 cores on Xenon... well all the launch games only used one of those, and I think no game used more than that as of yet... that's not 33%(one third) though... 33% of the CPU in the best case scenario, since the GPU is single core.

 

And by the way... each single core in Xbox 360 performs in general purpose like a celeron 1,7 (close to a Pentium 3 1,4 GHz) so it's pretty close to double the power of the original PIII 733 MHz CPU on Xbox, get my rift? ::shrug: read my post's above and you'll conclude that probably Wii's CPU is faster than that, in single thread... considering all we've seen on Xbox was only using that... how can IGN call it a souped-up Xbox knowing just that?

 

Depending on the GPU it might be on-par with all X360 did so far... including High definition if the GPU was powerful enough (or meant to be used for that) anyway we know they wont, so why can't it match that kind of standard in 480p? In my opinion... just if it doesn't support those features in hardware and that would be plain dumb on Nintendo's part.

 

Most third parties will not go into the work of using two extra CPU cores on X360 or PS3 SPE's for that matter (just like loads of games didn't bother to use the vectors in PS2) and that will infact be the standard graphics for this generation.

Share this post


Link to post
Share on other sites
If Nintendo updates so little, I'm not confident about graphical improvement, and that's just a shame.

Graphical improvement isn't what Nintendo are into anymore though. That's why the Wii has a bizarre remote-style controller, and not a GPU with lots of Xs and some kind of huge impressive number in the title. In fact I sincerely doubt anyone would be as excited as they are about Nintendos new console if that was the case. It's graphical performance is of course going to be intresting to find out, but it's pretty much at the bottom of everybody's list of priorities. Even if it offered very little, or even no improvement over the GC it could still easily compete because that's not what it is trying to offer. Leave it to MS and Sony to follow that well-trodden path in their fancy suits and well-shined Doc-Martins. Nintendo is stripping naked and going Rambo in the jungle.

Share this post


Link to post
Share on other sites
Leave it to MS and Sony to follow that well-trodden path in their fancy suits and well-shined Doc-Martins. Nintendo is stripping naked and going Rambo in the jungle.

 

Quote of the week!

Share this post


Link to post
Share on other sites
Reggie is stripping naked and going Rambo in the jungle.

:laughing:

damned character limit

Share this post


Link to post
Share on other sites

Nintendo bets on innovation, but even its fanboys continue moaning about graphics. Have them ever failed you on the graphics department? Have a bit on faith on Nintendo...

Share this post


Link to post
Share on other sites

can someone please tell me what direct x is?

i once use to put stuff in a direct x folder on my old computer ... never understood why (someone told me to do it)....

 

what the hell is direct x ?

Share this post


Link to post
Share on other sites

As far as I know, it's the big translator program in you computer. It was introduced so that games and programs would be more easily compatible with a whole range of different hardwares. The more up to date your DirectX is, the better your computer can talk to the graphics card and so on. For the newest graphics card it may mean you need a new version of DirectX before you can use some of the features on it.

 

I'm sure someone can explain it better.

Share this post


Link to post
Share on other sites
can someone please tell me what direct x is?

i once use to put stuff in a direct x folder on my old computer ... never understood why (someone told me to do it)....

 

what the hell is direct x ?

it's a software API library used in PC's... it's mostly for graphics although it can also take part on things like sound and such. it's a direct link between the software you have and the hardware... If you did domething from ground for Nvidia it could not run on a ATi part, so if it is DirectX compliant it should run anyway. It's that bridge now there's diferences is versions like direct X 7 didn't support shaders, directX 8 did, and so on, so when we are talking about direct X 10 it's best to assume that it should be compliant with it meaning it can do the features needed for that standard... on hardware... Jamba's explanation is better than mine really.

 

but DirectX itself is from Microsoft... Xbox was supposed to be DirectX-box at a time.

Share this post


Link to post
Share on other sites
... Jamba's explanation is better than mine really.

I have generally agreed with much of what you contribute here, and as always this one is no exception :awesome:

 

I think the simplest (or simpler) way to think is that DirectX is a translator. It stands in between devs (games) and hardware. When the hardware "learns new words" (i.e. upgraded), you have to update the translator (DirectX) so that it knows those new words :yay: Each new generation of hardware and DirectX expand the dictionary to include more words so that you can have more colourful language :idea:

Share this post


Link to post
Share on other sites

×