Jump to content
N-Europe

Wii may use Direct X 10!


DiemetriX

Recommended Posts

I have generally agreed with much of what you contribute here, and as always this one is no exception :awesome:

 

I think the simplest (or simpler) way to think is that DirectX is a translator. It stands in between devs (games) and hardware. When the hardware "learns new words" (i.e. upgraded), you have to update the translator (DirectX) so that it knows those new words :yay: Each new generation of hardware and DirectX expand the dictionary to include more words so that you can have more colourful language :idea:

thanks :) you seem new here (hadn't seen you around), so I'd like to welcome you to these forums...

 

Also that's a pretty good explanation you've got there :heh: keep posting.

Link to comment
Share on other sites

  • Replies 89
  • Created
  • Last Reply

Top Posters In This Topic

Crumbs.. there's been a lot of confusion in this thread.

 

I'd expect that, going one what Nintendo themselves have said publically about being able to use GC dev kits to prototype Revolution (do I have to call it Wii? :/ ) demos on, it'll be using an extention of the GameCube Graphics library which - as has already been mentioned - is similar in style to OpenGL.

 

I think that the confusion has perhaps arisen because people sometimes refer to Graphics Cards as being "DirectX 9 / 10 compliant", in terms of which of the latest set of features that DirectX supports (in software) are supported in hardware (i.e. hardware accelerated) by the graphics card. Perhaps this refers to shader implementation in this new version of the Flipper chip, which would be a radical addition to the hardware, considering the previous Flipper chip merely did Hardware Transform and Lighting (think DirectX 7 / GeForce2 era).

It had a fixed T&L pipeline... but still it wasn't a directX 7 GPU by the book... the ISA around it was completly programmable making up for real shaders, features like skinning, wind waker's cell shading and others wouldn't be possible otherwise. so it's really a DirectX8+ GPU... Sure it doesn't do it in the same way but it does them, and it's not harder to do so according to Factor5.

 

Also we don't know yet if it's really a new version of "flipper".I agree with you on the rest though.

Link to comment
Share on other sites

For true backwards compatibility neither the CPU or GPU must be totally different. So I guess the CPU will just be a better clocked architecture jump from the PowerPC 750CXe to probably a modified version of the 970FX (small, efficient). The GPU can be modified aswell as long the features of the Flipper stay within the chip. I admit it would be quite difficult to use the basic Flipper and just build new features around it as that would seriously hurt performance and architecture. It is easier to build a GPU similar to the old one from scratch but heavily enhanced.

Link to comment
Share on other sites

Indeed, I think it is safe to say that the cpu will be of power pc architecture. As for the gpu, I think ATI will be pouring in the shader power. I say this since ATI has shifted focus to shaders with the 1900 line as well as with the xbox 360 gpu. There is no reason to think that they won't do the same for the Wii gpu.

Link to comment
Share on other sites

[A bitt off topic] From ign:

 

wiitoscale_1146785504.jpg

That's right: Wii is small because it can be; it doesn't need to be big and bulky because it isn't hiding a flux capacitor and 10 water-cooled fans under its hood... And yet, I browse the IGN message boards and continue to see posts like: "OMGOMG Nintendo Wii Has More Power than 360 and PS3 Combined, According to My Ass!"... If you haven't figured it out yet, Wii is not a console whose horsepower is on the same level as its competitors.

Link to comment
Share on other sites

and what has horsepower to do with functionality?

 

Nothing. He is not bad talking the Wii.

 

It is a console whose controller takes gaming into a completely different -- possibly better -- direction.Let's focus on the controller, please. I think that once you experience it, you're not going to care quite as much if your friend's system can render more polygons than yours.

Link to comment
Share on other sites

If the Wii gets released with a G5 and DirectX 9 level GPU there's no reason why games should look worse than first gen 360 games at all.

 

In that case I hate IGN for publishing the specs of an overclocked Cube devkit.

Link to comment
Share on other sites

This whole shader stuff is overhyped - developers can make shadows more real looking and save power. It is very efficient for that but are shadows that important?

 

It started with lens flares and the over using of them and now we have to live through shadows, HDR and motion blur! I am sick of that crap. Every 6 months ATI and NVIDIA decide what the new super trend is and how it will affect gaming. So far I am not convinced at all.

 

The Unreal Engine 3 is not Jesus and so far the pictures look so superficial that all the advantages with lightning and shadowing are gone. Antialiasing is a nice feature, Voxel Technology is quite interesting do I see those features? Nope because that would take time and effort to invest into and neither ATI nor NVIDIA can afford to spend that and leap a generation behind.

 

Pixel/Vertex Shader Models are usefull but not really the alpha and omega of gaming. Compare the leap between Doom 2 and Quake 2 to the changes games have nowadays. Higher resolution for textures (add more RAM to GPU), more shadows and more artifical lightning and BINGO you have a current generation "masterpiece".

 

Anyone actually saw the Skybox in Quake 4? Are the system requirements really justified for the product you get?

Link to comment
Share on other sites

As much as I believe that it's not what you have, but what you do with it, I sincerely believe that many of the feature above ARE the way forward for graphical development. Not sure what Voxel tech is, but HDR and Anti-analising are great steps forward for graphical development and are already fairly standard features on most games such as Farcry and the Source games. Even on my old but faithful vanilla 6600 (I'm switching to ATI - nvidia can't do AA and HDR at the same time very well for some reason) these features, though not essential, have hugely improved my play experience. To simply deny their purpose would be madness.

Link to comment
Share on other sites

Exactly my point. Though shaders (and HDR) might be overrated, they do add much to the feel of the graphics - more than high-res textures and much more than a higher polygon count. Because of those graphics I begin to see more and more of the age of the GameCube hardware in my games. Games often have a dry and bland look over them compared to the next generation and that isn't to blame on raw power and fillrate per se, but rather on the limited shader features. These new technologies add so much to the vivity of a game - it would be very dissappointing if the Wii didn't make use of them.

Link to comment
Share on other sites

I believe the Wii will be more then capable of these effects in developers wish to use them. Although I personally feel that we are approaching a graphical ceiling. While the GC games have aged compared to the MGS4 real time demo, I've yet to see even the most idealistic Sony "tech demo" totally outshine the likes of Resi 4 or Prime. At least, in no where near the same way as the jump from PS1/PS2 or the great jump from 2D/3D. Maybe this will change once we start seeing DX10 in full effect on PC and possibly certain consoles, but it just seems like very soon the graphical potential of a console will be utterly inconcequential. Personally, I think it is already. As I said before, even if the Wii offered even no graphical improvement over the GC, (though I believe there will be a fairly significant improvment) that would still be more then enough for any professional developer to fully realise their vision. Sheer console horsepower just isn't a limitation anymore.

Link to comment
Share on other sites


×
×
  • Create New...