Guest Jordan Posted October 26, 2005 Posted October 26, 2005 Jesus, fucking, christ. Using an X1800XT, some crazy Finnish guys have managed to overclock its core to a massive 1GHZ. Thats a world record. Yeah... thats fast. Anyway details here: http://ca.us.biz.yahoo.com/ccn/051026/200510260293452001.html?.v=2
CompSci Posted October 26, 2005 Posted October 26, 2005 gah! my 9800 pro is still at 340 something...... mhz
Guest Offerman Posted October 27, 2005 Posted October 27, 2005 Very good article. I knew ATi wouldnt let us down.
Cheapshot Posted October 27, 2005 Posted October 27, 2005 Can they play games properly at that speed or does the card blow up? Read the article: Stable!? WTF!
Guest Offerman Posted October 27, 2005 Posted October 27, 2005 Can they play games properly at that speed or does the card blow up? Read the article: Stable!? WTF! Its nearly always heat and voltage that cause instabilitys. If you can remove the heat, which they have done; and provide enough voltage then you can have a perfectly stable and useable overclock.
Cheapshot Posted October 27, 2005 Posted October 27, 2005 I can't even overclock my 9600 30 mhz over without getting artifacts. Finland is freezing so they have an advantage there.
wing Posted October 27, 2005 Posted October 27, 2005 Its nearly always heat and voltage that cause instabilitys. If you can remove the heat, which they have done; and provide enough voltage then you can have a perfectly stable and useable overclock. Right so i'll need access to a fridge (maybe a freezer) and maybe a direct link to the national grid. Cool, Thanks for the heads up.
Dieter Posted October 27, 2005 Posted October 27, 2005 So old, I've known this for a full ten minutes :P
wing Posted October 27, 2005 Posted October 27, 2005 Anyway may wait for ATI to release a GDDR4 version, which may hopefully be out sometime next year ... hopefully.Next year is what Stan Ossias, Sr. Product Manager, Desktop Products, ATI Technologies Inc. says,
Guest Offerman Posted October 27, 2005 Posted October 27, 2005 Right so i'll need access to a fridge (maybe a freezer) and maybe a direct link to the national grid. Cool, Thanks for the heads up. Well usally they just use another totally dedicated PSU, and use N2O which is about -30'C.
wing Posted October 27, 2005 Posted October 27, 2005 you know i've read reviews and thing comparing the nvidia 7800gtx to the x1800xt and saying the nvidia was better. Wonder what those reviewers would say now. Wonder what all those nvidia fanboys would say now. I mean they were all preaching as to how good the old 7800gtx was compared to the new x1800xt.
Dieter Posted October 27, 2005 Posted October 27, 2005 you know i've read reviews and thing comparing the nvidia 7800gtx to the x1800xt and saying the nvidia was better. Wonder what those reviewers would say now. Wonder what all those nvidia fanboys would say now. I mean they were all preaching as to how good the old 7800gtx was compared to the new x1800xt. It still is I think. You can't possibly use these benchmark results as a standard seeing as you need a liquid nitrogen cooling system and a seperate psu and a LOT of knowledge to get it running at those speeds. Not to mention your card would wear down pretty fast. Those reviews use the standard, pre-defined clock speeds and pipelines and stuff... making them more accurate, seeing as not a lot of people can oc their cards. All that aside, pure factual Mhz don't mean very much anymore (think AMD numbers compared to Intel, GP2X cpu clocks compared to PSP...). Of course, these massive numbers indicate great speed but just don't go looking purely at the bigger numbers, that's all I'm trying to say. -edit- Oh and I have an ATI but my next card will pretty much certainly be an nVidia
wing Posted October 28, 2005 Posted October 28, 2005 It still is I think. You can't possibly use these benchmark results as a standard seeing as you need a liquid nitrogen cooling system and a seperate psu and a LOT of knowledge to get it running at those speeds. Not to mention your card would wear down pretty fast. Those reviews use the standard, pre-defined clock speeds and pipelines and stuff... making them more accurate, seeing as not a lot of people can oc their cards. All that aside, pure factual Mhz don't mean very much anymore (think AMD numbers compared to Intel, GP2X cpu clocks compared to PSP...). Of course, these massive numbers indicate great speed but just don't go looking purely at the bigger numbers, that's all I'm trying to say. -edit- Oh and I have an ATI but my next card will pretty much certainly be an nVidia Yeah I know but you have to remember that all these reviews and stuff were done by comparing the nvidia 7800gtx to a refernce design ati x1800xt running on beta drivers and even then the ati almost matched the nvidia. Just imagine what the final revision will be able to do. And besides the ati is the only card right now that can do hdr and anti-alising with little performance hit. Nvidia cant even do that. And beside for lost coast, which should be out now, the ati card will storm all over the nvidia because you can have both hdr and antilising, unlike the nvidia where you can only have one or the other. I know i sound like an ati fanboy, but take my word for it im not. I dont even own an ati card right now. The only reason i say all this is beacuse i have used and seen nvidia cards in action and IMO i rekon ati is better, not always, but most of the time.
Guest Jordan Posted October 28, 2005 Posted October 28, 2005 And beside for lost coast, which should be out now, the ati card will storm all over the nvidia because you can have both hdr and antilising, unlike the nvidia where you can only have one or the other. I know i sound like an ati fanboy, but take my word for it im not. I dont even own an ati card right now. The only reason i say all this is beacuse i have used and seen nvidia cards in action and IMO i rekon ati is better, not always, but most of the time. Dude... I mean i'm an ATi fanboy but thats the biggest load of bull shit i've ever heard. I mean EVER. Why the hell did you say that?
wing Posted October 28, 2005 Posted October 28, 2005 Dude... I mean i'm an ATi fanboy but thats the biggest load of bull shit i've ever heard. I mean EVER. Why the hell did you say that? its all true i mean have you tried to switch anti-alising on on nvidia cards while hdr is on, frame rate drops bad.
Guest Jordan Posted October 28, 2005 Posted October 28, 2005 What nVidia card have you been using? An FX5200? Jeeze, the X1800 and the 7800 are pretty damn close in terms of all specs. All graphic cards take a huge frame rate hit from turning HDR on and even FSAA.
wing Posted October 28, 2005 Posted October 28, 2005 What nVidia card have you been using? An FX5200? Jeeze, the X1800 and the 7800 are pretty damn close in terms of all specs. All graphic cards take a huge frame rate hit from turning HDR on and even FSAA. Yeah I know but x1800xt was design so that this framerate hit isn't such a hugh issue. It won't be such of a performance drain as it is on the 7800. Thats all im saying.
RoadKill Posted October 28, 2005 Posted October 28, 2005 Yeah I know but x1800xt was design so that this framerate hit isn't such a hugh issue. It won't be such of a performance drain as it is on the 7800. Thats all im saying. Your grammar is terrible, but worse is your assumption of graphics card power. I have a Radeon 9800 Pro and I can promise you when you enable AA with HDR in Lost Coast, the frame rate drops like a stone. ATi has classically had an advantage over nVIDIA running at higher resolutions with AA and AF enabled, not necessarily HDR. Please tell me how much experience you have in this field and if you've worked with different platforms and hardware, and had the chance to benchmark them, because I'd welcome it. However I'm pretty sure you're just talking bollocks because you like ATi. Don't get me wrong, I like ATi, I've had a Radeon 9800 Pro, Radeon 9800, Radeon 9600 Pro, Radeon 8500 AIW DV, but try to spread fact not fiction. At least back your arguments up with some links.
wing Posted October 28, 2005 Posted October 28, 2005 I don't want to seem like I know anything about cards. I am just saying what I think, from what I have read and heard. I just want to know which one you would go for and why. Reason being, and I need a good one, because I am thinking of getting a new card and I need to know which is good and why? So that I can justify the cost of a £300+ card. EDIT: need the card to be future proof thats all.
Guest Jordan Posted October 28, 2005 Posted October 28, 2005 Unforch theres no such thing as "future proof" these days. My X800 was the dogs bollocks back this time last year. Now its getting slower and worse over time.
wing Posted October 28, 2005 Posted October 28, 2005 Unforch theres no such thing as "future proof" these days. My X800 was the dogs bollocks back this time last year. Now its getting slower and worse over time. Yeah I know what you mean but what card would you say is good to get this or next year, which would last me for a few good years to say the least. Lets say I have a budget of around £300 to £360.
Recommended Posts