Jump to content
N-Europe

Radeon X1800XT Overclocked to 1GHZ


Guest Jordan

Recommended Posts

Guest Offerman
Can they play games properly at that speed or does the card blow up?

 

Read the article: Stable!? WTF!

 

Its nearly always heat and voltage that cause instabilitys. If you can remove the heat, which they have done; and provide enough voltage then you can have a perfectly stable and useable overclock.

Link to comment
Share on other sites

Its nearly always heat and voltage that cause instabilitys. If you can remove the heat, which they have done; and provide enough voltage then you can have a perfectly stable and useable overclock.

 

 

Right so i'll need access to a fridge (maybe a freezer) and maybe a direct link to the national grid. Cool, Thanks for the heads up. :smile:

Link to comment
Share on other sites

Guest Offerman
Right so i'll need access to a fridge (maybe a freezer) and maybe a direct link to the national grid. Cool, Thanks for the heads up. :smile:

 

Well usally they just use another totally dedicated PSU, and use N2O which is about -30'C.

Link to comment
Share on other sites

you know i've read reviews and thing comparing the nvidia 7800gtx to the x1800xt and saying the nvidia was better. Wonder what those reviewers would say now. Wonder what all those nvidia fanboys would say now. I mean they were all preaching as to how good the old 7800gtx was compared to the new x1800xt.

Link to comment
Share on other sites

you know i've read reviews and thing comparing the nvidia 7800gtx to the x1800xt and saying the nvidia was better. Wonder what those reviewers would say now. Wonder what all those nvidia fanboys would say now. I mean they were all preaching as to how good the old 7800gtx was compared to the new x1800xt.

 

It still is I think. You can't possibly use these benchmark results as a standard seeing as you need a liquid nitrogen cooling system and a seperate psu and a LOT of knowledge to get it running at those speeds. Not to mention your card would wear down pretty fast.

 

Those reviews use the standard, pre-defined clock speeds and pipelines and stuff... making them more accurate, seeing as not a lot of people can oc their cards.

 

All that aside, pure factual Mhz don't mean very much anymore (think AMD numbers compared to Intel, GP2X cpu clocks compared to PSP...). Of course, these massive numbers indicate great speed but just don't go looking purely at the bigger numbers, that's all I'm trying to say.

 

 

-edit-

 

Oh and I have an ATI but my next card will pretty much certainly be an nVidia :)

Link to comment
Share on other sites

It still is I think. You can't possibly use these benchmark results as a standard seeing as you need a liquid nitrogen cooling system and a seperate psu and a LOT of knowledge to get it running at those speeds. Not to mention your card would wear down pretty fast.

 

Those reviews use the standard, pre-defined clock speeds and pipelines and stuff... making them more accurate, seeing as not a lot of people can oc their cards.

 

All that aside, pure factual Mhz don't mean very much anymore (think AMD numbers compared to Intel, GP2X cpu clocks compared to PSP...). Of course, these massive numbers indicate great speed but just don't go looking purely at the bigger numbers, that's all I'm trying to say.

 

 

-edit-

 

Oh and I have an ATI but my next card will pretty much certainly be an nVidia :)

 

 

Yeah I know but you have to remember that all these reviews and stuff were done by comparing the nvidia 7800gtx to a refernce design ati x1800xt running on beta drivers and even then the ati almost matched the nvidia. Just imagine what the final revision will be able to do.

 

And besides the ati is the only card right now that can do hdr and anti-alising with little performance hit. Nvidia cant even do that. And beside for lost coast, which should be out now, the ati card will storm all over the nvidia because you can have both hdr and antilising, unlike the nvidia where you can only have one or the other.

 

I know i sound like an ati fanboy, but take my word for it im not. I dont even own an ati card right now. The only reason i say all this is beacuse i have used and seen nvidia cards in action and IMO i rekon ati is better, not always, but most of the time.

Link to comment
Share on other sites

And beside for lost coast, which should be out now, the ati card will storm all over the nvidia because you can have both hdr and antilising, unlike the nvidia where you can only have one or the other.

 

I know i sound like an ati fanboy, but take my word for it im not. I dont even own an ati card right now. The only reason i say all this is beacuse i have used and seen nvidia cards in action and IMO i rekon ati is better, not always, but most of the time.

 

Dude... I mean i'm an ATi fanboy but thats the biggest load of bull shit i've ever heard. I mean EVER. Why the hell did you say that?

Link to comment
Share on other sites

Dude... I mean i'm an ATi fanboy but thats the biggest load of bull shit i've ever heard. I mean EVER. Why the hell did you say that?

 

its all true i mean have you tried to switch anti-alising on on nvidia cards while hdr is on, frame rate drops bad.

Link to comment
Share on other sites

What nVidia card have you been using? An FX5200? Jeeze, the X1800 and the 7800 are pretty damn close in terms of all specs.

 

All graphic cards take a huge frame rate hit from turning HDR on and even FSAA.

Link to comment
Share on other sites

What nVidia card have you been using? An FX5200? Jeeze, the X1800 and the 7800 are pretty damn close in terms of all specs.

 

All graphic cards take a huge frame rate hit from turning HDR on and even FSAA.

 

 

Yeah I know but x1800xt was design so that this framerate hit isn't such a hugh issue. It won't be such of a performance drain as it is on the 7800. Thats all im saying.

Link to comment
Share on other sites

Yeah I know but x1800xt was design so that this framerate hit isn't such a hugh issue. It won't be such of a performance drain as it is on the 7800. Thats all im saying.

 

Your grammar is terrible, but worse is your assumption of graphics card power.

 

I have a Radeon 9800 Pro and I can promise you when you enable AA with HDR in Lost Coast, the frame rate drops like a stone.

 

ATi has classically had an advantage over nVIDIA running at higher resolutions with AA and AF enabled, not necessarily HDR.

 

Please tell me how much experience you have in this field and if you've worked with different platforms and hardware, and had the chance to benchmark them, because I'd welcome it. However I'm pretty sure you're just talking bollocks because you like ATi.

 

Don't get me wrong, I like ATi, I've had a Radeon 9800 Pro, Radeon 9800, Radeon 9600 Pro, Radeon 8500 AIW DV, but try to spread fact not fiction. At least back your arguments up with some links.

Link to comment
Share on other sites

I don't want to seem like I know anything about cards. I am just saying what I think, from what I have read and heard. I just want to know which one you would go for and why. Reason being, and I need a good one, because I am thinking of getting a new card and I need to know which is good and why? So that I can justify the cost of a £300+ card.

 

 

EDIT: need the card to be future proof thats all.

Link to comment
Share on other sites

Unforch theres no such thing as "future proof" these days. My X800 was the dogs bollocks back this time last year. Now its getting slower and worse over time.

Link to comment
Share on other sites

Unforch theres no such thing as "future proof" these days. My X800 was the dogs bollocks back this time last year. Now its getting slower and worse over time.

 

 

Yeah I know what you mean but what card would you say is good to get this or next year, which would last me for a few good years to say the least. Lets say I have a budget of around £300 to £360.

Link to comment
Share on other sites

×
×
  • Create New...