last night i went to an computer expo on my city n all that new computer tech makes me drooling :P~..i am amaze, i make my computer 2 years ago n now all the tech are completely different..its more powerful, more fast, n more expensive hehehe..i use my computer for my work as a interior designer so i use AUTOCAD n 3DMAX a lot..it work fine to me but last night when i see all the new graphic card i think im gonna trow my old graphic card to trash n buy a new one..hehehe
so in this post i wanna share an article about graphic card, AMD vs Nvidia..all these time im a ATI minded, but after reading this article, would it change???
anyway i read this article from http://www.techradar.com
so this is the article...
Mac vs PC. Microsoft vs Google. Intel vs AMD. There's no shortage of monumental rivalries in the tech industry. But the royal rumble between ATI and Nvidia for dominance of 3D graphics is one of the roughest of the lot.
The contest is really AMD vs Nvidia. After all, AMD snapped up Canadian graphics outfit ATI back in 2006.
More importantly, however you slice it the histories of ATI and Nvidia have been very closely intertwined. Both started out as specialists in PC graphics and have since branched out into non-PC platforms such as games consoles, mobile devices and set-top boxes.
More recently, ATI's acquisition by AMD seems to have set it on a very different path for the future from Nvidia. But let's start by reminiscing on a few of our favourite ATI vs Nvidia fisticuffs from yesteryear before taking a look at their current offerings.
AMD vs Nvidia: RIVA and Rage
The early days of graphics on the PC saw 3D cards like Nvidia's RIVA 128 and TNT2 take on ATI's Rage and Rage 128. But it was Nvidia who presaged the modern GPU or Graphics Processing Unit with the mighty GeForce 256 in 1999. It was the first graphics chip with hardware transform and lighting capabilities. And it was fast. Damned fast.
ATI responded in 2000 with the Radeon graphics card. Ever since, successive generations of GeForce and Radeon GPUs have been leapfrogging each other in the race for graphics supremacy. Nvidia had the early advantage with the GeForce, GeForce 2, GeForce 3 and GeForce 4 series arguably having the edge over ATI's Radeon, Radeon 7500 and Radeon 8500.
But in 2002 ATI turned the tables with the awesome Radeon 9700 Pro. The first GPU with fully programmable shaders, the 9700 Pro was massively more powerful than any graphics chip before. It took until early 2003 for Nvidia to respond with the ill-fated GeForce 5800 Ultra, a GPU that never lived up to expectations.
Nvidia was back on form a year later with the GeForce 6800 series. A tit for tat ensued with neither ATI nor Nvidia achieving a clear advantage. It was during this period that Nvidia introduced its revolutionary multi-GPU SLI technology and ATI responded with the copycat Crossfire platform. There really was nothing to separate them.
AMD vs Nvidia: Radeon rethink
At least, there wasn't until ATI released the underperforming Radeon HD 2900 XT. Like Nvidia's calamitous GeForce FX series, the 2900 arrived late, ran hot, underperformed and couldn't match its opposition, the GeForce 8800 Ultra.
But unlike the GeForce FX, it lead to a fundamental strategic rethink. AMD decided that in future ATI would no longer chase ultimate performance with its top GPU. Instead it would aim for maximum bang for buck and introduce dual-GPU boards to cater for enthusiasts demanding ultra-high performance.
The culmination of this rethink was the Radeon HD 4870. Launched in mid 2008, it was half the price of Nvidia's competing GeForce GTX 280 but delivered at least 80 per cent of the performance. It was a winning combination.
AMD vs Nvidia: The DX11 era
Of course, graphics technology waits for no man and much has changed since the Radeon HD 4000 series and GeForce GTX 200 hit the market in 2008. Late last year AMD unleashed the Radeon HD 5000 series, the world's first family of graphics chips with support for the latest DirectX 11 multimedia API from Microsoft, as seen in Windows 7 but also available as an update for Windows Vista.
It took a little longer for Nvidia to respond in kind with the GeForce GTX 400 family. It eventually turned up earlier this year and since then its been these two pixel pumping graphics architectures fighting it out for top DX11 honours.
Topping the current single-GPU tables, therefore, are the ATI Radeon HD 5870 and Nvidia Geforce GTX 480. Thanks to AMD's greater emphasis on value, the GTX 480 weighs in around £100 more expensive at £430 or so.
For the money Nvidia gives you an extra billion transistors for a faintly ridiculous total of three billion. You also get a little more memory as standard, 1.5GB to the 5870's 1GB. However, it's worth noting that 2GB variants of the 5870 are now available for less than the 1.5GB GTX 480.
Anyway, what you don't get from the 480 is a huge performance advantage. Yes it's a little quicker than the 5870. But not nearly as much as it needs to be given the extra cost and complexity.
AMD vs Nvidia: Cut-down cards
It's a similar story further at the next rung down the graphics ladder. Both AMD and Nvidia offer slightly cut down versions of their top GPUs. The Radeon HD 5850 is yours for £225 and retains 1,440 of the 5870's 1,600 stream shaders. Meanwhile, Nvidia's GeForce GTX 470 weighs in around £295 and packs 448 of Nvidia's mighty CUDA cores. The GTX 480, for the record, has 480 cores.
Once again, the 470 is a little quicker than its AMD equivalent, but it's also much more expensive. From there, things get a little more complicated. AMD does a silly-money dual-GPU Radeon HD 5000 board, the 5970. In most tests of pixel pumping prowess, it's the quickest thing out there (NVIDIA has yet to wheel out a dual-GPU take on the GTX 400 series). But just occasionally its dual-GPU architecture and split-memory set up gets the better of it.
Move into mid-range territory and direct comparisons between ATI and Nvidia are currently a bit tricky. That's because Nvidia has yet to release more affordable chips based on Fermi, the new DX11 architecture that underpins the GTX 480 and 470 GPUs.
Consequently, the Radeon HD 5770 (£125), Radeon HD 5670 (£85) and Radeon HD 5570 (£72) are lording it without any DX11 competition.
Instead, Nvidia makes do with older chipsets based on DX10 tech, such as the GeForce GTS 250 (£125) and GeForce GT 240 (£72).
Advantage Nvidia?
That said, Nvidia has recently released an even more cut-down version of the Fermi chip in the new GeForce GTX 465, on sale now from around £230. But what it really needs is some pukka mid-range DX11 chips to take the fight to AMD. And it needs them soon. AMD has already released its family of second-generation DirectX 11 GPUs.
Read ourAMD Radeon HD 6870 review
In the meantime, it's not all bad news for Nvidia. Arguably, it has an edge in at least one important DX11 feature, the hardware tessellator. Designed to spew out huge numbers of polygons and therefore give games more geometric detail and realism than ever before, the tessellator could prove to be the killer feature in DX11. Early tests suggest Nvidia's chips have more tessellation power than ATI's.
Nvidia is also way ahead of ATI when it comes to stereoscopic 3D. Nvidia's 3D Vision technology is the best way to get 3D on your PC today. It works with a large number of games and is also compatible with certain formats of 3D movies including Blu-ray 3D. But like most other 3D display technologies, wearing a pair of geeky goggles is the price of participation.
Read more: http://www.techradar.com/news/computing-components/graphics-cards/amd-vs-nvidia-who-makes-the-best-graphics-cards--699480?#ixzz1P3O9JnAW
AMD vs Nvidia: who makes the best graphics cards?
2:50 AM
mari masak
Labels:
technology
Labels:
technology
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment