Even though I (along with probably everyone else on this planet) am sick and tired of hearing the word “recession”, we can’t escape the fact that the economy and our savings are taking one hell of a beating. Because of this, more and more people are being forced to take a step back and reevaluate what they are doing with their money and shift cash from unnecessary purchases to the essentials. That means the high tech industry in particular has been hit hard considering many people are holding off on purchasing big-ticket items such as computers or upgrades. You can only guess where that leaves $400 graphics cards. What the market needed was a budget-concious graphics card that put some serious muscle in cahoots with a great price. Enter the ATI HD 4770.
The HD 4770 is the poster boy for the changing face of ATI. They have forgone the push for a power-hungry monolithic architecture and have instead focused their energies on scalable, more compact GPU cores which can be used for multiple cards in various price ranges. This was first accomplished with the 55nm HD 4800-series and later perfected with the lower end HD 4600 series and more recently the HD 4890 1GB. As technology has progressed, so too has ATI’s shift towards a more price and efficiency-conscious manufacturing processes. While Nvidia has had some serious issues moving from 65nm to 55nm, with the HD 4770 ATI has now gone directly to the use of 40nm chips. What this means for power consumption has yet to be seen but what we can tell you is that a smaller manufacturing process could conceivably allow ATI more pricing flexibility in the long run due to lower production costs. That should translate into a better pricing structure.
We don’t usually post non-technical PR slides but we feel like the one above is quite telling. Nvidia has been spinning their wheels for some time now with the same G92 core and until recently, their 9800 GT had very little competition. The HD 4830 changed this somewhat but it was not able to best Nvidia’s renamed card in the majority of benchmarks. As such, Nvidia has been getting away with a ~$115 USD price on their mid-range card while the HD 4830 goes for under $100. The HD 4770 with its GDDR5 memory should change that pricing scheme quite a bit since it should outperform the 9800 GT by a relatively wide margin. What this means for the HD 4830 is anyone’s guess but like we always say: competition is a great thing.
If we take back and look at the last few months, ATI seems to have really hit their stride and from the 40nm manufacturing process to the GDDR5, the HD 4770 definitely looks promising. But can it deliver?
ATI HD 4770 Specs / Market Positioning
Even though ATI’s lineup is a bit further-reaching than what you see here (the HD 4870 X2, HD 4650 and lower-end cards were omitted), it is quite apparent that some very interesting things are happening with the release of the HD 4770. Recently, the HD 4830 received a pretty steep price cut to the point where mail in rebates are beginning to bring its price closer to the $80 USD mark. We can assume this was done in order to clear out excess stock before the HD 4770 512MB made its presence felt.
By taking a close look at its specs, it is quite obvious that the HD 4770 512MB will be taking the place of the 55nm HD 4830. The number of ROPs, Texture Units and Stream Processors is a carbon copy of what we saw with the HD 4830 but where the new RV740 core differs is in its smaller 40nm manufacturing process and internal clock speed. The 40nm manufacturing process seems to have helped in two areas: bringing power consumption to a more than manageable level of around 80W under load and allowing ATI run the core at reasonably high clocks.
Out with the old, in with the new?
However, the one crowning achievement of the HD 4770 is the inclusion of GDDR5 memory. Before today, GDDR5 had been reserved for higher-end cards which carried with them performance and prices that mid-range products could not hope to match. Well, that has now changed with a $99 card sporting some of the fastest memory available on the market. This in turn has allowed ATI to forgo the usual 256-bit wide bus and replace it with a less expensive 128-bit affair. I am sure that some of you will roll your eyes at seeing the specifications for the memory bus but I highly suggest you hold out coming to any rash conclusions until you see how this card actually performs.
ATI used to be the underdog and many of us naturally rooted for them to succeed for that one fact alone. Now they have finally come into their own once again and have an extremely strong lineup to show for it. The HD 4770 seems to be the perfect fit for the current economic conditions and if it is available in large enough numbers, it will take the market by storm. Or at least that’s what ATI hopes…
The R700-series Features
It seems like in this brave new world of parallel processing capabilities of GPU cores, both ATI and Nvidia are racing to take advantage of the potential the modern graphics card has locked away within its confines. What we will soon see is a massive increase in the performance of certain applications like video transcoding, Folding and physics calculations. ATI has been on this bandwagon for some time now with their Folding @ home application which first came out for X19xx-series graphics cards, made the jump to the R600 / RV670 cores a while ago and has now been moved over to the new HD4800 / HD4700 / HD4600-series as well. With their massive number of stream processors, the RV700-series cards should be able to handle any application thrown at them. Let’s take a look at what ATI has to offer with additional features.
Even though DX10.1 is a minor update to the Vista-exclusive DX10, ATI feels that its implementation will benefit gamers quite a bit in today’s market. Let’s cut right to the chase: DX10.1 doesn’t offer us anything particularly new in terms of outlandishly new features but it does offer new paths for developers to simplify their code which in turn has the potential to increase performance in certain areas. At present, among the “big two” graphics processor manufacturers, ATI is the only one which supports DX10.1
Even though we run the risk of editorializing here we have to say that ATI’s acceptance of the DX10.1 API seems to be the right thing to do in today’s graphics card industry. After seeing first-hand the performance benefits it brings when applying AA to a DX10 environment in games like Assassin’s Creed we can only express disappointment and outright shock that other GPU manufacturers haven’t followed ATI’s lead. Consumers have been left high and dry without any reason to purchase an OS with DX10 for the simple fact that the performance in impact of DX10 is does not justify minor graphical benefits. DX10.1 works to alleviate those performance hurdles by offering developers more options when producing their games. We can only hope that ATI’s present generation cards become widespread enough that more game developers will implement DX10.1 into their titles.
Up until the HD2900-series was introduced, running more than one ATI card was a clumsy affair which included external cables and more headache than should have been necessary. Then they introduced their very own Crossfire bridge connector and it was all sunshine and roses since daisy chaining two, three or even four cards together became possible. This technology continues today with the HD4000-series cards and AMD has promised that users will get better drivers, quick driver revisions and better industry acceptance among game developers.
In ATI’s never-ending quest to offer us the most power savings possible they have introduced something called PowerPlay. This technology allows the Catalyst software to dynamically adjust voltages and core speeds depending on the application it is being used for. This results in less idle power consumption and power being distributed when and where you need it.
When AMD and Havok announced their partnership to optimize the Havok physics engine to run on ATI hardware, many enthusiasts perked up and listened. Havok Physics has been implemented into a vast variety of games form every single genre the vast majority of the industry’s upcoming blockbuster titles (including Starcraft II) support it. This not only gives ATI’s physics push a massive installed user base but it also guarantees that there will be games with Havok released for years to come. With both ATI and Nvidia firmly entrenched in the war to bring physics processing to a wider market acceptance, we may look back at this point in time as the moment when the renaissance of in-game physics really began.