So we've got CPUs moving towards more efficient designs, a (somewhat) general consensus that there should be energy-saving methods applied wherever possible, and yet somehow we see a steady increase in power usage from graphics processing units each time they're updated. It doesn't take a circuitry engineer (although that qualification could be helpful) to understand that higher clock speeds lead to higher frame rates, at least generally speaking, and companies like nVidia and ATI have apparently been taking the high road in order to boost those FPS figures. Current top-end graphic cards can easily suck down 100 watts (or more) when under full load, but it appears that requirement could nearly double -- the two graphical big shots are reportedly looking to offer powerhouse cards that almost need to be wired straight in to the power station to function. Japan's PC Watch has reported that the forthcoming G80 from nVidia may consume up to 175 watts, while ATI's R600 could demand an unprecedented 200 watts to produce its eye-candy. While these figures may be a bit staggering, they aren't exactly shocking -- we've already heard rumors that ATI's next major chipset may be released in an external form, at least partially for power-related reasons. While we can't say for sure what's happening behind the tightly-sealed doors at nVidia or ATI AMD, we hope someone in there is paying attention to these concerns, because we're not exactly cool with needing a separate power strip (and an additional occupation) to feed our gaming habits.

ASiQ looking to fill the Connexion void