Watts

Latest

  • Calxeda benchmarks claim that its server chips are 15 times more power efficient than Intel's

    by 
    Daniel Cooper
    Daniel Cooper
    06.21.2012

    Calxeda may have been given the bum's rush by HP's Project Moonshot, but the company isn't taking it lying down. It's released benchmarks for its ARM-based server technology that claims it's 15 times more power-efficient than the comparable Intel Xeon. Rigging up a 1.1GHz Energycore ECX-1000 with 4GB RAM against a 3.3GHz Xeon E3-1240, the former consumed only 5.26 W compared to the 102 W of Intel's high-spec chip. While it certainly wasn't faster, power efficiency is a key concern for data centers looking to keep costs down, and if the trend continues, Santa Clara will come to regret AMD's recently announced love-in.

  • Ultra Products unveils 2000-watt X3 ATX power supply

    by 
    Darren Murph
    Darren Murph
    01.05.2007

    We're all about watching new "world's (insert adjective here)" gizmos become a reality, and we certainly don't mind the occasional dash of overkill, but Ultra Products' forthcoming power supply takes "insane" to another level. In what's presumably the world's largest, most powerful PSU to call an ATX case home, the 2000-watt Modular X3 comes in at 10.25-inches in length and will reportedly fit "wherever a PC Power & Cooling 1000-watt version will". The +12V rail alone is rated at 1800-watts, which means that it can purportedly handle a 150A load, and just might cause some sort of small disaster if actually achieved. Nevertheless, the smorgasbord of connectors allow for more power connections that most could even fathom needing, but Ultra believes that this PSU should remove all worry over whether or not your rig "has enough juice." While it's easy to brush this off as completely absurd, the latest AMD scorchers combined with a few NVIDIA GeForce 8800 GTX or ATI R600 cards could easily eat up a good bit of the supplied power, so if a ginormous power supply was the only thing missing from finishing up your energy-sucking rig, Ultra's X3 should be available sometime this quarter for "less than $499."[Via Digg]

  • Next-gen nVidia and ATI GPUs to require 200 watts?

    by 
    Darren Murph
    Darren Murph
    08.20.2006

    So we've got CPUs moving towards more efficient designs, a (somewhat) general consensus that there should be energy-saving methods applied wherever possible, and yet somehow we see a steady increase in power usage from graphics processing units each time they're updated. It doesn't take a circuitry engineer (although that qualification could be helpful) to understand that higher clock speeds lead to higher frame rates, at least generally speaking, and companies like nVidia and ATI have apparently been taking the high road in order to boost those FPS figures. Current top-end graphic cards can easily suck down 100 watts (or more) when under full load, but it appears that requirement could nearly double -- the two graphical big shots are reportedly looking to offer powerhouse cards that almost need to be wired straight in to the power station to function. Japan's PC Watch has reported that the forthcoming G80 from nVidia may consume up to 175 watts, while ATI's R600 could demand an unprecedented 200 watts to produce its eye-candy. While these figures may be a bit staggering, they aren't exactly shocking -- we've already heard rumors that ATI's next major chipset may be released in an external form, at least partially for power-related reasons. While we can't say for sure what's happening behind the tightly-sealed doors at nVidia or ATI AMD, we hope someone in there is paying attention to these concerns, because we're not exactly cool with needing a separate power strip (and an additional occupation) to feed our gaming habits.

  • NVIDIA denies enthusiasts the Quad-SLI goodness

    by 
    Conrad Quilty-Harper
    Conrad Quilty-Harper
    06.05.2006

    Techreport has posted a review of nVIDIA's latest dual-GPU graphics card, the GeForce 7950 GX2, which also happens to be capable ("capable" being the key word) of Quad-SLI. You won't be surprised to read that this card is fast when compared to its predecessors. It positively destroyed all the other single-GPU cards the Techreport guys tested it against; in Battlefield 2 the GX2 managed "twice the average frame rate of the GeForce 7900 GT." As you probably already know, this kind of performance doesn't come cheap. NVIDIA expects the 7950 GX2 to cost around $599 to $649, and that's before you check your power bill: in tests the card drew 133 Watts at idle and a whopping 237 Watts under load. In comparison to the card's main single-GPU rival, ATI's X1900, the 7950 featured similar levels of power consumption, size and heat output but performed significantly faster in all the benchmarks. The 7950's dual-GPU solution also surpasses the performance of traditional SLI configurations like dual 7900 GTs, with the added advantage of being compatible with any PCI-e motherboard chipset. Strangely, the biggest problem that the review found had nothing to do with the card itself. Although the 7950 GX2 is perfectly capable of being partnered up with another card to make a Quad-SLI system, nVIDIA refuses to support this type of configuration, citing the "complexity" involved. The only way you'll be able to get a Quad-SLI setup is by either hacking two cards together or by purchasing a (some say overpriced) system from Alienware, Falcon Northwest or Dell.The company went as far as refusing to supply the website with a second review card. As the reviewer points out "when explaining to your best customers why they can't purchase two of your $649 video cards for themselves without also buying a $5K PC built by someone else, it's probably not good idea to use a shaky excuse with an embedded insult."

  • Apple and Intel weren't kidding about "low power"

    by 
    David Chartier
    David Chartier
    02.21.2006

    Tom Yager over at InfoWorld has performed some power tests on a 20" Dual Core iMac to discover that these machines in fact do not meet Apple's bold low-power specs - they surpass them. Apple lists the maximum power consumption of a 20" Dual Core iMac at 120 watts, while Tom's tests - even with both  2.0 GHz cores maxed at 100% CPU usage, 1 GB RAM, WiFi, BT, 128 MB graphics card and (oh yea) a 20" LCD - found the iMac drawing a steady 95 watts of power. Assuming that the typical LCD draws around 32 watts of power alone, that means the iMac - even at full throttle - is running as 63 watt personal computer. By comparison, Intel's old Pentium 4 architecture that still ships in many computers needs anywhere from 300-400 watt power supplies - and that's just for the computer itself, sans-display. I should know, I used to build them for a living.Ultimately, this should boil down to great news for the computing industry. Tom Yager's even so excited about the results that he's issued a friendly challenge to the PC market to find a machine that can claim the same stats. The one question that still bothers me about these new chips, however: why hasn't Apple placed at least an estimated battery life rating on the MacBook Pro?[via MacSlash]