gpu

Latest

  • The amazing 360 GPU shrinks to 65nm this Fall

    by 
    Dustin Burg
    Dustin Burg
    04.30.2007

    According to sources close to equipment manufacturers, later this year we'll be ushering in a new and improved Xbox 360 GPU. A Chinese report in the Commercial Times is stating that the new 65nm 360 GPU will be sent to production in May and make its way into Xbox 360 consoles later this Fall. And we all know the benefits of 65nm technology. So, anyone in the market for a new 360 or jumping in for the first time and going to hold out until the new 65nm technology makes it into the console? Or could you care less unless it would involve an Xbox 360 price cut?[Via Joystiq]

  • Xbox 360 Xenos GPU shrinks to 65nm this fall

    by 
    Christopher Grant
    Christopher Grant
    04.30.2007

    We know the 65nm version of Xenon, the Xbox 360's triple-core CPU, had been delayed from Q1 2007 to "mid-2007," just missing being used in the Xbox 360 Elite, making irony the system's most prominent feature. But the CPU isn't the only thing getting shrunk: DigiTimes reports that the Xbox 360 will be equipped with a 65nm Xenos GPU this fall as well, with production of the new graphics chips scheduled for May. For those of you on the fence about the Elite, a newer, cooler, and cheaper (to manufacture) Xbox 360 should be here in the fall. The question: wait or jump in?[Thanks, Fandel]

  • AMD names names: R600 now the ATI Radeon HD 2900 XT

    by 
    Paul Miller
    Paul Miller
    04.13.2007

    AMD is losing the "X" prefix of its ATI graphics lineup, and slapping on "HD" to denote the changes and advancements in its R600-based DirectX 10 cards. Up top is the Radeon HD 2900 XT (the rumors were close), with 320 stream processors, double that of the GeForce 8800 GTX from NVIDIA. The lower-end RV630- and RV610-based cards will go as the HD 2600 Pro / XT and the HD 2400 Pro / XT. HD on all these cards denotes the Avivo HD technology on board for decoding H.264 and VC-1 video off of Blu-ray and HD DVD discs. The 2900 series also has full HDMI outs, with integrated 5.1 surround sound. No exact launch date or pricing yet, but we shouldn't have too much longer to wait.

  • NVIDIA GeForce 8600 and 8500 launch deets outed

    by 
    Paul Miller
    Paul Miller
    04.09.2007

    Still haven't scraped together enough change to get yourself a fancy new 8800? Well just wait 10 more days and you can get (a little bit of) that hip-cool technology for peanuts. The GeForce 8600 GTS, 8600 GT and 8500 GT are all due on April 17th, hitting at the $199-$229, $149-$159 and $89-$129 price points, respectively. Specs scale nicely, with 256MB of GDDR3, a 675MHz core clock and a 1000MHz memory clock at the top end, on down to the 128 to 256MB DDR2 or GDDR3, 450MHz core clock and 700MHz memory clock at the bottom. Before too long, the even more basic 8400 GS and 8300 GS will round out the set, but hopefully you won't have to stoop that low to get your DirectX 10 on.

  • NVIDIA shows off new mid-range 8300, 8500GT, and 8600GT DX10 graphics cards

    by 
    Conrad Quilty-Harper
    Conrad Quilty-Harper
    03.20.2007

    They may not match up with the performance of the super high end graphics cards that we've seen from NVIDIA and ATI recently, but inevitably middle-range graphics cards are going to be the most popular with gaming enthusiasts as they invariably offer the most bang for your buck. At CeBIT last week, NVIDIA apparently accidentally put several graphics cards that it marked down simply as "New DX10 Graphics Card with HDMI" on display, which were in fact its new mid-range 8000-series of GeForce cards. On show were the passively cooled 8300, and the higher spec'ed 8500GT, and 8600GT (pictured). All feature HDCP supported HDMI ports (with sound routed to the cards through S/PDIF), Dual-Link DVI, and video RAM at 256MB each. No word on release date, but as with every other range of middle performance GPUs, you can probably expect the prices to be significantly less than their bigger brother (along with the performance, too).[Via Trusted Reviews]

  • AMD integrates ATI Radeon X1250 into Vista-certified 690 chipset

    by 
    Darren Murph
    Darren Murph
    02.28.2007

    Although ATI's R600 graphics chip may have hit a recent snag, it seems that the company's Radeon X1250 GPU is coming along just fine, as it claims the proud title of "world's first" integrated graphics unit to receive Vista certification. The chip, of course, is a critical piece of AMD's latest 690-series chipset, which integrates Aero-capable graphics, 1GHz HyperTransport interface speeds, and support for Sempron, Athlon 64/64 FX/64 FX X2 processors, PCI Express, Microsoft's DirectDraw, hardware acceleration for MPEG-2/4 and WMV9, TV output, HDCP-compatible DVI / HDMI outputs, and the general smorgasbord of connectors we're all used to seeing these days. Putting the resources of the ATI / AMD merger to full use, the 690 family claims to be the first chipset from the pair that supports ATI's Avivo technology, which purportedly makes your multimedia experiences within Vista a smooth ordeal. AMD's latest should be available right now for an undisclosed price, but we'd wait for a few hands-on opinions before rushing out and skipping over a dedicated GPU in your next rig.[Via 64-Bit-Computers]

  • Samsung's GDDR4 graphics memory goes to 2000MHz

    by 
    Darren Murph
    Darren Murph
    02.23.2007

    While Samsung has been dabbling in the RAM world quite a bit of late, kicking out the micro-sized OneDRAM and cellphone-bound gigabit-density DRAM, the firm is now officially loosing its 2000MHz GDDR4 RAM on the world. While the zippy memory is already found on ATI's Radeon X1950 card, it'll now be available en masse to graphics card producers in order to speed up current and future offerings "by up to 66-percent." By using 80-nanometer production technology, the memory boasts a 4Gbps throughout, which is nearly two-thirds higher than the widely used 2.4Gbps GDDR4 variety out now; additionally, it'll be offered to vendors in just a 512MB density, rock a 32-bit data bus configuration, and utilizes "JEDEC-approved standards for signal noise reduction to help attain the highest possible speed." No word on how much Sammy plans on charging speed-freaked manufacturers, nor how long it'll take for a kilowatt-burnin' card other than the X1950 to include such a luxury, but we wouldn't count on it being too much longer, regardless.[Via TGDaily]

  • Thermaltake kicks out Quad GPU-ready power supplies

    by 
    Darren Murph
    Darren Murph
    02.15.2007

    There's just nothing quite like the smell of four graphics cards burning through energy in the morning to get you amped for the day, and Thermaltake is making sure you've got the juice and the connectors to make it happen. While not quite as mighty as Ultra Products' 2,000-watt behemoth, the 1,200- and 1,000-watt Toughpower PSUs support both AMD and Intel rigs and boast a trio of six-pin PCI-E connectors, eight SATA and Molex ports, and both units claim the "world's first" tag by collectively being numero uno to include eight-pin PCI-E connectors. The backwards compatible ports allow gamers to hit up a bit of Quad SLI action on their own, and considering the massive power draw demanded by such cards, it's good to know you've got so much energy on tap for those graphically straining firefights. Although there's no pricing information available just yet, the W0133 / W0132 should be available soon, and if you're already considering throwing down for four high-end GPUs, we highly doubt cashflow is of primary concern.[Via FarEastGizmos]Read - Thermaltake Toughpower W0133Read - Thermaltake Toughpower W0132

  • Shots surface of ATI's R600 -- and boy is she a big one

    by 
    Paul Miller
    Paul Miller
    02.11.2007

    We already know that this little monster hums along at quite a clip, but how does it look? Monstrous, of course. Those of you hoping to get off easy with the case size and power supply requirements are going to have to think again, ATI's R600 tops out at a record-busting 12.4-inches in length. ATI will have two SKUs of the R600 at launch, the XTX which features 1GB of DDR4 RAM and the R600XT with a mere 512MB of GDDR3 -- weak sauce, we know. The XTX version comes in retail and OEM versions and it's the OEM one (pictured) that really gets outlandish, with the 12.4-inch length and 270W of power consumption. The retail XTX cuts it down to 9.5-inches and 240W, while the weaker XT matches those specs. A little bit down the road, ATI will follow these up with the R600XL which will be cheaper and hopefully less demanding. Just for a frame of reference: NVIDIA's 8800 GTX was deemed outlandish with its longest-ever 10.4-inch length and beefy 165W power requirement. How much bigger are these things going to get? Oh, that's right, once they get outside our PCs there just won't be any stopping 'em.[Via fx57; thanks Noah D]

  • Asus XG Station external GPU, it works

    by 
    James Ransom-Wiley
    James Ransom-Wiley
    01.09.2007

    Engadget scored a CES hands-on look at the Asus XG Station, an external GPU capable of converting your aging laptop into a capable gaming rig. The demo featured two identical laptops with internal Intel GMA 945 processors. Asus reps attached the XG Station to one and left the other bare -- and struggling. The test seemed to prove that the XG Station, which also simulates 5.1 surround sound, is an adequate option for underpowered-laptop owners who enjoy the occasional peek at what's good in PC gaming.

  • AMD Stream Processor launched, uses GPU power for general tasks

    by 
    Cyrus Farivar
    Cyrus Farivar
    11.14.2006

    Well, AMD's achieved that stream computing thing (with the help of its latest acquisition, ATI) that we've all been hearing about for the last month and change. At the Supercomputing 2006 show down in Tampa, Florida, the company announced what it claims to be the "world's first dedicated stream processor." The new GPU, creatively named the "AMD Stream Processor," is a PCI Express card that is loaded up with 1GB of GDDR3 memory. TG Daily reports that the new processor is based on the R580 graphics processor used in ATI's Radeon X1900 graphics cards and that it has the potential of reaching 375 gigaflops. Now again, we take these numbers with a healthy dose of skepticism -- while AMD's claims of performance boosts may hold up in the lab, it's pretty unlikely that your everyday computer applications will benefit exponentially from all those floating point operations.[Via TG Daily]

  • Stanford University tailors Folding@home to GPUs

    by 
    Darren Murph
    Darren Murph
    09.29.2006

    Apparently the insane amount of gigaflops that your modern-day graphics card can churn out is nothing short of a phenomenon, as Folding@home's forefather Vijay Pande has tailored a new piece of software to harness to raw processing power of GPUs. Pande claimed that even the latest dual-core CPUs can't hold a candle to the floating point performance of ATi's X1900 / X1950 graphics cards. He estimated a Core 2 Duo chip could push about 25 gigaflops of folding power, while a high-end off-the-shelf ATi card could unleash a whopping 375 GFLOPS, which is about "20 to 40 times more speed" than the project has seen thus far. The team has also optimized the algorithms in the GPU-centric software, which is expected to add "10 to 15 times" more speed on top of the GPU's already impressive performance figures. Currently, the beta version is limited to the X1900 lineup, but plans are to include the X1800 variety in the near future, and Pande even mentioned that a PlayStation 3-friendly version was in the works. So if you aren't too busy tweaking your GPU-based supercomputer (or stressing over your energy bill), why not put those excessive GFLOPS to good use through Engadget's own Folding@home team, yeah?

  • Peakstream software taps into GPU for supercomputing power

    by 
    Darren Murph
    Darren Murph
    09.19.2006

    While building a supercomputer has been whittled down to a science, Peakstream has developed a suite of applications that look towards those speedy PCI Express slots -- not the CPU socket -- for an extra boost of power. The company boldly states that a supercomputer can be created by harnessing the power of "common CPUs combined with the resources of modern graphics cards" to increase performance by "20x." This extreme form of load balancing exploits the tremendous potential housed in today's GPUs in order to schedule workloads, offload tasks onto the optimal processor(s), and manage calculations to minimize the queue of tasks to be completed. Granted, the biggest boon of a graphics processor is the extraordinary floating-point performance; for instance, ATi's X1950 XTX pumps out 750 GFLOPS in dual-graphics mode, while it'd take 31 Intel Xeon 5100 CPUs to crank out those same figures -- thus Peakstream feels that mathematical and computational applications (sorry, Doom fans) are best suited for its software. While having your own personal supercomputer churning those Engadget Folding@home cycles would be mighty impressive, the average joe isn't apt to drop $2,000 (per node) for Peakstream's suite, but maybe this explains the real intentions behind those 200 watt, energy sucking, externally-housed graphics cards after all.

  • Next-gen nVidia and ATI GPUs to require 200 watts?

    by 
    Darren Murph
    Darren Murph
    08.20.2006

    So we've got CPUs moving towards more efficient designs, a (somewhat) general consensus that there should be energy-saving methods applied wherever possible, and yet somehow we see a steady increase in power usage from graphics processing units each time they're updated. It doesn't take a circuitry engineer (although that qualification could be helpful) to understand that higher clock speeds lead to higher frame rates, at least generally speaking, and companies like nVidia and ATI have apparently been taking the high road in order to boost those FPS figures. Current top-end graphic cards can easily suck down 100 watts (or more) when under full load, but it appears that requirement could nearly double -- the two graphical big shots are reportedly looking to offer powerhouse cards that almost need to be wired straight in to the power station to function. Japan's PC Watch has reported that the forthcoming G80 from nVidia may consume up to 175 watts, while ATI's R600 could demand an unprecedented 200 watts to produce its eye-candy. While these figures may be a bit staggering, they aren't exactly shocking -- we've already heard rumors that ATI's next major chipset may be released in an external form, at least partially for power-related reasons. While we can't say for sure what's happening behind the tightly-sealed doors at nVidia or ATI AMD, we hope someone in there is paying attention to these concerns, because we're not exactly cool with needing a separate power strip (and an additional occupation) to feed our gaming habits.

  • AMD to keep ATI brand, may create more integrated chips

    by 
    Ludwig Kietzmann
    Ludwig Kietzmann
    08.08.2006

    After some initial rumblings that indicated otherwise, AMD has reaffirmed their desire to attach the ATI brand to several of their forthcoming product lines. "The ATi name will live on at AMD as our leading consumer brand, and so will the Radeon brand and other ATi product brands," says spokesperson Eric DeRitis. "AMD's executive management knows very well the power and value of branding, and ATi's branding is some of the most valued in the global technology industry. As such, we plan to keep it. Period."The nature of the products to be branded as such have yet to be fully disclosed, but already AMD is hinting at providing more integrated graphics solutions. Indeed, the branding may become especially vital when faced with the widespread (and arguably correct) perception that "integrated graphics" is merely a shorter term for referring to that worthless piece of tech that came with your computer and can barely push two frames per second in the latest Tiger Woods game. According to their marketing manager for Europe, AMD sees integrating graphics acceleration directly into the CPU as the next logical step. "So, in much the same way as a floating point unit is now integrated into the processor, I would expect to see joint single pieces of silicon for certain specialist markets too." Richard Baker restrains the idea a bit, though, and says that AMD won't "integrate some steaming great big quad-core CrossFire engine into a CPU; that would be crazy. But if you're looking at entry level parts for emerging markets, where a very simple GPU could be integrated, then that could be possible."The true fallout of the AMD/ATI deal will likely become most evident once the new product lines show up which, if Baker is to be believed, could happen as early as next year. Read - AMD stays hand over ATI brand axeRead - AMD hints at integrated graphics and physics acceleration in CPUsPreviously: AMD to buy ATI for $5.4 billion Nvidia on ATI: "basically throwing in the towel" ATI responds to Nvidia, clears up post-takeover rumors

  • Nvidia Quadro Plex 1000 goes nuts with 80 billion pixels-per-second

    by 
    Paul Miller
    Paul Miller
    08.02.2006

    Maybe Nvidia's recent tough talk following the acquisition of ATI wasn't just talk. They're backing up those words with some serious graphics muscle: a unit called the Quadro Plex 1000 that can pump 80 billion pixels per second for pro graphics needs. Like, really pro graphics needs. Before we hurt ourselves trying to figure just exactly how many Marios that is, we'll run the rest of the specs by you. The Quadro Plex comes in a desktop (pictured) or 3U rackmount configuration, and is designed for working with 12-megapixel HD video, 3D graphics, scientific visualization, simulations and whatever else needs that type of GPU power. One node involves eight Quadro FX cards, jammed into two Plex 1000 systems and paired up via SLI, all of which is hosted by a 32-bit Intel or 64-bit AMD machine running Windows or Linux. All that juicy Quake II power (we keed! we keed!) can be yours starting at a mere $17,500, and should be available beginning next month.

  • ATI responds to Nvidia, clears up post-takeover rumors

    by 
    Ludwig Kietzmann
    Ludwig Kietzmann
    07.27.2006

    Not too long after Nvidia CEO, Jen Hsun-Huang, described AMD's purchase of ATI as a "gift" and as a sign of their main competitor "throwing in the towel," comes this retort from the newly formed CPU-GPU monster:"The PC market is a tough place to be without any friends. ATI now has all the resources of AMD behind it, and will be producing faster, more compact GPUs and reaching the channel more effectively than ever before. Nvidia's words are bravado, designed to confuse the market while the company tries to find a way to compete now that it's standing alone."Oh my, such comments are in desperate need of thermal paste and an officially endorsed heatsink and fan combination! Or, um, burn. While Nvidia is hardly "alone" in the market, the point ATI makes is not entirely without merit. In a considerably competitive hardware market, combining resources and technology couldn't easily be labeled as a terrible strategy and certainly not as a white flag flapping in the wind. Both companies have valid points -- we'll see which is more convincing once they start releasing their next generation of wares.The rest of AMD's response is aimed at several rumors that have cropped up since the original takeover announcement. They pledge to remain committed to GPUs and to support graphics solutions on Intel platforms, rubbishing the suggestion that Intel is revoking ATI's license. Perhaps that's where Nvidia and ATI can agree: having a strange PC hardware format war would be unpleasant for all parties involved.[Via Voodoo Extreme, thanks devian!]

  • NVIDIA is happy about AMD + ATI merger

    by 
    David Chartier
    David Chartier
    07.26.2006

    FiringSquad, a site covering all things gaming, interviewed Derek Perez, the Director of Public Relations for NVIDIA (ATI's largest competitor), about yesterday's news of the AMD and ATI merger. Mr. Perez looks at the merger as a boost to their own business, excited that NVIDIA will be the only GPU company that supports both AMD and Intel. If this is true, however, this merger could wind up being a bad thing for Mac users as Apple's machines are 100% Intel Inside (yes, I know that slogan is dead now). FiringSquad didn't get much out of Intel, their only comment was basically "we'll get back to you after we're finished reading all this legal mumbo jumbo".It's still way too early to tell, but I hope this merger doesn't mean that Apple's customers will lose one custom build option in the online store.[via IMG]

  • Nvidia on ATI: "basically throwing in the towel"

    by 
    Ludwig Kietzmann
    Ludwig Kietzmann
    07.25.2006

    It seems that graphics powerhouse Nvidia is reacting rather well to news of AMD scooping ATI off the corporate shelf and filling in coupons to the value of $5.4 billion at the checkout counter. Speaking to BusinessWeek Online, Nvidia CEO Jen-Hsun Huang described the purchase as a "gift", presumably whilst reclining in a henhouse, sipping cocktails and counting objects of some kind. He went on to say that ATI was "throwing in the towel, leaving us as the only stand-alone (graphics chip) company in the world." Of course, not being a stand-alone graphics chip company hasn't stopped Intel from competing in that market, so perhaps winning the "who can be the last stand-alone company" competition isn't all that important.More importance can be found in the repercussions of such a large purchase. There are concerns that AMD's debt-to-capital ratio might take a turn for the worse after the company took out a $2.5 billion term loan to cover some of the purchase. Intel and Nvidia's chummy relationship may also prove to be a stumbling block should ATI's graphics chips ever be locked out of Intel machines. Still, AMD cautiously considers the potential benefits, such as major cost reductions and an entry point into the Intel-dominated laptop arena, to be worth the price and effort. Mr. Huang's expectations may turn out to be accurate in the long run, but in an industry that was once ruled by 3dfx Interactive (remember Glide?), anything can happen.

  • Virtua Fighter 5 in playable state

    by 
    Nick Doerr
    Nick Doerr
    06.30.2006

    According to an interview at IGN, Virtua Fighter 5 is running in an advanced state on PS3 hardware as we speak. All that's missing are home specific features and tweaks. This is an update from the last post on this exciting next-gen fighter, slightly due to its near completion, but what was interesting were the comments the developers had about the PS3.Hiroshi Kataoka, president of SEGA-AM2 was quoted as saying "considering its abilities, it's not expensive ... if that hardware was released not as PlayStation, but under the Vaio brand, and you got a Blu-Ray drive, a cell chip and the latest NVIDIA GPU for under 100,000 yen, you'd definately call it cheap." He makes sure to say if consumers consider it just a game machine, the price can seem pretty daring.The rest of the interview talks about how while a multi-platform release would have been possible, it would have decreased parameters, so they decided to stick to a PS3 exclusive for that reason.So there you have it! For fighter fans out there, especially of the Virtua variety, there is a fantastic game coming to PS3 alone ... and at the rate it's going, we may even see it earlier than we expect. We hope.