GraphicsCard

Latest

  • External Thunderbolt graphics card for Macs to be developed soon, thanks to Facebook poll

    by 
    Billy Steele
    Billy Steele
    08.04.2011

    Have you ever let your Facebook friends determine a new product development decision for your company? Well, Village Instruments has, via an online poll in order to gauge interest in an external Thunderbolt PCI Express graphics card enclosure. Dubbed the ViDock Thunderbolt, this device will soon begin to dramatically improve the performance of today's Apple machines. Running at speeds of up to 10Gb/second, the new T-Bolt model can move data much faster than the company's current Express Card-connected external GPU. So if you're rocking the new MBP model, but you've got a hankering for more power out of your graphics card, you better start saving your Benjamins.

  • AMD announces the Radeon HD 6990M, has some pointed words for NVIDIA

    by 
    Dana Wollman
    Dana Wollman
    07.12.2011

    Here are five words you've heard before: "the world's fastest notebook GPU." Why, NVIDIA made just that claim two weeks ago, when it touted the GeForce GTX 580M as the nimblest card this side of Pluto. Not so fast, says AMD. The outfit just unveiled the Radeon HD 6990M with DirectX11 and HD3D support, and it insists this is the speediest GPU on the block -- specifically, up to 25 percent faster than any other GPU that's been announced to the public. And yes, AMD's well aware of that 580M. Just like NVIDIA came out swinging, making pointed comparisons to the Radeon HD 6970M, AMD's got some fighting words of its own: the company says the 6990M can whip the 580M in the benchmark AvP and games such as Batman Arkham Asylum, Dragon Age 2, Shogun 2, BattleForge, Left 4 Dead, Metro2033, Wolfenstein MP, The Chronicles of Riddick, and ET: Quake Wars. We don't need to remind you that these numbers merely represent the story each company wants to tell. Still, you get the idea: these are the top-of-the-line cards each has to offer at the moment, and they'll likely be competing for space in your next gaming rig. As you can imagine, the 6990M joins other Radeon HD cards in supporting the company's Eyefinity technology, as well as GPU app acceleration. Let it be known, too, that while the 6990M supplants the popular 6970M as far as performance claims go, AMD tells us the 6970M will still be available for the foreseeable future. Speaking of availability, the 6990M will be offered in the Alienware M18x -- right alongside NVIDIA's 580M. Additionally, you'll find it packed inside Clevo's P170HM and P150HM. And you didn't think we forgot about specs, did you? Head on past the break to find the full PR, along with a handful of technical details straight from the horse's mouth.

  • NVIDIA announces GeForce GTX 580M and 570M, availability in the Alienware M18x and MSI GT780R (updated: MSI says no)

    by 
    Dana Wollman
    Dana Wollman
    06.28.2011

    We know you're going to be shocked -- shocked! -- to hear this, but NVIDIA's gone and refreshed its high-end line of GeForce GTX cards. The GTX 580M takes the place of the GTX 485M, and NVIDIA's bragging that it's the "fastest notebook GPU ever," capable, we're told, of besting the Radeon HD 6970M's tesselation performance by a factor of six. The new GTX 570M, meanwhile, promises a 20 percent speed boost over the last-generation 470M. Both 40-nanometer cards support DirectX11, OpenCL, PhysX, CUDA, 3D Vision, Verde drivers, Optimus, SLI, and 3DTV Play. As for battery life, NVIDIA's saying that when coupled with its Optimus graphics switching technology, the 580M can last through five hours of Facebook, but last we checked, that's not why y'all are shelling out thousands for beastly gaming rigs. You can find the 580M in the Alienware M17X and M18X (pictured) starting today, though you might have to wait a week or so for them to ship. Meanwhile, the 570M is shipping in the MSI GT780R as you read this, and you'll also find the 580M in a pair of 3D-capable Clevo laptops: the P170HM3 and the SLI-equipped P270WN. Handy chart full 'o technical details after the break. Update: An MSI rep has let us know that contrary to earlier reports, the GT780R is not currently available with the 570M graphics card. The company added that it will offer some unspecified laptop with the 570M sometime in the "near" future. It's unclear if that laptop will, in fact, be the GT780R.

  • Microsoft decides to pass on WebGL over security concerns (Update: iOS 5 supports WebGL, sort of))

    by 
    Terrence O'Brien
    Terrence O'Brien
    06.17.2011

    Well, it looks like Microsoft is taking those warnings about WebGL pretty seriously. The company has decided not to support the web-based 3D standard because it wouldn't be able to pass security muster. Highest on the list of concerns is that WebGL opens up a direct line from the internet to a system's GPU. To make matters worse, holes and bugs may crop up that are platform or video card specific, turning attempts to plug holes in its defense into a game of whack-a-mole -- with many players of varying reliability. Lastly Microsoft, like security firm Context, has found current solutions for protecting against DoS attacks rather unsatisfying. Lack of support in Internet Explorer won't necessarily kill WebGL and, as it matures, Microsoft may change its tune -- but it's still a pretty big blow for all us of hoping the next edition of Crysis would be browser-based. Update: As is usually the case Apple and the Windows folks are on opposite sides of this one. In fact, the Cupertino crew plans to bring WebGL to iOS 5 with one very strange restriction -- it will only be available to iAd developers. Now, chances are it will eventually be opened up in mobile Safari for everyone, but for the moment it seems browser-based 3D graphics will be limited to advertisements on the iPhone. Still, that's another big name throwing its support behind the burgeoning standard. [Thanks, Greg]

  • MSI's Afterburner Android app makes GPU overclocking as easy as Facebooking

    by 
    Darren Murph
    Darren Murph
    06.06.2011

    Back in our day, overclocking one's PC was akin to a fine art. It took skill. Precision. Effort. Cajones. These days, it's just about as simple as blinking. Or winking. Or winking while blinking. MSI's made the simplification of PC overclocking quite the priority over the past few years, with OC Genie and an updated Wind BIOS from last decade putting all sorts of power into the hands of mere mortals. At Computex this week, the outfit took things one step further with the Afterburner Android app. Purportedly, the GPU tool enables users to monitor the temperature, voltage and fan speed of their graphics card via a WiFi connection, and if you're feeling froggy, you can overclock and overvolt to your heart's content. Details beyond that are few and far betwixt, but we're hearing that it'll soon work with GPUs from other vendors, and that an iOS variant is en route.

  • ASUS Mars II and Matrix GTX580 Platinum eyes-on

    by 
    Vlad Savov
    Vlad Savov
    06.01.2011

    If you thought the original Mars graphics card from ASUS was a little bit ridiculous, get ready to see what a lot of ridiculous looks like. The company's Mars II that was recently teased alongside a fresh new Matrix GTX580 Platinum card, squeezes two GeForce GTX 580 chips on the same board and overclocks them for good measure. In order to achieve such great feats, the card requires no less than three 8-pin auxiliary power connectors and takes up the space of three (2.6, to be precise) PCI slots with its ginormous dual-fan cooler. Heatpipes are also employed to keep the raging fires within in check, and -- for situations where all else fails -- ASUS has installed a special red button that sends the fan into full speed when depressed. ASUS hasn't yet finalized how far above the default engine clock speeds the Mars II will reach, but it has a bit of time to figure that out as this extremely limited edition card is coming sometime in July. Buyers in the US, Europe and Asia-Pacific region will have to be quick on their credit card trigger, as only 1,000 Mars IIs will ever be produced. Oh, and if you're wondering how much power a dual-GTX 580 graphics card might consume, the answer is 600W. All by itself. %Gallery-125006% Also making its debut at Computex this week is ASUS' latest offering for the truly overclock-mad PC gamer: the Matrix GTX580 Platinum. Frankly, it feels barren by comparison to its Martian sibling, coming with just one GTX 580 graphics processor, albeit an overclocked one, and the requirement for only two 8-pin connectors for added power. ASUS has thrown in a pair of physical "plus" and "minus" buttons, which permit voltage alterations on the fly, added the same fan override key as on the Mars II, and included a Safe Mode switch at the back in the event that you get carried away with your tweaking. Mashing that last button will reset all clock speeds, voltages and other settings to their default values, which should hopefully let you boot back up and try again. A final note of merit goes to the LED-infused Matrix logo atop the GTX580 Platinum. It's not there just for decorative purposes; its color changes in response to the load the GPU is under, so that blue and green will tell you there are no worries and orange and red will indicate you're cranking it close to its limits. The GTX580 Platinum should start selling worldwide next week, though pricing has yet to be announced. Check it out in closer detail in the gallery below. %Gallery-125008%

  • NVIDIA's GTX 560 desktop GPU fills an exceedingly narrow pricing niche

    by 
    Terrence O'Brien
    Terrence O'Brien
    05.17.2011

    With Tegra 2 hogging the spotlight, sometimes it's easy to forget that NVIDIA is still primarily in the business of making GPUs for computers. Yet, here it is with the GeForce GTX 560, another graphics chip ready to be inserted into mid-range gaming rigs. This smaller sibling of the GTX 560 Ti is designed to plug right into a small price gap in the company's lineup -- right around the $200 mark. The 336 CUDA cores inside this second-gen Fermi card, predictably, perform slightly better than the GTX 460 and fall just short of the 560 Ti, but it does eke out a victory over similarly priced competition from AMD. The only thing that kept reviewers from wholeheartedly endorsing the various (and often overclocked) flavors of the 560 was the tiny difference in price between it and its relatives -- tacking on the letters Ti and its 48 additional stream processors costs as little as $15 after a mail-in rebate. Check out the reviews below for all the benchmarks your little nerd heart can handle. And don't miss the video of a GTX 560 plowing through Duke Nukem Forever, Alice: Madness Returns, and Dungeon Siege III at the more coverage link. Read - Tech Report Read - AnandTech Read - Tom's Hardware Read - Guru 3D

  • WebGL flaw leaves GPU exposed to hackers

    by 
    Terrence O'Brien
    Terrence O'Brien
    05.12.2011

    Google spent a lot of time yesterday talking up WebGL, but UK security firm Context seems to think users should disable the feature because it poses a serious security threat, and the US Computer Emergency Readiness Team (CERT) is encouraging people to heed that advice. According to Context, a malicious site could pass code directly to a computer's GPU and trigger a denial of service attack or simply crash the machine. Ne'er-do-wells could also use WebGL and the Canvas element to pull image data from another domain, which could then be used as part of a more elaborate attack. Khronos, the group that organizes the standard, responded by pointing out that there is an extension available to graphics card manufacturers that can detect and protect against DoS attacks, but it did little to satisfy Context -- the firm argues that inherent flaws in the design of WebGL make it very difficult to secure. Now, we're far from experts on the intricacies of low-level hardware security but, for the moment at least, there seems to be little reason for the average user to panic. There's even a good chance that you're not vulnerable at all since WebGL won't run on many Intel and ATI graphics chips (you can check by clicking here). If you're inclined to err on the side of caution you can find instructions for disabling WebGL at the more coverage link -- but come on, living on the cutting edge wouldn't be anywhere near as fun if it didn't involve a bit of danger. [Thanks, Tony]

  • Powercolor expected to unveil double-barreled Radeon at Computex

    by 
    Sean Buckley
    Sean Buckley
    05.10.2011

    An unnamed, undressed dual-GPU prototype of AMD's latest in southern-island graphics cards surfaced over the weekend. Flaunting twin Bart chips with 1,120 stream processors a pop, this card totals up at 2,240, with each GPU packing its own memory for a total of 2GB of GDDR5. Although PowerColor is staying tight lipped on specs and official name until Computex in June, two DVI ports, double mini DisplayPorts, and one HDMI-out paint obvious similarities to the existing Radeon HD 6870. One last notable difference? The unknown soldier is powered by two eight-pin PCIe connectors, as opposed to the HD 6870's six-pin variant. We're probably looking at the latest in the Radeon HD 6800 series, we'll know for sure in about a month.

  • NVIDIA losing ground to AMD and Intel in GPU market share

    by 
    Vlad Savov
    Vlad Savov
    05.04.2011

    NVIDIA may be kicking all kinds of tail on the mobile front with its ubiquitous Tegra 2 chipset, but back on its home turf of laptop and desktop graphics, things aren't looking so hot. The latest figures from Jon Peddie Research show that the GPU giant has lost 2.5 percentage points of its market share and now accounts for exactly a fifth of graphics chips sold on x86 devices. That's a hefty drop from last year's 28.4 percent slice, and looks to have been driven primarily by sales of cheaper integrated GPUs, such as those found inside Intel's Clarkdale, Arrandale, and most recently, Sandy Bridge processors. AMD's introduction of Fusion APUs that combine general and graphics processing into one has also boosted its fortunes, resulting in 13.3 percent growth in sales relative to the previous quarter and a 15.4 percent increase year-on-year. Of course, the real profits are to be made in the discrete graphics card market, where NVIDIA remains highly competitive, but looking at figures like these shows quite clearly why NVIDIA is working on an ARM CPU for the desktop -- its long-term survival depends on it.

  • AMD elevates the low-end with trio of sub-$100 cards: Radeon HD 6670, 6570, and 6450

    by 
    Terrence O'Brien
    Terrence O'Brien
    04.19.2011

    Graphics card companies don't live and die by the enthusiast market alone. That may be where the glory is, but it's the budget cards that really bring in the bacon. For the entry level, AMD just unleashed a trio of sub-$100 cards, the Radeon HD 6670, 6570, and 6450. How do they perform? Well, let's just say you get what you pay for. Reaction from reviewers has been one of mild indifference. Depending on manufacturer, fan noise does appear to be an issue, possibly precluding the cards from being a viable HTPC choice. Otherwise, even the lowly, $55 6450 is a worthy upgrade over an integrated graphics chip or a two-year-old discrete card, but it can't match the performance of NVIDIA's GT 430, which can be had for only a few dollars more. Consensus was that, with prices of the older 5000 series being slashed, purchasers can get more bang for their GPU buck by sticking with last generation cards (like the Radeon HD 5750) if they're looking for pure gaming prowess. That said, the GDDR5 flavors of the 6670 provide perfectly playable performance on most modern games (it averaged 45 FPS in Call of Duty: Black Ops) for just $99 (the 6570 runs about $79). Just beware those models shipping with GDDR3. Benchmarks galore below. Read - Hexus Read - techPowerUp 6450 Read - techPowerUp 6670 Read - Guru3D Read - Tech Report Read - Tom's Hardware 6670 and 6570 Read - Tom's Hardware 6450 Read - TweakTown Read - AnandTech Read - HotHardware

  • Radeon HD 6790 sneaks in at under $150, leaves reviewers wanting more for the money

    by 
    Vlad Savov
    Vlad Savov
    04.05.2011

    As sure as snow in winter or sun in summer, AMD has yet another refresh to its graphics card portfolio this spring. The Radeon HD 6790 is only a couple of misplaced digits away from the far more illustrious HD 6970, but you should be able to tell the two apart by another, altogether more significant spec: the new mid-tier card retails at $149. Predictably, its performance offers no threat to AMD's single-GPU flagship, but the 6790's 840MHz graphics and shader clock speeds plus 1GB of GDDR5 running at an effective 4.2GHz data rate don't seem like anything to sniff at either. Reviewers agreed that it's AMD's slightly delayed answer to NVIDIA's GTX 460, and with the latter card exiting retail availability to make room for the (oddly enough) less powerful GTX 550 Ti, AMD's new solution looks set to be the better choice at the shared $149 price point. Alas, being limited to 800 Stream processors and 16 ROPs does expose the HD 6790 to being cannibalized by AMD's own Radeon HD 6850 (which can be had for sub-$150 if you're tolerant of rebates) and that turns out to be exactly what happens. A solid card, then, but one that would require an even lower price dip to make economic sense. Benchmarks await below. Read - Tech Report Read - AnandTech Read - Tom's Hardware Read - PC Perspective

  • Mac OS X 10.6.7 suggests support for generic video cards

    by 
    Kelly Hodgkins
    Kelly Hodgkins
    03.24.2011

    Hacker Tony of TonyMacx86 discovered that the latest Mac OS X 10.6.7 update for the 2011 MacBook Pro includes native graphics acceleration for select AMD and ATI video cards. These cards include seven Radeon 5xxx and three 6xxx models that are not present in any shipping Mac products. Several of these AMD/ATI cards may debut in the upcoming refresh of the Apple iMac line, but Tony believes Apple's support for these cards extends beyond built-in hardware. The hacker proposes the idea that Apple may support off-the-shelf video solutions in its upcoming desktop models. In this scenario, a Mac user would be able to swap out their video card, similar to how PC users replace video hardware on their Windows machines. Most Macs do not currently offer user-replaceable video cards, with the exception of the Mac Pro line; even there, only a very small subset of the cards on the market have Mac-compatible drivers. [Via MacNN]

  • NVIDIA's dual-GPU GeForce GTX 590 emerges, can't slay the Radeon HD 6990 titan

    by 
    Vlad Savov
    Vlad Savov
    03.24.2011

    1,024 total CUDA cores, 94 ROPs, and 3GB of GDDR5 RAM on board. Yup, the NVIDIA GeForce GTX 590 is indeed a pair of GTX 580 chips spliced together, however power constraints have meant that each of those chips is running at a tamer pace that their single-card variant. The core clock speed is down to 607MHz, shaders are only doing 1.2GHz, and the memory clocks in at 3.4GHz. Still, there's a ton of grunt under that oversized shroud and reviewers have put it to the test against AMD's incumbent single-card performance leader, the Radeon HD 6990. Just like the GTX 590, it sports a pair of AMD's finest GPUs and costs a wallet-eviscerating $699. Alas, after much benchmarking, testing, and staring at extremely beautiful graphics, the conclusion was that AMD retains its title. But only just. And, as Tech Report points out, the GTX 590 has a remarkably quiet cooler for a heavy duty pixel pusher of its kind. Dive into the reviews below to learn more, or check the new card out on video after the break. Read - AnandTech Read - HardOCP Read - Tech Report Read - PC Perspective Read - Guru 3D Read - X-bit labs Read - Hot Hardware Read - techPowerUp! Read - TechSpot

  • NVIDIA's next flagship graphics card to be unveiled at 9AM on Thursday, bring your own popcorn

    by 
    Vlad Savov
    Vlad Savov
    03.23.2011

    In NVIDIA's own words, this Thursday will bring us the company's "next generation, highest performance graphics card." If that has you thinking GeForce GTX 590, you're not alone. The dual-GPU solution was expected to arrive at the PAX East get-together this month but seemed to shyly dodge the limelight, though now there's no escaping its date with destiny. Just make sure to be up nice and early tomorrow, say around 9AM US Eastern Time, for the inevitable barrage of reviews. An unsatisfyingly brief teaser video, featuring Crysis 2 slyly running in the background, can be found after the break. Update: Whoa, Nelly! Looks like it may end up being the GeForce GTX 590, as evidenced by these leaked images here. [Thanks, Abdulmalik]

  • LucidLogix Virtu in action, discrete graphics and Sandy Bridge together at last

    by 
    Michael Gorman
    Michael Gorman
    03.18.2011

    At CES, LucidLogix's Virtu software solution promised to get discrete and Sandy Bridge GPUs together in graphical harmony -- giving you both Sandy Bridge's greased-lightning video transcoding and the horsepower of an NVIDIA or ATI rig. The code also lets you watch content from Intel's forthcoming Insider movie service while running a discrete GPU. Now that Chipzilla's 2nd-gen Core i5 and i7 CPUs are getting to market en masse, the gang at Hot Hardware put an RC of Virtu through its paces to see what it can do. As expected, the software waxes chumps and smokes fools when encoding HD video, but gaming performance suffered slightly (in FPS and 3DMark 11 tests) with the technology enabled. The other nit to pick was that Virtu renders the control panel of your discrete card unavailable, so any graphics adjustments must be made in-game whenever the software is running. Time will tell if the final release has similar shortcomings. Hit up the source link for the full rundown.

  • NVIDIA SLI faces AMD CrossFire in a triple-GPU shootout

    by 
    Tim Stevens
    Tim Stevens
    03.16.2011

    Place your bets, folks, because this one's gonna get ugly. On your left: a thunderous triad of AMD Radeon HD 6950 cards running in CrossFire. On your right: the terrorizing threat of triple NVIDIA GeForce GTX 570 in SLI. In the middle: a Tom's Hardware tester just trying to stay alive. The winner? Well, as usual in these benchmark articles that sort of depends on what you're doing, but in general it's the AMD solution and its CrossFire barrage that comes out on top in terms of performance, cost, and even efficiency. But, that's certainly far from the whole story. You'll want to click on through to read about every agonizing blow.

  • NVIDIA sends GeForce GTX 550 Ti into the $150 graphics card wars

    by 
    Vlad Savov
    Vlad Savov
    03.15.2011

    It wasn't that long ago that we were commending ATI on the stellar regularity of its product launches while NVIDIA was floundering, yet now the roles are reversed and we're seeing NVIDIA flesh out its second generation of Fermi products with the midrange GeForce GTX 550 Ti presented today. Its biggest attraction is a $150 price tag, but it makes a major concession in order to reach that pricing plateau -- there are only 192 CUDA cores inside it, equal to the previous-gen GTS 450, but less than the celebrated GTX 460. NVIDIA tries to ameliorate that shortage of parallel processing units by running the ones it has at an aggressive 1800MHz allied to a 900MHz graphics clock speed, and it also throws in a gigabyte of RAM running at an effective rate of 4GHz. That too is constrained somewhat, however, by a 192-bit interface, rather than the wider 256-bit affair on its bigger brother GTX 560 Ti. What all these specs boil down is some decent performance, but few recommendations from reviewers -- mostly due to the abundance of compelling alternatives at nearby price points. Hit up the links below for more. Read - AnandTech Read - Tech Report Read - Guru3D Read - PC Perspective Read - techPowerUp! Read - Hexus

  • EVGA GeForce GTX 460 2Win has 'double the win,' becomes NVIDIA's first dual-Fermi graphics card

    by 
    Vlad Savov
    Vlad Savov
    03.11.2011

    Why, it was only yesterday that we were eyeballing a dual-GF104 board from Galaxy, presuming it an artifact of a 2010 project that went nowhere, but there's at least one NVIDIA partner that's going to deliver exactly such a creation, and soon at that! EVGA has just set loose the details of a new GTX 460 2Win graphics card, which ticks along at 700MHz, has 672 cumulative CUDA cores served by 2GB of GDDR5, and reportedly collects more 3D Marks than NVIDIA's finest card out at the moment, the GTX 580. The company also gleefully reports that pricing of the 2Win model will be lower than the 580's. It's interesting that NVIDIA is opting for a pair of the older-gen GF104 Fermi chips here, but then again, those have been big winners with critics and price-sensitive gamers alike, with many touting the use of two GTX 460s in SLI as a more sensible solution than the elite single-card options. Well, now you have both, in a manner of speaking. Skip past the break to see EVGA's latest in the flesh. [Thanks, Ben]

  • Visualized: NVIDIA's dual-Fermi card that never was

    by 
    Vlad Savov
    Vlad Savov
    03.10.2011

    PAX East 2011, which kicks off tomorrow, is widely expected to finally deliver a dual-GPU solution from NVIDIA's Fermi family of graphics chips, a PCI Express-saturating single-card workhorse to be known as the GTX 590. While we wait for its arrival, however, here's a sentimental look back upon 2010 and another little prototype that NVIDIA had kicking around its labs back then. Emerging over in a Chinese forum, this dual-GPU board features two GF104 chips (the same that made the GTX 460 such a winner) and a snow white PCB paintjob that makes it look utterly irresistible. We're loving the four DVI outputs and, just like you, have no idea why this card never came out, but that shouldn't obstruct the enjoyment of looking at the darn thing. More pics after the break.