CudaCore

Latest

  • NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650

    by 
    Daniel Cooper
    Daniel Cooper
    09.13.2012

    NVIDIA's had some trouble shaving its Kepler GPUs down to an entry-level price point, but it looks to have put the problem behind it with the new GeForce 660 and 650 graphics cards. The company's ambition was to coax impoverished gamers clinging to DirectX9 (and to a lesser extent, 10) into switching up to this wallet-friendly pair of low-end units. The 660 has been designed to be the "weapon of choice" for budget gamers. It'll play most games at reasonably high settings, thanks to its 2GB of RAM, 960 CUDA Cores and GPU Boost, which automatically overclocks the silicon according to the demands of your software. While we'll wait for real-world benchmarks, the company expects four-times the performance of the GeForce 9800GT, claiming games like Borderlands 2 and Guild Wars 2, in a resolution of 1,920 x 1,080 will play at frame rates of 51fps and 41fps with full 3D, respectively The 650 is the company's self-proclaimed "gateway" into gaming, being the lowest-priced Kepler it's planning to produce. Unlike the other cards in the range, it lacks GPU Boost, but the company left six-pin power on the card, giving card makers 64W to push the "good overclocker" 1GHz units all the way to 1.2GHz. It's got 1GB of DDR5 RAM, which will apparently handle even the newest games at mid-range levels of detail with its 384 CUDA Cores. The pair are available from today, with companies like Maingear and Origin already announcing discounted desktops for them to nestle inside.

  • Engadget Primed: The crazy science of GPU compute

    by 
    Sharif Sakr
    Sharif Sakr
    08.20.2012

    Primed goes in-depth on the technobabble you hear on Engadget every day -- we dig deep into each topic's history and how it benefits our lives. You can follow the series here. Looking to suggest a piece of technology for us to break down? Drop us a line at primed *at* engadget *dawt* com. As you're hopefully aware, this is a gadget blog. As a result, we're innately biased towards stuff that's new and preferably fandangled. More cores, more pixels, more lenses; just give it here and make us happy. The risk of this type of technological greed is that we don't make full use of what we already have, and nothing illustrates that better than the Graphics Processing Unit. Whether it sits in our desktops, laptops, tablets or phones, the GPU is cruelly limited by its history -- its long-established reputation as a dumb, muscular component that takes instructions from the main processor and translates them into pixels for us to gawp at. But what if the GPUs in our devices had some buried genius -- abilities that, if only we could tap into them, would yield hyper-realistic experiences and better all-round performance from affordable hardware? Well, the thing is, this hidden potential actually exists. We've been covering it since at least 2008 and, even though it still hasn't generated enough fuss to become truly famous, the semiconductor industry is making more noise about it now than ever before. So please, join us after the break as we endeavor to explain why the trend known as "GPU compute," aka "general purpose GPU (GPGPU)," or simply "not patronizing your graphics processor," is still exciting despite having let us down in the past. We'll try to show why it's worth learning a few related concepts and terms to help provide a glossary for future coverage; and why, on the whole, your graphics chip is less Hasselhoff and more Hoffman than you may have imagined.

  • NVIDIA outs GeForce GTX 670 GPU: it's Kepler without the mortgage

    by 
    Daniel Cooper
    Daniel Cooper
    05.10.2012

    This'll be old news for some lucky folks, but NVIDIA has just unveiled the GeForce GTX 670 graphics card. It aims to bring Kepler to gamers who don't have off-shore bank accounts, with a price tag of $399 (or £329 in the UK, and €329 in Europe). What sacrifices will be made to reach that bracket, compared to the flagship GTX 680? A loss of 192 CUDA cores, for starters, plus a slightly slower 915MHz base clock speed, which will no doubt have an impact on benchmarks -- we'll do a review round-up shortly to figure out just how much. Nevertheless, you'll still get the same 28nm chip architecture and 2GB of DDR5 RAM, along with NVIDIA's GPU Boost technology that autonomously overclocks the processor to make use of available headroom. In terms of official performance claims, NVIDIA has chosen to compare its benchmarks to AMD's high-end Radeon HD 7950 and boasts that the GTX 670 comes out on top every time by a margin of 18 to 49 percent. Of course, the war of words is little more than performance art at this point, so stay tuned for independent tests. Meanwhile, gaming-friendly manufacturers like Origin and Maingear have declared that they'll carry the card alongside the 690 in its desktop offerings -- you can learn more about that after the jump.

  • NVIDIA unleashes GeForce GTX 690 graphics card, loads it with dual Kepler GPUs, charges $1k

    by 
    Joe Pollicino
    Joe Pollicino
    04.29.2012

    Would you look at that? NVIDIA hinted it would be coming today, and it looks like the tease is living up to the hype. The company stormed into the weekend at its Shanghai Game Festival by unleashing its latest offering, the GeForce GTX 690 -- and oh yeah, it's packing two 28nm Kepler GPUs! Trumping the recently released GTX 680 as the "worlds fastest graphics card," it's loaded with a whopping 3,072 Cuda cores. The outer frame is made from trivalent chromium-plated aluminum, while you'll find thixomolded magnesium alloy around the fan for vibration reduction and added cooling. Aiding in cooling even further, the unit also sports a dual vapor chamber and center-mounted fan. It'll cost you a spendy $1,000 to pick up one of these puppies come May 3rd, and you'll likely be tempted to double up -- two can run together in SLI as an effective quad-core card. With that said, NVIDIA claims that a single 690 runs 4dB quieter than duo of GTX 680s in SLI and handles about twice the framerate as a duo of GTX 680s in SLI a single GTX 680 -- impressive, but we'll reserve judgement until we see it for ourselves. Check out the press release after the break if you'd like more information in the meantime (...and yes, it runs Crysis -- 2 Ultra to be exact -- at 57.8fps, according to NVIDIA). [Thanks to everyone who sent this in]