DiscreteGraphics

Latest

  • MacBook Pros with NVIDIA GeForce GT 330M silicon making questionable graphics switching decisions

    by 
    Chris Ziegler
    Chris Ziegler
    04.21.2010

    "With every choice you make, ask yourself: is this a good choice, or is this a bad choice?" That's the sage advice we were constantly given as tykes -- and it's advice that replays in our feeble brains every day as we write news. Turns out it's also a piece of wisdom Apple's latest round of MacBook Pros would be wise to heed, because currently, they're making some awful decisions about when to turn on that power-sapping NVIDIA GeForce GT 330M discrete chipset. Read on to see what we mean. [Thanks, Tom]

  • NVIDIA's Optimus technology shows its graphics switching adroitness on video

    by 
    Vlad Savov
    Vlad Savov
    03.03.2010

    Explaining automatic graphics switching and the benefits thereof can be a somewhat dry affair. You have to tell people about usability improvements and battery life savings and whatnot... it's much more fun if you just take a nice big engineering board, strap the discrete GPU on its own card and insert an LED light for the viewer to follow. NVIDIA has done just that with its Optimus technology -- coming to a laptop or Ion 2-equipped netbook near you -- and topped it off by actually pulling out the GPU card when it wasn't active, then reinserting it and carrying on with its use as if nothing had happened. This was done to illustrate the fact that Optimus shuts down the GPU electrically, which is that little bit more energy efficient than dropping it into an idle state. Shimmy past the break to see the video.

  • NVIDIA Optimus automates graphics switching, promises the best of both worlds

    by 
    Joanna Stern
    Joanna Stern
    02.09.2010

    We've always thought switchable graphics made a lot of sense on laptops, and NVIDIA's new Optimus tech looks like it's going to bring it mainstream in a serious way -- there's no more manually toggling between the powerful discrete GPU and the power-saving integrated chip. More than just automatically switching off the discrete GPU when the laptop is unplugged, the idea is that you don't have to think about when you want to use the different graphics options: the software and hardware combo will take care of deciding which graphics processor is best for the application or content. For instance, launch Call of Duty 4 and the discrete GPU will power on, close out and start writing an e-mail and it will switch to the IGP. Sounds pretty simple, but under the hood its much more complicated as NVIDIA has moved to running the drivers for both graphics subsystems concurrently and removed the multiplexers under the hood. For more details on all the technical fixes hit the more coverage link. Unsurprisingly, Intel hasn't been involved in these innovations, but NVIDIA says Optimus will work with Intel's new Core 2010 processors and the Pineview Atom platform, along with NVIDIA's GeForce 200M series, GeForce 300M series, next-gen GeForce M, and next-gen Ion GPUs. Speaking of Ion, NVIDIA wouldn't officially say what the next version will look like, but they confirmed it will be announced in March and use Optimus technology (we're pretty much assuming that it will combine the Pineview platform with a lower-end discrete GPU, like the previously hinted G310). The first Optimus-enabled laptops will hit at the end of this month courtesy of ASUS, and will include the UL50Vf, N61Jv, N71Jv, N82Jv, and U30Jc. We've been playing around with the $849 UL50Vf, so hit the break for some early impressions and video of the new graphics technology. %Gallery-84954%

  • ASUS's UL30Vt announced, somehow finds room for discrete graphics

    by 
    Tim Stevens
    Tim Stevens
    11.17.2009

    ASUS impressed the world with its lightweight, inexpensive 13.3-inch UL30 over the summer, and just last week impressed us with its switchable-graphics packing bigger cousin, the UL80Vt. Now the 30 is getting the discrete treatment, enabling users to choose from molasses rendering and "all-day computing" battery life or slightly more robust graphics and slightly (about an hour) shorter longevity. The system also packs DDR3 memory, a 1.4GHz Intel Core 2 Duo SU7300 processor (able to be overclocked), and that "stylishly robust" aluminum lid. No word on release or price, but the earlier Vt models didn't come with a massive leap in MSRP, so we'd expect this one not to fall too far from UL30's $749 street price. [Thanks, Neti_Neti]

  • Intel rep says people "probably won't" need discrete graphics in the future

    by 
    Donald Melanson
    Donald Melanson
    04.03.2008

    Intel's already made some fairly bold promises at its Intel Developer Forum in Shanghai this week, and it now looks like it's getting into the prediction game as well, with one representative from the company telling TG Daily that people "probably won't" need discrete graphics cards in the future. That word comes from Intel Graphics and Gaming Technologist Ron Fosner, who was showing off a graphics demo running on a multi-core Nehelam system that, as you can see in the video at the link below, likely won't have NVIDIA or AMD rethinking their strategy just yet. Fosner also curiously looked to the past to back up his argument, saying that "if you look back into the mid 80's, there were no discreet graphics cards." Of course, all of this is all the more puzzling given that Intel is itself dabbling in discrete graphics with its Larrabee project, albeit under the guise of a CPU / GPU hybrid.

  • Intel teraflopping into high-end graphics with "Larrabee"

    by 
    Paul Miller
    Paul Miller
    09.19.2007

    Intel's Paul Otellini IDF keynote shed some new light on the company's Larrabee processor, which is now set for a 2010 release and will compete against AMD and NVIDIA in the realm of high-end graphics. Paul says the chips will scale up to teraflops in speed, and be targeted at science and analytics in addition to graphics -- though he dodged questions about Larrabee potentially being a discrete graphics competitor for AMD and NVIDIA, and only reiterating that "Graphics will also be an area for the chip." Intel has so far stayed squarely in the realm of integrated graphics, but a move to discrete graphics would be quite a welcome shakeup to the current market, and teraflops would certainly make it all the more interesting.

  • Intel set to challenge NVIDIA and AMD/ATI in discrete graphics biz

    by 
    Donald Melanson
    Donald Melanson
    01.23.2007

    There's been rumors, but Intel looks to have finally made its challenge to the NVIDIA, AMD/ATI establishment official, revealing some details of its so-called Larrabee project in a new round of job postings on its website. Now less-mysteriously named the Visual Computing Group, the division looks to be taking square aim at the two big players in the graphics business, promising to deliver "discrete graphics products based on a many-core architecture targeting high-end client platforms." In other words, a big step up from Intel's current underpowered integrated graphics offerings. Unfortunately, that's about all that Intel's saying about the project for the time being, and given that the first official word we're about it comes in the form of a job posting, it's probably safe to assume that we're still a ways off from actually seeing any products come out of the newly-formed group.[Via Slashdot]