3drendering

Latest

  • Disney Research

    Disney reduces the chances of CG hair disasters

    by 
    Jon Fingas
    Jon Fingas
    07.14.2018

    Movie studios often want computer-generated hair to have a specific effect, whether it's a seductive toss or a careless flick that knocks something over. But there's a problem: most rough-cut simulations don't realistically simulate hair, leading to a lot of guesswork and time-consuming edits. Disney (no stranger to hair-centric movies) has a solution, though. It developed a new system that can produce more authentic-looking simulations without an impractical boost to computation power. The trick, it turns out, was to use just a few cleverly controlled sample hairs.

  • Disney Research crafts a more realistic way to capture the human eye

    by 
    Billy Steele
    Billy Steele
    12.05.2014

    We're keen on checking in with the folks at Disney Research from time to time to see what crazy projects its been working on. At SIGGRAPH Asia this week, the outfit is presenting recent work in crafting more detailed 3D-rendered eyes. In order to properly capture all the details needed to make things appear realistic for things like character generation, the studio has crafted a method for nabbing those intricacies based not only on appearance, but taking into account how the eye responds to light, too. "Generically modeled eyes may be sufficient for background characters, but it now takes significant effort to manually create realistic eyes for heroes and other leading characters," says Disney Research Zurich's Pascal Bérard. The project is nothing new for Disney's experimental arm, as the folks there have been looking into a method for more detailed ocular representation for quite some time. The method not only cuts down on the work required to manually produce believable results, but the tech could also drastically improve modeling in ophthalmology as well.

  • The Big Picture: Not-quite San Francisco's 'Davis Street'

    by 
    Timothy J. Seppala
    Timothy J. Seppala
    07.21.2014

    Take a second look at the above image -- it isn't actually a photo. Nope, the entire scene (dubbed Davis Street) was painstakingly created with the graphics rendering suite 3DS Max. To achieve the impressive end result, artist Gilvan Isbiro says that he had to separate the street into two rendering "plans" considering how big the street's surface is. As he tells it, the splitting didn't stop there. Isbiro notes that due to the sheer amount of textures and materials present in the scene, not to mention lighting and post-processing effects like depth of field, that he had to divide the rendering tasks in two: foreground and background. It is, however, a bit embellished compared to the actual intersection of Davis and California in San Francisco. For starters, The City By the Bay does not have trains like the one up above and there's no such Sweet Street. Spot any other liberties that Isbiro took? Let us know in the comments.

  • Computer trickery makes these shadows 'dance'

    by 
    Mariella Moon
    Mariella Moon
    06.05.2014

    You know how to turn crooked vases into an interesting art installation that remind us of Beauty and the Beast's singing pots and candlesticks? We'd like to say magic, but since we don't live in a Disney movie, the right answer is motion tracking and real-time 3D rendering. The installation's creators, artist Laurent Craste and digital agency Dpt., used a hidden projector to make the vases' shadows dance whenever a viewer swings the lamp above them. Their movements even depend on the lamp's swing, so side-to-side swinging triggers the same animation, while a more circular one also shows the lamp's shadows going in circles. Sadly, you can't see this in person anymore (it was displayed at a festival in Montreal in May), but you can watch the video after the break.

  • Live2D drawing technology from Cybernoids adds a little 3D spice to your hand drawn images

    by 
    James Trew
    James Trew
    08.23.2012

    While 3D graphics have been filling our eyes in cinemas and video games way before Nemo ever got lost, we've typically had to settle for computer generated artwork. Live2D from Cybernoids is a drawing technology that hopes to change that. The software lets animators and game creators give hand drawn 2D images rudimentary 3D qualities. In the video above you can see the character turn her head, and the image -- based solely on the 2D version -- twists and adapts in real-time. There are two versions of the software, one based on polygons, and the other vectors, and there is support for consoles and smartphones -- but no details on specifics at this time. The developers admit it's only suitable for limited movement, such as in dialog-based games, for now, but hope to have the tools to handle full 360 degree motions over the next two years. At least, for now, its way way further down on the creep-o-meter scale.

  • Dell Precision R5500 lets four graphics pros work on one PC, we wish it did gaming

    by 
    Jon Fingas
    Jon Fingas
    05.17.2012

    Workstations aren't normally our focus, but when Dell shows off a new Precision system that lets four media pros share its graphics hardware at once, you can be sure the company has our attention. If your IT chief springs for a Precision R5500 with four Quadro 2000 cards, each of those cards can take advantage of a graphics pass-through in Citrix's virtualization to render 3D models at speeds much more like what you'd get if the Quadro were sitting in your own PC. Before you have visions of four-player Modern Warfare parties after-hours at work, the inherent barriers of distance and the virtual machine itself will likely rule out any game sessions.

  • Maingear brings Intel i7-3960X Extreme Edition chip, Epic Audio Engine to desktops, extreme gamers

    by 
    Amar Toor
    Amar Toor
    11.14.2011

    Looking to add a dash of extremity to your gaming existence? Maingear's got you covered, now that it's added a second generation Intel Core i7 CPU to a handful of its desktop offerings. Today, the company announced yet another upgrade to its SHIFT, Quantum SHIFT and F131 desktops, with the addition of the Intel Core i7-3960X Extreme Edition processor. According to Maingear, this extra horsepower will provide gamers with a 34 percent improvement in performance at normal speeds, while offering similar enhancements in video editing and 3D rendering capabilities. That's all thanks to the fact that the i7-3960X can be overclocked at a handsome 5.2GHz, with a quad-channel memory structure that brings even more bandwidth to the table. On top of that, the company has also added its own EPIC Audio Engine to this troika of rigs, using Aphex's processing technology to offer audio that, according to Maingear, is "more balanced, more articulated, and simply put, better sounding." The revamped desktops are on sale now, so hit up the source link for more information, or check out the full PR, after the break.

  • TerraRay for Mac gets a speedy update

    by 
    Mel Martin
    Mel Martin
    09.19.2011

    I took a look at TerraRay recently, and found it a good, low priced 3D scenery generator. It didn't have the features of the big boys like Vue or Bryce, but it is a fraction of the price. One of my biggest complaints in my review was that the rendering was slow. Now the app has a free update to version 5 for current users, and a 50% off price of US $9.99 until September 26. You can find TerraRay at the Mac app store. In my tests, rendering speed has increased about 3x over version 4.5. The app has also added a configurable preview window, support for Lion full screen mode, some built-in sample scenes to get you started and that you can modify, as well as improvements in texturing. One thing I'd like to see is more control of the position of the sun. You can't easily change the sun elevation. Instead, you have to drag the sun's position around a bit to fiddle it into the place you want. I'd like to see the ability to just add numeric input for azimuth and elevation. On the whole, this app is a solid, entry level renderer. If you're not ready to spend hundreds of dollars for high end landscape creation, TerraRay is competent and now a lot faster. You can check out a video of the app here, and take a look at some screen shots from my sessions in the gallery below. %Gallery-134225%

  • Kinect hack turns tourists into 3D souvenirs (video)

    by 
    Christopher Trout
    Christopher Trout
    04.05.2011

    As souvenirs go, a miniature replica of yourself -- striking a pose of your choosing on Barcelona's La Rambla street -- is a far sight more original than a bullfights-and-senoritas snow globe. This past January, the hilariously titled BlablabLAB enlisted three Kinects and a RapMan 3.1 to snap passersby and render them into personalized tchotchkes, in a project called Be Your Own Souvenir. Subjects stood atop a small platform, mimicking the human statues on La Rambla, as the Kinects captured their likeness in full 360-degree glory. The resulting images were then processed as a mesh reconstruction, saved as a G-code file, and then fed through a 3D printer -- and voila, out popped the tiny statuettes. If you're a fan of flashy editing and Kinect-based street experiments, check out the video after the break.

  • Fabricate Yourself Kinect hack turns you into a 3D puzzle piece

    by 
    Christopher Trout
    Christopher Trout
    03.02.2011

    We've been whittling our likeness into bars of soap for decades, but lucky for us someone's come up with a far easier way to render our flawless good looks in miniature. Following in a long line of inventive Kinect hacks, the folks at Interactive Fabrication have produced a program called Fabricate Yourself that enlists the machine to capture images of users and convert them into 3D printable files. The hack, which was presented at Tangible, Embedded and Embodied Interaction Conference in January, results in tiny 3D models that resemble Han Solo trapped in carbonite and sport jigsaw edges that can be used to make a grid of small, but accurate renderings. Fabricate Yourself is still in its infancy, and the resulting models are relatively short on detail, but we're no less excited by the possibilities -- just think of all the things we could monogram in the time it takes to produce one soapy statuette. Video after the jump.

  • Google details low-level Renderscript API for Honeycomb

    by 
    Donald Melanson
    Donald Melanson
    02.11.2011

    There's no question that Honeycomb tablets like the Xoom are powerful pieces of hardware, and it looks like Google will be doing its best to ensure that developers are able to exploit as much of that power as possible. A big piece of that puzzle is the company's Renderscript API for the OS, which it's just now starting to detail in full. The big advantage there is that it's a low-level API designed especially for developers who are "comfortable working closer to the metal," which will let applications built with it (including games) take full advantage of the high-end GPUs and dual-core processors found in Honeycomb tablets. What's more, while the API is just now being made public, it's already been put to use in Honeycomb by Google itself -- both the YouTube and Books apps, and the live wallpapers shipping with the first Honeycomb tablets were created with the help of it. Head on past the break for another quick example -- a brute force physics simulation that involves 900 particles titling with the tablet -- and look for Google to provide some additional technical information and sample code sometime soon.

  • Maya 8.5 3D now Universal

    by 
    Scott McNulty
    Scott McNulty
    01.16.2007

    Autodesk announced Maya 8.5 yesterday, and for a .5 release it has a slew of new features that I am sure will delight 3D artists. The one feature that'll make Intel Mac users hearts beat a little faster is that Maya 8.5 is now a Universal Binary. It'll run on either PowerPC or Intel Macs, but I have to think that it will run much faster on Intel Macs (what doesn't, at native speeds?).Thanks to everyone who sent this in!