siggraph

Latest

  • Visualized: 3D3 Solutions scans our face in two seconds flat

    by 
    Darren Murph
    Darren Murph
    08.10.2011

    See that bloke? That's Darren Murph. Well, a digital representation of the human version, anyway. That image was captured in two painless seconds at the hands of 3D3 Solutions, which was on-hand here at SIGGRAPH to demonstrate its newest FlexScan setups. The rig that snapped our face rings up at around $10,000, and relies on a Canon DSLR (strictly for capturing textures), a projector and a secondary camera. As you've likely picked up on, this is hardly designed for average DIYers, but these solutions are also far more detailed and flexible than using Microsoft's Kinect. We're told that the company recently started to support Nikon cameras as well, and for those who'd prefer to use their existing cameras / PJs, a hobbyist-centric software package will allow you to do just that. The only problem? Figuring out where the $2,700 (for software) is going to come from. Head on past the break for a demonstration vid, or peruse the gallery below if you're feeling extra creepy. %Gallery-130289%

  • Organic Motion's OpenStage motion capture system grabs 200FPS, no backdrop required (video)

    by 
    Darren Murph
    Darren Murph
    08.10.2011

    At just under $40,000 for an eight camera setup, we're hardly in hobbyist territory here, but Organic Motion's new OpenStage 2.0 motion capture system could certainly make do in the average basement. Unlike a few competing solutions shown here at SIGGRAPH, this one actually has no backdrop mandate, and better still, doesn't require you to latch a single sensor onto your subject. The magic lies within the cameras hung above -- kits are sold that contain between eight and 24 cameras, and even the latter can be handled with a single workstation. Multi-person tracking ain't no thang, and while you aren't capturing HD footage here, the high-speed VGA capability enables up to 200 frames per second to be logged. Not surprisingly, the company's aiming this squarely at the animation and medical realms, and should start shipping bundles as early as next month. Looking to take down Pixar? You'll need a lot more than 40 large, but perhaps the video after the break will give you a bit of inspiration. %Gallery-130288%

  • Robot skin captures super detailed 3D surface images

    by 
    Lydia Leavitt
    Lydia Leavitt
    08.10.2011

    Remember those awesome pin art toys where you could press your hand (or face) into the pins to leaving a lasting impression? Researchers at MIT have taken the idea one (or two) steps further with "GelSight," a hunk of synthetic rubber that creates a detailed computer visualized image of whatever surface you press it against. It works as such: push the reflective side of the gummy against an object (they chose a chicken feather and a $20 bill) and the camera on the other end will capture a 3-D image of the microscopic surface structure. Originally designed as robot "skin," researchers realized the tool could be used in applications from criminal forensics (think bullets and fingerprints) to dermatology. The Coke can-sized machine is so sensitive, it can capture surface subtleties as small as one by two micrometer in surface -- finally solving the mystery of who stole the cookies from the cookie jar. (Hint: we know it was you Velvet Sledgehammer).

  • Perceptive Pixel shows world's largest projected capacitive display at SIGGRAPH, we go hands-on (video)

    by 
    Darren Murph
    Darren Murph
    08.09.2011

    Perceptive Pixel wasn't kidding around when it launched the planet's biggest projected capacitive display here at SIGGRAPH -- all 82 inches of it were here on display, and naturally, we stopped by to give it a look. While 82-inch panels aren't anything new, this one's particularly special. You see, the company actually procures the panels from Samsung, and then it rips the guts out while bonding its own network of sensors directly to it; most large-screen touch devices simply pop a touch layer on top of whatever TV shows up in the labs, but this integrated approach takes sensitivity to a whole 'nother level. For those unfamiliar with the term 'projected capacitive,' we're surmising that it's actually far less foreign than you think -- it's a technology used in a handful of smartphones, from Samsung's Moment to Apple's iPhone. 3M was also showing off a PC tech preview back at CES, and after using it here on the show floor, there's no question that it's the future for larger-screen devices. To quote CEO Jeff Han: "once consumers get a taste of this on the mobile front, they start demanding it elsewhere." %Gallery-130284%

  • NVIDIA, Fusion-io and HP drive a dozen 1080p streams on four displays at SIGGRAPH (video)

    by 
    Darren Murph
    Darren Murph
    08.09.2011

    A dozen uncompressed 1080p video feeds, simultaneously running off a single workstation. Yep, you're looking at it. NVIDIA's showcase piece here at SIGGRAPH was undoubtedly this wall -- a monster that trumps even Intel's CES wall in terms of underlying horsepower. A relatively stock HP Z800 workstation was loaded with the NVIDIA QuadroPlex 7000 Visual Computing System (that's four GPUs, for those counting) in order to push four HD panels. A pair of Fusion-io's ioDrive Duos were pushing a total of three gigabytes per second, enabling all 12 of the feeds to cycle through with nary a hint of lag. We're still a few years out from this being affordable enough for the common Earthling, but who says you need to wait that long to get a taste? Vid's after the break, hombre. %Gallery-130280%

  • Microsoft's KinectFusion research project offers real-time 3D reconstruction, wild AR possibilities

    by 
    Darren Murph
    Darren Murph
    08.09.2011

    It's a little shocking to think about the impact that Microsoft's Kinect camera has had on the gaming industry at large, let alone the 3D modeling industry. Here at SIGGRAPH 2011, we attended a KinectFusion research talk hosted by Microsoft, where a fascinating new look at real-time 3D reconstruction was detailed. To better appreciate what's happening here, we'd actually encourage you to hop back and have a gander at our hands-on with PrimeSense's raw motion sensing hardware from GDC 2010 -- for those who've forgotten, that very hardware was finally outed as the guts behind what consumers simply know as "Kinect." The breakthrough wasn't in how it allowed gamers to control common software titles sans a joystick -- the breakthrough was the price. The Kinect took 3D sensing to the mainstream, and moreover, allowed researchers to pick up a commodity product and go absolutely nuts. Turns out, that's precisely what a smattering of highly intelligent blokes in the UK have done, and they've built a new method for reconstructing 3D scenes (read: real-life) in real-time by using a simple Xbox 360 peripheral. The actual technobabble ran deep -- not shocking given the academic nature of the conference -- but the demos shown were nothing short of jaw-dropping. There's no question that this methodology could be used to spark the next generation of gaming interaction and augmented reality, taking a user's surroundings and making it a live part of the experience. Moreover, game design could be significantly impacted, with live scenes able to be acted out and stored in real-time rather than having to build something frame by frame within an application. According to the presenter, the tech that's been created here can "extract surface geometry in real-time," right down to the millimeter level. Of course, the Kinect's camera and abilities are relatively limited when it comes to resolution; you won't be building 1080p scenes with a $150 camera, but as CPUs and GPUs become more powerful, there's nothing stopping this from scaling with the future. Have a peek at the links below if you're interested in diving deeper -- don't be shocked if you can't find the exit, though. %Gallery-130263%

  • Perceptive Pixel unveils an 82-inch multi-touch LCD, TV news anchors overcome by giddy hands

    by 
    Joe Pollicino
    Joe Pollicino
    08.09.2011

    Perceptive Pixel has been no stranger to massive multi-touch screens ever since it got over being so Frustrated. At this year's SIGGRAPH the company is showing off a whopping 82-inch projected capacitive LCD -- and you thought MondoPad was huge. Apparently, the "slim" 6-inch deep, optically bonded display is "the world's largest" of its type, although Perceptive does make an 88-inch DLP model if you need a bit more real estate. On-screen content is displayed in 1080p HD resolution at 120Hz, and with an unlimited multi-touch response time of less than 1ms, it's ready for all the situations Wolf Blitzer's digits can handle. We'll hopefully be checking it out on the show floor, but for now you'll find more details past the break.

  • NVIDIA's Project Maximus takes multi-GPU mainstream, 'Virtual Graphics' takes it to the cloud

    by 
    Darren Murph
    Darren Murph
    08.08.2011

    NVIDIA just wrapped up an intimate press briefing here at SIGGRAPH 2011, where -- amongst other things -- it officially took the wraps off of two major initiatives. Project Maximus and Virtual Graphics are the two main topics of conversation here, and while both are obviously targeting working professionals at the moment, there's no question that a trickle-down effect is already on the company's mind. With Maximus, the outfit plans to stop recommending bigger GPUs to pros, and start recommending "a light Quadro GPU and as large a Tesla as you can get in the system." The overriding goal here is to make multi-GPU technology entirely more accessible; to date, it hasn't exactly been easy to get a finely tuned multi-GPU setup to the masses, but it sounds like a good deal of future flexibility (it'll be "nearly infinitely scalable") aims to change that. Just imagine: dynamic coupling and decoupling of GPUs depending on user load, at a far more detailed level within the application... Update: Regarding that Tesla bit, NVIDIA clarified with this: "What we're saying is for applications that are light on graphics / don't place a heavy demand on graphics, but more so a heavy demand on computational tasks, users will have an option to choose an entry- or mid-level Quadro card for graphics functions, such as the Quadro 600 or Quadro 2000. For certain applications, better performance is achieved by adding a Tesla companion processor, as opposed to scaling up the primary Quadro graphics. Users still require as much graphics as possible." %Gallery-130218%

  • Glowing Pathfinder Bugs installation puts the 'Minority Report' interface to good use - in a sand box (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.30.2010

    Nestled among the various booths at SIGGRAPH 2010 visitors got to check out a unique installation called Glowing Pathfinder Bugs. Created by Squidsoup and Anthony Rowe, this interactive art piece uses projectors to place "bugs" made out of light in a sandbox, coupled with a 3D gesture-based interface that allows people to pick up, move, and even breed the creatures. The system even takes the topography of the sand itself into consideration: altering the sand will alter the bug's path. It's nice to see someone put an interface technology to good use for a change! Video after the break.

  • Acroban: the childlike robot you want to punch (video)

    by 
    Thomas Ricker
    Thomas Ricker
    07.30.2010

    We see a lot of robots around here. But few can evoke emotion without resorting to a doe-eyed visage or plush, Dough-Boy bodice. That's what makes Acroban so interesting. Dispensing with the cheap parlor tricks, Acroban still comes across as child-like, playful, seemingly dependent upon your care and guidance. Cute, even though it's a quivering mass of aluminum, wire, and servos with a questionable taste in headwear. Doesn't mean you won't punch it in the pie hole now and again just to show it who's boss -- it is a murderous robot after all. See what we mean after the break.

  • Sony's 360-degree RayModeler 3D display brings its glasses-free act to LA, plays Breakout (video)

    by 
    Richard Lawler
    Richard Lawler
    07.28.2010

    Sony talked up its cylindrical no-glasses 3D 360-degree prototype display last fall, and now it's showing off the tech, dubbed RayModeler 3D, on US soil at SIGGRAPH 2010 through tomorrow. A major bonus of that showcase is an English language video -- embedded after the break, plus a hands on including a game of Pong Breakout from Core77 and our videos from the Japanese exhibition -- showing how it all works, including the eight-camera rig and turntable that capture objects in 45-degree separations before they are interpolated to create a continuous 360-degree motion image. Sony claims this is the first of its type capable of high quality images, full color and interactive live motion -- check it out and imagine keeping a tiny 3D pet or floating, disembodied head on your bedside table, where it can respond and react to your every gesture. We wouldn't want our blip-verts any other way.%Gallery-76236%

  • Tactile display allows you to 'feel' both light and shadow

    by 
    Laura June Dziuban
    Laura June Dziuban
    07.27.2010

    The concept of touching things such as light or smells isn't anything new, but there's so much room for interpretation that it's always interesting to see new applications. At Siggraph 2010, a new tactile display is being shown off which allows the user to feel light and shadow. Called Touch Light Through the Leaves, the device consists of a camera which detects light, and 85 vibration units, which have motors, process the light and shadow information into sensations. Check out the video below to see it in action, and hit up the source link for a bit more info.

  • Puyocon mouse reacts to being squeezed, thrown, gyrated (video)

    by 
    Vlad Savov
    Vlad Savov
    01.15.2010

    The Puyocon isn't about to swoop in and replace your trusty old two-dimensional laser pointing mousie just yet, but we're always suckers for bizarre input peripherals. Demonstrated by Tsukuba University at Siggraph Asia 2009 last month, it is a soft and squeezable ball that offers a quirky new spin on the old airborne controller idea. Differing from the Wii Remote in the fact that it won't break your HDTV (or itself) if it slips out of your hand, the spongy ball operates on the basis of a three-way accelerometer and 14 pressure sensors in order to give detailed multidimensional information to the system it's controlling. That's probably overkill for the humble computer desktop, but there might be hope for the Puyocon becoming a commercial reality through games that make use of all its input points -- after all, if there's room for the Wiiwaa, why not the Puyocon too? See it in action after the break.

  • Touchable Holography uses Wiimotes to add touch to holograms

    by 
    Laura June Dziuban
    Laura June Dziuban
    08.06.2009

    Researchers from The University of Tokyo have demoed a touchable hologram at Siggraph 2009. The project, called Touchable Holography, involves the use of Wiimotes placed above the display to track hand motion, and an airborne ultrasound tactile display created in the university's lab to create the sensation of touch. The result is a holographic image that produces tactile feedback without any actual touching, and without degrading the image itself. Check out the video after the break for a fuller, more stunning explanation. [Thanks, Adam]

  • Dell adds high-powered ATI FirePro M7740 graphics to the Precision M6400

    by 
    Nilay Patel
    Nilay Patel
    08.03.2009

    We've always lusted after Dell's high-zoot Precision M6400 mobile workstation, and now we've got yet another reason to save all these nickels and dimes in the sock drawer: the company's adding AMD's new ATI FirePro M7740 graphics processor to the mix. The new chip is due to be announced tomorrow at SIGGRAPH 2009, and like the rest of the FirePro line, it'll offer 1GB of DDR5 frame buffer memory, 30-bit DisplayPort and dual-link DVI output, and tons of CAD application certifications. We're looking for hard specs and prices now, we'll let you know as soon as we get 'em.

  • Invisible flash produces photos without glares

    by 
    Laura June Dziuban
    Laura June Dziuban
    07.17.2009

    Dilip Krishnan and Rob Fergus at New York University have developed a dark or invisible flash which uses infrared and UV light to take photos in dark places without the nasty glare of a standard flash. Their dark flash camera is made by modifying a flashbulb so that it emits light over a wider range of frequencies and filters out the visible light, and removes filters that prevent the silicon image sensor from detecting IR and UV rays. This flash results in a crisp image which does not have correct color balance, and looks like night vision photography. To correct the colors of the image, the camera also takes a quick color image sans flash right after the dark flash image. The image produced in this second image is predictably grainy and unclear, but the colors are correct. Software is then used to combine the information from the photos to produce the final image (an example of which you see above). There are some minor problems with the method -- objects that absorb UV light (such as freckles!) do not show up using this method. The pair will present their work at the Siggraph conference in New Orleans in August.

  • SIGGRAPH 2009 panel to focus on Fight Night Round 4, Gears of War 2 graphics

    by 
    Randy Nelson
    Randy Nelson
    06.26.2009

    The organizer of SIGGRAPH, an annual gathering of digital visual artists, has already implied that gaming will have a larger presence at 2009's conference than in previous years; there's even going to be a keynote address by SimCity creator Will Wright. It's now been announced that a session focusing on real-time computer graphics will explore the creation of visuals for EA Sports' newly released Fight Night Round 4 and Epic Games' not-so-new (but still mightily pretty) Gears of War 2.Titled "Big, Fast, and Cool: Making the Art for Fight Night Round 4 & Gears of War 2" (that title is neither small, quick or cool, by the way), the session is being offered for CG artists because, as SIGGRAPH 2009 real-time rendering chair Evan Hirsch says, "So much of what makes today's videogames so great are the responsive graphics and the stories that unfold during game play." It's a shame about that "during game play" bit; otherwise the entire thing could be called "The cinematics of Metal Gear Solid 4: Watch and Learn." Next year, Kojima. Next year.

  • Will Wright to keynote SIGGRAPH 2009

    by 
    Jason Dobson
    Jason Dobson
    03.05.2009

    There's just something about Will Wright that makes us melt. His choice in games. His geeky good looks. His ability to play god. Whatever it is, maybe the esteemed game designer will shed some light on our infatuation when he takes the stage as the keynote speaker at SIGGRAPH 2009. His presence marks an "expanded" gaming focus for the CGI tech fest, which normally centers on what's new and bleeding edge with all things graphics. The topic of Wright's discussion has not yet been released, though we'll be listening with bated breath when the show opens its doors in New Orleans in early August.

  • 'Photo real' robotics to keep toddlers and the elderly from freaking out

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    09.11.2008

    We know what you're thinking: your Roomba 532 really does a number on the carpet, but where is the love? At this year's SIGGRAPH in Los Angeles, Taisuck Kwon (from the Kyushu Institute of Design) demonstrated his latest work in the realm of "photo real" robots: robots designed to reproduce the facial expressions that human beings take for granted. Unlike the robots that assemble consumer electronics or detonate IEDs, the photo real robots convey emotions, using articulated humanoid facial features designed to put people at ease, "especially seniors and toddlers." The robots have an underlying mechanical configuration that mimics the muscle structure of the human face, involving 26 moving units in total, with servomotors and actuators used to manipulate "muscles" beneath the "skin." Our only regret is that this technology wasn't available when Disney World last updated its Hall Of Presidents.

  • Gettin' Siggy with it: Joystiq goes to SIGGRAPH

    by 
    Kevin Kelly
    Kevin Kelly
    08.14.2008

    We headed into the wonderific CGI fray known as SIGGRAPH this year, and ultimately decided that we need to start checking this out more often. The technical conference just entered its 35th year, with the acronym being for Special Interest Group on GRAPHics and Interactive Techniques. While it's evolved into a pretty glorified job fair, they still show off new and impressive technology, have a large section focusing on papers relating to innovation in the field of computer graphics (like this year's "Simulating Knitted Cloth at the Yarn Level") and feature a fun Computer Animation Festival component filled with dozens of short CGI films in competition.The only gaming companies we noticed in attendance were Activision, LucasArts, and THQ, which mostly offered "we want to hire you!" booths, but a lot of the tech behind games was being shown as well. NVIDIA was demoing "the world's first fully interactive GPU-based ray tracer," and the Mova Contour system was showing off their futuristic looking rig. Plus, it now seems like everyone and their uncle is creating 3D printers that pump out plastic models, but that doesn't mean we don't want one. Read on after the break to find out more, explore the gallery below, and be sure to watch the video that got the biggest laughs, just ahead.%Gallery-29646%