ucsd

Latest

  • Computers can tell how much pain you're in by looking at your face

    by 
    Jon Fingas
    Jon Fingas
    06.04.2015

    Remember Baymax's pain scale in Big Hero 6? In the real world, machines might not even need to ask whether or not you're hurting -- they'll already know. UC San Diego researchers have developed a computer vision algorithm that can gauge your pain levels by looking at your facial expressions. If you're wincing, for example, you're probably in more agony than you are if you're just furrowing your brow. The code isn't as good at detecting your pain as your parents (who've had years of experience), but it's up to the level of an astute nurse.

  • UCSD engineers develop mini wide-angle lens that's ten times smaller than a regular one

    by 
    Edgar Alvarez
    Edgar Alvarez
    09.25.2013

    What you see here, dear readers, is the image of a fiber-coupled monocentric lens camera that was recently developed by engineers from the University of California, San Diego. The researchers involved in the project say this particular miniature wide-angle lens is one-tenth of the size of more traditional options, such as the Canon EF 8-15mm f/4L pictured above. Don't let the sheer magnitude (or lack thereof) of this glass fool you, however: UCSD gurus note that the newly developed optics can easily mimic the performance of regular-sized lenses when capturing high-resolution photos. "It can image anything between half a meter and 500 meters away (a 100x range of focus) and boasts the equivalent of 20/10 human vision (0.2-milliradian resolution)," according to engineers. As for us, well, we can't wait to see this technology become widely adopted -- don't you agree?

  • Telescopic contact lenses magnify sight 2.8 times, turn wearer into cyborg

    by 
    Melissa Grey
    Melissa Grey
    07.02.2013

    Interested in upgrading your eyeballs? Well, a team of DARPA-funded researchers led by Joseph Ford of UC San Diego recently published a proposal for a new type of telescopic contact lens in Optics Express. Designed for people with age-related macular degeneration, the lenses are only 1.17mm thick and can magnify images up to 2.8 times. Their layered construction admits light near the outer edge of the lens, bouncing it across a series of tiny aluminum mirrors before transmitting it to the back of the retina, kind of like the origami-optics lens. Telescopic sight can be toggled on and off by using a pair of 3D glasses to switch the polarization of the central part of the lens. It sounds promising, but the lenses -- pictured after the break -- currently have some obstacles, like gas-impermeable materials unsuitable for long-term wear and sub-par image quality. Want to read more? Pop on your glasses and check out the full paper at the source link below.

  • UCSD's robot baby appears, is happy, sad, a little creepy (video)

    by 
    Mat Smith
    Mat Smith
    01.09.2013

    Development on the UCSD's Diego-san has been underway for several years and now the robot child is read for his first home movie. The bot is being constructed to better understand the cognitive development of children, with a camera behind each eye recording (and learning from) human interactions around it. There are 27 moving parts in the face alone and Diego-san is able to replicate a whole gamut of emotions -- and give us shivers as he does. We've got some unerringly realistic footage right after the break.

  • Researchers create algorithms that could help lithium-ion batteries charge two times faster

    by 
    Alexis Santos
    Alexis Santos
    10.04.2012

    Researchers at the University of California San Diego have devised new algorithms that could cut lithium-ion battery charge times in half, help cells run more efficiently and potentially cut production costs by 25 percent. Rather than tracking battery behavior and health with the traditional technique of monitoring current and voltage, the team's mathematical models estimate where lithium ions are within cells for more precise data. With the added insight, the team can more accurately gauge battery longevity and control charging efficiency. The group was awarded $415,000 from the Department of Energy's ARPA-E research arm to further develop the algorithm and accompanying tech with automotive firm Bosch and battery manufacturer Cobasys, which both received the remainder of a $4 million grant. Wondering if the solution will ever find its way out of the lab? According to co-lead researcher Scott Moura, it'll see practical use: "This technology is going into products that people will actually use." Update: UC San Diego reached out to let us know that they were awarded $415,000 (not $460,000 as previously noted) out of a grant totaling $4 million (not $9.6 million), split between Bosch and Cobasys. We've updated the post and the press release below to reflect the correct figures.

  • San Diego Supercomputer Center launches world's largest academic cloud storage system

    by 
    Donald Melanson
    Donald Melanson
    09.26.2011

    A new cloud storage system may not exactly be big news these days, but it is certainly a bit more noteworthy when it's the San Diego Supercomputer Center (or SDSC) behind it. That new service, simply dubbed the SDSC Cloud, also isn't your ordinary cloud storage system -- it's designed specifically for academic and research use, and it's said to be the largest of its kind in the world. That title comes from an initial raw capacity of 5.5 petabytes, which the SDSC notes is scalable by "orders of magnitude to hundreds of petabytes," and which is accessible at sustained read speeds from 8 to 10 gigabytes per second (also promised to be improved over time). Those interested in signing up can do so right now through an application process, with rates starting at $3.25 a month for 100GB of storage. Additional details are in the press release after the break.

  • Archaeologists eschewing traditional whip / leather jacket combo for Kinect controllers

    by 
    Brian Heater
    Brian Heater
    08.03.2011

    The latest must-have piece of technology in the archaeological world? The Kinect. Students from the University of California, San Diego are taking Microsoft's much-hacked peripheral on an archeological dig in Jordan, using the device to help create 3D scans of the site on the cheap. The hack pulls data captured by the Kinect's camera and infrared scanner, turning the information into avatars that can be plugged into virtual worlds. The hack's creator is hoping that it might some day be able to capture information as complex as a buildings or neighborhoods. The first, decidedly less ambitious application is being referred to as "ArKinect," because what's a cool hack without a cool name?

  • Scent generator threatens to waft Odorama into the 21st century

    by 
    Christopher Trout
    Christopher Trout
    06.17.2011

    Finally, an invention John Waters can get behind. When the harbinger of filth brought the odiferous experience to screenings of Polyester, he took the scratch-and-sniff route -- including scents like glue and feces -- now a team of researchers at the University of California in San Diego are expanding on the smell-what-you-see concept, albeit in a much more high-tech fashion. In collaboration with the Samsung Advanced Institute of Technology, the team has developed a method for generating odors that could pack the appropriate hardware into a device "small enough to fit on the back of your TV." Basically, scents are produced by an aqueous solution, like ammonia, which is heated by a thin metal wire, and eventually expelled, as an odorous gas, from a small hole in its silicone elastomer housing -- and, bam! You've got Smell-O-Vision. The team has tested its method using perfumes by Jennifer Lopez and Elizabeth Taylor, but have yet to create a working prototype. For the sake of innocent noses everywhere, let's hope Mr. Waters doesn't get a whiff of this.

  • Moneta Onyx phase-change memory prototype can write some data 7x faster than traditional SSDs

    by 
    Zach Honig
    Zach Honig
    06.13.2011

    As file sizes for many data types continue to grow, smaller chunks are also becoming more ubiquitous, particularly on social media sites like Twitter and Facebook, and search tools like Google. These high-volume, small-size blocks of data may soon be served up from a specific type of SSD, like the Moneta Onyx prototype developed by a team at the University of California, San Diego. Onyx uses phase-change memory (PCM), which can rewrite single bits of data (1s and 0s) on demand, rather than rewriting data in larger chunks, yielding sustained 327 megabyte per second (MB/s) reads and 91MB/s writes with smaller file types -- two to seven times faster than the most efficient commercial SSDs. PCM specifically benefits granular data, rather than large files that must be transferred completely (like photos and documents), so the tech is more likely to appear on devices serving up short text-based messages. Traditional SSDs can write larger files faster than the Onyx prototype, though the new drive offers speedier read speeds across the board. It'll be at least a couple years before PCM becomes commercially available, but once (and if) it does, you'll be reading about your coworker's breakfast or college buddy's traffic jam milliseconds faster than before.

  • Audi commissions four US universities to research urban mobility issues

    by 
    Michael Gorman
    Michael Gorman
    01.20.2011

    We've seen what other companies have in store for our automotive future, and now Audi's given us a glimpse of what we can expect from its car of tomorrow. The company's Silicon Valley research lab has teamed up with four universities here in the US to develop technologies that will give city drivers the full KITT treatment -- vehicles that recognize the driver (and his or her preferences) and can detect and avoid dangers and traffic delays. Called the Audi Urban Intelligence Assist initiative, each participating university has a specific area of urban mobility research ranging from urban crash analysis to aggregating historical and real-time traffic, parking, and pedestrian data in cities. The schools will also study how best to deliver relevant information to drivers and get them from point A to point B as easily and efficiently as possible. Looks like the groundwork is being laid for a German counterpart to GM's EN-V we test drove in Vegas, and we look forward to the fruits of their labor. Ich bin ein Ingolstädter!

  • Graduate student erases pedestrians from Google Street View

    by 
    Sean Hollister
    Sean Hollister
    08.07.2010

    We love Google, oh yes we do, but there's no question the company could use some brownie points when it comes to privacy. That's not to say Mountain View doesn't try -- the firm does blur license plates and faces in Street View when it's not unintentionally snooping our WiFi. However, a UCSD graduate student has a more thorough idea: simply make the pedestrians disappear entirely. Arturo Flores' algorithm does just that, determining what to erase and what to keep using two adjacent frames. Because Google's roaming cameras end up taking images of the same subject from multiple angles, the program can grab bits of the background (in this case, the sidewalk, lawn and building) from either side, then layer them over the pedestrian in the foreground to hide him from view. It doesn't yet work on persons who are walking the same direction as the camera, or when there are many in the frame, but these obstacles can be surmounted at a later date. Here's hoping Google's PR team gives Arturo a buzz, so we can finally enjoy architecture without all those pesky humans in the way.

  • Scientists create sweat-monitoring underwear, break them in (video)

    by 
    Sean Hollister
    Sean Hollister
    06.21.2010

    Biochip bracelets be damned -- nanoengineers at UC San Diego want to put sensors in your skivvies. Researchers have begun prototyping a pair of tighty-whiteys coated with the requisite carbon electrodes to make electrochemical computing a reality, as it turns out the nether regions are a prime place to measure chemicals excreted in one's sweat. Until recently, there was some question whether the enzyme sensor solution would handle the stresses of daily life, so to speak, but these briefs were up to the task -- subjected to a torturous gauntlet of bending and stretching, a treated elastic waistband was still able to adequately measure chemicals as required. Funded by the U.S. Office of Naval Research, project leaders envision "smart underwear" that measures a soldier's sweat for warning signs and automatically trigger an appropriate medical dosage. We think they might be getting a wee bit ahead of themselves, though -- we don't yet know how they handle detergent, let alone a quality color-safe bleach. Video after the break, but don't expect any footage of the underpants actually being worn.

  • UCSD researchers hope to track airborne toxins with sensor-equipped cellphones

    by 
    Donald Melanson
    Donald Melanson
    05.15.2010

    If researchers the world over have their way, cellphones will one day be used to detect and track everything from nuclear radiation to pollution to cancer, and it looks like you can now add one more to the group -- some researchers from the University of California, San Diego have developed a tiny sensor that could eventually let cellphones track airborne toxins in real time. To do that, the researchers have proposed a rather novel system that would consist of a tiny silicon sensor that changes color when it interacts with various chemicals, and a equally tiny camera with a macro lens that would actually capture an image of the sensor and display it on the phone's screen. As you might have guessed, however, while the researchers are now showing off the sensor itself, they still have a ways to go on the cellphone part of the equation -- although they have apparently started work on a prototype.

  • Study finds Americans consume 34 gigabytes of information per day

    by 
    Donald Melanson
    Donald Melanson
    12.09.2009

    Well, it's a good thing life comes with an unlimited data plan, as a recent study conducted by the University of California, San Diego has found that Americans consume plenty of gigabytes in the average day. Thirty-four gigabytes, to be specific, which translates to a grand total of of 3.6 zettabytes of information consumed by American households in 2008 (or 3.6 billion trillion bytes). Of course, that doesn't just consist of gigabytes "consumed" the traditional way, but instead encompasses everything from TV (still the leader by a wide margin) to phone calls to newspapers. In terms of time, the study found that Americans spent about 11.8 hours a day consuming information in one way or another, the majority of which was spent staring at a screen of some sort -- and, yes, they did take HD content into account, but its growth apparently hasn't yet resulted in a huge jump in data consumption.

  • Nine HDTVs form 3D visualization rig, but only in the name of science

    by 
    Ross Miller
    Ross Miller
    08.19.2009

    If you're the kind of person who happens to have a number of LCD HDTV's lying around, we suggest you give University of California, San Diego's Calit2 Visualization Team a ring. Researchers from the group have constructed a three-column, nine-panel 3D display using flat screens from JVC, stereoscopic glasses, and "game PCs with high end NVIDIA game engines." Dubbed NexCAVE, it's a much more inexpensive version of the its projector-powered StarCAVE used for data analysis, although its range is more limited -- on the plus side, however, since this is LCD, it can be used in bright rooms. At 6,000 x 1,500 pixel, the resolution isn't as mind-blowing as we'd hope, but the team is currently building a version for Saudi Arabia's King Abdullah University for Science and Technology (KAUST) that's 7 columns (totaling 21 panels) and 15,000 x 1,500 resolution. If nothing else, any chance we can play Mirror's Edge on this? Video demonstration of the nine-panel rig after the break. [Via PhysOrg]

  • Einstein robot learns to smile, teaches us how to feel

    by 
    Donald Melanson
    Donald Melanson
    07.10.2009

    By now, you're no doubt well acquainted with the Albert Hubo Einstein robot developed by the mad scientists at KAIST, but some researchers at the University of California, San Diego has also been working on their own Einstein bot for the past little while, and they've now managed to teach it some new tricks. While the bot has previously been able to display a full range of expressions through some pre-programmed facial movements, it's now able to teach itself how to smile or display other emotions thanks to a new trial-and-error technique dubbed "body babble." That apparently works by comparing Einstein's attempts at an expression with some facial recognition software, which provides Al with some positive feedback each time he manages an actual expression. Did we mention there's a video? Check it out after the break.[Via Switched]Update: The folks at UC, San Diego have kindly pointed us towards a bit more background on their Einstein robot, including a video of its pre-self-teaching days, and a couple of behind-the-scenes pics. Head on past the break for one we like to call "Einstein: Behind the Face."

  • Scientists copy keys with computer imaging to make sure we feel insecure

    by 
    Samuel Axon
    Samuel Axon
    11.06.2008

    Not satisfied with the time-honored traditions of lockpicking or bump keying, computer science professor Stefan Savage and a handful of grad students at the University of California at San Diego have developed a computer system that makes a functional copy of a key based solely on a photograph, regardless of angle or distance -- the image resolution just has to be high enough to make out the details. They claim they did this "to show people that their keys are not inherently secret" so they'll be more careful about flaunting them around in their Flickr photos, but we're worldly enough to know that they really did it to steal beer from rival frat houses. Shame![Via Switched]

  • Coaster-sized origami-optics lens boosts focal length, shrinks photog egos

    by 
    Tim Stevens
    Tim Stevens
    09.04.2008

    Sports photogs aren't compensating for something by swinging gigantic, monopod-mounted lenses; they need the focal length. Focusing and zooming on outfielders usually means glass far from the camera body, but not so when using so-called "origami optics," flat lenses being researched at UCSD Jacobs School of Engineering that use internal reflection to achieve long focal lengths. Only the outer ring actually captures the image, while the others bounce it around before depositing light onto the film or sensor. The military is sponsoring this research, wanting better eyes on its UAVs, and we're hoping for improved optics in our gadgets -- though we were equally jazzed about liquid lenses, and those haven't exactly revolutionized mobile photography yet. A snooze-inducing Engineering TV clip after the break explains it all in more detail, so don't click on until you've had your morning cuppa -- or two.

  • Expression recognition turns humans into remote controls... for robots

    by 
    Thomas Ricker
    Thomas Ricker
    06.25.2008

    Jacob Whitehill at UC San Diego's Jacobs School of Engineering has demonstrated a proof of concept that allows his facial expressions to speed-up and slow-down video playback. Pretty sweet. But we're more interested to hear that his project is part of a larger effort at the UCSD Machine Perception Lab (gulp) to use automated face recognition to "make robots more effective teachers." We can see the future now...Human: (frowning) Robot: Aw, my meat bag is sad, I will now give it a hamburger and turn on Golden Girls. Fortunately, human teachers who've somehow missed out on the billions of years of biologically evolution required to recognize the "oh face" can take advantage of this research as well. See a video demonstration of that after the break, face-controlled video here.

  • HIPerSpace visualization system takes the crown with 220 million pixels

    by 
    Darren Murph
    Darren Murph
    08.23.2007

    For AV freaks enamored with their own HDTV and hardcore gamers who doubt anyone's ability to unleash more graphical firepower than that found in their rig, prepare to be humbled. As part of the HIPerSpace visualization system, engineers at the University of California, San Diego "have constructed the highest-resolution computer display in the world, with a screen resolution of up to 220 million pixels." The system, which links between UCSD and UC-Irvine (responsible for the mighty HiPerWall) via dedicated optical networking, contains a "graphics super cluster" that relies on 80 NVIDIA Quadro FX 5600 GPUs. Reportedly, scientists dealing with large-scale applications involving "Earth sciences, climate prediction, biomedical engineering, genomics, and brain imaging" will be able to make use of the newfangled setup in order to better digest the information they're dealing with. Sheesh, all we want is a solid day with this thing, infinite Doritos, and Halo 3.[Via MedGadget]