silicon

Latest

  • Silicine might be the new graphene, now that it's been physically constructed

    by 
    Sean Hollister
    Sean Hollister
    03.28.2011

    Surely you've heard of graphene, the one-atom-thick layer of pencil lead that has the potential to change the world of computers, batteries and screens? You might want to familiarize yourself with the term "silicine," too. It's basically a version of graphene constructed out of silicon, which doesn't naturally align itself into the same eminently useful honeycomb shape -- but, given a little prod here and a layer of silver or ceramic compound there, can do much the same thing, and with better computing compatibility. First proposed around 2007, it's reportedly been produced twice now by two different teams, which gives physicists hope that it could actually be useful some day. For now, researchers need to figure out a way to easily produce it so detailed experiments can be performed -- from what we understand, the good ol' scotch tape method just won't do the job.

  • Samsung starts baking 30nm 4Gb LPDDR2 chips, packaging 2GB mobile RAM in April

    by 
    Richard Lai
    Richard Lai
    03.25.2011

    When it comes to mobile RAM, capacity is often what pops to mind first while we overlook speed and power consumption, but Samsung's latest delivery is worth the extra attention. Earlier this month, said Korean giant started producing 30nm 4Gb 1066Mbps LPDDR2 (or simply Mobile DDR2) chips, in order to phase out its 40nm ones that topped 2Gb at a 800Mbps transmission rate. To put it in perspective, a 40nm 1GB package consists of four 2Gb chips, whereas the new 30nm one will only need two 4Gb chips, thus reducing the package thickness by 20 percent (down to 0.8mm) and power consumption by 25 percent. It's hard to tell when we'll start seeing these bits of silicon entering the consumer market, but Samsung's already stamping out 1GB modules this month, with a 2GB version to follow next month. Oh yes, we're definitely liking the sound of 2GB RAM for mobile phones.

  • iPad 2 specs discerned, 900MHz dual-core ARM CPU and PowerVR SGX543MP2 GPU blow away graphical benchmarks

    by 
    Sean Hollister
    Sean Hollister
    03.12.2011

    iFixit may have physically uncovered Apple's latest silicon, but it's the processor gurus that have discovered what's truly inside -- using software benchmarks, they've unearthed the speeds and feeds of the Apple A5. As you'll no doubt be aware having read our headline above, there actually isn't a 1GHz CPU at the helm, as AnandTech and IOSnoops report the dual-core ARM chip is dynamically clocked around 900MHz, likely in search of reduced power consumption. Perhaps more interestingly for all you gamers in the audience, the iPad 2 reports that it has a dual-core PowerVR SGX543MP2 GPU on the die as originally foretold -- and, spoiler alert -- it mops the floor with both the original iPad and the Motorola Xoom. Though the new chip didn't quite demonstrate 9X the graphical prowess of its predecessor, it rendered 57.6 frames per second in a GLBenchmark test where the (admittedly higher-res) Tegra 2 tablet managed only 26.7fps, and last year's iPad pulled only 17.6fps. That's some serious Tai Chi. Hit up our source links to see the difference it can make in games like Infinity Blade. Update: Though it sure sounds like there's a dual-core ARM Cortex A9 in there, that's not yet a proven fact -- we only know that it's a dual-core ARM v7 chip which performs relatively similarly in non-graphical tests. [Thanks, Jim] [Thanks to everyone who sent this in]

  • Bridgelux silicon LED could mean bright future for solid state lighting

    by 
    Christopher Trout
    Christopher Trout
    03.11.2011

    While Democrats and Republicans squabble over the future of the incandescent light bulb, a Livermore-based company has produced an LED that they claim could lead to brighter, more affordable solid state lighting. By growing gallium nitride on low-cost silicon wafers, as opposed to the typical sapphire and silicon carbide substrates, the company has achieved an output of 135lm/W (lumens per watt) with a color correlated temperature of 4730K-- brighter than any affordable LED lighting solution we've ever seen. Of course, this isn't the first time efficacy of this level has been achieved, and we've yet to see a practical application, but if Bridgelux's numbers are right, this could mean a 75 percent cut in LED production costs. The company expects the technology to make its way to real world lights in the next two to three years -- perhaps by then the furor over pigtail light bulbs will have settled a bit. Enlightening PR after the break.

  • Scientists grow nanolasers on silicon chips, prove microscopic blinkenlights are the future

    by 
    Tim Stevens
    Tim Stevens
    02.07.2011

    What you see above may look like a nanoscale Obelisk of Light, ready to protect the tiny forces of Nod, but that's not it at all. It's a nanolaser, grown directly on a field of silicon by scientists at Berkeley. The idea is to rely on light to transmit data inside of computers, rather than physical connections, but until now finding a way to generate that light on a small enough scale to work inside circuitry without damaging it has been impossible. These indium gallium arsenide nanopillars could solve that, grown on and integrated within silicon without doing harm. Once embedded they emit light at a wavelength of 950nm, as shown in the video below. [Thanks, Paul]

  • IBM says graphene won't fully replace silicon in CPUs

    by 
    Donald Melanson
    Donald Melanson
    01.25.2011

    As you may have been able to tell from the flurry of research that's occurred over the past few years (which has even resulted in a Nobel Prize), there's plenty of folks betting on graphene as the next big thing for computing. One of the big players in that respect has been IBM, which first opened up the so-called graphene bandgap and has created some of the fastest graphene transistors around, but is now sounding a slightly more cautious tone when it comes to the would-be demise of silicon-based CPUs. Speaking with Custom PC, IBM researcher Yu-Ming Lin said that "graphene as it is will not replace the role of silicon in the digital computing regime," and further explained that "there is an important distinction between the graphene transistors that we demonstrated, and the transistors used in a CPU." To that end, he notes that unlike silicon, "graphene does not have an energy gap," and that it therefore cannot be completely "switched off," which puts it at quite a disadvantage compared to silicon. Intel's director of components research, Mike Mayberry, also chimed in on the matter, and noted that "the industry has so much experience with it that there are no plans to move away from silicon as the substrate for chips." That doesn't mean that there still isn't a bright future for graphene, though -- Lin gives the example of hybrid circuit, for instance, which could use graphene as a complement to silicon in order to "enrich the functionality of computer chips."

  • IBM forms new partnership with ARM in hopes of developing ludicrously small chip processing technology

    by 
    Ben Bowers
    Ben Bowers
    01.19.2011

    We've seen IBM and ARM team up before, but this week both companies announced a new joint initiative to develop 14nm chip processing technology. That's significantly smaller than the 20nm SoC technology ARM hopes to create in partnership with TSMC, and makes the company's previous work with IBM on 32nm semiconductors look like a cake walk. The potential benefits, though, are faster processors that require less power, and feature lower per unit manufacturing costs Who knows if or when we'll see tangible results from the tag team, but if IBM's Watson can beat Jeopardy champions, further reducing the average size of a feature that can be created on a chip should be elementary, right? To read over the full announcement check out the press release after the break.

  • Sculpted Eers fills ears with silicon, molds custom-molded headphones in four minutes flat (ears-on)

    by 
    Sam Sheffer
    Sam Sheffer
    01.06.2011

    We've actually seen custom fit headphones before, but we figured something cheaper would rise up in the future. Here at CES, we stumbled upon what Sonomax is calling Sculpted Eers. Starting at $199, this one-time-use, do-it-yourself molding kit will create custom fit in-ear headphones -- and the best part, it only takes four minutes. We actually got the chance to get a pair of our own molded for us. Head past the break for some hands, er, ears-on video action! %Gallery-112769%

  • Reebok sets sights on flexible computing sportswear, partners with startup team

    by 
    Sean Hollister
    Sean Hollister
    12.10.2010

    Science has prototyped flexible versions of just about everything a ever-loving geek needs: displays, memory, batteries, LEDs, speakers and an input device or three. Now, Reebok's looking to put some of that computing power up our sleeves. The apparel manufacturer's teamed up with MC10 -- a startup founded by our old friend John Rogers, who helped pioneer the field -- with the intent to build "conformable electronics" into high-performance clothing for athletes over the next couple of years. Though the company told MIT Technology Review the devices typically consist of thin silicon strips printed onto flexible materials, and that they might they might measure metabolism and performance using embedded sensors, hard details are few -- the only thing we know for sure is that a flexible tech scientist just scored a partnership with a major company, and we're hopeful they'll make something neat. PR after the break.

  • IBM breakthrough brings us one step closer to exascale computing, even more intense chess opponents

    by 
    Darren Murph
    Darren Murph
    12.01.2010

    The path to exascale computing is a long and windy one, and it's dangerously close to slipping into our shunned bucket of "awesome things that'll never happen." But we'll hand it to IBM -- those guys and gals are working to create a smarter planet, and against our better judgment, we actually think they're onto something here. Scientists at the outfit recently revealed "a new chip technology that integrates electrical and optical devices on the same piece of silicon, enabling computer chips to communicate using pulses of light (instead of electrical signals), resulting in smaller, faster and more power-efficient chips than is possible with conventional technologies." The new tech is labeled CMOS Integrated Silicon Nanophotonics, and if executed properly, it could lead to exaflop-level computing, or computers that could handle one million trillion calculations per second. In other words, your average exascale computer would operate around one thousand times faster than the fastest machine today, and would almost certainly give Garry Kasparov all he could stand. When asked to comment on the advancement, Dr. Yurii A. Vlasov, Manager of the Silicon Nanophotonics Department at IBM Research, nodded and uttered the following quip: "I'm am IBMer, and exascale tomfoolery is what I'm working on."* *Not really, but you believed it, didn't you?

  • Qualcomm teases 28nm dual-core Snapdragons, pixel-punching Adreno 300 GPU

    by 
    Sean Hollister
    Sean Hollister
    11.18.2010

    By the time Qualcomm's 1.5GHz QSD8672 Snapdragon finally makes it to market, it might be obsolete -- the company just announced that the new 28nm dual-core MSM8960 system-on-a-chip will have five times the performance and consume 75 percent less power than the original Snapdragon when it arrives in 2011. It's got the usual WiFi, GPS, Bluetooth and FM radio modules but also a multi-mode LTE / 3G modem too, and reportedly four times the graphical muscle on board. Speaking of graphics, Qualcomm separately took the time to detail a new GPU: the Qualcomm Adreno 300 series, which will allegedly offer the gaming performance of an Xbox 360 or PS3. We'd say "We'll believe it when we see it," but that would imply doubt -- the reality is that we just want to feast our eyes on mobile gaming bliss as soon as humanly possible. [Thanks, PhineasJW]

  • AMD sees a tablet chip in its future, and an end to the core-count wars

    by 
    Sean Hollister
    Sean Hollister
    10.14.2010

    AMD told us that it wasn't terribly interested in the iPad market, and would wait and see if touchscreen slates took off, but CEO Dirk Meyer changed the company's tone on tablets slightly after reporting a $118 million net loss (on $1.62 billion in revenue) in a Q3 2010 earnings call this afternoon. First revealing his belief that tablets will indeed cannibalize the notebook and netbook markets, he later told investors that he actually expects AMD's netbook parts to start appearing in OEM slates in the next couple of years, and that AMD itself would "show up with a differentiated offering with great graphics and video technology" when the market becomes large enough to justify an R&D investment. Elsewhere, AMD CTO of servers Donald Newell prognosticated that the number of individual CPUs on a chip won't go up forever: "There will come an end to the core-count wars," he told IDG News. Just as the megahertz race was eventually defeated by thermal restrictions, so too will the number of cores on a chip cease to increase. " I won't put an exact date on it, but I don't myself expect to see 128 cores on a full-sized server die by the end of this decade," he said. So much for our Crysis-squashing terascale superchip dreams, we suppose.

  • NC State patents multifunctional smart sensors, looks to 'revolutionize energy and communications infrastructure'

    by 
    Darren Murph
    Darren Murph
    10.04.2010

    Bold words coming from a program that choked in epic fashion this past Saturday in front of 58,000+, don't you think? Thankfully for those who are actually involved in the global energy and communications infrastructure (not to mention depressed alumni), NC State's athletics department is far removed from its research labs, and the university's latest development was born and bred in the latter. A team of researchers have managed to patent a new technology that is expected to enable the development of "high-power, high-voltage and high-current devices that are critical for the development of energy distribution devices, such as smart grid technology and high-frequency military communications." The secret? Integrating gallium nitride (GaN) sensors and devices directly into silicon-based computer chips, a feat that hasn't been accomplished by any team prior. According to Dr. Jay Narayan, this newfangled integration has "enabled the creation of multifunctional smart sensors, high-electron mobility transistors, high-power devices, and high-voltage switches for smart grids," and it also makes a broader range of radio frequencies available -- something that'll obviously be beneficial in the advancement of communications. Best of all, a US-based corporation is already in the process of licensing the technology, so it's likely that we'll see this in use in the not-too-distant future. An ACC championship, however, remains far more elusive.

  • Researchers develop means to reliably read an electron's spin, take us one step closer to the quantum zone

    by 
    Tim Stevens
    Tim Stevens
    09.30.2010

    Another day, another step bringing us closer to the next big revolution in the world of computing: replacing your transistory bits with qubits. Researchers at Australia's Universities of New South Wales and of Melbourne, along with Finland's Aalto University, have achieved the impossibly tiny goal of reliably reading the spin of a single electron. That may not sound like much, but let's just see you do it quickly without affecting said spin. This particular implementation relies on single atoms of phosphorus embedded in silicon. Yes, silicon, meaning this type of qubit is rather more conventional than others we've read about. Of course, proper quantum computers depend on reading and writing the spin of individual electrons, so as of now we effectively have quantum ROM. When will that be quantum RAM? They're still working on that bit.

  • Silicon carbide sensors developed for transmitting inside volcanos

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    09.21.2010

    There's one serious obstacle to volcano research: volcanos, like, shoot lava. Sure, you could aim a thermal camera at one from a safe distance, but where's the fun in that? On the other hand, researchers at Newcastle University are developing silicon carbide-based components for a device that they say will be able to withstand 900° Celsius temperatures -- just the thing to sense what's going on inside a volcano and transmit the info in real-time. Not only will this allow researchers to better understand conditions leading up to an eruption, it might also someday signal an eruption before it occurs. "At the moment we have no way of accurately monitoring the situation inside a volcano," says NU's Dr. Alton Horsfall. "With an estimated 500 million people living in the shadow of a volcano this is clearly not ideal." Since silicon carbide is more resistant to radiation than plain ol' silicon, the tech can also be used inside nuclear power plants or even as radiation sniffers in places that might face a terror attack.

  • AMD throws down gauntlet, pits Zacate netbook chip against Intel's Core i5 in City of Heroes duel (video)

    by 
    Sean Hollister
    Sean Hollister
    09.14.2010

    We knew AMD planned to upstage Intel in San Francisco this week, but we didn't realize just how far Chipzilla's rival would go -- the company's demonstrating the power of its new Zacate APU by having it trounce an Intel Core i5-520M in a graphical superhero showdown. Though we've never really thought much of Intel's integrated graphics anyhow (though we're giving Sandy Bridge's technique the benefit of the doubt), watching a netbook part beat a 2.4GHz Core i5 at anything is truly something else. While AMD won't speak to the clockspeed or price of its new dual-core chips, it says the 18W Zacate and 9W Ontario should appear in devices with over 8 and 10 hours of battery life respectively when they likely ship to consumers early next year. Video after the break. %Gallery-102207%

  • Intel's Sandy Bridge, eyes-on

    by 
    Sean Hollister
    Sean Hollister
    09.13.2010

    This is Intel's Sandy Bridge -- the actual silicon itself. And if you think about what the previous generation of Core processors looked like under their heatspreader hoods, that internal codename actually makes a good bit of sense now. But we're sure you'd rather know what's inside. To that end, you'll find a handy diagram right after the break.

  • Silicon oxide forms solid state memory pathways just five nanometers wide

    by 
    Sean Hollister
    Sean Hollister
    09.03.2010

    Silicon oxide has long played the sidekick, insulating electronics from damage, but scientists at Rice University have just discovered the dielectric material itself could become a fantastic form of storage. Replacing the 10-nanometer-thick strips of graphite used in previous experiments with a layer of SiOx, graduate student Jun Yao discovered the latter material worked just as well, creating 5nm silicon nanowires that can be easily joined or broken (to form the bits and bytes of computer storage) when a voltage is temporarily applied. Considering that conventional computer memory pathways are still struggling to get to 20nm wide, this could make for quite the advance in storage, though we'll admit we've heard tell of one prototype 8nm NAND flash chip that uses nanowires already. Perhaps it's time for silicon oxide to have a turn in the limelight.

  • HP Labs teams up with Hynix to manufacture memristors, plans assault on flash memory in 2013

    by 
    Ross Miller
    Ross Miller
    08.31.2010

    The memristor's come a long way since being hypothesized back in 1971. If you ask HP Labs, the history of this particular memory technology didn't hit its next milestone for almost four decades, when the company produced the very first memory resistor chip. Just last month, the Labs group proved its little transistor could handle logic and data storage, and as of today, the company's announcing a joint development agreement with Hynix Semiconductor, with a goal of bringing these chips to the market -- and rendering flash memory obsolete. That challenge against flash (not a very popular naming convention these days, it seems) was thrown down by HP Labs Senior Fellow Stan Williams, who posits that the memristor is "an universal memory that over a sufficient amount of time will replace flash, DRAM, magnetic hard disks, and possibly even SRAM." But onto the immediate, albeit aspirational goal (i.e. not a commitment, which he stressed on multiple occasions): Williams hopes to see the transistors in consumer products by this time 2013, for approximately the price of what flash memory will be selling for at the time but with "at least twice the bit capacity." He also claims a much smaller power requirement of "at least a factor of 10" and an even faster operation speed, in addition to previously-discussed advantages like read / write endurance. With Hynix on board, the goal is to make these "drop-in replacements" for flash memory, whereby the same protocols and even the same connectors will work just fine. For HP, however, Williams says there'll be an initial competitive advantage for the company due to its comfort level with memristors' unique properties, but that other companies will be encouraged to license the technology and experiment with new possibilities in hardware design. Williams wouldn't give any specific product examples where we might initially see the memristor, except to repeat that it'll be anywhere and everywhere flash memory is. Fighting words, indeed. We normally don't get excited about minute hardware components -- not often, at least -- but we gotta say, the seeds of the future look mighty interesting. Can't wait to see what germinates. Highlights from our talk with Williams after the break. %Gallery-100780%

  • Melting silicon 'in reverse' can help purify it, result in cheaper electronics

    by 
    Vlad Savov
    Vlad Savov
    08.02.2010

    Just our favorite combination of news: a mind-bending innovation that can have a very practical impact on our daily tech consumption. MIT scientists have found that silicon -- when combined in the right dosage with other metals -- can actually be made to melt by reducing its temperature. Typically, you'd require 1,414 degrees of Celsius heat to liquidize solid silicon, but the intermixed variant discussed here need only reach 900 degrees before its slow cooling process starts turning it gooey. The great advantage to this discovery is that because the impurities tend to separate off into the liquid part, there's now a practicable way to filter them out, meaning that things like solar cells won't require the same high grade of silicon purity for their construction -- which in turn might lead to us being able to afford them one day. Of course, that's getting way too far ahead of ourselves, as the research is still ongoing, but good news is good news no matter the timescale.