silicon

Latest

  • Scientists bend gamma rays, could neuter radioactive waste (update: more credit)

    by 
    Jon Fingas
    Jon Fingas
    05.10.2012

    Bending most light is easy; bending it in gamma ray form, however, has often been deemed impossible given how hard it is for electrons to react to the extreme frequencies. LMU Munich scientist Dietrich Habs and his Institut Laue-Langevin teammate Michael Jentschel have proven that assumption wrong: an experiment in blasting a silicon prism has shown that gamma rays will refract just slightly through the right material. If a lens is made out of a large-atom substance like gold to bend the rays further, the researchers envision focused beams of energy that could either detect radioactive material or even make it inert by wiping off neutrons and protons. In theory, it could turn a nuclear power plant's waste harmless. A practical use of the technology is still some distance off -- but that it's even within sight at all just feels like a breakthrough. Update: The research also involved the Max Planck Institute of Quantum Optics' Marc Günther. Thank you, Dr. Günther.

  • Researchers build optical transistor out of silicon, provide path to all-optical computing

    by 
    Michael Gorman
    Michael Gorman
    05.01.2012

    The speed of light is the universal speed limit, so naturally, optical technologies appeal when trying to construct speedy computational devices. Fiber optics let us shoot data to and fro at top speed, but for the time being our CPUs still make their calculations using electronic transistors. Good news is, researchers from Purdue University have built an optical transistor out of silicon that can propagate logic signals -- meaning it can serve as an optical switch and push enough photons to drive two other transistors. It's constructed of a microring resonator situated next to one optical line that transmits the signal, and a second that heats the microring to change its resonant frequency. The microring then resonates at a specific frequency to interact with the light in the signal line in such a way that its output is drastically reduced and essentially shut off. Presto, an optical transistor is born. Before dreams of superfast photonic computers start dancing in your head, however, just know they won't be showing up anytime soon -- the power consumption of such transistors is far beyond their electronic counterparts due to the energy inefficient lasers that power them.

  • Intel teaches Haswell the core values of teamwork, optimism

    by 
    Sharif Sakr
    Sharif Sakr
    02.09.2012

    Sure you can make wild, individualistic boasts about having a 22nm fabrication process and three different GPUs, but that stuff counts for nothing without the magic of cooperation. The Amish know that and so does Intel, which is why its forthcoming Haswell cores will support Transactional Synchronization Extensions (TSX) -- a new instruction set designed to allow cores to work together more closely without hammering each others' fingers. TSX takes greater responsibility for the division of labor between cores at the hardware level, relieving the software programmer of some of this burdensome duty and hopefully allowing for finer-grained threading as a result. The system also relies on inherent optimism, with each core assuming that the others have handled their part of the work successfully. Inevitably, there'll be occasions when this happy belief gets splintered and a bad job has to be started again from scratch, but on average things should get done quicker and leave more energy for the barn dance.

  • IBM builds 9 nanometer carbon nanotube transistor, puts silicon on notice

    by 
    Michael Gorman
    Michael Gorman
    01.28.2012

    It's not the smallest transistor out there, but the boffins at IBM have constructed the tiniest carbon nanotube transistor to date. It's nine nanometers in size, making it one nanometer smaller than the presumed physical limit of silicon transistors. Plus, it consumes less power and is able to carry more current than present-day technology. The researchers accomplished the trick by laying a nanotube on a thin layer of insulation, and using a two-step process -- involving some sort of black magic, no doubt -- to add the electrical gates inside. The catch? (There's always a catch) Manufacturing pure batches of semiconducting nanotubes is difficult, as is aligning them in such a way that the transistors can function. So, it'll be some time before the technology can compete with Intel's 3D silicon, but at least we're one step closer to carbon-based computing.

  • Samsung looks to borrow $1 billion to expand production capacity in Austin, Texas

    by 
    Darren Murph
    Darren Murph
    01.16.2012

    When you're producing chips for the iPad and iPhone, you need a serious facility to meet those demands. And evidently, Samsung's not foreseeing its legal battles with Apple to cause any wrinkles in said plans. In fact, Bloomberg is reporting that Sammy has "sent requests for proposals to banks to borrow as much as $1 billion to expand production capacity at its factory in Austin, Texas," with the bonds to be issued by Samsung's US unit. It's bruited that the company -- which has around $19.2 billion in cash -- may sell its first overseas bonds since 1997 due to the impossibly low cost of borrowing money these days, and in a time where positive economic news is tough to come by, it's quite the relief to see a bit of forward progress come from historically low interest rates. Reuters is reporting that the investment will mostly be used to "boost production of mobile chips and next-generation OLED (organic light-emitting diode) display panels," but specific details beyond that remain murky.

  • Quanta sues AMD, claims it sold defective products

    by 
    Darren Murph
    Darren Murph
    01.04.2012

    Yikes. Quanta -- also known as the planet's largest contract maker of laptops -- has just slapped a nasty lawsuit on the world's second-largest chipmaker. According to Bloomberg, Quanta is alleging that AMD and ATI sold chips that "didn't meet heat tolerances and were unfit for particular purposes." Those chips were then used in NEC-labeled machines, and caused them to "malfunction" in some regard. No big deal? Hardly. In the complaint, Quanta states that it has "suffered significant injury to prospective revenue and profits," and it's seeking a jury trial and damages for good measure. As if that weren't harsh enough, the suit also claims "breach of warranty, negligent misrepresentation, civil fraud and interference with a contract." When pinged for comment, AMD's spokesman, Michael Silverman stated: "AMD disputes the allegations in Quanta's complaint and believes they are without merit. AMD is aware of no other customer reports of the alleged issues with the AMD chip that Quanta used, which AMD no longer sells. "In fact, Quanta has itself acknowledged to AMD that it used the identical chip in large volumes in a different computer platform that it manufactured for NEC without such issues." Somewhere, Intel has to be smirking.

  • NTT Docomo, Panasonic, Samsung and more team up to take on Qualcomm over cellphone chips

    by 
    Richard Lawler
    Richard Lawler
    12.27.2011

    Japanese mobile operator NTT Docomo just announced (as had been rumored) it's forming a joint venture with five partners -- Samsung, Panasonic, Fujitsu Limited, Fujitsu Semiconductor and NEC -- to develop and sell chips for mobile devices. According to the press release the fabless JV will get started once all involved finish hammering out the details and focus on creating LTE-connected products for the global market. NTT Docomo is investing $5.8 million to create a subsidiary, Communication Platform Planning Co., in preparation with one of its executives as CEO. Currently Qualcomm makes the majority of chips found in smartphones, but it appears to have some high-powered competition on the way soon.

  • MIT slinks into a cafe, orders a side of photonic chips on silicon

    by 
    Darren Murph
    Darren Murph
    11.25.2011

    Whiz-kids the world over have been making significant progress on the development of photonic chips -- devices that "use light beams instead of electrons to carry out their computational tasks." But now, MIT has taken the next major leap, filling in "a crucial piece of the puzzle" that just might allow for the creation of photonic chips on the standard silicon material that underlies most of today's electronics. Today, data can travel via light beams shot over through optical fibers, and once it arrives, it's "converted into electronic form, processed through electronic circuits and then converted back to light using a laser." What a waste. If MIT's research bears fruit, the resulting product could nix those extra steps, allowing the light signal to be processed directly. Caroline Ross, the Toyota Professor of Materials Science and Engineering at MIT, calls it a diode for light; to construct it, researchers had to locate a material that was both transparent and magnetic. In other words, a material that only exists in the Chamber of Secrets. Hit the source link for the rest of the tale.

  • Researchers increase charging capacity, speed of lithium ion batteries by a factor of ten

    by 
    Amar Toor
    Amar Toor
    11.16.2011

    It's not every day that we get to write about advancements in battery technology -- much less one as potentially groundbreaking as what a group of engineers at Northwestern University claim to have pulled off. In fact, Professor Harold Kung and his team say they've successfully managed to increase both the charging capacity and speed of lithium ion batteries by a factor of ten. The key, according to Kung, is the movement of the lithium ions nestled between layers of graphene. The speed at which these ions move across a battery's graphene sheets is directly related to how fast a device can recharge. To speed up this process, Kung decided to poke millions of tiny, 10-20nm-sized holes into a mobile battery's graphene layers, thereby providing the ions with a "shortcut" to the next level. As a result, Kung's perforated batteries were able to charge ten times faster than traditional cells, going from zero to hero in 15 minutes. Not satisfied with that achievement alone, Kung and his squad then set about increasing their battery's charging capacity, as well. Here, they increased the density of lithium ions by inserting small clusters of silicon between each graphene slice. This approach allows more ions to gather at the electrode and, by taking advantage of graphene's malleable properties, avoids some of the silicon expansion problems that have plagued previous attempts at capacity enhancement. The result? A battery that can run on a single charge for more than a week. "Now we almost have the best of both worlds," Kung said. "We have much higher energy density because of the silicon, and the sandwiching reduces the capacity loss caused by the silicon expanding and contracting. Even if the silicon clusters break up, the silicon won't be lost." There is, however, a downside, as both charging capacity and speed sharply fell off after 150 charges. But as Kung points out, the increase in charge retention would more than make up for this shortcoming. "Even after 150 charges, which would be one year or more of operation, the battery is still five times more effective than lithium-ion batteries on the market today," he told the BBC. For more technical details, hit up the links below.

  • MIT unveils computer chip that thinks like the human brain, Skynet just around the corner

    by 
    Chris Barylick
    Chris Barylick
    11.15.2011

    It may be a bit on the Uncanny Valley side of things to have a computer chip that can mimic the human brain's activity, but it's still undeniably cool. Over at MIT, researchers have unveiled a chip that mimics how the brain's neurons adapt to new information (a process known as plasticity) which could help in understanding assorted brain functions, including learning and memory. The silicon chip contains about 400 transistors and can simulate the activity of a single brain synapse -- the space between two neurons that allows information to flow from one to the other. Researchers anticipate this chip will help neuroscientists learn much more about how the brain works, and could also be used in neural prosthetic devices such as artificial retinas. Moving into the realm of "super cool things we could do with the chip," MIT's researchers have outlined plans to model specific neural functions, such as the visual processing system. Such systems could be much faster than digital computers and where it might take hours or days to simulate a simple brain circuit, the chip -- which functions on an analog method -- could be even faster than the biological system itself. In other news, the chip will gladly handle next week's grocery run, since it knows which foods are better for you than you ever could.

  • Intel 4004, world's first commercial microprocessor, celebrates 40th birthday, ages gracefully

    by 
    Amar Toor
    Amar Toor
    11.15.2011

    Pull out the candles and champagne, because the Intel 4004 is celebrating a major birthday today -- the big four-oh. That's right, it's been exactly four decades since Intel unveiled the world's first commercially available CPU, with an Electronic News ad that ran on November 15th, 1971. It all began in 1969, when Japan's Nippon Calculating Machine Corporation asked Intel to create 12 chips for its Busicom 141-PF calculator. With that assignment, engineers Federico Faggin, Ted Hoff and Stanley Mazor set about designing what would prove to be a groundbreaking innovation -- a 4-bit, 16-pin microprocessor with a full 2,300 MOS transistors, and about 740kHZ of horsepower. The 4004's ten micron feature size may seem gargantuan by contemporary standards, but at the time, it was rather remarkable -- especially considering that the processor was constructed from a single piece of silicon. In fact, Faggin was so proud of his creation that he decided to initial its design with "FF," in appropriate recognition of a true work of art. Hit up the coverage links below for more background on the Intel 4004, including a graphic history of the microprocessor, from the Inquirer.

  • Ferroelectric transistor memory could run on 99 percent less power than flash

    by 
    Sharif Sakr
    Sharif Sakr
    09.28.2011

    We've been keeping an optimistic eye on the progress of Ferroelectric Random Access Memory (FeRAM) for a few years now, not least because it offers the tantalizing promise of 1.6GB/s read and write speeds and crazy data densities. But researchers at Purdue University reckon we've been looking in the wrong place this whole time: the real action is with their development of FeTRAM, which adds an all-important 'T' for 'Transistor'. Made by combining silicon nanowires with a ferroelectric polymer, Purdue's material holds onto its 0 or 1 polarity even after being read, whereas readouts from capacitor-based FeRAM are destructive. Although still at the experimental stage, this new type of memory could boost speeds while also reducing power consumption by 99 percent. Quick, somebody file a patent. Oh, they already did.

  • Julius Blank, chip-making pioneer and Fairchild co-founder, dies at 86

    by 
    Amar Toor
    Amar Toor
    09.26.2011

    Somber news coming out of Palo Alto today, where Julius Blank, the man who helped found the groundbreaking chipmaker Fairchild Semiconductor Corporation, has passed away at the age of 86. The Manhattan-born Blank (pictured third from left, above) began his engineering career in 1952, when he joined AT&T's Western Electric plant in New Jersey. As a member of the engineering group at the plant, Blank helped create phone technology that allowed users to dial long-distance numbers without going through an operator. It was also at Western Electric where he met fellow engineer Eugene Kleiner. In 1956, Blank and Kleiner left AT&T to work at the lab of Nobel Prize-winning physicist William B. Shockley, but departed just one year later (amid to start Fairchild, alongside a group of six other computer scientists that included future Intel Corporation founders Robert Noyce and Gordon Moore. At their new labs, Blank and his peers developed an inexpensive method for manufacturing silicon chips, earning them $1.5 million in capital from a single investor. As the only two with any manufacturing experience, Blank and Kleiner were charged with bringing the dream to fruition -- a task that required them to build the chips from scratch, beginning with the machinery for growing silicon crystals. They succeeded, of course, and in 1969, Blank left Fairchild to start Xicor, a tech firm that Intersil would later buy for $529 million, in 2004. But his legacy will forever be linked to those early days at Fairchild, where, as Blank described in a 2008 interview, he and his colleagues were able to experience the unique thrill of "building something from nothing." Julius Blank is survived by his two sons, Jeffrey and David, and two grandsons. [Photo courtesy of Joan Seidel / AP 1999]

  • Hanako 2 robot acts like a human dental patient, makes us say 'aah' (video)

    by 
    Amar Toor
    Amar Toor
    06.30.2011

    No, she's not in a state of shock, nor is she hunting for plankton -- she's simply waiting for the dentist to polish her pearly whites, just like any other conscientious robot. Known as the Showa Hanako 2, this humanoid was originally developed last year as a tool for dentists looking to practice new procedures. Now, engineers at Japan's Showa University have updated their dental denizen, adding a motorized head and replacing her PVC skin with a more realistic silicon coating. She also boasts speech recognition capabilities and can execute freakishly natural movements, including blinking, sneezing, coughing and, under more unsavory circumstances, even choking. See her in action for yourself, after the break.

  • Apple's A6 processor may come courtesy of TSMC, Samsung left to wonder why

    by 
    Joseph Volpe
    Joseph Volpe
    06.27.2011

    Apple's fondness for anorexic handhelds knows no bounds, and if this alleged deal with the Asian foundry holds water, expect to see its waistband tighten further. Rumoured back before the iPad 2 launch, the house-that-Steve-built's reportedly been eyeing Taiwanese Semiconductor Manufacturing Corp to produce an 'A6' for its upcoming iPhone refresh. While it's easy to dismiss this purported move as a direct diss to Samsung, what's more likely is that Cupertino's engaging in a competitive bit of size does matter -- specifically, the A5's 45nm process. A transition to newer, lower power 28nm ARM chips would give Jonathan Ives' employer a distinct market advantage, dwarfing even TSMC's current 40nm in the process. While it's all still just speculation for now, only time and an iPhone 5 tear-down will tell for sure.

  • Tilera's new 100-core CPU elbows its way to the cloud, face-melt still included

    by 
    Joseph Volpe
    Joseph Volpe
    06.21.2011

    Hundred core chips might not be breaking news -- especially if the company announcing it is Tilera -- but what if that new multi-core CPU drew an insanely lower wattage and set its sights on powering a few cloud server farms? Well, that's exactly what chip maker Tilera has up its silicon sleeve. "Co-developed with the world's leading cloud computing companies" -- take a guess who that might include -- the new 64-bit TileGx-3100 clocks in at up to 1.5GHz while sucking down a lighter 48W. Line that up next to the current cloud favorite, Intel's Xeon, and your power consumption is slashed nearly in half. Of course, the barrier to entry is high for the nascent chip developer since most code written is for the x86 -- requiring a whole new set of instructions for data centers to play nice. Expect to see this face-melting monster sometime early 2012, by which time, you'll probably have your 50,000 strong music library synced to the cloud.

  • AMD ships five million Fusion chips, says it's sold out

    by 
    Sean Hollister
    Sean Hollister
    05.28.2011

    Sounds like Notbooks are making a dent: AMD says it's shipped five million Fusion processors since the architecture's debut, according to a report at CNET. In January, the company said the hybrid CPU / GPU chips had momentum, and as of last month it was quoting 3.9 million APUs out in the wild, but this week AMD says that demand has overtaken supply and it's completely sold out of the Atom alternative. Sounds like Intel's more than justified in seeking out hybrid solutions of its own, no matter where it might have to look to get a leg up in the integrated graphics market. Here's hoping AMD's other Fusion chips show just as much pep per penny (and milliampere-hour) as the original processor.

  • Intel will mass produce 3D transistors for all future CPUs, starting with 22nm Ivy Bridge (video)

    by 
    Sean Hollister
    Sean Hollister
    05.04.2011

    Looks like 3D isn't just a fad, folks, so long as we're talking about silicon -- Intel just announced that it has invented a 3D "Tri-Gate" transistor that will allow the company to keep shrinking chips, Moore's Law naysayers be darned. Intel says the transistors will use 50 percent less power, conduct more current and provide 37 percent more speed than their 2D counterparts thanks to vertical fins of silicon substrate that stick up through the other layers, and that those fancy fins could make for cheaper chips too -- currently, though, the tri-gate tech adds an estimated 2 to 3 percent cost to existing silicon wafers. Intel says we'll see the new technology first in its 22nm Ivy Bridge CPUs, going into mass production in the second half of the year, and it's planning 14nm chips in 2013 and 10nm chips in 2015. Also, 3D transistors won't be limited to the cutting edge -- Intel reps told journalists that they "will extend across the entire range of our product line," including mobile devices. Three videos and a press release await you after the break. Chris Trout contributed to this report.

  • Intel touts 50Gbps interconnect by 2015, will make it work with tablets and smartphones too

    by 
    Sharif Sakr
    Sharif Sakr
    04.29.2011

    Woah there, Mr. Speedy. We've barely caught up with the 10Gbps Thunderbolt interconnect, debuted in the new Macbook Pro, and now Intel's hyperactive researchers are already chattering away about something five times faster. They're promising a new interconnect, ready in four years, that will combine silicon and optical components (a technology called silicon photonics) to pump 50Gbps over distances of up to 100m. That's the sort of speed Intel predicts will be necessary to handle, say, ultra-HD 4k video being streamed between smartphones, tablets, set-top boxes and TVs. Intel insists that poor old Mr. Thunderbolt won't be forced into early retirement, but if we were him we'd be speaking to an employment lawyer right about now.

  • Today marks 50th anniversary of first silicon integrated circuit patent (and the entire computing industry)

    by 
    Zach Honig
    Zach Honig
    04.25.2011

    There's little question that the last 50 years have represented the most innovative half-century in human history, and today marks the anniversary of the invention that started it all: the silicon-based integrated circuit. Robert Noyce received the landmark US patent on April 25, 1961, going on to found Intel Corporation with Gordon E. Moore (of Moore's Law fame) in 1968. He wasn't the first to invent the integrated circuit -- the inventor of the pocket calculator Jack Kilby patented a similar technology on a germanium wafer for Texas Instruments a few months prior. Noyce's silicon version stuck, however, and is responsible for Moore's estimated $3.7 billion net worth, not to mention the success of the entire computing industry. Holding 16 other patents and credited as a mentor of Steve Jobs, Noyce was awarded the National Medal of Technology in 1987, and continued to shape the computing industry until his death in 1990. If Moore's Law continues to hold true, as we anticipate it will, we expect the next 50 years to be even more exciting than the last. Let's meet back here in 2061.