silicon

Latest

  • Intel announces Quark system on a chip, the company's smallest to date

    by 
    Michael Gorman
    Michael Gorman
    09.10.2013

    The hits keep coming from IDF. After showing off svelte new 14nm silicon built for laptops, CEO Brian Krzanich announced a brand new SoC series named Quark. It's the smallest SoC the company has ever built, with processor cores one-fifth the size of Atom's, and is built upon an open architecture meant so spur its use. Early on in his keynote, Krzanich said that Intel plans to "lead in every segment of computing," and Quark is positioned to put Intel in wearables -- and, in fact, he even showed off a prototype smartwatch platform Intel constructed to help drive wearable development. And, Intel President Renee James pointed out that Quark's designed for use in integrated systems, so we'll be seeing Quark in healthcare and municipal use cases, too. Unfortunately, no details about the new SoC's capabilities or specs are yet available, but we can give you some shots of Intel's wearable wristband prototype in our gallery below.

  • Researchers claim 'almost instantaneous' quantum computing breakthrough

    by 
    Timothy J. Seppala
    Timothy J. Seppala
    09.05.2013

    Silicon is great, but we're tickling the edges of its speed limit. As a result, researchers at Oregon State University have been plugging away at a low-cost, faster alternative for the past three years: tiny quantum devices called metal-insulator-metal diodes, or MIM diodes for short. Silicon chips involve electrons traveling through a transistor, but MIM diodes send electrons "tunneling" through the insulator in a quantum manner, such that they appear "almost instantaneously" on the other side. The tech's latest development doubles the insulator fun -- transforming the MIM into a MIIM (pictured above) -- giving the scientists another method for engineering quantum mechanical tunneling. With MIIMs, super fast transistor-less computers could be around the corner. The Oregon researchers aren't bold enough to put a date on making any of this happen outside of the lab, but they promise entire new industries may "ultimately emerge" from their work, and we're far too under-qualified to doubt them.

  • NXP's silicon fingerprinting promises to annoy the heck out of ID hackers

    by 
    Sharif Sakr
    Sharif Sakr
    02.21.2013

    It's 2013 and white hat hackers like Adam Laurie are still breaking into ID chips that are supposed to be secure. How come? Partly it's the way of the world, because no man-made NFC or RFID security barrier can ever be truly impervious. But in practical terms, a chip's vulnerability often stems from the fact that it can be taken apart and probed at a hacker's leisure. The secure element doesn't necessarily need to have power running through it or to be in the midst of near-field communication in order to yield up its cryptographic key to a clever intruder who has sufficient time and sufficient desire to breach the security of a smartphone, bank card or national border. Which brings us to the latest device in NXP's SmartMX2 range -- a piece of technology that is claimed to work very differently and that is expected to hit the market next year. Instead of a traditional key stored in the secure element's memory, every single copy of this chip carries a unique fingerprint within the physical structure of its transistors. This fingerprint (aka Physically Unclonable Function, or PUF) is a byproduct of tiny errors in the fabrication process -- something chip makers usually try to minimize. But NXP has found a way to amplify these flaws in a controlled way and use them for identification, and it'd take a mightily well-equipped criminal (or fare dodger, or Scrabble cheater) to reverse engineer that.

  • USC battery wields silicon nanowires to hold triple the energy, charge in 10 minutes

    by 
    Jon Fingas
    Jon Fingas
    02.13.2013

    There's no shortage of attempts to build a better battery, usually with a few caveats. USC may have ticked all the right checkboxes with its latest discovery, however. Its use of porous, flexible silicon nanowires for the anodes in a lithium-ion battery delivers the high capacity, fast recharging and low costs that come with silicon, but without the fragility of earlier attempts relying on simpler silicon plates. In practice, the battery could deliver the best of all worlds. Triple the capacity of today's batteries? Full recharges in 10 minutes? More than 2,000 charging cycles? Check. It all sounds a bit fantastical, but USC does see real-world use on the horizon. Researchers estimate that there should be products with silicon-equipped lithium-ion packs inside of two to three years, which isn't long to wait if the invention saves us from constantly hunting for the nearest wall outlet.

  • University of Michigan makes silicon from liquid metal, aims for low-cost chips

    by 
    Jon Fingas
    Jon Fingas
    01.25.2013

    Forming silicon normally requires extreme temperatures of more than 2,000F, with the expensive energy to match. The University of Michigan has developed a technique involving liquid metal that could shed most of the heat -- and cost. By coating a liquid gallium electrode with silicon tetrachloride, researchers can generate pure silicon crystals through the gallium's electrons at a comparatively cool 180F. While the crystals are currently small, bigger examples are at least theoretically possible with new metals or other refinements. Any eventual commercial success could lead to much easier, and likely cheaper, manufacturing for processors and solar cells; given that silicon still forms the backbone of most technology, real-world use can't come quickly enough.

  • Fraunhofer black silicon could catch more energy from infrared light, go green with sulfur

    by 
    Jon Fingas
    Jon Fingas
    10.04.2012

    Generating solar power from the infrared spectrum, or even nearby frequencies, has proven difficult in spite of a quarter of the Sun's energy passing through those wavelengths. The Fraunhofer Institute for Telecommunications may have jumped that hurdle to efficiency through sulfur -- one of the very materials that solar energy often helps eliminate. By irradiating ordinary silicon through femtosecond-level laser pulses within a sulfuric atmosphere, the technique melds sulfur with silicon and makes it easier for infrared light electrons to build into the frenzy needed for conducting electricity. The black-tinted silicon that results from the process is still in the early stages and needs improvements to automation and refinement to become a real product, but there's every intention of making that happen: Fraunhofer plans a spinoff to market finished laser systems for solar cell builders who want their own black silicon. If all goes well, the darker shade of solar panels could lead to a brighter future for clean energy.

  • New process for nanotube semiconductors could be graphene's ticket to primetime (video)

    by 
    James Trew
    James Trew
    09.30.2012

    In many ways, graphene is one of technology's sickest jokes. The tantalizing promise of cheap to produce, efficient to run materials, that could turn the next page in gadget history has always remained frustratingly out of reach. Now, a new process for creating semiconductors grown on graphene could see the super material commercialized in the next five years. Developed at the Norwegian University of Science and Technology, the patented process "bombs" graphene with gallium, which forms droplets, and naturally arranges itself to match graphene's famous hexagonal pattern. Then, arsenic is added to the mix, which enters the droplets and crystallizes at the bottom, creating a stalk. After a few minutes of this process the droplets are raised by the desired height. The new process also does away with the need for a (relatively) thick substrate to grow the nanowire on, making it cheaper, more flexible and transparent. The inventors state that this could be used in flexible and efficient solar cells and light emitting diodes. We say forward the revolution.

  • Researchers create working quantum bit in silicon, pave way for PCs of the future

    by 
    Sarah Silbert
    Sarah Silbert
    09.21.2012

    If you've been paying attention, you know the quantum computing revolution is coming -- and so far the world has a mini quantum network, not to mention the $10,000 D-Wave One, to show for it. Researchers from the University of Melbourne and University College, London, have now developed the "first working quantum bit based on a single atom of silicon." By measuring and manipulating the magnetic orientation, or spin, of an electron bound to a phosphorus atom embedded in a silicon chip, the scientists were able to both read and write information, forming a qubit, the basic unit of data for quantum computing. The team used a silicon transistor, which detects the electron's spin and captures its energy when the spin's direction is "up." Once the electron is in the transistor, scientists can change its spin state any way they choose, effectively "writing" information and giving them control of the quantum bit. The next step will be combing two qubits into a logic step, with the ultimate goal being a full-fledged quantum computer capable of crunching numbers, cracking encryption codes and modeling molecules that would put even supercomputers to shame. But, you know, baby steps.

  • Seed-sized A*STAR antenna could open the door to 20Gbps wireless

    by 
    Jon Fingas
    Jon Fingas
    08.29.2012

    Antennas have often capped the potential speed of a wireless link -- the 450Mbps in modern 802.11n WiFi routers is directly linked to the use of a MIMO antenna array to catch signals more effectively, for example. That ceiling is about to get much higher, if A*STAR has anything to say about it. The use of a polymer filling for the gaps instead of air lets the Singapore agency create a 3D, cavity-backed silicon antenna that measures just 0.06 by 0.04 inches, roughly the size of a seed on your hamburger bun, even as it increases the breakneck pace. The new antenna generates a signal 30 times stronger than on-chip rivals at an ultrawideband-grade 135GHz, and musters a theoretical peak speed of 20Gbps -- enough that 802.11ac WiFi's 1.3Gbps drags its heels by comparison. Before we get ahead of ourselves on expecting instant file transfers at short distances, there's the small matter of getting a chip that can use all that bandwidth. Even the 7Gbps of WiGig wouldn't saturate the antenna, after all. Still, knowing that A*STAR sees "immense commercial potential" in its tiny device hints that wireless data might eventually blow past faster wired standards like Thunderbolt.

  • Acronym-loving Samsung joins Intel and TSMC, buys stake in ASML

    by 
    Daniel Cooper
    Daniel Cooper
    08.27.2012

    Samsung's round of cash-flashing continues with a $629 million purchase of a three-percent stake in ASML. It's joining Intel and TSMC in pumping money into the Dutch business, developing tooling for chip-making machines with Extra Ultraviolet Lithography (EUV) designed to "extend Moore's Law." It'll also help reduce the cost of future silicon, since it'll enable the companies to use wider silicon wafers along the manufacturing line. Given that Samsung's investment caps of a project to raise nearly $5 billion in cash and that ASML's home is just five miles west of PSV Eindhoven's stadium, we just hope they threw in a few home tickets for their trouble.

  • Harvard makes distortion-free lens from gold and silicon, aims for the perfect image (or signal)

    by 
    Jon Fingas
    Jon Fingas
    08.25.2012

    Imaging has been defined by glass lenses for centuries, and even fiber optics haven't entirely escaped the material's clutch. Harvard's School of Engineering and Applied Sciences might have just found a way to buck those old (and not-so-old) traditions. A new 60-nanometer thick silicon lens, layered with legions of gold nanoantennas, can catch and refocus light without the distortion or other artifacts that come with having to use the thick, curved pieces of glass we're used to -- it's so accurate that it nearly challenges the laws of diffraction. The lens isn't trapped to bending one slice of the light spectrum, either. It can range from near-infrared to terahertz ranges, suiting it both to photography and to shuttling data. We don't know what obstacles might be in the way to production, which leads us to think that we won't be finding a gold-and-silicon lens attached to a camera or inside a network connection anytime soon. If the technology holds up under scrutiny, though, it could ultimately spare us from the big, complicated optics we often need to get just the right shot.

  • ARM and Globalfoundries hammer out deal to promote 20nm mobile chips

    by 
    Sharif Sakr
    Sharif Sakr
    08.13.2012

    Sure it's British, but ARM's mobile empire is being built through careful alliances rather than conquest. The chip designer's latest deal with Globalfoundries, which mirrors a very similar agreement signed with rival foundry TSMC last month, is a case in point. It's designed to promote the adoption of fast, energy-efficient 20nm processors by making it easy for chip makers (like Samsung, perhaps) to knock on Globalfoundries' door for the grunt work of actually fabricating the silicon -- since the foundry will now be prepped to produce precisely that type of chip. As far as the regular gadget buyer is concerned, all this politicking amounts to one thing: further reassurance that mobile processor shrinkage isn't going to peter out after the new 32nm Exynos chips or the 28nm Snapdragon S4 -- it's going to push on past the 22nm benchmark that Ivy Bridge already established in the desktop sphere and hopefully deliver phones and tablets that do more with less juice.

  • Intel to buy 15 percent of silicon fab equipment maker ASML, wants manufacturing machines made faster

    by 
    Michael Gorman
    Michael Gorman
    07.10.2012

    Chipzilla didn't get its position as the king of semiconductors by twiddling its thumbs, folks. It became a Valley behemoth by delivering us faster and better silicon, and its latest $4.1 billion purchase -- a 15 percent stake in silicon manufacturing equipment maker ASML Holding NV -- should help keep Intel atop the CPU heap. You see, Intel's in the process of retooling its chip manufacturing process to use bigger diameter silicon wafers, which'll make those Ivy Bridge, ValleyView and other future chips cheaper for all of us. Such retooling can take years to implement, which is likely why Intel was willing to plunk down so much cash to ensure nothing futzes with its manufacturing timetable. The company's investment will presumably give it the clout to get ASML's crucial lithography equipment on the fast track to completion. Hop to it, fellas, we want our CPUs at bargain-basement prices, and we want them now.

  • Researchers partially automate CPU core design, aim to fast track new PC processor production

    by 
    Michael Gorman
    Michael Gorman
    06.27.2012

    Tired of the year wait (or more) in between new silicon architecture offerings from Chipzilla and AMD? Well, if some Wolfpack researchers have anything to say about it, we'll measure that wait in months thanks to a new CPU core design tool that automates part of the process. Creating a new CPU core is, on a high level, a two step procedure. First, the architectural specification is created, which sets the core's dimensions and arranges its components. That requires some heavy intellectual lifting, and involves teams of engineers to complete. Previously, similar manpower was needed for the second step, where the architecture spec is translated into an implementation design that can be fabricated in a factory. No longer. The aforementioned NC State boffins have come up with a tool that allows engineers to input their architecture specification, and it generates an implementation design that's used to draw up manufacturing blueprints. The result? Considerable time and manpower savings in creating newly designed CPU cores, which means that all those leaked roadmaps we're so fond of could be in serious need of revision sometime soon.

  • Engadget Primed: why nanometers matter (and why they often don't)

    by 
    Sharif Sakr
    Sharif Sakr
    06.15.2012

    Primed goes in-depth on the technobabble you hear on Engadget every day -- we dig deep into each topic's history and how it benefits our lives. You can follow the series here. Looking to suggest a piece of technology for us to break down? Drop us a line at primed *at* engadget *dawt* com. Welcome to one of the most unnecessarily complicated questions in the world of silicon-controlled gadgets: should a savvy customer care about the underlying nature of the processor in their next purchase? Theoretically at least, the answer is obvious. Whether it's a CPU, graphics card, smartphone or tricorder, it'll always receive the Holy Grail combo of greater performance and reduced power consumption if it's built around a chip with a smaller fabrication process. That's because, as transistors get tinier and more tightly packed, electrons don't have to travel so far when moving between them -- saving both time and energy. In other words, a phone with a 28-nanometer (nm) processor ought to be fundamentally superior to one with a 45nm chip, and a PC running on silicon with features etched at 22nm should deliver more performance-per-watt than a 32nm rival. But if that's true, isn't it equally sensible to focus on the end results? Instead of getting bogged down in semiconductor theory, we may as well let Moore's Law churn away in the background while we judge products based on their overall user experience. Wouldn't that make for an easier life? Well, maybe, but whichever way you look at it, it's hard to stop this subject descending into pure philosophy, on a par with other yawnsome puzzles like whether meat-eaters should visit an abattoir at least once, or whether it's better to medicate the ailment or the person. Bearing that in mind, we're going to look at how some key players in the silicon industry treat this topic, and we'll try to deliver some practical, offal-free information in the process.

  • UCSB engineers proteins that make silicon, leads hipsters to insist on organically-grown computers

    by 
    Jon Fingas
    Jon Fingas
    06.08.2012

    Organic circuits have been in development for awhile, but it's still rare that the organics are producing the circuitry themselves. Researchers at the University of California, Santa Barbara plan to break that silence with genetically engineered proteins that can make silicon dioxide or titanium dioxide structures like those used in the computer chips and solar cells that we hold dear. The trick, the university's Daniel Morse found, is to attach silica-forming DNA to plastic beads that are in turn soaked in the silicon or titanium molecules they're looking for: after some not-so-natural selection for the best genes, the thriving proteins can produce not only substantial minerals, but whole fiber sheets. Much work is left to get the proteins producing the kind of silicon or titanium dioxides that could run a computer or power your house, but the dream is to have synthetic creations that organically produce what would normally need a mining expedition -- imagine something akin to the glass-like Venus' Flower Basket sponge (pictured above) sitting in an Intel factory. We're half-expecting organically-grown smartphones at Whole Foods, right next to the kale chips and fair trade coffee. [Image credit: Ryan Somma, Flickr]

  • Engineer Guy shows how a phone accelerometer works, knows what's up and sideways (video)

    by 
    Jon Fingas
    Jon Fingas
    05.22.2012

    We love finding out how things work, and arguably one of the most important parts of the smartphones and tablets we thrive on is the accelerometer gauging our device's orientation. Imagine our delight, then, when we see the University of Illinois' Bill Hammack (i.e. The Engineer Guy) giving a visual rundown of how accelerometers work. Although it's certainly the Cliff's Notes version of what's going on in your Android phone or iPhone, the video does a great job of explaining the basic concepts behind three-axis motion sensing and goes on to illustrate how MEMS chips boil the idea down to the silicon form that's needed for our mobile hardware. Hammack contends that it's one of the coolest (and unsung) parts of a smartphone, and we'd definitely agree; you can see why in the clip after the break.

  • Scientists use metal and silicon to create invisibility cloak (no, you can't wear it)

    by 
    Sarah Silbert
    Sarah Silbert
    05.22.2012

    In the quest to achieve that much-desired invisibility cloak, scientists have redirected light, used heat monitoring and even gone underwater -- with varying degrees of success. The latest attempt at this optical illusion is from engineers at Stanford and the University of Pennsylvania, who have developed a device that can detect light without being seen itself. When the ratio of metal to silicon is just right, the light reflected from the two materials is completely canceled out. The process, called plasmonic cloaking, controls the flow of light to create optical and electronic functions while leaving nothing for the eye to see. Scientists envision this tech being used in cameras -- plasmonic cloaking could reduce blur by minimizing the cross-talk between pixels. Other applications include solar cells, sensors and solid-state lighting -- human usage is conspicuously absent on that list.

  • Samsung pushes graphene one step closer to silicon supremacy

    by 
    James Trew
    James Trew
    05.18.2012

    Graphene has long-held notions of grandeur over its current silicon overlord, but a few practical issues have always kept its takeover bid grounded. Samsung, however, thinks it's cracked at least one of those -- graphene's inability to switch off current. Previous attempts to use graphene as a transistor have involved converting it to a semi-conductor, but this also reduces its electron mobility, negating much of the benefit. Samsung's Advanced Institute of Technology has created a graphene-silicon "Schottky barrier" that brings graphene this much-needed current-killing ability, without losing its electron-shuffling potential. The research also explored potential logic device applications based on the same technology. So, does this mean we'll finally get our flea-sized super computer implant? Maybe, not just yet, but the wheels have certainly been oiled.

  • AMD reveals Trinity specs, claims to beat Intel on price, multimedia, gaming

    by 
    Sharif Sakr
    Sharif Sakr
    05.15.2012

    Itching for the details of AMD's latest Accelerated Processing Units (APUs)? Then get ready to scratch: Trinity has arrived and, as of today, it's ready to start powering the next generation of low-power ultra-portables, laptops and desktops that, erm, don't run Intel. The new architecture boasts up to double the performance-per-watt of last year's immensely popular Llano APUs, with improved "discrete-class" integrated graphics and without adding to the burden on battery life. How is that possible? By how much will Trinity-equipped devices beat Intel on price? And will it play Crysis: Warhead? Read on to find out.