moores law

Latest

  • Moore's Law in action: making our machines ever more micro

    by 
    Jon Turi
    Jon Turi
    06.13.2015

    Over the past few decades, engineers have leveraged Moore's Law to the fullest, resulting in powerful ultrathin laptops and feature-rich miniature wearables. Back in 1981, a 23-pound Osborne 1 computer was considered portable, with 64KB of onboard memory. Today, smartphones weigh just a few ounces and easily pack 128GB of storage. There's also a vastly more complex architecture of circuits and sensors inside these devices, all at a scale nearly invisible to the eye. It's taken us decades to develop and shrink down these microelectronics to where they are today, which is no small feat. Join us as we ride the ever-shrinking gadget wave from its early days to some of the nanoscopic creations at work today.

  • Acronym-loving Samsung joins Intel and TSMC, buys stake in ASML

    by 
    Daniel Cooper
    Daniel Cooper
    08.27.2012

    Samsung's round of cash-flashing continues with a $629 million purchase of a three-percent stake in ASML. It's joining Intel and TSMC in pumping money into the Dutch business, developing tooling for chip-making machines with Extra Ultraviolet Lithography (EUV) designed to "extend Moore's Law." It'll also help reduce the cost of future silicon, since it'll enable the companies to use wider silicon wafers along the manufacturing line. Given that Samsung's investment caps of a project to raise nearly $5 billion in cash and that ASML's home is just five miles west of PSV Eindhoven's stadium, we just hope they threw in a few home tickets for their trouble.

  • Researchers take nanowire transistors vertical, double up on density

    by 
    Steve Dent
    Steve Dent
    06.21.2012

    3D silicon is all the rage, and now nanowire transistors have further potential to keep Moore's Law on life support. Researchers at A*STAR have found a way to double the number of transistors on a chip by placing the atomic-scale wires vertically, rather than in the run-of-the-mill planar mode, creating two "wrap-around gates" that put a pair of transistors on a single nanowire. In the future, the tech could be merged with tunnel field effect transistors -- which use dissimilar semiconductor materials -- to create a markedly denser design. That combo would also burn a miniscule percentage of the power required conventionally, according to the scientists, making it useful for low-powered processors, logic boards and non-volatile memory, for starters. So, a certain Intel founder might keep being right after all, at least for a few years more.

  • Engadget Primed: why nanometers matter (and why they often don't)

    by 
    Sharif Sakr
    Sharif Sakr
    06.15.2012

    Primed goes in-depth on the technobabble you hear on Engadget every day -- we dig deep into each topic's history and how it benefits our lives. You can follow the series here. Looking to suggest a piece of technology for us to break down? Drop us a line at primed *at* engadget *dawt* com. Welcome to one of the most unnecessarily complicated questions in the world of silicon-controlled gadgets: should a savvy customer care about the underlying nature of the processor in their next purchase? Theoretically at least, the answer is obvious. Whether it's a CPU, graphics card, smartphone or tricorder, it'll always receive the Holy Grail combo of greater performance and reduced power consumption if it's built around a chip with a smaller fabrication process. That's because, as transistors get tinier and more tightly packed, electrons don't have to travel so far when moving between them -- saving both time and energy. In other words, a phone with a 28-nanometer (nm) processor ought to be fundamentally superior to one with a 45nm chip, and a PC running on silicon with features etched at 22nm should deliver more performance-per-watt than a 32nm rival. But if that's true, isn't it equally sensible to focus on the end results? Instead of getting bogged down in semiconductor theory, we may as well let Moore's Law churn away in the background while we judge products based on their overall user experience. Wouldn't that make for an easier life? Well, maybe, but whichever way you look at it, it's hard to stop this subject descending into pure philosophy, on a par with other yawnsome puzzles like whether meat-eaters should visit an abattoir at least once, or whether it's better to medicate the ailment or the person. Bearing that in mind, we're going to look at how some key players in the silicon industry treat this topic, and we'll try to deliver some practical, offal-free information in the process.

  • Raspberry Pi hands-on and Eben Upton interview at Maker Faire (video)

    by 
    Myriam Joire
    Myriam Joire
    05.21.2012

    Unless you've been hiding under a rock lately, we're pretty sure you've heard about the Raspberry Pi by now -- a $25 credit-card sized PC that brings ARM/Linux to the Arduino form factor. As a refresher, the system features a 700MHz Broadcom BCM2835 SoC with an ARM11 CPU, a Videocore 4 GPU (which handles HD H.264 video and OpenGL ES 2.0) and 256MB RAM. The board includes an SD card slot, HDMI output, composite video jack, 3.5mm audio socket, micro-USB power connector and GPIO header. Model A ($25) comes with one USB port, while Model B ($35) provides two USB ports and a 100BaseT Ethernet socket. Debian is recommended, but Raspberry Pi can run most ARM-compatible 32-bit OSes. This past weekend at Maker Faire Bay Area 2012 we ran into Eben Upton, Executive Director of the Raspberry Pi Foundation, and took the opportunity to spend some quality time with a production board and to discuss this incredible PC. We touched upon the origins of the system (inspired by the BBC Micro, one of the ARM founders' projects), Moore's law, the wonders of simple computers and upcoming products / ideas -- including Adafruit's Pi Plate and Raspberry Pi's prototype camera add-on. On the subject of availability, the company expects that "there will be approximately 200,000 units in the field by the end of June". Take a look at our hands-on gallery below and our video interview after the break.%Gallery-155800%

  • Single atom transistors point to the future of quantum computers, death of Moore's law

    by 
    Terrence O'Brien
    Terrence O'Brien
    02.21.2012

    Transistors -- the basic building block of the complex electronic devices around you. Literally billions of them make up that Core i7 in your gaming rig and Moore's law says that number will double every 18 months as they get smaller and smaller. Researchers at the University of New South Wales may have found the limit of this basic computational rule however, by creating the world's first single atom transistor. A single phosphorus atom was placed into a silicon lattice and read with a pair of extremely tiny silicon leads that allowed them to observe both its transistor behavior and its quantum state. Presumably this spells the end of the road for Moore's Law, as it would seem all but impossible to shrink transistors any farther. But, it could also points to a future featuring miniaturized solid-state quantum computers.

  • IBM builds 9 nanometer carbon nanotube transistor, puts silicon on notice

    by 
    Michael Gorman
    Michael Gorman
    01.28.2012

    It's not the smallest transistor out there, but the boffins at IBM have constructed the tiniest carbon nanotube transistor to date. It's nine nanometers in size, making it one nanometer smaller than the presumed physical limit of silicon transistors. Plus, it consumes less power and is able to carry more current than present-day technology. The researchers accomplished the trick by laying a nanotube on a thin layer of insulation, and using a two-step process -- involving some sort of black magic, no doubt -- to add the electrical gates inside. The catch? (There's always a catch) Manufacturing pure batches of semiconducting nanotubes is difficult, as is aligning them in such a way that the transistors can function. So, it'll be some time before the technology can compete with Intel's 3D silicon, but at least we're one step closer to carbon-based computing.

  • IBM sees stacked silicon sitting in fluid as the way to power future PCs

    by 
    Chris Barylick
    Chris Barylick
    11.17.2011

    Generally, the combination of microchips, electricity and fluids is usually considered an incredibly bad thing. IBM, however, thinks it can combine those three to make super small and super powerful computers in the future. The idea is to stack hundreds of silicon wafers and utilize dual fluidic networks between them to create 3D processors. In such a setup, one network carries in charged fluid to power the chip, while the second carries away the same fluid after it has picked up heat from the active transistors. Of course, 3D chips are already on the way, and liquid cooled components are nothing new, but powering a PC by fluids instead of wires has never been done before. Bruno Michel, who's leading Big Blue's research team, has high hopes for the technology, because future processors will need the extra cooling and reduced power consumption it can provide. Michel says he and his colleagues have demonstrated that it's possible to use a liquid to transfer power via a network of fluidic channels, and they to plan build a working prototype chip by 2014. If successful, your smartphone could eventually contain the power of the Watson supercomputer. Chop, chop, fellas, those futuristic fluidic networks aren't going to build themselves.

  • Koomey's law heckles Moore's in the post-PC world

    by 
    Daniel Cooper
    Daniel Cooper
    09.15.2011

    Around the same time most years, (2007, 2009, 2010), someone heralds the death of Moore's law. This time it's Stanford University's Dr. Jonathan Koomey, who has found that energy efficiency roughly doubles every two years. With the rise of mobile devices, we care less if our phones and tablets can outpace a desktop and more about if a full charge will last the duration of our commute -- reducing the importance of Moore's law. Historically, efficiency has been a secondary concern as manufacturers built ever faster CPUs, but Koomey believes there is enormous room for improvement. In 1985, Dr. Richard Feynman calculated an efficiency upper limit of Factor 100 Billion -- since then we've only managed to achieve Factor 40,000. Let's just hope Quantum Computing goes mainstream before next autumn so we can get on with more important things.

  • Intel plans exascale computing by 2018, wants to make petaflops passé

    by 
    Michael Gorman
    Michael Gorman
    06.20.2011

    Sure, Fujitsu has a right to be proud of its K supercomputer -- performing over 8 petaflops with just under 70,000 Venus CPUs is nothing to sneeze at. Intel isn't giving up its status as the supercomputing CPU king, however, as it plans to bring exascale computing to the world by the end of this decade. Such a machine could do one million trillion calculations per second, and Intel plans to make it happen with its Many Integrated Core Architecture (MIC). The first CPUs designed with MIC, codenamed Knights Corner, are built on a 22nm process that utilizes the company's 3D Tri-Gate transistors and packs over 50 cores per chip. These CPUs are designed for parallel processing applications, similar to the NVIDIA GPUs that will be used in a DARPA-funded supercomputer we learned about last year. Here we thought the war between these two was over -- looks like a new one's just getting started. PR's after the break.

  • Intel goes ULV for laptops to combat the oncoming tablet horde

    by 
    Terrence O'Brien
    Terrence O'Brien
    05.20.2011

    Intel has been talking up its x86-powered smartphones and battery-sipping Atoms for tablets quite a bit recently, but the company hasn't forgotten its roots in traditional PC form-factors. At an investor event in San Francisco, CEO Paul Otellini announced a significant change to its line of notebook CPUs -- ultra low voltage will be the new norm, not just a niche chip for high-end ultra-portables. The baseline TDP for future CPUs will be in the 10 to 15 watt range, a huge drop from the 35w design of the mainstream Core line and lower than even current-gen ULV chips (which bottom out at 17w). The company also plans to make NVIDIA eat its words by putting the pedal to the metal on die shrinks -- releasing a 22nm Atom next year followed by a 14nm version in 2013. That could mean our fantasy of true all-day battery life in a sleek and sexy laptop will finally come true. Don't crush our dreams Intel!

  • Intel will mass produce 3D transistors for all future CPUs, starting with 22nm Ivy Bridge (video)

    by 
    Sean Hollister
    Sean Hollister
    05.04.2011

    Looks like 3D isn't just a fad, folks, so long as we're talking about silicon -- Intel just announced that it has invented a 3D "Tri-Gate" transistor that will allow the company to keep shrinking chips, Moore's Law naysayers be darned. Intel says the transistors will use 50 percent less power, conduct more current and provide 37 percent more speed than their 2D counterparts thanks to vertical fins of silicon substrate that stick up through the other layers, and that those fancy fins could make for cheaper chips too -- currently, though, the tri-gate tech adds an estimated 2 to 3 percent cost to existing silicon wafers. Intel says we'll see the new technology first in its 22nm Ivy Bridge CPUs, going into mass production in the second half of the year, and it's planning 14nm chips in 2013 and 10nm chips in 2015. Also, 3D transistors won't be limited to the cutting edge -- Intel reps told journalists that they "will extend across the entire range of our product line," including mobile devices. Three videos and a press release await you after the break. Chris Trout contributed to this report.

  • Today marks 50th anniversary of first silicon integrated circuit patent (and the entire computing industry)

    by 
    Zach Honig
    Zach Honig
    04.25.2011

    There's little question that the last 50 years have represented the most innovative half-century in human history, and today marks the anniversary of the invention that started it all: the silicon-based integrated circuit. Robert Noyce received the landmark US patent on April 25, 1961, going on to found Intel Corporation with Gordon E. Moore (of Moore's Law fame) in 1968. He wasn't the first to invent the integrated circuit -- the inventor of the pocket calculator Jack Kilby patented a similar technology on a germanium wafer for Texas Instruments a few months prior. Noyce's silicon version stuck, however, and is responsible for Moore's estimated $3.7 billion net worth, not to mention the success of the entire computing industry. Holding 16 other patents and credited as a mentor of Steve Jobs, Noyce was awarded the National Medal of Technology in 1987, and continued to shape the computing industry until his death in 1990. If Moore's Law continues to hold true, as we anticipate it will, we expect the next 50 years to be even more exciting than the last. Let's meet back here in 2061.

  • What 10 years of Apple did to its main product

    by 
    Matt Tinsley
    Matt Tinsley
    09.24.2010

    How time flies! In the year 2000, I was just finishing high school, listening to Bush, and becoming acquainted with Windows 2000. Back then, I knew very little about Apple, and I'd certainly not heard of the Bondi Bubble iMac (the first iMac was released in 1998). In 2010, well...how things have changed for me! And, as illustrated by Brett Jordan in the graphic above, things have also changed at Apple. It's incredible to think that the iPhone has taken center stage at Apple over the last three years. As noted by some of our commentators, there has been a real lack of Mac-centric news recently. Sure, there was the update to the iMac a few months ago, but it's glaringly obvious that the Mac has taken a back seat to the iPhone -- certainly in the limelight department. In fact, I'm reveling in the fact that I'm writing about the iPhone and the iMac at the same time! Today, the Mac is the center of our digital hub, but it's no longer the center of our digital world. When we leave the house / office / room where the Mac lives, it's the iPhone ( iPad / iPod touch) that is constantly in our hands, and Apple knows it! Of course, we have to come back to our Macs eventually (in my case, repeatedly, everyday) because the iPhone can't do everything that we want it to, or even some of the things that we want done well, yet. But just looking at this picture shows how far things have come, and how the direction taken by personal computing is becoming even more personal. The only feature of the iPhone 4 that doesn't beat the iMac of yesteryear is screen real estate. The processor and RAM are double the capacity of the iMac, the iPhone's storage is 2 gigabytes larger, and it's flash-based memory. And of course, it's tiny in comparison. As noted by Obama Pacman, it's Moore's law in effect. But when will it end? In 10 years time, will we have an iPhone that's five times smaller than the current one, but more powerful than the personal computers of today? Who knows? That might be a weird phone, but anything could happen. For now, I'm still stuck with my iPhone 3G, and I think it might still have some Bush on it. In the meantime, I'm just looking forward to getting the iPhone 4!

  • Entelligence: when less beats Moore

    by 
    Michael Gartenberg
    Michael Gartenberg
    08.27.2010

    Entelligence is a column by technology strategist and author Michael Gartenberg, a man whose desire for a delicious cup of coffee and a quality New York bagel is dwarfed only by his passion for tech. In these articles, he'll explore where our industry is and where it's going -- on both micro and macro levels -- with the unique wit and insight only he can provide. We are all familiar with Moore's law. The observation made by Intel co-founder Gordon Moore that the density of semiconductors doubles roughly every eighteen months. The net result? It's always going to be better faster and cheaper. Certainly that's been true of the phone space, with large screens, fast processors and lots of storage. In the last few weeks alone I've looked at new phones with 1Ghz processors, the latest and greatest software platforms from Google and RIM... but it's been one little gadget that's caught my attention and it totally bucks the trend. What device? It's the Sony Ericsson Xperia X10 Mini Pro -- which is a lot of name for a small phone -- and it shows some very different thinking about what a smartphone is. In theory, this isn't a phone that I should like. Instead of a large 4.3-inch screen, it's running a 2.55-inch screen at 240 x 320 resolution. Don't look for a 1Ghz processor here. It's got an ARMv6 revision 5 processor at 600Mhz. Finally, forget Froyo or even Eclair. This thing's got Android 1.6 on it and may never get updated to the latest and greatest. Despite all that, I think Sony Ericsson has a potential hit on their hands if they decide to bring this to the US later this year as they said they plan to. Why am I so enamored?

  • NVIDIA VP says 'Moore's law is dead'

    by 
    Sean Hollister
    Sean Hollister
    05.03.2010

    NVIDIA and Intel haven't been shy about their differing respective visions of the future of computing in the past year or so, but it looks like Team GPU just upped the rhetoric a little -- a Forbes column by NVIDIA VP Bill Dally argues that "Moore's law is dead." Given that Moore's law is arguably the foundation of Intel's entire business, such a statement is a huge shot across the bow; though other companies like AMD are guided by the doctrine, Intel's relentless pursuit of Gordon Moore's vision has become a focal point and rallying cry for the world's largest chipmaker. So what's Dally's solution to the death of Moore's law? For everyone to buy into parallel computing, where -- surprise, surprise -- NVIDIA's GPUs thrive. Dally says that dual, quad- and hex-core solutions are inefficient -- he likens multi-core chips to "trying to build an airplane by putting wings on a train," and says that only ground-up parallel solutions designed for energy efficiency will bring back the golden age of doubling performance every two years. That sounds fantastic, but as far as power consumption is concerned, well, perhaps NVIDIA had best lead by example.

  • Defective graphene sheets look poised to succeed silicon

    by 
    Tim Stevens
    Tim Stevens
    04.02.2010

    As circuitry gets smaller and approaches the effective limitation of silicon's computing power, and Moore's Law begins to look like it has an expiration date, we get closer and closer to needing an alternative. Graphene is held to be the answer; sheets of carbon a single atom thick that could be stacked and composited to create processors. Two professors at the University of South Florida, Matthias Batzill and Ivan Oleynik, have found a new way to turn those sheets into circuits by creating nanoscale defects. These strips of broken atomic rings wind up having metallic properties, thus making them act like microscopic wires. IBM is already teasing us with the possibilities of graphene and now, with a more practical way to make graphene-based electronics, we'd say Moore's Law still has at least another couple decades left. [Photo credit: Y. Lin]

  • Physicists calculate the end of Moore's Law, clearly don't believe in Moore's Law

    by 
    Darren Murph
    Darren Murph
    10.20.2009

    If you're looking for pundits with an end date for Moore's Law, you don't have to look far. You also don't have to look far to find a gaggle of loonies who just knew the world was ending in Y2K, so make of that what you will. The latest duo looking to call the demise of the processor mantra that has held true for two score comes from Boston University, with physicists Lev Levitin and Tommaso Toffoli asserting that a quantum limit would be achieved in around 75 to 80 years. Scott Aaronson, an attention-getter at MIT, expects that very same limit to be hit in just 20 years. Of course, there's plenty of technobabble to explain the what's and how's behind all this, but considering that the brainiacs of the world can't even agree with Gordon Moore's own doomsday date, we're choosing to plug our ears and keep on believin' for now. Bonus video after the break. [Via Slashdot]

  • A Decade of Divination...

    by 
    Tim Dale
    Tim Dale
    08.01.2009

    My first writings here at Massively were a look back at the last ten years of MMO gaming, much of which I'd taken some small part in, and a comparison of how early MMOs had been then, against how they seem to have shaped up today. I expect if I was going to grow out of these things it would have already happened by now, so am fully expecting to be playing an MMO of some description in 2019.Much of the year 2019 is already known to us, and detailed extensively in the documentaries 'Bladerunner', 'The Running Man' and 'Akira', but what will MMOs be like, a decade from now? Join me as I charge up the flux capacitors, spin the big brass and crystal whirley thing with no obvious purpose and hop in my little blue box in a bid to divine...the future!

  • IBM claims title of world's fastest graphene transistor

    by 
    Donald Melanson
    Donald Melanson
    12.19.2008

    As we've seen, plenty of researchers and companies are betting on graphene as being the big thing that will revolutionize transistors and, hence, all manner of electronics, and it looks like IBM is now claiming one of the biggest breakthroughs to date, not to mention the desirable title of "world's fastest graphene transistor." More specifically, IBM researchers have apparently been the first to demonstrate the operation of graphene field-effect transistors at gigahertz frequencies and, apparently even more importantly, they've also established the scaling behavior of the graphene transistors, which they say could eventually lead to the development of terahertz graphene transistors -- or, in other word's, keep Moore's Law around for quite a bit longer than many expected.