MooresLaw

Latest

  • Koomey's law heckles Moore's in the post-PC world

    by 
    Daniel Cooper
    Daniel Cooper
    09.15.2011

    Around the same time most years, (2007, 2009, 2010), someone heralds the death of Moore's law. This time it's Stanford University's Dr. Jonathan Koomey, who has found that energy efficiency roughly doubles every two years. With the rise of mobile devices, we care less if our phones and tablets can outpace a desktop and more about if a full charge will last the duration of our commute -- reducing the importance of Moore's law. Historically, efficiency has been a secondary concern as manufacturers built ever faster CPUs, but Koomey believes there is enormous room for improvement. In 1985, Dr. Richard Feynman calculated an efficiency upper limit of Factor 100 Billion -- since then we've only managed to achieve Factor 40,000. Let's just hope Quantum Computing goes mainstream before next autumn so we can get on with more important things.

  • Intel plans exascale computing by 2018, wants to make petaflops passé

    by 
    Michael Gorman
    Michael Gorman
    06.20.2011

    Sure, Fujitsu has a right to be proud of its K supercomputer -- performing over 8 petaflops with just under 70,000 Venus CPUs is nothing to sneeze at. Intel isn't giving up its status as the supercomputing CPU king, however, as it plans to bring exascale computing to the world by the end of this decade. Such a machine could do one million trillion calculations per second, and Intel plans to make it happen with its Many Integrated Core Architecture (MIC). The first CPUs designed with MIC, codenamed Knights Corner, are built on a 22nm process that utilizes the company's 3D Tri-Gate transistors and packs over 50 cores per chip. These CPUs are designed for parallel processing applications, similar to the NVIDIA GPUs that will be used in a DARPA-funded supercomputer we learned about last year. Here we thought the war between these two was over -- looks like a new one's just getting started. PR's after the break.

  • Intel goes ULV for laptops to combat the oncoming tablet horde

    by 
    Terrence O'Brien
    Terrence O'Brien
    05.20.2011

    Intel has been talking up its x86-powered smartphones and battery-sipping Atoms for tablets quite a bit recently, but the company hasn't forgotten its roots in traditional PC form-factors. At an investor event in San Francisco, CEO Paul Otellini announced a significant change to its line of notebook CPUs -- ultra low voltage will be the new norm, not just a niche chip for high-end ultra-portables. The baseline TDP for future CPUs will be in the 10 to 15 watt range, a huge drop from the 35w design of the mainstream Core line and lower than even current-gen ULV chips (which bottom out at 17w). The company also plans to make NVIDIA eat its words by putting the pedal to the metal on die shrinks -- releasing a 22nm Atom next year followed by a 14nm version in 2013. That could mean our fantasy of true all-day battery life in a sleek and sexy laptop will finally come true. Don't crush our dreams Intel!

  • Intel will mass produce 3D transistors for all future CPUs, starting with 22nm Ivy Bridge (video)

    by 
    Sean Hollister
    Sean Hollister
    05.04.2011

    Looks like 3D isn't just a fad, folks, so long as we're talking about silicon -- Intel just announced that it has invented a 3D "Tri-Gate" transistor that will allow the company to keep shrinking chips, Moore's Law naysayers be darned. Intel says the transistors will use 50 percent less power, conduct more current and provide 37 percent more speed than their 2D counterparts thanks to vertical fins of silicon substrate that stick up through the other layers, and that those fancy fins could make for cheaper chips too -- currently, though, the tri-gate tech adds an estimated 2 to 3 percent cost to existing silicon wafers. Intel says we'll see the new technology first in its 22nm Ivy Bridge CPUs, going into mass production in the second half of the year, and it's planning 14nm chips in 2013 and 10nm chips in 2015. Also, 3D transistors won't be limited to the cutting edge -- Intel reps told journalists that they "will extend across the entire range of our product line," including mobile devices. Three videos and a press release await you after the break. Chris Trout contributed to this report.

  • Today marks 50th anniversary of first silicon integrated circuit patent (and the entire computing industry)

    by 
    Zach Honig
    Zach Honig
    04.25.2011

    There's little question that the last 50 years have represented the most innovative half-century in human history, and today marks the anniversary of the invention that started it all: the silicon-based integrated circuit. Robert Noyce received the landmark US patent on April 25, 1961, going on to found Intel Corporation with Gordon E. Moore (of Moore's Law fame) in 1968. He wasn't the first to invent the integrated circuit -- the inventor of the pocket calculator Jack Kilby patented a similar technology on a germanium wafer for Texas Instruments a few months prior. Noyce's silicon version stuck, however, and is responsible for Moore's estimated $3.7 billion net worth, not to mention the success of the entire computing industry. Holding 16 other patents and credited as a mentor of Steve Jobs, Noyce was awarded the National Medal of Technology in 1987, and continued to shape the computing industry until his death in 1990. If Moore's Law continues to hold true, as we anticipate it will, we expect the next 50 years to be even more exciting than the last. Let's meet back here in 2061.

  • What 10 years of Apple did to its main product

    by 
    Matt Tinsley
    Matt Tinsley
    09.24.2010

    How time flies! In the year 2000, I was just finishing high school, listening to Bush, and becoming acquainted with Windows 2000. Back then, I knew very little about Apple, and I'd certainly not heard of the Bondi Bubble iMac (the first iMac was released in 1998). In 2010, well...how things have changed for me! And, as illustrated by Brett Jordan in the graphic above, things have also changed at Apple. It's incredible to think that the iPhone has taken center stage at Apple over the last three years. As noted by some of our commentators, there has been a real lack of Mac-centric news recently. Sure, there was the update to the iMac a few months ago, but it's glaringly obvious that the Mac has taken a back seat to the iPhone -- certainly in the limelight department. In fact, I'm reveling in the fact that I'm writing about the iPhone and the iMac at the same time! Today, the Mac is the center of our digital hub, but it's no longer the center of our digital world. When we leave the house / office / room where the Mac lives, it's the iPhone ( iPad / iPod touch) that is constantly in our hands, and Apple knows it! Of course, we have to come back to our Macs eventually (in my case, repeatedly, everyday) because the iPhone can't do everything that we want it to, or even some of the things that we want done well, yet. But just looking at this picture shows how far things have come, and how the direction taken by personal computing is becoming even more personal. The only feature of the iPhone 4 that doesn't beat the iMac of yesteryear is screen real estate. The processor and RAM are double the capacity of the iMac, the iPhone's storage is 2 gigabytes larger, and it's flash-based memory. And of course, it's tiny in comparison. As noted by Obama Pacman, it's Moore's law in effect. But when will it end? In 10 years time, will we have an iPhone that's five times smaller than the current one, but more powerful than the personal computers of today? Who knows? That might be a weird phone, but anything could happen. For now, I'm still stuck with my iPhone 3G, and I think it might still have some Bush on it. In the meantime, I'm just looking forward to getting the iPhone 4!

  • Entelligence: when less beats Moore

    by 
    Michael Gartenberg
    Michael Gartenberg
    08.27.2010

    Entelligence is a column by technology strategist and author Michael Gartenberg, a man whose desire for a delicious cup of coffee and a quality New York bagel is dwarfed only by his passion for tech. In these articles, he'll explore where our industry is and where it's going -- on both micro and macro levels -- with the unique wit and insight only he can provide. We are all familiar with Moore's law. The observation made by Intel co-founder Gordon Moore that the density of semiconductors doubles roughly every eighteen months. The net result? It's always going to be better faster and cheaper. Certainly that's been true of the phone space, with large screens, fast processors and lots of storage. In the last few weeks alone I've looked at new phones with 1Ghz processors, the latest and greatest software platforms from Google and RIM... but it's been one little gadget that's caught my attention and it totally bucks the trend. What device? It's the Sony Ericsson Xperia X10 Mini Pro -- which is a lot of name for a small phone -- and it shows some very different thinking about what a smartphone is. In theory, this isn't a phone that I should like. Instead of a large 4.3-inch screen, it's running a 2.55-inch screen at 240 x 320 resolution. Don't look for a 1Ghz processor here. It's got an ARMv6 revision 5 processor at 600Mhz. Finally, forget Froyo or even Eclair. This thing's got Android 1.6 on it and may never get updated to the latest and greatest. Despite all that, I think Sony Ericsson has a potential hit on their hands if they decide to bring this to the US later this year as they said they plan to. Why am I so enamored?

  • NVIDIA VP says 'Moore's law is dead'

    by 
    Sean Hollister
    Sean Hollister
    05.03.2010

    NVIDIA and Intel haven't been shy about their differing respective visions of the future of computing in the past year or so, but it looks like Team GPU just upped the rhetoric a little -- a Forbes column by NVIDIA VP Bill Dally argues that "Moore's law is dead." Given that Moore's law is arguably the foundation of Intel's entire business, such a statement is a huge shot across the bow; though other companies like AMD are guided by the doctrine, Intel's relentless pursuit of Gordon Moore's vision has become a focal point and rallying cry for the world's largest chipmaker. So what's Dally's solution to the death of Moore's law? For everyone to buy into parallel computing, where -- surprise, surprise -- NVIDIA's GPUs thrive. Dally says that dual, quad- and hex-core solutions are inefficient -- he likens multi-core chips to "trying to build an airplane by putting wings on a train," and says that only ground-up parallel solutions designed for energy efficiency will bring back the golden age of doubling performance every two years. That sounds fantastic, but as far as power consumption is concerned, well, perhaps NVIDIA had best lead by example.

  • Defective graphene sheets look poised to succeed silicon

    by 
    Tim Stevens
    Tim Stevens
    04.02.2010

    As circuitry gets smaller and approaches the effective limitation of silicon's computing power, and Moore's Law begins to look like it has an expiration date, we get closer and closer to needing an alternative. Graphene is held to be the answer; sheets of carbon a single atom thick that could be stacked and composited to create processors. Two professors at the University of South Florida, Matthias Batzill and Ivan Oleynik, have found a new way to turn those sheets into circuits by creating nanoscale defects. These strips of broken atomic rings wind up having metallic properties, thus making them act like microscopic wires. IBM is already teasing us with the possibilities of graphene and now, with a more practical way to make graphene-based electronics, we'd say Moore's Law still has at least another couple decades left. [Photo credit: Y. Lin]

  • Physicists calculate the end of Moore's Law, clearly don't believe in Moore's Law

    by 
    Darren Murph
    Darren Murph
    10.20.2009

    If you're looking for pundits with an end date for Moore's Law, you don't have to look far. You also don't have to look far to find a gaggle of loonies who just knew the world was ending in Y2K, so make of that what you will. The latest duo looking to call the demise of the processor mantra that has held true for two score comes from Boston University, with physicists Lev Levitin and Tommaso Toffoli asserting that a quantum limit would be achieved in around 75 to 80 years. Scott Aaronson, an attention-getter at MIT, expects that very same limit to be hit in just 20 years. Of course, there's plenty of technobabble to explain the what's and how's behind all this, but considering that the brainiacs of the world can't even agree with Gordon Moore's own doomsday date, we're choosing to plug our ears and keep on believin' for now. Bonus video after the break. [Via Slashdot]

  • A Decade of Divination...

    by 
    Tim Dale
    Tim Dale
    08.01.2009

    My first writings here at Massively were a look back at the last ten years of MMO gaming, much of which I'd taken some small part in, and a comparison of how early MMOs had been then, against how they seem to have shaped up today. I expect if I was going to grow out of these things it would have already happened by now, so am fully expecting to be playing an MMO of some description in 2019.Much of the year 2019 is already known to us, and detailed extensively in the documentaries 'Bladerunner', 'The Running Man' and 'Akira', but what will MMOs be like, a decade from now? Join me as I charge up the flux capacitors, spin the big brass and crystal whirley thing with no obvious purpose and hop in my little blue box in a bid to divine...the future!

  • IBM claims title of world's fastest graphene transistor

    by 
    Donald Melanson
    Donald Melanson
    12.19.2008

    As we've seen, plenty of researchers and companies are betting on graphene as being the big thing that will revolutionize transistors and, hence, all manner of electronics, and it looks like IBM is now claiming one of the biggest breakthroughs to date, not to mention the desirable title of "world's fastest graphene transistor." More specifically, IBM researchers have apparently been the first to demonstrate the operation of graphene field-effect transistors at gigahertz frequencies and, apparently even more importantly, they've also established the scaling behavior of the graphene transistors, which they say could eventually lead to the development of terahertz graphene transistors -- or, in other word's, keep Moore's Law around for quite a bit longer than many expected.

  • Researchers say new state of matter could extend Moore's Law

    by 
    Donald Melanson
    Donald Melanson
    10.22.2008

    There's certainly been no shortage of folks trying to pin down an end date for Moore's Law, but there's also thankfully plenty of researchers doing their best to keep it going, and a team of physicists from McGill University in Montreal now say they've made a discovery that could keep the law alive even further into the future. Their big breakthrough is a new state of matter known as a quasi-three-dimensional electron crystal, which they discovered in a semiconductor material by using a device cooled at temperatures "roughly 100 times colder than intergalactic space," and then exposing the material to the "most powerful continuous magnetic fields generated on Earth." Unlike two-dimensional electron crystals, which lead researcher Dr. Guillaume Gervais equates to a ham sandwich, the quasi-three-dimensional electron crystals are in an "in-between state" between 2D and 3D, which could potentially allow for transistors to improve further as they run up against the physical limits imposed by the laws of physics. [Via InformationWeek, image courtesy University of Cambridge]

  • Microchip breakthrough could keep Moore's law intact (again)

    by 
    Darren Murph
    Darren Murph
    07.11.2008

    We're pretty certain we'll be hearing this same story each year, every year for the rest of eternity, but hey, not like we're kvetching over that or anything. Once again, we're hearing that mad scientists have developed a breakthrough that makes Mr. Moore look remarkably bright, as a new approach to chip making could carve features in silicon chips "that are many times smaller than the wavelength of the light used to make them." Reportedly, the new method "produces grids of parallel lines just 25-nanometers wide using light with a wavelength of 351-nanometers," although the grids aren't functional circuits just yet. If you're interested in more technobabble on the matter, head on down to the read link, but we'd recommend against if you're easily frightened by terms like "photolithographic" and "nanotechnology."