computer science

Latest

  • Caiaimage/Robert Daly via Getty Images

    Amazon funds STEM programs in Seattle schools

    by 
    Kris Holt
    Kris Holt
    06.25.2019

    Perhaps with an eye on the next generation of engineers that might be interested in working on its delivery robots or in coding, Amazon is funding computer science and robotics programs at up to 30 public schools in its Seattle home base. From this fall, the Future Engineer Robotics grants will provide schools with expanded access to computer science learning and a private tour of an Amazon robotics fulfillment center. The schools will also get support to set up FIRST robotics teams, including professional development for teachers in robotics.

  • UK government to launch a revamped Computer Science GCSE in 2016

    by 
    Nick Summers
    Nick Summers
    12.09.2014

    After realising that lessons on Microsoft Office aren't particularly useful for schoolchildren, the UK government has started doubling down on coding. After launching a new computing curriculum in September, the Prime Minister David Cameron has promised to introduce a new Computer Science GCSE by 2016, focused on writing code, designing applications and exploring some of the ethical and legal issues that surround new technology.

  • NPR: '80s ads are responsible for the lack of women coders

    by 
    Daniel Cooper
    Daniel Cooper
    10.21.2014

    Back in the day, computer science was as legitimate a career path for women as in medicine, law or science. But in 1984, the number of females majoring in computing-related subjects began to fall, and is now as low as 20 percent compared to those other three. It's a surprising trend that NPR's Planet Money has uncovered, and the show's latest episode seeks to answer a simple question: Why? According to the show's experts, computers were advertised as a "boy's toy," and combined with early '80s geek culture staples like the novel Hackers, as well as movies like WarGames and Weird Science, the knock-on effect was to exclude women. It wasn't long before those female computer science majors decided to switch programs to ones where they weren't made to feel inferior, and while there are now signs of recovery, you have to wonder if those same decisions aren't the cause of the current toxic environment for women in technology. If you'd like to hear the show, we've got it embedded after the break. [Image Credit: Quoctrung Bui/NPR]

  • What happened to all of the women coders in 1984

    by 
    Jessica Conditt
    Jessica Conditt
    10.20.2014

    In 1984, women stopped pursuing Computer Science majors at American universities. From 1970 onward, women had composed an increasing percentage of Computer Science majors, but something happened in 1984 and that number began to drastically fall, an occurrence at odds with other tech fields. This trend has continued into the 2000s, and today women make up roughly 20 percent of Computer Science majors, as opposed to the 1984 high of about 37 percent. NPR's Planet Money team of Caitlin Kenney and Steve Henn dove into the data to uncover what went down in the mid-80s to drive women out of the field. "There was no grand conspiracy in computer science that we uncovered," Henn said. "No big decision by computer science programs to put a quota on women. There was no sign on a door that said, 'Girls, keep out.' But something strange was going on in this field."

  • Valve's Newell promotes 'Hour of Code' learning campaign, EA gives games to participants

    by 
    Danny Cowan
    Danny Cowan
    12.11.2013

    Non-profit computer science advocacy group CODE.org promotes its ongoing "Hour of Code" campaign by giving students an audience with Gabe Newell, managing director of Half-Life creator Valve Corporation. The full half-hour talk is archived for public viewing in the video above. "Hour of Code" challenges students to spend an hour learning the basics of software coding, with CODE.org offering several online tutorials helping to jumpstart a career in the field of computer science. Over three million students have completed the organization's "Write your own computer program" tutorial, which is available in 20 languages and features characters from Rovio's Angry Birds and PopCap's Plants vs. Zombies throughout its lessons teaching the fundamentals of coding. Electronic Arts announced this week that elementary and middle school students who complete CODE.org's 20-hour training course are eligible to receive a free PC game via Origin. Available games include Bejeweled 3, FIFA Soccer 13, SimCity 4 Deluxe Edition and Plants vs. Zombies.

  • Programming is FUNdamental: A closer look at Code.org's star-studded computer science campaign

    by 
    Brian Heater
    Brian Heater
    07.04.2013

    "All these people who've made it big have their own variation of the same story, where they felt lucky to be exposed to computer programming at the right age, and it bloomed into something that changed their life," explains the organization's co-founder, Ali Partovi, seated in the conference room of one of the many successful startups he's helped along the way. The Iranian-born serial entrepreneur has played a role in an impressive list of companies, including the likes of Indiegogo, Zappos and Dropbox. Along with his twin brother, Hadi, he also co-founded music-sharing service iLike. Unlike past offerings from the brothers, Code.org is a decidedly non-commercial entity, one aimed at making computer science and programming every bit as essential to early education as science or math. For the moment, the organization is assessing just how to go about changing the world. The site currently offers a number of resources for bootstrappers looking to get started in the world of coding. There are simple modules from Scratch, Codecademy, Khan Academy and others, which can help users tap into the buzz of coding their first rectangle, along with links to apps and online tutorials. The organization is also working to build a comprehensive database of schools offering computer science courses and soliciting coders interested in teaching.

  • Blast from the GUI past: 50 years after Ivan Sutherland's Sketchpad first debuted

    by 
    Erica Sadun
    Erica Sadun
    04.10.2013

    Fifty years ago in 1963, Ivan Sutherland first demonstrated Sketchpad, one of the most important contributions to the field of Computer Science. Long before Apple, the Lisa and Xerox's Alto, a constraint-based object-oriented graphical system was developed and demonstrated. Today's video was pointed out by Charles Choi over on his Notes from /dev/null blog. He writes, "Sometimes you're told something that happened some time ago. You stash that date in the back of your mind only to recall it much later in life, surprised and chagrined at the time that's passed since you last thought of it... Today I recalled Ivan Sutherland's Sketchpad, arguably the most significant Computer Science Ph.D. thesis ever. I had the fortune in the mid-'90s to watch a rare videocassette recording of Alan Kay describing Sketchpad for a computer graphics course taught by Randy Pausch. Fast-forward to today and the video is only a YouTube search away." Looking at that video, it's just amazing to think of that kind of tech in 1963.

  • Remembering Alan Turing at 100

    by 
    Brian Heater
    Brian Heater
    06.22.2012

    Alan Turing would have turned 100 this week, an event that would have, no doubt, been greeted with all manner of pomp -- the centennial of a man whose mid-century concepts would set the stage for modern computing. Turing, of course, never made it that far, found dead at age 41 from cyanide poisoning, possibly self-inflicted. His story is that of a brilliant mind cut down in its prime for sad and ultimately baffling reasons, a man who accomplished so much in a short time and almost certainly would have had far more to give, if not for a society that couldn't accept him for who he was. The London-born computing pioneer's name is probably most immediately recognized in the form of the Turing Machine, the "automatic machine" he discussed in a 1936 paper and formally extrapolated over the years. The concept would help lay the foundation for future computer science, arguing that a simple machine, given enough tape (or, perhaps more appropriately in the modern sense, storage) could be used to solve complex equations. All that was needed as Turing laid it out, was a writing method, a way of manipulating what's written and a really long ream to write on. In order to increase the complexity, only the storage, not the machine, needs upgrading.

  • Researchers get CPUs and GPUs talking, boost PC performance by 20 percent

    by 
    James Trew
    James Trew
    02.08.2012

    How do you fancy a 20 percent boost to your processor's performance? Research from the North Carolina State University claims to offer just that. Despite the emergence of fused architecture SoCs, the CPU and GPU cores typically still work independently. The University hoped that by assigning tasks based on each processor's abilities, performance efficiency would be increased. As the CPU and GPU can fetch data at comparable speeds, the researchers set the GPUs to execute the computational functions, while the CPUs did the prefetching. With that data ready in advance, the graphics processor unit has more resources free, yielding an average performance boost of 21.4 percent though it's unclear what metrics the researchers were using. Incidentally, the research was funded by AMD, so no prizes for guessing which chips we might see using the technique first.

  • MESM Soviet computer project marks 60 years

    by 
    Brian Heater
    Brian Heater
    12.26.2011

    Before you go complaining about your job, take a moment to remember the MESM project, which just marked the 60th anniversary of its formal recognition by the Soviet Academy of Sciences. The project, headed by Institute of Electrical Engineering director Sergey Lebedev, was born in a laboratory built from scratch amongst the post-World War II ruins of Ukrainian capital city, Kyiv, by a team of 20 people, many of whom took up residence above the lab. Work on MESM -- that's from the Russian for Small Electronic Calculating Machine -- began toward the end of 1948. By November 1950, the computer was running its first program. The following year, it was up and running full-time. The machine has since come to be considered the first fully operation electronic computer in continental Europe, according to a Google retrospective. Check out a video interview with a MESM team member, after the break -- and make sure you click on that handy caption button for some English subtitles.

  • UK gov't recognizes computer science education is important

    by 
    Alexander Sliwinski
    Alexander Sliwinski
    11.29.2011

    Eidos "life president" Ian Livingstone's Livingstone-Hope Skills Review has been positively received by the UK government. The paper recommends that the UK's Information and Communications Technology national curriculum be replaced by computer science. "The Government looks forward to working with [the games industry], educators and others to develop an attractive computer science offering for schools, so that students are able to develop the rigorous skills needed -- not only to support these industries but also to ensure a digitally literate citizenry," read the government's lengthy response. "I hope common sense and the national interest will prevail," Livingstone told GI.biz, recognizing that it would "take a number of years" before any actual reforms occurred. And once the UK education system teaches students helpful game industry skills, they'll be off to sunny Canada for jobs, where the games industry isn't slowly crumbling around them. On the bright side, things could be worse ... like in Australia.

  • New computer system can read your emotions, will probably be annoying about it (video)

    by 
    Amar Toor
    Amar Toor
    11.22.2011

    It's bad enough listening to your therapist drone on about the hatred you harbor toward your father. Pretty soon, you may have to put up with a hyper-insightful computer, as well. That's what researchers from the Universidad Carlos III de Madrid have begun developing, with a new system capable of reading human emotions. As explained in their study, published in the Journal on Advances in Signal Processing, the computer has been designed to intelligently engage with people, and to adjust its dialogue according to a user's emotional state. To gauge this, researchers looked at a total of 60 acoustic parameters, including the tenor of a user's voice, the speed at which one speaks, and the length of any pauses. They also implemented controls to account for any endogenous reactions (e.g., if a user gets frustrated with the computer's speech), and enabled the adaptable device to modify its speech accordingly, based on predictions of where the conversation may lead. In the end, they found that users responded more positively whenever the computer spoke in "objective terms" (i.e., with more succinct dialogue). The same could probably be said for most bloggers, as well. Teleport past the break for the full PR, along with a demo video (in Spanish).

  • Intel 4004, world's first commercial microprocessor, celebrates 40th birthday, ages gracefully

    by 
    Amar Toor
    Amar Toor
    11.15.2011

    Pull out the candles and champagne, because the Intel 4004 is celebrating a major birthday today -- the big four-oh. That's right, it's been exactly four decades since Intel unveiled the world's first commercially available CPU, with an Electronic News ad that ran on November 15th, 1971. It all began in 1969, when Japan's Nippon Calculating Machine Corporation asked Intel to create 12 chips for its Busicom 141-PF calculator. With that assignment, engineers Federico Faggin, Ted Hoff and Stanley Mazor set about designing what would prove to be a groundbreaking innovation -- a 4-bit, 16-pin microprocessor with a full 2,300 MOS transistors, and about 740kHZ of horsepower. The 4004's ten micron feature size may seem gargantuan by contemporary standards, but at the time, it was rather remarkable -- especially considering that the processor was constructed from a single piece of silicon. In fact, Faggin was so proud of his creation that he decided to initial its design with "FF," in appropriate recognition of a true work of art. Hit up the coverage links below for more background on the Intel 4004, including a graphic history of the microprocessor, from the Inquirer.

  • Wireless bike brake system has the highest GPA ever

    by 
    Amar Toor
    Amar Toor
    10.17.2011

    Color us a yellow shade of mendacious, but if we designed something that works 99.999999999997 percent of the time, we'd probably round off and give ourselves a big ol' 100 percent A+. We'd probably throw in a smiley faced sticker, too. Computer scientist Holger Hermanns, however, is a much more honest man, which is why he's willing to admit that his new wireless bike brake system is susceptible to outright failure on about three out of every trillion occasions. Hermanns' concept bike, pictured above, may look pretty standard at first glance, but take a closer look at the right handlebar. There, you'll find a rubber grip with a pressure sensor nestled inside. Whenever a rider squeezes this grip, that blue plastic box sitting next to it will send out a signal to a receiver, attached to the bike's fork. From there, the message will be sent on to an actuator that converts the signal into mechanical energy, and activates the brake. Best of all, this entire process happens will take just 250 milliseconds of your life. No wires, no brakes, no mind control. Hermanns and his colleagues at Saarland University are now working on improving their system's traction and are still looking for engineers to turn their concept into a commercial reality, but you can wheel past the break for more information, in the full PR.

  • Dennis Ritchie, pioneer of C programming language and Unix, reported dead at age 70

    by 
    Amar Toor
    Amar Toor
    10.13.2011

    We're getting reports today that Dennis Ritchie, the man who created the C programming language and spearheaded the development of Unix, has died at the age of 70. The sad news was first reported by Rob Pike, a Google engineer and former colleague of Ritchie's, who confirmed via Google+ that the computer scientist passed away over the weekend, after a long battle with an unspecified illness. Ritchie's illustrious career began in 1967, when he joined Bell Labs just one year before receiving a PhD in physics from Harvard University. It didn't take long, however, for the Bronxville, NY native to have a major impact upon computer science. In 1969, he helped develop the Unix operating system alongside Ken Thompson, Brian Kernighan and other Bell colleagues. At around the same time, he began laying the groundwork for what would become the C programming language -- a framework he and co-author Kernighan would later explain in their seminal 1978 book, The C Programming Language. Ritchie went on to earn several awards on the strength of these accomplishments, including the Turing Award in 1983, election to the National Academy of Engineering in 1988, and the National Medal of Technology in 1999. The precise circumstances surrounding his death are unclear at the moment, though news of his passing has already elicited an outpouring of tributes and remembrance for the man known to many as dmr (his e-mail address at Bell Labs). "He was a quiet and mostly private man," Pike wrote his brief post, "but he was also my friend, colleague, and collaborator, and the world has lost a truly great mind."

  • NC State researchers team with IBM to keep cloud-stored data away from prying eyes

    by 
    Amar Toor
    Amar Toor
    10.07.2011

    The man on your left is Dr. Peng Ning -- a computer science professor at NC State whose team, along with researchers from IBM, has developed an experimental new method for safely securing cloud-stored data. Their approach, known as a "Strongly Isolated Computing Environment" (SICE), would essentially allow engineers to isolate, store and process sensitive information away from a computing system's hypervisors -- programs that allow networked operating systems to operate independently of one another, but are also vulnerable to hackers. With the Trusted Computing Base (TCB) as its software foundation, Ping's technique also allows programmers to devote specific CPU cores to handling sensitive data, thereby freeing up the other cores to execute normal functions. And, because TCB consists of just 300 lines of code, it leaves a smaller "surface" for cybercriminals to attack. When put to the test, the SICE architecture used only three percent of overhead performance for workloads that didn't require direct network access -- an amount that Ping describes as a "fairly modest price to pay for the enhanced security." He acknowledges, however, that he and his team still need to find a way to speed up processes for workloads that do depend on network access, and it remains to be seen whether or not their technique will make it to the mainstream anytime soon. For now, though, you can float past the break for more details in the full PR.

  • New program makes it easier to turn your computer into a conversational chatterbox

    by 
    Amar Toor
    Amar Toor
    09.05.2011

    We've already seen how awkward computers can be when they try to speak like humans, but researchers from North Carolina State and Georgia Tech have now developed a program that could make it easier to show them how it's done. Their approach, outlined in a recently published paper, would allow developers to create natural language generation (NLG) systems twice as fast as currently possible. NLG technology is used in a wide array of applications (including video games and customer service centers), but producing these systems has traditionally required developers to enter massive amounts of data, vocabulary and templates -- rules that computers use to develop coherent sentences. Lead author Karthik Narayan and his team, however, have created a program capable of learning how to use these templates on its own, thereby requiring developers to input only basic information about any given topic of conversation. As it learns how to speak, the software can also make automatic suggestions about which information should be added to its database, based on the conversation at hand. Narayan and his colleagues will present their study at this year's Artificial Intelligence and Interactive Digital Entertainment conference in October, but you can dig through it for yourself, at the link below.

  • IBM developing largest data drive ever, with 120 petabytes of bliss

    by 
    Amar Toor
    Amar Toor
    08.26.2011

    So, this is pretty... big. At this very moment, researchers at IBM are building the largest data drive ever -- a 120 petabyte beast comprised of some 200,000 normal HDDs working in concert. To put that into perspective, 120 petabytes is the equivalent of 120 million gigabytes, (or enough space to hold about 24 billion, average-sized MP3's), and significantly more spacious than the 15 petabyte capacity found in the biggest arrays currently in use. To achieve this, IBM aligned individual drives in horizontal drawers, as in most data centers, but made these spaces even wider, in order to accommodate more disks within smaller confines. Engineers also implemented a new data backup mechanism, whereby information from dying disks is slowly reproduced on a replacement drive, allowing the system to continue running without any slowdown. A system called GPFS, meanwhile, spreads stored files over multiple disks, allowing the machine to read or write different parts of a given file at once, while indexing its entire collection at breakneck speeds. The company developed this particular system for an unnamed client looking to conduct complex simulations, but Bruce Hillsberg, IBM's director of storage research, says it may be only a matter of time before all cloud computing systems sport similar architectures. For the moment, however, he admits that his creation is still "on the lunatic fringe."

  • AP, Google offer $20,000 scholarships to aspiring tech journalists, we go back to school

    by 
    Amar Toor
    Amar Toor
    08.17.2011

    Love technology? Love journalism? Well, the AP-Google Journalism and Technology Scholarship program might be right up your alley. The initiative, announced earlier this week, will offer $20,000 scholarships to six graduate or undergraduate students working toward a degree in any field that combines journalism, new media and computer science. Geared toward aspiring journalists pursuing projects that "further the ideals of digital journalism," the program also aims to encompass a broad swath of students from diverse ethnic, gender, and geographic backgrounds. Applications for the 2012-2013 school year are now open for students who are currently enrolled as college sophomores or higher, with at least one year of full-time coursework remaining. Hit up the source link below to apply, or head past the break for more information, in the full presser.

  • Telex anti-censorship system promises to leap over firewalls without getting burned

    by 
    Amar Toor
    Amar Toor
    08.14.2011

    Human rights activists and free speech advocates have every reason to worry about the future of an open and uncensored internet, but researchers from the University of Michigan and the University of Waterloo have come up with a new tool that may help put their fears to rest. Their system, called Telex, proposes to circumvent government censors by using some clever cryptographic techniques. Unlike similar schemes, which typically require users to deploy secret IP addresses and encryption keys, Telex would only ask that they download a piece of software. With the program onboard, users in firewalled countries would then be able to visit blacklisted sites by establishing a decoy connection to any unblocked address. The software would automatically recognize this connection as a Telex request and tag it with a secret code visible only to participating ISPs, which could then divert these requests to banned sites. By essentially creating a proxy server without an IP address, the concept could make verboten connections more difficult to trace, but it would still rely upon the cooperation of many ISPs stationed outside the country in question -- which could pose a significant obstacle to its realization. At this point, Telex is still in a proof-of-concept phase, but you can find out more in the full press release, after the break.