Latest in Edwardsnowden

Image credit:

Edward Snowden used automated web search tools to collect NSA data

Jon Fingas, @jonfingas
February 8, 2014
Share
Tweet
Share

Sponsored Links

It's tempting to imagine that Edward Snowden obtained NSA data through a daring Mission Impossible-style raid, but it now appears that he didn't have to put in much effort. Intelligence officials speaking to the New York Times say that Snowden used a standard web crawler, a tool that typically indexes websites for search engines, to automatically collect the info he wanted. He only needed the right logins to bypass what internal defenses were in place. Since the NSA wasn't walling off content to prevent theft by insiders, the crawler could collect seemingly anything -- and Snowden's Hawaii bureau didn't have activity monitors that would have caught his bot in the act. Whether or not you believe the NSA's intelligence gathering policies justified a leak, it's clear that the agency was partly to blame for its own misfortune.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
Tweet
Share

Popular on Engadget

Engadget's 2020 Back-to-School Guide

Engadget's 2020 Back-to-School Guide

View
Our readers get real about their issues with the AirPods Pro

Our readers get real about their issues with the AirPods Pro

View
Space Force official logo and motto unveiled

Space Force official logo and motto unveiled

View
Fossil's Gen 5 Wear OS smartwatches are about to get a major update

Fossil's Gen 5 Wear OS smartwatches are about to get a major update

View
Facebook repeatedly overruled fact checkers in favor of conservatives

Facebook repeatedly overruled fact checkers in favor of conservatives

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr