Latest in Edwardsnowden

Image credit:

Edward Snowden used automated web search tools to collect NSA data

35 Shares
Share
Tweet
Share
Save

Sponsored Links

It's tempting to imagine that Edward Snowden obtained NSA data through a daring Mission Impossible-style raid, but it now appears that he didn't have to put in much effort. Intelligence officials speaking to the New York Times say that Snowden used a standard web crawler, a tool that typically indexes websites for search engines, to automatically collect the info he wanted. He only needed the right logins to bypass what internal defenses were in place. Since the NSA wasn't walling off content to prevent theft by insiders, the crawler could collect seemingly anything -- and Snowden's Hawaii bureau didn't have activity monitors that would have caught his bot in the act. Whether or not you believe the NSA's intelligence gathering policies justified a leak, it's clear that the agency was partly to blame for its own misfortune.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
35 Shares
Share
Tweet
Share
Save

Popular on Engadget

Nintendo says there is no Switch exchange program

Nintendo says there is no Switch exchange program

View
IKEA creates a business unit devoted to smart home tech

IKEA creates a business unit devoted to smart home tech

View
US will reportedly give Huawei another temporary reprieve

US will reportedly give Huawei another temporary reprieve

View
The next Apple Watch may come in titanium and ceramic models

The next Apple Watch may come in titanium and ceramic models

View
Behind the wheel of VW’s electric dune buggy prototype

Behind the wheel of VW’s electric dune buggy prototype

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr