Skip to main content
Advertising

Originally published Saturday, February 8, 2014 at 6:32 PM

  • Share:
             
  • Comments (1)
  • Print

Snowden used low-cost tool to best NSA

Using “Web crawler” software designed to search, index and back up a website, Edward Snowden “scraped data” out of National Security Agency networks.


The New York Times

Most Popular Comments
Hide / Show comments
Compartmentalization is good for security bad for connecting dots. The worth of a dot... MORE

advertising

WASHINGTON — Intelligence officials investigating how Edward Snowden gained access to about 1.7 million of the country’s most highly classified documents say they have determined that he used inexpensive and widely available software to “scrape” the National Security Agency’s (NSA) networks, and he kept at it even after he was briefly challenged by agency officials.

Using “Web crawler” software designed to search, index and back up a website, Snowden “scraped data out of our systems” while he went about his day job, according to a senior intelligence official. “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process, he added, was “quite automated.”

The findings are striking because the NSA’s mission includes protecting the nation’s most sensitive military and intelligence computer systems from cyberattacks, especially the sophisticated attacks that emanate from Russia and China. Snowden’s “insider attack,” by contrast, was not sophisticated and should have been easily detected, investigators found.

Moreover, Snowden succeeded nearly three years after the WikiLeaks disclosures, in which military and State Department files, of far less sensitivity, were taken using similar techniques.

Snowden had broad access to the NSA’s complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency’s computer systems in an outpost that focuses on China and North Korea.

A Web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path.

Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the NSA’s internal networks.

Among the materials prominent in the Snowden files are the agency’s shared “wikis,” databases to which intelligence analysts, operatives and others contributed their knowledge. Some of that material indicates Snowden “accessed” the documents. But experts say they may well have been downloaded not by him but by the program acting on his behalf.

Agency officials insist that if Snowden had been working from NSA headquarters at Fort Meade, Md., which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught. But because he worked at an agency outpost that had not been upgraded with modern security measures, his copying of what the agency’s newly appointed No. 2 officer, Rick Ledgett, recently called “the keys to the kingdom,” raised few alarms.

“Someplace had to be last” in getting the security upgrade, said one official familiar with Snowden’s activities. But he added that Snowden’s actions had been “challenged a few times.”

In at least one instance when he was questioned, Snowden provided what were later described to investigators as legitimate-sounding explanations for his activities: As a systems administrator he was responsible for conducting routine network maintenance. That could include backing up the computer systems and moving information to local servers, investigators were told.

But from his first days working as a contractor inside the NSA’s underground Oahu, Hawaii, facility for Dell, a computer maker, and then at a modern office building on the island for Booz Allen Hamilton, a technology-consulting firm that sells and operates computer-security services used by the government, Snowden learned something critical about the NSA’s culture: While the organization built enormously high electronic barriers to keep out foreign invaders, it had rudimentary protections against insiders.

“Once you are inside, the assumption is that you are supposed to be there, like in most organizations,” said Richard Bejtlich, chief security strategist for FireEye, a Silicon Valley computer-security firm, and a senior fellow at the Brookings Institution. “But that doesn’t explain why they weren’t more vigilant about excessive activity in the system.”

Investigators have yet to determine whether Snowden happened into an ill-defended outpost of the NSA or sought a job there because he knew it had yet to install the security upgrades that might have stopped him.

“He was either very lucky or very strategic,” one intelligence official said. A new book, “The Snowden Files,” by Luke Harding, a correspondent for The Guardian in London, reports that Snowden sought his job at Booz Allen because “to get access to a final tranche of documents” he needed “greater security privileges than he enjoyed in his position at Dell.”

Through his lawyer at the American Civil Liberties Union, Snowden did not address the government’s theory of how he obtained the files, saying in a statement: “It’s ironic that officials are giving classified information to journalists in an effort to discredit me for giving classified information to journalists. The difference is that I did so to inform the public about the government’s actions, and they’re doing so to misinform the public about mine.”

The NSA declined to comment on its investigation or the security changes it has made since the Snowden disclosures. Other intelligence officials familiar with the findings of ongoing investigations — there are at least four — were granted anonymity to discuss the investigations.

Officials declined to say which Web crawler Snowden had used, or whether he had written some of the software himself. Officials said it functioned like Googlebot, a widely used Web crawler that Google developed to find and index new pages on the Web.

What officials cannot explain is why the presence of such software in a highly classified system was not an obvious tip-off to unauthorized activity.

When inserted with Snowden’s passwords, the Web crawler became especially powerful. Investigators determined he probably had also made use of the passwords of some colleagues or supervisors.

He was also aided by a culture within the NSA, officials say, that “compartmented” relatively little information. As a result, a 29-year-old computer engineer, working from a World War II-era tunnel in Oahu and then from downtown Honolulu, had access to unencrypted files that dealt with information as varied as the bulk collection of domestic phone numbers and the intercepted communications of Chancellor Angela Merkel of Germany.

Investigators have found no evidence that Snowden’s searches were directed by a foreign power, despite suggestions to that effect by the chairman of the House Intelligence Committee, Rep. Mike Rogers, R-Mich..

In his statement, Snowden denied any deliberate effort to gain access to any military information. “They rely on a baseless premise, which is that I was after military information,” he said.



News where, when and how you want it

Email Icon

Bake cookies for a cause

Bake cookies for a cause

Get 23 scrumptious recipes in our "Quintessential Cookies" e-book. One dollar of your $3.95 purchase goes to Fund For The Needy.

Advertising

Partner Video

Advertising


Advertising
The Seattle Times

The door is closed, but it's not locked.

Take a minute to subscribe and continue to enjoy The Seattle Times for as little as 99 cents a week.

Subscription options ►

Already a subscriber?

We've got good news for you. Unlimited seattletimes.com content access is included with most subscriptions.

Subscriber login ►