The NSA broke agency rules thousands of times
Get a powerful Linux Dual-Core dedicated server for less than $2.67 a day!Tweet Share on Twitter.
August 16, 2013
According to an internal agency audit obtained by The Washington Post, the NSA (National Security Agency) exceeded its legal authority and broke agency rules thousands of times since it was granted broader powers five years ago.
And it appears that the situation is getting worse going forward. Most violations involved unauthorized surveillance of Americans or foreign intelligence targets in the United States, according to the documents which were supplied to the newspaper by NSA whistleblower Edward Snowden.
The documents show infractions ranging from serious legal violations to typographical errors that resulted in unintended data collection, The Post reported.
That's on top of all the internet monitoring and sniffing that the NSA does on a daily basis. The agency was not always forthcoming with the details of its transgressions, the Post found. A quality assurance report not shared with an oversight committee found that a "large number" of calls were placed to Egypt in 2008 when the U.S. area code 202 was mistakenly entered as 20.
In another case, the Foreign Intelligence Surveillance Court, which reviews NSA warrant requests, wasn't made aware of a new collection method until it had been in place for several months. The court ultimately ruled it unconstitutional, the Post reported.
The audit, dated May 2012, uncovered no less than 2,776 separate incidents in the preceding twelve months of unauthorized collection, storage, access to or distribution of legally protected communications, the Post reported.
One of those cases involved the unauthorized use of data on 3,000 Americans and green-card holders. "We're a human-run agency operating in a complex environment with a number of different regulatory regimes, so at times we find ourselves on the wrong side of the line," the senior NSA official said, speaking with White House permission.
"You can look at it as a percentage of our total activity that occurs each day. You look at a number in absolute terms that looks big, and when you look at it in relative terms, it looks a little different," he added.
The Obama administration, which has defended the NSA activities, has never publicly addressed the agency's compliance record, the Post noted. But the NSA Director of Compliance John DeLong defended the agency's procedures, saying it had in recent years quadrupled the number of personnel working in its privacy compliance program.
"We want people to report if they have made a mistake or even if they believe that an NSA activity isn't consistent with the rules. NSA, like other regulated organizations, also has a hotline for people to report -- and no adverse action or reprisal can be taken for the simple act of reporting. We take each report seriously, investigate the matter, address the issue, constantly look for trends, and address them as well, all as a part of NSA's internal oversight and compliance efforts," he added.
"What's more, we keep our overseers informed through both immediate reporting and periodic reporting. Our internal privacy compliance program has more than 300 personnel assigned to it-- a fourfold increase since 2009. They manage NSA's rules, train personnel, develop and implement technical safeguards, and set up systems to continually monitor and guide NSA's activities. We take this work very seriously," DeLong said.
The NSA later offered this as a substitute statement-- "NSA's foreign intelligence collection activities are continually audited and overseen internally and externally. When NSA makes a mistake in carrying out its foreign intelligence mission, the agency reports the issue internally and to federal overseers, and aggressively gets to the bottom of it," the agency said.
In other NSA news
The NSA has issued a document in the U.S. titled 'The National Security Agency: Missions, Authorities, Oversight and Partnerships' that briefly explains some of its operations, and it includes a claim that it touches about 1.6 percent of all daily Internet traffic.
The report also adds that only about 0.025 percent of that 1.6 percent is actually selected for review in the first place. If you're skeptical when reading this, you're not alone...
Released quietly over the weekend - albeit amid fresh claims that the NSA is scrutinizing every email in and out of the US - the document's prologue explains that the NSA lacked tools to track one of the 9/11 hijackers.
As a result “several programs were developed to address the U.S. Government's needs to connect the dots of information available to the intelligence community and to strengthen the combined coordination between foreign intelligence agents and domestic law enforcement agencies”.
The report then goes on to detail the many legal underpinnings of the agency's work and identify the following methodology for its operations.
The NSA identifies foreign entities, persons and organizations that have information responsive to an identified foreign intelligence requirement.
For instance, the agency works closely to identify individuals who may belong to a terrorist network. The NSA develops "the network" with which that person or organization's information is shared or the command and control structure through which it flows.
In other words, if the agency is tracking a specific terrorist, it will endeavor to determine who that person is in contact with, and who he is taking his orders from.
The NSA identifies how the foreign entities communicate (radio, e-mail, telephony, etc.) The agency then identifies the telecommunications infrastructure used to transmit those communications.
The agency then identifies security vulnerabilities in the methods of communication used to transmit them. The NSA then matches its collection of data to those vulnerabilities, or it develops new capabilities to acquire communications of interest if needed.
The budgetary details comes in a section titled “Scope and Scale of NSA Collection” that reads as follows-- "According to various numbers published by a major technology provider, the Internet carries about 1,826 Petabytes of information per day."
"In its foreign intelligence mission, the NSA touches about 1.6 percent of that data. But of that 1.6 percent, only 0.025 percent is actually selected for review," the report states.
"In the end, the net effect is that NSA analysts look at about 0.00004 percent of the world's traffic in conducting its mission. That's less than one part in a million,” according to the report.
It also means that the NSA is analyzing a couple of terabytes a day as well. And let's also ponder just what “selected for review” means. Is it reading by humans? Processing by servers?
Perhaps the security probe launched by President Barack Obama into his spooks' activities will reveal all. We shall see in time, hopefully.
The NSA would have us believe that whatever is going on right now, “NSA personnel are obliged to report when they believe the NSA is not, or may not be, acting consistently with law, policy, or procedure”.
“This self-reporting is part of the culture and fabric of the NSA,” the document continues. “If the NSA is not acting in accordance with law, policy, or procedure, the agency will report through its internal and external intelligence oversight channels, conduct specific reviews to understand the root cause, and make appropriate adjustments to constantly improve itself.”
But for now, we can only imagine leakers 'a la Assange' working for government contractors that were not on the NSA's list of “external intelligence oversight channels”.
Whistleblower Edward Snowden thrusting himself into that role is most likely the real reason this document was published in the first place. We will keep you posted, as always.
In other internet security news
Network anonymisation firm TOR has posted a strange piece of commentary on reports that some of the anonymous servers it routes to have completely disappeared from its network in the last two days.
“Around midnight on August 4th, we were notified by a few people that a large number of hidden service addresses have completely disappeared from the Tor Network,” the post read.
As it explores the rumors, the post goes on to name an entity called Freedom Hosting, and to vigorously dissociate TOR from the organization.
Distancing TOR from Freedom seems a fine idea given numerous reports, such as this from The Irish Examiner, suggesting that its founder Eric Marques has been arrested because the FBI believes he facilitated the distribution of child porn using TOR. The FBI now wants to extradite Marques to the U.S.
That payload results in malware reaching users' PCs, possibly thanks to “potential bugs in Firefox 17 ESR, on which our TOR Browser is based,” the past warned.
TOR is “investigating these bugs and will fix them if we can,” it said. Various forums online, however, report that the malware has spread beyond sites hosted by Freedom. Some suggest TORmail, TOR's secure email service, may also have been compromised, or that the attack means TOR is no longer able to mask users' IP addresses.
TOR's post says it's not sure what's really happening and that it will update users once it learns more.
In other internet security news
On many levels, Edward Snowden’s revelations about the activities of the various security organizations in the U.S. have not come as a real surprise, yet they still were a wake-up call on how the overview of our own personal data security has changed in the past year, and it's mostly caused by the mobile segment.
Multiple devices and increased mobility have meant that we have looked for new methods to ensure that we have access to our data wherever and whenever.
It's also increasingly uncommon to find a homogeneous household in terms of manufacturer or operating system. It's now fairly common to find Windows, OS X, Android, iOS and even Linux devices all within a single household. Throw in digital cameras and a couple of smart TVs, and it's no wonder that we have a situation that makes data sharing in a secure fashion more and more problematic for the average person.
So file-syncing and sharing products such as Dropbox, SkyDrive and GoogleDrive are pretty much inevitable consequences of this. The everage user now has a broad selection of these services-- some are free and some paid for, but pretty much all of them are insecure. In fact, some are even a whole lot worse than you'd expect.
Of course, it would be nice if the operating system manufacturers could agree on a standard which included encryption of data in-flight and at rest with a simple and easy-to-use key-sharing mechanism.
But even with that, we would probably still not trust it any more, but it might at least provide us an initial level of defence, for what it's worth anyway.
Some of us have already started to look at several ways of adding encryption to the various cloud services they use. In the past, some used TrueCrypt, but it's not seamless and can be complex for nothing. But this is becoming more feasible as apps such as Cryptonite and DiskDecipher are appearing for mobile devices.
Recently, we started to play with BoxCryptor and EncFS. BoxCryptor seems nice and easy to use, certainly on the desktop. It supports multiple cloud providers, although the free version only supports a single cloud provider — if you want to encrypt your multiple cloud stores, you will have to pay.
There are also alternatives such as Cloudfogger, but development for BoxCryptor seems to be ongoing. There is also perhaps the option of building your own "sync and share" service.
Another new service called 'Transporter' recently successfully launched and looks good, while another one called 'Plug' is in the process of getting launched as well.
Similarly, Synology Devices has Cloud Station, and QNAP has myQNAPcloud. Or, you can go totally build your own and use ownCloud.
And of course, in the enterprise segment you have a multitude of options as well. The main thing is, you need not store your data in the cloud in an insecure manner. You have lots of options now, from keeping it local to using a cloud service provider.
Encryption is still not as user friendly as it could be, but it has got easier. We'll update you soon on this and other topics.
In other internet security news
The same security research team that discovered significant internet security vulnerabilties in more than a dozen home wireless Wi-Fi routers adds even more devices to that list at Defcon 21 - 2013.
More and more major brand-name Wi-Fi router security vulnerabilities continue to be discovered by the team, and continue to go unpatched, a security researcher has revealed.
Jake Holcomb, a security researcher at the Baltimore-based security firm Independent Security Evaluators and the lead researcher into Wi-Fi router vulnerabilities, said that the issues are far worse than when ISE released its original findings in April of this year.
The latest study continues to reveal that the small office and home office Wi-Fi routers are "very vulnerable to attack," Holcomb added.
"They're not a means to protect your network and your digital assets," he cautioned. Holcomb is a relatively young researcher, in his mid-20s, who turned his lifelong interest in computer security into a professional career only in the past year.
Previously, he was doing network security for a school district in Ohio. The new report details no less than fifty-six new Common Vulnerabilities and Exposures, or CVEs, that Holcomb and the other ISE researchers have discovered in popular routers.
Those include the Asus RT-AC-66U router, the D-Link DIR-865L, and the TrendNet TEW-812-DRU router, for which Holcomb plans on demonstrating security vulnerabilities at Defcon today.
Additional requests for comment from the affected vendors were not immediately returned. We will update this story when we hear from them.
You might not think that the router security flaws could affect you, or would be easy to exploit, but Holcomb explained that because the security vulnerabilities appear to affect most routers, and are difficult to repair, these could put nearly every user who connects to a vulnerable router at great risk.
And the scenario he explained was a very common one. Small-business and home Wi-Fi router administration often employs weak passwords, or static passwords that are the same across multiple stores, like a Starbucks.
All an attacker has to do is go to his favorite coffee spot, buy a coffee and get the establishment's Wi-Fi password. Then, equipped with access to the Wi-Fi network, all that attacker would have to do is use one of the exploits that ISE has uncovered.
The wireless router would then be compromised, including all the Web traffic flowing through it. Holcomb compared the problem of fixing routers to traditional PCs. "In most cases, automatic updates are enabled for Windows and Mac," he said.
But, he added, "even if a router manufacturer were to implement a similar feature, most people don't log into their routers, and that's the core of the whole problem."
Basically, and because people have been trained to think of the router as a set-it-and-forget-it device, and one without security holes, it's nearly impossible to get them to update router firmware.
And the fix won't be an easy one either, at least not logistically. "I think the solution is for wireless routers to automatically update themselves, and offer users the ability to opt out of it," Holcomb said.
However, given the great reluctance of some major router manufacturers to address these security issues, those exploits could exist unpatched in the wild for several years to come.
Get a powerful Linux Dual-Core dedicated server for less than $2.67 a day!Tweet Share on Twitter.
Source: The NSA.
You can link to the Internet Security web site as much as you like.