Vous êtes sur la page 1sur 38


Security and Backdoor Encryption to Data

Aparna Awasthi
MIS 6800I
Fairleigh Dickinson University


In the past, the use of encryption was not as unavoidable as it is today, because the computing

power to encrypt and decrypt information was costly. Therefore, according to the senate

committee hearing named “Going Dark: Encryption, Technology, and the Balance between

Public Safety and Privacy”, the Federal Bureau of Investigations (FBI), are requesting

organizations to control a separate set of encryption keys to get to user information with a court

order. Similarly, the FBI has called for organizations to help government break their devices.

These recommendations have met solid opposition from security specialists, technological

organizations, and also some professionals from law enforcement and intelligence agencies as

stated in the article “Why the fear over ubiquitous data encryption is overblown?” by authors

McConnell & Chertoff(as cited in Castro & McQuinn, 2016) .This paper will look at the nuances

of each of the U.S. government's recommendations and arguments to constrain the use of

encryption and examine their advantages and agreements. It begins by the fundamentals of

encryption, its evolution, and how government has gained access to encrypted information

overtime. The paper also talks about surveillance, the different strategies that government could

use to break encryption and recommendations that the U.S. government has openly supported.

The U.S government should not confine the commercialization of encryption. Doing so, there

will be no significant impact on militants and criminal’s ability to engage in encryption but it

will make a normal consumer and business unsecure, block the progression in data security and

harm the competitiveness of U.S companies, affecting U.S jobs.

Keywords: backdoor encryption, surveillance, security, privacy


Security and Backdoor Encryption to Data

Advancements in the field of data security, especially in how to use encryption to ensure the

secrecy of data, have immensely improved security for buyers and organizations. But, according

to the Senate committee hearing, as products and services have turned out to be more safe and

secure, it has become difficult for law enforcement and national security organizations to get

access to specific data that could help them avert and investigate violations and terrorism. This

has made a standout amongst the most troublesome policy dilemmas of the digital age, as

encryption both enhances security for consumers and organizations and makes it harder for

governments to shield them from different threats. However, the U.S. government should not

constrain or weaken encryption, because any attempts to do so would impact the general security

of law-abiding residents and businesses, making it more troublesome for U.S. companies to

compete in global markets, and limit advancements in data security. In addition, attempts to limit

or weaken encryption would be ineffective at keeping this technology out of the hands of

criminals and terrorists. “We live in a golden age of government surveillance even without

mandated encryption backdoors. Mandating that U.S. tech organizations compromise their

security frameworks to give law enforcement access to all content is not the way to prevent

terrorism and criminal activities that many desire it to be. Government needs to choose a

broader strategy for a stronger encryption as it will just end up harming the security of

law-abiding citizens”.

Evolution of Encryption

It is useful to evaluate how encryption has advanced over time as data innovation has

improved and as new commercial market needs have developed. Furthermore, a few types of

encryption are no more secure, just as a few strategies for breaking encryption have turned out to

be more successful over time. All through this procedure, intelligence agencies have constantly

pushed to constrain encryption to retain access to data. The report by the Information

Technology and Innovation Foundation (ITIF) entitled “Unlocking Encryption: Information

Security and the Rule of Law” by authors Castro and McQuinn gives a brief insight about the

evolution of encryption.

Late 1960’s – present: Symmetric Encryption

In the late 1960s, as organizations began to use PCs to share and store data in digital

form, they required an approach to effortlessly secure that information (Castro & McQuinn,

2016, p.4). These first types of commercial encryption used symmetric key algorithms, which

imply the sender and receiver use the same cryptographic key for encryption and decryption,


Castro also states that private-segment and scholarly efforts to develop better encryption

got pushback from intelligence agencies that dreaded the expanded predominance of encrypted

data would limit their capacity to see digital communication. Until this time, it was governments

that used encryption, essentially for military reasons. When storing information locally with

symmetric encryption, only people holding the key can encrypt the information. Subsequently,

government offices and other third parties can't get to the information, nor constrain third party

key holders to turn over the key. Rather, governments would need to get the key from the source

of the information, pretty much as they need to do today with different types of client side

encryption. In the light of non-government improvements in encryption, and trying to keep

propels in cryptography a secret, the National Security Agency (NSA) endeavored to control

early advances in encryption (2016, p.4).


In the late 1960s, more organizations started to research encryption. For instance, IBM

set up a research group to make an encryption technique for a money dispensing system that the

company was producing for a bank in London. The encryption calculation made was called

“Lucifer,” which encoded 128-bit blocks of information with a 128-bit key (Sorkin's study (as

cited in Castro & McQuinn, 2016)). IBM took this opportunity as in 1968 the U.S. National

Bureau of Standards (NBS, now NIST) started inquiring about potential requirements for

security in both the public and private sector. In 1973, (Pincock's study (as cited in Castro &

McQuinn, 2016)) found that NBS declared plans to build up a national standard for encryption of

business and sensitive data. To this end, NBS looked for proposals from the private sector for a

public standard for data encryption that was flexible and cost effective as mentioned in the

document named “Data Encryption Standard”. NBS acknowledged IBM's proposal and it got to

be known as the Data Encryption Standard (DES). Unfortunately, the NSA interfered, messing

with the strength of the first algorithm and reducing the key length to 56-bits, keeping in mind

the end goal to make it easier to decrypt (Jackson's study (as cited in Castro & McQuinn, 2016)).

The last version of DES was extensively weaker than the earlier version of Lucifer that IBM had

proposed. During 1980, personal computers became more popular. This resulted in rise of

communities of research scholars, users and developers that created cryptography to secure the

information. This led to market demand for embedding encryption into products. For instance, in

1983 “Lotus Notes” were added to the electronic messaging service as encryption by Lotus co-


As PCs got faster and less expensive, so too did strategies for breaking into encrypted

information by using brute force attacks—a technique in which an adversary tries every possible

key until the right one is found. For instance, in 2002, the Electronic Frontier Foundation and

Distributed.Net broke DES in under twenty-four hours with a machine that cost $250,000(CNET

news entitled “Record set in cracking 56-bit crypto” (as cited in Castro & McQuinn, 2016)). The

requirement for a stronger type of encryption led the National Institute of Standards and

Technology (NIST), which changed its name from NBS in 1988, to supplant DES with another

encryption standard known as the Advanced Encryption Standard (AES) in 1997 (Nechvatal et al

study (as cited in Castro & McQuinn, 2016)). AES is another symmetric algorithm with key size

up to 256-bits, which the NSA considers sufficiently strong to ensure secret communications.

Security specialists additionally started to plan security components to make brute force attacks

more troublesome. For instance, a system could restrict the rate or number of login attempts to

avoid automated brute force attacks (Castro & McQuinn, 2016, p.5).

The limitation of symmetric key calculations like DES and AES is that they are

inadequately suited for sharing data safely between different parties. This is because it can be

hard to securely distribute keys, as the same key is used to both encrypt and decrypt the data.

Previously, for instance, it was not an issue to share keys through a trusted carrier or during a

personal meeting, but vulnerability emerges when the key is sent electronically. If it were to

safely share keys, then the communicating parties would not require encryption, for they could

simply exchange data specifically. While symmetric key encryption is still broadly used today,

later advancements helped to some of its shortcomings (Castro & McQuinn, 2016, p.5).

1994-present: Public Key Encryption

Castro exclaimed that the disadvantage of symmetric encryption turned out to be more

claimed in the 1980s and 1990s as scholastics and businesses expected to safely communicate

with others on public systems but had no easier way to distribute keys. However, the ascent of

the business Internet worsened this issue and prompted the introduction of public key encryption.

Public key encryption uses two keys, one public and one private. The public key can be shared

and used to encrypt messages that exclusively the private key can decrypt. While the numerical

solution for this kind of encryption goes back to late 1970s, it was not until the mid-1990s that

PC and networking hardware were adequately best to make a usage cost-effective.

In 1994, when internet got commercialized, Netscape built up the Secure Sockets Layer

(SSL), a cryptographic protocol intended to give servers and clients a chance to communicate

securely over the Internet. SSL, and its successor Transport Layer Security (TLS), used public

key encryption. The TCP/IP protocol did not have encryption coordinated into the standard from

the beginning as the NSA had restricted making strong encryption accessible on public or

business networks. Encryption has become essential to security on networks like the Internet,

and it is used in each industry to securely store and transmit private information. Without

encryption, customers would not have the capacity to use online services for example, web mail,

patient portals, etc. As the cost of encrypting information has fallen, more and more sites are

encrypting all activity so that third party can't interrupt any data (Castro & McQuinn, 2016, p.6).

1991 – 1999: Cryptography wars

Until the 1990s, intelligence agencies had opposed the general dissemination of

encryption out of the fear that its widespread use would make it harder to handle surveillance of

electronic data. Castro states as per Comprehensive Counter-Terrorism Act of 1991 that in 1991,

the NSA promoted hardware producers to provide a mechanism by which law enforcement could

access the encrypted information when authorized. Congress declined to pass this enactment. In

1993, the NSA created and started promoting the Clipper Chip, a PC chip that executed a NSA-

backed algorithm to encrypt phone and data communications using a key escrow system (Levy’s

study (as cited in Castro & McQuinn, 2016)). Under this plan, the U.S. Bureau of Treasury's

Automated Systems Division and NIST would hold escrowed keys (Electronic Privacy

Information Center article named “Questions and Answers About the Clinton Administration’s

Encryption Policy” (as cited in Castro & McQuinn, 2016)). After security researcher uncovered

vulnerabilities in the Clipper Chip, the U.S. government stepped back for this specific usage and

kept on pushing for the private sector to implement its key escrow systems.

In December 1996, President Bill Clinton freed export prohibitions on items using

encryption, as long as it used the key size of less than 56 bits, which implied that companies still

couldn't trade items with advanced encryption. This held on until 1999 and accordingly, foreign

companies gained market advantage all through the world, including United States according to

Jay Stowsky’s study (as cited in Castro & McQuinn, 2016) illustrated that in the early 1990s, a

group of hackers, hardware engineers and libertarians known as the “CypherPunks” formed to

push back on government interruption with cryptography. In 1991, an engineer named Philip

Zimmermann made Pretty Good Privacy (PGP), software for secure communication. The

product used at least 128-bit keys and relied upon a web of trust—which does not require an

authority to verify users, rather depending on trust between users themselves. After Zimmerman

distributed his product and the Cypherpunks disseminated it, the U.S. government explored him

for violating trade controls, but dropped the case in 1996, before introducing new regulations as

mentioned in Dubois’s work (as cited in Castro & McQuinn, 2016).

The U.S. government restricted encryption by constraining its spread and controlling its

quality which proved to be a bad strategy. It was not just the CypherPunks who battled for the

advantages of encryption, but computing, financial institutions, and telecommunications that

depended on encryption did not want their computing infrastructure traded off by a government

mandated key escrow system. These gatherings saw the advantages of making encryption open

and broadly accessible to everybody. Also, Jay Stowky in his work stated that at the end of the

century, basic infrastructure, for example, water plants, telecommunication’s offices, and nuclear

energy offices, additionally depended on strongest techniques for encryption to guarantee its

protection (Castro & McQuinn, 2016, p.8).

1999 – Present: Era of cloud computing

Mohamed’s study (as cited in Castro & McQuinn, 2016) found that in 1999,

Salesforce.com brought the idea of cloud computing—conveying versatile computing resources

as a service—to the masses. Amazon and Google are using cloud based applications. However, a

greater portion of these cloud computing applications presented vulnerability, since users and

companies no longer maintained control of their information—rather, the cloud provider in most

cases has entry to the user's data. Consumers have less control of their security and there is risk

for user’s information in cloud. Rossi’s study (as cited in Castro & McQuinn, 2016) states that

cloud providers are not insusceptible to data breaches. For instance, Code Spaces—a software

collaboration and code hosting platform—endured data break due to password mismanagement

through its third party cloud supplier.

Blattberg’s work (as cited in Castro & McQuinn, 2016) mentions that cloud computing

providers have introduced end to end encryption—where the endpoint PCs hold the keys—so

that information is protected from its source to its destination so that no third party has access to

a private key. For instance, Apple has persistently redesigned its cloud encryption standards for

its devices and services, for example, by encrypting iCloud email. Similarly, Beer & Holland’s

study (as cited in Castro & McQuinn, 2016) found that Amazon Web Services (AWS) offers its

users various security sharing alternatives for information stored in the cloud, whether they need

full or partial control of their encryption keys or surrender full control to AWS. Other security

measures were targeted for fixing human errors as opposed to improving technology. For

instance, according to Wakabayashi’s research (as cited in Castro & McQuinn, 2016) hackers are

accessing through celebrity photographs through their iCloud accounts and Apple has revealed

plans to keep them out of its user’s accounts through a warning and two factor authentications.

2000 – Present: Securing data in mobile device

According to the article entitled “A Brief History of Malware” published by TrendMicro

(as cited in Castro & McQuinn, 2016), companies have been working relentlessly to improve the

security of cell phones through encryption. All through the 2000s, as more number of individuals

purchased cell phones—devices that combine with personal digital assistants (PDAs)—

criminals began targeting on the new technology for fraud or identity theft purposes. In 2004,

criminals developed malware for mobiles that would send instant messages at premium rates,

putting false charges on the victim's bill. These problems have escalated during recent years.

Boxall’s study (as cited in Castro & McQuinn, 2016) says that in 2014, Apple upgraded its

messaging application on its device software to permit clients to control their encryption keys

and communicate privately without the company having default access. Google, moreover, offers

encryption alternatives in its Android messaging software for smartphones.

Many manufacturers enable full disk encryption (FDE) by default on their mobile

devices. FDE is an encrypted storage whereby all information on a device is encrypted so that an

authorized person can decrypt it, and keys are stored in that device locally. Some devices permit

users to use biometrics, for example, facial or fingerprint recognition. This implies if the device

is lost or stolen, the information is still secured unless somebody has the password or key.

Moreover, companies have added anti-theft components to secure user information, for example,

deleting the device after numerous login attempts as mentioned in the article entitled “Use a

passcode for IPhone, IPad or IPod Touch” by Apple support (cited in Castro & McQuinn, 2016).

Present – future: Internet of things

Castro in his report states that many devices are progressively being connected with the

Internet. The Internet of Things refers to the possibility that the Internet is no more just a global

network for individuals to speak with each other, but also devices are inserted with Internet-

connected sensors that enable them to communicate globally. Nowadays connected devices are

consolidated in homes with products, for example, smart lighting and thermostats; Global

Positioning System, driving security; and even on individuals, for example, gadgets that help

people deal with their health and fitness. Gartner’s research (as cited in Castro & McQuinn,

2016) evaluated that 6.4 billion objects will be associated with the Internet worldwide this year.

Furthermore, Manyika’s work (as cited in Castro & McQuinn, 2016) assessed that the Internet of

Things will produce up to $11 trillion every year in economic value by 2025, affecting each part

of the economy and society.

The possibility of billions of connected devices will lead to a complex interconnected

environment. Analysts are researching on creating solutions for security issues of this complex

environment and these solutions should change to meet unique needs of associated devices.

Encryption will be important to secure all of the data these devices store and transmit. Like most

other devices, encryption—if left unrestricted—will improve to address the issues of present and

future (Castro & McQuinn, 2016, p.10).

Use of encryption and its approaches


Encryption can work effectively at the application level, the host level, and at network

level of data stream and transmission. The target of encryption is to make information extremely

muddled to unauthorized users and unintelligibly hard to decipher for the hacker. Thus,

encryption is dependent on the use of irregular encryption keys (this randomness makes the

encrypted information a difficult task for hackers to decrypt). These keys are used to encrypt

information, as well as to execute the decrypting of data. Not all encryption is similar and the

security of encrypted information is reliant not just on the key size but also on the nature of the

encryption technology and on the calculation that is implemented in the encryption procedure

(Singleton et al, n.d.). There are three approaches to the use of encryption:

Symmetric Key Encryption

Symmetric encryption takes meaningful information, encrypt it to make it illegible, and

decrypt it again when it's required. The most vital thing to recollect about symmetric encryption

is that both sides—the encrypter, and the decrypter—require access to the same key. A key, for

symmetric encryption reasons, is a string of information that is sustained to the encrypter

keeping in mind the end goal to encrypt it. It's ideal if this key is totally arbitrary, but there are

approaches to get keys from passwords too. The trickiest part about using symmetric encryption

is to store the key and make it accessible just to the required software. It is used to store

encrypted information that is stored on behalf of the user, to encrypt device storage or PC and to

create a secure channel between two endpoints of the network. This encryption stands useless if

the key gets transferred over an unprotected network or if the software that accesses the

unencrypted data is compromised (Behrens, 2014).

Asymmetric Key Encryption


Behrens in his post has mentioned that asymmetric encryption takes the information,

encrypts it, and decrypts it again but: an alternate key is used for every end. At one end, a public

key is used to scramble the information, and at the other end, matching private key is used to

unscramble it. It is also known as public key cryptography as public key has to be published and

the private key is kept private, secured much like the key for symmetric encryption. The private

key is used to make the signature and on the other hand the public key authorizes it. It is used to

secure the connections between browser and website, login sessions and it is also used to sign in

software updates so that the user can know that the codes getting installed are genuine. This

encryption is compromised if we do not trust the public key we possess. The attacker can act like

a third party and steal the information in between causing a man in the middle attack.

One-Way (Hash) Encryption

Hashing is not a type of encryption, however it uses cryptography. Hashing takes

information and makes a hash out of it, a string of information with three imperative properties

such as the same information will dependably create the same hash, it's difficult to turn around it

back to the first information and given information of just the hash and lastly it's infeasible to

make another string of information that will make the same hash. It is used for protecting

passwords and making it infeasible to modify the data or hash. This encryption is compromised

as it is impossible to reverse a hash. This is the reason it's basically imperative to choose a decent

password- hashing algorithm that costs more; increasing the cost of this makes your hashing

more safe, thus dissuading the breacher. It is always good to use a well-audited proven hashing

algorithm (Behrens, 2014).

Emergence of a crypto war in future


In the decades since the Crypto Wars resolution, there were expectations about how a

stronger encryption would profit the economy, reinforce Internet security, and protect civil

rights. These have been borne out. The disclosures about the U.S. National Security Agency's

unavoidable surveillance programs have drastically moved the national conversation,

highlighting the vulnerabilities in a significant number of the tools and networks on which we

now depend for both ordinary and sensitive communications. While normal people, civil rights

advocates and technological organizations have grasped more use of encryption as an important

step to address wide range of advanced threats from both government and nongovernment actors,

intelligence officers and law authorities have become progressively candid against measures to

strengthen these systems through encryption. To put forth their defense, they have revived large

portions of the arguments they made about encryption in the 1990s, seeming to have overlooked

the lessons of the past. Therefore, it appears like we may be on the verge of a new war: A Crypto

War 2.0 (Kehl et al, 2015).


Its economic transformation

Report on Intelligence activities published in 1976 illustrates that the adoption of digital

innovations over the past decade has prompted a move in the government's ability to take part in

huge scale surveillance. Fifty years back, if the government wished to keep an eye on a suspect,

it needed to devote a number of agents to take part in day and night physical observation and

request the post office to block and redirect her mail, which would be steamed open, itself a

labor escalated task. If phone surveillance was required, somebody needed to climb up a

telephone pole or open an access panel joined to a flat working with a specific goal to physically

attach wires to the suspect's line. With the tap set up, operators would need to screen the calls all

day and all night. Finally, the report also mentions, if investigating officers wished to listen to

discussions spoken inside the home, an immensely difficult and hazardous “black bag job”

would be fundamental, in which highly skilled specialists would break into the suspect's home or

working environment to clandestinely install remote transmitters and microphones.

Now the wiretapping techniques have changed. As per an article named “High-Tech

Crime: A New Frontier for Transnational Criminal Organizations” by Electronic Frontier

Foundation in 2009, companies and Internet Service Providers have legal compliance

departments, some open 24 hours every day, through which law operators can acquire wiretaps,

messages, or real time telephone data. The service providers can then start a wiretap with a

couple of keystrokes—all without the need to enter the suspect's home or physically connect

wires in a switching center. Once the wiretap has started, the user's information is transmitted to

the government servers. While this transmission of a suspect's communication is ordinarily

performed on a case-by-case basis because of response to a particular request, it gives the idea

that no less than one telecommunications organization has given the FBI access to its network in

order to enable the tapping of customers without company’s consent. Also, numerous Internet

providers have been blamed for giving raw access to their "backbone" systems to the National

Security Agency, to target communication for surveillance on individual basis without

company’s consent.

Five years ago, if the administration needed to access incriminating proof from the home

PCs of ten different suspects, agents needed to persuade a judge that they had reasonable

justification to get a court order for every individual. The investigating agencies would then

send operators to raid the houses of the people, remove the PCs, and later perform forensic

analysis to get the records. If the suspects knew each other, the government may play out a

simultaneous raid (requiring more manpower), so that suspects could not connect with each other

and delete their files (“High Tech Crime,” 2009).

The article also states that now many users have changed to cloud-based services. Law

agencies have basically nominated the technology companies and made these companies a key to

surveillance system. Thus, the private archives of ten people can now be obtained through a

subpoena to Google or Microsoft—whose engineers will then find the records and give them to

the government. This shift to cloud computing provides has many advantages: less manpower,

no reason to go to a judge or build up reasonable justification to acquire a warrant, and end of

physical risk to agents who may be shot or assaulted during a raid.

Does free surveillance works?

In a few past occurrences, companies have declined to give consent to surveillance orders

that they accepted were illegal. Federal wiretapping laws outline particular civil liabilities for

companies that give customer data without meeting the proper lawful requirements. This

obligation gives telecommunication organization a solid motivating force that the law is being

followed. In this way, when wiretaps can be performed with no contribution of the

telecommunications providers, customers are robbed of this oversight, and must depend upon

intelligence agencies to not manhandle their access (Nakashima & Eggen, 2007).

Another advantage of the pay for surveillance model is that it makes a paper-trail. That is,

if the government is charged for every wiretap it asks for, a billing record will be created

describing the date that tap started, ended, the number of users tapped, and the cost of this

service. At least two duplicates of this will be generated, one for the ISP and another sent to the

investigating office. This paper trail gives an abundance of information to oversight bodies, and

fear of making such a paper trail may prevent investigators from starting surveillance without the

appropriate proof. The per-transaction billing surveillance brings the advantage of scarcity. That

is, given a fixed plan, and endless number of suspects, government agents are compelled to

organize their surveillance endeavors. This gives a solid motivator to them to focus on

investigations and in addition to abstain from “fishing expeditions” (Soghoian, 2012).

Indeed, even if a provider charges for surveillance assistance, this circumstance is still

preferable for government agents over in the pre-digital days. Sending agents out to trail a

suspect devours altogether a large number of resources than paying an ISP $1000 to turn on a

wiretap or locate a cell phone. It is more secure. However, now that cloud computing companies

are able to provide law enforcement with the documents that would have once required an armed

raid, this risk of physical harm is gone, and with it, whatever disincentives for over-reaching is


Acquiring a warrant upon a suspect, raiding his/her home, and grabbing his/her PCs

devours important agent hours (Stuntz, 1999), but it places agents in a dangerous situation. A

suspect could be equipped, or have secured his home with traps. This danger of physical harm

gives a personal incentive for officers to limit such searches. Nowadays since many companies

share their documents with law enforcement, the danger of physical harm is gone.

Consumer demand for encryption

There is a lack of consumer demand for encryption. Data encryption with a key that is

private to the user protects against certain threats— insider attacks, where an employee accesses

customer data. This situation in which companies cannot publicize, service providers prefer that

their clients not know these risks exist.

Google had widely broadcasted its underlying refusal to convey search records in

response to demand by the U.S. Bureau of Justice in 2006, it had been less open to talk about the

number of subpoenas it gets every year, to which it did deliver its user’s information to law

agencies (McCullagh, 2008) (See Appendix B). Furthermore, the company’s CEO had expressed

that one of the reasons that retains user’s information is to help the government with legal

investigations. However, Google is not the only one in not wishing to discuss the requests from

the government—there is an industry-wise policy of silence (“CEO: Google knows a lot about

you,” 2009). Only Facebook and AOL have been open about numbers—10-20 every day and

1000 requests for each customer’s data to officers investigating piracy. These numbers only

reveal a part of the government’s secret collection of private data—as responses to FBI National

Security Letters and FISA court orders are gagged, and never disclosed, even in collective form.

It would be unfair to assume that consumers do not care about how their private information will

be handled. For example, in early 2009, new policies were set by Swedish ISP in which they

decided not to hold any information linking customer’s IP address information. This change in

policy was a strong wish from their customers (Lewan, 2009). This shows that the profit margins

lies on company’s skill to convince customers to trust them with more private data.

Effect of encryption on the current status quo

Encryption is an integral part of cybersecurity today and there is a small prospect of

having an immaculate backdoor-door that only FBI can open with proper legal process. On

technological grounds perfect backdoors are impossible but the technological flip side tells a

different story-a properly implemented encryption is practically impossible to crack. The

introduction of public key encryption has changed the dynamics of public policy. It was

impenetrable. It is not just about the technology but of human nature as well. The change in

encryption technology is related to the human behavior. It is the change in default rule -

difference between having encryption fully “off” as a default rule or always “on”. According to

me, with default “on” rule implemented, 80-90% data can be stored at rest in an encrypted state.

To this should be added, the growing body of law that secures encryption passwords against

obligatory exposure, as a consequence of the Fifth Amendment. That means, the courts say that

one cannot be forced to tell the government your password. So the device can be seized but not

decrypted. Lastly, with the data stored in the cloud, privacy is maintained by the third party

providers and the government can now contact the cloud provider to gain evidence. Thus, with

the deployment of encryption technology in the devices, the government still has a powerful

trump card: the ability to compel service providers to insert backdoors into their products

(Rosenzweig, 2015).

Government’s backdoor to encryption for surveillance

Previous battles for backdoor in encryption by Government

The recent case of FBI trying to get into shooter’s IPhone and Apple declining to help is

not the first attempt by the FBI, in fact the roots go back to the 1990s when the FBI and other

law enforcement agencies were attempting to control the then-new encryption innovations and

create access for themselves. One of the major differences between the two decades is that the

torchbearers in the fight against surveillance and encryption backdoors are no longer individual

academics but Apple, one of the world’s most influential tech companies, says Electronic

Frontier Foundation staff attorney Andrew Crocker (Lee, 2016).

Lee also mentions that the 1990s were brimming with talk and lawsuits about encryption,

online security and government surveillance that nearly resemble like today's ideas—to such an

extent that one tech official called it a “digital déjà vu”. Encryption got stronger since the

1990’s.It was more of an academic debate. These debates turned to lawsuits from the dotcom era

where there were huge legal consequences for Silicon Valley. Apple might have depended

vigorously on the 1999 case Bernstein v. U.S. Department of Justice, which established the idea

that computer code is a type of the First Amendment protected speech. In 1991, a graduate

student from University of California named Berkley was stopped by government from

publishing his algorithm on encryption protocol. It was considered as an illegal arms export.

After four years of initial filing, in 1995, it was ruled out that government’s interference violated

the First Amendment. Different cases that put computer codes under First Amendment protection

might be conjured in the Apple case. The 2000 case Junger v. Daley tested the federal

prohibition on exporting encryption source code, and the 6th U.S. Circuit Court of Appeals

maintained that computer code is a form of speech. The government's recent strategies and

requests on Silicon Valley to create a backdoor to work around encryption are not new. In the

1990s, the National Security Agency built up a chipset called the Clipper that, when attached to a

PC, gave the government an extra key to unlock encrypted communications. The NSA's

objective was to attach the chip to each phone, fax machine and PC in the United States.

Introducing Clipper Chip was to provide better encryption to businesses and individuals keeping

in mind the needs of law and national security.

The Clipper chip pulled in for the first time encryption advocates against the possibility

of government-backdoor access—or, as The New York Times called it, the “the first holy war of

information highway.” The Clipper Chip failed out because of Gore’s breakdown from his pro-

clipper stance. In February 2015, the Clipper Chip—after two decades of dormancy— resigned,

just as “Crypto War 2.0” fermented between Silicon Valley and Washington, D.C., again. “The

FBI paints this as a trade-off amongst security and protection. It's most certainly not. It's actually

a trade-off between more and less security,” says security technologist Bruce Schneier.

Backdoor encryption: a security threat and risk to consumers

Backdoors severely makes cybersecurity very weak, leaving users exposed to hacking

and criminal activities. A government mandating security vulnerability in tech products would be

an enormous burden on organizations and an obstruction to innovation.

A vital issue with the backdoor is that there is no real way to control who experiences it.

In the event that the US government can misuse a backdoor security vulnerability to get to a

user's device, so will hackers, foreign governments, and identity thieves. This will decimate the

security of individual customers around the globe, as well as the numerous organizations that use

American business tech products. Eventually, this government mandate would have the impact

of empowering cybercrime and affecting national security.

Users outside of the US might be not interested to buy American tech products that

encourage government surveillance. It is troublesome and costly to both develop backdoor

security vulnerability and then defend it against unauthorized users. This burden would be

heaviest on small businesses and pioneers of new communication services, which might create a

disincentive to encrypt their products, which would diminish the security of users.

Interference of encryption in Investigation

Castro in his report states that the U.S. law authorization strives to secure the wellbeing

of U.S. nationals and occupants. Late advances in encryption, alongside expanded appropriation,

have influenced how government organizations ensure national security by constraining the

measure of data they can access. In any case, law implementation and security offices direct

distinctive sorts of examinations. While national security offices surveil a lot of data, law

requirement examinations ordinarily target data significant to a particular case. Undoubtedly, law

authorization authorities don't have the power to serve mass warrants.

For investigative purposes, data is sorted as “data at rest” or “data in motion.” To access

to data at rest, law enforcement must get a court request and to have a reason to why the access is

required. The government officials will not be able to access data if they do not have a key.

Similarly, if Internet companies don't have the key to their user’s encrypted information, then

they will not be able to agree to a request by intelligence agencies to seek through this data

(Castro & McQuinn, 2016).

Olsen’s study (as cited in Castro & McQuinn, 2016) illustrates that data in motion refers

to data moving between two or more endpoints. If the communications are encoded end-to-end

so that only endpoints have keys, law authorities will not be able to decipher data. Also, when

security offices capture a lot of online data in transit to scan for trigger terms, they will not have

the access to get to this data if the information is encrypted. If it is encrypted, intelligence

agencies will only see metadata—information that depicts data about a communication, for

example, data on the source and destination packets and how much information is transferred.

The overall effect of adoption of encryption on intelligence agencies is unknown. Future devices

and methods may affect their ability to direct investigations, and this effect is also unknown and

will change over time.

Point of views: When FBI asked APPLE to grant them access to the back-doors regarding

banning encryption.

The Wall Street article named “The FBI vs. Apple” published in 2016 stated that

encryption debate commenced in 2014 after Apple released a security feature that creates

random security keys that were unknown to Apple itself. They are combination with the user’s

passcode to decrypt the mobile device’s information. Without such mathematical formulas, the

information is unreadable. This encryption makes iPhones more secure, however it additionally

implies that Apple can't unlock its own devices. Neither can Google after following the same


FBI director James Comey and Manhattan District Attorney Cyrus Vance stated that they

are losing the ability to execute search warrants under the Fourth Amendment. So they support a

command that the U.S. tech industry introduce a master security key—the “backdoor” .Mr. Cook

summons—to unlock any device. The CEO has a solid case when he said that indirect accesses

make more issues than they solve. Presenting security vulnerabilities that third party like cops

and spooks can use as required can likewise be misused by hackers and spies. Countries can

mandate backdoors, but there will be some encrypted channels outside of their jurisdiction where

ISIS can plot. The outcome would be weaker items for law-abiding consumers that leave U.S.

organizations less competitive with little security advantage (“The FBI vs. Apple,” 2016).

According to the survey (See Appendix A) conducted by the Pew Research Center,

Americans side with the FBI over Apple: 51% sided with the FBI v/s 38%. Another 11% said

“don’t know.” On the contrary, another Pew study conducted several years ago, when asked if it

was right for the government to violate their own privacy in the interest of national security, the

results were reversed. Lastly one question arises, will government move forward creating

legislation where it would be required for the technological organizations to keep backdoors?

The access to the data is a basic component if the security needs to be given to the citizens. If the

technological organizations and privacy advocates don't concur with creating backdoors, then

they should propose an answer acceptable to the law authorities in satisfying the mandate of

giving security services to the residents. This case is still debatable.


Proposed methods to break encryption

Prohibiting encryptions above a certain strength – its impact and benefits

Government can get to encrypted data by just banning encryption over certain strength or

by permitting weaker types of encryption that it has the strategies to break. For instance, the

United Kingdom could pass a law that bans encryption stronger than 64-bit keys, knowing its

intelligence group has the resources to split any type of legal encryption in the nation.

The security consequences of this move would be critical, weakening the security of all

services and products delivered or imported into a country, particularly against attacks from

hackers or nation states. To date, no nation has made arrangements to totally ban encryption

above certain strength. Rather, nations have practiced this kind of ban barely, directing it at

services or products that use encryption their intelligence groups can't bypass. For instance,

Saudi Arabia and the United Arab Emirates banned the secure messaging service of Blackberry

cell phones in 2010 is mentioned in the article named “Two Gulf states to ban some Blackberry

functions” (as cited in Castro & McQuinn, 2016). Similarly, in 2011, Pakistan banned all

encryption programming and requested Pakistani Internet Service Providers to tell the

government in the event that it discovered clients utilizing VPNs— that use encryption to permit

Internet users to browse the web secretly. This was captured from the article “Pakistan to ban

encryption software” (as cited in Castro & McQuinn, 2016).

Weakening standards of encryption - its impact and benefits

Jay Stowsky in his work “Secrets or Shields to Share” mentioned that Intelligence groups

can privately control and weaken national standards set for encryption, with the objective of

constraining encryption services and products that use those measures. While these processes are

open, intelligence agencies can surreptitiously control them. This requires cross-government

support. At times, intelligence agencies can weaken the quality of encryption of the national

standard, for example, the NSA's alterations to the U.S. National Bureau of Standards' final DES

algorithm. In different occurrences, intelligence agencies can control national measures to leave

security defects in the final algorithm. For instance, Greenemeier’s work (as cited in Castro &

McQuinn, 2016) mentions that the Snowden disclosures uncovered that the NSA had

surreptitiously bypassed encryption used to secure digital communication by controlling a

cryptography standard that the National Institute of Standards and Technology (NIST) had issued

in 2006.This permitted the NSA to get to information on a service or a product utilizing this

standard, but also uncovered these services to be mishandled by criminals as stated in Schumow

& Furguson’s work (as cited in Castro & McQuinn, 2016), proves that NIST distributed rules

debilitating organizations from using these standards and guaranteed to give people a chance to

give input on revising new standards. By trading off an encryption standard that the

administration utilizes, and by augmentation, that a significant part of the private sector will

probably use, intelligence agencies would make a mistake that malicious actors could find and

exploit. Schneier’s research (as cited in Castro & McQuinn, 2016) concludes that these security

flaws could hold on for quite a long time before the government settles the issue in a revised

standard. For instance, security analysts found the security flaw in the NIST cryptography

standard in 2007; however NIST did not revise it until 2014.

Creating backdoors by intelligence agencies - its impact and benefits

The government can direct attacks to break encryption, with or without assistance from

companies. While this is regularly done by intelligence agencies to introduce backdoor to

devices, it can likewise be used for law enforcement purposes. At the point when law

enforcement or intelligence agencies try to break into a product, they have no assurance of

success. Accordingly, distinctive sorts of attacks fluctuate in adequacy in light of what kind of

encryption a target is using and if a company is working with the law.

Law authorities have used numerous types of hacking to look for criminals, particularly

where offenders have used software to encrypt their online activities. Actually, Knibbs &

Poulsen’s research (as cited in Castro & McQuinn, 2016) found that the FBI not just has its own

surveillance software; it additionally uses hacking tools. One case of this is phishing, where law

enforcement takes on the appearance of a dependable source over an electronic communication

to motivate somebody to download surveillance software. Poulsen in his work (as cited in Castro

& McQuinn, 2016) states that this happened in 2007 when the FBI sent a fake news article to the

unknown email address connected with repeated bomb threats; downloading the article installed

the software that, when clicked, uncovered the suspect's identity.

Another regular strategy is introducing a keylogger, software that is intended to privately

record keystrokes. Keyloggers can be utilized to catch passwords or other sensitive data wrote

into a PC. For instance, McCullagh’s work (as cited in Castro & McQuinn, 2016) found that in

2007, the U.S. Drug Enforcement Administration utilized a keylogger to bypass PGP and an

encrypted email service by observing suspect’s keystrokes for their passwords keeping in mind

the end goal to use a keylogger, law enforcement must gain private access to the encrypted

device, something that can be difficult during an ongoing investigation.

The government can ask or drive a company to direct attacks for its benefit. For instance,

for end-to-end encrypted communications that rely on centralized provider key servers for e.g.,

WhatsApp, the government can attack key distribution, driving companies to distribute illegal

keys that store and get data identified with a user's account is mentioned in Green’s work (as

cited in Castro & McQuinn, 2016). In this example, when a user tries to communicate something

specific over an encrypted network, one should communicate with a key server that tells the user

the public key of the recipient. In addition, the government can work with a company to delete

the security features in the services it gives to specific users.

Similarly, as with introducing secret backdoors into products, there are tradeoffs with

different types of government hacking. Hacking a device that is still being used, for example, by

introducing malware, can make new vulnerabilities that different attackers could misuse.

Furthermore, a few strategies, for example, presenting vulnerabilities through software updates,

could demoralize users from fixing security issues on their devices. Furthermore, when the

government does not present new vulnerabilities in the private sector, it permits these

weaknesses to hold on—with the goal that they could later be misused by criminals (Castro &

McQuinn, 2016, p.17).

Giving access to third party through key recovery process- its impact and benefits

The government can require key-recovery mechanisms, ordinarily quoted as “key

escrow.” In such mechanisms, along with the first encrypt-decrypt key, there is a second key that

is held by a third party. So, there is no entry for secret backdoor. Rather, the software is intended

to make an extra key for the third party. At the point when law enforcement needs to capture

encrypted information or access a device, the escrow key can decrypt them.

Key escrow mechanisms, if adopted, would be a successful means to access the

encrypted data. By obtaining a warrant or court request, law authorities could constrain a

company to decrypt information without the knowledge of the person using it. The Obama

administration has talked about various approaches to execute this kind of key escrow system.

For instance, device producers could be required to keep up a different set of keys that would

permit access to encrypted information over a physical port on the devices. If law authorities

have physical access to the device, they could then extract the encrypted information after

getting a court order to get the key. The administration proposed a “forced back up plan”

whereby companies with the ability would be required to naturally upload user information,

before it is encrypted, to an area where the government could later access to it. Finally, NSA

chief Mike Rogers, proposed split-key encryption, where numerous parties each control a partial

key, and all parties must come together and decrypt information on court’s request as mentioned

in Nakashima & Gellman’s work (as cited in Castro & McQuinn, 2016). While this methodology

would restrict the danger of exploitation by requiring an attacker to acquire different partial keys

to recuperate a full one, despite everything it still introduces vulnerability—all keys must meet

up without a moment's delay to encrypt the data and the keys must remain same as before over

various transactions. In none of these situations would the government have the capacity to get to

all encrypted information, as users could use third party software to encrypt their data prior to

storing it on a device.

But these systems can have a negative result: Key escrow detracts from people the ability

to secure their own information in favor for a much less secure encryption structure. There are a

few issues with key escrow systems. Abelson and his co-writers (as cited in Castro & McQuinn,

2016) mention in their work that these systems increase complexity in the structure. In a

theoretical key escrow structure, a law can demand the company to go through various stages,

including verifying the court's warrant, finding the right information, finding the right key in the

company's escrow, and recovering the information from only the time period on the warrant. The

company would then need to convert the information into the correct format, transfer it to just

validated authorized parties, and keep up the capacity to review these requests. Each of these

means adds another level of complexity and everyone is liable to attacks from enemies—whether

there are attacks like false court reports, insider threats from employees, or hackers focusing on

the system that keeps up the keys in escrow.

Banning client side encryption - its impact and benefits

The government could restrict companies from executing client side encryption, whereby

customers control their own encryption keys. By not permitting users to encrypt information

using their own keys, third party providers will have the capacity to get to customer information

to satisfy court orders and warrants. This kind of confinement on encryption successfully

constrains companies to implement their own key escrow systems. Therefore, restricting client

side encryption brings about the same advantages to law enforcement and consumer risks as

endorsing key escrow systems. This strategy would not keep clients from using non-U.S.

software by third party, or creating their own, which permits them to encrypt information and

hold sole custody of the keys (Castro & McQuinn, 2016, p.20).

Ultimately, the security of users’ data is dependent on the actions of the service provider,

which can create a problem. If a user wants more security, which may increase costs, and a

service provider wants to generate profit by decreasing costs, the user may ultimately lose out.

However, if users control their own set of keys, then they can secure their data based on the level

of risk tolerance appropriate to that data, such as by using stronger keys, changing keys more

frequently, and implementing more security measures to limit who can access keys.

The security of user’s information is subject to the activities of the service provider,

which can make an issue. In the event that a user needs more security, which may build costs,

and a service provider needs to generate profit by diminishing costs, the user may be at loss. In

any case, if users control their own set of keys, then they can secure their information based on

data risk tolerance, for example, by using stronger keys, changing keys more often, and

implementing more efforts to establish safety measures to limit who can access them (Castro &

McQuinn, 2016, p.20).

Traditional forms of investigation - its impact and benefits

There are traditional types of investigation to access the keys used to encrypt information,

including surveillance, witnesses, physical inquiries, and confessions. For instance, law

enforcement or government may secretly watch the password a user enters on a PC or pursuit an

officer to find paper records containing passwords. These strategies for acquiring keys are less

valuable for blocking encrypted communications. Many of these strategies have been accessible

to law authorities for quite a long time, and have adjusted to evolving technologies. For instance,

before phones there were no wiretaps, and under the eye of transistors law enforcement couldn't

plant listening devices to supervise conversations remotely (Castro & McQuinn, 2016, p.20).

Each of these techniques has differing levels of viability. For instance, physical searches

can be made if it has jurisdiction. Thus, law can constrain somebody with the encryption key—

the sender or recipient of the communication, or the service provider —to give access to

decrypted information in certain situations. For instance, in case of encrypted telephones with

biometric locks, for example, new iPhones. However, for this to be possible, the person’s device

and the person should be accessible. Therefore, authorities may not utilize this technique if they

do not wish to be recognized. Also, some of these strategies will get more troublesome to

implement as encryption gets fused with more technologies (Castro & McQuinn, 2016, p.21).


The capacity of customers and organizations to protect sensitive data has permitted the

digital economy to thrive. As opposed to place obstructions on encryption, the U.S. government

should advocate for better cybersecurity practices both locally and abroad, to some degree by

encouraging the advancement in encryption. Congress and the organization can do as such by re-

establishing trust and fortifying data security at home, providing law enforcement with new

devices to maintain the law, and anticipating the United States firm duty to information security

to the world (Castro & McQuinn, 2016, p.33).

Building trust in government organizations and private sectors

U.S should re-establish trust in government and private sectors by revoking the statutory

mandate for NIST to work with other agencies to build up encryption standards. The government

should build trust in NIST backed encryption standards by being straightforward and clear about

the intelligence community’s involvement in development of standards. The technical standards

are very important in a world subject to trade, electronic communication as well as individual

privacy. The NSA has been involved in efforts that have disrupted encryption standards.

However, the NSA should be clear about its work to disrupt the security products. The NSA

operates the “Commercial Solutions Center,” which welcomes the makers of business hardware

and software encryption items to present their items to the agency with the objective of

enhancing general security through private-public partnerships stated by “National Security

Agency” (as cited in Castro & McQuinn, 2016).

The article entitled “Discovering IT problems” (as cited in Castro & McQuinn, 2016)

illustrates that the U.S government should investigate flaws in security and the companies should

be immediately be notified to fix these flaws. However, the NSA website states that around 91

percent flaws are detected by the agencies. According to them, the other 9 percent are not

revealed because software developers either patch them before the disclosure or are not disclosed

due to security reasons. The better way to protect the citizens and organizations is to advertise

strong cybersecurity practices in the each private sector.

Providing gateways to Law enforcement systems to solve challenging cases

It is vital to understand that the capacity of law enforcement to solve cases is minimized

because of frequent use of encryption. Therefore, new legal tools should be adapted by the law

systems to beat out this challenge. First, the U.S government should find ways whether

legislation can support U.S courts to balance the individual interests and state by permitting the

law enforcement to impose the decrypted data production. The Fifth Amendment endeavors to

make a reasonable government-individual equalization by measuring the benefits of the

individual with the necessities of society. In any case, encryption offers an extraordinary and

huge interest for the state in convincing generation of decrypted data, as encrypted data is

frequently invulnerable without the key. Accomplishing a reasonable equalization of interests

amongst citizens and the state requires allowing law authorization—under lawful court request—

to urge somebody to turn over their password or encryption key if law implementation can

demonstrate a persuading interest for getting that data. To accomplish this, the U.S. Congress

could pass a law systematizing these powers, and U.S. courts can maintain this way to deal with

providing balance to the system (Castro & McQuinn, 2016, p.34).

Second, the U.S. Congress ought to offer extra assets to state and nearby law

authorization for cyberforensics and to influence asset sharing. Despite the fast evolution of

encryption and the increased predominance of cybercrime, state and local law systems has been

not well prepared to explore and prosecute a rash of new criminal action. To support this, there

have been various government efforts to fortify state and local cyberforensics experts. For

instance, the Secret Service's National Computer Forensics Institute trains state and

neighborhood law requirement officers, prosecutors, and judges across the country in cyber

investigative techniques (“About-NCFI,” 2012). Similarly, the FBI offers restricted cybercrime

help through cyber task forces. However, these efforts are insufficient. To react to state and local

law needs, the U.S. should offer states funding to seek after better cyberforensics and empower

more prominent knowledge sharing between various levels of law authorization. For instance,

Congress could give proper technical assistance during investigations (Castro & McQuinn, 2016,


Enforcing strict rules for government hacking

Government should only be engaged in hacking under well-defined rules and regulations

established by congress. There are certain rules that make it hard for the FBI to fetch a warrant to

hack into any system because of jurisdiction (“Advisory Committee on Rules of Criminal

Procedure,” 2015). Clear and consistent rules should be set up for how and when law authorities

can hack into systems, including any help by the private sector. In this way the Fourth

Amendment rights will be protected by providing proper authority to law enforcement for

investigations. This will ensure that the impact on companies is reduced (Castro & McQuinn,

2016, p.35).

Improving strategy to tackle Cyber security

To improve cyber security across the globe United States should create a broader

strategy. First, U.S should try and restrict the countries that try to weaken the encryption

standards for suppressing competition. Second, the government should force its allies to reject

the policies and trade agreements that damage cyber security within their boundaries. This results

in endangering everyone’s security. Serious allegations should be imposed against the countries

that undermine the rules stated for information technology services and products. Internet’s

global nature often makes it difficult for the law systems to enforce legal jurisdictions and thus it

will be easier for the criminals to purchase products from those countries where there are no laws

and rules. Therefore U.S. along with its trading partners should come up with “Geneva

Convention on the Status of Data” as mentioned in the research by Castro (as cited in Castro &

McQuinn, 2016). This would help in establishing international rules for clarity, answer questions

of jurisdiction, better synchronization of requests from international law enforcement, and

restrict access by government of other countries. Only by attempting to set up a worldwide

agreement on these issues can countries that have already occupied with mass cyber surveillance

guarantee the international community that countries can consider each other responsible later

on, and that law implementation can cooperate to solve jurisdictional violations (Castro &

McQuinn, 2016, p.36).


U.S. endeavors to mandate unprecedented access to encryption products and services will scale

down progress in data security systems and serve just to open foreign markets for outside

contenders, as they did in the first crypto wars. Instead of making the digital world less secure by

halting innovation in data security and mandating specific technologies, the United States should

stand against any attempts to defame cybersecurity and should adapt to strong encryption by

promoting a better strategy for enhancing cybersecurity around the globe. The widespread use of

strong encryption will enable consumer benefits, drive our economy and protect nation’s

infrastructure and security.



Behrens, M. (2014, November 20). Understanding the 3 Main Types of Encryption. Retrieved

from https://spin.atomicobject.com/2014/11/20/encryption-symmetric-asymmetric-


Castro, D., & McQuinn, A. (2016, March 14). Unlocking Encryption: Information Security and

the Rule of Law (Rep.). Retrieved October 10, 2016, from The Information Technology

and Innovation Foundation website: http://www2.itif.org/2016-unlocking-


CEO: Google knows a lot about you, then Forgets. (2009). Retrieved from


FBI vs. Apple. (2016, February 19). The Wall Street Journal. Retrieved from


High-Tech Crime: A New Frontier for Transnational Criminal Organizations (2009). Retrieved

from https://oag.ca.gov/transnational-organized-crime/ch5 Electronic Frontier

Foundation, NSA Spying FAQ, http://www.eff.org/nsa/faq

Kehl, D., Wilson, A., & Bankston, K. (2015, June). DOOMED TO REPEAT HISTORY? - New

America. Retrieved from https://static.newamerica.org/attachments/3407-doomed-to-



Lee, S. (2016). The government's decades-long battle for backdoors in encryption. Retrieved

from http://www.newsweek.com/governments-decades-long-battle-backdoors-


Lewan, M. (2009). Swedish ISPs vow to erase users' traffic data. Retrieved from


McCullagh, D. (2008). How safe is Instant Messaging? A Security and Privacy Survey.

Retrieved from http://news.cnet.com/830113578_3-9962106-38.html

Nakashima, E., & Eggen, D. (2007, October). Former CEO Says U.S. Punished Phone Firm -

The Washington Post. Retrieved from http://www.washingtonpost.com/wp-


Report on intelligence activities and the rights of Americans. (1976).

Rosenzweig, P. (2015, July). Encryption, Biometrics, and the Status Quo Ante - Lawfare.

Retrieved from https://lawfareblog.com/encryption-biometrics-and-status-quo-ante

Singleton, T. W., Rivera, J. C., Hailey, W. A., & Curry, E. M. (n.d.). SE.DSI--The Use of

Encryption in IT Control and in the ... Retrieved from




AND LAW. Retrieved from http://cis-india.org/internet-governance/spies-we-trust

Stuntz, W. J. (1999). The distribution of Fourth Amendment privacy. Washington, D.C.: George

Washington University.

Appendix A

Apple v/s FBI survey conducted by PEW research center

Appendix B

A security and privacy survey conducted by News.com

Secure Secure Logs kept Logs kept of For how long Government
logging- conversations of user message wiretapping
in logins content
AOL AIM Yes Yes Yes No Won't say Won't say
AOL ICQ Yes No Yes No Won't say Won't say
Facebook Chat[1] No No Refused to Refused to Refused to Refused to
answer answer[2] answer answer
Google Talk Yes Yes[3] Yes No[4] Four weeks Won't say
IBM Lotus Yes Yes Yes Configurable Configurable N/A
Microsoft's Yes No[5] No No N/A Won't say
Windows Live

Skype Yes Yes Yes No "A short time" Cannot comply
with wiretaps[6]
Yahoo Messenger Yes No Yes No As long as Won't say
[1] Over the course of a week, Facebook refused to reply to questions.

[2] Facebook has said both that chat history "is not logged permanently" and that it is archived for 90 days.

[3] Encryption is on by default for the downloadable client, off by default for the Web, and not supported with the Google Talk Gadget.

[4] Configurable: users can choose to log conversations in their Gmail chat archives if they wish.

[5] Conversations are unencrypted, but files exchanged via Windows Live Messenger are encrypted.

[6] Skype was the only IM company that said it could not perform a live interception if presented with a wiretap request: "Because of Skype's

peer-to-peer architecture and encryption techniques, Skype would not be able to comply with such a request."