The words you are searching are inside this book. To get more targeted content, please make full-text search by clicking here.
Discover the best professional documents and content resources in AnyFlip Document Base.
Search
Published by Enhelion, 2020-05-20 08:31:52

3.1

3.1

MODULE 3.1
THE RIGHT TO INFORMATION, PRIVACY AND DATA PROTECTION

3.1.1. Data protection in the Indian context

“You have zero privacy. Get over it”.
Scott McNealy, CEO, Sun Microsystems1

3.1.1.1. Whither privacy: a prefatory note

Despite Scott McNealy’s blunt admonition that Internet users should get comfortable
with the notion of vanishing privacy, 86 per cent of users are either very concerned or
somewhat concerned about electronic privacy2. The issue has generated intense media
scrutiny world-wide. Consumers in the US and Europe are beginning to sue e-commerce
companies over their electronic data collection practices, and, not surprisingly, many
major companies are newly focused on privacy policies and protections.

For decades, businesses have collected consumer data without much complaint. Every
time a consumer completes a warranty card, the information on that card heads for a large
corporate database, which subsequently drives dinner-time telephone solicitations. But
the Internet makes it exponentially simpler to amass large quantities of extremely detailed
personal information. To many, this ease, efficiency, volume and vigour of electronic
data collection is an uncomfortable marriage of George Orwell and Adam Smith.

In physical space, companies can only track the positive actions a consumer takes, such
as a purchase, payment or application. That data can be matched to other databases of
consumer data which helps companies to target specific demographic groups. However,
retailers only know about the items that consumers both consider and purchase, not the

1‘Real Privacy’, The Standard, 2 November 1999, available at
http://www.thestandard.com/article/display/0,1151,7365,00.html.
2 Please refer to ‘Internet Users Seek Assurances over Online Use of Personal Data’, The Washington Post,
21 August 2000, available at http://www.washingtonpost.com/wp-dyn/articles/A60984-2000Aug20.html.

items that they consider and do not purchase. In contrast, in virtual space, companies can
track every step of our unerasable wanderings through cyberspace, even as we surf in the
seclusion of our homes and offices. Every single action on the Internet can become one
more brick in the wall of a detailed consumer profile. Concerns about the breakdown of
the distinction between public actions and private actions must be balanced against the
legitimate needs of Internet ventures to generate a revenue stream from content they are
providing to the public free of charge3.

Once consumer information is gathered, it can be shared with marketers, who can better
target and serve the consumer with highly tailored advertisements and direct e-mail
campaigns. Unlike print media or broadcasting, which target large demographic groups,
banner ads and direct e-mail campaigns on the Internet can target the individual. This is
sometimes known as ‘narrowcasting’. Internet advertising companies such as
DoubleClick offer companies the chance to market only to pre-selected, highly-qualified
consumers, which is why a search for information about an automobile will often return a
page with an ad for a car. Advertisers purchase a specific number of ‘impressions’ and
are promised that their ads will be presented only to the right consumers.

3.1.1.2. Self Regulation: Industry Standards and Practices

Driven both by media and consumer concerns and also perhaps, a need to convince
policy-makers that Internet privacy legislation is unnecessary, industry in the US and
Europe, has turned to self-regulation of data collection and use. Many companies have
taken the rather extraordinary step of voluntarily publishing a privacy statement, a series
of promises and disclosures about what the company does and does not do to collect or
share user information.

Companies persist in generating these policies primarily for two reasons. First, many
companies feel that a comprehensive privacy statement is necessary to quell user’s fears

3 In effect, the consumer is swapping otherwise private information for the ease and convenience of
Internet services; the public anxiety about privacy springs from our growing realisation of this fundamental,
if subtle, bargain of the Internet.

that their sensitive information will be disclosed. Secondly, companies may wish to post
privacy statements if they are receiving certain types of data from parties in the European
Union. Many companies take a further step and contract with an independent entity, such
as TRUSTe, to audit compliance with their privacy policies. In return for a fee, these
companies monitor adherence and certify the site with a prominently placed seal of
approval. Usually, the privacy statement must meet certain minimum content standards4.
Businesses hope that when a consumer sees an independent certification, the consumer
will have greater confidence in the site’s privacy policy. It is unclear whether companies
like TRUSTe will be liable for failing to discover or report breaches.

Despite these efforts, privacy statements are not always useful to others. Because a
privacy statement is by nature a static description of an evolving business practice, most,
if not all, privacy statements permit the company to change its statement at any time and
without any notice, which obligates consumers to view the privacy policy each time they
visit a web site, a practical impossibility. In reality, consumers are likely never to read
privacy policies even once. While some web sites allow consumers to ‘opt out’ of the
information collection process, most consumers have never read the policy and therefore
never exercise their opt-out rights. Thus, even where a company does continually update
its policy, it may be still risking liability, or at least a lawsuit for failing to notify users of
a policy change. Until the industry moves to an ‘opt-in’ approach or takes steps to
affirmatively alert users to policy changes, companies are likely to face lawsuits from,
and liability to, disgruntled users.

Taking another approach, the industry has also developed privacy protection software,
called Platform for Privacy Preferences Program. Or P3P, which has been incorporated
into the most recent release of Microsoft’s popular Internet Explorer browser. A user can
tell the browser what personal data he is willing to release. Then, when viewing a web
site, the P3P software compares the user’s preferences with the kinds of personal data the
web site collects. If there is a mismatch between them, the user is alerted.

4 Please refer to www.truste.com.

3.1.2. Electronic Privacy Legislation in the United States

3.1.2.1. Electronic Communications Privacy Act

Like most laws being used as the basis for internet privacy suits, the US Electronic
Communications Privacy Act (ECPA) was not enacted with the internet in mind. A 1986
revision of the federal wiretap statute, it protects a person’s electronic communications
from both governmental and non-governmental entities. The law prohibits: (1)
unauthorised intentional access to any computer facility or network, and (2) the
interception of data. It is also an offence to exceed an authorisation to access a computer
facility. This statute is an issue in most pending internet privacy lawsuits. Plaintiffs'
lawyers have focused on the prohibition on intentionally exceeding an authorization to
access a computer or network, arguing that programs that inspect and modify computer
systems exceed the user’s authorisation, particularly where there is no prior disclosure to
the user. The ECPA provides for both criminal and civil penalties for violations. Civil
penalties include statutory damages of no less than $1,000 per plaintiff and reasonable
lawyer fees, a powerful combination that facilitates class action lawsuits.

3.1.2.2. Computer Abuse and Fraud Act, 1986

The Computer Abuse and Fraud Act, sometimes called the anti-hacking statute, prohibits
unauthorised access to computer systems. Here also, the statute not only sanctions
unauthorised access to a protected computer, but also prohibits exceeding any
authorisation. Under the Computer Fraud and Abuse Act, one may not ‘access a
computer with authorization and use such access to obtain or alter information in the
computer that the accessor is not entitled so to obtain or alter’. ‘Protected computer’
includes any computer used in interested commerce or communications, which, given
today’s use of the internet for electronic commerce, includes just about any computer
connected to the internet. Paragraph (5) (A) of the statute also prohibits the transmission
of a virus with the intention of causing damage to a protected computer. As with the
ECPA, violations of this statute carry both criminal and civil penalties. Damages are

limited to economic losses and the action must be brought within two years of the
violation or within two years of discovery of the damage. At least $5,000 of damage
within a one year period is required to bring suit, but plaintiffs may aggregate their
damages. Accordingly, this statute is often featured prominently in internet privacy class
actions.

3.1.2.3. Children’s Online Privacy Protection Act

One of the few US privacy laws enacted specifically for the internet, the Children’s
Online Privacy Protection Act of 1998 (COPPA) is designed to protect the privacy of
children under the age of 13 while they surf the web. The Act restricts the collection of a
child’s:

● First and last name;
● Home or other physical address;
● E-mail address;
● Telephone numbers; and
● Social Security number

Moreover, the Federal Trade Commission (FTC) is authorised to add other categories of
information that it determines would allow the individual to be contracted. Website
operators are similarly restricted in their online collection of information concerning a
child (or a child’s parents) from a child that it then combines with an identifier, as
described in the Act.

The COPPA requires each website operator to obtain verifiable parental consent before
collecting, using, or disseminating any of the above data. It also provides that sites aimed
at children may not condition the participation in a game or the receipt of a prize on the
child disclosing personal information.

3.1.2.4. Video Privacy Protection Act, 1988

The Video Privacy Act was enacted to protect the privacy of consumer rental and
purchase of videos. Although the law was enacted in the 1980s and did not contemplate
the internet, its language is sufficiently broad to include internet video transactions. The
Act applies to ‘any person, engaged in the business….. of rental, sale, or delivery of pre-
recorded video cassette tape or similar audio visual materials’. While this definition
clearly captures e-tailers that ship physical products to consumers via traditional delivery
services, it may also cover companies that stream video over the internet. The statute
prohibits the disclosure of purchase or viewing history records of individual consumers
without their informed written consent in advance of disclosure, with certain exceptions.

This statute may create legal risk for companies streaming video for a fee over the
Internet. Disclosure of consumer data could leave these companies open to individual or
class action lawsuits. For example, due to a case alleging violation of this act in 2012,
Netflix had to change its privacy rules so that it did not retain records of people who had
left its website. It lobbied for a change in the statute. As a result, the law was amended to
allow video rental companies to share rental information on social media after obtaining
permission from the customer.

The statute provides for damages, including statutory damages of not less than $2, 500,
lawyers’ fees and expenses, and punitive damages.

3.1.2.5. Cable TV Privacy Act of 1984

When the US Congress passed the Cable TV Privacy Act of 1984 to protect the viewing
history of individual consumers, no one contemplated that people would receive internet
access through their cable TV.

Nevertheless, the Act provides that a ‘cable operator cannot collect personally
identifiable information without the prior written or electronic consent of the subscriber

concerned. This prohibition is particularly strict, because, while other statutes limit
disclosure of personal data, this statute prohibits the collection of any data whatsoever
without prior written permission. The consumer must ‘opt in’ prior to any data collection.
Further, even if the consumer consents to data collection, the cable company may not
disclose the data to a third party without the consumer’s express written consent. It is
unclear how these restrictions will affect cable operators that supply internet access.

3.1.2.6. State Statutes

Aside from the protections afforded by federal statute, various state statutes protect an
individual’s informational privacy, either broadly or under particular circumstances.
Numerous states have consumer protection and fraud laws, which in many cases would
apply to a breach of a privacy statement, and which might arguably apply to certain
particularly stealthy forms of data collection that occur unbeknownst to the consumer.
Some states, like New York, have laws that track some of the federal statutes discussed
above, while other states, like Connecticut, have laws that are more limited in scope.
Virginia has amended its Privacy Protection Act to include data collected over the
internet. Some state constitutions provide a right to privacy. Any company that collects
data by way of the internet may face liability in any jurisdiction where the internet is
available under any or all of these rules.

3.1.2.7. Common Law Torts

State common law also regulates data collection and disclosure. Note that public
disclosure of the information is not always required to have a tortious act. The
Restatement (Second) of Torts provides that one may not unreasonably intrude
‘physically or otherwise, upon the solitude or seclusion of another or his private affairs or
concerns… if the intrusion would be highly offensive to a reasonable person.

3.1.3. European Union

The Council of the European Union has adopted stringent consumer privacy and data
protection rules for all of its member countries. These regulations require all EU
countries to enact tough laws protecting personal data and, most notably for US purposes,
forbidding the transfer of personal data to non-member countries whose laws do not
provide ‘adequate protection’ as determined by the Council. These rules were earlier
enacted under the EU Data Directive. More recently, from 2018, these were superseded
by the European General Data Protection Regulations (GDPR).

Under the GDPR, companies cannot profile an individual’s preferences based on browser
or purchase history unless the user has given express consent to do so. The explicit opt-in
consent must be informed and unambiguous. It eliminates the requirement of single
consent for different uses as different types of data require separate consent. Data may
only be used for those particular purposes originally consented to by the consumer, and
may only be kept as long as necessary for the purpose collected. The consumer must be
given the right to object to his data being used for marketing purposes. As we have seen,
US law does not contain similar comprehensive requirements. In response to the EU
guidelines, US agencies had negotiated with the Council to develop means by which the
United States can provide ‘adequate protection’ for transferred information: a public /
private safe harbour scheme. To come under the safe harbour, a website operator had to
comply with certain minimum privacy standards regarding data collection, disclosure to
third parties, and consumer access to correct errors, among other detailed requirements.
The website operator also had to post its compliance policy on the website as a privacy
statement, and certify to the Department of Commerce, in one of several prescribed
manners, that it is in compliance with the safe harbour. If the website operator then
violated its policy, it faced either private suit or an action by the FTC for consumer fraud.
With the new GDPR in place, this safe harbour has been replaced by a new EU-US
privacy shield. It increases oversight and sanctions and gives more authority to European
data protection authorities.

With the foregoing in mind, lawyers must work closely with their clients to evaluate the
strengths and weaknesses of any data collection strategy. In the current legal climate, any
misstep could make a company the target of internet privacy litigation precisely at the
point the company’s business is beginning to reach critical mass. Corporate lawyers must
discuss principles of risk, cost and benefit with their clients to help them develop
thoughtful and coherent privacy policies that balance business needs with legal
compliance. Each act of collection and each act of disclosure should be considered.

The importance of cooperation and communication between legal, business and technical
personnel cannot be overstated. Each group must work together to foster a culture of
regular communication and to understand each other’s goals and concerns. A successful
privacy policy rests on a thorough discussion of technology, business, ethics, marketing,
and law.

Moreover, the discussion must be periodically re-engaged. All parties must keep their
fingers on the pulse of ever-changing technological possibilities and evolving legal
principles. The pace of change in the new economy and the uncertain state of privacy law
suggest a process of continual re-evaluation, and, in this process, a fluid working
relationship between lawyers, business people and technologists becomes of paramount
importance. Corporate counsel must work to build these relationships and advise their
clients accordingly.

3.1.4. Security Concerns, Trade Secrets and Privacy: Developing Trends

“One of the most facile and legalistic approaches to safeguarding privacy that has been
offered to date is the notion that personal information is a species of property. If this
premise is accepted, the natural corollary is that a data subject has the right to control
information about him and is eligible for the full range of legal protection that attaches to
property ownership.”5

5 Arthur Miller: The Assault on Privacy: Computers, Data Banks and Dossiers 211 (1971)

As laws, policies, and technological designs increasingly structure people's relationships
with social institutions, individual privacy faces new threats and new opportunities. Over
the Internet as a medium, there has to be a harmonisation of the specific rules for the
treatment of personal information. India has no data protection laws. Having said this, the
ambit of "personal liberty" as covered by the Constitution of India has been successfully
interpreted in cases relating to privacy, such as Govind v. State of M.P.6 and protection of
confidential information. Over the last several years, the realm of technology and privacy
has been transformed, creating a landscape that is both dangerous and encouraging.
Significant changes include large increases in communication bandwidths; the
widespread adoption of computer networking and public-key cryptography; mathematical
innovations that promise a vast family of protocols for protecting identity in complex
transactions; new digital media that support a wide range of social relationships; a new
generation of technologically sophisticated privacy activists; a massive body of practical
experience in the development and application of data-protection laws; and the rapid
globalisation of manufacturing, culture, and policy making.

Potentially the most significant technical innovation, though, is a class of privacy-
enhancing technologies (PETs). Beginning with the publication of the first public-key
cryptographic methods in the 1970s, mathematicians have constructed a formidable array
of protocols for communicating and conducting transactions while controlling access to
sensitive information. These techniques have become practical enough to be used in
mass-market products, and sharp conflicts have been provoked by attempts to propagate
them. PETs also mark a significant philosophical shift. By applying advanced
mathematics to the protection of privacy, they disrupt the conventional pessimistic
association between technology and social control. No longer are privacy advocates in
the position of resisting technology as such, and no longer can objectives of social control
(if there are any) be hidden beneath the mask of technical necessity. As a result, policy
debates have opened where many had assumed that none would exist, and the simple
choice between privacy and functionality has given way to a more complex trade-off
among potentially numerous combinations of architecture and policy choices.

6 (1975) 2 SCC 148)

This contrast reflects another, deeper divide. Powerful socio-economic forces are
working toward a global convergence of the conceptual content and the legal instruments
of privacy policy. These forces include commonalities of technology, a well-networked
global policy community and the strictures on cross-border flows of personal data in the
European Union’s GDPR guidelines. While the United States has moved slowly to
establish formal privacy mechanisms and standardise privacy practices over the last two
decades, it now appears that the globalisation of markets, the growing pervasiveness of
the Internet, and the implementation of the GDPR will bring new pressures to bear on the
American privacy regime.

One constant across this history is the notorious difficulty of defining the concept of
privacy. The lack of satisfactory definitions has obstructed public debate by making it
hard to support detailed policy prescriptions with logical arguments from accepted moral
premises. Attempts to ground privacy rights in first principles have floundered,
suggesting their inherent complexity as social goods. Privacy is more difficult to measure
than other objects of public concern, such as environmental pollution. The extreme lack
of transparency in societal transfers of personal data, moreover, gives the issue a
nebulous character. Citizens may be aware that they suffer harm from the circulation of
computerised information about them, but they usually cannot reconstruct the
connections between the cause and effect7.

The new technologies also have implications for conceptions of relationship, trust, and
public space. Technology and codes of practice determine whether database
“relationships” between organisations and individuals are fair, or whether they provoke
anxiety. These concerns are a traditional motivation for data protection regulation, but
they are amplified by technologies that permit organisations to maintain highly
customised “relationships” by projecting different organisational personae to different
individuals. Such “relationships” easily become asymmetric; with the organisation,

7 This may account in part for the striking mismatch between public expression of concern in opinion polls
and the almost complete absence of popular mobilisation in support of privacy rights.

having the greater power to control what information about it is released while
simultaneously obscuring the nature and scope of the information it has obtained about
individuals. Examine, for instance, the conditions under which individuals can establish
private zones that restrict access by outsiders. A secure telephone line is arguably a
precondition for the establishment of an intimate relationship, an interest that has long
been regarded as a defining feature of human dignity. This concern with the boundaries
that are established around a relationship complements concern with the boundaries that
are negotiated within a relationship. It also draws attention to the contested nature of
those boundaries.

Beneficial relationships are generally held to require trust. As the information
infrastructure supports relationships in more complex ways, it also creates the conditions
for the construction of trust. Trust has an obvious moral significance, and it is
economically significant when sustained business relationships cannot be reduced to
periodic zero-sum exchange or specified in advance by contract. Trust and uncertainty
are complementary; cryptography establishes the boundaries of trust by keeping secrets.
This approach, however, reduces trustworthiness to simple reliability, thereby introducing
tacit norms against trusting behaviour. Just as technology provides the conditions for
negotiating relationships, it also provides the conditions for creating trust. Legal systems
evolve to the institutional conditions by which a technical architecture comes to support
these conditions or else evolves toward a regime of coercive surveillance.

No matter how well crafted a privacy code might be, privacy will only be protected if the
necessary information practices are actually followed. Policy-makers need to understand
how privacy issues actually arise in the daily activities of information workers, and
organisational cultures need to incorporate practicable norms of privacy protection. Once
established, these norms will only be sustained if the public understands the issues well
enough to make informed choices and to assert their rights when necessary.

3.1.5. Right to Privacy Not an Absolute Right

Right to life includes the right to privacy, but this is not an absolute right. It can be
illustrated with the help of a case. Right to life of a lady with whom the patient was to
marry would positively include the right to be told that a person, with whom she was
proposed to be married, was the victim of a deadly disease, which was sexually
communicable. Since “Right to life” includes right to lead a healthy life so as to enjoy all
faculties of the human body in their prime condition, the doctors by their disclosure that
the patient was HIV (+), cannot be said to have, in any way, either violated the rule of
confidentiality or the right of privacy. Moreover, where there is a class of two
fundamental rights, as in the instant case, namely, the patient’s right to privacy as part of
right to life and his proposed wife’s right to lead a healthy life which is her fundamental
right under Article 21, the right which would advance the public morality or public
interest, would alone be enforced through the process of court, for the reason that moral
considerations cannot be kept at bay.8

3.1.6. Confidential Information

Confidential information constitutes the essence of software development. From the
instructions/specifications received from the client/trade partners, to the algorithms
developed by the co-workers, every part of the development of an item of software code
involves the use of confidential information. All of this information is invaluable to the
software company developing the code and even more so to its competitors. There is no
copyright in ideas or information as such and accordingly there is no remedy under the
copyright law for unauthorised use of confidential ideas or information obtained directly
or indirectly by one person from another. A remedy will have to be sought by
proceedings for breach of confidence or breach of trust. The relief that can be obtained is
by a suit for an injunction or damages.

8 Mr. “X” v. Hospital “Z”, AIR 1999 SC 495 : 1998 AIR SCW 3662 : 1998 (7) JT 626 : 1998 (6) Scale
230 : 1998 (8) SCC 296 : 1998 (9) Supreme 220.

3.1.6.1. Protection of Confidential Information

If ideas and information are acquired by a person in such circumstances that it would be a
breach of good faith to disclose them to a third party or utilise them and he has no just
cause or excuse for doing so, the court will grant an injunction against him. It is well
settled that information imparted in confidence (especially information which is parted in
confidence to servants and agents) will be protected. The courts will restrain the use of it
if it is breach of good faith. The law on this subject does not depend on any implied
contract. It depends on the broad principle of equity that he who has received information
in confidence shall not take unfair advantage of it. He must not make use of it to the
prejudice of him who gave it without obtaining his consent.

3.1.6.2. Nature of Confidential Information

It is a matter of common knowledge that, under a system of free private enterprise and
therefore of competition, it is to the advantage of a trader/commercial entity to obtain as
much information as possible concerning the business of his rivals and to let him know as
little as possible of his own.

The information may be a trade secret, for example, a method of production not protected
by a patent, or a business secret, such as the financial structuring of an undertaking or a
piece of domestic ‘in-house’ information like the salary scale of clerks, or the efficiency
of the firm’s filing system. Some of this information would be of a highly confidential
nature, as being potentially damaging if a competitor should obtain it, some would be less
so and much would be worthless to a rival organisation.

3.1.6.3. Confidence Implied in a Contract

If two parties make a contract under which one of them obtains for the purpose of
contract or in connection with it some confidential matter, even though the contract is
silent on the issue of confidence, the law will imply an obligation to treat that confidential

matter in a confidential way, as one of the implied terms of contract, but the obligation to
respect confidence is not limited to cases where the parties are in a contractual
relationship.

3.1.6.4. Confidence Implied by Circumstances

An action for breach of confidence does not depend upon any right of property or
contract or right of law. It results in an equitable obligation of confidence, which may be
implied, from the circumstances of the case. Even if there exists no contractual
relationship between the plaintiff and the defendant, if a defendant is proved to have used
confidential information obtained directly or indirectly from the plaintiff and without his
consent, express or implied, he will be guilty of infringing the plaintiff’s rights.

3.1.6.5. Identification of Confidential Information

In identifying confidential information, four elements must be discerned: First, the
information must be information the release of which the owner believes would be
injurious to him or advantageous to his rivals or others. Second, the owner must believe
that the information is confidential or secret, i.e. that it is not already in the public
domain. It may be that some or all of his rivals already have the information, but as long
as the owner believes it to be confidential, he is entitled to try to protect it. Third, the
owner’s belief under the two previous headings must be reasonable. Fourth, the
information must be judged in the light of the usage and practice of the particular
industry or trade concerned. It may be that information, which does not satisfy all these
requirements, may be entitled to protection as confidential information or trade secrets,
but that any information, which does satisfy them, must be of a type, which is entitled to
protection.

3.1.6.6. Essential requirements of breach of confidence

Three elements are normally required if, apart from contract, a case of breach of
confidence is to succeed. First, the information itself must have the necessary quality of
confidence about it. Secondly, that information must have been imparted in
circumstances importing an obligation of confidence. Thirdly, there must be unauthorised
use of that information to the detriment of the party communicating it.

3.1.6.7. Exceptions to breach of confidence

Where the information is such that it ought to be divulged in the public interest to one
who has an interest in receiving it, the Court will not restrain such a disclosure.
Information relating to anti-national activities, which are against national security,
breaches of the law or statutory duty or fraud, may come under this category. In fact,
whenever there is strong public interest in the disclosure of the matter, Courts may not
consider such disclosure as breach of confidence.

3.1.6.8. Remedies for breach of confidence

The remedies for breach of confidence consist of an injunction and damages and deliver-
up where applicable. The injunction may be interlocutory or permanent. The information
may remain confidential only for a limited period in which case, the injunction will not
extend beyond that period. Since the information, alleged to be confidential, might be of
value to the plaintiff only for a certain period, an interim injunction will ordinarily be
granted only for a specified period depending upon the circumstances and the nature of
confidential information.

In the balance of convenience, the following factors have to be considered:
● whether the effect of an injunction would be harmful to the defendants;
● whether the terms of the injunction are such that it is extremely difficult for the
defendants to know what they may do and what they may not do;

● whether it is certain upon the material before the Court that even if they were
successful in the trial, the plaintiff would obtain an injunction rather than
damages.

Damages or compensation is determined based on the market value of the confidential
information based on a notional sale between a willing seller and a willing purchaser.
This method may be more appropriate for confidential information relating to industrial
designs or processes or business secrets.

Where a plaintiff elects in favour of an account of profits, he will in the normal course
receive the difference between the sale price of the goods and the sum expended in
manufacturing them. The sum would be abated by the amounts, if any, expended by the
defendants as commission in relation to the contract.

3.1.7. Employee Privacy Rights

Employee privacy is considered one of the most important issues facing companies
today9. This is so because no longer is employee privacy relegated to the employer
“monitoring their workers’ performance by observing production lines, counting sales
orders, and simply looking over the employee’s shoulder.” Instead, employers now have
the capability to monitor their employees through electronic means, including computers
and e-mail. This “development of sophisticated technology is greatly expanding the
advanced and highly effective methods by which employers monitor the workplace.”10

Although it is obvious that e-mail gives companies a great deal of technological
advantages and is an important tool in today's business world, it also creates a problem
for employers and employees in the area of employee privacy. The question becomes, do
employers have the right to look at employees’ e-mails, and do employees have a right of

9 Laurie Thomas Lee, Watch Your E-Mail! Employee E-Mail Monitoring and Privacy Laws in the Age of
the “Electronic Sweatshop”, 28 J. Marshall L. Rev. 139, 139 (1994)
10 Larry O. Natt Gantt, II, An Affront to Human Dignity: Electronic Mail Monitoring in the Private Sector
Workplace, 8 Harv. J.L. & Tech. 345, 345 (1995)

privacy that should prevent such an intrusion? Employers argue that they need the right to
electronically monitor employees in order to enhance job performance, prevent theft,
fraud, and other illegal conduct. They also argue that productivity, efficiency, and quality
controls are all enhanced by electronic surveillance. The employee on the other hand,
maintains that he has an expectation of privacy, and that electronic surveillance is an
invasion of that right. A number of e-mail attributes lead employees to believe these
messages were their own private communications11. The need for passwords, the ability
to personally address e-mail, the use of the word “mail”, the most confidential form of
communication used by the public, in e-mail, and even the ability to “delete” messages
after reading them, all contribute to employee e-mail users believing that their e-mail
communications are private.

Functionally, a proper e-mail privacy standard lies at the confluence of two critical
questions: how much access do employers have to an employee’s workspace, and is that
access limited by a right of the employee to control their workspace; and how much of a
right do employees have to use the employer’s property as resources to pursue their own,
private purposes. The laws concerning this employee privacy are unclear at best, non-
existent in many situations, and still in discussion in India.

3.1.7.1. Employer Protection

The question thus is how can an employer protect against liability. First, it is important to
reduce the employee’s expectation of privacy with notice, and second, it is important to
do so in a manner that evidences the employee’s understanding of the policy.

Whether the current employer/employee relationship exhibits it or not, there is a
judicially created right to privacy. Privacy law has attempted to balance two basic
interests: first, the employer has an interest in minimising losses and injuries, preventing
fraud and crime in his workplace, and maximising production, productivity, and success.

11 Benkler, Yochai, Rules of the Road for the Information Superhighway: Electronic Communications and
the Law, West Publishing, 1996 at 402

Second, the employee has an interest in being free from intrusion into his/her private
affairs. Neither of these basic interests is more important than the other. In fact, privacy
law has taken on a “circumstances” based inquiry. How then, does this “circumstances”
based inquiry apply to the relatively new concept of privacy in the employer/employee
context of e-mail transmission?

The answer is, it really has not gone far enough. The Constitution does not explicitly give
the right to infringe the privacy of the employed individuals, and there is some doubt
whether it applies to e-mail at all. At present, legislation is under review, but without an
element of finality. Case law is sketchy at best, and is not on point in e-mail and internet-
related activities.

Therefore, to prevent unnecessary situations in the future, there are things that employers
and employees can do. First, employers should notify employees about policies that exist
within the company, which may allow the executive to search and conduct surveillance of
the employee. Thus, the expectation of privacy needs to be managed. Second, the
employer should limit the inquiry to matters associated with the workplace and the ability
of an individual to do their job. It probably does not benefit the employer to delve into an
employee’s personal e-mail. Third, employers should limit the amount of sensitive
information employees see. This would essentially negate the need to monitor. Fourth,
employers should not release any private information about the employee. Lastly,
employees should keep their personal correspondence where it belongs - at home and out
of the workplace. If both employers and employees practice these techniques, a more
compatible environment for e-mail monitoring will be available.

Nevertheless, one thing is for sure. Today, the growing restrictions arising from both
judicially created and any company who uses e-mail must consider statutory law. In
addition, any employer, who is thinking about monitoring and “snooping” over e-mail,
had better make sure that the employee has an awareness of this intent, because although
the laws are ambiguous today, the trend is toward a more protective environment for the
employee.

3.1.8. Breach of Confidentiality and Privacy: The Indian Perspective - an ‘offence’
under the Indian Information Technology Act, 2000 (IT Act)

India has, as such, no specific privacy laws in place as yet. Yet, drawing analogy from the
rulings of the Indian Supreme Court on Article 21 of the Constitution of India, one can
safely presume that the existing standards and case precedents of the developed world
will have a significant impact on the laws of India and the rulings of the Indian courts.
Privacy has been established as a fundamental right in the Puttaswamy case. The
implementation of the Information Technology Act, 2000, is bound only to strengthen
this position.

Section 72 of the IT Act prohibits unauthorised disclosure of the contents of an electronic
record. Privacy, in fact, involves at least two kinds of interests; informational privacy
interest and autonomy privacy interest. Information privacy interest means interest in
precluding the dissemination or misuse of sensitive and confidential information.
Autonomy interest means interests in making intimate personal decisions or conducting
personal activities without observation, intrusion or interference12. Both the interests
deserve protection. In regard to autonomy privacy interests, there are, however, certain
limitations and exceptions as set out in sections 67, 68, 69 of the IT Act, while Section 72
protects the informational privacy interests. It prohibits disclosure of information
received by a person in pursuance of the powers conferred under the Act. Such disclosure
is punishable with imprisonment for a term, which may extend to two years and/or fine,
which may extend to one lakh rupees. Disclosure could, however, be made without any
penal liability to the law enforcing agencies or pursuant to proper authorisation by the
Controller or with the consent of the concerned person.

12 Refer to Hill v. National Collegiate Athletic Association, 865 P 2d 633 (1994)

3.1.9. Privacy and Internet Law [Internet References]

Privacy protection is a critical element of consumer and user trust in the online
environment and a necessary condition for the development of electronic commerce.
Three international organizations have developed guidelines or rules that set forth basic
consumer privacy protections:
1. Organisation for Economic Co-operation and Development -- Guidelines on the

Protection of Privacy and Transborder Flows of Personal Data (Privacy Guidelines)
(1980)13
2. Council of Europe -- Convention for the Protection of Individuals with Regard to
Automatic Processing of Personal Data (1981)14- Articles 4 - 10 set out the basic
principles for data protection.
3. Internet Privacy Guidelines (23 February 1999) -- practical, non-binding advice for
Internet users and service providers15
4. A good overview of the privacy rules and recommendations issued by the Council of
Europe16
5. European Union -- Data Protection Directive (1995).17 Replaced by the GDPR.18
6. Guide to the data privacy directive -- focuses on who is entitled to handle personal
information and how such information can be processed.19

3.1.10. Consumer Privacy

There are two aspects to the concept of privacy:
1. Consumer privacy - the right of individuals to control information about them

generated or collected in the course of a commercial interaction. Referred to in
Europe as"data protection."

13 See http://www.oecd.org/dsti/sti/it/secur/index.htm
14 See http://conventions.coe.int/treaty/EN/cadreprincipal.htm
15 See http://www.coe.fr/dataprotection/rec/elignes.htm
16 See http://www.coe.fr/dataprotection/eintro.htm
17 See http://europa.eu.int/eur-lex/en/lif/dat/1995/en_395L0046.html
18 See https://gdpr-info.eu
19 See http://europa.eu.int/comm/internal_market/en/media/dataprot/news/guide_en.pdf

2. Privacy rights of the individual against the government - the individual's protection
against unreasonable government intrusions on privacy, such as searches of the home
or interceptions of communications.

Internet law needs to address both sets of issues.

Consumer privacy protection in the US and Europe, as well as under the guidelines of the
OECD, is based on the following principles:
1. Notice and Consent - before the collection of data, the data subject should be

provided: notice of what information is being collected and for what purpose and an
opportunity to choose whether to accept the data collection and use.
In Europe, data collection cannot proceed unless the data subject has unambiguously
given his consent (with exceptions).
2. Collection Limitation - data should be collected for specified, explicit and legitimate
purposes. The data collected should be adequate, relevant and not excessive in
relation to the purposes for which they are collected.
3. Use/Disclosure Limitation - data should be used only for the purpose for which it was
collected and should not be used or disclosed in any way incompatible with those
purposes.
4. Retention Limitation - data should be kept in a form that permits identification of the
data subject no longer than is necessary for the purposes for which the data were
collected.
5. Accuracy - the party collecting and storing data is obligated to ensure its accuracy
and, where necessary, keep it up to date; every reasonable step must be taken to
ensure that data which are inaccurate or incomplete are corrected or deleted
6. Access - a data subject should have access to data about himself, in order to verify its
accuracy and to determine how it is being used
7. Security - those holding data about others must take steps to protect its
confidentiality.

3.1.11. Privacy Protection against the Government

The right to privacy is internationally recognized as a human right. However, most
governments claim the authority to invade privacy through the following means:
1. Interception of communications in real-time
2. Interception of traffic data (routing information) in real-time
3. Access to data stored by service providers, including traffic data being stored for

billing purposes
4. Access to data stored by users

These means of access to communications and stored data must be narrowly defined and
subject to independent controls under strict standards. Real-time interception of
communications should take place only with prior approval by a judge, issued under
standards at least as strict as those for policy searches of private homes.

With the advent of the Internet, it has become easy for anyone to compile and exploit
private information of individuals. What were scattered, unimportant, small bits of data
has now become a potent large set of data that can be misused by companies or by
antisocial elements. This has prompted many countries to come up with legislation on
privacy.

3.1.12. International Privacy Initiatives

On July 25, 1995, the EU announced the adoption of a directive on the protection of
individuals’ personal data and on the free movement of such data. The directive sought to
prevent abuse of personal data and laid down comprehensive rules, including an
obligation to collect data only for specified, explicit and legitimate purposes, as well as to
only hold data if it is relevant, accurate and up-to-date.

The directive required all companies concluding business in the EU to meet certain
minimum standards of data protection. Any company that did not meet these stringent

standards faced sanctions. The guidelines laid down in this directive were further
strengthened in 2018 when the GDPR was enforced. It replaces the 1995 directive and
gives EU residents more control and rights over their data. It also increases the extent of
compliance required by companies doing business in the EU and prescribes heavy
penalties for violations of the GDPR.

The Electronic Communications Privacy Act in the US governs the privacy of e-mail in
public e-mail systems. It bars interception, use, or disclosure of e-mail by third parties
and sets the standards which law enforcement authorities must meet to gain access to e-
mail.

3.1.13. Indian Law Relating to Privacy: An Epilogue

Significantly, India does not have any specific law governing privacy. The courts in India
have not yet had the opportunity to look at privacy issues relating to the Internet.
Analogies to the Internet will, therefore, have to be drawn from cases that the court has
actually dealt with.

The Constitution of India does not patently grant the fundamental right to privacy.
However, the Supreme Court has, in the Puttaswamy case, read the right to privacy into
the other existing fundamental rights: Freedom of Speech and Expression, under Article
19 (1) (a) & Right to Life and Personal Liberty under Article 21. Barring a few
exceptions, the fundamental rights secured to an individual are limitations on State
action. They are not meant to protect persons against the conduct of private persons. It is
to be noted that the constitutional guarantee of the right to privacy is valid only against
the State and no constitutional remedy for violation of privacy lies against any individual.
The Puttaswamy judgment does emphasise on digital privacy and urges the legislature to
enact a law addressing privacy with regard to internet companies.

Further, common law also does not provide direct for invasion of privacy. It seeks to
provide protection by the use of civil wrongs such as defamation and breach of

confidence. However, with the advent of e-commerce, such common law seems
manifestly unsuited to this environment. As seen above, it may be difficult in India to
prevent individuals/corporations from violating privacy. There is, at present, no initiative
on the part of the government to regulate privacy of individuals against its encroachment
by private parties.

Companies that request this information potentially have a legal compliance issue in any
country where the information is collected and those to which it is transferred.
Regulatory restrictions may come into play at such stages as the point of data collection
(what is collected, how it is collected and how is the use disclosed?), during the use of the
data (can it be used for purposes other than specified?), for transfers of the data to third
parties, relating to the security and protection of the data, and may also limit the transfer
of information to third countries.

Let us observe, for example, the perspective of a global EU-based company with a
sophisticated approach to the EU’s privacy and data protection regime and an
understanding of global compliance challenges. Despite certain similarities in existing
approaches and efforts toward harmonization of legal rules for e-commerce and the
Internet, DaimlerChrysler has encountered a divergence of worldwide approaches to
privacy and data protection. For example, in Germany, the location of the company’s
headquarters, the core data protection principles of the German Federal Act on Data
Protection, which require public and private entities to obtain individual consent,
maintain confidentiality and provide notice and rights of access to individual information,
have been supplemented by the EU GDPR, which includes additional requirements on
obtaining and using as little data as possible and special restrictions regarding certain
“sensitive” data. It also includes well-known restrictions on transfers of data to any third
countries which do not provide protection equivalent to the EU rules.

As important, for example, is the perspective of a global Japan-based company facing
similar challenges but operating from a different legal tradition. Another comment needs
to be added on the growing concern over the treatment of private data in Japan, leading to

the change from ministry-approved and industry group guidelines to the introduction of a
new law on personal data protection, which is generally welcomed by the business
community in Japan. Though not required by Japanese law to do so, Fujitsu has taken
actions to protect personal data due to a belief in its importance to consumers.

3.1.14. Four Basic National Privacy/Data Protection Approaches

Overall, one can identify four basic national approaches to the issue: (1) statutory general
data protection laws, which include Japan’s new law, Hong Kong, Taiwan and New
Zealand, (2) sector-specific laws (such as for medical or financial information), for
countries such as the PRC and Malaysia, (3) state-approved guidelines, an approach
which had been followed by Japan, though that country is moving to a statutory general
data protection approach, and (4) a self-regulatory approach, as represented by Singapore
(though that country includes some sector-specific restrictions). There is currently a
tension between more of a regulated approach, such as is shared by the EU, and the
tendency toward the kind of laissez faire approach found in the U.S.

“Business entities” covered under the new law include persons who use personal data
databases, which would include law firms that maintain databases of clients and/or
employees. Tracking the OECD privacy guidelines, the basic law provides that business
entities must specify the purposes of data collection to the subject of the collection, must
keep the personal data accurate and current, adopt adequate security safeguards against
loss or leakage of information, use collected information only for the purposes described,
and prevent a transfer to third parties without consent of the data provider. Business
entities also must disclose the personal data that it owns when requested to do so by a
data subject, and correct any inaccuracies in such data.

3.1.14.1. Hong Kong – Another Example of Statutory General Data Protection

“Personal data” covered under Hong Kong’s Personal Data (Privacy) Ordinance must
have the elements of attribution (in relation to a living individual), identification (the

individual may be identified from the data) and retrievability (access to or processing of
the data must be possible). The Ordinance requires that collected data must be relevant
(related to a function or activity of the “data user”), necessary for or related to the stated
purpose and not excessive. The data user must specify the purposes of data collection and
access/correction rights, inform data subjects of the classes of persons to whom the data
may be transferred, keep the personal data accurate and not retain it longer than necessary
and, in the absence of consent, must use collected information only for the purposes
described or directly related purposes. Data users must adopt adequate security
safeguards against loss or leakage of information, must disclose the personal data that it
owns when requested to do so by a data subject, and correct any inaccuracies in such
data. The Ordinance also addresses aspects of employee data, such as generally providing
access to employee evaluations but denying access to pre-Ordinance evaluations for a
period of 7 years.

3.1.14.2. The PRC – Limited Sector-Specific Rules in the Absence of a Privacy
Tradition

Unlike Hong Kong, the PRC does not regulate privacy and data protection through
general statutory provisions. Only a few express references to privacy protection appear
to exist under PRC statute. For example, while Internet content providers that operate
electronic bulletin board services are not to disclose web user personal data to third
parties without consent, this restriction applies only unless modified by other laws. One
such law requires Internet service providers to record certain customer information,
including a customer’s account, domain name, telephone number and even Internet log-
on times, and make this information available to authorities. In fact, state access to
personal information is highly valued, as evidenced by the continued existence of the
lifetime “personal file” system on each individual to record whereabouts, employment,
awards, punishments and other information. While minimum data protection-like
limitations are established, such as certain collection, safekeeping and transfer
restrictions, the system is designed for state control, not individual protection, and access
by individuals is strictly prohibited. The launch of a computerized staff management

system known as “China black files” is meant as a repository of personal data for
employees, which employers can access. While employee permission is required for
access, detractors argue that employees will be under pressure to grant permission if they
desire future employment.

3.1.14.3. Singapore – A Largely Self-Regulatory Approach

In contrast to Japan and Hong Kong, Singapore has no plans to institute statutory general
privacy or data protection laws. Apart from a limited common law and contract
protection of confidential information, the protection of account information under the
Banking Act is among the few limited statutory privacy or data protection provisions.
The Singapore government perceives only modest consumer interest in privacy and data
protection as balanced against the government’s interest in not restricting online business
or impeding the growth of e-commerce in Singapore. Singapore has instead introduced a
non-binding code of practice for Internet service and content providers. Providers who
wish to comply with the code may apply to a compliance authority for the use of a
compliance symbol, and complaints may be made to the authority in the event of a failure
to comply with the code. Though elements of the OECD privacy guidelines are reflected
in the code, the code contains many open-ended obligations even for those who
voluntarily subject themselves to compliance, such as to take “reasonable steps” to ensure
confidentiality, use personal information for “legitimate purposes” and “endeavour to
give the user an option” as to whether it wishes to receive third-party marketing material.

3.1.14.4. TRUSTe – A Private Sector Privacy Seal Alternative

The objective behind an online privacy “seal” program, such as that introduced by
Singapore and privately-administered programs such as that run by TRUSTe, is to
indicate to consumers that they can expect companies which display the seal to follow
certain requirements about the way the displaying web site handles data, and that an
independent third party would handle complaints and resolve disputes..

Perhaps, such a self-governance model allows industry to self-regulate yet with the
outside scrutiny of a third party such as TRUSTe. Sanctions for a failure to comply with
the TRUSTe standards include a potential whistle-blowing to government authorities or
revocation of the privacy seal. We are advised that the over-2,000 web sites that display
the TRUSTe seal now generate about 200 complaints per month, most of which are
resolved to the user’s satisfaction.

One cannot but emphasise the importance of building online trust to the healthy growth
of online commercial transactions, whether or not companies are compelled by law to
conform to privacy and data protection requirements. Yet, since compliance with privacy
laws in multiple jurisdictions is of growing importance, TRUSTe has also developed
privacy seal programs which enable companies to comply with the EU GDPR.

The best course forward requires careful deliberation, not in isolation, but in the context
of the overall development of Internet Law and Policy.


Click to View FlipBook Version