The words you are searching are inside this book. To get more targeted content, please make full-text search by clicking here.
Discover the best professional documents and content resources in AnyFlip Document Base.
Published by Enhelion, 2021-11-09 02:02:22

Module 4

Module 4


4.1 How are robots related to privacy:

What are Robots? A robot is defined as a machine or a mechanical device devised
and designed to perform or execute one or more assignments “automatically” with
promptness and exactitude. An autonomous robot, supplemented with its
computer-mainframe, acts as a stand-alone unit. One of the most sophisticated
smart-robots of today’s age possesses an in-built artificial intelligence [AI]
framework that can adapt and acclimatize itself from the setting around it and the
situation it is placed in. Next Generation robots are in the research-and-
development phase and include features such as artificial intelligence, self-
replication, self-assembly, and nanoscale size. The ultimate in robotic intelligence
and sophistication might take on forms yet to be imagined and that is exactly what
we should be concerned about. The amount of sophistication and real presence that
is being made into robots is the cause of concern as to where does it create a line as
to differentiate what is to be dealt with and in what manner, the AI is becoming
involved in our lives, in our daily activities be it in the usage of mobile phones to
the cars we drive to the defense system of a Nation. Are we sure enough to become
reliant on all the robotics and AI to hand over the security of a Nation in their
figurative hands? Sure, these AIs have become sophisticated to understand our
commands but are they good enough to analyze the situation and create a response.

The point of our vulnerability and question is - What is the information that these
robots process and how is it gathered? Robots are furnished with the capacity to

sense, manage, and document the world around them, and robots can go places
which is inaccessible to a general human being, and visually observe and view
which a general human being will not be able to physically – hence, essentially,
robots are human instruments. However, the principle utility to which we have
placed these instruments has been of surveillance and reconnaissance. However
amplifying the authority to maintain a close-watch over another is merely one of
the ways in which robots may violate the accepted norms of privacy, another
concern that lies other that the privacy is the security because with access to our
personal information we give them access to follow us around to monitor us- we
give them full surveillance autonomy. This was in the private life of an individual,
robots form part of a much bigger picture, they are not just concerned with what
one does in his day to day life but they have vast applications in the automobile
industry which gives them remote access to our vehicles to turn then on & off, to
track our movements, etc, these automobile applications takes us to a rather vast
picture where this autonomous technology is used in our defense systems, the
unmanned aerial vehicles used in surveillance and combats, no denying the fact
that these AI-robots do help us in combat but they make us equally vulnerable to
the cyber attacks by the enemy state.

Robots are adapting quicker and smarter to the sensitivities, emotional-responses,
and thought process, often by merely “looking” at them – an advancement that
might one day enable more perceptive machines to discern changes in a person's
physical or emotional wellbeing or the frame of their mind. And, this is happening
in real-time – machines developed and created by a Nord Distinguished Assistant
Professor of Mechanical and Aerospace engineering of the Case School of
Engineering, Kiju Lee, and a graduate student, Xiao Liu, are accurately
recognizing human sentiments and sensations by merely looking at the facial

expressions of people - 98 percent of the time, almost immediately. And, the Lee-
Liu experiments aren’t the only one of their kind – related scientific experiments,
and testing has achieved similar results. These days it is not about the huge
mechanical figures walking around with us following our commands robotics is
anything that listens to us and does as we command [not necessarily now because
now we have AI, they have a brain of their own], they are anything ranging from a
smartphone in our hand to a machine in the industry working on its own. Recently,
Google launched a new application for its assistant- Google Duplex, what is it?
The service works with an AI-driven voice, which was designed to help people
make appointments to businesses over the phone, but without any interaction from
the user. The AI voice could not only understand the voice of the human on the
other end of the call, but it could respond with correct answers to that real person’s
inquiries and questions as well. Google Duplex’s voice even put in words like
“um” and pause breaks to make it sound even more like a real human.

We have R&D going on all over to make them more equipped to understand,
analyze, and respond to a situation but at what cost. Are we compromising the

privacy of individuals who give remote access to robots and AIs in their day to day
life who understand their emotions their responses to the situations? Are we
making robots human or humans’ robot?

The Supreme Court upholding privacy as a fundamental right is the first step to
make India ready to adopt artificial intelligence and the government has already
started working on the framework for the same, according to officials.1 The
country is home to the world’s largest biometric identity project, Aadhaar, which,
depending on how it is used, can form a focal point for AI applications in the
country. India is also at a critical point in the development of data protection
regulation, which will have a profound effect on how AI technologies can and will
function within it.2 With all the progress AI is doing, Regulation is considered
necessary to both encourage it and manage associated risks.

National Initiatives to regulate AI:
A. Report of the Artificial Intelligence Task Force:

The task force’s report looked at the “use of AI along with its major challenges,
and possible solutions for each sector.” It examined ten sectors, referred to as
“domains of relevance to India.” These sectors were: Manufacturing, FinTech,
Agriculture, Healthcare, Technology for the Differently-abled, National Security,
Environment, Public Utility Services, Retail and Customer Relationship, and
Education.3 The task force made several recommendations to the government:

2 Artificial Intelligence Policy in India: A framework for engaging the limits of data-driven decision-making by
Vidushi Marda
3 Elonnai Hickok et al., The AI Task Force Report – The First Steps Towards India’s AI Framework, CIS [June 27,
framework, archived at

Noting that ‘AI should be seen as a scalable problem solver in India rather than
only as a booster of economic growth’, the Task Force recommends:
[a] the creation of an inter- ministerial National AI mission to coordinate AI-
related activities in India;
[b] enabling the setting up of digital data banks, marketplaces and exchanges to
ensure availability of cross-industry data and information;
[c] participating in the elaboration of operation standards for AI-based systems;
[d] putting in place enabling policies to encourage and facilitate the development
and deployment of AI-based products [such as data policies regarding ownership,
sharing rights and usage, as well as tax incentives to support innovation];
[e] elaborating an AI education strategy to develop human resources with
necessary skills;
[f] supporting reskilling of the current workforce;
[g] participating in the international policy discussion on the governance of AI
technologies; and
[h] leveraging bilateral partnership on the development of AI solutions for social
and economic problems and for sharing best practices in regulation.4

A. NITI Aayog Discussion Paper on a National AI Strategy:

On February 1, 2018, Finance Minister Arun Jaitley stated that the government
think-tank NITI Aayog “would lead the national programme on AI” and that “[the

4 India: Government-appointed Task Force Issues Recommendations on AI, GIP DIGITAL WATCH [Mar. 30,
2018],, archived

government is set to support start-ups and centres of excellence with respect to AI
training and research activities.”5 Its flagship initiative is a two-tiered integrated
strategy to boost research in AI. First, new Centres of Research Excellence in AI
[COREs] will focus on fundamental research. Second, the COREs will act as
technology feeders for the International Centres for Transformational AI [ICTAIs],
which will focus on creating AI-based applications in domains of societal
importance. In the report, NITI Aayong identifies healthcare, agriculture,
education, smart cities, and smart mobility as the priority sectors that will benefit
the most socially from applying AI. The report also recommends setting up a
consortium of Ethics Councils at each CORE and ICTAI, developing sector
specific guidelines on privacy, security, and ethics, creating a National AI
Marketplace to increase market discovery and reduce time and cost of collecting
data, and a number of initiatives to help the overall workforce acquire skills.
Strategically, the government wants to establish India as an “AI Garage,” meaning
that if a company can deploy an AI in India, it will then be applicable to the rest of
the developing world.6

Ministry of Electronics and Information Technology Committees
The Ministry of Electronics and Information Technology has established four
committees to help encourage research in AI. They are headed by “directors of

5 Why We Need to Have Regulation and Legislation on AI and Quick, THE INDIAN EXPRESS [July 31,
legislation- on-artificial-intelligence-quick-5151401/, archived at
6 Tim Dutton, Artificial Intelligence Strategies, MEDIUM [June 28, 2018],
ai/anoverview-of-national-ai-strategies-2a70ec6edfd, archived at

Indian Institutes of Technology [IITs], Nasscom and eminent researchers”7 and
include the following:
i. Committee on platforms and data for AI,
ii. Committee on leveraging AI for identifying National Missions in key sectors,
iii. Committee on mapping technological capabilities, key policy enablers, skilling,
re- skilling and R&D
iv. Committee on cybersecurity, safety, legal and ethical issues.8
The four committees are “presently studying AI in context of citizen centric
services; data platforms; skilling, reskilling and R&D; and legal, regulatory and
cybersecurity perspectives.”9

Privacy Challenges:

India currently does not have a comprehensive legal framework for data protection.
On July 27, 2018, the government of India’s Committee of Experts [also known as
the Justice B.N. Srikrishna Committee] released a Draft Protection of Personal
Data Bill10 along with an accompanying report titled A Free and Fair Digital

7 Digital India: IT Ministry Set Up Four Committees to Encourage AI Research, LIVEMINT [Feb. 10,
committees-to-encoura.html, archived at

8 Press Release, Ministry of Commerce and Industry, supra note 61.

9 Vidushi Marda, Artificial Intelligence Policy in India: A Framework for Engaging the Limits of Data-driven Decision-
making, 376[2133] PHIL. TRANS. A [Royal Society Pub., Nov. 28,
2018], roypta/376/2133/20180087.full.pdf, archived

10 Personal Data Protection Bill,
2018,,2018.pdf, archived
at NJNG.

Economy Protecting Privacy, Empowering Indians.11 The Bill, like the EU’s
General Data Protection Regulation, establishes a set of rights but does not appear
to include rights to protect against automated decision-making.12 According to an
analysis by the Centre for Internet and Society, “the Bill creates a framework to
address harms arising out of AI, but does not empower the individual to decide
how their data is processed and remains silent on the issue of ‘black box’
algorithms” and is “focused on placing the responsibility on companies to prevent

Privacy judgement:

The extent to which privacy can be respected or eroded by AI technologies also
depends on the legal framework within which they function. In August 2017, the
Supreme Court of India unanimously upheld the right to privacy as a fundamental
right under the Constitution of India 14. This historic judgment recognized
informational privacy as part of this fundamental right, and underscored the
dangers that emanate from the ability of machines to infer information and analyse
data in new, sophisticated ways.

[2018],, archived

12 Amber Sinha & Elonnai Hickok, The Srikrishna Committee Data Protection Bill and Artificial Intelligence in India,
CIS [Sept. 3, 2018],
and-artificial-intelligence-in-india, archived at

13 Ibid
14 K.S. Puttaswamy v. Union of India, [2014] 6 SCC 433.

The judgment also highlighted the urgent need for a ‘robust regime for data
protection’ in the country, emphasizing the close relationship between data
protection, autonomy and identity. In particular, the Court observed,
“Informational privacy is a facet of the right to privacy. The dangers to privacy in
an age of information can originate not only from the state but from non-state
actors as well. We commend to the Union Government the need to examine and
put into place a robust regime for data protection. The creation of such a regime
requires a careful and sensitive balance between individual interests and legitimate
concerns of the state.”15

India’s long-awaited privacy legislation – the Personal Data Protection Bill, 2019 –
is currently being deliberated by members of a Joint Committee of the Houses of
Parliament. The Committee has its work cut out for it – the PDP Bill, while
progressive on many fronts, suffers from several lacunae and needs to be future-
proofed. One aspect that the PDP Bill must account for is whether it is sufficient
for an era of ‘Artificial Intelligence’ and ‘Big Data’, where personal data is used to
predict and control the behaviour of individuals.16
4.2 Application of Robots which might pose a risk to people’s privacy:

We have given access to these autonomous robotic devices in our day to day life,
we rely on them to clean our houses, to book our appointments [latest google home
edition], to managing our house-work security. Intrinsically, robots are very much
human-like and intuitively social and collaborative, making them more interactive
with and pivotal to their users, specifically, and engaging the community at large.

15 Ibid

With a substantial lowering of the prices and an entrance of a plethora of players
into the robotics industry, the market for domestic-use and home robots, also
known as service robots, is increasing swiftly. Domestic-use and home robots
could very well be cleaning gadgets or a simple AI-induced thermostat, operational
with a cluster of sensors, including infrared cameras, sonar or laser rangefinders,
odor, smoke, and smell detectors, accelerometers, and a framework of global
position systems [“GPS”]. These devices now come installed with a monitoring
and recording software which recognizes our usage and demand patterns which is
not the concern, but they also store information regarding what time we are home
what time we sleep, etc. Most varieties of these home robots connect wirelessly to
the home network, some to transmit pictures and audio clips over the Worldwide
Web in real-time, and some others to bring up to date the existing programming in
most cases. The well-known WowWee Rovio, to take an instance, is a
commercially accessible robot that provides for security and entertainment. It can
be remotely controlled and accessed by means of the Internet and broadcasts both
audio and video to an online control board.

The increasing number of smartphones, networked sensors into the home does
concern citizen privacy. At the least, the government will be able to procure a
warrant for confiscating recorded data by the means of the law, physically
sequestering the robot, and gaining live access to the recordings and other such
information. Fair as law requirement is directly able to compel in-car route benefit
suppliers to turn on a receiver in one’s car or phone companies to compromise with
portable phones and area of a person, so might the government tap into the
information stream from a domestic robot or indeed move the robot to the room or
protest it wishes to watch.

Nowadays we rely on these robots to clean our houses, schedule our day to take
care-monitoring of our infants- basically act as our personal staff doing our chores,
and while giving them these services we also give them unrestricted access to our
house and lives. The electronic self-cleaning devices we use, they work on
mapping technology where with time they learn and scale-out the entire building-
every inch of it, it is done to increase the work efficiency of these cleaning robots
but what is at risk is that these robots are open to access by anyone who can hack
into them, once hacked into, the hacker shall have access to not only the floor map
of the building or house but they shall also know when no one is home as these
devices generally monitor the presence of any human being as well, the hacker
shall have access to the house via various sensors present in the robot, similarly in
case of the monitoring devices for our infants, we install various cameras and
monitoring devices to look out if our child is at any risk during the day or when we
are not physically present around him. These wireless home monitoring devices are
more prone to hacking attacks than any other system and the entire family is at risk
they give unlimited access of the house to the hacker, and although home security
cameras, originally meant to enable the owner with keeping an eye out on the
house and thereby ensuring enhanced safety for its dwellers and enhancing security
for the property, if accessed without authorization by a third party – then the
threshold sense of ease and relief can plummet considerably.

It gets out of hand and for the very worse. Today’s age and time are filled with
stories of security cameras being hacked. Recently, a California family
experienced a troubling incident when a hacker warned them through their Nest
camera of a false North Korean missile attack on the United States17. In another

17 Christopher Carbone, “Hacked Nest camera warned of North Korean ‘Missile Attack’, family says”. Fox News.
January 23, 2019, [
family-says], as accessed on 21.07.2020

incident, a hacker took control of a Nest camera and told the owner’s Amazon
Alexa device to begin playing Justin Bieber’s version of “Despacito” while the
owner, bewildered, tried to figure out what was going on18. The hacks prompted
Nest to issue a statement telling customers to change their passwords19. While the
most recent incidents involved Nest cameras, any device can be infiltrated by
hackers. A Calgary, Alberta based group claimed they hacked into as many as ten
security cameras and communicated with the people on the other ends of the


18 Kinsey Schofield, “Hacker takes over Nest camera, asks Alexa to play ‘Despacito’”, WGNO-ABC News, February
07, 2019, [
alexa-to-play-despacito/], as accessed on 21.07.2020
19 Patrick Hearn, “Amid Security breaches, Nest urges customers to use stronger passwords”, Digital Trends,
February 06. 2019, [], as
accessed on 21.07.2020
20 Ibid.
21 “IP Camera Hacking Attempts are Rising”. CST Group Inc., July 20, 2019,
[], as accessed on

Well, this is not how or where this stops. Recently, Trend Micro published some
statistics that just about everyone should find disturbing. According to their latest
statistics, the security company has blocked more than five million cyber-attacks
against IP cameras, just in the past five months. Worse, IP cameras don't tend to
have great security in place, to begin with, making it relatively easy for hackers to
control them remotely.22

Moving on from home security devices, now we have robots who live with you,
can understand emotions and respond like humans, Pepper, as the humanoid robot
is affectionately known, might actually scare you once we explain how it works.
The robot’s emotions functions the same way as a human’s. Pepper is able to
“feel” things independently based on processing information from its cameras,
touch sensors, accelerometer as well as other sensors, just as a human’s body
would process emotion based on interactions with its five senses23. “Pepper’s
emotions are influenced by people’s facial expressions and words, as well as his
surroundings,” a statement from the company says. “For example, Pepper is at ease
when he is around people he knows, happy when he is praised and gets scared
when the lights go down.24” However, the most unsettling of all is the fact that
Pepper can adapt and learn from human behavior, and hence, can have an
autonomous intelligence of its own which may conflict with the decisions that its
human master may take – thus, we are not sure at this point of time how to react or
respond to this rather chilling aspect of innovation, and only time can tell if the
high dependence of ours on AI and humanoids is a boon or a bane.

22 “Increase in IP Camera Hacking Attacks”, DWP Information Architects, July 20, 2019,
[ ], as accessed on 21.07.2020
23 Michael Hiscock, “The robot has real feelings and we’re terrified”, The Loop, June 23, 2020,
[], as accessed on 21.07.2020
24 Ibid.

According to a recent report by Credence Research, the global medical
robotics market was valued at $7.24 billion in 2015 and is expected to grow to
$20 billion by 202325. A wide range of robots is being developed to serve in a
variety of roles within the medical environment. Robots specializing in human
treatment include surgical robots and rehabilitation robots. The field of assistive
and therapeutic robotic devices is also expanding rapidly. These include robots that
help patients rehabilitate from serious conditions like strokes, empathic robots that
assist in the care of older or physically/mentally challenged individuals, and
industrial robots that take on a variety of routine tasks, such as sterilizing rooms
and delivering medical supplies and equipment, including medications26. The
discipline of telepresence signifies the technologies that permit an individual to
sense as if they were at another location without being there. Robots are utilized in
the discipline of medicine to execute operations that are normally performed
manually by human beings.27

These operations may be extremely professional and facilitated to diagnose and
treat the patients. Though medical robotics may still be in its infancy, the use of
medical robots for numerous operations may increase the quality of medical
treatment28. The utilization of telepresence in the medical operations has
eliminated the barriers of distance, due to which professional expertise is readily
available. The use of robotics in the medical field and telepresence minimize

25 “Medical Robotics Market To Reach Over US$ 20 Bn By 2023”, Credence Research,
[], as accessed on 21.07.2020
26 Mark Crawford, “Top 6 Robotic Applications in Medicine”, The American Society of Mechanical Engineers,
September 14, 2016, [], as
accessed on 21.07.2020
27 “Robotics in the Medical Field”, Bright Hub Engineering, November 11, 2010,
[], as accessed on
28 Ibid.

individual oversight and brings specialized knowledge to inaccessible regions
without the need for physical travel29.

The use of robotic assistance in surgery has expanded exponentially since it was
first approved in 2000. It is estimated that, worldwide, more than 570,000
procedures were performed with the da Vinci robotic surgical system in 2014, with
this figure growing almost 10% each year. Robotic-assisted surgery [RAS] has
found its way into almost every surgical subspecialty and now has approved uses
in urology, gynecology, cardiothoracic surgery, general surgery, and
otolaryngology. RAS is most commonly used in urology and gynecology; more
than 75% of robotic procedures performed are within these two specialties.
Robotic surgical systems have the potential to improve surgical techniques and
outcomes, but they also create a unique set of risks and patient safety concerns30.

Automation in any industry poses risks, the same is the case here- as much facility
and ease that the medical robotics provide they are still prone to the cyber risks of
hijacking and malfunctioning. “During the procedure, there were mechanical
problems as the robotic arms were not responding as expected. The urologist
persisted in using robotic technology and ultimately was able to complete the
procedure. The operation took twice as long as expected, but the urologist felt it
had been successful. Postoperatively, the patient developed serious bleeding
requiring multiple blood transfusions. He was taken back to the operating room
where it was noted the inferior epigastric artery [a key artery in the pelvis] had
been damaged during the original procedure. The injury was repaired but this
second operation was prolonged and complicated due to the degree of bleeding.

29 A. Raju, “Robotics to play vital role in healthcare in coming days: Experts”, Pharmabiz, December 30,, 2017,
[], as accessed on 21.07.2020
30 Tara Kirkpatrick & Chad LeGrange, “Robotic Surgery: Risks vs. Rewards”, February 2016,
[], as accessed on 21.07.2020

The patient ultimately required several additional surgeries and a prolonged
hospital stay.31” No doubt that automation and robotics provide accuracy but are
we sure that we want to place someone’s life at risk for our ease.

The cinematic concept of combat robots has been around since the turn of the last
century. But they aren't just science fiction anymore. The military has deployed
thousands of them for use in the battlefields. The U.S. is already using unmanned
aerial vehicles to conduct surveillance and drop missiles on suspected terrorists
overseas in places like Pakistan and Yemen. The efficacy and morality of these and
other operations are controversial, but supporters say drones are less costly,
minimize collateral damage, and don't require putting troops at risk. That's partly
because humans can operate these machines – often in far-flung, dangerous places
– from the safety and comfort of a domestic operations center. While drones do
their work from high above, other robots are operating on the ground in battlefields
worldwide. Forces relied on bomb-squad robots to inspect and defuse possible
explosive devices during military operations around the world. The remote-
controlled machines moved via tank tread and featured infrared vision, multiple
cameras, floodlights, and mechanical arms to spot bombs and dispose of them, all
while human operators stayed a safe distance away.

Growing military investment in robotic technology—by some 40 other nations
indicates that robots are rapidly becoming an important piece of tomorrow’s
military arsenal. The U.S. fields more than 5,300 unmanned aerial vehicles and
more than 12,000 ground robots to reduce risk to soldiers, gather intelligence and
strike stealthily at remote enemies. Israel and South Korea use armed robots to
patrol their borders. Operators in cubicles in the U.S. routinely fly drone aircraft
via remote control, monitoring, and attacking potential targets.

31 Ibid.

Every state wants to save the lives of its troops, the soldiers on their borders who
are ready to lay their lives for their Nation but why not save a lie when we can that
is exactly why the automated armed robots are being deployed in service by every
nation around the world. The increased usage of AI and robots in the defense
sector poses its own risk to the national security and the life of the troop, more than
that it also gives the controlling and technologically developed nations with access
to more advanced resources a much more influential as well as terrorizing power
over others.

“Robotic warfare is open-source warfare,”. “The technology is highly
commercialized and affordable. With $1,000, an enemy could build a drone that
has roughly the same capabilities of the Raven drones used by the soldiers in Iraq.”
The Raven is a small drone used as a scout by ground troops. During the Israeli
invasion of Lebanon in 2006, Hezbollah fighters fielded at least four drones. The
combination of robotics and terrorism will empower individuals at the expense of
states and make it easier to launch terrorist attacks. An Air Force study found that
drones were the optimal platform for deploying weapons of mass destruction and
one against which we have no effective defenses. drone strikes can be

“In a May 2009 opinion piece in The New York Times, David Kilcullen [a former
adviser to General David Petraeus, and Andrew Exum, an Army officer in Iraq and
Afghanistan and now a fellow at the Center for a New American Security]
observed that many targets are not positively identified and airstrikes not always
precise. Thus, some attacks kill more civilians than terrorists. Coverage via the
Internet and traditional media brings the horror of civilian deaths home to everyone

in the region, radicalizing the population and driving civilians into the arms of
local militants.”32

Social media plays a vital role in our lives today, as much as we try to deny it but
we rely on smartphones more than we shall rely on another human. If we are sad
we take it to social media, if bored we take it to social media, if angry we take it
out via social media, it is like a living organism we address and consult when the
need arises and social media responds. It can create an outrage around the world in
a matter of hours, we get responses to be it support or criticism from around the
world and this is all for free access and in a very famous quote it is said- IF YOU
applies very rightly to our situation if we are not paying for any of these services
then how are they accessible to us. It is our personal information and our privacy
that we are selling or rather I would say the companies and social media websites
store- sell-use our information as they please disregarding all the privacy terms that
we sign up to. Social media holds the power to alter minds, to alter our thinking,
they know what we like, they know what we look up for, and they know who our
loved ones are- by they I mean everyone. The information is out there on the
webserver of social media as to where you had your last meal, what you had for
your last meal, and with who. And it is us who is to be blamed for this as well, we
are the ones who have uploaded our information and have let our minds be swayed
by the deceptive information being broadcasted over the internet. With the
increasing number of users on social media and internet, with an ever-increasing
number of smartphones in the market, more than half of the world is open to
cyberattacks and phishing threats, their information is only a few set of hacking

32 Alan S. Brown, “Risks of Robotic Warfare”, The American Society of Mechanical Engineers, August 12, 2011,
[], as accessed on 21.07.2020

codes away from being out open in the public and every second person is adding
on to the cart of social media.

Most of the devices we purchase or give access to these days, we also enter into to
confidentiality agreement with the service provider where-when any collection of
our personal information is done or shall be done in the future, to protect our
personal information which the device has collected or has stored or has sent to be
analyzed and worked on their servers to make the service experience better in
future. Our concern here is not the collection of personal information at the
intricate level but the usage of this information as to how it shall be used and who
shall have access to our personal information. Here the information gathered can
range from our house location to our eating habits to our sleeping patterns
depending on the device or software we have been using. The agreements entered
by us do have vague conditions and pointers which give advantage to the service
provider to exploit the information gathered as it may like. With this, our concern
also lies as to how secure are the networks that are used to save this information?

How concrete is the Personal Data Processing Policy of the service provider? How
much access can the government gain?

4.3 Privacy laws around the world:
Over 2.5 quintillion bytes of data are created each day. Much of this data consists
of information that would allow people to be personally and individually identified
[or, personal information]. The complex problems arise when personal
information is collected-processed by powerful new technologies and then stored
and disclosed online. It is common knowledge that information is power, and when
a third party collects it without authorization. they can very realistically have
power over us and that leaves us at risk. An important aspect of this is that people
often consent to their information being stored and shared. The user agreements we
don't read when downloading new apps or creating online accounts are the most
common way of handing out our personal information of a platter. But Woodrow
Hartzogremarks that social media app, surveillance technologies, and the internet
of things are built in ways that make it difficult to guard personal information—
and the law says this is OK because it is up to users to protect themselves33. As
right and apt as it sounds, it is equally scary because out in open there is very little
knowledge about how someone’s personal information can be used to make his
duplicate. You just need access to one of the government ID account or any office
data to create do profile mapping and create an entirely same human being just
with a different face.

Privacy is a fundamental right, essential to the autonomy and the protection of
human dignity, it serves as the foundation upon which many other human rights
are built. Privacy enables us to create barriers and manage boundaries to protect

33 Woodrow Hartzog, “Privacy’s Blueprint – The Battle to Control the Design of New Technologies”, Harvard
University Press, 2018

ourselves from unwarranted interference in our lives, which allows us to negotiate
who we are and how we want to interact with the world around us. Privacy helps
us establish boundaries to limit who has access to our bodies, places, and things, as
well as our communications and our information34.

The rules that protect privacy give us the ability to assert our rights in the face of
significant power imbalances. As a result, privacy is an essential way to protect
ourselves and society against arbitrary and unjustified use of power, by reducing
what can be known about us and done to us, while protecting us from others who
may wish to exert control.

Innovation has continuously been interlaced with this right. As an instance, our
capabilities to accord protection to privacy are more advanced in today’s time than
ever before, however, the capabilities that presently exist for surveillance and
reconnaissance are unprecedented as well. On absolute terms, Privacy is a
qualified, fundamental human right. The right to privacy is enunciated in all of the
major international and regional human rights conventions instruments, including:

United Nations Declaration of Human Rights [UDHR] 1948, Article 12: “No one
shall be subjected to arbitrary interference with his privacy, family, home or
correspondence, nor to attacks upon his honor and reputation. Everyone has the
right to the protection of the law against such interference or attacks.35”

International Covenant on Civil and Political Rights [ICCPR] 1966, Article 17: “1.
No one shall be subjected to arbitrary or unlawful interference with his privacy,
family, home or correspondence, nor to unlawful attacks on his honor or

34 “What is Privacy?”, Privacy International, October 23, 2017,
[], as accessed on 21.07.2020
35 Article 12, The Universal Declaration of Human Rights, 1948, [
human-rights/#:~:text=Article%2012.,against%20such%20interference%20or%20attacks.], as accessed on

reputation. 2. Everyone has the right to the protection of the law against such
interference or attacks.36”

The right to privacy is also included in:

• Article 14 of the United Nations Convention on Migrant Workers37;
• Article 16 of the UN Convention on the Rights of the Child38;
• Article 10 of the African Charter on the Rights and Welfare of the

• Article 4 of the African Union Principles on Freedom of Expression

[the right of access to information]40;
• Article 11 of the American Convention on Human Rights41;
• Article 5 of the American Declaration of the Rights and Duties of

• Articles 16 and 21 of the Arab Charter on Human Rights43;
• Article 21 of the ASEAN Human Rights Declaration44; and
• Article 8 of the European Convention on Human Rights45.

36 Article 17, The International Covenant on Civil and Political Rights,
[], as accessed on 21.07.2020
37 Article 14, International Convention on the Protection of the Rights of All Migrant Workers and Members of their
Families, [], as accessed on 21.07.2020
38 Article 16, Convention on the Rights of the Child,
[], as accessed on 21.07.2020
39 Article 10, The African Charter on the Rights and Welfare of the Child,
[], as accessed on 21.07.2020
40 Article 4, Declaration of Principles on Freedom of Expression and Access to Information in Africa,
n_in_africa_eng.pdf], as accessed on 22.07.2020
41 Article 11, American Convention on Human Rights – The Pact of San Jose, Costa Rica,
42 Article 5, American Declaration of the Rights and Duties of Man,
43 Article 16 & Article 21, Arab Charter of Human Rights, []
44 Article 21, ASEAN Human Rights Declaration,

Over 130 countries have constitutional provisions with regards to safeguard or
protection of privacy and over 100 nations which have some kind of data
protection and privacy legislation in place46.

It is worth noting that an important element of the right to privacy is the right to the
protection of personal data. While the right to data protection can be reasoned from
the generally guaranteed right to privacy, some international and regional
instruments also stipulate a more specific right to protection of personal data,

1. the OECD Guidelines on the Protection of Privacy and Transborder
Flows of Personal Data, 198047,

2. the Protection of Individuals regarding the Automatic Processing of
Personal Data of the Council of Europe48, several European Union
Directives and corresponding Regulations49, and the European Union
Charter of Fundamental Rights50,

45 Article 8, European Convention on Human Rights, []
46 “Data Protection and Privacy Legislation Worldwide”, United Nations Conference on Trade and Development,
48 “Convention 108”, []
49 To name a few of such Directives and Regulations:

a. Regulation 2016/679: on the protection of Natural Persons with regard to the processing of Personal Data
and on the free movement of such data, [

b. Regulation 2018/1725: on the protection of natural persons with regard to the processing of personal
data by the Union institutions, bodies, offices and agencies and on the free movement of such data,

c. Directive 2016/680: on the protection of natural persons with regard to the processing of personal data
by competent authorities for the purposes of the prevention, investigation, detection or prosecution of
criminal offences or the execution of criminal penalties, and on the free movement of such data,


3. the Asia-Pacific Economic Cooperation [APEC] Privacy Framework,
201551, and

4. The Economic Community of West African States has a
Supplementary Act on data protection, 201052.

The novel data protection law of the European Commission, the General Data
Protection Regulation [“GDPR”], guarantees that it will change the way the
Internet is put to use, and limit how individual information is regulated, collected,
and controlled. As per the GDPR, neither would pages of fine print suffice, nor
would coercing or imposing unilateral conditions upon users.
Instead, companies must be clear and concise about their collection and use of
personal data like full name, home address, location data, IP address, or the
identifier that tracks web and app use on smartphones. Companies have to spell out
why the data is being collected and whether it will be used to create profiles of
people’s actions and habits. Moreover, consumers will gain the right to access data
companies store about them, the right to correct inaccurate information, and the
right to limit the use of decisions made by algorithms, among others53.
The law protects individuals in the 28 member countries of the European Union,
even if the data is processed elsewhere. That means GDPR will apply to publishers
like WIRED; banks; universities; much of the Fortune 500; the alphabet soup of

52 Thirty-Seventh Session of the Authority of Heads of State and Government, Supplementary Act A/S.A . 1/01/10
on Personal Data Protection on ECOWAS,
53 Nitasha Tiku, “Europe’s New Privacy Law Will Change the Web, and More”, [2018],
[], as accessed on

ad-tech companies that track you across the web, devices, and apps; and Silicon
Valley tech giants54.
As an example of the law’s reach, the European Commission, the EU’s legislative
arm, says on its website that a social network will have to comply with a user
request to delete photos the user-posted as a minor — and inform search engines
and other websites that used the photos that the images should be removed. The
commission also says a car-sharing service may request a user’s name, address,
credit card number, and potentially whether the person has a disability, but can’t
require a user to share their race. [Under GDPR, stricter conditions apply to collect
“sensitive data,” such as race, religion, political affiliation, and sexual

54 Ibid.
55 Ibid.

GDPR has as of now driven and contributed to, changes in data-collection,
regulation, and control. Adhering to the principles incorporated in the GDPR,
Google declared that it would halt mining emails in Gmail to personalize
advertisements56. [The company however said that the step taken was independent
of the GDPR and was done so as to standardize the various versions of Gmail.]
Although first launched in 2009, Google has overhauled its privacy dashboard to
be more user-friendly. Following suit, Facebook has declared its privacy dashboard
as well57. In spite of the fact that the law applies exclusively to Europe, the
companies are making changes in their servers all over the world, since it is riddled
with lesser complications than making separate and diverse frameworks, and for
ensuring simpler compliance - privacy is a right to all and the companies should
deal in protecting people’s privacy as a doctor cares for a dying patient .4.4
Privacy laws in India:
Often confused with trade secrets and confidentiality, privacy refers to the use and
disclosure of personal information and is only applicable to information specific to
individuals. Since personal information is a manifestation of the personality of an
individual, the Indian courts including the Supreme Court of India, have thus
recognized that the right to privacy it as an integral part of the right to life and
personal liberty. In the recent past, in the landmark case of Justice K S Puttaswamy
[Retd.] & Anr. vs. Union of India and Ors.58, the Constitution Bench of the Hon'ble
Supreme Court had held that the Right to Privacy is very much a fundamental
right, subject to certain reasonable restrictions.

56 Ginny Marvin, “Google will stop mining Gmail content for personalizing ads”, Martech Today,
57 Nitasha Tiku, “Facebook Will Make It Easier For You to Control Your Personal Data”, 2018,
58 Writ Petition [Civil] No. 494 0f 2012, [2017] 10 SCC 1

India presently does not have any express legislation governing data protection or
privacy. However, the relevant laws in India dealing with data protection are the
Information Technology Act, 2000 and the [Indian] Contract Act, 1872. A codified
law on the subject of data protection is likely to be introduced in India in the near

The [Indian] Information Technology Act, 2000 [hereinafter, the “I.T. Act”] deals
with the issues relating to the payment of compensation [Civil] and punishment
[Criminal] in case of wrongful disclosure and misuse of personal data and violation
of contractual terms in respect of personal data.

Under Section 43A of the [Indian] Information Technology Act, 200059, a body
corporate who is possessing, dealing or handling any sensitive personal data or
information, and is negligent in implementing and maintaining reasonable security
practices resulting in wrongful loss or wrongful gain to any person, then such body
corporate may be held liable to pay damages to the person so affected. It is
important to note that there is no upper limit specified for the compensation that
can be claimed by the affected party in such circumstances.

The Government has notified the Information Technology [Reasonable Security
Practices and Procedures and Sensitive Personal Data or Information] Rules, 2011
[hereinafter referred to as the "2011 Rules"]60. The Rules only deals with the
protection of "Sensitive personal data or information of a person", which includes
such personal information which consists of information relating to:-

59 Section 43A, I.T. Act, 2000: Compensation for failure to protect data
Where a body corporate, possessing, dealing or handling any sensitive personal data or information in a
computer resource which it owns, controls or operates, is negligent in implementing and maintaining
reasonable security practices and procedures and thereby causes wrongful loss or wrongful gain to any
person, such body corporate shall be liable to pay damages by way of compensation to the person so


• Passwords;
• Financial information such as bank account or credit card or debit

card or other payment instrument details;
• Physical, physiological and mental health condition;
• Sexual orientation;
• Medical records and history;
• Biometric information.

However, it is to be noted that Section. 69 of the Information Technology Act,
200061, stands as an exception to the general rule of maintenance of privacy and
secrecy of the personal data and information.

With regards to an offence related to computer-system, Section 66 of the
Information Technology Act, 202062 provides that if any person, dishonestly or

61 Section 69, I.T. Act 2000: Power to issue directions for interception or monitoring or decryption of any
information through any computer resource

[1] Where the Central Government or a State Government or any of its officer specially authorised by the
Central Government or the State Government, as the case may be, in this behalf may, if satisfied that it is
necessary or expedient so to do in the interest of the sovereignty or integrity of India, defence of India,
security of the State, friendly relations with foreign States or public order or for preventing incitement to
the commission of any cognizable offence relating to above or for investigation of any offence, it may
subject to the provisions of sub-section [2], for reasons to be recorded in writing, by order, direct any
agency of the appropriate Government to intercept, monitor or decrypt or cause to be intercepted or
monitored or decrypted any information generated, transmitted, received or stored in any computer
[2] The procedure and safeguards subject to which such interception or monitoring or decryption may be
carried out, shall be such as may be prescribed.
[3] The subscriber or intermediary or any person in-charge of the computer resource shall, when called
upon by any agency referred to in sub-section [1], extend all facilities and technical assistance to-

[a] provide access to or secure access to the computer resource generating transmitting,
receiving, or storing such information; or
[b] intercept, monitor, or decrypt the information, as the case may be; or
[c] provide information stored in computer resource.
[4] The subscriber or intermediary or any person who fails to assist the agency referred to in sub-section
[3] shall be punished with imprisonment for a term which may extend to seven years and shall also be
liable to fine.
62 Section 66, I.T. Act, 2020: Computer related offences

fraudulently does any act referred to in section 43, he shall be punishable with
imprisonment for a term which may extend to three years or with fine which may
extend to Rs 5,00,000 [approx. US$ 8,000]] or with both.

Penalty for Breach of Confidentiality and Privacy:

Section 72 of the Information Technology Act, 200063 provides for penalty for
breach of confidentiality and privacy. The Section provides that any person who, in
pursuance of any of the powers conferred under the IT Act Rules or Regulations
made thereunder, has secured access to any electronic record, book, register,
correspondence, information, document or other material without the consent of
the person concerned, discloses such material to any other person, shall be
punishable with imprisonment for a term which may extend to two years, or with
fine which may extend to Rs 1,00,000, [approx. US$ 3,000] or with both.

Current Data Privacy laws in India:

When the Information Technology Act, 2000 first came into force on October 17,
2000, it lacked provisions for the protection and the procedure to be followed to
ensure the safety and security of sensitive personal information of an individual.
This led to several other amendments and bills being passed and finally The
Information Technology [Amendment] Act, 2008 inserted Section 43A of the I.T.

If any person, dishonestly or fraudulently, does any act referred to in section 43, he shall be punishable
with imprisonment for a term which may extend to three years or with fine which may extend to five lakh
rupees or with both.
63 Section 72, I.T. Act 2000: Penalty for Breach of confidentiality and privacy
Save as otherwise provided in this Act or any other law for the time being in force, if any person who, in
pursuance of any of the powers conferred under this Act, rules or regulations made thereunder, has
secured access to any electronic record, book, register, correspondence, information, document or other
material without the consent of the person concerned discloses such electronic record, book, register,
correspondence, information, document or other material to any other person shall be punished with
imprisonment for a term which may extend to two years, or with fine which may extend to one lakh
rupees, or with both.

Act64 which notified the Information Technology [Reasonable security practices
and procedures and sensitive personal data or information] Rules, 2011 The key
features of 2011 Rules are:

These 2011 Rules only apply to body corporates and persons located in India.
Section 43A of the I.T. Act explicitly provides that whenever a corporate body
possesses or deals with any sensitive personal data or information, and is negligent
in maintaining reasonable security to protect such data or information, which
thereby causes wrongful loss or wrongful gain to any person, then such body
corporate shall be liable to pay damages to the person[s] so affected.

Under the said Section 43A of the I.T. Act, a list of items has been provided which
are to be treated as “sensitive personal data” which include passwords, biometric
information, sexual orientation, medical records and history, credit/ debit card
information, etc. but any information which is freely available or accessible in the
public domain is not considered to be sensitive personal data.

• Any body corporate seeking such sensitive personal data must draft a
privacy policy that has to be published on the website of the body
corporate, containing details of the information being collected and
the purpose for its use.

• The body corporate must establish reasonable security practices for
maintenance of confidentiality of such data, obtain consent from
persons for collecting such sensitive personal data for a lawful and
necessary purpose.

• The purpose must be clear and information used only for such consent
as given and data to be retained only till needed.

64 Supra note 43

• The 2011 Rules also provide that the Grievance Officer shall be
responsible to address grievances of information providers within 1
month for resolution of such Grievances. Body corporates must have
an audit of the reasonable security practices and procedures
implemented by an auditor at least once a year or as and when the
body corporate or a person on its behalf undertake significant up-
gradation of its process and computer resources.

• The punishment for disclosure of information in breach of lawful
contract and imprisonment under the IT Act may be for a term not
exceeding three years, or with a fine which may be Indian Rupees 5
million or with both65.

Thus, as can be seen, Section 43A of the IT Act and the 2011 Rules do provide for
many similar provisions as under GDPR but applicable only for residents of India.
However, this does mean that most companies already have a privacy policy in
place which can now be further developed and extended to include and encompass
the stricter regulations of GDPR so that they do not face any penalties for breaches
under the GDPR.

Aadhaar cards and right to privacy:

Aadhaar system [a nationwide biometric identification system] is being currently
challenged in India with the key dispute being whether the norms for the
compilation of the demographic biometric data by the Government violates the
right to privacy.

65 Dr. Gubbi Subba Rao & Aliza Abdin, “Data Privacy Laws in India”, Lawyered, 2019,
[], as
accessed on 22.07.2020

It is mandatory for a citizen of India to apply for the AADHAR card, issued by the
Government of India, and it is mandatory that an individual shall have to furnish
his/her data and information at the time of making the application. The Aadhaar
scheme which was first introduced as a means of targeted distribution of subsidies,
is today being implemented towards a variety of purposes, including the fight
against black money, transaction authentication, and 'know your customer'
requirements for banks and telecom companies. Aspects of Aadhaar Act, such as
[i] security of the Aadhaar system, [ii] the inability of the individual to file
complaints [for a violation under the Aadhaar Act] relating to theft or misuse of
their data, and [iii] the inability to withdraw/delete one's data once registered with
the UIDAI [government authority dealing with Aadhaar laws] is under scrutiny.

While the judgment which delivered the decision regarding privacy as a
fundamental right of individuals subject to reasonable restrictions was not directly
intended to impact the use of Aadhaar card, it will now have a significant impact
on the pending litigation. The outcome of this pending litigation will significantly
impact data protection policies in India.

4.4.1 The Data [Privacy and Protection Bill], 2017:

The Personal Data Protection Bill, 2019 [Hereinafter, “The Bill”] was introduced
in Lok Sabha by Mr. Ravi Shankar Prasad, Minister of Electronics and Information
Technology on 11th December, 2019. The Bill is inspired from the Committee of
Experts under the Chairmanship of Justice B.N. Srikrishna, 2018.66


1) Right to be Forgotten and Right to Erasure


Right to be Forgotten- Section 2067 of the Bill gives the data principal the
right to be forgotten, i.e., the right to restrict or prevent from being disclosed
if [i] the purpose of data collection has been served; [ii] the user withdrew
consent; and [iii] the data was disclosed illegally.

To exercise this right, an application must be made by a data principal to an
Adjudicating Officer showing that his right in restricting the continued
disclosure of his personal data overrides the right to freedom of speech and
expression and the right to information of any other citizen.

Further, the section also provides for the aggrieved party to prefer an appeal
under the Appellate Tribunal.

Right to Correction and Erasure- §.1868 of the Bill gives the data principal
right to correction and erasure with regard to the personal data being
processed. The Data Principal have the right to [i] the correction of
inaccurate or misleading personal data; [ii] the completion of incomplete
personal data; [iii] the updating of personal data that is out-of-date; and [iv]
the erasure of personal data, which is not necessary for the purpose for
which it was processed.

Where the data fiduciary corrects, completes, updates or erases any personal
data, such data fiduciary shall also take necessary steps to notify all relevant
entities to whom such personal data may have been disclosed regarding the
relevant correction, completion, updation or erasure.

67 § 20, The Personal Data Protection Bill, No. 373 of 2019, INDIA CODE [2019]. [Hereinafter The Bill, 2019].
68 §. 18[1], The Bill, 2019.

Where the Data fiduciary, does not agree with the request of the Data
Principal, then such data fiduciary shall provide the data principal with
adequate justification in writing for rejecting the application and may take
reasonable steps to indicate, alongside the relevant personal data, that the
same is disputed by the data principal.

International Approach
Article 17 of the General Data Protection Regulation69 [Hereinafter,
“GDPR”] provides for right to be erasure [forgotten]. It states that the data
subject shall have the right to obtain from the controller the erasure of
personal data concerning him or her without undue delay and the controller
shall have the obligation to erase personal data without undue delay” if one
of a number of conditions applies. The Article lays down certain conditions
under which the right applies, these conditions are:70

• The personal data is no longer necessary for the purpose an
organization originally collected or processed it.

• An organization is relying on an individual’s consent as the lawful
basis for processing the data and that individual withdraws their

• An organization is relying on legitimate interests as its justification for
processing an individual’s data, the individual objects to this
processing, and there is no overriding legitimate interest for the
organization to continue with the processing.

69 Article 17, 2018 O.J. [L 127].
70 Everything you need to know about the “Right to be forgotten”, GDPR.EU,

• An organization is processing personal data for direct marketing
purposes and the individual objects to this processing.

• An organization processed an individual’s personal data unlawfully.
• An organization must erase personal data in order to comply with a

legal ruling or obligation.
• An organization has processed a child’s personal data to offer

their information society services.

However, there are certain reasons based on which an organization’s right to
process someone’s data can override their right to be forgotten. These
conditions are:71

• The data is being used to exercise the right of freedom of expression
and information.

• The data is being used to comply with a legal ruling or obligation.
• The data is being used to perform a task that is being carried out in the

public interest or when exercising an organization’s official authority.
• The data being processed is necessary for public health purposes and

serves in the public interest.
• The data being processed is necessary to perform preventative or

occupational medicine. This only applies when the data is being
processed by a health professional who is subject to a legal obligation
of professional secrecy.
• The data represents important information that serves the public
interest, scientific research, historical research, or statistical purposes

71 Everything you need to know about the “Right to be forgotten”, GDPR.EU,

and where erasure of the data would likely to impair or halt progress
towards the achievement that was the goal of the processing.
• The data is being used for the establishment of a legal defence or in
the exercise of other legal claims.

Google Spain S.L. v. Agencia Espanola de Proteccion de Datos72
The European Court of Justice dealt with the complaint brought by a
Spanish national Costeja González before the country’s Data Protection
Agency against La Vanguardia newspaper, Google Spain and Google Inc. in
March 2010. González wanted the newspaper to remove or alter the record
of his 1998 attachment and garnishment proceedings so that the information
would no longer be available through Internet search engines. He also
requested Google Inc. or its subsidiary, Google Spain, to remove or conceal
the data. González argued that the proceedings had been fully resolved for
several years and therefore they should no longer appear online. The Agency
dismissed the complaint against the newspaper on the ground that the
publication was legally justified pursuant to a government order. It,
however, upheld the complaint against Google, finding that Internet search
engines are also subject to data protection laws and must take necessary
steps to protect personal information.

On appeal, the National High Court of Spain stayed the proceedings and
presented a number of questions to the European Court of Justice concerning
the applicability of the EU Directive 95/46 [protection of personal data] to
the Internet search engines. The Court ruled that a search engine is regarded
as a “controller” with respect to “processing” of personal data through its act
of locating, indexing, storing, and disseminating such information.

72 Google Spain S.L. v. Agencia Espanola de Proteccion de Datos, Case C-131/12, European Court of Justice.

Additionally, it held that in order to guarantee the rights of privacy and the
protection of personal data, operators of search engines can be required to
remove personal information published by third party websites. But the data
subject’s right to make that request must be balanced against the interest of
the general public to access his or her personal information.73

Difference between Right to Be Forgotten and Right to Erasure

Art. 17 of the GDPR addresses the right of erasure and includes the right to
forgotten under the ambit of right to erasure.74 But the PDP discusses the
right to be forgotten and the right to erasure as two different rights. §.18 of
the bill addresses the right to erasure a right which was not provided for in
the preceding draft of the 2019 Bill, namely the Personal Data Protection
Bill 2018 ["2018 Bill"]. The right to erasure is a more protective remedy as
it demands for the data to be ‘erased’ or removed, updated or corrected.
Whereas, the right to be forgotten just restricts the data fiduciary from
disclosing such data. It does not demand for the data to be removed from the
access of the Data fiduciary.

§.18 of the 2019 Bill, unlike §. 20, does not provide the data principal the
right to appeal from a data fiduciary's decision to reject such data principal's
request to correct or complete or update or erase personal data relating to the
data principal.75 Though, unlike the right to erasure, a data principal's right

73 Google Spain SL v. Agencia Española de Protección de Datos, Global Freedom of Expression Columbia University,
74 Recital 66, The GDPR.
75 VINOD JOSEPH, Right Of Erasure - Under The Personal Data Protection Bill 2019, December, 2019,

to be forgotten can only be enforced by an order of the Adjudicating Officer
after an application has been made to him. Also, the GDPR stipulates for the
controller to erase personal data without undue delay but no time frame is
given for the data fiduciary to comply with the request of the data principal
in the PDP Bill.

Judicial Precedents in India

• The Karnataka High Court in 2017 in the matter of Sri Vasunathan v. The
Registrar General & Ors.76, recognized the ‘Right to be Forgotten’ and
safeguarded the same in sensitive cases involving women in general and
highly sensitive cases involving rape or affecting the modesty and
reputation of the person concerned, in particular.

• The importance of a right to be forgotten was also emphasised by the
Supreme Court in Puttaswamy77. The SC observed that it is privacy
which nurtures the ability of preservation and forgetting a struggle by
removing the shackles of unadvisable things which may have been done
in the past.

• In Zulfiqar Ahman Khan v. Quintillion Business Media Pvt. Ltd.78, the
Delhi HC held that the ‘Right to be Forgotten’ and the ‘Right to be Left
Alone’ are the inherent facets of ‘Right to Privacy’.

• Further the Report of Justice B.N Srikrishna Committee opined that the
recognition of the right to privacy envisages within its contours the right
to protect personal information on the Internet. Consequently, from this

76 Sri Vasunathan v. The Registrar General & Ors., W.P. No. 62038 of 2016 [GM-RES], Decided on January 23, 2017.
77 Justice KS Puttaswamy [Retd.] v. U.O.I. & Ors., [2017] 10 SCC 1 [India].
78 Zulfiqar Ahman Khan v. Quintillion Business Media Pvt. Ltd., 2019 [175] DRJ 660.

right, it follows, that any individual may have the derivative right to
remove the ‘shackles of unadvisable past things’ on the Internet and
correct past actions.79

2) Kinds of Personal Data – The Bill regulates 3 types of data:



Section 3[28] Section 3[36] Section 33[2]

Data about or relating to Such personal data, Such personal data as

a natural person who is which may, reveal, be may be notified by the

directly or indirectly related to, or Central Government to

identifiable, having constitute— be the critical personal

regard to any [i] financial data; [ii] data.82

characteristic, trait, health data; [iii] official

attribute or any other identifier; [iv] sex life;

feature of the identity of [v] sexual orientation;

such natural person, [vi] biometric data; [vii]

whether online or genetic data; [viii]

offline, or any transgender status; [ix]

combination of such intersex status; [x] caste

features with any other or tribe; [xi] religious or

information, and shall political belief or

include any inference affiliation; or [xii] any

drawn from such data for other data categorised as

the purpose of sensitive personal data

profiling80. under section 1581

No personal data shall be May be transferred Shall only be processed

79 Srikrishna Committee Report — [Data Protection], 2018.
80 § 3[28], The Bill, 2019.
81 § 3[36], The Bill, 2019.
82 § 33[2], The Bill, 2019.

processed by any person, outside India for the in India.86

except for any specific, purpose of processing,

clear and lawful when explicit consent is

purpose.83 given by the data

Personal data can be principal for such

processed without transfer.85

consent, [i] if required by

the State for providing

benefits to the

individual; [ii] legal

proceedings; and [iii] to

respond to a medical



Financial Data

There is special categorisation of Financial Data in the bill. It is not a different
category but rather a subset of Sensitive Personal Data. § 3[19]87 defines
financial data as any number or other personal data that is used to identify [i] an
account opened by a data fiduciary, or [ii] a card or payment instrument issued
by a financial institution. It is also defined to include personal data regarding
the relationship between a financial institution and a data principal including
financial status and credit status. Other types of data like account statements,
data relating to other financial products and investment information are not
included within the definition of financial data.

83 § 4, The Bill, 2019.
84 § 12, The Bill, 2019.
85 § 34[1] The Bill, 2019.
86 § 33[2], The Bill, 2019.
87 § 3[19], The Bill, 2019.

The bill classifies Financial Data as Sensitive Personal Data and proposes
exemption of M&As from the consent requirement, as a “reasonable purpose”
for processing under §.14[2]88. It is notable that the original 2018 Bill did not
allow SPD to be exempted as a reasonable purpose. The 2019 Bill however
makes no such distinction. Thus, unless further clarity is given in the final draft,
SPD, including financial data, will also be within the scope of the consent
exemption for M&As.89

International Approach –
Article 4[1] of the GDPR defines personal data as any information relating to an
identified or identifiable natural person [‘data subject’]; an identifiable natural
person is one who can be identified, directly or indirectly, in particular by
reference to an identifier such as a name, an identification number, location
data, an online identifier or to one or more factors specific to the physical,
physiological, genetic, mental, economic, cultural or social identity of that
natural person.90
It also specifies that the GDPR only applies to personal data processed in one of
two ways:
• Personal data processed wholly or partly by automated means [or,

information in electronic form]; and

88 § 14[2], The bill, 2019.
89 ASHEETA REGIDI, Impact of the Data Protection Bill on fintech sector and aligning financial laws with it, See also: VIJAYANT SINGH,
Classification Of Data Under The PDP Bill: Implications For Startups, Ikigai Law,
90 Article 4[1], 2018 O.J. [L 127].

• Personal data processed in a non-automated manner which forms part of, or
is intended to form part of, a ‘filing system’ [or, written records in a manual
filing system].91

Article 9 of GDPR specifies the processing of special categories of personal
data.92 It states that the data subject has to give explicit consent to the
processing of personal data.

3) Data Fiduciary and Data Principal
Data Fiduciary – The Bill explains data fiduciary as any person, including the
State, a company, any juristic entity or any individual who alone or in
conjunction with others determines the purpose and means of processing of
personal data.93

Chapter II of the bill discusses Obligations of Data Fiduciary which is covered
from §.4 - §.11. Data Fiduciary shall be responsible for complying with the
provisions of this Act in respect of any processing undertaken by it or on its
behalf.94 [Accountability of Data Fiduciary].

Obligations of Data Fiduciary

The Bill specifies certain obligations that are to be adhered to by Data Fiduciary
such as:

• Personal Data can be processed by any person for specific, clear and
lawful purpose.95

91 What is considered personal data under the EU GDPR?, GDPR.EU,
92 Article 9, 2018 O.J. [L 127].
93 § 3[13], The Bill, 2019.
94 § 10, The Bill, 2019.
95 § 4, The Bill, 2019.

• Collection of Personal Data shall be limited to such data that is necessary
for the purposes of processing.96

• Notice is required to be given to the individual/data principal for
collection and processing of personal data.97

• Personal data shall be retained only for the purpose for which it is
processed and shall be deleted at the end of the processing.98

• Consent is required to be taken from the data principal at the
commencement of the data processing.99

• Data Fiduciaries and Processing of Data of Children- Data Fiduciary
must verify the age and obtain parental consent when processing
sensitive personal data of children.100 Chapter IV of the PDP Bill is
dedicated to Processing of personal data and sensitive personal data of
children.101 Under the PDP Bill a 'child' is defined as a person less than
18 years of age.102 All data fiduciaries are required to verify the age of a
child and obtain parental consent to process the personal data of a

The Bill places additional obligations on certain data fiduciaries who
operate commercial websites or online services directed at children or
process large volumes of children's personal data, which are classified
under regulations as 'guardian data fiduciaries'.104 Guardian data
fiduciaries are prohibited from profiling, tracking, behavioural

96 § 6, The Bill, 2019.
97 § 7, The Bill, 2019.
98 § 9[1], The Bill, 2019.
99 § 11[1], The Bill, 2019.
100 § 16, The Bill, 2019.
101 Id.
102 § 3[8], The Bill, 2019.
103 §.16[2], The Bill, 2019.
104 § 16[4], The Bill, 2019.

monitoring, or targeted advertising directed at children, or undertaking
other processing that may cause significant harm to children.105

Transparency and Accountability Measures for Data Fiduciary

Apart from these obligations, it is also an essential duty of the data fiduciaries to
undertake certain transparency and accountability measures. These measures are
detailed at length in Chapter VI of the bill [§.22-§.32]. They are as follows:

Prepare Take Implementing Inform the Audit its Undertake Appoint a Institute
Privacy necessary Authority, policies data Data grievance
Policy steps to security by notice, redressal
[S.22] maintain safeguards breach of and impact Protection mechanisms
transparency conduct of assessment Officer
[S.24] any [S.30[1]] [S.32]
in personal policies [S.27]*
processing every year
personal data
[S.25] [S.29]

* The following topic will be discussed at length later on.

Data Principal – The Bill defines Data Principal as the natural person to whom
the personal data relates.106 Under Chapter V,107 the Bill grants several rights to
the Data Principal, such as:

105 § 16[5], The Bill, 2019.
106 § 3[14], The Bill, 2019.
107 §§ 17, 18, 19, 20, The Bill, 2019.

Rights of


Right to Right to Right to data Right to be Exercise of
confirmation correction and portability forgotten Rights [S.21]
and access erasure [S.18]
[S.19] [S.20] Rights
[S.17] exercised

Right to ask Data Right to request Right to ask Data Right to stop data upon a
Fiduciary for a Data Fiduciary to Fiduciary to port from being request made
correct any piece disclosed
copy of their stored of incorrect data their data to to the data
data within another Data fiduciary

specified time limit Fiduciary


International Approach –
GDPR’s approach towards Data Fiduciary is through Controllers and

Controller means the natural or legal person, public authority, agency or
other body which, alone or jointly with others, determines the purposes and
means of the processing of personal data.108 Controllers shoulder the highest
level of compliance responsibility – they must comply with, and
demonstrate compliance with, all the data protection principles as well as
the other GDPR requirements. They are also responsible for the compliance
of their processor[s].109

108 Article 4[7], 2018 O.J. [L 127].
109 Article 24, 2018 O.J. [L 127].

Processor is a natural or legal person, public authority, agency or other body
which processes personal data on behalf of the controller.110 Article 28 of
GDPR specifies the obligations that need to be fulfilled by the Processor.111

According to EU’s GDPR, the person whose personal data is being
collected, held or processed is called data subject is called ‘Data Subject’.112
The GDPR also grants certain rights to the data subject, such as:

13, 14113
Right to be Right to know about the privacy
16115 informed information that will be obtained from
18117 the data collected from the

Right to access Right to obtain a copy of their

personal data as well as other

supplementary information.

Right to rectification The right allows the individuals to

rectify their inaccurate personal data

Right to erasure Right to erase their personal data.

[Right to be


Right to restrict Right to restrict the processing of

processing their personal data in certain


Right to data Right to receive personal data they

portability have provided to a controller in a

110 Article 4[8], 2018 O.J. [L 127].
111 Article 28, 2018 O.J. [L 127].
112 What is Data Subject?, EU GDPR COMPLIANT,
113 Articles 13, 14, 2018 O.J. [L 127].
114 Article 15, 2018 O.J. [L 127].
115 Article 16, 2018 O.J. [L 127].
116 Article 17, 2018 O.J. [L 127].
117 Article 18, 2018 O.J. [L 127].
118 Article 19, 2018 O.J. [L 127].

structured and machine-readable


21119 Right to object Right to object to the processing of
their personal data.

Right in relation to Automated individual decision-

automated decision- making is a decision made by

making including automated means without any human

profiling involvement.121 It does not have to

involve profiling.


4) Social Media Intermediary – The Bill defines social media intermediary as
an intermediary who primarily or solely enables online interaction between
two or more users and allows them to create, upload, share, disseminate,
modify or access information using its services, but shall not include
intermediaries which primarily,— [a] enable commercial or business
oriented transactions; [b] provide access to the Internet; and [c] in the nature
of search-engines, on-line encyclopaedias, e-mail services or online storage
services.122 All such intermediaries which have users above a notified
threshold, and whose actions can impact electoral democracy or public
order, have certain obligations, which include providing a voluntary user
verification mechanism for users in India.123

International Approach –

119 Article 21, 2018 O.J. [L 127].
120 Article 22, 2018 O.J. [L 127].
121 Guide to General Data Protection Regulation,
122 § 26, The Bill, 2019.
123 § 26, The Bill, 2019

GDPR does not explicitly specify the obligations imposed on social media
intermediaries. The liability of intermediary is fixed as per the liability rules
of intermediary service providers in Directive 2000/31/EC.124 Similarly no
specific obligations are imposed on social media intermediaries in Personal
Data Protection Act, 2012125 [Singapore] and in Australia.126

5) Data Protection Authority

The Bill establishes a governing body for regulating the provisions of the
Bill, called Data Protection Authority [Hereinafter, “The Authority”]127.
The Authority is established under Chapter IX128.
It shall be deemed to be a body corporate and shall possess al the rights and
duties of a body corporate. The authority shall comprise of a Chairperson
and six members appointed by the Central Government with at least 10
years’ expertise in the field of data protection and information technology.129
The Central Government is empowered to remove such Chairperson and
members of the authority upon being adjudged as an insolvent, or upon
commission of negligent behaviour or unlikely gaining financial interest.
The Authority has the duty to:
(i) take steps to protect interests of individuals.
(ii) prevent misuse of personal data.
(iii) ensure compliance with the Bill.

124 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of
information society services, in particular electronic commerce, in the Internal Market ['Directive on electronic
commerce'], O.J. [L 178] 1-16,
125 Personal Data Protection Act, available at
126 IKIGAI LAW, Data protection in Australia, 25th April, 2020,
127 § 41, The Bill, 2019.
128 §. 41-56, The Bill, 2019.
129 § 42, The Bill, 2019.

(iv) to promote awareness about data protection.
(v) taking prompt and appropriate action in response to personal data

breach in accordance with the provisions of this Act.
(vi) maintaining a database on its website containing names of significant

data fiduciaries along with a rating in the form of a data trust score
indicating compliance with the obligations of this Act by such
(vii) monitoring cross border transfer of personal data.
(viii) specifying codes of practices.

Codes of Practice
The Bill provides for specific guidelines and provisions for the processing of
data which are to be issued by Authority under the head ‘Codes of Practice’.
Such provisions are related to compliance rules which includes form of
notices, retention periods, grounds of processing the data, method for
exercise of rights by data principals, specific measures or standards for
security and safeguards for personal data, cross border data transfers,
personal data breaches, data protection impact assessments, processing of
de-identified data for research, archiving or statistical purposes etc.130

Inquiry and Investigation
The Authority has the right to take cognizance and inquire upon the
complaint received or on its own discretion when there exist such reasonable
grounds to believe that a data processor is either acting negligently or in
contravention of the provisions of the Bill. An inquiry officer is appointed

130 §. 50, The Bill, 2019.

by the Authority, possessing broad powers to inquire and investigate the data
processor and is required to submit a detailed report to the Authority. The
inquiry Officer also possesses the powers to make an application to the
designated court to exercise search and seizure procedure when such officer
has reasonable grounds to believe that the records and documents are being
tampered by the data processor. After thorough examination, the authority
has the right to issues rules and regulation to the data processor in order to
either desist or close some activities. If the data processor is aggrieved by
the decision of the Data Protection Authority then he may approach to
appellate tribunal for an appeal.131

Orders of the Authority can be appealed to an Appellate Tribunal.132 Appeals
from the Tribunal will go to the Supreme Court.133

International Approach –

GDPR has provisions to establish an independent public authority, called
supervisory authority,134 for monitoring the application of the GDPR in order
to protect the fundamental rights and freedoms of natural persons in relation
to processing and to facilitate the free flow of personal data within the EU.135
The GDPR provides national supervisory authorities with significant powers
to enforce its provisions, including:

• A number of investigative, corrective and authorisation and advisory

131 §. 53, The Bill, 2019.
132 § 67, The Bill, 2019.
133 § 75, The Bill, 2019.
134 Article 4[22], 2018 O.J. [L 127].
135 Article 51, 2018 O.J. [L 127].
136 Article 58, 2018 O.J. [L 127].

Click to View FlipBook Version