By:  Atty. Jay-r C. Ipac, Senior Associate

On 18 May 2018, or a week before the Europe’s General Data Protection Regulation (GDPR) takes effect, I, along with a colleague at the Firm, attended the “GDPR and E-Privacy: European Law – Global Reach,” a Masterclass held at Amara Singapore. The GDPR is set to replace the Data Protection Directive (DPD) of 1995 on 25 May 2018. Based on the lecture’s content and the questions from the audience, the Philippines’ data privacy regime appears to be in step with what is considered the gold standard in data privacy. Apart from the GDPR, EU’s new E-Privacy Directive (or regulation) is also expected next year to replace the EU’s 2002 E-Privacy Directive.

Technology-related challenges to defining privacy

While “privacy” has been a buzzword lately, the concept of privacy is nothing new. Its modern legal form has been articulated as the “right to left alone” as early as 1890 when the then “groundbreaking” technology of “instantaneous photographs” was seen as a threat to individual personality and dignity.[1] Privacy is thus framed in the negative, as an “opacity tool” protecting individuals from unjustified inference with their private lives.[2] In actual physical space, the interference is “unjustified” when a person exhibits a legitimate expectation of privacy and that expectation is something socially acceptable. Thus, when one whispers something to another person’s ear, he clearly exhibits an expectation of privacy and to eavesdrop on the conversation is not socially acceptable.

As technology continues to intersect with the law, advances in technology however have posed serious challenges in determining what expectations of privacy may be considered legitimate. This is because technology inescapably distorts the physical space that traditionally maps out the contours of privacy. Thus in the old case of Olmstead v. United States,[3] it was then thought that because the telephone wires through which the conversations between individuals passed were within public space, then an individual cannot assert any legitimate expectation of privacy in his public telephone conversations.

Decades later, the US Supreme Court abandoned this reasoning in the famous case of Katz v. United States,[4] saying that if a person enters a public telephone booth, “shuts the door behind him, and pays the toll that permits him to place a call is surely entitled to assume that the words he utters into the mouthpiece will not be broadcast to the world.” As technology continued to push forward, so are the privacy questions that it created. Thus, in the late 1970s, the US Supreme Court[5] had to distinguish between “content” and “non-content”[6] portions of a private telephone communication (because of a device called “pen register”) and ruled that an individual has legitimate expectation of privacy only on the “content” portion of his communication.

Alongside with these developments was the growing recognition of the threats to individual rights and freedoms posed by the continued accumulation of personal data in government and corporate data banks. With advancement in computers and information technology, capable of processing vast quantities of personal data more efficiently, a subset of privacy also emerged—i.e., “informational privacy.”[7] What initially began as a concept of “fair information practices” in the US eventually evolved into “data protection” in the major regional blocs in the world (OECD, EU and the APEC).[8] Data protection aims to empower the individual with a certain level of control over their personal data, and the ways it is used. Unlike privacy therefore, data protection (or data privacy as it later came to be known), although anchored on the fundamental right to privacy, is not prohibitive in nature, but rather a transparency-enhancing tool.[9]

As information and communication technologies converge and with the advent of the ubiquitous internet, the traditional concept of privacy itself became blurred. For one, the idea that privacy is an “opacity tool” stands on its head when products and services on the internet are “freely” accessed in exchange for an individual’s personal information. A person “voluntarily” does this by clicking on a button that usually contains a pre-ticked box that roughly says that the user “agrees” to the website’s processing of the user’s personal information. Does this sufficiently signify that the user “waives” his right to “privacy”? Without a data privacy law, it would appear that the user waives his right to privacy as an “opacity tool” by agreeing to such terms.

For another, the business practices that prevailed online took advantage of the online users’ general lack of understanding of the internet’s inner workings. Thus, online tracking activities (like installing of third-party cookies), mostly for advertising purposes, proliferated unnoticed by the unwary users.[10] Again without a data privacy law, it would appear that an individual’s “privacy” in the information age is simply the “currency” that he trades every time he surfs the web with all its concomitant “risks.” Data privacy laws seek to change that arrangement by granting the individual control over his data instead of allowing others (the data processor, controller, or third parties) to paternalistically determine what is “good” for him.

At the same time however, this blurring effect of technology on privacy has heightened the awareness and interest of legal scholars, particularly those in the special field of information technology law, on the privacy and other legal implications of technological advancements. Thus, as technology gets even more sophisticated to include predictive algorithms inter alia, our own Supreme Court[11] taking note of these concerns agreed that even the “non-content” portion of online communications (or more accurately and currently, “traffic data” in ICT parlance) is deserving of judicial protection even if there is no legitimate expectation of privacy to such data per se. As recognized by legal scholars, the aggregation of these supposedly anonymous “traffic data” and their analysis, using other sophisticated forms of technology, can in fact reveal information that an individual reasonably expects to be private, thus violating one’s right to privacy.

The Philippine right to privacy in the internet age and big data era  

Since 2011, the Philippines had been consistently topping the list of countries with the highest social network penetration, and eventually was dubbed as the social networking capital of the world.[12] This meant that Filipinos in general love to share more and more information about themselves to others and this information includes pictures, geo-location data, relationships, health condition, preferences and/or interests on a wide array of subjects. Unsurprisingly, when the Facebook scandal involving Cambridge Analytica came out early this year, the Philippines placed second on the list of countries whose user data were included in those compromised.[13]

Under Philippine law, for a right to privacy to be successfully invoked, two requirements must concur: that a person has exhibited an actual (subjective) expectation of privacy; and, that the expectation be one that society is prepared to recognize as reasonable (objective). If privacy is an “opacity tool,” does this mean that Filipinos in general enjoy lesser degrees of privacy online through their own propensity to disclose personal information? This challenge to privacy in question form informs us that at the core of the legal concept of privacy are actually underlying norms and values that are also being changed as individuals interact with the technology they created.

Thus, in one case,[14] the Supreme Court refused to recognize a violation of the right to informational privacy because the petitioners (parents of student minors) failed to prove “that the access to the pictures posted [on Facebook] were limited to the original uploader, through the “Me Only” privacy setting, or that the user’s contact list has been screened to limit access to a select few, through the “Custom” setting.” Although the ruling is sound, it nonetheless raised one important side question: is it correct to assume that the same “legitimate expectation of privacy” we applied to adults is equally applicable to minors given the unique attributes of the internet?

The Court’s reasoning also highlighted the difference between constitutional privacy and statutory privacy (data privacy). Whereas the former requires an individual to comply with both objective and subjective tests, data privacy, as a means of granting control to individuals over their personal data in the hands of personal information controllers/processors, requires the existence of any of the grounds under the law before personal data may be processed. Consent of the data subject is one of those grounds. Interestingly, and for the first time, the GDPR requires parental consent before the personal data of children under 16 years of age may be processed by information society service providers.[15] Should this age factor, which is applied under statutory (GDPR) data privacy, not be considered, too, in situations when privacy is applied as an “opacity tool”? As newer generations would grow up completely tied to the internet, or perhaps the Web 3.0, from early childhood, these questions, among many others, may likely confront our courts in the future.

Last year, the National Privacy Commission reported that based on the results of the survey it commissioned, Filipinos value data privacy.[16] However the Philippines still has a long way to go in having business establishments, which handle personal data, that see privacy measures[17] as a matter of individual right, rather than simply as a way of avoiding the stiff penalties under applicable law. The reason for this observation can be explained by analogous reference to the metaphor of the boiling frog: “place a frog in boiling water, and it will jump out to save itself; but place it in cold water and slowly apply heat, and the frog will boil to death.”[18]

Stated plainly, violations, incursions or breaches of the right to privacy in the digital space do not normally have direct, immediate adverse effects on the affected individuals (unless perhaps the violation amounts to a cybercrime) unlike in the physical world. Hence, the incentive to create or embed a social or organizational culture of privacy, especially in a country that arguably has less-restrictive cultural idea of privacy, is quite minimal. Nonetheless, allowing these kinds of privacy incursions is our surefire ticket to the much murkier and complicated issues associated with big data and the sophisticated analytics that go with it. These issues include discrimination against certain classes of persons, political interference (as we discovered in the Cambridge Analytica scandal), and government surveillance, among others. From where we currently stand to the point of confronting those critical issues, the message we are receiving is pretty clear:  (i) those who create these wonderful technologies (whatever that may be at the dawn of the Internet of Things) should also begin considering the privacy (and broadly, social) implications of their creation; and (ii) the public and private sector should begin cultivating a culture of privacy that advances individuals’ legitimate expectation of privacy taking into account the potential for abuse or misuse of existing technologies.

From a business perspective, taking privacy matters seriously cannot be overemphasized. In 2012, following the EU rights-based approach to privacy, the Philippines created its own national privacy watchdog, which has been very active in pursuing its mandate. Also, in July 2016, the United Nations issued a resolution declaring internet access as a human right.[19] While the actual meaning of the declaration remains to be seen, such is consistent with the continued clamor for a wider, better and cost-effective internet access in the country.[20]

With improved internet access and more public education activities on data privacy in the pipeline, breaches, violations, or incursions of privacy rights in the digital space would mean taking huge risks of liability for businesses that treat privacy issues lightly. The recent developments and events that have brought the issue of privacy under global spotlight should be taken as a clear signal that as technology continues to evolve, so are the privacy concerns and risks to reputational damage and legal liability that businesses must clearly address if they are to remain competitive and maintain the trust of their clients.

[1] The Right to Privacy, Samuel D. Warren and Louis D. Brandeis, Harvard Law Review, Vol. 4, No. 5. (Dec. 15, 1890), pp. 193-220.
[2] An International Constitutional Moment For Data Privacy In The Times Of Mass-Surveillance, Monika Zalnieriute, International Journal of Law and Information Technology, 2015, 23, 99–133, Oxford.
[3] 277 U.S. 438 (1928)
[4] 389 U.S. 347 (1967)
[5] See Smith v. Maryland, 442 U.S. 735 (1979).
[6] In this case, the non-content portion is the telephone number dialed by an individual.
[7] First appeared in Philippine jurisprudence as Footnote 62 in Ople v. Torres, G.R. No. 127685 July 23, 1998.
[8] Instead of following the sector-specific legislation on data privacy in the US, the Philippines followed the omnibus model of the EU.
[9] An International Constitutional Moment For Data Privacy In The Times Of Mass-Surveillance, Monika Zalnieriute, International Journal of Law and Information Technology, 2015, 23, 99–133, Oxford.
Under our Data Privacy law, several rights have been granted to data subjects and several obligations have been imposed on data controllers and processors.
[10] See Chris Jay Hoofnagle, Behavioral Advertising: The Offer You Can’t Refuse, 6 Harv. L. & Pol’y Rev. 273 (2012)
[11] See Disini v. Secretary of Justice, G.R. No. 203335, February 11, 2014.
[12]  However, it is debated whether users spend more time on the internet because of the slow speed. ( Read more:
[14] Vivares v. Saint Theresa’s College, G.R. No. 202666, September 29, 2014.
[15] Article 8, GDPR. In the US, the Children’s Online Privacy Protection Act (COPPA) imposes certain requirements on operators of websites or online services directed to children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.
[17] For instance, while mobile applications collecting personal data from a user is certainly a personal information controller, the author observes that some mobile applications still do not comply with the notice and consent requirement under the law.
[18] See Dissenting Opinion of Justice Arturo D. Brion, in Arroyo v. Department of Justice, G.R. No. 199082, September 18, 2012, citing Euegene Volokh, The Mechanisms of the Slippery Slope, Harvard Law Review, Vol. 116, February 2003.
[20] In this regard, in June 2016, Congress passed RA 10929 or the Free Internet Access in Public Places Act.