Happy friends using smartphones and social media showing impact of LGPD on user experience

Privacy First – How Data Protection Laws Impact on User Experience

Until the approval of GDPR in the European Union on April 14, 2016, the topic of protection of personal data was not a priority in the daily activities of companies. With the approval of the LGPD (the Brazilian Data protection Law), which is expected to come into force between August 2020 and January 2021 with fines that could be applied from August 2021, Brazil will also have the means to legitimize the processing of personal data by companies and ensure respect for the rights of data holders.

One of the challenges that is faced today is the lack of education and privacy culture. Companies that rely on systems and interfaces, whether they are digital or analog, in order to collect personal data it is necessary to modify business rules, workflows and rethink their interfaces so that they can meet the rights of data subjects, not only in Brazil, but in practically all over the world.

The main responsibility for the leak of a company’s confidential data is the employee himself. According to the Global Information Security Survey 2016, published by PwC (PricewaterhouseCoopers), 41% of the 600 companies surveyed by the consultancy reported that current employees are the biggest cause of information security incidents in Brazil. Such incidents range from theft of intellectual property to the compromise of customer data, which has led 39% of companies to report financial losses after the attacks.

Still, according to the PwC survey, not all internal leakage is generated by criminal attitudes, but rather by unpreparedness, lack of culture, lack of training, and problems with usability and communication regarding processes and systems.

In order to enhance  the experience, customize the interaction through digital channels, improve communication and processes, optimize the time of use and increase conversion rates, mobile applications, systems and digital channels invite each user to provide their personal data all the time and not always are people  prepared to deal with this data, generating leaks in many different ways.

User experience

User trust is directly related to the success of a business. If the user loses confidence in the company, the brand, or the effectiveness of its services, then the company will face many challenges to regain the lost trust. In the age of privacy, now, losing confidence is getting easier.

Therefore, taking care of the user experience in the context of data processing is not only related to the care that the company must take in relation to its website or system, but also in all the layers that permeate the relationship between the individual and the company. From this perspective, it can be seen that users are more concerned with how companies are using their personal information than with the impact that the Internet has on personal privacy and security in general.

The user experience goes much further than a graphical interface, but it considers satisfaction and experience in a user’s emotional relationship with the product or service. According to Norman (2017), it is important to differentiate UI (user interface) from UX (userexperience), the first concerns the graphical interface with which the user interacts where usability is an important attribute; The latter is broader and several disciplines are taken into account in an effort to create a quality user experience.

Still in this line of reasoning, Nielsen (2003) defines usability as an attribute of quality, which concerns how easy to use the interface is for a user. It is defined by the ease of learning, efficiency, ability to memorize, error prevention and satisfaction. According to Jordan (1998), the question of satisfaction in relation to usability is more about avoiding negative feelings, such as frustration in use, than producing positive emotions such as pleasure or pride.

Therefore, the more personalized the user experience, the greater the volume of data generated by the user, which causes questions about the use of personal data for economic purposes of companies, and what is the purpose of everything that is stored and shared, such thought is followed by Doneda in:

Without losing sight of the fact that control over information has always been an essential element in the definition of powers within a society, technology has specifically operated to intensify information flows and, consequently, its sources and recipients. Such a change, at first quantitative, ends up influencing qualitatively, changing the balance axes in the equation between power – information – person – control […] It is necessary to verify how technological development acts on society and, consequently, on the legal system . (Doneda, 2006, p. 9)

In her article “What does GDPR mean for UX?”, Claire Barrett, UX and UI designer, shared an objective set of UX guidelines that the design agency she works at has followed in terms of protecting personal data, literal translation as to follow:

  1. Users must expressly and manually activate data collect and use.
  2. Users must consent to all types of data processing activities. (Author’s note: If the legal basis for consent is chosen, of course)
  3. Users should have the right to withdraw their consent easily at any time.
  4. Users must be able to verify all companies and all suppliers and partners of the company that will handle the data.
  5. Consent is not the same as agreeing to the terms and conditions; therefore, they should not be grouped; they are separate and must have separate shapes.
  6. While it’s good to ask for consent at the right times, it’s even better to explain clearly why consent will benefit your experience.

Following the same line of reasoning as Barrett, according to Nogueira (2013), there are six criteria to assess usability: Ease of use, Ease of learning, User satisfaction, Productivity, Flexibility and Memorability.

These six criteria can be applied to assess usability in relation to the protection of personal data. It is also important to comment that the analysis does not apply only to portals and digital systems, but broadly, to any relationship of the individual who owns the data with the organization that will temporarily manage the personal data, reducing the distance between them and causing the company to see the user as a “partner” for the business to prosper and not as a mere product. As NORMAN points out “the more restricted the possibilities of error for the user, the more efficient their choices will be.”

Personal data processing

One of the changes brought about by the need to protect personal data is the treatment of collecting (LGPD, Art 3 °, cp III, § 1). In the context of digital interfaces, collecting is understood as the action that the website or system performs to capture someone’s personal data, either through a form, an application registration screen or even processing electronic identifiers such as cookies and IP.

The first most obvious change concerns the purpose of processing. (LGPD, chapter I: Article 6). All personal data to be collected by the website has a clear purpose so that the user can identify, clearly and without difficult or hidden words, what will be the purpose that the company will give to that data once informed by the user. That is, the collecting must have a known and informed purpose and not implied or merely generic for storage and future use.

Figure 1 below represents a form, apparently quite simple and common, that would not meet the purpose requirements as reproduced in Article 6 of the LGPD:

Art. 6 The personal data processing activities shall observe the good faith and the following principles:
I -purpose: processing for legitimate, specific and explicit purposes informed to the data subject, without any possibility of subsequent processing inconsistently with these purposes. (LGPD, 2018, art. 6) (emphasis added).

Form example
Figure 1 – Form example

It should be noted that the form collects personal data but does not inform the purpose of each field. The creation of a sentence indicating that when sending the form, the user would automatically be in accordance with a policy, which is normally in another location, is not sufficient, specific and can be considered an illegal treatment of personal data. There is no requirement to necessarily inform the purpose of each field, but at least those that may generate doubts by the holder as to their real need for the business context.

For example, in a user registration form to access the system, it is usual for the system to request the email and password for identification purposes, so there is no need to add one more explanation in the interface, generating confusion or visual pollution in the application. However, if the company needs to collect the “National Document number” which has no clear function for the purpose of “login”, it is worth mentioning an explanation of why the “National Document number” is important in that context.

Figure 2 shows a clearer view of an example of a necessary change in a form. Disregard the graphic aspect of the wireframe (?) or even the usefulness of the fields, they are just examples.

Form example 2
Figure 2 – Form example 2
  • There was an explanation for the user about the reason for collecting certain personal data, which might not be necessary for the service in question.
  • The request for the Gender has become optional, but it has become clear that the user will not only have the customized service, a benefit offered to those who informed the data.
  • Clear explanation that the service is suitable for adults, so age needs to be collected. In this case there is no need to know the user’s date of birth unless the service intends to send messages of congratulations or gifts to the user.

Another important principle that was taken into account in this example was the principle of necessity, which according to item III of article 6 of the LGPD says: “III – need:  limitation  of  the  processing  to  the  minimum  processing required  for  achievement  of  its purposes,  encompassing  pertinent,  proportional  and  non-excessive  data  in  relation  to  the  purposes  of the data processing”. (emphasis added)

Therefore, the data subject has the expectation that the interface will collect the minimum of personal data, being just enough to provide the service without any harm to the individual. Collecting excess data or forecasting future advertising, targeting and sharing campaigns with third parties is no longer allowed (not that it was “allowed” before), which is a massive change for sites and systems that depend on a large volume of personal data. (LGPD, Art 7, chapter X, § 5) And who have always been used to “collect as much as they can” for “whenever they need it”.

Privacy policy

Privacy policy is one of the elements for the effective implementation of a privacy program following the concepts of “privacyby design”. The policy aims to give greater visibility and transparency to the treatment of personal data on a given website, service or company.

It cannot be forgotten that, although in simpler language, policies are nonetheless legal and governance elements, and that every care is necessary to avoid that a privacy policy or a term of use can generate more harm than benefit to the company. In order to create policies so that to present them in a more user-friendly way, the concept of “Legal Design” can be used, which we could summarize as design thinking applied to law. Still, according to Stanford University Professor Hagan, legal design “is a way of assessing and creating legal services, with a focus on how usable, useful, and engaging these services are.


According to the LGPD (Art 5  chap XII) consent is free, informed and unequivocal pronouncement by means of which the data subjects agree to the processing of their personal data for a specific purpose. This transparency and demand for clear communication with the user is defended by Doneda when criticizing requests for consent without clarity in: “In addition, the effects of consent are not always clear to the user, so that his requirement for the processing of personal data turns out to be an innocuous procedure.” (DONEDA, 2006, p. 373)

When recognizing the need for consent to be clear and unambiguous, for it to be valid, it is worth thinking about how to request this consent from users in order to offer a quality experience. In his context, there are two models by which a user can consent: “opt-in” and “opt-out”.

The opt-in model requires an active attitude from the user, declaring his willingness to submit his personal data to any treatment” (MENDES, 2014, p. 205), unlike the opt-out that requires an active posture from the user declaring his willingness to cancel the consent previously assigned or just cancel the processing of personal data as in the case of CCPA (California Data Protection Act, USA) which expressly requires opt-out mechanisms.

After consent is granted, customers must have complete control over their data; that is, the ability to browse, change and delete any of the data that was collected through the legal basis of “consent”. This means that privacy settings need to provide granular options to revoke consent without major difficulties where mere acceptance or signature by the interested party is not enough (or even a click, in the case of internet users) for consent to be configured, it must be sufficiently free and informed (BEYLEVELD e BROWNSWORD, 2007, p. 7)

The facility to remove data must be greater than or equal to the facility to obtain the data. If the user has a bad experience when leaving the company or canceling services with them, it is permissible that he will not return and can still add negative comments to other customers.

Cookie management

A cookie is nothing more than a file that is saved on the user’s computer so that the website can recognize it again. Cookies, by themselves, cannot identify a person, since they, in most cases, record information from the access made by the individual, that is, information about the page accessed, clicked menu, monitoring tools or some preferences identified on the website. This information, even when linked to the IP (Internet Protocol), does not allow to identify the individual, but only that equipment that accessed a certain website. Thus, the use of cookies does not initially characterize a violation of the LGPD, as long as the consumer is informed (art. 7 of Law n. 12.965 / 2014 – Brazilian Civil Rights Framework for the Internet) and is not used for malicious purposes.

Therefore, user authorization is not required for cookies aimed at technical storage or access, such as user login cookies, authentication during a session, security cookies, especially to detect authentication abuses and linked to functionality explicitly requested by the user , for a limited duration, as well as cookies from multimedia content players.

However, following ePrivacy guidelines, personalization of sites is required when cookies are for the purpose of marketing or statistics, to deliver advertising, targeted content, behavior analysis and user profile mapping. In such cases, in addition to the authorization request, the user must have a platform at his disposal so that he can manage his cookies, authorize, revoke authorizations and be clear about the purpose of each cookie.

This type of solution is called a “cookie banner”, as an example in the Figure 3 below, and must be offered by all companies that process cookies for the purpose of identifying the individual and for purposes other than security and access issues.

PrivacyTools Privacy Management Platform
Figure 3 – PrivacyTools Privacy Management Platform

In the example of the figure above it was possible to identify two new elements that are part of the user experience: An indicative bar to explain the existence of the  cookies processing, with options for the user to accept or reject, as well as a window for the user to choose which cookies he wants to agree or disagree with.


A very common practice in creating a web form is the self-filling of checkboxes, which creates dissonance in relation to the concept of privacy by default, as was formerly explained, but only when the checkbox is intended to “facilitate” the individual’s work in accepting conditions.

In the example in figure 4 below, a set of check boxes has been added to obtain user authorization for the use of their personal data for different purposes.

Form example 3
Figure 4 – Form example 3

In this case, they are not mandatory options, but they are already checked before the user even takes an acceptance action. This type of practice, which is quite common, cannot be performed if the company values respect for the right to data protection and privacy of users, following the concept of privacy by default.

The checkboxes must all be unchecked and there must be a clear and unambiguous action by the user to mark them. This change affects several systems, mainly those that use check boxes to add to the service, being provided to the user a series of other “tie-in” purposes with the main reason.


Computational power has made it possible for more and more companies to generate business, explore new markets, make decisions, and have intelligent means to reduce costs, increase profitability and deliver better and faster products and services. The positive points of such technological evolution could not fail to be accompanied by some negative points, and the feeling of loss of privacy is one of those points. Constantly watched, people get used to the facilities of devices and systems and forget that each free use of a system carries with it an infinity of personal data that make people the product of that relationship.

User interfaces are the gateway to the collect of personal data and this is where the whole relationship begins. Previously such kind of activity was carried out in an unrestricted, irresponsible manner and the companies that had the largest number of data from people were those that had the greatest bargaining power in the market because they could offer an increasingly qualified segmentation.

Now, everything has changed, the data belongs to the people and the recurrent cases of data leaks have linked warning signs in individuals that there is too much risk in that set of data that the individual records to consume an apparently harmless and free product.


  1. UN GENERAL ASSEMBLY. Resolution 217 A (III). Paris, December 10th 1948. Universal Declaration of Human Rights.
  2. https://blogs.iadb.org/conocimiento-abierto/en/data-privacy-reform-gains-momentum-in-latin-america/
  3. https://www.pwc.com.br/pt/sala-de-imprensa/artigos/experiencia-usuario-vs-privacidade.html
  4. https://www.globalwebindex.com/reports/data-confidence-index
  5. https://techcrunch.com/2011/11/30/examination-of-privacy-policies-shows-a-few-troubling-trends/
  6. https://uxdesign.cc/what-does-gdpr-mean-for-ux-9b5ecbc51a43
  7. https://super.abril.com.br/tecnologia/naoli-e-concordo
  8. http://www.lawbydesign.co/en/legal-design/
  9. https://gdpr.eu/cookies/
  10. BEYLEVELD, Deryck; BROWNSWORD, Roger. Consent in thelaw: legal theory today. Portland: Hart Publishing, 2007.
  11. BRASIL. Decree n. 8.771, May 11th 2016. Brasília: Brazilian Federal Register, 2016. Available at: http://www.planalto.gov.br/ccivil_03/_ato2015-2018/2016/decreto/D8771.htm Access on: Nov. 07 2019.
  12. Law n. 12.527, November 18th 2011. Brasília: Brazillian Federal Register, 2011. Available at: http://www.planalto.gov.br/ccivil_03/_ato2011-2014/2011/lei/l12527.htm. Access on: Nov. 07 2019.
  13. CORREIA, Pedro Miguel Alves Ribeiro; DE JESUS, Inês Oliveira Andrade. O lugar do conceito de privacidade numa sociedade cada vez mais orwelliana. Rio de janeiro: Revista Direito, Estado e Sociedade, 2013.
  14. DONEDA, Danilo. Da Privacidade à Proteção de Dados Pessoais. Rio de Janeiro: Renovar, 2006. Um código para proteção de dados pessoais na Itália. Rio de Janeiro: Revista Trimestral de Direito Civil, 2003.
  15. MENDES, Laura Schertel. Privacidade, proteção de dados e defesa do consumidor: linhas gerais de um novo direito fundamental. São Paulo: Saraiva, 2014
  16. NIELSEN, J. Designig WEB Usability: The Practice of Simplicity. San Francisco: New RidersPublishing, 2000 . Technology Transfer of Heuristic Evaluation and Usability Inspection. 1995. Available at: http://www.useit.com. Access on: Nov. 07. 2019.
  17. NOGUEIRA, J.L.T. Reflexões sobre métodos de avaliação de interface. Master Thesis on Computer Science. Niterói: Universidade Federal Fluminense, 2003
  18. NORMAN, Donald. The Design of Everyday Things. Rio de Janeiro: Rocco, 2006
  19. RODOTÀ, Stefano. Life in the surveillance society: Privacy today. Organization, selection and presentation by Maria Celina Bodin de Moraes. Translated by Danilo Doneda e Luciana Doneda. Rio de Janeiro: Renovar, 2008.