
No to the paternalism of big techs in the protection of children and adolescents
*Originally published in JOTA.
**This is an AI-powered machine translation of the original text in Portuguese.
The so-called bill on children and adolescents on the internet (Bill 2628/2022) is about to be voted on in the Chamber of Deputies. It is the most important regulation concerning the digital environment currently under consideration in the Legislature. A key point lies in its Article 5, which addresses the responsibility of digital technology companies regarding harmful effects on children and adolescents.
In its current wording, that article stipulates that big techs must observe a “duty of care and safety” in relation to products made available to children and adolescents, and, on the other hand, must “actively prevent the use” of products not intended for them by children and adolescents.
In the recent judgment by the Federal Supreme Court (STF) on Article 19 of the Civil Rights Framework for the Internet, the responsibility of content providers was expanded in the case of crimes (except for those involving honor), removing the requirement of a prior court order for the obligation to remove content — a facet of the duty of care, in the sense of acting to reduce risks of harm to third parties. But how far should this active role of big techs go when it comes to children and adolescents?
The bill outlines the obligations to mitigate risks related to the use of digital products and services by children and adolescents, which already constitutes the duty of care. Making this explicit in the text could lead to the interpretation that something beyond this is expected, which creates legal uncertainty, since the concept is not defined in the Brazilian Civil Code. The situation differs from the mention of the “product safety duty,” which has its regime defined in the Consumer Protection Code.
The duty to “actively prevent the use,” in turn, recalls the constitutional debate on the role of the State — in an economy based on free enterprise — regarding the risks of mass communication and advertising of harmful products. At the time, there was concern that direct intervention to protect the recipients of communication might end up inserting into the Citizen Constitution an undesirable paternalistic stance by the State.
The constitutional solution, then, was to provide, in Article 220, paragraph 3, item II, that the State should only provide “the means to ensure that individuals and families can protect themselves from programs or programming potentially harmful to them.”
If the Constitution does not want a paternalistic State, it also does not want paternalistic big techs that take on the responsibility of preventing children from accessing or improperly using inappropriate content. The tech companies should only be responsible for providing the technical means and necessary information to ensure that guardians and families can protect children and adolescents in the digital environment, but not for appropriating citizens’ autonomy in the care, upbringing, and educational guidance of their children. Such protection is a duty shared by all, in collaboration: the State, the big techs, and the family.
And the most effective way to achieve this is to embed protection measures in the very design of the products. Article 7 of the bill imposes, albeit confusingly, the obligation on big techs to ensure privacy and data protection by design and by default. It is important to distinguish these concepts.
From the design stage of products and services, providers must follow the state of the art and best practices to offer, where applicable, different privacy configurations and functionalities suitable to the developmental stages of children and adolescents, always with their best interests in mind.
By default, products must be made available with the highest level of protection and provide clear and accessible information so that children and their guardians can, if they choose, consent to less protective configurations. In this way, without resurrecting paternalistic archaism, the autonomy of the citizen is respected, allowing the family to protect itself within the digital environment, and not from the digital environment.