Child Safety Online: Overview
Child safety online is a growing concern as the internet continues to expand, providing vast resources and capabilities while also exposing children to potential dangers. Key issues in this discussion include the accessibility of online pornographic materials to minors, child exploitation through global trafficking of explicit content, and the risks posed by online predators. Advocates for child safety are divided on approaches, with some favoring censorship of inappropriate content while others argue for education as the most effective strategy for protection. Historical attempts to legislate online safety, including the Communications Decency Act and the Child Online Protect Act, have faced challenges in courts, often on the grounds of infringing on free speech rights.
Recent legislative efforts like COPPA and its updates seek to protect children's privacy online, while proposals like the Kids Online Safety Act aim to address mental health concerns linked to social media. The balance between safeguarding children and ensuring freedom of expression remains central to the ongoing debate, highlighting the complexities of regulating digital spaces. As technology evolves, so too do the challenges in creating a safe online environment for children, making this a critical area for continued attention and action.
Child Safety Online: Overview
Introduction
Debates over best practices for ensuring children's safety online have increased as the internet has continued to grow in scope and its technologies have advanced. The internet has allowed billions of users all over the globe unprecedented access to vast resources of information, images, and video. At the same time, new technologies have made it possible for even the minimally tech-savvy to manipulate online tools and for the technologically sophisticated to evade mechanisms designed to keep unwanted materials at bay.
Child safety advocates on different sides of the debate agree that the anonymous, largely unregulated nature of cyberspace has made the Internet an ideal vehicle for the dissemination of materials of a controversial, sometimes criminal nature. Hate groups, the pornography industry, and terrorist organizations have all exploited of the power of the Internet; on the other hand, so have human rights advocates, political dissenters, and people determined to expose military, corporate, or government incompetence or corruption.
The controversy over balancing free expression and the need to protect children has dominated the discussion of child safety. The debate has focused on three key problems: the easy access by underage users to online pornographic materials; the global trafficking in online pornographic images and videos that specifically exploit children; and the dangers posed by child predators stalking online chat rooms and similar forums.
Some supporters of child safety online have attempted to ban sexually explicit materials by appealing to legal precedents for the regulation of the television and radio industries. They have also looked to the models provided by of state- and community-level obscenity laws against pornographic books and magazines.
Many Internet content bans, however, have not survived court challenges because they targeted a range of materials judged to be so broad in nature that they infringed on free speech. By blocking keywords of a sexual nature, for example, some bans (and the filtering software used to enforce them) eliminated access not only to pornographic materials but also to legitimate health and medical information.
Opponents of Internet content bans argue that education, not censorship, represents the best means of protecting children from online abuse and exploitation. They contend that it is impossible to eliminate the dangers to children, especially when the favorite strategy of online predators is to pose as a peer, and that teaching children to be aware of these dangers is the most practical defense.
They also insist that eliminating access to non-child pornography would violate the rights of those adults who choose to make or view such materials. But critics have highlighted the ease with which anyone possessing a credit card can access online pornographic materials without having to furnish verifiable proof of age.
Understanding the Discussion
Anonymity: The condition of deliberately concealed identity.
Blog: Short for “weblog,” a regularly updated online personal journal.
Censorship: The act of suppressing or deleting material considered objectionable.
Chat Room: An online venue for communities of users with a common interest to communicate in real time.
Content Filtering Software: A program designed to block a web browser from displaying designated content.
Firewall: A system designed to prevent unauthorized access to or from a computer network. By blocking content that does not meet specified security criteria, firewall technology can work as a censoring tool.
Internet: A system of networks connecting computers around the world.
World Wide Web: A system, consisting of many websites linked together, for browsing Internet sites.
History
The first federal attempt to censor the Internet occurred in 1996, when Congress passed the Communications Decency Act (CDA), which criminalized the transmission of “indecent” materials to minors. The following year, the Supreme Court struck down the CDA as unconstitutional by a unanimous vote. In deciding the case Reno v. American Civil Liberties Union, the Court held that the provisions of the CDA were so sweeping (it banned all sexually explicit online content including materials that many might not consider pornographic in nature) that the law violated First Amendment free speech protections.
In response to this ruling, President Bill Clinton signed into law the Child Online Protect Act (COPA) in 1998. COPA established federal criminal penalties for communicating, sending, or displaying for commercial purposes online material “harmful to minors” to underage users.
COPA was quickly challenged on constitutional grounds, with the appealing parties claiming that they could not prevent the distribution of their materials to minors without also denying adults access to such communications. In 1999, a federal court agreed that COPA likely violated the First Amendment and issued a preliminary injunction against enforcement of the law. This injunction was held up by the Third Circuit Court of Appeals in 2000 and, ultimately, also by the Supreme Court. By 2009 a permanent injunction against the legislation was in place.
Although COPA failed, another 1998 law, the Child Online Privacy and Protection Act (COPPA), was enacted in 2000. COPPA set out federal rules for the online collection of personal data about children under the age of thirteen. The rules applied to operators of websites aimed at children under thirteen as well as to operators of general-audience websites who know that children under thirteen are submitting data to the site.
With the failure of CDA and COPA to withstand challenges to their constitutionality, in 2000 President Clinton signed into law the Children’s Internet Protection Act (CIPA). As a condition for receiving federal funding of Internet connections, CIPA requires all public schools and libraries to install filtering software to block access to materials harmful to minors. In 2003 the Supreme Court upheld the constitutionality of this legislation.
In the spring of 2006, Congress began debate on a piece of legislation aimed at taking the provisions of CIPA one step further. The Deleting Online Predators Act (DOPA) would require schools and libraries receiving federal funding “to protect minors from commercial social networking websites and chat rooms.” Legislators acknowledged that DOPA primarily targeted MySpace, at the time a popular social network site where young people, primarily high school students, created profiles and links to other users within the site.
Because most teenage users of MySpace and similar forums elected to make their profiles available to the general Internet audience, legislators expressed concern that this made young people vulnerable to online predators. Some lawmakers also believed that the kind of censorship called for by DOPA was appropriate in light of the nature of the content posted by many MySpace users. DOPA supporters argued that graphic descriptions of sex, drugs, and drinking, as well as a convenient forum for online bullying, should not be made readily available to children in public schools and libraries.
Opponents of DOPA did not dispute the unsavory nature of some of the content to be found on MySpace and similar sites, or the dangers of adult strangers attempting to contact young people. They contended, however, that the restrictions called for by DOPA were so broad that they could deprive young people of access to legitimate educational tools such as blogs, mailing lists, and photo-sharing sites.
After several rounds of revision, DOPA was passed by the House in July 2006, and then handed off to the Senate. The Senate read the bill and referred it to the Committee on Commerce, Science, and Transportation, but never brought it to a vote.
In December 2012, after a two-year review period, the FTC issued amended the COPPA regulations. Taking effect in July 2013, the amended regulations expanded the definition of "children's personal information" to include cookies that tracked online activity, geolocation data, photos, videos, and audio recordings. The revised regulations continued the safe harbor programs through which organizations could create their own FTC-approved self-compliance regulations for their members as an alternate to formal FTC COPPA enforcement.
During the late 2010s and early 2020s, the FTC issued fines to more than a dozen companies for violating the COPPA regulations. In September 2019, the FTC settled with Google, the parent company of YouTube, for $170 million for allegedly profiting from illegal personal data collection from underage children. Some critics thought the penalty was too low to serve as a deterrent to Google or similar companies. That same year, Musical.ly (TikTok) was fined $5.7 million for collecting personal data from children under thirteen without explicit parental consent. In December 2022, the FTC fined Fortnite (Epic Games) $520 million for collecting personal data from underage children without prior parental notification or consent and for making it difficult for parents to ask the company to delete a child's account. The following year, the FTC fined Microsoft $20 million, for not only collecting underage users' personal information without parental notification or consent, but also for illegally retaining the data.
Child Safety Online Today
The late 2010s and early 2020s also saw the introduction of several federal legislative updates and improvements to COPPA, including COPPA 2.0. Introduced in March 2019 by Senators Ed Markey (Democrat, Massachusetts) and Josh Hawley (Republican, Missouri) COPPA 2.0 aimed to extend COPPA privacy protections to children fifteen years old and younger while also allowing platforms to collect personal information from children ages thirteen to fifteen with their own personal consent rather than parental consent. COPPA 2.0 also included provisions to establish a so-called Eraser Button that would delete a child's data from a platform while preventing the platform from deplatforming them or their parents. The legislation also aimed to ban sites from sending targeted ads to children under thirteen and to ban internet-connected toys and devices from stores unless their packaging included a disclosure to parents of how a child's data would be collected, retained, shared, and protected and protection met strong security standards.
In 2022, Senator Richard Blumenthal (Democrat of Connecticut) and Senator Marsha Blackburn (Republican of Tennessee) introduced he Kids Online Safety Act (KOSA) in response to information leaked by Facebook whistle-blower Frances Haugen that showed Instagram could harm the mental health of teenagers. The bill included provisions that would require online platforms to refrain from promoting suicide, self-harm, eating disorders, and bullying; allow parents of children under sixteen to modify algorithmically generated recommendations, ban certain content, prevent third parties from accessing their children's data and limit children's online screen time; and require social media platforms to publish yearly reports detailing their potential risks to minors. Tech companies responded to the legislation by saying they already followed numerous federal rules and regulations for keeping children safe online. Other critics of the legislation argued that it was too broad because it would apply to video games, streaming services, and any other online service that children under sixteen were reasonably likely to use. Another part of the bill that sparked controversy was Blackburn's statement that KOSA would categorize content about critical race theory, the civil rights movement and racism, and transgender health care as harmful. Critics on the political left felt that KOSA would censor vital information for LGBTQ+ youth and those who were Black, Indigenous, or other people of color, while critics on the right felt KOSA was not explicit enough in banning such content as harmful to mental health.
COPPA 2.0 and KOSA both initially failed to pass in the Senate, but were re-introduced in 2023. Blumenthal and Blackburn had revised KOSA to address critics' concerns about censorship of LGBTQ+ content, and by February 2024 the revised bill had gained enough support to pass the Senate.
In December 2023, the FTC proposed changes to COPPA that would enhance data protection by requiring separate opt-in parental consent before disclosing children's personal data to third parties, including advertisers—unless the disclosure is "integral to the nature of the website or online service." The revision, if passed, would further limit companies' ability to make service access contingent on monetizing children's data. The FTC also proposed limiting the "support for internal operations" exception; limiting companies' nudging of kids via push notifications to stay online; limiting data retention; codifying ed tech guidance; increasing accountability for Safe Harbor programs; strengthening data security requirements; and expanding personal information to include biometric identifiers.
These essays and any opinions, information or representations contained therein are the creation of the particular author and do not necessarily reflect the opinion of EBSCO Information Services.
Bibliography
Avni, Benny. “Who Do You Trust to Run the Internet?” Newsweek Global, vol. 162, no. 13, 2014, pp. 1–5. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=95129496. Accessed 15 Dec. 2014.
Bambauer, Derek E. “Orwell’s Armchair.” University Of Chicago Law Review, vol. 79, no. 3, 2012, pp. 863-944. EBSCOhost, research.ebsco.com/linkprocessor/plink?id=fa8d269f-2815-36ae-aec1-844075fdadf4. Accessed 19 Feb. 2014.
Boas, Taylor C. and Kalathil, Shanthi. Open Networks, Closed Regimes: The Impact of the Internet on Authoritarian Rule. Carnegie Endowment for International Peace, 2010.
Broude, Tomer, and Holger P. Hestermeyer. “The First Condition of Progress? Freedom of Speech and the Limits of International Trade Law.” Virginia Journal of International Law, vol. 54, no. 2, 2014, pp. 295-321. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=98390880. Accessed 15 Dec. 2014.
Corrales, Javier, Leslie David Simon, and Donald R. Wolfensberger. Democracy and the Internet: Allies or Adversaries? Woodrow Wilson Center Press, 2002.
Davis, Lauren Cassani. “How Do Americans Weigh Privacy Versus National Security?” Atlantic, Atlantic Monthly Group, 3 Feb. 2016, www.theatlantic.com/technology/archive/2016/02/heartland-monitor-privacy-security/459657/. Accessed 19 Feb. 2016.
Dibbell, Julian. “The Shadow.” Scientific American, vol. 306, no. 3, 2012, pp. 60–65.
Feiner, Lauren. "Kids Online Safety Act Gains Enough Supporters to Pass the Senate." The Verge, 15 Feb. 2024, www.theverge.com/2024/2/15/24073878/kids-online-safety-act-new-senate-support. Accessed 22 Mar. 2024.
Ginsberg, Jodie. “Global View.” Index on Censorship, vol. 43, no. 2, 2014, pp. 56–58.
Guo, Steve, and Guangchao Feng. “Understanding Support for Internet Censorship in China: An Elaboration of the Theory of Reasoned Action.” Journal of Chinese Political Science, vol. 17, no. 1, 2012, pp. 33-52. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=71507643. Accessed 19 Feb. 2014.
Krakovsky, Marina. “Garbage In, Info Out.” Communications of the ACM, vol. 55, no. 9, 2012, pp. 17-19. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=79648492. Accessed 19 Feb. 2014.
Naím, Moisés, and Philip Bennett. “The Anti-Information Age.” Atlantic, Atlantic Monthly Group, 16 Feb. 2015, www.theatlantic.com/international/archive/2015/02/government-censorship-21st-century-internet/385528/. Accessed 19 Feb. 2016.
Oluwole, Joseph, and Preston C. Green III. Censorship and Student Communication in Online and Offline Settings. Information Science Reference, 2016.
Petrosyan, Ani. "COPPA Fines Imposed against Companies in the United States 2018–2023." Statista, 4 Aug. 2023, www.statista.com/statistics/1401650/coppa-fines-us-companies/. Accessed 22 Mar. 2024.
Pontin, Jason. “Free Speech -In the Era of Its- Technological Amplification.” Technology Review, vol. 116, no. 2, 2013, pp. 60–65. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=85748123. Accessed 19 Feb. 2014.
Rauhala, Emily. “China’s Great Firewall Won’t Be Touched By Beijing’s New Reforms.” Time, 2013, p. 1. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=92115827. Accessed 19 Feb. 2014.
"Revised Children's Online Privacy Protection Rule Goes into Effect Today." Federal Trade Commission, 1 July 2013, www.ftc.gov/news-events/news/press-releases/2013/07/revised-childrens-online-privacy-protection-rule-goes-effect-today. Accessed 22 Mar. 2024.
Reynolds, Glenn Harlan. “When Digital Platforms Become Censors.” The Wall Street Journal - Online Edition, 18 Aug. 2018, www.wsj.com/articles/when-digital-platforms-become-censors-1534514122. Accessed 24 Sept. 2019.
"Senators Markey and Cassidy Reintroduce COPPA 2.0, Bipartisan Legislation to Protect Online Privacy of Children and Teens." Ed Markey, United States Senator for Massachusetts, 3 May 2023, www.markey.senate.gov/news/press-releases/senators-markey-and-cassidy-reintroduce-coppa-20-bipartisan-legislation-to-protect-online-privacy-of-children-and-teens. Accessed 22 Mar. 2024.
Silberling, Amanda. "Lawmakers Revise Kids Online Safety Act to Address LGBTQ Advocates' Concerns." TechCrunch, 15 Feb. 2024, techcrunch.com/2024/02/15/lawmakers-revise-kids-online-safety-act-to-address-lgbtq-advocates-concerns/. Accessed 22 Mar. 2024.
Sorkin, Andrew Ross, et al. "Child Safety Is the New Tech Battleground." DealBook, The New York Times, 17 Feb. 2022, www.nytimes.com/2022/02/17/business/dealbook/children-online-safety-bill.html. Accessed 22 Mar. 2024.