House Energy and Commerce Committee: Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights
House Committee on Energy & Commerce
Subcommittee on Innovation, Data, and Commerce
Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights
Wednesday, April 17, 2024
Topline
- Members from both parties discussed the American Privacy Rights Act, Kids Online Safety Act, and the Children and Teens’ Online Privacy Protection Act 2.0.
- Members from both parties debated data minimization, a private right of action, and algorithmic addiction.
Witnesses
- Ava Smithing, Director of Advocacy, Young People’s Alliance
- Maureen K. Ohlhausen, Co-chair, 21st Century Privacy Coalition
- Katherine Kuehn, Member, Board of Directors and CISO-in-Residence, National Technology Security Coalition
- Kara Frederick, Director, Tech Policy Center, The Heritage Foundation
- Samir C. Jain, Vice President of Policy, Center for Democracy & Technology
- David Brody, Managing Attorney, Digital Justice Initiative, Lawyers’ Committee for Civil Rights Under Law
Opening Statements
Subcommittee Chair Gus Bilirakis (R-Fla.)
In his opening statement, Bilirakis explained how the American Privacy Rights Act (APRA) provides Americans with the right to control their personal information, including how and where it is being collected and stored. He noted the legislation preempts the patchwork of state laws to provide consistent rights, protections, and obligations across state lines. Bilirakis discussed how APRA mandates strong data security standards that minimize and protect against data being used by bad actors. He said Members are also discussing proposals that require age verification for certain websites and streamlining terms of services labelling. Bilirakis criticized Big Tech for failing to prioritize the health and safety of our children online and concluded that Big Tech must be held accountable for manipulating kids to keep them addicted to their screens for longer than ever before.
Subcommittee Ranking Member Jan Schakowsky (D-Ill.)
In her opening statement, Schakowsky noted that consumers are concerned that companies are tracking their data, where they go, and who they talk to. She said that while almost eighty percent of people around the world are protected by national privacy laws, Americans are not. Schakowsky explained that twelve states have moved on data privacy, which reinforced the need for national laws like the APRA. She discussed how APRA would bar data brokers from providing data to scammers and allow consumers to opt out of algorithms that infringe on their personal freedoms. Schakowsky emphasized the need for strong protections for sensitive data like fingerprints and DNA. She concluded that while APRA is still a work in progress, she was confident that Congress would move forward.
Full Committee Chair Cathy McMorris Rodgers (R-Wash.)
In her opening statement, Rodgers explained how social media companies are collecting every data point imaginable to control what we see and when we see it. She warned about the handful of companies and bad actors who are exploiting our information, monetizing it, and using it to manipulate how we think and act. Rodgers said the APRA would give people the right to control their personal information online and not have it used against them. She discussed how algorithms designed by Big Tech companies are specifically designed to get children addicted to their platform and have been used to target kids with content that leads to dangerous, life-threatening behaviors. Rodgers concluded that the APRA would work with the Kids Online Safety Act (KOSA) and other bills to ensure the best protections to date for our children.
Full Committee Ranking Member Frank Pallone (D-N.J.)
In his opening statement, Pallone noted that for far too long, Americans have been powerless against Big Tech’s drive to collect, use, and profit from the sale of vast amounts of Americans’ personal information. He said he was pleased that the APRA has data minimization, rather than notice and consent, as its foundation. Pallone explained how data minimization limits the amount of information that can be collected, processed, retained, and transferred to only what is necessary to provide the products and services requested by the consumer. He noted the discussion draft combines data minimization with provisions that empower consumers to access, correct, and delete their personal data, and opt out of targeted advertisements. Pallone warned that the draft does not provide enough specific protections for children, such as prohibiting targeted advertising for children. He noted the Children and Teens’ Online Privacy Protection Act, which updates the Children’s Online Privacy Protection Act (COPPA), includes provisions for data minimization but leaves websites and apps largely free to collect and disclose minors’ information after receiving consent from a teen or the parent of a child. Pallone concluded that the combination of AI and personal information can be weaponized to deprive people of equal opportunities to find housing, look for jobs, or receive information about goods and services, and urged Congress to consider whether the legislation adequately reflects what they’ve learned about AI, particularly generative AI.
Testimony
Ava Smithing, Director of Advocacy, Young People’s Alliance
In her testimony, Smithing discussed how the social media landscape changed completely after Facebook bought Instagram, which ushered in algorithmically recommended content and targeted advertising. She noted that companies’ ability to track engagements allowed companies to store users’ insecurities as data and link it to all their other accounts across the internet. Smithing explained that if the U.S. had a national privacy standard that ensured data minimization, none of the harmful practices that social media companies engage in would have happened. She urged Congress to pass KOSA and COPPA 2.0 to update old privacy laws and protect against downstream harms caused by specific design features.
Smithing concluded that data privacy would protect users from the harm and polarization caused by social media companies by limiting the information platforms can collect on them.
Maureen K. Ohlhausen, Co-chair, 21st Century Privacy Coalition
In her testimony, Ohlhausen explained that personal information should not be subject to varying protections because of the state someone is in. She discussed how APRA addresses issues including transparency, consent, and the relationship between companies, vendors, and third parties. Ohlhausen noted the draft provides the FTC with several useful enforcement tools to protect consumers, including civil penalty of authority for a first violation and limited Administrative Procedure Act (APA) rulemaking. She said that American consumers and businesses deserve the clarity of a single federal privacy standard.
Ohlhausen said the FCC’s data breach notification authority should be eliminated by the APRA and warned that the draft language could unintentionally cause significant disruptions to common and beneficial practices in the TV marketplace. She also noted her concerns that sensitive information was not included in the exception to data minimization requirements, given how broadly the discussion draft defines sensitive data. Ohlhausen warned that by adopting an overly broad definition of substantial privacy harm, the draft would repeal arbitration agreements while inviting class action lawsuits that would undermine compliance with the bill.
Katherine Kuehn, Member, Board of Directors and CISO-in-Residence, National Technology Security Coalition
In her testimony, Kuehn emphasized that Americans’ data privacy must be protected. She discussed the complex system of state specific privacy laws, which described as confusing for consumers and burdensome for businesses. Kuehn warned of the risk that states could compete by offering looser regulations to attract business investments, creating a race to the bottom. She concluded that the APRA represented a significant improvement in the landscape of consumer privacy protections in the U.S.
Kara Frederick, Director, Tech Policy Center, The Heritage Foundation
The hearing lapsed during Frederick’s testimony. None of her testimony was captured.
Samir C. Jain, Vice President of Policy, Center for Democracy & Technology
In his testimony, Jain discussed the DELETE Act, which would establish a centralized mechanism through which individuals could seek the deletion of their information. He outlined how bans on targeted advertising for those under seventeen and on transfers of children’s data without consent would provide meaningful and important protections for children. Jain said that while well intentioned, legislation that restricts access to content can harm youth and present significant constitutional issues. He concluded that incentives to adopt age verification systems to identify children often require further data collection from children and adults alike, which undermines privacy.
David Brody, Managing Attorney, Digital Justice Initiative, Lawyers’ Committee for Civil Rights Under Law
In his testimony, Brody discussed how the lack of a federal privacy law enables discrimination and other harms. He explained that tech companies that collect data on Black communities and other historically marginalized groups feed it into algorithms that make life-altering decisions and warned that these practices lead to discriminatory harms and unequal access to goods and services. Brody said the foundation of the APRA includes data minimization, civil rights, consumer protections, individual controls, and multi-layered enforcement. He noted that APRA improves on previous legislative efforts by prohibiting forced arbitration of claims involving discrimination and providing the right to major decisions made by a human instead of AI. He described APRA as an imperfect but needed bargain that requires companies to test their algorithms for biases and said that individual protections build consumer trust and reduce the risk of fraud, theft, and deceptive practices. Brody concluded by highlighting his concerns that APRA narrowed the private right of action for violations involving sensitive data.
Question & Answer
Big Tech
Rodgers asked why Big Tech should be subject to algorithm assessments and design evaluations. She asked if Congress should require a company to give thought to the impact on Americans and the decisions they make. Frederick said private companies are just as capable on infringing on Americans rights as the government. She said that while companies say they are transparent, the public only sees what companies want them to see.
Rep. Kathy Castor (D-Fla.) criticized Big Tech for using every method possible to keep children online and addicted so they can pocket huge profits. She cited testimony from Facebook whistleblower Frances Haugen and others who said that companies know that their platforms are causing harm, kids are simply too lucrative for them to change how they do business.
Castor asked why it’s important for Congress to address both privacy protection and the design code.
Smithing said KOSA will address design features that are not addressed by data privacy. She explained that data privacy provisions alone cannot address harmful design features such as likes, endless scrolling, beauty filters, and others that keep us on platforms for longer. Castor said she wouldn’t put it past the Big Tech platforms that have influence on Capitol Hill to throw up barriers along the way.
Rep. Tim Walberg (R-Mich.) asked how Big Tech’s efforts to recruit younger and younger users should impact Congress’ efforts, and whether Congress should include specific privacy protections and enforcement mechanisms for younger users. Frederick said absolutely, and noted tech companies are trying to outdo each other to get younger users addicted to their platforms.
Rep. Jeff Duncan (R-S.C.) asked the witnesses to identify who represents the biggest threat to personal data. Frederick said Tik Tok and Big Tech companies. Smithing also said Big Tech companies, while Brody said Big Tech and data brokers. Jain also cited data brokers.
Rep. Diana Harshbarger (R-Tenn.) noted Facebook and Google can comply with any law that Congress throws at them. She asked how small businesses’ interactions with Facebook and Google would change under the APRA. Ohlhausen explained how a federal privacy standard would allow small business to design and create systems around a single standard without having to adapt to a changing landscape. She said APRA strikes a good balance between allowing pro-competitive uses of data and deterring harmful advertising.
Rep. John James (R-Mich.) blasted social media platforms for making young people more depressed and wreaking havoc on mental health, particularly after the Covid pandemic. James described Facebook/Meta as the Philip Morris of our time and said now is the time to act. James discussed his legislation, the Protecting Kids on Social Media Act, which he said reigns in abuse by Big Tech’s use of algorithms to target minor children.
James asked about the benefits and drawbacks of federal involvement with social media companies. Kuehn said there needs to be a public-private partnership, and described the situation as an opportunity for to work together to do what is best for children.
Rep. Kim Schrier (D-Wash.) said she was concerned about the impacts that screen time and social media have on kids. She noted the KOSA would ensure that social media companies are held responsible for ensuring a safe online environment for kids whenever possible, and would require Big Tech companies to be transparent about the design features that make these apps so addicting.
Algorithmic Products
Rep. Yvette Clarke (D-N.Y.) asked why it’s important to include provisions that prohibit algorithmic discrimination and require algorithmic accountability in a comprehensive privacy bill. Brody explained that algorithmic products are being rolled out with inadequate testing, and often exclude and discriminate against people of color and other marginalized groups. He said APRA requires entities to test their algorithmic systems both before and after deployment to ensure that disparate impacts are not happening. Jain agreed with Brody and called for transparency into how algorithms work to identify potential harms and how to fix them.
Rep. Russ Fulcher (R-Idaho) asked whether it’s possible to balance algorithms that target advertisements without having them misuse that data. Frederick cited the potential for corruption, noting how the CCP makes extensive use of dual-use concepts to our detriment.
Schrier asked Smithing to discuss the addictive nature of design features. Smithing said when looking at addiction specifically, it is important to limit the amount of data that goes into recommendation algorithms. She said it’s not enough to opt out of targeted advertisements, because the cadence at which these posts are delivered is what makes them so addicting. She reiterated the need to allow people to opt out of these algorithms in the first place.
Data Minimization
Schakowsky asked how the APRA draft addresses the issue of scammers. Jain explained the draft’s data minimization requirements, which restrict the initial collection and transfer of data, would lessen the amount of data flowing through the broker ecosystem. He noted these protections would give consumers more power to reign in harmful activities by data brokers.
Pallone asked if the data minimization process in the COPPA 2.0 adequately protects kids’ privacy. Jain said no, and explained COPPA 2.0’s provisions primarily apply to just the collection of information, while APRA’s standards apply to the processing, transfer, and other aspects of data use. He emphasized that the APRA has a stronger standard.
Rep. Debbie Dingell (D-Mich.) said she was pleased the draft APRA language included data minimization protections. She asked if the APRA draft sufficiently addressed data minimization. Jain said the basic standard is strong but said there might be a need to improve the permissible purposes. He explained that the bill’s provision says you can collect data to prevent fraud and warned that data brokers sometimes say they are collecting lots of data in an attempt to combat fraud. Jain said Congress should prevent this from becoming a loophole to allow data brokers to justify their collection of data.
Private Right of Action
Bilirakis asked how KOSA and APRA could curb the power of Big Tech and give control back to the American people. Frederick explained how both bills provide transparency on the harms that Big Tech companies can cause for consumers and their children. She noted private rights of action are critical to enabling enforcement of the legislation.
Rep. Darren Soto (D-Fla.) asked Brody to explain his remarks about KOSA backtracking on causes of action for privacy violations. Brody explained that under previous proposals, there was a private right of action for the collection, processing, retention, and transfer of sensitive covered data, compared to merely a private right of action for the transfer of that data.
Rep. Robin Kelly (D-Ill.) asked about the consequences of failing to include a private right of action against entities that are collecting, retaining, processing, and storing irrelevant sensitive data. Jain said data minimization is the foundational feature for the bill, and warned that without strong enforcement, we risk undermining that foundation. He explained that the private right of action is an enforcement measure that allows individuals to collect damages and encourages companies to take proactive privacy measures in the first place.
National Privacy Standard
Rodgers asked how important a uniform federal standard is in terms of American leadership on the global stage. Kuehn said a national standard is critical, and explained that with so many state laws, it is difficult for corporations and international organizations to have effective privacy standards.
Duncan asked which provisions would be the most important to include in a federal privacy law. Smithing and Kuehn both said data minimization and opt out capabilities. Brody said data minimization and civil rights protections, while Ohlhausen warned that data minimization must be balanced with permissible uses of the data. Frederick emphasized the importance of having an enforcement mechanism, which Jain echoed. Jain also listed data minimization.
Concerns/Changes to the APRA
Rep. Debbie Lesko (R-Ariz.) asked the witnesses to respond to the concerns that the preemptive language of the APRA was not strong enough. Jain said there needs to be a compromise with preemption, and explained that the APRA tries to come up with a compromise by recognizing places where states have particular expertise like in healthcare. Ohlhausen said that while the general language is good, some of the exceptions that still allow state law to prevail on issues like tort or common law could be a way to sidestep preemption. Brody said it’s difficult to strike a delicate balance because data touches everything. He warned about harming the ability of states to regulate fraudulent practices or other types of harm that are not being anticipated by APRA.
Lesko asked the witnesses to comment on concerns that companies that share data with their subsidiaries could be affected by the APRA. Frederick said there are third party tools that can send user data to other companies like Tik Tok. She noted there is a whole ecosystem that needs to be restricted so companies cannot use those loopholes. Kuehn emphasized the need to have the right provision across each line in the data ecosystem so that everyone is held responsible.
Rep. Lori Trahan (D-Mass.) said she was pleased about the consideration of the DELETE Act, but said she was concerned that changes to some of the provisions of the bill that were included in the APRA draft would not fully meet the needs of America’s users. Trahan asked how the provisions in the APRA differ from what is in the DELETE Act, and whether that part of the discussion draft should be strengthened. Jain said yes, and that it should be strengthened by adding the ability to create a centralized mechanism that allows consumers to ask all data brokers to delete their data in one go.
Rep. Kat Cammack (R-Fla.) asked the witnesses about altering the APRA language to include an opt-in provision for data collection, instead of an opt-out provision. Frederick described it as an interesting idea. Jain said default settings have a lot of power, and noted there might be certain settings where it is okay for people to be opted in. Kuehn urged Congress to look at the lessons learned from our international counterparts. She noted there are technical considerations that need to be considered before deciding between opt-in or opt-out.
For more information on this meeting, please click here.
For an archive of past SIFMA hearing coverage, please click here.