summary
Last year was a pivotal year for data privacy, with privacy receiving significant attention from many regulators, including the Federal Trade Commission (“FTC”). A review of the FTC's 2023 enforcement actions, statements, and policies provides attorneys and clients with a helpful roadmap for future compliance.
enforcement action
2023 will see a series of FTC enforcement actions primarily focused on online tracking technologies, children's privacy, reasonableness of privacy and cybersecurity practices, and location data, resulting in large fines for companies across a variety of industries. and various fines were imposed.
Online Tracking Technology – Comparing Promises and Actions
One clear focus area was protecting digital health by regulating online tracking technologies. The FTC has taken multiple actions, including under the Health Breach Notification Rule.[1](“HBNR”) alleges that digital health platforms GoodRx, Betterhelp, and Premom collect sensitive consumer health information, improperly share it with third parties, and use that health data for targeted advertising in violation of the FTC Act. He claims to have monetized it as[2], HBNR, and the entity's own privacy policies. Specifically, the FTC advanced the theory that sharing data with third parties constitutes an unauthorized disclosure of personally identifiable health information and constitutes a security breach under the HBNR.[3] These actions have resulted in significant financial penalties of $1.5 million (GoodRx), $7.8 million (Betterhelp), and $100,000 (Premom), as well as prohibitions on data sharing, data retention limits, and new privacy policies. Various non-monetary fines were also imposed, including financial obligations. Policies and Programs. These are clear warnings for his HIPAA-exempt companies in the healthcare space.
And this trend is likely to continue, as the FTC recently finalized changes to the HBNR that clarify its applicability to health apps and similar technologies. Companies should take a closer look at the tracking technologies used on their websites and apps and ensure their practices are consistent with their privacy disclosures. In particular, the focus should be on what data is disclosed to whom and the restrictions placed on its use by third party recipients. That information.
Children's Privacy – Dark Patterns, Collection, and Communication
Another area of focus is technology and software companies whose platforms target and are accessed by children due to violations of the Children's Online Privacy Protection Regulation.[4](“COPPA”) and the FTC Act. Amazon was fined $35 million for indefinitely retaining transcripts of children's voice recordings through Amazon Alexa products and misleading parents about whether they could delete their children's voice recordings. Additionally, the FTC amended its 2020 order to allow Messenger Kids products to allegedly allow children to communicate with contacts without their parents' approval and allow app developers to access children's personal data. Meta was fined $5 billion. Microsoft was also fined $20 million for failing to obtain parental consent before its Xbox products collected and retaining children's personal information for longer than reasonably necessary. In this case, the FTC extends COPPA protections to third-party game publishers with whom they share data, and provides that avatars generated from images of children do not contain other personal information (e.g. We have clarified that COPPA applies when collected with information that reveals related to children's accounts). Finally, Edmodo, a company that develops educational technology tools, is accused of failing to obtain verifiable parental consent before collecting children's data, using children's personal information in advertising, and being held accountable for COPPA compliance. fined $6 million for illegally outsourcing services to schools.
These actions demonstrate the FTC's commitment to protecting children's online privacy and regulating how children's data is collected and used. Organizations that collect children's data must ensure that they obtain parental consent in accordance with COPPA, collect information only when reasonably necessary, and retain data only for a reasonable period of time. You should also evaluate your privacy policy statements to ensure that your internal policies are consistent with the promises made within them.
Reasonable privacy and cybersecurity practices
Another focus is violations of Section 5 of the FTC Act.[5]For privacy and cybersecurity practices. Although he received no financial penalties, a lawsuit was brought against Mr. Dosley and Mr. Chegg for improper conduct, including misrepresentation and deceptive statements. The FTC classified their actions as “inadvertent” and argued that privacy policy statements claiming to have reasonable security measures in place were deceptive.
In its lawsuit against Ring, the FTC alleged that Ring's lax security measures and overly permissive access grants exposed consumers' sensitive data to misuse by hackers. Ring was also penalized for failing to properly notify or obtain consent from consumers for extensive human review of their private video recordings to train its algorithms. Specifically, the FTC states that general statements in privacy policies and terms of service that allow companies to use consumer data for product improvement and development will require that the data be reviewed by a human. We have determined that the content is not sufficient to justify use for algorithm training or other artificial intelligence purposes. . As many companies investigate the feasibility of AI applications, this puts a renewed emphasis on privacy disclosure.
In its own lawsuit against 1Health.io, the FTC alleges that 1Health.io retroactively changed its policies without properly notifying consumers and obtaining their consent and misleading consumers about its data-sharing practices with third parties. insisted. This action clarifies that if companies make material retroactive changes to their privacy policies, the FTC may require companies to notify consumers about those changes.
Geolocation data
Although the FTC filed a sealed complaint against data broker Kochava in 2022, an unsealed amended complaint in 2023 provides insight into the FTC's expectations for the collection of geolocation data. The complaint addresses the FTC's concerns about the collection and sharing of geolocation data, stating that Kochava sells geolocation tracking data collected from hundreds of millions of mobile devices to identify third parties. It is alleged that the information was exposed by allowing individuals to be tracked as they entered and exited sensitive locations. They face the threat of prejudice, stalking, discrimination, unemployment and potential physical violence.
Impact in 2024: The pending case is expected to move forward in earnest in 2024, and the Idaho District Court denied Kochava's motion to dismiss on February 5.th, 2024. Companies that collect geolocation data should follow this case carefully, as the outcome of this case could have important implications for the collection and sharing of geolocation data.
Statements and policy positions
In addition to enforcement actions, the FTC clarified its position on a number of privacy issues through a number of statements and policy positions in 2023.
In May, the FTC issued a biometric alert expressing concern about the increasing use of consumer biometric information and related technologies that utilize machine learning. Specifically, it states that the use of these technologies raises serious concerns for consumer privacy and data security, and may lead to bias and discrimination. The FTC claimed to combat unfair or deceptive acts and practices related to the collection and use of consumers' biometric information and the use of biometric information technology. In this warning, the FTC listed the factors it uses to determine whether a company's use of these technologies constitutes unfair or deceptive practices in violation of the FTC Act. When collecting and using biometric data, organizations can use factors such as addressing known or foreseeable risks and providing appropriate training as a roadmap for compliance policies and procedures.
In addition to the warning, the FTC issued an additional statement citing concerns about discrimination and bias in automated systems. In a joint statement, the FTC and other regulators will vigorously use their joint authority to promote responsible innovation, combat discrimination and bias, and oversee the development and use of automated systems. It was decided that. The FTC also released a resolution in November that would give it more authority to more easily issue civil investigative demands in private investigations of AI-related products and services.
The FTC followed through on some of these concerns in its July 2023 lawsuit against ChatGPT. In its investigation, the FTC investigated whether ChatGPT engaged in unfair or deceptive privacy or data security practices or engaged in unfair or deceptive practices that involved a risk of harm to consumers. To do so, we requested records on how ChatGPT is addressing risks related to its AI models. Additionally, the FTC will announce a settlement with Rite Aid in late 2023 that will allow Rite Aid to deploy AI recognition technology without reasonable safeguards and expose consumers, especially women and people of color, to shoplifters. They claim that they were mistakenly tagged as .
conclusion
The FTC is focused on preventing unfairness and bias that is transmitted to consumers through technological innovation. We expect this focus to continue to expand through enforcement actions that should give organizations the opportunity to consider how to leverage technological innovation while protecting consumer data. In particular, companies that utilize artificial intelligence and machine learning algorithms and those that process health and geolocation data should prepare for increased oversight and further guidance from the FTC.
footnote
[1] 16 CFR Part 318.
[2] 15 USC §§ 41-58.
[3] The FTC's 2021 Policy Statement provides guidance on the scope of HBNR, includes coverage for most medical apps not covered by HIPAA, and recognizes that security breaches can occur as a result of an organization's own actions. It was explained that there is, but is not limited to this. Cybersecurity intrusion or malicious activity. The 2023 action was the first action the FTC took under the HBNR and the first action substantiating the 2021 policy statement.
[4] 16 CFR Part 312.
[5] 15 USC § 45(a).
[View source.]