On September 6, 2024, the U.S. Department of Labor (DOL) issued Compliance Assistance Release No. 2024-01, titled “Cybersecurity Guidance Update.” The updated guidance clarifies that the DOL cybersecurity guidance applies to all ERISA-covered plans, and not just retirement plans, but also health and welfare plans. Also, as a direct response to service providers’ concerns, the DOL expanded its 2021 guidance to emphasize that plan sponsors, fiduciaries, recordkeepers, and participants should adopt cybersecurity practices across all employee benefit plans. With cyber risks continually evolving, the update highlights the importance of implementing robust security practices to protect participant information and plan assets.

Background

When the DOL initially issued its cybersecurity guidance in April 2021, it was intended to help ERISA plan sponsors, fiduciaries, service providers, and participants safeguard sensitive data and assets. Some interpreted the guidelines as applicable only to retirement plans and not service providers or recordkeepers, which led to industry calls for clarity. The 2024 Compliance Assistance Release addresses these concerns by confirming that the DOL’s cybersecurity expectations indeed are intended to extend to all ERISA-covered employee benefit plans, including health and welfare plans.

Expanded Guidance Highlights

The updated guidance maintains the original three-part format, emphasizing Tips for Hiring a Service ProviderCybersecurity Program Best Practices, and Online Security Tips. Here’s a breakdown of these components and key updates from the recent guidance:

1. Tips for Hiring a Service Provider

Plan sponsors and fiduciaries have a critical responsibility when selecting and monitoring service providers to ensure strong cybersecurity practices are in place. The updated DOL guidance advises fiduciaries to thoroughly vet potential providers by asking specific, detailed questions. One key area to examine is insurance coverage. Fiduciaries should be verifying that the prospective provider’s insurance includes coverage for losses resulting from cybersecurity incidents.

In addition, fiduciaries should review the provider’s security history and validation processes. This involves requesting records of past security incidents, recent information security audits, and any evidence of the provider’s compliance with cybersecurity standards. Finally, it is essential to establish clear contractual obligations with service providers. Contracts should contain provisions addressing data confidentiality, timely breach notification, ongoing compliance monitoring, and well-defined incident response protocols.

By specifying these points, the DOL aims to provide plan fiduciaries with concrete criteria for evaluating potential third-party providers, especially those managing sensitive health and welfare data.

2. Cybersecurity Program Best Practices

Educating participants plays a crucial role in reducing cyber risks, and the DOL encourages plan sponsors to empower participants with resources that strengthen their account security. One fundamental aspect of this education involves password management and the use of multi-factor authentication (MFA). The DOL recommends that participants use longer, unique passwords and change them annually. This approach offers a balance, maintaining security without overwhelming users with frequent updates.

Sponsors should also encourage participants to enable MFA wherever possible, as this extra layer of protection makes it significantly harder for unauthorized users to gain access. Additionally, the DOL highlights the importance of cyber threat awareness. Educating employees on recognizing phishing attempts, avoiding free public Wi-Fi when accessing sensitive accounts, and keeping contact information up to date are essential to safeguard against fraud. By understanding and implementing these practices, plan participants can actively contribute to the security of their accounts.

3. Online Security Tips for Participants

The updated guidance underscores the need for a comprehensive cybersecurity framework to protect ERISA plans. A cornerstone of this approach is conducting regular cybersecurity risk assessments. By identifying potential vulnerabilities, plan sponsors and fiduciaries can better understand the specific risks to their data and implement targeted access controls to ensure that only authorized individuals can access sensitive information. Data encryption is also a vital part of the DOL’s recommendations. Encrypting data both in transit and at rest adds a critical layer of defense, protecting information from unauthorized access, even if the data is intercepted or compromised.

These tips further highlight the DOL’s focus on enhanced MFA. Service providers, in particular, are encouraged to implement phishing-resistant MFA, especially for systems exposed to the internet or areas containing highly sensitive data. By deploying these robust authentication methods, ERISA plan administrators can significantly reduce the risk of unauthorized access and bolster overall security. Additionally, the DOL pointed health and welfare plan sponsors to resources from the Department of Health and Human Services (HHS), including the Health Industry Cybersecurity Practices and guidelines tailored for smallmedium, and large healthcare organizations.

Takeaways and Action Items for Plan Sponsors and Fiduciaries

The updated guidance reinforces the importance of cybersecurity across all ERISA-covered plans. To adhere to the DOL’s expectations and mitigate cyber risks effectively, plan sponsors and fiduciaries should consider these actions:

  • Evaluate Service Provider Cybersecurity: Conduct due diligence by asking for information on service providers’ cybersecurity policies, audits, and breach history. Include clear cybersecurity terms in contracts and ensure vendors have applicable insurance coverage.
  • Implement Robust Cybersecurity Policies: Ensure your organization’s cybersecurity policies align with DOL guidelines, including regular risk assessments, strong encryption practices, and incident response planning.
  • Educate Participants: Provide ongoing resources to educate plan participants on online security, focusing on best practices like strong passwords, MFA, and phishing awareness.
  • Leverage HHS Resources for Health Plans: For health and welfare plans, use the HHS cybersecurity guidance to align your practices with industry-specific standards.
  • Conduct a Cybersecurity Self-Audit: Consider conducting a self-audit or hiring a cybersecurity expert to assess and improve your cybersecurity practices. Health plans, in particular, should coordinate these audits with HIPAA privacy and security requirements.

Seyfarth Synopsis: In a significant decision for website operators, the Massachusetts Supreme Judicial Court clarified that tracking users’ web activity does not constitute illegal wiretapping under the state’s Wiretap Act. The court found that person-to-website interactions fall outside the Act’s scope, which focuses on person-to-person communications. However, the court emphasized that other privacy laws could still apply to such tracking practices. This ruling may influence how similar cases proceed nationwide and signals to the Massachusetts legislature that any broader restrictions on web tracking require explicit statutory action.

On Thursday, October 24, 2024, the Massachusetts Supreme Judicial Court ruled that the Massachusetts state wiretap act (“Wiretap Act”) does not prevent a website owner from tracking visitors’ web browsing activity, even without user consent. Plaintiffs have filed numerous similar lawsuits under different state wiretapping laws around the United States. Courts in these cases have largely permitted Plaintiffs to proceed with their claims past the motion to dismiss stage. This decision from the Massachusetts high court could alter that course.

Plaintiff Kathleen Vita alleged that she had accessed and reviewed information on the defendants’ – New England Baptist Hospital and Beth Israel Deaconess Medical Center, Inc. – websites, including doctors’ information, medical symptoms, conditions and procedures. She alleged the defendants collected and shared her browsing history with third parties for advertising purposes without her consent. These third parties include Facebook and Google which obtained the information through tracking software – Meta pixel and Google Analytics – installed on the defendants’ websites. Plaintiff did not allege that any private patient records or messages to nurses or doctors communicated through the website were intercepted or shared.

The Massachusetts Supreme Judicial Court reversed the lower court’s denial of the defendant hospitals’ motion to dismiss. In doing so, the Court looked to the statutory text of the Wiretap Act and legislative intent when the Act passed. The Court focused on the statutory term “communication” and determined that the legislature only intended to prevent the wiretapping of or eavesdropping on person-to-person communications when passing the Act. The conduct Plaintiff alleged did not involve person-to-person communications, but rather an interaction between a person and website, and thus fell outside the purview of the Wiretap Act.

The Court did recognize the legislature’s intent for the law to apply to new and emerging technologies that may not have been contemplated when the law was originally passed in 1968. Thus, the Court noted the wiretapping law could apply to person-to-person communications across a broad technological spectrum, including cell phones, text messaging, internet chats or email, so long as the communication actually involves people communicating with each other. But if the legislature intends for the wiretapping law to prohibit the tracking of a person’s browsing activity or interaction with a website, the Court urged the legislature to pass a law stating so expressly.

Although the Supreme Judicial Court in Massachusetts sided with defendants in determining that website tracking does not violate the Massachusetts state Wiretap Act, it also noted that the activity may violate other privacy laws outside the wiretapping context. Accordingly, businesses in Massachusetts and elsewhere should consider the host of privacy laws when implementing website tracking software. Additionally, it remains to be seen whether the Massachusetts legislature will heed the Court’s directive and pass a law expressly prohibiting website tracking under the Wiretap Act or other statute. Lastly, while this particular case resulted in a positive outcome for businesses utilizing website tracking software, courts in different states around the United States have reached different conclusions under their respective laws.

Corporations face unprecedented challenges in safeguarding sensitive data and mitigating privacy risks in an era marked by the rapid proliferation of Internet of Things, or IoT, devices.

Recent developments, including federal and state regulators’ heightened focus on privacy enforcement, highlight the importance of proactive risk management, compliance and data governance. As IoT and smart devices continue to hit the marketplace, heightened scrutiny for businesses’ data governance practices follows.

The Federal Trade Commission’s recent technology blog, “Cars & Consumer Data: On Unlawful Collection & Use”[1] underscores the agency’s commitment to enforcing consumer protection laws. Despite their blog’s focus on the car industry, the FTC’s message extends to all businesses, emphasizing its vigilance against illegal — or “unfair and deceptive” — collection, use and disclosure of personal data.

Recent enforcement actions are a stark reminder of the FTC’s proactive stance in safeguarding consumer privacy.

Geolocation data is a prime example of sensitive information subject to enhanced protections under the Federal Trade Commission Act. Much like mobile phones, cars can reveal consumers’ persistent, precise locations, making them susceptible to privacy infringements.

Continue Reading Careful Data Governance Is a Must Amid Enforcement Focus

On August 2, 2024, Illinois Governor J. B. Pritzker signed legislation reforming Illinois’ Biometric Information Privacy Act (BIPA). Senate Bill 2979 immediately amends BIPA to limit a private entities’ potential liability for collecting or sharing biometric data without consent.

The BIPA amendment followed a call for action  directed at the legislature from the Illinois courts. Previously, the question of damages liability had wound its way through appellate review in Illinois courts. This amendment changes the course of the Illinois Supreme Court interpretation of BIPA claim accrual, which had held that each unlawful collection or disclosure constituted a new BIPA claim but that damages were discretionary.

Now, with the passage of SB 2979, a private entity that collects or otherwise acquires biometric data in more than one instance for the same person commits only one violation of the Act. Additionally, a private entity that discloses biometric data from the same person to the same recipient commits only one violation of the Act, regardless of the number of times that data is disclosed. As a result, individuals are only entitled to a single recovery of statutory damages.

This reform has potential to reduce the top end liability private entities may face when it comes to BIPA claims.  However, many BIPA litigators are of the opinion that a single instance of harm was already “built in” to settlement valuations in prior cases, and that this new legislation will not do much to alter the approximate average valuation of $ 1500 per person that most plaintiff lawyers are putting on class settlement demands in BIPA lawsuits.  Additionally, even a single instance of alleged harm involving tens of thousands of employees or customers can still amount to significant damage claims. Businesses are still well-advised to be wary before deploying any biometric collection device or mechanism in Illinois without legal advice about appropriate consent and legal compliance obligations.

The European Union (EU)’s government organizations are just like any another entity trying to function in a world where global companies and even government entities are reliant on digital platforms for messaging and collaboration. For years, there has been debate about how platforms like Microsoft 365, formerly Office 365, could be deployed in a way that complies with the GDPR processing and transfer restrictions. And it turns out that even the European Commission (EC) itself can apparently get it wrong. In a surprising turn of events earlier this month, the European Data Protection Supervisor (EDPS) concluded its nearly three year investigation into the Commission’s own deployment and use of Microsoft 365, signaling a pivotal moment in the conversation about the GDPR privacy and security requirements for cloud-based messaging and document collaboration platforms.

The Catalyst for Change

The EDPS’s investigation, spurred by the well-known Irish DPC case initiated by Maximillian Schrems (C-311/18), widely referred to as the “Schrems II” case, and as part of the 2022 Coordinated Enforcement Action of the European Data Protection Board (EDPB), unearthed several critical issues with the Commission’s deployment of Microsoft 365. These findings reportedly involve the EC’s failure to ensure that personal data transferred outside the EU/EEA is afforded protection equivalent to that within the EU/EEA and a lack of specificity in contracts between the Commission and Microsoft regarding the types of personal data collected and the purposes for its collection.  Those contractual terms, and accompanying GDPR safeguards and commitments are of course the same terms that every other global company is using with Microsoft, posted on the Microsoft website and generally not open for negotiation or discussion.

The EDPS’s Verdict

The resolution to the findings is as unprecedented as the investigation itself. The EDPS issued the EC a reprimand and imposed corrective measures demanding the suspension of all data flows from the use of Microsoft 365 to Microsoft and its affiliates and sub-processors outside the EU/EEA not covered by an adequacy decision, effective December 9, 2024. Additionally, the Commission must demonstrate its compliance with Regulation (EU) 2018/1725, specifically regarding purpose limitation, data transfers outside the EU/EEA, and unauthorized data disclosures by the same date.

This decision, the first of its kind, raises important questions about the future of data protection enforcement within the EU – and the use and deployment of any cloud-based platform like Microsoft 365 by any company established in the EU. What mechanisms will be or can be employed to ensure compliance? How will this affect the technical and logistical operations of the European Commission and potentially other EU institutions and bodies as they transition data flows to new servers?

A Roadmap for Compliance

Despite the challenges and short term confusion this decision presents, it also offers a silver lining. The decision serves as a vital roadmap for compliance, setting a precedent for the level of transparency and security required in data processing and transfer activities. This move by the EDPS reinforces the EU’s stance on the importance of data protection, signaling to institutions, companies, and individuals alike that safeguarding personal data is paramount and non-negotiable.

The Path Forward

As we reflect on this decision, it is clear that the implications extend far beyond the confines of the European Commission and Microsoft 365. This decision serves as another wake-up call to all entities operating within the EU’s jurisdiction, emphasizing the need for stringent data protection measures and the importance of reevaluating current data handling practices. The fact that the EC itself is on the receiving end is a surprise, but plenty of other companies with operations in the EU know they are doing the exact same things that the EC just got reprimanded for doing.

Looking ahead, the decision by the EDPS is not just about compliance; it’s about setting a global standard for data protection that companies can understand and predictably follow. As we move forward, institutions and companies should take heed of the path to compliance outlined by the decision. Businesses can no longer assume things are safe by relying on the size and popularity of the Microsoft 365 ecosphere, and the contentment that everyone else is doing the same thing. This decision rocked the boat for everyone. If the EC themselves can get this wrong, what chance is there for the rest of us?

On Wednesday, March 20, Seyfarth attorneys Rebecca Woods, Owen Wolfe, Lauren Leipold, and Puya Partow-Navid will present and Ken Wilton will moderate, the first session of the 2024 Commercial Litigation Outlook webinar series: Charting the Course: AI’s Influence on Legal Practice and IP Protection.

Time of the event:
1:00 p.m. to 2:00 p.m. Eastern

About the Program

Our esteemed panel of experts will explore the intricate intersections of AI technology, legal practice, and intellectual property rights. As businesses worldwide adapt to transformative advancements and evolving regulatory frameworks, this session promises invaluable insights into the future of law and IP protection amidst the AI revolution.

  • Explore the transformative impact of AI on legal practice in 2024 and beyond
  • Obtain insights into forthcoming AI regulations and their implications for businesses operating in the US and EU.
  • Evaluate risk mitigation strategies to avoid potential liability when using AI platforms
  • Learn how to implement strategies for safeguarding intellectual property rights amidst advancing AI technology
  • Discuss the evolving role of legal education in preparing lawyers for an AI-enabled future and the shift towards human-centered AI.

Register here.

This is the first webinar as part of the 2024 Commercial Litigation Outlook series. For more information on the other upcoming webinars in the series, please see below.



Part 2 – Navigating Legal Minefields: Insights on Restrictive Covenants, eDiscovery, and Privacy Compliance

Thursday, April 11, 2024
1:00 p.m. to 2:00 p.m. Eastern
12:00 p.m. to 1:00 p.m. Central
11:00 a.m. to 12:00 p.m. Mountain
10:00 a.m. to 11:00 a.m. Pacific

About the Program

The second webinar in the series will examine the regulatory landscape surrounding non-compete agreements and will also address critical aspects in the realm of eDiscovery and Privacy litigation. Specifically covering the following:

  • Federal Attempts to Curb Non-Competes: Delve into the proposed FTC rule and the NLRB’s stance, analyzing their potential impacts and the legal challenges they may face.
  • State Initiatives: Uncover the latest legislative developments from states like California, Minnesota, and New York, examining how these changes could impact employers nationwide.
  • Judicial Scrutiny and Trends: Gain insights into recent court decisions regarding non-competes and confidentiality provisions, and understand their implications for businesses.
  • Regulatory Enforcement Surrounding Privacy Laws: Learn about the rising regulatory enforcement and litigation surrounding data privacy laws, including the impact of consumer awareness and state legislation on businesses.
  • Navigating the Risks of Privacy Litigation: Discover the latest developments in privacy litigation, including the surge in lawsuits related to website beacons, biometric data, and AI processing. Gain insights on compliance frameworks and preemptive risk assessments to mitigate litigation threats.
  • Advancements and Risks in eDiscovery Tools: Learn about the latest advancements in GenAI eDiscovery tools, including document summarization, subjective coding determinations, and GenAI syntax and querying. Understand the challenges and considerations of adopting GenAI in litigation, including defensible use of technology and negotiating discovery protocols.
  • Generative AI in eDiscovery Workflows: hear about the potential of Generative AI in eDiscovery workflows to streamline your business, increase productivity, and reduce inefficiencies amidst rising regulatory enforcement and litigation surrounding data privacy laws.

Moderator:

Rebecca Woods, Partner, Seyfarth Shaw

Speakers: 

Dawn Mertineit, Partner, Seyfarth Shaw
James Yu, Senior Counsel, Seyfarth Shaw
Jason Priebe, Partner, Seyfarth Shaw
Matthew Christoff, Partner, Seyfarth Shaw


Part 3 – Commercial Litigation Outlook: Insights and Predictions for Litigation Trends in 2024

Thursday, May 2, 2024
1:00 p.m. to 2:30 p.m. Eastern

In the third session of the series, we will dive into the dynamic landscape of further litigation trends set to shape the coming year. Our panelists will provide invaluable insights and practical advice to navigate these complex legal arenas effectively. Don’t miss this opportunity to stay ahead of the curve and arm yourself with the knowledge needed to thrive in the ever-evolving world of litigation. Join us to explore trends, predictions and recommendations in the following areas:

  • Antitrust
  • Consumer Class Actions
  • ESG
  • Franchise
  • Health Care
  • Securities & Fiduciary Duty

Moderator:

Shawn Wood, Partner, Seyfarth Shaw

Speakers:

Brandon Bigelow, Partner, Seyfarth Shaw
Kristine Argentine, Partner, Seyfarth Shaw
Gina Ferrari, Partner, Seyfarth Shaw
John Skelton, Partner, Seyfarth Shaw
Jesse Coleman, Partner, Seyfarth Shaw
Greg Markel, Partner, Seyfarth Shaw


We are committed to providing you with actionable insights and strategic guidance to stay ahead in an ever-changing legal environment. Don’t miss this opportunity to gain invaluable knowledge and network with industry experts.

To register for the entire series, please click here. For more information and to access our full Commercial Litigation Outlook 2024 publication, please click here.

This blog post is co-authored by Seyfarth Shaw and The Chertoff Group and has been cross-posted with permission.

What Happened

On July 26, the U.S. Securities & Exchange Commission (SEC) adopted its Cybersecurity Risk Management, Strategy, Governance, and Incident Disclosure final rule on a 3-2 vote. The final rule is a modified version of the SEC’s earlier Notice of Proposed Rulemaking (NPRM) released in March 2022. The final rule formalizes and expands on existing interpretive guidance requiring disclosure of “material” cybersecurity incidents.

Continue Reading SEC Publishes Public Company Cybersecurity Disclosure Final Rule

This post was originally published to Seyfarth’s Global Privacy Watch blog.

On July 10th, the European Commission issued its Implementing Decision regarding the adequacy of the EU-US Data Privacy Framework (“DPF”). The Decision has been eagerly awaited by US and Europe based commerce, hoping it will help business streamline cross-Atlantic data transfers, and by activists who have vowed to scrutinize the next framework arrangement (thereby maintaining their relevance). Regardless of the legal resiliency of the decision, it poses an interesting set of considerations for US businesses, not the least of which is whether or not to participate in the Framework.

For those who followed the development and demise of the Privacy Shield program and the Schrems II case, it has been apparent for some time that the fundamental objection of the activists and the Court of Justice of the EU (“CJEU”) to the original Privacy Shield was the perception that the US intelligence community had an ability to engage in disproportional data collection without any possibility of recourse by EU residents whose personal information may be swept into an investigation. The actual functioning of the program for the certifying businesses were much less controversial.

Since the structure of the program wasn’t the primary reason for Privacy Shield’s revocation, from a business perspective, the current DPF looks a lot like the old Privacy Shield. For businesses who made the decision to participate in the Privacy Shield program in the past, the operational burden shouldn’t be much different under the new DPF, if they have already taken steps to operationalize the requirements.

What is interesting about the new DPF is how it may impact a company’s decision to choose  between the Standard Contractual Clauses (“SCCs”) and the alternative adequacy mechanism for transfers. There is also some interest vis-à-vis the DPF and its interactions with state privacy laws.

DPF v. SCCs

One of the components of the new SCCs that were adopted in 2021 (which did not exist in the prior version of the SCCs) is the requirement for all SCCs to be accompanied by a transfer impact assessment (“TIA”)[1]. A TIA is designed to assess whether there are legal barriers to the enforcement of the SCCs in the relevant importing jurisdiction – in this case, the US. Many commentators, and some courts, have applied the Schrems II reasoning to argue that use of the SCCs as a transfer mechanism to the US is not effective in all circumstances, because the Foreign Intelligence Services Act (“FISA”) authorizes US intelligence to engage in bulk collection under section 702 and such programs are not proportional and do not have reasonable safeguards required under EU law.

Although the SCCs are still used to transfer European data to the US (mostly because after Privacy Shield was invalidated, practically speaking, they had been the only remaining transfer mechanism for many businesses), several commenters have taken the position that, if Schrems II is taken to its logical conclusion, then any use of SCCs in the US is effectively impossible, because US companies cannot live up to their promises in the SCCs. This was noted in an expert report commissioned by the German Conference of Independent Data Protection Supervisors to assess the broad reach of FISA section 702 programs. Needless to say, companies who undertake a TIA as part of their deployment of SCCs are also under some level of uncertainty as to the effectiveness since a TIA is not the opinion of a supervisory authority, but rather their own interpretation, and that of their legal counsel – which said expert report may cast doubt on.

The DPF is not plagued by such uncertainty. Specifically, recital 200 of the Decision expressly states the legal protections surrounding FISA programs are adequate and “essentially equivalent” to EU protections related to intelligence and national security. This is a momentous declaration, in our estimation, because as a consequence, participation by a company in the DPF seems to remover the need for a TIA for a transfer mechanism. Put another way, the recital 200 provides binding authority for the assertion that the primary motivation for a TIA (i.e. FISA section 702 programs) is now moot in that the DPF participants have sufficient safeguards (even in light of FISA 702) regardless of undertaking a TIA. Note that the removal of a TIA requirement only works for participants in the DPF and TIAs are still required when relying on the  SCCs as a transfer mechanism.

DPF v. State Law

Because the DPF establishes “essentially equivalent” controls for participants, the differences between the scope and requirements of EU privacy law and US state privacy law are brought into more apparent contrast. Looking across the two general frameworks, the differences in concepts, protective requirements, and other controls may actually motivate businesses who are already subject to the various state omnibus privacy laws, to skip participation in the DPF. This is mostly because the DPF is a bit more reasonable to businesses with respect to the exercise of individual rights than some state laws.

For example, the GDPR does not require the controller to comply in full with an access request if the response would “adversely affect the rights” of others, including, a business’ trade secrets or intellectual property[2]. The California Consumer Privacy Act has no such express limitation related to business’ data. That being said, there are a number of possible arguments available under other laws (trade secret, confidentiality obligations, etc.) that could justify a limiting a response to an access request. However, those limitations are not express in the California law – and they are in the GDPR and the DPF.

Similarly, the principles in the GDPR and DPF allow for a denial of an access request where responding to such request triggers an undue burden on the business. The California law’s limitation is a bit narrower than the GDPR/DPF limitation in this instance. California requires responsive disclosures to access requests unless the request is “manifestly unfounded or excessive” [3]. This standard narrower than the DPF standard of “…where the burden or expense of providing access would be disproportionate to the risks to the individual’s privacy…”[4].

Conclusion

This lack of alignment between DPF requirements and state law may lead to operational  confusion and uncertainty by US businesses interested or actively involved in the transfer of personal information from the EU. Regardless of the confusion related to the overlapping US and EU privacy laws, businesses who have previously participated in and are familiar with the Privacy Shield program may find it useful to also participate in the DPF. Additionally, for some business models, participation in the DPF can mean reduced administrative and legal costs as compared to putting in place and maintaining SCCs. However, it must be remembered that the DPF is not the same as compliance with US state privacy laws – even though some omnibus state privacy laws echo GDPR concepts. There are significant distinctions which have to be managed between the tactical implementation of a privacy program for US state law and a DPF compliance program.

Finally, even though there has been a commitment by some to challenge the DPF at the CJEU, the Commission’s approval of the DPF does not necessarily signal a “wait and see” approach. It is instead a time for companies to carefully evaluate and review their transfer activities, their regulatory obligations, and the most appropriate path forward.  All these years after Schrems II, it is at least nice to have a potential alternative to SCCs, in the right business conditions.


[1] Commission Implementing Decision (EU) 2021/914 Recitals 19 through 21

[2] GDPR, Article 15(4) and Recital 63.

[3] Cal. Civ. Code 1798.145(h)(3)

[4] Commission Implementing Decision for DPF Annex I, Section III.4; 8.b, c, e; 14.e, f and 15.d.

2023 has brought several states into the privacy limelight. As of Sunday, June 18, and with the signature of Texas governor Greg Abbott, the Texas Data Privacy and Security Act (“TDPSA”) was enacted, making the Lone Star state the tenth in the U.S. to pass a comprehensive data privacy and security law. The Act provides Texas consumers the ability to submit requests to exercise privacy rights, and extends to parents the ability exercise rights on behalf of their minor children.

The Texas Act provides the usual compliment of data subject rights relating to access, corrections, data portability, and to opt out of data being processed for purposes of targeted advertising, the sale of personal information, and profiling where a consumer may be significantly or legally effected. It also requires that covered businesses provide a privacy notice and other disclosures relevant to how they use consumer data.

Application Standard

Among the ten state-level comprehensive privacy bills, the TDPSA is the first to remove processing and profit thresholds from the applicability standard. Instead of using these thresholds to determine whether an entity is covered, the TDPSA applies to persons that: (1) Conduct business in Texas or produce products or services consumed by Texas residents; (2) Process or engage in the sale of personal data; and (3) Are not small business as defined by the United States Small Business Administration.[i]

Definitions and Obligations of Controllers and Processors

The TDPSA’s definition of “sale of personal data” aligns closely with that of the California Consumer Privacy Act (“CCPA”). It refers to the “the sharing, disclosing, or transferring of personal data for monetary or other valuable consideration by the controller to a third party.” The Act defines “process” as “an operation or set of operations performed” whether it be manually or automatically, on Texas consumers’ personal data or sets of data. This includes the “collection, use, storage, disclosure, analysis, deletion, or modification of personal data.” Unlike the CCPA, the law exempts data processed for business-to-business and employment purposes.

Covered businesses who are data controllers must provide consumers with “a reasonably accessible and clear” privacy notice. These notices include the categories of personal data processed by the controller, the purpose of processing personal data, the categories of data shared with third parties, and methods by which consumers can exercise their rights under the law. If a controller’s use of sensitive personal data or biometric data constitutes a sale, one or both of the following must be included:

  • “NOTICE: This website may sell your sensitive personal data.”
  • “NOTICE: This website may sell your biometric personal data.”

Processors, which are akin to “service providers” under the CCPA, are those people or businesses that process personal data on the behalf of the controller. Processors have a number of obligations under the TDPSA, including assisting controllers in responding to consumer rights requests and data security compliance. All processors will need to have a written data protection agreement (“DPA”) in place with a controller, which will include:

  1. “clear instructions for processing data;
  2. the nature and purpose of processing;
  3. the type of data subject to processing;
  4. the duration of processing;
  5. the rights and obligations of both parties; and
  6. a number of requirements that the processor shall follow under the agreement.

Processors will be required to ensure confidentiality of data, and at the controller’s direction, a processor must delete or return all of the information at the conclusion of the service agreement. However, this deletion requirement excludes data that must be retained pursuant to the processor’s records retention obligations.

Processors must also certify that they will make available to the controller all information necessary to demonstrate the processor ’s compliance with the requirements of this chapter, and that they will comply with a controllers assessment of their security practices. Lastly, should a processor engage any subcontractor, they will need another written contract that meets all of the written requirements set forth by the controller’s DPA.

Yet another state law DPA requirement brings into question whether businesses, particularly those on the national and multi-national level, are going to need separate addenda to service agreements that recite a la carte provisions that include separate definitions and commitments to comply with each state-level privacy law, as well as international data privacy laws such as the EU’s GDPR. Based on the new and upcoming privacy laws in the U.S., businesses can probably still operate using some form of uniform DPA that accounts for each of the different requirements. However, we may soon reach a point where the pages of all the appendices and addenda to comply with separate state requirements are greater than the typical service agreement contract.

Data Protection Assessment Requirement

The TDPSA mandates that controllers conduct and document data protection assessments. These assessments, which mirror those required by the Connecticut Data Privacy Act, require businesses to identify the purposes for which they are collecting and processing data, as well as the associated risks and benefits of that processing. Businesses will need to assess these benefits and risks for the following categories of processing activities involving personal data:

  1. Processing of personal data for targeted advertising;
    1. The sale of personal data;
    1. Processing for purposes of profiling if there is a reasonably foreseeable risk of unfair or deceptive treatment, injury to consumers (including financial and reputational risk), or physical intrusion to an individual’s solitude or seclusion;
    1. Processing of sensitive data; and
    1. Any processing activities that generally may pose a heightened risk of harm to consumers.

Enforcement

The TDPSA expressly states that it does not provide a private right of action. The Texas Attorney General holds exclusive power to field consumer complaints, investigate, issue written warnings, and ultimately enforce violations of the law. The AG may seek both injunctive relief and civil penalties of up to $7,500 in damages for each statutory violation.

The Texas AG also has the authority to investigate controllers when it has reasonable cause to believe that a business has engaged in, is engaging in, or, interestingly – is about to engage in – a violation of the TDPSA. While the Senate version suggested revising this authority to remove pre-violation (thought crime?) investigations, the language withstood its scrutiny and remained in the signed act. This was one of many suggested changes by the Senate prior to bill’s passing.

Back and Forth Between Texas Legislative Houses

As the TDPSA was drafted, a few notable revisions were made between the House and the Senate versions of the bill. However, most of the Senates proposed additions were not ultimately accepted. To start, the Senate added language that expressly excluded from its definition of biometric data “a physical or digital photograph or data generated from” photographs. Further, under the definition of “Sensitive data”, the Senate removed the House’s language that included sexual orientation data. Notably, the sexual orientation language has since been re-added to the finalized version of the law.

The House and Senate also went back and forth on some of the other definitions in the Act, including which entities fall are exempt. Under the “state agency” definition, the Senate broadened the language to include any branch of state government, rather than any branch that falls into the executive branch of government – which was on of the House versions. However, the Senate language made it into the finalized version. 

For the most part, the two legislative groups were in agreement as to which entities were exempt from the TDPSA. These include exemptions for institutions and data subject to Title V of the Gramm-Leach-Bliley Act,[ii] the HIPAA[iii] and HITECH[iv], non-profits and higher education institutions. The Senate sought to add electric utilities and power companies to this list, but the finalized version kept them off the exempt list.

The Senate’s revisions also proposed language allowing consumers to authorize a designated agent to submit rights requests (similar to what we see in the CCPA), but that language was not signed into law. The TDPSA does not let anyone act on another person’s behalf to exercise their privacy rights, other than parents acting on behalf of their minor children.

Conclusion

Section 541.055(e), which provides consumers the ability to have a third party authorized agent submit opt-out requests on their behalf, goes into effect January 1, 2025. Other than that rather narrow delayed measure, the rest of the TDPSA is set to go into effect on July 1, 2024. It will be one of the broadest reaching in the U.S., particularly because of its unique applicability standard. Its mandatory data protection assessments and requirement for written contracts with all processors make the law slightly less business friendly than the rest of the state privacy laws out there. But California is still in a league of its own on that score.


[i] The SBA defines small business generally as a privately-owned enterprise with 500 or fewer employees. Depending on the industry, however, the maximum number of employees may fluctuate and in some cases may not be factored in. Some businesses are defined as “small” according to their average annual revenue.

[ii] 15 U.S.C. Section 6801 et seq

[iii] 42 U.S.C. Section 1320d et seq.

[iv] Division A, Title XIII, and Division B, Title IV, Pub. L. No. 111-5

Seyfarth Synopsis: The U.S. District Court for the Northern District of Illinois recently denied Plaintiff’s motion to reconsider a prior dismissal of his privacy action due to untimeliness.  In a case titled Bonilla, et al. v. Ancestry.com Operations Inc., et al., No. 20-cv-7390 (N.D. Ill.), Plaintiff alleged that consumer DNA network Ancestry DNA violated the Illinois Right of Publicity Act (“IRPA”) when it uploaded his high school yearbook photo to its website.  The Court initially granted Ancestry’s motion for summary judgment, finding Plaintiff’s claims to be time-barred under the applicable one-year limitations period.  Upon reconsideration, Plaintiff  – unsuccessfully – made a first-of-its-kind argument that the Court should apply the Illinois Biometric Privacy Act’s five-year statute of limitations to the IRPA.

Background on the Bonilla Lawsuit

Ancestry DNA, most commonly known for its at-home DNA testing kits, also maintains a robust database of various historical information and images.  One subset of this online database is the company’s “Yearbook Database.”  This portion of the website collects yearbook records from throughout the country and uploads the yearbook contents – including students’ photos – to Ancestry.com.  On June 27, 2019, Ancestry DNA uploaded the 1995 yearbook from Central High School in Omaha, Nebraska to its Yearbook Database.

More than a year later, on December 14, 2020, Plaintiff Sergio Bonilla filed a lawsuit against Ancestry DNA over its publication of the Central High School yearbook.  Specifically, Plaintiff Bonilla – a current Illinois resident and former student of Central High School whose picture appeared in Ancestry’s database – alleged that Ancestry DNA improperly publicized his private information without obtaining his consent.  Plaintiff’s lawsuit asserted violations of the IRPA, as well as a cause of action for unjust enrichment.  Ancestry DNA filed a motion for summary judgment on the basis that Plaintiff’s action was not brought within the requisite one-year limitations period.  The Court agreed, thereby dismissing Plaintiff’s claims.

Court Denies Plaintiff’s Motion for Reconsideration

After the Illinois Supreme Court’s decision in Tims v. Black Horse Carriers (which held that BIPA is subject to a five-year statute of limitations – read our full summary HERE), Plaintiff filed a motion for reconsideration, contending that the Court should actually apply a five-year limitations period to IRPA actions, like it applies to BIPA.  To that end, Plaintiff emphasized that the IRPA (similar to BIPA) does not itself contain a statute of limitations.  Plaintiff also noted that both the IRPA and BIPA derived from legislative concerns centered on Illinois residents’ right to privacy.  Therefore, according to Plaintiff, the IRPA’s legislative purpose would be best served by applying the catch-all five-year limitations period of 735 ILCS 5/13-205. 

On reconsideration, the Court again rejected Plaintiff’s argument.  The Court first outlined relevant case law precedent, under which the only courts to address this issue previously held that the IRPA’s applicable statute of limitations is one year.  See Toth-Gray v. Lamp Liter, Inc., No. 19-cv-1327, 2019 WL 3555179, at *4 (N.D. Ill. July 31, 2019); see also Blair v. Nevada Landing P’ship, 859 N.E.2d 1188, 1192 (Ill. App. Ct. 2006). 

The Court then analyzed the Tims decision, which held that, “when the law does not specify a statute of limitations, ‘the five-year limitations period applies’ unless the suit is one for ‘slander, libel or for publication of a matter violating the right of privacy.’”  Here, the Court reasoned that an IRPA action squarely falls within the last category identified by the Court in Tims, as IRPA cases necessarily involve alleged violations of a party’s right to privacy.  Finally, the Court rejected Plaintiff’s contention that Tims controls this situation, instead holding that “[u]nlike the BIPA, the IRPA protects the publication of matters related to the right of privacy and, thus, falls under the one-year statute of limitations.”

Implications for Businesses

This decision establishes a welcome pro-business standard in the Illinois privacy law context.  Notably, the Illinois Supreme Court in Tims rejected the defense bar’s argument that BIPA violations were akin to privacy rights violations and subject to the one-year statute of limitations applicable to IRPA claims.  This Ancestry.com decision holds that the converse also is not true.  It is also the first court to reject expansion of the plaintiff-friendly five-year BIPA statute of limitations to claims beyond BIPA.

Though this decision was issued by an Illinois federal court – rather than the Illinois Supreme Court, which decided the recent Tims and Cothron v. White Castle System BIPA cases – it nonetheless offers some privacy protection for Illinois businesses that post or otherwise aggregate third parties’ content or information.  We will monitor whether defendants are able to expand the Bonilla decision into other related privacy law actions, or if Illinois courts will restrict its holding to actions brought under the IRPA.

For more information about the Illinois Right of Publicity Act, the Illinois Biometric Information Privacy Act, or how this decision may affect your business, contact the authors Danielle Kays and James Nasiri, your Seyfarth attorney, or Seyfarth’s Workplace Privacy & Biometrics Practice Group.