Now in its sixth year, Seyfarth’s Commercial Litigation Outlook provides a clear view into the forces reshaping business disputes in 2026. This year’s analysis highlights a risk landscape defined by accelerating technological change, an increasingly fragmented regulatory environment, and growing economic pressures across multiple industries.

According to the Outlook, artificial intelligence is creating new categories of legal risk, from the challenges of authenticating AI‑generated content to navigating the use of algorithmic tools while courts and regulators rapidly reset expectations around emerging technology. At the same time, state‑level regulation continues to expand, particularly around non‑competes, privacy, and biometrics, creating a compliance patchwork that requires businesses to adapt strategies by jurisdiction. Coupled with elevated interest rates, rising debt, and post‑pandemic strain, especially in real estate, health care, and franchise sectors, the commercial litigation environment remains fluid, fast‑moving, and resistant to neat predictions. Against this backdrop, eDiscovery, information governance, and cybersecurity response functions play increasingly central roles in managing litigation risk and staying ahead of shifting expectations.


Authored by Jay Carle, Matthew Christoff, and Danny Riley, this year’s eDiscovery & Innovation article spotlights one of the most significant and fast‑moving risks in the discovery landscape: the rise of AI‑enabled notetaking and meeting‑summarization tools. As generative AI capabilities become embedded directly into videoconferencing platforms, these tools now routinely record meetings, create transcripts with speaker attribution, and auto‑generate summaries—often by default. The result is a sudden proliferation of new, unvetted records that can capture sensitive, strategic, or privileged conversations. The article warns that these tools exponentially increase the risk of inadvertent disclosure, while also creating evidentiary challenges when transcripts or summaries are later used to establish what was said, by whom, and with what intent.

The article also highlights that litigation risk is expanding beyond the developers of these tools to the organizations deploying them. AI notetakers raise overlapping consent, privacy, wiretap, and biometric concerns, and courts will increasingly scrutinize whether companies can demonstrate how meeting data was captured, stored, and controlled. As with prior waves of privacy litigation, the differentiator will be operational discipline: organizations that implement clear governance around meeting recording, restrict distribution of AI‑generated outputs, and define authoritative versions of records will be far better positioned to defend against disclosure missteps, authenticity disputes, and statutory claims.

Click here to download the 2026 Commercial Litigation Outlook.

Continue Reading The Changing Discovery Landscape: Takeaways from Seyfarth’s 2026 Commercial Litigation Outlook

Introduction

Robotics and artificial intelligence are converging at an unprecedented pace. As robotics systems increasingly integrate AI-driven decision-making, businesses are unlocking new efficiencies and capabilities across industries from manufacturing and logistics to healthcare and real estate.

Yet this convergence introduces complex legal and regulatory challenges. Companies deploying AI-enabled robotics must navigate issues related to data privacy, intellectual property, workplace safety, liability, and compliance with emerging AI governance frameworks.

The Shift: Robotics as an AI Subset

Traditionally, robotics was viewed as a standalone discipline focused on mechanical automation. Today, robotics is increasingly powered by machine learning algorithms, natural language processing, and predictive analytics—hallmarks of AI technology.

This evolution raises critical questions for legal teams:

  • Who owns the data generated by AI-enabled robots?
  • How do we allocate liability when autonomous systems make decisions without human intervention?
  • What contractual safeguards should be in place when outsourcing robotics solutions to third-party vendors?

As robotics increasingly incorporates AI functionality, traditional contract structures for hardware procurement and service agreements require significant updates. This evolution introduces new risk categories that must be addressed through precise drafting and negotiation.

Continue Reading The AI-Driven Evolution of Robotics

On July 24, 2025, the California Privacy Protection Agency (“CPPA”) unanimously voted to adopt a package of Proposed Regulations for the California Consumer Privacy Act (“CCPA”), marking a significant development in California privacy law. These cover Automated Decision-making Technology (“ADMT”), mandatory Cybersecurity Audits, Risk Assessments, and clarifications for the CCPA’s applicability to Insurance Companies. The package will move into its final review stage before formal enactment, once filed with the California Office of Administrative Law.

CCPA Steering Toward Operational Compliance

This is a clear signal that privacy compliance expectations in California are trending toward a more operational phase. The new rules are designed to give Californians greater control over how their personal information is used while pushing businesses toward higher levels of transparency and accountability, especially when automated decision-making and high-risk data processing is involved. For companies, this is more than just a theoretical update – it’s a clarion call to ensure these requirements are built into day-to-day governance, technology and process design, and vendor management practices.

Continue Reading California Privacy Protection Agency (CPPA) Finally Voted to Adopt Much Debated Update to CCPA Regulations: What Your Business Should Know

On June 3, 2025, the California Senate unanimously passed Senate Bill 690 (SB 690), a bill that seeks to add a “commercial business purposes” exception to the California Invasion of Privacy Act (CIPA).

After multiple readings on the Senate floor, SB 690 passed as amended, and will now proceed to the California State Assembly. SB

On September 6, 2024, the U.S. Department of Labor (DOL) issued Compliance Assistance Release No. 2024-01, titled “Cybersecurity Guidance Update.” The updated guidance clarifies that the DOL cybersecurity guidance applies to all ERISA-covered plans, and not just retirement plans, but also health and welfare plans. Also, as a direct response to service providers’

Corporations face unprecedented challenges in safeguarding sensitive data and mitigating privacy risks in an era marked by the rapid proliferation of Internet of Things, or IoT, devices.

Recent developments, including federal and state regulators’ heightened focus on privacy enforcement, highlight the importance of proactive risk management, compliance and data governance. As IoT and smart devices continue to hit the marketplace, heightened scrutiny for businesses’ data governance practices follows.

The Federal Trade Commission’s recent technology blog, “Cars & Consumer Data: On Unlawful Collection & Use”[1] underscores the agency’s commitment to enforcing consumer protection laws. Despite their blog’s focus on the car industry, the FTC’s message extends to all businesses, emphasizing its vigilance against illegal — or “unfair and deceptive” — collection, use and disclosure of personal data.

Recent enforcement actions are a stark reminder of the FTC’s proactive stance in safeguarding consumer privacy.

Geolocation data is a prime example of sensitive information subject to enhanced protections under the Federal Trade Commission Act. Much like mobile phones, cars can reveal consumers’ persistent, precise locations, making them susceptible to privacy infringements.

Continue Reading Careful Data Governance Is a Must Amid Enforcement Focus

This post was originally published to Seyfarth’s Global Privacy Watch blog.

On July 10th, the European Commission issued its Implementing Decision regarding the adequacy of the EU-US Data Privacy Framework (“DPF”). The Decision has been eagerly awaited by US and Europe based commerce, hoping it will help business streamline cross-Atlantic data transfers, and by

Introduction

On June 10, 2021, China officially passed China’s first Data Security Law, which will take effect on September 1, 2021. Following the introduction of the Data Security Law, together with the Cybersecurity Law, which has been implemented since June 1, 2017, and the Personal Information Protection Law, which is undergoing public comment

In this unprecedented time, businesses are, more than ever, implementing and rapidly rolling out programs for remote or at-home work by employees. The quick changes in local and state governmental “shelter in place” instructions and Public Heath directives have placed significant strains on remote networks and caused local shortages of laptop computers at office supply and electronic stores across the country.

With this unexpected increase in remote workers, many companies are pushing the limits of their existing remote access technology, or deploying ad hoc technology and access solutions as quickly as possible. Some of those companies are not taking the time to consider potential information security, privacy, and other compliance ramifications for those same remote workers.

It is entirely appropriate and necessary for companies to adapt their technology and work networks are utilized to the greatest degree possible to remain in operation and serve business and customer needs. But as always, data security and privacy should always be part of the equation.

Below are some essential things to know about the security risks posed by remote or at-home worker, and a Technical Checklist for Remote employees to make sure your corporate data is safe, and you do not risk compliance challenges with data privacy law and requirements.
Continue Reading Cybersecurity, Data Privacy, and Compliance Issues Related to Remote Workers