2026 Chapman Law Review Symposium: Data Flow Frontiers: Privacy, Policy, Practice Library Research Guide
February 2, 2026
Data privacy has become a critical concern in the digital age, as personal information is continuously collected, stored and shared across various platforms, including social media, businesses and generative artificial intelligence (AI). As everyday activities move online, companies and data brokers routinely process identifiers, location and browsing data, financial and health details and biometrics. Social media companies, businesses and AI programs collect and track user data, claiming to use it to enhance their services. Strong privacy safeguards are designed to prevent harms, such as identity theft, discrimination, unwanted profiling, and breaches of confidentiality, while fostering trust in digital services.
The protection of sensitive data is essential to safeguard individuals against misuse and unauthorized access. Data breaches involving sensitive personal data can cause significant harm to individuals and large groups of people. This issue becomes more complex as data flows across international borders. Data privacy has become an increasingly significant national security concern. Legally, both state and federal frameworks have been established to regulate the handling of personal data and enhance privacy protections. The question is, how well do they protect our data?
To help answer that question, the Hugh and Hazel Darling Law Library created a research guide to showcase the importance of data privacy as it pertains to the law. This guide is a supplement to the physical display curated by the Hugh and Hazel Darling Law Library and Chapman Law Review for their Spring 2026 symposium, “Data Flow Frontiers: Privacy Policy & Practice.”
The research guide provides background information about data privacy and pressing issues that have arisen regarding the protection of personal data. The guide also offers information about the library display and access to digital versions of the display items. It includes a list of recommended treatises and other resources for further reading and research.
If you would like to know more about the Chapman Law Review Symposium 2026 Research Guide or the physical display, contact the Hugh and Hazel Darling Law Library Research Team at lawresearch@chapman.edu
Access the Research Guide on the Hugh & Hazel Darling Law Library Website (login required)
Research Articles, an Annotated Bibliography
The President’s Authority Over Cross-Border Data Flows
This Article reveals a surprising expansion of presidential authority to control goods and services available in the United States because of the information flows that they entail. Such authority is grounded in laws focused on protecting national security, here with respect to foreign surveillance and propaganda. But broad executive powers over our information infrastructure raises significant concerns with respect to core American values of free expression and due process. Worries about unfettered foreign access to data should be coupled with worries about unfettered executive control over our information services and technologies.
Protecting Consumer Data Privacy with Arbitration
Erin O’Hara O’Connor, Protecting Consumer Data Privacy with Arbitration, 96 N.C. L. REV. 711 (2018)
The article explores the potential of using arbitration as a mechanism to protect consumer data privacy, addressing the limitations of current regulatory approaches such as FTC enforcement and class actions. It argues that these existing methods can lead to either overdeterrence or underdeterrence of privacy violations due to their inability to accurately assess the subjective and heterogeneous nature of privacy harms. The proposed arbitration mechanism, monitored by the FTC, aims to incorporate a pricing mechanism that would allow high-privacy-value consumers to bring claims while discouraging low-privacy-value claims, thus providing a more efficient regulatory framework. The article also suggests the introduction of third-party claims insurance to better sort consumers based on their privacy valuations, potentially improving the effectiveness of arbitration in addressing privacy harms. However, the implementation of such a scheme would require congressional action and careful consideration of potential complications, such as ensuring consumer protection and addressing the risk of underdeterrence.
Catalyzing Privacy Law
The article examines the influence of the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) on privacy law in the United States. It challenges the conventional view that the GDPR is the primary catalyst for global privacy standards, arguing instead that California, through the CCPA, has become a significant driver of privacy legislation in the U.S. The authors highlight the differences between the GDPR and CCPA, noting that while both laws share some similarities, the CCPA is not merely a “GDPR-lite” but a distinct regime with its own framework. The article discusses the role of California as a “superregulator” and the impact of local networks and norm entrepreneurship in shaping privacy law. It also explores the potential constraints on the spread of CCPA-like laws, such as the dormant commerce clause, federal preemption, and First Amendment challenges. The authors conclude that California, rather than Europe, is catalyzing the development of U.S. data privacy law, with the CCPA influencing both state and federal legislative activities.
Information. Privacy and the Inference Economy.
Alicia Solow-Niederman, Information Privacy and the Inference Economy, 117 NW. U. L. REV. 357 (2022)
Information privacy is in trouble. Contemporary information privacy protections emphasize individuals’ control over their own personal information. But machine learning, the leading form of artificial intelligence, facilitates an inference economy that pushes this protective approach past its breaking point. Machine learning provides pathways to use data and make probabilistic predictions and inferences that are inadequately addressed by the current regime. For one, seemingly innocuous or irrelevant data can generate machine learning insights, making it impossible for an individual to anticipate what kinds of data warrant protection. Moreover, it is possible to aggregate myriad individuals’ data within machine learning models, identify patterns, and then apply the patterns to make inferences about other people who may or may not be part of the original dataset. The inferential pathways created by such models shift away from “your” data and towards a new category of “information that might be about you.” And because our law assumes that privacy is about personal, identifiable information, we miss the privacy interests implicated when aggregated data that is neither personal nor identifiable can be used to make inferences about you, me, and others. This Article contends that accounting for the power and peril of inferences requires refraining information privacy governance as a network of organizational relationships to manage-not merely a set of dataflows to constrain. The status quo magnifies the power of organizations that collect and process data, while disempowering the people who provide data and who are affected by data-driven decisions. It ignores the triangular relationship among collectors, processors, and people and, in particular, disregards the co-dependencies between organizations that collect data and organizations that process data to draw inferences. It is past time to rework the structure of our regulatory protections. This Article provides a framework to move forward. Accounting for organizational relationships reveals new sites for regulatory intervention and offers a more auspicious strategy to contend with the impact of data on human lives in our inference economy.
Privacy Harms
Danielle Keats Citron & Daniel J. Solove, Privacy Harms, 102 B.U. L. REV. 793 (April 2022)
The requirement of harm has significantly impeded the enforcement of privacy law. In most tort and contract cases, plaintiffs must establish that they have suffered harm. Even when legislation does not require it, courts have taken it upon themselves to add a harm element. Harm is also a requirement to establish standing in federal court. In Spokeo, Inc. v. Robins and TransUnion LLC v. Ramirez, the Supreme Court ruled that courts can override congressional judgment about cognizable harm and dismiss privacy claims. Case law is an inconsistent, incoherent jumble with no guiding principles. Countless privacy violations are not remedied or addressed on the grounds that there has been no cognizable harm. Courts struggle with privacy harms because they often involve future uses of personal data that vary widely. When privacy violations result in negative consequences, the effects are often small-frustration, aggravation, anxiety, inconvenience-and dispersed among a large number of people. When these minor harms are suffered at a vast scale, they produce significant harm to individuals, groups, and society. But these harms do not fit well with existing cramped judicial understandings of harm. This Article makes two central contributions. The first is the construction of a typology for courts to understand harm so that privacy violations can be tackled and remedied in a meaningful way. Privacy harms consist of various different types that have been recognized by courts in inconsistent ways. Our typology of privacy harms elucidates why certain types of privacy harms should be recognized as cognizable. This Article’s second contribution is providing an approach to when privacy harm should be required. In many cases, harm should not be required because it is irrelevant to the purpose of the lawsuit. Currently, much privacy litigation suffers from a misalignment of enforcement goals and remedies. We contend that the law should be guided by the essential question: When and how should privacy regulation be enforced? We offer an approach that aligns enforcement goals with appropriate remedies.
TikTok and Instagram Know What You Did Last Summer
The article argues that state-level regulation is better equipped to address invasive data collection practices by social media platforms like TikTok and Instagram compared to federal efforts. While the Federal Trade Commission (FTC) has broad authority under Section 5(a) of the FTC Act, its enforcement capabilities are limited, making it ineffective in protecting user data. State legislatures, particularly California, through the CCPA and CPRA, have demonstrated greater success in implementing stricter privacy protections. The article advocates for state-level enforcement as the most effective approach to safeguard user data, rather than relying on a weak federal statutory scheme or the overburdened FTC. (AI summary)
National Security Creep in Corporate Transactions
The article examines the expanding scope of national security reviews in corporate transactions, a phenomenon termed “national security creep.” This expansion, driven by the conflation of economic and national security interests, has led to increased regulatory scrutiny, particularly targeting foreign investments, with significant implications for dealmaking, judicial deference to the executive branch, and international economic relations. The analysis highlights the opaque nature of these reviews, the challenges they pose to legal certainty, and the need for balanced approaches to mitigate risks while maintaining economic stability. (AI Summary)
Murky Consent: An Approach to the Frictions of Consent in Privacy Law
The article critiques the effectiveness of consent in privacy laws, arguing that both the U.S. notice-and-choice approach and the EU’s express consent framework fail to provide meaningful consent. It introduces the concept of “murky consent,” which acknowledges the limitations and fictions of privacy consent, proposing a middle ground that restricts data use under regulatory oversight. The analysis emphasizes the need for a more nuanced approach to consent, recognizing its ambiguity and advocating for a system that prioritizes fairness and individual privacy through specific duties imposed on organizations. (AI Summary)