Compelled to Share: Antitrust Remedies and the Privacy Costs of United States v. Google
Google’s dominance is indisputable, controlling around 90% of the global search engine market. [1] In United States et al. v. Google, the Justice Department’s Antitrust Division sought to tackle Google’s unlawful monopolization to restore competition in search and advertising. [2] On September 2nd, 2025, U.S. District Judge Amit Mehta issued penalties against Google, including requiring Google to share portions of its extremely valuable search data with some of its competitors. [3] This ruling makes it easier for competitors to build their own search engines and evens the playing field in the search space, chipping away at Google's monopoly power. However, it raises a new concern—how will Google’s user data remain private once it is handed over to third parties? To balance competition and privacy, we should ensure there are robust privacy safeguards in data sharing, transparent enforcement mechanisms with remedies, and international harmonization to protect both the users and the industry.
In October 2020, the DOJ and a coalition of states brought suit against Google under Section 2 of the Sherman Act, alleging that Google formed “exclusive agreements to secure default distribution of its search and advertising services to maintain monopolies in three online markets.” [4] Last year, Judge Amit Mehta ruled that Google violated Sherman Act, 15 U.S.C. § 2, “by unlawfully acquiring and maintaining monopoly power in the Open-Web Display Publisher Ad Server market and the Open-Web Display Ad Exchange market.” [5] On September 2nd, Judge Amit Mehta issued his remedies decision intended to expand competitiveness in the market. [6] Under the remedies order, “Google will be barred from entering or maintaining exclusive contracts relating to the distribution of Google Search, Chrome, Google Assistant, and the Gemini app.” [7] In addition, Google is required to make certain search index and user-interaction data available to certain competitors as well as required to offer certain competitors search and test ad syndication services. [8] The court also established a five-person technical oversight committee, which is responsible for setting criteria for “qualified competitors,” defining security and anonymization standards, and monitoring compliance. [9]
While the data sharing remedy presents strong potential in promoting competition, it presents significant risks. Data like click-and-query logs, which capture what a user searches for and clicks on, are critical for improving a search engine’s accuracy and relevance. If Google provides index and interaction data, competitors could accelerate improvements with their algorithms and compete better with Google’s high-quality results. However, there is a significant risk to privacy. Search queries and click data often contain highly personal and sensitive information, relating to health, legal issues, finances, location, or other unique identifiers. Merely removing personally identifying fields is not always enough, a reality acknowledged by Judge Mehta himself. [10] Also, there is a risk of reidentification if large datasets are aggregated. The technical committee is tasked with enforcing risk mitigation programs and techniques to enhance privacy, but the DOJ’s suggested proposals, like anonymization and privacy audits, are insufficient in an AI-driven world.
Concerns go beyond immediate privacy loopholes—there is also consideration of what happens to the data after it is shared beyond Google. The control of consequential uses and enforcing misuse becomes increasingly more difficult. Derivative recipients of the user data may have inconsistent or inadequate security standards and be incentivized to utilize the data beyond the allowed usages. The remedy is that to receive the data, competitors must show a “plan to invest and compete in” these search markets that is sufficient, as determined by the DOJ in consultation with the technical committee. [11] However, vigilant oversight is critical to ensure compliance with the plan and prevent data misuse. Additionally, there is a risk that this remedy might disincentivize Google to innovate its privacy policies and measures, as competitors freely benefit from their data. This tension between competition and privacy represents a broader struggle in digital antitrust. While antitrust requires openness and access, privacy necessitates restriction and data minimization, which is collecting and storing the minimum amount of data as necessary. The US’s patchwork of privacy regulations and laws leaves a policy gap in protecting user data at a critical inflection point of rapid technology innovation.
To reconcile this, robust privacy safeguards need to be integrated into any data-sharing remedy. Courts should require advanced anonymization or privacy-enhancing techniques rather than crude redaction. Privacy-enhancing techniques such as adding noise to data and applying generalization to make specific values less identifiable can help protect individuals’ identities in shared datasets. Recipients must be obliged to accept legally binding restrictions on reidentification, secondary profiling, and nonsearch-related use. Before release, independent audits should certify that anonymization is robust, and the oversight committee must enforce strict revocation or sanctions upon any misuse. Additionally, it is vital to ensure transparency to the user. Users should receive a clear, accessible notice that deidentified interaction data may be shared under court order, with limited opt-out rights for particularly sensitive categories, like health and legal queries. Monitoring and enforcement mechanisms must be stronger than standard reporting. Cybersecurity experts should monitor recipient systems, audit logs, and investigate suspicious behavior. Given that the data is highly valuable and sensitive, sanctions for misuse should be meaningful, such as monetary, injunctive, or suspension of data access rights. To ensure long-term effectiveness, there should also be periodic reviews by the technical committee to assess the competition effects and privacy harms, with adjustments as needed.
Limiting the scope and purpose of data sharing is key to preventing unnecessary disclosure of user data. Access should be strictly limited to competition-related functions like building alternative search models and should not extend to marketing, cross-product profiling, or external monetization. Recipients should demonstrate ongoing legitimate investment in search competition to receive user data. Therefore, U.S. policymakers should consider reinforcing existing data-sharing laws or developing a comprehensive federal privacy law to protect privacy. For instance, Congress could codify safe-harbor standards under which courts may compel data-sharing in digital markets, while embedding baseline privacy protections. A federal privacy law that enshrines data minimization, purpose limitation, and individual rights would help harmonize the antitrust-privacy interface and reduce uncertainty in the remedy orders. Furthermore, because data flows across borders, U.S. regulators should coordinate with European, UK, and Asia-Pacific regulators to align standards and avoid conflicting obligations. International harmonization is critical as data is shared across borders, so agreements to recognize cross-border audits, shared oversight, and joint sanctions would reduce friction and regulatory arbitrage.
United States et al. v. Google represents a pivotal moment in digital antitrust. A data sharing mandate is a bold step toward rebalancing power in the search economy, especially in today’s AI age, where user data functions as a competitive asset. However, it raises privacy concerns in a country where privacy is yet to be considered a fundamental right, and where the absence of such laws leaves significant policy gaps. Without strong guardrails, attempting to fix one problem could exacerbate another by exposing sensitive user data to exploitation under the banner of competition. If courts, regulators, and legislators align on privacy, we can create a digital marketplace that is both competitive and respectful of individual privacy.
Edited by Vincent Hovsepian
Endnotes
[1] StatCounter, Search Engine Market Share Worldwide, StatCounter Global Stats, 2025, online at https://gs.statcounter.com/search-engine-market-share.
[2] “Department of Justice Wins Significant Remedies Against Google.” U.S. Department of Justice. September 2nd, 2025. https://www.justice.gov/opa/pr/department-justice-wins-significant-remedies-against-google
[3] Ibid.
[4] “United States v. Google, LLC.” 138 Harvard Law Review 891, 892 (2025). https://harvardlawreview.org/print/vol-138/united-states-v-google-llc/
[5] United States v. Google, LLC, 1:23-cv-00108-LMB-JFA, 1 (ED VA 2025). https://www.justice.gov/atr/media/1414781/dl?inline
[6] “Department of Justice Wins Significant Remedies Against Google.” U.S. Department of Justice. September 2nd, 2025. https://www.justice.gov/opa/pr/department-justice-wins-significant-remedies-against-google
[7] Ibid.
[8] Ibid.
[9] United States v. Google, LLC, 1:20-cv-03010-APM, (D.D.C. 2024). https://www.justice.gov/atr/media/1378036/dl
[10] United States v. Google, LLC, 1:20-cv-03010-APM, 163 (D.D.C. 2025). https://storage.courtlistener.com/recap/gov.uscourts.dcd.223205/gov.uscourts.dcd.223205.1436.0_1.pdf
[11] Mark MacCarthy. “Privacy protections in the Google search case.” Brookings Institution. June 12, 2025.
https://www.brookings.edu/articles/privacy-protections-in-the-google-search-case/