TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

United States Privacy Digest | Heard around DC — Reflections on a potential US data privacy law Related reading: A view from DC: The path to IAPP Global Privacy Summit 2024

rss_feed

""

""

Editor's Note:

Update: Friday, June 3, 2022 – 1:54 pm ET: After the IAPP went to press, the bipartisan draft discussion bill, called the American Data Privacy and Protection Act, was released to the public. 

Another week and we find ourselves a step closer to that fabled comprehensive U.S. data privacy law. Congressional leaders are apparently very close to hammering out a delicate agreement on language for a bipartisan bill, with a markup of one or more Senate bills expected this month. Politico Pro reported yesterday that a draft bill has been circulating that was pieced together by House Energy and Commerce Chair Frank Pallone, D-N.J., ranking member Cathy McMorris Rodgers, R-Wash., and Sen. Roger Wicker, R-Miss. For those who can't get around the Politico paywall, the IAPP's Joe Duball also reported on Wednesday's draft bill. 

As predicted, Republicans and Democrats have found common ground by including both a private right of action and preemptive language in the draft, but with limitations on both. According to Politico sources, Senate Commerce Chair Maria Cantwell, D-Wash., has not yet agreed to the compromise draft due to concerns that its enforcement mechanisms are not strong enough. 

An updated version of Cantwell’s Consumer Online Privacy Rights Act has also been circulating, which provides for a PRA when a “substantial privacy harm” has occurred, a term that reportedly includes physical, mental, and reputational harms or financial harms above $1,000. As Politico reported, Cantwell’s draft goes on to include language, not reflected in the other draft, that “would allow individuals to sue companies for substantial privacy harms in the courts without first going through arbitration, even if businesses had tried to mandate arbitration through agreements with their customers.”

Meanwhile, Sen. Brian Schatz, D-Hawaii, wrote a letter to the Commerce committees supporting a duty of care provision and calling on leaders to “refuse to settle for a privacy framework that will only result in more policies to read, more cookies to consent to, and no real change for consumers.” Cantwell replied in a statement, "Senator Schatz is right — any robust and comprehensive privacy law must protect consumers’ personal data with a clear requirement that companies are accountable for the use of that data and must act in consumers’ best interests.”

Bipartisanship is essential for passage of any bill this term. For more on why — and analysis of all the existing bipartisan privacy and data protection proposals circulating in Congress see the IAPP’s new whitepaper, Negotiating Privacy.

Here's what else happened since the last roundup:

  • The White House is poised to issue its executive order implementing the EU-U.S. agreement on data transfers this month.
    • Among other agreed adjustments, the EO is expected to direct the Department of Justice to promulgate rules that establish the mechanisms agreed to in the Trans-Atlantic Data Privacy Framework. As Politico’s Digital Bridge reported, the EO will also include specific details on the “necessary and proportionate” limits for how U.S. national security agencies can access both European and U.S. data.
  • CFPB said algorithmic credit decisions must be explainable.
    • The Consumer Financial Protection Bureau clarified that federal anti-discrimination law requires companies to explain reasons for denying an application for credit or taking other adverse actions, even when those decisions are informed or made by “black-box models.” As CFPB Director Rohit Chopra stated, “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”
  • Senators wrote letters about reproductive health data.
    • Apps collecting location or health data, particularly if it could be used for inferences about reproductive health choices, are the subject of much discussion. Five senators led by Ed Markey, D-Mass., urged Apple and Google to ensure that third-party services on their app stores “do not employ data practices that threaten the wellbeing of individuals seeking abortion services.” Another letter from Sen. Ron Wyden, D-Ore. and more than 40 others called on Google to “stop unnecessarily collecting and retaining customer location data, to prevent that information from being used by right-wing prosecutors to identify people who have obtained abortions.” Meanwhile, the FTC received a letter from Sen. Amy Klobuchar, D-Minn., and 15 others asking the agency to explain the steps it is taking to “ensure data brokers are not collecting, buying, or selling sensitive location data that put people, particularly those seeking medical attention, at risk.” 
  • The creator of Second Life reminded us just how hard virtual governance is.
    • As Meta (and others) continue to clarify aspirations for building and governing the metaverse, it may be worth listening to voices who have been down this road before. Philip Rosedale, who founded the company that built the virtual world Second Life, has been making the rounds with some insightful commentary about how we should be mindful in our approach to building a metaverse: “We've got enough problems with humanity right now, without adding a physically identified version of reality on the internet. We have to tactically look at the internet as it currently exists and ask, who owns the spaces where people are hanging out, what are the rules of engagement, what's the moderation strategy, and learn how to do this right.”
  • The Future of Privacy Forum helped us understand the contours and limits of 'biometrics.'
  • Under scrutiny:
    • Algorithmic decision-making tools and “new surveillance technologies” are the subject of an exhaustive report from the Center for Democracy and Technology on exploring the disproportionate impacts on disabled people when these technologies are used in education, policing, health care and the workplace.
    • Chatbots used for criminal justice purposes are not popular with privacy and civil rights advocates.
    • Zoom’s plan to implement an emotion recognition system in its platform is also not going over well among advocates. A recent coalition letter argues emotion recognition is harmful because it is based in pseudoscience, subject to racial biases, and presents serious data security risks.
    • Automated test proctor software for detecting cheating is the subject of an investigative report by Kashmir Hill at The New York Times.
    • Privacy policies are the subject of an op-ed by Washington Post columnist Geoffrey Fowler, in which he calls for their abolition — or at least supplementation with virtual privacy “butlers.”
  • Upcoming happenings:

Please send feedback, updates and black box explanations to cobun@iapp.org.  


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.