Alysa Hutnik’s Post

#DataMinimization in #privacy discussions is an apple 🥧 theme in policy talks but I don’t hear as much discussion about monitoring what happens as a result of that policy decision, or some of the other less consumer friendly after effects. Solving privacy is not the end of the story. This quote from a recent review of #privacyresearch caught my eye for example: “Policymakers encourage companies to minimize the collection of personal data about sensitive characteristics such as age, gender, race, and sexual orientation. But if companies don’t know who is male or female, white or Hispanic, how can they test for and prevent algorithmic bias against those groups? Further, how can they avoid unfair treatment of marginalized groups in “data deserts”?” What is our answer to #datadeserts and #algorithmicbias that is not just happy talk? #privacylaws #privacypolicy #algorithmicdecisionmaking #moretothediscussion Kelley Drye Advertising Law Kelley Drye & Warren LLP https://lnkd.in/eNU3w7un

Data Privacy Is for the Privileged

Data Privacy Is for the Privileged

chicagobooth.edu

Gerard Stegmaier

Practical Problem Solver + Trusted Consigliere

2w

Most of these things can be engineered around. The reality is that much of the policy talk is divorced from the engineering and technical realities. The people who make law and regulation often are completely attenuated from the people who must deal with it. And, often the advocates/intelligentsia are equally if not further removed. And, one step further, there is little to any mention, ever, of the First Amendment and expressive consequences resulting from government-driven efforts to discourage speech-related activities that are not deceptive. Really great prompt Alysa and sorry for the long, strident answer. We can do better but from regulators we need more, or just some carrot, rather than all stick all the time.

I think this is driven at least in part by an overly narrowed definition of data minimization that focuses exclusively on *collecting* less data. But data minimization is not just about collecting less; it's about storing, processing, sharing, and using less personal information wherever possible in the data and model development lifecycles. Collect + anonymize is an important data minimization pattern that balances the need for transparency and auditability with the need to use only the minimum amount of personal data -- in the case of anonymized data, no personal data -- required to achieve the intended outcome.

Heidi Saas

Data Privacy and Technology Attorney | Licensed in CT, MD, & NY | Ethical AI Consultant | Change Agent | ⚡️ Disruptor ⚖️

2w

I agree on the necessary collection of data to achieve fairness, but we cannot trust them to do that just yet. I can also see shady lawyers using the overcollection of data as a defense to claims they failed to minimize data collection. 💀 Frat boys and free beer is what I think of when i hear "data minimization" promises from businesses. Only take what you need, ok!?!? https://smallworldadventures.com/wp-content/uploads/2010/04/keg-stand.jpg

Jesse Tayler

Team Builder, Startup Cofounder and App Store Inventor

2w

one thing we can do right now is respectfully separate the identity portion of the accounts we use in a ZKP connection so the service has no liability and we do not suffer loss of privacy. https://www.linkedin.com/pulse/unlocking-privacy-double-blind-zero-knowledge-proofs-identity-8brie

My favorite assessment question for infosec is the one about data minimization. The response is (mostly) crickets. Lots of work to do!

Jonathan Barry-Blocker

Attorney | Community Advocate | Public Speaker

2w

If testing/auditing data is critical, then compensate ppl for use of their data. Companies don’t have to guess about biased outcomes or feign ethical conundrums about data exploitation. Get consent and compensate.

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics