Ketch

Ketch

Software Development

San Francisco, California 5,414 followers

The Ketch Data Permissioning Platform helps brands collect and mobilize permissioned data for growth.

About us

The Ketch Data Permissioning Platform helps brands collect and mobilize permissioned data for growth. With a connected set of apps, infrastructure, and APIs, Ketch simplifies privacy ops and ensures a responsible foundation for AI-driven initiatives. Most customers use Ketch for: - Collecting, managing, and enforcing consumer consent signals - Automating intake and fulfillment of consumer data privacy requests - Understanding personal data footprint throughout the business ecosystem - Simplifying and streamlining privacy risk assessments (DPIAs, PIAs, and TIAs) - Collecting marketing communications preferences to support first-party data strategies To learn more, visit ketch.com.

Website
https://www.ketch.com/
Industry
Software Development
Company size
51-200 employees
Headquarters
San Francisco, California
Type
Privately Held
Founded
2020
Specialties
compliance, data privacy, privacy, privacy tech, consent management, data subject requests, regulatory requirements, data governance, GDPR, and CCPA

Locations

  • Primary

    140 New Montgomery St

    4th floor

    San Francisco, California 94105, US

    Get directions

Employees at Ketch

Updates

  • View organization page for Ketch, graphic

    5,414 followers

    It's harder than ever to capture customers' attention in the chaotic world of e-commerce and retail. With third-party cookies becoming obsolete (eventually), AI-powered personalization and loyalty programs are all the rage. And here's the real kicker: data privacy is frequently disregarded and left to the legal profession. If you ignore it, your customer experience will suffer and your profitability will suffer. In this edition of Ketch Up! Colleen Barry breaks down the 3️⃣ main privacy concerns: 🟣 Ensuring seamless cross-channel buying 🟣 Gathering and utilizing first-party data 🟣 Optimizing websites for maximum conversions Dive in to learn how to be an expert at balancing customization and privacy! #DataPrivacy #Retail

    3 major privacy challenges for retail & ecommerce brands

    3 major privacy challenges for retail & ecommerce brands

    Ketch on LinkedIn

  • View organization page for Ketch, graphic

    5,414 followers

    For TIME, respecting data privacy rights for an audience of 100 million people is part of the modernization story of an iconic brand. When the publishing giant's legacy privacy software tool wasn’t doing the job, Adam Keephart, Senior Manager of Information Security, chose Ketch to step in and help. Adopting Ketch DSR automation and consent management solutions has been transformative for the cross-functional privacy team at TIME. Now, this team is ready to continue their data-driven journey towards automation and scale. Read more: https://bit.ly/3Vde62K #publishing #DataPrivacy #compliance

    • No alternative text description for this image
  • View organization page for Ketch, graphic

    5,414 followers

    Why use a kitchen sink's worth of data when a clean dish will do? Katharina Koerner breaks down this new study showing how data in risk prediction models can reduce "label bias," resulting in fairer and more accurate outcomes. Learn more and read the full study 👇 #DataPrivacy

    View profile for Katharina Koerner, graphic

    AI Governance Lead at WGU: Bridging Policy and IT: I’m passionate about policy operationalization and fostering cross-disciplinary dialogue.

    A new study, titled "Risk Scores, Label Bias, and Everything But the Kitchen Sink," by Julian Nyarko, Professor at Stanford Law School, an Associate Director and Center Fellow at Stanford Institute for Human-Centered Artificial Intelligence (HAI) together with colleagues from Stanford University and Harvard University, explores the challenges in risk prediction models, where using too much data can lead to inaccuracies due to “label bias” where the models use inapt proxies instead of measuring the real outcomes of interest. Risk prediction models are used in various sectors like lending, college admissions, and healthcare, to forecast potential outcomes based on data. These models often follow a "kitchen sink" approach, where the assumption is that more data will yield better predictions. The study shows that this approach might not be ideal because it can lead to what's called "label bias." Label bias happens when models use proxies—a substitute measurement used when direct measurements are unavailable. The study suggests a simpler, less-is-more approach with models using fewer data points could improve both the accuracy and fairness of these predictions. This approach prioritizes direct and relevant data over a broad array of less pertinent information, leading to models that are not only easier to understand but potentially more just in their outcomes. For example, arrest rates might be used as a proxy for criminal behavior. However, these proxies might not accurately represent the true outcome they are supposed to measure. This can lead to inaccuracies; for instance, if arrest rates are higher in a particular area, it might not truly reflect the individuals' likelihood to commit crimes but rather the intensity of police activity in that area. In the medical field, a similar issue arises when predicted future medical costs are used to determine which patients should enter high-need care programs. This can disadvantage certain groups, like Black patients, who might have equal medical needs but lower predicted costs because they seek less medical treatment. Applying this insight to college admissions, the findings suggest that admissions models should avoid overly complex algorithms that incorporate a wide range of data points, many of which may not be directly relevant to predicting student success. E.g., instead of considering a vast array of factors like family background, high school attended, and various extracurricular activities, colleges might focus on a smaller set of predictors more directly linked to academic success, such as GPA, specific aptitude test scores, and relevant academic achievements. Article: https://lnkd.in/gpykKPA7 Study: https://lnkd.in/gPsXwaJu

    How Bias Hides in ‘Kitchen Sink’ Approaches to Data

    How Bias Hides in ‘Kitchen Sink’ Approaches to Data

    hai.stanford.edu

  • View organization page for Ketch, graphic

    5,414 followers

    Just like we predicted, 2024 is bringing more child-focused privacy laws, and New York's next with their Child Data Protection Act, banning sneaky data grabs and making sure info gets deleted fast. So, what does this mean for your business? Odia Kagan breaks it down 🔽 Want to dive in deeper into our thoughts for the 2024 Data Privacy Trend Landscape? 👀 https://bit.ly/3VEuHOk

    View profile for Odia Kagan, graphic

    CDPO, CIPP/E/US, CIPM, FIP, GDPRP, PLS, Partner, Chair of Data Privacy Compliance and International Privacy at Fox Rothschild LLP

    If you have a website, online service, online application, mobile application, or connected device, or a portion thereof, that is "primarily directed to minors" or that you actually know is used by minors, wholly or party in the State of New York, what will you need to do, after Gov Hochul signs SB 7695 NY Child Data Protection Act into law: Your service is deemed to be "primarily directed to minors" also in cases where you have actual knowledge that the service is collecting personal data of users directly from users of another website, online service, online application, mobile application, or connected device primarily directed to minors. You can't collect or allow a third party to collect information of a minor unless: 1) for 12 and under - the processing is permitted by COPPA 2) for 13 and over - the processing is strictly necessary for: providing the requested service and things like: protection against fraud, detection of errors, compliance with law or law enforcement, vital interests - and also internal business operations BUT excluding: marketing, advertising, research and development, providing products or services to third parties, or prompt-ing users to use the website, online service, online application, mobile application, or connected device when it is not in use 3) for 13 and over - with informed consent obtained either through a device communication or signal or through a request. For this you need: 🔹 separate consent 🔹 say that it's not strictly necessary 🔹 not use dark patterns 🔹 clearly present an option to refuse as the most prominent option. 🔹 once given - consent is revocable 🔹 You can't degrade the quality of the service due to refused consent 🔹 You can't sell data of minors. 🔹 If you find out it's a minor you must, within 30 days: (1) delete any personal information you have (2) direct your processors to do the same (3) inform your third party operators that this is a minor Before sharing any personal information of minors with a third party operators: 🔹 You must tell them that your website is directed a minors / the personal data is of a minor (covered user) 🔹 Need a data sharing agreement in place that requires: processing pursuant to instructions; return/deletion of the data; assistance with legal obligations; participation in assessments; accountability (audit); notify in advance of sharing with processors Signals: 🔹 Treat a user as a covered user if the user's device communicates or signals that the user is or shall be treated as a minor, including through a browser plug-in or privacy setting, device setting, or other mechanism that complies with regulations promulgated by the AG 🔹 Adhere to any clear and unambiguous communications or signals concerning processing that the covered user consents to or declines to consent to 🔹 If a signal is unclear - ask for consent. 🔹 AG enforcement + pending regs 🔹 Effective date: 1 year after becomes law #dataprivacy #dataprotection #privacyFOMO

    • No alternative text description for this image
  • View organization page for Ketch, graphic

    5,414 followers

    Our study with The Ethical Tech Project shows that consumers OVERWHELMINGLY agree that businesses 𝙢𝙪𝙨𝙩 adopt ethical data practices in today's AI era. The ones that don't risk alienating their customers...Or not attracting those customers in the first place! Ethical data principles include: ‣ Privacy, or the idea that sensitive data should only be collected with purpose. And then deleted once no longer needed. ‣ Transparency, or the idea that your business should communicate plainly about how data is being used and how long it's being stored. ‣ Agency, or the idea that consumers should have choice and control about how their data is being used. ‣ Fairness, or the idea that companies should prevent AI they're using from practicing racial, gender, or other types of bias. ‣ Accountability, or the idea that clean, uniform standards and practices should be adhered to when it comes to AI. #DataPrivacy #ArtificialIntelligence #innovation

    • No alternative text description for this image

Similar pages

Browse jobs

Funding

Ketch 3 total rounds

Last Round

Series A

US$ 20.0M

See more info on crunchbase