FTC: Why You Need To Prepare Now For Privacy Legislation That May Not Pass
Forbes Councils Member
Do you know what your company’s algorithms are up to?
If the American Data Privacy Protection Act (ADPPA) makes it through congress this year, the Federal Trade Commission (FTC) will want you to know. The legislation is a long-simmering effort to enact a national data security and digital privacy standard.
Unlike the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA)—which focus on how organizations protect collected data—the ADPPA goes a layer deeper. In addition to data governance, the legislation introduces oversight into the artificial intelligence (AI) algorithms that mine data to ensure they are safe and non-biased. If passed, the bill will require businesses to audit their algorithms and explain to the FTC how they work, what data they collect and how the data will be used.
With our current highly divided congress, it’s tempting to hedge bets and assume the ADPPA will not survive. That’s not a good idea. Here’s why.
Evolving Advocacy For Data Privacy
The ADPPA already has bi-partisan support. Lawmakers from the left and right are interested in enacting legislation that protects citizens from big tech. So, even if it doesn’t pass in this session of congress, similar legislation will likely pass in the near future.
President Biden gave a shoutout to data privacy in his State of the Union address in early February, saying, “[I]t’s time to pass bipartisan legislation to stop big tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data that companies collect on all of us.” The comment received applause from both sides of the aisle.
While big tech is an easy target because of its prevalent use of AI algorithms to collect and monetize personal data, the ADPPA cuts a broader swath of what types of organizations will need to comply with the legislation, including:
- Data controllers, which decide the purpose and means of collecting, processing and/or transferring personal information of U.S. residents
- Service providers, such as data processors that collect, process and transfer personal information
- Large data holders with annual gross revenue of $250 million or more and collect or process data for five million persons (or devices) with sensitive personal information greater than 200,000 persons or devices.
According to these guidelines, it’s likely that most companies that offer software-as-a-service (SaaS) or e-commerce services will be considered covered entities. The implications of the legislation will affect multiple departments, including product/service development, marketing, finance, operations and legal. So even if you don’t work on the technical aspects of AI algorithms, you need to understand the implications of the legislation.
Organizations of all sizes should form cross-functional teams now to evaluate the data that’s collected and used as well as the efficacy of the AI algorithms. This is partly to be prepared to demonstrate compliance when new data privacy legislation passes, but also because it’s just the right thing to do.
Play Like You Train
Similar to professional athletes, AI algorithms are only as good as they are developed and trained. They will produce the intended results only if they are fed accurate data sets with clear oversight on how to analyze and parse the data. We know from recent examples this isn’t always the case. For instance, during congressional hearings, we learned that a Facebook algorithm fed ads and content for weight loss products and services to teen girls who consumed content about eating disorders.
The ADPPA mandates that AI algorithmic evaluations occur during the design phase of an algorithm, including training data. The legislation also requires large data holders to assess their algorithms and submit impact assessments to the FTC annually. The assessments must describe steps the organization has taken or will take to mitigate potential harm from algorithms, including any issues specifically related to individuals under 17.
Designing and deploying an auditing process is a complex task. Getting out front of it now puts your organization in a stronger position when privacy legislation passes. It is also a smart business decision to demonstrate to your customers that you are proactively taking steps to protect their data and that your algorithms are safe, non-biased and effective.
Balancing Data Privacy With Useful Data Collection
Organizations face a difficult challenge when developing and auditing AI algorithms to minimize harmful outcomes. Reducing the amount or type of data collected may seem logical to meet privacy concerns. But excluding too much data can lead to skewed results that are biased or ineffective.
Orly Lobel is a professor and director of the Center for Employment and Labor Policy (CELP) at the University of San Diego who studies digital privacy issues. In a recent Time magazine article, she argues that “Privileging privacy, instead of openly acknowledging the need to balance privacy with fuller and representative data collection, obscures the many ways in which data is a public good. Too much privacy—just like too little privacy—can undermine the ways we can use information for progressive change.”
In and of itself, data collection is not harmful. However, how the data is used determines whether it harms individuals or groups of people. For organizations that collect data for commercial purposes, it is best to carefully screen the data sets used to train AI algorithms and audit their outcomes along the way in order to strike the right balance.
The ADPPA or similar legislation may force organizations to demonstrate how they train and audit AI algorithms. By preparing now for data privacy, companies will be in a better position to demonstrate compliance with streamlined audit processes. Simultaneously, they will benefit from effective AI algorithms that can produce strategic insights for business growth.
About SenecaGlobal
Founded in 2007, SenecaGlobal is a global leader in software development and management. Services include software product development, application software development, enterprise cloud and managed services, quality assurance and testing, security, operations, help desk, technology advisory services and more. The company’s agile team consists of world-class information technologists and business executives across industries, ensuring that we provide clients with a strong competitive advantage.
SenecaGlobal is headquartered in Chicago, Illinois, and has a state-of-the-art software development and management center in Hyderabad, India. The company is certified as a Great Place to Work® and is ISO 9001 certified for quality and ISO 27001 certified for security.