22 April 2022

The FTC is Growing Fangs

Case in point, the Federal Trade Commission has instituted a new policy, which, in addition to fines,  requires deletion of all data collected and all algorithms if they are determined to have dishonestly collected the data.

I would call this an algorithmic application of the, "Fruit of the Poisonious Tree," doctrine, which is generally applied to law enforcement misconduct.

The idea is that if someone acquires information unlawfully, nothing that derives from that information can be used by the perpetrators benefit:

The U.S. Federal Trade Commission has set its sights on tech companies, finding ways to thwart deceitful data practices.

On 4 March, the FTC issued a settlement order against WW International (the company previously known as Weight Watchers) and its subsidiary, Kurbo, with FTC chair Lina Khan stating that “Weight Watchers and Kurbo marketed weight management services for use by children as young as eight, and then illegally harvested their personal and sensitive health information.” The FTC required the companies to “delete their ill-gotten data, destroy any algorithms derived from it, and pay a penalty for their lawbreaking.”

Algorithms are a finite sequence of commands and a set of rules in a computer program used to process data. In the case of AI, machine learning algorithms are trained on data to build models that could predict certain actions or make specific decisions.

“When an algorithm is trained on private data, it would be simple to figure out some or all of that data from the algorithm. This means that just deleting the private data wouldn’t be an effective remedy and wouldn’t prevent future privacy harms,” says Kit Walsh, senior staff attorney at the Electronic Frontier Foundation. “So when you have an important interest like privacy and it’s necessary to delete the algorithm to address the harm, that’s when algorithmic destruction orders are on the firmest footing.”

Aside from curbing privacy harms, algorithmic destruction could hold organizations liable not only for how they gather data but also the methods for processing that data. “It’s adding this twofold approach to holding companies accountable when they go about harvesting data through deceptive means and using that data to generate algorithms,” says Divya Ramjee, a senior fellow at American University’s Center for Security, Innovation, and New Technology and a fellow at Washington College of Law’s Tech, Law & Security Program.

Destroying algorithms could render software useless and negatively affect a company’s bottom line. “At the end of the day, companies are doing this work for money,” Ramjee says. “They’re collecting data and creating algorithms that are essentially a product being sold and generating more money for them. So when you have to destroy that algorithm, there’s a financial consequence for the company because that’s their work product generating revenue that they have to give up.”

While I am not particularly impressed by the actions and policies of the Biden Administration, the people that they have selected to run the FTC and the NLRB are actually doing their jobs, something that hasn't really happened since (at least) the Carter administration.

It's a good start, but the government needs to frog march company executives out of their offices in handcuffs as well.

0 comments :

Post a Comment