arstechnica.com

Photobucket opted inactive users into privacy nightmare, lawsuit says

Class action could foil Photobucket’s plan to turn old photos into AI goldmine.

Credit: PM Images | Stone

Photobucket was sued Wednesday after a recent privacy policy update revealed plans to sell users' photos—including biometric identifiers like face and iris scans—to companies training generative AI models.

The proposed class action seeks to stop Photobucket from selling users' data without first obtaining written consent, alleging that Photobucket either intentionally or negligently failed to comply with strict privacy laws in states like Illinois, New York, and California by claiming it can't reliably determine users' geolocation.

Two separate classes could be protected by the litigation. The first includes anyone who ever uploaded a photo between 2003—when Photobucket was founded—and May 1, 2024. Another potentially even larger class includes any non-users depicted in photographs uploaded to Photobucket, whose biometric data has also allegedly been sold without consent.

Photobucket risks huge fines if a jury agrees with Photobucket users that the photo-storing site unjustly enriched itself by breaching its user contracts and illegally seizing biometric data without consent. As many as 100 million users could be awarded untold punitive damages, as well as up to $5,000 per "willful or reckless violation" of various statutes.

If a substantial portion of Photobucket's entire 13 billion-plus photo collection is found infringing, the fines could add up quickly. In October, Photobucket estimated that "about half of its 13 billion images are public and eligible for AI licensing," Business Insider reported.

Users suing include a mother of a minor whose biometric data was collected and a professional photographer in Illinois who should have been protected by one of the country's strongest biometric privacy laws.

So far, Photobucket has confirmed that at least one "alarmed" Illinois user's data may have already been sold to train AI. The lawsuit alleged that most users eligible to join the class action likely similarly only learned of the "conduct long after the date that Photobucket began selling, licensing, and/or otherwise disclosing Class Members’ biometric data to third parties."

On top of users' concerns about biometric data, they fear that AI training on their Photobucket images could also make it easier for AI models to create convincing "deepfakes" using their images or potentially regurgitate their images.

Photobucket accused of “campaign of fraud and coercion”

Like most users, those suing let their accounts go dormant after Photobucket's popularity waned post-MySpace's peak. They've accused Photobucket of launching "a campaign of fraud and coercion" hidden behind "innocuous" emails promising to "safeguard" user data, but allegedly really functioning to spook as many inactive users as possible into opting in to new terms.

"Contrary to their plain language, the emails were not intended to allow users to 'reactivate,' 'unlock,' or even 'delete' their accounts," the lawsuit said. "Instead, no matter which link the user clicked on, they were taken to a page where the user was forced to accept Photobucket’s updated Terms of Use to proceed" and "agree to Photobucket’s brand-new Biometric Information Privacy Policy," even if they wanted to delete their account. Photobucket also apparently misled users to think they had to agree to the Biometric Policy if they wanted to download their data, when they could have retrieved images without doing so.

And "even more troublingly," a press release that Ars received from users' legal team said, "Photobucket claimed that any registered user who ignored the emails would automatically be 'opted in' to the biometric consent after 45 days."

"Photobucket is planning to sell these photos to the AI companies, even though its users never consented to give their images and biometric data to AI, and such uses of their photos will put them at risk of privacy violations like facial recognition in public," a press release that Ars received from users' legal team alleged.

And Photobucket isn't the only one frustrating users. In addition to seeking an injunction forcing Photobucket to stop misusing data and compensate users whose data was allegedly sold, the lawsuit also seeks damages from unknown AI companies who allegedly bought the data to train AI models. Various state privacy laws require not only that those companies obtain consent for biometric data, but also that those companies clearly explain to each user how their data will be used and how long it will be stored.

At this point, it's unclear who Photobucket's customers might be, but users are hoping to out them through legal discovery.

In October, Photobucket CEO Ted Leonard remained vague, telling Business Insider that "Photobucket was in talks with several companies to license the images." And rather than sharing "how much revenue AI-training deals might bring in," Leonard only disclosed that Photobucket expects the deals will give the company "capital at what we think will be fairly significant in material margins to continue investing in the product itself."

Leonard did not immediately respond to Ars' request to comment.

Mike Kanovitz, a lawyer representing users suing, said in a press statement that Photobucket knew that once it sold users' data, that data could never be clawed back. Because users have allegedly been irreparably harmed by the permanent privacy violation concerning their most sensitive data, Kanovitz is urging the court to award significant damages that at least return ill-gotten gains instead.

"Photobucket’s customers deserve control over how their data gets used, and by whom," Kanovitz said. "And, if there is money to be made from people’s data, the people absolutely should share in the profits."

Photobucket likely has 30 days to respond to the complaint, a spokesperson for users' legal team told Ars.

Read full news in source page