Thursday, June 6, 2024

Updated Adobe Terms of Use

Ben Lovejoy (via John C. Randolph):

A change to Adobe terms & conditions for apps like Photoshop has outraged many professional users, concerned that the company is claiming the right to access their content, use it freely, and even sub-licence it to others.

The company is requiring users to agree to the new terms in order to continue using their Adobe apps, locking them out until they do so …

Adobe says that its new terms “clarify that we may access your content through both automated and manual methods, such as for content review.”

[…]

Concept artist Sam Santala pointed out that you can’t raise a support request to discuss the terms without first agreeing to them. You can’t even uninstall the apps!

Brandon Lyttle (via Hacker News):

This has caused concern among professionals, as it means Adobe would have access to projects under NDA such as logos for unannounced games or other media projects. Sam Santala, the founder of Songhorn Studios noted the language of the terms on Twitter, calling out the company’s overreach.

As with Slack, I doubt there’s nefarious intent here, but why can’t these documents be written in a more narrow way to allay people’s fears? Right now it says that “Our automated systems may analyze your Content[…] using techniques such as machine learning.” And they define “Content” as including anything that you create using their software. The machine learning FAQ says that they “don't analyze content processed or stored locally on your device” and that you can opt out. I’m not sure whether there’s any legal force to a FAQ linked from a ToS.

See also: Theodore McKenzie, Penny Horwood, Reddit.

Previously:

Update (2024-06-07): See also: Hacker News.

Mike Wuerthele:

We saw that furor, and reached out to Adobe about it. Then, they issued an unclear statement on the matter, saying that the terms had always been this way.

“Adobe accesses user content for a number of reasons, including the ability to deliver some of our most innovative cloud-based features, such as Photoshop Neural Filters and Remove Background in Adobe Express, as well as to take action against prohibited content,” the company said at the time. “Adobe does not access, view or listen to content that is stored locally on any user’s device.”

[…]

They finally said something concrete on Thursday night.

[…]

The company says that it will be clarifying the Terms of Use acceptance to reflect the details of Thursday’s post. It’s not clear when this is going to happen.

He does think the post addresses all the issues, either.

Glenn Fleishman:

Adobe did the thing companies that host and sync data keep doing: they updated their terms in what is a reasonable way without a) giving advance warning and a thorough explanation and b) realizing that the legal niceties sound horrifying to an average person. Adobe can’t legally safely host your content without a license. This updates mostly adds compliance issues that are govt focused—and should be examined.

mcc:

I don’t think this “explanation” helps at all. They don’t justify why this data needs to be on their server rather than at rest on the user computer, and I don’t see where they make it clear what you’d need to do to prevent exfiltration to “the cloud” or applicability of the bad terms. Some of the justifications they give as to when and why they apply tos terms are either so elastic they could mean anything (“to improve the service”) or are the exact features people are afraid of (“AI”).

Update (2024-06-12): Scott Nover (via Hacker News):

According to a post on its blog, the company is not training its A.I. model on user projects: “Adobe does not train Firefly Gen AI models on customer content. Firefly generative AI models are trained on a dataset of licensed content, such as Adobe Stock, and public domain content where copyright has expired.” The post claims that the company often uses machine learning to review user projects for signs of illegal content, such as child pornography, spam, and phishing material.

Although an outside spokesperson for Adobe simply pointed me to the blog post, Belsky offered a view into the consternation inside the company, admitting on X that the wording of the terms of use was confusing. “Trust and transparency couldn’t be more crucial these days, and we need to be clear when it comes to summarizing terms of service in these pop-ups,” he wrote.

Despite the cleanup efforts, this episode demonstrates how gun-shy everyone is about generative A.I. And perhaps there’s no population that has been more wronged here than creative professionals, many of whom feel that generative A.I. companies have illicitly trained their image-, video-, and sound-generation models on copyright works. Big Tech is splitting its loyalties between serving its existing audiences and taking advantage of self-propagating hype for generative A.I. But by doing this, it risks alienating loyal customers. No one wants to be treated like training data—even if that’s what we all are.

Adam Engst:

It feels like we’re descending into a morass of miscommunication, with examples from companies large and small, including Slack, Bartender, and Adobe.

[…]

Slack’s error lay in failing to update its privacy principles as generative AI became a thing. In contrast, Adobe got in trouble for updating its terms of use—and requiring users to agree before they could use Photoshop or other Adobe apps. (Apparently, you couldn’t even uninstall Photoshop without agreeing.)

[…]

That’s not to criticize the people who did freak out. Yes, many of them were playing to a social media audience and exaggerating the potential downside, but the resulting media attention may have been necessary to get these companies to update their documents, clarify what they meant, and back down from potentially problematic changes.

On the other hand, it’s painfully obvious that companies need to do a better job with corporate communications.

Right now, we just have of bunch of tweets and blog posts clarifying Adobe’s intent. I would like to see the Adobe General Terms of Use updated to say, directly in the document, what people actually want to know:

Currently, the document is written very broadly, I guess to protect Adobe, but from the customer’s point of view it seems to be full of loopholes.

Previously:

Update (2024-06-18): Theodore McKenzie:

Apparently, the community’s dissatisfaction with the company grew so intense that even Adobe’s own staff started expressing unhappiness about this whole ordeal, a relatively rare occurrence in an era when many employment contracts often pressure employees to unconditionally support the company’s decisions and strategies.

As reported by Business Insider, which obtained Adobe’s internal Slack discussions, the company’s workers appear to be siding with regular users, voicing complaints about the TOS updates and the resulting backlash, as well as Adobe’s poor communication and apparent mishandling of the situation.

Update (2024-06-20): Ina Fried (via Nick Heer):

Adobe on Tuesday updated its terms of service to make explicit that it won’t train AI systems using customer data.

Adobe:

We don’t scan or review content that is stored locally on your device. We also don’t train generative AI models on your or your customers’ content unless you’ve submitted the content to the Adobe Stock marketplace.

Bravo.

1 Comment RSS · Twitter · Mastodon

"Adobe can’t legally safely host your content without a license"

Then restrict the tos to this specific case. I don't understand why people keep defending companies when they put overly broad terms into their tos.

Leave a Comment