What do you think the trained model is other than a derived work?
What do you think the trained model is other than a derived work?
Promotional images are still under copyright.
AI is creating an image based on someone else’s property. The difference is it’s owned by a corporation.
This isn’t the issue. The copyright infringement is the creation of the model using the copywrite work as training data.
All NYT is doing is demonstrating that the model must have been created using copywrite works, and hence infringement has taken place. They are not stating that the model is committing an infringement itself.
They are showing that the author of the tool has comitted massive copyright infringement in the process construction of the tool.
…unless they licensed all the copyright works they trained the model on. (Hint: they didn’t, and we know they didn’t because the copyright holders haven’t licensed their work for that purpose. )
It doesn’t matter if a company charges or not for anything. It’s not a factor in copyright law.
Copyleft is not public domain, and requires copyright law to function.
The article uses Midjourney. Nobody is tuning it.
…and that’s why the person you originally replied to asked their question. General popularity is generally a bad proxy metric for personal preference.
Is it impossible to like things outside the mainstream?
Only works if you can start a cycle on power on. My machine will just sit there waiting for someone to press the go button.
I do use the timer delay to run the wash cycle when the power is cheap. I’d really like it if I could set it as “ready to go” and something else give it the “go” when the power is cheap.
Once I have that, it’s also useful to have something to tell me there’s wet washing that needs to be unloaded.
If my washing machine was older I could do all of this with a remote power switch and sensor, but because my washing machine has touch buttons instead of click/clacks, I can’t. Turning the power on just makes it wait for a button press.
It would be far better if somebody sold a single VPN device for the mass public to be able to access home devices. Something wireguard based could be so simple for people to use. Even better if your ISP had this as a standard feature which they made easy to setup Then none of these devices would have an excuse to go out to the company’s servers. Any that did would be obviously spying and they could be shamed.
If fair use is cut down…
It’s not a case of cutting down fair use. It’s a case 9f enforcing current fair use limits.
The choices here are to respect copyright or destroy it. Having and AI exception is nonsense.
"I’m not illegally downloading the latest blockbuster/ best seller / chart topping album. I’m scraping the internet for training data for my AI. It just so happens I need to filter the data by hand before it can injest it. I keep looking for suitable data, but haven’t identified any yet. "
There’s plenty of non copyright material out there to do research on. It won’t make for useful AI products, but they can start licensing for that.
…and then return it to his grieving wife?
Yes. A diamond is just a rock somebody found. Same for gold. They have value because they are scarce and people think they are pretty (up until the last couple of centuries when we developed industrial uses for both). Nobody has ever needed a diamond or a hunk of gold to survive, yet they have value because we say that they are valuable.
It’s not hard. He’s saying that this study makes no claims about effectiveness, but people are so programmed with the catchphrase “safe and effective” that they conflate the two.
Containers are the ultimate “works for me” in software development. My experience it makes for more fragile software that depends on its environment being perfect and nothing else will do.
The trained model is a work derived from masses of copywrite material. Distribution of that model is infringement, same as distributing copies of movies. Public access to that model is infringement, just as a public screening of a movie is.
People keep thinking it’s “the picture the AI drew” that’s the issue. They’re wrong. It’s the “AI” itself.