Tagging and filtering works for communities that need it. Those that have NSFW content enforce rules that allow users to customize their feeds, just like is done here with the [AI] tag.
Even_Adder
Please explain how honoring artist’s will can make the situation 10x worse?
That's what I was talking about when I said:
Using things “without permission” forms the bedrock on which artistic expression and free speech are built upon. They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.
And when I said:
The people who train these systems still have rights like you and I, and the public interest transcends individual consent. Rights holders, even when they are living, breathing individuals, would always prefer to restrict our access to materials, but from an ethical standpoint, the benefits we see from of fair use and library lending, outweigh author permissions. We need to uphold a higher ethical standard here for the benefit of society so that we don’t end up building a utopia for corporations, bullies, and every wannabe autocrat, destroying open dialogue in the process.
What do you think someone who thinks you’re going to write an unfavorable review would say when you ask them permission to analyze their work? They’ll say no. One point for the scammers. When you ask someone to scrutinize their interactions online, what will they say? They’ll say no, one point for the misinformation spreaders. When you ask someone to analyze their thing for reverse engineering, what will they say? They’ll say no, one point for the monopolists. When you ask someone to analyze their data for indexing, what will they say? They’ll say no, one point for the obstructors.
And when I said:
...If we allow that type of overreach, we would be giving anyone a blank check to threaten the general populace with legal trouble off of just from the way you draw the eyes on a character. This is bad, and I shouldn’t have to explain or spell it out to you.
What these people want unfairly restricts self-expression and speech. Art isn’t a product, it is speech, and people are allowed to participate in conversations even when there are parties that rather they didn’t. Wanting to bar others from iterating on your ideas or expressing the same ideas differently is both is selfish and harmful. That’s why the restrictions on art are so flexible and allow for so much to pulled from to make art.
You have spent so many hours dishonestly dodging the actual points I've made, it's not surprising you're lost this far in.
And we're discussing your assertion that AI art is unethical because of how it's trained. I've given examples and explanations on how your views on honoring artists' wills are not only unfair, but shortsighted, and harmful to all of us too. I do this not only in hopes of changing your mind, but also the minds of anyone who might be reading this thread.
Your actions don't match your words friend. It seems to me that if you were here doing as you say, there wouldn't be any doubt that tags are the best solution in this situation. People that want to can view the content, and those that don't can avoid it, as has always been done.
Loot at it this way. You want to bar me from posting this stuff here, even though you aren't bereft of other communities that function the way you want. I push back against it because you have the option to customize your feed and countless other communities to choose from. Why are you trying to take away my few choices?
I asked you to think about what copy write protects. It gives artists protection over specific expressions, not broad concepts like styles, and this fosters ethical self-expression and discourse. If we allow that type of overreach, we would be giving anyone a blank check to threaten the general populace with legal trouble off of just from the way you draw the eyes on a character. This is bad, and I shouldn't have to explain or spell it out to you.
What these people want unfairly restricts self-expression and speech. Art isn't a product, it is speech, and people are allowed to participate in conversations even when there are parties that rather they didn't. Wanting to bar others from iterating on your ideas or expressing the same ideas differently is both is selfish and harmful. That's why the restrictions on art are so flexible and allow for so much to pulled from to make art.
It is spelled out in the links I've replied with how these short sided power grabs will consolidate power at the top and damage life for us all. While Cory Doctorow doesn't endorse AI art, he agrees that it should exist. He goes on to say that you can't fix a labor problem with copyright, the way some artists are trying to do. That just changes how and how much you end up paying the people at the top.
And I want to reiterate, I'm not talking about the law here, I'm talking about the effects the laws have. I feel for the artists here, but honoring a special monopoly on abstract ideas and general forms of expression is a recipe for disaster that will only make our situation ×10 worse.
Maybe it's because the source was a CGI show? A 3D art style LoRA over anime models always yields a soft blend of the two sales. To me It looks like what the CGI could look like with infinite money.
The content is allowed here, you're the one saying it shouldn't be, when there are other communities like you describe. You're not pushing back, you're pushing into an already established community rather than curating your own feed.
We can continue this conversation if you're willing to proceed in good faith, but putting words in my mouth and trying to misrepresent the situation isn't cool. If you can't own up to your side of the argument and have to try to turn it on me, you've already lost the plot. This kind of manipulation leads to miscommunication, kills the actual dialogue, and makes you look even weaker than your argument.
I'm not telling you to ponder this from a legal perspective, look at it what those laws protect from an ethical perspective. And I urge you again to actually read the material. It goes in depth and explains how all this works and the ways in it's all related. A quick excerpt:
Break down the steps of training a model and it quickly becomes apparent why it's technically wrong to call this a copyright infringement. First, the act of making transient copies of works – even billions of works – is unequivocally fair use. Unless you think search engines and the Internet Archive shouldn't exist, then you should support scraping at scale:
https://pluralistic.net/2023/09/17/how-to-think-about-scraping/
And unless you think that Facebook should be allowed to use the law to block projects like Ad Observer, which gathers samples of paid political disinformation, then you should support scraping at scale, even when the site being scraped objects (at least sometimes):
https://pluralistic.net/2021/08/06/get-you-coming-and-going/#potemkin-research-program
After making transient copies of lots of works, the next step in AI training is to subject them to mathematical analysis. Again, this isn't a copyright violation.
Making quantitative observations about works is a longstanding, respected and important tool for criticism, analysis, archiving and new acts of creation. Measuring the steady contraction of the vocabulary in successive Agatha Christie novels turns out to offer a fascinating window into her dementia:
https://www.theguardian.com/books/2009/apr/03/agatha-christie-alzheimers-research
Programmatic analysis of scraped online speech is also critical to the burgeoning formal analyses of the language spoken by minorities, producing a vibrant account of the rigorous grammar of dialects that have long been dismissed as "slang":
Since 1988, UCL Survey of English Language has maintained its "International Corpus of English," and scholars have plumbed its depth to draw important conclusions about the wide variety of Englishes spoken around the world, especially in postcolonial English-speaking countries:
https://www.ucl.ac.uk/english-usage/projects/ice.htm
The final step in training a model is publishing the conclusions of the quantitative analysis of the temporarily copied documents as software code. Code itself is a form of expressive speech – and that expressivity is key to the fight for privacy, because the fact that code is speech limits how governments can censor software:
https://www.eff.org/deeplinks/2015/04/remembering-case-established-code-speech/
If you're not willing to do that, there isn't much I can do, since all of your questions are answered there.
I'm not telling anyone they are wrong because stuff they don't want to see, I only want them to use the tools available to them before making knee-jerk decisions that can have adverse effects on the community. As easy as it is to create communities, it's even easier to use the blocking tools for yourself. This conversation has taken hundreds of times longer than it would have for someone to block and move on.
Just because the majority thinks one way doesn't mean they aren't wrong or ignorant. History is full of examples where the crowd went the wrong way on issues. Hell, you don't even need history, just look at the US today. A community without dissent is dooming itself to ignorance and leaving itself vulnerable to the machinations of bad actors. The reality is that justice and truth aren't the same as popularity, and we have to push against the crowd sometimes to get to it. Lemmy arms us with the tools to do just that, and it's up to us to use them whenever possible.
The tags exist here because we already agreed that was the way we were handling content. In the meantime, you can just block me until tags arrive. That would be the simplest way to filter this content from your view.
Yeah, it only barely resembles the image on the wiki.