So they're admitting that their entire business model requires them to break the law. Sounds like they shouldn't exist.
It likely doesn't break the law. You should check out this article by Kit Walsh, a senior staff attorney at the EFF, and this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.
Headlines like these let people assume that it's illegal, rather than educate people on their rights.
Reproduction of copyrighted material would be breaking the law. Studying it and using it as reference when creating original content is not.
humans studying it, is fair use.
So if a tool is involved, it's no longer ok? So, people with glasses cannot consume copyrighted material?
Copyright can only be granted to works created by a human, but I don’t know of any such restriction for fair use. Care to share a source explaining why you think only humans are able to use fair use as a defense for copyright infringement?
What's the difference? Humans are just the intent suppliers, the rest of the art is mostly made possible by software, whether photoshop or stable diffusion.
You might want to read this post from one of the EFF's senior lawyers on the topic who has previously litigated IP cases:
https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0
It doesn't break the law at all. The courts have already ruled that copyrighted material can be fed into AI/ML models for training:
This ruling only applies to the 2nd Circuit and SCOTUS has yet to take up a case. As soon as there's a good fact pattern for the Supreme Court of a circuit split, you'll get nationwide information. You'll also note that the decision is deliberately written to provide an extremely narrow precedent and is likely restricted to Google Books and near-identical sources of information.
Have there been any US ruling stating something along the lines of “The training of general purpose LLMs and/or image generation AIs does not qualify as fair use,” even in a lower court?
I guess I can’t read anything and learn from it.
you know what? I like this argument. Software/Streaming services are "too complex and costly to work in practice" therefore my viewership/participation "could not exist" if I were forced to pay for them.
Hey if they want to set that precedent, so be it.
Oh, no no no.
Rules for thee, not for me.
That's the other edge of this sword.
I do love how AI has gotten Corporate Giants to start attacking the Copyright System they've used to beat down the little man for generations
Maybe because it's not the same corporations? We might be seeing a giant powershift from IP hoarders to makers.
Makers use the copyright system to their advantage as well though. If I write code and place it on github, the only thing stopping a mega corp stealing it is the copyright I hold.
Abolishing copyright is not a win.
Let's not kid ourselves that the copyright is stopping mega corporations from stealing your github code.
What's stopping them from hiring an engineer that basically rewrites your code? No one would ever know.
Copyleft enforcement is laughable at best and thats with legitimate non profits working on it (like FSF) and that's when it comes to direct library use without modifications and there's basically no history of prosecution or penalties for partial code copying (nor that there should be imo) that's even when 1:1 code has been found!
I feel like copyright has been doing very little in modern age and have yet to see any science that contradicts my opinion here. Most copyright holders (like high 90%) are mega corporations like ghetty images that hardly contribute back to the society.
Weakening copyright is a win
Another reason why copyright should be shortened... Society has changed massively in the last 100 years, but every expression of our modern society is locked behind copyright.
I keep thinking how great it would be if the federal government made a central server system to access digital content for free via taxes.
All public domain and publicly funded research and content, all in one place. Could also host owned content for people/entities and pay out royalties automatically based on consumption.
There are ways to make this fairly affordable to everyone via taxes, but maybe the big opportunity is it could also allow companies to train AI on all the data for a fat, but fair subscription. The value of that could easily pay for enough to shrink any tax costs for the public.
In general, if the US government were smart (and not currently tearing itself apart) it would be creating a generative AI public service like the postal service, potentially even relying on public government documents and the library system for training.
Offer it at effectively cost for the public to use. Would drive innovation and development, nothing produced by it would be copyrightable, and it would put pressure on private options to compete against it.
We can still have the FedEx or DHL of gen AI out there, but they would need to contend with the public option being cheaper and more widely available for use.
In addition to the US government actually needing to do work, the senators would need to understand how to turn a computer on and off.
So... This may be an unpopular question. Almost every time AI is discussed, a staggering number of posts support very right-wing positions. EG on topics like this one: Unearned money for capital owners. It's all Ayn Rand and not Karl Marx. Posters seem to be unaware of that, though.
Is that the "neoliberal Zeitgeist" or what you may call it?
I'm worried about what this may mean for the future.
ETA: 7 downvotes after 1 hour with 0 explanation. About what I expected.
I think it's a conflation of the ideas of what copyright should be and actually is. I don't tend to see many people who believe copyright should be abolished in its entirety, and if people write a book or a song they should have some kind of control over that work. But there's a lot of contention over the fact that copyright as it exists now is a bit of a farce, constantly traded and sold and lasting an aeon after the person who created the original work dies.
It seems fairly morally constant to think that something old and part of the zeitgeist should not be under copyright, but that the system needs an overhaul when companies are using your live journal to make a robot call center.
Lemmy seems left-wing on economics in other threads. But on AI, it's private property all the way, without regard for the consequences on society. The view on intellectual property is that of Ayn Rand. Economically, it does not get further to the right than that.
My interpretation is that people go by gut feeling and never think of the consequences. The question is, why does their gut give them a far-right answer? One answer is that somehow our culture, at present, fosters such reactions; that it is the zeitgeist. If that's the truth (and this reflects a wider trend) then inequality will continue to increase as a result of voter's demands.
My interpretation is that people go by gut feeling and never think of the consequences.
Often, yes.
The question is, why does their gut give them a far-right answer?
The political right exploits fear, and the fear of AI hits close to home. Many people either have been impacted, could be impacted, or know someone who could be impacted, either by AI itself or by something that has been enabled by or that has been blamed on AI.
When you’re afraid and/or operating from a vulnerable position, it’s a lot easier to jump on the anti-AI bandwagon. This is especially true when the counter-arguments address their flawed reasoning rather than the actual problems. They need something to fix the problem, not a sound argument about why a particular attempt to do so is flawed. And when this problem is staring you in the face, the implications of what it would otherwise mean just aren’t that important to you.
People are losing income because of AI and our society does not have enough safety nets in place to make that less terrifying. If you swap “AI” for “off-shore outsourcing” it’s the same thing.
The people arguing in favor of AI don’t have good answers for them about what needs to happen to “fix the problem.” The people arguing against AI don’t need to have sound arguments to appeal to these folks since their arguments sound like they could “fix the problem.” “If they win this lawsuit against OpenAI, ChatGPT and all the other LLMs will be shut down and companies will have to hire real people again. Anthropic even said so, see!”
UBI would solve a lot of the problems, but it doesn’t have the political support of our elected officials in either party and the amount of effort to completely upend the makeup of Congress is so high that it’s obviously not a solution in the short term.
Unions are a better short-term option, but that’s still not enough.
One feasible solution would be legislation restricting or taxing the use of AI by corporations, particularly when that use results in the displacement of human laborers. If those taxes were then used to support those same displaced laborers, then that would both encourage corporations to hire real people and lessen the sting of getting laid off.
I think another big part of this is that there’s a certain amount of feeling helpless to do anything about the situation. If you can root for the folks with the lawsuit, then that’s at least something. And it’s empowering to see that people like you - other writers, artists, etc. - are the ones spearheading this, as opposed to legislators.
But yes, the more that people’s fear is exploited and the more that they’re misdirected when it comes to having an actual solution, the worse things will get.
Yeah I think that this is showing a lot of people only really care about espousing anti-privatization ideas as long as it suits their personal interests and as long as they feel they have more to gain than to lose. People are selfish, and a lot of progressive, or really any kind of passionate rhetoric is often conveniently self-serving and emotionally driven, rather than truly principled.
I see way too many people advocating for copyright. I understand in this case it benefits big companies rather than consumers, but if you disagree with copyright, as I do, you should be consistent.
Copyright law should benefit humans, not machines, not corporations. And no, corporations are not people. Anthony Kennedy can get bent.
I hate the MAGMA companies as much as anyone, but AI such as LLMs, especially the open source stuff Facebook and Stable diffusion is making, is beneficial to us all.
You don't have to be against copyright, as such. Fair Use is part of copyright law. It exists to prevent copyrights from being abused against the interests of the general public.
It's interesting as it's many of the MPAA/RIAA attitudes towards Napster/BitTorrent but now towards gen AI.
I think it reflects the generational shift in who considers themselves content creators. Tech allowed for the long tail to become profitable content producers, so now there's a large public audience that sees this from what's historically been a corporate perspective.
Of course, they are making the same mistakes because they don't know their own history and thus are doomed to repeat it.
They are largely unaware that the MPAA/RIAA fighting against online sharing of media meant they ceded the inevitable tech to other companies like Apple and Netflix that developed platforms that navigated the legality alongside the tech.
So for example right now voice actors are largely opposing gen AI rather than realizing they should probably have their union develop or partner for their own owned offering which maximizes member revenues off of usage and can dictate fair terms.
In fact, the only way many of today's mass content creators have platforms to create content is because the corporate fights to hold onto IP status quo failed with platforms like YouTube, etc.
Gen AI should exist in a social construct such that it is limited in being able to produce copyrighted content. But policing training/education of anything (human or otherwise) doesn't serve us and will hold back developments that are going to have much more public good than most people seem to realize.
Also, it's unfortunate that we've effectively self propagandized for nearly a century around 'AI' being the bad guy and at odds with humanity, misaligned with our interests, an existential threat, etc. There's such an incredible priming bias right now that it's effectively become the Boogeyman rather than correctly being identified as a tool that - like every other tool in human history - is going to be able to be used for good or bad depending on the wielder (though unlike past tools this one may actually have a slight inherent and unavoidable bias towards good as Musk and Gab recently found out with their AI efforts on release denouncing their own personally held beliefs).
Then they shouldn't exist.
Too late
I’m just trying to think about how refined AI would be if it could only use public domain data.
ChatGPT channels Jane Austin and Shakespeare.
That's not really how it would work.
If you want that outcome, it's better to train on as massive a data set as possible initially (which does regress towards the mean but also manages to pick up remarkable capabilities and relationships around abstract concepts), and then use fine tuning to bias it back towards an exceptional result.
If you only trained it on those works, it would suck at pretty much everything except specifically completing those specific works with those specific characters. It wouldn't model what the concerns of a prince in general were, but instead model that a prince either wants to murder his mother (Macbeth) or fuck her (Oedipus).
Of course they will exist. China will own them all.
Not that I am a fan of the current implementation of copyright in the US, but I know if I was planning on building my business around something that couldn’t exist without violating copyright I would surely thought of that fairly early on.
"My profits from fencing your wallet could not exist if stealing your wallet were punished."
"Ah, you're right, how silly of me, carry on."
Sounds like a win to me
Sounds good.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed