125
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 19 Jul 2023
125 points (98.4% liked)
Technology
59467 readers
3508 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Does no one remember the days of Napster and the multiples over retail cost that people caught pirating were charged?
And technically piracy is a federal crime, so there could even be criminal charges.
A "nothing burger"?
Let's see...oh my, what's this? 504.c.2
That's per work infringed.
Nothing burger indeed.
OpenAI is on the other end of over two decades of fearmongering and lobbying to enact laws with ridiculous penalties for piracy in the digital age.
As for how we know where they got the information, that's what subpoenas are for in a legal proceeding. Even if training information is not publicly disclosed, whether they did or didn't pirate content is going to come out privately in court.
The AI doesn't need to reproduce the book for OpenAI to have infringed in illegally sourcing the copyrightable material they used in training.
You failed to read my post. You jumped straight into an assumption that piracy can be proved rather than actually reading what I've posted.
If you're going to continue with strawman arguments then please return to reddit.
Piracy can be proved if it occurred by talking to employees under oath and subpoenaing relevant email records.
The idea the court would need to reverse engineer ChatGPT to find out is absurd.