Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.
So what’s the difference between a person reading their books and using the information within to write something and an ai doing it?
Because AIs aren’t inspired by anything and they don’t learn anything
So uninspired writing is illegal?
No but a lazy copy of someone else’s work might be copyright infringement.
So when does Kevin Costner get to sue James Cameron for his lazy copy of Dances With Wolves?
Avatar is not Dances with Wolves. It’s Ferngully.
Idk, maybe. There are thousands of copyright infringement lawsuits, sometimes they win.
I don’t necessarily agree with how copyright law works, but that’s a different question. Doesn’t change the fact that sometimes you can successfully sue for copyright infringement if someone copies your stuff to make something new.
Why not? Hollywood is full to the brim with people suing for copyright infringement. And sometimes they win. Why should it be different for AI companies?
Language models actually do learn things in the sense that: the information encoded in the training model isn’t usually* taken directly from the training data; instead, it’s information that describes the training data, but is new. That’s why it can generate text that’s never appeared in the data.
What does inspiration have to do with anything? And to be honest, humans being inspired has led to far more blatant copyright infringement.
As for learning, they do learn. No different than us, except we learn silly abstractions to make sense of things while AI learns from trial and error. Ask any artist if they’ve ever looked at someone else’s work to figure out how to draw something, even if they’re not explicitly looking up a picture, if they’ve ever seen a depiction of it, they recall and use that. Why is it wrong if an AI does the same?
the person bought the book before reading it
not if i checked it out from a library. a WORLD of knowledge at your fingertips and it’s all free to me, the consumer. So who’s to say the people training the ai didn’t check it out from a library, or even buy the books they are using to train the ai with? would you feel better about it had they purchased their copy?
A person is human and capable of artistry and creativity, computers aren’t. Even questioning this just means dehumanizing artists and art in general.
Not being allowed to question things is a really shitty precedent, don’t you think?
Do you think a hammer and a nail could do anything on their own, without a hand picking them up guiding them? Because that’s what a computer is. Nothing wrong with using a computer to paint or write or record songs or create something, but it has to be YOU creating it, using the machine as a tool. It’s also in the actual definition of the word: art is made by humans. Which explicitly excludes machines. Period. Like I’m fine with AI when it SUPPORTS an artist (although sometimes it’s an obstacle because sometimes I don’t want to be autocorrected, I want the thing I write to be written exactly as I wrote it, for whatever reason). But REPLACING an artist? Fuck no. There is no excuse for making a machine do the work and then to take the credit just to make a quick easy buck on the backs of actual artists who were used WITHOUT THEIR CONSENT to train a THING to replace them. Nah fuck off my guy. I can clearly see you never did anything creative in your whole life, otherwise you’d get it.
Oh, right. So I guess my 20+ year Graphic Design career doesn’t fit YOUR idea of creative. You sure have a narrow life view. I don’t like AI art at all. I think it’s a bad idea. you’re a bit too worked up about this to try to discuss anything. Not to excited about getting told to fuck off about an opinion. This place is no better than reddit ever was.
Of course I’m worked up. I love art, I love doing art, i have multiple friends and family members who work with art, and art is the last genuine thing that’s left in this economy. So yeah, obviously I’m angry at people who don’t get it and celebrate this bullshit just because they are too lazy to pick up a pencil, get good and draw their own shit, or alternatively commission what they wanna see from a real artist. Art was already PERFECT as it was, I have a right to be angry that tech bros are trying to completely ruin it after turning their nose up at art all their lives. They don’t care about why art is good? Ok cool, they can keep doing their graphs and shit and just leave art alone.
Large language models can only calculate the probability that words should go together based on existing texts.
Isn’t this correct? What’s missing?
Let’s ask chatGPT3.5:
> Mostly accurate. Large language models like me can generate text based on patterns learned from existing texts, but we don’t “calculate probabilities” in the traditional sense. Instead, we use statistical methods to predict the likelihood of certain word sequences based on the training data.
“Mostly accurate” is pretty good for an anonymous internet post.
I thought so too so I’m still confused about the votes. Oh well
I don’t see how “calculate the probability” and “predict the likelihood” are different. Seems perfectly accurate to me.