General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsAt least one publisher asked a famous author to allow trained AI to write their future books.
Last edited Fri Jul 14, 2023, 12:50 AM - Edit history (1)
Dammit.
This news today via tweets from bestselling YA author Maureen Johnson - https://en.wikipedia.org/wiki/Maureen_Johnson - who was given permission by the famous author to tweet about this but not name names yet.
Tweets below, then text.
Link to tweet
Link to tweet
Link to tweet
Link to tweet
@maureenjohnson
Authors: we need to stand up with the actors. AI is ALREADY HERE in our work. I just spoke to a Very Famous Author who has to remain nameless for legal reasons. They are held up in a contract negotiation because a Major Publisher wants to train AI on their work.
This person will talk publicly when they can. They can't right now. But you see what this means, and what's already going on. If we don't support other artists, we're toast.
I have their permission to talk about this as a blind item. Seriously. The alarms are ringing right now.
There are already services training AI on our books. This is an attempt by one of the Big Publishers to codify this into a contract. And I repeat, this person is a tentpole of publishing. So this can, and almost certainly will, happen to us all.
We need to know from agents if they are seeing this on other contracts. And we have to get serious about unionizing as authors.
I'm really not fond of ringing alarms or freaking out. It's just that the moment is here. It's already here. And we need to--every single one of us in publishing--every author, agent, editor--need to get together on this right now or our whole art and industry goes up in smoke.
We have to deal with this immediately in our contracts, in our work being fed to AI online. And then we need an Authors Union NOW.
EDITING because I just found a later tweet she posted with a TikTok video with a bit more detail. She explains that the famous author was upset because the publisher wanted the AI trained to "spit out more" of the author's work, without the author being involved:
Link to tweet
https://www.tiktok.com/@maureenjohnsonbooks/video/7255426312058178858
highplainsdem
(52,351 posts)moniss
(5,724 posts)created works to be labeled as such. All areas of creativity. I have no desire for a machine to tell me about laughter, love, passion, pride, loss, hope and tragedy. I have no desire for the machine to draw my eye to the visual or my ear to the aural of things that are of human good and bad and wonder.
highplainsdem
(52,351 posts)fields by AI and robots, for the financial benefit of very few and to the detriment of almost everyone else.
Generative AI companies that did not legally acquire every bit of art and writing in their data sets should be sued out of existence. That would be almost all of them.
Generative AI companies whose AI cause harm should be held responsible, especially if they do not make AI results clearly detectable as such.
Using AI for deepfakes should be punished as a serious crime.
moniss
(5,724 posts)and a great point about the deepfake issue.
Bernardo de La Paz
(50,899 posts)Programmers have to be a bit ego-less because they expect their code will be reworked over time, improved by others, debugged, tweaked, partially rewritten as requirements change. So using AI assists for programming is the way it is going. You still need a programmer to ride herd over everything, but AI does increase productivity.
Fiction is different. In a sense, a work of fiction is cast in stone when it is finished as deemed by the author. Authors sometimes will write a second edition of a piece of fiction, but that is extremely rare.
Emrys
(7,941 posts)is for AI to write the author's future work. It's to formalize the use of the author's work as training material for AI.
As the author quoted in the OP said, this already happens, and on a grand scale, but copyright enforcement has yet to catch up. So it looks more like the publisher is trying to move with the times. Presumably, if they have contractual rights for use of the material for AI training, they can monetize it.
AI undoubtedly has already been used to produce "literature" - on Amazon recently, there have been a number of books put out by disreputable publishers that have been identified as AI-produced, but they've been so crappy that they've soon been withdrawn after a few suckers bought them. I don't believe this contract is to do with that.
highplainsdem
(52,351 posts)that style and make the writer "more productive." Ditto for image-generating AI.
With this being a very famous author, the publisher is trying to get the right to publish new, AI-written books under the author's name, after the author retires, or dies, or when they just want to take some time off.
Emrys
(7,941 posts)Last edited Thu Jul 13, 2023, 11:13 PM - Edit history (1)
"At least one publisher asked a famous author to allow trained AI to write their future books."
The contract seems to be governing use of the material for training, not to produce the author's (or indeed necessarily any) future works by AI.
This is borne out by other authors who've replied to the tweets in your OP:
Link to tweet
@JenniferBrody
Yup! Just signed new book deal & my agent got AI protections in! But Im hearing from agents big pubs stonewalling on AI (owned by same corporations as studios). Also got in Twitter argument with BIG author with Apple+ show who thinks we should just let AI happen. Initials=HH
Link to tweet
@FearlessProse
I think you'll find that is already happening and not only public online content is being trawled for AI. The training platform I use just updated their T&C to include everything I put on the platform, which is part of my intellectual property.
highplainsdem
(52,351 posts)author's work is worth much more to the publisher to train an AI to write future books "by" that author than it would be if sold to bundle with the work of other authors.
Emrys
(7,941 posts)The problems of AI in the creative arts are real and just beginning, but step 1 is for creators to establish whether they can have and exercise control over use of their work as input for AI. Step 2 would be to enforce that, which would be a whole other ball game, and one where it's probably going to be impossible to achieve 100% compliance because some of the firms developing AI are going to be shady and indiscriminate in what they trawl as input.
That's what these current contracts are doing. There's no suggestion in the OP tweets that the publisher intends to do what your OP title claims. It may, but it's not there in the tweets you quoted or the ones I've just quoted.
What you say may be true, but I think it's a misreading of the situation described in the tweets in the OP.
People are worried about their rights over their current work, not immediately about what AI may be used to produce in future. That's going to be a different battle.
highplainsdem
(52,351 posts)is most valuable when used to replicate that author's work by a publisher with rights to publish that AI-replicated work. Tossing that data set into a vast data set for any other reason, even in the hope of marginally improving the overall quality of writing from a chatbot, is almost worthless by comparison. And no publisher in their right mind would give that data set to competitors, for someone to legally ask for something written just like Famous Author.
We've seen series started by authors continued after the author's death. A publisher who likes AI would think an AI replica of Famous Author would be ideal. Famous Author AI could also churn out more books while the author is still here. No worry about publishing schedules being messed up because a book isn't finished in time.
Btw, some of the people responding in that thread don't know there are already some unions for writers. Some of those Twitter users seem to think writers aren't allowed to unionize. Sigh.
highplainsdem
(52,351 posts)of what she was told by the famous author. Who told her the publisher wanted to train the AI to "spit out more" of the author's work without the author being involved.
I posted that at the end of the OP. Here's the link for the TikTok video:
https://www.tiktok.com/@maureenjohnsonbooks/video/7255426312058178858
Bernardo de La Paz
(50,899 posts)No good for me for fiction. Not one penny for new fiction unless CERTIFIED entirely the work of a human writer. Unless the social consensus is that putting a person's name on AI work is fraud; that would be the opposite certification if all fiction with AI involved partially or wholly is labelled as AI-generated fiction.
For non-fiction, I'm more concerned about goofy or blindly malicious AI making errors of fact or conclusion. So labelling would be important there too.
Artcatt
(344 posts)what the AI boosters say. Its not about our being afraid of technology. Its how its always used to eliminate workers.
Bernardo de La Paz
(50,899 posts)Anytime someone takes a simplistic approach to a complex issue and boils it down to a "sole purpose" they are almost always wrong, as is the case here.
Not finding new medicines DOES eliminate workers when they die from illnesses, injuries, and diseases.
highplainsdem
(52,351 posts)Bernardo de La Paz
(50,899 posts)I will only buy fiction certified to be written entirely by a real person.
As to non-fiction, I tend to double check it anyway. So, less of a concern there, and it is more the facts that are key, rather than writing style which assists greatly but is not the key feature.
highplainsdem
(52,351 posts)True stories. Personal experience.
Please don't devalue the people who write non-fiction.
Bernardo de La Paz
(50,899 posts)Also, there are many varieties of non-fiction: history, auto-biography, engineering textbooks, philosophical analysis of gender rights, cookbooks, self-help, ....
Plus, extending what you say, the lived experience of a human being can be mocked up, simulated, assembled from fragments by an AI, but it would not have the true feeling or perspective or real experience of a human.
However, with fact-checking and data in hand, an author can have perhaps deeper or broader insights and feel more confident in the data they are using.
But a careless author would just accept AI "fact-checks" without checking themselves. If it were me, I'd want an AI that says stuff like "The proportion of college graduates on page 129 is not accurate: see in World Bank site." Then I'd go check myself.
An uninspired author would copy-paste large sections of writing in whether or not the "insights" are garbage or bogus or a misinterpretation. But it was ever thus.
Ultimately the true test will be whether the author can talk knowledgeably about the contents. I imagine an interview going "In your third chapter you make the startling claim that supporters of Finnish indigenous people have antlers growing from their shoulders and elbows. Where did you observe that?" ... pause ... "umm, er".
Especially if editors are AIs or are useless without AIs.
Emrys
(7,941 posts)Even the best and most authoritative authors need assistance in ensuring their ideas are expressed clearly and that they haven't had inadvertent mental glitches, even before you get anywhere near fact-checking.
Those who aren't the best sometimes need a great deal of help with English issues, especially if it's not their first language, which isn't uncommon as English is the lingua franca for most subjects.
If anything's unclear in what a human author's written, we can raise queries with them and clear issues up, or if they're incompetent, we can apply common sense or our own real-world knowledge and research as best we can to get the job done. I'm not sure how that would work with an AI author. I'm not sure at all how it would work with AI copy-editors working on AI-generated text!
But the technology's been creeping in to copy-editing work for years in the form of automated "assistants" using AI (which are not infrequently more of a hindrance than a help), though so far they still need humans as the final arbiters, and maybe always will unless standards slip even more than they have already over the last few decades.
The pressures are likely to be higher in non-fiction because academics have to publish to survive nowadays, even if they sometimes have nothing new to say ("sometimes" may be a bit generous there). Plagiarism is already a major issue, and efforts to detect it have been automated to a certain extent - often using AI. Detecting AI-generated copy will pose its own challenges, and so the arms race will continue - and no doubt involve pitting AI against AI!
Bernardo de La Paz
(50,899 posts)Bernardo de La Paz
(50,899 posts)Analogous to how stuff like Twinkies are manufactured food-like substances.
discntnt_irny_srcsm
(18,577 posts)Lab created diamonds as opposed to blood diamonds I'm good with.
MineralMan
(147,576 posts)I made my living for almost my entire adult life through writing. Not fiction, but non-fiction articles for magazines. Major ones. Here's why AI will work to replace writers to some degree:
Early on in my writing career, I realized what it would take for me to be successful in the magazine world. Over about a year, I learned to create specific algorithms for different magazines I wanted to write for. It wasn't as difficult as you might think. Here is what my process was for creating an algorithm for any magazine I intended to write for:
First, I bought the most recent two issues of that magazine and quickly read through them.
Next, I read them in a more detailed way, focusing on the articles that were the most similar in subject matter to what I hoped to sell to that publication. I took notes, to help me create the algorithm for that type of article in that particular publication. Some of the things I included in the algorithm:
Article word count.
Average length of paragraphs.
Average length of sentence.
Level of diction - what words were chosen to use, related to reading ability of their target audience.
Style details: Use of Boston comma. Punctuation style (semicolons?) Pronoun usages. Point of view and references to readers.
Use of metaphors.
Logical structure of article - Style of lede, length of introduction, number of paragraphs for structural purposes, assumptions about readers, method of summing up and conclusion.
There was more, of course, in my eventual algorithm for each magazine. That algorithm fit on a single page, which I filed for reference when needed.
Once I had the algorithm, based on current issues, I looked at the most recent year's issues to create an editorial calendar for that magazine. Typically, at the time, queries to editors needed to be made about 6 months in advance of issue date. Then, I'd come up with possible articles for an issue six months out and send that list of proposals. to the editor most likely to be the editor I might be working with. That also took some research effort. Of course, I also included information about myself, if I was unknown to the editor.
This did not always work, of course. But it did work well enough to get me a foot in the door in enough cases. Then, I followed my own rules for myself. What I sent to the magazine for that first assignment was an article that was error-free, on time, at the length that type of article usually had, and in the style of that magazine. The style was always based on my assessment of the algorithm used in recent issues.
It worked for me. Editors liked seeing my stuff come in, because it was already very close to what it would turn out to be, after their editing. That meant less work for them and ensured that my proposals would have a better chance of becoming assignments.
Algorithms are what AI is about. That's why an AI text generation program can closely mimic the output from a writer who writes similar things over and over again. Some novelists make their careers from writing formulaic novels. They have an internal algorithm for creating those books. Same thing. AI can do that, and will do that, with or without the permission of the authors. Guaranteed.
highplainsdem
(52,351 posts)because something can be done, it does not follow that people should view it as okay or inevitable.
Ethical people should always reject things that are possible when they harm others.
Pressuring writers or actors to sign contracts so they can be replaced by AI is unethical as hell.
Will individuals and companies still act unethically at times, if it's to their advantage? Sure.
But they should be punished for doing so, through social rejection or civil lawsuits or criminal penalties.
Writing to fit a formula, often called hackwork, can be done by humans, if they want to do that type of writing. But it was still your writing, your decision about the words.
AI follows algorithms mindlessly, thanks to the theft of vast amounts of intellectual property.
Using that to replace human writers is not okay.
MineralMan
(147,576 posts)I said that it will happen. It will.
Every publisher has a formula. That's especially true with magazines. You either write to fit that, or you don't write for that publication. As you said, the words you write are yours. It is the structure that you are copying.
Making a living as a freelance writer is not easy. Once you learn how to do it successfully, you can get paid to write. You choose the words. You choose what you will say.
AI software works cheaper than human writers do. So, it's going to get used. I'm retired now, so it will not affect me. I still maintain that it is inevitable, though. Just watch.
Johonny
(22,047 posts)As it is a way to extend intellectual rights and produce "new" product.
We're seeing SAG, SWG dealing with AI. The question right now is who owns this work. Obviously the studios and the publishers would rather own it.
It's going to be a fight because everyone sees the future.
MineralMan
(147,576 posts)How will it turn out? I don't know. I suspect that the current state of AI is overhyped.
So, let's see what happens. Let's have the first AI-scripted, AI-performed animated movie. How will it do? I suspect it will do poorly.
Let's see the first AI-written James Patterson novel. Right now, humans are writing them. Those seem to be doing OK, marketwise. How will the first AI-written one do. I suspect it will do poorly.
Will AI-generated contend and artistic production improve? No doubt. Will it replace actual writers, actors, composers, and so on? I doubt it. What it will do is replace low-quality crap output by poorly-paid offshore hacks.
highplainsdem
(52,351 posts)as acceptable.
And yes, I know you can get paid to write as a freelancer, sometimes very well. And you are selling to markets. Though sometimes publishers have to make competitive bids in auctions. And sometimes they realize they want something that doesn't look much like what they published earlier and they take a chance on something new, though if it's successful enough they might want to keep following that new example. Smart publishers give good editors some leeway.
It isn't true that every publisher has a formula. They do often have general guidelines on subjects and length, and they want to see submissions written for what they view as their particular audience, in terms of interests and age and educational level. But editors of real magazines, as opposed to websites filled with SEO clickbait to generate revenue from ads, are still more interested in quality and originality than just following a formula.
And book publishers often publish a dizzying array of fiction and non-fiction on different subjects. So have some of the better magazines.
If AI-generated content becomes considered acceptable because it's "inevitable" it will destroy the quality of professionally published writing.
MineralMan
(147,576 posts)What do you think will happen?
Inevitability does not equal acceptability.
Neither of us is going to get to decide what is acceptable, quite frankly.
I use to be a $1 per word writer. That ended when paper publications went away. And they went away when they couldn't find advertisers. For computer magazines, that happened when there were only a couple of brands of PCs being made and only a few major software companies.
I stopped writing for the failing computer magazines and started writing website content from start to finish for brand new business websites. I worked with a successful small website designer. I'm fast, good, and flexible. So, that worked out well. Now I'm retired. Almost everyone I knew in the magazine writing business is now retired. That market no longer exists in a way that can offer writers a career. Time changes things. That, too, is inevitable, but not acceptable.
Emrys
(7,941 posts)AI isn't cheap to run and involves intensive computing time.
If sheer cost is the issue, I'm pretty sure a publisher can find someone in some part of the world (India has long been exploited by publishers as a cheap place to outsource work) who can do it cheaper.
Whether they can do it better than AI is another matter altogether. I try really hard not to feel superior because I have so many advantages, having had a great education in English from a very young age, but some of the people I work with from India can barely write a coherent email or Comment in a Word file - they're usually very good at tasks like cross-checking references, though, and other technical matters, and those are the aspects that are most frequently outsourced to them.
Bernardo de La Paz
(50,899 posts)Prohibiting use of AI would be as effective as trying to stop magats from sharing Biden memes.
However, emulating an author and publishing under their name would also be false and either copyright laws, truth-in-advertising laws, or new laws will be needed to block it.
What if an author uses AI to write in their style and gives it a light once-over? Gonna happen if it hasn't already.