AI in writing software

GlitterRain

New Member
It’s kind of crazy that I even have to ask this, but are you guys worrying about and/or implementing any protective measures for your work against AI within writing software?

This worries me, and I know there are others out there who are worried. I spend so much time on my writing and it means so much to me, so I don’t want AI to steal it (as I’m sure we all feel). I’ve done some basic research on the issue, but without technical knowledge of how AI works or how it is implemented into the software at issue, it’s kind of hard to understand the info out there, on top of the fact that there is not a whole lot on the subject in the first place.

Honestly, I’d like to just print out all my books and work old school from now on with notebooks and pens, but that would definitely be slower and I’ll have an inky hand all the time because I’m left handed. Plus, should my belongings ever be in a natural disaster, my work would not be safe if I was not there or able to grab it.

Anyone have any insights, either in general or about specific software, especially the big ones (Word, Pages, Scrivner, Google, LibreOffice, etc.)? Additional related worries? Solutions?
 
but are you guys worrying about and/or implementing any protective measures for your work against AI within writing software?

No, and no. If an AI learns from my work, then OK. It's not going to plagiarise it directly, but it might learn to write "in my style", whatever that is. Aw hell, if someone produces a work inspired by me, I should be flattered that they think I'm worth copying.

I mean, every plot I've ever written has been done, in broad terms, by someone, somewhere.

Hell, I had someone on the OG lift a phrase I had written directly, without attribution or even a thanks (I wouldn't have objected if they'd just shown some common courtesy and asked), and that pissed me off more than an AI doing it.

I look at it like this - what will they *actually* be stealing off me?
 
I'm not sure why you'd jump right from using dedicated or cloud writing software to longhand. That's a bit extreme. There are lots of in between options along the way. If Zuck Musk buys the Internet you can always smash out novels on an offline word processor and sell the manuscripts in a back alley.

Besides, it doesn't really matter where you write it if you're publishing it online anyways. Amazon probably has the largest contemporary library of literature, handily already digitized. I don't think they're going to ask us peasants for our permission before selling it to train an LLM geared for fiction writing.

I've heard that Google Docs only scrapes public docs for AI training, but I also found this in their TOS which seems to indicate they don't for now. It also says "to train our document AI models" but doesn't explicitly say it's not selling the data or using for something else either.

Does Google use customer data to improve models?​

No. Google does not use any of your content (such as documents and predictions) for any purpose except to provide you with the Document AI service. See section 17 of the Google Cloud Terms of Service.

At Google Cloud, we never use customer data to train our Document AI models.

For more information, see the Transparency & data protection page.

In the future, will Google share the document I send to Document AI?​

We won't make the document that you send available to the public or share it with anyone else, except as necessary to provide the Document AI service. For example, sometimes we may need to use a third-party vendor to help us provide some aspect of our services, such as storage or transmission of data. Our vendors are under appropriate security and confidentiality contractual obligations. We don't share documents you send with other parties or make them public for any other purpose.

Regardless I think the scraping ship has sailed. I've already accepted it. Is it stealing? Well it's publishing/reproducing your work in a way you didn't authorize, so in my opinion yes. Is it personally, specifically going to affect you in any noticeable way beyond principle? I can't see how.
 
Well it's publishing/reproducing your work in a way you didn't authorize

It's using your work in a particular way, but is that any different to what a person might do with a copy of your work they'd borrowed from a library?

Fair enough if your work is unpublished, but once it's published, you don't have control over how people consume your work.
 
It's using your work in a particular way, but is that any different to what a person might do with a copy of your work they'd borrowed from a library?
It's categorically different. LLM's don't consume work the way a human does. It literally weighs/tokenizes it. An LLM is not a human reader (which is the implicit audience one publishes for).
Fair enough if your work is unpublished, but once it's published, you don't have control over how people consume your work.
Reading a book and feeding a book are not the same thing. When your work is scraped by an LLM, it's been republished to AI without—barring a sneaky ToS update—your consent.
 
Last edited:
It's categorically different. LLM's don't consume work the way a human does. It literally weighs/tokenizes it. An LLM is not a human reader (which is the implicit audience one publishes for).

I don't see the difference, sorry. You don't publish a book and specify "it must be consumed in such and such a manner". The fact that it is tokenised means nothing, as far as I'm concerned. Ultimately, its ultimate end purpose is the same - human consumption, in one form or another.

My opinion, which no doubt people will disagree with.

However, I don't support or condone producing creative work with AI.
 
I think it remains very difficult in this current era of cloud everything to keep your data out of corporate reach. What they do with it today may be different from what they do with it next year, but they already have it to do whatever.
Word, Pages, Scrivner, Google, LibreOffice
From this list I'd say your safest bet is libre Office, but even there, how long before they introduce ai assistance...
You don't publish a book and specify "it must be consumed in such and such a manner"
I agree with this sentiment. Anything you put out to the public, even on forums like this one, anybody can see and do whatever they like with it. If they bought a hard copy and typed it into their own computer and tried to pass it off as theirs, it would be poor conduct, but conceivably they could try.

If AI learns to write like me in terms of style, that's not an issue. From my experience it can sort of, kind of, almost get close. But what it can't do is generate the same ideas and concepts that I do. Yet, at least.
 
If they bought a hard copy and typed it into their own computer and tried to pass it off as theirs, it would be poor conduct, but conceivably they could try.
I'm pretty sure that's copyright infringement.

I find that arguments in favour of AI tend to overlap with notions against copyright protection. See also: "Every story has already been written" which seems surprisingly common in AI discussions.
 
From this list I'd say your safest bet is libre Office, but even there, how long before they introduce ai assistance...
I'm using Scrivener, and if there's some form of AI in it other than spelling I'm unaware of it. It hasn't made a single suggestion otherwise. It's less invasive than even Word. Granted, I've only been using it for a couple of weeks, but there's nothing in it I find concerning.

I agree with this sentiment. Anything you put out to the public, even on forums like this one, anybody can see and do whatever they like with it. If they bought a hard copy and typed it into their own computer and tried to pass it off as theirs, it would be poor conduct, but conceivably they could try.
I'm sorry, what? Could they try? Of course. I can also try to rob a bank but me having an account there doesn't make it any less illegal. And no, they do not get to reproduce it, whether they want to or not. Could they get away with it? Sure. If they're caught is "But I bought it" going to be a legitimate defense? No. Not unless they bought a license to reproduce it.
 
See also: "Every story has already been written" which seems surprisingly common in AI discussions.

In broad terms, it's also true.

What about narrow terms?

It depends on how you define a story or, more precisely, how you define telling a story. If a story is a bunch of shit that happens, then yeah, AI can certainly do that. If telling a story is recounting that bunch of shit that happened, it may as well be AI. Do it up in bullet points and get to the end as quickly as possible. I do know that many of us on the forum approach story-telling in different ways, but I don't think anyone aspires to that.

The story, whether for pure entertainment or other form of stimulation, is created by the author according to a range of details personal only to them. That's what makes it worth reading and that's what makes it different to every other story ever told.

My attitude to AI is something I've expressed before on the old site. It's not very complicated, and not likely to be changed by contrary argument. AI is the culmination of the technological impetus that separates each of us from other people and now separates us from ourselves. It's happening, probably no escaping it, but in my little world of creative pursuits it has no place. I plan to keep it that way.

I know other people feel differently and good luck to them. I no more want to change their minds than expect them to change mine.
 
What about narrow terms?

It can be done in narrow terms as well. For example, there are any number of knockoffs of The Colour From Space (e.g. The Tommyknockers, by Stephen King), and Something Wicked This Way Comes (e.g. Needful Things, by Stephen King). Then you have parodies and reimaginations, which may have identical plots (but see below).

But as we say in the workshop, it's the execution that counts. It's not so much what is told but how it's told. That's why you shouldn't use AI to generate your writing. Its execution is generally poor, at the moment.

A note though is that you also don't have to reproduce someone else's text word-for-word to commit plagiarism. If it's close, it can still be considered legally (AFAIK) as copyright violation.
 
I'm pretty sure that's copyright infringement.

Could they try? Of course. I can also try to rob a bank but me having an account there doesn't make it any less illegal.

I'm sorry, I clearly did not articulate my point well enough and have been misunderstood. Yes it is copyright infringement and illegal. What I was suggesting is that, if anyone is determined enough, you cannot prevent them from copying (stealing) any work available to them.

This relates to AI similarly, of you and your data to their servers, you cannot control what they do with it. Now if gpt were to spit out a complete word for word replica, that would be a different situation, likely under copyright law. But that is not typically how these things work. The fact that it can replicate an author's style or voice is, I think, still a grey area. Moreso because it's still not highly accurate at doing it.

I'm using Scrivener, and if there's some form of AI in it other than spelling I'm unaware of it
Good to know, I can't speak on this first hand. I know that a lot of writing apps, even Word, now have AI assistants, which necessarily must send your work to their cloud for processing. Call me cynical, but what we have seen is companies storing this data until they come up with a way to benefit from it. And their informing users of it depends entirely on how ethical the company is, but usually constitutes a mere licensee agreement change, as someone mentioned.
 
In broad terms, it's also true.
But we don't value stories that way, otherwise we'd get just as much pleasure reading a summary. Rigor Mortis said it better than me in his post above, though.

It can be done in narrow terms as well.
But as we say in the workshop, it's the execution that counts. It's not so much what is told but how it's told.
How it's told is also being harvested.
 
Last edited:
How it's told is also being harvested.

Not in a way that can replicate you. How is about more than the words you use.

An AI may be able to write a sentence like you. It can't write a story like you. At least, not yet.

Anyway, I'm out of this thread. I'm not going to change your mind or vice versa, and arguing it further will just cause conflict, so best for me just to leave it there.
 
Last edited:
Back
Top