Is AI writing assistance ethical?

It's not just spelling. For example, in the UK, we say "in hospital". Apparently, Americans prefer "in the hospital". I didn't know that, and I have no desire to learn fluent Yankee just to do it.
And Brits don't know that a large portion of the US would be highly offended to be called a "yankee" lmao.

I will stop now, before someone starts getting annoyed
Honestly. And it's so silly.
 
Thanks for all your replies so far. Our red lines seem to vary. I do wonder how many editors etc. use AI for quickness and efficiency. Spellcheckers might have been seen as AI back in the day, or even software (computers) to write our stories. Where do you all draw the line?
 
Where do you all draw the line?
I think the field needs to be established first. What kind of AI? The generative AI that writes for you is a zillion miles from the editing and beta-reading AI many of the above are discussing. And those are million miles away from the killer AI robots. The term "AI" is a marketing buzzword. the likes of which we haven't seen since "digital" entered the consumer's lexicon. It needs several steps of qualification to be discussed intelligently. .

I do wonder how many editors etc. use AI for quickness and efficiency.
See my above remark. What kind of AI and how is it being used? I'm sure there are plenty of editors that will take your money and plug your manuscript into a AI model that you probably could have done yourself for free. Then there are plenty that use as an assist tool, because as others have pointed out, more clients, more output. Then there are plenty that won't touch it at all, but I imagine that population is shrinking. I'm sure you'll have your big time, best selling, legendary authors who will be fine, but EOTD there hasn't been a massive technological advance in human history that any of the "keep it real" crowd have survived, as far as I know. It's just not economically or temporally possible.

And I'd ask what you mean "ethical" in the overall sense. It isn't a right/wrong sort of thing.
 
I think the field needs to be established first. What kind of AI? The generative AI that writes for you is a zillion miles from the editing and beta-reading AI many of the above are discussing. And those are million miles away from the killer AI robots. The term "AI" is a marketing buzzword. the likes of which we haven't seen since "digital" entered the consumer's lexicon. It needs several steps of qualification to be discussed intelligently. .


See my above remark. What kind of AI and how is it being used? I'm sure there are plenty of editors that will take your money and plug your manuscript into a AI model that you probably could have done yourself for free. Then there are plenty that use as an assist tool, because as others have pointed out, more clients, more output. Then there are plenty that won't touch it at all, but I imagine that population is shrinking. I'm sure you'll have your big time, best selling, legendary authors who will be fine, but EOTD there hasn't been a massive technological advance in human history that any of the "keep it real" crowd have survived, as far as I know. It's just not economically or temporally possible.

And I'd ask what you mean "ethical" in the overall sense. It isn't a right/wrong sort of thing.
It's just I hear 'Ethical' being used for this kind of thing. Hearing that word triggered my thought process in the first place.
 
It's just I hear 'Ethical' being used for this kind of thing. Hearing that word triggered my thought process in the first place.
Exactly. And that's another warning sign, as the term (another trigger word, this time from the psychological realm) presupposes a set of legal/licensing guidelines that professionals are obligated to follow. Like doctors, lawyers, financial advisors, and the like. No such things exist in the writing publishing world that I'm aware of beyond the standard business practices of any industry. And a lot of the vitriol and righteous indignation you'll find on both sides of the debate is a result of the more ideological crowd believing such a thing existed in the first place. There's a lot of One True God pervading the whole thing.

Misrepresenting yourself as a person or company that does not use AI but in fact does would certainly be unethical, but there's no moral imperative filtering the debate one way or another.

You can say AI is unethical in that many of LLMs used copyrighted works without permission to train themselves, but that's irrelevant to this part of the debate. That's like saying the medical benefits or appeal of a certain drug are negated because they unethically tested them on animals or humans.

You can sort of say it's unethical in that it will take a bunch of human jobs, but the lever and inclined plane have blood on their hands too in that department.
 
Doesn't society at large decide what is ethical and not. A majority of us have agreed that murder is wrong. In the same vein, if a majority of people decide that heavy AI (editing assistance) usage is wrong, then I guess that is what the majority thinks/want to enforce against. Though majorities of peoples can sometimes be wrong...

If you publish your book, having used a lot of Gen AI for its creation, and it turns out a majority of the population has been laid off by various AIs and an AI boycott is in effect, you'd be doing yourself a disservice.

However, if AI cures cancer and is the holy grail to human advancement and everyone changes their opinion on it to favour it for everything, you'd be doing yourself a disservice not using it.

The issue is that no one can predict how it will all turn out. We can make guesses, maybe even educated guesses, but then something completely out of the blue could happen that turns the tech 180 degrees.
 
The ethics of it in the end, is - is it your work?

If yes, you have the right to put your name on it. Is it co-written by Chatgpt? Then ethically, the author is 'Joe Schmo and Chatgpt'. Maybe the next James Patterson best seller will be 'James Patterson in collaboration with Chatgpt!'. If you were to give it a prompt and the AI wrote a story, then the author changes a few lines and puts his/her name on it... not ethical. If the work was edited by a human, the editor would be given credit, so maybe 'Edited by Chatgpt'?

Last week I had Chatgpt critique a story I've been working on. It was the first time I've done that and it was surprisingly useful. It hit things I knew needed work and brought up other points that I both agree and disagree with. I specifically told it not to re-write anything or give any examples because I don't want to be influenced that way. I'll use going forward as a teacher, but I want the work to be my own.

I don't have any ethical concerns using grammar, punctuation and spell checkers or using the thesaurus add-in, will future writers feel the same way about generated prose? In the past those basic skills were a necessary part of writing and a purest today might not agree with my ethics in using tools so I don't have to learn them (again).
 
I couldn’t imagine letting an AI choose words for me, or judge the emotional tone I am aiming for. It’s my story, and I am the best judge of every part of it, not some machine. It’s not a question of ethics. It’s a question of artistry.

So – I prompted ChatGPT to complete this metaphor - Giving up word choice to an AI is like

And it responded –

…handing your paintbrush to a very skilled assistant who has never seen the landscape you’re trying to remember.
…letting someone else choose the spices for a meal that only you know the taste of.
…outsourcing your fingerprints.
…sending a ghost to sign your name.
…asking an echo to decide what you meant to shout.
…playing a violin while someone else chooses which notes your fingers land on.
…borrowing a voice that fits perfectly but remembers none of your wounds.
…letting a mirror write your diary.

So, AI can do it. But then it wouldn’t be mine.
 
The ethics of it in the end, is - is it your work?
This seems to be the question that everyone is circling. Ultimately I don't feel this is an accurate or complete representation of the ethical dilemma.

Forget for a moment of illustration how AI currently functions.

Imagine, as a computer programmer, I create a program that can write, or draw, or whatever. In order for it to achieve a result, good, bad or otherwise, I have had to program in the instructions on how to do that. The word choices or a methodology for choosing from a known dictionary. The placement of a line or colour. Everything the program knows how to do follows directly from an instruction that I gave it. So, would you say that the result is a product of my own work?

Consider human assistants. Michelangelo did not paint the Sistine Chapel by himself, he had a dozen assistants who performed perfunctory tasks, but also contributed to the painting. So, which parts or what percentage of the resulting artwork would you consider to be Michelangelo's?

Imagine a human assistant who had studied under Tolstoy, Tolkien, Dickens, Austen, Woolf, anyone who you consider to be a literary giant, everyone. Imagine this assistant came to your home and offered to help proofread, edit, make suggestions to your text. You would, I expect, evaluate the suggestions with the gravitas due that assistants experience. Assuming you adopted some of their suggestions into your work, does that make it less yours? At what point or percentage does the work become not yours?

Now, let's return to our AI assistant. At what point or percentage of using an automated assistant to edit or make suggestions, or even generate portions of text - which presumably will be reviewed and accepted by the author - does the work become no longer the authors?

Where I believe ethics may creep in - or at least some morally grey area - is in how the AI obtains its knowledge. It is not the case that the author here programmed in specific language instructions, so the results are not entirely of his own creative efforts. And, unlike our human assistant, the AI was not shared into expert knowledge voluntarily by the massive number of creatives that it has consumed. Was the scraping and training of AI unethical? Is it's redistribution of this knowledge unethical? (These questions may well be for a different topic.)

Indulge me in another hypothetical. Imagine a Human assistant, but this time they don't study under the artists. Maybe they go to the library for years on end, reading and consuming all books and art they can get their hands on. Maybe they sat in the Sistine Chapel, watched Michelangelo apply plaster to the walls, and pigments to the plaster. Maybe they trained themselves without anyone ever noticing. Granted, the lifetime for a human to match the volume of data consumed by an AI is probably unrealistic and unobtainable. But, for the hypothesis, if this assistant came to your door and offered their services, would it be unethical to accept?
 
Ahh the advance of technology dragging civilization along behind it like an old, worn-out comfort blanket.

Who can forget the social unrest caused by the invention of gas lighting and then electricity and the lightbulb. Our very human sleeping patterns have never been the same.

I think about the horse sometimes. The world standard in land transportation for thousands of years. Until the invention of the automobile. A large swath of civilized society were up in arms about these noisy, dangerous new contraptions. Relegated to race tracks and 'horse people'

Man yearned for flight since it witnessed the first bird take to wing and explore the skies more free than any human could ever wish to...until the airplane came along. Now we cry about crowding and sickness and TSA agents, metal detectors and strip searches...but we still board the plane.

I have misgiving about my bearing witness to the birth of the internet. Now it seems I am one of those who throws their arms to the skies and cries out for justice against this intrusion of technology into our ignorant society.

"Smart" phones...heads down, brainless and getting dumber with each generation.

Should I wail ineffectively against the rise of AI as well?

To what end...
 
I don't think automobiles and TikTok are the same thing.

Though, let it be known, I don't think fiction writers and dock workers are the same thing either; I'm not concerned about AI stealing the jobs of creatives, at least fiction authors. I suppose visual creators and multimedia creators have already lost work given the AI ads we're seeing.

I also don't think spell check is the same thing as an LLM. Nor are paid assistants. Nor are human brains.
Exactly. And that's another warning sign, as the term (another trigger word, this time from the psychological realm) presupposes a set of legal/licensing guidelines that professionals are obligated to follow. Like doctors, lawyers, financial advisors, and the like. No such things exist in the writing publishing world that I'm aware of beyond the standard business practices of any industry.
You can say AI is unethical in that many of LLMs used copyrighted works without permission to train themselves, but that's irrelevant to this part of the debate. That's like saying the medical benefits or appeal of a certain drug are negated because they unethically tested them on animals or humans.
Well, not to put too fine a point on it, but I can appreciate the link being drawn. The starting point, even legal guideline to use your terms, is: "It is unethical to take someone's work and publish it as your own." That's setting aside environment etc...

Machine gun nests and trenches are in place around where data harvesting to feed an algorithm fits into that guideline. It's obviously not the same as copying and pasting a passage from a Stephen King book—well usually, there are exceptions. However, it's not the same as emerging from the work-inspired mind either. It's somewhere in the middle. We do not agree on where it is in the middle.

For your example on medication, it would fit if producing the medication continuously was unethically using animals or humans. There might be some hesitation to use semaglutide if it required puppy spinal fluid... oh who am I kidding it wouldn't matter.
 
Last edited:
Back
Top