Things AI can do (but there aren't many)

Using AI to research is no different (and may be better) than using Google

Using AI to generate prose is anathema to the artistic pursuit
 
There are a number of ways to use AI to support writing which don't involve generating prose or plot.

But no one has to use it if they don't want to.
 
You know how ChatGPT often gives you two different responses to choose from? It has, on more than one occasion, given me two different responses which were diametrically opposed to each other. So AI summaries need to be taken with doses of sodium chloride. More often than not, I use it to confirm or clarify details of subjects I am already familiar with, but if I'm not, I'll always confirm what it says separately.
 
Largely by AI recently

You do have to use your common sense while researching anything

I do find the AI summariser on google useful but I have found a number of occasions where it quotes stuff from fiction as though it’s fact

The internet has been 90% pap almost forever!
AI is just increasing the volume of pap!

I have found Copilot good to cut down search times recently. Sometimes to check a small fact fast, but mostly to point at resources.
The advantage it has over Google for me is it will usually make sense of vague half baked questions.

Getting the right material out of Google often means getting a search term right and even when you do it will still give you Reddit and ads before anything else!
 
I moved this into the Writing Tech forum but I had to change the title. I would have done it sooner but didn't see it until Nao requested it be moved, so apologies for the delay. I'm not going to do this often, but I couldn't support anything that said "How to use AI to help your writing." The post itself and the arguments are fine, but the former title is a lighthouse for spammers and others that whole-heartedly support generative AI, and I don't want the wrong people to find it.

I'll entertain an alternate title @Naomasa298, but I couldn't let the last one stand. Again, apologies for that.
 
Hmm. When doing research, I have to ask: how reliable is Wikipedia? What about Quora? Or Reddit?

And before you jump on me for citing Reddit as a source, of all things: I'm thinking of Reddit groups that specialise in your chosen subject (like r/historians for history - it's staffed by actual, knowledgeable, and friendly historians). :)

Wikipedia has a reputation for not being 100% accurate, but I never found it to be too bad.

As for Quora ... again, I know I shouldn't use it verbatim (and I don't). But I use it for researching historical weapons (especially their effects), and historical medicine (i.e. what people used for medicine way back then, whatever "way back then" means). ;)

I haven't used AI for anything serious. I always thought the only things AI can do are these:

1. Sell shoes (badly)
2. Drink beer
3. Try to fix the TV (and fall off the roof)
4. Work on his Dodge
5. Sit on the sofa with his hand down his trousers.

(Note I didn't say "pants" -- I'm aware that in the US, that means ... something else ... than it does in the British Commonwealth). ;)
 
I saw my first driverless taxis this week. Chilling. Sometimes when you stare into the abyss, the abyss stares back with unblinking electronic eyes.

The Plottr folks have an AI offering that I think is fairly new. You upload a book, or a series of books, it spits out a Plottr outline/timeline file. Your work is protected by terms of service from training the AI engine.

I can see how that provides a service without hijacking creativity and ultimately imprisoning mankind in Fahrenheit 451's Families. Other applications of AI already do that, anyway, so there's no real need.

The magic comes at a price. $99 per book with a discount for multiple books.
 
Hmm. This struck me as interesting: ChatGPT vs. Gemini: Which AI is better?

This is also interesting, if slightly technical:


Lastly:

Using AI to research is no different (and may be better) than using Google

Using AI to generate prose is anathema to the artistic pursuit

I am not so sure. I tried, for instance, to use ChatGPT to ask various rather abstruse questions about historical research (for instance, "How was a Mongol ger-tereg heated?", and "What did the Icelandic Norse use for heating?") and it gave me answers that were more complete than I could find on google. So I was impressed.

As for your second point ... yes, I am forced to agree. I am currently writing an intensive-researched story about life in medieval Iceland, and I asked ChatGPT: "Write me a story about a medieval Icelandic Norse girl who wants to become a Viking". The result was, um ... simplistic, full of cliches, and far too easy; something that a child might appreciate, but not an adult.

So, OK. ChatGPT looks like it's good for some things, but not for others. I would not use it to write my own stories (and besides, what would be the point?), but I see no reason not to use it for research, provided that I can back up its results with results from other, more reputable sites.

What's your view?
 
We use it at work every day (marketing agency.) While we never take anything it spits out verbatim like a blog, email, or even a subhead, it does cut down significantly on brainstorming. Not the actual thinking, but just the notes, if you will. GPT or Gemini can create thought starters and ideas in a fraction of the time it would take us to whiteboard it or write it down. It's a good starting point and research assistance tool.
 
We use it at work every day (marketing agency.) While we never take anything it spits out verbatim like a blog, email, or even a subhead, it does cut down significantly on brainstorming. Not the actual thinking, but just the notes, if you will. GPT or Gemini can create thought starters and ideas in a fraction of the time it would take us to whiteboard it or write it down. It's a good starting point and research assistance tool.

Same with my marketing people at work. Every press release or social media post. It's not even noteworthy anymore. I stopped complaining about it a year ago. It is what it is.
 
One thing it's better at doing (certainly better than Google Translate) is translating vernacular. It's not so good at doing long passages, but if you need a natural sounding way of translating a single sentence, ChatGPT can do it, while Google Translate will give you a literal translation.

So when I needed a typically German equivalent to an English saying, ChatGPT could give it to me easily, it would have taken much longer to do a search on it. And also, it's useful for checking period-appropriate language. It knows that two Elizabethan bishops talking to each other will use "you", not "thou", or that a private in the 1900 Japanese army is called an "nitousotsu" rather than "nitouhei".
 
Same with my marketing people at work. Every press release or social media post. It's not even noteworthy anymore. I stopped complaining about it a year ago. It is what it is.
Yeah, every content writer in my company (myself included) was hesitant at first. We all shunned it, in fact. But our design team was really the first to dive in and they started making some great conceptuals. Not any final deliverables for clients, but just some fleshed out ideas. Slowly, we adapted. Now it's just another tool in the box.
 
I consult to Perplexity more than to my dictionary. Sometimes I miss out Word's coolest feature of displaying a slew of kissing cousin to the word where caret hides, but I just can't feed those LLMs a reasonable prompt.

Another thing is I'm looking for the time when LLM hallucinate and make things up. I try to think it through and make believe I find the justification behind it. Then I bring that justification to its terminal end and use it either as a story's concept or a prompt whenever I get stuck.
 
Back
Top