Is AI writing assistance ethical?

Glad I'm not a coder then, I guess.

I was mentioning this in another thread earlier, but the AI summary function of search engines appears to be obviating the websites they are referencing. I was noticing the other day that I haven't had to click on a website in what feels like months because the summary function has answered everything I've needed. Granted it was all stupid stuff that didn't need vetting or independent confirmation. But in using our jam as an example, let's say we're trying to attract new members who might, say, type "how do I get a literary agent?" How many read the summary and stop there without clicking on a link to a website that might be providing the info in the first place. This might be a bad example because there's a lot more nuance and follow up to that particular question, but you get the idea. When you get an answer like this, how many people keep going:

1773621664167.png
 
Glad I'm not a coder then, I guess.

I was mentioning this in another thread earlier, but the AI summary function of search engines appears to be obviating the websites they are referencing. I was noticing the other day that I haven't had to click on a website in what feels like months because the summary function has answered everything I've needed. Granted it was all stupid stuff that didn't need vetting or independent confirmation. But in using our jam as an example, let's say we're trying to attract new members who might, say, type "how do I get a literary agent?" How many read the summary and stop there without clicking on a link to a website that might be providing the info in the first place. This might be a bad example because there's a lot more nuance and follow up to that particular question, but you get the idea. When you get an answer like this, how many people keep going:

View attachment 1019
The other side of that is most websites in top results are garbage anyway. Instead of an involved forum thread you're slapped with Quora or a site chocked with affiliate links barely even pretending to be an enthusiast resource.

So one has to add "reddit" to the search terms to hopefully get honest opinions, but advertisers pretend to be users there, too.

Want to read an article about up-to-date self publishing advice from a human? Nah, pal, Best I can do is a bloated Youtube video, and no guarantee on it being a human there either.
 
The other side of that is most websites in top results are garbage anyway. Instead of an involved forum thread you're slapped with Quora or a site chocked with affiliate links barely even pretending to be an enthusiast resource.

So one has to add "reddit" to the search terms to hopefully get honest opinions, but advertisers pretend to be users there, too.

Want to read an article about up-to-date self publishing advice from a human? Nah, pal, Best I can do is a bloated Youtube video, and no guarantee on it being a human there either.
In a way, you can say that the AI is doing the same thing to the Internet that the Internet did to paper over the last 30 years. And It's sort of becoming a postmodern Internet world in the sense that all of its elements are blending together. When you see a thing on your screen, does it matter if it's a forum or a website or an app or a YouTube channel or a streaming service? Now that all electronic devices can run the same applications, it doesn't feel like dividing things into buckets matters much anymore.

What is YouTube? A website (yes), an app (yes), a streaming service (yes)? Can I use it on my phone (yes), my tablet (yes), my computer (yes), my television (yes)?

Hell, is there even an Internet anymore? I used to have three sets of wires coming into my house for my phone, my television, and my Internet. Now it's all the same thing.

As a new owner of a website/forum, I find these things to be utterly fascinating. And I think I'm onto something with this postmodern way of looking at it, but I'm not sure what. Postmodern is probably the wrong thing to call it, but I can't think of anything better.
 
I was mentioning this in another thread earlier, but the AI summary function of search engines appears to be obviating the websites they are referencing. I was noticing the other day that I haven't had to click on a website in what feels like months because the summary function has answered everything I've needed.

Yeah. The good news is, when I type in "writing forums", it doesn't yet come up with an AI summary, just a list of results. But that day can't be too far away.

I am making absolutely no suggestions about what, if anything, you need to do about that though.
 
My brother in law uses AI to write all of his computer code apparently. His take is that if he wants to have a job in six months, he better develop his way of using AI better than everybody else around him. His company found it was a million times easier to manage AI coders than human coders.

I think I might do an experiment: can Homer learn to code through AI? Has to be something on this Forum I apply coding to.
I don't actually use AI on a day to day basis for my job (I'm also a software engineer) but it's largely because our code-base is a 20 year old behemoth too convoluted for the AI to interpret. Colleagues working on newer systems certainly are using it to generate generic code, like API endpoints to fetch data and return in a standardised format.

I don't feel like we are in danger of losing our job in the short term - long term, who can say. I could see junior/entry level positions potentially disappearing across a number of industries, where something that requires minimum competence would easily be covered by a 5 minute prompt.

I've done quick one of tasks, generating a script to run server maintenance or something, much quicker using AI than having to work through it myself. But the risk of errors creeping in is still high, and if you don't know how to read and interpret the product you get back, you could create all sorts of issues down the line. The guys who do use it regularly have to tell it something is wrong, and it tries to self correct, with varying success rates.

Interestingly, there was a report recently (unfortunately I cannot find the reference to cite) where they had experienced developers work with and without AI assistance, and tested productivity. When asked, the developers reported feeling more productive, but the statistics showed that it actually took something like 25% longer to complete tasks.

Here is a scary article. It's a bit off-topic but in terms of AI's near future...

https://finance.yahoo.com/news/morgan-stanley-warns-ai-breakthrough-072000084.html
Before you get too scared, I would consider what Morgan Stanley's agenda is in publishing an article like this.
 
Yeah. The good news is, when I type in "writing forums", it doesn't yet come up with an AI summary, just a list of results. But that day can't be too far away.

I am making absolutely no suggestions about what, if anything, you need to do about that though.
I wouldn't say it's a problem. Just a thing that needs to be understood.

It takes a few days, but I just added Google Search Console to the Analytics which will tell us exactly what people are searching (allegedly) to discover the forum. I'm curious as to how many people are searching for a derivation of "writing forum(s)" as opposed to a specific writing question that can be answered by a writing forum. Could be many, or could be few. I have no idea.

But the risk of errors creeping in is still high, and if you don't know how to read and interpret the product you get back, you could create all sorts of issues down the line. The guys who do use it regularly have to tell it something is wrong, and it tries to self correct, with varying success rates
Yeah, brother in law says the same thing. He still needs to check it but it's much faster and doesn't come into work hungover.
 
I think I might do an experiment: can Homer learn to code through AI? Has to be something on this Forum I apply coding to.
I’ve also been “vibe coding” recently. The parallels with the “author” Trish mentioned, who publishes an AI-produced book each day, are inescapable. That individual is an extreme example of a “vibe author”. It emphasises ends over means – getting the job done as quickly as possible because “time is money”. I suppose one outcome, given our trajectory of ever-increasing “productivity”, is that such extremes become normalised over time, giving rise to even more extreme extremes.

On a related note, I saw this earlier and thought of y'all: Race on to establish globally recognised 'AI-free' logo (BBC article).
 
So – I prompted ChatGPT to complete this metaphor - Giving up word choice to an AI is like

And it responded –
It's interesting to see the options it generated. They are all heavily flattering towards the AI (skilled assistant, mirror, echo, etc). Normally this would be considered an interesting authorial choice--however, robots not having a true choice just kind of make it sad.
 
Back
Top