ChatGPT vs. your voice

In my job I write a lot of technical text, whether it is for reports or scientific papers. As many others, I have been experimenting with how ChatGPT or other AI tools can be of use for so-called "knowledge workers".

I have noticed two particular things with the text produced by ChatGPT that annoys me:

  1. The generated text often contains a lot of unnecessary, flowery adjectives and adverbs. In my opinion, the language quickly starts to sound like your marketing something, or you're just being pretentious. Examples: "meticulous(ly)", "comprehensive", "diverse", "intelligent(ly)", "precise", "crucial", "seamless(ly)".
  2. The generated text often contains unnecessary complex synonyms for verbs. I have mainly experimented with ChatGPT for rewriting a piece of text to improve the flow, and very often the verbs I originally used are swapped with synonyms that in the worst case changes the meaning significantly, or doesn't make sense at all. Examples: "encompass", "elucidate", "encapsulate", "delve", "comprising", "boast", "fostering".

There's absolutely nothing wrong with these words in themselves, but the way they are overused by ChatGPT makes the text sound very unnatural.

I have, however, a greater concern regarding how ChatGPT modifies and changes the text we use to communicate information. Using an LLM as a tool for expressing yourself means that it's no longer your voice, or at the very best, it's a modified version of it.

LLMs are making their way everywhere, with ChatGPT being only the beginning of AI-access for the greater masses. Apple recently revealed Apple Intelligence1, which is their new AI platform that is "built into your iPhone, iPad, and Mac to help you write, express yourself, and get things done effortlessly." Microsoft, Google, and Meta have already deployed some AI tools that are deployed into their platforms, at least in some regions, but I see Apple Intelligence as the next big step of AI coming into the daily lives of larger groups of people.

ChatGPT has attracted a lot of users that have an explicit interest in using AI-tools, but platform integrations will bring AI also to those that do not necessarily feel the need or want for it. What consequences will this have? Apple claims that their new AI platform will "help you write" and "express yourself"1, but how much are you really expressing yourself with an AI tool as the middle-man?

LLMs has a myriad of use cases, some good and some bad. In my opinion, after one and a half year of observing and experimenting with the chatbot since its release, I think we should be careful about using it as an intermediate layer of personal communication between people. It is not a matter of whether AI will be "good enough" at mimicking our personal style or not. I think there is irreplacable value in the fact that the sender expresses their own thoughts. Receiving AI-generated communication is completely different from something written by the sender on their own, as has been noted by others2.

The landscape of textual communication is changing, and I'm not sure how well my thoughts on this will hold up five years from now. Maybe the expectations for digital communication will change significantly along with the development of AI tools, rendering these concerns outdated. But with so much communication already happening online instead of in the real world, I'm very concerned about how this will affect our inter-personal relationships. Of course, it's a huge difference between, let's say, an SMS and a technical report, and the demands for productivity in the workplace may make it impossible to avoid using AI-tools to speed up communication.

However, I know that when I have a discourse with another human, I prefer reading or hearing the actual words chosen by the other person, because there is also a lot of information in those choices. I prefer hearing your voice, not the one suggested by ChatGPT.


  1. Apple Intelligence

  2. Neven Mrgan: How it feels to get an AI email from a friend