I once observed:
I still have stuff I wrote fifty years ago.
It’s not any good, mind you, but I have it.
Now, the web faces an apparently ceaseless deluge of content that’s generated by various forms of what passes for artificial intelligence. Thus, it seems a good time to note that although the stuff on this site also may not be any good, it always has been and will be written by a human, namely me.
It seems I can’t spend much time online these days without seeing numerous stories about how AI-generated content:
- Is becoming part of major search engines.
- Is putting human writers out of work.
- Has long been involved with basic journalistic writing at surprisingly high levels of distribution.
- Constitutes a huge majority of recent submissions to many content-seeking websites and other entities, some of which are trying to ban such submissions altogether.1
We are yet in the very early days of this cyber-torrent, and it will only grow. Soon, it may be impossible to know how much of what we’re reading — much less hearing, as with some of the already artificial-sounding content on many tech-oriented YouTube videos I’ve encountered lately2 — still comes from humans rather than AI.
And, don’t be fooled: just because an article has a human’s name in the byline is already pretty meaningless and, in fact, probably has been ever since the invention of pseudonyms.
“Pen names” have long had many purposes, among them: protecting authors’ private lives; allowing famous authors to try different genres without confusing or riling existing fans; and, back when female authors often weren’t taken as seriously as their male counterparts, enabling the former to go under the radar with male-sounding names (e.g., “George Eliot” for Mary Anne Evans).
Now, we can likely add one more purpose for pseudonyms: making us think that AI-generated content was, in fact, human-created.
Perhaps governing groups will try to enforce more honesty about such matters, but it’s likely that a large portion of the web won’t care and will keep promoting AI-generated text under human-like bylines. And I’m not even talking about outright fraudsters: I’d guess many seemingly trustworthy businesses will do this, especially in their marketing efforts, if for no other reason than to avoid paying human writers. (I further suspect I am being rather naive in positing this as a future activity, rather than something that’s already been going on for a while.)
Anyway . . .
I swear to you that this isn’t a case of Old Man Yells at Cloud, or a “Skynet is coming” screed, or anything of that sort. Indeed: over time, society will reap numerous benefits from ever-smarter forms of AI, and we’ll just have to hope that those outweigh the risks, as we must have such hopes about so many other aspects of human civilization’s inevitable changes.
All I really want to make clear is that — while it’s probably unavoidable that my research for these posts will include (and perhaps already has included) sources that aren’t human-generated3 — the content you read here that’s presented as being mine is always going to be really from me. While my words are of no great value in the grand scheme of things, they are and will continue to be 100% human-generated. On that, you have the word of this living, breathing, non-cybernetic being.
I say “trying to” because, as AI-generated content gets better and thus less easily identified as artificial, those entities will have to depend on the honesty of the humans who submit it as their own. Good luck with that. ↩︎
Of course, many videos’ audio tracks have artificial voices. In some cases, that’s because content creators who aren’t native English speakers can write their scripts in English but don’t want to read it with their own heavy accents, perhaps for fear of repelling certain individuals who unfortunately reject those with such accents. However, this post is about artificial generation, not voicing, of textual content — the former of which also seems to be increasing on YouTube and other video platforms. ↩︎
That also will have to count others’ code I mention since I wouldn’t necessarily know whether it’s human- or AI-generated, especially in view of the widespread use of tools like GitHub Copilot. ↩︎