Crime — When AI impersonates you and steals your livelihood

A news report from a few days ago caught my eye. It was about a US writer named Jane Friedman. She has written or co-written 10 books that are available on Amazon, mainly about the business of publishing and writing.
She was contacted by a reader who emailed her with a comment about her new book. Only she didn’t have a new book. So she went to Amazon and found two books by a writer also called Jane Friedman. About the business of publishing. With her photo. Written in a very similar style to hers. But they weren’t hers.
They had been written by an AI. She was being impersonated, her identity stolen along with her unique style and chosen subject in order to publish on Amazon and elsewhere. When she looked at Goodreads (another global online bookseller) she found not two but five books under her name, all written by an AI. The AI was raking in money for someone based on her writing reputation. Friedman immediately alerted Amazon and the other outlets that listed the books but was met with resistance. Amazon at first refused her request to remove the titles from their website, insisting that she provide a trademark registration number associated with her name. It was only when she went public on Twitter that the books were taken down.
But there are many other things to say about the incident because this modus operandi, I suspect, is going to become very popular very soon. I will also tell you how easy it is to prevent this sort of fraud. Permanently. Some smart reader is free to make it happen and become a billionaire. I promise not to sue.
But first…
Of the many early impressive feats of AI and Large Language Model architectures is their ability to mimic. Write a speech in the style of Kennedy or Mandela or Churchill and now you can have it delivered in their voices, perfectly replicated, with their unique timbre, inflections and lilts and pauses and all. What you are doing is asking the AI to weight its predictions with your chosen person’s grammar, narrative style, vocabulary, sentence structure, semantic idiosyncrasies and the like.
The results are startlingly good. More than that, they are sometimes a little creepy in the precision of their mimicry, even though this technology has not even begun to mature yet. And consider that some AIs can write a book length manuscript in minutes.
So, imagine you are what the literature likes to call a ‘bad actor’, an amoral grifter looking to steal. A natural route might now be for you to look at people who make money from their writing, but who are not yet celebrities (the scam would be uncovered too fast). Then ask an AI to build a reasonable facsimile of a book somewhat related to the author’s usual subject area (I suspect non-fiction would be easier), add some frills like a cover and some fake reviews and list it on Amazon. Of course, it is likely the faux titles will eventually be unmasked but by that time you have closed your illegal bank account and moved on.
Worse still, this can be done at scale and with no coding skills at all. It would be simple to do this using the work of tens of thousands of journeyman authors simultaneously and you could bag a big windfall before being shut down. Then you could repeat the whole thing the following week with another ten thousand authors.
How much could you make? Let’s assume you sell only 20 copies per author before being caught. Times 10,000. Do the math. There are one million authors on Amazon — a big pool in which to fish. And I am not even considering photos and music and scripts and the rest of the creative content available online where similar heists are waiting their turn.
It seems obvious to me that a large-scale grift like this is going to happen, probably repeatedly. After spending a good 6 years deep in the world of crypto, to me the only constant surprise is how many people there are who will steal without blinking and without shame, as long as they are reasonably confident that they have a good chance of getting away with it. This kind of theft also often carries with it a badge of honour, with anonymous online bragging about successful scams proudly posted.
AI is going to open this door even wider than crypto. It is not only theft of ‘creative’ IP through deceptive mimicry which will flourish but, given that we are already in a world where differentiating between human output and AI output is increasingly difficult, we are certain to see the rise of the industrial-strength deep-fake scamster across other areas of human endeavour too.
But here is the simple solution, and it comes from the other big technology of the last decade — crypto.
If every piece of human-created content was assigned an NFT by its creator at its birth there would be no way for an AI to impersonate the work of that artist. Only the holder of the private/public key pair could claim authenticity, using the magic of unhackable digital signatures. These technologies are here now, courtesy of cryptography. Details of how the NFT is minted and who mints it and how the keys are protected and remembered are all the stuff of multiple innovations pouring out of the world of crypto over the last few years. Its application to the protection of human IP is both obvious and achievable.
And to that reader who makes it happen, you can thank me later.
Steven Boykey Sidley is a Professor of Practice at JBS, University of Johannesburg. His new book “It’s Mine: How the Crypto Industry is Redefining Ownership” will be published in August 2023.
Article first printed in Daily Maverick