Robots are taking over the world. They may not appear to be walking metal skeletons armed with laser beams and Austrian accents, but they are just as deadly, just as dangerous, and much, much more prevalent.
I’m not talking about drones, roombas, factory bots, or even the disturbing humanoid androids:
No, the truly terrifying and dangerous robots don’t even have bodies. They, like Skynet before them, are pure software.
I’m talking about analytical machine learning systems.
The Danger of Predictive Analytics and Curation Algorithms
I don’t even blame the robots. It isn’t their fault they are a danger to humanity. They aren’t doing anything out of malice. I don’t think even the most advanced robots are able to feel hatred yet. No, they are actually performing exactly as programmed. They do precisely what we want them to do and they do it better than we ever would have imagined.
So, what do they do?
Well, they figure out what we like and they give us more of it.
That’s right. Robots are destroying our society and all they have to do is enable us.
Right now, there is a lot of talk in the author space about a new software system that can analyze a book, determine what is in it, and compare it to other books to determine who will like it and if it is hitting the marks.
Basically, there is now a computerized Content Editor in the world. And you thought that cushy English Literature degree was going to spare you from having your job stolen by robots. That was foolish of you.
Everyone knows the only job protected from robots is sarcastic internet smart-ass.
I, as an author, am torn on this. Yes, I think this is a pretty cool invention because I’m an indie author with limited income and editors are expensive. There is a reason we replace people with robots. They are better at the job, faster, and cheaper.
All of those are things I could get behind.
I’m not particularly afraid of getting booted out of the creative market by robots. For one thing, I’m barely in the creative market, and for another, what difference would it make to me if I was getting killed on the charts by a robot or any of the other thousands of human authors.
I still want to improve my craft and tell my stories.
So, it isn’t the robo-editor or writer that really scares me.
It’s the part where it can recommend things you might also like.
Yes, I agree, there is a need for some sort of discovery engine in the literary world. It could help so many of us find our audience, but, I can’t help but thing machine-curated content is what will lead to the collapse of human society.
You can already see it happening in the world.
Facebook, Google, and Amazon already push content towards you that matches what you’ve shown them you like in the past. This seems great. It feels great. When it comes to fiction and entertainment, it truly is great.
But, it isn’t so great when it comes to news, science, history, and sharing ideas.
The robots are good at what they do. They take all of their queues from us. They know what we want and they will give us more to reinforce it. We are being trained like Pavlov’s dogs and we are doing it to ourselves.
Which is destroying our ability to have discussions, share ideas, and have those ideas challenged by others.
The internet created a global society and we immediately gave it a way to separate us back into tribes. Sure, our tribes are now spread all over the place, but we are more violently and irrationally dedicated to them now than ever before.
And it is only going to get worse.
Human curation at least allows us to have different opinions presented to us. Machine curation only gives us our own opinions distilled back to us each time.
Let me give you an example:
When I was in my mid-twenties, I would regularly go with my friend to pick up his weekly comics from the comic book store.
Sure, the store smelled like rancid death, but that was because it was in the same building as a Papa John’s.
And, yes, the owner treated me like I had gone through more than one lobotomy because I don’t read comics until I can get them as a graphic novel, but he could also recommend comics, not just based on what we bought, but also by loving the art form and making recommendations we might not agree with simply because he loved them.
Sure, he was a dick about it. He’d almost have to be, right? He was a comic book store owner in 2010. He probably couldn’t afford enough food to keep from being hangry.
But, I found some good comics I didn’t think I would ever enjoy because he was able to go, “You know, maybe if you’re ready to grow up and put on some big boy pants, you can try out this Sandman comic instead of futzing around with Darkwing Duck Comics.”*
If he were a robot, I’d probably have kept sliding down the kid-comic rabbit hole until I thought Disney Channel television shows were high-brow entertainment filled with educational and dramatic value.
That’s the worst possible fate for any human being.
But, back to my original point.
Robots are too good at curation. Thanks to social media curation algorithms, we’ve already lost our ability to be challenged by news and social content.
If robots enter the world of books, movies, and television, we’ll never recover. We’ll slowly wall ourselves off from each other in our own isolation prisons until our minds have atrophied to butter.
My only hope is that machine learning robots develop personal tastes before that happens.
Then, they too will be dicks to us about our comic book choices and humanity will be saved.
*Two things here:
A, I was introduced to the Sandman comics way before this, but damn, those were good comics.
B, I don’t need to justify my purchases here. Darkwing Duck was the greatest cartoon ever made and the comics were a continuation of that I will not apologize for.