It's coming for you next
... and you wont see it coming. Elon Musk, Steve Wozniak, and others call for halt to Ai
Amsterdam, 3.34 am on a Friday, March 31st. I can’t manage to reconcile my sleep.
For the past couple of days, I’ve been thinking about our relationship with Ai in any form and the problem this represents for our privacy, data, and future.
This has nothing to do with the idea of Ai turn; that’s overlords; that’s the least of my concerns. But there’s a fundamental feature missing in most Ai tools around privacy, data, and copyright.
Or a way to lock your Ai to you, and only to you, forever.
The one that concerns me the most is privacy. Right now, we are all using Ai tools, interacting with them, and asking about the ABCD of life, but those interactions can be accessed, changed, and continued by anyone. Imagine accidentally leaving your GPT chats open at work on an unlocked computer. Bad, right?
This is not the same as leaving your email open to a co-worker, friend, or family member. This technology may soon evolve close to a second conscience. Then we need ways to lock them forever with us, biometrically or otherwise.
It should be on those companies developing Ai models, tools, and personas that will only attach to us and creating mechanisms that prevent anyone else from using them, accessing data, or peeking. End-to-end encryption, 2FA, or any of sorts, will not solve the problem. We need new tools to make soulbond Ais or AGIs the norm.
The best way to do that is to link it to our biometric data, even our DNA, but that takes me to the second point about data.
Ai Data sets
Or the information about you, your dreams, and your annoying co-workers.
Following initial thoughts, if we use some biometric data linking users with Ai, we should find new ways to store that data. This will not be the same as having Facebook or Google looking into our shopping or TV series. This is equivalent to having a microphone directly connected to one’s thoughts.
The more advanced the AI becomes, the bigger part of our lives it will take. And with that in mind, companies will not store more relevant information about us than ever.
how we keep it safe and force them to keep it safe?
I feel that we as a species give up privacy too easily, sometimes in exchange for a “feature,” not even a life-changing service.
Going into a world populated by Ai models everywhere is safe to assume that they will be used for good, but there’s also the potential for bad use cases. Combining the capabilities of an Ai with advanced computation power can render any database and encryption useless in seconds.
There’s an interesting open letter to stop Ai development for six months to talk about the actual impact it will have from so many angles. In this letter, the group covers concerns about current developments out of control, “more powerful than GPT4.”
This means to me that there’s something like a Death Star being built, and we are yet to identify who’s the new Vader and his master.
Or the information the Ai is getting “inspired” with.
This one touches very close to home. I still don’t understand why we would start by automating most of the jobs that actually cause joy to people. Really, from all the things we can create, we generated an advice-free Fiveer version that will potentially kill jobs like design or illustration.
The worst part is that most designers and illustrators love their job and craft. Even Adobe jumped on the Ai train with Firefly. Then, we have GPT4 replacing writing; I get it is cool to write emails and stuff, but I can make a book with it if I want to.
All texts written here are 100% human-made and the product of my truncated human fingers hitting the keyboard. And it’s now 4.46 am, and I’m still writing; I’m enjoying it; it relaxes me and is probably relevant or interesting to read to the 1% of you.
But what’s becoming more and more evident is that Ai is giving a talk to the “talentless” and using a compound of “copyrighted training data” to do so. Adobe states that they use a very slim database, mostly composed of CC commons and their stock data. But it’s well understood that MidJourney or Dalle use copyright material to train their models. What’s worst, you can actually ask for a scene in any illustrator you want.
We can go into technicalities about the end product and most copyright laws will not see this as an infringement. The data set should not include copyrighted material unless licensed by the creators and paid for. That’s the point everyone seems to be missing.
Policing the end result will be impossible. It is today when a designer with some skills copies another, same for illustrators.
We must find ways to regulate Ai, its use, and how this will be rollout.
If, at this point, you still don’t care, think about this scenario for a second
we create Ai to replace artists; companies do not hire so many anymore
we create Ai to replace designers; companies do not hire so many anymore
we create Ai to replace developers; companies do not hire so many anymore
we create Ai to replace service, finance, legal, product, & planning personnel …
we create Ai to replace what we enjoy, like arts and music
we create Ai to replace all that gives meaning to our life
an Ai that will eventually take your job
but prices either remain the same or increase due to inflation.
… do the math
Thanks for reading The Untitled Handbook! Subscribe for free to receive new posts and support my work.
Disclaimer: This is a newsletter, and you can opt-out at any time; there’s a link below to unsubscribe automatically. All the opinions are my own and do not reflect the opinions and beliefs of my employers, affiliates, or business partners.