Site icon Hot Paths

Sora 2 Is Wonderful and Terrifying at the Same Time

The AI future we’ve been promised is finally here — and it’s both wonderful and terrifying.

OpenAI’s new Sora 2 video app is fun. It’s delightful and silly and goofy and creative. I have genuinely loved making funny videos that use images of me and my friends.

But read that last sentence again. The flip side of all this wonder is also the terrifying part. This is the first time I’ve felt AI get close to mimicking real life. In other words, you might have a hard time telling what’s real and what’s fake when you watch these Sora-made videos.

Fakes — especially if they’re made with the intent to mislead or harm — could be a big issue. Sora feels like a step into a new world where we haven’t quite figured out the rules.

(If you haven’t used it yet, a primer: Sora 2 lets you upload your picture and then lets you prompt it to make videos using your image. You can also choose to let your likeness be used by your friends — or just anyone in the world. That’s where a lot of the Sam Altman-robs-a-store videos came from, for instance.)

AI that finally blows my mind

In the last few years, I’ve used generative AI apps, like ChatGPT, in lots of ways — most of which aren’t particularly exciting or interesting. Things like basic searches or work productivity stuff. AI feels like a useful tool, but nothing has truly blown my mind. Until Sora 2.

After playing around with Sora 2 for the last few days, I see how it’s a breakthrough: It’s something I actually enjoyed using.

Sora — though it might not create the greatest AI video ever, and isn’t the first social AI video feed (Meta has one called Vibes) — unlocked something.

Vibes was a dud. Without the ability to tell the app to “Make a video of me getting arrested,” it’s boring to just watch random screensaver slop. The key to Sora — and what it’s unlocked more broadly — is the ability to see ourselves.

Sora 2 also brings worries

It’s also, very obviously, concerning. You don’t have to be an AI pessimist to see the very clear potential harms in a tool that allows you to make super-realistic videos of other people’s likenesses with just a few taps.

I’ve seen countless deepfake or AI videos of celebrities and politicians over the years, but as a non-famous person, I’ve never seen a super high-quality AI video of myself — until now.

Just off the top of my head, there are all kinds of things to worry about with this new ability to create realistic-looking (ish) videos with real people: scams, personal humiliation, extortion, and misinformation. And I’m sure there are more worries we haven’t even discovered because this is so new.

Jake Paul endorses this … or does he?

I’ll leave you with one last thought after a few days using Sora 2.

Celebrities — who make their living in large part through owning the exclusive rights to their faces and voices — are so far not on Sora in any large number that I’ve seen. (An exception is Altman, who let it rip as soon as Sora went live.)

There’s a notable exception, though: Jake Paul.

The social media star and boxer has a long history of early success on new platforms and certainly has an, um, playful approach to monetizing his personal brand.

I’m glad he’s on Sora 2, especially since he’s such a big fan of Business Insider!

Note: Jake Paul definitely did not really say that. See my point?

Welcome to the new world!

Exit mobile version