22
Tried using a free AI voice clone tool on my own videos and the results were way too good
I wanted to test how easy it was to make a fake, so I used a free online tool to clone my voice from a 30 second clip of me reading a recipe. I fed it new text, something I'd never say, about a fake product. The audio it made was scary good, no weird robotic sounds at all. I played it for my roommate and he totally believed it was me, which was the opposite of what I expected. I thought these free tools would be obvious junk. It made me realize that the audio side of fakes is getting really hard to spot without special software. Now I'm torn between thinking we should push for better detection tech or if we need laws to limit these tools. Has anyone else had a test go sideways and make you more worried instead of less?
3 comments
Log in to join the discussion
Log In3 Comments
tessap738d ago
I did the same thing with a clip of me complaining about grading papers. The fake audio agreed to do extra recess forever. My students would revolt if they found out it wasn't real.
6
betty_perry248d ago
My cousin teaches third grade in Phoenix. She says lying to kids about stuff like this breaks trust big time. Those students are going to feel tricked when the truth comes out. It sets a bad example about using tech to fake promises. The short term laugh isn't worth the long term problem.
3
fiona_kim978d ago
Consider the legal side for a second. If a free tool can do this, what's stopping bad actors? We might need rules before the damage is done.
5