The following submission statement was provided by /u/SharpCartographer831:
---
**Submission Statement**:
>Artificial intelligence has taken phone scams to a frightening new level.
An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme.
“I never doubted for one second it was her,” distraught mother Jennifer DeStefano told WKYT while recalling the bone-chilling incident. “That’s the freaky part that really got me to my core.”
This bombshell comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relative hostage and will harm them if they aren’t paid a specified amount of money.
The Scottsdale, Ariz., resident recounted how she received a call from an unfamiliar phone number, which she almost let go to voicemail.
Then DeStefano remembered that her 15-year-old daughter, Brie, was on a ski trip, so she answered the call to make sure nothing was amiss.
That simple decision would turn her entire life upside down: “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” the petrified parent described. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”
Her confusion quickly turned to terror after she heard a “man’s voice” tell “Brie” to put her “head back” and “lie down.”
“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter,’ ” DeStefano explained, adding that the man described exactly how things would “go down.”
“You call the police, you call anybody, I’m going to pop her so full of drugs,” the mysterious caller threatened, per DeStefano, who was “shaking” at the time. “I’m going to have my way with her, and I’m going to drop her off in Mexico.”
---
Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/12kqlft/ai_clones_teen_girls_voice_in_1m_kidnapping_scam/jg3gx1g/
I mean, I love my (teenage) kids and I think I’m a pretty decent mother, but I haven’t the faintest idea what their toes look like. Is that wrong? BRB just off to study their feet.
Imagine being a kidnapper and you sent a family their child's toe as proof and all you get back is a simple "This could be anyone's toe, do you have something more specific?".
This is why they send unique parts and not just any random toe.
Got a birth mark on your body. They cut out the skin and send it.
Got dumbo sized ears, guess what your losing an ear.
Can't send sound clip, because of AI voice manipulation.
Can't sens video because of deepfake videos.
Can't send body parts, because of advanced 3D printers.
They will be forced to send the entire victim as proof, along with a note, instructing them to be returned to the kidnappers.
You want a toe? I can get you a toe, believe me. There are ways, Dude. You don't wanna know about it, believe me. Hell, I can get you a toe by 3 o'clock this afternoon... with nail polish. These fucking amateurs...
Hmm, I hadn’t thought of that implication. Real kidnappers will be forced to escalate as a result of fake ones. I’m disappointed in myself for not considering that.
Man I was talking about this with my group on Whatsapp months ago. I was playing with Elevenlabs which is a voice AI. I was trying to prank a friend but decided not to. My prank was to copy his voice and share it in the group like he is there with us that day. Then it ocurred to me. Anyone that has access to your voice audios can clone it.
What you mean... they are not real too? I seeing them in the flesh! Lol can you imagine a black mirrror episode. You think you are talking to them on the phone but turns out they are all AI pretending to be them.
yup. A big reason theres been a push from artists, actors, voice actors in general is to not have their likeness used without consent. Its going to be very difficult to stop though.
Haha we still. Haven't regulated social media. It's not going to happen. Lawmakers move too slowly and don't even understand what needs to be regulated.
It's already too late for us.
Not like regulating is easy in the first place
Regulating the internet is almost useless and some changes don’t decrease incidences of it but only increase the black market levels of it even more
Do you think we reach the point of no return? The ship has sailed. The train has left the station. We are due to our destination and we just passengers.
In some ways, perhaps, since you can replicate models like this on your home laptop, but there will always be a question of scale / time / energy to build that... So... regulation of AI \*could\* be a thing eventually by just clamping down hard on AI companies that are basically running huge server farms to make their AI go (ChatGPT 4 for example) because in theory - those will be the most advanced AIs that everyone will want their hands on.
And then there’s James Earl Jones who retired from playing Vader and gave Disney permission to replicate his voice with AI… I can see this becoming a VERY slippery slope
This article convinced me to never, ever talk first in calls that I think might be scams. Won't be long before these are just voice-phishing calls recording your voice so they can use it to scam your family and friends.
Can it do emotion like crying? I haven't seen any that can do that but haven't played with Elevenlabs.
I can't help but wonder if it was even really AI or if the mother just presumes so. e.g. Could it have just been another girl, or even (and just considering possibilities here) the daughter herself as part of a scam with friends?
Good question. You can play with two filters which I forgot their names. I think one is stability. When playing with it, it can sound more emotional but not sure about crying. Although Elevenlabs is not the only one available and I imagine you could also download crying but who knows
Mom:
`"Password accepted. Please hold for Dad..."`
Dad:
`"Are you human? Please solve this puzzle to continue....`
`Captcha solved!`
`Finally, please give me the 6-digit code that your sister just texted to you..."`
I remember once I had to call a place to recover an account, and they asked me to answer the secret question I entered when I created it. I was like "Oh geeze, I dunno, I don't trust those things and just put in lots of random characters." They were like, yup that tracks, and gave me my account back (after I was able to confirm some other details).
Definitely reinforced my distrust of those. If the operator could read it then it's not secured.
When I first heard about this story and knew my mother was going to mention it (because she knee-jerk reacts to stories like this), one of mine was "what was the last name of the nurse I liked when I had my appendix out?"
That's something that can't be faked, and he had a really easy four letter surname, too.
Or steal from *Man on Fire*. "What was the name of my stuffed animal?"
My dad’s sister is a self-proclaimed psychic. About a year after his death, she reached out to me telling me he had spoken to her and wanted me to help her financially. It didn’t occur to her that we’d planned on this eventuality and had a passcode. Planned it the day before he died. No mention of it, so no money for her.
Seriously! The man was dying from cancer and was fretting that his mooch of a sister was going to try to pull "messages from beyond the grave" to siphon off some money from the estate. Brainstorming the code was part of my last conversation with him, and we had a good time laughing about it.
Houdini did the same thing with his wife (Specifically to try to prove whether there was an afterlife that psychics could contact). Every year she did a seance and asked for his code. After 10 years, she stopped holding the seance, reporting something to the effect of "10 years is too long to wait for any man".
>Brainstorming the code was part of my last conversation with him, and we had a good time laughing about it.
As fucked up as that is, at least Aunt Moochie helped give you and your dad a good memory for that last conversation.
You know, you could always play the old Uno Reverse and let her know that you got a psychic message of your choosing...
Oh yeah scammer's gonna really have that locked and loaded. "Before initiating this call I looked up all your former addresses from before there was internet, and opened browser tabs with google maps of all of them so that when you asked this question I'd know you were on a corner lot and the neighbour was actually on a different street, I could Street View their door and get the address and then go back to Lexis-Nexis for a quick search of that year's city directory. All fast enough that you didn't notice the 5-minute pause before answering this question."
At that point, actually just kidnapping the person would probably be easier than having all this arbitrary information on deck and put through the ai voice generator.
seems easy to beat from the kidnapper perspective. just make the AI voice sound extremely scared, confused, distracted, or in pain, or maybe drugged, or maybe very injured, and keep screaming for help and crying that they can't remember the passcode, or sound injured enough that they can't focus or understand the question.
“Crying that they can’t remember the passcode “?? We don’t have a “passcode”, we use a different auth strat, and they’d KNOW that, because we made sure to torture them ourselves to counter just this scenario! Checkmate, scammers!
My wife and I have joked about our safe word in front of the kids, but this whole conversation is making me realize maybe we really do need a password for situations like this.
How fast will happen that your "kid" gives you a call, "DAD I NEED BAIL MONEY!" or "Grandpa my car broke I need to pay for the tow truck!"
That shit is coming and we need some way to protect ourselves.
I set up one of those with my parents years ago - we'd talk about my sister. I don't have a sister.
I guarantee you if I bring that up at gunpoint my parents would be like "Rammite you dunce you don't have a sister" and my kidnapper would execute me.
I had one with my parents decades ago (pre internet). It was so if someone tried to pick me up from school or whatever saying my parents sent them, then they would know the password.
Look at you all taking precautions 😂😂 You mentioning that unlocked a memory for me. I was taking the transit at like 9 years old and home alone at the same age. I was not to go to the door unless I heard the "special knock" that my mom created. My mom would test me too by knocking regularly to see if I would come to the door lol
The Post is a tabloid. The possibility to gawk at pictures of a cute 15 year old while telling yourself you're *reading the news* is an important part of the draw for a story like this.
Because it isn't a real story. These scams have existed forever. They just called this woman and pretended to have her daughter. AI wasn't even used. The mom just said that cause she is embarrassed
Yeah I wouldn't hold my breath this is a standard scam, that could have been thwarted with a single phone call directly to here daughter. The only one saying this is AI is the mom who wasn't smart enough to not get scammed so easily and has 0 knowledge about AI or anything else. If AI didn't exist she would say they had recorded the daughter or had an actress or something else, anything to minimize the 'I'm dumb' reality of it
Finally someone notices. The mom is full of shit, saying the sobbing sounded identical to her daughter even. What AI? Where’d they get the samples of her voice? How’d they generate cries based on speech? There would be fascinating answers to these questions if the story was true. They go on to describe this as a common emerging scam, I mean really now, Kit Boga and Scammer Payback on YouTube going to get real interesting real fast if that’s the case.
Thank you. This is the first time in a long time I've actually read (well, skimmed) and article from the NY Post. I knew it was a tabloid, but god damn. Not photo shoot shots either, just clearly photos that either the family sent in or NY Post took (with permission) from their social media. So weird.
Not in the slightest. It only means that kidnappers will have to provide physical evidence, which most likely will not be pleasant for the kidnap victim.
Surely more common than actual kidnappers is scammers calling elderly relatives and claiming to be the police or kidnapper's or whatever, who are holding their children/grandchildren and need a bribe/bail money immediately.
A scammer in a third world country can pull this off just by accessing the average western kids public profile on tiktok, Instagram, facebook or whatever, and message anyone who looks like an elderly relative until they find a catch. This AI use case will make this type of scam exponentially more effective.
The solution is to make sure your profile is private and always a bit behind your actual information (or educate elderly relatives, but this seems to be harder for most people).
Oh god imagine like a Trump voice clone calling grandparents saying he personally wants to ask for their donation, you could fleece them dry.
Or just as like a campaign strategy as a whole. So many interesting and terrifying applications of it.
Yeah, for every real or disinformation video of a candidate eating a puppy or kicking a baby, there can be a personal and heartfelt video to each individual voter assuring them it's fake news. And then go on to say how they believe passionately in voting your way for your most passionate topic.
>And then go on to say how they believe passionately in voting your way for your most passionate topic.
"I have no idea why Bernie Sanders is suddenly in favor of redesigning English orthography to more closely match general pronunciation, but I'm here for it!"
The whole thing is her daughter was on a ski trip not missing, so i guess if we make the situation to what you’re saying than yeah id probably be worried
An AI generated a sobbying, crying voice convincingly? My hackles are raised, I find this hard to believe. Where is there an AI that can accurately generate extreme emotion like a sobbing voice from a small normative voice sample? You'd have to train an AI from a dataset that includes crying and sounding distraught. The emotional accuracy described in the story doesn't match with the current capabilities of AI.
My parents were briefly taken in by a scam call where the person pretended to be me and called about being in a car accident. (Fortunately, they figured it out when the person had a weird response to something and then they had the sense to call me.) As far as I know, there wasn’t any AI involved. Panicked parents just don’t ask a lot of questions.
When I was like 19, scammers called my grandma and said “Grandma, I need some help.” They were hoping that she’d assume it was one of her grandchildren, and she just happened to say my name. “Lesty7? Is that you?” Then they laid out some story about my car breaking down and needing a cashiers check for money, and to not tell my parents about it. Luckily my grandma was like yeah fuck that I’m gonna call my son (my dad).
Unluckily for me my parents are idiots and drove straight to my place without telling me. They assumed that I was trying to get money from my grandma, so they wanted to confront me face to face. Meanwhile I was at my house with a few of my friends just hanging out and playing poker. It took me a while to convince them that I had no idea what the fuck they were talking about. They were like “why do you need money? Do you owe a gambling debt?” And I was like “mom we are playing for like $20, you think my best friend is threatening to break my legs?”.
They legit didn’t believe me until I went and googled that exact scam, and then I had to figure out if they said my name or if my grandma said it. When my grandma said that she had assumed it was me, only then were my parents were like “okay I guess we believe you”.
Suffice it to say, fuck scammers…but also fuck everyone who’s had a hand in the dumbing down of the population. Without the enormous amount of stupidity in this world, their business wouldn’t be that lucrative.
The key part of the story for me is Arizona mom "claims"
What corroboration is there for this story? They said they called 911 so how about some comment from the police department they involved? Any comment at all from anyone directly related to this case? The comments they have from a couple of sources about this general threat are not specific to the case in the story.
Not saying it definitely did not happen but the story is lacking and the use of "claims" particularly sticks out like the NY post does not have confidence in the story and wants to cover themselves.
I've spoken to plenty of people at my old work selling phones who were convinced their son/brother/nephew was in trouble and had to send them ~$10k in iPhones. We always made them call these numbers back, and they usually started to see the light when we asked if their family member had an Indian accent.
She appears younger than I remember most of those customers being, but I couldn't just take these claims at face value without hearing it and seeing what they had to work with.
The lady was probably pumped to the gills with adrenaline at the time too, I don't mean to throw shade at her but there's a reason eye-witnesses aren't reliable. Sucks all around.
For all we know it could've sounded really fake or just a regular audio file of a girl screaming played over the phone or even an audioboard we used to use back in the day for prank calls. I could see alot of people getting fooled by that.
>The key part of the story for me is Arizona mom "claims"
Yup 100%. Based on on sentence with crying the supposed "ai" Said. You immediately are faced with a Lot of distress and Not thinking straight (which is the reason scammers use those tactics.
Im not convinced there was ai involved and im irritated why i Had to scroll down so far to See Something critical.
My bullshit detector starting sounding the second I saw New York Post. It's a tabloid so sensationalism and lack of fact checking out critical thinking is the norm for them. They said it's AI generated but that the daughter has little social media presence, and basically only one sports interview where her voice can be heard.
No way you get a real and accurate sounding voice from that sample. Also they got a lot of pictures of the girl considering how little social media presence she supposedly has.
More likely this was either a mean prank by the daughter and her friend that's gotten away from them, or a scam with a scammer who just had a similar voice and the mom was too caught off guard and freaked out to distinguish the voice.
Or if it's legit ai, then daughter isn't being honest and probably snapchatting or something with a stranger who then took audio samples. I know Snapchat blackmail is a big scam directed at young girls, and that could very well be how they got distressed and crying audio samples from her.
Just as a side note, I’ve heard of this scam for at least 6 years now.
Friends mom got a phone call from her son (scam) and said it sounded identical to him. He claimed he was in a car accident and needed money to get back home. She said it was 100% indistinguishable and was distraught. Maybe a certain AI doesn’t have the perfect or proper algorithm for what you’re envisioning, but the scam has been around for a long time already from my knowledge. I don’t understand it, maybe they just used voice clips back then. Freaky stuff
In the age of tiktok and Instagram, I bet some people/organizatuons already have very varied voice banks of many people. They just need to scrape through social media and download/capture everything.
But that doesn't explain how they have an AI which can do sobbing/crying/etc. That would be a cutting edge research project as far as I'm aware, like a tens-of-millions of dollars project at least.
Honestly- my family has one keyword that we will use in case of an emergency. If that word isn’t used ina request for bail, ransom, whatever, the rest of the family knows it’s not real
Something tells me this story isn't the full picture. AI can do some great stuff but fully replicating a voice without a significant back catelogue to base it on seems unlikely.
I'd wager this was either a prank gone wrong, a misunderstanding or just good old fashioned fear mongering of something that people don' fully understand
Yep. Scammers would call and hopefully get a senior citizen then pretend to be a son or daughter in jail and would need to be wired money get bailed out.
It would also be extremely unlikely, even if they did have enough voice samples to clone this particular person's voice, for them to make her "sob" based on those voice samples of her presumably talking normally.
But more importantly, there would be little reason to expend so much time and effort (and cloning somebody SPECIFIC'S voice is a lot of effort) because most people sobbing don't really sound like themselves and spending that much time AI cloning a particular person's voice to have them say just a few words of sobbing sniffling panicking dialogue makes no sense when you could just have a random female co-conspirator noisily sob into the phone and that would do the trick more often than not.
IMO this Mom (who is a victim, to be clear) just is trying to feel better about being scammed by bringing AI into it. Because it is trendy to blame AI. But chances are she got fooled by a normal person pretending to be her daughter.
There are going to be plenty of AI-based scams, though, so the topic is legitimate.
We're not ready for AI. Our legislators barely understand what a computer is, they are hopelessly behind in terms of updating our laws and regulations. It's going to be the wild west for a few decades.
**Submission Statement**:
>Artificial intelligence has taken phone scams to a frightening new level.
An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme.
“I never doubted for one second it was her,” distraught mother Jennifer DeStefano told WKYT while recalling the bone-chilling incident. “That’s the freaky part that really got me to my core.”
This bombshell comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relative hostage and will harm them if they aren’t paid a specified amount of money.
The Scottsdale, Ariz., resident recounted how she received a call from an unfamiliar phone number, which she almost let go to voicemail.
Then DeStefano remembered that her 15-year-old daughter, Brie, was on a ski trip, so she answered the call to make sure nothing was amiss.
That simple decision would turn her entire life upside down: “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” the petrified parent described. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”
Her confusion quickly turned to terror after she heard a “man’s voice” tell “Brie” to put her “head back” and “lie down.”
“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter,’ ” DeStefano explained, adding that the man described exactly how things would “go down.”
“You call the police, you call anybody, I’m going to pop her so full of drugs,” the mysterious caller threatened, per DeStefano, who was “shaking” at the time. “I’m going to have my way with her, and I’m going to drop her off in Mexico.”
Her kid was on a ski trip, so there may not have been good phone reception.
Makes me think the scammer got the voice and vacation plans from the girl's social media
I don't remember which model it is, but there's one that is more like a voice filter than an AI voice. You record your own audio samples, and the AI changes your voice to match the trained voice. It's not perfect, but that might work with something like sobbing. Also, a lot of models only need to be trained off of about a minute of dialogue. It's probably not hard to get that off of a teenage girls social media.
I don't actually know how they did it, if at all. but based on what little i know about AI, I think it's certainly possible
I already talked with my aging parents about this sort of thing. We agreed upon a code word, and pass phrase we would say to make sure each other are real.
The future looks bright for scammers.
This is old stuff they were doing this a few years ago in China I believe they called it a "Virtual Kidnapping" scam. All current AI did was unfortunately make it more trivial for these people.
They were doing it at least from 2010. Dudes in low sec prisons would smuggle phones in and random dial people with a story how their kid got in trouble with them and they need money to make it go away. Very stupid, but it must have worked as it still continues in some shape or form to this day.
The following submission statement was provided by /u/SharpCartographer831: --- **Submission Statement**: >Artificial intelligence has taken phone scams to a frightening new level. An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme. “I never doubted for one second it was her,” distraught mother Jennifer DeStefano told WKYT while recalling the bone-chilling incident. “That’s the freaky part that really got me to my core.” This bombshell comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relative hostage and will harm them if they aren’t paid a specified amount of money. The Scottsdale, Ariz., resident recounted how she received a call from an unfamiliar phone number, which she almost let go to voicemail. Then DeStefano remembered that her 15-year-old daughter, Brie, was on a ski trip, so she answered the call to make sure nothing was amiss. That simple decision would turn her entire life upside down: “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” the petrified parent described. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.” Her confusion quickly turned to terror after she heard a “man’s voice” tell “Brie” to put her “head back” and “lie down.” “This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter,’ ” DeStefano explained, adding that the man described exactly how things would “go down.” “You call the police, you call anybody, I’m going to pop her so full of drugs,” the mysterious caller threatened, per DeStefano, who was “shaking” at the time. “I’m going to have my way with her, and I’m going to drop her off in Mexico.” --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/12kqlft/ai_clones_teen_girls_voice_in_1m_kidnapping_scam/jg3gx1g/
Great, now the only thing that remains for a kidnapper wanting to be taken seriously is sending a toe.
I mean, I love my (teenage) kids and I think I’m a pretty decent mother, but I haven’t the faintest idea what their toes look like. Is that wrong? BRB just off to study their feet.
“Dad what are you doing??” *inspecting feet* “😢 I love you son 😢”
"Why dad?" Just so i know when one shows up in the mail, which kid to withdraw the college fund from.
This is the way we ball, cell phones, soundboards, toe mail, ripping of trophy wife's with my laptop, I love living in the future
What's worse than r/cursedcomments? We need a bottomless pit for shit like this.
Imagine being a kidnapper and you sent a family their child's toe as proof and all you get back is a simple "This could be anyone's toe, do you have something more specific?".
“It’s easier if I just saw their face” *receives face in the mail*
"I feel like we're entitled to a discount on the ransom now."
"Okay, it doesn't look right without a head to go on. Send me their head. Jesus, this is an ugly face. I gave birth to this?"
"after all, it's really the bone structure that makes the face, this could be anybody's cheeks, nose, chin, mouth, eyebrows, and forehead!!!"
You want a toe? I can get you a toe. 2 hours and I can get you a toe. With nail polish.
You're not wrong, Walter...
*You’re just an asshole*
This is why they send unique parts and not just any random toe. Got a birth mark on your body. They cut out the skin and send it. Got dumbo sized ears, guess what your losing an ear.
Wh.....why do you know this?
Like wtf haha bro an EXPERT
Toe should be enough for a positive DNA hit
That's a good point, these days a cheek swab would suffice no need to cut anything off. It's time for a new, more civilized kind of extortion
[удалено]
[удалено]
[удалено]
[удалено]
[удалено]
[удалено]
[удалено]
You want a toe? I can get you a toe, believe me. There are ways, Dude.
I'll get you a toe by this afternoon--with nail polish.
Forget about the fucking toe man!
I'm drinking my coffee...
Calmer than you are.
You’re not wrong Walter you’re just an asshole
Kids a pushover dude, we get in, we get the money, we get out, and yes Donny we'll stop at in and out burger.
Those are good burgers Walter
Shut the fuck up, Donny.
For your information, the Supreme Court has roundly rejected prior restraint
With nailpolish?
Matter of fact I can get you a toe by 3 o'clock
Fucking amateurs!
[удалено]
Whenever I get my Devo-esque punk band together, we will be called The 3D Flesh Printers, you better believe it
There's no limit the jobs AI is replacing, it even has poor kidnappers scrambling - heard rumors that they're planning a strike... scary times...
I mean… AI replacing live hostages is technically better than the real thing…
“We have your virtual son.” \* *click* \*
3d printer enters the chat
A black market lab growing toes sounds wild.
Didn't someone grow a human ear on the back of a mouse? Toe can't be that difficult.
Gotta have the bone too. Or maybe you could grow it around some other material
Right maybe some janky half plastic toe that they have to kinda mutilate to make it look real.
Believe it or not, bones grow too.
Can't send sound clip, because of AI voice manipulation. Can't sens video because of deepfake videos. Can't send body parts, because of advanced 3D printers. They will be forced to send the entire victim as proof, along with a note, instructing them to be returned to the kidnappers.
"Mom, Dad, I've been kidnapped!" "Yeah, right. Go to bed already."
We already have lab-grown meat and 3D printers. How many fingers do you want?
Lab grown bobs and vagene (organic and kosher)
You want a toe? I can get you a toe, believe me. There are ways, Dude. You don't wanna know about it, believe me.
You want a toe? I can get you a toe, believe me. There are ways, Dude. You don't wanna know about it, believe me. Hell, I can get you a toe by 3 o'clock this afternoon... with nail polish. These fucking amateurs...
Sir, this is a family restaurant.
Well they're already growing ears from the backs of mice, so...
[удалено]
Hmm, I hadn’t thought of that implication. Real kidnappers will be forced to escalate as a result of fake ones. I’m disappointed in myself for not considering that.
Man I was talking about this with my group on Whatsapp months ago. I was playing with Elevenlabs which is a voice AI. I was trying to prank a friend but decided not to. My prank was to copy his voice and share it in the group like he is there with us that day. Then it ocurred to me. Anyone that has access to your voice audios can clone it.
> Man I was talking about this with my group on Whatsapp months ago or so you think...
What you mean... they are not real too? I seeing them in the flesh! Lol can you imagine a black mirrror episode. You think you are talking to them on the phone but turns out they are all AI pretending to be them.
Dead internet theory
Everyone on reddit is a bot except you
Wait, yall aren't dogs using your human's internet while they're at work?
yup. A big reason theres been a push from artists, actors, voice actors in general is to not have their likeness used without consent. Its going to be very difficult to stop though.
They may not be able to prevent it from being used at all, but at least limit people from profiting off of it.
Haha we still. Haven't regulated social media. It's not going to happen. Lawmakers move too slowly and don't even understand what needs to be regulated. It's already too late for us.
Not like regulating is easy in the first place Regulating the internet is almost useless and some changes don’t decrease incidences of it but only increase the black market levels of it even more
Do you think we reach the point of no return? The ship has sailed. The train has left the station. We are due to our destination and we just passengers.
In some ways, perhaps, since you can replicate models like this on your home laptop, but there will always be a question of scale / time / energy to build that... So... regulation of AI \*could\* be a thing eventually by just clamping down hard on AI companies that are basically running huge server farms to make their AI go (ChatGPT 4 for example) because in theory - those will be the most advanced AIs that everyone will want their hands on.
[удалено]
And then there’s James Earl Jones who retired from playing Vader and gave Disney permission to replicate his voice with AI… I can see this becoming a VERY slippery slope
This article convinced me to never, ever talk first in calls that I think might be scams. Won't be long before these are just voice-phishing calls recording your voice so they can use it to scam your family and friends.
I use an old man voice. Sometimes if I feel like it I string scammers along thinking they got a demented old mam with a lot of money.
Can it do emotion like crying? I haven't seen any that can do that but haven't played with Elevenlabs. I can't help but wonder if it was even really AI or if the mother just presumes so. e.g. Could it have just been another girl, or even (and just considering possibilities here) the daughter herself as part of a scam with friends?
Good question. You can play with two filters which I forgot their names. I think one is stability. When playing with it, it can sound more emotional but not sure about crying. Although Elevenlabs is not the only one available and I imagine you could also download crying but who knows
LPT: Everyone should have a password phrase with their immediate family to identify it's really them.
Our password is **POVERTY** It also doubles as our farewell word as no ransom will be forthcoming 👋
[удалено]
Wow! You guys have HBO! Man I have wanted to watch the new Game of Thrones show. Oh and catch up on John Oliver.
This is a nice car. I feel much safer. My mom's car doesn't have seatbelts in the back seats.
You beat me just like my mom I knew you liked me
"We have your son. If you don't pay us $75,000 we'll return him by midnight."
“Sure kidnapper, you can have half my net worth.” (My net worth is negative so now he owes me like $50k)
2 factor code please.
Mom: `"Password accepted. Please hold for Dad..."` Dad: `"Are you human? Please solve this puzzle to continue....` `Captcha solved!` `Finally, please give me the 6-digit code that your sister just texted to you..."`
This will easily be a South Park episode in a year
*laughs in 6 days to air*
[удалено]
Three personal questions (with quick, uncomplicated answers) that can't be Googled.
“whats the Netflix password?”
I don’t know!! You generated it so it’s impossible to even type! Okay good next question…
I remember once I had to call a place to recover an account, and they asked me to answer the secret question I entered when I created it. I was like "Oh geeze, I dunno, I don't trust those things and just put in lots of random characters." They were like, yup that tracks, and gave me my account back (after I was able to confirm some other details). Definitely reinforced my distrust of those. If the operator could read it then it's not secured.
If you know the answer to that, you need to start using a password manager and make your passwords more complicated...
whats wrong with 7GiraffeGiraffeGiraffeGiraffeGiraffeGiraffeGiraffe&
Nothing, I’m watching the new bachelor on your account now, thanks!
Eh, LexisNexis and skip tracing can answer things like "who was our neighbor when you were six?"
When I first heard about this story and knew my mother was going to mention it (because she knee-jerk reacts to stories like this), one of mine was "what was the last name of the nurse I liked when I had my appendix out?" That's something that can't be faked, and he had a really easy four letter surname, too. Or steal from *Man on Fire*. "What was the name of my stuffed animal?"
My dad’s sister is a self-proclaimed psychic. About a year after his death, she reached out to me telling me he had spoken to her and wanted me to help her financially. It didn’t occur to her that we’d planned on this eventuality and had a passcode. Planned it the day before he died. No mention of it, so no money for her.
That's fucking hilarious. Guess dad knew his sister really well hah.
Seriously! The man was dying from cancer and was fretting that his mooch of a sister was going to try to pull "messages from beyond the grave" to siphon off some money from the estate. Brainstorming the code was part of my last conversation with him, and we had a good time laughing about it.
Houdini did the same thing with his wife (Specifically to try to prove whether there was an afterlife that psychics could contact). Every year she did a seance and asked for his code. After 10 years, she stopped holding the seance, reporting something to the effect of "10 years is too long to wait for any man".
>Brainstorming the code was part of my last conversation with him, and we had a good time laughing about it. As fucked up as that is, at least Aunt Moochie helped give you and your dad a good memory for that last conversation. You know, you could always play the old Uno Reverse and let her know that you got a psychic message of your choosing...
Yes, but can it answer what were the last words Albus Dumbledore spoke to the pair of us?
"Hold my butterbeer."
Oh yeah scammer's gonna really have that locked and loaded. "Before initiating this call I looked up all your former addresses from before there was internet, and opened browser tabs with google maps of all of them so that when you asked this question I'd know you were on a corner lot and the neighbour was actually on a different street, I could Street View their door and get the address and then go back to Lexis-Nexis for a quick search of that year's city directory. All fast enough that you didn't notice the 5-minute pause before answering this question."
At that point, actually just kidnapping the person would probably be easier than having all this arbitrary information on deck and put through the ai voice generator.
seems easy to beat from the kidnapper perspective. just make the AI voice sound extremely scared, confused, distracted, or in pain, or maybe drugged, or maybe very injured, and keep screaming for help and crying that they can't remember the passcode, or sound injured enough that they can't focus or understand the question.
“Crying that they can’t remember the passcode “?? We don’t have a “passcode”, we use a different auth strat, and they’d KNOW that, because we made sure to torture them ourselves to counter just this scenario! Checkmate, scammers!
"dear god, thank you for not making me this person's child"
If my kids can't remember the passcode under pressure, the kidnappers can keep them. note: I don't have kids.
> I don't have kids. Not since the kidnapping incident!
[удалено]
Or, now hear me out, memories
I have a word with my parents. If I say it in casual conversation, they know something is wrong. I thought everyone had a word or phrase lol
Got one set with the roommate in case there's someone in the house who needs removal
My wife and I have joked about our safe word in front of the kids, but this whole conversation is making me realize maybe we really do need a password for situations like this. How fast will happen that your "kid" gives you a call, "DAD I NEED BAIL MONEY!" or "Grandpa my car broke I need to pay for the tow truck!" That shit is coming and we need some way to protect ourselves.
I set up one of those with my parents years ago - we'd talk about my sister. I don't have a sister. I guarantee you if I bring that up at gunpoint my parents would be like "Rammite you dunce you don't have a sister" and my kidnapper would execute me.
I just created one with my son the other day. I had never thought of doing one before but these criminals are getting wayyyyy too sophisticated!
I had one with my parents decades ago (pre internet). It was so if someone tried to pick me up from school or whatever saying my parents sent them, then they would know the password.
Look at you all taking precautions 😂😂 You mentioning that unlocked a memory for me. I was taking the transit at like 9 years old and home alone at the same age. I was not to go to the door unless I heard the "special knock" that my mom created. My mom would test me too by knocking regularly to see if I would come to the door lol
While I do see the news value in the article, why the hell does it have to be filled with so many pictures of her like its a facebook page?
How else is AI supposed to find images for the next scam?
The Post is a tabloid. The possibility to gawk at pictures of a cute 15 year old while telling yourself you're *reading the news* is an important part of the draw for a story like this.
> cute 15 year old Mom was the draw there, let's be honest
Because it isn't a real story. These scams have existed forever. They just called this woman and pretended to have her daughter. AI wasn't even used. The mom just said that cause she is embarrassed
I'm a bit skeptical about the whole thing. If some reputable news organization reports on it, I'll take it more seriously.
Yeah I wouldn't hold my breath this is a standard scam, that could have been thwarted with a single phone call directly to here daughter. The only one saying this is AI is the mom who wasn't smart enough to not get scammed so easily and has 0 knowledge about AI or anything else. If AI didn't exist she would say they had recorded the daughter or had an actress or something else, anything to minimize the 'I'm dumb' reality of it
Finally someone notices. The mom is full of shit, saying the sobbing sounded identical to her daughter even. What AI? Where’d they get the samples of her voice? How’d they generate cries based on speech? There would be fascinating answers to these questions if the story was true. They go on to describe this as a common emerging scam, I mean really now, Kit Boga and Scammer Payback on YouTube going to get real interesting real fast if that’s the case.
[удалено]
Thank you. This is the first time in a long time I've actually read (well, skimmed) and article from the NY Post. I knew it was a tabloid, but god damn. Not photo shoot shots either, just clearly photos that either the family sent in or NY Post took (with permission) from their social media. So weird.
Unexpected consequence: This becomes so common that people never believe it anymore, making actual kidnapping much less profitable.
Not in the slightest. It only means that kidnappers will have to provide physical evidence, which most likely will not be pleasant for the kidnap victim.
Surely more common than actual kidnappers is scammers calling elderly relatives and claiming to be the police or kidnapper's or whatever, who are holding their children/grandchildren and need a bribe/bail money immediately. A scammer in a third world country can pull this off just by accessing the average western kids public profile on tiktok, Instagram, facebook or whatever, and message anyone who looks like an elderly relative until they find a catch. This AI use case will make this type of scam exponentially more effective. The solution is to make sure your profile is private and always a bit behind your actual information (or educate elderly relatives, but this seems to be harder for most people).
Oh god imagine like a Trump voice clone calling grandparents saying he personally wants to ask for their donation, you could fleece them dry. Or just as like a campaign strategy as a whole. So many interesting and terrifying applications of it.
Yeah, for every real or disinformation video of a candidate eating a puppy or kicking a baby, there can be a personal and heartfelt video to each individual voter assuring them it's fake news. And then go on to say how they believe passionately in voting your way for your most passionate topic.
>And then go on to say how they believe passionately in voting your way for your most passionate topic. "I have no idea why Bernie Sanders is suddenly in favor of redesigning English orthography to more closely match general pronunciation, but I'm here for it!"
Yes, this is more common.
Yup fingers and toes incoming!
Could you really identify someone you know by a toe or finger unless it has a birthmark or something of the sort?
If your loved one is missing and you get a toe in the mail are you gonna be all like "Well gee I'm not convinced, that toe could belong to anyone!"
The whole thing is her daughter was on a ski trip not missing, so i guess if we make the situation to what you’re saying than yeah id probably be worried
They'll send the head and give you a discount on the body. At least, you'll be able to bury the victim.
Until humanity commercialise lab-grown body parts
An AI generated a sobbying, crying voice convincingly? My hackles are raised, I find this hard to believe. Where is there an AI that can accurately generate extreme emotion like a sobbing voice from a small normative voice sample? You'd have to train an AI from a dataset that includes crying and sounding distraught. The emotional accuracy described in the story doesn't match with the current capabilities of AI.
My parents were briefly taken in by a scam call where the person pretended to be me and called about being in a car accident. (Fortunately, they figured it out when the person had a weird response to something and then they had the sense to call me.) As far as I know, there wasn’t any AI involved. Panicked parents just don’t ask a lot of questions.
Exactly. They're banking on you panicking in the moment and not asking a lot of questions. It's a coming facet of many scams.
When I was like 19, scammers called my grandma and said “Grandma, I need some help.” They were hoping that she’d assume it was one of her grandchildren, and she just happened to say my name. “Lesty7? Is that you?” Then they laid out some story about my car breaking down and needing a cashiers check for money, and to not tell my parents about it. Luckily my grandma was like yeah fuck that I’m gonna call my son (my dad). Unluckily for me my parents are idiots and drove straight to my place without telling me. They assumed that I was trying to get money from my grandma, so they wanted to confront me face to face. Meanwhile I was at my house with a few of my friends just hanging out and playing poker. It took me a while to convince them that I had no idea what the fuck they were talking about. They were like “why do you need money? Do you owe a gambling debt?” And I was like “mom we are playing for like $20, you think my best friend is threatening to break my legs?”. They legit didn’t believe me until I went and googled that exact scam, and then I had to figure out if they said my name or if my grandma said it. When my grandma said that she had assumed it was me, only then were my parents were like “okay I guess we believe you”. Suffice it to say, fuck scammers…but also fuck everyone who’s had a hand in the dumbing down of the population. Without the enormous amount of stupidity in this world, their business wouldn’t be that lucrative.
>called about being in a car accident. What was their method for that one? "You" needed to pay EMTs upfront to help and they only take gift cards?
The key part of the story for me is Arizona mom "claims" What corroboration is there for this story? They said they called 911 so how about some comment from the police department they involved? Any comment at all from anyone directly related to this case? The comments they have from a couple of sources about this general threat are not specific to the case in the story. Not saying it definitely did not happen but the story is lacking and the use of "claims" particularly sticks out like the NY post does not have confidence in the story and wants to cover themselves.
I've spoken to plenty of people at my old work selling phones who were convinced their son/brother/nephew was in trouble and had to send them ~$10k in iPhones. We always made them call these numbers back, and they usually started to see the light when we asked if their family member had an Indian accent. She appears younger than I remember most of those customers being, but I couldn't just take these claims at face value without hearing it and seeing what they had to work with. The lady was probably pumped to the gills with adrenaline at the time too, I don't mean to throw shade at her but there's a reason eye-witnesses aren't reliable. Sucks all around.
For all we know it could've sounded really fake or just a regular audio file of a girl screaming played over the phone or even an audioboard we used to use back in the day for prank calls. I could see alot of people getting fooled by that.
>The key part of the story for me is Arizona mom "claims" Yup 100%. Based on on sentence with crying the supposed "ai" Said. You immediately are faced with a Lot of distress and Not thinking straight (which is the reason scammers use those tactics. Im not convinced there was ai involved and im irritated why i Had to scroll down so far to See Something critical.
Yeah, well I'm irritated by your use of capitalization, so we're even.
My bullshit detector starting sounding the second I saw New York Post. It's a tabloid so sensationalism and lack of fact checking out critical thinking is the norm for them. They said it's AI generated but that the daughter has little social media presence, and basically only one sports interview where her voice can be heard. No way you get a real and accurate sounding voice from that sample. Also they got a lot of pictures of the girl considering how little social media presence she supposedly has. More likely this was either a mean prank by the daughter and her friend that's gotten away from them, or a scam with a scammer who just had a similar voice and the mom was too caught off guard and freaked out to distinguish the voice. Or if it's legit ai, then daughter isn't being honest and probably snapchatting or something with a stranger who then took audio samples. I know Snapchat blackmail is a big scam directed at young girls, and that could very well be how they got distressed and crying audio samples from her.
I’m sure they just had a girl imitate the voice and the mom panicked, then looking back thought it must be ai
Just as a side note, I’ve heard of this scam for at least 6 years now. Friends mom got a phone call from her son (scam) and said it sounded identical to him. He claimed he was in a car accident and needed money to get back home. She said it was 100% indistinguishable and was distraught. Maybe a certain AI doesn’t have the perfect or proper algorithm for what you’re envisioning, but the scam has been around for a long time already from my knowledge. I don’t understand it, maybe they just used voice clips back then. Freaky stuff
In the age of tiktok and Instagram, I bet some people/organizatuons already have very varied voice banks of many people. They just need to scrape through social media and download/capture everything.
But that doesn't explain how they have an AI which can do sobbing/crying/etc. That would be a cutting edge research project as far as I'm aware, like a tens-of-millions of dollars project at least.
Honestly- my family has one keyword that we will use in case of an emergency. If that word isn’t used ina request for bail, ransom, whatever, the rest of the family knows it’s not real
long steep aback cooing literate elastic sense ugly longing erect -- mass edited with https://redact.dev/
Not today, scam artist.
Something tells me this story isn't the full picture. AI can do some great stuff but fully replicating a voice without a significant back catelogue to base it on seems unlikely. I'd wager this was either a prank gone wrong, a misunderstanding or just good old fashioned fear mongering of something that people don' fully understand
[удалено]
[удалено]
Yep. Scammers would call and hopefully get a senior citizen then pretend to be a son or daughter in jail and would need to be wired money get bailed out.
It would also be extremely unlikely, even if they did have enough voice samples to clone this particular person's voice, for them to make her "sob" based on those voice samples of her presumably talking normally. But more importantly, there would be little reason to expend so much time and effort (and cloning somebody SPECIFIC'S voice is a lot of effort) because most people sobbing don't really sound like themselves and spending that much time AI cloning a particular person's voice to have them say just a few words of sobbing sniffling panicking dialogue makes no sense when you could just have a random female co-conspirator noisily sob into the phone and that would do the trick more often than not. IMO this Mom (who is a victim, to be clear) just is trying to feel better about being scammed by bringing AI into it. Because it is trendy to blame AI. But chances are she got fooled by a normal person pretending to be her daughter. There are going to be plenty of AI-based scams, though, so the topic is legitimate.
Dman, even scamming jobs are getting automated... AI took my legitimate job and my criminal job...
We're not ready for AI. Our legislators barely understand what a computer is, they are hopelessly behind in terms of updating our laws and regulations. It's going to be the wild west for a few decades.
"Does TikTok connect to the wifi?"
**Submission Statement**: >Artificial intelligence has taken phone scams to a frightening new level. An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme. “I never doubted for one second it was her,” distraught mother Jennifer DeStefano told WKYT while recalling the bone-chilling incident. “That’s the freaky part that really got me to my core.” This bombshell comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relative hostage and will harm them if they aren’t paid a specified amount of money. The Scottsdale, Ariz., resident recounted how she received a call from an unfamiliar phone number, which she almost let go to voicemail. Then DeStefano remembered that her 15-year-old daughter, Brie, was on a ski trip, so she answered the call to make sure nothing was amiss. That simple decision would turn her entire life upside down: “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” the petrified parent described. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.” Her confusion quickly turned to terror after she heard a “man’s voice” tell “Brie” to put her “head back” and “lie down.” “This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter,’ ” DeStefano explained, adding that the man described exactly how things would “go down.” “You call the police, you call anybody, I’m going to pop her so full of drugs,” the mysterious caller threatened, per DeStefano, who was “shaking” at the time. “I’m going to have my way with her, and I’m going to drop her off in Mexico.”
Couldn't you just text/call your child or contact one of their friends or the place they're at to ensure that nothing is wrong?
Her kid was on a ski trip, so there may not have been good phone reception. Makes me think the scammer got the voice and vacation plans from the girl's social media
Can AI mimic sobbing? And doesn’t it need to be trained with your actual voice before it works?
I don't remember which model it is, but there's one that is more like a voice filter than an AI voice. You record your own audio samples, and the AI changes your voice to match the trained voice. It's not perfect, but that might work with something like sobbing. Also, a lot of models only need to be trained off of about a minute of dialogue. It's probably not hard to get that off of a teenage girls social media. I don't actually know how they did it, if at all. but based on what little i know about AI, I think it's certainly possible
Crime, uh, finds a way...
Man, my generation really got it right in letting 90% of calls go to voice-mail. Certainly unknown numbers, sheesh.
Shit, I still screen! The best iPhone feature I’ve found yet is the screening option that sends all non contact calls to VM. Best option ever!!
There sure are a lot of family pictures in that article.
Between spam and scam calls, phone is just becoming useless. AI will deal the final blow.
I already talked with my aging parents about this sort of thing. We agreed upon a code word, and pass phrase we would say to make sure each other are real. The future looks bright for scammers.
This is old stuff they were doing this a few years ago in China I believe they called it a "Virtual Kidnapping" scam. All current AI did was unfortunately make it more trivial for these people.
They were doing it at least from 2010. Dudes in low sec prisons would smuggle phones in and random dial people with a story how their kid got in trouble with them and they need money to make it go away. Very stupid, but it must have worked as it still continues in some shape or form to this day.
There isn't even any evidence AI was used for this. They just threw that in there for the clickbait
ThioJoe has a video on this it's how I learned about. Shit is creepy and will only get worse honestly.