Skip to content

Instantly share code, notes, and snippets.

@nopslip
Created March 21, 2024 18:02
Show Gist options
  • Save nopslip/855cf8732fbabff761671c8dc7cb8251 to your computer and use it in GitHub Desktop.
Save nopslip/855cf8732fbabff761671c8dc7cb8251 to your computer and use it in GitHub Desktop.
Example of np4k fabric helper --output kvp
title: GUEST ESSAY: A DIY guide to recognizing – and derailing – Generative AI voice scams
keywords: guest, essay, diy, guide, recognizing, derailing, generative, voice, ai, scams, real, information, alexander, cloning, person, scammers, konovalov, google, impersonate, laugh, sounds, there’s, end, love, listen, unusual, you’re, verify, don’t, caller, urgency, co-founder, technologies, e-commerce, americans
tags:
authors: bacohido, Jeffrey Burt, Richi Jennings, Michael Vizard
summary: By Alexander Konovalov Americans lost a record $10 billion to scams last year — and scams are getting more sophisticated.\nRelated: Google battles AI fakers Recently used to impersonate Joe Biden and Taylor Swift, AI voice cloning scams are gaining momentum — and one in three adults confess they aren’t confident they’d identify the cloned voice from the real thing.\nGoogle searches for ‘AI voice scams’ soared by more than 200 percent in the course of a few months.\nHere are a few tips how to not fall prey to voice cloning scams.\nWhile voice cloning technology can be convincing, it isn’t yet perfect.
text: By Alexander Konovalov\n\nAmericans lost a record $10 billion to scams last year — and scams are getting more sophisticated.\n\nRelated: Google battles AI fakers\n\nRecently used to impersonate Joe Biden and Taylor Swift, AI voice cloning scams are gaining momentum — and one in three adults confess they aren’t confident they’d identify the cloned voice from the real thing.\n\nGoogle searches for ‘AI voice scams’ soared by more than 200 percent in the course of a few months. Here are a few tips how to not fall prey to voice cloning scams.\n\n•Laugh. AI has a hard time recognizing laughter, so crack a joke and gauge the person’s reaction. If their laugh sounds authentic, chances are there’s a human on the other end of the line, at least.\n\n•Test their reactions. Say something that a real person wouldn’t expect to hear. For instance, if scammers are using artificial intelligence to imitate an emergency call from your relative, say something inappropriate, such as “Honey, I love you.” Whereas a real person would react panicked or confused, AI would simply reply “I love you too.”\n\n•Listen for anomalies. While voice cloning technology can be convincing, it isn’t yet perfect. Listen out for unusual background noises and unexpected changes in tone, which may be a result of the variety of data used to train the AI model. Unusual pauses and speech that sounds like it was generated by ChatGPT are also clear giveaway that you’re chatting to a machine.\n\n•Verify their identity. Don’t take a familiar voice as proof that a caller is who they say they are, especially when discussing sensitive subjects or financial transactions. Ask them to provide as many details as possible: the name of their organization, the city they’re calling from, and any information that only you and the real caller would know.\n\n•Don’t overshare. Avoid sharing unnecessary personal information online or over the phone. According to Alexander, scammers often phish for private information they can use to impersonate you by pretending to be from a bank or government agency. If the person on the other end seems to be prying, hang up, find a number on the organization’s official website, and call back to confirm their legitimacy.\n\n•Treat urgency with skepticism. Scammers often use urgency to their advantage, pressuring victims into acting before they have time to spot the red flags — If you’re urged to download a file, send money, or hand over information without carrying out due diligence, proceed with caution. Take your time to verify any claims (even if they insist there’s no time.)\n\nAbout the essayist: Alexander Konovalov is the Co-Founder & Co-CEO of vidby AG, a Swiss SaaS company focused on Technologies of Understanding and AI-powered voice translation solutions. A Ukrainian-born serial tech entrepreneur, and inventor, he holds patents in voice technologies, e-commerce, and security. He is also a co-founder of YouGiver.me, a service that offers easy and secure communication through real gifts, catering to individual users and e-commerce businesses.\n\nMarch 11th, 2024
publish_date: 2024-03-11T22:40:26+00:00
url: https://securityboulevard.com/2024/03/guest-essay-a-diy-guide-to-recognizing-and-derailing-generative-ai-voice-scams/
---
title: How To Spot (and Avoid) AI Voice Scams
keywords: spot, voice, ai, scams, avoid, scam, phone, scammers, family, money, information, online, fraudsters, create, accounts, identity, calls, cloning, credit, videos, caller, bank, security, fake, it’s, access, questions, loved, victims, friends, friend, aura, person, account, technology
tags:
authors: Jason Fragoso
summary: AI voice scams occur when scammers use generative AI software to record someone’s voice and create a cloned version to use in a scam.\nHow To Quickly Identify an AI Voice Scam Although it is difficult to discern if you're talking to an imposter, there are many tell-tale signs that can tip you off to the fact that you’re dealing with an AI voice scam.\nHere are five warning signs of an AI voice scam: You only briefly “hear” your loved one’s voice.\nHere’s what to do to avoid these AI voice scams: Be skeptical of urgent requests.\nHere’s what to do to avoid these AI voice scams: Verify the caller's identity.
text: Are Scammers Targeting You With AI Voice Scams?\n\nDespite its promises of precision and efficiency, artificial intelligence (AI) has also become a massive threat to your finances, family, and identity. In particular, AI voice scams are on the rise, as fraudsters use advanced technology to clone your loved ones’ voices, automate phone scams, and trick you into giving up money and sensitive information.\n\nAccording to the latest research [*]:\n\nScammers only need three seconds of audio to “clone” a person’s voice to use in scam calls. Even worse, 77% of AI voice scam victims lose money.\n\nNews clips, videos on social media, and even your voicemail greeting can give scammers all they need to target you with an AI voice scam.\n\nIn this guide, we’ll explain how AI voice scams work, the warning signs to look out for, and how you can stay one step ahead of the latest scams.\n\n{{show-toc}}\n\nWhat Are AI Voice Cloning Scams? How Do They Work?\n\nAI voice scams occur when scammers use generative AI software to record someone’s voice and create a cloned version to use in a scam.\n\nFraudsters can find an audio clip online (or via your voicemail) and then use this clone to activate voice-controlled devices, produce deep fake videos, run voice phishing scams to ask your loved ones for money or personal information, and even commit virtual kidnapping.\n\nIn recent years, we’ve seen a rapid evolution in voice cloning technology. With new deep-learning algorithms, it’s possible to create realistic and convincing voice clones that imitate accents, pauses, and other particularized nuances.\n\nBetween caller ID spoofing and voice cloning, answering unknown phone calls has become extremely dangerous. If you fall for an AI voice scam, someone could get access to your devices, or exploit your credit cards, bank accounts, and personal identification.\n\nHere’s how AI voice scams typically play out:\n\nFraudsters research victims online or on social media. Most AI voice scams are highly targeted. Cybercriminals look for information they can use about a victim’s family or friends via TikTok, Instagram, and other social media accounts or online sources. For example, scammers cloned the voice of an Arizona family’s daughter while she was away on a skiing trip and then claimed she had been kidnapped [*].\n\nNext, they choose a voice to “clone.” The con artists find audio or video clips to use in their scam — often pulled from social media videos. If your family members are active online, they could be easy targets.\n\nThe caller claims to be your friend or family member. Using AI voice cloning technology, they call and trick you into believing you’re talking with a loved one, like a grandchild.\n\nThey claim to be in trouble. The scammer often says there’s an emergency (such as a car accident). Sometimes, an accomplice may join the call to impersonate a third party, like a lawyer or police officer. The goal is to make you panic.\n\nThey ask for money. As the scammers gain your trust — and create a sense of urgency — your “family member” asks for your help. Typically, they want you to send money, often through a gift card or wire transfer.\n\n🏆 Get advanced protection against the latest scams and threats. Aura’s award-winning digital security solution uses advanced technology to block scam calls, warn you of phishing attacks, and protect your identity and finances. Try Aura free for 14 days.\n\nHow To Quickly Identify an AI Voice Scam\n\nAlthough it is difficult to discern if you're talking to an imposter, there are many tell-tale signs that can tip you off to the fact that you’re dealing with an AI voice scam.\n\nHere are five warning signs of an AI voice scam:\n\nYou only briefly “hear” your loved one’s voice. Fraudsters know that the longer they use a cloned voice, the more likely you will catch on. In most cases, you'll only briefly hear your family member or friend — usually in a distraught tone or crying — as they explain the situation or say, "I'm in trouble" or "I messed up."\n\nThey can’t answer simple questions. Fraudsters can clone a voice, but not a person’s memories or personality. If the caller can’t give you straight answers, or hesitates when you ask basic questions, these are red flags.\n\nYou’re called from an unknown number. Most AI voice scams start with an unsolicited call from a number you don’t recognize. Quite often, it will be from another country, like one of the hotspots for phone scams such as Nigeria, Mexico, or India.\n\nSomeone else quickly takes over the call. Scammers often start the call by using the cloned voice before passing the phone over to another person who pretends to be a kidnapper, attorney, or law enforcement officer.\n\nYou’re told to pay a ransom via cryptocurrency or gift cards. Scammers prefer to use difficult-to-trace payment methods so that police and victims won’t be able to recover funds or track the criminals.\n\n💡 Resource: How To Identify a Scammer On The Phone [With Examples] →\n\nThe 5 Most Common AI Voice Cloning and Deepfake Scams\n\nFake kidnapping phone scams\n\nGrandparent scam calls\n\nFake celebrity endorsement videos\n\nScammers cloning your voice to access accounts\n\nCalls from friends who desperately need money\n\nAI voice scams pose a growing concern, with “scam likely” calls a common sight on today’s smartphones and caller IDs.\n\nLet’s take a closer look at five of the most common AI voice scams you need to watch out for in 2024:\n\n1. Fake kidnapping phone scams\n\nScammers target families, especially those that have a large online presence, by cloning a child’s voice and then calling one of the child’s parents to claim that they’ve been kidnapped. To create a sense of urgency, fraudsters make it seem as if the child is crying and begging for help — before quickly taking over the call to act as the “kidnapper” and demand a ransom.\n\nAn FBI special agent in Chicago reported that families in the USA lose an average of $11,000 in every fake kidnapping scam [*].\n\nHere’s what to do to avoid these AI voice scams:\n\nDiscuss a plan with friends and family. A proactive approach helps; for instance, you, your family, and friends can decide on a code word to use on the phone to verify that you are speaking with each other.\n\nAlways contact your loved one before acting. Scammers pressure you to act quickly, and try to keep you on the phone to ensure that you don’t call the police. Before taking action, find a way to contact the “kidnapping victim” on another phone to verify or disprove the story.\n\nAsk questions. Even though it’s an emergency, you must remain calm and figure out if you’re really talking to your family member or friend. Ask questions about specific memories, events, or things that only the real person will be able to answer. If you detect any hesitation or get wrong answers, it’s probably a scam.\n\n🛡 Fight back against online scammers. Fraudsters are constantly using new methods to target their victims. Keep up with their schemes by signing up for Aura's AI-powered, all-in-one digital security solution. Try Aura for free today.\n\n2. Grandparent scam calls\n\nGrandparent scams occur when fraudsters pose as family members in danger and persuade elderly victims to share sensitive information or pay bogus fees, fines, or ransoms. AI voice cloning technology makes these scams more convincing than ever before — which is a serious risk for vulnerable elderly people.\n\nIn one example, several grandmothers in Canada lost thousands of dollars to these AI voice scams last year. Fraudsters pretended to be a grandchild who was arrested, and convinced victims to send money urgently — and not tell anybody because there was a supposed gag order in place [*].\n\nHere’s what to do to avoid these AI voice scams:\n\nBe skeptical of urgent requests. If you receive unsolicited calls from someone in an emergency situation, proceed with caution. Before acting, try to verify the person’s identity by asking questions or contacting other family members.\n\nAvoid sharing personal information. If an unknown caller asks for your personally identifiable information (PII), such as your PIN or credit card numbers, hang up immediately.\n\nBe wary of payment methods. Consider it a red flag if someone asks you to send money via wire transfers or gift cards. Unlike credit card payments, you don’t have any protections if it’s a scam and will find it almost impossible to get your money back.\n\n💡 Related: How To Avoid the 12 Worst Scams Targeting Seniors in 2024 →\n\n3. Fake celebrity endorsement videos\n\nScammers use AI to create convincing videos that appear to feature real celebrities endorsing products or services — like those from Apple, McAfee, and Amazon — which can then trick consumers into buying illegitimate products.\n\nIn January 2024, The New York Times revealed that some Taylor Swift fans fell for an AI video scam in which the singer appeared to endorse Le Creuset cookware [*].\n\nHere’s what to do to avoid these AI voice scams:\n\nDon’t believe everything you see and hear online. Scammers use sophisticated AI image generators and voice cloning to create convincing ads and websites. But if it seems too good to be true, it probably is — so tread carefully!\n\nLook for signs that a video is fake. You can spot deepfakes if you look closely at videos featuring celebrities. These videos often have blurry spots, changes in video quality, and sudden transitions in the person's movement, background, or lighting.\n\nResearch companies before making purchases. Scammers impersonate celebrities to win your trust and persuade you to buy the products they’re trying to sell. But you should do your due diligence on any company by checking out third-party reviews on reputable customer review sites like Trustpilot.\n\n{{show-cta}}\n\n4. Scammers cloning your voice to access accounts\n\nScammers can use AI voice scams to contact financial institutions and fool bank employees into divulging information or making transfers. With enough samples of you speaking, scammers can create an AI-generated voice to access your bank account details and steal your savings.\n\nFlorida investor Clive Kabatznik had a lucky escape when fraudsters used AI voice cloning to impersonate him in order to transfer money to another account. Thankfully, the Bank of America representatives spotted the scam and hung up [*]. But many other victims are not so lucky.\n\nHere’s what to do to avoid these AI voice scams:\n\nCreate complex security questions. Your bank will ask any caller security questions before discussing your account on the phone, so make sure you create answers that only you will know (and never share them with anyone else).\n\nUse two-factor authentication (2FA). This additional security layer makes it harder for fraudsters to access your account. You can use various methods of 2FA, including SMS codes, biometrics, or a hardware security key.\n\nSet up bank alerts. In your online bank app settings, you can set up alerts to get notifications any time someone logs in (or tries to make changes to your account). This simple step can make all the difference.\n\n💡 Related: How To Get Your Money Back If Your Bank Account Was Hacked →\n\n5. Calls from friends who urgently need money\n\nFake emergency scams are similar to grandparent scams. However, fraudsters target anyone — not just elderly people. These scams start with a spoofed phone call claiming to be from your family member or friend. As imposters gain your trust, they explain that they are in urgent need of money (due to a supposed emergency).\n\nThe scammer often insists on secrecy, plays on your emotions, and pressures you into sending money immediately. An Ontario man thought he was sending $8,000 to his friend, who claimed he had been in a serious accident — but it was a scam [*].\n\nHere’s what to do to avoid these AI voice scams:\n\nVerify the caller's identity. If you receive a call from a friend or family member claiming to be in trouble (especially if the person asks for money), resist the pressure to act immediately and ask lots of specific, personal questions.\n\nOffer to meet in person or do a video call. An actual friend or relative shouldn’t take issue with these requests, but a fraudster will be evasive and come up with excuses. If the caller doesn’t want to do either, regard this as a red flag.\n\nRefuse to use wire transfers, cryptocurrency, or gift cards. These methods don't offer the same payment protections as credit cards. If you want to help but the caller insists on these methods, it’s a scam — hang up.\n\n💡 Related: How To Identify a Scammer On the Phone →\n\nWhat To Do If You Receive a Suspicious Phone Call From Someone You Know\n\nEven if you end up on the phone with a scammer, it’s not too late to prevent a bad situation from getting worse.\n\nHere are the essential steps you need to take to protect your finances and identity:\n\nNever give out personal information. Sharing something as simple as your address or email may give crooks enough to steal your identity.\n\nHang up and verify. If the caller asks for sensitive information or pressures you on the phone, end the call and use a known number to directly contact the person who claimed to be calling you. Find out if they’re really in trouble. If you can’t get in touch, call other friends and family members to verify the story.\n\nCall all financial institutions immediately. If you share sensitive information or believe the scammers can access your financial accounts, you must notify your bank and credit card issuers of potential identity fraud as soon as possible. Cancel your cards, and attempt to reverse any fraudulent transfers. It’s also wise to place fraud alerts on your accounts to get notifications about further attempts to steal your money.\n\nBlock the scammer. It's essential to protect yourself from future scams by blocking the scammer's phone number or email address. Aura uses AI to help you block unwanted calls to avoid spam text messages and vishing.\n\nSecure your accounts. To protect your online accounts against financial fraud, change your passwords and enable two-factor authentication (2FA).\n\nPlace a credit freeze. You can prevent scammers from opening new accounts in your name by freezing your credit. Contact each of the three credit bureaus individually — Experian, Equifax, and TransUnion.\n\nReport the call. If you suspect a scam, report the call to the Federal Trade Commission (FTC) either online or by calling 1-877-382-4357. If you gave up sensitive information, you should also file an official identity theft report at IdentityTheft.gov. You can bring this report with you to your local law enforcement to file a police report.\n\nWarn others. If you’ve been impersonated online, warn your friends and contacts about the scam, as they could be next.\n\n🏆 Don’t settle for second-best online and scam protection. Aura’s all-in-one digital security solution has been rated #1 by Money.com, Forbes, TechRadar, and more. Try Aura for free today.\n\nHow To Protect Yourself and Your Family From Phone Scams\n\nPhone scams are becoming more of an ongoing risk for Americans — and AI is only going to make them harder to avoid as these scams become more sophisticated.\n\nHere are seven ways to keep yourself and your family safe from phone scams:\n\nCreate a family “safe word” to use on the phone. Choose a password or phrase that only your family members know. If the person on the other line doesn’t respond to the question correctly, you’ll know it’s an AI voice scam.\n\nAsk someone else to call 911 while you’re on the call. If the call is indeed a scam (or an emergency), having a second person call 911 helps ensure your safety and those around you.\n\nMake sure you contact your loved ones. By directly contacting the loved ones involved, you can confirm if the information you received over the phone is accurate and not a result of a potential phone scam.\n\nLimit what you share on social media and online. Scammers can only use the information that’s available to them. The less videos, voice recordings, and information cybercriminals can find about you online, the harder it will be to target you with AI voice scams.\n\nScreen phone calls with an AI scam blocker. Aura’s AI-powered tools can analyze call patterns, voice characteristics, and other indicators to determine the likelihood of a call being a scam.\n\nUse strong passwords and 2FA on all accounts. Create strong, unique passwords for each account by combining letters, numbers, and special characters in order to make your passwords difficult to crack. Adding 2FA with facial recognition or an authenticator app makes hacking your accounts almost impossible.\n\nSign up for an AI-powered digital security tool. These tools offer proactive defenses against emerging cyber threats. In addition to AI voice scams, you can combat malware, ransomware, phishing attacks, DDoS attacks, and zero-day vulnerabilities.\n\n💡 Related: How Much Does LifeLock Cost For Seniors? (2024 Price Breakdown) →\n\nThe Bottom Line: AI Is Empowering Scammers — Aura Can Help\n\nAs AI technology advances, more criminals use it to create convincing scams, which makes it harder to spot the warning signs. If you fall for these elaborate cons, the perpetrators can quickly access your personal information, trick you into sending them money, or even steal your identity.\n\nSigning up for an all-in-one digital security service is the best way to protect yourself against AI voice scams and almost every type of cyber fraud in 2024.\n\nAura safeguards you and your family with award-winning identity theft and credit protection, including 24/7 fraud monitoring across all of your financial accounts, devices, and credit files.\n\nAura’s easy-to-use cybersecurity suite includes AI spam and scam call blocking to stop scammers from getting on the phone with you. And in the worst-case scenario, you’ll get peace of mind knowing you have 24/7 access to Aura’s dedicated team of U.S.-based White Glove Fraud Resolution Specialists — along with up to $5 million in insurance coverage for eligible losses due to identity theft.\n\nUse AI to fight fire with fire against scammers.Try Aura free for 14 days.\n\nNeed an action plan?\n\nNo items found.
publish_date: 2024-02-07T00:00:00
url: https://www.aura.com/learn/ai-voice-scams
---
title: Scammers can use AI tools to clone the voices of you and your family—how to protect yourself
keywords: tools, voices, protect, clone, ai, scammers, family, voice, voters, election, warmenhoven, president, biden, hampshire, primary, money, robocall, presidential, sounds, save, technology, check, online, end, number, tuesday's, general, attorney, general's, office, types, member, order, ai-generated, steps
tags:
authors: Cheyenne DeVon, www.facebook.com
summary: A robocall impersonating President Joe Biden urged New Hampshire voters not to participate in Tuesday's presidential primary — and it probably won't be the last AI voice scam this election season.\n"These messages appear to be an unlawful attempt to disrupt the New Hampshire Presidential Primary Election and to suppress New Hampshire voters."\nIn March, the Federal Trade Commission issued a consumer alert warning people that scammers could target them by using AI technology to clone the voice of a family member in order to convince them to send the scammers money.\nFact-check and verify When it comes to AI voice scams like the one involving Biden, you should double check that what you're hearing is actually correct, Warmenhoven says.\nor just mumble something, you'll give up lots of words and inflections in your voice and then [a scammer] can clone your voice," Warmenhoven says.
text: A robocall impersonating President Joe Biden urged New Hampshire voters not to participate in Tuesday's presidential primary — and it probably won't be the last AI voice scam this election season.\n\n"Of course, this will be used by foreign nation states just like the trolling farms they already have. This is just another weapon in the arsenal," Adrianus Warmenhoven, a cybersecurity expert at NordVPN, tells CNBC Make It.\n\nThe fraudulent robocall begins by saying, "What a bunch of malarkey," in a voice that sounds like President Biden's, according to NBC News. The call then instructs recipients to "save your vote for the November election" and refrain from participating in the nation's first presidential primary, which is when voters choose which candidate they would like to be their political party's nominee in the general election.\n\nNew Hampshire's Attorney General's office says it has launched an investigation following a series of complaints about the robocalls.\n\n"Although the voice in the robocall sounds like the voice of President Biden, this message appears to be artificially generated based on initial indications," the attorney general's office said in a Jan. 22 statement. "These messages appear to be an unlawful attempt to disrupt the New Hampshire Presidential Primary Election and to suppress New Hampshire voters."\n\nThanks to the rapid development of the type of AI technology used to clone and mimic people's voices, these types of AI-powered schemes are becoming more common — and scammers aren't just spoofing well-known public figures.\n\nIn March, the Federal Trade Commission issued a consumer alert warning people that scammers could target them by using AI technology to clone the voice of a family member in order to convince them to send the scammers money.\n\nAlthough it's getting harder to differentiate between what's real and what may be an AI-generated deepfake, there are a couple of steps you can take to protect yourself.\n\n1. Fact-check and verify\n\nWhen it comes to AI voice scams like the one involving Biden, you should double check that what you're hearing is actually correct, Warmenhoven says.\n\nAlthough the spoofed version of President Biden voice told New Hampshire voters to "save" their votes for the November general election, a quick Google search reveals that registered Republicans and undeclared voters could cast their ballot in Tuesday's primary.\n\nIf you suspect you're being targeted by a cybercriminal, you can also check online to see if there have been any recent reports of AI-generated voice scams, Warmenhoven says.\n\n2. End the call\n\nIf you decide to answer a call from an unknown number and it sounds like a panicked family member is asking you for money, try not to panic yourself.\n\nInstead, end the call and try calling or texting the person using the number they've given you, rather than the one that called, to check that they're OK, Warmenhoven says.\n\nThe same goes if you receive a call from a scammer pretending to be your bank. Don't immediately believe their claims. Instead, end the call and call the number listed on your bank's website or on the back or your debit or credit card, he says.\n\nIt's especially important to get off the phone with these types of scammers, because they don't need an especially long clip of your voice in order to use AI to clone it.\n\n"Even if you just say 'Hello? Is anybody there ?' or just mumble something, you'll give up lots of words and inflections in your voice and then [a scammer] can clone your voice," Warmenhoven says.\n\nIn May, McAfee researchers found that free online tools could be used to clone someone's voice using just three seconds of audio, per the computer software company's "The Artificial Imposter" report.\n\n"Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person's voice and deceive a close contact into sending money," Steve Grobman, McAfee's chief technology officer, said in the report. "It's important to remain vigilant and to take proactive steps to keep you and your loved ones safe."\n\nDON'T MISS: Want to be smarter and more successful with your money, work & life? Sign up for our new newsletter!\n\nWant to land your dream job in 2024? Take CNBC's new online course How to Ace Your Job Interview to learn what hiring managers are really looking for, body language techniques, what to say and not to say, and the best way to talk about pay. Get started today and save 50% with discount code EARLYBIRD.
publish_date: 2024-01-24T00:00:00
url: https://www.cnbc.com/2024/01/24/how-to-protect-yourself-against-ai-voice-cloning-scams.html
---
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment