• FBI warns kidnapping scam

    From Mike Powell@1:2320/105 to All on Tuesday, December 09, 2025 09:05:56
    FBI warns of kidnapping scams as hackers turn to AI to provide 'proof of life'

    Date:
    Mon, 08 Dec 2025 14:30:00 +0000

    Description:
    AI-generated deepfake videos are being used as "proof" someone was kidnapped.

    FULL STORY

    Hackers are using Generative Artificial Intelligence (GenAI) to create convincing deepfake videos which are then used as proof of life in kidnapping and extortion scams.

    This is according to the US Federal Bureau of Investigation (FBI) which recently released a new Public Service Announcement (PSA), warning citizens
    not to fall for the trick.

    Here is how the scam works: the criminals will pick a target and scour social media and other sources for images and videos. If they find enough
    information, they will source it into an AI tool to create videos and images depicting their targets loved ones as kidnapped. Then, they will reach out to the victims and demand an immediate ransom payment in order to release their hostage.

    The scam might not be that widespread, but its been around for a little
    while. The Guardian reported on it two years ago. Still, with AI getting
    better by the minute, its safe to assume these scams are getting more common, prompting a reaction from the FBI.

    The FBI also said that these photos and videos are not perfect. With a little pixel hunting, they can be identified as fake. However, crooks know this too, so the messages they send are usually timed and expire before any meaningful analysis can be done.

    Examples of these inaccuracies include missing tattoos or scars and
    inaccurate body proportions, the PSA reads. Criminal actors will sometimes purposefully send these photos using timed message features to limit the
    amount of time victims have to analyze the images.

    To defend against these attacks, the FBI first suggests citizens be more mindful about their privacy: when posting photos online, or when providing personal information to strangers while traveling. Then, they suggest they establish a code word only they know and, most importantly - try to contact
    the loved ones before making any payments.

    ======================================================================
    Link to news story: https://www.techradar.com/pro/security/fbi-warns-of-kidnapping-scams-as-hacker s-turn-to-ai-to-provide-proof-of-life

    $$
    --- SBBSecho 3.28-Linux
    * Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105)