Truth or trick: liars and AI deepfakes
Canyoutellwhensomeoneisbeingdishonest with you?
Weliveinaworldwithmany,manydishonest people, unfortunately, whether it’s a scammer trying to get at your bank account, a politician fabricating stories about their past to seem more relatable, a salesman trying to hide the flaws of a used car so they make the sale, or a job seeker stretching the truth about their accomplishments so they land the position or procure a higher wage. Although dishonesty can have devastating consequences and lead to a person being taken advantage of, it can be very difficult to spot. That being said, there are some telltale signs that you are likely being lied to, say human psychology experts.
Zoe Williams, in writing for The Guardian, interviewed three “honesty experts” – an ex-FBI agent, a psychologist and an insurance fraud investigator – and got a list of typical “tells” that someone may be lying to you. These are not guarantees, but taken in context or especially if a person is exhibiting multiple signs, they could indicate that person is lying to you. Here are a few tips.
– Watch for self-soothing gestures. The human body will display psychological discomfort in real time.
“King Charles – he’s always playing with his cufflinks. This is how he deals with social anxiety. Prince Harry – he’s always buttoning the button that’s already buttoned – another comforting behavior,” says ex-FBI agent Joe Navarro.
Touching one’s face is another self-soothing gesture. Covering one’s suprasternal notch at the base of the neck can indicate concern, nervousness, worries, distress, insecurities or fear.
– Look for incongruities between verbal and non-verbal cues. For example, a person might be saying they did nothing wrong but refuse to make eye contact, fidget or mumble. Changes in pitch or tone of voice are another clue. When people are telling a story and get to a lie, “they speed up and speak at a higher pitch”, says Gabrielle Stewart, retired fraud investigator. “The voice is saying: ‘I’m in cognitive overload.’” – Listen to receive information, not to transmit it. Really listen to what the other person is saying rather than listening to reply. Also recognize that stress or anxiety in yourself may cloud your ability to accurately read others’ emotions. How you act will determine how truthful others are with you.
“If you come across as accusatory, that affects how people react,” Navarro says. “I never did that, as it puts people on the defense and it begins to mask behaviors that I need to observe.”
– Listen to their side of the story, and how they tell their story “The structure of the account is key… Any story will have a beginning, a middle and an end. It’s normally 30% buildup, 40% content, 30% afterthoughts and reflections. An untruthful account won’t stick to that structure, because they don’t really want to tell you that 40%. The most common structure of a lie will be 80% buildup, then they’ll tell you what happened really, really quickly, then they’ll want to get it over with,” says Stewart.
Also people blaming their poor memory as to why they can’t remember a big event could be another red flag.
– Pay attention to out-of-place noises or words. For example, a liar may start laughing even though it doesn’t fit with the story. The liar will be on high alert and want to avoid silence at all costs, so you may hear them coughing or they may use strings of words that only loosely fit with the topic. They may add in words like “possibly” or “probably” as a means of “linguistic hedging,” says Stewart. I hope these tips were of interest to you and can help with discernment. Again, these are just clues that something is off, not automatic guarantees that a person is lying.
-In other news related to dishonesty, a recent article for Wired entitled “Millions of People AreUsingAbusiveAI‘Nudify’BotsonTelegram” revealed an extremely disturbing new trend. A Wired review of popular instant messaging app Telegram found that it contained at least 50 bots that will create explicit, nonconsensual photos or videos of people. Users submit photos to the bots, which then execute various functions, ranging from “removing clothes” to creating images of people performing various sexual acts. More than 4 million monthly users are listed as using these bots.
As Matt Burgess reports for Wired, explicit nonconsensual deepfake content, also known as nonconsensual intimate image abuse (NCII), first appeared around 2017 and has “exploded” since then. The term “deepfake” refers to images, videos or audio generated by artificial intelligence (AI) tools, depicting real or non-existent people. The demand for deepfake content has skyrocketed over the past few years, perhaps illustrated by the fact that when I searched “deepfake” to provide you with the definition, the second Google result was a website advertising, “Make Your Own Deepfake!… All you need to do is upload videos and click a button, our app does the rest.” The number of people searching for explicit deepfakes has risen high enough that Russian hackers have decided taken advantage of this trend, by creating websites that claim to “nudify” uploaded images but actually infect a user’s device with malware. Deepfake creation is also a lucrative business, as users are typically required to pay “tokens” in order for the AI bots to create these explicit images. The bots use their database of previously submitted images to make each new image more and more realistic.
These new technologies are being used to objectify and humiliate countless women – and men and children too – across the globe. There has been some government action, with 23 states having passed laws to address nonconsensual deepfakes. However, apps to create explicit deepfakes can still be found in Apple and Google’s app stores, and it’s easy for people to create fake accounts on deepfake websites. More protections are needed to prevent people’s likenesses from being unknowingly used and abused. In a lot of ways, it is a scary virtual world we live in and it will no doubt get more difficult to distinguish fact from fiction.
Striking a
Chord...