Money Matters Episode 330- AI, Fraud & the Future W/ Whitney Joy Smith
Money Matters Episode 330- AI, Fraud & the Future W/ Whitney Joy Smith
The audio sounded perfect. Too perfect.
That’s what Whitney Joy Smith—one of Canada’s top private investigators—told me when we sat down to talk about AI, trust, and the future of voice.
In this special edition of the Money Matters podcast, we go deep into one of the most uncomfortable truths in today’s tech landscape:
You can no longer assume a voice recording is real.
Whitney shares real-world cases where AI-generated voices nearly destroyed lives, and how investigators are adjusting—not just their tools, but their intuition.
And here’s where it connects to financial advising:
In a profession built on trust, how do we navigate a world where even authenticity can be faked?
We talked about:
-
The rise of deepfake scams and how to spot what’s missing
-
The emotional cost of over-automation in human relationships
-
Why the future isn’t about out-teching fraud—but out-humaning it
This episode is part of the research series for my upcoming book, Digital Kaizen for Financial Advisors—where we explore small, sustainable shifts that keep relationships at the heart of everything, even in an AI-driven world.
🎧 Give it a listen. It’s only 13 minutes, and it might change the way you think about your own voice, your own presence, and how you show up.
👇 Join the Digital Kaizen email list in the comments for early access to interviews, tools, and insights from the book.
#DigitalKaizen #MoneyMattersPodcast #FinancialAdvisors #VoiceDeepfakes #AIandTrust #PrivateInvestigation #BehavioralFinance #HumanCenteredAI #WhitneyJoySmith #Cybersecurity #AdvisorGrowth #EmpathyInTech #Authenticity
Money Matters Episode 330 Special Episode
[00:00:00] What if the next voice message that you receive from your spouse wasn't actually them? Today's guest, Whitney Joy Smith is a private investigator who's been chasing down truth for over two decades, and she's been sounding the alarm. We've officially entered a world where audio recordings, once the gold standard of evidence can no longer be trusted.
Whitney leads one of Canada's top investigative agencies specializing in fraud, digital forensics, and deception detection. But her real superpower is spotting what's missing, microexpressions gut instincts, the imperfect human truths that AI still can't fake. In this episode, we'll talk about deep fakes, the erosion of trust and what it takes to protect your identity and your judgment.
And an age of digital manipulation. This is more than an episode about technology. It's a conversation about staying human when the lines between real and artificial keep blurring. Let's jump right in. [00:01:00] Whitney, thank you so much for being on the show today. Let me just ask you, you co-author an article with us in that article you mentioned.
That the moment you realized audio recordings could be faked was a turning point. Can you take us back to that moment? What happened and how did it change the way that you think about
evidence?
Whitney Joy Smith: Absolutely. That moment came when a client presented me with an audio clip that on the surface seemed completely legitimate, emotionally charged, high quality, and very convincing, but something just fell off. We began digging deeper using Osint techniques and comparing the voice to other publicly available available recordings of the individual in question.
That's when I realized this wasn't just suspicious. It could have been fabricated using ai. From that point forward, I understood that audio clip could no longer be treated as irrefutable evidence. It required a full verification process, context, chain of custody, and cross-referencing with other data, everything changed.
Christopher NonAI: All right. Now, Whitney, you, you've also built your career around [00:02:00] uncovering the the truth. How do you maintain trust in a world where even a voice can lie?
Whitney Joy Smith: I've always believed that trust has to be earned, and that means being transparent about what we know and what we don't. These days, we explain to clients that we have the tools to detect manipulation. Uh, but certainty isn't possible, so I rely heavily on cooperating evidence, witness statements, surveillance footage, behavioral patterns.
Trust is maintained by providing a complete picture, not just a single piece of data.
Christopher NonAI: I. That makes a lot of sense. In a world where audio can be generated, what new skills or instincts have investigators like you had to develop to stay ahead of deception,
Whitney Joy Smith: We've had to hone both our tech and human skills. On the technical side, we now use forensics tools to analyze, pitch, background, noise and inconsistencies. It's our human instincts that really give us the edge. Listening for those tonal changes, spotting mismatches between behavior and voice, and assessing authenticity based on subtle cues.
AI generated voices [00:03:00] often lack emotional variability or spontaneity that two perfect quality becomes in its own. The towel.
Christopher NonAI: You shared that perfection's actually a red flag in deepfake content. Can you explain that idea a little bit more and how your team spots when something's too polished?
Whitney Joy Smith: Sure. Think about the way people naturally behave. Emotions cause fluctuations in speech, camera movements, expressions, but deepfake content often lacks those imperfections. For example, in a genuine moment, a camera might shake because someone is surprised or a afraid, but AI generated content, everything's smooth, eerily perfect, that's a red flag.
Perfection can actually signal manipulation.
Christopher NonAI: We talk a lot in financial advising about digital clutter. Too many platforms, too many touch points. You've described a similar challenge in investigations where a single voice recording isn't enough anymore. How do you create a clear signal from all the noise?
Whitney Joy Smith: It is not easy. Sometimes [00:04:00] we're working with a 22nd audio clip and nothing else. In those cases, we can't just go on instinct. We need context. We turn to osint background checks or additional surveillance to confirm authenticity. If we can't say with certainty, we provide a professional assessment based on experience and probability.
It's about creating clarity without compromising certainty.
Christopher NonAI: I love that. Uh, what you said about microexpressions and inconsistencies in human behavior being difficult for AI to replicate, do you think those human tales will always be r edge in the age of artificial intelligence?
Whitney Joy Smith: Yes, I do. AI might be getting better at mimicking speech or facial expressions, but it still struggles with the unpredictable nuances of human behavior. Those micro expressions pauses, shifts in emotion, even the natural rhythm of speech. Those subtle details are where truth often lies. As long as we know how to look for them, we'll maintain an edge.
Christopher NonAI: All right, I got one for you. What scares you more AI getting better at mimicking people or people starting to [00:05:00] behave more like ai.
Whitney Joy Smith: Honestly the second one, people are adjusting their behavior because of ai, self-censoring, performing for imagined audiences and relying too heavily on automated tools. That loss of authenticity worries me. When we begin to act like the very technology we fear, we risk losing the very traits that make us human intuition, creativity, and emotional depth.
Christopher NonAI: Okay. How do you personally stay grounded, discerning, and human in a field that's becoming increasingly digital and distorted.
Whitney Joy Smith: Yeah, I like to disconnect whenever I can spend time in nature, hiking, hunting. Uh, for me, I'm a big hunter. I like to fly into remote areas. I like to, um, you know, be able to just enjoy nature. Being grounded in what's a genuine and authentic for us as humans. We're created for that we're created to be grounded in nature and.
As much as I can. That's what I prefer to do is, is get out in nature, disconnect from all the, the noise of, of life and the busyness of all the [00:06:00] devices that we have with us on a, a daily basis. It's definitely the, the necessary thing to do.
Christopher NonAI: Alright. I'm an outdoors person too. I love that. How do you think the rise of AI generated voice impacts professions that rely on trust and personal connection, like financial advising?
Whitney Joy Smith: The human element will always matter whether it's a child custody case or an infidelity investigation. We have empathy, discretion, and emotional intelligence that tech cannot replicate. In my agency, for example, especially with our team of female investigators, that personal touches what clients value most.
They want someone who understands the emotional weight of what they're going through. That's where we're always gonna have the upper hand over technology. We're human.
Christopher NonAI: What cues do you look for in voice or behavior that signals something might be off even when the data looks perfect.
Whitney Joy Smith: I pay close attention to rhythm, tone, and emotion. A real human voice like mine naturally fluctuates. It speeds up, it slows down. It shows excitement, frustration, and hesitation. AI [00:07:00] tends to be too smooth and evenly paced. When I hear something that lacks variability or emotional nuance, that's a red flag.
I also consider the context. Is this how the person usually speaks? Are there missing cues, background noise, pauses, interruptions that would typically be there? Often what's missing tells you more than what's present
Christopher NonAI: I bet you have a ton of good stories. Can you share a case where seemingly.
Authentic recording turned out to be misleading and how you knew.
Whitney Joy Smith: that we're seeing a troubling trend across the investigative industry. They're increasing reports of individuals being targeted by people with vendetta, whether it's through gang stalking or other forms of harassment. In some situations, these bad actors fabricate audio clips often suggesting something damaging like an affair, and then send them to a spouse or loved one to stir up conflict.
It's incredibly manipulative and emotionally harmful. The challenge then falls on the individual and often on professionals like investigators or forensic analysis to determine whether the clip is real. In many cases, the audio [00:08:00] sounds convincing, but it lacks ambient noise, contains no speech overlap or has a natural pacing.
These are all cues that the content may have been generated or altered. It's a reminder that in today's world, we have to be cautious about taking recordings at face value, especially when they're delivered anonymously or with malicious intent.
Christopher NonAI: How can advisors stay human in an increasingly AI mediated world without sounding robotic or overly polished?
Whitney Joy Smith: Yes, absolutely. I've noticed people polishing their image to the point where it becomes hard to tell what's genuine. They're hyper aware of compliance recording digital trails, so they start sounding more like scripted bots than normal people, and that can erode trust. Clients wanna feel they're speaking to a real person who understands their situation, not someone who's just reciting bullet points.
When authenticity is lost, relationships suffer.
Be real. Don't over script your communications. Clients connect with authenticity, your tone, your empathy, your ability to listen. It's okay to pause, to show emotion, to say, I don't know, but I'll find out. Those [00:09:00] are the things that build long-term trust. Technology can help with organization and access, but it can't replicate.
Human relationships show up as yourself. That's what sets you apart.
Christopher NonAI: What advice would you give advisors who want to protect themselves and their clients from voice-based scams or identity fraud?
Whitney Joy Smith: First set verification protocols. Don't rely on voice alone. Use callbacks. Confirm request via secure email or use way factor authentication for financial decisions. Educate clients on possibility of voice fraud, and encourage them to pause if anything feels off. Also, keep an eye out on your own digital footprint.
What videos or voice clips are you posting online? The less publicly available audio there is. The harder it is for someone to clone your voice and finally, trust your gut. If something doesn't feel right, investigate it further. Don't rush.
Christopher NonAI: Now you've talked before about how people self-censor when they think they're recorded, right? Do you see this happening in other industries like finance, and how does it affect [00:10:00] trust?
Whitney Joy Smith: Yeah, it has a massive impact. Professions like financial advising are built on personal report and trust. If a client gets a call that sounds just like their advisor, the same tone, same phrases, they're likely to believe it's real. That opens the door to dangerous fraud. Voice cloning makes it harder to rely on, uh, what used to be a very personal con connection.
So it's no longer just about what's being said, but how it's said when and through what channels. And scammers are also gonna try and convince you that the person is real. They use manipulation and scare tactics to get you to feel scared or pressured into making quick decisions. My grandma, a few months ago, had a call from a male who started the phone call off by saying, grandma, I'm in jail and I need your help.
Now, this wasn't a real family member, but my grandma got scared. She immediately lost all sense of perception and reality and was asking him who it was. He just kept saying, grandma, you know who I am. And then she was like, no, I don't know who you are. And so she started rhyming off names until the scammer felt that was the right one to go with.
And he said, yes, [00:11:00] it's me. And he said, my cousin's, she said My cousin's name. And then he said, yeah, that's me. Uh, he then proceeded to ask her for a credit card to get him outta jail. He kept saying, don't tell my dad he's gonna kill me, but grandma, please don't tell him. Now. Thankfully, we've placed protections in place for my grandma and she wasn't able to give any financial information to this individual.
But that example just shows you how people will use fear and manipulation and other tactics to get information.
Christopher NonAI: All right. And then how might advisors use your thinking around behavioral patterns and anomalies to better serve clients during emotional life events like retirement, divorce, inheritance, and that sort of thing.
Whitney Joy Smith: Advisors can watch for those same inconsistencies we look for in investigations. Is the client acting differently, rushing decisions, avoiding certain topics? These behavioral shifts can signal stress, confusion, or external pressure during emotional life events. It's especially important to slow things down and really observe not just what clients are saying, but how they're saying it.
That's [00:12:00] where empathy and intuition become powerful tools. Just like an investigation, the anomalies often point to deeper truth.
Christopher NonAI: Okay, , thank you so much for being on the show today. Uh, for listeners who'd like to find out more , we will have her links on, uh, the podcast notes . Just to let you guys know, uh, joy co-authored a blog post on our new digital Kaizen blog series at. We're doing now. This blog series is a lead up for two books that I have coming out.
One just called Digital Kaizen, and the next one called Digital Kaizen for Financial Advisors. Joy is gonna be in the second one talking about what we just discussed today, right? And how this affects our clients, how this affects advisors. Uh, we encourage you to go, go on LinkedIn, find both of us, follow us so that you can get more information on that.
With that, everybody, have a good rest of the day there. Thank you for listening.

Whitney Smith
Founder & Cep
Whitney Smith is the founder and CEO of The Smith Investigation Agency, Canada’s largest female-owned private investigation company. She also leads Smith Security Inc., a premier security services provider in Ontario, and is the driving force behind Training Centre Canada, an online platform offering private investigator and security guard training. Whitney is dedicated to excellence, ethics, and community service.
Her agencies have consistently earned top accolades, including the Consumers Choice Award for Best Investigative Agency and the AI award for Best Woman-owned Private Investigative Agency. Whitney’s leadership is rooted in her strong faith, guiding her decisions with integrity.
As a fourth-generation entrepreneur and Orillia City Councillor, Whitney is passionate about empowering youth, women, and young entrepreneurs. Her unwavering commitment to professional excellence and community impact continues to inspire individuals across the investigative, security, and training sectors.