Unreal Deals: AI, Deepfakes and Real Estate Fraud
The Relationship Between AI and Real Estate
AI is reshaping the real estate industry by accelerating processes, enhancing accuracy, and improving client experiences. From smart chatbots that streamline customer service to data analysis tools that uncover market trends and predictive property valuation models, AI for real estate is being used in powerful, positive ways. However, bad actors are leveraging AI in real estate for fraud, including deepfakes, synthetic identity schemes, voice cloning, and falsified documentation.
What is Generative AI (GenAI)?
GenAI is a type of artificial intelligence built on advanced math and computer science. It uses algorithms and large datasets to recognize patterns, learn from them, and then create new content like text, images, audio, or even video that didn’t exist before. It mimics how humans create by learning what works and applying that knowledge in new ways. It is also sometimes called synthetic media. For example, imagine you give AI several pictures of cats. It studies those pictures, learns what makes a cat look like a cat, and then creates new images of cats that look real, even though they don’t exist in real life.
What are some types of real estate fraud that scammers are committing using AI?
- Impersonation: Scammers use deepfake audio and video to impersonate individuals, manipulating investors with fake emails, video messages, and AI cold calling to real estate professionals. For example, AI can generate highly convincing phishing emails and messages by mimicking the writing style of a person. This makes it harder for individuals to distinguish between real and fake communications.
- Voice Cloning: This is a technique that replicates a person’s voice using audio samples, allowing scammers to fake phone calls or voicemail messages that sound like a trusted source. For example, this can be used to impersonate real estate attorneys, agents, lenders, banks or escrow officers and trick others into sharing sensitive information or authorizing fraudulent transactions.
- Identity Theft and Financial Fraud: Deepfake technology can create realistic but fraudulent identity documents and intercept payments or redirect funds to scammers' accounts.
- Business Email Compromise (BEC): Deepfakes are increasingly being used in Business Email Compromise (BEC) scams to enhance the realism and effectiveness of these attacks. Cybercriminals use the technology to make the fraudulent emails more convincing and personalized.
Real Estate and AI Fraud: Safety Tips for Transactions
- Software Updates: Keep your software updated to prevent scams and ensure you have malware and anti-virus protection.
- Property Alerts: Set up a google alert for a property’s address.
- County Recorder: If the county recorder offers a free notification service, sign up!
- Identification: Always verify the identity of the parties in the transaction. Consider setting up a virtual meeting to verify identity.
- Secure Communication: Use secure communication channels, like encrypted emails or transaction portals, when sending documents and emails. Don’t rely on email alone, especially when it comes to money.
- Multi-Factor Authentication (MFA): Use MFA as a minimum protection. It adds an extra layer of security.
- Title Insurance: If you are buying a residential property, consider selecting a title insurance policy with fraud coverage. If you desire enhanced coverage against fraud that might occur after you buy your home, ask the title company if an ALTA Homeowner’s policy is available for purchase.
- Stay Vigilant: AI technology is continuing to evolve and become more sophisticated. Keep up with the latest real estate fraud trends and how to detect them.
Tips to Tell if Something Is AI Generated
Language Fingerprints
- Language formality
- Verbose
- Redundancy
- Overuse of transition words or phrases
- Generational lexicon
- Emotion-based words
Voice
- Listen for clicks or pauses
- Interrupt the speaker
- Ask detail specific questions
- Ask open-ended questions
- Use slang or colloquialisms
- Listen for irregularities, generic responses, lack of fillers
- Listen for perfection
- Check caller ID or source
Picture Quality
- Picture inconsistencies
- Look at the eyes, hands, hair, eyebrows
- Blurry or distorted backgrounds
- Uniform lighting, missing shadows
- Gaps in background and foreground
Video
- Ask detail specific questions
- Ask open-ended questions
- Request body movement
- Ask for a 360 degree view with camera
- Watch for glitches or loops
- Look at mouth movement
Stay Vigilant. Stay Informed.
First on Fraud is the go-to podcast for real estate professionals. Subscribe today for more information on top crime stories, expert interviews, and actionable tips to stay ahead of emerging real estate fraud trends. If you have any questions about 1031 exchange fraud, contact First American Exchange Company today.