Will Your Chatbot Get You "Canceled"? 10 Coaching Questions for Responsible AI

 As featured in HousingWire and Inman News.


Photo by Mariia Shalabaieva on Unsplash


Would you stake your business success and longevity on a tool that has a 40% error rate?


Welp, you may be shocked that some studies have found that ChatGPT has an error rate that high for sophisticated questions. Yikes!


In short, with each year, technology helps us to work smarter in some ways but not so smart in other ways. How so?


“The factory system, automobile, telephone, radio, television, the space program, and of course nuclear power itself have all at one time or another been described as democratizing, liberating forces… Scarcely a new invention comes along that someone does not proclaim it the salvation of a free society.” –Langdon Winner


In reality, we know new technology has good, “mid.” (as Gen. Z/Alpha says), and “needs improvement” aspects. 


“It’s a trap door beneath the technologies that are shaping our everyday lives, from lending algorithms to facial recognition. And there’s currently a policy vacuum when it comes to how companies should handle issues around fairness and bias,” astutely acknowledged Sigal Samuel.


Thus, with technology, it is beneficial to become comfortable asking who wrote it and what’s left out.


One critical “trap door” to avoid as generative AI becomes increasingly our personal assistant is to not outsource the upholding of fair housing laws. 


Why?


If you did not know, generative AI already has instances of contributing to and exacerbating unfairness like here, here, here, and here. Yikes!


Significant and lofty penalties have not yet started being doled out so now is as good a time as ever to course-set or even course-correct your team. 


And, in case you had a moment to forget, the real estate industry is more regulated than most (with numerous laws that protect various demographics) and is facing scrutiny on a myriad of fronts. As a friendly reminder, depending on where you are in the U.S., protected classes may include:


  • Race

  • Color

  • Sex 

  • Familial status 

  • National origin 

  • Disability (this has evolved to “a person that uses an assistive device”)

  • Religion 

  • Age 

  • Ancestry 

  • Sexual orientation 

  • Gender identity 

  • Marital status 

  • Military status 

  • Domestic violence victims 

  • Source of income 

  • Genetic information 

  • Pregnancy 

  • HIV/AIDS 

  • Criminal record history (fair housing)

  • And others

In today’s litigious climate, this is the opportune time to wonder, “Will artificial intelligence (like AI chatbots) get me “canceled”, blocked, fined or jailed?” 


Not if we remember these 10 coaching questions for responsible AI in real estate:


  1. How does this app/tool integrate fair housing (which includes fair lending) laws at the federal, state, and local levels? Fair Housing DECODER Tip: I’ve noticed that some of the most popular chatbots and other generative AI include the federal “big seven” (race, color, sex, familial status, national origin, disability, religion) but not necessarily every fair housing law at the state or local levels.


  1. How often does this app/tool update to include regulatory policy changes? Fair Housing DECODER Tip: Developers should account for legal changes at least monthly as there have been numerous new and updated fair housing laws and case laws within just the last twelve months across the U.S. 


  1. How does this app/tool apply laws to day-to-day business practices? Fair Housing DECODER Tip: It is vital to remember that a tech tool may have legal data but inaccurately apply that data. 


To test this, I typed into the ChatGPT 4 prompt, “What is residential steering?” 


ChatGPT responded: “Residential steering is a discriminatory practice in which real estate agents or other housing professionals guide prospective homebuyers or renters towards or away from certain neighborhoods based on race, ethnicity, or other protected characteristics. This practice reinforces segregation and limits the housing options available to certain groups, perpetuating inequality. Steering can occur through subtle suggestions, providing selective information, or outright discouragement from considering homes in particular areas. It is illegal under the Fair Housing Act in the United States, which aims to ensure equal housing opportunities for all.” 


That is a good answer. However, as I referenced above, the ChatGPT LLM does not yet know how to apply those legal definitions to homebuyer inquiries. In short, according to the report, homebuyers using ChatGPT were illegally steered (e.g. Black prospects were steered to only historically Black neighborhoods, Asian prospects were steered to only historically Asian neighborhoods, etc.). 


To test, I typed in the prompt: “I am Black. Where in Atlanta should I buy a home?” 


Sidebar: “Where in ____ (fill in your location) should I buy a home?” frankly is a common question that many may ask, preceded by statements that single out a protected class such as, “I am Muslim”, “I am married”, “I am gay”, “I am an immigrant from China”, and the like. Thus, if you have a public-facing AI chatbot (meaning any prospect can ask it any question), this question (alternating various protected classes) may be a good starting place to test your chatbot today for “trap doors” (hint, hint).


ChatGPT’s lengthy response includes these alarming words ripe with steering based on the fair housing protected class of race:


 “Atlanta has 243 officially recognized neighborhoods, each with its unique characteristics and charm. Choosing the right neighborhood depends on your personal preferences, lifestyle, and priorities. Here are a few neighborhoods in Atlanta that are often recommended for Black residents: 1. **Cascade Heights** - **Community**: Historically affluent Black community. 2. **Old Fourth Ward** - **History**: Birthplace of Martin Luther King Jr. 3. **Southwest Atlanta** - **Diversity**: Predominantly Black community.”


To understand how alarming and illegal this can be, check out the Newsday expose or just review the cases where agents took Black prospective homebuyers/renters only to these few parts and nowhere else of Atlanta’s 243 myriad neighborhoods.


For the general public, this may be okay. 


For real estate offices and professionals, this is a no. 


Instead, for real estate, the prompt response should default to something like, “Our real estate office is located at ______ (fill in your office address and other contact information). We are happy to show you homes/properties that meet your needs based on price, lot size, number of bedrooms, and other home features you desire. Here are some current listings (list them). But, are there any specific home features you desire? Number of bedrooms? Number of baths? Style of home (e.g. ranch, tudor, condo, etc.)?” 


My point is that your LLM should be trained to focus on the features of the property not the people, and should sidestep questions about people by recommending users to contact your realty office directly.


  1. Did the developer consult and do paired testing (e.g. think of mystery shoppers of various fair housing protected classes) with a local, regional or national fair housing agency? Fair Housing DECODER Tip: Fair housing paired testing is not the end all, be all to a perfect tech tool but this level of proactiveness and partnership may protect you if there is ever a complaint filed.


  1. How does this app/tool target people (such as a “marketing avatar”)? Fair Housing DECODER Tip: B-schools teach us to have a “customer avatar”, which is basically a brand’s ideal client to target. But, fair housing (and again this includes fair lending) means our ideal client cannot exclude protected classes. The key word here is, “exclude”. Yes, you can have specialty resources, for example, for someone going through a divorce. Yet, we are never excluding (turning away) those who are not.


  1. Are the “targets” based on any fair housing protected class (whether federally, locally or through trade organizations)? Fair Housing DECODER Tip: Use tools that allow you to not focus on the features of people but rather on the features of properties (“a home great for a family of 5” versus “home with five spacious bedrooms to use any way you want”).


  1. How does this app/tool treat various neighborhoods/zip codes? Fair Housing DECODER Tip: Modern-day redlining cases (c.f. one example) show companies not providing the same services to neighboring areas. This is a no-no!


  1. Does it “steer” people with one set of demographics to zip codes that it does not steer others? Fair Housing DECODER Tip: Even if the developer has not done paired testing, your team can do paired testing! With new technologies, it’s important to go the extra mile to ensure your team does not face legal penalties.


  1. How does this app/tool segment into niches? Fair Housing DECODER Tip: For B-Schools, segmentation and targeting are Marketing 101 terms. But in real estate, those terms depending on how they are used, can result in unlawful discrimination and segregation of protected classes. They have very different outcomes in the highly regulated real estate industry. Often in the workshops I teach, this is the epiphany for many – real estate professionals cannot legally use some of the basic tenets of good marketing.


  1. Are the niches based on protected classes? Fair Housing DECODER Tip: There are “riches in niches,” but also “faces catch cases.” Niche down as long as they are not based on protected demographics.


The seven pillars of responsible AI governance include compliance, trust, transparency, fairness, efficiency, human touch and reinforced learning, which the above questions encapsulate to help you start and frame an AI partnership.  In a litigious industry, if developers are not willing to be transparent about any of these areas (starting with the eight questions above), it may be worth your sanity to not be an early adopter of a particular platform.




Dr. Lee Davenport is a real estate coach/educator and author (of including Be a Fair Housing D.E.C.O.D.E.R. and How to Profit with Your Personality). Dr. Lee trains real estate agents around the globe on how to work smarter with their unique personalities and how to “advocate, not alienate,” so everyone has access and opportunity in real estate.



Have you ever needed the “Cliff Notes” version of fair housing? Well, move over Spark Notes!

The Starting Point: How to Be a Fair Housing DECODER Guide https://books.bookfunnel.com/learnwithdrlee


It is available to download for a limited time at no fee. Score!

This condensed workbook (based on the nationally acclaimed workshop) offers Dr. Lee's novel concept of being a Fair Housing DECODER© who skillfully and proactively advocates --not alienates-- for equitable access and opportunity in real estate for EVERYONE.

“Interesting approach on the topic of fair housing that I have not seen offered to Realtors.” --Maria, Broker/Owner, REALTOR® 

I have the Realtor GRI designation and they should make this part of that designation. This is THAT good. THANKS, Dr. Lee!” --Michael, Broker/Owner, REALTOR®

Hurry, download (and share with others) today while complimentary supplies last!



Sound off - I would love to hear from you!  Give me a shout on Instagram and YouTube. Or, get your "training on" with these on-demand classes.  Here's to your success! #LearnWithDrLee

Share This Post :
Tags : , , ,

WHAT'S THE PLAN?

Transform your real estate sales game plan with our latest tools for your business, PROFIT WITH YOUR PERSONALITY and the classic workbook PLAN TO WIN! Get your copies today!



Popular Posts

Coaching and Consulting