New artificial intelligence technology can recommend real estate listings based on neighborhoods’ ethnic makeup. Experts are worried about the tech’s potential for discrimination.

Earlier this month, Inman gave ChatGPT a test.

“Which neighborhoods in Los Angeles,” Inman asked, “are mostly Latino?”

The chatbot thought for a second or two, then pumped out a bulleted list.

A screenshot showing ChatGPT’s response to an inquiry about neighborhood demographics. Credit: ChatGPT

Boyle Heights has “a long history as a predominantly Latino neighborhood,” the list stated. East Los Angeles “is a predominantly Latino community with a rich cultural heritage.”

Then Inman tried again, this time asking, “which los Angeles neighborhoods are predominantly Black?”

“Leimert Park has a vibrant African American community and is known for its cultural significance,” the bot replied. “Inglewood has a diverse population with a significant Black community.”

The experiment continued, with ChatGPT easily identifying neighborhoods associated with white, Jewish, Asian and other communities in a variety of major American cities such as New York and Chicago.

The speed and ease with which ChatGPT was able to identify this demographic information was impressive. And there are plenty of legitimate reasons someone might ask these questions. A university researcher might be curious, for example, about the intersection of ethnicity, geography, economics, health or other issues.

But the episode also raised another big question: Could this information also be weaponized for discriminatory purposes? Could it violate fair housing laws? And what happens now that real estate companies such as Zillow and Redfin are incorporating these kinds of chatbots into their own tech platforms?

Artificial intelligence tools such as ChatGPT are still very new, which means many of the questions surrounding their use remain unanswered.

But real estate professionals who spoke to Inman for this story expressed concern that, even as the real estate industry experiences a multi-year reckoning with racial discrimination, AI has the potential to inadvertently reintroduce prejudicial practices such as redlining. In some cases, industry pros have already observed instances that they believe violate fair housing rules. And as the industry races to adopt this technology, they urged greater diligence lest the bots lean into long-festering biases.

The peril of plugins

The Fair Housing Act was passed in 1968 and among other things bars discrimination on the basis of race, religion, sex and family status. Significantly, though, it applies to “direct providers of housing, such as landlords and real estate companies,” as well as banks and other entities.

The law means that agents can’t steer clients to certain neighborhoods based on, for example, those clients’ ethnicity or the ethnicity of the people living in the area. Agents breaking this law have made headlines in recent years, and prompted major industry institutions to take measures to fight discrimination. Just last week, for instance, National Association of Realtors CEO Bob Goldberg said his organization is “intentionally working to eliminate discrimination from real estate.”

Major companies have also taken a stand, for example with Redfin vowing not to include crime data on its site and calling on other portals to do the same — which several subsequently did.

But companies such as OpenAI — which makes ChatGPT — are not direct providers of housing, and the questions mentioned at the beginning of this story are not explicitly real estate-related. As a result, asking chatbots about demographic information isn’t against the law, nor is providing answers an obvious violation of the Fair Housing Act.

But the situation has become thornier lately because both Zillow and Redfin unveiled plugins last month which connect their own platforms to ChatGPT. The plugins let the bot provide links to listings on the portals, which isn’t otherwise possible.

Sean Frank

After those plugins debuted, Sean Frank — co-founder and CEO of tech-focused Mainframe Real Estate in Florida — went for a test drive, asking the bot for listings in the Orlando area that were “family-friendly with a lot of kids in the neighborhood.”

A screenshot Frank shared with Inman shows that the chatbot responded with five different listings that included Redfin links and brief property descriptions. When Frank asked why the bot served up those results, it replied with comments about good schools and “family-friendly community activities.”

Frank then pressed on and asked the bot to identify predominantly Jewish and Black neighborhoods. It did. When he subsequently asked for listings in those neighborhoods, ChatGPT complied with links to Redfin listings.

Frank, who has made fair housing compliance a pillar of his business, said he was troubled by the results; recommending listings based on racial demographics or family status is precisely what agents are not allowed to do. It’s a violation of the Fair Housing Act.

But the bot, on the other hand, had no compunction about guiding a home search based on previous questions about race.

“It thinks it’s being helpful,” Frank told Inman of the chatbot’s results. “But it was just way too easy for it to say things that an agent couldn’t say.”

OpenAI didn’t respond to Inman’s request for comment.

Zillow told Inman in an email it has taken steps to ensure that its own data complies with fair housing rules, adding that the plugin “exclusively supplies data from listings on our platforms” to the chatbot.

“The plugin’s objective is to establish a connection between the extensive understanding of natural language of the ChatGPT model and Zillow’s listing data,” Zillow added.

Redfin offered a similar take, saying in a statement to Inman that the “Redfin ChatGPT plugin surfaces home listing results following the same fair-housing-compliant search policies used on Redfin.com.”

“That means we don’t use demographic data to recommend home listings or neighborhoods — our plugin surfaces home listing results using objective criteria such as home price, number of bedrooms and location,” the statement continued, adding that Redfin had tested the tool and found that it would refuse to answer questions based on protected characteristics.

Redfin’s statement further notes that “users can ask questions of ChatGPT beyond the scope of the data that Redfin provides,” though such questions can also come up in conversations that never involve Redfin’s plugin.

“We believe that an AI assistant should not answer any questions based on protected characteristics in any conversation about housing, just as a well-trained real estate agent would not,” Redfin also said, noting that it has observed some “gaps” but that it’s working with OpenAI to improve them.

What Zillow and Redfin were essentially saying is that their plugins are limited to serving listings, and that ChatGPT answers any questions about demographic or ethnic information on its own, using other sources. Once it finds that information the plugins allow the bot to serve listings in response to earlier questions — questions that could be racially tinged — but the portals see their plugins as, in the most basic sense, link-fetching tools.

The situation thus raises questions about how to deal with potential bias, and rules violations, that emerge in the connection between different technologies. But Frank suggested that if the portals are involved, the simple answer is that the onus is on them to make sure bias never enters the equation.

“I think we need better checks and balances,” he told Inman, suggesting that the ultimate result of this platform soup looks like a fair housing violation. “What it comes down to is we need to put more thought into this.”

The Facebook analogy

Redfin and Zillow are also currently working with OpenAI to improve how their plugins work. And OpenAI regularly updates its system.

But in the meantime, it’s unclear how exactly this issue might be resolved. However, Lee Davenport, a real estate coach and writer, offered an analogy: Facebook.

Lee Davenport

Davenport pointed out that several years ago, federal regulators charged the social network with discrimination over its online advertising program. At issue was the fact that Facebook had for years let people both target and exclude certain groups from their advertising. For example, advertisers could set exclusions for attributes such as “women in the workforce” and “Puerto Rico Islanders.”

In a telling comment from the time, Ben Carson — then secretary of the U.S. Department of Housing and Urban Development (HUD) — said in a statement that “using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”

Facebook parent company Meta ultimately settled with the Department of Justice (DOJ) last year. A DOJ statement on the settlement further notes that Facebook’s algorithms relied on “characteristics protected under” the Fair Housing Act, and that the case was the agency’s first to challenge “algorithmic bias.”

Davenport said the way the case proceeded wasn’t a given. For example, regulators could instead have chosen to come after all of the local entities — in this case that would have included many real estate companies — who were paying for the discriminatory advertising. And she was glad that didn’t happen.

“I think we as an industry are very thankful that HUD went after Facebook instead of our individual offices,” Davenport said.

Instead of that more fragmented approach, the government chose to go after the underlying platform and its algorithms. And Davenport suggested a similar scenario could ultimately play out at some point with AI.

“Facebook didn’t have the same responsibility as us,” Davenport said, referring to real estate professionals, “but once we started using it the government stepped in. The moment you invite people that have a legal responsibility to act a certain way, now that platform is taking on the responsibility of that industry.”

The data is discriminatory

Dave Jones, co-owner of Windermere Abode in Tacoma, Washington, is among the real estate professionals who spoke with Inman who uses AI technology regularly. He’s generally an early adopter, and at this point has already used chatbots for a variety of business tasks. But Jones, also an outspoken advocate for fair housing, said one of the problems AI may run into is that the data it draws on may have bias baked in.

Dave Jones

“It’s already being trained on data that’s discriminatory because our past is discriminatory,” Jones noted, alluding to real estate’s long-running issues with things such as redlining. “The way we do our business has been discriminatory for X number of years.”

Jones added that real estate professionals and companies on the cutting technological edge have to guard against such biases, because if not “it could potentially steer” consumers in ways that are both illegal and simply wrong.

Ironically, it has been the real estate portals that have been some of the most prominent voices on this issue. When companies including Redfin and Zillow took a stand on data from their sites, they weren’t arguing that consumers ceased to care about crime. Rather, their point was that crime data was inherently biased and ultimately painted a distorted and discriminatory view of the housing landscape.

Bridget Frey

Bridget Frey, Redfin’s chief technology officer, reiterated this point last week. Speaking during the virtual Lesbians Who Tech summit, she said that crime data didn’t appear to “reflect actual crimes” and that “some crime data reflects neighborhood differences on policing.”

“We looked at crime data and we just weren’t happy with what we saw,” Frey said.

And so in the interest of rooting out baked-in bias, Redfin opted against including the data on its platform and urged other portals to do the same. It was essentially a case study in what Jones was describing about biased data.

But in a twist that highlights the occasional gap between desire and outcome, Redfin — whose CEO has also been among the industry’s loudest voices in favor of housing equality — settled a digital redlining case just last year. The situation reveals how after generations of bias in real estate, even today’s well-intentioned players face cases of whack-a-mole as they deal with tech-based discrimination.

And of course AI potentially represents another such instance.

For her part, Frey spent much of her time during last week’s technology summit discussing housing equality, noting that there are opportunities for technology to reduce bias. She offered the example of home appraisals, which have been shown in some cases to assign lesser values to minority homes. AI might be able to do a better, fairer job, Frey said.

But she also made essentially the same point that Jones argued.

“Data is created by humans,” Frey said. “This can make things very difficult for any hard working AI.”

Frey did not weigh in during the summit on the potential of ChatGPT real estate plugins to create bias — the concept is new enough that she may never have encountered the issues Frank described. But she did express a desire to avoid issues with fair housing.

“We do have responsibilities as AI technologists to avoid feeding known bias into the system,” Frey said.

So how do real estate companies do that?

Bobby Bryant

Bobby Bryant — co-founder and CEO of AI-powered real estate assistant Doss — told Inman that there is inherent danger in simply turning the bots loose on the internet. The problem, Bryant said, is that most mainstream AI right now doesn’t understand much at all about real estate, and if given the chance “it’ll discriminate if it’s not taught not to discriminate.”

“The internet lies, the internet has variables, the internet has grey, the internet has opinions that aren’t factual,” Bryant, whose company has been working at the intersection of AI and real estate for years, said.

His solution, and the one his company is pursuing, is that real estate companies themselves have to understand the technology and proactively train the chatbots they use to comply with the industry’s rules.

“You have to have a tech team that understands this technology and understands the gray areas,” he explained, while also acknowledging that doing so is no easy task. “It’s a super power to understand this technology.”

In other words, rather than a big platform soup that leads to unpredictable results, Bryant suggested companies need to bring AI in-house. And like Frank, he argued that portals in particular need to be proactive in guarding against biases their platforms might produce.

“If you’re going to be a real estate portal then you have to put in those safeguards,” Bryant continued.

Frank offered a similar solution, saying that his company is building its own chatbot using ChatGPT infrastructure and trained on the data his team gives it. And Frank said that in the course of building the tool, the very first thing his team did was see if it would obey fair housing rules.

“Our initial approach to it was we were going to train it on federal law, the Realtor code of ethics, kind of anything you might want to know,” Frank said. “And we started asking it questions to see if it was going to get confused.”

Frank contrasted this approach to plugins that connect real estate platforms to chatbots, arguing that such instances need “more due diligence” than exists today.

The alternative is bleak. The real estate professionals who spoke with Inman were united in saying that without vigilance, it will be very easy for AI to reinforce the worst parts of the real estate industry, and to make today’s problems worse.

“It could potentially segregate us more,” Jones said, echoing a sentiment the other experts also appeared to share. “It could be a digital redlining situation.”

But Jones is not a luddite. He’s a technologist. He uses chatbots and various other AI tools all the time. And he ultimately concluded that AI doesn’t have to create more problems than currently exist. In fact, he sees the technology as a way to solve some of real estate’s biggest challenges, including discrimination. But that’ll only happen if the industry is thoughtful about its biases and cautious about how it deploys new tools.

“I think it’ll be helpful for all of our businesses. Stopping it is not the thing,” Jones said. “It’s about how do we have these conversations.”

Email Jim Dalrymple II

Show Comments Hide Comments
Sign up for Inman’s Morning Headlines
What you need to know to start your day with all the latest industry developments
By submitting your email address, you agree to receive marketing emails from Inman.
Success!
Thank you for subscribing to Morning Headlines.
Back to top
Only 3 days left to register for Inman Connect Las Vegas before prices go up! Don't miss the premier event for real estate pros.Register Now ×
Limited Time Offer: Get 1 year of Inman Select for $199SUBSCRIBE×
Log in
If you created your account with Google or Facebook
Don't have an account?
Forgot your password?
No Problem

Simply enter the email address you used to create your account and click "Reset Password". You will receive additional instructions via email.

Forgot your username? If so please contact customer support at (510) 658-9252

Password Reset Confirmation

Password Reset Instructions have been sent to

Subscribe to The Weekender
Get the week's leading headlines delivered straight to your inbox.
Top headlines from around the real estate industry. Breaking news as it happens.
15 stories covering tech, special reports, video and opinion.
Unique features from hacker profiles to portal watch and video interviews.
Unique features from hacker profiles to portal watch and video interviews.
It looks like you’re already a Select Member!
To subscribe to exclusive newsletters, visit your email preferences in the account settings.
Up-to-the-minute news and interviews in your inbox, ticket discounts for Inman events and more
1-Step CheckoutPay with a credit card
By continuing, you agree to Inman’s Terms of Use and Privacy Policy.

You will be charged . Your subscription will automatically renew for on . For more details on our payment terms and how to cancel, click here.

Interested in a group subscription?
Finish setting up your subscription
×