• Tech innovation is transforming from visible apps to invisible artificial intelligence and voice-driven programs that are more powerful yet harder to understand.
  • Real estate is adopting this technology, and in order to make the most of it, we need to understand the implications of this next wave of innovation.

Hacker Connect January 16 in New York
An event for and by the real estate tech community

I vividly remember the launch of the Apple App Store. After a year with my shiny new iPhone, Apple showed the world what the real power of the phone was — a platform of millions of apps designed to solve myriad problems, delight users and snap sepia-toned pictures.

It set off nearly a decade of sharing home-screen setups, touting the latest app you loved and caused frequent bouts of FOMO (fear of missing out) when someone shared a new app you didn’t yet have on your phone.

At Inman Connect San Francisco this year, countless presentations were centered around understanding the user interface of these apps — how to use it to get the most out of them. And as we near another Connect, something has become increasingly clear to me: The innovation is no longer something we can see.

The biggest changes are focused not on adding new apps to our lives, but rather slowly taking them away from us. Removing the user interface, removing complexity, removing mental overhead of decision-making.

As things are taken away, it’s important we understand why and where it’s happening, and what it means for how we live and work.

Annette Shaff / Shutterstock.com

Annette Shaff / Shutterstock.com

The Voice Interface Future

In other words, the future of the user interface is no interface. The future of the app is invisible. The user interface of the future isn’t “Minority Report,” it’s natural language.

Today, it’s not about showing off the apps we have; it’s showing off the services that are so sophisticated that we no longer need to manipulate them with our fingers to get them to do what we want. Apple set this trend in motion when it ditched the physical keyboards and buttons of cellphones for a single button, abstracting controls from the physical world to the digital. Now, apps and services are rendering buttons of all kinds obsolete.

Here are just a few examples.

I use an artificial intelligence software for my email assistant. Called X.ai, the artificial intelligence engine works by email, and the friendly persona “Amy” (or “Andrew,” your preference) graciously schedules calls and meetings. The experience is so real, that many people can’t believe it’s not a person. I’ve emailed Amy thank you’s before — even though I know she’s just a software program.

Google Now has long been surfacing flight information and travel times to your phone’s home screen, and Siri answers any question you may have. My son uses her to solve his hard math homework. She’s great at long division and finding takeout, among other things.

Facebook’s M is a personal assistant that just happens to know everything about what you like and don’t like. Amazon’s Echo is Siri for the living room.

And beyond voice operation is computer behavior tied to our own behavior and thought — the boundaries melting away.

Interfaces disappearing in real estate

We’re seeing a similar proliferation in real estate, where new services look less like apps and more like, well, nothing new. OpenDoor’s big innovation isn’t an app, it’s machine learning applied to home values that runs circles around today’s automated valuation models (AVMs). You can’t really see the difference, but it’s powerful and will change how homes are bought and sold.

Riley isn’t another app, it’s a service built on top of SMS notifications — letting you text to see listings. Zillow’s next big thing is TV — Apple TV — where you can browse listings from your living room.

One of Apple TV’s main caveats? That apps like Zillow’s need to support Siri’s voice-command interface in order to be approved.

Powered by machine learning

Voice-powered personal assistants are nothing new. Scheduling apps are nothing new. AVMs are also not new.

But what is new is the technology that these services are using as a foundation. Previous versions of these applications were built on top of large indexes of information, and their validity was based solely on the size and organization of the index.

For example, when you use your voice to navigate a phone menu at a large corporation, the computer understands you only when you say something it expects to hear.

Today’s versions are built on top of neural networks, a type of machine learning that lets computers “train” on large sets of data and then use that data to create rules about how to act in subsequent settings. These networks are recursive, meaning the computer uses its success or failure to improve upon not only its next response but also on its next process (or thought, if you will).

It all happens “under the hood,” but there is no mistaking that these networks create a completely different kind of machine understanding than we’ve seen before.

OpenDoor is probably the best example of this in real estate today. The company is rewriting what it means to be an AVM using these advanced techniques to mine value and insights from data beyond the public records used by everyone else.

The catch of no interface

The good and bad news is, with most of these things, there is no shiny new app.

This is good news in the sense that there’s little new to learn. Just speak and you access the information you want. There is no flashy new interface, nothing to get confused by, or — conversely — show off to your friends.

It can’t be demoed on stage because there’s nothing to “see.” Click a button, get a response is fast becoming an anachronism. It’s not as visually satisfying — there is no eye candy, but the innovation isn’t any less profound.

The catch, of course, is our lack of control. Decisions are made in an instant by a server farm in the cloud based on millions of data points.

How or why those decisions are made isn’t revealed to us; we only get the end result. Not only are these applications smarter than their predecessors, they act without your knowledge. When you can’t see how decisions are being made, it’s hard to understand exactly how your behavior and world is being shaped. That lack of control should be worrisome, or at least something to seriously consider, no matter how big a technophile (like me) you are.

Take one innocuous example from Apple. If you ask Siri about popular music and aren’t a subscriber to Apple Music, she won’t answer the question for you. When Amazon changes a price hundreds of times a day, all based on your past behavior, the optimization is all about how much you’re willing to give up while still feeling like you got a good deal. You’re playing against a very smart house dealer. The implications are profound.

To be sure, these innovations are far more interesting and potentially impactful than any app ever has been. And the issues they raise will become even more important to understand and address as the interface to them dissolves into the air across our lives and work. The fact that we can’t see these new programs doesn’t make their potential consequences any less real.

What’s next?

What happens when artificial intelligence tells you which lead to contact and when, which listings are the best match for you, when the ideal time to sell your home is — and the likelihood of another, higher offer coming in, or the likelihood of getting a negotiated commission split?

What happens when Apple decides which homes to show in the Zillow app for TV, or changes how Siri works with DocuSign? Or how Google’s new RankBrain machine learning search algorithm accounts for home listings and your website?

These aren’t questions for five years from now; these are questions for today. While our apps and websites aren’t going anywhere, increasingly we need to think about the implications of machine learning and applications without an understandable interface and “thought process” that gives us only what it thinks we want and not what we explicitly command it to do.

At each new phase of innovation, we’ve overcome the risks posed by new technology. It’s easy to remember the days when logging into your bank account on the Web was considered an unnecessary risk.

Now we pay for groceries with our phones tied to credit cards. In much the same way, we will have to work through a more nuanced, harder to grasp set of risks and rewards with the coming technology innovations.

The loss of the interface — the move to innovation we can’t see or easily understand and control — will impact how we live, how we discover, how we buy, how we communicate and how we earn a living.

Understanding this shift and what it means, and how we can manage it and take advantage of it, is imperative in the years to come. Without an effort to understand it and the implications of it, we will be without the control we hope to hold on to.

It’s the next great wave of technology, and it’s one of the topics we’ll be exploring at Inman Connect in New York, where the industry comes to learn what’s next.

Email Morgan Brown.