- The RESO Data Dictionary is only 90-percent adopted across MLSs for the current version, and MLSs need to continue to stay adopted by implementing updates.
- The RESO web API is a revolutionary next step that reduces redundancy and replication of MLS data.
- The future will be two-way API support to allow MLSs to share data and for brokers to load data from any RESO API-supporting update product without manual input directly into the MLS.
WAV Group founding partner Victor Lund is a consultant member of RESO, and WAV Group Communications is RESO’s PR agency of record.
Our industry is in the dawn of a new day.
With some measure of struggle, the nations’ MLSs and their vendors have endeavored to adopt a set of standardized fields for data transportation from the MLS system to applications that support the real estate industry.
I consider this the dawn of the effort because, for the very first time, MLS adoption of the real estate standards are more strictly mandated by the National Association of Realtors’ (NAR’s) MLS Policy.
For years, NAR supported and funded the Real Estate Standards Organization, which is referred to by its acronym, RESO. Despite being a free-standing not-for-profit organization with an independent board of directors, the bulk of the funding for this standards organization came from NAR and was supplemented by MLS vendors and a few others.
Today, RESO has blossomed into one of the most collaborative industrywide efforts we have ever seen, with funding from vendors, brokers, associations and MLSs. The effort ties MLS vendors, broker and agent technology vendors, MLS operators, Realtor associations and many brokerages together.
This group is funding and directing a massive overhaul of how information (data) is used today and laying a strong foundation for the future. It has inspired transformation.
Like anything new, different and technical, there is also a massive level of misunderstanding that is frustrating the efforts.
In some small way, my hope is to clarify some things to set some people straight. We try hard to understand before disagreeing and disagree without being disagreeable.
The ‘native’ database and RESO Data Dictionary standards
The MLS system has a “native” database.
When agents enter listing information into that database, they most often enter data that is not RESO-certified. Despite the 1,078 fields and 1,475 values within the most current version of the RESO Data Dictionary, the MLS has additional fields, business rules, database logic and numerated values that are beyond RESO standards today.
With few exceptions, native MLS Databases and listing input forms have not been converted to the RESO Data Dictionary standards.
If MLSs were to adopt the RESO Data Dictionary standards for MLS fields, it would require an MLS conversion.
I imagine that most of you reading this have suffered through an MLS conversion, so you can appreciate the expense and pain that it would cause to convert all 728 MLSs. Moreover, the Data Dictionary has not evolved to cover every MLS field yet (although one day in the near future that may happen).
The only arguments for such a conversion would be to align the data values that consumers see with the data values that are entered into the MLS, and to remove the need to translate the native MLS data to the RESO Data Dictionary standard.
When data mapping means countless custom connections
RESO made a wise choice at focusing standardization on data transport and not on the native MLS system. The chief cause for standardization of transporting data is to eliminate the need for exponential data mapping.
Today, many of the nation’s 728 MLSs are data mapping to the RESO standards — that is 728 data maps. Not such a bad number.
It allows MLSs to maintain their local, custom, native data input and storage while translating that data to the industry standard when it is used for applications that reside outside of the MLS system.
Here is the exponential part. There are roughly 750 technology vendors that RE Technology records as having applications that connect to MLS data.
In the absence of RESO, that would cause 750 vendors to map to 728 MLS data schema for a possible value of 546,000 custom connections between databases.
The number becomes even more absurd if you consider a brokerage that wants 10 or 20 applications in their business to talk to each other.
In the old world, a small change each month in 10 percent of MLSs to the data set (that’s 72 MLSs making database changes like adding a field, removing a field or changing the value of a field) would cause thousands of brokerage applications to break because the data mapping broke.
By the way — this does happen today, and it happens every day.
RESO data standards fix most of the daily breaks in data mapping. MLSs are free to modify fields in the native MLS system without disturbing the transport of data to the applications because the data transport is the standard!
It’s a beautiful thing.
With that freedom, MLSs should refer to the RESO Data Dictionary prior to making any field modifications — after all, if you are already changing a field, do it in the manner of the nationwide standard!
The big data problem
The first phase of RESO standards mandates by the NAR was to adopt the RESO Data Dictionary.
Want to know if your MLS is certified? Here is a list of the 74 MLSs that are not RESO-certified.
Every MLSs was supposed to adopt the RESO Data Dictionary Standard policy by January 1, 2016, and RESO provides certification to allow MLSs to demonstrate compliance with NAR.
There is no excuse. They have had almost two years (since Nov. 2014) to adopt the RESO Data Dictionary and become certified.
The RESO Data Dictionary is the set of field names and values that are sitting on the server that a technology vendor connects to.
The RESO standard considers two distinct forms of transporting that data to application vendors.
The first, and the most popular today is called RETS — or “Real Estate Transaction Standard.”
The RETS standard is a protocol for taking all of the data in the MLS and primarily replicating it in the vendor’s database.
Think of RETS like loading all of the data onto a train, where each car has a piece of data that is in the RESO data format.
The train travels from the MLS station to the vendor station, where it is unloaded and put into the vendor’s product for use by the broker and agent customers.
It’s a one-way train. Sometimes the train takes all of the data (usually once a month or once a week), but most of the time, the train only takes the changes.
The full load can be hundreds of thousands of listings that span 15 years of MLS data with millions of photos, equating to terabytes of data.
Technology companies such as homes.com, Wolfnet, Real Estate Digital, Listingbook and many others that connect to hundreds of MLSs have “trainloads” that add up to a thousand terabytes.
The goal is to keep both the MLS and the database synchronized, so vendors will often tell the train to transport the changes in the database every 15 minutes. A company managing this volume of data is probably incurring a cost of over $10 million in staff and servers to mange it. If you are like me, you have trouble synchronizing your contacts between your phone, tablet and computer.
There is a better way: The RESO web API
An API, or application-programing interface, is a newer, more efficient way for applications to talk to databases.
By now, you may have heard murmurs about the RESO API, or web API. The technical description is pretty nicely explained on this Wikipedia page, but you really do not need to understand the technical components to appreciate the value of the web API for the future of real estate technology programming.
Today, just about every application that connects to MLS data uses the RETS standard.
That presumes that the train is delivering the data every day and the application developer or vendor is hosting the replicated database of the MLS or numerous MLSs on their server.
The beauty of using an API is that the data only needs to be accessed when it is used.
For example, when a consumer does a property search, the application is calling the MLS database directly rather than the vendor’s database.
Ergo — no need to replicate the terabytes or petabytes of data, or to encumber the tremendous effort and expense. It’s fast, secure, less expensive and so on.
You experience APIs every day on your mobile phone. Have you had an experience whereby you authorize an app like Linkedin or Facebook to connect to your contacts and find friends? That is an API!
Ever see a Google map on a web page displaying a location? That is an API!
Clearly, Google does not want everyone to have its maps database — and frankly, the size of that database and the number of daily changes make it absurd to duplicate.
That is why everyone uses the Google Maps API, and that is the same rationale that RESO is deploying for the web API.
RESO’s modern API construction uses something called oDATA, a global transport protocol with off the shelf tools by Microsoft, Google, Salesforce, Apple and many others.
Where are we now?
Aside from about 10 percent of the MLSs — MLSs are now publishing the MLS data to servers in the RESO-certified format. However, they are keeping the legacy servers live for awhile to allow technology vendors to migrate. That will eliminate the exponential problem of custom data mapping described above.
Only a handful of MLSs are RESO web-API certified, but this does represent over 200,000 real estate professionals nationwide.
As it turns out, the industry was caught a little flat-footed regarding the amount of time and effort it would take for web API Certification.
The good news is that the floodgates have opened. Two of the three largest vendors have gotten at least one of their MLS clients certified, so the process of getting their hundreds of MLS customers certified is a matter of implementing the API code on each MLS server and submitting their application to RESO.
Only a small handful of real estate technology vendors will use the API out of the gate.
First of all, it is not really available today, so applications in real estate have not been built to leverage the API.
Most legacy applications were built to operate off of the RETS standard. Migrating to an API is a complete rebuild, costing hundreds of thousands or even millions of investment dollars for reprogramming.
WAV Group expects to see the API emerge in new applications for real estate, so it will take some time.
The good news is that it may be revolutionary as technology applications will be able to be developed specifically for the RESO web API, with rapid prototyping and at a lower cost. Data mapping will only happen once regardless of how many MLSs the vendor is connecting to.
The vendor will not need to replicate and store the data. It’s a bright future!
Next big thing
The next big thing for RESO will be a two-way standard for data transport for the RESO web API, which is a standard already in active development within the experts of the RESO community.
Today, the RESO API standard is for moving data out of the MLS. It’s one-way.
With the rapid adoption of MLS data-sharing and projects such as Upstream, the ability to move listings and other data into the MLS will emerge.
Inserting data into the MLS is tricky. The MLS has very specific rules and regulations for adding information into the native database.
A one-way translation from a custom local native database is not terribly hard. However, reversing that stream to allow rules compliant data to be added to the MLS is an entirely different translation.
Because of this complexity, we expect RESO to pick up this discussion. RESO does have a standard called RETS Update that was an optional RESO standard — and poorly adopted. It is being phased out, and a new RETS and web API standard for loading and managing data in the MLS will be developed.
This evolution will allow MLSs to talk to each other, eliminating duplicate listing input. It will allow broker systems to talk to MLS systems and have two-way data sharing, too!