Inman

Solutions, not problems

Editor’s note: Inman News Publisher Bradley Inman wrote a blog entry offering a solution for the online property listings fiasco that has put the National Association of Realtors in a heated battle with the U.S. Justice Department. We’d like to hear your comments on his idea. Please see the comments section of the blog to add your two cents.

Sitting on a National Association of Realtors panel at the Association Executive Institute on Saturday in Reno, Nev., I was reminded of the many good things that the powerful trade group does, such as lobbying the U.S. Congress to preserve the mortgage interest deduction, creating the Center for Realtor Technology, its trade shows, its publications, its efforts to spread home ownership here and abroad, and its powerful education and networking efforts.

On a few issues, however, NAR has its head in the sand or is gripped by irrational fear, sometimes taking action that hurts more than helps the industry and the cause of home ownership.

The online listings debacle is one of those, and NAR has gotten itself in hot water with the Department of Justice as a result. The DOJ last fall filed an antitrust lawsuit against the trade group, accusing it of adopting overly restrictive policies for online property listings display.

Here is a solution: Create and distribute a simple technology application, show it to the DOJ and maybe the U.S. government will back off.

This is how it would work. NAR creates an API (application program interface) for MLS listing data. This would not be too much different from the current IDX technology solution except NAR would create a standardized application like Google did with its mapping solution.

That API would have a set of rules or protocols that everyone must follow to use the application, like any open source software off the Web, i.e., maps, blogging, productivity software, etc.

Companies that use this API and follow the protocols would have access to a one-stop listing data warehouse where local MLSs would contribute their data. The listings would be watermarked and copyright protected and be bundled with the API, limiting changes. The API would offer business logic and a payments platform. Software developers would contribute to the API by writing applications and features to the NAR feature.

Users, which could be anyone, would pay a small fee for a data license (distributed to the MLSs per a fractional share by number of listings) and for the API, funds that NAR could use to police data integrity and copyright infringement, which is easy to do today with new technologies.

This would not be too different from the rules around stock exchange data, a system that has worked very well in the securities marketplace, resulting in equities data being distributed far and wide, helping to promote stock sales.

Of course, NAR must endorse this strategy of open source and open data, which perfectly fits with today’s world of data access and transparency, evidenced by everything from Sarbanes/Oxley to recent Web 2.0 innovations. But it would also give NAR control over the use of the data in a new and exciting way, not control by limiting it to a few favored companies but by promoting and distributing the home listings information to everyone with NAR’s help.

By letting go, NAR could actually be in the driver’s seat.

NAR would again play the role of agnostic advocate for home ownership, as it does with legislation. It would give up a little bit of control but would assume an ever bigger, more productive, more positive, more opportunistic and less fearful role in the future of real estate and the Web.

Thoughts?

Leave your comments at the Inman News blog by clicking here.