A reader emailed recently wondering why search engines find property listings on some websites and not on others.
Even more vexing, sometimes search engines treat different listings on the same site inconsistently — one listing on Craigslist will show up in search results, and another listing won’t. Why is that?
Search is one of those arcane arts of the Web. Agents and brokers — who once provided a fairly hefty subsidy of local newspapers through their advertising — now spend a great deal of time and money getting properties and sites to show up in search engine results.
And that’s probably a good thing. If someone is is paying to have their house listed, and one of the promises made by the agent involves marketing the property, then doing a bit of online marketing via search engines is in order. But hopefully you’ll get the property sold before you have to boost its ranking on a head term.
How search works
To understand how and why online listings sometimes rank and sometimes don’t, let’s look into how a search engine works.
There are many strategies for boosting rankings through "search engine optimization," or "SEO," some of which can do more harm than good. None of the things I’m going to discuss in this column are "bad" or against Google’s terms of service.
I tend to believe that, clever as I am, it’s unlikely that I’m going to outwit a building full of Stanford math Ph.D.s who are employed full time to prevent Google from being outwitted. People get lucky now and then, but luck doesn’t last.
Now that that’s out of the way, let’s look at how a search engine works and see if we can figure out where and why a listing might show up on some sites but not on others.
First off, there’s the stuff you enter into the multiple listing service. If you are trying to get an MLS listing to rank for something, then the words in that listing need to include words that you want to rank for. Search geeks call these "keywords," even though it might be more than one word or a phrase. For example, if you want people to find your listing when they search for "horse farm," then make sure you use the phrase "horse farm" in your MLS listing.
Most MLS systems have a clunky and limited interface for entering this sort of stuff. Typically the database itself is fairly strict (which, given the junk that gets entered, is probably a good thing).
The place where you’ll have the most opportunity to describe a property is likely a "description" or "remarks" field. Since there are something like 800 different MLSs out there, it would be beyond the scope of this column to identify all the possibilities. But you can find that part where you get to write a paragraph or two describing the property. Make sure that has keywords in it.
If you upload photos (which of course you should) there is another small opportunity for helping search engines figure out what your listing is about. Google and other engines aren’t very good at figuring out what is in a picture, but they use other cues.
One of those cues is the file name itself. Your camera names a picture file something like "DSC109238.jpg," but you might consider renaming it "horse-farm-in-mytown-state.jpg" or something like that, sticking with the "horse farm" theme.
So that’s the content part, such as it is. MLS systems are rarely very good at this sort of SEO thing because they weren’t designed for it. They were designed to keep as much information private and hidden, not shared out to the world.
Indexing by spiders and bots
The very first thing a search engine has to do in order to get your listing on page one is simply find the listing somewhere online. This is called "indexing," and it’s one of the more technical areas of SEO. I’ll provide an overview of the main concepts.
Search engines have software that is constantly reading the code of Web pages and then following links on those pages. Those bits of software are called "spiders" because they crawl around the Web. They’re also referred to as "bots."
In order for your listing to show up in search engines, their spiders have to find your content. The only way they arrive on your content is by following a link from somewhere, or because you have given the search engine an updated sitemap (which is just a bunch of links as well).
This is one of several reasons why search geeks get excited about "backlinks." The more links there are to your listing, the more chances of a search spider finding its way to the listing.
A couple things to know about spiders:
- They read only the code of a website. Use your browser’s "view source" capability to see what they see. Can you find your listing in that gobbledygook?
- They can’t register or log in. This means that they don’t follow links that are behind login screens on sites like Facebook, private groups and forums, or in your email inbox.
- They follow specific instructions on a website’s robots.txt file, which may prevent them from visiting certain parts of a site. Many default WordPress installations are set to be hidden from search engines in this manner.
- They don’t index iFrames (yourkwagent sites, and many MLS sites, for example).
- They are primarily focused on finding links and moving on.
Once a spider finds a page, the search engine needs to make a copy of that page somewhere. This is called "the cache."
Keeping an archive of the entire Internet is no mean feat. It requires a lot of computing resources and drive space. Keeping that data readily accessible to serve up a search result on a moment’s notice is challenging as well.
Pages with fewer lines of code are easier for search engines to cache. This means the search engine will likely spend more time on the site grabbing up as many pages as possible. It’s a simple ROI equation for the search engine — more potential search results with fewer resources.
The reader who wrote in noticed, for example, that Craigslist was one of the sites that frequently "got the listing" online. Craigslist has efficient code and is therefore easy for Google to cache. Efficiently coded sites don’t have to look as ugly as Craigslist though.
This is the part that gets all the attention: the ranking algorithm (cue angelic choir).
Once all of those Web pages are stored in the cache, search engines have to apply some logic and math and try to figure out which one is the best result for a given search phrase.
The words you put into your description will play a part here. But so will a wide variety of other factors including whether a site has many other trusted sites linking to it (backlinks), whether the page loads quickly (user experience) and whether the search engine feels the content is any good (content quality).
For simply getting a listing to rank anywhere, much of this will be outside your control. You can’t control whether someone else’s site loads quickly. You can’t control whether other people link to the listing or not (though you can ask and so on). You can write a good description (quick tip: no more all caps and no more acronym soup), but you might not be able to control what other garbage content appears on the same page.
After all that work has been done, eventually someone will type "horse farm" into the search engine and get a list of 10 "organic" links sorted according to the ranking step. MLS listings have a special challenge in this area that most other content doesn’t have.
MLS listings, by their nature, are exactly the same everywhere they appear. That’s usually part of the agreement.
If you wrote the best optimized listing ever for "horse farm" and then put your listing on Craigslist, your site, your IDX (i.e., everyone else’s website) and all the aggregators, then the ranking step might determine that your listing was the best result for all 10 slots.
The problem with that, for search engines, is that no one wants to be handed the exact same thing in slightly different packages. Search users would get really annoyed and stop using the site, and then the search engine would go out of business.
So when it comes to display, search engines know to show only one version of the content. How does it decide? Typically, they try to show whichever version they spidered first — the one they consider the original.
Sometimes they will disregard the original if another site with an extremely high trust factor also carries the same content. If your website and the New York Times carry the exact same content, the search engine might choose the Times as the site with the most authority.
But typically it’s very focused on which bit of content was first. And then all of the other copies would be relegated to the back of the line.
It is likely this behavior that causes the reader’s wide variation on where a listing is shown in search results. Sometimes the spiders find Craigslist first, sometimes they find a competitor’s website Internet Data Exchange (IDX) feed, and so on.
The primary business of a search engine is to make money by sprinkling ads around on the display page. The more individual pages available to search engines, the more potential search results they can show and therefore the more potential advertising opportunities they have.
There are two ways to use this to your advantage. One is the simple easy way: Buy some advertising and get on page one in less than 24 hours. In many cases, this will be easier and less expensive than mounting an SEO effort.
The other way is to make sure you have lots of pages of content for the search engine to index. If you have the choice between showing one page with all of the information about a listing or breaking that information up into 10 pages, go with 10 pages.
For listings that are your own, in most cases you can add material beyond what is in the MLS, providing an opportunity to expand upon the information and make more pages.
This page focus will likely change over time as the Web continues to shift from a static, page-based paradigm to a dynamic, event-based paradigm. But for now, the process outlined above should help you understand what is involved in getting your listing to rank.
Gahlord Dewald is the president and janitor of Thoughtfaucet, a strategic creative services company in Burlington, Vt.
|Contact Gahlord Dewald:|