PhotoSketch is a mind-blowing project from a group of computer science students in China, who have created software that will take your hand drawn sketch, search the web for corresponding images and then stitch them all together into a brand new composite image.
A video demonstration from their paper shows the software in action.
While only tangentially related to real estate marketing (though imagine the fun you putting together a listing presentation…) – what I think this technology points to is a rise of alternate input methods that will push us beyond the keyboard. The iPhone has already demonstrated the appeal of multi-touch and I suspect we’ll be seeing even more creative uses of this kind of gesture-based interaction over the next few years.
Text based search has been held back somewhat by a limited vocabulary and clumsy filters. So imagine searching for a home by simply drawing what you were looking for on Surface-like table or a Perceptive Pixel-type wall and have the software return the most relevant results to you.
It won’t be for everyone – but for the more adventurous, the future of real estate search may just be about letting your fingers do the talking.