Loading...
 

Contextual Search Using Screen Readers

This idea is part of the A Dollar Worth of Ideas series, with potential open source, research or data science projects or contributions for people to pursue. I would be interested in mentoring some of them. Just contact me for details.


In this fast paced world, people do not have time to write search queries longer than a few words. What about a query-less search approach? It might be possible to run a desktop search program that continuously monitors the user's activities and anticipate the users' information needs, then presenting this information to the user in a non-obtrusive manner.

A potential implementation can piggy-back on accessibility APIs as the ones used by screen readers (specialized software used by blind computer users to get the content in the screen read aloud to them). Alternatively, a screen capture and OCR can be performed.

From the text in the screen (and, most importantly, its changes over time), queries against the desktop search index can be performed, updating a navigation bar somewhere in the screen with the most relevant files to the task at hand. The query generation component for this system is its most complex and open to innovation piece of the idea.

The architecture looks like this:

Arch

This system might particularly handy to students in the new generation. Of course, recommendation systems have to live with their head hanging in shame of ever becoming the next Clippy. This one will be no different.