SAN FRANCISCO (10/31/2003) - Fifteen years ago, Apple Computer Inc. produced a concept video called "Knowledge Navigator." To this day, no more compelling vision of human/computer interaction has been demonstrated. If you've never seen the video, or haven't watched it in a long time, do a Google on "knowledge navigator quicktime" to find a copy. Some aspects of Apple's pre-Web fantasy are now routine: global document search, wireless networking. Others, most notably natural language understanding, remain far out of reach. What interests me is the middle ground of immediate or near-term possibility. Consider this fragment of dialogue from the video:
Professor Bradford: I need to review more recent literature. Show me the articles I haven't read yet.
Software assistant: Journal articles only?
Professor Bradford: Fine.
Software assistant: Your friend Jill Gilbert has published an article about deforestation in the Amazon and its effects on rainfall in the sub-Sahara.
A conversation like this isn't yet possible, but the underlying transactions are. It should be easy to find unread search results. My Web searches are observable, as are the sets of documents they return and the subsets I read. The browser sees all of this activity and remembers it (for a while) in its history and in its cache. A correlation engine could mine these data stores or, thanks to the transparency of HTTP, could monitor Web transactions and capture events directly.
Correlating unread search results with friends is harder, but should still be doable. Buddy lists provide important clues. So do patterns of e-mail interaction. For example, I respond to my most valued correspondents more promptly, and at greater length, than I respond to other people. And I tend to create folders and filters for my friends' messages. The timing and quantity of responses would be easy to observe. Like the HTTP pipeline, the SMTP pipeline is transparent. No matter which e-mail application you use, a local monitor can see and correlate your message traffic. Foldering and filtering events, on the other hand, are seen only by e-mail programs, and they're proprietary to those programs.
There's an important lesson here I hope desktop applications will learn, courtesy of the emerging paradigm of SOA (service-oriented architecture). In the realm of SOA, events are represented in an open XML format and flow through a transparent pipeline that's open to inspection and subject to intermediation. As a result, Web services management vendors are able to deliver proxy-based meta-services that watch message flows and relate them to SLAs, HIPAA (Health Insurance Portability and Accountability Act) requirements, or business-activity thresholds. Given the right instrumentation, it's not hard to do these things. And when the event stream is transparent, it's not hard to create the instrumentation.
Ironically, the graphical desktop popularized the event-driven model that's being writ large in the Web services network. Now we need to come full circle. Local event streams need to be open in the same ways as network event streams are and for the same reasons. At the interface between so-called rich Internet apps and the services cloud, this will happen as a matter of course. Browser, Windows, Java, or Flash, a SOAP call is a SOAP call. But the apps and services that live on our own machines use a hodgepodge of protocols and message formats. Moore's Law says we can soon abandon these binary protocols and proprietary formats. The inexorable logic of service-oriented architecture spells out why we must.