Earlier this month, Apple introduced iOS 9 with new search and Siri features. In iOS 9, users will be able to search for both specific activities and content that is contained in iOS applications. What this means is that users should be able to search for “sunburn” to find information on how to treat sunburns that is provided by iOS medical applications, and tapping the item will immediately launch it in the application. Additionally, these features will allow users to reference whatever they are looking at in a reminder they create through Siri. That is, when looking at an Amazon product page, users will be able to tell Siri “Remind me to buy this tonight,” and Siri will add a reminder with the link included.
Prior to iOS 8, an application’s functionality and content were indivisible from the application itself. If a user was looking at a photo in their photo library and wanted to edit it using a more powerful editing application they had installed, they had to leave Photos, open the editing application, and find and open the photo again there. If the user needed to access a text document they had stored in Pages, they had to launch Pages. In iOS 8, Apple eliminated the redundancy in the former example through extensions, which allow developers to atomize their application’s functionality and allow users to utilize it outside the scope of the application itself.1
The latter example is still true within iOS 8. Content is indivisible from the application itself. iOS 9, however, begins to break content and tasks from the application by making them searchable through what used to be called Spotlight on iOS but is now just Search.
The features around Search and Siri Reminders are absolutely useful. It is flexible and convenient to be able to move over to the resurrected Search page on the home screen and type in, say, “Denver” to find my flight or Airbnb reservation. What I find more interesting than the user-facing features here, though, are the tools provided to developers to make this possible, and the direction task and content search indicate iOS may be heading.
To allow iOS’s new Search feature to surface tasks and content that are contained within applications, developers must indicate to the system what within their application is content that should be surfaced, and what type of content it is (image, audio, event, etc). Developers do much the same thing for tasks. Somewhat similarly, extensions indicate to the system what kind of content they can consume.
This is referred to as “deep linking,” because it allows users to follow a “link” to somewhere deep within an application for some kind of task or content, exactly like clicking on a link in Google to a news article within a website, as opposed to going to the website’s home page and moving through their hierarchy to the article. “Deep linking,” while apt, is also somewhat misleading because this allows much more than just search. When developers update their applications to take advantage of Apple’s new APIs for identifying content and tasks to the system, they will be helping the system structure what–and what kind–of data is on the user’s device. The system will know what content is on a user’s device, what kind of content that is, and what kind of content applications provide. The system will know what photos, contacts, events (say, hotel reservations), and music are on a user’s device.
Using these tools, we could begin to construct an understanding of what the user is doing. Applications are indicating to the system what tasks the user is doing (editing a text document, browsing a web page, reading a book), as well as what kind of content it is they are interacting with. From this, we can make inferences about what the user’s intent is. If the user is reading a movie review in the New York Times application, they may want to see show times for that movie at a local theater. If the user is a student writing an essay about the Ming dynasty in China, they may want access to books they have read on the topic, or other directly relevant sources (and you can imagine such a tool being even more granular than being related to “the Ming dynasty”). Apple is clearly moving in this direction in iOS 9 through what it is calling “Proactive,” which notifies you when it is time to leave for an appointment, but there is the possibility of doing much more, and doing it across all applications on iOS.
Additionally, extensions could be the embryonic stage of application functions broken out from the application and user interface shell, one-purpose utilities that can take in some kind of content, transform it, and provide something else. A Yelp “extension” (herein I will call them “utilities” to distinguish between what an extension currently is and what I believe it could evolve into) could, for example, take in a location and food keywords, and provide back highly rated restaurants associated with the food keywords. A Fandango extension could similarly provide movie show times, or even allow the purchase of movie tickets. A Wikipedia extension could provide background information on any subject. And on and on.
In a remarkable piece titled Magic Ink, Bret Victor describes what he calls the “Information Ecosystem.” Victor describes a platform where applications (what he calls “views”) indicate to the system some topic of interest from the user, and utilities (what he calls “translators”) take in some kind of content and transform it into something else. What this platform would do is then provide inputs to all applications and translators. The platform would provide some topic of interest that has been inferred from the user; as I described above, this may be a text document where the user is writing about the Ming dynasty, or a movie review the user is reading through a web browser. Applications and translators can then consume these topics of interest and information provided by utilities. The Fandango utility I describe above could consume the movie review’s keywords, for example, and provide back to the platform movie show times in the area. The Wikipedia utility could consume the text document, and provide back information on the Ming dynasty.
What is important here is that the user intent that can be inferred from what the user is doing and what specific content they are working with, and the utilities described above, could be chained together and utilized by separate applications for the user, in such a way that was not explicitly designed beforehand. Continuing the movie review case, while the user is reading a review for Inside Out in the New York Times application, they could invoke Fandango to find local show times and purchase tickets. This could occur either by opening the Fandango application, which would immediately display the relevant show times, or through Siri (“When is this playing?”). More interesting, one could imagine a new kind of topical research application that, upon notice that the user is writing an essay related to the Ming dynasty, pulls up any number of relevant sources, from Wikipedia (using the Wikipedia utility) and online sources (papers, websites). Perhaps the user has read several books about the Ming dynasty within iBooks, and has highlighted them and added notes. If iBooks identifies that information to the system, such a research application could even bring up not just the books, but specific sections relevant to what they are writing, and passages they highlighted or left notes on. Through the platform Victor describes, the research application could do so without being explicitly designed to interface with iBooks. As a result, the work the user has done in one application can flow into another application in a new form and for a new purpose.
To further illustrate what this may allow, I am going to stretch the above research application example. Imagine that a student is writing an essay on global warming in Pages on the iPad in the left split-view, and has the research application open on the right. As the user is writing, the text will be fed into a topic processor, and “global warming” will be identified as a topic of interest by iOS. Because earlier that week they had added a number of useful articles and papers to Instapaper from Safari, Instapaper will see “global warming” as a topic of interest, and serve up to the system all articles and papers related to the topic. Then, a science data utility the user had installed at the beginning of the semester would also take in “global warming” as a topic, and would offer data on the change in global temperature since the Industrial Revolution. The research application, open on the right side of the screen, will see the articles and papers brought forward by Instapaper and the temperature data provided by the science data utility, and make them immediately available. The application could group the papers and articles together as appropriate, and show some kind of preview of the temperature data, which could then be opened into a charting application (say, Numbers) to create a chart of the rise in temperatures to put in the essay. And the research application could adjust what it provides as the user writes, without them doing anything at all.
What we would have is the ability to do research in disparate applications, and have a third application organize our research for the user in a relevant manner. Incredibly, that application could also provide access to relevant historical data for the user as well. All of this would be done without the need for this application to build in the ability to search the web and academic papers for certain topics (although it could, of course). Rather, the application is free to focus on organizing research in a meaningful and useful way in response to what the user is doing, and they would just need to do so by designing for content types, not very specific data formats coming from very specific sources.
Utilities, too, would not necessarily need to be installed with a traditional application, or “installed” at all. Because they are face-less functions, they could be listed and installed separate from applications themselves, and Apple could integrate certain utilities into the operating system to provide system-wide functionality without any work on the user’s part. For example, utilities could be used in the same way that Apple currently integrates Fandango for movie times and Yelp for restaurant data and reviews. Siri would obviously be a beneficiary of this, but all applications could be smarter and more powerful as a result.
Apple hasn’t built the Information Ecosystem in iOS 9. While iOS 9′s new search APIs allow developers to identify what type of content something is, we do not yet have more sophisticated types (like book notes and highlights), nor a system for declaring new types in a general way that all applications can see (like a “movie show times” type).2 Such a system will be integral to realizing what Victor describes, and is by no means a trivial problem. But the component parts are increasingly coming into existence. I don’t know if that is the direction Apple is heading, but it certainly *could be*, based on the last few years of iOS releases. What is clear, though, is Apple is intent on trying to infer more about what the user is doing and their intent, and provide useful features using it. iOS 7 began passively remembering frequently-visited locations and indicated how long it would take to get to, say, the user’s office in the morning. iOS 9 builds on that sort of concept by notifying the user when they need to leave for an appointment to get there on time, and by automatically starting a certain playlist the user likes when they get in the car. Small steps, but the direction of those steps is obvious.
I hope Apple is putting the blocks in place to build something like the Information Ecosystem. Building the Information Ecosystem would go a long way to expanding the power of computing by breaking applications–discrete and indivisible units of data and function–into their component parts, freeing that data to flow into other parts of the system and to capture user intent, and for the functionality to augment other areas in unexpected ways.
I believe that the Information Ecosystem ought to be the future of computing. I hope Apple is putting the blocks in place to build something like it.