Wednesday, December 7, 2016

The End of Sandboxes: Changing the Unit of Design

Anyone working in design and technology—and probably anyone living in the modern world—will have noticed the swiftness with which we went from using terms like the “World Wide Web” to “The Internet of Things.”
In a short time, we’ve gone from designing and developing web pages – points on the map of the information superhighway – accessed through browsers, to entire packages of interconnected applications, delivering content and experiences across an array of form factors, resolutions, and interaction patterns.
The question then is, how does this change the role of the designer?

The role of the designer tomorrow

The object of our practice was once placing an intricate series of boxes, type, and buttons onto a relatively reliable browser screen (and eventually a mobile app screen).
This was the breadth of the experience that we delivered to our users. We must now consider that our products exist as components within complex systems of components that interact with, swap information with, and sometimes even disappear behind other parts.
The End of Sandboxes: Changing the Unit of DesignRecognizing the end of apps is just the beginning.
As Paul Adams of Intercom writes in his piece The End of Apps as we Know them, “In a world of many different screens and devices, content needs to be broken down into atomic units so that it can work agnostic of the screen size or technology platform.”
And this goes beyond just content. We have to consider how every feature or service we offer to our end users exists within and cooperates with other components of multiple systems and platforms.

How will users receive data tomorrow?

In the beginning of the mobile revolution, designers were focused on creating apps that deliver a similar experience: well-designed and controlled sandboxed environments that users interacted with to consume content or accomplish a task and then exited.
Most of this happened within a screen, and designers had a good deal of control over the delivery.
However, the range of mobile devices has exploded from screen-based mini-computers to wrist-wrapping message centers and devices to remotely control your home thermostat. The paradigm has shifted dramatically.
Today, users get snippets from multiple sources packaged and delivered staccato through notifications and alerts. This shift has major implications for a range of industries, including news and media outlets that came to rely on being somewhat siloed destinations for information.
The End of Sandboxes: Changing the Unit of DesignThe context in which users receive data is changing.
For example, a recent study by the Pew Research Center found that nearly two-thirds of U.S. adults get news from social media, and nearly a fifth use it as their preferred source. And according to The Reuters Institute, in places like Korea, Japan and the Czech Republic, the majority of users primarily rely on news aggregators as their primary gateway to the news.

How will users interact with data tomorrow?

The proliferation of diverse, interconnected technologies has also meant that a single product will be experienced and interacted with differently in different contexts.
An alert to a message might appear on a user’s smart watch while they’re en route to a meeting.
Rather than having to find, and open the messaging app, the user has the option to respond within the notification and possibly send out a pre-composed quick reply with one tap and perhaps engage with the message more deeply on another device later.
This means that our atomic unit has to be deliverable through varied experiences tailored to the context of interaction.
Chances are, the user is not composing a long-form email reply on a jog. They probably won’t be opening up a laptop for the sole purpose of using the one-button reply that was so handy on their wearable. They might never even read the notification on a screen, and instead listen to the content to a voice-activated virtual assistant.
Paul Adams writes, “In a world where notifications are full experiences in and of themselves, the screen of app icons makes less and less sense. Apps as destinations makes less and less sense. Why open the Facebook app when you can get the content as a notification and take action — like something, comment on something — right there at the notification or OS level. Why open up separate apps for your multi-destination flights and hotels in a series of vendor apps when a travel application like TripIt can present all of that information at once—and even allow you to complete your pre-boarding check-in? In some cases, the need for screens of apps won’t exist in a few years, other than buried deep in the device UI as a secondary navigation.”
Rather than having to use the alert or notification simply as a redirect to an app where the user is jerked from one context and placed into another with a different set of interaction patterns, the user can take action immediately.

How does how we interact with data influence the interface?

The rise in messaging apps that operate with an almost command-line style interface takes this even further.
Apps like the explosively popular WeChat allow users to do things like send and receive money, call a cab, and receive updates on its arrival time, search for a restaurant, and book a table or book a hotel – all through a text-based interface that neatly removes the need for a user to navigate to individual apps to use their functionality.
In The Future of UI? Old School Text Messages, Kyle Vanhemert looks to the proliferation of text-based platforms that increasingly replace the need for the GUI. The services that apps provide will be increasingly accessed outside of the apps themselves.
When we start to consider designing for the world of iOT, we’re abdicating even more control over the interface—creating products and services that might not be accessed through a screen at all.
When the touch-based screen revolutionized mobile devices, designers suddenly had a treasure trove of new modes of interaction to leverage, like swipes, pinches, taps, double-taps, long-taps, tap-and-drag that went well beyond the point-and-click.
Now users are swiping and pinching in thin air or waving tagged accessories in front of sensor-equipped devices, where we can’t explicitly guide our users with the same visual cues—now what?
In a sense, this means we’re looking at designing predictable, reusable interaction patterns.

How does this change how we implement products?

In many ways, the app as the unit of design—at least in the way we’ve been looking at it so far—is vanishing.
The standalone application is becoming the secondary, or even tertiary, destination for users to receive the services we offer. Even that the boxes and buttons are slipping into the background, so how do we realign to meet our users’ needs best?
For one thing, we have to design less for pixels and more for packets.
In a world where the amount of available information and number of competing services is expanding exponentially, the best way to be heard is to be shared and distributed across an array of platforms and contexts.
What this means is that ideally users are going to be receiving our content in ways that we cannot explicitly control, often within multiple wrappers.
We are going to be asking ourselves, as designers, a whole new set of questions.
  • Will this key piece of information be compelling in a Tweet featured in an article on a news article on Feedly?
  • How can I format my content to adjust to an array of viewports and even screen-minimal devices?
  • Is there a way to package a key functionality of my services that can exist somewhere else?
  • What are the most appropriate and useful interaction patterns we can offer as actions to take within a notification?
As users are spending less time inside of our destinations, how do we create experiences that can be completed elsewhere?
Can the push notification for an upcoming calendar event give the user a set of common controls to book a flight, make travel suggestions, or order flowers without requiring the user to ever open the app itself?
How do we design app functionalities that work with and inside of other apps to provide our specific services to users that are compelling enough to exceed the competition?

What does the design process look like tomorrow?

If the component is the new unit of design, the new object of our practice, how do we define it? The component in question will be defined differently in each case and has different implications for every product, but I would like to propose some examples to help designers and developers, when thinking about their products as parts of systems rather than as neatly-boxed destinations.
The End of Sandboxes: Changing the Unit of Design

In what context will users receive a notification?

Let’s start with notifications as a simple pattern.
Imagine we’re designing for a car service app that echoes across multiple devices, and notifies users on their hired car’s position, estimated arrival time, and delays.
Within the destination housed in our app, we can control how a notification appears, what information is contained, and what the user’s available actions are.
But we hardly expect our users to open the app, and monitor the screen continually. So we have to package our notifications to be pushed to the operating system, and be aware of how a multitude of mobile devices handle that content.
At this point, we are starting to think of this app as less of the destination and more of a component within the operating system.

What actions will be available within a notification?

Within the app, we alert the user of a delay with a link to the address of the hired car’s position by tapping a “see my car” button that will then open the specified location in the preferred map application.
We also give users the option, in our app’s delay notification, to cancel their reservation.
Many device’s system-level notifications allow for the user to take actions directly, so we can package our notification component to include a few actions that exist inside of our app. So we might consider including either or both of these actions within the notification itself, which then opens the map or, in the case of cancelling, adds one level of confirmation. If, instead, we want to provide the option to message the driver, we can design to allow for a line of communication within the notification, meaning the user doesn’t have to open our app at all.

If users don’t open our app what are we designing?

The components we design don’t only interact with the operating system; they may even be pulled into other apps.
For example, a travel app like TripIt, may also contain this notification in the case that the user has a travel itinerary, which may be affected by a delay.
We can package our notification actions to work with the travel app to make chains of actions available to the user.
Perhaps the travel app has other people attached to this itinerary. We could package our notification through the travel app to present the user with the option to send an updated ETA to friends waiting at the airport.
In this way, our own app’s notification component merges with the travel app’s notification and then is delivered through the device’s notification packaging to the user. Then, the user’s actions cascade back down the chain, and a change is made to our app’s status without the user ever having to open it.
If we take this a step further to include something like a voice-activated virtual assistant, how do we package and deliver a notification and actions where we can’t anticipate a screen-based interaction?
We can translate the steps in the user flow from our app’s notification to mirror the screen-based interaction pattern, or tailor more to the device’s particular interface paradigms. The ideal is probably somewhere in the middle. We need to account for the interaction patterns, which become habit within the physical device, while maintaining a consistent flow that delivers predictability and confidence to our users.
Many of the offerings can be broken down into components in a similar fashion, and we have to begin to see them this way if they are to meet our users’ needs.
Apps are becoming organisms in large ecosystems that interface with other organisms, and must adapt accordingly.
I would suggest looking at how IFTTT (If This Then That) links apps’ components into systems of actions that don’t require the user to ever open all of the apps involved—if any.

The unit of design has been redefined.

We are re-defining the unit of design from target destinations—such as a web page or a sandboxed application—to systems of component parts that interact with each other and with the end user.
In many cases, these interactions are happening almost invisibly. While apps as a concept are not being eliminated, they are no longer the main event or destination.
Designers have to pivot and redirect their foci in order to create products and services that adapt to thrive in multiple ecosystems and across an array of unpredictable contexts.
The revolution will be atomised.
This article was originally posted on Toptal

No comments:

Post a Comment