Depending on whom you ask, iOS 7 is either a hero or a villain. It’s a spring-cleaning of the cobwebs from an outdated visual style, or it’s an over-correction based on an inflexible system of dubious rules. It’s Apple once again breaking free from the constraints of the past, or it’s Apple showing up unfashionably late to the newest fad.
These opinions can’t all be right, but can any opinion be right? How will we know when iOS 7 can be judged a success or a failure? What does “success” even mean for an iOS update? What role does interface design play in this success?
A good place to start is with Apple’s own attempts to draw an outline around iOS’ impact on the mobile OS landscape. In this years’ WWDC Keynote (beginning around 69:40), Tim Cook approached iOS from several angles: install base, usage, and customer satisfaction.
If success is defined by the percentage of in-use devices that are updated to the latest OS, then iOS 7 is probably going to be the most successful iOS version to date. According to Cook, ninety-three percent of iOS devices are using iOS 6 (with six percent running iOS 5). This is most likely a testament to the ease of installing over-the-air software updates, as well as Apple’s efforts to support older devices still in widespread use. These factors are not changing in iOS 7, so I would expect to see similar numbers at next year’s Keynote.1
iOS device owners use their devices more often than other mobile device owners. According to Experian, iOS devices are used 75 minutes a day, on average, compared to 45 minutes a day for Android.2 If these usage times tip in favor of Android, it could mean that the changes in iOS 7’s design are making customers less willing to spend time with their devices. This seems an unlikely scenario to me. The “offensive” changes in iOS 7 are mostly superficial. The key differences between iOS and Android devices3 — tight integration of hardware and software, the richness of third-party apps, etc. — won’t change from iOS 6 to iOS 7.4
The primary metric that Apple uses to objectively measure iOS’ success is customer satisfaction scores — or “customer sat” as Cook referred to it during the Keynote. He proudly noted that the iPhone has been ranked number one in customer satisfaction by J.D. Power nine consecutive times — a first for any product. ChangeWave found that overall satisfaction was 97 percent for iOS device owners, and that 73 percent of them characterized themselves as “very satisfied.”
Of all the metrics used to measure iOS’ success, I think that customer satisfaction is the one to watch most closely. If customers find something distasteful about iOS 7, I would expect to see a drop in both overall satisfaction and in the number of people who claim to be very satisfied. Or perhaps the opposite might happen. Either way, customer satisfaction is the best measure we have. The other metrics compare iOS to Android and other mobile operating systems. The data are influenced by too many external factors to be a good measure of the iOS 7 user experience itself.
Another measure of success will take longer to notice. Over the next two or three years, as contracts begin to expire and customers head out in search of device upgrades, what kind of devices will they choose? Will iOS sales growth taper off as customers choose alternatives?
Standing in an AT&T store recently, the only smartphone in the store that didn’t have a “flat” design aesthetic was the iPhone. Windows Phone’s app tiles swooped around in a delightful way, with no drop shadows or gradients to distract me. The Android lock screens were hard to distinguish from the iOS 7 test device I’d brought with me in my pocket. The build quality of some of the Samsung, HTC, and Nokia phones was comparable to the iPhone. Their screens were bigger and more pixel-dense. The HTC One felt more like a miniature iMac then a smartphone. It felt great to heft it around. Standing there, absorbing all of the options available to me, and with AT&T staff more than happy to sell me an Android device as easily as an iPhone, I wondered why anyone chooses an iPhone at all.
If the appeal of the iPhone, all the years it’s been on sale, is in part due to the little visual details — a notepad icon that looks like what it does, an unlock slider so easy to grasp than even babies figure it out on their own — then iOS 7 could be spoiling the key ingredient to the success of its predecessors. Time will tell.
- There are still a lot of iPhone 3GS devices in use, which will not be supported by iOS 7. However, with the steady growth in the smartphone market overall, and the likely introduction of lower-cost iPhone models in the fall, I expect that by June 2014 any impact on the iOS 7 install base by iPhone 3GS devices running iOS 6 will be more than compensated for by sales of newer iOS devices. ↩
- That’s a time difference of only thirty minutes. Since iOS devices are generally more power-efficient than their Android counterparts, it is not implausible that the Android deficit is a side-effect of battery life. If Android devices with better power-consumption are released, then Android usage numbers could improve without implying anything negative about iOS 7. ↩
- Some people argue that there are also differences between iOS and Android device *owners*. I won’t make such an assumption here. If there is any truth to this notion, it could plausibly account for usage differences, too. But if personality drives usage, then iOS 7 isn’t going change anything. A software update can’t make you a thinner, overzealous bore. ↩
- Cook also presented data on mobile web usage on tablets, but these data also aren’t likely to change with iOS 7, since the underlying causes will remain the same. ↩
My internet pal Shawn Blanc has released a new book today: Delight is in the Details, a book for craftspeople about the practice of sweating the small stuff. The exhortation from the announcement email was great:
Now, go forth and make something delightful.
Spoken like a true Protestant. I look forward to reading his book this week.
My wife and I started watching Orange Is The New Black last night. I immediately understood why this show is such an instant cult hit: any given two minute slice passes the Bechdel test with flying colors.
The Bechdel test asks whether a work of fiction features at least two women who talk to each other about something other than a man.
There’s a funny scene in the movie The World According to Garp in which Garp (Robin Williams) and his wife are looking at a house they’d like to buy. They’re standing on the sidewalk with the realtor when, without warning, a single-prop plane comes sputtering into view over the tree-line. The plane crashes into the side of the house, making a terrific mess. Garp’s wife and the realtor are horrified, watching the unscathed pilot waving hello from what’s left of the master bedroom.
This scene wasn’t in the novel, but it could have been.
“We’ll take the house!” Garp exclaims, wrapping an arm around his wife. “This house has been pre-disastered. We’ll be safe here.”
Fear of failure — and of disaster, which amounts to the same thing — disappears when the failure is no longer hypothetical. When the failure actually happens, and it becomes real, we discover that the repercussions are easier to manage than the fear itself.
So here’s the most embarrassing moment from my childhood: I was taking a math test in fourth grade. It was near the end of a long, hot day at school. I had visited the water fountain too many times and now I needed to pee, badly. It was the don’t-stop-tapping-your-foot-or-you’ll-bust kind of need to pee. I flagged down my teacher, who refused to let me go the bathroom. I’ve never resented someone so much in whole life.
I lasted about ten minutes before what you can imagine happened. Shorts darkened. Socks yellowed. Textbooks under the desk swelled and curled up at the edges. There was no stopping it. Trying to hold it back again only made it worse. So I gave in.
“Well,” I thought to myself, “at least I don’t have to pee anymore.”
I felt relieved. That was the last thing I had expected to feel. I had traded one set of problems for another. Failure relieves fear. Actual problems are solvable. Hypothetical problems are not.
In creative work, it’s easy to waste a lot of energy on the fear of failure. We measure our work against external ideals of perfection. We care about what we do, so we try to get it right. But if we’re not careful, we can let the fear of failure get in the way of doing our best work.
I’m not one of those people who by nature doesn’t give a shit about what other folks think of my work. If such a person exists, I’ve never met one. A man without doubt is a monster. I am always going to labor under the gaze of the Big Other. So any strategy that boils down to ignoring other people’s expectations of me is not going to work.
There’s another way of dealing with the fear of failure that is more plausible: give yourself permission to fail. Short-circuit the tangle of expectations you’ve inherited by adding failure to the list of what excellent craftspeople do.
Real artists fail.
I finally found time to revamp all of jaredsinclair.com. Re-written from scratch (well, from HTML5 Boilerplate scratch). Completely unbeknownst to me, I finished just in time to get linked-to by Shawn Blanc. Whew. If I had been a few minutes later finishing the transition it would’ve been an awful sight. There were a handful of bugs that I couldn’t fix without first deploying the new markup and CSS to the live site.
I came this close to switching from Tumblr to WordPress, but in the end I realized how much I enjoy the constraints of Tumblr’s dummy-proof feature set. I would have spent hours on WordPress disabling features and installing plug-ins to get back the simplicity of Tumblr. I’d rather spend that time writing.
I stumbled across it two more times this week, the advice about using a high-quality prime lens instead of the zoom lens that comes with most DSLR kits. The idea is that a high-quality prime lens is much more affordable and takes sharper and more vibrant images than a zoom lens at the same price.
The loss of versatility is mostly an imagined problem. A prime “portrait” lens will work great in most scenarios. You don’t need a zoom lens. You certainly don’t need a zoom lens as much as you need your DSLR sensor to be bathed in as much light as possible, at a price you can justify to your spouse.
Taken at f/1.4 under cloudy skies at ISO 800.
I purchased a Nikon AF-S 50mm f/1.4 G based on Ken Rockwell’s review. He’s not a great photographer, but he’s a fantastic product reviewer. I trust his opinions on gear more than anyone else’s. He’s also more likely to be found shooting with Nikon gear (I have a D5100), so his opinion is backed by a lot of real-world use.
For an everyday prime lens he still prefers the older AF-D model, but that lens won’t auto-focus when used with a D5100, so I went with the closest alternative. I was expecting good results, but I was not prepared for just how much better it was than our old kit zoom lens. Believe the hype.
Two big things happened to me recently: I quit my day job to become an indie iOS developer, and I became a father.
Needless to say, I now have a lot to think about.
In-between diaper changes and bottle feedings, I’ve been wrestling with some hard design challenges for my next app. Every day this week, I worked late into the evening, pushing the design to a point that seemed like the right solution, only to wake up the next day and see that yesterday’s solution wasn’t right yet.
It’s taxing to work like this, but rewarding. Vonnegut’s advice to young writers was to work passionately on a sonnet for a week, polishing it more every day, then to tear it up and toss the pieces into seven different trashcans. Your best work today will not be as good as your best work tomorrow.
I have my dad to thank for my capacity for this kind of work, such as it is. If I had had a different upbringing, I would likely have a bad habit of settling for my first attempts. My dad taught me the importance of getting it right.
The stories about my dad’s tireless attention to detail could fill a book. He pushes himself to do his best in everything, even in menial tasks around the house. He mows the lawn as carefully as he composes music. He’s a pianist and a music/drama teacher. Somewhere he learned how to draw well, too. I think he taught himself. When he directed Little Shop of Horrors at a local high school, he ordered an enormous collection of realistic plant monster puppets. They arrived in an 18-wheeler. The biggest one filled the stage and took five people to operate it. It was the coolest thing many of those kids had ever done.
My dad pushed me as hard as he pushed himself. When I was in high school, my chemistry teacher had us create trading cards for the periodic table of the elements. Everybody had to pick an element and make three trading cards for it. I picked phosphorus because it glows in the dark. I stayed up very late the night it was due. I designed the cards with our Power Mac. They looked like dossiers from the X-Files. They had glow-in-the-dark stickers and distressed paper textures. It was 1996 and they were awesome.
My dad saw what I was making and took me down to Kinkos to get them laminated. It was after midnight when we got there. He showed me how to get them laminated thickly, like drivers’ licenses. We even rounded the corners. The cards were impenetrable. I kept thumbing my fingers over the edges of the laminated cards, admiring their thickness, grateful to my dad for showing me how to make them.
Then the unthinkable happened. On the way home, one of us noticed that I had misspelled phosphorus. I spelled it “phosphorous,” which is the adjectival form.
“No one would notice,” I said to my dad.
“No. We’re going to get it right,” he replied.
He sent me back to the Power Mac and I re-edited the design and reprinted them. I applied a new set of stickers in just the right places. Then my dad took me back to Kinkos where we laminated the replacement set of cards, doing all the work over again, even rounding the corners. It was after 4:30 AM when we finally got home.
The second set were even better than the first. I got a good grade on them. But the satisfaction of an A+ paled in comparison to the reward of getting it right.
If nothing else, I want to share this lesson with my own son. It’s the greatest lesson my dad ever taught me: always do your best.
Announcing Unread: an RSS reader and my first indie project. The teaser site went live this morning. I won’t say much about the app until it’s out, but I will say this: if you use RSS and like Riposte, then you’re going to love Unread. It will ship after iOS 7 is released this fall. Head to the teaser site and join the mailing list to be notified when it’s available.
I was reorganizing my Dropbox folders when I stumbled upon my first short story, Is That It?. I wrote it back in 2008. It suffers from numerous flaws, but I think it’s a pretty good effort from a guy who should have been going to bed on time for nursing school, instead of staying up late writing about pointless miracles. Read it all here (PDF). Here’s the first paragraph:
Five years ago today, Saul Zuero, now an obscure philosopher and former zoo janitor, then an obscure zoo janitor and former philosopher, discovered what was and remains the World’s Only Officially Documented Miracle. All doubts have been dispelled, all possible scientific explanations exhausted. All things being equal (and in spite of the fact that there is every indication that they are not), only one avenue of theoretical causation remains open: the Divine. It happened in Chicago, on a Tuesday.
There is a word that Apple’s presenters use frequently when explaining new products: “content.” It makes many people cringe to hear it used so glibly, like enduring a marketing team parrot “Verticals! Verticals!” ad nauseam. Content is Apple’s elevated term for “stuff.” Content can be photos of a vacation, an episode of a tv show, or a page of text. Content is what the user wants to see.
Apps are content, too. Apple used to think so. In a promotional video in 2011, Jony Ive praised the design of the then new iMac. It was the first to eliminate the aluminum bezel from the sides of the display. Ive praised the iMac for having nothing to distract the user from their content. The implication was that “content” meant everything on the display, from the menu bar to the dock, and all points between. The hardware was the potential distraction.
Local color: Prince’s Hot Chicken in Nashville.
On Monday, Apple revealed a polarizing new design for iOS 7. Those of us in the iOS design community are still reeling from the changes. Apple has ripped the rug out from under us, or as Scott Jackson put it over bourbon yesterday, Apple “moved the goalposts overnight.”
With iOS 7, Apple has issued a decree: Apps are no longer content. This is the epicenter of the seismic activity that has shaken those of us who make apps for a living. Some of us gripe about the surface-level changes like a rigid adherence to a flat design aesthetic or borderless buttons, but these are superficial changes. The deeper shock is that iOS 7 no longer embraces the variety and richness of the third-party apps that have helped popularize the platform.
Here is a quote from one of the iOS 7 marketing pages:
Conspicuous ornamentation has been stripped away. Unnecessary bars and buttons have been removed. And in taking away design elements that don’t add value, suddenly there’s greater focus on what matters most: your content.
The implication here is that third-party apps that pride themselves on tasteful toolbars and easy-to-recognize buttons don’t add value for the user, that the app itself isn’t content.
User interfaces are as just as much a part of the experience of an app as the text, photos, and videos that it displays. You may remember John Gruber’s quip about Twitter apps being a UI design playground. All Twitter apps display the same content, but not all Twitter apps are the same. Tweetbot’s shiny buttons and clicky sound effects have as much to do with its popularity as its functional design. Tweetie, Twitterriffic, Hootsuite, Tweetdeck were all different, and all were popular, even if no single app appealed to the tastes of all users (which would be impossible). Design is more than just how something works. It’s also how it looks. Would you rather buy a black or a brown leather jacket. Why?
The best metaphor for understanding Apple’s misguided strategy with iOS 7 is a restaurant. Food alone is not the content of a restaurant. The decor on the walls, the uniforms worn by the waitstaff, the ambient music, the quality of the flatware — all of these elements add value to the experience of a meal. A dark pub, a smoky Indian barbecue joint, a campy vintage food truck — every detail of a restaurant matters. If iOS 7 was a restaurant, it would be, in Neven Mrgan’s words, a hospital cafeteria:
Waitstaff wear white. The whole restaurant is white. No music. Enjoy.
Here’s a quote from the abstract, published in the NEJM:
Medicaid expansions were associated with a significant reduction in adjusted all-cause mortality (by 19.6 deaths per 100,000 adults, for a relative reduction of 6.1%; P=0.001).
Garrett Murray posts images of the original tweets, showing that it was users — and not Twitter itself — who invented the @reply. Innovative software experiences are the fruit of the marriage between a powerful open API and a software ecosystem that adapts to fit what users need. Healthcare software devs should take heed!
Part 1: What the EMR Market Can Learn From Twitter
As Meaningful Use marches on and providers are deploying their newly-minted EMRs (electronic medical records), the ONC has been receiving complaints that the user interfaces of many EMRs are frustrating and poorly-designed. Dr. Farzad Mostashari, the head of the ONC, put it well last October in an interview on HIStalk:
The only problem is that providers consistently say, “I didn’t know what I bought until three months after I bought it. I didn’t know what the usability of the system was really going to be, because all I saw was these demos I had from people who knew their way around the system and knew spots to avoid.”
To help prevent these horror stories from happening to other providers, the ONC and NIST are strongly encouraging EMR vendors to include formal usability testing results when submitting their products for certification. The thinking is that if usability scores are publicly available, providers will be able to comparatively shop around for the best experiences, thus encouraging all vendors to improve their products.
While I applaud the ONC and Dr. Mostashari for their efforts, I believe that this initiative misunderstands the systemic nature of the usability problem. Contrary to popular belief, the usability problem is not directly the fault of the EMR vendors. There are fundamental technological and market forces that disincentivize EMR vendors from creating truly great user experiences. These forces must be disrupted before the EMR industry will begin to produce satisfying end-user experiences. A change in certification criteria is woefully inadequate to the task.
Before going further, I should clarify my terms. What goes by the name “usability” in most EMR discussions is actually a blend of related concepts: safety, beauty, ease-of-use, etc. What these concepts all share is the notion that EMRs should be safe, fast, easy-to-use, and pleasant. Rather than get lost in the weeds of specialized terminology, I will follow the common practice of referring to all these domains under the umbrella term “usability.” I humbly beg usability experts to forgive me this injustice.
Because many “usability” complaints exceed the scope of the formal definition of usability, it is clear on this basis alone that the ONC’s initiative is not enough. Formal usability testing is a tool used to prove the safety and efficiency of a given task, but providers are frustrated with the entire EMR experience, with the way that all tasks are bound together into clumsy, dissatisfying products. Formal usability testing can refine an innovation, but it cannot inspire it. Innovation must be inspired by a profound shift in priorities.
The reason that EMR vendors have yet to produce innovative user experiences has nothing to do with a lack of money or talent; they have healthy revenues flowing in from one of the largest and fastest growing markets; they have many bright employees; there’s no lack of technical or design expertise. If a vendor wanted to make an EMR with a show-stopping user-interface, there would be few internal obstacles in their path. So why don’t they want to do so?
The answer is that the current EMR market doesn’t really prioritize usability. EMR purchasers are not end users; their purchasing decisions are driven by other priorities. They are responsible for ensuring that meaningful use criteria are met in a timely and cost-effective manner. Every purchase must be a part of a long-term strategy of going paperless even as future changes in reimbursement will potentially threaten their bottom line. The basis of competition for these customers is essentially an arms race of features and functions. Usability is one concern among many. The vendors that offer the most features and functions are valued the highest.
Before Meaningful Use, only a small fraction of institutions were using EMRs. Most of the ones that had EMRs were using them in a limited fashion: labs, eMARs, etc. Less than one percent were fully electronic. To go from this state to a fully-electronic system presents an enormous task for managers. It also presents an enormous task for vendors.
There are few technological standards for EMR integration. Even HL7 is just a messaging protocol, with dozens of specs, all of which are haphazardly implemented in practice. There is an HL7 segment called the Z segment, which is a kind of wildcard that can be used for anything not covered in the official spec by another segment. It was intended to be used only rarely. An integration engineer I know estimates that 90% of HL7 messages are now sent via the Z segment.
When a major vendor released a disappointing iPhone app recently, I had the pleasure of speaking with the lead developer on the project about my disappointment. He acknowledged its flaws, but explained the difficulty they are having in integrating iOS applications with their core products:
This app is a small step as we continue to decouple complex logic embedded in the Windows platform and expose as services.
Translation: “We’re having a difficult time extending an API, even to ourselves. Our software was not architected for agnostic interoperability with other platforms.”
There are perfectly valid reasons for this difficulty. Since the industry has never produced true interoperability standards, EMR vendors have been left to themselves to create and maintain their own internal standards. Because of the scale and complexity of an EMR, these internal standards are full of proprietary fine-tuning and optimizations. Front-end applications were not designed to be hot-swappable.
Thus, in a market in which customers value feature lists and functionality, the vendors who offer the most features must also be the vendors who are the most technologically integrated. This is why Epic has been taking the lion’s share of the inpatient market. Their all-or-nothing approach with customers is backed by the technological capability of doing so. Epic can guarantee feature lists because it controls its technology stack, soup to nuts. Other vendors boast similar successes to the extent that they are integrated and feature-complete — or not.
What does all this have to do with usability? Since innovation is the fruit of competition, innovation in usability will not occur until usability becomes the basis of competition. This cannot happen while front-end applications and back end systems are still tightly integrated. To understand why, let’s consider the example of Twitter and third-party Twitter apps.
When Twitter was first launched in 2006, it was mainly a text-messaging service. Tweets were limited to 140 characters (well, 140 characters plus a 20 character username) because SMS messages are limited to 160 characters. You would send and receive tweets straight from the text messaging screen on your mobile phone. Twitter also hosted a website which you could use to browse your timeline and send messages.
Behind the scenes, both the website and the SMS service ran on the same underlying back end system. Twitter engineers worked hard to maintain a clear boundary between the front end SMS/web experiences and the back end, linked together by a reliable and robust API.
What Twitter did next with this API was the key to its success and cultural impact: they extended this API to third-party developers. Developers loved that they could build and sell an app to suit their customers’ tastes and needs. As long as they followed the API, the sky was the limit to their creativity. The apps that were produced from this effort — Tweetie, Twitteriffic, TweetDeck, and many others — have come to define the Twitter experience for their users. Customers could try new apps with ease. All that was needed was a username and a password. They could switch from one app to another without worrying about losing their data or being unable to send tweets. The Twitter API guaranteed interoperability.
This competition produced almost every innovation that we now consider essential components of the Twitter experience: hashtags, retweets, photos, shortened links, etc. Even the word “tweet” was coined not by Twitter employees, but by developers at a third-party company, The Icon Factory. Every Twitter client had its own unique user interface, and apps competed for the best experience. Some, like TweetDeck, competed with pro-level features, while others, like Twitteriffic, competed with simplicity and ease-of-use. Client apps that were poorly-designed or hard to use did not succeed in gaining users.
Compare this to Facebook: until relatively recently, if you wanted to use Facebook, you had to visit their official website. The Facebook experience was designed for the lowest-common-denominator, and it showed. Every major attempt to redesign the Facebook site resulted in large numbers of upset users. There is simply no way to satisfy all people with a single, monolithic user interface. Variety is a necessity if the goal is to create satisfying user experiences.
What if EMRs worked more like Twitter? What if there was a clear separation of concerns between reliable back end systems and user-facing client applications, linked together by a robust, universal API? If providers could pick and choose the component pieces of their front end software the same way that Twitter users can swap out Twitter clients, the basis of competition would radically shift towards usability. Smaller vendors would be able to compete with equal footing with the large vendors, at least on an app-by-app basis. This one makes the best CPOE module, this one the best stress-test documentation, this one the best eMAR. Providers could hire developers to write custom in-house apps at relatively little expense and difficulty compared to present circumstances.
Is it even possible for the industry to change in this way? In Part Two of this series, I will discuss the changes that would need to take place for such a scenario to occur. I will also identify the institutions that are the most likely to be able to effect the necessary changes, and whether and how they would be motivated to do so.
I had the pleasure of meeting Dr. Rick Weinhaus at this years Creating Usable EHR conference in Bethesda, Maryland. I was also fortunate to assist with this, his latest article on HIStalk, in which he reflects on the lessons learned there.
Phil Freo asks:
I’ve got the latest iPhone with its 8MP camera and HD video camera, complete with iOS 5 and I pay for extra storage on iCloud. Apple’s supposed to be the best at designing simple user experiences across hardware and software – and I believe they are.
So when I want to take a bunch of photos and videos that I took from my iPhone and share those with some family members, it should be simple right?
Yet it’s not.
I couldn’t agree more.