© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

The Influence Of UX Design Is Growing. These Authors Say It Should Be More Positive

(Pixabay)
(Pixabay)

The call for a new design philosophy for the digital age to make smartphones and surfing the web a lot more elegant and user-friendly.

Guests

Cliff Kuang, user-experience designer and award-winning journalist. Former design editor at Wired and founder at Fast Company Design. Co-author of “ User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play.” ( @cliffkuang)

Robert Fabricant, co-founder and partner of Dalberg Design. Former vice president of creative at the firm Frog Design. Co-author of “ User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play.” ( @fabtweet)

From The Reading List

Excerpt from “User Friendly” by Cliff Kuang and Robert Fabricant

Introduction: The Empire of User- Friendliness

User Friendly

  1. Computing. Of hardware or software: easy to use or understand, esp. by an inexperienced user; designed with the needs of a user in mind.
  2. In extended use: easy to use; accessible, manageable.

At only four stories tall, the world’s largest office building sits low to the ground but commands a footprint worthy of a UFO that could blot out the sun: a perfect doughnut shape, a mile around its edge. In the middle lies a grove meant to recall the time, just fifty years ago, when Silicon Valley wasn’t Silicon Valley, but rather the Valley of Heart’s Delight, covered in 10 million fruit trees—apricot, cherry, peach, pear, and apple. It took Apple, the computing giant, years to buy up all that land in secret, assembling some sixteen different plots across fifty acres in a $5 billion jigsaw. If the building looks like a spaceship, then it’s one that lifted off directly from Steve Jobs’s imagination to land in the heart of an otherwise sleepy suburb. It was one of the last undertakings that the great man signed off on before he died.

Every morning during the construction of Jobs’s last dream, Harlan Crowder woke up to the dull roar of heavy trucks on their way to the site, their alarms bleating as they nudged their loads into place. When we met, Crowder was seventy-three years old with three grown children. He wore a white goatee and a retiree’s wardrobe of rumpled pants and floral shirts. In Crowder’s neighborhood, the hubbub attending Apple Campus 2’s arrival had unleashed a swarm of real estate agents trawling door-to-door and offering to make people rich.

These agents were mostly women buffed to a high gloss, and they came adorned in big brass jewelry that served as armor against holdouts like Crowder. “There was one, she said it just like that: ‘I have ten people waiting to start a bidding war for your house,’” Crowder said in his Texas drawl as we sat talking on his back patio. Houses like his had originally been built in the 1960s for the population boom incited by the nascent transistor industry. When I visited, modest three-bedroom ranch houses like Crowder’s were easily fetching $2.5 million. In a year, he assumed it could be 10 percent higher, maybe more. It was all some strange dream.

Crowder isn’t a famous person—there are untold thousands like him, spread out in the Valley: people of technical ability who built the place but whose names are lost to history. But Crowder is one of the first people in the annals of history to use the term “user friendly” to refer to a computer. Every week or so, Crowder fends off the real estate agents and their offers of a multimillion-dollar payday. Apple fuels it all—Apple, the company that made “user friendly” into an idea that we live with every day.

Crowder looks on at the new Apple Park during his daily walks past the campus, the crown jewel in an empire built upon trillions of dollars in iPods and iMacs and iPads and Apple Watches and iPhones—devices that, despite being some of the most advanced computers ever made, can still be operated by toddlers. Their power dwarfs that of the “supercomputers” Crowder once worked with at IBM. That he’d come to IBM at all seemed like another kind of dream. He’d failed eighth-grade algebra. After high school, he bummed around until enlisting in the Army, where he trained as a medic: a year and a half of doctor’s training with all the theory stripped out, so that you simply knew the essentials required to save a life. The practicality and the pressure lit into him. The onetime layabout graduated first in his class. After service came college at East Texas State University, in Commerce. “It wasn’t the edge of nowhere, but you could see the edge,” he told me.

Crowder had seen an IBM recruitment flyer on a bulletin board at college, calling for science majors to enter a newfangled training program. He replied, and they called him back. He flew up to Yorktown, New York, without much idea of what to expect. IBM’s research center was a crescent-shaped architectural landmark designed by the great Finnish American designer Eero Saarinen—the original corporate-campus-as-spaceship. Its facade was a gleaming curved-glass curtain wall, and the walls inside were hewn from New York’s native granite. The building’s design was a statement about the modern workplace from the world’s smartest company, with as many Ph.D.s on staff as a university.

Crowder walked to his job interview in awe. The building resembled nothing so much as the spaceship from  2001, a bright techno future that was big news in 1968. Here was a place where the water-cooler conversations were about scientific breakthroughs. “I’d have done anything to work there. I didn’t care if I had to clean the toilets,” Crowder said, his voice still shimmering with glee. IBM had run out of computer programmers, and the company wanted to make more of them. Crowder got the job, and he took to the pragmatic nature of the work, using computers to solve real-world problems that you could measure by the ton, such as mapping shipping routes and calculating trucking loads. He found that he had a mind for visualizing the complex equations his job required.

The field where Crowder worked, operations research, began with World War II and the Marshall Plan. Rebuilding Europe meant shipping a mind-boggling amount of matériel across the Atlantic, but also shipping back a mind-boggling amount of matériel that had accumulated in dozens of countries during a war effort of unprecedented scale. Loading ships efficiently on the way over, then loading them up efficiently for the return home, was a math problem whose unruly complexity demanded computing power.

Crowder was working on these sorts of operational problems for IBM’s clients in the 1960s. To create a computer program, he had to use a machine to punch intricate holes in cards the size of an airplane boarding pass. When he was done, he couldn’t just walk up to the computer himself. The computer was a $5 million machine—about $35 million in today’s money—patrolled by two security guards and a perky-eared German shepherd. Crowder would spend all day programming and then take his stack of cards to the computer attendant behind a window, who fed the cards into the machine. The computer would spend all night calculating, and the results would usually be ready for Crowder by the morning—if he hadn’t made a mistake. This was a big “if.” A mistake would be as simple as a misplaced character that gummed up the processing, or a poorly defined equation that divided by zero and sent the computer into looping infinities. (Shades of Apple again: The address of its former campus is One Infinite Loop.)

Faced with fitfully waiting overnight to see if all the days they’d spent programming had been wasted by a typo, a cultish group of programmers at IBM found a way out. Working at a minicomputer tied to a mainframe down the hall and using a simplified programming language called APL—literally, A Programming Language—you could simply write a program and run it. You just typed up your program, saw whether the computer spat out meaningful results, and immediately knew whether your program was heading in the right direction. This was magic. Simply seeing the fruits of your work right away let you shape new ideas in your head just as soon as you had them. Years later, Steve Jobs would describe a computer as a bicycle for the mind—a fabulous machine that could turn the muscle power of a single person into the ability to traverse a mountain in a day. Crowder and his colleagues were among the very first to experience that ideal firsthand. The machine, once it was capable of  immediate feedback, was actually augmenting what your mind could do. An insight might flash before you, and you could test that idea out right away. Seeing how well it worked would spur new ideas, and on and on. Thanks to these feedback loops, computer programming, which had once had the stately pace of chamber music, entered its own improvisational Jazz Age.

These “jazz musicians” traded their ideas in academic journals. The only thing was, it was awful trying to re-create the music someone had made on their machines. You couldn’t tell how easy it would be to test another person’s work or replicate it. The programs simply didn’t consider what  someone else might do with them. To Crowder, they weren’t  user friendly. And so Crowder proposed that a computer program be gauged not just on how well it solved a problem but on how easy it made the lives of the people trying to solve it. To be clear, he didn’t actually invent the term. As far as he knows, it had been floating around in the air, and it was there when he needed it. Which tends to prove how powerful it was—how it encapsulated something that people had started to feel.

And yet IBM didn’t go on to invent the user-friendly world—even though it hired some of the best designers in the world, such as Paul Rand, who created its logo; Eero Saarinen, who created its campus; and even Eliot Noyes, who designed its Selectric typewriter. Instead, that accomplishment is typically credited to Apple, which adapted ideas seeded at Xerox PARC to create the Macintosh. Just a decade after Crowder first wrote his paper describing user-friendly algorithms, Apple was making ads for user-friendly machines:

In the olden days, before 1984, not very many people used computers—for a very good reason.

Not very many people knew how.

And not very many people wanted to learn …

Then, on a particularly bright day in California, some particularly bright engineers had a brilliant idea: since computers are so smart, wouldn’t it make sense to teach computers about people, instead of teaching people about computers?

So it was that those very engineers worked long days and late nights—and a few legal holidays—teaching tiny silicon chips all about people. How they make mistakes and change their minds. How they label their file folders and save old phone numbers. How they labor for their livelihoods. And doodle in their spare time …

And when the engineers were finally finished, they introduced us to a personal computer so personable it can practically shake hands.

There’s a certain magic in how a few words can elide so many stories and so many ideas. This book is an attempt to paint a picture that’s gone missing in plain sight.

* * *

It began as an idea first broached with me by Robert Fabricant, who at the time was vice president of creative at the firm Frog Design. We’d known each other for a couple of years, and Robert’s pitch, which mirrored a decade of my own work as a writer and an editor, was simply that user-experience design, which had been the province of computer geeks and futurists, wasn’t a niche anymore. In an era in which 2.5 billion people own smartphones, user experience now occupies the center of modern life, remaking not just our digital lives but also business, society, even philanthropy. It was Robert’s idea to call this book  User Friendly—a term whose very familiarity proves the thesis. And yet the history of the term, the meaning it conveys, the mechanics of how it works—these all remain, outside of a few professional circles, either untold or known only in pieces. What we first conceived over a period of a few months—and assumed would take six months to complete—eventually took me six years to write and report. That is the book you’re reading now, one that tries to show how “user friendly” was invented, how it shapes our everyday rhythms, and where it will go next.

There is a certain generation of designers who quarrel with the term “user friendly.” They disagree with the premise that gadgets should always presume a chipper familiarity to their users; they quibble about whether “friendliness” is the right relationship between a device and its user; they consider the idea condescending in how it implies that users must be treated like children. These critiques are sometimes reasonable. They also fail to undermine the term itself while missing a larger point.

Today, alluringly designed gadgetry has remade the texture of everyday life, from how we make money to how we make friends to how we make babies. We now expect that the tools we use to diagnose cancer or to identify a problem with an airplane engine will be as simple to use as  Angry Birds. We aren’t wrong to do so. Technology should become simpler over time. Then it should become simpler still, so that it disappears from notice. This has already happened with stunning speed, and that transformation is one of the greatest cultural achievements of the last fifty years. But even as the designed world shapes us, its inner logic remains almost totally absent from daily conversation. Instead, whether we’re speaking to our kids or speaking to our grandparents, the only words we have are “user friendly.” We hardly examine them at all—and yet they’re the standard by which we judge the designed world.

“User friendly” rolls off the tongue unconsciously because we know what it means, or we think we know what it means. It means something like “Did this thing do what I want?” But even that simplified formulation raises a litany of questions: Why should some product defer to our desires? How does the person who created that object come to understand what I want to begin with? What can go wrong in translating our desires into artifacts? It took more than a century of progress and peril to answer those questions. This book is about how the idea of user-friendliness was born and how it works. We travel backward and forward in time, from the paradigm shifts that made user-friendliness into something people cared about at all to the present day, when user-friendliness has redefined nearly every minute of our waking lives.

Many of the ideas in this book will be familiar to user-experience designers—the people who observe our lives so that they might invent new products. Still, this story should be new. User-experience design, which has come to encompass everything from theme parks to chatbots, simply hasn’t had a narrative thread comprehensible to both laymen and experts. The great chain of ideas that spawned it typically hasn’t been appreciated as a tapestry of personalities, happenstance, and ideological struggle. If user experience is foreign to you, I hope you’ll come away from this book understanding how the world is being remade every day—the ideals, principles, and assumptions that lie behind the taps and swipes you take for granted. If you’re a designer, I hope you’ll better understand the origin of the ideas you swim among, so that you might better examine—and even challenge—the values you put into the things you make. At the very least, I hope you’ll be able to offer this book to the people you know and say, “This is why user experience matters.”

Excerpted from USER FRIENDLY: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play by Cliff Kuang with Robert Fabricant. Published by MCD, an imprint of Farrar, Straus and Giroux November 19th 2019. Copyright © 2019 by Cliff Kuang and Robert Fabricant. All rights reserved.


New York Times: “ Design That’s Got Users in Mind” — “‘Once a niche profession more commonly associated with chairs,’ product design ‘is now talked of as a solution to the world’s ills,’ Cliff Kuang and Robert Fabricant declare in ‘User Friendly,’ their new book on what’s known as user-experience design. To advocates, it’s a transformative paradigm for everything from bus scheduling to city management. To naysayers, it’s a ‘boondoggle’ (The Chronicle of Higher Education), ‘b.s.’ (Fast Company), or, worse, ‘fundamentally conservative’ (Harvard Business Review).

“Lay people can now reach their own conclusions more easily. Few previous books have surveyed the history of user-centered design from the origins of American product design as a profession in the 1920s to the latest wearables and beyond.

“Some applications of design thinking are literally matters of life or death. The many admirers of Donald Norman’s general-interest design books, including his 1988 best seller, recently reissued as ‘The Design of Everyday Things,’ may not realize that he became an activist for usability only after he helped write a report on the 1979 Three Mile Island nuclear accident. As Kuang, a journalist and designer, and Fabricant, a former vice president of creative at the legendary firm Frog Design, explain, that near tragedy resulted not from faulty reactor design or from failures of the plant’s managers, but from the arrangement of the facility’s host of dials and lights, which bore little obvious relationship to the sequence of reactors and boilers they monitored and controlled.”

Fast Company: “ Apple built a $1 trillion empire on two metaphors. One is breaking” — “On August 2, 2018, Apple became the world’s first public company worth more than $1 trillion. If anything, that abstract figure understates the company’s reach. Apple makes the first thing that hundreds of millions of people look at when they wake up. The company’s supply chain can extract trace amounts of rare-earth minerals from a mine in the Democratic Republic of the Congo, embed them in one of the planet’s most advanced computers, and deliver the whole thing to the steppes of Mongolia. And yet Apple’s rise is nothing more or less than the story of three interfaces: the Macintosh OS, the iPod click wheel, and the iPhone touchscreen. Everything else has been fighting about how the pie would be divided up among competitors and copycats.

“In the user-friendly world, interfaces make empires: IBM, with its punch-card mainframes, was an empire until the 1970s. Then came the graphical user interface, which transformed both Apple and Microsoft from niche companies into mainstream Goliaths. (In April 2019, Microsoft became the third company in the world to reach a $1 trillion valuation, right behind Amazon.) Apple, of course, nearly died in the late 1990s; a major part of what saved the company in the years after Steve Jobs returned was the iPod’s click wheel, which cracked the problem of making it fun to browse incredibly long lists. Blackberry, with its telephone lashed to a keyboard, was another empire until the iPhone. Even Amazon grew from an interface idea: 1-click shopping.

“The value of the patent alone is staggering: Amazon made billions licensing it to Apple for the launch of the iTunes store. But its value to Amazon has been far greater. By eliminating all the check-out steps required to buy something online, 1-click gave Amazon a decisive edge against cart abandonment, which, according to some studies, averages 70 percent and remains one of the two or three biggest challenges to online retailers. 1-click made impulse buys on the web actually impulsive. The boost from 1-click shopping has been estimated to be as high as 5 percent of Amazon’s total sales—an astonishing figure, given that Amazon’s operations margin hovers below 2 percent. Moreover, it also incentivized Amazon’s customers to stay logged in to Amazon at all times—which then allowed Amazon to silently build up a user profile in its databases, which then allowed Amazon to become a platform for selling and recommending anything, not just books. Amazon’s 1-click would easily be the single most consequential button ever invented, but for the Facebook Like button.”

Wired: “ How the Dumb Design of a WWII Plane Led to the Macintosh” — “The B-17 Flying Fortress rolled off the drawing board and onto the runway in a mere 12 months, just in time to become the fearsome workhorse of the US Air Force during World War II. Its astounding toughness made pilots adore it: The B-17 could roar through angry squalls of shrapnel and bullets, emerging pockmarked but still airworthy. It was a symbol of American ingenuity, held aloft by four engines, bristling with a dozen machine guns.

“Imagine being a pilot of that mighty plane. You know your primary enemy—the Germans and Japanese in your gunsights. But you have another enemy that you can’t see, and it strikes at the most baffling times. Say you’re easing in for another routine landing. You reach down to deploy your landing gear. Suddenly, you hear the scream of metal tearing into the tarmac. You’re rag-dolling around the cockpit while your plane skitters across the runway. A thought flickers across your mind about the gunners below and the other crew: ‘Whatever has happened to them now, it’s my fault.’ When your plane finally lurches to a halt, you wonder to yourself: ‘How on earth did my plane just crash when everything was going fine? What have I done?’

“For all the triumph of America’s new planes and tanks during World War II, a silent reaper stalked the battlefield: accidental deaths and mysterious crashes that no amount of training ever seemed to fix. And it wasn’t until the end of the war that the Air Force finally resolved to figure out what had happened.”

This article was originally published on WBUR.org.

Copyright 2020 NPR. To see more, visit https://www.npr.org.