UX 17

The Freelance Studio Denver, Co. User Experience Agency At the UX Poland Conference, presenter Jeff Parks, said: “Without research, businesses cannot make informed decisions.” By analyzing data, entrepreneurs and UX professionals get information that will help them develop more efficient and profitable products and services. Contextual analytics is a great way to get the most out of big chunks of data. Facebook and Twitter, for example, are excellent sources of data, but detecting patterns in order to make sense of all that available information is incredibly time-consuming and complicated. Going a step further, once we make sense of the data, we need to analyze it in order to create valuable insights, and that relies on having a specific context in mind – otherwise we end up with nothing but generalized conclusions. In short, applying contextual analytics can help emphasize the individuality of the consumer and his or her behavior. WHAT IS BIG DATA? To start with, big data is called “big” for a reason – we’re talking terabytes and terabytes of information flowing into companies every day. Regular data becomes “big data” when it is large enough that it cannot be easily processed using conventional methods. With this volume, spreadsheets are useless; they lack flexibility and scalability. However, once processed, that big data is very valuable. With the context provided by analytics, big data can highlight key metrics, allowing UX professionals to create tailored solutions for their users. Big analytics further the business strategy. Big Data in Retail: Examples in Action. How is all this relevant to today’s businesses? There are three specific ways in which big data can lead to better products and a better user experience—when viewed with context: Big data allows us to expand our knowledge of the customer and develop products and services that are best suited to their needs. Big data gives us a deeper understanding of how our customers behave, allowing us to connect with customers on a more meaningful level. Big data can help boost marketing activities, since it provides us with a chance to analyze customer behavior on multiple channels and understand when the customer is most likely to buy products or services. In this article we’ll take a look at how to make the most of analytics, and why context is so important for big data. CONTEXTUAL DATA A few years ago, one of my customers at UsabilityTools used our Form Tester tool to learn how users behave on his website. The Form Tester tool allows us to analyze the way website visitors interact with online forms. It provides data about each field of the form, identifying steps that cause dropouts. The first thing that we noticed (thanks to the tool) was a high bounce rate, which indicated that something might be off with the form. In order to analyze the problem, we started to look at different elements of the form. We noticed that the usual response time was about 5 seconds, but one particular field took users over three minutes to fill. Having found a problem, we put it into the context: the field required an ID card number, which meant the user had to leave his desk, look for his wallet and come back to copy the data from the card. That explained the three minute wait, and showed us that we didn’t need to fix a problem with the form, so much as prepare users for needing their credit cards, so that they wouldn’t leave when they encountered this more “strenuous” task. Contextual analytics is what allows businesses to trace patterns and detect trends like we did. It helps designers to build predictive models and prepare a suitable business strategy. The context is what makes the difference between “big data” and “dumb data.” WHAT’S WRONG WITH NO CONTEXT? Most companies in the digital industry already have some web analytics software implemented. But that doesn’t allow them to fully understand the psychological and cultural factors that influence customer lifestyles. From the perspective of a business, the typical elements of web analytics, such as page views or the bounce rate, which provide data can actually lead to conclusions and mission-critical insights that are simply wrong. It’s easy to see how plain numbers can lie, especially when taken out of their context. Let’s take the “average time on site” metric. Five minutes as the average time looks pretty solid as an average, but when we look at individual visits, we suddenly see that the majority of visitors spend only ten seconds on the site, and the average metric is distorted by a few prolonged visits! The realization that we can’t blindly trust data has been circulating for a while now – in an article entitled “What Data Can’t Do,” David Brooks of The New York Times points out that the main problem of big data is that it’s “pretty bad at narrative and emergent thinking, and it cannot match the explanatory suppleness of even a mediocre novel.” The best way to deal with this problem is to follow the words of Scott Gnau of Teradata Labs: “big data is a new piece, but it is not the only piece of the data puzzle.” Context and context-derived analytics can unlock the potential stored within big data; by contextualizing the data at hand, we can do things like improve our customer insights and identify the reasons behind common consumer behaviors. From here, businesses can create experiences that actually surprise and delight their users. House of Cards is a popular political drama featured on Netflix, and perhaps one of the best examples of big data’s influence. Netflix actually uses big data to customize the House of Cards plotline and character twists. As Salon reported, if a user is watching the first episode and pauses it to get a snack, Netflix records the pause and the play. It’s impossible for Netflix to determine the reason why viewers paused the episode, but they can ask and assume – why do people pause at that moment? Is it because it’s shocking, repulsive, captivating or simply boring? Why do so many people rewind to exactly fourteen minutes into the episode? Is it because something is difficult to understand, or is it because the scene was amazing? Finally, why do viewers stopped watching the episode half-way through? The reason could be simple: the show was just bad. By looking at the scenes during which these events (pause, rewind, stop) happen, the analytics team puts the events in context, and the results of their analysis are used later in order to improve the future viewing experience. Currently, according to rottentomatoes.com both seasons of House of Cards have received ratings well above 80%—proof that the series is successful. By putting big data in context, Netflix has prepared, and will improve a show which kept people glued to the screens. House of Cards is an extreme example, but the same principle applies to any experience. Online delivery app Foodler recommends “best bets” to users based on previous items they have ordered from similar restaurants. They could go a step further and analyze their data within the context of time of day, and begin recommending breakfast-, lunch-, or dinner-specific foods at the appropriate times. Similarly, Target uses contextual big data to identify changes in customer behavior—this is how Target famously learned that a customer was pregnant before she had even told her family. Foodler predicts what users are likely to eat. Foodler is able to predict what a user is likely to eat at any restaurant. Knowing the why behind the data is what is really valuable. This context explains the psychology behind consumer behavior and consequently influences our ability to develop marketing strategies that successfully reach users at key touch points. THE ROLE OF CONTEXT IN PREDICTION There is every reason to think that getting the hang of accurate models and patterns is key to boosting the analytics decision-making processes in the big data environment. Contextual analytics feed predictive analytics and produce a perspective on how people actually behave, crucial to building great predictive models. We can use contextual analytics to emphasize data usability and business relevance. This allows us to create models that predict the future behavior of consumers, such as when Amazon recommends additional products. Amazon uses predictive analytics when I purchase a tent, Amazon uses analytics to determine I might want a sleeping bag as well. Contextual analytics can determine, for instance, whether different data observations can be subscribed to a single individual, providing the context for the accurate merging of data to form real associations. For example, an e-commerce owner using predictive analytics will note that many customers purchase shoes on Friday afternoons, but contextual analytics will allow them to see that most of these customers are in office buildings, and are more likely to purchase when waiting for a client or meeting (in the last five minutes and first five minutes of the hour). Data driven context Possible actions Repetitive consumer habits (e.g. buying a certain type of product or purchasing at certain times) Suggest certain types of products or display offers during the purchase. Other consumer habits (e.g. products purchased by people with similar habits/demographics) Suggesting products that other consumers picked Context derived from outside data (e.g. determining user’s hobbies based on their mobile apps) Suggesting products concerning trending topics/people (e.g. recently awarded movies, deceased musicians) In a study of contextual analytics, Lisa Sokol and Steve Chen of IBM created another example, involving the traditional scoring system banks use to determine whether a client is eligible for a loan. If the bank uses analytics, they stated, it will see every account from every bank, but it won’t be able to associate all the different accounts from several banks to one person and, in consequence, will have to base its decision on imprecise information. With contextual analytics, on the other hand, the bank is able to see that those several accounts belong to one person and so will have all the necessary information to accurately evaluate the client’s ability to pay back a loan. By taking advantage of context-driven analytics, we can increase the efficiency of prediction models and make better business decisions as a result. NEXT STEPS Recognizing the benefits of using context in big data analytics is only the first step. Once we start gathering it, we’re ready to look for contextual insights and, as a result, create better customer experiences. Here are a few ways to get started. Study the data and KPIs that others are using, to better understand which are most relevant to the field. Don’t trust average values! Consider the context, whatever the metric is. Read more about the contextual analysis of big data: Age of Context: Mobile, Sensors, Data and the Future of Privacy by Robert Scoble The Human Face of Big Data by Rick Smolan Data Crush: How the Information Tidal Wave is Driving New Business Opportunities by Christopher Surdak Big Data Geeks blog Follow influencers on Twitter who often post useful articles on big data, such as Bob Gourley, Tony Baer or DJ Patil. Sponsor Sophia is discussing usability testing with her client and can’t wait to get started. The only problem is that they’ve got different ideas on what to test and which areas of the website to focus on. Sophia’s client has knowledge of his customers, while Sophia has got years of UX experience to base her testing tasks on. With no sign of an agreement between Sophia and her client she turns her attention to analytics to gain some insight into how people are really using the website. Usability testing and analytics make for a dynamite team, enabling us to learn about our users, track our goals, and troubleshoot unexpected problems. When it comes to troubleshooting, analytics tell us what pages or journeys are causing problems for users, and helps identify what areas we should focus on in usability testing. The usability tests will then tell us why users are behaving in those specific ways. Between the two, we can provide focused, user-specific recommendations for site owners. In the case of Sophia (and many other UX practitioners like her) analytics show exactly how users are accessing the website. Though her background in UX, and her client’s customer know-how might have resulted in good assumptions about what to test, analytics showed them how people were using the website in a clear, unbiased way. For anyone willing to learn a few simple tools to read them, analytics helps: identify problem areas on a website show how users are engaging with a site measure the results of any design improvements In this two-part series, I’ll explain how to use analytics to identify where users are having issues, and what areas of a site will most benefit from usability testing. Today’s article focuses on three metrics to identify problems on a web site: bounce and exit rate, average time on page, and page value. In part two, we’ll move on to using these metrics to identify drop off points, and then we’ll dig into segmenting the data to pick up on additional details. IDENTIFYING PROBLEM PAGES AND SECTIONS As a freelance UX consultant, I’ve worked on a wide range of websites across multiple industries, and the process consistently begins with analytics. I start by identifying how many users visit the site daily, and which pages are the most popular. This gives me an overview of how people are using the site. I then move on to identifying potential problem areas, which will later become the focus of my UX recommendations. In general, I look at three types of metric to identify problem areas: Bounce and Exit rate Average time on page Page value BOUNCE AND EXIT RATE “Bounce rate” and “exit rate” are two metrics that can cause confusion. Bounce rate is the percentage of users who visited just that one page of a site: arriving on a page but then leaving without viewing any other pages on the site. Exit rate is the percentage of people who leave the site from a page; this includes people who have visited other pages on the site previously. A diagram showing bounce rates and exit rates. If I notice a section of the website that shows a high bounce or exit rate, I make note of it, in case something on a particular page is driving away visitors. A page with a high bounce rate may indicate that the content on the page wasn’t what the user was expecting when they arrived there. A high exit rate may show that this page is causing the user to drop out part-way through their intended journey—on the other hand, if the page with the high exit rate is the final page in the journey, then the exit rate is not a problem at all. Using the “weighted sort” option in Google Analytics makes the bounce rate metric even more valuable. According to Google Analytics “Weighted sort sorts percentage data in order of importance instead of numerical order.” To give an example, a page may have a 100% bounce rate, but if it only had one visit in the last month then only one person left the page (and a bigger issue may be that no one is visiting the page!). If the page has an 80% bounce rate, but is a key starting page in the user journey, then the site could be losing a lot of business. Recognizing whether the problem is that no one is visiting the page or that everyone visiting immediately leaves the site is crucial to preparing an appropriate usability test around the page in question. A Google Analytics table where rates are weighted. AVERAGE TIME ON PAGE “Average time on page” is the average amount of time that users spend viewing a webpage. If I see a page with a low “average time on page,” it may mean the page is under-performing by not holding the user’s attention. On the other hand, if users are spending a lot of time on a checkout page, it might be because the page is overly complicated. Of course, all metrics should be viewed in context; if a blog article has a high “average time on page,” it’s generally a good sign, since it may imply that users are actually reading the whole post. Another good way to quickly gauge how pages are performing is to use the “compare to site average” display option. This is a graph that shows whether pages are significantly above or below average for the selected metric. They will still need to be analyzed on a page by page basis, as different pages have different objectives, but pages with a lower than average time on page are likely to be an issue, assuming the purpose is to keep users reading. The example below clearly shows that the “Contact” page has a lower time on page than the site average, while the “Blog” page has a time on page of over 80% higher than the average. A Google Analytics table of average time spent on pages Again, context is key here. Users may be arriving on the contact page to check the address of a company, or to find their phone number. If they successfully achieve this then they will leave the site, so a low time on page here can be good sign that the page is performing efficiently. A “blog” page is expected to hold a users’ attention, so the higher than average time on this page could be seen as a good thing. PAGE VALUE An important, but underused, metric for spotting poorly performing pages is “page value.” Page value, as the name suggests, is a way to give a single web page a direct monetary value. It pulls the values in from a combination of transaction revenue, for ecommerce sites, and goal value for all other types of websites, both of which need to be set up manually in Google Analytics for a page value to be calculated. A high page value will often be a sign of an important page, which indicates that it is a good page to focus on during usability tests. High value pages that show a high exit rate are a good area to focus on for improvement. These are pages where users are dropping out at a key part of their journey to conversion. In the example below, taken from an ecommerce site, I’ve highlighted three categories with a similar page value. It is clear to see that there is a far higher exit rate for the Personalised Toys product page. This shows that this is a high value page that’s “leaking” users, and should be the focus of future UX work. A Google Analytics table of page values. Looking at individual pages will only show part of the picture though. It’s important to use the “content grouping” feature to look at how whole sections of a website are performing. Content grouping is essentially a way to segment data by the types of pages that users visit on a website. Pages can be grouped in a variety of ways. For a site that sells clothing, for example, groups could be set up for each type of clothing, showing whether pants have a higher page value than shirts! Once a page or section has been identified as having a low page value the next step would be to find out why this might be the case. In the example above it’s been shown that shirts have a comparatively low value. My first step here would be to look for any clear UX or technical issues on the shirts pages, using my experience and judgement. After doing this I would test the page, or pages, with real users to see why they are experiencing these issues—and look for clues indicating how we would fix them. A Google Analytics table with content grouping. Content Grouping can be a really powerful way to see how different parts of a website are performing. USING METRICS IN PRACTICE This is the first step in using analytics to identify website problems. In the second part of this series, we’ll look at how to identify drop-off points in the user journey and how to segment users to see more details. In the meantime, try identifying potential problems using the methods from this article: Pull up the bounce rate to find pages users land on and immediately leave. Read through the exit rate of pages to see where in the customer journey users leave the site. Factor in the importance of a user’s average time on a page—a high bounce rate on a blog page, combined with a long average user time is actually a good thing! Look at the pages ranked by page value. The higher the page value, the more important it is to usability test, and ultimately fix problems users are having on the page. Sponsor ABOUT THE AUTHOR Luke Hay Luke has been working with web analytics for over 15 years. He currently splits his time between working as the Senior Conversion Strategist at integrated digital agency Fresh Egg and as a freelance UX and analytics consultant and trainer. Luke has helped organize and curate events for UX Brighton and is one of the organizers of UX Camp Brighton. You can follow Luke on Twitter: @hayluke or on his website www.lukehay.co.uk RELATED ARTICLES An Analytics-First Approach to UX, Part 2 4 Quick Tips for Getting the Most out of Google Analytics Top 29 Free UX Tools and Extensions Increasing Usability with User Feedback IDENTIFY DROP-OFF POINTS Knowing how users move through a website can add context to single page stats. For example, analyzing previous pages on a user journey may help to indicate why the exit rate of a particular page is so high. In addition, finding out the common user journeys through a website can be very beneficial when it comes to creating usability tests. Usability test tasks can be created to mirror those common user journeys, ensuring that the behavior of users during tests is inline with that of existing site users. Google Analytics attempts to show user journeys with the user flow and behavior flow reports. These can be hard to read, and often suffer from grouping multiple pages together, meaning that Google Analytics will often only show the top few most popular pages individually but will then combine several pages and label them as “>100 pages”, which is of no help at all. The screenshot below shows how only a few individual pages are displayed for each step of journey before pages are grouped, making analysis difficult due to this limited information. Despite the issues caused by page grouping, spending time analyzing these reports can identify problem areas based on drop-off rates or unexpected user journeys (i.e. did a user go in a very different direction than we expected?). Once we identify the problem areas, we can create usability tasks to see how users a thinking as they go through the journey, and learn why they’re having trouble. In Google Analytics’ user flow and behavior flow reports, pages are visualized as green boxes with grey lines showing the user journeys between them. Each box also shows the percentage of ‘drop-offs’, in red, where users are leaving the site. They can help demonstrate popular user journeys and where users are exiting the site, which is another indication of problem areas. The example below comes from a travel site that I worked on which featured a prominent search box on the homepage. In this annotated and simplified picture, we can see a potential issue. Visitors were using the search box to find a holiday destination, but then returning back to the home page from the search results page (aka pogo sticking), demonstrating that the search results shown were unsatisfactory in some way. This could be due to a number of reasons: perhaps the search was regularly returning no results, too many results, or too few results. It could be that the problem was not with the search results themselves but another factor such as the prices for the holidays shown on the search results were too high. The fact that the data suggested that the initial search was unsatisfactory led me to run some usability testing on the search box. The usability testing uncovered that the problem was actually that the search results were too broad, and users were overwhelmed by the number of results. Based on the outcomes of the user testing, I suggested introducing a faceted search system on the results page allowing users to filter results on a range of criteria without having to start their search again from the homepage. The new search system allowed users to filter their results on facilities offered; such as whether the hotels in the results had swimming pools, gyms and other facilities, which in turn meant that they were able to find results that were useful to them. The design solution led to a large reduction in the number of users returning to the homepage after their initial search, and saw more users reaching the next step of their journey. The screenshot above shows the analytics for the month after faceted search was introduced; showing a reduction in ‘pogo-sticking’ between the homepage and search results. While there is clearly still room for improvement here, it was encouraging to see positive results from the change. SEGMENT DATA FOR GREATER DETAIL Segments offer a great way to look at the differences in behavior across different types of users. One simple example would be comparing new and returning users. The graph below shows that, in this example taken from an online jobs board, the number of visits from new users remained fairly consistent across the month. The visits from returning users followed a different pattern though, with clear dips in traffic around the weekends. This led me to look in more detail at the differences between new and returning visitors. Looking at other metrics for these two groups showed that returning users tended to spend longer on the site, look at more pages per session and were more likely to apply for jobs. From this data I was able to hypothesise that returning visitors were more likely to be serious job seekers, while new users were more casual in their approach. This led to me recommending some personalisation for the site. My recommendation was that new users should be shown reassurance that the jobs board was a legitimate and trustworthy place to search for jobs and be directed towards quick/simple calls to action, such as signing up for job alerts. I also recommended that returning visitors were shown more sophisticated and detailed job search options, and messaging to encourage them to apply. The ability to see how new and returning visitors behave differently can demonstrate many things, depending on the type of site. It can show, for example, that people returning to an ecommerce site are more likely to convert. If this is the case, it may be worth focusing on helping convert additional users on their first visit. The analysis of this type of segmentation could also play a part in recruitment for usability testing. If there are clear differences in the behavior of new and returning visitors then it may be advisable to test with both existing users and those who have not used the site before. Testing with these different user types may help explain why they are behaving differently on a site. There are several pre-prepared segments in Google Analytics to help slice up the data, beyond the new and returning users example shown above. These include: Different traffic sources—useful for identifying how visitors who found the site via search differ from those who came via links on other websites. Visitors using different device types—useful for comparing the metrics of mobile, tablet and desktop users. It is also often a good idea to create custom segments, so that the segments closely match the key audiences and persona types across a website. In this way, we can analyze the different user journeys that these groups take through the site, such as comparing existing customers’ journeys to first-time purchasers. Segments can be used to view the user journeys of people using different device types. Segmenting by mobile, desktop and tablet will give three different behavior flows to investigate. This can be particularly useful when it comes to identifying potential problem areas for these different device type users. The behavior flow diagram for mobile users may see a big drop-off point during the user journey, which is not an issue for desktop or tablet. This should lead on to mobile usability testing, focusing on those problem areas to find out why mobile users are dropping out of the journey at this point. NOW WHAT? The next step after using analytics to identify the problems is to find out why users are having those problems. Looking at analytics provides some key areas to focus usability testing or split testing on. As UX professionals we naturally want to spend time with our users, learning from them through usability testing. Analytics simply help guide our way to better tests. Try it – take some of the techniques outlined here and try applying them to a specific project. It’s surprising how much we can discover through analytics. For readers still feeling uncertain, there’s plenty of help available. To keep up-to-date with the latest developments in Google analytics I recommend the official Google Analytics blog and Occam’s Razor; the blog of analytics guru Avinash Kaushik. A practical way to improve analytics know-how is to check out the Google Analytics training centre. These tutorials are also excellent preparation for gaining a Google Analytics Individual Qualification. Sponsor ABOUT THE AUTHOR Luke Hay Luke has been working with web analytics for over 15 years. He currently splits his time between working as the Senior Conversion Strategist at integrated digital agency Fresh Egg and as a freelance UX and analytics consultant and trainer. Luke has helped organize and curate events for UX Brighton and is one of the organizers of UX Camp Brighton. You can follow Luke on Twitter: @hayluke or on his website www.lukehay.co.uk “The start of any project is where the greatest risk lives. Essentially, you’re starting from the darkest depths of the ocean…and it is a long, long way to the surface. If this sounds like a big task, that’s because it most certainly is. But if you look at all the individual parts of any design process, and if you understand how they affect each other, it becomes a lot easier to tackle. and if you devote significant time and attention to the very first order of business — your strategy — the foundation you build will be strong enough to withstand any weather as you move into design and coding.” –Think First, by Joe Natoli Author Joe Natoli Thus begins Think First, Joe Natoli’s new book about strategy, thoughtful UX, and successful projects. It’s the perfect read for anyone who has ever been mid-project wondering, “how did we end up here?!” In Think First, Joe outlines not only what a strategy is, but how to build one, and how to avoid many of the pitfalls we face when we follow them. We’re delighted to be interviewing Joe this week. Read on to find out what inspires him, how he thinks we can design for the Internet of Things, and his magic formula for doing the impossible: staying within a project’s scope! Want a copy of Think First? Enter to win a signed copy here. → Strategy provides the big-picture perspective for UX professionals who might otherwise get caught up in the weeds. What inspired you to write about strategic thinking? Experience. After doing this for so long — headed towards three decades now — what’s become really clear is the fact that when a product fails, the underlying cause is rarely poor UI design, and it’s not technology platform limitations. It’s not even bad UX. Those things all contribute and exacerbate the problem, but they’re really just symptoms. The underlying disease, so to speak, is almost always strategic in nature. The team delivered what the business thought customers expected or wanted or needed, without taking the time to qualify those features or functions or interactions. For example, maybe the UI design was done according to an existing theme or template, instead of being customized in a way that guided users appropriately, in a visual language they specifically would understand, that was appropriate for the context of use. In other words, the team jumped to tactical solutions before ever asking enough strategic questions. Designers were told to mimic a competitor’s UI or interaction pattern, or that of a completely unrelated business. Development teams were pressured to deliver under time and budget constraints that left no room to implement anything other than lowest-common denominator functionality. While there was a whole lot of effort and activity, no one was certain whether any of this was the right thing to do. As a client of mine likes to say, “the urgent trumps the important.” My experience has been that typically the three big questions I talk about in Think First are never asked, or at least never explored to the degree they should have been: what’s worth doing, what exactly should we be creating, and what value does it provide? Knowing why you’re doing something is the first step in making something of value. And when you don’t take the requisite time to ask and then qualify the answers, you build something that people either don’t need or can’t use. You’re solving the wrong problems — even though in some cases you’re doing so really well! It’s so important to know why you’re embarking on a project. Of course, it’s important for us to remember that realistically, some organizations’ primary reason for creating things might be to “be successful” or “make money,” or even to “make a difference.” How do you recommend designers convert these vague goals into more specific strategies? You press the people in the room for specifics. And as I say in the book, until you get specifics, you keep asking questions. And you explain why specifics matter. For example, with enterprise clients, I’ve had the following exchange many times: Client: “We have to make this responsive.” Me: “Why?” Client: “So people can use the site on their mobile phone.” Me: “What part of this data-intensive system will anyone really be willing to look at on a six-inch screen? What specific interactions do we think they’ll be willing to undertake in this format?” I often get silence after that question. But the reason I ask it is because “responsive” isn’t a measurable goal. So I usually give them an example, something I’ve experienced before or that a colleague has shared with me. For example, I tell them that I might learn that the company could save $600 for every one of its 3,000 employees who complete an onscreen process they’re ignoring now. And maybe we would then find out that while they hate the internal system, simplifying it on a mobile device may get somewhere around 63% of all employees to do it. Now we’re talking about saving $1 million and change semi-annually. That’s a clear target, one we can focus on. And more importantly, one they’re willing to invest time and money into researching and solving. The light bulb goes on when you tell that story. They understand why you’re being a pain in the ass, and they see how taking time to be specific could benefit them. You do that in a polite, respectful way, mind you. I take great pains to remind clients that I’m not pressing because I want to make their lives difficult: I’m doing so that when they pull the trigger on a significant investment of time and money, they can do so with some peace of mind, with certainty that a real, measurable business problem will be solved as a result. That’s a great place to get to with a client. You also suggest comparing the top business goals of your organization to your competitors. Of course, the challenge there is that we can’t see the competitors’ process, but only their output. How do you recommend a designer determine the business model of a competitor? When you don’t have access to any intel on competitors, you use the Internet. If people either radically love or hate something, you will find a wealth of evidence confirming that online. Facebook and Twitter alone are littered with praise and criticism for both B2B and B2C products. Any Google search on any business, product or service will return a mountain of forums and blogs and support websites filled with evidence. And while you will certainly have to take some of what you read with a grain (or pound) of salt, when you do enough digging you will absolutely start to see patterns. Repeated instances of the same issues, over and over, people praising the same good and complaining about the same bad. So you can get a pretty reliable read on what competitors are doing right, and where they’re screwing up. But it takes time — you really have to dig, spend a significant amount of time sifting through hundreds of instances. In your introduction you actually point out many important questions to consider for the user, including “how does using our product fit with other products they may be using?” This is even more important in today’s Internet of Things (IoT). Can you speak to some of the challenges in designing amidst the IoT? I think the biggest challenge for everyone — from product owners and BAs, to designers, developers and the businesses they serve — is to have the requisite time and resources to widen your scope enough to consider these things. In Think First I talk about the fact that there are a number of naturally competing forces at play in any project: preconceptions, personal opinion, politics, time, money, resources, and more. And all of those force all of us to be really judicious about how we spend our time and what we spend it doing. That’s real pressure, and it often results in what I described at the outset of our conversation: not enough time spent asking and answering strategic questions, asking why, exploring the larger issues. The IoT essentially amplifies the fact that nothing is used in a vacuum. Nearly every tool we use, digital or otherwise, is used in conjunction with a multitude of software and hardware additions and variations. Sometimes those are things your app or system should be doing, so people can use one tool instead of three. At the same time, sometimes those are necessary complements, because you’ll never be able to provide what those other tools can. And rolling out something half-baked is often a recipe for disaster. You can’t unseat the king of the hill unless you are superior in every way — and if you’re not, you’ll just get yourself hurt. I’ll give you an example. I worked with a client awhile back who wanted to add an unrelated feature set to their core product, in order to capitalize on the popularity of a specific app their customers were using in conjunction with theirs. I advised them to qualify the idea before committing to such a large bet. Their plan didn’t involve nearly the amount of upfront customer research and prototype testing that was necessary to qualify the idea and figure out whether it was worth doing. What I told them was this: “unless you roll this out in a way that is absolutely, unquestionably superior to that app — in form, function, ease of use — you’re going to waste a great deal of time and money and capture none of that market. If you don’t want to commit to validating the idea, spend the money improving some other area of your business.” They did it anyway, their way. Several million dollars down the drain on features no one used because — you already know the answer — it was faster and easier to use the other app. So frustrating! Of course, doing all of the necessary research would probably have convinced them to add quite a bit beyond the original scope of the project. How do you recommend a designer help to keep clients within the scope while also maintaining the strategic goal? This is one of my favorite topics, and it could be my favorite part of the book! I think there are two parts to managing scope with clients. The first is that you have to make sure you do a good job clearly establishing scope, down to the most minute detail. A lot of folks tend to rush into the design and development process without really, fully understanding everything that the finished product needs to do. In most cases, what happens is that no one raises the red flag when they first start feeling uncomfortable, that little voice that suggests something is amiss, that scope isn’t clearly defined. That something isn’t possible or just doesn’t make sense. As I say in the book, silence is almost always interpreted as agreement — and that can get you in trouble. So if you say nothing when that voice in your head pipes up, you’re not only agreeing that all of this is the right thing to do — you’re agreeing to do it. The second part of managing scope is that you have to be willing to act as the gatekeeper, the person who politely but firmly says “no” to additional requests that are outside the established scope. You can’t take those requests personally, you can’t internalize the pressure to do them to make someone happy. You have to hold your ground and say no. Otherwise, every little addition is another cut, another injury, and the project dies from the fabled “death by a thousand cuts.” It never ends, it never launches and everyone involved is very, very frustrated. Think First Thank you so much for talking with us Joe! Before we say goodbye, how do you recommend our readers begin the journey to thinking first? First, let go of the idea that you have to be right. Don’t get yourself locked into the idea that you have to know, at every point in a project, exactly what your requirements are, exactly what will result in the best UX or exactly how something should be designed. You don’t, and you shouldn’t be expected to. You don’t miraculously have that intrinsic knowledge; you get it by investigating, by being willing to try some things and be wrong and learn. You’re not the smartest person in the room and you don’t have to be. Learn to collaborate instead, put all the brainpower in the room to work. Be patient, be flexible and remember that there is always more than one right way to do something. Second, something I say often, both to myself and to young designers and developers: let go of the idea that success comes from being fearless. Forget the idea that successful people are somehow fearless in their endeavors. That’s not true. People who succeed are almost always feeling more fear than they think they can handle, and they dive in and do it anyway. And even if you do get your nose broken, you’ll learn that it doesn’t kill you and that, in fact, the next time is a lot better from having had that experience. So in whatever you’re doing, allow yourself to feel the fear — and then do it anyway. Joe Natoli is the author of Think First: My no-nonsense approach to creating successful products, memorable user experiences and happy customers. His online UX courses serve over 30,000 students, and he has consulted with and trained Fortune 500 and 100 organizations for nearly three decades. His articles, tips and advice can be found at givegoodux.com. Readers: want to get a free copy of Think First? Enter our giveaway! Sponsor ABOUT THE AUTHOR UX Booth UX Booth's editors are constantly pushing the digital envelope. UX Booth shares stories by and for those practicing user-centered design. RELATED ARTICLES Think First: A No-Nonsense Approach Focusing Design with Design Strategy 10 UX Books Due out in 2013 Selling UX to Small Business