This study examines how users behave and navigate on mobile e-commerce sites. It specifically looks at how sites can improve the user’s ability to find, evaluate, select, and purchase products in the unique mobile context of a touch interface with very limited screen real-estate. This includes areas such mobile homepages, mobile category navigation, mobile search, mobile product lists, filtering and sorting, mobile product detail pages, mobile shopping carts, mobile checkout flows, etc.
This Mobile E-commerce usability study is based on two main research components:
Below the methodology for each of the research methods is described in detail.
To purchase access to the Mobile E-commerce Usability Report & Benchmark go to: baymard.com/mcommerce-usability
One part of this research is based on a large-scale usability study of 18 major e-commerce sites. The usability study tasked real users with finding, evaluating, selecting and purchasing everyday products at mobile e-commerce sites.
The 1:1 “think aloud” test protocol was used to test the 18 mobile sites: 1-800-Flowers, Amazon, Avis, Best Buy, Buy.com, Coastal.com, Enterprise.com, Fandango, Foot Locker, FTD, GAP, H&M, Macy’s, REI, Southwest Airlines, Toy’R’Us, United Airlines, Walmart. Each test subject tested 3 - 6 mobile sites, depending on how fast they were. The duration of each subject’s test session varied between 1 and 1.5 hours, and the subjects were allowed breaks between each site tested.
While these sites are by no means small, the test subjects encountered 1,000+ usability-related issues during the test sessions. Ranging from small interruptions with an interface to severe misconceptions about the basic premises of the m-commerce site, resulting in abandonments. Our studies show that 43.2% of all smartphone or tablet users have abandoned an online order during the checkout flow in the past 2 months. And 61% sometimes or always switch to their desktop devices when having to complete the mobile checkout process. Also, despite testing the mobile e-commerce sites of major e-commerce businesses such as Walmart, Amazon, Avis, United, BestBuy, FTD, Fandango, etc. numerous test subjects were unable to complete a purchase at multiple of these sites. Note that the word is unable, not unwilling. The usability issues were that severe.
All of these findings have been analyzed and distilled into 146 mobile usability guidelines on how to best design a mobile site that align with users unique behavior due to the mobile platform constraints.
Since there will always be contextual site differences, the aim of this study is not to arrive at statistical conclusions of whether 21.1% or 24.2% of your mobile users will encounter a specific issue. The goal is to uncover severe user experience problems that are likely to occur on your m-commerce site, document the underpinning issue and misalignment And as importantly, to present solutions and mobile design patterns that during testing were verified to resolve these usability issues.
For a study following the think aloud protocol, the binomial probability formula show that 95% of all usability problems with an occurrence rate of 14% or higher will be discovered on average, with 20 test subjects used.
The other part of this research study is a comprehensive usability benchmark. Using the 146 mobile usability guidelines from the large-scale usability tests as the review heuristics and scoring parameters, we’ve subsequently benchmarked the entire mobile shopping experience at 50 top grossing US e-commerce sites. This has resulted in a benchmark database with more than 5,200 mobile usability parameters reviewed, 3,600 additional examples for the 146 guidelines, and 650+ mobile page examples from top retailers, each annotated with review notes.
The total UX performance score assigned to each benchmarked sites is essentially an expression of how good (or bad) a mobile shopping experience a first-time user will have at the mobile e-commerce site – based on the 146 guidelines documented in the Mobile E-commerce Usability report.
The specific score is calculated using a weighted multi-parameter algorithm:
Below is a brief description of the main elements in the algorithm:
All site reviews were conducted by Christian Holst, Thomas Grønne, Amir Molavi, and Christian Vind, mainly during summer 2015. A US-based IP address was used. In the case multiple local or language versions of a site existed, the US site version was used for the benchmark.
All reviews were conducted as a new customer would experience them - hence no existing accounts or browsing history were used. The documented and benchmarked designs at each site were the mobile: Homepage, Search Field, Search Autocomplete, Navigation Menus, Category Pages, Search Results, Product Lists, Filtering Options, Product Pages, Cart Pages, Account Selection (during checkout), Shipping Address, Shipping Method, Billing Details, Review Order, Payment, Receipt/Order Complete. One specific page from a site is shown in the benchmark, but the reviewer investigated 15-30 other pages which were used for the benchmark scoring as well.
Baymard Institute provide this information “as is”. It is based on the reviewers subjective judgement of each site at the time of testing and in relation to the documented guidelines. Baymard Institute can’t be held responsible for any kind of usage or correctness of the provided information.
The screenshots used may contain images and artwork that are both copyright and trademark protected by their respective owners. Baymard Institute does not claim to have ownership of the artwork that might be featured within these screenshots, and solely capture and store the website screenshots in order to provide constructive review and feedback within the topic of web design and web usability.
Citations, images, and paraphrasing may only be published elsewhere in limited extend, and only if crediting “Mobile E-commerce Usability study by Baymard Institute, baymard.com/research/mcommerce-usability”