How to Use Qualitative UX Research to Identify Conversion Roadblocks

As the world of e-commerce continues to expand, businesses must prioritize their user experience (UX) in order to remain competitive. A key component of creating a successful UX is identifying and removing conversion roadblocks – issues that prevent users from completing a desired action, such as making a purchase. One way to identify these roadblocks is through qualitative UX research. In this post, we’ll explore how to use qualitative UX research to identify conversion roadblocks, and why this approach is essential for creating a successful e-commerce website.

What is Qualitative UX Research and Why is it Important?

Before diving into how to use qualitative UX research to identify conversion roadblocks, let’s define what it is and why it’s important. Qualitative UX research involves gathering feedback from users in the form of interviews, surveys, and observation, with the goal of understanding their experiences and behaviors on a website or app. By analyzing this data, businesses can gain insights into how users interact with their website, what frustrates them, and what motivates them to take action.

So why is qualitative UX research important? First, it allows businesses to better understand their users and create a more user-friendly experience. By identifying roadblocks and pain points, businesses can make necessary changes to improve the user experience and increase conversions. Second, qualitative UX research provides valuable insights that can inform design decisions and help prioritize future improvements. By understanding what users want and need, businesses can make informed decisions that lead to a better overall experience.

How to Conduct Qualitative UX Research to Identify Conversion Roadblocks

Now that we understand the importance of qualitative UX research, let’s dive into how to conduct it to identify conversion roadblocks. There are several methods businesses can use to gather qualitative data, including:

  • User Interviews: Conducting interviews with users can provide valuable insights into their experiences and behaviors on a website. Businesses can ask questions about what users like and dislike about the website, what they find confusing or frustrating, and what motivates them to take action.
  • Surveys: Surveys are a quick and efficient way to gather feedback from a large number of users. Businesses can use surveys to ask specific questions about the user experience, such as how easy it was to find a product or complete a checkout.
  • Observations: Observing users as they navigate a website can provide valuable insights into their behavior and frustrations. By watching how users interact with a website, businesses can identify areas where users get stuck or confused. (We use FullStory to help us with this!)

Once businesses have gathered qualitative data, they can use it to identify conversion roadblocks. Some common roadblocks include:

  • Confusing Navigation: If users have a hard time finding what they’re looking for on a website, they’re more likely to abandon their purchase. Businesses should ensure that their website is easy to navigate, with clear labels and a logical hierarchy.
  • Complicated Checkout Process: A complicated checkout process is a major conversion roadblock. Businesses should strive to make the checkout process as simple and streamlined as possible, with clear calls to action and minimal steps.
  • Lack of Trust: If users don’t trust a website, they’re unlikely to make a purchase. Businesses should ensure that their website is secure and that they have clear policies in place for things like returns and refunds.

The Importance of Using Qualitative UX Research to Improve Conversion Rates

In conclusion, qualitative UX research is an essential tool for identifying conversion roadblocks and creating a successful e-commerce website. By gathering feedback from users and analyzing their experiences and behaviors, businesses can identify pain points and make necessary changes to improve the user experience. This, in turn, can lead to increased conversions and a more successful online business.

When conducting qualitative UX research, it’s important to use a variety of methods, including user interviews, surveys, and observations. By leveraging qualitative data pre- and post-optimization, organizations can track their impact and make data-driven decisions about future investments. Roboboogie is the one-stop-shop for uncovering user pain points, gathering feedback, and measuring the success of digital transformation efforts. Our expertise in Analytics, User Research, Design, and Technology provides actionable and measurable insights, with flexible services to meet you where you’re at in your optimization efforts.

10 Years of A/B Testing: Embracing Failure for Better Tests 


Next month marks the 10-year anniversary of my first A/B test. How things have changed since those early days!

The A/B testing hustle was real – partnering with Optimizely and the incredible founding team there, under the innovative leadership of Dan Siroker and Pete Koomen.

We were all just figuring things out.

We knew that A/B testing had the power to do something incredible in the market (in fact, it helped support Obama’s re-election success in 2012).

At Roboboogie, we were excited that we could add a new layer to our UX strategy and design, with measurable results. Real-time feedback from actual users based on how they performed? Incredible!

Focus groups and user testing had been helpful tools for us, but they only provided academic feedback. A/B Testing, on the other hand, unleashed a new approach – an interactive, scientific methodology that could guarantee positive outcomes.

Growing up with a biology professor and a chemistry teacher running our household, I viewed the world through a scientific lens. Developing hypotheses was how I figured out the world, and my place in it.

Combining science with marketing in my career was a no-brainer. Bring on A/B testing!

Fast forward to June 2013.

Roboboogie engaged our first A/B testing client. A highly innovative, fast-paced e-commerce client willing to take (smart) risks, with the ‘go-fast-and-break-things’ mentality. “Let’s embrace the experimentation. Test fast, and iterate.”

We were scrappy back then. I remember singlehandedly doing test ideation, sketching UX test variation concepts, hopping on a call with clients for alignment, then jumping into the WYSIWYG editor to build it, set up analytics, QA it, and launch it – sometimes all in the same day.

At our peak, we were launching 12 tests per month (all possible due to their high site traffic and quick purchase cycles).

And we saw big success. Experiments were increasing revenue, unlocking new customer segments, and helping inform product development and positioning.

But the go-go-go approach wasn’t without missteps.

I still cringe to this day about one test in particular.

We tested dynamic pricing for a certain set of luggage products. Depending on the variation, we were showing pricing differently, presenting the retail price, promotional price, and an additional discount layer. Regardless of the variation, the product was priced at $74.99.

The test strategy and architecture were sound. The test was built to spec and seamlessly passed QA.

But when we launched the test, the results were staggering. There were massive lifts in product engagement, product add-to-carts, initiated check-outs, user time on site, total pages views per session, and… site-wide add to carts?

That’s when I got the phone call from our client partner.

“Umm, we have a shopper on the line talking to customer service who is pretty upset that they can’t buy our featured trip to Costa Rica for $74.99. What is going on?! We need to halt all testing immediately.”

Oops.

Instead of my testing parameters applying only to the backpack products we were piloting our experiment on, they had been applied site-wide. To clothing, snowboards, bikes, kayaks, and even trips. The test was running perfectly for our intended products – but also everywhere else.

The go-fast-and-break-things approach had … broken things. While we had massive wins, we also now had a sobering misstep. The experiment was only live for about 25 minutes, but it had caused some need for damage control with several customers. Luckily, customers were understanding, and with some exchanged store credit, everything was smoothed over fairly quickly.

That mistake, however, significantly shaped my professional approach and how we approach our testing methodology at Roboboogie. The experience has proved invaluable time and time again. For it is out of failure, that our best growth and maturity comes.

My attention to detail has never been the same since – an approach we now incorporate into our testing process. We have a fully immersive, multi-disciplinary approach to each step of the testing process – involving thoughtful strategy, smart UX, tight UI, pixel-perfect development, methodical data engineering, and double-and-triple-check QA before launch. Each team member has an eye on the bigger picture goal – launching smart tests, free from errors, with the right balance of speed, strategy, and attention to detail.

For us, our only “failed” tests are the ones that are launched broken.

We embrace the mantra of “go-fast-and-BUILD-things” now. We believe our clients and end-users deserve better than broken tests.

Not every first test we launch results in an immediate net-positive CRO impact (it’s experimentation, after all). But we work to ensure that every test we launch is a winner – either driving revenue/leads, elevating the brand experience, unearthing user insights, or unlocking new user segments.

And we’re incredibly proud of our ability to do so.

Would we be where we are today without that mistake? I’m not sure. But I do believe it catapulted us forward. Looking back at that mistake 10 years ago – almost accidentally selling a $3500 trip to South America for $75 – I dare say the mistake may have been worth it.

That failure resulted in a decade of better tests – for dozens of clients and millions of tested visitors since.

  • Jedidiah Fugle, COO @ Roboboogie