The Future of AI in Customer Experience Optimization

Roboboogie’s Chief Experience Officer John Gentle has spent most of his adult life in web consultancy and iterative, data-backed design. With that kind of experience, one tends to get really excited about topics that others don’t tend to think about as much—but with that excitement comes passion and knowledge. We decided to kick off a new series, John Talks, to get his years of experiences and insights on the various subjects we handle on a daily basis.

Hanging out in the serene vibes of Martha’s, our local coffee shop located 23 feet from the Roboboogie headquarters, John shared his thoughts on artificial intelligence, how we use it, and the roles it will play in the future of CX, UX, and UI.

What do you find so exciting about AI?

Beyond the fact that we’re literally building robots who can learn, AI helps us deliver a more complete online experience. We build machines that can instantly identify anything from customer pain-points, to bounce locations, down to the color of a button on a screen to make selection and purchase easier.

Interesting. So where do you see AI fitting into the future of CX?

Well, it helps streamline so much of the development process, especially iterative design. The more sophisticated these systems get to understand the relational components, the more you can automate and take the pressure off of people.

What sort of benefits do businesses have by leveraging AI?

Oh, tons. With AI, we have more success with CX journey mapping, which is an important step to understanding someone’s experience from end-to-end. After that, we can develop multiple touch points, create a more meaningful relationship between the customer and the brand, and open up more opportunities for revenue.

And that’s part of the whole thing, right? Creating that customer-brand relationship?

For sure. In terms of AI, we can automate how we manage relationships, especially through things like targeted messages and eliminating barriers to engagement. In the world of eCommerce, we can integrate it into services. That helps better predict the customers needs, create better post-sale relationships, and develop a more meaningful connection to the brand. And that’s just eCommerce. We could potentially apply these same sort of principles to something like healthcare—maybe start seeing more people getting the care they need before something happens.

What potential pitfalls could you see regarding this form of AI?

Any kind of software or digital experience adoption has to make life better or easier or people won’t adopt it. A lot of AI and retargeting is happening in the background, so people are right to be suspicious. When it’s done right and brands are being transparent, it makes life easier. When brands get too “creepy” with it, that’s when people drop off.

For example, think of going into a store or restaurant where they’re familiar with you. They know your order or what you like to wear. You’ll like that place more. However, if they suddenly popped up and were offering you things outside of those stores, you might not like them as much.

If you get a similar experience on a brand webpage and they reach out in a more meaningful way, it makes decision making easier. It’s more effective and creates a better connection to the brand and the customer. However, if you’re not respecting the customers, you can alienate them and they’ll reject you. People don’t have a relationship with the AI, they do with the brand. That’s where we can step in and help.

Anything else you’d like to share?

Timing and context is everything. Be empathetic to customer needs and be empathetic towards where they are at that place and time and the rest will fall into place.

Check out more interviews with the Roboboogie team. We’ll see you again next month for the next installment of John Talks!

Safari ITP 2.1: How it Impacts your Testing and Personalization Program

(NOTE: This post will be updated as new information and solutions are available from Optimizely, Google, VWO, Convert and others).

Update: June 4, 2019

On May 13th, Safari released ITP 2.2 with its newest browser updates. This update further inhibits cross-site traffic by limiting client-side cookies set with document.cookie at 1 day when those cookies are determined to be set via Link Decoration.

What is Link Decoration?

Link Decoration is when cookies are set from a domain classified with cross-site tracking capabilities (think Google, Facebook, ad networks) and that domain is responsible for navigating the user to the current page. When navigated the referrer domain (where the user came from) sets query strings (data after the ? in a URL) and/or fragment identifiers (data after # in a URL). Upon landing on the navigated to URL, the previous site uses the data in the decorated link to set a cookie using document.cookie which allows them to track users from site to site.

ITP 2.2 will reduce the ability for third parties to track users cross-site when those users take more than 24 hours to follow an action. An example: A user coming from a social network clicks through to an online shop. If the shop uses services from the social network that sent them to the page, then that network can most likely set a cookie with information in the URL to pair the user with an ID in their database. If the user saves an item to cart, but then comes back to buy that item after 24 hours, then the social network cookie will have expired and the attribution of that user’s action cannot be paired with their social ID.

Update from Convert:

Sorry for the delay on this one. Convert was one of the first companies we reached out to when we first heard the rumors of ITP 2.1’s release. Convert updated how they are handling ITP 2.1 and is future proofing against 2.2 by using cookies set on the server side way back on April 17th. “Since the new cookie duration restrictions apply only to browser-created cookies, we’re moving the cookie issuance part to [the client’s] web server, which means [the client’s] server will create the cookies and not the users’ browsers.” They have a great article detailing their approach here and code snippets for many backend development languages here.

Update from Optimizely:

Optimizely suggests using either in-product mitigation or external mitigation. In-product mitigation would consist of ignoring users that visit an experience in a browser that does not support their cookie setting technique or excluding them entirely from all experiments. If an experiment is targeted at mobile this is a non-starter for us. External mitigation requires the user to “manage the cookie creation process at another point in [their] stack.” They do not give clear detail on how to accomplish this but do ask that any client wanting to do so should contact their support team. You can find this article here.

Note: The California Consumer Privacy Act (CCPA) will take effect in January of 2020, so less than a year away. We have been calling it California’s version of GDPR and it will limit how user data is being consumed and stored. CCPA will require companies to allow users the ability to delete and download any personal data being collected, as well as offering the option to opt-out of data collection entirely and upfront.

Update: April 1, 2019

As of March 25th, iOS 12.2 and Safari 12.1, on macOS High Sierra and Mojave, are now live.

Considering the impact ITP 2.1 will have on analytics and the continuity of experiences in testing, we are anticipating announcements from each testing platform soon. Adobe has released the first statement in which it outlines a solution using HTTP cookies set by a shared web-service. if we assume this to be the trend, and it seems to be in discussion within analytics circles, we can expect the solution to go like this:

  1. A company looking test will need to set up a CNAME record that is a subdomain. As a subdomain it exists technically as same-site, reference as eTLD+1. (eg. tracking.yourdomain.com)
  2. When this CNAME record is called by yourdomain.com it will redirect to a subdomain such as “tracking-fix.analytics.google.com”, which responds with a HTTP cookie (eg. _ga). Because the HTTP cookie is handed back to the subdomain, which is considered same-site, coming from a url on the same domain, it will not be deleted in 7 days.

This puts the majority of the work on the web service, and only a little bit more on each client looking to perform testing.

In a discussion on Twitter between Simo Ahava and John Wilander, Simo, speaking about using CNAME’s to set first-party HTTP cookies for analytics asked, “Do you think this is future-proofed (as much as anything can be)?”

John Wilander answered: “The only ‘future proofing’ to be had is to stop cross-site tracking. If trackers are allowed to repurpose scripts in the first-party context to facilitate cross-site tracking, we have no choice but to prevent it.”

Another notable viewpoint worth reading through is a recent proposal from Mike West, who works on the Chrome security team. In this tweet, West says we should deprecate javascript cookies all together. He points out over 84% of cookies are not using the built-in security features released over two decades ago and we cannot trust that any future security enhancements will be taken seriously. Notably, it was retweeted by John Wilander, Apple WebKit Engineer behind Safari’s Intelligent Tracking Prevention.


First: What is happening?

In the upcoming release of Safari, cookies will only last 7 days. This includes iOS. Firefox to follow.

Second: Why should you care?

This will break test integrity. A user who comes back after a week will not necessarily see the same experience. Unless the test is concluded within 7 days, it would be difficult to have a consistent experience, or any reliable user-level metrics. Personalization? BAH.

But let’s not freak out, it is currently a small audience that we are able to exclude from testing, and there are already some potential solutions. There has yet to be an official response from Optimizely, Convert, VWO, or even Google about how they are planning on handling this. Hopefully, they will have a solution on their end. In the meantime let’s talk about what’s happening and what we can do.

(Update #1 Tuesday March 26th @4:55pm): There has been an update from Adobe (read here). Although they only mention their testing platform briefly (Adobe Target), they do identify their proposed solution for user level identification… CNAME! We have also been in direct contact with one of our testing partner technologies who mentioned they were headed in this direction as well. We will be going further into the benefits of this solution soon!

Background:

With the introduction of iOS 12.2 and Safari 12.1, expected to roll out at the end of March, both browsers will be capping all client-side cookies to 7 days. This particular feature is called Intelligent Tracking Prevention (ITP) 2.1 (source). Firefox has began testing a similar behavior in a near future release (source).

The motivation behind this is user privacy, and they have been moving this way for some time. Last year, Safari limited third party site cookies from persisting in an effort to limit the tracking of users across sites. This in turn caused analytics tools, advertisers, and anyone looking to leverage a users journey, to start using first party cookies to track them. Currently third party services are able to set first party cookies to track users, and ITP 2.1 is Safari’s response.

Client-side cookies are where these tools store user-level identifiers. They are what make a user a User. Allowing you to say “Most users visit 3-5 times in six months before converting.” Most have expiration dates well beyond 7 days. For example Google Analytics uses a 2 year cookie (source) and Optimizely uses (by default) a 6 month cookie (source). This is to ensure that a user’s information is always continuous.

Continuity is integral for testing, advertising, and analytics. For A/B testing specifically this is a massive dilemma. Most testing platforms (Optimizely, VWO, Convert) mainly use javascript injection to load experiences. This means their only means to create cookies is through the client-side. Without continuous user IDs, users will be given new IDs each time they come back to a site after 7 days. This ultimately means users will be bucketed within different variations if a test necessitates a run longer than 7 days to reach statistical significance (which is common even if a site gets hundreds of thousands of visitors a month). Not only will this be disruptive to users who might be bumped back and forth between variations, but KPI attribution to specific variations will be impossible in these situations.

Who will this this effect?

Safari is the second most used browser in the world. This is because of mobile, which is increasing in market share. Firefox is the third (source). But how many will update, and how fast? This is something to be monitored, and will certainly be a growing audience.

What are the potential solutions?

For now, it may be simple enough to just ignore this audience. This can be done in analysis. It is expected to be small audience to start, and we will be monitoring them closely. Ultimately, we still don’t want those users to see a bad experience. Once a test is launched, we will confirm the User Agent and release details on the specific audience. We will also update this post with trends.

Other long-term, more wholistic potential solutions are updating user-level identifiers via “localStorage” or setting the cookie from the server-side such as use of an HTTP cookie.

With no news yet from the major A/B testing platforms, rather than investing the upfront costs of working around this change, simply excluding the audience may be best for the time being. (However, we are investigating both options mentioned above currently and will post details as we identify them.)