A/B Testing FAQs

Can I run multiple tests at once?

Technologically, there is no limit to the number of parallel tests you can have running on Intelligems. Strategically, you run the risk of creating interference between your tests if they are running on similar things at similar times.

  • For example, maybe you are testing pricing on a particular product while also testing the layout of the PDP. You may see these as two separate tests, but in reality customers are being exposed to both. So you end up with a combination of experiences that you donโ€™t have full visibility into since visitors are being randomly assigned within each test

To control for interference, you can:

  1. Set this up as one large test - make a group for each combination of variables so that you can properly measure and compare (a multivariate test). You can have multiple products in the same test, which would keep users in the same group across all products being tested.

  2. Run tests sequentially, one after another

EXCEPTIONS: You can run tests in parallel if there is not meaningful overlap in their impact to the customer experience.

  • For example, if you wanted to test a PDP layout for one product, and separately have a test running on a landing page for a completely different product and funnel - that should have minimal interference, as the customers wonโ€™t likely experience both.

  • Or if you ran a pricing test on womenโ€™s underwear at the same time you were testing menโ€™s overcoats - those are likely to be pretty distinct sets of customers who wonโ€™t be exposed to both tests.

Note: We do only allow one shipping test at a time for technical reasons.

How does Intelligems split the traffic?

All of our tests are run as true split tests where we segment your traffic in real time between test variants. Intelligems randomly assigns users to a test group when they visit your site for the first time during a test. We then use a cookie to ensure they are in the same test group every time they come back to the site on the same browser on the same device so they have a consistent experience.

How does Intelligems handle customers who switch devices?

Intelligems randomly assigns users to a test group when they visit your site for the first time during a test. We then use a cookie to ensure they remain in the same test group every time they come back to the site on the same browser and device.

When a user visits your site on a new device, we cannot rely on that cookie to place them in the same group, so there is a chance they will end up in a different group. That said, in our testing, we have found that less than 1% of typical site traffic may be impacted by switching devices during a test.

We recommend making your customer support team aware of any tests you may be running, and in the rare case that a customer does switch devices and notices a differing price, having a plan of action (such as a discount code) to create a positive customer experience.

How long does it take to run a test? / When should I end my test?

Determining how long to run a test for relies on 3 major things:

  1. How big is the effect of this change? The smaller the effect, the harder it is to detect and be certain that the result is not just noise.

  2. How many people visit your website every day (and more specifically, how many orders are being placed in a day).

  3. How confident do you want to be? The more confident you want to be, the more data you need.

Our rule of thumb is that if you want to detect a 10% change in conversion with 90% confidence, you need about 300 orders per group.

  • Smaller changes will likely require more data to pick out the signal from the noise

  • Larger changes will likely require less data to see a significant pattern

Here are the steps we recommend you take when your test goes live to monitor results and determine when to end the test:

  1. Always check in on your test after ~4 hours to make sure your data is flowing - this can be a helpful time to catch any errors in the test configuration (don't launch a test end of day on a Friday, when you won't be able to monitor it)

  2. Try not to end any tests before a full week - it helps to get an even set of weekdays in the mix for your analysis, and early results often change

  3. Take a look at the "Trend" tab in the Intelligems analytics dashboard. Have those charts shown consistent results, or are they still varying from day to day? If there is still a lot of variability, it may make sense to run the test for a few more days.

  4. Once a week has passed, you can check the โ€œstatistical significanceโ€ tab to get a read on the โ€œprobability to be bestโ€ for each group. Check out our article on statistical significance to understand what you are looking for here!

  5. You should have a risk tolerance in your head.

    • You can wait for everything to hit 95 or 99% confidence, but for smaller brands, that means a lot of time waiting around for results. Waiting around presents a big opportunity cost since you are not running other tests - which could be delivering more value! - during that time

    • We wouldnโ€™t recommend making decisions with less than 75% confidence

    • Many of our customers look for confidence somewhere around 90%

  6. Some tests will not ever hit confidence - theyโ€™ll waffle around in the 40-60% range - that means there is no real difference between the versions. At that point, pick one to move forward with based on intuition or other strategic considerations, and move on

Pro tip: youโ€™re able to change traffic allocation in the middle of a test. So if you have one group that is a clear loser, you can โ€œshut it offโ€ in the middle of your test and send more traffic to the more viable options.

How does the test status work in the test dashboard?

There are several different statuses that a test can have:

  1. Pending: You've set this test up, but have not yet started it.

  2. Gathering Data: You've started this test within the last 21 days.

  3. Ready For Analysis: This test has been running for at least 21 days. While this is a good baseline, there is ultimately no general rule on time. That said, we often start seeing significant results after about 300 orders per group.

  4. Paused: You've paused this test; it is not active on the site, but you can resume it.

  5. Ended: You've ended this test.

I need to edit something in my live test - how do I do that?

There are a few steps to safely edit your live test:

Step 1: Pause your test.

The pause button can be found next to the status column within your A/B Tests overview tab.

If you are pausing a shipping test, we will automatically restore your shipping profiles while the test is paused.

If you are pausing a price test, we will ask you to select which prices you would like to roll out while the test is paused.

Step 2: Make your edits.

The edit button can be found under the three dots to the right of your test. Select this and make any changes you need, such as adding a new product, removing a test group (you can do this by allocating 0% of traffic to that group), changing a price, or changing a shipping rate. Make sure you click through the entire edit flow and select the save button at the end to save your changes!

Step 3: Resume your test.

Now that you've made your edits, you can resume your test and keep getting results! The button to do so is right next to the test status.

Keep your edits in mind when looking at the analytics dashboard if the change may have affected the results. Once a test has started, test group names cannot be changed and you can no longer add/remove test groups, but you can set traffic to 0% for a group to effectively remove that group from the test. We do this so that analytics are maintained for that old group that now has no traffic going to it. You can filter your results by date to see the results before and/or after edits were made.

How do I target or exclude new vs. returning customers?

Targeting new vs. returning customers is a bit tricky because Shopify does not share whether a customer is new or returning with Intelligems. There are, however, a few ways you can set up audience targeting to help target or exclude new vs. returning visitors.

The first option is to use UTM parameters. In order to target only new visitors, we typically recommend applying the test to users visiting the site through specific UTM or media campaigns that you use for targeting new customers. There is a small chance that returning customers could come in through these campaigns.

The second option is to use our New / Returning Visitor targeting. It is important to note that this is looking at whether someone is a new or returning visitor, and not necessarily whether they are a new or returning customer. It is also important to note that we determine whether a visitor is new or returning based off of whether they have an Intelligems ID assigned to them. We assign all site visitors an Intelligems ID when they come to your site, as long as our JavaScript is in your live site. Because of this, if you have just started working with Intelligems, most visitors will be seen as new since they don't have an Intelligems ID assigned to them yet.

Both of these options can be set up in the Targeting step of test set up, which you can read more about here.

Can I remove the Intelligems script once my test is complete?

While there are technically no issues with removing the Intelligems script from your site once your test is complete, we only recommend doing so in the rare case that the site is experiencing performance issues. Otherwise, we recommend leaving the lightweight script installed in your theme as testing shouldn't be one-and-done. We find that stores that approach testing with a roadmap and plan are the stores that unlock the most value. If you are stuck on coming up with an idea on what to test next, check out this article!

If you do need to remove the Intelligems script from your site, delete or comment it out in your theme.liquid file. For Shopify Plus customers, this may also be located in your checkout.liquid file. The JavaScript to remove or comment out will look like this:

<script src="https://cdn.intelligems.io/<your_customer_id>.js"></script>
Can I preview a Shopify theme and an Intelligems test at the same time?

Yes, you can! This is a very common practice we use here at Intelligems. You can do this by entering preview mode for your theme, and then entering preview mode for your test. The theme should be cached when you open your test preview.

What will my orders look like in Shopify?

Customers on your site will not see anything to indicate that an Intelligems price test is running, but there are three ways that Intelligems will show up on orders within your Shopify admin portal:

  1. Each visitor gets tagged with a unique id that sorts them into a group. This id is passed to the order via cart.attribute for tracking purposes and will show under โ€œAdditional Detailsโ€ on the order page. This is hidden from the customer.

  2. In cases where we use a checkout script, we add a line_item.property to each item so that the checkout script provides the appropriate price.

  3. The checkout script applies a โ€œline-item discountโ€ in order to generate the correct price for a customerโ€™s product. While this shows as a discount in the admin view, this is always hidden from the customer (i.e. in this case they would just see โ€œ$13.99โ€ for the product price).

Points 2 and 3 above only apply if a price test is implemented using checkout scripts, which are used for Shopify Plus customers only.

Does Intelligems integrate with GA4?

Yes! When you create a test, you can enable our GA4 integration. Enabling this will create Audiences in Google Analytics automatically for each experiment test group so you can easily filter for or compare one group to another. We use custom dimensions on the user to send experiment IDs and test group IDs and create the audiences based on those dimensions.

Can I filter out blog traffic from my test?

Yes! Some brands use their blog pages for SEO. This generates a lot of traffic for their site, but does not result in many conversions or revenue. Because of this, blog traffic can create a lot of noise in analytics, therefore it might be better to filter blog traffic out of your tests to keep your results clean and actionable. If you do not have any UTMs set up for your blog pages, this article will walk you through how you can exclude that traffic from your test.

Step 1: Add snippet to theme.liquid file.

Add the below code to your theme.liquid file directly above your Intelligems JavaScript snippet. An example Intelligems JavaScript snippet is included at the bottom of the snippet below to illustrate where the new snippet should be placed.

<script>
if (window.history.pushState && window.location.pathname.includes('blog')) {
ย  ย  ย  ย  ย  ย  ย  ย  ย  ย  const newURL = new URL(window.location.href);
ย  ย  ย  ย  ย  ย  ย  ย  ย  ย  newURL.search = '?blog=1';
ย  ย  ย  ย  ย  ย  ย  ย  ย  ย  window.history.pushState({ path: newURL.href }, '', newURL.href);
}
</script>
<script src="https://cdn.intelligems.io/<your_customer_id>.js"></script>

Step 2: Set up UTM filter.

Set up the below Audience Targeting on your test. Note that this will need to be added to each test that you want to exclude blog traffic, and will only exclude that traffic moving forward.

Can I schedule tests to start, pause or end at a specific date and time?

Yes! Intelligems allows you to schedule start times for all types of tests, including Price, Shipping, Content and Campaigns. Intelligems also allows you to schedule pause and end times for Content and Campaigns tests, but does not currently support scheduling pause or end times for Price or Shipping tests due to rollout requirements for ending those types of tests.

To schedule a start, pause or end time, follow these steps:

  1. Click on the More Options menu (the three dots) next to the test you'd like to set up a schedule for.

  2. Click "Schedule Test". If you do not see this option, please reach out to our support team here to request this feature be turned on for your account!

  3. If you are setting up a schedule for a test that has not yet been started, this will open a modal that will ask you to confirm that you have completed the integration & QA'd your test. If you have not already done so, please be sure to complete both of these items before proceeding.

  4. Once you have confirmed those items are complete, you will be taken to a modal where you can set up a scheduled start, pause, and/or end time.

A few things to keep in mind as you set these up:

  1. You must schedule items for times in the future only.

  2. Test can only be scheduled at 5 minute increments.

  3. The times will be based off of your device's current time zone, which will also be listed at the top of the modal.

  4. You can only schedule pause and end times for Content and Campaign tests that are either live, or that have a scheduled start time.

  5. Once you have scheduled a test to start, pause or end at a specific time, you will see a clock symbol next to the status, and will be able to hover over that to get more information on the scheduled times like in the below screenshot.

Last updated