Web Analytics World

Analytics, Mobile, Social Media and Digital Marketing Strategy

  • Home
  • About Us
  • Digital Marketing Courses
    • Digital Marketing Professional Certificate
    • MSc Digital Marketing Online
  • Current Bloggers
  • Contact Us
You are here: Home / Archives for A/B Testing

Using Analytics to Grow Podcast Audiences

October 14, 2014 by April Wilson 1 Comment

With the new iOS8, Podcasts are now a standard feature on the new Apple operating system. While there have been a handful of articles showcasing the benefits of podcasting as part of a larger digital marketing strategy, there hasn’t been a lot of focus on the how to use analytics and data to increase audience. Many marketers still don’t know how podcasts extend their reach and grow their brand.

That being said, podcasters enthusiastically rave about their results and their audiences. Naresh Vissa, author of Podcastnomics: The Book of Podcasting… to Make You Millions, believes marketers are on the edge of the inflection point:

“By the end of the year, there will be over half a billion people with the Podcasts app on their iOS device. This number will grow to be close to one billion people by the end of 2015, and that is just on the iOS side. Podcasting is now entering into its true golden age of mass adoption.”

So how do you measure success?

[Read more…]

Good Thinking on Attribution Models

July 1, 2014 by Karen Bellin 4 Comments

Three hands stacked together in a classroom indicating agreement

Everyone wins when data models are used to understand how traffic sources work together to lift conversion rates.

The Twitter and LinkedIn algorithms both thought it would be a good idea for me to read Gary Angel’s latest post the other day. I’m glad they recommended it since it digs into some really interesting considerations around attribution models.

Giving Credit Where Credit is Due.

We are always attributing some outcome to some action in our analyses – at the most basic level when we report on how many visits came from each traffic source – we are attributing visits to a traffic source.

Attribution models help us understand how marketing channels work together to produce an engaged audience and impact business outcomes. For instance, I would not have visited Gary’s blog if I didn’t get a notification from my social media services that he had posted something new. [Read more…]

Tracking Ad Testing in Google AdWords

December 10, 2013 by Mike Nierengarten Leave a Comment

Ad testing is an integral part of improving PPC campaigns, but tracking ad tests can be challenging. At Obility we have tried a number of different methods including manual tracking in Excel, creating scripts in AdWords, bid management platforms, and SAAS options like BoostCTR, but all of these options have serious drawbacks. Manual tracking and writing/updating scripts are incredibly time consuming and automated options far too costly. The net result was that ad testing occurred but ad testing reporting was sporadic. We needed a better solution.

Our solution arose from working with a PPC savvy, data-centric client who challenged us to get a better system in place. She is a huge fan of the Dimensions tab in Excel and often pulls her reporting directly from it. Obility uses AdGroup labels to differentiate between Campaign category (e.g. Brand, Competitor, Display, Partner, Retargeting, Search, etc.), and our savvy client would compare performance across these groups using the Segment by Label option in AdWords (see below & click to enlarge).

AdGroup Performance by LabelLightbulb! Why not use this same approach to measure ad testing? Obility created 6 different ad testing labels to help differentiate the types of ad tests we run with our clients:

  • Original (to identify the control ad)
  • Ad Title
  • Ad Content
  • Ad Offer
  • Ad Landing Page
  • Ad Display URL

The great thing about this process is that each label has an inherent ad testing goal. For example, Ad Title, Ad Content, and Ad Display URL are most typically tied to improving CTR. Ad Landing Page is related to conversion rate (CVR). Ad Offer is a combination of CTR & CVR and is probably best measured by cost per acquisition (CPA). However, although each ad category has a “natural” tie to ad performance, your goal may be different. For example, if you are trying to improve lead quality through ad content, CTR is not a good metric for ad performance.

After labeling ads, Obility is able to provide ad testing reports quickly. We can pull aggregate data on all of our ad tests from the Dimensions tab, or we can analyze ad tests individually be Filtering by Labels in the Ads tab (see below & click to enlarge).

Example Ad TestThis process works well for our agency due to its simplicity. For each different test, we can quickly determine ad, landing page, and offer performance by ad group. For example, in the above screenshot, we are running an ad content test. Based on performance so far, we can quickly determine that the new ad content has improved CTR. Once the data is statistically significant, we can pause or delete the Original ad and run with the new ad.

**Note that you should remove labels before deleting ads as you cannot remove labels after deleting.

We can also readily export ad performance by label and create robust reports in Excel for our clients. The updated process is clear enough that we can present it directly to our clients without filtering through our SEM Managers for analysis. Since clarity and transparency of our processes are key to Obility’s success to date, our new ad testing process fits right in.

Take Politics Out of the Decision Making Process with Testing Part 3

May 7, 2013 by Matt Aster 2 Comments

This is part 3 of a multi-part post on website testing.
Part 1 is available here
Part 2 is available here

Alas, we reach the end of this blog series on site testing, but hopefully this is just the beginning of your journey in conversion optimization.

As promised in part 2, this last post will focus on the differentiating features, the reporting functionality, and the summary of tools.

Differentiating Features

Based on my experience I’ve found a lot of A/B testing tools to be very similar, if not downright the same thing. What differentiates one tool from the next often comes down to price alone. My goal here is to ignore price (all three tools were relatively the same price) and focus on the fundamental differences.

Convert Experiments

I was pretty excited to see the types of Google Analytics integrations Convert Experiments offers. All three tools integrated with Google Analytics to push test information into custom variables, but only Convert Experiments allows integration with Google Analytics goals and ecommerce tracking (with the exception of Google Analytics experiments).

Optimizely

I really liked the straight forward approach of Optimizely. If you’re just looking to set up a test and go, Optimizely has a very quick startup. The standout feature of Optimizely is the implementation… 1 line of code, AND you get emailed when the code gets added to a site. This is incredibly helpful.

Additionally, Optimizely integrates with KissMetrics, MixPanel, and SiteCatalyst. It also integrates with click tracking tools like ClickTale.

 

Optimizely-ClickTale

 

Visual Website Optimizer

I felt Visual Website Optimizer had the best overall feature set of all the tools tested. Not only is the targeting incredibly comprehensive and flexible, it has direct integration for “focus group” testing, straight geo-targeting testing and my absolutely favorite… click maps!

Visual Website Optimizer eliminates some of the need for tools like CrazyEgg by incorporating click maps directly into the testing. Additionally, if you only want click maps, you can do that sort of testing alone.

VWO-ClickMap

 

Reporting

I already demonstrated in Post 2 how the different tools report within Google Analytics, but let’s dive into how each tool displays the results.  I’m not providing much commentary on the reporting data and graphs themselves as everyone has a different preference for reporting.

Convert Experiments

I was pretty excited to see the types of Google Analytics integrations Convert Experiments offered. All platforms integrated with Google Analytics to push test information into custom variables, but only Convert Experiments allowed integration with Google Analytics goals and ecommerce tracking (with the exception of Google Analytics experiments.)

Convert-RevenueReport

Optimizely

Report summary is pretty clean, easy to tell which had the highest engagement effect, scrolling to the right shows the different goals.

 Optimizely-ReportSummary

The detailed goal reporting is also pretty clean. It’s easy to tell what’s doing better or worse.

Optimizely-ReportChart Optimizely-ReportData

 

Visual Website Optimizer

Reporting is pretty similar to Optimizely on the summary front, but I found this interface a little more convoluted than Optimizely.

VWO-ReportSummary

I like the detail reporting a bit better in Visual Website Optimizer than the other 2 tools -It’s very clean and right to the point.

VWO-ReportGraph VWO-reportdata

 

Table of Features

 

Google Experiments

Convert Experiments

Visual Website Optimizer

Optimizely

Google Analytics report integration

✔+

Custom Variable Slot

Custom Variable Slot

Custom Variable Slot

Segmentation

 

✔

✔

✔

IP Address exclusion

 ✔ (filters)

✔ (experiment level)

✔ (account level)

✔ (account level)

Import goals from Google Analytics

✔

✔

 

 

Heat/Click maps

 

 

✔

 

Split URL Tests

✔

✔

✔

✔ (redirect)

A/B & Multi Variate Testing

 

✔

✔

✔

Flexible Goals

✔

✔

✔

✔

Google Analytics revenue reporting

✔

✔

 

 

 

Closing Remarks

At the end of the day, the important thing to take away from this blog series is that you have no reason not to test. The tools out there are easy to get the hang of and use and the reporting is fairly straight forward. The real challenge for you is to decide where to start with your tests and what your success measurements are going to be.

So, with that said, go forth and test! Or, Tweet me (@MattAster) with your email and URL and I’ll give you 2 free optimization ideas to test!

Take Politics Out of the Decision Making Process with Testing Part 2

April 30, 2013 by Matt Aster 5 Comments

This is part 2 of a multi-part post on website testing, part 1 is available here

Part 3 is going to include reporting features and final summaries.

In my continuing journey to open eyes to A/B and multivariate testing, I’m going to show you 2 more tools this week. On the docket: Visual Website Optimizer , Convert Experiments, and Optimizely

For purposes of this post, we are going to ignore Split URL tests. In my humble opinion, if you want to run split URL testing you only need Google Analytics Experiments. For this comparison I am going to focus on multivariate testing

The easiest way to do this comparison is to break this up into 3 sections.

  1. Creating test variations
  2. Setting up the test (implementation, targeting, etc.)
  3. Results and reporting (I’ll cover this in Post 3)

Creating Test Variations

Both tools are WYSIWYG (what you see is what you get). Simply put in the URL you want to test and it spits back the page with every element clickable, allowing you to edit each.   You’ll notice the editing elements are fairly similar between the tools. Each platform lets you choose to “Track Clicks” to determine goal success and every platform lets you edit the element.

In all 3 platforms you are able to edit the HTML directly, perhaps to add custom JavaScript or CSS styling. In Optimizely you have an advantage of adding custom JavaScript directly (instead of editing the HTML first).

 

Visual Website Optimizer

Convert Experiments

Optimizely

 VWO-ElementOptions Convert-ElementOptions  Optimizely-ElementOptions

 

On our homepage, I wanted to test the effects of pausing our auto-rotating banner. Our site sits on a DotNetNuke CMS and the only way to turn off the auto-rotator was to insert a line of JavaScript telling the normal rotating banner to load as PAUSED. 

Since the JavaScript had to be applied at the page level, I could not utilize the Pause button highlighted in the screenshot below.

 PD-Banner

For this, I needed to edit the javascript of the page. The 3 platforms each have this option. For Optimizely I had to select an element and go to parent until I hit the Body, then I could edit the HTML (I felt this was very messy as it required me touching the base of the site for a very simple test.)

Visual Website Optimizer and Convert Experiments allow you to add custom JavaScript to the specific variations without needing to touch the overall site code. *Note, I verified with Convert Experiments and global is being renamed to avoid any confusion, it DOES apply only to the variation.

Visual Website Optimizer

Convert Experiments

Optimizely

 VWO-CustomJavascript1 VWO-CustomJavascript2  Convert-AddingJavascriptToPage  Optimizely-CustomJavascript

 

Test Setup                                                

Implementation

I’ll start with the part most people are going to hate (those that aren’t developers): implementing the code needed to actually execute the test. All 3 platforms have a similar process, add some JavaScript to the head of your website, and move on.

  • Convert Experiments does execute asynchronously.
  • Visual Website Optimizer has an option for async or standard
  • Optimizely’s code is NOT async, but they do have an option for asynch if you contact them… however, the synchronous option was the easiest install with only 1 line.

All 3 platforms give a nifty “check code” after you are instructed to implement it. Optimizely even sent an email to me to let me know the code was implemented. This would be incredibly helpful if you work in an organization or an agency where someone else has to implement the code.

Targeting

All 3 platforms have targeting customization some way or the other; look at the table below to get a sense of how each is structured. I’ll note Optimizely DOES offer geo-targeting, but only at the Platinum level (my testing was done at the Gold level).

I found Optimizely’s targeting a little lacking, compared to the options available in the other 2 tools. But theoretically the Custom JS condition could add these functions, although it would require some serious JavaScript knowledge.

 

Visual Website Optimizer

Convert Experiments

Optimizely

VWO-TargetingOptions 

 

 Convert-TargetingOptions  Optimizely-TargetingOptions

 

Traffic Allocation

I mentioned in part 1, if you have a particularly sensitive test to run, choosing what percentage of your visitors to test is a key element to stakeholder buy-in. All 3 platforms have a traffic allocator.

Visual Website Optimizer

 VWO-TrafficAllocation1

VWO-TrafficAllocation3

 

And they have a nifty tool to help you estimate how long a test will need to run at a specific allocation. This tool will be great when you’re writing up your testing schedule.

 VWO-TrafficAllocation2

Convert Experiments

Nothing fancy here, just straight to the point.

 Convert-TrafficAllocation

Optimizely

Similar to Visual Website Optimizer, you’re able to specify allocation among variations.

 

 Optimizely-TrafficAllocation

Analytics Integration

All three tools have native Google Analytics integration. This is done via custom variables. .

The main difference between the three tools in this area is that Optimizely is the only one that lists what’s in the variation. This is incredibly beneficial for future reporting when you want to look back to remember what a test was.

 

Example, both Convert Experiments and Visual Website Optimizer

VWO-Convert-CustomVariable

Optimizely shows what each combination was. Granted, it’s truncated text, but at least it gives a sense.

Optimizely-CustomVariables
But that’s all the reporting I’m going to show you now! Part 3 will have the final summary and all the reporting goodness.

Stay tuned!

  • 1
  • 2
  • 3
  • Next Page »

Get Your Professional Certificate

 

Never miss another post!

Entering your email address in the field below will subscribe you to our RSS to Email list. This means that when we publish a new post, you'll get an email with a synopsis of the post and links to the full article on this site.

  

You can unsubscribe from this service at any time by following the instructions within the notification email.

Our Popular Posts

  • Top 5 PR Strategies
  • Top 100 Outsourcing Companies in the World
  • Top Exit Page Analysis?
  • The Real Reason for Amazon’s Success
  • 57 Things You Can Do With Gmail
  • 10 Must Track Google Analytics Goals
  • Using UTM Tags To Correctly Identify Traffic Sources In Google Analytics
  • How to Analyze and Report ‘Average Visit Duration’ in Google Analytics
  • Basics of Debugging Google Analytics Code: GA Chrome Debugger and other tools
  • 21 Essential Mobile Metrics for Measuring Success

© 2019 Web Analytics World • Privacy • Cookies