Analyzing the Analytics



Everybody is aware of some simple facts in the Analytics landscape: An increase in Page Impressions is good. A high bounce rate is bad. Are you sure? Shouldn´t you educate yourself on additional synergies within those facts? It is too naïve to say that generating more Page Impressions than the day or the month before is a positive development in your digital channels. An important aspect of Key Performance Indicators for example is that they all should be set in comparison with a time interval to be able to judge the growth or decrease of the figures.

In many metrics this is a key aspect. So, again: is an increase of Server Calls good? Simple answer is: you do not know. If you want to find out – compare. Not only with your historical data (and then of course with the same time frame a short period ago: yesterday with the same day a week ago, instead of yesterday with the day before). You also have to compare your data with the market trend.

And I already see some of you readers have three arguments against this comparison:

  1. We are sooo unique. No, you are not – Full stop. There is competition.
  2. We do not have the market data – Valid point. Lots of industries have some public overview of market data available, for example media or finance, but yes, this is hard to find sometimes.
  3. We are not able to compare the data in our tool – Go select a professional solution. Not a tool.  The high-end vendors will very much be able to integrate market data via API into the system and have clear overview of the comparison between your company data and the market information.

Back to the metrics and to start with an easy one; an increase of server calls is good. We all agree to add additional aspects, see above. What if the Bounce Rate is rather high? I would say: perfect – well done (if we talk about a specific landing page with no further call to action). I would say, oh my god, if we talk about an eCommerce shop. The perspective makes the difference, as well as the website goals. Depending on content and goal a high bounce rate can theoretically be the ideal set up: if you own a company that is purely concentrating on services and you deliver all needed info on a specific page, then there is no need for the visitor to click further: he received all info he was searching for. Job done.

Another metric to be discussed is concerning Streaming Media content. Is it bad if a video was interrupted in between or has not been viewed until the end? You simply can not tell. It all depends on the placement of the content. If the relevant content has been seen (and was maybe included within the first minute), then the interruption of the video after this minute is less relevant and alerting because the main message was received. So, if the video is 5 minutes, the completion rate is only less than 20% but you do not care, because the main information has been covered.

How about long duration on a website? Good or bad? Again: it depends. If you have fascinating content on your website, lots of videos and detailed articles a long duration would seem to be more likely than on a news site. But the duration can also be high on a news site – because your editor is using the wrong words or too complex descriptions. If you are able to check the duration per article and check this with the number of words for example (again: good analytics solutions can combine different data sources and get you a more transparent overview) then you see if the complexity level per article has an impact on the duration.

At the end I would also want to touch a more detailed topic: picture galleries. Is it good or bad to add more pictures to the gallery? Does it depend on the area of the gallery, meaning the content or the page? Is there a limit of pictures to show without annoying people to click through too many pics? Try it out! Analytics is your friend also here.

With the given experience and testing on current clients we were able to get some results. Far from being representative official figures, but good to check for your own website. It was obvious to see that see completion rate was always around 45-55%, no matter how many pictures were included in the gallery. That means: the costs for stockphotos and the effort involved can pay out. You might have asked yourself is it worth putting more and more photos to that gallery.

The answer in short is: yes. If you include 40 pictures it is most likely the visitors will click around 20, so 50%. If you include 15 pictures it is most likely the visitors click up to appr. 7 picture, so roughly 50%. The quantity is less important. It is important to include your most important pictures or messages within the first half. See the descriptive screenshot below to see a range of galleries from 3 to 40.The completion rate is quite the same, as said. So, how long should a picture gallery be: as long as you want it to make sure you cover all your stuff within the first 50%. Is it relevant in which parts or channels of a website or in which interest group this is shown: no. They all have around 50% completion rate.

PhotoCompletion_ComparisonGraph

So, how long should a picture gallery be? As long as you want it to make sure you cover all your stuff within the first 50%. Is it relevant in which parts or channels of a website or in which interest group this is shown? No. They all have around 50% completion rate.

So what have we learned?

  1. We are not unique – there is competition
  2. Market data can be hard to find but it is very valuable
  3. You need to be able to compare your data
  4. Find answers to questions that have not been asked yet
  5. Go and start with something. But start!
Subscribe by Email

7 Ways You Should Have Advanced Your Digital Strategy in 2013



Wow, it’s January 2014 already! If you’re like me, 2013 was a fast-moving combination of wins, near-misses, regrets, accomplishments, failures and fun. Amongst it all, I hope you can look back and count 2013 as a pivotal year for your online strategy. Before we dive into 2014, I’m thinking of seven ways you could have, no make that should have, advanced your digital strategy.

1) Implemented a Tag Management System
I’ll start with this one; it’s a no-brainer. If you don’t have a Tag Management System in place yet, forget the other six items on this list and immediately go sign up for Google Tag Manager. It’s free, and it will save you countless hours and headaches in the future. It’s not the only good solution out there, but it’s a good place to start for 95% of businesses or organizations.
For more on Tag Management, check out these great posts: http://www.webanalyticsworld.net/category/tag-management

2) Planned a Migration to Universal Analytics
If you are using Google Analytics, you’ve undoubtedly heard about Google’s next generation framework dubbed Universal Analytics. It’s the first fundamental overhaul to the Google Analytics measurement model since Google purchased Urchin in 2005, and it will open the door for tons of great features in the future. Unfortunately, for many users, the value of Universal Analytics has not been made clear enough. While at the moment there are still some features missing that some GA users depend on (remarketing support, DFA integration, etc.), an upgrade already makes sense for many businesses. The simplified codebase, server-side options and ability to define and use custom metrics and dimensions make Universal Analytics appealing already.
Google has made transitioning to UA fairly easy, but there are still some coding changes required. If you’ve taken my advice on #1 and are utilizing a TMS, this shouldn’t be so bad. Go ahead and put a migration plan together for Q1 of 2013 and make it a priority to switch to UA. You’ll be able to quickly implement new features and functionality, including the oft-cited promise of tracking users across devices with the UserID override capabilities.

3) Executed a Smart PLA Strategy
It’s not much of a stretch to say that PLAs took over the SERPs in 2013. If you sell products online, but aren’t in the mix of Product Listing Ads, you missed the boat. PLAs offered merchants low CPCs and high conversion rates, but lately CPCs have been climbing fairly dramatically (53% increase since last year).

There is a difference, however, in having PLAs and having a smart PLA strategy. The default way of adding products to an AdWords account is actually very dumb, and it produces a lot of bad results for merchants. At this point, without a granular, product-specific PLA strategy, you may be underwhelmed with PLA performance. Smart merchants are using 3rd party tools and services to bid efficiently by inventory, category, profit margin and other key data points. Breaking out products efficiently into campaign and ad groups allows for greater control and precision bidding – just what SEMs are used to.

4) Advanced Your Remarketing Campaigns
You do have a remarketing program, right? OK, good. But how sophisticated is it? The best remarketing campaigns are built with highly dynamic designs, reach audiences across multiple platforms and are based on specific visitor behavior and time. If you are still throwing all visitors into one bucket called “All Audiences,” it’s time to take it to the next level.

5) Cursed Google for Taking Away Keyword Data
Ok, this won’t exactly “advance” your strategy, but it needs to be done, am I right? Moving on…

6) Developed a Custom Attribution Model
Attribution modeling used to be only for the elite: the 1-percenters. Not anymore. Many folks don’t even realize that there’s a pretty nifty Attribution Modeling Tool just waiting to be used in their own Google Analytics account! Yep, and it comes with seven built in models and the ability to create all kinds of exciting custom models. Further, in 2013 Google added the ability to integrate GDN display data to spice up those attribution models (see my earlier post about this here).
If you are still using Last Click, prepare to have coal in your stocking this year.

7) Invested in Conversion Rate Optimization
Acquiring traffic to a website has never been more difficult or expensive. On the SEO front, techniques that used to work don’t work as well anymore, and Google is methodically pushing organic results farther and farther down the page. On the paid side, CPC costs continue to rise and competition is fiercer than ever. As traffic acquisition becomes harder and harder, it’s surprising that more businesses aren’t focusing their attention to Conversion Rate Optimization. For most organizations, it is much more affordable to increase their conversion rate by 20% than it is to increase their total traffic by 20%. But did you know that for every $92 we spend on acquiring traffic, only $1 is spent helping that traffic convert to a lead or a sale?

Our clients are finding CRO drives tremendous value across all of their online channels, and at a fraction of the cost it would take to produce similar gains through acquisition alone. If you missed this opportunity in 2013, let next year be the year you start a Conversion Rate Optimization journey.

That’s my list, but you probably have others, right? Chime in below and let us know how you made out in 2013, and what’s on the list for this year!

Subscribe by Email

Digital Marketing Data Gift Guide for 2013



Digital Marketing Data

GIFT GUIDE

2013

Gifts for the:

Digital Analyst | Content Marketer | CMO | CDO

With love and apologies to the fine writers of the New York Times Gift Guide 2013.

~

Marketing Data Gifts for the Digital Analyst

All good analysts know: What’s measured matters. Apply this general wisdom to gift-giving, and the results are deepened insight, increased mathematical rigor, and metrics to treasure.

  1. Multi-device, multi-browser analytics: One of the best gifts I ever got was access to Google’s Universal Analytics tool. Once it was assembled, I had access to cross-device, cross-browser behavior as well as a handy attribution model. The resulting insights spurred me into a happy analytics dance. Also available in Premium starting at MSRP $150K. Some assembly required.
  2. Freedom from pre-packaged reporting: If you thirst for a web analytics tool that lets you under the hood –  direct access to raw, structured data in an easy-to-query format – start with a demo of Snowplow Analytics. Snowplow Analytics is a powerful, open source platform from the UK. You can store data in your own AWS cloud. You can query it with any tool you want. In lieu of pre-packaged views, Snowplow gives you advanced analysis recipes that turn even average web analysts into marketing masterminds.
  3. Robust text analysis: Truly useful text analysis tools – as opposed to top lists and word clouds – are surprisingly hard to find. KNIME caters to a wide range of industries, and will support your enterprise, too. The suggested applications range from social media influencer analyses to recommendation engines. Whitepapers with repeatable workflows are useful, though training is required. Gift this to an analyst with a training budget – stateside training is rare and pricey.
  4. Meaningful engagement metrics in social media: Chaos and confusion continue to reign in the world of social media, as  marketers find themselves cornered between a rock and a hard place. On one hand there is too little time to authentically engage with customers; on the other, there are a slew of vanity metrics being touted with little business relevance. Fortunately there may be an out, as the more established social media networks continue to release increasingly compelling analytics tools. Here are some suggestions for diving deeper into your social media metrics.
    1. YouTube Analytics Groups: “Groups allow you to view aggregate data of the videos or channels in a group, which can help you analyze performance in an organized way. For example, you can create groups based on a common topic or type of video as well as by geography or the recency of the upload. You can see groups data for all the reports available in YouTube Analytics.” – YouTube
    2. Twitter Analytics: All statistics from Twitter, including follower characteristics, account growth and click-through rates on account tweets, can be accessed by setting up a $1 campaign (and then canceling it before a penny has been spent.) How-to guide is available from Econsultancy.

^ Back to Top

Big Data Gifts for the Content Marketer

It’s a hot topic! It’s a marketing channel! It’s a digital consumer product! We offer data in three surprising categories. Thanks to net neutrality, these items come not just from the marketing industry but from organizations focused on academics, philanthropy, fitness and technology.

  1. “Big Data” as a hot topic: This hot topic has intrigue, unlimited potential, and a series of inherent challenges that marketers can discover as they devise their 2014 content plans. Thought leaders are jotting down their “big data” metaphors in tweets, blog posts and status updates in an attempt to feed the insatiable demand for big data content coming through Google search.
  2. “Big Data” as a marketing channel: It is now possible to gain access to hard-to reach IT Decision Makers through data itself. Host a contest on Kaggle (or if you are a non-profit, launch a project with DataKind) with a data set and a tantalizing problem statement. Participating data scientists who crave real-world data to develop and refine their techniques can then be recruited (with consent) if not simply incentivized to transform the way you think about your business.
  3. “Big Data” as a consumer product: “Big data” digital products, in a variety of shapes and sizes, connect to the internet as they capture and quantify consumer behavior in real life. The classic example is Nike+ Fuelband. Don’t underestimate the staying power of these products. With advanced analytics algorithms, consumers will be as overwhelmed and entranced by their quantified selves as content marketers are with campaign optimization.

^ Back to Top

Big Data Gifts for the CMO

Big data does not have to cramp your style. Our selections add variety and velocity to the marketing dollar. Whether your marketing message is targeted towards B2B or B2C, these gifts speak the international language of ROI.

  1. Get results, fast! For a sped-up marketing campaign, combine real-time bidding with real-time analytics, to perk up both awareness and conversions.
  2. Pay for what you get: Put the kibosh on paying for bot clicks: a refreshing, band of marketers is moving to crack down on impression and click-fraud in advertising.
  3. The best thing in a tiny packages: As tailored messaging is created for niche audience segments, smaller campaigns are becoming favorites among data savvy marketers. This streamlined approach replaces spending beyond the point of diminishing returns by producing more campaigns with less spend on each

^ Back to Top

Big Data Gifts for the CDO

In my office, filled as it is with go-getter entrepreneurs and millennials, revenue opportunities abound. We’re learning not to grow too attached to “my next million dollar…” ideas.The best of this year’s revenue opportunities are so lucrative, however, that I’m scheming to place them with the Chief Data Officer (CDO). It won’t be too long until we’ll need the money in the bank to fund the next big thing.

  1. Big data as a revenue stream: Sure you can sell products and services with differentiating features at competitive prices. But they’re not efficient if you need a lot of revenue in a hurry or want to supply a quickly growing demand. Consider instead monetizing your data. While more complicated than other potential offerings, monetized data captures a share of market currently up for grabs. In fact, in the first half of 2013, Twitter made $32 million in revenue from “data licensing.”

^ Back to Top

Subscribe by Email

7 things to consider before blindly choosing Google Tag Manager



More and more companies are switching to a Tag Management System (TMS) these days. And many of those simply go with the easiest option – the free and readily available Google Tag Manager. But sometimes it is better to look a bit further. Consider these 7 thoughts before deciding. 

Tag-Management-System-AnbieterGoogle is so incredibly dominant. It has long replaced “to do an internet search” with “to google”. Nowadays, many people seem to equate ”Web Analytics” with “Google Analytics”. If that was not enough, I more and more often get the impression that “Tag Management System” has become synonymous with “Google Tag Manager” – as if Google’s TMS was the only one out there.

Far from it! Google Tag Manager is neither the most powerful, nor is it even the only free tool on the market! QuBit’s “OpenTag” (free up to one million pageviews per month and very cheap after that) and DC Storm are the other popular free ones out there, and they are pretty good themselves. A sort-of-free tool is ”Adobe Dynamic Tag Management” (formerly “Satellite”) which you get for free when you use a product of Adobe’s Marketing Cloud.

Strengths of Google Tag Manager

That being said, I am a big fan of Google Tag Manager and use the tool almost every day (see my latest article here on Custom JavaScript Macros for example). It opened up the world of tag management for me. The major benefits of Google’s TMS in my opinion are:

  • free (simply pay by giving yet more information to Google)
  • available in seconds (just log in)
  • fast and very user-friendly
  • probably the best turn-key integrations with Google Analytics and AdWords
  • When you want to use a none-turn-key tag, you can usually just copy and paste the JavaScript code that your vendor gives you. No need to re-scope JavaScript variables or alter the code like you have to do in some enterprise tools.
  • probably the slickest testing functionality – I just love the ultra-fast in-browser testing on the live system
  • many other users, so the internet is full of tips and tricks 

Drawbacks of Google Tag Manager

But Google Tag Manager has also some drawbacks in comparison to enterprise tools like BrightTagEnsighten, Tealium, TagMan or Adobe Dynamic Tag Management if you are an Adobe client. Not all these tools offer all the functionalities I am going to describe below. But, as a general rule, the larger your enterprise and the more people involved, the more severe Google’s weaknesses become:

1. Very few turn-key tags

“Turn-key tag” are the tags you just create by filling out little forms instead of pasting and altering JavaScript code. Apart from the Google tools, turn-key support for other tags is sparse in Google Tag Manager. This has two consequences:

a) If your main Web Analytics tool is not Google Analytics, but, say, Adobe Analytics (“SiteCatalyst”) or Webtrends, you may have a hard time implementing those via GTM. 

b) All those other tags require you to paste JavaScript code into a “Custom HTML” tag in GTM. That, in turn, usually requires some IT folks to review the tags before they can go live – which in turn means that you may lose one of the main benefits of a TMS: speed. 

2. No workflow management / only basic rights settings

This is crucial for many large enterprises. Some companies want to be able to give the marketing team the right to publish on the test platform, but not on the live website. This is impossible in Google Tag Manager. Another frequent request is that Marketing can only publish a tag to the live system after IT has reviewed it – a simple case of a 4-eye check that is so common, but not possible in Google Tag Manager (nor in some enterprise tools).

data-protection-913770-m3. Security

With a TMS, you can easily bypass all the usual restrictions developers go through when wanting to publish code to the website. So if someone hacks into your Google Tag Manager’s admin account, your website will be gone in seconds, serve trojans to your visitors, or push your tracking data to a competitor’s system. In a time when we hear of new hacking scandals every month (see the recent one at Adobe for example), this is a huge security issue for some companies.

So look for a tool that offers two-factor authentication. That can be an SMS pin code or an email that contains a link that has to be clicked before you are able to publish to the live system. Some tools also offer IP restrictions so only users inside your company are able to log into the TMS.

4. No functions that go beyond a content management system for tags

A content management system for tags – that is a TMS ‘s most obvious benefit. Nevertheless, enterprise tag management systems are more and more leaping into what I call “data integration upon collection”, others call it “digital data distribution” (see “BrightTag’s Fuse” or this whitepaper by Tealium and Web Analytics Demystified for example). Since all the visitor data you collect on your webite goes through your TMS, there is no need to plug into multiple third-party tools’ APIs anymore to get the data into your data warehouse or dashboard. Instead, you let your TMS push the data directly to wherever you want to integrate it (the data warehouse, for example). Sounds easier than it is, but I believe that this is the future.

Other handy features common to Enterprise Tools and currently not available in Google Tag Manager are:

  • Off-site tagging (Ensighten offers this for example)
  • Privacy law compliance: Some tools offer automatic compliance with “Do-not-Track” (some offer this to be respected on a per-tag basis) or the privacy laws of the country the visitor is coming from. This can be important especially when looking at how differently every EU member state has interpreted the “cookie directive” (see Ensighten’s “Privacy” platform or a related functionality by Tealium). 
  • Tag Performance Reports (how long does each tag load? Which tags are not working, and on which pages is that the case?)
  • Server-side tag execution (possible with some of BrightTag’s tags)
  • “Visual Tagging” tools that allow you to create tags and data layers by just clicking on elements on your website or even in your mobile app (see Ensighten Mobile)

5. Support

There are indeed a lot of tips and tricks on the internet for Google Tag Manager. But sometimes, that is not enough. With an enterprise tool, you can turn to people who know their tool like noone else.

6. No way to change the order in which turn-key tags are fired.

Have a Google Analytics Event Tracking tag that should be fired upon pageload, but after the Google Analytics pageview tag (because if the event fires before the pageview, your page may not count as an “entry” page)? Currently, there is no way to do that other than through a workaround with custom JavaScript that means saying good-bye to your handsome turn-key tags (see CardinalPath’s Blog for an example).

7. No support for Google Content Experiments (A/B testing)

If A/B testing is something you love, Google Tag Manager will not make it easier for you. It still does not support Content Experiments. That being said, even some Enterprise Tag Management Systems have this problem (luckily, not all of them!). The main reason for this is probably that the Content Experiments tag is supposed to load synchronously instead of asynchronously and it has to go into the start of the head section whereas GTM’s tag container goes after the opening body tag.

The points I think are especially important are 2, 3, and 4. So before just going with the seemingly easiest choice at hand, do invest in a tool evaluation or test-drive Google Tag Manager against an Enterprise tool on a smaller pilot website. Make sure to compare the page load time as well, since it can differ quite strongly from tool to tool – even though all TMS providers claim their tool is the fastest, some actually slow down your site instead of making it faster.

Which features do you miss in Google Tag Manager?

Now it is your turn: Which features would you like to see in Google Tag Manager? What does an Enterprise TMS  bring to the table that GTM just can’t? I am curious to read your comments.

Subscribe by Email

It’s time to put the customer back into Behavioral Analysis with multi-screen, multi-device analytics



Measurement protocols are adapting to track customers instead of cookies.

I have spent an inordinate amount of time helping clients understand the difference between “unique visitors” and “people” when it comes to measuring web site audiences. The “unique visitor” metric comes from web analytics and is defined in terms of browser cookies. The “people” metric comes from IP/panel data (like Quantcast) and is typically an estimate based on a representative audience sample. Neither metrics actually represent a person, like you or me or the people we interact with in real life (irl) every day.

The “unique visitor” metric is expected to be greater than the number of “irl people,” or customers that access your sites, and the “people” metric varies according to how representative the sample is of your audience.

If only there was a way to tell how many “irl people” are visiting sites.

Universal Analytics (UA) from Google moves us one step closer to being able to have the “irl people” metric and associated insights.

As the prevalence of multi-device browsing grows, brands are investing in mobile experiences in addition to web ones. But are they reaching the same customer twice – once on the web site and again on mobile? Or are they reaching a new audience on their mobile properties that they weren’t reaching before because they only had a website? Are customers converting at a higher rate thanks to being able to have a mobile touchpoint with the brand instead of just a web one?

Many web analytics tools let you capture the User-IDs for logged-in visits in a variable, but the reporting effort needed to answer questions about the impact of cross-device browsing is clunky since the measurement protocols are cookie based, rather than customer based. The measurement protocol for UA is designed around the User-ID, so once you start tracking that information, associated activity is attributed to one visitor in the reports – one “irl person.”

The seamlessness of the solution provided by the new measurement protocol in UA means clunky work-arounds – like having to adjust unique visitor numbers by the number of times a User-ID shows up in visits reports, are obsolete.

Battle scars gained from explaining the difference between “unique visitors” and “people” have made me shy away from reporting on those metrics and made me recommend that analysts focus on discrete sessions which are better captured in cookie-based web analytics tools. Now, with UA, I am excited to see analysts putting the “irl people,” the customers, back into their analyses.

Are new measurement protocols, like cross-device tracking, changing how you measure your site audiences?

Subscribe by Email

Google Analytics integration for Digital Business Intelligence



Google Analytics integration for Digital Business Intelligence

Graph ImageIn our introductory post in this series, we outlined the structure of our coverage of the 5 broad categories of use cases for extending Google Analytics functionality through API integration. In this post, we take a deeper look at the first of these categories and explore how Google Analytics can be turned into a powerful repository for business intelligence through API integration. By using Google Analytics as a data collection engine as opposed to a reporting interface, serious Marketers can engineer a truly enlightening, top-down transformation of the quality of insights generated from their digital channel data.

Following are some examples of the key features that can be easily enabled by plugging in Google Analytics data into any capable business intelligence tool

 Number 1

Reporting using multi-source data-Google Analytics holds a gold mine of click stream data which by itself is of limited utility in driving any strategic marketing optimization. The two largest data points not easily captured within Google Analytics include cost (not just media but also other categories including staff, third-party agencies, software etc.) and the behaviour of customers who transact either offline or in a different web property from the one used for acquisition. Google recently made a lightweight attempt at sending data into Google Analytics using cost data upload and measurement protocol features but for all practical purposes these remain of little utility in enterprise marketing measurement. For most companies, Digital Marketing is usually a fraction of overall Marketing efforts and measurement needs extend well beyond the boundaries of Web Analytics. Typically, such companies already deploy some form of business intelligence functionality for wider measurements and replicating this inside Google Analytics can never be justified as a sound investment or even a semi-literate Enterprise Architecture approach. As such, a holistic assessment of overall marketing performance remains an elusive goal if working within Google Analytics interface.

 

 

 Number 2

Visualization-This would easily be the single biggest reason why advanced Marketers might want to integrate Google Analytics into a capable business intelligence tool. High traffic sites routinely generate large amounts of data for which numerical output is hard to ingest and make sense of. A typical approach to address the lack of visualization capabilities within Google Analytics is to export data into Excel and then run some elementary charting. This method is still crude, primitive and fails in utility for serious Marketers who need to quickly visualize big data sets along multiple dimensions. Business intelligence tools address this challenge by providing advanced visual aids (chart types, filters, overlays, colour intensity, graphic size etc.) to help Marketers quickly identify patterns such as outliers, missing values in datasets, gainers/losers, percentage contributions, actual vs. target comparisons and so on

 

 

 Number 3

Calculated metrics and dimensions- Effective analysis routinely requires building custom metrics and dimensions on the fly after data has been collected by the Analytics engine. Ever tried analysing an arbitrary metric within Google Analytics? For example a bespoke mathematical relationship between cost and revenue? Or the weighted average of revenue realization from various channels? Or perhaps the running total of all website conversions in a given window? While not possible within Google Analytics, running such analysis in a business intelligence tool would usually require nothing more than a bit of common sense and a basic knowledge of mathematical expressions

 

Apart from the above high level drivers for integrating Google Analytics, there are a number of other advanced reporting capabilities that are not supported within the existing interface but which can be easily enabled by plugging in Google Analytics data into a business intelligence tool

 

 Number 4

What if analysis-This technique forms the bedrock of strategic planning for most established digital marketing departments at intermediate to advanced levels of marketing maturity. Examples include

-       How does the conversion rate change if traffic to landing page x is increased by y%?

-       What is the impact on revenue if paid advertising budget in channel z is decreased by 5%?

-       How many additional new visitors would be required to drive overall email signup rate by 2%?

Business Intelligence tools typically answer such hypothetical questions using parameterized inputs and provide visually appealing display of output based on automatically calculated (or custom generated) models . Marketers routinely use such analysis for better planning and prioritization of their optimization efforts.

 

 

 Number 5

Trend analysis and Forecasting-Unlike what-if analysis, trend analysis techniques project future value of metrics based on historical data but without any user input. For example, trend analysis of website conversion rates can be used to estimate website revenue at a future time and then work back to calculate marketing costs for delivering a certain cost per acquisition

 

 

 Number 6

Outlier detection-Outliers can wreak havoc on any analysis output no matter how well planned. Being able to quickly identify outliers through visual display allows Marketers to present a more accurate analysis of data at hand. This is especially relevant in Web Analytics where average values of metrics are used for practically all analysis output. For example, making judgements about overall conversion rates based on a few exceptionally high converting days in a campaign is ill-advised and almost sure to provide misleading results

 

 

 Number 7

On-the fly calculations-Ever tried measuring the percentage change in conversion rates since the launch of a campaign? Or the percentage change in revenue contribution of various channels as percentage of overall paid advertising spend? Or perhaps the lifetime value of customers across a given period? There are numerous such analysis use cases that cannot be implemented within the native GA interface but all of which form a critical part of any commercially relevant analysis

 

 

These are just some of the use cases for extending Google Analytics for wider business intelligence. Excel based reporting using csv files downloaded from Google Analytics forms a crude substitute to API integration and in any case runs out of utility when dealing with big data volumes or when data requires complex pre-processing. Excel ‘plugins’ designed specifically for fetching Google Analtycis data into excel do away with the need for manual  downloads but remain practically useless when dealing with big data requiring complex transformations or advanced visualizations

Using cloud based data warehousing techniques that leverage the powerful Google Analytics API in conjunction with user friendly business intelligence tools, Marketers can nowdevelop highly cost-effective digital insights infrastructures and give paid web anlaytics tools a serious run for their money.

Subscribe by Email

Getting rid of the inflexible side of Google Tag Manager



Google Tag Manager has been a nice tool, but often a bit inflexible. With the new feature “Custom JavaScript for Macros”, that has changed. In this sample case, I’ll show you how this new feature helps you reduce redundancies and shrink your clutter from 2 tags, 2 macros, and 2 rules, to just 1 tag and 1 macro.

Google Tag Manager (GTM) may still be behind many of the paid enterprise tag management systems (TMS), but it is getting better and better. Recently, Google released a little feature with a lot of power – it is called “Custom JavaScript” for Macros. No, I do not mean the old “Custom HTML” tag that allows you to execute JavaScript tags. That one is mainly there to integrate the many tools for which GTM doesn’t offer a turnkey integration (clicking your tag together instead of coding it). “Custom JavaScript for Macros” instead is a feature that allows you to make your Macros and rules very flexible and reduce redundancies.

Example case: One GA Code, several domains and subdomains

In our case (here a simplified version of it) – a very common case I might add – the client used Google Analytics and wanted to track visitors across all of his many subdomains. For this to work, you need to set the domain in your tracking code to the name of the main domain. In Google Tag Manager, you can do this by entering the client’s main domain into the field “Domain Name” when setting up a Google Analytics tag.

Google Tag Manager set domain in tracking code

Now, this client also had one or several “test” platforms in addition to the “live” platform.  But the test and staging platforms ran under different domains (not just different subdomains), for example:

  • myclient-test.com
  • subdomain.myclient-test.com
  • etc.

Google Analytics was to run on the testing platforms as well. But if you set the domain name in the GA tracking code to “myclient.com”, yet the actual domain name is “myclient-test.com”, the code will not be executed because GA’s first-party cookie can’t be set by a third-party domain.

Until recently: 2 tags, 2 macros, 2 rules

So until recently, the recommended way to work around this was creating extra work for your developers by inserting the current main domain into the GA data layer (in the HTML code on the page) and then insert this data layer variable’s value into the “Domain” field in your GA tag. A very inflexible solution and exactly the thing you wanted to avoid now that you have a Tag Management System. So if you wanted to do it without your IT, you had to create a lot of redundant clutter:

  • Two GA tags:
    • A: One tag with the domain name set to the main live domain (myclient.com)
    • B: One tag with the domain name set to the main testing domain (myclient-test.com)
  • Two “Constant String” Macros (recommended because if the testing domain changes one day, you don’t have to change all the tags where you’ve hard-coded the old domain name)
    • Main live domain
    • Main testing domain
  • Two rules:
    • When on hostname that belongs to the live platforms (rule for firing tag A)
    • When on hostname that belongs to the testing platforms (rule for firing tag B)

Now this is a simplified example. The client actually had yet another testing domain, and he needed other tags to run on both environments. Some of them had the same problem, so for these tags, we would have had this kind of tag proliferation too.

Now: One tag, one rule, one macro

What we would have loved to have had was a more dynamic form of a macro. One that doesn’t just return the current hostname, a data layer variable or a constant string, but one that would return the appropriate domain name to be set in the Google Analytics tag.

Now, with Custom JavaScript for Macros, we get exactly what we want. It allows you to put it all into one macro, no rule, and one tag. 

One Macro of the type “Custom JavaScript”: I called it “Return domain name to be set”.
 
Google Tag Manager Custom Javascript


Within a Custom JavaScript Macro, you define an anonymous function that returns a value. The return value becomes the value of the Macro.

In our example, our function checks the current hostname (the “domain”). If the hostname looks like we are on a testing platform, the return value becomes “myclient-test.com” (the main testing domain). If not, we return the main live domain. 

Google Tag Manager More Java Script

 

(Remember that there can be subdomains, so we cannot just set the Macro to the current hostname.)

  • One Tag: Instead of two GA Tags, I need only one. In the field “Domain”, I refer to the Macro “Return domain name to be set” that I have just created. My tag will now set the domain to the value that my Macro returned, “myclient.com” or “myclient-test.com”: 

    Google Tag Manager Domain Name

  • No rule: The only use for the two old rules was to fire tag A or tag B depending on whether I was on the live or test domain. But since my dynamic macro now automatically alters the tag to fit both scenarios, I can discard these rules. I can now simply fire my tag on any page. So I just use the preconfigured “All pages” rule – that’s it.

You could apply the same logic if you want to fill another field in your tag based on more dynamic rules. For example, you could use it to route your traffic into a different GA property ID depending on whether you are on your test or live system without having to set up 2 tags for that. 

Rules with dynamic instead of hard-coded patterns

But Custom JavaScript Macros can also be used to “dynamicize” tag rules: for example, you want to fire tags depending on whether there is a certain pattern in the URL (e.g., a Regular Expression to identify your Conversion page). Since many tags and rules need this pattern, without Custom JavaScript Macros you would need to hard-code the pattern into each of the rules:

Google Tag Manager Hard Coding

Note that you can’t compare a macro-based value like {{url}} with another macro-based value. So this would not work:

Google Tag Manager more hard coding

But with Custom JavaScript Macros, you can simply do it like this:

Google Tag Manager Another Javascript Macro

Instead of having the rule test the URL itself for the pattern, I let the rule test the return value of my Custom JavaScript Macro “Crazy Pattern URL Test Macro”. This Macro returns “true” if the pattern is found in the URL.

If the pattern changes, I just change this one Macro – no need to edit all those rules, then forget one or mistype one and then spend hours finding your mistake.

I hope you can understand why the Custom JavaScript Macros are my favorite feature in Google Tag Manager: They save time, erase redundancies, and de-clutter my tag, rule and macro library.

One drawback: Yet more JavaScript

The one drawback of course is that you need some basic JavaScript know-how. But as with any Tag Management System, it is impossible to get the bigger benefits without JavaScript if you want to run more than a couple of simple tags.

Share your experiences

Have you used Custom JavaScript Macros as well? Please do share your case in the comments.

Subscribe by Email