TNT12: Make Quality Content Count with Web Analytics

Rick Allen 
Principal / Content Strategist, ePublish Media

The audio for this podcast can be downloaded at

Announcer: This is one in a series of podcasts from the HighEdWeb Conference in Austin 2011.

Rick Allen: Hi, I'm Rick. I worked in higher education for a long time my entire professional career in lots of different capacities on the Web, and in all these different roles, the common problem in all of this work was content, and it led me to focus on trying to come up with content solutions to a lot of these problems. So now I currently run a content strategy consultancy in Boston and aim to solve those content problems.

epublishmedia is my name up there, my Twitter handle. All the 'Rick Allen' variations were taken, I tried every one. I also co-founded Meet Content with Georgy Cohen here, she talked yesterday about higher ed news, and the two of us are big proponents of higher ed web content.


Meet Content is a blog/online resource dedicated to higher education content strategy, and our goal here is to enable higher ed to create and sustain web content that works. We're trying to elevate the discussion about higher ed web content and are looking to talk to lots of you about the problems that you're running into and share the solutions. So, yes, I'm definitely a fan of quality content.

Today, I want to talk about web analytics, though, and how analytics can support content planning and also be used to demonstrate content success. For anyone working on the Web, this is critical, because if we can't make the case for content work, we can't demonstrate success.

So what about analytics? Exactly. Analytics is the 'So what?' of content strategy, allowing us to quantitatively evaluate content quality including content delivery, communications, branding, almost every element of content strategy.


But measurement is typically considered a post-launch task. You're evaluating content after it's been published. But it's not the only time we need to be asking 'So what?' In fact, at every stage of the content strategy planning process, we need to be asking this question. And as a result, analytics can help inform our content work at every stage.

This topic of this presentation was borne out of my own need to feel to quantify the value of my work. And it's very hard with content, reason being is that content strategy is a process, it's not a project. So it makes it very hard to measure the effectiveness of all these different deliverables in stages and make the case for content.


As Kristina Halvorson, author of "Content Strategy for the Web" says, "Web analytics changed the game for marketers and advertisers. But with respect to content quality, many practitioners are flip-flopping with regard to which numbers matter, and whether we can make use of them." This is one of the problems we need to fix.

Content folks, including myself, get hung up on the fact that analytics can't adequately answer 'Why?' It can't define user intent. And it led all the data that we have available to us will not replace a single person coming up to you and telling you why they came to your website and why they left.

Another problem articulated by Clare O'Brien, a content strategist in London, says, "Broadly speaking, and thanks largely to the ubiquity and ease of access to Google Analytics, businesses have become fixated by traffic volumes, bounces, sources, journeys and subsequent destinations and the like and aren't looking to learning more." We need to move past dashboard metrics.


This is a typical dashboard view on Google Analytics, and this is typically where people start their analytics analysis. They're captivated by flashy, sexy data. Yes, I did say "sexy" and "data" in the same sentence.


Rick Allen: But this data is not useful on its own.

The content measurement roadblocks that we're facing are we don't know what numbers matter and we're looking in the wrong places. And the reason these are roadblocks is numbers rarely speak for themselves.

Two-thousand page views. Is this good or bad? Seventy percent returning visitors. Is this better than 70% new visitors? We don't want new visitors, right?


Rick Allen: Bounce rate. This could mean that someone came to your website, the content was not what they expected, and they immediately left, or it can mean they came to your website, found exactly what they needed, and left satisfied.


Numbers rarely speak for themselves.

So we can't start here. This is not where we could start for content measurement using web analytics. We need to start here. We need to ask, 'So what?' And use fancy keynotes, transitions.


Rick Allen: Broadly, there are two approaches to web analytics.

One is the bottom-up approach. This is where you start with a broad set of metrics and you narrow your search so as to identify some insights. This is a process of insights through discovery. While this approach seems easier because they're jumping right in the data and it's right there, in reality, this is much more difficult. It's hard to find answers when you don't know the questions to ask.


So I want to talk about the second approach to web analytics, and this is the top-down approach. This is where you start with a narrow set of goals, and then you broaden your search to identify a larger set of relevant web metrics to support those goals. This process allows you to answer your content questions.

My very first recommendation here is, to find answers with web analytics, start with meaningful content questions. And this is the reason why it's so important.

Avinash Kaushik is the author of "Web Analytics 2.0" and he is the web analytics evangelist for Google. Has an awesome blog, too. He says, "Often numbers don't speak as loudly as they should because they're missing one simple ingredient: context."


Without context, your data is meaningless. This is a big hurdle for analytics analysis. If your data isn't meaningful, people can't make decisions, and it's hard to care.

Now, I'm a 'content first' advocate for the Web. I advocate for a 'content first' approach. I guess it's not a surprise at this point. But admittedly, sometimes I get caught up in the campaign and I forget to highlight some of the important steps that actually do happen before content.

Kristina Halvorson had a tweet recently that kind of illustrates this point for me: "Before mobile, before first." Or more to the point, objectives. What is the purpose of your website? What do your users want from your website?

Through this process, goals provide context, and context provides meaning. And when we learn to make good use of analytics data, analytics can do more than report progress. It can inform process. It can inform content-planning for research and discovery, content audits, analysis.


Let's look at how analytics can support these content-planning efforts.

Unlike internal stakeholders, our web users, our audiences, are rarely at the ready to tell us exactly what they're wanting from our websites. So we have to be proactive to gain these insights, and we rely on use-cases, personas, user interviews, and we use these approaches to be able to imagine our web visitors.

However, as the user experience consultancy User Intelligence describes here, these tactics have some inherent limitations.


Data usually is from a small number set. Most methods take a snapshot in time. It can be difficult to capture some behavior. And setting is sometimes artificial, including like lab tests. Web analytics enables more comprehensive use of research. We don't have to settle for those limitations.

Another element of content strategy, content audit. A content audit is a critical early step in developing a content strategy. It helps us identify what content exists and whether it meets the business needs. Simply put, a content auditor is needed because if we don't know what content we have, how do we know what content we need or if that content is any good?


This is what the typical spreadsheet looks like, content audit. During this process, we can have a ROT content audit, it's like an early content assessment, and this is where we have a superficial evaluation of content quality. If you're not familiar with the term 'ROT', it stands for 'redundant, outdated, trivial' content.

You can use this, or you could do a more comprehensive evaluation also addressing whether content is appropriate, useful, usable. But during our content audit, this is really helpful because it really filters out all of the really high-level content problems and identifies a lot of low-hanging fruit that you can tackle.

But when you're finished with an audit and you have a spreadsheet now with thousands of rows, how do you prioritize those content insights? It's typically not reasonable to expect that you can nail every single content problem.


Web analytics can help identify and evaluate the impact of content problems. And I'm going to actually get to some examples on how we can do that.

Content analysis. This is another element. It's often led through a qualitative evaluation, including identifying contents, figuring out if it's useful and relevant, clarity and accuracy, influence and engagement, completeness, voice and style, usability and findability. These are things that are really hard to evaluate quantitatively, so it's appropriate obviously if you're starting this process from a qualitative perspective.

But it's not where the process should end. A lot of the questions that come up during this process require analytics to either answer or help answer, including, how do I know which part of our audience is getting our message? Are they sharing their ideas? What type of content is most effective at driving inquiries? What channel should we use to communicate with prospective students? Are people finding our content through search? When they find it, is it relevant to them? These are all questions that analytics can help answer.


So recommendation here is augment qualitative analysis with web analytics for deeper content insights.

What I want to do now is actually show you how you can do this and what process we can follow for using web analytics to answer our content problems. And to do this, I'm wondering if someone would be willing to help me out. Can I have a volunteer? You'll get a prize. No, the first person who volunteered before I said there was a prize gets the...



Rick Allen: If you wouldn't mind coming up.

I want to go through a process with you in how you can identify metrics, but I want to actually use some real-case examples here. I'm going to actually go through this with you.

So if you could just write down the answers that people shoot out, which they're going to shoot out in a second.

Audience 1: OK.

Rick Allen: So a web analytics framework looks like this. That funnel that I was talking about earlier where you're starting with a narrow set of goals and you're broadening it to identify a larger set of relevant web metrics, this is that process. And it starts with a deceptively simple question: what is the purpose of your website?

Can anyone tell me the purpose of their website? And there's more than one purpose, hopefully.


Audience 2: To recruit students.

Rick Allen: To recruit students. OK, if you wouldn't mind writing that down. OK.

Other objectives could be increase customer service, right? Increase brand awareness. Improve communications, marketing. It could also be improve operational efficiency, because your website is not necessarily, 'Oh, it's just a marketing tool.' It improves internal workflow.

One of the examples that I want to follow along in parallel with this one is, if you wouldn't mind writing the sentence, improve communication and feedback systems.

The second step is content goals. These are your content questions here.


What actions do you want people to take on your websites?  For recruit students, our objective there, can anyone offer some actions that you would want to take on your website for recruiting students?

Speaker 1:  Apply.

Rick Allen:  Apply.  OK.  So you want them to click on the Apply button or complete the application.   What are some other ideas?

Speaker 1:  Request a visit.

Rick Allen:  Request a visit.  Any others?

Speaker 1:  Request information.  Like us on Facebook.

Rick Allen:  Request information?  Like us on Facebook.  How does that directly relate to recruit students?  It can.  It just keeps them connected.  Those processes, you can always have multiple objectives and of course any website does.  

But the purpose of the process is you want to, if possible, have the goals be as tightly connected to the objective as possible.  Because what I’m trying to do is identify metrics that directly support these goals and the objectives.  OK.  So we have Apply, Request Visit, Like Us, Request Info.

So for one of the other example that I put there was improve communication feedback systems, maybe that is we want someone read our blog or news blog and maybe we want them to leave a comment.  In order for content to be valuable, in needs to support some type of action.  This is why analytics is so helpful for this.  This appeared Tim Nekritz says, “Great content moves people to action”.   Pretty good, Tim. 



Rick Allen:  Next step is KPIs and this is kind of a very jargony term for relevant web metrics.  This is a collection of metrics, it could be page views and bounds written some of those seemingly irrelevant ones as mentioned earlier, but together they provide some context and insight relates to these goals and objectives. 

So this question is what relevant web metrics can be used to measure our content goals over time?  So some of our content goals here are Apply, so applying would be, can somebody give me an example how that would work for their website.  Is that clicking on a or downloading a PDF application?  OK. No?   Submitting an online application? 

Speaker 1:  Either or? 

Rick Allen:  Either or?  OK.  So they’re downloading a PDF application, one of the metrics you’d want there, right, would be of course to track that quick.  But what are some other metrics that relate to these goals?  Request a Visit.

Can someone give me an example on how that works on their website and what actions that we’d want to be evaluating?

Speaker 1:  Submitting a form.

Rick Allen:  Submitting a form?  OK.  Yup, so then another, you know, KPI there would be to evaluate the confirmation for a Request a Visit.  Or if it was an email link, you know, measuring that, that click there, that’s a conversion.

Going back to my other example about the news sites and comment rate, some typical engagement metrics are some common ones, are depth of visits or time on page.   So this example here, says average time on page is one minute 50 seconds, so that could be a KPI that we want to evaluate as part of a collection of other ones. 


And then, as part of evaluating comments, people leaving a comment on our website, we can look at a comment rate and this is a custom metric, so for this we’re creating an analytic’s goal in Google Analytics or whatever program you use.  So any time someone leaves a comment on your blog, it gets reported as a goal, as an action.  So then this is a custom metric then you can actually evaluate the conversion rate. 

So in this example, this blog has a 1.58% comment rate.  So the next step is targets.  This is important because going back to the earlier example 2,000 page views, if that’s good or bad, without a target then we don’t have that perspective.  If the target is 1,000 page views then 2,000 is looking pretty good.  If your target is 3,000 page views then 2,000 is not looking so great.


So, for these examples here, tracking the Apply clicks or Submit a Form, some of the targets could be, you’d evaluate it based on previous numbers.  I’ll just offer you an example here. 

So for time on page, for this example, we can look, this is over a four-month period, so from May to September 2011.  So this example in May, the average time on page was a minute and 40 seconds and in September it was a minute and 50 seconds.   So if we’re recording our targets based on a four-month lead then we could project our next target to be two plus minutes or targets could also be a longer time period. 


And if it was a comment rate, we could look here, look at a month to month over a period of time and evaluate the low end of the comment rate and the high end and see, OK, here it’s 0.67% and then the high end here in January 1.58%.

So maybe we’re setting a target for a 2%.  And that aspect is segmentation and this is asking, what visitor attributes will provide meaningful insights?  And segments are often considered this like optional element of web analytics, but I highly encourage you to reconsider that perspective.  Its deity and aggregate is not nearly as meaningful as date it’s segmented. 

And just considering the examples that we have here, if we’re evaluating the effect on this of our online application, whether it’d be a called action or a page or a formed process, if we’re looking that on aggregate, it’s influenced those numbers and the conversion rates are influenced by our internal traffic too. 

And most college websites' internal traffic can sit, you know, can be upwards of 50% of the over-all traffic.  So building the segment internal versus external traffics here looking at the audience that you’re trying to measure these goals against, that’s really, really important. 


You know, another common segmentation element is like new versus returning visitors.  And these numbers, each reflect very different, show very different insights.   I mean, in this example here, returning visitors are 2,000 and for unique visitors and 7,000, for new visitors.

Again, if you try to measure it these particular goals, you want to build a segment based on your target audience.  And some of these goals might be for internal audiences or returning visitors.  You could also segment data based on referrals so you can segment data based on social media sites or particular referring sites, affiliate websites, stuff like that.

So it’s not just limited to some of the standard new versus returning stuff like that.  OK.  So you’re all set.  Thank you very much, Julie.  Can we give Julie a hand please?  And Julie, don’t forget your files.  Thank you very much.



Rick Allen:  OK.  So you have a framework in place, you have plans.  You’re able to identify some meaningful metrics.  And once you do so, the floodgates open and you’re able to make use of otherwise generic numbers and make really good use of them.  And they can be applied to all of your content work, not just the top level content, but also down to specific like calls to action or landing pages or particular video.  It can all be measured. 

And I'll just skip that one, and then we'll come back. Using analytics, I recommend that you ask "so what" at every stage of the content planning process;  identify and evaluate the impact of content problems; augment qualitative analysis for deeper content insights and then make the case for content strategy demonstrating content success and failure.  The analytics is all great for these.

Actually putting this to work, some of the recommendations I have are work with the content owners and subject experts to develop, to identify these objectives and content goals.  Typically, the person creating and posting the content is not necessarily a content expert.  So we need to go the subject’s expert to identify those objectives and goals.

But then, I recommend that you or an analyst on staff then do the work of trying to pair those goals with meaningful metrics.  And there are a lot of resources out there to kind of help you identify those metrics, plus the whole hired community.  So yeah, you take ownership of that aspect of it.

And then put a process in place for aligning new content and related goals to establish these objectives.  Because you’ll always be creating new content but in this too somehow be aligned and fixed in to this framework.   So, it can either be aligned to existing goals or you can add on to that. 

Goals are going to change over time and your KPIs need to as well.  Objectives should not be changing too frequently, though.


And then, routinely demonstrate success.  When you have, we talked to content owners and you’ve identified objectives and goals and support their part of the website, show how that content is working for them and get their buy in. 

This is also a great method of getting people invested in their content - when they can actually see that it’s working and how it’s working.  And then finally, just remember to ask "so what".  When you hit a metric, you know when to stop, you know when you have enough web metrics when you keep asking "so what" and you don’t have a good answer anymore. 

So, you know, page view, 2,000 page views, so what?  And then the answer to that question, you identify whether metrics are needed to kind of help answer that question and then you just keep asking "so what" again.  Seventy percent bounce rate, so what?  And just keep going. 


So, I talk a lot on the MeetContent about content measurement, a lot of other related topics.  And so many aspects of this presentation that I’ve touched on including content audit and analysis and stuff are reflected on MeetContent.  Could you check it out if you’re interested in learning more? 

I’m also more than happy to talk to you about how this could work in your institution, kind of, you know, getting a plan in place.  These things are always very particular to every organization.  The content analytics is all stuff that I really love and I’d be happy to chat with you more. 

So, I guess we have, we have some time for questions and stuff if you have any.

Speaker 1: Your comment that content should result in action. What about informational content?


Rick Allen:  Content.  Sorry. Yes.   Informational content is tied to of some goal.  An action is not necessarily a click.  An action could be reading a blog page.  That’s an action.  You’re just putting yourself in that mindset thinking of things in the terms of action so that you can actually quantify. 

If you’re not able to pair it to some type of action quantitatively, how do you demonstrate the value of that?  If it is information, then someone needs to be reading it, right? 

So that’s an important thing you want to be able to identify - is it getting viewed? We're missing like a time of page might be good to measure news articles or page articles that offers other ways that comes free. But do you have any other PPIs for news to measure these? To measuring news? Let's walk through an example. You have a news?

Speaker 1: A new story?

Rick Allen: A new story? Like on your college website, you have?

Speaker 1: Let's just say it's a story about community program. So it's a featured story.

Rick Allen: It's a featured story. And so, why is that featured story there? Are you trying to inform?

Speaker 1: Inform all audiences.

Rick Allen: All audiences? OK. So can we narrow that to identify some of these audiences.



Speaker 1: Sure. Prospective students, first is embraced community because that's the..

Rick Allen: OK. So protection needs perspective students. So why is that there for perspective students?

Speaker 1: So that, know that, you know, our school is something that works for different regions and thus spends the work.

Rick Allen: I mean, all of these questions area leading with so what.


So what? Because you have to keep thinking deeper. So what?

Speaker 1: To show them our brands.

Rick Allen: To show your brand, and why is your brand important?

Speaker 1: Because it attracts certain med student.

Rick Allen: So you're wanting to attract these particular students, you're wanting them to do what?


Rick Allen: Apply?


So, ultimately, you have to think of what that action they're wanting them to do. I mean, even just a simple, you know, blog in your website, it's there for a reason. Right. And just thinking of what actions they're wanting people to take. I don't mean to discount actions that can't be measured with analytics. I'm not saying analytics is the answer because lots of actions don't happen on your website. Right?

Some actions happen offline. Brand awareness slots that a lot of the discussions happened, you know, at the water cooler. They don't happen online, on Twitter. And so, you have to look at other ways to build and measure that. But these framework, this does not limited to analytics. Right. So when you get down to the point of trying to figure out which metrics are appropriate, you'll discover that analytics may not be able to answer this question for you. And then, you have to look at other options. I'm not discounting qualitative analysis. It's really important. Aaron you had a....



Speaker 1: Yeah, I was just going to ask about the time on page metric. You know, it was a minute 40, a minute 50, but how long? You know, I mean I want the people to read those pages...

Rick Allen: It's all relative. Right. I can't tell you what the ideal time on page is. I mean, it's going to be different based on a different content type, based on your audience, based on the purpose of that content, based on how many people are visiting your website, and that's why it's so important to provide context for all these numbers. And time on page is not necessarily a good thing, right? Time on page could mean to the confused.


Right, and they can't find the information easily. And that's why you have to pair it with other relevant metrics that should provide that meaningful context.

Speaker 1: Yeah. So, there's always the sexy metric of the year. I mean originally even first with analytics, it was like, how many pay checks and then it was, how long spent on the site and then all these different things that are eventually all knocked down. Well, because like 5,000 people came, so what?



Rick Allen: Right.

Speaker 1: They hit the wrong play. They spent 2 minutes on the site. Now they're spending 2 minutes and forty because they're confused or because they're engaged.

Rick Allen: Right. Absolutely.

Speaker 1: All these stuff can be found straight in the site. My new favorite metric of the year. You know like, "Oh! That can be anything." I thought, you know that can't be solid and left.

Rick Allen: Right.

Speaker 1: So, all this stuff sort of digs this black hole of wheels.

Rick Allen: Well, no.


That's not what I'm trying to convey.



Rick Allen:No.

Speaker 1: That's what you excel on, Dave.


Rick Allen: It's all for not. No.

Speaker 1: It's all for not.


Rick Allen: No, it's not all for not. No. That's why you have to keep asking, so what? You have to keep asking it over and over and over and over again because you can't just settle for one metric, and two metrics may not be enough. And when we're going through this process here, even we have maybe just like one objective and one goal here, we may have, it maybe six metrics that we're looking at.



And it's not going to be full proof. It's not going to be bullet proof, but it's going to provide some meaningful insights that they will influence our decisions. Yes.

Speaker 1: Are there any some time you enable to attribute metrics to specific time sense? I mean, you know, we have a really old school audition's team in our school and maybe lots of campus or like high school business, mass affairs and things and campus visits, how do we know it's because the content was so great to the point they're going to the page versus it was our admissions team fair. You know, they're great at speech like that for me.

Rick Allen: Right. Well, if you're on content goals and you're executing on those from multiple places whether your website or even in real-life information session stuff like that, you have to put something in place, will to measure those things. So maybe you have, I mean, your websites either track.



But like offline stuff, you can have used campaign tracking, you can set up unique URLs to go track off, you know, print links and stuff like that. So, I mean these are examples. If you have different types of content on your website, and serving a similar purpose and you're trying to figure out which one is most effective, even on the same page, you can use event tracking. That's really, really valuable.

Speaker 1: Just to go back the time when page for a second, so I've been there once and for all about a page and and get a phone call and leave that page up towards the hours or something. I mean, how do you conceptualize that? How do you figure out what people were actually doing and what those qualitative measures mean?

Rick Allen: Well, first of all, Google Analytics will stop camp tracking after 30 minutes of inactivity.



Speaker 1: Well, yeah.


Rick Allen: But...

Speaker 1: The point, you know, how do you assess the experience that is behind this qualitative measures? You know what I'm saying?

Rick Allen: I knew I don't.

Speaker 1: To go back to the point earlier, I mean, to me it's a good thing that people are coming here and getting out quick. It's depth and people...

Rick Allen: Right.


Rick Allen: How do you figure that out? Analytics can't answer that. I can't answer why.

Speaker 1: If analytics can't do it. Is there a way that you can figure that out?

Rick Allen: Yep. User research surveys. Yeah. Creating personas, I mean those lots of qualitative metrics. They need to complement this stuff.



Speaker 1: I mean seriously, it took, was it a super expensive test that we love about to avoid all of these things.

Rick Allen: Yeah. I mean, analytics is going to uncover a lot of content problems and it won't be able to answer them all. And then you're going to go back to some qualitative, you know, evaluation of content. But likewise, you know, qualitative analysis uncovers a lot of questions that it can't answer them. They complement each other.

One of the things that it's trying to convey is that analytics is severely under utilized and people aren't using it well and there's a lot of opportunities for it to be used. And not just the traditional, you know, content measurement aspects and you know, dashboard reporting and stuff like that. But actually applying it to all your different content, you know, work. It has a lot of value. Yes.


Speaker 1: If you're not on these frameworks, well like, if no one really is doing this now, who's the best?


Rick Allen: I don't know. Ideally, you know, we don't have the analytics analysts on hand. Right? But if not, I think that the, you know, the content creator/editor is a great person for this because they're the ones that have the answers to those content questions and they're the ones who identify to repair those metrics with those goals. They're doing that already for the content creation. Any other questions. OK. I guess that's it. Thank you very much.