- Perry Williams: Hello Dear, I am strongly agree with your point that the web analytics is associated with the social...
- Philip Sheldrake: Nice overview Marianina. I wanted to post a link to an article in Business Week from June about the...
- Luisa Woods: Hi Marianina, I think you make a very good point about the importance of segmentation. I like to carry...
- Eric T. Peterson: Marianina, Nice to have seen you Monday in London! I just got this post so perhaps something odd is...
- Marianina Manning: Hi Luisa, Thanks for your thought-provoking comment! I agree that new ways of looking at web...
- 7 Ways to make web analytics work better in companies
- Measuring social media, influence, debate, buzz monitoring
- Web analytics winners and losers? It’s the people that make the difference.
- Simple segmentation for your website and better web analytics understanding
- Web Analytics Wednesday in London – the future of web analytics
- Digital cream: revealing debating at econsultancy’s marketing event
- Google Analytics Tip: Ecommerce tracking set up, screenshots and why it’s useful
- Reliving my customer’s experience and some nice screenshots
- Internal site search part 2
- The best charts ever and food for thought for us web analysts
- August 2008 (1)
- May 2008 (1)
- April 2008 (1)
- March 2008 (3)
- February 2008 (2)
- January 2008 (3)
- December 2007 (3)
- November 2007 (5)
- October 2007 (4)
- September 2007 (5)
- August 2007 (4)
- July 2007 (6)
- June 2007 (3)
My Blogger Friends
1. Over time will departmentalised/silo-d analytics areas become part of a larger research and analytics function reporting directly to the Financial Director or chief executive officer? Presenting completed analysis or recommendations to executives can be far less effective on an on-going basis than the continuous, informal questioning and answering between managers and analysis. I think you need both, on-going silo-d analytics which really drills down into specific issues and centralised strategic analytics to align web analytics recommendations and objectives with overall company objectives on a strategic level (so there one number of how many customer’s you actually have rather than a politicised struggle between marketing and customer services departments for example).
2. Analytics is hard. Analytics takes resources. It takes effort for a company to create and assimilate learnings from analytics. Focus your analytics at the key leverage points of the business, for example in the case of a lead generation site such as rightmove your lead conversion rate. Focus analytics where it will have most impact to potentially help and change the business
3. Getting to a culture of fact/data driven decision making, requires your business to have real solid wins using analytics that will make people care from the top to bottom in the company. Once it is have been shown/proven that the eg the company’s conversion rate has increased due to multi-variate testing and changes to the form process or that PPC is generating a higher conversion rate or whatever the wins might be, a process of “yes analytics is important to me – it will help me – it could even help me get the bonus that I want” begins from people all over the company. I love multi-variate testing as well as the ultimate measurable way to prove to senior management.
4. The analysts that you hire are extremely intelligent, humble, versative and political creatures that are in constant communication and debate with key decision makers and are at home with numbers and all the advantages and pitfalls of various analytics solutions, but most importantly are able to move beyond creating key performance indicators to aligning strategic business objectives of the website to the company’s business.
5. Well thought-out metrics that everyone in the business understands. The challenge is creating a shared understanding of the right metrics and what they mean.
6. Don’t rely on one proven and tested way to get insight. Try usability testing, try feedback forms and survey, try sophisticated metrics, try scoring systems, try multi-variate testing, digging into the data to understand root causes and opportunities.
7. The most important thing with analytics is to get started. It is a journey not a competition. And each company’s journey will be different. Good analytics is an evolution of thinking and deciding.
8 conversion rate tactics below to help increase conversion rates on your website.
1. People Click On What They Want
People navigate the web by “scent”, Byran Eisenberg, conversion guru and persuasion architect, tells us. Scent was first described by Xerox PARC to describe the parallels between a human’s information-gathering techniques on the Web and an animal’s food-gathering techniques in the wild. People seek information through the “scent” given off by their trigger words.
According to research performed by usability guru Jared Spool, when visitors found their trigger words on a landing page, they were successful at completing their task 72% of the time; if the trigger word wasn’t on the page, they were only successful 6% of the time. The scent of the keywords kept them on the right path; lacking that scent, they stopped searching that particular “trail”. One tip to make sure you have your visitors’ trigger words covered is to make sure each major button or link:
completes this sentence: “I want to _____”.
includes trigger words / strong scent
2. Start Using Persuasive Call To Action Words
Impotent call to action hyperlinks like “read more” and “submit” sometimes make me feel embarrassed for website owners. They should know better.
Persuasive call to action hyperlinks should include an imperative verb and a benefit. For example, which hyperlink is more persuasive: A or B?
George found an investment secret that changed his life. Read More
George found an investment secret that changed his life. See how George doubled his income in one year.
You can see from this comparison why the second example is more likely to induce action.
3. Better Product Images Are Worth A Thousand Calls to Action
Having better-looking product images than other sellers will do wonders. If research is any indication, product images are a major factor in converting visitors. In fact, 83 percent of eBay shoppers skip listings without images, while sites with galleries get 15% more activity and those with so-called super-size photos show a 24 percent spike in sales. The better photo wins every time. Many people skimp on the quality of their product images and use manufacturer-supplied images which is a mistake.
4. B2B Products or Services Need Merchandising, Too
The same holds true if you are in B2B: Better product images are worth a thousand calls to action. Many B2B sites offer downloads of whitepapers or demos in exchange for completing a form, but fail to make the most basic of efforts to persuade visitors to do so. Don’t just tell them about your whitepaper… merchandise it. Show a cover, show them how easy it is to read with all your pretty charts. Test to see which pieces matter the most.
5. Headlines Must be Made to Stick
Most headlines (and copy for that matter) suffer from what Chip and Dan Heath refer to in their book Made to Stick as a curse of knowledge: Once you know something, it’s difficult to imagine what it is like to not know it. The headline on your page is the one thing that about 80% of your visitors will read. But while headlines are often crafted for their persuasive abilities, they often assume too much prior knowledge on the part of the reader. Make sure that everybody understands what your headline is about, even if they have no reference to understand it. Then invest as much time as possible testing your headline’s abilities to both (1) gather attention and (2) entice visitors to invest the next 30 seconds on your page by explaining what’s in it for them — in language they can understand!
6. Always be Testing
Doing A/B or multivariate testing used to require some in-house programming expertise or expensive third-party software. Thankfully, Google has provided us with a free alternative in the form of Google Website Optimizer. While it may not offer every feature some of the other solutions provide, it is quite an elegant solution and getting better all the time. I actually prefer that people don’t spend their money on a tool, but focus those resources on better copy and imagery instead. There are no more excuses for not testing regularly. Remember what Claude Hopkins wisely said in 1923: “Almost any question can be answered cheaply, quickly and finally, by a test campaign. And that’s the way to answer them – not by arguments around a table. Go to the court of last resort – buyers of your products.”
7. Should we be testing hundreds of thousands of variations?
This question illustrates the market’s misunderstanding of testing. For the vast majority of businesses, this is more like random testing. You can test thousands of combinations in a multivariate test, but being able to doesn’t mean you should. Let’s focus on this example. I’ve kept the numbers simple for clarity’s sake, but let’s assume:
Example I (not recommended):
1,000 = Test combinations (the number of page sections and variations in the test)
10,000 = Page views per day
100% = Visitors in experiment (we’ll run the experiment with all our traffic)
2.4% = Current conversion rate (average conversion rate)
20% = Expected improvement
The duration for this test: 34.9 days. (More than a month!)
Example II (recommended):
20 = Test combinations (focused on key drivers)
10,000 = Page Views per day
100% = Visitors in experiment
2.4% = Current conversion rate
20% = Expected improvement (focus on key drivers in the hierarchy of optimization rather than just random elements, and your expectations should be higher)
The duration for this test: 0.698 days. (Under a day!)
Under the guise of being “scientific”, the companies that originally offered these tools charged on a monthly basis. While they had plenty of experience in managing their software, they had little experience in identifying valuable tests. Plus, they had zero incentive to get quick results while customers paid a monthly fee.
Multivariate testing for the sake of conversion rate optimization should be scientific. However, testing is about improving your business results, not scientific experimentation. Unless you’re running a lab, you’re testing for profit. (No offense, non-profits… yes, you should be testing too.) Testing only what matters is how to recover opportunity cost. Time is money. Don’t waste it by testing which variables matter; rather, invest your time in improving those variables and your understanding of them. Fix the things that hurt your conversions as fast as possible, and make more money today.
8. Read the Reviews on Conversion
Reviews have been all the buzz the past couple of years. If you recently purchased something online, has a review influenced your purchase decision?
New research further illustrates the value of reviews:
77% of online shoppers use reviews and ratings when purchasing (Jupiter Research, August 2006)
63% of consumers indicate they are more likely to purchase from a site if it has product ratings and reviews. (CompUSA & iPerceptions study)
86.9% of respondents said they would trust a friend’s recommendation over a review by a critic, while 83.8% said they would trust user reviews over a critic. (MarketingSherpa)
Most people don’t seem to focus on all the factors involved in implementing reviews to enhance conversion. It’s important that you test and optimize for conversion and persuasion by focusing on the following areas:
Placement for Visibility
Above the fold
Stars or other graphic
Near point of attention or action
Ease of reading
Use across the site
Single Dimension versus Multi-Dimension Reviews
What are the key attributes across different categories
Can review content influence purchase decision
Negative and positive reviews
Review approval policy
What Does a Review Mean
Number of reviews
What questions are you asking
Qualitative versus quantitative
Reviews are just one example of the market trend demanding more authenticity and transparency, and they are key factors in getting your visitors to take action. Any time you have a choice between opening up more or less, always opt for giving your customers more.
What do you think? Do you have any ideas on how to make your website perform better?
Click here to see and also listen to my presentation of how Rightmove are using in this particular case tealeaf to understand their customer experience better and replay exactly what their visitors did and see it in their eyes (I’ve saved the presentation using Jing Project which is absolutely fab).
The presentation is just under 5 minutes long, includes all my slides, my voice (audio) and also a video of where a visitor’s journey went wrong using tealeaf’s session replay. It opens up in a new window, press play and you can listen to all 5 minutes (if you have the time that is). http://screencast.com/t/SadLainUI3
This Thursday 8th Nov I’ll be presenting at the London Stock Exchange and next wednesday 14th Nov at Web Analytics Wednesday in London’s covent garden. As we move beyond pure web analytics, to trying to get into our customer’s head to understand their experience on our website, how can customer experience management tools help us.
There are tools available now such as tealeaf and speedtrap that allow us to replay exactly what our customers did on our site, a bit like a video player of exactly what they did on your website. Very cool. I also had the pleasure of spending a couple of hours with Robert Wenig, CTO and founder of tealeaf yesterday.
What customers actually need compared to what they actually get
Why do we need to understand our customer’s experience and how can tools help us?
1. The unique session replay functionality would allow the company to hone in on live technical and customer orientated issues to achieve a fix with a quick turnaround on their website.
2. Help customer services teams work with customers where they were having a problem on the website doing what they wanted to do – so the customer services team can replay the customer’s session and tell them where they missed a step or alternatively where the website didn’t deliver.
3. Identify poor customer experience in our customer journey on the website e.g look up sessions where a page or image had a problem loading, 404 errors etc, couldn’t find address, couldn’t find product etc.
4. Identify fraud or unexpected activity on financial services websites and look at the fraudster’s activity.
5. Weekly meetings to go through problem sessions, come up with ideas and identify solutions.
6. Compare click activity on a page to mouse movements on a page to eg identify elements on a page that are encouraging a high number of mouse movements but that are non-clickable (and hence should be clickable).
7. Real time data.
Are there others that I haven’t thought of? Are you getting inside your customer’s heads with pure web analytics tools so don’t use these customer experience management tools?
But there are alot of challenges associated with these tools:
1. The cost – they are extremely expensive.
2. Actually getting people internally within the company to use them, for example creating a culture where tech support and or customer services actively use these tools in their daily role.
3. Lots of training and learning and time on how to use the product and to configure the events required from scratch.
4. Set up and configuration of the product across all hosting sites and servers, tealeaf uses http requests for example.
5. Getting to grips with the “customer experience method of thinking” in terms of the perceived traditional web analytics definitions of items such as visits and page impressions versus “sessions”.
Have you found challenges delivering value using customer experience management tools?
This web analytics wednesday is a discussion on how we all think that these tools could or should drive or delivervalue to drive commercial benefit and of course better customer experience.
How do you get inside the head of your customers?
I will also be drawing on my experience at Rightmove (26 million visits a month and tenth most popular website in the UK) where we have started using tealeaf to help.
Any thoughts, ideas – do you use customer experience tools, do you wish you did, do you think that we don’t need them and we can get into our customer’s heads with traditional Voice of customer, surveys, polls, web analytics solutions, engagement metrics etc?
Thanks so much for reading and if you are coming to either of the upcoming events, see you soon
This is where customer experience management tools, which are not cheap, such as tealeaf and speedtrap have created a niche, and include a browser replay feature of individual Web sessions, including the specific pages a customer viewed and how he or she interacted with them, with or without mouse movements as well. Companies on a significantly smaller budget can use other tools to a certain extent such as Techsmith’s Morae, a usability lab on a CD Rom and Crazy Egg, free or almost, heatmapping of click and mouse movement over time.
Why we wish to understand the experience our users/customers in greater detail:
The unique session replay functionality would allow the company to hone in on live technical and customer orientated issues to achieve a fix with a quick turnaround on their website.
Help customer services teams work with customers where they were having a problem on the website doing what they wanted to do – so the customer services team can replay the customer’s session and tell them where they missed a step or alternatively where the website didn’t deliver.
Identify problems in our customer journey on the website e.g look up sessions where a page or image had a problem loading, 404 errors etc.
Identify fraud or unexpected activity on financial services websites
Compare click activity on a page to mouse movements on a page to eg identify elements on a page that are encouraging a high number of mouse movements but that are non-clickable (and hence should be clickable).
B2B customer issues: Identify issues that our B2B customers are having on the site. Allows customer services to replay customer sessions.
B2C customer issues: Identify issues that B2C customers, visitors to the site are having on the site. Allows customer services us to replay customer sessions.
Usability testing: B2B and B2C usability testing, where our test audience journeys on the website can be replayed to identify issues, be that on a form process or product. For usability testing, tools developed specifically for usability testing such as Morae will allow metrics such as “time on task”, “error rate”, satisfaction, mouse movement over time, mouse clicks, web page changes and survey results.
However, these customer experience tools will allow almost video recording replay of visitor sessions which can be used as part of a website usability process.
Mouse click and mouse movement activity, if that is all is wanted initially is offered either free or very reasonably by crazy egg – but that is all one is getting. On a page by page basis (each page needs to be tracked individually), a heat map of that one page where one can breakdown mouse click and movements is shown. However, there is no ability to replay sessions.
The ultimate aim is to drive improvements in the customer experience of the website you provide, small improvements to conversion rates can lead to significant increases in leads.
Successful capture of end-user sessions and session data.
Visual replay of end user sessions.
Successful search for sessions based on free-text and event constraints.
Click versus mouse movement activity.
Challenges associated with these tools:
Getting to grips with the “customer experience method of thinking” in terms of the perceived traditional web analytics definitions of items such as visits and page impressions versus “sessions”.
Learning how to use the product and to configure the events required from scratch.
Set up and configuration of the product across all hosting sites and servers.
The ability to configure the customer experience tool to your exact requirements, rather than accept a standard “off the shelf” approach or black box approach, can be a blessing or curse depending on the processes in-house at your organisation. As they are completely flexible so reliant on good processes internally to set them up to maximise their return on investment.
Customer experience management tools with their ability to replay sessions are a window to better customer experience understanding. I hasten to add, I say a window – as it is up to the web analytics manager / user experience manager to drive value and understanding from these tools, to drive a better understanding of the customer experience.
Emetrics has come to a close after a few hectic days in DC, interspersed with seeing the Dalai Lama in George Town, the solar powered homes exhibition on the mall, the White House, hours chatting in the omni shoreham lobby bar and swimming in the invigorating heated outdoor pool at sunrise.
That asides, what has been going on? Or as my American counterparts would say, what are some of the key takeaways in terms of consumer understanding and behaviour. I’ll do another post about Google Analytics and Microsoft’s Gatineau this weekend.
Jim Novo of Drilling Down fame, spoke about speaking the “exec level” language that CEO/CFOs understand. If we think about our sales pipeline, it is the predictive/future likelihood to happen that execs are interested in when it comes to understanding our online data, sales and consumer behaviour so you can focus your efforts, marketing spend and optimisation efforts where they will have the most impact. Which are your dreck customers, your former best customer, new customers and best customers – map them out on a two dimensional value map with an XY slope.
Use recency, frequency and latency (you can even begin looking at these with Google Analytics) to understand your best and worst customers and grow your best customers. And importantly, build your predictive customer performance pipeline with your CEO/CFO so that they understand it, help you build it – which helps significantly with buy-in. Buy-in let’s face it, can be the biggest obstacle to taking action in any company.
Joseph Carrabis, the web analytics association new anthropologist and neuro-behaviourist on the scene, spoke about really taking advantage of our hard wiring to make our audience do and think what we want them to think or do. As human beings we all apply our own stereotypical and prejudiced frame of reference to everything into which we come into contact. For example in the context of images on a webpage, which image and at what position and angle will trigger what emotions or thoughts at a subconscious level. If an image is positioned at an angle, it implies motion. A photo of a couple, an elderly man, a teenager and early thirties woman will also, all provoke different inferences from one’s audience. To illustrate this, Carrabis engaged 50 of us in a persona exercise where we had to sit down after he namecalled six photos to tell him which one we thought was the Economics professor in Beijing. Interestingly, most of us thought the middle-aged conservative looking white man, was that character – and we were right. The key thing being the inferences that we draw.
In terms of multi-variate testing, the weather channel, used a variety of different images, a couple, then a man and also a woman to see which image was working more successfully in terms of optimising the page for it’s audience and hence having the highest conversion rate. Interestingly, the web page version withe the image of a woman on her own had a much higher conversion rate than other versions tested. This can be linked back to Carrabis point about the power of assocations, inferences and our pysche hard wiring on what we think about images, positioning and sound on a webpage.
Neil Mason, a fellow Londoner, talked about segmenting one consumer segments into tribes (richly developed personas in other words), using datamining to provide statistically robust anomalies, patterns, associations that stand out from a business commercial perspective and use these to identify key drivers for purchase and identify the most valuable consumer segment. For example, with a case study on the Royal Mail, segments included price finders (10%), cottage industrialists (2%) and regular posters (1%) – which were the most valuable segment. They also indentified that visitors who “saved a quote” on their first visit were signifantly more likely to become and continue to buy from the website and be the website’ most commercially valuable segment (worth most money). They used these consumer segments to drive email marketing segmentation and discovered that emails sent 4 to 5 days after their last visit were most likely to convert. Less than 4 days was too soon (the visitor was still thinking about it) and more than 5 days and the conversion rate began to drop. It’s all about the timing – oh – it’s recency again.
Thanks for an interesting emetrics everyone and I look forward to meeting those I met again soon
I am a huge fan of management and marketing theory (not for the sake of it of course), but applying and finding ways to make the continuous job of improving overall marketing performance (web analytics of course) a little easier. I developed my Activity Based Scorecard (ABS) after working with the balanced scorecard - traditional management theory. In my “web analytics scorecard”, see image below, define KPIs, I identify relationships, I bench mark these relationships (with trends) against themselves: and ask the question how effective is my website and marketing performance? This is my correlation between effectiveness of my website under review and the web analyst’s approach to continuous improvement, of said company performance.
(Click image for higher resolution)
My ABS scorecard, large image above, measures the relationship(s) between:
1. Usability, multi-variate A/B testing, market research.
2. Management information and web analyst reports.
3. Web data and statistics.
4. Actionable insights and decisions.
These can all be classified as flows of information and should have separate agenda and metrics. Each component should be weighted according to its importance and the overall metric for assessment. In other words providing weighted scores for different components to provide a more accurate picture of their value to the overall picture, from the company perspective which would be aligned to their commercial objectives.
At the outset one should understand the objective, content, and overall aim of the website, in other words if everything went kaput, as Avinash often tells us, what is the most important thing for your business. My Activity Based Scorecard, analysis and decisions then flow from this. My analytics scorecard is a very similar assessment to the cash flow forecast of a company (finance people will love this) it takes basic inflows of information (whether product or service based) and links these inflows to outflows. A relationship Forms. Also, a cross relationship forms between the numbers of statistics and the reports generated and not to forget the decisions that flow from these relationships.
Separate metrics or cross relationships develop and can be assessed alongside the overall assessment. Usability can in some cases prevail over say the numbers leading straight through to decision flows which influences both the statistics and the reports generated. Through timely assessment of my ABS report trends (and metrics) can be established through each component and the overall assessment.
Actions can have a corrective or test feedback straight through to the inputs or changes of the core components. But, for me the most important factor is that the activities of the analyst have direct influence to the drivers of the website. This ultimately impacts the assessment.
Strategic objectives can be best described as the common ground between what we focus on, what we do best and ultimately what our passion is. We are however constrained by our financial drivers. Our marketing plan directly links with our strategic objectives. So when senior management ask: how is our website performing? - we have a context and a weighted activity based scorecard within which we can measure our true performance and ongoing actions and decisions.
The reality is that for any of this to work at all requires alot of hard work, perseverance and at the outset defining the key business objective(s) for the company. There are some tools on the market to help develop one’s scorecard, for example with Balanced Scorecard Designer you can create a set of KPIS and group, categorise and weight them.
Do let me know if you have any questions or if you agree or completely disagree. Thanks so much for reading.
Consumer Generated Media (“CGM”) is the term that encompasses all social media content on the Internet authored by consumers. This content ranges from blogs, to social networks, consumer review sites, message boards, and videos.
Social networking and connecting with customers is all the buzz, for example yesterday Forrester Research did a webcast on “Know your Customers’ Social Technographics and Craft the Right Social Marketing Strategy” with Charlene Li from Forrester. She shared her insights on understanding ones target audience attitudes and behaviors towards social technologies in order to craft the right social marketing strategy. These are great calls for marketers to learn more about getting their arms around social media, listening to the voice of the customer and engagement with consumers in social media.
Jeremiah, a fellow web analytics association social media committee member, is a social media stategist and gives us outlines of how to approach positioning one’s company in the wider social media ecosystem. From a web analytics perspective, how does one even begin to gauge the influence of these conversations on one’s brand?
Social media technographics report by Forrester research.
Some other stats, from Pew Internet and Jupiter research:
One blog is created every minute
27% (32M) read blogs
22% (27M) post reviews/ comments
44% (53m) are content creators(running own blogs/sites, posting messages).
There are more than 1.5 billion comments per day, the collective voice of the consumer to influence brands and buying strategy has never been stronger and will continue to be strong.
There have recently appeared in the market, applications, such as Visible Technologies TruCast that enable companies to monitor social media conversations, gain valuable insights, and even engage with consumers in order to better allow companies to manage their brands online on social media sites. For companies, these online conversations represent a new opportunity and challenge for brand monitoring, reputation management, word-of-mouth marketing, and consumer engagement.
This is pretty powerful stuff, the ability to segment one’s potential customers by feeling and tone and message from the enormous pool of social media sites.
I wonder how scalable this tool or any tool is, because eventually with the increase of the blogosphere appearing to be exponential how much data will their databases be able to handle?
But assuming all social media data on the Internet, posts and comments are collected in a database with multi-tiered querying – there would be some pretty powerful information.
Influence engagement metrics and advanced analytics:
Identifies the most influential consumers for a particular topic or issue
Determines the sub-topics of conversations
Interactive dashboard allows clients to determine specific sites and authors wielding the most influence in conversations.
What are they talking about (sentiment) scores:
For example, their intelligent sentiment technology evaluates the positive and negative sentiment and tone of conversations. Users establish sentiment criteria by scoring a sample of data, and TruCast automatically scores the rest. I’d like to put this to the test.
There are others such as Pythia which give trended social media data for free, so even for SMEs there are tools which can help.
I personally think the idea of engagement metrics within the context of the broader social media ecosystem and putting it to use to be able to positively impact on managing one’s company’s brand, social media reputation management, is something that we will all be doing in the not too distant future.
Any thoughts or questions or disagreements, please let the web analytics princess know.
Last weekend I went to the inaugural Podcamp UK and co-presented a session with Lucie Follett on the monetisation of podcasting and podcasting measurement using engagement metrics, in the auspicious surrounding of Birmingham’s NTI (new technology institute). It was fast paced and innovative. You may be able to spot us in one of the photos? There were a whole bunch of people there including all the top UK podcasters, Twitter guys (I twitter, do you?), bloggers, journalists and new media folk in general.
In the social media ecosystem, in which I would include podcasting, there is so much potential for businesses to use podcasting to generate brand awareness and interest in their product or service from a niche audience. At the same time, there is an increased awareness of the potential monetisation of podcasting, if it is done effectively. I am still a big believer in “Content is King” - ie create podcasts that genuinely interest and compel your target audience. And have seen examples where “view movie” (ie watch podcast) with the right kind of engaging content has resulted in a tripling of lead generation on a particular car company’s website, such as brochure requests. So podcasting can and does work for business when done in the right way - you need a good story, and definitely not my boss told me to do a podcast!
However, how do you begin to measure a podcast’s effectiveness?
Due to the nature of downloadable media, there are a number of difficulties when it comes to getting accurate metrics from podcasting and issues to consider which impede the efficient implementation of big marketing or advertising campaigns across multiple website.
-How many podcast downloads are there – if the podcast is embedded in the website, is it still considered to e a podcast?
-How many viewers actually watch or listen to the podcast once it is downloaded?
-What degree of the podcast is listened to, for example if you have ad in the podcast towards the end, how many people actually listen to it?
-What true influence or buzz is actually generated by the podcast, because link content (popularity) does not equate to influence.
The key things is to look at podcasting in the same way one does other social media.
Engagement metrics are key. Things to consider include (and please feel free to add any more via comments):
1. Visitor reviews of your podcast (for example on itunes).
2. Visitor comments – where a podcast:comment ratio is the most helpful one as it strips it down to pure engagement on a podcast, by podcast basis.
3. Social capital/Visitor influence – if an established reviewer ie top podcaster or specialist within the industry writes a review/comment etc, this will have a lot more influence than if Sam from Dunkirk did (sorry Sam).
4. Ranking on established podcasting platforms, such as podcasting news top 25 or podcast alley top 10.
5. Wisdom from the rest of the web, such as the reaction on the blogosphere, twitter-sphere, facebook-sphere, general search engine results etc.
The monetisation of podcasting, is not about corporates trying to strangle the life out of a vibrant, independent podcasting community – which will definitely continue to thrive, but a marketing journey where businesses who understand social media will use it to their advantage. Businesses that podcast will be able to measure those tangible or intangible (hence engagement metrics) benefits to their business, and where eventually marketers and advertisers will be able to efficiently implement advertising across multiple podcasts, similar principle (but very different at the same time) to the way google adwords has their content network advertising – where you can run campaigns on a keyword/sector basis, having illustrated the value of advertising on podcasts or websites running podcasts.
Thanks so much for reading and do let me know if you have any thoughts or ideas, or if you completely disagree.
I’m not going to presume to give the entire or perfect answer to the difference between reporting and analytics. But, here is a lovely start. Reporting could include creation of KPI dashboards, preparing results/reports to multiple data source integration. But analytics goes much deeper. Web analytics as a means to improve your site and company’s conversion rate and improve profitability and the use of web analytics to create action : ie, real strong sense of forward movement (ie, not just about ‘reporting’). Analytics services may include multi-variate testing, customer experience analysis, conversion analysis but the key is the actionable insights that emerge from these analyses.
The problem is that alot of the time, most people/companies/even some analytics consultancies (!) don’t differentiate clearly enough between reporting services and analytics services and in some cases don’t even see the difference between the two or assume that a KPI fashboard will give the insights as to why the site’s performance/conversion rate etc has changed. With web analytics we really need to get our hands dirty to find out why things are happening, come up with insights that are all about making things happen, taking action, testing and making things work better (increased conversion rate/other website holy grails such as increased engagement).
However here is an absolutely lovely analogy (courtesy of James Dutton): “So every day I get in my car and I drive to work; I have defined KPI”s for fuel, engine speed, speed (a dynamic metric), I have measures of success (for example averaging a certain mpg) and warning triggers (eg battery charge). In other words I have a dashboard, or in an analytics metaphor I have my site dashboard.
Now in the event that something goes wrong I may get an alert to tell me what has happened (eg “ABS failure, see technician”) but most of the time something will happen without warning – for example: overheating. Overheating is an awful problem; it happens very quickly, is disabling and has no direct signs of impending problems. Just like our site dashboard – if our conversion rate falls from 8.4% to 2.1% over the course of a month we may not realise until the next dashboard is due. The fix requires the diagnosis to proceed to rule out causator events, such as blocked radiator pipes.
Just as with web analytics the diagnosis process needs to be figured so that elements are ruled out as being a causator. The process may be simple, or may be a complex study, however it is still a process. Hence, our reporting services and our analysis services should be complementary, but without careful alignment in both process and definition will not be.”
I do love the idea of being website mechanic (not very glamourous but it’s actually quite exciting – you sort out the problem (overheating) and then you add on some bigger wheels, turbo-charge the engine and voila increase your car’s speed (website conversion rate). Oh wait, I am getting carried away.
If you agree or disagree or have any ideas then please do share.
- Campaign effectiveness (7)
- Consumer segmentation (3)
- Customer experience (5)
- Engagement metrics (5)
- Events (7)
- Google Analytics (5)
- Ideas (12)
- Influence (3)
- Management theory (2)
- Marketing (18)
- Microsoft Gatineau (1)
- Podcasting (1)
- Site search (2)
- Social Media (7)
- Social media analytics (2)
- Social media strategy (3)
- Statistics (2)
- Virtual Worlds (1)
- Web Analytics (39)
Wordpress theme by Wordpress Themes
Web Analytics Princess by Marianina Chaplin