will provide details on how

Unite professionals to advance email dataset knowledge globally.
Post Reply
rubinaruma
Posts: 15
Joined: Sun Dec 22, 2024 4:24 am

will provide details on how

Post by rubinaruma »

Podcasting
Metrics Are the New Wizard of Oz
authorToni Becker
dateSeptember 11, 2021
Guest Post by James Piecowye

Any time we talk about the effectiveness of digital tools such as webpages, socials, podcasts, video and more, one of the 1st questions asked is what do the metrics say.


Metrics are simply a way to measure something.

In 1974 Wenmouth Willams and David LeRoy conducted research into alternative ways to measure public radio audiences and what they discovered then about metrics rings true today as we l uk email address list ook at a plethora of digital delivery systems, the metrics we use give us insight into audience size and habits but the functions of those mediums and how audiences perceive them is still largely neglected and unclear.

Metrics today, like the Wizard of Oz, have taken on a mythical stature leading us to suspend reason and common sense. Like Dorthy, Scarecrow, Tinman and the Lion following the Yellow Brick Road to find the Wizard of Oz, who was not what he was billed to be just an ordinary man, we have placed an extraordinary amount of trust in metrics. Slowly we are coming to realise that metrics are not a panacea for all our questions about the effectiveness or veracity of the content we are creating.

Metrics Are the New Wizard of Oz


Image


Metrics are not the be-all and end-all when it comes to making our digital marketing decisions they are just one tool, albeit an important one, in the mix.

The challenge we face is to not give metrics a ‘Wizard of Oz’ status simply because we don’t understand the variability of them or what they mean in the context of what they are being applied to and how they are being interpreted.

Just to be clear I am not for a moment suggesting that we outright ignore or disregard metrics as one of the tools in our digital marketing arsenal, but I am suggesting we take metrics for what they are and understand their limitations.

The ‘Wizard of Oz’ effect of metrics, blind trust in their ability to tell us everything we want to know, is rooted in the simple fact that we are often not clear on what type of data we are collecting and how it might be used to gain insights about our audiences.

There are 2 types of data we can collect that form our metrics, quantitative and qualitative.

Quantitative data, which is numerically represented and easy to visually display on a spreadsheet, graph or presentation, is by and far the most popular type of metric we use today. Quantitative data is by and large easy to collect and easy to manipulate and is typically useful to give us a sense of the reach of our digitals.

The challenge with quantitative data is that it is often used in a way, such as to determine engagement or why a person might like a webpage and it is really not suited to this task. Quantitative metrics are very useful in the quest to know what is happening, how many people listen to a podcast or how long people have stayed on a webpage but they are woefully inefficient at answering the question of why they listened so long or stayed on the page al all.

To answer the question of why or gain insight about the engagement of an audience we turn to qualitative data. The challenge with qualitative data is it is more complicated and time-consuming to both collect and interpret and is less agreeable to easy visual representation. Qualitative data is often obtained from an interview or maybe a diary where broad open-ended questions can be asked.

The goal with any metric is to understand what it is telling you and what it isn’t saying, this is the classic qualitative vs quantitative data conundrum.

But there is another challenge.

When we use metrics for insight not only do we need to understand the type of data being used but we also need to be aware of the different platforms we are talking about because there is no universality in data between platforms and they do not necessarily collect or process the data that we use for metrics the same way.

The data we are using for our metrics is not only not standardised in how it collected but each platform may interpret what is collected differently.

What the lack of standardisation of data collection and interpretation means for our digital media is simply that we need to be aware of the limitations that exist in terms of using metrics to make firm pronouncements about reach and engagement and not fall into the trap of the ‘Wizard of Oz’ effect where we believe that metrics are all we need for successful decision making.

To highlight the challenge of metrics let’s consider the case of podcasting and how a few different platforms collect qualitative and quantitative data to create reach and engagement metrics.

Youtube is at the top of the pack providing reach and engagement data. For reach metrics we have watch time, average percentage viewed, average view duration, audience retention and rewatch numbers, impression click-through rate, card click-through rate and unique viewers as a starting point. For engagement, we have comments, shares, likes, dislikes and playlist engagement to get both qualitative and quantitative data. Although the qualitative data is limited it is a starting point.

Anchor, Spotify’s podcast service, is rich in quantitative data but weak in qualitative data. What we do get from this platform is the total plays of a channel, estimated audience size, plays per episode, listener location, gender, age. What we do not know is why someone might like one show over another with any certainty.

A further challenge with podcasting metrics is that while a podcast may live in one place like Anchor it may be distributed to many other podcast services like iTunes, Deezer and Google Podcasts, to name a few services, that also collect data and may not share that data with the place the podcast is homed creating inaccurate or even misleading metrics for a podcast. And there is always the possibility that the way the metrics are calculated can change at any given moment as has been the case with Anchor over the past 6 months.

Apple is interesting because it your content is being listened to and provide an analysis of listener patterns and locations against segments of an episode. But there are limitations. First Apple only displays data from users who agreed to share their diagnostics and usage information with providers. All data is aggregated, and no personally identifiable information for any customer is shown. Also, Apple does not share this data with other podcast platforms that may be sharing to it for distribution.

Deezer the popular French streaming service organises its data into analytics and audience categories and both are essentially quantitative data sources with limited qualitative data. The analytics category shows the podcast’s number of streams, unique listeners, fans and shares of your podcast, as well as the total and peak listening time. The audience category shows the age and gender of your fans, along with who’s listening on desktop, mobile or web.

What the limited examples above demonstrate is the variability and lack of standardisation of data collection and the fact that in some cases the data is incomplete yet this data is integral in making decisions about the veracity of digital products.

In the case of podcasting what is also missing from the metrics is the impact of sharing and posting to other social media such as Twitter, LinkedIn and Instagram which all have their own platform-specific metrics.

This takes us back to where we started.

The challenge we face when dealing with metrics is to not give them a ‘Wizard of Oz’ status simply because we don’t understand the variability of them or what they mean in the context of what they are being applied to and how they are being interpreted. Metrics are important but they are not all the same and they are not perfect or complete. What metrics do is give us insight upon which to make more informed decisions they do not make the decisions.

Ready to conduct a complete digital audit?
Post Reply