This Week on CXL and Birthday!

A busy week in my studies and work. The days pass within the smart quarantine that is being made in Colombia, very similar, one to the other. The workday usually ends late (8:00 pm), and then I start working on my CXL program. However, this weekend was very special, as I turned 33, so I was reunited with my family by videoconference, celebrating with my family and loved ones from afar, but with a lot of love from them.

This past week I studied very strongly in the analytics module, starting to go into more detail in the analysis of Google Analytics, Google Tag Manager within the implementation of digital marketing campaigns. I was very satisfied with what I learned during this module, which undoubtedly questioned the way the google analytics account is implemented and more, the analysis of the results of the strategy I designed, from the point of view of the numbers.

In this sense, we should question whether the way we are reading the data from the different tools is the right one? Not because of the way it is read, but because of the way the account is set up for analysis.

Before this, I certainly had to finish my statistical fundamentals program. A short module compared to the others, however of a very high level of the complex given the themes that are seen. Here are some important notes on the subject.

We want to start with a heuristic approach in order to get familiarized with the website, set a number of challenges, and later validate or refuse with data.

Be aware of biases:

● Bias blind spot: a tendency to be over-sure about your point of view.

● Confirmation bias: favor information to confirm your beliefs.

“If you can’t explain what you are doing as a process, you don’t know what you are doing”. Great Quote right?

The 7 levels of conversion

Web Arts uses 7 levels to assess each page:

  1. Relevance.
  2. Trust.
  3. Orientation.
  4. Stimulants.
  5. Security.
  6. Convenience.
  7. Confirmation.

And Invesp uses 8 principles:

  1. Build buyer personas and focus on a few select personas when designing your layout, writing copy, and so on.
  2. Build user confidence, make them trust you by using all kinds of trust elements.
  3. Engagement. Entice visitors to spend a longer time, come back to visit, bookmark it, and/or refer others to it.
  4. Understand the impact of buying stages. Not everybody will buy something on their first visit, so build appropriate sales funnels and capture leads instead, and sell them later.
  5. Deal with fears, uncertainties, and doubts (FUDs). Address user's concerns, hesitations, doubts.
  6. Calm their concerns. Incentives are a great way to counter FUDs and relieve friction.
  7. Test, Test, Test.
  8. Implement in an iterative manner. Build smaller blocks, make smaller changes, and test them and improve their performance.

Conversion formula for marketing experiments

https://marketingexperiments.com/conversion-marketing

Steps using for heuristic analysis

Here are the steps that I personally use for performing a heuristic analysis of a given website.

I start by conducting thorough walkthroughs of the site with all the top browsers and each device category (desktop, tablet, mobile). I pay attention to the site structure, go through the checkout/form filling process. The goal here is to familiarize me with the site and its structure and to identify any cross-browser and cross-device issues (both UX and technical issues). Read the chapter on walkthroughs.

When evaluating a site, I will:

● Assess each page for clarity — is it perfectly clear and understandable what’s being offered and how it works? This is not just about the value proposition — it applies to all pages (pricing, featured, product pages, etc).

● Understand the context and evaluate page relevancy for visitors: does the web page relate to what the visitor thought they were going to see? Do pre-click and post-click messages and visuals align?

● Assess incentives to take action: Is it clear what people are getting for their money? Is there some sort of believable urgency? What kind of motivators are used? Is there enough product information? Is the sales copy persuasive?

● Evaluate all the sources of friction on the key pages. This includes difficult and long processes, insufficient information, poor readability and UX, bad error validation, fears about privacy & security, any uncertainties and doubts, unanswered questions.

● Pay attention to distracting elements on every high priority pages. Are there any blinking banners or automatic sliders stealing attention? Too much information unrelated to the main call to action? Any elements that are not directly contributing to visitors taking the desired action?

● Understand buying phases and see if visitors are rushed into too big of a commitment too soon. Are there paths in place for visitors in different stages (research, evaluation, etc)?

https://cxl.com/blog/give-your-advertising-roi-a-serious-boost-by-maintaining-scent/

Usability Evaluation

Jakob Nielsen define usability as:

Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?

Efficiency: Once users have learned the design, how quickly can they perform tasks?

Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?

Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?

Satisfaction: How pleasant is it to use the design?

Use Checklist

Usability Guidelines:

Task: Open your website in one window and in the other open the usability checklist. Write in a spreadsheet every issue that you found. Then, start to rank every issue by the ease of implementation from 1 to 3:

Implement every issue-solution according to the prioritization defined.

Survey Design Theory

Qualitative survey approach: Must be done by a zero-sum analysis. Get all the qualitative answers and make clusters according to keywords or insights found. Then do a Quantitative analysis according to the qualitative data.

Bouncing betas: When research is made to a very small audience, not all the customers are going to fit in the research, so the answers would be 0.

Error in surveys: Mixing behavior questions with attitude questions. Others are questions that don’t communicate. Surveys too long (5 to 10 minutes long max.) The error of central tendency, fatigue increase, and people answer with neither agree nor disagree, but you can’t go further in the analysis.

Selective Perception > Something customers agree with you, tent to automatic agree

Survey Customers via Email

Important to send out a purchase as soon as possible, from the point that your customers purchase your service/product.

From 8 to 10 questions max, avoiding the fatigue of customers.

Usability Testing Vs A/B Testing

The difference between both options is that usability testing shows what issues are causing the user problem or friction to accomplish a goal, against a/b testing that shows the probability that A option is better than B, with a statistical significance. In the usability test, you only need some users. In an A/B test, according to the number of web visitors, you need a specific amount of them to validate a hypothesis.

The way to create a test on our websites is to make a Usability test in order to find the problems on the website. Later, create a hypothesis, make an A/B test and get a result in your hypothesis.

Mouse tracking

Is useful to identify:

● Where people click and where they don’t

● How far they scroll on any given page

For a Head Map analysis, it’s possible to use tools like “algorithm tools”, but it's necessary to take into consideration that these instant head maps are created by machine predicting algorithms, so there’s no attribution to real end-users.

Google Analytics Health Check

If I have a service fee for a google analytics health diagnosis, It should create trust with customers, and also, people want fast deliveries on their works. So It could work great.

The first thing is to check for the needs of the company, tracked in GA. Ask several questions that are important:

● “Does it collect what we need?”

● “Can we trust this data?”

● “Where are the holes?”

● “Is there anything that can be fixed?”

● “Is anything broken?”

● “What reports should be avoided?”

A/B Testing Mastery Course

Type of experiments:

● Lift elements: Just delete some elements on the page that doesn't have value to your users and are negatively impacting on your website.

● Optimization: Lean deployment, is the best way to A/B Testing some elements.

The ROAR Model

  1. Risk
  2. Optimization
  3. Automatization
  4. Re-Think

If you don't have at least 1,000 goals per month, you cant create A/B tests oriented to goal conversion optimization.

Which KPI to Pick

From a mature perspective, you might select a KPI in importance from top to bottom if you are a mature company.

● Potential Lifetime Value

● Revenue per user

● Transactions (at least this, if you want to focus in a more business approach)

● Behavior

● Clicks

What can be optimized?

Customer behavior study: Start looking at what your customers want, frictions, etc..

○ Get the most important insights into your customer journey

Track your website changes with several tools. Also, we can track the changes of any competitor page to see if there are major changes in the site, so we can test also. If the population is shared with them.

Behavioral metrics for website

  1. % Light interactions in a website
  2. % High interactions in a website
  3. % Low intention to purchase
  4. % High intention to purchase

What to report when we have these numbers?

  1. Amount of users in every cluster
  2. Time for users to move from a cluster to another.

Also, it is important to talk with customers' service or hear a call in order to understand what customers want, and need about our product.

Create modules asking for feedback online. Use at much as possible your current users. It could be the ones who interact with your service or product already.

What type of test can we do to evaluate our assumptions

  1. Five seconds test (measure users first impression)
  2. Question test (get users feedback)
  3. Click test (visualize where users click)
  4. Preference test (find out what users prefer)
  5. Navigation test (find how your users navigate in your site)

Google Optimize

It’s important to run an A/B test in Google Optimize as similar as possible to every possibility. So, it’s recommended to create a set of pages: “Original”, “Default” and “Variant”. The original one is going to receive 0% of the total traffic, and the default and variant 50% each. So, in this way, we make sure that the original and test version are as similar as possible, so results are more accurate.

Calculate A/B test length?

● Why do we have to take a complete week for a test?

Weekdays' behavior affects results compared to weekends. Also, evening effects compare to business hours.

● Why 1, 2, 3 to 4 weeks?

Sample dilution or not

Pace/velocity versus business cycles.

You have to take into account the amount of time that a visitor converts on your website, so you can recognize the complete effect of the experiment on a business cycle of a customer.

Statistics Fundamentals of testing

Statistics is the way marketers can tell if an A/B test is true or false, according to data and more importantly, validates statistically any hypothesis.

Population: all potential users from a group that we want to measure.

Parameters: variable of interest that can be measured.

Sample parameter: it’s a sample of a representative group.

Population parameter: it’s the interest parameter we want to measure, from all the population.

Population Parameters

Sample Parameters

Mean: is the average of all points of the data (central tendency measure)

Variance: is the shape of the data and how the spread is the data.

Standard deviation: shows how much is the variability of the data.

Confidence Intervals Are the range of values that is a specific probability of the value of that parameter is the confidence interval.

● Mean

● Sample Size

● Variability

● Confidence Level

Statistical Significance and P-Value

Quantify if a result is real.

● P-Value: the probability of obtaining the difference between a sample if it isn’t a difference.

Statistical Power

The probability that any test of significance will reject a false null hypothesis.

Is determined by:

● Size of the effect you want to detect

● Size of the sample used

Sample size and how to calculated

Sample size variables

● Control group expected conversion rate

● Minimum relative change in conversion you want to detect (Lift)

● Confidence Level

Levels held constant for the sample size: Confidence level and Power

● Statistics Trap #1

Regression to the mean & sampling error

Sampling error: A sample has an error so there are a lot of outliers in the data points. So, there is more than one variation and the media is not accurate.

● Statistics Trap #2

Too many variants

Optimization: Hypothesis-driven! You need a process behind it.

● Low error probability: accept a low error probability (for example, 5%), increases when there are many variables tested.

Correct for yourself:

● Analysis of variance

● Unifactorial analysis

Limit the number of test variants

● No more than 3 variables adding the control variable, should be the max. (expert advice)

● Statistics Trap #3

Click Rates & Conversion Rate

Select and prioritize the main KPI before starting any test. Any metric that takes into consideration an effect on the main KPI or north metric is the ones called Macro-Conversions. There are two kinds of conversions, Macro and Micro, the one who tells you “How Much”, and the other that tells you “Why”.

HOW MUCH — Macro conversions:

● Conversions

● Orders

● Revenue

● Profit

● Returns

WHY — Micro conversions:

● Clicks

● Visits

● Views

● Scrolls

● Bounces

● Statistics Trap #4

Frequentists vs Bayesian test procedures

In the Bayesian procedure, we assigned a probability to a test before running an experiment. In the Frequentist, we do not assign a probability before the test.

Finally, we will continue to talk in the next week with the advances in the growth marketing program.

--

--

--

I am a Digital Marketing Specialist that loves to learn as much as possible about food, analytics, X-games, history and more. Hope to have an interesting blog

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

What Can You Use Instead of Car Wash Soap In 2022

What Can You Use Instead of Car Wash Soap In 2022

Uninspired by 2020, travel back to the Nineties

Design for understanding

Design Supply Install in Derry #Derry https://t.co/BkRKwEfUXD

How to run a lean product testing session in your community

Tennis Court Installation Specialists in Bridgend #Tennis #Facility #Construction #Company…

What I learned designing for IoT

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Oscar Ivan Hernandez Hernandez

Oscar Ivan Hernandez Hernandez

I am a Digital Marketing Specialist that loves to learn as much as possible about food, analytics, X-games, history and more. Hope to have an interesting blog

More from Medium

How To Write A High Converting Ad Copy?

Subtraction by Addition?!

Do you know what Marketing is?