What is Player Multivariate (MV) Testing?


Multivariate testing is a form of A/B testing available on our Players, that allows you to test and compare between two or more different versions of the same Player with your live traffic. You can change one or multiple settings between your player versions and control the percent of traffic that will be exposed to the variant player.

This feature is extremely useful when trying to increase your Player's performance and engagement statistics by changing one or more settings in a controlled environment while utilizing your core viewers.


What are the MV Testing Mechanisms?

Whenever you apply MV testing to your player, you have an option to select one of the two MV testing mechanisms. The table below describes the essence and principal difference between the MV Testing mechanisms.

Characteristic\Mechanism Swap player Swap player configuration
Description Real players are swapped and compared. Player configurations are swapped and compared while the player itself remains the same.
Playlist input Different playlists can be assigned to the original and variant players. Playlist is assigned to the original player only.
Adding macros Original player + variant players. Original player only.
Traffic Goes to the original player and variant players. Goes to the original player only.
Reporting All reporting metrics that your user has access to are available. Only the reporting metrics of macro report type are available.


How Does MV Testing Work?

The best way to describe how Multivariate Testing works is using the following example:
Let's say that we want to find out which player is performing better: our "Click to Play" Player or an Autoplay one. In this case, we will take our Player that is already embedded on our main sites and add some MV testing players to it.

In the following scenario we have taken our original "Click to Play" Player and added two additional Variation Players to it: Variant 1 which is an "Autoplay" player, and Variant 2 which is a "Click to Play" Player that also incorporates our "Animated Poster" feature.
Using an easy weighting property we also defined that we want each of the Players to get an equal delivery percent (33% each). This means that each Player (the Original, Variant 1 and Variant 2) will be served equally across the traffic.

The following flowchart and explanation describe how this is accomplished:




  1. At first, the embedded Player tag on our webpage requests the original Player from AOL's Delivery Service. This is the same process for all our Players across the network and nothing has changed here.

  2. The Delivery Service then locates the relevant Player (according to the Player ID passed by the webpage) and checks if the Player has any MV Testing variants. In this example, we have two additional variants to our player.

    Note: All MV Testing Players are requested under the original Player ID defined in the player tag embed code.

  3. The Player then cycles through the variants (including the original) and serves one of them back to the Delivery Service according to the defined weight. 

    Note: If the MV mechanism is set to 'Swap player', the actual player will be served. If the MV mechanism is set to 'Swap player configuration', the original player will be served together with the selected variant player configuration. 

  4. Because all of our Player have been weighted equally (33%), each player will be delivered an equal number of times out of all the requests made for the player.
    Simply put, if the Player is viewed 150 times by random users on your webpage, each Player variant (including the original) will be delivered 50 times (Original 50 times, Variant 1- 50 times and Variant 2- 50 times).

    Important! MV testing original players are regular players for all intents and purposes. Therefore all typical player reports are available.

    Important! MV testing variant players are used only for the purpose of MV testing and shouldn't be themselves embedded on any sites.

    In the end of the test, you will be able to generate a report showing how each of the different settings affected your viewers. For instance, see an increase in clicks when using the Animated poster, or an increase in your video completion rate with Autoplay players.
    Then, using your findings, you can either do more focused tests or implement a change on your original player that will boost your performance.

    Related Articles

Have more questions? Submit a request