Measuring the impact of dishonesty in football

2018-01-27

Note: This post was published on January 27th and updated on February 22nd to include the 2×2 matrix. Thanks to Avinash Kaushik for providing guidance to do that.

It’s clear there is very little awareness of the diving problem in football and one of the main reasons is that nobody has measured how big a problem it is. We just see it happening in the moment, it’s an accepted part of the game, and given it’s part of the status quo very few people are interested in changing it. Measuring the problem will be a way to raise awareness around it but also a way to understand it better. Measurement is also the first step to identify relevant solutions and ultimately to improve the situation being studied. As one of the leading business management experts says:

“If you can’t measure it, you can’t improve it.”
-Peter Drucker

It is not only about counting the number of dives in each game, the ultimate question we want to answer is: How much the course of games, leagues, World Cups are negatively affected by diving? How many times the winner of a game would have been different had there been no dishonesty on the field? What percentage of the results from games are polluted by divers? Could we calculate an honesty index in football?

An easy way to visualize what we want to accomplish is by looking at the following 2×2 matrix. On the vertical axis we separate dives from non-dives, on the horizontal axis we differentiate situations that produce an unjust impact on the game from those that don’t.

We want to study the two lower quadrants: dives that go unidentified and their impact on the course of games, plus dives that are identified and contained by referees. In a world were all players engage with honour and integrity there shouldn’t be unfair results and referees should not have to protect fair play from deceiving players. We hope that our effort will encourage players that dive to become the safeguards of fair play themselves.

Note: The top left quadrant is out of our scope, plays that don’t include dives but produce an unjust impact, like cases of mistaken identity. Those cases could be addressed by the Video Assistant Referee (VAR).

Data analysis

As a data analytics problem we need to take a few steps before arriving to a solution: (1) gather data, (2) give context to that data to enable analysis, (3) draw insights from the analysis, and finally (4) drive action to solve or improve the situation being analysed.

With all that in mind here we present our proposed measurement model to reveal the impact diving has on our beloved sport.

1) Building the dataset

Quantifying metrics

Metrics to count the type of dives that occur in a game.

  • Referee identified dive
  • Type: Integer, deterministic (there is no doubt)
  • Description: Counted when the dive is identified by the referee and not acknowledged as a fault. When that happens the referee often communicates with the diver (verbally or non-verbally) letting them know his decision. Sometimes the diver gets a yellow card. In any case, the referee has made it explicit that he or she identified a dive and has decided not to acknowledge it as a fault.
  • Video identified dive
  • Type: Integer, deterministic (there is no doubt)
  • Description: Counted when a play is acknowledged by the referee as a fault but later demonstrated  by a video replay as a clear non-contact play, thus a dive. Even in leagues that don’t use Video Assistant Referee (VAR) the TV broadcast can easily be used to assess this.
  • Inconclusive dive
  • Type: Fractional (percentage), non-deterministic (there is doubt)
  • Description: A play acknowledged by the referee as fault, where there is contact (also demonstrated by a video replay) but where there’s reasonable doubt of contact being the cause of the fall.

Drawbacks:
Judgement is subjective and opinions will vary within 3 possible discrete values: dive, no dive or unclear. As there is no deterministic way to judge these plays, the option is to qualify them with a crowdsourcing mechanism asking the audience to assign the play one of those 3 values. The metric would then report what percentage of the audience thinks it was a dive, what percentage think it wasn’t and what percentage think it’s unclear. The data should always be paired with its statistical significance based on the number of crowdsourced responses. And it should be ensured that the participating audience (panel) is not biased towards one of the teams.

Mitigations:
Wikipedia lists a 2009 study that found 4 recognisable traits (we propose a fifth one) that can often be observed when a player is diving. People can be educated in recognising these traits to increase the accuracy of the crowdsourced judgement. To simplify panel bias prevention, a panel of proven experts could be appointed by a neutral organization.

Value:
Even if an accurate way of judging cannot be implemented the value of this metric is not so much the actual percentages but the way it completes the overall picture of the diving practice. Counting inconclusive dives as an integer will reveal how many plays where there’s reasonable doubt there are in a game. To put it in visual terms see these simple Venn diagrams:

Impact metrics

  • Possession changed
  • Type: Enumeration (Changed, Maintained)
  • Description: Did the dive produce a change in ball possession in favour of the diver or did the diver get to maintain possession over an imminent ball loss?
  • Impacts: Ball possession time.
  • Card produced
  • Type: Enumeration (Yellow, Second Yellow, Red)
  • Description: Did the dive produce a card for the supposed offender? Either yellow, second yellow or red.
  • Impacts: Card count for the other player, player count in the game (in case the player get dismissed from the match) and player suspensions (from card accumulation in a tournament).
  • Goal scored
  • Type: Boolean (yes/no)
  • Description: Did the dive produce a goal in favour of the diver?
  • Impacts: Goal difference.
  • Winner changed
  • Type: Enumeration (Losing to Tie, Tie to Winning)
  • Description: Did the dive produce a change in match score or result? From losing a game to a tie, from a tie to winning a game.
  • Impacts: Points gained per match.

Qualifying metrics

Metrics to understand the effect dives have over the game and the other team.

  • Dive timestamp
  • Type: Time (minute:second)
  • Description: Earlier dives with impact on score leave more time for the affected team to recover than last minute ones.
  • Total playing time per player
  • Type: Time (minutes:seconds)
  • Description: To measure dive frequency and how prone a player is to diving. A player with 5 dives in 90 minutes does not create the same disruption as a player with 5 dives in 5 minutes.

As a summary, here are all the proposed metrics:

2) Providing context to enable analysis

Basically, add as many dimensions as you want so that you can slice and dice the data in interesting ways. Some examples are:

  • Player name – this is the most important in order to build a personal record
  • Player position – goalkeepers are much less prone to dive than forwards
  • Team
  • League
  • Official or friendly match
  • etcetera

3) Drawing insights from the analysis

Metrics by themselves hardly produce any insights. Just counting dives will produce no greater good other than exposing divers a little more. There are many other interesting things to be discovered though. Some examples:

  • Are some leagues or teams more honest than others?
  • What is the diving behaviour per player? Calculate average dives per dimension (per league, per team, per position, by playing time) then compare each player output to those averages.
  • Who are the players in the world with the most dives in the context of their position and time played?
  • Who is the player in the world with the least dives and therefore a strong candidate for an honesty award?

And the most important questions to answer:

  • How many goals, games, championships have been influenced by dives?
  • What is the trend by month/year?
  • Are we getting more honest or more dishonest over time?

By answering these questions we might be able to compute the honesty index. And then many debates can be started. What would you think if suddenly you discovered that 80% (just to say a number) of the results in you favourite sport come from dishonest acts? That only 20% of the results are real and just. We believe this would propel action and change.

4) Driving action to solve or improve the situation

Once enough data has been gathered and insights drawn, the maybe proper actions can be taken. Without all that, what we can do right now is to appeal to players sense of honesty & integrity and ask them to demonstrate these values on the field. Which is what we’re trying to do with #StopDiving.

Copyright notice for this post:

© 2018 StopDiving.org All rights reserved

Header photo by Lorenzo Cafaro from Pexels

Liverpool, the most honest team in the Champions League semifinal

Liverpool with only 4 dives, the most honest team this round.

More

Diving analysis by Confederation

A comparison of diving behaviour by Gender and Confederation

More

Introducing StopDiving.org

Today we're launching our cause to ask football players to #StopDiving.

More