Playing in Europe does affect domestic results in the EPL

There’s recently been a bit of discussion in the media (e.g: Sky, Guardian) on whether participation in European competitions has a negative impact on an EPL club’s domestic performance. This is partly motivated by the significant improvements shown by Liverpool and Chelsea this season: after 13 games they are 10 and 17 points better off than at the same stage last season, respectively. Neither are playing in Europe this year. Leicester are demonstrating a similar trait, albeit in the opposite direction: they are now 15 points worse off than last season. For them, the Champions League seems to have been a significant distraction.

Numerous studies have demonstrated that there is no ‘hangover’ effect (see here and here) from playing in Europe. There is no evidence that EPL teams consistently perform worse in league matches that immediately follow a midweek European fixture. But what about the longer-term impact? Perhaps the mental and physical exertion of playing against the best teams in Europe manifests itself gradually over a season, rather than in the immediate aftermath of European games. If this is the case, we should be able to relate variations in an EPL team’s points haul from season-to-season to the difference in the number of European fixtures it played.

It turns out that there is indeed evidence for a longer-term impact. The scatter plot below shows the difference in the number of European games played by EPL teams in successive seasons against the change in their final points total, over the last 10 seasons. Each point represents a single club over successive seasons. For instance, the right-most point shows Fulham FC from the 08/09 to 09/10 season: in 09/10 they played 15 games in the Europa cup (having not played in Europe in 08/09) and collected 7 fewer points in the EPL. Teams are only included in the plot if they played in European competitions in one or both of two successive seasons[1]. The green points indicate the results for this season relative to last (up to game week 13); the potential impact of European football (or lack of) on Chelsea, Liverpool, Southampton and Leicester is evident. Chelsea's league performance from 2014/15 to 2015/16 is a clear outlier: they played the same number of Champions League games but ended last season 37 points worse off.
Effect of participation in European competitions on a team's points total in the EPL over successive seasons. Green diamonds show the latest results for this season compared to the same stage last season. Blue dashed line shows results of a linear regression. 

The blue dashed line shows the results of a simple linear regression. Although the relationship is not particularly strong – the r-square statistic is 0.2 – it’s certainly statistically significant[2]. The slope coefficient of the regression implies that, for each extra game a team plays in the Europe, they can expect to lose half a point relative to the previous season. So, if a team plays 12 more games, it will be 6 points worse off (on average) than the previous season. 

It’s worth noting that the CIES Football Observatory performed a similar analysis in a comprehensive report on this topic published earlier this year.  They found there to be no relationship between domestic form and European participation over successive seasons. However, in their analysis they combined results from 15 different leagues across Europe. So perhaps the effect is more pronounced in the EPL than other leagues? This recent article in the Guardian, citing work by Omar Chaudhuri, suggests that the effects of playing in Europe may be more pronounced in highly competitive divisions. The lack of a winter break may also be a factor: while teams in Italy, Spain and Germany enjoy several weeks rest, EPL teams will play four league matches over the Christmas period. 

Finally, an obvious question is whether we are simply measuring the effects of playing more games across a season. To test this, we should apply the same analysis to progress in domestic cup competitions. However, I’ll leave that to the next blog.


----------------------

[1]. The points along x=0 are teams that played the same number of European games in successive seasons (and did play in Europe both seasons). The only two teams that are omitted are Wigan and Birmingham City, both of whom played in the Europa League while in the Championship. Matches played in preliminary rounds are not counted.
[2] The null hypothesis of no correlation is resoundingly rejected.

Comments

  1. Hi,

    This doesn't correct for regression to mean. To qualify for Europe, on average team will have over-performed and so do worse in following seasons.

    ReplyDelete
    Replies
    1. Yeh - that's right, although it's a little tricky to correct for.

      However, what I have done is repeat the exercise removing the smaller teams (Leicester, Southampton, etc) -- i.e., those that qualified for Europe after having over-performed and are therefore likely to do worse the following season -- retaining only the teams that have qualified for Europe most seasons (Man Utd, Spurs, Arsenal, Chelsea and Liverpool). The regression coefficient (beta) does not change significantly, which implies that the main result of this blog is not due to mean reversion.

      Delete

Post a Comment

Popular posts from this blog

Using Data to Analyse Team Formations

Structure in football: putting formations into context

From Sessegnon to Sanchez: How to calculate the correct market salary for EPL players.