The Fantasy Impact of EPL Teams "On the Beach"

Let’s kick things off with a poll…


In the final few games of the season, when a team is “on the beach” (i.e., they have nothing left to play for), they…

  1. Play worse

  2. Play better

  3. Are involved in more high scoring games

  4. Are involved in more low scoring games

It’s a question that came up during discussions at Draft Society HQ over the weekend. On the Saturday, 9th placed (but nowhere near Europe) Leicester City took on 15th placed (but fairly safe) Aston Villa. According to Twitter, both teams were well and truly on the beach. It finished 0-0. Then on Sunday, 10th placed Brighton faced 13th placed Southampton. It finished 2-2. The previous weekend, Brighton had turned up and beaten a red-hot Spurs away from home, whilst Leicester had slipped to defeat at (then, lowly) Newcastle.


So what’s going on? Are teams on the beach good or bad? Do they score loads or concede loads? How does all of this affect our draft fantasy management? As always, I’ve turned to the numbers to take a look.


The Analysis

The analysis here is fairly straightforward: compare pre-beach performance with on-the-beach performance and see what’s better, worse, or staying the same. What we define as on the beach is subjective but being twice as many points away from relegation/Europe as are games left seems a reasonable threshold (i.e., 10 or more points with 5 games to go, 8 or more points with 4 games to go, etc.). The issue we have, though, is that in a 20 team, 38 game league season, with 3 spots afforded for relegation and 6 or 7 for European competition, time “on the beach” is inevitably short.


In general, we can get away with a sample size of 8-10 games (preferably more, of course). But no team is on the beach in Gameweek 28. In fact, it’s not really until around the 34 game mark that we start seeing multiple teams with nothing, realistically, to play for. At this point, we have a “beach” sample size of just 4 games – far from ideal, but it will have to do.


So with our comparisons decided (first 34 games vs last 4 games), we now need to determine our indicators of performance. As alluded to at the start, team goals scored and team goals conceded make sense. Likewise team points. For these, we will collect data going back 12 Premier League seasons, which gives us a sample size of 58 teams. We’re also going to include probably the most important measure of all: fantasy football scores (and specifically, points per 90). For this, we will only look at last season (sample size of 43 players).


(Team) Points per Game

The graph below shows the change in points per game for all 58 teams in our sample, ordered from lowest (team got worse) to highest (team got better). Marginally more teams get worse (56.9%) than get better (43.1%), with the mean change being a decrease of 0.12 points per game and the median change being a decrease of 0.24 points per game. Nine teams got substantially worse (> 0.75 points decrease per game), with the 2010-11 Bolton Wanderers side (remember them?!) leading the charge with four losses to Blackburn Rovers, Sunderland, Blackpool, and Manchester City. Just three teams got substantially better, with the 2018-19 Crystal Palace side (who beat Arsenal, Cardiff City, and Bournemouth, and drew to Everton) top of the pile.

(Team) Goals Scored per Game

The graph below shows the change in goals scored per game. Marginally more teams see their goals scored decrease (53.4%) than increase (46.6%), with the median change being an essentially negligible decrease of 0.05 goals scored per game. Interestingly, the mean change was an increase of 0.06 goals scored per game due to the relatively extreme changes seen by a few teams. Just one team scored substantially fewer goals when on the beach (> 0.75 goals scored decrease per game); that being the 2017-18 Watford side that blanked against Crystal Palace, Tottenham Hotspur, and Manchester United. Seven teams scored considerably more when on the beach, with the 2018-19 Crystal Palace side previously mentioned bagging 11 in the games against Arsenal, Cardiff, and Bournemouth.

(Team) Goals Conceded per Game

The graph below shows the change in goals conceded per game. Considerably more teams see their goals conceded increase (62.1%) than decrease (37.9%), with the mean change being an increase of 0.23 goals conceded per game, and the median change being a increase of 0.25 goals conceded per game. Nine teams conceded substantially more goals when on the beach (> 0.75 goals conceded increase per game); with the 2012-13 West Bromwich Albion side taking the crown for leakiest after they shipped 13 in matches against Wigan Athletic, Manchester City, Norwich City, and Manchester United. Only three teams improved on the defensive side of things by a considerable margin, with the 2013-14 Southampton side earning clean sheets against Aston Villa, Everton, and Swansea City, before conceding just the one to Manchester United on the final day of the season.

Fantasy Performance (Points per 90; PP90)

Finally, the one that we care about most, is fantasy performance, and the graph below shows the change in fantasy points per 90 scored by all 43 players in our sample, ordered from lowest (player got worse) to highest (player got better). Here we see that 58.1% of players experienced a decline in their PP90, whilst 41.9% of players experienced an increase. The mean change was a -0.53 decrease per game in PP90 and the median change was a 0.50 decrease. Four players saw a sizeable increase (> 5 PP90 per game), and not surprisingly, two of those (Joe Willock, 6.77, and Paul Dummett, 5.19) came from Newcastle United, who had the second biggest increase in goals scored per game (as per the second graph). Three players saw a sizeable decrease: Ruben Neves (-5.45), Stuart Armstrong (-5.19), and Jeffrey Schlupp (-5.06). Joel Ward (-4.89) and Cheikhou Kouyate (-4.38) also struggled, which is to be expected given that the 2020-21 Crystal Palace side had the 8th biggest increase in goals conceded per game (as per the third graph).

What This All Means...

If there’s one thing that this data tells us, it’s that teams on the beach produce random performances. The only consistency is inconsistency. Nothing exemplifies this more, perhaps, than the examples of Everton, Fulham, and Liverpool in 2011-12. The three sides were 7th, 8th, and 9th after 34 games, but were well adrift of Chelsea in a season in which only the top six earned a European berth. They were all on the beach. In the final four games, Everton would beat Fulham 4-0, then draw to Stoke (14th in the league) and Wolves (20th). Fulham would, of course, lose to Everton 4-0, then beat Liverpool 1-0. And Liverpool would, after losing to Fulham, beat Chelsea 4-1 then lose to Swansea 1-0.


Crystal Palace provide us with more evidence of this haphazard phenomenon. Their 2018-19 side produced one of the best on-the-beach performances of any side in the last 12 years, improving their points per game by 1.35 and their goals scored per game by 1.57. Fast-forward 12 months and their 2019-20 side produced one of the worst on-the-beach performances, declining in points per game by 0.46 and increasing in goals conceded per game by 0.85. Same manager, same on-the-beach position, many of the same players, completely different outcome.


There is a solid logic behind believing that a team (and therefore, from a fantasy perspective, it’s players) on the beach will get worse. The problem is that there’s an equally solid logic behind believing that said team and players will get better. With nothing to play for, a team will lack motivation. And motivation is key to performance. Ipso facto, the team’s performance worsens. Or…with nothing to play for, there is no longer the debilitating pressure on players. And pressure is often a negative thing. Thus, the team’s performances have to improve.


It is possible (and I say this very tentatively) that you could infer from the data that being on the beach – if it were to have an influence one way or another – is slightly more negative. The fact that 62% of teams see their goals conceded per game increase is not a trivial amount. For me, however, the numbers are too small in magnitude and the distribution too wide for them to have an influence on my decision making. If I were to take anything from this, it’s that being on the beach doesn’t tell you anything – and that in itself is something.


So let others believe the myth. Let others stack players against the likes of Brentford and Southampton for the remainder of this season. It will bite them because these teams are just as likely to pull out a win as they are a loss (or a high-scoring thriller as they are a dour 0-0 draw). Some beach teams get better, and some get worse. The result is random and influenced to a large extent by their schedule. Most importantly, it is unpredictable, so bear all this in mind when weighing up a waiver pick up of Daniel James (he plays flip-flop wearing Brighton in Gameweek 37) or a trade for Ismaila Sarr (he takes on deckchair sitting Crystal Palace in Gameweek 36) – don’t let this myth dictate your decisions.


For all the latest from The Inner Geek, follow @the_innergeek on Twitter!

And for more in-depth and exclusive resources, become a member of The Inner Circle.








264 views

Recent Posts

See All