The Hinkley Factor

The tendency of Port Adelaide under Coach Hinkley to taper off towards the end of the season (e.g., missing the top four after being 10-1 in 2014, missing the finals altogether after being 11-4 in 2018, and major disappointments in finals in the last few years) has inspired me to undertake an analysis of all AFL clubs over the period since 2013 (when Kenny became coach) to see whether the perceived Power drop-off come the second half of the season is real and significant as compared with other clubs, or not.
I first recall expressing the idea that, from about July onwards, Port’s on-field performances tend to start going down the gurgler, in a discussion on this topic here, with some input from @Trader:
viewtopic.php?f=11&t=43901&start=920
However, this followed many years of observing that Port tended to perform extremely well, even against good sides, in the first few months of the season, relative to their performance across the back half, possibly by focusing on the short-term, or possibly on account of lacking the required ruthless edge in important matches.
Criticism of Port’s performances towards the pointy end of the season feeds into more general scepticism over the years of their ability under Kenny to get the job done in ‘big’ games – e.g. Showdowns, Friday nights, matches that could decide ladder positions, and of course finals. Of course, some of these ‘big’ matches are more likely to occur from July onwards.
And so led me to delve through the records to produce the following analysis, and my discovery of what I shall term the Hinkley Factor.
The Hinkley Factor
The Hinkley Factor is determined by calculating a team’s win percentage for the period up to and including 30 June, and subtracting from that, the team’s win percentage for the period from 1 July onwards. This can be done across one season or multiple.
Thus, the Hinkley Factor is a measure of a team’s underperformance (or otherwise) from July onwards, relative to their win-loss record after the first few months of the season.
For instance, Port have gone (including all minor and major round games) 88-51 (63.3% win rate) to June, and 60-51 (54.1% win rate) from July, since Kenny became senior coach ahead of the 2013 season. Therefore, Port’s Hinkley Factor since 2013 is +9.3 (after rounding).
In other words, their win rate is 9.3 percentage points lower after tax time as compared with before.
Why did I pick 30 June? Basically for simplicity. I could have chosen before/after the bye rounds, or before/after 11 games played, or a certain round etc. – picking a date seemed an easy choice for collecting the data, and as arbitrary as the other options. And I doubt that a different method would yield a significantly different result.
So, what’s your team’s Hinkley Factor during the same period? And are Port the worst in the comp? My calculations are as follows:
Gold Coast +11.5
Fremantle +9.4
Port Adelaide +9.3
Collingwood +8.1
Melbourne +5.4
Adelaide +2.7
Geelong +2.4
North +2.1
Sydney +1.9
Carlton +0.4
St. Kilda +0.0
Essendon -0.8
GWS -1.2
WB -3.8
West Coast -5.3
Hawthorn -5.4
Brisbane -12.1
Richmond -14.8
(Note: The league average is slightly above zero as, for simplicity, I have considered draws in minor round matches to be losses for each side, for the purposes of calculating win percentage. There have been more draws in the latter part of the season since 2013.)
As expected, Port are right up near the top, just pipped into second spot by Freo. There are probably some parallels between Ken Hinkley and Ross Lyon, in that I never really thought Rossco was going to take Freo to a flag despite a strong win-loss record in the minor round, although their record in finals is not that bad, suggesting that they also underperform in July and August relative to the earlier part of the season. West Coast’s negative Hinkley Factor probably rules out the effects of travel to and from Perth being a significant factor.
Gold Coast being top are no surprise – they’ve always had a young average age, with plenty of rookies and second-year players who naturally struggle at the back-end of the season before they’ve had a few pre-seasons under their belts. It’s vaguely possible that ‘selective tanking’ could also be a cause; if a team knows by the halfway mark of the season that they’re not likely to qualify for finals, the draft system obviously creates an incentive to lose their remaining matches in order to receive higher draft picks. While this could apply to a number of clubs every year, the Gold Coast have never qualified for finals and have seldom come close.
Collingwood’s strong Hinkley Factor didn’t come as a surprise to me, although I suppose my expectations were influenced by the decades-long ‘Collywobbles’ which were in reality confined to Grand Finals. Perhaps some perceived similarities between Kenny and Nathan Buckley also influenced me. 2023 actually increased Collingwood’s Hinkley Factor very slightly, notwithstanding that they got the job done in September. (The absence of Nick Daicos for much of the latter part of the season perhaps contributed to this.)
On the flip side, Richmond have a fantastic Hinkley Factor over this period, perhaps suggesting that they prepare for each game to put themselves in a position to win the one that counts (which would appear to have been a winning strategy for them). There is a clearly an inverse relationship between Hinkley Factor and premiership success, when looking at the figures above.
The biggest surprise for me was Brisbane – despite burning a number of good chances in finals across the last five seasons, they’ve still got a fantastic Hinkley Factor of below -12. I think that’s helped by some strong late-season recoveries during the early Hinkley period. It also highlights the fact that high Hinkley Factor teams like Port underperform not merely in finals, but also in the latter stages of the minor round.
Conclusion
So, has my hypothesis been borne out? Well, Port have the 16th best (or is that worst?) Hinkley Factor out of the 18 clubs, so I’m probably not far off the mark. It’s true that the Gold Coast have out-Hinkleyed them, but you’d probably expect that to be the case given the Suns’ age profile and, well, if you’re even comparable to the Suns, you’ve got problems.
Port Adelaide, Freo, and Collingwood all have very large Hinkley Factors since 2013. I appreciate that there have been coaching changes at the latter two – perhaps I could have limited analysis of each club to the seasons they’ve been coached by their current coach, since the implication is that coaching has a bearing on things – but then, with most clubs the sample size would be quite low.
Interestingly, if you compare Port’s Hinkley Factor in seasons where they have made finals (+14.9) against seasons where they did not qualify for finals (+5.7), it appears that Port ‘spud it up’ far more late in the season when they’ve had a strong start – potentially under the pressure they’ve created for themselves. Or perhaps it’s really the case that overperformance early in the season comes at a cost of winning games towards the pointy end. Again, in comparing Port under Kenny to clubs who have won premierships over the last decade, it's hard not to draw the conclusion that Port have placed more of a focus on early wins than their competitors.
There are probably other ways to analyse the Hinkley era. Undertaking the same analysis in respect of points scored for and against, looking at strength of schedule, or records against teams making the top eight/top four, analyses involving betting markets, etc., or even calculating the average Hinkley Factor across seasons (rather than merely aggregating total wins and losses) would all be interesting. I know that many Port fans have pointed out Kenny’s terrible record in primetime matches, and conversely his near-flawless record against the Gold Coast, St. Kilda, and North Melbourne, even as compared with successful teams.
Nevertheless, I think it’s clear that Port, under Kenny, for whatever reason, just can’t get it done in the back half of the season, and arguably a reason for that is prioritisation of early-season wins over long-term success.
And last, but not least, Geelong’s Hinkley Factor during Kenny’s playing days at the club (1989-1995)?
A mediocre but not terrible +2.3.
I first recall expressing the idea that, from about July onwards, Port’s on-field performances tend to start going down the gurgler, in a discussion on this topic here, with some input from @Trader:
viewtopic.php?f=11&t=43901&start=920
However, this followed many years of observing that Port tended to perform extremely well, even against good sides, in the first few months of the season, relative to their performance across the back half, possibly by focusing on the short-term, or possibly on account of lacking the required ruthless edge in important matches.
Criticism of Port’s performances towards the pointy end of the season feeds into more general scepticism over the years of their ability under Kenny to get the job done in ‘big’ games – e.g. Showdowns, Friday nights, matches that could decide ladder positions, and of course finals. Of course, some of these ‘big’ matches are more likely to occur from July onwards.
And so led me to delve through the records to produce the following analysis, and my discovery of what I shall term the Hinkley Factor.
The Hinkley Factor
The Hinkley Factor is determined by calculating a team’s win percentage for the period up to and including 30 June, and subtracting from that, the team’s win percentage for the period from 1 July onwards. This can be done across one season or multiple.
Thus, the Hinkley Factor is a measure of a team’s underperformance (or otherwise) from July onwards, relative to their win-loss record after the first few months of the season.
For instance, Port have gone (including all minor and major round games) 88-51 (63.3% win rate) to June, and 60-51 (54.1% win rate) from July, since Kenny became senior coach ahead of the 2013 season. Therefore, Port’s Hinkley Factor since 2013 is +9.3 (after rounding).
In other words, their win rate is 9.3 percentage points lower after tax time as compared with before.
Why did I pick 30 June? Basically for simplicity. I could have chosen before/after the bye rounds, or before/after 11 games played, or a certain round etc. – picking a date seemed an easy choice for collecting the data, and as arbitrary as the other options. And I doubt that a different method would yield a significantly different result.
So, what’s your team’s Hinkley Factor during the same period? And are Port the worst in the comp? My calculations are as follows:
Gold Coast +11.5
Fremantle +9.4
Port Adelaide +9.3
Collingwood +8.1
Melbourne +5.4
Adelaide +2.7
Geelong +2.4
North +2.1
Sydney +1.9
Carlton +0.4
St. Kilda +0.0
Essendon -0.8
GWS -1.2
WB -3.8
West Coast -5.3
Hawthorn -5.4
Brisbane -12.1
Richmond -14.8
(Note: The league average is slightly above zero as, for simplicity, I have considered draws in minor round matches to be losses for each side, for the purposes of calculating win percentage. There have been more draws in the latter part of the season since 2013.)
As expected, Port are right up near the top, just pipped into second spot by Freo. There are probably some parallels between Ken Hinkley and Ross Lyon, in that I never really thought Rossco was going to take Freo to a flag despite a strong win-loss record in the minor round, although their record in finals is not that bad, suggesting that they also underperform in July and August relative to the earlier part of the season. West Coast’s negative Hinkley Factor probably rules out the effects of travel to and from Perth being a significant factor.
Gold Coast being top are no surprise – they’ve always had a young average age, with plenty of rookies and second-year players who naturally struggle at the back-end of the season before they’ve had a few pre-seasons under their belts. It’s vaguely possible that ‘selective tanking’ could also be a cause; if a team knows by the halfway mark of the season that they’re not likely to qualify for finals, the draft system obviously creates an incentive to lose their remaining matches in order to receive higher draft picks. While this could apply to a number of clubs every year, the Gold Coast have never qualified for finals and have seldom come close.
Collingwood’s strong Hinkley Factor didn’t come as a surprise to me, although I suppose my expectations were influenced by the decades-long ‘Collywobbles’ which were in reality confined to Grand Finals. Perhaps some perceived similarities between Kenny and Nathan Buckley also influenced me. 2023 actually increased Collingwood’s Hinkley Factor very slightly, notwithstanding that they got the job done in September. (The absence of Nick Daicos for much of the latter part of the season perhaps contributed to this.)
On the flip side, Richmond have a fantastic Hinkley Factor over this period, perhaps suggesting that they prepare for each game to put themselves in a position to win the one that counts (which would appear to have been a winning strategy for them). There is a clearly an inverse relationship between Hinkley Factor and premiership success, when looking at the figures above.
The biggest surprise for me was Brisbane – despite burning a number of good chances in finals across the last five seasons, they’ve still got a fantastic Hinkley Factor of below -12. I think that’s helped by some strong late-season recoveries during the early Hinkley period. It also highlights the fact that high Hinkley Factor teams like Port underperform not merely in finals, but also in the latter stages of the minor round.
Conclusion
So, has my hypothesis been borne out? Well, Port have the 16th best (or is that worst?) Hinkley Factor out of the 18 clubs, so I’m probably not far off the mark. It’s true that the Gold Coast have out-Hinkleyed them, but you’d probably expect that to be the case given the Suns’ age profile and, well, if you’re even comparable to the Suns, you’ve got problems.
Port Adelaide, Freo, and Collingwood all have very large Hinkley Factors since 2013. I appreciate that there have been coaching changes at the latter two – perhaps I could have limited analysis of each club to the seasons they’ve been coached by their current coach, since the implication is that coaching has a bearing on things – but then, with most clubs the sample size would be quite low.
Interestingly, if you compare Port’s Hinkley Factor in seasons where they have made finals (+14.9) against seasons where they did not qualify for finals (+5.7), it appears that Port ‘spud it up’ far more late in the season when they’ve had a strong start – potentially under the pressure they’ve created for themselves. Or perhaps it’s really the case that overperformance early in the season comes at a cost of winning games towards the pointy end. Again, in comparing Port under Kenny to clubs who have won premierships over the last decade, it's hard not to draw the conclusion that Port have placed more of a focus on early wins than their competitors.
There are probably other ways to analyse the Hinkley era. Undertaking the same analysis in respect of points scored for and against, looking at strength of schedule, or records against teams making the top eight/top four, analyses involving betting markets, etc., or even calculating the average Hinkley Factor across seasons (rather than merely aggregating total wins and losses) would all be interesting. I know that many Port fans have pointed out Kenny’s terrible record in primetime matches, and conversely his near-flawless record against the Gold Coast, St. Kilda, and North Melbourne, even as compared with successful teams.
Nevertheless, I think it’s clear that Port, under Kenny, for whatever reason, just can’t get it done in the back half of the season, and arguably a reason for that is prioritisation of early-season wins over long-term success.
And last, but not least, Geelong’s Hinkley Factor during Kenny’s playing days at the club (1989-1995)?
A mediocre but not terrible +2.3.