Our Five Favourite Talks from the 2017 Sports Analytics Innovation Summit

by Fusion Sport
 | 16th March, 2017

On March 8th and 9th, the Fusion Sport team trekked down to Melbourne to attend the Australian leg of the annual Sports Analytics Innovation Summit. Held at the MCG at the start of AFL season, Aussie rules was surprisingly not the only topic on the agenda. We heard about all things high-tech in speed skating; we learnt how expert NRL recruiters pit brains against instinct when finding the next superstar; and we saw how the NBL is using data to engage a whole new generation of fans. And there was a fair amount of footy sprinkled in there too!

Sports Analytics Innovation Summit 2017 — Melbourne

Below are our five favourite talks from the conference.

Robert Sorenson — Founder & Analyst at Litespeed
From the Front Lines: High Resolution Athlete Telemetry and Overcoming Adoption Problems

After we managed to peel our eyes away from the hallowed turf of the MCG — frankly mesmerised by the sun filtering through the silent stadium — Robert Sorensen opened the conference with an equally captivating presentation about his work in speed skating analytics.

An empty MCG

Here are three things we learnt from his presentation:

1. Speed skating is hard!

The difference between an elite athlete and a very-good-but-not-elite athlete can be painfully subtle. Sometimes it comes down to a minuscule difference in blade angle, or the smallest inefficiency when rolling a skate on or off the ice.

2. Technology and analytics can help

Litespeed co specialises in wireless, real-time accelerometer technology that can track blade orientation in three axes — without the use of GPS. Sifting through the data generated by these sensors can reveal valuable insights; for example, maybe an athlete’s right blade is pitching up excessively when performing ‘cross-overs’.

Elma de Vries performing a cross-over

1. No buy-in from coaches and athletes? No analytics

Coming from a business/retail background, Robert emphasised that achieving acceptance is a multi-year process — and this is true across any industry breaking into analytics. Key takeaways: educate, then deliver, then demonstrate.

Sam Robertson — Senior Sports Scientist at the Western Bulldogs & Senior Research Fellow at Victoria University
A Hard Sell? Towards Automated Decision Support in the AFL

Dr Sam Robertson is one of a handful of experts in Australia who bridge the worlds of sports science and data science. A towering figure, he handed down from on high his lessons in implementing valid decision support systems.

Here are three things we learnt from his presentation:

1.Decision support systems can help communicate data to coaches

Sam’s recently published paper on traffic light systems1 (see also our recent blog post summarising that paper) illustrates a great example of a decision support system, whereby athletes are nominated as green, amber or red according to their predicted injury risk. Of course, traffic light systems can be applied to any complex process like wellness scores, fitness and fatigue, physiological testing, and more1.

2. Validate your data

Sam also highlighted a few underappreciated data issues he sees within the sports industry.

For instance, experimental design; e.g., if your performance measures are not reliable, valid, responsive and feasible, can you really trust your data?

Collinearity is also important in sports science; i.e., when two or more independent variables in a linear model are highly correlated. Not only can collinearity make your model coefficients unstable, but it also means that you are collecting data on variables you don’t need to measure!

Cross-validation — or lack-of. This is when you test your model on data it has not seen before. If your model performs well on trained data but performs poorly on the new data, the model is ‘overfitted’. In the words of Mladen from complementarytraining.net, an overfitted model is not predictive but merely ‘retrodictive’.

3. Non-linear modelling techniques, like decision trees, are underutilised

Not only do many non-linear modelling techniques easily overcome issues like collinearity, they can also provide highly intuitive, interpretable results. Improve the accuracy of your decision support system by either upskilling in machine learning or bringing in new statistical talent.

Jonathan Shepherd — Sports Engineer at the Queensland Academy of Sport & SABEL Labs
Data, data, data … but what about feedback?

PhD candidate Jonathan Shepherd (or Jono for short) is an emerging figure in the world of sports technology. Currently serving as a director on the board for the International Sports Engineering Association (ISEA), Jono’s multi-disciplinary doctoral research is bringing together engineering design, sports science and motor skill learning to push the boundaries of elite performance.

Here are three things we learnt from his presentation:

1. Data means nothing if no one is listening to it

In the sea of big data, it is sink or swim. Some will fight for air — struggling to wrangle the waves of GPS and wearable technology data into something useful — whereas others will feel for the current and let that guide them to shore. Jono is of the latter mould. He is making technology that feedbacks data an athlete can actually use to improve their preparation and motor skills. His current work revolves around the Oculus Rift — the burgeoning virtual reality platform.

2. The Cycling Simulator

A collaboration between SABEL Labs and Griffith University’s IDEA Lab, the cycling simulator combines the Oculus Rift with a watt bike to provide a fully immersive race-day experience. You can even ride against the time of a world-class athlete or a ghost version of yourself. We’ll let Jono’s recently aired segment on SCOPE TV do most of the talking, but this may be a piece of tech that will engage fans and athletes alike.

3. There is a ‘right’ time to jump on the tech bandwagon

Sometimes it is smart to leave the pioneering work to the experts. Jono says the best time to adopt new tech is while part of the ‘Early Majority’; that is, after the ‘Innovators’ and ‘Early Adopters’ — who iron out all the bugs — but before the ‘Late Majority’ and the ‘Laggards’ — who relinquish any competitive advantage.

David Joyce — Head of Athletic Performance at the Greater Western Sydney Giants
We’re Better than Average

A giant in the industry, it fits that the GWS Giants snapped David Joyce up as their Head of Athletic Performance after a stint at the Western Force. It adds to an already impressive resume: nursing an injury-prone Harry Kewell back to his best at Galatasaray, helping Brett Emerton and Lucas Neill through one of their most successful periods at Blackburn Rovers, and working with the Chinese and British Olympic teams — amongst much more. Suffice it to say, when ‘Joycee’ talks, people listen.

Here are three things we learnt from his presentation:

1. Different athletes require different training programs

You wouldn’t treat a Ferrari like you would a semi-trailer, and you shouldn’t treat a wily veteran midfielder the way you would a young ruckman. Performance staff need to tailor their programs to recognise that all athletes require different training regimens.

2. The ‘mean’ can bring mayhem

So if we agree all our athletes are different, why do we often use group averages?

David recited a stirring example of how the US Air force committed a fatal abuse of the average in 1926. Building their first ever cockpit, they measured the dimensions of a few hundred pilots and fit the cockpit to fit the averages of those measurements. But twenty years later, at the dawn of jet-powered aviation, planes started dropping out of the sky. No one knew why: it wasn’t pilot or mechanical error.

So the brass thought that maybe the average pilot had just gotten fatter since 1926. They hired a young scientist to measure 4000 pilots across 140 physical dimensions, and guess how many pilots actually fit within 30% error of the average ranges of the 10 most important variables?


And when choosing just three variables, only 3.5% of pilots fell within all the average ranges.

Sometimes there is no archetypal ‘average’ athlete. Check out our previous blog to learn how data mining can yield richer summaries than the measly mean.

One dimensional thinking returns one dimensional results

3. If you are going to summarise data using averages, at least report the error

This was a point that came up in a discussion after David’s talk, and is spot-on. If a set of measurements are highly variable, reporting just the mean without also reporting the underlying variance is a bit misleading. You need to report the standard error/deviation to give an idea how much ‘trust’ you can put into your mean result.

Jeremy Loeliger — CEO of the National Basketball League (NBL)
The Role of Analytics in the Commercial Transformation of the NBL

Who can forget the NBA golden years? With perpetual MVP Andrew Gaze shredding the court?

But since the Gaze-era, the NBL has seen better times: 18 months ago there were no national sponsors and no TV deals. Then Mr Loeliger — ex-partner in a national law firm — came on board and set things back on course.

Here are three things we learnt from his presentation:

1. Know your strengths

The NBA is the uncontested champion of basketball competitions. Rather than compete, the NBL has learnt to leverage their bigger brother. Consequently, they have enticed a legion of Aussie NBA fans back to the NBL. The NBL also streams into China where basketball is very popular.

2. Nothing but net

Data and analytics have been key in maximising match day attendances and broadcast audiences. Likewise, understanding which promotions work, such as the classic ‘shoot from half way’ competition, and which don’t work, have massively improved their attendances, social media presence, and sponsorships.

3. Marry data with gut feel

Sports analytics is, all-in-all, still in its infancy compared to other statistical paradigms. Instinct and experience still plays a role when making decisions.

We love analytics here at Fusion Sport. Check out our Smartabase and Smartspeed products to see how we can facilitate successful analytics programs within your organisation.


1. Robertson, S., Bartlett, J., & Gastin, P. (2016). Red, Amber or Green? Athlete Monitoring in Team Sport: The Need for Decision Support Systems. International Journal of Sports Physiology And Performance, 1-24. http://dx.doi.org/10.1123/ijspp.2016-0541