I still remember one of my first sports science project. The idea was to build an analytical tool to estimate workload for return-to-play sessions based on previous drills to further estimate trends. I jumped straight onto the task, willing to prove my ability to deliver. I re-analyzed all return to play sessions, cutting all drills to complete the database, and ended up building the analytical tool for the S&C in about 3 months. When I presented this to the group, I realized that no-one was interested in the product. Indeed, their process to build a session – based on daily context, positional roles of the players involved during training, and their experience – didn’t need a fancy tool to plan it. So, where did I fail?
We previously claimed that getting insights on athletes’ dose-response relationships to training was the holy grail of sport scientists for decades [1,2]. What if we were (somewhat) wrong? More than anything else, I think the holy grail of every practitioners should be to help athletes and teams winning more games. To do so, we must have a significant impact on the programming puzzle . For this purpose, solving coaches’ challenges is key. Many people have already talked about this topic in depth [e.g. 3,4,5], but why are we still so ineffective in the real world? To do it well, I believe we can adapt some tools and concepts coming from the business world.
Resources are scarce.
First, we need to acknowledge that we are in a resource-constrained situation . So, choices are mutually exclusive. For entrepreneurs that means if they allocate money or engineers to improve a product A for a customer, they can’t allocate these people and money to work on a product B. You cannot simply do A and B. You’ll unlikely do A then B, because when the time come to do B, circumstances will very likely have changed. For a sports scientist, it means that we need to think more about time and budget constraints to understand that all projects can’t be handle – even for those working 80h a week. The opportunity cost of doing A is that you cannot also do B. Full stop.
Companies or teams that lack strategic bounds try to do too much and spread themselves too thin. Because they fail to concentrate their available resources, they can’t win in any key market . For us, it means going to every single “interesting” project we find. Bouncing from ideas to ideas, exploring some “What if” projects all day long. There are tons of interesting project and we could explore all of these fantastic topics. But if we want to have an impact on the field, we really need to focus on the most important ones and deliver them.  We need to be known as great finishers and not as good starters . We need to stay focused on the prize.
So, how, as practitioners, can we try to maximize our impact on the field when it comes to new methods, processes or tools? How can we focus our energy on the most important projects? By using the concepts of design thinking and lean start-up commonly used in the business world, I will propose a simple framework to face those challenges. This process is mainly based on my multiple fails, but I believe that in failure there are some learnings. To make it simple, I have broken down this framework in two essential steps:
- Ideation: Using Design thinking to be more customer-centered.
- Implementation: Using Lean startup concepts to learn faster and get real insights.
Thinking as a designer
RESEARCH – OBSERVE: Design thinking is a discipline that uses the designer’s sensibility and methods (in our specific case sports science!) to match people’s needs (staff or decision makers) with what is technologically feasible and can bring value to the team . Thus, during the earliest phase of the project (inspiration), the team in charge of the project must talk and try to understand what practitioners’ issues and constraints are (e.g., time, lack of resources or knowledge). For this research phase, if we are not able to create effective relationships with the coaching staff and engaged with them, we will likely fail solving their biggest challenges.
IDENTIFY POTENTIAL NEEDS: As Martin Buchheit  said, “it is only by sitting right next to them during training sessions and team debriefs, by sharing meals and coffees, being with them in the ‘trenches’” that sports scientist will improve their understanding of other practitioners’ challenges . In a designer term, we need to do some shadowing and have empathy toward the coaching staff to identify their potential needs. Thinking back to my project, engaging with coaches from the beginning during the inspiration phase, I could have gathered better information about their true needs and constraints.
PRIORITIZE NEEDS / PROJECTS: With all the insights collected during these first two phases, we can identify potential challenges faced by the staff. Using a simple Impact / Effort scale can help to prioritize projects (see fig 2). This scale is relatively easy to use. Draw on the abscissa, a low to high impact scale and on the ordinate, a simple to complex one. We have now 4 clear buckets:
- Low impact and complex project are the ones we clearly need to forget – sometimes it is fancy and interesting ones we would be keen to explore but remember, we are in a resource-constrained world.
- Easy and interesting / low impact project can be tackled as simple task.
- We are then left with the high impact and complex projects that need to become key projects we need to work on
- Finally, we have the high impact and simple ones that need to be done NOW – they are our quick wins. Everyone loves quick wins.
While, sometimes, it seems obvious in which bucket fall projects, I highly recommend writing the identified challenges in the Impact / Effort scale to avoid being biased by the project we are interested in, but that are not (clearly) important.
BRAINSTORM – CONCEPTS: Let say we have identified the need to structure the return to play phases as our high impact / complex challenge and we want to turn it into a project. First, I would recommend to turn/change the challenge into a question to favor ideation. Using “How might we (HMW)” question, we can easily do that. The phrasing is purposely open-ended and optimistic to force us to look for opportunities and challenges, rather than getting bogged down by problems or jumping to solutions too soon .
HOW MIGHT WE improve the rationale behind our RTP choices to become more effective?
Using this question as a starting point, we can now try to brainstorm some potential concepts with the end-users, thus benefitting from power-of-crowd. Again, I don’t think (highly unlikely!) these brainstorming sessions would be done in a formal meeting format, using post-it on the walls (but who knows…). Yet, sharing gyms session or coffee (and/or beers!) is the perfect place for that.
Rapid prototyping and iterations
Once an idea has been chosen it is time to execute the plan. Other challenges come up during this phase. With weeks (or months) necessary to run a major project, end users’ needs can change, pivoting the strategy can be requested and apprehension of showing the end-product grows exponentially.
THE BUILD – LEARN – ADAPT LOOP: To reduce risks of failure, product development teams (yes, an analytics project is a product) now use a method called ‘Lean start-up’ or ‘Lean UX’ . This methodology favors experimentation over elaborate planning, customer (in our case, the coaching staff) feedbacks over intuition  and comes with the concept of Minimum Viable Product (MVP) and validated learning. Lean development decreases wasted time and resources by developing the product iteratively and incrementally. Instead of long cycles (months for the analytical projects to years for a product), this process is based on the development of an MVP that will be used to develop a loop of testing, learning, and adaptation hand-in-hand with the end-user. The goal of this iteration phase is simply to validate the hypotheses and scale the validated ones. Asking question is good – validating hypothesis better!
Coming back to our previous project, one option could have been the following. From the ideation phase, we have decided to improve our RTP process and one S&C coach mentioned his use of the training loads reports to plan his trainings in order to anticipate the load of the subsequent sessions. From this insight, I have developed the concept of a training load prediction tools that could be used by S&C coaches. But before building it completely (remember my very own story!), I need to validate some hypotheses:
How Might We (HMW) improve the rationale behind our RTP choices to become more effective?
Hypothesis: If I create a tool that will estimate the training load of the session based on selected drills and show the Coaches workloads trends, will they use it regularly. I will validate this hypothesis, if the Coaches use the tool during 5 consecutive sessions.
MVP: Split some drills and build a quick-and-dirty tool that do the job (or fake it!).
Then, based on the feedbacks I got from the S&C, I could have adapted my idea & concepts and done several iterations – let’s say, one a week, to build something that solved their “job-to-be-done”: (i.e., preparing the next rehabilitation session), not something that I found fancy and technically advanced.
Conclusion – Take-away
It is important for us, practitioners, to understand that every choice we make, every project we decide to tackle, are potential opportunity we are missing. So, when we engage ourselves in a project, first we need to be clear about the true needs of our end-users (ideation phase). Second, we need to focus on rapid experimentation because (i) we can realize that the need identified was not the good one and kill the project and/or (ii) engage the coaches into the development, collect early feedbacks, and maximize our chance to build something relevant for them. I’m clearly aware that it sounds purely theoretical and it’s always easier said than done. Yet, is this framework not the best way to get us closer to the Graal?
If you liked the idea, in the coming weeks, I’ll try to explain some practical tools to help you in your quest for better challenges and your experimentation process to learn faster and better. Meanwhile, I would be keen to hear about your personal stories and how you handle the management of project in your club & organization. What are your tools to prioritize projects (i.e., do you use backlogs?), how do you integrate end-users into the ideation or conceptions phases (i.e., are you doing some MVP of reports before building the real one, or are you going All-in in the building process?), do you manage some feedbacks on your process or product you are building? What are your success stories and your failures?
- Lacome M., Simpson B., Buchheit M. Monitoring training status with player-tracking technology. Aspetar Journal. 2018; 55-65 [Available here]
- Lacome M. GPS-related research in Team Sport – Are we done or is there new windows of opportunity? Mathlacome.com. 2020 [Available here]
- Buchheit M. Chasing the 0.2. Int J Sports Physiol Perform. 2016;11(4):417-418. doi:10.1123/IJSPP.2016-0220
- Buchheit M. Want to see my report, coach? Aspetar J. 2017;(6).
- Jones B, Till K, Emmonds S, et al. Accessing off-field brains in sport; an applied research model to develop practice. Br J Sports Med. 2019;53(13):791-793. doi:10.1136/bjsports-2016-097082
- Collis D. Lean Strategy. 2016. Harvard Business Review.
- Grant H. How to become a great finisher. 2011. Harvard Business Review.
- Brown T. Design Thinking. Harvard Business Review Harv Bus Rev. June 2008:84-92.
- Knapp J. Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days. 2016. Simon & Schuster
- Blank S. Why the Lean Start-up changes everything. Harvard business review. 2013