This conundrum exists because of the inherently slow nature of waterfall technology development; small changes to existing infrastructure cannot be accommodated by IT teams because they are always driving towards bigger and more fundamental changes or simply because they are hugely under-resourced to deal with demand. Aside from the actual planning realities of this model, IT teams feel that making small changes to existing content is pointless ‘because it is all going to change soon anyway’.
This is what gave rise to the growth of agile development methodologies, which are ultimately an attempt to do product and software development in a more iterative fashion, feeding directly off insight into customer needs or pain-points. However, whilst many companies have implemented these methods within their technology teams, they have left the rest of the organisation as it was. Marketing and business are still separate, they just throw their requirements over the wall into an agile team rather than a waterfall team.
The result of this is, in some ways, worse than what existed before: the concept of agile development is supposed to mean that the scrum team are autonomous, self-organising and own their customer experience, and yet in an enterprise environment this is rarely the case; they still get told what to do, they are just expected to deliver it faster.
This post is about Agile Analytics & Optimisation, but by that I do not mean ‘how to get Agile teams to do what your analytics team tells them’ and nor do I mean ‘how can we make an analytics team (in a BI or Marketing environment) work in a more Agile way’ – what I am aiming to show is how the Agile endeavour can itself become truly data-driven, analytical and experimental.
What is Agile?
For me, the concept of agile is ultimately about iterative customer collaboration – the perpetual ‘becoming’ of developing products or software gradually based on both what customers want and/or need from them and what delivers value to the business, rather than based on a strategy or grand narrative about the long-term vision for what that software should be. Now, if you had 3 customers you could simply put them in a room every week and ask them how they are using it and what else they need, but when you have thousands or millions you cannot do this. This means that Agile relies on 2 things:
- Intelligence which provides an understanding of what customers want and need; how they use your product, the struggles they are having and the opportunities to make it more useful.
- Experimentation which allows you to test whether your interpretation of the findings of #1 are correct when delivered back to the customer.
Therefore, these skill-sets must be central to scrum ways of working and mindsets, otherwise agile becomes simply a different way of responding to business demand.
Why should Agile become Analytical?
So, becoming analytical benefits the scrum team because:
1. It is the foundation stone of iterative, continuous improvement
Iterative improvement is always improvement towards something and that goal can only really be customer experience, the secondary benefit of which is business value. What customers want can occasionally be intuitively known, but only through actually listening to real customers and understanding their needs will any real progress and innovation happen.
2. It puts the scrum in control
Requirements can either come from the customer or they can come from the whim or desire of someone internal to the business. When the latter is the primary driver of the work flowing into the squad this generally results in a de-motivated team who feel disenfranchised from their work. If, conversely, the requirements are both data-driven and originate from within the squad, the squad gets the sense of control, autonomy and mastery of destiny which is the intention of Agile.
3. It delivers value
When executed effectively, a data-driven approach to product development will always deliver the most value, both to the customer and to the business, because it is focused on directing resources at where the greatest impact can be achieved. It doesn’t mean that intuition and creativity have no place, it simply points those efforts at the right part of the experience.
How does Agile become Analytical?
The short answer to this is that analytical skill-sets need to be embedded into the scrum team, and this happens in two ways:
- Embed digital analytics and optimization specialists into the scrum team
- Democratise insight and data and create a data-driven culture of experimentation to engage all team members into analytical ways of thinking
A digital optimisation analyst should be:
- Embedded – physically located with the squad or tribe if resources don’t allow for 1 per squad. They should be involved in all aspects of the daily life of the squad; stand-ups etc. They will particularly partner with the product owner/manager and also interface with any wider business stakeholders.
- Pro-active – in my experience analytical types generally fall into the buckets of either being reactive i.e. the kinds of people who thrive well in BI environments where there are structured processes and clear requirements to work to – or pro-active, where the more inquisitive nature will work well in an environment where they need to evangelise analytics and find their own projects. In agile the latter is essential, although unfortunately harder to come by than the former.
- Lean – they are uncovering customer pain points and suggesting fixes in a fluid, always-on manner, not (or maybe only very occasionally) going away into a black box to produce large presentations.
- Business-funded – from experience, analytics which is project funded very rarely survives the project. As soon as budgets become tight due to e.g. scope creep, then analytics will be the very first to get the chop. The optimisation analysts therefore need to be Opex funded by the business. If you need them to fill in time sheets, fair enough, however they should not be paid for out of projects because they will not end up doing very much.
- Product-specialists – the optimisation analyst is multi-skilled and covers all aspects of insight, analytics and optimisation, but is dedicated to a particular product team or area, and knows that product inside out in the same way the product owner does. It is virtually impossible to hire someone with all these skills out of the box, but if they are pro-active and analytical the rest of it generally comes fairly easy to them.
A culture of analytics and experimentation should be championed and driven by the analysts within their respective areas, and relies on:
- Clear KPIs and reporting of the benefits of features
- Gamification of the experimentation process – healthy competition about whose ideas deliver the greatest impact. Getting teams to vote on which variants of tests they believe will win is also a time-worn but incredibly effective way of creating this culture.
- Democratisation of tools/data and a focus on simplicity in those tools, for example ensuring that analytics implementation is clean and intuitive. The analyst should also regularly train the other team members on using these tools.
What does Agile Analytics & Optimisation look like?
The most important foundation stone of agile analytics and optimisation is a meaningful measurement framework and set of metrics and KPIs, which are entirely specific to the product in question and are essentially a description of what a productive customer experience looks like. The following example describes digital online self-serve help and support operations.
This [example] framework and its associated metrics provide the basis on which to use data to understand whether the product is achieving its customer experience goals and, where to focus the attention if it isn’t.
The following then describes the range of tasks that the optimisation analyst conducts. This does not represent a formal process, rather simply the flow of insight and analysis into experimentation and ultimately development:
Analytics should focus on 3 core areas:
- The detection of trends and/or anomalies in any of the aforementioned metrics and route cause/insight analysis to determine the reasons and resolutions
- Pro-active deep dive insight analyses into specific metrics and their drivers. Ideally this is an ongoing analysis and will bring into play various forms of data such as session replays, heuristics, user research as well as digital analytics. Conversion Profiling is a good example of this kind of analysis.
- Monitoring and Listening. Every app store review and every verbatim comment in a VoC survey holds the potential to be a test hypothesis or simply a fix. If someone says they experienced an error on the checkout, can we quantify if other customers experienced the same thing – if so, ticket it.
The output of all these analyses is a list of experiment hypotheses, articulated and prioritised in a standardised way which indicates which is likely to deliver the greatest impact with the lowest effort. Ideally hypotheses come from this data-driven approach, however it is important that ideas are democratised and can come from anywhere – everything is valid as an idea; providing it is prioritised and processed in the same way.
This list then feeds into the testing process:
The most important factor to get right to ensure volume and velocity of testing is to ensure that testing requiring developer/squad involvement (either because it needs content or functionality building or because a server-side testing approach is used) does not sit behind bigger and more interesting things in the main backlog. Capacity must be ring-fenced in order to ensure that testing activity is worked through continuously for its own sake.
Ultimately the experimentation process is what feeds the primary feature roadmap i.e. the learnings from the experimentation process often prove the value of a broader feature or set of features.
The whole point of Agile is to listen to customers and develop based on their needs, but for that to be successful the scrum team must be in control of this listening. What I have tried to show is that ‘listening’, really means ‘understanding customers frustrations and needs via data’ and ‘development’ really means ‘development driven by controlled experimentation’ which allows the team to truly validate what that listening is telling them.
But, above all else, the approach is about embedding the skills and resource within scrum, not taking a ‘requirements’ approach.