A few simple frameworks to pinpoint what the analytical needs of your organization are and how to make it more data-driven
Understanding your organization’s analytical maturity can give you a strong edge as a data professional. It will make your “non-analytical” decisions better informed (from “project prioritization” to “how to present your findings”) and help you formulate a long-term goal. And that’s truly an edge — not a lot of data professionals are taking this step back to design long-term goals (and even fewer are delivering against these long-term goals).
This article is split into 3 parts:
- Part 1: Understanding Analytical Maturity
- Part 2: Moving stages
- Part 3: What’s a “mature” organization
Let’s dive in!
Any organization (team, product, company, etc.) at a given time is at a certain stage of analytical maturity. Just like humans who “crawl, walk and then run”, organizations go through the same stages. It is one of those immutable laws of nature: everything goes through creation, development, and maturity.
There are a few interesting frameworks¹ re:Analytical maturity, with different components and emphasis. From personal experience, I found looking at an organization through the following 4 components to be the most useful and actionable:
- Its needs: Robert D. Peng and Elizabeth Matsui wrote in “The Art of Data” that there are 6 main types of questions: Descriptive, Exploratory, Inferential, Predictive, Causal, and Mechanistic. The type of questions you are being asked are usually a great indicator of the level of maturity of your org — a low-maturity org will be mostly interested in descriptive and exploratory data studies, vs an advanced-maturity org that will ask more predictive and causal questions.
- Its people: another key component to analytical maturity is the people, both in terms of capabilities and capacities. Depending on how many data resources the organization has and how good their capacities are
- Its tools & processes: are there standardized tools for the data professional? Do we have standardized processes (e.g. prioritization, templates, etc.) for the data team?
- Its culture: what’s the split between intuition and data in decision-making?
Depending on how your organization scores on each of those components, it will fall into one of these 3 stages:
- The launch stage: At this stage, the organization needs basic reporting to know what has already happened (“in hindsight”). There is no central data team, you might not even have data analysts — data studies are being done by a few data-savvy operators on top of their 9–5 jobs. There is also no tooling, no process, and no clear agreement on what lens should be used when looking at a particular phenomenon. This leads to a lot of noise (e.g. multiple teams with different churn definitions that lead to disagreement down the road). On the cultural side, while everyone agrees that data should inform the decision-making process, due to the lack of data (or the mistrust in the data) a lot of decisions are being made via “informed gut feeling”.
- The development stage: The organization has good visibility of its market and for some of the key metrics it should be tracking. Now it needs to understand why things are evolving in a certain way (“insight”). Teams start being supported by data professionals (either embedded inside them or within a centralized data team). The data infrastructure is slowly shifting from Google Spreadsheets to more robust tooling. To triage and prioritize all of the data requests r, the first few data professionals establish basic prioritization principles and a ticketing system (i.e. Google Form). Common lenses are being adopted across teams, and as a result, data is more and more relied upon for decision-making. Non-data professionals become smarter on what data questions to ask, With tooling, non-data professionals can look at the data themselves
- The maturity stage: The organization understands why things are moving in a certain way and can now predict and influence future changes (“foresight”). Centralized data teams start forming, acting as proactive thought partners (vs “reactive support” from the previous stage). Tools, processes, and metrics are being standardized. Data is expected in every decision-making process
The image above is a simplification of the real life. In reality, organizations can score very differently on each component — but you get the gist of it. The beauty of this framework is that:
- It gives you a structured way to discover the critical factors hindering your organization’s analytical growth.
- It allows you to pinpoint where your org is in its journey — and what’s next for it.
That’s really why you get a strong edge when you know how to use this framework: it gives you a way to understand where you are and where you could be, and to diagnose why you are not there yet. Your job then is “only” to set a strategy to remove the roadblocks — which is exactly what we’ll see in the next
Richard Rumlet in “Good Strategy, Bad Strategy” wrote: “The core of strategy work is always the same: discovering the critical factors in a situation and designing a way of coordinating and focusing actions to deal with those factors”.
That’s also true for when you want to move your organization’s analytical maturity: you need to pinpoint the critical factors that will help you move to the next steps and design a plan to get there. The framework we saw above — the one that broke down analytical maturity into 4 components: the needs of an organization, its data resources, its processes & tools, and its data culture — can help you pinpoint the gaps in your organization — but pinpointing is only 20% of the job. Let’s discuss the remaining 80%
Good strategy, bad strategy framework
I love Richard Rumlet’s book and I think it gives a superb framework to think about this. He explains that a good strategy has 3 elements:
- A diagnostic: the most important part of the framework is the diagnostic — it is the base of your whole logical approach. Your diagnostic should allow you to understand the current situation, but also the cause and “why” the organization is there.
- Some guiding principles: From this diagnostic, you can derive some guiding principles — that once you are on your journey to grow the analytical maturity will help you make your decision-making process easier and help you stay on track over time.
- A coherent action plan following the above: Armed with your diagnostic and your guiding principles, your main task is to decide where you want to be under what timeline, and how you will get there.
Starting with a diagnostic
“A well-stated problem is half-solved” John Dewey
The idea is to understand the current situation and the true “why” behind it. You don’t want to tackle symptoms — your goal is to go all the way to the root cause and fix what needs to be fixed.
Here are a few tips on how to do a good diagnostic:
- Start from the 4 dimensions we saw previously: needs / people / tools & processes / culture and assess your organization using this lens and get to the root cause in each of those areas.
- Get data on current pain points and solutions:
- Interview people: get to know people, their jobs, their decision-making process, and how they use the data in their day-to-day job
- Shadow people: similarly, shadowing people can be a great way to get a deeper understanding of their day-to-day jobs, and can uncover insights that you wouldn’t have if you were to only interview them
- Send a survey: depending on the size of your organization, sending a survey can be helpful for you to get more quantitative data. Bonus: it can also allow you to start tracking the feelings of your org toward “analytics”, and give you a benchmark you can report against later on.
- Do a “literature review”, both internally (review previous work and understand how people tried to solve the previous pain points if they were successful or not, and why) and externally ( a lot of content is available for free on the web, and most likely the issues you are thinking about have been documented and discussed before (either in a nice article on HBR or on an obscure forum for analytics aficionados). It is always extremely helpful to get other people’s perspectives on how to solve different problems).
- Practice the 5 “whys”: ask yourself why every time you uncover a new insight. You want to take a bird’s eye view of things and understand the key reasons for the situation the organization is in. Note that It is not necessarily an easy task, especially if you have been inside the company for a long time and you are used to things the way they are.
Deriving guiding policies
“Everyone has a plan until they get punched in the mouth” Mike Tyson
The diagnostic will uncover some patterns which should allow you to derive guiding principles. Those guiding principles will come in handy in a couple of different situations:
- When defining your action plan: think of those as “guardrails” on the freeway: they will allow you to always stay on track and to make sure the issue you diagnosed will get solved
- When faced with a situation that you were not expecting: you can use your different principles to facilitate and guide your decision-making — that will give you incredible peace of mind
- When making trade-offs or saying no to stakeholders: saying no is always complicated — but this is essential for a good strategy. By making your principles clear and having your stakeholders agree to it, pushing back on their requests will be an easier pill for them to swallow.
The hardest part of the guiding principles is sticking to them — just like in life.
Setting up an Action Plan
This action plan needs to be coherent and cohesive and cover the different components of analytical maturity.
How to set up an action plan:
- Find subject matter experts in the organization you are supporting, and work with them on the plan:
- Walk them through your diagnosis and your guiding principles, and brainstorm with them on what should be the next steps and over which time frame.
- If you are in a fast-paced organization, consider optimizing for optionality — giving you time to move the maturity of the organization but also to be able to answer “fire drills” or time-sensitive questions
- Think outside of your organization: if you are supporting one part of a larger company, also think about how you will interact with the other analytical functions, and have that added to your plan
- Set up success criteria: whenever there is qualitative work being done, don’t forget about setting up success criteria. Like any other work, you should be able to say if this is a success once you’re done with it. So set a binary success criteria that will be able to tell you how you did. Put some thought into it — making sure the criteria will properly represent what you are trying to solve.
- Set up reporting processes and timeline: doing the work is important, but if nobody knows about it or uses what you built, are you really creating value? Setting up a proper reporting process will allow you to achieve multiple goals at once:
- Gives visibility to your work to a larger audience and facilitates collaboration opportunities
- Facilitates a Go-To-Market strategy for your new analytical product (as you have a venue to advertise your new dashboards and reports)
- Ensure Leadership buy-in: you can’t build a culture around data without the support of your leads. Present the plan to them and garner their support to ensure smooth sailing towards your goal
The formula for success
The FS newsletter shared this tiny thought the other day:
“The recipe for success:
- The courage to start.
- The discipline to focus.
- The confidence to figure it out.
- The patience to know progress is not always visible.
- The persistence to keep going, even on the bad days.”
Ultimately — this is what it all boils down to. You need to have the courage to start the conversation around your organization’s analytical maturity and where it should be, the discipline to develop your action plan (while addressing the immediate fire drills), the confidence to find the right solution despite potential naysayers, the patience and persistence of moving forward.
And hopefully, you’ll reach the end goal: building an Analytically Mature Organization
I have been talking a lot about AMO and we’ve seen how to grow it — but I never concretely outlined what is an analytically mature organization, and why it is so great. So here is part 3 — with concrete examples of what a mature analytical organization does differently!
An AMO is an organization that understands the complex dynamic of its market, and which activities can influence it.
Analytically mature organizations have clear visibility into how their activities (“input metrics”) drive short-term results (“output metrics”) which in turn drive long-term outcomes (“outcome metrics”).
- Example: an analytically mature marketing org will know the impact of sending promotional emails (input: # of sent emails) in driving new sign-ups (output: # of sign-ups), and to what extent those sign-ups will convert to paying users down the line (outcome: # of paid users). They will use the different ratios (sign up vs sent) and do benchmarks between their different campaigns, helping them improve their craft.
Mature orgs will also have a clear understanding of the key factors influencing their topline metrics. They can seamlessly perform root cause analysis, to understand the evolution of these top metrics, and take corrective actions.
- Example: a sales org will be able to determine which channels and customer segments to prioritize based on where there might be headwinds or lucrative opportunities. They have perfected their investigation process — to the point they were able to automate it, and at this stage, an algorithm directly surfaces the right insight to the right people.
The data needs have shifted toward more “complex” questions — such as opportunity sizing, causal impact tracking, etc. Harder questions — that require deep domain expertise as well as advanced statistical methodologies
- Example: an analytical mature HR org will want to start looking into what can drive employee retention and/or success — and to do so, it will start running causal impact analysis to extract the key factors that are predictors of success
An analytically mature organization is an organization where a few specialized data teams are collaborating.
- The whole backbone of an analytically mature organization relies on clean data — which is the reason why in an analytically mature organization, you have data engineers who are creating pipelines, datasets, and databases, and that are committing to very strict rules and “service level agreements” (SLA) so that it can be easily consumed by the different downstream teams (such as data science or business intelligence).
- You also have product managers, working alongside those data engineers to make sure the right databases are being built to solve the most pressing pain points of the organization, and building tools to improve data discoverability (which is, even in a very mature organization, always a complicated topic).
- You have data scientists who consume all this data and turn it into deeper insights for product and business users — allowing the organization to make better decisions. They are usually a pretty central team, with their work influencing the roadmaps of both the upstream and the downstream teams (i.e. their needs will influence the roadmap of the data engineering team, and their findings will usually influence the work of other analyst teams).
- Finally, you have business/data/financial analysts, who support both strategic decisions and day-to-day operations.
To give a concrete example of a large retailer:
- Data engineers will build the right pipeline to make sure we have daily databases, with the store name, its location, its inventory, the # of sales per item, etc.
- Data scientists will use those databases to run “market basket analysis” — to uncover which items are the most bought together.
- Business Analysts will take those findings and look into how to operationalize them within the different stores. They will build metrics to track the “operationalization” (and potentially set OKRs for the different stores against those).
An analytically mature organization is an organization with robust tools and standardized processes — allowing the different teams to derive insights faster and with a higher level of quality.
- In an AMO, robust data governance processes have been implemented, making it easier for people to consume data. Analysts don’t have to spend hours double-checking each data source — they can trust a few certified databases and metrics, which greatly saves their precious time.
- Multiple tools have been built (or implemented) to standardize typical data studies — which leaves less room for error for the individual contributors and allows more people to get the insights they need.
- Example: instead of having to make your statistical tests for A/B tests, you have a tool in which you just input the data that does that for you automatically.
- Similarly, from a project management perspective — the usual “steps” of a study have been mapped, formalized & standardized (everything from the prioritization decision-making process to the internal go-to-market of a study). Thanks to those formalized processes, it is easier for the org to understand who is doing what, and how to collaborate with the different data teams.
Finally, an analytically mature organization is an organization where everyone is data-savvy.
- As knowledge management has been a priority (and not just an after-thought) people find it easy to find resources and support to answer their data requests
- There are also a few inspiring and tenured “data leaders” who have started organizing an internal “data aficionado” ecosystem (more on that in the following article!)
- Internal training is available and upskilling people — no matter where they are on their data journey
- Data forums are “cool” — they are where great conversations and big decisions are being taken. Data teams are considered as “thought partners” and are brought to the table when key decisions are being taken. Every decision is informed, if not driven, by data.
In summary, you have a very well-oiled machine. Everything is set up so that the data teams can focus on generating quality insights, and the barrier of entry to data usage has been lifted, allowing interested individuals to start deriving insights and improving their day-to-day jobs. It is a utopia.
This article was cross-posted to Analytics Explained, a newsletter where I distill what I learned at various analytical roles (from Singaporean startups to SF big tech), and answer reader questions about analytics, growth, and career.
¹: Analytics Maturity Models: An Overview by Karol Król and Dariusz Zdonek