Building intelligence from project features, by Alapon Sen
How is your organization’s strategy impacting your workforce? Beyond the usual metrics of throughput volumes and project completion times, a curious management team might wonder where new innovations are taking place and where teams are duplicating effort. This is something we’ve often wondered in dunnhumby. We have 400+ data scientists working across 28 countries on consulting deliveries. So, we developed a new centralized project tracking system to do this.
Taking a data science inspired approach, we built an Intelligent Project Measurement System (IPMS) in dunnhumby to track the projects of our data scientists. In this article, we discuss the general components of IPMS, provide examples of insights one can gain from the system and the organizational factors that drives the success of this.
Components of a typical IPMS
Every IPMS should have the following components:
(1) A stable database of project features
(2) A project delivery process
(3) An inspiring front end for users
(4) A dynamic dashboard to be used by management
These components are discussed in detail below.
1. A stable database of project features
We are looking for a feature set of project attributes that can go beyond Gantt charts and burn-rates. As an example, we might want to store what type of machine learning algorithm has been used in a project and who are the stakeholders. These features need to be logged within the system so that we can keep track of what we’re delivering to clients centrally. If your project management systems provide you this feature set, that’s great. Otherwise, we need to build a custom architecture for storing this data. There could be plenty of technologies available, such as PostgreSQL for storing this data.
2. Integrated project delivery process
No matter what your adopted project delivery method is (agile, waterfall or both according to the needs of the organisation), all projects follow a journey from inception to completion.
In our IPMS, we encourage data scientists to detail project briefs, methodology reviews, cost reviews, scope changes (when necessary), delivery milestones, retrospectives, etc. Hence, a clearly defined delivery process can help to keep teams thinking about the right questions during the different stages of their project.
We’ve tightly integrated our IPMS system with our cloud data science platform to ensure that cloud compute costs can be tracked for individual projects and users. When data scientists start a new project, they must create a project on the IPMS before they can start a cloud instance. We now have greater visibility of our compute costs, thanks to this feature.
3. An inspiring front-end
A front-end is used by data scientists to log updates about the different stages of their projects. The quality and completeness of data captured is going to drive much of the analyses later, which is why we need an inspiring and easy-to-use front end. Therefore, utmost care must be taken around aspects such as formats, or overwriting protocols and make the system as resilient as possible against manual entry errors. UX considerations will go a long way to ensure that teams find it easier and pleasurable to add in documentation. For example, if projects are similar, you can save users’ time by allowing them to duplicate entries from the past and only changing those project attributes which are necessary, e.g. dates.
4. A dynamic dashboard
A dashboard helps management to keep track of what is happening across the organization. There could be standard reporting requirements in the organisation from time to time such as project volumes and productivity aspects. An efficient reporting dashboard can take care of such needs. But more importantly, with time, more sophisticated needs will arise, such as search functionalities or even a very team specific KPI, e.g., a modelling team might want a leaderboard on accuracies of forecasting models. Hence visualisation technology choices should be made in a way so that this dashboard can evolve rapidly, easily and by a whole community, not just one development team.
So now that we have covered the basics, what intelligence can we gather out of this measurement system? In dunnhumby, we have been reporting performance metrics to the executive leadership team out of our IPMS in an easy, automated manner for over a year now. These range from metrics on volumes, efficiencies, and team feedback on projects.
But remember, we built this to glean out deeper insights. So, from a well-designed system, we could attempt even the following:
- Given client situations and a set of assumptions, what could be a benchmark time for a project delivery? This can have implication for future staffing or pricing decisions.
- You could take a sample of projects and use machine learning to understand what is driving the success of a project? Is this related to use of specific algorithms or time allocated?
- Like the above, we can understand, what factors can contribute to the cost of a project or creation of technical debt and work out mitigations. Is it an inefficient coding or a legacy solution which is not giving the same kind of ROI?
- A more obvious “knowledge management” type of query could be looking for methods to solve something which has been done before — why reinvent the wheel? Working with right search protocols can give us these answers and they can even feed into a traditional KM ecosystem.
- Often investment decisions in new tools and technologies need to consider past requirements and use cases. The database of projects could be again a perfect inventory to dip into for such fact finding.
Organisational success factors
As you can see the possibilities are many with the right level of configuration for the system. So, what will be some of the organisational success factors to make sure that your IPMS stays in good health?
- Creating a culture of measurement and compliance early on is vital to this. Remember the old saying “garbage in, garbage out?” The cleaner and more complete the data is, the better positioned you are to derive insights out of it. Keeping people engaged and interested in tracking the data in IPMS will be key to getting a high quality of data.
- This will evolve over time. So, start small and but with the intent and focus. Make the system visible to as many stakeholders as possible and be open to feedback from the community to continuously improve the system. Learning from the data should help also tweak front ends and inquiring into what more project features can be collected.
- Users should be supported to access their data, so that they can take action. Managers will be key user group of this system. And, if necessary, these super users should be able to query the database directly (beyond available reporting dashboards) to find out more.
- While a greater degree of details about projects will drive the success of this system, care must be taken not to store client/customer/employee sensitive information which can be queried at will.
We have seen many benefits in dunnhumby because of this system:
- Creating efficiencies, tech migration choices and delivery benchmarks are much more data based than ever.
- Past initiatives on a given theme can be easily referenced for best practices through a search function. This also helps identifies experts in different subject areas.
- Front-end features of being able to duplicate a project make documentation quicker and more complete.
While we have spoken about IPMS in the context of a data science company, the rules of building and using it and the benefits of it apply to any organisation, which handles large volumes of projects.
Just like businesses have discovered with customer and transactions data, use of this projects’ delivery data could be truly empowering. Responsible and non-intrusive use of this data and allowing users to access their own data and create intelligence out of it can improve lives. As you can see from the examples described earlier, the insights gained can help businesses and teams be smarter about the way they do their work in many ways.