How can Data/Analytics and AI/ML help with better business decision making? How can we unlock the potential in your data?
Given a relatively volatile set of macroeconomic, socio-political and geographical (eg supply chain) business uncertainties on the one hand and the promise of technologies for and new possibilities, how can we capitalize on these potential opportunities.
Opportunities lie in the latent space between uncertainty and certainty where data remains locked with the potential to be unleashed into innovative new solutions how do you think the upcoming technologies, like AI/ML and Data/Analytics can help us simplify the complex organizational decision-making process?
Decisions require context
Richer context, more certain contextually enriched data that stitches together with stronger probability, the more probable the conclusions
Data finds data and build a web of deeper and deeper context that provides an increasingly rich set of data that build a case for an increasingly higher and higher probability of making a trend a reality ; thus certainty increases and uncertainty decreases and the distant shadow of definitive direction starts to shape out more clearly into a decision we can be more firmly confident of…
But how do we get there? Here are some suggested steps.
- Get the data house in order
- Leverage data cloud for building a DataLake for Realtime streaming (e.g., Cloud Pub/Sub into Google BigQuery) , batch processing of data into a data lake, like BigLake.
- Clean, curate and label data for AI/ML training
- Publish in a feature store, like Vertex AI Feature Store. This can become the governed gold standard for multiple use-cases across the organization tobe used and reused by multiple teams
- Have an end to end pipeline that automates the data pipeline for the data lake, and ones that produce feature store outputs for training
- Gain insights into your data through visualization and analysis , Looker and Data Studio
- Use the formal construct of an Experiment in Vertex AI to try out various algorithms, optimization techniques.
- Evaluate the models produced, explore explainability in AI
- Publish the models in a Model Registry like Vertex AI Model Registry.
- Choose the versions of the models you wish to advocate and go through the governance process, perhaps CI/CD to push into production
- Use a pipeline to deliver the models from the Vertex AI Model Registry and after Meta data exploration into a production endpoint
- Monitor the endpoint, using Vertex Monitor
- Serve using a Managed endpoint, GKE cluster or GCE depending on your organizational standards , but automate the process using Vertex AI Pipelines
- As you monitor, re-trigger the data pipeline, the training pipeline and the deployment into production pipelines with various governance processes implemented through Vertex AI Pipelines
What innovations can these technologies unlock?
What new business models are possible?
What new possibilities open in terms of innovative business strategies?
What are your thoughts? I’ll comment on the above in my next blog entry.