Source: Pega
Start Small and Specific, Then Scale Up
Businesses looking to map their customer journey on the back of AI-driven systems and processes can start by focusing on Narrow AI – algorithms that can only really provide good answers to a very specific set of questions. This enables a deeper and richer customer experience early on.
Think of Narrow AI as a specialist – like a neurosurgeon who is very good with brain injury but is clueless when consulted about stomach pain. Similarly, first builds of AI models in an organization can be specialized. As they continue to grow in number, complexity, application and usage, they can be bridged and functionally updated to know when to hand off interactions to a different AI or a human.
Mobile carrier Sprint realized a 10% reduction in subscriber churn using a real-time AI they had up and running in only 13 weeks. Agile and DevOps approaches with a quick-win focus can bring results like these.
A common (and excellent) example is the use of chatbots to cut down on customer wait times. These can start as simple, response-coded bots that incorporating ML as data becomes available. They can then evolve into complex chatbots that reference the customer’s history of interactions, including purchases, complaints, and external touch points, to craft appropriate responses. The bots can also recommend solutions or products to accounts that are most likely to purchase at relevant times. This opens up time for support and service staff to address higher-level customer issues.
AI-based deep learning is compute- and storage-intensive. As more data comes in, enterprises will need multi-GPU parallel processing capabilities for training and testing of AI models. Reliable and flexible storage capabilities will be needed to access and mine a large dataset. The training process is iterative, so the total AI workload grows over time. This means the enterprise will need an all-encompassing AI/ML solution that
Ensures hardware and software can be managed and upgraded as and when necessary without disrupting the working of the model
Delivers local compute for IoT and edge devices
Doesn’t need re-architecting as algorithms or data sources scale and the model grows in complexity
Makes it simple to manage application lifecycles and push changes without manual intervention
Ensures security of sensitive data without the need for dedicated infrastructure
Evangelize and Organize the Data
As AI models within enterprises or even SMBs scale up and out, burning questions about the structure and organization of data arise and need to be answered. Where does the data live? How is it formatted, stored and accessed? What cloud infrastructure – private, public, or hybrid – is right for managing data and running the AI applications?
An article on MIT Sloan Management Review divides data used in training AI models into three categories:
The Trusted pool: Data that is current, reliable, relatively complete, and validated for use in training. Having a good taxonomy and tagging system is important here, particularly with digitized data.
The Queued pool: Data that may have validation needs, or be incomplete or inaccurate in some cases. This might be useful for training after vetting. Data received through merger and acquisition should always start here. Stores will move up or down this list as they are reviewed.
The Naysayer pool: Data that is known to be outdated, contain errors and corruptions, may contain biased data or information not acceptable to be used in a given location or regulatory environment (think GDPR).
AI can turn an enterprise with many data silos into a single persona for customer interactions. The complexity of multiple cloud and on-premise data stores combined can be daunting, but it doesn’t have to be. Seek to simplify access and management through a “single pane of glass” wherever possible.
Evaluate, Update, Transform
Adding AI to the organization introduces opportunities and uncertainties that have never been thought of previously. Discerning the difference between a game changing insight and an anomaly could jettison the business forward. This effort might turn into a major project of its own if the business doesn’t have a digital transformation strategy in place, but will pay long-term dividends on numerous fronts once implemented.
With great rewards come great risks. However, when done well, the rewards far outweigh risks. The greater risk is in not engaging in available technology. The secret to effectively using AI to engage and satisfy customers is to align the production of AI with the consumption of AI at every stage and touchpoint of their journey.
Deciding when, where, and how to start are the first steps to moving into the AI-driven enterprise category. Considering the pace of adoption in every market, industry, and vertical, now is certainly the time.