By Dr. Roze Phillips, managing director for Accenture Consulting Most of us get how a machine can beat a human at Jeopardy and Go. But what does it say about the learning capability of machines – which, by the way, is not fully understood – when a machine bluffs its way to victory in a game of Poker. Is it time to play the ‘governance’ card? A machine wins a game of Jeopardy – this outcome is, or should be, relatively unsurprising. Two things are required for a win: substantial factual knowledge and the ability to understand the questions. Machines can be loaded with an abundance of the former; sophisticated natural language processing (NLP) handles the latter. Similarly, when a machine emerging victorious in a game of Go – a complex, strategy-based game – the outcome remains comprehensible: simply encode for the universe of possible moves and allow the machine to make the best choices at each stage. But what about this Poker win? In a game with imperfect or ‘hidden’ knowledge, where key information is hidden, a machine beat 11 professional players. It raises complex questions. Rapid advances in machine learning – particularly of the unsupervised variety, in which machines ‘learn’ to interpret and draw conclusions from data with minimal or no human guidance –mean that the issue of digital governance is something that we need to confront. And sooner rather than later. Not everyone is rattled by the growing capabilities of artificial intelligence (AI). For the younger generation in particular, the notion of machines as entities that are somehow strange or unknowable is absent, or rapidly disappearing. Tolerance of machines is increasing. Yet the capacity of machines is growing exponentially and the time and cognitive effort that humans must devote to getting machines to accomplish tasks is steadily decreasing. As the decisions that machines make begin to impact human life directly [think autonomous vehicles], we need to consider how we want that relationship to develop. It’s not too early to consider governance of human-machine interaction. Machine vs human cognition – which do you trust more? Having sought to recreate ‘human’ thinking in machines – and the ever-expanding computational capacity of deep neural networks is particularly relevant here – we have indeed created machines with the ability to ‘learn’. However, we do not yet have a complete understanding of that process. What we do know is that machines’ capacity to learn is not restricted to what is ‘taught’ directly; it is a free-floating capacity that can be applied without a guiding framework. Do we need to put one in place? Computational capacity can be deepened and extended in a way that human intelligence cannot. Moreover, machines can be developed to process inputs barred to us humans, limited as we are by our set of senses. How will we ensure that the role of machines remains the augmentation of human capability – that AI remains an adjunct to human intelligence, not the other way around? The decisions we make today around AI will affect us for many years to come. The emphasis must remain on human beings; we must continue to come first. We cannot abdicate decision-making or subjugate human cognition to that of machines. Precisely the opposite is what will allow businesses and people to go beyond this current man-machine juncture. The AI-driven business puts people first What does this mean for CEOs? It means best you check yourself before you wreck your stack. Begin deciding today how rapidly and extensively – or slowly and selectively – you want to transform your organisation into one that uses AI-based technologies.
- Ensure your business case is not purely cost take-out based, but unlocks value by pivoting employees from transaction processing and administrative tasks to more insight-driven work.
- Ensure that you link your ‘talent and skills for the future’ strategy with your robotics and AI strategy.
- Ensure that your talent strategy is linked with your learning strategy so that employees can assimilate and lead in the new skills of the workforce of the future.