Imagine a future workplace in which your colleagues are a medley of humans and robots. Some are human, some robots and some a combination of the two. Scary, eh? But wait! Did I write "future" workplace? Make that "current". Artificial intelligence (AI) and robots are already taking over our lives. Cars drive themselves. Factories are populated with robots doing the jobs that humans used to perform.
More and more routine tasks are being taken over by AI. Human accountants will probably one day be redundant. AI is even moving up the management chain into human resources and other activities.
Is there a limit? Yes, say some management specialists. Robots can never be MDs or CEOs because they lack humanity and empathy. Really? How many of us have worked for bosses without either?
The point is this: AI is transforming the world of work faster than we can follow and in ways that few of us can comprehend. As Regenesys Business School dean Penny Law puts it: "AI has progressed beyond our understanding."
It is the sharp end of what is called the fourth industrial revolution (4IR). Traditional skills are becoming redundant, new ones appearing almost daily. Among dozens of sectors where human job descriptions could disappear are agriculture, construction, civil engineering, real estate, insurance broking, retail and publishing.
Potential new jobs include energy consultant, YouTuber, 3D technician, medical mentor, robotics technician, AI engineer, drone tester, digital tailor, digital rehab counsellor (for people overwhelmed by social media), and end-of-life counsellor (for the elderly).
Various studies suggest that 47% of jobs in the US will be at risk from automation in the next few years; 65% of students entering high school in 2019 will end up in jobs that don’t exist today; and 60% of employees believe they don’t possess the skills required for the future.
On the plus side, it’s estimated that technology will create 60-million jobs by 2030.
That’s great for developed economies, which provide the education and skills necessary. But what of developing economies like SA, where basic education is substandard and jobs hard to find?
Regenesys chair Marko Saravanja says robots can take over dirty, dangerous jobs, like underground mining. Existing miners could be trained in robot management and maintenance. But it’s hard to see the government and unions in SA supporting an idea that, though saving lives, would still result in thousands of job losses.
Faced with these changes, business is casting around for advice on how to prepare for the unknown. It’s a situation custom-made for business schools with a bold vision of the future. Many are, indeed, stepping into the breach.
Lerato Mahlasela, director of customised executive education at the Gordon Institute of Business Science, says: "A lot of companies have some idea of what it could mean for their business but many haven’t considered the implications for their people. We give them context."
A number of schools have staged seminars on the subject. What is evident from all of them is that there are no obvious answers. Prof Raphael Mpofu, acting director of Unisa’s Graduate School of Business Leadership, says: "Essentially, we don’t know what’s coming but we have to get people ready for it when it comes."

Some countries are already experiencing that future. In Sweden, bank customers are having credit-card chips implanted in fingers for cardless payments. Elsewhere, scientists say it will be possible within the next five years to implant cellular chips in the brain so we can communicate without hand-held phones.
Last week, SA-born technology pioneer Elon Musk said his Neuralink company hopes to start linking human brains to computers next year. Musk, who once said the rapid development of AI and robotic intelligence would render humans "house cats" by comparison, says tiny electrodes inserted in the brain will feed information to a chip behind the ear, then on to a smartphone app.
One of the initial goals is to counter brain diseases and paralysis. In the longer term, Neuralink wants to enhance human knowledge by linking people’s brains directly to computers and AI.
Musk said at the announcement: "[The goal is to] achieve a sort of symbiosis with artificial intelligence. At the moment, we rely on an interface with technology, such as our laptops, that is slowed by our fingers or our eyes. Inserting a chip into our brains to speed things up will be key to overcoming that."
David Hanson, the creator of humanoid robots with the capacity to talk and "think", believes such symbiosis is inevitable and that mankind may not survive without super-intelligence. The human fear of "bad" robots taking over the world is ironic, he says, given mankind’s tendency towards genocide and the extinction of other creatures.
Hanson, who spoke at a Duke Corporate Education conference in Johannesburg this month, says humans must find a way to "breathe goodness" into machines which, in return, will use their intelligence to moderate mankind’s excesses.
A history of revolution
The Industrial Revolution — now also known as the first industrial revolution, was the transition to new manufacturing processes with the introduction of steam power in Europe and the US, from about 1760 to between 1820 and 1840.
The second industrial revolution began at the end of the 19th century through electricity, mass production and advances in communication.
The second half of the 20th century marks the start of the third industrial revolution based on the development of electronics, telecommunications and computers.
The fourth industrial revolution (4IR), a 21st-century phenomenon, builds on the third through the development of robotics, artificial intelligence, nanotechnology, quantum computing, biotechnology, the Internet of Things and advanced wireless technologies.
— ECONOMIC AGES
But this raises ethical issues, says Cobus Oosthuizen, dean of Milpark Business School. Who decides what information is fed into the human mind, and how it is used? "It’s a quagmire of uncertainty," he says. "I don’t think people have thought this through."
The ethical question works at both ends of the scale. Regenesys’s Saravanja asks: "Can a robot solve an ethical issue? No."
Rhodes Business School director Prof Owen Skae goes further: "Would you trust a robot enough to write a legal judgment, to decide if someone goes to jail, to decide your annual bonus? We can’t jump into this blind and hope things sort themselves out. As with anything else, there is the potential for unintended consequences. There’s talk of using robots in education, in the classroom. Will they be able to pick up the subtlety of a question?"
Hanson understands why there are questions but is adamant. "We are in the midst of the greatest species extermination rate in history," he says. "If we don’t get super-intelligence or super-benevolence, humanity may not survive."
So how do schools prepare business for this incomprehensible change? Johannesburg Business School director Prof Lyal White says that business schools in general are currently ill-equipped to deal with the future "because of our obsession to churn out people with the right qualifications". He adds: "As a sector, we are committed to our old structures. I don’t think we are adapting fast enough to deal with AI."
Regent Business School director Ahmed Shaikh says: "There is a need for much more interdisciplinary teaching, research and innovation. In the technology-enabled workplace, skills such as creativity, critical thinking, collaboration and communication are increasingly important. It is well beyond the capabilities of traditional education to respond to the new exigencies of 4IR."
Instead of traditional business skills, the teaching emphasis must be on employability. Business schools have consistently said that one-off executive education or MBA programmes are of limited value and that students must undergo "lifelong learning" to remain relevant. Shaikh says AI will require them to "continuously learn and re-learn new competencies and skills".
Institutions, he adds, "need to adapt their education system to provide broader 21st-century skill sets and competencies and use new formats of education which are flexible and blended in order to close the digital divide".
‘Essentially, we don’t know what’s coming but we have to get people ready for it when it comes’
To bank on one career, says Prof Brian Armstrong, chair of digital business at Wits Business School, "is like taking all your money and putting it into a single stock". Diversification of skills and potential careers will be increasingly important.
Standard Bank learning head Vish Sanghani says: "Studying what you are told to study is no longer enough. We have to tell people to learn much more than education can provide."
It’s certainly more than some business school lecturers can provide. Traditionally, says Henley Africa dean Jon Foster-Pedley, students expect to learn everything from their lecturers. That’s not the case with 4IR.
"We don’t assume we have the best insights on everything," he says. "We tell our students to get information from multiple resources. Often our lecturers learn as much in the classroom as the students do. But that’s fine. Modern education should be collaborative."
Prof Fulu Netswera, director of North-West University’s business school, isn’t convinced. He says the fact that many academics have not been properly exposed to AI is "limiting". Even if they weren’t, MSA (formerly Monash SA) director Prof HB Klopper says: "Most schools are too comfortable with the old student-teacher relationship to want to change."
But Foster-Pedley says: "What lecturers should be better at is knowing how people learn, getting them to open up, to challenge their assumptions. No-one knows what the AI future holds but we want to send people out into the world able to ask the right questions and draw the right conclusions."
Armstrong says that while it’s important to think about the future, it must not be at the expense of the present. Some skills and many human attributes will always be in demand. Don’t get so carried away by predicting potential change that you forget what is needed now.

Lee-Anne Vasi, of the Nelson Mandela University Business School, agrees: "It’s not all about futuristic concepts. There will always be a place for teaching the basics of management."
Kumeshnee West, head of executive education at Cape Town University’s Graduate School of Business, concurs. "We won’t be able to run businesses for the foreseeable future without the human touch," she says. "We’ll still need the ‘soft’ human skills but also the capacity for complex thinking. And rather than looking too far ahead, we need to empower people to use technology to support what they are doing now."
Sharmla Chetty, Duke’s Johannesburg-based global head for Europe & Africa, says business schools are best placed to offer the leadership business needs to face the future.
"There is huge fear around AI and robotics. Companies are understandably nervous. But this is where our strengths lie. Schools can call on both theoretical and practical expertise." Duke, for example, is an international school with access to hundreds of experts from around the world. "These aren’t new puzzles for business schools," says Chetty. "The context may be different, and the pace of change unprecedented, but these are issues we think about every day."
Schools may not always have an immediate answer, but they are the best bets to find practical solutions.
While artificial intelligence is well and truly upon us and needs to be embraced, there are many voices expressing caution















Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.