As software, AI, IoT, and robotics swallow more and more of the world (as Marc Andreessen suggested they would), it is natural to wonder what the role will be for people? For years we’ve heard corporate leaders, as well as their HR lieutenants, tell us that their people are ‘their most important assets,’ but at the same time the increasing pressure for efficiency seems to be steadily and necessarily replacing people with various smart technologies. It is easy to become pessimistic, and to see advances in AI and IoT as the ‘enemy of the people.’ But this would be a mistake.
Yes, millions of jobs globally are currently in the process of being made redundant. Anyone caught up in this path will feel the pain. When Tim Cook went on TV recently to suggest that learning coding should be a central part of every kid’s basic education, tech journalist Kara Swisher seemed skeptical. She kept asking, essentially, “So, your answer to the growing knowledge-skills-jobs gap is that everyone should learn to be a programmer?” Cook sort of answered the question, but the way the exchange unfolded made it seem like that was what he was saying. I actually think he was saying something much more important, but unfortunately that didn’t come through in the conversation.
What constitutes knowledge, and useful knowledge, is changing. It always has. When you apply Moore’s Law analogically to other domains of human endeavor, such as learning and working, we see acceleration everywhere. As a former educator (a B-school professor), I know this all too well. I spent years teaching students things that were either outdated (in terms of the state of research knowledge) or more or less useless from any practical point of view. I am still getting over that.
Traditional university education, despite it’s becoming such a huge industry and and even larger parental obsession, is terribly wanting. New alternatives, such as MissionU, are a shot across the bow. Relevant tech education delivered over the course of one year produces tech-minded knowledge workers ready to earn a living at around 19 years of age. Of course, pure tech education isn’t sufficient to building a fully literate society. Perhaps one year to read books (lots of them) and one year to do the tech thing, is enough to create an intellectual techie?
Back to Cook’s comments. What I think he meant when he said ‘all kids will need to know how to code’ is that they will need to ‘think like coders’ in order to solve complex problems. The exponential proliferation of data and smart devices simply confound the nature and scope of human decision making. Leveraging that data with the information processing skills of the human mind is not possible. We need apps, tools, shortcuts, and collaborations.
In the world of work, which is our company’s domain, tech is more important than ever. HR tech applications allow busy managers to see, in real time, what their people are working on, where they are, who they are working with, how engaged they are, and a whole host of other social data that are important. Yesterday’s annual performance reviews, or once-every-five-year culture surveys, are just not up to the task. Some of your best Millennial employees will have moved on to new gigs anyway. What is needed are real-time analytics at the tip of managers’ fingers.
The impact that this is having on the field of HRM is really only just beginning to be understood. This has already (and quite easily) been recognized with respect to hiring, benefits, and payroll, etc. However, the soft and elusive stuff- engagement, culture, trust, etc- is now also in the belly of the software. Even if you don’t know how to code (and I sure don’t), I think that if you understand this basic fact, then Tim Cook’s point is successfully made.
This article authored by OpenWork Agency Partner, Drew Jones, PhD.