Hard as it may be for those in the job market to believe, many employers still think they are waging a War for Talent. Despite the tsunami of resumes flowing into their emailboxes, they remain convinced that there is a shortage of two kinds of workers: those with rare skills and those who are rare performers. The demand is there for pediatric nurses or clinical scientists and for those who are “A” level performers in any field, but the supply – at least from a recruiter’s perspective – isn’t.
This talent shortfall has its roots in two developments that have significantly affected the American workplace over the past quarter century or so. One involves a change in the kind of capability American employers need, while the other requires that American workers use their talent in a new and different way. I’ll cover the first development in this week’s post and the second next week.
A Change in the Way Technology is Put to Work
The role of technology changed dramatically during the last two decades of the 20th Century and the first decade of the 21st Century. It became a much more important part of how work was accomplished in both the production of goods and the delivery of services.
Tens of thousands of organizations purchased a host of new hardware and software systems, including those for computer-aided design and manufacturing; data storage, processing and analysis; and internal as well as external communications. They put robots on the assembly line, computers on middle managers’ desks, automated tellers in banks and self-serve scanners in the checkout line of grocery stores.
Initially, the vast majority of companies had only a minimal if any understanding of how best to leverage this technology to their best advantage. By the mid-to-late 1980s, however, many organizations had figured out that the technology, itself, was worthless—just a bunch of expensive silicon wafers and 1s and 0s—without skilled employees who could put it to work on-the-job. To capture the process, productivity and quality gains locked within their hardware and software systems, employers had to hire the people who had the talent to deploy, integrate, operate and maintain their technology investments.
The significance of that realization is best illustrated by the contrast between two commercial airline flights in 2009. Both flights occurred in identical state-of-the-art aircraft—the Airbus A320—yet the journeys produced markedly dissimilar outcomes.
Thanks to the talent of its pilot, Chesley “Sully” Sullenberger, U.S. Airways flight 1549 survived a crippling bird strike and a dangerous water landing to become “the miracle on the Hudson.” Northwest Airlines flight 188, on the other hand, had a very different caliber of pilot at the controls—one who was qualified, but lacked a commitment to excellence—so it lost radio contact with air traffic controllers for over an hour and overflew its destination by 150 miles. The basis for the vastly different results—the tipping point between success and failure—was the talent of each organization’s employees.
Employers now recognize that investments in physical capital – whether it’s planes, trains or automobiles – must be accompanied by similar investments in human capital. Unfortunately, however, that hasn’t created more employment opportunities for those in the workforce. It’s made employers more picky. They are recruiting new hires, they are even paying hiring and retention bonuses, but they are doing so only for those they deem to have talent. And their definition is the only one that counts. It is the person who has a skill that is critical to their success or a commitment to doing superior work or both.
Thanks for reading,
Note: The above post was drawn in part from my new book, The Career Activist Republic. To read more, get the book at Amazon.com, in many bookstores and on Weddles.com.