A recent survey of Americans and business leaders by Gallup (sponsored by the Lumina Foundation) had some pretty damning findings regarding public opinion of higher education. You can read the survey yourself (PDF download), but if you work in higher ed, I wouldn’t recommend doing so while you’re eating lunch.
Perhaps one of the most alarming findings in the survey was that less than half of Americans surveyed agreed that “college graduates in this country are well-prepared for success in the workforce” and only 14% “strongly agreed” with that statement. Worse yet (especially if you’re a recent college grad or about to become one), only 33% of business leaders agreed that “higher education institutions in this country are graduating students with the skills and competencies that [their] business needs.” In fact, employers who were surveyed were so negative about college doing an adequate job preparing graduates for the workforce that 71% of business respondents responded that “all things being equal, including experience, ability, and company fit [they] would consider hiring someone without a post-secondary degree or credential over someone with a post-secondary degree.”
These findings are scary enough but they get even scarier if you look at them in the context of another recent Gallup survey conducted on behalf of Inside Higher Ed. The 2014 Inside Higher Ed Survey of College & University Chief Academic Officers (PDF download) polled 842 Chief Academic Officers/Provosts from 418 public institutions, 261 private colleges and universities, and 42 CAOs from for-profit institutions. And while the findings weren’t all that surprising (20% of CAOs surveyed “strongly agreed” that they wanted to be a college president some day and only 12% agreed that the new Obama ratings initiative will help prospective students), the one that really jumped out at me was that 91% of survey respondents felt that their institution’s “academic health” (overall academic quality) was “good” or “excellent” and 89% felt that their institution was “somewhat effective” or “very effective” at “preparing students for the world of work.”
Huh. On the one hand we have the majority of Americans and employers feeling that colleges and universities are doing a terrible job at preparing students for the workforce. On the other hand we’ve got the vast majority of academic leaders reporting that they think their institutions are doing a good job preparing students for the world of work. Why such a huge perceptual gap?
We can blame some of it on economics. In a soft job market such as the one we find ourselves in now, employers can be a lot choosier about who they hire because there are so many people looking for work. Experienced, skilled workers aren’t hard to find and they’re willing to work for less than in the past, a trend supported by the stagnation in real wages that occurred during the Great Recession and continues today. The bad job market also means that employers can also ask for more out of applicants when looking to hire workers for lower-paying or entry-level jobs. In fact, the 2012 Georgetown University’s Center on Education and the Workforce found that a post-secondary degree is a requirement for an increasing number of jobs, with 2.2 million jobs created between 2007 and 2012 requiring a bachelor’s degree.
As a former employer (I ran a digital agency for about 10 years and was in charge of hiring most of the employees), these numbers make perfect sense to me. Like it or not, all employers seek to hire the best people they can get for the least amount of money possible. When I started my agency in the midst of the mid-to-late 90’s dot.com boom I had to pay through the nose for qualified web developers and designers because there just weren’t that many in the job market, and those that were looking for work were in high demand from startups flush with VC cash. Today there’s a surplus of web developers and designers looking for work and, based on anecdotal evidence I’ve gathered from friends still in the industry, they’re getting paid less than they were 10-15 years ago. Considering that many of these kinds of entry-level “creative economy” jobs require more hard skills and demonstrable talent (proven via a portfolio), it doesn’t surprise me in the least that Gallup discovered that many employers are willing to look at candidates without degrees…all things being equal, they’ll probably work for less.
But economics doesn’t necessarily explain the gap between employer dissatisfaction with college grads’ skill levels and academic leaders feeling like they’re doing a great job preparing students for the job market. Perhaps the “skills gap” isn’t as much about measurable skills as it is about perceptual differences.
If you look at what employers value in job candidates (rather than their satisfaction/dissatisfaction with applicants), the perpetual answer coming out of “what employers want” surveys is that it’s the so-called “soft skills” that lead the list (a topic my colleague Brian Etheridge posted about yesterday on this blog). In most recent “employer wants” survey published by the National Association of Colleges and Employers, “technical knowledge related to the job” was ranked 7th out of 10 traits employers were looking for in new hires. (see below)
As you can clearly see, the most desirable qualities are those typically associated with the classic liberal arts education: teamwork, problem solving, organization, communication, research and analysis.
If you look at the “skills gap” between employers and academics in the context of the NACE report, the reason for the gap starts to become clearer. Since many academic leaders came up through a more traditional liberal arts education, they’re going to hold these “softer” skills in high regard and feel satisfied with their institution’s ability to prepare students for the job market if they feel these skills are being emphasized in the curriculum. Many are also bolstered by studies such as the recent “How Liberal Arts and Sciences Majors Fare in Employment” published in January by AAC&U which found that over the long term liberal arts graduates actually do pretty well for themselves…provided they get a Masters degree at some point. Apparently, Mr. President, it turns out that art history grads actually can make a good living.
But why do employers seem to be talking out of both sides of their mouths when it comes to the skills and qualities they’re looking for in job candidates? How can Gallup find them pessimistic about higher education’s ability to prepare grads for the workforce while the NACE survey seems to say that employers aren’t looking for specific skills as much as they are looking for employees who can collaborate with colleagues, communicate effectively, and find and analyze information? If the softer skills are so important, why are studies finding higher rates of unemployment (or underemployment) among liberal arts majors who, it can be assumed, graduate with the kinds of skills employers say they’re looking for?
The answer, I think, is time. Looking back on my hiring days, I can safely say that when we needed to hire a new employee, we probably needed them yesterday. If a gap in the company opened up due to employee turnover, increased business, or someone being let go, that gap could become a gaping hole in our business if it wasn’t filled quickly. There was work to be done and it needed to be done now, not at some later date after a new hire had time to learn specific skills on the job. Every hour they spent learning (and not producing) meant another hour that couldn’t be billed. And when belts have to be tightened during a recession few businesses are going to be interested in paying the costs required to teach new employees how to do their jobs…especially if there are lots of qualified candidates to choose from willing to work for lower wages.
Educators think long-term. Employers, unfortunately, often think short-term. But neither way of thinking is necessarily wrong. Colleges and universities have traditionally considered their role to be preparing graduates for life while employers typically rarely have the luxury of thinking beyond the next quarter, especially in tough economic times . In times of economic stability, however, they can begin to look longer-term. Unfortunately we’re not there yet, and haven’t been for a while.
Perhaps the answer to closing (or at least narrowing) the skills gap is to recognize the need to strike a balance between employers’ short-term, easily-definable skills needs and the benefits of developing skills such as critical thinking, problem solving, communication, and collaboration that will benefit students over the course of their lives. Of course, many of us in higher education say that we do this now through general education requirements, “writing across the curriculum” initiatives, and other programs designed to develop the skills and qualities that define what it means to be a “college graduate” no matter what a student decides to major in. But it can be tough to maintain these ideals when everyone from parents to the President is focused on the short-term employability of graduates, the need to increase participation in STEM disciplines, and anxiety over the increasingly rapid pace of technological development and its impact on society.
Striking a balance between long-term and short-term needs can be tough in any situation, but it seems to be particularly tough when it comes to preparing undergraduates to thrive in today’s world. The structure and pace of undergraduate education was developed over a long time and is highly resistant to change. But we have to recognize that the way we educate undergraduates was, for the most part, developed during a time when the pace of change was much slower and the need for a college education much lower for those looking to enter the workforce. According to the Georgetown Center on Education and the Workforce (PDF download), by 2018 63 percent of job openings will require workers with some college education. In 1973, that number was 28%.
If employers are expecting job candidates to have college degrees and specific job skills (many of which are technology-related), it may be a mistake to think that we can teach those skills over the course of the four (or five, or six, or longer) years it takes an undergraduate to earn a degree. The pace of change is just too fast. It’s no wonder that most employers think that graduates aren’t prepared…the skills they’ve learned are obsolete by the time they graduate.
The answer to striking the balance that’s going to eliminate the skills gap perhaps lies in several avenues. A greater emphasis on experiential learning that allows students to get real-life experience in the workplace through internships, co-op programs, practica, and even apprenticeship-style training would allow students to gain valuable workplace experience. Re-thinking the structure of the undergraduate experience so that it can incorporate both the development of long-term foundational knowledge and critical skills through more traditional semester-length (or even longer) experiences and more immediate-term development of specific technical skills through shorter intensive formats that emphasize real-world applications within the students’ chosen discipline would provide greater flexibility for students and more intensive development of their skill base. And working to build bridges across disciplines would help better prepare students for a world that’s increasingly interdisciplinary and would encourage the kind of innovative thinking that’s essential to their future success.
The “skills gap” may be a combination of economic conditions, perceptions and priorities, but that doesn’t mean its not real. We can either stick our heads in the sand and keep pretending that everything’s OK (and suffer the consequences for not changing) or we can work to innovate how we educate in order to meet the challenges (and realities) of today…and the future.