A Key Component of Leadership: Knowing When to Be Directive vs. Building Consensus

The old fashioned view of leadership is of the strong, decisive leader, throwing out orders to willing underlings, who faithfully execute on behalf of the boss. Although this model of “leadership” fails to leverage the incredibly valuable input and due diligence that comes from a more collaborative and consensus oriented approach, it can actually be the preferred method in some situations.

So, when should you as a leader be more directive?

In my experience, the answer lies in both operational and cultural contexts.

Operationally, it is often preferable to be more directive when you simply don’t have the time for a more consensus oriented approach. In such situations, you need to be fairly confident that your choices/decisions are more likely to be correct than not, but if time is genuinely critical, then making “executive” decisions can be valuable to the organization. A recent example in my own experience was the opportunity to save 75% of the normal acquisition cost of an ERP level software application, but with a very tight deadline to accept the offer or not. I did have colleagues evaluate the application against our current system, but ultimately made the decision myself. In this particular case, I had the advantage that a sister organization was already using the software platform and thus had a real time, real world consult, but it was a good example of an executive decision opportunity regardless. Another operational example might be when you have time, but your colleagues/subordinates do not. In other words, they are already stretched thin and you can do them a favor my making an executive decision that saves them the time and effort of doing so collectively. Similarly, sometimes you’ve dedicated the time to a collaborative approach, but that approach had produced two or three equally viable options. It is better to make an executive decision as a leader than to drag your colleagues through a never ending process trying to get to one team decision.

There are also times when being more directive makes sense for cultural reasons. In organizations where hierarchy and autocracy are the norm, it can be quite disconcerting to employees when they are asked to contribute to consensus or to even make their own decisions. In fact, it can even be unfair in the sense that asking people to do something with which they have no experience can not only create a lot of dissonance, it can result in poor decision making and execution as well. This same dynamic also applies to individuals. Even if an organization broadly speaking is amenable to a collaborative, consensus based approach, not everyone in the organization has the experience or confidence to effectively participate in such a model. In those cases, a good leader should develop that capacity in such a colleague over time, eventually weaning him or her off of dependence on decision making by the boss.

And sometimes, being decisive or “executive” as a leader, particularly in a time of stress or crisis, can be valuable because it can give others in the organization confidence that someone is “in charge” and will get them through whatever the challenge happens to be.

To be clear, generally speaking, leaders will arrive at higher quality decisions and will get higher quality buy-in and execution on those decisions using a more collaborative, consensus based approach. This is true simply because multiple people almost always have better ideas than individuals and teams almost always generate better work than individuals or groups that do not function as teams. However, part of good leadership is knowing when being more directive is preferable.

Why a Compelling Value Proposition is More Important to Your Organization than Mission and Vision

Over the last quarter century I have participated in and led dozens of exercises designed to define organizations through mission, vision, and values. These can be very worthwhile exercises because they help organizations understand really fundamental things like why they exist, what they do, and what they aspire to be. Mission and vision statements, and core values, serve as guideposts that should inform important decisions about strategy, resource allocation, and other issues that drive consistency and sustainability.

However, in the current rapid change, hyper-competitive, highly commoditized world that virtually every organization in every industry now operates in, a more important question to ask in the context of defining an organization might be: Why would a customer choose us?

It is becoming more clear to me, both as a CEO and a consultant in the current VUCA environment, that what differentiates organizations that thrive from those that just survive or even fail is not their mission or vision statements or even their values (values, can however, support critical behaviors). While those are important, there is a highly dynamic, if not volatile “where the rubber meets the road” imperative faced by virtually all customer or client driven organizations today, which boils down to your value proposition to the customer compared to the value proposition of other possible choices offered by the competition. We can often attract customers or students or patients or clients into an initial transaction with catchy marketing or steep discounts or convenience, but over time, and particularly with “big ticket” commitments, in order to succeed in the hyper competitive and commoditized environment that most of us work in, our customers must believe that there is genuine value for them in the product or service we are selling—and not only genuine value, but value that is demonstrably greater than what is available from the competition. The value proposition itself becomes a key differentiator that sustains customer loyalty and advocacy.

While I rarely engage an organization that does not have a mission or vision statement, I frequently find organizations that do not have an articulated value proposition. That value proposition can reflect many things in a given organization or for an identified set of customers such as quality, cost, service, benefit, flexibility, support, etc., but with rare exception, consumers today almost always have choices and usually many choices of where to buy a given product or service, so there has to be a good reason they will choose your organization over another. In some cases (usually in retail contexts) the value can even be related to “prestige” or “image,” but it still has to exist for the consumer.

Even in organizations that have thought about and committed to a compelling value proposition for their customers or clients, only the most sophisticated have a truly customer-centered notion of what is valuable and important to the consumer. It is actually more common to see such statements based on what the organization or vendor thinks is valuable. This happens simply because we human beings get attached to what we are invested in or have experience in or think we’re good at. That is a trap that gets in the way of creating value for the customer rather than for ourselves.

What this all boils down to for leaders is that self-definition is still important for organizations today, but with limited time and resources, having a deep understanding of why customers would choose you over a competing organization (and acting on that) is probably more important than having clarity around mission and vision.

The Employer-Employee Relationship: Things Have Really Changed!

In the early industrial era when Americans migrated in large numbers to urban areas for factory work, employees worked very long hours under grim, even dangerous conditions. There were also no protections for child laborers. This reality lasted through much of the 19th century and into the early 20th century. While factory wages were higher than other alternatives, it was punishing work in which the company, in many ways, “owned” the employee.

There was a brief period of American history, roughly post-depression through about 1980, in which at least a couple of generations experienced the ability to work for one or two employers for most of their lives, with relatively safe conditions and 40 hour work weeks, for a living wage, then retire in a home they likely owned. They were also able to provide advantages to their children that allowed millions of Americans to experience a more prosperous life than their parents. This was possible for several reasons. Unions played a significant role in ensuring safer conditions and living wages and benefits for moderately skilled workers. Social programs such as social security and Medicare kept millions of elderly out of poverty. And a slower pace of change allowed companies to do roughly the same work in roughly the same way for decades. This, in turn, facilitated long term relationships between employers and employees that supported an unwritten social contract. The employee worked hard, learned new skills over time, became more valuable, and the employer made an informal commitment to steadily improving wages and job security.

During this same period, management had a somewhat different “contract,” but similar benefits. In return for accepting longer and less predictable hours as “salaried” employees, managers generally enjoyed higher compensation, better benefits, and higher status within their organizations. However, even for management, there were much more clear boundaries between work and private life than there are now. There were no laptop computers, smart phones, tablets, etc. Vacations generally meant completely leaving the workplace behind, usually with zero contact for up to two full weeks. Weekends were generally fully disconnected from the workplace and managers rarely took work home with them. Hourly workers never took work home with them.

Fast-forward to the 1980s and several things began to change in substantial ways. Unions began a precipitous decline. “Globalization” began to put downward pressure on wages. The pace of change within organizations, related to technology and the competitive landscape, made it impossible for employees to do the same work in the same way for more than several years, let alone decades. These disruptions combined, or more accurately, conspired, to weaken the social contract between employers and employees. Today, several decades later, the contract has been obliterated. We are now in an economy in which all employees, hourly and management, are generally seen as “expendable,” and in which organizations, broadly speaking, feel no obligation or loyalty to the individuals that make the organization’s work and survival possible. What this means is that although it is still in one’s own interest to act professionally in terms of meeting obligations, doing one’s best work, representing one’s organization well, it would be a naïve mistake to think that, with rare exception, sacrificing for one’s employer will be “repaid” with job security or deferential treatment in tough times. While this phenomenon is more prevalent in the U.S., it is also becoming more common in other regions of the world as well.

Similarly, unlike the clear line that used to exist between work and private life, organizations now typically see no line whatsoever and claim some level of ownership over all of an employee’s time (it tends to be even worse for managers). In what amounts to a kind of bait and switch, employers spend several hundred dollars on smart phones and tablets at “no cost” to the employee, then get tens or hundreds of thousands of dollars of your “off the clock” time in return. Again, this dynamic is more pronounced in American and Western contexts, but it exists at some level in most global contexts.

And this has happened at the same time that smart phones work essentially anywhere, any time, domestically and internationally. One cannot even escape wi-fi at 35,000 feet in an airplane anymore!

The ultimate irony is that even as we work more hours and have less space between our work and private lives, we are actually becoming less productive overall and certainly less creative. How is that possible?

The short answer is that not only is there not a correlation between working longer hours and being more productive, recent research shows that the opposite is actually true. This is palpably ironic for both employees and the organizations they work for. See a lengthy post on this topic here.

While this new state of affairs can be stressful for employees, who seem to be working harder, with less work-life balance at the same time employer loyalty and job security are becoming ever more rare, it also provides a kind of freedom in which employees are also less tied to employers and thus more free to pursue new opportunities. Change can be difficult, but it is also a catalyst for growth and the flip side of less job security is more professional options. Another potential benefit is the fact that people generally realize faster upward mobility by moving from one organization to another than waiting for opportunities in one organization.

As leaders, we can actually take advantage of the current reality by identifying employees who bring the greatest value, have the best attitudes, integrity, etc. and find ways to support and reward those individuals. We frankly want longevity in our best people. The fact is that organizations today are often driven by short-term thinking, fear of failure, and limited innovation because of a reticence to take risks when times are hard. However, that same reality also provides opportunity to stand out in the crowd for leaders who care about others, take a long view, and embrace risk!

Post Script:

While all of my posts and articles are based at some level on my own experience over 35 years in the workplace, I rarely share specific personal stories. In this particular post, however, I think it is helpful to make an exception. The reality is that you can do the right thing as a leader and still pay a heavy price. You can focus on quality or sustainability or transformational change and be penalized or even fired by a boss or a board who does not care about those things. Due to your own sense of integrity you might prioritize compliance or legality or ethicality over financial imperatives when those who employ you are more “flexible” in their own interpretations of right and wrong. You may care about human beings and thus make decisions that value people over process or short-term financial gain. Fortunately, in most cases, when you do the right or smart or compassionate thing, you will be respected and rewarded, but not always. I have been separated from organizations for doing the things noted above when those things were not reflective of the value systems of my employer. However, just as a good leader takes the long view in terms of performance outcomes, over a professional lifetime, taking the long view in terms of your own integrity is also the right thing to do. In 25 years in management positions, I have been rewarded and benefitted from doing the right thing far more often than I have been penalized. In the end, none of us want our legacy to be how many people we fired or laid off rather than how many we coached and saved, or how many short term profit or sales goals we exceeded while sacrificing quality or the survival of the organization. I have seen short-term, unethical thinking literally destroy a healthy organization, wiping out thousands of jobs, stranding tens of thousands of customers, and wiping out hundreds of millions of dollars in equity. I have seen greed overrule compassion and brilliant strategy sacrificed to ego. And I have personally paid a price for challenging the status quo, but I have also been inspired by servant leaders and experienced the joy of leading a team through transformational change to achieve things they never thought possible. In the end, we will not be judged by our wealth or our status or our conquests, but by the good we have done, and that is a legacy for which we should be willing to occasionally sacrifice our own well being.

Our Faustian Bargain With Technology

Most of us reflexively think of technology as more of a “good” thing than a “bad” thing and that is probably objectively true in some ways and contexts. If we look at fields such as medicine, transportation, and communications, for example, and compare how technology has affected those arenas today vs., say, 25 years ago, most of us would say that we are better off.

It is because of technology that someone in Abu Dhabi can buy an avocado grown in Kenya (and that farmers in Kenya know in real time where to get the best price for their products). It is because of technology that early detection of colon cancer has greatly increased survival rates and that surgeons can correct heart defects before babies are even born. It is because of communications technology that two people, regardless of location, can connect in real time and share virtually unlimited information of interest to both. Based on these examples, most people would say that technology represents a positive influence in our world and our lives.

On the other hand, technology has become so pervasive in virtually every aspect of our personal and professional lives that we have become fundamentally dependent on that technology to transact even the most basic tasks of our daily existence, even to the point that most people even mediate their primary human relationships through technology.

Technology has done big things in the very recent past. It has democratized access to information. It has effectively eliminated time and geography as barriers to communication. It has facilitated automation that has completely changed how entire industries function—and how the humans in those industries work and interact. It has made some products and services much cheaper (or even free).

It has also been incredibly disruptive, both in terms of how it impacts our lives, but also because of the increasingly rapid pace with which new technologies enter the mainstream of human life and enterprise, fundamentally changing how we communicate, work, behave, and interact with one another. It has wiped out entire labor markets and it has shifted the risk of failure from single entities to entire systems, and this is a really, really big deal.

In the “old days,” when technology was electro-mechanical or just mechanical, machines failed all the time. However, those failures were limited to individual machines (a car, a loom, a washing machine, etc.). Now, with most all technology employing some sort of software, software that is connected to other things also employing software, we are vulnerable to massive, even catastrophic failures—and it’s already happening. Entire airlines are grounded across the globe. Power grids go down. Vehicle fleets are pulled off the roads—because a string of software code fails or is hacked or was flawed to begin with. The current process for coding is deeply flawed as well, but that is another post…

In short, a new car, for example, typically requires millions of lines of code to operate multiple processors. There is no way for any programmer or programmers to anticipate the billions of potential combinations of scenarios that a driver and the car (and hundreds of thousands of other cars and drivers) will encounter over millions of miles in highly diverse environments. As a result, a piece of code that is supposed to stop a car from accelerating, for example, given enough time and scenarios, will inevitably fail to stop accelerating even when the driver takes her foot off the gas pedal. This has already happened, resulting in accidents and even deaths. And the problem affects every single car running the same code.

It is becoming clear that there are at least a few areas in which our “bargain” with technology may have come at a very steep price. Privacy, independence, vulnerability to failures, human development, mental health, and human relationships—and we have not even largely begun the next technology era with artificial intelligence and mixed/virtual reality! The reality is that we are all subjects in a very big experiment and we frankly don’t know what the outcome will be, particularly for young people (tech natives) who represent the first generation in human history to have lived their entire lives mediated through technology and tethered to smart devices.

This horse is way, way out of the barn. We are not going back to a pre-software driven world, and on balance, most of us are relatively satisfied with the technology we use every day (and depend on without even knowing it). But there are two realities that we should be thoughtful about. One is that there will continue to be massive, catastrophic systems failures, and it will get worse before it gets better. Part of this isn’t even related to technology failing directly—it comes from support systems failing. Ask the people of Puerto Rico about life without smart phones, ATMs, and internet, all of which need electricity and other infrastructure to function. The second reality is that we can still individually carve out space in our life that is mostly tech-free if we choose to and we should do that on a regular basis. As human beings, our relationships probably need time with others that is not mediated through technology. We know that spending significant, unbroken hours engaged with laptops, tablets, smart phones and video games has documented affects on our brains and bodies (think concentration, sleep and eating patterns). We also know that social media without breaks can increase anxiety and decrease self-esteem, while also generating a great deal of stress. Increasing research is beginning to show that human beings need and benefit from extended periods of time in social and natural environments that are not mediated in any way through technology. At least for now we still have some control over that and should regularly exercise that control for our own benefit and the benefit of others.

Why Organizations Are Often Disappointed with Consultants

Many organizations invest significant amounts of time and energy in consultants only to later hear employees and managers say things like, “they didn’t tell us anything we didn’t already know,” or “their ideas don’t work in the real world.”

As someone who has worked extensively as a consultant and hired consultants, I’ve had a broad range of experiences. Under the right circumstances, consultants can be very valuable, but often they don’t seem to provide any measurable ROI, while creating significant distraction in the process.

Why is this so?

At the most basic level, many organizations often contract with consultants to solve problems or achieve outcomes for which a consultancy model is just not the right choice to begin with. Things like “improving quality” or “developing growth strategies” are reasonable organizational goals, but those things should probably be core competencies developed by the organization itself rather than something that comes from unaccountable external consultants. There is certainly value in having an external set of eyes validate a growth strategy or quality initiative, but if an organization is asking consultants to determine the course of action, that arrangement is likely to ultimately fail. It is likely to fail because even if consultants have a high level of appropriate industry knowledge (and they often don’t), they don’t have any accountability for executing the strategies or suggestions they present. Moreover, by definition, consultants are “outsiders.” They rarely have a meaningful understanding of organizational culture or history or politics. They also tend to work in a silo based on the nature of their contract. As a result, they often have no idea how their recommendations relative to “improving sales,” for example, are connected to the organization’s training or marketing capabilities. Lastly, and this is really important, consultants are almost never involved in the hard work of operationalizing and implementing their recommendations. Most often, they provide a report, do some follow up conversation and move on to the next client, while the employees in the organization are left to implement strategies that many do not fully agree with, that are often not appropriately resourced, that don’t fit into “normal” operations, and that no one in the organization owns.

Organizations are more likely to solve the kinds of challenges noted above by providing training and development opportunities for their own employees and managers, and having them use their new found knowledge and capabilities to develop, then implement, new operational strategies that they own and are enthusiastic about. The training and development activities can certainly come from external sources, including consultants, but that is different than asking consultants to provide the new strategies themselves.

On the other hand, consulting projects are much more likely to provide valuable outcomes if they are designed to produce tangible products or resources that enhance daily operations. For example, hiring a consultant to write a training curriculum or code programming scripts that will add additional functionality to a back office or customer service computer application, tend to provide great ROI because they result in concrete tools the organization can use to do things it couldn’t do before. I have also found that consulting projects that directly transfer skills and knowledge (rather than abstract ideas or strategies) are generally very successful because they empower employees (who are the ones left behind when the consultant leaves) to work in different, more effective ways. Similarly, consultants who can identify and articulate very specific, actionable changes in a process or who can provide an instrument to accomplish articulated goals, are generally seen to bring real value.

Additionally, in the big corporate consulting firms, most of the consultants they put in the field are very young, very inexperienced entry level professionals, often in their first jobs out of college, who have little or even zero actual operational experience. They work very hard and they’re very smart, but they have no internal “gut check” capabilities when it comes to the real-life viability of their observations and recommendations. It is also common to see consulting teams in which, despite their intelligence and work ethic, not a single member is actually an operational expert in the industry or subject area of the consulting project.

The big firms have great resources, particularly when it comes to research and technology, but they also tend to have deep biases toward what they’ve already done for other clients. Because of that, contracting organizations often get modified versions of previously dispensed advice, suggestions, models, etc. In fact, in certain areas, the consulting is actually based on previously designed templates that get repurposed for new clients. Sometimes the templates are the product of very high quality work, but the very nature of the template framework pushes consultants in pre-determined directions that can miss critical context for a specific organization.

When organizations find themselves dissatisfied with consultants, the problem is generally not the concept of consulting itself. In fact, under the right circumstances, a good consultancy can bring significant value to an organization. The reason consulting arrangements often do not bring the desired value is typically because of a mismatch between what the organization needs vs. how consultancy models usually work. For most operational issues, I have found that “boutique” consultancies that are experts in the subject area and industry of the consulting work are often more effective than larger, generalist firms. Of course, the opposite can also be true, particularly if a project requires deep pockets and resources on the part of the consultant. This can be the case with large research projects, for example.

In short, if an organization finds itself using a consultancy to compensate for what should be core competencies or if the consultants are not experts in the work of the organization, there will likely be dissatisfaction on the horizon. On the other hand, the more tangible, applicable, and customized a consulting product, the more likely it is to bring value.

The Two Monopolies that Keep American Higher Education Afloat

In a previous post, I wrote about how tectonic shifts in the macro environment in which American colleges and universities operate would almost certainly result in drastic change over the next couple of decades. Those shifts have profoundly affected the financial model of higher education as well as the value proposition for many millions of potential students. On the other hand, higher education as a societal institution or as an industry has had remarkable staying power in the face of profound change that has wholly remade other sectors of society. How is it that higher education has broadly managed to survive in much the same form it has been in for a century while other sectors and industries have undergone foundational change, and in many cases, simply vanished?

This question is even more interesting considering that educational institutions are generally more inertia bound than just about any other type of organizations one can think of.

The simple fact is that American Higher Education has continued to operate with little systemic innovation, and at times something close to disdain for the needs of its “customers,” because of two monopolies that have protected it from the market realities that affect other industries. The first monopoly is access to federal financial aid. The federal government disburses many billions of dollars a year to institutions of higher education (IHEs) in the names of their students. This largesse comes in the form of grants and guaranteed loans, without which not more than a handful of the thousands of Title IV eligible institutions could operate. This financial aid is only available to the “cartel” of educational institutions that follow the narrow, static requirements of recognized accreditors. The second monopoly is less structurally assured, but equally important, and that is the control that accredited IHEs have had over the credentials that graduates use to gain employment and take licensure exams. In another article I noted that the monopoly over credentials is weakening in the face of the growth of the gig economy and the alternative ways that Millenials are learning job skills that do not require attendance at universities. Even if weakened, that monopoly will continue to exist in some form as long as employers and licensing boards support it. This monopoly is actually stronger in many countries outside the U.S., particularly in developing countries and regions which have very traditional notions of higher education.

To be clear, even with the monopolies protecting its existence, higher education is not thriving. It is surviving, trapped by a powerful structural dilemma. The accreditation system that IHEs must adhere to in order to qualify for federal aid is a painfully stifling force against innovation, as are the arcane, century old Department of Education rules that apply to the kinds of educational experiences that are eligible for federal aid. This status quo creates a serious quandary for IHEs, which must innovate to survive from a market perspective, but which are prevented from being truly creative if they want continued access to federal financial aid. To make matters worse, the US Education Department is compromised by as much or more inertia than are the IHEs themselves!

While it is possible that non-financial aid dependent education models will begin to chip at the edges of the traditional higher education edifice, offering limited innovative alternatives to students, until the current anachronistic and innovation-killing higher education funding model changes, the traditional higher education system will continue its slow and painful slide toward declining enrollment and declining relevance.

How Millenials Are Driving the Evolution of Online Learning–It’s Not What You Think

Online learning as we currently understand it has existed in some form since roughly 1995, about a year after the introduction of the World Wide Web. By some measures, online learning has been a huge success in the sense that today millions of college students in the U.S. take some or all of their courses via online delivery. Globally, the numbers are much larger. In fact, about 25% of all undergraduate content delivered in U.S. higher education today is accessed via learning management systems (LMSs) like Blackboard or Moodle and over half of graduate level content is accessed the same way. However, this form of online learning is dwarfed by the learner driven activity taking place every day, driven largely by Millenials. While there is a broad consensus that MOOCs (Massive Open Online Courses) have flopped relative to their initial, lofty goals, the fact is that MOOCS are fast becoming the primary mode of learning and skill development for tens of millions of people in the U.S. and around the globe. It’s called YouTube. And Wikipedia and Instagram and Tumbler and Vimeo and Facebook and other social media platforms.

The single greatest repository of educational content in the world is on YouTube, which has millions of tutorials on virtually every conceivable subject. Research by Google (the owner of YouTube) has found that nearly 70% of Millenials believe that they can learn anything via YouTube tutorials and over 90% of young professionals access social media sites for ideas, data, and instruction when completing projects at work. And what is really important about this phenomenon is that for younger users in particular, they see their learner relationship with social media as organic and undifferentiated from their other relationships with, and uses of, social media platforms. They have grown up with this self-defined version of eLearning and they are exceptionally confident learners in this medium. While people of all ages and backgrounds are using YouTube and wikis for self-education, Millenials are fearless when it comes to teaching themselves how to do virtually anything—and they are largely successful. They are also the creators of much of the educational content on YouTube and other platforms, completing a circle of learner and tutor.

Why does this reality matter for “traditional” online learning and education in general?

As with other user-driven trends, many millions of people are organically shaping eLearning in ways that are sharply diverging from the online education currently offered by colleges and universities. In other words, although traditional eLearning delivered by institutions of higher education (IHEs) has become quite sophisticated in terms of the pedagogical models embedded in the LMSs used by those institutions, these formal eLearning models force users (learners) to abandon the behaviors they normally use daily in other internet activities, especially those connected to self-directed learning.

The way that many, many more “students” learn online via YouTube and Wikis and shared knowledge via multiple social media platforms is moving sharply away from how online learning is designed and delivered in universities. This does not bode well for IHEs who are frankly asking students to use tools they find less effective and with which they are less comfortable, in a medium that they otherwise use pervasively to transact every aspect of their lives.

So far, traditional IHEs have been protected by a monopoly over credentials, which until recently, has been largely supported by employers. That monopoly is now at significant risk due to a major shift, not in higher education, but in the economy.

The rise of the gig economy has many implications for those who work in that sector of the economy; implications which have been discussed at length by economists, sociologists, policy makers and others. What has not been discussed is the extent to which this shift to on-demand, piece work will have huge impacts on the educational paths that people have traditionally taken on their way to skilled positions. The reason for this is simply that the gig economy is based on deliverables, not on educational credentials. A video editing project or a training curriculum is successful if it meets the needs of the contracting entity, whereas the contractor’s academic credentials are close to irrelevant. Doing good work, to spec, on time are the markers of value (and continued access to subsequent contracts). Young people, working in both the gig and traditional economies go to social media and other internet platforms (not colleges and universities) as their first choice for professional and skills development. They are also much less “proprietary” in their thinking about intellectual property, either their own or that of others, and see the sharing of knowledge and skills to be a “public domain” activity. Finally, Millenials are proving to be quite utilitarian and pragmatic and much less “hung up” on the formalities and conventions of previous generations. That, combined with an economy that is also shifting in fundamental ways, has huge implications for education in general, but online education in particular.

Change Management Has Become a Core Leadership Requirement: A Few Things You Need to Know

In a previous article I suggested that the current demands on executive leadership have become so deep, broad, and complex that it is basically impossible for any one person to possess all of the competencies required of contemporary organizations and operating environments. As a result, I further suggested that we are probably better off focusing on a narrow set of competencies and traits that are essential to effective leadership. One of those—and possibly the most critical at this point—is change management. The reason for that is because contemporary environments are so dynamic that much of what a leader does on a day to day basis is to identify and manage initiatives, strategies, projects, etc. that all represent change of one kind or another. It is change management itself that ties most of a leader’s efforts together. This phenomenon has been in play for at least a few decades, but the rate of change and the volatility of change are qualitatively different and more challenging now than in the past. One key reason for this is that technology is an accelerant of change and a disruptor of the status quo. I will address that in detail in a subsequent post.

So, what are the implications of this reality for leaders and what do you do about it?

First, both individuals and organizations are inherently stressed by change itself. In other words, just the presence of change elicits reflexive stress responses from people individually and from organizations collectively. Because of that reality, it is important to limit the depth and breadth of change that anyone or any organization must absorb at one time. This can be really challenging for leaders who possess a bias for action. We often evaluate ourselves based on how much activity we are driving, how many strategies are in play, etc. While understandable, a much more productive approach in the current environment is to very carefully and purposefully prioritize initiatives or strategies based on the relative return on investment (not just monetary) and/or criticality of each initiative. Because there is a limit to what people and organizations can effectively absorb relative to change, both leaders and their organizations are better off if they engage in fewer distinct initiatives (changes) at a time, but get more value out of the ones they choose.

Second, because of the first point above, it is critical to realize that the success of virtually any initiative or project is as dependent on the extent to which it is viewed as a change management challenge, as are the actual strategies and resources connected to the execution of the project itself. As such, leaders must build a change management plan into the process of executing on the initiative or project or strategy.

Lastly, we frequently talk about “change management” as this thing that happens rather than a thing we do. Leaders must approach change management itself as a formal process with best practice steps and components in the same way they would address conflict resolution or budgeting or strategic planning, etc. It is beyond the scope of this article to present an entire change management process, but you can see an example here.

In short, it is extremely unlikely that leaders can be effective today without understanding that change management is core to just about everything they do and that because of that reality, leaders must be as purposeful about managing change as they are about executing on any strategy, project or initiative deemed critical to organizational success.

Don’t Just Fix Problems. Let People Know that You Are Fixing Problems.

Some of my posts are about “big” issues like how effective leaders treat other people and some are little nuggets like how to schedule one’s day to include things you actually want to do. This post is one of the little nuggets that can help to smooth day to day operations.

A characteristic common to all good managers is the ability to fix problems. People respect and appreciate individuals that can size up a challenge and implement a solution to that challenge. And they expect managers to help them be successful. However, many of the best solutions take time to develop and implement. In the mean time, if you’re working feverishly on a great solution to a problem, does anyone know that? A key, but often overlooked component of good management is clear communication about what is being done at any give time to solve a problem or challenge.

I learned several years ago that when someone brings a problem or challenge to your attention, even if you are dedicating significant time and resources to helping the person, if you don’t explicitly tell him or her what you are doing, he or she is likely to think you are doing nothing, which creates a double problem. First, the employ is feeling anxiety over the fact that he or she still has the problem, and second, the person is also likely to feel that you didn’t hear him or her or don’t care about the person (or the problem) because you aren’t doing anything about it. Ninety percent of satisfaction is managing expectations. So, if, on the other hand, you let people know what you are doing to solve or help solve their problem, what the likely timeline is, and what kind of resolution they can expect, they will feel confident about the problem—because they know you’re working on it—and they will appreciate your efforts, even though the problem hasn’t been solved yet. Periodic updates, even a quick email or SMS, when you are working on a solution to a problem or an answer to a request will go a long way to keeping employees engaged and appreciative, and they will know you haven’t forgotten them or their issue!

Is There a Future for Higher Education As We Know It?

A decade ago pundits began predicting the demise of traditional higher education at the hands of online education, and more recently than that, by MOOCs (Massive Open Online Courses). While that has not come to pass, something else has and it does not bode well for higher education, at least in the U.S.

Over the last few years, for the first time in modern history, large numbers of Americans have begun to question the underlying value proposition of higher education. Within some sectors of society, a majority of people have decided that colleges and universities are “not good for the country.” This is astonishing.

My personal experience in higher education as a student and a professional educator spans 35 years. I came into the system in the early 1980s at the tail end of its “glory days,” as an exclusive, well-funded, highly respected institution of American society and enterprise. While significant structural change has been in play for a good 25 years, higher education is entering the most turbulent time in its history.

So, what has happened?

Higher Education as an “industry” has hit three simultaneous tipping points that collectively will almost certainly change the very foundation of post-secondary education:

  • Student debt (in excess of 1.2 TRILLION dollars)
  • An unsustainable financial model for traditional colleges and universities
  • A disconnect between what colleges and universities do and what employers and the economy need

Over the next couple of decades, these three tipping points will combine to:

  • materially reduce the overall number of colleges and universities (and locations)
  • continue the decline in the number of “traditional” students who spend four years on a residential campus
  • virtually eliminate career, full time, tenure track professorships
  • bifurcate higher education into a smaller market of institutions that have the student demand and financial means to operate in more traditional ways and a larger set of institutions based on quickly changing, market-driven professional programs, many of which will not lead to traditional credentials such as bachelor’s degrees.

The lines between for-profit and not-for-profit will blur as all institutions will have to operate more as “retail” entities (think aggressive marketing, fierce competition for students, intense focus on financial metrics) than what we think of as traditional colleges and universities. And unfortunately for institutions of higher education, supply is currently in excess of demand.

Tipping Point 1: Student Debt

Student debt is not new. What is new is the enormous shift from collectivefunding of higher education by states (taxpayers) to individual funding (by students and their families). This represents a massive paradigm shift in both funding and political advocacy that has resulted in extraordinary, and unsustainable, levels of private debt. Whereas students used to take on debt that was relatively small and could be paid off within several years of graduation, the average student debt today is tens of thousands of dollars and hundreds of thousands in some fields of study. Despite the conventional wisdom, fueled by media reports, the highest levels of student debt are not held by students of for-profit institutions (although that debt is problematic as well). The highest per capita debt is held by students attending “middle of the road” not-for-profit private liberal arts colleges that do not have the means to provide large scholarships and grants (but still charge high tuition). As a result, students have been borrowing heavily from both the government and private lenders in order to afford what is typically very high cost education. In just the last few years, however, a tipping point has been reached in which borrowing by students and families declined 20% between 2010 and 2015. In basic terms, students and their families have determined that the return on investment of a lifetime of student debt is simply not favorable, and in fact, in many cases will leave them worse off than if they had never attended college at all!

In short, students and families, in large numbers, are now refusing to incur such unsustainable debt, which is translating into financial crises across much of the private, not-for-profit liberal arts sector of higher education, resulting in unsustainable deficit spending and increasing school closures. Roughly one third of such colleges now operate at a deficit. Approximately 15 not-for-profit private colleges are closing per year and that number will likely increase going forward. In the for profit sector there has been a veritable blood bath with nearly a thousand locations closing in the last five years and it has not hit the bottom yet.

Tipping Point 2: An Unsustainable Financial Model

Tipping point two is related to tipping point one. In most private institutions, that have always been more tuition dependent than their public brethren, the reticence of students to continue borrowing has hit revenues hard. In the short term, the situation is exacerbated by the fact that we are in a demographic trough of high school graduates, so enrollments are also depressed by the simple fact of fewer available students. The simple reality is that the overhead of a typical private liberal arts college or university is far in excess of what a given student population could possibly pay in tuition. As an example, even a small liberal arts college is likely to have an annual budget for physical plant, personnel, and operations of thirty to fifty million dollars. If we split the difference and use $40,000,000, that means that with a student population of 800 students (common for a small liberal arts college), each student would have to pay $50,000 each per year (without including room and board) for the college to break even. In a typical institution of this type, the average student’s tuition is, at most, half of the $50,000 figure. Why? Most liberal arts colleges today are heavily discounting tuition in an attempt to hit their enrollment goals because the market simply will not support the “retail” tuition rates. The vast majority of small to medium size private colleges do not have adequate alternative means of revenue (endowments, legislative appropriations, research grants, etc.) to fund the half of expense budgets not paid for by students. Many of them are technically insolvent already and are surviving on deficit spending. There is simply no viable business model for most of the institutions of this type operating today. Many will merge, change, or close over the next couple of decades. And, as it relates to tipping point one, even with heavily discounted rates and scholarships, the average student still must borrow tens of thousands of dollars a year to cover what is unfunded by the institutional contribution for tuition, books, and room and board.

In public institutions, which are far more tuition dependent than they were 25 or 30 years ago, there is an uneven story of haves and have nots, but broadly speaking, the “second tier” (non flagship) colleges and universities are struggling with both declining enrollment and declining funding—a similar situation is affecting community colleges, but they have the advantage of attracting students based on lower cost.

Through the early 1980s, roughly 75% of public college budgets were funded by state taxpayers. Today, that number averages less than 25% and in many cases, is less than 10%. In fact, many of these institutions are actually insolvent, but are still open, because interestingly, there are not formal mechanisms to close them. As a result we are seeing program closures, layoffs, critical maintenance deferrals, failing technology, defaults on loans and bond payments, and other serious matters. Accrediting bodies are loathe to sanction public institutions, even when they are clearly failing to meet articulated standards. If that were not the case we would see large numbers of institutions on probation for “financial exigency” and other failures to meet standards. The accreditors have some cover on the financial exigency issue due to arcane formulas which often allow institutions to count the “full faith and credit” of their states even when the institution itself is broke and not actually benefiting in any way from the non allocated resources in state budgets.

Tipping Point 3: The Disconnect between What Colleges Do and What Society Needs

This may be the most interesting of the three tipping points because it offers options to the most entrepreneurial institutions. From the employer perspective, traditional colleges and universities continue to graduate large numbers of students who are nowhere near “job ready” on graduation day. Moreover, the bulk of job growth today and for the foreseeable future is connected to technical fields that require post-secondary education, but that do not require bachelor’s degrees. Examples are as varied as allied health and HVAC (heating, air conditioning, and ventilation). The sector of higher education that is currently best aligned with employer needs is the career college sector, which is predominantly comprised of for-profit institutions. Community colleges also tend to be better aligned with employer needs, but graduation rates in those institutions are so low that comparatively few students actually benefit from programs in those institutions.

Broadly speaking, the very core of higher education for over a century—the college degree—is at risk of obsolescence. And the model (dedicating four to six years of study) that leads to bachelor’s degrees is also increasingly out of step with the reality of most students. The financial return on investment is becoming more and more dubious in many academic majors as is the curricular model itself. Concepts such as “competency based learning,” certificates, and badges are experiencing growing currency in the labor market, and in many cases, carry greater value (and certainly greater ROI) than higher level academic degrees for large numbers of students.

The reality is that higher education (particularly for private institutions) has become as susceptible to market forces as just about any other sector of the economy. Students are customers, price matters, and competition is fierce. As noted, supply is in excess of demand as well, which will contribute to school closures, consolidations, and re-organizations. The recent acquisition of Kaplan University, by Purdue University, a stunning reversal of the typical acquisition process, is a powerful indication of how even a public flagship institution like Purdue recognizes that it’s future is largely dependent on the ability to expand the student market, and importantly, to reach students via delivery models (Kaplan has a very robust online infrastructure) that are sensitive to market realities.

To be clear, there is a future, a bright future, for post-secondary education, but it will look very different than the higher education system of the last century. High demand elite private institutions with robust, non-tuition revenue streams will survive and even thrive into the foreseeable future because they operate on a small, highly exclusive minority of the student population to begin with (think Ivy League, Stanford, etc.). We can expect a similar outcome for large, flagship state universities with political advocacy and robust alternative revenue streams. Many community colleges will persevere as long as they continue to receive taxpayer funding and serve a broad-based community mission, but they will have to solve their current graduation problem. A material number of four-year state institutions will also survive in some form as long as they are perceived to offer a reasonable return on investment for their students and students’ families. A smaller, but likely very robust career college sector will survive and likely thrive because it is structured to provide value to students and employers with job-focused, generally shorter-term programs. The biggest losers over the next few decades will be small to medium size, less exclusive liberal arts colleges, many of whom will simply close. We will also see many fewer humanities programs across all sectors of higher education simply because financial models in most institutions will no longer support programs that are net revenue consumers rather than producers. Ph.D. programs in the humanities or arcane fields will also close in large numbers. Most institutions that survive will shift to market driven professional programs and even more “contingent” faculty (adjuncts). Anachronisms such as faculty rank and tenure will still exist, but in a continually contracting number of institutions.

The biggest winners may be institutions that don’t even exist in any material sense now—those that can provide affordable (mostly non-degree) credentials that lead directly to employment, followed by “re-credentialing” and back to employment as a repeated cycle over professional lifetimes. These institutions are likely to be much less dependent or completely non-dependent on Title IV financial aid than mainstream higher education institutions.

While it is difficult to know exactly how much “traditional” higher education will contract overall, if we extrapolate recent trends and assume a slight increase in those trends based on lower demand, it is not unreasonable to think that upwards of a third of all higher education institutions could consolidate or close in the next 20 years—and a potentially larger percentage among private liberal arts institutions. The situation would be even worse for for-profit career colleges except that the sector has already contracted so dramatically that it is reaching a new equilibrium. It may end up providing the best return on investment for students who complete career focused programs in high demand fields.

In short, some parts of the American higher education system will survive and look remarkably similar over the next few decades, but that will occur in the midst of an overall contraction, a continued move toward more market driven programs, more contingent labor, and more retail-like business and customer models. One can debate the overall plusses and minuses of such a future, but the current reality will no longer support the higher education model of the last century at anywhere near its historical scale.