Things I Now Know For Sure About Successful Leadership

I have been truly fortunate over a long management career to have experienced an incredible diversity of leadership and management opportunities in different kinds of organizations, large and small, in multiple countries, across a variety of structures, missions, business objectives, etc. From my first management position 25 years ago, leading a small department in a rural Idaho school district with one employee, to heading a multinational corporation with thousands of employees based outside of the U.S., I’ve been blessed with learning opportunities most leaders will not have in their entire careers.

I have also been challenged with running organizations in different cultural and nationality contexts and had to decide in what ways I would adjust my own leadership strategies and methods vs. how I would ask the organizations to adapt to me!

Now, in my 25th year of management and 30th year in the “professional” work place, through many failures and successes, I have come to understand several truths that transcend culture or organizational structure or financial model, etc. There are no profound secrets here, but I have both scars and victories that serve to identify what has been most important and what I believe is transferable to almost any leadership context from entry level manager to CEO, in virtually any organizational context. Some of these things relate to leaders as individuals and some relate to organizations. You can also see how “less is more” for most leaders here.

Teams do better, more valuable work than even the most talented individuals.

Therefore, building and supporting teams is key to your success as a leader.

At every level of leadership listening is more important than talking.

Yes, you must articulate a compelling vision and communicate in convincing ways, but your efficacy is based more on what you learn from listening than what you gain from talking.

Mistakes are better teachers than successes.

The best leaders have generally made the most mistakes, and importantly, learned the most profound lessons from those mistakes. Moreover, someone who makes good faith mistakes is taking risks—and growing as a person and a leader.

Results matter but that is not the whole story.

Most people at the top of organizations are there because they are “results oriented” and have a sense of personal accountability. They have consistently achieved objectives over time. However, today’s organizational realities typically present leadership challenges that go far beyond just “hitting the numbers.” See an in depth discussion here.

Make sure you know what matters to your leader(s).

Whether you report to a manager above you in the organization or to a Board of Directors, you can save yourself a lot of grief by knowing what the priorities above you are.

Related: Make sure your bosses and your team know what you are doing.

Almost all of us communicate less well than we think we do. Ensuring that your bosses and colleagues are fully informed about what you’re doing and why, all the time, will head off problems before they happen while also increasing the likelihood that everyone is rowing in the same direction.

Organizational culture is more powerful than strategy or planning or just about anything else.

As the management guru Peter Drucker once said, “culture eats strategy for breakfast.” As a leader your success is inextricably linked to your ability to align culture with strategy. Ignore this at your own peril.

Human capital will get you farther than financial capital, technology or even strategy.

In an ideal world your organization or division or department would be well capitalized and you’d be implementing brilliant strategy supported by world class technology, but even if those things are true—and they often aren’t—your ability to execute on plans and strategies, to make change, to innovate, comes from people.

Related: Leading people well pays greater dividends that managing processes.

See the point above. Your colleagues are “force multipliers” for your vision, strategies, operational plans, etc. Focus on what will make them successful and you will be successful.

Related: What you say and do has a bigger impact on your subordinates than you realize.

Be careful. You can empower and hurt your subordinates more easily than you think you can.

Take care of yourself at least as well as you take care of others.

Leadership is stressful and the stakes are high, but the most important thing you can do for your own success is to care for yourself. Get adequate sleep and create space to relax and think. Meditate. Exercise

Integrity will sustain you over time.

You can certainly achieve short-term wins with “flexible” ethics or by using/abusing people or by saying one thing and doing another. It’s also true that you can operate with unimpeachable integrity and experience failures. But your longevity as a leader, and sustainable success, requires integrity more than just about anything else.

Related: Have a personal vision statement.

This is really important and surprisingly rare. Without a “true north” as a person, it is impossible to have a true north as a leader. Take the time to figure out what matters to you and who you want to be. See more about this here.

A surprising part of successful leadership is simply being stable and predictable.

Most organizations today find themselves operating in turbulent circumstances on a regular basis. It is hard to overestimate the value to the crew of seeing the captain calmly and competently steering the ship, particularly in rough seas.

Over time being a decent human being is a far better legacy than being a rich and powerful jerk.

Some of the most “successful” people in business are also some of the worst people in business. However, at the end of the last day, having hurt or cheated or disrespected people in the pursuit of riches or power is worth zero. Living on in those you have helped is priceless.

Of course, this is not an exhaustive list of the most important things I’ve learned as a manager/leader so far and I have no doubt that I have more important lessons to learn. However, a common thread among the items shared here is that essentially all of them made the list because I made some sort of mistake related to each one at one time or another. In some cases, I just didn’t know how important they were or I ignored them or I thought I could finesse my way around them, etc. Another common thread is that I think they apply across organizational and cultural contexts. I have learned other things that might apply in a traditional, not-for-profit liberal arts college in the U.S., but not so much in a private equity owned for profit business in Latin America. My sense is that the list above applies at some level just about everywhere. I hope some of the lessons I’ve learned are helpful in your leadership journey as well!

Higher Education’s Dirty Little Secret: Most Professors Know Little to Nothing about Teaching

At a high level, to be a truly effective teacher, one must consciously or unconsciously, have a theory of practice about pedagogy (teaching and learning) itself. Imagine for a moment if a pilot had no theory of aerodynamics or a physician had no theory for diagnosis or treatment. Doesn’t seem likely, but that is standard operating procedure for professors in most institutions of higher education.

Surprisingly (if not shockingly), many instructors at the university level enter classrooms every day without any formal training on how their choices and behaviors will support, or mitigate against, learning in their students. The dirty little secret about college level instructors is that the vast, vast majority of them have teaching roles because they have content expertise in some discipline—not because they have been taught anything about teaching. Some have natural proclivities or amenable traits, which certainly helps, but virtually no full-time, tenure track professors outside of Education departments were or are hired because of their teaching expertise or effectiveness. Some colleges and universities do provide occasional, usually ad-hoc, workshops to help instructors improve their effectiveness in the classroom (typically based on isolated tasks such as creating an assessment rubric or writing a lesson plan), but relative to the time, effort, and credentials dedicated to their content expertise, even the most ambitious training programs are comparative drops in a bucket. This would be similar to medical doctors completing training in basic sciences, but receiving no clinical training before engaging with patients!

There really isn’t a comparable situation in other professions (other than when experts in a given field are asked to teach neophytes). It is only in higher education that there is essentially no expectation or requirement that the practitioner have any expertise or credential for doing a large part, if not majority of his or her job. It’s frankly a little disconcerting. Of course, many faculty have spent long hours in the classroom, and through that experience have intuitively internalized ways of doing things that are likely to be more effective than others, but despite many years of such on the job training, most university professors cannot explain even the most basic learning or assessment theory, let alone the neuroscience behind how we make memories, learn, and develop new skills.

One exception to this reality can be found in most online or eLearning classes and programs. The reason for this is simply that online classes generally have a learning management system (LMS) of some kind that requires either instructors or instructional designers or both to make pedagogical decisions as part of the course design and delivery process. Even if professors don’t understand the underlying theory they are at least forced to think about about fundamental questions such as how they will present content and assess student progress, how students will practice new skills and how learners will discuss and process what they are learning. Another place where there is more focus on teaching is in career colleges, where curricula are more applied and instructors are more likely to be professional practitioners in the field being taught. Traditional, campus based courses typically have no such pedagogical/application imperative and professional practitioners are far less valued than those with terminal academic degrees. More on that in another post!

The good news is that a theory of practice, or pedagogy, can be learned by any professor and even remedial training can greatly improve teaching effectiveness. However, the reason that professors rarely invest substantial effort in developing pedagogical expertise is because being an effective teacher really isn’t important on many campuses—and such training is rarely available in any sustained way regardless. This is particularly true in large research universities where getting hired, getting promoted, and getting tenure have very little to do with teaching. The situation is a little better today than say, 20 plus years ago, but in traditional institutions, faculty are still broadly rewarded for research, publishing, and securing grant funding far more than they are for excellence in teaching. As large universities shift to more and more contingent (adjunct) instructors, the overall faculty focus on research and publishing will decrease, but there is no indication that institutional focus on teaching will grow in any appreciable way. Any material improvement will likely have to come from the adjunct faculty themselves.

So, what does this all mean for the institution of higher education and the people, both faculty and students, in it? Firstly, as an educational enterprise, the university is structurally flawed and has been from its inception. This is a global phenomenon. That does not mean that students don’t learn or that college degrees don’t provide value. However, it does mean that most students do not learn nearly as much as they are capable of and the majority of other students are so poorly served that they are unable to complete programs of study. In fact, most people do not realize that in the United States, over the entire history of higher education, the system has failed far more students than it has successfully served. Well over half of all the students who enroll in a college or university never complete a degree program! Even among the minority of students who are successful in the sense that they complete a degree, many of them succeed, not because they have edifying learning experiences, but because they effectively manage (survive) the system for long enough to earn a degree. In fact, in many cases, employers want employees with college degrees, not because of what the students have learned, but because having a degree proves that a person can start, persevere through, and finish a challenging long term project.

How is it possible that a structurally flawed system that fails over half of its constituents has continued to operate without meaningful change for centuries? The simplest answer is because there is no viable, scalable alternative for post-secondary education. That will likely change in the not-too-distant future as more employers in certain fields move away from requiring college credentials and toward industry based credentials. This is already happening in IT/Computer Science and will expand to other technical fields as well. Some disciplines such as those in the health sciences will continue to require college degrees as long as professional licensure for those positions continues to require degrees. If that barrier falls, then so will the university monopoly. Similarly, professions that require graduate degrees will also support the university monopoly in post-secondary education for some time into the future. Whether or not non-university, post-secondary educational models will provide teachers with pedagogical knowledge or not remains to be seen. However, a primary difference that already exists is that such models tend to be competency based, providing a greater likelihood that the learning that students do accomplish is more immediately applicable in work settings. One can debate the relative value of a liberal vs. technical/vocational/professional education, but the same structural flaw applies regardless.

All stakeholders would certainly benefit if university leaders dedicated as much attention to the teaching skills of their faculty as they do to their academic credentials, research, fund raising, etc. In fact, institutions that are forward thinking enough to make pedagogical excellence a core objective and priority will differentiate themselves from the crowd, creating a significant competitive advantage in the higher education market.

“I’m Sorry to Bother You. I know You’re Busy.”

I would speculate that virtually every single person reading this post has uttered those words, or something very close, upon walking into your boss’s or some other senior manager’s office. I’m also confident that if you are in a senior leadership role, that you hear something similar from subordinates almost every day. It’s time for all of us to stop saying those words

I realize that in most cases we are simply trying to be polite or deferential, but the fact is when you say, “I’m sorry to bother you,” you are suggesting that whatever your reason for wanting to speak to the person is, it is, by definition, less important than what he or she is already doing. When you say, “I know you are busy,” you are saying that the other things the person does all day that make him or her busy are justified, but your reason for contributing to the person’s busyness is not equally justified.

This silly need to be deferential or polite is an historical legacy of hierarchical organizations in which the boss’s time and ideas and decisions were always considered to be more important and smarter than anyone else’s. Regardless of how one feels about the hierarchy itself, the reality is that the complexity and volatility of day to day operations in virtually any contemporary organization, in any industry or field, is such that bosses no longer bring value (if they ever did) by controlling agendas and time and decisions. They bring value today by empowering many other people in the organization to do better work—ideally, collaboratively, in teams.

So, when someone walks into a superior’s workspace or calls on the phone, with or without a formal appointment, it is because that person has determined that he or she needs something (an opinion, a resource, an approval, etc.) that he or she will use to do his or her job. That interaction should not begin with, “I’m sorry to bother you. I know you are busy.” This may sound like a small thing, but it is actually a big thing. It is a mistake for anyone to start a conversation with words that immediately devalue the importance of their reason for being there in the first place!

For those of you who are senior managers and leaders, it is important to overtly redirect people when they say those self-deprecating words to you. I find myself doing this every day, but I feel strongly about it, so I do it. When anyone comes into my office to speak with me and they start the dialogue with, “I’m sorry to bother you. I know you’re busy,” I immediately reply with, “You aren’t bothering me. I want to hear what you have to say. And, yes, I’m busy, but I’m busy doing important things like meeting with you.” It is often a joy to see the look (surprise?) on the face of a colleague or customer when I do that!

For what it’s worth, I also do not apologize to my board members when I “take their time.” I never want them to think that the time they’re going to spend with me is any less important than anything else they could be doing!

A Key Component of Leadership: Knowing When to Be Directive vs. Building Consensus

The old fashioned view of leadership is of the strong, decisive leader, throwing out orders to willing underlings, who faithfully execute on behalf of the boss. Although this model of “leadership” fails to leverage the incredibly valuable input and due diligence that comes from a more collaborative and consensus oriented approach, it can actually be the preferred method in some situations.

So, when should you as a leader be more directive?

In my experience, the answer lies in both operational and cultural contexts.

Operationally, it is often preferable to be more directive when you simply don’t have the time for a more consensus oriented approach. In such situations, you need to be fairly confident that your choices/decisions are more likely to be correct than not, but if time is genuinely critical, then making “executive” decisions can be valuable to the organization. A recent example in my own experience was the opportunity to save 75% of the normal acquisition cost of an ERP level software application, but with a very tight deadline to accept the offer or not. I did have colleagues evaluate the application against our current system, but ultimately made the decision myself. In this particular case, I had the advantage that a sister organization was already using the software platform and thus had a real time, real world consult, but it was a good example of an executive decision opportunity regardless. Another operational example might be when you have time, but your colleagues/subordinates do not. In other words, they are already stretched thin and you can do them a favor my making an executive decision that saves them the time and effort of doing so collectively. Similarly, sometimes you’ve dedicated the time to a collaborative approach, but that approach had produced two or three equally viable options. It is better to make an executive decision as a leader than to drag your colleagues through a never ending process trying to get to one team decision.

There are also times when being more directive makes sense for cultural reasons. In organizations where hierarchy and autocracy are the norm, it can be quite disconcerting to employees when they are asked to contribute to consensus or to even make their own decisions. In fact, it can even be unfair in the sense that asking people to do something with which they have no experience can not only create a lot of dissonance, it can result in poor decision making and execution as well. This same dynamic also applies to individuals. Even if an organization broadly speaking is amenable to a collaborative, consensus based approach, not everyone in the organization has the experience or confidence to effectively participate in such a model. In those cases, a good leader should develop that capacity in such a colleague over time, eventually weaning him or her off of dependence on decision making by the boss.

And sometimes, being decisive or “executive” as a leader, particularly in a time of stress or crisis, can be valuable because it can give others in the organization confidence that someone is “in charge” and will get them through whatever the challenge happens to be.

To be clear, generally speaking, leaders will arrive at higher quality decisions and will get higher quality buy-in and execution on those decisions using a more collaborative, consensus based approach. This is true simply because multiple people almost always have better ideas than individuals and teams almost always generate better work than individuals or groups that do not function as teams. However, part of good leadership is knowing when being more directive is preferable.

Why a Compelling Value Proposition is More Important to Your Organization than Mission and Vision

Over the last quarter century I have participated in and led dozens of exercises designed to define organizations through mission, vision, and values. These can be very worthwhile exercises because they help organizations understand really fundamental things like why they exist, what they do, and what they aspire to be. Mission and vision statements, and core values, serve as guideposts that should inform important decisions about strategy, resource allocation, and other issues that drive consistency and sustainability.

However, in the current rapid change, hyper-competitive, highly commoditized world that virtually every organization in every industry now operates in, a more important question to ask in the context of defining an organization might be: Why would a customer choose us?

It is becoming more clear to me, both as a CEO and a consultant in the current VUCA environment, that what differentiates organizations that thrive from those that just survive or even fail is not their mission or vision statements or even their values (values, can however, support critical behaviors). While those are important, there is a highly dynamic, if not volatile “where the rubber meets the road” imperative faced by virtually all customer or client driven organizations today, which boils down to your value proposition to the customer compared to the value proposition of other possible choices offered by the competition. We can often attract customers or students or patients or clients into an initial transaction with catchy marketing or steep discounts or convenience, but over time, and particularly with “big ticket” commitments, in order to succeed in the hyper competitive and commoditized environment that most of us work in, our customers must believe that there is genuine value for them in the product or service we are selling—and not only genuine value, but value that is demonstrably greater than what is available from the competition. The value proposition itself becomes a key differentiator that sustains customer loyalty and advocacy.

While I rarely engage an organization that does not have a mission or vision statement, I frequently find organizations that do not have an articulated value proposition. That value proposition can reflect many things in a given organization or for an identified set of customers such as quality, cost, service, benefit, flexibility, support, etc., but with rare exception, consumers today almost always have choices and usually many choices of where to buy a given product or service, so there has to be a good reason they will choose your organization over another. In some cases (usually in retail contexts) the value can even be related to “prestige” or “image,” but it still has to exist for the consumer.

Even in organizations that have thought about and committed to a compelling value proposition for their customers or clients, only the most sophisticated have a truly customer-centered notion of what is valuable and important to the consumer. It is actually more common to see such statements based on what the organization or vendor thinks is valuable. This happens simply because we human beings get attached to what we are invested in or have experience in or think we’re good at. That is a trap that gets in the way of creating value for the customer rather than for ourselves.

What this all boils down to for leaders is that self-definition is still important for organizations today, but with limited time and resources, having a deep understanding of why customers would choose you over a competing organization (and acting on that) is probably more important than having clarity around mission and vision.

The Employer-Employee Relationship: Things Have Really Changed!

In the early industrial era when Americans migrated in large numbers to urban areas for factory work, employees worked very long hours under grim, even dangerous conditions. There were also no protections for child laborers. This reality lasted through much of the 19th century and into the early 20th century. While factory wages were higher than other alternatives, it was punishing work in which the company, in many ways, “owned” the employee.

There was a brief period of American history, roughly post-depression through about 1980, in which at least a couple of generations experienced the ability to work for one or two employers for most of their lives, with relatively safe conditions and 40 hour work weeks, for a living wage, then retire in a home they likely owned. They were also able to provide advantages to their children that allowed millions of Americans to experience a more prosperous life than their parents. This was possible for several reasons. Unions played a significant role in ensuring safer conditions and living wages and benefits for moderately skilled workers. Social programs such as social security and Medicare kept millions of elderly out of poverty. And a slower pace of change allowed companies to do roughly the same work in roughly the same way for decades. This, in turn, facilitated long term relationships between employers and employees that supported an unwritten social contract. The employee worked hard, learned new skills over time, became more valuable, and the employer made an informal commitment to steadily improving wages and job security.

During this same period, management had a somewhat different “contract,” but similar benefits. In return for accepting longer and less predictable hours as “salaried” employees, managers generally enjoyed higher compensation, better benefits, and higher status within their organizations. However, even for management, there were much more clear boundaries between work and private life than there are now. There were no laptop computers, smart phones, tablets, etc. Vacations generally meant completely leaving the workplace behind, usually with zero contact for up to two full weeks. Weekends were generally fully disconnected from the workplace and managers rarely took work home with them. Hourly workers never took work home with them.

Fast-forward to the 1980s and several things began to change in substantial ways. Unions began a precipitous decline. “Globalization” began to put downward pressure on wages. The pace of change within organizations, related to technology and the competitive landscape, made it impossible for employees to do the same work in the same way for more than several years, let alone decades. These disruptions combined, or more accurately, conspired, to weaken the social contract between employers and employees. Today, several decades later, the contract has been obliterated. We are now in an economy in which all employees, hourly and management, are generally seen as “expendable,” and in which organizations, broadly speaking, feel no obligation or loyalty to the individuals that make the organization’s work and survival possible. What this means is that although it is still in one’s own interest to act professionally in terms of meeting obligations, doing one’s best work, representing one’s organization well, it would be a naïve mistake to think that, with rare exception, sacrificing for one’s employer will be “repaid” with job security or deferential treatment in tough times. While this phenomenon is more prevalent in the U.S., it is also becoming more common in other regions of the world as well.

Similarly, unlike the clear line that used to exist between work and private life, organizations now typically see no line whatsoever and claim some level of ownership over all of an employee’s time (it tends to be even worse for managers). In what amounts to a kind of bait and switch, employers spend several hundred dollars on smart phones and tablets at “no cost” to the employee, then get tens or hundreds of thousands of dollars of your “off the clock” time in return. Again, this dynamic is more pronounced in American and Western contexts, but it exists at some level in most global contexts.

And this has happened at the same time that smart phones work essentially anywhere, any time, domestically and internationally. One cannot even escape wi-fi at 35,000 feet in an airplane anymore!

The ultimate irony is that even as we work more hours and have less space between our work and private lives, we are actually becoming less productive overall and certainly less creative. How is that possible?

The short answer is that not only is there not a correlation between working longer hours and being more productive, recent research shows that the opposite is actually true. This is palpably ironic for both employees and the organizations they work for. See a lengthy post on this topic here.

While this new state of affairs can be stressful for employees, who seem to be working harder, with less work-life balance at the same time employer loyalty and job security are becoming ever more rare, it also provides a kind of freedom in which employees are also less tied to employers and thus more free to pursue new opportunities. Change can be difficult, but it is also a catalyst for growth and the flip side of less job security is more professional options. Another potential benefit is the fact that people generally realize faster upward mobility by moving from one organization to another than waiting for opportunities in one organization.

As leaders, we can actually take advantage of the current reality by identifying employees who bring the greatest value, have the best attitudes, integrity, etc. and find ways to support and reward those individuals. We frankly want longevity in our best people. The fact is that organizations today are often driven by short-term thinking, fear of failure, and limited innovation because of a reticence to take risks when times are hard. However, that same reality also provides opportunity to stand out in the crowd for leaders who care about others, take a long view, and embrace risk!

Post Script:

While all of my posts and articles are based at some level on my own experience over 35 years in the workplace, I rarely share specific personal stories. In this particular post, however, I think it is helpful to make an exception. The reality is that you can do the right thing as a leader and still pay a heavy price. You can focus on quality or sustainability or transformational change and be penalized or even fired by a boss or a board who does not care about those things. Due to your own sense of integrity you might prioritize compliance or legality or ethicality over financial imperatives when those who employ you are more “flexible” in their own interpretations of right and wrong. You may care about human beings and thus make decisions that value people over process or short-term financial gain. Fortunately, in most cases, when you do the right or smart or compassionate thing, you will be respected and rewarded, but not always. I have been separated from organizations for doing the things noted above when those things were not reflective of the value systems of my employer. However, just as a good leader takes the long view in terms of performance outcomes, over a professional lifetime, taking the long view in terms of your own integrity is also the right thing to do. In 25 years in management positions, I have been rewarded and benefitted from doing the right thing far more often than I have been penalized. In the end, none of us want our legacy to be how many people we fired or laid off rather than how many we coached and saved, or how many short term profit or sales goals we exceeded while sacrificing quality or the survival of the organization. I have seen short-term, unethical thinking literally destroy a healthy organization, wiping out thousands of jobs, stranding tens of thousands of customers, and wiping out hundreds of millions of dollars in equity. I have seen greed overrule compassion and brilliant strategy sacrificed to ego. And I have personally paid a price for challenging the status quo, but I have also been inspired by servant leaders and experienced the joy of leading a team through transformational change to achieve things they never thought possible. In the end, we will not be judged by our wealth or our status or our conquests, but by the good we have done, and that is a legacy for which we should be willing to occasionally sacrifice our own well being.

Our Faustian Bargain With Technology

Most of us reflexively think of technology as more of a “good” thing than a “bad” thing and that is probably objectively true in some ways and contexts. If we look at fields such as medicine, transportation, and communications, for example, and compare how technology has affected those arenas today vs., say, 25 years ago, most of us would say that we are better off.

It is because of technology that someone in Abu Dhabi can buy an avocado grown in Kenya (and that farmers in Kenya know in real time where to get the best price for their products). It is because of technology that early detection of colon cancer has greatly increased survival rates and that surgeons can correct heart defects before babies are even born. It is because of communications technology that two people, regardless of location, can connect in real time and share virtually unlimited information of interest to both. Based on these examples, most people would say that technology represents a positive influence in our world and our lives.

On the other hand, technology has become so pervasive in virtually every aspect of our personal and professional lives that we have become fundamentally dependent on that technology to transact even the most basic tasks of our daily existence, even to the point that most people even mediate their primary human relationships through technology.

Technology has done big things in the very recent past. It has democratized access to information. It has effectively eliminated time and geography as barriers to communication. It has facilitated automation that has completely changed how entire industries function—and how the humans in those industries work and interact. It has made some products and services much cheaper (or even free).

It has also been incredibly disruptive, both in terms of how it impacts our lives, but also because of the increasingly rapid pace with which new technologies enter the mainstream of human life and enterprise, fundamentally changing how we communicate, work, behave, and interact with one another. It has wiped out entire labor markets and it has shifted the risk of failure from single entities to entire systems, and this is a really, really big deal.

In the “old days,” when technology was electro-mechanical or just mechanical, machines failed all the time. However, those failures were limited to individual machines (a car, a loom, a washing machine, etc.). Now, with most all technology employing some sort of software, software that is connected to other things also employing software, we are vulnerable to massive, even catastrophic failures—and it’s already happening. Entire airlines are grounded across the globe. Power grids go down. Vehicle fleets are pulled off the roads—because a string of software code fails or is hacked or was flawed to begin with. The current process for coding is deeply flawed as well, but that is another post…

In short, a new car, for example, typically requires millions of lines of code to operate multiple processors. There is no way for any programmer or programmers to anticipate the billions of potential combinations of scenarios that a driver and the car (and hundreds of thousands of other cars and drivers) will encounter over millions of miles in highly diverse environments. As a result, a piece of code that is supposed to stop a car from accelerating, for example, given enough time and scenarios, will inevitably fail to stop accelerating even when the driver takes her foot off the gas pedal. This has already happened, resulting in accidents and even deaths. And the problem affects every single car running the same code.

It is becoming clear that there are at least a few areas in which our “bargain” with technology may have come at a very steep price. Privacy, independence, vulnerability to failures, human development, mental health, and human relationships—and we have not even largely begun the next technology era with artificial intelligence and mixed/virtual reality! The reality is that we are all subjects in a very big experiment and we frankly don’t know what the outcome will be, particularly for young people (tech natives) who represent the first generation in human history to have lived their entire lives mediated through technology and tethered to smart devices.

This horse is way, way out of the barn. We are not going back to a pre-software driven world, and on balance, most of us are relatively satisfied with the technology we use every day (and depend on without even knowing it). But there are two realities that we should be thoughtful about. One is that there will continue to be massive, catastrophic systems failures, and it will get worse before it gets better. Part of this isn’t even related to technology failing directly—it comes from support systems failing. Ask the people of Puerto Rico about life without smart phones, ATMs, and internet, all of which need electricity and other infrastructure to function. The second reality is that we can still individually carve out space in our life that is mostly tech-free if we choose to and we should do that on a regular basis. As human beings, our relationships probably need time with others that is not mediated through technology. We know that spending significant, unbroken hours engaged with laptops, tablets, smart phones and video games has documented affects on our brains and bodies (think concentration, sleep and eating patterns). We also know that social media without breaks can increase anxiety and decrease self-esteem, while also generating a great deal of stress. Increasing research is beginning to show that human beings need and benefit from extended periods of time in social and natural environments that are not mediated in any way through technology. At least for now we still have some control over that and should regularly exercise that control for our own benefit and the benefit of others.

Why Organizations Are Often Disappointed with Consultants

Many organizations invest significant amounts of time and energy in consultants only to later hear employees and managers say things like, “they didn’t tell us anything we didn’t already know,” or “their ideas don’t work in the real world.”

As someone who has worked extensively as a consultant and hired consultants, I’ve had a broad range of experiences. Under the right circumstances, consultants can be very valuable, but often they don’t seem to provide any measurable ROI, while creating significant distraction in the process.

Why is this so?

At the most basic level, many organizations often contract with consultants to solve problems or achieve outcomes for which a consultancy model is just not the right choice to begin with. Things like “improving quality” or “developing growth strategies” are reasonable organizational goals, but those things should probably be core competencies developed by the organization itself rather than something that comes from unaccountable external consultants. There is certainly value in having an external set of eyes validate a growth strategy or quality initiative, but if an organization is asking consultants to determine the course of action, that arrangement is likely to ultimately fail. It is likely to fail because even if consultants have a high level of appropriate industry knowledge (and they often don’t), they don’t have any accountability for executing the strategies or suggestions they present. Moreover, by definition, consultants are “outsiders.” They rarely have a meaningful understanding of organizational culture or history or politics. They also tend to work in a silo based on the nature of their contract. As a result, they often have no idea how their recommendations relative to “improving sales,” for example, are connected to the organization’s training or marketing capabilities. Lastly, and this is really important, consultants are almost never involved in the hard work of operationalizing and implementing their recommendations. Most often, they provide a report, do some follow up conversation and move on to the next client, while the employees in the organization are left to implement strategies that many do not fully agree with, that are often not appropriately resourced, that don’t fit into “normal” operations, and that no one in the organization owns.

Organizations are more likely to solve the kinds of challenges noted above by providing training and development opportunities for their own employees and managers, and having them use their new found knowledge and capabilities to develop, then implement, new operational strategies that they own and are enthusiastic about. The training and development activities can certainly come from external sources, including consultants, but that is different than asking consultants to provide the new strategies themselves.

On the other hand, consulting projects are much more likely to provide valuable outcomes if they are designed to produce tangible products or resources that enhance daily operations. For example, hiring a consultant to write a training curriculum or code programming scripts that will add additional functionality to a back office or customer service computer application, tend to provide great ROI because they result in concrete tools the organization can use to do things it couldn’t do before. I have also found that consulting projects that directly transfer skills and knowledge (rather than abstract ideas or strategies) are generally very successful because they empower employees (who are the ones left behind when the consultant leaves) to work in different, more effective ways. Similarly, consultants who can identify and articulate very specific, actionable changes in a process or who can provide an instrument to accomplish articulated goals, are generally seen to bring real value.

Additionally, in the big corporate consulting firms, most of the consultants they put in the field are very young, very inexperienced entry level professionals, often in their first jobs out of college, who have little or even zero actual operational experience. They work very hard and they’re very smart, but they have no internal “gut check” capabilities when it comes to the real-life viability of their observations and recommendations. It is also common to see consulting teams in which, despite their intelligence and work ethic, not a single member is actually an operational expert in the industry or subject area of the consulting project.

The big firms have great resources, particularly when it comes to research and technology, but they also tend to have deep biases toward what they’ve already done for other clients. Because of that, contracting organizations often get modified versions of previously dispensed advice, suggestions, models, etc. In fact, in certain areas, the consulting is actually based on previously designed templates that get repurposed for new clients. Sometimes the templates are the product of very high quality work, but the very nature of the template framework pushes consultants in pre-determined directions that can miss critical context for a specific organization.

When organizations find themselves dissatisfied with consultants, the problem is generally not the concept of consulting itself. In fact, under the right circumstances, a good consultancy can bring significant value to an organization. The reason consulting arrangements often do not bring the desired value is typically because of a mismatch between what the organization needs vs. how consultancy models usually work. For most operational issues, I have found that “boutique” consultancies that are experts in the subject area and industry of the consulting work are often more effective than larger, generalist firms. Of course, the opposite can also be true, particularly if a project requires deep pockets and resources on the part of the consultant. This can be the case with large research projects, for example.

In short, if an organization finds itself using a consultancy to compensate for what should be core competencies or if the consultants are not experts in the work of the organization, there will likely be dissatisfaction on the horizon. On the other hand, the more tangible, applicable, and customized a consulting product, the more likely it is to bring value.

The Two Monopolies that Keep American Higher Education Afloat

In a previous post, I wrote about how tectonic shifts in the macro environment in which American colleges and universities operate would almost certainly result in drastic change over the next couple of decades. Those shifts have profoundly affected the financial model of higher education as well as the value proposition for many millions of potential students. On the other hand, higher education as a societal institution or as an industry has had remarkable staying power in the face of profound change that has wholly remade other sectors of society. How is it that higher education has broadly managed to survive in much the same form it has been in for a century while other sectors and industries have undergone foundational change, and in many cases, simply vanished?

This question is even more interesting considering that educational institutions are generally more inertia bound than just about any other type of organizations one can think of.

The simple fact is that American Higher Education has continued to operate with little systemic innovation, and at times something close to disdain for the needs of its “customers,” because of two monopolies that have protected it from the market realities that affect other industries. The first monopoly is access to federal financial aid. The federal government disburses many billions of dollars a year to institutions of higher education (IHEs) in the names of their students. This largesse comes in the form of grants and guaranteed loans, without which not more than a handful of the thousands of Title IV eligible institutions could operate. This financial aid is only available to the “cartel” of educational institutions that follow the narrow, static requirements of recognized accreditors. The second monopoly is less structurally assured, but equally important, and that is the control that accredited IHEs have had over the credentials that graduates use to gain employment and take licensure exams. In another article I noted that the monopoly over credentials is weakening in the face of the growth of the gig economy and the alternative ways that Millenials are learning job skills that do not require attendance at universities. Even if weakened, that monopoly will continue to exist in some form as long as employers and licensing boards support it. This monopoly is actually stronger in many countries outside the U.S., particularly in developing countries and regions which have very traditional notions of higher education.

To be clear, even with the monopolies protecting its existence, higher education is not thriving. It is surviving, trapped by a powerful structural dilemma. The accreditation system that IHEs must adhere to in order to qualify for federal aid is a painfully stifling force against innovation, as are the arcane, century old Department of Education rules that apply to the kinds of educational experiences that are eligible for federal aid. This status quo creates a serious quandary for IHEs, which must innovate to survive from a market perspective, but which are prevented from being truly creative if they want continued access to federal financial aid. To make matters worse, the US Education Department is compromised by as much or more inertia than are the IHEs themselves!

While it is possible that non-financial aid dependent education models will begin to chip at the edges of the traditional higher education edifice, offering limited innovative alternatives to students, until the current anachronistic and innovation-killing higher education funding model changes, the traditional higher education system will continue its slow and painful slide toward declining enrollment and declining relevance.

How Millenials Are Driving the Evolution of Online Learning–It’s Not What You Think

Online learning as we currently understand it has existed in some form since roughly 1995, about a year after the introduction of the World Wide Web. By some measures, online learning has been a huge success in the sense that today millions of college students in the U.S. take some or all of their courses via online delivery. Globally, the numbers are much larger. In fact, about 25% of all undergraduate content delivered in U.S. higher education today is accessed via learning management systems (LMSs) like Blackboard or Moodle and over half of graduate level content is accessed the same way. However, this form of online learning is dwarfed by the learner driven activity taking place every day, driven largely by Millenials. While there is a broad consensus that MOOCs (Massive Open Online Courses) have flopped relative to their initial, lofty goals, the fact is that MOOCS are fast becoming the primary mode of learning and skill development for tens of millions of people in the U.S. and around the globe. It’s called YouTube. And Wikipedia and Instagram and Tumbler and Vimeo and Facebook and other social media platforms.

The single greatest repository of educational content in the world is on YouTube, which has millions of tutorials on virtually every conceivable subject. Research by Google (the owner of YouTube) has found that nearly 70% of Millenials believe that they can learn anything via YouTube tutorials and over 90% of young professionals access social media sites for ideas, data, and instruction when completing projects at work. And what is really important about this phenomenon is that for younger users in particular, they see their learner relationship with social media as organic and undifferentiated from their other relationships with, and uses of, social media platforms. They have grown up with this self-defined version of eLearning and they are exceptionally confident learners in this medium. While people of all ages and backgrounds are using YouTube and wikis for self-education, Millenials are fearless when it comes to teaching themselves how to do virtually anything—and they are largely successful. They are also the creators of much of the educational content on YouTube and other platforms, completing a circle of learner and tutor.

Why does this reality matter for “traditional” online learning and education in general?

As with other user-driven trends, many millions of people are organically shaping eLearning in ways that are sharply diverging from the online education currently offered by colleges and universities. In other words, although traditional eLearning delivered by institutions of higher education (IHEs) has become quite sophisticated in terms of the pedagogical models embedded in the LMSs used by those institutions, these formal eLearning models force users (learners) to abandon the behaviors they normally use daily in other internet activities, especially those connected to self-directed learning.

The way that many, many more “students” learn online via YouTube and Wikis and shared knowledge via multiple social media platforms is moving sharply away from how online learning is designed and delivered in universities. This does not bode well for IHEs who are frankly asking students to use tools they find less effective and with which they are less comfortable, in a medium that they otherwise use pervasively to transact every aspect of their lives.

So far, traditional IHEs have been protected by a monopoly over credentials, which until recently, has been largely supported by employers. That monopoly is now at significant risk due to a major shift, not in higher education, but in the economy.

The rise of the gig economy has many implications for those who work in that sector of the economy; implications which have been discussed at length by economists, sociologists, policy makers and others. What has not been discussed is the extent to which this shift to on-demand, piece work will have huge impacts on the educational paths that people have traditionally taken on their way to skilled positions. The reason for this is simply that the gig economy is based on deliverables, not on educational credentials. A video editing project or a training curriculum is successful if it meets the needs of the contracting entity, whereas the contractor’s academic credentials are close to irrelevant. Doing good work, to spec, on time are the markers of value (and continued access to subsequent contracts). Young people, working in both the gig and traditional economies go to social media and other internet platforms (not colleges and universities) as their first choice for professional and skills development. They are also much less “proprietary” in their thinking about intellectual property, either their own or that of others, and see the sharing of knowledge and skills to be a “public domain” activity. Finally, Millenials are proving to be quite utilitarian and pragmatic and much less “hung up” on the formalities and conventions of previous generations. That, combined with an economy that is also shifting in fundamental ways, has huge implications for education in general, but online education in particular.