Why Organizations Must Embrace (and Redefine) Failure

In most organizations, people from executive leaders to line employees are socialized to fear failure. Although there are a lot of unhealthy values in many organizational cultures, this is one of the more damaging ones. It leads people to hide mistakes, refuse to share bad news, and most importantly, to avoid risk. Not only does fear of failure lead to negative behaviors, it also mitigates against necessary, healthy behaviors.

This very unfortunate socialization has many sources, but at its core is the terrible reality that in many organizations people are often punished for “failure”—the failure to achieve targets; the failure of a new product or technology to deliver desired results; the failure to reach a goal within a prescribed time frame, the failure of an investment to pay off, etc. In such organizations, people learn to “keep their heads down” and do only what they already know how to do, what doesn’t make waves, and what they won’t have to explain to others. While this may be a “safe” way to behave, it is often disastrous for the organization, particular in today’s environment of constant, high-paced change. James Quincey, the CEO of Coca Cola said it very simply: “If we’re not making mistakes, we’re not trying hard enough.” Fear of failure is the bedfellow of complacency.

Complacency is an obvious, but frankly “lower-level” problem. So, how else does the fear of failure hurt organizations?

First, both individuals and organizations learn as much or more from failure as from success. We all know this intuitively, but the cultural pressure against failure is often so powerful that organizations (and leaders) willingly sacrifice critical learning opportunities in the interest of avoiding what looks like failure. Organizations must “fail” in order to learn and the organizations with the greatest capacity to learn and improve are also those with the highest tolerance for (or even encouragement of) failure.

Relatedly, in a highly competitive environment in which much of what organizations do and sell can be commoditized or rendered obsolete overnight, organizations must have a culture of experimentation in order to stay a step ahead of the competition and in step with customers. They must be willing to experiment with new ways of doing things ALL THE TIME. And the most powerful and high return outcomes tend to come from the highest risk experimentation. As Jeff Bezos said in a recent Harvard Business Review article, “If you’re going to take bold bets, they’re going to be experiments.” He would know.

Change for change’s sake is not necessarily productive, but innovation is critical to success. And that may be the biggest casualty of a culture in which people fear making mistakes and failing. By definition, innovation requires “failed” attempts. The greatest commonality among the most innovative people and organizations is that their new ideas fail to work out far more than they hit the jackpot. Interestingly, what may be the most innovative culture in the world, Silicon Valley, rarely even uses the word failure. When something doesn’t work out as hoped, they “pivot” to the next idea or opportunity, using what they learned from what didn’t work. Even venture capitalists know that they will lose money on many more investments than the ones that will pay off. However, they know that it only takes a small number of big winners to compensate for all the risk taken on the “losers.” Even in the most successful organizations such as Apple or Google, we only know about the experiments that worked out—not the hundreds that didn’t.

As a leader, you must support risk taking and thus encourage (yes, actually encourage) “failure.” And importantly, you must actively redefine failure from a discreet event that is “bad,” to a process that is necessary, if not critical. You must not only make it safe to “fail,” you must reward that behavior by celebrating what was learned, then “pivot” to the next experiment!

Post Script

Strangely, I have sat in Board meetings in which Directors or Trustees were almost paralyzed when it came to committing to strategies due to some level of risk. Not long ago, the mantra in most organizations (and Board rooms) was “data based decision making!” While certainly a plus if possible, in today’s rapid change environment, data driven decision making is a luxury and if you insist on waiting to have compelling data to support your decision, by definition, it will probably be too late to realize much benefit—the train will be out of the station and someone else who took the risk early on will already be “first to market” and reaping the benefits—and generating the data 🙂

 

 

Everyone, from Line Employees to the CEO, Needs Encouragement

When we think of leadership, most of us think about skills or traits or work related behaviors directly related to whatever results an organization is supposed to achieve: The greater the results, the greater the leader. While there is certainly some truth to that, there are also psychological components of leadership—both how leaders interact with others and how they function themselves. These are things that do not show up in a spreadsheet or a quarterly report, but they are often critical to the health of leaders themselves and the folks they support in an organization. And, for what it’s worth, there is also interesting research on how at least some psychological constructs such as sense of belonging and sense of purpose, affect employee retention and productivity.

So, speaking of the psychological or affective elements of leadership, one of the most important and simple things you can do as a leader is to offer encouragement to your employees. Even the most confident and successful people have moments of self-doubt. The rest of us are frankly more vulnerable than the “most confident” people and have many moments of self-doubt!

There are a couple of really good reasons to look for opportunities to reassure and support the folks you manage. One altruistic reason is that you will make people feel better—less stressed, anxious, etc. A practical and even self-serving reason is that if your employees feel more confident, they will commit to their work more energetically, will take risks, and will be less distracted by doubt. And in more extreme situations, a good pep talk at the right time may keep someone from looking for other opportunities. Unlike a lot of leadership tasks, this one is really easy. You just have to be genuine.

Interestingly (and unfortunately) the higher up the ladder a manager is in an organization, the less likely he or she is to get encouragement or a pep talk, etc. We have this strange belief that senior managers and executives, magically, by virtue of their position, somehow have superhuman strength and invulnerability. The phrase “it’s lonely at the top,” speaks to the fact that the most senior leaders tend to be the least supported psychologically. This is further ironic because C-level executives, for example, tend to experience the greatest stress because they have the greatest responsibilities (both in quantity and criticality). Strangely, despite the clear need, those leaders are the least likely to receive encouragement. This is structural, i.e., if you’re at the top, there aren’t many people above you to offer support, and it is also based on the misconception that the most senior leaders don’t need support and encouragement precisely because they are senior leaders.

The good news is that encouragement can come from anyone, not just a boss. It is valuable regardless and subordinates should feel comfortable to offer words of encouragement to their supervisors as well. However, one reason that supportive comments from someone above are important is because our superiors control our destiny! As a result, if they have confidence in us, we are likely to have more confidence in ourselves.

So, look for opportunities to offer words of encouragement–a quick written note, a pat on the back, a one-minute pep talk—any of these will provide value well in excess of the effort itself.

Post Script: This post is particularly timely because, for a variety of reasons, people (and thus employees) are feeling more and deeper stress than at any time since it has been measured. Research recently conducted by the American Psychological Association found that Americans have hit record highs of stress and anxiety. Similarly, the World Health Organization recently declared that stress has become a worldwide crisis and is “the health epidemic of the 21st century.” There are many reasons for the increase in stress, some of which employees bring to the workplace and some of which are caused by the workplace. Regardless, it has become such a pervasive issue that it is not possible for leaders to effectively run organizations without taking stress mitigation into account (in themselves and their employees). While offering words of encouragement to employees and colleagues is not a stress mitigation plan itself, it does often relieve the stress caused by self-doubt in the workplace and provides high returns for fairly low effort.

Good Teams Are More Valuable Than Even High Performance Individuals

Within all types of organizations, the performance of individuals, no matter how productive, insightful, or connected to organizational goals is rarely more important or valuable over time than the performance of the team or teams that person is (or should be) a part of.

We know this intuitively by watching teams that compete at the highest levels in sports. Those teams often have a “superstar,” but the most successful teams are those in which all the players work well together to make each other better.

Talented people working in isolation often achieve quick wins (or score a goal), but on the whole, are far less valuable than even “less” talented people working well together. As Patrick Lencionci, in his book, The Five Dysfunctions of a Team notes, there is no greater competitive advantage in organizations than teamwork. Why? Because cohesive and well functioning teams create synergies. They have a multiplying effect on the abilities of individuals within the team. They “see” things, all looking together, that individuals do not see by themselves. And most importantly, at any given moment in time, teamwork produces a multiplicity of ideas, approaches, solutions and shared actions greater than even the most capable of individuals working by themselves.

Great teams within an organization represent a competitive advantage because most organizations prioritize and value strategy, technology, operational capital, etc. over people and teams. Virtually every organization says that their people are important or “their greatest asset,” etc., but they often do not operate as if that is actually true. This reality is at least slightly culturally embedded, with institutions in the West being less relational and seeing people as more expendable, or even disposable, than those in the the Middle East and East, but in both contexts, organizations tend to see capital, technology, and strategy as more important variables for success. This reality provides a great advantage to leaders who choose to support and leverage teams for success.

As a result, while it is important as a leader to nurture individuals, and certainly to care about strategy, finance, etc., you and your organization will benefit more if you nurture the growth and efficacy of of people working together.

Things I Now Know For Sure About Successful Leadership

I have been truly fortunate over a long management career to have experienced an incredible diversity of leadership and management opportunities in different kinds of organizations, large and small, in multiple countries, across a variety of structures, missions, business objectives, etc. From my first management position 25 years ago, leading a small department in a rural Idaho school district with one employee, to heading a multinational corporation with thousands of employees based outside of the U.S., I’ve been blessed with learning opportunities most leaders will not have in their entire careers.

I have also been challenged with running organizations in different cultural and nationality contexts and had to decide in what ways I would adjust my own leadership strategies and methods vs. how I would ask the organizations to adapt to me!

Now, in my 25th year of management and 30th year in the “professional” work place, through many failures and successes, I have come to understand several truths that transcend culture or organizational structure or financial model, etc. There are no profound secrets here, but I have both scars and victories that serve to identify what has been most important and what I believe is transferable to almost any leadership context from entry level manager to CEO, in virtually any organizational context. Some of these things relate to leaders as individuals and some relate to organizations. You can also see how “less is more” for most leaders here.

Teams do better, more valuable work than even the most talented individuals.

Therefore, building and supporting teams is key to your success as a leader.

At every level of leadership listening is more important than talking.

Yes, you must articulate a compelling vision and communicate in convincing ways, but your efficacy is based more on what you learn from listening than what you gain from talking.

Mistakes are better teachers than successes.

The best leaders have generally made the most mistakes, and importantly, learned the most profound lessons from those mistakes. Moreover, someone who makes good faith mistakes is taking risks—and growing as a person and a leader.

Results matter but that is not the whole story.

Most people at the top of organizations are there because they are “results oriented” and have a sense of personal accountability. They have consistently achieved objectives over time. However, today’s organizational realities typically present leadership challenges that go far beyond just “hitting the numbers.” See an in depth discussion here.

Make sure you know what matters to your leader(s).

Whether you report to a manager above you in the organization or to a Board of Directors, you can save yourself a lot of grief by knowing what the priorities above you are.

Related: Make sure your bosses and your team know what you are doing.

Almost all of us communicate less well than we think we do. Ensuring that your bosses and colleagues are fully informed about what you’re doing and why, all the time, will head off problems before they happen while also increasing the likelihood that everyone is rowing in the same direction.

Organizational culture is more powerful than strategy or planning or just about anything else.

As the management guru Peter Drucker once said, “culture eats strategy for breakfast.” As a leader your success is inextricably linked to your ability to align culture with strategy. Ignore this at your own peril.

Human capital will get you farther than financial capital, technology or even strategy.

In an ideal world your organization or division or department would be well capitalized and you’d be implementing brilliant strategy supported by world class technology, but even if those things are true—and they often aren’t—your ability to execute on plans and strategies, to make change, to innovate, comes from people.

Related: Leading people well pays greater dividends that managing processes.

See the point above. Your colleagues are “force multipliers” for your vision, strategies, operational plans, etc. Focus on what will make them successful and you will be successful.

Related: What you say and do has a bigger impact on your subordinates than you realize.

Be careful. You can empower and hurt your subordinates more easily than you think you can.

Take care of yourself at least as well as you take care of others.

Leadership is stressful and the stakes are high, but the most important thing you can do for your own success is to care for yourself. Get adequate sleep and create space to relax and think. Meditate. Exercise

Integrity will sustain you over time.

You can certainly achieve short-term wins with “flexible” ethics or by using/abusing people or by saying one thing and doing another. It’s also true that you can operate with unimpeachable integrity and experience failures. But your longevity as a leader, and sustainable success, requires integrity more than just about anything else.

Related: Have a personal vision statement.

This is really important and surprisingly rare. Without a “true north” as a person, it is impossible to have a true north as a leader. Take the time to figure out what matters to you and who you want to be. See more about this here.

A surprising part of successful leadership is simply being stable and predictable.

Most organizations today find themselves operating in turbulent circumstances on a regular basis. It is hard to overestimate the value to the crew of seeing the captain calmly and competently steering the ship, particularly in rough seas.

Over time being a decent human being is a far better legacy than being a rich and powerful jerk.

Some of the most “successful” people in business are also some of the worst people in business. However, at the end of the last day, having hurt or cheated or disrespected people in the pursuit of riches or power is worth zero. Living on in those you have helped is priceless.

Of course, this is not an exhaustive list of the most important things I’ve learned as a manager/leader so far and I have no doubt that I have more important lessons to learn. However, a common thread among the items shared here is that essentially all of them made the list because I made some sort of mistake related to each one at one time or another. In some cases, I just didn’t know how important they were or I ignored them or I thought I could finesse my way around them, etc. Another common thread is that I think they apply across organizational and cultural contexts. I have learned other things that might apply in a traditional, not-for-profit liberal arts college in the U.S., but not so much in a private equity owned for profit business in Latin America. My sense is that the list above applies at some level just about everywhere. I hope some of the lessons I’ve learned are helpful in your leadership journey as well!

Higher Education’s Dirty Little Secret: Most Professors Know Little to Nothing about Teaching

At a high level, to be a truly effective teacher, one must consciously or unconsciously, have a theory of practice about pedagogy (teaching and learning) itself. Imagine for a moment if a pilot had no theory of aerodynamics or a physician had no theory for diagnosis or treatment. Doesn’t seem likely, but that is standard operating procedure for professors in most institutions of higher education.

Surprisingly (if not shockingly), many instructors at the university level enter classrooms every day without any formal training on how their choices and behaviors will support, or mitigate against, learning in their students. The dirty little secret about college level instructors is that the vast, vast majority of them have teaching roles because they have content expertise in some discipline—not because they have been taught anything about teaching. Some have natural proclivities or amenable traits, which certainly helps, but virtually no full-time, tenure track professors outside of Education departments were or are hired because of their teaching expertise or effectiveness. Some colleges and universities do provide occasional, usually ad-hoc, workshops to help instructors improve their effectiveness in the classroom (typically based on isolated tasks such as creating an assessment rubric or writing a lesson plan), but relative to the time, effort, and credentials dedicated to their content expertise, even the most ambitious training programs are comparative drops in a bucket. This would be similar to medical doctors completing training in basic sciences, but receiving no clinical training before engaging with patients!

There really isn’t a comparable situation in other professions (other than when experts in a given field are asked to teach neophytes). It is only in higher education that there is essentially no expectation or requirement that the practitioner have any expertise or credential for doing a large part, if not majority of his or her job. It’s frankly a little disconcerting. Of course, many faculty have spent long hours in the classroom, and through that experience have intuitively internalized ways of doing things that are likely to be more effective than others, but despite many years of such on the job training, most university professors cannot explain even the most basic learning or assessment theory, let alone the neuroscience behind how we make memories, learn, and develop new skills.

One exception to this reality can be found in most online or eLearning classes and programs. The reason for this is simply that online classes generally have a learning management system (LMS) of some kind that requires either instructors or instructional designers or both to make pedagogical decisions as part of the course design and delivery process. Even if professors don’t understand the underlying theory they are at least forced to think about about fundamental questions such as how they will present content and assess student progress, how students will practice new skills and how learners will discuss and process what they are learning. Another place where there is more focus on teaching is in career colleges, where curricula are more applied and instructors are more likely to be professional practitioners in the field being taught. Traditional, campus based courses typically have no such pedagogical/application imperative and professional practitioners are far less valued than those with terminal academic degrees. More on that in another post!

The good news is that a theory of practice, or pedagogy, can be learned by any professor and even remedial training can greatly improve teaching effectiveness. However, the reason that professors rarely invest substantial effort in developing pedagogical expertise is because being an effective teacher really isn’t important on many campuses—and such training is rarely available in any sustained way regardless. This is particularly true in large research universities where getting hired, getting promoted, and getting tenure have very little to do with teaching. The situation is a little better today than say, 20 plus years ago, but in traditional institutions, faculty are still broadly rewarded for research, publishing, and securing grant funding far more than they are for excellence in teaching. As large universities shift to more and more contingent (adjunct) instructors, the overall faculty focus on research and publishing will decrease, but there is no indication that institutional focus on teaching will grow in any appreciable way. Any material improvement will likely have to come from the adjunct faculty themselves.

So, what does this all mean for the institution of higher education and the people, both faculty and students, in it? Firstly, as an educational enterprise, the university is structurally flawed and has been from its inception. This is a global phenomenon. That does not mean that students don’t learn or that college degrees don’t provide value. However, it does mean that most students do not learn nearly as much as they are capable of and the majority of other students are so poorly served that they are unable to complete programs of study. In fact, most people do not realize that in the United States, over the entire history of higher education, the system has failed far more students than it has successfully served. Well over half of all the students who enroll in a college or university never complete a degree program! Even among the minority of students who are successful in the sense that they complete a degree, many of them succeed, not because they have edifying learning experiences, but because they effectively manage (survive) the system for long enough to earn a degree. In fact, in many cases, employers want employees with college degrees, not because of what the students have learned, but because having a degree proves that a person can start, persevere through, and finish a challenging long term project.

How is it possible that a structurally flawed system that fails over half of its constituents has continued to operate without meaningful change for centuries? The simplest answer is because there is no viable, scalable alternative for post-secondary education. That will likely change in the not-too-distant future as more employers in certain fields move away from requiring college credentials and toward industry based credentials. This is already happening in IT/Computer Science and will expand to other technical fields as well. Some disciplines such as those in the health sciences will continue to require college degrees as long as professional licensure for those positions continues to require degrees. If that barrier falls, then so will the university monopoly. Similarly, professions that require graduate degrees will also support the university monopoly in post-secondary education for some time into the future. Whether or not non-university, post-secondary educational models will provide teachers with pedagogical knowledge or not remains to be seen. However, a primary difference that already exists is that such models tend to be competency based, providing a greater likelihood that the learning that students do accomplish is more immediately applicable in work settings. One can debate the relative value of a liberal vs. technical/vocational/professional education, but the same structural flaw applies regardless.

All stakeholders would certainly benefit if university leaders dedicated as much attention to the teaching skills of their faculty as they do to their academic credentials, research, fund raising, etc. In fact, institutions that are forward thinking enough to make pedagogical excellence a core objective and priority will differentiate themselves from the crowd, creating a significant competitive advantage in the higher education market.

“I’m Sorry to Bother You. I know You’re Busy.”

I would speculate that virtually every single person reading this post has uttered those words, or something very close, upon walking into your boss’s or some other senior manager’s office. I’m also confident that if you are in a senior leadership role, that you hear something similar from subordinates almost every day. It’s time for all of us to stop saying those words

I realize that in most cases we are simply trying to be polite or deferential, but the fact is when you say, “I’m sorry to bother you,” you are suggesting that whatever your reason for wanting to speak to the person is, it is, by definition, less important than what he or she is already doing. When you say, “I know you are busy,” you are saying that the other things the person does all day that make him or her busy are justified, but your reason for contributing to the person’s busyness is not equally justified.

This silly need to be deferential or polite is an historical legacy of hierarchical organizations in which the boss’s time and ideas and decisions were always considered to be more important and smarter than anyone else’s. Regardless of how one feels about the hierarchy itself, the reality is that the complexity and volatility of day to day operations in virtually any contemporary organization, in any industry or field, is such that bosses no longer bring value (if they ever did) by controlling agendas and time and decisions. They bring value today by empowering many other people in the organization to do better work—ideally, collaboratively, in teams.

So, when someone walks into a superior’s workspace or calls on the phone, with or without a formal appointment, it is because that person has determined that he or she needs something (an opinion, a resource, an approval, etc.) that he or she will use to do his or her job. That interaction should not begin with, “I’m sorry to bother you. I know you are busy.” This may sound like a small thing, but it is actually a big thing. It is a mistake for anyone to start a conversation with words that immediately devalue the importance of their reason for being there in the first place!

For those of you who are senior managers and leaders, it is important to overtly redirect people when they say those self-deprecating words to you. I find myself doing this every day, but I feel strongly about it, so I do it. When anyone comes into my office to speak with me and they start the dialogue with, “I’m sorry to bother you. I know you’re busy,” I immediately reply with, “You aren’t bothering me. I want to hear what you have to say. And, yes, I’m busy, but I’m busy doing important things like meeting with you.” It is often a joy to see the look (surprise?) on the face of a colleague or customer when I do that!

For what it’s worth, I also do not apologize to my board members when I “take their time.” I never want them to think that the time they’re going to spend with me is any less important than anything else they could be doing!

A Key Component of Leadership: Knowing When to Be Directive vs. Building Consensus

The old fashioned view of leadership is of the strong, decisive leader, throwing out orders to willing underlings, who faithfully execute on behalf of the boss. Although this model of “leadership” fails to leverage the incredibly valuable input and due diligence that comes from a more collaborative and consensus oriented approach, it can actually be the preferred method in some situations.

So, when should you as a leader be more directive?

In my experience, the answer lies in both operational and cultural contexts.

Operationally, it is often preferable to be more directive when you simply don’t have the time for a more consensus oriented approach. In such situations, you need to be fairly confident that your choices/decisions are more likely to be correct than not, but if time is genuinely critical, then making “executive” decisions can be valuable to the organization. A recent example in my own experience was the opportunity to save 75% of the normal acquisition cost of an ERP level software application, but with a very tight deadline to accept the offer or not. I did have colleagues evaluate the application against our current system, but ultimately made the decision myself. In this particular case, I had the advantage that a sister organization was already using the software platform and thus had a real time, real world consult, but it was a good example of an executive decision opportunity regardless. Another operational example might be when you have time, but your colleagues/subordinates do not. In other words, they are already stretched thin and you can do them a favor my making an executive decision that saves them the time and effort of doing so collectively. Similarly, sometimes you’ve dedicated the time to a collaborative approach, but that approach had produced two or three equally viable options. It is better to make an executive decision as a leader than to drag your colleagues through a never ending process trying to get to one team decision.

There are also times when being more directive makes sense for cultural reasons. In organizations where hierarchy and autocracy are the norm, it can be quite disconcerting to employees when they are asked to contribute to consensus or to even make their own decisions. In fact, it can even be unfair in the sense that asking people to do something with which they have no experience can not only create a lot of dissonance, it can result in poor decision making and execution as well. This same dynamic also applies to individuals. Even if an organization broadly speaking is amenable to a collaborative, consensus based approach, not everyone in the organization has the experience or confidence to effectively participate in such a model. In those cases, a good leader should develop that capacity in such a colleague over time, eventually weaning him or her off of dependence on decision making by the boss.

And sometimes, being decisive or “executive” as a leader, particularly in a time of stress or crisis, can be valuable because it can give others in the organization confidence that someone is “in charge” and will get them through whatever the challenge happens to be.

To be clear, generally speaking, leaders will arrive at higher quality decisions and will get higher quality buy-in and execution on those decisions using a more collaborative, consensus based approach. This is true simply because multiple people almost always have better ideas than individuals and teams almost always generate better work than individuals or groups that do not function as teams. However, part of good leadership is knowing when being more directive is preferable.

Why a Compelling Value Proposition is More Important to Your Organization than Mission and Vision

Over the last quarter century I have participated in and led dozens of exercises designed to define organizations through mission, vision, and values. These can be very worthwhile exercises because they help organizations understand really fundamental things like why they exist, what they do, and what they aspire to be. Mission and vision statements, and core values, serve as guideposts that should inform important decisions about strategy, resource allocation, and other issues that drive consistency and sustainability.

However, in the current rapid change, hyper-competitive, highly commoditized world that virtually every organization in every industry now operates in, a more important question to ask in the context of defining an organization might be: Why would a customer choose us?

It is becoming more clear to me, both as a CEO and a consultant in the current VUCA environment, that what differentiates organizations that thrive from those that just survive or even fail is not their mission or vision statements or even their values (values, can however, support critical behaviors). While those are important, there is a highly dynamic, if not volatile “where the rubber meets the road” imperative faced by virtually all customer or client driven organizations today, which boils down to your value proposition to the customer compared to the value proposition of other possible choices offered by the competition. We can often attract customers or students or patients or clients into an initial transaction with catchy marketing or steep discounts or convenience, but over time, and particularly with “big ticket” commitments, in order to succeed in the hyper competitive and commoditized environment that most of us work in, our customers must believe that there is genuine value for them in the product or service we are selling—and not only genuine value, but value that is demonstrably greater than what is available from the competition. The value proposition itself becomes a key differentiator that sustains customer loyalty and advocacy.

While I rarely engage an organization that does not have a mission or vision statement, I frequently find organizations that do not have an articulated value proposition. That value proposition can reflect many things in a given organization or for an identified set of customers such as quality, cost, service, benefit, flexibility, support, etc., but with rare exception, consumers today almost always have choices and usually many choices of where to buy a given product or service, so there has to be a good reason they will choose your organization over another. In some cases (usually in retail contexts) the value can even be related to “prestige” or “image,” but it still has to exist for the consumer.

Even in organizations that have thought about and committed to a compelling value proposition for their customers or clients, only the most sophisticated have a truly customer-centered notion of what is valuable and important to the consumer. It is actually more common to see such statements based on what the organization or vendor thinks is valuable. This happens simply because we human beings get attached to what we are invested in or have experience in or think we’re good at. That is a trap that gets in the way of creating value for the customer rather than for ourselves.

What this all boils down to for leaders is that self-definition is still important for organizations today, but with limited time and resources, having a deep understanding of why customers would choose you over a competing organization (and acting on that) is probably more important than having clarity around mission and vision.

The Employer-Employee Relationship: Things Have Really Changed!

In the early industrial era when Americans migrated in large numbers to urban areas for factory work, employees worked very long hours under grim, even dangerous conditions. There were also no protections for child laborers. This reality lasted through much of the 19th century and into the early 20th century. While factory wages were higher than other alternatives, it was punishing work in which the company, in many ways, “owned” the employee.

There was a brief period of American history, roughly post-depression through about 1980, in which at least a couple of generations experienced the ability to work for one or two employers for most of their lives, with relatively safe conditions and 40 hour work weeks, for a living wage, then retire in a home they likely owned. They were also able to provide advantages to their children that allowed millions of Americans to experience a more prosperous life than their parents. This was possible for several reasons. Unions played a significant role in ensuring safer conditions and living wages and benefits for moderately skilled workers. Social programs such as social security and Medicare kept millions of elderly out of poverty. And a slower pace of change allowed companies to do roughly the same work in roughly the same way for decades. This, in turn, facilitated long term relationships between employers and employees that supported an unwritten social contract. The employee worked hard, learned new skills over time, became more valuable, and the employer made an informal commitment to steadily improving wages and job security.

During this same period, management had a somewhat different “contract,” but similar benefits. In return for accepting longer and less predictable hours as “salaried” employees, managers generally enjoyed higher compensation, better benefits, and higher status within their organizations. However, even for management, there were much more clear boundaries between work and private life than there are now. There were no laptop computers, smart phones, tablets, etc. Vacations generally meant completely leaving the workplace behind, usually with zero contact for up to two full weeks. Weekends were generally fully disconnected from the workplace and managers rarely took work home with them. Hourly workers never took work home with them.

Fast-forward to the 1980s and several things began to change in substantial ways. Unions began a precipitous decline. “Globalization” began to put downward pressure on wages. The pace of change within organizations, related to technology and the competitive landscape, made it impossible for employees to do the same work in the same way for more than several years, let alone decades. These disruptions combined, or more accurately, conspired, to weaken the social contract between employers and employees. Today, several decades later, the contract has been obliterated. We are now in an economy in which all employees, hourly and management, are generally seen as “expendable,” and in which organizations, broadly speaking, feel no obligation or loyalty to the individuals that make the organization’s work and survival possible. What this means is that although it is still in one’s own interest to act professionally in terms of meeting obligations, doing one’s best work, representing one’s organization well, it would be a naïve mistake to think that, with rare exception, sacrificing for one’s employer will be “repaid” with job security or deferential treatment in tough times. While this phenomenon is more prevalent in the U.S., it is also becoming more common in other regions of the world as well.

Similarly, unlike the clear line that used to exist between work and private life, organizations now typically see no line whatsoever and claim some level of ownership over all of an employee’s time (it tends to be even worse for managers). In what amounts to a kind of bait and switch, employers spend several hundred dollars on smart phones and tablets at “no cost” to the employee, then get tens or hundreds of thousands of dollars of your “off the clock” time in return. Again, this dynamic is more pronounced in American and Western contexts, but it exists at some level in most global contexts.

And this has happened at the same time that smart phones work essentially anywhere, any time, domestically and internationally. One cannot even escape wi-fi at 35,000 feet in an airplane anymore!

The ultimate irony is that even as we work more hours and have less space between our work and private lives, we are actually becoming less productive overall and certainly less creative. How is that possible?

The short answer is that not only is there not a correlation between working longer hours and being more productive, recent research shows that the opposite is actually true. This is palpably ironic for both employees and the organizations they work for. See a lengthy post on this topic here.

While this new state of affairs can be stressful for employees, who seem to be working harder, with less work-life balance at the same time employer loyalty and job security are becoming ever more rare, it also provides a kind of freedom in which employees are also less tied to employers and thus more free to pursue new opportunities. Change can be difficult, but it is also a catalyst for growth and the flip side of less job security is more professional options. Another potential benefit is the fact that people generally realize faster upward mobility by moving from one organization to another than waiting for opportunities in one organization.

As leaders, we can actually take advantage of the current reality by identifying employees who bring the greatest value, have the best attitudes, integrity, etc. and find ways to support and reward those individuals. We frankly want longevity in our best people. The fact is that organizations today are often driven by short-term thinking, fear of failure, and limited innovation because of a reticence to take risks when times are hard. However, that same reality also provides opportunity to stand out in the crowd for leaders who care about others, take a long view, and embrace risk!

Post Script:

While all of my posts and articles are based at some level on my own experience over 35 years in the workplace, I rarely share specific personal stories. In this particular post, however, I think it is helpful to make an exception. The reality is that you can do the right thing as a leader and still pay a heavy price. You can focus on quality or sustainability or transformational change and be penalized or even fired by a boss or a board who does not care about those things. Due to your own sense of integrity you might prioritize compliance or legality or ethicality over financial imperatives when those who employ you are more “flexible” in their own interpretations of right and wrong. You may care about human beings and thus make decisions that value people over process or short-term financial gain. Fortunately, in most cases, when you do the right or smart or compassionate thing, you will be respected and rewarded, but not always. I have been separated from organizations for doing the things noted above when those things were not reflective of the value systems of my employer. However, just as a good leader takes the long view in terms of performance outcomes, over a professional lifetime, taking the long view in terms of your own integrity is also the right thing to do. In 25 years in management positions, I have been rewarded and benefitted from doing the right thing far more often than I have been penalized. In the end, none of us want our legacy to be how many people we fired or laid off rather than how many we coached and saved, or how many short term profit or sales goals we exceeded while sacrificing quality or the survival of the organization. I have seen short-term, unethical thinking literally destroy a healthy organization, wiping out thousands of jobs, stranding tens of thousands of customers, and wiping out hundreds of millions of dollars in equity. I have seen greed overrule compassion and brilliant strategy sacrificed to ego. And I have personally paid a price for challenging the status quo, but I have also been inspired by servant leaders and experienced the joy of leading a team through transformational change to achieve things they never thought possible. In the end, we will not be judged by our wealth or our status or our conquests, but by the good we have done, and that is a legacy for which we should be willing to occasionally sacrifice our own well being.

Our Faustian Bargain With Technology

Most of us reflexively think of technology as more of a “good” thing than a “bad” thing and that is probably objectively true in some ways and contexts. If we look at fields such as medicine, transportation, and communications, for example, and compare how technology has affected those arenas today vs., say, 25 years ago, most of us would say that we are better off.

It is because of technology that someone in Abu Dhabi can buy an avocado grown in Kenya (and that farmers in Kenya know in real time where to get the best price for their products). It is because of technology that early detection of colon cancer has greatly increased survival rates and that surgeons can correct heart defects before babies are even born. It is because of communications technology that two people, regardless of location, can connect in real time and share virtually unlimited information of interest to both. Based on these examples, most people would say that technology represents a positive influence in our world and our lives.

On the other hand, technology has become so pervasive in virtually every aspect of our personal and professional lives that we have become fundamentally dependent on that technology to transact even the most basic tasks of our daily existence, even to the point that most people even mediate their primary human relationships through technology.

Technology has done big things in the very recent past. It has democratized access to information. It has effectively eliminated time and geography as barriers to communication. It has facilitated automation that has completely changed how entire industries function—and how the humans in those industries work and interact. It has made some products and services much cheaper (or even free).

It has also been incredibly disruptive, both in terms of how it impacts our lives, but also because of the increasingly rapid pace with which new technologies enter the mainstream of human life and enterprise, fundamentally changing how we communicate, work, behave, and interact with one another. It has wiped out entire labor markets and it has shifted the risk of failure from single entities to entire systems, and this is a really, really big deal.

In the “old days,” when technology was electro-mechanical or just mechanical, machines failed all the time. However, those failures were limited to individual machines (a car, a loom, a washing machine, etc.). Now, with most all technology employing some sort of software, software that is connected to other things also employing software, we are vulnerable to massive, even catastrophic failures—and it’s already happening. Entire airlines are grounded across the globe. Power grids go down. Vehicle fleets are pulled off the roads—because a string of software code fails or is hacked or was flawed to begin with. The current process for coding is deeply flawed as well, but that is another post…

In short, a new car, for example, typically requires millions of lines of code to operate multiple processors. There is no way for any programmer or programmers to anticipate the billions of potential combinations of scenarios that a driver and the car (and hundreds of thousands of other cars and drivers) will encounter over millions of miles in highly diverse environments. As a result, a piece of code that is supposed to stop a car from accelerating, for example, given enough time and scenarios, will inevitably fail to stop accelerating even when the driver takes her foot off the gas pedal. This has already happened, resulting in accidents and even deaths. And the problem affects every single car running the same code.

It is becoming clear that there are at least a few areas in which our “bargain” with technology may have come at a very steep price. Privacy, independence, vulnerability to failures, human development, mental health, and human relationships—and we have not even largely begun the next technology era with artificial intelligence and mixed/virtual reality! The reality is that we are all subjects in a very big experiment and we frankly don’t know what the outcome will be, particularly for young people (tech natives) who represent the first generation in human history to have lived their entire lives mediated through technology and tethered to smart devices.

This horse is way, way out of the barn. We are not going back to a pre-software driven world, and on balance, most of us are relatively satisfied with the technology we use every day (and depend on without even knowing it). But there are two realities that we should be thoughtful about. One is that there will continue to be massive, catastrophic systems failures, and it will get worse before it gets better. Part of this isn’t even related to technology failing directly—it comes from support systems failing. Ask the people of Puerto Rico about life without smart phones, ATMs, and internet, all of which need electricity and other infrastructure to function. The second reality is that we can still individually carve out space in our life that is mostly tech-free if we choose to and we should do that on a regular basis. As human beings, our relationships probably need time with others that is not mediated through technology. We know that spending significant, unbroken hours engaged with laptops, tablets, smart phones and video games has documented affects on our brains and bodies (think concentration, sleep and eating patterns). We also know that social media without breaks can increase anxiety and decrease self-esteem, while also generating a great deal of stress. Increasing research is beginning to show that human beings need and benefit from extended periods of time in social and natural environments that are not mediated in any way through technology. At least for now we still have some control over that and should regularly exercise that control for our own benefit and the benefit of others.