At a high level, to be a truly effective teacher, one must consciously or unconsciously, have a theory of practice about pedagogy (teaching and learning) itself. Imagine for a moment if a pilot had no theory of aerodynamics or a physician had no theory for diagnosis or treatment. Doesn’t seem likely, but that is standard operating procedure for professors in most institutions of higher education.
Surprisingly (if not shockingly), many instructors at the university level enter classrooms every day without any formal training on how their choices and behaviors will support, or mitigate against, learning in their students. The dirty little secret about college level instructors is that the vast, vast majority of them have teaching roles because they have content expertise in some discipline—not because they have been taught anything about teaching. Some have natural proclivities or amenable traits, which certainly helps, but virtually no full-time, tenure track professors outside of Education departments were or are hired because of their teaching expertise or effectiveness. Some colleges and universities do provide occasional, usually ad-hoc, workshops to help instructors improve their effectiveness in the classroom (typically based on isolated tasks such as creating an assessment rubric or writing a lesson plan), but relative to the time, effort, and credentials dedicated to their content expertise, even the most ambitious training programs are comparative drops in a bucket. This would be similar to medical doctors completing training in basic sciences, but receiving no clinical training before engaging with patients!
There really isn’t a comparable situation in other professions (other than when experts in a given field are asked to teach neophytes). It is only in higher education that there is essentially no expectation or requirement that the practitioner have any expertise or credential for doing a large part, if not majority of his or her job. It’s frankly a little disconcerting. Of course, many faculty have spent long hours in the classroom, and through that experience have intuitively internalized ways of doing things that are likely to be more effective than others, but despite many years of such on the job training, most university professors cannot explain even the most basic learning or assessment theory, let alone the neuroscience behind how we make memories, learn, and develop new skills.
One exception to this reality can be found in most online or eLearning classes and programs. The reason for this is simply that online classes generally have a learning management system (LMS) of some kind that requires either instructors or instructional designers or both to make pedagogical decisions as part of the course design and delivery process. Even if professors don’t understand the underlying theory they are at least forced to think about about fundamental questions such as how they will present content and assess student progress, how students will practice new skills and how learners will discuss and process what they are learning. Another place where there is more focus on teaching is in career colleges, where curricula are more applied and instructors are more likely to be professional practitioners in the field being taught. Traditional, campus based courses typically have no such pedagogical/application imperative and professional practitioners are far less valued than those with terminal academic degrees. More on that in another post!
The good news is that a theory of practice, or pedagogy, can be learned by any professor and even remedial training can greatly improve teaching effectiveness. However, the reason that professors rarely invest substantial effort in developing pedagogical expertise is because being an effective teacher really isn’t important on many campuses—and such training is rarely available in any sustained way regardless. This is particularly true in large research universities where getting hired, getting promoted, and getting tenure have very little to do with teaching. The situation is a little better today than say, 20 plus years ago, but in traditional institutions, faculty are still broadly rewarded for research, publishing, and securing grant funding far more than they are for excellence in teaching. As large universities shift to more and more contingent (adjunct) instructors, the overall faculty focus on research and publishing will decrease, but there is no indication that institutional focus on teaching will grow in any appreciable way. Any material improvement will likely have to come from the adjunct faculty themselves.
So, what does this all mean for the institution of higher education and the people, both faculty and students, in it? Firstly, as an educational enterprise, the university is structurally flawed and has been from its inception. This is a global phenomenon. That does not mean that students don’t learn or that college degrees don’t provide value. However, it does mean that most students do not learn nearly as much as they are capable of and the majority of other students are so poorly served that they are unable to complete programs of study. In fact, most people do not realize that in the United States, over the entire history of higher education, the system has failed far more students than it has successfully served. Well over half of all the students who enroll in a college or university never complete a degree program! Even among the minority of students who are successful in the sense that they complete a degree, many of them succeed, not because they have edifying learning experiences, but because they effectively manage (survive) the system for long enough to earn a degree. In fact, in many cases, employers want employees with college degrees, not because of what the students have learned, but because having a degree proves that a person can start, persevere through, and finish a challenging long term project.
How is it possible that a structurally flawed system that fails over half of its constituents has continued to operate without meaningful change for centuries? The simplest answer is because there is no viable, scalable alternative for post-secondary education. That will likely change in the not-too-distant future as more employers in certain fields move away from requiring college credentials and toward industry based credentials. This is already happening in IT/Computer Science and will expand to other technical fields as well. Some disciplines such as those in the health sciences will continue to require college degrees as long as professional licensure for those positions continues to require degrees. If that barrier falls, then so will the university monopoly. Similarly, professions that require graduate degrees will also support the university monopoly in post-secondary education for some time into the future. Whether or not non-university, post-secondary educational models will provide teachers with pedagogical knowledge or not remains to be seen. However, a primary difference that already exists is that such models tend to be competency based, providing a greater likelihood that the learning that students do accomplish is more immediately applicable in work settings. One can debate the relative value of a liberal vs. technical/vocational/professional education, but the same structural flaw applies regardless.
All stakeholders would certainly benefit if university leaders dedicated as much attention to the teaching skills of their faculty as they do to their academic credentials, research, fund raising, etc. In fact, institutions that are forward thinking enough to make pedagogical excellence a core objective and priority will differentiate themselves from the crowd, creating a significant competitive advantage in the higher education market.