Let's start with the reasons for research:
- Research is necessary to maintain and grow knowledge. When I research, it is impossible for me to get published unless I stay abreast of what is going on in industry and within academic journals. I must read tons of articles, discerning who has said what, why it is important, and how it applies to my thoughts and industry. Research requirements force me to learn. And as knowledge expands over time, new discoveries often replace old ideas. By staying current with research, I stay in better touch with reality.
- Research (usually) pushes professors to think objectively and scientifically. I have received rejections from a number of journals. Sad, but true. Often these rejections highlight flaws in my thinking or flaws in the way I present my claims. While I occasionally disagree with reviewer comments, they have in large been well done. Their comments push me to improve my research methods, thinking, and writing skills to avoid biases, subjective interpretations, and sloppy writing.
- Research helps verify that a PhD was not an accident, fluke, or an aberration from a diploma mill. Because let's face it, not all PhD's are equal. The research requirements at many universities independently verifies a professor's knowledge and ability to think. By using blind peer-reviewed journal publications, universities can better assess a professor's expertise while avoiding conflicts of interest, confirmation biases, and group think.
- Related to the previous points, research output becomes a measure of expertise. This expertise is not always what academics like to pretend it is, as I explain later, but is important in assessing a professor's competence. It establishes a professor as an expert in their field.
- To a large extent, and often contrary to many university missions, research is still emphasized more than teaching. I have not heard of a single university that requires new faculty to take an introductory course on "how to teach". That's not to say I haven't had some excellent classes on certain techniques and technologies, such as writing in the curriculum and online education technologies. But there has been no single introduction to teaching course required. Even during my interview process, only one school required me to teach a class in front of students. And when it comes to tenure decisions, research can make or break you, but "adequate" teaching skills often suffice.
- Research can often be tangential to what you teach. This is in part because research is so specialized that its difficult to find topics that readily translate into class topics. In my web development class, I spend maybe 95-98% of my time talking about technology and business processes that have existed for 10 years or more. It is important that I do that, so the students understand the fundamentals. Yet, in most academic journals, 10 years ago is old news and not worthy of publication. While I do research in the area I teach - web technologies, most of my topics are so specialized that no undergraduates and few graduates could comprehend the topic, much less appreciate the new knowledge.
- By its nature, research emphasizes depth not breadth of knowledge. I get zero credit for my background in science, nor my significant reading and understanding of philosophy or economics. And yet those experiences help me develop examples for class material that integrates knowledge across disciplines, enriching the education experience.
- Nor do I get credit if I develop an external websites, do significant consulting, or create popular workshops or seminars in my field. These can all be used as means of demonstrating expertise (assuming I am in demand), but are rarely considered in tenure decisions. This is unfortunate because often some of the richest learning occurs when doing an activity, not just observing someone else doing an activity. What better way for a professor to build his understanding of a subject than to actually do it.
Where did things go wrong? The traditional research paradigm developed when the humanities and hard sciences represented the core of higher eduction. At the time, research was not as specialized, so new findings were far more applicable to undergraduate and graduate courses. As universities included more professional development programs, like engineering, business, medicine, and law, scientific research continued to serve as the primary determinate of expertise. Simulatenously, the basic theories and frameworks became well established, so new research focused on ever more abstract and abtuse subjects.
There is no question that professors need to deeply understand the concepts in their field (be experts) and universities need verification of that expertise. Because there are established and rigorous methods for judging expertise, those measures are emphasized. But who says they cannot adapt new means of measuring expertise. Let's not forget that establishing expertise is only part of the problem. Professors need to know which of their fields' core concepts are most important for students to understand. They need to know how best to present those concepts. And they need to do it effectively. These pedagogical skills are not adequately emphasized.
I'm not sure I have an answer to what I see as a dilmema in higher education today. I certainly see room for more pedagogical research, especially in schools nominally claiming to be "teaching" oriented. I also think more consideration should be given to establishing objective measures of teaching effectiveness. The current system of student evaluations is widely known to be useless, yet it persists. I also see room for more avenues establishing and maintaining expertise within a given field, especially in the business schools where I reside.
What do you think?