Skip to content

Dr. Sven van de Wetering: The University after the Year 2000

October 20, 2012

Interviewer: Scott Douglas Jacobsen

Numbering: Issue 1.B, Subject: Psychology

Place of Publication: Langley, British Columbia, Canada

Title: In-Sight: Independent Interview-Based Undergraduate Journal

Web Domain:

Individual Publication Date: October 20, 2012

Issue Publication Date: May 1, 2013

Name of Publisher: In-Sight Publishing

Frequency: Three Times Per Year

Words: 5,901

ISSN 2369-6885

Issue 1.B, Subject: Psychology

{Editor Note: Written prior to 2000}

The University after the Year 2000

I am a product of university education.  I have three university degrees, and am well on my way to earning a fourth.  I should be trying to use myself as a model of what is good about university education.  And yet, my first response to a competition for the most interesting essay on the topic of “The University after the Year 2000” was to write a truly boring essay on that topic.  What does this say about the education I have received?

I wish I could say I am an exception, that the university is in fact a highly interesting, stimulating place in which undergraduate students, intoxicated with learning, move eagerly from class to class, enjoying a heady mix of exciting, cutting-edge knowledge and profoundly engaging instructional techniques.  I would love to say those students are brimming with enthusiasm, and that everything they do is imbued with that enthusiasm.  It would be good to believe that their discussions are animated and their papers overflowing with intellectual joie de vivre.

Sadly, I have come to the conclusion that this is not so.  For one thing, it is evident that undergraduate students are being induced to write papers just as boring as mine.  I know, because I have read numerous student papers as a teaching assistant.  Many student papers come close to putting me to sleep.  Furthermore, students are often bored, as well as often boring.  I can see it in the glazed eyes at lectures, the apathetic silence in tutorials, the slumped postures in library carrels.  The primary motivating factor for undergraduates at every university I have attended is the same: terror of getting poor marks.  Compared to this, the intrinsic joy of acquiring exciting new knowledge seems to be a feeble motivator; sometimes students actively suppress their drive for new knowledge for the sake of greater efficiency in chasing marks.

How did the university get to be such a boring place?  Part of the problem, of course, is the competition for marks, which is fueled by the equally frantic competition for various other goods that are dependent on marks, such as scholarships, places in graduate school, jobs, and maybe even self-esteem.  For better or for worse, we live in a competitive society, and this society creates a context where competition for marks may be inevitable.

Whenever students focus on marks or other extrinsic sources of motivation, they are bound to lose awareness of their intrinsic sources of academic motivation, such as joy in acquiring new knowledge.  If students don’t believe they are motivated by love of knowledge, they genuinely do come to value knowledge less (except as a means to various ends).  One result of this is the plethora of competent but uninspired term papers that afflicts university markers.  Another is the large admixture of cynicism and apathy in students’ attitudes toward higher education.

Competition for marks is not the only source of the problem.  It is true that competition for marks tends to drive out students’ intrinsic desire for knowledge.  Nevertheless, this intrinsic desire for knowledge would not be so easy to drive out if this desire were firmly entrenched in the first place.  Something that truly excites a person will continue to excite them even after they find themselves doing it for ulterior motives.  In this essay I want to discuss two problems that I believe have undermined the inherent excitement of learning by weakening the esteem in which academic knowledge is held.  These problems are the breakdown of metanarratives of legitimation and the fragmentation of knowledge.

The Breakdown of Metanarratives of Legitimation

Jean-François Lyotard (1979) defines the postmodern condition as a state of incredulity toward metanarratives.  A metanarrative is a large narrative structure within which the day-to-day stories that help us make sense of our lives are embedded.  The Christian construal of the course of world history, centering on the fall from grace, the incarnation of God, and the subsequent salvation of the faithful is one sort of metanarrative.  Within this grand metanarrative, people could give meaning to their day-to-day activities by asserting that those activities helped glorify God, or else that they served to improve their personal chances of doing well in the next world.  The enlightenment ideal of human progress was a very different sort of metanarrative, one that was particularly valuable in legitimating organized inquiry and making it seem meaningful.  Marxism was one variant of that metanarrative.

Lyotard asserts that skepticism toward such metanarratives has become a standard feature of late 20th century discourse.  He also claims that such skepticism is not necessarily a bad thing.  I disagree.  I believe that the inability of most people to heartily believe in some metanarrative has had very destructive consequences.  For all their faults (chief among them being the fostering of intolerance and dogmatism), metanarratives do also have one important virtue: they give people a sense of being involved in an important shared enterprise.  This sense of doing something important together is practically a prerequisite for enthusiasm.  Without this sense, desire for individual accomplishment is the only spur to purposeful activity.

This individual competition is a poor substitute for shared goals as a motivator.  Desire for individual accomplishment in the absence of superordinate common goals fosters competition for its own sake, without providing any sense that the activities that constitute this competition are meaningful in their own right.  People who lose in the great competition have little with which to console themselves, while those who win must enjoy their laurels in an atmosphere poisoned by the resentment of those they have defeated.  The people who hand out the winners’ laurels find the atmosphere even more poisoned, because the competitors harbor lingering suspicions that the whole evaluation process was unfair.

It was not too long ago that the universities, and people engaged in the organized acquisition of knowledge in general, still had a metanarrative that helped them imbue themselves with an overarching sense of purpose within the larger society.  This metanarrative was the story of human progress, a story that presumably ended with the protagonists living happily ever after.  The systematic quest for knowledge that academics engaged in was at the cutting edge of the quest to improve the human lot.  Knowledge meant progress because it led to the improvement of techniques for wresting the good things in life from intransigent nature, as well as helping to create more rational human institutions to take the place of institutions that had been built in ignorance and that therefore caused needless suffering.

In recent times, this dream of using knowledge to bring about steady progress in the human condition has become much less credible. The holocaust, the invention of the atomic bomb, and other horrors of the 20th century have made it much more difficult to equate the acquisition of technical, scientific, and social scientific knowledge with the general betterment of humanity.  Academics can no longer assert that they are acquiring knowledge for the sake of a better world, at least not when they are trying to legitimate their demands on the public purse.  Instead, they must limit themselves to more limited and specific claims.  “It may be that knowledge in general does not serve the human race,” they will say, “but the particular type of knowledge I am trying to produce will be cost effective.  It will have practical applications.  Students who learn what I am finding out will be able to get better jobs, earn more money, and pay more taxes to the government.”  In other words, academic teaching and research is no longer an important, grand enterprise; at best, it is still a somewhat useful one.  Small, practical goals are the order of the day.  For an academic to claim any more grandiose ambitions would smack of megalomania.  The professor who gets genuinely excited about what he or she is doing becomes an anomaly, a true believer in a world full of skeptics.  It is often better for such enthusiasts to hide their enthusiasm beneath a veneer of hard-nosed pragmatism, at least in front of the uninitiated.  The undergraduates in the lecture theaters are the first to feel the effects of this veneer, and we already know what happens to them: They get bored.

The Fragmentation of Knowledge

My personal epiphany concerning the fragmentation of knowledge came when I was doing background reading on theories of prejudice, my personal area of graduate research in psychology.  My interest in prejudice stems from my strong conviction that there is too much hatred in the world, and that the separation of people into myriads of mutually hostile groups is bad for everybody.  As far as I can make out, virtually all researchers in prejudice share my convictions.  Thus, I expected that researchers in prejudice would practice what they preach and reach out to all other researchers in prejudice, without regard to minor differences of research emphasis, departmental affiliation, or theoretical orientation.

This is not what I found.  Instead, the study of prejudice is profoundly divided.  Researchers who study prejudice from a psychological point of view write as if the sociological theories of prejudice did not exist.  Most sociologists return the favor.  Cognitively oriented psychologists who assert that much of prejudice is due to processes common to all of us tend to dismiss psychoanalytic theories that emphasize the role of bad child rearing in creating prejudiced individuals; however, they do not then replace this with a theory of their own that explains why some individuals are more prejudiced than others.  The attitude of Marxist sociologists of prejudice toward neoliberal sociologists of prejudice borders on contempt.  The attitude is mutual.  There are many other divisions in the field of prejudice research; there are at least 28 different theories of prejudice.  This sampling should give some indication of the extent to which prejudice researchers exhibit the same incomprehension, intolerance, and outright hostility among themselves that they decry among others.  Even more disturbing, these researchers seem unaware of their own hypocrisy in this matter.  The study of prejudice, with its obvious practical applications, is severely hampered by the many divisions within what should be a seamless web of knowledge and understanding.

Like the breakdown of metanarratives, the fragmentation of knowledge makes it more difficult for people in universities to believe that they are involved in a coherent, important enterprise.  The causes of this fragmentation are quite different from those for the breakdown of metanarratives, though.  One very simple cause of this fragmentation is the explosive growth of systematic knowledge, combined with the inability of individual human beings, with their relatively fixed resources of time and attention, to learn any substantial proportion of that body of knowledge (Thorngate, 1990).  Collectively, academics come to know more and more, but one of the main effects of this growth of collective knowledge is that, individually, academics come to be ignorant of more and more.

The growth of systematic knowledge to the point where it exceeds the capacity of individual knowers is an irreversible process.  Nevertheless, this process does not have to lead to the ever greater fragmentation of knowledge.  Other factors that encourage this fragmentation can potentially be reversed.

One such factor is a widespread contempt for generalization and synthesis, at least within the hard sciences (Greene, 1997).  Such generalization and synthesis are often equated with popularization, which is not considered a serious scientific activity.  Progress in science is seen as being constituted exclusively by the discovery of previously undiscovered facts of nature.  Organizing already discovered facts tends to be considered a reshuffling of existing knowledge, rather than the creation of new knowledge.  Thus, organization and synthesis of existing findings is not considered research.  A similar ethos exists in psychology, where people publishing empirical articles are considered to be engaging in active research, while those publishing review articles are not.

This blinkered attitude toward integration can and should be changed.  Incoherent knowledge is a contradiction in terms, yet incoherent knowledge seems to be the goal toward which we are steering.  Greene (1997) fears that we are heading for a state of affairs in which the world is dominated by the products of hard science, but in which nobody within that world has a scientific world view.  Such a state of affairs would be more than ironic; it could be catastrophic.  The growth of human knowledge, even in its present fragmented form, has resulted in a growth of human power to change the environment.  If this growth of power is not matched by a growth of wisdom, a growth of the capacity to understand the manifold consequences of human actions, then the human capacity for inadvertent destruction will also increase.  It is hard to know how much more inadvertent destruction the world can tolerate before true disaster strikes.

What Can Be Done?

People in the 21st century are not doomed to be unable to feel a sense of shared purpose.  The academic world is not destined to break up into ever more numerous, more specialized, and more mutually uncomprehending fields of study.  The students after the year 2000 are not yet condemned to four years of academic boredom.  Such outcomes look probable, but they can be avoided if the problems discussed above are recognized and appropriate steps are taken to alleviate them.

Three changes will need to be made if universities are to combat the drifting purposelessness of postmodern skepticism and the stultification of fragmented knowledge.  These changes will consist of the formulation of a new, credible metanarrative justification for the organized pursuit of knowledge; a change in incentives to professors to encourage the integration of knowledge; and a change of the undergraduate curriculum to encourage students to develop broad understanding.  In the process of implementing these changes, professors and students may find that their enthusiasm for the life of the university is at last rekindled.

New Metanarratives of Legitimation

When I speak of universities needing new metanarratives of legitimation, I am speaking of something more general than mere statements of purpose, such as the one recently drafted for Simon Fraser University by David Gagan (1998).  This statement and others like it set out specific goals relating to teaching, research, support for international students, etc.  However, a true metanarrative of legitimation does more than just set out specific goals for the institution: It narrates a project that is thought to comprise a goal of the society as a whole, and attempts to delineate the institution’s function within that project.  Statements of purpose talk about the goals of the institution, but leave the larger goals of the society within which the institution is embedded implicit.

Many years ago, the organized pursuit of knowledge derived its sense of legitimacy from its pivotal role in promoting human progress.  Nowadays, the grand epic of progress looks more like a farce, and the development of technology looks more like a way of creating amusing playthings to fuel increased consumer spending than it does like the best hope for the happiness of the human species.  Nevertheless, the dream of progress was not a fraud.  Many of the goals for which proponents of progress strove now look silly not because they were unrealistic, but rather because they have already been achieved, and are therefore seen as trivial.  There are large sections of the world where nobody ever starves to death, where people seldom work themselves to death at mind-numbing manual labor, where capricious and arbitrary power is, if not eliminated, at least held within strict bounds.  Access to education has become enormously easier.  The sort of luxury and comfort that used to be the private preserve of the very rich and powerful has become common to all but the very poor.

The main reason for disappointment in the achievements of progress is not a shortfall of achievement compared to expectation, but rather the failure of people to be made happy by the fact that they are materially much better off than their remote ancestors were.  Happiness does not come from the absolute level of one’s comfort, but rather from the match between expectations and reality.  Reality has improved, but expectations have increased apace, and the ratio of the two remains about the same.

The metanarrative of progress has not been discredited, but rather ended.  Now we’re in the part of the metanarrative that says “and they all lived happily ever after.”  Even as a child, I always thought that was the most boring part of any story.  The ultimate goal is not to live happily ever after, but rather to be involved in the sequel to the story.  The end of one story is not the end of all stories.  We have not reached the end of history, as Francis Fukuyama (1992) asserts.  We are at the stage where we ask, “Where do we go from here?”  This sort of identity crisis is difficult and painful, but should not last forever.

At the time of the rise of the metanarrative of progress, people were powerless in many ways.  They had little control over the natural world, which sometimes bestowed its bounty, but sometimes brought plagues or starvation.  Most people had little control over the course of their lives, which were heavily determined by the traditions governing their authoritarian societies, and by the positions into which they had been born.

Now technology greatly increases people’s power over the environment, while liberal democracy allows ordinary individuals to have a greater degree of control over their lives than would have been imaginable a few centuries ago.  The problem now is not powerlessness, at least not powerlessness of the same sort as that that troubled our ancestors.  The problem now is that power has outstripped understanding.  As a result, it becomes increasingly easy to be harmed by exercising one’s power, rather than by being unable to exercise it.  People in the richer countries no longer die of starvation because they cannot exercise the option of eating food, as opposed to not eating food.  Instead, they die of heart attacks because they can exercise the option of eating food high in fats and salt, as opposed to food high in vitamins and complex carbohydrates.  People are no longer at the mercy of arbitrary despots.  Instead they are at the mercy of their own inability to distinguish an inspiring demagogue from a true leader at voting time.  What people need now is not greater power over the environment and the course of their own lives, but rather sufficient understanding of the consequences of their actions to be able to make intelligent use of the power they already have.

This acknowledgment of the need for understanding is not the same as the widespread truism that we now live in an information age.  Information can take the form of thousands of unconnected pieces.  Information is the sort of thing computers deal with much better than humans do, yet computers are still more or less devoid of understanding.  Computers can easily process huge volumes of information, but they are still incapable of doing many tasks that are easy for human beings.  A large part of this inability has to do with something artificial intelligence researchers call the frame problem: Computers don’t know when to invoke information from outside a specific knowledge domain to solve a particular problem, nor which information is likely to be useful.  In other words, something more than mere information is needed for understanding.  That something more is the integration of that information into a coherent whole, leading to an intuitive feel for what sort of information should be used for decision-making in what sort of context.

I believe that the search for understanding has the potential to be the next great epic, the grand quest our society can undertake now that the quest for material and social progress has reached the point of diminishing returns.  This is the new metanarrative we need to tell ourselves.  Our situation used to be like that of a gardener who had trouble with her garden because she had no way of killing weeds.  She set out to acquire ways of killing weeds: detonating bombs, spraying with herbicides, setting fire to the garden, strewing salt over the ground.  Now she has virtually unlimited power over the weeds.  What is needed is not more power, but rather enough understanding to be able to use that power wisely, so that a healthy garden will be able to grow over the corpses of the weeds.

This quest for the understanding and wisdom needed to make good decisions is a long-term project for the society as a whole; nevertheless, it is clear that universities have a special role to play within this project.  No other institution is so well equipped to encourage people to think and to organize their understanding of the world.  The university is the place where the atoms of knowledge that gave us power have the potential to be assembled into the coherent  knowledge strucutres that may eventually allow us to use our power wisely.  Of the various institutions that engage in research, only the university is sufficiently detached from short-term practical applications of research findings to be able to think about long-term costs as well as short-term benefits of new technologies and new knowledge.

If universities decide to tell this sort of story in order to legitimate themselves, they will have to change direction.  One thing they will have to do is resist excessive encroachment of purely practical concerns in the curriculum.  Practical knowledge is an important part of what universities have to teach, but it can never constitute the whole.  Many people take universities to be little more than vocational training institutes.  Vocational training is undeniably important, but the university itself will be dead if it ever devotes itself exclusively to such training.  Short-term practical concerns create an atmosphere of excessive urgency.  Urgency is the enemy of reflective, integrative thought of the type that leads to broad understanding.  If the university devotes itself primarily to the immediately practical, it will have sacrificed the living, breathing metanarrative of the quest for understanding to the moribund god of the quest for material progress.

The other changes that this new metanarrative will necessitate will involve increasing the importance placed on integration of knowledge and the creation of broad understanding.  Specific mechanisms for doing this will be discussed in the next two sections.

Change of Incentives to Professors

People make fun of the “publish or perish” incentive structure that governs the careers of professors.  Actually, the “publish or perish” mandate is not even the most pernicious pressure on academics.  The worst problem is the type of publication that is taken seriously.  In the hard sciences, and in many social sciences as well, what is expected is the publication of a relatively steady stream of empirical research articles in high-status journals.  Professors have to demonstrate that they are at the cutting edge of new knowledge creation by designing and carrying out empirical studies nobody has ever carried out before.  Writing a book that integrates existing knowledge into a compelling new framework is much less consistently rewarded, unless one hits the jackpot and achieves instant international fame with one’s book.

The result of the mandate for academics to constantly carry out new empirical studies is that the academic world produces an enormous quantity of research.  This can be considered good news, bad news, or terrible news.  The good news is that most of this research is methodologically sound, and most of the findings are reliable.  The bad news is that most of this research investigates completely trivial questions, questions whose answers, however reliable they are, have virtually no capacity to enrich our understanding of the way the world works.  The really terrible news, though, is that it is becoming increasingly difficult to distinguish between profound findings and trivial findings, because nobody is rewarded for sifting through this great mass of findings and trying to figure out what they all mean.  Occasionally a review article is published that attempts to survey the work in an area; even more rarely a book appears that tries to integrate the findings from several areas into a coherent framework.  More often than not, these books are written by science journalists, rather than academic scientists, which is surely an indicator of the weak incentives present for this sort of integrating activity.

This lack of incentives for integration is not just a lack of incentives to publish integrative works; it also appears to consist of a disincentive against personal thought and the private integration of knowledge.  Thorngate (1990) reports that professors of psychology typically spend only 3-6 hours a week reading scholarly literature.  This is far too little to allow them to construct a comprehensive personal understanding of the context within which they work.  Once again, the result is the sort of lack of perspective that encourages the publication of methodologically sound but trivial investigations.

The reasons for the small amount professors read is not hard to find.  Professors are under tremendous pressure to teach, conduct research, and perform various administrative duties.  Something has to be sacrificed in their busy schedules, and unless they want to give up sleep, reading and thinking are likely to be the first activities to be squeezed out.  In order to reverse this trend, the universities after the year 2000 will have to either find incentives to encourage professors to read more or (probably more effectively) decrease the pressures for other sorts of activities.

Even if professors are allowed and encouraged to read a little more, the field of organized knowledge is too vast for individual scholars to completely understand the entire context within which their research fits.  For this reason, Campbell (1969) advocates what he calls a fish-scale model of omniscience.  This means that, although no individual can possibly grasp the whole of organized knowledge, nevertheless a large number of individuals have a better chance of evenly covering the field of what is known (instead of being sequestered in isolated subspecialties and subsubspecialties) if every scholar attempts to be reasonably well versed in several separate fields, rather than thoroughly grounded in one specialty and almost completely ignorant of neighboring specialties.  This can be done by promoting contacts between different departments, encouraging faculty members to subscribe to idiosyncratic mixtures of journals, and fostering conventions that cross conventional disciplinary boundaries.

In addition to this emphasis on acquiring knowledge in different disciplines, there also has to be more reward given for active efforts at synthesis.  Ongoing theoretical work that may eventually result in an integrative book should be rewarded in just the same way (i.e. in terms of its effects on career advancement, tenure, etc.) as ongoing empirical work that results in an extended series of short journal articles.  After all, the work load involved in such theoretical, integrative work is equivalent or greater to that entailed by empirical work, and the benefits to knowledge (keeping in mind that knowledge must be known by somebody, and not just be sitting scattered and disorganized on library shelves) are also potentially greater.

Undergraduate Curriculum

Because taking electives is a requirement for graduation, undergraduate students at Simon Fraser University and other North American universities are already encouraged to cross disciplinary boundaries far more than are the faculty members who teach them.  Nevertheless, still more needs to be done to encourage undergraduate students to acquire the breadth of perspective needed to develop the kind of understanding I have been promoting in this essay.

One weakness of electives in promoting breadth of understanding is that there is no incentive for integration.  Students are evaluated in each of the courses they take, and therefore spend a great deal of time memorizing course contents before exams.  However, they are never, under any circumstances, required to make use of information from two different courses simultaneously.  It is perfectly possible for an undergraduate student to take a course in ecology, another in economics, and a third in political science, and yet never have to deal with the fact that ecological decisions have economic consequences, economic decisions have ecological consequences, and both types of decisions have political consequences.

What I would like to advocate is the creation of an undergraduate course called “synthesis”, which would be compulsory for all undergraduates at the second and again at the fourth year level.  This course would attempt to teach the ways in which several different disciplines can be brought to bear on a single problem.  Each student would choose the problem to which they would try to bring several disciplines to bear.  This problem could be either theoretical (e.g. “What does current knowledge on the psychology of motivation tell us about the plausibility of economic concepts of utility, and how does this relate to the economics of environmental protection?”) or practical (e.g. “How can I promote racial and ethnic cooperation on campus?”).  A requirement of the synthesis course would be that information from courses in at least three disciplines would have to be brought to bear on the chosen problem.

One important obstacle to the integration of knowledge is the fact that any given field tends to make knowledge claims that either contradict or are incommensurable with the claims of other fields.  Thus, the budding undergraduate synthesist will have to have tools to assimilate diverging knowledge claims.  This means that every student, regardless of their field of study, will have to study logic, rhetoric, and epistemology.  This should probably accompany a more general grounding in philosophy.  Many students seem to hate philosophy, but this does not mean they don’t need it.

It will be recalled that the objective of increasing the breadth of understanding cultivated by undergraduates is to promote wise, knowledgeable decision making by people who graduate from university, in order to allow us as a species to use our great power without producing horrible side effects.  Not all branches of knowledge are equally valuable in helping people assess the potential side effects of powerful actions.  Most of the side effects of human actions are either social or ecological in nature.  Thus, the range of different courses students take should ideally include at least one, and preferably several courses in both the social sciences (e.g. anthropology, sociology, political science, psychology, economics) and in sciences related to ecology (ecology itself, other biology courses, chemistry, climatology, geography, etc.)  It may also be appropriate to introduce the occasional problem-focused course, one that examines a single problem from several different disciplinary perspectives.

Needless to say, these rather elaborate breadth and integration requirements would exist alongside the more usual requirements for specific types of professional training in the field of the student’s choice.  The likely outcome of the addition of these requirements would therefore be to lengthen the time it takes to earn a degree, perhaps from four to five years.  This is not necessarily a disadvantage; the increase in the capacities fostered by such an undergraduate program would more than compensate students  for the extra time.

History Repeats Itself

The present essay has focused on the evils of excessive specialization and the potential benefits of encouraging integration and the ability of both students and professors to perceive knowledge as an organized whole.  This plea is not novel.  Spranger (1910) reports that the same problem of overspecialization in higher education was widely perceived by intellectuals early in the nineteenth century, and that somewhat similar solutions were advocated.  The fundamental unity of knowledge was a basic premise of this intellectual movement.  The actions advocated to foster the ability of academics to perceive this unity were the integration of all branches of academic learning into a single institution, as well as centering that institution around the faculty of philosophy.  The first of these proposed actions has been undertaken and not undone: Most higher learning still takes place at universities that contain several different faculties.  The second of these proposed actions was also undertaken, but has since come almost completely unraveled: Philosophy has assumed a very subordinate role in the university, and no other integrative discipline has taken its place.  The present proposal to have students complete courses on integration, logic, rhetoric, and epistemology would effectively put philosophy back into the center of the university.  If properly applied, this proposal could also increase the intellectual sophistication of the university’s graduates, improve the general populace’s ability to call on a wide range of knowledge when making important decisions, and might just make the world a better place, where the tremendous powers we have gained from sophisticated technology are used wisely, with an eye to both the benefits and the long-term costs.

Best of all, students might once again come to believe that they are involved in an important shared enterprise, one that enhances their dignity regardless of how well they do in competition with other students.  This belief could make them more enthusiastic, and banish the boredom of student life.


Universities have the potential to go in two different directions after the year 2000.  One possibility is a continuation of the present course, where universities are seen rather cynically as factories to produce graduates who can get good jobs in a basically directionless society.  Such a course, in addition to its potentially destructive consequences for the world as a whole, is damaging to the morale of students, because they see university education as little more than a hoop they have to jump through on their way to achieving their half-hearted hopes for a good life.

The alternative I advocate for universities in the new millenium is a revitalization based on a rethinking of the role of universities in the larger society.  If it is realized that the major lack in western societies is no longer wealth but understanding, then universities will no longer be the servants of those that promote the production of ever greater levels of wealth.  Instead, they will constitute the driving force of a cultural renewal with long term beneficial consequences.  Such a change in the perception of the role of the university could not help but improve the morale of those associated with the university, as well as improving the quality of both the written products of academics and of the learning process of the students.


Campbell, D. T. (1969).  Ethnocentrism of disciplines and the fish-scale model of omniscience.  In M. Sherif & C. W. Sherif (Ed.s), Interdisciplinary relationships in the social sciences, pp.328-348.  Chicago: Aldine.

Fukuyama, F. (1992).  The end of history and the last man.  New York: Avon.

Gagan, D. (1998).  Proposed statement of purpose.  Simon Fraser News, 12(2), 2.

Greene, M. T. (1997).  What cannot be said in science.  Nature, 388, 619-620.

Lyotard, J.-F. (1979).  La condition postmoderne: rapport sur le savoir.  Paris: Editions de Minuit.

Spranger, E. (1910).  Wilhelm von Humboldt und die Reform des Bildungswesens.  Tübingen (Germany): Max Niemeyer.

Thorngate, W. (1990).  The economy of attention and the development of psychology.  Canadian Psychology, 31, 262-271.


Creative Commons Licence In-sight by Scott Douglas Jacobsen is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.


© Scott Douglas Jacobsen and In-sight, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Scott Douglas Jacobsen and In-sight with appropriate and specific direction to the original content.


From → Chronology

  1. Wayne Podrouzek permalink

    I, for the most part agree with your analysis – BORING, and why? A couple of reasons: we have abandoned the foundations of “education” in the classical sense. There are several issues here. FIrst, we, as universities, no longer “educate”, we “credential.” Instead of modelling the foundations of critical thinking and analysis, we pour factoids into students heads so we can meet “measurable” learning objectives (students will be able to name four reasons for …). Second, we let the sociologists, so called humanists, and related folks get control of the university system. For example, there is the notion of the “safe learning environment.” This is a space where students never never need feel uncomfortable, never have to be in a state of disequilibrium regarding their beliefs, where if you question their religious faith or other basic beliefs (O! heaven forbid) then you are a “bad” prof – that students should ever be in a state of discomfort! Related, but third, we have embraced the notion of postpositivism, which I agree with you is not optimal, but for a somewhat different reason. Postmodernism, at least as generally (mis?)understood, leads to the position that everything is just opinion, and that one opinion is just as good as another, and that we need to just agree to disagree …, and like that. How silly! I like Dawkin’s position on this: If you think that gravity (a scientific principle) is just a matter of belief or opinion, you are invited to step out of a 10th floor window at your leisure. Believe what you will, I suspect “splat” will be the outcome. Or Nietzsche’s pithy observation that “A strole throught he lunatic asylum shows faith does not prove anything.” Or Shermer’s “Wronger than wrong” article. Or Pauli’s principle, and the list goes ever on and on.
    And you can see the lack of engagement here. Where are the responses, where is the conversation, the dialogue? Ain’t no-where, eh. No one wants to put her- or himself out there, taking positions that might be controversial. The universities are now one of the LAST places where you can find free and open debate on controversial issues! They are now the main bastions of “official thought and speech.” If you disagree with the prevaling attitudes of your professors of administration you run a risk. This is especially the case in the “special” universities and colleges in which there is (I think intentionally) no tenure. Although I though we tried this case in 399BCE, and again in the 17 century (Socrates and Galileo, respectively), this makes no impression on us. We have learned NOTHING from the McCarthy years. So, I think we need to look to ourselves for the difficulty here, our weakness, our cowardness, our inability to maintain standards and vision. The fault, dear colleagues, is not in our stars but in ourselves that the education system SUCKS. See also George Carlin –

    and rethinking education

    Look forward to the CONVERSATION – should anyone give a flying F**K.


    • Sven van de Wetering permalink


      It’s funny, we had a discussion on some of these issues yesterday, and it seems our more engaged students perceive a lot of the same issues you do, plus one more: the cult of self-esteem means we reward people for trivial accomplishments, tell them they’re brighter than they really are, and leave them feeling threatened in their self-esteem every time the encounter an idea that they don’t comprehend instantly.
      I’m not sure I agree with George Carlin. I don’t think the masters of the universe are too upset with us being able to think critically (though they would prefer us not to), I suspect a bigger issue may be they want us atomized and isolated from one another, so that we are incapable of mobilizing. Was it Frederick the Great who said people can think whatever they want, as long as the OBEY? That of course gets us back to measurable educational outcomes: we reward people for jumping through hoops, not for independent thought, because jumping through hoops is what future employers want from them.


    • Brandon Tomm permalink

      To add to Wayne’s idea of the “safe learning environment:” the apparent castration of meaningful dialogue on certain topics (such as religion and ethics) for fear of offending someone isn’t wholly attributable to the “business” aspect of modern university education (or the people who ‘control’ the university). In my experience (however short it may be) I see blame to be shared amongst students themselves. There is a perceivable splash of apprehension on the faces of students when I (or anyone else) takes shot at another person’s view. It’s in the blood of most students I know to stay within their shell, and they do their best not to roll and break anyone else’s. Yes, I get a good conversation going from time to time, but too often I get looks from others who prefer not to rock the boat. Of course, it is natural for most people to seek to avoid conflict, but my peers have failed to decouple conflict from mere difference, dialogue, and comparison. I want us all to continue to encourage meaningful discourse, and discourage the complacent, apathetic relativism (in the university classrooms, at the very least).

      That said, I should mention that I have had many many professors past and present who try their darndest to inspire the more social aspects of education in my classes (the professors posting on this page are some of the most productive in doing so). It remains up to the students to utilize their education, not just their credentials.

      • Sven van de Wetering permalink

        A very interesting comment, Brandon. This seems to be true all over the world. Censorship is never as paralyzing as self-censorship, because we never get all that hostile to the censor when it’s us. I think professors can do their bit to alleviate this by fostering appropriate norms, but sometimes I wonder if seminar classes should be held in the bar.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: