One very interesting lecture presented was Jim Duderstadt’s presentation on the history of the institution of “the university,” the university in Britain, the early United States, and The University of Michigan. The concluding remarks about the future of higher education, as well as the Spellings Commission were also very elucidating.
A couple of points of interest that came up for me were:
· the continued desire of higher education to continue to require a liberal arts education;
· research universities teaching basic classes; and
· an (apparent) lack of connection between secondary and higher education.
With regard to the first point – the continued desire to teach a liberal arts education – is, I believe, one of the major points of weakness in the US higher education system. While this is usually seen as a major strength of the US system, I am cognizant of the continually quickening pace of the world’s economies. Students in US higher education are waiting two years – half of their potential time of training at university – to decide their major field of study. This is compared to students in the UK who are admitted directly into their majors, and take courses almost exclusively within it. In Taiwan – as far as I understand their curriculum – all students are required to take a series of “core” courses (“colloquia”), but then stay pretty confined to their disciplines.
The idea that the US-educated university graduate knows a little about everything, and more about their topic is a good thing to have, but I believe that there are enough students who are in the system that have little desire to learn outside their field. This leaves me at an intellectual fork-in-the-road situation: should students be forced to take courses they do not want to (and professors and lecturers teach courses to these students who do all wish to learn) in order to have a liberal education, or should the requirements of education be changed to be more in line with a different goal?
I feel that a student body should not be forced to take courses for the sake of broadening their education – that is what personal development (for which everyone has a whole lifetime) is for. Instead, the US university system should use the first two years of each students’ academic tenure to try and instill the traits they desire in their graduates. Hopefully, these would include a strong basis in ethics, as well as training in thoughtful reasoning, leadership, and cooperation, as well as the basic tools each student would need in continuing within their chosen field.
This last point brings me to my second point of interest: research universities teaching basic classes. Should UofM be offering classes in basic algebra, biology, chemistry, etc? What difference does it make for a student to take such courses at an expensive research university, as opposed to a community college? A lecturer at a community college should have as much knowledge with such foundational material as any professor, lecturer, or graduate student at a research university. Indeed, I would make the argument that a professor at a research university who is interested in the cutting-edge of science might be less inclined to be interested in re-hashing the basics of his or her field every four months. If this is the case, it makes little sense that such a professor would teach such a class, leaving it (or as much of the running of it as possible) to his or her graduate students – which is what happens in many of these introductory 100-level courses around campus. So why should a student (or the parents of a student) pay thousands or tens-of-thousands of dollars to learn material from a graduate student when they can pay a fraction of the cost to learn the same material at a community college? Although Jim didn’t go into it very much in his presentation, I subsequently learned from him that the original setup of the University of California system was set up to focus only on upper-level and graduate courses, but was eventually scrapped due to recruitment concerns.
Of course, this could potentially all be solved by having a much-improved secondary education system in the United States. The major gap was one of the major findings of the Spellings Commission (barring the meta-findings of political “tampering”). In the UK, there is a Department for Education and Skills which can set teaching (and presumably learning) requirements nationwide, and although many debates occurred in the UK about the benefits and pitfalls surrounding teaching to tests, during my time living there, I felt that those students coming into universities were of a greater level of academic maturity than many of the Junior Year Abroad (JYA) students from the United States. While there are definite problems with the British education system, there should be a greater emphasis on connecting the skills learned in secondary education with those of university. Potentially having upper-tier universities require certain prerequisites for entry, or have a “catch-up” year or semester for students who need one before starting in on their actual degree.
And this leads me back, full-circle, to the questionable insistence by United States universities to stick to a liberal arts education. I feel that the problems with holding onto this commodity – decreased specialization, recalcitrant students being taught by disinterested lecturers/professors, high financial costs to learn basic subject matter, etc – will outweigh the benefits of a broader education. A breadth of education should be encouraged throughout a person’s schooling and life, and not forced upon those individuals who choose to be specialists. At the same time, universities should strive to teach all their students a set of tools – thoughtful reasoning, leadership, cooperation, etc – to help study and solve field-spanning problems that we are beginning to see today.