Mainly about the practice of teaching and the experience of learning.
Used to be "Recent Reflection", but I've given up on the "reflection" business--it's so dilute, it's homeopathic.
Tyler Cowen on stories (transcript of TEDx talk) Kind of interesting talk on the limitations of story construction as a way of thinking, but offered with apparently little grounding in the substantial literature apart from Booker, 2005* (which is odd given Cowen's reputation as a voracious reader), so just an introduction. You can watch the video here, but it is out of sync so it is almost unwatchable.
This may be a rather confused post: it is prompted by the season and its religious expression, without being particularly religious in content. It is also full of sweeping generalisations: I've left them like that because to qualify them would make the argument, such as it is, even less coherent than it is already.
On Christmas Eve I went as usual to Midnight Communion at a local (fairly high) Anglican church. I gave up on our actual parish church some time ago, because of its minimalist evangelical one-dimensional logocentrism--there is a point to this, I'm not merely being rude!
The church I attended had sung responses, and anthems from the choir as well as hymns, and rich vestments, and candles and even incense. It offered a multi-layered experience, at whatever level one wanted to take it--artistic, cultural, social or even "spiritual". (I'm not going to refer to "worship"--it is too pre-emptive a term in this context).
Two initial points: First, the evangelical church, with a faith centred on propositional assent to a creed, would not like such a multi-layered experience, to which different participants brought different backgrounds and commitments and from which they also took different things. Such a church prefers to sing in unison, or parallel. Ambiguity is not highly valued.*
It is highly unlikely that the congregation at the church I attended shared an articulated belief system in the same way as their more evangelical brethren*. The multiple layers of meaning are more flexible and tolerant than those which rely so heavily on an intellectual assent to the propositions of a creed.
That, however, poses the question what layers which participants at the service I attended had access to. And of course what significance they would have attributed to what they were witnessing; some might readily see the sumptuous vestments as offensive to the ideal of poverty propagated by Jesus, while for others those vestments may represent an offering of the very best materials and craft skills--anything less would be unworthy of their high purpose. For some, familiar with the ecclesiastical calendar, the colours of those vestments and of the altar-cloths, will be redolent with meaning--but probably completely arbitrary to most of us.
There were quite a few younger people present, and I wondered what they were making of the music, the chant, the solemnity, the symbolism, the arcane and archaic language (e.g. "...a full, perfect, and sufficient sacrifice, oblation and satisfaction,") --and even I had to admit that I didn't really understand a word of the sermon. But they come from a completely different world, or rather worlds.
So much of our ability to understand and feel at home in a culture rests on taken-for-granted understandings and allusions and "common sense" which needs no explanation. But it does need commonality, and I'm beginning to wonder whether that common foundation of shared experience and understanding is being eroded by the speed of change and the personalisation of technology.
In all the celebrations of the 400th anniversary of the publication of the Authorised Version of the Bible (although everyone now refers to it as the "King James version"), there have been many comments about how much we owe to it for common phrases, and characters. See, for example, here.
...the King James Bible version swept round the globe in school assemblies,
far flung churches, remotely stationed battalions ...it was the Book of
the community of English speaking peoples. [...]
New words - we use them still: "scapegoat", "let there be light", "the
powers that be", "my brother's keeper", "filthy lucre", "fight the good
fight", "sick unto death", "flowing with milk and honey", "the apple of
his eye", "a man after his own heart", "the spirit is willing but the
flesh is weak", "signs of the times", "ye of little faith", "eat drink
and be merry", "broken hearted", "clear eyed". And hundreds more:
"fishermen", "landlady", "sea-shore", "stumbling block", "taskmaster",
"two-edged", "viper", "zealous" and even "Jehovah" and "Passover" come
into English through Tyndale. "Beautiful", a word which has meant only
human beauty, was greatly widened by Tyndale, as were many others. [From here]
How many people now have access to that range of reference and connotation, as the emphasis turns to functionality and clarity and simplicity, even in biblical language?
More broadly, of course, the emphasis in education is firmly on utility and measurable outcomes and "impact"; in relation to higher education there is a defensive debate in which the arts and humanities are increasingly being called upon (or their advocates feel that they are increasingly being called upon) to justify their existence, see for example Matthew Reisz here, Roger Lister here, and--with particular reference to the religious tradition--Eduardo de la Fuente here.
Things have moved on since I started teaching. In the late 'sixties I was appointed to teach "Liberal Studies" in a technical college. The very existence of the subject and its requirement as a part of technical and vocational courses testified to the assumption that cultural, social and even political issues could not be neglected in a modern educational system. As Bailey and Unwin (2008) document, the tide of humanism ebbed in the '70s, and the idea that it was possible for all young people to be liberally educated was eventually abandoned. Certainly I personally gave up on it and moved into more vocational areas--the heirs of Liberal Studies in technical education now are termed "functional skills", which in itself shows how things have changed. But perhaps the mistake was to believe that the desired appreciation of culture and history and society had to come through individuals. As Mary Beard put it just a few days ago at rather a higher level (my emphasis):
The important cultural point is that some people should have read Virgil
and Dante. To put it another way, the overall strength of the classics
is not to be measured by exactly how many young people know Latin and
Greek from high school or university. It is better measured by asking
how many believe that there should be people in the world who do know
Latin and Greek, how many people think that there is an expertise in
that worth taking seriously—and ultimately paying for. [Mary Beard, here]
The cumulative heritage of sensibility (wow! How's that for pomposity?) resides in the community rather than individuals...
"Hot damn, I thought, I must blog on this. But then I read it again and,
well, what more is there to be said? So just read it and weep with
gratitude for Marilynne, the New York Times, for the Bible, for all the
wonders of the religious imagination and with pity for those poor
militant atheists."
Quite...
I covered some of the same points from a different starting point in this earlier post.
* As I write this, I am reminded of precisely this issue emerging from my research for my dissertation in the sociology of religion (Dependence and the Practice of Religion unpublished M.Litt thesis, University of Lancaster 1974) and some of the research which contributed to it (Walker A G and Atherton J S (1971) "An Easter Pentecostal Convention; the successful management of a 'time of blessing'" Sociological Review vol 19 No 3 pp. 367-387)
Reference: Bailey B and Unwin L (2008) "Fostering ‘habits of reflection, independent study and free inquiry’: an analysis of the short-lived phenomenon of General/Liberal Studies in English vocational education and training" Journal of Vocational Education and Training Vol. 60, No. 1, pp. 61–74
Miracles and the Historians.
It's Christmas, so this post is apposite (although I don't find it
entirely satisfactory) but I've shared it as much because of who it is
by; Peter Berger. Yes, that Peter Berger, now 82, author of Invitation to Sociology (1963),
a slim volume that probably introduced more people to sociology than
any other text, and co-author with Thomas Luckmann of The Social Construction of Reality(1966) --one of the most influential works of sociological theory ever. His blog is invariably worth reading.
Walking through doorways causes forgetting. The paper does not mention going upstairs, but the model they propose would work for that as well. There's a friendlier version from Scientific Americanhere.
I'm not sure why I haven't bothered to comment on this before; but I have just received yet another peremptory demand to referee an article for an academic journal. It came via boiler-plate email, demanding a report within 30 days, or that I go to a website to excuse myself...
It creeps up on us. I remember, 30-odd years ago, how flattered I was to be asked to review an article for a journal--and to be paid for it. In 1978, I received £25.00, for reviewing one such article. How things have changed!
Let's be clear. The "request" I received has nothing to do with the journal itself. It is entirely a manipulative bluff on the part of publishers who have cornered the market in journals, with an amazing business model:
The content is provided for free.
Publishers may even demand that authors or their institutions pay for publication, at the rate of hundreds of dollars per page.
The editors work for free
As do the editorial boards
And the referees
And increasingly the journals are not even printed in hard copy
And subscriptions are bundled together so that libraries which want just a few useful ones are compelled to accept a lot of dross as well--and to pay for it. At least they no longer have to find shelf space for it.
Be amazed at how much just one university (UCL) is paying these publishers! And then...
Discover how little most journals are read--in many cases not at all.
George Monbiot made similar points in the Guardianhere.
It occurs to me that the situation makes a nonsense of any attempt to incorporate "impact" as a factor in evaluating research within the "Research Excellence Framework" (the notorious new system for assessing the quality of research in UK higher education institutions).
I'm really glad I am no longer in the rat-race--going to enormous lengths to produce article no-one will read. My stuff is not peer-reviewed pre-publication (although I get plenty of feedback afterwards) but it reaches a lot more people (more than the median annual usage of the Elsevier journals taken by UCL, every day). But the cartoon minutes of the panel DC attended (brilliant idea, incidentally) sum it up;
Of course there is a radical alternative--it addresses most of the problems, but doesn't actually publish anything...
In her classic Patterns of Culture (1934), Ruth Benedict discusses the "apollonian" culture of the Zuni pueblo native Americans of New Mexico. I was particularly struck by their principle (which I caricature for effect) that the only disqualification for public office is to seek it.
I've just been to the retirement celebration of a former colleague at another university. It was a very pleasant event, and it is really grossly unfair to her, and the respect and affection in which she is held by her colleagues and peers, to concentrate on one aspect of it which grated. Nevertheless...
I refer to the presence and manner of one very senior member of the university (PVC), who came late, after the speeches and the presentation (no, not a PowerPoint) of gifts and tokens of appreciation. I happened to be chatting to my former colleague at the time. The PVC arrived, excused himself perfunctorily, and engaged her in conversation talked to her about Optical Mark Readers, or OMRs, as he referred to them throughout (it took me a while to catch up). It had evidently not occurred to him that someone who is about to leave the university at the end of next week might not be able to care less about how out-dated the OMR software is. Nor that she was missing out on farewells from friends and colleagues who did not want to barge in on her "conversation" with a VIP...
The gentleman in question is clearly a self-serving boorish person. He was however, very skilled at communicating his own appreciation of his talents and meteoric rise to his present rank in a few seconds... "But that's enough about me! Let's talk about me!"
I suspect it was ever thus. The (ideal-type) university, wrapped in mythic collegiality, has been in denial about ambition for ever. Cornford's wonderfully waspish Microcosmographia Academica (1908) testifies to that as well as a string of novels from Cannan and Snow to Lodge and beyond. But as Ginsberg argues, the trend has accelerated over the last twenty or so years, on both sides of the pond.
I was coming to believe that the only effective prophylactic is the Zuni principle.
But I got home to watch "Rev" with the creepy archdeacon protesting "nolo episcopare" a little too much. Sorry, Zuni, the creeps are ahead of you!
The Evolved Self-Management System: Nicholas Humphrey in the Edge musing on what the placebo effect suggests about releasing human capacities in other directions.
Race And Intelligence: A Wrap from Andrew Sullivan; I don't claim to have read it all, but this is the kind of conversation which only a popular, moderated blog can deliver.
Inland Revenue has a sense of humour? This has been going around via email for quite a long time, but it was new to me: the link is to the original source.
More on the Khan Academy; getting behind the scenes and reporting on how analysts are drawing on their finance backgrounds to evaluate the effectiveness of the programmes.
Interesting introduction to the work of Jonathan Haidt, a psychologist who studies morality, as much as possible without preconceptions. I haven't read his latest, The Righteous Mind; why good people are divided by politics and religion, but I did enjoy his 2006 offering The Happiness Hypothesis; putting ancient wisdom and philosophy to the test of modern science (Arrow Books).
The very title is interesting; a few years ago, it would have implied that the "research" was pointing clearly in favour of learning styles--but read the article and you will see that it argues in exactly the opposite direction.
There are one or two unexplained abbreviations:
"ISPI" is International Society for Performance Improvement
"ATI" is--at a guess--Analysis of Training Interaction or something similar. (9.12.11: David Stone puts me right--it's Aptitude-Treatment Interaction)
I have just been re-working a presentation from a couple of weeks ago to put out on the net as support material for the session I used it for. As usual it has taken much longer than I had intended. Good! The delay has helped me to evaluate some parts of the argument, and modify it sensibly in the light of some of the remarks made in discussion.
I use SlideShare to post presentations to my blogs and web-pages, but it doesn't handle fancy effects well (not that I use many of them, other than building up graphics). So I have to break down many slides into their components, and run them as a sequence...
What I have learned over the years, but never condensed/collapsed until now to the extent that I could post it or teach it (and probably all you readers are way ahead of me on this) is that powerpoint is rubbish at handling arguments and needs to be wrestled into submission.
I've written about this general issue before (here, with links to previous stuff) but not about the epistemology of presentation packages (you can get at Edward Tufte's and others' takes on this via the link above). There's a strictly practical view here.
In short, these packages are about hierarchical knowledge structures. As Tufte points out in relation to the NASA Challenger disaster enquiry, the presentation template allowed for six levels of detail. So the enquiry team followed that default model, and missed the point because it didn't fit--the package did not readily accommodate (the mot juste) impact from bottom to top as much as from top to bottom.
I'm interested in Prezi, as an alternative to powerpoint (I concede the term has achieved default status like "hoover" and "xerox", so the "tm" stuff is pointless) but its zoom structure is still based on hierarchy, and it's not easy to create a pan-and-zoom display which neither induces nausea, nor attracts attention to itself to the detriment of the content. Nevertheless, its general approach of offering a large (pretty well infinite) virtual canvas, which can be examined in greater and greater detail--and then in broader and broader context, so that relations between material are clear--is promising.
C-map tools, which can also be persuaded to work as a presentation package--although the process is not exactly intuitive--is good at presenting connected components of an argument. The nodes are simple labels, but the connections invite labels by default, such as A implies B, or C includes D. However, it is the least flexible package in terms of the incorporation of any other media or external material.
The presentation as part of a system
When I use a presentation in a live lecture, it is of course subordinate to the address itself, which carries the burden of the argument. I may choose to put it on line, or distribute a handout based on it, but it does not stand alone; it is a gloss on a verbal event.
Of course that is why simply making your slides available on the VLE is pretty useless. In most cases they just do not make sense as they stand. The same tends to be true of handouts. We often sort-of acknowledge this, and make the slides ever more verbose and comprehensive so that they will make stand-alone sense--but in so doing we make them less effective as a supporting act for the lecture. We end up reading out verbatim the content of the slides (often facing the screen to do so and thus turning our backs to the class)... Yuch!
Everything in teaching, including syllabi, schemes of work, session plans, presentations, exercises, assessments, evaluations... Everything needs to be considered as part of, and interacting with, the rest of the teaching and learning system. So everything needs to be modified according to its place in the system.
(One of my least pleasant experiences in thirty-five years of teaching in colleges and universities occurred this summer, when, for reasons which are neither fully understood nor relevant, but which were clearly motivated by disproportionate vitriolic animus, a kangaroo court was mounted under the guise of an "internal review" of a course with which I had a long-standing relationship. The part of this which most clearly affected me was the "critique" of the course handbook, which I have edited for fifteen years, and indeed the only outcome of this review process [rant deleted...] was an annotated Word file of the handbook demanding more than two hundred revisions [including, I concede, some useful observations--perhaps five of them.] The supposed justification was variation from the formal quality assurance template. But the handbook is for students. Their concerns are different from QA mavens. [And incidentally, the handbook had been commended by QAA and Ofsted and the external examiners, and even the university's head of quality assurance, as a model of its kind.])
Sorry! But the comprehensible part of the dispute can be attributed in part to the assumption by QA obsessives that everyone needs to be told the same thing in the same way at the same level of detail... An insistence on (too many) absolute (and potentially incompatible) values distorts the system. (The same mistake may lead to the collapse of the euro. I did try warning them in the late '90s, but no-one was listening...)
Back at the important stuff! If I post the material on a blog or SlideShare or the VLE, even with podcast support, the burden of the argument is borne by the visuals. If you have looked at the page where all this started, you will have seen that my solution (for which I make no great claims--I am sure there are better ones), is to include explanatory call-outs on most pages which at least hint at what I said in person at the live event.
Forms of knowledge and media for presentation
As the presentation in question touches on, I'm renewing my interest in the distinction Hudson articulated in 1966, between convergent and divergent thinkers. I've revisited the original account, in which he discusses testing the intelligence of schoolboys (forgive the dated expression):
"Initially I had hoped, [...] that open-ended tests would cut across the arts/science distinction, and give some reflection of boys' brightness; of their level, in other words, rather than their bias. The results were a surprise. Far from cutting across the arts/science distinction, the open-ended tests provided one of my best correlates of it. Most arts specialists, weak at the IQ tests, were much better at the open-ended ones; most scientists were the reverse. Arts specialists are on the whole divergers, physical scientists convergers. Between three and four divergers go into arts subjects like history, English literature and modern languages for every one that goes into physical science. And, vice versa, between three and four convergers do mathematics, physics and chemistry for every one that goes into the arts. As far as one can tell from the samples available, classics belong with physical science, while biology, geography, economics, and general arts courses attract convergers and divergers in roughly equal proportions." (Hudson, 1966: 42. My emphasis.)
It is, I concede, principally on the basis of this (and similar) passage(s) that I argue that convergence and divergence are not primarily attributes of people but of disciplines. The distinction is epistemological rather than psychological (on balance, of course; like all these constructs it's a bit of both. And the issue of match between discipline and learner makes a difference. And I bow to Jim Hamlyn's point about the plethora of categories of knowledge...)
It's not a particularly original step to argue that different disciplines call for different pedagogies, but the proliferation of technologies now poses new questions about what suits what. I got into this in a minor way, years ago, when the options were limited (here). The questions are still more important than the answers. In relation to the session I started this reflection from, you can see the resulting revision and make your own judgement about what I got right and what wrong...
But that is the wrong question...
What did this approach to presentation "privilege" (emphasise) or "deprecate" (play down)?
How did the choice of this medium affect the power-balance in the room?
How would the choice of any other method/medium have covertly affected the expectations/experience of the participants?
How seriously would they nowadays take a presenter who walked in with no notes and just occasionally wrote a keyword or drew on a flip chart--although she could be much more responsive to the group*? **
What difference does it make how handouts are handled (no, there is no universal rule)?
What does the manner of presentation (and of course discussion etc.) say about the presumed status of the material? Is it an object, a tool or a frame?
Enough. Bottom line: Media, methods*** and content all influence and constrain each other in an elaborate dance. You can't (or at least shouldn't) treat any element in isolation. But the conventional theories of pedagogy don't help. Ask the questions, and develop your own answers.
* There's another rub. To what extent do your class members identify with the class group? Does "responding to the group" largely mean going along with the louder members' concerns, regardless of their self-appointed status...
** There'll be more on the bottom-line of this in a different context, in a day or two...
*** This post has been deliberately conservative about "methods", partly because the original stimulus concerned a very conventional seminar presentation, but more because even that was very complex under the surface, and I wanted the post to make at least some semblance of sense.
"...I think lots of people are beginning to realize that accusing your
audience, and depressing your audience, and guilt tripping your
audience, and trashing your opponents is not a winning formula."
"...the most beguiling thing to me was the notion that had not Judaea
come together and formed a nucleus of a serious Jewish state, then
Judaism would not have thrived in the way it did and led to Christianity
and then to Islamism, and the world would be a totally different place.
It's one of the great "might not have beens" of history. (Melvyn Bragg's newsletter)
Another spuriously precise and prescriptive model of learning contested.
And a continuing thread on Andrew Sullivan's blog about the "race" and intelligence debate--conducted in, as he puts it, a "surprisingly civil" way. Simply, is the issue too hot to handle? What are the conceptual and epistemological issues involved in formulating it? And what are its possible consequences? None of the answers are simple... (Up-date 6 December--the wrap)
I didn't buy one for myself for many reasons, chief of which was that it purported to be a solution to a non-existent problem (leaving aside the relatively trivial sustainability argument--clearly it is trivial when Amazon, pleading VAT regulations, can charge more for a virtual edition than for a lovely case-bound hardback).
Granted, I discovered that I could download free sample chapters from some prospective purchases, which may be a step up from bookshop browsing. And there is a substantial back catalogue of free or almost-free texts; most of them could be found on the net, but I have to concede that they are easier to read on the Kindle.
The flicker of a constantly refreshing screen is subliminal, but even so it is there on a monitor, and although the kindle screen has a way to go, it is more comfortable for prolonged reading. Indeed, its font and format options are great for anyone whose sight is at all impaired. The days of the large-print book may well be numbered.
There are nevertheless limitations to the display. The contrast and grey-scale, and apparently limited standard templates, restrict the layout and formatting, not to mention the graphics. Of course, embedded links to colour graphics and even video can be incorporated, just not implemented in this incarnation.
I have just made my first proper purchase.
(Yes, it is legitimate to ask what right I have to pontificate on this on the basis of one purchase. But sometimes it is useful to remember the first time for technical reasons... Oh, get over it :-) )
It is frighteningly, seductively easy. My security system requires me to register specific items of kit with the wifi router via their MAC address; that took less than five minutes from a baseline of total ignorance. And I already had one-click enabled.
So I can find any Kindled item on Amazon, and buy and dowload it to my Kindle with three clicks. They say it takes a minute, but that's pessimistic.
At one level that is brilliant, but I can see myself acquiring a backlog of unread stuff because of momentary impulses. I do that now, of course, but at least I can see the reproachfully growing piles of unread physical books, prompting me to make inroads,
I checked out the hard-copy version of the book I had downloaded, courtesy of Heffers in Cambridge (not that I asked them. But they only had three copies, and none in the front of house displays) I was disappointed by the referencing of the Kindle version, and had wondered whether it was dumbed-down for the digital market. It wasn't--but then it would require more editorial effort to do so than to leave it as it was... and even more to plug it in to the infinite net.
But of course Kindle couldn't have an index, when the pagination is dynamic, could it? Well, yes, with one-click links. And a search function--which has a system of identifying "locations" within a text; I have not yet worked out whether those remain consistent despite formatting changes, and whether they can be shared. A blogger I follow has just posted a link to a key passage in a kindle text, but the link just took me to my own notes page rather than his... this is not yet intuitive practice.
So the annotation/ highlighting/ mark-up facilities are limited and a pain to use at the moment--in the case of the Kindle 4 in large measure because of the clunky virtual keyboard. But they do exist, however rudimentary the form. And since a substantial proportion of users will never use them, and older versions of the kit do include a basic keyboard, and there is at least the opportunity to share annotations, which is a great idea... Amazon is on a good track.
OK, I'm 67, from a receding generation, and I'm hooked on physical books; books do furnish a room after all. I can't see myself adopting Kindle as my default reading format. I can't quite put my finger on it, but I don't read the page in quite the same way as a book. The line length is shorter, by default, and so I don't have to scan as much as I do on a normal printed text. I can skim more easily. That's good for some things and less so for others.
I have not mentioned that the Kindle is not restricted to displaying Amazon material, and the fraught question of online availability is one I shall leave aside here; but it is easy to transfer rich text files so that you can readily have to hand all the documentation for a meeting without a single piece of paper. I very rarely do meetings like that any more, thank goodness, and there are other ways of achieving the same end, but for the moment this is an elegant solution.
Somehow I suspect that just as the afterthought of SMS messaging became the unexpected killer app. of the mobile phone, the e-reader may find its niche far from the novel-reading commuter who is its present target.
Daniel Kahnemann explains the essence of his book on fast and slow thinking. I know he was first with many of the ideas behind the current (understandable) obsession with risk and judgement (see Nicholas Nassim Taleb, Dan Gardner--who is particularly good on Tetlock-- Dan Ariely, even Malcolm Gladwell in 2005) but he's late to the party with a popular account, so I hope it's good.
Barry Schwartz talking about the "paradox of choice" and the possible up-side of economic downturn. I hope his work is more stimulating than Renata Salecl's The Tyranny of Choice (Profile, 2011) which gathers opinions from a formidable range of writers, but very little evidence for anything.
We have our milk delivered. It does cost more than buying from a supermarket, and I suppose I am motivated a little by trying to keep alive an old-fashioned model of service (and more to the point, a service which ensures that a customer is visited at least every other day, and notices if the milk has not been taken in...).
So S. is away for a couple of weeks. I put out the milk bottles, and knowing that I won't use much, I stick a rolled-up piece of paper in the neck of one which says, "No more milk until December, thanks," and includes the house number to save the milk-person from having to remember it. Simples! Elegant! Effective!
I get a call from Patrick*. Come to think of it, when did I give the company my number? I must have started on the slippery slope when I opted to pay by direct debit... He wants me to sign up to place my milk order (and overpriced orange juice and... I shouldn't mock; if I were housebound I'd really value this service) online.
Why? Because then I wouldn't have to leave notes for the milk-person if I wanted to change the order. Use the website up to 9pm the night before, and I can change whatever I like.
But I can leave a note in a milk-bottle up to a second before the delivery, or I can actually speak to a real person if I catch them at the right moment! This is progress?
Oh, I know where the company is coming from. Orders changed on the doorstep make for inefficient loading... and when this is a premium service they need to preserve their edge.
But! It's simple. It's reliable. It has no intermediaries. It requires no more technology than a pen and paper. It even permits a degree of human interaction.
* But all credit to Patrick (that's the name he gave me). When I said "no", he didn't push it; he signed off in a resigned manner. I got the feeling he had got the same message many many times today.
Here is a lesson from the Khan Academy on basic physics. If you are in the teaching business, get together with colleagues and discuss and evaluate it...
"The Educational Lottery" on the four kinds of heretics attacking the gospel of education. American, but also goes for the UK; thanks to The Browser for the link.
And this about grammatical errors, and in the linked paper, which is indeed a classic, the hypocrisy of the language "mavens" as Pinker calls them. The linked paper is indeed great fun (in a nerdy way), but what is really interesting about this piece is that the topic is a regular concern in the general US higher education press and blogosphere, and very rarely appears over here. We merely bemoan our students' lack of writing skills, and provide individual remedial services; for them, composition is a routine and obligatory freshman requirement...
It's been fun collecting this ("curating" is the in-word)--I think I'll carry on....
(Nothing below is intended to disparage remembrance and respect for the fallen and maimed in any military conflict, including honorable foes.)
I've just been watching this ceremony of remembrance as I always do. (The link will expire a week from posting, and may not work outside the UK). I had my reservations about the show-biz contributions, and I wonder about the (post-Diana?) shift to sentimentality rather than stoicism, but sensibilities and fashion change, even at this level.
It's the context and the communal dimension which confers the genuinely moving nature of this ceremony.
In 2006 I attended the changing of the guard at the Tomb of the Unknowns in Arlington National Cemetery in Washington D.C. (the video is not mine--it's from YouTube).
I was very disappointed. For all the precision and spit-and-polish and solemnity, it struck me as camp. More mincing than marching.
Then this year I witnessed the parallel ceremony in front of the Presidential Palace in Athens.
Our guide included it, I think, because her son had been a member of the honour guard when he did his obligatory military service. She was clearly very proud. But I'm afraid it reminded me too much of Monty Python...
I'm not saying this to mock. I just found myself speculating about how these peacocks' tails of rituals came about. Perhaps the ultimate example is the pantomime at the India-Pakistan border in Kashmir:
In each case (apart from the Ministry, of course), there seems to be something important and solemn to be commemorated, which confers prestige on those who perform the ritual, and apparently permits its evolution through generations of young men without checks and balances. And of course the significance of the ritual lies simply in the fact that it takes place. While the performance in Kashmir does refer to the rhetoric of contempt, as Michael Palin comments, there is no meaningful symbolism in the presenting of arms or the goose-stepping of the other ceremonies.
Cut loose from meaning, there is merely performance. On the one hand, those who perished deserve better than that. On the other, they too were young men like these--perhaps it is what they would have done--before they did the other things young soldiers do given the chance...
Another two graduation ceremonies, yesterday. I had thought six years ago that one might have been my last, but a year later it was still going strong, and it has done ever since. But circumstances have changed; from three graduands six years ago, the PCE courses now dominate not one but two ceremonies, albeit in a smaller venue. Pity that the non-graduate and the post-graduate students, who have studied in the same rooms and times for two years, were separated arbitrarily by label for the ceremonies, rather than by grouped locations--but it has still been progress. I think.
After all, one of our nominations for the award of an honorary doctorate was accepted! Pity that the ceremony selected had nothing to do with education, and none of the people who nominated him were able to attend, and the audience who heard his address probably had little idea of what he was talking about. To be fair, he did feature as a guest speaker yesterday, for one of the ceremonies, before he jetted off for a conference elsewhere--and his address was as pointed as can be. It's not often that you hear an academic--admittedly claiming Glasgow dialect--using the term "bulls**t" in a formal speech...
And the guest speaker for the evening ceremony may not have had the same credibility for those of us in the PCE community, but he was very accomplished and had his own impressive record.
But... for all the rhetoric about celebrating and valuing achievement, what does it say to our graduating students attending, that in the opening address, the Deputy Vice-Chancellor celebrated at some length the achievements of the Faculty without a single mention of the post-compulsory education sector--from which hailed about 75% of the graduands about to be presented? (To be fair to her, her remarks were scripted by someone else, probably, but that someone should have known better.)
Moreover, every single graduating student present passed across the dais to shake hands with the officiating... officials. Somehow, one would have assumed that after shaking hands and exchanging a few banal phrases with mature students (mean age 35+ and current maximum 68--or was that when you started the course, Maurice?) the platform party would have twigged that these are not callow 21-year-olds taking their first steps into a big scary world? (This stereotype is both unfair and disturbingly accurate.)
However, although our invited guest speaker adopted the ingenious rhetorical device of re-evaluating the advice he had been given at his original graduation in 1964,* and a recognition that the world is changing so fast that any advice for today will he useless tomorrow, the tone inexorably tended towards "wise advice".
Afterwards, my colleagues and I rated the performances (it's what we do!) No, we don't keep a league table... We were pretty scathing that no-one performing on the platform seemed to have given any thought to the context of the event, and that this failure to learn has characterised them for years. They simply dust off last year's remarks (after all very few people other than the academics ever hear them twice) and just possibly update them with reference to the ever-tougher job market (in which your new degree will stand you in good stead) or some other nugget of news which is recognisable of the current year. If I had been one of this cohort of graduates sitting through the ceremony, I should have been pretty insulted to be ignored and treated as a stereotyped 21-year-old...
But it occurs to me that we may have got this entirely wrong.Our speaker did refer to the ritual aspects of academic life--the robes, the processions and the certificates. And both speakers recognised that the status of "graduand", like that of bride, is a very ephemeral one. It is transitional, or indeed liminal. Graduation (like the whole business of going to "uni" for mainstream students) is a rite of passage, the principal object of which is to manage and communicate the change in status, to self and others.
* I didn't believe a word of it. At my graduation the address was given by the then Prime Minister, Harold Wilson. I have no idea what he said.
Thanks to the Advances in the History of Psychology blog for publicising this new initiative from the University of Chicago, digitising and making available some classic clips, including Watson's "Little Albert" conditioning experiments from 1920.
Until recently, I should just have added it with one click to my "Shared Items", and a link would automatically have popped up in the side bar, but Google Reader has now dropped that feature (in favour of forcing one to use Google+, which is not the same thing at all. So "Shared Items" will gradually dwindle and I'll have to post directly about stuff which catches my eye.
The heading link is to an interesting argument calling for a less obsessional approach to citation and referencing in student work, because it distracts attention from content and style. As might be expected, it generates a lot of comments.
My own take on it has been covered here (and here and here).
My son (30), in the pub, after work. He called me. We've been discussing a book I passed on to him. He is not particularly serious:
"I hate this! I'm turning into you! In a few weeks' time, when it gets cold and dark outside, and they light the fire here... [Gestures to the genuine open fire-place beside our seats] I'll think there is nothing better than to come in here after work, and sit in this corner with a pint and a book. And I'll glare 'meaningly' (like Paddington Bear) at anyone who tries to interrupt."
Occasionally I stumble across some site which represents to me the best the web can do. I've got a lot of respect for such sites whenever I find them, but when they are personal efforts, self-funded, and with no axe to grind, I am really impressed. Add in the research behind something such as this, and I am blown away.
At first sight,the history of workhouses and the Poor Laws is not a particularly fun project. Indeed, it isn't, but the history, policy and practice of dealing with the poorest in society has ramifications to the present day. Someone needed to take it on and indeed humanise the dry records, and in this case it was Peter Higginbotham. He's done a fantastic job--I've checked out some locations with which I am familiar, and there are maps and photographs (many taken by him personally) and census records...
What a change from the necessarily generally critical and sceptical tone of my post, to endorse a truly public-spirited and selfless contribution. Thank you.
I've taken on doing some marking (grading) for another university, for a (postgraduate) course with which I have no other involvement.
I'm fascinated by the difference. Not in the quality of the work I am marking, but in the experience of doing it.
I approach with trepidation the task of marking assessment of stuff I have taught. I have no such concern about this marking, or about reviewing the assessments of courses to which I serve as an external examiner.
Why?
I suspect the difference lies in the extent of the marker's engagement with the project (sorry! I just need a term to refer to the whole business of producing and evaluating some assessable work).
Bottom line: If I taught the module/course/unit/whatever*, when I read the assessment I am looking in a mirror. I am seeing my teaching reflected back at me (with glosses of course from the student's experience).
I am assessing myself.
And if several students have missed the point of, say, behaviourism, but have got the other stuff... That poses a question about how I taught it as much as about how they learned it. I chose the example deliberately; it's not a difficult topic, and although I continually revise the presented material and the exercises etc. the evidence of the assessments has generally been that it has been well understood.
I should have been (I was--to a certain extent) alerted to a problem by one student (a military trainer, for whom behavioural objectives etc. are meat and drink) on this run of the course, who didn't get behavioural theory as an approach to teaching. He used it every day, but I had conspicuously failed to get over its underlying principles; we discussed them in class several times, but I suspect that in practice, instead of clarifying matters for him, our conversations served to confuse them for other class members.
Aside from (or, in this strangely infiltrating usage "absent") the ethical/professional issues raised by this particular example--should I be more lenient in my marking because I realise that this time I did not teach the topic as well as previously? ...
...aside from that, and assuming that the assessment is well designed, this is raw gold-standard evidence of the effectiveness (at least Kirkpatrick 2) of your teaching.
So I approach it with trepidation. My apologies to the 2010 Unit 2 cohort; I clearly pitched some stuff wrong, and did not manage to retrieve it in our subsequent discussions.
Now I'm approaching this other material, solely as an assessor (who is agreed not to be a subject expert).
When I come across an issue in this material it does not point (potentially) backwards in time to my teaching, but forwards to a new area the student might explore. It's not a mirror but a window.Of course, windows are always a little reflective...
* I think it is the standup (interesting label in itself) Michael McIntyre who jokes that practically any adjective can be used as a synonym/euphemism for "drunk". The same might be said of nouns in the context of eduspeak, although I confess I am struggling with epiglottis and ... (this started as a rather silly exercise of finding another noun which had no educational connotations. An hour later, I haven't found one! This may be the basis of a great teaching exercise. Or not. In any event if you follow it up, please tell me/us about it... Or perhaps it is simply evidence of my being moduled if not totally epiglottised?)
"...this story is also a reminder that we should always be cautious with "surrogate" outcomes. The biological change measured was important, and good grounds for optimism, because it shows the treatment is doing what it should in the body. But things that work in theory do not always work in practice, and while a measurable biological indicator is a hint something is working, such outcomes can often be misleading."
And later...
"...improvements on surrogate biological outcomes that can be measured in the body are a strong hint that something works – and I hope this new DMD treatment does turn out to be effective – but even in the most well-established surrogate measures, and drugs, these endpoints can turn out to be misleading."
A fairly basic point, of course, but it did set me thinking about how the extent to which we have become obsessed with measurement in many fields, including teaching and learning, had led to increasing reliance on fairly dubious surrogates.
And then I came across this commentary on Standard and Poor's revision of the the US credit rating:
"[Ratings agencies] ..are human enterprises, fallible institutions—and like other institutions, they have procedures, interests, and histories. Their records deserve inspection. In the scientific spirit, in the spirit of show me, they deserve scrutiny."
A credit rating is a complex construct (I presume). Since it is supposed to have predictive value (other than merely being part of a self-fulfilling prophecy), it must be put together from a raft of surrogate measures, presumably of directly observable factors which co-vary with an institutions credit-worthiness. But it is only as good as the choice of those surrogates*.
Which led to some general thoughts in relation to education.
Today is the day when GCSE results are published (the exams taken at age 16 by practically all pupils in the UK). The press stories are predictable, suggesting grade inflation and the exams being dumbed-down. (Or of course if the pass-rates were not an improvement on last year, there would be jeremiads about further decline in educational standards...) The press discussion will not be sophisticated, but it will at least acknowledge what the politicians and the educational establishment will deny, namely that the examinations are not realistic proxies for educational achievement.
This is leaving aside the issue of the tail wagging the dog, of "teaching to the test" without ever asking whether the test is valid or reliable. Beyond that, the logistics and practicalities of mass assessment distort the process, and it has ever been thus. When Liam Hudson (1967) discussed convergent and divergent thinking styles, he noted that convergent thinking was privileged in school at least in part because its testing could be standardised.
But these artificial surrogate assessments are increasingly separating the formal educational system from the "real world", particularly that of communities of practice. This is not an original observation; while I'm on "golden oldies", I'll refer to Becker's wonderful 1972 paper School is a Lousy Place to Learn Anything In, which is based inter alia on a similar argument.
It is out of an awareness of the intrinsic limitations of such surrogacy that a course on teaching with which I have long been involved has attempted to develop a more authentic assessment strategy. Of course, teaching courses have always routinely involved direct observation of teaching, but not everything is amenable to direct observation. The traditional solution on most** other courses has been set assignments; our course moved away from that to negotiated submissions based on a learning contract. Learning outcomes are specified and students decide, in consultation with a tutor, of course, what evidence they will submit to demonstrate that the outcomes have been met. This is a step closer to reality, but of course only insofar as the specified learning outcomes correspond to the real world.
The course has just been internally reviewed for routine reasons, and it is apparent that the bureaucrats hate the assessment scheme. Work is not graded, for example. The scheme is not suited to anonymous submission, because the students are talking about their own practice and work setting (it is an in-service course). Not all work is suited to electronic submission via Turnitin.... the list of complaints goes on.
The real problem is that validity, reliability and fairness--the traditional requirements of an assessment scheme--are now subordinated to standardisation, administrative convenience, and security***.
These are considerations for the legitimation of surrogates and proxies--the same kind of consideration as applies to the regulation of second or third-order derived financial instruments which no longer bear any relation to buying and selling stuff which is any actual use.
* I am not relying entirely on a single blog-post here! See also Dan Gardner's excellent and accessible Future Babble: Why expert predictions fail and why we believe them anyway. (London; Virgin Books, 2011) It's a great corrective to all the doom and gloom surrounding us. Incidentally, he draws a lot on the work of Philip Tetlock, the subject of this interview by Jonah Lehrer in Wired.
** Most but not all, our approach owes much to work at the University of Huddersfield, particularly in the early '90s.
*** Security in the sense of not being vulnerable to plagiarism, although the emphasis on discussion of one's own practice and production of examples and resources means that the approach is fairly protected in any case.
Becker H (1972) “School is a Lousy Place to Learn Anything In” American Behavioral Scientist (1972):85-105, reproduced in R G Burgess (ed.) (1995) Howard Becker on Education Buckingham: OU Press
(Update later today: Many thanks to David Stone, who writes; "I was happy to discover that my institutional subscription gave me access to the original Becker article. Just in case others should be as lucky, here is the DOI link: http://dx.doi.org/10.1177/000276427201600109 ")
Hudson L (1967) Contrary Imaginations; a psychological study of the English Schoolboy Harmondsworth: Penguin
This post is prompted by interesting points made by Bruno Setola in a substantial post on his blog Gamification.nu, which is well worth reading (and I'm not just saying that because he makes some kind remarks about my sites). The relevant piece is headed "Levelling Up".
His post is packed with ideas and efforts to synthesise them into an approach to teaching Cross-Media Communication, amongst which he finds Threshold Concepts to be a very useful tool. When I first read his thoughts, though, I thought he hadn't really got the idea; he was emphasising the acquisition of a frame of reference rather than the actual content of the concepts.
However, in the course of the discussion he refers to a keynote at the Third Biennial Threshold Concepts Symposium in Sydney last year, given by David Perkins. Unfortunately I couldn't get to that meeting, but that made me even keener to watch the video, below. (Note that it is almost an hour long, but well worth the time. You might find it helpful to have the .pdf file of the full set of slides open so that you can switch to them, because the camera does not dwell on the screen.)
In essence, Perkins is now talking about threshold experiences rather than concepts, and explores the epistemic shifts which take place as they develop from object to tool to frame. (Hence the shift of emphasis in Bruno's account.)
Selectively, because there's a lot in the address, my attention was drawn to Perkins' thoughts about what is involved in managing these shifts and teaching material to serve as a tool rather than an object. (For more detail on the content of the tables, do watch the video; these notes are only about the gist of some parts which strike me on the basis of current interests.)
I was reminded of a couple of pretty poor classes I've observed this year, commented on here and here. In both cases the problem was not really with the actual teaching, but with the syllabus, and the way in which it treats each item of learning as a gobbet of what Perkins elsewhere calls "inert knowledge". Each item was to be stored in the students' brains, to be taken out and shown when called for, but there was no sense of doing anything with it. The academic level Perkins was talking about was higher than the classes I had observed, but he discussed how using the approaches in the left hand column of the table below tend to promote learning of material as a set of concepts, rather than tools to work with.
Object role
Tool role
Key features, 'toy' applications
Fully developed applications
Rival academic concepts
Rival tacit operative concepts
Comparison and critique
Select among several and apply
(Do not be tempted, incidentally, to see "Object" as merely equivalent to the lower levels of Bloom in the cognitive domain, and "Tool" as signifying applying the material. It is possible to teach at a very advanced level, still working with objects--and indeed as Perkins notes, that is often entirely appropriate, when the material is a "destination" rather than a "route"*, an end in itself or object of scholarship rather than something which earns its keep by serving as a tool.)
Tool role
Frame role
Several concepts
One concept
Somewhat closed problems
Somewhat open problems
Abundant time
Low-stress real time
Solo or large group
Small group, rapid turns
Tools have specific tasks, and need to be selected appropriately, and although they may become "extensions of the body" in practical tasks, they are nevertheless also objects which can be studied and refined (Setola discusses the "extensions" point in his post).
The third way in which ideas/knowledge/concepts etc. may be used is as a frame. A frame is an idea through which one sees stuff; a tool is an idea with which one works; an object is an idea one knows about. The critical difference is that by default a frame is part of oneself. It is not experienced as something other; indeed it may be very difficult to step outside one's habitual way of seeing things and take "my habitual way of seeing things" as an object of study.
Frames are what reveal the "inner game" of topics of study, for better or for worse, as Perkins (2009: ch.5) discusses. It needs to be emphasised that frames are not "superior" forms of knowledge (or skill, or values) to tools or objects. As Perkins' use of the term "role" suggests, it is a matter of what job you want this knowledge to do, and so how you teach it.
Bruno's concern is principally about how these transitions might be managed and "taught". Scaffolding, for example, with its implications of incremental development, no longer works when one reaches a discontinuity, such as this kind of epistemic shift between object, tool, and frame.
In short, I'm not sure it can reliably be managed. That is the nature of a threshold experience--the liminality, uncertainty, and indeed risk (although I don't want to over-dramatise) of how experience is re-organised by a new idea.
On the other hand, does it need to be managed? Does trying to manage it make it more likely to happen? Or is it wasted effort? But that's a question which might actually succumb to ingenious empirical research...
I'm reminded of Gestalt shifts in perception. But also of Ramsey (1967). I remember, almost 45 years ago, listening to Ian Ramsey delivering a guest lecture at Sussex on religious language--he must have been speaking about work in progress, because this was before 1967. He spoke about parallelisms in the psalms (I'm not going to digress that far) and the analogy of the polygon and the circle. Start with the simplest regular polygon--an equilateral triangle. Add a side = a square. Go on and on and the figure gets more and more circular, until at some point it is indistinguishable from a circle, and so it is a circle***.
And I hazily remember some basic physics from even longer ago! I seem to remember that phase transitions (such as ice melting, or water boiling) require an energy premium (not the correct phrase, I know)... A catalyst may help, chemically, but the basic transition is the product of "more of the same". It's just that in teaching, the "more of the same" needs to be about the epistemic status one is aiming at, not that which one is emerging from.
These properties are emergent...
This kind of thinking underpins Perkins (2009), where he is concerned about developing appropriate approaches to teaching to promote learning for understanding. (It's a term he is quite comfortable with, and discusses at some length on p.48 ff.)
The book is to a certain extent a reflection on his experience of learning to play baseball as a child; he found it easy, he argues, because he was exposed to the whole game. He practiced the components, of course, but he knew where they all fitted in and he saw them in context.
In formal education, on the other hand, there is in many cases no overall introduction to the whole game of a subject or discipline. Instead, each element of the knowledge base and skill set is likely to be introduced separately, and in isolation. Clearly this inhibits understanding of how it fits together; he calls this unfortunate curriculum strategy "elementitis".
And even if the whole is introduced, it is often discussed at a distance. In baseball (or other sport, or music, or language learning) newcomers get to play, from very early on. In education, the subject is described rather than participated in; he calls this aboutitis. (Perkins does address the question how the "whole game" can be introduced when it is enormous--such as mathematics, or science. He argues that just as baseball is introduced through a simplified form--simpler even than Little League--it is possible to develop an appropriate "junior" form of the game which students, of whatever age, can grasp.)
Back to practice. The sessions I observed were--inappropriately--focused on learning objects rather than tools, still less frames. But that was what the syllabus required. The mechanistic fragmentation of the whole into learning units and outcomes and assessment criteria effectively precluded any other approach. Moreover, the "whole game" was almost inconceivable. As the Wolf report suggests--although one could have wished for more detail--the arbitrary assemblage of "competences" into courses, does not make for coherent and teachable programmes.
I may be critical of my students' application and implementation of their learning, but seen through this frame (or "lens" as Brookfield puts it) it is not clear how they can get better. Bottom line: if you are forced to teach a whole which does not make sense, the parts can't make sense either.
So that is what I did on my holidays.
7/10. You need to get out more.
Teacher
Notes/Asides
I agreed with practically all of Perkins' book. I also found it highly readable, in part because does not let his references interrupt his flow--the evidence is there, but it is in the very accessible notes at the end.
Indeed, I recognise much of his approach in mine, although he is more rigorous than me on "working on the hard parts" (ch.3), which is my failing. I would promise to do better next time, but at my age, there may not be a next time!
His chapter on the "inner game" is a classic (ch.5), particularly on the hidden curriculum embodied in the physical and logistical elements of the classroom**.
I'm being presumptuous here, but he does divide the basic idea, of concentrating on the whole, into seven principles, each of which has several aspects, each of which can in turn generate several strategies or exercises... Of course, if you approach the material as a tool-kit or even a frame, that's good. But, although I say it myself, I'm very good at that. I try to employ it all the time, but I did find I could not sustain the necessary frame all the way through the book. Perodically, I did lapse into thinking, "Do I have to learn all these particular techniques?" (Object orientation)
P. writes in a US context. Syllabi in the UK (particularly in vocational, professional and further education), are much more prescribed and regulated. Frequently very badly. With very little understanding on the part of awarding and validating bodies about what it is like to study on their programmes. (See here on who writes syllabi, if you've not been there already.
* my terms rather than Perkins'.
** This excerpt concerns the explication or deconstruction of the chair desk (chair with flap-over writing surface) based on Luttrell (2004) (full source on Perkins p.238; author referred to here as "Wendy")
A chair and a desk are fused into the same convenient unit, the desk component a rather small platform upon which the student can rest a book or a notepad. Books usually can be stored under the seat. Wendy provokes people to realize that this very ordinary instrument of education embodies numerous tacit assumptions and expectations that deserve a second thought. [...]
[...][T]he conventional chair-desk favors right-handed students; the writing platform is almost always to the right. The working surface is not very large, so apparently students are not expected to coordinate multiple sources of written information or develop complex representations. Also, the chair-desk gets in the way of students forming working circles and deprives them of common desk space, as when five or six pupils sit around a table. Learners work alone! Normally chair-desks come in one size for a classroom. One-size-fits-all!
And there is more...
*** (Update 29 August) I now discover that this idea originates from Nicholas of Cusa (1401-1464). See here for a brief but more detailed exposition than mine, and a discussion of how he attempts to use it as a proof of the existence of God, but the writer claims eventually proves exactly the opposite.
References
Perkins D N (2009) Making Learning Whole; how seven principles of teaching can transform education San Francisco; Jossey-Bass
Ramsey I T (1967) Religious Language London: Macmillan
The link is to an enjoyable, accessible and iconoclastic article by Eric Dietrich, entitled "There is no Progress in Philosophy". (The first four sections are the most entertaining; the remainder is more technical, but still not particularly hard going.) From the Abstract:
Except for a patina of twenty-first century modernity, in the form of logic and language, philosophy is exactly the same now as it ever was; it has made no progress whatsoever. We philosophers wrestle with the exact same problems the Pre-Socratics wrestled with. Even more outrageous than this claim, though, is the blatant denial of its obvious truth by many practicing philosophers. [...] The final section offers an explanation for philosophy’s inability to solve any philosophical problem, ever. The paper closes with some reflections on philosophy’s future.
This is in contrast, of course, to the achievements of science.
If you accept the argument (which I think I do with some reservations), it is interesting to speculate whether the same can be said of the rest of the humanities, albeit in a weaker form. It is fair to argue that no progress has been made in the study of literature, for example, partly on the contingent basis that determining what constitutes "progress" in such a field is a philosophical question. Of course the stock of literature is ever-increasing, so we may have quantitative growth if not qualitative. I take it that the sterile deviation of "theory" (now apparently in retreat) is evidence that attempts at "progress" can only achieve the feat of disappearing up the proponents' own nether regions. A similar argument applies to the study of history (but again not to the creation of history)... As Alan Ryan observes in today's Times Higher Education (a propos of a nanced discussion of the relationship between teaching and research):
The corpus of available Greek literature that has escaped the ravages of time is finite and scholars have just about all of it under their belts. Interpretations of that finite corpus are another matter; they are, if not infinite, certainly indefinitely many. Nor is there any particular technique likely to yield insights that will be definitive, irresistible, part of a cumulative project of explaining everything there is to explain about Greek literature. Physicists may fantasise about finally reaching the "theory of everything", but it is unimaginable that anyone will produce the definitive way to read Aeschylus.
This is of course not good news for the practitioners of the humanities, which are under threat in the academy yet again. But are these disciplines about "making progress"? Or are they the stuff of Oakeshott's "conversation across the ages" (quoted here by Mike Love)?
As civilized human beings, we are the inheritors, neither of an inquiry about ourselves and the world, nor of an accumulating body of information, but of a conversation, begun in the primeval forests and extended and made more articulate in the course of centuries.
But like all conversations (including the arguments of philosophers which are Dietrich's starting point):
"In a conversation the participants are not engaged in an inquiry or a debate; there is no 'truth' to be discovered, no proposition to be proved, no conclusion sought. They are not concerned to inform, to persuade, or to refute one another, and therefore the cogency of their utterances does not depend upon their all speaking in the same idiom; they may differ without disagreeing. [...] It is with conversation as with gambling, its significance lies neither in winning nor in losing, but in wagering. Properly speaking, it is impossible in the absence of a diversity of voices: in it different universes of discourse meet, acknowledge each other and enjoy an oblique relationship which neither requires nor forecasts their being assimilated to one another."
And that is a delight. But does it make sense to try and "professionalise" it? And what are the implications for higher education of accepting such an argument--namely, that the humanities are not "subjects" in the same way as other subjects which do make progress?
(The original liberal arts, in the trivium and quadrivium, for example, are more focused than the usage adopted today in the US traditional liberal arts college; one might argue that only philosophy got a look in, under the heading of "logic" or "dialectic". So the historical argument for their centrality to the curriculum, weak as it already is, doesn't wash. And the study of English Literature is positively new--the University of Cambridge only appointed its first endowed chair in this dubious area of study in 1911, although interestingly there was a chair at the University of Glasgow from 1862.)
There is of course a recurrent debate in educational circles about knowledge and skills--which I am not going to reference because of its ubiquity. It's not merely a matter of liberal arts versus practical and vocational arts, lively though that discussion is. It is about how one goes about cultivating the higher reaches of critical understanding. Is it a matter of cultivating the skills of critical thinking first, with the knowledge base as an underpinning resource? Or is it a matter of transmitting the knowledge base, so that students are equipped to make judgements on the basis of real knowledge--and trusting that the skills will emerge?
False dichotomy of course. Both-and rather than either/or. But Bloom implies that the way to the skills is through the knowledge. And if the point of the knowledge is ultimately that the skills of creative thinking are engendered ("Creating" is the highest stage in the Krathwohl and Anderson revision of the Cognitive Domain) then it may not matter that the knowledge base itself is not going anywhere at a scholarly or cultural level. It is going somewhere for a particular learner.
And--just possibly--there may be something to learn from its substance regarded substantively rather than instrumentally, for its own sake rather than in the service of some other objective.
Jim Hamlyn touches on some associated questions here.(Although as I have revised this post, we may have diverged.)
Reference
Oakeshott, M. (1962) "The Voice of Poetry in the Conversation of Mankind," Rationalism in Politics and Other Essays. London: Methuen, 197-247.
The original pointer to the Dietrich essay was from the Browser. Many thanks.