25 August 2011

On surrogates and assessment

I'm starting off with two observations about quite different things.

The first is this one from Ben Goldacre's brilliant "Bad Science" blog at the Guardian. His topic concerns a press release about a potential new treatment for Duchenne's Muscular Dystrophy.
"...this story is also a reminder that we should always be cautious with "surrogate" outcomes. The biological change measured was important, and good grounds for optimism, because it shows the treatment is doing what it should in the body. But things that work in theory do not always work in practice, and while a measurable biological indicator is a hint something is working, such outcomes can often be misleading."
And later...
"...improvements on surrogate biological outcomes that can be measured in the body are a strong hint that something works – and I hope this new DMD treatment does turn out to be effective – but even in the most well-established surrogate measures, and drugs, these endpoints can turn out to be misleading."
A fairly basic point, of course, but it did set me thinking about how the extent to which we have become obsessed with measurement in many fields, including teaching and learning, had led to increasing reliance on fairly dubious surrogates.

And then I came across this commentary on Standard and Poor's revision of the the US credit rating:
"[Ratings agencies] ..are human enterprises, fallible institutions—and like other institutions, they have procedures, interests, and histories. Their records deserve inspection. In the scientific spirit, in the spirit of show me, they deserve scrutiny."
A credit rating is a complex construct (I presume). Since it is supposed to have predictive value (other than merely being part of a self-fulfilling prophecy), it must be put together from a raft of surrogate measures, presumably of directly observable factors which co-vary with an institutions credit-worthiness. But it is only as good as the choice of those surrogates*.

Which led to some general thoughts in relation to education.

Today is the day when GCSE results are published (the exams taken at age 16 by practically all pupils in the UK). The press stories are predictable, suggesting grade inflation and the exams being dumbed-down. (Or of course if the pass-rates were not an improvement on last year, there would be jeremiads about further decline in educational standards...) The press discussion will not be sophisticated, but it will at least acknowledge what the politicians and the educational establishment will deny, namely that the examinations are not realistic proxies for educational achievement.

This is leaving aside the issue of the tail wagging the dog, of "teaching to the test" without ever asking whether the test is valid or reliable. Beyond that, the logistics and practicalities of mass assessment distort the process, and it has ever been thus. When Liam Hudson (1967) discussed convergent and divergent thinking styles, he noted that convergent thinking was privileged in school at least in part because its testing could be standardised.

But these artificial surrogate assessments are increasingly separating the formal educational system from the "real world", particularly that of communities of practice. This is not an original observation; while I'm on "golden oldies", I'll refer to Becker's wonderful 1972 paper School is a Lousy Place to Learn Anything In, which is based inter alia on a similar argument.

It is out of an awareness of the intrinsic limitations of such surrogacy that a course on teaching with which I have long been involved has attempted to develop a more authentic assessment strategy. Of course, teaching courses have always routinely involved direct observation of teaching, but not everything is amenable to direct observation. The traditional solution on most** other courses has been set assignments; our course moved away from that to negotiated submissions based on a learning contract. Learning outcomes are specified and students decide, in consultation with a tutor, of course, what evidence they will submit to demonstrate that the outcomes have been met. This is a step closer to reality, but of course only insofar as the specified learning outcomes correspond to the real world.

The course has just been internally reviewed for routine reasons, and it is apparent that the bureaucrats hate the assessment scheme. Work is not graded, for example. The scheme is not suited to anonymous submission, because the students are talking about their own practice and work setting (it is an in-service course). Not all work is suited to electronic submission via Turnitin.... the list of complaints goes on.

The real problem is that validity, reliability and fairness--the traditional requirements of an assessment scheme--are now subordinated to standardisation, administrative convenience, and security***.

These are considerations for the legitimation of surrogates and proxies--the same kind of consideration as applies to the regulation of second or third-order derived financial instruments which no longer bear any relation to buying and selling stuff which is any actual use.


* I am not relying entirely on a single blog-post here! See also Dan Gardner's excellent and accessible Future Babble: Why expert predictions fail and why we believe them anyway. (London; Virgin Books, 2011) It's a great corrective to all the doom and gloom surrounding us. Incidentally, he draws a lot on the work of Philip Tetlock, the subject of this interview by Jonah Lehrer in Wired.

** Most but not all, our approach owes much to work at the University of Huddersfield, particularly in the early '90s.

*** Security in the sense of not being vulnerable to plagiarism, although the emphasis on discussion of one's own practice and production of examples and resources means that the approach is fairly protected in any case.

Becker H (1972) “School is a Lousy Place to Learn Anything In” American Behavioral Scientist (1972):85-105, reproduced in R G Burgess (ed.) (1995) Howard Becker on Education Buckingham: OU Press
(Update later today: Many thanks to David Stone, who writes; "I was happy to discover that my institutional subscription gave me access to the original Becker article. Just in case others should be as lucky, here is the DOI link:
http://dx.doi.org/10.1177/000276427201600109 ")

Hudson L (1967) Contrary Imaginations; a psychological study of the English Schoolboy Harmondsworth: Penguin


20 August 2011

On teaching objects, tools, and frames

This post is prompted by interesting points made by Bruno Setola in a substantial post on his blog Gamification.nu, which is well worth reading (and I'm not just saying that because he makes some kind remarks about my sites). The relevant piece is headed "Levelling Up".

His post is packed with ideas and efforts to synthesise them into an approach to teaching Cross-Media Communication, amongst which he finds Threshold Concepts to be a very useful tool. When I first read his thoughts, though, I thought he hadn't really got the idea; he was emphasising the acquisition of a frame of reference rather than the actual content of the concepts.

However, in the course of the discussion he refers to a keynote at the Third Biennial Threshold Concepts Symposium in Sydney last year, given by David Perkins. Unfortunately I couldn't get to that meeting, but that made me even keener to watch the video, below. (Note that it is almost an hour long, but well worth the time. You might find it helpful to have the .pdf file of the full set of slides open so that you can switch to them, because the camera does not dwell on the screen.)



In essence, Perkins is now talking about threshold experiences rather than concepts, and explores the epistemic shifts which take place as they develop from object to tool to frame. (Hence the shift of emphasis in Bruno's account.)

Selectively, because there's a lot in the address, my attention was drawn to Perkins' thoughts about what is involved in managing these shifts and teaching material to serve as a tool rather than an object. (For more detail on the content of the tables, do watch the video; these notes are only about the gist of some parts which strike me on the basis of current interests.)

I was reminded of a couple of pretty poor classes I've observed this year, commented on here and here. In both cases the problem was not really with the actual teaching, but with the syllabus, and the way in which it treats each item of learning as a gobbet of what Perkins elsewhere calls "inert knowledge". Each item was to be stored in the students' brains, to be taken out and shown when called for, but there was no sense of doing anything with it. The academic level Perkins was talking about was higher than the classes I had observed, but he discussed how using the approaches in the left hand column of the table below tend to promote learning of material as a set of concepts, rather than tools to work with.

Object role Tool role
Key features, 'toy' applications Fully developed applications
Rival academic concepts Rival tacit operative concepts
Comparison and critique Select among several and apply

(Do not be tempted, incidentally, to see "Object" as merely equivalent to the lower levels of Bloom in the cognitive domain, and "Tool" as signifying applying the material. It is possible to teach at a very advanced level, still working with objects--and indeed as Perkins notes, that is often entirely appropriate, when the material is a "destination" rather than a "route"*, an end in itself or object of scholarship rather than something which earns its keep by serving as a tool.)

Tool role Frame role
Several concepts One concept
Somewhat closed problems Somewhat open problems
Abundant time Low-stress real time
Solo or large group Small group, rapid turns

Tools have specific tasks, and need to be selected appropriately, and although they may become "extensions of the body" in practical tasks, they are nevertheless also objects which can be studied and refined (Setola discusses the "extensions" point in his post).

The third way in which ideas/knowledge/concepts etc. may be used is as a frame. A frame is an idea through which one sees stuff; a tool is an idea with which one works; an object is an idea one knows about. The critical difference is that by default a frame is part of oneself. It is not experienced as something other; indeed it may be very difficult to step outside one's habitual way of seeing things and take "my habitual way of seeing things" as an object of study.

Frames are what reveal the "inner game" of topics of study, for better or for worse, as Perkins (2009: ch.5) discusses. It needs to be emphasised that frames are not "superior" forms of knowledge (or skill, or values) to tools or objects. As Perkins' use of the term "role" suggests, it is a matter of what job you want this knowledge to do, and so how you teach it.

Bruno's concern is principally about how these transitions might be managed and "taught". Scaffolding, for example, with its implications of incremental development, no longer works when one reaches a discontinuity, such as this kind of epistemic shift between object, tool, and frame.

In short, I'm not sure it can reliably be managed. That is the nature of a threshold experience--the liminality, uncertainty, and indeed risk (although I don't want to over-dramatise) of how experience is re-organised by a new idea.

On the other hand, does it need to be managed? Does trying to manage it make it more likely to happen? Or is it wasted effort? But that's a question which might actually succumb to ingenious empirical research...

I'm reminded of Gestalt shifts in perception. But also of Ramsey (1967). I remember, almost 45 years ago, listening to Ian Ramsey delivering a guest lecture at Sussex on religious language--he must have been speaking about work in progress, because this was before 1967. He spoke about parallelisms in the psalms (I'm not going to digress that far) and the analogy of the polygon and the circle. Start with the simplest regular polygon--an equilateral triangle. Add a side = a square. Go on and on and the figure gets more and more circular, until at some point it is indistinguishable from a circle, and so it is a circle***.

And I hazily remember some basic physics from even longer ago! I seem to remember that phase transitions (such as ice melting, or water boiling) require an energy premium (not the correct phrase, I know)... A catalyst may help, chemically, but the basic transition is the product of "more of the same". It's just that in teaching, the "more of the same" needs to be about the epistemic status one is aiming at, not that which one is emerging from.

These properties are emergent...

This kind of thinking underpins Perkins (2009), where he is concerned about developing appropriate approaches to teaching to promote learning for understanding. (It's a term he is quite comfortable with, and discusses at some length on p.48 ff.)

The book is to a certain extent a reflection on his experience of learning to play baseball as a child; he found it easy, he argues, because he was exposed to the whole game. He practiced the components, of course, but he knew where they all fitted in and he saw them in context.

In formal education, on the other hand, there is in many cases no overall introduction to the whole game of a subject or discipline. Instead, each element of the knowledge base and skill set is likely to be introduced separately, and in isolation. Clearly this inhibits understanding of how it fits together; he calls this unfortunate curriculum strategy "elementitis".

And even if the whole is introduced, it is often discussed at a distance. In baseball (or other sport, or music, or language learning) newcomers get to play, from very early on. In education, the subject is described rather than participated in; he calls this aboutitis. (Perkins does address the question how the "whole game" can be introduced when it is enormous--such as mathematics, or science. He argues that just as baseball is introduced through a simplified form--simpler even than Little League--it is possible to develop an appropriate "junior" form of the game which students, of whatever age, can grasp.)

Back to practice. The sessions I observed were--inappropriately--focused on learning objects rather than tools, still less frames. But that was what the syllabus required. The mechanistic fragmentation of the whole into learning units and outcomes and assessment criteria effectively precluded any other approach. Moreover, the "whole game" was almost inconceivable. As the Wolf report suggests--although one could have wished for more detail--the arbitrary assemblage of  "competences" into courses, does not make for coherent and teachable programmes.

I may be critical of my students' application and implementation of their learning, but seen through this frame (or "lens" as Brookfield puts it) it is not clear how they can get better. Bottom line: if you are forced to teach a whole which does not make sense, the parts can't make sense either.

So that is what I did on my holidays.

7/10. You need to get out more.
Teacher


Notes/Asides

  • I agreed with practically all of Perkins' book. I also found it highly readable, in part because does not let his references interrupt his flow--the evidence is there, but it is in the very accessible notes at the end.
  • Indeed, I recognise much of his approach in mine, although he is more rigorous than me on "working on the hard parts" (ch.3), which is my failing. I would promise to do better next time, but at my age, there may not be a next time!
  • His chapter on the "inner game" is a classic (ch.5), particularly on the hidden curriculum embodied in the physical and logistical elements of the classroom**.
  • I'm being presumptuous here, but he does divide the basic idea, of concentrating on the whole, into seven principles, each of which has several aspects, each of which can in turn generate several strategies or exercises... Of course, if you approach the material as a tool-kit or even a frame, that's good. But, although I say it myself, I'm very good at that. I try to employ it all the time, but I did find I could not sustain the necessary frame all the way through the book. Perodically, I did lapse into thinking, "Do I have to learn all these particular techniques?" (Object orientation)
  • P. writes in a US context. Syllabi in the UK (particularly in vocational, professional and further education), are much more prescribed and regulated. Frequently very badly. With very little understanding on the part of awarding and validating bodies about what it is like to study on their programmes. (See here on who writes syllabi, if you've not been there already.
* my terms rather than Perkins'.

** This excerpt concerns the explication or deconstruction of the chair desk (chair with flap-over writing surface) based on Luttrell (2004) (full source on Perkins p.238; author referred to here as "Wendy")
A chair and a desk are fused into the same convenient unit, the desk component a rather small platform upon which the student can rest a book or a notepad. Books usually can be stored under the seat. Wendy provokes people to realize that this very ordinary instrument of education embodies numerous tacit assumptions and expectations that deserve a second thought. [...]

[...][T]he conventional chair-desk favors right-handed stu­dents; the writing platform is almost always to the right. The work­ing surface is not very large, so apparently students are not expected to coordinate multiple sources of written information or develop complex representations. Also, the chair-desk gets in the way of students forming working circles and deprives them of common desk space, as when five or six pupils sit around a table. Learners work alone! Normally chair-desks come in one size for a classroom. One-size-fits-all!
And there is more...

*** (Update 29 August) I now discover that this idea originates from Nicholas of Cusa (1401-1464). See here for a brief but more detailed exposition than mine, and a discussion of how he attempts to use it as a proof of the existence of God, but the writer claims eventually proves exactly the opposite.

References

Perkins D N (2009) Making Learning Whole; how seven principles of teaching can transform education San Francisco; Jossey-Bass

Ramsey I T (1967) Religious Language London: Macmillan

11 August 2011

On the impossibility of philosophical progress

The link is to an enjoyable, accessible and iconoclastic article by Eric Dietrich, entitled "There is no Progress in Philosophy". (The first four sections are the most entertaining; the remainder is more technical, but still not particularly hard going.) From the Abstract:
Except for a patina of twenty-first century modernity, in the form of logic and language, philosophy is exactly the same now as it ever was; it has made no progress whatsoever. We philosophers wrestle with the exact same problems the Pre-Socratics wrestled with. Even more outrageous than this claim, though, is the blatant denial of its obvious truth by many practicing philosophers. [...] The final section offers an explanation for philosophy’s inability to solve any philosophical problem, ever. The paper closes with some reflections on philosophy’s future.
This is in contrast, of course, to the achievements of science.

If you accept the argument (which I think I do with some reservations), it is interesting to speculate whether the same can be said of the rest of the humanities, albeit in a weaker form. It is fair to argue that no progress has been made in the study of literature, for example, partly on the contingent basis that determining what constitutes "progress" in such a field is a philosophical question. Of course the stock of literature is ever-increasing, so we may have quantitative growth if not qualitative. I take it that the sterile deviation of "theory" (now apparently in retreat) is evidence that attempts at "progress" can only achieve the feat of disappearing up the proponents' own nether regions. A similar argument applies to the study of history (but again not to the creation of history)... As Alan Ryan observes in today's Times Higher Education (a propos of a nanced discussion of the relationship between teaching and research):
The corpus of available Greek literature that has escaped the ravages of time is finite and scholars have just about all of it under their belts. Interpretations of that finite corpus are another matter; they are, if not infinite, certainly indefinitely many. Nor is there any particular technique likely to yield insights that will be definitive, irresistible, part of a cumulative project of explaining everything there is to explain about Greek literature. Physicists may fantasise about finally reaching the "theory of everything", but it is unimaginable that anyone will produce the definitive way to read Aeschylus.
This is of course not good news for the practitioners of the humanities, which are under threat in the academy yet again. But are these disciplines about "making progress"? Or are they the stuff of Oakeshott's "conversation across the ages" (quoted here by Mike Love)?
As civilized human beings, we are the inheritors, neither of an inquiry about ourselves and the world, nor of an accumulating body of information, but of a conversation, begun in the primeval forests and extended and made more articulate in the course of centuries.
But like all conversations (including the arguments of philosophers which are Dietrich's starting point):
"In a conversation the participants are not engaged in an inquiry or a debate; there is no 'truth' to be discovered, no proposition to be proved, no conclusion sought. They are not concerned to inform, to persuade, or to refute one another, and therefore the cogency of their utterances does not depend upon their all speaking in the same idiom; they may differ without disagreeing. [...] It is with conversation as with gambling, its significance lies neither in winning nor in losing, but in wagering. Properly speaking, it is impossible in the absence of a diversity of voices: in it different universes of discourse meet, acknowledge each other and enjoy an oblique relationship which neither requires nor forecasts their being assimilated to one another."
And that is a delight. But does it make sense to try and "professionalise" it? And what are the implications for higher education of accepting such an argument--namely, that the humanities are not "subjects" in the same way as other subjects which do make progress?

(The original liberal arts, in the trivium and quadrivium, for example, are more focused than the usage adopted today in the US traditional liberal arts college; one might argue that only philosophy got a look in, under the heading of "logic" or "dialectic". So the historical argument for their centrality to the curriculum, weak as it already is, doesn't wash. And the study of English Literature is positively new--the University of Cambridge only appointed its first endowed chair in this dubious area of study in 1911, although interestingly there was a chair at the University of Glasgow from 1862.)

There is of course a recurrent debate in educational circles about knowledge and skills--which I am not going to reference because of its ubiquity. It's not merely a matter of liberal arts versus practical and vocational arts, lively though that discussion is. It is about how one goes about cultivating the higher reaches of critical understanding. Is it a matter of cultivating the skills of critical thinking first, with the knowledge base as an underpinning resource? Or is it a matter of transmitting the knowledge base, so that students are equipped to make judgements on the basis of real knowledge--and trusting that the skills will emerge?

False dichotomy of course. Both-and rather than either/or. But Bloom implies that the way to the skills is through the knowledge. And if the point of the knowledge is ultimately that the skills of creative thinking are engendered ("Creating" is the highest stage in the Krathwohl and Anderson revision of the Cognitive Domain) then it may not matter that the knowledge base itself is not going anywhere at a scholarly or cultural level. It is going somewhere for a particular learner.

And--just possibly--there may be something to learn from its substance regarded substantively rather than instrumentally, for its own sake rather than in the service of some other objective.

Jim Hamlyn touches on some associated questions here.(Although as I have revised this post, we may have diverged.)

Reference
Oakeshott, M. (1962) "The Voice of Poetry in the Conversation of Mankind," Rationalism in Politics and Other Essays. London: Methuen, 197-247.

The original pointer to the Dietrich essay was from the Browser. Many thanks.

10 August 2011

On evidence-based study tips

I don't often simply post a link on this blog, but this is a digest from the British Psychological Society of study tips with links to sources--very useful, in preparation for a new academic year.

04 August 2011

On hostages to fortune

I'm quite pleased with myself. It appears that after forty or so years in the further and higher education system, I am still naif. (The masculine--if connotationally effete--form of "naive". Not the same as "naff", although that may also apply...)

How do I know? Some time I may tell the tale of the great Course Review (but, like that of the Giant Rat of Sumatra, the world is not yet ready). But my bit of the fallout is to revise the Course Handbook. There are two hundred or so revisions required.
  • Since many of them are uncontentious technicalities such as updating the names of committees to this year's fashion, why can't the administrators--who know the answers--just correct them, instead of telling academics they are wrong and they need to look up the latest regulations?

  • It's not surprsing the committee wants changes. The course has now been running for fifteen years, and we have amended the handbook every year to accommodate its growth and perpetual updating. It was last formally reviewed in 2007, with no comments on the handbook or regulations. Because no-one read them. For years I've been slipping in asides and jokes and odd footnotes--and I admit many of them have been self-indulgent and even confusing for students who don't share my odd sense of humour. No-one has noticed because no-one has read it. (Tip for authors of this stuff. Plant an "easter egg" in the middle of the verbiage, and see whether anyone finds it. More.)
But beyond the legitimate stuff (and of course there is some) and other issues which are primarily attributable to the culpable negligence of a succession of dubiously competent "senior" staff ...

...my principal concern is the number of occasions on which I have been peremptorily instructed to insert the standard university material, rather than what we developed for ourselves to suit our course in the light of our experience. Invariably (and I do know what that means, and I use the term advisedly) the standard bumf is a qualified, weasel-worded, diluted fudge of our undertakings.

(One exception! Although I may argue with the approach and style of the Library's guide to citation [it's prescriptive without sufficient attention to the underlying pricniples], it is both compact and comprehensive, and I have happily replaced our own amateur guidance with it.)

In particular, we have been instructed not to refer to an independent support website (because it "might confuse the students")

Critical remarks about items on the reading lists are not allowed: but all the texts are curate's eggs, we know (including mine) And we can't refer to a "curate's egg" or latin abbreviations such as "q.v." or "inter al." because they too might confuse the students!

This is a deeply patronising and downright insulting approach to mature students.

If it is naif to believe that, then so be it.