29 July 2010

On one kind of American psyche

 Or--"Why I love my revolver more than my mobile phone."

I like Edwin Leap; I get his blog through RSS regularly, and if he is a little down-home Will Rogers-ish for a European taste, so be it. The list of awards to the right of the page testifies to how he is received in the US.

Every so often, though, a post from one of the US bloggers I casually follow acutely exposes a massive gulf between their mind-set and mine. (I am not so daft as to generalise this beyond the few people I follow, none of whom have been chosen for their extremism or freakery but simply for their interesting take on their world.) And this one is not merely any old gulf. It's a Grand Canyon (sorry about mixing geographical metaphors).

Actually, it is a classic statement and it is worth quoting at some length:
"The phone represents connection.  Especially a smart-phone, since it connects me not only to those I love, but in essence to the entire world in a way unfathomable a few short decades ago.  I can tap the World Wide Web from the comfort of my car or anywhere else, simply by pulling out my phone.  The phone represents the collective knowledge of humanity and our interdependency. [...] It allows me,  [...] to call others to my rescue, to ask others’ opinions, to check the collective opinion on millions of potential websites.

What about my little Smith and Wesson model 640?  Well, it isn’t about collective communication; though the sound of it being discharged will probably bring interested parties to investigate.  It has no communication device located anywhere on the frame.  It has three functions, as I see it.  It reassures me.  It may discourage those who would harm me or my family.  It is capable of causing harm.

Philosophically, it is light years from the phone.  Because it requires that I be master of my own fate.  It says, ‘no one will help you, but you, when the chips are down.’  The help summoned by a phone may not arrive for a very long time, as I live in ‘the sticks.’  The revolver says, ‘you must be accountable for this decision; you must carry and use me safely and you must use my capacity for injury with a profound awareness of morality, ethics and of the value of life.’   The revolver is about individuality.

Americans, Westerners in general, are slowly abdicating responsibility for self and embracing the collective.  The question is always, ‘who will help me?  Who will pay for me?  Who will give me? Who will come to me?  Who will allow me to do nothing while they do something?  Who can I blame?  Who will excuse me?’

This is a travesty.  This will be one of the deepest wounds to our nation, to our way of life.  Not the phone itself, which is merely a tool, but the abdication of accountability that communication can falsely represent.

If the law were just, I would gladly carry my revolver everywhere.  I like the phone, and I find it useful.  But I am old school.  I am old Southern.  I am descended from patriots.  I have been responsible for others and continue to be for myself.  I believe in the individual.

Furthermore, I am a physician in an emergency department.  I know what humans are capable of doing.  I love people.  Some of my favorite patients have been in handcuffs.  I joke with them, I like them.  But humans are dangerous.  If you doubt it, read the newspaper.

So in the end, based on my life, my experiences and my philosophy, I can only say that when I reach around my side, I love the feel of that handle far more than the feel of that phone.  And I love what it represents far more than any capacity for communication."

26 July 2010

On the difference between knowledge and wisdom

As the late Miles Kington put it (I paraphrase; original in the heading link...)
  • Knowledge is knowing that a tomato is a fruit...
  • Wisdom is knowing better than to put it in a fruit salad.
(Thanks to "Quote ... Unquote" on BBC Radio 4; I can't get at it to link at the moment.)

23 July 2010

On ritual knowledge

Last Friday the hard drive on my principal machine unexpectedly and suddenly died, gave up the ghost, expired, exited this mortal coil...

Fortunately pretty well all my data was backed up, and so at one level and thanks to a very helpful (and reasonably priced) engineer --Paul West of Bedford Home Computers deserves the plug-- it was relatively simple to install a new drive (twice the capacity of course) and then to re-populate it with everything from scratch. The operating system, the main packages, and then the drivers and the add-ons and the tuning, and the little programs installed for free from magazine cover-discs years ago which have become integral to how I work (particularly ABC Snapgraphics, a really simple template-based vector drawing programme which is the tool which generated most of the graphics on my web-sites; runs under Windows 3.1, from 1995. I have many back-ups of that, some on a single floppy!), the list goes on and it took most of the weekend to get minimally functional again.

But that's the background. As I engaged in the tedium of finding stuff and installing and scrabbling for authorisation codes on packaging that had only narrowly escaped being thrown away years ago---I was reminded once again of David Perkins' discussion of the forms of "troublesome knowledge", particularly in the context of threshold concepts (of course). I'd like to link to his 1999 paper at this point [Perkins D (1999) "The constructivist classroom - the many faces of constructivism" Educational Leadership, Volume 57, Number 3], but it has disappeared from its former open-access home, so what follows is based on my understanding of part of his argument.

Troublesome knowledge may be associated with
  • ritual knowledge
  • inert knowledge
  • conceptually difficult knowledge
  • the defended learner 
  • alien knowledge
  • tacit knowledge
  • troublesome language
Without going into all the detail, the point is that these features of the material to be learned--alone or in combination--make it difficult to learn.

And the longer we go on in teaching, the more inured we get to the difficulty of these forms of knowledge, because the more we are initiated into the little worlds of our disciplines and practices the more they come to make sense to us. And the longer that goes on, the more difficult it becomes to empathise with the difficulties experienced by our students. And of course to engage constructively with those difficulties...

Until it hits home. I have a lot of ritual knowledge about PCs. I know what I have to do to make certain things happen. I haven't a clue why they work (just as I no longer have a clue about how my car engine works, despite having done basic servicing on its predecessors for thirty or so years). But I know I have to go through the motions.

I've never been much of a computer geek, but in the 'eighties I could and did create (rather boring) programs for my son to practise basic maths in BBC BASIC, and later created crude interfaces in CP/M and MS-DOS. Then, as with the car, I at least enjoyed the delusion that I knew what was going on. Now I am disabused of that.

More important, I suspect that nobody knows what is going on.

In the 'eighties, the computer mags (my favourite was PCW Plus for the Amstrad 8/9000 pre-PC series) generally included in each edition an arcane article about programming in assembler, just one step away from machine code. It's of course possible that even in those days only the authors knew what they were talking about, but I wonder if any person (as opposed to a succession of programs and of course millions of machines) exists who can read a few hundred lines of low-level code and even see that they are the nuts and bolts of, say, a state-of-the-art image editor.

Analogously, that is like reading a list of thousands of pixels defined by their colour properties, and then answering Rolf Harris' ritual question, "Can you tell what it is, yet?" I do remember from one of those magazines, a throw-away remark that the difficult thing about programming was not writing code, but being able to read it.

The point? That in all my dutiful behaviour re-creating my desk-top, I haven't actually learned anything. Apart, perhaps, from obeying orders.

More important, for most of us there isn't anything else to learn. I can't reference an op-ed piece I read a few weeks ago pointing out that there is no single person on the planet who knows how your mobile phone works. Its operating system and "apps" have been assembled by specialists across the world who produce a library of "black box", mysterious modules which happen to work. But those who use them--not only the consumers but also the engineers and designers who package them for public consumption, neither need to know how they work, nor do know.

I've just been watching a re-run of one of the marvellous series on BBC TV on the Indian railways. Some of their locomotives are still steam-powered, and I admired the wonderful dance of well-lubricated pistons and shafts and eccentrics and wheels, and the knowledge and skill of the people who made them work. But then I thought--even I could potentially learn how these things work. I can see the components and their linkages. They are concrete objects harnessing abstract principles.

But that is not true of ICT, or of very much modern technology--even the ills of the internal combustion engine can nowadays only reliably be diagnosed by sensors and readouts, at several removes from the actual events causing the mis-fire or the overheating. Instead, one is at the mercy of the oracles...

I'm tangentially reminded of a birthday event I attended a few years ago, where I met up again with a former housemate, an early-retired (full) professor of maths, then working part-time with doctoral candidates at another university on the mathematics of financial derivatives. He, with an Oxford D.Phil long under his belt, was unstinting in his admiration for the technical competence of his high-flying candidates. He admitted that sometimes he was, if not struggling, at least exerting himself to keep up.  Dateline; February 2006.  About 18 months later the significance became apparent of the disconnection between the capacity of the analytical disciplines and and complexity of those all-too-real deals.

The inaccessibility to most of us of the technology which we can merely consume makes us dependent, and perhaps promotes learned helplessness (and hyperbole). Discuss!

"Any sufficiently advanced technology is indistinguishable from magic."
Arthur C. Clarke (1961) "Profiles of The Future"

18 July 2010

On neuromyths

The linked site focusses largely on school-based education, but it sets out to produce a balanced account of the rather faddish, fashionable, and often downright misleading or wrong contribution of neuroscience to learning and teaching. Based at the School of Education at Bristol University, it draws on sound research and scholarship both to address myths and to promote neuroscience-based approaches to teaching.

In particular, the site links to an excellent balanced chapter on "neuromyths" (pdf available here) from a recent book by the co-ordinator of the site (Howard-Jones P (2009) Introducing Neuroeducational Research London; Routledge). It's not as entertaining as Ben Goldacre's Bad Science but more comprehensive and informative, covering
  • Multiple intelligences, 
  • Learning styles, 
  • Enriched environments
  • Brain gym
  • Water
  • Omega-3
  • Sugary snacks and drinks
--and even a sympathetic discussion about why there are (so many) neuromyths in education — and how to spot one?  Unfortunately, as simply a pdf of the chapter, the bibliography is not included—you'll have to get the actual book for that!

The site was recommended by Tony Fisher—many thanks.

16 July 2010

On dissonance by any other name

The linked blog post is in turn based on this article (Nyhan and Reifler, 2010), which discusses why attempts to change people's political misperceptions and misunderstandings often don't work, and can in some cases backfire. The blog post relates the same processes to the teaching of science.

There is a certain sense of reinventing the wheel, here, and I'm interested that there is no reference to a vast and venerable literature on cognitive dissonance (here is a page of mine on that) which goes back to the 1950's. On the other hand, it is worthy of note that if you follow up the first external link from my page you come to a .pdf file of a textbook chapter on cognitive dissonance, with a note to say that it is on the web because it is no longer included in the latest versions of the textbook, so the whole thing may be a matter of fashion.

Incidentally, the epigraph to the substantive article is attributed to Mark Twain, as such things often are; I think the correct ascription is to Henry Wheeler Shaw writing as Josh Billings:
The trouble with people is not that they don't know but that they know so much that ain't so.
(Josh Billings' Encyclopedia of Wit and Wisdom, 1874)

Nyhan B & Reifler J (2010). "When Corrections Fail: The Persistence of Political Misperceptions" Political Behavior, 32 (2), 303-330 DOI: 10.1007/s11109-010-9112-2

14 July 2010

On being first with the latest moral panic!

You read it here first! Be in the advance guard! Worry! Act! But mainly worry!

How long before the Daily Mail gets it? The Sun with added fear!

(I thought Mustang was a horse, and of course the great North American P51B...)

Thanks to Tyler Cowen for the link.

On filtering and abstracting

This is pure ignorant speculation, but I was interested in the linked article and its argument that much infant learning is so efficient (particularly, it suggests, early language learning) because the child's brain is insufficiently developed to filter information, and has to swallow it whole, as it were.

One issue that frequently comes up when covering the basics of memory as a precursor to learning theories is the existence (or not) and form of photographic (or eidetic) memory. I'm not going into detail because I do know I don't know enough about it even to be a reliable guide to other sites. Insofar as it is agreed to exist, it is much more common in children than in adults. That would seem to fit with the argument of the post and article. It appears that eidetic memory stands in the same relation to an "ordinary" memory as a picture of a document does to a version scanned with a print recognition utility. The eidetic memory will take up much more of (in this case a computer's) memory because it includes all the information of the document as a physical object (such as folds and smudges in the paper) regardless of their importance. The scanned text of course loses such irrelevant information, becomes much leaner, and of course editable.

I am reminded of Bruner on the enactive, iconic and symbolic modes of representation and the issue of whether being almost confined to an enactive mode (as an eidetic image is--granted that an image is by definition iconic to some degree) is as much a limitation to learning as a resource. Certainly that is the case for people with autistic spectrum disorders.

07 July 2010

On giving handouts

The link is to an account of some empirical research on the much debated issue of when to give out handouts accompanying a lecture--at the beginning or the end?
The findings provide preliminary evidence that lecturers should provide their students with handouts during the lecture. ... In no case ... did having the handouts during a lecture impair performance on the final tests. Even when there were no differences in final test performance, students still benefited in the sense that they reached the same level of learning with less work.
In my view it all depends on the task of the lecture, the kind of knowledge one is imparting and how one hopes that students will engage with it. The reported research relates to students watching videos of short 12-minute presentations on science topics; whether it would also go for other content I don't know. (But it is an easy project to research, and might make a useful small-scale project for students.)

I don't use handouts at all any more--I set up a blog or a web-page on which I can post not only the presentation, but also the references and links, and photos of points made in discussion written up on a flip chart or white-board. I may make it available before the session, but I invariably edit it afterwards to reflect the actual material taught.

This is more like the minutes of the meeting which actually took place rather than the session I planned to give. But that suits me because my subject areas are typically both "soft" and applied, and the discussion in the (relatively small) group is more important than the content I plan to "cover". I rarely stick to a session plan in any case, so advance notes could actively confuse the students.

It's probably more important to have a worked-out rationale for doing what you do (including telling the students what you expect of them) than doing it in a particular way.

Marsh, E., & Sink, H. (2009). Access to handouts of presentation slides during lecture: Consequences for learning. Applied Cognitive Psychology, 24 (5), 691-706 DOI: 10.1002/acp.1579

and acknowledgements to the BPS Research Digest blog for the pointer

06 July 2010

On validations and grading

I'm now quite an old hand on validations panels.

(New readers start here: it is standard practice in the UK that a new course, and indeed a substantially modified established course, is scrutinised by academic peers --colleagues within a school, within the university but from a different school, and beyond the university-- before it receives the university's imprimatur.)

I have taken part in two in the last couple of weeks. One was a complex master's programme with generic and named routes, and different but sometimes unexplored understandings of what constitutes M level work. And today's was a humble 60-credit certificate at NQF 5.

In both cases I have been really impressed by the quality of discussion. Something has changed for the better over the past few years (in my experience).

On the Master's programme we had a great and appropriately unresolved discussion about transformative learning and whether one can "require" it in assessment (among other things) [Yes, I will post on that in more detail as promised as soon as I know what I think...]

..and today we grappled with how to assess "reflection" and whether its proxy for assessment  could be graded. And I found myself arguing that it could be graded!

Thinking back over a dozen or so validations in which I have participated recently --some internally but more externally, at all levels from level 4 certificates to professional doctorates-- for once I have to concede that the system has got better. Not necessarily at the level of formal regulations--in practice one only hits those when really difficult technical issues obtrude-- but in both the quality and the culture of the debate.

I (and I'm not a lone voice, albeit a timorous one) have inveighed against (moaned about) the "compliance culture" stifling serious discussion of course content, structure and processes. But that seems to come principally from the accreditation bodies rather than academic institutions themselves.

I'm very pleased to find that within those institutions, "quality assurance" has moved on from covering institutional arses by ticking boxes, to a genuine enquiry into how learning can be promoted and the student experience enhanced.

I'll have to lie down--it has all been a bit of a shock!

02 July 2010

On ecstasy, sort of.

In the sense of ec-stasy, or standing-outside-of.

In the informal coffee and cake melee after my session on Monday (more on that to follow), several people approached me to discuss "standing outside one's taken-for-granted world" as a threshold concept.
Both channels are wonderful correctives to the usual perspectives on news. It misses the point to allege that they are "biassed" or "spun". In many cases, the stories they report just do not figure on our news--just as few domestic stories do not reach their lists. This is not like the Mail or the Sun or the Guardian spinning the same stories differently. It is about quite different priorities about what counts as a story.

Encourage students to watch!