31 December 2012

Items to Share: 31 December 2012

Education Focus 
  • Neuroskeptic: How Intelligent is IQ? IQ's in the news at the moment thanks to a paper called Fractionating Human Intelligence from Canadian psychologists Adam Hampshire and colleagues. Some say it 'debunks the IQ myth' -but does it? The study started out with a huge online IQ test...  
Other Business
  • 100 Diagrams That Changed the World | Brain Pickings "Since the dawn of recorded history, we’ve been using visual depictions to map the Earth, order the heavens, make sense of time, dissect the human body, organize the natural world, perform music, and even concretize abstract concepts like consciousness and love."
  • Archbishop's Thought for the Day: 'if all you have is a gun,… 'In his final Thought for the Day this morning on BBC Radio 4, the Archbishop of Canterbury talks about the recent killings in Connecticut and discounts the argument often put forward that "it's not guns that kill, it's people", saying: "People use guns. But in a sense guns use people, too."' ... a thoughtful gloss on Maslow's "if your only tool is a hammer..." and the aftermath of Sandy Hook. (Re-posted from Christmas Eve)
Very best wishes for 2013!

29 December 2012

On IT skills

I've just had an irritating evening. Both my sons decided they needed to avail themselves of Dad's Back-office Services (TM) to create some quotes and invoices. Thar's OK, up to a point, but what annoys me is that they don't have the content worked out clearly and expect me to sit there while they leaf through suppliers' catalogues deciding what mark-up they can get away with on various materials. If I were being paid by my time (or at all) I wouldn't mind so much...

OK--I was a teacher. Why do I not teach them to do these invoices themselves? I have done. Very badly. Last time I was working away from home for a few days, S. let them loose on my laptop to do the paperwork for themselves. She complained that they were up in the study for most of the evening, "helping" each other, generated piles of scrap paper--and littering my hard drive with unlabelled files just dropped in whatever folder I had last used.

R. has some excuse. He's now in his mid-forties, and all this stuff never featured in his formal education, and he has never owned a computer, so although I get irritated with his failure even to make a note of his customers' surnames or postcodes, I don't expect him to handle file-management or version control or even typing beyond painful hunt-and-peck level.

But CJ is younger, and computers have been round in his life since I got my first (Amstrad 8256) when he was five years old, and he got a Sinclair Spectrum + 2 at six or seven, before being led astray by games consoles a year or two later.  IT (not yet ICT) featured--rather falteringly, granted--in his primary school education and was routine at high school from 1994 onwards...

I'm not just ranting about his lack of skill, though. He is clearly accomplished in many tasks--ICT is after all not just one thing. His gaming skills may be a little rusty, but they are still accessible. He can do all kinds of things on his smartphone that I don't even understand. It's just that he uses a word-processor when he should use a spreadsheet--we're not yet anywhere near an accounts package...

In fact, he's very accomplished at all the things he has never been taught. I'd go further and hypothesise that in real-world ICT practice there is an inverse correlation between the skill level achieved by learners and the amount of formal teaching they receive. The speed at which I see adolescents texting in the street or on the bus amazes me. I'm sure that any efforts to incorporate social media into a formal curriculum would only hinder, slow or even halt its expansion and user base.

In part, then, I am for once inclined to applaud Michael Gove's recognition of the inadequacies of the current national ICT curriculum.

A clear disconnection has developed between curriculum aspirations and outcomes, parallel to the divergence in the late '50s and early '60s between formal and informal curricula in music. In secondary school, my classmates and I--very mildly by current standards-- subverted formal music lessons and affected to despise them. It was an attitude reciprocated by our teachers, who referred to popular musicians as three-chord wonders. Nevertheless one of my friends--while affecting the same disdain as the rest of us for "music appreciation"--was practicing for countless hours to master all 88 tracks of Django Reinhardt's solos in his LP collection, and he was not alone.*

As Illich put it:
A second major illusion on which the school system rests is that most learning is the result of teaching. Teaching, it is true, may contribute to certain kinds of learning under certain circumstances. But most people acquire most of their knowledge outside school, and in school only insofar as school, in a few rich countries, has become their place of confinement during an increasing part of their lives (Illich, 1970:12)

...and I had a J R Hartley moment: R had told his girlfriend about how he is credited in the acknowledgements of one of my books--had I a copy to spare? No, but--it's on Amazon, for £0.01. (+ £2.80 p&p!) withdrawn from library stock somewhere. Not a boost to my literary ego, but a reasonable present!

*  He has no direct web presence, but search for him and get countless references to professional musicians who "studied under David Taplin at the University of Huddersfield".

22 December 2012

On being a "deviationist" in several senses...

It's not something you claim, more something you confess (usually under duress).

I recently watched "The Golden Age of Steam Railways" on BBC4 about the redevelopment of the Talyllyn and Ffestiniog narrow-gauge railways in North Wales; the last 15 minutes or so concerned the construction of the "deviation", a loop from Dduallt round the the Tanygrisiau reservoir to restore the full track to Blaenau Festiniog. Like the page I linked to, that last sentence probably told you more than you ever wanted to know about the potentially obsessional world of the railway enthusiast.

I have enormous respect for the thousands of people who worked on what must count as one of the greatest purely voluntary civil engineering projects ever undertaken, over a ten year period. My involvement was trivial and incidental, so I claim no credit--but I do have a debt to the project, for what I learned.

First, I participated in a weekend working-party. Leave London after work on Friday, pile into a minibus, drive up the A5--largely pre-motorway--arrive on site at 11pm in the dark and climb to the base in a Nissen hut (no plumbing) --work every possible hour until dark on Sunday, and then travel back. And pay expenses for the privilege...

... and a week's working party, noted here, rather nostalgically.

My previous account concluded, "Wouldn't have missed it for the world." That is rather over-stating it. I can come clean. I hated every minute of it. I did it out of some perverse sense of duty and allegiance to an organisation which I really enjoyed belonging to in my teens, and out of loyalty to a good friend who was leading the group and who wanted a sidekick...

However, some memories do abide...
  • From today's perspective, the whole set-up appears naively trusting--I don't remember even having to provide a character reference to become a leader of a residential group of 12-15 year old boys, based in a Nissen hut up a mountain and totally isolated from the rest of the world (other than on foot over difficult terrain) between 8 pm and 8 am. It was not simply a "safeguarding" issue--that term and indeed that concern were unknown then. Indeed, I suspect that to have raised concerns about potential abuse would have been one of the few grounds for disqualifying anyone from participation--it would have drawn attention to an unhealthy preoccupation with such matters.
  • As I recall, the team leader received a cash float for expenses. He did keep it in a locked cash-box--but I seem to remember he brought his own box, and he kept track of expenditure in a notebook. In those days printed receipts were unknown, so if someone forgot to ask for a written one, we had to rely on memory and price labels.
  • We did have a first-aid box, with the usual compliment of bandages and the like--but as far as I know there was no check on the first-aid qualifications of the leaders. (That may not have been noticed, because one of us was a medical student--which status does not of course guarantee practical first-aid competence.)
  • We spent our days hacking at rocks with picks and shovels. The health and safety precautions surrounding the use of explosives were indeed quite tight--only the Colonel (Campbell, who lived half-way down the mountain) had explosives clearance--and I remember going to pick up the gelignite from the depot in the valley, which was plastered with score-boards announcing "220 days since the last accident", and the like.
  • But I don't recall any hard hats, steel-capped boots, eye- or ear-protection...
I could go on.

But in a related vein: a few days ago, I and a few former colleagues--all with a background in education, although I was probably the oldest--met for one of our occasional walks. The conversation turned to the Jimmy Savile case, and an animated discussion about whether "it was all different in those days" was a legitimate defence for the conspiracy of silence which surrounded his abusive acts. (I won't say "alleged". He's dead, and if I had heard those rumours then so had everyone with any connection with children's services. And I've written about abuse here, and here, in particular.)

It strikes me that;
  • We assume (ideological hegemony) that our current values are best/correct/right... (How dare I suggest otherwise? "I was beaten every day and b****red every night at [boarding] School, and it didn't do me any harm!"---is no longer a legitimate claim.*) ....
  • Our predecessors were not: 1: naive-- 2: ignorant-- 3: in denial-- 4: complicit-- 5: participant in relation to abuse. That categorisation is itself predicated on a frame of reference focused on the detection of presumed abuse.
  • There's a clear divergence here between an approach to any initiative which is about the maximisation of happiness (the utilitarian "hedonic calculus" in its most benign form) and about the prevention/mitigation of harm, even when the latter is a pre-requisite of the former. The harm perspective claims the high ground so much that it can never be claimed that enough has been done on that front, and so there is never enough energy left for the happiness agenda**. (I leave aside the possibility that some organisations in the field have a vested interest in exaggerating risk in order to raise their profile and their funds...)
What has happened to our perspective and discourse, particularly relating to young people and risk, in the past thirty years? Why? Can and should the pendulum swing back? What are the costs of the current defensive approach--both social and personal?

I'm reminded of Daddy Walker's famous telegram in Swallows and Amazons (1930) in response to the children's request to go sailing; "BETTER DROWNED THAN DUFFERS IF NOT DUFFERS WON'T DROWN" Today that approach would be regarded as tantamount to abuse.

* For the record, I do agree.

** Of course, the reality of some risk, however unlikely, will always remain. Since I starting drafting this post, the Newtown massacre has taken place. Followed by the bizarre and chilling take on it by Wayne LaPierre of the NRA.

18 December 2012

On a culinary con

S. gave me the book of  Jamie's 15-minute meals as part of my birthday present, concluding her inscription with "Now feed me!"

I've now cooked a dozen or so of his recipes--more or less. Why so vague? That goes to the heart of the issue...
  • Some of the ingredients are rather obscure, so I substituted others. "Prepared polenta" is a cheat. I can't even find ordinary polenta at Waitrose. And yesterday it took three members of (brilliantly keen to help) staff to locate the bulgur wheat. Frankly, all these staples taste of very little, so why not list substitutes in the recipes? So I can legitimately be accused of not following instructions properly. OK--I've not done that for forty years... Read on...
  • because the whole project is driven by a silly and arbitrary constraint--fifteen minutes. Get real. Yes, assuming that you have lined up all the prep at the head of the recipe, zapping ingredients and cooking them may take only a few minutes, but that is catering-speak, not home-cooking-speak. "15-minute meals" is an appealing marketing idea, but it imposes a stupid structure on almost all of the recipes (exempting some of the fish and veggie ones). 
Assuming that each "meal" (read "course") consists of meat/fish + veg/salad + staple  (potato/ pasta/ rice-based carbohydrate), Jamie's staff* have been driven down the line of creating:
  • something which can be portioned, beaten and flavoured/seasoned to be fried or grilled in a few minutes.
  • a salad or quick-cooked vegetables
  • a filler--probably not potatoes because they take too long
--and there are only so many variations on the theme. So most of the recipes I have tried share:
"On a large sheet of greaseproof paper, toss the [meat] with salt, pepper [herb/spice 1] and [herb/spice 2]. Fold over the paper then bash and flatten the [meat] to 1.5 cm thick with a rolling pin."
(I don't know how to 'toss' with those dry ingredients on a sheet of greaseproof paper without making a dreadful mess, incidentally--but I do admit that greaseproof paper works better than the clingfilm I have been wont to use...) There isn't very much alternative if you are going to cook meat in about ten minutes--the cuts, moreover, have to be premium and expensive.

And there is no time for subtlety in seasoning, so most recipes call for chillies and garlic. But if you are going down that route and time is of the essence, why can't you use chilli and/or garlic and/or ginger and/or anchovy... puree? Delia conceded that in How to Cheat at Cooking so why not Jamie?

What both Jamie and Delia have in common, however, is far too many ingredients (Delia's first edition in the '70s was much better in this respect). On my latest effort with Jamie, (p.40) not only did I substitute couscous for polenta (with some knock-on changes), but I dumped the asparagus and the spinach. Even so, my notes read; "lots of flavours --not much flavour overall. Confused."

I know little in this field, but I would have expected that the test cooks would have systematically tried recipes with and without the seasonings and sides, and decided which made a difference and which were swamped. I'm sure that the 80/20 rule applies here as much as in the rest of the world.

So--why this post? It's not a formal review, and apparently the book is not doing as well as Jamie's previous Christmas offerings, in any case.

It's not about the book, or the Jamie brand*--it's about the approach. Apparently, for all the popularity of cooking shows on TV and the (tie-in, of course) books, fewer and fewer people are cooking from scratch at home (sorry, too many sources of variable quality to evaluate).

Recipe books are a dead end, a cul-de-sac. They reinforce the difference between the chef (chief) who originates the recipe, and the compliant cook, who follows it. They are the culinary counterpart of Freire's banking model of education.

It's interesting to think about alternatives.As long as they taste good!

* "Jamie Oliver" ceased to be a person and became a brand about a decade ago. I'm sure he signs off each project which bears his name, but he cannot possibly develop them all personally.

16 December 2012

Items to Share: 16 December

09 December 2012

Items to Share: 9 December

Education Focus
Other Business

08 December 2012

On what to do after the end of the world...

According to NASA, the world will not end on St Lucy's Day*. I'm going to stick my neck out and believe them.

It's the original cognitive dissonance scenario: celebrate instead with this classic sketch.

And just carry on...

* This dating may not work any more, although it would be a pity to lose it; Donne was of course writing before the calendar adjustment of 1752.... Still, 21 December 2012 seems to be the deadline.

02 December 2012

Items to share: 2 December 12

20 November 2012

On returning to the MOOC...

Sorry! This post arose because I ran out of words on the commenting facility of a discussion of a Massive Open Online Course (aka MOOC) which I left a while ago (see here)

Jonathan Rees has stuck with the MOOC longer than me, and he is now engaged in a discussion with the principal contributor Jeremy Adelman, who has effectively laid down the challenge, "OK, how would you do it better?" Jonathan opened the discussion to all and sundry, and I replied...

[Here is the lead-in] 

The first requirement is to get away from the economic model. The Dearing report on higher education in the UK, in the late '90s (sorry, I'm just responding on the hoof, so referencing takes a back seat) in an appendix, argues that simple lecturing requires 6 person-hours prep. for each hour of "delivery". "Resource-based learning" as it was then known, might require up to 100 person-hours of investment for each hour... And that  was about delivery of content...

So on-line learning should be prepared to spend about 15 times as much per hour as face to face. But the student numbers are a thousand times as great or more!

Whatever recent commentators such as Alex Tabarrok have argued--from an economic rather than educational perspective--the online model is impoverished--and especially if it relies on the "lecture" model.

(And incidentally, I'm sorry to say that the lectures on this MOOC were, at least until I dropped out, very poor. Not only were they poor on-line, they would also have been bad in the lecture theatre [although the sense of a live "performance" would have helped]. The visual accompaniments were frankly amateurish, and there was clearly no investment in professional advice in lecturing for distant delivery. This is not about spoon-feeding, but judicious enhancement. And--brutal bottom line--voice coaching for Jeremy.)

The "Global Dialogues" show something of the way forward.

(for reasons of technical ineptitude on my part, I'm publishing this incomplete stuff now, because it is getting late. I hope to finish it shortly)

(A week later...)

The second is implicit in the very pertinent comment added by Contingent Cassandra to the initial version of this post--it concerns the transparency of the medium. She concludes that:
The minute a lecture lands on a screen, students seem to become critics, and to use as their standard the highest-quality material they've seen. That's a tough standard to meet.
(I'm tempted to quote George Herbert again--but I've done that three times already.) The medium has become a message, if not the message. I suspect that only routine exposure to such a medium will overcome that issue, if indeed anything will.

And that prompts the third consideration, which is that simply porting a method (lecturing), which hasn't had much of a good press since alternatives became available and is far from the gold standard of teaching, suggests a certain lack of nerve on the part of the authors of the course, and, as I argued on 29 September a degree of laziness which sends a message: preparedness to re-configure the course,
...requires conscious and deliberate effort. And if it doesn't happen, it sends a message to the student. You ain't worth it.
More positively, there are ideas out there which could be used, although they would need some root-and-branch re-design. The most obvious are the Khan Academy (I note that the strap-line for their history listing is "The history of the world--eventually!") and RSA Animate. They are short (shorter than the MOOC's 15-minute segments), and use fairly basic animation to keep track of the arguments. Moreover, the preparation is necessarily a team enterprise (even if Khan did start out on his own) --the very act of discussing the integration and juxtaposition of channels of communication forces the team to make the most of the process. And that in its turn may promote the creation of more innovative models...

I'd better stop while I'm being uncharacteristically positive!

(But do follow Jonathan.)

19 November 2012

Items to Share: 18 November

15 November 2012

On finding one's own way

I've finally bought a tablet. It's not an Apple product; I admire their design, of course, but as a colleague remarked the other day, "I refuse to be dictated to about what I can and cannot buy to run on the machine." My first netbooks ran Linux--the ideological antithesis to the Jobs model (despite being the infrastructure of the MacOS)--and they were brilliant. Up to a point.

But the iPad famously comes without a manual--an act of typically Jobsian chutzpah--and it appears that its emulators feel obliged to demonstrate their intuitiveness in an equally minimal way. Indeed this story suggests that a machine might be the ultimate stimulus for discovery learning. (Caveat; it does originate from here--not that that is necessarily an issue.)

I'm less interested in the arguments and the one-upmanship (Stephen Potter, of course) than in the experience of learning by experience forced upon one by the lack of guidance.

But I did not have the advantage of ignorance. I came to the tablet with preconceptions about how it should work. For years, my PC use has been based on folders and files; files are tied to default applications (with some exceptions), so if I want to edit something, I find it in a file manager and click on it--it finds its default application--and there it is. Very rarely do I go the application first. Even when I want to create something from scratch, I am usually already using the application, so I just find whatever passes for File -> New File nowadays.

The File Manager on Android is so clunky as to be almost useless. That's OK; in this model of user interaction, it is "back office" stuff. It's routine business the user should not have to bother his pretty little head about.

It's all about Apps. I've carefully referred to "applications" so far for earlier interfaces, where they have been [more clearly] servants. They have now been promoted to maitre d' or gatekeeper status. "There's an app for that!" is the new mantra.

They have become easier to use, at the expense of flexibility. I have spent hours--yes, hours*--trying to work out how to do things I no longer need to be able to do. Or at least, someone has decided I no longer need to do them...
  • the proper url of a webpage appears fleetingly, and there is no way to copy it...
  • because the keyboard shortcuts (Ctrl+C etc.) don't work (I've searched for an app. to reinstate them--there isn't one. Come on, someone!) [Shift + > or < does highlight, but Ctrl+ anything just leaves the page, it appears. Granted, formatting can be done by poking a finger at the toolbar, but even Windows of umpteen earlier editions ago allowed several routes to the same result...]
  • there is no right-click menu.
(Note: I started writing this on the tablet--it's an Asus Transformer, with dedicated keyboard--but had to revert to the netbook, in order to manage some basic formatting. Like italicising this.)

OK--that's the grumpy stuff. What's the point?
  • I'm groping about in the dark. This is ultimate problem-based learning. But I don't even know what I need to know
    • So I pass through various procedures without noticing them because I don't know that they are important stepping stones to doing something I want to do. It's only when I realise, "I've been here before!" that I can start to chain them together.
    • They worked. Not necessarily very well, but they did work. And I remember them. So that's what I'll do again... (Yes, behavioral** theory can account for that.) It's a dead end***.
    • So how will I back-track to find an even better way?
  • I've complained principally on the basis that things don't work as I am used to. But what I am used to says more about me than about the quality of the UX (user experience).
  • The tablet interface is designed to facilitate consumption. Music, movies, news... and  elaboration (social networking) and conversation (ditto) or even modification (wikis) but not origination. I'd wager that few blogs are by default written on tablets, and even fewer stand-alone sites.
  • And that is what happens with learning purely from the bottom up... I'm really intrigued about Negroponte's experiment (referred to above) because if his subjects did indeed hack into the tablets and change the rules of the game by activating the cameras, they went beyond that stage. Without teachers. 
Next week I start teaching a unit on "curriculum"... I think I'll use this as a starting point. And of course Allen Tough's work. Except these are not on the curriculum...

* I would say "literally, hours", had "literally" not become devalued in current usage into its opposite--"apparently" or "virtually". Hey-ho--my baggage is showing again...

** I know there is a missing "u" there, but site statistics are potent reinforcers! 

*** Getting to "just about works"--survival level--and then getting stuck there, is a real problem. For many years--16 at the last count--I have obsessionally edited, written and up-dated a course handbook. I have used at least 70% of Word's capabilities (and some of its incapabilities), from elementary styles to labour-saving cross-references and table formats to definitive tables of contents and glossaries. All automatic. All automatically up-dating with every revision. See page x was always the correct x. It appears no-one else in the institution knows about these brilliant features--if I let anyone else touch the file it will come back broken... This year I had to pass it on. Disaster.

And strangely enough that disaster is the product of colleagues' learning from experience. That tends to be self-limiting. It needs not so much a teacher, as someone who can hold out a vision of something even better, to transcend the immediate answer and push on to the next level. (I'm amazed that I can draw such meta-lessons from Word, but so it goes.)

12 November 2012

Items to Share: 11 November

Education Focus 
Other Business

* the single "l" is correct British spelling, before anyone else complains!

04 November 2012

Items to Share: 4 November

Education Focus
Other Business

01 November 2012

On a master's class

... or at least, lecture. The lecture is the simplest--perhaps the crudest--form of "teaching", but it nonetheless has nuances...

A week or so ago, I got an email from an old friend and former colleague. He has forsaken mainstream academe to become a freelance lecturer, working nowadays across the world, but it so happened that this week he was speaking just a few miles away. Would I like to attend as his guest, and catch up over lunch afterwards?
    (I just googled his name, and his site came up second out of almost 9m hits. He must be doing something right, and I withdraw any silly remarks I made to him earlier about his site! It only goes to show the triumph of substance over style. 
    And I recalled the first time we met, when he was doing an informal staff-development session in the college library on the use of the Overhead Projector, about thirty years ago. I learned more in half an hour about reveals, overlays, layout, colour... than I could have believed possible. The technology has moved on, and he has kept up with it, but not for its own sake.)
We met briefly before he went on, and he asked me whether I had brought my (paper) notebook, because he wanted me to act as a "critical friend" and give him some feedback. Somehow, I'd rather expected that. But it's the first point which makes this worth blogging about.

J. has been lecturing for 30+ years. In his present incarnation, as it were, he has a standard repertoire of about a dozen talks, so goodness knows how many times he has delivered each of them (well, I shouldn't be surprised if he also keeps track...) And yet, as we talked afterwards, it became clear that they never stand still--much of our discussion was about what to put in and what to leave out, and how to get even better. More of that later (mini-point: he was of course very good at flagging points within the talk for later comment, creating a little sense of expectation, on which he of course duly delivered).

And yet he saw an opportunity to get some honest (and, dare I say it, professional) feedback--and took it. It takes confidence to do that. (Contrast with this incident.)

The context, of course, was a little different from my usual observations:
  • It is debatable whether he was teaching. The lecture was to a decorative and fine arts society;  the audience were certainly interested, and they may well have picked up some fascinating information along the way, but they were not primarily there to learn. After all, the programme of events for the society was never designed as a curriculum.
  • So, in terms of evaluation, using dear old Kirkpatrick, only level one [ Reaction - what participants thought and felt about the training (satisfaction; "smile sheets") ] really matters. Or does it? And how should it be construed? In other words, is this simply entertainment?
However, there were many useful practical points: I am confining myself to what I could observe and/or discuss. So although we talked about the perennially fraught issue of what to include and what to leave out, I'll just touch on that--it's a content rather than process issue.

As PhD Comics fortuitously put it the other day;

(The small text reads: "Where = the number of academics in the room who think they know how to fix it, and 1 = the person who finally calls the A/V technician")

J brings his own equipment, and had it set up well in advance. He knows it thoroughly, and its capability to fill the screen in a large hall. It was just ironic that the chairwoman of the society's opening remarks included a vote of thanks to the couple who were just stepping down from arranging the audio-visual facilities, having done it for many years.

Not only did he set it up, he also had a rolling presentation--of quotations about wine--which he set running as the audience arrived. Some knew each other so there was some lively chatter, but there were quite a few singletons, who probably appreciated something to look at while waiting. The screen went blank for the notices about forthcoming events, and the introductory remarks.

Having been introduced, he took up his position at the side of the stage, where a boom microphone had been set up. As we discussed later, he prefers a lavalier radio mic. because it does not restrict you to one spot, sometimes it is simpler to adapt to the normal provision.

He did not launch into the talk at once. As the lights were being dimmed he requested that those at the rear of the hall be left on, lest their dimming encourage people to nod off. I've no idea whether this really happens or not, but it did--more importantly in my view--send a message, "I know what I'm doing; I'm an old hand; you can relax." 

He asked if everyone could hear, stepped up to the mic. and away from it, and back again. It was clear the amplification was not necessary, so that gave him a little more flexibility, although he kept it on. Later he told me that when he started doing these talks, he arranged several sessions with a voice coach to ensure he would not wear his voice out. The coach invoked John Wesley as testimony to the capacity of the unaided voice to reach thousands of people, even in the open air.

And then he told a self-deprecating and entertaining anecdote about his late mother's opinion of his ability as a public speaker, while the audience settled down to listen...and so to the lecture itself, marked by the first "slide". 
    I've gone on about this at some length because it was all about that "sending a message" to the audience, and it is (I generalise wildly) largely neglected in academe. Of course, as the link above suggests, you can't not send a message, but often--as academic staff of an accredited institution--we rely on its prestige, and even its branding, to send an institutional message to students so we think we don't have to bother. After all, our audience members are not lively nudging-elderly people who could readily choose to do many other things this Tuesday morning, and who generally trust the Society to book interesting and entertaining speakers, but who also know that they occasionally get deadly bores, and who need to be re-assured on each occasion... our students are signed up to a package of which we are a part, and are largely stuck with us. So we don't have to pay attention to these messages. So we don't. And they default to "It's not worth the bother." or worse, "You're not worth the bother." No--attention to these messages is not beneath us.
(The lecture was about the Shakers of North America. The link is not very informative, but the lecture itself is!)

As befits his subject, J's presentation package was simple*, even plain, but beautifully crafted. As Thomas Merton apparently said (or--being a Trappist monk--wrote), "The peculiar grace of a Shaker chair is due to the fact that an angel might come by and sit on it." But he would probably have adopted the same style for any of his presentations.

The slides mostly had plain blue backgrounds, with white sans-serif text. Sometimes, where relevant, a monochrome picture, appropriately faded, stood in for the background--that's an effective way of tying subordinate points into a common heading. No flashy transitions or animations, just occasional highlighting. (I did suggest that some of the highlighting on a map of settlement locations was too subtle and did not really do its job--but it is when you get to that kind of nit-picking that you are aware of the underlying quality!)

We had a slightly more committed difference about the wordiness of some of the text. J argues that while the visuals complement the lecture, they should be able to be read sensibly as free-standing text. For his predominantly elderly audience, perhaps compensating between visual and aural channels**, he may well be right. I'm more inclined to minimalism; the fewer words the better, each one a keyword for an idea. Coherent prose comes way down the field. (Indeed, switch the presentation package to Prezi, for example, and that is almost forced on you...)

Nevertheless, J never turned to look at the screen behind him (after all, the content was on his laptop before him) and when he showed a quotation on screen, he did not read it out loud. But he did keep quiet to allow the audience to read it. Again, I'm sure that works well with his constituency, and indeed his aspirations (Kirkpatrick levels 1 and perhaps 2). But I wouldn't want (as J wouldn't) to recommend it context-free. 

Most of the time the visuals ran in parallel with the lecture itself--I think I remember some counter-point juxtapositions, but subtly done. 

J did not announce his objectives at the start of the session--as I noted with interest, because when we first met many years ago he was very keen on making objectives clear. But of course the context was different--no-one was going to ask, "Why do we have to learn this?" or, "Will this be in the exam?" Even so, he was careful to sign-post each section of the lecture. It was billed as "The Shakers: Their Beliefs, Architecture and Artefacts", and so it was made clear how we were progressing through the material. The only part which was not signposted was the final few minutes in which J talked about being in touch (by email--they never eschewed technology) with one of the last three surviving Shakers, and told of the agony of some recent events among them. That touch of revealing a personal acquaintance framed the preceding material quite differently--and could not have been announced in advance. (I suspect, although I have not checked this out, that J makes an ad hoc judgement call about whether to include that, depending of course on time, but perhaps also on the "feel" of the audience...)

Just one other point to mention here. The whole lecture came in at 67 minutes, which was about what J. had expected. I began to notice a little fidgetiness in the audience at around 25 minutes: just time for a humorous but relevant diversion about the millennial laws of Shaker propriety, which lightened the mood for the transition to the examination of architecture and artefacts. There had, of course (of course?), been lighter moments and a few jokes earlier, but this was a couple of minutes' breathing space. I suspect that with a younger audience, J. would have gone for shorter units. (General background on lectures here.)
    Arising from our discussion I'm beginning to think of a model of trade-offs in constructing primarily informative sessions--but I'll come back to that in another post.

* "'tis the gift to be simple..." is the Shaker hymn which is a theme in Copland's Appalachian Spring suite, later well-known as the tune to the popular carol "Lord of the Dance" by Sydney Carter. Naturally J and I discussed the issue of the inclusion of music in the lecture; it's a trade-off, of course, and he does include it in his study days when he has more time. But music takes time to listen to, and the challenge is always to manage content in the available time.

** ...but nothing above should be construed as any kind of endorsement of the egregious bull***t about "sensory learning styles"--that is quite a different matter.

29 October 2012

Items to Share: 29 October

 Education Focus

Other Business

24 October 2012

On using a title

Years ago, I was asked for my name for some National Health Service form, by a receptionist whose first language was not English. After she struggled with the spelling of my surname, I just passed over a credit card so she could copy it. It showed my title as "Dr", and I lost my mini-campaign to be treated as normal by the NHS. Not that I am discriminated against in my thankfully so far limited contact with them, it is just that using my academic title confuses the system, wastes time as I have to explain I'm not a medic., and has medical staff "explaining" things to me in technical language which means nothing... All--admittedly rather desultory--efforts to get it changed have so far had no effect, despite assurances.

But I do remember one wise doctor on a course who greeted me with, "Are you one of us, or a proper one?" In the UK, strangely enough, the medical "Dr" is a courtesy title, because the MD is rare; most medics are "MB, BS" (or "MB, ChB"--two unclassified bachelor's degrees, in any case) and of course to the disconcertion of foreigners taken ill in the UK, surgeons are "Mr" or "Miss"*. By "proper one" he meant an academic doctorate, although of course there was no telling what its private connotations were ...

It came up again last week, in an out-patients' waiting room--and everyone looked up as I responded to the call for "Dr Atherton". I'm probably being hyper-sensitive and it's all trivial but I mention it because...

I've just been watching a Newsnight piece on the postponement of the badger cull. The two interviewees were a representative of the National Farmers' Union, and Dr Brian May speaking for "Team Badger". The title appeared in his on-screen label, as well as in Paxman's introduction. There is no doubt as to his entitlement to the title: PhD, Imperial College London, 2007**. In astrophysics. How does that--or indeed the reason which probably actually got him onto the programme--his previous association with some popular musical group called "Queen"--relate to his credibility as a badger advocate? (I'm with him on the badgers, as a cause, so I feel let down by his apparent need to appeal to false -OK, irrelevant--credentials.)

*  this is entirely a matter of convention, and as far as I know, the standard and equivalence of UK medical qualifications is globally acknowledged.

**  and all credit to him for completing it thirty years after starting it--lucky he did not fall foul of the pernicious "Levinsky rules".

22 October 2012

Items to Share: 21 October