28 June 2010
26 June 2010
On one reason not to grade...
It presupposes a degree of maturity on the part of students, but the course with which I am still associated, operates only on a pass/refer basis (one opportunity to re-submit referred work; fiture at that point is failure and requires a re-take of the whole module, at best). Many students don't like it--particularly those who are themselves graduates and have a long history of assessing themselves by their marks, often of course relative to their class-mates.
The situation Mary Beard describes in the linked post is one reason (and it is incredibly time-consuming to adopt her solution, particularly for big courses).But it goes a bit further than that.
Let's apply the Dale/Bruner “Cone of Experience” to the situation--slightly odd, but it fills the bill and such tools are never more than pragmatic devices. In terms of assessment;
The situation Mary Beard describes in the linked post is one reason (and it is incredibly time-consuming to adopt her solution, particularly for big courses).But it goes a bit further than that.
Let's apply the Dale/Bruner “Cone of Experience” to the situation--slightly odd, but it fills the bill and such tools are never more than pragmatic devices. In terms of assessment;
- what the students actually know/can do etc. is at the enactive level. It's real life. It's incredibly rich and complex and inherently unassessable because of that, so
- we devise means of assessing which necessarily lose a lot of detail in order to be manageable. Sometimes the detail we actually lose is precisely what we wanted to assess in the first place, but we just get it wrong. This is at the iconic level. The students do the assessment, but in order to compare them against a standard, or even against each other, or previous performance...
- we have to assess the assessment, and that gets even more abstract and symbolic. And attaching a single number or letter is as abstract as one can get (apart of course from manipulating those number mathematically) --or in other words loses the most information.
"Yes, it did only get a C+, but that was solely down to those daft mistakes you made with the statistics; if you'd got them right it would probably have been an A-"
"Yeah, right..."Such practices are actively inimical to effective feedback and assessment for learning, in the current jargon.
22 June 2010
On "benign" myths
In the car this morning, I caught part of "Woman's Hour" on Radio 4. (Heading link goes there, available for seven days, relevant bit about 30 minutes in.)
It was a discussion about drinking in pregnancy, and it was very refreshing. One of the contributors talked about how some of the strident advocates of prohibiting alcohol consumption in pregnancy, saw the topic of possible foetal damage principally as a way of "telling a good story" which would command attention in a wider campaign against the evils of drink. Another, asked about the labelling of alcohol in the US with warning about drinking during pregnancy, agreed that the initiative was principally traceable to the desire of drinks companies to cover themselves against litigation, regardless of the truth of the claims. Further asked if the labels had made any difference to the incidence of foetal alcohol syndrome, she was very clear--none at all.
This discussion exposed more clearly than anything I have heard or seen in mainstream media for years, the supposedly benign drift from factual reporting of risk to supposedly benign but infantilising myth-making, which not only undermines the capacity of grown-up people to make their own judgements of risk, but also undermines their confidence in anything they are told by the Ministry of Truth.
Not that it is new; Plato started it. Different context, same principle:
It was a discussion about drinking in pregnancy, and it was very refreshing. One of the contributors talked about how some of the strident advocates of prohibiting alcohol consumption in pregnancy, saw the topic of possible foetal damage principally as a way of "telling a good story" which would command attention in a wider campaign against the evils of drink. Another, asked about the labelling of alcohol in the US with warning about drinking during pregnancy, agreed that the initiative was principally traceable to the desire of drinks companies to cover themselves against litigation, regardless of the truth of the claims. Further asked if the labels had made any difference to the incidence of foetal alcohol syndrome, she was very clear--none at all.
This discussion exposed more clearly than anything I have heard or seen in mainstream media for years, the supposedly benign drift from factual reporting of risk to supposedly benign but infantilising myth-making, which not only undermines the capacity of grown-up people to make their own judgements of risk, but also undermines their confidence in anything they are told by the Ministry of Truth.
Not that it is new; Plato started it. Different context, same principle:
How then may we devise one of those needful falsehoods of which we lately spoke—just one royal lie which may deceive the rulers, if that be possible, and at any rate the rest of the city?....
...Such is the tale; is there any possibility of making our citizens believe in it?
Not in the present generation, he replied; there is no way of accomplishing this; but their sons may be made to believe in the tale, and their sons' sons, and posterity after them.(Plato, the Republic, book III)
On "potential" (self-indulgent rant)
I do realise that this is of little interest to anyone else, but it's my blog, so read or move on!
I am prompted by hearing on BBC Radio 4 news of all places that something or someone is a "potential threat" to whatever.
This is not merely oxymoronic, it also dilutes the original idea to "threat of a threat".
The following list is not exhaustive, but the notion of potentiality is not merely implicit but clearly embedded in the following:
"Potential" is a useful qualifier for something concrete--indeed certain-- such as "result", "outcome" "consequence"--but whose qualities are as yet unknown.
[Incidentally... "more/few/fewer" is the term for items which can be counted. "more/less" is the term for quantities which need to be measured. (Yes, I know, "term" is not the best term...]
Who says? You may well ask, but that's a whole new ball-game.
I am prompted by hearing on BBC Radio 4 news of all places that something or someone is a "potential threat" to whatever.
This is not merely oxymoronic, it also dilutes the original idea to "threat of a threat".
The following list is not exhaustive, but the notion of potentiality is not merely implicit but clearly embedded in the following:
- chance
- danger
- opportunity
- probability
- risk
- threat
"Potential" is a useful qualifier for something concrete--indeed certain-- such as "result", "outcome" "consequence"--but whose qualities are as yet unknown.
[Incidentally... "more/few/fewer" is the term for items which can be counted. "more/less" is the term for quantities which need to be measured. (Yes, I know, "term" is not the best term...]
Who says? You may well ask, but that's a whole new ball-game.
21 June 2010
On the problem of self-limiting adequacy
I'm not sure whether or not it is true, but the story goes that Ofsted has declared that their Grade 3, "Satisfactory" is no longer satisfactory. Or perhaps it is (i.e. "satisfactory"), once again, given that the common inspection framework has "raised the bar". (I've looked up the urls for the links, but don't expect me actaully to read this stuff. I'm retired, remember?)
I hate to say it, but they may be right.
For Father's Day one of our sons gave me a premium, organic, long-matured, etc. rib of beef. I don't usually roast a joint on the bone, so I consulted Delia, naturally (in book form rather than online). Of course her advice was impeccable, and even Susi commented on the better flavour and texture than our usual supermarket joints. (This may prove to be expensive if we have also raised Susi's bar...)
For me, and indeed a generation of conscientious UK cooks (domestic cooks, rather than "chefs"; it's a very different discipline), Delia's books are the bible [that statement is grammatically "correct"].
Next to the guidance on roasting beef was, of course, a recipe for Yorkshire Pudding. (No, I am not going to post a link to a similar recipe.)
It was different from my usual one. That is of course fine; there is no canonical recipe for a traditional dish. And mine works very well, thank you. Most important, it is reliable. Sometimes a pudding comes out as a dome rather than a bowl for some unknown reason; but turn it upside down and it is like all the others.
I didn't try it. I know what I always get with my own recipe, so why risk it? If I were floundering; if my puddings came out like biscuits, I would go for it. And perhaps if I were cooking just for me, I might try it; but I am trying to produce a meal for Susi, too. (And the dog, but being diabetic, he couldn't have the pud in any case...)
So in Ofsted's insistence on "capacity to develop further" they may just be onto something. "Satisfactory" is not only not good enough, but inimical to further development. Discuss...
(Principally, Delia says you don't have to let the batter stand, but then she does add water as well as milk. I've always believed that standing the batter in the fridge until the last half-hour of roasting lets the liquid do something to the flour which helps with the rising process --because it is plain flour after all, and the egg is not whipped to retain air, souffle style. A proper scientist would not take any of this on the authority of the blessed Delia or even St Jamie, but rely on the experiments. OK, fund me!)
I hate to say it, but they may be right.
For Father's Day one of our sons gave me a premium, organic, long-matured, etc. rib of beef. I don't usually roast a joint on the bone, so I consulted Delia, naturally (in book form rather than online). Of course her advice was impeccable, and even Susi commented on the better flavour and texture than our usual supermarket joints. (This may prove to be expensive if we have also raised Susi's bar...)
For me, and indeed a generation of conscientious UK cooks (domestic cooks, rather than "chefs"; it's a very different discipline), Delia's books are the bible [that statement is grammatically "correct"].
Next to the guidance on roasting beef was, of course, a recipe for Yorkshire Pudding. (No, I am not going to post a link to a similar recipe.)
It was different from my usual one. That is of course fine; there is no canonical recipe for a traditional dish. And mine works very well, thank you. Most important, it is reliable. Sometimes a pudding comes out as a dome rather than a bowl for some unknown reason; but turn it upside down and it is like all the others.
I didn't try it. I know what I always get with my own recipe, so why risk it? If I were floundering; if my puddings came out like biscuits, I would go for it. And perhaps if I were cooking just for me, I might try it; but I am trying to produce a meal for Susi, too. (And the dog, but being diabetic, he couldn't have the pud in any case...)
So in Ofsted's insistence on "capacity to develop further" they may just be onto something. "Satisfactory" is not only not good enough, but inimical to further development. Discuss...
(Principally, Delia says you don't have to let the batter stand, but then she does add water as well as milk. I've always believed that standing the batter in the fridge until the last half-hour of roasting lets the liquid do something to the flour which helps with the rising process --because it is plain flour after all, and the egg is not whipped to retain air, souffle style. A proper scientist would not take any of this on the authority of the blessed Delia or even St Jamie, but rely on the experiments. OK, fund me!)
17 June 2010
On manual work, feedback and fulfilment
I've just finished Matthew Crawford's (2010) The Case for Working with Your Hands; or why office work is bad for us and fixing things feels good London; Penguin/Viking, published in the USA last year as Shop Class as Soulcraft; an inquiry into the value of work. It's not very long, and highly readable.
To a certain extent, Crawford is in the same territory as Richard Sennett (see links below) in lauding craftsmanship and painstaking skill, but whereas Sennett rambles self-indulgently around the workshops of violin-makers in Cremona, and discusses literary approaches to cooking a chicken, Crawford is more grounded. He does not use the effete term "craft", but concentrates on the manual "trade". And his paradigmatic trade is repairing motorbikes.
He has an impressive academic background, including a degree in physics and a doctorate in political philosophy, and enough form as an office worker to reject it on an informed basis. He can also appeal to having practised as an electrician on small-scale construction sites.
He runs a motorbike repair shop as his main business. He knows what he is talking about, as some fascinating stories testify, even if readers may not really understand them.
There's much too much for a blog post here, and of course if I go into it too much you may not read the book, and there are reviews here, here and here. But...
To a certain extent, Crawford is in the same territory as Richard Sennett (see links below) in lauding craftsmanship and painstaking skill, but whereas Sennett rambles self-indulgently around the workshops of violin-makers in Cremona, and discusses literary approaches to cooking a chicken, Crawford is more grounded. He does not use the effete term "craft", but concentrates on the manual "trade". And his paradigmatic trade is repairing motorbikes.
He has an impressive academic background, including a degree in physics and a doctorate in political philosophy, and enough form as an office worker to reject it on an informed basis. He can also appeal to having practised as an electrician on small-scale construction sites.
He runs a motorbike repair shop as his main business. He knows what he is talking about, as some fascinating stories testify, even if readers may not really understand them.
There's much too much for a blog post here, and of course if I go into it too much you may not read the book, and there are reviews here, here and here. But...
- He celebrates work which gives direct and unmediated feedback; do it right and the machine works again. Do it wrong and it doesn't.
- He bemoans current systems which don't do that--but he talks about them as if they were deliberately constructed as if to obscure feedback. They aren't. It's just that in most areas of practice today you don't find out at once if something is working or not (see Jaques "time-span of discretion")
- And he castigates managers and modern work practices for focusing on process and compliance rather than outcomes and results. I have a lot of sympathy, and I moan about it myself, but complex organisations are just like that. Sorry!
- Crawford's conception of manual work is somewhat idealised. He nods in that direction in his acknowledgement that working as an electrician on new-build work is less interesting than repairing faults. And of course, manufacturing new motor-bikes on a production-line represents exactly the de-humanising work against which he inveighs.
- It's symptomatic of the fragmentation of work about which Crawford complains, that some of the most relevant work of recent years, such as situated learning, deliberate practice and even threshold concepts, does not get a mention.
14 June 2010
On learning and being taught
My son has just called round (he's 28), initially to print out a spreadsheet, but then to email it. Before that, though, he wanted me to show him how to add up a column of figures on the spreadsheet.
He couldn't email it from his own laptop because of the security settings, so he had to copy the file across via a flash-drive, with which he needed my help.
The file was simply in his main "My Documents" folder, called "Expenses.xls". I pointed that out. He didn't understand that it was a problem...
I was irritated, to say the least. Apart from games consoles, he has had his own computer since before he went to university. I remember helping him to create graphs for a project when he was still at school.
After he left, I had something of a moan to Susi. In her inimitable way, she suggested that it was not surprising because because I am a useless teacher (guilty! I am hopelessly impatient when it comes to procedures) and when I responded by pointing out how long he has had to acquire these basic skills, she wondered if he had ever been taught them at school or university.
I love learning. I hate being taught.
In particular, I don't understand the assumption which underlies so much current discourse in the field, that learning depends on being taught.
Texting is not taught at school, but the "achievement" of young people in this field is phenomenal. (Hint to Michael Gove; if you ever want to massage figures upwards, introduce a Diploma in "Social Communication", where txtg is a major component.)
I remember an epiphany in 1974, after a discussion at an annual Easter houseparty with friends in a rented cottage, in this case on the Nefyn peninsula... That beyond the basic three "R"s and foreign languages (and some of that was dodgy -- did you every meet a French person who enquired, "Comment allez-vous?"?) ..beyond that almost everything I had been taught at school was;
But somehow the conventional wisdom is that being taught is a prerequisite of learning.
And of course that a paper qualification is a reliable proxy for learning, knowledge and skill.
No. It's merely a proxy for having been taught.
He couldn't email it from his own laptop because of the security settings, so he had to copy the file across via a flash-drive, with which he needed my help.
The file was simply in his main "My Documents" folder, called "Expenses.xls". I pointed that out. He didn't understand that it was a problem...
I was irritated, to say the least. Apart from games consoles, he has had his own computer since before he went to university. I remember helping him to create graphs for a project when he was still at school.
After he left, I had something of a moan to Susi. In her inimitable way, she suggested that it was not surprising because because I am a useless teacher (guilty! I am hopelessly impatient when it comes to procedures) and when I responded by pointing out how long he has had to acquire these basic skills, she wondered if he had ever been taught them at school or university.
I love learning. I hate being taught.
In particular, I don't understand the assumption which underlies so much current discourse in the field, that learning depends on being taught.
Texting is not taught at school, but the "achievement" of young people in this field is phenomenal. (Hint to Michael Gove; if you ever want to massage figures upwards, introduce a Diploma in "Social Communication", where txtg is a major component.)
I remember an epiphany in 1974, after a discussion at an annual Easter houseparty with friends in a rented cottage, in this case on the Nefyn peninsula... That beyond the basic three "R"s and foreign languages (and some of that was dodgy -- did you every meet a French person who enquired, "Comment allez-vous?"?) ..beyond that almost everything I had been taught at school was;
- trivial and/or
- irrelevant (to what? Almost anything in the real world.) and/or
- wrong
But somehow the conventional wisdom is that being taught is a prerequisite of learning.
And of course that a paper qualification is a reliable proxy for learning, knowledge and skill.
No. It's merely a proxy for having been taught.
12 June 2010
On pictures and words
The other day a friend and former colleague and I met for lunch, as we do every few months. He, being a resident of the Potteries (that part of Staffordshire which came to prosperity in the early 19th century through the entrepreneurship of people like Josiah Wedgwood) gave me these two plates. Based on the familiar Willow Pattern tableware (originally from the Minton pottery) these up-date the familiar scene to reflect the Potteries (on the left), and London (on the right).
RK explained how he made use of them in short courses on promoting entrepreneurial activity; instead of setting a groupwork brief of writing the characteristics of a proposed project, he insists on it being drawn. He described the way in which the task divided groups, but also how profitable it was because of the effectiveness of the post-presentation discussion--what would be known in an art school context as the critique. (Identified by Lee Shulman as their "signature pedagogy".)
I was prompted to recall, on the train on the way back, I an exercise I used to use with students (on teaching programmes) around getting them to define their view of the teaching and learning process. Some of the more obvious images, from the trigger slide, are below; I used this in order to pre-empt participants from latching on to the cliches...
RK explained how he made use of them in short courses on promoting entrepreneurial activity; instead of setting a groupwork brief of writing the characteristics of a proposed project, he insists on it being drawn. He described the way in which the task divided groups, but also how profitable it was because of the effectiveness of the post-presentation discussion--what would be known in an art school context as the critique. (Identified by Lee Shulman as their "signature pedagogy".)
I was prompted to recall, on the train on the way back, I an exercise I used to use with students (on teaching programmes) around getting them to define their view of the teaching and learning process. Some of the more obvious images, from the trigger slide, are below; I used this in order to pre-empt participants from latching on to the cliches...
I used to ask them to work in groups to generate their own images of the process, to draw them on a flipchart or acetates, and then to present them (the delights of primitive technology--you can do it with an electronic whiteboard but it's a lot of hassle) to each other, and get the crit. going in that way. A useful variant was not to allow any verbal explanation by the presenters--the rest of the class had to deconstruct their offering.
And that in turn set me thinking about one of the texts I read long ago about a technique used in health-care for exploring critical incidents, known as "Illuminative Incident Analysis" (Cortazzi and Roote, 1975). This too centres around drawing the incident, in as cartoon-like a way as the group members are capable of, in response to guidance and structured questions. The authors insist on the use of pictures as a way of getting around how words can so easily be used to fudge meaning rather than reveal it.
(Of course it is precisely the capacity of language to carry so many connotations and levels of meaning which makes it so rich, but in this context that was seen as getting in the way of the task.)
As I recall, for example, a review of potentially serious incident on a hospital ward, included in a discussion the phrase, "a nurse in a tizzy". They draw out the possible connotations of that phrase, including implications about competence and stress (and nowadays they might note that for staff involved whose first language is not English, such an informal usage might be meaningless...) But if the speaker had to draw that nurse, so much more might emerge--she might be surrounded by clocks, cartoon patients, doctors, managers and even paper-work might be depicted pulling her in different directions and her head might be spinning. There would be the role-set, a picture of demands on time, all there at once (a completed picture is an instantaneous non-linear representation)...
Prompted by the latest round of marking I am struck by the current ubiquity of the bullet-point. The use of a bulleted list is quite a pointer to the student not having really got the idea, because paradoxically it removes the structural dimension from the account, indeed my contribution to an educational Devil's Dictionary would include;
- Bullet points: the most effective way of reducing an idea to less than the sum of its parts.
I wonder what is gained and lost in the interpretation of, say, a novel into a graphic novel (or vice versa) as well as the re-telling of tales on film and video.
I have no time for the "learning styles" rubbish, but the use of many different media in communication for teaching is not merely a matter of pandering to that fad, it provides opportunities to explore different features of the content, too.
Cortazzi D, Roote S (1975) Illuminative incident analysis London; McGraw-Hill see also: Rich A and Parker D (1995) "Reflection and critical incident analysis: ethical and moral implications of their use within nursing and midwifery education" Journal of Advanced Nursing 22:6 pp 1050-1057
04 June 2010
On trivial learning
This is tricky. What I learned about was far from trivial. The fact that I learned it, was.
I have just watched a programme about aerial warfare in the first world war. It was fascinating and haunting. And irritating.
Because Albert Ball was exalted as the great ace (crudely, 44 kills). Mick Mannock (61) was not even mentioned. Surely class was not a factor, 95 years later?
And the position of the propeller shaft of the SE5 was wrong, probably because of the exigencies of reconstruction. I'm sure someone even geekier than me will make a devastating contrary case about this. But the objective facts are not the point, here---
My interest is in the fact that I read something about this when I was about 12. More than 50 years ago. I made no effort at all to "learn" it. Amazongly I have just found the source (Jones, 1954) Watching the programme, I was irritated by the continual references to 56 Squadron. I wanted to claim the primacy of 73 squadron. Actually it should have been 74 squadron...
I am drawing no conclusions, just questions.
I have just watched a programme about aerial warfare in the first world war. It was fascinating and haunting. And irritating.
Because Albert Ball was exalted as the great ace (crudely, 44 kills). Mick Mannock (61) was not even mentioned. Surely class was not a factor, 95 years later?
And the position of the propeller shaft of the SE5 was wrong, probably because of the exigencies of reconstruction. I'm sure someone even geekier than me will make a devastating contrary case about this. But the objective facts are not the point, here---
My interest is in the fact that I read something about this when I was about 12. More than 50 years ago. I made no effort at all to "learn" it. Amazongly I have just found the source (Jones, 1954) Watching the programme, I was irritated by the continual references to 56 Squadron. I wanted to claim the primacy of 73 squadron. Actually it should have been 74 squadron...
I am drawing no conclusions, just questions.
03 June 2010
On distraction
Nicholas Carr has written on The Shallows; what the internet is doing to our brains. Note that I have not linked that title to Amazon or anywhere. That's because his blog post (which is of course linked from the header) explores whether embedded links (as opposed to the those gathered at the end of a post or a page in the manner of a bibliography) tend to distract the reader and to fragment the reading experience, and possibly the capacity to follow an argument:
"Links are wonderful conveniences, as we all know (from clicking on them compulsively day in and day out). But they're also distractions. Sometimes, they're big distractions - we click on a link, then another, then another, and pretty soon we've forgotten what we'd started out to do or to read. Other times, they're tiny distractions, little textual gnats buzzing around your head. Even if you don't click on a link, your eyes notice it, and your frontal cortex has to fire up a bunch of neurons to decide whether to click or not. You may not notice the little extra cognitive load placed on your brain, but it's there and it matters. People who read hypertext comprehend and learn less, studies show, than those who read the same material in printed form. The more links in a piece of writing, the bigger the hit on comprehension."So if we place a premium on clarity and communication, is it stylistically better to gather the links at the end?
Subscribe to:
Posts (Atom)