The link is to a piece by Steven Levitt at the Freakonomics blog.
He's just learned a new approach to putting (in golf). That kind of change is often resisted, as I explored here. Levitt does acknowledge a minor downside, but no sensation of loss, even after, by his own admission, he has been barking up the wrong tree for the putting proportion of 5,000 hours of golf.
Is it a matter of mind-set? And is sport an activity which promotes an incremental mind-set?
25 July 2011
21 July 2011
On being condemned to expertise
I've just been reading Matthew Syed's excellent Bounce: The Myth of Talent and the Power of Practice (2011, Fourth Estate*). I'm not a great sport fan, or music or chess buff, and those are the fields he discusses most, but he has an entertaining approach to Ericsson, and Dweck and other usual suspects.
I may be perverse but I do wonder about the opportunity cost of acquiring such expertise. In other words, what is the trade-off between what these people could have done with their childhood and adolescence, and what they ended up doing with it? I'm sure that they learned a great deal about perseverance and commitment and mind-set; but... Often it wasn't their idea--the Williams sisters and Tiger Woods and many more had to forgo many "normal" experiences of growing up because they were too busy and too focused--and it was not nexessarily their vision which drove them. See the controversy provoked by Chua's (2011) Battle Hymn of the Tiger Mother London; Bloomsbury.
See also this post and its links
* I'm not signed up to Adsense or any other scheme, and I have decided not to link to Amazon any more. If you want it you can find it--preferably at an independent local, physical bookshop. I'm prompted to this by a belated epiphany. I had occasion to go to Ampthill yesterday and came across Horatio's bookshop (and artist's materials purveyor...). Quirky and independent--and I'm sure endangered (although according to the website it started in 2009). And what about Topping's in Bath and Ely? Or even grumpy County Town Books in Bedford... Use them or lose them! Yes, they're not as cheap, but buying the book is only a small part of your investment in it--consider how long you spend reading it.
I may be perverse but I do wonder about the opportunity cost of acquiring such expertise. In other words, what is the trade-off between what these people could have done with their childhood and adolescence, and what they ended up doing with it? I'm sure that they learned a great deal about perseverance and commitment and mind-set; but... Often it wasn't their idea--the Williams sisters and Tiger Woods and many more had to forgo many "normal" experiences of growing up because they were too busy and too focused--and it was not nexessarily their vision which drove them. See the controversy provoked by Chua's (2011) Battle Hymn of the Tiger Mother London; Bloomsbury.
See also this post and its links
* I'm not signed up to Adsense or any other scheme, and I have decided not to link to Amazon any more. If you want it you can find it--preferably at an independent local, physical bookshop. I'm prompted to this by a belated epiphany. I had occasion to go to Ampthill yesterday and came across Horatio's bookshop (and artist's materials purveyor...). Quirky and independent--and I'm sure endangered (although according to the website it started in 2009). And what about Topping's in Bath and Ely? Or even grumpy County Town Books in Bedford... Use them or lose them! Yes, they're not as cheap, but buying the book is only a small part of your investment in it--consider how long you spend reading it.
17 July 2011
On learning in a technological age
No, this is not about the use of technology to enhance learning and teaching--it's a different angle.
Nicholas Carr's (2011) The Shallows: How the Internet is Changing the Way We Think, Read and Remember (London: Atlantic Books) is a well-written popular account of what it says in the title. Naturally it is selective and tends to assume its conclusions from the outset, but its fair.
There have been a couple of substantial blog posts this week, too, exploring similar issues;
And to a certain, nuanced, extent it is true. As it has been through the ages. The introductions of writing, of printing, of local printing, etc. have all had their impact. They have changed what it means to "learn".The challenge for education... Sorry! Scrub that cliche! So how have they changed what it means to "teach"?
(I recividistically** and opportunistically try to weave too many themes into a post, but this is about "reflection" and this does reflect how I think, for better or for worse...)
My colleagues and I have just been subject to a (insert derogatory adjective of your choice but don't forget nugatory) review of a course. One perfectly proper and reasonable question focused on the assessment strategy. "Why don't you use a wider variety of assessments? Quizzes? Timed tests?..." etc.
We didn't answer in these terms, but it did occur to me that the choice of assessment is an epistemological issue. What kind of knowledge/skill/value do you think you are testing? And on this kind of professional course, sheer memorization as assessed by a multiple choice test does not matter very much.***
It's a matter calling for thoughtful consideration, not knee-jerk answers.
[Although the other book I've just been reading--which poses similar questions from a very different angle--is Matthew Syed's (2011) Bounce: The Myth of Talent and the Power of Practice London; Fourth Estate, which touches on similar material to that I've discussed earlier.
What I don't see in much of the popular literature--and the textbooks, indeed--is much recognition of how fundamentally different the learning issues are in different disciplines and contexts.]
* ...and of course my discovery of these links and references relied heavily on the availability of the linked material online.
** I'm not sure if this is a "proper" word (it only gets three hits on google, at the time of writing, in the adverbial form) but I commend it as "in the manner of a repeat offender".
*** which is not to say it doesn't matter a great deal on other professional courses, such as medically or perhaps legally-based programmes, where there is indeed a body of knowledge to be acquired for instant access--"there's a fracture of the thingy-bone and a puncture of the whatsit-artery. Sorry! On the tip of my tongue! I'll just go and look them up..."
and--great minds think alike corner--here is another piece.
Nicholas Carr's (2011) The Shallows: How the Internet is Changing the Way We Think, Read and Remember (London: Atlantic Books) is a well-written popular account of what it says in the title. Naturally it is selective and tends to assume its conclusions from the outset, but its fair.
There have been a couple of substantial blog posts this week, too, exploring similar issues;
- from Ed Yong at Not Exactly Rocket Science on "The extended mind – how Google affects our memories" and
- from The Neurocritic on "The Google Stroop Effect?" (read it to find out what a stroop effect is!)
The Yong article seems to have been the source for a number of shorter pieces in the serious newspapers.
The very basic underlying argument is a variation on Dr Johnson (1775);
Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it.Broadly it suggests that since the net (or Google as its metonym) has made it so easy to know where to find out, it has made it unnecessary to hold the information ourselves. There is a further argument that just as Socrates argued against writing in the Phaedrus (274e-275b);
...writing is inferior to speech. For it is like a picture, which can give no answer to a question, and has only a deceitful likeness of a living creature. It has no power of adaptation, but uses the same words for all. It is not a legitimate son of knowledge, but a bastard, and when an attack is made upon this bastard neither parent nor any one else is there to defend it. The husbandman will not seriously incline to sow his seed in such a hot–bed or garden of Adonis; he will rather sow in the natural soil of the human soul which has depth of earth; and he will anticipate the inner growth of the mind, by writing only, if at all, as a remedy against old age. The natural process will be far nobler, and will bring forth fruit in the minds of others as well as in his own....so the adoption of technological extensions to human capabilities ultimately undermine those abilities. Calculators replace the ability to do mental arithmetic, for example. It's an old argument.*
And to a certain, nuanced, extent it is true. As it has been through the ages. The introductions of writing, of printing, of local printing, etc. have all had their impact. They have changed what it means to "learn".
(I recividistically** and opportunistically try to weave too many themes into a post, but this is about "reflection" and this does reflect how I think, for better or for worse...)
My colleagues and I have just been subject to a (insert derogatory adjective of your choice but don't forget nugatory) review of a course. One perfectly proper and reasonable question focused on the assessment strategy. "Why don't you use a wider variety of assessments? Quizzes? Timed tests?..." etc.
We didn't answer in these terms, but it did occur to me that the choice of assessment is an epistemological issue. What kind of knowledge/skill/value do you think you are testing? And on this kind of professional course, sheer memorization as assessed by a multiple choice test does not matter very much.***
It's a matter calling for thoughtful consideration, not knee-jerk answers.
[Although the other book I've just been reading--which poses similar questions from a very different angle--is Matthew Syed's (2011) Bounce: The Myth of Talent and the Power of Practice London; Fourth Estate, which touches on similar material to that I've discussed earlier.
What I don't see in much of the popular literature--and the textbooks, indeed--is much recognition of how fundamentally different the learning issues are in different disciplines and contexts.]
* ...and of course my discovery of these links and references relied heavily on the availability of the linked material online.
** I'm not sure if this is a "proper" word (it only gets three hits on google, at the time of writing, in the adverbial form) but I commend it as "in the manner of a repeat offender".
*** which is not to say it doesn't matter a great deal on other professional courses, such as medically or perhaps legally-based programmes, where there is indeed a body of knowledge to be acquired for instant access--"there's a fracture of the thingy-bone and a puncture of the whatsit-artery. Sorry! On the tip of my tongue! I'll just go and look them up..."
and--great minds think alike corner--here is another piece.
Labels:
disciplines,
learning,
on-line learning,
practice,
reflection
08 July 2011
On insight
The link is to an excellent article from Edge magazine. It's a conversation with Gary Klein on the nature of intuition and the decision-making of experts.
Judgments based on intuition seem mysterious because intuition doesn't involve explicit knowledge. It doesn't involve declarative knowledge about facts. Therefore, we can't explicitly trace the origins of our intuitive judgments. They come from other parts of our knowing. They come from our tacit knowledge and so they feel magical. Intuitions sometimes feel like we have ESP, but it isn't magical, it's really a consequence of the experience we've built up.
- There's enough material here for an entire course
- In particular, it poses interesting questions about the much-vaunted notion of reflection.
- Because it is an interview piece (although we only get Klein's contribution) it is very accessible. It's even exciting in parts!
Labels:
decision-making,
expertise,
intuition,
tacit knowledge
07 July 2011
On---I'm exasperated about silly arguments!
This one is about pedagogy vs. andragogy. If you need primary sources; See here and here.
I have been teaching for forty+ years (whether my students have been learning for the same period is of course contestable).
I have never yet encountered any exclusively behavioural learning (even with our wonderful Westies) or "cognitive" or "humanistic" or any other label.
And as for pedagogy versus andragogy... It's a spurious distinction whose primary achievement was to raise the profile of Malcolm Knowles. But read his piece here. Is that about treating people as grown-ups?
What these theorists never (OK! rarely) discuss is what is being learned. You want to learn a foreign language? At some point you are going to have to memorise a lot of vocabulary. You want to throw pots or improve your tennis? You need to practise, and there are people who can do it much better than you and you would be stupid to ignore their advice. You want to study political philosophy (OK, this is the most difficult)? Most people, of course, don't study it. They go straight to teaching it in the saloon bar... But...
And that is to say nothing about STEM subjects (Science, Technology, Engineering and Maths), where it is self-evident that there is an enormous (and growing) body of knowledge into which you must be inducted before you can make any contribution.
As I complain here, andragogy is not an approach to teaching--it's a brand. And one which gets in the way of making appropriate decisions about how to teach a particular subject/topic/skills to a particular group of learners.
I have been teaching for forty+ years (whether my students have been learning for the same period is of course contestable).
I have never yet encountered any exclusively behavioural learning (even with our wonderful Westies) or "cognitive" or "humanistic" or any other label.
And as for pedagogy versus andragogy... It's a spurious distinction whose primary achievement was to raise the profile of Malcolm Knowles. But read his piece here. Is that about treating people as grown-ups?
What these theorists never (OK! rarely) discuss is what is being learned. You want to learn a foreign language? At some point you are going to have to memorise a lot of vocabulary. You want to throw pots or improve your tennis? You need to practise, and there are people who can do it much better than you and you would be stupid to ignore their advice. You want to study political philosophy (OK, this is the most difficult)? Most people, of course, don't study it. They go straight to teaching it in the saloon bar... But...
And that is to say nothing about STEM subjects (Science, Technology, Engineering and Maths), where it is self-evident that there is an enormous (and growing) body of knowledge into which you must be inducted before you can make any contribution.
As I complain here, andragogy is not an approach to teaching--it's a brand. And one which gets in the way of making appropriate decisions about how to teach a particular subject/topic/skills to a particular group of learners.
01 July 2011
On angels, pinheads and APA
I've been here before, and here. And I'm marking again, so the old referencing bug-bears come out of hibernation.
The obsessional scholasticism of the "style" manuals fascinates me, and this post from the APA blog apparently takes the biscuit. Read it before continuing.
What really irritates me is of course that it privileges a US convention, that colons are followed by capitals: that is not the British style. But the convention imposes an arbitrary divide: it's no longer a matter of style, taste and preference, but of right/wrong, good/bad. (Which may translate into assessment grades and even immediate career prospects.) [Check out the arguable solecisms of this paragraph.]
There are several standards in play here. I hold to my opinion that we should not get our knickers in a twist about material up to Master's level. (Including such colloquial phrases).
That is what we might call domestic level citation. It's a matter between student and tutor/supervisor. Its test is whether the t/s can trace evidence back to source. In the fairly rare event that an assessor does such a trace, it will be done manually; with considerable tolerance of punctuation, case, and formatting. All my complaints about obsessional and pedantic compliance requirements stand, up to this point.
But I'm reluctantly changing my mind beyond that point.
I don't use EndNote or any other citation management software: I don't understand what most of its fields mean, and the prospect of importing a lifetime's references is unattractive in any case. And I don't need it. But I can see why many people do need something like it. Preparing literature reviews or surveys for publication is made so much easier; such tasks go beyond the domestic level and call for industrial strength solutions. And once data has to be "normalised" (made consistent) for use in a database, such arcane minutiae as the place of a colon matter. The question of whether the colon is followed by an upper-case letter could determine whether the whole title is treated as one field, or as two--title and sub-title.
Such is the stupidity of machines that we have to do their bidding.
Incidentally, in one of those earlier posts I noted that Jude Carroll;
But the Oxford comma? (And here.) I don't think there is any excuse for getting worked up over that.
The obsessional scholasticism of the "style" manuals fascinates me, and this post from the APA blog apparently takes the biscuit. Read it before continuing.
What really irritates me is of course that it privileges a US convention, that colons are followed by capitals: that is not the British style. But the convention imposes an arbitrary divide: it's no longer a matter of style, taste and preference, but of right/wrong, good/bad. (Which may translate into assessment grades and even immediate career prospects.) [Check out the arguable solecisms of this paragraph.]
There are several standards in play here. I hold to my opinion that we should not get our knickers in a twist about material up to Master's level. (Including such colloquial phrases).
That is what we might call domestic level citation. It's a matter between student and tutor/supervisor. Its test is whether the t/s can trace evidence back to source. In the fairly rare event that an assessor does such a trace, it will be done manually; with considerable tolerance of punctuation, case, and formatting. All my complaints about obsessional and pedantic compliance requirements stand, up to this point.
But I'm reluctantly changing my mind beyond that point.
I don't use EndNote or any other citation management software: I don't understand what most of its fields mean, and the prospect of importing a lifetime's references is unattractive in any case. And I don't need it. But I can see why many people do need something like it. Preparing literature reviews or surveys for publication is made so much easier; such tasks go beyond the domestic level and call for industrial strength solutions. And once data has to be "normalised" (made consistent) for use in a database, such arcane minutiae as the place of a colon matter. The question of whether the colon is followed by an upper-case letter could determine whether the whole title is treated as one field, or as two--title and sub-title.
Such is the stupidity of machines that we have to do their bidding.
Incidentally, in one of those earlier posts I noted that Jude Carroll;
"...went so far as to say that this obsession with "correct" referencing was a phenomenon of the last ten years, and implied that it was symptomatic of a crisis of confidence on the part of academics in their own authority in a post-modern world."But perhaps that is in itself a humanistic perspective, and the same time-scale could be accounted for by the rise of the machines?
Subscribe to:
Posts (Atom)