As I noted the other day, I've been exploring assessment recently, and in particular problems inherent in the process, and Type I and Type II errors. These are usually discussed in relation to medical tests, rather than educational ones, but the same principles hold. Type I errors are so-called "false positive" results--when the test "discovers" something which is not there. Type II errors are of course the opposite, where the test fails to identify something which really is there. I've discussed them in relation to assessment
here.
(Incidentally, I got myself into a mess trying to explain to students why simply making the test harder in order to reduce the false positives--arguably the more serious error in the system--doesn't work. Not only does it not work, it counter-intuitively increases the number of such errors. I've fudged that on the linked page, frankly, but I still don't understand what's happening! I may return to this topic. Advice welcome!)
This post is about trying to get a handle on (some of) the mismatches between education/training programmes and the real-world occupations they purport to prepare people for. They do both less than and more than they are supposed to (type II and type I). And despite simplistic nostrums like "competence-based assessment" there are no simple solutions...
Nursing and Medicine
I'm starting with a bit of background, because there have been several news stories recently, mainly about nursing (the
Staffordshire report, and the proposal that
prospective nursing students start as Health-Care Assistants among them) which focus on the supposed deficiencies of training in that (and of course other) field(s).
I've written about this earlier, but this time I want to look at a more general issue implicit in the story.
That is the untested assumption that the outcomes of professional formation in general, and training in particular, are those intended and only those intended, and that they are beneficial and desirable. We recognise that medical treatments have side-effects, and indeed that sometimes "the cure is worse than the disease", but rarely does that issue surface in the discussion of education and training. Unintended effects are a kind of Type I error.
Studies such as
this meta-analysis and
this, however, suggest that those "caring" and "empathic" qualities which are currently being lauded so much, actually
diminish in the course of medical training. That is not new; I have a very tattered reprint of
this study from 1960 by Isobel Menzies-Lyth snappily entitled (originally)
A case-study in the functioning of social systems as a defence against anxiety in which she discusses how the system of training for nurses at that time
set out to encourage barriers between nurse and patient, so as to preserve the nurse from being overwhelmed by the suffering of their patients. A range of strategies--such as frequent rotation between wards, concentration on tasks to be performed rather than on patients to be cared for (on the button for mid-Staffs), de-humanisation of patients by referring to their disorders rather than them as individuals ("the emphysema in bed 13"), and the primacy of the nursing hierarchy (matron) were deliberately employed to cultivate psychological distance between nurse and patient, on the grounds that a nurse rendered incapable by emotional involvement with a patient was worse than useless.
It took time, but the implications of that perspective were eventually recognised and underpinned (alongside other concerns) the introduction of the
Project 2000 reforms in 1986: the present state of play is discussed in a series of articles from just a year ago in the Independent, starting
here. The comments from practitioners and others are worth reading.
Social work
I wrote the above several weeks ago, but an associated issue has popped up again, this time in child protection. Social work education is again under attack, this time as the scapegoat for the failings of the police and children's services to act in the
Oxfordshire grooming case. Harriet Sergeant wrote a coruscating articles about the shortcomings of the residential child care system in particular in the
Sunday Times on 19 May--that is behind a paywall, but it was picked up and quotations from it were posted here on
the Conservative Home blog. That re-posting concentrated on one of Ms Sergeant's (seconding Mr Gove) rather strange ideas--that problems in social work with children and families would be addressed by appointing social workers without a social work qualification*.
There are many things wrong with this idea--as one of the commenters notes, it smacks of naive consultancy "solutions" with no clue about the realities of practice, and as another points out, this may be one area where "graduates from top universities" may well have even less of a clue than the average punter. Graduates of the care system, on the other hand? (And I'm privileged to have known, trained and worked with some truly impressive people who had to follow that route.)
There are also some things right with it--learning on the job is great, when properly supported and supervised--but the balance has to be struck. It is hard to regulate what might be learned--a placement in an over-stretched team (or care home) with a churning staff group, more than half of whom are agency locum staff (not their fault), can teach cynicism and defeatism with frightening effectiveness. And without very skilled mentoring by very accomplished practitioners, on-the-job learning can plateau at survival level, far short of best practice.
However, the interesting issue here is the insistence on the "pernicious nature" of the social work degree, based on some skilfully cherry-picked claims in a linked post. Sergeant's informant is of course correct about the "rubbish you have to pretend to believe."
I commented on that some time ago. Frankly, I don't remember anyone who fell for that. Indeed, given that bullsnit-detection is a key skill of a social worker (from both ends of the role-set!) failure to see through that could/should have been a criterion for graduation. Having said that, my experience predates the three-year, standalone, degree. I had grave reservations about accepting school-leavers onto qualifying courses and argued at one point for a minimum entry age of 25, which was defeated in committee as "discriminatory"! Never mind...
The argument is that the qualifying process actually
de-skills the practitioner. So an unqualified practitioner is
more skilled than a qualified person.
There are two versions of this argument: the first is that the unintended consequences are more potent than the intended ones, so the training is incompetent. The second, which approaches a conspiracy theory, and which seems to be espoused by many of the
Conservative Home commenters, is that it is a deliberate policy to undermine "society". There were--perhaps still are--some individuals for whom the political issues which cannot be avoided in social work are much more important than the experiences of individuals, families and communities, and for whom traditional "good practice" is merely the expression of Marxist "false consciousness"--and who are dogmatically, even demagogically, vocal about it, but it is not wise to judge the training by this minority.
Nevertheless, their ideology, and endemic "groupthink" in social care organisations under pressure and with ill-defined tasks and strategies, may in some cases make it harder for junior practitioners to stand up for their own beliefs and practices. The limitation of the training is that it does tend to buy into a spurious "politically correct" consensus, and shies away from equipping practitioners to argue their corner.
Teaching
I moved from social work education to teacher education, still low down in the professional pecking order, but not
quite as far down. There too I encountered similar claims, although being made more stridently in the USA than the UK, and again largely from the political right. In both countries, the concern has been sufficient for the governments to accept or even promote alternative routes to professional accreditation, in the form of
TeachFirst and
TeachForAmerica.
And in the UK, the latest idea is that redundant non-graduate
former service personnel will be able to train as teachers in two years; their experience in the forces will carry credit towards the qualification. The underlying assumption seems to be that such experience is more relevant than that acquired in the university classroom. The experiment harks back to the
13-month emergency teacher training scheme of the immediate post-war era.
Implicit in the scheme is a recognition (warranted or not), that academic preparation may be irrelevant or perhaps even inimical to effective practice. I can't imagine that this can ever be empirically researched--given the shifting discourses surrounding education and the enormous vested interests involved, but perhaps it is a legitimate question. Even Carl Rogers admitted it; human beings, he said;
"[...]are curious about their world, until and unless this curiosity is blunted by their experience in our educational system."
Rogers C R (1969) Freedom to Learn Merrill: Columbus Ohio (ch 7)
My work for the past 15 years has been involved with teacher education in the post-compulsory, mainly vocational, sector; I can certainly testify that in terms of orientation and values at least, the non-graduate students have often been more capable and resourceful than their more academic fellows. It is just possible that the experience of university does actually blunt that enthusiasm... Ken Robinson has been on Radio 2 this week making his usual point about that--I won't link because there was nothing new
Discussion
What to make of this rather rambling discussion?
Curriculum planning in vocational or professional education areas such as these assumes that we are painting on a flat blank white canvas--not in terms of a personal
tabula rasa, but of a neutral context in which practitioners will work. But the context is not neutral. It is not flat and it is not blank. To push the metaphor closer to its limits, the background colours show through, and the irregularities of the practice landscape mean that in some places formal learning "takes" readily, and in others it just drains away.
Enter the
community of practice (CoP). Generally speaking, the CoP aspect of professional learning has got a good press, and although the terminology may not be used, the proponents of alternative approaches to preparation for the (semi-) professions discussed are implicitly endorsing learnig on the job and the occupational socialisation which goes with it. However, even Wenger warns against such a view;
"Communities of practice are not intrinsically beneficial or harmful. They are not privileged in terms of positive or negative effects. Yet they are a force to be reckoned with, for better or for worse."
Wenger E (1998) Communities of Practice[...] Cambridge U P p. 85
It is the potency of the sub-cultures within organisations which makes a CoP something of a wild-card. They are very effective at socialising new members, but there is no guarantee that the socialisation will be into best practice. Indeed in many cases it will be survival oriented and as mentioned above, when that is your aspiration, it rather puts a cap on quality. (
See also the associated discussion here).
This has two implications. The first is a warning about over-reliance on practitioners and the practice setting as a major agency for training.
The second is the importance of the environment of the practice. It is far too sweeping to make claims that survival sub-culture will be less common in settings where practitioners feel valued and not under threat. This is not an argument for the naivety of
Theory Y management styles. Nevertheless, "When you're up to your armpits in alligators, it's hard to remember to drain the swamp."
(One of many variations, attributed to Ronald Reagan.) The Menzies study of nurse training mentioned at the beginning recognised the pressure and anxiety inherent in the job. The pressures of social work are self-evident. Teaching is also recognised as a stressful occupation. And it may also be worth mentioning the extent to which sub-cultures thrive within the emergency and the armed services--because of the security and trust they can engender for those within them, however crude, self-serving and oppressive they may be from the point of view of the outsider.
The training process rarely engages effectively with such issues. Indeed, formal curricula rarely even recognise their existence--except through addressing codes of practice which often seem to be prepared by people who have been away from practice for almost as long as I have, and whose feet are a long way off the ground. That is not to say that good programmes do not try hard to get at the issues--but it is very difficult to do so without
blaming. The whole history of political correctness in professional education testifies to the malignant synergy which develops when ideological interests among the staff--perhaps vying to be "more sanctimonious than thou" because their disciplines and claims to academic or professional credibility are soft and weak--meet with the denial, or at least silence, of the students about attitudes and values which it is dangerous to express.
In this recent post, I discussed Brookfield's use of critical theory as a framework for reflection, and referred briefly to its own hegemonic status; but I didn't make much of how it can be used to stifle dissent. He made much of how brave one has to be (particularly as a student) to "hunt assumptions" and challenge them from a critical (Frankfurt school) standpoint, but nothing of how much braver it might sometimes be to argue that just occasionally the
Daily Mail might get something right.
All of which is a rather convoluted way of saying that driving less-than-good practice underground is not an effective way of dealing with it. Two thoughts;
- Primum non nocere (first of all to do harm) underpins not only medicine but also the other services I have mentioned in this post. It is not possible to enact. All of them are harmful to some people who come into contact with them, as practitioners, clients and managers. The probability that some aspects of our practice are toxic, and that that is a necessary feature of it which cannot be made to go away, needs to be discussed on training programmes in a constructive and not defensive manner. That is hard work. Failure to tackle it, however, is a major Type II error.
- Some of this poor practice serves a defensive purpose; you can't just take it away and not offer an alternative way of dealing with the problems--if you do so you make the situation worse. As Jesus put it, "When the unclean spirit is gone out of a man, he
walketh through dry places, seeking rest; and finding none, he saith, I
will return unto my house whence I came out. And when he cometh, he
findeth it swept and garnished. Then goeth he, and taketh to himseven
other spirits more wicked than himself; and they enter in, and dwell
there: and the last state of that man is worse than the first." (Luke 11;24-26 AV).
Trying to strip away resources which help practitioners to survive is a major Type I error.
*
Disclosure: I spent 20+ years in social work education, having worked on and led courses leading to the Certificate in the Residential Care of Children and Young People, the Certificate of Qualification in Social Work, and the Diploma in Social Work. All were in their time the standard nationally accredited qualifications in the field. I left the field before the introduction of the undergraduate degree. I have never worked as a social worker, and I do not myself hold any of the qualifications above. If you think that's strange, so do I;
more here.
This is an even less coherent post than usual. Sorry--very much work in progress! Feel free to comment.