17 December 2009 by Jennifer
The authors of the Impact Assessment calculate a theoretical net financial benefit from theoretical improvements to the education of children in elective home-based education, and these children therefore theoretically earning more money as adults.
In the totals of EHE children who are currently said to have no or poor education, the authors mistakenly include figures for other categories, including children who had simply not been assessed by the LA staff.
Re-doing the calculations with correct figures, the sums show
there is either a small net ten year quantifiable gain or (more likely) an overall net loss.1
But even the correctly calculated figures include two other problems:
The authors assume that LA staff from a traditional teaching background are able to assess child-led education accurately; this is questionable, because those are very different areas of expertise. (It’s difficult anyway to assess the environment of a child one has only just met, who may have unusual special needs influencing their behaviour.) For instance, some LA staff appear to believe that no “written work” equals no learning. Therefore, even the correctly calculated figures for LAs’ beliefs about these children’s education don’t necessarily reflect the children’s actual status.
The authors assume that the LAs’ interventions will have a net result of improving children’s education; this is unproven. (Some qualitative research would be useful, exploring families’ existing experience of visits from LA personnel. Anecdotal evidence suggests to me strongly that although some families may be helped, some are likely be hindered, e.g. by misguided advice, insensitive comments to children, etc.)
Key point: Don’t count on any such theoretical financial benefit.
The authors state that the cost of School Attendance Orders (SAOs) does not need to be calculated separately, as there will be so few. They estimate five per ten thousand children per year: 20 to 40 in total, if there are 40k to 80k EHE children in total.
This figure is spurious; the 2008-2009 total is not 10 SAOs for the estimated 20k children on LAs’ official lists; it’s nearer 80.
But in any case, the current annual total of SAOs can’t be used to predict the likely total if the new Bill were to become law. In the current draft of the new Bill, the legal framework around SAOs would change radically, making them (if I understand correctly) significantly easier for LAs to deploy and defend.
It’s probably not possible at this stage to predict how LAs would in fact use the new SAOs, but under the proposed new system, the total could easily leap to a much higher level, for two main reasons:
Currently, LAs don’t make as much use of SAOs as they supposedly ought to: of the children they claim are getting no education, it appears that fewer than a quarter have been given a SAO.
New-style SAOs could be issued purely on the basis of “paperwork shortcomings“, rather than being a response specifically to educational concerns.
It’s likely that most SAOs would be against the parents’ wishes, so a related missing cost here is for appeals processes. A draft structure for the appeals process has not yet been published.
Key point: SAOs and appeals against them are potentially a large cost, which has not been included.
A “false positive” is an indicator of a problem, where no problem in fact exists. (An example would be a positive result from an HIV test, where the person did not in fact have HIV.)
Sending LA staff to inspect children in home-based education is likely to throw up some “false positive” results for both abuse and poor education, which would need further investigation. Unfounded concerns about abuse or neglect would divert Social Services time from children who really need it; unfounded concerns about poor education would potentially get children sent back to school when their home-based education was in fact serving them well.
The Impact Assessment includes no acknowledgement of this issue and no costings for it.
Key point: “False positives” for abuse and/or poor education would generate costs which have not been included.
It’s not clear how the authors arrived at their conclusions about the costs of staff training. However, as the DCSF has consistently underestimated the shortfall in expertise in respect of (a) autonomous/child-led education and (b) children with special educational needs, it seems likely that their figures for training are also underestimated.
Key point: An inadequate budget for training and recruitment would undermine any chance of any of this doing any good at all. Caution required.
The Impact Assessment uncritically incorporates wrong statistics on CPPs (Child Protection Plans) and NEET percentages (Not in Education, Employment or Training).
There are several other places where the authors’ assumptions are either unclear or clearly wrong (e.g. in assuming that EHE is always planned using frameworks or curricula similar to school).
My earlier longer article gives references for all figures and some of the other assertions… plus more opinions :-)
(But, actually, I think there’s the potential for yet another iteration of this explanation, succinct-ish like this one but more thoroughly referenced than either. Open to offers of collaboration…)
Only one footnote today…
1. Dr Ben Anderson’s review of the stats used to produce the financial predictions: http://docs.google.com/View?id=dfjpcgdp_279fjczvdx.