An Education Bureaucracy that Works

Steve Heyneman is Professor of International Educational Policy at Vanderbilt University.

Most ministries of education are situated in old buildings and work with outdated equipment and with outdated people. The Department of Children, Schools and Families (DCSF) in London is different.  Like others near the House of Parliament, the building’s façade is Victorian. In this case however, a decade of significant investment has altered everything on the inside. There are eight floors of color and glass. Waiting to be escorted upstairs, visitors are invited to sit in a Danish modern waiting room and are served cappuccino. All data sources are encrypted (no flash sticks are allowed into the building). On each floor, 500 people work at ‘hot desks’ (books and personal papers are stored overnight in a personal locker) where each person uses whatever desk and computer are available. There are no walls. No barriers separate staff, regardless of seniority. Scattered around are numerous spaces for meetings. Chairs come in three styles of ‘laid back’: a little bit, more, and a lot. Large glass seminar rooms (fully booked) have modern equipment available. Instead of power point projectors, slides are stored at a common source and electronically dialed into large, flat video screens. Recording and podcasting can be done from any meeting room. Coffee, tea, sandwiches, fruit and soft drinks are available at sites on each floor. No one leaves at lunch. Meetings start (and stop) exactly on time. Schedules are circulated and followed.

I was treated first to a presentation by the Lead Schools Standards Advisor, who described the work of the standards group. He showed me charts with stick figures in five colors (representing percentages of English school children) that are used to explain to teachers and parents what the particular problems were for each group of students and how to fix them. The colorful stick figures appeared next to a large arrow pointing toward a list of problems and their proposed solutions. A lot of attention was paid to making explanations simple to interpret.

Each of the 4,000 secondary schools in Britain  will this fall be given an individual ‘report card,’ which will be available on line. Each school will be given grades in six areas:  (i) pupil academic progress (gain scores); (ii) pupil attainment (of particular academic goals); (iii) the narrowing of gaps of high and low pupils in particular categories (low SES, minority, gender); (iv) parent opinions of the school’s quality; (v) teacher and staff opinions of the school quality; and (vi) pupil opinions of the school quality. On the day of my visit there was a debate on whether the six should be summarized into one grade and, if so, how the six items should be weighted. The balance of opinion seemed to be in favor of giving each school a single summary grade on grounds that if it wasn’t done by the DCSF it would be done in 10 minutes by the nation’s many newspapers, and not necessarily done well or accurately.

The system is managed from four levels. At the center is the National Strategies Group for school improvement (about 300 staff members); at each of the nine regions are regional strategies group; in each of the 150 local authorities are the local authority strategies group; and then at each school there is a team made up of the headmaster, parents, teachers and local business leaders.

Schools are divided into six categories: (i) ‘gaining ground schools’ which have met standards but have only minimal gain scores; (ii) volatile schools, whose performance varies significantly from year to year and criteria to criteria; (iii) ‘national challenge schools’,  who meet neither the norm standards nor the gain score standards; (iv) ‘good to great schools’ which do not meet norm criteria but have high gain scores and (v) great schools which have high norm results and high gain scores. Both teachers and headmasters are judged by the success at moving a school into a new and more challenging category. Remuneration and promotion are influenced by their achievements, though it was not clear whether the relationship was direct or indirect.

Prior to lunch I was introduced to director of the national ombudsman office. What does your office do? We adjudicate in cases of a dispute with government schools. We act like a court. The office has a half dozen ‘judges’ who try cases in the regions where they occur. The judgments can be appealed on the grounds of a ‘faulty procedure’ but not on grounds that the content of the ruling was faulty.

What are the most common cases which come before you? They fall into two general categories. One is over property: who actually owns the school land — the council the central government, or a church? Often authorities may wish to utilize the land for uses other than education and that may generate a dispute. Since Jewish synagogues, Catholic and Anglican churches (and now Islamic mosques) all run state schools, the rights of the property owners can get complicated.

The second most common category is over fairness in admissions. A suit may be brought against a school for not adhering to the public interest (such as not admitting a proper portion of students from low income backgrounds) or to its claim of how it plans to meet the public interest. In fact two private schools today stand to lose their tax-exempt status because they have been accused of not sufficiently adhering to the public interest.

Over a lunch of quiche, sandwiches, fruit and tea, I was introduced to the ‘Narrowing the Gap Division’. The manager is responsible for narrowing the gaps for economically disadvantaged pupils. She also has overall responsibility for gender gaps.  Responsibilities include the ‘Extra Mile Project’ which began by studying the schools in deprived areas which for one or another reason have performed significantly better than expected over time.

After lunch I was introduced to the staff at the DCSF and its agencies with responsibility for collecting and analyzing data from international comparisons. This included those responsible for IEA (TIMSS), OECD (PISA and IALS) surveys, and advisory boards of the UNESCO Institute of Statistics. The meeting was chaired by the Deputy Director for School Standards and attended by the team leader of the DCSF’s Schools Research Team, the DCSF International Comparisons Programme Manager, the Director of Assessment and Statistics at the National Foundation for Education Research, (their main external contractor), directors of the Training and Development Agency for Schools, and the Qualifications and Curriculum Authority. International data are seen as ‘drivers’ for national and local reform in terms of teacher qualifications, curriculum improvement, and sources of comparative information on narrowing the gaps and successful school management. International surveys were ‘mined’ for new ideas which were not available from domestic sources. The discussion covered issues of long-term institutional stability in UNESCO, IEA and OECD, problems of inadequate or unstable financing, and poor technical supervision of non-OECD countries.  (The fact that Kazakhstan was well ahead of Britain on the TIMSS results in science and math was treated as an illustration of possible corruption of the international data collection exercise.) The discussion also included comparisons of TIMSS and PISA results. Britain had made progress over time on TIMSS but not PISA. Because the former is curriculum-based, was this a sign that in fact the British curriculum reforms and school management interventions were working? On the issue of institutional responsiveness, it was felt that IEA was ‘soft’ and unable to effectively manage the many country members (often represented not by the ministry of education but by a university or research institute). OECD was felt to be responsive, but the main contractor for the PISA study (The Australian Council for Education Research) was thought to be ‘overly protective’ and hence resistant to change. The opinion was expressed that OECD should pay less attention to achievement and more attention to the ‘future of testing in an environment of universal internet access’.

In the next meeting I was introduced to the director of the ‘value for money group’ with responsibilities for the distribution of resources and the efficiency of resource use. The director had studied history in university, then had positions at Arthur Anderson and several local authorities before being selected to direct the nation’s education efficiency. He needed a haircut and clearly was brilliant. He explained that resources were distributed from the DCSF center to Local Authorities according to a formula. The formula included weights for housing prices, minority, English-as a second language learners, children with special needs, children who are permitted free school meals, schools located in rural under-privileged areas, rusting economic areas and with high percentages of ‘working classes’. This last category appeared time and again as the main issue of the ministry. In fact, with the possible exception of minority boys from the Caribbean, schools whose student population had shifted from white working class to minority were usually schools with high gain scores. The schools most likely to have low attainment, low gain, and low closing-the-gap scores were schools with high percentages of white working class males. This appeared to be a special target, including for the value for money group.

Money is divided into two categories. Dedicated School Grants account for 90% of expenditures. Once the dedicated school grant money is distributed, each local authority then re-distributes it to local schools according to its own funding formula. Is it possible that a local authority might distribute money according to a formula which counter-acts the intention of the national formula? Yes it is. What then can you do about that? We can monitor and when we feel it necessary we can intervene. Is that sufficient? No.

Ten percent of the money is called the School Standards Fund. This is money dedicated to specific line-item purposes to support the strategic plan for each school.

Local authorities help support education through a local council tax. It accounts for 20% of the overall education financing in Britain, but this percentage is on the decline.

School management is graded according to whether schools meet ‘national financial management standards’, designed by the money-for-value group. These include standards of transparency and reporting accuracy. The school management committee is held accountable. The management committee includes between 12 and 20 members appointed by the headmaster from among parents, teachers and community leaders. In Britain there are 600,000 people who serve on school management committees. In the opinion of the director of the value for money group, these constitute ‘a moral army’ to oversee of the budgeting and management process.

Schools in Britain have a budget surplus. They find it hard to spend the money they have been allocated. The amount of money ‘banked’ by British schools was over ‘one thousand million pounds’. The money, often set aside for capital construction projects, earns interest. The worry was expressed that this money might be ‘found’ by another ministry and used for purposes other than education.

Britain has undertaken a national school construction project called Building Schools for the Future which intends to construct 4000 new secondary schools. The new schools will be modern, electronic campuses. But that is a story for another time.

Last Updated


Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428

For subscription service to the printed journal
Phone (617) 496-5488

Copyright © 2024 President & Fellows of Harvard College