menu

University Higher Education Social Mobility

Social mobility in Britain’s universities – the problems with UCAS’s figures


A shorter version of this post appeared on the Young Fabians’ blog on 29 September 2014. It was also published in print, as a feature article in the Winter edition (Issue 4, 2014) of Anticipations, the Young Fabian journal. 

In the wake of mid-August’s A-Level results, the Guardian’s education editor Richard Adams penned  a particularly positive article on social mobility in Britain’s Universities, and specifically the spiffingness of UCAS’ enrolment data.

Seemingly basing his analysis entirely on UCAS’s recent figures on University admissions, Adams noted the spiffing fact that students from deprived areas “with historically low rates of entry to higher education” have been admitted to University in record numbers this year – 8% more have gained places than in the previous admissions cycle. Even more spiffingly, the gap between rich and poor has also narrowed to its lowest level yet. This is because:

“There has been a slight dip in the number of places filled by students from better-off regions of the UK, down by 1% this year. The combined effect [of this fall and the 8% rise for deprived students] means that students from better-off areas are still two-and-a-half times more likely to attend university than those from the lowest participation areas – but a steep improvement from 2012, when they were more than three times more likely.”

Obviously, this is really really lovely news. What truly places it in the upper echelons of the spiffingness-scale is the mere fact that this has been achieved in the grim context of, amongst other things, increased economic inequality, a growing social gulf between rich and poor in schools and, of course, a near trebling of tuition fees.

However, it’s precisely this grim context in which this feat’s been achieved that leads me to to suspect things aren’t as spiffing as the Guardian and UCAS imply.

Fortunately, an absolute shedload of interesting reports have been written on this topic. Addressing the deficiencies in UCAS’ stats, they’ve analysed the issue in a wealth of different ways – looking, amongst other things, at actual enrolments in University as well as applications; mature students rather than just College/Sixth Form leavers; and outcomes after graduation, rather than just inputs. They’ve also defined “disadvantaged students” in different ways and – crucially – looked at the difference in participation levels within the University system, rather than just giving overall figures for the sector as a whole.

Having taken the liberty of reading a bit more widely on the topic, I must belatedly report that things aren’t as spiffing as the headline stats suggest – so much so, in fact, that I am going to desist from using the word “spiffing” for the remainder of this post (it’s an antiquated, “tinny” sort of word anyway, and should be purged from the English language). Four key facts are set out below, followed up (in the end) with some suggestions about what we could do to reform the system:

Fact 1: If you measure “advantage” / “disadvantage” in different ways, the rate of improvement is less profound

UCAS’ statistics do show that over the past decade, the number of “disadvantaged” students both applying to, and gaining admittance into, Universities, has increased at a faster rate than for “advantaged” students – naturally, this means the “advantaged”/”disadvantaged” gap in applications and admissions has narrowed.

But how you define “advantaged” or “disadvantaged” really matters. At present, UCAS bases its measure of disadvantage on a data set called Participation in Local Areas 2 (POLAR 2), which splits the UK into equally-populated neighbourhoods and measures the proportion of their inhabitants who go on to University. These areas are then split into quintiles, with the least disadvantaged – for whom the highest proportion of their population go on to University – in Quintile 5 (Polar 2 Q 5); and the most disadvantaged in Quintile 1 (Polar 2 Q1). UCAS knows the quintiles of all their students’ neighbourhoods, and so can map differences in applications and admissions over time.

The problem here is that the Polar 2 data on these neighbourhoods was compiled in 2007. It’s a bit dated now (indeed it’s been superseded by Polar 3, but UCAS haven’t shifted to the new measure because this would make it harder to make comparisons over previous years). An excellent Channel 4 Fact Check article goes into detail on this problem. Channel 4 also show that if you measure advantage/disadvantage in different ways – say, using Polar 3 data; measuring the social class of their parents; or just simply whether students are privately/comprehensively-schooled – then the pace of improvement over the past few years is less remarkable. Which measure of disadvantage we use becomes important when we consider fact 2 below…

Fact 2: On some measures, the most competitive Universities are just as exclusive as they have ever been

UCAS’ statistics only analyse the University sector as a whole – they don’t break it down any further. It’s thus conceivable that the most selective, high-end Universities are just as socially-exclusive as ever; and that disadvantaged students have mostly just gone to the less prestigious institutions.

Well, it’s more than conceivable actually. Back in 2010, OFFA commissioned some analysis from HEFCE1 which appears to show precisely this. Their findings are presented to the right.

Social Mobility in High, Medium and Low Entry Tariff Universities. Data compiled by HEFCE and presented in this OFFA report.

Social Mobility in High, Medium and Low Entry Tariff Universities. Data compiled by HEFCE and presented in this OFFA report (p. 112).

HEFCE split all Universities into three groups – “higher”, “medium” or “lower” – depending on the average UCAS entry tariff points they require for enrolment in their courses. They use their own measure of disadvantage2 and use on the Y axis a ratio of “advantaged” to “disadvantaged” University participants in each year. Whilst, over the past fifteen years, the gap in participation has narrowed for medium and low entry tariff Universities – to the point that it’s almost completely dissipated for the low-end of the University sector – it’s actually increased in the most competitive Universities. One good thing to say about this data, however, is that the gap at the top hasn’t risen from the 2005/06 academic year onwards – it’s only stayed the same.

But this only takes us up to the 2009-2010 academic year. What about more recent years? Here – until HEFCE answer an FOI request I made asking for more recent data along the lines above3 – I’ve had to rely on Polar 2 data only up to the year 2013, published in a report by the Independent Commission on Fees4 – a body chaired by Will Hutton to analyse the impact of the Coalition’s 9k fees rise. Like the HEFCE study, it uses a ratio of advantaged vs. disadvantaged admissions to measure changes in social mobility, but it defines the top Universities as the “Sutton Trust 30” of most prestigious Universities. Their results are presented below (full tables are accessible in an Excel spreadsheet in my gallery):

Changes in Social Mobility – Sutton Trust 30 Vs. Other Universities. Source: Independent Commission on Fees. For the data in full, see the gallery.

You can’t really draw many conclusions from this data. It shows the trend in the most prestigious Universities broadly following that for the rest of the sector for three of the four years.  We see a small rise in social exclusivity in 2011,followed by a fall thereafter. There’s a particularly dramatic fall in 2013 for the most exclusive institutions, but it remains to be seen whether this is an anomaly or part of a longer-term trend (we don’t yet have data for 2014).

So why are the most prestigious institutions more socially-exclusive, and why – at least according to the HEFCE data set – haven’t things changed at all in over a decade? Three reports – the aforementioned OFFA report; a 2014 report by Prof. Michael Brown for Centre Forum5; and a 2012 report by the Social Mobility and Child Poverty Commission, chaired by Alan Milburn6– suggest four interesting reasons.

Firstly, although disadvantaged students have entered University in record numbers, their A-level grades haven’t actually improved proportionally. This means they haven’t attained the grades necessary to get into the most prestigious institutions, which naturally ask for higher grades. The difference in A-level performance between advantaged and disadvantaged has become more, not less, pronounced: the percentage of A level A-grades from pupils from selective schools increased at twice the rate of non-selective. Private schools – teaching 15% of A-Level students – take up 30% of A-level A grades.

Secondly, however, part of the problem is that many disadvantaged students who are good enough – some 4500 according to OFFA’s 2010 report – don’t actually apply for the most selective institutions. Many cite a (mis)perceived view that it’s more expensive, whilst many simply say they won’t fit in in such an environment, away from their less academically-successful peers.

Thirdly and more worryingly, there’s some evidence that even when disadvantaged students do apply, they stand less of a chance of gaining admittance than their more privileged peers. Again, the OFFA report suggests there’s a “state school ‘penalty'” of one A-level grade: generally, a state school student would have to get one grade higher than their privately-educated peers to get the same place as them in a Russell Group University.

The fourth and final reason concerns the broader incentives prestigious institutions actually have to take up disadvantaged students. University league table rankings from the Times, Guardian and others leverage institutions’ positions so heavily on admissions grades, degree outcomes and student satisfaction. But if you admit disadvantaged students, the Centre Forum report argues this’ll tend to mean you have lower admissions grades, and students from poorer backgrounds also tend to report lower student satisfaction levels and graduate with lower degrees. Universities thus have no incentive to actually take them in.

Fact 3: Figures are only focussed on outputs – not the actual impact these Universities are having on social mobility and student outcomes:

One of the key issues raised in the aforementioned Centre Forum report is that the way the Government, UCAS and press currently assess Universities is almost exclusively based on outputs rather than outcomes: how many students they educate from disadvantaged backgrounds, rather than the actual effect this has on their life trajectories.

A lot of the reporting from Government and the media implies that so long as disadvantaged students are entering Universities in greater numbers and narrowing the overall acceptances gap, this will automatically have an effect in terms of enhancing social mobility. This is a pretty big logical jump!

On the contrary, you could well argue that because a lot of these students are going to less-prestigious Universities, they aren’t really getting an education that in any way changes their life trajectories. In addition, even those who do gain admittance to the high-end institutions aren’t necessarily given the necessary help, because – as noted before – they currently have no real incentive to do so, what with the way the League Tables system measures “success.” Disadvantaged students lack the soft skills and parental social connections to negotiate their way through the modern-day jobs market. It should be Universities which help provide them with these skills. Too often, Centre Forum suggests, they don’t, Universities thus end up having little impact on disadvantaged students’ lives.

Does the evidence back up this claim? Well in 2013, HEFCE did a report7 which analysed the difference outcomes – in terms of entrance into further study or professional jobs post-graduation – depending on social class for just one year group of students. It found, unsurprisingly, that there was a gap. Sadly, with the lack of data for other years, we don’t know whether there’s been any commensurate improvement in bridging this outcomes gap in line with the increased participation of disadvantaged students in these institutions.

Alan Milburn Fair Access to Professions - Higher Family Income Background of Professionals

The professions are becoming more, not less accessible – the parents of most modern-day professionals are richer now than they were in the post-war period. Source: Milburn’s 2009 report on fair access to the professions.

Looking elsewhere, however, in 2009 Alan Milburn produced a report8  on the accessibility of the professions to people from disadvantaged backgrounds. It highlighted (see figure to the right) that the professions have become less accessible since the post-war boom in social mobility in Britain. Most modern-day professionals come from richer backgrounds than they did in the post-war period.

Of course, these figures are for previous generations and it remains to be seen whether  this problem will persist for the current generation. But the context – what with rising income inequality, increased child poverty and a much more impermeable professional labour market than in the post-war era – could hardly be bleaker. I suggest that the admittance of more disadvantaged students to Universities may well end up having little practical effect on the life chances of these people. The fact that there’s so little data on actual outcomes, though, just shows how low this is on the agenda of Government, UCAS and others – they’re far too content with publishing positive outputs data and then pretending it shows something it really, really doesn’t.

Fact 4: Mature student enrolment fell in England following the Coalition Fees Rise, and still hasn’t recovered

In the first year of admissions following the Coalition’s 2011 tuition fees rise, University applications fell, including amongst the most deprived students;9 and there were fears they would continue to do so.

UCAS’ figures clearly show these worries were misplaced … for 18 year-olds. But they’re an imperfect measure of participation levels for mature students, who constitute a smaller slice of the overall admissions pie and often don’t apply via UCAS anyway. Helpfully, just last month, the Independent Commission on Fees provided data – running up to 2013 – on the change in enrolments for two age groups of mature students:

Mature Student Enrolments in England Still Down following Coalition Fees Rise. Source: Independent Commission on Fees (2014). For the data in full, see my gallery.

The number of mature students admitted to higher education increased by 50% from 2007-2010.10 But admittances sharply fell following the fees rise and still haven’t recovered from their pre-2011 levels. Be it noted that over the same period, mature student participation rates haven’t fallen in Scotland, Wales or Northern Ireland – this phenomenon appears to be specific to the English University system, which just so happens to be the one that charges £9k fees!

Conclusion: How can we turn this around?

Needless to say, there’s a limit to what Universities alone can do in fostering social mobility. Income inequality is set to rise to unprecedented levels and our schools sector delivers drastically unequal outcomes for pupils – the Higher Education system is bound to reproduce these inequities. Nevertheless, a lot of interesting measures have been suggested to encourage Universities themselves to act as drivers for social mobility. The most promising options are copied below:

 (A) Create an Outcomes-Focussed Measure of University Success

Both Prof. Michael Brown for Centre forum and Alan Milburn agree: the Government should take the initiative and set up its own progressive, outcomes-focussed League Table for Universities. Indeed, Brown even produces a ready-made index for them to use – the Social Mobility Graduate Index (SMGI).

To over-simplify it slightly, the SMGI ranks Universities according to how many graduates they get into professional employment or further study, but it gives an institution more “points” if they achieve these outcomes for an individual from a more disadvantaged background. To gain a higher SMGI ranking, more socially-exclusive institutions would have to admit more students from disadvantaged backgrounds.

(B) Allocate more student places to better-performing, more socially-inclusive Universities:

This is a particularly radical proposal from Centre Forum. Currently, the Government decides every year how much money they want to allocate to Universities for places, and then HEFCE allocates places to each University11.

There’s scope within this system, Policy Network argues, for changing the formula so that more places are allocated to Universities which show better graduate outcomes; and which recruit more students from lower POLAR 3 quintiles – i.e. precisely the kind of thing reflected in the SMGI.

(C) Increase and reform Student Opportunity Fund allocations to Universities with disadvantaged students:

Milburn’s commission has also been consistently calling on the Government to review the whole system of financial support it provides to Universities through various schemes. Milburn makes a lot of sound proposals about a whole raft of seperate funds and suggests that a kind of Pupil Premium for Universities might be a way to pool all the money together into the most cost-effective way possible. Building on this Pupil Premium idea, I suggest my own potential reform below…

Break-Down of HEFCE Recurrent Funding Allocations for all Universities – 2014/15. Source: HEFCE. For full data, see the gallery.

Much (but certainly not all) Government funding to Universities is allocated through HEFCE – the Higher Education Funding Council for England. Presently 11% of their “recurrent funding” (i.e. the funding for programmes that carry on year-on-year, rather than for one-off plans) is allocated as so-called “Student Opportunity Funding” (see right for HEFCE’s 2014/15 allocations). Unlike other parts of HEFCE’s budget, this funding is allocated to Universities based on the number of disadvantaged students they have: institutions with more deprived students get more of the money. It’s supposed to help pay for the higher costs of teaching such students, and for outreach activities.

The problem is that it’s not much in the context of HEFCE’s overall spend, and it’s also considerably dwarfed by the private endowments of the most prestigious institutions. Back in 2012, Prof. Roger Brown of Liverpool Hope University set out the problem in an article for the Guardian. He highlighted how the 24 Russell Group institutions alone possess 52% of the sector’s total net assets. Spend per student was £22-£65k in the richest ten Universities vs just £7-8k for the bottom 10.

Although I’m trying to obtain more data to further explore this issue,12 I think it’s pretty clear that the Opportunity Fund money couldn’t possibly act as a sufficient incentive for prestigious institutions to take up poorer students, in the context of their wider endowments. I also don’t think it’s nearly enough to give disadvantaged students a genuinely life-changing education – and there doesn’t appear to be much of a way to ensure Universities are spending what Opportunity Fund money they have in the most cost-effective way possible.

So I’d suggest an alternative. We could considerably increase Opportunity Fund allocations, but rather than giving the money direct to institutions, it could be drawn into a central pool for institutions to bid for. The most successful projects would benefit the most students (so naturally, institutions with more deprived students would get the lion’s share), and would give them a broader range of skills to gain professional employment. Crucially, it strikes me that the best way of funding such a Premium, however, would be to apply some kind of financial penalty on the most socially-exclusive institutions. That way, they’re both incentivised to become more socially-inclusive but – even if they don’t – they provide the University system with some capital to help give disadvantaged students the education they never got when they were at school.

Show 12 footnotes

  1. See Annex C of OFFA’s report What more can be done to widen access to highly selective Universities? (2010)
  2. They classify local areas into five groups based on the likelihood of individuals having a HE-qualified parent, with the most-disadvantaged 20% of the population least likely (Quartile 1), and the most advantaged 20% most likely (Quartile 5). For their graph, they combine the number of applicants from Quartiles 1 and 2 together because not enough of them go to the most exclusive Universities to sufficiently guard against random error year-on-year; and use this to compile the ratio of deprived (Q1 & Q2) to non-deprived (Q5) entrants. This is not the methodology employed in the later study by the Independent Commission on Fees which I reference – who base their ratio on Q1 / Q5 – even though presumably they’d have come across the same problem.
  3. I made this FOI request to HEFCE asking them to provide the actual figures used as a basis for the above graph. I also asked for them to provide figures for future years, up to the latest available year.
  4. Independent Commission on Fees (2014), Analysis of Trends in Higher Education Applications, Admissions and Enrolments, p. 17
  5. Prof. Michael Brown, Centre Forum, Higher Education as a Tool of Social Mobility.
  6. Social Mobility and Child Poverty Commission (2014), Higher Education: The Fair Access Challenge.
  7. HEFCE (2013), Higher education and beyond: outcomes from full-time first degree study.
  8. Alan Milburn, Panel on Fair Access to the Professions (2009), Unleashing Aspiration, p. 20.
  9. Oddly enough, the fall was faster amongst the less deprived students mind. It’s worth noting that there were also falls in applications when Labour introduced tuition fees, and then again when they increased them.
  10. Source: the Independent Commission on Fees’ report.
  11. This will change in 2015/16, when the Government will remove the cap for places
  12. I’m trying to obtain some figures from the Higher Education Statistics Authority (HESA) to try and get a better picture of how these spending allocations fit into this broader picture. Quite simply, I’ve asked HESA to provide me figures on the undergraduate student population of each institution so as to give some context to their allocations for each institution. I’ve also asked them to give me information on the private spending of each institution. However, as they’re a private body not subject to FOI law, I suspect they’ll tell me to shove it.

Comments are closed.

Go to top