//
you're reading...
Policies, Research

Actual numbers, an update

Our last post drew attention to the publicly available WGEA reports, in which large Australian employer organisations (including Australian universities) report their staffing mix as part of a national commitment to collecting and reporting data on workplace gender equity. This data doesn’t have casualisation as its focus, but casualisation is one of the factors reported.

It’s very detailed, and you can search for the reports from your own university here.

In our original post we also shared a work-in-progress spreadsheet that we’ve been working on, pulling together information reported by different universities to see how they compare. The comparisons were so extraordinary that we took another look, and we had a follow up conversation with researchers at the NTEU who are looking at the same data. The problem is that the very poor fit between the WGEA categories and university staffing is likely to create inconsistencies in category interpretation.

Today we’ve done some further checking against other sources, including Annual Reports, and it turns out we’d guessed wrongly where universities were reporting the bulk of their senior academic staff. So we’ve taken the spreadsheet down to have another think, as we’re still chasing this up. In one case at least most Level D/E academics are reported as Senior Managers; and a very small number are reported as Managers, while most (and most casuals) are below that level. But we don’t know if this is consistent across the sector. In fact, we now know we know less than we did before we started, despite this lovely tweet from the Research Whisperers yesterday:

In terms of our analysis, the spread of academic staffing data across categories certainly shifts the apparent ratio of casual to permanent staff—the very thing that we’re all curious about. And collapsing academic and professional managers into the same WGEA category makes clarifying this much, much harder to do.

The other question that we’re asking since looking at the WGEA data: what does that reported number of casual staff itself represent, and do all universities report according to the same assumptions? Universities have a very complicated relationship to casual hiring in all categories, and in terms of academic staffing we deal with the slippery category of “adjunct” in the Australian context: professionals in other fields who take up some kind of university role, including in medical and other trainee supervision. Should we expect this to be a factor?

Overall, these complications suggest a significant correction to our optimism that we’d found a publicly available source of data that could be used by people like us—people without direct access to HR or organisational research divisions—to get a clear look at casualisation. And it’s strongly confirmed our suspicion that the common sense questions that people are asking about casualisation have generated no simple answers.

Working this out in public is a project we’re committed to. We want to demonstrate something about how Australia both counts and hides casualisation, which we have argued since we began CASA is a critical measure of three things: business sustainability, staff wellbeing and student quality of learning. As qualitative researchers, we really want to know: why is it so hard to get an answer to the question of casualisation as a proportion of the staffing experience in Australian universities?

So we’re reporting an adjustment to our first assumption about proportion, and we’ll be back with more reports on what we learn. This week’s key message: universities have tremendous capacity to collect and report data with clarity, so when they don’t, there’s a reason to be curious. The QILT dashboard is an excellent example of the way in which this kind of data can then be put to work in a public-facing way. If you want to compare performance in student satisfaction, graduate employability or other teaching indicators across a collection of up to 6 universities, right down to degree level, you can do so quite easily. So in the spirit of Christmas wishing, we’d really like to ask Australian higher education to put its data mining muscle to developing a public-facing comparison dashboard for university staffing so that we can all know what’s going on.

And on that note, no one has really bettered Donald Rumsfeld:

Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.

If you shared our post this week, please pass on this update.

Thanks!

Advertisement

About Kate Bowles

Now blogging over at musicfordeckchairs.com

Discussion

2 thoughts on “Actual numbers, an update

  1. Hi Kate, An interesting post, and possibly no surprise that there are all sorts of blurry issues around the staffing mix, particularly regarding casualisation. Just an issue with reference to the QILT database. I have had to do some work on this, and QILT is not (repeat, not) a valid source of information and should be treated with caution. The data is cumulative, so that multiple responses are recorded for a particular year in a degree program, but are not necessarily representative of the selected cohort. If you look at the number of responses for a degree program at X university, you may find that there are more responses than actual students enrolled in that year! The QILT methodology document also notes that the data is not complete, nor entirely reliable. If the universities are being assessed (and compared) for their degree programs by students, using faulty methodology by the data-collectors – in an era where student experience is being assessed to the nth degree – we have real problems. So the murky categories being applied to staff (professional and academic) are really another example of trying to compare apples/oranges/grapes/peaches. I am curious regarding definitions (on casual/adjunct/fixed term/ongoing, levels, blended employment (eg casual teacher/professional full time, etc) as these will affect the data collection/collation process. Maybe I have missed it, but if you could put these up on the CASA website, I would be interested 🙂

    Posted by Mandalay | December 10, 2015, 5:42 pm
    • Thanks so much for this really helpful insight into QILT’s datasets. Our thoughts were really on the interface that QILT provided, but you’re absolutely right that dodgy data can easily be cleaned up by a nice looking set of charts and a useable search function. And in this, QILT has certainly won friends.

      Like you, we’re also curious about staffing definitions, particularly when individuals hold multiple contracts of different kinds, sometimes at different institutions, and institutions have a tendency to different names for similar things.

      Although this blurriness offers some convenient cover to institutions who aren’t tremendously keen for the rate of casualisation to be made an issue, useable public data on staffing has to be a good goal.

      Posted by Kate Bowles | December 10, 2015, 9:49 pm

We welcome your thoughts (update: oldest comments now appear first!)

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: