Stefan Helgesson: A quality system worthy of the name

Words from the management: SU’s quality assurance system for research is, in my view, a nimble system that avoids the worst pitfalls of New Public Management.

Stefan Helgesson, Deputy Vice President

Stefan Helgesson, Deputy Vice President. Photo: Sören Andersson


If you do a search on the words “kvalitetssäkring” (“quality assurance”) and “kvalitetssystem” (“quality system”) in the Korp database at Språkbanken, one learns that they entered Swedish usage in the 1990s. Hence, the terms are connected to the rise of New Public Management (NPM) and its emphasis on goal orientation and result assessment. The intention to guarantee an appropriate use of available resources is not bad in itself, but NPM’s downsides are all too familiar these days: increased bureaucratisation and attenuated trust (and self-confidence) in professional judgement. The ambition to run things more efficiently has often had opposite results, as testimonies from the school and health sectors in Sweden show with depressing clarity.

As a higher education institution (HEI) and state agency, Stockholm University is also subjected to the regime of quality assurance, with an undeniable NPM-profile, that obtains in Sweden today. This was however tweaked when UKÄ (The Swedish Higer Education Authority) some years ago handed over the responsibility for quality assurance to the HEI’s themselves. That shift explains the background behind SU’s two internal systems for quality assurance, one for education and the other for research.

On 11 March, there was a half-day conference at Albano focusing on SU’s quality assurance system for research. This is a topical theme, since it is only now that the system has completed its first cycle; cycle number two has recently begun. The system has some few clear components: indicator reports compiled by the SU president’s office, comments on these reports from the departments, and dialogues with department representatives and the deputy vice-presidents.

One intention behind the indicator reports is to show how research is continually assessed for quality. The reports therefore present statistics on external funding, peer review publications, and so on. The dialogues then cover – among other things – recruitment, the department’s use of the basic allocation for research, and outreach and collaboration.

This is, in my view, a nimble system that avoids the worst pitfalls of NPM. By emphasising the dialogue component, all those involved have a chance to address, in a collegial spirit, what they see as the central issues in the research environment. The conversations are engaging and informative, with notes being taken for future reference.

In addition to these standard ingredients in the quality system, SU has a tool called “fokusutvärdering”, which is a limited external evaluation. As a rule, one or two such evaluations are conducted yearly, with the aim to assess a particular environment or practice at SU. At present there is an ongoing evaluation of centres – “centrumbildningar” – at SU.

At the quality conference, two heads of departments – one from each scientific area – reflected on their experiences of being externally evaluated. It requires that the department puts in some work, and care needs to be taken in the selection of evaluators. But by and large, the positive aspects of this exercise were underlined. When these external evaluations function as intended, they both produce valuable knowledge and strengthen the quality of our research. That makes them part of a quality system in the strict sense, and not merely a bureaucratic burden.

This text is written by Stefan Helgesson, Deputy Vice President. It appears in the section ”Words from the University’s senior management team”, where the management take turns to write about topical issues. The section appears in News for staff.

Last updated: 2025-03-31

Source: Communications Office