Measure what matters: Ranking universities in the age of pandemic Universities are caught in the cross fire of the pandemic. While the value of their research has never been more evident, the quality of online learning is questioned. How will university r

Это лучшие и худшие времена для университетов. С одной стороны, их исследователи были восхищены быстрой разработкой вакцин. С другой стороны, многие студенты недовольны закрытием кампуса и онлайн-обучением. Итак, когда уляжется пыль COVID-19, какие оценки получит каждый университет?  

Это частично зависит от того, как рейтинговые системы университетов в конечном итоге уравняют всю эту суматоху - и, хотя никто еще не знает ответа, то, как эти самозваные университетские оценщики справятся с этой проблемой, также повлияет на их собственное будущее. В течение многих лет небольшая группа частных рейтинговых организаций от Шанхая до Нью-Йорка собирала и публиковала данные об успеваемости в университетах. Их часто критиковали - чаще всего со стороны невысоких университетов. Но они никогда не игнорируются ни администрацией университета, ни потенциальными спонсорами, студентами или преподавателями.

Пока что рейтинговая индустрия делает все возможное, чтобы справиться с последствиями COVID-19. Например, в Великобритании QS World University Rankings, одна из крупнейших рейтинговых организаций университетов, которая на этой неделе опубликовала таблицу за 2021 год, еще не изменила свою методологию, чтобы отразить изменения в исследованиях и образовании, подтвердил представитель.

Это происходит по двум причинам: во-первых, данные, используемые для составления рейтинга, собираются за пятилетний период, а во-вторых, влияние пандемии на высшее образование до сих пор неясно. Компания также придерживается политики воздержания от поспешных изменений методологии, поскольку это может поставить под угрозу сопоставимость их рейтингов из года в год.

Но QS может измениться в будущем. «Естественно, мы понимаем, что если общепринятые методы получения высшего образования в долгосрочной перспективе изменятся в результате этой пандемии, то, возможно, наши методы оценки результатов работы университетов должны измениться соответствующим образом. Однако принимать такие решения пока рано », - сказал пресс-секретарь.

Состав первой десятки QS не изменился, MIT занимает первое место девятый год подряд. Произошла небольшая перестановка: Университетский колледж Лондона переместился с 8-го места в 2020 году на 10-е, Имперский колледж Лондона поднялся на одно место с 9-го на 8-е. Кембриджский университет и ETH Zurich сохранили свои позиции на 7-м и 6-м местах соответственно, а Оксфордский университет опустился с 4-го на 5-е.

Среди ведущих университетов США Стэнфорд сохраняет второе место, а Гарвард - 3. Калифорнийский технологический институт занимает 4-е место, что на одно место выше, чем в 2020 году.

Мониторинг нарушений COVID-19  

По другую сторону Атлантики ведущая рейтинговая организация US News отслеживает сбои в высшем образовании, вызванные пандемией COVID-19. Хотя компания не планирует делиться своей методологией до тех пор, пока этой осенью не будут опубликованы ее ежегодные рейтинги лучших мировых университетов, она, в частности, «отслеживает, в какой степени COVID повлиял на публикации, ссылки и сотрудничество преподавателей, если таковые имеются», - сказал Роберт Морс , главный аналитик данных в US News.

Вернувшись в Великобританию, рейтинг Times Higher Education (THE) пытается пересмотреть свою методологию с особым акцентом на обновлении своего подхода к библиометрическим данным, который подвергался резкой критике. Компания THE впервые объявила о планах реформирования методологии два года назад, но в начале пандемии компания  решила  отложить реконструкцию до тех пор, пока буря не утихнет, с планами обновить мировой рейтинг университетов за 2023 год, который будет опубликован в сентябре 2022 года.

К тому времени THE надеется, что влияние кризиса на университеты по всему миру станет более ясным, особенно когда речь идет о количестве иностранных студентов, сотрудников и сотрудников, которые составляют 7,5% от текущей методологии.

Проблемные рейтинги

Even before the pandemic, university ranking organisations were seen by some as using outdated metrics and failing to keep up with the times.

“We are in the age of open science,” said Ellen Hazelkorn, joint managing partner at BH Associates, an education consultancy. “As we move increasingly away from those kind of very traditional metrics to ‘alt-metrics’ and open access publishing, that is not where the rankings are at.”

There is another problem in that ranking bodies are un-appointed and answer to nobody. In 2016, the International Network of Research Management Societies (IORMS), which brings together research management societies and associations from across the globe, set up a working group to look into how to make research evaluation “more meaningful, responsible and effective.”

“Independent oversight of the rankings is non-existent. What we wanted to do is basically develop a set of criteria by which the ranking agencies’ individual rankings could be assessed,” said Elizabeth Gadd, chair of the expert group, who is research policy manager at Loughborough University.

After long deliberation, the expert group set out four criteria against which to judge university ranking organisations: good governance, rigour of their methodology, transparency and whether they measure what matters.

The initial assessment of six global university ranking organisations found none were up to par, with none scoring 100% for each of the four criteria.

For Gadd, the key criterion of the four is measuring what really matters. There are many metrics that do not say much about a university, such as how many Nobel prize winners have walked through its door, or how many international students it hosts.

Meanwhile, useful indicators, such as a university’s commitment to open science and to fixing the gender pay-gap, are often overlooked.

Current university rankings largely rely on two indicators: research performance and reputation. The former is based on bibliometric and citation data, the usefulness of which has been disputed. For example, the impact factor, measuring the number of citations a given paper gets, says little about its quality.

Reputation, too, arguably, says little about the quality of teaching or research. It rates a university’s popularity, ensuring that the biggest, best-known institutions are the top of the league tables each year. A good reputation boosts the rank, and the rank boosts reputation.

Ranking organisations collect a variety of such data in varying degrees of meaningfulness, boil them down to individual indicators and produce a rank for each university.

U-Multirank, the ranking launched six years ago by the European Commission as the EU’s answer to the long running arguments about league tables, scored the best in the assessment by IORMS. The CWTS Leiden Ranking, based on bibliometric indicators, was a close second.

CWTS Leiden also did the best in terms of rigour by avoiding the use of opinion surveys and in being open about the validity of its indicators.

None of the ranking bodies scored well for good governance and transparency, with QS World University Rankings taking a small lead in best governance practices.

In the end, none of the six ranking organisations met the standards outlined by IORMS. “I think that certain data about universities can be helpful if you are looking one thing at a time,” Gadd told Science|Business. “But the methods used to make those assessments need to be rigorous to ensure they are fair and meaningful. Otherwise, people are making all sorts of decisions based on this data, which are not the best decisions because they are not based on the best evidence.”

Concentration of excellence

Some of the rankings, such as THE, date back to the 1980s, but at that time the focus was on compiling national rankings. The move to rank universities globally really took off in 2003 when the Chinese government set out to measure the impact of its increasing investments in university research, launching the Shanghai Ranking.

They may be charged with many shortcomings, but there is no denying global league tables are influential.  

 

For example, the poor standing of France’s universities in global rankings was one of the spurs for higher education reform and restructuring from 2007 onwards. One upshot was the merger of several institutions to create Paris-Saclay University, which in June 2020 moved into 14th place in the world in the Shanghai Rankings.

Similarly, India recently introduced Institutes of Eminence, a title given to research institutions the country wants to turn into world leaders.  But the latest QS league table shows the programme is struggling to yield results, with no increase in representation for India’s public Institutes of Eminence recorded this year.

In past years, governments have issued a flurry of such policies in an attempt to position their select universities better. As a result, Hazelkorn says, “We have a concentration of excellence in a few sets of institutions and very little attention being spent elsewhere.”

These policies, though flawed, perhaps made sense in the times of growing globalisation, but today, we live in a very different environment. “The demands for what universities do and how they contribute and impact on society are now forefront,” said Hazelkorn.

Universities are central to addressing societal challenges. Yet, what rankings measure “in no way bears any resemblance to the kinds of issues we are asking universities to deal with today,” Hazelkorn said.

Despite this, university rankings are a powerful influence that cannot simply be ignored. Gadd observes that university leaders have an uneasy relationship with league tables. “To ignore the rankings is a financial and reputational suicide for institutions,” she said. “So, we have to engage with them.”

What about the pandemic?

The issues with university rankings have long been debated, and research management literature on the topic keeps expanding. But last year, the global pandemic mixed up the landscape of higher education, forcing universities to rapidly adopt online teaching, while international student numbers fell. As the comments from THE indicate, rankers are not yet sure if, or how, to react.

One long-term change is likely to be the way COVID-19- driven the acceleration towards digital learning. Hazelkorn says this switch has been a long time coming, but pre-pandemic universities were reluctant to move towards online teaching. “As a consequence, the pandemic is pushing everyone to look at teaching and the quality of what they are delivering. But there is a big difference between emergency online teaching and quality online teaching,” she said.

Reduced student mobility could take a toll on university finances. Here, the inequalities between universities will continue play a role. Wealthy institutions such as Harvard or Oxford University may weather the storm, but smaller universities may not be able to, possibly leading to growing inequalities. “No one is immune, but some of us are more immune than others,” Gadd says.

It’s too early to say what the long-term impact of the pandemic on higher education will be, let alone on the way universities are ranked. “What we are finding with all sorts of research evaluation issues is that the pandemic is not going to impact in the next 12 months. It’s going to be impacting over the next ten years,” said Gadd.

Too big to listen

Last November, Gadd wrote an op-ed for Nature, outlining the findings of the group’s assessment of six top ranking organisations. Given the results are not particularly favourable to the rankers, she was concerned there would be a backlash against the study.

“We’ve been largely ignored, to be fair,” she told Science|Business. However, that proves the point: the ranking organisations recognise they are powerful global organisations. “They don’t need to listen to grassroots organisations like us,” Gadd said.

University ranking organisations are businesses. They may say they do not make money from the rankings, but their accounts are hidden. “I have tried, and my colleagues have tried, to understand the financial modelling. [They] refused to produce it,” Hazelkorn told Science|Business.

A larger issue, she says, is that these businesses hold immense amounts of university data, which should be public. Although extremely valuable, this “evidence” is kept behind paywalls, as are the methods for assessing it. For example, the Times Higher Education Sustainable Development Goals impact ranking is undertaken internally, which makes it impossible for the results to be compared without external interpretations.

“If rankings did adhere to our criteria, they could provide some useful data to the community,” Gadd said.

Yet, if the rankings were fair and measured one thing at the time, using indicators that are a good proxy for the thing they seek to measure, end users would not find them as interesting. Most are interested in a single figure, a point of reference, and do not have the patience to judge which indicators matter to them.

«Есть способы улучшить рейтинги, но они станут менее привлекательными для конечного пользователя, который хочет использовать их в качестве ленивого показателя качества университета. Но это могло произойти », - говорит Гэдд.

 

ссылка  https://sciencebusiness.net/covid-19/news/measure-what-matters-ranking-universities-age-pandemic