Decode, Track, and Use Ranking Information
Abstract
Understanding the impacts rankings have on universities, Mānoa Institutional Research Office (MIRO) has made great efforts to track ranking data and communicate this information to people within and outside of the UHM community. Each year, when a new ranking is published, MIRO staff pulls out the information and saves it to their centralized database. MIRO created an interactive data visualization web app to help university data users easily locate and use ranking information, so the university can more efficiently use ranking data in different areas, such as marketing, recruiting, and student/community engagement initiatives.
Introduction
Hawai‘i has a unique geographic location that connects the east and the west; it has the most diverse culture and racial composition in the United States. It is one of the most beautiful places on earth and attracts people from all across the globe to study, teach, and conduct research. The University of Hawai‘i at Mānoa (UHM) is a world-renowned university that has excellent teaching and research opportunities, and is highly recognized by major international rankings such as U.S. News & World Report (USNWR), QS, Shanghai Ranking, and Times Higher Education.
Ranking, especially international rankings, could have major impacts on colleges and universities, including:
- Student and Talents Recruitment
- International Students’ Eligibility for Government Scholarships
- Work or Resident Visas Eligibility
- Graduate and Alumni Employability
- Choices of Institutional and Program Partnership
- Community and Alumni Engagement
- Public Support and Funding Opportunities
Because of how important rankings are to a university, institutional researchers are often asked by administrators to interpret and explain ranking results and methodology, sometimes even asked to provide strategies to improve rankings. Over the past decade, MIRO has continued to gain knowledge and a greater understanding about university rankings by partnering with university colleagues and collaborating with major ranking agencies on a daily basis. MIRO is self-taught when it comes to addressing data requests from rankings and ranking questions from offices and programs on campus, and has excelled greatly thanks to the new skills acquired.
Our View Towards the Ranking Phenomenon
MIRO is often asked why UH Mānoa is ranked so differently compared to other rankings. For example, in 2020, UH Mānoa was ranked 60th in the United States by Times Higher Education and 62nd by QS. With U.S. News & World Report, however, Mānoa placed 109th in the Best Global Universities Rankings and 170th in the Best College Rankings. This is because rankings use different methodologies and data sources. Showing how Mānoa is ranked by various rankings helps remind people that each ranking is a different lens in which to view the institution and its programs. MIRO constantly encourages our university to use higher rankings for marketing and promotion efforts and tells them not to worry if a few positions move up or down.
The variation in ranking placements could be caused by a number of factors. For example, it could be that the ranking method of the specific agency changed or that some universities decided to submit partial data (or not even submit their data at all). Another scenario could be that data submitted to ranking agencies were inaccurate. Even if ranking organizations decide not to use data provided by the university, major university changes can affect the university’s status–like merging with other universities or receiving significant government funding.
It is important to be aware that ranking is a concept of context and is relative to other universities; when one university’s rank changes, it can affect other universities’ rankings. In other words, even without major changes to a university, the ranking could still change if other universities’ situations change. Due to the many uncertainties that could affect a university’s ranking, it is important to recognize that ranking methods may vary depending on the ranking agency and the pool of institutions being ranked that year. That’s why MIRO recommends using ranking results as a way to promote a university’s strengths from a marketing’s perspective rather than a means of deciding how to manage the organization or allocate resources.
Ranking methodologies include multiple data points (some of which IR offices don’t even have access to) and how other universities report their data can also affect university rankings. It became clear that trying to improve UH Mānoa rankings by examining the data was not necessarily the best way to use MIRO’s energy and time, so the office decided to focus on a different approach: tracking some of the most well-known rankings, making data readily available and easy to access, and helping faculty and staff better understand rankings to use them for marketing and promotion.
For a public research university like the University of Hawai‘i at Mānoa, international rankings are more important than domestic rankings for two reasons: (1) in order to recruit students, researchers, faculty, and staff from world-wide, the university needs to advertise where it stands in an international context. (2) Since international rankings mainly use research data in their ranking methodology instead of students profile data, UH Mānoa tends to be ranked better by international ranking organizations than by domestic rankings. This is why, when deciding which rankings to track and report as an IR office, MIRO chose to focus on a few international rankings (USNWR, QS, Times, and Shanghai Ranking) that are more well-known and established. These ranking agencies have the resources to buy data sources and hire people to develop ranking methodology, making them more credible with more channels to reach a wider audience.
Sometimes, universities may find that their mission, vision, and values may not necessarily align with some of the important rankings. As a public university, UH Mānoa’s mission is to make the college experience more accessible to Hawai‘i residents. Some popular domestic rankings, such as the U.S. News & World Report’s Best Colleges Ranking, use measures such as student selectivity and alumni giving rates that do not put Mānoa in a fair position to compete with other universities that only admit the best performing students and those with higher socioeconomic backgrounds. Hence, MIRO chooses to focus on international rankings that emphasize research excellence so as to highlight Mānoa’s research competence while showcasing higher rankings.
After learning from ranking-related work, the office started developing a new approach to handle ranking related questions and matters. MIRO created a ranking web page, began publishing an annual ranking Analysis Brief, developed a ranking data web app, helped the university publish dozens of ranking related news stories, and conducted training seminars for faculty and staff. Additionally, MIRO also collaborated with major international ranking organizations to develop positive relationships and learn more about their ranking decisions and changes.
Ranking Data Preparation
In order to be considered for certain rankings, an IR office must provide timely data to their annual data collection. If they miss the deadline or do not provide sufficient data, the university will either be negatively affected or not even ranked at all for the following year. This is why UH Mānoa was not ranked by Times Higher Education for two years because, for whatever reason, the IR office at the time did not provide enough data to them.
Preparing data for rankings is not an easy task: information needs to be collected from many different offices on campus and also needs to go through a lot of data processing and analysis before an IR office sends it to ranking organizations. To make reporting efficient and consistent, MIRO created SPSS syntax and Excel pivot tables that allow the staff to quickly auto-generate standard ranking reports.
After examining the data requirements carefully, MIRO found reporting errors from when the previous staff conducted reporting cycles. MIRO was able to correct the errors, which may have affected the university’s overall ranking. For example, when reporting international students and faculty, MIRO’s previous staff did not count green card holders as international students/faculty. Although it’s a common practice in the United States to do so, international rankings often count green card holders as “international.” Having two different reporting rules may end up with different calculations and can potentially affect a university’s ranking status. To avoid these kinds of mistakes, MIRO went through great efforts to make sure all Mānoa offices who were involved in providing data clearly understood the data requirements from the different ranking organizations, and reminded them to be very cautious when pulling out data and entering the numbers. Before any submission, MIRO staff also makes sure to check the data multiple times. In other words, preparing the data for different rankings not only requires expertise and caution but also relies on interdepartmental collaboration to retrieve the necessary data.
Cost Efficient Ways to Improve Ranking
Many people often wonder how they can help improve their university’s rankings. Most ranking agencies provide paid services to help institutions improve their rankings, but the consulting fee is not cheap and universities’ budgets might be tight, especially after a global pandemic. MIRO chose to find other strategies to improve ranking without impacting the university’s budget. These strategies were developed from MIRO’s ranking experience and from what staff members learned at ranking conferences.
As mentioned before, providing data to ranking agencies in a timely manner requires collaboration with many offices across campus. A university’s leadership can help by emphasizing the importance of this task and by encouraging offices to respond and cooperate when IR collects the data from them. Universities may also improve rankings by creating a template or various guidelines on how to give appropriate credit when publishing papers or books.
When ranking calculations are made, research data is often pulled using the university’s name. It’s fascinating how many variations of a university’s name there are when faculty, researchers, and students publish their works. Most popular ranking agencies are equipped to track variations of university names, but they may not include all possible variations. Therefore, a university might not receive the credit of a research outcome if a variation of the university’s name is not included in rankings’ databases. Some universities have established specific requirements on how their university’s name should be written, even including requirements on what types of situations a researcher should credit the university for. For example, if a researcher publishes a paper after they no longer work at the university, they may be required to accredit the institution they did a majority of their research at. Universities can make specific policies to ensure that they receive the credit they deserve and that all research outcomes are reflected in its rankings. This policy implementation, however, is beyond the scope of an IR office; institutional researchers can make recommendations, but it’s more appropriate for other offices or officials to create such guidelines.
Another action universities can take is to maintain and expand a list of contacts for academic and employment reputation surveys conducted by ranking agencies. Some popular rankings heavily weigh reputation surveys in their ranking methodology; they even allow universities to submit a list of people they would like the reputation surveys sent to. It is assumed that people on a university’s list might have stronger ties with the university and are more likely to fill out the reputation surveys and respond in favor of the university. However, maintaining this list and maintaining the relationships with people on the list is beyond an IR office’s scope and capacity. Although it is a debatable gray area whether or not universities should invest more time and resources to maintain the list and relationships, it’s simply worth mentioning as a way to improve university rankings.
Not all universities invest the same levels of time, money, and efforts into rankings which may lead to different ranking results, making it difficult to predict and manage a university’s ranking position. That is why we at MIRO chose to spend more energy on tracking ranking data and helping university officials use ranking results rather than figuring out why rankings increased or decreased.
We learned from our interaction with rankings that if universities are dissatisfied with the ranking results and sincerely believe there’s a data discrepancy, they may request to do a data comparison and verification with the ranking organization. However, this is extremely time consuming and universities may not have access to the expensive research databases that the ranking organizations purchase their data from. Also, based on MIRO’s past experiences, fewer and fewer ranking organizations agree to accommodate such needs, so this may not be a practical approach anymore.
Communicate Ranking Results
Other than providing data to rankings and tracking rankings results, MIRO has started investing more time in communicating ranking data with the UH Mānoa community and general public. These efforts can easily benefit the university in various ways while also having long-term impacts.
One of UH Mānoa’s major branding challenges is the “tourism” image of Hawai‘i that is so deeply ingrained in popular culture. Many people don’t often associate Hawai‘i with world-class teaching and research excellence, which is why international rankings can be extremely helpful: international rankings compare universities in a global context. A few major rankings like U.S. News and World Report, Shanghai Ranking, QS, and Times Higher Education (THE) are highly acknowledged by the general public including students, parents, and even government agencies. UH Mānoa is ranked exceptionally well by these established rankings. For example, as of 2021, QS and Times have both ranked UH Mānoa around 60th in the United States. THE also has ranked Mānoa in the top 250 universities in the world. These recognitions can help UH Mānoa surpass the “tourism” image often associated with the state and showcase Mānoa’s world-class university statistics for international audiences and constituencies.
When choosing schools and employers, people want more information about the university and may search for ranking information on their own. People may find all kinds of rankings and make decisions based on their own interpretations. It is important to be proactive by making ranking information we want to emphasize easily accessible on the university’s web page, providing explanations and interpretations, so as to minimize the negative impact of random rankings as much as possible.
Creating a Ranking Webpage (Strategy 1)
MIRO’s first strategy to share ranking information was to make a centralized place to publish ranking data; that’s why there’s a separate section on the web page specifically designated to Rankings. This page houses and publicly tracks ranking information, making ranking statistics easier to access for both internal and external users. There are numerous existing rankings and new ones being published frequently. MIRO also does not have the capacity to track more rankings so we only focus on a select few that are the most well-established rankings. Too many rankings can also overwhelm an audience, so focusing on less benefits the office and its users in the long run.
When thinking about how a university is ranked, people often refer to the overall ranking. MIRO took all the information pertaining to UH Mānoa’s national and international rankings, summarized it, and placed it into one centralized table (see Figure 4). This table is very helpful in demonstrating Mānoa’s research profile, and MIRO frequently includes it in presentations, demonstrations, and publications about the university.
Tracking ranking information in a timely manner can be tedious and often requires specific knowledge and expertise. To reduce the burden for colleges and departments, MIRO created a template to consistently track and easily share ranking results. Each ranking published on MIRO’s website follows the exact same format, including an overview of the specific ranking and a link to the ranking agency’s website. The general introduction also includes the number of universities being ranked during that time period because it is a commonly asked question. Since different ranking agencies don’t release their rankings at the same time throughout the year, MIRO lists the release date to serve as a reminder for when the ranking is released and when it should be updated.
First on the list are institutional rankings, separated by national and global rankings. The overall rankings themselves, however, do not always provide a comprehensive picture of a university’s academic and research strength. At UH Mānoa, we always emphasize the importance of subject rankings along with overall rankings by conducting presentations and publishing ranking related news stories. Subject rankings are more helpful in facilitating marketing and promotional needs of colleges, departments, and programs. For Mānoa, having a wider variety of disciplines ranked is an important strength and holds greater research potential. If a university has more subjects with positive rankings, it benefits the university because it would mean that students have more world renowned majors to choose from; having more disciplines with positive rankings also means that a university has a greater potential to conduct higher quality interdisciplinary studies.
Other valuable resources MIRO created while collaborating with major rankings are also located at the bottom of MIRO’s ranking webpage. These include materials from previous presentations (video and powerpoint) and Q&A documents that address ranking questions raised by institutional researchers and other higher education professionals. Adding these documents can help save time searching for answers from ranking websites and, most importantly, help users to deepen their understanding about university rankings.
Tracking Ranking Data (Strategy 2)
MIRO publishes the most updated ranking data on the Ranking web page, but sometimes previous years’ ranking data is needed to answer inquiries about how UH Mānoa’s ranking has changed. Some rankings, like Shanghai Ranking, will show a record of the previous year’s rankings on their website, but others don’t. Therefore, if we don’t keep these records ourselves, we won’t be able to access that information when needed. Meanwhile, many offices and programs also have a need to locate certain ranking information. Delving into each ranking’s website to search for specific information is quite time consuming. Knowing the various needs of locating ranking data, MIRO started brainstorming ways to share the rankings we were already tracking so they wouldn’t have to spend extra time duplicating the work.
MIRO started tracking the rankings’ information internally using an Excel spreadsheet. As time went by and more ranking data was tracked and collected, the Excel spreadsheet stopped being as efficient and sufficient.
Eventually, the office created the UHM Rankings web app to help ourselves and other data users quickly locate the desired ranking information. For example, the Office of Communication often needs the previous year’s information in the ranking-related news stories; they can easily access the data by using the UHM Rankings web app.
This web app has multiple filters that aim to address Mānoa data users’ specific questions. If no filters are selected, users will see a full report that has all the ranking data that MIRO tracks. Once a report is generated, there will be a navigation bar on the top left corner of the web page. It remains visible as users scroll up or down on the report. This feature is very useful, especially when the report is very long. The navigation bar is organized by 3 layers: (1) MIRO’s top ranking agencies (Times Higher Education, QS, ShanghaiRanking, and U.S. News & World Report), (2) ranking names, and (3) institutional and subject ranking comparisons. Although there are many other ranking agencies who publish tons of rankings, MIRO decided to focus on four major ranking agencies.
The second layer is the ranking names. MIRO tracks the World University Rankings from the 4 ranking agencies, plus the Best Grad Schools Rankings and the Best College Rankings from the US News & World Report. The third layer distinguishes between institutional rankings and subject rankings.
As for the primary table on the report, users are able to view multiple years of data from different rankings. Sometimes people want to see a university’s ranking history, but not every ranking agency/organization shows historical data on their website; the Rankings web app incorporates multi-year data using data MIRO pulls annually from each ranking. Other than tracking how UH Mānoa is ranked globally, MIRO also includes the university’s ranking position in the United States. With the 2021 Times Higher Education ranking, for example, Mānoa was ranked 201-250 in the world and 60th in the U.S.
In addition to institutional rankings, MIRO also tracks subject rankings. Many subjects at Mānoa were ranked by ranking agencies, which demonstrates UH Mānoa’s strength in many academic disciplines that are recognized internationally.
To select new filters, users do not have to scroll down to the bottom of the page; instead, they can click the “Change Search Criteria” function above the table to quickly jump to the web app filters.
The first filter is the “release year;” if users want to know the ranking results released in a specific year, they can simply select the year desired or select multiple years to see more data for comparison. This can be confusing, however, since ranking agencies may name their rankings differently: some associate the ranking with the same year the ranking was published whereas others associate it with the following year. To make ranking data consistent on MIRO’s web app, we decided to associate the year that the ranking is published with the university’s ranking status for that year rankings, regardless of which year the ranking agencies label them as.
If users want to see Mānoa’s rankings from a specific ranking agency, such as U.S. News and World Report, they can choose from the filter “ranking agency” to see all rankings published by USNWR. Sometimes users may only be interested in a specific ranking published by a ranking agency, so they can use the filter “ranking name” and select the desired ranking, like the Best Graduate School Ranking.
The filters introduced thus far can help us extract the institutional ranking data and the ranking data of a wide variety of subjects. However, if users from particular academic units only want to know the rankings related to their department, they would still need to go through each table to find out the answers. Some of the added filters allow users to see data from colleges, departments, majors, and programs.
Figure 8 shows the School of Ocean and Earth Science, and Technology (SOEST) as an example. From the report, we can see that QS classifies Earth & Marine Sciences, Environmental Sciences, Geology, and Geophysics under Natural Sciences, whereas UH Mānoa labels those majors as SOEST. How ranking agencies define academic disciplines does not always match how universities set up their colleges and departments, so to help departments find relevant ranking information, MIRO linked the subject ranking data to the university’s programs, departments, or colleges using the Classification of Instructional Programs (CIP) codes.
Another strength of MIRO’s web app is that it provides as much relevant information as possible. It is very likely that some majors are not ranked simply because there are no such subject rankings available yet. Instead of showing no data at all, academic programs might find it helpful to know how the university was ranked in a broader subject area even if their program is not included in a subject ranking. For example, there are currently no subject rankings that rank Chinese as a major, but when Mānoa data users select “Chinese” in the “major” filter, they will see how the university’s subject area of Arts & Humanities is ranked (see Figure 9). MIRO hopes this mapping approach can provide more relevant and useful ranking information, which would have otherwise not been available.
In addition to academic unit filters, MIRO also incorporates 2-digit, 4-digit, and 6-digit CIP codes as different levels of the broadness of academic disciplines into the web app. CIP stands for Classification of Instructional Programs codes, originally developed by the U.S. Department of Education’s National Center for Education Statistics and are used to track and report fields of study and program completion activity. The CIP codes are a universal way to classify academic disciplines in the United States; when linking with various disciplines in the rankings, these codes are more helpful than classifying programs by majors since universities organize their majors in very different ways.
Figure 10 shows engineering-related ranking data at UH Mānoa when the number 14 is selected in the “CIP 2” filter, and retrieves engineering related rankings at UH Mānoa. The report shows that QS considers Life Sciences & Medicine and Natural Sciences as part of engineering, whereas the U.S. News & World Report includes Engineering and Natural Sciences MIRO implemented the CIP code filters in its web app to provide a different angle to look at academic disciplines reflected by rankings.
It may be challenging for programs and offices to explain the ranking data available on the web app, so MIRO provides published ranking stories as examples of how to write their own ranking related publications. Thinking about users’ needs ahead of time, MIRO gathered UH news stories and put them on the navigation bar for easy access (see Figure 11).
MIRO’s UHM Rankings web app not only tracks the ranking data from specific years, specific ranking agencies, and ranking names, but the app also extracts data for specific academic units or subject fields. This ranking tool allows UH Mānoa’s faculty and staff to quickly locate the ranking information that MIRO collects and tracks. This also helps save time when looking for information across scattered locations, gives users a bigger picture of the rankings rather than focusing on a specific ranking, and shares access to historical ranking information.
Improving Understanding about Ranking
Although MIRO shares data on the office website and creates useful web apps, it does not guarantee that people will know about it and use the tools available. This is why MIRO collaborates with UH Mānoa’s Office of Communications to develop news stories on rankings to give greater access and visibility to the whole campus community. These news stories also help MIRO slowly but effectively tell the ranking stories that the university wants to frame. When choosing how to tell the stories, we recommend selecting the results that can highlight the strengths of the university and its programs; it is also helpful to mention different rankings and subject rankings to help people look at rankings in context. By doing so, we can help prospective students realize that there are actually many rankings that they can look at. Subject rankings can also be helpful for students to look at when choosing schools and majors.
MIRO understands that UH Mānoa’s deans, department chairs, and other faculty and staff need to address ranking-related questions directly from students, so we make ourselves available for consultation. We use what we know about rankings to help our colleagues better understand the meaning behind the ranking results and to help them communicate the results with their prospective students.
Many offices, colleges, and departments often ask MIRO whether a specific ranking is worth promoting. If it’s not a popular ranking that MIRO already tracks, we usually suggest they wait and see how the ranking is accepted and promoted by other universities. If a majority of universities start using data from a new ranking agency, our office may give that ranking more consideration as well.In addition to working with the Office of Communications on ranking stories when a new ranking is published, MIRO publishes a Ranking Analysis Brief on the office’s website. This brief summarizes major rankings published in the corresponding year, explains commonly asked questions about rankings, and sets the tone for how to interpret and use the ranking information.
Closing Remarks
While the office has done a lot of ranking related work at the university and has established an expert-level role in ranking related issues, none of MIRO’s staff had much knowledge about or prior experience with rankings when they first started. Our knowledge about rankings came from information available from the ranking agencies’ websites, our experience working on ranking data, and the collaboration with colleagues in and outside of our university.
Ranking websites contain a plethora of helpful information to look at. MIRO usually tries to look for two things when looking at the ranking websites: (1) data sources and methodology to help understand what each ranking emphasizes and determine whether the data sources are credible, and (2) what type of universities the ranking agencies favor. Rankings that heavily rely on research data tend to rank research universities well, and prestigious universities are usually ranked better in the rankings that weigh student profile data.
MIRO also closely looks at and questions the characteristics of the rankings. Is it an institutional ranking or a subject ranking? Is it international or domestic? Does it focus on graduate programs only or online programs only? If a ranking agency uses student survey results in its ranking methodology (such as the WSJ/THE College Ranking), we either look at or ask for the number of responses and survey response rates because they helped us understand how representative the survey data was. After collecting all the important pieces of information, we use the information to advise our colleagues whether the ranking was worth promoting. We also decide which rankings to track and how to display the ranking information in a structured way on our ranking webpage. Our goal is to help our viewers easily digest the ranking information that we choose to focus on.
For institutional researchers who are interested in establishing their expertise about ranking at your organization, MIRO created a list of suggestions summarized from our own experience and available resources.
- Create a ranking page to centralize your institution’s ranking information. MIRO’s ranking page can be used as a reference.
- Familiarize yourself with basic ranking information. The Q&A documents and recorded videos on MIRO’s ranking page can be very helpful.
- Once you’ve gained some understanding of your university’s rankings, offer informational or training sessions to different constituencies at your university and collaborate with your communication’s office to generate ranking news stories to share with the whole campus. It is also beneficial to link your ranking page on the bottom of the ranking stories so people know where to find more information.
- Last but not least, it is important to attend ranking conferences or virtual symposiums to stay up to date with the ever changing field of university rankings.
It is not always easy to get people to tell their own stories using the data that IR offices collect. IR doesn’t have the authority to tell others what to do, but we can try our best to make the data easier to access. We can help set the tone of the ranking stories by creating ranking reports, and by listing and tracking certain rankings that we believe are worth emphasizing. It’s also beneficial to work with the communication’s office to consistently tell our own ranking stories. It does take time to infuse the ranking data tool– or any data tools that MIRO builds–into the campus culture and operations, but we believe that our efforts will eventually pay off. Although rankings have limitations and can often be controversial, they will continue to have tangible impacts on our personal and professional lives and will most likely remain relevant or play an important role in the field of higher education for a long time. The best thing institutional researchers can do is to be prepared and stay informed on how to understand and utilize these rankings.