Humble Brag: How Seriously Should We Take National Student Survey Results? || Mclean’s University Rankings Canada

Linnet Humble is the Writing Centre Coordinator at St. Thomas University in Fredericton, New Brunswick.

In April, a Maclean’s article shared by a colleague on Facebook caught my eye. This colleague noticed our university ranked first in a particular category on Maclean’s second annual Student Survey. When asked if their university was helping them write clearly and concisely, 55% of St. Thomas University students strongly agreed and 31% somewhat agreed, placing our university at the top of the list for that performance indicator—ahead of other similar schools in the region, like Acadia and Mount Allison, as well as much larger schools from Ontario, such as Queen’s.

When I saw our university ranked first in an infographic related to writing, I let out a whoop and immediately reposted the article. Just as I was wearily approaching my year-end reports, here was some external validation—from a prominent national publication, no less! It was a shot in the arm for me as the Writing Centre Coordinator. Friends and colleagues offered their congratulations; I walked around for half a day feeling quite chuffed.

But I soon began to second-guess this good news. Did these results actually mean anything? Is Maclean’s a reliable source of information? Can surveys like this accurately measure our students’ writing abilities in their own right, or in comparison to those at other Canadian universities? And how much credit could the Writing Centre reasonably take for such scores?

To learn more, I contacted Garry Hansen, Director of Institutional Research at St. Thomas University. Hansen is responsible for collecting, analyzing, and ensuring effective use of university data both internally (by informing strategic decision-making at the university), and externally, by providing data to agencies like Statistics Canada and Maclean’s.

Image source

Hansen cautioned me against placing too much stock in these survey results. In addition to lending his critical perspective on this particular publication, he was able to suggest other tools and organizations that can provide a more accurate perspective on our students’ writing skills and how they fare in comparison to others’.

First, though: a word on Maclean’s.

As anyone near a newsstand in this country knows, Maclean’s publishes an annual guide to Canadian universities for parents and prospective students that ranks institutions based on 14 metrics, including the student-teacher ratio, the volume of library acquisitions, and the amount of research funding the university receives. Scores on these individual factors are then aggregated into an overall ranking for each university. Maclean’s has been publishing this guide since 1991, making it the longest-running ranking system of its kind in Canada. Its longevity would seem to add weight to its results. But as I encourage students to do when evaluating sources for their research assignments, I first questioned the authority and appropriateness of Maclean’s as a source of information on this topic. I wondered why a current affairs weekly would take it upon itself to comment on the merit of Canadian universities. Why are our students’ writing abilities being evaluated by a general interest magazine, instead of, say, a trade publication like University Affairs?

Hansen informed me that the Maclean’s survey is not unique in its genre: south of the border, several news publications produce national rankings of this kind—namely, U.S. News & World Report, which the Maclean’s survey is likely modelled after. (The Globe & Mail also offers profiles of universities in its Canadian University Report, but stops short of ranking them.) And while those of us who work within post-secondary education might question the authority of these non-specialized periodicals to judge our institutions, Hansen reminded me that those outside of PSE tend to view university education as a commodity like any other. For this reason, most readers would accept university rankings from a magazine like Maclean’s just as they would reviews of any other product or business.

That doesn’t mean we should take Maclean’s to be the Blue Book for BAs. As Garry Hansen explains, the Maclean’s rankings “lack the methodological rigour of consumer reports.” Unlike academic studies that explain their methods of statistical analysis in detail, Maclean’s doesn’t fully disclose how it calculates its rankings. According to Hansen, the magazine does reveal how a university has scored on some indicators, but it doesn’t share the formula for how these factors are aggregated and combined into an overall score.

The factors Maclean’s takes into account have also been criticized. Although the weight of this particular element has been lowered slightly in recent years, Hansen surmises that reputation is still “the largest single indicator in the ranking.” To assess a university’s reputation, Maclean’s surveys university administrators, high school guidance counsellors, and members of the business sector (e.g., CEOs, recruiters) for their perspective on the overall quality of education and R&D at various universities. As Paul Axelrod points out in an article for University Affairs, this makes it possible for the Maclean’s rankings to be based largely on “impressionistic and often ill-informed opinions about an institution’s performance or status.” Hansen informed me that this aspect of the ranking system also greatly benefits some universities over others: universities that have more ties to the corporate world—i.e., that are located within urban centres, that offer more professional programs, and that have industry partners—tend to rank higher simply because respondents in the private sector are more familiar with these types of institutions. And because the Maclean’s rankings have such a strong influence on reputation, this evaluation process becomes self-confirming: universities that rank high in Maclean’s become more well-known, and this enhanced reputation ensures high scores in future years.

Maclean’s has also been criticized for its focus on numbers—on factors that can be counted and tallied. Where it once focused on inputs directly from the universities (e.g., average grade for students entering university), in the mid-oughts, Maclean’s changed its approach so that it came to rely on third party data sources instead, such as Statistics Canada. This created a greater shift toward quantifiable data, including the amount of external research funding a university receives, the size of its operating budget, and the number of citations faculty research receives. This shift in focus disadvantaged smaller universities that have fewer faculty and students; it also disadvantaged schools that specialize in the arts and humanities, where the type of research being carried out doesn’t require expensive facilities (being largely text-based) and, therefore, tend to receive lower value Tri-Council grants. Hansen remembers this change in formula adversely affecting St. Thomas’s placement in the Maclean’s overall ranking circa 2007.

Adverse effects aside, Hansen says that any major shift in methods should send up a red flag. “If it’s a good tool, it shouldn’t be changing,” Hansen says. The results shouldn’t be changing, either: “At the end of the day, universities don’t change very quickly…So if you have a ranking that changes dramatically from year to year, there’s something wrong with the tool.” While it could be that these changes represent an effort on the magazine’s part to improve its methods and arrive at more accurate assessments in response to stakeholder complaints, there may be other, less laudable reasons for it. Hansen wryly points out that “if [the ranking] doesn’t change much from year to year, you aren’t going to sell very many magazines.”

One change that few participating universities objected to, though, was the addition of the Student Survey. This survey was introduced two years ago in response to criticisms that Maclean’s focus on numbers did not capture the student experience. For this part of its study, Maclean’s canvassed students at universities across Canada, asking questions like how many hours each week they spent studying versus partying, how prepared they feel to enter the workforce, and so on. It was this 2016 survey of 17,000 students that yielded the results I had been so quick to share online—the survey that placed our students at the top for the country in terms of writing ability.

Regardless of how relevant the other indicators are, surely this part of the Maclean’s ranking system must be useful, since it asks students directly about their academic skills and experience?

Well, no. Once I looked beyond the bar graph, I realized that this survey is asking students to assess their own abilities: these abilities aren’t being objectively tested or measured in any way. As with any study that relies on subjective perception of oneself and self-reported data, these results may be unreliable. Hansen points out that there may be a confirmation bias at work here, too. Students would expect that their educational experience would make them better communicators: after all, it’s one of the selling points of a university degree, particularly in the liberal arts. In asking them whether their university was helping them write clearly and concisely, the survey was playing upon this expectation.

This last point raises another issue: the “apples to oranges” argument. As pleased as I was to see St. Thomas ranked #1, our standing likely does not reflect true supremacy relative to other schools on that list, since the institutions that are being ranked alongside us aren’t comparable. There are vast differences between these universities in terms of size and programming, differences that are flattened out by the Maclean’s system and given the appearance of a level playing field. It just so happens that these differences benefitted St. Thomas University on this part of the Student Survey. As students at a liberal arts university, our undergraduates have more opportunities than those in other disciplines to practice and improve their writing over the course of their degree. With smaller class sizes and marking loads (all courses here are capped at 60), our profs are also able to assign more essays and reports, with higher page counts.  It makes sense, then, that students from our institution would rank higher on this type of question than students who come from a school that has larger class sizes and a greater proportion of students in sciences and engineering. Considering these differences, Hansen says that the Maclean’s rankings would mean more if they compared individual programs instead of whole institutions.

Given all these flaws, why do universities like ours continue to take part in the Maclean’s survey? As it turns out, some larger universities in central Canada have stopped providing data after years of doing so over complaints related to the ranking system, lack of transparency, and lack of control over the data. (These universities, including the University of Toronto, went on to form the Common University Data Set Ontario which collects and publishes its own data from participating universities.)

Of those like STU that continue to volunteer information to Maclean’s, I had assumed their participation must be recruitment-driven. According to the Maritime Provinces Higher Education Commission, university enrolment in this province (New Brunswick) has declined 13% over the last 5 years, from 22,246 full- and part-time students in 2011-2012 to 19,394 in 2015-2016. In the same period, there has been a 20.5% decrease in the number of students at my own institution. Within the context of declining enrolment in our school and province, and increased competition between universities over a dwindling number of university-aged residents, I figured free publicity like that offered by Maclean’s would be welcome.

Hansen admits that universities do “hold their noses and play the game” in part because the Maclean’s survey is so popular among readers, and because our absence from the survey might be more harmful than any score we might receive. However, Hansen says that it’s difficult to determine the extent to which these rankings actually affect our enrolment. We are only able to measure this impact of these rankings by asking students who come to our university if exposure to Maclean’s influenced their decision to attend St. Thomas; obviously, there is no way of measuring the publication’s impact on prospective students who choose not to attend STU. Hansen also contends that though our inclusion in Maclean’s may influence recruitment of international students who might not otherwise be aware of STU, these rankings do not have an impact on our overall enrolment since so many of our students come from in-province, where St. Thomas is relatively well-known. (The rankings may have a greater impact on enrolment for universities located in larger metro centres, where there is more choice and greater competition for domestic students among post-secondaries, or where universities have to attract more students from abroad through publications like Maclean’s.)

Instead of enrolment, Hansen identifies the desire to make data-driven decisions as the reason for participating in such studies. This desire comes not just from upper administration, but from other external stakeholders, including provincial governments who want programming and budgetary decisions for publically funded universities to be based on hard data. Though the tendency for districts to use these results as key performance indicators is an unfortunate and unintended use of such data, universities also have a desire to use this information to enhance the student experience and increase retention and recruitment.

To this end, our university actually participates in a number of other national student surveys, as I discovered—surveys which Hansen claims are better designed and which might be more useful to me and other Writing Centres than Maclean’s.

Perhaps the best alternative to the Maclean’s university rankings, according to Hansen, is the National Survey of Student Engagement. The NSSE survey began as a research study out of Indiana University that attempted to understand the relationship between retention and student engagement. It has a much larger sample size than Maclean’s: first piloted in 1999, by 2016 there were 560 institutions and 322,582 students participating in the NSSE. Its College Student Report surveys first-year and senior students at four-year US colleges and Canadian universities about their participation in activities that add to their learning and personal growth. In doing so, the survey aims to measure not just how engaged students are in their course work, but how often they take advantage of other resources and learning opportunities available to them while at university. The NSSE also asks students to provide a lot of quantifiable data (e.g. “During the current school year, about how many papers, reports or writing tasks of the following lengths have you been assigned?”), which provides a better snapshot of student activities. And while the NSSE does provide benchmarks that allow schools to compare their results to other schools’–and though this data can be used by prospective students in their decision-making process–the NSSE does not rank institutions in the way that Maclean’s does, making it more an information tool than a marketing tool.

Whatever advantages the NSSE might have in terms of its design and data collection, it doesn’t appear to be any more useful for Writing Centre practitioners than Maclean’s. Writing Centres are only mentioned in the context of promoting academic support services. (“How much does your institution emphasize the following: Using learning support services [tutoring services, writing center, etc.]?)” And since I am not an administrator, I don’t have access to our university’s NSSE results and can’t drill down to view responses to specific questions like this. I can only share the overall finding that of the 79 Canadian institutions that took part in the 2014 NSSE survey, “Students from St. Thomas were significantly more likely than their peers at other Canadian universities to report they had developed skills in writing clearly and thinking critically“ (a result that doesn’t tell me much more than Maclean’s).

A better, alternative source of information for Writing Centre practitioners might be the Canadian University Survey Consortium (CUSC), which STU participates in occasionally. The CUSC is a group of Canadian universities that were independently performing satisfaction surveys and realized it would be mutually beneficial to collect and share comparative data. According to Hansen, the CUSC hired the Prairie Research Associates to design and conduct the joint survey, which was first administered in 1994. The CUSC now operates on a three year cycle and alternates between surveying students who are in their first year, mid-degree, and graduating. Hansen prefers this survey to Maclean’s because it addresses one of the problems identified earlier: it provides good comparative data for our university, focusing primarily on the Canadian undergraduate student experience.

For the purposes of Canadian Writing Centres, the CUSC is more helpful than Maclean’s and NSSE combined. Rather than ask students about their writing ability or the promotion of writing services at their school, this survey asks students questions about their use of and satisfaction with various facilities and services, including the Writing Centre. I was pleased to discover that in a survey of graduating St. Thomas University students in 2015, the CUSC found that “100% of students who used academic services for writing skills were satisfied or very satisfied.” Ever the pragmatist, though, Hansen suggests that I take glowing reviews like this with a grain of salt. He notes that satisfaction surveys like this one tend to provoke extreme responses. Those who take the time to answer questions modeled around a Likert scale (i.e. with answers that range from “very satisfied” to “very dissatisfied”) are often “motivated respondents”: that is, they tend to feel either very positively or very negatively about the services they have received. As with any other exit survey, Hansen also points out that this one has an implicit sample bias. By only including graduating students, the study limits feedback to those who had academic skills sufficient to make it through their degree program.

In the end, I’ve decided to focus my attention more on internal measures of success, drawn from the data we collect ourselves here at the Centre, rather than results from national student surveys. Like many other centres’, our year-end reports include data on how many students we’ve served this year, how many hours of tutoring we’ve been able to offer, our utilization rate, and so on. For those who are interested, these results were much more modest: there had been only a 7% increase in the number of students we served this year, and a 5% increase in the number of hours we offered. Our booking rate is fairly consistent with last year’s, at 87%, but despite our best efforts to discourage late cancellations and no-shows, our utilization rate continues to hover at a humbling 69%. While I’m certain these numbers would not place us at the top of the country on any national survey designed to measure and rank performance indicators for Writing Centres, they do give me a more accurate and detailed view of our operations.

I’m also happy to report that in our own survey of Writing Centre users, all but three of our respondents found their visits to the Writing Centre either very helpful (69%) or somewhat helpful (25%), and all but one found the staff either very knowledgeable/approachable (80%) or somewhat (18%) knowledgeable/approachable–although, thanks to Garry Hansen, I’m now rethinking the way I phrase and format these survey questions.

I’ll still include a link to that Maclean’s survey in my report, along with the results from the NSSE and CUSC—but I’ll be less quick to brag in the future.

Sources
“About NSSE.” National Survey of Student Engagement. Accessed June 3, 2017. http://nsse.indiana.edu/html/about.cfm
Axelrod, Paul. The Trouble with University Rankings. University Affairs, December 6, 2010. http://www.universityaffairs.ca/opinion/in-my-opinion/the-trouble-with-university-rankings/
Hansen, Garry. Interview by Linnet Humble. May 26, 2017.
Hutchins, Aaron. Which Universities Best Prepare Students for Employment? Maclean’s, April 17, 2017. http://www.macleans.ca/education/numbers-to-study/.
Maritime Provinces Higher Education Commission. Table 1: Total Enrolment by Province, Institution, and Registration Status. October 17, 2016. http://www.mphec.ca/research/maritimeuniversitystatistics/enrolment.aspx
“What Students Say About STU.” St. Thomas University. Accessed June 3, 2017. http://w3.stu.ca/stu/futurestudents/whatstudentssay

 

 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.