Universities have been struggling to cope with ransomware attacks, as threat actors turn flaws in their administrative systems into attack vectors they can exploit. High-profile breaches have recently brought the issue into sharp focus, vindicating experts calling for the higher education sector to shore up its cyber-defenses.
This week, Lincoln College, a highly regarded university in Illinois dating back to the nineteenth century, announced it was closing permanently, citing financial problems caused by COVID and a cyberattack in December that wreaked havoc on its admissions process.
Meanwhile, another university, Austin Peay in Tennessee, suffered its own ransomware attack last month, forcing it to shut down its systems and temporarily suspend operations. Though at the time of writing it appears to be weathering the storm, students and other commenters were quick to voice their condemnation of the university’s apparent lack of preparedness.
“You would think a university would have ransomware protection,” noted one dryly on Twitter. “All revert to paper!” cried another sarcastically. This is the kind of publicity any university should dread, and underscores the damage cyberattacks can do not only to an institution’s finances and functionality but also to its wider reputation.
Industry analysts are unsurprised by academia’s cyber woes. A report published in April by Sophos described the higher education sector as having the slowest recovery time from cyberattacks, apart from the government. Other experts have warned that university printers in particular have an exposed attack surface because they do not separate machines used for research and administrative purposes.
One analyst who has been delving into the education sector’s cybersecurity issues for some time is Rick McElroy, principal strategist at VMware. He believes that universities face multiple challenges before they can overcome the threat to their research and education programs – but with diligence and patience, he believes that they can.
Another cybersecurity analyst I spoke to mentioned that higher education is particularly vulnerable to ransomware attacks. Why do you think that is the case?
I would say typically it's a question of resources and funds. In most of the universities I work with, their programs are understaffed and underfunded, and then you have academic freedom, as you should have – a professor has the ability to teach their classes however they see fit for that curriculum. Generally speaking, that involves lots of different professors and applications, and those aren't always maintained. Then you have a very young set of interactors with the computer itself, people fresh out of high school. They probably haven't had a good amount of security training, click on lots of things, and that generally leads to some of the issues.
And in terms of the applications, does that tie in with the printer problem I’ve been told about, where the smart computers running them are vulnerable because they are not segmented?
Yes, certainly, I would say printers are probably the biggest example of that. Other devices as well: there are all kinds of things that people plug in that don't go through the IT ticketing system. Printers happen to be one of the most vulnerable devices, we've seen them get breached a number of times, and generally, it's because there are a lot of local print options in the classroom not managed as part of a print network like an enterprise would. And not having that segment essentially means, as an attacker, when I go after that printer I can now pivot to all kinds of different things: I can start to steal credentials and so on.
I had a look at the recent Sophos report, and a couple of things jumped out at me. It said that last year, two-thirds of higher education institutions were attacked by ransomware, but it also mentioned that seven in ten do take security measures. Does that suggest that threat actors are targeting institutions even if they are security conscious, or does it mean that they’ve gone in and observed things quietly – as they tend to – and found the measures to be inadequate?
Both. From a ransomware evolution perspective, they're really smart: they monitor all of their different attacks, they get a lot of “recon data” as we refer to it, and they leverage that. So they understand the softer targets that don't have security programs [and] go after them first, because it's a larger ROI [return on investment].
Now when you look at the code and the changes from a ransomware perspective, one of the things we've seen over the last eighteen months is specifically identifying whether or not the system was backed up, and then going after those back-ups. So from a Windows perspective, they'll try to delete the shadow volume copy that lives there; if it's a server, they'll try to figure out the back-up application that's running [and] shut it down.
So all that changes for the attackers if you have back-ups is the complexity of their attack chain: it certainly doesn't make you invulnerable. The attackers will be already on a system, that system will be backed up, that “persistence method” as we refer to it is then restored back into production – that's not good. They need to be air-gapped and offline from critical systems. You can't do that everywhere: obviously you're not going to do it with student laptops. But your student record management system, you're probably going to want to do that.
So just to clarify, you mean any machine a university uses to store data and records should be offline?
Yes. And then [do] periodic testing. Organizations don't do a great job [when they] have something backed up, and just assume it's all going to be good. Just do an exercise. The most important thing is that this university just got ransomed – what do we do about that? As they start to walk through that exercise [and] ask that question, they're going to identify a bunch of things that they need to do. And then putting a plan around that is a great idea for universities.
So like a sort of roleplaying or wargaming exercise, where you play out the attack scenario as a test?
Yeah, in the industry we call them tabletops. Essentially it's all the folks that would need to be involved in the decision-making up to the chancellor, and you're going to have your legal team weighing in on that. Because a lot of these ransomware crews have sanctions levied against them by banking systems in the West, and so paying them actually runs you afoul of those sanctions, which has other implications. So doing it when you're not under attack is the key time to identify all those areas and risks.
Talking about ransoms, the Sophos report said that half of universities that are attacked pay up in the end. Do you see this figure being reduced in the next five years?
In the US, I believe it's going to go up, because it's now mandatory to report [a cyberattack] to [US cyber watchdog] CISA and our Homeland Security department. I think we'll actually be able to measure the problem, and then we can say whether there will be a reduction over time.
I wanted to talk about the impact insurance is having on universities’ response to ransomware incidents. In 87% of cases, the companies pay at least clean-up costs, and all of them pay something for a cyberattack. Do you think there is an “insurance culture” that is making higher education complacent about taking precautions?
Great question. I would say yes: some organizations have adopted insurance [but] it's not a real control for cyberattacks. That being said, the cyber insurers are raising the bar now, because they are paying most of these costs. An insurer with some loose definition of [cyber]security is where it started: well, now they have 20 years’ worth of data. They're requiring organizations to do more than ever before they can get insurance.
So I think that overall it's positive for security, because organizations who didn't invest are going to be forced to get a minimum of security. And when that cyberattack does occur, they can expect way more scrutiny over decisions made, including things like non-wilful statements: “you didn't mean to say you weren't doing security – but you didn't.” I think they'll use that to lessen the amount they have to pay out.
In other words, the insurance culture will have a reversal, and impel universities to smarten up their acts in the future?
I think so, at least based on conversations that we're having with customers who perhaps were a little immature in their security programs, or a little late to start.
There's long been a pecking order within universities for academic excellence, but is that replicated in cybersecurity? For instance, would Ivy League universities tend to have much better measures in place, in your experience?
That is a very interesting question. I would say that generally when you look across higher education, they build good cybersecurity programs in their universities. They are graduating students that are going on to build security products, lead teams, do all of that stuff. One of the areas where I think we could do better is – and I've seen some universities do this – the senior-year cyber students, why aren't they providing services back to the university? Day one on their job, they're going to be expected to triage alerts, what's the firewall, these things. Some of the very progressive institutions out there are not only training cyber people, but they're using them to defend themselves as part of their graduating course or whatever the case may be.
So I do think there's an opportunity to improve how they're doing cyber security, and it makes sense when you look at the salaries: maybe they can get some entry-level folks. And maybe those folks stay a year or two, but I can guarantee after that they're going to price themselves out. And that's really tough. If you put yourself in the seat of a security professional who's leading a team at a university, one of their key challenges is hiring talent and retaining. Because they're competing against the large corporations [and] governments now, and all of our salaries have really gone up.
That coupled with when you look at elite institutions, they have massive research programs as well – whether that's medical or information technology – and that makes them ripe for one particular nation-state and their hacking teams to come after. So they have to worry about cybercrime groups, people who want to ransom them, [and] hacktivists – because maybe the university made a decision the students don't agree with. Then they have this very advanced nation-state that comes after research – to build a program to stop that is very expensive.
When you say nation-state, are we talking about China here?
Yes. I think it accounts for the bulk of research that is pilfered out of universities. That in particular makes it extremely tough. Especially when you have an entire program that may or may not send students to that university for reconnaissance purposes.
So when it comes to cyber espionage in the US, I’m guessing MIT would be the prime target?
MIT, Stanford… most of the high-end technical universities. There are a number of benefits to stealing someone else's research: you get to leapfrog the research [and in] some cases, these groups have destroyed data. Obviously that's going to put you even further behind, because you've got to recreate all your software code. And so again when I think of the people that are defending these systems, it's a big challenge.
It almost sounds like running to stand still – the top universities might have more resources, but they are also higher-priority targets?
Definitely. And it means more from an attacker perspective when you get that brand, [for] advertising services: the efficacy, how fast they get paid, stuff like that.
Going back to the problem of retaining cyber graduates, here’s a thought. Higher education fees are exorbitant in the US – do any of the more technical colleges have a policy whereby they slash these for students who agree to come back and work for them after graduating?
Sounds amazing. I bet there are some pilots like that, I haven't run into them. But that's a really cool idea, I'll take it to my local universities because I work with them a lot! You graduate and get a year internship, that could be seminal.
Although you say you'd like to see more students recruited, you have also pointed to their lack of training and expertise – what more could be done to alleviate the vulnerabilities caused by raw recruits at university?
It's a journey. They're going to get some security education when they arrive at the university. It's just part of the curriculum: don't share your passwords, basic stuff. We have seen a lot of universities across the globe create awesome specialty programs. A lot of these are approved by [intelligence agencies such as] GCHQ and the NSA, they really look at the curriculum and make sure they're meeting the grade. And so I think that's really where I'd like to see it, and then see some of those folks have work experience by the time they graduate. Why can't we start building around universities that are staffed by students, when we have a training curriculum for that? It would be great if those people could ultimately end up in positions because they've already got a couple of years and understand the systems. All of that would make for a better higher-education ecosystem around the human capital for security.
And do you actually see this happening? What are the obstacles, if any, that you envisage?
The obstacles are what they always are: paperwork, politics, and humans. Especially in higher education: they're some of the smartest people you'll ever work with, but they're very smart in a niche. What we have to do is get someone who has spent twenty years studying physics or quantum theory, and really get them to [understand] why security is important. And then make that relatable, to still allow them to do their job. That's not putting a bunch of logins in front of their faces, that's not making their jobs harder, right?
In security, we can do a bunch of stuff to enable that, but again another key challenge for that sector is reeducating folks, being creative in how those programs are managed. But I assure you it can be done, even in the face of resistance. Program leaders should think a little less about control frameworks, and a little more about cultural change. And then, patience: because if you're walking the right path, that's going to take years, not months. But ultimately, if you put the work into it, I've seen organizations change.
You did mention patience just now, and I couldn't help but notice the Twitter blowback on Austin Peay when they sent out the tweet telling students to shut down their computers. I had a look at the thread and saw a lot of sarcastic comments – do you think they were a bit harsh in their criticisms?
It's fair criticism. Any consumer should have an absolute stake in their privacy. That being said, I can assure you that when I was a younger security professional and a breach would happen, we did all kinds of “oh, they shoulda done this and that.” And what happens when you've had a bit more time in the seat, you realize generally it's good people trying to do the right thing. Someone is there to represent security, someone has probably not slept, lost some hair, all of that stuff – because they are passionate.
So I think again they're absolutely right in that take, but I also think there's a lot of nuance. It's easy to always say: “we shoulda done X.” Much harder to be part of the solution and actually drive it to that place. So what I would say to them is: “awesome, you've identified a problem – now what are we going to do to solve it?” Let's take some of that passion into projects that help that university. Let's give them some horsepower, get some of those folks into cybersecurity. And I think it's a good opportunity for the university to have that discussion with their students.
More from Cybernews:
Subscribe to our newsletter