Skip to content ↓ | Skip to navigation ↓

I recently volunteered as an AV tech at a science communication conference in Portland, OR. There, I handled the computers of a large number of presenters, all scientists and communicators who were passionate about their topic and occasionally laissez-faire about their system security.

As exacting as they were with the science, I found many didn’t actually see a point to the security policies their institutions had, or they had actively circumvented them.

A short survey heard reasoning like

  • My college doesn’t actually care.
  • It takes too long, so I disabled it.
  • I *want* my data to be accessible by other scientists. Why should we secure it?
  • I have bigger things to worry about – you know my research is on <insert critical issue>?
  • Too many systems require passwords, so I just use the same one.
  • I travel a lot, so I automatically connect to open WiFi networks.
  • All my stuff is on Google Docs anyway.

A reasonably large portion of this group didn’t think security mattered and found it too annoying to put up with. That security is the enemy of convenience is well known, but since it is important (and it is – their universities do care), there are ways to make it as habitual as brushing your teeth – unless your research is on gum decay.

In general, a security program will be concerned with controlling access – only people who are supposed to get at the computer, the email, the network, the data store, etc, can actually get access to read or edit what is there.

What could happen if someone who is not supposed to have the ability to write or edit gets access? The possibilities run from simple mistakes to malicious sabotage, all of which could have effects on important research or an individual’s scientific reputation.

Let’s take a Google drive store of data that a group of labmates have collected over several years. They may be using Python or R to clean their records, append new data and analyze – it’s great for handling and updating large quantities of information. A student finds the data and tries to “help” by doing an update that attempts to bucket the data and deletes the original field …. and the student doesn’t mention it. Now the source is incorrect and the researchers may be making bad science without knowing it.

What if it was not a student but an industry that feels this group’s research will harm their profits? What if there is a rival who wants to know what avenues others are researching, with a hacker friend who can access unencrypted email? Might a research team get scooped or have their findings made moot by a shady advertising campaign?

Watson and Crick went to some lengths to acquire the images that Rosalind Franklin developed. Alexander Graham Bell happened to be working at the Western Union from which Antonio Meucci’s telephone patent application seemingly disappeared. And there are entire movies about Thomas Edison. Sometimes there’s a known rivalry, sometimes there’s a trusted mentor or colleague, but keeping records of the data, where it came from and who came up with it can prevent these misunderstandings and mis-attributions.

Preventing breaches against your processes is what an institution’s security policy is all about – made difficult because they *know* the researchers want to collaborate. They’ve got a policy written up and best practices outlined, but it does depend on the researcher being willing to follow a few inconvenient steps to keep their systems up to date and access secure.

Habits and tools a scientist can put in place to protect the provenance of their work include

  • Set up the computers to take updates automatically. An old or unpatched OS has known security vulnerabilities that can be exploited or even just bugs you could hit by accident. This might include putting a habitual reboot into your schedule, which many folks forget to do for weeks on end.
  • Use a password vault and have different passwords for each tool or data store. The vault will have a master password which can be something a human can remember – for instance, four unrelated words strung together – and then the various tools can have unintelligible passwords if required, but one can log in via the vault and not have to retype the long, secure passwords.
  • Have an antivirus tool installed and confirm the signatures get updated regularly. If it needs to be turned off to do an install, be sure it gets turned on again. If it stops working, get assistance.
  • Only connect to trusted networks. The coffee shop may be fine, but be skeptical of miscellaneous hotspots that may be reviewing data being transmitted.
  • Read the institution’s security policy.

Security may be considered an inconvenience, but it’s also a solution to the risks discussed above. Scientists need to be very exacting about their research – and protecting the data that come from it.