Anthropology professor discusses what we can learn from Sweden to protect our personal data
âCU data cyberattackâ was the subject line that appeared in thousands of university-affiliated inboxes on Feb. 9, 2021. On that date, former CU President Mark Kennedy reported that individual records of students and employees may have been compromised.
CU was of this cyberattack on Accellion, a file-sharing service that CU used, which also hit organizations like Shell and the University of Californiaâand yet, it wasnât even from the last year.Ìę
As these attacks grow in frequency and complexity, the U.S. government is rushing to address the. A University of Colorado Boulder professor, though, suggests we look to Sweden for possible solutions while also maintaining a crucial element: the publicâs trust.
Alison Cool, an assistant professor in anthropology, studies data law and practice, particularly in Sweden, and she says her research has potentially broader implications, namely ways to regulate data that balance the interests of citizens, consumers, corporations and governments.
Finding a definitive compromise for all stakeholders is not straightforward she says.
âSweden had to grapple early on with the question of âhow do we use data to create a better society while also protecting individuals from the potential harms that can come from the availability of this information?ââ Cool said.Ìę
She has been conducting research in Sweden since 2005. Initially, she planned to research women who gave birth to twins after in virtro fertilization (IVF) which led her to the largest twin registry in the world, the Swedish Twin Registry. Her research focus, however, shifted as she saw the large-scale national collection of personal data and what seemed like the publicâs trust of scientific research and the state.
âIn Sweden people have historically a lot of trust in science and they want to trust in the state,â Cool said. âThe narrative is that it doesnât feel threatening to have that information collected, because you feel very strongly that itâs going to benefit society, but the people who are telling me this are the scientists.â
However, this history of trust was called into question in 2010 when LifeGene, a biobank that planned to collect genomic data from half a million people, was launched in Sweden. The biobank was at the center of a public controversy about biomedical ethics and value creation in the age of big data.
âWhen they started LifeGene, it put things in a new light because it was no longer this older model of citizens participating in scientific research and it helps the state,â Cool continued. âWhen you have a public-private database with financially valuable data, it changes that kind of relationship of trust.âÌę
Two published journal articles in and stemmed from Coolâs research in Sweden. She examined the relationship between the data use and data regulation, and the benefits and downfalls of both sides.Ìę
In capitalism, data distribution is a large and profitable new market. ââ in this new market, better defined as the digital economy, and, as with oil, has political ramifications.Ìę
âThere is such a vast power imbalance between the parties (government and corporations between individuals) that are coming together,â said Janet Ruppert, a graduate student studying information science at CU Boulder. Ruppert was in Coolâs graduate seminar and Cool is a member of her PhD committee.
âPeople think about elite programmers that are doing all of this stuff, but I would really point more to the company as an institution and the legal and economic sides. The business and legal side of the company are really the ones who are pushing for exploitation because thatâs how you make it under capitalism.â
The data economy influences exploitative practices, but there are good intentioned data users. However, in a negative case, identifying illegal conduct is tough to pinpoint and prosecute.
Developing clear guidelines or regulations is difficult, though, because of the legal and cultural complexity of regulating a global data economy. As a result, âthere is a widespread sense of uncertainty about data law,â said Cool in a discussing the European Unionâs General Data Protection Regulation (GDPR).
The GDPR regulates organizations that are collecting data related to people in the European Union. Put into effect on May 25, 2018, organizations that violate its guidelines will endure financial penalties. âThe regulation itself is large, far-reaching and fairly light on specifics,â the stated.
Cool interviewed scientists, data managers, legal scholars, lawyers, ethicists and activists in Sweden to understand their opinions on the GDPR. She found that scientists and data managers found the law âincomprehensible,â meaning that that they felt was a lack of clarity and specificity. For example, if scientists worried that if they were to manipulate the data outside the lawâs parameters unknowingly, they could find themselves in court.
Legal scholars and lawyers believed that the lawâs loosely defined stipulations were not the result of poor legislation, but rather the fact that technology advances more quickly than does legislation.ÌęÌę
Creating laws that consider ethics, data use and social norms as Cool explains, is difficult, given the varied interests of stakeholders. However, Cool argues that the GDPR, is âthe most comprehensiveâmost people would say the bestâprivacy law that weâve ever had.â
âI would say that Europe really set the standards for data privacy for the world and is really the world leader in how to best regulate personal data in the best interest of individuals,â Cool said.
For companies, an ideal world would consist of complete open, free data use and distribution, but this world is not feasible with individuals valuing personal security. Balancing the goals of the institution with the values of the individuals continues to be the point of contention.Ìę
âEveryone has to find a place in between those two extremes where research can get done, but in a way that is secure,â Cool said.
Ruppert, Cool and other CU data experts collaborated on an for CU employees and students affected by the data breach. The document recommends best practices for data security and privacy tools.
âI think itâs important that people understand that transparency (of how data is used) is a very low bar. If tech companies really want to be transparent, maybe they can actually start writing the terms of service and privacy policies to include relevant, specific information that would meet a reasonable standard of transparency,â Ruppert said.
Developing strict and clear data regulations that allow flexibility on both sides of the spectrum is a tall order. Yet, as Cool concluded in a, âPragmatic guidelines that make sense to people who work with data might do a lot more to protect our personal data than a law that promises to change the internet, but canât explain how.â