Knowledge is vulnerable when centralized. We have seen it both in the East in censoring of Wikipedia in Turkey, and in the West during whitewashing of climate research data in the aftermath of the US presidential elections. There is a solution: Through decentralization and blockchain, it becomes possible to leverage otherwise wasted resources on your computers. For example, empty space on your hard drive or idle computing power.
Decentralized technologies are more than a blockchain hype. This is not only about providing anonymity to whistle-blowers but to anyone who wishes to use freely the technology without being tracked. And it brings the possibility to anyone to benefit financially from this peer-to-peer architecture, and not only the powerful intermediaries.
The domino effect of greed
One of such intermediaries are scientific publishers, who have been openly called out as corruptors of the scientific ecosystem. They feed on scarce governmental funding for basic research, making profits of what was named as “producing envelopes without having any idea of the contents”, while researchers do the peer review for free, and yet have to spend thousands of dollars from their modest research funding to share their work through scientific articles. Later on, to gain access to those outputs, their universities are required to pay millions of dollars to cross paywalls.
IN THE PHOTO: GRAPHIC REPRESENTATION OF THE FRAGILE EFFECT PHOTO CREDIT: VIRGIL CAYASA – UNSPLASH
You may wonder why such a relationship persists. This hardly seems fair, but to this day research funding organisations as well as hiring committees wrongly rely on the journal impact factor to judge the quality of individual researcher’s work. DORA (San Francisco Declaration of Research Assessment) has already made recommendations to “eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations”. It cautions that “the Journal Impact Factor, as calculated by Thomson Reuters*, was originally created as a tool to help librarians identify journals to purchase, not as a measure of the scientific quality of research in an article.”
But the transition between a recommendation and widespread adoption will not happen overnight. Researchers, who face weak prospects for permanent employment due to disproportionate funding, have no choice but to comply and reinforce both the metric and the position of intermediaries. And thus first-author publications in respectable journals became the Holy Grail of science, or as others say, the currency of research, which reaches the point of inflation with 2.5 million new scientific papers added to the vault of gold each year.
The pursuit of reputation points has even more side effects. The performance-driven culture is ruining scientific research, leading to “showboat science that under-investigates less eye-catching — but ultimately more useful — areas.” The fierce competition and pressure to publish has also caused mental health damage in academia, with 33% of PhD students being at risk of a common psychiatric disorder, attributed in part to not transparent job demands and control, or juggling work-family demands. From the point of view of senior researchers, “everywhere, supervisors ask PhD students to publish in high-impact journals and acquire external funding before they are ready.”
Altogether, it all adds up to a domino effect: scarce funding, lucrative businesses feeding of public money, even more competitive market, inflation of publications, inability to judge quality of scientific output or reproduce results, showboat science (“better be first than right”), and so on. This is a carefully spun web, which cannot be easily untangled. It has to be done collaboratively, taking into account the complexity of science economics, challenges in data management, research assessment metrics and the wobbling marble reputation structure, on which equally depend both the foreseeable future of individual researchers, and the credibility of science as a whole.
Libraries 2.0.
The political landscape evolves – Plan S comes into effect, funding organisations require data management plans, and universities begin to self-organise to put pressure on publishers. Libraries stand a fair chance to step up and take back control of the university-funded research outcomes, rather than buy them back from a publisher, who makes insane profit margins of nearly 1 billion dollar a year.
IN THE PHOTO: PUBLIC LIBRARY PHOTO CREDIT: DAVIDE CANTELLI – UNSPLASH
Imagine what would happen if universities established a decentralized network of higher education institutions protecting the legacy of humankind and maintaining the quality of it directly, not via services of monopolistic middlemen.
– The governmental funding could be invested into replication studies and could give jobs to dropout PhD students and postdocs as content curators. This would spare thousands of highly educated people the inconvenience of going through the identity crisis after not getting a single job offer in academia.
– As as side-effect, active involvement in research pipelines would enable libraries to be closer to researchers, maintain well-documented, fully traceable project timelines and professional data management plans. Within such a framework, new research assessment method will have solid touch-points.
– Currently, the outdated manuscript format of communication is flooding the world with indigestible amount of text every single day, to a point that it is hard to distinguish real signal from noise. Scholarly communication could be revolutionized by leveraging new technologies such as machine learning, blockchain and in the future, the virtual reality. New infrastructure could make research more transparent while ensuring immutable attribution of discoveries, data, interpretations, code, or other research outputs.
As we look into the future, decentralized storage brings a prospect for permanent and censorship-resistant content on the Web 3.0., and that should include scientific data as well. Who if not libraries will be the guardians of humanity’s body of knowledge, especially in the era when drastic changes on the political scene challenge its resilience?
Norms in science vs. blockchain
Blockchain is only a part of the landscape of decentralized technologies but it gained a lot of attention due to the impact it exerted on financial institutions, policies and governments. But its reach will go beyond that. At a closer look, blockchain characteristics – decentralization, disintermediation, transparency and economic incentives – resonate very well with what we agree to be the norms in science.
IN THE PHOTO: ACTUAL BLOCKS CHAIN PHOTO CREDIT: VALIDITY LABS
Take communality – the principle that all research findings should be the commonwealth – which is encompassed by the notion of decentralization, namely being democratic and impossible to control by one party. Transparency and scepticism both call for making the scientific findings available for scrutiny.
Disintermediation, which questions trusting parties based on their reputation, is a close friend to assessing research outputs based on merit, not the journal, university or research group that it comes from. And as research work is done now in the framework of competing for funding and scarce positions, adding economic incentives could push self-interested motivation back to the background and bring solid science to the foreground.
A beacon of hope
Just recently, Validity Labs, the blockchain startup from the Swiss Crypto Valley in Zug, has partnered with ETH Library Lab, a joint initiative of ETH Zurich Library and the Library of the Karlsruhe Institute of Technology, to pioneer incorporating decentralised technologies into the scientific ecosystem. The announcement of the partnership marks the launch of the program called SEED – Scientific Ecosystem Experimentation with Decentralisation – which is a mix of an educational conference, interdisciplinary think tank and incubator.
IN THE PHOTO: SEED OFFICIAL ACCOUNT TWEET PHOTO CREDIT: VALIDITY LABS
What is unique about this program is their practical approach. Academic and industry experts joined forces to paint the global landscape of science and while learning about distributed ledger and decentralized technologies, will look for means of improving how science works today. Disintermediating scholarly communication is one of the topics of interest. The building blocks of the research infrastructure of today – economics of science, patents and IP, research funding, assessment, and scholarly communication – will be at its focus. And hacktivists will have a chance to join an SDG-themed parallel hackathon Hack & SEED.
In science lies much hope for the future
No matter what, the way science works is bound to change. Simple problems have already been solved and we enter the Science 2.0. era of collaborative efforts to be able to make incremental progress. Hence, soon we will need to stop relying on personal satisfaction and individual’s reputation as repayment for doing research.
There are many open questions that need to be answered. How can we empower the scientific community to be appropriately rewarded for their work, and perhaps even find new ways of funding research projects? And how do we break the vicious cycle of secrecy and competition with the right incentives for collaboration and openness, which is in the interest of science?
Embracing peer-to-peer networks and blockchain technology has shaken up the financial system through the back door. Scientific ecosystem needs the same but way more urgently.