We need to decentralize science

We are screwed as a species without reliable, accessible and trustworthy scientific research

OPINION
article-image

Kiselev Andrey Valerevich/Shutterstock modified by Blockworks

share

Scientific research is built on facts, but the keystone holding the arch of our scientific institution together is starting to crack.  

We expect scientific institutions and their scientists to be infallible, their experiments proven and taken as fact so others can accurately replicate — and advance — their work in humanity’s collective plight to find answers. Unfortunately, that is not always the case; a growing catalog of reports shows several landmark studies and significant scientific discoveries based on false, plagiarized or manipulated data.  

Last year, Science Magazine found that the amyloid hypothesis, used for decades as the basis for studying and treating Alzheimer’s disease, was based partly on false data and images. In 2012, biotech pioneer Amgen attempted to reproduce results from 53 landmark cancer studies and failed with 47, an 11% success rate. This summer, three prominent researchers from Duke, Stanford and Harvard faced separate accusations of data manipulation of past research, accusations so damaging that Marc Tessier-Lavigne, the Stanford researcher and university president, was forced to resign.   

Trust in research is waning within the scientific community and among the public, bringing far-ranging implications for our ability to scale economies, boost the quality of living and mitigate disease (not to mention sinking taxpayer funds into questionable research).  

There are many reasons for this crisis, including the lack of incentives and good infrastructure for scientists to share underlying data and code in their publications. Most data and code are lost or inaccessible on centralized servers or clouds, making it impossible to check the reproducibility of most empirical results. 

The fidelity and accessibility of scientific research won’t improve if we continue like this. Scientific research is too critical for humanity; it should live in an open dataverse secured by an verifiable index accessible to humans and machines.   

The problems of centralization 

Centralization has contributed to the replication crisis and the erosion of trust in scientific institutions and research. Its disadvantages include missing scalability and flexibility, data sovereignty and a single point of failure. 

Centralization also fragments data into silos with low cross-team visibility, making information difficult to access, reproduce and verify. As I wrote above, third-party researchers and internal academic investigators face significant roadblocks to accessing original data, making it nearly impossible to reproduce results or detect problems.   

Read more from our opinion section: DeFi has not followed through on its privacy promises — yet

The emergence of Web3 and blockchain technology provides a compelling technical solution to the problem of siloed data systems. Content-addressed storage tools like IPFS (InterPlanetary File System) and Filecoin enable scientists to redesign data storage and accessibility in ways Web2 can’t, ensuring data integrity within the FAIR principles (findable, accessible, interoperable, reusable). In Web2, URLs point to where a file is stored, which leads to problems such as link rot or content drift if the file is moved or content is changed, both of which happen frequently. In Web3, however, content addressing generates a unique hash for each file, meaning even the tiniest change in content leads to an entirely different hash. Using these unique content hashes as identifiers solves both the problem of link rot and content drift. It also allows multiple entities to store the same file at different places, enabling institutional autonomy and improving content availability. This breaks up data silos in favor of distributed, open systems that guarantee content availability without paywalls and enable institutional sovereignty.  

Open access to scientific manuscripts alone is one step towards solving the replication crisis. 

We must also move beyond the PDF as the dominating form to publish science and embrace a new model based on versionable, FAIR digital research objects that contain all relevant research project components — manuscript, data, code, videos and more — to enable the reproducibility and reuse of invaluable information. In decentralized scholarly publication systems, qualified third parties – including publishers, funding agencies, academic societies and field experts – can use cryptographically signed attestations to evaluate and verify desirable characteristics of research. For example, badges for data availability or computational reproducibility would be clearly visible on the research objects, allowing readers to filter their search for such content, thus creating valuable metadata that can be used to improve the incentives for scientists. Based on IPFS, a protocol for decentralized persistent identifiers for science (DPIDs) is under development.  

Answering the arguments against open science 

Open science practices are the most promising way forward for the scientific community, but face some headwinds. There are some common arguments against its implementation, including: 

  • Data Privacy: Scientific data often contains sensitive information that should not and cannot legally be shared openly, including genetic data, health records and financial history. As we become more digitally dependent, the risk of cyberattacks that threaten information safety increases significantly
  • Lack of Incentives: Researchers lack incentives to share their data and code openly because doing so could eliminate competitive advantages over other scientists. Without tangible returns, it creates more work for researchers, and their transparency allows colleagues to highlight mistakes.  

Objections from open science detractors partially stem from resistance to change, which has plagued many industries amid digital transformation and the emergence of blockchain technologies. It’s essential to understand how open science addresses these concerns: 

  • Data Privacy: FAIR doesn’t dictate blanket data accessibility, but there should be a path for those seeking access with a legitimate reason. IPFS nodes can run on private servers with content addressers, provenance identifiers and identification checks. Open science can employ blockchain while on a server with privacy restrictions. 
  • Lack of Incentives: Funding agencies increasingly encourage data and code sharing. In 2022, the White House Office of Science and Technology Policy (OSTP) mandated that federally funded agencies publicly share research, data and code for free. Encouragement from policymakers to create incentives for open science can be enabled and supported by a new, decentralized scientific infrastructure.  

Enabling progress 

Science is about progress, new developments and, crucially, facts. But for too long, the scientific community has stagnated because of outdated and inefficient methods of storing, preserving and accessing research. The result is an industry beset by wasted time and questionable incentives, fraud and manipulated data directly impacting real people. 

Scientific research holds too much importance to remain siloed and inaccessible in centralized systems. The emergence of open science and decentralized blockchain tools can and will solve this issue, enabling scientists to use new methods of storing and accessing research that the current Web2 system cannot match. Without reliable, accessible and trustworthy scientific research, we are screwed as a species.



Start your day with top crypto insights from David Canellis and Katherine Ross. Subscribe to the Empire newsletter.

Explore the growing intersection between crypto, macroeconomics, policy and finance with Ben Strack, Casey Wagner and Felix Jauvin. Subscribe to the Forward Guidance newsletter.

Get alpha directly in your inbox with the 0xResearch newsletter — market highlights, charts, degen trade ideas, governance updates, and more.

The Lightspeed newsletter is all things Solana, in your inbox, every day. Subscribe to daily Solana news from Jack Kubinec and Jeff Albus.

Tags

Upcoming Events

Salt Lake City, UT

MON - TUES, OCT. 7 - 8, 2024

Blockworks and Bankless in collaboration with buidlbox are excited to announce the second installment of the Permissionless Hackathon – taking place October 7-8 in Salt Lake City, Utah. We’ve partnered with buidlbox to bring together the brightest minds in crypto for […]

Salt Lake City, UT

WED - FRI, OCTOBER 9 - 11, 2024

Permissionless is a conference for founders, application developers, and users. Come meet the next generation of people building and using crypto.

Javits Center North | 445 11th Ave

Tues - Thurs, March 18 - 20, 2025

Blockworks’ Digital Asset Summit (DAS) will feature conversations between the builders, allocators, and legislators who will shape the trajectory of the digital asset ecosystem in the US and abroad.

recent research

4.png

Research

This months PPGC covered four main areas. Firstly, debriefing the progress and status of the mainnet implementation of the Ahmedabad hard fork. Secondly, a retrospective on the testnet phase of the Ahemdabad Hard Fork. Thirdly, an update on PIP-36 which involves replaying failed state syncs. Lastly, PIP-47 which pushes upgrades to the Polygon Protocol Council.

article-image

Institutions to test out the settlement of “digital assets and currencies” on a network that annually carries more than 5 billion financial messages

article-image

After Bitwise’s XRP ETF filing this week, one industry watcher notes: “Politics will determine whether this happens soon or in a few years”

article-image

Plus, a look back at some of the SEC’s biggest enforcement moves under Gurbir Grewal

article-image

The forward-looking financial system is being championed by several contributors to India’s UPI digital money system

article-image

Multiple teams are pursuing integration cross-chain and off-chain

article-image

An SEC spokesperson told Blockworks the Ripple judgment clashes with Supreme Court precedent and securities laws