News & events

Stay up to date and know where to meet us

There are several ways to stay updated and keep in touch with us and on the latest topics in the industry. Here you can find news, updates and upcoming events that we attend and host, over the web and in person.

Stay up-to-date: Our news articles cover market trends, subject areas, expert interviews with authors, librarians and editors, product applicability and updates, and more. We'd love to hear from you on any one of our article topics, or on a topic you'd like to see covered.


News, Updates and Industry InitiativesSee all

Fixing the incentive problem in research assessment - lessons learned from global practice

R
Research Publishing
By: undefined, Thu Feb 5 2026

What happens when the systems meant to reward research excellence, can be seen as one of the causes of a loss of trust in science? Does an overreliance on publication metrics mean that researchers feel compelled to target these metrics? These questions shaped one of the discussions during COPE's Publication Integrity Week, an annual initiative that brings together the global research publishing community to highlight issues that define good practice.  

As COPE Trustee Board member, I had the pleasure to moderate the session Boosting Incentives for Ethical Conduct. With three fascinating speakers, we asked ourselves: 

Why do incentives matter for trust in science?  

Incentives shape behaviour. When institutions offer cash bonuses or base promotions on the number or prestige of publications, the pressure to publish can lead to shortcuts, and sometimes misconduct. It’s a reality for many researchers today, and my team and I have heard many researchers expressing these pressures during our research integrity investigations.

In the session, our panel of international experts discussed how the current research assessment systems influence behaviour, the unintended consequences of incentives, and the work underway to rethink how we reward researchers.  

What we can learn from the Cobra effect 

Achal Agrawal, founder of India Retraction Watch offered a striking analogy to the use of publication counts as prerequisites for bonuses or promotions – “the Cobra effect”. In colonial India, a bounty was introduced to reduce venomous cobras. Initially, the program worked to reduce numbers, but soon people began breeding cobras to claim rewards. When the program ended, breeders released their snakes, leaving the region with an even bigger problem. 

This tale mirrors what happens in academia when incentives prioritise quantity over quality. Achal shared examples of universities offering direct payment for papers and patents, alongside punitive measures for those who fail to meet unrealistic research goals. Combined with rankings that heavily weight research output, researchers face immense pressure to publish, sometimes at any cost.

These pressures, he argued, are driving misconduct and inflating retraction rates worldwide.   

 Achal warned: “When you reward papers, people start gaming the system” 

 “Incentives are powerful” he added, “if we don’t monitor and evolve them, they will lead to unintended consequences”.   

Achal’s call to action included:  

  • Better metrics that reward integrity, openness and collaboration  
  • Oversight and transparency in journal practices, clearer retraction notices and audits of COPE compliance  
  • System-level penalties for institutions with repeated misconduct such as a research integrity risk index  

Many of Achal’s recommendations align closely with our work to promote research integrity, transparency and accountability.

Through Springer Nature’s India Research Tour, in collaboration with the Ministry of Education, Government India, we delivered workshops, training across institutions to promote ethical, transparent, inclusive research practices and strengthening research integrity awareness. 

We also collaborate with organisations such as ORCID, DORA, STM and COPE, to develop shared solutions and resources, including guidance for publishers on how to clearly communicate retractions, which are a neutral mechanism to correct the scholarly literature.  These collaborations help create the interconnected systems that are required for incentives which promote integrity and good practice. 

Moving beyond Metrics  

Sanli Faez, national manager of the Dutch Recognition and Rewards programme, challenged the reliance on traditional metrics which tend to focus on counting papers and citations, ignoring the many different contributions that drive discovery and advance knowledge.  

 “What we aim for is concrete benefit to society and reliable science. But that’s not what we’re rewarding”  

Drawing on Bruno Latour’s work, Sanli explained that the day-to-day incentives are social rather than philosophical. "What scientists want is credit - from peers - because credit unlocks resources to keep doing the work.” 

This is the traditional credit cycle, where initial recognition leads to resources, which enable experiments and data collection, which turn into publications, presentations, generating more recognition and resources. However, this model does not reflect the reality of modern science today, with the growth of multi-disciplinary work, and blurred discipline boundaries.  

Sanli introduced a compelling idea, moving away from the traditional credit cycle to a credibility network, a cloud of contributions where value is distributed across people, teams and institutions.   

He added “We now conduct research in a more diverse and generally collaborative way, through countless micro-contributions: datasets, code snippets, preprints, peer reviews, blog posts, mentorship and collaborative discussions”.  

These smaller, interconnected outputs reflect the reality of interdisciplinary research and open science.  However, the current reward systems struggle to aggregate and meaningfully reward these contributions.  

The Netherlands are considering bold steps: removing H-index reporting in grant proposals and attempting to measure the qualities of the types of work in academia, across different disciplines.   Sanli said, “This moves us away from marketing or promoting the individual researcher and towards honouring the collective research output – the cloud of micro-contributions, advancing discovery”. 

At the institutional level, the Dutch Recognitions and Rewards programme is creating room for diverse talent profiles, valuing the contributions of teachers, researchers, managers, data scientists and more, and rewarding team science and cross-disciplinary collaboration. The Dutch Recognitions and Rewards culture barometer give an insight into the success of the programme and how this has been recognised, shared and experienced in the workplace by surveying over 8,000 researchers.  

These approaches resonate with my experiences in integrity and with the position Springer Nature has taken.  As a signatory of the Declaration on Research Assessment (DORA), we have long advocated for a more balanced approach to research assessment, one that rewards integrity, openness and collaboration.  Our recent white paper on the state of research assessment reflects this commitment, drawing on insights from over 6,600 researchers to understand their experiences on how their contributions are evaluated.  

We also continue to support the wider researcher community with free to access, specific training to promote good research practices, from fundamentals on research integrity, to conducting peer review. By working collaboratively across the ecosystem, we aim to shape systems that reflect the realities of modern science and strengthen trust in science.  

Recognising micro-contributions means valuing:  

  • Data sharing and curation that enable reproducibility 
  • Open-source code that accelerates innovation  
  • Peer review and mentoring that strengthen research culture 
  • Collaborative outputs that cross disciplinary boundaries.  

Building systems to capture and credit these contributions is essential. It’s not just about fairness; it’s about creating incentives that align with the realities of modern research.  

Redesigning Academic Rewards:  

Caitlin Schleicher introduced the MA³ Challenge, a $1.5 million initiative to help institutions rethink hiring, promotion and tenure systems, with a goal of rewarding bold strategies to foster a culture of openness, collaboration and transparency. She highlighted Stanford’s School of Medicine as a case study: faculty can now include an optional CV addendum showcasing open practices - data sharing, reproducibility, and more - while impact factors are explicitly excluded.  

“Show us your open data practices, rigor and reproducibility, not your H-index,” Caitlin emphasised. 

Key points of agreement 

All the speakers made compelling cases and, unsurprisingly, converged on some key points:  

  • Metrics aren’t going away but they must evolve. Rankings drive behaviour, so better, more robust measures are essential.  
  • Quality over quantity: Recognise diverse contributions, teaching collaboration and open science.  
  • System-level change matters: Funders, publishers and institutions must align incentives with integrity  
  • Global collaboration is crucial: Solutions must be inclusive, designed to support diverse academic contexts and avoid widening gaps between resource rich universities and those with limited resources.  

What struck me most was the sense of urgency and optimism. Reforming incentives is complex, but the examples our panellists shared show that progress is possible, when institutions, funders, publishers and other stakeholders across the sector work together. To build trust in research, we need systemic reforms that make ethical conduct the rational choice.  At Springer Nature, we are committed to supporting this shift through advocacy, resources and training that empower researchers to conduct research with integrity.  Learn more about our work here.  

The 2025 State of Open Data report: Can technology push openness forward?

T
The Link
By: undefined, Mon Feb 2 2026

The State of Open Data survey has been capturing researchers’ attitudes and experiences with open data since 2016. The 2025 report celebrates its tenth year, with more insights and findings. Understanding how researchers feel about data sharing and open data mandates helps institutions design services, training and infrastructure that genuinely match researchers’ needs and behaviours. By grounding policy and support in these insights, institutions can promote stronger adoption and compliance. They can create an environment that empowers, rather than pressures, their researchers. In a special blog contribution, Springer Nature’s Ed Gerstner, Director, Research Environment Alliances, Academic Affairs, shares his thoughts on the 2025 report, and why he is (cautiously) hopeful that technology could push openness forward.

With the 2025 State of Open Data report, titled 'A decade of progress and challenges,' we mark 10 years of tracking how researchers think about and engage with open data. The annual reports on the survey results have become a key reference point for understanding the open data landscape. The 2025 report shows that we’ve come a long way in ten years, but we’ve still got a way to go.

The 10th anniversary report examines the current state of open data as reflected in the 2025 survey results, as well as how attitudes and practices have evolved over the past decade. It provides valuable insight into experiences of researchers with data sharing, their attitudes and practices. You’ll also find input from experts on related topics, from funder mandates to data sharing challenges and from recognition to reproducibility

“Looking back, data sharing is still woefully unrecognised by funders and institutions. Looking forward, technology has finally reached a point where it might be able to help.”

For institutions, this long-term view of researchers’ motivations and the challenges they face offers insights that help them better design the essential services they offer to their researchers. With evidence-based support and advocacy, institutional stakeholders are invaluable partners in fostering open, transparent and reproducible research.

Making data FAIR, without sufficient credit for time and effort

The 2025 State of Open Data report shows that researchers’ familiarity with the idea of FAIR data has dramatically improved (ensuring data are findable, accessible, interoperable, and reusable).

But while the proportion of researchers who are now familiar with the FAIR principles has increased substantially, the 2025 report finds that recognition for researchers who make their data FAIR, or even just open, has not. Two-thirds of respondents told us that they feel researchers still don’t receive sufficient credit for making their research data open. This misalignment between efforts and recognition is one of the most significant barriers to widespread adoption of data sharing.

Date sharing mandates: Compliance burden with no support?

This in itself isn’t news, and has been discussed often, including in previous State of Open Data reports. What is striking to see is the corresponding drop-in net support for national open data mandates.

In the first report in 2016, a clear majority of respondents told us that they strongly supported open data mandates. In 2025, that strong support has fallen to around 40%, and to less than a third in the United States. National or funder mandates can be powerful drivers of data sharing, by setting clear standards of the practice. But without resources, tools, and guidance to researchers, these mandates may be seen as burdening researchers with compliance without enough support.

Even so, support for mandates outweighs opposition when comparing the two groups, which is encouraging to see. Institutional stakeholders, from libraries and research offices to data support units, play a central role in helping researchers understand and meet data sharing mandates. Understanding the challenges researchers face when implementing these mandates enables institutional stakeholders to better support them, design relevant services, and advocate for meaningful institutional change.

The challenges of sharing data and doing it right

But even with the strongest will in the world, researchers have precious little time to share their data (which is why it is so important to understand what drives successful data sharing). Data sharing competes with the need to manage labs, write papers, chase research funds, teach undergrads, and much more besides doing actual research.

What’s more, sharing data on its own is not enough. If research data is released without sufficient metadata, it has little value. Metadata is the information that tells others not just when, where, and by whom it was collected, but what it represents. Without metadata, the potential of shared data to be found and reused (the F and the R in FAIR) is limited.

The curation of data and the creation of metadata that enables it to be FAIR are specialist skills. If we require all researchers to be data scientists, it will cost much more than merely the investment in the construction and maintenance of digital infrastructure. The more time researchers spend making their data open, the less time they will have to spend doing research.

“How do you guide people to structure data in a way that others can use it? There is enormous capability, for example, in the academic librarian community that knows how to do these sorts of things well.” - Brian Nosek, Co-founder and Executive Director of the Center for Open Science and Psychology Professor at the University of Virginia, from the 2025 State of Open Data Report

Institutions can make the difference in supporting researchers with preparing high-quality datasets. The report identifies librarians as enablers of metadata, standards and licensing. Such institutional support can help researchers make their data available and accessible, thus promoting data sharing adoption and compliance.

AI supporting data sharing: The importance of maintaining trust

Recognition of researchers’ efforts to share data would be a first step. The support of data specialists would be even better. Short of both, it seems that technology might soon fill some of the gaps.

In the 2025 State of Open Data survey, a quarter of respondents told us that they were using artificial intelligence to help them create metadata. That’s a promising sign, but with a warning. Although AI can be used to help researchers manage data, it can also help dishonest actors generate fake data. Any incentives to share data need to be developed in a way that they don’t reward bad behaviour.

“AI empowers researchers to make their data FAIR, but it also allows for the generation of fraudulent data. Credit systems should promote the former and deter the latter.”

One of the greatest benefits of open data is the insight that can be gained by combining many datasets together. If the authenticity of any one of these data cannot be trusted, the value of the whole is lost. It’s critical then, that we develop ways to validate trust in the data that are shared.

Explore the full State of Open Data 2025 report to learn more about researchers’ views on data sharing and for expert insights on related topics, from funder mandates to data sharing challenges, and from recognition to reproducibility.

Related content:

Don't miss the latest news & blogs, subscribe to The Link Alerts!

Meet with us & Learn with us

-

Attend a webinar!

Explore our series of live, educational, online talks designed for librarians and information managers. 

Free to attend, you can expect to gain insight into emerging new information technologies and some of the most pressing issues in the industry today. 

-

Meet us in person! 

We attend key librarian conferences around the globe and would love to talk to you. You can also find out what special events and workshops we are hosting on our dedicated regional pages.

Conferences and events in: