A perspective from Account Development
Publishers and libraries are part of the scientific community at large seeking ways to address the most pressing societal challenges. As collaborators, we want to support the combined efforts from all stakeholders, including researchers, policy makers, entrepreneurs and science communicators, in pushing research developments to the forefront of change at a local and global scale.
As this project enters its second phase, we spoke to Timon about the partnership, his role in facilitating it, and his hopes for future partnerships like this in helping institutions and researchers accelerate impact and meaningful change. You can also find out more about the results of the partnership itself in an interview featuring Jürgen Wastl, Director of Academic Relations and Consultancy, Digital Science, on our sister blog, The Source.
Tell us about the relationship with VSNU/UKB, and your role in supporting their ongoing agreements with us since 2015.
In 2015, we launched one of Europe’s first national open access arrangements with VSNU/UKB, both extremely innovation-minded organizations. The deal proved hugely successful, with over 6000 articles being made openly accessible worldwide, within just three years. Towards the end of the first deal cycle, in late 2017, I was asked to do an initial “impact” analysis of the entire article set. At the time, I focused on mainly exploring the nature and scope of the online attention through Altmetric data that the 6000 articles had accrued, and how this data correlated to citations and downloads.
VSNU welcomed the new insights and offered to work together on expanding the analysis, by exploring the societal impact of content relevant to the UNs Sustainable Development Goals, and furthermore, to explore the role of the library in supporting researchers in optimizing societal impact. The Dutch funding landscape (VSNU, KNAW, NOW, NFU ZonMw) is highly committed to using new indicators of societal impact to recognize and reward academic work. So to address these issues, we set up an impact working group consisting of representatives from all three organizations. The group’s work will last at least until the end of 2020.
Tell us more about what the first impact analysis demonstrated, and how the results led to the new working group?
In the first impact analysis the focus was mainly on altmetric indicators of impact as well as its correlation to citations and downloads. It was the first time that we had ever undertaken such an analysis across all channels, ranging from Twitter, Reddit, Blog posts, Wiki, and Mendeley. We quantified and visualized the intensity of activity , including response time, and across all the signals for each of the 6000 articles. We could not find a connection between online attention and citations but we could find one, of moderate strength, between downloads and citations. This depth of analysis for a customer was unique, to my knowledge. This work then led to two further studies for Sweden and Austria, which compared performance, based on these metrics, of both OA and non-OA articles published in the same journal set and time period. As suspected, the open access articles outperformed the non-OA articles. Together, the data points provided good early indications of the impact or ROI that can be expected of such a transformative agreement.
This analysis method was also developed with a global overview and compared with data sets for the UK, and a resulting study was published here.
This project required coordination across technical, editorial and commercial teams from across Springer Nature, VSNU/UKB, and Digital Science. What was the most exciting part of this, and why is it such a unique initiative?
Yes, in total, some 10-20 colleagues from multiple organisations contributed to developing the mapping technology. This also included several colleagues from both Maastricht University and VU Amsterdam. The various tasks were delegated to different sub-teams. Our editorial network teamed up with colleagues from Maastricht University for quality assurance. Digital Science spearheaded the technical workflow and Machine Learning. VU Amsterdam visualized the results in an online dashboard. Excellent team work all round.
There are many aspects of this project that make it unique. Perhaps the most novel aspect is that together we have developed one of the world’s first online SDG content classifiers drawing on both extensive Expert Subject Matter input as well as Machine Learning. The first release of our new tool can map content for SDGs 3, 4, 7, 11 and 16. This has enabled us to make a strong contribution to existing pioneering and brilliant initiatives in this area, e.g. the AURORA network.
The project has been made up of three workstreams, set out here. Can you give us some insights into the developments of these and what we hope to achieve with and for our customers?
We have set up three interlinked workstreams (or project teams), with the first and second sequenced and the third running in parallel. I am project leading the first workstream which was tasked to create a new SDG content mapping technology to classify all Dutch scholarly outputs from the past 10 years into one of five of the SDGs based on relevancy. Here, we contracted Digital Science as a technology vendor to build such a tool. But still, as this had not been done before there was no blueprint to follow. We quite literally had to start from scratch and had many discussions on how best to do this technically. In the end we agreed on a two-phase method that involved first, creating a training set per goal based on a human-curated search string in Dimensions. The search strings were then carefully checked by our editorial network and Dutch colleagues. In a second step, the resulting DOIs were then fed to a self-correcting Machine Learning-based algorithm.In a final step, the model was applied across the entire Dimensions database, over 100 million DOIs, to categorise all content into one of five SDGs. Digital Science carried out all technical work. You can find out more about this in our interview featuring Jürgen Wastl, Director of Academic Relations and Consultancy, at Digital Science
The work of this first workstream has now triggered the second workstream, which is conducting extensive bibliometric analysis on all resulting data points from the mapping. Using qualitative methods such as surveys, they are also looking into the impact on non-academic actors, including business, politics, industry, and interest groups.
A third workstream is providing an overview of the existing tools and support services – as offered by libraries and institutions –used by researchers to facilitate impact. This team will also develop a set of best practices on the most effective ways that researchers can do this, which will be included on our tools & services pages.
Give us a little insight into how this integrated into your role within Account Development work?
Whilst my Account Development (AD) work is at the institutional or university level, the Strategic Partnership work operates at the consortia, national, or even pan-European level. The topics are also slightly different with my AD work focusing on usage statistics, product training, discovery support, and company updates, and my strategic work covering topics around research impact, open science, and emerging technologies. The common denominator that brings all my work together: supporting the library community, at whatever level, in servicing the information needs of researchers.
How are you seeing the research evaluation landscape evolve, and how does this new collaboration contribute to moving it forward?
Whilst science is advancing discovery at often great speeds, many funders and institutions continue to rely on old and often bias-prone evaluation tools (based on journal-level metrics) for key academic decisions (tenure, promotion, funding). This frustrates many in the academic community, especially those whose work impacts in equally valid ways, though until now more difficult to capture. It’s also why around 2000 research organisations now support the use of alternative indicators of impact (DORA), including the Wellcome Trust and European Commission. But still, there are many questions about what exactly are the new alternative tools, methods, and standards that we should be using? And how exactly can we best capture wider societal impact? With this project, we are addressing these concerns, and are contributing one such new tool –a kind of SDG relevancy checker - and my hope that is that evaluation officers and research directors will try using it. And that it will inspire others to come up with additional tools.
We will publish a series of interviews and insights around Sustainable Development Goals, our partnerships that support research with societal relevance, and the impact for researchers and organizations globally. Keep up to date by signing up for Librarian Alerts, or visiting our Springer Nature SDG Programme hub.