Unlocking Libraries’ Potential: Monitoring Open Science through Initiatives, Methods, and Tools
Report from a LIBER Annual Conference Workshop, held on 2 July 2025 in Lausanne, Switzerland.
Open Science has changed the way research is shared, assessed, and reused, but tracking that progress remains a challenge. During the LIBER Annual Conference 2025 in Lausanne, members of the LIBER Open Access Working Group hosted a lively workshop titled “Unlocking Libraries’ Potential: Initiatives, Methods and Tools to Monitor Open Science.”
More than thirty participants from across Europe joined to exchange experiences and reflect on how libraries can take a leading role in monitoring Open Science. The conversation revolved around a simple but crucial question: how can we measure openness responsibly?
A Shared Starting Point
The workshop opened with a reminder that libraries have long been at the forefront of supporting Open Access, but the next challenge is to make the impact of those efforts visible, considering what data should be collected, and how those should be used to guide change.
The discussion was framed around two key initiatives shaping research assessment in Europe: the Coalition for Advancing Research Assessment (CoARA) and the Barcelona Declaration on Open Research Information. Both call for transparent and contextual approaches to evaluating research, areas where libraries already bring essential expertise.
The session included 3 presentations by Alicia Fátima Gómez, Ivana Matijević, and Andrea Solieri.
Responsible Use of Metrics for Assessment of (Open) Research
Access the slides here
by Alicia Fátima Gómez, IE University Library (Madrid, Spain)
The presentation explores the shift from traditional research metrics to more responsible and open approaches to assessment, highlighting the growing alignment between Open Science practices and the reform of evaluation systems.
Starting with the limitations of conventional indicators such as the impact factor and h-index, the talk traces key milestones in the movement towards responsible metrics including DORA, the Leiden Manifesto, The Metric Tide, the Hong Kong Principles, and the CoARA Agreement on Reforming Research Assessment. It also references the Barcelona Declaration on Open Research Information as a recent commitment to openness and transparency in research evaluation.
The presentation reviews major European and global initiatives that promote responsible assessment from the European Commission’s Next Generation Metrics and Evaluation of Research Careers reports, to projects such as OPUS (Open and Universal Science) and TARA (Tools to Advance Research Assessment). It also highlights institutional examples like TU Delft and Utrecht University, where recognition and rewards systems are being redefined to reflect Open Science values.
By contrasting “closed” and “open” approaches to science, Gómez emphasises a cultural shift: from publication-based, individualistic models to collaborative, multi-output, and socially engaged research. The session concludes that libraries, with their expertise in metadata and open infrastructures, are crucial allies in implementing and monitoring responsible assessment ensuring that indicators remain transparent, contextual, and aligned with the goals of Open Science.
The Role of Libraries in Open Science Monitoring: A Croatian Perspective
Access the slides here
by Ivana Matijević, National and University Library in Zagreb (Croatia)
The presentation provides an overview of the current state of Open Science (OS) in Croatia, highlighting how libraries can drive OS monitoring and promotion.
Focusing on methods, tools, and infrastructures rather than raw figures, the presentation shows how libraries can collect and interpret OS-related data; advise researchers and university leadership; build capacity through training and collaboration; and lead practical monitoring activities, tracking Open Access publishing, assessing the FAIRness of research data, analysing repository usage, and providing evidence for institutional and national reporting.
While Croatia lacks a formal system for evaluating OS progress, the presentation shows that valuable tools, datasets and expertise are already in place. Libraries are shown as key intermediaries who can enhance transparency, accessibility and accountability in research by translating complex data into actionable insights for decision-makers, institutions and end users.
The concluding note underlines that libraries are essential enablers of Open Science – fostering trust, supporting implementation at both institutional and national levels and shaping the future development of OS monitoring.
10-step guide to monitor APCs
Access the slides here
by Andrea Solieri, head of Open Science Services at University of Modena and Reggio Emilia (Italy).
The presentation outlines a practical, step-by-step approach to establishing a monitoring system for article processing charges (APCs).
Rather than presenting data or charts, the talk focuses on methods and tools to track real, not estimated, APC expenditures, in a way that overcomes the initial challenges, including inconsistent payment practices, lack of metadata, and absence of APC-specific accounting categories.
Through a ten-step guide, the talk illustrates how to implement targeted improvements: integrating APC payment requests within the institutional repository, creating dedicated accounting codes, and ensuring communication and cooperation among researchers, accountants, IT developers, libraries and administrative offices.
A key element is the automation and interoperability between the repository and the accounting system, allowing for effective linkage between costs and related publications.
The final steps cover how to clean, enrich, and report data, with suggestions for stakeholder-oriented reporting and data sharing with initiatives like OpenAPC. The talk concludes with the message that APC monitoring is a continuous, iterative process, one that demands persistence, collaboration, and a good set of tools.
Taken together, the three contributions sketch a practical pathway for libraries to monitor Open Science: start by aligning assessment with responsible-metrics principles, build on national infrastructures and open data streams, and operationalise the work through automated, reproducible workflows at institutional level.
After the talks, a 45-minute breakout in groups explored two questions to reflect how practices/tools relate to monitoring Open Science: methods for collecting data on open-research practices and the tools used, then mapped them on a four-field matrix (traditional↔responsible; low↔high cultural impact).
Main takeaways:
- Tools do not generate impact by themselves; outcomes depend on how they are used and to what ends.
- There is continued reliance on proprietary services; alongside adopting new approaches, low-impact routines should be discontinued.
- National and institutional contexts shape how activities and tools are interpreted and what is considered a driver of change.
- Using open data alone does not necessarily change practices; an explicit impact logic is required.
- Monitoring tends to default to Open Access; participants recommended broadening to the wider Open Science spectrum (data, software, collaboration, peer review, etc.).
Overall, the workshop outlined a pragmatic path for monitoring Open Science:
- Fix efforts in responsible-assessment principles (DORA, Leiden, The Metric Tide, CoARA) and the Barcelona Declaration so indicators stay transparent and contextual.
- Build on what already exists by linking repositories, CRIS, journal portals, and bibliometric services via open, PID-based data flows, and treat monitoring as workflow design with standard categories, shared metadata, and clear roles across libraries, research offices, IT, and finance.
- Automate across systems to reveal real costs and connect outputs to expenditures for decision support. Invest in skills, recognition and rewards, and library mediation, and keep iterating: clean, enrich, report, and publish aggregates (e.g., to OpenAPC).
Related News