Monthly Archives: May 2024

W-log #4: events, reflections, and readings about reproducibility

What happened? A few days ago, the University of Bologna organised a half-day event entirely dedicated to the concepts of reliability, transparency and reproducibility of research. It counted international experts, including Sabina Leonelli and Daniele Fanelli, who gave two exciting keynote speeches, and two roundtables with several institutional representatives (from the National Agency for The Evaluation of Universities and Research Institutes, the Ministry of University and Research, the UK Reproducibility Network, and the Jagellonian University) and with UNIBO’s researchers (from PhD students to Professors) introducing their take on reproducibility within their research and discipline. It has been a wonderful moment of discussion within our community, which will be followed by two additional events (in October and December) dedicated, respectively, to interdisciplinarity and ethics.

What I did. As a consequence of my take-home messages from the event and the slides I prepared for a lecture I gave to second-year students of the Cultural Heritage in the Digital Ecosystem PhD programme, I spent several hours on the concept of transparency, which is strongly tied to reproducibility. In several meetings and papers, I’ve seen too many times that transparency is introduced as synonymous with quality. While I understand the rationale behind such a conclusion, I believe this claim is rather simplistic – and wrong, from my perspective. I would need to spend more effort (and words) to rigorously outline the whole rationale of my thoughts – at least more than a short paragraph in a blog post. Still, the main point here is that quality is a goal to achieve (e.g. to perform qualitatively sound work), while transparency is, in fact, a tool that enables others to check if we have reached that goal. In practice, quality can exist without transparency, but transparency is needed if we want people to scrutinise quality. Thus, transparency does not build quality; it builds trust.

What to read. With all this agitation about these topics I’ve been absorbed this week, the weekly suggestion is, indeed, about research reproducibility and shows how there is not one definition that fits all but, rather, a plethora of different lenses one may have to use, depending on contextual and/or disciplinary situations:

Goodman, S. N., Fanelli, D., & Ioannidis, J. P. A. (2016). What does research reproducibility mean? Science Translational Medicine, 8(341). https://doi.org/10.1126/scitranslmed.aaf5027 – Open Access at https://osf.io/dw23g

W-log #3: Application of maturity dimensions for catalogues, digitisation of museums, responsible use of indicators

What happened? In the previous log, I introduced the article I’ve co-authored describing the work done in the context of the EOSC Task Force on Semantic Interoperability on identifying twelve maturity dimensions (and related sub-features) for assessing catalogues of semantic artefacts. A few days ago, FAIR Sharing, one of the catalogues we assessed in the article above, published a blog post that revised and updated the analysis regarding their service.

What I did. The Creators Day 2024 was an event organised in Bologna, focused on sharing with society the effort and dedication that several institutions, including the University of Bologna, put into the Cultural and Creative Industries. I had the honour to be part of the programme as one of the speakers, bringing my personal experience gained in the past year thanks to plenty of researchers working at the University of Bologna and the Institute of Heritage Science of the Italian National Research Council, on the digitisation of museums and, more generally, cultural heritage.

What to read. Over the past years, thanks to several initiatives worldwide supported by several institutions, the Open Science community has often participated in discussions about research assessment and the importance of responsible use of quantitative indicators. One of the most recent contributions to this discussion was disclosed by DORA very recently. It is a short and pleasant reading on why to avoid the misuse of specific indicators – e.g. the Journal Impact Factor and the Hirsch Index – and how to guide interested communities in using such quantitative indicators in research assessment exercises:

Declaration on Research Assessment (DORA). (2024). Guidance on the responsible use of quantitative indicators in research assessment. Declaration on Research Assessment (DORA). https://doi.org/10.5281/zenodo.11156568

W-log #2: Italian situation on digital cultural heritage, semantic artefacts, science and society

What happened? One year ago, the Italian Ministry of Culture published some guidelines that imposed heavy payments, even for scientific publications derived from publicly funded (e.g. by the European Union) research, in case of (re)use of images of state-owned cultural heritage objects (CHOs). These guidelines were contrary to the indication of the European Union on the digitisation of cultural heritage, which pushed for years to implement measures to guarantee the broadest possible openness and compliance with FAIR (findable, accessible, interoperable, reusable) principles of digitised CHOs (DCHOs) that are in the public domain for any purpose – the motto here is: what is in the public domain stays in the public domain. Thus, a few weeks ago and after several months of objections, the Italian Ministry of Culture revised its guidelines to relax the conditions for applying such heavy payments in case of scientific publications. However, we are far from the goal, considering that there are still massive restrictions, regulated by law, on the application of open access licenses (e.g. the Creative Commons CC0, CC-BY, CC-BY-SA) to these DCHOs, which is a mandatory requirement in several funding programmes, including Horizon Europe.

What I did. As the corresponding author, I’ve finalised the bureaucracy with the publisher for an open-access publication of a work co-authored with Andras, Clement, Emanuele, Fajar, Ivan, and Oscar, just published in Scientific Data. The article, entitled A maturity model for catalogues of semantic artefacts, is entirely focused on the key components that enable the implementation of semantic interoperability, i.e. semantic artefacts, that are a machine-actionable formalisation of a conceptualisation enabling sharing and reuse by humans and machines. In particular, we reflected on the catalogues where semantic artefacts are stored – i.e. web-based systems that foster the availability, discoverability and long-term preservation and maintenance of semantic artefacts – and provided a set of twelve dimensions for measuring the maturity of such catalogues.

What to read. Since its introduction, in the last lecture of my course on Open Science within the second-cycle degree in Digital Humanities and Digital Knowledge, I invite an expert to talk about the current status of Open Science in Europe. This year, continuing with the tradition of the previous course editions, I had the pleasure of having a friend of mine, Elena Giglia, who gave a presentation entitled Open Science: in dialogue with society. My reading suggestion for this week is the slides of her talk that have been deposited on Zenodo:

Giglia, E. (2024, May 7). Open Science: In dialogue with society. Open Science course, a.a. 2023/2024, Digital Humanities and Digital Knowledge, University of Bologna, Bologna, Italy. https://doi.org/10.5281/zenodo.11127310