Open Science: materials and tools

This section, which is a work in progress, provides some insights on open science issues.

It is intended to offer updated tools and materials on the new frontiers in open knowledge.

For advice and screening on any journals approaching you

[email protected]

Where not to publish in open access: predatory publishing

Predatory publishing is an exploitive publishing model that has emerged in recent years. It lures researchers who need to publish quickly with the false promise of fast peer-review and publication, and of wide-ranging dissemination.

The goal of predatory publishing is not to disseminate the results of scientific research through open access, though it exploits its logic, but to earn by deceiving academics, offering no or poor-quality services and hiding behind open access as an ethical way of doing research.

More information

Learn more about the world of scientific publishing and how to publish in safety and quality.



Predatory publishing exploits the logic of open access, as it does not charge readers but authors (or their institutions), who are attracted by the promise of fast publication with a view to meeting funding or career development requirements. The victims of predatory publishing are often young researchers with few publications, who need to publish quickly.

Predatory publishing promises rapid publication through fast but inaccurate peer review, or no review at all. On the contrary, accurate peer review is a process that takes time and ensures the validity of scientific research.

The contributions published in predatory journals with no peer review cannot be deemed valid.

Furthermore, predatory publishers do not have archiving policies that ensure the conservation of published works, precisely because their only interest is in profit.

It can also happen that the sites on which your articles have been published disappear at any moment without a trace.

It is not easy to recognize predatory publishers, because they create websites containing completely realistic e-journals. They may also state Impact Factors (or similar indicators) that turn out to be fake, or falsely declare to be indexed in recognized databases, such as Scopus, Web of Science, Pubmed.

Often the declared scientific committee members are not professionals, or they are professionals indicated unbeknownst to them, or the published articles are plagiarisms.

Typically, predatory publishers contact researchers by e-mail and flatter them into contributing to their journals, described in an enthusiastic and unprofessional manner. Publishing fees are not clear and are often increased once the article is published.

Researchers who realize that they fell into a trap before publication may not be able to withdraw their contributions, which may be published without the author's consent.

When submitting your contribution to a scientific journal, please consider the following:

  • Email invitation: is the email invitation to publish well written or does it have typos, spelling and grammatical errors? Does it use unprofessional language? Was it sent from an institutional or general account (e.g. gmail, yahoo, etc.)?
  • Journal title: is it similar to that of a leading journal in the field? Predatory journals often use titles that remind of prestigious and internationally recognized journals.
  • Geographical information in the title: a title may suggest that the journal is based in the US or the UK, while actually being based elsewhere.
  • "About" section: is the information accurate? Is the research area specific or too vast for the journal to be reliable? Predatory journals often state broad areas of research to attract more submissions.
  • Contact details: is there a physical address, phone number, institutional email (as opposed to a general one, such as gmail, yahoo ...)?
  • Editorial board: is it possible to contact its members? Are these wellknown professionals in the research field? Even if they are, be careful, as predatory journals often report well-known. professionals as members of the scientific committee without them knowing. You may write to the contact persons to dispel any doubts.
  • Published articles: consult the published articles and check the professionalism of the authors. Predatory journals often publish plagiarisms, or unscientific contributions; sometimes they have no content at all.
  • False indexing: predatory journals often declare that they are indexed in wellknown databases, such as Web of Science, Scopus, Pubmed, DOAJ. You can fact-check these statements by querying the databases or by doing a preliminary search on MIAR Matriu d'Informació per l'Anàlisi de Revistes.
  • Impact Factor: predatory journals often indicate that they have an impact factor which proves to be fake. You can check it on Journal Citation Reports.
  • Invented metrics: predatory journals often state metrics that prove nonexistent. It is important to check whether they are used by other accredited journals.
  • Author charges: are article processing charges (APC) or article submission charges stated clearly? Are they comparable to other open access journals in the same field? Are payment times clear?
  • Instructions for authors: are there any submission instructions? Is it clear how submissions will be treated?
  • Peer review: is the type of peer review specified? Be wary of journals promising quick peer review, as it means that there will be no peer review or that it will be inaccurate, thus affecting the validity of the publication.
  • Digital archiving: is information about digital archiving available? Predatory journals provide no archiving guarantee.
  • Copyright:  is copyright information clear?
  • ISSN: does the journal have an ISSN? You can check the accuracy on the ISSN portal.

The phenomenon of predatory publishing can also concern those who intend to publish books or book chapters.

Here are some things to consider when publishing a book or chapter:    

  • Publisher: Is the publisher known in the industry? Is it easy to contact them by phone, email, post?
  • Searchability: Is it easy to find the publisher's latest books?
  • Editors: Are editors well-known in your field? Is it easy to contact them by phone, e-mail, post?
  • Peer-review: Is the type of peer-review clearly stated?
  • Indexing: Are the publisher's books indexed in the databases for your field?
  • Archiving: Does the publisher provide long-term archiving services?
  • Author charges: Are these clearly stated?
  • Author guidelines: Are author guidelines published?
  • License: Does the publisher use a clear license for Open Access books and state whether exceptions are allowed based on the author's needs?
  • Copyright: Are the authors' rights clearly expressed? For instance, is it allowed to publish the electronic version of a book or chapter on an OA website, such as an institutional repository?
  • Recognized organizations: Is the publisher a member of a recognized organization? For example, does it comply with the guidelines of the   Committee on Publication Ethics (COPE), or are its OA  books listed in the  Directory of Open Access Books (DOAB).
Pre-print

A preprint is a complete scientific manuscript that is uploaded by authors on a public server.  A preprint contains data and methods; it is often the same manuscript that is submitted to a journal.  After a quick quality check to ensure that the work is scientific in nature, the author's manuscript is published on the web without peer review, and can be viewed for free by anyone in the world. Based on feedback and/or new data, new versions of a preprint may be uploaded. However, previous versions will also be retained.  Preprint servers allow scientists to directly control the dissemination of their work in the global scientific community. In most cases, the same paper published as a preprint is peer reviewed for publication in a journal or, according to the latest developments, in a platform for open peer review.  In the firstcase, if the paper is accepted it will be published as a version of record, but in the second case the preprint will be accessible along with the review reports as peer reviewed preprints. Some journals allow portability of this type of open peer review.

Preprint servers and open peer review platforms arise and should remain public infrastructure.

 

The office is available to hold training events on the subject of pre-print [email protected]

Until a few years ago, there was only Arxiv. However, over the past few years preprint archives have increased in multiple areas, including philosophy, literature, history, social sciences, agricultural sciences, and psychology.

preprint archives logos

Dissemination through a preprint server offers the following advantages to authors and readers:

  1. It allows to certify priority on a new idea;
  2. Preprints on some servers are open to comments and feedback that can help the author;
  3. improve the manuscript before or when submitting it to a journal;
  4. It ensures rapid dissemination and timely sharing of research results;
  5. There are no costs for the author or reader;
  6. Following peer comments, preprint servers archive subsequent versions, while retaining all previous versions;
  7. Some funding institutions (e.g. ERC) support the use of preprint servers;
  8. 7Most journals accept manuscripts that have previously been distributed on preprint servers (usually not for profit), therefore, it is important to always review a journal's pre-print policies;
  9. Authors usually retain copyright on papers published on preprint servers.

OSF: this engine searches numerous preprint servers

Google scholar: also includes pre-print servers in searches but with no distinction from other types of papers. Pre-prints can be identified by the URL in example.

ASAPbio (Accelerating Science and Publication in biology) is an initiative promoted by researchers that deals with the possibility of a rapid dissemination of research work in the biological sciences.

At this link you can check the policies of the research funding bodies with respect to the publication of pre-prints

ASAPbio and EMBO have started a peer review project for pre-prints that can be found on Review Commons, where one or more reviewers will evaluate the preprint when it is submitted to a magazine.

Post-print

A postprint (Author’s Accepted Manuscript, AAM) is the final draft of a manuscript that has already undergone a peer review process. The final draft includes the reviewers’ comments, but not the publishing layout: graphic design, pagination, logo and copyright. Page numbers, if any, start from 1 and any tables can be found at the end of the text.

The postprint is accepted for open access and use. It is also the first admitted version for self-archiving in institutional archives, often with a specification regarding an embargo period during which access is closed and only bibliographic metadata are visible. For this reason, it is fundamental to check the conditions established in the publishing contract, including the choice of the license - normally a Creative Commons license - applicable to the manuscript.

 

 

For information

[email protected]

Open Peer Review

Peer review is a recognised process that aims to validate research works to ensure the scientific value of a publication. Peer review can come in different forms, depending on the journal.  More recently, following the widespread use of the Internet and the success of open access publishing, open peer review has been gaining a foothold, as a type of peer review which entails some changes to the traditional academic review process.

Open peer review does not have a standard structure, but can be generally defined as a peer review model which involves the disclosure of the authors’ and referees’ identities (at least to each other), and where specific aspects of the peer review process itself are made public.

Open peer review is different from traditional peer review in some or all of the following: authors and referees know each other; edits are published along with the article; authors, referees and editors can engage in open discussions; other members of the academic community can add comments. This also results in a clear separation between the peer review process and the publishing process, as the review may be carried out by an organisation other than the publisher. An equally important aspect is the fact that pre-print manuscripts are immediately available and can be edited before the version of record is published, while further comments and evaluations can be received after publication.

The debate on the pros and cons of traditional peer review and open peer review is still ongoing, and will go on for years. Nevertheless, it is undeniable that open peer review allows to significantly reduce costs and time required for the dissemination of articles, and to immediately start an open discussion within the academic community, not limited to the referees involved. This openness would lead to a richer debate, and facilitate innovation and further developments in research.

One of the most interesting aspects is the opportunity for the scientific community to have a clearer overview of the decision-making processes within a journal. Transparency is in fact one of the distinctive features of open science and of the tools used to implement its principles.

There is another fundamental aspect: if referees decide to give up their anonymity (this being a choice in open peer review, not a requirement), the value of their work can be recognised. This could set in motion a virtuous mechanism to expand the range of evaluation parameters considered for career advancement in academia. As a consequence, review processes would be more accurate and impartial, as they would be open to criticism by other members of the scientific community.

For information

[email protected]

Open Access search engines

Below is a list of the main search engines in the Open Access academic world.

OpenAIRE: provides access to Open Access scientific literature funded by the EU, by indexing metadata or full-text contributions filed in institutional or thematic repositories, Open Access journals and publishers, dataset archives, service aggregators such as DataCite, BASE, DOAJ.

CORE - COnnecting REpositories: is the leading aggregator of Open Access scientific publications from all over the world.

Science open: is a platform providing access to Open Access publications as well as a range of services (e.g. user metrics).

DOAJ - Directory of Open Access Journals: indexes Open Access Gold scientific journals subject to peer-review.

DOAB - Directory of Open Access Books: indexes almost 30,000 Open Access peer-reviewed academic books by approx. 380 publishers.

Lens: is an Open Access platform collecting patent data from multiple repositories worldwide, and integrating it with bibliographies and research data (from PubMed, CrossRef, Microsoft Academic).

Dimensions: searches different contexts, including clinical trials, grants, patents, datasets, cross-checking and relating data; for instance, it is possible to trace  the sponsors behind a publication, or a publication which resulted in a patent.

Focus
Tools

Please find below two tools that will help you find the free version of articles of which you only know the paid version.

Unpaywall: available for Chrome and Firefox, it is an extension that allows you to search the web for the free version of a contribution, of which only the paid version has been identified.

 

Once you have downloaded the extension, when you find a paid item, a padlock icon on the right side of the browser will signal any versions available for free; by clicking on the padlock you can directly access the Open Access version.


OAbutton: available both as a website and a Firefox extension, it allows you to locate the Open Access version of a contribution of which you only know the paid version or, if not available for free, to submit a request to the author to make a copy available. OAbutton is supported by SPARC (Scholarly Publishing and Academic Resources Coalition), whose goal is to promote Open Access as a form of dissemination of scientific research.

 

 

Altmetrics (Alternative Metrics) are alternative scientific research impact indicators, as opposed to traditional ones (e.g. Impact Factor, H-Index).

Altmetrics have spread in recent years. Compared to traditional metrics, they find scientific contributions more rapidly, as they also consider citations outside the academic field, e.g. social media, downloads, how many times an article is visited online, and how often it is mentioned in social media and other channels such as blogs, websites and preprint servers.

It may take years before a scientific work is first cited in other works. As opposed to traditional metrics, Altmetrics will detect the dissemination of a work on the web, through citation tools such as Mendeley, CiteULike, Zotero. They also consider the relevance of the article itself, not the journal in which it is published.

Altmetrics are fast because they take into consideration significant scientific works with no citations yet, as well as works that have not been peer-reviewed (which takes a long time before publication). Moreover, they use public APIs to collect data, and the scripts and algorithms that collect and interpret the data are also open.

Altmetrics don't just consider the citation count, they also examine semantic content (detecting data such as username, timestamp, tag). They differ from citations and webmetrics, which take a long time, are not structured and are closed.

Altmetrics' fast registration times allow you to create timely research suggestions, for example by offering an overview of the week's most significant contributions in a specific research area.

Altmetrics (Alternative Metrics) are alternative scientific research impact indicators, as opposed to traditional ones (e.g. Impact Factor, H-Index).

Altmetrics have spread in recent years. Compared to traditional metrics, they find scientific contributions more rapidly, as they also consider citations outside the academic field, e.g. social media, downloads, how many times an article is visited online, and how often it is mentioned in social media and other channels such as blogs, websites and preprint servers.

It may take years before a scientific work is first cited in other works. As opposed to traditional metrics, Altmetrics will detect the dissemination of a work on the web, through citation tools such as Mendeley, CiteULike, Zotero. They also consider the relevance of the article itself, not the journal in which it is published.

Altmetrics are fast because they take into consideration significant scientific works with no citations yet, as well as works that have not been peer-reviewed (which takes a long time before publication). Moreover, they use public APIs to collect data, and the scripts and algorithms that collect and interpret the data are also open.

Altmetrics don't just consider the citation count, they also examine semantic content (detecting data such as username, timestamp, tag). They differ from citations and webmetrics, which take a long time, are not structured and are closed.

Altmetrics' fast registration times allow you to create timely research suggestions, for example by offering an overview of the week's most significant contributions in a specific research area.

Transformative agreements cover hybrid journals, i.e. subscription journals where authors can make their papers available open access for a fee. With increasing pressures for research publicity from both research funders and society as a whole, the hybrid model has become unsustainable for institutions. Some publishers are double dipping, as they charge readers, in the form of a subscription, as well as authors, through article processing charges (APC). Transformative agreements are negotiated between publishers and institutions (libraries and consortia) with the aim of moving from a subscription-based model to a model in which publishers are paid for open-access publishing services. Ideally, subscription costs should be replaced by the costs for reading and publishing, with no extra charge. As a matter of fact, there are always extra charges. 

These are temporary and transitional agreements, with an ideal duration of two to three years. Upon expiry, the agreement should be re-negotiated. Publishers are currently not required to switch from hybrid to gold open access. The agreements will have to be revised from time to time based on new scenarios. Transformative agreements are mostly available on the ESAC website (Efficiency and Standards for Article Charges) according to a principle of publicity and transparency.

These are copyright licenses allowing authors to inform users how they can (re)use their work.

There are four conditions of use, each marked by a graphic symbol:

cc by new white svg BY – attribution: it is always granted

Cc-nc_white.svg NC - non commercial

cs_white SA – share alike

nd   ND - no derivative works

Six combinations can be obtained by combining these four clauses.

Wikipedia Commons

zero badge 

CC0 is the public domain license indicating waiver of copyright on the work worldwide. It is one of the licenses that can be associated, for example, with research data.

Licenses come in three different forms:

Legal code – the actual license.

Commons deed – the simplified and abbreviated version (a sort of license summary, though with no legal effect).

Digital code – set of machine-readable metadata allowing for automated cataloguing.

CC licenses do not provide (greater) protection for the work that is protected by copyright law.

Open Research Europe Publishing Platform (ORE) is a scientific publication platform funded by the European Commission and intended for beneficiaries of H2020 projects and the upcoming Horizon Europe framework programme. It allows users to manage the entire publication process quickly and transparently, starting from pre-print through open peer review, up to post-print.

It publishes contributions in the following subjects: Natural Sciences, Engineering and Technology, Medical Sciences, Agricultural Sciences, Social Sciences and Humanities.

Each publication must have at least one author who is a beneficiary of a Horizon 2020 grant.

The author enters a contribution, which must be original, and proposes 5 reviewers. Otherwise, they will be proposed by the platform.

Then the editorial team carries out a series of checks (for example for plagiarism and the presence of at least one author with Horizon 2020 funding), after which the article is published with a CC-BY license, a DOI and a publication status.

After publication, the review begins, which is open, and must include at least 2 reviewers for each article. Reviewers know that all their comments will be made public. All subsequent versions of the contribution, along with reviews and the review status, remain public.

Once the contribution passes the review, it will be indexed in the bibliographic databases that have approved Open Research Europe and in the appropriate repositories. If it does not pass the review, it remains published on the site, marked by a specific publication status.

Publication fees (APC), set at 780 euros per contribution in the tender, are funded by the European Commission, at no cost to the author.

Launched in 2012 on the occasion of the annual meeting of the American Society for Cell Biology in San Francisco, this initiative is aimed at identifying research assessment parameters that are scientifically more correct than traditional metrics, such as the Impact Factor.

DORA promotes a series of recommendations to use research assessment criteria other than journal-related metrics, with a focus on the quality of each individual article.

The recommendations advise against the use of indicators based on traditional metrics as a tool for evaluating the work of individual researchers.

DORA calls for transparency from institutions and funding bodies assessing scientific production for the purposes of recruiting researchers and career management. It encourages to consider not only publications, but all research products, including datasets and software, for evaluation purposes.

Publishers are recommended to reduce the emphasis on the Impact Factor, either by ceasing to promote it or by presenting it alongside other metrics that allow for a journal evaluation in a broader context. The suggestion is to adopt metrics that assess a specific contribution, as opposed to the journal in which it is published. Open Access journals are recommended to allow reuse of content through Public Domain licenses.

Metrics providers should prioritize transparency in terms of data and calculation methods, distribution through a license allowing for reuse, banning the manipulation of metrics.

Researchers evaluating research are recommended to promote research quality, not traditional metrics.

 

Embracing DORA means committing to adopt scientifically and ethically correct research assessment practices.

The League of European Research Universities (LERU) provides its 23 members with an overview of the 8 open science pillars as defined by the European Commission in its guidelines: "Open science and its role in universities". This document allows academia to monitor the implementation of the 8 pillars: Open data, EOSC, New generation metrics, Rewards and incentives, Future of scholarly communication, Research integrity, Education and skills, Citizen science.

UNESCO adopted its recommendations to member states on open science in November 2021.

Video

UNESCO Recommendation on Open Science

EOSC (European Open Science Cloud) is a multidisciplinary open-access platform where researchers can publish, share and reuse data and research products.


It is a project sponsored by the European Union with the mission to promote Open Science in Europe by making research data available to all in a FAIR (Findable, Accessible, Interoperable, Reusable) way.

 

Aguillo, Isidro F. (2023). Chart Open Science. figshare. Figure. https://doi.org/10.6084/m9.figshare.24541621.v1
Aguillo, Isidro F. (2023). Chart Open Science. figshare. Figure. https://doi.org/10.6084/m9.figshare.24541621.v1