Warnings, Caveats and Weasel Words

Most of the experiments linked to here are running on R&D equipment in a non-production environment. They may disappear without warning and/or perform erratically. If one of them isn’t working for some reason, come back later and try again.

Labs

 5 minute read.

Crossref Labs is the dedicated R&D arm of Crossref. In late 2021 we announced that we are re-energizing Labs after a period of working mostly on development tasks.

What’s our focus?

The division between what the R&D group does at the group-level, and what the wider organisation does will always be more of a gradient than a line. But at the highest level we’d say that R&D will focus on projects that:

  • Address new constituencies.
  • Involve fundamentally new approaches (technology or process or both).
  • Are exploratory with no clear product application yet.

And that a “strategic initiative” (as opposed to a new feature, service, or product) is something that:

  • Involves something we’ve never done before.
  • Involves potential changes to our membership model and fees.
  • Would require a large investment of resources outside of normal budget.

We’re certainly not the only group at Crossref who experiment, build proof of concepts (POCs), and do research, but we hope to support other groups who do - both inside our organization and in the wider research ecosystem.

Sound right up your street? Interested in collaborating on something you’re working on? Let us know.

labs_email

What are we working on now?

All the projects we are working on can be browsed here. We also have a list of Labs ideas.

A few current highlights are:

  • Measuring how good our community is at preservation. Dashboard at https://the-vault.fly.dev/ and these are displayed on the Labs prototype (see next…!)
  • Playing around with how we might evolve our Participation Reports.
  • Looking at running a “labs” version of our API where we can experiment with exposing new functionality and metadata in order to get quick feedback from the community.
  • Making the Retraction Watch data openly available via the Labs API (and in .csv format) so we can get feedback on it before integrating it into our REST API.
  • Looking at how we can extend the classification information we currently make available via our API across more journal titles.
  • Exploring how we currently do citation matching with a view to evolving this approach i.e. making it better, more transparent and open to community contributions.
  • Creating a sampling framework that can be used to extract samples of works and make them publicly available.
  • Building POC tools to help our members and support team more easily accomplish common tasks.

A flavor of Labs research and experiments

Graduated to production services (or became part of them)

Once upon a time, PLOS ran a project called Article Level Metrics (ALM). It worked well for them internally and had garnered interest and support from other organizations. It had enough potential that we decided to pick it up as a Labs project. The idea was to collect information from various online sources mentioning their DOIs, a version of altmetrics.

The project had a couple of aims:

  • Scale effectively beyond one publisher to all Crossref DOIs.
  • Work out what would be needed to create a production version of such a tool.

The output was intended to be open data to provide context around outputs, while avoiding creating yet another set of metrics. If successful, it had the potential to centralise collection of this kind of data, providing significant efficiencies to publisher and sources. Read more about the early stages of the project here.

The initial experimenting proved successful and by Spring 2014 we were in a position to run a pilot with the cooperation of a number of organisations. The project had become the DOI Event Tracker (DET), which built on Lagotto, the successor of ALM. Read about the DET pilot here.

DET was now ready to be passed over to the production team and in 2017 entered a beta phase.

It has continued as Crossref Event Data, with over 800 million events collected (up until October 2021) and available via a public API.

Page owner: A Crossref Labs creature   |   Last updated 2023-February-20