Crowd Editions:
Call for Reviews (English)

Call for Reviews in German / Suggested Projects for Review

October 2023

The Institute for Documentology and Scholarly Editing (IDE) is calling for reviews dedicated to crowd edited and peer sourced Digital Scholarly Editions (DSE). A publication of the reviews is planned in the online review journal for digital editions and resources RIDE under the topic “Crowd Edited Digital Scholarly Editions” as rolling release starting in 2024.

We are inviting reviews of such Digital Scholarly Editions from all Humanities disciplines. “Crowd edited” is understood in a broad sense: It describes a form of digitally enabled participation in the tasks that the creation and publication of a digital scholarly edition entail. It can imply that typical tasks, e.g., transcription, collation, annotation, (cross-)linking, or commentary, are “crowd sourced” or “peer sourced”, that means undertaken by a casual group of committed persons, through a publicly accessible platform. It can also imply that the edition is completely created by an independent public user group, e.g., on a social media platform (“social edition”).

To guide reviewers through the review process and to create a framework for the evaluation of crowd edited DSE, there are specific guidelines to consider, which can be found at the end of this call. The general guidelines for reviewing DSE (Version 1.1) can be found here: https://www.i-d-e.de/publikationen/weitereschriften/kriterien-version-1-1/.

We kindly ask you to email us before reviewing a crowd edited DSE at ride-crowd@i-d-e.de with a suggestion of which resource you would like to review and with a short explanation of your affiliation, your connection to the DSE/project to be reviewed, and your area of expertise. This is important to avoid multiple reviews of the same tool, and also to recognise, disclose resp., as far as possible, avoid conflicts of interest.

Please also check out information about the general RIDE writing process and RIDE submission guidelines. RIDE aims to publish reviews that are peer-reviewed articles. In order to reach a certain level of in-depth discussion, reviews should have a minimum length of 2000 words. Reviews can be submitted in German or English. Publications will be licensed under CC BY 4.0. RIDE does not charge submission, processing or publication fees of any sort.

Background

Crowd editing does not yet belong to the traditional means of creating a scholarly edition. However, with the emergence of digital environments, activating the power of the crowd for highly specialized, labor-intense and/or tedious tasks in the context of scholarly editing has not only become much easier, but it has also become a potentially relevant method to address the politically attractive area of so-called citizen science. The idea of crowd edited DSE emerged in the Anglo-American sphere about fifteen years ago (cf. Busch/Roeder 2023), while finding the break-even point in the equation of workload of the crowd versus workload of crowd management and quality control has remained a challenge for which there is no standard manual yet. One goal of the RIDE issue is to compare various experiences and identify generic best practices of organizing “crowd editions”.

The diversity of tasks addressed by peers includes: basic transcription, post OCR correction, collation, annotation, (cross-)linking, named entity recognition, image editing or annotation, data cleaning and so on. Secondly, there are different approaches to how much quality control the crowd needs: Does the “community brain” have enough experience to identify mistakes or ambiguities, or is there an acceptable error rate? Does every task have to be controlled by an academic senior editor? This is not only important to calculate the “overhead cost” of crowd editing, but it also defines the level of confidence in the relation between the public and science. It is often emphasized by participants that a successful crowd editing approach relies much more on mutual trust than on actual crowd control (cf. Moczek 2023). – More literature on the topic of crowd editions can be found in the Zotero open bibliography.

The journal RIDE was founded in 2014 to provide a framework of evaluation for digital scholarly editions and is now extending its scope to the area of crowd edited DSE. RIDE is Open Access, reviews are published as HTML and downloadable as TEI. The RIDE issue on Crowd Edited DSE is edited by Anna Busch, Martin Prell and Torsten Roeder.

Submission details

If you are interested in writing a review, please contact us in advance so as to avoid overlapping content. Reviews are accepted in English and German. The length of the review can vary (approximately 2000-5000 words).

Please submit your review as an editable text file (preferably, but not necessarily, docx to facilitate the conversion to TEI). Please send illustrations as separate image files (PNG) and leave a note in the text as a placeholder for each image. In addition to the text, we collect keywords (up to five per review). Each review should be accompanied by a short abstract in English, independently of the language used in the main text.

For further information please check the general RIDE guidelines for writing and submitting at https://ride.i-d-e.de/reviewers/submission-guidelines/. The questionnaire mentioned in the submission checklist has been designed for scholarly digital editions in general.

All reviews will be peer-reviewed to reach a high-quality level of the evaluations. We believe this is important as the evaluation of digital scholarly resources usually requires a double expertise in digital methods as well as in individual disciplines.

Specific Guidelines

We would appreciate special attention to the following points within the review:

  • Conceptual aspects
    • What does the edition call itself? How does it emphasize the crowd aspects?
    • Are there references to role models? Do partner projects exist?
    • What does the edition hope to achieve by involving the crowd?
    • What are the benefits for the individual participants? (Gamification, social inclusion, study credits, etc.).
  • Practical implementation
    • How can people get involved? (Sign-up process, learning curve)
    • What does the crowd do? => typically e.g. transcription and annotation; but to what extent are “editorial” tasks also taken on, such as textual criticism and commenting?
    • Who manages the whole thing (the crowd itself = social edition or the project management = crowdsourcing)? Quality management?
    • Which platform is used? (e.g. MediaWiki, own platform)
    • Technical aspects, how is it implemented?
  • Legal aspects
    •   Is there an assignment of rights of use or a volunteer agreement?
    •   How will the participation be accredited?
    •   How is the crowdsourced content licensed?