scholarly journals From Field to Web: Pre-entry point digitisation

Author(s):  
Vladimir Blagoderov

Most digitisation workflows are focused on legacy material, due to the sheer number of objects already collected. However, it is just as important to develop protocols for digitisation of incoming material to reduce accumulation of an additional backlog. This is especially crucial with the advent of molecular collections and field sequencing. In-the-field extraction and sequencing (Oxford Nanopore Technologies 2018) may lead to increasing numbers of voucher specimens without proper collection data and labels; or specimens disassociated with data. It is easy for researchers occupied by collecting and sequencing to delay proper documentation until a later date. As a curator, I can vouch that specimens without properly recorded data (with only collecting codes, for example) are lost for science. Fortunately, a combination of the best collecting and curatorial practices, simple online and offline tools, and modern technologies, makes in-the-field digitisation a reality. In the last couple of years, entomologists at the National Museums Scotland (NMS) have been testing the following workflow: Collecting routes and points are recorded with ViewRanger (Augmentra Ltd 2019), available as an app for mobile phones; At the moment of collecting, event data is recorded with Epicollect5 (Imperial College London 2019), available as Android app. Software's field generator allows creation of different scenarios, depending on method or circumstances of collection; and records main types of data: text, dates, time, coordinates. Individual collecting code is associated with the record; Specimens collected are prepared (pinned, stored in preservative, dried, etc.) and associated with corresponding collecting code; Additional data (diary records) is recorded in a notebook with Neo Smartpen (NEO SMARTPEN Inc. 2017) and digitsed. Collecting event records are imported into a collection management system (CMS) (PAPIS, Pape and Ioannou 2019) or EarthCape (EarthCape 2019); Specimen lots (if relevant) are sorted to a desirable level; Multiple specimen or lot records are created in CMS based on collecting event records; Data labels and UID labels are printed and physically associated with specimens or lots; Additional data (klm file of collecting route, diary records) are imported and associated with collecting events. Collecting routes and points are recorded with ViewRanger (Augmentra Ltd 2019), available as an app for mobile phones; At the moment of collecting, event data is recorded with Epicollect5 (Imperial College London 2019), available as Android app. Software's field generator allows creation of different scenarios, depending on method or circumstances of collection; and records main types of data: text, dates, time, coordinates. Individual collecting code is associated with the record; Specimens collected are prepared (pinned, stored in preservative, dried, etc.) and associated with corresponding collecting code; Additional data (diary records) is recorded in a notebook with Neo Smartpen (NEO SMARTPEN Inc. 2017) and digitsed. Collecting event records are imported into a collection management system (CMS) (PAPIS, Pape and Ioannou 2019) or EarthCape (EarthCape 2019); Specimen lots (if relevant) are sorted to a desirable level; Multiple specimen or lot records are created in CMS based on collecting event records; Data labels and UID labels are printed and physically associated with specimens or lots; Additional data (klm file of collecting route, diary records) are imported and associated with collecting events. Steps 1-4, and, depending on available facilities, steps 5-9, can be performed in the field, before specimens reach the depository. Alternatively, steps 5-9 should be performed immediately on returning from the field. There is no excuse for newly collected material not to be digitised before it is reaches the collection. Recent entomological collecting trips of NMS yielded 7358 specimens from 72 collecting events, fully documented and digitised in a matter of hours.

2018 ◽  
Vol 2 ◽  
pp. e25579
Author(s):  
Falko Glöckler ◽  
Markus Englund

The DINA system (“DIgital information system for NAtural history data”, https://dina-project.net) consists of several web-based services that fulfill specific tasks. Most of the existing services are covering single core features in the collection management system and can be used either as integrated components in the DINA environment, or as stand-alone services. In this presentation single services will be highlighted as they represent technically interesting approaches and practical solutions for daily challenges in collection management, data curation and migration workflows. The focus will be on the following topics: (1) a generic reporting and label printing service, (2) practical decisions on taxonomic references in collection data and (3) the generic management and referencing of related research data and metadata: Reporting as presented in this context is defined as an extraction and subsequent compilation of information from the collection management system rather than just summarizing statistics. With this quite broad understanding of the term the DINA Reports & Labels Service (Museum für Naturkunde Berlin 2018) can assist in several different collection workflows such as generating labels, barcodes, specimen lists, vouchers, paper loan forms etc. As it is based on customizable HTML templates, it can be even used for creating customized web forms for any kind of interaction (e.g. annotations). Many collection management systems try to cope with taxonomic issues, because in practice taxonomy is used not only for determinations, but also for organizing the collections and categorizing storage units (e.g. “Coleoptera hall”). Addressing taxonomic challenges in a collection management system can slow down development and add complexity for the users. The DINA system uncouples these issues in a simple taxonomic service for the sole assignment of names to specimens, for example determinations. This draws a clear line between collection management and taxonomic research, of which the latter can be supported in a separate service. While the digitization of collection data and workflows proceeds, linking related data is essential for data management and enrichment. In many institutions research data is disconnected from the collection specimen data because the type and structure cannot be easily included in the collection management databases. With the DINA Generic Data Module (Museum für Naturkunde Berlin 2017) a service exists that allows for attaching any relational data structures to the DINA system. It can also be used as a standalone service that accommodates structured data within a DINA compliant interface for data management.


Author(s):  
Brecht Declercq ◽  
Loes Nijsmans

Both traditional and more recent audiovisual carriers degrade. Even CD-ROMs have typically only a ten-year expected life span. In addition, playback equipment for both analogue and digital carriers will ultimately grow scarcer and more expensive to repair or replace. Archives and museums are inevitably faced with the decision of whether to preserve audiovisual carriers after their content has been digitized. This paper o ers a draft decision- making framework developed by the Flemish Institute of Archiving (VIAA). Assuming that an institution already has a digital collection management system in place, the proposed framework addresses the concepts of favourability, possibility, value, preservation conditions and the risk for other carriers through a series of questions. The paper also addresses the disposal of carriers, should an organization decide that disposal is in the best interests of its collections.


2018 ◽  
Vol 2 ◽  
pp. e26479
Author(s):  
Sharon Grant ◽  
Janeen Jones ◽  
Kate Webbink ◽  
Rob Zschernitz

On the 9th of April 2010 the Field Museum received a momentous email from the ORNIS (ORnithology Network Information System) team informing them that they could now access the products of a nationwide georeferencing project; its bird collection could be, quite literally, put on the map. On the 7th of August 2017 those data (along with the sister datasets from FISHNet (FISH NETwork) and MaNIS (Mammal Network Information System) finally made their way into the Museum’s collection management system. It's easy to get data out, why is it so hard to get it back? To make it easier, what do we need to do in terms of coordination, staffing, and/or technological resources? How can tools like data quality flags better accommodate the needs of data-providers as well as data-users elsewhere along the collections data pipeline? We present a real life case studyof repatriating an enhanced dataset to its institute of origin, including details on timelines, estimates of effort, and lessons learned. The best laid repatriation protocols might not prepare us for everything, but following them more closely might save us some sanity.


2018 ◽  
Vol 2 ◽  
pp. e26083
Author(s):  
Teresa Mayfield

At an institution without a permanent collections manager or curators, who has time to publish data or research issues on that data? Collections with little or no institutional support often benefit from passionate volunteers who continually seek ways to keep them relevant. The University of Texas at El Paso Biodiversity Collections (UTEP-BC) has been cared for in this manner by a small group of dedicated faculty and emeritus curators who have managed with no budget to care for the specimens, perform and publish research about them, and publish a good portion of the collections data. An IMLS grant allowed these dedicated volunteers to hire a Collections Manager who would migrate the already published data from the collections and add unpublished specimen records from the in-house developed FileMaker Pro database to a new collection management system (Arctos) that would allow for better records management and ease of publication. Arctos is a publicly searchable web-based system, but most collections also see the benefit of participation with biodiversity data aggregators such as the Global Biodiversity Information Facility (GBIF), iDigBio, and a multitude of discipline-specific aggregators. Publication of biodiversity data to aggregators is loaded with hidden pathways, acronyms, and tech-speak with which a curator, registrar, or collections manager may not be familiar. After navigating the process to publish the data the reward is feedback! Now data can be improved, and everyone wins, right? In the case of UTEP-BC data, the feedback sits idle as the requirements of the grant under which the Collection Manager was hired take precedence. It will likely remain buried until long after the grant has run its course. Fortunately, the selection of Arctos as a collection management system allowed the UTEP-BC Collection Manager to confer with others publishing biodiversity data to the data aggregators. Members of the Arctos Community have carried on multiple conversations about publishing to aggregators and how to handle the resulting data quality flags. These conversations provide a synthesis of the challenges experienced by collections in over 20 institutions when publishing biodiversity data to aggregators and responding (or not) to their data quality flags. This presentation will cover the experiences and concerns of one Collection Manager as well as those of the Arctos Community related to publishing data to aggregators, deciphering their data quality flags, and development of appropriate responses to those flags.


2021 ◽  
Author(s):  
Bing Wang

This thesis comprises a description and analysis of a personal scrapbook with 123 albumen prints compiled in the nineteenth century. Originally attributed to photographer John Thomson (1837–1921), it is held by George Eastman House, International Museum of Photography and Film. The goal of this thesis is to critically examine the original catalogue records and provide more appropriate and searchable catalogue information in the Eastman House’s collection management system. Histories of photography in Asian countries featured in the scrapbook, and an account of Thomson's career contextualize the scrapbook. A detailed analysis of this scrapbook focuses on the attribution and subjects of the photographs and the compiler of the scrapbook. While the attributions for all the images can not be confirmed, this thesis provides more accurate and reliable information about this scrapbook and, therefore, paves the way for future research on this scrapbook and the history of the nineteenth-century Far East photography.


Sign in / Sign up

Export Citation Format

Share Document