Skip to content

Instantly share code, notes, and snippets.

@markpbaggett
Last active February 10, 2024 00:42
Show Gist options
  • Save markpbaggett/8103c5985c3177e3c1af77c3fbbe5e85 to your computer and use it in GitHub Desktop.
Save markpbaggett/8103c5985c3177e3c1af77c3fbbe5e85 to your computer and use it in GitHub Desktop.
Weekly Report - February 5 - 9 2024

Weekly Report - February 5 - 9 2024

Tasks / Worked On

  • DOIs / Vol Journals / Trace
    • Reviewed and identified problematic DOIs for Crossref Resolution Report (Details Here)
    • Minted DOIs for JASM Vol 15 Issue 3 and JASM Vol 15 Issue 4.
    • Discovered and minted DOIs for latest issue of Teaching and Supervision in Counseling.
    • Discovered and legacy NQSP DOIs.
    • [] Identified problematic IJNS articles that have minted DOIs but metadata refers to wrong DOI pattern.
    • [] Started drafting procedures, policies, and practices for DOI minting at UTK beyond the Libraries.
    • Created and Released v0.0.1 of digital_commons, a Python library that wraps the digital commons v2 real time API.
    • Met with Peter to discuss problems with DOIs and expanding the use of Crossref services at UTK beyond Libraries.
    • Use in Jupyter or a Python application with pip install digital-commons.
    • Import like from digital_commons import DigitalCommons in Juypter or your own Python applications / projects / libraries.
    • Find 100 records from Trace with video in any metadata field and 2013 in the title from any series like with x = DigitalCommons(site_url='trace.tennessee.edu', key='my-digital-commons-key').query(('q', 'video'), ('title', '2013'), ('limit', '100')).
    • Export all the "indexed" metadata from Trace like x = DigitalCommons(site_url='trace.tennessee.edu', key='my-digital-commons-key'), y = x.export_full() and then x.download(y).
    • See docs for more details and examples.
  • Dark Archiving, Preservation, etc.
    • Modified Islandora Bagit to include AIP, METS, and other binary files related to perservation for Webarchives and Born Digital Objects.
    • Bagged, Transferred, and Deposited Database of the Smokies and Chronicling COVID-19 into Chronopolis.
  • Rising from the Ashes and Primo
    • [] Investigated issues with Primo and "Rising from the Ashes" works missing thumbnails and links to work.
    • Assumed responsibility from Meredith for seeking solution to issue.
    • Opened Salesforce ticket with Ex Libris describing problem and what we've done this far.
    • Created, opened, and merged pull request to rule out un-normalized space as an issue.
  • Submitted session for IIIF Annual
    • With Nuno Freire from Europeana and Emily Lynema from Indiana University, submitted session proposal for IIIF Annual Conference in June title, "Handling special AV annotations: captions, subtitles, audio description, transcripts, and translations".

Training / Learning

Thinking about Next Week

  • [] Continue drafting procedures, policies, and practices for DOI minting at UTK beyond the Libraries.
  • [] Meet with graduate school and schol comm group to discuss rethinking ETD submission processes.
  • [] Meet with IIIF editors and other coauthors of IIIF AV Annotations Extension to discuss next steps and draft writing.
  • [] Continue working on Primo and "Rising from the Ashes" issue.
  • [] Be prepared to work with Scientist.com on anything regarding migrations, testing, or blocking issue 587.
  • [] Follow through on commitment to investigate and report on Regression A/V Problem in UniversalViewer
  • [] Meet with Ag Extension to discuss their needs for DOIs and Crossref services.
  • [] Meet with Chris Bombardo to discuss infrastructure support from Law for Abolition Now project (reseach and creative activity).
  • [] Other stuff I've probably forgotten about.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment