800-222-9711

Loading...

MARC Records

  • Improving Foreign Language Service – Part II

    A word cloud of "hello" in many languages, written in different colors and directions. The translations include aloha, dia duit, bonjour, verwelkoming, and sannu.

    During our latest New Features Webinar, we had a question about getting the complete list of all languages recognized by PIMMS. Here are some things to know about that list:

    • The list that PIMMS adopted is based on the ISO 639-2 Bibliographic list, a standard list maintained by the Library of Congress. There are 487 languages on this list.
    • This list is designed to represent all the languages that you might expect to find a book written in, but it is not necessarily a list of all spoken languages.
    • The code values in the ISO-639-2 list are used in MARC records to indicate the language(s) of a title.
    • The code values that PIMMS accepts are numeric codes, instead of the alpha values that represent the standard they are attempting to match.

    There are also some things to consider regarding language codes in KLAS:

    • The list of Patron Language codes (PatLang) are mapped to the numeric PIMMS codes.
    • The list of Language Codes for title records (LangCode) are mapped to coincide with incoming MARC data.
    • The local codes for Patron Languages and Language Codes for title records need to match for Nightly and Book Search exclusion checks.
    • The list of values in a drop down are sorted by Code Value instead of description.
    • The more values you have in a list of codes, the more difficult it gets for your staff to use.
    • Changes to a Code list are not applied retroactively; clean up needs to be made on all affected records.

    In short: there are a ton of language possibilities out there. However, when you balance out the needs of your patrons, your collection, and your staff, the only languages you really want to have in KLAS are the ones you need right now.

    You can certainly add more languages than you have now without causing headaches—and we’re happy to help you do it! Languages that your patrons speak, that appear in the collection, and/or that you want to track interest in should definitely be added to your code lists.

    Just know that too many languages can make things complicated, and too many changes later on can be a pain.

    If you want the long version, read on for more details.

    Difficulties of a Long List

    Having “too many” options in a drop-down field introduces some really interesting ways to mess up data entry. For example, while the full list of PIMMS languages was briefly available, one patron was given “Middle English” as their default Language! The more options there are, the easier it is to accidentally select the wrong one, either by misreading or mis-clicking.

    In addition to the unwieldiness of such long list and the ease of selecting an unintended entry, these lists sort by code not full description. This means you should take care with how you setup the codes, or the list will not be in a straightforward, alphabetical order.  Once you get more than 30 or so entries in a combo-box, having a logical sort order becomes very important! For example, browse through the list of states on the Contact tab. The state codes are well-known—but they don’t sort the same as the names of the states. If you weren’t already familiar with them, could you easily use this field? What if the states weren’t in order at all?

    Getting back to the list of languages that you are managing in your database – another thing to keep in mind is the way a language might be referred to by a patron vs its official name or designation. Many libraries have had Cambodian in their Patron language list for years as C or CAM but the Marc value is khm for Khmer. Serbo-Croatian was previously used for what are now distinguished as four different languages whose official codes are based on the spelling of the language in that language – hence HRV for Croatian. The MARC code is PER for Persian but the language is frequently referred to as Farsi.  

    Will those codes help your staff browse to the right part of the list? It’s important to understand how patrons might ask for materials in these languages, and choose local codes and descriptions such that your staff can match what a patron is saying to the proper selection.

    Fortunately, the Code Files allow you to use a local code for the Patron and Catalog modules while still mapping to PIMMS and to the codes used on MARC records. Think about the local codes you may want to use, and how can use them to enforce a reasonable sort order.

    Remember: selecting logical, well-sorted codes, and then selecting the right one from the list every time will be easier the shorter you keep the list.

    Changes to Code Files are not retroactive

    Code Files can be tricky to deal with, especially in cases where you are dealing with large numbers of records, mapping between modules, and mapping with both PIMMS and MARC records. If you’re up to it, you can add to your PatLang codes, but we recommend you leave the LangCode file to us—and don’t go hog wild changing the codes.

    Here’s why:

    Code Files aren’t like Headings. Headings are linked on each record; when you update a Heading, the changes appear on each record. Codes get stored on each record instead, and the Code File simply defines what that code means. If you change a Description, that’s fine! But if you change a Code, all the records with the old code still have the old code. Records with the old code won’t be able to find it in the Code File to define it, triggering error messages and generally breaking things.

    Plus, for this Code File, mapping must be maintained between the Patron and Catalog modules. Doing so means Nightly can match the patrons’ preferred Language to the Language on title records.

    So, if you want to change one of your language codes:

    1. Add the new Code to the PatLang Code File, setting the CMLS Lang Code to match the code you want to update.
    2. Find all of the patrons with the original Code on their record, updating each of them to use the new Code instead.
    3. Only delete the original Code from the PatLang Code File once it’s no longer associated with any
    4. Repeat the process in the LangCode file, with the catalog records, making sure to maintain the CodeXref (MARC record mapping).

    Or... better yet, have us do it.

    Any time you need a language added to the list, or if you want to adjust the local codes in use for sorting the list, please send the details to Customer Support at , and we’ll take it from there.

    Policy Changes are also not retroactive – About Bilingual Titles

    While you’re thinking about all this, here’s one last thing to keep in mind:

    In the past NLS issued MARC records for bilanguage titles with a combined code, such as E/S or S/E for books with both Spanish and English parts. That policy changed! According to NLS, these books are now given a language code reflecting the “primary” language of the title, even if one or more additional languages are included. For example, “Drivetime German” which is marked as E/G in a number of databases, is primarily in English (with some German) according to NLS. Meanwhile, “eTicket Ingles” is a Spanish title (which also has English).

    New MARC records arrive cataloged this way. It’s up to you whether you want to update their records to reflect the bilingual nature of the titles, or want to update your back catalog to use the approach of cataloging the primary language.

    For more information about this, or advice on how to apply changes in the direction of your choice please let us know!

  • Let's Talk about Cataloging

    Line drawing of an open book, with circuit board style lines running into and out from it.

    If you attended one of the NLS Regional conferences (or just checked out the slide deck), you already know we are considering offering a Cataloging Service. But what's involved, and how will it work? To get there, let's start with a look at what Keystone is already doing for your catalog.

    Current Catalog Enhancements:

    Every MARC Records file posted by NLS is then reviewed by Mitake here at Keystone, before being posted for download here. That review, and accompanying corrections as needed, covers:

    • Language code
    • Subtitle formatting 
    • Series in non-English languages
    • Remove publisher imprints from series (ex: Penguin Classics; Pocketbook)
    • Audience Notes (typos & inconsistencies in 521/546, ex: split “Contains Sex & Strong Language” into two headings)
    • Diacritics clean-up
    • Annotation (combine tags so Audience Notes are included after standard annotation)
    • Check for subjects with “stories” vs “fiction” (ex: combine “Mystery & Detective Stories” and “Mystery & Detective Fiction”)
    • Ensure 082 & 072 exist
    • Validate & normalize MARC

    Additionally, we generate a query set of titles included in each Talking Book Topics (TBT) issue, pull the MARC record for the TBT monograph record if not included in one of the batch files, and pull MARC records on demand for Titles needed to load BARD transactions. For Keystone-hosted customers, we also load all MARC records, including both the regular batch files and any on-demand titles.

    For a one-time setup charge, we can provide subject mapping and series mapping services, which involve combing through your headings to match them up with the ones generally included in certain MARC tags of the NLS records, and creating the filters and heading data to ensure the NLS headings load automatically from the MARC to the BibRec. 

    Like everyone, the rapid increase in the number of new titles has made it harder and harder for us to keep up the current level of cataloging. For example, as part of ongoing Series maintenance, we're adding 50-100 new Series headings a month to each Series customer--way more than we anticipated when initiating the Series maintenance service and offered upkeep of new headings at no change after the initial set-up fee. The absolute last thing we want is to offer less right when you need more, but Katharina spent nearly an entire week this month on just Series maintenance (and she’s faster at it than any of the rest of us). While we will absolutely honor no maintenance charge for the first year, at renewal, ongoing support of new series headings will need to become a charged service for us to continue offering it.

    Proposed Cataloging Service

    To make our cataloging service more sustainable, we're looking to start with some structural changes.

    Instead of working with the batch MARC Records files which come out twice a month (or so) from NLS, we will instead pull in the titles as they become available with a PICS API integration. This should give us a steady flow of Titles to review, rather than a twice-monthly blast from a firehose.

    The other big change will be where we make the record updates. Rather than changing those files into everyone's separate database, and then having to do a bunch of maintenance in each of those databases, we plan to set up a centralized cataloging database. We will then do any cleanup and corrections to the titles there. Once the title is ready, we will then push it into the subscribing customer databases via a batch program. This will occur routinely overnight for standard new files, but can also be done immediately to push out a freshly reviewed on-demand back catalog title. 

    Because we know everyone's patron base, staff and institutional preferences, and service approach is unique, subscribers will still have some options. You can choose whether you want to import the full record "our way" from the central database, if you want to exclude one or more specific MARC tags (ex – if you want to maintain your own series, you can get the rest of the record without the series tag). Or if you only want to overlay specific tags onto the NLS original (ex – if you chose not to get the full cataloging service but want to buy into Series maintenance), we can overlay that one MARC tag onto the original title as it comes from NLS).

    Planned additional review & correction before pushing titles to subscriber databases:

    • Narrator
    • Alt length
    • Pub year (print publication)
    • Alt pub year (recording year) 
    • Fiction/non-fiction checkmark
    • Reading level
    • Headings merges for typos / alt forms / etc
    • Sequel heading
    • Addition & removal of "Current TBT Issue" heading

    So what will it cost? 

    While we would love to just offer this to everyone, it is going to be a major time commitment. However, by distributing the cost of this staff time across customers, we hope to keep the charge well below what it would cost your library to hire even a part time cataloger.

    As we finish developing this service, and the situation with the incoming records changes, these estimates are subject to change. Set-up fees may apply and will depend on the level of subject mapping and / or series set-up needed to make your database compatible with the centralized records. Please contact customer support for a full quote for your library.

    1. Series Service: one-time set-up fee + $200/month ongoing at renewal
    2. Basic Cataloging Service (no series): $250/month*
    3. All-in: $400 /month*

    *Initial set-up charge may be needed if subject mapping and series setup are not already been done.

     

  • MARC Record Corrections - 2020

    The following MARC records have had diacratic marks or other issues corrected. Please upload these files in place of the NLS version.

    To avoid overwriting your cataloging data, we are removing the older NEW files. Updated versions of the New records will eventually be included in the Completed file that NLS provides.

    For more information on uploading MARC Records, see Are Your Records Up-To-Date? on the forums.

    Note: You must be logged in to download these files.

  • MARC Record Corrections - 2021

    The following MARC records have had diacritic marks or other issues corrected. Please upload these files in place of the NLS version.

    For more information on uploading MARC Records, see Are Your Records Up-To-Date? on the forums.

    Note: You must be logged in to view and download these files.

    Latest Updates

    The initial December files were uploaded on 12/07/2021. NEW File 2 was uploaded on 12/22/2021.

  • MARC Record Corrections - 2022

    The following MARC records have had diacritic marks or other issues corrected. Please upload these files in place of the NLS version.

    For more information on uploading MARC Records, see Are Your Records Up-To-Date? on the forums.

    Note: You must be logged in to view and download these files.

    Latest Updates

    The Mid-December file was uploaded on 12/20/2022.

    The Sept / October TBT record was added on 11/08/2022.The Nov / Dec TBT query set was added on 12/01/2022.

  • MARC Record Corrections - 2023

    The following MARC records have had diacritic marks or other issues corrected. Please upload these files in place of the NLS version.

    For more information on uploading MARC Records, see Are Your Records Up-To-Date? on the forums.

    Note: You must be logged in to view and download these files.

    Latest Updates

    The December files were uploaded 12/28/2023.

    The Nov/Dec TBT Query Set was uploaded on 12/13/2023

  • MARC Record Corrections - 2024

    The following MARC records have had diacritic marks or other issues corrected. Please upload these files in place of the NLS version.

    For more information on uploading MARC Records, see Are Your Records Up-To-Date? on the forums.

    Note: You must be logged in to view and download these files.

    Latest Updates

    The March NEW file 2 was uploaded 3/29/2024

    The Mar/Apr TBT Query Set was uploaded on 3/29/2024