Library Journal, June 15, 1984, pages 1197-1203

Automation and the Future of Technical Services

By Maurice J. Freedman

THE DEMISE of technical services has so often been predicted that our continuing discussions of that part of librarianship could be called attribute to its heartiness, or perhaps, in a darker light, a manifestation of its diehardiness.  One of the early crêpehangers was Charles Coffin Jewett, the head of the Smithsonian Institution library in the mid-19th Century.  Jewett, the unsuccessful precursor to first the LC card service and later OCLC, attempted to centralize the nation’s catalog control and production.  His use of the stereotype print technology was an important reason for his failure, although Fred Kilgour, the founding father of OCLC, gives Jewett’s conception (not Jewett’s technology) much credit for the development of OCLC’s service.

The “golden age” goes on

            Charles Ammi Cutter had no such imperial view, but as an observer he predicted that the happy and creative occupation of devising and implementing cataloging rules would wind to a halt once the LC catalog card service achieved success.  After his death in 1904, in his 4th edition of cataloging rules, Cutter viewed the implementation of the LC card service as the end of this golden age of cataloging.  Had Cutter only known!  The endless debates concerning rule construction, interpretation, and local policy are indeed endless.  Our dear colleagues at LC and elsewhere who have so befriended us with the new codes, editions, and published interpretations are not letting up in their thankless mission to help us.  Finally, in our own shops, we who know so much better now have at our disposal a technology that fosters, that even cries out for local options, manipulation, and change.  We act on this superior knowledge with the use of that technology.  The golden age may not be over.  The debates and the “harmless diversion” of cataloging argumentation continue.  Clearly we have not lost the future when the present thirst for debate appears so unquenchable.

            The flaming optimism and misguided acceptance and worship of computer technology brought the third major predication that technical services was without a future.  The rampant belief in the computer panacea originated in the 1960’s.  The prediction and the technology upon which it was based has had its ups and downs since.  Perhaps failures would be a more accurate characterization that “downs” of what has happened with at least some computer applications.  Yes, there have been notable successes in automation, but our profession seems to suffer from a propensity to project expectations on the basis of those successes which tend to be impractical if not unreasonable.  Two instances of such willfulness on our part come immediately to mind.

Turnkey system problems

            The new turnkey online circulation control systems were impressively successful in single facility libraries or libraries which, if they had multiple branches or agencies, did not have very many.  There have been examples in which it has been abundantly clear that forcing the technology and its system architecture to serve many more agencies than it was designed to do, has produced results varying from qualified success to apparent failure, at least to date.  We must cherish our successes, though modest they be, without unduly burdening them with things for which they were neither intended nor designed.

Redefining “catalog”

            The second area in which too much is demanded of turnkey circulation systems is the hot pursuit of the online catalog.  Forcing the vendors to superimpose an online catalog on a system working quite hard at counting and controlling books, plus a host of other “nicety” services, has caused double trouble.

            First, in the words of the underworld, the turnkey systems cannot take the weight.  Second, in honor of Orwell’s Ministry of Truth and the annum 1984, the term “catalog,” as in public access online catalog, is being redefined to meet the more limited capability of the system based on circulation control and the hardware on which it is forced to reside.  For example, arbitrarily limiting the number of subject headings because of technical limitations is catalog degradation.

            So, despite the wisdom and prognostication of our best thinkers and the qualified success of our latest technology, I submit that technical services has a future, at least for a while.

The traditional technical services

Acquisitions has traditionally been perceived as the process of acquiring materials for the library in any and all of their forms. Typically there are several categories of record maintained in the order department: 1) outstanding items ordered; 2) orders recently cancelled or received; 3) financial records pertaining to 1) and 2); 4) lists of vendor names, addresses, specialties, performance information, discount policy, etc.; and 5) miscellaneous information such as desiderata, problems, and any data which might be useful in improving the acquisitions process.  A mere listing of these acquisitions files does not do justice to the range of sources searched in the acquisition process.  From lookups in the library’s catalog (as well as in the aforementioned fields), to searches in established bibliographic tools and endless trade and other publications to identify or verify a given item, searching is a primary acquisitions activity.

 Cataloging is one of the all-time fun pursuits, particularly such glamorous work as file creation and maintenance.  In a way, cataloging is a misnomer.  Most of what is done in North American libraries is recataloging according to the Library of congress.  The cataloging process can be readily characterized as the search for LC copy for the item in hand.  The other sources checked by catalogers (or their paraprofessional surrogates) to avoid having to do the cataloging themselves is cataloging done by others.  When these sources turn up empty of a record for the item in hand, a cataloger will be forced to create the record.  Note that the particularly cost-conscious, but not service-conscious, administrator will have the unfound item sit for period of time, then subsequently researched in the same sources to avoid the cost of original cataloging.  The user will, of course, be denied catalog access to the item during the wait, but this is the kind of cost saving against which such practicality of administration is balanced.

            The cataloger, to borrow from Maurice Line, leaves several spoors behind.  First, there is a catalog which yields author, title, and subject access to the materials acquire by the library.  Second, there is a shelflist, which is a form of catalog itself.  The shelflist provides access to the collection as a representation of the materials in the sequence they appear on the library’s shelves.  The third file usually associated with catalogers in something called the in-process file.  This file is a byproduct of the deliberate speed of the catalog process.

            In the case of both the acquisitions files and the catalog files, in the pre-automation state they are nonreplicable, only a single copy of them is available for use and anyone wishing to search them must travel to them to do so.  Not being wholly mischievous, librarians have tended to make the catalog available in reasonable conspicuous and accessible settings.  The invaluable acquisitions files, the cataloging in-process file, and the shelflist are usually found in the private fiefdoms of their progenitors.

            Processing is the last functional area associated with traditional technical services.  Processing is the physical preparation of materials for use (or reuse in the case of binding or some other form of preservation) in the library.  This activity usually involves some form of identity imposed on the item to indicate that it is the property of the library, the physical placement of a label on the item to indicate its proper location in the library, and less often, affixing a record to the item for the purposes of circulation control (most notably a book pocket and or circulation card).  The last processing activity is more often associated with public rather than research libraries.

“There have been examples in which it has been abundantly clear that forcing the technology and its system architecture to serve many more agencies than it was designed to do, has produced results varying form qualified success to apparent failure…”

          Circulation control is a function that has not been viewed traditionally as a technical services responsibility, but in the world of automation, this perception has changed.  The major task with regard to circulation is the file building associated with the marriage between the item record and the patron record.  Since the catalog record is the basis for the item record, the creation and maintenance of these machine records within technical services argues for that functional unit to manage circulation.  This is a trend in public and academic libraries.

            As with the other technical services, circulation has its own share of files, and at least some of them overlap with those already separately and redundantly created in acquisitions, cataloging, and processing.  In the manual systems, circulation files almost always include some combination of the author’s name, the title, and the call number, as well as patron and charge, discharge, hold, reserve, etc., information.

            The maintenance of the paper files in circulation control probably has been the squeakiest wheel in libraries.  One widely accepted but terrible solution, in terms of overall information control, is the microfilm based circulation system.  No access is provided to outstanding (or returned) materials by author, title, call number, or borrower.  Libraries rushed to adopt this system because it eliminated the paper files of earlier manual circulation systems, and thus eliminated the labor costs associated with them.  Costs for overdue notices on the microfilm-based systems are quite high, and usually the borrower is poorly served in terms of reserves and overdue notification.

            For research librarians, preservation and conservation should be mentioned in a list of the technical services.  Stated simply, this function treats of those materials suffering from (or likely to suffer from) overuse or decay.  Several options are available.  The simplest is to discard them.  The second option is to replace those which have been discarded.  Finally, the library can refurbish or rebind those which have suffered.  Conservation can be viewed as the preventive maintenance to avoid premature wear, disused, misused, or other controllable degradation of the print image.  In terms of the future of technical services, the maintenance of preservation and conservation records (what has been shipped to the bindery or elsewhere for regeneration) with their associated bibliographic and related information yields, in part, the same kind of data, problems, and solutions as the other technical services.

The salient features

            Some of the salient features of pre-automation technical services that emerge from this overall analysis are the following:

1.      In a manual environment, files are created or kept in their respective, functional technical service areas.  Anyone wanting to use a given file must either travel to the file, or telephone someone already there.

2.      The files have elements of redundancy.  Every file is concerned with, and repeats the name of, the author and the title.  Every file has, as well, unique elements specific to the given function for which the file was built.  An order file has such information as dates ordered and received, the vendor, cost, and more, in addition to the redundant information.

3.      In a manual environment, because of the nature of the files and records created and maintained there, the technical services people are necessarily localized and specialized workers engaging in a unique library service.  The key elements are “localized” and “unique.”  “Unique” follows from the local and exclusive nature of the files and, to varying extents, the intellectual and physical complexity of the tasks of creating and maintaining those files.  “Local” simply means that to use the files, anyone must go to where they are located, or call someone who is attending the files.

“Forcing the vendors to superimpose an online catalog on system working quite hard at counting and controlling books, plus a host of other ‘nicety’ services, has caused double trouble”

            The terrible redundancy of information found in the various manual files that are built in the traditional technical services operation cries out for automation.  There is no future for technical services if it is seen as the enterprise which concerns itself with creating little (and not so little) files of paper which to varying extents contain the same information.  That is the good news for library administrators and for the editorial writers who, for decades, have taken cheap shots at catalogers and their fellow technical services workers for trying to maintain adequate and accurate access to the library’s most precious possession, its collections.  That’s the good news.

            There is validity to the argument that the fundamental information created by the technical services worker is essential to libraries.  Neither the computer, the pundit, nor any other naysayer will eliminate the need for that information.  Someone, somewhere is going to have to record and store the bibliographic information required for adequate and accurate access to and control of library materials.  That this process has been simplified, rendered more efficient, and still has not enjoyed the respect of administrators and editorial writers is, indeed, partly the responsibility of those who work in technical services.  LJ once quoted Richard De Gennaro’s remark that AACR2 was “a self-inflicted wound,” and our eldest of elder statespeople, the legendary Seymour Lubetzky, characterized the impact of the implementation of AACR2 on our catalogs as “dismemberment.”  This is not intended to rekindle the fires of the AACR2 debates, but simply to note that technical services people have, in part, contributed to the state of opprobrium they enjoy in certain quarters.

The theses

            The two theses of this paper emerge from the discussion to this point:

1)      Automation saves on the expense of labor of managing manual files. 

2)      Automation improves services to the user by enabling increased access to, and distribution of, information that has traditionally been included in those manual files.

Computer technology

            My introduction to data processing equipment came in two steps.  In a 1965 library school course, my colleagues and I keypunched 80 column cards to produce a list of serials which included all the cards keypunched in the IBM tabulating equipment controlled alphabetic sequence.

            Three years later I went to work for a company that was generating (albeit defectively) automated upper and lower case line printer printed book catalogs for one of NASA’s research centers on a computer that was, for its time, a good sized mainframe computer.  (This was almost the “right stuff.”)  Being a powerful and sophisticated beast, it required very special environmental accommodations (a raised floor, dropped ceiling, and special air conditioning) and specially trained operators, programmers, and systems analysts.  That computer had a mystique that both enthralled and intimidated all but these high priests.

            This 1968, third generation, revolutionary IBM 360/30 computer had a central processing unit with a memory capacity of 32,000 bytes.  Attached to it were three disk drives, each of which was capable of storing either seven or 21 million bytes of information.

            Today one can go into any computer store and purchase a quite functional “micro” computer which has 512K bytes of memory, and one or more 10 megabyte (i.e. millions of bytes) disk drives for home use.  This microcomputer, a desktop device, requires no special environmental conditions, nor any specialist operators or others from that high priest cult, unless the owner is especially helpless or intimidated.  Today’s mainframes are correspondingly massive, and current “state of the art” minis and micros dwarf their earlier counterparts in capacity and speed.

            In light of this growth in computer storage space and processing speed, and the increased buyer purchasing power such growth implies, we can make some generalizations about data processing technology.

            The development of technology or its rate of change makes a shambles of our definitions.  What constitutes mainframe, mini-, and microcomputers is constantly being redefined, and is more a function of the hardware’s traits at a moment in time that any sustainable logical definition.  The mainframe of 1968 is the lower end micro of today.  The mainframe of 1974 is the supermini of today, enjoying a central memory store of several megabytes and disk drives of 399+ megabyte capacity.  Such minis can support up to 100 terminals in functional library applications.

            The microcomputer which served as a single function data collection device in the recent past (reading, recording, comparing, and playing back identification numbers) is now being programmed to support an entire library function, such as acquisitions.  The INNOVACQ system form Innovative Interfaces, Inc., handles all ordering, bookkeeping, and ancillary activity, while supporting eight or more terminals online, all on a high powered microcomputer.

            The constant redefinition of these devices occurs in three primary areas: storage capacity, computing speed, and cost.  Change seems to be a constant, as demonstrated by this equipment review.

            In an issue of Scientific American published two or three years ago, it was shown that computer costs (since 1950!) have decreased by 25 percent every single year.  You might wish to compare these data processing costs with the rise in salaries of workers since 1950.  You won’t need a “weatherperson” to tell you why administrators and budget officers prefer to see equipment rather than people entering the library work area.

Computer functions and products

            A few simple points, perhaps too simple, should be noted about the computer’s basic functions from the view of your basic professional librarian and administrator.  The computer has the capacity to store huge files of data in tiny places, and it has the capacity, at extraordinarily high speed, to find the data, and to rearrange and display any and all element of it in an almost unlimited number of combinations.

            For our purposes, the limitations on the computer’s ability to rearrange the data, exclusive of the computer’s size and speed are: 1) the quality of the data entry, that is, the care and detail with which the data is entered, labeled, and delimited; and 2) the intelligence and sophistication of the programs which determine what data the computer examines, and how the computer manipulates and, as a result, displays that data.  The computer has the capability to distribute the reformulated or reconfigured data to a number of locations in any of several ways.

Online distribution

            Online distribution means the user has a terminal in direct communication with the computer, which in turn displays on that terminal the reconfigured data, on demand, per the commands of the terminal user.  Here the limitations are computer and disk storage costs, line charges if phone lines are used, and so forth.  These costs are traded off for the value to the user of the immediacy of the online transactions.  Remember, “online” does not necessarily mean “current.”  Online only involves the immediacy of communication with the machine.  As we find with Dialog, Orbit, and the other batch update “online” services, one is often online to static, outdated records.

Offline distribution

            Offline distribution has had great success in combination with other technologies to create several valuable products.  To produce COM (computer output microform), the computer sorts the data, which is then frozen in place on microform via the projection of a CRT (cathode ray tube) display.  The duplicates made form the microform masters are, in principle, endlessly distributable and usable with microform readers.  Both the technologies used for COM, data processing, and micrographics are quite cost-effective.

            Through a different CRT display, the computer-produced data is projected onto positive film which serves as a photoreproducible master for printed products such as book catalogs.  Those are distributable, too, depending only on printing costs.  Note these require no special readers, a minimum of available light, and can be used easily without electricity.

            A third form of distributed data employs only print output, and no CRT.  Computer line printer products such as cards, printouts, etc. are static or finished displays of data.  The number of copies printed determines the distributability of such products.  Typically, in libraries, one set of cards is made for one catalog.

“…the term “catalog,” as in public access online catalog, is being redefined to meet the more limited capability of the system based on circulation control and the hardware on which it is forced to reside”

The laser printer is one of the most recent distribution devices, with OCLC, UTLAS, LC, and others making good use of it.  Graphics quality print output is produced at high speed without the intermediate photoreproduction step associated with the book catalog process mentioned above.  The costs are obviously reduced here, and of course as costs are reduced, broader distribution is encouraged, if not increased.  For example, it will be even easier to produce special collection hard copy catalogs of segments of one’s database.  Using this method, the Westchester Library System will economically and expeditiously produce a catalog in large print of its large print holdings.

            These online and offline means for the distribution of processed data manifest, at least in part, the computer’s data processing capability, and its capacity in combination with other technologies to successfully disseminate vital library information.

Technology and those files

            In general, technology can have positive impact on the localized, yet redundant, technical services files described earlier.

            The character of the work performed to build and maintain those files changes dramatically.  People who manually typed, arranged, filed, withdrew, refiled, or discarded 3” x 5” slips will no longer do that.  They will communicate with computer stored files and no longer interact with paper files, and/or the residue of paper files will be surrogates or backups of the computer files.

            The accessibility of files is radically altered through their distribution.  Whether the files are available online or through multiple copies of offline products, the information will no longer be localized.  Note that the overwhelming number of North American libraries still have card catalogs, an obvious exception to the online or offline distribution of files discussed here.  Despite the advent of the computer, those card catalogs clearly show that it is still the future we are discussing, not the present.

            Through the increased search capability, the information available is made far more accessible.  The files are better indexed than they have ever been.  Such examples as author and title access to on-order information, either online or via COM products; Boolean searching of catalog data elements; and other enhanced means of access to the vital data managed in technical services abound.

            The files will no longer be physically separated and redundantly produced.  The power and size of computers will permit the integration, or more likely, cross-talk of information in functionally different files.  No longer will author and title have to be separately entered for each technical services file, manual or (as it was in some cases) automated.

A selective automation chronology

            Most libraries could not even fantasize about automation in the 1960s.  It became possible for them, in a limited way, in the 1970s.  In the 1980s, it will be tough to keep libraries away from automation.  The 1960s were dominated by the mainframe computers and their use by the nation’s largest research libraries: The New York Public Library, Stanford, Chicago, Yale Columbia, and some others.  Looking more closely at the 1970s, we observe three levels of computing ability which brought automation to technical services in medium sized and larger public and academic libraries.

            OCLC and its mainframe computers made it possible for any institutional member to avail itself of centralized cataloging data, largely in the manner envisioned by Jewett and reified by Kilgour.  The local library no longer has to be big.  It simply needs to be able to afford the membership, line, and use charges, to buy a terminal, modem, etc.  It can search for LC copy via the terminal, instead of maintaining and searching proofslip files and the National Union Catalog.  However, for most libraries, once OCLC mails arranged cards to them, they are just about back to business as usual regarding the card catalog.

            CLSI introduced the minicomputer in 1973 with its original truncated record circulation control system.  There was also an acquisitions system, but it did not achieve any but the most limited success.  Smaller libraries could now have their own computer, terminals, and online system—in-house!  For the first time for most libraries, computing capability had now become wholly available within the library.  CLSI’s minicomputers coupled with OCLC’s success represented the beginnings of the democratization of library automation.

            In the late 1970s another level of minicomputer with far greater capacity than the 360/30 mainframe of 1968 or the original machine used by CLSI became available for the same group of libraries.

            The entire catalog record (almost) could be stored in the minicomputer, and online access could be provided somewhat successfully to 50-85 terminal locations.  Claims of even greater success from networked minicomputers have recently been made.

Today’s computers

            OCLC is now providing cataloging, serials control, acquisitions, and interlibrary loan services to any library that wants them on their large array of mainframes and other computers.  All these services are conditionally open to OCLC members.  The Research Libraries Information Network (RLIN), another mainframe-based network, is in the process of selling itself and its services, but with considerably less achievement than its competition.  Compassion dictates little further discussion of RLIN’s efforts.  UTLAS, the Canadian-based network, announced that it is replacing its Sigma mainframes (the same as used by OCLC) with TANDEM computer equipment (also used, but differently, by OCLC), and UTLAS promises that all, not just a percentage, of a user library’s local catalog records will be actively online to the user.

            The minicomputer as used for online circulation control has gotten bigger and bigger.  The marketplace is now demanding of the vendors that the mini-based systems serve many functions.  Currently being demanded of the turnkey vendors are online circulation control; film booking; reserve book room control; online catalog functions with Boolean search capability; full authority control; acquisitions and serials control; the capacity to hang 500 or more terminals online; and who knows what else!  The vendors, of course, are scrambling to provide all of the above, and somewhat to their discredit are involved in a cutthroat battle of promising systems that they are months, years (or forever) away from delivering.

            With the emergence of the microcomputer, three progressively more sophisticated application presented themselves.  In the 1970s, microcomputers were used as simple data collection devices to capture circulation transaction numbers, and play them back to a mainframe computer.

            The 1970s also saw the use of the microcomputer by Innovative Interfaces, Inc. as a black box conversion device the took OCLC records via the OCLC terminal, reformatted them (or should one say “deformatted” them), and then passed them along to the CLSI minicomputer.

            The use of vendor-supplied or in-house microcomputers to back up turnkey, minicomputer-based circulation systems (much in the way the data collection devices were used) has been most successful.  Unlike their predecessors, more recent microcomputers are accessible to one and all, and do not require specialized knowledge.  Exhibitors at any current library convention will demonstrate their successful use as stand-alone card production, circulation, serials check-in, and other systems for smaller libraries.

            The microcomputer is now surpassing the computing capacity of the mid 1970s minicomputer, it is also appreciably less costly, and is far ƒ


 cordial for library and personal use.  It also is showing up as a stand-alone functional device (as with the INNOVACQ system).  Super-minis, eclipsing the mainframes of the past and present, are involved in many applications, which in many respects completely transcend any dedicated local library application.

The future of technical services

            There are recurring questions about the impact of library automation on the future of technical services.  Some of them are listed below, with my answers:

            1. How will automation change the positions and responsibilities of professional and paraprofessional staff?

            The work of a professional librarian is professional.  It requires the exercise of judgment based on education, experience, and understanding of the individual’s specific responsibility, all in the context of the library’s overall mission.  The acquisitions librarian is not just mailing orders.  She or he selects and evaluates vendors, negotiates favorable contracts and purchasing relationships, analyses charges and discounts and book commerce in general, all with an eye for acquiring materials as economically and effectively as possible within the scope of the overall service requirements of the given library.  Automation will not displace this person, but this person will undoubtedly perform these functions with better technological support and far more and better information upon which to base decisions.  The machine will make data readily available which will facilitate decision-making, and replace much of the all-too-intuitive judgement required because of the former lack of formal data.  The professional will routinely work and interact with some form of computer.

            The cataloger’s professional work should not diminish, as long as the library still wishes to meet the basic objectives of the catalog enunciated by Panizzi and his Anglo-American followers.  The authority work can be aided by machine processes, but it will still be the cataloger’s responsibility in original cataloging, and even for copy cataloging, to ensure that the appropriate references and notes appear in the catalog.  With automated files and online authority control, the cataloger can liberate the library from some of the silly practices LC’s cataloging and automation failures have imposed upon us.  The cataloger can change subject headings affecting tens or hundreds of entries so that they will reflect the user’s language, as opposed to LC’s formulation of these concepts 10 to 50 years ago, or worse, LC’s total nonrecognition of certain notions.  Boolean searches will be a tremendous help to the cataloger, not just the library public.

            The computer will enhance the professional’s performance by creating options totally unavailable with manual files.  The computer’s capacity to supply management data is phenomenal, and that will provide better support for and thus enhance the professional’s judgement.

            Paraprofessionals have been used to replace professionals in many positions where it was thought that professional judgment was not required.  The classic example is the use of LC cataloging copy.  From the 1960s on, libraries all over the country replaced catalogers with paraprofessionals for the purpose of cataloging with LC copy.  This was true with the older LC card, proof slip, or NUC copy and the more recent, and more defective, cataloging-publication records; or the displayed MARC records from OCLC and other utilities.  The rationale, back to Jewett, is that one is enough; a paraprofessional can successfully locate, verify, and accept the LC copy, as opposed to the cataloger’s more expensive revision, re-do, or even blanket acceptance of it.

            Two points emerge.  First, the paraprofessional’s work may shift as technology shifts, but the paraprofessional, as well as the clerk, will also be kept on the job and perform many more of their duties with some form of data processing equipment.  Most of the paper records and files will be replaced with machine files.  As long as response time for some of the networks is as bad as it has been at times, the paraprofessionals will be kept occupied, if not always busy.  It is conceivable that automation will reduce the time required to perform specific tasks of paraprofessionals, and some staff time may be reallocated.  Significant labor reductions have been achieved where large paper files have been eliminated.

“Note that the overwhelming number of North American libraries still have card catalogs… Despite the advent of the computer, those card catalogs clearly show that it is still the future we are discussing, not the present”

2.      What kind of reorganization may be necessary?

This is where things get turned around.  The distribution of files, online or offline, changes functional responsibilities throughout a library system.  By making a copy of the catalog, on-order file, serial, and other library holdings available at every desk and service location, including even those of administrators, radically affects service to the user and the performance and responsibilities of these library workers.

In my earlier life at the Hennepin County Library, Minnesota, the on-order file and history report of everything ordered over the previous two years were distributed in COM format to all 23 branches and the centralized selection and technical services staff.  The distribution of these files freed the equivalent of one full-time person in the selection department.  Prior to the distribution to the branches, all reader requests for items not found in the catalog were sent to the selection department for searching in the acquisitions department’s single entry on-order file.  Since the branch staffs now had both the on-order file and the order history file, by author and title, they could do the searching themselves, inform the reader immediately as to the status of the item, and free the selection person for other work.  The distributed acquisitions file affected the selection people too.  No longer would queries of the on-order file accumulate in the “wait for the trip to tech services” pile.  Now whenever a question arose, the answer was on the COM reader on the desk of the selection person or, at worst, a desk or two away.

Overall, organizational shifts are possible, but I shy away from recommending outright reorganization.  Computers permit the preparation and input of information from any location.  The only limits are imposed by hardware, line costs, or system design.

Those tasks that do not require specific expertise or immediate supervision can be distributed to all appropriate places throughout the library.  Specific tasks no longer need to be restricted to a file in one physical location.  Support activity will be possible from any terminal.

3.                  Will flexibility of personnel assignments be greater or less with increased automation?

The distribution of files and their integration with each other will permit, if not encourage, flexibility of personnel assignments.  With all of the files accessible throughout the library, people who otherwise had no effective access to them, now have them at their fingertips.  A host of duties (catalog searching, on-order searching, catalog maintenance, addition of cross-references, etc.) can, in principle, be done outside of technical services.  Inhibiting or limiting professional or paraprofessional activity will be a matter of library policy.  The organization of the library will not be predicated on the physical location or custody of files.  Radical changes will be limited by many of the processes required.  The files can be spread all over creation but their professional control and organization are not so easily diffused.  I do admit that this is open to question.

4.      “How will quality control be affected?”

“You get what you pay for” is not a thought which originates with this speaker.  There is much the computer can do to support and facilitate quality and accuracy.  The kind of authority control system developed at NYPL and emulated by UTLAS, WLN, and the National Library of Canada, and transferred to my alma mater in Minnesota (Hennepin County Library) can assist tremendously in catalog quality.  Typos in authority terms are literally not permitted without the cataloger’s approval—the system kicks out (or highlights) any term or name which does not match the names and terms on the authority file.  On the other hand online input may tend to work against quality assurance, unless some random or systematic review is made of input data.  And we all know that the computer’s shortcomings tend to have far more devastating consequences than manual foul-ups.  When there is programming error, it is possible that every single record in a file can be affected or, perish the thought, destroyed.  This can be catastrophic.  Manual operations tend not to produce such far-reaching consequences.

The real issue as regards quality control is the willingness of the library to pay for it.  Redundance in procedures, in hardware, etc., is an expensive option in automated or manual systems, but it will help ensure quality and security.  Aids can be built into the system.  Authority control and control over data fields in which numeric (or only alphabetic) data is permitted can provide some quality assurance.  As vendors and utilities know, authority control is pretty darned expensive.

5.                  What will be the relationship between technical services and other areas of the library?

The distribution of files will enhance the work of all library people.  Workers outside technical services will have access to the information formerly kept under guard there.  Tech services will still have responsibility for that data, but the data will be shared by anyone with a terminal or COM reader.  Technical services people should accept the participation of the auslanders as unthreatening and conductive to better library service.  Working relations will be based on shared knowledge and greater information.

Difference between departments will be based on professional responsibilities and not strictly on the custody of and control over access to physical files.

6.                  Assuming automation is not a fad and is here to stay, what shall we do about it? How can we plan now to facilitate future changes?

I suggest, to all save the bold, those with ample funds for experimentation, or those with a commitment to the use of cutting edge technology, that automation has been oversold, if not overrated.  Its success, every step of the way, has tended to have concommitant negative side effects which have often been swept under the rug or ignored by at least some of its chief advocates and vendors.  As an automation consultant, I have been called in to perform autopsies on automated systems which by their failure crippled library operations.  A remedy in one case required a six-figure re-investment in hardware erroneously estimated by an overconfident and ignorant vendor.  I do not know exactly how many turnkey circulation systems, at least in larger installations, have suffered from unacceptable response time, much slower than promised.  I don’t know how many libraries have been advised by their suppliers that they need to re-input their data and buy new hardwRT cif they want to avail themselves of upgrades “automatically” available to all customers because of system design limitations not understood at the time of purchase.  One vendor has not yet been able to upgrade its newest system, the first system it designed without shutting the library’s circulation system down for at least a month, and possibly longer.

OCLC has revolutionized library service for its members.  New efforts at resource sharing, new cooperative agreements for collection development, and new union catalogs abound because of the incredible shared cataloging resource that has been created.

The dark side of OCLC is its recurring problems with response time, its denial of access to local records, its lack of authority control, and its mad obsession regarding copyright.  A while ago it was not a unique experience for OCLC terminal operators to read books or magazines while waiting for system responses.  RLIN, I was told, measured its response time in minutes rather than seconds at one stage of development.

“The work of a professional librarian is professional.  It requires the exercise of judgment…Automation will not displace this person, but this person will undoubtedly perform…with better technological support and …better information upon which to base decision”

A healthy skepticism

A healthy skepticism regarding all automation claims is always in order.  Be especially skeptical when cost savings are claimed or promised.  Attend meetings, read the literature, and do not be shy about asking questions.  The vendors need to be watched and checked up on.  If there is a system in your library’s future, visit previous installations by the vendors of the systems you might select.  It is unreasonable to expect the vendor to reveal a product’s shortcomings, and unfortunately, for many reasons, colleagues may not be 100 percent forthcoming either.

You do not necessarily have to hire or fire to get someone to be responsible for automation.  Technical services staff are natural, if not always the best, candidates to manage an automation project.  Successful automation, in the largest part, will depend on the participation of all affected staff in its planning, development, and implementation.  The person responsible for the activity to be automated, unless they decline or simply are not suited to manage the change, should play a key role in automating.

As for controlling the future, a librarian should stay on top of developments without being blinded by the glitter of technology or seduced by its siren song.

The future of automated technical services in libraries presented here is fundamentally a rosy one, but the reality still visible is that the promise of automation, at least for some libraries, is, as yet, unfulfilled or compromised.

I have the greatest optimism for the success of automation in medium-sized libraries.  It must be understood, however, that with automation, one set of problem indigenous to manual system are being exchanged for another set of problems peculiar to automated systems.

Automation is not a panacea, it is a tool.  Automation is a means of redistributing needed information, and possibly the work, so as to enhance the delivery of library service and improve operating efficiency.  When you decide to automate, know that a complete set of new problems and cost come with this new technology.

For better or worse, the future of technical services is inextricably tied up with automation.