Tag Archives: panopticism

Ireland’s new National Public Library Catalogue

I would like to begin this post on an optimistic note because I want to give some thoughts on the concept of Ireland’s national catalogue in our public libraries without focusing on the inevitable teething problems that occur with any monumental change in direction that a service undergoes. So, let’s assume that the new national catalogue in Ireland works exactly how it is supposed to. Let’s take that as our starting point. Let’s forget about problems with the actual catalogue. Let’s also forget about SIERRA’s awful search engine that all too often returns unexpected and very inaccurate search results when trying to find a common title. Let’s not worry for now about the fact that paging (item request) lists cannot be trusted or that the system often refuses to clear or move along requests. Let’s not worry too much about barcodes not matching the barcodes on actual items, or the duplication of item records. Let’s not trouble ourselves with the fact that SIERRA was never designed to understand things like text messages, or fines in the public library service. Let’s forget about it’s love of connecting to printers and its complete disregard for the environment. Let us put all that aside and be positive and from there take a look at the idea of our new national catalogue from a conceptual and slightly philosophical perspective.

So what is the idea really? Well, that every public library in the country is connected, sharing items and services. A patron can search the catalogue, find an item anywhere in the country and can have that item delivered to their local library in 3 to 5 days. In many ways it reminds me of the EU. Who did not love the idea of the freedom of movement of both people and goods across the continent? Like all ideas, it was perfect in its idealogical state. But putting an idea into practice requires a measure of control and regulation. And the moment you try to regulate freedom, well, you destroy it. Library members express their surprise and excitement at the new national catalogue because, in truth, it is a brilliant idea that makes more information more accessible to more people. However, what is it like in practice? How well thought out has the idea been? How well has it been executed? And what are the future implications of this new departure for our public library service?

Cost of transportation and the environment

After expressing their joy at such a service, the next questions patrons ask are: ‘how much will that cost?’ and ‘whose going to pay for it?’ The figures will likely never be released by the LGMA as they have a penchant for secrecy at the best of times, but from what I understand the cost of transporting one item stands at around .75 cents. A request list in my local library, small and rural, is usually 20-30 items long. Not only this, but it is the library that has to send the item out that incurs the cost. I work in a smaller county that has spent a lot of its budget in recent years developing excellent stock. Because it is a smaller county it has a smaller budget. The worry is that now this budget is going to be swallowed up by the cost of transporting its excellent stock all around the country. Of course, on the upside it does mean that borrowers in libraries that are poorly stocked will now have a much better range of items to borrow. However, does transporting books around the country really make sense? Rather than sending a book out to a library ten times a year, would it not be better to spend that money on buying an extra copy in the county? Not to mention it being kinder on the environment too. What will happen to library stock in the long term? If book buying budgets are smaller due to the cost of book transportation then won’t that simply increase the demand for requests in the future because individual libraries will be buying less books every year? So while right now you can get books quicker, in the future the queues may get much longer if the service is not properly supported with generous budgets.

It is interesting to consider how wasteful this new system might be. A scenario could arise in which I send a book off to Donegal from Laois. A few hours later a patron comes into the library in Laois and requests the book I just posted to Donegal. I go onto the catalogue and place another request. The system grabs another copy of the same book from Kerry and transports it to Laois. Is this not a wasteful system? In work on Saturday I sent books from Laois to Clare, Sligo, Mayo, Dublin, Wicklow, Limerick and Cork. I checked the LMS to see where other copies of those books were available. The book I sent to Limerick was also available in Clare, which is obviously much closer to Limerick than Laois. The LMS does not understand geography. It will search the home county first, but after that it will simply take the next available copy it finds irrespective of where that copy is located. This means that every book that leaves a county is potentially leaking efficiency. The Limerick copy could have saved 150kms of distance, which will cost less money and less CO2 emissions. And anyone who works in a public library will know that a lot of items that are requested are never actually picked up in time by patrons. I wonder how much money is wasted transporting items to libraries around the country only for the item to never be collected and read? It seems like it is a system that is very wasteful and inefficient.

What ever happened to e-books/ e-resources?

Libraries elsewhere have been pumping money into e-reading services. Why? Well, because it is extremely efficient and location is not really a barrier to reading. Would it not be wiser to allow libraries to develop their own stock and for the money to be put into e-reading services nationally? In fact, we do have some great e-resources available through public libraries (including books and magazine), but currently the e-books service is not actually connected to the national catalogue so when a patron searches for a book it only gives them the option of physical items and ignores the fact that we do actually have a few thousand e-books as well. If every university catalogue can be connected to electronic resources, why can’t the public library catalogue?

My search for the National Geographic magazine returned the following result on Libraries Ireland:

Screen Shot 2017-04-02 at 17.12.47

The results do not give me an option of an electronic copy despite the fact that I know one is available online through Zinio Magazine Collection and Laois Libraries website.

Searching for ‘My Husband’s Wife’ using the e-books filter returns the following result:

Screen Shot 2017-04-02 at 17.18.30

This book is actually available to library members through Bolinda e-books services, again available through your local library’s website. Maybe spending some money connecting our already existing e-services into the national catalogue might actually prevent transportation wastage on physical items? Of course, the defence will be that it is not yet a finished product. But how much money is going to be wasted in the meantime while the LGMA work on finishing what they have started? Would it not be better to put the infrastructure in place first before rolling out a service? It certainly calls into the question the ability of the LGMA and the government to deliver an adequate and effective library service. Reading a review of a bicycle recently, a reviewer commented, “when you buy quality, it only hurts once”. Proper planning in this sense is a lot less painful in the long run, and the LGMA seem incapable of thorough research and planning.

Let’s digress: using an academic library LMS in public libraries

I worked in the UK at the University of Surrey last year. We used an academic LMS called ALMA. It is important to question why the choice was taken to use SIERRA as the LMS of choice for this new national catalogue. SIERRA was first used in Ireland by Trinity College and UCC. These are naturally two academic libraries. In fact, SIERRA is a system designed for academic libraries. I noted earlier that it does not understand geography very well. It has an option when requesting books to ‘hold copy returned soonest’. In an academic library, that may only have four library buildings (at most) in close proximity to each other then holding the next copy returned is sensible, especially because if there is more than one library in a university they will be stocked by subject anyway. The books don’t have to be transported from one location to the next. In the public library service, we have more than 300 library branches so holding the next copy returned means a book could be travelling 300kms from one branch to another. SIERRA has no understanding of geography; in an academic library it does not need to. Of course, it will search the local authority first and then default to the next copy outside of the county if none are available in county. However, by my estimation, this only actually happens about 80% of the time. I have received copies of items requested by members of my library arrive from outside the county when I know there to be a copy sitting on the shelf just a few feet away. Of course, you can alternatively find a copy of an item and make a specific item request. However, many items that were lost on HORIZON transferred over to SIERRA as ‘on shelf’ and so the item you request may never actually arrive. SIERRA does not understand time either. If a copy is due back the next day into the authority where it has been requested, it does not stall the request and wait for that copy to return. Instead, it goes outside of the county and pulls a copy from elsewhere. There does not seem to be any pattern or logic to how or where it pulls items from. It is hugely wasteful of resources and is not a system that understands public library processes or procedures.

Also in an academic library, you have a small team of cataloguers all working to the same standards. At the University of Surrey, we worked to British Library standards. With the previous LMS HORIZON cataloguing was a problem, but it was manageable. You see, some of the people cataloguing started out card cataloguing and in the last 30 years have never updated their understanding of cataloguing. When we moved over to electronic cataloguing and to AACR2, there were huge gaps in knowledge and new cataloguing standards were not followed consistently. And I imagine very few cataloguers today in the authorities really understand RDA/MARC21 because they are simply too busy to continuously up skill. In the past these inconsistencies were limited because even though the cataloguing standards may not have been the best, they were at least somewhat consistent within each authority. We now have a catalogue that has 15 million items and god only knows how many people are adding records on a daily basis in the individual branches. The catalogue is currently a huge problem. I frequently have to go back into HORIZON in order to find what I am looking for. I have seen items with mis-spellings in the author’s name. All item records should be attached to the same bib record so that when requests are place a nice orderly queue is formed. However, I saw one item with multiple bib records. The first had 6 items attached and there were more than 40 requests on the record. The second record actually had 60 items attached but with no requests on it. It was the same book, but the bib record had been duplicated. The main benefit of a national catalogue is the request system and the ability to share information, but the actual catalogue as it is is making the request system very inefficient and ineffective at times.

I could go on and on about the catalogue and the inaccessibility it creates. However, it is important to question why SIERRA and Innovative were chosen? The LMS is simply not fit for purpose because it is an academic library system being used in a drastically different public library setting.

The scenic route to ‘National Procurement’

Of course, national procurement is coming down the line as part of the national catalogue in which all libraries’ stock will be purchased centrally. The current national catalogue is impressively diverse. But this diversity will diminish with national procurement. Eventually, will all libraries in the country simply have the same stock anyways, thereby alleviating the need for a courier system? If every library has the same books and are equally stocked in terms of quantity, then there will be no need to borrow from around the country. Why not just do this now and save us all the hassle and expense? Or better still, develop e-book services instead?

My Open Library, ethics, surveillance, democracy, diversity and the future of public libraries

There are other questions of course. Like how does the national library catalogue tie in to the Open Library plans? Surely these two systems have been considered by the LGMA together? Surely they are part of the same long term vision. The National Catalogue certainly seems to suggest that users will have a better service if they go online and order items that they want and then drop in to pick them up at the branch. The national library catalogue does seem to be pushing people more towards online services and may well diminish the services for those who are not capable or inclined to visit their library online. This is because stock in libraries looks like it’s about to be negatively affected by the cost of the national catalogue. So, is the national catalogue simply a prelude or set up to the open library agenda?

Philosophically, there is a tendency to consider all authorities as the same in this strategy. There are fundamental problems with this. Not only in terms of reducing the diversity of library stock, but also in terms of failing to understand the diverse needs of library users which varies greatly from county to county. There is a bigger issue of controlling information rather than freeing up information. National procurement, coupled to a national catalogue, coupled to an open library results in greater control of library members and of library stock. It leads to greater surveillance in libraries (a fundamental principle that libraries are at odds with), but also greater control of information that people have access to. The cynics amongst us might suggest that what is really happening is that the government is exerting greater control over our freedoms. A national catalogue creates a national database of civilians whose personal information and reading habits are now accessible to government when before they were localised. A librarian in any branch in the country has access to a huge national database of phone numbers, emails and personal addresses, and while we cannot see reading history, we can see what anyone in the country is currently reading. I wonder how members of the public feel about this from a privacy and security perspective. National procurement threatens to centralise the control of information and people’s access to it, as well as exerting greater control over what information the public have access to freely through their libraries. And finally, add to this an Open Library system in which members of the public are no longer able to enter a public building without being video recorded and personally identified and you have the destruction of a key pillar of freedom in a democratic society. In fact, libraries could be the last truly free space left in our society and the national catalogue, for all of its promise, should not be viewed separately from other government initiatives.

Part 3: The Semiotic Approach to Citation Indexing


This brings us onto the final approach to theorizing citation indexing. This approach was termed by Chubin and Moitra (1975) as ‘phenomenological’ in that it looks at citing in terms of being a symbolic exchange. Small (1980) puts forward the idea that citations become markers or symbols which are indicative of theories, concepts, ideas or methods. Blaise Cronin has developed this approach in a more interesting way by looking at citations as both sign and symbol in his essay ‘Symbolic Capitalism’. Cronin (2005, 143) goes on to assert that a citation is a signaling device or action indicating that one is familiar with and have drawn upon a particular author and work. However, here Cronin once again places equal emphasis on the author and the work, meaning his semiotic approach draws more from structuralism than post-structuralism. The concept of the author becomes a regulating force over all future iterations of that text, meaning that the text can never be re-conceptualized leading to greater innovations having finally been released from the original hegemonic authorial context and given life of its own. Cronin is not alone in positing a semiotic symbolic relationship between texts. Wouters (1993, 7) suggests that citations act as two different signs, one that points back to the original text and one that refers to its own context. Warner (1990, 28) rejects this approach arguing that “the ambiguity of citation in aggregate form can be seen as a special case of the indeterminacy other written signifiers, such as words, can acquire when torn from their discursive context”. However, while there is some validity to this argument, Warner is still reliant on a citation being held within an original authoritative context. Cronin’s approach (2005, 156) is successful with regard to his focus on sign systems, arguing that “references and citations need to be unraveled in respect of their respective sign systems.” He (Cronin 2005, 159) goes on to suggest that this sign system is triadic in nature: “The referent of the bibliographic reference is a specific work; the referent of a citation the absent text that it denotes; in the case of large-scale citation counts, the referents are the cited authors.”

The problem with Cronin’s approach is that he views language from a structuralist perspective as is clearly evident from his triangular structure of language in which signs fall back upon an original context. But reference to Roland Bathes theory above demonstrates that signs do not necessarily operate in such a coherent direction. Rather, signs are dispersive entities that ripple out into the past, present and future thereby creating multiple contexts. They do not necessarily fold back upon the original text, but rather re-conceptualize that text pushing it into the future as a ‘new’ work. Baudrillard (1981, 150) would refer to Cronin’s sign system as the “mirage of the referent”. This essay supports Baudrillard’s concept of the sign becoming a kind of false referent that signals back to the original text. So, citations as signs do not really contain a past, rather, they only push past texts into a newly imagined future. In many ways, post-structuralism depicts citations as signs in terms of what Brian McHale (1987, 166) defined as heteroglossia, that is, “a plurality of discourse […] which serves as the vehicle for the confrontation and dialogue among world-views”. What this recognition must do, is destory any sense of hierarchy within the citation process. It can not only tear citing from their authorial and hierarchal structure, but it can also seriously undermine the normative approach that all theories appear to fall back into, nó longer allowing citations to be retained under a hegemonic capitalist scheme.

In conclusion, this paper has attempted to explicate the three main approaches to understanding and theorizing citation indexing. It has done this through a brief review of the literature available in the academic field. The suggestion is that citation indexing has become blinded to the hierarchy that now controls it. In this sense, Sosteric’s (1999) argument that hegemonic control over scholarship through the proliferation and globalization of citation practices in the wake of the technological revolution has well and truly been realized. This can be argued as we see scholars become blinded to the underlying capitalism that controls scholarly thinking by embedding scholarship within a fragmented and contradictory paradigm. Many scholars argue then for a more expansive theory through the interpretative, phenomenological, and semiotic approaches, but these become retained within authoritative contexts and ultimately collapse back into a normative approach. By identifying the persistence of an underlying capitalised structure, this essay has attempted to take a more holistic and ontological approach to the subject. It has also attempted to utilise some post-structuralist theory in order to develop the semiotic approach of Cronin. In doing so, this paper argues for the freeing up of Croin’s sign system to incorporate a more dispersed heterogenous theory that could ultimately create a freer, more authonomous citation system.


Baudrillard, J. (1981), For a Critique of the Political Economy of the Sign, London: Telos Press

Chubin, D.E. & Moitra, S.D. (1975), Content analysis of references: adjunct or alternative to citation counting? Social Studies of Science, 5, 423-441

Cronin, B. (2005), ‘Symbolic capitalism’, The Hand of Science: Academic writing and its rewards. Lanham, MD: Scarecrow.

McHale, Brian (1987), Postmodernist Fiction, New York and London: Methuen

Small, H.G. (1980), Co-citation context analysis and the structure of paradigms, Journal of Documentation, 36(3), 183-196

Sosteric, M. (1999). Endowing mediocrity: Neoliberalism, information technology, and the decline of radical pedagogy. Radical Pedagogy. http://www.radicalpedagogy.org/radicalpedagogy.org/Endowing_Mediocrity__Neoliberalism,_Information_Technology,_and_the_Decline_of_Radical_Pedagogy.html

Warner, J. (1990), Semiotics, information science, documents and computers, Journal of Documentation, 46(1), 16-32

Wouters, P. (1993), Writing histories of scientometrics or what precisely is scientometrics?

Digital Curation & Preservation: At what cost?

brainIt continues to astound me on this MLIS course how many ideas, theories and practices blindly press ahead with the supposed ‘advancement’ of the industry without ever addressing important fundamental questions about the underlying nature, impact and value of the work being undertaken. It also amazes me at how information studies academics continue to theorise while passively ignoring the poststructuralist theory that has been informing many other disciplines uninterrupted for the last 50 years. Reading Helen Shenton’s work has left me no less bemused.

Digital curation and preservation takes as its starting point the mantra ‘we must preserve’ without ever asking whether or not it is right, or indeed valuable to preserve. Poststructuralism has worked hard to ensure that history and culture are not controlled as homogenous entities, but digital curation is now threatening to undo much of that good work. Poststructuralism is a theory of language that denies words as static culture building objects, and instead views language as a highly dispersive subjective heterogenous experience. It is the theory that underlies so much of our achievements in the last 50 years. It lead to the feminist movement, to the reconceptualisation of history as a discipline, and to the destruction of periodisation in literature. With real world artifacts we still have the potential to make new discoveries about the past. However, with born-digital objects which only have a lifespan of up to 25 years, we will not have the capacity to re-write the past through new discoveries. As a result, the digital curators of today are essentially the historians of tomorrow. The files that they choose to save will create a static history that cannot be questioned in the future. Howard Zinn, a postmodern historian argued that history has traditionally been written by those who win wars. Digital Curation, which is funded by governments or private organizations, is in danger of destroying the culture it is aiming to preserve in what could be become a Big Brother like scenario.


Helen Shenton’s work in ‘Virtual reunification, virtual preservation and enhanced conservation’ focuses on the digitisation of dispersed works. It is in many ways a hugely interesting project, but it needs to be taken to question about its real underlying value. It is disturbing that Shenton’s work has as its goal ‘reunification’. This word summons forth a whole litany of other terms like ’empire’, ‘colonisation’, ‘power’, ‘race’, ‘slavery’ and ‘control’ to name but a few. This word inherently references imperialism at a time when the breaking apart of the United kingdom has become a real possibility in the near future. The fact that some important texts exist in a dispersed format is in itself culturally significant because it is indicative of the breaking apart of empire itself. Bringing these texts together has the potential to create a false narrative and a homogenous cultural discourse, and in this sense Shenton, like many of her contemporary information professionals, uses an outmoded form of structuralism to inform her ideas. She argues, in relation to the Sinaiticus Project, that it requires ‘the production of an historical account of the document’ that needs to be objective. The very idea that a homogenous ‘objective’ narrative is being added to these documents is a regulating process that ignores the lessons learned in the arts through poststructuralism. Structuralism is also implicitly referenced in the layer of information in the form of digital links over the manuscripts, which again inherently asserts control and authority over the material. Shenton has not stopped to ask what is the cost of such a project. Nor has she asked why the British Library feel as though they have the right to oversee the reunification of material from different cultures around the world.

The British Library is not only collecting material, but they are seeking to play a role in culture building. I thought the function of a library was to provide non-judgemental access to information. Shenton talks about ‘enhancing’ culture through diplomacy insofar as cultural diplomacy can play a role in international relations. It shows that there is an implicit and dangerous politics behind these preservation projects. Questions need to be posed regarding for whom is the British Library attempting to play a role in international relations and to what end? This project seems to be going beyond simply collecting material, but is ‘using’ material to re-tell an old story of empire. It feeds into an attempt by governments to create and control fake grand narratives. Howard Zinn’s principle of postmodern history was a way of challenging power by telling history through dispersed narratives. Shenton’s digitisation project runs the risk of more easily cutting off avenues to the past for us here in the present, but more dangerously, for people in the future. It poses the danger of manipulating information in ways that reassert a new kind of imperialism, a new homogeny of information, and an oppressive future in which subjectivity is no longer valued.


IT = Innovative Management System or Panoptic Hegemonic Control


Fatat Bouraad, in ‘The Emerging Operations Manager’, puts forward the thesis that the increasing reliance on IT services and IT skills based staff needs a framework in order to develop new methods of management. This is because as IT becomes more prevalent, new methods of observing, evaluating and managing staff also emerges, allowing for shifts in management styles. However, I think it is important to ask whether we are managing staff through IT, or whether IT is becoming a mechanism for a more totalitarian style of management in which the machine allows for an even more strict top down management style?

I recently read an article by Mike Sosteric called ‘Endowing Mediocrity’ in which the author posits that IT in all forms comes from an increasingly prevalent surveillance culture within business, education and social media forms of expression. This surveillance is of course facilitated more easily through the use of IT, but rather than creating a more flat structure, it tends more towards a deceitful panopticism. Sosteric (1999) argues that “Panoptic systems thus function as systems of behavioural and ideational (hegemonic) manipulation and control.” So Bouraad may argue that IT allows for greater efficiency and a tendency towards a flat system, but he also argues for a framework through which this flat system should operate which is somewhat contradictory. There needs to be an understanding that as we move more into the realms of IT based systems, that all of our actions are constantly under surveillance by the hierarchy that we work within.

Snapshot 2009-08-07 15-32-18

Furthermore, modern communication systems were actually designed to create greater control over human targets. I use this language deliberately. Norbert Weiner is the father of modern IT based communication systems. He developed these as a way of controlling military missiles during flight so that they could become more accurate. The endgame was always to gain greater control over the end user/ target. The same system is now used in modern computing. In the ‘know how to be’ stage of regulating new operations management theory, Bouraad argues that employees must remain up to date in order to developed a continued propensity towards innovation. However, innovation rarely comes about within an environment of surveillance. Most companies are either trying to control employees or they are attempting to control the consumer habits of targeted customers. Information industries have been contributing to this manipulation of end user increasingly through the spread of Big Data and internet monitoring. These issues do have serious implications for libraries also as they move more towards digital and online forms of dissemination. We should of course embrace the many benefits that IT give us, but we should never lose sight of where this IT has come from and the negative impact it can have on the personal liberty of our information professionals and the public they serve.