Tag Archives: information

Over-qualified & Under-experienced

Intro Slide
Intro Slide

The above slide best sums up my first year seeking employment as a new Information Professional. While I have made it to a few interviews for Assistant Librarian roles, I have failed to secure these posts due to not having enough direct experience in these roles. This means I need to find an entry level Library Assistant role in order to build experience However, I have found it much more difficult to get interviews at entry level Library Assistant roles. I am told that I am over-qualified for these by experienced librarians I have spoken to about this problem.

Slide 1
Slide 1

So what makes me overqualified and under-experience. I have 12 years experience in Teaching and Education. I have taught Secondary English and worked as an Examiner for the Leaving Certificate Mock Exams; I have worked as an Assistant Lecturer in English Literature at Maynooth University; I have taught English Language, Technical English and Academic English in university and private companies in three countries, including UCD her in Ireland; I have taught Academic Writing, Research Skills and Information Literacy at third level also.

Since graduating, from my MLIS at UCD, I have worked for 1 year as a Library Assistant in the University of Surrey in the UK because I could not find a job here in Ireland. For the past three months, I have been working as a Library Assistant in Laois Libraries on a temporary contract that could end at any moment. I also have two postgraduate degrees.

Slide 2
Slide 2

In 2016, I applied for Library Assistant jobs in 6 County Libraries and 5 Academic Libraries, but was only invited to one interview in Laois thus far. In the county, there are more than 17 Library Assistants, but I am the only one with a library qualification. I finished 10th on that panel, out of which 5 people were hired on full-time permanent contracts. 1 of the 5 had meaningful library experience and none had a qualification. I am happy for anyone who finds a job, but the numbers in public libraries are too heavily weighted towards those without library qualifications, suggesting the qualification is not valued for these positions. I personally do not have an objection to libraries hiring Assistants that have no qualification because I understand their need to have security and continuity in their staff as many Library Assistant are not interested in being upwardly mobile – unlike most MLIS graduates. Also, the people I work with at the University of Surrey and Laois Libraries are excellent at their jobs and I learn from them every day. My experience illustrates just how challenging it is for new professionals to land that first position in the current market.

Slide 3
Slide 3

So how can you maximise your opportunities? You need to work hard on your cv, creating a new cv for every job. I delete my CV every time I apply for a job because it forces me to create a new one for each application, in which I use the job description in order to guide how I describe my experience. The Library Association of Ireland is also an amazing community of professionals who you can learn from and build in-person and online networks. They do provide many CPD opportunities. This is something I need to work on more myself. Using your new cv and networking powers, you can land that first job. But from there you need make that job work for you. Rather than sitting at the information desk at the University of Surrey, I got involved in Teaching & Learning projects, I shadowed Subject Librarians, and took on Cataloguing projects when I heard Cataloguing were very busy. I travelled outside of my own job description in order to develop experience that strengthens my employment opportunities. So long as you are doing your primary job well, then your Line Manager will likely be open to supporting your development. In my current role, I catalogue donated items rather than sending them back to Library HQ; I get involved in collection development and weeding; I am currently organising workshops for Leaving Certificate students in the area; and am planning to run basic computer and web design courses in the library in the near future.

Slide 4
Slide 4

However, while there is a vibrant, helpful library community in this country, we do have to balance our view by considering that there is also a political and economic reality at work. Many librarians who act within the community and through the LAI are also going back into their daily professional posts and are failing to act upon the needs of new professionals in their hiring policies. They create CPD opportunities in the LAI but fail to realise that in order to really allow for career development to happen, more opportunities to apply these skills in a professional role need to be created. So by all means work hard on your CV, and doubly so on networking. But don’t forget to remind those senior librarians that you meet at CPD events, that the obligation they have to nurture the profession does not stop once they leave their LAI committee meetings. And that they need to carry that sentiment back out into the real professional environments, to their hiring committees and interview panels, so that we can be afforded the opportunity they themselves were once given. Thank you.

Slide 5
Slide 5

Part 3: Assessing Ireland’s Open Library Initiative

Will the open library provide more or less access to information for all people within the community?

I initially started this article by suggesting that open libraries could provide greater access to information and that this could potentially be a good thing. They provide greater access to information simply because the library is open for longer periods of time. However, does an open library provide greater access to information than a less technologically enabled staffed library?

If the user is computer literate then the staffed library and open library provide the same access to information. However, if the user is not computer literate then the open library provides less access to information. This is because many people do rely on the librarians to help them use the IT facilities in the library. Many users do not know how to search the catalogue in order to find the books they are looking for; others do not know how to log on to a computer let alone search a digital database effectively; most users cannot use the photocopying and scanning facilities without help; and in libraries when there are self-service machines, most users come to the desk with their items anyway preferring the human interaction and service they get from the staff. Open libraries exclude all of these people. So, if it is cheaper to staff a library than to set it up for open access, and if open access excludes users, then is it not true that staffed libraries provide greater a access to information than open libraries given that a staffed library service in the evening provides access to more people and for longer periods of time? The simple truth is that librarians are as much a part of the access infrastructure of a library as computers are.


Part 2: Assessing Ireland’s Open Library Initiative

Who are the designated end users and does the Open Library truly serve them?

The designated community of an open library are users that are computer liberate and technologically enabled. Many people who attend the library during the day are excluded because they do not know how to use the technology in the library. If the open library scheme is targeting those people that are working during the day then that is fine. It is acceptable to target these users if the library is open to everyone else throughout the day. However, if the end user is someone in full-time employment/ education and is already technologically enabled, is the open library the best solution for their needs?

The bottom line is that the library now provides more electronic resources than it does physical copies. You can borrow e-books and e-magazines through the library website, you can take courses through University Class and you can learn languages also. True open access for a technologically enabled user actually means ‘remote access’. That user can access electronic resources from home, work, while on the train or their lunch break. That user already has a broadband or 3G connection and is already connected to the information they need. Does it make sense that we are investing millions countrywide to set up individual open libraries to service people who are already connected to the internet and who are online? Does it not make more sense to invest this money in electronic and digital resources that can be accessed 24/7 from anywhere and by anyone that has an internet connection? This is also important because one countrywide electronic resources license actually serves the entire country. The open library initiative is setting up single, individual libraries in every county to provide greater access to information. It is like choosing to pay for hundreds of Windows licences when one will cover the entire country at a lower cost.

So, if the actually information can be provided electronically, whey else would someone need an open library? Of course, people will use it not just for the information it holds, but also for the facilities, ie. copying, computing and studying. Again, the designated or targeted user likely already has a computer and printer in their home, or alternatively, will use printing and photocopying services at work. So if the open library will only be open in the evening and the people using it will already be technologically enabled, is it worth the investment to open the library as a study space for professionals and students? There is no doubt that it would be useful for people to study and hold meetings in the library, but do we really need all of the additional security and technology to provide people with a desk, a chair, or a group meeting room? And will people feel safe and comfortable enough to use it anyway?

It appears that the reasons behind the open library are misguided. The attempt to appease disgruntled librarians and patrons by arguing that the open library will only be open for a few hours in the evenings and on Sundays simply does not add up or make sense from a financial or end user perspective.


Solutions to Controlled Vocabularies Part Two……


Structuralism can be understood as a normative science in which a classification system would begin with the norm and afterwards, if necessary, treat of any exceptions. This, in many ways, goes a long way towards explaining how some of the solutions to classification biases are conceptualised. Both Olson and Mai have basically inserted into the old systems a new way to deal with any exceptions to the norm. They both imply an understanding and incorporation of difference into their models, but only once the normative system has been first applied. However, in Poststructuralism, there emerges a more appropriate definition of difference, and not as an exception of ‘limit’ to the ‘core’, but as a regulating principal that functions to define the core of a subject. As James Williams (2104) demonstrates, in poststructuralism “the limit is not compared with the core, or balanced with it […]the limit is the core”. Poststructuralism sees dualism as a problematic approach to understanding language, and even more problematic with the dualist approach of classification theorists is that they imbalance the dualism between sameness and difference, lending more significance to sameness than difference. Perhaps it is better to explain the principle of ‘core’ and ‘limit’ through an example. Defining something like ‘Irishness’ is traditionally understood through what is at its core, that is, being born in a certain place, time, to certain parents, being a certain skin colour, speaking a certain language, etc. This understanding of Irishness is what regulates our political system and society as a whole. The ‘limit’ in this example would be the problems that arise with the definition once we factor in ethnic minorities who become naturalised with their own set of cultures and histories. But in a traditional, structuralist understanding, these minorities are the exception, and while the Irish government may legislate to create better conditions for ethnic minorities, there will always be discrimination because the limit does not change the core. However, poststructuralism, in James’ (2014) words, would argue that “The truth of a population is where it is changing. A nation is defined at its borders”, that is, at the point of difference because everything that happens at the borders of a country changes how the core is defined. It is the ‘difference’ of ethnic minorities that is most representative of a where a nation is going to in the future and so the ‘limit’ for poststructuralism is the most meaningful way that a nation, or a text, or indeed a classification system can be defined. In terms of vocabulary control then, what defines how a text is classified should not be a biased and static system. Texts should be classified by emphasising how those texts are being, or could be used in ‘different’ ways, by different disciplines. A book like ‘Words of Power’, that would be classified under LCSH and DDC as Philosophy: Logic, would no longer be limited to such static systems. This book could also be used by feminist scholars, by those working in ethics, or anthropology, sociology, psychology, gender studies, linguistics, and so on. Defining the book under strict controlled vocabularies denies access to ‘exceptional’ groups which results in a lack of real innovation and creativity in academia. In this sense, there are lessons to be learned form poststructuralist theorist Deleuze who argues for the power of openness in creativity.

To reassert, then, poststructuralism denies the traditional approach to classification through controlled vocabularies and aims to positively disrupt traditional classification systems in order to achieve greater autonomy for texts and their users. The problem with the solutions is that they are developed from the point of view that it is too disruptive to completely change our classification systems. But here disruption is seen negatively rather than positively. The seriousness of this ultimately limiting attitude and of the reliance on outmoded classifications can best be understood by applying the work of Jacques Derrida to the topic. Derrida’s (1976) ‘textual positivism’ would not ask ‘what is this book about?’, but rather, ‘what does this book do?’ This question radically changes the way we would categorise texts because it places an emphasis on multiplicity of use rather than the singularity of meaning as defined by ‘specialists’. Derrida’s approach distinctly adopts anthropocentrism and sees the classification system as onto-theological. Derrida’s ‘origin’ is constantly being affected by a texts ‘presence’, thats is, what a text is doing in the moment. It is this ‘presence’ that leads to both the future and the past, that essentially pulls the ‘origin’ into ever-evolving new contexts. A ‘sign’ then for Derrida is nothing more than a ‘trace’ of that change, a trace that can be followed to a point of difference so long as we understand that once we reach the trace, we have too altered its origin which has moved off beyond our grasp. It is this point of difference that allows for creativity to emerge in what Derrida defines as ‘play’. In this sense, and applied to classification systems, the moment at which a reader reads ‘Words of Power’ in relation to psychoanalytical studies, is the moment that changes the origin indefinitely, opening up that text to new contexts in the future and the past, and thereby changing how that text should be defined in a classification system.


This essay will argue then, that classification systems should take ‘difference’ as a key underlying principle in the way we organise information. To do otherwise is to consciously do ‘violence’ by excluding individuals or groups from our knowledge economy. As Derrida (1976, 140) writes “There is no ethics without the presence of the other but also, and consequently, without absence, dissimulation, detour, differance, writing”. This is because traditional classification systems and the solutions outlined in this paper, try to regulate the past, present and future of information in order to make it more accessible. However, the adoption of a classical scientific approach only makes it possible to categories if texts, if language, are seen as static. But Derrida (1976, 67) teaches us that “The concepts of present, past, and future, everything in the concepts of time and history that assumes their classical evidence – the general metaphysical concepts of time – cannot describe the structure of the trace adequately”. This constitutes a denial that there is no ‘final’ past, present or future of a text. Derrida (1976, 69) would argue that what is really happening in controlled vocabularies is a violence and unethical act of control which is borne out of a fear and misunderstanding of ‘death’. “Spacing as writing is the becoming-absent and the becoming-unconsious of the subject. By the movement of its drift/ derivation the emancipation of the sign constitutes in return the desire of the presence. That becoming – or that drift/ derivation – does not befall the subject which would choose it or would passively let itself be drawn along by it. As the subject’s relationship with its own death, this becoming is the constitution of subjectivity.” Death in this sense is seen as continuity from one context to the next rather than a final end. In many ways, traditional classification systems are a kind of death sentence in the traditional sense in that they render entire texts and disciplines static and irrelevant as new contexts emerge and are classified in inadequate ways. Perhaps the point is best represented by Williams (2014) who claims “The demand for clarity is dangerous because clarity justifies violent judgements and exclusions on the basis of a promise of a world of understanding and togetherness.”

What is happening then in our classification systems is that subject specialists are attempting to create greater accessibility to information by categorising information under specific and specialist subject headings in order to create a sense of clarity when one is searching for that information. Texts are gathered together based around a principle of sameness and difference in which things that are similar are classified under the same subject headings. The reality is that this system, based on controlled vocabularies is extremely biased and fails to account for real difference, heterogeneity and multiplicity in our information world. There have been attempts to create new systems that, for example, are more appropriate for those interested in feminist studies, but these systems simply shift the control from one universal group to another smaller prioritisation. They do highlight a very important politics and injustice, for example, in the way texts are classified, but they do so by reverting to an equally biased system. It is at the point of difference that real innovation and creativity occurs. Our universities are set up as places in which creativity and independent thinking is supposed to lead to new innovations, while our public libraries are moving more and more towards providing creative spaces for communities to grow and develop. Yet, the way in which we search for information is limited and contradictory, and no longer fit for purpose. The argument that ‘difference’ should be prevalent in classification systems is not absurd or contradictory, but would require a complete overhaul of the way in which we understand and categorise information. Perhaps there is already a working model existent in the way in which internet search engines like Google operate. Websites on Google are presented to us in a way that can promote difference as a classifying principle. This is because those websites that have the most links to other active sites are presented as being more prevalent and relevant. What would happen if a similar approach was adopted by libraries? That when we search for information under a certain topic, that we are presented with a list of texts that are organised based around the number of connections the texts have to other texts and thus other disciplines? This would perhaps provide a classification system that would celebrate multiplicity, ranking texts according to the many possible ways they can be used and interpreted. In the field of literature, I remember my PhD supervisor suggesting to me that I include some comparative work between John Banville and Gabriel Garcia Marquez in my thesis because the postcolonial contexts of Ireland and Colombia bare many similarities. As a researcher, in all the hours spent searching for information on Banville and postcolonialism, I was never presented with any texts that implied real difference or multiplicity. Searches were singular and restrictive, never indicating that any different approach was possible. The classification system would provide no new spark of creativity for young researchers to pursue. In fact, using Boolean logic, I found that the more search terms I entered in different searches, the more relevant the results that were presented. If difference was to become an organising principle, then information would be retrieved that prioritised multiplicity and that would lead to greater inclusion and thus innovation. The problem at the moment is that the way we search the digital databases is dictated by the way in which information is placed on stacks. The stacks or the numbers on the books do not have to change. Where the information is located in the library is irrelevant because rarely nowadays do we search the stacks anyway. What needs to change is the way information is classified on the digital databases. There is nothing stopping us from radically changing this digital system to one that is more inclusive of difference and that contains less ‘violent’ vocabularies.


Beghtol, C. (1986). “Semantic validity: concepts of warrant in bibliographic classification systems”. Library Resources & Technical Services, Vol. 30 No. 2, pp. 109-25.

Borges, J.L. (1952). “The analytical language of John Wilkins”, Other Inquisitions 1937-1952. Souvenir Press, London, 1973.

Bowker, G.C. and Star, S.L. (1999). Sorting Things Out: Classification and Its Consequences. MIT Press, Boston, MA.

Derrida, Jacques (1976). Of Grammatology. John Hopkins University Press, Maryland.

Lakoff, G. (1987). Women, Fire, and Dangerous Things: What Categories Reveal about the Mind. University of Chicago Press, Chicago, IL.

Mai, Jens-Erik (2010). Classification in a social world: bias and trust. Journal of Documentation, Vol. 66 Issue 5, pp. 627-642.

Miksa, F. (1998). The DDC, the Universe of Knowledge, and the Post-Modern Library. Forest

Press, Albany, NY.

Olson, Hope A. (2001). Sameness and difference: a cultural foundation of classification. Library Resources and Technical Services, 45, no. 3, pp.115-122

Shirky, C. (2005). “Ontology is overrated: categories, links, and tags”. Clay Shirky’s Writings about the Internet. Economics & Culture, Media & Community, available at: http://www.shirky. com/writings/ontology_overrated.html html (Date Accessed 28th April 2015)

Williams, James (2014). Understanding Poststructuralism. Routledge: London.

Part 3: The Semiotic Approach to Citation Indexing


This brings us onto the final approach to theorizing citation indexing. This approach was termed by Chubin and Moitra (1975) as ‘phenomenological’ in that it looks at citing in terms of being a symbolic exchange. Small (1980) puts forward the idea that citations become markers or symbols which are indicative of theories, concepts, ideas or methods. Blaise Cronin has developed this approach in a more interesting way by looking at citations as both sign and symbol in his essay ‘Symbolic Capitalism’. Cronin (2005, 143) goes on to assert that a citation is a signaling device or action indicating that one is familiar with and have drawn upon a particular author and work. However, here Cronin once again places equal emphasis on the author and the work, meaning his semiotic approach draws more from structuralism than post-structuralism. The concept of the author becomes a regulating force over all future iterations of that text, meaning that the text can never be re-conceptualized leading to greater innovations having finally been released from the original hegemonic authorial context and given life of its own. Cronin is not alone in positing a semiotic symbolic relationship between texts. Wouters (1993, 7) suggests that citations act as two different signs, one that points back to the original text and one that refers to its own context. Warner (1990, 28) rejects this approach arguing that “the ambiguity of citation in aggregate form can be seen as a special case of the indeterminacy other written signifiers, such as words, can acquire when torn from their discursive context”. However, while there is some validity to this argument, Warner is still reliant on a citation being held within an original authoritative context. Cronin’s approach (2005, 156) is successful with regard to his focus on sign systems, arguing that “references and citations need to be unraveled in respect of their respective sign systems.” He (Cronin 2005, 159) goes on to suggest that this sign system is triadic in nature: “The referent of the bibliographic reference is a specific work; the referent of a citation the absent text that it denotes; in the case of large-scale citation counts, the referents are the cited authors.”

The problem with Cronin’s approach is that he views language from a structuralist perspective as is clearly evident from his triangular structure of language in which signs fall back upon an original context. But reference to Roland Bathes theory above demonstrates that signs do not necessarily operate in such a coherent direction. Rather, signs are dispersive entities that ripple out into the past, present and future thereby creating multiple contexts. They do not necessarily fold back upon the original text, but rather re-conceptualize that text pushing it into the future as a ‘new’ work. Baudrillard (1981, 150) would refer to Cronin’s sign system as the “mirage of the referent”. This essay supports Baudrillard’s concept of the sign becoming a kind of false referent that signals back to the original text. So, citations as signs do not really contain a past, rather, they only push past texts into a newly imagined future. In many ways, post-structuralism depicts citations as signs in terms of what Brian McHale (1987, 166) defined as heteroglossia, that is, “a plurality of discourse […] which serves as the vehicle for the confrontation and dialogue among world-views”. What this recognition must do, is destory any sense of hierarchy within the citation process. It can not only tear citing from their authorial and hierarchal structure, but it can also seriously undermine the normative approach that all theories appear to fall back into, nó longer allowing citations to be retained under a hegemonic capitalist scheme.

In conclusion, this paper has attempted to explicate the three main approaches to understanding and theorizing citation indexing. It has done this through a brief review of the literature available in the academic field. The suggestion is that citation indexing has become blinded to the hierarchy that now controls it. In this sense, Sosteric’s (1999) argument that hegemonic control over scholarship through the proliferation and globalization of citation practices in the wake of the technological revolution has well and truly been realized. This can be argued as we see scholars become blinded to the underlying capitalism that controls scholarly thinking by embedding scholarship within a fragmented and contradictory paradigm. Many scholars argue then for a more expansive theory through the interpretative, phenomenological, and semiotic approaches, but these become retained within authoritative contexts and ultimately collapse back into a normative approach. By identifying the persistence of an underlying capitalised structure, this essay has attempted to take a more holistic and ontological approach to the subject. It has also attempted to utilise some post-structuralist theory in order to develop the semiotic approach of Cronin. In doing so, this paper argues for the freeing up of Croin’s sign system to incorporate a more dispersed heterogenous theory that could ultimately create a freer, more authonomous citation system.


Baudrillard, J. (1981), For a Critique of the Political Economy of the Sign, London: Telos Press

Chubin, D.E. & Moitra, S.D. (1975), Content analysis of references: adjunct or alternative to citation counting? Social Studies of Science, 5, 423-441

Cronin, B. (2005), ‘Symbolic capitalism’, The Hand of Science: Academic writing and its rewards. Lanham, MD: Scarecrow.

McHale, Brian (1987), Postmodernist Fiction, New York and London: Methuen

Small, H.G. (1980), Co-citation context analysis and the structure of paradigms, Journal of Documentation, 36(3), 183-196

Sosteric, M. (1999). Endowing mediocrity: Neoliberalism, information technology, and the decline of radical pedagogy. Radical Pedagogy. http://www.radicalpedagogy.org/radicalpedagogy.org/Endowing_Mediocrity__Neoliberalism,_Information_Technology,_and_the_Decline_of_Radical_Pedagogy.html

Warner, J. (1990), Semiotics, information science, documents and computers, Journal of Documentation, 46(1), 16-32

Wouters, P. (1993), Writing histories of scientometrics or what precisely is scientometrics?

Part 2: The Interpretative Approach to Citation Indexing


The second, and again, fragmented approach to citation indexing is best described as the interpretative approach which relies on the idea of citation as a communicative act that forms a relationship between texts. Firstly, May (1967: 890) suggests that citations are ‘deviants’ and are partly informed by the ‘scientific, political and personal motivations’ of the user. This has lead to citation indexing being viewed as a social science with the emphasis on communication between texts and also between authors. Mitroff (1972) questioned the normative approach in this way by suggesting that referencing relies on subjective behavior in the methods scientists use to cite. What this means, according to Gilbert and Mulkay (1982) is that citation behavior is context dependent. The problem with this approach was best summarized by Blackwell and Kochtanek (1981) who point out that while citation indexing is a communicable relationship involving two texts, it does not make it explicitly clear what the nature of the relationship actually is. This leads to the inclusion of psychological analysis in the debate. Harter, Nisonger and Weng (1993) suggest that there is psychological validity to citation usage in that they don’t always retain a clear topical relevance. However, little consensus was reached regarding what the texts are actually saying to each other given the rather slippery application of terms like ‘subjective’ and ‘context-based’.

Once again, as long as a fragmented, rather than holistic approach, is taken to the subject the real value of citation indexing, if any at all, will not be realized. For example, Stanely Fish (1989, 164) argues from a reader-response perspective that “the convention is a way of acknowledging that we are involved in a community activity in which the value of one’s work is directly related to the work that has been done by others; that is, in this profession you earn the right to say something because it has not been said by anyone else, or because while it has been said, its implications have not been spelled out.” However, Fish’s explanation only assesses the citation process through an insular relationship between two texts rather assessing qualitatively ‘why’ a specific cited text is valuable. His approach still falls victim to the idea expressed by Voos and Dagaev (1976) that citations function on the assumption that they have and equivalent value. In this sense, citations fail to distinguish between degrees of importance between differently weighted texts. This has lead to Czarniawska-Joerges’ (1998, 63) supposition that citations act as a “trace of conversations between texts”. In this sense, Merton’s (1977, 84) early argument that citations are too cognitively complex to be accurate and comprehensive in their citation behavior still holds true. However, this does not prevent Brodkey (1987, 4) and Bordieu (1991, 20) from falling back on the idea that there are normative procedures that regulate citation practices.

The key problem with the interpretative approach is that it takes the scientific tenet that the process of citing and the relationship between texts needs to be clearly defined. Post-structuralist theory can be useful in this sense in that it demonstrates that one cannot really assert clear definitions based around authorial intention onto context-based reading processes. Roland Barthes’s (1967) essay ‘The Death of the Author’ argues that understanding the intention of an author is neither useful or desirable when understanding textual referents. This is because language operates not as a circular reciprocal structure, but as a more dispersed set of signs. A citation marker then, cannot refer backwards to highlight the importance of an older text, but rather, a citation marker can only refer forward into the future of that ‘old’ text. This is because language does not work as a static system; in post-structuralism, language is a highly dispersive and heterogenous marker that pushes ‘past’ texts into the future while having the impact of re-contextualizing them in the process. The interpretative approach views language as something ordered and permanent. The fact that they cannot figure out what these ‘ordered’ citation markers are actually saying should act as a solid indicator that they do not engage in a conversation between the original cited work and the work that is citing, and that each time a work is cited it is transformed into a new context taking on new signification. In this sense, the author as an authority becomes irrelevant and dispersed in that he/she cannot possibly retain control of the original information. It is here where the academics theorizing citation indexing come unstuck. If they fall back upon a normative approach, then they must realize how ideologically corrupt that approach is due to the overarching commodification of education and research, not to mention the fact that the normative theory is an attempt to assert to control and authority in asserting predictable practices. However, if they embrace contemporary linguistic and culturally theory, then they must accept that they will loss control of the hierarchy altogether.


Barthes, Roland (1967), The Death of the Author, Aspen, No. 5-6

Blackwell, P.K. & Kochtanek, T.R. (1981), An iterative technique for document retrieval using descriptors and relations, Proceedings of the 44th American Society for Information Science Annual Meeting, Washington: ASIS, 215-217

Bordieu, P. (1991), Language and Symbolic Power, Cambridge, MA: Harvard University Press

Brodkey, L (1987), Academic Writing as Social Practice, Philidelphia: Temple University Press

Czarniawska-Joerges B. (1998), Narrative Approach to Organization Studies, London: Sage

Fish, S. (1989), Doing What Comes Naturally: Change, Rhetoric, and the Practice of Theory in Literary and Legal Studies, Durham NC, Duke University Press

Gibert, G.N. & Mulkay, M. (1980), Contexts of scientific discourse: social accounting in experimental papers, in Knorr, K.D. et al (eds.), The social process of scientific investigation, Dordrecht: Reidel, 269-294

Harter S.P., Nisonger T.E. and Weng A. (1993), Semantic relationships between cited and citing articles in library and information science journals, Journal of the American Society for Information Science, 44(9), 543-552

May, K.O. (1967), Abuses of citation indexing, Science, 156, 890-892

Merton, R.K. (1977), The sociology of science: an episodic memoir, The Sociology of Science in Europe, Carbondale: South Illinois University Press, 3-141

Mitroff, .I.I. (1972), The myth of subjectivity or why science needs a new psychology of science, Management Science, 18, 613-618

Voos, H. & Dagaev, K.S. (1976), Are all citations equal? Or did we op.cit. your idem? Journal of Academic Librarianship, 1(6), 19-21

Digital Curation & Preservation: At what cost?

brainIt continues to astound me on this MLIS course how many ideas, theories and practices blindly press ahead with the supposed ‘advancement’ of the industry without ever addressing important fundamental questions about the underlying nature, impact and value of the work being undertaken. It also amazes me at how information studies academics continue to theorise while passively ignoring the poststructuralist theory that has been informing many other disciplines uninterrupted for the last 50 years. Reading Helen Shenton’s work has left me no less bemused.

Digital curation and preservation takes as its starting point the mantra ‘we must preserve’ without ever asking whether or not it is right, or indeed valuable to preserve. Poststructuralism has worked hard to ensure that history and culture are not controlled as homogenous entities, but digital curation is now threatening to undo much of that good work. Poststructuralism is a theory of language that denies words as static culture building objects, and instead views language as a highly dispersive subjective heterogenous experience. It is the theory that underlies so much of our achievements in the last 50 years. It lead to the feminist movement, to the reconceptualisation of history as a discipline, and to the destruction of periodisation in literature. With real world artifacts we still have the potential to make new discoveries about the past. However, with born-digital objects which only have a lifespan of up to 25 years, we will not have the capacity to re-write the past through new discoveries. As a result, the digital curators of today are essentially the historians of tomorrow. The files that they choose to save will create a static history that cannot be questioned in the future. Howard Zinn, a postmodern historian argued that history has traditionally been written by those who win wars. Digital Curation, which is funded by governments or private organizations, is in danger of destroying the culture it is aiming to preserve in what could be become a Big Brother like scenario.


Helen Shenton’s work in ‘Virtual reunification, virtual preservation and enhanced conservation’ focuses on the digitisation of dispersed works. It is in many ways a hugely interesting project, but it needs to be taken to question about its real underlying value. It is disturbing that Shenton’s work has as its goal ‘reunification’. This word summons forth a whole litany of other terms like ’empire’, ‘colonisation’, ‘power’, ‘race’, ‘slavery’ and ‘control’ to name but a few. This word inherently references imperialism at a time when the breaking apart of the United kingdom has become a real possibility in the near future. The fact that some important texts exist in a dispersed format is in itself culturally significant because it is indicative of the breaking apart of empire itself. Bringing these texts together has the potential to create a false narrative and a homogenous cultural discourse, and in this sense Shenton, like many of her contemporary information professionals, uses an outmoded form of structuralism to inform her ideas. She argues, in relation to the Sinaiticus Project, that it requires ‘the production of an historical account of the document’ that needs to be objective. The very idea that a homogenous ‘objective’ narrative is being added to these documents is a regulating process that ignores the lessons learned in the arts through poststructuralism. Structuralism is also implicitly referenced in the layer of information in the form of digital links over the manuscripts, which again inherently asserts control and authority over the material. Shenton has not stopped to ask what is the cost of such a project. Nor has she asked why the British Library feel as though they have the right to oversee the reunification of material from different cultures around the world.

The British Library is not only collecting material, but they are seeking to play a role in culture building. I thought the function of a library was to provide non-judgemental access to information. Shenton talks about ‘enhancing’ culture through diplomacy insofar as cultural diplomacy can play a role in international relations. It shows that there is an implicit and dangerous politics behind these preservation projects. Questions need to be posed regarding for whom is the British Library attempting to play a role in international relations and to what end? This project seems to be going beyond simply collecting material, but is ‘using’ material to re-tell an old story of empire. It feeds into an attempt by governments to create and control fake grand narratives. Howard Zinn’s principle of postmodern history was a way of challenging power by telling history through dispersed narratives. Shenton’s digitisation project runs the risk of more easily cutting off avenues to the past for us here in the present, but more dangerously, for people in the future. It poses the danger of manipulating information in ways that reassert a new kind of imperialism, a new homogeny of information, and an oppressive future in which subjectivity is no longer valued.