Tag Archives: education

Part 3: Assessing Ireland’s Open Library Initiative

Will the open library provide more or less access to information for all people within the community?

I initially started this article by suggesting that open libraries could provide greater access to information and that this could potentially be a good thing. They provide greater access to information simply because the library is open for longer periods of time. However, does an open library provide greater access to information than a less technologically enabled staffed library?

If the user is computer literate then the staffed library and open library provide the same access to information. However, if the user is not computer literate then the open library provides less access to information. This is because many people do rely on the librarians to help them use the IT facilities in the library. Many users do not know how to search the catalogue in order to find the books they are looking for; others do not know how to log on to a computer let alone search a digital database effectively; most users cannot use the photocopying and scanning facilities without help; and in libraries when there are self-service machines, most users come to the desk with their items anyway preferring the human interaction and service they get from the staff. Open libraries exclude all of these people. So, if it is cheaper to staff a library than to set it up for open access, and if open access excludes users, then is it not true that staffed libraries provide greater a access to information than open libraries given that a staffed library service in the evening provides access to more people and for longer periods of time? The simple truth is that librarians are as much a part of the access infrastructure of a library as computers are.

open-library-risks

Part 2: Assessing Ireland’s Open Library Initiative

Who are the designated end users and does the Open Library truly serve them?

The designated community of an open library are users that are computer liberate and technologically enabled. Many people who attend the library during the day are excluded because they do not know how to use the technology in the library. If the open library scheme is targeting those people that are working during the day then that is fine. It is acceptable to target these users if the library is open to everyone else throughout the day. However, if the end user is someone in full-time employment/ education and is already technologically enabled, is the open library the best solution for their needs?

The bottom line is that the library now provides more electronic resources than it does physical copies. You can borrow e-books and e-magazines through the library website, you can take courses through University Class and you can learn languages also. True open access for a technologically enabled user actually means ‘remote access’. That user can access electronic resources from home, work, while on the train or their lunch break. That user already has a broadband or 3G connection and is already connected to the information they need. Does it make sense that we are investing millions countrywide to set up individual open libraries to service people who are already connected to the internet and who are online? Does it not make more sense to invest this money in electronic and digital resources that can be accessed 24/7 from anywhere and by anyone that has an internet connection? This is also important because one countrywide electronic resources license actually serves the entire country. The open library initiative is setting up single, individual libraries in every county to provide greater access to information. It is like choosing to pay for hundreds of Windows licences when one will cover the entire country at a lower cost.

So, if the actually information can be provided electronically, whey else would someone need an open library? Of course, people will use it not just for the information it holds, but also for the facilities, ie. copying, computing and studying. Again, the designated or targeted user likely already has a computer and printer in their home, or alternatively, will use printing and photocopying services at work. So if the open library will only be open in the evening and the people using it will already be technologically enabled, is it worth the investment to open the library as a study space for professionals and students? There is no doubt that it would be useful for people to study and hold meetings in the library, but do we really need all of the additional security and technology to provide people with a desk, a chair, or a group meeting room? And will people feel safe and comfortable enough to use it anyway?

It appears that the reasons behind the open library are misguided. The attempt to appease disgruntled librarians and patrons by arguing that the open library will only be open for a few hours in the evenings and on Sundays simply does not add up or make sense from a financial or end user perspective.

02_buecherschraenke_klein

Solutions to Controlled Vocabularies Part Two……

steckel_082503_2

Structuralism can be understood as a normative science in which a classification system would begin with the norm and afterwards, if necessary, treat of any exceptions. This, in many ways, goes a long way towards explaining how some of the solutions to classification biases are conceptualised. Both Olson and Mai have basically inserted into the old systems a new way to deal with any exceptions to the norm. They both imply an understanding and incorporation of difference into their models, but only once the normative system has been first applied. However, in Poststructuralism, there emerges a more appropriate definition of difference, and not as an exception of ‘limit’ to the ‘core’, but as a regulating principal that functions to define the core of a subject. As James Williams (2104) demonstrates, in poststructuralism “the limit is not compared with the core, or balanced with it […]the limit is the core”. Poststructuralism sees dualism as a problematic approach to understanding language, and even more problematic with the dualist approach of classification theorists is that they imbalance the dualism between sameness and difference, lending more significance to sameness than difference. Perhaps it is better to explain the principle of ‘core’ and ‘limit’ through an example. Defining something like ‘Irishness’ is traditionally understood through what is at its core, that is, being born in a certain place, time, to certain parents, being a certain skin colour, speaking a certain language, etc. This understanding of Irishness is what regulates our political system and society as a whole. The ‘limit’ in this example would be the problems that arise with the definition once we factor in ethnic minorities who become naturalised with their own set of cultures and histories. But in a traditional, structuralist understanding, these minorities are the exception, and while the Irish government may legislate to create better conditions for ethnic minorities, there will always be discrimination because the limit does not change the core. However, poststructuralism, in James’ (2014) words, would argue that “The truth of a population is where it is changing. A nation is defined at its borders”, that is, at the point of difference because everything that happens at the borders of a country changes how the core is defined. It is the ‘difference’ of ethnic minorities that is most representative of a where a nation is going to in the future and so the ‘limit’ for poststructuralism is the most meaningful way that a nation, or a text, or indeed a classification system can be defined. In terms of vocabulary control then, what defines how a text is classified should not be a biased and static system. Texts should be classified by emphasising how those texts are being, or could be used in ‘different’ ways, by different disciplines. A book like ‘Words of Power’, that would be classified under LCSH and DDC as Philosophy: Logic, would no longer be limited to such static systems. This book could also be used by feminist scholars, by those working in ethics, or anthropology, sociology, psychology, gender studies, linguistics, and so on. Defining the book under strict controlled vocabularies denies access to ‘exceptional’ groups which results in a lack of real innovation and creativity in academia. In this sense, there are lessons to be learned form poststructuralist theorist Deleuze who argues for the power of openness in creativity.

To reassert, then, poststructuralism denies the traditional approach to classification through controlled vocabularies and aims to positively disrupt traditional classification systems in order to achieve greater autonomy for texts and their users. The problem with the solutions is that they are developed from the point of view that it is too disruptive to completely change our classification systems. But here disruption is seen negatively rather than positively. The seriousness of this ultimately limiting attitude and of the reliance on outmoded classifications can best be understood by applying the work of Jacques Derrida to the topic. Derrida’s (1976) ‘textual positivism’ would not ask ‘what is this book about?’, but rather, ‘what does this book do?’ This question radically changes the way we would categorise texts because it places an emphasis on multiplicity of use rather than the singularity of meaning as defined by ‘specialists’. Derrida’s approach distinctly adopts anthropocentrism and sees the classification system as onto-theological. Derrida’s ‘origin’ is constantly being affected by a texts ‘presence’, thats is, what a text is doing in the moment. It is this ‘presence’ that leads to both the future and the past, that essentially pulls the ‘origin’ into ever-evolving new contexts. A ‘sign’ then for Derrida is nothing more than a ‘trace’ of that change, a trace that can be followed to a point of difference so long as we understand that once we reach the trace, we have too altered its origin which has moved off beyond our grasp. It is this point of difference that allows for creativity to emerge in what Derrida defines as ‘play’. In this sense, and applied to classification systems, the moment at which a reader reads ‘Words of Power’ in relation to psychoanalytical studies, is the moment that changes the origin indefinitely, opening up that text to new contexts in the future and the past, and thereby changing how that text should be defined in a classification system.

Controlled-Vocabulary-Digital-Asset-Management

This essay will argue then, that classification systems should take ‘difference’ as a key underlying principle in the way we organise information. To do otherwise is to consciously do ‘violence’ by excluding individuals or groups from our knowledge economy. As Derrida (1976, 140) writes “There is no ethics without the presence of the other but also, and consequently, without absence, dissimulation, detour, differance, writing”. This is because traditional classification systems and the solutions outlined in this paper, try to regulate the past, present and future of information in order to make it more accessible. However, the adoption of a classical scientific approach only makes it possible to categories if texts, if language, are seen as static. But Derrida (1976, 67) teaches us that “The concepts of present, past, and future, everything in the concepts of time and history that assumes their classical evidence – the general metaphysical concepts of time – cannot describe the structure of the trace adequately”. This constitutes a denial that there is no ‘final’ past, present or future of a text. Derrida (1976, 69) would argue that what is really happening in controlled vocabularies is a violence and unethical act of control which is borne out of a fear and misunderstanding of ‘death’. “Spacing as writing is the becoming-absent and the becoming-unconsious of the subject. By the movement of its drift/ derivation the emancipation of the sign constitutes in return the desire of the presence. That becoming – or that drift/ derivation – does not befall the subject which would choose it or would passively let itself be drawn along by it. As the subject’s relationship with its own death, this becoming is the constitution of subjectivity.” Death in this sense is seen as continuity from one context to the next rather than a final end. In many ways, traditional classification systems are a kind of death sentence in the traditional sense in that they render entire texts and disciplines static and irrelevant as new contexts emerge and are classified in inadequate ways. Perhaps the point is best represented by Williams (2014) who claims “The demand for clarity is dangerous because clarity justifies violent judgements and exclusions on the basis of a promise of a world of understanding and togetherness.”

What is happening then in our classification systems is that subject specialists are attempting to create greater accessibility to information by categorising information under specific and specialist subject headings in order to create a sense of clarity when one is searching for that information. Texts are gathered together based around a principle of sameness and difference in which things that are similar are classified under the same subject headings. The reality is that this system, based on controlled vocabularies is extremely biased and fails to account for real difference, heterogeneity and multiplicity in our information world. There have been attempts to create new systems that, for example, are more appropriate for those interested in feminist studies, but these systems simply shift the control from one universal group to another smaller prioritisation. They do highlight a very important politics and injustice, for example, in the way texts are classified, but they do so by reverting to an equally biased system. It is at the point of difference that real innovation and creativity occurs. Our universities are set up as places in which creativity and independent thinking is supposed to lead to new innovations, while our public libraries are moving more and more towards providing creative spaces for communities to grow and develop. Yet, the way in which we search for information is limited and contradictory, and no longer fit for purpose. The argument that ‘difference’ should be prevalent in classification systems is not absurd or contradictory, but would require a complete overhaul of the way in which we understand and categorise information. Perhaps there is already a working model existent in the way in which internet search engines like Google operate. Websites on Google are presented to us in a way that can promote difference as a classifying principle. This is because those websites that have the most links to other active sites are presented as being more prevalent and relevant. What would happen if a similar approach was adopted by libraries? That when we search for information under a certain topic, that we are presented with a list of texts that are organised based around the number of connections the texts have to other texts and thus other disciplines? This would perhaps provide a classification system that would celebrate multiplicity, ranking texts according to the many possible ways they can be used and interpreted. In the field of literature, I remember my PhD supervisor suggesting to me that I include some comparative work between John Banville and Gabriel Garcia Marquez in my thesis because the postcolonial contexts of Ireland and Colombia bare many similarities. As a researcher, in all the hours spent searching for information on Banville and postcolonialism, I was never presented with any texts that implied real difference or multiplicity. Searches were singular and restrictive, never indicating that any different approach was possible. The classification system would provide no new spark of creativity for young researchers to pursue. In fact, using Boolean logic, I found that the more search terms I entered in different searches, the more relevant the results that were presented. If difference was to become an organising principle, then information would be retrieved that prioritised multiplicity and that would lead to greater inclusion and thus innovation. The problem at the moment is that the way we search the digital databases is dictated by the way in which information is placed on stacks. The stacks or the numbers on the books do not have to change. Where the information is located in the library is irrelevant because rarely nowadays do we search the stacks anyway. What needs to change is the way information is classified on the digital databases. There is nothing stopping us from radically changing this digital system to one that is more inclusive of difference and that contains less ‘violent’ vocabularies.

Bibliography

Beghtol, C. (1986). “Semantic validity: concepts of warrant in bibliographic classification systems”. Library Resources & Technical Services, Vol. 30 No. 2, pp. 109-25.

Borges, J.L. (1952). “The analytical language of John Wilkins”, Other Inquisitions 1937-1952. Souvenir Press, London, 1973.

Bowker, G.C. and Star, S.L. (1999). Sorting Things Out: Classification and Its Consequences. MIT Press, Boston, MA.

Derrida, Jacques (1976). Of Grammatology. John Hopkins University Press, Maryland.

Lakoff, G. (1987). Women, Fire, and Dangerous Things: What Categories Reveal about the Mind. University of Chicago Press, Chicago, IL.

Mai, Jens-Erik (2010). Classification in a social world: bias and trust. Journal of Documentation, Vol. 66 Issue 5, pp. 627-642.

Miksa, F. (1998). The DDC, the Universe of Knowledge, and the Post-Modern Library. Forest

Press, Albany, NY.

Olson, Hope A. (2001). Sameness and difference: a cultural foundation of classification. Library Resources and Technical Services, 45, no. 3, pp.115-122

Shirky, C. (2005). “Ontology is overrated: categories, links, and tags”. Clay Shirky’s Writings about the Internet. Economics & Culture, Media & Community, available at: http://www.shirky. com/writings/ontology_overrated.html html (Date Accessed 28th April 2015)

Williams, James (2014). Understanding Poststructuralism. Routledge: London.

Part 1: The Normative Approach to Citation Indexing

image_thumb[7]-2

This is the first part of three short critiques of citation indexing….

The first theoretical approach to citation indexing is the normative approach. However, much of the discussion around this approach remains fragmented as protagonists of the approach maintain an outlook that assesses normative measures by analysing codes and processes ‘within’ the practice of citing. Cronin (1984, 2) explains that “Implicit in this is the assumption that authors’ citing habits display conformity and consistency.” This view was originally developed by Garfield (1963) who argues for the use of citation indexes as quantitative and valuable if they adhere to scientific principles. The fact that this argument requires codified modes of behaviour demonstrates that the approach looks only at the processes of citing rather than asking questions about the value of citing itself and the motivations that encourage or dictate authors to cite in the first place. Once Kaplan (1965) argued for a citation approach that was sociological in that citations relate to other kinds of social data, Merton (1973) developed the normative approach to include four categories upon which this code can be identified and understood. These include: Universalism; Organised Skepticism; Communism; and Disinterestedness. These four categories were then expanded by Mitroff (1974) to eleven categories.

However, this method of assessing only an implicit code of reference within citation practices ultimately falls victim to hierarchy in which a few elite or powerful authors become dominant players in influencing new research. Whitely (1969, 219) argues that “The formal communication system also forms the basis for the allocation of rewards: instrumental and consummatory. Thus it is a means of exercising social control . . . Publication of an article in an archival journal signifies a degree of recognition for the author, while legitimizing the object of research and methodology.” Thus, the danger of any normative approach that relies on there being established rules or codes of practice that regulates citation practices, is that it is prone to become part of a system of control in which influential academics begin to benefit from a normative approach that acts as a kind of pyramid scheme. Cronin (1984, 12/3) seems to celebrate the concept that “Maverick ideas, or notions which are, scientifically speaking, revolutionary, are thus effectively debarred from the official record of science – the journal archive”. Storer (1966) highlights that citations will continue to be used out of a principle of self-interest in which scientists adhere to the norms because citations are necessary commodities in which colleagues share mutual interest. This monetization of citations is confirmed by Hagstrom (1971) who goes on to argue that citations coincide with the value of grants, funding and university rewards. However, the fact that academics are engaging in a discourse that commonly accepts the commodification of ideas within an education setting is ethically reprehensible. It also demonstrates a lack of real interest in exploring the core value of citation indexes because the academics in question are benefiting from being cited. It can clearly be seen from looking at the literature that there is an acceptance of the monetization of citations as part of normative practice. However, the normative argument is highly fragmentary in that it fails to acknowledge that the citing norms are only compliant to an underlying monetized hierarchy. All the norms do is reinforce a homogenous and hierarchal academic system. The approach cannot claim to be truly normative because the norms are actually imposed.

Mike Sosteric in his essay ‘Endowing Mediocrity’ takes a more holistic approach to the subject as he attempts to expose the narrative that underlies and informs the normative codes in citation analysis. In doing so he gives greater context to some of the above mentioned problems with the normative approach to citation indexes. Sosteric (1999) examines the influence of capitalism and cybernetics on bibliometrics, asserting that citation indexing creates a homogenous narrative that reasserts hierarchy within eduction. Sosteric expands upon Teeple’s (1995, 1) suggestions that the 1980s “signified the beginning of what has been called the triumph of capitalism”. Sosteric (1999) continues to argue that “as a result of the neoliberal push, universities are being colonized, both physically and intellectually, by capital, its representatives, and its ideologies.” What can be seen here is that the normative trends that regulate citation indexing are monopolized by capitalist processes. Senior or established academics at the top of the hierarchy directly benefit from the setting up of normative modes of practice because the more their work is cited, the greater the monetary and symbolic gain. Those less established academics cannot become more visible unless they pay tribute through normative citation practices to the established scholars and universities who exert significant authority over the career trajectories of younger and emerging academics and researchers. In this sense, normative practices within citation indexing is regulated under hegemonic control. And as Boor (1982) points out, it is highly susceptible to manipulation, especially now that it has come under the complete control of cybernetic processes insofar as citation counts can be ‘engineered’ through unfair means in order to create inflated citation scores. Therefore, Nelson (1997, 39) may refer to citation indexing as “academia’s version of applause”, and Grafton (1997, 5) may insist that it is codified by “ideology and technical practices”, but their assessment remains fragmentary. Once we assess the processes of citation from a more holistic approach, we must question the very ideology that is creating such practices and more deeply consider the true value that they have.

References:

Cronin, Blaise (1984), The Citation Process: The Role and Significance of Citations in Scientific Communication, Taylor Graham

Garfield, E. (1963), Citation indexes in sociological research, American Documentation, 14(4), 289-291

Grafton, A. (1997), The Footnote: A Curious History, Cambridge, MA: Harvard University Press

Hagstrom, W.O. (1971), Inputs, outputs and the prestige of university science departments, Sociology of Education, 44(4), 375-397

Kaplan, N. (1965), The norms of citation behaviour: prolegomena to the footnote, American Documentation, 16(3), 179 – 184

Merton, R.K. (1973), The sociology of science: theoretical and empirical investigations, Chicago University Press

Mitroff, .I.I. (1974), The subjective side of science: a philosophical inquiry into the psychology of the Apollo moon scientists, Amsterdam: Elsevier

Nelson, P. (1997), Superstars, Academe, 87(1), 38-54

Sosteric, M. (1999). Endowing mediocrity: Neoliberalism, information technology, and the decline of radical pedagogy. Radical Pedagogy. http://www.radicalpedagogy.org/radicalpedagogy.org/Endowing_Mediocrity__Neoliberalism,_Information_Technology,_and_the_Decline_of_Radical_Pedagogy.html

Teeple, Gary (1995). Globalization and the Decline of Social Reform. Toronto: Garamond Press.

Storer, N.W. (1966) The social system of science, New York: Holt Rinehart & Winston

Whitley, R.D. (1969), Communication nets in science: status and citation patterns in animal