Statement of Competency
Introduction
Information Retrieval Systems (IR systems) are systems that store, organize, and make information accessible to a user and their information need(s). IR systems can take the form of catalogs, indexes, digital asset management systems, internet search engines, and databases. LIS professionals interacting with IR systems should be able to understand the concepts of design, querying, and evaluation so as to ensure all patrons and users are able to locate relevant information. The evidence supporting this competency draws upon foundational IR systems knowledge, design evaluation, and advanced search strategies, which show a wide understanding of this competency.
Design
An effective IR system is one that connects a user with relevant information which addresses their information need(s). Judith Weedman (2019) describes design as a process that spans from ideation to final product/service. The different phases in the design process include a requirement analysis: who is the user?; what type of information will be contained in the IR systems? How will they try to find or access the information? How will they determine relevance/best fit? The answers to these questions can be secured through surveys, questionnaires, focus groups, and/or user interviews (Weedman, 2018). Once a requirement analysis has been conducted, the designer can move on to developing a concept design that is able to articulate the concept to stakeholders and users for feedback. What follows is the physical development of an interface (working or not) that takes initial concept research and generates a general sketch of the IR systems solution. This must be done before developing a prototype of the IR systems, which is defined as an “incomplete, but usable model” (Weedman, 2019, p. 224). Prototyping is the stage wherein subject representation (indexing terms and controlled vocabularies) begin to be built into the IR systems. The prototype is then put through beta-testing with potential users and industry in an effort to identify issues within the system. The design process is now in an iterative cycle, where testing and changes repeatedly inform one another.
Query
Querying is the process of asking a system for information. In an Information Retrieval System, Weedman stresses the importance of having the “search fit the system” (2018, p. 178) which means that the system’s organization must inform the process by which you search for information. What fields are available? What type of information is contained within the system? This knowledge will help the user with their query, and will help them formulate a search strategy. Christopher Brown (2021) outlines the seven main search strategies information professionals employ for targeted search results: 1. Boolean Logic; 2. Controlled Vocabularies; 3. Field Searching; 4. Proximity Searching; 5. Truncation and Wildcards; 6. Limits; 7. Learning from Results. Utilizing any combination of these strategies should assist with obtaining relevant results from the IR systems. It should be understood that searching is an iterative process that requires one to use multiple strategies to return the most relevant results (Weedman, 2018).
Evaluate
Evaluation is critical to IR system design, and it occurs throughout the life cycle of the system: ideation, development, testing, searches, and results (Tucker, 2023, Weedman, 2018). Specifically, evaluation doesn’t happen solely at the end of the design process as a wrap-up exercise. So how can we evaluate an IR system? Manning et al. (2008) suggest evaluation of IR systems by assessing both the system quality and the user utility. This approach measures both the technical and nuanced aspects of the system. For example, system quality evaluation involves checking for relevance in precision (number of relevant items retrieved divided by the number of retrieved items) and recall (number of relevant items retrieved divided by the number or relevant items) among other metrics (Manning et al., 2008, p. 42-43). User utility involves measuring user satisfaction based on the relevant results retrieved for their unique information need(s). Synthesizing both subjective and objective data makes a holistic evaluation nuanced, but evaluation is a necessary part of the design process and should not be ignored.
Evidence
This evidence comes from INFO 244: Online Searching (Spring 2024). In this exercise, we evaluated the Dialog (Clarivate) database through guided searches using techniques such as thesaurus terms, controlled vocabularies, and combining individual search sets. These searches pushed me to think carefully about how results are generated and how factors like metadata, subject terms, and indexing make retrieval more precise or more frustrating.
Through this project, I demonstrated my ability to retrieve information from different systems and to use multiple search strategies effectively. Dialog is structured very differently from a typical web search engine, and learning to navigate its controlled vocabulary and Boolean logic helped me see how system organization directly affects the search process. I could see that the way a database is designed (the fields it uses, how records are indexed, and whether it makes use of a hierarchical thesaurus) shapes how efficiently a user can find information.
I also gained experience evaluating components of an information retrieval system, especially when it came to balancing precision and recall. Some searches brought back far too many irrelevant results, while other search results were too narrow. Experimenting with both extremes helped me understand how database design and search strategy interact. These exercises could be tedious, but they gave me a real appreciation for how much thought goes into designing and maintaining systems that support effective retrieval.
INFO 244: Building Block Search Technique Presentation
This assignment, also from INFO 244, asked us to visually explain a search concept. I created an infographic in Canva illustrating the building block search technique, showing how ideas can be broken down and recombined to create stronger searches. Developing this presentation helped me practice designing search strategies and thinking about how search components interact.
In building the infographic, I reflected on how different search tactics affect retrieval results. Turning the concept into a visual made me more aware of the design principles that support usability and comprehension, both in database interfaces and in how information is presented. This project shows that I can apply different search techniques and communicate them clearly. Translating a technical concept into something visual and approachable deepened my understanding of how searching works behind the scenes and how thoughtful design supports learning and retrieval.
INFO 202: Project 1 Alpha Prototype: Cheese
This group project from INFO 202 with Dr. Ghosh (Fall 2023) was one of my first introductions to database design and information system structure. My team created a prototype database around the theme of “cheese,” and I worked on developing metadata fields, controlled vocabularies, and indexing rules which are all essential components of an information retrieval system.
At first, our intended user group was “anyone who likes cheese.” After Dr. Ghosh’s feedback, we refined it to “home chefs who want to incorporate cheese into their cooking.” This change taught us to think critically about user needs and how system organization supports different search behaviors. We re-evaluated our structure to ensure that our metadata and controlled vocabulary terms reflected how users would actually look for information.
This project demonstrates my awareness of design principles and my ability to evaluate and refine a retrieval system based on user feedback. It showed me that even a simple database requires careful attention to structure, consistency, and vocabulary. Working through these challenges helped me see how design, usability, and retrieval are connected and how every design decision affects how people find and use information.
Conclusion
Competency E emphasizes the importance of understanding how information retrieval systems are designed, queried, and evaluated to meet diverse user needs. Through my coursework and projects, I have developed both a technical understanding of IR system design and a user-centered awareness of how search efforts affect the retrieval of relevant documents. As I make the transition from student to professional, I will continue to use IR systems for both my professional and personal needs. I will have the expertise in how to effectively query a system, but through my understanding of the evaluation and design of IR systems, I now have the ability to structure and organize data and information so that it can be found and used by others.
References
Brown, C. C. (2021). Librarian’s guide to online searching : Cultivating database skills for research and instruction (Sixth edition.). Libraries Unlimited.
Manning, C. D., Raghavan, P., & Schütze, H. (2008). Introduction to information retrieval. Cambridge: Cambridge University Press.
Tucker, V. M. (2023b). Lecture 7: Evaluation. In V. M. Tucker (Ed.), Information Retrieval System Design: Principles & Practice (6.2 ed., pp. 349-357). Academic Pub.
Weedman, J. (2018). Information retrieval: Designing, querying, and evaluating information systems. In K. Haycock & M.-J. Romaniuk (Eds.), The portable MLIS: Insights from the experts (2nd ed., pp. 171-186). Libraries Unlimited.
Weedman, J. (2019). Chapter 4 lecture: Design processes. In V. M Tucker (Ed.), Information retrieval system design: Principles & practice (6.0 ed., pp. 220 - 232). Academic Pub.