Information retrieval
Template:TOC-right Information retrieval is defined as "a branch of computer or library science relating to the storage, locating, searching, and selecting, upon demand, relevant data on a given subject."[1] As noted by Carl Sagan, "human beings have, in the most recent few tenths of a percent of our existence, invented not only extra-genetic but also extrasomatic knowledge: information stored outside our bodies, of which writing is the most notable example."[2] The benefits of enhancing personal knowledge with retrieval of extrasomatic knowledge has been shown in a controlled comparison with rote memory.[3]
Although information retrieval is usually thought of being done by computer, retrieval can also be done by humans for other humans.[4] In addition, some Internet search engines such as mahalo.com and http://www.chacha.com/ may have human supervision or editors.
Some Internet search engines such http://www.deeppeep.org and http://www.deepdyve.com/ as attempt to index the Deep Web which is web pages that are not normally public.[5]
Classification by user purpose
Information retrieval can be divided into information discovery, information recovery, and information awareness.[6]
Information discovery
Information discovery is searching for information that the searcher has not seen before and the searcher does not know for sure that the information exists. Information discovery includes searching in order to answer a question at hand, or searching for a topic without a specific question in order to improve knowledge of a topic.
Information recovery
Information recovery is searching for information that the searcher has seen before and knows to exist.
Information awareness
Information awareness has also been described as "'systematic serendipity' - an organized process of information discovery of that which he [the searcher] did not know existed".[6] Examples of this prior to the Internet include reading print and online periodicals. With the Internet, new methods include email newsletters[7], email alerts, and RSS feeds.[8]
Classification by indexing methods used
Document retrieval
- Boolean
- Vector space model (relevancy)
- Probabilistic (Bayes)
Factors associated with unsuccessful retrieval
The field of medicine provides much research on the difficulties of information retrieval. Barriers to successful retrieval include:
- Lack of prior experience with the information retrieval system being used[9][3]
- Low visual spatial ability[9]
- Poor formulation of the question to be searched[10]
- Difficulty designing a search strategy when multiple resources are available[10]
- "Uncertainty about how to know when all the relevant evidence has been found so that the search can stop"[10]
- Difficulty synthesizing an answer across multiple documents[10]
Factors associated with successful retrieval
Characteristics of how the information is stored
For storage of text content, the quality of the index to the content is important. For example, the use of stemming, or truncating, words by removing suffixes may help.[11]
Display of information
Information that is structured was found to be more effective in a controlled study.[12] In addition, the structure should be layered with a summary of the content being the first layer that the readers sees.[13] This allows the reader to take only an overview, or choose more detail. Some Internet search engines such as http://www.kosmix.com/ try to organize search results beyond a one dimensional list of results.
Regarding display of results from search engines, an interface designed to reduce anchoring and order bias may improve decision making.[14]
Characteristics of the search engine
John Battelle has described features of the perfect search engine of the future.[15] For example, the use of Boolean searching may not be as efficient.[16]
Characteristics of the searcher
In healthcare, searchers are more likely to be successful if their answer is answer before searching, they have experience with the system they are searching, and they have a high spatial visualization score.[9] Also in healthcare, physicians with less experience are more likely to want more information.[17] Physicians who report stress when uncertain are more likely to search textbooks than source evidence.[18]
In healthcare, using expert searchers on behalf of physicians led to increased satisfaction by the physicians with the search results.[19]
Evaluation of the quality of information retrieval
Various methods exist to evaluate the quality of information retrieval.[20][21][22] Hersh[21] noted the classification of evaluation developed by Wancaster and Warner[20] in which the first level of evaluation is:
- Costs/resources consumed in learning and using a system
- Time needed to use the sytem
- Quality of the results.
- Coverage. An estimated of coverage can be crudely automated.[23] However, more accurate judgment of relevance requires a human judge which introduces subjectivity.[24]
- Precision and recall
- Novelty. This has been judged by independent reviewers.[25]
- Completeness and accuracy of results. An easy method of assessing this is to let the searcher make a subjective assessment.[26][27][28] A better method may be to use a panel of judges.[9]
Precision and recall
Recall is the fraction of relevant documents that are successfully retrieved. This is the same as sensitivity.
Precision is the fraction of retrieved documents that are relevant to the search. This is the same as positive predictive value.
F1 is the unweighted harmonic mean of the recall and precision.[22]
References
- ↑ National Library of Medicine. Information Storage and Retrieval. Retrieved on 2007-12-12.
- ↑ Sagan, Carl (1993). The Dragons of Eden: Speculations on the Evolution of Human Intelligence. New York: Ballantine Books. ISBN 0-345-34629-7.
- ↑ 3.0 3.1 de Bliek R, Friedman CP, Wildemuth BM, Martz JM, Twarog RG, File D (1994). "Information retrieved from a database and the augmentation of personal knowledge". J Am Med Inform Assoc 1 (4): 328–38. PMID 7719819. [e]
Cite error: Invalid
<ref>
tag; name "pmid7719819" defined multiple times with different content - ↑ Mulvaney, S. A., Bickman, L., Giuse, N. B., Lambert, E. W., Sathe, N. A., & Jerome, R. N. (2008). A randomized effectiveness trial of a clinical informatics consult service: impact on evidence-based decision-making and knowledge implementation, J Am Med Inform Assoc, 15(2), 203-211. doi: 10.1197/jamia.M2461.
- ↑ Wright A. (2009) Exploring a ‘Deep Web’ That Google Can’t Grasp. New York Times.
- ↑ 6.0 6.1 Garfield, E. “ISI Eases Scientists’ Information Problems: Provides Convenient Orderly Access to Literature,” Karger Gazette No. 13, pg. 2 (March 1966). Reprinted as “The Who and Why of ISI,” Current Contents No. 13, pages 5-6 (March 5, 1969), which was reprinted in Essays of an Information Scientist, Volume 1: ISI Press, pages 33-37 (1977). http://www.garfield.library.upenn.edu/essays/V1p033y1962-73.pdf
- ↑ Roland M. Grad et al., “Impact of Research-based Synopses Delivered as Daily email: A Prospective Observational Study,” J Am Med Inform Assoc (December 20, 2007), http://www.jamia.org/cgi/content/abstract/M2563v1 (accessed December 21, 2007).
- ↑ Koerner B (2008). Algorithms Are Terrific. But to Search Smarter, Find a Person. Wired Magazine. Retrieved on 2008-04-04.
- ↑ 9.0 9.1 9.2 9.3 Hersh WR, Crabtree MK, Hickam DH, et al (2002). "Factors associated with success in searching MEDLINE and applying evidence to answer clinical questions". J Am Med Inform Assoc 9 (3): 283–93. PMID 11971889. PMC 344588. [e]
Cite error: Invalid
<ref>
tag; name "pmid11971889" defined multiple times with different content Cite error: Invalid<ref>
tag; name "pmid11971889" defined multiple times with different content - ↑ 10.0 10.1 10.2 10.3 Ely JW, Osheroff JA, Ebell MH, et al (March 2002). "Obstacles to answering doctors' questions about patient care with evidence: qualitative study". BMJ 324 (7339): 710. PMID 11909789. PMC 99056. [e]
- ↑ Porter MF. An algorithm for suffix stripping. Program. 1980;14:130–7.
- ↑ Beck AL, Bergman DA (September 1986). "Using structured medical information to improve students' problem-solving performance". J Med Educ 61 (9 Pt 1): 749–56. PMID 3528494. [e]
- ↑ Nielsen J (1996). Writing Inverted Pyramids in Cyberspace (Alertbox). Retrieved on 2007-12-12.
- ↑ Lau AY, Coiera EW (October 2008). "Can cognitive biases during consumer health information searches be reduced to improve decision making?". J Am Med Inform Assoc. DOI:10.1197/jamia.M2557. PMID 18952948. Research Blogging.
- ↑ John Battelle. The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture. Portfolio Trade. ISBN 1-59184-141-0.
- ↑ Verhoeff, J (2001). Inefficiency of the use of Boolean functions for information retrieval system. Communications of the ACM. 1961;4:557 DOI:10.1145/366853.366861
- ↑ Gruppen LD, Wolf FM, Van Voorhees C, Stross JK (1988). "The influence of general and case-related experience on primary care treatment decision making". Arch. Intern. Med. 148 (12): 2657–63. PMID 3196128. [e]
- ↑ McKibbon KA, Fridsma DB, Crowley RS (2007). "How primary care physicians' attitudes toward risk and uncertainty affect their use of electronic information resources". J Med Libr Assoc 95 (2): 138–46, e49–50. DOI:10.3163/1536-5050.95.2.138. PMID 17443246. Research Blogging.
- ↑ Shelagh A. Mulvaney et al., “A Randomized Effectiveness Trial of a Clinical Informatics Consult Service: Impact on Evidence Based Decision-Making and Knowledge Implementation,” J Am Med Inform Assoc (December 20, 2007), http://www.jamia.org/cgi/content/abstract/M2461v1 (accessed December 21, 2007).
- ↑ 20.0 20.1 Lancaster, Frederick Wilfrid; Warner, Amy J. (1993). Information retrieval today. Arlington, Va: Information Resources Press. ISBN 0-87815-064-1.
- ↑ 21.0 21.1 Hersh, William R. (2008). Information Retrieval: A Health and Biomedical Perspective (Health Informatics). Berlin: Springer. ISBN 0-387-78702-X. Google books
- ↑ 22.0 22.1 Trevor Strohman; Croft, Bruce; Donald Metzler (2009). Search Engines: Information Retrieval in Practice. Harlow: Addison Wesley. ISBN 0-13-607224-0.
- ↑ Fenton SH, Badgett RG (July 2007). "A comparison of primary care information content in UpToDate and the National Guideline Clearinghouse". J Med Libr Assoc 95 (3): 255–9. DOI:10.3163/1536-5050.95.3.255. PMID 17641755. PMC 1924927. Research Blogging.
- ↑ Hersh WR, Buckley C, Leone TJ, Hickam DH, OHSUMED: An interactive retrieval evaluation and new large test collection for research, Proceedings of the 17th Annual ACM SIGIR Conference, 1994, 192-201.
- ↑ Lucas BP, Evans AT, Reilly BM, et al (May 2004). "The impact of evidence on physicians' inpatient treatment decisions". J Gen Intern Med 19 (5 Pt 1): 402–9. DOI:10.1111/j.1525-1497.2004.30306.x. PMID 15109337. PMC 1492243. Research Blogging.
- ↑ Ely JW, Osheroff JA, Chambliss ML, Ebell MH, Rosenbaum ME (2005). "Answering physicians' clinical questions: obstacles and potential solutions". J Am Med Inform Assoc 12 (2): 217–24. DOI:10.1197/jamia.M1608. PMID 15561792. PMC 551553. Research Blogging.
- ↑ Gorman P (2001). "Information needs in primary care: a survey of rural and nonrural primary care physicians". Stud Health Technol Inform 84 (Pt 1): 338–42. PMID 11604759. [e]
- ↑ Alper BS, Stevermer JJ, White DS, Ewigman BG (November 2001). "Answering family physicians' clinical questions using electronic medical databases". J Fam Pract 50 (11): 960–5. PMID 11711012. [e]
- Pages with reference errors
- Pages using ISBN magic links
- Pages using PMID magic links
- CZ Live
- Library and Information Science Workgroup
- Computers Workgroup
- Health Sciences Workgroup
- Articles written in British English
- All Content
- Library and Information Science Content
- Computers Content
- Health Sciences Content