Information retrieval

From Citizendium
Revision as of 07:23, 17 April 2009 by imported>Robert Badgett (→‎Evaluation of the quality of information retrieval)
Jump to navigation Jump to search
This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Template:TOC-right Information retrieval is defined as "a branch of computer or library science relating to the storage, locating, searching, and selecting, upon demand, relevant data on a given subject."[1] As noted by Carl Sagan, "human beings have, in the most recent few tenths of a percent of our existence, invented not only extra-genetic but also extrasomatic knowledge: information stored outside our bodies, of which writing is the most notable example."[2] The benefits of enhancing personal knowledge with retrieval of extrasomatic knowledge has been shown in a controlled comparison with rote memory.[3]

Although information retrieval is usually thought of being done by computer, retrieval can also be done by humans for other humans.[4] In addition, some Internet search engines such as mahalo.com and http://www.chacha.com/ may have human supervision or editors.

Some Internet search engines such http://www.deeppeep.org and http://www.deepdyve.com/ as attempt to index the Deep Web which is web pages that are not normally public.[5]

The usefulness of a search engine has been proposed to be:[6]

Classification by user purpose

Information retrieval can be divided into information discovery, information recovery, and information awareness.[7]

Information discovery

Information discovery is searching for information that the searcher has not seen before and the searcher does not know for sure that the information exists. Information discovery includes searching in order to answer a question at hand, or searching for a topic without a specific question in order to improve knowledge of a topic.

Information recovery

Information recovery is searching for information that the searcher has seen before and knows to exist.

Information awareness

Information awareness has also been described as "'systematic serendipity' - an organized process of information discovery of that which he [the searcher] did not know existed".[7] Examples of this prior to the Internet include reading print and online periodicals. With the Internet, new methods include email newsletters[8], email alerts, and RSS feeds.[9]

Classification by indexing methods used

Document retrieval

  • Boolean
  • Vector space model (relevancy)
  • Probabilistic (Bayes)

Factors associated with unsuccessful retrieval

The field of medicine provides much research on the difficulties of information retrieval. Barriers to successful retrieval include:

  • Lack of prior experience with the information retrieval system being used[10][3]
  • Low visual spatial ability[10]
  • Poor formulation of the question to be searched[11]
  • Difficulty designing a search strategy when multiple resources are available[11]
  • "Uncertainty about how to know when all the relevant evidence has been found so that the search can stop"[11]
  • Difficulty synthesizing an answer across multiple documents[11]

Factors associated with successful retrieval

Characteristics of how the information is stored

For storage of text content, the quality of the index to the content is important. For example, the use of stemming, or truncating, words by removing suffixes may help.[12]

Display of information

Information that is structured was found to be more effective in a controlled study.[13] In addition, the structure should be layered with a summary of the content being the first layer that the readers sees.[14] This allows the reader to take only an overview, or choose more detail. Some Internet search engines such as http://www.kosmix.com/ try to organize search results beyond a one dimensional list of results.

Regarding display of results from search engines, an interface designed to reduce anchoring and order bias may improve decision making.[15]

Characteristics of the search engine

John Battelle has described features of the perfect search engine of the future.[16] For example, the use of Boolean searching may not be as efficient.[17]

Characteristics of the searcher

In healthcare, searchers are more likely to be successful if their answer is answer before searching, they have experience with the system they are searching, and they have a high spatial visualization score.[10] Also in healthcare, physicians with less experience are more likely to want more information.[18] Physicians who report stress when uncertain are more likely to search textbooks than source evidence.[19]

In healthcare, using expert searchers on behalf of physicians led to increased satisfaction by the physicians with the search results.[20]

Impact of information retrieval

The benefits of enhancing personal knowledge with retrieval of extrasomatic knowledge has been shown in a controlled comparison with rote memory.[3]

Various before and after comparisons are summarized in the tables.

Impact of medical searching by physicians and medical students[21][22][23][10]
Search engine Users Questions Portion of answers correct Portion of answers that moved from correct to incorrect
Before searching After searching
Quick Clinical[22][23]
(federated search)
73 practicing doctors and clinical nurse consultants Eight clinical questions 37% 50% 7%
User's own choice[21] 23 primary care physicians 23 clinical questions from Hersh[10] 39% 42% 11%
OVID[10] 45 senior medical students (data available for nursing students) 23 clinical questions from Hersh[10] 32% 52% 13%
Frequency that searching changed medical care.[24][25][26]
  Searches Frequency useful information found Frequency changed care
Crowley[24] 625 self-initiated searches 83% 39%
Rochester study[25] 80%
Chicago study 74%

Evaluation of the quality of information retrieval

Various methods exist to evaluate the quality of information retrieval.[27][28][29] Hersh[28] noted the classification of evaluation developed by Wancaster and Warner[27] in which the first level of evaluation is:

  • Costs/resources consumed in learning and using a system
  • Time needed to use the sytem[30]
  • Quality of the results.
    • Coverage. An estimated of coverage can be crudely automated.[31] However, more accurate judgment of relevance requires a human judge which introduces subjectivity.[32]
    • Precision and recall
    • Novelty. This has been judged by independent reviewers.[33]
    • Completeness and accuracy of results. An easy method of assessing this is to let the searcher make a subjective assessment.[24][34][35][36] Other methods may be to use a bank of questions with known target documents[37] or known answers[10][21].
  • Usage
    • Self-reported
    • Measured[30]

Number needed to read

The number Needed to Read (NNR) is "how many papers in a journal have to be read to find one of adequate clinical quality and relevance."[38][39][40][41] Of note, the NNR has been proposed as a metric to help libraries to decide which journals to subscribe to.[38]

Hit curve

A hit curve is the number of relevent documents retrieved among the first n results.[42][43]

Decision velocity

Time need to answer a question can be compared between two systems with a Kaplan-Meir survival analysis method.[23] In addition, if the correct answer to the search question is known, a survival analysis can compared time to obtain a correct answer. The result is an S-curve (also called sigmoid curve or logistic growth curve) in which most questions are answered after an initial delay; however, a minority of questions take a much longer time.

Precision and recall

Recall is the fraction of relevant documents that are successfully retrieved. This is the same as sensitivity.

Precision is the fraction of retrieved documents that are relevant to the search. This is the same as positive predictive value.

F1 is the unweighted harmonic mean of the recall and precision.[29]

References

  1. National Library of Medicine. Information Storage and Retrieval. Retrieved on 2007-12-12.
  2. Sagan, Carl (1993). The Dragons of Eden: Speculations on the Evolution of Human Intelligence. New York: Ballantine Books. ISBN 0-345-34629-7. 
  3. 3.0 3.1 3.2 de Bliek R, Friedman CP, Wildemuth BM, Martz JM, Twarog RG, File D (1994). "Information retrieved from a database and the augmentation of personal knowledge". J Am Med Inform Assoc 1 (4): 328–38. PMID 7719819[e] Cite error: Invalid <ref> tag; name "pmid7719819" defined multiple times with different content Cite error: Invalid <ref> tag; name "pmid7719819" defined multiple times with different content
  4. Mulvaney, S. A., Bickman, L., Giuse, N. B., Lambert, E. W., Sathe, N. A., & Jerome, R. N. (2008). A randomized effectiveness trial of a clinical informatics consult service: impact on evidence-based decision-making and knowledge implementation, J Am Med Inform Assoc, 15(2), 203-211. doi: 10.1197/jamia.M2461.
  5. Wright A. (2009) Exploring a ‘Deep Web’ That Google Can’t Grasp. New York Times.
  6. Shaughnessy AF, Slawson DC, Bennett JH (November 1994). "Becoming an information master: a guidebook to the medical information jungle". J Fam Pract 39 (5): 489–99. PMID 7964548[e]
  7. 7.0 7.1 Garfield, E. “ISI Eases Scientists’ Information Problems: Provides Convenient Orderly Access to Literature,” Karger Gazette No. 13, pg. 2 (March 1966). Reprinted as “The Who and Why of ISI,” Current Contents No. 13, pages 5-6 (March 5, 1969), which was reprinted in Essays of an Information Scientist, Volume 1: ISI Press, pages 33-37 (1977). http://www.garfield.library.upenn.edu/essays/V1p033y1962-73.pdf
  8. Roland M. Grad et al., “Impact of Research-based Synopses Delivered as Daily email: A Prospective Observational Study,” J Am Med Inform Assoc (December 20, 2007), http://www.jamia.org/cgi/content/abstract/M2563v1 (accessed December 21, 2007).
  9. Koerner B (2008). Algorithms Are Terrific. But to Search Smarter, Find a Person. Wired Magazine. Retrieved on 2008-04-04.
  10. 10.0 10.1 10.2 10.3 10.4 10.5 10.6 10.7 Hersh WR, Crabtree MK, Hickam DH, et al (2002). "Factors associated with success in searching MEDLINE and applying evidence to answer clinical questions". J Am Med Inform Assoc 9 (3): 283–93. PMID 11971889. PMC 344588[e] Cite error: Invalid <ref> tag; name "pmid11971889" defined multiple times with different content Cite error: Invalid <ref> tag; name "pmid11971889" defined multiple times with different content Cite error: Invalid <ref> tag; name "pmid11971889" defined multiple times with different content
  11. 11.0 11.1 11.2 11.3 Ely JW, Osheroff JA, Ebell MH, et al (March 2002). "Obstacles to answering doctors' questions about patient care with evidence: qualitative study". BMJ 324 (7339): 710. PMID 11909789. PMC 99056[e]
  12. Porter MF. An algorithm for suffix stripping. Program. 1980;14:130–7.
  13. Beck AL, Bergman DA (September 1986). "Using structured medical information to improve students' problem-solving performance". J Med Educ 61 (9 Pt 1): 749–56. PMID 3528494[e]
  14. Nielsen J (1996). Writing Inverted Pyramids in Cyberspace (Alertbox). Retrieved on 2007-12-12.
  15. Lau AY, Coiera EW (October 2008). "Can cognitive biases during consumer health information searches be reduced to improve decision making?". J Am Med Inform Assoc. DOI:10.1197/jamia.M2557. PMID 18952948. Research Blogging.
  16. John Battelle. The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture. Portfolio Trade. ISBN 1-59184-141-0. 
  17. Verhoeff, J (2001). Inefficiency of the use of Boolean functions for information retrieval system. Communications of the ACM. 1961;4:557 DOI:10.1145/366853.366861
  18. Gruppen LD, Wolf FM, Van Voorhees C, Stross JK (1988). "The influence of general and case-related experience on primary care treatment decision making". Arch. Intern. Med. 148 (12): 2657–63. PMID 3196128[e]
  19. McKibbon KA, Fridsma DB, Crowley RS (2007). "How primary care physicians' attitudes toward risk and uncertainty affect their use of electronic information resources". J Med Libr Assoc 95 (2): 138–46, e49–50. DOI:10.3163/1536-5050.95.2.138. PMID 17443246. Research Blogging.
  20. Shelagh A. Mulvaney et al., “A Randomized Effectiveness Trial of a Clinical Informatics Consult Service: Impact on Evidence Based Decision-Making and Knowledge Implementation,” J Am Med Inform Assoc (December 20, 2007), http://www.jamia.org/cgi/content/abstract/M2461v1 (accessed December 21, 2007).
  21. 21.0 21.1 21.2 McKibbon KA, Fridsma DB (2006). "Effectiveness of clinician-selected electronic information resources for answering primary care physicians' information needs". J Am Med Inform Assoc 13 (6): 653–9. DOI:10.1197/jamia.M2087. PMID 16929042. PMC 1656967. Research Blogging.
  22. 22.0 22.1 Westbrook JI, Gosling AS, Coiera EW (2005). "The impact of an online evidence system on confidence in decision making in a controlled setting". Med Decis Making 25 (2): 178–85. DOI:10.1177/0272989X05275155. PMID 15800302. Research Blogging.
  23. 23.0 23.1 23.2 Coiera E, Westbrook JI, Rogers K (2008). "Clinical Decision Velocity is Increased when Meta-search Filters Enhance an Evidence Retrieval System". J Am Med Inform Assoc 15 (5): 638–46. DOI:10.1197/jamia.M2765. PMID 18579828. PMC 2528038. Research Blogging. Pubmed Central Cite error: Invalid <ref> tag; name "pmid18579828" defined multiple times with different content
  24. 24.0 24.1 24.2 Crowley SD, Owens TA, Schardt CM, et al (March 2003). "A Web-based compendium of clinical questions and medical evidence to educate internal medicine residents". Acad Med 78 (3): 270–4. PMID 12634206[e]
  25. 25.0 25.1 Marshall JG (April 1992). "The impact of the hospital library on clinical decision making: the Rochester study". Bull Med Libr Assoc 80 (2): 169–78. PMID 1600426. PMC 225641[e]
  26. King DN (October 1987). "The contribution of hospital library information services to clinical care: a study in eight hospitals". Bull Med Libr Assoc 75 (4): 291–301. PMID 3450340. PMC 227744[e]
  27. 27.0 27.1 Lancaster, Frederick Wilfrid; Warner, Amy J. (1993). Information retrieval today. Arlington, Va: Information Resources Press. ISBN 0-87815-064-1. 
  28. 28.0 28.1 Hersh, William R. (2008). Information Retrieval: A Health and Biomedical Perspective (Health Informatics). Berlin: Springer. ISBN 0-387-78702-X.  Google books
  29. 29.0 29.1 Trevor Strohman; Croft, Bruce; Donald Metzler (2009). Search Engines: Information Retrieval in Practice. Harlow: Addison Wesley. ISBN 0-13-607224-0. 
  30. 30.0 30.1 Cabell CH, Schardt C, Sanders L, Corey GR, Keitz SA (December 2001). "Resident utilization of information technology". J Gen Intern Med 16 (12): 838–44. PMID 11903763. PMC 1495306[e]
  31. Fenton SH, Badgett RG (July 2007). "A comparison of primary care information content in UpToDate and the National Guideline Clearinghouse". J Med Libr Assoc 95 (3): 255–9. DOI:10.3163/1536-5050.95.3.255. PMID 17641755. PMC 1924927. Research Blogging.
  32. Hersh WR, Buckley C, Leone TJ, Hickam DH, OHSUMED: An interactive retrieval evaluation and new large test collection for research, Proceedings of the 17th Annual ACM SIGIR Conference, 1994, 192-201.
  33. Lucas BP, Evans AT, Reilly BM, et al (May 2004). "The impact of evidence on physicians' inpatient treatment decisions". J Gen Intern Med 19 (5 Pt 1): 402–9. DOI:10.1111/j.1525-1497.2004.30306.x. PMID 15109337. PMC 1492243. Research Blogging.
  34. Ely JW, Osheroff JA, Chambliss ML, Ebell MH, Rosenbaum ME (2005). "Answering physicians' clinical questions: obstacles and potential solutions". J Am Med Inform Assoc 12 (2): 217–24. DOI:10.1197/jamia.M1608. PMID 15561792. PMC 551553. Research Blogging.
  35. Gorman P (2001). "Information needs in primary care: a survey of rural and nonrural primary care physicians". Stud Health Technol Inform 84 (Pt 1): 338–42. PMID 11604759[e]
  36. Alper BS, Stevermer JJ, White DS, Ewigman BG (November 2001). "Answering family physicians' clinical questions using electronic medical databases". J Fam Pract 50 (11): 960–5. PMID 11711012[e]
  37. Haynes RB, McKibbon KA, Walker CJ, Ryan N, Fitzgerald D, Ramsden MF (January 1990). "Online access to MEDLINE in clinical settings. A study of use and usefulness". Ann. Intern. Med. 112 (1): 78–84. PMID 2403476[e]
  38. 38.0 38.1 Toth B, Gray JA, Brice A (2005). "The number needed to read-a new measure of journal value". Health Info Libr J 22 (2): 81–2. DOI:10.1111/j.1471-1842.2005.00568.x. PMID 15910578. Research Blogging.
  39. McKibbon KA, Wilczynski NL, Haynes RB (2004). "What do evidence-based secondary journals tell us about the publication of clinically important articles in primary healthcare journals?". BMC Med 2: 33. DOI:10.1186/1741-7015-2-33. PMID 15350200. Research Blogging.
  40. Bachmann LM, Coray R, Estermann P, Ter Riet G (2002). "Identifying diagnostic studies in MEDLINE: reducing the number needed to read". J Am Med Inform Assoc 9 (6): 653–8. PMID 12386115[e]
  41. Haase A, Follmann M, Skipka G, Kirchner H (2007). "Developing search strategies for clinical practice guidelines in SUMSearch and Google Scholar and assessing their retrieval performance". BMC Med Res Methodol 7: 28. DOI:10.1186/1471-2288-7-28. PMID 17603909. Research Blogging.
  42. Herskovic JR, Iyengar MS, Bernstam EV (2007). "Using hit curves to compare search algorithm performance". J Biomed Inform 40 (2): 93–9. DOI:10.1016/j.jbi.2005.12.007. PMID 16469545. Research Blogging.
  43. Bernstam EV, Herskovic JR, Aphinyanaphongs Y, Aliferis CF, Sriram MG, Hersh WR (2006). "Using citation data to improve retrieval from MEDLINE". J Am Med Inform Assoc 13 (1): 96–105. DOI:10.1197/jamia.M1909. PMID 16221938. Research Blogging.