Call for Paper - August 2022 Edition
IJCA solicits original research papers for the August 2022 Edition. Last date of manuscript submission is July 20, 2022. Read More

Assessment of Measures for Information Retrieval System Evaluation: A User-centered Approach

Print
PDF
International Journal of Computer Applications
© 2011 by IJCA Journal
Number 7 - Article 7
Year of Publication: 2011
Authors:
Bernard Ijesunor Akhigbe
Babajide Samuel Afolabi
Emmanuel Rotimi Adagunodo
10.5120/3046-4138

Bernard Ijesunor Akhigbe, Babajide Samuel Afolabi and Emmanuel Rotimi Adagunodo. Article: Assessment of Measures for Information Retrieval System Evaluation: A User-centered Approach. International Journal of Computer Applications 25(7):6-12, July 2011. Full text available. BibTeX

@article{key:article,
	author = {Bernard Ijesunor Akhigbe and Babajide Samuel Afolabi and Emmanuel Rotimi Adagunodo},
	title = {Article: Assessment of Measures for Information Retrieval System Evaluation: A User-centered Approach},
	journal = {International Journal of Computer Applications},
	year = {2011},
	volume = {25},
	number = {7},
	pages = {6-12},
	month = {July},
	note = {Full text available}
}

Abstract

The ever increasing need for Information globally is a primary reason for the scores of daily usage of IR system. Therefore there is a need to evaluate the system from a more holistic perspective – of both the system and the user. At the moment system-centered measures are not usable for the user-centered approach. Therefore, this paper attempts to determine and also suggest measures as well as methods to meet this need. The factor analytic technique was experimented for this purpose, and the structural equation modeling technique was used to estimate the resultant model. Results show that the study demonstrated high significance. Hence, the statistics presented is capable of inspiring further work in IR systems’ evaluation from user’s perspective.

Reference

  • Ong, C.-S., Day, M.-Y., Hsu, W.-L. (2009). The measurement of user satisfaction with question answering systems. Elsevier, Information & Management 46 (2009), pp 397–403
  • Mandl, T. (2008). Recent Developments in the Evaluation of Information Retrieval Systems: Moving Towards Diversity and Practical Relevance. Informatica 32 (2008) 27–38
  • Voorhees, E., Harman, D. (2001). Overview of TREC-2001. Proceedings of TREC’2001. Available at http://trec.nist.gov
  • Kumar, R., Suri, P.K., Chauhan, R.K. (2005). Search Engines Evaluation DESIDOC Bulletin of Information Technology , Vol. 25, No. 2, March 2005, pp. 3-10.
  • Saracevic, T. (1995). Evaluation of evaluation in information retrieval. Proceedings of SIGIR 95, 138-46
  • Wu M., Diane H. S. (1999). Reflections on information retrieval evaluation. In Proceedings of the 1999 EBTI, ECAI, SEER & PNC Joint Meeting. Academia Sinica accessed from http://pnclink.org/annual/annual1999/ 1999pdf/wu-mm on 01/03/2010 @ 12:23pm
  • Dervin B., and Nilan M. S. (1986). Information Needs and Use. In Williams, M. E. (Ed.) Annual Review of Information Science and Technology, vol. 21, (pp.3-33). White Plains, NY: Knowledge Industry.
  • Lewandowski, D., and Hochstotter, N. (2008). Web Searching: A Quality Measurement Perspective. A. Spink and M. Zimmer (eds.), Web Search, Springer Series in Information Science and Knowledge Management 14, pp 309- 340. Published in Springer-Verlag Berlin Heidelberg.
  • Nicholas J. B. (2008) Some (what) Grand Challenges for Information Retrieval, European Conference on Information Retrieval (ECIR), Glasgow, Scotland, 31 March 2008.
  • Zhilin Yang, Shaohan Cai, Zheng Zhou, Nan Zhou (2005) Development and validation of an instrument to measure user perceived service quality of information presen- ting Web portals. Elsevier, Information & Management 42 (2005) 575–589
  • Dragomir, R. R., Hong, Q., Harris, W., and Weiguo, F. (2002). Evaluating Web-based Question Answering Systems. In Demo Section, LREC 2002, Las Palmas, Spain.
  • Jaana K., Kalervo J. (2005) Evaluating Information Retrieval Systems under the challenges of interaction and multidimensional dynamic relevance. Proceedings of the 4th COLIS Conference. Greenwood Village, CO: Libraries Unlimited, pp. 253-270.
  • Strasunskas, D., Tomassen, S.L. (2007) Quality Aspects in Ontology-driven Information Retrieval. In Khosrow-Pour, M. (Ed.) Managing Worldwide Operations and Communications with Information Technology (Proceedings of the 2007 IRMA International Conference), Vancouver, Canada, 2007, IDEA Group Publishing, pp. 1048-1050.
  • Schmettow, M. (2006). User Interaction Design Patterns for Information Retrieval Systems pg (C6-1) – (C6-24) accessed @ www.hillside.net/europlop/ europlop2006 /work- shops /C6.pdf on 28/10/2008
  • Marius Pasca and Sanda Harabagiu. High Performance question/answering. In Proceedings of the 24th International Conference on Research and Development in Information Retrieval, pages 366–374, 2001.
  • Gao, X., Murugesan, S., and Lo, B. (2004). Multi-dimensional Evaluation of Information Retrieval Results. Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence (WI’04). IEEE Computer Society, pp 192- 198.
  • Kent, A., Berry, M., Leuhrs, F. U., & Perry, J. W. (1955). Machine literature searching VIII. Operational criteria for designing information retrieval systems. American Documentation,6(2), 93–101.
  • Turpin A. H., and Hersh W. (2001) Why batch and user evaluations do not give the same results. In SIGIR 2001, Proceedings of the 24th Annual ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 225-231). New York: ACM.
  • Sparck, J.K. (2005). Meta-reflections on TREC. In E.M. Voorhees & D.K. Harman (Eds.) TREC: Experiment and Evaluation in Information Retrieval (pp. 421-448). Cambridge, MA: MIT Press.
  • Turpin, A., Scholer, F. (2006). User performance versus precision measures for simple search tasks. Proc. 29th ACM SIGIR Conf., pages 11-18, Seattle, US, August 2006.
  • Radev, D. R., Weiguo F., Hong Q., and Amardeep G., (2002). Probabilistic question answering from the web. In The Eleventh International World Wide Web Conference, Honolulu, Hawaii, May.
  • Yiming Y., Abhimanyu L., Ni L., Abhay H., Bryan K., Monica R. (2007) Utility-based Information Distillation Over Temporally Sequenced Documents. SIGIR 2007 Proceedings, Amsterdam, The Netherlands, ACM 978-1-59593-597-7/07/0007
  • Barllan, J. (2009). Criteria for Evaluating Information Retrieval Systems in Highly Dynamic Environments. Accessed @ http://citeseerx.ist.psu.edu/viewdoc on 04/03/09
  • Toms, E. G., O’Brien, H. (2009). The Information-seeking Support System (ISSS) Measurement Dilemma, Published by the IEEE Computer Society, 0018-9162/09, pp. 44 -50, 2009 IEEE
  • Wu, J.-H., Shen, W.-S., Lin, L.–M., Greenes, R., and Bates, D.W. (2008). International Journal for Quality in Health Care; Volume 20, Number 2: pp. 123–129
  • Churchill, G. A. (1979). A Paradigm for Developing Better Measures of Marketing Constructs. Journal of Marketing Research. Vol. 16, No. 1, pp. 64-73. Published by American Marketing Association.
  • Nauman, S., Yun, Y., Suku, S. (2009). User Acceptance of Second Life: An Extended TAM including Hedonic Consumption Behaviours. 17th European Conference on Information Systems. ECIS2009-0269.R1. pg 1-13.
  • Suhr, D.D. (2005). Statistics and Data Analysis Paper 203-30 Principal Component Analysis vs. Exploratory Factor Analysis. In the Proceedings of the 30th Annual SAS Users Group International Conference. Cary, NC: SAS Institute
  • Suhr, D.D. (2006). Statistics and Data Analysis Paper 200-31 Principal Component Analysis vs. Exploratory Factor Analysis. In the Proceedings of the 31st Annual SAS Users Group International Conference. Cary, NC: SAS Institute Inc.
  • Costello, A.B., and Jason, W. O. (2005). Best Practices in Exploratory Factor Analysis: Four Recommendations for getting the most from your Analysis. Practical Assessment Research and Evaluation, 10(7)
  • O’Brien, H. (2008). “Defining and Measuring Engagement in User Experiences with Technology,” doctoral dissertation, Dalhousie University, 2008.
  • Brown, T. A. (2006). Confirmatory Factor Analysis for Applied Research. New York: Guilford.
  • MacCallum, R. C., & Austin, J. T. (2000). Applications of structural equation modeling in psychological research. Annual Review of Psychology, 51, 201–226
  • Asparouhov, T. and Muthen, B. (2009). "Exploratory structural equation modeling". Structural Equation Modeling, 16, 397-438.
  • Nauman, S., Yun, Y., Suku, S. (2009). User Acceptance of Second Life: An Extended TAM including Hedonic Consumption Behaviours. 17th European Conference on Information Systems. ECIS2009-0269.R1. pg 1-13.
  • Byun , D. and Finnie, G. Evaluating usability, user satisfaction and intention to revisit for successful e-government websites. e-Government, an International Journal. Vol. 8, No. 1, pg 1-19. 2011
  • Beneke, J. Towards a conceptual model: a path analysis of fundamental relationships affecting mobile advertising effectiveness. International Journal of Electronic Finance. Vol. 5, No. 1, 2011, pg 15 – 31.
  • Al-Maskari, A., and Sanderson, M. (2010). A Review of Factors Influencing User- satisfaction in Information Retrieval. Journal of the American Society for Information Science and Technology. Published online by Wiley InterScience. Retrieved from http://disshef.ac.uk/mark /pub -licatio- ns/my_papers/2010_JASIST_Azzah. pdf on 18/03/2010 @ 1:06am.
  • Wikipedia (2011). Information retrieval. Retrieved on 24/05/2011 @ 9:13 pm from http://en.wikipedia.org/wiki/Information_retrieval
  • Hair, F. J., Black, W. C., Babin, B., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis. Upper Saddle River, NJ: Prentice-Hall.
  • Ryen W. White (2009). Designing Information-Seeking Support Systems. Reports on NSF Workshop on Information Seeking Support Systems. Retrieved from http://ils.unc.edu/ISSS/ on 09/06/2011.