Paper title:

Evaluating Web-based Technologies: The Paradigm of User-centricity

DOI: https://doi.org/10.4316/JACSM.201602005
Published in: Issue 2, (Vol. 10) / 2016
Publishing date: 2016-10-20
Pages: 32-39
Author(s): AKHIGBE Bernard Ijesunor, ADERIBIGBE Stephen Ojo, AFOLABI Babajide Samuel
Abstract. Web Search Engines (WeSEs) are information systems that demonstrate large scale distributed system capabilities, and are fluxing and dynamic in nature. It is necessary to continually review them. There is need for methods and metrics that are replicable, which outcome will benefit design teams and content experts. Such results could form the basis to arrive at policies as strategies that are translatable to user-requirements for better user interactive experiences when implemented. This paper attempts to investigate the WeSE and propose evaluative metrics. The paper relied on the guide provided by the Web Analytic Framework (WAF) using both the subjective evaluative modelling technique and the reflective approach. These techniques provided the synergistic support the WAF drew on. The WAF was drawn on to interpret the conceptualization of metrics as the process of assigning measures - values to a phenomenon. The results obtained are replicable; they demonstrated significant effectiveness in their applicability in the assessment of distributed information systems like the WeSE. It will be necessary to try the proposed metrics to assess other distributed information systems in a perception-oriented context in order to ascertain their generalizability and replicability.
Keywords: Web Search Engines, User-centric Metrics, Web Analytic Framework, And Factor Analysis
References:

1. Akhigbe B I, Afolabi B S, and Adagunodo E R. 2012. Item’s Characterization for Model’s Development in Information Retrieval System Evaluation. In Proceedings of Information Systems, Technology and Management (Grenoble, France), Springer, pp. 101-111.

2. Akhigbe B I, Afolabi B S, and Adagunodo E R. 2014. A Baseline Model for Relating Users’ Requirements of Web Search Engines. Advances in Knowledge Organization 14, pp. 374 - 381.

3. Akhigbe B I, Afolabi B S, and Adagunodo E R. 2015. Modelling User-centred Attributes: The Web Search Engine as a Case. Knowledge Organization 42(1), pp. 25 – 39.

4. Akhigbe B I. 2012. Development of a User-Centered Evaluative Model for IR Systems: An Empirical Perspective. Saarbrücken, Germany: Lambert Academic Publishing.

5. Aladwani A M and Palvia P C. 2002. Developing and Validating an Instrument for Measuring User-perceived Web Quality. Information and Management 39, pp. 467–476.

6. 6. Aladwani A M and Palvia P C. 2002. Developing and Validating an Instrument for Measuring User-perceived Web Quality. Information and Management 39, pp. 467–476.

7. Ataloglou M. and Economides A. 2009. Evaluating European Ministries' Websites. Inter. Journal of Public Information Systems 5(3), 147-177.

8. Ballard P. (2014). 5 Web Technologies You Can’t Afford To Ignore. Retrieved from http://www.informit.com/articles/article.aspx?p=2184061 on 30/11/2015 @ 21:00 pm.

9. Bertini M, Del Bimbo A, Ferracani A, Landucci L, and Pezzatini D. 2013. Interactive Multi-user Video Retrieval Systems. Multimedia Tools and Applications 62(1), pp. 111-137.

10. Bilal D and Boehm M. 2013. Towards New Methodologies for Assessing Relevance of Information Retrieval from Web Search Engines on Children’s Queries. Qualitative and Quantitative Methods in Libraries 1, pp. 93 – 100.

11. Bollen D, Knijnenburg B P, Willemsen M C, and Graus M. 2010. Understanding Choice Overload in Recommender Systems. In Proc. of the 4th ACM Conference on Recommender Systems, RecSys'10, New York, NY, USA, pp. 63 - 70.

12. Bughin J, Corb L, Manyika J, Nottebohm O, Chui M, Borja de M. B., and Said R. 2011. The Impact of Internet Technologies: Search. Retrieved from http://www. mckinsey.com/~/ media/mckinsey/dotcom/client_service /high%20tech/pdfs/impact_of_internet_technologies_ search_final2.ashx on 17/11/2015 @ 22:45.

13. Carterette B, Kanoulas E, and Yilmaz E. 2012a. Advances on the Development of Evaluation Measures. In Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval, pp. 1200-1201.

14. Carterette B, Kanoulas E, and Yilmaz E. 2012b. Evaluating Web Retrieval Effectiveness. In D. Lewandowski (Ed.), Web Search Engine Research. Bingley: Emerald, pp. 105-137.

15. Carvalho C, Conboy J, Santos J, Fonseca J, Tavares D, Martins D,..., and Gama A P. 2015. An Integrated Measure of Student Perceptions of Feedback, Engagement and School Identification. Procedia-Social and Behavioural Sciences 174, pp. 2335-2342.

16. Dunlop M D. 2000. Reflections on Mira: Interactive Evaluation in Information Retrieval. Journal of the American Society for Information Science 51(14), pp. 1269-1274.

17. Edwards J R and Bagozzi R P. 2000. On the Nature and Direction of Relationships between Constructs and Measures. Psychological Methods 5(2), pp. 155-174.

18. Fagan J C. 2014. The Suitability of Web Analytics Key Performance Indicators in the Academic Library Environment. Journal of Academic Librarianship 40, pp. 25–34.

19. Fornell C and Larcker D F. 1981. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. Journal of Marketing Research 48, pp. 39–50.

20. Hair J F, Black W C, Babin B J, and Anderson R E. 2010. Multivariate Data Analysis (7th ed.). New Jersey: Prentice Hall.

21. Hargittai E and Hsieh Y P. 2012. Succinct Survey Measures of Web-use Skills. Social Science Computer Review 30(1), pp. 95-107.

22. Henson R K and Roberts J K. 2006. Use of Exploratory Factor Analysis in Published Research: Common Errors and Some Comment. Educational and Psychological Measurement 66 (3), pp. 393-416.

23. Jansen B J and Rieh S Y. 2010. The Seventeen Theoretical Constructs of Information Searching and Information Retrieval, Journal of the American Society for Information Sc. and Tech. 61(8), pp.1517–1534.

24. Jansen B J and Spink A. 2006. How are we Searching the World Wide Web? A Comparison of Nine Search Engine Transaction Logs. Information Processing and Management 42(1), pp. 248-263.

25. Jerome S. 2015. What is Web-based Technology? Retrieved from http://digitaldealer. com/sales-marketing/digital-dealer/what-is-web-based-technology on 18/11/2015 @ 21:51pm

26. Lamm K, Mandl T, Womser-Hacker C, and Greve W. 2010. User Experiments with Search Services: Methodological Challenges for Measuring the Perceived Quality. In Proc. of 3rd ISCA/DEGA Tutorial and Research Workshop on Perceptual Quality of Systems, pp. 64 – 69.

27. Lewandowski D and Hochstotter N. 2008. Web searching: A quality measurement perspective, in: A. Spink, M. Zimmer (Eds.), Web Search: Multidisciplinary Perspectives, Springer, Berlin, Heidelberg, pp. 309–340.

28. McNee S M, Riedl J, and Konstan J A. 2006. Being Accurate is not enough: How Accuracy Metrics have Hurt Recommender Systems. In Proceedings of the CHI '06 Extended Abstracts on Human Factors in Computing Systems, CHI EA '06, ACM (N.Y., USA), pp. 1097-1101.

29. Ong C - S, Day M - Y, and Hsu W - L. 2009. The Measurement of User Satisfaction with Question Answering Systems. Information and Management 46, pp. 397–403.

30. Phippen A, Sheppard L, and Furnell S. 2004. A Practical Evaluation of Web Analytics. Internet Research 14(4), pp. 284 - 293.

31. Said A, Fields B, Jain B J, and Albayrak S. 2013a. User-centric Evaluation of a k-furthest Neighbour Collaborative Filtering Recommender Algorithm. In Proceed -ings of the 2013 ACM Conference on Computer supported cooperative work, pp. 1399-1408.

32. Said A, Jain B J, Lommatzsch A, and Albayrak S. 2012. Correlating Perception-oriented Aspects in User-centric Recommender System Evaluation. In Proceedings of the 4th ACM Conference on Information Interaction in Context Symposium, pp. 294-297.

33. Saracevic T. 1995. Evaluation of Evaluation in Information Retrieval. In Proceedings of SIGIR 95, pp. 138-46.

34. Sirotkin P. 2013. On Search Engine Evaluation Metrics. arXiv preprint arXiv:1302.2318, pp. 1-192.

35. Soffer A and Dori D. 2013. Model-Based Requirements Engineering Framework for Systems Life-Cycle Support. In managing requirements knowledge (pp. 291-311). Springer Berlin Heidelberg.

36. Sumak B, Hericko M, Pusnik M, and Polancic G. 2011. Factors Affecting Acceptance and Use of Moodle: An Empirical Study Based on TAM. Informatica 35, pp. 91–100.

37. Teo T and Fan X. 2013. Coefficient Alpha and Beyond: Issues and Alternatives for Educational Research. The Asia-Pacific Education Research 22(2), pp. 209–213.

38. Teo T and Zhou M. 2014. Explaining the Intention to Use Technology among University Students: a Structural Equation Modeling Approach. Journal of Computing In Higher Education 26(2), pp. 124-142.

39. Thompson B and Daniel L G. 1996. Factor Analytic Evidence for the Construct Validity of Scores: A Historical Overview and Some Guidelines. Educational and Psychological Measurement 56, pp. 197-208.

40. Waisberg D and Kaushik A. 2009. Web Analytics 2.0: Empowering Customer Centricity. The Original Search Engine Marketing Journal, 2(1), pp. 5-11.

41. Wang X, Shen D, Chen H - L, and Wedman L. 2011. Applying Web Analytics in a K-12 Resource Inventory. The e-Library 29(1), pp. 20-35.

42. Yang Z, Cai S, Zhou Z, Zhou N. 2005. Development and validation of an instrument to measure user perceived service quality of information presenting Web portals. Information and Management 42, pp. 575–589.

43. Zahran D I, Al-Nuaim H A, Rutter M J, and Benyon D. 2014. A Comparative Approach to Web Evaluation and Website Evaluation Methods. Inter. Journal of Public Information Systems, 10(1), pp. 20 – 39.

44. Sundgren, B. (2005). What is a Public Information System? International Journal of Public Information Systems. 1(1), pp. 81-99.

45. Katre, D., and Gupta (2011). Expert Usability Evaluation of 28 State Government Web Portals of India. International Journal of Public Information Systems, 10(1), p.

46. Brin, S., and Page, L. (2012). Reprint of: The Anatomy of a Large-scale Hypertextual Web Search Engine. Computer networks, 56(18), 3825-3833.

47. Christophersen T and Konradt U. 2012. Development and Validation of a Formative and a Reflective measure for the Assessment of Online Store Usability. Behaviour and Information Technology 31(9), pp. 839-857.

48. Abhari K, Davidson E J, and Xiao B. 2016. Measuring the Perceived Functional Affordances of Collaborative Innovation Networks in Social Product Development. In 2016 49th IEEE Hawaii International Conference on System Sciences (HICSS) (pp. 929-938).

49. Wu J-H, Wen-Shen S, Li–Min L, Robert A G, and David W B. 2008. “Testing the Technology Acceptance Model for Evaluating Healthcare Professionals’ intention to Use an Adverse Event Reporting System.” International Journal for Quality in Health Care 20, pp. 123-29.

Back to the journal content
Creative Commons License
This article is licensed under a
Creative Commons Attribution-ShareAlike 4.0 International License.
Home | Editorial Board | Author info | Archive | Contact
Copyright JACSM 2007-2024