Quantitative & Qualitative Evaluation of Three Search Engines (Google, Yahoo, and Bing)

Authors

  • Mohammad Asmaran Al-Balqa’Applied University , Al-Salt 19117, Jordan

Keywords:

Information Retrieval, Data Ranking, Search Engine Evaluation, Search Engines Comparison.

Abstract

As Internet evolution becomes bigger every day, people become more interested in finding services and information spread all over the web. This is a difficult process because of the neuromas number of sites released every day. This brings the importance of search engines to be very high to web users so they can surf the web easily. Search engines are required to provide appropriate and accurate results to users looking for any information. This research evaluates three of the top search engines over the internet (i.e. Google, Yahoo, and Bing) using well know qualitative and quantitative approaches. According to the research results, Google is the best of the three engines in term of quality of search results while Yahoo is the best in terms of search speed.

References

[1] Google. "Google" Internet: http://www.google.com, [01-01-2016]
[2] Yahoo. "Yahoo Search" Internet: https://search.yahoo.com/, [01-01-2016]
[3] Bing. "Bing" Internet: http://www.bing.com/, [01-01-2016]
[4] Bruce Croft,Donald Metzler,Trevor Strohman,Search Engines: Information Retrieval in Practice, 2010
[5] Dawei Yin, Yuening Hu, Jiliang Tang, Tim Daly Jr., Mianwei Zhou, Hua Ouyang, Jianhui Chen, Changsung Kang, Hongbo Deng, Chikashi Nobata, Jean-marc Langlois, Yi Chang. "Ranking Relevance in Yahoo Search" ACM SIGKDD Conference On Knowledge Discovery And Data Mining (KDD 2016), 2016
[6] C. D. Manning, P. Raghavan, and H. Schütze. Introduction to Information Retrieval. New York, NY, USA, Cambridge University Press, 2008.
[7] K. Järvelin, J. Kekäläinen. "Cumulated gain-based evaluation of ir techniques". ACM Transactions on Information Systems, 20, pp. 422–446, Oct. 2002.
[8] S. Robertson and H. Zaragoza. "The probabilistic relevance framework: Bm25 and beyond". Foundations and TrendsR in Information Retrieval, 3, pp. 333–389, Apr. 2009.
[9] R. Srikant, S. Basu, N. Wang, and D. Pregibon. "User browsing models: relevance versus examination". ACM SIGKDD international conference on Knowledge discovery and data mining (KDD 2010), 2010, pp. 223-232.
[10] T. Joachims. "Optimizing search engines using clickthrough data". ACM SIGKDD international conference on Knowledge discovery and data mining (KDD 2002), 2002.
[11] S. Jiang, Y. Hu, C. Kang, T. Daly Jr., D. Yin, Y. Chang and C. Zhai. "Learning Query and Document Relevance from a Web-scale Click Graph". the 39th International ACM SIGIR conference on Research and Development in Information Retrieval (SIGIR 2016), 2016, pp. 185-194.
[12] Pingdom. "Website speed test" Internet: http://tools.pingdom.com/fpt/, [01-01-2016]
[13] comScore. “comScore Releases February 2016 U.S. Desktop Search Engine Rankings” Internet: https://www.comscore.com/Insights/Rankings/comScore-Releases-February-2016-US-Desktop-Search-Engine-Rankings, [16-03-2016]

Downloads

Additional Files

Published

2016-10-31

How to Cite

Asmaran, M. (2016). Quantitative & Qualitative Evaluation of Three Search Engines (Google, Yahoo, and Bing). American Scientific Research Journal for Engineering, Technology, and Sciences, 26(2), 97–106. Retrieved from https://asrjetsjournal.org/index.php/American_Scientific_Journal/article/view/2305

Issue

Section

Articles