A bibliometric study of video retrieval evaluation benchmarking (TRECVid) : a methodological analysis

DC FieldValueLanguage
dc.contributor.authorThornley, Clare V.-
dc.contributor.authorMcLoughlin, Shane J.-
dc.contributor.authorJohnson, Andrea C.-
dc.contributor.authorSmeaton, Alan F.-
dc.date.copyright2011 The authorsen
dc.identifier.citationJournal of Information Scienceen
dc.description.abstractThis paper provides a discussion and analysis of methodological issues encountered during a scholarly impact and bibliometric study within the field of computer science (TRECVid Text Retrieval and Evaluation Conference, Video Retrieval Evaluation). The purpose of this paper is to provide a reflection and analysis of the methods used to provide useful information and guidance for those who may wish to undertake similar studies, and is of particular relevance for the academic disciplines which have publication and citation norms that may not perform well using traditional tools. Scopus and Google Scholar are discussed and a detailed comparison of the effects of different search methods and cleaning methods within and between these tools for subject and author analysis is provided. The additional database capabilities and usefulness of “Scopus More” in addition to “Scopus General” is discussed and evaluated. Scopus paper coverage is found to favourably compare to Google Scholar but Scholar consistently has superior performance at finding citations to those papers. These additional citations significantly increase the citation totals and also change the relative ranking of papers. Publish or Perish (PoP), a software wrapper for Google Scholar, is also examined and its limitations and some possible solutions are described. Data cleaning methods, including duplicate checks, expert domain checking of bibliographic data, and content checking of retrieved papers are compared and their relative effects on paper and citation count discussed. Google Scholar and Scopus are also compared as tools for collecting bibliographic data for visualisations of developing trends and, due to the comparative ease of collecting abstracts, Scopus is found far more effective.en
dc.description.sponsorshipNot applicableen
dc.format.extent698972 bytes-
dc.relation.requiresCLARITY Research Collectionen
dc.subjectVideo retrievalen
dc.subjectResearch evaluationen
dc.subjectScholarly impacten
dc.subjectComputer scienceen
dc.subjectCitation analysisen
dc.subject.lcshInformation retrieval--Researchen
dc.subject.lcshDigital video--Researchen
dc.titleA bibliometric study of video retrieval evaluation benchmarking (TRECVid) : a methodological analysisen
dc.typeJournal Articleen
dc.internal.availabilityFull text availableen
dc.statusPeer revieweden
dc.neeo.contributorThornley|Clare V.|aut|-
dc.neeo.contributorMcLoughlin|Shane J.|aut|-
dc.neeo.contributorJohnson|Andrea C.|aut|-
dc.neeo.contributorSmeaton|Alan F.|aut|-
dc.description.admin12M embargo - AV 20/7/2011 Author says it will be online in August, check then for published version - OR 25/7/11 Author has confirmed with publisher that accepted version can be made available when published online at end August - OR 02/08/2011en
item.fulltextWith Fulltext-
Appears in Collections:CLARITY Research Collection
Information and Communication Studies Research Collection
Files in This Item:
File Description SizeFormat 
JIS-1410-v4.pdf682.59 kBAdobe PDFDownload
Show simple item record

Page view(s) 1

Last Week
Last month
checked on Feb 28, 2020

Download(s) 5

checked on Feb 28, 2020

Google ScholarTM


This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.