As specialized Web crawlers like CiteSeerX and Quertle become more sophisticated, they gain recognition in user communities. Perhaps these Web tools do a better job of gathering references to conference publications and grey literature than do traditional databases. Librarians need to compare traditional databases with Web crawler generated databases when they make collection development decisions and as they choose what tools to highlight during instruction sessions. Traditional databases are often compared by reviewing what journals are indexed in each resource. This type of comparison is not applicable when evaluating Web search engines. First, there is no publication list for databases created by Web crawlers. Second, for engineering related search tools, journal title comparison does not include other important publications like conference articles, government documents, reports, standards, patents and grey literature. One method used to assess Web crawlers is to compare results of subject based searches to the same search in a traditional database. Reviewing the number of items retrieved and the appropriateness of a subset of the citations retrieved provides an assessment of how well a search engine performs. However this method does not show the depth and breadth of content found within a database. This paper explores a method to quantitatively compare the content gathered by Web crawlers to the content provided by traditional databases. A checklist is developed using researchers' resumes. The researcher publication checklist methodology is evaluated and validated with a test project. © 2012 American Society for Engineering Education.
CITATION STYLE
Kirkwood, P. E. (2012). Faculty publication checklists: A quantitative method to compare traditional databases to Web search engines. In ASEE Annual Conference and Exposition, Conference Proceedings. American Society for Engineering Education. https://doi.org/10.18260/1-2--21391
Mendeley helps you to discover research relevant for your work.