An architecture for SCS: A specialized Web crawler on the topic of security

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Mining for correct and relevant information in the World Wide Web is a difficult task, handled by Web crawlers. This study outlines the components of a specialized crawler on the topic of security (SCS) that heavily makes use of artificial neural networks and rule-based expert systems to establish successful focused crawling on the topic of security. SCS is designed to find, index and follow the updates of Web pages of interest, and proposes new approaches for reaching relevant pages, which might stay hidden to other crawling approaches. SCS consists of four new page explorers, a database of relevant pages, a relevance evaluator using artificial neural networks and an updater using rule-based expert systems. SCS is a multi-threaded multi-object Java Applet and Application combination with embedded SQL and PHP elements and is applicable on single or multiple machines through parallel processing with its expandable and modular structure.

Cite

CITATION STYLE

APA

Özmutlu, H. C., & Özmutlu, S. (2004). An architecture for SCS: A specialized Web crawler on the topic of security. In Proceedings of the ASIST Annual Meeting (Vol. 41, pp. 317–326). https://doi.org/10.1002/meet.1450410138

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free