BRACU-Crawler is a pilot project designed to crawl BD sites only. First, Initiate crawler module provides current status of URL queue to the Schedule policy module and receives policy map and crawling...


BRACU-Crawler is a pilot project designed to crawl BD sites only. First, Initiate crawler module provides current status of URL queue to the Schedule<br>policy module and receives policy map and crawling threshold number. Until a threshold reached flag is received from the Fetch URL module, Crawler<br>receives a link from Fetch URL module. It then retrieves a new page by providing page link to the Fetch site page module. This page is then sent to the<br>Extract URL module and it returns a set of newly extracted links to the Crawler. In order to do that, Extract URL module first generates raw links using<br>Parse raw links module and sends those to the Filter Valid links module. Filter valid links module returns only the URLS that Crawler should crawl in the<br>future. After receiving the filtered links, Extract URL module formats the links using a library module called Format link and finally sends these filtered-<br>formatted links to the Crawler. Based on the policy map, Crawler can either add the links to the Update URL queue module which is an off-page<br>connector or dispatch to the Reschedule policy module which is an on-page connector.<br>Design a structure chart based on the above information.<br>

Extracted text: BRACU-Crawler is a pilot project designed to crawl BD sites only. First, Initiate crawler module provides current status of URL queue to the Schedule policy module and receives policy map and crawling threshold number. Until a threshold reached flag is received from the Fetch URL module, Crawler receives a link from Fetch URL module. It then retrieves a new page by providing page link to the Fetch site page module. This page is then sent to the Extract URL module and it returns a set of newly extracted links to the Crawler. In order to do that, Extract URL module first generates raw links using Parse raw links module and sends those to the Filter Valid links module. Filter valid links module returns only the URLS that Crawler should crawl in the future. After receiving the filtered links, Extract URL module formats the links using a library module called Format link and finally sends these filtered- formatted links to the Crawler. Based on the policy map, Crawler can either add the links to the Update URL queue module which is an off-page connector or dispatch to the Reschedule policy module which is an on-page connector. Design a structure chart based on the above information.

Jun 11, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here