Crawler4j setup guide

I would like to configure the crawler to crawl a website, say a blog, and get only the links on the website and insert the links inside the text file. Can you guide me step by step to install the finder? I am using Eclipse.

+5
source share
1 answer

Jsoup will do everything you need at wrt html parsing. Jsoup is a java api for processing html source code. You can get

  • A table with which you can analyze each row or column.
  • A list of all the links and the original import into this html (import as css and js).
  • The data for a specific tag.

and more.

.

, .

0

All Articles