Crawls web pages and prints any link it can find. Fast HTML SAX-parser (powered by golang.org/x/net/html) Small (below 1500 SLOC), idiomatic, 100% test-covered codebase. Grabs most of useful resources URLs (pics, videos, audios, forms, etc...) Found URLs are streamed to stdout and guaranteed to be unique (with fragments omitted) Scan depth (limited by starting host and path, by default - 0) can be configured. Can crawl rules and sitemaps from robots.txt. Brute mode - scan HTML comments for URLs (this can lead to bogus results) Make use of HTTP_PROXY / HTTPS_PROXY environment values + handle proxy auth. Directory-only scan mode (aka fast-scan)

Features

  • Idiomatic, 100% test covered codebase
  • Below 1500 SLOC
  • Grabs most of useful resources URLs
  • Directory-only scan mode
  • Allows to ignore URLs with matched substrings from crawling
  • Extract API endpoints from JS files

Project Samples

Project Activity

See All Activity >

Categories

Web Scrapers

License

MIT License

Follow crawley

crawley Web Site

You Might Also Like
MongoDB Atlas runs apps anywhere Icon
MongoDB Atlas runs apps anywhere

Deploy in 115+ regions with the modern database for every enterprise.

MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of crawley!

Additional Project Details

Operating Systems

Mac, Windows

Programming Language

Go

Related Categories

Go Web Scrapers

Registered

2023-04-12