A toolchain for bringing web2 to web3. IPFS daemon should be launched. Download enwiki-latest-all-titles to crawler root dir. Basically, there are two main functions provided by crawler tool. The first one is to parse wiki titles and submit links between keywords and wiki pages. Also, crawler has separate command upload-duras-to-ipfs to upload files to local IPFS node. All DURAs are collected under single root unixfs directory.
Features
- Note: Requires Go 1.12+
- IPFS daemon should be launched
- Download enwiki-latest-all-titles to crawler root dir
- Basically, there are two main functions provided by crawler tool
- Parse wiki titles and submit links between keywords and wiki pages
- Upload duras to IPFS
Categories
File SharingLicense
MIT LicenseFollow crawler
You Might Also Like
Gen AI apps are built with MongoDB Atlas
MongoDB Atlas is the developer-friendly database used to build, scale, and run gen AI and LLM-powered apps—without needing a separate vector database. Atlas offers built-in vector search, global availability across 115+ regions, and flexible document modeling. Start building AI apps faster, all in one place.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of crawler!