# csv-crawler **Repository Path**: mirrors-dev/csv-crawler ## Basic Information - **Project Name**: csv-crawler - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2020-11-09 - **Last Updated**: 2025-09-22 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Crawl Packages According to CSV File ## Description 根据 CSV 文件中的制品 GAV(Group, Artifact, Version)同步上游源的制品。 ## Usage ```shell usage: crawler.py [-h] -f FILE -t TYPE [-s SEP] [--no-header] [-o LOG_DIR] [--last-sync LAST_SYNC] Sync packages according to csv file optional arguments: -h, --help show this help message and exit -f FILE, --file FILE specify the csv file which contains package info to sync -t TYPE, --type TYPE specify the type of packages ready to sync -s SEP, --sep SEP specify the sep of csv file (default: " ") --no-header whether the csv file has header -o LOG_DIR, --log-dir LOG_DIR specify log directory (default: /tmp) --last-sync LAST_SYNC specify last sync package ```