Binance Syncer is a high-performance synchronization tool for downloading and managing Binance historical data (klines, trades, etc.) asynchronously. It supports both local and S3 storage with intelligent data optimization (monthly vs daily preference).
- Asynchronous synchronization: Concurrent downloading optimized for large volumes
- Multi-market support: SPOT, FUTURES (UM/CM), and OPTIONS markets
- Various data types: KLINES, TRADES, AGG_TRADES, BOOK_DEPTH, etc.
- Flexible storage: Local filesystem or cloud (AWS S3)
- Intelligent optimization: Automatic preference for monthly data
- CLI interface: Complete command-line interface with dry-run mode
- Error handling: Automatic retry and robust SSL management
- Progress tracking: Progress bar with rich console
- Advanced logging: Structured logs with automatic rotation
pip install binance-synceruv tool install binance-syncergit clone https://github.com/caymaar/binance-syncer.git
cd binance-syncer
pip install .On first launch, the syncer automatically creates:
~/utilities/config/binance_syncer.ini
~/utilities/logs/binance_syncer/
Edit ~/.config/binance_syncer/config.ini and replace with your configuration:
[LOCAL]
PATH = ~/binance-vision
[S3]
BUCKET = my-binance-data-bucket
PREFIX = binance-vision
[SETTINGS]
MAX_CONCURRENT_DOWNLOADS = 100
SYMBOL_CONCURRENCY = 10
BATCH_SIZE_SYNC = 20
BATCH_SIZE_DELETE = 1000To set your S3:
export AWS_ACCESS_KEY_ID=your_access_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_DEFAULT_REGION=us-east-1# Sync all SPOT KLINES 1d data
binance-syncer --market-type spot --data-type klines --interval 1d
# Specific symbols with progress bar
binance-syncer --market-type spot --data-type klines --interval 1d \
--symbols BTCUSDT ETHUSDT ADAUSDT --progress
# S3 storage
binance-syncer --market-type spot --data-type klines --interval 1d \
--storage s3 --progress
# Dry-run mode (simulation)
binance-syncer --market-type spot --data-type klines --interval 1d \
--dry-run
# Trade data (no interval required)
binance-syncer --market-type spot --data-type trades --storage local# Verbose logging
binance-syncer --market-type spot --data-type klines --interval 1d \
--verbose --progress
# Futures with 4h interval
binance-syncer --market-type futures/um --data-type klines --interval 4h \
--storage s3 --progressimport asyncio
from binance_syncer import BinanceDataSync, MarketType, DataType, KlineInterval
async def main():
# Configure syncer
syncer = BinanceDataSync(
storage_mode="local", # or "s3"
market_type=MarketType.SPOT,
data_type=DataType.KLINES,
interval=KlineInterval.D1,
progress=True
)
# Sync specific symbols
await syncer.sync(["BTCUSDT", "ETHUSDT"])
# Or sync all available symbols
# await syncer.sync()
# Execute
asyncio.run(main())~/binance-vision/
├── data/
│ ├── spot/
│ │ ├── klines/
│ │ │ ├── BTCUSDT/
│ │ │ │ ├── 1d/
│ │ │ │ │ ├── 2024-01.parquet # Monthly data
│ │ │ │ │ ├── 2024-02-15.parquet # Daily data
│ │ │ │ │ └── ...
│ │ │ │ └── 1h/
│ │ │ └── ETHUSDT/
│ │ └── trades/
│ ├── futures/
│ └── option/
s3://your-bucket/
├── binance-vision/
│ ├── data/
│ │ ├── spot/
│ │ │ ├── klines/
│ │ │ │ ├── BTCUSDT/
│ │ │ │ │ ├── 1d/
│ │ │ │ │ │ ├── 2024-01.parquet
│ │ │ │ │ │ └── ...
| Data Type | Market Types | Interval Required |
|---|---|---|
| KLINES | SPOT, FUTURES_UM, FUTURES_CM | Yes |
| TRADES | SPOT, FUTURES_UM, FUTURES_CM | No |
| AGG_TRADES | SPOT, FUTURES_UM, FUTURES_CM | No |
| BOOK_DEPTH | SPOT, FUTURES_UM, FUTURES_CM | No |
| BOOK_TICKER | SPOT, FUTURES_UM, FUTURES_CM | No |
| EOH_SUMMARY | OPTION | No |
- Seconds: 1s
- Minutes: 1m, 3m, 5m, 15m, 30m
- Hours: 1h, 2h, 4h, 6h, 8h, 12h
- Days: 1d
- Weeks: 1w
Logs are automatically created in ~/utilities/logs/binance_syncer/
- Retention: 7 days
- Levels: DEBUG, INFO, WARNING, ERROR
- Downloads: 100 simultaneous tasks per symbol
- Symbols: 10 symbols processed in parallel
- Batches: Processing in batches of 20 to avoid overload
- SSL: Robust configuration with fallback
- Timeouts: Configurable (connection: 30s, read: 120s, total: 300s)
- Retry: Automatic retry logic with exponential backoff
- Chunked reading: Reading in 8KB chunks for large files
- Monthly preference: Automatic removal of daily data when monthly data is available
- Deduplication: Avoids downloading existing files
- Compression: Parquet format with Snappy compression
# On macOS, install Python certificates
/Applications/Python\ 3.11/Install\ Certificates.command
# Check SSL configuration
python -c "import ssl, certifi; print(certifi.where())"# Check AWS credentials
aws configure list
aws s3 ls s3://your-bucket/