Saltar a contenido

Cron/Task Scheduler - SignalDashPro (Binance-first)#

Checklist para operar jobs automaticos de forma consistente en spot/futures.

Nota (deploy externo)#

En el VPS/Proxmox (CT110) estamos usando systemd timers en vez de cron para la operacion 30 dias:

  • signaldashpro-daily.timer: smoke + evidencia diaria (ver deploy/ops/daily.sh).
  • signaldashpro-alerts.timer: alertas Slack/Teams cada 5 min (ver deploy/ops/alerts.sh).
  • signaldashpro-storage-backup.timer: backup diario de storage/ (ver deploy/ops/storage_backup.sh).
  • signaldashpro-weekly-report.timer: genera reporte semanal (MD/JSON) los lunes 07:00 UTC (ver deploy/ops/weekly_report.sh).

Comandos:

systemctl list-timers --all | grep signaldashpro
journalctl -u signaldashpro-daily.service -n 200 --no-pager

Jobs principales#

Job Variable *_AUTO_RUN Script/Comando Frecuencia sugerida Verificacion
Execution queue EXECUTION_QUEUE_AUTO_RUN Integrado en backend (SignalQueueService) EXECUTION_QUEUE_POLL_SECONDS GET /jobs/status + eventos processing_* en signal_queue_events.
News import NEWS_AUTO_RUN python scripts/import_market_news.py --symbols BTCUSDT ETHUSDT --brainstorm 15 min Logs en storage/logs/signaldashpro.log.
News summary NEWS_SUMMARY_AUTO_RUN python scripts/import_market_news.py --summary-only --symbols BTCUSDT ETHUSDT 30 min Nuevos summary_ai en market_news.
Data pipeline DATA_PIPELINE_AUTO_RUN python scripts/run_data_pipeline.py --symbols BTCUSDT ETHUSDT --timeframe H1 60 min Nuevos registros en market_windows.
BI export BI_EXPORT_AUTO_RUN python scripts/export_bi_datasets.py --datasets trading_performance targets_progress --formats csv json --summary 4 h Archivos en storage/bi_exports/.
Training pipeline TRAINING_AUTO_RUN python scripts/run_training_pipeline.py --lookback-hours 240 --model-type random_forest Diario Nuevos artefactos en storage/models/.
Goal monitor META_GOAL_AUTO_RUN python scripts/run_goal_monitor.py --lookback-days 7 --alert-grace 3 60 min Nuevos goal_snapshots/goal_alerts.
External alerts (sin flag) python scripts/notify_external_alerts.py --backend-url http://127.0.0.1:8096 --slack-webhook <url> 5 min Entrega en Slack/Teams + estado en storage/alerts/alert_state.json.
Weekly report (sin flag) python scripts/run_weekly_report.py Semanal (lunes 07:00 UTC) Archivos en storage/reports/weekly/ (JSON + MD).

Variables recomendadas (Binance)#

  • EXECUTION_MODE=binance
  • EXECUTION_QUEUE_AUTO_RUN=true
  • EXECUTION_QUEUE_POLL_SECONDS=5
  • EXECUTION_QUEUE_BATCH=5
  • EXECUTION_RETRY_BACKOFF_BASE_SECONDS=5
  • EXECUTION_RETRY_BACKOFF_MAX_SECONDS=300
  • ALERT_BACKEND_BASE_URL=http://127.0.0.1:8096
  • SLACK_WEBHOOK_URL=... o TEAMS_WEBHOOK_URL=...

Plantilla cron (Linux/macOS)#

SHELL=/bin/bash
BASH_ENV=/opt/signaldashpro/.env.binance
PATH=/usr/local/bin:/usr/bin:/bin
LOG_DIR=/var/log/signaldashpro

*/15 * * * * cd /opt/SignalDashPro && python scripts/import_market_news.py --symbols BTCUSDT ETHUSDT --brainstorm >> $LOG_DIR/cron.log 2>&1
*/30 * * * * cd /opt/SignalDashPro && python scripts/import_market_news.py --summary-only --symbols BTCUSDT ETHUSDT >> $LOG_DIR/cron.log 2>&1
0 * * * * cd /opt/SignalDashPro && python scripts/run_data_pipeline.py --symbols BTCUSDT ETHUSDT --timeframe H1 >> $LOG_DIR/cron.log 2>&1
0 */4 * * * cd /opt/SignalDashPro && python scripts/export_bi_datasets.py --datasets trading_performance targets_progress --formats csv json --summary >> $LOG_DIR/cron.log 2>&1
30 2 * * * cd /opt/SignalDashPro && python scripts/run_training_pipeline.py --lookback-hours 240 --model-type random_forest >> $LOG_DIR/cron.log 2>&1
*/60 * * * * cd /opt/SignalDashPro && python scripts/run_goal_monitor.py --lookback-days 7 --alert-grace 3 >> $LOG_DIR/cron.log 2>&1
*/5 * * * * cd /opt/SignalDashPro && python scripts/notify_external_alerts.py --backend-url http://127.0.0.1:8096 --slack-webhook https://hooks.slack.com/services/... >> $LOG_DIR/cron.log 2>&1
0 7 * * 1 cd /opt/SignalDashPro && python scripts/run_weekly_report.py >> $LOG_DIR/cron.log 2>&1

Plantilla Task Scheduler (Windows)#

Ejecutable:

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Arguments (ejemplo pipeline):

-NoProfile -ExecutionPolicy Bypass -Command "Set-Location 'D:\Repos\SignalDashPro'; . .\setenv.ps1 env\.env.binance; python scripts\run_data_pipeline.py --symbols BTCUSDT ETHUSDT --timeframe H1 >> .\storage\logs\cron.log 2>&1"

Evidencia operativa#

Captura snapshot operativo (HTTP + DB) y guarda JSON en storage/logs/evidence/:

set -a; source env/.env.binance; set +a
.venv/bin/python scripts/capture_operational_evidence.py --env-name binance-spot --backend-url http://127.0.0.1:8096

Campos clave a revisar en la evidencia:

  • /jobs/status (execution_queue.auto_run=true)
  • /risk/kill-switch/status (enabled=false en operacion normal)
  • /binance/status (market/symbols/limits esperados)
  • db_health.checks sin warnings criticos