Skip to content

Commit 751370a

Browse files
committed
Merge quant platform into trading bot and add AI-only smoke test
1 parent 3602e31 commit 751370a

111 files changed

Lines changed: 9454 additions & 86 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
name: AI Trading Smoke
2+
3+
on:
4+
workflow_dispatch:
5+
6+
jobs:
7+
ai-smoke:
8+
runs-on: ubuntu-latest
9+
timeout-minutes: 30
10+
env:
11+
PYTHONUNBUFFERED: "1"
12+
TRAINED_MODEL_INFERENCE_URL: ${{ secrets.TRAINED_MODEL_INFERENCE_URL }}
13+
TRAINED_MODEL_API_KEY: ${{ secrets.TRAINED_MODEL_API_KEY }}
14+
TWELVEDATA_API_KEYS: ${{ secrets.TWELVEDATA_API_KEYS }}
15+
ALPHAVANTAGE_API_KEYS: ${{ secrets.ALPHAVANTAGE_API_KEYS }}
16+
NVIDIA_API_KEY: ${{ secrets.NVIDIA_API_KEY }}
17+
AI_SMOKE_TICKERS: "AAPL,MSFT,NVDA,TSLA,SPY"
18+
steps:
19+
- name: Checkout
20+
uses: actions/checkout@v4
21+
22+
- name: Set up Python
23+
uses: actions/setup-python@v5
24+
with:
25+
python-version: "3.11"
26+
27+
- name: Install dependencies
28+
run: pip install -r requirements.txt
29+
30+
- name: Run AI-only smoke test
31+
run: python run_ai_trading_smoke.py
32+
33+
- name: Upload AI smoke artifacts
34+
if: always()
35+
uses: actions/upload-artifact@v4
36+
with:
37+
name: ai-trading-smoke
38+
path: results/ai_smoke_*.json
39+
retention-days: 7

README.md

Lines changed: 80 additions & 86 deletions
Original file line numberDiff line numberDiff line change
@@ -1,99 +1,93 @@
1-
# Daily Trading Bot
1+
# Trading Bot
22

3-
This is a Python-based trading bot designed to run daily, ingest market data, generate signals, and execute trades (simulated or real).
3+
This repo now contains both parts of the system in one place:
44

5-
## Setup
5+
1. The daily trading bot at the repo root
6+
2. The train-once quant model platform under `quant_platform/`
67

7-
### Prerequisites
8-
- Python 3.10+
9-
- `pip`
8+
## Repo Layout
109

11-
### Installation
12-
1. Clone the repository.
13-
2. Install dependencies:
14-
```bash
15-
pip install -r requirements.txt
16-
```
17-
3. Create a `.env` file with your API keys:
18-
```bash
19-
cp .env.example .env
20-
```
21-
4. Open `.env` and paste your keys (do not commit `.env`).
10+
- Root: live trading bot, daily orchestration, SQLite state, email/reporting
11+
- `quant_platform/`: corpus building, one-time GPU training, frozen LoRA adapter workflow, backtesting/research platform
2212

23-
## Usage
13+
## Current Architecture
2414

25-
### Local Run
26-
To run the daily job manually:
15+
```
16+
trading_bot/
17+
├── main.py # Daily core bot + AI bot orchestration
18+
├── llm_trader.py # AI trading branch using the trained model
19+
├── trained_model_client.py # Remote HTTP client for trained-model inference
20+
├── modal_trained_model_service.py
21+
├── backtesting/ # Existing research stack in the bot repo
22+
└── quant_platform/ # Merged train-once quant platform repo
23+
```
24+
25+
## Core vs AI
26+
27+
- Core bot remains unchanged in principle: price ingestion, feature generation, OLS ranking, meta-learner, portfolio logic
28+
- AI trading bot is separate and now uses the trained quant model over HTTP
29+
- The AI path is batched and designed to call the Modal CPU endpoint, not a local model
30+
31+
## Secrets
32+
33+
### Still used
34+
- `NVIDIA_API_KEY`: news sentiment path
35+
- `TRAINED_MODEL_INFERENCE_URL`: deployed Modal CPU inference URL for the AI trading bot
36+
- `TRAINED_MODEL_API_KEY`: optional auth for the trained-model endpoint
37+
- `TWELVEDATA_API_KEYS`, `ALPHAVANTAGE_API_KEYS`: optional price providers
38+
39+
### No longer used by the AI trading bot
40+
- `NVIDIA_REASONING_API_KEY`
41+
42+
## Main Workflows
43+
44+
- `.github/workflows/daily_trading_bot.yml`
45+
- Daily root bot workflow
46+
- Core + AI orchestration
47+
- `.github/workflows/ai_trading_smoke.yml`
48+
- AI-only smoke test against the trained model endpoint
49+
- Does not run the core strategy
50+
51+
## AI-Only Smoke Test
52+
53+
Manual:
2754
```bash
55+
python run_ai_trading_smoke.py
56+
```
57+
58+
GitHub Actions:
59+
- Actions -> **AI Trading Smoke**
60+
- This tests only the AI trading branch and the trained-model endpoint
61+
62+
## Quant Platform
63+
64+
The full train-once quant platform has been merged into:
65+
66+
- [quant_platform/](./quant_platform)
67+
68+
That subtree contains:
69+
- corpus builders
70+
- training scripts
71+
- backtest engine
72+
- inference/API scaffolding
73+
- configs, docs, and tests from the original train-once repo
74+
75+
Start there if you want to inspect the model/training system rather than the daily bot.
76+
77+
## Local Setup
78+
79+
```bash
80+
pip install -r requirements.txt
2881
python main.py daily_job
2982
```
3083

31-
### Full Pipeline
32-
To run the full pipeline (ingest, train, backtest):
84+
For AI-only testing:
3385
```bash
34-
python main.py full
86+
python run_ai_trading_smoke.py
3587
```
3688

37-
## API Keys (Step-by-Step)
38-
39-
This repo is set up so secrets are **not** stored in git. A fresh `git clone` will **not** include your API keys.
40-
41-
### A) Run Locally (Recommended)
42-
1. Clone:
43-
```bash
44-
git clone https://github.com/Rohan5commit/trading_bot.git
45-
cd trading_bot
46-
```
47-
2. Create `.env`:
48-
```bash
49-
cp .env.example .env
50-
```
51-
3. Open `.env` and paste your keys:
52-
- In VS Code: `code .` then click `.env` and paste values.
53-
- In Terminal: `nano .env` then paste values and save.
54-
4. Required env vars for the current setup:
55-
- `NVIDIA_API_KEY` (news/LLM sentiment)
56-
- `NVIDIA_REASONING_API_KEY` (AI strategy trade selection)
57-
- Email: `SMTP_SERVER`, `SMTP_PORT`, `SENDER_EMAIL`, `SENDER_PASSWORD`, `RECIPIENT_EMAIL`
58-
- Optional (faster S&P500 ingestion): `TWELVEDATA_API_KEYS` (comma-separated)
59-
5. Run:
60-
```bash
61-
python3 main.py daily_job
62-
```
63-
64-
### B) Run In GitHub Actions (Cloud)
65-
1. Go to your repo on GitHub.
66-
2. `Settings` -> `Secrets and variables` -> `Actions`.
67-
3. Click `New repository secret`.
68-
4. Add secrets (names must match exactly):
69-
- `NVIDIA_API_KEY`
70-
- `NVIDIA_API_KEY_ID` (optional)
71-
- `NVIDIA_REASONING_API_KEY`
72-
- `NVIDIA_REASONING_API_KEY_ID` (optional)
73-
- `TWELVEDATA_API_KEYS` (optional, comma-separated)
74-
- Email secrets as used by your deployment (see your workflow / `send_email_report.py` configuration).
75-
76-
## State Persistence (Important)
77-
Positions and account state live in SQLite at `data/trading_bot.db`.
78-
- GitHub Actions keeps this via cache for cloud runs.
79-
- A new local clone starts "fresh" unless you restore a saved copy of `data/trading_bot.db`.
80-
81-
## GitHub Actions Deployment
82-
83-
This repository includes a GitHub Actions workflow (`.github/workflows/daily_trading_bot.yml`) that runs the bot daily at 9:30 AM EST (Mon-Fri).
84-
85-
### Configuration
86-
To execute successfully on GitHub Actions, you must configure the following **Repository Secrets** in your GitHub repository (Settings -> Secrets and variables -> Actions):
87-
88-
| Secret Name | Description |
89-
|---|---|
90-
| `TWELVEDATA_API_KEYS` | Comma-separated API keys for TwelveData (if used). |
91-
| `ALPHAVANTAGE_API_KEYS` | Comma-separated API keys for AlphaVantage (if used). |
92-
| `MAILGUN_API_KEY` | API Key for Mailgun (for email reports). |
93-
| `MAILGUN_DOMAIN` | Mailgun domain (e.g., `mg.yourdomain.com`). |
94-
| `EMAIL_RECIPIENTS` | Comma-separated list of email addresses to receive reports. |
95-
| `EMAIL_SENDER` | Email address to send from (e.g., `bot@yourdomain.com`). |
96-
| `OPENAI_API_KEY` | OpenAI API Key (for LLM analysis). |
97-
98-
### Workflows
99-
- **Daily Trading Bot**: Triggers on schedule (Mon-Fri) and can be manually triggered via the "Run workflow" button in the Actions tab.
89+
## Notes
90+
91+
- The AI bot is remote-only and expects the trained model to be served externally.
92+
- The current deployment target is Modal CPU.
93+
- The core bot and AI bot remain logically separate even though they now live in one combined repo.
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
name: Backtest CPU
2+
3+
on:
4+
workflow_dispatch:
5+
6+
jobs:
7+
backtest:
8+
runs-on: ubuntu-latest
9+
steps:
10+
- uses: actions/checkout@v4
11+
- uses: actions/setup-python@v5
12+
with:
13+
python-version: '3.11'
14+
- name: Install dependencies
15+
run: pip install -r requirements.txt
16+
- name: Build features and run backtest
17+
run: |
18+
python -m src.cli build-data --config configs/data.yaml
19+
python -m src.cli backtest --config configs/backtest.yaml
20+
- name: Upload reports
21+
uses: actions/upload-artifact@v4
22+
with:
23+
name: backtest-reports
24+
path: reports/backtest
Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
name: Build Corpus Chunk (Modal CPU)
2+
3+
on:
4+
workflow_dispatch:
5+
inputs:
6+
chunk_id:
7+
description: "Chunk index (0-based)"
8+
required: true
9+
default: "0"
10+
chunk_size:
11+
description: "Tickers per chunk"
12+
required: true
13+
default: "200"
14+
cpu:
15+
description: "Modal CPU"
16+
required: true
17+
default: "16"
18+
memory:
19+
description: "Modal memory (MB)"
20+
required: true
21+
default: "65536"
22+
config:
23+
description: "Config path"
24+
required: true
25+
default: "configs/data_large.yaml"
26+
27+
jobs:
28+
build:
29+
runs-on: ubuntu-latest
30+
steps:
31+
- uses: actions/checkout@v4
32+
- uses: actions/setup-python@v5
33+
with:
34+
python-version: "3.11"
35+
- name: Install dependencies
36+
run: pip install -r requirements-data.txt
37+
- name: Set Modal tokens
38+
env:
39+
MODAL_TOKEN_ID: ${{ secrets.MODAL_TOKEN_ID }}
40+
MODAL_TOKEN_SECRET: ${{ secrets.MODAL_TOKEN_SECRET }}
41+
run: |
42+
modal token set --token-id "$MODAL_TOKEN_ID" --token-secret "$MODAL_TOKEN_SECRET"
43+
- name: Build chunk on Modal CPU
44+
env:
45+
SEC_USER_AGENT: ${{ secrets.SEC_USER_AGENT }}
46+
TRADINGVIEW_CSV_URL: ${{ secrets.TRADINGVIEW_CSV_URL }}
47+
TRADINGVIEW_CSV_PATH: ${{ secrets.TRADINGVIEW_CSV_PATH }}
48+
CHECKPOINT_S3_BUCKET: ${{ secrets.CHECKPOINT_S3_BUCKET }}
49+
CHECKPOINT_S3_ACCESS_KEY: ${{ secrets.CHECKPOINT_S3_ACCESS_KEY }}
50+
CHECKPOINT_S3_SECRET_KEY: ${{ secrets.CHECKPOINT_S3_SECRET_KEY }}
51+
CHECKPOINT_S3_REGION: ${{ secrets.CHECKPOINT_S3_REGION }}
52+
CHECKPOINT_S3_ENDPOINT: ${{ secrets.CHECKPOINT_S3_ENDPOINT }}
53+
CHECKPOINT_S3_PREFIX: ${{ secrets.CHECKPOINT_S3_PREFIX }}
54+
CHECKPOINT_S3_USE_PATH_STYLE: ${{ secrets.CHECKPOINT_S3_USE_PATH_STYLE }}
55+
run: >-
56+
python scripts/modal_build_chunk.py
57+
--config ${{ inputs.config }}
58+
--chunk-id ${{ inputs.chunk_id }}
59+
--chunk-size ${{ inputs.chunk_size }}
60+
--cpu ${{ inputs.cpu }}
61+
--memory ${{ inputs.memory }}
62+
--summary chunk_summary.json
63+
- name: Upload chunk summary
64+
uses: actions/upload-artifact@v4
65+
with:
66+
name: chunk-summary
67+
path: chunk_summary.json
Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
name: Build Corpus Chunk (GitHub CPU)
2+
3+
on:
4+
workflow_dispatch:
5+
inputs:
6+
chunk_id:
7+
description: "Chunk index (0-based)"
8+
required: true
9+
default: "0"
10+
chunk_size:
11+
description: "Tickers per chunk (keep small for 6h limit)"
12+
required: true
13+
default: "50"
14+
config:
15+
description: "Config path"
16+
required: true
17+
default: "configs/data_large.yaml"
18+
19+
jobs:
20+
build:
21+
runs-on: ubuntu-latest
22+
timeout-minutes: 350
23+
steps:
24+
- uses: actions/checkout@v4
25+
- uses: actions/setup-python@v5
26+
with:
27+
python-version: "3.11"
28+
- name: Install dependencies
29+
run: pip install -r requirements-data.txt
30+
- name: Build chunk on GitHub CPU
31+
env:
32+
SEC_USER_AGENT: ${{ secrets.SEC_USER_AGENT }}
33+
TRADINGVIEW_CSV_URL: ${{ secrets.TRADINGVIEW_CSV_URL }}
34+
TRADINGVIEW_CSV_PATH: ${{ secrets.TRADINGVIEW_CSV_PATH }}
35+
CHECKPOINT_S3_BUCKET: ${{ secrets.CHECKPOINT_S3_BUCKET }}
36+
CHECKPOINT_S3_ACCESS_KEY: ${{ secrets.CHECKPOINT_S3_ACCESS_KEY }}
37+
CHECKPOINT_S3_SECRET_KEY: ${{ secrets.CHECKPOINT_S3_SECRET_KEY }}
38+
CHECKPOINT_S3_REGION: ${{ secrets.CHECKPOINT_S3_REGION }}
39+
CHECKPOINT_S3_ENDPOINT: ${{ secrets.CHECKPOINT_S3_ENDPOINT }}
40+
CHECKPOINT_S3_PREFIX: ${{ secrets.CHECKPOINT_S3_PREFIX }}
41+
CHECKPOINT_S3_USE_PATH_STYLE: ${{ secrets.CHECKPOINT_S3_USE_PATH_STYLE }}
42+
run: >-
43+
python scripts/gha_build_chunk.py
44+
--config ${{ inputs.config }}
45+
--chunk-id ${{ inputs.chunk_id }}
46+
--chunk-size ${{ inputs.chunk_size }}
47+
--summary chunk_summary.json
48+
- name: Upload chunk summary
49+
uses: actions/upload-artifact@v4
50+
with:
51+
name: chunk-summary
52+
path: chunk_summary.json

0 commit comments

Comments
 (0)