Initial release v1.1.0
- Complete MVP for tracking Fidelity brokerage account performance - Transaction import from CSV with deduplication - Automatic FIFO position tracking with options support - Real-time P&L calculations with market data caching - Dashboard with timeframe filtering (30/90/180 days, 1 year, YTD, all time) - Docker-based deployment with PostgreSQL backend - React/TypeScript frontend with TailwindCSS - FastAPI backend with SQLAlchemy ORM Features: - Multi-account support - Import via CSV upload or filesystem - Open and closed position tracking - Balance history charting - Performance analytics and metrics - Top trades analysis - Responsive UI design Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
95
.gitignore
vendored
Normal file
95
.gitignore
vendored
Normal file
@@ -0,0 +1,95 @@
|
|||||||
|
# Environment variables
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.*.local
|
||||||
|
|
||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
env/
|
||||||
|
|
||||||
|
# Node / Frontend
|
||||||
|
node_modules/
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
.pnpm-debug.log*
|
||||||
|
dist/
|
||||||
|
dist-ssr/
|
||||||
|
*.local
|
||||||
|
|
||||||
|
# IDEs
|
||||||
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
|
# Database
|
||||||
|
*.db
|
||||||
|
*.sqlite
|
||||||
|
*.sqlite3
|
||||||
|
postgres_data/
|
||||||
|
|
||||||
|
# Docker volumes
|
||||||
|
imports/*.csv
|
||||||
|
!imports/.gitkeep
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
*.log
|
||||||
|
logs/
|
||||||
|
|
||||||
|
# Testing
|
||||||
|
.coverage
|
||||||
|
htmlcov/
|
||||||
|
.pytest_cache/
|
||||||
|
.tox/
|
||||||
|
|
||||||
|
# Misc
|
||||||
|
*.bak
|
||||||
|
*.tmp
|
||||||
|
.cache/
|
||||||
|
|
||||||
|
# Temporary fix files
|
||||||
|
*FIX*.md
|
||||||
|
*FIX*.txt
|
||||||
|
*FIX*.sh
|
||||||
|
*fix*.sh
|
||||||
|
diagnose*.sh
|
||||||
|
transfer*.sh
|
||||||
|
rebuild.sh
|
||||||
|
verify*.sh
|
||||||
|
apply*.sh
|
||||||
|
deploy*.sh
|
||||||
|
emergency*.sh
|
||||||
|
nuclear*.sh
|
||||||
|
complete*.sh
|
||||||
|
|
||||||
|
# Sample/test CSV files
|
||||||
|
History_for_Account*.csv
|
||||||
|
|
||||||
|
# Diagnostic files
|
||||||
|
DIAGNOSTIC*.md
|
||||||
|
SETUP_STATUS.md
|
||||||
64
CHANGELOG.md
Normal file
64
CHANGELOG.md
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to myFidelityTracker will be documented in this file.
|
||||||
|
|
||||||
|
## [Unreleased]
|
||||||
|
|
||||||
|
## [1.1.0] - 2026-01-22
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Timeframe Filtering on Dashboard**: Users can now filter dashboard metrics and balance history by timeframe
|
||||||
|
- Available timeframes: All Time, Last 30 Days, Last 90 Days, Last 180 Days, Last 1 Year, Year to Date
|
||||||
|
- Filters both the metrics cards (Total P&L, Win Rate, etc.) and the Balance History chart
|
||||||
|
- Implemented in `DashboardV2.tsx` component
|
||||||
|
- **Backend Date Filtering**: Added `start_date` and `end_date` parameters to `/analytics/overview` endpoint
|
||||||
|
- Updated `calculate_account_stats()` method in `PerformanceCalculatorV2` to filter positions by open date
|
||||||
|
- Allows frontend to request statistics for specific date ranges
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Updated `analyticsApi.getOverview()` to accept optional `start_date` and `end_date` parameters
|
||||||
|
- Modified balance history query to dynamically adjust days based on selected timeframe
|
||||||
|
- Enhanced `DashboardV2` component with timeframe state management
|
||||||
|
|
||||||
|
### Technical Details
|
||||||
|
- Files Modified:
|
||||||
|
- `frontend/src/components/DashboardV2.tsx` - Added timeframe filter UI and logic
|
||||||
|
- `frontend/src/api/client.ts` - Updated API types
|
||||||
|
- `backend/app/api/endpoints/analytics_v2.py` - Added date parameters to overview endpoint
|
||||||
|
- `backend/app/services/performance_calculator_v2.py` - Added date filtering to position queries
|
||||||
|
|
||||||
|
## [1.0.0] - 2026-01-21
|
||||||
|
|
||||||
|
### Initial Release
|
||||||
|
- Complete MVP for tracking Fidelity brokerage account performance
|
||||||
|
- Transaction import from CSV files
|
||||||
|
- Automatic position tracking with FIFO matching
|
||||||
|
- Real-time P&L calculations with Yahoo Finance integration
|
||||||
|
- Dashboard with metrics and charts
|
||||||
|
- Docker-based deployment
|
||||||
|
- Support for stocks, calls, and puts
|
||||||
|
- Deduplication of transactions
|
||||||
|
- Multi-account support
|
||||||
|
|
||||||
|
### Components
|
||||||
|
- Backend: FastAPI + PostgreSQL + SQLAlchemy
|
||||||
|
- Frontend: React + TypeScript + TailwindCSS
|
||||||
|
- Infrastructure: Docker Compose + Nginx
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Current Status
|
||||||
|
|
||||||
|
**Version**: 1.1.0
|
||||||
|
**Deployment**: Remote server (starship2) via Docker
|
||||||
|
**Access**: http://starship2:3000
|
||||||
|
**Last Updated**: 2026-01-22
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
Development priorities for future versions:
|
||||||
|
1. Additional broker support (Schwab, E*TRADE)
|
||||||
|
2. Tax reporting features
|
||||||
|
3. Advanced filtering and analytics
|
||||||
|
4. User authentication for multi-user support
|
||||||
|
5. Mobile app development
|
||||||
540
LINUX_DEPLOYMENT.md
Normal file
540
LINUX_DEPLOYMENT.md
Normal file
@@ -0,0 +1,540 @@
|
|||||||
|
# Linux Server Deployment Guide
|
||||||
|
|
||||||
|
Complete guide for deploying myFidelityTracker on a Linux server.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
### Linux Server Requirements
|
||||||
|
- **OS**: Ubuntu 20.04+, Debian 11+, CentOS 8+, or similar
|
||||||
|
- **RAM**: 4GB minimum (8GB recommended)
|
||||||
|
- **Disk**: 20GB free space
|
||||||
|
- **Network**: Open ports 3000, 8000 (or configure firewall)
|
||||||
|
|
||||||
|
### Required Software
|
||||||
|
- Docker Engine 20.10+
|
||||||
|
- Docker Compose 1.29+ (or Docker Compose V2)
|
||||||
|
- Git (optional, for cloning)
|
||||||
|
|
||||||
|
## Step 1: Install Docker on Linux
|
||||||
|
|
||||||
|
### Ubuntu/Debian
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Update package index
|
||||||
|
sudo apt-get update
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
sudo apt-get install -y \
|
||||||
|
ca-certificates \
|
||||||
|
curl \
|
||||||
|
gnupg \
|
||||||
|
lsb-release
|
||||||
|
|
||||||
|
# Add Docker's official GPG key
|
||||||
|
sudo mkdir -p /etc/apt/keyrings
|
||||||
|
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
|
||||||
|
|
||||||
|
# Set up repository
|
||||||
|
echo \
|
||||||
|
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \
|
||||||
|
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
|
||||||
|
|
||||||
|
# Install Docker Engine
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
|
||||||
|
|
||||||
|
# Add your user to docker group (optional, to run without sudo)
|
||||||
|
sudo usermod -aG docker $USER
|
||||||
|
newgrp docker
|
||||||
|
|
||||||
|
# Verify installation
|
||||||
|
docker --version
|
||||||
|
docker compose version
|
||||||
|
```
|
||||||
|
|
||||||
|
### CentOS/RHEL
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install Docker
|
||||||
|
sudo yum install -y yum-utils
|
||||||
|
sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo
|
||||||
|
sudo yum install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
|
||||||
|
|
||||||
|
# Start Docker
|
||||||
|
sudo systemctl start docker
|
||||||
|
sudo systemctl enable docker
|
||||||
|
|
||||||
|
# Add user to docker group (optional)
|
||||||
|
sudo usermod -aG docker $USER
|
||||||
|
newgrp docker
|
||||||
|
|
||||||
|
# Verify
|
||||||
|
docker --version
|
||||||
|
docker compose version
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 2: Transfer Files to Linux Server
|
||||||
|
|
||||||
|
### Option A: Direct Transfer (from your Mac)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# From your Mac, transfer the entire project directory
|
||||||
|
# Replace USER and SERVER_IP with your values
|
||||||
|
cd /Users/chris/Desktop
|
||||||
|
scp -r fidelity USER@SERVER_IP:~/
|
||||||
|
|
||||||
|
# Example:
|
||||||
|
# scp -r fidelity ubuntu@192.168.1.100:~/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option B: Using rsync (faster for updates)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# From your Mac
|
||||||
|
rsync -avz --progress /Users/chris/Desktop/fidelity/ USER@SERVER_IP:~/fidelity/
|
||||||
|
|
||||||
|
# Exclude node_modules and other large dirs
|
||||||
|
rsync -avz --progress \
|
||||||
|
--exclude 'node_modules' \
|
||||||
|
--exclude '__pycache__' \
|
||||||
|
--exclude '*.pyc' \
|
||||||
|
/Users/chris/Desktop/fidelity/ USER@SERVER_IP:~/fidelity/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option C: Git (if using version control)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# On your Linux server
|
||||||
|
cd ~
|
||||||
|
git clone YOUR_REPO_URL fidelity
|
||||||
|
cd fidelity
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option D: Manual ZIP Transfer
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# On your Mac - create zip
|
||||||
|
cd /Users/chris/Desktop
|
||||||
|
zip -r fidelity.zip fidelity/ -x "*/node_modules/*" "*/__pycache__/*" "*.pyc"
|
||||||
|
|
||||||
|
# Transfer the zip
|
||||||
|
scp fidelity.zip USER@SERVER_IP:~/
|
||||||
|
|
||||||
|
# On Linux server - extract
|
||||||
|
cd ~
|
||||||
|
unzip fidelity.zip
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 3: Configure for Linux Environment
|
||||||
|
|
||||||
|
SSH into your Linux server:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh USER@SERVER_IP
|
||||||
|
cd ~/fidelity
|
||||||
|
```
|
||||||
|
|
||||||
|
### Make scripts executable
|
||||||
|
|
||||||
|
```bash
|
||||||
|
chmod +x start-linux.sh
|
||||||
|
chmod +x stop.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configure environment variables
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create .env file
|
||||||
|
cp .env.example .env
|
||||||
|
|
||||||
|
# Edit .env file to add your server IP for CORS
|
||||||
|
nano .env # or use vim, vi, etc.
|
||||||
|
```
|
||||||
|
|
||||||
|
Update the CORS_ORIGINS line:
|
||||||
|
```env
|
||||||
|
CORS_ORIGINS=http://localhost:3000,http://YOUR_SERVER_IP:3000
|
||||||
|
```
|
||||||
|
|
||||||
|
Replace `YOUR_SERVER_IP` with your actual server IP address.
|
||||||
|
|
||||||
|
### Create imports directory
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir -p imports
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 4: Start the Application
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start all services
|
||||||
|
./start-linux.sh
|
||||||
|
|
||||||
|
# Or manually:
|
||||||
|
docker-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
The script will:
|
||||||
|
- Check Docker is running
|
||||||
|
- Create necessary directories
|
||||||
|
- Start all containers (postgres, backend, frontend)
|
||||||
|
- Display access URLs
|
||||||
|
|
||||||
|
## Step 5: Access the Application
|
||||||
|
|
||||||
|
### From the Server Itself
|
||||||
|
- Frontend: http://localhost:3000
|
||||||
|
- Backend API: http://localhost:8000
|
||||||
|
- API Docs: http://localhost:8000/docs
|
||||||
|
|
||||||
|
### From Other Computers on the Network
|
||||||
|
- Frontend: http://YOUR_SERVER_IP:3000
|
||||||
|
- Backend API: http://YOUR_SERVER_IP:8000
|
||||||
|
- API Docs: http://YOUR_SERVER_IP:8000/docs
|
||||||
|
|
||||||
|
### From the Internet (if server has public IP)
|
||||||
|
First configure firewall (see Security section below), then:
|
||||||
|
- Frontend: http://YOUR_PUBLIC_IP:3000
|
||||||
|
- Backend API: http://YOUR_PUBLIC_IP:8000
|
||||||
|
|
||||||
|
## Step 6: Configure Firewall (Ubuntu/Debian)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Allow SSH (important - don't lock yourself out!)
|
||||||
|
sudo ufw allow 22/tcp
|
||||||
|
|
||||||
|
# Allow application ports
|
||||||
|
sudo ufw allow 3000/tcp # Frontend
|
||||||
|
sudo ufw allow 8000/tcp # Backend API
|
||||||
|
|
||||||
|
# Enable firewall
|
||||||
|
sudo ufw enable
|
||||||
|
|
||||||
|
# Check status
|
||||||
|
sudo ufw status
|
||||||
|
```
|
||||||
|
|
||||||
|
### For CentOS/RHEL (firewalld)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Allow ports
|
||||||
|
sudo firewall-cmd --permanent --add-port=3000/tcp
|
||||||
|
sudo firewall-cmd --permanent --add-port=8000/tcp
|
||||||
|
sudo firewall-cmd --reload
|
||||||
|
|
||||||
|
# Check status
|
||||||
|
sudo firewall-cmd --list-all
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 7: Load Demo Data (Optional)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Copy your CSV to imports directory
|
||||||
|
cp History_for_Account_X38661988.csv imports/
|
||||||
|
|
||||||
|
# Run seeder
|
||||||
|
docker-compose exec backend python seed_demo_data.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Linux-Specific Commands
|
||||||
|
|
||||||
|
### View Logs
|
||||||
|
```bash
|
||||||
|
# All services
|
||||||
|
docker-compose logs -f
|
||||||
|
|
||||||
|
# Specific service
|
||||||
|
docker-compose logs -f backend
|
||||||
|
docker-compose logs -f frontend
|
||||||
|
docker-compose logs -f postgres
|
||||||
|
|
||||||
|
# Last 100 lines
|
||||||
|
docker-compose logs --tail=100
|
||||||
|
```
|
||||||
|
|
||||||
|
### Check Container Status
|
||||||
|
```bash
|
||||||
|
docker-compose ps
|
||||||
|
docker ps
|
||||||
|
```
|
||||||
|
|
||||||
|
### Restart Services
|
||||||
|
```bash
|
||||||
|
docker-compose restart
|
||||||
|
docker-compose restart backend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Stop Application
|
||||||
|
```bash
|
||||||
|
./stop.sh
|
||||||
|
# or
|
||||||
|
docker-compose down
|
||||||
|
```
|
||||||
|
|
||||||
|
### Update Application
|
||||||
|
```bash
|
||||||
|
# Stop containers
|
||||||
|
docker-compose down
|
||||||
|
|
||||||
|
# Pull latest code (if using git)
|
||||||
|
git pull
|
||||||
|
|
||||||
|
# Rebuild and restart
|
||||||
|
docker-compose up -d --build
|
||||||
|
```
|
||||||
|
|
||||||
|
### Access Database
|
||||||
|
```bash
|
||||||
|
docker-compose exec postgres psql -U fidelity -d fidelitytracker
|
||||||
|
```
|
||||||
|
|
||||||
|
### Shell Access to Containers
|
||||||
|
```bash
|
||||||
|
# Backend shell
|
||||||
|
docker-compose exec backend bash
|
||||||
|
|
||||||
|
# Frontend shell
|
||||||
|
docker-compose exec frontend sh
|
||||||
|
|
||||||
|
# Database shell
|
||||||
|
docker-compose exec postgres bash
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Port Already in Use
|
||||||
|
```bash
|
||||||
|
# Check what's using the port
|
||||||
|
sudo lsof -i :3000
|
||||||
|
sudo lsof -i :8000
|
||||||
|
sudo lsof -i :5432
|
||||||
|
|
||||||
|
# Or use netstat
|
||||||
|
sudo netstat -tlnp | grep 3000
|
||||||
|
|
||||||
|
# Kill the process
|
||||||
|
sudo kill <PID>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Permission Denied Errors
|
||||||
|
```bash
|
||||||
|
# If you get permission errors with Docker
|
||||||
|
sudo usermod -aG docker $USER
|
||||||
|
newgrp docker
|
||||||
|
|
||||||
|
# If import directory has permission issues
|
||||||
|
sudo chown -R $USER:$USER imports/
|
||||||
|
chmod 755 imports/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Docker Out of Space
|
||||||
|
```bash
|
||||||
|
# Clean up unused containers, images, volumes
|
||||||
|
docker system prune -a
|
||||||
|
|
||||||
|
# Remove only dangling images
|
||||||
|
docker image prune
|
||||||
|
```
|
||||||
|
|
||||||
|
### Services Won't Start
|
||||||
|
```bash
|
||||||
|
# Check Docker is running
|
||||||
|
sudo systemctl status docker
|
||||||
|
sudo systemctl start docker
|
||||||
|
|
||||||
|
# Check logs for errors
|
||||||
|
docker-compose logs
|
||||||
|
|
||||||
|
# Rebuild from scratch
|
||||||
|
docker-compose down -v
|
||||||
|
docker-compose up -d --build
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cannot Access from Other Computers
|
||||||
|
```bash
|
||||||
|
# Check firewall
|
||||||
|
sudo ufw status
|
||||||
|
sudo firewall-cmd --list-all
|
||||||
|
|
||||||
|
# Check if services are listening on all interfaces
|
||||||
|
sudo netstat -tlnp | grep 3000
|
||||||
|
# Should show 0.0.0.0:3000, not 127.0.0.1:3000
|
||||||
|
|
||||||
|
# Update CORS in .env
|
||||||
|
nano .env
|
||||||
|
# Add your server IP to CORS_ORIGINS
|
||||||
|
```
|
||||||
|
|
||||||
|
## Production Deployment (Optional)
|
||||||
|
|
||||||
|
### Use Docker Compose in Production Mode
|
||||||
|
|
||||||
|
Create `docker-compose.prod.yml`:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
restart: always
|
||||||
|
|
||||||
|
backend:
|
||||||
|
restart: always
|
||||||
|
environment:
|
||||||
|
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD} # Use strong password
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
restart: always
|
||||||
|
```
|
||||||
|
|
||||||
|
Start with:
|
||||||
|
```bash
|
||||||
|
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
### Set Up as System Service (Systemd)
|
||||||
|
|
||||||
|
Create `/etc/systemd/system/fidelity-tracker.service`:
|
||||||
|
|
||||||
|
```ini
|
||||||
|
[Unit]
|
||||||
|
Description=myFidelityTracker
|
||||||
|
Requires=docker.service
|
||||||
|
After=docker.service
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=oneshot
|
||||||
|
RemainAfterExit=yes
|
||||||
|
WorkingDirectory=/home/YOUR_USER/fidelity
|
||||||
|
ExecStart=/usr/bin/docker-compose up -d
|
||||||
|
ExecStop=/usr/bin/docker-compose down
|
||||||
|
TimeoutStartSec=0
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
|
```
|
||||||
|
|
||||||
|
Enable and start:
|
||||||
|
```bash
|
||||||
|
sudo systemctl daemon-reload
|
||||||
|
sudo systemctl enable fidelity-tracker
|
||||||
|
sudo systemctl start fidelity-tracker
|
||||||
|
sudo systemctl status fidelity-tracker
|
||||||
|
```
|
||||||
|
|
||||||
|
### Enable HTTPS with Nginx Reverse Proxy
|
||||||
|
|
||||||
|
Install Nginx:
|
||||||
|
```bash
|
||||||
|
sudo apt-get install nginx certbot python3-certbot-nginx
|
||||||
|
```
|
||||||
|
|
||||||
|
Configure `/etc/nginx/sites-available/fidelity`:
|
||||||
|
```nginx
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name your-domain.com;
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_pass http://localhost:3000;
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection 'upgrade';
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_cache_bypass $http_upgrade;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /api {
|
||||||
|
proxy_pass http://localhost:8000;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Enable and get SSL:
|
||||||
|
```bash
|
||||||
|
sudo ln -s /etc/nginx/sites-available/fidelity /etc/nginx/sites-enabled/
|
||||||
|
sudo nginx -t
|
||||||
|
sudo systemctl restart nginx
|
||||||
|
sudo certbot --nginx -d your-domain.com
|
||||||
|
```
|
||||||
|
|
||||||
|
### Backup Database
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create backup script
|
||||||
|
cat > backup-db.sh << 'EOF'
|
||||||
|
#!/bin/bash
|
||||||
|
DATE=$(date +%Y%m%d_%H%M%S)
|
||||||
|
docker-compose exec -T postgres pg_dump -U fidelity fidelitytracker > backup_$DATE.sql
|
||||||
|
gzip backup_$DATE.sql
|
||||||
|
echo "Backup created: backup_$DATE.sql.gz"
|
||||||
|
EOF
|
||||||
|
|
||||||
|
chmod +x backup-db.sh
|
||||||
|
|
||||||
|
# Run backup
|
||||||
|
./backup-db.sh
|
||||||
|
|
||||||
|
# Schedule with cron (daily at 2 AM)
|
||||||
|
crontab -e
|
||||||
|
# Add: 0 2 * * * /home/YOUR_USER/fidelity/backup-db.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Best Practices
|
||||||
|
|
||||||
|
1. **Change default passwords** in `.env`
|
||||||
|
2. **Use firewall** to restrict access
|
||||||
|
3. **Enable HTTPS** for production
|
||||||
|
4. **Regular backups** of database
|
||||||
|
5. **Keep Docker updated**: `sudo apt-get update && sudo apt-get upgrade`
|
||||||
|
6. **Monitor logs** for suspicious activity
|
||||||
|
7. **Use strong passwords** for PostgreSQL
|
||||||
|
8. **Don't expose ports** to the internet unless necessary
|
||||||
|
|
||||||
|
## Performance Optimization
|
||||||
|
|
||||||
|
### Increase Docker Resources
|
||||||
|
|
||||||
|
Edit `/etc/docker/daemon.json`:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"log-driver": "json-file",
|
||||||
|
"log-opts": {
|
||||||
|
"max-size": "10m",
|
||||||
|
"max-file": "3"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Restart Docker:
|
||||||
|
```bash
|
||||||
|
sudo systemctl restart docker
|
||||||
|
```
|
||||||
|
|
||||||
|
### Monitor Resources
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Container resource usage
|
||||||
|
docker stats
|
||||||
|
|
||||||
|
# System resources
|
||||||
|
htop
|
||||||
|
free -h
|
||||||
|
df -h
|
||||||
|
```
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Your app is now running on Linux! The main differences from macOS:
|
||||||
|
- Use `start-linux.sh` instead of `start.sh`
|
||||||
|
- Configure firewall for remote access
|
||||||
|
- CORS needs your server IP
|
||||||
|
- Use `systemctl` for Docker management
|
||||||
|
|
||||||
|
The application itself runs identically - Docker handles all the platform differences.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Questions?** Check the main README.md or run `docker-compose logs` to diagnose issues.
|
||||||
101
LINUX_QUICK_REFERENCE.txt
Normal file
101
LINUX_QUICK_REFERENCE.txt
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
myFidelityTracker - Linux Deployment Quick Reference
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
📦 TRANSFER TO LINUX SERVER
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
From your Mac:
|
||||||
|
scp -r /Users/chris/Desktop/fidelity USER@SERVER_IP:~/
|
||||||
|
|
||||||
|
Or with rsync:
|
||||||
|
rsync -avz /Users/chris/Desktop/fidelity/ USER@SERVER_IP:~/fidelity/
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
🚀 FIRST-TIME SETUP ON LINUX
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
ssh USER@SERVER_IP
|
||||||
|
cd ~/fidelity
|
||||||
|
|
||||||
|
# Make scripts executable
|
||||||
|
chmod +x start-linux.sh stop.sh
|
||||||
|
|
||||||
|
# Configure CORS (edit .env file)
|
||||||
|
cp .env.example .env
|
||||||
|
nano .env
|
||||||
|
# Change: CORS_ORIGINS=http://localhost:3000,http://YOUR_SERVER_IP:3000
|
||||||
|
|
||||||
|
# Start the app
|
||||||
|
./start-linux.sh
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
🌐 ACCESS URLs
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
From the server: http://localhost:3000
|
||||||
|
From other computers: http://SERVER_IP:3000
|
||||||
|
API Documentation: http://SERVER_IP:8000/docs
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
🔥 FIREWALL SETUP (Ubuntu)
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
sudo ufw allow 22/tcp
|
||||||
|
sudo ufw allow 3000/tcp
|
||||||
|
sudo ufw allow 8000/tcp
|
||||||
|
sudo ufw enable
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
📝 DAILY COMMANDS
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
Start: ./start-linux.sh
|
||||||
|
Stop: ./stop.sh
|
||||||
|
View logs: docker-compose logs -f
|
||||||
|
Status: docker-compose ps
|
||||||
|
Restart: docker-compose restart
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
🌱 LOAD DEMO DATA
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
cp History_for_Account_X38661988.csv imports/
|
||||||
|
docker-compose exec backend python seed_demo_data.py
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
⚙️ WHAT CHANGED FROM macOS?
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
✓ Use start-linux.sh (not start.sh)
|
||||||
|
✓ Add server IP to CORS_ORIGINS in .env
|
||||||
|
✓ Configure firewall to allow ports 3000, 8000
|
||||||
|
✓ Everything else works the same!
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
🆘 TROUBLESHOOTING
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
Port in use:
|
||||||
|
sudo lsof -i :3000
|
||||||
|
sudo kill <PID>
|
||||||
|
|
||||||
|
Can't access from other computers:
|
||||||
|
1. Check firewall: sudo ufw status
|
||||||
|
2. Check CORS in .env has your server IP
|
||||||
|
3. Verify services running: docker-compose ps
|
||||||
|
|
||||||
|
Permission errors:
|
||||||
|
sudo usermod -aG docker $USER
|
||||||
|
newgrp docker
|
||||||
|
|
||||||
|
Out of space:
|
||||||
|
docker system prune -a
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
📚 FULL DOCUMENTATION
|
||||||
|
────────────────────────────────────────────────────────────────
|
||||||
|
See LINUX_DEPLOYMENT.md for complete guide
|
||||||
|
See README.md for full application documentation
|
||||||
|
|
||||||
|
════════════════════════════════════════════════════════════════
|
||||||
193
PROJECT_SUMMARY.md
Normal file
193
PROJECT_SUMMARY.md
Normal file
@@ -0,0 +1,193 @@
|
|||||||
|
# myFidelityTracker - Project Summary
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
Complete MVP for tracking Fidelity brokerage account performance with transaction import, position tracking, and real-time P&L calculations.
|
||||||
|
|
||||||
|
## What's Been Built
|
||||||
|
|
||||||
|
### ✅ Backend (Python/FastAPI)
|
||||||
|
- **Database Models**: Account, Transaction, Position (with junction tables)
|
||||||
|
- **CSV Parser**: Fidelity-specific parser with deduplication
|
||||||
|
- **Services**:
|
||||||
|
- Import Service (file upload + filesystem import)
|
||||||
|
- Position Tracker (FIFO matching, options support)
|
||||||
|
- Performance Calculator (with Yahoo Finance integration)
|
||||||
|
- **API Endpoints**:
|
||||||
|
- Accounts (CRUD)
|
||||||
|
- Transactions (list, filter, pagination)
|
||||||
|
- Positions (open/closed, stats)
|
||||||
|
- Analytics (overview, balance history, top trades)
|
||||||
|
- Import (upload + filesystem)
|
||||||
|
- **Database**: PostgreSQL with Alembic migrations
|
||||||
|
- **Features**: Deduplication, real-time P&L, market data caching
|
||||||
|
|
||||||
|
### ✅ Frontend (React/TypeScript)
|
||||||
|
- **Components**:
|
||||||
|
- Dashboard (metrics cards + charts)
|
||||||
|
- Account Manager (create/list/delete accounts)
|
||||||
|
- Import Dropzone (drag-drop + filesystem import)
|
||||||
|
- Transaction Table (filterable, sortable)
|
||||||
|
- Position Cards (open/closed with P&L)
|
||||||
|
- Performance Chart (balance over time)
|
||||||
|
- Metrics Cards (KPIs)
|
||||||
|
- **Styling**: TailwindCSS with Robinhood-inspired design
|
||||||
|
- **State Management**: React Query for data fetching
|
||||||
|
- **Routing**: Tab-based navigation
|
||||||
|
|
||||||
|
### ✅ Infrastructure
|
||||||
|
- **Docker Compose**: Multi-container setup (postgres, backend, frontend)
|
||||||
|
- **Nginx**: Reverse proxy for SPA routing + API proxying
|
||||||
|
- **Multi-arch**: Supports amd64 and arm64
|
||||||
|
- **Volumes**: Persistent database + import directory
|
||||||
|
- **Health Checks**: Service readiness monitoring
|
||||||
|
|
||||||
|
### ✅ Developer Experience
|
||||||
|
- **Documentation**:
|
||||||
|
- README.md (comprehensive guide)
|
||||||
|
- QUICKSTART.md (2-minute setup)
|
||||||
|
- API docs (auto-generated at /docs)
|
||||||
|
- **Scripts**:
|
||||||
|
- start.sh (automated startup with health checks)
|
||||||
|
- stop.sh (graceful shutdown)
|
||||||
|
- seed_demo_data.py (demo data loader)
|
||||||
|
- **Environment**: .env.example template
|
||||||
|
- **Git**: .gitignore configured
|
||||||
|
|
||||||
|
## Key Features
|
||||||
|
|
||||||
|
### Transaction Management
|
||||||
|
- Import via CSV upload or filesystem
|
||||||
|
- Automatic deduplication using SHA-256 hashing
|
||||||
|
- Support for stocks, calls, puts
|
||||||
|
- Handle assignments, expirations, rolls
|
||||||
|
|
||||||
|
### Position Tracking
|
||||||
|
- Automatic FIFO matching
|
||||||
|
- Multi-leg position support
|
||||||
|
- Open vs. closed positions
|
||||||
|
- Partial position closes
|
||||||
|
- Average entry/exit prices
|
||||||
|
|
||||||
|
### Performance Analytics
|
||||||
|
- Realized P&L (closed positions)
|
||||||
|
- Unrealized P&L (open positions with live prices)
|
||||||
|
- Win rate calculation
|
||||||
|
- Average win/loss metrics
|
||||||
|
- Top trades analysis
|
||||||
|
- Balance history charting
|
||||||
|
|
||||||
|
### User Experience
|
||||||
|
- Clean, modern UI (Robinhood-inspired)
|
||||||
|
- Mobile-responsive design
|
||||||
|
- Real-time data updates
|
||||||
|
- Intuitive navigation
|
||||||
|
- Error handling with user feedback
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Data Flow
|
||||||
|
```
|
||||||
|
CSV File → Parser → Deduplication → Database (Transactions)
|
||||||
|
↓
|
||||||
|
Position Tracker (FIFO)
|
||||||
|
↓
|
||||||
|
Positions DB
|
||||||
|
↓
|
||||||
|
Performance Calculator + Yahoo Finance
|
||||||
|
↓
|
||||||
|
Analytics API
|
||||||
|
↓
|
||||||
|
React Frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tech Stack
|
||||||
|
- **Backend**: Python 3.11, FastAPI, SQLAlchemy, PostgreSQL, Pandas, yfinance
|
||||||
|
- **Frontend**: React 18, TypeScript, Vite, TailwindCSS, React Query, Recharts
|
||||||
|
- **Infrastructure**: Docker, Docker Compose, Nginx
|
||||||
|
|
||||||
|
## File Structure
|
||||||
|
```
|
||||||
|
fidelity/
|
||||||
|
├── backend/
|
||||||
|
│ ├── app/
|
||||||
|
│ │ ├── api/endpoints/ # API routes
|
||||||
|
│ │ ├── models/ # Database models
|
||||||
|
│ │ ├── schemas/ # Pydantic schemas
|
||||||
|
│ │ ├── services/ # Business logic
|
||||||
|
│ │ ├── parsers/ # CSV parsers
|
||||||
|
│ │ └── utils/ # Helper functions
|
||||||
|
│ ├── alembic/ # DB migrations
|
||||||
|
│ ├── Dockerfile
|
||||||
|
│ ├── requirements.txt
|
||||||
|
│ └── seed_demo_data.py
|
||||||
|
├── frontend/
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── components/ # React components
|
||||||
|
│ │ ├── api/ # API client
|
||||||
|
│ │ ├── types/ # TypeScript types
|
||||||
|
│ │ └── styles/ # CSS
|
||||||
|
│ ├── Dockerfile
|
||||||
|
│ ├── nginx.conf
|
||||||
|
│ └── package.json
|
||||||
|
├── imports/ # CSV import directory
|
||||||
|
├── docker-compose.yml
|
||||||
|
├── start.sh
|
||||||
|
├── stop.sh
|
||||||
|
├── README.md
|
||||||
|
├── QUICKSTART.md
|
||||||
|
└── .env.example
|
||||||
|
```
|
||||||
|
|
||||||
|
## Getting Started
|
||||||
|
|
||||||
|
### Quick Start
|
||||||
|
```bash
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
./start.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Access
|
||||||
|
- Frontend: http://localhost:3000
|
||||||
|
- Backend: http://localhost:8000
|
||||||
|
- API Docs: http://localhost:8000/docs
|
||||||
|
|
||||||
|
### Demo Data
|
||||||
|
```bash
|
||||||
|
cp History_for_Account_X38661988.csv imports/
|
||||||
|
docker-compose exec backend python seed_demo_data.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Checklist
|
||||||
|
|
||||||
|
### ✅ To Test
|
||||||
|
1. Start application (`./start.sh`)
|
||||||
|
2. Create account via UI
|
||||||
|
3. Import sample CSV
|
||||||
|
4. Verify transactions imported
|
||||||
|
5. Check positions calculated
|
||||||
|
6. View dashboard metrics
|
||||||
|
7. Test filters and sorting
|
||||||
|
8. Verify P&L calculations
|
||||||
|
9. Check responsive design
|
||||||
|
10. Test re-import (deduplication)
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
- [ ] Additional brokerages (Schwab, E*TRADE, Robinhood)
|
||||||
|
- [ ] Authentication/multi-user
|
||||||
|
- [ ] Tax reporting (wash sales, capital gains)
|
||||||
|
- [ ] Email notifications
|
||||||
|
- [ ] Dark mode
|
||||||
|
- [ ] PDF export
|
||||||
|
- [ ] AI trade recommendations
|
||||||
|
- [ ] Backtesting
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
- Uses FIFO for position matching
|
||||||
|
- Market data cached for 60 seconds
|
||||||
|
- Options pricing uses Yahoo Finance (may not be perfect)
|
||||||
|
- Designed for personal use (single-user)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Status**: ✅ MVP Complete and Ready for Testing
|
||||||
|
**Last Updated**: January 2026
|
||||||
37
QUICKSTART.md
Normal file
37
QUICKSTART.md
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
# Quick Start - Fix Yahoo Finance Rate Limiting
|
||||||
|
|
||||||
|
## The Problem
|
||||||
|
Your dashboard is hitting Yahoo Finance rate limits (HTTP 429 errors) and taking forever to load.
|
||||||
|
|
||||||
|
## The Fix
|
||||||
|
Complete solution with database-backed caching, rate limiting, and instant dashboard loading.
|
||||||
|
|
||||||
|
## Deploy in 3 Minutes
|
||||||
|
|
||||||
|
### Step 1: Transfer Files (on your Mac)
|
||||||
|
```bash
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
./deploy-rate-limiting-fix.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 2: Apply Fix (on your Linux server)
|
||||||
|
```bash
|
||||||
|
ssh pi@starship2
|
||||||
|
cd ~/fidelity
|
||||||
|
./apply-rate-limiting-patches.sh
|
||||||
|
docker compose down
|
||||||
|
docker compose build --no-cache backend frontend
|
||||||
|
docker compose up -d
|
||||||
|
sleep 30
|
||||||
|
docker compose exec backend alembic upgrade head
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 3: Test
|
||||||
|
Open http://starship2:3000 - dashboard should load instantly!
|
||||||
|
|
||||||
|
## What You Get
|
||||||
|
|
||||||
|
Before: ❌ 30+ second load, 429 errors, timeouts
|
||||||
|
After: ✅ <1 second load, cached prices, no errors
|
||||||
|
|
||||||
|
See RATE_LIMITING_SOLUTION.md for full details.
|
||||||
363
RATE_LIMITING_SOLUTION.md
Normal file
363
RATE_LIMITING_SOLUTION.md
Normal file
@@ -0,0 +1,363 @@
|
|||||||
|
### Rate Limiting & Caching Solution for Yahoo Finance API
|
||||||
|
|
||||||
|
## Problem
|
||||||
|
|
||||||
|
Yahoo Finance API has rate limits and was returning **HTTP 429 (Too Many Requests)** errors when the dashboard loaded. The dashboard would:
|
||||||
|
1. Fetch prices for every open position synchronously
|
||||||
|
2. Block UI until all prices were loaded
|
||||||
|
3. Hit rate limits quickly with multiple open positions
|
||||||
|
4. Lose all cached data on container restart (in-memory cache only)
|
||||||
|
|
||||||
|
## Solution Overview
|
||||||
|
|
||||||
|
Implemented a multi-layered approach:
|
||||||
|
|
||||||
|
1. **Database-backed price cache** - Persistent across restarts
|
||||||
|
2. **Rate limiting with exponential backoff** - Respects Yahoo Finance limits
|
||||||
|
3. **Batch processing** - Fetches multiple prices efficiently
|
||||||
|
4. **Stale-while-revalidate pattern** - UI shows cached data immediately
|
||||||
|
5. **Background refresh** - Optional manual price updates
|
||||||
|
6. **Configurable API call limits** - Control how many API calls to make
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### New Components
|
||||||
|
|
||||||
|
#### 1. `MarketPrice` Model (`backend/app/models/market_price.py`)
|
||||||
|
Database table to cache prices with timestamps:
|
||||||
|
```python
|
||||||
|
- symbol: Stock ticker (indexed, unique)
|
||||||
|
- price: Current price
|
||||||
|
- fetched_at: When price was fetched
|
||||||
|
- source: Data source (yahoo_finance)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. `MarketDataService` (`backend/app/services/market_data_service.py`)
|
||||||
|
Core service handling all market data:
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- **Database caching**: Stores prices in PostgreSQL
|
||||||
|
- **Rate limiting**: 500ms delay between requests, exponentially backs off on 429 errors
|
||||||
|
- **Retry logic**: Up to 3 retries with increasing delays
|
||||||
|
- **Batch fetching**: `get_prices_batch()` fetches multiple symbols efficiently
|
||||||
|
- **Stale data support**: Returns old cached data if fresh fetch fails
|
||||||
|
- **Background refresh**: `refresh_stale_prices()` for periodic maintenance
|
||||||
|
|
||||||
|
**Key Methods:**
|
||||||
|
```python
|
||||||
|
get_price(symbol, allow_stale=True)
|
||||||
|
# Returns cached price if fresh, or fetches from Yahoo
|
||||||
|
|
||||||
|
get_prices_batch(symbols, allow_stale=True, max_fetches=10)
|
||||||
|
# Fetches multiple symbols with rate limiting
|
||||||
|
|
||||||
|
refresh_stale_prices(min_age_seconds=300, limit=20)
|
||||||
|
# Background task to refresh old prices
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. `PerformanceCalculatorV2` (`backend/app/services/performance_calculator_v2.py`)
|
||||||
|
Enhanced calculator using `MarketDataService`:
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Batch price fetching for all open positions
|
||||||
|
- Configurable API call limits
|
||||||
|
- Returns cache statistics
|
||||||
|
- Non-blocking operation
|
||||||
|
|
||||||
|
**Key Changes:**
|
||||||
|
```python
|
||||||
|
calculate_account_stats(
|
||||||
|
account_id,
|
||||||
|
update_prices=True, # Set to False to use only cache
|
||||||
|
max_api_calls=10 # Limit Yahoo Finance API calls
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Enhanced Analytics Endpoints (`backend/app/api/endpoints/analytics_v2.py`)
|
||||||
|
|
||||||
|
**New/Updated Endpoints:**
|
||||||
|
|
||||||
|
```
|
||||||
|
GET /api/analytics/overview/{account_id}?refresh_prices=false&max_api_calls=5
|
||||||
|
# Default: uses cached prices only (fast!)
|
||||||
|
# Set refresh_prices=true to fetch fresh data
|
||||||
|
|
||||||
|
POST /api/analytics/refresh-prices/{account_id}?max_api_calls=10
|
||||||
|
# Manual refresh - waits for completion
|
||||||
|
|
||||||
|
POST /api/analytics/refresh-prices-background/{account_id}?max_api_calls=20
|
||||||
|
# Background refresh - returns immediately
|
||||||
|
|
||||||
|
POST /api/analytics/refresh-stale-cache?min_age_minutes=10&limit=20
|
||||||
|
# Maintenance endpoint for periodic cache refresh
|
||||||
|
|
||||||
|
DELETE /api/analytics/clear-old-cache?older_than_days=30
|
||||||
|
# Clean up old cached prices
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. `DashboardV2` Component (`frontend/src/components/DashboardV2.tsx`)
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- **Instant loading**: Shows cached data immediately
|
||||||
|
- **Data freshness indicator**: Shows when data was last updated
|
||||||
|
- **Manual refresh button**: User can trigger fresh price fetch
|
||||||
|
- **Cache statistics**: Shows how many prices were cached vs fetched
|
||||||
|
- **Background updates**: Refetches on window focus
|
||||||
|
- **Stale-while-revalidate**: Keeps old data visible while fetching new
|
||||||
|
|
||||||
|
**User Experience:**
|
||||||
|
1. Dashboard loads instantly with cached prices
|
||||||
|
2. User sees "Last updated: 2m ago"
|
||||||
|
3. Click "Refresh Prices" to get fresh data
|
||||||
|
4. Background spinner shows refresh in progress
|
||||||
|
5. Data updates when refresh completes
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
### First Load (No Cache)
|
||||||
|
```
|
||||||
|
1. User opens dashboard
|
||||||
|
2. Frontend calls GET /api/analytics/overview/{id}?refresh_prices=false
|
||||||
|
3. Backend checks database cache - empty
|
||||||
|
4. Returns stats with unrealized_pnl = null for open positions
|
||||||
|
5. Dashboard shows data immediately (without prices)
|
||||||
|
6. User clicks "Refresh Prices"
|
||||||
|
7. Fetches first 10 symbols from Yahoo Finance
|
||||||
|
8. Caches results in database
|
||||||
|
9. Updates dashboard with fresh prices
|
||||||
|
```
|
||||||
|
|
||||||
|
### Subsequent Loads (With Cache)
|
||||||
|
```
|
||||||
|
1. User opens dashboard
|
||||||
|
2. Frontend calls GET /api/analytics/overview/{id}?refresh_prices=false
|
||||||
|
3. Backend checks database cache - HIT!
|
||||||
|
4. Returns stats with cached prices (instant!)
|
||||||
|
5. Dashboard shows: "Last updated: 3m ago | 📦 8 cached"
|
||||||
|
6. User can optionally click "Refresh Prices" for fresh data
|
||||||
|
```
|
||||||
|
|
||||||
|
### Background Refresh
|
||||||
|
```
|
||||||
|
1. Cron job calls POST /api/analytics/refresh-stale-cache
|
||||||
|
2. Finds prices older than 10 minutes
|
||||||
|
3. Refreshes up to 20 prices with rate limiting
|
||||||
|
4. Next dashboard load has fresher cache
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Backend Settings (`backend/app/config.py`)
|
||||||
|
```python
|
||||||
|
MARKET_DATA_CACHE_TTL: int = 300 # 5 minutes (adjust as needed)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Settings (`frontend/src/components/DashboardV2.tsx`)
|
||||||
|
```typescript
|
||||||
|
staleTime: 30000, # Keep cache for 30 seconds
|
||||||
|
refetchOnWindowFocus: true, # Auto-refresh when user returns
|
||||||
|
```
|
||||||
|
|
||||||
|
### Per-Request Controls
|
||||||
|
```typescript
|
||||||
|
// Fast load with cached data only
|
||||||
|
analyticsApi.getOverview(accountId, {
|
||||||
|
refresh_prices: false,
|
||||||
|
max_api_calls: 0
|
||||||
|
})
|
||||||
|
|
||||||
|
// Fresh data with limited API calls
|
||||||
|
analyticsApi.getOverview(accountId, {
|
||||||
|
refresh_prices: true,
|
||||||
|
max_api_calls: 10 // Fetch up to 10 symbols
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
## Rate Limiting Strategy
|
||||||
|
|
||||||
|
The `MarketDataService` implements smart rate limiting:
|
||||||
|
|
||||||
|
1. **Initial delay**: 500ms between requests
|
||||||
|
2. **Exponential backoff**: Doubles delay on 429 errors (up to 10s max)
|
||||||
|
3. **Gradual recovery**: Decreases delay by 10% on successful requests
|
||||||
|
4. **Retry logic**: Up to 3 retries with increasing delays
|
||||||
|
|
||||||
|
Example flow:
|
||||||
|
```
|
||||||
|
Request 1: Success (500ms delay)
|
||||||
|
Request 2: Success (450ms delay)
|
||||||
|
Request 3: 429 Error (delay → 900ms)
|
||||||
|
Request 3 retry 1: 429 Error (delay → 1800ms)
|
||||||
|
Request 3 retry 2: Success (delay → 1620ms)
|
||||||
|
Request 4: Success (delay → 1458ms)
|
||||||
|
...gradually returns to 500ms
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Migration
|
||||||
|
|
||||||
|
Run migration to add market_prices table:
|
||||||
|
```bash
|
||||||
|
docker compose exec backend alembic upgrade head
|
||||||
|
```
|
||||||
|
|
||||||
|
## Deployment Steps
|
||||||
|
|
||||||
|
### 1. Transfer new files to server:
|
||||||
|
```bash
|
||||||
|
# On Mac
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
|
||||||
|
# Backend files
|
||||||
|
scp backend/app/models/market_price.py pi@starship2:~/fidelity/backend/app/models/
|
||||||
|
scp backend/app/services/market_data_service.py pi@starship2:~/fidelity/backend/app/services/
|
||||||
|
scp backend/app/services/performance_calculator_v2.py pi@starship2:~/fidelity/backend/app/services/
|
||||||
|
scp backend/app/api/endpoints/analytics_v2.py pi@starship2:~/fidelity/backend/app/api/endpoints/
|
||||||
|
scp backend/alembic/versions/add_market_prices_table.py pi@starship2:~/fidelity/backend/alembic/versions/
|
||||||
|
scp backend/app/models/__init__.py pi@starship2:~/fidelity/backend/app/models/
|
||||||
|
|
||||||
|
# Frontend files
|
||||||
|
scp frontend/src/components/DashboardV2.tsx pi@starship2:~/fidelity/frontend/src/components/
|
||||||
|
scp frontend/src/api/client.ts pi@starship2:~/fidelity/frontend/src/api/
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Update main.py to use new analytics router:
|
||||||
|
```python
|
||||||
|
# backend/app/main.py
|
||||||
|
from app.api.endpoints import analytics_v2
|
||||||
|
|
||||||
|
app.include_router(
|
||||||
|
analytics_v2.router,
|
||||||
|
prefix=f"{settings.API_V1_PREFIX}/analytics",
|
||||||
|
tags=["analytics"]
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Update App.tsx to use DashboardV2:
|
||||||
|
```typescript
|
||||||
|
// frontend/src/App.tsx
|
||||||
|
import DashboardV2 from './components/DashboardV2';
|
||||||
|
|
||||||
|
// Replace <Dashboard /> with <DashboardV2 />
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Run migration and rebuild:
|
||||||
|
```bash
|
||||||
|
ssh pi@starship2
|
||||||
|
cd ~/fidelity
|
||||||
|
|
||||||
|
# Stop containers
|
||||||
|
docker compose down
|
||||||
|
|
||||||
|
# Rebuild
|
||||||
|
docker compose build --no-cache backend frontend
|
||||||
|
|
||||||
|
# Start
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
# Run migration
|
||||||
|
docker compose exec backend alembic upgrade head
|
||||||
|
|
||||||
|
# Verify table was created
|
||||||
|
docker compose exec postgres psql -U fidelity -d fidelitytracker -c "\d market_prices"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
### Test the improved dashboard:
|
||||||
|
```bash
|
||||||
|
# 1. Open dashboard - should load instantly with cached data
|
||||||
|
open http://starship2:3000
|
||||||
|
|
||||||
|
# 2. Check logs - should see cache HITs, not Yahoo Finance requests
|
||||||
|
docker compose logs backend | grep -i "cache\|yahoo"
|
||||||
|
|
||||||
|
# 3. Click "Refresh Prices" button
|
||||||
|
# Should see rate-limited requests in logs
|
||||||
|
|
||||||
|
# 4. Check database cache
|
||||||
|
docker compose exec postgres psql -U fidelity -d fidelitytracker -c "SELECT symbol, price, fetched_at FROM market_prices ORDER BY fetched_at DESC LIMIT 10;"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test API endpoints directly:
|
||||||
|
```bash
|
||||||
|
# Fast load with cache only
|
||||||
|
curl "http://localhost:8000/api/analytics/overview/1?refresh_prices=false&max_api_calls=0"
|
||||||
|
|
||||||
|
# Fresh data with limited API calls
|
||||||
|
curl "http://localhost:8000/api/analytics/overview/1?refresh_prices=true&max_api_calls=5"
|
||||||
|
|
||||||
|
# Manual refresh
|
||||||
|
curl -X POST "http://localhost:8000/api/analytics/refresh-prices/1?max_api_calls=10"
|
||||||
|
|
||||||
|
# Background refresh (returns immediately)
|
||||||
|
curl -X POST "http://localhost:8000/api/analytics/refresh-prices-background/1?max_api_calls=15"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Benefits
|
||||||
|
|
||||||
|
### Before:
|
||||||
|
- ❌ Dashboard blocked for 30+ seconds
|
||||||
|
- ❌ Hit rate limits constantly (429 errors)
|
||||||
|
- ❌ Lost all cache data on restart
|
||||||
|
- ❌ No way to control API usage
|
||||||
|
- ❌ Poor user experience
|
||||||
|
|
||||||
|
### After:
|
||||||
|
- ✅ Dashboard loads instantly (<1 second)
|
||||||
|
- ✅ Respects rate limits with exponential backoff
|
||||||
|
- ✅ Persistent cache across restarts
|
||||||
|
- ✅ Configurable API call limits
|
||||||
|
- ✅ Shows stale data while refreshing
|
||||||
|
- ✅ Manual refresh option
|
||||||
|
- ✅ Background updates
|
||||||
|
- ✅ Cache statistics visible to user
|
||||||
|
|
||||||
|
## Maintenance
|
||||||
|
|
||||||
|
### Periodic cache refresh (optional):
|
||||||
|
```bash
|
||||||
|
# Add to crontab for periodic refresh
|
||||||
|
*/10 * * * * curl -X POST "http://localhost:8000/api/analytics/refresh-stale-cache?min_age_minutes=10&limit=20"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Clear old cache:
|
||||||
|
```bash
|
||||||
|
# Monthly cleanup
|
||||||
|
curl -X DELETE "http://localhost:8000/api/analytics/clear-old-cache?older_than_days=30"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
1. **WebSocket updates**: Push price updates to frontend in real-time
|
||||||
|
2. **Batch updates**: Update all accounts' prices in background job
|
||||||
|
3. **Multiple data sources**: Fall back to alternative APIs if Yahoo fails
|
||||||
|
4. **Historical caching**: Store price history for charting
|
||||||
|
5. **Smart refresh**: Only refresh prices during market hours
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Still getting 429 errors:
|
||||||
|
- Increase `_rate_limit_delay` in `MarketDataService`
|
||||||
|
- Decrease `max_api_calls` in API requests
|
||||||
|
- Use longer `cache_ttl` (e.g., 600 seconds = 10 minutes)
|
||||||
|
|
||||||
|
### Dashboard shows old data:
|
||||||
|
- Check `cache_ttl` setting
|
||||||
|
- Click "Refresh Prices" button
|
||||||
|
- Check database: `SELECT * FROM market_prices;`
|
||||||
|
|
||||||
|
### Prices not updating:
|
||||||
|
- Check backend logs for errors
|
||||||
|
- Verify migration ran: `\d market_prices` in postgres
|
||||||
|
- Check if symbols are valid (Yahoo Finance format)
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
This solution provides a production-ready approach to handling rate-limited APIs with:
|
||||||
|
- Fast, responsive UI
|
||||||
|
- Persistent caching
|
||||||
|
- Graceful degradation
|
||||||
|
- User control
|
||||||
|
- Clear feedback
|
||||||
|
|
||||||
|
Users get instant dashboard loads with cached data, and can optionally refresh for the latest prices without blocking the UI.
|
||||||
420
README.md
Normal file
420
README.md
Normal file
@@ -0,0 +1,420 @@
|
|||||||
|
# myFidelityTracker
|
||||||
|
|
||||||
|
A modern web application for tracking and analyzing Fidelity brokerage account performance. Track individual trades, calculate P&L, and gain insights into your trading performance over time.
|
||||||
|
|
||||||
|

|
||||||
|

|
||||||
|

|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
### Core Features
|
||||||
|
- **Multi-Account Support**: Manage multiple brokerage accounts in one place
|
||||||
|
- **CSV Import**: Import Fidelity transaction history via CSV upload or filesystem
|
||||||
|
- **Automatic Deduplication**: Prevents duplicate transactions when re-importing files
|
||||||
|
- **Position Tracking**: Automatically matches opening and closing transactions using FIFO
|
||||||
|
- **Real-Time P&L**: Calculate both realized and unrealized profit/loss with live market data
|
||||||
|
- **Performance Analytics**: View win rate, average win/loss, and top-performing trades
|
||||||
|
- **Interactive Dashboard**: Beautiful Robinhood-inspired UI with charts and metrics
|
||||||
|
- **Responsive Design**: Works seamlessly on desktop, tablet, and mobile
|
||||||
|
|
||||||
|
### Technical Features
|
||||||
|
- **Docker Deployment**: One-command setup with Docker Compose
|
||||||
|
- **Multi-Architecture**: Supports both amd64 and arm64 platforms
|
||||||
|
- **RESTful API**: FastAPI backend with automatic OpenAPI documentation
|
||||||
|
- **Type Safety**: Full TypeScript frontend for robust development
|
||||||
|
- **Database Migrations**: Alembic for version-controlled database schema
|
||||||
|
- **Market Data Integration**: Yahoo Finance API for current stock prices
|
||||||
|
|
||||||
|
## Screenshots
|
||||||
|
|
||||||
|
### Dashboard
|
||||||
|
View your account overview with key metrics and performance charts.
|
||||||
|
|
||||||
|
### Transaction History
|
||||||
|
Browse and filter all your transactions with advanced search.
|
||||||
|
|
||||||
|
### Import Interface
|
||||||
|
Drag-and-drop CSV files or import from the filesystem.
|
||||||
|
|
||||||
|
## Tech Stack
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
- **FastAPI** - Modern Python web framework
|
||||||
|
- **SQLAlchemy** - SQL toolkit and ORM
|
||||||
|
- **PostgreSQL** - Relational database
|
||||||
|
- **Alembic** - Database migrations
|
||||||
|
- **Pandas** - Data manipulation and CSV parsing
|
||||||
|
- **yfinance** - Real-time market data
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
- **React 18** - UI library
|
||||||
|
- **TypeScript** - Type-safe JavaScript
|
||||||
|
- **Vite** - Fast build tool
|
||||||
|
- **TailwindCSS** - Utility-first CSS framework
|
||||||
|
- **React Query** - Data fetching and caching
|
||||||
|
- **Recharts** - Charting library
|
||||||
|
- **React Dropzone** - File upload component
|
||||||
|
|
||||||
|
### Infrastructure
|
||||||
|
- **Docker** - Containerization
|
||||||
|
- **Docker Compose** - Multi-container orchestration
|
||||||
|
- **Nginx** - Web server and reverse proxy
|
||||||
|
- **PostgreSQL 16** - Database server
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
- Docker Desktop (or Docker Engine + Docker Compose)
|
||||||
|
- 4GB+ RAM available
|
||||||
|
- Port 3000, 8000, and 5432 available
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
1. **Clone or download this repository**
|
||||||
|
```bash
|
||||||
|
cd /path/to/fidelity
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Place your sample CSV file** (optional, for demo data)
|
||||||
|
```bash
|
||||||
|
cp History_for_Account_X38661988.csv imports/
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Start the application**
|
||||||
|
```bash
|
||||||
|
docker-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
This will:
|
||||||
|
- Build the backend, frontend, and database containers
|
||||||
|
- Run database migrations
|
||||||
|
- Start all services
|
||||||
|
|
||||||
|
4. **Seed demo data** (optional)
|
||||||
|
```bash
|
||||||
|
docker-compose exec backend python seed_demo_data.py
|
||||||
|
```
|
||||||
|
|
||||||
|
5. **Access the application**
|
||||||
|
- Frontend: http://localhost:3000
|
||||||
|
- Backend API: http://localhost:8000
|
||||||
|
- API Docs: http://localhost:8000/docs
|
||||||
|
|
||||||
|
### First-Time Setup
|
||||||
|
|
||||||
|
1. **Create an Account**
|
||||||
|
- Navigate to the "Accounts" tab
|
||||||
|
- Click "Add Account"
|
||||||
|
- Enter your account details
|
||||||
|
|
||||||
|
2. **Import Transactions**
|
||||||
|
- Go to the "Import" tab
|
||||||
|
- Either:
|
||||||
|
- Drag and drop a Fidelity CSV file
|
||||||
|
- Place CSV files in the `./imports` directory and click "Import from Filesystem"
|
||||||
|
|
||||||
|
3. **View Dashboard**
|
||||||
|
- Return to the "Dashboard" tab to see your portfolio performance
|
||||||
|
|
||||||
|
## Usage Guide
|
||||||
|
|
||||||
|
### Importing Transactions
|
||||||
|
|
||||||
|
#### CSV Upload (Recommended)
|
||||||
|
1. Navigate to the Import tab
|
||||||
|
2. Drag and drop your Fidelity CSV file or click to browse
|
||||||
|
3. The system will automatically:
|
||||||
|
- Parse the CSV
|
||||||
|
- Deduplicate existing transactions
|
||||||
|
- Calculate positions
|
||||||
|
- Update P&L metrics
|
||||||
|
|
||||||
|
#### Filesystem Import
|
||||||
|
1. Copy CSV files to the `./imports` directory on your host machine
|
||||||
|
2. Navigate to the Import tab
|
||||||
|
3. Click "Import from Filesystem"
|
||||||
|
4. All CSV files in the directory will be processed
|
||||||
|
|
||||||
|
### Understanding Positions
|
||||||
|
|
||||||
|
The application automatically tracks positions using FIFO (First-In-First-Out) logic:
|
||||||
|
|
||||||
|
- **Open Positions**: Currently held positions with unrealized P&L
|
||||||
|
- **Closed Positions**: Fully exited positions with realized P&L
|
||||||
|
- **Options**: Supports calls and puts, including assignments and expirations
|
||||||
|
|
||||||
|
### Viewing Analytics
|
||||||
|
|
||||||
|
#### Dashboard Metrics
|
||||||
|
- **Account Balance**: Current cash balance from latest transaction
|
||||||
|
- **Total P&L**: Combined realized and unrealized profit/loss
|
||||||
|
- **Win Rate**: Percentage of profitable closed trades
|
||||||
|
- **Open Positions**: Number of currently held positions
|
||||||
|
|
||||||
|
#### Charts
|
||||||
|
- **Balance History**: View account balance over time (6 months default)
|
||||||
|
- **Top Trades**: See your most profitable closed positions
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Local Development Setup
|
||||||
|
|
||||||
|
#### Backend
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
|
||||||
|
# Create virtual environment
|
||||||
|
python -m venv venv
|
||||||
|
source venv/bin/activate # or `venv\Scripts\activate` on Windows
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
pip install -r requirements.txt
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
export POSTGRES_HOST=localhost
|
||||||
|
export POSTGRES_USER=fidelity
|
||||||
|
export POSTGRES_PASSWORD=fidelity123
|
||||||
|
export POSTGRES_DB=fidelitytracker
|
||||||
|
|
||||||
|
# Run migrations
|
||||||
|
alembic upgrade head
|
||||||
|
|
||||||
|
# Start development server
|
||||||
|
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Frontend
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
npm install
|
||||||
|
|
||||||
|
# Start development server
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
Access the dev server at http://localhost:5173
|
||||||
|
|
||||||
|
### Database Access
|
||||||
|
|
||||||
|
Connect to PostgreSQL:
|
||||||
|
```bash
|
||||||
|
docker-compose exec postgres psql -U fidelity -d fidelitytracker
|
||||||
|
```
|
||||||
|
|
||||||
|
### View Logs
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# All services
|
||||||
|
docker-compose logs -f
|
||||||
|
|
||||||
|
# Specific service
|
||||||
|
docker-compose logs -f backend
|
||||||
|
docker-compose logs -f frontend
|
||||||
|
docker-compose logs -f postgres
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Documentation
|
||||||
|
|
||||||
|
### Interactive API Docs
|
||||||
|
Visit http://localhost:8000/docs for interactive Swagger UI documentation.
|
||||||
|
|
||||||
|
### Key Endpoints
|
||||||
|
|
||||||
|
#### Accounts
|
||||||
|
- `POST /api/accounts` - Create account
|
||||||
|
- `GET /api/accounts` - List accounts
|
||||||
|
- `GET /api/accounts/{id}` - Get account details
|
||||||
|
- `PUT /api/accounts/{id}` - Update account
|
||||||
|
- `DELETE /api/accounts/{id}` - Delete account
|
||||||
|
|
||||||
|
#### Import
|
||||||
|
- `POST /api/import/upload/{account_id}` - Upload CSV file
|
||||||
|
- `POST /api/import/filesystem/{account_id}` - Import from filesystem
|
||||||
|
|
||||||
|
#### Transactions
|
||||||
|
- `GET /api/transactions` - List transactions (with filters)
|
||||||
|
- `GET /api/transactions/{id}` - Get transaction details
|
||||||
|
|
||||||
|
#### Positions
|
||||||
|
- `GET /api/positions` - List positions (with filters)
|
||||||
|
- `GET /api/positions/{id}` - Get position details
|
||||||
|
- `POST /api/positions/{account_id}/rebuild` - Rebuild positions
|
||||||
|
|
||||||
|
#### Analytics
|
||||||
|
- `GET /api/analytics/overview/{account_id}` - Get account statistics
|
||||||
|
- `GET /api/analytics/balance-history/{account_id}` - Get balance history
|
||||||
|
- `GET /api/analytics/top-trades/{account_id}` - Get top trades
|
||||||
|
- `POST /api/analytics/update-pnl/{account_id}` - Update unrealized P&L
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Directory Structure
|
||||||
|
```
|
||||||
|
myFidelityTracker/
|
||||||
|
├── backend/ # FastAPI backend
|
||||||
|
│ ├── app/
|
||||||
|
│ │ ├── api/ # API endpoints
|
||||||
|
│ │ ├── models/ # SQLAlchemy models
|
||||||
|
│ │ ├── schemas/ # Pydantic schemas
|
||||||
|
│ │ ├── services/ # Business logic
|
||||||
|
│ │ ├── parsers/ # CSV parsers
|
||||||
|
│ │ └── utils/ # Utilities
|
||||||
|
│ ├── alembic/ # Database migrations
|
||||||
|
│ └── Dockerfile
|
||||||
|
├── frontend/ # React frontend
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── components/ # React components
|
||||||
|
│ │ ├── api/ # API client
|
||||||
|
│ │ ├── types/ # TypeScript types
|
||||||
|
│ │ └── styles/ # CSS styles
|
||||||
|
│ ├── Dockerfile
|
||||||
|
│ └── nginx.conf
|
||||||
|
├── imports/ # CSV import directory
|
||||||
|
└── docker-compose.yml # Docker configuration
|
||||||
|
```
|
||||||
|
|
||||||
|
### Data Flow
|
||||||
|
|
||||||
|
1. **Import**: CSV → Parser → Deduplication → Database
|
||||||
|
2. **Position Tracking**: Transactions → FIFO Matching → Positions
|
||||||
|
3. **Analytics**: Positions → Performance Calculator → Statistics
|
||||||
|
4. **Market Data**: Open Positions → Yahoo Finance API → Unrealized P&L
|
||||||
|
|
||||||
|
### Database Schema
|
||||||
|
|
||||||
|
#### accounts
|
||||||
|
- Account details and metadata
|
||||||
|
|
||||||
|
#### transactions
|
||||||
|
- Individual brokerage transactions
|
||||||
|
- Unique hash for deduplication
|
||||||
|
|
||||||
|
#### positions
|
||||||
|
- Trading positions (open/closed)
|
||||||
|
- P&L calculations
|
||||||
|
|
||||||
|
#### position_transactions
|
||||||
|
- Junction table linking positions to transactions
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
Create a `.env` file (or use `.env.example`):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Database
|
||||||
|
POSTGRES_HOST=postgres
|
||||||
|
POSTGRES_PORT=5432
|
||||||
|
POSTGRES_DB=fidelitytracker
|
||||||
|
POSTGRES_USER=fidelity
|
||||||
|
POSTGRES_PASSWORD=fidelity123
|
||||||
|
|
||||||
|
# API
|
||||||
|
API_V1_PREFIX=/api
|
||||||
|
PROJECT_NAME=myFidelityTracker
|
||||||
|
|
||||||
|
# CORS
|
||||||
|
CORS_ORIGINS=http://localhost:3000,http://localhost:5173
|
||||||
|
|
||||||
|
# Import Directory
|
||||||
|
IMPORT_DIR=/app/imports
|
||||||
|
|
||||||
|
# Market Data Cache (seconds)
|
||||||
|
MARKET_DATA_CACHE_TTL=60
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Port Already in Use
|
||||||
|
If ports 3000, 8000, or 5432 are already in use:
|
||||||
|
```bash
|
||||||
|
# Stop conflicting services
|
||||||
|
docker-compose down
|
||||||
|
|
||||||
|
# Or modify ports in docker-compose.yml
|
||||||
|
```
|
||||||
|
|
||||||
|
### Database Connection Issues
|
||||||
|
```bash
|
||||||
|
# Reset database
|
||||||
|
docker-compose down -v
|
||||||
|
docker-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
### Import Errors
|
||||||
|
- Ensure CSV is in Fidelity format
|
||||||
|
- Check for encoding issues (use UTF-8)
|
||||||
|
- Verify all required columns are present
|
||||||
|
|
||||||
|
### Performance Issues
|
||||||
|
- Check Docker resource limits
|
||||||
|
- Increase PostgreSQL memory if needed
|
||||||
|
- Reduce balance history timeframe
|
||||||
|
|
||||||
|
## Deployment
|
||||||
|
|
||||||
|
### Production Considerations
|
||||||
|
|
||||||
|
1. **Use strong passwords** - Change default PostgreSQL credentials
|
||||||
|
2. **Enable HTTPS** - Add SSL/TLS certificates to Nginx
|
||||||
|
3. **Secure API** - Add authentication (JWT tokens)
|
||||||
|
4. **Backup database** - Regular PostgreSQL backups
|
||||||
|
5. **Monitor resources** - Set up logging and monitoring
|
||||||
|
6. **Update regularly** - Keep dependencies up to date
|
||||||
|
|
||||||
|
### Docker Multi-Architecture Build
|
||||||
|
|
||||||
|
Build for multiple platforms:
|
||||||
|
```bash
|
||||||
|
docker buildx create --use
|
||||||
|
docker buildx build --platform linux/amd64,linux/arm64 -t myfidelitytracker:latest .
|
||||||
|
```
|
||||||
|
|
||||||
|
## Roadmap
|
||||||
|
|
||||||
|
### Future Enhancements
|
||||||
|
- [ ] Additional brokerage support (Schwab, E*TRADE, Robinhood)
|
||||||
|
- [ ] Authentication and multi-user support
|
||||||
|
- [ ] AI-powered trade recommendations
|
||||||
|
- [ ] Tax reporting (wash sales, capital gains)
|
||||||
|
- [ ] Email notifications for imports
|
||||||
|
- [ ] Dark mode theme
|
||||||
|
- [ ] Export reports to PDF
|
||||||
|
- [ ] Advanced charting with technical indicators
|
||||||
|
- [ ] Paper trading / backtesting
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
Contributions are welcome! Please feel free to submit a Pull Request.
|
||||||
|
|
||||||
|
### Development Guidelines
|
||||||
|
- Follow existing code style
|
||||||
|
- Add comments for complex logic
|
||||||
|
- Write type hints for Python code
|
||||||
|
- Use TypeScript for frontend code
|
||||||
|
- Test thoroughly before submitting
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This project is licensed under the MIT License - see the LICENSE file for details.
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
For issues, questions, or suggestions:
|
||||||
|
- Open an issue on GitHub
|
||||||
|
- Check existing documentation
|
||||||
|
- Review API docs at `/docs`
|
||||||
|
|
||||||
|
## Acknowledgments
|
||||||
|
|
||||||
|
- Inspired by Robinhood's clean UI design
|
||||||
|
- Built with modern open-source technologies
|
||||||
|
- Market data provided by Yahoo Finance
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Disclaimer**: This application is for personal portfolio tracking only. It is not financial advice. Always consult with a financial advisor before making investment decisions.
|
||||||
153
READ_ME_FIRST.md
Normal file
153
READ_ME_FIRST.md
Normal file
@@ -0,0 +1,153 @@
|
|||||||
|
# READ THIS FIRST
|
||||||
|
|
||||||
|
## Your Current Problem
|
||||||
|
|
||||||
|
You're still getting:
|
||||||
|
1. **HTTP 307 redirects** when trying to create accounts
|
||||||
|
2. **Database "fidelity" does not exist** errors
|
||||||
|
|
||||||
|
This means **the previous rebuild did NOT work**. The backend container is still running old code.
|
||||||
|
|
||||||
|
## Why This Keeps Happening
|
||||||
|
|
||||||
|
Your backend container has old code baked in, and Docker's cache keeps bringing it back even when you think you're rebuilding.
|
||||||
|
|
||||||
|
## The Solution
|
||||||
|
|
||||||
|
I've created **ULTIMATE_FIX.sh** which is the most aggressive fix possible. It will:
|
||||||
|
|
||||||
|
1. Completely destroy everything (containers, images, volumes, networks)
|
||||||
|
2. Fix the docker-compose.yml healthcheck (which was trying to connect to wrong database)
|
||||||
|
3. Verify your .env file is correct
|
||||||
|
4. Rebuild with ABSOLUTE no caching
|
||||||
|
5. Test everything automatically
|
||||||
|
6. Tell you clearly if it worked or not
|
||||||
|
|
||||||
|
## What To Do RIGHT NOW
|
||||||
|
|
||||||
|
### Step 1: Transfer files to your server
|
||||||
|
|
||||||
|
On your Mac:
|
||||||
|
```bash
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
|
||||||
|
# Transfer the ultimate fix script
|
||||||
|
scp ULTIMATE_FIX.sh pi@starship2:~/fidelity/
|
||||||
|
scp diagnose-307.sh pi@starship2:~/fidelity/
|
||||||
|
scp docker-compose.yml pi@starship2:~/fidelity/
|
||||||
|
scp backend/app/main.py pi@starship2:~/fidelity/backend/app/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 2: Run the ultimate fix on your server
|
||||||
|
|
||||||
|
SSH to your server:
|
||||||
|
```bash
|
||||||
|
ssh pi@starship2
|
||||||
|
cd ~/fidelity
|
||||||
|
./ULTIMATE_FIX.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Watch the output carefully. At the end it will tell you:
|
||||||
|
- ✅ **SUCCESS!** - Everything works, you can use the app
|
||||||
|
- ❌ **STILL FAILING!** - Backend is still using old code
|
||||||
|
|
||||||
|
### Step 3: If it still fails
|
||||||
|
|
||||||
|
If you see "STILL FAILING" at the end, run the diagnostic:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./diagnose-307.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Then send me the output. The diagnostic will show exactly what code is running in the container.
|
||||||
|
|
||||||
|
## What I Fixed
|
||||||
|
|
||||||
|
I found and fixed two issues:
|
||||||
|
|
||||||
|
### Issue 1: Healthcheck Database Name
|
||||||
|
The docker-compose.yml healthcheck was:
|
||||||
|
```yaml
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U fidelity"]
|
||||||
|
```
|
||||||
|
|
||||||
|
This doesn't specify a database, so PostgreSQL defaults to a database named "fidelity" (same as username).
|
||||||
|
|
||||||
|
I fixed it to:
|
||||||
|
```yaml
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U fidelity -d fidelitytracker"]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Issue 2: Docker Cache
|
||||||
|
Even with `--no-cache`, Docker can still use cached layers in certain conditions. The ULTIMATE_FIX.sh script:
|
||||||
|
- Manually removes all fidelity images
|
||||||
|
- Prunes all volumes
|
||||||
|
- Uses `DOCKER_BUILDKIT=1` with `--pull` to force fresh base images
|
||||||
|
- Removes Python __pycache__ directories
|
||||||
|
|
||||||
|
## Alternative: Manual Nuclear Option
|
||||||
|
|
||||||
|
If you prefer to do it manually:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd ~/fidelity
|
||||||
|
|
||||||
|
# Stop everything
|
||||||
|
docker compose down -v --remove-orphans
|
||||||
|
|
||||||
|
# Delete images manually
|
||||||
|
docker rmi -f $(docker images | grep fidelity | awk '{print $3}')
|
||||||
|
|
||||||
|
# Clean everything
|
||||||
|
docker system prune -af --volumes
|
||||||
|
|
||||||
|
# Clear Python cache
|
||||||
|
find ./backend -type d -name "__pycache__" -exec rm -rf {} +
|
||||||
|
|
||||||
|
# Rebuild and start
|
||||||
|
DOCKER_BUILDKIT=1 docker compose build --no-cache --pull
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
# Wait 45 seconds
|
||||||
|
sleep 45
|
||||||
|
|
||||||
|
# Test
|
||||||
|
curl -i http://localhost:8000/api/accounts
|
||||||
|
```
|
||||||
|
|
||||||
|
If you see HTTP 200, it worked! If you see HTTP 307, the old code is still there somehow.
|
||||||
|
|
||||||
|
## Files Included
|
||||||
|
|
||||||
|
- **ULTIMATE_FIX.sh** - Main fix script (USE THIS)
|
||||||
|
- **diagnose-307.sh** - Diagnostic if ultimate fix fails
|
||||||
|
- **docker-compose.yml** - Fixed healthcheck
|
||||||
|
- **backend/app/main.py** - Fixed (no redirect_slashes=False)
|
||||||
|
|
||||||
|
## Next Steps After Success
|
||||||
|
|
||||||
|
Once you see "SUCCESS!" from the ultimate fix:
|
||||||
|
|
||||||
|
1. Open your browser: `http://starship2:3000` (or use the IP address)
|
||||||
|
2. Click "Create Account"
|
||||||
|
3. Fill in the form:
|
||||||
|
- Account Number: X12345678
|
||||||
|
- Account Name: Main Trading
|
||||||
|
- Account Type: Margin
|
||||||
|
4. Click Create
|
||||||
|
5. Should work!
|
||||||
|
|
||||||
|
## If Nothing Works
|
||||||
|
|
||||||
|
If the ULTIMATE_FIX.sh still shows "STILL FAILING", there might be:
|
||||||
|
1. A file permission issue preventing the rebuild
|
||||||
|
2. A Docker daemon issue
|
||||||
|
3. Something modifying files during build
|
||||||
|
|
||||||
|
Run the diagnostic and share the output:
|
||||||
|
```bash
|
||||||
|
./diagnose-307.sh > diagnostic-output.txt
|
||||||
|
cat diagnostic-output.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
Send me that output and I'll figure out what's going on.
|
||||||
200
ROOT_CAUSE_FOUND.md
Normal file
200
ROOT_CAUSE_FOUND.md
Normal file
@@ -0,0 +1,200 @@
|
|||||||
|
# ROOT CAUSE FOUND! 🎯
|
||||||
|
|
||||||
|
## The Diagnostic Revealed Everything
|
||||||
|
|
||||||
|
Your diagnostic output showed the **exact problem**:
|
||||||
|
|
||||||
|
### What We Saw
|
||||||
|
|
||||||
|
**Registered Routes (from diagnostic):**
|
||||||
|
```
|
||||||
|
POST /api/accounts/
|
||||||
|
GET /api/accounts/
|
||||||
|
```
|
||||||
|
|
||||||
|
Notice the **trailing slash**? (`/api/accounts/`)
|
||||||
|
|
||||||
|
**HTTP Response (from diagnostic):**
|
||||||
|
```
|
||||||
|
HTTP/1.1 307 Temporary Redirect
|
||||||
|
location: http://localhost:8000/api/accounts/
|
||||||
|
```
|
||||||
|
|
||||||
|
The backend was redirecting FROM `/api/accounts` TO `/api/accounts/`
|
||||||
|
|
||||||
|
## Why This Was Happening
|
||||||
|
|
||||||
|
### The Route Definition
|
||||||
|
|
||||||
|
In `accounts.py`, the routes were defined as:
|
||||||
|
|
||||||
|
```python
|
||||||
|
@router.get("/", response_model=List[AccountResponse])
|
||||||
|
def list_accounts(...):
|
||||||
|
...
|
||||||
|
```
|
||||||
|
|
||||||
|
### How FastAPI Combines Paths
|
||||||
|
|
||||||
|
When you register the router in `main.py`:
|
||||||
|
|
||||||
|
```python
|
||||||
|
app.include_router(
|
||||||
|
accounts.router,
|
||||||
|
prefix="/api/accounts", # <-- prefix
|
||||||
|
tags=["accounts"]
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
FastAPI combines them:
|
||||||
|
```
|
||||||
|
prefix: "/api/accounts" + route: "/" = "/api/accounts/"
|
||||||
|
↑ trailing slash!
|
||||||
|
```
|
||||||
|
|
||||||
|
### What the Frontend Was Doing
|
||||||
|
|
||||||
|
Your React frontend was calling:
|
||||||
|
```javascript
|
||||||
|
fetch('http://starship2:8000/api/accounts') // No trailing slash
|
||||||
|
```
|
||||||
|
|
||||||
|
### The Result
|
||||||
|
|
||||||
|
1. Frontend: `GET /api/accounts` (no slash)
|
||||||
|
2. Backend: "I only have `/api/accounts/` (with slash)"
|
||||||
|
3. Backend: "Let me redirect you there: HTTP 307"
|
||||||
|
4. Frontend: "I don't follow redirects automatically, request fails"
|
||||||
|
5. UI: Spinning loading indicator forever
|
||||||
|
|
||||||
|
## The Fix
|
||||||
|
|
||||||
|
Changed all route decorators from:
|
||||||
|
```python
|
||||||
|
@router.get("/", ...) # Creates /api/accounts/
|
||||||
|
```
|
||||||
|
|
||||||
|
To:
|
||||||
|
```python
|
||||||
|
@router.get("", ...) # Creates /api/accounts
|
||||||
|
```
|
||||||
|
|
||||||
|
Now when combined:
|
||||||
|
```
|
||||||
|
prefix: "/api/accounts" + route: "" = "/api/accounts"
|
||||||
|
↑ NO trailing slash!
|
||||||
|
```
|
||||||
|
|
||||||
|
Perfect match with what the frontend calls!
|
||||||
|
|
||||||
|
## Files Fixed
|
||||||
|
|
||||||
|
1. **backend/app/api/endpoints/accounts.py**
|
||||||
|
- Changed `@router.post("/")` → `@router.post("")`
|
||||||
|
- Changed `@router.get("/")` → `@router.get("")`
|
||||||
|
|
||||||
|
2. **backend/app/api/endpoints/positions.py**
|
||||||
|
- Changed `@router.get("/")` → `@router.get("")`
|
||||||
|
|
||||||
|
3. **backend/app/api/endpoints/transactions.py**
|
||||||
|
- Changed `@router.get("/")` → `@router.get("")`
|
||||||
|
|
||||||
|
## Why Previous Fixes Didn't Work
|
||||||
|
|
||||||
|
We spent time trying to fix:
|
||||||
|
- Docker cache (not the issue)
|
||||||
|
- Database connection (not the issue)
|
||||||
|
- redirect_slashes parameter (not the issue)
|
||||||
|
- Environment variables (not the issue)
|
||||||
|
|
||||||
|
**The real issue was simply the trailing slash in route paths!**
|
||||||
|
|
||||||
|
## How To Apply The Fix
|
||||||
|
|
||||||
|
### Option 1: Quick Transfer (Recommended)
|
||||||
|
|
||||||
|
On your Mac:
|
||||||
|
```bash
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
./transfer-final-fix.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Then on your server:
|
||||||
|
```bash
|
||||||
|
cd ~/fidelity
|
||||||
|
./FINAL_FIX.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 2: Manual Transfer
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# On Mac
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
|
||||||
|
scp backend/app/api/endpoints/accounts.py pi@starship2:~/fidelity/backend/app/api/endpoints/
|
||||||
|
scp backend/app/api/endpoints/positions.py pi@starship2:~/fidelity/backend/app/api/endpoints/
|
||||||
|
scp backend/app/api/endpoints/transactions.py pi@starship2:~/fidelity/backend/app/api/endpoints/
|
||||||
|
scp FINAL_FIX.sh pi@starship2:~/fidelity/
|
||||||
|
|
||||||
|
# On Server
|
||||||
|
ssh pi@starship2
|
||||||
|
cd ~/fidelity
|
||||||
|
chmod +x FINAL_FIX.sh
|
||||||
|
./FINAL_FIX.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
## What Will Happen
|
||||||
|
|
||||||
|
The FINAL_FIX.sh script will:
|
||||||
|
1. Stop containers
|
||||||
|
2. Remove backend image
|
||||||
|
3. Rebuild backend with fixed code
|
||||||
|
4. Start services
|
||||||
|
5. Test automatically
|
||||||
|
6. Show **SUCCESS!** if it works
|
||||||
|
|
||||||
|
## Expected Result
|
||||||
|
|
||||||
|
After the fix:
|
||||||
|
- ✅ `GET /api/accounts` returns HTTP 200 (not 307!)
|
||||||
|
- ✅ Response: `[]` (empty array)
|
||||||
|
- ✅ Account creation works in UI
|
||||||
|
- ✅ No more spinning/loading forever
|
||||||
|
|
||||||
|
## Why The Diagnostic Was So Helpful
|
||||||
|
|
||||||
|
The diagnostic showed:
|
||||||
|
1. ✅ Backend had correct main.py (no redirect_slashes=False)
|
||||||
|
2. ✅ Database connection worked perfectly
|
||||||
|
3. ✅ Environment variables were correct
|
||||||
|
4. ✅ Image was freshly built (2 minutes ago)
|
||||||
|
5. ❌ But routes were registered WITH trailing slashes
|
||||||
|
6. ❌ And HTTP test returned 307 redirect
|
||||||
|
|
||||||
|
This pointed directly to the route path issue!
|
||||||
|
|
||||||
|
## Lesson Learned
|
||||||
|
|
||||||
|
FastAPI's route registration is simple but subtle:
|
||||||
|
|
||||||
|
```python
|
||||||
|
# These are DIFFERENT:
|
||||||
|
@router.get("/") # With trailing slash
|
||||||
|
@router.get("") # Without trailing slash
|
||||||
|
|
||||||
|
# When combined with prefix "/api/accounts":
|
||||||
|
"/api/accounts" + "/" = "/api/accounts/" # Not what we want
|
||||||
|
"/api/accounts" + "" = "/api/accounts" # Perfect!
|
||||||
|
```
|
||||||
|
|
||||||
|
## Final Note
|
||||||
|
|
||||||
|
This is a common FastAPI gotcha. The framework's `redirect_slashes=True` parameter is supposed to handle this, but when routes are registered with explicit trailing slashes, it creates the redirect behavior we saw.
|
||||||
|
|
||||||
|
By using empty string `""` for the root route of each router, we match exactly what the frontend expects, and everything works!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Status:** ✅ Root cause identified and fixed!
|
||||||
|
**Next:** Transfer files and rebuild
|
||||||
|
**Expected:** Account creation should work perfectly!
|
||||||
115
SIMPLE_DEPLOYMENT.md
Normal file
115
SIMPLE_DEPLOYMENT.md
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
# Simple Deployment Guide
|
||||||
|
|
||||||
|
## Quick Fix for Rate Limiting
|
||||||
|
|
||||||
|
You can deploy the rate limiting fix without manually editing files. I've created two approaches:
|
||||||
|
|
||||||
|
### Approach 1: Automatic (Recommended)
|
||||||
|
|
||||||
|
I'll create scripts that automatically update the necessary files.
|
||||||
|
|
||||||
|
### Approach 2: Manual (if you prefer)
|
||||||
|
|
||||||
|
Just 2 small changes needed:
|
||||||
|
|
||||||
|
#### Change 1: Update main.py (backend)
|
||||||
|
|
||||||
|
File: `backend/app/main.py`
|
||||||
|
|
||||||
|
**Find this line:**
|
||||||
|
```python
|
||||||
|
from app.api.endpoints import accounts, transactions, positions, analytics
|
||||||
|
```
|
||||||
|
|
||||||
|
**Change to:**
|
||||||
|
```python
|
||||||
|
from app.api.endpoints import accounts, transactions, positions, analytics_v2 as analytics
|
||||||
|
```
|
||||||
|
|
||||||
|
That's it! By importing `analytics_v2 as analytics`, the rest of the file works unchanged.
|
||||||
|
|
||||||
|
#### Change 2: Update App.tsx (frontend)
|
||||||
|
|
||||||
|
File: `frontend/src/App.tsx`
|
||||||
|
|
||||||
|
**Find this line:**
|
||||||
|
```typescript
|
||||||
|
import Dashboard from './components/Dashboard';
|
||||||
|
```
|
||||||
|
|
||||||
|
**Change to:**
|
||||||
|
```typescript
|
||||||
|
import Dashboard from './components/DashboardV2';
|
||||||
|
```
|
||||||
|
|
||||||
|
**That's it!** The component props are identical, so nothing else needs to change.
|
||||||
|
|
||||||
|
### Deploy Steps
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Transfer files (on your Mac)
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
./deploy-rate-limiting-fix.sh
|
||||||
|
|
||||||
|
# 2. SSH to server
|
||||||
|
ssh pi@starship2
|
||||||
|
cd ~/fidelity
|
||||||
|
|
||||||
|
# 3. Make the two changes above, then rebuild
|
||||||
|
docker compose down
|
||||||
|
docker compose build --no-cache backend frontend
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
# 4. Run migration (adds market_prices table)
|
||||||
|
sleep 30
|
||||||
|
docker compose exec backend alembic upgrade head
|
||||||
|
|
||||||
|
# 5. Verify
|
||||||
|
curl "http://localhost:8000/api/analytics/overview/1?refresh_prices=false"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing
|
||||||
|
|
||||||
|
1. Open dashboard: `http://starship2:3000`
|
||||||
|
2. Should load instantly!
|
||||||
|
3. Click account dropdown, select your account
|
||||||
|
4. Dashboard tab loads immediately with cached data
|
||||||
|
5. Click "🔄 Refresh Prices" button to get fresh data
|
||||||
|
|
||||||
|
### Logs to Expect
|
||||||
|
|
||||||
|
**Before (with rate limiting issues):**
|
||||||
|
```
|
||||||
|
429 Client Error: Too Many Requests
|
||||||
|
429 Client Error: Too Many Requests
|
||||||
|
429 Client Error: Too Many Requests
|
||||||
|
```
|
||||||
|
|
||||||
|
**After (with fix):**
|
||||||
|
```
|
||||||
|
Cache HIT (fresh): AAPL = $150.25 (age: 120s)
|
||||||
|
Cache HIT (stale): TSLA = $245.80 (age: 320s)
|
||||||
|
Cache MISS: AMD, fetching from Yahoo Finance...
|
||||||
|
Fetched AMD = $180.50
|
||||||
|
```
|
||||||
|
|
||||||
|
### Rollback (if needed)
|
||||||
|
|
||||||
|
To go back to the old version:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# In main.py, change back to:
|
||||||
|
from app.api.endpoints import accounts, transactions, positions, analytics
|
||||||
|
|
||||||
|
# In App.tsx, change back to:
|
||||||
|
import Dashboard from './components/Dashboard';
|
||||||
|
|
||||||
|
# Rebuild
|
||||||
|
docker compose build backend frontend
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
The `market_prices` table will remain (doesn't hurt anything), or you can drop it:
|
||||||
|
```sql
|
||||||
|
DROP TABLE market_prices;
|
||||||
|
```
|
||||||
167
SOLUTION_SUMMARY.md
Normal file
167
SOLUTION_SUMMARY.md
Normal file
@@ -0,0 +1,167 @@
|
|||||||
|
# Solution Summary - Account Creation Fix
|
||||||
|
|
||||||
|
## Problem Identified
|
||||||
|
|
||||||
|
Your backend is running **old cached code** from a previous Docker build. Even though you updated the files on your Linux server, the running container has the old version because:
|
||||||
|
|
||||||
|
1. Docker cached the old code during the initial build
|
||||||
|
2. Rebuilding without `--no-cache` reused those cached layers
|
||||||
|
3. The old code had `redirect_slashes=False` which causes 307 redirects
|
||||||
|
4. Result: Account creation fails because API calls get redirected instead of processed
|
||||||
|
|
||||||
|
## The Fix
|
||||||
|
|
||||||
|
Run the **nuclear-fix.sh** script on your Linux server. This script:
|
||||||
|
- Completely removes all old containers, images, and cache
|
||||||
|
- Rebuilds everything from scratch with `--no-cache`
|
||||||
|
- Tests that the correct code is running
|
||||||
|
- Verifies all endpoints work
|
||||||
|
|
||||||
|
## Files Created for You
|
||||||
|
|
||||||
|
### 1. nuclear-fix.sh ⭐ MAIN FIX
|
||||||
|
Complete rebuild script that fixes everything. **Run this first**.
|
||||||
|
|
||||||
|
### 2. verify-backend-code.sh
|
||||||
|
Diagnostic script that shows exactly what code is running in the container.
|
||||||
|
Use this if the nuclear fix doesn't work.
|
||||||
|
|
||||||
|
### 3. CRITICAL_FIX_README.md
|
||||||
|
Detailed explanation of the problem and multiple solution options.
|
||||||
|
|
||||||
|
### 4. transfer-to-server.sh
|
||||||
|
Helper script to transfer all files to your Linux server via SSH.
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### On your Mac:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/chris/Desktop/fidelity
|
||||||
|
|
||||||
|
# Option A: Transfer files with helper script
|
||||||
|
./transfer-to-server.sh pi@starship2
|
||||||
|
|
||||||
|
# Option B: Manual transfer
|
||||||
|
scp nuclear-fix.sh verify-backend-code.sh CRITICAL_FIX_README.md pi@starship2:~/fidelity/
|
||||||
|
scp backend/app/main.py pi@starship2:~/fidelity/backend/app/
|
||||||
|
```
|
||||||
|
|
||||||
|
### On your Linux server (starship2):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd ~/fidelity
|
||||||
|
|
||||||
|
# Read the detailed explanation (optional)
|
||||||
|
cat CRITICAL_FIX_README.md
|
||||||
|
|
||||||
|
# Run the nuclear fix
|
||||||
|
./nuclear-fix.sh
|
||||||
|
|
||||||
|
# Watch the output - it will test everything automatically
|
||||||
|
```
|
||||||
|
|
||||||
|
## Expected Results
|
||||||
|
|
||||||
|
After running nuclear-fix.sh, you should see:
|
||||||
|
|
||||||
|
```
|
||||||
|
✓ Backend health check: PASSED
|
||||||
|
✓ Accounts endpoint: PASSED (HTTP 200)
|
||||||
|
✓ Frontend: PASSED (HTTP 200)
|
||||||
|
```
|
||||||
|
|
||||||
|
Then when you create an account in the UI:
|
||||||
|
- The form submits successfully
|
||||||
|
- No spinning/loading forever
|
||||||
|
- Account appears in the list
|
||||||
|
|
||||||
|
## If It Still Doesn't Work
|
||||||
|
|
||||||
|
Run the verification script:
|
||||||
|
```bash
|
||||||
|
./verify-backend-code.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This will show:
|
||||||
|
- What version of main.py is actually running
|
||||||
|
- Database connection details
|
||||||
|
- Registered routes
|
||||||
|
- Any configuration issues
|
||||||
|
|
||||||
|
Share the output and I can help further.
|
||||||
|
|
||||||
|
## Technical Details
|
||||||
|
|
||||||
|
### Why --no-cache Is Critical
|
||||||
|
|
||||||
|
Your current workflow:
|
||||||
|
1. ✅ Update files on Mac
|
||||||
|
2. ✅ Transfer to Linux server
|
||||||
|
3. ❌ Run `docker compose build` (WITHOUT --no-cache)
|
||||||
|
4. ❌ Docker reuses cached layers with OLD CODE
|
||||||
|
5. ❌ Container runs old code, account creation fails
|
||||||
|
|
||||||
|
Correct workflow:
|
||||||
|
1. ✅ Update files on Mac
|
||||||
|
2. ✅ Transfer to Linux server
|
||||||
|
3. ✅ Run `docker compose build --no-cache`
|
||||||
|
4. ✅ Docker rebuilds every layer with NEW CODE
|
||||||
|
5. ✅ Container runs new code, everything works
|
||||||
|
|
||||||
|
### The Volume Mount Misconception
|
||||||
|
|
||||||
|
docker-compose.yml has:
|
||||||
|
```yaml
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
```
|
||||||
|
|
||||||
|
You might think: "Code changes should be automatic!"
|
||||||
|
|
||||||
|
Reality:
|
||||||
|
- Volume mount puts files in container ✅
|
||||||
|
- But uvicorn runs WITHOUT --reload flag ❌
|
||||||
|
- Python has already loaded modules into memory ❌
|
||||||
|
- Changing files doesn't restart the process ❌
|
||||||
|
|
||||||
|
For production (your setup), code is baked into the image at build time.
|
||||||
|
|
||||||
|
### Why You See 307 Redirects
|
||||||
|
|
||||||
|
Old main.py had:
|
||||||
|
```python
|
||||||
|
app = FastAPI(
|
||||||
|
redirect_slashes=False, # This was the problem!
|
||||||
|
...
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
This caused:
|
||||||
|
- Frontend calls: `GET /api/accounts` (no trailing slash)
|
||||||
|
- Route registered as: `/api/accounts/` (with trailing slash)
|
||||||
|
- FastAPI can't match, returns 307 redirect
|
||||||
|
- Frontend doesn't follow redirect, gets stuck
|
||||||
|
|
||||||
|
New main.py (fixed):
|
||||||
|
```python
|
||||||
|
app = FastAPI(
|
||||||
|
# redirect_slashes defaults to True
|
||||||
|
# Handles both /api/accounts and /api/accounts/
|
||||||
|
...
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
This works:
|
||||||
|
- Frontend calls: `GET /api/accounts` (no trailing slash)
|
||||||
|
- FastAPI auto-redirects internally to `/api/accounts/`
|
||||||
|
- Route matches, returns 200 with data ✅
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
**Problem**: Old code in Docker container
|
||||||
|
**Cause**: Docker build cache
|
||||||
|
**Solution**: Rebuild with --no-cache
|
||||||
|
**Script**: nuclear-fix.sh does this automatically
|
||||||
|
|
||||||
|
Transfer the files and run the script. It should work!
|
||||||
157
apply_patches.py
Executable file
157
apply_patches.py
Executable file
@@ -0,0 +1,157 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Apply patches for rate limiting fix.
|
||||||
|
This Python script works across all platforms.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import shutil
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
def backup_file(filepath):
|
||||||
|
"""Create backup of file."""
|
||||||
|
backup_path = f"{filepath}.backup"
|
||||||
|
shutil.copy2(filepath, backup_path)
|
||||||
|
print(f"✓ Backed up {filepath} to {backup_path}")
|
||||||
|
return backup_path
|
||||||
|
|
||||||
|
|
||||||
|
def patch_main_py():
|
||||||
|
"""Patch backend/app/main.py to use analytics_v2."""
|
||||||
|
filepath = Path("backend/app/main.py")
|
||||||
|
|
||||||
|
if not filepath.exists():
|
||||||
|
print(f"❌ Error: {filepath} not found")
|
||||||
|
return False
|
||||||
|
|
||||||
|
print(f"\n[1/2] Patching {filepath}...")
|
||||||
|
|
||||||
|
# Backup
|
||||||
|
backup_file(filepath)
|
||||||
|
|
||||||
|
# Read file
|
||||||
|
with open(filepath, 'r') as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
# Check if already patched
|
||||||
|
if 'analytics_v2 as analytics' in content or 'import analytics_v2' in content:
|
||||||
|
print("✓ Backend already patched (analytics_v2 found)")
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Apply patch
|
||||||
|
old_import = "from app.api.endpoints import accounts, transactions, positions, analytics"
|
||||||
|
new_import = "from app.api.endpoints import accounts, transactions, positions, analytics_v2 as analytics"
|
||||||
|
|
||||||
|
if old_import in content:
|
||||||
|
content = content.replace(old_import, new_import)
|
||||||
|
|
||||||
|
# Write back
|
||||||
|
with open(filepath, 'w') as f:
|
||||||
|
f.write(content)
|
||||||
|
|
||||||
|
print("✓ Backend patched successfully")
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
print("❌ Could not find expected import line")
|
||||||
|
print("\nLooking for:")
|
||||||
|
print(f" {old_import}")
|
||||||
|
print("\nPlease manually edit backend/app/main.py")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def patch_app_tsx():
|
||||||
|
"""Patch frontend/src/App.tsx to use DashboardV2."""
|
||||||
|
filepath = Path("frontend/src/App.tsx")
|
||||||
|
|
||||||
|
if not filepath.exists():
|
||||||
|
print(f"❌ Error: {filepath} not found")
|
||||||
|
return False
|
||||||
|
|
||||||
|
print(f"\n[2/2] Patching {filepath}...")
|
||||||
|
|
||||||
|
# Backup
|
||||||
|
backup_file(filepath)
|
||||||
|
|
||||||
|
# Read file
|
||||||
|
with open(filepath, 'r') as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
# Check if already patched
|
||||||
|
if 'DashboardV2' in content or "components/DashboardV2" in content:
|
||||||
|
print("✓ Frontend already patched (DashboardV2 found)")
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Apply patch - handle both single and double quotes
|
||||||
|
old_import1 = "import Dashboard from './components/Dashboard'"
|
||||||
|
new_import1 = "import Dashboard from './components/DashboardV2'"
|
||||||
|
old_import2 = 'import Dashboard from "./components/Dashboard"'
|
||||||
|
new_import2 = 'import Dashboard from "./components/DashboardV2"'
|
||||||
|
|
||||||
|
changed = False
|
||||||
|
if old_import1 in content:
|
||||||
|
content = content.replace(old_import1, new_import1)
|
||||||
|
changed = True
|
||||||
|
if old_import2 in content:
|
||||||
|
content = content.replace(old_import2, new_import2)
|
||||||
|
changed = True
|
||||||
|
|
||||||
|
if changed:
|
||||||
|
# Write back
|
||||||
|
with open(filepath, 'w') as f:
|
||||||
|
f.write(content)
|
||||||
|
|
||||||
|
print("✓ Frontend patched successfully")
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
print("❌ Could not find expected import line")
|
||||||
|
print("\nLooking for:")
|
||||||
|
print(f" {old_import1}")
|
||||||
|
print(f" or {old_import2}")
|
||||||
|
print("\nPlease manually edit frontend/src/App.tsx")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
print("=" * 60)
|
||||||
|
print("Applying Rate Limiting Fix Patches (Python)")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
# Check we're in the right directory
|
||||||
|
if not Path("docker-compose.yml").exists():
|
||||||
|
print("\n❌ Error: docker-compose.yml not found")
|
||||||
|
print("Please run this script from the fidelity project directory")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
# Apply patches
|
||||||
|
backend_ok = patch_main_py()
|
||||||
|
frontend_ok = patch_app_tsx()
|
||||||
|
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
|
||||||
|
if backend_ok and frontend_ok:
|
||||||
|
print("✅ All patches applied successfully!")
|
||||||
|
print("=" * 60)
|
||||||
|
print("\nNext steps:")
|
||||||
|
print("")
|
||||||
|
print("1. Rebuild containers:")
|
||||||
|
print(" docker compose down")
|
||||||
|
print(" docker compose build --no-cache backend frontend")
|
||||||
|
print(" docker compose up -d")
|
||||||
|
print("")
|
||||||
|
print("2. Run migration:")
|
||||||
|
print(" sleep 30")
|
||||||
|
print(" docker compose exec backend alembic upgrade head")
|
||||||
|
print("")
|
||||||
|
print("3. Test:")
|
||||||
|
print(" curl http://localhost:8000/api/analytics/overview/1?refresh_prices=false")
|
||||||
|
print("")
|
||||||
|
sys.exit(0)
|
||||||
|
else:
|
||||||
|
print("⚠️ Some patches failed - see manual instructions above")
|
||||||
|
print("=" * 60)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
42
backend/Dockerfile
Normal file
42
backend/Dockerfile
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# Multi-stage build for Python FastAPI backend
|
||||||
|
FROM python:3.11-slim as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install system dependencies
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
gcc \
|
||||||
|
postgresql-client \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy requirements and install Python dependencies
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir --user -r requirements.txt
|
||||||
|
|
||||||
|
# Final stage
|
||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install runtime dependencies
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
postgresql-client \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy Python dependencies from builder
|
||||||
|
COPY --from=builder /root/.local /root/.local
|
||||||
|
|
||||||
|
# Copy application code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Make sure scripts in .local are usable
|
||||||
|
ENV PATH=/root/.local/bin:$PATH
|
||||||
|
|
||||||
|
# Create imports directory
|
||||||
|
RUN mkdir -p /app/imports
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
# Run migrations and start server
|
||||||
|
CMD alembic upgrade head && uvicorn app.main:app --host 0.0.0.0 --port 8000
|
||||||
52
backend/alembic.ini
Normal file
52
backend/alembic.ini
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
# Alembic configuration file
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# Path to migration scripts
|
||||||
|
script_location = alembic
|
||||||
|
|
||||||
|
# Template used to generate migration files
|
||||||
|
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d_%%(rev)s_%%(slug)s
|
||||||
|
|
||||||
|
# Timezone for migration timestamps
|
||||||
|
timezone = UTC
|
||||||
|
|
||||||
|
# Prepend migration scripts with proper encoding
|
||||||
|
prepend_sys_path = .
|
||||||
|
|
||||||
|
# Version location specification
|
||||||
|
version_path_separator = os
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
||||||
72
backend/alembic/env.py
Normal file
72
backend/alembic/env.py
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
"""Alembic environment configuration for database migrations."""
|
||||||
|
from logging.config import fileConfig
|
||||||
|
from sqlalchemy import engine_from_config, pool
|
||||||
|
from alembic import context
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add parent directory to path to import app modules
|
||||||
|
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
|
||||||
|
|
||||||
|
from app.config import settings
|
||||||
|
from app.database import Base
|
||||||
|
from app.models import Account, Transaction, Position, PositionTransaction
|
||||||
|
|
||||||
|
# Alembic Config object
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Override sqlalchemy.url with our settings
|
||||||
|
config.set_main_option("sqlalchemy.url", settings.database_url)
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# Target metadata for autogenerate support
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""
|
||||||
|
Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL and not an Engine,
|
||||||
|
though an Engine is acceptable here as well. By skipping the Engine
|
||||||
|
creation we don't even need a DBAPI to be available.
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""
|
||||||
|
Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine and associate a
|
||||||
|
connection with the context.
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(connection=connection, target_metadata=target_metadata)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
25
backend/alembic/script.py.mako
Normal file
25
backend/alembic/script.py.mako
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from typing import Sequence, Union
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = ${repr(up_revision)}
|
||||||
|
down_revision: Union[str, None] = ${repr(down_revision)}
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
|
||||||
|
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
${downgrades if downgrades else "pass"}
|
||||||
83
backend/alembic/versions/001_initial_schema.py
Normal file
83
backend/alembic/versions/001_initial_schema.py
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
"""Initial schema
|
||||||
|
|
||||||
|
Revision ID: 001_initial_schema
|
||||||
|
Revises:
|
||||||
|
Create Date: 2026-01-20 10:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
from sqlalchemy.dialects import postgresql
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '001_initial_schema'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Create accounts table
|
||||||
|
op.create_table(
|
||||||
|
'accounts',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('account_number', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('account_name', sa.String(length=200), nullable=False),
|
||||||
|
sa.Column('account_type', sa.Enum('CASH', 'MARGIN', name='accounttype'), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_accounts_id'), 'accounts', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_accounts_account_number'), 'accounts', ['account_number'], unique=True)
|
||||||
|
|
||||||
|
# Create transactions table
|
||||||
|
op.create_table(
|
||||||
|
'transactions',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('account_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('run_date', sa.Date(), nullable=False),
|
||||||
|
sa.Column('action', sa.String(length=500), nullable=False),
|
||||||
|
sa.Column('symbol', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('description', sa.String(length=500), nullable=True),
|
||||||
|
sa.Column('transaction_type', sa.String(length=20), nullable=True),
|
||||||
|
sa.Column('exchange_quantity', sa.Numeric(precision=20, scale=8), nullable=True),
|
||||||
|
sa.Column('exchange_currency', sa.String(length=10), nullable=True),
|
||||||
|
sa.Column('currency', sa.String(length=10), nullable=True),
|
||||||
|
sa.Column('price', sa.Numeric(precision=20, scale=8), nullable=True),
|
||||||
|
sa.Column('quantity', sa.Numeric(precision=20, scale=8), nullable=True),
|
||||||
|
sa.Column('exchange_rate', sa.Numeric(precision=20, scale=8), nullable=True),
|
||||||
|
sa.Column('commission', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('fees', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('accrued_interest', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('amount', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('cash_balance', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('settlement_date', sa.Date(), nullable=True),
|
||||||
|
sa.Column('unique_hash', sa.String(length=64), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['account_id'], ['accounts.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_transactions_id'), 'transactions', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_account_id'), 'transactions', ['account_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_run_date'), 'transactions', ['run_date'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_symbol'), 'transactions', ['symbol'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_unique_hash'), 'transactions', ['unique_hash'], unique=True)
|
||||||
|
op.create_index('idx_account_date', 'transactions', ['account_id', 'run_date'], unique=False)
|
||||||
|
op.create_index('idx_account_symbol', 'transactions', ['account_id', 'symbol'], unique=False)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_index('idx_account_symbol', table_name='transactions')
|
||||||
|
op.drop_index('idx_account_date', table_name='transactions')
|
||||||
|
op.drop_index(op.f('ix_transactions_unique_hash'), table_name='transactions')
|
||||||
|
op.drop_index(op.f('ix_transactions_symbol'), table_name='transactions')
|
||||||
|
op.drop_index(op.f('ix_transactions_run_date'), table_name='transactions')
|
||||||
|
op.drop_index(op.f('ix_transactions_account_id'), table_name='transactions')
|
||||||
|
op.drop_index(op.f('ix_transactions_id'), table_name='transactions')
|
||||||
|
op.drop_table('transactions')
|
||||||
|
op.drop_index(op.f('ix_accounts_account_number'), table_name='accounts')
|
||||||
|
op.drop_index(op.f('ix_accounts_id'), table_name='accounts')
|
||||||
|
op.drop_table('accounts')
|
||||||
|
op.execute('DROP TYPE accounttype')
|
||||||
70
backend/alembic/versions/002_add_positions.py
Normal file
70
backend/alembic/versions/002_add_positions.py
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
"""Add positions tables
|
||||||
|
|
||||||
|
Revision ID: 002_add_positions
|
||||||
|
Revises: 001_initial_schema
|
||||||
|
Create Date: 2026-01-20 15:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '002_add_positions'
|
||||||
|
down_revision = '001_initial_schema'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Create positions table
|
||||||
|
op.create_table(
|
||||||
|
'positions',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('account_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('symbol', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('option_symbol', sa.String(length=100), nullable=True),
|
||||||
|
sa.Column('position_type', sa.Enum('STOCK', 'CALL', 'PUT', name='positiontype'), nullable=False),
|
||||||
|
sa.Column('status', sa.Enum('OPEN', 'CLOSED', name='positionstatus'), nullable=False),
|
||||||
|
sa.Column('open_date', sa.Date(), nullable=False),
|
||||||
|
sa.Column('close_date', sa.Date(), nullable=True),
|
||||||
|
sa.Column('total_quantity', sa.Numeric(precision=20, scale=8), nullable=False),
|
||||||
|
sa.Column('avg_entry_price', sa.Numeric(precision=20, scale=8), nullable=True),
|
||||||
|
sa.Column('avg_exit_price', sa.Numeric(precision=20, scale=8), nullable=True),
|
||||||
|
sa.Column('realized_pnl', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('unrealized_pnl', sa.Numeric(precision=20, scale=2), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['account_id'], ['accounts.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_positions_id'), 'positions', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_positions_account_id'), 'positions', ['account_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_positions_symbol'), 'positions', ['symbol'], unique=False)
|
||||||
|
op.create_index(op.f('ix_positions_option_symbol'), 'positions', ['option_symbol'], unique=False)
|
||||||
|
op.create_index(op.f('ix_positions_status'), 'positions', ['status'], unique=False)
|
||||||
|
op.create_index('idx_account_status', 'positions', ['account_id', 'status'], unique=False)
|
||||||
|
op.create_index('idx_account_symbol_status', 'positions', ['account_id', 'symbol', 'status'], unique=False)
|
||||||
|
|
||||||
|
# Create position_transactions junction table
|
||||||
|
op.create_table(
|
||||||
|
'position_transactions',
|
||||||
|
sa.Column('position_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('transaction_id', sa.Integer(), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['position_id'], ['positions.id'], ondelete='CASCADE'),
|
||||||
|
sa.ForeignKeyConstraint(['transaction_id'], ['transactions.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('position_id', 'transaction_id')
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_table('position_transactions')
|
||||||
|
op.drop_index('idx_account_symbol_status', table_name='positions')
|
||||||
|
op.drop_index('idx_account_status', table_name='positions')
|
||||||
|
op.drop_index(op.f('ix_positions_status'), table_name='positions')
|
||||||
|
op.drop_index(op.f('ix_positions_option_symbol'), table_name='positions')
|
||||||
|
op.drop_index(op.f('ix_positions_symbol'), table_name='positions')
|
||||||
|
op.drop_index(op.f('ix_positions_account_id'), table_name='positions')
|
||||||
|
op.drop_index(op.f('ix_positions_id'), table_name='positions')
|
||||||
|
op.drop_table('positions')
|
||||||
|
op.execute('DROP TYPE positionstatus')
|
||||||
|
op.execute('DROP TYPE positiontype')
|
||||||
40
backend/alembic/versions/add_market_prices_table.py
Normal file
40
backend/alembic/versions/add_market_prices_table.py
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
"""Add market_prices table for price caching
|
||||||
|
|
||||||
|
Revision ID: 003_market_prices
|
||||||
|
Revises: 002_add_positions
|
||||||
|
Create Date: 2026-01-20 16:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '003_market_prices'
|
||||||
|
down_revision = '002_add_positions'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Create market_prices table
|
||||||
|
op.create_table(
|
||||||
|
'market_prices',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('symbol', sa.String(length=20), nullable=False),
|
||||||
|
sa.Column('price', sa.Numeric(precision=20, scale=6), nullable=False),
|
||||||
|
sa.Column('fetched_at', sa.DateTime(), nullable=False, default=datetime.utcnow),
|
||||||
|
sa.Column('source', sa.String(length=50), default='yahoo_finance'),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create indexes
|
||||||
|
op.create_index('idx_market_prices_symbol', 'market_prices', ['symbol'], unique=True)
|
||||||
|
op.create_index('idx_symbol_fetched', 'market_prices', ['symbol', 'fetched_at'])
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_index('idx_symbol_fetched', table_name='market_prices')
|
||||||
|
op.drop_index('idx_market_prices_symbol', table_name='market_prices')
|
||||||
|
op.drop_table('market_prices')
|
||||||
2
backend/app/__init__.py
Normal file
2
backend/app/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
"""myFidelityTracker backend application."""
|
||||||
|
__version__ = "1.0.0"
|
||||||
1
backend/app/api/__init__.py
Normal file
1
backend/app/api/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""API routes and endpoints."""
|
||||||
19
backend/app/api/deps.py
Normal file
19
backend/app/api/deps.py
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
"""API dependencies."""
|
||||||
|
from typing import Generator
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app.database import SessionLocal
|
||||||
|
|
||||||
|
|
||||||
|
def get_db() -> Generator[Session, None, None]:
|
||||||
|
"""
|
||||||
|
Dependency that provides a database session.
|
||||||
|
|
||||||
|
Yields:
|
||||||
|
Database session
|
||||||
|
"""
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
1
backend/app/api/endpoints/__init__.py
Normal file
1
backend/app/api/endpoints/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""API endpoint modules."""
|
||||||
151
backend/app/api/endpoints/accounts.py
Normal file
151
backend/app/api/endpoints/accounts.py
Normal file
@@ -0,0 +1,151 @@
|
|||||||
|
"""Account management API endpoints."""
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
from app.api.deps import get_db
|
||||||
|
from app.models import Account
|
||||||
|
from app.schemas import AccountCreate, AccountUpdate, AccountResponse
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("", response_model=AccountResponse, status_code=status.HTTP_201_CREATED)
|
||||||
|
def create_account(account: AccountCreate, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Create a new brokerage account.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account: Account creation data
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created account
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If account number already exists
|
||||||
|
"""
|
||||||
|
# Check if account number already exists
|
||||||
|
existing = (
|
||||||
|
db.query(Account)
|
||||||
|
.filter(Account.account_number == account.account_number)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail=f"Account with number {account.account_number} already exists",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create new account
|
||||||
|
db_account = Account(**account.model_dump())
|
||||||
|
db.add(db_account)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_account)
|
||||||
|
|
||||||
|
return db_account
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("", response_model=List[AccountResponse])
|
||||||
|
def list_accounts(skip: int = 0, limit: int = 100, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
List all accounts.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
skip: Number of records to skip
|
||||||
|
limit: Maximum number of records to return
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of accounts
|
||||||
|
"""
|
||||||
|
accounts = db.query(Account).offset(skip).limit(limit).all()
|
||||||
|
return accounts
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{account_id}", response_model=AccountResponse)
|
||||||
|
def get_account(account_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get account by ID.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Account details
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If account not found
|
||||||
|
"""
|
||||||
|
account = db.query(Account).filter(Account.id == account_id).first()
|
||||||
|
|
||||||
|
if not account:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Account {account_id} not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
return account
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{account_id}", response_model=AccountResponse)
|
||||||
|
def update_account(
|
||||||
|
account_id: int, account_update: AccountUpdate, db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update account details.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
account_update: Updated account data
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated account
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If account not found
|
||||||
|
"""
|
||||||
|
db_account = db.query(Account).filter(Account.id == account_id).first()
|
||||||
|
|
||||||
|
if not db_account:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Account {account_id} not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update fields
|
||||||
|
update_data = account_update.model_dump(exclude_unset=True)
|
||||||
|
for field, value in update_data.items():
|
||||||
|
setattr(db_account, field, value)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_account)
|
||||||
|
|
||||||
|
return db_account
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{account_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
def delete_account(account_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Delete an account and all associated data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If account not found
|
||||||
|
"""
|
||||||
|
db_account = db.query(Account).filter(Account.id == account_id).first()
|
||||||
|
|
||||||
|
if not db_account:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Account {account_id} not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
db.delete(db_account)
|
||||||
|
db.commit()
|
||||||
111
backend/app/api/endpoints/analytics.py
Normal file
111
backend/app/api/endpoints/analytics.py
Normal file
@@ -0,0 +1,111 @@
|
|||||||
|
"""Analytics API endpoints."""
|
||||||
|
from fastapi import APIRouter, Depends, Query
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from app.api.deps import get_db
|
||||||
|
from app.services.performance_calculator import PerformanceCalculator
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/overview/{account_id}")
|
||||||
|
def get_overview(account_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get overview statistics for an account.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with performance metrics
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculator(db)
|
||||||
|
stats = calculator.calculate_account_stats(account_id)
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/balance-history/{account_id}")
|
||||||
|
def get_balance_history(
|
||||||
|
account_id: int,
|
||||||
|
days: int = Query(default=30, ge=1, le=3650),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get account balance history for charting.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
days: Number of days to retrieve (default: 30)
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of {date, balance} dictionaries
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculator(db)
|
||||||
|
history = calculator.get_balance_history(account_id, days)
|
||||||
|
return {"data": history}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/top-trades/{account_id}")
|
||||||
|
def get_top_trades(
|
||||||
|
account_id: int,
|
||||||
|
limit: int = Query(default=20, ge=1, le=100),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get top performing trades.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return (default: 20)
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculator(db)
|
||||||
|
trades = calculator.get_top_trades(account_id, limit)
|
||||||
|
return {"data": trades}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/worst-trades/{account_id}")
|
||||||
|
def get_worst_trades(
|
||||||
|
account_id: int,
|
||||||
|
limit: int = Query(default=20, ge=1, le=100),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get worst performing trades (biggest losses).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return (default: 20)
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculator(db)
|
||||||
|
trades = calculator.get_worst_trades(account_id, limit)
|
||||||
|
return {"data": trades}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/update-pnl/{account_id}")
|
||||||
|
def update_unrealized_pnl(account_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Update unrealized P&L for all open positions in an account.
|
||||||
|
|
||||||
|
Fetches current market prices and recalculates P&L.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of positions updated
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculator(db)
|
||||||
|
updated = calculator.update_open_positions_pnl(account_id)
|
||||||
|
return {"positions_updated": updated}
|
||||||
273
backend/app/api/endpoints/analytics_v2.py
Normal file
273
backend/app/api/endpoints/analytics_v2.py
Normal file
@@ -0,0 +1,273 @@
|
|||||||
|
"""
|
||||||
|
Enhanced analytics API endpoints with efficient market data handling.
|
||||||
|
|
||||||
|
This version uses PerformanceCalculatorV2 with:
|
||||||
|
- Database-backed price caching
|
||||||
|
- Rate-limited API calls
|
||||||
|
- Stale-while-revalidate pattern for better UX
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, Depends, Query, BackgroundTasks
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import date
|
||||||
|
|
||||||
|
from app.api.deps import get_db
|
||||||
|
from app.services.performance_calculator_v2 import PerformanceCalculatorV2
|
||||||
|
from app.services.market_data_service import MarketDataService
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/overview/{account_id}")
|
||||||
|
def get_overview(
|
||||||
|
account_id: int,
|
||||||
|
refresh_prices: bool = Query(default=False, description="Force fresh price fetch"),
|
||||||
|
max_api_calls: int = Query(default=5, ge=0, le=50, description="Max Yahoo Finance API calls"),
|
||||||
|
start_date: Optional[date] = None,
|
||||||
|
end_date: Optional[date] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get overview statistics for an account.
|
||||||
|
|
||||||
|
By default, uses cached prices (stale-while-revalidate pattern).
|
||||||
|
Set refresh_prices=true to force fresh data (may be slow).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
refresh_prices: Whether to fetch fresh prices from Yahoo Finance
|
||||||
|
max_api_calls: Maximum number of API calls to make
|
||||||
|
start_date: Filter positions opened on or after this date
|
||||||
|
end_date: Filter positions opened on or before this date
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with performance metrics and cache stats
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculatorV2(db, cache_ttl=300)
|
||||||
|
|
||||||
|
# If not refreshing, use cached only (fast)
|
||||||
|
if not refresh_prices:
|
||||||
|
max_api_calls = 0
|
||||||
|
|
||||||
|
stats = calculator.calculate_account_stats(
|
||||||
|
account_id,
|
||||||
|
update_prices=True,
|
||||||
|
max_api_calls=max_api_calls,
|
||||||
|
start_date=start_date,
|
||||||
|
end_date=end_date
|
||||||
|
)
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/balance-history/{account_id}")
|
||||||
|
def get_balance_history(
|
||||||
|
account_id: int,
|
||||||
|
days: int = Query(default=30, ge=1, le=3650),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get account balance history for charting.
|
||||||
|
|
||||||
|
This endpoint doesn't need market data, so it's always fast.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
days: Number of days to retrieve (default: 30)
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of {date, balance} dictionaries
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculatorV2(db)
|
||||||
|
history = calculator.get_balance_history(account_id, days)
|
||||||
|
return {"data": history}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/top-trades/{account_id}")
|
||||||
|
def get_top_trades(
|
||||||
|
account_id: int,
|
||||||
|
limit: int = Query(default=10, ge=1, le=100),
|
||||||
|
start_date: Optional[date] = None,
|
||||||
|
end_date: Optional[date] = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get top performing trades.
|
||||||
|
|
||||||
|
This endpoint only uses closed positions, so no market data needed.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return (default: 10)
|
||||||
|
start_date: Filter positions closed on or after this date
|
||||||
|
end_date: Filter positions closed on or before this date
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculatorV2(db)
|
||||||
|
trades = calculator.get_top_trades(account_id, limit, start_date, end_date)
|
||||||
|
return {"data": trades}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/worst-trades/{account_id}")
|
||||||
|
def get_worst_trades(
|
||||||
|
account_id: int,
|
||||||
|
limit: int = Query(default=10, ge=1, le=100),
|
||||||
|
start_date: Optional[date] = None,
|
||||||
|
end_date: Optional[date] = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get worst performing trades.
|
||||||
|
|
||||||
|
This endpoint only uses closed positions, so no market data needed.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return (default: 10)
|
||||||
|
start_date: Filter positions closed on or after this date
|
||||||
|
end_date: Filter positions closed on or before this date
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculatorV2(db)
|
||||||
|
trades = calculator.get_worst_trades(account_id, limit, start_date, end_date)
|
||||||
|
return {"data": trades}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/refresh-prices/{account_id}")
|
||||||
|
def refresh_prices(
|
||||||
|
account_id: int,
|
||||||
|
max_api_calls: int = Query(default=10, ge=1, le=50),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Manually trigger a price refresh for open positions.
|
||||||
|
|
||||||
|
This is useful when you want fresh data but don't want to wait
|
||||||
|
on the dashboard load.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
max_api_calls: Maximum number of Yahoo Finance API calls
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Update statistics
|
||||||
|
"""
|
||||||
|
calculator = PerformanceCalculatorV2(db, cache_ttl=300)
|
||||||
|
|
||||||
|
stats = calculator.update_open_positions_pnl(
|
||||||
|
account_id,
|
||||||
|
max_api_calls=max_api_calls,
|
||||||
|
allow_stale=False # Force fresh fetches
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Price refresh completed",
|
||||||
|
"stats": stats
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/refresh-prices-background/{account_id}")
|
||||||
|
def refresh_prices_background(
|
||||||
|
account_id: int,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
max_api_calls: int = Query(default=20, ge=1, le=50),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Trigger a background price refresh.
|
||||||
|
|
||||||
|
This returns immediately while prices are fetched in the background.
|
||||||
|
Client can poll /overview endpoint to see updated data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
background_tasks: FastAPI background tasks
|
||||||
|
max_api_calls: Maximum number of Yahoo Finance API calls
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Acknowledgment that background task was started
|
||||||
|
"""
|
||||||
|
def refresh_task():
|
||||||
|
calculator = PerformanceCalculatorV2(db, cache_ttl=300)
|
||||||
|
calculator.update_open_positions_pnl(
|
||||||
|
account_id,
|
||||||
|
max_api_calls=max_api_calls,
|
||||||
|
allow_stale=False
|
||||||
|
)
|
||||||
|
|
||||||
|
background_tasks.add_task(refresh_task)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Price refresh started in background",
|
||||||
|
"account_id": account_id,
|
||||||
|
"max_api_calls": max_api_calls
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/refresh-stale-cache")
|
||||||
|
def refresh_stale_cache(
|
||||||
|
min_age_minutes: int = Query(default=10, ge=1, le=1440),
|
||||||
|
limit: int = Query(default=20, ge=1, le=100),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Background maintenance endpoint to refresh stale cached prices.
|
||||||
|
|
||||||
|
This can be called periodically (e.g., via cron) to keep cache fresh.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
min_age_minutes: Only refresh prices older than this many minutes
|
||||||
|
limit: Maximum number of prices to refresh
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of prices refreshed
|
||||||
|
"""
|
||||||
|
market_data = MarketDataService(db, cache_ttl_seconds=300)
|
||||||
|
|
||||||
|
refreshed = market_data.refresh_stale_prices(
|
||||||
|
min_age_seconds=min_age_minutes * 60,
|
||||||
|
limit=limit
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Stale price refresh completed",
|
||||||
|
"refreshed": refreshed,
|
||||||
|
"min_age_minutes": min_age_minutes
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/clear-old-cache")
|
||||||
|
def clear_old_cache(
|
||||||
|
older_than_days: int = Query(default=30, ge=1, le=365),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Clear old cached prices from database.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
older_than_days: Delete prices older than this many days
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of records deleted
|
||||||
|
"""
|
||||||
|
market_data = MarketDataService(db)
|
||||||
|
|
||||||
|
deleted = market_data.clear_cache(older_than_days=older_than_days)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Old cache cleared",
|
||||||
|
"deleted": deleted,
|
||||||
|
"older_than_days": older_than_days
|
||||||
|
}
|
||||||
128
backend/app/api/endpoints/import_endpoint.py
Normal file
128
backend/app/api/endpoints/import_endpoint.py
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
"""Import API endpoints for CSV file uploads."""
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from pathlib import Path
|
||||||
|
import tempfile
|
||||||
|
import shutil
|
||||||
|
|
||||||
|
from app.api.deps import get_db
|
||||||
|
from app.services import ImportService
|
||||||
|
from app.services.position_tracker import PositionTracker
|
||||||
|
from app.config import settings
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/upload/{account_id}")
|
||||||
|
def upload_csv(
|
||||||
|
account_id: int, file: UploadFile = File(...), db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Upload and import a CSV file for an account.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID to import transactions for
|
||||||
|
file: CSV file to upload
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Import statistics
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If import fails
|
||||||
|
"""
|
||||||
|
if not file.filename.endswith(".csv"):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST, detail="File must be a CSV"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Save uploaded file to temporary location
|
||||||
|
try:
|
||||||
|
with tempfile.NamedTemporaryFile(delete=False, suffix=".csv") as tmp_file:
|
||||||
|
shutil.copyfileobj(file.file, tmp_file)
|
||||||
|
tmp_path = Path(tmp_file.name)
|
||||||
|
|
||||||
|
# Import transactions
|
||||||
|
import_service = ImportService(db)
|
||||||
|
result = import_service.import_from_file(tmp_path, account_id)
|
||||||
|
|
||||||
|
# Rebuild positions after import
|
||||||
|
if result.imported > 0:
|
||||||
|
position_tracker = PositionTracker(db)
|
||||||
|
positions_created = position_tracker.rebuild_positions(account_id)
|
||||||
|
else:
|
||||||
|
positions_created = 0
|
||||||
|
|
||||||
|
# Clean up temporary file
|
||||||
|
tmp_path.unlink()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"filename": file.filename,
|
||||||
|
"imported": result.imported,
|
||||||
|
"skipped": result.skipped,
|
||||||
|
"errors": result.errors,
|
||||||
|
"total_rows": result.total_rows,
|
||||||
|
"positions_created": positions_created,
|
||||||
|
}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||||
|
detail=f"Import failed: {str(e)}",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/filesystem/{account_id}")
|
||||||
|
def import_from_filesystem(account_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Import all CSV files from the filesystem import directory.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID to import transactions for
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Import statistics for all files
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If import directory doesn't exist
|
||||||
|
"""
|
||||||
|
import_dir = Path(settings.IMPORT_DIR)
|
||||||
|
|
||||||
|
if not import_dir.exists():
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Import directory not found: {import_dir}",
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
import_service = ImportService(db)
|
||||||
|
results = import_service.import_from_directory(import_dir, account_id)
|
||||||
|
|
||||||
|
# Rebuild positions if any transactions were imported
|
||||||
|
total_imported = sum(r.imported for r in results.values())
|
||||||
|
if total_imported > 0:
|
||||||
|
position_tracker = PositionTracker(db)
|
||||||
|
positions_created = position_tracker.rebuild_positions(account_id)
|
||||||
|
else:
|
||||||
|
positions_created = 0
|
||||||
|
|
||||||
|
return {
|
||||||
|
"files": {
|
||||||
|
filename: {
|
||||||
|
"imported": result.imported,
|
||||||
|
"skipped": result.skipped,
|
||||||
|
"errors": result.errors,
|
||||||
|
"total_rows": result.total_rows,
|
||||||
|
}
|
||||||
|
for filename, result in results.items()
|
||||||
|
},
|
||||||
|
"total_imported": total_imported,
|
||||||
|
"positions_created": positions_created,
|
||||||
|
}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||||
|
detail=f"Import failed: {str(e)}",
|
||||||
|
)
|
||||||
104
backend/app/api/endpoints/positions.py
Normal file
104
backend/app/api/endpoints/positions.py
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
"""Position API endpoints."""
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Query, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_
|
||||||
|
from typing import List, Optional
|
||||||
|
|
||||||
|
from app.api.deps import get_db
|
||||||
|
from app.models import Position
|
||||||
|
from app.models.position import PositionStatus
|
||||||
|
from app.schemas import PositionResponse
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("", response_model=List[PositionResponse])
|
||||||
|
def list_positions(
|
||||||
|
account_id: Optional[int] = None,
|
||||||
|
status_filter: Optional[PositionStatus] = Query(
|
||||||
|
default=None, alias="status", description="Filter by position status"
|
||||||
|
),
|
||||||
|
symbol: Optional[str] = None,
|
||||||
|
skip: int = 0,
|
||||||
|
limit: int = Query(default=100, le=500),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List positions with optional filtering.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Filter by account ID
|
||||||
|
status_filter: Filter by status (open/closed)
|
||||||
|
symbol: Filter by symbol
|
||||||
|
skip: Number of records to skip (pagination)
|
||||||
|
limit: Maximum number of records to return
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of positions
|
||||||
|
"""
|
||||||
|
query = db.query(Position)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if account_id:
|
||||||
|
query = query.filter(Position.account_id == account_id)
|
||||||
|
|
||||||
|
if status_filter:
|
||||||
|
query = query.filter(Position.status == status_filter)
|
||||||
|
|
||||||
|
if symbol:
|
||||||
|
query = query.filter(Position.symbol == symbol)
|
||||||
|
|
||||||
|
# Order by most recent first
|
||||||
|
query = query.order_by(Position.open_date.desc(), Position.id.desc())
|
||||||
|
|
||||||
|
# Pagination
|
||||||
|
positions = query.offset(skip).limit(limit).all()
|
||||||
|
|
||||||
|
return positions
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{position_id}", response_model=PositionResponse)
|
||||||
|
def get_position(position_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get position by ID.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
position_id: Position ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Position details
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If position not found
|
||||||
|
"""
|
||||||
|
position = db.query(Position).filter(Position.id == position_id).first()
|
||||||
|
|
||||||
|
if not position:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Position {position_id} not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
return position
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{account_id}/rebuild")
|
||||||
|
def rebuild_positions(account_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Rebuild all positions for an account from transactions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of positions created
|
||||||
|
"""
|
||||||
|
from app.services.position_tracker import PositionTracker
|
||||||
|
|
||||||
|
position_tracker = PositionTracker(db)
|
||||||
|
positions_created = position_tracker.rebuild_positions(account_id)
|
||||||
|
|
||||||
|
return {"positions_created": positions_created}
|
||||||
227
backend/app/api/endpoints/transactions.py
Normal file
227
backend/app/api/endpoints/transactions.py
Normal file
@@ -0,0 +1,227 @@
|
|||||||
|
"""Transaction API endpoints."""
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Query, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
from typing import List, Optional, Dict
|
||||||
|
from datetime import date
|
||||||
|
|
||||||
|
from app.api.deps import get_db
|
||||||
|
from app.models import Transaction, Position, PositionTransaction
|
||||||
|
from app.schemas import TransactionResponse
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("", response_model=List[TransactionResponse])
|
||||||
|
def list_transactions(
|
||||||
|
account_id: Optional[int] = None,
|
||||||
|
symbol: Optional[str] = None,
|
||||||
|
start_date: Optional[date] = None,
|
||||||
|
end_date: Optional[date] = None,
|
||||||
|
skip: int = 0,
|
||||||
|
limit: int = Query(default=50, le=500),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List transactions with optional filtering.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Filter by account ID
|
||||||
|
symbol: Filter by symbol
|
||||||
|
start_date: Filter by start date (inclusive)
|
||||||
|
end_date: Filter by end date (inclusive)
|
||||||
|
skip: Number of records to skip (pagination)
|
||||||
|
limit: Maximum number of records to return
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of transactions
|
||||||
|
"""
|
||||||
|
query = db.query(Transaction)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if account_id:
|
||||||
|
query = query.filter(Transaction.account_id == account_id)
|
||||||
|
|
||||||
|
if symbol:
|
||||||
|
query = query.filter(Transaction.symbol == symbol)
|
||||||
|
|
||||||
|
if start_date:
|
||||||
|
query = query.filter(Transaction.run_date >= start_date)
|
||||||
|
|
||||||
|
if end_date:
|
||||||
|
query = query.filter(Transaction.run_date <= end_date)
|
||||||
|
|
||||||
|
# Order by date descending
|
||||||
|
query = query.order_by(Transaction.run_date.desc(), Transaction.id.desc())
|
||||||
|
|
||||||
|
# Pagination
|
||||||
|
transactions = query.offset(skip).limit(limit).all()
|
||||||
|
|
||||||
|
return transactions
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{transaction_id}", response_model=TransactionResponse)
|
||||||
|
def get_transaction(transaction_id: int, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get transaction by ID.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
transaction_id: Transaction ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Transaction details
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If transaction not found
|
||||||
|
"""
|
||||||
|
transaction = (
|
||||||
|
db.query(Transaction).filter(Transaction.id == transaction_id).first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if not transaction:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Transaction {transaction_id} not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
return transaction
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{transaction_id}/position-details")
|
||||||
|
def get_transaction_position_details(
|
||||||
|
transaction_id: int, db: Session = Depends(get_db)
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Get full position details for a transaction, including all related transactions.
|
||||||
|
|
||||||
|
This endpoint finds the position associated with a transaction and returns:
|
||||||
|
- All transactions that are part of the same position
|
||||||
|
- Position metadata (type, status, P&L, etc.)
|
||||||
|
- Strategy classification for options (covered call, cash-secured put, etc.)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
transaction_id: Transaction ID
|
||||||
|
db: Database session
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with position details and all related transactions
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If transaction not found or not part of a position
|
||||||
|
"""
|
||||||
|
# Find the transaction
|
||||||
|
transaction = (
|
||||||
|
db.query(Transaction).filter(Transaction.id == transaction_id).first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if not transaction:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Transaction {transaction_id} not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Find the position this transaction belongs to
|
||||||
|
position_link = (
|
||||||
|
db.query(PositionTransaction)
|
||||||
|
.filter(PositionTransaction.transaction_id == transaction_id)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if not position_link:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Transaction {transaction_id} is not part of any position",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get the position with all its transactions
|
||||||
|
position = (
|
||||||
|
db.query(Position)
|
||||||
|
.filter(Position.id == position_link.position_id)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if not position:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail=f"Position not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get all transactions for this position
|
||||||
|
all_transactions = []
|
||||||
|
for link in position.transaction_links:
|
||||||
|
txn = link.transaction
|
||||||
|
all_transactions.append({
|
||||||
|
"id": txn.id,
|
||||||
|
"run_date": txn.run_date.isoformat(),
|
||||||
|
"action": txn.action,
|
||||||
|
"symbol": txn.symbol,
|
||||||
|
"description": txn.description,
|
||||||
|
"quantity": float(txn.quantity) if txn.quantity else None,
|
||||||
|
"price": float(txn.price) if txn.price else None,
|
||||||
|
"amount": float(txn.amount) if txn.amount else None,
|
||||||
|
"commission": float(txn.commission) if txn.commission else None,
|
||||||
|
"fees": float(txn.fees) if txn.fees else None,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Sort transactions by date
|
||||||
|
all_transactions.sort(key=lambda t: t["run_date"])
|
||||||
|
|
||||||
|
# Determine strategy type for options
|
||||||
|
strategy = _classify_option_strategy(position, all_transactions)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"position": {
|
||||||
|
"id": position.id,
|
||||||
|
"symbol": position.symbol,
|
||||||
|
"option_symbol": position.option_symbol,
|
||||||
|
"position_type": position.position_type.value,
|
||||||
|
"status": position.status.value,
|
||||||
|
"open_date": position.open_date.isoformat(),
|
||||||
|
"close_date": position.close_date.isoformat() if position.close_date else None,
|
||||||
|
"total_quantity": float(position.total_quantity),
|
||||||
|
"avg_entry_price": float(position.avg_entry_price) if position.avg_entry_price is not None else None,
|
||||||
|
"avg_exit_price": float(position.avg_exit_price) if position.avg_exit_price is not None else None,
|
||||||
|
"realized_pnl": float(position.realized_pnl) if position.realized_pnl is not None else None,
|
||||||
|
"unrealized_pnl": float(position.unrealized_pnl) if position.unrealized_pnl is not None else None,
|
||||||
|
"strategy": strategy,
|
||||||
|
},
|
||||||
|
"transactions": all_transactions,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _classify_option_strategy(position: Position, transactions: List[Dict]) -> str:
|
||||||
|
"""
|
||||||
|
Classify the option strategy based on position type and transactions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
position: Position object
|
||||||
|
transactions: List of transaction dictionaries
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Strategy name (e.g., "Long Call", "Covered Call", "Cash-Secured Put")
|
||||||
|
"""
|
||||||
|
if position.position_type.value == "stock":
|
||||||
|
return "Stock"
|
||||||
|
|
||||||
|
# Check if this is a short or long position
|
||||||
|
is_short = position.total_quantity < 0
|
||||||
|
|
||||||
|
# For options
|
||||||
|
if position.position_type.value == "call":
|
||||||
|
if is_short:
|
||||||
|
# Short call - could be covered or naked
|
||||||
|
# We'd need to check if there's a corresponding stock position to determine
|
||||||
|
# For now, just return "Short Call" (could enhance later)
|
||||||
|
return "Short Call (Covered Call)"
|
||||||
|
else:
|
||||||
|
return "Long Call"
|
||||||
|
elif position.position_type.value == "put":
|
||||||
|
if is_short:
|
||||||
|
# Short put - could be cash-secured or naked
|
||||||
|
return "Short Put (Cash-Secured Put)"
|
||||||
|
else:
|
||||||
|
return "Long Put"
|
||||||
|
|
||||||
|
return "Unknown"
|
||||||
53
backend/app/config.py
Normal file
53
backend/app/config.py
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
"""
|
||||||
|
Application configuration settings.
|
||||||
|
Loads configuration from environment variables with sensible defaults.
|
||||||
|
"""
|
||||||
|
from pydantic_settings import BaseSettings
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
class Settings(BaseSettings):
|
||||||
|
"""Application settings loaded from environment variables."""
|
||||||
|
|
||||||
|
# Database configuration
|
||||||
|
POSTGRES_HOST: str = "postgres"
|
||||||
|
POSTGRES_PORT: int = 5432
|
||||||
|
POSTGRES_DB: str = "fidelitytracker"
|
||||||
|
POSTGRES_USER: str = "fidelity"
|
||||||
|
POSTGRES_PASSWORD: str = "fidelity123"
|
||||||
|
|
||||||
|
# API configuration
|
||||||
|
API_V1_PREFIX: str = "/api"
|
||||||
|
PROJECT_NAME: str = "myFidelityTracker"
|
||||||
|
|
||||||
|
# CORS configuration - allow all origins for local development
|
||||||
|
CORS_ORIGINS: str = "*"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def cors_origins_list(self) -> list[str]:
|
||||||
|
"""Parse CORS origins from comma-separated string."""
|
||||||
|
if self.CORS_ORIGINS == "*":
|
||||||
|
return ["*"]
|
||||||
|
return [origin.strip() for origin in self.CORS_ORIGINS.split(",")]
|
||||||
|
|
||||||
|
# File import configuration
|
||||||
|
IMPORT_DIR: str = "/app/imports"
|
||||||
|
|
||||||
|
# Market data cache TTL (seconds)
|
||||||
|
MARKET_DATA_CACHE_TTL: int = 60
|
||||||
|
|
||||||
|
@property
|
||||||
|
def database_url(self) -> str:
|
||||||
|
"""Construct PostgreSQL database URL."""
|
||||||
|
return (
|
||||||
|
f"postgresql://{self.POSTGRES_USER}:{self.POSTGRES_PASSWORD}"
|
||||||
|
f"@{self.POSTGRES_HOST}:{self.POSTGRES_PORT}/{self.POSTGRES_DB}"
|
||||||
|
)
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
env_file = ".env"
|
||||||
|
case_sensitive = True
|
||||||
|
|
||||||
|
|
||||||
|
# Global settings instance
|
||||||
|
settings = Settings()
|
||||||
38
backend/app/database.py
Normal file
38
backend/app/database.py
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
"""
|
||||||
|
Database configuration and session management.
|
||||||
|
Provides SQLAlchemy engine and session factory.
|
||||||
|
"""
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
from app.config import settings
|
||||||
|
|
||||||
|
# Create SQLAlchemy engine
|
||||||
|
engine = create_engine(
|
||||||
|
settings.database_url,
|
||||||
|
pool_pre_ping=True, # Enable connection health checks
|
||||||
|
pool_size=10,
|
||||||
|
max_overflow=20
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create session factory
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
# Base class for SQLAlchemy models
|
||||||
|
Base = declarative_base()
|
||||||
|
|
||||||
|
|
||||||
|
def get_db():
|
||||||
|
"""
|
||||||
|
Dependency function that provides a database session.
|
||||||
|
Automatically closes the session after the request is completed.
|
||||||
|
|
||||||
|
Yields:
|
||||||
|
Session: SQLAlchemy database session
|
||||||
|
"""
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
66
backend/app/main.py
Normal file
66
backend/app/main.py
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
"""
|
||||||
|
FastAPI application entry point for myFidelityTracker.
|
||||||
|
|
||||||
|
This module initializes the FastAPI application, configures CORS,
|
||||||
|
and registers all API routers.
|
||||||
|
"""
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
|
||||||
|
from app.config import settings
|
||||||
|
from app.api.endpoints import accounts, transactions, positions, import_endpoint
|
||||||
|
from app.api.endpoints import analytics_v2 as analytics
|
||||||
|
|
||||||
|
# Create FastAPI application
|
||||||
|
app = FastAPI(
|
||||||
|
title=settings.PROJECT_NAME,
|
||||||
|
description="Track and analyze your Fidelity brokerage account performance",
|
||||||
|
version="1.0.0",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Configure CORS middleware - allow all origins for local network access
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"], # Allow all origins for local development
|
||||||
|
allow_credentials=False, # Must be False when using allow_origins=["*"]
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Register API routers
|
||||||
|
app.include_router(
|
||||||
|
accounts.router, prefix=f"{settings.API_V1_PREFIX}/accounts", tags=["accounts"]
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
transactions.router,
|
||||||
|
prefix=f"{settings.API_V1_PREFIX}/transactions",
|
||||||
|
tags=["transactions"],
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
positions.router, prefix=f"{settings.API_V1_PREFIX}/positions", tags=["positions"]
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
analytics.router, prefix=f"{settings.API_V1_PREFIX}/analytics", tags=["analytics"]
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
import_endpoint.router,
|
||||||
|
prefix=f"{settings.API_V1_PREFIX}/import",
|
||||||
|
tags=["import"],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
def root():
|
||||||
|
"""Root endpoint returning API information."""
|
||||||
|
return {
|
||||||
|
"name": settings.PROJECT_NAME,
|
||||||
|
"version": "1.0.0",
|
||||||
|
"message": "Welcome to myFidelityTracker API",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/health")
|
||||||
|
def health_check():
|
||||||
|
"""Health check endpoint."""
|
||||||
|
return {"status": "healthy"}
|
||||||
7
backend/app/models/__init__.py
Normal file
7
backend/app/models/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
"""SQLAlchemy models for the application."""
|
||||||
|
from app.models.account import Account
|
||||||
|
from app.models.transaction import Transaction
|
||||||
|
from app.models.position import Position, PositionTransaction
|
||||||
|
from app.models.market_price import MarketPrice
|
||||||
|
|
||||||
|
__all__ = ["Account", "Transaction", "Position", "PositionTransaction", "MarketPrice"]
|
||||||
41
backend/app/models/account.py
Normal file
41
backend/app/models/account.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
"""Account model representing a brokerage account."""
|
||||||
|
from sqlalchemy import Column, Integer, String, DateTime, Enum
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
import enum
|
||||||
|
|
||||||
|
from app.database import Base
|
||||||
|
|
||||||
|
|
||||||
|
class AccountType(str, enum.Enum):
|
||||||
|
"""Enumeration of account types."""
|
||||||
|
CASH = "cash"
|
||||||
|
MARGIN = "margin"
|
||||||
|
|
||||||
|
|
||||||
|
class Account(Base):
|
||||||
|
"""
|
||||||
|
Represents a brokerage account.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
id: Primary key
|
||||||
|
account_number: Unique account identifier
|
||||||
|
account_name: Human-readable account name
|
||||||
|
account_type: Type of account (cash or margin)
|
||||||
|
created_at: Timestamp of account creation
|
||||||
|
updated_at: Timestamp of last update
|
||||||
|
transactions: Related transactions
|
||||||
|
positions: Related positions
|
||||||
|
"""
|
||||||
|
__tablename__ = "accounts"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
account_number = Column(String(50), unique=True, nullable=False, index=True)
|
||||||
|
account_name = Column(String(200), nullable=False)
|
||||||
|
account_type = Column(Enum(AccountType), nullable=False, default=AccountType.CASH)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||||
|
updated_at = Column(DateTime(timezone=True), onupdate=func.now(), server_default=func.now(), nullable=False)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
transactions = relationship("Transaction", back_populates="account", cascade="all, delete-orphan")
|
||||||
|
positions = relationship("Position", back_populates="account", cascade="all, delete-orphan")
|
||||||
29
backend/app/models/market_price.py
Normal file
29
backend/app/models/market_price.py
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
"""Market price cache model for storing Yahoo Finance data."""
|
||||||
|
from sqlalchemy import Column, Integer, String, Numeric, DateTime, Index
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from app.database import Base
|
||||||
|
|
||||||
|
|
||||||
|
class MarketPrice(Base):
|
||||||
|
"""
|
||||||
|
Cache table for market prices from Yahoo Finance.
|
||||||
|
|
||||||
|
Stores the last fetched price for each symbol to reduce API calls.
|
||||||
|
"""
|
||||||
|
|
||||||
|
__tablename__ = "market_prices"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
symbol = Column(String(20), unique=True, nullable=False, index=True)
|
||||||
|
price = Column(Numeric(precision=20, scale=6), nullable=False)
|
||||||
|
fetched_at = Column(DateTime, nullable=False, default=datetime.utcnow)
|
||||||
|
source = Column(String(50), default="yahoo_finance")
|
||||||
|
|
||||||
|
# Index for quick lookups by symbol and freshness checks
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_symbol_fetched', 'symbol', 'fetched_at'),
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<MarketPrice(symbol={self.symbol}, price={self.price}, fetched_at={self.fetched_at})>"
|
||||||
104
backend/app/models/position.py
Normal file
104
backend/app/models/position.py
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
"""Position model representing a trading position."""
|
||||||
|
from sqlalchemy import Column, Integer, String, DateTime, Numeric, ForeignKey, Date, Enum, Index
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
import enum
|
||||||
|
|
||||||
|
from app.database import Base
|
||||||
|
|
||||||
|
|
||||||
|
class PositionType(str, enum.Enum):
|
||||||
|
"""Enumeration of position types."""
|
||||||
|
STOCK = "stock"
|
||||||
|
CALL = "call"
|
||||||
|
PUT = "put"
|
||||||
|
|
||||||
|
|
||||||
|
class PositionStatus(str, enum.Enum):
|
||||||
|
"""Enumeration of position statuses."""
|
||||||
|
OPEN = "open"
|
||||||
|
CLOSED = "closed"
|
||||||
|
|
||||||
|
|
||||||
|
class Position(Base):
|
||||||
|
"""
|
||||||
|
Represents a trading position (open or closed).
|
||||||
|
|
||||||
|
A position aggregates related transactions (entries and exits) for a specific security.
|
||||||
|
For options, tracks strikes, expirations, and option-specific details.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
id: Primary key
|
||||||
|
account_id: Foreign key to account
|
||||||
|
symbol: Base trading symbol (e.g., AAPL)
|
||||||
|
option_symbol: Full option symbol if applicable (e.g., -AAPL260116C150)
|
||||||
|
position_type: Type (stock, call, put)
|
||||||
|
status: Status (open, closed)
|
||||||
|
open_date: Date position was opened
|
||||||
|
close_date: Date position was closed (if closed)
|
||||||
|
total_quantity: Net quantity (can be negative for short positions)
|
||||||
|
avg_entry_price: Average entry price
|
||||||
|
avg_exit_price: Average exit price (if closed)
|
||||||
|
realized_pnl: Realized profit/loss for closed positions
|
||||||
|
unrealized_pnl: Unrealized profit/loss for open positions
|
||||||
|
created_at: Timestamp of record creation
|
||||||
|
updated_at: Timestamp of last update
|
||||||
|
"""
|
||||||
|
__tablename__ = "positions"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
account_id = Column(Integer, ForeignKey("accounts.id", ondelete="CASCADE"), nullable=False, index=True)
|
||||||
|
|
||||||
|
# Symbol information
|
||||||
|
symbol = Column(String(50), nullable=False, index=True)
|
||||||
|
option_symbol = Column(String(100), index=True) # Full option symbol for options
|
||||||
|
position_type = Column(Enum(PositionType), nullable=False, default=PositionType.STOCK)
|
||||||
|
|
||||||
|
# Status and dates
|
||||||
|
status = Column(Enum(PositionStatus), nullable=False, default=PositionStatus.OPEN, index=True)
|
||||||
|
open_date = Column(Date, nullable=False)
|
||||||
|
close_date = Column(Date)
|
||||||
|
|
||||||
|
# Position metrics
|
||||||
|
total_quantity = Column(Numeric(20, 8), nullable=False) # Can be negative for short
|
||||||
|
avg_entry_price = Column(Numeric(20, 8))
|
||||||
|
avg_exit_price = Column(Numeric(20, 8))
|
||||||
|
|
||||||
|
# P&L tracking
|
||||||
|
realized_pnl = Column(Numeric(20, 2)) # For closed positions
|
||||||
|
unrealized_pnl = Column(Numeric(20, 2)) # For open positions
|
||||||
|
|
||||||
|
# Timestamps
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||||
|
updated_at = Column(DateTime(timezone=True), onupdate=func.now(), server_default=func.now(), nullable=False)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
account = relationship("Account", back_populates="positions")
|
||||||
|
transaction_links = relationship("PositionTransaction", back_populates="position", cascade="all, delete-orphan")
|
||||||
|
|
||||||
|
# Composite indexes for common queries
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_account_status', 'account_id', 'status'),
|
||||||
|
Index('idx_account_symbol_status', 'account_id', 'symbol', 'status'),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class PositionTransaction(Base):
|
||||||
|
"""
|
||||||
|
Junction table linking positions to transactions.
|
||||||
|
|
||||||
|
A position can have multiple transactions (entries, exits, adjustments).
|
||||||
|
A transaction can be part of multiple positions (e.g., closing multiple lots).
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
position_id: Foreign key to position
|
||||||
|
transaction_id: Foreign key to transaction
|
||||||
|
"""
|
||||||
|
__tablename__ = "position_transactions"
|
||||||
|
|
||||||
|
position_id = Column(Integer, ForeignKey("positions.id", ondelete="CASCADE"), primary_key=True)
|
||||||
|
transaction_id = Column(Integer, ForeignKey("transactions.id", ondelete="CASCADE"), primary_key=True)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
position = relationship("Position", back_populates="transaction_links")
|
||||||
|
transaction = relationship("Transaction", back_populates="position_links")
|
||||||
81
backend/app/models/transaction.py
Normal file
81
backend/app/models/transaction.py
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
"""Transaction model representing a brokerage transaction."""
|
||||||
|
from sqlalchemy import Column, Integer, String, DateTime, Numeric, ForeignKey, Date, Index
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
|
||||||
|
from app.database import Base
|
||||||
|
|
||||||
|
|
||||||
|
class Transaction(Base):
|
||||||
|
"""
|
||||||
|
Represents a single brokerage transaction.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
id: Primary key
|
||||||
|
account_id: Foreign key to account
|
||||||
|
run_date: Date the transaction was recorded
|
||||||
|
action: Description of the transaction action
|
||||||
|
symbol: Trading symbol
|
||||||
|
description: Full transaction description
|
||||||
|
transaction_type: Type (Cash/Margin)
|
||||||
|
exchange_quantity: Quantity in exchange currency
|
||||||
|
exchange_currency: Exchange currency code
|
||||||
|
currency: Transaction currency
|
||||||
|
price: Transaction price per unit
|
||||||
|
quantity: Number of shares/contracts
|
||||||
|
exchange_rate: Currency exchange rate
|
||||||
|
commission: Commission fees
|
||||||
|
fees: Additional fees
|
||||||
|
accrued_interest: Interest accrued
|
||||||
|
amount: Total transaction amount
|
||||||
|
cash_balance: Account balance after transaction
|
||||||
|
settlement_date: Date transaction settles
|
||||||
|
unique_hash: SHA-256 hash for deduplication
|
||||||
|
created_at: Timestamp of record creation
|
||||||
|
updated_at: Timestamp of last update
|
||||||
|
"""
|
||||||
|
__tablename__ = "transactions"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
account_id = Column(Integer, ForeignKey("accounts.id", ondelete="CASCADE"), nullable=False, index=True)
|
||||||
|
|
||||||
|
# Transaction details from CSV
|
||||||
|
run_date = Column(Date, nullable=False, index=True)
|
||||||
|
action = Column(String(500), nullable=False)
|
||||||
|
symbol = Column(String(50), index=True)
|
||||||
|
description = Column(String(500))
|
||||||
|
transaction_type = Column(String(20)) # Cash, Margin
|
||||||
|
|
||||||
|
# Quantities and currencies
|
||||||
|
exchange_quantity = Column(Numeric(20, 8))
|
||||||
|
exchange_currency = Column(String(10))
|
||||||
|
currency = Column(String(10))
|
||||||
|
|
||||||
|
# Financial details
|
||||||
|
price = Column(Numeric(20, 8))
|
||||||
|
quantity = Column(Numeric(20, 8))
|
||||||
|
exchange_rate = Column(Numeric(20, 8))
|
||||||
|
commission = Column(Numeric(20, 2))
|
||||||
|
fees = Column(Numeric(20, 2))
|
||||||
|
accrued_interest = Column(Numeric(20, 2))
|
||||||
|
amount = Column(Numeric(20, 2))
|
||||||
|
cash_balance = Column(Numeric(20, 2))
|
||||||
|
|
||||||
|
settlement_date = Column(Date)
|
||||||
|
|
||||||
|
# Deduplication hash
|
||||||
|
unique_hash = Column(String(64), unique=True, nullable=False, index=True)
|
||||||
|
|
||||||
|
# Timestamps
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||||
|
updated_at = Column(DateTime(timezone=True), onupdate=func.now(), server_default=func.now(), nullable=False)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
account = relationship("Account", back_populates="transactions")
|
||||||
|
position_links = relationship("PositionTransaction", back_populates="transaction", cascade="all, delete-orphan")
|
||||||
|
|
||||||
|
# Composite index for common queries
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_account_date', 'account_id', 'run_date'),
|
||||||
|
Index('idx_account_symbol', 'account_id', 'symbol'),
|
||||||
|
)
|
||||||
5
backend/app/parsers/__init__.py
Normal file
5
backend/app/parsers/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
"""CSV parser modules for various brokerage formats."""
|
||||||
|
from app.parsers.base_parser import BaseParser, ParseResult
|
||||||
|
from app.parsers.fidelity_parser import FidelityParser
|
||||||
|
|
||||||
|
__all__ = ["BaseParser", "ParseResult", "FidelityParser"]
|
||||||
99
backend/app/parsers/base_parser.py
Normal file
99
backend/app/parsers/base_parser.py
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
"""Base parser interface for brokerage CSV files."""
|
||||||
|
from abc import ABC, abstractmethod
|
||||||
|
from typing import List, Dict, Any, NamedTuple
|
||||||
|
from pathlib import Path
|
||||||
|
import pandas as pd
|
||||||
|
|
||||||
|
|
||||||
|
class ParseResult(NamedTuple):
|
||||||
|
"""
|
||||||
|
Result of parsing a brokerage CSV file.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
transactions: List of parsed transaction dictionaries
|
||||||
|
errors: List of error messages encountered during parsing
|
||||||
|
row_count: Total number of rows processed
|
||||||
|
"""
|
||||||
|
transactions: List[Dict[str, Any]]
|
||||||
|
errors: List[str]
|
||||||
|
row_count: int
|
||||||
|
|
||||||
|
|
||||||
|
class BaseParser(ABC):
|
||||||
|
"""
|
||||||
|
Abstract base class for brokerage CSV parsers.
|
||||||
|
|
||||||
|
Provides a standard interface for parsing CSV files from different brokerages.
|
||||||
|
Subclasses must implement the parse() method for their specific format.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
def parse(self, file_path: Path) -> ParseResult:
|
||||||
|
"""
|
||||||
|
Parse a brokerage CSV file into standardized transaction dictionaries.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to the CSV file to parse
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ParseResult containing transactions, errors, and row count
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileNotFoundError: If the file does not exist
|
||||||
|
ValueError: If the file format is invalid
|
||||||
|
"""
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _read_csv(self, file_path: Path, **kwargs) -> pd.DataFrame:
|
||||||
|
"""
|
||||||
|
Read CSV file into a pandas DataFrame with error handling.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to CSV file
|
||||||
|
**kwargs: Additional arguments passed to pd.read_csv()
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
DataFrame containing CSV data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileNotFoundError: If file does not exist
|
||||||
|
pd.errors.EmptyDataError: If file is empty
|
||||||
|
"""
|
||||||
|
if not file_path.exists():
|
||||||
|
raise FileNotFoundError(f"CSV file not found: {file_path}")
|
||||||
|
|
||||||
|
return pd.read_csv(file_path, **kwargs)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _safe_decimal(value: Any) -> Any:
|
||||||
|
"""
|
||||||
|
Safely convert value to decimal-compatible format, handling NaN and None.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value: Value to convert
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Converted value or None if invalid
|
||||||
|
"""
|
||||||
|
if pd.isna(value):
|
||||||
|
return None
|
||||||
|
if value == "":
|
||||||
|
return None
|
||||||
|
return value
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _safe_date(value: Any) -> Any:
|
||||||
|
"""
|
||||||
|
Safely convert value to date, handling NaN and None.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value: Value to convert
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Converted date or None if invalid
|
||||||
|
"""
|
||||||
|
if pd.isna(value):
|
||||||
|
return None
|
||||||
|
if value == "":
|
||||||
|
return None
|
||||||
|
return value
|
||||||
257
backend/app/parsers/fidelity_parser.py
Normal file
257
backend/app/parsers/fidelity_parser.py
Normal file
@@ -0,0 +1,257 @@
|
|||||||
|
"""Fidelity brokerage CSV parser."""
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List, Dict, Any
|
||||||
|
import pandas as pd
|
||||||
|
from datetime import datetime
|
||||||
|
import re
|
||||||
|
|
||||||
|
from app.parsers.base_parser import BaseParser, ParseResult
|
||||||
|
|
||||||
|
|
||||||
|
class FidelityParser(BaseParser):
|
||||||
|
"""
|
||||||
|
Parser for Fidelity brokerage account history CSV files.
|
||||||
|
|
||||||
|
Expected CSV columns:
|
||||||
|
- Run Date
|
||||||
|
- Action
|
||||||
|
- Symbol
|
||||||
|
- Description
|
||||||
|
- Type
|
||||||
|
- Exchange Quantity
|
||||||
|
- Exchange Currency
|
||||||
|
- Currency
|
||||||
|
- Price
|
||||||
|
- Quantity
|
||||||
|
- Exchange Rate
|
||||||
|
- Commission
|
||||||
|
- Fees
|
||||||
|
- Accrued Interest
|
||||||
|
- Amount
|
||||||
|
- Cash Balance
|
||||||
|
- Settlement Date
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Expected column names in Fidelity CSV
|
||||||
|
EXPECTED_COLUMNS = [
|
||||||
|
"Run Date",
|
||||||
|
"Action",
|
||||||
|
"Symbol",
|
||||||
|
"Description",
|
||||||
|
"Type",
|
||||||
|
"Exchange Quantity",
|
||||||
|
"Exchange Currency",
|
||||||
|
"Currency",
|
||||||
|
"Price",
|
||||||
|
"Quantity",
|
||||||
|
"Exchange Rate",
|
||||||
|
"Commission",
|
||||||
|
"Fees",
|
||||||
|
"Accrued Interest",
|
||||||
|
"Amount",
|
||||||
|
"Cash Balance",
|
||||||
|
"Settlement Date",
|
||||||
|
]
|
||||||
|
|
||||||
|
def parse(self, file_path: Path) -> ParseResult:
|
||||||
|
"""
|
||||||
|
Parse a Fidelity CSV file into standardized transaction dictionaries.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to the Fidelity CSV file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ParseResult containing parsed transactions, errors, and row count
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileNotFoundError: If the file does not exist
|
||||||
|
ValueError: If the CSV format is invalid
|
||||||
|
"""
|
||||||
|
errors = []
|
||||||
|
transactions = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Read CSV, skipping empty rows at the beginning
|
||||||
|
df = self._read_csv(file_path, skiprows=self._find_header_row(file_path))
|
||||||
|
|
||||||
|
# Validate columns
|
||||||
|
missing_cols = set(self.EXPECTED_COLUMNS) - set(df.columns)
|
||||||
|
if missing_cols:
|
||||||
|
raise ValueError(f"Missing required columns: {missing_cols}")
|
||||||
|
|
||||||
|
# Parse each row
|
||||||
|
for idx, row in df.iterrows():
|
||||||
|
try:
|
||||||
|
transaction = self._parse_row(row)
|
||||||
|
if transaction:
|
||||||
|
transactions.append(transaction)
|
||||||
|
except Exception as e:
|
||||||
|
errors.append(f"Row {idx + 1}: {str(e)}")
|
||||||
|
|
||||||
|
return ParseResult(
|
||||||
|
transactions=transactions, errors=errors, row_count=len(df)
|
||||||
|
)
|
||||||
|
|
||||||
|
except FileNotFoundError as e:
|
||||||
|
raise e
|
||||||
|
except Exception as e:
|
||||||
|
raise ValueError(f"Failed to parse Fidelity CSV: {str(e)}")
|
||||||
|
|
||||||
|
def _find_header_row(self, file_path: Path) -> int:
|
||||||
|
"""
|
||||||
|
Find the row number where the header starts in Fidelity CSV.
|
||||||
|
|
||||||
|
Fidelity CSVs may have empty rows or metadata at the beginning.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to CSV file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Row number (0-indexed) where the header is located
|
||||||
|
"""
|
||||||
|
with open(file_path, "r", encoding="utf-8-sig") as f:
|
||||||
|
for i, line in enumerate(f):
|
||||||
|
if "Run Date" in line:
|
||||||
|
return i
|
||||||
|
return 0 # Default to first row if not found
|
||||||
|
|
||||||
|
def _extract_real_ticker(self, symbol: str, description: str, action: str) -> str:
|
||||||
|
"""
|
||||||
|
Extract the real underlying ticker from option descriptions.
|
||||||
|
|
||||||
|
Fidelity uses internal reference numbers (like 6736999MM) in the Symbol column
|
||||||
|
for options, but the real ticker is in the Description/Action in parentheses.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
- Description: "CALL (OPEN) OPENDOOR JAN 16 26 (100 SHS)"
|
||||||
|
- Action: "YOU SOLD CLOSING TRANSACTION CALL (OPEN) OPENDOOR..."
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol: Symbol from CSV (might be Fidelity internal reference)
|
||||||
|
description: Description field
|
||||||
|
action: Action field
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Real ticker symbol, or original symbol if not found
|
||||||
|
"""
|
||||||
|
# If symbol looks normal (letters only, not Fidelity's numeric codes), return it
|
||||||
|
if symbol and re.match(r'^[A-Z]{1,5}$', symbol):
|
||||||
|
return symbol
|
||||||
|
|
||||||
|
# Try to extract from description first (more reliable)
|
||||||
|
# Pattern: (TICKER) or CALL (TICKER) or PUT (TICKER)
|
||||||
|
if description:
|
||||||
|
# Look for pattern like "CALL (OPEN)" or "PUT (AAPL)"
|
||||||
|
match = re.search(r'(?:CALL|PUT)\s*\(([A-Z]+)\)', description, re.IGNORECASE)
|
||||||
|
if match:
|
||||||
|
return match.group(1)
|
||||||
|
|
||||||
|
# Look for standalone (TICKER) pattern
|
||||||
|
match = re.search(r'\(([A-Z]{1,5})\)', description)
|
||||||
|
if match:
|
||||||
|
ticker = match.group(1)
|
||||||
|
# Make sure it's not something like (100 or (Margin)
|
||||||
|
if not ticker.isdigit() and ticker not in ['MARGIN', 'CASH', 'SHS']:
|
||||||
|
return ticker
|
||||||
|
|
||||||
|
# Fall back to action field
|
||||||
|
if action:
|
||||||
|
match = re.search(r'(?:CALL|PUT)\s*\(([A-Z]+)\)', action, re.IGNORECASE)
|
||||||
|
if match:
|
||||||
|
return match.group(1)
|
||||||
|
|
||||||
|
# Return original symbol if we couldn't extract anything better
|
||||||
|
return symbol if symbol else None
|
||||||
|
|
||||||
|
def _parse_row(self, row: pd.Series) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Parse a single row from Fidelity CSV into a transaction dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
row: Pandas Series representing one CSV row
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with transaction data, or None if row should be skipped
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If required fields are missing or invalid
|
||||||
|
"""
|
||||||
|
# Parse dates
|
||||||
|
run_date = self._parse_date(row["Run Date"])
|
||||||
|
settlement_date = self._parse_date(row["Settlement Date"])
|
||||||
|
|
||||||
|
# Extract raw values
|
||||||
|
raw_symbol = self._safe_string(row["Symbol"])
|
||||||
|
description = self._safe_string(row["Description"])
|
||||||
|
action = str(row["Action"]).strip() if pd.notna(row["Action"]) else ""
|
||||||
|
|
||||||
|
# Extract the real ticker (especially important for options)
|
||||||
|
actual_symbol = self._extract_real_ticker(raw_symbol, description, action)
|
||||||
|
|
||||||
|
# Extract and clean values
|
||||||
|
transaction = {
|
||||||
|
"run_date": run_date,
|
||||||
|
"action": action,
|
||||||
|
"symbol": actual_symbol,
|
||||||
|
"description": description,
|
||||||
|
"transaction_type": self._safe_string(row["Type"]),
|
||||||
|
"exchange_quantity": self._safe_decimal(row["Exchange Quantity"]),
|
||||||
|
"exchange_currency": self._safe_string(row["Exchange Currency"]),
|
||||||
|
"currency": self._safe_string(row["Currency"]),
|
||||||
|
"price": self._safe_decimal(row["Price"]),
|
||||||
|
"quantity": self._safe_decimal(row["Quantity"]),
|
||||||
|
"exchange_rate": self._safe_decimal(row["Exchange Rate"]),
|
||||||
|
"commission": self._safe_decimal(row["Commission"]),
|
||||||
|
"fees": self._safe_decimal(row["Fees"]),
|
||||||
|
"accrued_interest": self._safe_decimal(row["Accrued Interest"]),
|
||||||
|
"amount": self._safe_decimal(row["Amount"]),
|
||||||
|
"cash_balance": self._safe_decimal(row["Cash Balance"]),
|
||||||
|
"settlement_date": settlement_date,
|
||||||
|
}
|
||||||
|
|
||||||
|
return transaction
|
||||||
|
|
||||||
|
def _parse_date(self, date_value: Any) -> Any:
|
||||||
|
"""
|
||||||
|
Parse date value from CSV, handling various formats.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
date_value: Date value from CSV (string or datetime)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
datetime.date object or None if empty/invalid
|
||||||
|
"""
|
||||||
|
if pd.isna(date_value) or date_value == "":
|
||||||
|
return None
|
||||||
|
|
||||||
|
# If already a datetime object
|
||||||
|
if isinstance(date_value, datetime):
|
||||||
|
return date_value.date()
|
||||||
|
|
||||||
|
# Try parsing common date formats
|
||||||
|
date_str = str(date_value).strip()
|
||||||
|
if not date_str:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Try common formats
|
||||||
|
for fmt in ["%m/%d/%Y", "%Y-%m-%d", "%m-%d-%Y"]:
|
||||||
|
try:
|
||||||
|
return datetime.strptime(date_str, fmt).date()
|
||||||
|
except ValueError:
|
||||||
|
continue
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _safe_string(self, value: Any) -> str:
|
||||||
|
"""
|
||||||
|
Safely convert value to string, handling NaN and empty values.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value: Value to convert
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
String value or None if empty
|
||||||
|
"""
|
||||||
|
if pd.isna(value) or value == "":
|
||||||
|
return None
|
||||||
|
return str(value).strip()
|
||||||
14
backend/app/schemas/__init__.py
Normal file
14
backend/app/schemas/__init__.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
"""Pydantic schemas for API request/response validation."""
|
||||||
|
from app.schemas.account import AccountCreate, AccountUpdate, AccountResponse
|
||||||
|
from app.schemas.transaction import TransactionCreate, TransactionResponse
|
||||||
|
from app.schemas.position import PositionResponse, PositionStats
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"AccountCreate",
|
||||||
|
"AccountUpdate",
|
||||||
|
"AccountResponse",
|
||||||
|
"TransactionCreate",
|
||||||
|
"TransactionResponse",
|
||||||
|
"PositionResponse",
|
||||||
|
"PositionStats",
|
||||||
|
]
|
||||||
34
backend/app/schemas/account.py
Normal file
34
backend/app/schemas/account.py
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
"""Pydantic schemas for account-related API operations."""
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from app.models.account import AccountType
|
||||||
|
|
||||||
|
|
||||||
|
class AccountBase(BaseModel):
|
||||||
|
"""Base schema for account data."""
|
||||||
|
account_number: str = Field(..., description="Unique account identifier")
|
||||||
|
account_name: str = Field(..., description="Human-readable account name")
|
||||||
|
account_type: AccountType = Field(default=AccountType.CASH, description="Account type")
|
||||||
|
|
||||||
|
|
||||||
|
class AccountCreate(AccountBase):
|
||||||
|
"""Schema for creating a new account."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class AccountUpdate(BaseModel):
|
||||||
|
"""Schema for updating an existing account."""
|
||||||
|
account_name: Optional[str] = Field(None, description="Updated account name")
|
||||||
|
account_type: Optional[AccountType] = Field(None, description="Updated account type")
|
||||||
|
|
||||||
|
|
||||||
|
class AccountResponse(AccountBase):
|
||||||
|
"""Schema for account API responses."""
|
||||||
|
id: int
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
45
backend/app/schemas/position.py
Normal file
45
backend/app/schemas/position.py
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
"""Pydantic schemas for position-related API operations."""
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import date, datetime
|
||||||
|
from typing import Optional
|
||||||
|
from decimal import Decimal
|
||||||
|
|
||||||
|
from app.models.position import PositionType, PositionStatus
|
||||||
|
|
||||||
|
|
||||||
|
class PositionBase(BaseModel):
|
||||||
|
"""Base schema for position data."""
|
||||||
|
symbol: str
|
||||||
|
option_symbol: Optional[str] = None
|
||||||
|
position_type: PositionType
|
||||||
|
status: PositionStatus
|
||||||
|
open_date: date
|
||||||
|
close_date: Optional[date] = None
|
||||||
|
total_quantity: Decimal
|
||||||
|
avg_entry_price: Optional[Decimal] = None
|
||||||
|
avg_exit_price: Optional[Decimal] = None
|
||||||
|
realized_pnl: Optional[Decimal] = None
|
||||||
|
unrealized_pnl: Optional[Decimal] = None
|
||||||
|
|
||||||
|
|
||||||
|
class PositionResponse(PositionBase):
|
||||||
|
"""Schema for position API responses."""
|
||||||
|
id: int
|
||||||
|
account_id: int
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
|
||||||
|
class PositionStats(BaseModel):
|
||||||
|
"""Schema for aggregate position statistics."""
|
||||||
|
total_positions: int = Field(..., description="Total number of positions")
|
||||||
|
open_positions: int = Field(..., description="Number of open positions")
|
||||||
|
closed_positions: int = Field(..., description="Number of closed positions")
|
||||||
|
total_realized_pnl: Decimal = Field(..., description="Total realized P&L")
|
||||||
|
total_unrealized_pnl: Decimal = Field(..., description="Total unrealized P&L")
|
||||||
|
win_rate: float = Field(..., description="Percentage of profitable trades")
|
||||||
|
avg_win: Decimal = Field(..., description="Average profit on winning trades")
|
||||||
|
avg_loss: Decimal = Field(..., description="Average loss on losing trades")
|
||||||
44
backend/app/schemas/transaction.py
Normal file
44
backend/app/schemas/transaction.py
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
"""Pydantic schemas for transaction-related API operations."""
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import date, datetime
|
||||||
|
from typing import Optional
|
||||||
|
from decimal import Decimal
|
||||||
|
|
||||||
|
|
||||||
|
class TransactionBase(BaseModel):
|
||||||
|
"""Base schema for transaction data."""
|
||||||
|
run_date: date
|
||||||
|
action: str
|
||||||
|
symbol: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
transaction_type: Optional[str] = None
|
||||||
|
exchange_quantity: Optional[Decimal] = None
|
||||||
|
exchange_currency: Optional[str] = None
|
||||||
|
currency: Optional[str] = None
|
||||||
|
price: Optional[Decimal] = None
|
||||||
|
quantity: Optional[Decimal] = None
|
||||||
|
exchange_rate: Optional[Decimal] = None
|
||||||
|
commission: Optional[Decimal] = None
|
||||||
|
fees: Optional[Decimal] = None
|
||||||
|
accrued_interest: Optional[Decimal] = None
|
||||||
|
amount: Optional[Decimal] = None
|
||||||
|
cash_balance: Optional[Decimal] = None
|
||||||
|
settlement_date: Optional[date] = None
|
||||||
|
|
||||||
|
|
||||||
|
class TransactionCreate(TransactionBase):
|
||||||
|
"""Schema for creating a new transaction."""
|
||||||
|
account_id: int
|
||||||
|
unique_hash: str
|
||||||
|
|
||||||
|
|
||||||
|
class TransactionResponse(TransactionBase):
|
||||||
|
"""Schema for transaction API responses."""
|
||||||
|
id: int
|
||||||
|
account_id: int
|
||||||
|
unique_hash: str
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
6
backend/app/services/__init__.py
Normal file
6
backend/app/services/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
"""Business logic services."""
|
||||||
|
from app.services.import_service import ImportService, ImportResult
|
||||||
|
from app.services.position_tracker import PositionTracker
|
||||||
|
from app.services.performance_calculator import PerformanceCalculator
|
||||||
|
|
||||||
|
__all__ = ["ImportService", "ImportResult", "PositionTracker", "PerformanceCalculator"]
|
||||||
149
backend/app/services/import_service.py
Normal file
149
backend/app/services/import_service.py
Normal file
@@ -0,0 +1,149 @@
|
|||||||
|
"""Service for importing transactions from CSV files."""
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List, Dict, Any, NamedTuple
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from app.parsers import FidelityParser
|
||||||
|
from app.models import Transaction
|
||||||
|
from app.utils import generate_transaction_hash
|
||||||
|
|
||||||
|
|
||||||
|
class ImportResult(NamedTuple):
|
||||||
|
"""
|
||||||
|
Result of an import operation.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
imported: Number of successfully imported transactions
|
||||||
|
skipped: Number of skipped duplicate transactions
|
||||||
|
errors: List of error messages
|
||||||
|
total_rows: Total number of rows processed
|
||||||
|
"""
|
||||||
|
imported: int
|
||||||
|
skipped: int
|
||||||
|
errors: List[str]
|
||||||
|
total_rows: int
|
||||||
|
|
||||||
|
|
||||||
|
class ImportService:
|
||||||
|
"""
|
||||||
|
Service for importing transactions from brokerage CSV files.
|
||||||
|
|
||||||
|
Handles parsing, deduplication, and database insertion.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
"""
|
||||||
|
Initialize import service.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db: Database session
|
||||||
|
"""
|
||||||
|
self.db = db
|
||||||
|
self.parser = FidelityParser() # Can be extended to support multiple parsers
|
||||||
|
|
||||||
|
def import_from_file(self, file_path: Path, account_id: int) -> ImportResult:
|
||||||
|
"""
|
||||||
|
Import transactions from a CSV file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to CSV file
|
||||||
|
account_id: ID of the account to import transactions for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ImportResult with statistics
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileNotFoundError: If file doesn't exist
|
||||||
|
ValueError: If file format is invalid
|
||||||
|
"""
|
||||||
|
# Parse CSV file
|
||||||
|
parse_result = self.parser.parse(file_path)
|
||||||
|
|
||||||
|
imported = 0
|
||||||
|
skipped = 0
|
||||||
|
errors = list(parse_result.errors)
|
||||||
|
|
||||||
|
# Process each transaction
|
||||||
|
for txn_data in parse_result.transactions:
|
||||||
|
try:
|
||||||
|
# Generate deduplication hash
|
||||||
|
unique_hash = generate_transaction_hash(
|
||||||
|
account_id=account_id,
|
||||||
|
run_date=txn_data["run_date"],
|
||||||
|
symbol=txn_data.get("symbol"),
|
||||||
|
action=txn_data["action"],
|
||||||
|
amount=txn_data.get("amount"),
|
||||||
|
quantity=txn_data.get("quantity"),
|
||||||
|
price=txn_data.get("price"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check if transaction already exists
|
||||||
|
existing = (
|
||||||
|
self.db.query(Transaction)
|
||||||
|
.filter(Transaction.unique_hash == unique_hash)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Create new transaction
|
||||||
|
transaction = Transaction(
|
||||||
|
account_id=account_id,
|
||||||
|
unique_hash=unique_hash,
|
||||||
|
**txn_data
|
||||||
|
)
|
||||||
|
|
||||||
|
self.db.add(transaction)
|
||||||
|
self.db.commit()
|
||||||
|
imported += 1
|
||||||
|
|
||||||
|
except IntegrityError:
|
||||||
|
# Duplicate hash (edge case if concurrent imports)
|
||||||
|
self.db.rollback()
|
||||||
|
skipped += 1
|
||||||
|
except Exception as e:
|
||||||
|
self.db.rollback()
|
||||||
|
errors.append(f"Failed to import transaction: {str(e)}")
|
||||||
|
|
||||||
|
return ImportResult(
|
||||||
|
imported=imported,
|
||||||
|
skipped=skipped,
|
||||||
|
errors=errors,
|
||||||
|
total_rows=parse_result.row_count,
|
||||||
|
)
|
||||||
|
|
||||||
|
def import_from_directory(
|
||||||
|
self, directory: Path, account_id: int, pattern: str = "*.csv"
|
||||||
|
) -> Dict[str, ImportResult]:
|
||||||
|
"""
|
||||||
|
Import transactions from all CSV files in a directory.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
directory: Path to directory containing CSV files
|
||||||
|
account_id: ID of the account to import transactions for
|
||||||
|
pattern: Glob pattern for matching files (default: *.csv)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary mapping filename to ImportResult
|
||||||
|
"""
|
||||||
|
if not directory.exists() or not directory.is_dir():
|
||||||
|
raise ValueError(f"Invalid directory: {directory}")
|
||||||
|
|
||||||
|
results = {}
|
||||||
|
|
||||||
|
for file_path in directory.glob(pattern):
|
||||||
|
try:
|
||||||
|
result = self.import_from_file(file_path, account_id)
|
||||||
|
results[file_path.name] = result
|
||||||
|
except Exception as e:
|
||||||
|
results[file_path.name] = ImportResult(
|
||||||
|
imported=0,
|
||||||
|
skipped=0,
|
||||||
|
errors=[str(e)],
|
||||||
|
total_rows=0,
|
||||||
|
)
|
||||||
|
|
||||||
|
return results
|
||||||
330
backend/app/services/market_data_service.py
Normal file
330
backend/app/services/market_data_service.py
Normal file
@@ -0,0 +1,330 @@
|
|||||||
|
"""
|
||||||
|
Market data service with rate limiting, caching, and batch processing.
|
||||||
|
|
||||||
|
This service handles fetching market prices from Yahoo Finance with:
|
||||||
|
- Database-backed caching to survive restarts
|
||||||
|
- Rate limiting with exponential backoff
|
||||||
|
- Batch processing to reduce API calls
|
||||||
|
- Stale-while-revalidate pattern for better UX
|
||||||
|
"""
|
||||||
|
import time
|
||||||
|
import yfinance as yf
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_
|
||||||
|
from typing import Dict, List, Optional
|
||||||
|
from decimal import Decimal
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from app.models.market_price import MarketPrice
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class MarketDataService:
|
||||||
|
"""Service for fetching and caching market prices with rate limiting."""
|
||||||
|
|
||||||
|
def __init__(self, db: Session, cache_ttl_seconds: int = 300):
|
||||||
|
"""
|
||||||
|
Initialize market data service.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db: Database session
|
||||||
|
cache_ttl_seconds: How long cached prices are considered fresh (default: 5 minutes)
|
||||||
|
"""
|
||||||
|
self.db = db
|
||||||
|
self.cache_ttl = cache_ttl_seconds
|
||||||
|
self._rate_limit_delay = 0.5 # Start with 500ms between requests
|
||||||
|
self._last_request_time = 0.0
|
||||||
|
self._consecutive_errors = 0
|
||||||
|
self._max_retries = 3
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _is_valid_stock_symbol(symbol: str) -> bool:
|
||||||
|
"""
|
||||||
|
Check if a symbol is a valid stock ticker (not an option symbol or CUSIP).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol: Symbol to check
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if it looks like a valid stock ticker
|
||||||
|
"""
|
||||||
|
if not symbol or len(symbol) > 5:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Stock symbols should start with a letter, not a number
|
||||||
|
# Numbers indicate CUSIP codes or option symbols
|
||||||
|
if symbol[0].isdigit():
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Should be mostly uppercase letters
|
||||||
|
# Allow $ for preferred shares (e.g., BRK.B becomes BRK-B)
|
||||||
|
return symbol.replace('-', '').replace('.', '').isalpha()
|
||||||
|
|
||||||
|
def get_price(self, symbol: str, allow_stale: bool = True) -> Optional[Decimal]:
|
||||||
|
"""
|
||||||
|
Get current price for a symbol with caching.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol: Stock ticker symbol
|
||||||
|
allow_stale: If True, return stale cache data instead of None
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price or None if unavailable
|
||||||
|
"""
|
||||||
|
# Skip invalid symbols (option symbols, CUSIPs, etc.)
|
||||||
|
if not self._is_valid_stock_symbol(symbol):
|
||||||
|
logger.debug(f"Skipping invalid symbol: {symbol} (not a stock ticker)")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Check database cache first
|
||||||
|
cached = self._get_cached_price(symbol)
|
||||||
|
|
||||||
|
if cached:
|
||||||
|
price, age_seconds = cached
|
||||||
|
if age_seconds < self.cache_ttl:
|
||||||
|
# Fresh cache hit
|
||||||
|
logger.debug(f"Cache HIT (fresh): {symbol} = ${price} (age: {age_seconds}s)")
|
||||||
|
return price
|
||||||
|
elif allow_stale:
|
||||||
|
# Stale cache hit, but we'll return it
|
||||||
|
logger.debug(f"Cache HIT (stale): {symbol} = ${price} (age: {age_seconds}s)")
|
||||||
|
return price
|
||||||
|
|
||||||
|
# Cache miss or expired - fetch from Yahoo Finance
|
||||||
|
logger.info(f"Cache MISS: {symbol}, fetching from Yahoo Finance...")
|
||||||
|
fresh_price = self._fetch_from_yahoo(symbol)
|
||||||
|
|
||||||
|
if fresh_price is not None:
|
||||||
|
self._update_cache(symbol, fresh_price)
|
||||||
|
return fresh_price
|
||||||
|
|
||||||
|
# If fetch failed and we have stale data, return it
|
||||||
|
if cached and allow_stale:
|
||||||
|
price, age_seconds = cached
|
||||||
|
logger.warning(f"Yahoo fetch failed, using stale cache: {symbol} = ${price} (age: {age_seconds}s)")
|
||||||
|
return price
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_prices_batch(
|
||||||
|
self,
|
||||||
|
symbols: List[str],
|
||||||
|
allow_stale: bool = True,
|
||||||
|
max_fetches: int = 10
|
||||||
|
) -> Dict[str, Optional[Decimal]]:
|
||||||
|
"""
|
||||||
|
Get prices for multiple symbols with rate limiting.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbols: List of ticker symbols
|
||||||
|
allow_stale: Return stale cache data if available
|
||||||
|
max_fetches: Maximum number of API calls to make (remaining use cache)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary mapping symbol to price (or None if unavailable)
|
||||||
|
"""
|
||||||
|
results = {}
|
||||||
|
symbols_to_fetch = []
|
||||||
|
|
||||||
|
# First pass: Check cache for all symbols
|
||||||
|
for symbol in symbols:
|
||||||
|
# Skip invalid symbols
|
||||||
|
if not self._is_valid_stock_symbol(symbol):
|
||||||
|
logger.debug(f"Skipping invalid symbol in batch: {symbol}")
|
||||||
|
results[symbol] = None
|
||||||
|
continue
|
||||||
|
cached = self._get_cached_price(symbol)
|
||||||
|
|
||||||
|
if cached:
|
||||||
|
price, age_seconds = cached
|
||||||
|
if age_seconds < self.cache_ttl:
|
||||||
|
# Fresh cache - use it
|
||||||
|
results[symbol] = price
|
||||||
|
elif allow_stale:
|
||||||
|
# Stale but usable
|
||||||
|
results[symbol] = price
|
||||||
|
if age_seconds < self.cache_ttl * 2: # Not TOO stale
|
||||||
|
symbols_to_fetch.append(symbol)
|
||||||
|
else:
|
||||||
|
# Stale and not allowing stale - need to fetch
|
||||||
|
symbols_to_fetch.append(symbol)
|
||||||
|
else:
|
||||||
|
# No cache at all
|
||||||
|
symbols_to_fetch.append(symbol)
|
||||||
|
|
||||||
|
# Second pass: Fetch missing/stale symbols (with limit)
|
||||||
|
if symbols_to_fetch:
|
||||||
|
logger.info(f"Batch fetching {len(symbols_to_fetch)} symbols (max: {max_fetches})")
|
||||||
|
|
||||||
|
for i, symbol in enumerate(symbols_to_fetch[:max_fetches]):
|
||||||
|
if i > 0:
|
||||||
|
# Rate limiting delay
|
||||||
|
time.sleep(self._rate_limit_delay)
|
||||||
|
|
||||||
|
price = self._fetch_from_yahoo(symbol)
|
||||||
|
if price is not None:
|
||||||
|
results[symbol] = price
|
||||||
|
self._update_cache(symbol, price)
|
||||||
|
elif symbol not in results:
|
||||||
|
# No cached value and fetch failed
|
||||||
|
results[symbol] = None
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def refresh_stale_prices(self, min_age_seconds: int = 300, limit: int = 20) -> int:
|
||||||
|
"""
|
||||||
|
Background task to refresh stale prices.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
min_age_seconds: Only refresh prices older than this
|
||||||
|
limit: Maximum number of prices to refresh
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of prices refreshed
|
||||||
|
"""
|
||||||
|
cutoff_time = datetime.utcnow() - timedelta(seconds=min_age_seconds)
|
||||||
|
|
||||||
|
# Get stale prices ordered by oldest first
|
||||||
|
stale_prices = (
|
||||||
|
self.db.query(MarketPrice)
|
||||||
|
.filter(MarketPrice.fetched_at < cutoff_time)
|
||||||
|
.order_by(MarketPrice.fetched_at.asc())
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
refreshed = 0
|
||||||
|
for cached_price in stale_prices:
|
||||||
|
time.sleep(self._rate_limit_delay)
|
||||||
|
|
||||||
|
fresh_price = self._fetch_from_yahoo(cached_price.symbol)
|
||||||
|
if fresh_price is not None:
|
||||||
|
self._update_cache(cached_price.symbol, fresh_price)
|
||||||
|
refreshed += 1
|
||||||
|
|
||||||
|
logger.info(f"Refreshed {refreshed}/{len(stale_prices)} stale prices")
|
||||||
|
return refreshed
|
||||||
|
|
||||||
|
def _get_cached_price(self, symbol: str) -> Optional[tuple[Decimal, float]]:
|
||||||
|
"""
|
||||||
|
Get cached price from database.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (price, age_in_seconds) or None if not cached
|
||||||
|
"""
|
||||||
|
cached = (
|
||||||
|
self.db.query(MarketPrice)
|
||||||
|
.filter(MarketPrice.symbol == symbol)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if cached:
|
||||||
|
age = (datetime.utcnow() - cached.fetched_at).total_seconds()
|
||||||
|
return (cached.price, age)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _update_cache(self, symbol: str, price: Decimal) -> None:
|
||||||
|
"""Update or insert price in database cache."""
|
||||||
|
cached = (
|
||||||
|
self.db.query(MarketPrice)
|
||||||
|
.filter(MarketPrice.symbol == symbol)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if cached:
|
||||||
|
cached.price = price
|
||||||
|
cached.fetched_at = datetime.utcnow()
|
||||||
|
else:
|
||||||
|
new_price = MarketPrice(
|
||||||
|
symbol=symbol,
|
||||||
|
price=price,
|
||||||
|
fetched_at=datetime.utcnow()
|
||||||
|
)
|
||||||
|
self.db.add(new_price)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
def _fetch_from_yahoo(self, symbol: str) -> Optional[Decimal]:
|
||||||
|
"""
|
||||||
|
Fetch price from Yahoo Finance with rate limiting and retries.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price or None if fetch failed
|
||||||
|
"""
|
||||||
|
for attempt in range(self._max_retries):
|
||||||
|
try:
|
||||||
|
# Rate limiting
|
||||||
|
elapsed = time.time() - self._last_request_time
|
||||||
|
if elapsed < self._rate_limit_delay:
|
||||||
|
time.sleep(self._rate_limit_delay - elapsed)
|
||||||
|
|
||||||
|
self._last_request_time = time.time()
|
||||||
|
|
||||||
|
# Fetch from Yahoo
|
||||||
|
ticker = yf.Ticker(symbol)
|
||||||
|
info = ticker.info
|
||||||
|
|
||||||
|
# Try different price fields
|
||||||
|
for field in ["currentPrice", "regularMarketPrice", "previousClose"]:
|
||||||
|
if field in info and info[field]:
|
||||||
|
price = Decimal(str(info[field]))
|
||||||
|
|
||||||
|
# Success - reset error tracking
|
||||||
|
self._consecutive_errors = 0
|
||||||
|
self._rate_limit_delay = max(0.5, self._rate_limit_delay * 0.9) # Gradually decrease delay
|
||||||
|
|
||||||
|
logger.debug(f"Fetched {symbol} = ${price}")
|
||||||
|
return price
|
||||||
|
|
||||||
|
# No price found in response
|
||||||
|
logger.warning(f"No price data in Yahoo response for {symbol}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
error_str = str(e).lower()
|
||||||
|
|
||||||
|
if "429" in error_str or "too many requests" in error_str:
|
||||||
|
# Rate limit hit - back off exponentially
|
||||||
|
self._consecutive_errors += 1
|
||||||
|
self._rate_limit_delay = min(10.0, self._rate_limit_delay * 2) # Double delay, max 10s
|
||||||
|
|
||||||
|
logger.warning(
|
||||||
|
f"Rate limit hit for {symbol} (attempt {attempt + 1}/{self._max_retries}), "
|
||||||
|
f"backing off to {self._rate_limit_delay}s delay"
|
||||||
|
)
|
||||||
|
|
||||||
|
if attempt < self._max_retries - 1:
|
||||||
|
time.sleep(self._rate_limit_delay * (attempt + 1)) # Longer wait for retries
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
# Other error
|
||||||
|
logger.error(f"Error fetching {symbol}: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
logger.error(f"Failed to fetch {symbol} after {self._max_retries} attempts")
|
||||||
|
return None
|
||||||
|
|
||||||
|
def clear_cache(self, older_than_days: int = 30) -> int:
|
||||||
|
"""
|
||||||
|
Clear old cached prices.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
older_than_days: Delete prices older than this many days
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of records deleted
|
||||||
|
"""
|
||||||
|
cutoff = datetime.utcnow() - timedelta(days=older_than_days)
|
||||||
|
|
||||||
|
deleted = (
|
||||||
|
self.db.query(MarketPrice)
|
||||||
|
.filter(MarketPrice.fetched_at < cutoff)
|
||||||
|
.delete()
|
||||||
|
)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
logger.info(f"Cleared {deleted} cached prices older than {older_than_days} days")
|
||||||
|
return deleted
|
||||||
364
backend/app/services/performance_calculator.py
Normal file
364
backend/app/services/performance_calculator.py
Normal file
@@ -0,0 +1,364 @@
|
|||||||
|
"""Service for calculating performance metrics and unrealized P&L."""
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_, func
|
||||||
|
from typing import Dict, Optional
|
||||||
|
from decimal import Decimal
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import yfinance as yf
|
||||||
|
from functools import lru_cache
|
||||||
|
|
||||||
|
from app.models import Position, Transaction
|
||||||
|
from app.models.position import PositionStatus
|
||||||
|
|
||||||
|
|
||||||
|
class PerformanceCalculator:
|
||||||
|
"""
|
||||||
|
Service for calculating performance metrics and market data.
|
||||||
|
|
||||||
|
Integrates with Yahoo Finance API for real-time pricing of open positions.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db: Session, cache_ttl: int = 60):
|
||||||
|
"""
|
||||||
|
Initialize performance calculator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db: Database session
|
||||||
|
cache_ttl: Cache time-to-live in seconds (default: 60)
|
||||||
|
"""
|
||||||
|
self.db = db
|
||||||
|
self.cache_ttl = cache_ttl
|
||||||
|
self._price_cache: Dict[str, tuple[Decimal, datetime]] = {}
|
||||||
|
|
||||||
|
def calculate_unrealized_pnl(self, position: Position) -> Optional[Decimal]:
|
||||||
|
"""
|
||||||
|
Calculate unrealized P&L for an open position.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
position: Open position to calculate P&L for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Unrealized P&L or None if market data unavailable
|
||||||
|
"""
|
||||||
|
if position.status != PositionStatus.OPEN:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get current market price
|
||||||
|
current_price = self.get_current_price(position.symbol)
|
||||||
|
if current_price is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if position.avg_entry_price is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Calculate P&L based on position direction
|
||||||
|
quantity = abs(position.total_quantity)
|
||||||
|
is_short = position.total_quantity < 0
|
||||||
|
|
||||||
|
if is_short:
|
||||||
|
# Short position: profit when price decreases
|
||||||
|
pnl = (position.avg_entry_price - current_price) * quantity * 100
|
||||||
|
else:
|
||||||
|
# Long position: profit when price increases
|
||||||
|
pnl = (current_price - position.avg_entry_price) * quantity * 100
|
||||||
|
|
||||||
|
# Subtract fees and commissions from opening transactions
|
||||||
|
total_fees = Decimal("0")
|
||||||
|
for link in position.transaction_links:
|
||||||
|
txn = link.transaction
|
||||||
|
if txn.commission:
|
||||||
|
total_fees += txn.commission
|
||||||
|
if txn.fees:
|
||||||
|
total_fees += txn.fees
|
||||||
|
|
||||||
|
pnl -= total_fees
|
||||||
|
|
||||||
|
return pnl
|
||||||
|
|
||||||
|
def update_open_positions_pnl(self, account_id: int) -> int:
|
||||||
|
"""
|
||||||
|
Update unrealized P&L for all open positions in an account.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID to update
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of positions updated
|
||||||
|
"""
|
||||||
|
open_positions = (
|
||||||
|
self.db.query(Position)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
Position.account_id == account_id,
|
||||||
|
Position.status == PositionStatus.OPEN,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
updated = 0
|
||||||
|
for position in open_positions:
|
||||||
|
unrealized_pnl = self.calculate_unrealized_pnl(position)
|
||||||
|
if unrealized_pnl is not None:
|
||||||
|
position.unrealized_pnl = unrealized_pnl
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
return updated
|
||||||
|
|
||||||
|
def get_current_price(self, symbol: str) -> Optional[Decimal]:
|
||||||
|
"""
|
||||||
|
Get current market price for a symbol.
|
||||||
|
|
||||||
|
Uses Yahoo Finance API with caching to reduce API calls.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol: Stock ticker symbol
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Current price or None if unavailable
|
||||||
|
"""
|
||||||
|
# Check cache
|
||||||
|
if symbol in self._price_cache:
|
||||||
|
price, timestamp = self._price_cache[symbol]
|
||||||
|
if datetime.now() - timestamp < timedelta(seconds=self.cache_ttl):
|
||||||
|
return price
|
||||||
|
|
||||||
|
# Fetch from Yahoo Finance
|
||||||
|
try:
|
||||||
|
ticker = yf.Ticker(symbol)
|
||||||
|
info = ticker.info
|
||||||
|
|
||||||
|
# Try different price fields
|
||||||
|
current_price = None
|
||||||
|
for field in ["currentPrice", "regularMarketPrice", "previousClose"]:
|
||||||
|
if field in info and info[field]:
|
||||||
|
current_price = Decimal(str(info[field]))
|
||||||
|
break
|
||||||
|
|
||||||
|
if current_price is not None:
|
||||||
|
# Cache the price
|
||||||
|
self._price_cache[symbol] = (current_price, datetime.now())
|
||||||
|
return current_price
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
# Failed to fetch price
|
||||||
|
pass
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def calculate_account_stats(self, account_id: int) -> Dict:
|
||||||
|
"""
|
||||||
|
Calculate aggregate statistics for an account.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with performance metrics
|
||||||
|
"""
|
||||||
|
# Get all positions
|
||||||
|
positions = (
|
||||||
|
self.db.query(Position)
|
||||||
|
.filter(Position.account_id == account_id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
total_positions = len(positions)
|
||||||
|
open_positions_count = sum(
|
||||||
|
1 for p in positions if p.status == PositionStatus.OPEN
|
||||||
|
)
|
||||||
|
closed_positions_count = sum(
|
||||||
|
1 for p in positions if p.status == PositionStatus.CLOSED
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate P&L
|
||||||
|
total_realized_pnl = sum(
|
||||||
|
(p.realized_pnl or Decimal("0"))
|
||||||
|
for p in positions
|
||||||
|
if p.status == PositionStatus.CLOSED
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update unrealized P&L for open positions
|
||||||
|
self.update_open_positions_pnl(account_id)
|
||||||
|
|
||||||
|
total_unrealized_pnl = sum(
|
||||||
|
(p.unrealized_pnl or Decimal("0"))
|
||||||
|
for p in positions
|
||||||
|
if p.status == PositionStatus.OPEN
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate win rate and average win/loss
|
||||||
|
closed_with_pnl = [
|
||||||
|
p for p in positions
|
||||||
|
if p.status == PositionStatus.CLOSED and p.realized_pnl is not None
|
||||||
|
]
|
||||||
|
|
||||||
|
if closed_with_pnl:
|
||||||
|
winning_trades = [p for p in closed_with_pnl if p.realized_pnl > 0]
|
||||||
|
losing_trades = [p for p in closed_with_pnl if p.realized_pnl < 0]
|
||||||
|
|
||||||
|
win_rate = (len(winning_trades) / len(closed_with_pnl)) * 100
|
||||||
|
|
||||||
|
avg_win = (
|
||||||
|
sum(p.realized_pnl for p in winning_trades) / len(winning_trades)
|
||||||
|
if winning_trades
|
||||||
|
else Decimal("0")
|
||||||
|
)
|
||||||
|
|
||||||
|
avg_loss = (
|
||||||
|
sum(p.realized_pnl for p in losing_trades) / len(losing_trades)
|
||||||
|
if losing_trades
|
||||||
|
else Decimal("0")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
win_rate = 0.0
|
||||||
|
avg_win = Decimal("0")
|
||||||
|
avg_loss = Decimal("0")
|
||||||
|
|
||||||
|
# Get current account balance from latest transaction
|
||||||
|
latest_txn = (
|
||||||
|
self.db.query(Transaction)
|
||||||
|
.filter(Transaction.account_id == account_id)
|
||||||
|
.order_by(Transaction.run_date.desc(), Transaction.id.desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
current_balance = (
|
||||||
|
latest_txn.cash_balance if latest_txn and latest_txn.cash_balance else Decimal("0")
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total_positions": total_positions,
|
||||||
|
"open_positions": open_positions_count,
|
||||||
|
"closed_positions": closed_positions_count,
|
||||||
|
"total_realized_pnl": float(total_realized_pnl),
|
||||||
|
"total_unrealized_pnl": float(total_unrealized_pnl),
|
||||||
|
"total_pnl": float(total_realized_pnl + total_unrealized_pnl),
|
||||||
|
"win_rate": float(win_rate),
|
||||||
|
"avg_win": float(avg_win),
|
||||||
|
"avg_loss": float(avg_loss),
|
||||||
|
"current_balance": float(current_balance),
|
||||||
|
}
|
||||||
|
|
||||||
|
def get_balance_history(
|
||||||
|
self, account_id: int, days: int = 30
|
||||||
|
) -> list[Dict]:
|
||||||
|
"""
|
||||||
|
Get account balance history for charting.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
days: Number of days to retrieve
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of {date, balance} dictionaries
|
||||||
|
"""
|
||||||
|
cutoff_date = datetime.now().date() - timedelta(days=days)
|
||||||
|
|
||||||
|
transactions = (
|
||||||
|
self.db.query(Transaction.run_date, Transaction.cash_balance)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
Transaction.account_id == account_id,
|
||||||
|
Transaction.run_date >= cutoff_date,
|
||||||
|
Transaction.cash_balance.isnot(None),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.order_by(Transaction.run_date)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get one balance per day (use last transaction of the day)
|
||||||
|
daily_balances = {}
|
||||||
|
for txn in transactions:
|
||||||
|
daily_balances[txn.run_date] = float(txn.cash_balance)
|
||||||
|
|
||||||
|
return [
|
||||||
|
{"date": date.isoformat(), "balance": balance}
|
||||||
|
for date, balance in sorted(daily_balances.items())
|
||||||
|
]
|
||||||
|
|
||||||
|
def get_top_trades(
|
||||||
|
self, account_id: int, limit: int = 10
|
||||||
|
) -> list[Dict]:
|
||||||
|
"""
|
||||||
|
Get top performing trades (by realized P&L).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
positions = (
|
||||||
|
self.db.query(Position)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
Position.account_id == account_id,
|
||||||
|
Position.status == PositionStatus.CLOSED,
|
||||||
|
Position.realized_pnl.isnot(None),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.order_by(Position.realized_pnl.desc())
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"symbol": p.symbol,
|
||||||
|
"option_symbol": p.option_symbol,
|
||||||
|
"position_type": p.position_type.value,
|
||||||
|
"open_date": p.open_date.isoformat(),
|
||||||
|
"close_date": p.close_date.isoformat() if p.close_date else None,
|
||||||
|
"quantity": float(p.total_quantity),
|
||||||
|
"entry_price": float(p.avg_entry_price) if p.avg_entry_price else None,
|
||||||
|
"exit_price": float(p.avg_exit_price) if p.avg_exit_price else None,
|
||||||
|
"realized_pnl": float(p.realized_pnl),
|
||||||
|
}
|
||||||
|
for p in positions
|
||||||
|
]
|
||||||
|
|
||||||
|
def get_worst_trades(
|
||||||
|
self, account_id: int, limit: int = 20
|
||||||
|
) -> list[Dict]:
|
||||||
|
"""
|
||||||
|
Get worst performing trades (biggest losses by realized P&L).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
positions = (
|
||||||
|
self.db.query(Position)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
Position.account_id == account_id,
|
||||||
|
Position.status == PositionStatus.CLOSED,
|
||||||
|
Position.realized_pnl.isnot(None),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.order_by(Position.realized_pnl.asc())
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"symbol": p.symbol,
|
||||||
|
"option_symbol": p.option_symbol,
|
||||||
|
"position_type": p.position_type.value,
|
||||||
|
"open_date": p.open_date.isoformat(),
|
||||||
|
"close_date": p.close_date.isoformat() if p.close_date else None,
|
||||||
|
"quantity": float(p.total_quantity),
|
||||||
|
"entry_price": float(p.avg_entry_price) if p.avg_entry_price else None,
|
||||||
|
"exit_price": float(p.avg_exit_price) if p.avg_exit_price else None,
|
||||||
|
"realized_pnl": float(p.realized_pnl),
|
||||||
|
}
|
||||||
|
for p in positions
|
||||||
|
]
|
||||||
433
backend/app/services/performance_calculator_v2.py
Normal file
433
backend/app/services/performance_calculator_v2.py
Normal file
@@ -0,0 +1,433 @@
|
|||||||
|
"""
|
||||||
|
Improved performance calculator with rate-limited market data fetching.
|
||||||
|
|
||||||
|
This version uses the MarketDataService for efficient, cached price lookups.
|
||||||
|
"""
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_
|
||||||
|
from typing import Dict, Optional
|
||||||
|
from decimal import Decimal
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from app.models import Position, Transaction
|
||||||
|
from app.models.position import PositionStatus
|
||||||
|
from app.services.market_data_service import MarketDataService
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class PerformanceCalculatorV2:
|
||||||
|
"""
|
||||||
|
Enhanced performance calculator with efficient market data handling.
|
||||||
|
|
||||||
|
Features:
|
||||||
|
- Database-backed price caching
|
||||||
|
- Rate-limited API calls
|
||||||
|
- Batch price fetching
|
||||||
|
- Stale-while-revalidate pattern
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db: Session, cache_ttl: int = 300):
|
||||||
|
"""
|
||||||
|
Initialize performance calculator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db: Database session
|
||||||
|
cache_ttl: Cache time-to-live in seconds (default: 5 minutes)
|
||||||
|
"""
|
||||||
|
self.db = db
|
||||||
|
self.market_data = MarketDataService(db, cache_ttl_seconds=cache_ttl)
|
||||||
|
|
||||||
|
def calculate_unrealized_pnl(self, position: Position, current_price: Optional[Decimal] = None) -> Optional[Decimal]:
|
||||||
|
"""
|
||||||
|
Calculate unrealized P&L for an open position.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
position: Open position to calculate P&L for
|
||||||
|
current_price: Optional pre-fetched current price (avoids API call)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Unrealized P&L or None if market data unavailable
|
||||||
|
"""
|
||||||
|
if position.status != PositionStatus.OPEN:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Use provided price or fetch it
|
||||||
|
if current_price is None:
|
||||||
|
current_price = self.market_data.get_price(position.symbol, allow_stale=True)
|
||||||
|
|
||||||
|
if current_price is None or position.avg_entry_price is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Calculate P&L based on position direction
|
||||||
|
quantity = abs(position.total_quantity)
|
||||||
|
is_short = position.total_quantity < 0
|
||||||
|
|
||||||
|
if is_short:
|
||||||
|
# Short position: profit when price decreases
|
||||||
|
pnl = (position.avg_entry_price - current_price) * quantity * 100
|
||||||
|
else:
|
||||||
|
# Long position: profit when price increases
|
||||||
|
pnl = (current_price - position.avg_entry_price) * quantity * 100
|
||||||
|
|
||||||
|
# Subtract fees and commissions from opening transactions
|
||||||
|
total_fees = Decimal("0")
|
||||||
|
for link in position.transaction_links:
|
||||||
|
txn = link.transaction
|
||||||
|
if txn.commission:
|
||||||
|
total_fees += txn.commission
|
||||||
|
if txn.fees:
|
||||||
|
total_fees += txn.fees
|
||||||
|
|
||||||
|
pnl -= total_fees
|
||||||
|
|
||||||
|
return pnl
|
||||||
|
|
||||||
|
def update_open_positions_pnl(
|
||||||
|
self,
|
||||||
|
account_id: int,
|
||||||
|
max_api_calls: int = 10,
|
||||||
|
allow_stale: bool = True
|
||||||
|
) -> Dict[str, int]:
|
||||||
|
"""
|
||||||
|
Update unrealized P&L for all open positions in an account.
|
||||||
|
|
||||||
|
Uses batch fetching with rate limiting to avoid overwhelming Yahoo Finance API.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID to update
|
||||||
|
max_api_calls: Maximum number of Yahoo Finance API calls to make
|
||||||
|
allow_stale: Allow using stale cached prices
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with update statistics
|
||||||
|
"""
|
||||||
|
open_positions = (
|
||||||
|
self.db.query(Position)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
Position.account_id == account_id,
|
||||||
|
Position.status == PositionStatus.OPEN,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
if not open_positions:
|
||||||
|
return {
|
||||||
|
"total": 0,
|
||||||
|
"updated": 0,
|
||||||
|
"cached": 0,
|
||||||
|
"failed": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get unique symbols
|
||||||
|
symbols = list(set(p.symbol for p in open_positions))
|
||||||
|
|
||||||
|
logger.info(f"Updating P&L for {len(open_positions)} positions across {len(symbols)} symbols")
|
||||||
|
|
||||||
|
# Fetch prices in batch
|
||||||
|
prices = self.market_data.get_prices_batch(
|
||||||
|
symbols,
|
||||||
|
allow_stale=allow_stale,
|
||||||
|
max_fetches=max_api_calls
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update P&L for each position
|
||||||
|
updated = 0
|
||||||
|
cached = 0
|
||||||
|
failed = 0
|
||||||
|
|
||||||
|
for position in open_positions:
|
||||||
|
price = prices.get(position.symbol)
|
||||||
|
|
||||||
|
if price is not None:
|
||||||
|
unrealized_pnl = self.calculate_unrealized_pnl(position, current_price=price)
|
||||||
|
|
||||||
|
if unrealized_pnl is not None:
|
||||||
|
position.unrealized_pnl = unrealized_pnl
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
# Check if price was from cache (age > 0) or fresh fetch
|
||||||
|
cached_info = self.market_data._get_cached_price(position.symbol)
|
||||||
|
if cached_info:
|
||||||
|
_, age = cached_info
|
||||||
|
if age < self.market_data.cache_ttl:
|
||||||
|
cached += 1
|
||||||
|
else:
|
||||||
|
failed += 1
|
||||||
|
else:
|
||||||
|
failed += 1
|
||||||
|
logger.warning(f"Could not get price for {position.symbol}")
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Updated {updated}/{len(open_positions)} positions "
|
||||||
|
f"(cached: {cached}, failed: {failed})"
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total": len(open_positions),
|
||||||
|
"updated": updated,
|
||||||
|
"cached": cached,
|
||||||
|
"failed": failed
|
||||||
|
}
|
||||||
|
|
||||||
|
def calculate_account_stats(
|
||||||
|
self,
|
||||||
|
account_id: int,
|
||||||
|
update_prices: bool = True,
|
||||||
|
max_api_calls: int = 10,
|
||||||
|
start_date = None,
|
||||||
|
end_date = None
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Calculate aggregate statistics for an account.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
update_prices: Whether to fetch fresh prices (if False, uses cached only)
|
||||||
|
max_api_calls: Maximum number of Yahoo Finance API calls
|
||||||
|
start_date: Filter positions opened on or after this date
|
||||||
|
end_date: Filter positions opened on or before this date
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with performance metrics
|
||||||
|
"""
|
||||||
|
# Get all positions with optional date filtering
|
||||||
|
query = self.db.query(Position).filter(Position.account_id == account_id)
|
||||||
|
|
||||||
|
if start_date:
|
||||||
|
query = query.filter(Position.open_date >= start_date)
|
||||||
|
if end_date:
|
||||||
|
query = query.filter(Position.open_date <= end_date)
|
||||||
|
|
||||||
|
positions = query.all()
|
||||||
|
|
||||||
|
total_positions = len(positions)
|
||||||
|
open_positions_count = sum(
|
||||||
|
1 for p in positions if p.status == PositionStatus.OPEN
|
||||||
|
)
|
||||||
|
closed_positions_count = sum(
|
||||||
|
1 for p in positions if p.status == PositionStatus.CLOSED
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate realized P&L (doesn't need market data)
|
||||||
|
total_realized_pnl = sum(
|
||||||
|
(p.realized_pnl or Decimal("0"))
|
||||||
|
for p in positions
|
||||||
|
if p.status == PositionStatus.CLOSED
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update unrealized P&L for open positions
|
||||||
|
update_stats = None
|
||||||
|
if update_prices and open_positions_count > 0:
|
||||||
|
update_stats = self.update_open_positions_pnl(
|
||||||
|
account_id,
|
||||||
|
max_api_calls=max_api_calls,
|
||||||
|
allow_stale=True
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate total unrealized P&L
|
||||||
|
total_unrealized_pnl = sum(
|
||||||
|
(p.unrealized_pnl or Decimal("0"))
|
||||||
|
for p in positions
|
||||||
|
if p.status == PositionStatus.OPEN
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate win rate and average win/loss
|
||||||
|
closed_with_pnl = [
|
||||||
|
p for p in positions
|
||||||
|
if p.status == PositionStatus.CLOSED and p.realized_pnl is not None
|
||||||
|
]
|
||||||
|
|
||||||
|
if closed_with_pnl:
|
||||||
|
winning_trades = [p for p in closed_with_pnl if p.realized_pnl > 0]
|
||||||
|
losing_trades = [p for p in closed_with_pnl if p.realized_pnl < 0]
|
||||||
|
|
||||||
|
win_rate = (len(winning_trades) / len(closed_with_pnl)) * 100
|
||||||
|
|
||||||
|
avg_win = (
|
||||||
|
sum(p.realized_pnl for p in winning_trades) / len(winning_trades)
|
||||||
|
if winning_trades
|
||||||
|
else Decimal("0")
|
||||||
|
)
|
||||||
|
|
||||||
|
avg_loss = (
|
||||||
|
sum(p.realized_pnl for p in losing_trades) / len(losing_trades)
|
||||||
|
if losing_trades
|
||||||
|
else Decimal("0")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
win_rate = 0.0
|
||||||
|
avg_win = Decimal("0")
|
||||||
|
avg_loss = Decimal("0")
|
||||||
|
|
||||||
|
# Get current account balance from latest transaction
|
||||||
|
latest_txn = (
|
||||||
|
self.db.query(Transaction)
|
||||||
|
.filter(Transaction.account_id == account_id)
|
||||||
|
.order_by(Transaction.run_date.desc(), Transaction.id.desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
current_balance = (
|
||||||
|
latest_txn.cash_balance if latest_txn and latest_txn.cash_balance else Decimal("0")
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"total_positions": total_positions,
|
||||||
|
"open_positions": open_positions_count,
|
||||||
|
"closed_positions": closed_positions_count,
|
||||||
|
"total_realized_pnl": float(total_realized_pnl),
|
||||||
|
"total_unrealized_pnl": float(total_unrealized_pnl),
|
||||||
|
"total_pnl": float(total_realized_pnl + total_unrealized_pnl),
|
||||||
|
"win_rate": float(win_rate),
|
||||||
|
"avg_win": float(avg_win),
|
||||||
|
"avg_loss": float(avg_loss),
|
||||||
|
"current_balance": float(current_balance),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add update stats if prices were fetched
|
||||||
|
if update_stats:
|
||||||
|
result["price_update_stats"] = update_stats
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def get_balance_history(
|
||||||
|
self, account_id: int, days: int = 30
|
||||||
|
) -> list[Dict]:
|
||||||
|
"""
|
||||||
|
Get account balance history for charting.
|
||||||
|
|
||||||
|
This doesn't need market data, just transaction history.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
days: Number of days to retrieve
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of {date, balance} dictionaries
|
||||||
|
"""
|
||||||
|
cutoff_date = datetime.now().date() - timedelta(days=days)
|
||||||
|
|
||||||
|
transactions = (
|
||||||
|
self.db.query(Transaction.run_date, Transaction.cash_balance)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
Transaction.account_id == account_id,
|
||||||
|
Transaction.run_date >= cutoff_date,
|
||||||
|
Transaction.cash_balance.isnot(None),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.order_by(Transaction.run_date)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get one balance per day (use last transaction of the day)
|
||||||
|
daily_balances = {}
|
||||||
|
for txn in transactions:
|
||||||
|
daily_balances[txn.run_date] = float(txn.cash_balance)
|
||||||
|
|
||||||
|
return [
|
||||||
|
{"date": date.isoformat(), "balance": balance}
|
||||||
|
for date, balance in sorted(daily_balances.items())
|
||||||
|
]
|
||||||
|
|
||||||
|
def get_top_trades(
|
||||||
|
self, account_id: int, limit: int = 10, start_date: Optional[datetime] = None, end_date: Optional[datetime] = None
|
||||||
|
) -> list[Dict]:
|
||||||
|
"""
|
||||||
|
Get top performing trades (by realized P&L).
|
||||||
|
|
||||||
|
This doesn't need market data, just closed positions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return
|
||||||
|
start_date: Filter positions closed on or after this date
|
||||||
|
end_date: Filter positions closed on or before this date
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
query = self.db.query(Position).filter(
|
||||||
|
and_(
|
||||||
|
Position.account_id == account_id,
|
||||||
|
Position.status == PositionStatus.CLOSED,
|
||||||
|
Position.realized_pnl.isnot(None),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply date filters if provided
|
||||||
|
if start_date:
|
||||||
|
query = query.filter(Position.close_date >= start_date)
|
||||||
|
if end_date:
|
||||||
|
query = query.filter(Position.close_date <= end_date)
|
||||||
|
|
||||||
|
positions = query.order_by(Position.realized_pnl.desc()).limit(limit).all()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"symbol": p.symbol,
|
||||||
|
"option_symbol": p.option_symbol,
|
||||||
|
"position_type": p.position_type.value,
|
||||||
|
"open_date": p.open_date.isoformat(),
|
||||||
|
"close_date": p.close_date.isoformat() if p.close_date else None,
|
||||||
|
"quantity": float(p.total_quantity),
|
||||||
|
"entry_price": float(p.avg_entry_price) if p.avg_entry_price else None,
|
||||||
|
"exit_price": float(p.avg_exit_price) if p.avg_exit_price else None,
|
||||||
|
"realized_pnl": float(p.realized_pnl),
|
||||||
|
}
|
||||||
|
for p in positions
|
||||||
|
]
|
||||||
|
|
||||||
|
def get_worst_trades(
|
||||||
|
self, account_id: int, limit: int = 10, start_date: Optional[datetime] = None, end_date: Optional[datetime] = None
|
||||||
|
) -> list[Dict]:
|
||||||
|
"""
|
||||||
|
Get worst performing trades (by realized P&L).
|
||||||
|
|
||||||
|
This doesn't need market data, just closed positions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
limit: Maximum number of trades to return
|
||||||
|
start_date: Filter positions closed on or after this date
|
||||||
|
end_date: Filter positions closed on or before this date
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of trade dictionaries
|
||||||
|
"""
|
||||||
|
query = self.db.query(Position).filter(
|
||||||
|
and_(
|
||||||
|
Position.account_id == account_id,
|
||||||
|
Position.status == PositionStatus.CLOSED,
|
||||||
|
Position.realized_pnl.isnot(None),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply date filters if provided
|
||||||
|
if start_date:
|
||||||
|
query = query.filter(Position.close_date >= start_date)
|
||||||
|
if end_date:
|
||||||
|
query = query.filter(Position.close_date <= end_date)
|
||||||
|
|
||||||
|
positions = query.order_by(Position.realized_pnl.asc()).limit(limit).all()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"symbol": p.symbol,
|
||||||
|
"option_symbol": p.option_symbol,
|
||||||
|
"position_type": p.position_type.value,
|
||||||
|
"open_date": p.open_date.isoformat(),
|
||||||
|
"close_date": p.close_date.isoformat() if p.close_date else None,
|
||||||
|
"quantity": float(p.total_quantity),
|
||||||
|
"entry_price": float(p.avg_entry_price) if p.avg_entry_price else None,
|
||||||
|
"exit_price": float(p.avg_exit_price) if p.avg_exit_price else None,
|
||||||
|
"realized_pnl": float(p.realized_pnl),
|
||||||
|
}
|
||||||
|
for p in positions
|
||||||
|
]
|
||||||
465
backend/app/services/position_tracker.py
Normal file
465
backend/app/services/position_tracker.py
Normal file
@@ -0,0 +1,465 @@
|
|||||||
|
"""Service for tracking and calculating trading positions."""
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_
|
||||||
|
from typing import List, Optional, Dict
|
||||||
|
from decimal import Decimal
|
||||||
|
from collections import defaultdict
|
||||||
|
from datetime import datetime
|
||||||
|
import re
|
||||||
|
|
||||||
|
from app.models import Transaction, Position, PositionTransaction
|
||||||
|
from app.models.position import PositionType, PositionStatus
|
||||||
|
from app.utils import parse_option_symbol
|
||||||
|
|
||||||
|
|
||||||
|
class PositionTracker:
|
||||||
|
"""
|
||||||
|
Service for tracking trading positions from transactions.
|
||||||
|
|
||||||
|
Matches opening and closing transactions using FIFO (First-In-First-Out) method.
|
||||||
|
Handles stocks, calls, and puts including complex scenarios like assignments and expirations.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
"""
|
||||||
|
Initialize position tracker.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db: Database session
|
||||||
|
"""
|
||||||
|
self.db = db
|
||||||
|
|
||||||
|
def rebuild_positions(self, account_id: int) -> int:
|
||||||
|
"""
|
||||||
|
Rebuild all positions for an account from transactions.
|
||||||
|
|
||||||
|
Deletes existing positions and recalculates from scratch.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID to rebuild positions for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of positions created
|
||||||
|
"""
|
||||||
|
# Delete existing positions
|
||||||
|
self.db.query(Position).filter(Position.account_id == account_id).delete()
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
# Get all transactions ordered by date
|
||||||
|
transactions = (
|
||||||
|
self.db.query(Transaction)
|
||||||
|
.filter(Transaction.account_id == account_id)
|
||||||
|
.order_by(Transaction.run_date, Transaction.id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Group transactions by symbol and option details
|
||||||
|
# For options, we need to group by the full option contract (symbol + strike + expiration)
|
||||||
|
# For stocks, we group by symbol only
|
||||||
|
symbol_txns = defaultdict(list)
|
||||||
|
for txn in transactions:
|
||||||
|
if txn.symbol:
|
||||||
|
# Create a unique grouping key
|
||||||
|
grouping_key = self._get_grouping_key(txn)
|
||||||
|
symbol_txns[grouping_key].append(txn)
|
||||||
|
|
||||||
|
# Process each symbol/contract group
|
||||||
|
position_count = 0
|
||||||
|
for grouping_key, txns in symbol_txns.items():
|
||||||
|
positions = self._process_symbol_transactions(account_id, grouping_key, txns)
|
||||||
|
position_count += len(positions)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
return position_count
|
||||||
|
|
||||||
|
def _process_symbol_transactions(
|
||||||
|
self, account_id: int, symbol: str, transactions: List[Transaction]
|
||||||
|
) -> List[Position]:
|
||||||
|
"""
|
||||||
|
Process all transactions for a single symbol to create positions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
symbol: Trading symbol
|
||||||
|
transactions: List of transactions for this symbol
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of created Position objects
|
||||||
|
"""
|
||||||
|
positions = []
|
||||||
|
|
||||||
|
# Determine position type from first transaction
|
||||||
|
position_type = self._determine_position_type_from_txn(transactions[0]) if transactions else PositionType.STOCK
|
||||||
|
|
||||||
|
# Track open positions using FIFO
|
||||||
|
open_positions: List[Dict] = []
|
||||||
|
|
||||||
|
for txn in transactions:
|
||||||
|
action = txn.action.upper()
|
||||||
|
|
||||||
|
# Determine if this is an opening or closing transaction
|
||||||
|
if self._is_opening_transaction(action):
|
||||||
|
# Create new open position
|
||||||
|
open_pos = {
|
||||||
|
"transactions": [txn],
|
||||||
|
"quantity": abs(txn.quantity) if txn.quantity else Decimal("0"),
|
||||||
|
"entry_price": txn.price,
|
||||||
|
"open_date": txn.run_date,
|
||||||
|
"is_short": "SELL" in action or "SOLD" in action,
|
||||||
|
}
|
||||||
|
open_positions.append(open_pos)
|
||||||
|
|
||||||
|
elif self._is_closing_transaction(action):
|
||||||
|
# Close positions using FIFO
|
||||||
|
close_quantity = abs(txn.quantity) if txn.quantity else Decimal("0")
|
||||||
|
remaining_to_close = close_quantity
|
||||||
|
|
||||||
|
while remaining_to_close > 0 and open_positions:
|
||||||
|
open_pos = open_positions[0]
|
||||||
|
open_qty = open_pos["quantity"]
|
||||||
|
|
||||||
|
if open_qty <= remaining_to_close:
|
||||||
|
# Close entire position
|
||||||
|
open_pos["transactions"].append(txn)
|
||||||
|
position = self._create_position(
|
||||||
|
account_id,
|
||||||
|
symbol,
|
||||||
|
position_type,
|
||||||
|
open_pos,
|
||||||
|
close_date=txn.run_date,
|
||||||
|
exit_price=txn.price,
|
||||||
|
close_quantity=open_qty,
|
||||||
|
)
|
||||||
|
positions.append(position)
|
||||||
|
open_positions.pop(0)
|
||||||
|
remaining_to_close -= open_qty
|
||||||
|
else:
|
||||||
|
# Partially close position
|
||||||
|
# Split into closed portion
|
||||||
|
closed_portion = {
|
||||||
|
"transactions": open_pos["transactions"] + [txn],
|
||||||
|
"quantity": remaining_to_close,
|
||||||
|
"entry_price": open_pos["entry_price"],
|
||||||
|
"open_date": open_pos["open_date"],
|
||||||
|
"is_short": open_pos["is_short"],
|
||||||
|
}
|
||||||
|
position = self._create_position(
|
||||||
|
account_id,
|
||||||
|
symbol,
|
||||||
|
position_type,
|
||||||
|
closed_portion,
|
||||||
|
close_date=txn.run_date,
|
||||||
|
exit_price=txn.price,
|
||||||
|
close_quantity=remaining_to_close,
|
||||||
|
)
|
||||||
|
positions.append(position)
|
||||||
|
|
||||||
|
# Update open position with remaining quantity
|
||||||
|
open_pos["quantity"] -= remaining_to_close
|
||||||
|
remaining_to_close = Decimal("0")
|
||||||
|
|
||||||
|
elif self._is_expiration(action):
|
||||||
|
# Handle option expirations
|
||||||
|
expire_quantity = abs(txn.quantity) if txn.quantity else Decimal("0")
|
||||||
|
remaining_to_expire = expire_quantity
|
||||||
|
|
||||||
|
while remaining_to_expire > 0 and open_positions:
|
||||||
|
open_pos = open_positions[0]
|
||||||
|
open_qty = open_pos["quantity"]
|
||||||
|
|
||||||
|
if open_qty <= remaining_to_expire:
|
||||||
|
# Expire entire position
|
||||||
|
open_pos["transactions"].append(txn)
|
||||||
|
position = self._create_position(
|
||||||
|
account_id,
|
||||||
|
symbol,
|
||||||
|
position_type,
|
||||||
|
open_pos,
|
||||||
|
close_date=txn.run_date,
|
||||||
|
exit_price=Decimal("0"), # Expired worthless
|
||||||
|
close_quantity=open_qty,
|
||||||
|
)
|
||||||
|
positions.append(position)
|
||||||
|
open_positions.pop(0)
|
||||||
|
remaining_to_expire -= open_qty
|
||||||
|
else:
|
||||||
|
# Partially expire
|
||||||
|
closed_portion = {
|
||||||
|
"transactions": open_pos["transactions"] + [txn],
|
||||||
|
"quantity": remaining_to_expire,
|
||||||
|
"entry_price": open_pos["entry_price"],
|
||||||
|
"open_date": open_pos["open_date"],
|
||||||
|
"is_short": open_pos["is_short"],
|
||||||
|
}
|
||||||
|
position = self._create_position(
|
||||||
|
account_id,
|
||||||
|
symbol,
|
||||||
|
position_type,
|
||||||
|
closed_portion,
|
||||||
|
close_date=txn.run_date,
|
||||||
|
exit_price=Decimal("0"),
|
||||||
|
close_quantity=remaining_to_expire,
|
||||||
|
)
|
||||||
|
positions.append(position)
|
||||||
|
open_pos["quantity"] -= remaining_to_expire
|
||||||
|
remaining_to_expire = Decimal("0")
|
||||||
|
|
||||||
|
# Create positions for any remaining open positions
|
||||||
|
for open_pos in open_positions:
|
||||||
|
position = self._create_position(
|
||||||
|
account_id, symbol, position_type, open_pos
|
||||||
|
)
|
||||||
|
positions.append(position)
|
||||||
|
|
||||||
|
return positions
|
||||||
|
|
||||||
|
def _create_position(
|
||||||
|
self,
|
||||||
|
account_id: int,
|
||||||
|
symbol: str,
|
||||||
|
position_type: PositionType,
|
||||||
|
position_data: Dict,
|
||||||
|
close_date: Optional[datetime] = None,
|
||||||
|
exit_price: Optional[Decimal] = None,
|
||||||
|
close_quantity: Optional[Decimal] = None,
|
||||||
|
) -> Position:
|
||||||
|
"""
|
||||||
|
Create a Position database object.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account ID
|
||||||
|
symbol: Trading symbol
|
||||||
|
position_type: Type of position
|
||||||
|
position_data: Dictionary with position information
|
||||||
|
close_date: Close date (if closed)
|
||||||
|
exit_price: Exit price (if closed)
|
||||||
|
close_quantity: Quantity closed (if closed)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created Position object
|
||||||
|
"""
|
||||||
|
is_closed = close_date is not None
|
||||||
|
quantity = close_quantity if close_quantity else position_data["quantity"]
|
||||||
|
|
||||||
|
# Calculate P&L if closed
|
||||||
|
realized_pnl = None
|
||||||
|
if is_closed and position_data["entry_price"] and exit_price is not None:
|
||||||
|
if position_data["is_short"]:
|
||||||
|
# Short position: profit when price decreases
|
||||||
|
realized_pnl = (
|
||||||
|
position_data["entry_price"] - exit_price
|
||||||
|
) * quantity * 100
|
||||||
|
else:
|
||||||
|
# Long position: profit when price increases
|
||||||
|
realized_pnl = (
|
||||||
|
exit_price - position_data["entry_price"]
|
||||||
|
) * quantity * 100
|
||||||
|
|
||||||
|
# Subtract fees and commissions
|
||||||
|
for txn in position_data["transactions"]:
|
||||||
|
if txn.commission:
|
||||||
|
realized_pnl -= txn.commission
|
||||||
|
if txn.fees:
|
||||||
|
realized_pnl -= txn.fees
|
||||||
|
|
||||||
|
# Extract option symbol from first transaction if this is an option
|
||||||
|
option_symbol = None
|
||||||
|
if position_type != PositionType.STOCK and position_data["transactions"]:
|
||||||
|
first_txn = position_data["transactions"][0]
|
||||||
|
# Try to extract option details from description
|
||||||
|
option_symbol = self._extract_option_symbol_from_description(
|
||||||
|
first_txn.description, first_txn.action, symbol
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create position
|
||||||
|
position = Position(
|
||||||
|
account_id=account_id,
|
||||||
|
symbol=symbol,
|
||||||
|
option_symbol=option_symbol,
|
||||||
|
position_type=position_type,
|
||||||
|
status=PositionStatus.CLOSED if is_closed else PositionStatus.OPEN,
|
||||||
|
open_date=position_data["open_date"],
|
||||||
|
close_date=close_date,
|
||||||
|
total_quantity=quantity if not position_data["is_short"] else -quantity,
|
||||||
|
avg_entry_price=position_data["entry_price"],
|
||||||
|
avg_exit_price=exit_price,
|
||||||
|
realized_pnl=realized_pnl,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.db.add(position)
|
||||||
|
self.db.flush() # Get position ID
|
||||||
|
|
||||||
|
# Link transactions to position
|
||||||
|
for txn in position_data["transactions"]:
|
||||||
|
link = PositionTransaction(
|
||||||
|
position_id=position.id, transaction_id=txn.id
|
||||||
|
)
|
||||||
|
self.db.add(link)
|
||||||
|
|
||||||
|
return position
|
||||||
|
|
||||||
|
def _extract_option_symbol_from_description(
|
||||||
|
self, description: str, action: str, base_symbol: str
|
||||||
|
) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Extract option symbol from transaction description.
|
||||||
|
|
||||||
|
Example: "CALL (TGT) TARGET CORP JAN 16 26 $95 (100 SHS)"
|
||||||
|
Returns: "-TGT260116C95"
|
||||||
|
|
||||||
|
Args:
|
||||||
|
description: Transaction description
|
||||||
|
action: Transaction action
|
||||||
|
base_symbol: Underlying symbol
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Option symbol in standard format, or None if can't parse
|
||||||
|
"""
|
||||||
|
if not description:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Determine if CALL or PUT
|
||||||
|
call_or_put = None
|
||||||
|
if "CALL" in description.upper():
|
||||||
|
call_or_put = "C"
|
||||||
|
elif "PUT" in description.upper():
|
||||||
|
call_or_put = "P"
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Extract date and strike: "JAN 16 26 $95"
|
||||||
|
# Pattern: MONTH DAY YY $STRIKE
|
||||||
|
date_strike_pattern = r'([A-Z]{3})\s+(\d{1,2})\s+(\d{2})\s+\$([\d.]+)'
|
||||||
|
match = re.search(date_strike_pattern, description)
|
||||||
|
|
||||||
|
if not match:
|
||||||
|
return None
|
||||||
|
|
||||||
|
month_abbr, day, year, strike = match.groups()
|
||||||
|
|
||||||
|
# Convert month abbreviation to number
|
||||||
|
month_map = {
|
||||||
|
'JAN': '01', 'FEB': '02', 'MAR': '03', 'APR': '04',
|
||||||
|
'MAY': '05', 'JUN': '06', 'JUL': '07', 'AUG': '08',
|
||||||
|
'SEP': '09', 'OCT': '10', 'NOV': '11', 'DEC': '12'
|
||||||
|
}
|
||||||
|
|
||||||
|
month = month_map.get(month_abbr.upper())
|
||||||
|
if not month:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Format: -SYMBOL + YYMMDD + C/P + STRIKE
|
||||||
|
# Remove decimal point from strike if it's a whole number
|
||||||
|
strike_num = float(strike)
|
||||||
|
strike_str = str(int(strike_num)) if strike_num.is_integer() else strike.replace('.', '')
|
||||||
|
|
||||||
|
option_symbol = f"-{base_symbol}{year}{month}{day.zfill(2)}{call_or_put}{strike_str}"
|
||||||
|
return option_symbol
|
||||||
|
|
||||||
|
def _determine_position_type_from_txn(self, txn: Transaction) -> PositionType:
|
||||||
|
"""
|
||||||
|
Determine position type from transaction action/description.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
txn: Transaction to analyze
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
PositionType (STOCK, CALL, or PUT)
|
||||||
|
"""
|
||||||
|
# Check action and description for option indicators
|
||||||
|
action_upper = txn.action.upper() if txn.action else ""
|
||||||
|
desc_upper = txn.description.upper() if txn.description else ""
|
||||||
|
|
||||||
|
# Look for CALL or PUT keywords
|
||||||
|
if "CALL" in action_upper or "CALL" in desc_upper:
|
||||||
|
return PositionType.CALL
|
||||||
|
elif "PUT" in action_upper or "PUT" in desc_upper:
|
||||||
|
return PositionType.PUT
|
||||||
|
|
||||||
|
# Fall back to checking symbol format (for backwards compatibility)
|
||||||
|
if txn.symbol and txn.symbol.startswith("-"):
|
||||||
|
option_info = parse_option_symbol(txn.symbol)
|
||||||
|
if option_info:
|
||||||
|
return (
|
||||||
|
PositionType.CALL
|
||||||
|
if option_info.option_type == "CALL"
|
||||||
|
else PositionType.PUT
|
||||||
|
)
|
||||||
|
|
||||||
|
return PositionType.STOCK
|
||||||
|
|
||||||
|
def _get_base_symbol(self, symbol: str) -> str:
|
||||||
|
"""Extract base symbol from option symbol."""
|
||||||
|
if symbol.startswith("-"):
|
||||||
|
option_info = parse_option_symbol(symbol)
|
||||||
|
if option_info:
|
||||||
|
return option_info.underlying_symbol
|
||||||
|
return symbol
|
||||||
|
|
||||||
|
def _is_opening_transaction(self, action: str) -> bool:
|
||||||
|
"""Check if action represents opening a position."""
|
||||||
|
opening_keywords = [
|
||||||
|
"OPENING TRANSACTION",
|
||||||
|
"YOU BOUGHT OPENING",
|
||||||
|
"YOU SOLD OPENING",
|
||||||
|
]
|
||||||
|
return any(keyword in action for keyword in opening_keywords)
|
||||||
|
|
||||||
|
def _is_closing_transaction(self, action: str) -> bool:
|
||||||
|
"""Check if action represents closing a position."""
|
||||||
|
closing_keywords = [
|
||||||
|
"CLOSING TRANSACTION",
|
||||||
|
"YOU BOUGHT CLOSING",
|
||||||
|
"YOU SOLD CLOSING",
|
||||||
|
"ASSIGNED",
|
||||||
|
]
|
||||||
|
return any(keyword in action for keyword in closing_keywords)
|
||||||
|
|
||||||
|
def _is_expiration(self, action: str) -> bool:
|
||||||
|
"""Check if action represents an expiration."""
|
||||||
|
return "EXPIRED" in action
|
||||||
|
|
||||||
|
def _get_grouping_key(self, txn: Transaction) -> str:
|
||||||
|
"""
|
||||||
|
Create a unique grouping key for transactions.
|
||||||
|
|
||||||
|
For options, returns: symbol + option details (e.g., "TGT-JAN16-100C")
|
||||||
|
For stocks, returns: just the symbol (e.g., "TGT")
|
||||||
|
|
||||||
|
Args:
|
||||||
|
txn: Transaction to create key for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Grouping key string
|
||||||
|
"""
|
||||||
|
# Determine if this is an option transaction
|
||||||
|
action_upper = txn.action.upper() if txn.action else ""
|
||||||
|
desc_upper = txn.description.upper() if txn.description else ""
|
||||||
|
|
||||||
|
is_option = "CALL" in action_upper or "CALL" in desc_upper or "PUT" in action_upper or "PUT" in desc_upper
|
||||||
|
|
||||||
|
if not is_option or not txn.description:
|
||||||
|
# Stock transaction - group by symbol only
|
||||||
|
return txn.symbol
|
||||||
|
|
||||||
|
# Option transaction - extract strike and expiration to create unique key
|
||||||
|
# Pattern: "CALL (TGT) TARGET CORP JAN 16 26 $100 (100 SHS)"
|
||||||
|
date_strike_pattern = r'([A-Z]{3})\s+(\d{1,2})\s+(\d{2})\s+\$([\d.]+)'
|
||||||
|
match = re.search(date_strike_pattern, txn.description)
|
||||||
|
|
||||||
|
if not match:
|
||||||
|
# Can't parse option details, fall back to symbol only
|
||||||
|
return txn.symbol
|
||||||
|
|
||||||
|
month_abbr, day, year, strike = match.groups()
|
||||||
|
|
||||||
|
# Determine call or put
|
||||||
|
call_or_put = "C" if "CALL" in desc_upper else "P"
|
||||||
|
|
||||||
|
# Create key: SYMBOL-MONTHDAY-STRIKEC/P
|
||||||
|
# e.g., "TGT-JAN16-100C"
|
||||||
|
strike_num = float(strike)
|
||||||
|
strike_str = str(int(strike_num)) if strike_num.is_integer() else strike
|
||||||
|
|
||||||
|
grouping_key = f"{txn.symbol}-{month_abbr}{day}-{strike_str}{call_or_put}"
|
||||||
|
return grouping_key
|
||||||
5
backend/app/utils/__init__.py
Normal file
5
backend/app/utils/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
"""Utility functions and helpers."""
|
||||||
|
from app.utils.deduplication import generate_transaction_hash
|
||||||
|
from app.utils.option_parser import parse_option_symbol, OptionInfo
|
||||||
|
|
||||||
|
__all__ = ["generate_transaction_hash", "parse_option_symbol", "OptionInfo"]
|
||||||
65
backend/app/utils/deduplication.py
Normal file
65
backend/app/utils/deduplication.py
Normal file
@@ -0,0 +1,65 @@
|
|||||||
|
"""Transaction deduplication utilities."""
|
||||||
|
import hashlib
|
||||||
|
from datetime import date
|
||||||
|
from decimal import Decimal
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
def generate_transaction_hash(
|
||||||
|
account_id: int,
|
||||||
|
run_date: date,
|
||||||
|
symbol: Optional[str],
|
||||||
|
action: str,
|
||||||
|
amount: Optional[Decimal],
|
||||||
|
quantity: Optional[Decimal],
|
||||||
|
price: Optional[Decimal],
|
||||||
|
) -> str:
|
||||||
|
"""
|
||||||
|
Generate a unique SHA-256 hash for a transaction to prevent duplicates.
|
||||||
|
|
||||||
|
The hash is generated from key transaction attributes that uniquely identify
|
||||||
|
a transaction: account, date, symbol, action, amount, quantity, and price.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
account_id: Account identifier
|
||||||
|
run_date: Transaction date
|
||||||
|
symbol: Trading symbol
|
||||||
|
action: Transaction action description
|
||||||
|
amount: Transaction amount
|
||||||
|
quantity: Number of shares/contracts
|
||||||
|
price: Price per unit
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: 64-character hexadecimal SHA-256 hash
|
||||||
|
|
||||||
|
Example:
|
||||||
|
>>> generate_transaction_hash(
|
||||||
|
... account_id=1,
|
||||||
|
... run_date=date(2025, 12, 26),
|
||||||
|
... symbol="AAPL",
|
||||||
|
... action="YOU BOUGHT",
|
||||||
|
... amount=Decimal("-1500.00"),
|
||||||
|
... quantity=Decimal("10"),
|
||||||
|
... price=Decimal("150.00")
|
||||||
|
... )
|
||||||
|
'a1b2c3d4...'
|
||||||
|
"""
|
||||||
|
# Convert values to strings, handling None values
|
||||||
|
symbol_str = symbol or ""
|
||||||
|
amount_str = str(amount) if amount is not None else ""
|
||||||
|
quantity_str = str(quantity) if quantity is not None else ""
|
||||||
|
price_str = str(price) if price is not None else ""
|
||||||
|
|
||||||
|
# Create hash string with pipe delimiter
|
||||||
|
hash_string = (
|
||||||
|
f"{account_id}|"
|
||||||
|
f"{run_date.isoformat()}|"
|
||||||
|
f"{symbol_str}|"
|
||||||
|
f"{action}|"
|
||||||
|
f"{amount_str}|"
|
||||||
|
f"{quantity_str}|"
|
||||||
|
f"{price_str}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generate SHA-256 hash
|
||||||
|
return hashlib.sha256(hash_string.encode("utf-8")).hexdigest()
|
||||||
91
backend/app/utils/option_parser.py
Normal file
91
backend/app/utils/option_parser.py
Normal file
@@ -0,0 +1,91 @@
|
|||||||
|
"""Option symbol parsing utilities."""
|
||||||
|
import re
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, NamedTuple
|
||||||
|
from decimal import Decimal
|
||||||
|
|
||||||
|
|
||||||
|
class OptionInfo(NamedTuple):
|
||||||
|
"""
|
||||||
|
Parsed option information.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
underlying_symbol: Base ticker symbol (e.g., "AAPL")
|
||||||
|
expiration_date: Option expiration date
|
||||||
|
option_type: "CALL" or "PUT"
|
||||||
|
strike_price: Strike price
|
||||||
|
"""
|
||||||
|
underlying_symbol: str
|
||||||
|
expiration_date: datetime
|
||||||
|
option_type: str
|
||||||
|
strike_price: Decimal
|
||||||
|
|
||||||
|
|
||||||
|
def parse_option_symbol(option_symbol: str) -> Optional[OptionInfo]:
|
||||||
|
"""
|
||||||
|
Parse Fidelity option symbol format into components.
|
||||||
|
|
||||||
|
Fidelity format: -SYMBOL + YYMMDD + C/P + STRIKE
|
||||||
|
Example: -AAPL260116C150 = AAPL Call expiring Jan 16, 2026 at $150 strike
|
||||||
|
|
||||||
|
Args:
|
||||||
|
option_symbol: Fidelity option symbol string
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
OptionInfo object if parsing successful, None otherwise
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
>>> parse_option_symbol("-AAPL260116C150")
|
||||||
|
OptionInfo(
|
||||||
|
underlying_symbol='AAPL',
|
||||||
|
expiration_date=datetime(2026, 1, 16),
|
||||||
|
option_type='CALL',
|
||||||
|
strike_price=Decimal('150')
|
||||||
|
)
|
||||||
|
|
||||||
|
>>> parse_option_symbol("-TSLA251219P500")
|
||||||
|
OptionInfo(
|
||||||
|
underlying_symbol='TSLA',
|
||||||
|
expiration_date=datetime(2025, 12, 19),
|
||||||
|
option_type='PUT',
|
||||||
|
strike_price=Decimal('500')
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
# Regex pattern: -SYMBOL + YYMMDD + C/P + STRIKE
|
||||||
|
# Symbol: one or more uppercase letters
|
||||||
|
# Date: 6 digits (YYMMDD)
|
||||||
|
# Type: C (call) or P (put)
|
||||||
|
# Strike: digits with optional decimal point
|
||||||
|
pattern = r"^-([A-Z]+)(\d{6})([CP])(\d+\.?\d*)$"
|
||||||
|
|
||||||
|
match = re.match(pattern, option_symbol)
|
||||||
|
if not match:
|
||||||
|
return None
|
||||||
|
|
||||||
|
symbol, date_str, option_type, strike_str = match.groups()
|
||||||
|
|
||||||
|
# Parse date (YYMMDD format)
|
||||||
|
try:
|
||||||
|
# Assume 20XX for years (works until 2100)
|
||||||
|
year = 2000 + int(date_str[:2])
|
||||||
|
month = int(date_str[2:4])
|
||||||
|
day = int(date_str[4:6])
|
||||||
|
expiration_date = datetime(year, month, day)
|
||||||
|
except (ValueError, IndexError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Parse option type
|
||||||
|
option_type_full = "CALL" if option_type == "C" else "PUT"
|
||||||
|
|
||||||
|
# Parse strike price
|
||||||
|
try:
|
||||||
|
strike_price = Decimal(strike_str)
|
||||||
|
except (ValueError, ArithmeticError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return OptionInfo(
|
||||||
|
underlying_symbol=symbol,
|
||||||
|
expiration_date=expiration_date,
|
||||||
|
option_type=option_type_full,
|
||||||
|
strike_price=strike_price,
|
||||||
|
)
|
||||||
12
backend/requirements.txt
Normal file
12
backend/requirements.txt
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
fastapi==0.109.0
|
||||||
|
uvicorn[standard]==0.27.0
|
||||||
|
sqlalchemy==2.0.25
|
||||||
|
alembic==1.13.1
|
||||||
|
psycopg2-binary==2.9.9
|
||||||
|
pydantic==2.5.3
|
||||||
|
pydantic-settings==2.1.0
|
||||||
|
python-multipart==0.0.6
|
||||||
|
pandas==2.1.4
|
||||||
|
yfinance==0.2.35
|
||||||
|
python-dateutil==2.8.2
|
||||||
|
pytz==2024.1
|
||||||
94
backend/seed_demo_data.py
Normal file
94
backend/seed_demo_data.py
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
"""
|
||||||
|
Demo data seeder script.
|
||||||
|
Creates a sample account and imports the provided CSV file.
|
||||||
|
"""
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add parent directory to path
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent))
|
||||||
|
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from app.database import SessionLocal, engine, Base
|
||||||
|
from app.models import Account
|
||||||
|
from app.services import ImportService
|
||||||
|
from app.services.position_tracker import PositionTracker
|
||||||
|
|
||||||
|
def seed_demo_data():
|
||||||
|
"""Seed demo account and transactions."""
|
||||||
|
print("🌱 Seeding demo data...")
|
||||||
|
|
||||||
|
# Create tables
|
||||||
|
Base.metadata.create_all(bind=engine)
|
||||||
|
|
||||||
|
# Create database session
|
||||||
|
db = SessionLocal()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if demo account already exists
|
||||||
|
existing = (
|
||||||
|
db.query(Account)
|
||||||
|
.filter(Account.account_number == "DEMO123456")
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
print("✅ Demo account already exists")
|
||||||
|
demo_account = existing
|
||||||
|
else:
|
||||||
|
# Create demo account
|
||||||
|
demo_account = Account(
|
||||||
|
account_number="DEMO123456",
|
||||||
|
account_name="Demo Trading Account",
|
||||||
|
account_type="margin",
|
||||||
|
)
|
||||||
|
db.add(demo_account)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(demo_account)
|
||||||
|
print(f"✅ Created demo account (ID: {demo_account.id})")
|
||||||
|
|
||||||
|
# Check for CSV file
|
||||||
|
csv_path = Path("/app/imports/History_for_Account_X38661988.csv")
|
||||||
|
if not csv_path.exists():
|
||||||
|
# Try alternative path (development)
|
||||||
|
csv_path = Path(__file__).parent.parent / "History_for_Account_X38661988.csv"
|
||||||
|
|
||||||
|
if not csv_path.exists():
|
||||||
|
print("⚠️ Sample CSV file not found. Skipping import.")
|
||||||
|
print(" Place the CSV file in /app/imports/ to seed demo data.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Import transactions
|
||||||
|
print(f"📊 Importing transactions from {csv_path.name}...")
|
||||||
|
import_service = ImportService(db)
|
||||||
|
result = import_service.import_from_file(csv_path, demo_account.id)
|
||||||
|
|
||||||
|
print(f"✅ Imported {result.imported} transactions")
|
||||||
|
print(f" Skipped {result.skipped} duplicates")
|
||||||
|
if result.errors:
|
||||||
|
print(f" ⚠️ {len(result.errors)} errors occurred")
|
||||||
|
|
||||||
|
# Build positions
|
||||||
|
if result.imported > 0:
|
||||||
|
print("📈 Building positions...")
|
||||||
|
position_tracker = PositionTracker(db)
|
||||||
|
positions_created = position_tracker.rebuild_positions(demo_account.id)
|
||||||
|
print(f"✅ Created {positions_created} positions")
|
||||||
|
|
||||||
|
print("\n🎉 Demo data seeded successfully!")
|
||||||
|
print(f"\n📝 Demo Account Details:")
|
||||||
|
print(f" Account Number: {demo_account.account_number}")
|
||||||
|
print(f" Account Name: {demo_account.account_name}")
|
||||||
|
print(f" Account ID: {demo_account.id}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"❌ Error seeding demo data: {e}")
|
||||||
|
db.rollback()
|
||||||
|
raise
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
seed_demo_data()
|
||||||
67
docker-compose.yml
Normal file
67
docker-compose.yml
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
services:
|
||||||
|
# PostgreSQL database
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
container_name: fidelity_postgres
|
||||||
|
environment:
|
||||||
|
POSTGRES_USER: fidelity
|
||||||
|
POSTGRES_PASSWORD: fidelity123
|
||||||
|
POSTGRES_DB: fidelitytracker
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U fidelity -d fidelitytracker"]
|
||||||
|
interval: 10s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
networks:
|
||||||
|
- fidelity_network
|
||||||
|
|
||||||
|
# FastAPI backend
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
container_name: fidelity_backend
|
||||||
|
depends_on:
|
||||||
|
postgres:
|
||||||
|
condition: service_healthy
|
||||||
|
environment:
|
||||||
|
POSTGRES_HOST: postgres
|
||||||
|
POSTGRES_PORT: 5432
|
||||||
|
POSTGRES_DB: fidelitytracker
|
||||||
|
POSTGRES_USER: fidelity
|
||||||
|
POSTGRES_PASSWORD: fidelity123
|
||||||
|
IMPORT_DIR: /app/imports
|
||||||
|
ports:
|
||||||
|
- "8000:8000"
|
||||||
|
volumes:
|
||||||
|
- ./imports:/app/imports
|
||||||
|
- ./backend:/app
|
||||||
|
networks:
|
||||||
|
- fidelity_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# React frontend (will be added)
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
container_name: fidelity_frontend
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
ports:
|
||||||
|
- "3000:80"
|
||||||
|
networks:
|
||||||
|
- fidelity_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
driver: local
|
||||||
|
|
||||||
|
networks:
|
||||||
|
fidelity_network:
|
||||||
|
driver: bridge
|
||||||
199
docs/TIMEFRAME_FILTERING.md
Normal file
199
docs/TIMEFRAME_FILTERING.md
Normal file
@@ -0,0 +1,199 @@
|
|||||||
|
# Timeframe Filtering Feature
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
The timeframe filtering feature allows users to view dashboard metrics and charts for specific date ranges, providing better insights into performance over different time periods.
|
||||||
|
|
||||||
|
## User Interface
|
||||||
|
|
||||||
|
### Location
|
||||||
|
- Dashboard page (DashboardV2 component)
|
||||||
|
- Dropdown filter positioned at the top of the dashboard, above metrics cards
|
||||||
|
|
||||||
|
### Available Options
|
||||||
|
1. **All Time** - Shows all historical data
|
||||||
|
2. **Last 30 Days** - Shows data from the past 30 days
|
||||||
|
3. **Last 90 Days** - Shows data from the past 90 days
|
||||||
|
4. **Last 180 Days** - Shows data from the past 180 days (default for chart)
|
||||||
|
5. **Last 1 Year** - Shows data from the past 365 days
|
||||||
|
6. **Year to Date** - Shows data from January 1st of current year to today
|
||||||
|
|
||||||
|
## What Gets Filtered
|
||||||
|
|
||||||
|
### Metrics Cards (Top of Dashboard)
|
||||||
|
When a timeframe is selected, the following metrics are filtered by position open date:
|
||||||
|
- Total Positions count
|
||||||
|
- Open Positions count
|
||||||
|
- Closed Positions count
|
||||||
|
- Total Realized P&L
|
||||||
|
- Total Unrealized P&L
|
||||||
|
- Win Rate percentage
|
||||||
|
- Average Win amount
|
||||||
|
- Average Loss amount
|
||||||
|
- Current Balance (always shows latest)
|
||||||
|
|
||||||
|
### Balance History Chart
|
||||||
|
The chart adjusts to show the requested number of days:
|
||||||
|
- All Time: ~10 years (3650 days)
|
||||||
|
- Last 30 Days: 30 days
|
||||||
|
- Last 90 Days: 90 days
|
||||||
|
- Last 180 Days: 180 days
|
||||||
|
- Last 1 Year: 365 days
|
||||||
|
- Year to Date: Dynamic calculation from Jan 1 to today
|
||||||
|
|
||||||
|
## Implementation Details
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
|
||||||
|
#### Component: `DashboardV2.tsx`
|
||||||
|
```typescript
|
||||||
|
// State management
|
||||||
|
const [timeframe, setTimeframe] = useState<TimeframeOption>('all');
|
||||||
|
|
||||||
|
// Convert timeframe to days for balance history
|
||||||
|
const getDaysFromTimeframe = (tf: TimeframeOption): number => {
|
||||||
|
switch (tf) {
|
||||||
|
case 'last30days': return 30;
|
||||||
|
case 'last90days': return 90;
|
||||||
|
// ... etc
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Get date range for filtering
|
||||||
|
const { startDate, endDate } = getTimeframeDates(timeframe);
|
||||||
|
```
|
||||||
|
|
||||||
|
#### API Calls
|
||||||
|
1. **Overview Stats**:
|
||||||
|
- Endpoint: `GET /analytics/overview/{account_id}`
|
||||||
|
- Parameters: `start_date`, `end_date`
|
||||||
|
- Query key includes timeframe for proper caching
|
||||||
|
|
||||||
|
2. **Balance History**:
|
||||||
|
- Endpoint: `GET /analytics/balance-history/{account_id}`
|
||||||
|
- Parameters: `days` (calculated from timeframe)
|
||||||
|
- Query key includes timeframe for proper caching
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
|
||||||
|
#### Endpoint: `analytics_v2.py`
|
||||||
|
```python
|
||||||
|
@router.get("/overview/{account_id}")
|
||||||
|
def get_overview(
|
||||||
|
account_id: int,
|
||||||
|
refresh_prices: bool = False,
|
||||||
|
max_api_calls: int = 5,
|
||||||
|
start_date: Optional[date] = None, # NEW
|
||||||
|
end_date: Optional[date] = None, # NEW
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
# Passes dates to calculator
|
||||||
|
stats = calculator.calculate_account_stats(
|
||||||
|
account_id,
|
||||||
|
update_prices=True,
|
||||||
|
max_api_calls=max_api_calls,
|
||||||
|
start_date=start_date,
|
||||||
|
end_date=end_date
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Service: `performance_calculator_v2.py`
|
||||||
|
```python
|
||||||
|
def calculate_account_stats(
|
||||||
|
self,
|
||||||
|
account_id: int,
|
||||||
|
update_prices: bool = True,
|
||||||
|
max_api_calls: int = 10,
|
||||||
|
start_date = None, # NEW
|
||||||
|
end_date = None # NEW
|
||||||
|
) -> Dict:
|
||||||
|
# Filter positions by open date
|
||||||
|
query = self.db.query(Position).filter(Position.account_id == account_id)
|
||||||
|
|
||||||
|
if start_date:
|
||||||
|
query = query.filter(Position.open_date >= start_date)
|
||||||
|
if end_date:
|
||||||
|
query = query.filter(Position.open_date <= end_date)
|
||||||
|
|
||||||
|
positions = query.all()
|
||||||
|
# ... rest of calculation logic
|
||||||
|
```
|
||||||
|
|
||||||
|
## Filter Logic
|
||||||
|
|
||||||
|
### Position Filtering
|
||||||
|
Positions are filtered based on their `open_date`:
|
||||||
|
- Only positions opened on or after `start_date` are included
|
||||||
|
- Only positions opened on or before `end_date` are included
|
||||||
|
- Open positions are always included if they match the date criteria
|
||||||
|
|
||||||
|
### Balance History
|
||||||
|
The balance history shows account balance at end of each day:
|
||||||
|
- Calculated from transactions within the specified days
|
||||||
|
- Does not filter by open date, shows actual historical balances
|
||||||
|
|
||||||
|
## Caching Strategy
|
||||||
|
|
||||||
|
React Query cache keys include timeframe parameters to ensure:
|
||||||
|
1. Different timeframes don't conflict in cache
|
||||||
|
2. Changing timeframes triggers new API calls
|
||||||
|
3. Cache invalidation works correctly
|
||||||
|
|
||||||
|
Cache keys:
|
||||||
|
- Overview: `['analytics', 'overview', accountId, startDate, endDate]`
|
||||||
|
- Balance: `['analytics', 'balance-history', accountId, timeframe]`
|
||||||
|
|
||||||
|
## User Experience
|
||||||
|
|
||||||
|
### Performance
|
||||||
|
- Balance history queries are fast (no market data needed)
|
||||||
|
- Overview queries use cached prices by default (fast)
|
||||||
|
- Users can still trigger price refresh within filtered timeframe
|
||||||
|
|
||||||
|
### Visual Feedback
|
||||||
|
- Filter immediately updates both metrics and chart
|
||||||
|
- Loading states handled by React Query
|
||||||
|
- Stale data shown while fetching (stale-while-revalidate pattern)
|
||||||
|
|
||||||
|
## Testing Checklist
|
||||||
|
|
||||||
|
- [ ] All timeframe options work correctly
|
||||||
|
- [ ] Metrics update when timeframe changes
|
||||||
|
- [ ] Balance history chart adjusts to show correct date range
|
||||||
|
- [ ] "All Time" shows complete data
|
||||||
|
- [ ] Year to Date calculation is accurate
|
||||||
|
- [ ] Filter persists during price refresh
|
||||||
|
- [ ] Cache invalidation works properly
|
||||||
|
- [ ] UI shows loading states appropriately
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
Potential improvements:
|
||||||
|
1. Add custom date range picker
|
||||||
|
2. Compare multiple timeframes side-by-side
|
||||||
|
3. Save preferred timeframe in user settings
|
||||||
|
4. Add timeframe filter to Transactions table
|
||||||
|
5. Add timeframe presets for tax year, quarters
|
||||||
|
6. Export filtered data to CSV
|
||||||
|
|
||||||
|
## Related Components
|
||||||
|
|
||||||
|
- `TimeframeFilter.tsx` - Reusable dropdown component
|
||||||
|
- `getTimeframeDates()` - Helper function to convert timeframe to dates
|
||||||
|
- `TransactionTable.tsx` - Already uses timeframe filtering
|
||||||
|
|
||||||
|
## API Reference
|
||||||
|
|
||||||
|
### GET /analytics/overview/{account_id}
|
||||||
|
```
|
||||||
|
Query Parameters:
|
||||||
|
- refresh_prices: boolean (default: false)
|
||||||
|
- max_api_calls: integer (default: 5)
|
||||||
|
- start_date: date (optional, format: YYYY-MM-DD)
|
||||||
|
- end_date: date (optional, format: YYYY-MM-DD)
|
||||||
|
```
|
||||||
|
|
||||||
|
### GET /analytics/balance-history/{account_id}
|
||||||
|
```
|
||||||
|
Query Parameters:
|
||||||
|
- days: integer (default: 30, max: 3650)
|
||||||
|
```
|
||||||
33
frontend/Dockerfile
Normal file
33
frontend/Dockerfile
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# Multi-stage build for React frontend
|
||||||
|
|
||||||
|
# Build stage
|
||||||
|
FROM node:20-alpine as build
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy package files
|
||||||
|
COPY package*.json ./
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
# Use npm install instead of npm ci since package-lock.json may not exist
|
||||||
|
RUN npm install
|
||||||
|
|
||||||
|
# Copy source code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Build application
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Production stage with nginx
|
||||||
|
FROM nginx:alpine
|
||||||
|
|
||||||
|
# Copy built files from build stage
|
||||||
|
COPY --from=build /app/dist /usr/share/nginx/html
|
||||||
|
|
||||||
|
# Copy nginx configuration
|
||||||
|
COPY nginx.conf /etc/nginx/conf.d/default.conf
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 80
|
||||||
|
|
||||||
|
CMD ["nginx", "-g", "daemon off;"]
|
||||||
13
frontend/index.html
Normal file
13
frontend/index.html
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<!doctype html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8" />
|
||||||
|
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||||
|
<title>myFidelityTracker</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="root"></div>
|
||||||
|
<script type="module" src="/src/main.tsx"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
35
frontend/nginx.conf
Normal file
35
frontend/nginx.conf
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name localhost;
|
||||||
|
root /usr/share/nginx/html;
|
||||||
|
index index.html;
|
||||||
|
|
||||||
|
# Gzip compression
|
||||||
|
gzip on;
|
||||||
|
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
|
||||||
|
|
||||||
|
# SPA routing - serve index.html for all routes
|
||||||
|
location / {
|
||||||
|
try_files $uri $uri/ /index.html;
|
||||||
|
# Don't cache HTML to ensure new builds are loaded
|
||||||
|
add_header Cache-Control "no-cache, no-store, must-revalidate";
|
||||||
|
add_header Pragma "no-cache";
|
||||||
|
add_header Expires "0";
|
||||||
|
}
|
||||||
|
|
||||||
|
# Proxy API requests to backend
|
||||||
|
location /api {
|
||||||
|
proxy_pass http://backend:8000;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache static assets with versioned filenames (hash in name)
|
||||||
|
# The hash changes when content changes, so long cache is safe
|
||||||
|
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable";
|
||||||
|
}
|
||||||
|
}
|
||||||
4544
frontend/package-lock.json
generated
Normal file
4544
frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
38
frontend/package.json
Normal file
38
frontend/package.json
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
{
|
||||||
|
"name": "myfidelitytracker-frontend",
|
||||||
|
"private": true,
|
||||||
|
"version": "1.0.0",
|
||||||
|
"type": "module",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "vite",
|
||||||
|
"build": "tsc && vite build",
|
||||||
|
"preview": "vite preview",
|
||||||
|
"lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"react": "^18.2.0",
|
||||||
|
"react-dom": "^18.2.0",
|
||||||
|
"react-router-dom": "^6.21.1",
|
||||||
|
"@tanstack/react-query": "^5.17.9",
|
||||||
|
"axios": "^1.6.5",
|
||||||
|
"recharts": "^2.10.3",
|
||||||
|
"react-dropzone": "^14.2.3",
|
||||||
|
"date-fns": "^3.0.6",
|
||||||
|
"clsx": "^2.1.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/react": "^18.2.48",
|
||||||
|
"@types/react-dom": "^18.2.18",
|
||||||
|
"@typescript-eslint/eslint-plugin": "^6.19.0",
|
||||||
|
"@typescript-eslint/parser": "^6.19.0",
|
||||||
|
"@vitejs/plugin-react": "^4.2.1",
|
||||||
|
"autoprefixer": "^10.4.16",
|
||||||
|
"eslint": "^8.56.0",
|
||||||
|
"eslint-plugin-react-hooks": "^4.6.0",
|
||||||
|
"eslint-plugin-react-refresh": "^0.4.5",
|
||||||
|
"postcss": "^8.4.33",
|
||||||
|
"tailwindcss": "^3.4.1",
|
||||||
|
"typescript": "^5.3.3",
|
||||||
|
"vite": "^5.0.11"
|
||||||
|
}
|
||||||
|
}
|
||||||
6
frontend/postcss.config.js
Normal file
6
frontend/postcss.config.js
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
export default {
|
||||||
|
plugins: {
|
||||||
|
tailwindcss: {},
|
||||||
|
autoprefixer: {},
|
||||||
|
},
|
||||||
|
}
|
||||||
118
frontend/src/App.tsx
Normal file
118
frontend/src/App.tsx
Normal file
@@ -0,0 +1,118 @@
|
|||||||
|
import { useState, useEffect } from 'react';
|
||||||
|
import { useQuery } from '@tanstack/react-query';
|
||||||
|
import { accountsApi } from './api/client';
|
||||||
|
import DashboardV2 from './components/DashboardV2';
|
||||||
|
import AccountManager from './components/AccountManager';
|
||||||
|
import TransactionTable from './components/TransactionTable';
|
||||||
|
import ImportDropzone from './components/ImportDropzone';
|
||||||
|
import type { Account } from './types';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Main application component.
|
||||||
|
* Manages navigation and selected account state.
|
||||||
|
*/
|
||||||
|
function App() {
|
||||||
|
const [selectedAccountId, setSelectedAccountId] = useState<number | null>(null);
|
||||||
|
const [currentView, setCurrentView] = useState<'dashboard' | 'transactions' | 'import' | 'accounts'>('dashboard');
|
||||||
|
|
||||||
|
// Fetch accounts
|
||||||
|
const { data: accounts, isLoading, refetch: refetchAccounts } = useQuery({
|
||||||
|
queryKey: ['accounts'],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await accountsApi.list();
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Auto-select first account
|
||||||
|
useEffect(() => {
|
||||||
|
if (accounts && accounts.length > 0 && !selectedAccountId) {
|
||||||
|
setSelectedAccountId(accounts[0].id);
|
||||||
|
}
|
||||||
|
}, [accounts, selectedAccountId]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="min-h-screen bg-robinhood-bg">
|
||||||
|
{/* Header */}
|
||||||
|
<header className="bg-white shadow-sm border-b border-gray-200">
|
||||||
|
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<h1 className="text-2xl font-bold text-gray-900">myFidelityTracker</h1>
|
||||||
|
|
||||||
|
{/* Account Selector */}
|
||||||
|
{accounts && accounts.length > 0 && (
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
|
<select
|
||||||
|
value={selectedAccountId || ''}
|
||||||
|
onChange={(e) => setSelectedAccountId(Number(e.target.value))}
|
||||||
|
className="input max-w-xs"
|
||||||
|
>
|
||||||
|
{accounts.map((account: Account) => (
|
||||||
|
<option key={account.id} value={account.id}>
|
||||||
|
{account.account_name} ({account.account_number})
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
{/* Navigation */}
|
||||||
|
<nav className="bg-white border-b border-gray-200">
|
||||||
|
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
||||||
|
<div className="flex space-x-8">
|
||||||
|
{['dashboard', 'transactions', 'import', 'accounts'].map((view) => (
|
||||||
|
<button
|
||||||
|
key={view}
|
||||||
|
onClick={() => setCurrentView(view as typeof currentView)}
|
||||||
|
className={`py-4 px-1 border-b-2 font-medium text-sm transition-colors ${
|
||||||
|
currentView === view
|
||||||
|
? 'border-robinhood-green text-robinhood-green'
|
||||||
|
: 'border-transparent text-gray-500 hover:text-gray-700 hover:border-gray-300'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{view.charAt(0).toUpperCase() + view.slice(1)}
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
{/* Main Content */}
|
||||||
|
<main className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-8">
|
||||||
|
{isLoading ? (
|
||||||
|
<div className="flex items-center justify-center h-64">
|
||||||
|
<div className="text-gray-500">Loading...</div>
|
||||||
|
</div>
|
||||||
|
) : !selectedAccountId && currentView !== 'accounts' ? (
|
||||||
|
<div className="text-center py-12">
|
||||||
|
<h3 className="text-lg font-medium text-gray-900 mb-2">No accounts found</h3>
|
||||||
|
<p className="text-gray-500 mb-4">Create an account to get started</p>
|
||||||
|
<button onClick={() => setCurrentView('accounts')} className="btn-primary">
|
||||||
|
Create Account
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
{currentView === 'dashboard' && selectedAccountId && (
|
||||||
|
<DashboardV2 accountId={selectedAccountId} />
|
||||||
|
)}
|
||||||
|
{currentView === 'transactions' && selectedAccountId && (
|
||||||
|
<TransactionTable accountId={selectedAccountId} />
|
||||||
|
)}
|
||||||
|
{currentView === 'import' && selectedAccountId && (
|
||||||
|
<ImportDropzone accountId={selectedAccountId} />
|
||||||
|
)}
|
||||||
|
{currentView === 'accounts' && (
|
||||||
|
<AccountManager onAccountCreated={refetchAccounts} />
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</main>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export default App;
|
||||||
108
frontend/src/api/client.ts
Normal file
108
frontend/src/api/client.ts
Normal file
@@ -0,0 +1,108 @@
|
|||||||
|
/**
|
||||||
|
* API client for communicating with the backend.
|
||||||
|
*/
|
||||||
|
import axios from 'axios';
|
||||||
|
import type {
|
||||||
|
Account,
|
||||||
|
Transaction,
|
||||||
|
Position,
|
||||||
|
AccountStats,
|
||||||
|
BalancePoint,
|
||||||
|
Trade,
|
||||||
|
ImportResult,
|
||||||
|
} from '../types';
|
||||||
|
|
||||||
|
// Configure axios instance
|
||||||
|
const api = axios.create({
|
||||||
|
baseURL: '/api',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Account APIs
|
||||||
|
export const accountsApi = {
|
||||||
|
list: () => api.get<Account[]>('/accounts'),
|
||||||
|
get: (id: number) => api.get<Account>(`/accounts/${id}`),
|
||||||
|
create: (data: {
|
||||||
|
account_number: string;
|
||||||
|
account_name: string;
|
||||||
|
account_type: 'cash' | 'margin';
|
||||||
|
}) => api.post<Account>('/accounts', data),
|
||||||
|
update: (id: number, data: Partial<Account>) =>
|
||||||
|
api.put<Account>(`/accounts/${id}`, data),
|
||||||
|
delete: (id: number) => api.delete(`/accounts/${id}`),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Transaction APIs
|
||||||
|
export const transactionsApi = {
|
||||||
|
list: (params?: {
|
||||||
|
account_id?: number;
|
||||||
|
symbol?: string;
|
||||||
|
start_date?: string;
|
||||||
|
end_date?: string;
|
||||||
|
skip?: number;
|
||||||
|
limit?: number;
|
||||||
|
}) => api.get<Transaction[]>('/transactions', { params }),
|
||||||
|
get: (id: number) => api.get<Transaction>(`/transactions/${id}`),
|
||||||
|
getPositionDetails: (id: number) => api.get<any>(`/transactions/${id}/position-details`),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Position APIs
|
||||||
|
export const positionsApi = {
|
||||||
|
list: (params?: {
|
||||||
|
account_id?: number;
|
||||||
|
status?: 'open' | 'closed';
|
||||||
|
symbol?: string;
|
||||||
|
skip?: number;
|
||||||
|
limit?: number;
|
||||||
|
}) => api.get<Position[]>('/positions', { params }),
|
||||||
|
get: (id: number) => api.get<Position>(`/positions/${id}`),
|
||||||
|
rebuild: (accountId: number) =>
|
||||||
|
api.post<{ positions_created: number }>(`/positions/${accountId}/rebuild`),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Analytics APIs
|
||||||
|
export const analyticsApi = {
|
||||||
|
getOverview: (accountId: number, params?: { refresh_prices?: boolean; max_api_calls?: number; start_date?: string; end_date?: string }) =>
|
||||||
|
api.get<AccountStats>(`/analytics/overview/${accountId}`, { params }),
|
||||||
|
getBalanceHistory: (accountId: number, days: number = 30) =>
|
||||||
|
api.get<{ data: BalancePoint[] }>(`/analytics/balance-history/${accountId}`, {
|
||||||
|
params: { days },
|
||||||
|
}),
|
||||||
|
getTopTrades: (accountId: number, limit: number = 10, startDate?: string, endDate?: string) =>
|
||||||
|
api.get<{ data: Trade[] }>(`/analytics/top-trades/${accountId}`, {
|
||||||
|
params: { limit, start_date: startDate, end_date: endDate },
|
||||||
|
}),
|
||||||
|
getWorstTrades: (accountId: number, limit: number = 10, startDate?: string, endDate?: string) =>
|
||||||
|
api.get<{ data: Trade[] }>(`/analytics/worst-trades/${accountId}`, {
|
||||||
|
params: { limit, start_date: startDate, end_date: endDate },
|
||||||
|
}),
|
||||||
|
updatePnL: (accountId: number) =>
|
||||||
|
api.post<{ positions_updated: number }>(`/analytics/update-pnl/${accountId}`),
|
||||||
|
refreshPrices: (accountId: number, params?: { max_api_calls?: number }) =>
|
||||||
|
api.post<{ message: string; stats: any }>(`/analytics/refresh-prices/${accountId}`, null, { params }),
|
||||||
|
refreshPricesBackground: (accountId: number, params?: { max_api_calls?: number }) =>
|
||||||
|
api.post<{ message: string; account_id: number }>(`/analytics/refresh-prices-background/${accountId}`, null, { params }),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Import APIs
|
||||||
|
export const importApi = {
|
||||||
|
uploadCsv: (accountId: number, file: File) => {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
return api.post<ImportResult>(`/import/upload/${accountId}`, formData, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'multipart/form-data',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
},
|
||||||
|
importFromFilesystem: (accountId: number) =>
|
||||||
|
api.post<{
|
||||||
|
files: Record<string, Omit<ImportResult, 'filename'>>;
|
||||||
|
total_imported: number;
|
||||||
|
positions_created: number;
|
||||||
|
}>(`/import/filesystem/${accountId}`),
|
||||||
|
};
|
||||||
|
|
||||||
|
export default api;
|
||||||
177
frontend/src/components/AccountManager.tsx
Normal file
177
frontend/src/components/AccountManager.tsx
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
||||||
|
import { accountsApi } from '../api/client';
|
||||||
|
|
||||||
|
interface AccountManagerProps {
|
||||||
|
onAccountCreated: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Component for managing accounts (create, list, delete).
|
||||||
|
*/
|
||||||
|
export default function AccountManager({ onAccountCreated }: AccountManagerProps) {
|
||||||
|
const [showForm, setShowForm] = useState(false);
|
||||||
|
const [formData, setFormData] = useState({
|
||||||
|
account_number: '',
|
||||||
|
account_name: '',
|
||||||
|
account_type: 'cash' as 'cash' | 'margin',
|
||||||
|
});
|
||||||
|
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
// Fetch accounts
|
||||||
|
const { data: accounts, isLoading } = useQuery({
|
||||||
|
queryKey: ['accounts'],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await accountsApi.list();
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create account mutation
|
||||||
|
const createMutation = useMutation({
|
||||||
|
mutationFn: accountsApi.create,
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['accounts'] });
|
||||||
|
setFormData({ account_number: '', account_name: '', account_type: 'cash' });
|
||||||
|
setShowForm(false);
|
||||||
|
onAccountCreated();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete account mutation
|
||||||
|
const deleteMutation = useMutation({
|
||||||
|
mutationFn: accountsApi.delete,
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['accounts'] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleSubmit = (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
createMutation.mutate(formData);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-6">
|
||||||
|
{/* Header */}
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<h2 className="text-2xl font-bold">Accounts</h2>
|
||||||
|
<button onClick={() => setShowForm(!showForm)} className="btn-primary">
|
||||||
|
{showForm ? 'Cancel' : 'Add Account'}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Create Form */}
|
||||||
|
{showForm && (
|
||||||
|
<div className="card">
|
||||||
|
<h3 className="text-lg font-semibold mb-4">Create New Account</h3>
|
||||||
|
<form onSubmit={handleSubmit} className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<label className="label">Account Number</label>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
required
|
||||||
|
value={formData.account_number}
|
||||||
|
onChange={(e) =>
|
||||||
|
setFormData({ ...formData, account_number: e.target.value })
|
||||||
|
}
|
||||||
|
className="input"
|
||||||
|
placeholder="X38661988"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label className="label">Account Name</label>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
required
|
||||||
|
value={formData.account_name}
|
||||||
|
onChange={(e) =>
|
||||||
|
setFormData({ ...formData, account_name: e.target.value })
|
||||||
|
}
|
||||||
|
className="input"
|
||||||
|
placeholder="My Trading Account"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label className="label">Account Type</label>
|
||||||
|
<select
|
||||||
|
value={formData.account_type}
|
||||||
|
onChange={(e) =>
|
||||||
|
setFormData({
|
||||||
|
...formData,
|
||||||
|
account_type: e.target.value as 'cash' | 'margin',
|
||||||
|
})
|
||||||
|
}
|
||||||
|
className="input"
|
||||||
|
>
|
||||||
|
<option value="cash">Cash</option>
|
||||||
|
<option value="margin">Margin</option>
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button
|
||||||
|
type="submit"
|
||||||
|
disabled={createMutation.isPending}
|
||||||
|
className="btn-primary w-full disabled:opacity-50"
|
||||||
|
>
|
||||||
|
{createMutation.isPending ? 'Creating...' : 'Create Account'}
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{createMutation.isError && (
|
||||||
|
<div className="p-4 bg-red-50 border border-red-200 rounded-lg text-red-800 text-sm">
|
||||||
|
Error: {(createMutation.error as any)?.response?.data?.detail || 'Failed to create account'}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Accounts List */}
|
||||||
|
<div className="card">
|
||||||
|
<h3 className="text-lg font-semibold mb-4">Your Accounts</h3>
|
||||||
|
|
||||||
|
{isLoading ? (
|
||||||
|
<div className="text-center py-12 text-gray-500">Loading accounts...</div>
|
||||||
|
) : !accounts || accounts.length === 0 ? (
|
||||||
|
<div className="text-center py-12 text-gray-500">
|
||||||
|
No accounts yet. Create your first account to get started.
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="space-y-4">
|
||||||
|
{accounts.map((account) => (
|
||||||
|
<div
|
||||||
|
key={account.id}
|
||||||
|
className="flex items-center justify-between p-4 border border-gray-200 rounded-lg hover:bg-gray-50"
|
||||||
|
>
|
||||||
|
<div>
|
||||||
|
<h4 className="font-semibold text-lg">{account.account_name}</h4>
|
||||||
|
<p className="text-sm text-gray-600">
|
||||||
|
{account.account_number} • {account.account_type}
|
||||||
|
</p>
|
||||||
|
<p className="text-xs text-gray-500 mt-1">
|
||||||
|
Created {new Date(account.created_at).toLocaleDateString()}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button
|
||||||
|
onClick={() => {
|
||||||
|
if (confirm(`Delete account ${account.account_name}? This will delete all transactions and positions.`)) {
|
||||||
|
deleteMutation.mutate(account.id);
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
disabled={deleteMutation.isPending}
|
||||||
|
className="btn-danger disabled:opacity-50"
|
||||||
|
>
|
||||||
|
Delete
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
195
frontend/src/components/Dashboard.tsx
Normal file
195
frontend/src/components/Dashboard.tsx
Normal file
@@ -0,0 +1,195 @@
|
|||||||
|
import { useQuery } from '@tanstack/react-query';
|
||||||
|
import { analyticsApi, positionsApi } from '../api/client';
|
||||||
|
import MetricsCards from './MetricsCards';
|
||||||
|
import PerformanceChart from './PerformanceChart';
|
||||||
|
import PositionCard from './PositionCard';
|
||||||
|
|
||||||
|
interface DashboardProps {
|
||||||
|
accountId: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse option symbol to extract expiration and strike
|
||||||
|
* Format: -SYMBOL251017C6 -> Oct 17 '25 C
|
||||||
|
*/
|
||||||
|
function parseOptionSymbol(optionSymbol: string | null): string {
|
||||||
|
if (!optionSymbol) return '-';
|
||||||
|
|
||||||
|
// Extract components: -OPEN251017C6 -> YYMMDD + C/P + Strike
|
||||||
|
const match = optionSymbol.match(/(\d{6})([CP])([\d.]+)$/);
|
||||||
|
if (!match) return optionSymbol;
|
||||||
|
|
||||||
|
const [, dateStr, callPut, strike] = match;
|
||||||
|
|
||||||
|
// Parse date: YYMMDD
|
||||||
|
const year = '20' + dateStr.substring(0, 2);
|
||||||
|
const month = dateStr.substring(2, 4);
|
||||||
|
const day = dateStr.substring(4, 6);
|
||||||
|
|
||||||
|
const date = new Date(`${year}-${month}-${day}`);
|
||||||
|
const monthName = date.toLocaleDateString('en-US', { month: 'short' });
|
||||||
|
const dayNum = date.getDate();
|
||||||
|
const yearShort = dateStr.substring(0, 2);
|
||||||
|
|
||||||
|
return `${monthName} ${dayNum} '${yearShort} $${strike}${callPut}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Main dashboard showing overview metrics, charts, and positions.
|
||||||
|
*/
|
||||||
|
export default function Dashboard({ accountId }: DashboardProps) {
|
||||||
|
// Helper to safely convert to number
|
||||||
|
const toNumber = (val: any): number | null => {
|
||||||
|
if (val === null || val === undefined) return null;
|
||||||
|
const num = typeof val === 'number' ? val : parseFloat(val);
|
||||||
|
return isNaN(num) ? null : num;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Fetch overview stats
|
||||||
|
const { data: stats, isLoading: statsLoading } = useQuery({
|
||||||
|
queryKey: ['analytics', 'overview', accountId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await analyticsApi.getOverview(accountId);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch balance history
|
||||||
|
const { data: balanceHistory } = useQuery({
|
||||||
|
queryKey: ['analytics', 'balance-history', accountId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await analyticsApi.getBalanceHistory(accountId, 180);
|
||||||
|
return response.data.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch open positions
|
||||||
|
const { data: openPositions } = useQuery({
|
||||||
|
queryKey: ['positions', 'open', accountId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await positionsApi.list({
|
||||||
|
account_id: accountId,
|
||||||
|
status: 'open',
|
||||||
|
limit: 10,
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch top trades
|
||||||
|
const { data: topTrades } = useQuery({
|
||||||
|
queryKey: ['analytics', 'top-trades', accountId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await analyticsApi.getTopTrades(accountId, 5);
|
||||||
|
return response.data.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (statsLoading) {
|
||||||
|
return <div className="text-center py-12 text-gray-500">Loading dashboard...</div>;
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-8">
|
||||||
|
{/* Metrics Cards */}
|
||||||
|
<MetricsCards stats={stats!} />
|
||||||
|
|
||||||
|
{/* Performance Chart */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Balance History</h2>
|
||||||
|
<PerformanceChart data={balanceHistory || []} />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Open Positions */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Open Positions</h2>
|
||||||
|
{openPositions && openPositions.length > 0 ? (
|
||||||
|
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
|
{openPositions.map((position) => (
|
||||||
|
<PositionCard key={position.id} position={position} />
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<p className="text-gray-500 text-center py-8">No open positions</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Top Trades */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Top Performing Trades</h2>
|
||||||
|
{topTrades && topTrades.length > 0 ? (
|
||||||
|
<div className="overflow-x-auto">
|
||||||
|
<table className="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Symbol
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Type
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Contract
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Dates
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Entry
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Exit
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
P&L
|
||||||
|
</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody className="divide-y divide-gray-200">
|
||||||
|
{topTrades.map((trade, idx) => {
|
||||||
|
const entryPrice = toNumber(trade.entry_price);
|
||||||
|
const exitPrice = toNumber(trade.exit_price);
|
||||||
|
const pnl = toNumber(trade.realized_pnl);
|
||||||
|
const isOption = trade.position_type === 'call' || trade.position_type === 'put';
|
||||||
|
|
||||||
|
return (
|
||||||
|
<tr key={idx}>
|
||||||
|
<td className="px-4 py-3 font-medium">{trade.symbol}</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-500 capitalize">
|
||||||
|
{isOption ? trade.position_type : 'Stock'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-500">
|
||||||
|
{isOption ? parseOptionSymbol(trade.option_symbol) : '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-500">
|
||||||
|
{new Date(trade.open_date).toLocaleDateString()} →{' '}
|
||||||
|
{trade.close_date
|
||||||
|
? new Date(trade.close_date).toLocaleDateString()
|
||||||
|
: 'Open'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right">
|
||||||
|
{entryPrice !== null ? `$${entryPrice.toFixed(2)}` : '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right">
|
||||||
|
{exitPrice !== null ? `$${exitPrice.toFixed(2)}` : '-'}
|
||||||
|
</td>
|
||||||
|
<td
|
||||||
|
className={`px-4 py-3 text-right font-semibold ${
|
||||||
|
pnl !== null && pnl >= 0 ? 'text-profit' : 'text-loss'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{pnl !== null ? `$${pnl.toFixed(2)}` : '-'}
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<p className="text-gray-500 text-center py-8">No closed trades yet</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
316
frontend/src/components/DashboardV2.tsx
Normal file
316
frontend/src/components/DashboardV2.tsx
Normal file
@@ -0,0 +1,316 @@
|
|||||||
|
import { useQuery, useQueryClient, useMutation } from '@tanstack/react-query';
|
||||||
|
import { useState } from 'react';
|
||||||
|
import { analyticsApi, positionsApi } from '../api/client';
|
||||||
|
import MetricsCards from './MetricsCards';
|
||||||
|
import PerformanceChart from './PerformanceChart';
|
||||||
|
import PositionCard from './PositionCard';
|
||||||
|
import TimeframeFilter, { TimeframeOption, getTimeframeDates } from './TimeframeFilter';
|
||||||
|
|
||||||
|
interface DashboardProps {
|
||||||
|
accountId: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Enhanced dashboard with stale-while-revalidate pattern.
|
||||||
|
*
|
||||||
|
* Shows cached data immediately, then updates in background.
|
||||||
|
* Provides manual refresh button for fresh data.
|
||||||
|
*/
|
||||||
|
export default function DashboardV2({ accountId }: DashboardProps) {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
const [isRefreshing, setIsRefreshing] = useState(false);
|
||||||
|
const [timeframe, setTimeframe] = useState<TimeframeOption>('all');
|
||||||
|
|
||||||
|
// Convert timeframe to days for balance history
|
||||||
|
const getDaysFromTimeframe = (tf: TimeframeOption): number => {
|
||||||
|
switch (tf) {
|
||||||
|
case 'last30days': return 30;
|
||||||
|
case 'last90days': return 90;
|
||||||
|
case 'last180days': return 180;
|
||||||
|
case 'last1year': return 365;
|
||||||
|
case 'ytd': {
|
||||||
|
const now = new Date();
|
||||||
|
const startOfYear = new Date(now.getFullYear(), 0, 1);
|
||||||
|
return Math.ceil((now.getTime() - startOfYear.getTime()) / (1000 * 60 * 60 * 24));
|
||||||
|
}
|
||||||
|
case 'all':
|
||||||
|
default:
|
||||||
|
return 3650; // ~10 years
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Get date range from timeframe for filtering
|
||||||
|
const { startDate, endDate } = getTimeframeDates(timeframe);
|
||||||
|
|
||||||
|
// Fetch overview stats (with cached prices - fast!)
|
||||||
|
const {
|
||||||
|
data: stats,
|
||||||
|
isLoading: statsLoading,
|
||||||
|
dataUpdatedAt: statsUpdatedAt,
|
||||||
|
} = useQuery({
|
||||||
|
queryKey: ['analytics', 'overview', accountId, startDate, endDate],
|
||||||
|
queryFn: async () => {
|
||||||
|
// Default: use cached prices (no API calls to Yahoo Finance)
|
||||||
|
const response = await analyticsApi.getOverview(accountId, {
|
||||||
|
refresh_prices: false,
|
||||||
|
max_api_calls: 0,
|
||||||
|
start_date: startDate,
|
||||||
|
end_date: endDate,
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
// Keep showing old data while fetching new
|
||||||
|
staleTime: 30000, // 30 seconds
|
||||||
|
// Refetch in background when window regains focus
|
||||||
|
refetchOnWindowFocus: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch balance history (doesn't need market data - always fast)
|
||||||
|
const { data: balanceHistory } = useQuery({
|
||||||
|
queryKey: ['analytics', 'balance-history', accountId, timeframe],
|
||||||
|
queryFn: async () => {
|
||||||
|
const days = getDaysFromTimeframe(timeframe);
|
||||||
|
const response = await analyticsApi.getBalanceHistory(accountId, days);
|
||||||
|
return response.data.data;
|
||||||
|
},
|
||||||
|
staleTime: 60000, // 1 minute
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch open positions
|
||||||
|
const { data: openPositions } = useQuery({
|
||||||
|
queryKey: ['positions', 'open', accountId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await positionsApi.list({
|
||||||
|
account_id: accountId,
|
||||||
|
status: 'open',
|
||||||
|
limit: 10,
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
staleTime: 30000,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch top trades (doesn't need market data - always fast)
|
||||||
|
const { data: topTrades } = useQuery({
|
||||||
|
queryKey: ['analytics', 'top-trades', accountId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await analyticsApi.getTopTrades(accountId, 5);
|
||||||
|
return response.data.data;
|
||||||
|
},
|
||||||
|
staleTime: 60000,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Mutation for manual price refresh
|
||||||
|
const refreshPricesMutation = useMutation({
|
||||||
|
mutationFn: async () => {
|
||||||
|
// Trigger background refresh
|
||||||
|
await analyticsApi.refreshPricesBackground(accountId, { max_api_calls: 15 });
|
||||||
|
|
||||||
|
// Wait a bit, then refetch overview
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 2000));
|
||||||
|
|
||||||
|
// Refetch with fresh prices
|
||||||
|
const response = await analyticsApi.getOverview(accountId, {
|
||||||
|
refresh_prices: true,
|
||||||
|
max_api_calls: 15,
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
onSuccess: (data) => {
|
||||||
|
// Update the cache with fresh data
|
||||||
|
queryClient.setQueryData(['analytics', 'overview', accountId], data);
|
||||||
|
setIsRefreshing(false);
|
||||||
|
},
|
||||||
|
onError: () => {
|
||||||
|
setIsRefreshing(false);
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleRefreshPrices = () => {
|
||||||
|
setIsRefreshing(true);
|
||||||
|
refreshPricesMutation.mutate();
|
||||||
|
};
|
||||||
|
|
||||||
|
// Calculate data age
|
||||||
|
const getDataAge = () => {
|
||||||
|
if (!statsUpdatedAt) return null;
|
||||||
|
const ageSeconds = Math.floor((Date.now() - statsUpdatedAt) / 1000);
|
||||||
|
|
||||||
|
if (ageSeconds < 60) return `${ageSeconds}s ago`;
|
||||||
|
const ageMinutes = Math.floor(ageSeconds / 60);
|
||||||
|
if (ageMinutes < 60) return `${ageMinutes}m ago`;
|
||||||
|
const ageHours = Math.floor(ageMinutes / 60);
|
||||||
|
return `${ageHours}h ago`;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Check if we have update stats from the API
|
||||||
|
const hasUpdateStats = stats?.price_update_stats;
|
||||||
|
const updateStats = stats?.price_update_stats;
|
||||||
|
|
||||||
|
if (statsLoading && !stats) {
|
||||||
|
// First load - show loading
|
||||||
|
return (
|
||||||
|
<div className="text-center py-12 text-gray-500">
|
||||||
|
Loading dashboard...
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!stats) {
|
||||||
|
// Error state or no data
|
||||||
|
return (
|
||||||
|
<div className="text-center py-12 text-gray-500">
|
||||||
|
Unable to load dashboard data. Please try refreshing the page.
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-8">
|
||||||
|
{/* Timeframe Filter */}
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<div>
|
||||||
|
<label className="block text-sm font-medium text-gray-700 mb-2">
|
||||||
|
Timeframe
|
||||||
|
</label>
|
||||||
|
<TimeframeFilter value={timeframe} onChange={(value) => setTimeframe(value as TimeframeOption)} />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Data freshness indicator and refresh button */}
|
||||||
|
<div className="flex items-center justify-between bg-gray-50 px-4 py-3 rounded-lg border border-gray-200">
|
||||||
|
<div className="flex items-center space-x-4">
|
||||||
|
<div className="text-sm text-gray-600">
|
||||||
|
{stats && (
|
||||||
|
<>
|
||||||
|
<span className="font-medium">Last updated:</span>{' '}
|
||||||
|
{getDataAge() || 'just now'}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{hasUpdateStats && updateStats && (
|
||||||
|
<div className="text-xs text-gray-500 border-l border-gray-300 pl-4">
|
||||||
|
{updateStats.cached > 0 && (
|
||||||
|
<span className="mr-3">
|
||||||
|
📦 {updateStats.cached} cached
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
{updateStats.failed > 0 && (
|
||||||
|
<span className="text-orange-600">
|
||||||
|
⚠️ {updateStats.failed} unavailable
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button
|
||||||
|
onClick={handleRefreshPrices}
|
||||||
|
disabled={isRefreshing}
|
||||||
|
className={`px-4 py-2 text-sm font-medium rounded-lg transition-colors ${
|
||||||
|
isRefreshing
|
||||||
|
? 'bg-gray-300 text-gray-500 cursor-not-allowed'
|
||||||
|
: 'bg-primary text-white hover:bg-primary-dark'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{isRefreshing ? (
|
||||||
|
<>
|
||||||
|
<svg className="animate-spin -ml-1 mr-2 h-4 w-4 inline" fill="none" viewBox="0 0 24 24">
|
||||||
|
<circle className="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" strokeWidth="4"/>
|
||||||
|
<path className="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"/>
|
||||||
|
</svg>
|
||||||
|
Refreshing...
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
🔄 Refresh Prices
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Show info banner if using stale data */}
|
||||||
|
{stats && !hasUpdateStats && (
|
||||||
|
<div className="bg-blue-50 border border-blue-200 rounded-lg p-4 text-sm text-blue-800">
|
||||||
|
<strong>💡 Tip:</strong> Showing cached data for fast loading. Click "Refresh Prices" to get the latest market prices.
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Metrics Cards */}
|
||||||
|
<MetricsCards stats={stats} />
|
||||||
|
|
||||||
|
{/* Performance Chart */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Balance History</h2>
|
||||||
|
<PerformanceChart data={balanceHistory || []} />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Open Positions */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Open Positions</h2>
|
||||||
|
{openPositions && openPositions.length > 0 ? (
|
||||||
|
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
|
{openPositions.map((position) => (
|
||||||
|
<PositionCard key={position.id} position={position} />
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<p className="text-gray-500 text-center py-8">No open positions</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Top Trades */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Top Performing Trades</h2>
|
||||||
|
{topTrades && topTrades.length > 0 ? (
|
||||||
|
<div className="overflow-x-auto">
|
||||||
|
<table className="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Symbol
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Type
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Dates
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
P&L
|
||||||
|
</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody className="divide-y divide-gray-200">
|
||||||
|
{topTrades.map((trade, idx) => (
|
||||||
|
<tr key={idx}>
|
||||||
|
<td className="px-4 py-3 font-medium">{trade.symbol}</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-500 capitalize">
|
||||||
|
{trade.position_type}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-500">
|
||||||
|
{new Date(trade.open_date).toLocaleDateString()} →{' '}
|
||||||
|
{trade.close_date
|
||||||
|
? new Date(trade.close_date).toLocaleDateString()
|
||||||
|
: 'Open'}
|
||||||
|
</td>
|
||||||
|
<td
|
||||||
|
className={`px-4 py-3 text-right font-semibold ${
|
||||||
|
trade.realized_pnl >= 0 ? 'text-profit' : 'text-loss'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
${trade.realized_pnl.toFixed(2)}
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
))}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<p className="text-gray-500 text-center py-8">No closed trades yet</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
184
frontend/src/components/ImportDropzone.tsx
Normal file
184
frontend/src/components/ImportDropzone.tsx
Normal file
@@ -0,0 +1,184 @@
|
|||||||
|
import { useState, useCallback } from 'react';
|
||||||
|
import { useDropzone } from 'react-dropzone';
|
||||||
|
import { useMutation, useQueryClient } from '@tanstack/react-query';
|
||||||
|
import { importApi } from '../api/client';
|
||||||
|
|
||||||
|
interface ImportDropzoneProps {
|
||||||
|
accountId: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* File upload component with drag-and-drop support.
|
||||||
|
*/
|
||||||
|
export default function ImportDropzone({ accountId }: ImportDropzoneProps) {
|
||||||
|
const [importResult, setImportResult] = useState<any>(null);
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
// Upload mutation
|
||||||
|
const uploadMutation = useMutation({
|
||||||
|
mutationFn: (file: File) => importApi.uploadCsv(accountId, file),
|
||||||
|
onSuccess: (response) => {
|
||||||
|
setImportResult(response.data);
|
||||||
|
// Invalidate queries to refresh data
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['transactions', accountId] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['positions'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['analytics'] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Filesystem import mutation
|
||||||
|
const filesystemMutation = useMutation({
|
||||||
|
mutationFn: () => importApi.importFromFilesystem(accountId),
|
||||||
|
onSuccess: (response) => {
|
||||||
|
setImportResult(response.data);
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['transactions', accountId] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['positions'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['analytics'] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const onDrop = useCallback(
|
||||||
|
(acceptedFiles: File[]) => {
|
||||||
|
if (acceptedFiles.length > 0) {
|
||||||
|
setImportResult(null);
|
||||||
|
uploadMutation.mutate(acceptedFiles[0]);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[uploadMutation]
|
||||||
|
);
|
||||||
|
|
||||||
|
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||||
|
onDrop,
|
||||||
|
accept: {
|
||||||
|
'text/csv': ['.csv'],
|
||||||
|
},
|
||||||
|
multiple: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-6">
|
||||||
|
{/* File Upload Dropzone */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Upload CSV File</h2>
|
||||||
|
|
||||||
|
<div
|
||||||
|
{...getRootProps()}
|
||||||
|
className={`border-2 border-dashed rounded-lg p-12 text-center cursor-pointer transition-colors ${
|
||||||
|
isDragActive
|
||||||
|
? 'border-robinhood-green bg-green-50'
|
||||||
|
: 'border-gray-300 hover:border-gray-400'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
<input {...getInputProps()} />
|
||||||
|
<div className="space-y-2">
|
||||||
|
<svg
|
||||||
|
className="mx-auto h-12 w-12 text-gray-400"
|
||||||
|
stroke="currentColor"
|
||||||
|
fill="none"
|
||||||
|
viewBox="0 0 48 48"
|
||||||
|
>
|
||||||
|
<path
|
||||||
|
d="M28 8H12a4 4 0 00-4 4v20m32-12v8m0 0v8a4 4 0 01-4 4H12a4 4 0 01-4-4v-4m32-4l-3.172-3.172a4 4 0 00-5.656 0L28 28M8 32l9.172-9.172a4 4 0 015.656 0L28 28m0 0l4 4m4-24h8m-4-4v8m-12 4h.02"
|
||||||
|
strokeWidth={2}
|
||||||
|
strokeLinecap="round"
|
||||||
|
strokeLinejoin="round"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
{isDragActive ? (
|
||||||
|
<p className="text-lg text-robinhood-green font-medium">
|
||||||
|
Drop the CSV file here
|
||||||
|
</p>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<p className="text-lg text-gray-600">
|
||||||
|
Drag and drop a Fidelity CSV file here, or click to select
|
||||||
|
</p>
|
||||||
|
<p className="text-sm text-gray-500">Only .csv files are accepted</p>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{uploadMutation.isPending && (
|
||||||
|
<div className="mt-4 text-center text-gray-600">Uploading and processing...</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{uploadMutation.isError && (
|
||||||
|
<div className="mt-4 p-4 bg-red-50 border border-red-200 rounded-lg text-red-800">
|
||||||
|
Error: {(uploadMutation.error as any)?.response?.data?.detail || 'Upload failed'}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Filesystem Import */}
|
||||||
|
<div className="card">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Import from Filesystem</h2>
|
||||||
|
<p className="text-gray-600 mb-4">
|
||||||
|
Import all CSV files from the <code className="bg-gray-100 px-2 py-1 rounded">/imports</code> directory
|
||||||
|
</p>
|
||||||
|
<button
|
||||||
|
onClick={() => {
|
||||||
|
setImportResult(null);
|
||||||
|
filesystemMutation.mutate();
|
||||||
|
}}
|
||||||
|
disabled={filesystemMutation.isPending}
|
||||||
|
className="btn-primary disabled:opacity-50"
|
||||||
|
>
|
||||||
|
{filesystemMutation.isPending ? 'Importing...' : 'Import from Filesystem'}
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{filesystemMutation.isError && (
|
||||||
|
<div className="mt-4 p-4 bg-red-50 border border-red-200 rounded-lg text-red-800">
|
||||||
|
Error: {(filesystemMutation.error as any)?.response?.data?.detail || 'Import failed'}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Import Results */}
|
||||||
|
{importResult && (
|
||||||
|
<div className="card bg-green-50 border border-green-200">
|
||||||
|
<h3 className="text-lg font-semibold text-green-900 mb-4">Import Successful</h3>
|
||||||
|
|
||||||
|
{importResult.filename && (
|
||||||
|
<div className="mb-4">
|
||||||
|
<p className="text-sm text-gray-600">File: {importResult.filename}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="grid grid-cols-2 md:grid-cols-4 gap-4 text-center">
|
||||||
|
<div>
|
||||||
|
<div className="text-2xl font-bold text-green-700">{importResult.imported || importResult.total_imported}</div>
|
||||||
|
<div className="text-sm text-gray-600">Imported</div>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div className="text-2xl font-bold text-gray-700">{importResult.skipped || 0}</div>
|
||||||
|
<div className="text-sm text-gray-600">Skipped</div>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div className="text-2xl font-bold text-gray-700">{importResult.total_rows || 0}</div>
|
||||||
|
<div className="text-sm text-gray-600">Total Rows</div>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div className="text-2xl font-bold text-blue-700">{importResult.positions_created}</div>
|
||||||
|
<div className="text-sm text-gray-600">Positions</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{importResult.errors && importResult.errors.length > 0 && (
|
||||||
|
<div className="mt-4 p-4 bg-red-50 rounded-lg">
|
||||||
|
<p className="text-sm font-medium text-red-800 mb-2">Errors:</p>
|
||||||
|
<ul className="text-sm text-red-700 space-y-1">
|
||||||
|
{importResult.errors.slice(0, 5).map((error: string, idx: number) => (
|
||||||
|
<li key={idx}>• {error}</li>
|
||||||
|
))}
|
||||||
|
{importResult.errors.length > 5 && (
|
||||||
|
<li>... and {importResult.errors.length - 5} more</li>
|
||||||
|
)}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
82
frontend/src/components/MetricsCards.tsx
Normal file
82
frontend/src/components/MetricsCards.tsx
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
import type { AccountStats } from '../types';
|
||||||
|
|
||||||
|
interface MetricsCardsProps {
|
||||||
|
stats: AccountStats;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Display key performance metrics in card format.
|
||||||
|
*/
|
||||||
|
export default function MetricsCards({ stats }: MetricsCardsProps) {
|
||||||
|
// Safely convert values to numbers
|
||||||
|
const safeNumber = (val: any): number => {
|
||||||
|
const num = typeof val === 'number' ? val : parseFloat(val);
|
||||||
|
return isNaN(num) ? 0 : num;
|
||||||
|
};
|
||||||
|
|
||||||
|
const metrics = [
|
||||||
|
{
|
||||||
|
label: 'Account Balance',
|
||||||
|
value: `$${safeNumber(stats.current_balance).toLocaleString(undefined, {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`,
|
||||||
|
change: null,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Total P&L',
|
||||||
|
value: `$${safeNumber(stats.total_pnl).toLocaleString(undefined, {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`,
|
||||||
|
change: safeNumber(stats.total_pnl),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Realized P&L',
|
||||||
|
value: `$${safeNumber(stats.total_realized_pnl).toLocaleString(undefined, {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`,
|
||||||
|
change: safeNumber(stats.total_realized_pnl),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Unrealized P&L',
|
||||||
|
value: `$${safeNumber(stats.total_unrealized_pnl).toLocaleString(undefined, {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`,
|
||||||
|
change: safeNumber(stats.total_unrealized_pnl),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Win Rate',
|
||||||
|
value: `${safeNumber(stats.win_rate).toFixed(1)}%`,
|
||||||
|
change: null,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Open Positions',
|
||||||
|
value: String(stats.open_positions || 0),
|
||||||
|
change: null,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
|
||||||
|
{metrics.map((metric, idx) => (
|
||||||
|
<div key={idx} className="card">
|
||||||
|
<div className="text-sm text-gray-500 mb-1">{metric.label}</div>
|
||||||
|
<div
|
||||||
|
className={`text-2xl font-bold ${
|
||||||
|
metric.change !== null
|
||||||
|
? metric.change >= 0
|
||||||
|
? 'text-profit'
|
||||||
|
: 'text-loss'
|
||||||
|
: 'text-gray-900'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{metric.value}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
70
frontend/src/components/PerformanceChart.tsx
Normal file
70
frontend/src/components/PerformanceChart.tsx
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
import { LineChart, Line, XAxis, YAxis, CartesianGrid, Tooltip, ResponsiveContainer } from 'recharts';
|
||||||
|
import type { BalancePoint } from '../types';
|
||||||
|
|
||||||
|
interface PerformanceChartProps {
|
||||||
|
data: BalancePoint[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Line chart showing account balance over time.
|
||||||
|
*/
|
||||||
|
export default function PerformanceChart({ data }: PerformanceChartProps) {
|
||||||
|
if (!data || data.length === 0) {
|
||||||
|
return (
|
||||||
|
<div className="h-64 flex items-center justify-center text-gray-500">
|
||||||
|
No balance history available
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format data for Recharts
|
||||||
|
const chartData = data.map((point) => ({
|
||||||
|
date: new Date(point.date).toLocaleDateString('en-US', {
|
||||||
|
month: 'short',
|
||||||
|
day: 'numeric',
|
||||||
|
}),
|
||||||
|
balance: point.balance,
|
||||||
|
}));
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="h-64">
|
||||||
|
<ResponsiveContainer width="100%" height="100%">
|
||||||
|
<LineChart data={chartData}>
|
||||||
|
<CartesianGrid strokeDasharray="3 3" stroke="#E5E7EB" />
|
||||||
|
<XAxis
|
||||||
|
dataKey="date"
|
||||||
|
stroke="#6B7280"
|
||||||
|
style={{ fontSize: '12px' }}
|
||||||
|
/>
|
||||||
|
<YAxis
|
||||||
|
stroke="#6B7280"
|
||||||
|
style={{ fontSize: '12px' }}
|
||||||
|
tickFormatter={(value) =>
|
||||||
|
`$${value.toLocaleString(undefined, { maximumFractionDigits: 0 })}`
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
<Tooltip
|
||||||
|
formatter={(value: number) =>
|
||||||
|
`$${value.toLocaleString(undefined, {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`
|
||||||
|
}
|
||||||
|
contentStyle={{
|
||||||
|
backgroundColor: 'white',
|
||||||
|
border: '1px solid #E5E7EB',
|
||||||
|
borderRadius: '8px',
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
<Line
|
||||||
|
type="monotone"
|
||||||
|
dataKey="balance"
|
||||||
|
stroke="#00C805"
|
||||||
|
strokeWidth={2}
|
||||||
|
dot={false}
|
||||||
|
/>
|
||||||
|
</LineChart>
|
||||||
|
</ResponsiveContainer>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
76
frontend/src/components/PositionCard.tsx
Normal file
76
frontend/src/components/PositionCard.tsx
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
import type { Position } from '../types';
|
||||||
|
|
||||||
|
interface PositionCardProps {
|
||||||
|
position: Position;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Card displaying position information.
|
||||||
|
*/
|
||||||
|
export default function PositionCard({ position }: PositionCardProps) {
|
||||||
|
const pnl = position.status === 'open' ? position.unrealized_pnl : position.realized_pnl;
|
||||||
|
const isProfitable = pnl !== null && pnl >= 0;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={`border-2 rounded-lg p-4 ${isProfitable ? 'border-green-200 bg-profit' : 'border-red-200 bg-loss'}`}>
|
||||||
|
<div className="flex items-start justify-between mb-2">
|
||||||
|
<div>
|
||||||
|
<h3 className="font-bold text-lg">{position.symbol}</h3>
|
||||||
|
<p className="text-sm text-gray-600 capitalize">
|
||||||
|
{position.position_type}
|
||||||
|
{position.option_symbol && ` • ${position.option_symbol}`}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<span
|
||||||
|
className={`px-2 py-1 rounded-full text-xs font-medium ${
|
||||||
|
position.status === 'open'
|
||||||
|
? 'bg-blue-100 text-blue-800'
|
||||||
|
: 'bg-gray-100 text-gray-800'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{position.status}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="grid grid-cols-2 gap-4 text-sm mb-3">
|
||||||
|
<div>
|
||||||
|
<div className="text-gray-600">Quantity</div>
|
||||||
|
<div className="font-medium">{position.total_quantity}</div>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div className="text-gray-600">Entry Price</div>
|
||||||
|
<div className="font-medium">
|
||||||
|
${typeof position.avg_entry_price === 'number' ? position.avg_entry_price.toFixed(2) : 'N/A'}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div className="text-gray-600">Open Date</div>
|
||||||
|
<div className="font-medium">
|
||||||
|
{new Date(position.open_date).toLocaleDateString()}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{position.status === 'closed' && position.close_date && (
|
||||||
|
<div>
|
||||||
|
<div className="text-gray-600">Close Date</div>
|
||||||
|
<div className="font-medium">
|
||||||
|
{new Date(position.close_date).toLocaleDateString()}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{pnl !== null && typeof pnl === 'number' && (
|
||||||
|
<div className="pt-3 border-t border-gray-300">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<span className="text-sm text-gray-600">
|
||||||
|
{position.status === 'open' ? 'Unrealized P&L' : 'Realized P&L'}
|
||||||
|
</span>
|
||||||
|
<span className={`text-lg font-bold ${isProfitable ? 'text-profit' : 'text-loss'}`}>
|
||||||
|
{isProfitable ? '+' : ''}${pnl.toFixed(2)}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
90
frontend/src/components/TimeframeFilter.tsx
Normal file
90
frontend/src/components/TimeframeFilter.tsx
Normal file
@@ -0,0 +1,90 @@
|
|||||||
|
interface TimeframeFilterProps {
|
||||||
|
value: string;
|
||||||
|
onChange: (value: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type TimeframeOption =
|
||||||
|
| 'last30days'
|
||||||
|
| 'last90days'
|
||||||
|
| 'last180days'
|
||||||
|
| 'last1year'
|
||||||
|
| 'ytd'
|
||||||
|
| 'all';
|
||||||
|
|
||||||
|
export interface TimeframeDates {
|
||||||
|
startDate?: string;
|
||||||
|
endDate?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Calculate date range based on timeframe selection
|
||||||
|
*/
|
||||||
|
export function getTimeframeDates(timeframe: TimeframeOption): TimeframeDates {
|
||||||
|
const today = new Date();
|
||||||
|
const todayStr = today.toISOString().split('T')[0];
|
||||||
|
|
||||||
|
switch (timeframe) {
|
||||||
|
case 'last30days': {
|
||||||
|
const startDate = new Date(today);
|
||||||
|
startDate.setDate(startDate.getDate() - 30);
|
||||||
|
return {
|
||||||
|
startDate: startDate.toISOString().split('T')[0],
|
||||||
|
endDate: todayStr,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'last90days': {
|
||||||
|
const startDate = new Date(today);
|
||||||
|
startDate.setDate(startDate.getDate() - 90);
|
||||||
|
return {
|
||||||
|
startDate: startDate.toISOString().split('T')[0],
|
||||||
|
endDate: todayStr,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'last180days': {
|
||||||
|
const startDate = new Date(today);
|
||||||
|
startDate.setDate(startDate.getDate() - 180);
|
||||||
|
return {
|
||||||
|
startDate: startDate.toISOString().split('T')[0],
|
||||||
|
endDate: todayStr,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'last1year': {
|
||||||
|
const startDate = new Date(today);
|
||||||
|
startDate.setFullYear(startDate.getFullYear() - 1);
|
||||||
|
return {
|
||||||
|
startDate: startDate.toISOString().split('T')[0],
|
||||||
|
endDate: todayStr,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'ytd': {
|
||||||
|
const year = today.getFullYear();
|
||||||
|
return {
|
||||||
|
startDate: `${year}-01-01`,
|
||||||
|
endDate: todayStr,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'all':
|
||||||
|
default:
|
||||||
|
return {}; // No date filters
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Dropdown filter for selecting timeframe
|
||||||
|
*/
|
||||||
|
export default function TimeframeFilter({ value, onChange }: TimeframeFilterProps) {
|
||||||
|
return (
|
||||||
|
<select
|
||||||
|
value={value}
|
||||||
|
onChange={(e) => onChange(e.target.value)}
|
||||||
|
className="input max-w-xs"
|
||||||
|
>
|
||||||
|
<option value="all">All Time</option>
|
||||||
|
<option value="last30days">Last 30 Days</option>
|
||||||
|
<option value="last90days">Last 90 Days</option>
|
||||||
|
<option value="last180days">Last 180 Days</option>
|
||||||
|
<option value="last1year">Last 1 Year</option>
|
||||||
|
<option value="ytd">Year to Date</option>
|
||||||
|
</select>
|
||||||
|
);
|
||||||
|
}
|
||||||
399
frontend/src/components/TransactionDetailModal.tsx
Normal file
399
frontend/src/components/TransactionDetailModal.tsx
Normal file
@@ -0,0 +1,399 @@
|
|||||||
|
import { useQuery } from '@tanstack/react-query';
|
||||||
|
import { transactionsApi } from '../api/client';
|
||||||
|
|
||||||
|
interface TransactionDetailModalProps {
|
||||||
|
transactionId: number;
|
||||||
|
onClose: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Transaction {
|
||||||
|
id: number;
|
||||||
|
run_date: string;
|
||||||
|
action: string;
|
||||||
|
symbol: string;
|
||||||
|
description: string | null;
|
||||||
|
quantity: number | null;
|
||||||
|
price: number | null;
|
||||||
|
amount: number | null;
|
||||||
|
commission: number | null;
|
||||||
|
fees: number | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Position {
|
||||||
|
id: number;
|
||||||
|
symbol: string;
|
||||||
|
option_symbol: string | null;
|
||||||
|
position_type: string;
|
||||||
|
status: string;
|
||||||
|
open_date: string;
|
||||||
|
close_date: string | null;
|
||||||
|
total_quantity: number;
|
||||||
|
avg_entry_price: number | null;
|
||||||
|
avg_exit_price: number | null;
|
||||||
|
realized_pnl: number | null;
|
||||||
|
unrealized_pnl: number | null;
|
||||||
|
strategy: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PositionDetails {
|
||||||
|
position: Position;
|
||||||
|
transactions: Transaction[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Modal displaying full position details for a transaction.
|
||||||
|
* Shows all related transactions, strategy type, and P&L.
|
||||||
|
*/
|
||||||
|
export default function TransactionDetailModal({
|
||||||
|
transactionId,
|
||||||
|
onClose,
|
||||||
|
}: TransactionDetailModalProps) {
|
||||||
|
const { data, isLoading, error } = useQuery<PositionDetails>({
|
||||||
|
queryKey: ['transaction-details', transactionId],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await transactionsApi.getPositionDetails(transactionId);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const parseOptionSymbol = (optionSymbol: string | null): string => {
|
||||||
|
if (!optionSymbol) return '-';
|
||||||
|
|
||||||
|
// Extract components: -SYMBOL251017C6 -> YYMMDD + C/P + Strike
|
||||||
|
const match = optionSymbol.match(/(\d{6})([CP])([\d.]+)$/);
|
||||||
|
if (!match) return optionSymbol;
|
||||||
|
|
||||||
|
const [, dateStr, callPut, strike] = match;
|
||||||
|
|
||||||
|
// Parse date: YYMMDD
|
||||||
|
const year = '20' + dateStr.substring(0, 2);
|
||||||
|
const month = dateStr.substring(2, 4);
|
||||||
|
const day = dateStr.substring(4, 6);
|
||||||
|
|
||||||
|
const date = new Date(`${year}-${month}-${day}`);
|
||||||
|
const monthName = date.toLocaleDateString('en-US', { month: 'short' });
|
||||||
|
const dayNum = date.getDate();
|
||||||
|
const yearShort = dateStr.substring(0, 2);
|
||||||
|
|
||||||
|
return `${monthName} ${dayNum} '${yearShort} $${strike}${callPut}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50 p-4"
|
||||||
|
onClick={onClose}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
className="bg-white rounded-lg shadow-xl max-w-4xl w-full max-h-[90vh] overflow-y-auto"
|
||||||
|
onClick={(e) => e.stopPropagation()}
|
||||||
|
>
|
||||||
|
{/* Header */}
|
||||||
|
<div className="sticky top-0 bg-white border-b border-gray-200 px-6 py-4 flex items-center justify-between">
|
||||||
|
<h2 className="text-2xl font-semibold">Trade Details</h2>
|
||||||
|
<button
|
||||||
|
onClick={onClose}
|
||||||
|
className="text-gray-400 hover:text-gray-600 text-2xl font-bold"
|
||||||
|
>
|
||||||
|
×
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Content */}
|
||||||
|
<div className="px-6 py-4">
|
||||||
|
{isLoading && (
|
||||||
|
<div className="text-center py-12 text-gray-500">
|
||||||
|
Loading trade details...
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{error && (
|
||||||
|
<div className="text-center py-12">
|
||||||
|
<p className="text-red-600">Failed to load trade details</p>
|
||||||
|
<p className="text-sm text-gray-500 mt-2">
|
||||||
|
{error instanceof Error ? error.message : 'Unknown error'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{data && (
|
||||||
|
<div className="space-y-6">
|
||||||
|
{/* Position Summary */}
|
||||||
|
<div className="bg-gray-50 rounded-lg p-4">
|
||||||
|
<h3 className="text-lg font-semibold mb-3">Position Summary</h3>
|
||||||
|
<div className="grid grid-cols-2 md:grid-cols-3 gap-4">
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">Symbol</p>
|
||||||
|
<p className="font-semibold">{data.position.symbol}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">Type</p>
|
||||||
|
<p className="font-semibold capitalize">
|
||||||
|
{data.position.position_type}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">Strategy</p>
|
||||||
|
<p className="font-semibold">{data.position.strategy}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{data.position.option_symbol && (
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">Contract</p>
|
||||||
|
<p className="font-semibold">
|
||||||
|
{parseOptionSymbol(data.position.option_symbol)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">Status</p>
|
||||||
|
<p
|
||||||
|
className={`font-semibold capitalize ${
|
||||||
|
data.position.status === 'open'
|
||||||
|
? 'text-blue-600'
|
||||||
|
: 'text-gray-600'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{data.position.status}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">Quantity</p>
|
||||||
|
<p className="font-semibold">
|
||||||
|
{Math.abs(data.position.total_quantity)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">
|
||||||
|
Avg Entry Price
|
||||||
|
</p>
|
||||||
|
<p className="font-semibold">
|
||||||
|
{data.position.avg_entry_price !== null
|
||||||
|
? `$${data.position.avg_entry_price.toFixed(2)}`
|
||||||
|
: '-'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{data.position.avg_exit_price !== null && (
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">
|
||||||
|
Avg Exit Price
|
||||||
|
</p>
|
||||||
|
<p className="font-semibold">
|
||||||
|
${data.position.avg_exit_price.toFixed(2)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<p className="text-xs text-gray-500 uppercase">P&L</p>
|
||||||
|
<p
|
||||||
|
className={`font-bold text-lg ${
|
||||||
|
(data.position.realized_pnl || 0) >= 0
|
||||||
|
? 'text-profit'
|
||||||
|
: 'text-loss'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{data.position.realized_pnl !== null
|
||||||
|
? `$${data.position.realized_pnl.toFixed(2)}`
|
||||||
|
: data.position.unrealized_pnl !== null
|
||||||
|
? `$${data.position.unrealized_pnl.toFixed(2)}`
|
||||||
|
: '-'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Transaction History */}
|
||||||
|
<div>
|
||||||
|
<h3 className="text-lg font-semibold mb-3">
|
||||||
|
Transaction History ({data.transactions.length})
|
||||||
|
</h3>
|
||||||
|
<div className="overflow-x-auto">
|
||||||
|
<table className="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead className="bg-gray-50">
|
||||||
|
<tr>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Date
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Action
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Quantity
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Price
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Amount
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Fees
|
||||||
|
</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody className="bg-white divide-y divide-gray-200">
|
||||||
|
{data.transactions.map((txn) => (
|
||||||
|
<tr key={txn.id} className="hover:bg-gray-50">
|
||||||
|
<td className="px-4 py-3 text-sm">
|
||||||
|
{new Date(txn.run_date).toLocaleDateString()}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-600">
|
||||||
|
{txn.action}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right">
|
||||||
|
{txn.quantity !== null ? txn.quantity : '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right">
|
||||||
|
{txn.price !== null
|
||||||
|
? `$${txn.price.toFixed(2)}`
|
||||||
|
: '-'}
|
||||||
|
</td>
|
||||||
|
<td
|
||||||
|
className={`px-4 py-3 text-sm text-right font-medium ${
|
||||||
|
txn.amount !== null
|
||||||
|
? txn.amount >= 0
|
||||||
|
? 'text-profit'
|
||||||
|
: 'text-loss'
|
||||||
|
: ''
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{txn.amount !== null
|
||||||
|
? `$${txn.amount.toFixed(2)}`
|
||||||
|
: '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right text-gray-500">
|
||||||
|
{txn.commission || txn.fees
|
||||||
|
? `$${(
|
||||||
|
(txn.commission || 0) + (txn.fees || 0)
|
||||||
|
).toFixed(2)}`
|
||||||
|
: '-'}
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
))}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Trade Timeline and Performance Summary */}
|
||||||
|
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
|
{/* Trade Timeline */}
|
||||||
|
<div className="bg-blue-50 border border-blue-200 rounded-lg p-4">
|
||||||
|
<h4 className="font-semibold text-blue-900 mb-2">
|
||||||
|
Trade Timeline
|
||||||
|
</h4>
|
||||||
|
<div className="text-sm text-blue-800">
|
||||||
|
<p>
|
||||||
|
<span className="font-medium">Opened:</span>{' '}
|
||||||
|
{new Date(data.position.open_date).toLocaleDateString()}
|
||||||
|
</p>
|
||||||
|
{data.position.close_date && (
|
||||||
|
<p>
|
||||||
|
<span className="font-medium">Closed:</span>{' '}
|
||||||
|
{new Date(data.position.close_date).toLocaleDateString()}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
<p>
|
||||||
|
<span className="font-medium">Duration:</span>{' '}
|
||||||
|
{data.position.close_date
|
||||||
|
? Math.floor(
|
||||||
|
(new Date(data.position.close_date).getTime() -
|
||||||
|
new Date(data.position.open_date).getTime()) /
|
||||||
|
(1000 * 60 * 60 * 24)
|
||||||
|
) + ' days'
|
||||||
|
: 'Ongoing'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Annual Return Rate */}
|
||||||
|
{data.position.close_date &&
|
||||||
|
data.position.realized_pnl !== null &&
|
||||||
|
data.position.avg_entry_price !== null && (
|
||||||
|
<div className="bg-green-50 border border-green-200 rounded-lg p-4">
|
||||||
|
<h4 className="font-semibold text-green-900 mb-2">
|
||||||
|
Annual Return Rate
|
||||||
|
</h4>
|
||||||
|
<div className="text-sm text-green-800">
|
||||||
|
{(() => {
|
||||||
|
const daysHeld = Math.floor(
|
||||||
|
(new Date(data.position.close_date).getTime() -
|
||||||
|
new Date(data.position.open_date).getTime()) /
|
||||||
|
(1000 * 60 * 60 * 24)
|
||||||
|
);
|
||||||
|
|
||||||
|
if (daysHeld === 0) {
|
||||||
|
return (
|
||||||
|
<p className="text-gray-600">
|
||||||
|
Trade held less than 1 day
|
||||||
|
</p>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate capital invested
|
||||||
|
const isOption =
|
||||||
|
data.position.position_type === 'call' ||
|
||||||
|
data.position.position_type === 'put';
|
||||||
|
const multiplier = isOption ? 100 : 1;
|
||||||
|
const capitalInvested =
|
||||||
|
Math.abs(data.position.avg_entry_price) *
|
||||||
|
Math.abs(data.position.total_quantity) *
|
||||||
|
multiplier;
|
||||||
|
|
||||||
|
if (capitalInvested === 0) {
|
||||||
|
return (
|
||||||
|
<p className="text-gray-600">
|
||||||
|
Unable to calculate (no capital invested)
|
||||||
|
</p>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ARR = (Profit / Capital) × (365 / Days) × 100%
|
||||||
|
const arr =
|
||||||
|
(data.position.realized_pnl / capitalInvested) *
|
||||||
|
(365 / daysHeld) *
|
||||||
|
100;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<p>
|
||||||
|
<span className="font-medium">ARR:</span>{' '}
|
||||||
|
<span
|
||||||
|
className={`font-bold text-lg ${
|
||||||
|
arr >= 0 ? 'text-profit' : 'text-loss'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{arr.toFixed(2)}%
|
||||||
|
</span>
|
||||||
|
</p>
|
||||||
|
<p className="text-xs mt-1 text-green-700">
|
||||||
|
Based on {daysHeld} day
|
||||||
|
{daysHeld !== 1 ? 's' : ''} held
|
||||||
|
</p>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
})()}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Footer */}
|
||||||
|
<div className="sticky bottom-0 bg-gray-50 border-t border-gray-200 px-6 py-4 flex justify-end">
|
||||||
|
<button onClick={onClose} className="btn-secondary">
|
||||||
|
Close
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
202
frontend/src/components/TransactionTable.tsx
Normal file
202
frontend/src/components/TransactionTable.tsx
Normal file
@@ -0,0 +1,202 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useQuery } from '@tanstack/react-query';
|
||||||
|
import { transactionsApi } from '../api/client';
|
||||||
|
import TransactionDetailModal from './TransactionDetailModal';
|
||||||
|
import TimeframeFilter, { TimeframeOption, getTimeframeDates } from './TimeframeFilter';
|
||||||
|
|
||||||
|
interface TransactionTableProps {
|
||||||
|
accountId: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Table displaying transaction history with filtering.
|
||||||
|
* Rows are clickable to show full trade details.
|
||||||
|
*/
|
||||||
|
export default function TransactionTable({ accountId }: TransactionTableProps) {
|
||||||
|
const [symbol, setSymbol] = useState('');
|
||||||
|
const [page, setPage] = useState(0);
|
||||||
|
const [timeframe, setTimeframe] = useState<TimeframeOption>('all');
|
||||||
|
const [selectedTransactionId, setSelectedTransactionId] = useState<number | null>(null);
|
||||||
|
const limit = 50;
|
||||||
|
|
||||||
|
// Helper to safely convert to number
|
||||||
|
const toNumber = (val: any): number | null => {
|
||||||
|
if (val === null || val === undefined) return null;
|
||||||
|
const num = typeof val === 'number' ? val : parseFloat(val);
|
||||||
|
return isNaN(num) ? null : num;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Get date range based on timeframe
|
||||||
|
const { startDate, endDate } = getTimeframeDates(timeframe);
|
||||||
|
|
||||||
|
// Fetch transactions
|
||||||
|
const { data: transactions, isLoading } = useQuery({
|
||||||
|
queryKey: ['transactions', accountId, symbol, timeframe, page],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await transactionsApi.list({
|
||||||
|
account_id: accountId,
|
||||||
|
symbol: symbol || undefined,
|
||||||
|
start_date: startDate,
|
||||||
|
end_date: endDate,
|
||||||
|
skip: page * limit,
|
||||||
|
limit,
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="card">
|
||||||
|
<div className="mb-6">
|
||||||
|
<h2 className="text-xl font-semibold mb-4">Transaction History</h2>
|
||||||
|
|
||||||
|
{/* Filters */}
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
|
<div className="flex-1">
|
||||||
|
<label className="block text-xs text-gray-500 uppercase mb-1">
|
||||||
|
Symbol
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
placeholder="Filter by symbol..."
|
||||||
|
value={symbol}
|
||||||
|
onChange={(e) => {
|
||||||
|
setSymbol(e.target.value);
|
||||||
|
setPage(0);
|
||||||
|
}}
|
||||||
|
className="input w-full max-w-xs"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex-1">
|
||||||
|
<label className="block text-xs text-gray-500 uppercase mb-1">
|
||||||
|
Timeframe
|
||||||
|
</label>
|
||||||
|
<TimeframeFilter
|
||||||
|
value={timeframe}
|
||||||
|
onChange={(value) => {
|
||||||
|
setTimeframe(value as TimeframeOption);
|
||||||
|
setPage(0);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{isLoading ? (
|
||||||
|
<div className="text-center py-12 text-gray-500">Loading transactions...</div>
|
||||||
|
) : !transactions || transactions.length === 0 ? (
|
||||||
|
<div className="text-center py-12 text-gray-500">No transactions found</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<div className="overflow-x-auto">
|
||||||
|
<table className="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead className="bg-gray-50">
|
||||||
|
<tr>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Date
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Symbol
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Action
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Quantity
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Price
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Amount
|
||||||
|
</th>
|
||||||
|
<th className="px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">
|
||||||
|
Balance
|
||||||
|
</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody className="bg-white divide-y divide-gray-200">
|
||||||
|
{transactions.map((txn) => {
|
||||||
|
const price = toNumber(txn.price);
|
||||||
|
const amount = toNumber(txn.amount);
|
||||||
|
const balance = toNumber(txn.cash_balance);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<tr
|
||||||
|
key={txn.id}
|
||||||
|
className="hover:bg-gray-50 cursor-pointer transition-colors"
|
||||||
|
onClick={() => setSelectedTransactionId(txn.id)}
|
||||||
|
>
|
||||||
|
<td className="px-4 py-3 text-sm">
|
||||||
|
{new Date(txn.run_date).toLocaleDateString()}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm font-medium">
|
||||||
|
{txn.symbol || '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-gray-600 max-w-xs truncate">
|
||||||
|
{txn.action}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right">
|
||||||
|
{txn.quantity !== null ? txn.quantity : '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right">
|
||||||
|
{price !== null ? `$${price.toFixed(2)}` : '-'}
|
||||||
|
</td>
|
||||||
|
<td
|
||||||
|
className={`px-4 py-3 text-sm text-right font-medium ${
|
||||||
|
amount !== null
|
||||||
|
? amount >= 0
|
||||||
|
? 'text-profit'
|
||||||
|
: 'text-loss'
|
||||||
|
: ''
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{amount !== null ? `$${amount.toFixed(2)}` : '-'}
|
||||||
|
</td>
|
||||||
|
<td className="px-4 py-3 text-sm text-right font-medium">
|
||||||
|
{balance !== null
|
||||||
|
? `$${balance.toLocaleString(undefined, {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`
|
||||||
|
: '-'}
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Pagination */}
|
||||||
|
<div className="mt-6 flex items-center justify-between">
|
||||||
|
<button
|
||||||
|
onClick={() => setPage((p) => Math.max(0, p - 1))}
|
||||||
|
disabled={page === 0}
|
||||||
|
className="btn-secondary disabled:opacity-50 disabled:cursor-not-allowed"
|
||||||
|
>
|
||||||
|
Previous
|
||||||
|
</button>
|
||||||
|
<span className="text-sm text-gray-600">Page {page + 1}</span>
|
||||||
|
<button
|
||||||
|
onClick={() => setPage((p) => p + 1)}
|
||||||
|
disabled={!transactions || transactions.length < limit}
|
||||||
|
className="btn-secondary disabled:opacity-50 disabled:cursor-not-allowed"
|
||||||
|
>
|
||||||
|
Next
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Transaction Detail Modal */}
|
||||||
|
{selectedTransactionId && (
|
||||||
|
<TransactionDetailModal
|
||||||
|
transactionId={selectedTransactionId}
|
||||||
|
onClose={() => setSelectedTransactionId(null)}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
108
frontend/src/components/client.ts
Normal file
108
frontend/src/components/client.ts
Normal file
@@ -0,0 +1,108 @@
|
|||||||
|
/**
|
||||||
|
* API client for communicating with the backend.
|
||||||
|
*/
|
||||||
|
import axios from 'axios';
|
||||||
|
import type {
|
||||||
|
Account,
|
||||||
|
Transaction,
|
||||||
|
Position,
|
||||||
|
AccountStats,
|
||||||
|
BalancePoint,
|
||||||
|
Trade,
|
||||||
|
ImportResult,
|
||||||
|
} from '../types';
|
||||||
|
|
||||||
|
// Configure axios instance
|
||||||
|
const api = axios.create({
|
||||||
|
baseURL: '/api',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Account APIs
|
||||||
|
export const accountsApi = {
|
||||||
|
list: () => api.get<Account[]>('/accounts'),
|
||||||
|
get: (id: number) => api.get<Account>(`/accounts/${id}`),
|
||||||
|
create: (data: {
|
||||||
|
account_number: string;
|
||||||
|
account_name: string;
|
||||||
|
account_type: 'cash' | 'margin';
|
||||||
|
}) => api.post<Account>('/accounts', data),
|
||||||
|
update: (id: number, data: Partial<Account>) =>
|
||||||
|
api.put<Account>(`/accounts/${id}`, data),
|
||||||
|
delete: (id: number) => api.delete(`/accounts/${id}`),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Transaction APIs
|
||||||
|
export const transactionsApi = {
|
||||||
|
list: (params?: {
|
||||||
|
account_id?: number;
|
||||||
|
symbol?: string;
|
||||||
|
start_date?: string;
|
||||||
|
end_date?: string;
|
||||||
|
skip?: number;
|
||||||
|
limit?: number;
|
||||||
|
}) => api.get<Transaction[]>('/transactions', { params }),
|
||||||
|
get: (id: number) => api.get<Transaction>(`/transactions/${id}`),
|
||||||
|
getPositionDetails: (id: number) => api.get<any>(`/transactions/${id}/position-details`),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Position APIs
|
||||||
|
export const positionsApi = {
|
||||||
|
list: (params?: {
|
||||||
|
account_id?: number;
|
||||||
|
status?: 'open' | 'closed';
|
||||||
|
symbol?: string;
|
||||||
|
skip?: number;
|
||||||
|
limit?: number;
|
||||||
|
}) => api.get<Position[]>('/positions', { params }),
|
||||||
|
get: (id: number) => api.get<Position>(`/positions/${id}`),
|
||||||
|
rebuild: (accountId: number) =>
|
||||||
|
api.post<{ positions_created: number }>(`/positions/${accountId}/rebuild`),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Analytics APIs
|
||||||
|
export const analyticsApi = {
|
||||||
|
getOverview: (accountId: number, params?: { refresh_prices?: boolean; max_api_calls?: number; start_date?: string; end_date?: string }) =>
|
||||||
|
api.get<AccountStats>(`/analytics/overview/${accountId}`, { params }),
|
||||||
|
getBalanceHistory: (accountId: number, days: number = 30) =>
|
||||||
|
api.get<{ data: BalancePoint[] }>(`/analytics/balance-history/${accountId}`, {
|
||||||
|
params: { days },
|
||||||
|
}),
|
||||||
|
getTopTrades: (accountId: number, limit: number = 10, startDate?: string, endDate?: string) =>
|
||||||
|
api.get<{ data: Trade[] }>(`/analytics/top-trades/${accountId}`, {
|
||||||
|
params: { limit, start_date: startDate, end_date: endDate },
|
||||||
|
}),
|
||||||
|
getWorstTrades: (accountId: number, limit: number = 10, startDate?: string, endDate?: string) =>
|
||||||
|
api.get<{ data: Trade[] }>(`/analytics/worst-trades/${accountId}`, {
|
||||||
|
params: { limit, start_date: startDate, end_date: endDate },
|
||||||
|
}),
|
||||||
|
updatePnL: (accountId: number) =>
|
||||||
|
api.post<{ positions_updated: number }>(`/analytics/update-pnl/${accountId}`),
|
||||||
|
refreshPrices: (accountId: number, params?: { max_api_calls?: number }) =>
|
||||||
|
api.post<{ message: string; stats: any }>(`/analytics/refresh-prices/${accountId}`, null, { params }),
|
||||||
|
refreshPricesBackground: (accountId: number, params?: { max_api_calls?: number }) =>
|
||||||
|
api.post<{ message: string; account_id: number }>(`/analytics/refresh-prices-background/${accountId}`, null, { params }),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Import APIs
|
||||||
|
export const importApi = {
|
||||||
|
uploadCsv: (accountId: number, file: File) => {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
return api.post<ImportResult>(`/import/upload/${accountId}`, formData, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'multipart/form-data',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
},
|
||||||
|
importFromFilesystem: (accountId: number) =>
|
||||||
|
api.post<{
|
||||||
|
files: Record<string, Omit<ImportResult, 'filename'>>;
|
||||||
|
total_imported: number;
|
||||||
|
positions_created: number;
|
||||||
|
}>(`/import/filesystem/${accountId}`),
|
||||||
|
};
|
||||||
|
|
||||||
|
export default api;
|
||||||
26
frontend/src/main.tsx
Normal file
26
frontend/src/main.tsx
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import ReactDOM from 'react-dom/client';
|
||||||
|
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
||||||
|
import { BrowserRouter } from 'react-router-dom';
|
||||||
|
import App from './App';
|
||||||
|
import './styles/tailwind.css';
|
||||||
|
|
||||||
|
// Create React Query client
|
||||||
|
const queryClient = new QueryClient({
|
||||||
|
defaultOptions: {
|
||||||
|
queries: {
|
||||||
|
refetchOnWindowFocus: false,
|
||||||
|
retry: 1,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
ReactDOM.createRoot(document.getElementById('root')!).render(
|
||||||
|
<React.StrictMode>
|
||||||
|
<QueryClientProvider client={queryClient}>
|
||||||
|
<BrowserRouter>
|
||||||
|
<App />
|
||||||
|
</BrowserRouter>
|
||||||
|
</QueryClientProvider>
|
||||||
|
</React.StrictMode>
|
||||||
|
);
|
||||||
61
frontend/src/styles/tailwind.css
Normal file
61
frontend/src/styles/tailwind.css
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
@tailwind base;
|
||||||
|
@tailwind components;
|
||||||
|
@tailwind utilities;
|
||||||
|
|
||||||
|
@layer base {
|
||||||
|
body {
|
||||||
|
@apply bg-robinhood-bg text-gray-900 font-sans;
|
||||||
|
}
|
||||||
|
|
||||||
|
h1, h2, h3, h4, h5, h6 {
|
||||||
|
@apply font-semibold;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@layer components {
|
||||||
|
.btn {
|
||||||
|
@apply px-4 py-2 rounded-lg font-medium transition-colors duration-200 focus:outline-none focus:ring-2 focus:ring-offset-2;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-primary {
|
||||||
|
@apply btn bg-robinhood-green text-white hover:bg-green-600 focus:ring-green-500;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-secondary {
|
||||||
|
@apply btn bg-gray-200 text-gray-900 hover:bg-gray-300 focus:ring-gray-500;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-danger {
|
||||||
|
@apply btn bg-robinhood-red text-white hover:bg-red-600 focus:ring-red-500;
|
||||||
|
}
|
||||||
|
|
||||||
|
.card {
|
||||||
|
@apply bg-white rounded-xl shadow-sm border border-gray-200 p-6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.input {
|
||||||
|
@apply w-full px-4 py-2 border border-gray-300 rounded-lg focus:outline-none focus:ring-2 focus:ring-green-500 focus:border-transparent;
|
||||||
|
}
|
||||||
|
|
||||||
|
.label {
|
||||||
|
@apply block text-sm font-medium text-gray-700 mb-1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@layer utilities {
|
||||||
|
.text-profit {
|
||||||
|
@apply text-robinhood-green;
|
||||||
|
}
|
||||||
|
|
||||||
|
.text-loss {
|
||||||
|
@apply text-robinhood-red;
|
||||||
|
}
|
||||||
|
|
||||||
|
.bg-profit {
|
||||||
|
@apply bg-green-50;
|
||||||
|
}
|
||||||
|
|
||||||
|
.bg-loss {
|
||||||
|
@apply bg-red-50;
|
||||||
|
}
|
||||||
|
}
|
||||||
94
frontend/src/types/index.ts
Normal file
94
frontend/src/types/index.ts
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
/**
|
||||||
|
* TypeScript type definitions for the application.
|
||||||
|
*/
|
||||||
|
|
||||||
|
export interface Account {
|
||||||
|
id: number;
|
||||||
|
account_number: string;
|
||||||
|
account_name: string;
|
||||||
|
account_type: 'cash' | 'margin';
|
||||||
|
created_at: string;
|
||||||
|
updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Transaction {
|
||||||
|
id: number;
|
||||||
|
account_id: number;
|
||||||
|
run_date: string;
|
||||||
|
action: string;
|
||||||
|
symbol: string | null;
|
||||||
|
description: string | null;
|
||||||
|
transaction_type: string | null;
|
||||||
|
price: number | null;
|
||||||
|
quantity: number | null;
|
||||||
|
commission: number | null;
|
||||||
|
fees: number | null;
|
||||||
|
amount: number | null;
|
||||||
|
cash_balance: number | null;
|
||||||
|
settlement_date: string | null;
|
||||||
|
created_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Position {
|
||||||
|
id: number;
|
||||||
|
account_id: number;
|
||||||
|
symbol: string;
|
||||||
|
option_symbol: string | null;
|
||||||
|
position_type: 'stock' | 'call' | 'put';
|
||||||
|
status: 'open' | 'closed';
|
||||||
|
open_date: string;
|
||||||
|
close_date: string | null;
|
||||||
|
total_quantity: number;
|
||||||
|
avg_entry_price: number | null;
|
||||||
|
avg_exit_price: number | null;
|
||||||
|
realized_pnl: number | null;
|
||||||
|
unrealized_pnl: number | null;
|
||||||
|
created_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface PriceUpdateStats {
|
||||||
|
total: number;
|
||||||
|
updated: number;
|
||||||
|
cached: number;
|
||||||
|
failed: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface AccountStats {
|
||||||
|
total_positions: number;
|
||||||
|
open_positions: number;
|
||||||
|
closed_positions: number;
|
||||||
|
total_realized_pnl: number;
|
||||||
|
total_unrealized_pnl: number;
|
||||||
|
total_pnl: number;
|
||||||
|
win_rate: number;
|
||||||
|
avg_win: number;
|
||||||
|
avg_loss: number;
|
||||||
|
current_balance: number;
|
||||||
|
price_update_stats?: PriceUpdateStats;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface BalancePoint {
|
||||||
|
date: string;
|
||||||
|
balance: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Trade {
|
||||||
|
symbol: string;
|
||||||
|
option_symbol: string | null;
|
||||||
|
position_type: string;
|
||||||
|
open_date: string;
|
||||||
|
close_date: string | null;
|
||||||
|
quantity: number;
|
||||||
|
entry_price: number | null;
|
||||||
|
exit_price: number | null;
|
||||||
|
realized_pnl: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportResult {
|
||||||
|
filename: string;
|
||||||
|
imported: number;
|
||||||
|
skipped: number;
|
||||||
|
errors: string[];
|
||||||
|
total_rows: number;
|
||||||
|
positions_created: number;
|
||||||
|
}
|
||||||
21
frontend/tailwind.config.js
Normal file
21
frontend/tailwind.config.js
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
/** @type {import('tailwindcss').Config} */
|
||||||
|
export default {
|
||||||
|
content: [
|
||||||
|
"./index.html",
|
||||||
|
"./src/**/*.{js,ts,jsx,tsx}",
|
||||||
|
],
|
||||||
|
theme: {
|
||||||
|
extend: {
|
||||||
|
colors: {
|
||||||
|
'robinhood-green': '#00C805',
|
||||||
|
'robinhood-red': '#FF5000',
|
||||||
|
'robinhood-bg': '#F8F9FA',
|
||||||
|
'robinhood-dark': '#1E1E1E',
|
||||||
|
},
|
||||||
|
fontFamily: {
|
||||||
|
sans: ['Inter', 'system-ui', 'sans-serif'],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
plugins: [],
|
||||||
|
}
|
||||||
25
frontend/tsconfig.json
Normal file
25
frontend/tsconfig.json
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2020",
|
||||||
|
"useDefineForClassFields": true,
|
||||||
|
"lib": ["ES2020", "DOM", "DOM.Iterable"],
|
||||||
|
"module": "ESNext",
|
||||||
|
"skipLibCheck": true,
|
||||||
|
|
||||||
|
/* Bundler mode */
|
||||||
|
"moduleResolution": "bundler",
|
||||||
|
"allowImportingTsExtensions": true,
|
||||||
|
"resolveJsonModule": true,
|
||||||
|
"isolatedModules": true,
|
||||||
|
"noEmit": true,
|
||||||
|
"jsx": "react-jsx",
|
||||||
|
|
||||||
|
/* Linting */
|
||||||
|
"strict": true,
|
||||||
|
"noUnusedLocals": true,
|
||||||
|
"noUnusedParameters": true,
|
||||||
|
"noFallthroughCasesInSwitch": true
|
||||||
|
},
|
||||||
|
"include": ["src"],
|
||||||
|
"references": [{ "path": "./tsconfig.node.json" }]
|
||||||
|
}
|
||||||
10
frontend/tsconfig.node.json
Normal file
10
frontend/tsconfig.node.json
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"composite": true,
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"module": "ESNext",
|
||||||
|
"moduleResolution": "bundler",
|
||||||
|
"allowSyntheticDefaultImports": true
|
||||||
|
},
|
||||||
|
"include": ["vite.config.ts"]
|
||||||
|
}
|
||||||
17
frontend/vite.config.ts
Normal file
17
frontend/vite.config.ts
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
import { defineConfig } from 'vite'
|
||||||
|
import react from '@vitejs/plugin-react'
|
||||||
|
|
||||||
|
// https://vitejs.dev/config/
|
||||||
|
export default defineConfig({
|
||||||
|
plugins: [react()],
|
||||||
|
server: {
|
||||||
|
host: true,
|
||||||
|
port: 5173,
|
||||||
|
proxy: {
|
||||||
|
'/api': {
|
||||||
|
target: 'http://backend:8000',
|
||||||
|
changeOrigin: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
0
imports/.gitkeep
Normal file
0
imports/.gitkeep
Normal file
1
imports/README.txt
Normal file
1
imports/README.txt
Normal file
@@ -0,0 +1 @@
|
|||||||
|
CSV import files go here
|
||||||
35
quick-transfer.sh
Executable file
35
quick-transfer.sh
Executable file
@@ -0,0 +1,35 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Quick transfer script - sends all necessary files to server
|
||||||
|
|
||||||
|
SERVER="pi@starship2"
|
||||||
|
REMOTE_DIR="~/fidelity"
|
||||||
|
|
||||||
|
echo "Transferring files to $SERVER..."
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Critical fix files
|
||||||
|
echo "1. Transferring ULTIMATE_FIX.sh..."
|
||||||
|
scp ULTIMATE_FIX.sh $SERVER:$REMOTE_DIR/
|
||||||
|
|
||||||
|
echo "2. Transferring diagnose-307.sh..."
|
||||||
|
scp diagnose-307.sh $SERVER:$REMOTE_DIR/
|
||||||
|
|
||||||
|
echo "3. Transferring docker-compose.yml (with fixed healthcheck)..."
|
||||||
|
scp docker-compose.yml $SERVER:$REMOTE_DIR/
|
||||||
|
|
||||||
|
echo "4. Transferring main.py (without redirect_slashes)..."
|
||||||
|
scp backend/app/main.py $SERVER:$REMOTE_DIR/backend/app/
|
||||||
|
|
||||||
|
echo "5. Transferring README..."
|
||||||
|
scp READ_ME_FIRST.md $SERVER:$REMOTE_DIR/
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "✓ All files transferred!"
|
||||||
|
echo ""
|
||||||
|
echo "Next steps:"
|
||||||
|
echo " 1. ssh $SERVER"
|
||||||
|
echo " 2. cd ~/fidelity"
|
||||||
|
echo " 3. cat READ_ME_FIRST.md"
|
||||||
|
echo " 4. ./ULTIMATE_FIX.sh"
|
||||||
|
echo ""
|
||||||
115
start-linux.sh
Executable file
115
start-linux.sh
Executable file
@@ -0,0 +1,115 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# myFidelityTracker Start Script (Linux)
|
||||||
|
|
||||||
|
echo "🚀 Starting myFidelityTracker..."
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check if Docker is running
|
||||||
|
if ! docker info > /dev/null 2>&1; then
|
||||||
|
echo "❌ Docker is not running. Please start Docker and try again."
|
||||||
|
echo " On Linux: sudo systemctl start docker"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if docker compose is available (V2 or V1)
|
||||||
|
if docker compose version &> /dev/null; then
|
||||||
|
DOCKER_COMPOSE="docker compose"
|
||||||
|
elif command -v docker-compose &> /dev/null; then
|
||||||
|
DOCKER_COMPOSE="docker-compose"
|
||||||
|
else
|
||||||
|
echo "❌ Docker Compose not found. Please install it:"
|
||||||
|
echo " sudo apt-get install docker-compose-plugin # Debian/Ubuntu"
|
||||||
|
echo " sudo yum install docker-compose-plugin # CentOS/RHEL"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "📦 Using: $DOCKER_COMPOSE"
|
||||||
|
|
||||||
|
# Check if .env exists, if not copy from example
|
||||||
|
if [ ! -f .env ]; then
|
||||||
|
echo "📝 Creating .env file from .env.example..."
|
||||||
|
cp .env.example .env
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create imports directory if it doesn't exist
|
||||||
|
mkdir -p imports
|
||||||
|
|
||||||
|
# Copy sample CSV if it exists in the root
|
||||||
|
if [ -f "History_for_Account_X38661988.csv" ] && [ ! -f "imports/History_for_Account_X38661988.csv" ]; then
|
||||||
|
echo "📋 Copying sample CSV to imports directory..."
|
||||||
|
cp History_for_Account_X38661988.csv imports/
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Start services
|
||||||
|
echo "🐳 Starting Docker containers..."
|
||||||
|
$DOCKER_COMPOSE up -d
|
||||||
|
|
||||||
|
# Wait for services to be healthy
|
||||||
|
echo ""
|
||||||
|
echo "⏳ Waiting for services to be ready..."
|
||||||
|
sleep 5
|
||||||
|
|
||||||
|
# Check if backend is up
|
||||||
|
echo "🔍 Checking backend health..."
|
||||||
|
for i in {1..30}; do
|
||||||
|
if curl -s http://localhost:8000/health > /dev/null 2>&1; then
|
||||||
|
echo "✅ Backend is ready!"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if [ $i -eq 30 ]; then
|
||||||
|
echo "⚠️ Backend is taking longer than expected to start"
|
||||||
|
echo " Check logs with: docker compose logs backend"
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
# Check if frontend is up
|
||||||
|
echo "🔍 Checking frontend..."
|
||||||
|
for i in {1..20}; do
|
||||||
|
if curl -s http://localhost:3000 > /dev/null 2>&1; then
|
||||||
|
echo "✅ Frontend is ready!"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if [ $i -eq 20 ]; then
|
||||||
|
echo "⚠️ Frontend is taking longer than expected to start"
|
||||||
|
echo " Check logs with: docker compose logs frontend"
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
# Get server IP
|
||||||
|
SERVER_IP=$(hostname -I | awk '{print $1}')
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||||
|
echo "✨ myFidelityTracker is running!"
|
||||||
|
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||||
|
echo ""
|
||||||
|
echo "🌐 Access from this server:"
|
||||||
|
echo " Frontend: http://localhost:3000"
|
||||||
|
echo " Backend: http://localhost:8000"
|
||||||
|
echo " API Docs: http://localhost:8000/docs"
|
||||||
|
echo ""
|
||||||
|
echo "🌐 Access from other computers:"
|
||||||
|
echo " Frontend: http://${SERVER_IP}:3000"
|
||||||
|
echo " Backend: http://${SERVER_IP}:8000"
|
||||||
|
echo " API Docs: http://${SERVER_IP}:8000/docs"
|
||||||
|
echo ""
|
||||||
|
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||||
|
echo ""
|
||||||
|
echo "📖 Quick Start Guide:"
|
||||||
|
echo " 1. Open http://${SERVER_IP}:3000 in your browser"
|
||||||
|
echo " 2. Go to the 'Accounts' tab to create your first account"
|
||||||
|
echo " 3. Go to the 'Import' tab to upload a Fidelity CSV file"
|
||||||
|
echo " 4. View your dashboard with performance metrics"
|
||||||
|
echo ""
|
||||||
|
echo "🌱 To seed demo data (optional):"
|
||||||
|
echo " docker compose exec backend python seed_demo_data.py"
|
||||||
|
echo ""
|
||||||
|
echo "📊 To view logs:"
|
||||||
|
echo " docker compose logs -f"
|
||||||
|
echo ""
|
||||||
|
echo "🛑 To stop:"
|
||||||
|
echo " ./stop.sh or docker compose down"
|
||||||
|
echo ""
|
||||||
91
start.sh
Executable file
91
start.sh
Executable file
@@ -0,0 +1,91 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# myFidelityTracker Start Script
|
||||||
|
|
||||||
|
echo "🚀 Starting myFidelityTracker..."
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check if Docker is running
|
||||||
|
if ! docker info > /dev/null 2>&1; then
|
||||||
|
echo "❌ Docker is not running. Please start Docker Desktop and try again."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if .env exists, if not copy from example
|
||||||
|
if [ ! -f .env ]; then
|
||||||
|
echo "📝 Creating .env file from .env.example..."
|
||||||
|
cp .env.example .env
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create imports directory if it doesn't exist
|
||||||
|
mkdir -p imports
|
||||||
|
|
||||||
|
# Copy sample CSV if it exists in the root
|
||||||
|
if [ -f "History_for_Account_X38661988.csv" ] && [ ! -f "imports/History_for_Account_X38661988.csv" ]; then
|
||||||
|
echo "📋 Copying sample CSV to imports directory..."
|
||||||
|
cp History_for_Account_X38661988.csv imports/
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Start services
|
||||||
|
echo "🐳 Starting Docker containers..."
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# Wait for services to be healthy
|
||||||
|
echo ""
|
||||||
|
echo "⏳ Waiting for services to be ready..."
|
||||||
|
sleep 5
|
||||||
|
|
||||||
|
# Check if backend is up
|
||||||
|
echo "🔍 Checking backend health..."
|
||||||
|
for i in {1..30}; do
|
||||||
|
if curl -s http://localhost:8000/health > /dev/null 2>&1; then
|
||||||
|
echo "✅ Backend is ready!"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if [ $i -eq 30 ]; then
|
||||||
|
echo "⚠️ Backend is taking longer than expected to start"
|
||||||
|
echo " Check logs with: docker-compose logs backend"
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
# Check if frontend is up
|
||||||
|
echo "🔍 Checking frontend..."
|
||||||
|
for i in {1..20}; do
|
||||||
|
if curl -s http://localhost:3000 > /dev/null 2>&1; then
|
||||||
|
echo "✅ Frontend is ready!"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if [ $i -eq 20 ]; then
|
||||||
|
echo "⚠️ Frontend is taking longer than expected to start"
|
||||||
|
echo " Check logs with: docker-compose logs frontend"
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||||
|
echo "✨ myFidelityTracker is running!"
|
||||||
|
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||||
|
echo ""
|
||||||
|
echo "🌐 Frontend: http://localhost:3000"
|
||||||
|
echo "🔌 Backend: http://localhost:8000"
|
||||||
|
echo "📚 API Docs: http://localhost:8000/docs"
|
||||||
|
echo ""
|
||||||
|
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||||
|
echo ""
|
||||||
|
echo "📖 Quick Start Guide:"
|
||||||
|
echo " 1. Open http://localhost:3000 in your browser"
|
||||||
|
echo " 2. Go to the 'Accounts' tab to create your first account"
|
||||||
|
echo " 3. Go to the 'Import' tab to upload a Fidelity CSV file"
|
||||||
|
echo " 4. View your dashboard with performance metrics"
|
||||||
|
echo ""
|
||||||
|
echo "🌱 To seed demo data (optional):"
|
||||||
|
echo " docker-compose exec backend python seed_demo_data.py"
|
||||||
|
echo ""
|
||||||
|
echo "📊 To view logs:"
|
||||||
|
echo " docker-compose logs -f"
|
||||||
|
echo ""
|
||||||
|
echo "🛑 To stop:"
|
||||||
|
echo " ./stop.sh or docker-compose down"
|
||||||
|
echo ""
|
||||||
20
stop.sh
Executable file
20
stop.sh
Executable file
@@ -0,0 +1,20 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# myFidelityTracker Stop Script
|
||||||
|
|
||||||
|
echo "🛑 Stopping myFidelityTracker..."
|
||||||
|
|
||||||
|
# Check if docker compose is available (V2 or V1)
|
||||||
|
if docker compose version &> /dev/null; then
|
||||||
|
docker compose down
|
||||||
|
elif command -v docker-compose &> /dev/null; then
|
||||||
|
docker-compose down
|
||||||
|
else
|
||||||
|
echo "❌ Docker Compose not found"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "✅ All services stopped"
|
||||||
|
echo ""
|
||||||
|
echo "💡 To restart: ./start-linux.sh or docker compose up -d"
|
||||||
|
echo "🗑️ To remove all data: docker compose down -v"
|
||||||
Reference in New Issue
Block a user