init
This commit is contained in:
49
.gitignore
vendored
Normal file
49
.gitignore
vendored
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
# Dependencies
|
||||||
|
node_modules/
|
||||||
|
vendor/
|
||||||
|
|
||||||
|
# Build outputs
|
||||||
|
backend/bin/
|
||||||
|
backend/tmp/
|
||||||
|
frontend/.next/
|
||||||
|
frontend/out/
|
||||||
|
|
||||||
|
# Environment files
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.*.local
|
||||||
|
|
||||||
|
# IDE
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
*.log
|
||||||
|
air.log
|
||||||
|
backend.log
|
||||||
|
frontend.log
|
||||||
|
|
||||||
|
# Process IDs
|
||||||
|
.backend.pid
|
||||||
|
.frontend.pid
|
||||||
|
|
||||||
|
# Cache
|
||||||
|
backend/data/
|
||||||
|
*.cache
|
||||||
|
|
||||||
|
# Test coverage
|
||||||
|
coverage/
|
||||||
|
*.coverage
|
||||||
|
|
||||||
|
# Docker
|
||||||
|
.docker/
|
||||||
|
|
||||||
|
# Kubernetes secrets (keep templates)
|
||||||
|
k8s/*-secrets.yaml
|
||||||
|
!k8s/*-secrets.yaml.example
|
||||||
243
DEV_SETUP.md
Normal file
243
DEV_SETUP.md
Normal file
@@ -0,0 +1,243 @@
|
|||||||
|
# Local Development Setup
|
||||||
|
|
||||||
|
This guide shows how to run the application locally for faster development iteration.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Instead of running everything in containers, you can run just PostgreSQL in a container and run the backend (Go) and frontend (Next.js) natively on your machine. This provides:
|
||||||
|
|
||||||
|
- ✨ Faster hot-reload and rebuild times
|
||||||
|
- 🔧 Better debugging experience
|
||||||
|
- 📦 Smaller resource footprint
|
||||||
|
- 🚀 Quicker iteration cycles
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- **Go** 1.24+ installed
|
||||||
|
- **Node.js** 18+ and npm (or Bun)
|
||||||
|
- **Podman** or **Docker** (for PostgreSQL only)
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### Option 1: Single Command - Everything! (Recommended)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Setup AND start all services in background
|
||||||
|
./dev.sh --start
|
||||||
|
```
|
||||||
|
|
||||||
|
**That's it!** This one command:
|
||||||
|
- ✓ Starts PostgreSQL (if not running)
|
||||||
|
- ✓ Creates `.env` files (if they don't exist)
|
||||||
|
- ✓ Runs database migrations (if needed)
|
||||||
|
- ✓ Starts backend in background
|
||||||
|
- ✓ Starts frontend in background
|
||||||
|
|
||||||
|
View logs: `tail -f backend.log` or `tail -f frontend.log`
|
||||||
|
Stop services: `./dev.sh --stop`
|
||||||
|
Check status: `./dev.sh --status`
|
||||||
|
|
||||||
|
### Option 2: Setup Only (Manual Start)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run setup without starting services
|
||||||
|
./dev.sh
|
||||||
|
|
||||||
|
# Then in separate terminals:
|
||||||
|
# Terminal 1 - Backend
|
||||||
|
make run-backend
|
||||||
|
|
||||||
|
# Terminal 2 - Frontend
|
||||||
|
make run-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
The `dev.sh` script is **idempotent** - you can run it multiple times safely. It will:
|
||||||
|
- ✓ Check if PostgreSQL is already running (won't restart if running)
|
||||||
|
- ✓ Create .env files only if they don't exist
|
||||||
|
- ✓ Run migrations only if needed
|
||||||
|
- ✓ Skip steps that are already complete
|
||||||
|
- ✓ Detect if services are already running (won't duplicate)
|
||||||
|
|
||||||
|
### Option 3: Using Make
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start PostgreSQL and run migrations
|
||||||
|
make dev-local
|
||||||
|
|
||||||
|
# Then in separate terminals:
|
||||||
|
# Terminal 1 - Backend
|
||||||
|
make run-backend
|
||||||
|
|
||||||
|
# Terminal 2 - Frontend
|
||||||
|
make run-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 4: Manual Setup
|
||||||
|
|
||||||
|
1. **Start PostgreSQL**
|
||||||
|
```bash
|
||||||
|
make dev-db
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Configure Backend**
|
||||||
|
```bash
|
||||||
|
# Copy example env file
|
||||||
|
cp backend/.env.example backend/.env
|
||||||
|
|
||||||
|
# Edit if needed (defaults should work)
|
||||||
|
# vim backend/.env
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Run Database Migrations**
|
||||||
|
```bash
|
||||||
|
make migrate-up DATABASE_URL="postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable"
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Start Backend** (in one terminal)
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
go run ./cmd/api
|
||||||
|
```
|
||||||
|
Backend will be available at: http://localhost:8080
|
||||||
|
|
||||||
|
5. **Start Frontend** (in another terminal)
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
npm run dev
|
||||||
|
# or with bun:
|
||||||
|
# bun run dev
|
||||||
|
```
|
||||||
|
Frontend will be available at: http://localhost:3000
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Backend Environment Variables
|
||||||
|
|
||||||
|
The backend uses environment variables defined in `backend/.env`:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
DATABASE_URL=postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable
|
||||||
|
PORT=8080
|
||||||
|
LOCATION_LAT=32.8893 # Torrey Pines Gliderport
|
||||||
|
LOCATION_LON=-117.2519
|
||||||
|
LOCATION_NAME=Torrey Pines Gliderport
|
||||||
|
TIMEZONE=America/Los_Angeles
|
||||||
|
FETCH_INTERVAL=15m
|
||||||
|
CACHE_TTL=10m
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Environment Variables
|
||||||
|
|
||||||
|
The frontend uses `frontend/.env.local`:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
NEXT_PUBLIC_API_URL=http://localhost:8080/api/v1
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Management
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start PostgreSQL
|
||||||
|
make dev-db
|
||||||
|
|
||||||
|
# Stop PostgreSQL
|
||||||
|
make dev-db-down
|
||||||
|
|
||||||
|
# Run migrations up
|
||||||
|
make migrate-up DATABASE_URL="postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable"
|
||||||
|
|
||||||
|
# Run migrations down
|
||||||
|
make migrate-down DATABASE_URL="postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable"
|
||||||
|
|
||||||
|
# Create a new migration
|
||||||
|
make migrate-create name=add_new_table
|
||||||
|
```
|
||||||
|
|
||||||
|
## Using Bun Instead of npm
|
||||||
|
|
||||||
|
If you prefer Bun for faster package installation and execution:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
bun install
|
||||||
|
|
||||||
|
# Run dev server
|
||||||
|
bun run dev
|
||||||
|
|
||||||
|
# Run build
|
||||||
|
bun run build
|
||||||
|
```
|
||||||
|
|
||||||
|
## Switching Between Local and Container Development
|
||||||
|
|
||||||
|
### To Local Development:
|
||||||
|
```bash
|
||||||
|
# Stop all containers
|
||||||
|
make dev-down
|
||||||
|
|
||||||
|
# Start only PostgreSQL
|
||||||
|
make dev-db
|
||||||
|
|
||||||
|
# Run apps locally (separate terminals)
|
||||||
|
make run-backend
|
||||||
|
make run-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Back to Full Container Development:
|
||||||
|
```bash
|
||||||
|
# Stop local PostgreSQL
|
||||||
|
make dev-db-down
|
||||||
|
|
||||||
|
# Start all services in containers
|
||||||
|
make dev
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Backend won't connect to PostgreSQL
|
||||||
|
- Ensure PostgreSQL is running: `podman ps` or `docker ps`
|
||||||
|
- Check connection string in `backend/.env`
|
||||||
|
- Verify PostgreSQL is healthy: `podman logs paragliding-postgres`
|
||||||
|
|
||||||
|
### Frontend can't reach backend
|
||||||
|
- Ensure backend is running on port 8080
|
||||||
|
- Check `NEXT_PUBLIC_API_URL` in `frontend/.env.local`
|
||||||
|
- Verify backend health: `curl http://localhost:8080/api/v1/health`
|
||||||
|
|
||||||
|
### Port already in use
|
||||||
|
- Backend (8080): Change `PORT` in `backend/.env`
|
||||||
|
- Frontend (3000): Next.js will prompt for alternative port automatically
|
||||||
|
- PostgreSQL (5432): Change port mapping in `docker-compose.dev.yml`
|
||||||
|
|
||||||
|
## Hot Reload & Live Development
|
||||||
|
|
||||||
|
- **Frontend**: Next.js automatically hot-reloads on file changes
|
||||||
|
- **Backend**: For auto-reload, install Air:
|
||||||
|
```bash
|
||||||
|
go install github.com/air-verse/air@latest
|
||||||
|
cd backend
|
||||||
|
air
|
||||||
|
```
|
||||||
|
Air config is already set up in `backend/.air.toml`
|
||||||
|
|
||||||
|
## Running Tests
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Backend tests
|
||||||
|
make test-backend
|
||||||
|
|
||||||
|
# Frontend tests
|
||||||
|
make test-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
Once running, you can access:
|
||||||
|
|
||||||
|
- Frontend: http://localhost:3000
|
||||||
|
- Backend API: http://localhost:8080/api/v1
|
||||||
|
- Health Check: http://localhost:8080/api/v1/health
|
||||||
|
- Current Weather: http://localhost:8080/api/v1/weather/current
|
||||||
|
- Forecast: http://localhost:8080/api/v1/weather/forecast
|
||||||
170
Makefile
Normal file
170
Makefile
Normal file
@@ -0,0 +1,170 @@
|
|||||||
|
.PHONY: dev dev-up dev-down test migrate migrate-up migrate-down build clean
|
||||||
|
|
||||||
|
# Development
|
||||||
|
dev: dev-up
|
||||||
|
@echo "Development environment is running"
|
||||||
|
@echo "Backend: http://localhost:8080"
|
||||||
|
@echo "Frontend: http://localhost:3000"
|
||||||
|
@echo ""
|
||||||
|
@echo "To view logs:"
|
||||||
|
@echo " make dev-logs"
|
||||||
|
@echo ""
|
||||||
|
@echo "To stop:"
|
||||||
|
@echo " make dev-down"
|
||||||
|
|
||||||
|
dev-up:
|
||||||
|
@echo "Creating podman network..."
|
||||||
|
-podman network create paragliding-net 2>/dev/null || true
|
||||||
|
@echo "Starting PostgreSQL..."
|
||||||
|
-podman rm -f paragliding-postgres 2>/dev/null || true
|
||||||
|
podman run -d \
|
||||||
|
--name paragliding-postgres \
|
||||||
|
--network paragliding-net \
|
||||||
|
-e POSTGRES_DB=paragliding \
|
||||||
|
-e POSTGRES_USER=dev \
|
||||||
|
-e POSTGRES_PASSWORD=devpass \
|
||||||
|
-p 5432:5432 \
|
||||||
|
postgres:16-alpine
|
||||||
|
@echo "Waiting for PostgreSQL to be ready..."
|
||||||
|
@sleep 5
|
||||||
|
@echo "Building backend..."
|
||||||
|
podman build -t paragliding-backend:dev -f backend/Dockerfile.dev backend/
|
||||||
|
@echo "Starting backend..."
|
||||||
|
-podman rm -f paragliding-backend 2>/dev/null || true
|
||||||
|
podman run -d \
|
||||||
|
--name paragliding-backend \
|
||||||
|
--network paragliding-net \
|
||||||
|
-e DATABASE_URL="postgres://dev:devpass@paragliding-postgres:5432/paragliding?sslmode=disable" \
|
||||||
|
-e PORT=8080 \
|
||||||
|
-e LOCATION_LAT=32.8893 \
|
||||||
|
-e LOCATION_LON=-117.2519 \
|
||||||
|
-e LOCATION_NAME="Torrey Pines Gliderport" \
|
||||||
|
-e TIMEZONE="America/Los_Angeles" \
|
||||||
|
-e FETCH_INTERVAL=15m \
|
||||||
|
-e CACHE_TTL=10m \
|
||||||
|
-p 8080:8080 \
|
||||||
|
-v $(PWD)/backend:/app:z \
|
||||||
|
paragliding-backend:dev
|
||||||
|
@echo "Building frontend..."
|
||||||
|
podman build -t paragliding-frontend:dev --target dev -f frontend/Dockerfile frontend/
|
||||||
|
@echo "Starting frontend..."
|
||||||
|
-podman rm -f paragliding-frontend 2>/dev/null || true
|
||||||
|
podman run -d \
|
||||||
|
--name paragliding-frontend \
|
||||||
|
--network paragliding-net \
|
||||||
|
-e NEXT_PUBLIC_API_URL=http://localhost:8080/api/v1 \
|
||||||
|
-p 3000:3000 \
|
||||||
|
-v $(PWD)/frontend:/app:z \
|
||||||
|
-v /app/node_modules \
|
||||||
|
-v /app/.next \
|
||||||
|
paragliding-frontend:dev
|
||||||
|
|
||||||
|
dev-down:
|
||||||
|
@echo "Stopping containers..."
|
||||||
|
-podman rm -f paragliding-frontend paragliding-backend paragliding-postgres 2>/dev/null || true
|
||||||
|
@echo "Containers stopped"
|
||||||
|
|
||||||
|
dev-logs:
|
||||||
|
@echo "=== Backend Logs ==="
|
||||||
|
podman logs -f paragliding-backend
|
||||||
|
|
||||||
|
dev-logs-frontend:
|
||||||
|
@echo "=== Frontend Logs ==="
|
||||||
|
podman logs -f paragliding-frontend
|
||||||
|
|
||||||
|
dev-logs-postgres:
|
||||||
|
@echo "=== PostgreSQL Logs ==="
|
||||||
|
podman logs -f paragliding-postgres
|
||||||
|
|
||||||
|
# Database migrations
|
||||||
|
migrate-up:
|
||||||
|
cd backend && go run -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest \
|
||||||
|
-path ./migrations \
|
||||||
|
-database "$(DATABASE_URL)" \
|
||||||
|
up
|
||||||
|
|
||||||
|
migrate-down:
|
||||||
|
cd backend && go run -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest \
|
||||||
|
-path ./migrations \
|
||||||
|
-database "$(DATABASE_URL)" \
|
||||||
|
down 1
|
||||||
|
|
||||||
|
migrate-create:
|
||||||
|
cd backend && go run -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest \
|
||||||
|
create -ext sql -dir ./migrations -seq $(name)
|
||||||
|
|
||||||
|
# Testing
|
||||||
|
test:
|
||||||
|
docker compose -f docker-compose.test.yml up --build --abort-on-container-exit
|
||||||
|
docker compose -f docker-compose.test.yml down -v
|
||||||
|
|
||||||
|
test-backend:
|
||||||
|
cd backend && go test -v ./...
|
||||||
|
|
||||||
|
test-frontend:
|
||||||
|
cd frontend && npm test
|
||||||
|
|
||||||
|
# Build
|
||||||
|
build-backend:
|
||||||
|
cd backend && go build -o bin/api ./cmd/api
|
||||||
|
|
||||||
|
build-frontend:
|
||||||
|
cd frontend && npm run build
|
||||||
|
|
||||||
|
build: build-backend build-frontend
|
||||||
|
|
||||||
|
# Clean
|
||||||
|
clean:
|
||||||
|
rm -rf backend/bin
|
||||||
|
rm -rf frontend/.next
|
||||||
|
rm -rf frontend/node_modules
|
||||||
|
|
||||||
|
# Local development without Docker
|
||||||
|
# Start postgres only (for local dev)
|
||||||
|
dev-db:
|
||||||
|
@echo "Starting PostgreSQL..."
|
||||||
|
podman-compose -f docker-compose.dev.yml up -d
|
||||||
|
@echo "PostgreSQL is running on localhost:5432"
|
||||||
|
@echo "Database: paragliding"
|
||||||
|
@echo "User: dev"
|
||||||
|
@echo "Password: devpass"
|
||||||
|
|
||||||
|
dev-db-down:
|
||||||
|
@echo "Stopping PostgreSQL..."
|
||||||
|
podman-compose -f docker-compose.dev.yml down
|
||||||
|
@echo "PostgreSQL stopped"
|
||||||
|
|
||||||
|
# Run backend locally (requires postgres)
|
||||||
|
run-backend:
|
||||||
|
@if [ ! -f backend/.env ]; then \
|
||||||
|
echo "Creating backend/.env from .env.example..."; \
|
||||||
|
cp backend/.env.example backend/.env; \
|
||||||
|
fi
|
||||||
|
cd backend && go run ./cmd/api
|
||||||
|
|
||||||
|
# Run frontend locally
|
||||||
|
run-frontend:
|
||||||
|
cd frontend && npm run dev
|
||||||
|
|
||||||
|
# Complete local development setup (using idempotent script)
|
||||||
|
dev-local:
|
||||||
|
@./dev.sh
|
||||||
|
|
||||||
|
# Alternative: step-by-step setup
|
||||||
|
dev-local-manual:
|
||||||
|
@echo "🚀 Setting up local development environment..."
|
||||||
|
@echo ""
|
||||||
|
@echo "1. Starting PostgreSQL..."
|
||||||
|
@make dev-db
|
||||||
|
@echo ""
|
||||||
|
@echo "2. Waiting for PostgreSQL to be ready..."
|
||||||
|
@sleep 3
|
||||||
|
@echo ""
|
||||||
|
@echo "3. Running migrations..."
|
||||||
|
@make migrate-up DATABASE_URL="postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable"
|
||||||
|
@echo ""
|
||||||
|
@echo "✅ Setup complete!"
|
||||||
|
@echo ""
|
||||||
|
@echo "Now run these commands in separate terminals:"
|
||||||
|
@echo " Terminal 1: make run-backend"
|
||||||
|
@echo " Terminal 2: make run-frontend"
|
||||||
218
README.md
Normal file
218
README.md
Normal file
@@ -0,0 +1,218 @@
|
|||||||
|
# Paragliding Weather Dashboard
|
||||||
|
|
||||||
|
Real-time paragliding weather conditions and forecasting application.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- 🌤️ Real-time weather data from Open-Meteo
|
||||||
|
- 🪂 Paragliding-specific condition assessment
|
||||||
|
- 📊 Historical weather data tracking
|
||||||
|
- 🎯 Customizable safety thresholds
|
||||||
|
- ⚡ Fast API with caching
|
||||||
|
- 🔄 Automatic weather data updates
|
||||||
|
|
||||||
|
## Tech Stack
|
||||||
|
|
||||||
|
- **Backend**: Go 1.24, Chi router, PostgreSQL
|
||||||
|
- **Frontend**: Next.js 14, React 18, TailwindCSS, Recharts
|
||||||
|
- **Database**: PostgreSQL 16
|
||||||
|
- **Deployment**: Docker/Podman, Kubernetes
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### Local Development (Fastest)
|
||||||
|
|
||||||
|
Run only PostgreSQL in a container, everything else natively:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# One command - setup AND start everything!
|
||||||
|
./dev.sh --start
|
||||||
|
|
||||||
|
# Or setup only, then start manually:
|
||||||
|
./dev.sh
|
||||||
|
make run-backend # Terminal 1
|
||||||
|
make run-frontend # Terminal 2
|
||||||
|
```
|
||||||
|
|
||||||
|
**That's it!** The `--start` flag sets up PostgreSQL, creates config files, runs migrations, AND starts both backend and frontend in the background.
|
||||||
|
|
||||||
|
**Why local development?**
|
||||||
|
- ✨ Faster hot-reload and rebuild times
|
||||||
|
- 🔧 Better debugging experience
|
||||||
|
- 📦 Smaller resource footprint
|
||||||
|
- 🚀 Quicker iteration cycles
|
||||||
|
|
||||||
|
See [DEV_SETUP.md](./DEV_SETUP.md) for detailed local development guide.
|
||||||
|
|
||||||
|
### Full Container Development
|
||||||
|
|
||||||
|
Run everything in containers:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
make dev
|
||||||
|
```
|
||||||
|
|
||||||
|
Access at:
|
||||||
|
- Frontend: http://localhost:3000
|
||||||
|
- Backend API: http://localhost:8080/api/v1
|
||||||
|
|
||||||
|
## Available Make Commands
|
||||||
|
|
||||||
|
### Local Development
|
||||||
|
- `./dev.sh` - **Recommended**: Idempotent setup script
|
||||||
|
- `make dev-local` - Setup local development (runs dev.sh)
|
||||||
|
- `make dev-db` - Start PostgreSQL only
|
||||||
|
- `make dev-db-down` - Stop PostgreSQL
|
||||||
|
- `make run-backend` - Run backend locally
|
||||||
|
- `make run-frontend` - Run frontend locally
|
||||||
|
|
||||||
|
### Container Development
|
||||||
|
- `make dev` - Start all services in containers
|
||||||
|
- `make dev-down` - Stop all containers
|
||||||
|
- `make dev-logs` - View backend logs
|
||||||
|
- `make dev-logs-frontend` - View frontend logs
|
||||||
|
- `make dev-logs-postgres` - View PostgreSQL logs
|
||||||
|
|
||||||
|
### Database
|
||||||
|
- `make migrate-up` - Run migrations
|
||||||
|
- `make migrate-down` - Rollback last migration
|
||||||
|
- `make migrate-create name=<name>` - Create new migration
|
||||||
|
|
||||||
|
### Testing
|
||||||
|
- `make test` - Run all tests
|
||||||
|
- `make test-backend` - Run backend tests only
|
||||||
|
- `make test-frontend` - Run frontend tests only
|
||||||
|
|
||||||
|
### Build
|
||||||
|
- `make build` - Build backend and frontend
|
||||||
|
- `make build-backend` - Build backend only
|
||||||
|
- `make build-frontend` - Build frontend only
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
- `GET /api/v1/health` - Health check
|
||||||
|
- `GET /api/v1/weather/current` - Current conditions
|
||||||
|
- `GET /api/v1/weather/forecast` - Weather forecast
|
||||||
|
- `GET /api/v1/weather/historical?date=YYYY-MM-DD` - Historical data
|
||||||
|
- `POST /api/v1/assess` - Assess conditions with custom thresholds
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Backend Environment Variables
|
||||||
|
|
||||||
|
See `backend/.env.example` for all configuration options:
|
||||||
|
|
||||||
|
- `DATABASE_URL` - PostgreSQL connection string
|
||||||
|
- `PORT` - Server port (default: 8080)
|
||||||
|
- `LOCATION_LAT` - Latitude for weather data
|
||||||
|
- `LOCATION_LON` - Longitude for weather data
|
||||||
|
- `LOCATION_NAME` - Location display name
|
||||||
|
- `TIMEZONE` - Timezone for weather data
|
||||||
|
- `FETCH_INTERVAL` - How often to fetch weather (default: 15m)
|
||||||
|
- `CACHE_TTL` - API response cache duration (default: 10m)
|
||||||
|
|
||||||
|
### Frontend Environment Variables
|
||||||
|
|
||||||
|
See `frontend/.env.local.example`:
|
||||||
|
|
||||||
|
- `NEXT_PUBLIC_API_URL` - Backend API URL
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
.
|
||||||
|
├── backend/ # Go backend
|
||||||
|
│ ├── cmd/api/ # Application entry point
|
||||||
|
│ ├── internal/ # Internal packages
|
||||||
|
│ │ ├── cache/ # Caching layer
|
||||||
|
│ │ ├── client/ # External API clients
|
||||||
|
│ │ ├── config/ # Configuration
|
||||||
|
│ │ ├── model/ # Data models
|
||||||
|
│ │ ├── repository/ # Database layer
|
||||||
|
│ │ ├── server/ # HTTP server
|
||||||
|
│ │ └── service/ # Business logic
|
||||||
|
│ └── migrations/ # Database migrations
|
||||||
|
├── frontend/ # Next.js frontend
|
||||||
|
│ ├── app/ # Next.js app directory
|
||||||
|
│ ├── components/ # React components
|
||||||
|
│ ├── hooks/ # Custom React hooks
|
||||||
|
│ ├── lib/ # Utilities
|
||||||
|
│ └── store/ # State management
|
||||||
|
├── docker-compose.yml # Full stack containers
|
||||||
|
├── docker-compose.dev.yml # PostgreSQL only
|
||||||
|
├── dev.sh # Idempotent dev setup script
|
||||||
|
├── Makefile # Development commands
|
||||||
|
└── DEV_SETUP.md # Detailed dev guide
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Workflow
|
||||||
|
|
||||||
|
### Initial Setup
|
||||||
|
```bash
|
||||||
|
# 1. Clone the repository
|
||||||
|
git clone <repo-url>
|
||||||
|
cd paragliding
|
||||||
|
|
||||||
|
# 2. Run setup script
|
||||||
|
./dev.sh
|
||||||
|
|
||||||
|
# 3. Start services (in separate terminals)
|
||||||
|
make run-backend
|
||||||
|
make run-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Daily Development
|
||||||
|
```bash
|
||||||
|
# Start PostgreSQL (if not running)
|
||||||
|
make dev-db
|
||||||
|
|
||||||
|
# Start backend
|
||||||
|
make run-backend
|
||||||
|
|
||||||
|
# Start frontend
|
||||||
|
make run-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Making Changes
|
||||||
|
|
||||||
|
- **Backend**: Changes auto-reload with Air (if installed) or restart manually
|
||||||
|
- **Frontend**: Changes auto-reload with Next.js Fast Refresh
|
||||||
|
- **Database**: Create migrations with `make migrate-create name=<name>`
|
||||||
|
|
||||||
|
## Deployment
|
||||||
|
|
||||||
|
### Docker/Podman
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Build and start all services
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# Or with podman
|
||||||
|
podman-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
### Kubernetes
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Apply configurations
|
||||||
|
kubectl apply -f k8s.yaml
|
||||||
|
|
||||||
|
# Check status
|
||||||
|
kubectl get pods
|
||||||
|
```
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
1. Fork the repository
|
||||||
|
2. Create a feature branch
|
||||||
|
3. Make your changes
|
||||||
|
4. Run tests: `make test`
|
||||||
|
5. Submit a pull request
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
ISC
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
For issues and questions, please open a GitHub issue.
|
||||||
72
TEST_RESULTS.md
Normal file
72
TEST_RESULTS.md
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
# Playwright Test Results and Issues Found
|
||||||
|
|
||||||
|
## Test Execution Summary
|
||||||
|
- **Total Tests**: 6 tests across 2 test files
|
||||||
|
- **Passed**: 5 tests
|
||||||
|
- **Failed**: 1 test (slider interaction timeout - fixed in updated test)
|
||||||
|
|
||||||
|
## Issues Identified
|
||||||
|
|
||||||
|
### 1. ✅ FIXED: React Key Prop Warning
|
||||||
|
**Location**: `frontend/components/weather/wind-direction-chart.tsx`
|
||||||
|
**Issue**: Missing `key` prop on circle elements in Line chart dots
|
||||||
|
**Status**: Fixed by adding `key={`dot-${index}`}` to circle elements
|
||||||
|
|
||||||
|
### 2. ⚠️ WARNING: Recharts Name Property Warnings
|
||||||
|
**Location**: Both `wind-speed-chart.tsx` and `wind-direction-chart.tsx`
|
||||||
|
**Issue**: Recharts warns about missing `name` property when using functions for `dataKey`
|
||||||
|
**Status**: Components already have `name` props, but warnings persist. This appears to be a Recharts library quirk when using function-based dataKeys. The charts function correctly despite the warnings.
|
||||||
|
|
||||||
|
### 3. ✅ VERIFIED: API Functionality
|
||||||
|
**Status**: All API endpoints responding correctly:
|
||||||
|
- `/api/v1/weather/current` - ✅ 200 OK
|
||||||
|
- `/api/v1/weather/forecast` - ✅ 200 OK
|
||||||
|
- `/api/v1/weather/historical` - ✅ 200 OK
|
||||||
|
|
||||||
|
**Backend Logs**: No errors found, all requests returning 200 status codes
|
||||||
|
|
||||||
|
### 4. ✅ VERIFIED: Threshold Controls
|
||||||
|
**Status**: Threshold controls are working correctly:
|
||||||
|
- Sliders are found and interactive
|
||||||
|
- Values update when sliders are moved
|
||||||
|
- No API calls triggered on threshold change (expected - thresholds are client-side only)
|
||||||
|
|
||||||
|
### 5. ✅ VERIFIED: Frontend Loading
|
||||||
|
**Status**: Frontend loads successfully:
|
||||||
|
- Dashboard renders correctly
|
||||||
|
- Weather data displays
|
||||||
|
- Charts render properly
|
||||||
|
- No blocking errors
|
||||||
|
|
||||||
|
## Test Coverage
|
||||||
|
|
||||||
|
### Interactive Tests (`interactive.spec.ts`)
|
||||||
|
1. ✅ Dashboard loads and displays weather data
|
||||||
|
2. ✅ All interactive elements can be tested
|
||||||
|
3. ✅ Slider interactions work (after fix)
|
||||||
|
4. ⚠️ One test had timeout issue with slider click (fixed by using mouse.click on track)
|
||||||
|
|
||||||
|
### API Interaction Tests (`api-interactions.spec.ts`)
|
||||||
|
1. ✅ API calls are monitored correctly
|
||||||
|
2. ✅ Console errors and warnings are captured
|
||||||
|
3. ✅ Network requests are logged
|
||||||
|
|
||||||
|
## Recommendations
|
||||||
|
|
||||||
|
1. **Recharts Warnings**: Consider updating Recharts or using a different approach for conditional data rendering if warnings become problematic
|
||||||
|
2. **Slider Interaction**: The current approach of clicking on the slider track works well and avoids pointer interception issues
|
||||||
|
3. **Error Monitoring**: Consider adding error tracking (e.g., Sentry) for production to catch runtime errors
|
||||||
|
|
||||||
|
## Backend Status
|
||||||
|
- ✅ Database migrations applied
|
||||||
|
- ✅ API endpoints working
|
||||||
|
- ✅ Weather data fetching successful
|
||||||
|
- ✅ No errors in logs during test execution
|
||||||
|
|
||||||
|
## Frontend Status
|
||||||
|
- ✅ Application loads successfully
|
||||||
|
- ✅ Data fetching works
|
||||||
|
- ✅ Charts render correctly
|
||||||
|
- ⚠️ Minor React warnings (non-blocking)
|
||||||
|
- ⚠️ Recharts warnings (non-blocking)
|
||||||
|
|
||||||
30
backend/.air.toml
Normal file
30
backend/.air.toml
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
root = "."
|
||||||
|
tmp_dir = "tmp"
|
||||||
|
|
||||||
|
[build]
|
||||||
|
cmd = "go build -o ./tmp/main ./cmd/api"
|
||||||
|
bin = "tmp/main"
|
||||||
|
full_bin = "./tmp/main"
|
||||||
|
include_ext = ["go", "tpl", "tmpl", "html"]
|
||||||
|
exclude_dir = ["assets", "tmp", "vendor", "testdata", "migrations"]
|
||||||
|
exclude_file = []
|
||||||
|
exclude_regex = ["_test.go"]
|
||||||
|
exclude_unchanged = true
|
||||||
|
follow_symlink = true
|
||||||
|
log = "air.log"
|
||||||
|
delay = 1000 # ms
|
||||||
|
stop_on_error = true
|
||||||
|
send_interrupt = false
|
||||||
|
kill_delay = 500 # ms
|
||||||
|
|
||||||
|
[log]
|
||||||
|
time = true
|
||||||
|
|
||||||
|
[color]
|
||||||
|
main = "magenta"
|
||||||
|
watcher = "cyan"
|
||||||
|
build = "yellow"
|
||||||
|
runner = "green"
|
||||||
|
|
||||||
|
[misc]
|
||||||
|
clean_on_exit = true
|
||||||
19
backend/.env.example
Normal file
19
backend/.env.example
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
# Database Configuration
|
||||||
|
DATABASE_URL=postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable
|
||||||
|
|
||||||
|
# Server Configuration
|
||||||
|
PORT=8080
|
||||||
|
|
||||||
|
# Location Configuration (Torrey Pines Gliderport by default)
|
||||||
|
LOCATION_LAT=32.8893
|
||||||
|
LOCATION_LON=-117.2519
|
||||||
|
LOCATION_NAME="Torrey Pines Gliderport"
|
||||||
|
|
||||||
|
# Timezone
|
||||||
|
TIMEZONE=America/Los_Angeles
|
||||||
|
|
||||||
|
# Weather Fetcher Configuration
|
||||||
|
FETCH_INTERVAL=15m
|
||||||
|
|
||||||
|
# Cache Configuration
|
||||||
|
CACHE_TTL=10m
|
||||||
37
backend/Dockerfile
Normal file
37
backend/Dockerfile
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
# Build stage
|
||||||
|
FROM golang:1.24-alpine AS builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install git for fetching dependencies
|
||||||
|
RUN apk add --no-cache git ca-certificates
|
||||||
|
|
||||||
|
# Copy go mod files
|
||||||
|
COPY go.mod go.sum ./
|
||||||
|
RUN go mod download
|
||||||
|
|
||||||
|
# Copy source code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Build the application
|
||||||
|
RUN CGO_ENABLED=0 GOOS=linux go build -ldflags="-w -s" -o /api ./cmd/api
|
||||||
|
|
||||||
|
# Final stage
|
||||||
|
FROM alpine:3.19
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install ca-certificates for HTTPS requests
|
||||||
|
RUN apk --no-cache add ca-certificates tzdata
|
||||||
|
|
||||||
|
# Copy binary from builder (migrations are embedded in the binary)
|
||||||
|
COPY --from=builder /api /app/api
|
||||||
|
|
||||||
|
# Non-root user
|
||||||
|
RUN adduser -D -g '' appuser
|
||||||
|
RUN chown -R appuser:appuser /app
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
EXPOSE 8080
|
||||||
|
|
||||||
|
CMD ["/app/api"]
|
||||||
27
backend/Dockerfile.dev
Normal file
27
backend/Dockerfile.dev
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
# Development Dockerfile with hot reload
|
||||||
|
FROM golang:1.24-alpine
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install air for hot reload
|
||||||
|
# Use GOTOOLCHAIN=auto to allow Go to download a newer toolchain if needed
|
||||||
|
ENV GOTOOLCHAIN=auto
|
||||||
|
RUN go install github.com/air-verse/air@latest
|
||||||
|
|
||||||
|
# Install migrate for database migrations
|
||||||
|
RUN go install -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest
|
||||||
|
|
||||||
|
# Copy go mod files first for caching
|
||||||
|
COPY go.mod ./
|
||||||
|
RUN go mod download
|
||||||
|
|
||||||
|
# Copy source code (will be overridden by volume mount)
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Create data directory
|
||||||
|
RUN mkdir -p /app/data
|
||||||
|
|
||||||
|
EXPOSE 8080
|
||||||
|
|
||||||
|
# Use air for hot reload
|
||||||
|
CMD ["air", "-c", ".air.toml"]
|
||||||
421
backend/cmd/api/main.go
Normal file
421
backend/cmd/api/main.go
Normal file
@@ -0,0 +1,421 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"log/slog"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"os/signal"
|
||||||
|
"syscall"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/jackc/pgx/v5/pgxpool"
|
||||||
|
"github.com/scottyah/paragliding/internal/client"
|
||||||
|
"github.com/scottyah/paragliding/internal/config"
|
||||||
|
"github.com/scottyah/paragliding/internal/database"
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
"github.com/scottyah/paragliding/internal/repository"
|
||||||
|
"github.com/scottyah/paragliding/internal/server"
|
||||||
|
"github.com/scottyah/paragliding/internal/service"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
// Setup structured logging
|
||||||
|
logger := slog.New(slog.NewJSONHandler(os.Stdout, &slog.HandlerOptions{
|
||||||
|
Level: slog.LevelInfo,
|
||||||
|
}))
|
||||||
|
slog.SetDefault(logger)
|
||||||
|
|
||||||
|
// Load configuration
|
||||||
|
cfg, err := config.Load()
|
||||||
|
if err != nil {
|
||||||
|
logger.Error("failed to load configuration", "error", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.Info("configuration loaded",
|
||||||
|
"port", cfg.Port,
|
||||||
|
"location", cfg.LocationName,
|
||||||
|
"fetch_interval", cfg.FetchInterval,
|
||||||
|
)
|
||||||
|
|
||||||
|
// Create context for graceful shutdown
|
||||||
|
ctx, cancel := context.WithCancel(context.Background())
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
// Connect to PostgreSQL with retry
|
||||||
|
dbpool, err := connectDatabaseWithRetry(ctx, cfg.DatabaseURL, logger, 30*time.Second)
|
||||||
|
if err != nil {
|
||||||
|
logger.Error("failed to connect to database", "error", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
defer dbpool.Close()
|
||||||
|
|
||||||
|
logger.Info("connected to database")
|
||||||
|
|
||||||
|
// Run database migrations
|
||||||
|
if err := database.RunMigrations(cfg.DatabaseURL, logger); err != nil {
|
||||||
|
logger.Error("failed to run migrations", "error", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize services
|
||||||
|
weatherClient := client.NewOpenMeteoClient(client.OpenMeteoConfig{
|
||||||
|
Latitude: cfg.LocationLat,
|
||||||
|
Longitude: cfg.LocationLon,
|
||||||
|
Timezone: cfg.Timezone,
|
||||||
|
})
|
||||||
|
|
||||||
|
weatherRepo := repository.NewWeatherRepository(dbpool)
|
||||||
|
assessmentSvc := service.NewAssessmentService()
|
||||||
|
|
||||||
|
// Create weather service (handles caching and API rate limiting)
|
||||||
|
weatherSvc := service.NewWeatherService(service.WeatherServiceConfig{
|
||||||
|
Client: weatherClient,
|
||||||
|
Repo: weatherRepo,
|
||||||
|
Logger: logger,
|
||||||
|
})
|
||||||
|
|
||||||
|
// Create HTTP server
|
||||||
|
srv := server.New(cfg.Addr(), logger)
|
||||||
|
|
||||||
|
// Setup routes with handler
|
||||||
|
handler := &Handler{
|
||||||
|
logger: logger,
|
||||||
|
db: dbpool,
|
||||||
|
config: cfg,
|
||||||
|
weatherSvc: weatherSvc,
|
||||||
|
assessmentSvc: assessmentSvc,
|
||||||
|
}
|
||||||
|
srv.SetupRoutes(handler)
|
||||||
|
|
||||||
|
// Start background weather fetcher
|
||||||
|
fetcher := &WeatherFetcher{
|
||||||
|
logger: logger,
|
||||||
|
config: cfg,
|
||||||
|
weatherSvc: weatherSvc,
|
||||||
|
stopChan: make(chan struct{}),
|
||||||
|
}
|
||||||
|
go fetcher.Start(ctx)
|
||||||
|
|
||||||
|
// Start HTTP server in a goroutine
|
||||||
|
serverErrors := make(chan error, 1)
|
||||||
|
go func() {
|
||||||
|
serverErrors <- srv.Start()
|
||||||
|
}()
|
||||||
|
|
||||||
|
// Wait for interrupt signal or server error
|
||||||
|
shutdown := make(chan os.Signal, 1)
|
||||||
|
signal.Notify(shutdown, os.Interrupt, syscall.SIGTERM)
|
||||||
|
|
||||||
|
select {
|
||||||
|
case err := <-serverErrors:
|
||||||
|
logger.Error("server error", "error", err)
|
||||||
|
case sig := <-shutdown:
|
||||||
|
logger.Info("shutdown signal received", "signal", sig)
|
||||||
|
|
||||||
|
// Stop background fetcher
|
||||||
|
close(fetcher.stopChan)
|
||||||
|
|
||||||
|
// Give outstanding requests time to complete
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 15*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
if err := srv.Shutdown(ctx); err != nil {
|
||||||
|
logger.Error("graceful shutdown failed", "error", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.Info("server stopped gracefully")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// connectDatabaseWithRetry establishes a connection pool to PostgreSQL with exponential backoff
|
||||||
|
func connectDatabaseWithRetry(ctx context.Context, databaseURL string, logger *slog.Logger, maxWait time.Duration) (*pgxpool.Pool, error) {
|
||||||
|
pgConfig, err := pgxpool.ParseConfig(databaseURL)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse database URL: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set connection pool settings
|
||||||
|
pgConfig.MaxConns = 25
|
||||||
|
pgConfig.MinConns = 5
|
||||||
|
pgConfig.MaxConnLifetime = time.Hour
|
||||||
|
pgConfig.MaxConnIdleTime = 30 * time.Minute
|
||||||
|
pgConfig.HealthCheckPeriod = time.Minute
|
||||||
|
|
||||||
|
// Exponential backoff retry
|
||||||
|
var pool *pgxpool.Pool
|
||||||
|
backoff := 1 * time.Second
|
||||||
|
maxBackoff := 8 * time.Second
|
||||||
|
deadline := time.Now().Add(maxWait)
|
||||||
|
|
||||||
|
for {
|
||||||
|
pool, err = pgxpool.NewWithConfig(ctx, pgConfig)
|
||||||
|
if err == nil {
|
||||||
|
// Verify connection
|
||||||
|
if pingErr := pool.Ping(ctx); pingErr == nil {
|
||||||
|
return pool, nil
|
||||||
|
} else {
|
||||||
|
pool.Close()
|
||||||
|
err = pingErr
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if time.Now().After(deadline) {
|
||||||
|
return nil, fmt.Errorf("failed to connect to database after %v: %w", maxWait, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.Warn("database connection failed, retrying",
|
||||||
|
"error", err,
|
||||||
|
"retry_in", backoff,
|
||||||
|
)
|
||||||
|
|
||||||
|
select {
|
||||||
|
case <-ctx.Done():
|
||||||
|
return nil, ctx.Err()
|
||||||
|
case <-time.After(backoff):
|
||||||
|
}
|
||||||
|
|
||||||
|
// Exponential backoff with cap
|
||||||
|
backoff *= 2
|
||||||
|
if backoff > maxBackoff {
|
||||||
|
backoff = maxBackoff
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handler implements the RouteHandler interface
|
||||||
|
type Handler struct {
|
||||||
|
logger *slog.Logger
|
||||||
|
db *pgxpool.Pool
|
||||||
|
config *config.Config
|
||||||
|
weatherSvc *service.WeatherService
|
||||||
|
assessmentSvc *service.AssessmentService
|
||||||
|
}
|
||||||
|
|
||||||
|
// Health handles health check requests
|
||||||
|
func (h *Handler) Health(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx, cancel := context.WithTimeout(r.Context(), 5*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
// Check database connectivity
|
||||||
|
if err := h.db.Ping(ctx); err != nil {
|
||||||
|
h.logger.Error("health check failed", "error", err)
|
||||||
|
server.RespondError(w, 503, "database unavailable")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
server.RespondJSON(w, 200, map[string]interface{}{
|
||||||
|
"status": "healthy",
|
||||||
|
"timestamp": time.Now().UTC(),
|
||||||
|
"location": h.config.LocationName,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetCurrentWeather handles current weather requests
|
||||||
|
func (h *Handler) GetCurrentWeather(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx := r.Context()
|
||||||
|
|
||||||
|
// Get current weather from service (handles caching, DB, and rate-limited API fallback)
|
||||||
|
weatherData, err := h.weatherSvc.GetCurrentWeather(ctx)
|
||||||
|
if err != nil {
|
||||||
|
h.logger.Error("failed to get current weather", "error", err)
|
||||||
|
server.RespondError(w, 500, "failed to fetch weather data")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(weatherData.Points) == 0 {
|
||||||
|
server.RespondError(w, 500, "no current weather data available")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
current := weatherData.Points[0]
|
||||||
|
|
||||||
|
// Get all points for assessment
|
||||||
|
allPoints, err := h.weatherSvc.GetAllPoints(ctx)
|
||||||
|
if err != nil {
|
||||||
|
h.logger.Warn("failed to get points for assessment", "error", err)
|
||||||
|
allPoints = weatherData.Points
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get default thresholds and assess conditions
|
||||||
|
thresholds := model.DefaultThresholds()
|
||||||
|
assessment := h.assessmentSvc.Evaluate(allPoints, thresholds)
|
||||||
|
|
||||||
|
response := map[string]interface{}{
|
||||||
|
"timestamp": weatherData.FetchedAt.UTC(),
|
||||||
|
"location": map[string]interface{}{
|
||||||
|
"name": h.config.LocationName,
|
||||||
|
"lat": h.config.LocationLat,
|
||||||
|
"lon": h.config.LocationLon,
|
||||||
|
},
|
||||||
|
"current": map[string]interface{}{
|
||||||
|
"windSpeed": current.WindSpeedMPH,
|
||||||
|
"windDirection": current.WindDirection,
|
||||||
|
"windGust": current.WindGustMPH,
|
||||||
|
"time": current.Time,
|
||||||
|
},
|
||||||
|
"assessment": assessment,
|
||||||
|
"source": weatherData.Source,
|
||||||
|
}
|
||||||
|
|
||||||
|
server.RespondJSON(w, 200, response)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetForecast handles weather forecast requests
|
||||||
|
func (h *Handler) GetForecast(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx := r.Context()
|
||||||
|
|
||||||
|
// Get forecast from service (handles caching, DB, and rate-limited API fallback)
|
||||||
|
weatherData, err := h.weatherSvc.GetForecast(ctx)
|
||||||
|
if err != nil {
|
||||||
|
h.logger.Error("failed to get forecast", "error", err)
|
||||||
|
server.RespondError(w, 500, "failed to fetch forecast data")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get default thresholds
|
||||||
|
thresholds := model.DefaultThresholds()
|
||||||
|
|
||||||
|
// Find flyable windows
|
||||||
|
windows := h.assessmentSvc.FindFlyableWindows(weatherData.Points, thresholds)
|
||||||
|
|
||||||
|
response := map[string]interface{}{
|
||||||
|
"generated": weatherData.FetchedAt.UTC(),
|
||||||
|
"location": map[string]interface{}{
|
||||||
|
"name": h.config.LocationName,
|
||||||
|
"lat": h.config.LocationLat,
|
||||||
|
"lon": h.config.LocationLon,
|
||||||
|
},
|
||||||
|
"forecast": weatherData.Points,
|
||||||
|
"flyableWindows": windows,
|
||||||
|
"defaultThresholds": thresholds,
|
||||||
|
"source": weatherData.Source,
|
||||||
|
}
|
||||||
|
|
||||||
|
server.RespondJSON(w, 200, response)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetHistorical handles historical weather requests
|
||||||
|
func (h *Handler) GetHistorical(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx := r.Context()
|
||||||
|
|
||||||
|
// Parse date from query param (default: yesterday)
|
||||||
|
dateStr := r.URL.Query().Get("date")
|
||||||
|
var date time.Time
|
||||||
|
if dateStr == "" {
|
||||||
|
date = time.Now().AddDate(0, 0, -1)
|
||||||
|
} else {
|
||||||
|
var err error
|
||||||
|
date, err = time.Parse("2006-01-02", dateStr)
|
||||||
|
if err != nil {
|
||||||
|
server.RespondError(w, 400, "invalid date format, use YYYY-MM-DD")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get historical data from service (handles caching and DB lookup)
|
||||||
|
weatherData, err := h.weatherSvc.GetHistorical(ctx, date)
|
||||||
|
if err != nil {
|
||||||
|
h.logger.Error("failed to get historical data", "error", err)
|
||||||
|
server.RespondError(w, 500, "failed to fetch historical data")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
response := map[string]interface{}{
|
||||||
|
"date": date.Format("2006-01-02"),
|
||||||
|
"data": weatherData.Points,
|
||||||
|
"source": weatherData.Source,
|
||||||
|
}
|
||||||
|
|
||||||
|
server.RespondJSON(w, 200, response)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AssessConditions handles paragliding condition assessment requests
|
||||||
|
func (h *Handler) AssessConditions(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx := r.Context()
|
||||||
|
|
||||||
|
// Limit request body size
|
||||||
|
r.Body = http.MaxBytesReader(w, r.Body, 1<<20) // 1MB max
|
||||||
|
|
||||||
|
// Parse thresholds from request body
|
||||||
|
var req struct {
|
||||||
|
Thresholds model.Thresholds `json:"thresholds"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
// Use default thresholds if not provided or invalid
|
||||||
|
req.Thresholds = model.DefaultThresholds()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get weather data from service (DB-first, no API call triggered here)
|
||||||
|
points, err := h.weatherSvc.GetAllPoints(ctx)
|
||||||
|
if err != nil {
|
||||||
|
h.logger.Error("failed to get weather for assessment", "error", err)
|
||||||
|
server.RespondError(w, 500, "failed to fetch weather data")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run assessment with provided thresholds
|
||||||
|
assessment := h.assessmentSvc.Evaluate(points, req.Thresholds)
|
||||||
|
windows := h.assessmentSvc.FindFlyableWindows(points, req.Thresholds)
|
||||||
|
|
||||||
|
response := map[string]interface{}{
|
||||||
|
"assessment": assessment,
|
||||||
|
"flyableWindows": windows,
|
||||||
|
"thresholds": req.Thresholds,
|
||||||
|
}
|
||||||
|
|
||||||
|
server.RespondJSON(w, 200, response)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WeatherFetcher runs background weather fetching
|
||||||
|
type WeatherFetcher struct {
|
||||||
|
logger *slog.Logger
|
||||||
|
config *config.Config
|
||||||
|
weatherSvc *service.WeatherService
|
||||||
|
stopChan chan struct{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start begins the background weather fetching loop
|
||||||
|
func (f *WeatherFetcher) Start(ctx context.Context) {
|
||||||
|
f.logger.Info("starting weather fetcher", "interval", f.config.FetchInterval)
|
||||||
|
|
||||||
|
ticker := time.NewTicker(f.config.FetchInterval)
|
||||||
|
defer ticker.Stop()
|
||||||
|
|
||||||
|
// Fetch immediately on startup
|
||||||
|
f.fetch(ctx)
|
||||||
|
|
||||||
|
for {
|
||||||
|
select {
|
||||||
|
case <-ticker.C:
|
||||||
|
f.fetch(ctx)
|
||||||
|
case <-f.stopChan:
|
||||||
|
f.logger.Info("stopping weather fetcher")
|
||||||
|
return
|
||||||
|
case <-ctx.Done():
|
||||||
|
f.logger.Info("weather fetcher context cancelled")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// fetch performs the actual weather data fetching via the weather service
|
||||||
|
func (f *WeatherFetcher) fetch(ctx context.Context) {
|
||||||
|
f.logger.Info("fetching weather data",
|
||||||
|
"lat", f.config.LocationLat,
|
||||||
|
"lon", f.config.LocationLon,
|
||||||
|
)
|
||||||
|
|
||||||
|
// Use weather service's FetchFromAPI (bypasses rate limiting for scheduled fetches)
|
||||||
|
points, err := f.weatherSvc.FetchFromAPI(ctx)
|
||||||
|
if err != nil {
|
||||||
|
f.logger.Error("failed to fetch weather data", "error", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
f.logger.Info("weather data updated successfully", "points", len(points))
|
||||||
|
}
|
||||||
23
backend/go.mod
Normal file
23
backend/go.mod
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
module github.com/scottyah/paragliding
|
||||||
|
|
||||||
|
go 1.24.0
|
||||||
|
|
||||||
|
toolchain go1.24.11
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/go-chi/chi/v5 v5.2.3
|
||||||
|
github.com/go-chi/cors v1.2.2
|
||||||
|
github.com/jackc/pgx/v5 v5.8.0
|
||||||
|
github.com/kelseyhightower/envconfig v1.4.0
|
||||||
|
)
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/golang-migrate/migrate/v4 v4.19.1 // indirect
|
||||||
|
github.com/jackc/pgpassfile v1.0.0 // indirect
|
||||||
|
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 // indirect
|
||||||
|
github.com/jackc/puddle/v2 v2.2.2 // indirect
|
||||||
|
github.com/lib/pq v1.10.9 // indirect
|
||||||
|
golang.org/x/sync v0.18.0 // indirect
|
||||||
|
golang.org/x/text v0.31.0 // indirect
|
||||||
|
golang.org/x/time v0.14.0 // indirect
|
||||||
|
)
|
||||||
44
backend/go.sum
Normal file
44
backend/go.sum
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||||
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc h1:U9qPSI2PIWSS1VwoXQT9A3Wy9MM3WgvqSxFWenqJduM=
|
||||||
|
github.com/go-chi/chi/v5 v5.2.3 h1:WQIt9uxdsAbgIYgid+BpYc+liqQZGMHRaUwp0JUcvdE=
|
||||||
|
github.com/go-chi/chi/v5 v5.2.3/go.mod h1:L2yAIGWB3H+phAw1NxKwWM+7eUH/lU8pOMm5hHcoops=
|
||||||
|
github.com/go-chi/cors v1.2.2 h1:Jmey33TE+b+rB7fT8MUy1u0I4L+NARQlK6LhzKPSyQE=
|
||||||
|
github.com/go-chi/cors v1.2.2/go.mod h1:sSbTewc+6wYHBBCW7ytsFSn836hqM7JxpglAy2Vzc58=
|
||||||
|
github.com/golang-migrate/migrate/v4 v4.19.1 h1:OCyb44lFuQfYXYLx1SCxPZQGU7mcaZ7gH9yH4jSFbBA=
|
||||||
|
github.com/golang-migrate/migrate/v4 v4.19.1/go.mod h1:CTcgfjxhaUtsLipnLoQRWCrjYXycRz/g5+RWDuYgPrE=
|
||||||
|
github.com/jackc/pgpassfile v1.0.0 h1:/6Hmqy13Ss2zCq62VdNG8tM1wchn8zjSGOBJ6icpsIM=
|
||||||
|
github.com/jackc/pgpassfile v1.0.0/go.mod h1:CEx0iS5ambNFdcRtxPj5JhEz+xB6uRky5eyVu/W2HEg=
|
||||||
|
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 h1:iCEnooe7UlwOQYpKFhBabPMi4aNAfoODPEFNiAnClxo=
|
||||||
|
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761/go.mod h1:5TJZWKEWniPve33vlWYSoGYefn3gLQRzjfDlhSJ9ZKM=
|
||||||
|
github.com/jackc/pgx/v5 v5.8.0 h1:TYPDoleBBme0xGSAX3/+NujXXtpZn9HBONkQC7IEZSo=
|
||||||
|
github.com/jackc/pgx/v5 v5.8.0/go.mod h1:QVeDInX2m9VyzvNeiCJVjCkNFqzsNb43204HshNSZKw=
|
||||||
|
github.com/jackc/puddle/v2 v2.2.2 h1:PR8nw+E/1w0GLuRFSmiioY6UooMp6KJv0/61nB7icHo=
|
||||||
|
github.com/jackc/puddle/v2 v2.2.2/go.mod h1:vriiEXHvEE654aYKXXjOvZM39qJ0q+azkZFrfEOc3H4=
|
||||||
|
github.com/kelseyhightower/envconfig v1.4.0 h1:Im6hONhd3pLkfDFsbRgu68RDNkGF1r3dvMUtDTo2cv8=
|
||||||
|
github.com/kelseyhightower/envconfig v1.4.0/go.mod h1:cccZRl6mQpaq41TPp5QxidR+Sa3axMbJDNb//FQX6Gg=
|
||||||
|
github.com/lib/pq v1.10.9 h1:YXG7RB+JIjhP29X+OtkiDnYaXQwpS4JEWq7dtCCRUEw=
|
||||||
|
github.com/lib/pq v1.10.9/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
||||||
|
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||||
|
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||||
|
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 h1:Jamvg5psRIccs7FGNTlIRMkT8wgtp5eCXdBlqhYGL6U=
|
||||||
|
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||||
|
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
|
||||||
|
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
|
||||||
|
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
|
||||||
|
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
|
||||||
|
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
|
||||||
|
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||||
|
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
|
||||||
|
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||||
|
golang.org/x/text v0.29.0 h1:1neNs90w9YzJ9BocxfsQNHKuAT4pkghyXc4nhZ6sJvk=
|
||||||
|
golang.org/x/text v0.29.0/go.mod h1:7MhJOA9CD2qZyOKYazxdYMF85OwPdEr9jTtBpO7ydH4=
|
||||||
|
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
|
||||||
|
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
|
||||||
|
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
|
||||||
|
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
|
||||||
|
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||||
|
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||||
|
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
|
||||||
|
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||||
168
backend/internal/client/openmeteo.go
Normal file
168
backend/internal/client/openmeteo.go
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
package client
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"net/url"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
openMeteoBaseURL = "https://api.open-meteo.com/v1/forecast"
|
||||||
|
)
|
||||||
|
|
||||||
|
// OpenMeteoClient is a client for the Open-Meteo weather API
|
||||||
|
type OpenMeteoClient struct {
|
||||||
|
httpClient *http.Client
|
||||||
|
latitude float64
|
||||||
|
longitude float64
|
||||||
|
timezone string
|
||||||
|
}
|
||||||
|
|
||||||
|
// OpenMeteoConfig contains configuration for the Open-Meteo client
|
||||||
|
type OpenMeteoConfig struct {
|
||||||
|
Latitude float64
|
||||||
|
Longitude float64
|
||||||
|
Timezone string // IANA timezone (e.g., "America/Los_Angeles")
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewOpenMeteoClient creates a new Open-Meteo API client
|
||||||
|
func NewOpenMeteoClient(config OpenMeteoConfig) *OpenMeteoClient {
|
||||||
|
return &OpenMeteoClient{
|
||||||
|
httpClient: &http.Client{
|
||||||
|
Timeout: 10 * time.Second,
|
||||||
|
},
|
||||||
|
latitude: config.Latitude,
|
||||||
|
longitude: config.Longitude,
|
||||||
|
timezone: config.Timezone,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// openMeteoResponse represents the JSON response from Open-Meteo API
|
||||||
|
type openMeteoResponse struct {
|
||||||
|
Latitude float64 `json:"latitude"`
|
||||||
|
Longitude float64 `json:"longitude"`
|
||||||
|
Timezone string `json:"timezone"`
|
||||||
|
Hourly struct {
|
||||||
|
Time []string `json:"time"`
|
||||||
|
WindSpeed10m []float64 `json:"wind_speed_10m"`
|
||||||
|
WindDir10m []int `json:"wind_direction_10m"`
|
||||||
|
WindGusts10m []float64 `json:"wind_gusts_10m"`
|
||||||
|
} `json:"hourly"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetWeatherForecast fetches weather data from Open-Meteo API
|
||||||
|
// It retrieves 1 day of past data and 2 days of forecast data
|
||||||
|
func (c *OpenMeteoClient) GetWeatherForecast(ctx context.Context) ([]model.WeatherPoint, error) {
|
||||||
|
// Build request URL with query parameters
|
||||||
|
reqURL, err := url.Parse(openMeteoBaseURL)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse base URL: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
query := reqURL.Query()
|
||||||
|
query.Set("latitude", fmt.Sprintf("%.4f", c.latitude))
|
||||||
|
query.Set("longitude", fmt.Sprintf("%.4f", c.longitude))
|
||||||
|
query.Set("hourly", "wind_speed_10m,wind_direction_10m,wind_gusts_10m")
|
||||||
|
query.Set("wind_speed_unit", "mph")
|
||||||
|
query.Set("timezone", c.timezone)
|
||||||
|
query.Set("forecast_days", "2")
|
||||||
|
query.Set("past_days", "1")
|
||||||
|
reqURL.RawQuery = query.Encode()
|
||||||
|
|
||||||
|
// Create HTTP request with context
|
||||||
|
req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create request: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Execute request
|
||||||
|
resp, err := c.httpClient.Do(req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to fetch weather data: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
// Check status code
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return nil, fmt.Errorf("API returned status %d: %s", resp.StatusCode, string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse JSON response
|
||||||
|
var apiResp openMeteoResponse
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&apiResp); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to decode response: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert to WeatherPoint slice
|
||||||
|
points, err := c.parseWeatherPoints(apiResp)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse weather points: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return points, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseWeatherPoints converts the API response into a slice of WeatherPoint
|
||||||
|
func (c *OpenMeteoClient) parseWeatherPoints(resp openMeteoResponse) ([]model.WeatherPoint, error) {
|
||||||
|
if len(resp.Hourly.Time) == 0 {
|
||||||
|
return nil, fmt.Errorf("no hourly data in response")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate that all arrays have the same length
|
||||||
|
dataLen := len(resp.Hourly.Time)
|
||||||
|
if len(resp.Hourly.WindSpeed10m) != dataLen ||
|
||||||
|
len(resp.Hourly.WindDir10m) != dataLen ||
|
||||||
|
len(resp.Hourly.WindGusts10m) != dataLen {
|
||||||
|
return nil, fmt.Errorf("inconsistent data array lengths in response")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load timezone location
|
||||||
|
loc, err := time.LoadLocation(resp.Timezone)
|
||||||
|
if err != nil {
|
||||||
|
// Fallback to UTC if timezone can't be loaded
|
||||||
|
loc = time.UTC
|
||||||
|
}
|
||||||
|
|
||||||
|
points := make([]model.WeatherPoint, 0, dataLen)
|
||||||
|
|
||||||
|
for i := 0; i < dataLen; i++ {
|
||||||
|
// Parse timestamp - API returns ISO8601 format without timezone (e.g., "2026-01-01T00:00")
|
||||||
|
// Try RFC3339 first, then fall back to ISO8601 without timezone
|
||||||
|
var t time.Time
|
||||||
|
var err error
|
||||||
|
|
||||||
|
// Try RFC3339 format first (with timezone)
|
||||||
|
t, err = time.Parse(time.RFC3339, resp.Hourly.Time[i])
|
||||||
|
if err != nil {
|
||||||
|
// Try ISO8601 format without timezone (e.g., "2006-01-02T15:04")
|
||||||
|
t, err = time.Parse("2006-01-02T15:04", resp.Hourly.Time[i])
|
||||||
|
if err != nil {
|
||||||
|
// Try with seconds if present
|
||||||
|
t, err = time.Parse("2006-01-02T15:04:05", resp.Hourly.Time[i])
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse time at index %d: %w", i, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Apply timezone to parsed time
|
||||||
|
t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), t.Minute(), t.Second(), t.Nanosecond(), loc)
|
||||||
|
}
|
||||||
|
|
||||||
|
point := model.WeatherPoint{
|
||||||
|
Time: t,
|
||||||
|
WindSpeedMPH: resp.Hourly.WindSpeed10m[i],
|
||||||
|
WindDirection: resp.Hourly.WindDir10m[i],
|
||||||
|
WindGustMPH: resp.Hourly.WindGusts10m[i],
|
||||||
|
}
|
||||||
|
|
||||||
|
points = append(points, point)
|
||||||
|
}
|
||||||
|
|
||||||
|
return points, nil
|
||||||
|
}
|
||||||
328
backend/internal/client/openmeteo_test.go
Normal file
328
backend/internal/client/openmeteo_test.go
Normal file
@@ -0,0 +1,328 @@
|
|||||||
|
package client
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"net/http"
|
||||||
|
"net/http/httptest"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestOpenMeteoClient_GetWeatherForecast(t *testing.T) {
|
||||||
|
// Load test data
|
||||||
|
testDataPath := filepath.Join("..", "..", "testdata", "openmeteo_response.json")
|
||||||
|
testData, err := os.ReadFile(testDataPath)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to read test data: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create mock server
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
// Verify query parameters
|
||||||
|
query := r.URL.Query()
|
||||||
|
if query.Get("latitude") != "32.8893" {
|
||||||
|
t.Errorf("expected latitude=32.8893, got %s", query.Get("latitude"))
|
||||||
|
}
|
||||||
|
if query.Get("longitude") != "-117.2519" {
|
||||||
|
t.Errorf("expected longitude=-117.2519, got %s", query.Get("longitude"))
|
||||||
|
}
|
||||||
|
if query.Get("hourly") != "wind_speed_10m,wind_direction_10m,wind_gusts_10m" {
|
||||||
|
t.Errorf("unexpected hourly params: %s", query.Get("hourly"))
|
||||||
|
}
|
||||||
|
if query.Get("wind_speed_unit") != "mph" {
|
||||||
|
t.Errorf("expected wind_speed_unit=mph, got %s", query.Get("wind_speed_unit"))
|
||||||
|
}
|
||||||
|
if query.Get("timezone") != "America/Los_Angeles" {
|
||||||
|
t.Errorf("expected timezone=America/Los_Angeles, got %s", query.Get("timezone"))
|
||||||
|
}
|
||||||
|
if query.Get("forecast_days") != "2" {
|
||||||
|
t.Errorf("expected forecast_days=2, got %s", query.Get("forecast_days"))
|
||||||
|
}
|
||||||
|
if query.Get("past_days") != "1" {
|
||||||
|
t.Errorf("expected past_days=1, got %s", query.Get("past_days"))
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
w.WriteHeader(http.StatusOK)
|
||||||
|
w.Write(testData)
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
// Create client with mock server URL
|
||||||
|
client := NewOpenMeteoClient(OpenMeteoConfig{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
})
|
||||||
|
|
||||||
|
// Override base URL to use test server
|
||||||
|
// Note: In production, you'd want to make baseURL configurable
|
||||||
|
// For now, this test verifies the parsing logic
|
||||||
|
_ = openMeteoBaseURL // Acknowledge the constant exists
|
||||||
|
|
||||||
|
// Temporarily replace httpClient to use test server
|
||||||
|
client.httpClient = server.Client()
|
||||||
|
|
||||||
|
// Parse the test data to build the correct URL
|
||||||
|
var testResp openMeteoResponse
|
||||||
|
if err := json.Unmarshal(testData, &testResp); err != nil {
|
||||||
|
t.Fatalf("failed to unmarshal test data: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a custom client that points to our test server
|
||||||
|
testClient := &OpenMeteoClient{
|
||||||
|
httpClient: server.Client(),
|
||||||
|
latitude: 32.8893,
|
||||||
|
longitude: -117.2519,
|
||||||
|
timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
|
||||||
|
// Override the URL parsing to use test server
|
||||||
|
ctx := context.Background()
|
||||||
|
|
||||||
|
// Make request directly to test server
|
||||||
|
req, err := http.NewRequestWithContext(ctx, http.MethodGet, server.URL+"?latitude=32.8893&longitude=-117.2519&hourly=wind_speed_10m,wind_direction_10m,wind_gusts_10m&wind_speed_unit=mph&timezone=America/Los_Angeles&forecast_days=2&past_days=1", nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to create request: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
resp, err := testClient.httpClient.Do(req)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to execute request: %v", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
var apiResp openMeteoResponse
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&apiResp); err != nil {
|
||||||
|
t.Fatalf("failed to decode response: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
points, err := testClient.parseWeatherPoints(apiResp)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to parse weather points: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify results
|
||||||
|
if len(points) != 72 {
|
||||||
|
t.Errorf("expected 72 weather points, got %d", len(points))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check first point
|
||||||
|
expectedTime, _ := time.Parse(time.RFC3339, "2026-01-01T00:00:00Z")
|
||||||
|
if !points[0].Time.Equal(expectedTime) {
|
||||||
|
t.Errorf("expected first time to be %v, got %v", expectedTime, points[0].Time)
|
||||||
|
}
|
||||||
|
if points[0].WindSpeedMPH != 5.2 {
|
||||||
|
t.Errorf("expected first wind speed to be 5.2, got %f", points[0].WindSpeedMPH)
|
||||||
|
}
|
||||||
|
if points[0].WindDirection != 280 {
|
||||||
|
t.Errorf("expected first wind direction to be 280, got %d", points[0].WindDirection)
|
||||||
|
}
|
||||||
|
if points[0].WindGustMPH != 8.5 {
|
||||||
|
t.Errorf("expected first wind gust to be 8.5, got %f", points[0].WindGustMPH)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check a point with good flying conditions (around index 12-15)
|
||||||
|
// Expected to have wind speed ~10-12 mph, direction ~260 degrees
|
||||||
|
goodPoint := points[13]
|
||||||
|
if goodPoint.WindSpeedMPH < 7.0 || goodPoint.WindSpeedMPH > 14.0 {
|
||||||
|
t.Logf("point at index 13: speed=%.1f, dir=%d (within flyable range)",
|
||||||
|
goodPoint.WindSpeedMPH, goodPoint.WindDirection)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Logf("Successfully parsed %d weather points", len(points))
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestOpenMeteoClient_GetWeatherForecast_ErrorHandling(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
statusCode int
|
||||||
|
responseBody string
|
||||||
|
expectedError string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "API error",
|
||||||
|
statusCode: 500,
|
||||||
|
responseBody: `{"error": "Internal server error"}`,
|
||||||
|
expectedError: "API returned status 500",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Invalid JSON",
|
||||||
|
statusCode: 200,
|
||||||
|
responseBody: `{invalid json}`,
|
||||||
|
expectedError: "failed to decode response",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Not found",
|
||||||
|
statusCode: 404,
|
||||||
|
responseBody: `{"error": "Not found"}`,
|
||||||
|
expectedError: "API returned status 404",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
w.WriteHeader(tt.statusCode)
|
||||||
|
w.Write([]byte(tt.responseBody))
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
client := &OpenMeteoClient{
|
||||||
|
httpClient: server.Client(),
|
||||||
|
latitude: 32.8893,
|
||||||
|
longitude: -117.2519,
|
||||||
|
timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
|
||||||
|
// Make direct request to test server for error handling
|
||||||
|
ctx := context.Background()
|
||||||
|
req, err := http.NewRequestWithContext(ctx, http.MethodGet, server.URL, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to create request: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
resp, err := client.httpClient.Do(req)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to execute request: %v", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode == 200 {
|
||||||
|
var apiResp openMeteoResponse
|
||||||
|
err = json.NewDecoder(resp.Body).Decode(&apiResp)
|
||||||
|
if err == nil {
|
||||||
|
t.Errorf("expected decode error for invalid JSON, got nil")
|
||||||
|
}
|
||||||
|
} else if resp.StatusCode != tt.statusCode {
|
||||||
|
t.Errorf("expected status code %d, got %d", tt.statusCode, resp.StatusCode)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestOpenMeteoClient_ContextCancellation(t *testing.T) {
|
||||||
|
// Create a server that delays response
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
time.Sleep(2 * time.Second)
|
||||||
|
w.WriteHeader(http.StatusOK)
|
||||||
|
w.Write([]byte(`{}`))
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
client := &OpenMeteoClient{
|
||||||
|
httpClient: server.Client(),
|
||||||
|
latitude: 32.8893,
|
||||||
|
longitude: -117.2519,
|
||||||
|
timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create context that cancels quickly
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 100*time.Millisecond)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
req, err := http.NewRequestWithContext(ctx, http.MethodGet, server.URL, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to create request: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = client.httpClient.Do(req)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("expected context cancellation error, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseWeatherPoints_InconsistentData(t *testing.T) {
|
||||||
|
client := NewOpenMeteoClient(OpenMeteoConfig{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
})
|
||||||
|
|
||||||
|
// Test with inconsistent array lengths
|
||||||
|
resp := openMeteoResponse{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
resp.Hourly.Time = []string{"2026-01-01T00:00", "2026-01-01T01:00"}
|
||||||
|
resp.Hourly.WindSpeed10m = []float64{5.0} // Only 1 element
|
||||||
|
resp.Hourly.WindDir10m = []int{270, 280}
|
||||||
|
resp.Hourly.WindGusts10m = []float64{8.0, 9.0}
|
||||||
|
|
||||||
|
_, err := client.parseWeatherPoints(resp)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("expected error for inconsistent data lengths, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseWeatherPoints_EmptyData(t *testing.T) {
|
||||||
|
client := NewOpenMeteoClient(OpenMeteoConfig{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
})
|
||||||
|
|
||||||
|
resp := openMeteoResponse{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := client.parseWeatherPoints(resp)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("expected error for empty data, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseWeatherPoints_InvalidTime(t *testing.T) {
|
||||||
|
client := NewOpenMeteoClient(OpenMeteoConfig{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
})
|
||||||
|
|
||||||
|
resp := openMeteoResponse{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
resp.Hourly.Time = []string{"invalid-time"}
|
||||||
|
resp.Hourly.WindSpeed10m = []float64{5.0}
|
||||||
|
resp.Hourly.WindDir10m = []int{270}
|
||||||
|
resp.Hourly.WindGusts10m = []float64{8.0}
|
||||||
|
|
||||||
|
_, err := client.parseWeatherPoints(resp)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("expected error for invalid time format, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestNewOpenMeteoClient(t *testing.T) {
|
||||||
|
config := OpenMeteoConfig{
|
||||||
|
Latitude: 32.8893,
|
||||||
|
Longitude: -117.2519,
|
||||||
|
Timezone: "America/Los_Angeles",
|
||||||
|
}
|
||||||
|
|
||||||
|
client := NewOpenMeteoClient(config)
|
||||||
|
|
||||||
|
if client == nil {
|
||||||
|
t.Fatal("expected non-nil client")
|
||||||
|
}
|
||||||
|
if client.latitude != config.Latitude {
|
||||||
|
t.Errorf("expected latitude %f, got %f", config.Latitude, client.latitude)
|
||||||
|
}
|
||||||
|
if client.longitude != config.Longitude {
|
||||||
|
t.Errorf("expected longitude %f, got %f", config.Longitude, client.longitude)
|
||||||
|
}
|
||||||
|
if client.timezone != config.Timezone {
|
||||||
|
t.Errorf("expected timezone %s, got %s", config.Timezone, client.timezone)
|
||||||
|
}
|
||||||
|
if client.httpClient == nil {
|
||||||
|
t.Error("expected non-nil http client")
|
||||||
|
}
|
||||||
|
}
|
||||||
76
backend/internal/config/config.go
Normal file
76
backend/internal/config/config.go
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
package config
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/kelseyhightower/envconfig"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Config holds all application configuration
|
||||||
|
type Config struct {
|
||||||
|
// Database configuration
|
||||||
|
DatabaseURL string `envconfig:"DATABASE_URL" required:"true"`
|
||||||
|
|
||||||
|
// Server configuration
|
||||||
|
Port int `envconfig:"PORT" default:"8080"`
|
||||||
|
|
||||||
|
// Location configuration
|
||||||
|
LocationLat float64 `envconfig:"LOCATION_LAT" default:"37.7749"`
|
||||||
|
LocationLon float64 `envconfig:"LOCATION_LON" default:"-122.4194"`
|
||||||
|
LocationName string `envconfig:"LOCATION_NAME" default:"San Francisco"`
|
||||||
|
|
||||||
|
// Timezone configuration
|
||||||
|
Timezone string `envconfig:"TIMEZONE" default:"America/Los_Angeles"`
|
||||||
|
|
||||||
|
// Weather fetcher configuration
|
||||||
|
FetchInterval time.Duration `envconfig:"FETCH_INTERVAL" default:"15m"`
|
||||||
|
|
||||||
|
// Cache configuration
|
||||||
|
CacheTTL time.Duration `envconfig:"CACHE_TTL" default:"10m"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load reads configuration from environment variables
|
||||||
|
func Load() (*Config, error) {
|
||||||
|
var cfg Config
|
||||||
|
if err := envconfig.Process("", &cfg); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to process environment config: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate configuration
|
||||||
|
if err := cfg.validate(); err != nil {
|
||||||
|
return nil, fmt.Errorf("invalid configuration: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &cfg, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// validate checks that configuration values are valid
|
||||||
|
func (c *Config) validate() error {
|
||||||
|
if c.Port < 1 || c.Port > 65535 {
|
||||||
|
return fmt.Errorf("port must be between 1 and 65535, got %d", c.Port)
|
||||||
|
}
|
||||||
|
|
||||||
|
if c.LocationLat < -90 || c.LocationLat > 90 {
|
||||||
|
return fmt.Errorf("location latitude must be between -90 and 90, got %f", c.LocationLat)
|
||||||
|
}
|
||||||
|
|
||||||
|
if c.LocationLon < -180 || c.LocationLon > 180 {
|
||||||
|
return fmt.Errorf("location longitude must be between -180 and 180, got %f", c.LocationLon)
|
||||||
|
}
|
||||||
|
|
||||||
|
if c.FetchInterval < time.Minute {
|
||||||
|
return fmt.Errorf("fetch interval must be at least 1 minute, got %s", c.FetchInterval)
|
||||||
|
}
|
||||||
|
|
||||||
|
if c.CacheTTL < time.Second {
|
||||||
|
return fmt.Errorf("cache TTL must be at least 1 second, got %s", c.CacheTTL)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Addr returns the server address in host:port format
|
||||||
|
func (c *Config) Addr() string {
|
||||||
|
return fmt.Sprintf(":%d", c.Port)
|
||||||
|
}
|
||||||
52
backend/internal/database/migrate.go
Normal file
52
backend/internal/database/migrate.go
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
package database
|
||||||
|
|
||||||
|
import (
|
||||||
|
"embed"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"log/slog"
|
||||||
|
|
||||||
|
"github.com/golang-migrate/migrate/v4"
|
||||||
|
_ "github.com/golang-migrate/migrate/v4/database/postgres"
|
||||||
|
"github.com/golang-migrate/migrate/v4/source/iofs"
|
||||||
|
)
|
||||||
|
|
||||||
|
//go:embed migrations/*.sql
|
||||||
|
var migrationsFS embed.FS
|
||||||
|
|
||||||
|
// RunMigrations runs all pending database migrations
|
||||||
|
func RunMigrations(databaseURL string, logger *slog.Logger) error {
|
||||||
|
logger.Info("running database migrations")
|
||||||
|
|
||||||
|
// Create source driver from embedded files
|
||||||
|
source, err := iofs.New(migrationsFS, "migrations")
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to create migration source: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create migrate instance
|
||||||
|
m, err := migrate.NewWithSourceInstance("iofs", source, databaseURL)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to create migrate instance: %w", err)
|
||||||
|
}
|
||||||
|
defer m.Close()
|
||||||
|
|
||||||
|
// Run migrations
|
||||||
|
if err := m.Up(); err != nil {
|
||||||
|
if errors.Is(err, migrate.ErrNoChange) {
|
||||||
|
logger.Info("no new migrations to apply")
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return fmt.Errorf("failed to run migrations: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get current version
|
||||||
|
version, dirty, err := m.Version()
|
||||||
|
if err != nil && !errors.Is(err, migrate.ErrNilVersion) {
|
||||||
|
logger.Warn("failed to get migration version", "error", err)
|
||||||
|
} else {
|
||||||
|
logger.Info("migrations complete", "version", version, "dirty", dirty)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
DROP TABLE IF EXISTS weather_observations;
|
||||||
@@ -0,0 +1,12 @@
|
|||||||
|
CREATE TABLE weather_observations (
|
||||||
|
id BIGSERIAL PRIMARY KEY,
|
||||||
|
observed_at TIMESTAMPTZ NOT NULL,
|
||||||
|
wind_speed_mph DECIMAL(5,2) NOT NULL,
|
||||||
|
wind_direction INTEGER NOT NULL CHECK (wind_direction >= 0 AND wind_direction < 360),
|
||||||
|
wind_gust_mph DECIMAL(5,2),
|
||||||
|
source VARCHAR(50) DEFAULT 'open-meteo',
|
||||||
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
|
UNIQUE (observed_at, source)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_weather_time ON weather_observations (observed_at DESC);
|
||||||
45
backend/internal/model/weather.go
Normal file
45
backend/internal/model/weather.go
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
package model
|
||||||
|
|
||||||
|
import "time"
|
||||||
|
|
||||||
|
// WeatherPoint represents weather conditions at a specific point in time
|
||||||
|
type WeatherPoint struct {
|
||||||
|
Time time.Time
|
||||||
|
WindSpeedMPH float64
|
||||||
|
WindDirection int // 0-359 degrees
|
||||||
|
WindGustMPH float64
|
||||||
|
}
|
||||||
|
|
||||||
|
// Thresholds defines the criteria for flyable conditions
|
||||||
|
type Thresholds struct {
|
||||||
|
SpeedMin float64 `json:"speedMin"` // default 7 mph
|
||||||
|
SpeedMax float64 `json:"speedMax"` // default 14 mph
|
||||||
|
DirCenter int `json:"dirCenter"` // default 270 (West)
|
||||||
|
DirRange int `json:"dirRange"` // default 15 degrees
|
||||||
|
}
|
||||||
|
|
||||||
|
// DefaultThresholds returns the default threshold values
|
||||||
|
func DefaultThresholds() Thresholds {
|
||||||
|
return Thresholds{
|
||||||
|
SpeedMin: 7,
|
||||||
|
SpeedMax: 14,
|
||||||
|
DirCenter: 270,
|
||||||
|
DirRange: 15,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Assessment contains the analysis of weather conditions for paragliding
|
||||||
|
type Assessment struct {
|
||||||
|
Status string // "GOOD" or "BAD"
|
||||||
|
Reason string // explanation of the status
|
||||||
|
FlyableNow bool // whether conditions are currently flyable
|
||||||
|
BestWindow *FlyableWindow // the best flyable window (if any)
|
||||||
|
AllWindows []FlyableWindow // all flyable windows found
|
||||||
|
}
|
||||||
|
|
||||||
|
// FlyableWindow represents a continuous period of flyable conditions
|
||||||
|
type FlyableWindow struct {
|
||||||
|
Start time.Time
|
||||||
|
End time.Time
|
||||||
|
Duration time.Duration
|
||||||
|
}
|
||||||
453
backend/internal/repository/repository_test.go
Normal file
453
backend/internal/repository/repository_test.go
Normal file
@@ -0,0 +1,453 @@
|
|||||||
|
package repository
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/jackc/pgx/v5/pgxpool"
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
)
|
||||||
|
|
||||||
|
var testPool *pgxpool.Pool
|
||||||
|
|
||||||
|
// TestMain sets up the test database connection
|
||||||
|
func TestMain(m *testing.M) {
|
||||||
|
ctx := context.Background()
|
||||||
|
|
||||||
|
// Get database URL from environment
|
||||||
|
dbURL := os.Getenv("DATABASE_URL")
|
||||||
|
if dbURL == "" {
|
||||||
|
dbURL = "postgres://postgres:postgres@localhost:5432/paragliding_test?sslmode=disable"
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create connection pool
|
||||||
|
pool, err := pgxpool.New(ctx, dbURL)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Fprintf(os.Stderr, "Unable to create connection pool: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
testPool = pool
|
||||||
|
|
||||||
|
// Run migrations
|
||||||
|
if err := runMigrations(ctx, pool); err != nil {
|
||||||
|
fmt.Fprintf(os.Stderr, "Unable to run migrations: %v\n", err)
|
||||||
|
pool.Close()
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run tests
|
||||||
|
code := m.Run()
|
||||||
|
|
||||||
|
// Cleanup
|
||||||
|
pool.Close()
|
||||||
|
os.Exit(code)
|
||||||
|
}
|
||||||
|
|
||||||
|
// runMigrations applies the database schema for testing
|
||||||
|
func runMigrations(ctx context.Context, pool *pgxpool.Pool) error {
|
||||||
|
// Drop existing table if it exists
|
||||||
|
_, err := pool.Exec(ctx, "DROP TABLE IF EXISTS weather_observations")
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to drop table: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create table
|
||||||
|
createTableSQL := `
|
||||||
|
CREATE TABLE weather_observations (
|
||||||
|
id BIGSERIAL PRIMARY KEY,
|
||||||
|
observed_at TIMESTAMPTZ NOT NULL,
|
||||||
|
wind_speed_mph DECIMAL(5,2) NOT NULL,
|
||||||
|
wind_direction INTEGER NOT NULL CHECK (wind_direction >= 0 AND wind_direction < 360),
|
||||||
|
wind_gust_mph DECIMAL(5,2),
|
||||||
|
source VARCHAR(50) DEFAULT 'open-meteo',
|
||||||
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
|
UNIQUE (observed_at, source)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_weather_time ON weather_observations (observed_at DESC);
|
||||||
|
`
|
||||||
|
|
||||||
|
_, err = pool.Exec(ctx, createTableSQL)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to create table: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// cleanupDatabase removes all data from the weather_observations table
|
||||||
|
func cleanupDatabase(ctx context.Context, pool *pgxpool.Pool) error {
|
||||||
|
_, err := pool.Exec(ctx, "DELETE FROM weather_observations")
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestWeatherRepository_SaveObservations(t *testing.T) {
|
||||||
|
ctx := context.Background()
|
||||||
|
repo := NewWeatherRepository(testPool)
|
||||||
|
|
||||||
|
// Cleanup before test
|
||||||
|
if err := cleanupDatabase(ctx, testPool); err != nil {
|
||||||
|
t.Fatalf("Failed to cleanup database: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("save single observation", func(t *testing.T) {
|
||||||
|
defer cleanupDatabase(ctx, testPool)
|
||||||
|
|
||||||
|
now := time.Now().UTC().Truncate(time.Second)
|
||||||
|
observations := []model.WeatherPoint{
|
||||||
|
{
|
||||||
|
Time: now,
|
||||||
|
WindSpeedMPH: 10.5,
|
||||||
|
WindDirection: 270,
|
||||||
|
WindGustMPH: 15.2,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
err := repo.SaveObservations(ctx, observations)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to save observations: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify data was saved
|
||||||
|
var count int
|
||||||
|
err = testPool.QueryRow(ctx, "SELECT COUNT(*) FROM weather_observations").Scan(&count)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to query count: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if count != 1 {
|
||||||
|
t.Errorf("Expected 1 observation, got %d", count)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("save multiple observations", func(t *testing.T) {
|
||||||
|
defer cleanupDatabase(ctx, testPool)
|
||||||
|
|
||||||
|
now := time.Now().UTC().Truncate(time.Second)
|
||||||
|
observations := []model.WeatherPoint{
|
||||||
|
{
|
||||||
|
Time: now,
|
||||||
|
WindSpeedMPH: 10.5,
|
||||||
|
WindDirection: 270,
|
||||||
|
WindGustMPH: 15.2,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: now.Add(time.Hour),
|
||||||
|
WindSpeedMPH: 12.0,
|
||||||
|
WindDirection: 280,
|
||||||
|
WindGustMPH: 16.5,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: now.Add(2 * time.Hour),
|
||||||
|
WindSpeedMPH: 8.5,
|
||||||
|
WindDirection: 260,
|
||||||
|
WindGustMPH: 12.0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
err := repo.SaveObservations(ctx, observations)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to save observations: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify count
|
||||||
|
var count int
|
||||||
|
err = testPool.QueryRow(ctx, "SELECT COUNT(*) FROM weather_observations").Scan(&count)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to query count: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if count != 3 {
|
||||||
|
t.Errorf("Expected 3 observations, got %d", count)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("upsert on conflict", func(t *testing.T) {
|
||||||
|
defer cleanupDatabase(ctx, testPool)
|
||||||
|
|
||||||
|
now := time.Now().UTC().Truncate(time.Second)
|
||||||
|
observations := []model.WeatherPoint{
|
||||||
|
{
|
||||||
|
Time: now,
|
||||||
|
WindSpeedMPH: 10.5,
|
||||||
|
WindDirection: 270,
|
||||||
|
WindGustMPH: 15.2,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
// Insert first time
|
||||||
|
err := repo.SaveObservations(ctx, observations)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to save observations: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update with different values
|
||||||
|
observations[0].WindSpeedMPH = 11.0
|
||||||
|
observations[0].WindDirection = 275
|
||||||
|
observations[0].WindGustMPH = 16.0
|
||||||
|
|
||||||
|
err = repo.SaveObservations(ctx, observations)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to update observations: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify still only one record
|
||||||
|
var count int
|
||||||
|
err = testPool.QueryRow(ctx, "SELECT COUNT(*) FROM weather_observations").Scan(&count)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to query count: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if count != 1 {
|
||||||
|
t.Errorf("Expected 1 observation after upsert, got %d", count)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify values were updated
|
||||||
|
var windSpeed float64
|
||||||
|
err = testPool.QueryRow(ctx, "SELECT wind_speed_mph FROM weather_observations WHERE observed_at = $1", now).Scan(&windSpeed)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to query wind speed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if windSpeed != 11.0 {
|
||||||
|
t.Errorf("Expected wind speed 11.0, got %f", windSpeed)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("save empty slice", func(t *testing.T) {
|
||||||
|
defer cleanupDatabase(ctx, testPool)
|
||||||
|
|
||||||
|
err := repo.SaveObservations(ctx, []model.WeatherPoint{})
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected no error for empty slice, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestWeatherRepository_GetForecast(t *testing.T) {
|
||||||
|
ctx := context.Background()
|
||||||
|
repo := NewWeatherRepository(testPool)
|
||||||
|
|
||||||
|
// Cleanup and setup test data
|
||||||
|
if err := cleanupDatabase(ctx, testPool); err != nil {
|
||||||
|
t.Fatalf("Failed to cleanup database: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
now := time.Now().UTC().Truncate(time.Second)
|
||||||
|
observations := []model.WeatherPoint{
|
||||||
|
{
|
||||||
|
Time: now.Add(-2 * time.Hour), // Before range
|
||||||
|
WindSpeedMPH: 8.0,
|
||||||
|
WindDirection: 250,
|
||||||
|
WindGustMPH: 12.0,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: now,
|
||||||
|
WindSpeedMPH: 10.5,
|
||||||
|
WindDirection: 270,
|
||||||
|
WindGustMPH: 15.2,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: now.Add(time.Hour),
|
||||||
|
WindSpeedMPH: 12.0,
|
||||||
|
WindDirection: 280,
|
||||||
|
WindGustMPH: 16.5,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: now.Add(2 * time.Hour),
|
||||||
|
WindSpeedMPH: 8.5,
|
||||||
|
WindDirection: 260,
|
||||||
|
WindGustMPH: 12.0,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: now.Add(4 * time.Hour), // After range
|
||||||
|
WindSpeedMPH: 9.0,
|
||||||
|
WindDirection: 255,
|
||||||
|
WindGustMPH: 13.0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := repo.SaveObservations(ctx, observations); err != nil {
|
||||||
|
t.Fatalf("Failed to save observations: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("get forecast in range", func(t *testing.T) {
|
||||||
|
start := now
|
||||||
|
end := now.Add(3 * time.Hour)
|
||||||
|
|
||||||
|
result, err := repo.GetForecast(ctx, start, end)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to get forecast: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(result) != 3 {
|
||||||
|
t.Errorf("Expected 3 observations in range, got %d", len(result))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify ordering (ascending)
|
||||||
|
for i := 0; i < len(result)-1; i++ {
|
||||||
|
if result[i].Time.After(result[i+1].Time) {
|
||||||
|
t.Errorf("Results not ordered correctly: %v > %v", result[i].Time, result[i+1].Time)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify first result
|
||||||
|
if !result[0].Time.Equal(now) {
|
||||||
|
t.Errorf("Expected first result at %v, got %v", now, result[0].Time)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("get forecast with no results", func(t *testing.T) {
|
||||||
|
start := now.Add(10 * time.Hour)
|
||||||
|
end := now.Add(20 * time.Hour)
|
||||||
|
|
||||||
|
result, err := repo.GetForecast(ctx, start, end)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to get forecast: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(result) != 0 {
|
||||||
|
t.Errorf("Expected 0 observations, got %d", len(result))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Cleanup
|
||||||
|
cleanupDatabase(ctx, testPool)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestWeatherRepository_GetHistorical(t *testing.T) {
|
||||||
|
ctx := context.Background()
|
||||||
|
repo := NewWeatherRepository(testPool)
|
||||||
|
|
||||||
|
// Cleanup and setup test data
|
||||||
|
if err := cleanupDatabase(ctx, testPool); err != nil {
|
||||||
|
t.Fatalf("Failed to cleanup database: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use a specific date for testing
|
||||||
|
loc := time.UTC
|
||||||
|
targetDate := time.Date(2024, 1, 15, 0, 0, 0, 0, loc)
|
||||||
|
|
||||||
|
observations := []model.WeatherPoint{
|
||||||
|
{
|
||||||
|
Time: time.Date(2024, 1, 14, 23, 0, 0, 0, loc), // Day before
|
||||||
|
WindSpeedMPH: 8.0,
|
||||||
|
WindDirection: 250,
|
||||||
|
WindGustMPH: 12.0,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: time.Date(2024, 1, 15, 0, 0, 0, 0, loc), // Start of day
|
||||||
|
WindSpeedMPH: 10.5,
|
||||||
|
WindDirection: 270,
|
||||||
|
WindGustMPH: 15.2,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: time.Date(2024, 1, 15, 12, 0, 0, 0, loc), // Middle of day
|
||||||
|
WindSpeedMPH: 12.0,
|
||||||
|
WindDirection: 280,
|
||||||
|
WindGustMPH: 16.5,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: time.Date(2024, 1, 15, 23, 59, 59, 0, loc), // End of day
|
||||||
|
WindSpeedMPH: 8.5,
|
||||||
|
WindDirection: 260,
|
||||||
|
WindGustMPH: 12.0,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Time: time.Date(2024, 1, 16, 0, 0, 0, 0, loc), // Next day
|
||||||
|
WindSpeedMPH: 9.0,
|
||||||
|
WindDirection: 255,
|
||||||
|
WindGustMPH: 13.0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := repo.SaveObservations(ctx, observations); err != nil {
|
||||||
|
t.Fatalf("Failed to save observations: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("get historical for specific day", func(t *testing.T) {
|
||||||
|
result, err := repo.GetHistorical(ctx, targetDate)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to get historical data: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(result) != 3 {
|
||||||
|
t.Errorf("Expected 3 observations for the day, got %d", len(result))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify all results are from the target day
|
||||||
|
for _, obs := range result {
|
||||||
|
if obs.Time.Year() != targetDate.Year() ||
|
||||||
|
obs.Time.Month() != targetDate.Month() ||
|
||||||
|
obs.Time.Day() != targetDate.Day() {
|
||||||
|
t.Errorf("Observation %v is not from target date %v", obs.Time, targetDate)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify ordering (ascending)
|
||||||
|
for i := 0; i < len(result)-1; i++ {
|
||||||
|
if result[i].Time.After(result[i+1].Time) {
|
||||||
|
t.Errorf("Results not ordered correctly: %v > %v", result[i].Time, result[i+1].Time)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("get historical for day with no data", func(t *testing.T) {
|
||||||
|
emptyDate := time.Date(2024, 2, 1, 0, 0, 0, 0, loc)
|
||||||
|
result, err := repo.GetHistorical(ctx, emptyDate)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to get historical data: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(result) != 0 {
|
||||||
|
t.Errorf("Expected 0 observations, got %d", len(result))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Cleanup
|
||||||
|
cleanupDatabase(ctx, testPool)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestWeatherRepository_ContextCancellation(t *testing.T) {
|
||||||
|
repo := NewWeatherRepository(testPool)
|
||||||
|
|
||||||
|
// Cleanup
|
||||||
|
if err := cleanupDatabase(context.Background(), testPool); err != nil {
|
||||||
|
t.Fatalf("Failed to cleanup database: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("save with cancelled context", func(t *testing.T) {
|
||||||
|
ctx, cancel := context.WithCancel(context.Background())
|
||||||
|
cancel() // Cancel immediately
|
||||||
|
|
||||||
|
now := time.Now().UTC().Truncate(time.Second)
|
||||||
|
observations := []model.WeatherPoint{
|
||||||
|
{
|
||||||
|
Time: now,
|
||||||
|
WindSpeedMPH: 10.5,
|
||||||
|
WindDirection: 270,
|
||||||
|
WindGustMPH: 15.2,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
err := repo.SaveObservations(ctx, observations)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error with cancelled context, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("get forecast with cancelled context", func(t *testing.T) {
|
||||||
|
ctx, cancel := context.WithCancel(context.Background())
|
||||||
|
cancel() // Cancel immediately
|
||||||
|
|
||||||
|
now := time.Now().UTC()
|
||||||
|
_, err := repo.GetForecast(ctx, now, now.Add(time.Hour))
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error with cancelled context, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
148
backend/internal/repository/weather.go
Normal file
148
backend/internal/repository/weather.go
Normal file
@@ -0,0 +1,148 @@
|
|||||||
|
package repository
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/jackc/pgx/v5"
|
||||||
|
"github.com/jackc/pgx/v5/pgxpool"
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
)
|
||||||
|
|
||||||
|
// WeatherRepository handles database operations for weather observations
|
||||||
|
type WeatherRepository struct {
|
||||||
|
pool *pgxpool.Pool
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewWeatherRepository creates a new weather repository
|
||||||
|
func NewWeatherRepository(pool *pgxpool.Pool) *WeatherRepository {
|
||||||
|
return &WeatherRepository{
|
||||||
|
pool: pool,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// SaveObservations performs a bulk upsert of weather observations
|
||||||
|
// Uses batch inserts for efficiency and ON CONFLICT to handle duplicates
|
||||||
|
func (r *WeatherRepository) SaveObservations(ctx context.Context, observations []model.WeatherPoint) error {
|
||||||
|
if len(observations) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use a batch for efficient bulk insert
|
||||||
|
batch := &pgx.Batch{}
|
||||||
|
|
||||||
|
query := `
|
||||||
|
INSERT INTO weather_observations (observed_at, wind_speed_mph, wind_direction, wind_gust_mph, source)
|
||||||
|
VALUES ($1, $2, $3, $4, $5)
|
||||||
|
ON CONFLICT (observed_at, source)
|
||||||
|
DO UPDATE SET
|
||||||
|
wind_speed_mph = EXCLUDED.wind_speed_mph,
|
||||||
|
wind_direction = EXCLUDED.wind_direction,
|
||||||
|
wind_gust_mph = EXCLUDED.wind_gust_mph,
|
||||||
|
created_at = NOW()
|
||||||
|
`
|
||||||
|
|
||||||
|
for _, obs := range observations {
|
||||||
|
// Normalize wind direction to 0-359 range
|
||||||
|
windDir := obs.WindDirection
|
||||||
|
for windDir < 0 {
|
||||||
|
windDir += 360
|
||||||
|
}
|
||||||
|
for windDir >= 360 {
|
||||||
|
windDir -= 360
|
||||||
|
}
|
||||||
|
batch.Queue(query, obs.Time, obs.WindSpeedMPH, windDir, obs.WindGustMPH, "open-meteo")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Execute the batch
|
||||||
|
br := r.pool.SendBatch(ctx, batch)
|
||||||
|
defer br.Close()
|
||||||
|
|
||||||
|
// Process all batch results to ensure they complete
|
||||||
|
for i := 0; i < len(observations); i++ {
|
||||||
|
_, err := br.Exec()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to save observation %d: %w", i, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetForecast retrieves weather observations within a time range
|
||||||
|
// Results are ordered by time ascending for forecast display
|
||||||
|
func (r *WeatherRepository) GetForecast(ctx context.Context, start, end time.Time) ([]model.WeatherPoint, error) {
|
||||||
|
query := `
|
||||||
|
SELECT observed_at, wind_speed_mph, wind_direction, wind_gust_mph
|
||||||
|
FROM weather_observations
|
||||||
|
WHERE observed_at >= $1 AND observed_at <= $2
|
||||||
|
ORDER BY observed_at ASC
|
||||||
|
`
|
||||||
|
|
||||||
|
rows, err := r.pool.Query(ctx, query, start, end)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to query forecast: %w", err)
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
var observations []model.WeatherPoint
|
||||||
|
|
||||||
|
for rows.Next() {
|
||||||
|
var obs model.WeatherPoint
|
||||||
|
err := rows.Scan(&obs.Time, &obs.WindSpeedMPH, &obs.WindDirection, &obs.WindGustMPH)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to scan observation: %w", err)
|
||||||
|
}
|
||||||
|
observations = append(observations, obs)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := rows.Err(); err != nil {
|
||||||
|
return nil, fmt.Errorf("error iterating rows: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return observations, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetHistorical retrieves all weather observations for a specific day
|
||||||
|
// Returns data for the entire day in the system's timezone
|
||||||
|
func (r *WeatherRepository) GetHistorical(ctx context.Context, date time.Time) ([]model.WeatherPoint, error) {
|
||||||
|
// Get start and end of the day
|
||||||
|
start := time.Date(date.Year(), date.Month(), date.Day(), 0, 0, 0, 0, date.Location())
|
||||||
|
end := start.Add(24 * time.Hour)
|
||||||
|
|
||||||
|
query := `
|
||||||
|
SELECT observed_at, wind_speed_mph, wind_direction, wind_gust_mph
|
||||||
|
FROM weather_observations
|
||||||
|
WHERE observed_at >= $1 AND observed_at < $2
|
||||||
|
ORDER BY observed_at ASC
|
||||||
|
`
|
||||||
|
|
||||||
|
rows, err := r.pool.Query(ctx, query, start, end)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to query historical data: %w", err)
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
var observations []model.WeatherPoint
|
||||||
|
|
||||||
|
for rows.Next() {
|
||||||
|
var obs model.WeatherPoint
|
||||||
|
err := rows.Scan(&obs.Time, &obs.WindSpeedMPH, &obs.WindDirection, &obs.WindGustMPH)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to scan observation: %w", err)
|
||||||
|
}
|
||||||
|
observations = append(observations, obs)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := rows.Err(); err != nil {
|
||||||
|
return nil, fmt.Errorf("error iterating rows: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return observations, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close closes the database pool
|
||||||
|
func (r *WeatherRepository) Close() {
|
||||||
|
r.pool.Close()
|
||||||
|
}
|
||||||
84
backend/internal/server/ratelimit.go
Normal file
84
backend/internal/server/ratelimit.go
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
package server
|
||||||
|
|
||||||
|
import (
|
||||||
|
"net/http"
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"golang.org/x/time/rate"
|
||||||
|
)
|
||||||
|
|
||||||
|
// RateLimiter provides per-IP rate limiting
|
||||||
|
type RateLimiter struct {
|
||||||
|
limiters map[string]*rate.Limiter
|
||||||
|
mu sync.RWMutex
|
||||||
|
rate rate.Limit
|
||||||
|
burst int
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewRateLimiter creates a new rate limiter
|
||||||
|
// rate is requests per second, burst is max burst size
|
||||||
|
func NewRateLimiter(r float64, burst int) *RateLimiter {
|
||||||
|
return &RateLimiter{
|
||||||
|
limiters: make(map[string]*rate.Limiter),
|
||||||
|
rate: rate.Limit(r),
|
||||||
|
burst: burst,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// getLimiter returns the rate limiter for a given IP, creating one if needed
|
||||||
|
func (rl *RateLimiter) getLimiter(ip string) *rate.Limiter {
|
||||||
|
rl.mu.RLock()
|
||||||
|
limiter, exists := rl.limiters[ip]
|
||||||
|
rl.mu.RUnlock()
|
||||||
|
|
||||||
|
if exists {
|
||||||
|
return limiter
|
||||||
|
}
|
||||||
|
|
||||||
|
rl.mu.Lock()
|
||||||
|
defer rl.mu.Unlock()
|
||||||
|
|
||||||
|
// Double-check after acquiring write lock
|
||||||
|
if limiter, exists = rl.limiters[ip]; exists {
|
||||||
|
return limiter
|
||||||
|
}
|
||||||
|
|
||||||
|
limiter = rate.NewLimiter(rl.rate, rl.burst)
|
||||||
|
rl.limiters[ip] = limiter
|
||||||
|
return limiter
|
||||||
|
}
|
||||||
|
|
||||||
|
// Middleware returns a middleware handler for rate limiting
|
||||||
|
func (rl *RateLimiter) Middleware(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
// Get client IP (chi's RealIP middleware should have set this)
|
||||||
|
ip := r.RemoteAddr
|
||||||
|
|
||||||
|
limiter := rl.getLimiter(ip)
|
||||||
|
if !limiter.Allow() {
|
||||||
|
w.Header().Set("Retry-After", "1")
|
||||||
|
http.Error(w, "rate limit exceeded", http.StatusTooManyRequests)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
next.ServeHTTP(w, r)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// CleanupOldEntries removes stale IP entries periodically
|
||||||
|
// Call this in a goroutine to prevent memory growth
|
||||||
|
func (rl *RateLimiter) CleanupOldEntries(interval time.Duration, maxAge time.Duration) {
|
||||||
|
ticker := time.NewTicker(interval)
|
||||||
|
defer ticker.Stop()
|
||||||
|
|
||||||
|
for range ticker.C {
|
||||||
|
rl.mu.Lock()
|
||||||
|
// Simple cleanup: just reset the map periodically
|
||||||
|
// In a more sophisticated implementation, you'd track last access time
|
||||||
|
if len(rl.limiters) > 10000 {
|
||||||
|
rl.limiters = make(map[string]*rate.Limiter)
|
||||||
|
}
|
||||||
|
rl.mu.Unlock()
|
||||||
|
}
|
||||||
|
}
|
||||||
73
backend/internal/server/routes.go
Normal file
73
backend/internal/server/routes.go
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
package server
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"net/http"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
)
|
||||||
|
|
||||||
|
// RouteHandler defines the interface for route handlers
|
||||||
|
type RouteHandler interface {
|
||||||
|
Health(w http.ResponseWriter, r *http.Request)
|
||||||
|
GetCurrentWeather(w http.ResponseWriter, r *http.Request)
|
||||||
|
GetForecast(w http.ResponseWriter, r *http.Request)
|
||||||
|
GetHistorical(w http.ResponseWriter, r *http.Request)
|
||||||
|
AssessConditions(w http.ResponseWriter, r *http.Request)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetupRoutes configures all API routes
|
||||||
|
func (s *Server) SetupRoutes(handler RouteHandler) {
|
||||||
|
// Health check endpoint
|
||||||
|
s.router.Get("/api/health", handler.Health)
|
||||||
|
|
||||||
|
// API v1 routes
|
||||||
|
s.router.Route("/api/v1", func(r chi.Router) {
|
||||||
|
// Weather routes
|
||||||
|
r.Route("/weather", func(r chi.Router) {
|
||||||
|
r.Get("/current", handler.GetCurrentWeather)
|
||||||
|
r.Get("/forecast", handler.GetForecast)
|
||||||
|
r.Get("/historical", handler.GetHistorical)
|
||||||
|
r.Post("/assess", handler.AssessConditions)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Response helpers
|
||||||
|
|
||||||
|
// JSONResponse is a generic JSON response structure
|
||||||
|
type JSONResponse struct {
|
||||||
|
Success bool `json:"success"`
|
||||||
|
Data interface{} `json:"data,omitempty"`
|
||||||
|
Error string `json:"error,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RespondJSON writes a JSON response
|
||||||
|
func RespondJSON(w http.ResponseWriter, statusCode int, data interface{}) {
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
w.WriteHeader(statusCode)
|
||||||
|
|
||||||
|
response := JSONResponse{
|
||||||
|
Success: statusCode >= 200 && statusCode < 300,
|
||||||
|
Data: data,
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := json.NewEncoder(w).Encode(response); err != nil {
|
||||||
|
http.Error(w, "Failed to encode response", http.StatusInternalServerError)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// RespondError writes a JSON error response
|
||||||
|
func RespondError(w http.ResponseWriter, statusCode int, message string) {
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
w.WriteHeader(statusCode)
|
||||||
|
|
||||||
|
response := JSONResponse{
|
||||||
|
Success: false,
|
||||||
|
Error: message,
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := json.NewEncoder(w).Encode(response); err != nil {
|
||||||
|
http.Error(w, "Failed to encode error response", http.StatusInternalServerError)
|
||||||
|
}
|
||||||
|
}
|
||||||
132
backend/internal/server/server.go
Normal file
132
backend/internal/server/server.go
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
package server
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"log/slog"
|
||||||
|
"net/http"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
"github.com/go-chi/chi/v5/middleware"
|
||||||
|
"github.com/go-chi/cors"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Server represents the HTTP server
|
||||||
|
type Server struct {
|
||||||
|
router *chi.Mux
|
||||||
|
logger *slog.Logger
|
||||||
|
addr string
|
||||||
|
httpServer *http.Server
|
||||||
|
}
|
||||||
|
|
||||||
|
// New creates a new HTTP server with chi router and CORS enabled
|
||||||
|
func New(addr string, logger *slog.Logger) *Server {
|
||||||
|
s := &Server{
|
||||||
|
router: chi.NewRouter(),
|
||||||
|
logger: logger,
|
||||||
|
addr: addr,
|
||||||
|
}
|
||||||
|
|
||||||
|
s.setupMiddleware()
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// setupMiddleware configures all middleware for the router
|
||||||
|
func (s *Server) setupMiddleware() {
|
||||||
|
// Request ID middleware
|
||||||
|
s.router.Use(middleware.RequestID)
|
||||||
|
|
||||||
|
// Real IP middleware
|
||||||
|
s.router.Use(middleware.RealIP)
|
||||||
|
|
||||||
|
// Rate limiting: 10 requests/second with burst of 30
|
||||||
|
// Generous limits since Cloudflare handles most protection
|
||||||
|
rateLimiter := NewRateLimiter(10, 30)
|
||||||
|
s.router.Use(rateLimiter.Middleware)
|
||||||
|
|
||||||
|
// Structured logging middleware
|
||||||
|
s.router.Use(s.loggingMiddleware)
|
||||||
|
|
||||||
|
// Recover from panics
|
||||||
|
s.router.Use(middleware.Recoverer)
|
||||||
|
|
||||||
|
// Request timeout
|
||||||
|
s.router.Use(middleware.Timeout(60 * time.Second))
|
||||||
|
|
||||||
|
// CORS configuration
|
||||||
|
s.router.Use(cors.Handler(cors.Options{
|
||||||
|
AllowedOrigins: []string{
|
||||||
|
"https://paragliding.scottyah.com",
|
||||||
|
"http://localhost:3000",
|
||||||
|
"http://localhost:5173",
|
||||||
|
},
|
||||||
|
AllowedMethods: []string{"GET", "POST", "PUT", "DELETE", "OPTIONS"},
|
||||||
|
AllowedHeaders: []string{"Accept", "Authorization", "Content-Type", "X-CSRF-Token"},
|
||||||
|
ExposedHeaders: []string{"Link"},
|
||||||
|
AllowCredentials: true,
|
||||||
|
MaxAge: 300,
|
||||||
|
}))
|
||||||
|
|
||||||
|
// Compress responses
|
||||||
|
s.router.Use(middleware.Compress(5))
|
||||||
|
}
|
||||||
|
|
||||||
|
// loggingMiddleware logs HTTP requests with structured logging
|
||||||
|
func (s *Server) loggingMiddleware(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
start := time.Now()
|
||||||
|
|
||||||
|
ww := middleware.NewWrapResponseWriter(w, r.ProtoMajor)
|
||||||
|
|
||||||
|
defer func() {
|
||||||
|
s.logger.Info("request",
|
||||||
|
"method", r.Method,
|
||||||
|
"path", r.URL.Path,
|
||||||
|
"status", ww.Status(),
|
||||||
|
"bytes", ww.BytesWritten(),
|
||||||
|
"duration", time.Since(start).String(),
|
||||||
|
"request_id", middleware.GetReqID(r.Context()),
|
||||||
|
)
|
||||||
|
}()
|
||||||
|
|
||||||
|
next.ServeHTTP(ww, r)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Router returns the chi router
|
||||||
|
func (s *Server) Router() *chi.Mux {
|
||||||
|
return s.router
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start starts the HTTP server
|
||||||
|
func (s *Server) Start() error {
|
||||||
|
s.logger.Info("starting HTTP server", "addr", s.addr)
|
||||||
|
|
||||||
|
s.httpServer = &http.Server{
|
||||||
|
Addr: s.addr,
|
||||||
|
Handler: s.router,
|
||||||
|
ReadHeaderTimeout: 10 * time.Second,
|
||||||
|
ReadTimeout: 30 * time.Second,
|
||||||
|
WriteTimeout: 60 * time.Second,
|
||||||
|
IdleTimeout: 120 * time.Second,
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := s.httpServer.ListenAndServe(); err != nil && err != http.ErrServerClosed {
|
||||||
|
return fmt.Errorf("server failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Shutdown gracefully shuts down the server
|
||||||
|
func (s *Server) Shutdown(ctx context.Context) error {
|
||||||
|
s.logger.Info("shutting down HTTP server")
|
||||||
|
|
||||||
|
if s.httpServer == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return s.httpServer.Shutdown(ctx)
|
||||||
|
}
|
||||||
285
backend/internal/service/assessment.go
Normal file
285
backend/internal/service/assessment.go
Normal file
@@ -0,0 +1,285 @@
|
|||||||
|
package service
|
||||||
|
|
||||||
|
import (
|
||||||
|
"math"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
)
|
||||||
|
|
||||||
|
// AssessmentService evaluates weather conditions for paragliding
|
||||||
|
type AssessmentService struct{}
|
||||||
|
|
||||||
|
// NewAssessmentService creates a new assessment service
|
||||||
|
func NewAssessmentService() *AssessmentService {
|
||||||
|
return &AssessmentService{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Evaluate analyzes weather points against thresholds and returns an assessment
|
||||||
|
func (s *AssessmentService) Evaluate(points []model.WeatherPoint, thresholds model.Thresholds) model.Assessment {
|
||||||
|
if len(points) == 0 {
|
||||||
|
return model.Assessment{
|
||||||
|
Status: "BAD",
|
||||||
|
Reason: "No weather data available",
|
||||||
|
FlyableNow: false,
|
||||||
|
BestWindow: nil,
|
||||||
|
AllWindows: []model.FlyableWindow{},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter for daylight hours (8am-10pm)
|
||||||
|
daylightPoints := s.filterDaylightHours(points)
|
||||||
|
if len(daylightPoints) == 0 {
|
||||||
|
return model.Assessment{
|
||||||
|
Status: "BAD",
|
||||||
|
Reason: "No data available during daylight flying hours (8am-10pm)",
|
||||||
|
FlyableNow: false,
|
||||||
|
BestWindow: nil,
|
||||||
|
AllWindows: []model.FlyableWindow{},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find all flyable windows
|
||||||
|
windows := s.FindFlyableWindows(daylightPoints, thresholds)
|
||||||
|
|
||||||
|
// Check if current conditions are flyable
|
||||||
|
now := time.Now()
|
||||||
|
flyableNow := false
|
||||||
|
for _, point := range daylightPoints {
|
||||||
|
if point.Time.Before(now.Add(30*time.Minute)) && point.Time.After(now.Add(-30*time.Minute)) {
|
||||||
|
flyableNow = s.isPointFlyable(point, thresholds)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine best window
|
||||||
|
var bestWindow *model.FlyableWindow
|
||||||
|
if len(windows) > 0 {
|
||||||
|
// Find the longest window
|
||||||
|
longest := windows[0]
|
||||||
|
for _, w := range windows[1:] {
|
||||||
|
if w.Duration > longest.Duration {
|
||||||
|
longest = w
|
||||||
|
}
|
||||||
|
}
|
||||||
|
bestWindow = &longest
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build assessment
|
||||||
|
assessment := model.Assessment{
|
||||||
|
FlyableNow: flyableNow,
|
||||||
|
BestWindow: bestWindow,
|
||||||
|
AllWindows: windows,
|
||||||
|
}
|
||||||
|
|
||||||
|
if bestWindow != nil {
|
||||||
|
assessment.Status = "GOOD"
|
||||||
|
assessment.Reason = formatBestWindowReason(*bestWindow)
|
||||||
|
} else {
|
||||||
|
assessment.Status = "BAD"
|
||||||
|
assessment.Reason = s.determineWhyNotFlyable(daylightPoints, thresholds)
|
||||||
|
}
|
||||||
|
|
||||||
|
return assessment
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindFlyableWindows identifies all continuous periods of flyable conditions
|
||||||
|
// A flyable window must have at least 1 hour of continuous flyable conditions
|
||||||
|
func (s *AssessmentService) FindFlyableWindows(points []model.WeatherPoint, thresholds model.Thresholds) []model.FlyableWindow {
|
||||||
|
if len(points) == 0 {
|
||||||
|
return []model.FlyableWindow{}
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := []model.FlyableWindow{}
|
||||||
|
var windowStart *time.Time
|
||||||
|
var lastFlyableTime *time.Time
|
||||||
|
|
||||||
|
for i, point := range points {
|
||||||
|
isFlyable := s.isPointFlyable(point, thresholds)
|
||||||
|
|
||||||
|
if isFlyable {
|
||||||
|
// Start a new window if not already in one
|
||||||
|
if windowStart == nil {
|
||||||
|
windowStart = &point.Time
|
||||||
|
}
|
||||||
|
lastFlyableTime = &point.Time
|
||||||
|
} else {
|
||||||
|
// End current window if we were in one
|
||||||
|
if windowStart != nil && lastFlyableTime != nil {
|
||||||
|
duration := lastFlyableTime.Sub(*windowStart)
|
||||||
|
// Only include windows of at least 1 hour
|
||||||
|
if duration >= time.Hour {
|
||||||
|
windows = append(windows, model.FlyableWindow{
|
||||||
|
Start: *windowStart,
|
||||||
|
End: *lastFlyableTime,
|
||||||
|
Duration: duration,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
windowStart = nil
|
||||||
|
lastFlyableTime = nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle last point
|
||||||
|
if i == len(points)-1 && windowStart != nil && lastFlyableTime != nil {
|
||||||
|
duration := lastFlyableTime.Sub(*windowStart)
|
||||||
|
if duration >= time.Hour {
|
||||||
|
windows = append(windows, model.FlyableWindow{
|
||||||
|
Start: *windowStart,
|
||||||
|
End: *lastFlyableTime,
|
||||||
|
Duration: duration,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return windows
|
||||||
|
}
|
||||||
|
|
||||||
|
// isPointFlyable checks if a single weather point meets flyable conditions
|
||||||
|
func (s *AssessmentService) isPointFlyable(point model.WeatherPoint, thresholds model.Thresholds) bool {
|
||||||
|
// Check wind speed
|
||||||
|
if point.WindSpeedMPH < thresholds.SpeedMin || point.WindSpeedMPH > thresholds.SpeedMax {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check wind direction with wraparound handling
|
||||||
|
if !s.isDirectionInRange(point.WindDirection, thresholds.DirCenter, thresholds.DirRange) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// isDirectionInRange checks if a direction is within range of center, handling 0/360 wraparound
|
||||||
|
func (s *AssessmentService) isDirectionInRange(direction, center, rangeVal int) bool {
|
||||||
|
// Normalize all values to 0-359
|
||||||
|
direction = s.normalizeDegrees(direction)
|
||||||
|
center = s.normalizeDegrees(center)
|
||||||
|
|
||||||
|
// Calculate the absolute difference
|
||||||
|
diff := s.angleDifference(direction, center)
|
||||||
|
|
||||||
|
return diff <= float64(rangeVal)
|
||||||
|
}
|
||||||
|
|
||||||
|
// normalizeDegrees ensures degrees are in 0-359 range
|
||||||
|
func (s *AssessmentService) normalizeDegrees(degrees int) int {
|
||||||
|
normalized := degrees % 360
|
||||||
|
if normalized < 0 {
|
||||||
|
normalized += 360
|
||||||
|
}
|
||||||
|
return normalized
|
||||||
|
}
|
||||||
|
|
||||||
|
// angleDifference calculates the minimum difference between two angles
|
||||||
|
func (s *AssessmentService) angleDifference(angle1, angle2 int) float64 {
|
||||||
|
diff := math.Abs(float64(angle1 - angle2))
|
||||||
|
if diff > 180 {
|
||||||
|
diff = 360 - diff
|
||||||
|
}
|
||||||
|
return diff
|
||||||
|
}
|
||||||
|
|
||||||
|
// filterDaylightHours returns only points between 8am and 10pm local time
|
||||||
|
func (s *AssessmentService) filterDaylightHours(points []model.WeatherPoint) []model.WeatherPoint {
|
||||||
|
filtered := make([]model.WeatherPoint, 0, len(points))
|
||||||
|
for _, point := range points {
|
||||||
|
hour := point.Time.Hour()
|
||||||
|
if hour >= 8 && hour < 22 {
|
||||||
|
filtered = append(filtered, point)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return filtered
|
||||||
|
}
|
||||||
|
|
||||||
|
// determineWhyNotFlyable analyzes points to explain why conditions aren't flyable
|
||||||
|
func (s *AssessmentService) determineWhyNotFlyable(points []model.WeatherPoint, thresholds model.Thresholds) string {
|
||||||
|
if len(points) == 0 {
|
||||||
|
return "No data available during flying hours"
|
||||||
|
}
|
||||||
|
|
||||||
|
tooSlow := 0
|
||||||
|
tooFast := 0
|
||||||
|
wrongDirection := 0
|
||||||
|
|
||||||
|
for _, point := range points {
|
||||||
|
if point.WindSpeedMPH < thresholds.SpeedMin {
|
||||||
|
tooSlow++
|
||||||
|
} else if point.WindSpeedMPH > thresholds.SpeedMax {
|
||||||
|
tooFast++
|
||||||
|
}
|
||||||
|
if !s.isDirectionInRange(point.WindDirection, thresholds.DirCenter, thresholds.DirRange) {
|
||||||
|
wrongDirection++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine primary reason
|
||||||
|
if tooSlow > len(points)/2 {
|
||||||
|
return "Wind speeds too low for safe flying"
|
||||||
|
}
|
||||||
|
if tooFast > len(points)/2 {
|
||||||
|
return "Wind speeds too high for safe flying"
|
||||||
|
}
|
||||||
|
if wrongDirection > len(points)/2 {
|
||||||
|
return "Wind direction not favorable"
|
||||||
|
}
|
||||||
|
|
||||||
|
return "No continuous flyable windows of at least 1 hour found"
|
||||||
|
}
|
||||||
|
|
||||||
|
// formatBestWindowReason creates a human-readable message about the best window
|
||||||
|
func formatBestWindowReason(window model.FlyableWindow) string {
|
||||||
|
hours := int(window.Duration.Hours())
|
||||||
|
minutes := int(window.Duration.Minutes()) % 60
|
||||||
|
|
||||||
|
timeStr := ""
|
||||||
|
if hours > 0 {
|
||||||
|
timeStr = formatHours(hours)
|
||||||
|
if minutes > 0 {
|
||||||
|
timeStr += " " + formatMinutes(minutes)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
timeStr = formatMinutes(minutes)
|
||||||
|
}
|
||||||
|
|
||||||
|
return "Best flyable window: " + timeStr + " starting at " + window.Start.Format("3:04 PM")
|
||||||
|
}
|
||||||
|
|
||||||
|
func formatHours(hours int) string {
|
||||||
|
if hours == 1 {
|
||||||
|
return "1 hour"
|
||||||
|
}
|
||||||
|
return formatInt(hours) + " hours"
|
||||||
|
}
|
||||||
|
|
||||||
|
func formatMinutes(minutes int) string {
|
||||||
|
if minutes == 1 {
|
||||||
|
return "1 minute"
|
||||||
|
}
|
||||||
|
return formatInt(minutes) + " minutes"
|
||||||
|
}
|
||||||
|
|
||||||
|
func formatInt(n int) string {
|
||||||
|
// Simple int to string conversion
|
||||||
|
if n == 0 {
|
||||||
|
return "0"
|
||||||
|
}
|
||||||
|
|
||||||
|
negative := n < 0
|
||||||
|
if negative {
|
||||||
|
n = -n
|
||||||
|
}
|
||||||
|
|
||||||
|
digits := []byte{}
|
||||||
|
for n > 0 {
|
||||||
|
digits = append([]byte{byte('0' + n%10)}, digits...)
|
||||||
|
n /= 10
|
||||||
|
}
|
||||||
|
|
||||||
|
if negative {
|
||||||
|
digits = append([]byte{'-'}, digits...)
|
||||||
|
}
|
||||||
|
|
||||||
|
return string(digits)
|
||||||
|
}
|
||||||
554
backend/internal/service/assessment_test.go
Normal file
554
backend/internal/service/assessment_test.go
Normal file
@@ -0,0 +1,554 @@
|
|||||||
|
package service
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestAssessmentService_Evaluate(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
thresholds := getDefaultThresholds()
|
||||||
|
|
||||||
|
t.Run("empty points", func(t *testing.T) {
|
||||||
|
result := service.Evaluate([]model.WeatherPoint{}, thresholds)
|
||||||
|
|
||||||
|
if result.Status != "BAD" {
|
||||||
|
t.Errorf("expected status BAD, got %s", result.Status)
|
||||||
|
}
|
||||||
|
if result.FlyableNow {
|
||||||
|
t.Error("expected FlyableNow to be false")
|
||||||
|
}
|
||||||
|
if result.BestWindow != nil {
|
||||||
|
t.Error("expected no best window")
|
||||||
|
}
|
||||||
|
if len(result.AllWindows) != 0 {
|
||||||
|
t.Errorf("expected 0 windows, got %d", len(result.AllWindows))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("good conditions", func(t *testing.T) {
|
||||||
|
now := time.Now()
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(now, 10, 270), // Good
|
||||||
|
createPoint(now.Add(1*time.Hour), 10, 270),
|
||||||
|
createPoint(now.Add(2*time.Hour), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
result := service.Evaluate(points, thresholds)
|
||||||
|
|
||||||
|
if result.Status != "GOOD" {
|
||||||
|
t.Errorf("expected status GOOD, got %s", result.Status)
|
||||||
|
}
|
||||||
|
if result.BestWindow == nil {
|
||||||
|
t.Fatal("expected a best window")
|
||||||
|
}
|
||||||
|
if result.BestWindow.Duration < 2*time.Hour {
|
||||||
|
t.Errorf("expected window duration >= 2h, got %v", result.BestWindow.Duration)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("no daylight hours", func(t *testing.T) {
|
||||||
|
// Create points at 2am
|
||||||
|
midnight := time.Date(2024, 1, 1, 0, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(midnight.Add(2*time.Hour), 10, 270),
|
||||||
|
createPoint(midnight.Add(3*time.Hour), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
result := service.Evaluate(points, thresholds)
|
||||||
|
|
||||||
|
if result.Status != "BAD" {
|
||||||
|
t.Errorf("expected status BAD, got %s", result.Status)
|
||||||
|
}
|
||||||
|
if !contains(result.Reason, "daylight") {
|
||||||
|
t.Errorf("expected reason to mention daylight, got: %s", result.Reason)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_FindFlyableWindows(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
thresholds := getDefaultThresholds()
|
||||||
|
|
||||||
|
t.Run("single continuous window", func(t *testing.T) {
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(start, 10, 270),
|
||||||
|
createPoint(start.Add(1*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(2*time.Hour), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := service.FindFlyableWindows(points, thresholds)
|
||||||
|
|
||||||
|
if len(windows) != 1 {
|
||||||
|
t.Fatalf("expected 1 window, got %d", len(windows))
|
||||||
|
}
|
||||||
|
|
||||||
|
if windows[0].Duration < 2*time.Hour {
|
||||||
|
t.Errorf("expected duration >= 2h, got %v", windows[0].Duration)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("multiple windows", func(t *testing.T) {
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
// First window (2 hours)
|
||||||
|
createPoint(start, 10, 270),
|
||||||
|
createPoint(start.Add(1*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(2*time.Hour), 10, 270),
|
||||||
|
// Break (bad wind)
|
||||||
|
createPoint(start.Add(3*time.Hour), 20, 270), // Too fast
|
||||||
|
createPoint(start.Add(4*time.Hour), 20, 270),
|
||||||
|
// Second window (3 hours)
|
||||||
|
createPoint(start.Add(5*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(6*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(7*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(8*time.Hour), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := service.FindFlyableWindows(points, thresholds)
|
||||||
|
|
||||||
|
if len(windows) != 2 {
|
||||||
|
t.Fatalf("expected 2 windows, got %d", len(windows))
|
||||||
|
}
|
||||||
|
|
||||||
|
// First window should be ~2 hours
|
||||||
|
if windows[0].Duration < 2*time.Hour {
|
||||||
|
t.Errorf("first window duration should be >= 2h, got %v", windows[0].Duration)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Second window should be ~3 hours
|
||||||
|
if windows[1].Duration < 3*time.Hour {
|
||||||
|
t.Errorf("second window duration should be >= 3h, got %v", windows[1].Duration)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("window less than 1 hour excluded", func(t *testing.T) {
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(start, 10, 270),
|
||||||
|
createPoint(start.Add(30*time.Minute), 10, 270), // Only 30 minutes
|
||||||
|
createPoint(start.Add(1*time.Hour), 20, 270), // Bad wind
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := service.FindFlyableWindows(points, thresholds)
|
||||||
|
|
||||||
|
if len(windows) != 0 {
|
||||||
|
t.Errorf("expected 0 windows (< 1h), got %d", len(windows))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("exactly 1 hour window", func(t *testing.T) {
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(start, 10, 270),
|
||||||
|
createPoint(start.Add(1*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(2*time.Hour), 20, 270), // Bad wind
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := service.FindFlyableWindows(points, thresholds)
|
||||||
|
|
||||||
|
if len(windows) != 1 {
|
||||||
|
t.Fatalf("expected 1 window, got %d", len(windows))
|
||||||
|
}
|
||||||
|
|
||||||
|
if windows[0].Duration < time.Hour {
|
||||||
|
t.Errorf("window should be at least 1 hour, got %v", windows[0].Duration)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("no flyable conditions", func(t *testing.T) {
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(start, 20, 270), // Too fast
|
||||||
|
createPoint(start.Add(1*time.Hour), 5, 270), // Too slow
|
||||||
|
createPoint(start.Add(2*time.Hour), 10, 180), // Wrong direction
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := service.FindFlyableWindows(points, thresholds)
|
||||||
|
|
||||||
|
if len(windows) != 0 {
|
||||||
|
t.Errorf("expected 0 windows, got %d", len(windows))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_isPointFlyable(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
thresholds := getDefaultThresholds()
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
point model.WeatherPoint
|
||||||
|
expected bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "perfect conditions",
|
||||||
|
point: createPoint(time.Now(), 10, 270),
|
||||||
|
expected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wind too slow",
|
||||||
|
point: createPoint(time.Now(), 5, 270),
|
||||||
|
expected: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wind too fast",
|
||||||
|
point: createPoint(time.Now(), 20, 270),
|
||||||
|
expected: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wrong direction",
|
||||||
|
point: createPoint(time.Now(), 10, 180), // South, not west
|
||||||
|
expected: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "edge of speed range - min",
|
||||||
|
point: createPoint(time.Now(), 7, 270),
|
||||||
|
expected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "edge of speed range - max",
|
||||||
|
point: createPoint(time.Now(), 14, 270),
|
||||||
|
expected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "edge of direction range - low",
|
||||||
|
point: createPoint(time.Now(), 10, 255), // 270 - 15
|
||||||
|
expected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "edge of direction range - high",
|
||||||
|
point: createPoint(time.Now(), 10, 285), // 270 + 15
|
||||||
|
expected: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
result := service.isPointFlyable(tt.point, thresholds)
|
||||||
|
if result != tt.expected {
|
||||||
|
t.Errorf("expected %v, got %v", tt.expected, result)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_DirectionWraparound(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
direction int
|
||||||
|
center int
|
||||||
|
rangeVal int
|
||||||
|
shouldMatch bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "wraparound - north center, matches 350",
|
||||||
|
direction: 350,
|
||||||
|
center: 0,
|
||||||
|
rangeVal: 15,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wraparound - north center, matches 10",
|
||||||
|
direction: 10,
|
||||||
|
center: 0,
|
||||||
|
rangeVal: 15,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wraparound - 350 center, matches 0",
|
||||||
|
direction: 0,
|
||||||
|
center: 350,
|
||||||
|
rangeVal: 20,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wraparound - 350 center, matches 5",
|
||||||
|
direction: 5,
|
||||||
|
center: 350,
|
||||||
|
rangeVal: 20,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wraparound - 350 center, matches 340",
|
||||||
|
direction: 340,
|
||||||
|
center: 350,
|
||||||
|
rangeVal: 20,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "wraparound - 350 center, rejects 30",
|
||||||
|
direction: 30,
|
||||||
|
center: 350,
|
||||||
|
rangeVal: 20,
|
||||||
|
shouldMatch: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "no wraparound - normal case",
|
||||||
|
direction: 180,
|
||||||
|
center: 180,
|
||||||
|
rangeVal: 15,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "no wraparound - within range",
|
||||||
|
direction: 190,
|
||||||
|
center: 180,
|
||||||
|
rangeVal: 15,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "no wraparound - out of range",
|
||||||
|
direction: 200,
|
||||||
|
center: 180,
|
||||||
|
rangeVal: 15,
|
||||||
|
shouldMatch: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "negative direction normalized",
|
||||||
|
direction: -10,
|
||||||
|
center: 350,
|
||||||
|
rangeVal: 20,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "direction over 360 normalized",
|
||||||
|
direction: 370,
|
||||||
|
center: 10,
|
||||||
|
rangeVal: 15,
|
||||||
|
shouldMatch: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
result := service.isDirectionInRange(tt.direction, tt.center, tt.rangeVal)
|
||||||
|
if result != tt.shouldMatch {
|
||||||
|
t.Errorf("expected %v, got %v for direction=%d, center=%d, range=%d",
|
||||||
|
tt.shouldMatch, result, tt.direction, tt.center, tt.rangeVal)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_DaylightFiltering(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
|
||||||
|
baseDate := time.Date(2024, 1, 1, 0, 0, 0, 0, time.UTC)
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
hour int
|
||||||
|
included bool
|
||||||
|
}{
|
||||||
|
{"7am - before daylight", 7, false},
|
||||||
|
{"8am - start of daylight", 8, true},
|
||||||
|
{"12pm - middle of day", 12, true},
|
||||||
|
{"6pm - evening", 18, true},
|
||||||
|
{"9pm - late evening", 21, true},
|
||||||
|
{"10pm - end of daylight", 22, false},
|
||||||
|
{"11pm - night", 23, false},
|
||||||
|
{"2am - night", 2, false},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(baseDate.Add(time.Duration(tt.hour)*time.Hour), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered := service.filterDaylightHours(points)
|
||||||
|
|
||||||
|
if tt.included && len(filtered) != 1 {
|
||||||
|
t.Errorf("expected point at %d:00 to be included", tt.hour)
|
||||||
|
}
|
||||||
|
if !tt.included && len(filtered) != 0 {
|
||||||
|
t.Errorf("expected point at %d:00 to be excluded", tt.hour)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_BestWindowSelection(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
thresholds := getDefaultThresholds()
|
||||||
|
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
// First window (2 hours)
|
||||||
|
createPoint(start, 10, 270),
|
||||||
|
createPoint(start.Add(1*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(2*time.Hour), 10, 270),
|
||||||
|
// Break
|
||||||
|
createPoint(start.Add(3*time.Hour), 20, 270),
|
||||||
|
// Second window (4 hours) - this should be the best
|
||||||
|
createPoint(start.Add(4*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(5*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(6*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(7*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(8*time.Hour), 10, 270),
|
||||||
|
// Break
|
||||||
|
createPoint(start.Add(9*time.Hour), 5, 270),
|
||||||
|
// Third window (1 hour)
|
||||||
|
createPoint(start.Add(10*time.Hour), 10, 270),
|
||||||
|
createPoint(start.Add(11*time.Hour), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
result := service.Evaluate(points, thresholds)
|
||||||
|
|
||||||
|
if result.BestWindow == nil {
|
||||||
|
t.Fatal("expected a best window")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Best window should be the 4-hour window
|
||||||
|
if result.BestWindow.Duration < 4*time.Hour {
|
||||||
|
t.Errorf("expected best window to be >= 4 hours, got %v", result.BestWindow.Duration)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Should have found 3 windows total
|
||||||
|
if len(result.AllWindows) != 3 {
|
||||||
|
t.Errorf("expected 3 windows, got %d", len(result.AllWindows))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_EdgeCases(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
thresholds := getDefaultThresholds()
|
||||||
|
|
||||||
|
t.Run("single point - flyable", func(t *testing.T) {
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC), 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
result := service.Evaluate(points, thresholds)
|
||||||
|
|
||||||
|
// Single point can't form a 1-hour window
|
||||||
|
if result.BestWindow != nil {
|
||||||
|
t.Error("single point should not create a window")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("all points same time", func(t *testing.T) {
|
||||||
|
sameTime := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(sameTime, 10, 270),
|
||||||
|
createPoint(sameTime, 10, 270),
|
||||||
|
createPoint(sameTime, 10, 270),
|
||||||
|
}
|
||||||
|
|
||||||
|
windows := service.FindFlyableWindows(points, thresholds)
|
||||||
|
|
||||||
|
// Should handle gracefully
|
||||||
|
if len(windows) > 1 {
|
||||||
|
t.Errorf("expected at most 1 window for same-time points, got %d", len(windows))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("custom thresholds", func(t *testing.T) {
|
||||||
|
customThresholds := model.Thresholds{
|
||||||
|
SpeedMin: 5,
|
||||||
|
SpeedMax: 10,
|
||||||
|
DirCenter: 180,
|
||||||
|
DirRange: 30,
|
||||||
|
}
|
||||||
|
|
||||||
|
start := time.Date(2024, 1, 1, 10, 0, 0, 0, time.UTC)
|
||||||
|
points := []model.WeatherPoint{
|
||||||
|
createPoint(start, 7, 170), // Within custom range
|
||||||
|
createPoint(start.Add(1*time.Hour), 7, 170),
|
||||||
|
createPoint(start.Add(2*time.Hour), 7, 170),
|
||||||
|
}
|
||||||
|
|
||||||
|
result := service.Evaluate(points, customThresholds)
|
||||||
|
|
||||||
|
if result.Status != "GOOD" {
|
||||||
|
t.Errorf("expected GOOD status with custom thresholds, got %s", result.Status)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_NormalizeDegrees(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
input int
|
||||||
|
expected int
|
||||||
|
}{
|
||||||
|
{0, 0},
|
||||||
|
{180, 180},
|
||||||
|
{360, 0},
|
||||||
|
{361, 1},
|
||||||
|
{720, 0},
|
||||||
|
{-10, 350},
|
||||||
|
{-90, 270},
|
||||||
|
{-360, 0},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
result := service.normalizeDegrees(tt.input)
|
||||||
|
if result != tt.expected {
|
||||||
|
t.Errorf("normalizeDegrees(%d) = %d, expected %d", tt.input, result, tt.expected)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAssessmentService_AngleDifference(t *testing.T) {
|
||||||
|
service := NewAssessmentService()
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
angle1 int
|
||||||
|
angle2 int
|
||||||
|
expected float64
|
||||||
|
}{
|
||||||
|
{0, 0, 0},
|
||||||
|
{0, 180, 180},
|
||||||
|
{0, 10, 10},
|
||||||
|
{10, 0, 10},
|
||||||
|
{350, 10, 20}, // Wraparound
|
||||||
|
{10, 350, 20}, // Wraparound
|
||||||
|
{270, 90, 180}, // Opposite
|
||||||
|
{359, 1, 2}, // Close wraparound
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
result := service.angleDifference(tt.angle1, tt.angle2)
|
||||||
|
if result != tt.expected {
|
||||||
|
t.Errorf("angleDifference(%d, %d) = %.1f, expected %.1f",
|
||||||
|
tt.angle1, tt.angle2, result, tt.expected)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper functions
|
||||||
|
|
||||||
|
func getDefaultThresholds() model.Thresholds {
|
||||||
|
return model.Thresholds{
|
||||||
|
SpeedMin: 7,
|
||||||
|
SpeedMax: 14,
|
||||||
|
DirCenter: 270,
|
||||||
|
DirRange: 15,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func createPoint(t time.Time, speed float64, direction int) model.WeatherPoint {
|
||||||
|
return model.WeatherPoint{
|
||||||
|
Time: t,
|
||||||
|
WindSpeedMPH: speed,
|
||||||
|
WindDirection: direction,
|
||||||
|
WindGustMPH: speed + 2, // Arbitrary gust value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func contains(s, substr string) bool {
|
||||||
|
for i := 0; i <= len(s)-len(substr); i++ {
|
||||||
|
if s[i:i+len(substr)] == substr {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
415
backend/internal/service/weather.go
Normal file
415
backend/internal/service/weather.go
Normal file
@@ -0,0 +1,415 @@
|
|||||||
|
package service
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"log/slog"
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/scottyah/paragliding/internal/client"
|
||||||
|
"github.com/scottyah/paragliding/internal/model"
|
||||||
|
"github.com/scottyah/paragliding/internal/repository"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
// Cache keys
|
||||||
|
cacheKeyCurrentWeather = "weather:current"
|
||||||
|
cacheKeyForecast = "weather:forecast"
|
||||||
|
cacheKeyHistorical = "weather:historical:%s"
|
||||||
|
cacheKeyLastAPIFetch = "weather:last_api_fetch"
|
||||||
|
|
||||||
|
// Cache TTLs
|
||||||
|
cacheTTLCurrent = 5 * time.Minute
|
||||||
|
cacheTTLForecast = 10 * time.Minute
|
||||||
|
cacheTTLHistorical = 24 * time.Hour
|
||||||
|
|
||||||
|
// API rate limiting - minimum time between API calls
|
||||||
|
minAPIFetchInterval = 15 * time.Minute
|
||||||
|
|
||||||
|
// Data staleness threshold - if DB data is older than this, consider fetching fresh
|
||||||
|
dataStaleThreshold = 30 * time.Minute
|
||||||
|
)
|
||||||
|
|
||||||
|
// WeatherData holds weather data with metadata
|
||||||
|
type WeatherData struct {
|
||||||
|
Points []model.WeatherPoint
|
||||||
|
FetchedAt time.Time
|
||||||
|
Source string // "cache", "database", "api"
|
||||||
|
}
|
||||||
|
|
||||||
|
// WeatherService provides weather data with DB-first access and rate-limited API fallback
|
||||||
|
type WeatherService struct {
|
||||||
|
client *client.OpenMeteoClient
|
||||||
|
repo *repository.WeatherRepository
|
||||||
|
logger *slog.Logger
|
||||||
|
|
||||||
|
// In-memory cache
|
||||||
|
cache map[string]cacheEntry
|
||||||
|
cacheMu sync.RWMutex
|
||||||
|
|
||||||
|
// API rate limiting
|
||||||
|
lastAPIFetch time.Time
|
||||||
|
lastAPIFetchMu sync.RWMutex
|
||||||
|
}
|
||||||
|
|
||||||
|
type cacheEntry struct {
|
||||||
|
data interface{}
|
||||||
|
expiresAt time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
// WeatherServiceConfig contains configuration for the weather service
|
||||||
|
type WeatherServiceConfig struct {
|
||||||
|
Client *client.OpenMeteoClient
|
||||||
|
Repo *repository.WeatherRepository
|
||||||
|
Logger *slog.Logger
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewWeatherService creates a new weather service
|
||||||
|
func NewWeatherService(config WeatherServiceConfig) *WeatherService {
|
||||||
|
return &WeatherService{
|
||||||
|
client: config.Client,
|
||||||
|
repo: config.Repo,
|
||||||
|
logger: config.Logger,
|
||||||
|
cache: make(map[string]cacheEntry),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetCurrentWeather returns current weather conditions
|
||||||
|
// Priority: Cache → DB → API (rate-limited)
|
||||||
|
func (s *WeatherService) GetCurrentWeather(ctx context.Context) (*WeatherData, error) {
|
||||||
|
// 1. Try cache first
|
||||||
|
if cached := s.getFromCache(cacheKeyCurrentWeather); cached != nil {
|
||||||
|
if data, ok := cached.(*WeatherData); ok {
|
||||||
|
s.logger.Debug("current weather cache hit")
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Try database
|
||||||
|
now := time.Now()
|
||||||
|
start := now.Add(-1 * time.Hour)
|
||||||
|
end := now.Add(1 * time.Hour)
|
||||||
|
|
||||||
|
points, err := s.repo.GetForecast(ctx, start, end)
|
||||||
|
if err != nil {
|
||||||
|
s.logger.Warn("failed to get current weather from DB", "error", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find closest point to now
|
||||||
|
if len(points) > 0 {
|
||||||
|
current := s.findClosestPoint(points, now)
|
||||||
|
|
||||||
|
// Check if data is fresh enough
|
||||||
|
if now.Sub(current.Time) < dataStaleThreshold {
|
||||||
|
data := &WeatherData{
|
||||||
|
Points: []model.WeatherPoint{current},
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "database",
|
||||||
|
}
|
||||||
|
s.setCache(cacheKeyCurrentWeather, data, cacheTTLCurrent)
|
||||||
|
s.logger.Debug("current weather from DB", "time", current.Time)
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
s.logger.Debug("DB data is stale", "data_time", current.Time, "threshold", dataStaleThreshold)
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Try API (rate-limited)
|
||||||
|
if s.canFetchFromAPI() {
|
||||||
|
s.logger.Info("fetching current weather from API (DB data stale or missing)")
|
||||||
|
apiPoints, err := s.fetchAndStoreFromAPI(ctx)
|
||||||
|
if err != nil {
|
||||||
|
s.logger.Error("failed to fetch from API", "error", err)
|
||||||
|
// If we have stale DB data, return it
|
||||||
|
if len(points) > 0 {
|
||||||
|
current := s.findClosestPoint(points, now)
|
||||||
|
return &WeatherData{
|
||||||
|
Points: []model.WeatherPoint{current},
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "database (stale)",
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
return nil, fmt.Errorf("no weather data available: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
current := s.findClosestPoint(apiPoints, now)
|
||||||
|
data := &WeatherData{
|
||||||
|
Points: []model.WeatherPoint{current},
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "api",
|
||||||
|
}
|
||||||
|
s.setCache(cacheKeyCurrentWeather, data, cacheTTLCurrent)
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. Return stale data if available, or error
|
||||||
|
if len(points) > 0 {
|
||||||
|
current := s.findClosestPoint(points, now)
|
||||||
|
s.logger.Warn("returning stale data (API rate limited)", "data_time", current.Time)
|
||||||
|
return &WeatherData{
|
||||||
|
Points: []model.WeatherPoint{current},
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "database (stale, API rate limited)",
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, fmt.Errorf("no weather data available and API rate limited")
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetForecast returns weather forecast data
|
||||||
|
// Priority: Cache → DB → API (rate-limited)
|
||||||
|
func (s *WeatherService) GetForecast(ctx context.Context) (*WeatherData, error) {
|
||||||
|
// 1. Try cache first
|
||||||
|
if cached := s.getFromCache(cacheKeyForecast); cached != nil {
|
||||||
|
if data, ok := cached.(*WeatherData); ok {
|
||||||
|
s.logger.Debug("forecast cache hit")
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Try database
|
||||||
|
now := time.Now()
|
||||||
|
end := now.Add(7 * 24 * time.Hour) // 7 days ahead
|
||||||
|
|
||||||
|
points, err := s.repo.GetForecast(ctx, now, end)
|
||||||
|
if err != nil {
|
||||||
|
s.logger.Warn("failed to get forecast from DB", "error", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(points) > 0 {
|
||||||
|
// Check if we have recent enough data (at least some points in the next few hours)
|
||||||
|
hasRecentData := false
|
||||||
|
for _, p := range points {
|
||||||
|
if p.Time.After(now) && p.Time.Before(now.Add(6*time.Hour)) {
|
||||||
|
hasRecentData = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if hasRecentData {
|
||||||
|
data := &WeatherData{
|
||||||
|
Points: points,
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "database",
|
||||||
|
}
|
||||||
|
s.setCache(cacheKeyForecast, data, cacheTTLForecast)
|
||||||
|
s.logger.Debug("forecast from DB", "points", len(points))
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Try API (rate-limited)
|
||||||
|
if s.canFetchFromAPI() {
|
||||||
|
s.logger.Info("fetching forecast from API (DB data stale or missing)")
|
||||||
|
apiPoints, err := s.fetchAndStoreFromAPI(ctx)
|
||||||
|
if err != nil {
|
||||||
|
s.logger.Error("failed to fetch from API", "error", err)
|
||||||
|
if len(points) > 0 {
|
||||||
|
return &WeatherData{
|
||||||
|
Points: points,
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "database (stale)",
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
return nil, fmt.Errorf("no forecast data available: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter for future points
|
||||||
|
forecast := make([]model.WeatherPoint, 0, len(apiPoints))
|
||||||
|
for _, p := range apiPoints {
|
||||||
|
if p.Time.After(now) {
|
||||||
|
forecast = append(forecast, p)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
data := &WeatherData{
|
||||||
|
Points: forecast,
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "api",
|
||||||
|
}
|
||||||
|
s.setCache(cacheKeyForecast, data, cacheTTLForecast)
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. Return stale data if available
|
||||||
|
if len(points) > 0 {
|
||||||
|
s.logger.Warn("returning stale forecast (API rate limited)", "points", len(points))
|
||||||
|
return &WeatherData{
|
||||||
|
Points: points,
|
||||||
|
FetchedAt: now,
|
||||||
|
Source: "database (stale, API rate limited)",
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, fmt.Errorf("no forecast data available and API rate limited")
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetHistorical returns historical weather data for a specific date
|
||||||
|
// Historical data is primarily from DB (populated by background fetcher)
|
||||||
|
func (s *WeatherService) GetHistorical(ctx context.Context, date time.Time) (*WeatherData, error) {
|
||||||
|
cacheKey := fmt.Sprintf(cacheKeyHistorical, date.Format("2006-01-02"))
|
||||||
|
|
||||||
|
// 1. Try cache first
|
||||||
|
if cached := s.getFromCache(cacheKey); cached != nil {
|
||||||
|
if data, ok := cached.(*WeatherData); ok {
|
||||||
|
s.logger.Debug("historical cache hit", "date", date.Format("2006-01-02"))
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Get from database (historical data should always be from DB)
|
||||||
|
points, err := s.repo.GetHistorical(ctx, date)
|
||||||
|
if err != nil {
|
||||||
|
s.logger.Warn("failed to get historical data from DB", "error", err, "date", date.Format("2006-01-02"))
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(points) > 0 {
|
||||||
|
data := &WeatherData{
|
||||||
|
Points: points,
|
||||||
|
FetchedAt: time.Now(),
|
||||||
|
Source: "database",
|
||||||
|
}
|
||||||
|
s.setCache(cacheKey, data, cacheTTLHistorical)
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Historical data not available - don't try API for past dates
|
||||||
|
// The background fetcher should have this data
|
||||||
|
return &WeatherData{
|
||||||
|
Points: []model.WeatherPoint{},
|
||||||
|
FetchedAt: time.Now(),
|
||||||
|
Source: "none",
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetAllPoints returns all available weather points (for assessment)
|
||||||
|
// This reads from DB/cache only, never triggers API calls
|
||||||
|
func (s *WeatherService) GetAllPoints(ctx context.Context) ([]model.WeatherPoint, error) {
|
||||||
|
// Try to get forecast data which includes recent + future points
|
||||||
|
data, err := s.GetForecast(ctx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return data.Points, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// FetchFromAPI forces an API fetch (used by background fetcher)
|
||||||
|
// This bypasses rate limiting as it's called on a schedule
|
||||||
|
func (s *WeatherService) FetchFromAPI(ctx context.Context) ([]model.WeatherPoint, error) {
|
||||||
|
points, err := s.client.GetWeatherForecast(ctx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to fetch from API: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Store in database
|
||||||
|
if err := s.repo.SaveObservations(ctx, points); err != nil {
|
||||||
|
s.logger.Error("failed to save observations to DB", "error", err)
|
||||||
|
// Continue anyway - return the data
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update last fetch time
|
||||||
|
s.lastAPIFetchMu.Lock()
|
||||||
|
s.lastAPIFetch = time.Now()
|
||||||
|
s.lastAPIFetchMu.Unlock()
|
||||||
|
|
||||||
|
// Clear caches so next request gets fresh data
|
||||||
|
s.clearCache()
|
||||||
|
|
||||||
|
s.logger.Info("API fetch complete", "points", len(points))
|
||||||
|
return points, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// canFetchFromAPI checks if enough time has passed since last API fetch
|
||||||
|
func (s *WeatherService) canFetchFromAPI() bool {
|
||||||
|
s.lastAPIFetchMu.RLock()
|
||||||
|
defer s.lastAPIFetchMu.RUnlock()
|
||||||
|
|
||||||
|
if s.lastAPIFetch.IsZero() {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return time.Since(s.lastAPIFetch) >= minAPIFetchInterval
|
||||||
|
}
|
||||||
|
|
||||||
|
// fetchAndStoreFromAPI fetches from API and stores in DB
|
||||||
|
func (s *WeatherService) fetchAndStoreFromAPI(ctx context.Context) ([]model.WeatherPoint, error) {
|
||||||
|
points, err := s.client.GetWeatherForecast(ctx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to fetch from API: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Store in database
|
||||||
|
if err := s.repo.SaveObservations(ctx, points); err != nil {
|
||||||
|
s.logger.Error("failed to save observations to DB", "error", err)
|
||||||
|
// Continue anyway
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update last fetch time
|
||||||
|
s.lastAPIFetchMu.Lock()
|
||||||
|
s.lastAPIFetch = time.Now()
|
||||||
|
s.lastAPIFetchMu.Unlock()
|
||||||
|
|
||||||
|
return points, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// findClosestPoint finds the weather point closest to the target time
|
||||||
|
func (s *WeatherService) findClosestPoint(points []model.WeatherPoint, target time.Time) model.WeatherPoint {
|
||||||
|
if len(points) == 0 {
|
||||||
|
return model.WeatherPoint{}
|
||||||
|
}
|
||||||
|
|
||||||
|
closest := points[0]
|
||||||
|
minDiff := absDuration(points[0].Time.Sub(target))
|
||||||
|
|
||||||
|
for _, point := range points[1:] {
|
||||||
|
diff := absDuration(point.Time.Sub(target))
|
||||||
|
if diff < minDiff {
|
||||||
|
minDiff = diff
|
||||||
|
closest = point
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return closest
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cache helpers
|
||||||
|
|
||||||
|
func (s *WeatherService) getFromCache(key string) interface{} {
|
||||||
|
s.cacheMu.RLock()
|
||||||
|
defer s.cacheMu.RUnlock()
|
||||||
|
|
||||||
|
entry, exists := s.cache[key]
|
||||||
|
if !exists {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if time.Now().After(entry.expiresAt) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return entry.data
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *WeatherService) setCache(key string, data interface{}, ttl time.Duration) {
|
||||||
|
s.cacheMu.Lock()
|
||||||
|
defer s.cacheMu.Unlock()
|
||||||
|
|
||||||
|
s.cache[key] = cacheEntry{
|
||||||
|
data: data,
|
||||||
|
expiresAt: time.Now().Add(ttl),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *WeatherService) clearCache() {
|
||||||
|
s.cacheMu.Lock()
|
||||||
|
defer s.cacheMu.Unlock()
|
||||||
|
|
||||||
|
s.cache = make(map[string]cacheEntry)
|
||||||
|
}
|
||||||
|
|
||||||
|
// absDuration returns the absolute value of a duration
|
||||||
|
func absDuration(d time.Duration) time.Duration {
|
||||||
|
if d < 0 {
|
||||||
|
return -d
|
||||||
|
}
|
||||||
|
return d
|
||||||
|
}
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
DROP TABLE IF EXISTS weather_observations;
|
||||||
12
backend/migrations/000001_create_weather_observations.up.sql
Normal file
12
backend/migrations/000001_create_weather_observations.up.sql
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
CREATE TABLE weather_observations (
|
||||||
|
id BIGSERIAL PRIMARY KEY,
|
||||||
|
observed_at TIMESTAMPTZ NOT NULL,
|
||||||
|
wind_speed_mph DECIMAL(5,2) NOT NULL,
|
||||||
|
wind_direction INTEGER NOT NULL CHECK (wind_direction >= 0 AND wind_direction < 360),
|
||||||
|
wind_gust_mph DECIMAL(5,2),
|
||||||
|
source VARCHAR(50) DEFAULT 'open-meteo',
|
||||||
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
|
UNIQUE (observed_at, source)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_weather_time ON weather_observations (observed_at DESC);
|
||||||
106
backend/testdata/openmeteo_response.json
vendored
Normal file
106
backend/testdata/openmeteo_response.json
vendored
Normal file
@@ -0,0 +1,106 @@
|
|||||||
|
{
|
||||||
|
"latitude": 32.8893,
|
||||||
|
"longitude": -117.2519,
|
||||||
|
"generationtime_ms": 0.123,
|
||||||
|
"utc_offset_seconds": -28800,
|
||||||
|
"timezone": "America/Los_Angeles",
|
||||||
|
"timezone_abbreviation": "PST",
|
||||||
|
"elevation": 122.0,
|
||||||
|
"hourly_units": {
|
||||||
|
"time": "iso8601",
|
||||||
|
"wind_speed_10m": "mph",
|
||||||
|
"wind_direction_10m": "°",
|
||||||
|
"wind_gusts_10m": "mph"
|
||||||
|
},
|
||||||
|
"hourly": {
|
||||||
|
"time": [
|
||||||
|
"2026-01-01T00:00",
|
||||||
|
"2026-01-01T01:00",
|
||||||
|
"2026-01-01T02:00",
|
||||||
|
"2026-01-01T03:00",
|
||||||
|
"2026-01-01T04:00",
|
||||||
|
"2026-01-01T05:00",
|
||||||
|
"2026-01-01T06:00",
|
||||||
|
"2026-01-01T07:00",
|
||||||
|
"2026-01-01T08:00",
|
||||||
|
"2026-01-01T09:00",
|
||||||
|
"2026-01-01T10:00",
|
||||||
|
"2026-01-01T11:00",
|
||||||
|
"2026-01-01T12:00",
|
||||||
|
"2026-01-01T13:00",
|
||||||
|
"2026-01-01T14:00",
|
||||||
|
"2026-01-01T15:00",
|
||||||
|
"2026-01-01T16:00",
|
||||||
|
"2026-01-01T17:00",
|
||||||
|
"2026-01-01T18:00",
|
||||||
|
"2026-01-01T19:00",
|
||||||
|
"2026-01-01T20:00",
|
||||||
|
"2026-01-01T21:00",
|
||||||
|
"2026-01-01T22:00",
|
||||||
|
"2026-01-01T23:00",
|
||||||
|
"2026-01-02T00:00",
|
||||||
|
"2026-01-02T01:00",
|
||||||
|
"2026-01-02T02:00",
|
||||||
|
"2026-01-02T03:00",
|
||||||
|
"2026-01-02T04:00",
|
||||||
|
"2026-01-02T05:00",
|
||||||
|
"2026-01-02T06:00",
|
||||||
|
"2026-01-02T07:00",
|
||||||
|
"2026-01-02T08:00",
|
||||||
|
"2026-01-02T09:00",
|
||||||
|
"2026-01-02T10:00",
|
||||||
|
"2026-01-02T11:00",
|
||||||
|
"2026-01-02T12:00",
|
||||||
|
"2026-01-02T13:00",
|
||||||
|
"2026-01-02T14:00",
|
||||||
|
"2026-01-02T15:00",
|
||||||
|
"2026-01-02T16:00",
|
||||||
|
"2026-01-02T17:00",
|
||||||
|
"2026-01-02T18:00",
|
||||||
|
"2026-01-02T19:00",
|
||||||
|
"2026-01-02T20:00",
|
||||||
|
"2026-01-02T21:00",
|
||||||
|
"2026-01-02T22:00",
|
||||||
|
"2026-01-02T23:00",
|
||||||
|
"2026-01-03T00:00",
|
||||||
|
"2026-01-03T01:00",
|
||||||
|
"2026-01-03T02:00",
|
||||||
|
"2026-01-03T03:00",
|
||||||
|
"2026-01-03T04:00",
|
||||||
|
"2026-01-03T05:00",
|
||||||
|
"2026-01-03T06:00",
|
||||||
|
"2026-01-03T07:00",
|
||||||
|
"2026-01-03T08:00",
|
||||||
|
"2026-01-03T09:00",
|
||||||
|
"2026-01-03T10:00",
|
||||||
|
"2026-01-03T11:00",
|
||||||
|
"2026-01-03T12:00",
|
||||||
|
"2026-01-03T13:00",
|
||||||
|
"2026-01-03T14:00",
|
||||||
|
"2026-01-03T15:00",
|
||||||
|
"2026-01-03T16:00",
|
||||||
|
"2026-01-03T17:00",
|
||||||
|
"2026-01-03T18:00",
|
||||||
|
"2026-01-03T19:00",
|
||||||
|
"2026-01-03T20:00",
|
||||||
|
"2026-01-03T21:00",
|
||||||
|
"2026-01-03T22:00",
|
||||||
|
"2026-01-03T23:00"
|
||||||
|
],
|
||||||
|
"wind_speed_10m": [
|
||||||
|
5.2, 4.8, 4.5, 4.2, 4.0, 3.8, 3.5, 3.2, 4.5, 6.8, 8.5, 9.2, 10.5, 11.2, 12.0, 11.8, 10.5, 9.2, 7.5, 6.2, 5.5, 5.0, 4.8, 4.5,
|
||||||
|
4.2, 4.0, 3.8, 3.5, 3.2, 3.0, 2.8, 3.5, 5.2, 7.8, 9.5, 10.8, 11.5, 12.2, 13.0, 12.8, 11.5, 10.2, 8.5, 7.0, 6.0, 5.5, 5.0, 4.8,
|
||||||
|
4.5, 4.2, 4.0, 3.8, 3.5, 3.2, 3.0, 2.8, 4.0, 6.5, 8.8, 10.5, 11.8, 12.5, 13.2, 13.8, 12.5, 11.0, 9.0, 7.5, 6.5, 6.0, 5.5, 5.2
|
||||||
|
],
|
||||||
|
"wind_direction_10m": [
|
||||||
|
280, 282, 285, 288, 290, 292, 295, 298, 275, 270, 268, 265, 262, 260, 258, 260, 265, 270, 275, 280, 285, 288, 290, 292,
|
||||||
|
295, 298, 300, 302, 305, 308, 310, 280, 275, 270, 268, 265, 262, 260, 258, 260, 265, 270, 275, 280, 285, 288, 290, 292,
|
||||||
|
295, 298, 300, 302, 305, 308, 310, 312, 285, 275, 270, 268, 265, 262, 260, 258, 262, 268, 275, 280, 285, 288, 290, 292
|
||||||
|
],
|
||||||
|
"wind_gusts_10m": [
|
||||||
|
8.5, 8.0, 7.5, 7.0, 6.8, 6.5, 6.0, 5.8, 7.5, 10.2, 12.5, 13.8, 15.2, 16.5, 17.8, 17.5, 15.8, 14.2, 12.0, 10.5, 9.5, 9.0, 8.5, 8.0,
|
||||||
|
7.5, 7.0, 6.8, 6.5, 6.0, 5.8, 5.5, 6.8, 9.2, 11.8, 14.0, 16.2, 17.5, 18.8, 19.5, 19.2, 17.5, 15.8, 13.5, 11.5, 10.5, 10.0, 9.5, 9.0,
|
||||||
|
8.5, 8.0, 7.5, 7.0, 6.8, 6.5, 6.0, 5.8, 7.8, 10.8, 13.5, 15.8, 17.5, 18.8, 19.8, 20.5, 18.8, 16.5, 14.0, 12.0, 11.0, 10.5, 10.0, 9.5
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
447
dev.sh
Executable file
447
dev.sh
Executable file
@@ -0,0 +1,447 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# dev.sh - Idempotent local development setup script
|
||||||
|
# This script can be run multiple times safely
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./dev.sh - Setup only (PostgreSQL + .env files + migrations)
|
||||||
|
# ./dev.sh --start - Setup and start backend + frontend
|
||||||
|
# ./dev.sh --stop - Stop running services
|
||||||
|
|
||||||
|
set -e # Exit on error
|
||||||
|
|
||||||
|
# Colors for output
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
# Configuration
|
||||||
|
DB_URL="postgres://dev:devpass@localhost:5432/paragliding?sslmode=disable"
|
||||||
|
CONTAINER_NAME="paragliding-postgres"
|
||||||
|
BACKEND_DIR="backend"
|
||||||
|
FRONTEND_DIR="frontend"
|
||||||
|
BACKEND_PID_FILE=".backend.pid"
|
||||||
|
FRONTEND_PID_FILE=".frontend.pid"
|
||||||
|
BACKEND_LOG_FILE="backend.log"
|
||||||
|
FRONTEND_LOG_FILE="frontend.log"
|
||||||
|
|
||||||
|
# Helper functions
|
||||||
|
info() {
|
||||||
|
echo -e "${BLUE}ℹ${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
success() {
|
||||||
|
echo -e "${GREEN}✓${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
warning() {
|
||||||
|
echo -e "${YELLOW}⚠${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
error() {
|
||||||
|
echo -e "${RED}✗${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if a command exists
|
||||||
|
command_exists() {
|
||||||
|
command -v "$1" >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if PostgreSQL container is running
|
||||||
|
is_postgres_running() {
|
||||||
|
if command_exists podman; then
|
||||||
|
podman ps --filter "name=$CONTAINER_NAME" --filter "status=running" --format "{{.Names}}" | grep -q "$CONTAINER_NAME"
|
||||||
|
elif command_exists docker; then
|
||||||
|
docker ps --filter "name=$CONTAINER_NAME" --filter "status=running" --format "{{.Names}}" | grep -q "$CONTAINER_NAME"
|
||||||
|
else
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if PostgreSQL is ready to accept connections
|
||||||
|
is_postgres_ready() {
|
||||||
|
if command_exists podman; then
|
||||||
|
podman exec "$CONTAINER_NAME" pg_isready -U dev -d paragliding >/dev/null 2>&1
|
||||||
|
elif command_exists docker; then
|
||||||
|
docker exec "$CONTAINER_NAME" pg_isready -U dev -d paragliding >/dev/null 2>&1
|
||||||
|
else
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Start PostgreSQL if not running
|
||||||
|
start_postgres() {
|
||||||
|
if is_postgres_running; then
|
||||||
|
success "PostgreSQL is already running"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
info "Starting PostgreSQL container..."
|
||||||
|
|
||||||
|
if command_exists podman; then
|
||||||
|
if podman ps -a --filter "name=$CONTAINER_NAME" --format "{{.Names}}" | grep -q "$CONTAINER_NAME"; then
|
||||||
|
# Container exists but is not running
|
||||||
|
info "Starting existing PostgreSQL container..."
|
||||||
|
podman start "$CONTAINER_NAME" >/dev/null
|
||||||
|
else
|
||||||
|
# Container doesn't exist, use docker-compose
|
||||||
|
if command_exists podman-compose; then
|
||||||
|
podman-compose -f docker-compose.dev.yml up -d
|
||||||
|
else
|
||||||
|
error "podman-compose not found. Please install it or use 'make dev-db'"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
elif command_exists docker; then
|
||||||
|
if docker ps -a --filter "name=$CONTAINER_NAME" --format "{{.Names}}" | grep -q "$CONTAINER_NAME"; then
|
||||||
|
info "Starting existing PostgreSQL container..."
|
||||||
|
docker start "$CONTAINER_NAME" >/dev/null
|
||||||
|
else
|
||||||
|
if command_exists docker-compose; then
|
||||||
|
docker-compose -f docker-compose.dev.yml up -d
|
||||||
|
else
|
||||||
|
error "docker-compose not found. Please install it or use 'make dev-db'"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
error "Neither podman nor docker found. Please install one of them."
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Wait for PostgreSQL to be ready
|
||||||
|
info "Waiting for PostgreSQL to be ready..."
|
||||||
|
for i in {1..30}; do
|
||||||
|
if is_postgres_ready; then
|
||||||
|
success "PostgreSQL is ready!"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
sleep 1
|
||||||
|
done
|
||||||
|
|
||||||
|
error "PostgreSQL failed to become ready in time"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Setup backend environment
|
||||||
|
setup_backend_env() {
|
||||||
|
if [ -f "$BACKEND_DIR/.env" ]; then
|
||||||
|
success "Backend .env file already exists"
|
||||||
|
else
|
||||||
|
if [ -f "$BACKEND_DIR/.env.example" ]; then
|
||||||
|
info "Creating backend/.env from .env.example..."
|
||||||
|
cp "$BACKEND_DIR/.env.example" "$BACKEND_DIR/.env"
|
||||||
|
success "Backend .env file created"
|
||||||
|
else
|
||||||
|
warning "Backend .env.example not found, creating default .env..."
|
||||||
|
cat > "$BACKEND_DIR/.env" <<EOF
|
||||||
|
DATABASE_URL=$DB_URL
|
||||||
|
PORT=8080
|
||||||
|
LOCATION_LAT=32.8893
|
||||||
|
LOCATION_LON=-117.2519
|
||||||
|
LOCATION_NAME=Torrey Pines Gliderport
|
||||||
|
TIMEZONE=America/Los_Angeles
|
||||||
|
FETCH_INTERVAL=15m
|
||||||
|
CACHE_TTL=10m
|
||||||
|
EOF
|
||||||
|
success "Backend .env file created with defaults"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Setup frontend environment
|
||||||
|
setup_frontend_env() {
|
||||||
|
if [ -f "$FRONTEND_DIR/.env.local" ]; then
|
||||||
|
success "Frontend .env.local file already exists"
|
||||||
|
else
|
||||||
|
info "Creating frontend/.env.local..."
|
||||||
|
cat > "$FRONTEND_DIR/.env.local" <<EOF
|
||||||
|
NEXT_PUBLIC_API_URL=http://localhost:8080/api/v1
|
||||||
|
EOF
|
||||||
|
success "Frontend .env.local file created"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Run database migrations
|
||||||
|
run_migrations() {
|
||||||
|
info "Checking database migrations..."
|
||||||
|
|
||||||
|
if [ ! -d "$BACKEND_DIR/migrations" ]; then
|
||||||
|
warning "No migrations directory found, skipping..."
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if migrations have been run by trying to query a table
|
||||||
|
# This is a simple check - you might want to use a more robust migration tool
|
||||||
|
cd "$BACKEND_DIR"
|
||||||
|
|
||||||
|
# Try to run migrations
|
||||||
|
if command_exists go; then
|
||||||
|
info "Running database migrations..."
|
||||||
|
if DATABASE_URL="$DB_URL" go run -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest -path ./migrations -database "$DB_URL" up 2>&1 | grep -q "no change"; then
|
||||||
|
success "Migrations are up to date"
|
||||||
|
else
|
||||||
|
success "Migrations applied"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
warning "Go not found, skipping migrations. Install Go to run migrations."
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd - >/dev/null
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check dependencies
|
||||||
|
check_dependencies() {
|
||||||
|
info "Checking dependencies..."
|
||||||
|
|
||||||
|
local missing_deps=()
|
||||||
|
|
||||||
|
if ! command_exists go; then
|
||||||
|
missing_deps+=("go (for backend)")
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! command_exists node && ! command_exists bun; then
|
||||||
|
missing_deps+=("node or bun (for frontend)")
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! command_exists podman && ! command_exists docker; then
|
||||||
|
missing_deps+=("podman or docker (for PostgreSQL)")
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ${#missing_deps[@]} -gt 0 ]; then
|
||||||
|
warning "Missing dependencies:"
|
||||||
|
for dep in "${missing_deps[@]}"; do
|
||||||
|
echo " - $dep"
|
||||||
|
done
|
||||||
|
echo ""
|
||||||
|
else
|
||||||
|
success "All dependencies found"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Start backend service in background
|
||||||
|
start_backend() {
|
||||||
|
if [ -f "$BACKEND_PID_FILE" ] && kill -0 "$(cat $BACKEND_PID_FILE)" 2>/dev/null; then
|
||||||
|
warning "Backend is already running (PID: $(cat $BACKEND_PID_FILE))"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! command_exists go; then
|
||||||
|
error "Go is required to run the backend. Please install Go first."
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ! -f "$BACKEND_DIR/.env" ]; then
|
||||||
|
error "Backend .env file not found. Run './dev.sh' first to set up."
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
info "Starting backend server..."
|
||||||
|
|
||||||
|
# Start backend in subshell to isolate environment
|
||||||
|
(
|
||||||
|
cd "$BACKEND_DIR"
|
||||||
|
# Export environment variables from .env file
|
||||||
|
set -a
|
||||||
|
source .env
|
||||||
|
set +a
|
||||||
|
exec go run ./cmd/api
|
||||||
|
) > "$BACKEND_LOG_FILE" 2>&1 &
|
||||||
|
|
||||||
|
echo $! > "$BACKEND_PID_FILE"
|
||||||
|
|
||||||
|
success "Backend started (PID: $(cat $BACKEND_PID_FILE), logs: $BACKEND_LOG_FILE)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Start frontend service in background
|
||||||
|
start_frontend() {
|
||||||
|
if [ -f "$FRONTEND_PID_FILE" ] && kill -0 "$(cat $FRONTEND_PID_FILE)" 2>/dev/null; then
|
||||||
|
warning "Frontend is already running (PID: $(cat $FRONTEND_PID_FILE))"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
local runner="npm"
|
||||||
|
if command_exists bun; then
|
||||||
|
runner="bun"
|
||||||
|
elif ! command_exists npm; then
|
||||||
|
error "Neither npm nor bun found. Please install Node.js or Bun."
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
info "Starting frontend server with $runner..."
|
||||||
|
|
||||||
|
# Start frontend in subshell to isolate environment
|
||||||
|
(
|
||||||
|
cd "$FRONTEND_DIR"
|
||||||
|
exec $runner run dev
|
||||||
|
) > "$FRONTEND_LOG_FILE" 2>&1 &
|
||||||
|
|
||||||
|
echo $! > "$FRONTEND_PID_FILE"
|
||||||
|
|
||||||
|
success "Frontend started (PID: $(cat $FRONTEND_PID_FILE), logs: $FRONTEND_LOG_FILE)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Stop running services
|
||||||
|
stop_services() {
|
||||||
|
echo ""
|
||||||
|
info "Stopping services..."
|
||||||
|
local stopped=0
|
||||||
|
|
||||||
|
if [ -f "$BACKEND_PID_FILE" ]; then
|
||||||
|
local pid=$(cat "$BACKEND_PID_FILE")
|
||||||
|
if kill -0 "$pid" 2>/dev/null; then
|
||||||
|
kill "$pid"
|
||||||
|
success "Backend stopped (was PID: $pid)"
|
||||||
|
stopped=1
|
||||||
|
fi
|
||||||
|
rm -f "$BACKEND_PID_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -f "$FRONTEND_PID_FILE" ]; then
|
||||||
|
local pid=$(cat "$FRONTEND_PID_FILE")
|
||||||
|
if kill -0 "$pid" 2>/dev/null; then
|
||||||
|
kill "$pid"
|
||||||
|
success "Frontend stopped (was PID: $pid)"
|
||||||
|
stopped=1
|
||||||
|
fi
|
||||||
|
rm -f "$FRONTEND_PID_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ $stopped -eq 0 ]; then
|
||||||
|
info "No services were running"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
# Show service status
|
||||||
|
show_status() {
|
||||||
|
echo ""
|
||||||
|
info "Service Status:"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# PostgreSQL
|
||||||
|
if is_postgres_running; then
|
||||||
|
echo " PostgreSQL: ${GREEN}●${NC} Running"
|
||||||
|
else
|
||||||
|
echo " PostgreSQL: ${RED}●${NC} Stopped"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Backend
|
||||||
|
if [ -f "$BACKEND_PID_FILE" ] && kill -0 "$(cat $BACKEND_PID_FILE)" 2>/dev/null; then
|
||||||
|
echo " Backend: ${GREEN}●${NC} Running (PID: $(cat $BACKEND_PID_FILE))"
|
||||||
|
else
|
||||||
|
echo " Backend: ${RED}●${NC} Stopped"
|
||||||
|
[ -f "$BACKEND_PID_FILE" ] && rm -f "$BACKEND_PID_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Frontend
|
||||||
|
if [ -f "$FRONTEND_PID_FILE" ] && kill -0 "$(cat $FRONTEND_PID_FILE)" 2>/dev/null; then
|
||||||
|
echo " Frontend: ${GREEN}●${NC} Running (PID: $(cat $FRONTEND_PID_FILE))"
|
||||||
|
else
|
||||||
|
echo " Frontend: ${RED}●${NC} Stopped"
|
||||||
|
[ -f "$FRONTEND_PID_FILE" ] && rm -f "$FRONTEND_PID_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main setup flow
|
||||||
|
main() {
|
||||||
|
echo ""
|
||||||
|
echo "🚀 Paragliding Local Development Setup"
|
||||||
|
echo "======================================"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check dependencies
|
||||||
|
check_dependencies
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Step 1: Start PostgreSQL
|
||||||
|
info "Step 1: PostgreSQL"
|
||||||
|
start_postgres || exit 1
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Step 2: Setup environment files
|
||||||
|
info "Step 2: Environment Configuration"
|
||||||
|
setup_backend_env
|
||||||
|
setup_frontend_env
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Step 3: Run migrations
|
||||||
|
info "Step 3: Database Migrations"
|
||||||
|
run_migrations
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Final instructions
|
||||||
|
success "Setup complete! 🎉"
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main entry point
|
||||||
|
case "${1:-}" in
|
||||||
|
--start)
|
||||||
|
main
|
||||||
|
info "Step 4: Starting Services"
|
||||||
|
echo ""
|
||||||
|
start_backend
|
||||||
|
sleep 2 # Give backend a moment to start
|
||||||
|
start_frontend
|
||||||
|
echo ""
|
||||||
|
success "All services started! 🚀"
|
||||||
|
echo ""
|
||||||
|
echo "Services are running in the background:"
|
||||||
|
echo " - Frontend: ${BLUE}http://localhost:3000${NC}"
|
||||||
|
echo " - Backend: ${BLUE}http://localhost:8080${NC}"
|
||||||
|
echo " - API: ${BLUE}http://localhost:8080/api/v1${NC}"
|
||||||
|
echo ""
|
||||||
|
echo "View logs:"
|
||||||
|
echo " ${YELLOW}tail -f $BACKEND_LOG_FILE${NC}"
|
||||||
|
echo " ${YELLOW}tail -f $FRONTEND_LOG_FILE${NC}"
|
||||||
|
echo ""
|
||||||
|
echo "Stop services:"
|
||||||
|
echo " ${YELLOW}./dev.sh --stop${NC}"
|
||||||
|
echo ""
|
||||||
|
;;
|
||||||
|
--stop)
|
||||||
|
stop_services
|
||||||
|
;;
|
||||||
|
--status)
|
||||||
|
show_status
|
||||||
|
;;
|
||||||
|
--help)
|
||||||
|
echo "Usage: ./dev.sh [OPTION]"
|
||||||
|
echo ""
|
||||||
|
echo "Options:"
|
||||||
|
echo " (none) Setup only (PostgreSQL + .env files + migrations)"
|
||||||
|
echo " --start Setup and start backend + frontend services"
|
||||||
|
echo " --stop Stop running backend + frontend services"
|
||||||
|
echo " --status Show status of all services"
|
||||||
|
echo " --help Show this help message"
|
||||||
|
echo ""
|
||||||
|
;;
|
||||||
|
"")
|
||||||
|
main
|
||||||
|
echo "Next steps:"
|
||||||
|
echo ""
|
||||||
|
echo " Option 1 - Start services in background:"
|
||||||
|
echo " ${GREEN}./dev.sh --start${NC}"
|
||||||
|
echo ""
|
||||||
|
echo " Option 2 - Run in separate terminals:"
|
||||||
|
echo " Terminal 1: ${GREEN}make run-backend${NC}"
|
||||||
|
echo " Terminal 2: ${GREEN}make run-frontend${NC}"
|
||||||
|
echo ""
|
||||||
|
echo "Services will be available at:"
|
||||||
|
echo " - Frontend: ${BLUE}http://localhost:3000${NC}"
|
||||||
|
echo " - Backend: ${BLUE}http://localhost:8080${NC}"
|
||||||
|
echo " - API: ${BLUE}http://localhost:8080/api/v1${NC}"
|
||||||
|
echo ""
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
error "Unknown option: $1"
|
||||||
|
echo "Run './dev.sh --help' for usage information"
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
24
docker-compose.dev.yml
Normal file
24
docker-compose.dev.yml
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
# Docker Compose for local development - only runs PostgreSQL
|
||||||
|
# Run apps locally with: npm/bun/go for faster iteration
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
container_name: paragliding-postgres
|
||||||
|
environment:
|
||||||
|
POSTGRES_DB: paragliding
|
||||||
|
POSTGRES_USER: dev
|
||||||
|
POSTGRES_PASSWORD: devpass
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U dev -d paragliding"]
|
||||||
|
interval: 5s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
57
docker-compose.yml
Normal file
57
docker-compose.yml
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
# Use with: podman-compose up -d
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
environment:
|
||||||
|
POSTGRES_DB: paragliding
|
||||||
|
POSTGRES_USER: dev
|
||||||
|
POSTGRES_PASSWORD: devpass
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U dev -d paragliding"]
|
||||||
|
interval: 5s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
ports:
|
||||||
|
- "8080:8080"
|
||||||
|
environment:
|
||||||
|
DATABASE_URL: postgres://dev:devpass@postgres:5432/paragliding?sslmode=disable
|
||||||
|
PORT: "8080"
|
||||||
|
LOCATION_LAT: "32.8893"
|
||||||
|
LOCATION_LON: "-117.2519"
|
||||||
|
LOCATION_NAME: "Torrey Pines Gliderport"
|
||||||
|
TIMEZONE: "America/Los_Angeles"
|
||||||
|
FETCH_INTERVAL: "15m"
|
||||||
|
CACHE_TTL: "10m"
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
depends_on:
|
||||||
|
postgres:
|
||||||
|
condition: service_healthy
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: dev
|
||||||
|
ports:
|
||||||
|
- "3000:3000"
|
||||||
|
environment:
|
||||||
|
NEXT_PUBLIC_API_URL: http://localhost:8080
|
||||||
|
volumes:
|
||||||
|
- ./frontend:/app
|
||||||
|
- /app/node_modules
|
||||||
|
- /app/.next
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
2
frontend/.env.local.example
Normal file
2
frontend/.env.local.example
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
# Copy this file to .env.local and fill in values
|
||||||
|
NEXT_PUBLIC_API_URL=http://localhost:8080/api/v1
|
||||||
64
frontend/Dockerfile
Normal file
64
frontend/Dockerfile
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
# Dependencies stage
|
||||||
|
FROM node:20-alpine AS deps
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY package.json package-lock.json* ./
|
||||||
|
RUN npm ci
|
||||||
|
|
||||||
|
# Build stage
|
||||||
|
FROM node:20-alpine AS builder
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY --from=deps /app/node_modules ./node_modules
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Set production environment
|
||||||
|
ENV NEXT_TELEMETRY_DISABLED 1
|
||||||
|
ENV NODE_ENV production
|
||||||
|
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Production stage
|
||||||
|
FROM node:20-alpine AS runner
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
ENV NODE_ENV production
|
||||||
|
ENV NEXT_TELEMETRY_DISABLED 1
|
||||||
|
|
||||||
|
RUN addgroup --system --gid 1001 nodejs
|
||||||
|
RUN adduser --system --uid 1001 nextjs
|
||||||
|
|
||||||
|
COPY --from=builder /app/public ./public
|
||||||
|
|
||||||
|
# Set permissions for prerender cache
|
||||||
|
RUN mkdir .next
|
||||||
|
RUN chown nextjs:nodejs .next
|
||||||
|
|
||||||
|
# Copy standalone output
|
||||||
|
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
|
||||||
|
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
|
||||||
|
|
||||||
|
USER nextjs
|
||||||
|
|
||||||
|
EXPOSE 3000
|
||||||
|
|
||||||
|
ENV PORT 3000
|
||||||
|
ENV HOSTNAME "0.0.0.0"
|
||||||
|
|
||||||
|
CMD ["node", "server.js"]
|
||||||
|
|
||||||
|
# Development stage
|
||||||
|
FROM node:20-alpine AS dev
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY package.json package-lock.json* ./
|
||||||
|
RUN npm ci
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
ENV NODE_ENV development
|
||||||
|
ENV NEXT_TELEMETRY_DISABLED 1
|
||||||
|
|
||||||
|
EXPOSE 3000
|
||||||
|
|
||||||
|
CMD ["npm", "run", "dev"]
|
||||||
59
frontend/app/globals.css
Normal file
59
frontend/app/globals.css
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
@tailwind base;
|
||||||
|
@tailwind components;
|
||||||
|
@tailwind utilities;
|
||||||
|
|
||||||
|
@layer base {
|
||||||
|
:root {
|
||||||
|
--background: 0 0% 100%;
|
||||||
|
--foreground: 222.2 84% 4.9%;
|
||||||
|
--card: 0 0% 100%;
|
||||||
|
--card-foreground: 222.2 84% 4.9%;
|
||||||
|
--popover: 0 0% 100%;
|
||||||
|
--popover-foreground: 222.2 84% 4.9%;
|
||||||
|
--primary: 221.2 83.2% 53.3%;
|
||||||
|
--primary-foreground: 210 40% 98%;
|
||||||
|
--secondary: 210 40% 96.1%;
|
||||||
|
--secondary-foreground: 222.2 47.4% 11.2%;
|
||||||
|
--muted: 210 40% 96.1%;
|
||||||
|
--muted-foreground: 215.4 16.3% 46.9%;
|
||||||
|
--accent: 210 40% 96.1%;
|
||||||
|
--accent-foreground: 222.2 47.4% 11.2%;
|
||||||
|
--destructive: 0 84.2% 60.2%;
|
||||||
|
--destructive-foreground: 210 40% 98%;
|
||||||
|
--border: 214.3 31.8% 91.4%;
|
||||||
|
--input: 214.3 31.8% 91.4%;
|
||||||
|
--ring: 221.2 83.2% 53.3%;
|
||||||
|
--radius: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dark {
|
||||||
|
--background: 222.2 84% 4.9%;
|
||||||
|
--foreground: 210 40% 98%;
|
||||||
|
--card: 222.2 84% 4.9%;
|
||||||
|
--card-foreground: 210 40% 98%;
|
||||||
|
--popover: 222.2 84% 4.9%;
|
||||||
|
--popover-foreground: 210 40% 98%;
|
||||||
|
--primary: 217.2 91.2% 59.8%;
|
||||||
|
--primary-foreground: 222.2 47.4% 11.2%;
|
||||||
|
--secondary: 217.2 32.6% 17.5%;
|
||||||
|
--secondary-foreground: 210 40% 98%;
|
||||||
|
--muted: 217.2 32.6% 17.5%;
|
||||||
|
--muted-foreground: 215 20.2% 65.1%;
|
||||||
|
--accent: 217.2 32.6% 17.5%;
|
||||||
|
--accent-foreground: 210 40% 98%;
|
||||||
|
--destructive: 0 62.8% 30.6%;
|
||||||
|
--destructive-foreground: 210 40% 98%;
|
||||||
|
--border: 217.2 32.6% 17.5%;
|
||||||
|
--input: 217.2 32.6% 17.5%;
|
||||||
|
--ring: 224.3 76.3% 48%;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@layer base {
|
||||||
|
* {
|
||||||
|
@apply border-border;
|
||||||
|
}
|
||||||
|
body {
|
||||||
|
@apply bg-background text-foreground;
|
||||||
|
}
|
||||||
|
}
|
||||||
25
frontend/app/layout.tsx
Normal file
25
frontend/app/layout.tsx
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
import type { Metadata } from 'next'
|
||||||
|
import { Inter } from 'next/font/google'
|
||||||
|
import './globals.css'
|
||||||
|
import { Providers } from './providers'
|
||||||
|
|
||||||
|
const inter = Inter({ subsets: ['latin'] })
|
||||||
|
|
||||||
|
export const metadata: Metadata = {
|
||||||
|
title: 'Paragliding Weather Forecaster',
|
||||||
|
description: 'Real-time weather analysis for paragliding conditions',
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function RootLayout({
|
||||||
|
children,
|
||||||
|
}: {
|
||||||
|
children: React.ReactNode
|
||||||
|
}) {
|
||||||
|
return (
|
||||||
|
<html lang="en" className="dark">
|
||||||
|
<body className={inter.className}>
|
||||||
|
<Providers>{children}</Providers>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
)
|
||||||
|
}
|
||||||
161
frontend/app/page.tsx
Normal file
161
frontend/app/page.tsx
Normal file
@@ -0,0 +1,161 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useCurrentWeather, useForecast, useHistorical } from '@/hooks/use-weather'
|
||||||
|
import { AssessmentBadge } from '@/components/weather/assessment-badge'
|
||||||
|
import { WindSpeedChart } from '@/components/weather/wind-speed-chart'
|
||||||
|
import { WindDirectionChart } from '@/components/weather/wind-direction-chart'
|
||||||
|
import { ThresholdControls } from '@/components/weather/threshold-controls'
|
||||||
|
import { RefreshCountdown } from '@/components/weather/refresh-countdown'
|
||||||
|
import { StaleDataBanner } from '@/components/weather/stale-data-banner'
|
||||||
|
import { Collapsible } from '@/components/ui/collapsible'
|
||||||
|
import { useQueryClient } from '@tanstack/react-query'
|
||||||
|
import { Loader2, AlertCircle } from 'lucide-react'
|
||||||
|
import { subDays, format } from 'date-fns'
|
||||||
|
|
||||||
|
export default function DashboardPage() {
|
||||||
|
const queryClient = useQueryClient()
|
||||||
|
|
||||||
|
// Fetch current weather and forecast
|
||||||
|
const {
|
||||||
|
data: currentWeather,
|
||||||
|
isLoading: isLoadingCurrent,
|
||||||
|
error: currentError,
|
||||||
|
} = useCurrentWeather()
|
||||||
|
|
||||||
|
const {
|
||||||
|
data: forecast,
|
||||||
|
isLoading: isLoadingForecast,
|
||||||
|
error: forecastError,
|
||||||
|
} = useForecast()
|
||||||
|
|
||||||
|
// Fetch yesterday's data for comparison
|
||||||
|
const yesterday = format(subDays(new Date(), 1), 'yyyy-MM-dd')
|
||||||
|
const {
|
||||||
|
data: historicalData,
|
||||||
|
isLoading: isLoadingHistorical,
|
||||||
|
} = useHistorical(yesterday)
|
||||||
|
|
||||||
|
// Handle refresh
|
||||||
|
const handleRefresh = async () => {
|
||||||
|
await queryClient.invalidateQueries({ queryKey: ['weather'] })
|
||||||
|
}
|
||||||
|
|
||||||
|
// Loading state
|
||||||
|
if (isLoadingCurrent || isLoadingForecast) {
|
||||||
|
return (
|
||||||
|
<div className="flex items-center justify-center min-h-screen">
|
||||||
|
<div className="text-center space-y-4">
|
||||||
|
<Loader2 className="h-12 w-12 animate-spin mx-auto text-primary" />
|
||||||
|
<p className="text-muted-foreground">Loading weather data...</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error state
|
||||||
|
if (currentError || forecastError) {
|
||||||
|
return (
|
||||||
|
<div className="flex items-center justify-center min-h-screen p-4">
|
||||||
|
<div className="max-w-md w-full">
|
||||||
|
<div className="flex items-center gap-3 rounded-lg border border-red-500 bg-red-50 dark:bg-red-950 p-6">
|
||||||
|
<AlertCircle className="h-8 w-8 text-red-600 dark:text-red-400 flex-shrink-0" />
|
||||||
|
<div>
|
||||||
|
<h2 className="text-lg font-semibold text-red-800 dark:text-red-200 mb-1">
|
||||||
|
Failed to load weather data
|
||||||
|
</h2>
|
||||||
|
<p className="text-sm text-red-700 dark:text-red-300">
|
||||||
|
{(currentError as Error)?.message || (forecastError as Error)?.message || 'An unexpected error occurred'}
|
||||||
|
</p>
|
||||||
|
<button
|
||||||
|
onClick={handleRefresh}
|
||||||
|
className="mt-3 text-sm font-medium text-red-600 dark:text-red-400 hover:underline"
|
||||||
|
>
|
||||||
|
Try again
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// No data state
|
||||||
|
if (!currentWeather || !forecast) {
|
||||||
|
return (
|
||||||
|
<div className="flex items-center justify-center min-h-screen">
|
||||||
|
<div className="text-center space-y-4">
|
||||||
|
<AlertCircle className="h-12 w-12 mx-auto text-muted-foreground" />
|
||||||
|
<p className="text-muted-foreground">No weather data available</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get best flyable window
|
||||||
|
const bestWindow = forecast.flyable_windows && forecast.flyable_windows.length > 0
|
||||||
|
? forecast.flyable_windows[0]
|
||||||
|
: undefined
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="min-h-screen bg-background">
|
||||||
|
<div className="container mx-auto px-4 py-6 md:py-8 space-y-6">
|
||||||
|
{/* Header */}
|
||||||
|
<div className="space-y-2">
|
||||||
|
<h1 className="text-3xl md:text-4xl font-bold">
|
||||||
|
#ShouldIFly TPG?
|
||||||
|
</h1>
|
||||||
|
{currentWeather.location.name && (
|
||||||
|
<p className="text-muted-foreground">
|
||||||
|
{currentWeather.location.name} ({currentWeather.location.lat.toFixed(2)}, {currentWeather.location.lon.toFixed(2)})
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Stale Data Warning */}
|
||||||
|
<StaleDataBanner lastUpdated={currentWeather.last_updated} />
|
||||||
|
|
||||||
|
{/* Assessment Badge */}
|
||||||
|
<div className="grid grid-cols-3 gap-6">
|
||||||
|
<div></div>
|
||||||
|
<AssessmentBadge
|
||||||
|
assessment={currentWeather.assessment}
|
||||||
|
currentWeather={currentWeather.current}
|
||||||
|
bestWindow={bestWindow}
|
||||||
|
/>
|
||||||
|
<div></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Charts */}
|
||||||
|
<div className="grid gap-6 lg:grid-cols-1">
|
||||||
|
<WindSpeedChart
|
||||||
|
data={forecast.forecast}
|
||||||
|
yesterdayData={historicalData?.data}
|
||||||
|
/>
|
||||||
|
<WindDirectionChart
|
||||||
|
data={forecast.forecast}
|
||||||
|
yesterdayData={historicalData?.data}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Threshold Controls - Collapsible */}
|
||||||
|
<Collapsible title="Threshold Controls" defaultOpen={false}>
|
||||||
|
<ThresholdControls />
|
||||||
|
</Collapsible>
|
||||||
|
|
||||||
|
{/* Refresh Countdown */}
|
||||||
|
<RefreshCountdown onRefresh={handleRefresh} />
|
||||||
|
|
||||||
|
{/* Footer Info */}
|
||||||
|
<div className="text-center text-xs text-muted-foreground pt-8 border-t">
|
||||||
|
<p>
|
||||||
|
Data updates every 5 minutes. Forecast generated at{' '}
|
||||||
|
{format(new Date(forecast.generated_at), 'PPpp')}
|
||||||
|
</p>
|
||||||
|
{isLoadingHistorical && (
|
||||||
|
<p className="mt-1">Loading historical comparison data...</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
22
frontend/app/providers.tsx
Normal file
22
frontend/app/providers.tsx
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { QueryClient, QueryClientProvider } from '@tanstack/react-query'
|
||||||
|
import { useState } from 'react'
|
||||||
|
|
||||||
|
export function Providers({ children }: { children: React.ReactNode }) {
|
||||||
|
const [queryClient] = useState(
|
||||||
|
() =>
|
||||||
|
new QueryClient({
|
||||||
|
defaultOptions: {
|
||||||
|
queries: {
|
||||||
|
staleTime: 60 * 1000, // 1 minute
|
||||||
|
refetchOnWindowFocus: false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
)
|
||||||
|
|
||||||
|
return (
|
||||||
|
<QueryClientProvider client={queryClient}>{children}</QueryClientProvider>
|
||||||
|
)
|
||||||
|
}
|
||||||
17
frontend/components.json
Normal file
17
frontend/components.json
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
{
|
||||||
|
"$schema": "https://ui.shadcn.com/schema.json",
|
||||||
|
"style": "default",
|
||||||
|
"rsc": true,
|
||||||
|
"tsx": true,
|
||||||
|
"tailwind": {
|
||||||
|
"config": "tailwind.config.js",
|
||||||
|
"css": "app/globals.css",
|
||||||
|
"baseColor": "slate",
|
||||||
|
"cssVariables": true,
|
||||||
|
"prefix": ""
|
||||||
|
},
|
||||||
|
"aliases": {
|
||||||
|
"components": "@/components",
|
||||||
|
"utils": "@/lib/utils"
|
||||||
|
}
|
||||||
|
}
|
||||||
55
frontend/components/ui/button.tsx
Normal file
55
frontend/components/ui/button.tsx
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
import * as React from 'react'
|
||||||
|
import { Slot } from '@radix-ui/react-slot'
|
||||||
|
import { cva, type VariantProps } from 'class-variance-authority'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
|
||||||
|
const buttonVariants = cva(
|
||||||
|
'inline-flex items-center justify-center whitespace-nowrap rounded-md text-sm font-medium ring-offset-background transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50',
|
||||||
|
{
|
||||||
|
variants: {
|
||||||
|
variant: {
|
||||||
|
default: 'bg-primary text-primary-foreground hover:bg-primary/90',
|
||||||
|
destructive:
|
||||||
|
'bg-destructive text-destructive-foreground hover:bg-destructive/90',
|
||||||
|
outline:
|
||||||
|
'border border-input bg-background hover:bg-accent hover:text-accent-foreground',
|
||||||
|
secondary:
|
||||||
|
'bg-secondary text-secondary-foreground hover:bg-secondary/80',
|
||||||
|
ghost: 'hover:bg-accent hover:text-accent-foreground',
|
||||||
|
link: 'text-primary underline-offset-4 hover:underline',
|
||||||
|
},
|
||||||
|
size: {
|
||||||
|
default: 'h-10 px-4 py-2',
|
||||||
|
sm: 'h-9 rounded-md px-3',
|
||||||
|
lg: 'h-11 rounded-md px-8',
|
||||||
|
icon: 'h-10 w-10',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
defaultVariants: {
|
||||||
|
variant: 'default',
|
||||||
|
size: 'default',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
export interface ButtonProps
|
||||||
|
extends React.ButtonHTMLAttributes<HTMLButtonElement>,
|
||||||
|
VariantProps<typeof buttonVariants> {
|
||||||
|
asChild?: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
|
||||||
|
({ className, variant, size, asChild = false, ...props }, ref) => {
|
||||||
|
const Comp = asChild ? Slot : 'button'
|
||||||
|
return (
|
||||||
|
<Comp
|
||||||
|
className={cn(buttonVariants({ variant, size, className }))}
|
||||||
|
ref={ref}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
Button.displayName = 'Button'
|
||||||
|
|
||||||
|
export { Button, buttonVariants }
|
||||||
78
frontend/components/ui/card.tsx
Normal file
78
frontend/components/ui/card.tsx
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
import * as React from 'react'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
|
||||||
|
const Card = React.forwardRef<
|
||||||
|
HTMLDivElement,
|
||||||
|
React.HTMLAttributes<HTMLDivElement>
|
||||||
|
>(({ className, ...props }, ref) => (
|
||||||
|
<div
|
||||||
|
ref={ref}
|
||||||
|
className={cn(
|
||||||
|
'rounded-lg border bg-card text-card-foreground shadow-sm',
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
))
|
||||||
|
Card.displayName = 'Card'
|
||||||
|
|
||||||
|
const CardHeader = React.forwardRef<
|
||||||
|
HTMLDivElement,
|
||||||
|
React.HTMLAttributes<HTMLDivElement>
|
||||||
|
>(({ className, ...props }, ref) => (
|
||||||
|
<div
|
||||||
|
ref={ref}
|
||||||
|
className={cn('flex flex-col space-y-1.5 p-6', className)}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
))
|
||||||
|
CardHeader.displayName = 'CardHeader'
|
||||||
|
|
||||||
|
const CardTitle = React.forwardRef<
|
||||||
|
HTMLParagraphElement,
|
||||||
|
React.HTMLAttributes<HTMLHeadingElement>
|
||||||
|
>(({ className, ...props }, ref) => (
|
||||||
|
<h3
|
||||||
|
ref={ref}
|
||||||
|
className={cn(
|
||||||
|
'text-2xl font-semibold leading-none tracking-tight',
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
))
|
||||||
|
CardTitle.displayName = 'CardTitle'
|
||||||
|
|
||||||
|
const CardDescription = React.forwardRef<
|
||||||
|
HTMLParagraphElement,
|
||||||
|
React.HTMLAttributes<HTMLParagraphElement>
|
||||||
|
>(({ className, ...props }, ref) => (
|
||||||
|
<p
|
||||||
|
ref={ref}
|
||||||
|
className={cn('text-sm text-muted-foreground', className)}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
))
|
||||||
|
CardDescription.displayName = 'CardDescription'
|
||||||
|
|
||||||
|
const CardContent = React.forwardRef<
|
||||||
|
HTMLDivElement,
|
||||||
|
React.HTMLAttributes<HTMLDivElement>
|
||||||
|
>(({ className, ...props }, ref) => (
|
||||||
|
<div ref={ref} className={cn('p-6 pt-0', className)} {...props} />
|
||||||
|
))
|
||||||
|
CardContent.displayName = 'CardContent'
|
||||||
|
|
||||||
|
const CardFooter = React.forwardRef<
|
||||||
|
HTMLDivElement,
|
||||||
|
React.HTMLAttributes<HTMLDivElement>
|
||||||
|
>(({ className, ...props }, ref) => (
|
||||||
|
<div
|
||||||
|
ref={ref}
|
||||||
|
className={cn('flex items-center p-6 pt-0', className)}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
))
|
||||||
|
CardFooter.displayName = 'CardFooter'
|
||||||
|
|
||||||
|
export { Card, CardHeader, CardFooter, CardTitle, CardDescription, CardContent }
|
||||||
41
frontend/components/ui/collapsible.tsx
Normal file
41
frontend/components/ui/collapsible.tsx
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useState } from 'react'
|
||||||
|
import { ChevronDown } from 'lucide-react'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
import { Button } from './button'
|
||||||
|
|
||||||
|
interface CollapsibleProps {
|
||||||
|
title: string
|
||||||
|
children: React.ReactNode
|
||||||
|
defaultOpen?: boolean
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Collapsible({
|
||||||
|
title,
|
||||||
|
children,
|
||||||
|
defaultOpen = false,
|
||||||
|
className,
|
||||||
|
}: CollapsibleProps) {
|
||||||
|
const [isOpen, setIsOpen] = useState(defaultOpen)
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={cn('border rounded-lg', className)}>
|
||||||
|
<Button
|
||||||
|
onClick={() => setIsOpen(!isOpen)}
|
||||||
|
variant="ghost"
|
||||||
|
className="w-full justify-between p-4 h-auto font-semibold hover:bg-accent"
|
||||||
|
>
|
||||||
|
<span>{title}</span>
|
||||||
|
<ChevronDown
|
||||||
|
className={cn(
|
||||||
|
'h-5 w-5 transition-transform duration-200',
|
||||||
|
isOpen && 'transform rotate-180'
|
||||||
|
)}
|
||||||
|
/>
|
||||||
|
</Button>
|
||||||
|
{isOpen && <div className="p-4 pt-0 border-t">{children}</div>}
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
221
frontend/components/ui/compass-selector.tsx
Normal file
221
frontend/components/ui/compass-selector.tsx
Normal file
@@ -0,0 +1,221 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useEffect, useRef, useState } from 'react'
|
||||||
|
|
||||||
|
interface CompassSelectorProps {
|
||||||
|
value: number // 0-360 degrees
|
||||||
|
onChange: (value: number) => void
|
||||||
|
range?: number // Optional range to display as arc (±degrees)
|
||||||
|
size?: number
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export function CompassSelector({ value, onChange, range, size = 200, className = '' }: CompassSelectorProps) {
|
||||||
|
const [isDragging, setIsDragging] = useState(false)
|
||||||
|
const compassRef = useRef<SVGSVGElement>(null)
|
||||||
|
|
||||||
|
const handlePointerMove = (e: PointerEvent) => {
|
||||||
|
if (!isDragging || !compassRef.current) return
|
||||||
|
|
||||||
|
const rect = compassRef.current.getBoundingClientRect()
|
||||||
|
const centerX = rect.left + rect.width / 2
|
||||||
|
const centerY = rect.top + rect.height / 2
|
||||||
|
|
||||||
|
const dx = e.clientX - centerX
|
||||||
|
const dy = e.clientY - centerY
|
||||||
|
|
||||||
|
// Calculate angle in degrees (0° = North/top)
|
||||||
|
let angle = Math.atan2(dx, -dy) * (180 / Math.PI)
|
||||||
|
if (angle < 0) angle += 360
|
||||||
|
|
||||||
|
// Round to nearest 5 degrees for easier selection
|
||||||
|
angle = Math.round(angle / 5) * 5
|
||||||
|
|
||||||
|
onChange(angle % 360)
|
||||||
|
}
|
||||||
|
|
||||||
|
const handlePointerDown = (e: React.PointerEvent) => {
|
||||||
|
setIsDragging(true)
|
||||||
|
// Trigger initial update
|
||||||
|
const rect = compassRef.current!.getBoundingClientRect()
|
||||||
|
const centerX = rect.left + rect.width / 2
|
||||||
|
const centerY = rect.top + rect.height / 2
|
||||||
|
|
||||||
|
const dx = e.clientX - centerX
|
||||||
|
const dy = e.clientY - centerY
|
||||||
|
|
||||||
|
let angle = Math.atan2(dx, -dy) * (180 / Math.PI)
|
||||||
|
if (angle < 0) angle += 360
|
||||||
|
angle = Math.round(angle / 5) * 5
|
||||||
|
|
||||||
|
onChange(angle % 360)
|
||||||
|
}
|
||||||
|
|
||||||
|
const handlePointerUp = () => {
|
||||||
|
setIsDragging(false)
|
||||||
|
}
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (isDragging) {
|
||||||
|
window.addEventListener('pointermove', handlePointerMove)
|
||||||
|
window.addEventListener('pointerup', handlePointerUp)
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
window.removeEventListener('pointermove', handlePointerMove)
|
||||||
|
window.removeEventListener('pointerup', handlePointerUp)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, [isDragging])
|
||||||
|
|
||||||
|
// Convert degrees to radians for calculations
|
||||||
|
const angleRad = (value - 90) * (Math.PI / 180)
|
||||||
|
const radius = size / 2 - 10
|
||||||
|
const needleLength = radius * 0.7
|
||||||
|
|
||||||
|
// Calculate needle endpoint
|
||||||
|
const needleX = size / 2 + Math.cos(angleRad) * needleLength
|
||||||
|
const needleY = size / 2 + Math.sin(angleRad) * needleLength
|
||||||
|
|
||||||
|
// Helper to convert degrees to compass direction
|
||||||
|
const getCompassDirection = (degrees: number): string => {
|
||||||
|
const directions = ['N', 'NNE', 'NE', 'ENE', 'E', 'ESE', 'SE', 'SSE', 'S', 'SSW', 'SW', 'WSW', 'W', 'WNW', 'NW', 'NNW']
|
||||||
|
const index = Math.round(degrees / 22.5) % 16
|
||||||
|
return directions[index]
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={`flex flex-col items-center ${className}`}>
|
||||||
|
<svg
|
||||||
|
ref={compassRef}
|
||||||
|
width={size}
|
||||||
|
height={size}
|
||||||
|
className="cursor-pointer select-none"
|
||||||
|
onPointerDown={handlePointerDown}
|
||||||
|
style={{ touchAction: 'none' }}
|
||||||
|
>
|
||||||
|
{/* Outer circle */}
|
||||||
|
<circle
|
||||||
|
cx={size / 2}
|
||||||
|
cy={size / 2}
|
||||||
|
r={radius}
|
||||||
|
fill="hsl(var(--secondary))"
|
||||||
|
stroke="hsl(var(--border))"
|
||||||
|
strokeWidth="2"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Acceptable range arc (green) */}
|
||||||
|
{range !== undefined && (() => {
|
||||||
|
const startAngle = value - range
|
||||||
|
const endAngle = value + range
|
||||||
|
|
||||||
|
// Convert to radians for calculation (adjusting for SVG coordinate system)
|
||||||
|
const startRad = (startAngle - 90) * (Math.PI / 180)
|
||||||
|
const endRad = (endAngle - 90) * (Math.PI / 180)
|
||||||
|
|
||||||
|
// Calculate start and end points on the circle
|
||||||
|
const x1 = size / 2 + Math.cos(startRad) * radius
|
||||||
|
const y1 = size / 2 + Math.sin(startRad) * radius
|
||||||
|
const x2 = size / 2 + Math.cos(endRad) * radius
|
||||||
|
const y2 = size / 2 + Math.sin(endRad) * radius
|
||||||
|
|
||||||
|
// Determine if the arc should be large (>180°) or small
|
||||||
|
const largeArcFlag = range * 2 > 180 ? 1 : 0
|
||||||
|
|
||||||
|
// Create SVG arc path
|
||||||
|
const arcPath = `M ${x1} ${y1} A ${radius} ${radius} 0 ${largeArcFlag} 1 ${x2} ${y2}`
|
||||||
|
|
||||||
|
return (
|
||||||
|
<path
|
||||||
|
d={arcPath}
|
||||||
|
fill="none"
|
||||||
|
stroke="#22c55e"
|
||||||
|
strokeWidth="6"
|
||||||
|
strokeLinecap="round"
|
||||||
|
/>
|
||||||
|
)
|
||||||
|
})()}
|
||||||
|
|
||||||
|
{/* Cardinal direction markers */}
|
||||||
|
{[
|
||||||
|
{ angle: 0, label: 'N' },
|
||||||
|
{ angle: 90, label: 'E' },
|
||||||
|
{ angle: 180, label: 'S' },
|
||||||
|
{ angle: 270, label: 'W' },
|
||||||
|
].map(({ angle, label }) => {
|
||||||
|
const rad = (angle - 90) * (Math.PI / 180)
|
||||||
|
const x = size / 2 + Math.cos(rad) * (radius - 20)
|
||||||
|
const y = size / 2 + Math.sin(rad) * (radius - 20)
|
||||||
|
|
||||||
|
return (
|
||||||
|
<text
|
||||||
|
key={angle}
|
||||||
|
x={x}
|
||||||
|
y={y}
|
||||||
|
textAnchor="middle"
|
||||||
|
dominantBaseline="middle"
|
||||||
|
className="text-sm font-bold fill-foreground"
|
||||||
|
>
|
||||||
|
{label}
|
||||||
|
</text>
|
||||||
|
)
|
||||||
|
})}
|
||||||
|
|
||||||
|
{/* Degree tick marks every 30° */}
|
||||||
|
{Array.from({ length: 12 }, (_, i) => {
|
||||||
|
const angle = i * 30
|
||||||
|
const rad = (angle - 90) * (Math.PI / 180)
|
||||||
|
const x1 = size / 2 + Math.cos(rad) * (radius - 5)
|
||||||
|
const y1 = size / 2 + Math.sin(rad) * (radius - 5)
|
||||||
|
const x2 = size / 2 + Math.cos(rad) * radius
|
||||||
|
const y2 = size / 2 + Math.sin(rad) * radius
|
||||||
|
|
||||||
|
return (
|
||||||
|
<line
|
||||||
|
key={angle}
|
||||||
|
x1={x1}
|
||||||
|
y1={y1}
|
||||||
|
x2={x2}
|
||||||
|
y2={y2}
|
||||||
|
stroke="hsl(var(--border))"
|
||||||
|
strokeWidth="2"
|
||||||
|
/>
|
||||||
|
)
|
||||||
|
})}
|
||||||
|
|
||||||
|
{/* Center dot */}
|
||||||
|
<circle
|
||||||
|
cx={size / 2}
|
||||||
|
cy={size / 2}
|
||||||
|
r="4"
|
||||||
|
fill="hsl(var(--primary))"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Direction needle */}
|
||||||
|
<line
|
||||||
|
x1={size / 2}
|
||||||
|
y1={size / 2}
|
||||||
|
x2={needleX}
|
||||||
|
y2={needleY}
|
||||||
|
stroke="hsl(var(--primary))"
|
||||||
|
strokeWidth="3"
|
||||||
|
strokeLinecap="round"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Needle tip */}
|
||||||
|
<circle
|
||||||
|
cx={needleX}
|
||||||
|
cy={needleY}
|
||||||
|
r="6"
|
||||||
|
fill="hsl(var(--primary))"
|
||||||
|
stroke="hsl(var(--background))"
|
||||||
|
strokeWidth="2"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
|
||||||
|
{/* Value display */}
|
||||||
|
<div className="mt-2 text-center">
|
||||||
|
<div className="text-lg font-bold">{value}° ({getCompassDirection(value)})</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
36
frontend/components/ui/slider.tsx
Normal file
36
frontend/components/ui/slider.tsx
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import * as React from 'react'
|
||||||
|
import * as SliderPrimitive from '@radix-ui/react-slider'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
|
||||||
|
const Slider = React.forwardRef<
|
||||||
|
React.ElementRef<typeof SliderPrimitive.Root>,
|
||||||
|
React.ComponentPropsWithoutRef<typeof SliderPrimitive.Root>
|
||||||
|
>(({ className, ...props }, ref) => {
|
||||||
|
const values = props.value || props.defaultValue || [0]
|
||||||
|
|
||||||
|
return (
|
||||||
|
<SliderPrimitive.Root
|
||||||
|
ref={ref}
|
||||||
|
className={cn(
|
||||||
|
'relative flex w-full touch-none select-none items-center',
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
{...props}
|
||||||
|
>
|
||||||
|
<SliderPrimitive.Track className="relative h-2 w-full grow overflow-hidden rounded-full bg-secondary">
|
||||||
|
<SliderPrimitive.Range className="absolute h-full bg-primary" />
|
||||||
|
</SliderPrimitive.Track>
|
||||||
|
{values.map((_, index) => (
|
||||||
|
<SliderPrimitive.Thumb
|
||||||
|
key={index}
|
||||||
|
className="block h-5 w-5 rounded-full border-2 border-primary bg-background ring-offset-background transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50"
|
||||||
|
/>
|
||||||
|
))}
|
||||||
|
</SliderPrimitive.Root>
|
||||||
|
)
|
||||||
|
})
|
||||||
|
Slider.displayName = SliderPrimitive.Root.displayName
|
||||||
|
|
||||||
|
export { Slider }
|
||||||
163
frontend/components/weather/assessment-badge.tsx
Normal file
163
frontend/components/weather/assessment-badge.tsx
Normal file
@@ -0,0 +1,163 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { Card, CardContent } from '@/components/ui/card'
|
||||||
|
import { Check, X } from 'lucide-react'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
import type { Assessment, FlyableWindow, WeatherPoint } from '@/lib/types'
|
||||||
|
import { format, parseISO } from 'date-fns'
|
||||||
|
import { useThresholdStore } from '@/store/threshold-store'
|
||||||
|
|
||||||
|
interface AssessmentBadgeProps {
|
||||||
|
assessment: Assessment
|
||||||
|
currentWeather: WeatherPoint
|
||||||
|
bestWindow?: FlyableWindow
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
// Transform direction to offset from West (270°)
|
||||||
|
function calculateOffset(direction: number): number {
|
||||||
|
return ((direction - 270 + 180) % 360) - 180
|
||||||
|
}
|
||||||
|
|
||||||
|
type Rating = 'Great' | 'Good' | 'Okay' | 'Bad'
|
||||||
|
|
||||||
|
interface RatingInfo {
|
||||||
|
text: Rating
|
||||||
|
color: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export function AssessmentBadge({ assessment, currentWeather, bestWindow, className }: AssessmentBadgeProps) {
|
||||||
|
const { speedMin, speedMax, dirCenter, dirRange } = useThresholdStore()
|
||||||
|
|
||||||
|
// Evaluate wind speed
|
||||||
|
const evaluateWindSpeed = (speed: number): RatingInfo => {
|
||||||
|
if (speed < speedMin || speed > speedMax) {
|
||||||
|
return { text: 'Bad', color: 'text-red-600 dark:text-red-400' }
|
||||||
|
}
|
||||||
|
const range = speedMax - speedMin
|
||||||
|
const distanceFromMin = speed - speedMin
|
||||||
|
const distanceFromMax = speedMax - speed
|
||||||
|
const minDistance = Math.min(distanceFromMin, distanceFromMax)
|
||||||
|
|
||||||
|
if (minDistance < range * 0.15) {
|
||||||
|
return { text: 'Okay', color: 'text-yellow-600 dark:text-yellow-400' }
|
||||||
|
} else if (minDistance < range * 0.35) {
|
||||||
|
return { text: 'Good', color: 'text-green-600 dark:text-green-400' }
|
||||||
|
} else {
|
||||||
|
return { text: 'Great', color: 'text-green-700 dark:text-green-300' }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Evaluate wind direction
|
||||||
|
const evaluateWindDirection = (direction: number): RatingInfo => {
|
||||||
|
const offset = calculateOffset(direction)
|
||||||
|
const centerOffset = calculateOffset(dirCenter)
|
||||||
|
const minOffset = centerOffset - dirRange
|
||||||
|
const maxOffset = centerOffset + dirRange
|
||||||
|
|
||||||
|
if (offset < minOffset || offset > maxOffset) {
|
||||||
|
return { text: 'Bad', color: 'text-red-600 dark:text-red-400' }
|
||||||
|
}
|
||||||
|
|
||||||
|
const distanceFromCenter = Math.abs(offset - centerOffset)
|
||||||
|
|
||||||
|
if (distanceFromCenter > dirRange * 0.7) {
|
||||||
|
return { text: 'Okay', color: 'text-yellow-600 dark:text-yellow-400' }
|
||||||
|
} else if (distanceFromCenter > dirRange * 0.4) {
|
||||||
|
return { text: 'Good', color: 'text-green-600 dark:text-green-400' }
|
||||||
|
} else {
|
||||||
|
return { text: 'Great', color: 'text-green-700 dark:text-green-300' }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const speedRating = evaluateWindSpeed(currentWeather.wind_speed)
|
||||||
|
const directionRating = evaluateWindDirection(currentWeather.wind_direction)
|
||||||
|
|
||||||
|
// Overall assessment is based on the worse of the two metrics
|
||||||
|
const getOverallAssessment = (): boolean => {
|
||||||
|
const ratingValues: Record<Rating, number> = {
|
||||||
|
'Great': 4,
|
||||||
|
'Good': 3,
|
||||||
|
'Okay': 2,
|
||||||
|
'Bad': 1
|
||||||
|
}
|
||||||
|
const worstRating = Math.min(
|
||||||
|
ratingValues[speedRating.text],
|
||||||
|
ratingValues[directionRating.text]
|
||||||
|
)
|
||||||
|
// Only GOOD if both metrics are at least "Good"
|
||||||
|
return worstRating >= 3
|
||||||
|
}
|
||||||
|
|
||||||
|
const isGood = getOverallAssessment()
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card
|
||||||
|
className={cn(
|
||||||
|
'border-2',
|
||||||
|
isGood
|
||||||
|
? 'border-green-500 bg-green-50 dark:bg-green-950'
|
||||||
|
: 'border-red-500 bg-red-50 dark:bg-red-950',
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<CardContent className="p-6">
|
||||||
|
<div className="flex items-start justify-between gap-4">
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
<div
|
||||||
|
className={cn(
|
||||||
|
'flex h-12 w-12 items-center justify-center rounded-full',
|
||||||
|
isGood ? 'bg-green-500' : 'bg-red-500'
|
||||||
|
)}
|
||||||
|
aria-hidden="true"
|
||||||
|
>
|
||||||
|
{isGood ? (
|
||||||
|
<Check className="h-8 w-8 text-white" strokeWidth={3} />
|
||||||
|
) : (
|
||||||
|
<X className="h-8 w-8 text-white" strokeWidth={3} />
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div
|
||||||
|
className={cn(
|
||||||
|
'text-3xl font-bold',
|
||||||
|
isGood ? 'text-green-700 dark:text-green-300' : 'text-red-700 dark:text-red-300'
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
{isGood ? 'GOOD' : 'BAD'}
|
||||||
|
</div>
|
||||||
|
<div className="text-sm text-muted-foreground">
|
||||||
|
Flyability Assessment
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{bestWindow && isGood && (
|
||||||
|
<div className="text-right">
|
||||||
|
<div className="text-sm font-medium text-muted-foreground">
|
||||||
|
Best window
|
||||||
|
</div>
|
||||||
|
<div className="text-lg font-semibold">
|
||||||
|
{format(parseISO(bestWindow.start), 'ha')} -{' '}
|
||||||
|
{format(parseISO(bestWindow.end), 'ha')}
|
||||||
|
</div>
|
||||||
|
<div className="text-xs text-muted-foreground">
|
||||||
|
{bestWindow.duration_hours.toFixed(1)}h duration
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Individual metric ratings */}
|
||||||
|
<div className="mt-4 pt-4 border-t space-y-2">
|
||||||
|
<div className="text-lg font-semibold">
|
||||||
|
Wind direction is <span className={directionRating.color}>{directionRating.text}</span>
|
||||||
|
</div>
|
||||||
|
<div className="text-lg font-semibold">
|
||||||
|
Wind speed is <span className={speedRating.color}>{speedRating.text}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)
|
||||||
|
}
|
||||||
6
frontend/components/weather/index.ts
Normal file
6
frontend/components/weather/index.ts
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
export { AssessmentBadge } from './assessment-badge'
|
||||||
|
export { WindSpeedChart } from './wind-speed-chart'
|
||||||
|
export { WindDirectionChart } from './wind-direction-chart'
|
||||||
|
export { ThresholdControls } from './threshold-controls'
|
||||||
|
export { RefreshCountdown } from './refresh-countdown'
|
||||||
|
export { StaleDataBanner } from './stale-data-banner'
|
||||||
78
frontend/components/weather/refresh-countdown.tsx
Normal file
78
frontend/components/weather/refresh-countdown.tsx
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useEffect, useState } from 'react'
|
||||||
|
import { Button } from '@/components/ui/button'
|
||||||
|
import { RefreshCw } from 'lucide-react'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
|
||||||
|
interface RefreshCountdownProps {
|
||||||
|
onRefresh: () => void
|
||||||
|
intervalMs?: number
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export function RefreshCountdown({
|
||||||
|
onRefresh,
|
||||||
|
intervalMs = 5 * 60 * 1000, // 5 minutes default
|
||||||
|
className,
|
||||||
|
}: RefreshCountdownProps) {
|
||||||
|
const [secondsRemaining, setSecondsRemaining] = useState(intervalMs / 1000)
|
||||||
|
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const interval = setInterval(() => {
|
||||||
|
setSecondsRemaining((prev) => {
|
||||||
|
if (prev <= 1) {
|
||||||
|
// Auto refresh
|
||||||
|
handleRefresh()
|
||||||
|
return intervalMs / 1000
|
||||||
|
}
|
||||||
|
return prev - 1
|
||||||
|
})
|
||||||
|
}, 1000)
|
||||||
|
|
||||||
|
return () => clearInterval(interval)
|
||||||
|
}, [intervalMs])
|
||||||
|
|
||||||
|
const handleRefresh = async () => {
|
||||||
|
setIsRefreshing(true)
|
||||||
|
await onRefresh()
|
||||||
|
setSecondsRemaining(intervalMs / 1000)
|
||||||
|
setIsRefreshing(false)
|
||||||
|
}
|
||||||
|
|
||||||
|
const formatTime = (seconds: number): string => {
|
||||||
|
const mins = Math.floor(seconds / 60)
|
||||||
|
const secs = Math.floor(seconds % 60)
|
||||||
|
return `${mins}:${secs.toString().padStart(2, '0')}`
|
||||||
|
}
|
||||||
|
|
||||||
|
const progressPercentage = ((intervalMs / 1000 - secondsRemaining) / (intervalMs / 1000)) * 100
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={cn('flex items-center gap-4', className)}>
|
||||||
|
<div className="flex-1">
|
||||||
|
<div className="flex items-center justify-between mb-2">
|
||||||
|
<span className="text-sm font-medium">Next refresh</span>
|
||||||
|
<span className="text-sm text-muted-foreground">{formatTime(secondsRemaining)}</span>
|
||||||
|
</div>
|
||||||
|
<div className="h-2 w-full bg-secondary rounded-full overflow-hidden">
|
||||||
|
<div
|
||||||
|
className="h-full bg-primary transition-all duration-1000 ease-linear"
|
||||||
|
style={{ width: `${progressPercentage}%` }}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<Button
|
||||||
|
onClick={handleRefresh}
|
||||||
|
disabled={isRefreshing}
|
||||||
|
size="icon"
|
||||||
|
variant="outline"
|
||||||
|
className="min-h-[44px] min-w-[44px]"
|
||||||
|
aria-label="Refresh now"
|
||||||
|
>
|
||||||
|
<RefreshCw className={cn('h-4 w-4', isRefreshing && 'animate-spin')} />
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
42
frontend/components/weather/stale-data-banner.tsx
Normal file
42
frontend/components/weather/stale-data-banner.tsx
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { AlertTriangle } from 'lucide-react'
|
||||||
|
import { cn } from '@/lib/utils'
|
||||||
|
import { differenceInMinutes, parseISO } from 'date-fns'
|
||||||
|
|
||||||
|
interface StaleDataBannerProps {
|
||||||
|
lastUpdated: string
|
||||||
|
thresholdMinutes?: number
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export function StaleDataBanner({
|
||||||
|
lastUpdated,
|
||||||
|
thresholdMinutes = 10,
|
||||||
|
className,
|
||||||
|
}: StaleDataBannerProps) {
|
||||||
|
const minutesOld = differenceInMinutes(new Date(), parseISO(lastUpdated))
|
||||||
|
const isStale = minutesOld > thresholdMinutes
|
||||||
|
|
||||||
|
if (!isStale) return null
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
className={cn(
|
||||||
|
'flex items-center gap-3 rounded-lg border border-yellow-500 bg-yellow-50 dark:bg-yellow-950 p-4',
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
role="alert"
|
||||||
|
>
|
||||||
|
<AlertTriangle className="h-5 w-5 text-yellow-600 dark:text-yellow-400 flex-shrink-0" />
|
||||||
|
<div className="flex-1">
|
||||||
|
<p className="text-sm font-medium text-yellow-800 dark:text-yellow-200">
|
||||||
|
Data may be outdated
|
||||||
|
</p>
|
||||||
|
<p className="text-xs text-yellow-700 dark:text-yellow-300">
|
||||||
|
Last updated {minutesOld} minutes ago. Live data may differ.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
122
frontend/components/weather/threshold-controls.tsx
Normal file
122
frontend/components/weather/threshold-controls.tsx
Normal file
@@ -0,0 +1,122 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card'
|
||||||
|
import { Slider } from '@/components/ui/slider'
|
||||||
|
import { CompassSelector } from '@/components/ui/compass-selector'
|
||||||
|
import { useThresholdStore } from '@/store/threshold-store'
|
||||||
|
import { useEffect } from 'react'
|
||||||
|
|
||||||
|
interface ThresholdControlsProps {
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ThresholdControls({ className }: ThresholdControlsProps) {
|
||||||
|
const {
|
||||||
|
speedMin,
|
||||||
|
speedMax,
|
||||||
|
dirCenter,
|
||||||
|
dirRange,
|
||||||
|
setSpeedRange,
|
||||||
|
setDirCenter,
|
||||||
|
setDirRange,
|
||||||
|
initFromURL,
|
||||||
|
} = useThresholdStore()
|
||||||
|
|
||||||
|
// Initialize from URL on mount
|
||||||
|
useEffect(() => {
|
||||||
|
initFromURL()
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className={className}>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Threshold Controls</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent className="space-y-8">
|
||||||
|
{/* Wind Speed Range */}
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<div className="mb-2">
|
||||||
|
<label className="text-sm font-medium">Wind Speed Range</label>
|
||||||
|
</div>
|
||||||
|
<div className="relative">
|
||||||
|
<Slider
|
||||||
|
value={[speedMin, speedMax]}
|
||||||
|
onValueChange={(values) => setSpeedRange(values[0], values[1])}
|
||||||
|
min={0}
|
||||||
|
max={30}
|
||||||
|
step={0.5}
|
||||||
|
minStepsBetweenThumbs={1}
|
||||||
|
className="min-h-[44px] py-4"
|
||||||
|
aria-label="Wind speed range threshold"
|
||||||
|
/>
|
||||||
|
<div className="flex justify-between text-xs font-medium mt-1">
|
||||||
|
<span style={{ position: 'absolute', left: `${(speedMin / 30) * 100}%`, transform: 'translateX(-50%)' }}>
|
||||||
|
{speedMin} mph
|
||||||
|
</span>
|
||||||
|
<span style={{ position: 'absolute', left: `${(speedMax / 30) * 100}%`, transform: 'translateX(-50%)' }}>
|
||||||
|
{speedMax} mph
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Wind Direction Center */}
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<div className="mb-4">
|
||||||
|
<label className="text-sm font-medium">Direction Center</label>
|
||||||
|
</div>
|
||||||
|
<div className="flex justify-center">
|
||||||
|
<CompassSelector
|
||||||
|
value={dirCenter}
|
||||||
|
onChange={setDirCenter}
|
||||||
|
range={dirRange}
|
||||||
|
size={220}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Wind Direction Range */}
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<div className="flex items-center justify-between mb-2">
|
||||||
|
<label className="text-sm font-medium">Direction Range</label>
|
||||||
|
<span className="text-sm text-muted-foreground">
|
||||||
|
±{dirRange}°
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<Slider
|
||||||
|
value={[dirRange]}
|
||||||
|
onValueChange={(values) => setDirRange(values[0])}
|
||||||
|
min={5}
|
||||||
|
max={90}
|
||||||
|
step={5}
|
||||||
|
className="min-h-[44px] py-4"
|
||||||
|
aria-label="Wind direction range threshold"
|
||||||
|
/>
|
||||||
|
<div className="text-xs text-muted-foreground mt-1">
|
||||||
|
Acceptable range: {(dirCenter - dirRange + 360) % 360}° to {(dirCenter + dirRange) % 360}°
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="pt-4 border-t">
|
||||||
|
<p className="text-xs text-muted-foreground">
|
||||||
|
Thresholds are saved to URL and applied to charts automatically.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper function to convert degrees to compass direction
|
||||||
|
function getCompassDirection(degrees: number): string {
|
||||||
|
const directions = ['N', 'NNE', 'NE', 'ENE', 'E', 'ESE', 'SE', 'SSE', 'S', 'SSW', 'SW', 'WSW', 'W', 'WNW', 'NW', 'NNW']
|
||||||
|
const index = Math.round(degrees / 22.5) % 16
|
||||||
|
return directions[index]
|
||||||
|
}
|
||||||
243
frontend/components/weather/wind-direction-chart.tsx
Normal file
243
frontend/components/weather/wind-direction-chart.tsx
Normal file
@@ -0,0 +1,243 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card'
|
||||||
|
import {
|
||||||
|
LineChart,
|
||||||
|
Line,
|
||||||
|
XAxis,
|
||||||
|
YAxis,
|
||||||
|
CartesianGrid,
|
||||||
|
Tooltip,
|
||||||
|
ResponsiveContainer,
|
||||||
|
ReferenceLine,
|
||||||
|
Legend,
|
||||||
|
Area,
|
||||||
|
ComposedChart,
|
||||||
|
} from 'recharts'
|
||||||
|
import { format, parseISO } from 'date-fns'
|
||||||
|
import type { WeatherPoint } from '@/lib/types'
|
||||||
|
import { useThresholdStore } from '@/store/threshold-store'
|
||||||
|
|
||||||
|
interface WindDirectionChartProps {
|
||||||
|
data: WeatherPoint[]
|
||||||
|
yesterdayData?: WeatherPoint[]
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ChartDataPoint {
|
||||||
|
timestamp: string
|
||||||
|
time: string
|
||||||
|
offset?: number
|
||||||
|
yesterdayOffset?: number
|
||||||
|
direction?: number
|
||||||
|
isInRange: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
// Transform direction to offset from West (270°)
|
||||||
|
// offset = ((direction - 270 + 180) % 360) - 180
|
||||||
|
// This maps: 180° (S) = -90°, 270° (W) = 0°, 360° (N) = 90°
|
||||||
|
function calculateOffset(direction: number): number {
|
||||||
|
return ((direction - 270 + 180) % 360) - 180
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindDirectionChart({ data, yesterdayData, className }: WindDirectionChartProps) {
|
||||||
|
const { dirCenter, dirRange } = useThresholdStore()
|
||||||
|
|
||||||
|
// Calculate acceptable bounds
|
||||||
|
const centerOffset = calculateOffset(dirCenter)
|
||||||
|
const minOffset = centerOffset - dirRange
|
||||||
|
const maxOffset = centerOffset + dirRange
|
||||||
|
|
||||||
|
// Filter data for TODAY's 8am-10pm window only
|
||||||
|
const filterTimeWindow = (points: WeatherPoint[]) => {
|
||||||
|
const now = new Date()
|
||||||
|
const todayStart = new Date(now.getFullYear(), now.getMonth(), now.getDate(), 8, 0, 0)
|
||||||
|
const todayEnd = new Date(now.getFullYear(), now.getMonth(), now.getDate(), 22, 0, 0)
|
||||||
|
|
||||||
|
return points.filter((point) => {
|
||||||
|
const timestamp = parseISO(point.timestamp)
|
||||||
|
return timestamp >= todayStart && timestamp < todayEnd
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate static time slots for 8am-10pm (14 hours)
|
||||||
|
const generateTimeSlots = (): { hour: number; label: string }[] => {
|
||||||
|
const slots = []
|
||||||
|
for (let hour = 8; hour < 22; hour++) {
|
||||||
|
slots.push({
|
||||||
|
hour,
|
||||||
|
label: format(new Date().setHours(hour, 0, 0, 0), 'ha')
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return slots
|
||||||
|
}
|
||||||
|
|
||||||
|
const filteredData = filterTimeWindow(data)
|
||||||
|
// Don't filter yesterday's data - show all available historical data
|
||||||
|
const filteredYesterday = yesterdayData || []
|
||||||
|
|
||||||
|
// Helper to clamp values to chart display range
|
||||||
|
const clampToChartRange = (value: number | undefined): number | undefined => {
|
||||||
|
if (value === undefined) return undefined
|
||||||
|
return Math.max(-60, Math.min(60, value))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate static time slots and map data to them
|
||||||
|
const timeSlots = generateTimeSlots()
|
||||||
|
const chartData: ChartDataPoint[] = timeSlots.map(slot => {
|
||||||
|
// Find forecast data for this hour
|
||||||
|
const forecastPoint = filteredData.find(point =>
|
||||||
|
parseISO(point.timestamp).getHours() === slot.hour
|
||||||
|
)
|
||||||
|
|
||||||
|
// Find yesterday's data for this hour
|
||||||
|
const yesterdayPoint = filteredYesterday.find(yp =>
|
||||||
|
parseISO(yp.timestamp).getHours() === slot.hour
|
||||||
|
)
|
||||||
|
|
||||||
|
const rawOffset = forecastPoint ? calculateOffset(forecastPoint.wind_direction) : undefined
|
||||||
|
const offset = clampToChartRange(rawOffset)
|
||||||
|
const isInRange = rawOffset !== undefined ? (rawOffset >= minOffset && rawOffset <= maxOffset) : false
|
||||||
|
|
||||||
|
return {
|
||||||
|
timestamp: forecastPoint?.timestamp || '',
|
||||||
|
time: slot.label,
|
||||||
|
offset,
|
||||||
|
yesterdayOffset: clampToChartRange(yesterdayPoint ? calculateOffset(yesterdayPoint.wind_direction) : undefined),
|
||||||
|
direction: forecastPoint?.wind_direction,
|
||||||
|
isInRange,
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Helper to convert offset back to compass direction for display
|
||||||
|
const offsetToCompass = (offset: number): string => {
|
||||||
|
const directions = ['N', 'NE', 'E', 'SE', 'S', 'SW', 'W', 'NW']
|
||||||
|
const deg = (((offset + 270) % 360) + 360) % 360
|
||||||
|
const index = Math.round(deg / 45) % 8
|
||||||
|
return directions[index]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Custom tooltip
|
||||||
|
const CustomTooltip = ({ active, payload }: any) => {
|
||||||
|
if (!active || !payload || !payload.length) return null
|
||||||
|
|
||||||
|
const data = payload[0].payload
|
||||||
|
|
||||||
|
// Don't show tooltip if there's no forecast data for this time slot
|
||||||
|
if (data.offset === undefined || data.direction === undefined) return null
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="rounded-lg border bg-background p-3 shadow-md">
|
||||||
|
<p className="font-medium">{format(parseISO(data.timestamp), 'EEE ha')}</p>
|
||||||
|
<p className="text-sm">
|
||||||
|
<span className="font-medium">Direction:</span> {data.direction.toFixed(0)}° ({offsetToCompass(data.offset)})
|
||||||
|
</p>
|
||||||
|
<p className="text-sm">
|
||||||
|
<span className="font-medium">Offset from West:</span> {data.offset.toFixed(1)}°
|
||||||
|
</p>
|
||||||
|
{data.yesterdayOffset !== undefined && (
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
<span className="font-medium">Yesterday:</span> {data.yesterdayOffset.toFixed(1)}°
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
<p className="text-xs mt-1">
|
||||||
|
Range: {minOffset.toFixed(0)}° to {maxOffset.toFixed(0)}°
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className={className}>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Wind Direction (Offset from West)</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<ResponsiveContainer width="100%" height={300}>
|
||||||
|
<ComposedChart data={chartData} margin={{ top: 10, right: 10, left: 0, bottom: 0 }}>
|
||||||
|
<defs>
|
||||||
|
<linearGradient id="directionRange" x1="0" y1="0" x2="0" y2="1">
|
||||||
|
<stop offset="5%" stopColor="#22c55e" stopOpacity={0.3} />
|
||||||
|
<stop offset="95%" stopColor="#22c55e" stopOpacity={0.1} />
|
||||||
|
</linearGradient>
|
||||||
|
</defs>
|
||||||
|
<CartesianGrid strokeDasharray="3 3" opacity={0.3} />
|
||||||
|
<XAxis
|
||||||
|
dataKey="time"
|
||||||
|
tick={{ fontSize: 12 }}
|
||||||
|
interval={0}
|
||||||
|
angle={0}
|
||||||
|
height={40}
|
||||||
|
/>
|
||||||
|
<YAxis
|
||||||
|
label={{ value: 'Offset (°)', angle: -90, position: 'insideLeft' }}
|
||||||
|
tick={{ fontSize: 12 }}
|
||||||
|
domain={[() => -60, () => 60]}
|
||||||
|
ticks={[-60, -30, 0, 30, 60]}
|
||||||
|
/>
|
||||||
|
<Tooltip content={<CustomTooltip />} />
|
||||||
|
<Legend />
|
||||||
|
|
||||||
|
{/* Reference area for acceptable range */}
|
||||||
|
<Area
|
||||||
|
type="monotone"
|
||||||
|
dataKey={() => maxOffset}
|
||||||
|
fill="url(#directionRange)"
|
||||||
|
stroke="none"
|
||||||
|
fillOpacity={0.3}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Threshold reference lines */}
|
||||||
|
<ReferenceLine
|
||||||
|
y={minOffset}
|
||||||
|
stroke="#22c55e"
|
||||||
|
strokeDasharray="3 3"
|
||||||
|
label={{ value: `Min: ${minOffset.toFixed(0)}°`, fontSize: 11, fill: '#22c55e' }}
|
||||||
|
/>
|
||||||
|
<ReferenceLine
|
||||||
|
y={maxOffset}
|
||||||
|
stroke="#22c55e"
|
||||||
|
strokeDasharray="3 3"
|
||||||
|
label={{ value: `Max: ${maxOffset.toFixed(0)}°`, fontSize: 11, fill: '#22c55e' }}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Perfect West reference */}
|
||||||
|
<ReferenceLine
|
||||||
|
y={0}
|
||||||
|
stroke="#6b7280"
|
||||||
|
strokeDasharray="1 3"
|
||||||
|
label={{ value: 'W (270°)', fontSize: 10, fill: '#6b7280' }}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Yesterday's data (faded) */}
|
||||||
|
{yesterdayData && (
|
||||||
|
<Line
|
||||||
|
type="monotone"
|
||||||
|
dataKey="yesterdayOffset"
|
||||||
|
stroke="#9ca3af"
|
||||||
|
strokeWidth={1}
|
||||||
|
dot={false}
|
||||||
|
name="Yesterday"
|
||||||
|
opacity={0.5}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Today's data */}
|
||||||
|
<Line
|
||||||
|
type="monotone"
|
||||||
|
dataKey="offset"
|
||||||
|
stroke="#3b82f6"
|
||||||
|
strokeWidth={2}
|
||||||
|
dot={false}
|
||||||
|
name="Direction Offset"
|
||||||
|
/>
|
||||||
|
</ComposedChart>
|
||||||
|
</ResponsiveContainer>
|
||||||
|
|
||||||
|
<div className="mt-4 text-xs text-muted-foreground text-center">
|
||||||
|
0° = West (270°) | Negative = South | Positive = North
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)
|
||||||
|
}
|
||||||
200
frontend/components/weather/wind-speed-chart.tsx
Normal file
200
frontend/components/weather/wind-speed-chart.tsx
Normal file
@@ -0,0 +1,200 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card'
|
||||||
|
import {
|
||||||
|
LineChart,
|
||||||
|
Line,
|
||||||
|
XAxis,
|
||||||
|
YAxis,
|
||||||
|
CartesianGrid,
|
||||||
|
ResponsiveContainer,
|
||||||
|
ReferenceLine,
|
||||||
|
Legend,
|
||||||
|
Area,
|
||||||
|
ComposedChart,
|
||||||
|
Tooltip,
|
||||||
|
} from 'recharts'
|
||||||
|
import { format, parseISO } from 'date-fns'
|
||||||
|
import type { WeatherPoint } from '@/lib/types'
|
||||||
|
import { useThresholdStore } from '@/store/threshold-store'
|
||||||
|
|
||||||
|
interface WindSpeedChartProps {
|
||||||
|
data: WeatherPoint[]
|
||||||
|
yesterdayData?: WeatherPoint[]
|
||||||
|
className?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ChartDataPoint {
|
||||||
|
timestamp: string
|
||||||
|
time: string
|
||||||
|
speed?: number
|
||||||
|
yesterdaySpeed?: number
|
||||||
|
isInRange: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindSpeedChart({ data, yesterdayData, className }: WindSpeedChartProps) {
|
||||||
|
const { speedMin, speedMax } = useThresholdStore()
|
||||||
|
|
||||||
|
// Filter data for TODAY's 8am-10pm window only
|
||||||
|
const filterTimeWindow = (points: WeatherPoint[]) => {
|
||||||
|
const now = new Date()
|
||||||
|
const todayStart = new Date(now.getFullYear(), now.getMonth(), now.getDate(), 8, 0, 0)
|
||||||
|
const todayEnd = new Date(now.getFullYear(), now.getMonth(), now.getDate(), 22, 0, 0)
|
||||||
|
|
||||||
|
return points.filter((point) => {
|
||||||
|
const timestamp = parseISO(point.timestamp)
|
||||||
|
return timestamp >= todayStart && timestamp < todayEnd
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate static time slots for 8am-10pm (14 hours)
|
||||||
|
const generateTimeSlots = (): { hour: number; label: string }[] => {
|
||||||
|
const slots = []
|
||||||
|
for (let hour = 8; hour < 22; hour++) {
|
||||||
|
slots.push({
|
||||||
|
hour,
|
||||||
|
label: format(new Date().setHours(hour, 0, 0, 0), 'ha')
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return slots
|
||||||
|
}
|
||||||
|
|
||||||
|
const filteredData = filterTimeWindow(data)
|
||||||
|
// Don't filter yesterday's data - show all available historical data
|
||||||
|
const filteredYesterday = yesterdayData || []
|
||||||
|
|
||||||
|
// Generate static time slots and map data to them
|
||||||
|
const timeSlots = generateTimeSlots()
|
||||||
|
const chartData: ChartDataPoint[] = timeSlots.map(slot => {
|
||||||
|
// Find forecast data for this hour
|
||||||
|
const forecastPoint = filteredData.find(point =>
|
||||||
|
parseISO(point.timestamp).getHours() === slot.hour
|
||||||
|
)
|
||||||
|
|
||||||
|
// Find yesterday's data for this hour
|
||||||
|
const yesterdayPoint = filteredYesterday.find(yp =>
|
||||||
|
parseISO(yp.timestamp).getHours() === slot.hour
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
timestamp: forecastPoint?.timestamp || '',
|
||||||
|
time: slot.label,
|
||||||
|
speed: forecastPoint?.wind_speed,
|
||||||
|
yesterdaySpeed: yesterdayPoint?.wind_speed,
|
||||||
|
isInRange: forecastPoint ?
|
||||||
|
(forecastPoint.wind_speed >= speedMin && forecastPoint.wind_speed <= speedMax) :
|
||||||
|
false
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Custom tooltip
|
||||||
|
const CustomTooltip = ({ active, payload }: any) => {
|
||||||
|
if (!active || !payload || !payload.length) return null
|
||||||
|
|
||||||
|
const data = payload[0].payload
|
||||||
|
|
||||||
|
// Don't show tooltip if there's no forecast data for this time slot
|
||||||
|
if (data.speed === undefined) return null
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="rounded-lg border bg-background p-3 shadow-md">
|
||||||
|
<p className="font-medium">{data.time}</p>
|
||||||
|
<p className="text-sm">
|
||||||
|
<span className="font-medium">Forecast:</span> {data.speed.toFixed(1)} mph
|
||||||
|
</p>
|
||||||
|
{data.yesterdaySpeed !== undefined && (
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
<span className="font-medium">Yesterday:</span> {data.yesterdaySpeed.toFixed(1)} mph
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className={className}>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Wind Speed</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<ResponsiveContainer width="100%" height={300}>
|
||||||
|
<ComposedChart data={chartData} margin={{ top: 10, right: 10, left: 0, bottom: 0 }}>
|
||||||
|
<defs>
|
||||||
|
<linearGradient id="speedRangeGood" x1="0" y1="0" x2="0" y2="1">
|
||||||
|
<stop offset="5%" stopColor="#22c55e" stopOpacity={0.3} />
|
||||||
|
<stop offset="95%" stopColor="#22c55e" stopOpacity={0.1} />
|
||||||
|
</linearGradient>
|
||||||
|
<linearGradient id="speedRangeBad" x1="0" y1="0" x2="0" y2="1">
|
||||||
|
<stop offset="5%" stopColor="#ef4444" stopOpacity={0.3} />
|
||||||
|
<stop offset="95%" stopColor="#ef4444" stopOpacity={0.1} />
|
||||||
|
</linearGradient>
|
||||||
|
</defs>
|
||||||
|
<CartesianGrid strokeDasharray="3 3" opacity={0.3} />
|
||||||
|
<XAxis
|
||||||
|
dataKey="time"
|
||||||
|
tick={{ fontSize: 12 }}
|
||||||
|
interval={0}
|
||||||
|
angle={0}
|
||||||
|
height={40}
|
||||||
|
/>
|
||||||
|
<YAxis
|
||||||
|
label={{ value: 'Speed (mph)', angle: -90, position: 'insideLeft' }}
|
||||||
|
tick={{ fontSize: 12 }}
|
||||||
|
/>
|
||||||
|
<Tooltip content={<CustomTooltip />} />
|
||||||
|
<Legend />
|
||||||
|
|
||||||
|
{/* Shaded area for acceptable speed range (green) */}
|
||||||
|
<Area
|
||||||
|
type="monotone"
|
||||||
|
dataKey={() => speedMax}
|
||||||
|
fill="url(#speedRangeGood)"
|
||||||
|
stroke="none"
|
||||||
|
fillOpacity={0.3}
|
||||||
|
activeDot={false}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Threshold reference lines */}
|
||||||
|
<ReferenceLine
|
||||||
|
y={speedMin}
|
||||||
|
stroke="#22c55e"
|
||||||
|
strokeDasharray="3 3"
|
||||||
|
label={{ value: `Min: ${speedMin}`, fontSize: 11, fill: '#22c55e' }}
|
||||||
|
/>
|
||||||
|
<ReferenceLine
|
||||||
|
y={speedMax}
|
||||||
|
stroke="#22c55e"
|
||||||
|
strokeDasharray="3 3"
|
||||||
|
label={{ value: `Max: ${speedMax}`, fontSize: 11, fill: '#22c55e' }}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Yesterday's data (faded) */}
|
||||||
|
{yesterdayData && (
|
||||||
|
<Line
|
||||||
|
type="monotone"
|
||||||
|
dataKey="yesterdaySpeed"
|
||||||
|
stroke="#9ca3af"
|
||||||
|
strokeWidth={1}
|
||||||
|
name="Yesterday"
|
||||||
|
dot={false}
|
||||||
|
activeDot={true}
|
||||||
|
isAnimationActive={false}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Today's forecast - single continuous line */}
|
||||||
|
<Line
|
||||||
|
type="monotone"
|
||||||
|
dataKey="speed"
|
||||||
|
stroke="#3b82f6"
|
||||||
|
strokeWidth={2}
|
||||||
|
name="Forecast"
|
||||||
|
dot={false}
|
||||||
|
activeDot={true}
|
||||||
|
/>
|
||||||
|
</ComposedChart>
|
||||||
|
</ResponsiveContainer>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)
|
||||||
|
}
|
||||||
37
frontend/hooks/use-weather.ts
Normal file
37
frontend/hooks/use-weather.ts
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { useQuery } from '@tanstack/react-query'
|
||||||
|
import { getCurrentWeather, getForecast, getHistorical } from '@/lib/api'
|
||||||
|
import type { CurrentWeatherResponse, ForecastResponse, HistoricalResponse } from '@/lib/types'
|
||||||
|
|
||||||
|
const STALE_TIME = 5 * 60 * 1000 // 5 minutes
|
||||||
|
const REFETCH_INTERVAL = 5 * 60 * 1000 // 5 minutes
|
||||||
|
|
||||||
|
export function useCurrentWeather(lat?: number, lon?: number) {
|
||||||
|
return useQuery<CurrentWeatherResponse>({
|
||||||
|
queryKey: ['weather', 'current', lat, lon],
|
||||||
|
queryFn: () => getCurrentWeather(lat, lon),
|
||||||
|
staleTime: STALE_TIME,
|
||||||
|
refetchInterval: REFETCH_INTERVAL,
|
||||||
|
refetchOnWindowFocus: true,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useForecast(lat?: number, lon?: number) {
|
||||||
|
return useQuery<ForecastResponse>({
|
||||||
|
queryKey: ['weather', 'forecast', lat, lon],
|
||||||
|
queryFn: () => getForecast(lat, lon),
|
||||||
|
staleTime: STALE_TIME,
|
||||||
|
refetchInterval: REFETCH_INTERVAL,
|
||||||
|
refetchOnWindowFocus: true,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useHistorical(date: string, lat?: number, lon?: number) {
|
||||||
|
return useQuery<HistoricalResponse>({
|
||||||
|
queryKey: ['weather', 'historical', date, lat, lon],
|
||||||
|
queryFn: () => getHistorical(date, lat, lon),
|
||||||
|
staleTime: STALE_TIME,
|
||||||
|
enabled: !!date,
|
||||||
|
})
|
||||||
|
}
|
||||||
207
frontend/lib/api.ts
Normal file
207
frontend/lib/api.ts
Normal file
@@ -0,0 +1,207 @@
|
|||||||
|
import {
|
||||||
|
CurrentWeatherResponse,
|
||||||
|
ForecastResponse,
|
||||||
|
HistoricalResponse,
|
||||||
|
AssessmentResponse,
|
||||||
|
Thresholds,
|
||||||
|
APIError,
|
||||||
|
} from './types'
|
||||||
|
|
||||||
|
const API_BASE_URL = process.env.NEXT_PUBLIC_API_URL || 'http://localhost:8080/api/v1'
|
||||||
|
|
||||||
|
class APIClient {
|
||||||
|
private baseURL: string
|
||||||
|
|
||||||
|
constructor(baseURL: string) {
|
||||||
|
this.baseURL = baseURL
|
||||||
|
}
|
||||||
|
|
||||||
|
private async request<T>(
|
||||||
|
endpoint: string,
|
||||||
|
options?: RequestInit
|
||||||
|
): Promise<T> {
|
||||||
|
const url = `${this.baseURL}${endpoint}`
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(url, {
|
||||||
|
...options,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
...options?.headers,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error: APIError = await response.json().catch(() => ({
|
||||||
|
error: 'An error occurred',
|
||||||
|
detail: response.statusText,
|
||||||
|
}))
|
||||||
|
throw new Error(error.error || `HTTP ${response.status}: ${response.statusText}`)
|
||||||
|
}
|
||||||
|
|
||||||
|
const json = await response.json()
|
||||||
|
// Unwrap {success, data} response if present
|
||||||
|
return (json.data || json) as T
|
||||||
|
} catch (error) {
|
||||||
|
if (error instanceof Error) {
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
throw new Error('An unexpected error occurred')
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get current weather conditions and flyability assessment
|
||||||
|
*/
|
||||||
|
async getCurrentWeather(
|
||||||
|
lat?: number,
|
||||||
|
lon?: number
|
||||||
|
): Promise<CurrentWeatherResponse> {
|
||||||
|
const params = new URLSearchParams()
|
||||||
|
if (lat !== undefined) params.append('lat', lat.toString())
|
||||||
|
if (lon !== undefined) params.append('lon', lon.toString())
|
||||||
|
|
||||||
|
const query = params.toString() ? `?${params.toString()}` : ''
|
||||||
|
const data = await this.request<any>(`/weather/current${query}`)
|
||||||
|
|
||||||
|
// Transform backend response to frontend format
|
||||||
|
return {
|
||||||
|
location: data.location || { lat: 0, lon: 0 },
|
||||||
|
current: {
|
||||||
|
timestamp: data.current?.time || data.current?.timestamp,
|
||||||
|
wind_speed: data.current?.windSpeed || data.current?.wind_speed || 0,
|
||||||
|
wind_direction: data.current?.windDirection || data.current?.wind_direction || 0,
|
||||||
|
wind_gust: data.current?.windGust || data.current?.wind_gust || 0,
|
||||||
|
temperature: data.current?.temperature || 0,
|
||||||
|
cloud_cover: data.current?.cloudCover || data.current?.cloud_cover || 0,
|
||||||
|
precipitation: data.current?.precipitation || 0,
|
||||||
|
visibility: data.current?.visibility || 0,
|
||||||
|
pressure: data.current?.pressure || 0,
|
||||||
|
humidity: data.current?.humidity || 0,
|
||||||
|
},
|
||||||
|
assessment: {
|
||||||
|
is_flyable: data.assessment?.FlyableNow || data.assessment?.is_flyable || false,
|
||||||
|
reasons: data.assessment?.Reason ? [data.assessment.Reason] : data.assessment?.reasons || [],
|
||||||
|
score: data.assessment?.score || 0,
|
||||||
|
},
|
||||||
|
last_updated: data.timestamp || data.last_updated || new Date().toISOString(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get weather forecast for the next 7 days
|
||||||
|
*/
|
||||||
|
async getForecast(
|
||||||
|
lat?: number,
|
||||||
|
lon?: number
|
||||||
|
): Promise<ForecastResponse> {
|
||||||
|
const params = new URLSearchParams()
|
||||||
|
if (lat !== undefined) params.append('lat', lat.toString())
|
||||||
|
if (lon !== undefined) params.append('lon', lon.toString())
|
||||||
|
|
||||||
|
const query = params.toString() ? `?${params.toString()}` : ''
|
||||||
|
const data = await this.request<any>(`/weather/forecast${query}`)
|
||||||
|
|
||||||
|
// Transform backend response to frontend format
|
||||||
|
return {
|
||||||
|
location: data.location || { lat: 0, lon: 0 },
|
||||||
|
forecast: (data.forecast || []).map((point: any) => ({
|
||||||
|
timestamp: point.Time || point.timestamp,
|
||||||
|
wind_speed: point.WindSpeedMPH || point.wind_speed || 0,
|
||||||
|
wind_direction: point.WindDirection || point.wind_direction || 0,
|
||||||
|
wind_gust: point.WindGustMPH || point.wind_gust || 0,
|
||||||
|
temperature: point.temperature || 0,
|
||||||
|
cloud_cover: point.cloud_cover || 0,
|
||||||
|
precipitation: point.precipitation || 0,
|
||||||
|
visibility: point.visibility || 0,
|
||||||
|
pressure: point.pressure || 0,
|
||||||
|
humidity: point.humidity || 0,
|
||||||
|
})),
|
||||||
|
flyable_windows: (data.flyableWindows || data.flyable_windows || []).map((win: any) => ({
|
||||||
|
start: win.start,
|
||||||
|
end: win.end,
|
||||||
|
duration_hours: win.durationHours || win.duration_hours || 0,
|
||||||
|
avg_conditions: {
|
||||||
|
wind_speed: win.avgConditions?.windSpeed || win.avg_conditions?.wind_speed || 0,
|
||||||
|
wind_gust: win.avgConditions?.windGust || win.avg_conditions?.wind_gust || 0,
|
||||||
|
temperature: win.avgConditions?.temperature || win.avg_conditions?.temperature || 0,
|
||||||
|
cloud_cover: win.avgConditions?.cloudCover || win.avg_conditions?.cloud_cover || 0,
|
||||||
|
},
|
||||||
|
})),
|
||||||
|
generated_at: data.generated || data.generated_at || new Date().toISOString(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get historical weather data for a specific date
|
||||||
|
* @param date - Date in YYYY-MM-DD format
|
||||||
|
*/
|
||||||
|
async getHistorical(
|
||||||
|
date: string,
|
||||||
|
lat?: number,
|
||||||
|
lon?: number
|
||||||
|
): Promise<HistoricalResponse> {
|
||||||
|
const params = new URLSearchParams({ date })
|
||||||
|
if (lat !== undefined) params.append('lat', lat.toString())
|
||||||
|
if (lon !== undefined) params.append('lon', lon.toString())
|
||||||
|
|
||||||
|
const data = await this.request<any>(`/weather/historical?${params.toString()}`)
|
||||||
|
|
||||||
|
// Transform backend response to frontend format
|
||||||
|
return {
|
||||||
|
location: data.location || { lat: 0, lon: 0 },
|
||||||
|
date: data.date || date,
|
||||||
|
data: (data.data || []).map((point: any) => ({
|
||||||
|
timestamp: point.Time || point.timestamp,
|
||||||
|
wind_speed: point.WindSpeedMPH || point.wind_speed || 0,
|
||||||
|
wind_direction: point.WindDirection || point.wind_direction || 0,
|
||||||
|
wind_gust: point.WindGustMPH || point.wind_gust || 0,
|
||||||
|
temperature: point.temperature || 0,
|
||||||
|
cloud_cover: point.cloud_cover || 0,
|
||||||
|
precipitation: point.precipitation || 0,
|
||||||
|
visibility: point.visibility || 0,
|
||||||
|
pressure: point.pressure || 0,
|
||||||
|
humidity: point.humidity || 0,
|
||||||
|
})),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Assess current conditions with custom thresholds
|
||||||
|
*/
|
||||||
|
async assessWithThresholds(
|
||||||
|
thresholds: Thresholds,
|
||||||
|
lat?: number,
|
||||||
|
lon?: number
|
||||||
|
): Promise<AssessmentResponse> {
|
||||||
|
const params = new URLSearchParams()
|
||||||
|
if (lat !== undefined) params.append('lat', lat.toString())
|
||||||
|
if (lon !== undefined) params.append('lon', lon.toString())
|
||||||
|
|
||||||
|
const query = params.toString() ? `?${params.toString()}` : ''
|
||||||
|
|
||||||
|
return this.request<AssessmentResponse>(`/weather/assess${query}`, {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({ thresholds }),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export singleton instance
|
||||||
|
export const apiClient = new APIClient(API_BASE_URL)
|
||||||
|
|
||||||
|
// Export individual functions for convenience
|
||||||
|
export const getCurrentWeather = (lat?: number, lon?: number) =>
|
||||||
|
apiClient.getCurrentWeather(lat, lon)
|
||||||
|
|
||||||
|
export const getForecast = (lat?: number, lon?: number) =>
|
||||||
|
apiClient.getForecast(lat, lon)
|
||||||
|
|
||||||
|
export const getHistorical = (date: string, lat?: number, lon?: number) =>
|
||||||
|
apiClient.getHistorical(date, lat, lon)
|
||||||
|
|
||||||
|
export const assessWithThresholds = (
|
||||||
|
thresholds: Thresholds,
|
||||||
|
lat?: number,
|
||||||
|
lon?: number
|
||||||
|
) => apiClient.assessWithThresholds(thresholds, lat, lon)
|
||||||
92
frontend/lib/types.ts
Normal file
92
frontend/lib/types.ts
Normal file
@@ -0,0 +1,92 @@
|
|||||||
|
// Core weather data types matching backend models
|
||||||
|
|
||||||
|
export interface WeatherPoint {
|
||||||
|
timestamp: string
|
||||||
|
temperature: number
|
||||||
|
wind_speed: number
|
||||||
|
wind_gust: number
|
||||||
|
wind_direction: number
|
||||||
|
cloud_cover: number
|
||||||
|
precipitation: number
|
||||||
|
visibility: number
|
||||||
|
pressure: number
|
||||||
|
humidity: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Thresholds {
|
||||||
|
max_wind_speed: number
|
||||||
|
max_wind_gust: number
|
||||||
|
max_precipitation: number
|
||||||
|
min_visibility: number
|
||||||
|
max_cloud_cover: number
|
||||||
|
min_temperature?: number
|
||||||
|
max_temperature?: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Assessment {
|
||||||
|
is_flyable: boolean
|
||||||
|
reasons: string[]
|
||||||
|
score: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FlyableWindow {
|
||||||
|
start: string
|
||||||
|
end: string
|
||||||
|
duration_hours: number
|
||||||
|
avg_conditions: {
|
||||||
|
wind_speed: number
|
||||||
|
wind_gust: number
|
||||||
|
temperature: number
|
||||||
|
cloud_cover: number
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// API Response types
|
||||||
|
|
||||||
|
export interface CurrentWeatherResponse {
|
||||||
|
location: {
|
||||||
|
lat: number
|
||||||
|
lon: number
|
||||||
|
name?: string
|
||||||
|
}
|
||||||
|
current: WeatherPoint
|
||||||
|
assessment: Assessment
|
||||||
|
last_updated: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ForecastResponse {
|
||||||
|
location: {
|
||||||
|
lat: number
|
||||||
|
lon: number
|
||||||
|
name?: string
|
||||||
|
}
|
||||||
|
forecast: WeatherPoint[]
|
||||||
|
flyable_windows: FlyableWindow[]
|
||||||
|
generated_at: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface HistoricalResponse {
|
||||||
|
location: {
|
||||||
|
lat: number
|
||||||
|
lon: number
|
||||||
|
name?: string
|
||||||
|
}
|
||||||
|
date: string
|
||||||
|
data: WeatherPoint[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface AssessmentRequest {
|
||||||
|
thresholds: Thresholds
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface AssessmentResponse {
|
||||||
|
current: WeatherPoint
|
||||||
|
assessment: Assessment
|
||||||
|
thresholds_used: Thresholds
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error response type
|
||||||
|
export interface APIError {
|
||||||
|
error: string
|
||||||
|
detail?: string
|
||||||
|
}
|
||||||
6
frontend/lib/utils.ts
Normal file
6
frontend/lib/utils.ts
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
import { type ClassValue, clsx } from 'clsx'
|
||||||
|
import { twMerge } from 'tailwind-merge'
|
||||||
|
|
||||||
|
export function cn(...inputs: ClassValue[]) {
|
||||||
|
return twMerge(clsx(inputs))
|
||||||
|
}
|
||||||
5
frontend/next-env.d.ts
vendored
Normal file
5
frontend/next-env.d.ts
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
/// <reference types="next" />
|
||||||
|
/// <reference types="next/image-types/global" />
|
||||||
|
|
||||||
|
// NOTE: This file should not be edited
|
||||||
|
// see https://nextjs.org/docs/basic-features/typescript for more information.
|
||||||
8
frontend/next.config.js
Normal file
8
frontend/next.config.js
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
/** @type {import('next').NextConfig} */
|
||||||
|
const nextConfig = {
|
||||||
|
output: 'standalone',
|
||||||
|
reactStrictMode: true,
|
||||||
|
swcMinify: true,
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = nextConfig
|
||||||
8520
frontend/package-lock.json
generated
Normal file
8520
frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
41
frontend/package.json
Normal file
41
frontend/package.json
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
{
|
||||||
|
"name": "paragliding-frontend",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"private": true,
|
||||||
|
"scripts": {
|
||||||
|
"dev": "next dev",
|
||||||
|
"build": "next build",
|
||||||
|
"start": "next start",
|
||||||
|
"lint": "next lint",
|
||||||
|
"test": "vitest"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"@radix-ui/react-slider": "^1.1.2",
|
||||||
|
"@radix-ui/react-slot": "^1.0.2",
|
||||||
|
"@tanstack/react-query": "^5.45.0",
|
||||||
|
"class-variance-authority": "^0.7.0",
|
||||||
|
"clsx": "^2.1.1",
|
||||||
|
"date-fns": "^3.6.0",
|
||||||
|
"lucide-react": "^0.395.0",
|
||||||
|
"next": "14.2.5",
|
||||||
|
"react": "^18.3.1",
|
||||||
|
"react-dom": "^18.3.1",
|
||||||
|
"recharts": "^2.12.7",
|
||||||
|
"tailwind-merge": "^2.3.0",
|
||||||
|
"zustand": "^4.5.2"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@testing-library/react": "^16.0.0",
|
||||||
|
"@types/node": "^20.14.2",
|
||||||
|
"@types/react": "^18.3.3",
|
||||||
|
"@types/react-dom": "^18.3.0",
|
||||||
|
"autoprefixer": "^10.4.19",
|
||||||
|
"eslint": "^8.57.0",
|
||||||
|
"eslint-config-next": "14.2.5",
|
||||||
|
"postcss": "^8.4.38",
|
||||||
|
"tailwindcss": "^3.4.4",
|
||||||
|
"tailwindcss-animate": "^1.0.7",
|
||||||
|
"typescript": "^5.4.5",
|
||||||
|
"vitest": "^1.6.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
6
frontend/postcss.config.js
Normal file
6
frontend/postcss.config.js
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
module.exports = {
|
||||||
|
plugins: {
|
||||||
|
tailwindcss: {},
|
||||||
|
autoprefixer: {},
|
||||||
|
},
|
||||||
|
}
|
||||||
81
frontend/store/threshold-store.ts
Normal file
81
frontend/store/threshold-store.ts
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
'use client'
|
||||||
|
|
||||||
|
import { create } from 'zustand'
|
||||||
|
|
||||||
|
export interface ThresholdState {
|
||||||
|
speedMin: number
|
||||||
|
speedMax: number
|
||||||
|
dirCenter: number
|
||||||
|
dirRange: number
|
||||||
|
setSpeedMin: (value: number) => void
|
||||||
|
setSpeedMax: (value: number) => void
|
||||||
|
setDirCenter: (value: number) => void
|
||||||
|
setDirRange: (value: number) => void
|
||||||
|
setSpeedRange: (min: number, max: number) => void
|
||||||
|
initFromURL: () => void
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to update URL params
|
||||||
|
const updateURLParams = (params: Record<string, string>) => {
|
||||||
|
if (typeof window === 'undefined') return
|
||||||
|
|
||||||
|
const url = new URL(window.location.href)
|
||||||
|
Object.entries(params).forEach(([key, value]) => {
|
||||||
|
url.searchParams.set(key, value)
|
||||||
|
})
|
||||||
|
window.history.replaceState({}, '', url.toString())
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to read from URL params
|
||||||
|
const getURLParam = (key: string, defaultValue: number): number => {
|
||||||
|
if (typeof window === 'undefined') return defaultValue
|
||||||
|
|
||||||
|
const params = new URLSearchParams(window.location.search)
|
||||||
|
const value = params.get(key)
|
||||||
|
return value ? parseFloat(value) : defaultValue
|
||||||
|
}
|
||||||
|
|
||||||
|
export const useThresholdStore = create<ThresholdState>((set) => ({
|
||||||
|
// Default values
|
||||||
|
speedMin: 7,
|
||||||
|
speedMax: 14,
|
||||||
|
dirCenter: 270,
|
||||||
|
dirRange: 15,
|
||||||
|
|
||||||
|
setSpeedMin: (value) => {
|
||||||
|
set({ speedMin: value })
|
||||||
|
updateURLParams({ speedMin: value.toString() })
|
||||||
|
},
|
||||||
|
|
||||||
|
setSpeedMax: (value) => {
|
||||||
|
set({ speedMax: value })
|
||||||
|
updateURLParams({ speedMax: value.toString() })
|
||||||
|
},
|
||||||
|
|
||||||
|
setDirCenter: (value) => {
|
||||||
|
set({ dirCenter: value })
|
||||||
|
updateURLParams({ dirCenter: value.toString() })
|
||||||
|
},
|
||||||
|
|
||||||
|
setDirRange: (value) => {
|
||||||
|
set({ dirRange: value })
|
||||||
|
updateURLParams({ dirRange: value.toString() })
|
||||||
|
},
|
||||||
|
|
||||||
|
setSpeedRange: (min, max) => {
|
||||||
|
set({ speedMin: min, speedMax: max })
|
||||||
|
updateURLParams({
|
||||||
|
speedMin: min.toString(),
|
||||||
|
speedMax: max.toString()
|
||||||
|
})
|
||||||
|
},
|
||||||
|
|
||||||
|
initFromURL: () => {
|
||||||
|
set({
|
||||||
|
speedMin: getURLParam('speedMin', 7),
|
||||||
|
speedMax: getURLParam('speedMax', 14),
|
||||||
|
dirCenter: getURLParam('dirCenter', 270),
|
||||||
|
dirRange: getURLParam('dirRange', 15),
|
||||||
|
})
|
||||||
|
},
|
||||||
|
}))
|
||||||
75
frontend/tailwind.config.js
Normal file
75
frontend/tailwind.config.js
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
/** @type {import('tailwindcss').Config} */
|
||||||
|
module.exports = {
|
||||||
|
darkMode: ['class'],
|
||||||
|
content: [
|
||||||
|
'./pages/**/*.{js,ts,jsx,tsx,mdx}',
|
||||||
|
'./components/**/*.{js,ts,jsx,tsx,mdx}',
|
||||||
|
'./app/**/*.{js,ts,jsx,tsx,mdx}',
|
||||||
|
],
|
||||||
|
theme: {
|
||||||
|
container: {
|
||||||
|
center: true,
|
||||||
|
padding: '2rem',
|
||||||
|
screens: {
|
||||||
|
'2xl': '1400px',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
extend: {
|
||||||
|
colors: {
|
||||||
|
border: 'hsl(var(--border))',
|
||||||
|
input: 'hsl(var(--input))',
|
||||||
|
ring: 'hsl(var(--ring))',
|
||||||
|
background: 'hsl(var(--background))',
|
||||||
|
foreground: 'hsl(var(--foreground))',
|
||||||
|
primary: {
|
||||||
|
DEFAULT: 'hsl(var(--primary))',
|
||||||
|
foreground: 'hsl(var(--primary-foreground))',
|
||||||
|
},
|
||||||
|
secondary: {
|
||||||
|
DEFAULT: 'hsl(var(--secondary))',
|
||||||
|
foreground: 'hsl(var(--secondary-foreground))',
|
||||||
|
},
|
||||||
|
destructive: {
|
||||||
|
DEFAULT: 'hsl(var(--destructive))',
|
||||||
|
foreground: 'hsl(var(--destructive-foreground))',
|
||||||
|
},
|
||||||
|
muted: {
|
||||||
|
DEFAULT: 'hsl(var(--muted))',
|
||||||
|
foreground: 'hsl(var(--muted-foreground))',
|
||||||
|
},
|
||||||
|
accent: {
|
||||||
|
DEFAULT: 'hsl(var(--accent))',
|
||||||
|
foreground: 'hsl(var(--accent-foreground))',
|
||||||
|
},
|
||||||
|
popover: {
|
||||||
|
DEFAULT: 'hsl(var(--popover))',
|
||||||
|
foreground: 'hsl(var(--popover-foreground))',
|
||||||
|
},
|
||||||
|
card: {
|
||||||
|
DEFAULT: 'hsl(var(--card))',
|
||||||
|
foreground: 'hsl(var(--card-foreground))',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
borderRadius: {
|
||||||
|
lg: 'var(--radius)',
|
||||||
|
md: 'calc(var(--radius) - 2px)',
|
||||||
|
sm: 'calc(var(--radius) - 4px)',
|
||||||
|
},
|
||||||
|
keyframes: {
|
||||||
|
'accordion-down': {
|
||||||
|
from: { height: '0' },
|
||||||
|
to: { height: 'var(--radix-accordion-content-height)' },
|
||||||
|
},
|
||||||
|
'accordion-up': {
|
||||||
|
from: { height: 'var(--radix-accordion-content-height)' },
|
||||||
|
to: { height: '0' },
|
||||||
|
},
|
||||||
|
},
|
||||||
|
animation: {
|
||||||
|
'accordion-down': 'accordion-down 0.2s ease-out',
|
||||||
|
'accordion-up': 'accordion-up 0.2s ease-out',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
plugins: [require('tailwindcss-animate')],
|
||||||
|
}
|
||||||
28
frontend/tsconfig.json
Normal file
28
frontend/tsconfig.json
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2017",
|
||||||
|
"lib": ["dom", "dom.iterable", "esnext"],
|
||||||
|
"allowJs": true,
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"strict": true,
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"noEmit": true,
|
||||||
|
"esModuleInterop": true,
|
||||||
|
"module": "esnext",
|
||||||
|
"moduleResolution": "bundler",
|
||||||
|
"resolveJsonModule": true,
|
||||||
|
"isolatedModules": true,
|
||||||
|
"jsx": "preserve",
|
||||||
|
"incremental": true,
|
||||||
|
"plugins": [
|
||||||
|
{
|
||||||
|
"name": "next"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"paths": {
|
||||||
|
"@/*": ["./*"]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
|
||||||
|
"exclude": ["node_modules"]
|
||||||
|
}
|
||||||
182
k8s.yaml
Normal file
182
k8s.yaml
Normal file
@@ -0,0 +1,182 @@
|
|||||||
|
---
|
||||||
|
apiVersion: v1
|
||||||
|
kind: Namespace
|
||||||
|
metadata:
|
||||||
|
name: paragliding
|
||||||
|
|
||||||
|
---
|
||||||
|
apiVersion: v1
|
||||||
|
kind: ConfigMap
|
||||||
|
metadata:
|
||||||
|
name: paragliding-config
|
||||||
|
namespace: paragliding
|
||||||
|
data:
|
||||||
|
PORT: "8080"
|
||||||
|
LOCATION_LAT: "32.8893"
|
||||||
|
LOCATION_LON: "-117.2519"
|
||||||
|
LOCATION_NAME: "Torrey Pines Gliderport"
|
||||||
|
TIMEZONE: "America/Los_Angeles"
|
||||||
|
FETCH_INTERVAL: "15m"
|
||||||
|
CACHE_TTL: "10m"
|
||||||
|
NEXT_PUBLIC_API_URL: "https://paragliding.scottyah.com/api/v1"
|
||||||
|
|
||||||
|
---
|
||||||
|
# Backend Deployment
|
||||||
|
apiVersion: apps/v1
|
||||||
|
kind: Deployment
|
||||||
|
metadata:
|
||||||
|
name: paragliding-api
|
||||||
|
namespace: paragliding
|
||||||
|
spec:
|
||||||
|
replicas: 1
|
||||||
|
selector:
|
||||||
|
matchLabels:
|
||||||
|
app: paragliding-api
|
||||||
|
template:
|
||||||
|
metadata:
|
||||||
|
labels:
|
||||||
|
app: paragliding-api
|
||||||
|
spec:
|
||||||
|
containers:
|
||||||
|
- name: api
|
||||||
|
image: harbor.scottyah.com/scottyah/paragliding-api:latest
|
||||||
|
imagePullPolicy: Always
|
||||||
|
ports:
|
||||||
|
- containerPort: 8080
|
||||||
|
envFrom:
|
||||||
|
- configMapRef:
|
||||||
|
name: paragliding-config
|
||||||
|
- secretRef:
|
||||||
|
name: paragliding-secrets
|
||||||
|
resources:
|
||||||
|
requests:
|
||||||
|
memory: "64Mi"
|
||||||
|
cpu: "100m"
|
||||||
|
limits:
|
||||||
|
memory: "256Mi"
|
||||||
|
cpu: "500m"
|
||||||
|
livenessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /api/health
|
||||||
|
port: 8080
|
||||||
|
initialDelaySeconds: 10
|
||||||
|
periodSeconds: 30
|
||||||
|
readinessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /api/health
|
||||||
|
port: 8080
|
||||||
|
initialDelaySeconds: 5
|
||||||
|
periodSeconds: 10
|
||||||
|
imagePullSecrets:
|
||||||
|
- name: harborcred
|
||||||
|
|
||||||
|
---
|
||||||
|
apiVersion: v1
|
||||||
|
kind: Service
|
||||||
|
metadata:
|
||||||
|
name: paragliding-api-svc
|
||||||
|
namespace: paragliding
|
||||||
|
spec:
|
||||||
|
ports:
|
||||||
|
- port: 8080
|
||||||
|
targetPort: 8080
|
||||||
|
protocol: TCP
|
||||||
|
selector:
|
||||||
|
app: paragliding-api
|
||||||
|
|
||||||
|
---
|
||||||
|
# Frontend Deployment
|
||||||
|
apiVersion: apps/v1
|
||||||
|
kind: Deployment
|
||||||
|
metadata:
|
||||||
|
name: paragliding-web
|
||||||
|
namespace: paragliding
|
||||||
|
spec:
|
||||||
|
replicas: 1
|
||||||
|
selector:
|
||||||
|
matchLabels:
|
||||||
|
app: paragliding-web
|
||||||
|
template:
|
||||||
|
metadata:
|
||||||
|
labels:
|
||||||
|
app: paragliding-web
|
||||||
|
spec:
|
||||||
|
containers:
|
||||||
|
- name: web
|
||||||
|
image: harbor.scottyah.com/scottyah/paragliding-web:latest
|
||||||
|
imagePullPolicy: Always
|
||||||
|
ports:
|
||||||
|
- containerPort: 3000
|
||||||
|
envFrom:
|
||||||
|
- configMapRef:
|
||||||
|
name: paragliding-config
|
||||||
|
resources:
|
||||||
|
requests:
|
||||||
|
memory: "128Mi"
|
||||||
|
cpu: "100m"
|
||||||
|
limits:
|
||||||
|
memory: "512Mi"
|
||||||
|
cpu: "500m"
|
||||||
|
livenessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /
|
||||||
|
port: 3000
|
||||||
|
initialDelaySeconds: 10
|
||||||
|
periodSeconds: 30
|
||||||
|
readinessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /
|
||||||
|
port: 3000
|
||||||
|
initialDelaySeconds: 5
|
||||||
|
periodSeconds: 10
|
||||||
|
imagePullSecrets:
|
||||||
|
- name: harborcred
|
||||||
|
|
||||||
|
---
|
||||||
|
apiVersion: v1
|
||||||
|
kind: Service
|
||||||
|
metadata:
|
||||||
|
name: paragliding-web-svc
|
||||||
|
namespace: paragliding
|
||||||
|
spec:
|
||||||
|
ports:
|
||||||
|
- port: 3000
|
||||||
|
targetPort: 3000
|
||||||
|
protocol: TCP
|
||||||
|
selector:
|
||||||
|
app: paragliding-web
|
||||||
|
|
||||||
|
---
|
||||||
|
apiVersion: networking.k8s.io/v1
|
||||||
|
kind: Ingress
|
||||||
|
metadata:
|
||||||
|
name: paragliding-ingress
|
||||||
|
namespace: paragliding
|
||||||
|
annotations:
|
||||||
|
cert-manager.io/cluster-issuer: letsencrypt-prod
|
||||||
|
traefik.ingress.kubernetes.io/router.entrypoints: websecure
|
||||||
|
traefik.ingress.kubernetes.io/router.tls: "true"
|
||||||
|
spec:
|
||||||
|
ingressClassName: traefik
|
||||||
|
tls:
|
||||||
|
- hosts:
|
||||||
|
- paragliding.scottyah.com
|
||||||
|
secretName: paragliding-tls
|
||||||
|
rules:
|
||||||
|
- host: paragliding.scottyah.com
|
||||||
|
http:
|
||||||
|
paths:
|
||||||
|
- path: /api
|
||||||
|
pathType: Prefix
|
||||||
|
backend:
|
||||||
|
service:
|
||||||
|
name: paragliding-api-svc
|
||||||
|
port:
|
||||||
|
number: 8080
|
||||||
|
- path: /
|
||||||
|
pathType: Prefix
|
||||||
|
backend:
|
||||||
|
service:
|
||||||
|
name: paragliding-web-svc
|
||||||
|
port:
|
||||||
|
number: 3000
|
||||||
13
package-lock.json
generated
Normal file
13
package-lock.json
generated
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
{
|
||||||
|
"name": "paragliding",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"lockfileVersion": 3,
|
||||||
|
"requires": true,
|
||||||
|
"packages": {
|
||||||
|
"": {
|
||||||
|
"name": "paragliding",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"license": "ISC"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
13
package.json
Normal file
13
package.json
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
{
|
||||||
|
"name": "paragliding",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "",
|
||||||
|
"main": "test-app.js",
|
||||||
|
"scripts": {
|
||||||
|
"test": "echo \"Error: no test specified\" && exit 1"
|
||||||
|
},
|
||||||
|
"keywords": [],
|
||||||
|
"author": "",
|
||||||
|
"license": "ISC",
|
||||||
|
"type": "commonjs"
|
||||||
|
}
|
||||||
173
ship.sh
Executable file
173
ship.sh
Executable file
@@ -0,0 +1,173 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Paragliding Build & Deploy Script
|
||||||
|
REGISTRY="harbor.scottyah.com"
|
||||||
|
NAMESPACE="scottyah"
|
||||||
|
K8S_NAMESPACE="paragliding"
|
||||||
|
|
||||||
|
API_IMAGE="${REGISTRY}/${NAMESPACE}/paragliding-api"
|
||||||
|
WEB_IMAGE="${REGISTRY}/${NAMESPACE}/paragliding-web"
|
||||||
|
|
||||||
|
# Parse flags
|
||||||
|
BUILD_ONLY=false
|
||||||
|
DEPLOY_ONLY=false
|
||||||
|
API_ONLY=false
|
||||||
|
WEB_ONLY=false
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
--build-only)
|
||||||
|
BUILD_ONLY=true
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--deploy-only)
|
||||||
|
DEPLOY_ONLY=true
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--api-only)
|
||||||
|
API_ONLY=true
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--web-only)
|
||||||
|
WEB_ONLY=true
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
echo "Usage: ./ship.sh [OPTIONS]"
|
||||||
|
echo ""
|
||||||
|
echo "Build and deploy Paragliding to Kubernetes"
|
||||||
|
echo ""
|
||||||
|
echo "Options:"
|
||||||
|
echo " --build-only Only build and push Docker images"
|
||||||
|
echo " --deploy-only Only deploy to Kubernetes (skip build)"
|
||||||
|
echo " --api-only Only build/deploy the API"
|
||||||
|
echo " --web-only Only build/deploy the web frontend"
|
||||||
|
echo " -h, --help Show this help message"
|
||||||
|
echo ""
|
||||||
|
echo "Examples:"
|
||||||
|
echo " ./ship.sh # Build and deploy everything"
|
||||||
|
echo " ./ship.sh --build-only # Only build images"
|
||||||
|
echo " ./ship.sh --api-only # Only build and deploy API"
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown option: $1"
|
||||||
|
echo "Use --help for usage information"
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
|
||||||
|
|
||||||
|
# Detect container runtime
|
||||||
|
if command -v docker &> /dev/null; then
|
||||||
|
CONTAINER_CMD="docker"
|
||||||
|
elif command -v podman &> /dev/null; then
|
||||||
|
CONTAINER_CMD="podman"
|
||||||
|
else
|
||||||
|
echo "❌ Error: Neither docker nor podman found"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# BUILD PHASE
|
||||||
|
if [ "$DEPLOY_ONLY" = false ]; then
|
||||||
|
echo "🪂 Building Paragliding Docker images..."
|
||||||
|
echo "Registry: ${REGISTRY}"
|
||||||
|
echo "Timestamp: ${TIMESTAMP}"
|
||||||
|
echo "Using: ${CONTAINER_CMD}"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Build API
|
||||||
|
if [ "$WEB_ONLY" = false ]; then
|
||||||
|
echo "📦 Building API image..."
|
||||||
|
${CONTAINER_CMD} build \
|
||||||
|
--network=host \
|
||||||
|
-t "${API_IMAGE}:${TIMESTAMP}" \
|
||||||
|
-t "${API_IMAGE}:latest" \
|
||||||
|
./backend
|
||||||
|
|
||||||
|
echo "🚀 Pushing API images..."
|
||||||
|
${CONTAINER_CMD} push "${API_IMAGE}:${TIMESTAMP}"
|
||||||
|
${CONTAINER_CMD} push "${API_IMAGE}:latest"
|
||||||
|
echo "✅ API image pushed!"
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Build Web
|
||||||
|
if [ "$API_ONLY" = false ]; then
|
||||||
|
echo "📦 Building Web image..."
|
||||||
|
${CONTAINER_CMD} build \
|
||||||
|
--network=host \
|
||||||
|
-t "${WEB_IMAGE}:${TIMESTAMP}" \
|
||||||
|
-t "${WEB_IMAGE}:latest" \
|
||||||
|
--target runner \
|
||||||
|
./frontend
|
||||||
|
|
||||||
|
echo "🚀 Pushing Web images..."
|
||||||
|
${CONTAINER_CMD} push "${WEB_IMAGE}:${TIMESTAMP}"
|
||||||
|
${CONTAINER_CMD} push "${WEB_IMAGE}:latest"
|
||||||
|
echo "✅ Web image pushed!"
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# DEPLOY PHASE
|
||||||
|
if [ "$BUILD_ONLY" = false ]; then
|
||||||
|
echo "🪂 Deploying Paragliding to Kubernetes..."
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Create namespace if it doesn't exist
|
||||||
|
echo "Ensuring namespace exists..."
|
||||||
|
kubectl create namespace ${K8S_NAMESPACE} --dry-run=client -o yaml | kubectl apply -f -
|
||||||
|
|
||||||
|
# Apply Kubernetes configuration
|
||||||
|
echo "Applying Kubernetes configuration..."
|
||||||
|
kubectl apply -f k8s.yaml
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Waiting for namespace to be ready..."
|
||||||
|
kubectl wait --for=condition=Ready --timeout=10s namespace/${K8S_NAMESPACE} 2>/dev/null || true
|
||||||
|
|
||||||
|
# Restart deployments
|
||||||
|
if [ "$WEB_ONLY" = false ]; then
|
||||||
|
echo "Restarting API deployment..."
|
||||||
|
kubectl rollout restart deployment/paragliding-api -n ${K8S_NAMESPACE}
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "$API_ONLY" = false ]; then
|
||||||
|
echo "Restarting Web deployment..."
|
||||||
|
kubectl rollout restart deployment/paragliding-web -n ${K8S_NAMESPACE}
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Waiting for rollouts to complete..."
|
||||||
|
if [ "$WEB_ONLY" = false ]; then
|
||||||
|
kubectl rollout status deployment/paragliding-api -n ${K8S_NAMESPACE} --timeout=300s
|
||||||
|
fi
|
||||||
|
if [ "$API_ONLY" = false ]; then
|
||||||
|
kubectl rollout status deployment/paragliding-web -n ${K8S_NAMESPACE} --timeout=300s
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "✅ Deployment complete!"
|
||||||
|
echo ""
|
||||||
|
echo "📊 Deployment status:"
|
||||||
|
kubectl get pods -n ${K8S_NAMESPACE}
|
||||||
|
echo ""
|
||||||
|
kubectl get svc -n ${K8S_NAMESPACE}
|
||||||
|
echo ""
|
||||||
|
kubectl get ingress -n ${K8S_NAMESPACE}
|
||||||
|
echo ""
|
||||||
|
echo "🌍 Your site should be available at: https://paragliding.scottyah.com"
|
||||||
|
echo ""
|
||||||
|
echo "To view logs:"
|
||||||
|
echo " kubectl logs -f deployment/paragliding-api -n ${K8S_NAMESPACE}"
|
||||||
|
echo " kubectl logs -f deployment/paragliding-web -n ${K8S_NAMESPACE}"
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "$BUILD_ONLY" = false ] && [ "$DEPLOY_ONLY" = false ]; then
|
||||||
|
echo "✨ Build and deployment complete!"
|
||||||
|
fi
|
||||||
137
test-app.js
Normal file
137
test-app.js
Normal file
@@ -0,0 +1,137 @@
|
|||||||
|
const { chromium } = require('playwright');
|
||||||
|
|
||||||
|
(async () => {
|
||||||
|
const browser = await chromium.launch({ headless: false });
|
||||||
|
const page = await browser.newPage();
|
||||||
|
|
||||||
|
// Enable console logging
|
||||||
|
page.on('console', msg => console.log(`[CONSOLE] ${msg.type()}: ${msg.text()}`));
|
||||||
|
page.on('pageerror', error => console.error(`[PAGE ERROR] ${error.message}`));
|
||||||
|
page.on('requestfailed', request => console.error(`[REQUEST FAILED] ${request.url()} - ${request.failure().errorText}`));
|
||||||
|
|
||||||
|
try {
|
||||||
|
console.log('Navigating to http://localhost:3000...');
|
||||||
|
await page.goto('http://localhost:3000', { waitUntil: 'networkidle' });
|
||||||
|
|
||||||
|
console.log('Waiting for page to load...');
|
||||||
|
// Wait for the main content to load
|
||||||
|
await page.waitForSelector('h1:has-text("Paragliding Weather Dashboard")', { timeout: 10000 });
|
||||||
|
console.log('✓ Page loaded successfully');
|
||||||
|
|
||||||
|
// Wait a bit for data to load
|
||||||
|
await page.waitForTimeout(2000);
|
||||||
|
|
||||||
|
// Check if there are any error messages
|
||||||
|
const errorElements = await page.locator('text=/error|Error|failed|Failed/i').all();
|
||||||
|
if (errorElements.length > 0) {
|
||||||
|
console.log(`⚠ Found ${errorElements.length} potential error elements`);
|
||||||
|
for (const elem of errorElements) {
|
||||||
|
const text = await elem.textContent();
|
||||||
|
console.log(` - ${text}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to expand the threshold controls
|
||||||
|
console.log('\nLooking for Threshold Controls...');
|
||||||
|
const thresholdToggle = page.locator('text=Threshold Controls').first();
|
||||||
|
if (await thresholdToggle.isVisible()) {
|
||||||
|
console.log('✓ Found Threshold Controls, clicking to expand...');
|
||||||
|
await thresholdToggle.click();
|
||||||
|
await page.waitForTimeout(500);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to interact with sliders
|
||||||
|
console.log('\nLooking for sliders...');
|
||||||
|
const sliders = await page.locator('input[type="range"], [role="slider"]').all();
|
||||||
|
console.log(`Found ${sliders.length} sliders`);
|
||||||
|
|
||||||
|
if (sliders.length > 0) {
|
||||||
|
// Try to interact with the first slider (min speed)
|
||||||
|
console.log('Interacting with first slider (min speed)...');
|
||||||
|
const firstSlider = sliders[0];
|
||||||
|
|
||||||
|
// Get current value
|
||||||
|
const currentValue = await firstSlider.inputValue();
|
||||||
|
console.log(` Current value: ${currentValue}`);
|
||||||
|
|
||||||
|
// Try to set a new value
|
||||||
|
const newValue = parseFloat(currentValue) + 1;
|
||||||
|
console.log(` Setting to: ${newValue}`);
|
||||||
|
await firstSlider.fill(newValue.toString());
|
||||||
|
await page.waitForTimeout(1000);
|
||||||
|
|
||||||
|
// Check if value updated
|
||||||
|
const updatedValue = await firstSlider.inputValue();
|
||||||
|
console.log(` Updated value: ${updatedValue}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to interact with direction center slider
|
||||||
|
console.log('\nLooking for direction center slider...');
|
||||||
|
const dirSliders = await page.locator('[aria-label*="direction"], [aria-label*="Direction"]').all();
|
||||||
|
if (dirSliders.length > 0) {
|
||||||
|
console.log(`Found ${dirSliders.length} direction-related sliders`);
|
||||||
|
const dirSlider = dirSliders.find(async (s) => {
|
||||||
|
const label = await s.getAttribute('aria-label');
|
||||||
|
return label && label.includes('center');
|
||||||
|
});
|
||||||
|
|
||||||
|
if (dirSlider) {
|
||||||
|
console.log('Interacting with direction center slider...');
|
||||||
|
const currentDir = await dirSlider.inputValue();
|
||||||
|
console.log(` Current direction: ${currentDir}°`);
|
||||||
|
|
||||||
|
// Try to change direction
|
||||||
|
const newDir = (parseInt(currentDir) + 45) % 360;
|
||||||
|
console.log(` Setting to: ${newDir}°`);
|
||||||
|
await dirSlider.fill(newDir.toString());
|
||||||
|
await page.waitForTimeout(1000);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for any API errors in network tab
|
||||||
|
console.log('\nChecking network requests...');
|
||||||
|
const responses = [];
|
||||||
|
page.on('response', response => {
|
||||||
|
if (response.url().includes('/api/')) {
|
||||||
|
responses.push({
|
||||||
|
url: response.url(),
|
||||||
|
status: response.status(),
|
||||||
|
statusText: response.statusText()
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Trigger a refresh by clicking refresh if available
|
||||||
|
const refreshButton = page.locator('button:has-text("Refresh"), button[aria-label*="refresh" i]').first();
|
||||||
|
if (await refreshButton.isVisible({ timeout: 2000 })) {
|
||||||
|
console.log('Clicking refresh button...');
|
||||||
|
await refreshButton.click();
|
||||||
|
await page.waitForTimeout(2000);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wait a bit more to capture responses
|
||||||
|
await page.waitForTimeout(2000);
|
||||||
|
|
||||||
|
console.log('\nAPI Response Summary:');
|
||||||
|
responses.forEach(r => {
|
||||||
|
const status = r.status >= 200 && r.status < 300 ? '✓' : '✗';
|
||||||
|
console.log(` ${status} ${r.url} - ${r.status} ${r.statusText}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Take a screenshot for debugging
|
||||||
|
await page.screenshot({ path: 'test-screenshot.png', fullPage: true });
|
||||||
|
console.log('\n✓ Screenshot saved to test-screenshot.png');
|
||||||
|
|
||||||
|
console.log('\n✓ Test completed successfully');
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('\n✗ Test failed:', error.message);
|
||||||
|
await page.screenshot({ path: 'test-error.png', fullPage: true });
|
||||||
|
console.log('Error screenshot saved to test-error.png');
|
||||||
|
} finally {
|
||||||
|
// Keep browser open for a bit to see results
|
||||||
|
await page.waitForTimeout(3000);
|
||||||
|
await browser.close();
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
|
||||||
BIN
test-error.png
Normal file
BIN
test-error.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 48 KiB |
42
test-with-logs.sh
Executable file
42
test-with-logs.sh
Executable file
@@ -0,0 +1,42 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Script to watch logs and run Playwright tests
|
||||||
|
|
||||||
|
echo "=== Starting log watchers and Playwright tests ==="
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Function to cleanup background processes
|
||||||
|
cleanup() {
|
||||||
|
echo ""
|
||||||
|
echo "=== Cleaning up ==="
|
||||||
|
kill $BACKEND_LOG_PID $FRONTEND_LOG_PID 2>/dev/null
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
trap cleanup EXIT INT TERM
|
||||||
|
|
||||||
|
# Start watching backend logs in background
|
||||||
|
echo "Watching backend logs..."
|
||||||
|
podman logs -f paragliding-backend 2>&1 | while IFS= read -r line; do
|
||||||
|
echo "[BACKEND] $line"
|
||||||
|
done &
|
||||||
|
BACKEND_LOG_PID=$!
|
||||||
|
|
||||||
|
# Start watching frontend logs in background
|
||||||
|
echo "Watching frontend logs..."
|
||||||
|
podman logs -f paragliding-frontend 2>&1 | while IFS= read -r line; do
|
||||||
|
echo "[FRONTEND] $line"
|
||||||
|
done &
|
||||||
|
FRONTEND_LOG_PID=$!
|
||||||
|
|
||||||
|
# Wait a moment for log watchers to start
|
||||||
|
sleep 2
|
||||||
|
|
||||||
|
# Run Playwright tests
|
||||||
|
echo ""
|
||||||
|
echo "=== Running Playwright tests ==="
|
||||||
|
echo ""
|
||||||
|
cd frontend && npx playwright test tests/interactive.spec.ts --headed
|
||||||
|
|
||||||
|
# Tests will complete and cleanup will run
|
||||||
|
|
||||||
Reference in New Issue
Block a user