This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
This project uses uv for dependency management.
# Install all dependencies (core + dev + UI)
uv sync --extra uiAll commands use uv run to execute within the managed environment.
# Run all tests
uv run pytest tests/ -v
# Run a specific test file
uv run pytest tests/test_strategies.py -v
# Run tests matching a pattern
uv run pytest tests/ -k "butterfly"
# Check code formatting
uv run ruff format --check optopsy/ tests/
# Auto-format code
uv run ruff format optopsy/ tests/
# Lint code
uv run ruff check optopsy/ tests/
# Lint and auto-fix
uv run ruff check --fix optopsy/ tests/
# Type check
uv run ty check optopsy/
# Run tests with coverage
uv run pytest tests/ -v --cov=optopsy --cov-report=term-missingOptopsy is a backtesting library for options strategies. It processes historical option chain data and generates performance statistics.
- Input: CSV with option chain data (underlying_symbol, option_type, expiration, quote_date, strike, bid, ask, delta; optional: underlying_price, close)
- Load:
datafeeds.csv_data()normalizes and imports the data - Process: Strategy functions in
strategies.pycallcore._process_strategy()which:- Filters options by DTE, OTM %, bid-ask spread
- Matches entry/exit prices across dates
- Builds multi-leg positions via pandas merges
- Applies strategy-specific rules (strike ordering, butterfly constraints)
- Calculates P&L and percentage change
- Output: DataFrame with either raw combinations or aggregated statistics (grouped by DTE intervals and OTM ranges)
strategies.py- Public API. Each strategy function (e.g.,long_calls,iron_condor) wraps a helper that calls_process_strategy()core.py- Strategy execution engine._process_strategy()orchestrates the pipeline;_strategy_engine()handles multi-leg joinsrules.py- Strike validation rules (ascending order, butterfly equal-width wings, iron condor/butterfly constraints)definitions.py- Column definitions for 1/2/3/4-leg strategy outputschecks.py- Input validation for parameters and DataFrame dtypesdatafeeds.py- CSV import with flexible column mapping
- Add public function in
strategies.pythat calls a helper (or create new helper) - Helper should call
_process_strategy()with appropriateleg_def,rules, and column definitions - Add validation rule in
rules.pyif strategy has strike constraints - Update
definitions.pyif new column structure needed - Export in
__init__.py - Add tests in
tests/test_strategies.py
class Side(Enum):
long = 1 # Buy (positive multiplier)
short = -1 # Sell (negative multiplier)Leg definitions use tuples: (Side.long, _calls, quantity) where quantity defaults to 1.
Standalone data management package — no Chainlit dependency. Provides CLI, providers, and caching.
# Install with data extras
uv sync --extra data
# Download historical options data (requires EODHD_API_KEY)
uv run optopsy-data download SPY # download single symbol
uv run optopsy-data download SPY AAPL # download multiple symbols
uv run optopsy-data download SPY -s # download stock prices
uv run optopsy-data download SPY -v # verbose/debug logging
# List available symbols
uv run optopsy-data symbols
uv run optopsy-data symbols -q SPY
# Cache management
uv run optopsy-data cache size # show disk usage
uv run optopsy-data cache clear # clear all cached data
uv run optopsy-data cache clear SPY # clear specific symbolcli.py— CLI entry point (optopsy-data). Argparse withdownload,symbols, andcachesubcommands.paths.py— Base data directory resolution (~/.optopsyorOPTOPSY_DATA_DIR)._compat.py— Compatibility utilities._dataframe_utils.py— DataFrame helper functions._yf_helpers.py— Yahoo Finance data helpers.
Pluggable provider system for fetching market data.
base.py— AbstractDataProviderinterface. Requiresname,env_key,get_tool_schemas(),execute(tool_name, arguments).eodhd.py—EODHDProvider. Fetches options chains and stock prices from EODHD API. Smart caching with gap detection.cache.py—ParquetCache. File-based cache at~/.optopsy/cache/{category}/{SYMBOL}.parquet.result_store.py— Strategy result storage.
An AI-powered chat interface for interactive options backtesting, built on Chainlit + LiteLLM.
# Install with UI extras
uv sync --extra ui
# Launch (opens browser)
uv run optopsy-chat
# With options
uv run optopsy-chat run --port 9000 --headless --debugEnvironment variables (set in .env or shell):
| Variable | Purpose |
|---|---|
ANTHROPIC_API_KEY |
LLM provider API key (default provider) |
OPENAI_API_KEY |
Alternative LLM provider |
OPTOPSY_MODEL |
Override model (LiteLLM format, default: anthropic/claude-haiku-4-5-20251001) |
EODHD_API_KEY |
Enable EODHD data provider for live options/stock data |
cli.py— CLI entry point (optopsy-chat). Argparse withrunsubcommand. Lazy imports so non-runcommands skip Chainlit startup.app.py— Chainlit web app. Handlers foron_chat_start,on_chat_resume,on_message. Delegates toOptopsyAgent.agent.py—OptopsyAgentclass. Tool-calling loop over LiteLLM with streaming, message compaction (_COMPACT_THRESHOLD = 300), and max_MAX_TOOL_ITERATIONS = 15.tools.py— Tool registry. Core tools:load_csv_data,list_data_files,preview_data,run_strategy(all 38 strategies). Provider tools registered dynamically.
- Subclass
DataProviderinoptopsy/data/providers/ - Implement
name,env_key,get_tool_schemas(),get_tool_names(),execute() - Register in
providers/__init__.py - Provider is auto-detected if its
env_keyis set
Branch names are enforced by a pre-push hook (via pre-commit). All branches must use one of these prefixes:
| Prefix | Use case |
|---|---|
feature/ |
New features |
fix/ |
Bug fixes |
bugfix/ |
Bug fixes (alias) |
hotfix/ |
Urgent production fixes |
release/ |
Release preparation |
claude/ |
Claude-generated branches |
copilot/ |
Copilot-generated branches |
main |
Main branch (no prefix) |
Example: feature/add-iron-condor-strategy, fix/dte-filter-bug, claude/refactor-core
After cloning, install the pre-push hook:
uv run pre-commit install --hook-type pre-pushPublishing to PyPI is automated via GitHub Actions (.github/workflows/python-publish.yml) using trusted publishing. To release:
- Update
versioninpyproject.toml(e.g.,"2.3.0", or"2.3.0b1"for pre-releases) - Commit and push the version bump
- Create a GitHub release via
gh release create:# Stable release gh release create v2.3.0 --title "v2.3.0" --notes "Release notes here" # Pre-release / beta gh release create v2.3.0b1 --title "v2.3.0b1" --notes "Release notes here" --prerelease
- The workflow automatically builds and publishes to PyPI — no API tokens needed locally