Tax Practice AI - Next Steps¶
Last updated: 2024-12-23
Current Status: Development Environment Setup (Complete)¶
Completed¶
- Virtual environment created (.venv/)
- Python dependencies installed (requirements.txt, requirements-dev.txt)
- docker-compose.yml for local PostgreSQL (port 5433)
- src/ directory structure created
- tests/ directory structure created
- .env.example template
- .gitignore updated
- config.yaml with documented parameters
- .env configured with DB_* variables
- PostgreSQL container running (tax-practice-postgres)
- DATABASE_SCHEMA.sql applied (48 tables)
- Base service classes created:
- src/config/settings.py (config loading with env var substitution)
- src/services/base_service.py (abstract base class)
- src/services/aurora_service.py (async PostgreSQL access)
- src/services/init.py (ServiceRegistry)
Verified Working¶
# Test database connectivity
source .venv/bin/activate && source .env
python3 -c "
import asyncio
from src.config import get_config
from src.services import services
async def test():
config = get_config()
services.initialize(config)
result = await services.aurora.health_check_async()
print(f'Database connected: {result}')
await services.close_all()
asyncio.run(test())
"
Architectural Decision: Multi-Tenant SaaS¶
Decision: Separate Databases per Tenant (within one Aurora cluster)¶
| Tenant | Database | Connection |
|---|---|---|
| Firm A | tax_practice_firma | aurora-cluster/tax_practice_firma |
| Firm B | tax_practice_firmb | aurora-cluster/tax_practice_firmb |
Rationale: - Strongest data isolation (critical for tax/compliance) - Single Aurora cluster = shared tiered pricing - Independent backups per tenant - Easy migration to dedicated cluster if tenant grows - Meets IRS Circular 230 and WISP requirements
Implementation requirements: - [ ] Tenant registry table (master database or separate service) - [ ] Tenant routing middleware (resolve tenant → connection) - [ ] Per-tenant database provisioning workflow - [ ] Tenant-aware ServiceRegistry (dynamic connections)
Implementation Workflow¶
Before Coding Any Feature¶
- Read requirements - Review USER_STORIES.md for the specific story
- Read related specs - Check API_SPECIFICATION.md, DATABASE_SCHEMA.sql, PROCESS_FLOWS.md
- Verify understanding - Confirm scope with Don if unclear
- Plan implementation - Identify files to create/modify
- Implement - Write code following ARCHITECTURE.md patterns
- Test - Verify against requirements
- Update docs - Update ARCHITECTURE.md if patterns change
Next Phase: Client Intake Implementation¶
Sequence 1: Client Intake (INT-001 to INT-006)¶
| Story | Description | Priority |
|---|---|---|
| INT-001 | New client registration portal | P0 |
| INT-002 | Identity verification (Persona) | P0 |
| INT-003 | Conflict check automation | P1 |
| INT-004 | Engagement letter generation | P0 |
| INT-005 | E-signature integration (Google) | P0 |
| INT-006 | Initial document request | P1 |
Prerequisites for Client Intake¶
- Review INT-001 through INT-006 in USER_STORIES.md
- Review client-related endpoints in API_SPECIFICATION.md
- Create domain models (src/domain/client.py, etc.)
- Create repository layer (src/repositories/client_repository.py)
- Set up FastAPI routes (src/api/routes/clients.py)
- Create basic API schemas (src/api/schemas/client_schemas.py)
Implementation Order (From USER_STORIES.md)¶
| Sequence | Name | Stories | Focus |
|---|---|---|---|
| 1 | Client Intake | INT-001 to INT-006 | Registration, identity verification, engagement |
| 2 | Document Collection | INT-007 to INT-012 | Upload, classification, extraction |
| 3 | Return Preparation | INT-013 to INT-018 | AI review, calculations, forms |
| ... | ... | ... | ... |
Cloud Migration Queue¶
Items to address when moving from local dev to AWS:
| Item | Description | Priority |
|---|---|---|
| Aurora setup | Create Aurora PostgreSQL cluster | P0 |
| S3 buckets | Create document storage buckets | P0 |
| Secrets Manager | Move credentials from .env | P0 |
| IAM roles | Service-specific roles | P0 |
| VPC configuration | Network isolation | P0 |
| CI/CD pipeline | GitHub Actions for deployment | P1 |
| Airflow EC2 | Set up orchestration server | P1 |
Pending Client Confirmations¶
See docs/CLIENT_FOLLOWUP.md for items requiring client input:
- ASM-001: UltraTax handles all e-filing (assumed yes)
- VND-001: SurePrep sandbox access
- VND-002: SmartVault API credentials
To Resume Development¶
-
Activate virtual environment:
-
Start database:
-
Load environment:
-
Continue with client intake domain models or API routes
Key Documents Reference¶
| Document | Purpose |
|---|---|
| ARCHITECTURE.md | System design, code patterns |
| USER_STORIES.md | 82 prioritized implementation stories |
| DATABASE_SCHEMA.sql | Complete PostgreSQL schema |
| CLIENT_FOLLOWUP.md | Pending client confirmations |
| backlog.md | Technical debt and priorities |
| config.yaml | Centralized configuration (Python + Java) |