Compare commits
10 Commits
a0b366e94a
...
fix/viewer
| Author | SHA1 | Date | |
|---|---|---|---|
| 9d137a40d3 | |||
| 3bf6b8c6c9 | |||
| 4759374883 | |||
| cb6e34d5ce | |||
| 2b72951e66 | |||
| 69dad7cc74 | |||
| efa5aca35f | |||
| c429dcc033 | |||
| 9146118df1 | |||
| 07d15001ae |
136
PLAN.md
Normal file
136
PLAN.md
Normal file
@@ -0,0 +1,136 @@
|
|||||||
|
# Phase 2 Bug Fix & Tweaks - Implementation Plan
|
||||||
|
|
||||||
|
## 1. Admin Panel: Tenant Creation, Contract/Plan Fields, Disable/Archive
|
||||||
|
|
||||||
|
### Database Changes
|
||||||
|
- Add `contract_number VARCHAR(100)` and `plan_level VARCHAR(50) DEFAULT 'standard'` to `shared.organizations` (live DB ALTER + init SQL)
|
||||||
|
- Add `archived` to the status CHECK constraint: `('active', 'suspended', 'trial', 'archived')`
|
||||||
|
- Add to Organization entity: `contractNumber`, `planLevel` columns
|
||||||
|
|
||||||
|
### Backend Changes
|
||||||
|
- **admin.controller.ts**: Add two new endpoints:
|
||||||
|
- `POST /admin/tenants` — Creates org + first user + tenant schema in one call. Accepts: org name, email, address, contractNumber, planLevel, plus first user's email/password/firstName/lastName. Calls OrganizationsService.create() then sets up the user.
|
||||||
|
- `PUT /admin/organizations/:id/status` — Sets status to 'active', 'suspended', or 'archived'
|
||||||
|
- **auth.module.ts**: Import OrganizationsModule so AdminController can inject OrganizationsService
|
||||||
|
- **auth.service.ts**: In `login()`, after loading user with orgs, check if the default org's status is 'suspended' or 'archived' → throw UnauthorizedException("Your organization has been suspended/archived")
|
||||||
|
- **users.service.ts**: Update `findAllOrganizations()` query to include `contract_number, plan_level` in the SELECT
|
||||||
|
|
||||||
|
### Frontend Changes
|
||||||
|
- **AdminPage.tsx**:
|
||||||
|
- Add "Create Tenant" button → opens a modal with: org name, address, email, phone, contract number, plan level (select: standard/premium/enterprise), first admin email, first admin password, first/last name
|
||||||
|
- Orgs table: add Contract #, Plan Level columns
|
||||||
|
- Orgs table: add Status dropdown/buttons (Active/Suspended/Archived) per row with confirmation
|
||||||
|
- Show status colors: active=green, trial=yellow, suspended=orange, archived=red
|
||||||
|
|
||||||
|
## 2. Units/Homeowners: Delete + Assessment Group Binding
|
||||||
|
|
||||||
|
### Backend Changes
|
||||||
|
- **units.controller.ts**: Add `@Delete(':id')` route
|
||||||
|
- **units.service.ts**:
|
||||||
|
- Add `delete(id)` method — checks for outstanding invoices first, then deletes
|
||||||
|
- Add `assessment_group_id` to `create()` INSERT and `update()` UPDATE queries
|
||||||
|
- Update `findAll()` to JOIN assessment_groups and return `assessment_group_name`
|
||||||
|
|
||||||
|
### Frontend Changes
|
||||||
|
- **UnitsPage.tsx**:
|
||||||
|
- Add delete button (trash icon) per row with confirmation dialog
|
||||||
|
- Add Assessment Group dropdown (Select) in create/edit modal, populated from `/assessment-groups` query
|
||||||
|
- Show assessment group name in table
|
||||||
|
- When an assessment group is selected and no manual monthly_assessment is set, auto-fill from the group's regular_assessment
|
||||||
|
|
||||||
|
## 3. Assessment Groups: Frequency Field
|
||||||
|
|
||||||
|
### Database Changes
|
||||||
|
- Add `frequency VARCHAR(20) DEFAULT 'monthly'` to `assessment_groups` table (live DB ALTER + tenant-schema DDL)
|
||||||
|
- CHECK constraint: `('monthly', 'quarterly', 'annual')`
|
||||||
|
|
||||||
|
### Backend Changes
|
||||||
|
- **assessment-groups.service.ts**:
|
||||||
|
- Add `frequency` to `create()` INSERT
|
||||||
|
- Add `frequency` to `update()` dynamic sets
|
||||||
|
- Update `findAll()` and `getSummary()` income calculations to adjust by frequency:
|
||||||
|
- monthly → multiply by 1 (×12/year)
|
||||||
|
- quarterly → amounts are per quarter, so monthly = amount/3
|
||||||
|
- annual → amounts are per year, so monthly = amount/12
|
||||||
|
- Summary labels should change to reflect "Monthly Equivalent" for mixed frequencies
|
||||||
|
|
||||||
|
### Frontend Changes
|
||||||
|
- **AssessmentGroupsPage.tsx**:
|
||||||
|
- Add frequency Select in create/edit modal: Monthly, Quarterly, Annual
|
||||||
|
- Show frequency badge in table
|
||||||
|
- Update summary cards: labels → "Monthly Equivalent Operating" etc.
|
||||||
|
- Assessment amount label changes based on frequency ("Per Month" / "Per Quarter" / "Per Year")
|
||||||
|
|
||||||
|
## 4. UI Streamlining: Sidebar Grouping, Rename, Logo
|
||||||
|
|
||||||
|
### Sidebar Restructure
|
||||||
|
Group nav items into labeled sections:
|
||||||
|
```
|
||||||
|
Dashboard
|
||||||
|
─── FINANCIALS ───
|
||||||
|
Accounts (renamed from "Chart of Accounts")
|
||||||
|
Budgets
|
||||||
|
Investments
|
||||||
|
─── ASSESSMENTS ───
|
||||||
|
Units / Homeowners
|
||||||
|
Assessment Groups
|
||||||
|
─── TRANSACTIONS ───
|
||||||
|
Transactions
|
||||||
|
Invoices
|
||||||
|
Payments
|
||||||
|
─── PLANNING ───
|
||||||
|
Capital Projects
|
||||||
|
Reserves
|
||||||
|
Vendors
|
||||||
|
─── REPORTS ───
|
||||||
|
(collapsible with sub-items)
|
||||||
|
─── ADMIN ───
|
||||||
|
Year-End
|
||||||
|
Settings
|
||||||
|
─── PLATFORM ADMIN ─── (superadmin only)
|
||||||
|
Admin Panel
|
||||||
|
```
|
||||||
|
|
||||||
|
### Logo
|
||||||
|
- Copy SVG to `frontend/src/assets/logo.svg`
|
||||||
|
- In AppLayout.tsx: Replace `<Title order={3} c="blue">HOA LedgerIQ</Title>` with an `<img>` tag loading the SVG, sized to fit the 60px header (height ~40px with padding)
|
||||||
|
- SVG will be served directly (Vite handles SVG imports natively), no PNG conversion needed since browsers render SVG natively and it's cleaner
|
||||||
|
|
||||||
|
## 5. Capital Projects: PDF Table Export, Kanban Default, Future Category
|
||||||
|
|
||||||
|
### Frontend Changes
|
||||||
|
- **CapitalProjectsPage.tsx**:
|
||||||
|
- Change default viewMode from `'table'` to `'kanban'`
|
||||||
|
- PDF export: temporarily switch to table view for print, then restore. Use `@media print` CSS to always show table layout regardless of current view
|
||||||
|
- Add "Future" column in kanban: projects with `target_year = 9999` (sentinel value) display as "Future"
|
||||||
|
- Update the form: Target Year select should include a "Future (Beyond 5-Year)" option that maps to year 9999
|
||||||
|
- Kanban year list: always include current year through +5, plus "Future" if any projects exist there
|
||||||
|
- Table view: group "Future" projects under a "Future" header
|
||||||
|
- Title: "Capital Projects" (remove "(5-Year Plan)" since we now have Future)
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
- No backend changes needed — target_year=9999 works with existing schema (integer column, no constraint)
|
||||||
|
|
||||||
|
## File Change Summary
|
||||||
|
|
||||||
|
| File | Action |
|
||||||
|
|------|--------|
|
||||||
|
| `db/init/00-init.sql` | Add contract_number, plan_level, update status CHECK |
|
||||||
|
| `backend/src/modules/organizations/entities/organization.entity.ts` | Add contractNumber, planLevel columns |
|
||||||
|
| `backend/src/modules/organizations/dto/create-organization.dto.ts` | Add contractNumber, planLevel fields |
|
||||||
|
| `backend/src/modules/auth/admin.controller.ts` | Add POST /admin/tenants, PUT /admin/organizations/:id/status |
|
||||||
|
| `backend/src/modules/auth/auth.module.ts` | Import OrganizationsModule |
|
||||||
|
| `backend/src/modules/auth/auth.service.ts` | Add org status check on login |
|
||||||
|
| `backend/src/modules/users/users.service.ts` | Update findAllOrganizations query |
|
||||||
|
| `backend/src/modules/units/units.controller.ts` | Add DELETE route |
|
||||||
|
| `backend/src/modules/units/units.service.ts` | Add delete(), assessment_group_id support |
|
||||||
|
| `backend/src/modules/assessment-groups/assessment-groups.service.ts` | Add frequency support + adjust income calcs |
|
||||||
|
| `backend/src/database/tenant-schema.service.ts` | Add frequency to assessment_groups DDL |
|
||||||
|
| `frontend/src/assets/logo.svg` | New — copy from /Users/claw/Downloads/logo_house.svg |
|
||||||
|
| `frontend/src/components/layout/AppLayout.tsx` | Replace text with logo |
|
||||||
|
| `frontend/src/components/layout/Sidebar.tsx` | Restructure with grouped sections |
|
||||||
|
| `frontend/src/pages/admin/AdminPage.tsx` | Create tenant modal, status management, new columns |
|
||||||
|
| `frontend/src/pages/units/UnitsPage.tsx` | Delete, assessment group dropdown |
|
||||||
|
| `frontend/src/pages/assessment-groups/AssessmentGroupsPage.tsx` | Frequency field |
|
||||||
|
| `frontend/src/pages/capital-projects/CapitalProjectsPage.tsx` | Kanban default, table PDF, Future category |
|
||||||
|
| Live DB | ALTER TABLE commands for contract_number, plan_level, frequency, status CHECK |
|
||||||
4
backend/package-lock.json
generated
4
backend/package-lock.json
generated
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "hoa-ledgeriq-backend",
|
"name": "hoa-ledgeriq-backend",
|
||||||
"version": "2026.3.2-beta",
|
"version": "2026.3.7-beta",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "hoa-ledgeriq-backend",
|
"name": "hoa-ledgeriq-backend",
|
||||||
"version": "2026.3.2-beta",
|
"version": "2026.3.7-beta",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@nestjs/common": "^10.4.15",
|
"@nestjs/common": "^10.4.15",
|
||||||
"@nestjs/config": "^3.3.0",
|
"@nestjs/config": "^3.3.0",
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "hoa-ledgeriq-backend",
|
"name": "hoa-ledgeriq-backend",
|
||||||
"version": "2026.3.2-beta",
|
"version": "2026.3.7-beta",
|
||||||
"description": "HOA LedgerIQ - Backend API",
|
"description": "HOA LedgerIQ - Backend API",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
|
|||||||
@@ -112,6 +112,8 @@ export class TenantSchemaService {
|
|||||||
special_assessment DECIMAL(10,2) DEFAULT 0.00,
|
special_assessment DECIMAL(10,2) DEFAULT 0.00,
|
||||||
unit_count INTEGER DEFAULT 0,
|
unit_count INTEGER DEFAULT 0,
|
||||||
frequency VARCHAR(20) DEFAULT 'monthly' CHECK (frequency IN ('monthly', 'quarterly', 'annual')),
|
frequency VARCHAR(20) DEFAULT 'monthly' CHECK (frequency IN ('monthly', 'quarterly', 'annual')),
|
||||||
|
due_months INTEGER[] DEFAULT '{1,2,3,4,5,6,7,8,9,10,11,12}',
|
||||||
|
due_day INTEGER DEFAULT 1,
|
||||||
is_default BOOLEAN DEFAULT FALSE,
|
is_default BOOLEAN DEFAULT FALSE,
|
||||||
is_active BOOLEAN DEFAULT TRUE,
|
is_active BOOLEAN DEFAULT TRUE,
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
@@ -155,8 +157,11 @@ export class TenantSchemaService {
|
|||||||
amount DECIMAL(10,2) NOT NULL,
|
amount DECIMAL(10,2) NOT NULL,
|
||||||
amount_paid DECIMAL(10,2) DEFAULT 0.00,
|
amount_paid DECIMAL(10,2) DEFAULT 0.00,
|
||||||
status VARCHAR(20) DEFAULT 'draft' CHECK (status IN (
|
status VARCHAR(20) DEFAULT 'draft' CHECK (status IN (
|
||||||
'draft', 'sent', 'paid', 'partial', 'overdue', 'void', 'written_off'
|
'draft', 'pending', 'sent', 'paid', 'partial', 'overdue', 'void', 'written_off'
|
||||||
)),
|
)),
|
||||||
|
period_start DATE,
|
||||||
|
period_end DATE,
|
||||||
|
assessment_group_id UUID REFERENCES "${s}".assessment_groups(id),
|
||||||
journal_entry_id UUID REFERENCES "${s}".journal_entries(id),
|
journal_entry_id UUID REFERENCES "${s}".journal_entries(id),
|
||||||
sent_at TIMESTAMPTZ,
|
sent_at TIMESTAMPTZ,
|
||||||
paid_at TIMESTAMPTZ,
|
paid_at TIMESTAMPTZ,
|
||||||
@@ -325,6 +330,8 @@ export class TenantSchemaService {
|
|||||||
risk_notes JSONB,
|
risk_notes JSONB,
|
||||||
requested_by UUID,
|
requested_by UUID,
|
||||||
response_time_ms INTEGER,
|
response_time_ms INTEGER,
|
||||||
|
status VARCHAR(20) DEFAULT 'complete',
|
||||||
|
error_message TEXT,
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
)`,
|
)`,
|
||||||
|
|
||||||
|
|||||||
@@ -67,7 +67,7 @@ async function bootstrap() {
|
|||||||
const config = new DocumentBuilder()
|
const config = new DocumentBuilder()
|
||||||
.setTitle('HOA LedgerIQ API')
|
.setTitle('HOA LedgerIQ API')
|
||||||
.setDescription('API for the HOA LedgerIQ')
|
.setDescription('API for the HOA LedgerIQ')
|
||||||
.setVersion('2026.3.2')
|
.setVersion('2026.3.7')
|
||||||
.addBearerAuth()
|
.addBearerAuth()
|
||||||
.build();
|
.build();
|
||||||
const document = SwaggerModule.createDocument(app, config);
|
const document = SwaggerModule.createDocument(app, config);
|
||||||
|
|||||||
@@ -1,6 +1,12 @@
|
|||||||
import { Injectable, NotFoundException } from '@nestjs/common';
|
import { Injectable, NotFoundException, BadRequestException } from '@nestjs/common';
|
||||||
import { TenantService } from '../../database/tenant.service';
|
import { TenantService } from '../../database/tenant.service';
|
||||||
|
|
||||||
|
const DEFAULT_DUE_MONTHS: Record<string, number[]> = {
|
||||||
|
monthly: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12],
|
||||||
|
quarterly: [1, 4, 7, 10],
|
||||||
|
annual: [1],
|
||||||
|
};
|
||||||
|
|
||||||
@Injectable()
|
@Injectable()
|
||||||
export class AssessmentGroupsService {
|
export class AssessmentGroupsService {
|
||||||
constructor(private tenant: TenantService) {}
|
constructor(private tenant: TenantService) {}
|
||||||
@@ -42,6 +48,33 @@ export class AssessmentGroupsService {
|
|||||||
return rows.length ? rows[0] : null;
|
return rows.length ? rows[0] : null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private validateDueMonths(frequency: string, dueMonths: number[]) {
|
||||||
|
if (!dueMonths || !dueMonths.length) {
|
||||||
|
throw new BadRequestException('Due months are required');
|
||||||
|
}
|
||||||
|
// Validate all values are 1-12
|
||||||
|
if (dueMonths.some((m) => m < 1 || m > 12 || !Number.isInteger(m))) {
|
||||||
|
throw new BadRequestException('Due months must be integers between 1 and 12');
|
||||||
|
}
|
||||||
|
switch (frequency) {
|
||||||
|
case 'monthly':
|
||||||
|
if (dueMonths.length !== 12) {
|
||||||
|
throw new BadRequestException('Monthly frequency must include all 12 months');
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case 'quarterly':
|
||||||
|
if (dueMonths.length !== 4) {
|
||||||
|
throw new BadRequestException('Quarterly frequency must have exactly 4 due months');
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case 'annual':
|
||||||
|
if (dueMonths.length !== 1) {
|
||||||
|
throw new BadRequestException('Annual frequency must have exactly 1 due month');
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async create(dto: any) {
|
async create(dto: any) {
|
||||||
const existingGroups = await this.tenant.query('SELECT COUNT(*) as cnt FROM assessment_groups');
|
const existingGroups = await this.tenant.query('SELECT COUNT(*) as cnt FROM assessment_groups');
|
||||||
const isFirstGroup = parseInt(existingGroups[0].cnt) === 0;
|
const isFirstGroup = parseInt(existingGroups[0].cnt) === 0;
|
||||||
@@ -51,17 +84,23 @@ export class AssessmentGroupsService {
|
|||||||
await this.tenant.query('UPDATE assessment_groups SET is_default = false WHERE is_default = true');
|
await this.tenant.query('UPDATE assessment_groups SET is_default = false WHERE is_default = true');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const frequency = dto.frequency || 'monthly';
|
||||||
|
const dueMonths = dto.dueMonths || DEFAULT_DUE_MONTHS[frequency] || DEFAULT_DUE_MONTHS.monthly;
|
||||||
|
const dueDay = Math.min(Math.max(dto.dueDay || 1, 1), 28);
|
||||||
|
|
||||||
|
this.validateDueMonths(frequency, dueMonths);
|
||||||
|
|
||||||
const rows = await this.tenant.query(
|
const rows = await this.tenant.query(
|
||||||
`INSERT INTO assessment_groups (name, description, regular_assessment, special_assessment, unit_count, frequency, is_default)
|
`INSERT INTO assessment_groups (name, description, regular_assessment, special_assessment, unit_count, frequency, due_months, due_day, is_default)
|
||||||
VALUES ($1, $2, $3, $4, $5, $6, $7) RETURNING *`,
|
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9) RETURNING *`,
|
||||||
[dto.name, dto.description || null, dto.regularAssessment || 0, dto.specialAssessment || 0,
|
[dto.name, dto.description || null, dto.regularAssessment || 0, dto.specialAssessment || 0,
|
||||||
dto.unitCount || 0, dto.frequency || 'monthly', shouldBeDefault],
|
dto.unitCount || 0, frequency, dueMonths, dueDay, shouldBeDefault],
|
||||||
);
|
);
|
||||||
return rows[0];
|
return rows[0];
|
||||||
}
|
}
|
||||||
|
|
||||||
async update(id: string, dto: any) {
|
async update(id: string, dto: any) {
|
||||||
await this.findOne(id);
|
const existing = await this.findOne(id);
|
||||||
|
|
||||||
if (dto.isDefault === true) {
|
if (dto.isDefault === true) {
|
||||||
await this.tenant.query('UPDATE assessment_groups SET is_default = false WHERE is_default = true');
|
await this.tenant.query('UPDATE assessment_groups SET is_default = false WHERE is_default = true');
|
||||||
@@ -80,6 +119,24 @@ export class AssessmentGroupsService {
|
|||||||
if (dto.frequency !== undefined) { sets.push(`frequency = $${idx++}`); params.push(dto.frequency); }
|
if (dto.frequency !== undefined) { sets.push(`frequency = $${idx++}`); params.push(dto.frequency); }
|
||||||
if (dto.isDefault !== undefined) { sets.push(`is_default = $${idx++}`); params.push(dto.isDefault); }
|
if (dto.isDefault !== undefined) { sets.push(`is_default = $${idx++}`); params.push(dto.isDefault); }
|
||||||
|
|
||||||
|
// Handle due_months: if frequency changed and no explicit dueMonths, auto-populate defaults
|
||||||
|
const effectiveFrequency = dto.frequency || existing.frequency;
|
||||||
|
if (dto.dueMonths !== undefined) {
|
||||||
|
this.validateDueMonths(effectiveFrequency, dto.dueMonths);
|
||||||
|
sets.push(`due_months = $${idx++}`);
|
||||||
|
params.push(dto.dueMonths);
|
||||||
|
} else if (dto.frequency !== undefined && dto.frequency !== existing.frequency) {
|
||||||
|
// Frequency changed, auto-populate due_months
|
||||||
|
const newDueMonths = DEFAULT_DUE_MONTHS[dto.frequency] || DEFAULT_DUE_MONTHS.monthly;
|
||||||
|
sets.push(`due_months = $${idx++}`);
|
||||||
|
params.push(newDueMonths);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (dto.dueDay !== undefined) {
|
||||||
|
sets.push(`due_day = $${idx++}`);
|
||||||
|
params.push(Math.min(Math.max(dto.dueDay, 1), 28));
|
||||||
|
}
|
||||||
|
|
||||||
if (!sets.length) return this.findOne(id);
|
if (!sets.length) return this.findOne(id);
|
||||||
|
|
||||||
sets.push('updated_at = NOW()');
|
sets.push('updated_at = NOW()');
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { Controller, Get, Post, UseGuards, Req } from '@nestjs/common';
|
import { Controller, Get, Post, UseGuards, Req, Logger } from '@nestjs/common';
|
||||||
import { ApiTags, ApiBearerAuth, ApiOperation } from '@nestjs/swagger';
|
import { ApiTags, ApiBearerAuth, ApiOperation } from '@nestjs/swagger';
|
||||||
import { JwtAuthGuard } from '../auth/guards/jwt-auth.guard';
|
import { JwtAuthGuard } from '../auth/guards/jwt-auth.guard';
|
||||||
import { AllowViewer } from '../../common/decorators/allow-viewer.decorator';
|
import { AllowViewer } from '../../common/decorators/allow-viewer.decorator';
|
||||||
@@ -9,6 +9,8 @@ import { HealthScoresService } from './health-scores.service';
|
|||||||
@ApiBearerAuth()
|
@ApiBearerAuth()
|
||||||
@UseGuards(JwtAuthGuard)
|
@UseGuards(JwtAuthGuard)
|
||||||
export class HealthScoresController {
|
export class HealthScoresController {
|
||||||
|
private readonly logger = new Logger(HealthScoresController.name);
|
||||||
|
|
||||||
constructor(private service: HealthScoresService) {}
|
constructor(private service: HealthScoresService) {}
|
||||||
|
|
||||||
@Get('latest')
|
@Get('latest')
|
||||||
@@ -19,32 +21,56 @@ export class HealthScoresController {
|
|||||||
}
|
}
|
||||||
|
|
||||||
@Post('calculate')
|
@Post('calculate')
|
||||||
@ApiOperation({ summary: 'Trigger both health score recalculations (used by scheduler)' })
|
@ApiOperation({ summary: 'Trigger both health score recalculations (async — returns immediately)' })
|
||||||
@AllowViewer()
|
@AllowViewer()
|
||||||
async calculate(@Req() req: any) {
|
async calculate(@Req() req: any) {
|
||||||
const schema = req.user?.orgSchema;
|
const schema = req.user?.orgSchema;
|
||||||
const [operating, reserve] = await Promise.all([
|
|
||||||
|
// Fire-and-forget — background processing saves results to DB
|
||||||
|
Promise.all([
|
||||||
this.service.calculateScore(schema, 'operating'),
|
this.service.calculateScore(schema, 'operating'),
|
||||||
this.service.calculateScore(schema, 'reserve'),
|
this.service.calculateScore(schema, 'reserve'),
|
||||||
]);
|
]).catch((err) => {
|
||||||
return { operating, reserve };
|
this.logger.error(`Background health score calculation failed: ${err.message}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 'processing',
|
||||||
|
message: 'Health score calculations started. Results will appear when ready.',
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@Post('calculate/operating')
|
@Post('calculate/operating')
|
||||||
@ApiOperation({ summary: 'Recalculate operating fund health score only' })
|
@ApiOperation({ summary: 'Trigger operating fund health score recalculation (async)' })
|
||||||
@AllowViewer()
|
@AllowViewer()
|
||||||
async calculateOperating(@Req() req: any) {
|
async calculateOperating(@Req() req: any) {
|
||||||
const schema = req.user?.orgSchema;
|
const schema = req.user?.orgSchema;
|
||||||
const operating = await this.service.calculateScore(schema, 'operating');
|
|
||||||
return { operating };
|
// Fire-and-forget
|
||||||
|
this.service.calculateScore(schema, 'operating').catch((err) => {
|
||||||
|
this.logger.error(`Background operating score failed: ${err.message}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 'processing',
|
||||||
|
message: 'Operating fund health score calculation started.',
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@Post('calculate/reserve')
|
@Post('calculate/reserve')
|
||||||
@ApiOperation({ summary: 'Recalculate reserve fund health score only' })
|
@ApiOperation({ summary: 'Trigger reserve fund health score recalculation (async)' })
|
||||||
@AllowViewer()
|
@AllowViewer()
|
||||||
async calculateReserve(@Req() req: any) {
|
async calculateReserve(@Req() req: any) {
|
||||||
const schema = req.user?.orgSchema;
|
const schema = req.user?.orgSchema;
|
||||||
const reserve = await this.service.calculateScore(schema, 'reserve');
|
|
||||||
return { reserve };
|
// Fire-and-forget
|
||||||
|
this.service.calculateScore(schema, 'reserve').catch((err) => {
|
||||||
|
this.logger.error(`Background reserve score failed: ${err.message}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 'processing',
|
||||||
|
message: 'Reserve fund health score calculation started.',
|
||||||
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -252,7 +252,7 @@ export class HealthScoresService {
|
|||||||
private async gatherOperatingData(qr: any) {
|
private async gatherOperatingData(qr: any) {
|
||||||
const year = new Date().getFullYear();
|
const year = new Date().getFullYear();
|
||||||
|
|
||||||
const [accounts, budgets, assessments, cashFlow, recentTransactions] = await Promise.all([
|
const [accounts, budgets, assessments, cashFlow, recentTransactions, actualsMonths] = await Promise.all([
|
||||||
// Operating accounts with balances
|
// Operating accounts with balances
|
||||||
qr.query(`
|
qr.query(`
|
||||||
SELECT a.name, a.account_number, a.account_type, a.fund_type,
|
SELECT a.name, a.account_number, a.account_type, a.fund_type,
|
||||||
@@ -311,21 +311,54 @@ export class HealthScoresService {
|
|||||||
FROM invoices
|
FROM invoices
|
||||||
WHERE status IN ('sent', 'overdue') AND due_date < CURRENT_DATE
|
WHERE status IN ('sent', 'overdue') AND due_date < CURRENT_DATE
|
||||||
`),
|
`),
|
||||||
|
// Detect which months have posted actuals (expense or income JEs)
|
||||||
|
qr.query(`
|
||||||
|
SELECT DISTINCT EXTRACT(MONTH FROM je.entry_date)::int as month_num
|
||||||
|
FROM journal_entries je
|
||||||
|
JOIN journal_entry_lines jel ON jel.journal_entry_id = je.id
|
||||||
|
JOIN accounts a ON a.id = jel.account_id
|
||||||
|
WHERE je.entry_date >= $1
|
||||||
|
AND je.entry_date < $2
|
||||||
|
AND je.is_posted = true AND je.is_void = false
|
||||||
|
AND a.fund_type = 'operating'
|
||||||
|
AND a.account_type IN ('income', 'expense')
|
||||||
|
ORDER BY month_num
|
||||||
|
`, [`${year}-01-01`, `${year + 1}-01-01`]),
|
||||||
]);
|
]);
|
||||||
|
|
||||||
// Calculate month-by-month budget actuals progress
|
// Calculate month-by-month budget actuals progress
|
||||||
const currentMonth = new Date().getMonth(); // 0-indexed
|
const currentMonth = new Date().getMonth(); // 0-indexed
|
||||||
|
const dayOfMonth = new Date().getDate();
|
||||||
const monthNames = ['jan','feb','mar','apr','may','jun','jul','aug','sep','oct','nov','dec_amt'];
|
const monthNames = ['jan','feb','mar','apr','may','jun','jul','aug','sep','oct','nov','dec_amt'];
|
||||||
|
const monthLabelsForBudget = ['January','February','March','April','May','June','July','August','September','October','November','December'];
|
||||||
|
|
||||||
|
// Determine which months have posted actuals
|
||||||
|
const monthsWithActuals: number[] = actualsMonths.map((r: any) => parseInt(r.month_num)); // 1-indexed
|
||||||
|
const lastActualsMonth0 = monthsWithActuals.length > 0
|
||||||
|
? Math.max(...monthsWithActuals) - 1 // convert to 0-indexed
|
||||||
|
: -1; // no actuals posted at all
|
||||||
|
|
||||||
|
// YTD budget = sum through last month with actuals only (NOT current incomplete month)
|
||||||
let budgetedIncomeYTD = 0;
|
let budgetedIncomeYTD = 0;
|
||||||
let budgetedExpenseYTD = 0;
|
let budgetedExpenseYTD = 0;
|
||||||
for (const b of budgets) {
|
for (const b of budgets) {
|
||||||
for (let m = 0; m <= currentMonth; m++) {
|
for (let m = 0; m <= lastActualsMonth0; m++) {
|
||||||
const amt = parseFloat(b[monthNames[m]]) || 0;
|
const amt = parseFloat(b[monthNames[m]]) || 0;
|
||||||
if (b.account_type === 'income') budgetedIncomeYTD += amt;
|
if (b.account_type === 'income') budgetedIncomeYTD += amt;
|
||||||
else if (b.account_type === 'expense') budgetedExpenseYTD += amt;
|
else if (b.account_type === 'expense') budgetedExpenseYTD += amt;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Current month budget (shown separately, not included in YTD comparison)
|
||||||
|
let currentMonthBudgetIncome = 0;
|
||||||
|
let currentMonthBudgetExpense = 0;
|
||||||
|
for (const b of budgets) {
|
||||||
|
const amt = parseFloat(b[monthNames[currentMonth]]) || 0;
|
||||||
|
if (b.account_type === 'income') currentMonthBudgetIncome += amt;
|
||||||
|
else if (b.account_type === 'expense') currentMonthBudgetExpense += amt;
|
||||||
|
}
|
||||||
|
const currentMonthHasActuals = monthsWithActuals.includes(currentMonth + 1);
|
||||||
|
|
||||||
const operatingCash = accounts
|
const operatingCash = accounts
|
||||||
.filter((a: any) => a.account_type === 'asset')
|
.filter((a: any) => a.account_type === 'asset')
|
||||||
.reduce((s: number, a: any) => s + parseFloat(a.balance || '0'), 0);
|
.reduce((s: number, a: any) => s + parseFloat(a.balance || '0'), 0);
|
||||||
@@ -459,11 +492,27 @@ export class HealthScoresService {
|
|||||||
ytdIncome,
|
ytdIncome,
|
||||||
ytdExpense,
|
ytdExpense,
|
||||||
monthlyAssessmentIncome,
|
monthlyAssessmentIncome,
|
||||||
|
totalAnnualAssessmentIncome: assessments.reduce((sum: number, ag: any) => {
|
||||||
|
const regular = parseFloat(ag.regular_assessment) || 0;
|
||||||
|
const units = parseInt(ag.unit_count) || 0;
|
||||||
|
const total = regular * units;
|
||||||
|
const freq = ag.frequency || 'monthly';
|
||||||
|
if (freq === 'monthly') return sum + total * 12;
|
||||||
|
if (freq === 'quarterly') return sum + total * 4;
|
||||||
|
return sum + total; // annual
|
||||||
|
}, 0),
|
||||||
delinquentCount: parseInt(recentTransactions[0]?.count || '0'),
|
delinquentCount: parseInt(recentTransactions[0]?.count || '0'),
|
||||||
delinquentAmount: parseFloat(recentTransactions[0]?.total_overdue || '0'),
|
delinquentAmount: parseFloat(recentTransactions[0]?.total_overdue || '0'),
|
||||||
monthsOfExpenses: budgetedExpenseAnnual > 0 ? (operatingCash / (budgetedExpenseAnnual / 12)) : 0,
|
monthsOfExpenses: budgetedExpenseAnnual > 0 ? (operatingCash / (budgetedExpenseAnnual / 12)) : 0,
|
||||||
year,
|
year,
|
||||||
currentMonth: currentMonth + 1,
|
currentMonth: currentMonth + 1,
|
||||||
|
dayOfMonth,
|
||||||
|
monthsWithActuals,
|
||||||
|
lastActualsMonthLabel: lastActualsMonth0 >= 0 ? monthLabelsForBudget[lastActualsMonth0] : null,
|
||||||
|
currentMonthLabel: monthLabelsForBudget[currentMonth],
|
||||||
|
currentMonthBudgetIncome,
|
||||||
|
currentMonthBudgetExpense,
|
||||||
|
currentMonthHasActuals,
|
||||||
forecast,
|
forecast,
|
||||||
lowestCash: Math.round(lowestCash * 100) / 100,
|
lowestCash: Math.round(lowestCash * 100) / 100,
|
||||||
lowestCashMonth,
|
lowestCashMonth,
|
||||||
@@ -741,6 +790,14 @@ KEY FACTORS TO EVALUATE:
|
|||||||
4. Income-to-expense ratio
|
4. Income-to-expense ratio
|
||||||
5. Emergency buffer adequacy
|
5. Emergency buffer adequacy
|
||||||
6. CRITICAL — Projected cash flow: Use the 12-MONTH CASH FLOW FORECAST to assess future liquidity. The forecast shows month-by-month projected income (from assessments and budgeted sources), expenses (from budget), and project costs. Check whether cash will go negative or dangerously low in any future month. If projected income arrives before projected expenses, the position may be adequate even if current cash seems low. Conversely, if a large expense precedes income in a given month, flag the timing risk.
|
6. CRITICAL — Projected cash flow: Use the 12-MONTH CASH FLOW FORECAST to assess future liquidity. The forecast shows month-by-month projected income (from assessments and budgeted sources), expenses (from budget), and project costs. Check whether cash will go negative or dangerously low in any future month. If projected income arrives before projected expenses, the position may be adequate even if current cash seems low. Conversely, if a large expense precedes income in a given month, flag the timing risk.
|
||||||
|
7. BUDGET TIMING: YTD budget comparisons only include months where actual accounting entries have been posted. Do NOT penalize the HOA for a budget variance in the current month if actuals have not yet been submitted — this is normal operational procedure. Actuals are posted at month-end. The current month's budget is shown separately for context only, not for variance analysis.
|
||||||
|
|
||||||
|
CASH RUNWAY CLASSIFICATION (strict — use these rules for the Cash Reserves factor):
|
||||||
|
- <2 months of expenses: impact = "negative"
|
||||||
|
- 2-3 months of expenses: impact = "neutral"
|
||||||
|
- 3-6 months of expenses: impact = "positive"
|
||||||
|
- 6+ months of expenses: impact = "strongly positive" (contributes to Excellent score)
|
||||||
|
Do NOT rate cash runway as positive based on projected future inflows — evaluate the CURRENT cash-on-hand position for this factor. Future inflows should be evaluated separately under the Projected Cash Flow factor.
|
||||||
|
|
||||||
RESPONSE FORMAT:
|
RESPONSE FORMAT:
|
||||||
Respond with ONLY valid JSON (no markdown, no code fences):
|
Respond with ONLY valid JSON (no markdown, no code fences):
|
||||||
@@ -768,14 +825,30 @@ Provide 3-5 factors and 1-3 actionable recommendations. Be specific with dollar
|
|||||||
.join('\n') || 'No budget line items.';
|
.join('\n') || 'No budget line items.';
|
||||||
|
|
||||||
const assessmentLines = data.assessments
|
const assessmentLines = data.assessments
|
||||||
.map((a: any) => `- ${a.name}: $${parseFloat(a.regular_assessment || '0').toFixed(2)}/unit × ${a.unit_count} units (${a.frequency})`)
|
.map((a: any) => {
|
||||||
|
const regular = parseFloat(a.regular_assessment || '0');
|
||||||
|
const units = parseInt(a.unit_count || '0');
|
||||||
|
const total = regular * units;
|
||||||
|
return `- ${a.name}: $${regular.toFixed(2)}/unit × ${units} units (${a.frequency}) = $${total.toFixed(2)} total/period`;
|
||||||
|
})
|
||||||
.join('\n') || 'No assessment groups.';
|
.join('\n') || 'No assessment groups.';
|
||||||
|
|
||||||
|
const totalAnnualAssessmentIncome = data.assessments.reduce((sum: number, a: any) => {
|
||||||
|
const regular = parseFloat(a.regular_assessment || '0');
|
||||||
|
const units = parseInt(a.unit_count || '0');
|
||||||
|
const total = regular * units;
|
||||||
|
const freq = a.frequency || 'monthly';
|
||||||
|
if (freq === 'monthly') return sum + total * 12;
|
||||||
|
if (freq === 'quarterly') return sum + total * 4;
|
||||||
|
return sum + total; // annual
|
||||||
|
}, 0);
|
||||||
|
|
||||||
const userPrompt = `Evaluate this HOA's operating fund health.
|
const userPrompt = `Evaluate this HOA's operating fund health.
|
||||||
|
|
||||||
TODAY: ${today}
|
TODAY: ${today}
|
||||||
FISCAL YEAR: ${data.year}
|
FISCAL YEAR: ${data.year}
|
||||||
CURRENT MONTH: ${data.currentMonth} of 12
|
CURRENT MONTH: ${data.currentMonthLabel} (day ${data.dayOfMonth}), month ${data.currentMonth} of 12
|
||||||
|
Months with posted actuals: ${data.monthsWithActuals.length > 0 ? data.monthsWithActuals.map((m: number) => ['Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec'][m - 1]).join(', ') : 'None yet'}
|
||||||
|
|
||||||
=== OPERATING FUND ACCOUNTS ===
|
=== OPERATING FUND ACCOUNTS ===
|
||||||
${accountLines}
|
${accountLines}
|
||||||
@@ -789,20 +862,28 @@ Budgeted Annual Income: $${data.budgetedIncomeAnnual.toFixed(2)}
|
|||||||
Budgeted Annual Expenses: $${data.budgetedExpenseAnnual.toFixed(2)}
|
Budgeted Annual Expenses: $${data.budgetedExpenseAnnual.toFixed(2)}
|
||||||
Monthly Expense Run Rate: $${(data.budgetedExpenseAnnual / 12).toFixed(2)}
|
Monthly Expense Run Rate: $${(data.budgetedExpenseAnnual / 12).toFixed(2)}
|
||||||
|
|
||||||
=== BUDGET VS ACTUAL (YTD through month ${data.currentMonth}) ===
|
=== BUDGET VS ACTUAL (YTD through ${data.lastActualsMonthLabel || 'N/A — no actuals posted yet'}) ===
|
||||||
|
Note: This comparison only covers months with posted accounting entries. ${data.lastActualsMonthLabel ? `Actuals have been posted through ${data.lastActualsMonthLabel}.` : 'No monthly actuals have been posted yet for this fiscal year.'} Budget figures are used for forecasting until actuals are submitted at month-end.
|
||||||
|
|
||||||
Budgeted Income YTD: $${data.budgetedIncomeYTD.toFixed(2)}
|
Budgeted Income YTD: $${data.budgetedIncomeYTD.toFixed(2)}
|
||||||
Actual Income YTD: $${data.ytdIncome.toFixed(2)}
|
Actual Income YTD: $${data.ytdIncome.toFixed(2)}
|
||||||
Income Variance: $${(data.ytdIncome - data.budgetedIncomeYTD).toFixed(2)} (${data.budgetedIncomeYTD > 0 ? ((data.ytdIncome / data.budgetedIncomeYTD) * 100).toFixed(1) : 0}% of budget)
|
Income Variance: $${(data.ytdIncome - data.budgetedIncomeYTD).toFixed(2)}${data.budgetedIncomeYTD > 0 ? ` (${((data.ytdIncome / data.budgetedIncomeYTD) * 100).toFixed(1)}% of budget)` : ''}
|
||||||
|
|
||||||
Budgeted Expenses YTD: $${data.budgetedExpenseYTD.toFixed(2)}
|
Budgeted Expenses YTD: $${data.budgetedExpenseYTD.toFixed(2)}
|
||||||
Actual Expenses YTD: $${data.ytdExpense.toFixed(2)}
|
Actual Expenses YTD: $${data.ytdExpense.toFixed(2)}
|
||||||
Expense Variance: $${(data.ytdExpense - data.budgetedExpenseYTD).toFixed(2)} (${data.budgetedExpenseYTD > 0 ? ((data.ytdExpense / data.budgetedExpenseYTD) * 100).toFixed(1) : 0}% of budget)
|
Expense Variance: $${(data.ytdExpense - data.budgetedExpenseYTD).toFixed(2)}${data.budgetedExpenseYTD > 0 ? ` (${((data.ytdExpense / data.budgetedExpenseYTD) * 100).toFixed(1)}% of budget)` : ''}
|
||||||
|
|
||||||
|
=== CURRENT MONTH (${data.currentMonthLabel}, ${data.dayOfMonth} days elapsed) ===
|
||||||
|
Budgeted Income this month: $${data.currentMonthBudgetIncome.toFixed(2)}
|
||||||
|
Budgeted Expenses this month: $${data.currentMonthBudgetExpense.toFixed(2)}
|
||||||
|
Actuals posted this month: ${data.currentMonthHasActuals ? 'Yes' : 'No — actuals are typically posted at month-end'}
|
||||||
|
|
||||||
=== CASH RUNWAY ===
|
=== CASH RUNWAY ===
|
||||||
Months of Operating Expenses Covered: ${data.monthsOfExpenses.toFixed(1)} months
|
Months of Operating Expenses Covered: ${data.monthsOfExpenses.toFixed(1)} months
|
||||||
|
|
||||||
=== ASSESSMENT INCOME ===
|
=== ASSESSMENT INCOME ===
|
||||||
${assessmentLines}
|
${assessmentLines}
|
||||||
|
Total Annual Assessment Income: $${data.totalAnnualAssessmentIncome.toFixed(2)}
|
||||||
Monthly Assessment Income: $${data.monthlyAssessmentIncome.toFixed(2)}
|
Monthly Assessment Income: $${data.monthlyAssessmentIncome.toFixed(2)}
|
||||||
|
|
||||||
=== DELINQUENCY ===
|
=== DELINQUENCY ===
|
||||||
@@ -944,11 +1025,26 @@ ${budgetLines}
|
|||||||
|
|
||||||
=== SPECIAL ASSESSMENT INCOME (Reserve Fund) ===
|
=== SPECIAL ASSESSMENT INCOME (Reserve Fund) ===
|
||||||
${data.assessments.length === 0 ? 'No special assessments configured.' :
|
${data.assessments.length === 0 ? 'No special assessments configured.' :
|
||||||
data.assessments.map((a: any) => {
|
(() => {
|
||||||
const special = parseFloat(a.special_assessment || '0');
|
const lines = data.assessments.map((a: any) => {
|
||||||
if (special === 0) return null;
|
const special = parseFloat(a.special_assessment || '0');
|
||||||
return `- ${a.name}: $${special.toFixed(2)}/unit × ${a.unit_count} units (${a.frequency}) = $${(special * parseInt(a.unit_count || '0')).toFixed(2)}/period → Reserve Fund`;
|
if (special === 0) return null;
|
||||||
}).filter(Boolean).join('\n') || 'No special assessments currently being collected.'}
|
const units = parseInt(a.unit_count || '0');
|
||||||
|
const totalPerPeriod = special * units;
|
||||||
|
return `- ${a.name}: $${special.toFixed(2)}/unit × ${units} units (${a.frequency}) = $${totalPerPeriod.toFixed(2)}/period → Reserve Fund`;
|
||||||
|
}).filter(Boolean);
|
||||||
|
if (lines.length === 0) return 'No special assessments currently being collected.';
|
||||||
|
const totalAnnual = data.assessments.reduce((sum: number, a: any) => {
|
||||||
|
const special = parseFloat(a.special_assessment || '0');
|
||||||
|
const units = parseInt(a.unit_count || '0');
|
||||||
|
const total = special * units;
|
||||||
|
const freq = a.frequency || 'monthly';
|
||||||
|
if (freq === 'monthly') return sum + total * 12;
|
||||||
|
if (freq === 'quarterly') return sum + total * 4;
|
||||||
|
return sum + total;
|
||||||
|
}, 0);
|
||||||
|
return lines.join('\n') + '\nTotal Annual Special Assessment Income to Reserves: $' + totalAnnual.toFixed(2);
|
||||||
|
})()}
|
||||||
|
|
||||||
=== 12-MONTH PROJECTED CASH FLOW (Reserve Fund) ===
|
=== 12-MONTH PROJECTED CASH FLOW (Reserve Fund) ===
|
||||||
Starting Reserve Cash: $${data.reserveCash.toFixed(2)}
|
Starting Reserve Cash: $${data.reserveCash.toFixed(2)}
|
||||||
@@ -993,7 +1089,7 @@ Projected Year-End Total (Cash + Investments): $${data.projectedYearEndTotal.toF
|
|||||||
const requestBody = {
|
const requestBody = {
|
||||||
model,
|
model,
|
||||||
messages,
|
messages,
|
||||||
temperature: 0.3,
|
temperature: 0.1,
|
||||||
max_tokens: 2048,
|
max_tokens: 2048,
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -1019,7 +1115,7 @@ Projected Year-End Total (Cash + Investments): $${data.projectedYearEndTotal.toF
|
|||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
'Content-Length': Buffer.byteLength(bodyString, 'utf-8'),
|
'Content-Length': Buffer.byteLength(bodyString, 'utf-8'),
|
||||||
},
|
},
|
||||||
timeout: 120000,
|
timeout: 600000, // 10 minute timeout
|
||||||
};
|
};
|
||||||
|
|
||||||
const req = https.request(options, (res) => {
|
const req = https.request(options, (res) => {
|
||||||
@@ -1033,7 +1129,7 @@ Projected Year-End Total (Cash + Investments): $${data.projectedYearEndTotal.toF
|
|||||||
req.on('error', (err) => reject(err));
|
req.on('error', (err) => reject(err));
|
||||||
req.on('timeout', () => {
|
req.on('timeout', () => {
|
||||||
req.destroy();
|
req.destroy();
|
||||||
reject(new Error('Request timed out after 120s'));
|
reject(new Error('Request timed out after 600s'));
|
||||||
});
|
});
|
||||||
|
|
||||||
req.write(bodyString);
|
req.write(bodyString);
|
||||||
|
|||||||
@@ -36,9 +36,9 @@ export class InvestmentPlanningController {
|
|||||||
}
|
}
|
||||||
|
|
||||||
@Post('recommendations')
|
@Post('recommendations')
|
||||||
@ApiOperation({ summary: 'Get AI-powered investment recommendations' })
|
@ApiOperation({ summary: 'Trigger AI-powered investment recommendations (async — returns immediately)' })
|
||||||
@AllowViewer()
|
@AllowViewer()
|
||||||
getRecommendations(@Req() req: any) {
|
triggerRecommendations(@Req() req: any) {
|
||||||
return this.service.getAIRecommendations(req.user?.sub, req.user?.orgId);
|
return this.service.triggerAIRecommendations(req.user?.sub, req.user?.orgId);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -65,6 +65,9 @@ export interface SavedRecommendation {
|
|||||||
risk_notes: string[];
|
risk_notes: string[];
|
||||||
response_time_ms: number;
|
response_time_ms: number;
|
||||||
created_at: string;
|
created_at: string;
|
||||||
|
status: 'processing' | 'complete' | 'error';
|
||||||
|
last_failed: boolean;
|
||||||
|
error_message?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Injectable()
|
@Injectable()
|
||||||
@@ -196,14 +199,33 @@ export class InvestmentPlanningService {
|
|||||||
return rates.cd;
|
return rates.cd;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Ensure the status/error_message columns exist (for tenants created before this migration).
|
||||||
|
*/
|
||||||
|
private async ensureStatusColumn(): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.tenant.query(
|
||||||
|
`ALTER TABLE ai_recommendations ADD COLUMN IF NOT EXISTS status VARCHAR(20) DEFAULT 'complete'`,
|
||||||
|
);
|
||||||
|
await this.tenant.query(
|
||||||
|
`ALTER TABLE ai_recommendations ADD COLUMN IF NOT EXISTS error_message TEXT`,
|
||||||
|
);
|
||||||
|
} catch {
|
||||||
|
// Ignore — column may already exist or table may not exist
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get the latest saved AI recommendation for this tenant.
|
* Get the latest saved AI recommendation for this tenant.
|
||||||
|
* Returns status and last_failed flag for UI state management.
|
||||||
*/
|
*/
|
||||||
async getSavedRecommendation(): Promise<SavedRecommendation | null> {
|
async getSavedRecommendation(): Promise<SavedRecommendation | null> {
|
||||||
try {
|
try {
|
||||||
|
await this.ensureStatusColumn();
|
||||||
|
|
||||||
const rows = await this.tenant.query(
|
const rows = await this.tenant.query(
|
||||||
`SELECT id, recommendations_json, overall_assessment, risk_notes,
|
`SELECT id, recommendations_json, overall_assessment, risk_notes,
|
||||||
response_time_ms, created_at
|
response_time_ms, status, error_message, created_at
|
||||||
FROM ai_recommendations
|
FROM ai_recommendations
|
||||||
ORDER BY created_at DESC
|
ORDER BY created_at DESC
|
||||||
LIMIT 1`,
|
LIMIT 1`,
|
||||||
@@ -212,6 +234,64 @@ export class InvestmentPlanningService {
|
|||||||
if (!rows || rows.length === 0) return null;
|
if (!rows || rows.length === 0) return null;
|
||||||
|
|
||||||
const row = rows[0];
|
const row = rows[0];
|
||||||
|
const status = row.status || 'complete';
|
||||||
|
|
||||||
|
// If still processing, return processing status
|
||||||
|
if (status === 'processing') {
|
||||||
|
return {
|
||||||
|
id: row.id,
|
||||||
|
recommendations: [],
|
||||||
|
overall_assessment: '',
|
||||||
|
risk_notes: [],
|
||||||
|
response_time_ms: 0,
|
||||||
|
created_at: row.created_at,
|
||||||
|
status: 'processing',
|
||||||
|
last_failed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// If latest attempt failed, return the last successful result with last_failed flag
|
||||||
|
if (status === 'error') {
|
||||||
|
const lastGood = await this.tenant.query(
|
||||||
|
`SELECT id, recommendations_json, overall_assessment, risk_notes,
|
||||||
|
response_time_ms, created_at
|
||||||
|
FROM ai_recommendations
|
||||||
|
WHERE status = 'complete'
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
LIMIT 1`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (lastGood?.length) {
|
||||||
|
const goodRow = lastGood[0];
|
||||||
|
const recData = goodRow.recommendations_json || {};
|
||||||
|
return {
|
||||||
|
id: goodRow.id,
|
||||||
|
recommendations: recData.recommendations || [],
|
||||||
|
overall_assessment: goodRow.overall_assessment || recData.overall_assessment || '',
|
||||||
|
risk_notes: goodRow.risk_notes || recData.risk_notes || [],
|
||||||
|
response_time_ms: goodRow.response_time_ms || 0,
|
||||||
|
created_at: goodRow.created_at,
|
||||||
|
status: 'complete',
|
||||||
|
last_failed: true,
|
||||||
|
error_message: row.error_message,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// No previous good result — return error state
|
||||||
|
return {
|
||||||
|
id: row.id,
|
||||||
|
recommendations: [],
|
||||||
|
overall_assessment: row.error_message || 'AI analysis failed. Please try again.',
|
||||||
|
risk_notes: [],
|
||||||
|
response_time_ms: 0,
|
||||||
|
created_at: row.created_at,
|
||||||
|
status: 'error',
|
||||||
|
last_failed: true,
|
||||||
|
error_message: row.error_message,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Complete — return the data normally
|
||||||
const recData = row.recommendations_json || {};
|
const recData = row.recommendations_json || {};
|
||||||
return {
|
return {
|
||||||
id: row.id,
|
id: row.id,
|
||||||
@@ -220,6 +300,8 @@ export class InvestmentPlanningService {
|
|||||||
risk_notes: row.risk_notes || recData.risk_notes || [],
|
risk_notes: row.risk_notes || recData.risk_notes || [],
|
||||||
response_time_ms: row.response_time_ms || 0,
|
response_time_ms: row.response_time_ms || 0,
|
||||||
created_at: row.created_at,
|
created_at: row.created_at,
|
||||||
|
status: 'complete',
|
||||||
|
last_failed: false,
|
||||||
};
|
};
|
||||||
} catch (err: any) {
|
} catch (err: any) {
|
||||||
// Table might not exist yet (pre-migration tenants)
|
// Table might not exist yet (pre-migration tenants)
|
||||||
@@ -228,15 +310,153 @@ export class InvestmentPlanningService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save a 'processing' placeholder record and return its ID.
|
||||||
|
*/
|
||||||
|
private async saveProcessingRecord(userId?: string): Promise<string> {
|
||||||
|
await this.ensureStatusColumn();
|
||||||
|
const rows = await this.tenant.query(
|
||||||
|
`INSERT INTO ai_recommendations
|
||||||
|
(recommendations_json, overall_assessment, risk_notes, requested_by, status)
|
||||||
|
VALUES ('{}', '', '[]', $1, 'processing')
|
||||||
|
RETURNING id`,
|
||||||
|
[userId || null],
|
||||||
|
);
|
||||||
|
return rows[0].id;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update a processing record with completed results.
|
||||||
|
*/
|
||||||
|
private async updateRecommendationComplete(
|
||||||
|
jobId: string,
|
||||||
|
aiResponse: AIResponse,
|
||||||
|
userId: string | undefined,
|
||||||
|
elapsed: number,
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.tenant.query(
|
||||||
|
`UPDATE ai_recommendations
|
||||||
|
SET recommendations_json = $1,
|
||||||
|
overall_assessment = $2,
|
||||||
|
risk_notes = $3,
|
||||||
|
response_time_ms = $4,
|
||||||
|
status = 'complete'
|
||||||
|
WHERE id = $5`,
|
||||||
|
[
|
||||||
|
JSON.stringify(aiResponse),
|
||||||
|
aiResponse.overall_assessment || '',
|
||||||
|
JSON.stringify(aiResponse.risk_notes || []),
|
||||||
|
elapsed,
|
||||||
|
jobId,
|
||||||
|
],
|
||||||
|
);
|
||||||
|
} catch (err: any) {
|
||||||
|
this.logger.warn(`Could not update recommendation ${jobId}: ${err.message}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update a processing record with error status.
|
||||||
|
*/
|
||||||
|
private async updateRecommendationError(jobId: string, errorMessage: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.tenant.query(
|
||||||
|
`UPDATE ai_recommendations
|
||||||
|
SET status = 'error',
|
||||||
|
error_message = $1
|
||||||
|
WHERE id = $2`,
|
||||||
|
[errorMessage, jobId],
|
||||||
|
);
|
||||||
|
} catch (err: any) {
|
||||||
|
this.logger.warn(`Could not update recommendation error ${jobId}: ${err.message}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Trigger AI recommendations asynchronously.
|
||||||
|
* Saves a 'processing' record, starts the AI work in the background, and returns immediately.
|
||||||
|
* The TenantService instance remains alive via closure reference for the duration of the background work.
|
||||||
|
*/
|
||||||
|
async triggerAIRecommendations(userId?: string, orgId?: string): Promise<{ status: string; message: string }> {
|
||||||
|
const jobId = await this.saveProcessingRecord(userId);
|
||||||
|
this.logger.log(`AI recommendation triggered (job ${jobId}), starting background processing...`);
|
||||||
|
|
||||||
|
// Fire-and-forget — the Promise keeps this service instance (and TenantService) alive
|
||||||
|
this.runBackgroundRecommendations(jobId, userId, orgId).catch((err) => {
|
||||||
|
this.logger.error(`Background AI recommendation failed (job ${jobId}): ${err.message}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 'processing',
|
||||||
|
message: 'AI analysis has been started. You can navigate away safely — results will appear when ready.',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Run the full AI recommendation pipeline in the background.
|
||||||
|
*/
|
||||||
|
private async runBackgroundRecommendations(jobId: string, userId?: string, orgId?: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
const startTime = Date.now();
|
||||||
|
|
||||||
|
const [snapshot, allRates, monthlyForecast] = await Promise.all([
|
||||||
|
this.getFinancialSnapshot(),
|
||||||
|
this.getMarketRates(),
|
||||||
|
this.getMonthlyForecast(),
|
||||||
|
]);
|
||||||
|
|
||||||
|
this.debug('background_snapshot_summary', {
|
||||||
|
job_id: jobId,
|
||||||
|
operating_cash: snapshot.summary.operating_cash,
|
||||||
|
reserve_cash: snapshot.summary.reserve_cash,
|
||||||
|
total_all: snapshot.summary.total_all,
|
||||||
|
investment_accounts: snapshot.investment_accounts.length,
|
||||||
|
});
|
||||||
|
|
||||||
|
const messages = this.buildPromptMessages(snapshot, allRates, monthlyForecast);
|
||||||
|
const aiResponse = await this.callAI(messages);
|
||||||
|
const elapsed = Date.now() - startTime;
|
||||||
|
|
||||||
|
this.debug('background_final_response', {
|
||||||
|
job_id: jobId,
|
||||||
|
recommendation_count: aiResponse.recommendations.length,
|
||||||
|
has_assessment: !!aiResponse.overall_assessment,
|
||||||
|
elapsed_ms: elapsed,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check if the AI returned a graceful error (empty recommendations with error message)
|
||||||
|
const isGracefulError = aiResponse.recommendations.length === 0 &&
|
||||||
|
(aiResponse.overall_assessment?.includes('Unable to generate') ||
|
||||||
|
aiResponse.overall_assessment?.includes('invalid response'));
|
||||||
|
|
||||||
|
if (isGracefulError) {
|
||||||
|
await this.updateRecommendationError(jobId, aiResponse.overall_assessment);
|
||||||
|
} else {
|
||||||
|
await this.updateRecommendationComplete(jobId, aiResponse, userId, elapsed);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log AI usage (fire-and-forget)
|
||||||
|
this.logAIUsage(userId, orgId, aiResponse, elapsed).catch(() => {});
|
||||||
|
|
||||||
|
this.logger.log(`Background AI recommendation completed (job ${jobId}) in ${elapsed}ms`);
|
||||||
|
} catch (err: any) {
|
||||||
|
this.logger.error(`Background AI recommendation error (job ${jobId}): ${err.message}`);
|
||||||
|
await this.updateRecommendationError(jobId, err.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Save AI recommendation result to tenant schema.
|
* Save AI recommendation result to tenant schema.
|
||||||
|
* @deprecated Use triggerAIRecommendations() for async flow instead
|
||||||
*/
|
*/
|
||||||
private async saveRecommendation(aiResponse: AIResponse, userId: string | undefined, elapsed: number): Promise<void> {
|
private async saveRecommendation(aiResponse: AIResponse, userId: string | undefined, elapsed: number): Promise<void> {
|
||||||
try {
|
try {
|
||||||
|
await this.ensureStatusColumn();
|
||||||
await this.tenant.query(
|
await this.tenant.query(
|
||||||
`INSERT INTO ai_recommendations
|
`INSERT INTO ai_recommendations
|
||||||
(recommendations_json, overall_assessment, risk_notes, requested_by, response_time_ms)
|
(recommendations_json, overall_assessment, risk_notes, requested_by, response_time_ms, status)
|
||||||
VALUES ($1, $2, $3, $4, $5)`,
|
VALUES ($1, $2, $3, $4, $5, 'complete')`,
|
||||||
[
|
[
|
||||||
JSON.stringify(aiResponse),
|
JSON.stringify(aiResponse),
|
||||||
aiResponse.overall_assessment || '',
|
aiResponse.overall_assessment || '',
|
||||||
@@ -873,7 +1093,7 @@ Based on this complete financial picture INCLUDING the 12-month cash flow foreca
|
|||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
'Content-Length': Buffer.byteLength(bodyString, 'utf-8'),
|
'Content-Length': Buffer.byteLength(bodyString, 'utf-8'),
|
||||||
},
|
},
|
||||||
timeout: 300000, // 5 minute timeout
|
timeout: 600000, // 10 minute timeout
|
||||||
};
|
};
|
||||||
|
|
||||||
const req = https.request(options, (res) => {
|
const req = https.request(options, (res) => {
|
||||||
@@ -887,7 +1107,7 @@ Based on this complete financial picture INCLUDING the 12-month cash flow foreca
|
|||||||
req.on('error', (err) => reject(err));
|
req.on('error', (err) => reject(err));
|
||||||
req.on('timeout', () => {
|
req.on('timeout', () => {
|
||||||
req.destroy();
|
req.destroy();
|
||||||
reject(new Error(`Request timed out after 300s`));
|
reject(new Error(`Request timed out after 600s`));
|
||||||
});
|
});
|
||||||
|
|
||||||
req.write(bodyString);
|
req.write(bodyString);
|
||||||
|
|||||||
@@ -16,6 +16,11 @@ export class InvoicesController {
|
|||||||
@Get(':id')
|
@Get(':id')
|
||||||
findOne(@Param('id') id: string) { return this.invoicesService.findOne(id); }
|
findOne(@Param('id') id: string) { return this.invoicesService.findOne(id); }
|
||||||
|
|
||||||
|
@Post('generate-preview')
|
||||||
|
generatePreview(@Body() dto: { month: number; year: number }) {
|
||||||
|
return this.invoicesService.generatePreview(dto);
|
||||||
|
}
|
||||||
|
|
||||||
@Post('generate-bulk')
|
@Post('generate-bulk')
|
||||||
generateBulk(@Body() dto: { month: number; year: number }, @Request() req: any) {
|
generateBulk(@Body() dto: { month: number; year: number }, @Request() req: any) {
|
||||||
return this.invoicesService.generateBulk(dto, req.user.sub);
|
return this.invoicesService.generateBulk(dto, req.user.sub);
|
||||||
|
|||||||
@@ -1,33 +1,135 @@
|
|||||||
import { Injectable, NotFoundException, BadRequestException } from '@nestjs/common';
|
import { Injectable, NotFoundException, BadRequestException } from '@nestjs/common';
|
||||||
import { TenantService } from '../../database/tenant.service';
|
import { TenantService } from '../../database/tenant.service';
|
||||||
|
|
||||||
|
const MONTH_NAMES = [
|
||||||
|
'', 'January', 'February', 'March', 'April', 'May', 'June',
|
||||||
|
'July', 'August', 'September', 'October', 'November', 'December',
|
||||||
|
];
|
||||||
|
|
||||||
|
const MONTH_ABBREV = [
|
||||||
|
'', 'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun',
|
||||||
|
'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec',
|
||||||
|
];
|
||||||
|
|
||||||
@Injectable()
|
@Injectable()
|
||||||
export class InvoicesService {
|
export class InvoicesService {
|
||||||
constructor(private tenant: TenantService) {}
|
constructor(private tenant: TenantService) {}
|
||||||
|
|
||||||
async findAll() {
|
async findAll() {
|
||||||
return this.tenant.query(`
|
return this.tenant.query(`
|
||||||
SELECT i.*, u.unit_number,
|
SELECT i.*, u.unit_number, u.owner_name, ag.name as assessment_group_name, ag.frequency,
|
||||||
(i.amount - i.amount_paid) as balance_due
|
(i.amount - i.amount_paid) as balance_due
|
||||||
FROM invoices i
|
FROM invoices i
|
||||||
JOIN units u ON u.id = i.unit_id
|
JOIN units u ON u.id = i.unit_id
|
||||||
|
LEFT JOIN assessment_groups ag ON ag.id = i.assessment_group_id
|
||||||
ORDER BY i.invoice_date DESC, i.invoice_number DESC
|
ORDER BY i.invoice_date DESC, i.invoice_number DESC
|
||||||
`);
|
`);
|
||||||
}
|
}
|
||||||
|
|
||||||
async findOne(id: string) {
|
async findOne(id: string) {
|
||||||
const rows = await this.tenant.query(`
|
const rows = await this.tenant.query(`
|
||||||
SELECT i.*, u.unit_number FROM invoices i
|
SELECT i.*, u.unit_number, u.owner_name FROM invoices i
|
||||||
JOIN units u ON u.id = i.unit_id WHERE i.id = $1`, [id]);
|
JOIN units u ON u.id = i.unit_id WHERE i.id = $1`, [id]);
|
||||||
if (!rows.length) throw new NotFoundException('Invoice not found');
|
if (!rows.length) throw new NotFoundException('Invoice not found');
|
||||||
return rows[0];
|
return rows[0];
|
||||||
}
|
}
|
||||||
|
|
||||||
async generateBulk(dto: { month: number; year: number }, userId: string) {
|
/**
|
||||||
const units = await this.tenant.query(
|
* Calculate billing period based on frequency and the billing month.
|
||||||
`SELECT * FROM units WHERE status = 'active' AND monthly_assessment > 0`,
|
*/
|
||||||
|
private calculatePeriod(frequency: string, month: number, year: number): { start: string; end: string; description: string } {
|
||||||
|
switch (frequency) {
|
||||||
|
case 'quarterly': {
|
||||||
|
// Period covers 3 months starting from the billing month
|
||||||
|
const startDate = new Date(year, month - 1, 1);
|
||||||
|
const endDate = new Date(year, month + 2, 0); // last day of month+2
|
||||||
|
const endMonth = month + 2 > 12 ? month + 2 - 12 : month + 2;
|
||||||
|
const quarter = Math.ceil(month / 3);
|
||||||
|
return {
|
||||||
|
start: startDate.toISOString().split('T')[0],
|
||||||
|
end: endDate.toISOString().split('T')[0],
|
||||||
|
description: `Q${quarter} ${year} Assessment (${MONTH_ABBREV[month]}-${MONTH_ABBREV[endMonth]})`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'annual': {
|
||||||
|
const startDate = new Date(year, 0, 1);
|
||||||
|
const endDate = new Date(year, 11, 31);
|
||||||
|
return {
|
||||||
|
start: startDate.toISOString().split('T')[0],
|
||||||
|
end: endDate.toISOString().split('T')[0],
|
||||||
|
description: `Annual Assessment ${year}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
default: { // monthly
|
||||||
|
const startDate = new Date(year, month - 1, 1);
|
||||||
|
const endDate = new Date(year, month, 0); // last day of month
|
||||||
|
return {
|
||||||
|
start: startDate.toISOString().split('T')[0],
|
||||||
|
end: endDate.toISOString().split('T')[0],
|
||||||
|
description: `Monthly Assessment - ${MONTH_NAMES[month]} ${year}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Preview which groups/units will be billed for a given month/year.
|
||||||
|
*/
|
||||||
|
async generatePreview(dto: { month: number; year: number }) {
|
||||||
|
const allGroups = await this.tenant.query(
|
||||||
|
`SELECT ag.*, (SELECT COUNT(*) FROM units u WHERE u.assessment_group_id = ag.id AND u.status = 'active') as active_units
|
||||||
|
FROM assessment_groups ag WHERE ag.is_active = true ORDER BY ag.name`,
|
||||||
);
|
);
|
||||||
if (!units.length) throw new BadRequestException('No active units with assessments found');
|
|
||||||
|
const groups = allGroups.map((g: any) => {
|
||||||
|
const dueMonths: number[] = g.due_months || [1,2,3,4,5,6,7,8,9,10,11,12];
|
||||||
|
const isBillingMonth = dueMonths.includes(dto.month);
|
||||||
|
const activeUnits = parseInt(g.active_units || '0');
|
||||||
|
const totalAmount = isBillingMonth
|
||||||
|
? (parseFloat(g.regular_assessment) + parseFloat(g.special_assessment || '0')) * activeUnits
|
||||||
|
: 0;
|
||||||
|
const period = this.calculatePeriod(g.frequency || 'monthly', dto.month, dto.year);
|
||||||
|
|
||||||
|
return {
|
||||||
|
id: g.id,
|
||||||
|
name: g.name,
|
||||||
|
frequency: g.frequency || 'monthly',
|
||||||
|
due_months: dueMonths,
|
||||||
|
active_units: activeUnits,
|
||||||
|
regular_assessment: g.regular_assessment,
|
||||||
|
special_assessment: g.special_assessment,
|
||||||
|
is_billing_month: isBillingMonth,
|
||||||
|
total_amount: totalAmount,
|
||||||
|
period_description: period.description,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
const billableGroups = groups.filter((g: any) => g.is_billing_month && g.active_units > 0);
|
||||||
|
const totalInvoices = billableGroups.reduce((sum: number, g: any) => sum + g.active_units, 0);
|
||||||
|
const totalAmount = billableGroups.reduce((sum: number, g: any) => sum + g.total_amount, 0);
|
||||||
|
|
||||||
|
return {
|
||||||
|
month: dto.month,
|
||||||
|
year: dto.year,
|
||||||
|
month_name: MONTH_NAMES[dto.month],
|
||||||
|
groups,
|
||||||
|
summary: { total_groups_billing: billableGroups.length, total_invoices: totalInvoices, total_amount: totalAmount },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate invoices for all assessment groups where the given month is a billing month.
|
||||||
|
*/
|
||||||
|
async generateBulk(dto: { month: number; year: number }, userId: string) {
|
||||||
|
// Get assessment groups where this month is a billing month
|
||||||
|
const groups = await this.tenant.query(
|
||||||
|
`SELECT * FROM assessment_groups WHERE is_active = true AND $1 = ANY(due_months)`,
|
||||||
|
[dto.month],
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!groups.length) {
|
||||||
|
throw new BadRequestException(`No assessment groups have billing scheduled for ${MONTH_NAMES[dto.month]}`);
|
||||||
|
}
|
||||||
|
|
||||||
// Get or create fiscal period
|
// Get or create fiscal period
|
||||||
let fp = await this.tenant.query(
|
let fp = await this.tenant.query(
|
||||||
@@ -41,50 +143,87 @@ export class InvoicesService {
|
|||||||
}
|
}
|
||||||
const fiscalPeriodId = fp[0].id;
|
const fiscalPeriodId = fp[0].id;
|
||||||
|
|
||||||
const invoiceDate = new Date(dto.year, dto.month - 1, 1);
|
// Look up GL accounts once
|
||||||
const dueDate = new Date(dto.year, dto.month - 1, 15);
|
const arAccount = await this.tenant.query(`SELECT id FROM accounts WHERE account_number = '1200'`);
|
||||||
|
const incomeAccount = await this.tenant.query(`SELECT id FROM accounts WHERE account_number = '4000'`);
|
||||||
|
|
||||||
let created = 0;
|
let created = 0;
|
||||||
|
const groupResults: any[] = [];
|
||||||
|
|
||||||
for (const unit of units) {
|
for (const group of groups) {
|
||||||
const invNum = `INV-${dto.year}${String(dto.month).padStart(2, '0')}-${unit.unit_number}`;
|
// Get active units in this assessment group
|
||||||
|
const units = await this.tenant.query(
|
||||||
// Check if already generated
|
`SELECT * FROM units WHERE status = 'active' AND assessment_group_id = $1`,
|
||||||
const existing = await this.tenant.query(
|
[group.id],
|
||||||
'SELECT id FROM invoices WHERE invoice_number = $1', [invNum],
|
|
||||||
);
|
|
||||||
if (existing.length) continue;
|
|
||||||
|
|
||||||
// Create the invoice
|
|
||||||
const inv = await this.tenant.query(
|
|
||||||
`INSERT INTO invoices (invoice_number, unit_id, invoice_date, due_date, invoice_type, description, amount, status)
|
|
||||||
VALUES ($1, $2, $3, $4, 'regular_assessment', $5, $6, 'sent') RETURNING id`,
|
|
||||||
[invNum, unit.id, invoiceDate.toISOString().split('T')[0], dueDate.toISOString().split('T')[0],
|
|
||||||
`Monthly assessment - ${new Date(dto.year, dto.month - 1).toLocaleString('default', { month: 'long', year: 'numeric' })}`,
|
|
||||||
unit.monthly_assessment],
|
|
||||||
);
|
);
|
||||||
|
|
||||||
// Create journal entry: DR Accounts Receivable, CR Assessment Income
|
if (!units.length) continue;
|
||||||
const arAccount = await this.tenant.query(`SELECT id FROM accounts WHERE account_number = '1200'`);
|
|
||||||
const incomeAccount = await this.tenant.query(`SELECT id FROM accounts WHERE account_number = '4000'`);
|
|
||||||
|
|
||||||
if (arAccount.length && incomeAccount.length) {
|
const frequency = group.frequency || 'monthly';
|
||||||
const je = await this.tenant.query(
|
const period = this.calculatePeriod(frequency, dto.month, dto.year);
|
||||||
`INSERT INTO journal_entries (entry_date, description, entry_type, fiscal_period_id, source_type, source_id, is_posted, posted_at, created_by)
|
const dueDay = Math.min(group.due_day || 1, 28);
|
||||||
VALUES ($1, $2, 'assessment', $3, 'invoice', $4, true, NOW(), $5) RETURNING id`,
|
const invoiceDate = new Date(dto.year, dto.month - 1, 1);
|
||||||
[invoiceDate.toISOString().split('T')[0], `Assessment - Unit ${unit.unit_number}`, fiscalPeriodId, inv[0].id, userId],
|
const dueDate = new Date(dto.year, dto.month - 1, dueDay);
|
||||||
|
|
||||||
|
// Use the group's assessment amount (full period amount, not monthly equivalent)
|
||||||
|
const assessmentAmount = parseFloat(group.regular_assessment) + parseFloat(group.special_assessment || '0');
|
||||||
|
|
||||||
|
let groupCreated = 0;
|
||||||
|
|
||||||
|
for (const unit of units) {
|
||||||
|
const invNum = `INV-${dto.year}${String(dto.month).padStart(2, '0')}-${unit.unit_number}`;
|
||||||
|
|
||||||
|
// Check if already generated
|
||||||
|
const existing = await this.tenant.query(
|
||||||
|
'SELECT id FROM invoices WHERE invoice_number = $1', [invNum],
|
||||||
);
|
);
|
||||||
await this.tenant.query(
|
if (existing.length) continue;
|
||||||
`INSERT INTO journal_entry_lines (journal_entry_id, account_id, debit, credit) VALUES ($1, $2, $3, 0), ($1, $4, 0, $3)`,
|
|
||||||
[je[0].id, arAccount[0].id, unit.monthly_assessment, incomeAccount[0].id],
|
// Use unit-level override if set, otherwise use group amount
|
||||||
);
|
const unitAmount = unit.monthly_assessment && parseFloat(unit.monthly_assessment) > 0
|
||||||
await this.tenant.query(
|
? (frequency === 'monthly'
|
||||||
`UPDATE invoices SET journal_entry_id = $1 WHERE id = $2`, [je[0].id, inv[0].id],
|
? parseFloat(unit.monthly_assessment)
|
||||||
|
: frequency === 'quarterly'
|
||||||
|
? parseFloat(unit.monthly_assessment) * 3
|
||||||
|
: parseFloat(unit.monthly_assessment) * 12)
|
||||||
|
: assessmentAmount;
|
||||||
|
|
||||||
|
// Create the invoice with status 'pending' (no email sending capability)
|
||||||
|
const inv = await this.tenant.query(
|
||||||
|
`INSERT INTO invoices (invoice_number, unit_id, invoice_date, due_date, invoice_type, description, amount, status, period_start, period_end, assessment_group_id)
|
||||||
|
VALUES ($1, $2, $3, $4, 'regular_assessment', $5, $6, 'pending', $7, $8, $9) RETURNING id`,
|
||||||
|
[invNum, unit.id, invoiceDate.toISOString().split('T')[0], dueDate.toISOString().split('T')[0],
|
||||||
|
period.description, unitAmount, period.start, period.end, group.id],
|
||||||
);
|
);
|
||||||
|
|
||||||
|
// Create journal entry: DR Accounts Receivable, CR Assessment Income
|
||||||
|
if (arAccount.length && incomeAccount.length) {
|
||||||
|
const je = await this.tenant.query(
|
||||||
|
`INSERT INTO journal_entries (entry_date, description, entry_type, fiscal_period_id, source_type, source_id, is_posted, posted_at, created_by)
|
||||||
|
VALUES ($1, $2, 'assessment', $3, 'invoice', $4, true, NOW(), $5) RETURNING id`,
|
||||||
|
[invoiceDate.toISOString().split('T')[0], `Assessment - Unit ${unit.unit_number}`, fiscalPeriodId, inv[0].id, userId],
|
||||||
|
);
|
||||||
|
await this.tenant.query(
|
||||||
|
`INSERT INTO journal_entry_lines (journal_entry_id, account_id, debit, credit) VALUES ($1, $2, $3, 0), ($1, $4, 0, $3)`,
|
||||||
|
[je[0].id, arAccount[0].id, unitAmount, incomeAccount[0].id],
|
||||||
|
);
|
||||||
|
await this.tenant.query(
|
||||||
|
`UPDATE invoices SET journal_entry_id = $1 WHERE id = $2`, [je[0].id, inv[0].id],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
created++;
|
||||||
|
groupCreated++;
|
||||||
}
|
}
|
||||||
created++;
|
|
||||||
|
groupResults.push({
|
||||||
|
group_name: group.name,
|
||||||
|
frequency,
|
||||||
|
period: period.description,
|
||||||
|
invoices_created: groupCreated,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
return { created, month: dto.month, year: dto.year };
|
return { created, month: dto.month, year: dto.year, groups: groupResults };
|
||||||
}
|
}
|
||||||
|
|
||||||
async applyLateFees(dto: { grace_period_days: number; late_fee_amount: number }, userId: string) {
|
async applyLateFees(dto: { grace_period_days: number; late_fee_amount: number }, userId: string) {
|
||||||
@@ -95,7 +234,7 @@ export class InvoicesService {
|
|||||||
const overdue = await this.tenant.query(`
|
const overdue = await this.tenant.query(`
|
||||||
SELECT i.*, u.unit_number FROM invoices i
|
SELECT i.*, u.unit_number FROM invoices i
|
||||||
JOIN units u ON u.id = i.unit_id
|
JOIN units u ON u.id = i.unit_id
|
||||||
WHERE i.status IN ('sent', 'partial') AND i.due_date < $1
|
WHERE i.status IN ('pending', 'partial') AND i.due_date < $1
|
||||||
AND NOT EXISTS (
|
AND NOT EXISTS (
|
||||||
SELECT 1 FROM invoices lf WHERE lf.unit_id = i.unit_id
|
SELECT 1 FROM invoices lf WHERE lf.unit_id = i.unit_id
|
||||||
AND lf.invoice_type = 'late_fee' AND lf.description LIKE '%' || i.invoice_number || '%'
|
AND lf.invoice_type = 'late_fee' AND lf.description LIKE '%' || i.invoice_number || '%'
|
||||||
@@ -109,7 +248,7 @@ export class InvoicesService {
|
|||||||
const lfNum = `LF-${inv.invoice_number}`;
|
const lfNum = `LF-${inv.invoice_number}`;
|
||||||
await this.tenant.query(
|
await this.tenant.query(
|
||||||
`INSERT INTO invoices (invoice_number, unit_id, invoice_date, due_date, invoice_type, description, amount, status)
|
`INSERT INTO invoices (invoice_number, unit_id, invoice_date, due_date, invoice_type, description, amount, status)
|
||||||
VALUES ($1, $2, CURRENT_DATE, CURRENT_DATE + INTERVAL '15 days', 'late_fee', $3, $4, 'sent')`,
|
VALUES ($1, $2, CURRENT_DATE, CURRENT_DATE + INTERVAL '15 days', 'late_fee', $3, $4, 'pending')`,
|
||||||
[lfNum, inv.unit_id, `Late fee for invoice ${inv.invoice_number}`, dto.late_fee_amount],
|
[lfNum, inv.unit_id, `Late fee for invoice ${inv.invoice_number}`, dto.late_fee_amount],
|
||||||
);
|
);
|
||||||
applied++;
|
applied++;
|
||||||
|
|||||||
@@ -153,6 +153,14 @@ export class OrganizationsService {
|
|||||||
existing.role = data.role;
|
existing.role = data.role;
|
||||||
return this.userOrgRepository.save(existing);
|
return this.userOrgRepository.save(existing);
|
||||||
}
|
}
|
||||||
|
// Update password for existing user being added to a new org
|
||||||
|
if (data.password) {
|
||||||
|
const passwordHash = await bcrypt.hash(data.password, 12);
|
||||||
|
await dataSource.query(
|
||||||
|
`UPDATE shared.users SET password_hash = $1 WHERE id = $2`,
|
||||||
|
[passwordHash, userId],
|
||||||
|
);
|
||||||
|
}
|
||||||
} else {
|
} else {
|
||||||
// Create new user
|
// Create new user
|
||||||
const passwordHash = await bcrypt.hash(data.password, 12);
|
const passwordHash = await bcrypt.hash(data.password, 12);
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { Controller, Get, Post, Body, Param, UseGuards, Request } from '@nestjs/common';
|
import { Controller, Get, Post, Put, Delete, Body, Param, UseGuards, Request } from '@nestjs/common';
|
||||||
import { ApiTags, ApiBearerAuth } from '@nestjs/swagger';
|
import { ApiTags, ApiBearerAuth } from '@nestjs/swagger';
|
||||||
import { JwtAuthGuard } from '../auth/guards/jwt-auth.guard';
|
import { JwtAuthGuard } from '../auth/guards/jwt-auth.guard';
|
||||||
import { PaymentsService } from './payments.service';
|
import { PaymentsService } from './payments.service';
|
||||||
@@ -18,4 +18,12 @@ export class PaymentsController {
|
|||||||
|
|
||||||
@Post()
|
@Post()
|
||||||
create(@Body() dto: any, @Request() req: any) { return this.paymentsService.create(dto, req.user.sub); }
|
create(@Body() dto: any, @Request() req: any) { return this.paymentsService.create(dto, req.user.sub); }
|
||||||
|
|
||||||
|
@Put(':id')
|
||||||
|
update(@Param('id') id: string, @Body() dto: any, @Request() req: any) {
|
||||||
|
return this.paymentsService.update(id, dto, req.user.sub);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Delete(':id')
|
||||||
|
delete(@Param('id') id: string) { return this.paymentsService.delete(id); }
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -74,17 +74,95 @@ export class PaymentsService {
|
|||||||
await this.tenant.query(`UPDATE payments SET journal_entry_id = $1 WHERE id = $2`, [je[0].id, payment[0].id]);
|
await this.tenant.query(`UPDATE payments SET journal_entry_id = $1 WHERE id = $2`, [je[0].id, payment[0].id]);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Update invoice if linked
|
// Update invoice if linked — use explicit cast to avoid PostgreSQL type inference error
|
||||||
if (invoice) {
|
if (invoice) {
|
||||||
const newPaid = parseFloat(invoice.amount_paid) + parseFloat(dto.amount);
|
const newPaid = parseFloat(invoice.amount_paid) + parseFloat(dto.amount);
|
||||||
const invoiceAmt = parseFloat(invoice.amount);
|
const invoiceAmt = parseFloat(invoice.amount);
|
||||||
const newStatus = newPaid >= invoiceAmt ? 'paid' : 'partial';
|
const newStatus = newPaid >= invoiceAmt ? 'paid' : 'partial';
|
||||||
await this.tenant.query(
|
await this.tenant.query(
|
||||||
`UPDATE invoices SET amount_paid = $1, status = $2, paid_at = CASE WHEN $2 = 'paid' THEN NOW() ELSE paid_at END, updated_at = NOW() WHERE id = $3`,
|
`UPDATE invoices SET amount_paid = $1, status = $2::VARCHAR, paid_at = CASE WHEN $3::VARCHAR = 'paid' THEN NOW() ELSE paid_at END, updated_at = NOW() WHERE id = $4`,
|
||||||
[newPaid, newStatus, invoice.id],
|
[newPaid, newStatus, newStatus, invoice.id],
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
return payment[0];
|
return payment[0];
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async update(id: string, dto: any, userId: string) {
|
||||||
|
const existing = await this.findOne(id);
|
||||||
|
|
||||||
|
const sets: string[] = [];
|
||||||
|
const params: any[] = [];
|
||||||
|
let idx = 1;
|
||||||
|
|
||||||
|
if (dto.payment_date !== undefined) { sets.push(`payment_date = $${idx++}`); params.push(dto.payment_date); }
|
||||||
|
if (dto.amount !== undefined) { sets.push(`amount = $${idx++}`); params.push(dto.amount); }
|
||||||
|
if (dto.payment_method !== undefined) { sets.push(`payment_method = $${idx++}`); params.push(dto.payment_method); }
|
||||||
|
if (dto.reference_number !== undefined) { sets.push(`reference_number = $${idx++}`); params.push(dto.reference_number); }
|
||||||
|
if (dto.notes !== undefined) { sets.push(`notes = $${idx++}`); params.push(dto.notes); }
|
||||||
|
|
||||||
|
if (!sets.length) return this.findOne(id);
|
||||||
|
|
||||||
|
params.push(id);
|
||||||
|
await this.tenant.query(
|
||||||
|
`UPDATE payments SET ${sets.join(', ')} WHERE id = $${idx} RETURNING *`,
|
||||||
|
params,
|
||||||
|
);
|
||||||
|
|
||||||
|
// If amount changed and payment is linked to an invoice, recalculate invoice totals
|
||||||
|
if (dto.amount !== undefined && existing.invoice_id) {
|
||||||
|
await this.recalculateInvoice(existing.invoice_id);
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findOne(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
async delete(id: string) {
|
||||||
|
const payment = await this.findOne(id);
|
||||||
|
const invoiceId = payment.invoice_id;
|
||||||
|
|
||||||
|
// Delete associated journal entry lines and journal entry
|
||||||
|
if (payment.journal_entry_id) {
|
||||||
|
await this.tenant.query('DELETE FROM journal_entry_lines WHERE journal_entry_id = $1', [payment.journal_entry_id]);
|
||||||
|
await this.tenant.query('DELETE FROM journal_entries WHERE id = $1', [payment.journal_entry_id]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete the payment
|
||||||
|
await this.tenant.query('DELETE FROM payments WHERE id = $1', [id]);
|
||||||
|
|
||||||
|
// Recalculate invoice totals if payment was linked
|
||||||
|
if (invoiceId) {
|
||||||
|
await this.recalculateInvoice(invoiceId);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
|
||||||
|
private async recalculateInvoice(invoiceId: string) {
|
||||||
|
// Sum all remaining payments for this invoice
|
||||||
|
const result = await this.tenant.query(
|
||||||
|
'SELECT COALESCE(SUM(amount), 0) as total_paid FROM payments WHERE invoice_id = $1',
|
||||||
|
[invoiceId],
|
||||||
|
);
|
||||||
|
const totalPaid = parseFloat(result[0].total_paid);
|
||||||
|
|
||||||
|
// Get the invoice amount
|
||||||
|
const inv = await this.tenant.query('SELECT amount FROM invoices WHERE id = $1', [invoiceId]);
|
||||||
|
if (!inv.length) return;
|
||||||
|
|
||||||
|
const invoiceAmt = parseFloat(inv[0].amount);
|
||||||
|
let newStatus: string;
|
||||||
|
if (totalPaid >= invoiceAmt) {
|
||||||
|
newStatus = 'paid';
|
||||||
|
} else if (totalPaid > 0) {
|
||||||
|
newStatus = 'partial';
|
||||||
|
} else {
|
||||||
|
newStatus = 'pending';
|
||||||
|
}
|
||||||
|
|
||||||
|
await this.tenant.query(
|
||||||
|
`UPDATE invoices SET amount_paid = $1, status = $2::VARCHAR, paid_at = CASE WHEN $3::VARCHAR = 'paid' THEN NOW() ELSE NULL END, updated_at = NOW() WHERE id = $4`,
|
||||||
|
[totalPaid, newStatus, newStatus, invoiceId],
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
57
db/migrations/011-invoice-billing-frequency.sql
Normal file
57
db/migrations/011-invoice-billing-frequency.sql
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
-- Migration 011: Add billing frequency support to invoices
|
||||||
|
-- Adds due_months and due_day to assessment_groups
|
||||||
|
-- Adds period_start, period_end, assessment_group_id to invoices
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
v_schema TEXT;
|
||||||
|
BEGIN
|
||||||
|
FOR v_schema IN
|
||||||
|
SELECT schema_name FROM information_schema.schemata
|
||||||
|
WHERE schema_name LIKE 'tenant_%'
|
||||||
|
LOOP
|
||||||
|
-- Add due_months and due_day to assessment_groups
|
||||||
|
EXECUTE format('
|
||||||
|
ALTER TABLE %I.assessment_groups
|
||||||
|
ADD COLUMN IF NOT EXISTS due_months INTEGER[] DEFAULT ''{1,2,3,4,5,6,7,8,9,10,11,12}'',
|
||||||
|
ADD COLUMN IF NOT EXISTS due_day INTEGER DEFAULT 1
|
||||||
|
', v_schema);
|
||||||
|
|
||||||
|
-- Add period tracking and assessment group link to invoices
|
||||||
|
EXECUTE format('
|
||||||
|
ALTER TABLE %I.invoices
|
||||||
|
ADD COLUMN IF NOT EXISTS period_start DATE,
|
||||||
|
ADD COLUMN IF NOT EXISTS period_end DATE,
|
||||||
|
ADD COLUMN IF NOT EXISTS assessment_group_id UUID
|
||||||
|
', v_schema);
|
||||||
|
|
||||||
|
-- Backfill due_months based on existing frequency values
|
||||||
|
EXECUTE format('
|
||||||
|
UPDATE %I.assessment_groups
|
||||||
|
SET due_months = CASE frequency
|
||||||
|
WHEN ''quarterly'' THEN ''{1,4,7,10}''::INTEGER[]
|
||||||
|
WHEN ''annual'' THEN ''{1}''::INTEGER[]
|
||||||
|
ELSE ''{1,2,3,4,5,6,7,8,9,10,11,12}''::INTEGER[]
|
||||||
|
END
|
||||||
|
WHERE due_months IS NULL OR due_months = ''{1,2,3,4,5,6,7,8,9,10,11,12}''
|
||||||
|
AND frequency != ''monthly''
|
||||||
|
', v_schema);
|
||||||
|
|
||||||
|
-- Backfill period_start/period_end for existing invoices (all monthly)
|
||||||
|
EXECUTE format('
|
||||||
|
UPDATE %I.invoices
|
||||||
|
SET period_start = invoice_date,
|
||||||
|
period_end = (invoice_date + INTERVAL ''1 month'' - INTERVAL ''1 day'')::DATE
|
||||||
|
WHERE period_start IS NULL AND invoice_type = ''regular_assessment''
|
||||||
|
', v_schema);
|
||||||
|
|
||||||
|
-- Backfill assessment_group_id on existing invoices from units
|
||||||
|
EXECUTE format('
|
||||||
|
UPDATE %I.invoices i
|
||||||
|
SET assessment_group_id = u.assessment_group_id
|
||||||
|
FROM %I.units u
|
||||||
|
WHERE i.unit_id = u.id AND i.assessment_group_id IS NULL
|
||||||
|
', v_schema, v_schema);
|
||||||
|
|
||||||
|
END LOOP;
|
||||||
|
END $$;
|
||||||
33
db/migrations/012-invoice-status-pending.sql
Normal file
33
db/migrations/012-invoice-status-pending.sql
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
-- Migration 012: Replace 'sent' status with 'pending' for invoices
|
||||||
|
-- 'sent' implied email delivery which doesn't exist; 'pending' is more accurate
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
v_schema TEXT;
|
||||||
|
v_constraint TEXT;
|
||||||
|
BEGIN
|
||||||
|
FOR v_schema IN
|
||||||
|
SELECT schema_name FROM information_schema.schemata
|
||||||
|
WHERE schema_name LIKE 'tenant_%'
|
||||||
|
LOOP
|
||||||
|
-- Find and drop the existing status check constraint
|
||||||
|
SELECT constraint_name INTO v_constraint
|
||||||
|
FROM information_schema.table_constraints
|
||||||
|
WHERE table_schema = v_schema
|
||||||
|
AND table_name = 'invoices'
|
||||||
|
AND constraint_type = 'CHECK'
|
||||||
|
AND constraint_name LIKE '%status%';
|
||||||
|
|
||||||
|
IF v_constraint IS NOT NULL THEN
|
||||||
|
EXECUTE format('ALTER TABLE %I.invoices DROP CONSTRAINT %I', v_schema, v_constraint);
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- Add new constraint that includes 'pending'
|
||||||
|
EXECUTE format('ALTER TABLE %I.invoices ADD CONSTRAINT invoices_status_check CHECK (status IN (
|
||||||
|
''draft'', ''pending'', ''sent'', ''paid'', ''partial'', ''overdue'', ''void'', ''written_off''
|
||||||
|
))', v_schema);
|
||||||
|
|
||||||
|
-- Convert existing 'sent' invoices to 'pending'
|
||||||
|
EXECUTE format('UPDATE %I.invoices SET status = ''pending'' WHERE status = ''sent''', v_schema);
|
||||||
|
END LOOP;
|
||||||
|
END $$;
|
||||||
@@ -204,7 +204,10 @@ CREATE TABLE IF NOT EXISTS %I.assessment_groups (
|
|||||||
special_assessment DECIMAL(10,2) DEFAULT 0.00,
|
special_assessment DECIMAL(10,2) DEFAULT 0.00,
|
||||||
unit_count INTEGER DEFAULT 0,
|
unit_count INTEGER DEFAULT 0,
|
||||||
frequency VARCHAR(20) DEFAULT ''monthly'',
|
frequency VARCHAR(20) DEFAULT ''monthly'',
|
||||||
|
due_months INTEGER[] DEFAULT ''{1,2,3,4,5,6,7,8,9,10,11,12}'',
|
||||||
|
due_day INTEGER DEFAULT 1,
|
||||||
is_active BOOLEAN DEFAULT TRUE,
|
is_active BOOLEAN DEFAULT TRUE,
|
||||||
|
is_default BOOLEAN DEFAULT FALSE,
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
)', v_schema);
|
)', v_schema);
|
||||||
@@ -244,6 +247,9 @@ CREATE TABLE IF NOT EXISTS %I.invoices (
|
|||||||
amount DECIMAL(10,2) NOT NULL,
|
amount DECIMAL(10,2) NOT NULL,
|
||||||
amount_paid DECIMAL(10,2) DEFAULT 0.00,
|
amount_paid DECIMAL(10,2) DEFAULT 0.00,
|
||||||
status VARCHAR(20) DEFAULT ''draft'',
|
status VARCHAR(20) DEFAULT ''draft'',
|
||||||
|
period_start DATE,
|
||||||
|
period_end DATE,
|
||||||
|
assessment_group_id UUID,
|
||||||
journal_entry_id UUID,
|
journal_entry_id UUID,
|
||||||
sent_at TIMESTAMPTZ,
|
sent_at TIMESTAMPTZ,
|
||||||
paid_at TIMESTAMPTZ,
|
paid_at TIMESTAMPTZ,
|
||||||
@@ -443,10 +449,10 @@ END LOOP;
|
|||||||
-- ============================================================
|
-- ============================================================
|
||||||
-- 4b. Seed Assessment Groups
|
-- 4b. Seed Assessment Groups
|
||||||
-- ============================================================
|
-- ============================================================
|
||||||
EXECUTE format('INSERT INTO %I.assessment_groups (name, description, regular_assessment, special_assessment, unit_count) VALUES
|
EXECUTE format('INSERT INTO %I.assessment_groups (name, description, regular_assessment, special_assessment, unit_count, frequency, due_months, due_day) VALUES
|
||||||
(''Single Family Homes'', ''Standard single family detached homes (Units 1-20)'', 350.00, 0.00, 20),
|
(''Single Family Homes'', ''Standard single family detached homes (Units 1-20)'', 350.00, 0.00, 20, ''monthly'', ''{1,2,3,4,5,6,7,8,9,10,11,12}'', 15),
|
||||||
(''Patio Homes'', ''Medium-sized patio homes (Units 21-35)'', 425.00, 0.00, 15),
|
(''Patio Homes'', ''Medium-sized patio homes (Units 21-35)'', 1275.00, 0.00, 15, ''quarterly'', ''{1,4,7,10}'', 1),
|
||||||
(''Estate Lots'', ''Large estate lots (Units 36-50)'', 500.00, 75.00, 15)
|
(''Estate Lots'', ''Large estate lots (Units 36-50)'', 6000.00, 900.00, 15, ''annual'', ''{3}'', 1)
|
||||||
', v_schema);
|
', v_schema);
|
||||||
|
|
||||||
-- ============================================================
|
-- ============================================================
|
||||||
|
|||||||
545
docs/AI_FEATURE_AUDIT.md
Normal file
545
docs/AI_FEATURE_AUDIT.md
Normal file
@@ -0,0 +1,545 @@
|
|||||||
|
# AI Feature Audit Report
|
||||||
|
|
||||||
|
**Audit Date:** 2026-03-05
|
||||||
|
**Tenant Under Test:** Pine Creek HOA (`tenant_pine_creek_hoa_q33i`)
|
||||||
|
**AI Model:** Qwen 3.5-397B-A17B via NVIDIA NIM (Temperature: 0.3)
|
||||||
|
**Auditor:** Claude Opus 4.6 (automated)
|
||||||
|
**Data Snapshot Date:** 2026-03-04
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
Three AI-powered features were audited against ground-truth database records: **Operating Fund Health**, **Reserve Fund Health**, and **Investment Recommendations**. Overall, the AI demonstrates strong financial reasoning and produces actionable, fiduciary-appropriate recommendations. However, score consistency across runs is a concern (16-point spread on operating, 20-point spread on reserve), and several specific data interpretation issues were identified.
|
||||||
|
|
||||||
|
| Feature | Latest Score/Grade | Concurrence | Verdict |
|
||||||
|
|---|---|---|---|
|
||||||
|
| Operating Fund Health | 88 / Good | **72%** | Score ~10-15 pts high; cash runway below its own "Good" threshold |
|
||||||
|
| Reserve Fund Health | 45 / Needs Attention | **85%** | Well-calibrated; minor data misquote on annual contributions |
|
||||||
|
| Investment Recommendations | 6 recommendations | **88%** | Excellent specificity; all market rates verified accurate |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Data Foundation (Ground Truth)
|
||||||
|
|
||||||
|
### Financial Position
|
||||||
|
|
||||||
|
| Metric | Value | Source |
|
||||||
|
|---|---|---|
|
||||||
|
| Operating Cash (Checking) | $27,418.81 | GL balance |
|
||||||
|
| Reserve Cash (Savings) | $10,688.45 | GL balance |
|
||||||
|
| Reserve CD #1a (FCB) | $10,000 @ 3.67%, matures 6/19/26 | `investment_accounts` |
|
||||||
|
| Reserve CD #2a (FCB) | $8,000 @ 3.60%, matures 4/14/26 | `investment_accounts` |
|
||||||
|
| Reserve CD #3a (FCB) | $10,000 @ 3.67%, matures 8/18/26 | `investment_accounts` |
|
||||||
|
| Total Reserve Fund | $38,688.45 | Cash + Investments |
|
||||||
|
| Total Assets | $66,107.26 | Operating + Reserve |
|
||||||
|
|
||||||
|
### Budget (FY2026)
|
||||||
|
|
||||||
|
| Category | Annual Total |
|
||||||
|
|---|---|
|
||||||
|
| Operating Income | $184,207.40 |
|
||||||
|
| Operating Expense | $139,979.95 |
|
||||||
|
| **Net Operating Surplus** | **$44,227.45** |
|
||||||
|
| Monthly Expense Run Rate | $11,665.00 |
|
||||||
|
| Reserve Interest Income | $1,449.96 |
|
||||||
|
| Reserve Disbursements | $22,000.00 (Mar $13K, Apr $9K) |
|
||||||
|
|
||||||
|
### Assessment Structure
|
||||||
|
|
||||||
|
- **67 units** at $2,328.14/year regular + $300.00/year special (annual frequency)
|
||||||
|
- Total annual regular assessments: ~$155,985
|
||||||
|
- Total annual special assessments: ~$20,100
|
||||||
|
- Budget timing: assessments front-loaded in Mar-Jun
|
||||||
|
|
||||||
|
### Actuals (YTD through March 4, 2026)
|
||||||
|
|
||||||
|
| Metric | Value |
|
||||||
|
|---|---|
|
||||||
|
| YTD Income | $88.16 (ARC fees $100 - $50 adj + $38.16 interest) |
|
||||||
|
| YTD Expenses | $1,850.42 (January only) |
|
||||||
|
| Delinquent Invoices | 0 ($0.00) |
|
||||||
|
| Journal Entries Posted | 4 (Jan actuals + Feb adjusting + Feb opening balances) |
|
||||||
|
|
||||||
|
### Capital Projects (from `projects` table, 26 total)
|
||||||
|
|
||||||
|
| Project | Cost | Target | Funded % |
|
||||||
|
|---|---|---|---|
|
||||||
|
| Pond Spillway | $7,000 | Mar 2026 | 0% |
|
||||||
|
| Tuscany Drain Box | $5,500 | May 2026 | 0% |
|
||||||
|
| Front Entrance Power Washing | $1,500 | Mar 2027 | 0% |
|
||||||
|
| Irrigation Pump Replacement | $1,500 | Jun 2027 | 0% |
|
||||||
|
| **Road Sealing - All Roads** | **$80,000** | **Jun 2029** | **0%** |
|
||||||
|
| Asphalt Repair - Creek Stone Dr | $43,000 | TBD | 0% |
|
||||||
|
| Pavilion & Vineyard Structures | $7,000 | Jun 2035 | 0% |
|
||||||
|
| 16 placeholder items | $1.00 each | TBD | 0% |
|
||||||
|
| **Total Planned** | **$152,016** | | **0%** |
|
||||||
|
|
||||||
|
### Reserve Components
|
||||||
|
|
||||||
|
- **0 components tracked** (empty `reserve_components` table)
|
||||||
|
|
||||||
|
### Market Rates (fetched 2026-03-04)
|
||||||
|
|
||||||
|
| Type | Top Rate | Bank | Term |
|
||||||
|
|---|---|---|---|
|
||||||
|
| CD | 4.10% | E*TRADE / Synchrony | 12-14 mo |
|
||||||
|
| High-Yield Savings | 4.09% | Openbank | Liquid |
|
||||||
|
| Money Market | 4.03% | Vio Bank | Liquid |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Operating Fund Health Score
|
||||||
|
|
||||||
|
**Latest Score:** 88 (Good) — Generated 2026-03-04T19:24:36Z
|
||||||
|
**Score History:** 48 → 72 → 78 → 72 → 78 → **88** (6 runs, March 2-4)
|
||||||
|
**Overall Concurrence: 72%**
|
||||||
|
|
||||||
|
### Factor-by-Factor Analysis
|
||||||
|
|
||||||
|
#### Factor 1: "Projected Cash Flow" — Impact: Positive
|
||||||
|
> "12-month forecast shows consistent positive liquidity, with cash balances never dipping below the starting $27,419 and peaking at $142,788 in June."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| Budget surplus ($184K income vs $140K expense) | **Verified** ✅ |
|
||||||
|
| Assessments front-loaded Mar-Jun | **Verified** ✅ (budget shows $48K Mar, $64K Apr, $32K May, $16K Jun) |
|
||||||
|
| Peak of ~$142K in June | **Plausible** ✅ ($27K + cumulative income through June) |
|
||||||
|
| Cash never below starting $27K | **Plausible** ✅ (expenses < income by month) |
|
||||||
|
|
||||||
|
**Concurrence: 95%** — Forecast logic is sound. The only risk is the assumption that assessments are collected on the exact budget schedule.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 2: "Delinquency Rate" — Impact: Positive
|
||||||
|
> "$0.00 in overdue invoices and a 0.0% delinquency rate."
|
||||||
|
|
||||||
|
**Concurrence: 100%** ✅ — Database confirms zero delinquent invoices.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 3: "Budget Performance (Timing)" — Impact: Neutral
|
||||||
|
> "YTD income is 99.8% below budget ($55k variance) primarily due to the timing of the large Special Assessment ($20,700) and regular assessments appearing in future projected months."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| YTD income $88.16 | **Verified** ✅ |
|
||||||
|
| Budget includes March ($55K) in YTD calc | **Accurate** — AI uses month 3 of 12, includes full March budget |
|
||||||
|
| Timing explanation | **Reasonable** — we're only 4 days into March |
|
||||||
|
| Rating as "neutral" vs "negative" | **Appropriate** ✅ — correctly avoids penalizing for calendar timing |
|
||||||
|
|
||||||
|
**Concurrence: 80%** — The variance is accurately computed but presenting a $55K "variance" when we're 4 days into March could alarm a board member. The YTD window through month 3 includes all of March's budget despite only 4 days having elapsed. Consider computing YTD budget pro-rata or through the prior complete month.
|
||||||
|
|
||||||
|
**🔧 Tuning Suggestion:** Add a note to the prompt about pro-rating the current month's budget, or instruct the AI to note "X days into the current month" when the variance is driven by incomplete-month timing.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 4: "Cash Reserves" — Impact: Positive
|
||||||
|
> "Current operating cash of $27,419 provides 2.4 months of runway based on the annual expense run rate."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| $27,419 / ($139,980 / 12) = 2.35 months | **Math verified** ✅ |
|
||||||
|
| Rated as "positive" | **Questionable** ⚠️ |
|
||||||
|
|
||||||
|
**Concurrence: 60%** — The math is correct, but rating 2.4 months as "positive" contradicts the scoring guidelines which state 2-3 months = "Fair" (60-74) and 3-6 months = "Good" (75-89). This factor should be "neutral" at best, and the overall score should reflect that the HOA is *below* the "Good" threshold for cash reserves.
|
||||||
|
|
||||||
|
**🔧 Tuning Suggestion:** Add explicit guidance in the prompt: "If cash runway is below 3 months, this factor MUST be neutral or negative, regardless of projected future inflows."
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 5: "Expense Management" — Impact: Positive
|
||||||
|
> "YTD expenses are $36,313 under budget (4.8% of annual budget spent vs 25% of year elapsed)."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| YTD expenses $1,850.42 | **Verified** ✅ |
|
||||||
|
| Budget YTD (3 months): ~$38,164 | **Correct** ✅ |
|
||||||
|
| $1,850 / $38,164 = 4.85% | **Math verified** ✅ |
|
||||||
|
| "25% of year elapsed" | **Correct** (month 3 of 12) |
|
||||||
|
| Phrasing "of annual budget" | **Misleading** ⚠️ — it's actually 4.8% of YTD budget, not annual |
|
||||||
|
|
||||||
|
**Concurrence: 70%** — The percentage is correctly calculated against YTD budget, but the phrasing "of annual budget" is incorrect. Also, the low spend is not necessarily positive — only January actuals exist; February hasn't been posted yet, which the AI partially acknowledges with "or delayed billing cycles."
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Recommendation Assessment
|
||||||
|
|
||||||
|
| # | Recommendation | Priority | Concurrence |
|
||||||
|
|---|---|---|---|
|
||||||
|
| 1 | "Verify the posting schedule for the $20,700 Special Assessment" | Low | **90%** ✅ Valid; assessments are annual, collection timing matters |
|
||||||
|
| 2 | "Investigate the low YTD expense recognition ($1,850 vs $38,164)" | Medium | **95%** ✅ Excellent catch; Feb expenses not posted yet |
|
||||||
|
| 3 | "Consider moving excess cash over $100K in Q2 to interest-bearing account" | Low | **85%** ✅ Sound advice; aligns with HY Savings at 4.09% |
|
||||||
|
|
||||||
|
**Recommendation Concurrence: 90%** — All three recommendations are actionable and data-backed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Score Assessment
|
||||||
|
|
||||||
|
**Is 88 (Good) the right score?**
|
||||||
|
|
||||||
|
| Scoring Criterion | Guidelines Say | Actual | Alignment |
|
||||||
|
|---|---|---|---|
|
||||||
|
| Cash reserves | 3-6 months for "Good" | 2.4 months | ❌ Below threshold |
|
||||||
|
| Income vs expenses | "Roughly matching" for Good | $184K vs $140K (surplus) | ✅ Exceeds |
|
||||||
|
| Delinquency | "Manageable" for Good | 0% | ✅ Excellent |
|
||||||
|
| Budget performance | No major overruns for Good | Under budget (timing) | ✅ Positive |
|
||||||
|
| Projected cash flow | Not explicitly in guidelines | Strong positive trajectory | ✅ Positive |
|
||||||
|
|
||||||
|
The cash runway of 2.4 months is below the stated "Good" (75-89) threshold of 3-6 months and technically falls in the "Fair" (60-74) range of 2-3 months. Earlier AI runs scored this 72-78, which better aligns with the guidelines. The 88 appears to overweight the projected future cash flow (which is speculative) vs the current actual position.
|
||||||
|
|
||||||
|
**Suggested correct score: 74-80** (high end of Fair to low end of Good)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Score Consistency Concern
|
||||||
|
|
||||||
|
| Run Date | Score | Label |
|
||||||
|
|---|---|---|
|
||||||
|
| Mar 2 15:07 | 48 | Needs Attention |
|
||||||
|
| Mar 2 15:12 | 78 | Good |
|
||||||
|
| Mar 2 15:36 | 72 | Fair |
|
||||||
|
| Mar 2 17:09 | 78 | Good |
|
||||||
|
| Mar 3 02:03 | 72 | Fair |
|
||||||
|
| Mar 4 19:24 | 88 | Good |
|
||||||
|
|
||||||
|
A **40-point spread** (48-88) across 6 runs with essentially the same data is concerning. Even excluding the outlier first run (which noted a data config issue with "1 units"), the remaining 5 runs span 72-88 (16 points). At temperature 0.3, this suggests the model is not deterministic enough for financial scoring.
|
||||||
|
|
||||||
|
**🔧 Tuning Suggestion:** Consider lowering temperature to 0.1 for health score calculations to improve consistency. Alternatively, implement a moving average of the last 3 scores to smooth volatility.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Reserve Fund Health Score
|
||||||
|
|
||||||
|
**Latest Score:** 45 (Needs Attention) — Generated 2026-03-04T19:24:50Z
|
||||||
|
**Score History:** 25 → 48 → 42 → 25 → 45 → 35 → **45** (7 runs, March 2-4)
|
||||||
|
**Overall Concurrence: 85%**
|
||||||
|
|
||||||
|
### Factor-by-Factor Analysis
|
||||||
|
|
||||||
|
#### Factor 1: "Funded Ratio" — Impact: Negative
|
||||||
|
> "Calculated at 0% because no reserve components have been inventoried or assigned replacement costs, making it impossible to measure true funding health against the $152,016 in planned projects."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| 0 reserve components in DB | **Verified** ✅ |
|
||||||
|
| $152,016 in planned projects | **Verified** ✅ (sum of all `projects` rows) |
|
||||||
|
| 0% funded ratio | **Technically accurate** ✅ (no denominator from components) |
|
||||||
|
| Distinction between components and projects | **Well articulated** ✅ |
|
||||||
|
|
||||||
|
**Concurrence: 95%** — The AI correctly identifies that the 0% is an artifact of missing reserve study data, not a literal lack of funds. It appropriately flags this as a governance failure.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 2: "Projected Cash Flow" — Impact: Positive
|
||||||
|
> "Strong immediate liquidity; cash balance is projected to rise from $10,688 to over $49,000 by May 2026 due to special assessment income covering the $12,500 in urgent 2026 project costs."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| Starting reserve cash $10,688 | **Verified** ✅ |
|
||||||
|
| 2026 project costs: $7K (Mar) + $5.5K (May) = $12,500 | **Verified** ✅ |
|
||||||
|
| Special assessment: $300 × 67 = $20,100/year | **Verified** ✅ |
|
||||||
|
| CD maturities: $8K (Apr), $10K (Jun), $10K (Aug) | **Verified** ✅ |
|
||||||
|
| Projected rise to $49K by May | **Plausible** ✅ (income + maturities - project costs) |
|
||||||
|
|
||||||
|
**Concurrence: 85%** — Math is directionally correct. However, the assessment is annual frequency so the full $20,100 may arrive in a single payment, not spread monthly. The timing assumption is critical.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 3: "Component Tracking" — Impact: Negative
|
||||||
|
> "Critical failure in governance: 'No reserve components tracked' means the association is flying blind on the condition and remaining useful life of major assets like roads and irrigation."
|
||||||
|
|
||||||
|
**Concurrence: 100%** ✅ — Database confirms 0 rows in `reserve_components`. This is objectively a critical gap.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Factor 4: "Annual Contributions" — Impact: Negative
|
||||||
|
> "Recurring annual reserve income is only $300 (plus minimal interest), which is grossly insufficient to fund the $80,000 road sealing project due in 2029."
|
||||||
|
|
||||||
|
| Check | Result |
|
||||||
|
|---|---|
|
||||||
|
| Reserve budget income: $1,449.96/yr (interest only) | **Verified** ✅ |
|
||||||
|
| Special assessment: $300/unit × 67 = $20,100/yr | **Verified** ✅ |
|
||||||
|
| "$300" cited as annual reserve income | **Incorrect** ⚠️ |
|
||||||
|
| Road Sealing $80K in June 2029 | **Verified** ✅ |
|
||||||
|
|
||||||
|
**Concurrence: 65%** — The concern about insufficient contributions is valid, but the "$300" figure appears to confuse the per-unit special assessment amount ($300/unit) with the total annual reserve income. Actual annual reserve income = $1,450 (interest) + $20,100 (special assessments) = **$21,550/yr**. Even at $21,550/yr, the 3 years until Road Sealing would accumulate ~$64,650, still short of $80K. So the directional concern is correct, but the magnitude is significantly misstated.
|
||||||
|
|
||||||
|
**🔧 Tuning Suggestion:** The prompt should explicitly label the special assessment income total (not per-unit) in the data context. Currently the data says "$300.00/unit × 67 units (annual)" — the AI should compute $20,100 but sometimes fixates on the $300 per-unit figure. Consider pre-computing and passing the total.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Recommendation Assessment
|
||||||
|
|
||||||
|
| # | Recommendation | Priority | Concurrence |
|
||||||
|
|---|---|---|---|
|
||||||
|
| 1 | "Commission a professional Reserve Study to inventory assets and establish funded ratio" | High | **100%** ✅ Critical and universally correct |
|
||||||
|
| 2 | "Develop a long-term funding plan for the $80,000 Road Sealing project (2029)" | High | **90%** ✅ Verified project exists; $80K with 0% funded |
|
||||||
|
| 3 | "Formalize collection of special assessments into the reserve fund vs operating" | Medium | **95%** ✅ Budget shows special assessments in operating income section |
|
||||||
|
|
||||||
|
**Recommendation Concurrence: 95%** — All recommendations are actionable, appropriately prioritized, and backed by database evidence.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Score Assessment
|
||||||
|
|
||||||
|
**Is 45 (Needs Attention) the right score?**
|
||||||
|
|
||||||
|
| Scoring Criterion | Guidelines Say | Actual | Alignment |
|
||||||
|
|---|---|---|---|
|
||||||
|
| Percent funded | 20-30% for "Needs Attention" | 0% (no components) | ⬇️ Worse than threshold |
|
||||||
|
| Contributions | "Inadequate" for Needs Attention | $21,550/yr for $152K in projects | ⚠️ Borderline |
|
||||||
|
| Component tracking | "Multiple urgent unfunded" | 0 tracked, 2 due in 2026 | ❌ Critical gap |
|
||||||
|
| Investments | Not scored negatively | 3 CDs earning 3.6-3.67% | ✅ Positive |
|
||||||
|
| Capital readiness | | $12.5K due soon, only $10.7K cash | ⚠️ Tight |
|
||||||
|
|
||||||
|
A score of 45 is reasonable. The 0% funded ratio technically suggests "At Risk" (20-39), but the presence of real assets ($38.7K), active investments, and manageable near-term liquidity justifies bumping it into the "Needs Attention" band. The AI's balancing of the artificial 0% metric against actual fund health shows good judgment.
|
||||||
|
|
||||||
|
**Suggested correct score: 40-50** — the AI's 45 is well-calibrated.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Score Consistency Concern
|
||||||
|
|
||||||
|
| Run Date | Score | Label |
|
||||||
|
|---|---|---|
|
||||||
|
| Mar 2 15:06 | 25 | At Risk |
|
||||||
|
| Mar 2 15:13 | 25 | At Risk |
|
||||||
|
| Mar 2 15:37 | 48 | Needs Attention |
|
||||||
|
| Mar 2 17:10 | 42 | Needs Attention |
|
||||||
|
| Mar 3 02:04 | 45 | Needs Attention |
|
||||||
|
| Mar 4 18:49 | 35 | At Risk |
|
||||||
|
| Mar 4 19:24 | 45 | Needs Attention |
|
||||||
|
|
||||||
|
A **23-point spread** (25-48) across 7 runs. The scores oscillate between "At Risk" and "Needs Attention" — the model cannot consistently decide which band this falls into. The most recent 3 runs (35, 45, 45) are more stable.
|
||||||
|
|
||||||
|
**🔧 Tuning Suggestion:** Add boundary guidance to the prompt: "When the score falls within ±5 points of a threshold (40, 60, 75, 90), explicitly justify which side of the boundary the HOA falls on."
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. AI Investment Recommendations
|
||||||
|
|
||||||
|
**Latest Run:** 2026-03-04T19:28:22Z (3 runs saved)
|
||||||
|
**Overall Concurrence: 88%**
|
||||||
|
|
||||||
|
### Overall Assessment
|
||||||
|
> "The HOA has a healthy long-term cash flow outlook with significant surpluses projected by mid-2026, but faces an immediate liquidity pinch in the Reserve Fund for March/April capital projects. The current investment strategy relies on older, lower-yielding CDs (3.60-3.67%) that are maturing soon."
|
||||||
|
|
||||||
|
**Concurrence: 92%** ✅ — Every claim verified:
|
||||||
|
- CDs are at 3.60-3.67% vs market 4.10% (verified)
|
||||||
|
- March project ($7K) vs reserve cash ($10.7K) is tight (verified)
|
||||||
|
- Long-term surplus projected from assessment income (verified from budget)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Recommendation-by-Recommendation Analysis
|
||||||
|
|
||||||
|
#### Rec 1: "Critical Reserve Shortfall for March Project" — HIGH / Liquidity Warning
|
||||||
|
|
||||||
|
| Claim | Database Value | Match |
|
||||||
|
|---|---|---|
|
||||||
|
| Reserve cash = $10,688 | $10,688.45 | ✅ Exact |
|
||||||
|
| $7,000 Pond Spillway project due March | Projects table: $7,000, Mar 2026 | ✅ Exact |
|
||||||
|
| Shortfall risk | $10,688 - $7,000 = $3,688 remaining — tight but feasible | ✅ |
|
||||||
|
| Suggested action: expedite special assessment or transfer from operating | Sound advice | ✅ |
|
||||||
|
|
||||||
|
**Concurrence: 90%** — The liquidity concern is real. After paying the $7K project, only $3.7K would remain in reserve cash before the $5.5K May project. The AI correctly flags the timing risk even though the fund is technically solvent.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Rec 2: "Reinvest Maturing CD #2a at Higher Rate" — HIGH / Maturity Action
|
||||||
|
|
||||||
|
| Claim | Database Value | Match |
|
||||||
|
|---|---|---|
|
||||||
|
| CD #2a = $8,000 | $8,000.00 | ✅ Exact |
|
||||||
|
| Current rate = 3.60% | 3.60% | ✅ Exact |
|
||||||
|
| Maturity = April 14, 2026 | 2026-04-14 | ✅ Exact |
|
||||||
|
| Market rate = 4.10% (E*TRADE) | CD rates: E*TRADE 4.10%, 1 year, $0 min | ✅ Exact |
|
||||||
|
| Additional yield: ~$40/year per $8K | $8K × 0.50% = $40 | ✅ Math correct |
|
||||||
|
|
||||||
|
**Concurrence: 95%** ✅ — Textbook-correct recommendation. Every data point verified. The 50 bps improvement is risk-free income.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Rec 3: "Establish 12-Month CD Ladder for Reserves" — MEDIUM / CD Ladder
|
||||||
|
|
||||||
|
| Claim | Database Value | Match |
|
||||||
|
|---|---|---|
|
||||||
|
| ~$38K total reserve portfolio | $38,688.45 | ✅ Exact |
|
||||||
|
| Suggest 4-rung ladder (3/6/9/12 mo) | Standard strategy | ✅ |
|
||||||
|
| Rates up to 4.10% | Market data confirmed | ✅ |
|
||||||
|
| $9K matures every quarter | $38K / 4 = $9.5K per rung | ✅ Approximate |
|
||||||
|
|
||||||
|
**Concurrence: 75%** — Strategy is sound in principle, but the recommendation overlooks two constraints:
|
||||||
|
1. **Immediate project costs ($12.5K in 2026)** must be reserved first, leaving ~$26K for laddering
|
||||||
|
2. **Investing the entire $38K** is aggressive — some cash buffer should remain liquid
|
||||||
|
|
||||||
|
**🔧 Tuning Suggestion:** Add a constraint to the prompt: "When recommending CD ladders, always subtract upcoming project costs (next 12 months) and a minimum emergency reserve (1 month of budgeted reserve expenses) before calculating the investable amount."
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Rec 4: "Deploy Excess Operating Cash to High-Yield Savings" — MEDIUM / New Investment
|
||||||
|
|
||||||
|
| Claim | Database Value | Match |
|
||||||
|
|---|---|---|
|
||||||
|
| Operating cash = $27,418 | $27,418.81 | ✅ Exact |
|
||||||
|
| 3-month buffer = ~$35,000 | $11,665 × 3 = $34,995 | ✅ Math correct |
|
||||||
|
| Current cash below buffer | $27.4K < $35K | ✅ Correctly identified |
|
||||||
|
| Openbank 4.09% APY | Market data: Openbank 4.09%, $0.01 min | ✅ Exact |
|
||||||
|
| Trigger: "As soon as balance exceeds $35K" | Sound deferred recommendation | ✅ |
|
||||||
|
|
||||||
|
**Concurrence: 90%** ✅ — The AI correctly identifies the current shortfall and provides a forward-looking trigger. Well-structured advice that respects the liquidity constraint.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Rec 5: "Optimize Reserve Cash Yield Post-Project" — LOW / Reallocation
|
||||||
|
|
||||||
|
| Claim | Database Value | Match |
|
||||||
|
|---|---|---|
|
||||||
|
| Vio Bank Money Market at 4.03% | Market data: Vio Bank 4.03%, $0 min | ✅ Exact |
|
||||||
|
| Post-project reserve cash deployment | Appropriate timing | ✅ |
|
||||||
|
| T+1 liquidity for emergencies | Correct MM account characteristic | ✅ |
|
||||||
|
|
||||||
|
**Concurrence: 85%** ✅ — Reasonable low-priority optimization. Correctly uses market data.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
#### Rec 6: "Formalize Special Assessment Collection for Reserves" — LOW / General
|
||||||
|
|
||||||
|
| Claim | Database Value | Match |
|
||||||
|
|---|---|---|
|
||||||
|
| $300/unit special assessment | Assessment groups: $300.00 special | ✅ Exact |
|
||||||
|
| Risk of commingling with operating | Budget shows special assessments in operating income | ✅ Identified |
|
||||||
|
|
||||||
|
**Concurrence: 90%** ✅ — Important governance recommendation. The budget structure does show special assessments as operating income, which could lead to improper fund commingling.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Risk Notes Assessment
|
||||||
|
|
||||||
|
| Risk Note | Verified | Concurrence |
|
||||||
|
|---|---|---|
|
||||||
|
| "Reserve cash ($10.6K) barely sufficient for $7K + $5.5K projects" | ✅ $10,688 vs $12,500 in projects | **95%** |
|
||||||
|
| "Concentration risk: CDs maturing in 4-month window (Apr-Aug)" | ✅ All 3 CDs mature Apr-Aug 2026 | **100%** |
|
||||||
|
| "Operating cash ballooning to $140K+ without investment plan" | ✅ Budget shows large Q2 surplus | **85%** |
|
||||||
|
| "Road Sealing $80K in 2029 needs dedicated savings plan" | ✅ Project exists, 0% funded | **95%** |
|
||||||
|
|
||||||
|
**Risk Notes Concurrence: 94%** — All risk items are data-backed and appropriately flagged.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Cross-Run Consistency (Investment Recommendations)
|
||||||
|
|
||||||
|
Three runs were compared. Key observations:
|
||||||
|
- **Core recommendations are highly consistent** across runs: CD reinvestment, HY savings for operating, CD ladder for reserves
|
||||||
|
- **Dollar amounts match exactly** across all runs (same data inputs)
|
||||||
|
- **Bank name recommendations vary slightly** (E*TRADE vs "Top CD Rate") — cosmetic, not substantive
|
||||||
|
- **Priority levels are stable** (HIGH for liquidity warnings, MEDIUM for optimization)
|
||||||
|
|
||||||
|
**Consistency Grade: A-** — Investment recommendations show much better consistency than health scores, likely because the structured data (specific CDs, specific rates) constrains the output more than the subjective health scoring.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Cross-Cutting Issues
|
||||||
|
|
||||||
|
### Issue 1: Score Volatility (MEDIUM Priority)
|
||||||
|
|
||||||
|
Health scores vary significantly across runs despite identical input data:
|
||||||
|
- Operating: 40-point spread (48-88)
|
||||||
|
- Reserve: 23-point spread (25-48)
|
||||||
|
|
||||||
|
**Root Cause:** Temperature 0.3 allows too much variance for numerical scoring. The model interprets guidelines subjectively.
|
||||||
|
|
||||||
|
**Recommended Fix:**
|
||||||
|
1. Reduce temperature to **0.1** for health score calculations
|
||||||
|
2. Implement a **3-run moving average** to smooth individual run variance
|
||||||
|
3. Add explicit **boundary justification** requirements to prompts
|
||||||
|
|
||||||
|
### Issue 2: YTD Budget Calculation Includes Incomplete Month (LOW Priority)
|
||||||
|
|
||||||
|
The operating health score computes YTD budget through the current month (March), but actual data may only cover a few days. This creates alarming income variances (e.g., "$55K variance") that are pure timing artifacts.
|
||||||
|
|
||||||
|
**Recommended Fix:**
|
||||||
|
- Compute YTD budget through the **prior completed month** (February)
|
||||||
|
- OR pro-rate the current month's budget by days elapsed
|
||||||
|
- Add a note to the prompt: "If the variance is driven by the current incomplete month, flag it as 'timing' and weight it minimally."
|
||||||
|
|
||||||
|
### Issue 3: Per-Unit vs Total Confusion on Special Assessments (LOW Priority)
|
||||||
|
|
||||||
|
The AI sometimes quotes "$300" as the annual reserve income instead of $300 × 67 = $20,100. The data passed says "$300.00/unit × 67 units (annual)" but the model occasionally fixates on the per-unit figure.
|
||||||
|
|
||||||
|
**Recommended Fix:**
|
||||||
|
- Pre-compute and include the total in the data: "Total Annual Special Assessment Income: $20,100.00"
|
||||||
|
- Keep the per-unit breakdown for context but lead with the total
|
||||||
|
|
||||||
|
### Issue 4: Cash Runway Classification Inconsistency (MEDIUM Priority)
|
||||||
|
|
||||||
|
The operating health score rates 2.4 months of cash runway as "positive" despite the scoring guidelines defining 2-3 months as "Fair" territory. This inflates the overall score.
|
||||||
|
|
||||||
|
**Recommended Fix:**
|
||||||
|
- Add explicit prompt guidance: "Cash runway categorization: <2 months = negative, 2-3 months = neutral, 3-6 months = positive, 6+ months = strongly positive. Do NOT rate below-threshold runway as positive based on projected future inflows."
|
||||||
|
|
||||||
|
### Issue 5: Dual Project Tables (INFORMATIONAL)
|
||||||
|
|
||||||
|
The schema contains both `capital_projects` (empty) and `projects` (26 rows). The health score service correctly queries `projects`, but auditors initially checked `capital_projects` and found no data. This dual-table pattern could confuse future developers.
|
||||||
|
|
||||||
|
**Recommended Fix:**
|
||||||
|
- Consolidate into a single table, OR
|
||||||
|
- Add a comment/documentation clarifying the canonical source
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Concurrence Summary by Recommendation
|
||||||
|
|
||||||
|
### Operating Fund Health — Recommendations
|
||||||
|
| Recommendation | Concurrence |
|
||||||
|
|---|---|
|
||||||
|
| Verify posting schedule for $20,700 Special Assessment | 90% |
|
||||||
|
| Investigate low YTD expense recognition | 95% |
|
||||||
|
| Move excess cash to interest-bearing account | 85% |
|
||||||
|
| **Average** | **90%** |
|
||||||
|
|
||||||
|
### Reserve Fund Health — Recommendations
|
||||||
|
| Recommendation | Concurrence |
|
||||||
|
|---|---|
|
||||||
|
| Commission professional Reserve Study | 100% |
|
||||||
|
| Develop funding plan for $80K Road Sealing | 90% |
|
||||||
|
| Formalize special assessment collection for reserves | 95% |
|
||||||
|
| **Average** | **95%** |
|
||||||
|
|
||||||
|
### Investment Planning — Recommendations
|
||||||
|
| Recommendation | Concurrence |
|
||||||
|
|---|---|
|
||||||
|
| Critical Reserve Shortfall for March Project | 90% |
|
||||||
|
| Reinvest Maturing CD #2a at Higher Rate | 95% |
|
||||||
|
| Establish 12-Month CD Ladder | 75% |
|
||||||
|
| Deploy Operating Cash to HY Savings | 90% |
|
||||||
|
| Optimize Reserve Cash Post-Project | 85% |
|
||||||
|
| Formalize Special Assessment Collection | 90% |
|
||||||
|
| **Average** | **88%** |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Final Grades
|
||||||
|
|
||||||
|
| Feature | Score Accuracy | Recommendation Quality | Data Fidelity | Consistency | **Overall** |
|
||||||
|
|---|---|---|---|---|---|
|
||||||
|
| Operating Fund Health | C+ (score ~15 pts high) | A (90%) | B+ (minor math phrasing) | C (16-pt spread) | **72% — B-** |
|
||||||
|
| Reserve Fund Health | A- (well-calibrated) | A (95%) | B (per-unit confusion) | B- (23-pt spread) | **85% — B+** |
|
||||||
|
| Investment Recommendations | N/A (no single score) | A (88%) | A (exact data matches) | A- (stable across runs) | **88% — A-** |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Priority Action Items for Tuning
|
||||||
|
|
||||||
|
1. **[HIGH]** Reduce AI temperature from 0.3 → 0.1 for health score calculations to reduce score volatility
|
||||||
|
2. **[MEDIUM]** Add explicit cash-runway-to-impact mapping in operating prompt to prevent misclassification
|
||||||
|
3. **[MEDIUM]** Pre-compute total special assessment income in data context (not just per-unit)
|
||||||
|
4. **[LOW]** Adjust YTD budget calculation to use prior completed month or pro-rate current month
|
||||||
|
5. **[LOW]** Add boundary justification requirement to scoring prompts
|
||||||
|
6. **[LOW]** Consider implementing 3-run moving average for displayed health scores
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Generated by Claude Opus 4.6 — Automated AI Feature Audit*
|
||||||
@@ -1,587 +0,0 @@
|
|||||||
# HOA LedgerIQ — Deployment Guide
|
|
||||||
|
|
||||||
**Version:** 2026.3.2 (beta)
|
|
||||||
**Last updated:** 2026-03-02
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Table of Contents
|
|
||||||
|
|
||||||
1. [Prerequisites](#prerequisites)
|
|
||||||
2. [Deploy to a Fresh Docker Server](#deploy-to-a-fresh-docker-server)
|
|
||||||
3. [Production Deployment](#production-deployment)
|
|
||||||
4. [SSL with Certbot (Let's Encrypt)](#ssl-with-certbot-lets-encrypt)
|
|
||||||
5. [Backup the Local Test Database](#backup-the-local-test-database)
|
|
||||||
6. [Restore a Backup into the Staged Environment](#restore-a-backup-into-the-staged-environment)
|
|
||||||
7. [Running Migrations on the Staged Environment](#running-migrations-on-the-staged-environment)
|
|
||||||
8. [Verifying the Deployment](#verifying-the-deployment)
|
|
||||||
9. [Environment Variable Reference](#environment-variable-reference)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Prerequisites
|
|
||||||
|
|
||||||
On the **target server**, ensure the following are installed:
|
|
||||||
|
|
||||||
| Tool | Minimum Version |
|
|
||||||
|-----------------|-----------------|
|
|
||||||
| Docker Engine | 24+ |
|
|
||||||
| Docker Compose | v2+ |
|
|
||||||
| Git | 2.x |
|
|
||||||
| `psql` (client) | 15+ *(optional, for manual DB work)* |
|
|
||||||
|
|
||||||
The app runs four containers in production — backend (NestJS), frontend
|
|
||||||
(React/nginx), PostgreSQL 15, and Redis 7. A fifth nginx container is used
|
|
||||||
in dev mode only. Total memory footprint is roughly **1–2 GB** idle.
|
|
||||||
|
|
||||||
For SSL, the server must also have:
|
|
||||||
- A **public hostname** with a DNS A record pointing to the server's IP
|
|
||||||
(e.g., `staging.yourdomain.com → 203.0.113.10`)
|
|
||||||
- **Ports 80 and 443** open in any firewall / security group
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Deploy to a Fresh Docker Server
|
|
||||||
|
|
||||||
### 1. Clone the repository
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh your-staging-server
|
|
||||||
|
|
||||||
git clone <repo-url> /opt/hoa-ledgeriq
|
|
||||||
cd /opt/hoa-ledgeriq
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Create the environment file
|
|
||||||
|
|
||||||
Copy the example and fill in real values:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cp .env.example .env
|
|
||||||
nano .env # or vi, your choice
|
|
||||||
```
|
|
||||||
|
|
||||||
**Required changes from defaults:**
|
|
||||||
|
|
||||||
```dotenv
|
|
||||||
# --- CHANGE THESE ---
|
|
||||||
POSTGRES_PASSWORD=<strong-random-password>
|
|
||||||
JWT_SECRET=<random-64-char-string>
|
|
||||||
|
|
||||||
# Database URL must match the password above
|
|
||||||
DATABASE_URL=postgresql://hoafinance:<same-password>@postgres:5432/hoafinance
|
|
||||||
|
|
||||||
# AI features (get a key from build.nvidia.com)
|
|
||||||
AI_API_KEY=nvapi-xxxxxxxxxxxx
|
|
||||||
|
|
||||||
# --- Usually fine as-is ---
|
|
||||||
POSTGRES_USER=hoafinance
|
|
||||||
POSTGRES_DB=hoafinance
|
|
||||||
REDIS_URL=redis://redis:6379
|
|
||||||
NODE_ENV=development # keep as development for staging
|
|
||||||
AI_API_URL=https://integrate.api.nvidia.com/v1
|
|
||||||
AI_MODEL=qwen/qwen3.5-397b-a17b
|
|
||||||
AI_DEBUG=false
|
|
||||||
```
|
|
||||||
|
|
||||||
> **Tip:** Generate secrets quickly:
|
|
||||||
> ```bash
|
|
||||||
> openssl rand -hex 32 # good for JWT_SECRET
|
|
||||||
> openssl rand -base64 24 # good for POSTGRES_PASSWORD
|
|
||||||
> ```
|
|
||||||
|
|
||||||
### 3. Build and start the stack
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose up -d --build
|
|
||||||
```
|
|
||||||
|
|
||||||
This will:
|
|
||||||
- Build the backend and frontend images
|
|
||||||
- Pull `postgres:15-alpine`, `redis:7-alpine`, and `nginx:alpine`
|
|
||||||
- Initialize the PostgreSQL database with the shared schema (`db/init/00-init.sql`)
|
|
||||||
- Start all services on the `hoanet` bridge network
|
|
||||||
|
|
||||||
### 4. Wait for healthy services
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose ps
|
|
||||||
```
|
|
||||||
|
|
||||||
All containers should show `Up` (postgres and redis should also show
|
|
||||||
`(healthy)`). If the backend is restarting, check logs:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose logs backend --tail=50
|
|
||||||
```
|
|
||||||
|
|
||||||
### 5. (Optional) Seed with demo data
|
|
||||||
|
|
||||||
If deploying a fresh environment for testing and you want the Sunrise Valley
|
|
||||||
HOA demo tenant:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose exec -T postgres psql -U hoafinance -d hoafinance < db/seed/seed.sql
|
|
||||||
```
|
|
||||||
|
|
||||||
This creates:
|
|
||||||
- Platform admin: `admin@hoaledgeriq.com` / `password123`
|
|
||||||
- Tenant admin: `admin@sunrisevalley.org` / `password123`
|
|
||||||
- Tenant viewer: `viewer@sunrisevalley.org` / `password123`
|
|
||||||
|
|
||||||
### 6. Access the application
|
|
||||||
|
|
||||||
| Service | URL |
|
|
||||||
|-----------|--------------------------------|
|
|
||||||
| App (UI) | `http://<server-ip>` |
|
|
||||||
| API | `http://<server-ip>/api` |
|
|
||||||
| Postgres | `<server-ip>:5432` (direct) |
|
|
||||||
|
|
||||||
> At this point the app is running over **plain HTTP** in development mode.
|
|
||||||
> For any environment that will serve real traffic, continue to the Production
|
|
||||||
> Deployment section.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Production Deployment
|
|
||||||
|
|
||||||
The base `docker-compose.yml` runs everything in **development mode** (Vite
|
|
||||||
dev server, NestJS in watch mode, no connection pooling). This is fine for
|
|
||||||
local development but will fail under even light production load.
|
|
||||||
|
|
||||||
`docker-compose.prod.yml` provides a production overlay that fixes this:
|
|
||||||
|
|
||||||
| Component | Dev mode | Production mode |
|
|
||||||
|-----------|----------|-----------------|
|
|
||||||
| Frontend | Vite dev server (single-threaded, HMR) | Static build served by nginx |
|
|
||||||
| Backend | `nest start --watch` (ts-node, file watcher) | Compiled JS, clustered across CPU cores |
|
|
||||||
| DB pooling | None (new connection per query) | Pool of 30 reusable connections |
|
|
||||||
| Postgres | Default config (100 connections) | Tuned: 200 connections, optimized buffers |
|
|
||||||
| Nginx | Docker nginx routes all traffic | Disabled — host nginx routes directly |
|
|
||||||
| Restart | None | `unless-stopped` on all services |
|
|
||||||
|
|
||||||
### Deploy for production
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd /opt/hoa-ledgeriq
|
|
||||||
|
|
||||||
# Ensure .env has NODE_ENV=production and strong secrets
|
|
||||||
nano .env
|
|
||||||
|
|
||||||
# Build and start with the production overlay
|
|
||||||
docker compose -f docker-compose.yml -f docker-compose.prod.yml up -d --build
|
|
||||||
```
|
|
||||||
|
|
||||||
The production overlay **disables the Docker nginx container** — request routing
|
|
||||||
and SSL are handled by the host-level nginx. Backend and frontend are exposed
|
|
||||||
on `127.0.0.1` only (loopback), so they aren't publicly accessible without the
|
|
||||||
host nginx in front.
|
|
||||||
|
|
||||||
### Host nginx setup (required for production)
|
|
||||||
|
|
||||||
A ready-to-use host nginx config is included at `nginx/host-production.conf`.
|
|
||||||
It handles SSL termination, request routing, rate limiting, proxy buffering,
|
|
||||||
and extended timeouts for AI endpoints.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Copy the reference config
|
|
||||||
sudo cp nginx/host-production.conf /etc/nginx/sites-available/app.yourdomain.com
|
|
||||||
|
|
||||||
# Edit the hostname (replace all instances of app.yourdomain.com)
|
|
||||||
sudo sed -i 's/app.yourdomain.com/YOUR_HOSTNAME/g' \
|
|
||||||
/etc/nginx/sites-available/app.yourdomain.com
|
|
||||||
|
|
||||||
# Enable the site
|
|
||||||
sudo ln -s /etc/nginx/sites-available/app.yourdomain.com /etc/nginx/sites-enabled/
|
|
||||||
|
|
||||||
# Get an SSL certificate (certbot modifies the config automatically)
|
|
||||||
sudo certbot --nginx -d YOUR_HOSTNAME
|
|
||||||
|
|
||||||
# Test and reload
|
|
||||||
sudo nginx -t && sudo systemctl reload nginx
|
|
||||||
```
|
|
||||||
|
|
||||||
The host config routes traffic directly to the Docker services:
|
|
||||||
- `/api/*` → `http://127.0.0.1:3000` (NestJS backend)
|
|
||||||
- `/` → `http://127.0.0.1:3001` (React frontend served by nginx)
|
|
||||||
|
|
||||||
> See `nginx/host-production.conf` for the full config including rate limiting,
|
|
||||||
> proxy buffering, and extended AI endpoint timeouts.
|
|
||||||
|
|
||||||
> **Tip:** Create a shell alias to avoid typing the compose files every time:
|
|
||||||
> ```bash
|
|
||||||
> echo 'alias dc="docker compose -f docker-compose.yml -f docker-compose.prod.yml"' >> ~/.bashrc
|
|
||||||
> source ~/.bashrc
|
|
||||||
> dc up -d --build
|
|
||||||
> ```
|
|
||||||
|
|
||||||
### What the production overlay does
|
|
||||||
|
|
||||||
**Backend (`backend/Dockerfile`)**
|
|
||||||
- Multi-stage build: compiles TypeScript once, runs `node dist/main`
|
|
||||||
- No dev dependencies shipped (smaller image, faster startup)
|
|
||||||
- Node.js clustering: forks one worker per CPU core (up to 4)
|
|
||||||
- Connection pool: 30 reusable PostgreSQL connections shared across workers
|
|
||||||
|
|
||||||
**Frontend (`frontend/Dockerfile`)**
|
|
||||||
- Multi-stage build: `npm run build` produces optimized static assets
|
|
||||||
- Served by a lightweight nginx container (not Vite)
|
|
||||||
- Static assets cached with immutable headers (Vite filename hashing)
|
|
||||||
|
|
||||||
**Host Nginx (`nginx/host-production.conf`)**
|
|
||||||
- SSL termination + HTTP→HTTPS redirect (via certbot on host)
|
|
||||||
- Rate limiting on API routes (10 req/s per IP, burst 30)
|
|
||||||
- Proxy buffering to prevent 502s during slow responses
|
|
||||||
- Extended timeouts for AI endpoints (180s for investment/health-score calls)
|
|
||||||
- Routes `/api/*` → backend:3000, `/` → frontend:3001
|
|
||||||
|
|
||||||
**PostgreSQL**
|
|
||||||
- `max_connections=200` (up from default 100)
|
|
||||||
- `shared_buffers=256MB`, `effective_cache_size=512MB`
|
|
||||||
- Tuned checkpoint, WAL, and memory settings
|
|
||||||
|
|
||||||
### Capacity guidelines
|
|
||||||
|
|
||||||
With the production stack on a 2-core / 4GB server:
|
|
||||||
|
|
||||||
| Metric | Expected capacity |
|
|
||||||
|--------|-------------------|
|
|
||||||
| Concurrent users | 50–100 |
|
|
||||||
| API requests/sec | ~200 |
|
|
||||||
| DB connections | 30 per backend worker × workers |
|
|
||||||
| Frontend serving | Static files, effectively unlimited |
|
|
||||||
|
|
||||||
For higher loads, scale the backend horizontally with Docker Swarm or
|
|
||||||
Kubernetes replicas.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## SSL with Certbot (Let's Encrypt)
|
|
||||||
|
|
||||||
SSL is handled entirely at the host level using certbot with the host nginx.
|
|
||||||
No Docker containers are involved in SSL termination.
|
|
||||||
|
|
||||||
### Prerequisites
|
|
||||||
|
|
||||||
- A public hostname with DNS pointing to this server
|
|
||||||
- Ports 80 and 443 open in the firewall
|
|
||||||
- Host nginx installed: `sudo apt install nginx` (Ubuntu/Debian)
|
|
||||||
- Certbot installed: `sudo apt install certbot python3-certbot-nginx`
|
|
||||||
|
|
||||||
### Obtain a certificate
|
|
||||||
|
|
||||||
If you followed the "Host nginx setup" section above, certbot was already
|
|
||||||
run as part of that process. If not:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Ensure the host nginx config is in place first
|
|
||||||
sudo certbot --nginx -d YOUR_HOSTNAME
|
|
||||||
```
|
|
||||||
|
|
||||||
Certbot will:
|
|
||||||
1. Verify domain ownership via an ACME challenge on port 80
|
|
||||||
2. Obtain the certificate from Let's Encrypt
|
|
||||||
3. Automatically modify the nginx config to enable SSL
|
|
||||||
4. Set up an HTTP → HTTPS redirect
|
|
||||||
|
|
||||||
### Verify HTTPS
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Should return 200 with SSL
|
|
||||||
curl -I https://YOUR_HOSTNAME
|
|
||||||
|
|
||||||
# Should return 301 redirect to HTTPS
|
|
||||||
curl -I http://YOUR_HOSTNAME
|
|
||||||
```
|
|
||||||
|
|
||||||
### Auto-renewal
|
|
||||||
|
|
||||||
Certbot installs a systemd timer (or cron job) that checks for renewal
|
|
||||||
twice daily. Verify it's active:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo systemctl status certbot.timer
|
|
||||||
```
|
|
||||||
|
|
||||||
To test renewal without actually renewing:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
sudo certbot renew --dry-run
|
|
||||||
```
|
|
||||||
|
|
||||||
Certbot automatically reloads nginx after a successful renewal.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Backup the Local Test Database
|
|
||||||
|
|
||||||
### Full database dump (recommended)
|
|
||||||
|
|
||||||
From your **local development machine** where the app is currently running:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd /path/to/HOA_Financial_Platform
|
|
||||||
|
|
||||||
# Dump the entire database (all schemas, roles, data)
|
|
||||||
docker compose exec -T postgres pg_dump \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
--no-owner \
|
|
||||||
--no-privileges \
|
|
||||||
--format=custom \
|
|
||||||
-f /tmp/hoafinance_backup.dump
|
|
||||||
|
|
||||||
# Copy the dump file out of the container
|
|
||||||
docker compose cp postgres:/tmp/hoafinance_backup.dump ./hoafinance_backup.dump
|
|
||||||
```
|
|
||||||
|
|
||||||
The `--format=custom` flag produces a compressed binary format that supports
|
|
||||||
selective restore. The file is typically 50–80% smaller than plain SQL.
|
|
||||||
|
|
||||||
### Alternative: Plain SQL dump
|
|
||||||
|
|
||||||
If you prefer a human-readable SQL file:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose exec -T postgres pg_dump \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
--no-owner \
|
|
||||||
--no-privileges \
|
|
||||||
> hoafinance_backup.sql
|
|
||||||
```
|
|
||||||
|
|
||||||
### Backup a single tenant schema
|
|
||||||
|
|
||||||
To export just one tenant (e.g., Pine Creek HOA):
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose exec -T postgres pg_dump \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
--no-owner \
|
|
||||||
--no-privileges \
|
|
||||||
--schema=tenant_pine_creek_hoa_q33i \
|
|
||||||
> pine_creek_backup.sql
|
|
||||||
```
|
|
||||||
|
|
||||||
> **Finding a tenant's schema name:**
|
|
||||||
> ```bash
|
|
||||||
> docker compose exec -T postgres psql -U hoafinance -d hoafinance \
|
|
||||||
> -c "SELECT name, schema_name FROM shared.organizations WHERE status = 'active';"
|
|
||||||
> ```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Restore a Backup into the Staged Environment
|
|
||||||
|
|
||||||
### 1. Transfer the backup to the staging server
|
|
||||||
|
|
||||||
```bash
|
|
||||||
scp hoafinance_backup.dump user@staging-server:/opt/hoa-ledgeriq/
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Ensure the stack is running
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd /opt/hoa-ledgeriq
|
|
||||||
docker compose up -d
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Drop and recreate the database (clean slate)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Connect to postgres and reset the database
|
|
||||||
docker compose exec -T postgres psql -U hoafinance -d postgres -c "
|
|
||||||
SELECT pg_terminate_backend(pid)
|
|
||||||
FROM pg_stat_activity
|
|
||||||
WHERE datname = 'hoafinance' AND pid <> pg_backend_pid();
|
|
||||||
"
|
|
||||||
docker compose exec -T postgres dropdb -U hoafinance hoafinance
|
|
||||||
docker compose exec -T postgres createdb -U hoafinance hoafinance
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4a. Restore from custom-format dump
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Copy the dump into the container
|
|
||||||
docker compose cp hoafinance_backup.dump postgres:/tmp/hoafinance_backup.dump
|
|
||||||
|
|
||||||
# Restore
|
|
||||||
docker compose exec -T postgres pg_restore \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
--no-owner \
|
|
||||||
--no-privileges \
|
|
||||||
/tmp/hoafinance_backup.dump
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4b. Restore from plain SQL dump
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose exec -T postgres psql \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
< hoafinance_backup.sql
|
|
||||||
```
|
|
||||||
|
|
||||||
### 5. Restart the backend
|
|
||||||
|
|
||||||
After restoring, restart the backend so NestJS re-establishes its connection
|
|
||||||
pool and picks up the restored schemas:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose restart backend
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Running Migrations on the Staged Environment
|
|
||||||
|
|
||||||
Migrations live in `db/migrations/` and are numbered sequentially. After
|
|
||||||
restoring an older backup, you may need to apply newer migrations.
|
|
||||||
|
|
||||||
Check which migrations exist:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ls -la db/migrations/
|
|
||||||
```
|
|
||||||
|
|
||||||
Apply them in order:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Run all migrations sequentially
|
|
||||||
for f in db/migrations/*.sql; do
|
|
||||||
echo "Applying $f ..."
|
|
||||||
docker compose exec -T postgres psql \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
< "$f"
|
|
||||||
done
|
|
||||||
```
|
|
||||||
|
|
||||||
Or apply a specific migration:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose exec -T postgres psql \
|
|
||||||
-U hoafinance \
|
|
||||||
-d hoafinance \
|
|
||||||
< db/migrations/010-health-scores.sql
|
|
||||||
```
|
|
||||||
|
|
||||||
> **Note:** Migrations are idempotent where possible (`IF NOT EXISTS`,
|
|
||||||
> `DO $$ ... $$` blocks), so re-running one that has already been applied
|
|
||||||
> is generally safe.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Verifying the Deployment
|
|
||||||
|
|
||||||
### Quick health checks
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Backend is responding
|
|
||||||
curl -s http://localhost:3000/api/auth/login | head -c 100
|
|
||||||
|
|
||||||
# Database is accessible
|
|
||||||
docker compose exec -T postgres psql -U hoafinance -d hoafinance \
|
|
||||||
-c "SELECT count(*) AS tenants FROM shared.organizations WHERE status = 'active';"
|
|
||||||
|
|
||||||
# Redis is working
|
|
||||||
docker compose exec -T redis redis-cli ping
|
|
||||||
```
|
|
||||||
|
|
||||||
### Full smoke test
|
|
||||||
|
|
||||||
1. Open `https://YOUR_HOSTNAME` (or `http://<server-ip>`) in a browser
|
|
||||||
2. Log in with a known account
|
|
||||||
3. Navigate to Dashboard — verify health scores load
|
|
||||||
4. Navigate to Capital Planning — verify Kanban columns render
|
|
||||||
5. Navigate to Projects — verify project list loads
|
|
||||||
6. Check the Settings page — version should read **2026.3.2 (beta)**
|
|
||||||
|
|
||||||
### Verify SSL (if enabled)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Check certificate details
|
|
||||||
echo | openssl s_client -connect YOUR_HOSTNAME:443 -servername YOUR_HOSTNAME 2>/dev/null \
|
|
||||||
| openssl x509 -noout -subject -issuer -dates
|
|
||||||
|
|
||||||
# Check that HTTP redirects to HTTPS
|
|
||||||
curl -sI http://YOUR_HOSTNAME | grep -E 'HTTP|Location'
|
|
||||||
```
|
|
||||||
|
|
||||||
### View logs
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose logs -f # all services
|
|
||||||
docker compose logs -f backend # backend only
|
|
||||||
docker compose logs -f postgres # database only
|
|
||||||
docker compose logs -f frontend # frontend nginx
|
|
||||||
sudo tail -f /var/log/nginx/access.log # host nginx access log
|
|
||||||
sudo tail -f /var/log/nginx/error.log # host nginx error log
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Environment Variable Reference
|
|
||||||
|
|
||||||
| Variable | Required | Description |
|
|
||||||
|-------------------|----------|----------------------------------------------------|
|
|
||||||
| `POSTGRES_USER` | Yes | PostgreSQL username |
|
|
||||||
| `POSTGRES_PASSWORD`| Yes | PostgreSQL password (**change from default**) |
|
|
||||||
| `POSTGRES_DB` | Yes | Database name |
|
|
||||||
| `DATABASE_URL` | Yes | Full connection string for the backend |
|
|
||||||
| `REDIS_URL` | Yes | Redis connection string |
|
|
||||||
| `JWT_SECRET` | Yes | Secret for signing JWT tokens (**change from default**) |
|
|
||||||
| `NODE_ENV` | Yes | `development` or `production` |
|
|
||||||
| `AI_API_URL` | Yes | OpenAI-compatible inference endpoint |
|
|
||||||
| `AI_API_KEY` | Yes | API key for AI provider (Nvidia) |
|
|
||||||
| `AI_MODEL` | Yes | Model identifier for AI calls |
|
|
||||||
| `AI_DEBUG` | No | Set `true` to log raw AI prompts/responses |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Architecture Overview
|
|
||||||
|
|
||||||
```
|
|
||||||
Development:
|
|
||||||
┌──────────────────┐
|
|
||||||
Browser ─────────► │ nginx :80 │
|
|
||||||
└────────┬─────────┘
|
|
||||||
┌──────────┴──────────┐
|
|
||||||
▼ ▼
|
|
||||||
┌──────────────┐ ┌──────────────┐
|
|
||||||
│ backend :3000│ │frontend :5173│
|
|
||||||
│ (NestJS) │ │ (Vite/React) │
|
|
||||||
└──────┬───────┘ └──────────────┘
|
|
||||||
┌────┴────┐
|
|
||||||
▼ ▼
|
|
||||||
┌────────────┐ ┌───────────┐
|
|
||||||
│postgres:5432│ │redis :6379│
|
|
||||||
│ (PG 15) │ │ (Redis 7) │
|
|
||||||
└────────────┘ └───────────┘
|
|
||||||
|
|
||||||
Production (host nginx handles SSL + routing):
|
|
||||||
┌────────────────────────────────┐
|
|
||||||
Browser ─────────► │ Host nginx :80/:443 (SSL) │
|
|
||||||
│ /api/* → 127.0.0.1:3000 │
|
|
||||||
│ /* → 127.0.0.1:3001 │
|
|
||||||
└────────┬───────────┬───────────┘
|
|
||||||
▼ ▼
|
|
||||||
┌──────────────┐ ┌──────────────┐
|
|
||||||
│ backend :3000│ │frontend :3001│
|
|
||||||
│ (compiled) │ │ (static nginx)│
|
|
||||||
└──────┬───────┘ └──────────────┘
|
|
||||||
┌────┴────┐
|
|
||||||
▼ ▼
|
|
||||||
┌────────────┐ ┌───────────┐
|
|
||||||
│postgres:5432│ │redis :6379│
|
|
||||||
│ (PG 15) │ │ (Redis 7) │
|
|
||||||
└────────────┘ └───────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
**Multi-tenant isolation:** Each HOA organization gets its own PostgreSQL
|
|
||||||
schema (e.g., `tenant_pine_creek_hoa_q33i`). The `shared` schema holds
|
|
||||||
cross-tenant tables (users, organizations, market rates). Tenant context
|
|
||||||
is resolved from the JWT token on every API request.
|
|
||||||
532
docs/SCALING.md
532
docs/SCALING.md
@@ -1,532 +0,0 @@
|
|||||||
# HOA LedgerIQ — Scaling Guide
|
|
||||||
|
|
||||||
**Version:** 2026.3.2 (beta)
|
|
||||||
**Last updated:** 2026-03-03
|
|
||||||
**Current infrastructure:** 4 ARM cores, 24 GB RAM, single VM
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Table of Contents
|
|
||||||
|
|
||||||
1. [Current Architecture Baseline](#current-architecture-baseline)
|
|
||||||
2. [Resource Budget — Where Your 24 GB Goes](#resource-budget--where-your-24-gb-goes)
|
|
||||||
3. [Scaling Signals — When to Act](#scaling-signals--when-to-act)
|
|
||||||
4. [Phase 1: Vertical Tuning (Same VM)](#phase-1-vertical-tuning-same-vm)
|
|
||||||
5. [Phase 2: Offload Services (Managed DB + Cache)](#phase-2-offload-services-managed-db--cache)
|
|
||||||
6. [Phase 3: Horizontal Scaling (Multiple Backend Instances)](#phase-3-horizontal-scaling-multiple-backend-instances)
|
|
||||||
7. [Phase 4: Full Horizontal (Multi-Node)](#phase-4-full-horizontal-multi-node)
|
|
||||||
8. [Component-by-Component Scaling Reference](#component-by-component-scaling-reference)
|
|
||||||
9. [Docker Daemon Tuning](#docker-daemon-tuning)
|
|
||||||
10. [Monitoring with New Relic](#monitoring-with-new-relic)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Current Architecture Baseline
|
|
||||||
|
|
||||||
```
|
|
||||||
Internet
|
|
||||||
│
|
|
||||||
▼
|
|
||||||
┌─────────────────────────────────────────────────────────┐
|
|
||||||
│ Host VM (4 ARM cores, 24 GB RAM) │
|
|
||||||
│ │
|
|
||||||
│ ┌──────────────────────────────────┐ │
|
|
||||||
│ │ Host nginx :80/:443 (SSL) │ │
|
|
||||||
│ │ /api/* → 127.0.0.1:3000 │ │
|
|
||||||
│ │ /* → 127.0.0.1:3001 │ │
|
|
||||||
│ └──────────┬───────────┬──────────┘ │
|
|
||||||
│ ▼ ▼ │
|
|
||||||
│ ┌──────────────┐ ┌──────────────┐ Docker (hoanet) │
|
|
||||||
│ │ backend :3000│ │frontend :3001│ │
|
|
||||||
│ │ 4 workers │ │ static nginx │ │
|
|
||||||
│ │ 1024 MB cap │ │ ~5 MB used │ │
|
|
||||||
│ └──────┬───────┘ └──────────────┘ │
|
|
||||||
│ ┌────┴────┐ │
|
|
||||||
│ ▼ ▼ │
|
|
||||||
│ ┌────────────┐ ┌───────────┐ │
|
|
||||||
│ │postgres │ │redis │ │
|
|
||||||
│ │ 1024 MB cap│ │ 256 MB cap│ │
|
|
||||||
│ └────────────┘ └───────────┘ │
|
|
||||||
└─────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
**How requests flow:**
|
|
||||||
|
|
||||||
1. Browser hits host nginx (SSL termination, rate limiting)
|
|
||||||
2. API requests proxy to the NestJS backend (4 clustered workers)
|
|
||||||
3. Static asset requests proxy to the frontend nginx container
|
|
||||||
4. Backend queries PostgreSQL and Redis over the Docker bridge network
|
|
||||||
5. All inter-container traffic stays on the `hoanet` bridge (kernel-routed, no userland proxy)
|
|
||||||
|
|
||||||
**Key configuration facts:**
|
|
||||||
|
|
||||||
| Component | Current config | Bottleneck at scale |
|
|
||||||
|-----------|---------------|---------------------|
|
|
||||||
| Backend | 4 Node.js workers (1 per core) | CPU-bound under heavy API load |
|
|
||||||
| PostgreSQL | 200 max connections, 256 MB shared_buffers | Connection count, then memory |
|
|
||||||
| Redis | 256 MB maxmemory, LRU eviction | Memory, then network |
|
|
||||||
| Frontend | Static nginx, ~5 MB memory | Effectively unlimited for static serving |
|
|
||||||
| Host nginx | Rate limit: 10 req/s per IP, burst 30 | File descriptors, worker connections |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Resource Budget — Where Your 24 GB Goes
|
|
||||||
|
|
||||||
| Component | Memory limit | Typical usage | Notes |
|
|
||||||
|-----------|-------------|---------------|-------|
|
|
||||||
| Backend | 1024 MB | 250–400 MB | 4 workers share one container limit |
|
|
||||||
| PostgreSQL | 1024 MB | 50–300 MB | Grows with active queries and shared_buffers |
|
|
||||||
| Redis | 256 MB | 3–10 MB | Very low until caching is heavily used |
|
|
||||||
| Frontend | None set | ~5 MB | Static nginx, negligible |
|
|
||||||
| Host nginx | N/A (host) | ~10 MB | Runs on the host, not in Docker |
|
|
||||||
| New Relic agent | (inside backend) | ~30–50 MB | Included in backend memory |
|
|
||||||
| **Total reserved** | **~2.3 GB** | **~500 MB idle** | **~21.5 GB available for growth** |
|
|
||||||
|
|
||||||
You have significant headroom. The current configuration is conservative and can handle considerably more load before any changes are needed.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Scaling Signals — When to Act
|
|
||||||
|
|
||||||
Use these thresholds from New Relic and system metrics to decide when to scale:
|
|
||||||
|
|
||||||
### Immediate action required
|
|
||||||
|
|
||||||
| Signal | Threshold | Likely bottleneck |
|
|
||||||
|--------|-----------|-------------------|
|
|
||||||
| API response time (p95) | > 2 seconds | Backend CPU or DB queries |
|
|
||||||
| Error rate | > 1% of requests | Backend memory, DB connections, or bugs |
|
|
||||||
| PostgreSQL connection wait time | > 100 ms | Connection pool exhaustion |
|
|
||||||
| Container OOM kills | Any occurrence | Memory limit too low |
|
|
||||||
|
|
||||||
### Plan scaling within 2–4 weeks
|
|
||||||
|
|
||||||
| Signal | Threshold | Likely bottleneck |
|
|
||||||
|--------|-----------|-------------------|
|
|
||||||
| API response time (p95) | > 500 ms sustained | Backend approaching CPU saturation |
|
|
||||||
| Backend CPU (container) | > 80% sustained | Need more workers or replicas |
|
|
||||||
| PostgreSQL CPU | > 70% sustained | Query optimization or read replicas |
|
|
||||||
| PostgreSQL connections | > 150 of 200 | Pool size or connection leaks |
|
|
||||||
| Redis memory | > 200 MB of 256 MB | Increase limit or review eviction |
|
|
||||||
| Host disk usage | > 80% | Postgres WAL or Docker image bloat |
|
|
||||||
|
|
||||||
### No action needed
|
|
||||||
|
|
||||||
| Signal | Range | Meaning |
|
|
||||||
|--------|-------|---------|
|
|
||||||
| Backend CPU | < 50% | Normal headroom |
|
|
||||||
| API response time (p95) | < 200 ms | Healthy |
|
|
||||||
| PostgreSQL connections | < 100 | Plenty of capacity |
|
|
||||||
| Memory usage (all containers) | < 60% of limits | Well-sized |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Phase 1: Vertical Tuning (Same VM)
|
|
||||||
|
|
||||||
**When:** 50–200 concurrent users, response times starting to climb.
|
|
||||||
**Cost:** Free — just configuration changes.
|
|
||||||
|
|
||||||
### 1.1 Increase backend memory limit
|
|
||||||
|
|
||||||
The backend runs 4 workers in a 1024 MB container. Each Node.js worker uses
|
|
||||||
60–100 MB at baseline. Under load with New Relic active, they can reach
|
|
||||||
150 MB each (600 MB total). Raise the limit to give headroom:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# docker-compose.prod.yml
|
|
||||||
backend:
|
|
||||||
deploy:
|
|
||||||
resources:
|
|
||||||
limits:
|
|
||||||
memory: 2048M # was 1024M
|
|
||||||
reservations:
|
|
||||||
memory: 512M # was 256M
|
|
||||||
```
|
|
||||||
|
|
||||||
### 1.2 Tune PostgreSQL for available RAM
|
|
||||||
|
|
||||||
With 24 GB on the host, PostgreSQL can use significantly more memory. These
|
|
||||||
settings assume PostgreSQL is the only memory-heavy workload besides the
|
|
||||||
backend:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# docker-compose.prod.yml
|
|
||||||
postgres:
|
|
||||||
command: >
|
|
||||||
postgres
|
|
||||||
-c max_connections=200
|
|
||||||
-c shared_buffers=1GB # was 256MB (25% of 4GB rule of thumb)
|
|
||||||
-c effective_cache_size=4GB # was 512MB (OS page cache estimate)
|
|
||||||
-c work_mem=16MB # was 4MB (per-sort memory)
|
|
||||||
-c maintenance_work_mem=256MB # was 64MB (VACUUM, CREATE INDEX)
|
|
||||||
-c checkpoint_completion_target=0.9
|
|
||||||
-c wal_buffers=64MB # was 16MB
|
|
||||||
-c random_page_cost=1.1
|
|
||||||
deploy:
|
|
||||||
resources:
|
|
||||||
limits:
|
|
||||||
memory: 4096M # was 1024M
|
|
||||||
reservations:
|
|
||||||
memory: 1024M # was 512M
|
|
||||||
```
|
|
||||||
|
|
||||||
### 1.3 Increase Redis memory
|
|
||||||
|
|
||||||
If you start using Redis for session storage or response caching:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# docker-compose.prod.yml
|
|
||||||
redis:
|
|
||||||
command: redis-server --appendonly yes --maxmemory 1gb --maxmemory-policy allkeys-lru
|
|
||||||
```
|
|
||||||
|
|
||||||
### 1.4 Tune host nginx worker connections
|
|
||||||
|
|
||||||
```nginx
|
|
||||||
# /etc/nginx/nginx.conf (host)
|
|
||||||
worker_processes auto; # matches CPU cores (4)
|
|
||||||
events {
|
|
||||||
worker_connections 2048; # default is often 768
|
|
||||||
multi_accept on;
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Phase 1 capacity estimate
|
|
||||||
|
|
||||||
| Metric | Estimate |
|
|
||||||
|--------|----------|
|
|
||||||
| Concurrent users | 200–500 |
|
|
||||||
| API requests/sec | 400–800 |
|
|
||||||
| Tenants | 50–100 |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Phase 2: Offload Services (Managed DB + Cache)
|
|
||||||
|
|
||||||
**When:** 500+ concurrent users, or you need high availability / automated backups.
|
|
||||||
**Cost:** $50–200/month depending on provider and tier.
|
|
||||||
|
|
||||||
### 2.1 Move PostgreSQL to a managed service
|
|
||||||
|
|
||||||
Replace the Docker PostgreSQL container with a managed instance:
|
|
||||||
- **AWS:** RDS for PostgreSQL (db.t4g.medium — 2 vCPU, 4 GB, ~$70/mo)
|
|
||||||
- **GCP:** Cloud SQL for PostgreSQL (db-custom-2-4096, ~$65/mo)
|
|
||||||
- **DigitalOcean:** Managed Databases ($60/mo for 2 vCPU / 4 GB)
|
|
||||||
|
|
||||||
**Changes required:**
|
|
||||||
|
|
||||||
1. Update `.env` to point `DATABASE_URL` at the managed instance
|
|
||||||
2. In `docker-compose.prod.yml`, disable the postgres container:
|
|
||||||
```yaml
|
|
||||||
postgres:
|
|
||||||
deploy:
|
|
||||||
replicas: 0
|
|
||||||
```
|
|
||||||
3. Remove the `depends_on: postgres` from the backend service
|
|
||||||
4. Ensure the managed DB allows connections from your VM's IP
|
|
||||||
|
|
||||||
**Benefits:** Automated backups, point-in-time recovery, read replicas,
|
|
||||||
automatic failover, no memory/CPU contention with the application.
|
|
||||||
|
|
||||||
### 2.2 Move Redis to a managed service
|
|
||||||
|
|
||||||
Replace the Docker Redis container similarly:
|
|
||||||
- **AWS:** ElastiCache (cache.t4g.micro, ~$15/mo)
|
|
||||||
- **DigitalOcean:** Managed Redis ($15/mo)
|
|
||||||
|
|
||||||
Update `REDIS_URL` in `.env` and disable the container.
|
|
||||||
|
|
||||||
### Phase 2 resource reclaim
|
|
||||||
|
|
||||||
Offloading DB and cache frees ~5 GB of reserved memory on the VM,
|
|
||||||
leaving the full 24 GB available for backend scaling (Phase 3).
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Phase 3: Horizontal Scaling (Multiple Backend Instances)
|
|
||||||
|
|
||||||
**When:** Single backend container hits CPU ceiling (4 workers maxed),
|
|
||||||
or you need zero-downtime deployments.
|
|
||||||
|
|
||||||
### 3.1 Run multiple backend replicas with Docker Compose
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# docker-compose.prod.yml
|
|
||||||
backend:
|
|
||||||
deploy:
|
|
||||||
replicas: 2 # 2 containers × 4 workers = 8 workers
|
|
||||||
resources:
|
|
||||||
limits:
|
|
||||||
memory: 2048M
|
|
||||||
reservations:
|
|
||||||
memory: 512M
|
|
||||||
```
|
|
||||||
|
|
||||||
**Important:** With replicas > 1 you cannot use `ports:` directly.
|
|
||||||
Switch the host nginx upstream to use Docker's internal DNS:
|
|
||||||
|
|
||||||
```nginx
|
|
||||||
# /etc/nginx/sites-available/your-site
|
|
||||||
upstream backend {
|
|
||||||
# Docker Compose assigns container IPs dynamically.
|
|
||||||
# Use a resolver to look up the service name.
|
|
||||||
server 127.0.0.1:3000;
|
|
||||||
server 127.0.0.1:3010; # second replica on different host port
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Alternatively, use Docker Compose port ranges:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
backend:
|
|
||||||
ports:
|
|
||||||
- "127.0.0.1:3000-3009:3000"
|
|
||||||
deploy:
|
|
||||||
replicas: 2
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.2 Connection pool considerations
|
|
||||||
|
|
||||||
Each backend container runs up to 4 workers, each with its own connection
|
|
||||||
pool. With the default pool size of 30:
|
|
||||||
|
|
||||||
| Replicas | Workers | Max DB connections |
|
|
||||||
|----------|---------|-------------------|
|
|
||||||
| 1 | 4 | 120 |
|
|
||||||
| 2 | 8 | 240 |
|
|
||||||
| 3 | 12 | 360 |
|
|
||||||
|
|
||||||
If using managed PostgreSQL, ensure `max_connections` on the DB is high
|
|
||||||
enough. For > 2 replicas, consider adding **PgBouncer** as a connection
|
|
||||||
pooler (transaction-mode pooling) to multiplex connections:
|
|
||||||
|
|
||||||
```
|
|
||||||
Backend workers (12) → PgBouncer (50 server connections) → PostgreSQL
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.3 Session and state considerations
|
|
||||||
|
|
||||||
The application currently uses **stateless JWT authentication** — no
|
|
||||||
server-side sessions. This means backend replicas can handle any request
|
|
||||||
without sticky sessions. Redis is used for caching only. This architecture
|
|
||||||
is already horizontal-ready.
|
|
||||||
|
|
||||||
### Phase 3 capacity estimate
|
|
||||||
|
|
||||||
| Replicas | Concurrent users | API req/sec |
|
|
||||||
|----------|-----------------|-------------|
|
|
||||||
| 2 | 500–1,000 | 800–1,500 |
|
|
||||||
| 3 | 1,000–2,000 | 1,500–2,500 |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Phase 4: Full Horizontal (Multi-Node)
|
|
||||||
|
|
||||||
**When:** Single VM resources exhausted, or you need geographic distribution
|
|
||||||
and high availability.
|
|
||||||
|
|
||||||
### 4.1 Docker Swarm (simplest multi-node)
|
|
||||||
|
|
||||||
Docker Swarm is the easiest migration from Docker Compose. The compose
|
|
||||||
files are already compatible:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# On the manager node
|
|
||||||
docker swarm init
|
|
||||||
|
|
||||||
# On worker nodes
|
|
||||||
docker swarm join --token <token> <manager-ip>:2377
|
|
||||||
|
|
||||||
# Deploy the stack
|
|
||||||
docker stack deploy -c docker-compose.yml -c docker-compose.prod.yml hoaledgeriq
|
|
||||||
```
|
|
||||||
|
|
||||||
Scale the backend across nodes:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker service scale hoaledgeriq_backend=4
|
|
||||||
```
|
|
||||||
|
|
||||||
Swarm handles load balancing across nodes via its built-in ingress network.
|
|
||||||
|
|
||||||
### 4.2 Kubernetes (full orchestration)
|
|
||||||
|
|
||||||
For larger deployments, migrate to Kubernetes:
|
|
||||||
|
|
||||||
- **Backend:** Deployment with HPA (Horizontal Pod Autoscaler) on CPU
|
|
||||||
- **Frontend:** Deployment with 2+ replicas behind a Service
|
|
||||||
- **PostgreSQL:** External managed service (not in the cluster)
|
|
||||||
- **Redis:** External managed service or StatefulSet
|
|
||||||
- **Ingress:** nginx-ingress or cloud load balancer
|
|
||||||
|
|
||||||
This is a significant migration but provides auto-scaling, self-healing,
|
|
||||||
rolling deployments, and multi-region capability.
|
|
||||||
|
|
||||||
### 4.3 CDN for static assets
|
|
||||||
|
|
||||||
At any point in the scaling journey, a CDN provides the biggest return on
|
|
||||||
investment for frontend performance:
|
|
||||||
|
|
||||||
- **Cloudflare** (free tier works): Proxy DNS, caches static assets at edge
|
|
||||||
- **AWS CloudFront** or **GCP Cloud CDN**: More control, ~$0.085/GB
|
|
||||||
|
|
||||||
This eliminates nearly all load on the frontend nginx container and reduces
|
|
||||||
latency for geographically distributed users. Static assets (JS, CSS,
|
|
||||||
images) are served from edge nodes instead of your VM.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Component-by-Component Scaling Reference
|
|
||||||
|
|
||||||
### Backend (NestJS)
|
|
||||||
|
|
||||||
| Approach | When | How |
|
|
||||||
|----------|------|-----|
|
|
||||||
| Tune worker count | CPU underused | Set `WORKERS` env var or modify `main.ts` cap |
|
|
||||||
| Increase memory limit | OOM or >80% usage | Raise `deploy.resources.limits.memory` |
|
|
||||||
| Add replicas | CPU maxed at 4 workers | `deploy.replicas: N` in compose |
|
|
||||||
| Move to separate VM | VM resources exhausted | Run backend on dedicated compute |
|
|
||||||
|
|
||||||
**Current clustering logic** (from `backend/src/main.ts`):
|
|
||||||
- Production: `Math.min(os.cpus().length, 4)` workers
|
|
||||||
- Development: 1 worker
|
|
||||||
- To allow more than 4 workers, change the cap in `main.ts`
|
|
||||||
|
|
||||||
### PostgreSQL
|
|
||||||
|
|
||||||
| Approach | When | How |
|
|
||||||
|----------|------|-----|
|
|
||||||
| Increase shared_buffers | Cache hit ratio < 99% | Tune postgres command args |
|
|
||||||
| Increase max_connections | Pool exhaustion errors | Increase in postgres command + add PgBouncer |
|
|
||||||
| Add read replica | Read-heavy workload | Managed DB feature or streaming replication |
|
|
||||||
| Vertical scale | Query latency high | Larger managed DB instance |
|
|
||||||
|
|
||||||
**Key queries to monitor:**
|
|
||||||
```sql
|
|
||||||
-- Connection usage
|
|
||||||
SELECT count(*) AS active, max_conn FROM pg_stat_activity,
|
|
||||||
(SELECT setting::int AS max_conn FROM pg_settings WHERE name='max_connections') s
|
|
||||||
GROUP BY max_conn;
|
|
||||||
|
|
||||||
-- Cache hit ratio (should be > 99%)
|
|
||||||
SELECT
|
|
||||||
sum(heap_blks_hit) / (sum(heap_blks_hit) + sum(heap_blks_read)) AS ratio
|
|
||||||
FROM pg_statio_user_tables;
|
|
||||||
|
|
||||||
-- Slow queries (if pg_stat_statements is enabled)
|
|
||||||
SELECT query, mean_exec_time, calls
|
|
||||||
FROM pg_stat_statements
|
|
||||||
ORDER BY mean_exec_time DESC
|
|
||||||
LIMIT 10;
|
|
||||||
```
|
|
||||||
|
|
||||||
### Redis
|
|
||||||
|
|
||||||
| Approach | When | How |
|
|
||||||
|----------|------|-----|
|
|
||||||
| Increase maxmemory | Evictions happening frequently | Change `--maxmemory` in compose command |
|
|
||||||
| Move to managed | Need persistence guarantees | AWS ElastiCache / DigitalOcean Managed Redis |
|
|
||||||
| Add replica | Read-heavy caching | Managed service with read replicas |
|
|
||||||
|
|
||||||
### Host Nginx
|
|
||||||
|
|
||||||
| Approach | When | How |
|
|
||||||
|----------|------|-----|
|
|
||||||
| Tune worker_connections | Connection refused errors | Increase in `/etc/nginx/nginx.conf` |
|
|
||||||
| Add upstream servers | Multiple backend replicas | upstream block with multiple servers |
|
|
||||||
| Move to load balancer | Multi-node deployment | Cloud LB (ALB, GCP LB) or HAProxy |
|
|
||||||
| Add CDN | Static asset latency | Cloudflare, CloudFront, etc. |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Docker Daemon Tuning
|
|
||||||
|
|
||||||
These settings are applied on the host in `/etc/docker/daemon.json`:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"userland-proxy": false,
|
|
||||||
"log-driver": "json-file",
|
|
||||||
"log-opts": {
|
|
||||||
"max-size": "50m",
|
|
||||||
"max-file": "3"
|
|
||||||
},
|
|
||||||
"default-ulimits": {
|
|
||||||
"nofile": {
|
|
||||||
"Name": "nofile",
|
|
||||||
"Hard": 65536,
|
|
||||||
"Soft": 65536
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
| Setting | Purpose |
|
|
||||||
|---------|---------|
|
|
||||||
| `userland-proxy: false` | Kernel-level port forwarding instead of userspace Go proxy (already applied) |
|
|
||||||
| `log-opts` | Prevents Docker container logs from filling the disk |
|
|
||||||
| `default-ulimits.nofile` | Raises file descriptor limit for containers handling many connections |
|
|
||||||
|
|
||||||
After changing, restart Docker: `sudo systemctl restart docker`
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Monitoring with New Relic
|
|
||||||
|
|
||||||
New Relic is deployed on the backend via the conditional preload
|
|
||||||
(`NEW_RELIC_ENABLED=true` in `.env`). Key dashboards to set up:
|
|
||||||
|
|
||||||
### Alerts to configure
|
|
||||||
|
|
||||||
| Alert | Condition | Priority |
|
|
||||||
|-------|-----------|----------|
|
|
||||||
| High error rate | > 1% for 5 minutes | Critical |
|
|
||||||
| Slow transactions | p95 > 2s for 5 minutes | Critical |
|
|
||||||
| Apdex score drop | < 0.7 for 10 minutes | Warning |
|
|
||||||
| Memory usage | > 80% of container limit for 10 minutes | Warning |
|
|
||||||
| Transaction throughput drop | > 50% decrease vs. baseline | Warning |
|
|
||||||
|
|
||||||
### Key transactions to monitor
|
|
||||||
|
|
||||||
| Endpoint | Why |
|
|
||||||
|----------|-----|
|
|
||||||
| `POST /api/auth/login` | Authentication performance, first thing every user hits |
|
|
||||||
| `GET /api/journal-entries` | Heaviest read query (double-entry bookkeeping with lines) |
|
|
||||||
| `POST /api/investment-planning/recommendations` | AI endpoint, 30–180s response time, external dependency |
|
|
||||||
| `GET /api/reports/*` | Financial reports with aggregate queries |
|
|
||||||
| `GET /api/projects` | Includes real-time funding computation across all reserve projects |
|
|
||||||
|
|
||||||
### Infrastructure metrics to export
|
|
||||||
|
|
||||||
If you later add the New Relic Infrastructure agent to the host VM,
|
|
||||||
you can correlate application performance with system metrics:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install on the host (not in Docker)
|
|
||||||
curl -Ls https://download.newrelic.com/install/newrelic-cli/scripts/install.sh | bash
|
|
||||||
sudo NEW_RELIC_API_KEY=<your-key> NEW_RELIC_ACCOUNT_ID=<your-id> \
|
|
||||||
/usr/local/bin/newrelic install -n infrastructure-agent-installer
|
|
||||||
```
|
|
||||||
|
|
||||||
This provides host-level CPU, memory, disk, and network metrics alongside
|
|
||||||
your application telemetry.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Quick Reference — Scaling Decision Tree
|
|
||||||
|
|
||||||
```
|
|
||||||
Is API response time (p95) > 500ms?
|
|
||||||
├── Yes → Is backend CPU > 80%?
|
|
||||||
│ ├── Yes → Phase 1: Already at 4 workers?
|
|
||||||
│ │ ├── Yes → Phase 3: Add backend replicas
|
|
||||||
│ │ └── No → Raise worker cap in main.ts
|
|
||||||
│ └── No → Is PostgreSQL slow?
|
|
||||||
│ ├── Yes → Phase 1: Tune PG memory, or Phase 2: Managed DB
|
|
||||||
│ └── No → Profile the slow endpoints in New Relic
|
|
||||||
├── No → Is memory > 80% on any container?
|
|
||||||
│ ├── Yes → Phase 1: Raise memory limits (you have 21+ GB free)
|
|
||||||
│ └── No → Is disk > 80%?
|
|
||||||
│ ├── Yes → Clean Docker images, tune PG WAL retention, add log rotation
|
|
||||||
│ └── No → No scaling needed
|
|
||||||
```
|
|
||||||
4
frontend/package-lock.json
generated
4
frontend/package-lock.json
generated
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "hoa-ledgeriq-frontend",
|
"name": "hoa-ledgeriq-frontend",
|
||||||
"version": "2026.3.2-beta",
|
"version": "2026.3.7-beta",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "hoa-ledgeriq-frontend",
|
"name": "hoa-ledgeriq-frontend",
|
||||||
"version": "2026.3.2-beta",
|
"version": "2026.3.7-beta",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@mantine/core": "^7.15.3",
|
"@mantine/core": "^7.15.3",
|
||||||
"@mantine/dates": "^7.15.3",
|
"@mantine/dates": "^7.15.3",
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "hoa-ledgeriq-frontend",
|
"name": "hoa-ledgeriq-frontend",
|
||||||
"version": "2026.3.2-beta",
|
"version": "2026.3.7-beta",
|
||||||
"private": true,
|
"private": true,
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { useState, useEffect } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import { AppShell, Burger, Group, Text, Menu, UnstyledButton, Avatar, Alert, Button } from '@mantine/core';
|
import { AppShell, Burger, Group, Text, Menu, UnstyledButton, Avatar, Alert, Button, ActionIcon, Tooltip } from '@mantine/core';
|
||||||
import { useDisclosure } from '@mantine/hooks';
|
import { useDisclosure } from '@mantine/hooks';
|
||||||
import {
|
import {
|
||||||
IconLogout,
|
IconLogout,
|
||||||
@@ -9,9 +9,12 @@ import {
|
|||||||
IconUserCog,
|
IconUserCog,
|
||||||
IconUsersGroup,
|
IconUsersGroup,
|
||||||
IconEyeOff,
|
IconEyeOff,
|
||||||
|
IconSun,
|
||||||
|
IconMoon,
|
||||||
} from '@tabler/icons-react';
|
} from '@tabler/icons-react';
|
||||||
import { Outlet, useNavigate, useLocation } from 'react-router-dom';
|
import { Outlet, useNavigate, useLocation } from 'react-router-dom';
|
||||||
import { useAuthStore } from '../../stores/authStore';
|
import { useAuthStore } from '../../stores/authStore';
|
||||||
|
import { usePreferencesStore } from '../../stores/preferencesStore';
|
||||||
import { Sidebar } from './Sidebar';
|
import { Sidebar } from './Sidebar';
|
||||||
import { AppTour } from '../onboarding/AppTour';
|
import { AppTour } from '../onboarding/AppTour';
|
||||||
import { OnboardingWizard } from '../onboarding/OnboardingWizard';
|
import { OnboardingWizard } from '../onboarding/OnboardingWizard';
|
||||||
@@ -20,6 +23,7 @@ import logoSrc from '../../assets/logo.svg';
|
|||||||
export function AppLayout() {
|
export function AppLayout() {
|
||||||
const [opened, { toggle, close }] = useDisclosure();
|
const [opened, { toggle, close }] = useDisclosure();
|
||||||
const { user, currentOrg, logout, impersonationOriginal, stopImpersonation } = useAuthStore();
|
const { user, currentOrg, logout, impersonationOriginal, stopImpersonation } = useAuthStore();
|
||||||
|
const { colorScheme, toggleColorScheme } = usePreferencesStore();
|
||||||
const navigate = useNavigate();
|
const navigate = useNavigate();
|
||||||
const location = useLocation();
|
const location = useLocation();
|
||||||
const isImpersonating = !!impersonationOriginal;
|
const isImpersonating = !!impersonationOriginal;
|
||||||
@@ -108,6 +112,16 @@ export function AppLayout() {
|
|||||||
{currentOrg && (
|
{currentOrg && (
|
||||||
<Text size="sm" c="dimmed">{currentOrg.name}</Text>
|
<Text size="sm" c="dimmed">{currentOrg.name}</Text>
|
||||||
)}
|
)}
|
||||||
|
<Tooltip label={colorScheme === 'dark' ? 'Light mode' : 'Dark mode'}>
|
||||||
|
<ActionIcon
|
||||||
|
variant="default"
|
||||||
|
size="lg"
|
||||||
|
onClick={toggleColorScheme}
|
||||||
|
aria-label="Toggle color scheme"
|
||||||
|
>
|
||||||
|
{colorScheme === 'dark' ? <IconSun size={18} /> : <IconMoon size={18} />}
|
||||||
|
</ActionIcon>
|
||||||
|
</Tooltip>
|
||||||
<Menu shadow="md" width={220}>
|
<Menu shadow="md" width={220}>
|
||||||
<Menu.Target>
|
<Menu.Target>
|
||||||
<UnstyledButton>
|
<UnstyledButton>
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import '@mantine/dates/styles.css';
|
|||||||
import '@mantine/notifications/styles.css';
|
import '@mantine/notifications/styles.css';
|
||||||
import { App } from './App';
|
import { App } from './App';
|
||||||
import { theme } from './theme/theme';
|
import { theme } from './theme/theme';
|
||||||
|
import { usePreferencesStore } from './stores/preferencesStore';
|
||||||
|
|
||||||
const queryClient = new QueryClient({
|
const queryClient = new QueryClient({
|
||||||
defaultOptions: {
|
defaultOptions: {
|
||||||
@@ -21,9 +22,11 @@ const queryClient = new QueryClient({
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
ReactDOM.createRoot(document.getElementById('root')!).render(
|
function Root() {
|
||||||
<React.StrictMode>
|
const colorScheme = usePreferencesStore((s) => s.colorScheme);
|
||||||
<MantineProvider theme={theme}>
|
|
||||||
|
return (
|
||||||
|
<MantineProvider theme={theme} forceColorScheme={colorScheme}>
|
||||||
<Notifications position="top-right" />
|
<Notifications position="top-right" />
|
||||||
<ModalsProvider>
|
<ModalsProvider>
|
||||||
<QueryClientProvider client={queryClient}>
|
<QueryClientProvider client={queryClient}>
|
||||||
@@ -33,5 +36,11 @@ ReactDOM.createRoot(document.getElementById('root')!).render(
|
|||||||
</QueryClientProvider>
|
</QueryClientProvider>
|
||||||
</ModalsProvider>
|
</ModalsProvider>
|
||||||
</MantineProvider>
|
</MantineProvider>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
ReactDOM.createRoot(document.getElementById('root')!).render(
|
||||||
|
<React.StrictMode>
|
||||||
|
<Root />
|
||||||
</React.StrictMode>,
|
</React.StrictMode>,
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -587,7 +587,7 @@ export function AccountsPage() {
|
|||||||
{investments.filter(i => i.is_active).length > 0 && (
|
{investments.filter(i => i.is_active).length > 0 && (
|
||||||
<>
|
<>
|
||||||
<Divider label="Investment Accounts" labelPosition="center" my="xs" />
|
<Divider label="Investment Accounts" labelPosition="center" my="xs" />
|
||||||
<InvestmentMiniTable investments={investments.filter(i => i.is_active)} onEdit={handleEditInvestment} />
|
<InvestmentMiniTable investments={investments.filter(i => i.is_active)} onEdit={handleEditInvestment} isReadOnly={isReadOnly} />
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
</Stack>
|
</Stack>
|
||||||
@@ -605,7 +605,7 @@ export function AccountsPage() {
|
|||||||
{operatingInvestments.length > 0 && (
|
{operatingInvestments.length > 0 && (
|
||||||
<>
|
<>
|
||||||
<Divider label="Operating Investment Accounts" labelPosition="center" my="xs" />
|
<Divider label="Operating Investment Accounts" labelPosition="center" my="xs" />
|
||||||
<InvestmentMiniTable investments={operatingInvestments} onEdit={handleEditInvestment} />
|
<InvestmentMiniTable investments={operatingInvestments} onEdit={handleEditInvestment} isReadOnly={isReadOnly} />
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
</Stack>
|
</Stack>
|
||||||
@@ -623,7 +623,7 @@ export function AccountsPage() {
|
|||||||
{reserveInvestments.length > 0 && (
|
{reserveInvestments.length > 0 && (
|
||||||
<>
|
<>
|
||||||
<Divider label="Reserve Investment Accounts" labelPosition="center" my="xs" />
|
<Divider label="Reserve Investment Accounts" labelPosition="center" my="xs" />
|
||||||
<InvestmentMiniTable investments={reserveInvestments} onEdit={handleEditInvestment} />
|
<InvestmentMiniTable investments={reserveInvestments} onEdit={handleEditInvestment} isReadOnly={isReadOnly} />
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
</Stack>
|
</Stack>
|
||||||
@@ -1087,9 +1087,11 @@ function AccountTable({
|
|||||||
function InvestmentMiniTable({
|
function InvestmentMiniTable({
|
||||||
investments,
|
investments,
|
||||||
onEdit,
|
onEdit,
|
||||||
|
isReadOnly = false,
|
||||||
}: {
|
}: {
|
||||||
investments: Investment[];
|
investments: Investment[];
|
||||||
onEdit: (inv: Investment) => void;
|
onEdit: (inv: Investment) => void;
|
||||||
|
isReadOnly?: boolean;
|
||||||
}) {
|
}) {
|
||||||
const totalPrincipal = investments.reduce((s, i) => s + parseFloat(i.principal || '0'), 0);
|
const totalPrincipal = investments.reduce((s, i) => s + parseFloat(i.principal || '0'), 0);
|
||||||
const totalValue = investments.reduce(
|
const totalValue = investments.reduce(
|
||||||
@@ -1132,7 +1134,7 @@ function InvestmentMiniTable({
|
|||||||
<Table.Th ta="right">Maturity Value</Table.Th>
|
<Table.Th ta="right">Maturity Value</Table.Th>
|
||||||
<Table.Th>Maturity Date</Table.Th>
|
<Table.Th>Maturity Date</Table.Th>
|
||||||
<Table.Th ta="right">Days Remaining</Table.Th>
|
<Table.Th ta="right">Days Remaining</Table.Th>
|
||||||
<Table.Th></Table.Th>
|
{!isReadOnly && <Table.Th></Table.Th>}
|
||||||
</Table.Tr>
|
</Table.Tr>
|
||||||
</Table.Thead>
|
</Table.Thead>
|
||||||
<Table.Tbody>
|
<Table.Tbody>
|
||||||
@@ -1182,13 +1184,15 @@ function InvestmentMiniTable({
|
|||||||
'-'
|
'-'
|
||||||
)}
|
)}
|
||||||
</Table.Td>
|
</Table.Td>
|
||||||
<Table.Td>
|
{!isReadOnly && (
|
||||||
<Tooltip label="Edit investment">
|
<Table.Td>
|
||||||
<ActionIcon variant="subtle" onClick={() => onEdit(inv)}>
|
<Tooltip label="Edit investment">
|
||||||
<IconEdit size={16} />
|
<ActionIcon variant="subtle" onClick={() => onEdit(inv)}>
|
||||||
</ActionIcon>
|
<IconEdit size={16} />
|
||||||
</Tooltip>
|
</ActionIcon>
|
||||||
</Table.Td>
|
</Tooltip>
|
||||||
|
</Table.Td>
|
||||||
|
)}
|
||||||
</Table.Tr>
|
</Table.Tr>
|
||||||
))}
|
))}
|
||||||
</Table.Tbody>
|
</Table.Tbody>
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { useState } from 'react';
|
|||||||
import {
|
import {
|
||||||
Title, Text, Card, Table, SimpleGrid, Group, Stack, Badge, Loader, Center,
|
Title, Text, Card, Table, SimpleGrid, Group, Stack, Badge, Loader, Center,
|
||||||
ThemeIcon, Button, Modal, TextInput, NumberInput, Textarea, Select, ActionIcon, Tooltip,
|
ThemeIcon, Button, Modal, TextInput, NumberInput, Textarea, Select, ActionIcon, Tooltip,
|
||||||
|
MultiSelect,
|
||||||
} from '@mantine/core';
|
} from '@mantine/core';
|
||||||
import { useForm } from '@mantine/form';
|
import { useForm } from '@mantine/form';
|
||||||
import { useDisclosure } from '@mantine/hooks';
|
import { useDisclosure } from '@mantine/hooks';
|
||||||
@@ -21,6 +22,8 @@ interface AssessmentGroup {
|
|||||||
special_assessment: string;
|
special_assessment: string;
|
||||||
unit_count: number;
|
unit_count: number;
|
||||||
frequency: string;
|
frequency: string;
|
||||||
|
due_months: number[];
|
||||||
|
due_day: number;
|
||||||
actual_unit_count: string;
|
actual_unit_count: string;
|
||||||
monthly_operating_income: string;
|
monthly_operating_income: string;
|
||||||
monthly_reserve_income: string;
|
monthly_reserve_income: string;
|
||||||
@@ -49,6 +52,29 @@ const frequencyColors: Record<string, string> = {
|
|||||||
annual: 'violet',
|
annual: 'violet',
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const MONTH_OPTIONS = [
|
||||||
|
{ value: '1', label: 'January' },
|
||||||
|
{ value: '2', label: 'February' },
|
||||||
|
{ value: '3', label: 'March' },
|
||||||
|
{ value: '4', label: 'April' },
|
||||||
|
{ value: '5', label: 'May' },
|
||||||
|
{ value: '6', label: 'June' },
|
||||||
|
{ value: '7', label: 'July' },
|
||||||
|
{ value: '8', label: 'August' },
|
||||||
|
{ value: '9', label: 'September' },
|
||||||
|
{ value: '10', label: 'October' },
|
||||||
|
{ value: '11', label: 'November' },
|
||||||
|
{ value: '12', label: 'December' },
|
||||||
|
];
|
||||||
|
|
||||||
|
const MONTH_ABBREV = ['', 'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'];
|
||||||
|
|
||||||
|
const DEFAULT_DUE_MONTHS: Record<string, string[]> = {
|
||||||
|
monthly: ['1','2','3','4','5','6','7','8','9','10','11','12'],
|
||||||
|
quarterly: ['1','4','7','10'],
|
||||||
|
annual: ['1'],
|
||||||
|
};
|
||||||
|
|
||||||
export function AssessmentGroupsPage() {
|
export function AssessmentGroupsPage() {
|
||||||
const [opened, { open, close }] = useDisclosure(false);
|
const [opened, { open, close }] = useDisclosure(false);
|
||||||
const [editing, setEditing] = useState<AssessmentGroup | null>(null);
|
const [editing, setEditing] = useState<AssessmentGroup | null>(null);
|
||||||
@@ -73,18 +99,31 @@ export function AssessmentGroupsPage() {
|
|||||||
specialAssessment: 0,
|
specialAssessment: 0,
|
||||||
unitCount: 0,
|
unitCount: 0,
|
||||||
frequency: 'monthly',
|
frequency: 'monthly',
|
||||||
|
dueMonths: DEFAULT_DUE_MONTHS.monthly,
|
||||||
|
dueDay: 1,
|
||||||
},
|
},
|
||||||
validate: {
|
validate: {
|
||||||
name: (v) => (v.length > 0 ? null : 'Required'),
|
name: (v) => (v.length > 0 ? null : 'Required'),
|
||||||
regularAssessment: (v) => (v >= 0 ? null : 'Must be >= 0'),
|
regularAssessment: (v) => (v >= 0 ? null : 'Must be >= 0'),
|
||||||
|
dueMonths: (v, values) => {
|
||||||
|
if (values.frequency === 'quarterly' && v.length !== 4) return 'Quarterly requires exactly 4 months';
|
||||||
|
if (values.frequency === 'annual' && v.length !== 1) return 'Annual requires exactly 1 month';
|
||||||
|
return null;
|
||||||
|
},
|
||||||
|
dueDay: (v) => (v >= 1 && v <= 28 ? null : 'Must be 1-28'),
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
const saveMutation = useMutation({
|
const saveMutation = useMutation({
|
||||||
mutationFn: (values: any) =>
|
mutationFn: (values: any) => {
|
||||||
editing
|
const payload = {
|
||||||
? api.put(`/assessment-groups/${editing.id}`, values)
|
...values,
|
||||||
: api.post('/assessment-groups', values),
|
dueMonths: values.dueMonths.map(Number),
|
||||||
|
};
|
||||||
|
return editing
|
||||||
|
? api.put(`/assessment-groups/${editing.id}`, payload)
|
||||||
|
: api.post('/assessment-groups', payload);
|
||||||
|
},
|
||||||
onSuccess: () => {
|
onSuccess: () => {
|
||||||
queryClient.invalidateQueries({ queryKey: ['assessment-groups'] });
|
queryClient.invalidateQueries({ queryKey: ['assessment-groups'] });
|
||||||
queryClient.invalidateQueries({ queryKey: ['assessment-groups-summary'] });
|
queryClient.invalidateQueries({ queryKey: ['assessment-groups-summary'] });
|
||||||
@@ -121,6 +160,9 @@ export function AssessmentGroupsPage() {
|
|||||||
|
|
||||||
const handleEdit = (group: AssessmentGroup) => {
|
const handleEdit = (group: AssessmentGroup) => {
|
||||||
setEditing(group);
|
setEditing(group);
|
||||||
|
const dueMonths = group.due_months
|
||||||
|
? group.due_months.map(String)
|
||||||
|
: DEFAULT_DUE_MONTHS[group.frequency] || DEFAULT_DUE_MONTHS.monthly;
|
||||||
form.setValues({
|
form.setValues({
|
||||||
name: group.name,
|
name: group.name,
|
||||||
description: group.description || '',
|
description: group.description || '',
|
||||||
@@ -128,6 +170,8 @@ export function AssessmentGroupsPage() {
|
|||||||
specialAssessment: parseFloat(group.special_assessment || '0'),
|
specialAssessment: parseFloat(group.special_assessment || '0'),
|
||||||
unitCount: group.unit_count || 0,
|
unitCount: group.unit_count || 0,
|
||||||
frequency: group.frequency || 'monthly',
|
frequency: group.frequency || 'monthly',
|
||||||
|
dueMonths,
|
||||||
|
dueDay: group.due_day || 1,
|
||||||
});
|
});
|
||||||
open();
|
open();
|
||||||
};
|
};
|
||||||
@@ -138,6 +182,12 @@ export function AssessmentGroupsPage() {
|
|||||||
open();
|
open();
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const handleFrequencyChange = (value: string | null) => {
|
||||||
|
if (!value) return;
|
||||||
|
form.setFieldValue('frequency', value);
|
||||||
|
form.setFieldValue('dueMonths', DEFAULT_DUE_MONTHS[value] || DEFAULT_DUE_MONTHS.monthly);
|
||||||
|
};
|
||||||
|
|
||||||
const fmt = (v: string | number) =>
|
const fmt = (v: string | number) =>
|
||||||
parseFloat(String(v || '0')).toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
parseFloat(String(v || '0')).toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
||||||
|
|
||||||
@@ -149,6 +199,11 @@ export function AssessmentGroupsPage() {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const formatDueMonths = (months: number[], frequency: string) => {
|
||||||
|
if (!months || frequency === 'monthly') return 'Every month';
|
||||||
|
return months.map((m) => MONTH_ABBREV[m]).join(', ');
|
||||||
|
};
|
||||||
|
|
||||||
if (isLoading) return <Center h={300}><Loader /></Center>;
|
if (isLoading) return <Center h={300}><Loader /></Center>;
|
||||||
|
|
||||||
return (
|
return (
|
||||||
@@ -219,6 +274,7 @@ export function AssessmentGroupsPage() {
|
|||||||
<Table.Th>Group Name</Table.Th>
|
<Table.Th>Group Name</Table.Th>
|
||||||
<Table.Th ta="center">Units</Table.Th>
|
<Table.Th ta="center">Units</Table.Th>
|
||||||
<Table.Th>Frequency</Table.Th>
|
<Table.Th>Frequency</Table.Th>
|
||||||
|
<Table.Th>Due Months</Table.Th>
|
||||||
<Table.Th ta="right">Regular Assessment</Table.Th>
|
<Table.Th ta="right">Regular Assessment</Table.Th>
|
||||||
<Table.Th ta="right">Special Assessment</Table.Th>
|
<Table.Th ta="right">Special Assessment</Table.Th>
|
||||||
<Table.Th ta="right">Monthly Equiv.</Table.Th>
|
<Table.Th ta="right">Monthly Equiv.</Table.Th>
|
||||||
@@ -229,7 +285,7 @@ export function AssessmentGroupsPage() {
|
|||||||
<Table.Tbody>
|
<Table.Tbody>
|
||||||
{groups.length === 0 && (
|
{groups.length === 0 && (
|
||||||
<Table.Tr>
|
<Table.Tr>
|
||||||
<Table.Td colSpan={8}>
|
<Table.Td colSpan={9}>
|
||||||
<Text ta="center" c="dimmed" py="lg">
|
<Text ta="center" c="dimmed" py="lg">
|
||||||
No assessment groups yet. Create groups like "Single Family Homes", "Condos", etc.
|
No assessment groups yet. Create groups like "Single Family Homes", "Condos", etc.
|
||||||
</Text>
|
</Text>
|
||||||
@@ -263,6 +319,9 @@ export function AssessmentGroupsPage() {
|
|||||||
{frequencyLabels[g.frequency] || 'Monthly'}
|
{frequencyLabels[g.frequency] || 'Monthly'}
|
||||||
</Badge>
|
</Badge>
|
||||||
</Table.Td>
|
</Table.Td>
|
||||||
|
<Table.Td>
|
||||||
|
<Text size="xs" c="dimmed">{formatDueMonths(g.due_months, g.frequency)}</Text>
|
||||||
|
</Table.Td>
|
||||||
<Table.Td ta="right" ff="monospace">
|
<Table.Td ta="right" ff="monospace">
|
||||||
{fmt(g.regular_assessment)}{freqSuffix(g.frequency)}
|
{fmt(g.regular_assessment)}{freqSuffix(g.frequency)}
|
||||||
</Table.Td>
|
</Table.Td>
|
||||||
@@ -322,8 +381,22 @@ export function AssessmentGroupsPage() {
|
|||||||
{ value: 'quarterly', label: 'Quarterly' },
|
{ value: 'quarterly', label: 'Quarterly' },
|
||||||
{ value: 'annual', label: 'Annual' },
|
{ value: 'annual', label: 'Annual' },
|
||||||
]}
|
]}
|
||||||
{...form.getInputProps('frequency')}
|
value={form.values.frequency}
|
||||||
|
onChange={handleFrequencyChange}
|
||||||
/>
|
/>
|
||||||
|
{form.values.frequency !== 'monthly' && (
|
||||||
|
<MultiSelect
|
||||||
|
label={form.values.frequency === 'quarterly' ? 'Billing Quarters (select 4 months)' : 'Due Month'}
|
||||||
|
description={form.values.frequency === 'quarterly'
|
||||||
|
? 'Select the first month of each quarter when assessments are due'
|
||||||
|
: 'Select the month when the annual assessment is due'}
|
||||||
|
data={MONTH_OPTIONS}
|
||||||
|
value={form.values.dueMonths}
|
||||||
|
onChange={(v) => form.setFieldValue('dueMonths', v)}
|
||||||
|
error={form.errors.dueMonths}
|
||||||
|
maxValues={form.values.frequency === 'annual' ? 1 : 4}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
<Group grow>
|
<Group grow>
|
||||||
<NumberInput
|
<NumberInput
|
||||||
label={`Regular Assessment (per unit${freqSuffix(form.values.frequency)})`}
|
label={`Regular Assessment (per unit${freqSuffix(form.values.frequency)})`}
|
||||||
@@ -340,7 +413,16 @@ export function AssessmentGroupsPage() {
|
|||||||
{...form.getInputProps('specialAssessment')}
|
{...form.getInputProps('specialAssessment')}
|
||||||
/>
|
/>
|
||||||
</Group>
|
</Group>
|
||||||
<NumberInput label="Expected Unit Count" min={0} {...form.getInputProps('unitCount')} />
|
<Group grow>
|
||||||
|
<NumberInput label="Expected Unit Count" min={0} {...form.getInputProps('unitCount')} />
|
||||||
|
<NumberInput
|
||||||
|
label="Due Day of Month"
|
||||||
|
description="Day invoices are due (1-28)"
|
||||||
|
min={1}
|
||||||
|
max={28}
|
||||||
|
{...form.getInputProps('dueDay')}
|
||||||
|
/>
|
||||||
|
</Group>
|
||||||
<Button type="submit" loading={saveMutation.isPending}>
|
<Button type="submit" loading={saveMutation.isPending}>
|
||||||
{editing ? 'Update' : 'Create'}
|
{editing ? 'Update' : 'Create'}
|
||||||
</Button>
|
</Button>
|
||||||
|
|||||||
@@ -72,9 +72,10 @@ interface KanbanCardProps {
|
|||||||
project: Project;
|
project: Project;
|
||||||
onEdit: (p: Project) => void;
|
onEdit: (p: Project) => void;
|
||||||
onDragStart: (e: DragEvent<HTMLDivElement>, project: Project) => void;
|
onDragStart: (e: DragEvent<HTMLDivElement>, project: Project) => void;
|
||||||
|
isReadOnly?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
function KanbanCard({ project, onEdit, onDragStart }: KanbanCardProps) {
|
function KanbanCard({ project, onEdit, onDragStart, isReadOnly }: KanbanCardProps) {
|
||||||
const plannedLabel = formatPlannedDate(project.planned_date);
|
const plannedLabel = formatPlannedDate(project.planned_date);
|
||||||
// For projects in the Future bucket with a specific year, show the year
|
// For projects in the Future bucket with a specific year, show the year
|
||||||
const currentYear = new Date().getFullYear();
|
const currentYear = new Date().getFullYear();
|
||||||
@@ -86,21 +87,23 @@ function KanbanCard({ project, onEdit, onDragStart }: KanbanCardProps) {
|
|||||||
padding="sm"
|
padding="sm"
|
||||||
radius="md"
|
radius="md"
|
||||||
withBorder
|
withBorder
|
||||||
draggable
|
draggable={!isReadOnly}
|
||||||
onDragStart={(e) => onDragStart(e, project)}
|
onDragStart={!isReadOnly ? (e) => onDragStart(e, project) : undefined}
|
||||||
style={{ cursor: 'grab', userSelect: 'none' }}
|
style={{ cursor: isReadOnly ? 'default' : 'grab', userSelect: 'none' }}
|
||||||
mb="xs"
|
mb="xs"
|
||||||
>
|
>
|
||||||
<Group justify="space-between" wrap="nowrap" mb={4}>
|
<Group justify="space-between" wrap="nowrap" mb={4}>
|
||||||
<Group gap={6} wrap="nowrap" style={{ overflow: 'hidden' }}>
|
<Group gap={6} wrap="nowrap" style={{ overflow: 'hidden' }}>
|
||||||
<IconGripVertical size={14} style={{ flexShrink: 0, color: 'var(--mantine-color-dimmed)' }} />
|
{!isReadOnly && <IconGripVertical size={14} style={{ flexShrink: 0, color: 'var(--mantine-color-dimmed)' }} />}
|
||||||
<Text fw={600} size="sm" truncate>
|
<Text fw={600} size="sm" truncate>
|
||||||
{project.name}
|
{project.name}
|
||||||
</Text>
|
</Text>
|
||||||
</Group>
|
</Group>
|
||||||
<ActionIcon variant="subtle" size="sm" onClick={() => onEdit(project)}>
|
{!isReadOnly && (
|
||||||
<IconEdit size={14} />
|
<ActionIcon variant="subtle" size="sm" onClick={() => onEdit(project)}>
|
||||||
</ActionIcon>
|
<IconEdit size={14} />
|
||||||
|
</ActionIcon>
|
||||||
|
)}
|
||||||
</Group>
|
</Group>
|
||||||
|
|
||||||
<Group gap={6} mb={6}>
|
<Group gap={6} mb={6}>
|
||||||
@@ -148,11 +151,12 @@ interface KanbanColumnProps {
|
|||||||
isDragOver: boolean;
|
isDragOver: boolean;
|
||||||
onDragOverHandler: (e: DragEvent<HTMLDivElement>, year: number) => void;
|
onDragOverHandler: (e: DragEvent<HTMLDivElement>, year: number) => void;
|
||||||
onDragLeave: () => void;
|
onDragLeave: () => void;
|
||||||
|
isReadOnly?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
function KanbanColumn({
|
function KanbanColumn({
|
||||||
year, projects, onEdit, onDragStart, onDrop,
|
year, projects, onEdit, onDragStart, onDrop,
|
||||||
isDragOver, onDragOverHandler, onDragLeave,
|
isDragOver, onDragOverHandler, onDragLeave, isReadOnly,
|
||||||
}: KanbanColumnProps) {
|
}: KanbanColumnProps) {
|
||||||
const totalEst = projects.reduce((s, p) => s + parseFloat(p.estimated_cost || '0'), 0);
|
const totalEst = projects.reduce((s, p) => s + parseFloat(p.estimated_cost || '0'), 0);
|
||||||
const isFuture = year === FUTURE_YEAR;
|
const isFuture = year === FUTURE_YEAR;
|
||||||
@@ -178,9 +182,9 @@ function KanbanColumn({
|
|||||||
border: isDragOver ? '2px dashed var(--mantine-color-blue-4)' : undefined,
|
border: isDragOver ? '2px dashed var(--mantine-color-blue-4)' : undefined,
|
||||||
transition: 'background-color 150ms ease, border 150ms ease',
|
transition: 'background-color 150ms ease, border 150ms ease',
|
||||||
}}
|
}}
|
||||||
onDragOver={(e) => onDragOverHandler(e, year)}
|
onDragOver={!isReadOnly ? (e) => onDragOverHandler(e, year) : undefined}
|
||||||
onDragLeave={onDragLeave}
|
onDragLeave={!isReadOnly ? onDragLeave : undefined}
|
||||||
onDrop={(e) => onDrop(e, year)}
|
onDrop={!isReadOnly ? (e) => onDrop(e, year) : undefined}
|
||||||
>
|
>
|
||||||
<Group justify="space-between" mb="sm">
|
<Group justify="space-between" mb="sm">
|
||||||
<Title order={5}>{yearLabel(year)}</Title>
|
<Title order={5}>{yearLabel(year)}</Title>
|
||||||
@@ -199,7 +203,7 @@ function KanbanColumn({
|
|||||||
<Box style={{ flex: 1, minHeight: 60 }}>
|
<Box style={{ flex: 1, minHeight: 60 }}>
|
||||||
{projects.length === 0 ? (
|
{projects.length === 0 ? (
|
||||||
<Text size="xs" c="dimmed" ta="center" py="lg">
|
<Text size="xs" c="dimmed" ta="center" py="lg">
|
||||||
Drop projects here
|
{isReadOnly ? 'No projects' : 'Drop projects here'}
|
||||||
</Text>
|
</Text>
|
||||||
) : useWideLayout ? (
|
) : useWideLayout ? (
|
||||||
<div style={{
|
<div style={{
|
||||||
@@ -208,12 +212,12 @@ function KanbanColumn({
|
|||||||
gap: 'var(--mantine-spacing-xs)',
|
gap: 'var(--mantine-spacing-xs)',
|
||||||
}}>
|
}}>
|
||||||
{projects.map((p) => (
|
{projects.map((p) => (
|
||||||
<KanbanCard key={p.id} project={p} onEdit={onEdit} onDragStart={onDragStart} />
|
<KanbanCard key={p.id} project={p} onEdit={onEdit} onDragStart={onDragStart} isReadOnly={isReadOnly} />
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
) : (
|
) : (
|
||||||
projects.map((p) => (
|
projects.map((p) => (
|
||||||
<KanbanCard key={p.id} project={p} onEdit={onEdit} onDragStart={onDragStart} />
|
<KanbanCard key={p.id} project={p} onEdit={onEdit} onDragStart={onDragStart} isReadOnly={isReadOnly} />
|
||||||
))
|
))
|
||||||
)}
|
)}
|
||||||
</Box>
|
</Box>
|
||||||
@@ -595,6 +599,7 @@ export function CapitalProjectsPage() {
|
|||||||
isDragOver={dragOverYear === year}
|
isDragOver={dragOverYear === year}
|
||||||
onDragOverHandler={handleDragOver}
|
onDragOverHandler={handleDragOver}
|
||||||
onDragLeave={handleDragLeave}
|
onDragLeave={handleDragLeave}
|
||||||
|
isReadOnly={isReadOnly}
|
||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
})}
|
})}
|
||||||
|
|||||||
@@ -16,9 +16,9 @@ import {
|
|||||||
IconRefresh,
|
IconRefresh,
|
||||||
IconInfoCircle,
|
IconInfoCircle,
|
||||||
} from '@tabler/icons-react';
|
} from '@tabler/icons-react';
|
||||||
import { useState } from 'react';
|
import { useState, useCallback } from 'react';
|
||||||
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
import { useQuery, useQueryClient } from '@tanstack/react-query';
|
||||||
import { useAuthStore } from '../../stores/authStore';
|
import { useAuthStore, useIsReadOnly } from '../../stores/authStore';
|
||||||
import api from '../../services/api';
|
import api from '../../services/api';
|
||||||
|
|
||||||
interface HealthScore {
|
interface HealthScore {
|
||||||
@@ -311,11 +311,12 @@ interface DashboardData {
|
|||||||
|
|
||||||
export function DashboardPage() {
|
export function DashboardPage() {
|
||||||
const currentOrg = useAuthStore((s) => s.currentOrg);
|
const currentOrg = useAuthStore((s) => s.currentOrg);
|
||||||
|
const isReadOnly = useIsReadOnly();
|
||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
// Track whether last refresh attempt failed (per score type)
|
// Track whether a refresh is in progress (per score type) for async polling
|
||||||
const [operatingFailed, setOperatingFailed] = useState(false);
|
const [operatingRefreshing, setOperatingRefreshing] = useState(false);
|
||||||
const [reserveFailed, setReserveFailed] = useState(false);
|
const [reserveRefreshing, setReserveRefreshing] = useState(false);
|
||||||
|
|
||||||
const { data, isLoading } = useQuery<DashboardData>({
|
const { data, isLoading } = useQuery<DashboardData>({
|
||||||
queryKey: ['dashboard'],
|
queryKey: ['dashboard'],
|
||||||
@@ -327,33 +328,66 @@ export function DashboardPage() {
|
|||||||
queryKey: ['health-scores'],
|
queryKey: ['health-scores'],
|
||||||
queryFn: async () => { const { data } = await api.get('/health-scores/latest'); return data; },
|
queryFn: async () => { const { data } = await api.get('/health-scores/latest'); return data; },
|
||||||
enabled: !!currentOrg,
|
enabled: !!currentOrg,
|
||||||
|
// Poll every 3 seconds while a refresh is in progress
|
||||||
|
refetchInterval: (operatingRefreshing || reserveRefreshing) ? 3000 : false,
|
||||||
});
|
});
|
||||||
|
|
||||||
// Separate mutations for each score type
|
// Async refresh handlers — trigger the backend and poll for results
|
||||||
const recalcOperatingMutation = useMutation({
|
const handleRefreshOperating = useCallback(async () => {
|
||||||
mutationFn: () => api.post('/health-scores/calculate/operating'),
|
const prevId = healthScores?.operating?.id;
|
||||||
onSuccess: () => {
|
setOperatingRefreshing(true);
|
||||||
setOperatingFailed(false);
|
try {
|
||||||
queryClient.invalidateQueries({ queryKey: ['health-scores'] });
|
await api.post('/health-scores/calculate/operating');
|
||||||
},
|
} catch {
|
||||||
onError: () => {
|
// Trigger failed at network level — polling will pick up any backend-saved error
|
||||||
setOperatingFailed(true);
|
}
|
||||||
// Still refresh to get whatever the backend saved (could be cached data)
|
// Start polling — watch for the health score to change (new id or updated timestamp)
|
||||||
queryClient.invalidateQueries({ queryKey: ['health-scores'] });
|
const pollUntilDone = () => {
|
||||||
},
|
const checkInterval = setInterval(async () => {
|
||||||
});
|
try {
|
||||||
|
const { data: latest } = await api.get('/health-scores/latest');
|
||||||
|
const newScore = latest?.operating;
|
||||||
|
if (newScore && newScore.id !== prevId) {
|
||||||
|
setOperatingRefreshing(false);
|
||||||
|
queryClient.setQueryData(['health-scores'], latest);
|
||||||
|
clearInterval(checkInterval);
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// Keep polling
|
||||||
|
}
|
||||||
|
}, 3000);
|
||||||
|
// Safety timeout — stop polling after 11 minutes
|
||||||
|
setTimeout(() => { clearInterval(checkInterval); setOperatingRefreshing(false); }, 660000);
|
||||||
|
};
|
||||||
|
pollUntilDone();
|
||||||
|
}, [healthScores?.operating?.id, queryClient]);
|
||||||
|
|
||||||
const recalcReserveMutation = useMutation({
|
const handleRefreshReserve = useCallback(async () => {
|
||||||
mutationFn: () => api.post('/health-scores/calculate/reserve'),
|
const prevId = healthScores?.reserve?.id;
|
||||||
onSuccess: () => {
|
setReserveRefreshing(true);
|
||||||
setReserveFailed(false);
|
try {
|
||||||
queryClient.invalidateQueries({ queryKey: ['health-scores'] });
|
await api.post('/health-scores/calculate/reserve');
|
||||||
},
|
} catch {
|
||||||
onError: () => {
|
// Trigger failed at network level
|
||||||
setReserveFailed(true);
|
}
|
||||||
queryClient.invalidateQueries({ queryKey: ['health-scores'] });
|
const pollUntilDone = () => {
|
||||||
},
|
const checkInterval = setInterval(async () => {
|
||||||
});
|
try {
|
||||||
|
const { data: latest } = await api.get('/health-scores/latest');
|
||||||
|
const newScore = latest?.reserve;
|
||||||
|
if (newScore && newScore.id !== prevId) {
|
||||||
|
setReserveRefreshing(false);
|
||||||
|
queryClient.setQueryData(['health-scores'], latest);
|
||||||
|
clearInterval(checkInterval);
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// Keep polling
|
||||||
|
}
|
||||||
|
}, 3000);
|
||||||
|
setTimeout(() => { clearInterval(checkInterval); setReserveRefreshing(false); }, 660000);
|
||||||
|
};
|
||||||
|
pollUntilDone();
|
||||||
|
}, [healthScores?.reserve?.id, queryClient]);
|
||||||
|
|
||||||
const fmt = (v: string | number) =>
|
const fmt = (v: string | number) =>
|
||||||
parseFloat(String(v || '0')).toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
parseFloat(String(v || '0')).toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
||||||
@@ -381,7 +415,6 @@ export function DashboardPage() {
|
|||||||
<Center h={200}><Loader /></Center>
|
<Center h={200}><Loader /></Center>
|
||||||
) : (
|
) : (
|
||||||
<>
|
<>
|
||||||
<Text size="sm" fw={600} c="dimmed">AI Health Scores</Text>
|
|
||||||
<SimpleGrid cols={{ base: 1, md: 2 }}>
|
<SimpleGrid cols={{ base: 1, md: 2 }}>
|
||||||
<HealthScoreCard
|
<HealthScoreCard
|
||||||
score={healthScores?.operating || null}
|
score={healthScores?.operating || null}
|
||||||
@@ -391,9 +424,9 @@ export function DashboardPage() {
|
|||||||
<IconHeartbeat size={20} />
|
<IconHeartbeat size={20} />
|
||||||
</ThemeIcon>
|
</ThemeIcon>
|
||||||
}
|
}
|
||||||
isRefreshing={recalcOperatingMutation.isPending}
|
isRefreshing={operatingRefreshing}
|
||||||
onRefresh={() => recalcOperatingMutation.mutate()}
|
onRefresh={!isReadOnly ? handleRefreshOperating : undefined}
|
||||||
lastFailed={operatingFailed || !!healthScores?.operating_last_failed}
|
lastFailed={!!healthScores?.operating_last_failed}
|
||||||
/>
|
/>
|
||||||
<HealthScoreCard
|
<HealthScoreCard
|
||||||
score={healthScores?.reserve || null}
|
score={healthScores?.reserve || null}
|
||||||
@@ -403,9 +436,9 @@ export function DashboardPage() {
|
|||||||
<IconHeartbeat size={20} />
|
<IconHeartbeat size={20} />
|
||||||
</ThemeIcon>
|
</ThemeIcon>
|
||||||
}
|
}
|
||||||
isRefreshing={recalcReserveMutation.isPending}
|
isRefreshing={reserveRefreshing}
|
||||||
onRefresh={() => recalcReserveMutation.mutate()}
|
onRefresh={!isReadOnly ? handleRefreshReserve : undefined}
|
||||||
lastFailed={reserveFailed || !!healthScores?.reserve_last_failed}
|
lastFailed={!!healthScores?.reserve_last_failed}
|
||||||
/>
|
/>
|
||||||
</SimpleGrid>
|
</SimpleGrid>
|
||||||
|
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { useState, useEffect } from 'react';
|
import { useState, useEffect, useCallback } from 'react';
|
||||||
import {
|
import {
|
||||||
Title,
|
Title,
|
||||||
Text,
|
Text,
|
||||||
@@ -33,9 +33,10 @@ import {
|
|||||||
IconChevronDown,
|
IconChevronDown,
|
||||||
IconChevronUp,
|
IconChevronUp,
|
||||||
} from '@tabler/icons-react';
|
} from '@tabler/icons-react';
|
||||||
import { useQuery, useMutation } from '@tanstack/react-query';
|
import { useQuery } from '@tanstack/react-query';
|
||||||
import { notifications } from '@mantine/notifications';
|
import { notifications } from '@mantine/notifications';
|
||||||
import api from '../../services/api';
|
import api from '../../services/api';
|
||||||
|
import { useIsReadOnly } from '../../stores/authStore';
|
||||||
|
|
||||||
// ── Types ──
|
// ── Types ──
|
||||||
|
|
||||||
@@ -107,6 +108,9 @@ interface SavedRecommendation {
|
|||||||
risk_notes: string[];
|
risk_notes: string[];
|
||||||
response_time_ms: number;
|
response_time_ms: number;
|
||||||
created_at: string;
|
created_at: string;
|
||||||
|
status: 'processing' | 'complete' | 'error';
|
||||||
|
last_failed: boolean;
|
||||||
|
error_message?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
// ── Helpers ──
|
// ── Helpers ──
|
||||||
@@ -181,14 +185,29 @@ function RateTable({ rates, showTerm }: { rates: MarketRate[]; showTerm: boolean
|
|||||||
|
|
||||||
// ── Recommendations Display Component ──
|
// ── Recommendations Display Component ──
|
||||||
|
|
||||||
function RecommendationsDisplay({ aiResult, lastUpdated }: { aiResult: AIResponse; lastUpdated?: string }) {
|
function RecommendationsDisplay({
|
||||||
|
aiResult,
|
||||||
|
lastUpdated,
|
||||||
|
lastFailed,
|
||||||
|
}: {
|
||||||
|
aiResult: AIResponse;
|
||||||
|
lastUpdated?: string;
|
||||||
|
lastFailed?: boolean;
|
||||||
|
}) {
|
||||||
return (
|
return (
|
||||||
<Stack>
|
<Stack>
|
||||||
{/* Last Updated timestamp */}
|
{/* Last Updated timestamp + failure message */}
|
||||||
{lastUpdated && (
|
{lastUpdated && (
|
||||||
<Text size="xs" c="dimmed" ta="right">
|
<Stack gap={0} align="flex-end">
|
||||||
Last updated: {new Date(lastUpdated).toLocaleString()}
|
<Text size="xs" c="dimmed" ta="right">
|
||||||
</Text>
|
Last updated: {new Date(lastUpdated).toLocaleString()}
|
||||||
|
</Text>
|
||||||
|
{lastFailed && (
|
||||||
|
<Text size="10px" c="orange" fw={500} style={{ opacity: 0.85 }}>
|
||||||
|
last analysis failed — showing cached data
|
||||||
|
</Text>
|
||||||
|
)}
|
||||||
|
</Stack>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{/* Overall Assessment */}
|
{/* Overall Assessment */}
|
||||||
@@ -327,9 +346,9 @@ function RecommendationsDisplay({ aiResult, lastUpdated }: { aiResult: AIRespons
|
|||||||
// ── Main Component ──
|
// ── Main Component ──
|
||||||
|
|
||||||
export function InvestmentPlanningPage() {
|
export function InvestmentPlanningPage() {
|
||||||
const [aiResult, setAiResult] = useState<AIResponse | null>(null);
|
|
||||||
const [lastUpdated, setLastUpdated] = useState<string | null>(null);
|
|
||||||
const [ratesExpanded, setRatesExpanded] = useState(true);
|
const [ratesExpanded, setRatesExpanded] = useState(true);
|
||||||
|
const [isTriggering, setIsTriggering] = useState(false);
|
||||||
|
const isReadOnly = useIsReadOnly();
|
||||||
|
|
||||||
// Load financial snapshot on mount
|
// Load financial snapshot on mount
|
||||||
const { data: snapshot, isLoading: snapshotLoading } = useQuery<FinancialSnapshot>({
|
const { data: snapshot, isLoading: snapshotLoading } = useQuery<FinancialSnapshot>({
|
||||||
@@ -349,50 +368,86 @@ export function InvestmentPlanningPage() {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
// Load saved recommendation on mount
|
// Load saved recommendation — polls every 3s when processing
|
||||||
const { data: savedRec } = useQuery<SavedRecommendation | null>({
|
const { data: savedRec } = useQuery<SavedRecommendation | null>({
|
||||||
queryKey: ['investment-planning-saved-recommendation'],
|
queryKey: ['investment-planning-saved-recommendation'],
|
||||||
queryFn: async () => {
|
queryFn: async () => {
|
||||||
const { data } = await api.get('/investment-planning/saved-recommendation');
|
const { data } = await api.get('/investment-planning/saved-recommendation');
|
||||||
return data;
|
return data;
|
||||||
},
|
},
|
||||||
|
refetchInterval: (query) => {
|
||||||
|
const rec = query.state.data;
|
||||||
|
// Poll every 3 seconds while processing
|
||||||
|
if (rec?.status === 'processing') return 3000;
|
||||||
|
// Also poll if we just triggered (status may not be 'processing' yet)
|
||||||
|
if (isTriggering) return 3000;
|
||||||
|
return false;
|
||||||
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
// Populate AI results from saved recommendation on load
|
// Derive display state from saved recommendation
|
||||||
useEffect(() => {
|
const isProcessing = savedRec?.status === 'processing' || isTriggering;
|
||||||
if (savedRec && !aiResult) {
|
const lastFailed = savedRec?.last_failed || false;
|
||||||
setAiResult({
|
const hasResults = savedRec && savedRec.status === 'complete' && savedRec.recommendations.length > 0;
|
||||||
recommendations: savedRec.recommendations,
|
const hasError = savedRec?.status === 'error' && !savedRec?.recommendations?.length;
|
||||||
overall_assessment: savedRec.overall_assessment,
|
|
||||||
risk_notes: savedRec.risk_notes,
|
|
||||||
});
|
|
||||||
setLastUpdated(savedRec.created_at);
|
|
||||||
}
|
|
||||||
}, [savedRec]); // eslint-disable-line react-hooks/exhaustive-deps
|
|
||||||
|
|
||||||
// AI recommendation (on-demand)
|
// Clear triggering flag once backend confirms processing or completes
|
||||||
const aiMutation = useMutation({
|
useEffect(() => {
|
||||||
mutationFn: async () => {
|
if (isTriggering && savedRec?.status === 'processing') {
|
||||||
const { data } = await api.post('/investment-planning/recommendations', {}, { timeout: 300000 });
|
setIsTriggering(false);
|
||||||
return data as AIResponse;
|
}
|
||||||
},
|
if (isTriggering && savedRec?.status === 'complete') {
|
||||||
onSuccess: (data) => {
|
setIsTriggering(false);
|
||||||
setAiResult(data);
|
}
|
||||||
setLastUpdated(new Date().toISOString());
|
}, [savedRec?.status, isTriggering]);
|
||||||
if (data.recommendations.length > 0) {
|
|
||||||
notifications.show({
|
// Show notification when processing completes (transition from processing)
|
||||||
message: `Generated ${data.recommendations.length} investment recommendations`,
|
const prevStatusRef = useState<string | null>(null);
|
||||||
color: 'green',
|
useEffect(() => {
|
||||||
});
|
const [prevStatus, setPrevStatus] = prevStatusRef;
|
||||||
}
|
if (prevStatus === 'processing' && savedRec?.status === 'complete') {
|
||||||
},
|
|
||||||
onError: (err: any) => {
|
|
||||||
notifications.show({
|
notifications.show({
|
||||||
message: err.response?.data?.message || 'Failed to get AI recommendations',
|
message: `Generated ${savedRec.recommendations.length} investment recommendations`,
|
||||||
|
color: 'green',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (prevStatus === 'processing' && savedRec?.status === 'error') {
|
||||||
|
notifications.show({
|
||||||
|
message: savedRec.error_message || 'AI recommendation analysis failed',
|
||||||
color: 'red',
|
color: 'red',
|
||||||
});
|
});
|
||||||
},
|
}
|
||||||
});
|
setPrevStatus(savedRec?.status || null);
|
||||||
|
}, [savedRec?.status]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||||
|
|
||||||
|
// Trigger AI recommendations (async — returns immediately)
|
||||||
|
const handleTriggerAI = useCallback(async () => {
|
||||||
|
setIsTriggering(true);
|
||||||
|
try {
|
||||||
|
await api.post('/investment-planning/recommendations');
|
||||||
|
} catch (err: any) {
|
||||||
|
setIsTriggering(false);
|
||||||
|
notifications.show({
|
||||||
|
message: err.response?.data?.message || 'Failed to start AI analysis',
|
||||||
|
color: 'red',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
// Build AI result from saved recommendation for display
|
||||||
|
const aiResult: AIResponse | null = hasResults
|
||||||
|
? {
|
||||||
|
recommendations: savedRec!.recommendations,
|
||||||
|
overall_assessment: savedRec!.overall_assessment,
|
||||||
|
risk_notes: savedRec!.risk_notes,
|
||||||
|
}
|
||||||
|
: (lastFailed && savedRec?.recommendations?.length)
|
||||||
|
? {
|
||||||
|
recommendations: savedRec!.recommendations,
|
||||||
|
overall_assessment: savedRec!.overall_assessment,
|
||||||
|
risk_notes: savedRec!.risk_notes,
|
||||||
|
}
|
||||||
|
: null;
|
||||||
|
|
||||||
if (snapshotLoading) {
|
if (snapshotLoading) {
|
||||||
return (
|
return (
|
||||||
@@ -643,19 +698,21 @@ export function InvestmentPlanningPage() {
|
|||||||
</Text>
|
</Text>
|
||||||
</div>
|
</div>
|
||||||
</Group>
|
</Group>
|
||||||
<Button
|
{!isReadOnly && (
|
||||||
leftSection={<IconSparkles size={16} />}
|
<Button
|
||||||
onClick={() => aiMutation.mutate()}
|
leftSection={<IconSparkles size={16} />}
|
||||||
loading={aiMutation.isPending}
|
onClick={handleTriggerAI}
|
||||||
variant="gradient"
|
loading={isProcessing}
|
||||||
gradient={{ from: 'grape', to: 'violet' }}
|
variant="gradient"
|
||||||
>
|
gradient={{ from: 'grape', to: 'violet' }}
|
||||||
{aiResult ? 'Refresh Recommendations' : 'Get AI Recommendations'}
|
>
|
||||||
</Button>
|
{aiResult ? 'Refresh Recommendations' : 'Get AI Recommendations'}
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
</Group>
|
</Group>
|
||||||
|
|
||||||
{/* Loading State */}
|
{/* Processing State */}
|
||||||
{aiMutation.isPending && (
|
{isProcessing && (
|
||||||
<Center py="xl">
|
<Center py="xl">
|
||||||
<Stack align="center" gap="sm">
|
<Stack align="center" gap="sm">
|
||||||
<Loader size="lg" type="dots" />
|
<Loader size="lg" type="dots" />
|
||||||
@@ -663,19 +720,32 @@ export function InvestmentPlanningPage() {
|
|||||||
Analyzing your financial data and market rates...
|
Analyzing your financial data and market rates...
|
||||||
</Text>
|
</Text>
|
||||||
<Text c="dimmed" size="xs">
|
<Text c="dimmed" size="xs">
|
||||||
This may take a few minutes for complex tenant data
|
You can navigate away — results will appear when ready
|
||||||
</Text>
|
</Text>
|
||||||
</Stack>
|
</Stack>
|
||||||
</Center>
|
</Center>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{/* Results */}
|
{/* Error State (no cached data) */}
|
||||||
{aiResult && !aiMutation.isPending && (
|
{hasError && !isProcessing && (
|
||||||
<RecommendationsDisplay aiResult={aiResult} lastUpdated={lastUpdated || undefined} />
|
<Alert color="red" variant="light" title="Analysis Failed" mb="md">
|
||||||
|
<Text size="sm">
|
||||||
|
{savedRec?.error_message || 'The last AI analysis failed. Please try again.'}
|
||||||
|
</Text>
|
||||||
|
</Alert>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Results (with optional failure watermark) */}
|
||||||
|
{aiResult && !isProcessing && (
|
||||||
|
<RecommendationsDisplay
|
||||||
|
aiResult={aiResult}
|
||||||
|
lastUpdated={savedRec?.created_at || undefined}
|
||||||
|
lastFailed={lastFailed}
|
||||||
|
/>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{/* Empty State */}
|
{/* Empty State */}
|
||||||
{!aiResult && !aiMutation.isPending && (
|
{!aiResult && !isProcessing && !hasError && (
|
||||||
<Paper p="xl" radius="sm" style={{ textAlign: 'center' }}>
|
<Paper p="xl" radius="sm" style={{ textAlign: 'center' }}>
|
||||||
<ThemeIcon variant="light" color="grape" size={48} mx="auto" mb="md">
|
<ThemeIcon variant="light" color="grape" size={48} mx="auto" mb="md">
|
||||||
<IconSparkles size={28} />
|
<IconSparkles size={28} />
|
||||||
|
|||||||
@@ -1,30 +1,71 @@
|
|||||||
import { useState } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import {
|
import {
|
||||||
Title, Table, Group, Button, Stack, Text, Badge, Modal,
|
Title, Table, Group, Button, Stack, Text, Badge, Modal,
|
||||||
NumberInput, Select, Loader, Center, Card,
|
NumberInput, Select, Loader, Center, Card, Alert,
|
||||||
} from '@mantine/core';
|
} from '@mantine/core';
|
||||||
import { DateInput } from '@mantine/dates';
|
|
||||||
import { useForm } from '@mantine/form';
|
import { useForm } from '@mantine/form';
|
||||||
import { useDisclosure } from '@mantine/hooks';
|
import { useDisclosure } from '@mantine/hooks';
|
||||||
import { notifications } from '@mantine/notifications';
|
import { notifications } from '@mantine/notifications';
|
||||||
import { IconFileInvoice, IconSend } from '@tabler/icons-react';
|
import { IconSend, IconInfoCircle, IconCheck, IconX } from '@tabler/icons-react';
|
||||||
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
||||||
import api from '../../services/api';
|
import api from '../../services/api';
|
||||||
|
import { useIsReadOnly } from '../../stores/authStore';
|
||||||
|
|
||||||
interface Invoice {
|
interface Invoice {
|
||||||
id: string; invoice_number: string; unit_number: string; unit_id: string;
|
id: string; invoice_number: string; unit_number: string; unit_id: string;
|
||||||
invoice_date: string; due_date: string; invoice_type: string;
|
invoice_date: string; due_date: string; invoice_type: string;
|
||||||
description: string; amount: string; amount_paid: string; balance_due: string;
|
description: string; amount: string; amount_paid: string; balance_due: string;
|
||||||
status: string;
|
status: string; period_start: string; period_end: string;
|
||||||
|
assessment_group_name: string; frequency: string; owner_name: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PreviewGroup {
|
||||||
|
id: string;
|
||||||
|
name: string;
|
||||||
|
frequency: string;
|
||||||
|
active_units: number;
|
||||||
|
regular_assessment: string;
|
||||||
|
special_assessment: string;
|
||||||
|
is_billing_month: boolean;
|
||||||
|
total_amount: number;
|
||||||
|
period_description: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Preview {
|
||||||
|
month: number;
|
||||||
|
year: number;
|
||||||
|
month_name: string;
|
||||||
|
groups: PreviewGroup[];
|
||||||
|
summary: {
|
||||||
|
total_groups_billing: number;
|
||||||
|
total_invoices: number;
|
||||||
|
total_amount: number;
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
const statusColors: Record<string, string> = {
|
const statusColors: Record<string, string> = {
|
||||||
draft: 'gray', sent: 'blue', paid: 'green', partial: 'yellow', overdue: 'red', void: 'dark',
|
draft: 'gray', pending: 'blue', paid: 'green', partial: 'yellow', overdue: 'red', void: 'dark',
|
||||||
|
};
|
||||||
|
|
||||||
|
const frequencyColors: Record<string, string> = {
|
||||||
|
monthly: 'blue', quarterly: 'teal', annual: 'violet',
|
||||||
|
};
|
||||||
|
|
||||||
|
const fmt = (v: string | number) => parseFloat(String(v || '0')).toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
||||||
|
|
||||||
|
/** Extract last name from "First Last" format */
|
||||||
|
const getLastName = (ownerName: string | null) => {
|
||||||
|
if (!ownerName) return '-';
|
||||||
|
const parts = ownerName.trim().split(/\s+/);
|
||||||
|
return parts.length > 1 ? parts[parts.length - 1] : ownerName;
|
||||||
};
|
};
|
||||||
|
|
||||||
export function InvoicesPage() {
|
export function InvoicesPage() {
|
||||||
const [bulkOpened, { open: openBulk, close: closeBulk }] = useDisclosure(false);
|
const [bulkOpened, { open: openBulk, close: closeBulk }] = useDisclosure(false);
|
||||||
|
const [preview, setPreview] = useState<Preview | null>(null);
|
||||||
|
const [previewLoading, setPreviewLoading] = useState(false);
|
||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
|
const isReadOnly = useIsReadOnly();
|
||||||
|
|
||||||
const { data: invoices = [], isLoading } = useQuery<Invoice[]>({
|
const { data: invoices = [], isLoading } = useQuery<Invoice[]>({
|
||||||
queryKey: ['invoices'],
|
queryKey: ['invoices'],
|
||||||
@@ -35,13 +76,36 @@ export function InvoicesPage() {
|
|||||||
initialValues: { month: new Date().getMonth() + 1, year: new Date().getFullYear() },
|
initialValues: { month: new Date().getMonth() + 1, year: new Date().getFullYear() },
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Fetch preview when month/year changes
|
||||||
|
const fetchPreview = async (month: number, year: number) => {
|
||||||
|
setPreviewLoading(true);
|
||||||
|
try {
|
||||||
|
const { data } = await api.post('/invoices/generate-preview', { month, year });
|
||||||
|
setPreview(data);
|
||||||
|
} catch {
|
||||||
|
setPreview(null);
|
||||||
|
}
|
||||||
|
setPreviewLoading(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (bulkOpened) {
|
||||||
|
fetchPreview(bulkForm.values.month, bulkForm.values.year);
|
||||||
|
}
|
||||||
|
}, [bulkOpened, bulkForm.values.month, bulkForm.values.year]);
|
||||||
|
|
||||||
const bulkMutation = useMutation({
|
const bulkMutation = useMutation({
|
||||||
mutationFn: (values: any) => api.post('/invoices/generate-bulk', values),
|
mutationFn: (values: any) => api.post('/invoices/generate-bulk', values),
|
||||||
onSuccess: (res) => {
|
onSuccess: (res) => {
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
||||||
queryClient.invalidateQueries({ queryKey: ['journal-entries'] });
|
queryClient.invalidateQueries({ queryKey: ['journal-entries'] });
|
||||||
notifications.show({ message: `Generated ${res.data.created} invoices`, color: 'green' });
|
const groupInfo = res.data.groups?.map((g: any) => `${g.group_name}: ${g.invoices_created}`).join(', ') || '';
|
||||||
|
notifications.show({
|
||||||
|
message: `Generated ${res.data.created} invoices${groupInfo ? ` (${groupInfo})` : ''}`,
|
||||||
|
color: 'green',
|
||||||
|
});
|
||||||
closeBulk();
|
closeBulk();
|
||||||
|
setPreview(null);
|
||||||
},
|
},
|
||||||
onError: (err: any) => { notifications.show({ message: err.response?.data?.message || 'Error', color: 'red' }); },
|
onError: (err: any) => { notifications.show({ message: err.response?.data?.message || 'Error', color: 'red' }); },
|
||||||
});
|
});
|
||||||
@@ -54,8 +118,6 @@ export function InvoicesPage() {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
const fmt = (v: string) => parseFloat(v || '0').toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
|
||||||
|
|
||||||
if (isLoading) return <Center h={300}><Loader /></Center>;
|
if (isLoading) return <Center h={300}><Loader /></Center>;
|
||||||
|
|
||||||
const totalOutstanding = invoices.filter(i => i.status !== 'paid' && i.status !== 'void').reduce((s, i) => s + parseFloat(i.balance_due || '0'), 0);
|
const totalOutstanding = invoices.filter(i => i.status !== 'paid' && i.status !== 'void').reduce((s, i) => s + parseFloat(i.balance_due || '0'), 0);
|
||||||
@@ -64,20 +126,24 @@ export function InvoicesPage() {
|
|||||||
<Stack>
|
<Stack>
|
||||||
<Group justify="space-between">
|
<Group justify="space-between">
|
||||||
<Title order={2}>Invoices</Title>
|
<Title order={2}>Invoices</Title>
|
||||||
<Group>
|
{!isReadOnly && (
|
||||||
<Button variant="outline" onClick={() => lateFeesMutation.mutate()} loading={lateFeesMutation.isPending}>Apply Late Fees</Button>
|
<Group>
|
||||||
<Button leftSection={<IconSend size={16} />} onClick={openBulk}>Generate Monthly Invoices</Button>
|
<Button variant="outline" onClick={() => lateFeesMutation.mutate()} loading={lateFeesMutation.isPending}>Apply Late Fees</Button>
|
||||||
</Group>
|
<Button leftSection={<IconSend size={16} />} onClick={openBulk}>Generate Invoices</Button>
|
||||||
|
</Group>
|
||||||
|
)}
|
||||||
</Group>
|
</Group>
|
||||||
<Group>
|
<Group>
|
||||||
<Card withBorder p="sm"><Text size="xs" c="dimmed">Total Invoices</Text><Text fw={700}>{invoices.length}</Text></Card>
|
<Card withBorder p="sm"><Text size="xs" c="dimmed">Total Invoices</Text><Text fw={700}>{invoices.length}</Text></Card>
|
||||||
<Card withBorder p="sm"><Text size="xs" c="dimmed">Outstanding</Text><Text fw={700} c="red">{fmt(String(totalOutstanding))}</Text></Card>
|
<Card withBorder p="sm"><Text size="xs" c="dimmed">Outstanding</Text><Text fw={700} c="red">{fmt(totalOutstanding)}</Text></Card>
|
||||||
</Group>
|
</Group>
|
||||||
<Table striped highlightOnHover>
|
<Table striped highlightOnHover>
|
||||||
<Table.Thead>
|
<Table.Thead>
|
||||||
<Table.Tr>
|
<Table.Tr>
|
||||||
<Table.Th>Invoice #</Table.Th><Table.Th>Unit</Table.Th><Table.Th>Date</Table.Th>
|
<Table.Th>Invoice #</Table.Th><Table.Th>Unit</Table.Th><Table.Th>Owner</Table.Th>
|
||||||
<Table.Th>Due</Table.Th><Table.Th>Type</Table.Th><Table.Th ta="right">Amount</Table.Th>
|
<Table.Th>Group</Table.Th><Table.Th>Date</Table.Th>
|
||||||
|
<Table.Th>Due</Table.Th><Table.Th>Period</Table.Th>
|
||||||
|
<Table.Th ta="right">Amount</Table.Th>
|
||||||
<Table.Th ta="right">Paid</Table.Th><Table.Th ta="right">Balance</Table.Th><Table.Th>Status</Table.Th>
|
<Table.Th ta="right">Paid</Table.Th><Table.Th ta="right">Balance</Table.Th><Table.Th>Status</Table.Th>
|
||||||
</Table.Tr>
|
</Table.Tr>
|
||||||
</Table.Thead>
|
</Table.Thead>
|
||||||
@@ -86,27 +152,104 @@ export function InvoicesPage() {
|
|||||||
<Table.Tr key={i.id}>
|
<Table.Tr key={i.id}>
|
||||||
<Table.Td fw={500}>{i.invoice_number}</Table.Td>
|
<Table.Td fw={500}>{i.invoice_number}</Table.Td>
|
||||||
<Table.Td>{i.unit_number}</Table.Td>
|
<Table.Td>{i.unit_number}</Table.Td>
|
||||||
|
<Table.Td>{getLastName(i.owner_name)}</Table.Td>
|
||||||
|
<Table.Td>
|
||||||
|
{i.assessment_group_name ? (
|
||||||
|
<Badge size="sm" variant="light" color={frequencyColors[i.frequency] || 'gray'}>
|
||||||
|
{i.assessment_group_name}
|
||||||
|
</Badge>
|
||||||
|
) : (
|
||||||
|
<Badge size="sm" variant="light">{i.invoice_type}</Badge>
|
||||||
|
)}
|
||||||
|
</Table.Td>
|
||||||
<Table.Td>{new Date(i.invoice_date).toLocaleDateString()}</Table.Td>
|
<Table.Td>{new Date(i.invoice_date).toLocaleDateString()}</Table.Td>
|
||||||
<Table.Td>{new Date(i.due_date).toLocaleDateString()}</Table.Td>
|
<Table.Td>{new Date(i.due_date).toLocaleDateString()}</Table.Td>
|
||||||
<Table.Td><Badge size="sm" variant="light">{i.invoice_type}</Badge></Table.Td>
|
<Table.Td>
|
||||||
|
{i.period_start && i.period_end ? (
|
||||||
|
<Text size="xs" c="dimmed">
|
||||||
|
{new Date(i.period_start).toLocaleDateString(undefined, { month: 'short', year: 'numeric' })}
|
||||||
|
{i.period_start !== i.period_end && (
|
||||||
|
<> - {new Date(i.period_end).toLocaleDateString(undefined, { month: 'short', year: 'numeric' })}</>
|
||||||
|
)}
|
||||||
|
</Text>
|
||||||
|
) : (
|
||||||
|
<Text size="xs" c="dimmed">-</Text>
|
||||||
|
)}
|
||||||
|
</Table.Td>
|
||||||
<Table.Td ta="right" ff="monospace">{fmt(i.amount)}</Table.Td>
|
<Table.Td ta="right" ff="monospace">{fmt(i.amount)}</Table.Td>
|
||||||
<Table.Td ta="right" ff="monospace">{fmt(i.amount_paid)}</Table.Td>
|
<Table.Td ta="right" ff="monospace">{fmt(i.amount_paid)}</Table.Td>
|
||||||
<Table.Td ta="right" ff="monospace" fw={500}>{fmt(i.balance_due)}</Table.Td>
|
<Table.Td ta="right" ff="monospace" fw={500}>{fmt(i.balance_due)}</Table.Td>
|
||||||
<Table.Td><Badge color={statusColors[i.status] || 'gray'} size="sm">{i.status}</Badge></Table.Td>
|
<Table.Td><Badge color={statusColors[i.status] || 'gray'} size="sm">{i.status}</Badge></Table.Td>
|
||||||
</Table.Tr>
|
</Table.Tr>
|
||||||
))}
|
))}
|
||||||
{invoices.length === 0 && <Table.Tr><Table.Td colSpan={9}><Text ta="center" c="dimmed" py="lg">No invoices yet</Text></Table.Td></Table.Tr>}
|
{invoices.length === 0 && <Table.Tr><Table.Td colSpan={11}><Text ta="center" c="dimmed" py="lg">No invoices yet</Text></Table.Td></Table.Tr>}
|
||||||
</Table.Tbody>
|
</Table.Tbody>
|
||||||
</Table>
|
</Table>
|
||||||
<Modal opened={bulkOpened} onClose={closeBulk} title="Generate Monthly Assessments">
|
|
||||||
|
<Modal opened={bulkOpened} onClose={() => { closeBulk(); setPreview(null); }} title="Generate Assessments" size="lg">
|
||||||
<form onSubmit={bulkForm.onSubmit((v) => bulkMutation.mutate(v))}>
|
<form onSubmit={bulkForm.onSubmit((v) => bulkMutation.mutate(v))}>
|
||||||
<Stack>
|
<Stack>
|
||||||
<Group grow>
|
<Group grow>
|
||||||
<Select label="Month" data={Array.from({length:12},(_,i)=>({value:String(i+1),label:new Date(2026,i).toLocaleString('default',{month:'long'})}))} value={String(bulkForm.values.month)} onChange={(v)=>bulkForm.setFieldValue('month',Number(v))} />
|
<Select label="Month" data={Array.from({length:12},(_,i)=>({value:String(i+1),label:new Date(2026,i).toLocaleString('default',{month:'long'})}))} value={String(bulkForm.values.month)} onChange={(v)=>bulkForm.setFieldValue('month',Number(v))} />
|
||||||
<NumberInput label="Year" {...bulkForm.getInputProps('year')} />
|
<NumberInput label="Year" {...bulkForm.getInputProps('year')} />
|
||||||
</Group>
|
</Group>
|
||||||
<Text size="sm" c="dimmed">This will generate invoices for all active units based on their monthly assessment amount.</Text>
|
|
||||||
<Button type="submit" loading={bulkMutation.isPending}>Generate Invoices</Button>
|
{previewLoading && <Center py="md"><Loader size="sm" /></Center>}
|
||||||
|
|
||||||
|
{preview && !previewLoading && (
|
||||||
|
<Stack gap="xs">
|
||||||
|
<Text size="sm" fw={600}>Billing Preview for {preview.month_name} {preview.year}</Text>
|
||||||
|
|
||||||
|
{preview.groups.map((g) => (
|
||||||
|
<Card key={g.id} withBorder p="xs" style={{ opacity: g.is_billing_month ? 1 : 0.5 }}>
|
||||||
|
<Group justify="space-between">
|
||||||
|
<Group gap="xs">
|
||||||
|
{g.is_billing_month && g.active_units > 0
|
||||||
|
? <IconCheck size={16} color="green" />
|
||||||
|
: <IconX size={16} color="gray" />
|
||||||
|
}
|
||||||
|
<div>
|
||||||
|
<Group gap={6}>
|
||||||
|
<Text size="sm" fw={500}>{g.name}</Text>
|
||||||
|
<Badge size="xs" color={frequencyColors[g.frequency]} variant="light">
|
||||||
|
{g.frequency}
|
||||||
|
</Badge>
|
||||||
|
</Group>
|
||||||
|
<Text size="xs" c="dimmed">
|
||||||
|
{g.is_billing_month
|
||||||
|
? `${g.active_units} units - ${g.period_description}`
|
||||||
|
: `Not a billing month for this group`
|
||||||
|
}
|
||||||
|
</Text>
|
||||||
|
</div>
|
||||||
|
</Group>
|
||||||
|
{g.is_billing_month && (
|
||||||
|
<Text size="sm" fw={500} ff="monospace">{fmt(g.total_amount)}</Text>
|
||||||
|
)}
|
||||||
|
</Group>
|
||||||
|
</Card>
|
||||||
|
))}
|
||||||
|
|
||||||
|
{preview.summary.total_invoices > 0 ? (
|
||||||
|
<Alert icon={<IconInfoCircle size={16} />} color="blue" variant="light">
|
||||||
|
Will generate {preview.summary.total_invoices} invoices across{' '}
|
||||||
|
{preview.summary.total_groups_billing} group(s) totaling {fmt(preview.summary.total_amount)}
|
||||||
|
</Alert>
|
||||||
|
) : (
|
||||||
|
<Alert icon={<IconInfoCircle size={16} />} color="yellow" variant="light">
|
||||||
|
No assessment groups have billing scheduled for {preview.month_name}. No invoices will be generated.
|
||||||
|
</Alert>
|
||||||
|
)}
|
||||||
|
</Stack>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<Button
|
||||||
|
type="submit"
|
||||||
|
loading={bulkMutation.isPending}
|
||||||
|
disabled={!preview || preview.summary.total_invoices === 0}
|
||||||
|
>
|
||||||
|
Generate {preview?.summary.total_invoices || 0} Invoices
|
||||||
|
</Button>
|
||||||
</Stack>
|
</Stack>
|
||||||
</form>
|
</form>
|
||||||
</Modal>
|
</Modal>
|
||||||
|
|||||||
@@ -1,13 +1,13 @@
|
|||||||
import { useState } from 'react';
|
import { useState } from 'react';
|
||||||
import {
|
import {
|
||||||
Title, Table, Group, Button, Stack, Text, Badge, Modal,
|
Title, Table, Group, Button, Stack, Text, Badge, Modal,
|
||||||
NumberInput, Select, TextInput, Loader, Center,
|
NumberInput, Select, TextInput, Loader, Center, ActionIcon, Tooltip,
|
||||||
} from '@mantine/core';
|
} from '@mantine/core';
|
||||||
import { DateInput } from '@mantine/dates';
|
import { DateInput } from '@mantine/dates';
|
||||||
import { useForm } from '@mantine/form';
|
import { useForm } from '@mantine/form';
|
||||||
import { useDisclosure } from '@mantine/hooks';
|
import { useDisclosure } from '@mantine/hooks';
|
||||||
import { notifications } from '@mantine/notifications';
|
import { notifications } from '@mantine/notifications';
|
||||||
import { IconPlus } from '@tabler/icons-react';
|
import { IconPlus, IconEdit, IconTrash } from '@tabler/icons-react';
|
||||||
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
||||||
import api from '../../services/api';
|
import api from '../../services/api';
|
||||||
import { useIsReadOnly } from '../../stores/authStore';
|
import { useIsReadOnly } from '../../stores/authStore';
|
||||||
@@ -15,11 +15,13 @@ import { useIsReadOnly } from '../../stores/authStore';
|
|||||||
interface Payment {
|
interface Payment {
|
||||||
id: string; unit_id: string; unit_number: string; invoice_id: string;
|
id: string; unit_id: string; unit_number: string; invoice_id: string;
|
||||||
invoice_number: string; payment_date: string; amount: string;
|
invoice_number: string; payment_date: string; amount: string;
|
||||||
payment_method: string; reference_number: string; status: string;
|
payment_method: string; reference_number: string; status: string; notes: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function PaymentsPage() {
|
export function PaymentsPage() {
|
||||||
const [opened, { open, close }] = useDisclosure(false);
|
const [opened, { open, close }] = useDisclosure(false);
|
||||||
|
const [editing, setEditing] = useState<Payment | null>(null);
|
||||||
|
const [deleteConfirm, setDeleteConfirm] = useState<Payment | null>(null);
|
||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
const isReadOnly = useIsReadOnly();
|
const isReadOnly = useIsReadOnly();
|
||||||
|
|
||||||
@@ -39,10 +41,18 @@ export function PaymentsPage() {
|
|||||||
const form = useForm({
|
const form = useForm({
|
||||||
initialValues: {
|
initialValues: {
|
||||||
invoice_id: '', amount: 0, payment_method: 'check',
|
invoice_id: '', amount: 0, payment_method: 'check',
|
||||||
reference_number: '', payment_date: new Date(),
|
reference_number: '', payment_date: new Date(), notes: '',
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
|
const invalidateAll = () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['payments'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['invoices-unpaid'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['accounts'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['journal-entries'] });
|
||||||
|
};
|
||||||
|
|
||||||
const createMutation = useMutation({
|
const createMutation = useMutation({
|
||||||
mutationFn: (values: any) => {
|
mutationFn: (values: any) => {
|
||||||
const inv = invoices.find((i: any) => i.id === values.invoice_id);
|
const inv = invoices.find((i: any) => i.id === values.invoice_id);
|
||||||
@@ -53,22 +63,88 @@ export function PaymentsPage() {
|
|||||||
});
|
});
|
||||||
},
|
},
|
||||||
onSuccess: () => {
|
onSuccess: () => {
|
||||||
queryClient.invalidateQueries({ queryKey: ['payments'] });
|
invalidateAll();
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices'] });
|
|
||||||
queryClient.invalidateQueries({ queryKey: ['invoices-unpaid'] });
|
|
||||||
queryClient.invalidateQueries({ queryKey: ['accounts'] });
|
|
||||||
notifications.show({ message: 'Payment recorded', color: 'green' });
|
notifications.show({ message: 'Payment recorded', color: 'green' });
|
||||||
close(); form.reset();
|
close(); setEditing(null); form.reset();
|
||||||
},
|
},
|
||||||
onError: (err: any) => { notifications.show({ message: err.response?.data?.message || 'Error', color: 'red' }); },
|
onError: (err: any) => { notifications.show({ message: err.response?.data?.message || 'Error', color: 'red' }); },
|
||||||
});
|
});
|
||||||
|
|
||||||
|
const updateMutation = useMutation({
|
||||||
|
mutationFn: (values: any) => {
|
||||||
|
return api.put(`/payments/${editing!.id}`, {
|
||||||
|
payment_date: values.payment_date.toISOString().split('T')[0],
|
||||||
|
amount: values.amount,
|
||||||
|
payment_method: values.payment_method,
|
||||||
|
reference_number: values.reference_number,
|
||||||
|
notes: values.notes,
|
||||||
|
});
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
invalidateAll();
|
||||||
|
notifications.show({ message: 'Payment updated', color: 'green' });
|
||||||
|
close(); setEditing(null); form.reset();
|
||||||
|
},
|
||||||
|
onError: (err: any) => { notifications.show({ message: err.response?.data?.message || 'Error', color: 'red' }); },
|
||||||
|
});
|
||||||
|
|
||||||
|
const deleteMutation = useMutation({
|
||||||
|
mutationFn: (id: string) => api.delete(`/payments/${id}`),
|
||||||
|
onSuccess: () => {
|
||||||
|
invalidateAll();
|
||||||
|
notifications.show({ message: 'Payment deleted', color: 'orange' });
|
||||||
|
setDeleteConfirm(null);
|
||||||
|
close(); setEditing(null); form.reset();
|
||||||
|
},
|
||||||
|
onError: (err: any) => { notifications.show({ message: err.response?.data?.message || 'Error', color: 'red' }); },
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleEdit = (payment: Payment) => {
|
||||||
|
setEditing(payment);
|
||||||
|
form.setValues({
|
||||||
|
invoice_id: payment.invoice_id || '',
|
||||||
|
amount: parseFloat(payment.amount || '0'),
|
||||||
|
payment_method: payment.payment_method || 'check',
|
||||||
|
reference_number: payment.reference_number || '',
|
||||||
|
payment_date: new Date(payment.payment_date),
|
||||||
|
notes: payment.notes || '',
|
||||||
|
});
|
||||||
|
open();
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleNew = () => {
|
||||||
|
setEditing(null);
|
||||||
|
form.reset();
|
||||||
|
open();
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSubmit = (values: any) => {
|
||||||
|
if (editing) {
|
||||||
|
updateMutation.mutate(values);
|
||||||
|
} else {
|
||||||
|
createMutation.mutate(values);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
const fmt = (v: string) => parseFloat(v || '0').toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
const fmt = (v: string) => parseFloat(v || '0').toLocaleString('en-US', { style: 'currency', currency: 'USD' });
|
||||||
|
|
||||||
const invoiceOptions = invoices.map((i: any) => ({
|
const formatPeriod = (inv: any) => {
|
||||||
value: i.id,
|
if (inv.period_start && inv.period_end) {
|
||||||
label: `${i.invoice_number} - ${i.unit_number || 'Unit'} - Balance: $${parseFloat(i.balance_due || i.amount).toFixed(2)}`,
|
const start = new Date(inv.period_start).toLocaleDateString(undefined, { month: 'short' });
|
||||||
}));
|
const end = new Date(inv.period_end).toLocaleDateString(undefined, { month: 'short', year: 'numeric' });
|
||||||
|
return inv.period_start === inv.period_end ? start : `${start}-${end}`;
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
};
|
||||||
|
|
||||||
|
const invoiceOptions = invoices.map((i: any) => {
|
||||||
|
const period = formatPeriod(i);
|
||||||
|
const periodStr = period ? ` - ${period}` : '';
|
||||||
|
return {
|
||||||
|
value: i.id,
|
||||||
|
label: `${i.invoice_number} - ${i.unit_number || 'Unit'}${periodStr} - Balance: $${parseFloat(i.balance_due || i.amount).toFixed(2)}`,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
if (isLoading) return <Center h={300}><Loader /></Center>;
|
if (isLoading) return <Center h={300}><Loader /></Center>;
|
||||||
|
|
||||||
@@ -76,7 +152,7 @@ export function PaymentsPage() {
|
|||||||
<Stack>
|
<Stack>
|
||||||
<Group justify="space-between">
|
<Group justify="space-between">
|
||||||
<Title order={2}>Payments</Title>
|
<Title order={2}>Payments</Title>
|
||||||
{!isReadOnly && <Button leftSection={<IconPlus size={16} />} onClick={open}>Record Payment</Button>}
|
{!isReadOnly && <Button leftSection={<IconPlus size={16} />} onClick={handleNew}>Record Payment</Button>}
|
||||||
</Group>
|
</Group>
|
||||||
<Table striped highlightOnHover>
|
<Table striped highlightOnHover>
|
||||||
<Table.Thead>
|
<Table.Thead>
|
||||||
@@ -84,6 +160,7 @@ export function PaymentsPage() {
|
|||||||
<Table.Th>Date</Table.Th><Table.Th>Unit</Table.Th><Table.Th>Invoice</Table.Th>
|
<Table.Th>Date</Table.Th><Table.Th>Unit</Table.Th><Table.Th>Invoice</Table.Th>
|
||||||
<Table.Th ta="right">Amount</Table.Th><Table.Th>Method</Table.Th>
|
<Table.Th ta="right">Amount</Table.Th><Table.Th>Method</Table.Th>
|
||||||
<Table.Th>Reference</Table.Th><Table.Th>Status</Table.Th>
|
<Table.Th>Reference</Table.Th><Table.Th>Status</Table.Th>
|
||||||
|
{!isReadOnly && <Table.Th></Table.Th>}
|
||||||
</Table.Tr>
|
</Table.Tr>
|
||||||
</Table.Thead>
|
</Table.Thead>
|
||||||
<Table.Tbody>
|
<Table.Tbody>
|
||||||
@@ -96,18 +173,34 @@ export function PaymentsPage() {
|
|||||||
<Table.Td><Badge size="sm" variant="light">{p.payment_method}</Badge></Table.Td>
|
<Table.Td><Badge size="sm" variant="light">{p.payment_method}</Badge></Table.Td>
|
||||||
<Table.Td>{p.reference_number}</Table.Td>
|
<Table.Td>{p.reference_number}</Table.Td>
|
||||||
<Table.Td><Badge color={p.status === 'completed' ? 'green' : 'yellow'} size="sm">{p.status}</Badge></Table.Td>
|
<Table.Td><Badge color={p.status === 'completed' ? 'green' : 'yellow'} size="sm">{p.status}</Badge></Table.Td>
|
||||||
|
{!isReadOnly && (
|
||||||
|
<Table.Td>
|
||||||
|
<Tooltip label="Edit payment">
|
||||||
|
<ActionIcon variant="subtle" onClick={() => handleEdit(p)}>
|
||||||
|
<IconEdit size={16} />
|
||||||
|
</ActionIcon>
|
||||||
|
</Tooltip>
|
||||||
|
</Table.Td>
|
||||||
|
)}
|
||||||
</Table.Tr>
|
</Table.Tr>
|
||||||
))}
|
))}
|
||||||
{payments.length === 0 && (
|
{payments.length === 0 && (
|
||||||
<Table.Tr><Table.Td colSpan={7}><Text ta="center" c="dimmed" py="lg">No payments recorded yet</Text></Table.Td></Table.Tr>
|
<Table.Tr><Table.Td colSpan={isReadOnly ? 7 : 8}><Text ta="center" c="dimmed" py="lg">No payments recorded yet</Text></Table.Td></Table.Tr>
|
||||||
)}
|
)}
|
||||||
</Table.Tbody>
|
</Table.Tbody>
|
||||||
</Table>
|
</Table>
|
||||||
<Modal opened={opened} onClose={close} title="Record Payment">
|
|
||||||
<form onSubmit={form.onSubmit((v) => createMutation.mutate(v))}>
|
{/* Create / Edit Payment Modal */}
|
||||||
|
<Modal opened={opened} onClose={() => { close(); setEditing(null); form.reset(); }} title={editing ? 'Edit Payment' : 'Record Payment'}>
|
||||||
|
<form onSubmit={form.onSubmit(handleSubmit)}>
|
||||||
<Stack>
|
<Stack>
|
||||||
<Select label="Invoice" required data={invoiceOptions} searchable
|
{!editing && (
|
||||||
{...form.getInputProps('invoice_id')} />
|
<Select label="Invoice" required data={invoiceOptions} searchable
|
||||||
|
{...form.getInputProps('invoice_id')} />
|
||||||
|
)}
|
||||||
|
{editing && (
|
||||||
|
<TextInput label="Invoice" value={editing.invoice_number || 'N/A'} disabled />
|
||||||
|
)}
|
||||||
<DateInput label="Payment Date" required {...form.getInputProps('payment_date')} />
|
<DateInput label="Payment Date" required {...form.getInputProps('payment_date')} />
|
||||||
<NumberInput label="Amount" required prefix="$" decimalScale={2} min={0.01}
|
<NumberInput label="Amount" required prefix="$" decimalScale={2} min={0.01}
|
||||||
{...form.getInputProps('amount')} />
|
{...form.getInputProps('amount')} />
|
||||||
@@ -118,10 +211,60 @@ export function PaymentsPage() {
|
|||||||
]} {...form.getInputProps('payment_method')} />
|
]} {...form.getInputProps('payment_method')} />
|
||||||
<TextInput label="Reference Number" placeholder="Check # or transaction ID"
|
<TextInput label="Reference Number" placeholder="Check # or transaction ID"
|
||||||
{...form.getInputProps('reference_number')} />
|
{...form.getInputProps('reference_number')} />
|
||||||
<Button type="submit" loading={createMutation.isPending}>Record Payment</Button>
|
<TextInput label="Notes" placeholder="Optional notes"
|
||||||
|
{...form.getInputProps('notes')} />
|
||||||
|
|
||||||
|
<Group justify="space-between">
|
||||||
|
{editing ? (
|
||||||
|
<>
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
color="red"
|
||||||
|
leftSection={<IconTrash size={16} />}
|
||||||
|
onClick={() => setDeleteConfirm(editing)}
|
||||||
|
>
|
||||||
|
Delete Payment
|
||||||
|
</Button>
|
||||||
|
<Button type="submit" loading={updateMutation.isPending}>
|
||||||
|
Update Payment
|
||||||
|
</Button>
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<Button type="submit" fullWidth loading={createMutation.isPending}>Record Payment</Button>
|
||||||
|
)}
|
||||||
|
</Group>
|
||||||
</Stack>
|
</Stack>
|
||||||
</form>
|
</form>
|
||||||
</Modal>
|
</Modal>
|
||||||
|
|
||||||
|
{/* Delete Confirmation Modal */}
|
||||||
|
<Modal
|
||||||
|
opened={!!deleteConfirm}
|
||||||
|
onClose={() => setDeleteConfirm(null)}
|
||||||
|
title="Delete Payment"
|
||||||
|
size="sm"
|
||||||
|
>
|
||||||
|
<Stack>
|
||||||
|
<Text size="sm">
|
||||||
|
Are you sure you want to delete this payment of{' '}
|
||||||
|
<Text span fw={700}>{deleteConfirm ? fmt(deleteConfirm.amount) : ''}</Text>{' '}
|
||||||
|
for unit {deleteConfirm?.unit_number}?
|
||||||
|
</Text>
|
||||||
|
<Text size="xs" c="dimmed">
|
||||||
|
This will also remove the associated journal entry and recalculate the invoice balance.
|
||||||
|
</Text>
|
||||||
|
<Group justify="flex-end">
|
||||||
|
<Button variant="default" onClick={() => setDeleteConfirm(null)}>Cancel</Button>
|
||||||
|
<Button
|
||||||
|
color="red"
|
||||||
|
loading={deleteMutation.isPending}
|
||||||
|
onClick={() => deleteConfirm && deleteMutation.mutate(deleteConfirm.id)}
|
||||||
|
>
|
||||||
|
Delete Payment
|
||||||
|
</Button>
|
||||||
|
</Group>
|
||||||
|
</Stack>
|
||||||
|
</Modal>
|
||||||
</Stack>
|
</Stack>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,9 +6,11 @@ import {
|
|||||||
IconUser, IconPalette, IconClock, IconBell, IconEye,
|
IconUser, IconPalette, IconClock, IconBell, IconEye,
|
||||||
} from '@tabler/icons-react';
|
} from '@tabler/icons-react';
|
||||||
import { useAuthStore } from '../../stores/authStore';
|
import { useAuthStore } from '../../stores/authStore';
|
||||||
|
import { usePreferencesStore } from '../../stores/preferencesStore';
|
||||||
|
|
||||||
export function UserPreferencesPage() {
|
export function UserPreferencesPage() {
|
||||||
const { user, currentOrg } = useAuthStore();
|
const { user, currentOrg } = useAuthStore();
|
||||||
|
const { colorScheme, toggleColorScheme } = usePreferencesStore();
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Stack>
|
<Stack>
|
||||||
@@ -66,7 +68,10 @@ export function UserPreferencesPage() {
|
|||||||
<Text size="sm">Dark Mode</Text>
|
<Text size="sm">Dark Mode</Text>
|
||||||
<Text size="xs" c="dimmed">Switch to dark color theme</Text>
|
<Text size="xs" c="dimmed">Switch to dark color theme</Text>
|
||||||
</div>
|
</div>
|
||||||
<Switch disabled />
|
<Switch
|
||||||
|
checked={colorScheme === 'dark'}
|
||||||
|
onChange={toggleColorScheme}
|
||||||
|
/>
|
||||||
</Group>
|
</Group>
|
||||||
<Group justify="space-between">
|
<Group justify="space-between">
|
||||||
<div>
|
<div>
|
||||||
@@ -76,7 +81,7 @@ export function UserPreferencesPage() {
|
|||||||
<Switch disabled />
|
<Switch disabled />
|
||||||
</Group>
|
</Group>
|
||||||
<Divider />
|
<Divider />
|
||||||
<Text size="xs" c="dimmed" ta="center">Display preferences coming in a future release</Text>
|
<Text size="xs" c="dimmed" ta="center">More display preferences coming in a future release</Text>
|
||||||
</Stack>
|
</Stack>
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
|
|||||||
@@ -117,7 +117,7 @@ export function SettingsPage() {
|
|||||||
</Group>
|
</Group>
|
||||||
<Group justify="space-between">
|
<Group justify="space-between">
|
||||||
<Text size="sm" c="dimmed">Version</Text>
|
<Text size="sm" c="dimmed">Version</Text>
|
||||||
<Badge variant="light">2026.3.2 (beta)</Badge>
|
<Badge variant="light">2026.3.7 (Beta)</Badge>
|
||||||
</Group>
|
</Group>
|
||||||
<Group justify="space-between">
|
<Group justify="space-between">
|
||||||
<Text size="sm" c="dimmed">API</Text>
|
<Text size="sm" c="dimmed">API</Text>
|
||||||
|
|||||||
26
frontend/src/stores/preferencesStore.ts
Normal file
26
frontend/src/stores/preferencesStore.ts
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
import { create } from 'zustand';
|
||||||
|
import { persist } from 'zustand/middleware';
|
||||||
|
|
||||||
|
type ColorScheme = 'light' | 'dark';
|
||||||
|
|
||||||
|
interface PreferencesState {
|
||||||
|
colorScheme: ColorScheme;
|
||||||
|
toggleColorScheme: () => void;
|
||||||
|
setColorScheme: (scheme: ColorScheme) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const usePreferencesStore = create<PreferencesState>()(
|
||||||
|
persist(
|
||||||
|
(set) => ({
|
||||||
|
colorScheme: 'light',
|
||||||
|
toggleColorScheme: () =>
|
||||||
|
set((state) => ({
|
||||||
|
colorScheme: state.colorScheme === 'light' ? 'dark' : 'light',
|
||||||
|
})),
|
||||||
|
setColorScheme: (scheme) => set({ colorScheme: scheme }),
|
||||||
|
}),
|
||||||
|
{
|
||||||
|
name: 'ledgeriq-preferences',
|
||||||
|
},
|
||||||
|
),
|
||||||
|
);
|
||||||
@@ -23,21 +23,8 @@ server {
|
|||||||
proxy_cache_bypass $http_upgrade;
|
proxy_cache_bypass $http_upgrade;
|
||||||
}
|
}
|
||||||
|
|
||||||
# AI recommendation endpoint needs a longer timeout (up to 3 minutes)
|
# AI endpoints now return immediately (async processing in background)
|
||||||
location /api/investment-planning/recommendations {
|
# No special timeout needed — kept for documentation purposes
|
||||||
proxy_pass http://backend;
|
|
||||||
proxy_http_version 1.1;
|
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
|
||||||
proxy_set_header Connection 'upgrade';
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
proxy_read_timeout 180s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
# Everything else -> Vite dev server (frontend)
|
# Everything else -> Vite dev server (frontend)
|
||||||
location / {
|
location / {
|
||||||
|
|||||||
@@ -74,20 +74,8 @@ server {
|
|||||||
proxy_send_timeout 15s;
|
proxy_send_timeout 15s;
|
||||||
}
|
}
|
||||||
|
|
||||||
# AI endpoints — longer timeouts (LLM calls can take minutes)
|
# AI endpoints now return immediately (async processing in background)
|
||||||
location /api/investment-planning/recommendations {
|
# No special timeout overrides needed
|
||||||
proxy_pass http://127.0.0.1:3000;
|
|
||||||
proxy_read_timeout 300s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
location /api/health-scores/calculate {
|
|
||||||
proxy_pass http://127.0.0.1:3000;
|
|
||||||
proxy_read_timeout 180s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
# --- Frontend → React SPA served by nginx (port 3001) ---
|
# --- Frontend → React SPA served by nginx (port 3001) ---
|
||||||
location / {
|
location / {
|
||||||
|
|||||||
@@ -40,20 +40,8 @@ server {
|
|||||||
proxy_send_timeout 15s;
|
proxy_send_timeout 15s;
|
||||||
}
|
}
|
||||||
|
|
||||||
# AI endpoints → longer timeouts
|
# AI endpoints now return immediately (async processing in background)
|
||||||
location /api/investment-planning/recommendations {
|
# No special timeout overrides needed
|
||||||
proxy_pass http://backend;
|
|
||||||
proxy_read_timeout 180s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
location /api/health-scores/calculate {
|
|
||||||
proxy_pass http://backend;
|
|
||||||
proxy_read_timeout 180s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
# --- Static frontend → built React assets ---
|
# --- Static frontend → built React assets ---
|
||||||
location / {
|
location / {
|
||||||
|
|||||||
@@ -60,37 +60,8 @@ server {
|
|||||||
proxy_cache_bypass $http_upgrade;
|
proxy_cache_bypass $http_upgrade;
|
||||||
}
|
}
|
||||||
|
|
||||||
# AI recommendation endpoint needs a longer timeout (up to 3 minutes)
|
# AI endpoints now return immediately (async processing in background)
|
||||||
location /api/investment-planning/recommendations {
|
# No special timeout overrides needed
|
||||||
proxy_pass http://backend;
|
|
||||||
proxy_http_version 1.1;
|
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
|
||||||
proxy_set_header Connection 'upgrade';
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
proxy_read_timeout 180s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
# AI health-score endpoint also needs a longer timeout
|
|
||||||
location /api/health-scores/calculate {
|
|
||||||
proxy_pass http://backend;
|
|
||||||
proxy_http_version 1.1;
|
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
|
||||||
proxy_set_header Connection 'upgrade';
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
proxy_read_timeout 180s;
|
|
||||||
proxy_connect_timeout 10s;
|
|
||||||
proxy_send_timeout 30s;
|
|
||||||
}
|
|
||||||
|
|
||||||
# Everything else -> Vite dev server (frontend)
|
# Everything else -> Vite dev server (frontend)
|
||||||
location / {
|
location / {
|
||||||
|
|||||||
150
scripts/reset-password.sh
Executable file
150
scripts/reset-password.sh
Executable file
@@ -0,0 +1,150 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# reset-password.sh — Reset a user's password in HOA LedgerIQ
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./scripts/reset-password.sh <email> <new-password>
|
||||||
|
#
|
||||||
|
# Examples:
|
||||||
|
# ./scripts/reset-password.sh admin@hoaledgeriq.com MyNewPassword123
|
||||||
|
# ./scripts/reset-password.sh admin@sunrisevalley.org SecurePass!
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# ---- Defaults ----
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
PROJECT_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"
|
||||||
|
DB_USER="${POSTGRES_USER:-hoafinance}"
|
||||||
|
DB_NAME="${POSTGRES_DB:-hoafinance}"
|
||||||
|
COMPOSE_CMD="docker compose"
|
||||||
|
|
||||||
|
# If running with the SSL override, detect it
|
||||||
|
if [ -f "$PROJECT_DIR/docker-compose.ssl.yml" ] && \
|
||||||
|
docker compose -f "$PROJECT_DIR/docker-compose.yml" \
|
||||||
|
-f "$PROJECT_DIR/docker-compose.ssl.yml" ps --quiet 2>/dev/null | head -1 | grep -q .; then
|
||||||
|
COMPOSE_CMD="docker compose -f $PROJECT_DIR/docker-compose.yml -f $PROJECT_DIR/docker-compose.ssl.yml"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ---- Colors ----
|
||||||
|
RED='\033[0;31m'; GREEN='\033[0;32m'; YELLOW='\033[1;33m'; CYAN='\033[0;36m'; NC='\033[0m'
|
||||||
|
|
||||||
|
info() { echo -e "${CYAN}[INFO]${NC} $*"; }
|
||||||
|
ok() { echo -e "${GREEN}[OK]${NC} $*"; }
|
||||||
|
warn() { echo -e "${YELLOW}[WARN]${NC} $*"; }
|
||||||
|
err() { echo -e "${RED}[ERROR]${NC} $*" >&2; }
|
||||||
|
die() { err "$@"; exit 1; }
|
||||||
|
|
||||||
|
# ---- Helpers ----
|
||||||
|
|
||||||
|
ensure_containers_running() {
|
||||||
|
if ! $COMPOSE_CMD ps postgres 2>/dev/null | grep -q "running\|Up"; then
|
||||||
|
die "PostgreSQL container is not running. Start it with: docker compose up -d postgres"
|
||||||
|
fi
|
||||||
|
if ! $COMPOSE_CMD ps backend 2>/dev/null | grep -q "running\|Up"; then
|
||||||
|
die "Backend container is not running. Start it with: docker compose up -d backend"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---- CLI ----
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<EOF
|
||||||
|
HOA LedgerIQ Password Reset
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
$(basename "$0") <email> <new-password>
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
$(basename "$0") admin@hoaledgeriq.com MyNewPassword123
|
||||||
|
$(basename "$0") admin@sunrisevalley.org SecurePass!
|
||||||
|
|
||||||
|
This script:
|
||||||
|
1. Verifies the user exists in the database
|
||||||
|
2. Generates a bcrypt hash using bcryptjs (same library the app uses)
|
||||||
|
3. Updates the password in the database
|
||||||
|
4. Verifies the new hash works
|
||||||
|
|
||||||
|
EOF
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Parse args
|
||||||
|
case "${1:-}" in
|
||||||
|
-h|--help|help|"") usage ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
[ $# -lt 2 ] && die "Usage: $(basename "$0") <email> <new-password>"
|
||||||
|
|
||||||
|
EMAIL="$1"
|
||||||
|
NEW_PASSWORD="$2"
|
||||||
|
|
||||||
|
# Load .env if present
|
||||||
|
if [ -f "$PROJECT_DIR/.env" ]; then
|
||||||
|
set -a
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source "$PROJECT_DIR/.env"
|
||||||
|
set +a
|
||||||
|
DB_USER="${POSTGRES_USER:-hoafinance}"
|
||||||
|
DB_NAME="${POSTGRES_DB:-hoafinance}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Ensure containers are running
|
||||||
|
info "Checking containers ..."
|
||||||
|
ensure_containers_running
|
||||||
|
|
||||||
|
# Verify user exists
|
||||||
|
info "Looking up user: ${EMAIL} ..."
|
||||||
|
USER_RECORD=$($COMPOSE_CMD exec -T postgres psql -U "$DB_USER" -d "$DB_NAME" \
|
||||||
|
-t -A -c "SELECT id, email, first_name, last_name, is_superadmin FROM shared.users WHERE email = '${EMAIL}';" 2>/dev/null)
|
||||||
|
|
||||||
|
if [ -z "$USER_RECORD" ]; then
|
||||||
|
die "No user found with email: ${EMAIL}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Parse user info for display
|
||||||
|
IFS='|' read -r USER_ID USER_EMAIL FIRST_NAME LAST_NAME IS_SUPER <<< "$USER_RECORD"
|
||||||
|
info "Found user: ${FIRST_NAME} ${LAST_NAME} (${USER_EMAIL})"
|
||||||
|
if [ "$IS_SUPER" = "t" ]; then
|
||||||
|
warn "This is a superadmin account"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Generate bcrypt hash using bcryptjs inside the backend container
|
||||||
|
info "Generating bcrypt hash ..."
|
||||||
|
HASH=$($COMPOSE_CMD exec -T backend node -e "
|
||||||
|
const bcrypt = require('bcryptjs');
|
||||||
|
bcrypt.hash(process.argv[1], 12).then(h => process.stdout.write(h));
|
||||||
|
" "$NEW_PASSWORD" 2>/dev/null)
|
||||||
|
|
||||||
|
if [ -z "$HASH" ] || [ ${#HASH} -lt 50 ]; then
|
||||||
|
die "Failed to generate bcrypt hash. Is the backend container running?"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Update the password using a heredoc to avoid shell escaping issues with $ in hashes
|
||||||
|
info "Updating password ..."
|
||||||
|
UPDATE_RESULT=$($COMPOSE_CMD exec -T postgres psql -U "$DB_USER" -d "$DB_NAME" -t -A <<EOSQL
|
||||||
|
UPDATE shared.users SET password_hash = '${HASH}', updated_at = NOW() WHERE email = '${EMAIL}';
|
||||||
|
EOSQL
|
||||||
|
)
|
||||||
|
|
||||||
|
if [[ "$UPDATE_RESULT" != *"UPDATE 1"* ]]; then
|
||||||
|
die "Password update failed. Result: ${UPDATE_RESULT}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Verify the new hash works
|
||||||
|
info "Verifying new password ..."
|
||||||
|
VERIFY=$($COMPOSE_CMD exec -T backend node -e "
|
||||||
|
const bcrypt = require('bcryptjs');
|
||||||
|
bcrypt.compare(process.argv[1], process.argv[2]).then(r => process.stdout.write(String(r)));
|
||||||
|
" "$NEW_PASSWORD" "$HASH" 2>/dev/null)
|
||||||
|
|
||||||
|
if [ "$VERIFY" != "true" ]; then
|
||||||
|
die "Verification failed — the hash does not match the password. Something went wrong."
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
ok "Password reset successful!"
|
||||||
|
echo ""
|
||||||
|
info " User: ${FIRST_NAME} ${LAST_NAME} (${USER_EMAIL})"
|
||||||
|
info " Login: ${EMAIL}"
|
||||||
|
echo ""
|
||||||
Reference in New Issue
Block a user