Skip to content

Feoda BI — Implementation Kickoff Guide

Overview

This guide walks through the foundational setup required before writing any application code. The goal is to establish the project management, source control, CI/CD, and development environment infrastructure.

Source Control Architecture

Azure Repos is the primary code repository. All developers push to Azure Repos using their individual Microsoft/Azure DevOps identity. An Azure Pipeline automatically mirrors code to GitHub (feodadev/feoda-bi) for Copilot licensing and AI workflows. Developers never interact with GitHub directly.

Prerequisites

  • Azure account created
  • Azure DevOps organisation created
  • Azure Repos repository created (primary)
  • GitHub mirror repository created
  • Mirror pipeline configured
  • Development environment configured
  • Project scaffolded

Phase Summary

Phase What Time Estimate
Phase 0 Azure DevOps project (Feoda_BI) + Azure Repos + GitHub mirror ~1–2 hours
Phase 1 Backlog creation (epics, features, stories) ~2–3 hours
Phase 2 Development environment setup ~1–2 hours
Phase 3 Project scaffolding (Fastify + TypeScript) ~2–3 hours
Phase 4 CI/CD pipeline ~1–2 hours
Phase 5 Azure infrastructure (database, cache) ~1–2 hours

Phase 0: Azure DevOps + Azure Repos (Primary) + GitHub Mirror

Step 1: Create Azure DevOps Organisation

If you don't already have an Azure DevOps organisation:

  1. Go to dev.azure.com
  2. Sign in with your Azure account
  3. Click "New organization"
  4. Organisation name: feoda (or feoda-tech)
  5. Host your projects in: Australia East (closest to Pymble; UAE North is not available for Azure DevOps hosting, but this only affects where project metadata is stored — not where your app runs)

Step 2: Create Azure DevOps Project

  1. In your organisation, click "+ New project"
  2. Configure:
Setting Value
Project name Feoda_BI
Description ERP-agnostic Accounts Receivable Management platform for the education sector
Visibility Private
  1. Click Advanced to expand the advanced options
  2. Configure:
Setting Value
Version control Git
Work item process Agile (recommended — simpler than CMMI/Scrum, suits Feoda's team size)

Work Item Process — Hidden Under Advanced

The version control and work item process options are not visible by default in the create project form. You must click the "Advanced" toggle (below the Visibility setting) to reveal them. If you don't set this, Azure DevOps defaults to Agile for new organisations, but it's best to confirm explicitly.

The four available processes are: Basic (simplest — Issue/Task/Epic), Agile (recommended — User Story/Bug/Task/Feature/Epic), Scrum (Product Backlog Item/Bug/Task), and CMMI (most formal — Requirement/Change Request/Risk). You can change the process later via Organisation SettingsProcess, but it's easier to pick the right one at creation time.

  1. Click Create

Step 3: Configure Azure Boards

Once the project is created:

3a. Set Up Area Paths

Go to Project SettingsBoardsProject configurationAreas

Create the following area paths organised by domain (not technology layer):

Feoda_BI
├── Data and Import            ← Student/debtor data import, CSV/Excel processing
├── Billing                  ← Configurator, fee schedules, invoice generation
├── Payments                 ← Payment recording, gateways, refunds
├── Portal                   ← Parent-facing portal, account management
├── Integrations             ← ERP connectors (NetSuite), SIS sync
├── Infrastructure           ← CI/CD, Azure setup, monitoring, deployment
└── Documentation            ← Docs, ADRs, process guides

Why Domain-Based Areas (Not Layer-Based)

Each work item can only have one area path. Since stories in a vertical slice approach typically span backend API, database, and frontend together (e.g., "import debtors from CSV" touches the API route, DB migration, and UI upload screen), organising by domain keeps related work together.

If you need to filter by technology layer, use tags instead (e.g., api, database, ui, infra). Tags are multi-select, so a story can be tagged with both api and database.

3b. Set Up Iterations (Sprints)

Go to Project SettingsBoardsProject configurationIterations

Create a 2-week sprint cadence:

Feoda_BI
├── Sprint 0 — Foundation and Simple Full Billing Cycle (Mar 9 – Mar 20, 2026)
├── Sprint 1 — Parent Portal and Notifications (Mar 23 – Apr 3, 2026)
├── Sprint 2 — Integrations (Apr 6 – Apr 17, 2026)
├── Sprint 3 — Integration Adapters (Apr 20 – May 1, 2026)
├── Sprint 4 — Auth, Multi-Tenancy and Reporting (May 4 – May 15, 2026)
└── Sprint 5 — Hardening and Go-Live Prep (May 18 – May 29, 2026)

Sprint 0

Sprint 0 covers both the project infrastructure setup (repo, CI/CD, scaffolding) and the first simple billing cycle (schema, import, configurator, admin UI, manual payments). By the end of this sprint you have a working system — not just an empty project.

3c. Customise Board Columns

Go to BoardsBoard settings (gear icon) → Columns

Recommended columns:

Column Description
New Newly created work items
Refined Requirements are clear, acceptance criteria defined
In Progress Developer is actively working
In Review Pull request submitted, awaiting code review
Testing QA / manual testing
Pull Request PR approved, ready to merge
Done Merged to main, deployed to dev

3d. Define Tag Convention

Tags complement area paths by allowing multi-select filtering across technology layers and cross-cutting concerns. Unlike area paths (one per work item), a story can have multiple tags.

Azure DevOps creates tags on first use — there is no central tag creation screen. Just type a tag name in the Tags field on any work item.

Standard tags:

Tag Use For
api Backend API routes, services, middleware
database Migrations, schema changes, queries, seeds
ui Frontend components, pages, styling
infra CI/CD, deployment, Azure resources, Docker
auth Authentication, authorization, JWT, RLS
testing Test coverage, test infrastructure
doc Documentation, ADRs, process guides

Example: A Sprint 1 story "Import debtors from CSV" would have:

  • Area path: Feoda_BI\Data and Import
  • Tags: api, database, ui

Tags Are Free-Form

You can create additional tags as needed (e.g., csv-import, notifications, reporting). The standard tags above are for technology layer filtering — use them consistently across all stories so you can query "show me all database work across all sprints" regardless of area path.

Step 4: Initialise Azure Repos Repository (Primary)

Azure Repos is automatically available inside your Azure DevOps project. To set it up:

  1. Rename the default repository — it is created automatically with the same name as the project (Feoda_BI):
  2. Go to Project SettingsReposRepositories
  3. Click the repository name → Rename → type feoda-bi → confirm

  4. Initialise the repository:

  5. Go to ReposFiles
  6. Because the repo is empty, Azure DevOps shows an initialisation screen (not a button — it's the main content of the empty repo page)
  7. Scroll down to the "Initialize main branch with a README or gitignore" section
  8. Select .gitignore: Node
  9. Click Initialize

Can't See the Initialize Screen?

If you see a clone URL page instead, it means the repo was already initialised. Scroll down and look for a "Initialize" button or section. If the repo already has a main branch, skip this step.

4a. Configure Branch Policies

Go to ReposBranches → click on mainBranch policies

Setting Value
Require a minimum number of reviewers Yes — 1 reviewer
Check for linked work items Required
Check for comment resolution Required
Limit merge types Squash merge only (keeps history clean)
Build validation Add CI pipeline (configured in Phase 4)
Automatically included reviewers (optional — skip for now, add later as team grows)

Branch Policies vs GitHub Branch Protection

Azure Repos branch policies are more granular than GitHub's. They integrate natively with Azure Pipelines (build validation) and Azure Boards (work item linking) without any extra apps.

4b. Define Branch Strategy

main                    ← Production-ready code (protected)
├── develop             ← Integration branch for sprint work
│   ├── feature/BI-xx   ← Feature branches (linked to Azure Boards work item)
│   ├── bugfix/BI-xx    ← Bug fix branches
│   └── spike/BI-xx     ← Research/spike branches
├── release/v0.x        ← Release candidates
└── hotfix/BI-xx        ← Urgent production fixes

Branch naming uses the Azure Boards work item ID prefix (BI-xx) for traceability.

Policies and Procedures

All naming conventions — branch naming, commit message format (AB#<id>), tag convention, and code review expectations — will be formalised in a separate Developer Policies and Procedures document. This section is the quick reference; the full policy doc is the source of truth.

4c. Create develop Branch

  1. In ReposBranches, click "New branch"
  2. Name: develop, based on: main
  3. Click Create

4d. Add Developer Access

Go to Project SettingsReposRepositoriesfeoda-biSecurity

Add each developer with their individual Microsoft account (e.g., dev@feoda.ae). Recommended permissions:

Group/User Permission Level
Project Administrators Full control
Contributors (developers) Contribute, Create branch, Create tag
Readers (stakeholders) Read only

No Shared Accounts

Every developer must use their own Microsoft/Azure DevOps identity. This ensures every commit, PR, and work item change is individually attributed. Never share the feodadev or feodasalesadmin accounts.

Step 5: Create GitHub Mirror Repository

The GitHub mirror serves two purposes:

  1. GitHub Copilot licensing — devs need a GitHub account for Copilot, but never push to GitHub
  2. AI workflows — Copilot agents and AI tools can access the codebase via GitHub

5a. Create the Mirror Repo on GitHub

Go to github.com/feodadev:

  1. Click "New repository"
  2. Configure:
Setting Value
Repository name feoda-bi
Description [MIRROR] Feoda BI — mirrored from Azure Repos
Visibility Private
Initialize Do NOT initialise (empty repo)
  1. Click Create repository

Do Not Initialise

The mirror repo must start empty. The Azure Pipeline will push all content from Azure Repos. Initialising it would cause conflicts.

5b. Create a GitHub Personal Access Token (PAT)

The mirror pipeline needs a GitHub PAT to push:

  1. Go to github.com/settings/tokens
  2. Click "Generate new token (classic)"
  3. Name: azure-repos-mirror
  4. Scopes: repo (full control of private repositories)
  5. Click Generate token — copy and save it securely

5c. Store the PAT in Azure DevOps

  1. In Azure DevOps, go to PipelinesLibrary"+ Variable group"
  2. Name: github-mirror-secrets
  3. Add variable:
  4. Name: GITHUB_PAT
  5. Value: (paste the PAT)
  6. Click the lock icon to make it secret
  7. Click Save

5d. Verify Azure Boards + Azure Repos Integration

Since code is now in Azure Repos (not GitHub), the Boards integration is automatic — no GitHub app needed:

  1. Create a test work item in Azure Boards (e.g., "Test connection")
  2. Note the work item ID (e.g., BI-1)
  3. In Azure Repos, create a branch named feature/BI-1-test-connection
  4. Make a commit with message: AB#1 Test Azure Boards connection
  5. The AB#1 syntax links the commit to work item #1 in Azure Boards
  6. Verify the commit appears in the work item's "Development" section

Commit Message Convention

Use AB#<work-item-id> in commit messages to auto-link to Azure Boards. Example:

AB#42 Add tenant authentication middleware
This will show the commit, PR, and branch in the work item's Development tab. Since both code and boards are in Azure DevOps, this works natively — no GitHub app required.


Phase 1: Initial Backlog

Epics

Create these Epics in Azure Boards (BoardsBacklogs → set level to Epics):

# Epic Area Path Description
1 Project Foundation Infrastructure Repository setup, CI/CD, dev environment, coding standards
2 Core API & Database Data and Import Fastify server, PostgreSQL schema, base CRUD, migrations
3 Authentication & Multi-Tenancy Infrastructure JWT auth, tenant isolation (RLS), API key management
4 Billing Engine Billing Billing configuration, transaction generation, billing rules, items
5 Payment Processing Payments Payment gateway integration (eway), RPS, direct debit, refunds
6 ERP Integration Integrations NetSuite connector, journal posting, debtor sync
7 SIS Integration Integrations Student data import, enrollment sync
8 Admin UI Billing Staff-facing dashboard, configurator, billing management
9 Parent Portal Portal Payment portal, invoice viewing, account management
10 Reporting & Analytics Billing Aged balances, transaction reports, revenue dashboards
11 DevOps & Infrastructure Infrastructure Azure setup, monitoring, logging, alerting, deployment

Epic Area Paths

Each epic gets one area path representing its primary domain. Child stories under an epic can have a different area path — e.g., a story under "Admin UI" (area: Billing) might be assigned to Payments if it's about the payment management screen.

Sprint 0 — User Stories (Foundation + Full Billing Cycle)

Create these user stories under Epic 1: Project Foundation and assign to Sprint 0.

Foundation stories (17 SP):

# Title Description Area Path Tags SP Acceptance Criteria
1 Azure Repos Setup As a developer, I need Azure Repos set up with branch policies and .gitignore Infrastructure infra 2 Repo exists with .gitignore, main branch protected with policies, develop branch created, GitHub mirror syncing
2 Fastify TypeScript Scaffold As a developer, I need the Fastify + TypeScript project scaffolded with a health check endpoint Infrastructure api, infra 3 GET /health returns 200 OK, TypeScript compiles without errors, ESLint passes with zero warnings
3 Dev Container Setup As a developer, I need a Dev Container so the entire team has an identical development environment Infrastructure infra, database 3 VS Code "Reopen in Container" builds and starts the dev environment, Node.js 22 available inside container, PostgreSQL and Redis accessible as sibling containers, VS Code extensions auto-installed (ESLint, Prettier, GitLens), .nvmrc and engines field provide fallback for non-Dev Container use, docker-compose.yml fallback starts services manually
4 Database Migrations Setup As a developer, I need database migrations configured Data and Import database 2 First migration creates tenants table, npm run migrate executes successfully, rollback works with npm run migrate:down
5 CI Pipeline As a developer, I need a CI pipeline that runs lint, type-check, and tests on PR Infrastructure infra, testing 3 Azure Pipeline triggers on PR to develop and main, runs ESLint, TypeScript type-check, and unit tests, PR blocked if pipeline fails
6 Environment Configuration As a developer, I need environment configuration management Infrastructure infra 1 .env.example file with all documented variables, dotenv integration loads config on startup, app fails gracefully with clear error if required vars missing
7 Logging and Error Handling As a developer, I need logging and error handling set up Infrastructure api 2 Structured JSON logging via pino, global error handler catches unhandled exceptions, request ID tracking across log entries
8 Coding Standards Documentation As a developer, I need coding standards documented Documentation doc 1 ESLint config committed and enforced, Prettier config committed and enforced, commit message convention documented, PR template created in repo

Full billing cycle stories (31 SP, also Sprint 0 — under Epics 2, 4, 5, 8):

# Title Description Area Path Tags SP Acceptance Criteria
9 Core Database Schema As a developer, I need the core DB schema for tenants, debtors, students, items, invoices, and payments Data and Import database 5 Migrations run without errors, tables created with correct foreign keys and constraints, seed data loads successfully
10 CSV Debtor and Student Import As a staff user, I need to import debtors/students from CSV Data and Import api, database, ui 5 Upload CSV via admin UI, records created in database with correct relationships, validation errors displayed for malformed rows
11 Billing Configurator As a staff user, I need a billing configurator to define fee schedules, rules, and items Billing api, database, ui 8 Create and edit fee schedules, assign billing items to schedules, set rules (discounts, payment plans), configuration persists and is usable by billing engine
12 Billing Run Engine As a staff user, I need to run billing to generate invoices Billing api, database 5 Billing run creates invoices based on configurator rules, line items are correct and match fee schedules, billing run is idempotent (no duplicates)
13 Admin Dashboard As a staff user, I need a basic admin dashboard to view debtors and invoices Billing ui, api 5 List debtors with outstanding balances, view invoice details with line items, filter and search debtors/invoices
14 Manual Payment Recording As a staff user, I need to record manual payments against invoices Payments api, database, ui 3 Record payment against a specific invoice, invoice balance updates automatically, payment history visible on debtor record

Sprint 0 Is Ambitious — That's Intentional

Sprint 0 packs both infrastructure and the first billing cycle into 2 weeks. The foundation stories (1–8) are mostly setup tasks that take hours, not days. The billing cycle stories (9–14) are simplified — no auth, no external integrations, no parent portal. This is the core of the spiral method: get a working A-to-Z flow as fast as possible.

Sprint 1–5 Scope Preview (Spiral Method)

Each sprint adds a layer of sophistication around the working billing cycle from Sprint 0:

Sprint Theme What's New
Sprint 1 Parent Portal and Notifications Add the parent-facing layer: Parent portal (view invoices, account balance, payment history), email notifications (invoice issued, payment received, overdue reminders)
Sprint 2 Integrations Connect to external systems: ERP connector (NetSuite journal posting), SIS data sync
Sprint 3 Integration Adapters Payment gateway adapters: Adapter pattern implementation (EwayAdapter or partner), direct debit, refund workflows
Sprint 4 Auth, Multi-Tenancy and Reporting Production-grade access control: JWT auth, tenant isolation (RLS), API key management, reporting dashboards (aged balances, revenue, transaction reports)
Sprint 5 Hardening and Go-Live Prep Production readiness: Load testing, monitoring & alerting, error handling polish, deployment pipeline, security audit, documentation

Spiral Method

By the end of Sprint 0, you have a fully working billing system: import debtors → configure fees → run billing → staff views invoices → records a manual payment. It has no auth, no external integrations, no parent portal — but the entire core business logic works end-to-end. Each subsequent sprint wraps another layer around this core.

Sprint 0: ████████████████████████████████████████ Foundation + simple billing cycle
Sprint 1: ████████████████████████████████████████████████ + Parent portal
Sprint 2: ████████████████████████████████████████████████████████ + Integrations
Sprint 3: ████████████████████████████████████████████████████████████████ + Payment adapters
Sprint 4: ████████████████████████████████████████████████████████████████████████ + Auth & reporting
Sprint 5: ████████████████████████████████████████████████████████████████████████████████ Production

Phase 2: Development Environment

Unified Dev Environment via Dev Containers

All developers use VS Code Dev Containers to guarantee an identical environment — same Node.js version, same tools, same extensions, same OS. No manual installation of Node, nvm, or service containers. Docker Desktop is the only prerequisite.

Step 1: Install Prerequisites

Every developer needs only two things installed on their host machine:

  1. Docker Desktopdocker.com/products/docker-desktop
  2. VS Code + Dev Containers extension — install from VS Code Extensions marketplace (ms-vscode-remote.remote-containers)
# Verify Docker is installed
docker --version
docker compose version

No Node.js Installation Required

Do not install Node.js on your host machine. The Dev Container provides the exact Node.js version (22.x) automatically. This eliminates version drift between developers.

Step 2: Clone and Checkout

# Clone from Azure Repos (primary)
cd ~/Feoda
git clone https://dev.azure.com/feoda/Feoda_BI/_git/feoda-bi
cd feoda-bi

# Switch to develop branch
git checkout develop

# Open in VS Code
code .

Azure Repos Git Credentials

When cloning for the first time, Azure DevOps will prompt for authentication. Options:

  • HTTPS + Git Credential Manager (recommended on macOS): Install GCM — it caches credentials automatically
  • SSH: Add an SSH key in Azure DevOps → User SettingsSSH public keys
  • Personal Access Token: Generate at Azure DevOps → User SettingsPersonal access tokens (scope: Code > Read & Write)

Phase 3: Project Scaffolding

Step 1: Configure Dev Container

Create the Dev Container configuration files first — this gives every developer Node.js 22, PostgreSQL, and Redis automatically with zero manual setup.

Create .devcontainer/devcontainer.json:

{
  "name": "Feoda BI",
  "dockerComposeFile": "docker-compose.yml",
  "service": "app",
  "workspaceFolder": "/workspace",
  "features": {
    "ghcr.io/devcontainers/features/node:1": {
      "version": "22"
    },
    "ghcr.io/devcontainers/features/git:1": {}
  },
  "customizations": {
    "vscode": {
      "extensions": [
        "dbaeumer.vscode-eslint",
        "esbenp.prettier-vscode",
        "eamodio.gitlens",
        "ms-azuretools.vscode-docker",
        "vitest.explorer",
        "bradlc.vscode-tailwindcss"
      ],
      "settings": {
        "editor.defaultFormatter": "esbenp.prettier-vscode",
        "editor.formatOnSave": true,
        "editor.codeActionsOnSave": {
          "source.fixAll.eslint": "explicit"
        },
        "typescript.preferences.importModuleSpecifier": "relative",
        "files.eol": "\n"
      }
    }
  },
  "postCreateCommand": "npm install",
  "forwardPorts": [3000, 5432, 6379],
  "portsAttributes": {
    "3000": { "label": "App", "onAutoForward": "notify" },
    "5432": { "label": "PostgreSQL", "onAutoForward": "silent" },
    "6379": { "label": "Redis", "onAutoForward": "silent" }
  }
}

Create .devcontainer/docker-compose.yml:

# .devcontainer/docker-compose.yml — Dev Container compose (app + services)
services:
  app:
    build:
      context: ..
      dockerfile: .devcontainer/Dockerfile
    volumes:
      - ..:/workspace:cached
    command: sleep infinity
    environment:
      DATABASE_URL: postgres://arm_dev:arm_dev_password@postgres:5432/arm_development
      REDIS_URL: redis://redis:6379
      NODE_ENV: development
      PORT: 3000
      LOG_LEVEL: debug
      JWT_SECRET: dev-only-secret-do-not-use-in-production
    depends_on:
      postgres:
        condition: service_healthy
      redis:
        condition: service_healthy

  postgres:
    image: postgres:16-alpine
    environment:
      POSTGRES_USER: arm_dev
      POSTGRES_PASSWORD: arm_dev_password
      POSTGRES_DB: arm_development
    ports:
      - "5432:5432"
    volumes:
      - arm_pgdata:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U arm_dev"]
      interval: 5s
      timeout: 5s
      retries: 5

  redis:
    image: redis:7-alpine
    ports:
      - "6379:6379"
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 5s
      retries: 5

volumes:
  arm_pgdata:

Create .devcontainer/Dockerfile:

FROM mcr.microsoft.com/devcontainers/typescript-node:22

Create .nvmrc in the repo root:

22

Step 2: Open in Dev Container

In VS Code, with the feoda-bi folder open, it will detect the .devcontainer/ folder and prompt:

"Folder contains a Dev Container configuration file. Reopen folder to develop in a container?"

Click "Reopen in Container". VS Code will:

  1. Build the Dev Container image (first time only, ~2-3 minutes)
  2. Start PostgreSQL and Redis alongside the app container
  3. Install all VS Code extensions defined in devcontainer.json
  4. Apply workspace settings (formatter, linter, etc.)
  5. Open a terminal inside the container — ready to code

Git credentials from your host machine are automatically forwarded into the Dev Container — no extra setup needed.

Step 3: Verify Environment

Inside the Dev Container terminal:

# Node.js (exact version locked by Dev Container)
node --version   # Should show v22.x.x
npm --version    # Should show 10.x.x

# PostgreSQL (running as a sibling container)
PGPASSWORD=arm_dev_password psql -h postgres -U arm_dev -d arm_development -c "SELECT 1;"

# Redis (running as a sibling container)
redis-cli -h redis ping   # Should return PONG

Service Hostnames Inside Dev Container

Inside the Dev Container, services are accessed by their Docker Compose service name — postgres and redis — not localhost. The .env.example is pre-configured with these hostnames.

Fallback: Running Without Dev Containers

If a developer cannot use Dev Containers (e.g., JetBrains IDE), the repo includes fallback safeguards:

  • .nvmrc — run nvm use to switch to the correct Node.js version
  • engines in package.jsonnpm install fails if Node version is wrong
  • docker-compose.yml — start PostgreSQL + Redis manually with docker compose up -d
# Fallback setup (without Dev Containers)
nvm use                    # Reads .nvmrc, switches to Node 22
npm install                # Fails if Node version doesn't match engines
docker compose up -d       # Starts PostgreSQL + Redis
npm run dev                # Starts the app

Step 4: Initialise Node.js + TypeScript Project

All commands below run inside the Dev Container terminal (Node 22 is already available).

# Initialise package.json
npm init -y

# Install production dependencies
npm install fastify @fastify/cors @fastify/helmet @fastify/swagger @fastify/swagger-ui
npm install @fastify/jwt @fastify/rate-limit
npm install pg kysely              # PostgreSQL client + query builder
npm install ioredis                # Redis client
npm install bullmq                 # Job queue (backed by Redis)
npm install pino pino-pretty       # Structured logging
npm install dotenv                 # Environment variables
npm install zod                    # Schema validation

# Install dev dependencies
npm install -D typescript @types/node @types/pg
npm install -D tsx                 # TypeScript execution (dev server)
npm install -D eslint @typescript-eslint/eslint-plugin @typescript-eslint/parser
npm install -D prettier eslint-config-prettier
npm install -D vitest              # Test framework
npm install -D @faker-js/faker     # Test data generation

Add the engines field to package.json:

{
  "engines": {
    "node": ">=22.0.0",
    "npm": ">=10.0.0"
  }
}

Why Both Dev Container and Fallback Files?

The Dev Container is the primary way to develop. The .nvmrc, engines, and root docker-compose.yml exist as fallbacks for developers who use JetBrains IDEs or can't run Dev Containers. The CI pipeline (Phase 4) also uses the root docker-compose.yml for service containers.

Step 5: Configure TypeScript

Create tsconfig.json:

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "outDir": "dist",
    "rootDir": "src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "resolveJsonModule": true,
    "declaration": true,
    "declarationMap": true,
    "sourceMap": true,
    "noUnusedLocals": true,
    "noUnusedParameters": true,
    "exactOptionalPropertyTypes": false,
    "noImplicitReturns": true,
    "noFallthroughCasesInSwitch": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "dist", "**/*.test.ts"]
}

Step 6: Create Project Structure

feoda-bi/
├── .devcontainer/
│   ├── devcontainer.json         ← Dev Container config (Node 22, extensions, settings)
│   └── docker-compose.yml        ← App + PostgreSQL + Redis (Dev Container compose)
├── src/
│   ├── app.ts                    ← Fastify app setup
│   ├── server.ts                 ← Entry point (starts server)
│   ├── config/
│   │   ├── index.ts              ← Environment config loader
│   │   └── database.ts           ← Database connection config
│   ├── modules/
│   │   ├── health/
│   │   │   ├── health.routes.ts  ← GET /health, GET /ready
│   │   │   └── health.service.ts
│   │   ├── tenants/
│   │   │   ├── tenant.routes.ts
│   │   │   ├── tenant.service.ts
│   │   │   ├── tenant.schema.ts  ← Zod/JSON schemas
│   │   │   └── tenant.model.ts   ← TypeScript types
│   │   ├── debtors/
│   │   ├── students/
│   │   ├── items/
│   │   ├── billing/
│   │   └── payments/
│   ├── database/
│   │   ├── connection.ts         ← Kysely/pg pool setup
│   │   ├── migrations/           ← Database migrations
│   │   │   └── 001_create_tenants.ts
│   │   └── seeds/                ← Dev seed data
│   ├── middleware/
│   │   ├── auth.ts               ← JWT verification
│   │   ├── tenant.ts             ← Tenant context extraction
│   │   └── error-handler.ts      ← Global error handler
│   ├── plugins/
│   │   ├── redis.ts              ← Redis connection plugin
│   │   └── swagger.ts            ← OpenAPI docs plugin
│   └── utils/
│       ├── logger.ts             ← Pino logger setup
│       ├── pagination.ts         ← Pagination helpers
│       └── errors.ts             ← Custom error classes
├── tests/
│   ├── setup.ts                  ← Test setup (test DB, cleanup)
│   ├── health.test.ts
│   └── tenants.test.ts
├── docker-compose.yml            ← Fallback compose (PostgreSQL + Redis only, for non-Dev Container use)
├── Dockerfile
├── .env.example
├── .eslintrc.json
├── .prettierrc
├── .gitignore
├── .nvmrc                        ← Node version lock (fallback for non-Dev Container use)
├── tsconfig.json
├── vitest.config.ts
└── package.json

Step 7: Create the Health Check Endpoint

This is the first code to write — it validates the entire chain works:

// src/server.ts
import { buildApp } from './app.js';
import { config } from './config/index.js';
import { logger } from './utils/logger.js';

const start = async () => {
  const app = await buildApp();

  try {
    await app.listen({ port: config.port, host: '0.0.0.0' });
    logger.info(`Server running on http://localhost:${config.port}`);
    logger.info(`API docs: http://localhost:${config.port}/docs`);
  } catch (err) {
    logger.error(err);
    process.exit(1);
  }
};

start();

Step 8: Add npm Scripts

{
  "scripts": {
    "dev": "tsx watch src/server.ts",
    "build": "tsc",
    "start": "node dist/server.js",
    "lint": "eslint src/ --ext .ts",
    "format": "prettier --write 'src/**/*.ts'",
    "type-check": "tsc --noEmit",
    "test": "vitest run",
    "test:watch": "vitest",
    "migrate": "tsx src/database/migrate.ts",
    "seed": "tsx src/database/seed.ts"
  }
}

Step 9: Create .env.example

# Server
PORT=3000
NODE_ENV=development
LOG_LEVEL=debug

# Database
DATABASE_URL=postgres://arm_dev:arm_dev_password@localhost:5432/arm_development

# Redis
REDIS_URL=redis://localhost:6379

# Auth
JWT_SECRET=change-this-in-production-use-256-bit-key

# API
API_PREFIX=/api/v1
CORS_ORIGIN=http://localhost:3001

Phase 4: CI/CD Pipeline (Azure Pipelines)

Pipeline 1: CI (Build Validation)

Create azure-pipelines.yml in the repo root:

# azure-pipelines.yml — CI pipeline
trigger:
  branches:
    include:
      - main
      - develop

pr:
  branches:
    include:
      - main
      - develop

pool:
  vmImage: 'ubuntu-latest'

services:
  postgres:
    image: postgres:16-alpine
    ports:
      - 5432:5432
    env:
      POSTGRES_USER: arm_test
      POSTGRES_PASSWORD: arm_test_password
      POSTGRES_DB: arm_test
  redis:
    image: redis:7-alpine
    ports:
      - 6379:6379

steps:
  - task: NodeTool@0
    inputs:
      versionSpec: '22.x'
    displayName: 'Install Node.js 22'

  - script: npm ci
    displayName: 'Install dependencies'

  - script: npm run lint
    displayName: 'Lint'

  - script: npm run type-check
    displayName: 'Type Check'

  - script: npm run migrate
    displayName: 'Run Migrations'
    env:
      DATABASE_URL: postgres://arm_test:arm_test_password@localhost:5432/arm_test
      REDIS_URL: redis://localhost:6379

  - script: npm test
    displayName: 'Run Tests'
    env:
      DATABASE_URL: postgres://arm_test:arm_test_password@localhost:5432/arm_test
      REDIS_URL: redis://localhost:6379
      JWT_SECRET: test-secret-key-do-not-use-in-production
      NODE_ENV: test

After creating this file, go to PipelinesNew pipelineAzure Repos Git → select feoda-biExisting Azure Pipelines YAML file → select azure-pipelines.yml.

Then add this pipeline as a build validation policy on the main branch (see Step 4a above).

Pipeline 2: GitHub Mirror Sync

Create azure-pipelines-mirror.yml in the repo root:

# azure-pipelines-mirror.yml — Mirror to GitHub
trigger:
  branches:
    include:
      - main
      - develop

pool:
  vmImage: 'ubuntu-latest'

variables:
  - group: github-mirror-secrets

steps:
  - checkout: self
    fetchDepth: 0    # Full history for mirror

  - script: |
      git remote add github https://$(GITHUB_PAT)@github.com/feodadev/feoda-bi.git
      git push github --mirror
    displayName: 'Mirror to GitHub'

Create a second pipeline in Azure DevOps:

  1. Go to PipelinesNew pipelineAzure Repos Git → select feoda-bi
  2. Select Existing Azure Pipelines YAML file → select azure-pipelines-mirror.yml
  3. Name it: Mirror to GitHub
  4. Link the github-mirror-secrets variable group (grant access when prompted)

How the Mirror Works

Every push to main or develop in Azure Repos triggers the mirror pipeline, which does a git push --mirror to GitHub. This keeps the GitHub repo as an exact copy — all branches, tags, and history are synced. GitHub is read-only from the team's perspective.


Phase 5: Azure Infrastructure (Later — Not Sprint 0)

Not Yet

Do NOT set up Azure cloud infrastructure during Sprint 0. Use Docker Compose for local development. Azure resource provisioning happens before the first deployment (Sprint 2–3).

When ready, the Azure resources to create:

Resource Azure Service Tier Est. Cost
Resource Group rg-arm-dev Free
PostgreSQL Azure Database for PostgreSQL Flexible Server Burstable B1ms ~$13/mo
Redis Azure Cache for Redis Basic C0 ~$16/mo
Container App Azure Container Apps Consumption ~$0–20/mo
Blob Storage Azure Storage Account Hot LRS ~$1/mo
Container Registry Azure Container Registry Basic ~$5/mo
Total (dev) ~$35–55/mo

Checklist

Use this checklist to track your progress:

Phase 0 — Project Setup

  • Azure DevOps organisation created
  • Azure DevOps project "Feoda_BI" created
  • Area paths configured (7 domain-based paths)
  • Iterations (sprints) configured
  • Board columns customised
  • Tag convention agreed (api, database, ui, infra, auth, testing, doc)
  • Azure Repos repository feoda-bi created (primary)
  • Branch policies set on main
  • develop branch created
  • GitHub mirror repo feodadev/feoda-bi created (empty)
  • GitHub PAT created and stored in Azure DevOps variable group
  • Developer accounts added with individual Microsoft identities

Phase 1 — Backlog

  • Epics created in Azure Boards with area paths assigned
  • Sprint 0 user stories created with area paths and tags
  • Sprint 0 stories assigned and estimated

Phase 2 — Dev Environment

  • Docker Desktop installed
  • VS Code + Dev Containers extension installed
  • Repository cloned from Azure Repos
  • "Reopen in Container" opens successfully
  • Node.js 22 available inside container
  • PostgreSQL accessible inside container (psql -h postgres)
  • Redis accessible inside container (redis-cli -h redis ping)

Phase 3 — Scaffolding

  • .devcontainer/devcontainer.json configured
  • .devcontainer/docker-compose.yml configured (app + PostgreSQL + Redis)
  • .nvmrc committed (fallback)
  • package.json initialised with dependencies (including engines field)
  • tsconfig.json configured
  • Project folder structure created
  • Health check endpoint (GET /health) working
  • npm run dev starts server successfully
  • .env.example committed

Phase 4 — CI/CD

  • Azure Pipeline (CI) created (passing blocked — awaiting hosted parallelism grant)
  • Azure Pipeline (GitHub mirror) created (syncing blocked — awaiting hosted parallelism grant; manual sync in place)
  • Build validation policy added to main and develop branches (temporarily disabled until parallelism granted)
  • PR template created

What's Next

After completing this kickoff:

  1. Sprint 1 — Parent Portal and Notifications: parent-facing portal, email notifications, payment history
  2. Sprint 2 — Integrations: ERP connector (NetSuite), SIS data sync
  3. Sprint 3 — Integration Adapters: payment gateway adapters (Eway, partner), refund workflows
  4. Sprint 4 — Auth, Multi-Tenancy and Reporting: JWT auth, RLS, reporting dashboards

Each sprint adds a layer of sophistication around the working billing cycle from Sprint 0 — see the spiral method diagram above.

See the Technology Stack Comparison for the full rationale behind each technology choice, and the Cost Projections for infrastructure cost estimates at each growth phase.