diff --git a/.agents/skills/design-an-interface/SKILL.md b/.agents/skills/design-an-interface/SKILL.md new file mode 100644 index 0000000..d056bd1 --- /dev/null +++ b/.agents/skills/design-an-interface/SKILL.md @@ -0,0 +1,94 @@ +--- +name: design-an-interface +description: Generate multiple radically different interface designs for a module using parallel sub-agents. Use when user wants to design an API, explore interface options, compare module shapes, or mentions "design it twice". +--- + +# Design an Interface + +Based on "Design It Twice" from "A Philosophy of Software Design": your first idea is unlikely to be the best. Generate multiple radically different designs, then compare. + +## Workflow + +### 1. Gather Requirements + +Before designing, understand: + +- [ ] What problem does this module solve? +- [ ] Who are the callers? (other modules, external users, tests) +- [ ] What are the key operations? +- [ ] Any constraints? (performance, compatibility, existing patterns) +- [ ] What should be hidden inside vs exposed? + +Ask: "What does this module need to do? Who will use it?" + +### 2. Generate Designs (Parallel Sub-Agents) + +Spawn 3+ sub-agents simultaneously using Task tool. Each must produce a **radically different** approach. + +``` +Prompt template for each sub-agent: + +Design an interface for: [module description] + +Requirements: [gathered requirements] + +Constraints for this design: [assign a different constraint to each agent] +- Agent 1: "Minimize method count - aim for 1-3 methods max" +- Agent 2: "Maximize flexibility - support many use cases" +- Agent 3: "Optimize for the most common case" +- Agent 4: "Take inspiration from [specific paradigm/library]" + +Output format: +1. Interface signature (types/methods) +2. Usage example (how caller uses it) +3. What this design hides internally +4. Trade-offs of this approach +``` + +### 3. Present Designs + +Show each design with: + +1. **Interface signature** - types, methods, params +2. **Usage examples** - how callers actually use it in practice +3. **What it hides** - complexity kept internal + +Present designs sequentially so user can absorb each approach before comparison. + +### 4. Compare Designs + +After showing all designs, compare them on: + +- **Interface simplicity**: fewer methods, simpler params +- **General-purpose vs specialized**: flexibility vs focus +- **Implementation efficiency**: does shape allow efficient internals? +- **Depth**: small interface hiding significant complexity (good) vs large interface with thin implementation (bad) +- **Ease of correct use** vs **ease of misuse** + +Discuss trade-offs in prose, not tables. Highlight where designs diverge most. + +### 5. Synthesize + +Often the best design combines insights from multiple options. Ask: + +- "Which design best fits your primary use case?" +- "Any elements from other designs worth incorporating?" + +## Evaluation Criteria + +From "A Philosophy of Software Design": + +**Interface simplicity**: Fewer methods, simpler params = easier to learn and use correctly. + +**General-purpose**: Can handle future use cases without changes. But beware over-generalization. + +**Implementation efficiency**: Does interface shape allow efficient implementation? Or force awkward internals? + +**Depth**: Small interface hiding significant complexity = deep module (good). Large interface with thin implementation = shallow module (avoid). + +## Anti-Patterns + +- Don't let sub-agents produce similar designs - enforce radical difference +- Don't skip comparison - the value is in contrast +- Don't implement - this is purely about interface shape +- Don't evaluate based on implementation effort diff --git a/.agents/skills/edit-article/SKILL.md b/.agents/skills/edit-article/SKILL.md new file mode 100644 index 0000000..b319b7c --- /dev/null +++ b/.agents/skills/edit-article/SKILL.md @@ -0,0 +1,14 @@ +--- +name: edit-article +description: Edit and improve articles by restructuring sections, improving clarity, and tightening prose. Use when user wants to edit, revise, or improve an article draft. +--- + +1. First, divide the article into sections based on its headings. Think about the main points you want to make during those sections. + +Consider that information is a directed acyclic graph, and that pieces of information can depend on other pieces of information. Make sure that the order of the sections and their contents respects these dependencies. + +Confirm the sections with the user. + +2. For each section: + +2a. Rewrite the section to improve clarity, coherence, and flow. Use maximum 240 characters per paragraph. diff --git a/.agents/skills/git-guardrails-claude-code/SKILL.md b/.agents/skills/git-guardrails-claude-code/SKILL.md new file mode 100644 index 0000000..d943c68 --- /dev/null +++ b/.agents/skills/git-guardrails-claude-code/SKILL.md @@ -0,0 +1,95 @@ +--- +name: git-guardrails-claude-code +description: Set up Claude Code hooks to block dangerous git commands (push, reset --hard, clean, branch -D, etc.) before they execute. Use when user wants to prevent destructive git operations, add git safety hooks, or block git push/reset in Claude Code. +--- + +# Setup Git Guardrails + +Sets up a PreToolUse hook that intercepts and blocks dangerous git commands before Claude executes them. + +## What Gets Blocked + +- `git push` (all variants including `--force`) +- `git reset --hard` +- `git clean -f` / `git clean -fd` +- `git branch -D` +- `git checkout .` / `git restore .` + +When blocked, Claude sees a message telling it that it does not have authority to access these commands. + +## Steps + +### 1. Ask scope + +Ask the user: install for **this project only** (`.claude/settings.json`) or **all projects** (`~/.claude/settings.json`)? + +### 2. Copy the hook script + +The bundled script is at: [scripts/block-dangerous-git.sh](scripts/block-dangerous-git.sh) + +Copy it to the target location based on scope: + +- **Project**: `.claude/hooks/block-dangerous-git.sh` +- **Global**: `~/.claude/hooks/block-dangerous-git.sh` + +Make it executable with `chmod +x`. + +### 3. Add hook to settings + +Add to the appropriate settings file: + +**Project** (`.claude/settings.json`): + +```json +{ + "hooks": { + "PreToolUse": [ + { + "matcher": "Bash", + "hooks": [ + { + "type": "command", + "command": "\"$CLAUDE_PROJECT_DIR\"/.claude/hooks/block-dangerous-git.sh" + } + ] + } + ] + } +} +``` + +**Global** (`~/.claude/settings.json`): + +```json +{ + "hooks": { + "PreToolUse": [ + { + "matcher": "Bash", + "hooks": [ + { + "type": "command", + "command": "~/.claude/hooks/block-dangerous-git.sh" + } + ] + } + ] + } +} +``` + +If the settings file already exists, merge the hook into existing `hooks.PreToolUse` array — don't overwrite other settings. + +### 4. Ask about customization + +Ask if user wants to add or remove any patterns from the blocked list. Edit the copied script accordingly. + +### 5. Verify + +Run a quick test: + +```bash +echo '{"tool_input":{"command":"git push origin main"}}' | +``` + +Should exit with code 2 and print a BLOCKED message to stderr. diff --git a/.agents/skills/git-guardrails-claude-code/scripts/block-dangerous-git.sh b/.agents/skills/git-guardrails-claude-code/scripts/block-dangerous-git.sh new file mode 100755 index 0000000..c40b59c --- /dev/null +++ b/.agents/skills/git-guardrails-claude-code/scripts/block-dangerous-git.sh @@ -0,0 +1,25 @@ +#!/bin/bash + +INPUT=$(cat) +COMMAND=$(echo "$INPUT" | jq -r '.tool_input.command') + +DANGEROUS_PATTERNS=( + "git push" + "git reset --hard" + "git clean -fd" + "git clean -f" + "git branch -D" + "git checkout \." + "git restore \." + "push --force" + "reset --hard" +) + +for pattern in "${DANGEROUS_PATTERNS[@]}"; do + if echo "$COMMAND" | grep -qE "$pattern"; then + echo "BLOCKED: '$COMMAND' matches dangerous pattern '$pattern'. The user has prevented you from doing this." >&2 + exit 2 + fi +done + +exit 0 diff --git a/.agents/skills/grill-me/SKILL.md b/.agents/skills/grill-me/SKILL.md index a8ba574..f1543a9 100644 --- a/.agents/skills/grill-me/SKILL.md +++ b/.agents/skills/grill-me/SKILL.md @@ -3,6 +3,6 @@ name: grill-me description: Interview the user relentlessly about a plan or design until reaching shared understanding, resolving each branch of the decision tree. Use when user wants to stress-test a plan, get grilled on their design, or mentions "grill me". --- -Interview me relentlessly about every aspect of this plan until we reach a shared understanding. Walk down each branch of the design tree, resolving dependencies between decisions one-by-one. +Interview me relentlessly about every aspect of this plan until we reach a shared understanding. Walk down each branch of the design tree, resolving dependencies between decisions one-by-one. For each question, provide your recommended answer. If a question can be answered by exploring the codebase, explore the codebase instead. diff --git a/.agents/skills/migrate-to-shoehorn/SKILL.md b/.agents/skills/migrate-to-shoehorn/SKILL.md new file mode 100644 index 0000000..ae4f965 --- /dev/null +++ b/.agents/skills/migrate-to-shoehorn/SKILL.md @@ -0,0 +1,118 @@ +--- +name: migrate-to-shoehorn +description: Migrate test files from `as` type assertions to @total-typescript/shoehorn. Use when user mentions shoehorn, wants to replace `as` in tests, or needs partial test data. +--- + +# Migrate to Shoehorn + +## Why shoehorn? + +`shoehorn` lets you pass partial data in tests while keeping TypeScript happy. It replaces `as` assertions with type-safe alternatives. + +**Test code only.** Never use shoehorn in production code. + +Problems with `as` in tests: + +- Trained not to use it +- Must manually specify target type +- Double-as (`as unknown as Type`) for intentionally wrong data + +## Install + +```bash +npm i @total-typescript/shoehorn +``` + +## Migration patterns + +### Large objects with few needed properties + +Before: + +```ts +type Request = { + body: { id: string }; + headers: Record; + cookies: Record; + // ...20 more properties +}; + +it("gets user by id", () => { + // Only care about body.id but must fake entire Request + getUser({ + body: { id: "123" }, + headers: {}, + cookies: {}, + // ...fake all 20 properties + }); +}); +``` + +After: + +```ts +import { fromPartial } from "@total-typescript/shoehorn"; + +it("gets user by id", () => { + getUser( + fromPartial({ + body: { id: "123" }, + }), + ); +}); +``` + +### `as Type` → `fromPartial()` + +Before: + +```ts +getUser({ body: { id: "123" } } as Request); +``` + +After: + +```ts +import { fromPartial } from "@total-typescript/shoehorn"; + +getUser(fromPartial({ body: { id: "123" } })); +``` + +### `as unknown as Type` → `fromAny()` + +Before: + +```ts +getUser({ body: { id: 123 } } as unknown as Request); // wrong type on purpose +``` + +After: + +```ts +import { fromAny } from "@total-typescript/shoehorn"; + +getUser(fromAny({ body: { id: 123 } })); +``` + +## When to use each + +| Function | Use case | +| --------------- | -------------------------------------------------- | +| `fromPartial()` | Pass partial data that still type-checks | +| `fromAny()` | Pass intentionally wrong data (keeps autocomplete) | +| `fromExact()` | Force full object (swap with fromPartial later) | + +## Workflow + +1. **Gather requirements** - ask user: + - What test files have `as` assertions causing problems? + - Are they dealing with large objects where only some properties matter? + - Do they need to pass intentionally wrong data for error testing? + +2. **Install and migrate**: + - [ ] Install: `npm i @total-typescript/shoehorn` + - [ ] Find test files with `as` assertions: `grep -r " as [A-Z]" --include="*.test.ts" --include="*.spec.ts"` + - [ ] Replace `as Type` with `fromPartial()` + - [ ] Replace `as unknown as Type` with `fromAny()` + - [ ] Add imports from `@total-typescript/shoehorn` + - [ ] Run type check to verify diff --git a/.agents/skills/prd-to-plan/SKILL.md b/.agents/skills/prd-to-plan/SKILL.md new file mode 100644 index 0000000..a5da1c2 --- /dev/null +++ b/.agents/skills/prd-to-plan/SKILL.md @@ -0,0 +1,107 @@ +--- +name: prd-to-plan +description: Turn a PRD into a multi-phase implementation plan using tracer-bullet vertical slices, saved as a local Markdown file in ./plans/. Use when user wants to break down a PRD, create an implementation plan, plan phases from a PRD, or mentions "tracer bullets". +--- + +# PRD to Plan + +Break a PRD into a phased implementation plan using vertical slices (tracer bullets). Output is a Markdown file in `./plans/`. + +## Process + +### 1. Confirm the PRD is in context + +The PRD should already be in the conversation. If it isn't, ask the user to paste it or point you to the file. + +### 2. Explore the codebase + +If you have not already explored the codebase, do so to understand the current architecture, existing patterns, and integration layers. + +### 3. Identify durable architectural decisions + +Before slicing, identify high-level decisions that are unlikely to change throughout implementation: + +- Route structures / URL patterns +- Database schema shape +- Key data models +- Authentication / authorization approach +- Third-party service boundaries + +These go in the plan header so every phase can reference them. + +### 4. Draft vertical slices + +Break the PRD into **tracer bullet** phases. Each phase is a thin vertical slice that cuts through ALL integration layers end-to-end, NOT a horizontal slice of one layer. + + +- Each slice delivers a narrow but COMPLETE path through every layer (schema, API, UI, tests) +- A completed slice is demoable or verifiable on its own +- Prefer many thin slices over few thick ones +- Do NOT include specific file names, function names, or implementation details that are likely to change as later phases are built +- DO include durable decisions: route paths, schema shapes, data model names + + +### 5. Quiz the user + +Present the proposed breakdown as a numbered list. For each phase show: + +- **Title**: short descriptive name +- **User stories covered**: which user stories from the PRD this addresses + +Ask the user: + +- Does the granularity feel right? (too coarse / too fine) +- Should any phases be merged or split further? + +Iterate until the user approves the breakdown. + +### 6. Write the plan file + +Create `./plans/` if it doesn't exist. Write the plan as a Markdown file named after the feature (e.g. `./plans/user-onboarding.md`). Use the template below. + + +# Plan: + +> Source PRD: + +## Architectural decisions + +Durable decisions that apply across all phases: + +- **Routes**: ... +- **Schema**: ... +- **Key models**: ... +- (add/remove sections as appropriate) + +--- + +## Phase 1: + +**User stories**: <list from PRD> + +### What to build + +A concise description of this vertical slice. Describe the end-to-end behavior, not layer-by-layer implementation. + +### Acceptance criteria + +- [ ] Criterion 1 +- [ ] Criterion 2 +- [ ] Criterion 3 + +--- + +## Phase 2: <Title> + +**User stories**: <list from PRD> + +### What to build + +... + +### Acceptance criteria + +- [ ] ... + +<!-- Repeat for each phase --> +</plan-template> diff --git a/.agents/skills/request-refactor-plan/SKILL.md b/.agents/skills/request-refactor-plan/SKILL.md new file mode 100644 index 0000000..7e8b2e4 --- /dev/null +++ b/.agents/skills/request-refactor-plan/SKILL.md @@ -0,0 +1,68 @@ +--- +name: request-refactor-plan +description: Create a detailed refactor plan with tiny commits via user interview, then file it as a GitHub issue. Use when user wants to plan a refactor, create a refactoring RFC, or break a refactor into safe incremental steps. +--- + +This skill will be invoked when the user wants to create a refactor request. You should go through the steps below. You may skip steps if you don't consider them necessary. + +1. Ask the user for a long, detailed description of the problem they want to solve and any potential ideas for solutions. + +2. Explore the repo to verify their assertions and understand the current state of the codebase. + +3. Ask whether they have considered other options, and present other options to them. + +4. Interview the user about the implementation. Be extremely detailed and thorough. + +5. Hammer out the exact scope of the implementation. Work out what you plan to change and what you plan not to change. + +6. Look in the codebase to check for test coverage of this area of the codebase. If there is insufficient test coverage, ask the user what their plans for testing are. + +7. Break the implementation into a plan of tiny commits. Remember Martin Fowler's advice to "make each refactoring step as small as possible, so that you can always see the program working." + +8. Create a GitHub issue with the refactor plan. Use the following template for the issue description: + +<refactor-plan-template> + +## Problem Statement + +The problem that the developer is facing, from the developer's perspective. + +## Solution + +The solution to the problem, from the developer's perspective. + +## Commits + +A LONG, detailed implementation plan. Write the plan in plain English, breaking down the implementation into the tiniest commits possible. Each commit should leave the codebase in a working state. + +## Decision Document + +A list of implementation decisions that were made. This can include: + +- The modules that will be built/modified +- The interfaces of those modules that will be modified +- Technical clarifications from the developer +- Architectural decisions +- Schema changes +- API contracts +- Specific interactions + +Do NOT include specific file paths or code snippets. They may end up being outdated very quickly. + +## Testing Decisions + +A list of testing decisions that were made. Include: + +- A description of what makes a good test (only test external behavior, not implementation details) +- Which modules will be tested +- Prior art for the tests (i.e. similar types of tests in the codebase) + +## Out of Scope + +A description of the things that are out of scope for this refactor. + +## Further Notes (optional) + +Any further notes about the refactor. + +</refactor-plan-template> diff --git a/.agents/skills/scaffold-exercises/SKILL.md b/.agents/skills/scaffold-exercises/SKILL.md new file mode 100644 index 0000000..d87df28 --- /dev/null +++ b/.agents/skills/scaffold-exercises/SKILL.md @@ -0,0 +1,106 @@ +--- +name: scaffold-exercises +description: Create exercise directory structures with sections, problems, solutions, and explainers that pass linting. Use when user wants to scaffold exercises, create exercise stubs, or set up a new course section. +--- + +# Scaffold Exercises + +Create exercise directory structures that pass `pnpm ai-hero-cli internal lint`, then commit with `git commit`. + +## Directory naming + +- **Sections**: `XX-section-name/` inside `exercises/` (e.g., `01-retrieval-skill-building`) +- **Exercises**: `XX.YY-exercise-name/` inside a section (e.g., `01.03-retrieval-with-bm25`) +- Section number = `XX`, exercise number = `XX.YY` +- Names are dash-case (lowercase, hyphens) + +## Exercise variants + +Each exercise needs at least one of these subfolders: + +- `problem/` - student workspace with TODOs +- `solution/` - reference implementation +- `explainer/` - conceptual material, no TODOs + +When stubbing, default to `explainer/` unless the plan specifies otherwise. + +## Required files + +Each subfolder (`problem/`, `solution/`, `explainer/`) needs a `readme.md` that: + +- Is **not empty** (must have real content, even a single title line works) +- Has no broken links + +When stubbing, create a minimal readme with a title and a description: + +```md +# Exercise Title + +Description here +``` + +If the subfolder has code, it also needs a `main.ts` (>1 line). But for stubs, a readme-only exercise is fine. + +## Workflow + +1. **Parse the plan** - extract section names, exercise names, and variant types +2. **Create directories** - `mkdir -p` for each path +3. **Create stub readmes** - one `readme.md` per variant folder with a title +4. **Run lint** - `pnpm ai-hero-cli internal lint` to validate +5. **Fix any errors** - iterate until lint passes + +## Lint rules summary + +The linter (`pnpm ai-hero-cli internal lint`) checks: + +- Each exercise has subfolders (`problem/`, `solution/`, `explainer/`) +- At least one of `problem/`, `explainer/`, or `explainer.1/` exists +- `readme.md` exists and is non-empty in the primary subfolder +- No `.gitkeep` files +- No `speaker-notes.md` files +- No broken links in readmes +- No `pnpm run exercise` commands in readmes +- `main.ts` required per subfolder unless it's readme-only + +## Moving/renaming exercises + +When renumbering or moving exercises: + +1. Use `git mv` (not `mv`) to rename directories - preserves git history +2. Update the numeric prefix to maintain order +3. Re-run lint after moves + +Example: + +```bash +git mv exercises/01-retrieval/01.03-embeddings exercises/01-retrieval/01.04-embeddings +``` + +## Example: stubbing from a plan + +Given a plan like: + +``` +Section 05: Memory Skill Building +- 05.01 Introduction to Memory +- 05.02 Short-term Memory (explainer + problem + solution) +- 05.03 Long-term Memory +``` + +Create: + +```bash +mkdir -p exercises/05-memory-skill-building/05.01-introduction-to-memory/explainer +mkdir -p exercises/05-memory-skill-building/05.02-short-term-memory/{explainer,problem,solution} +mkdir -p exercises/05-memory-skill-building/05.03-long-term-memory/explainer +``` + +Then create readme stubs: + +``` +exercises/05-memory-skill-building/05.01-introduction-to-memory/explainer/readme.md -> "# Introduction to Memory" +exercises/05-memory-skill-building/05.02-short-term-memory/explainer/readme.md -> "# Short-term Memory" +exercises/05-memory-skill-building/05.02-short-term-memory/problem/readme.md -> "# Short-term Memory" +exercises/05-memory-skill-building/05.02-short-term-memory/solution/readme.md -> "# Short-term Memory" +exercises/05-memory-skill-building/05.03-long-term-memory/explainer/readme.md -> "# Long-term Memory" +``` diff --git a/.agents/skills/setup-pre-commit/SKILL.md b/.agents/skills/setup-pre-commit/SKILL.md new file mode 100644 index 0000000..395a77b --- /dev/null +++ b/.agents/skills/setup-pre-commit/SKILL.md @@ -0,0 +1,91 @@ +--- +name: setup-pre-commit +description: Set up Husky pre-commit hooks with lint-staged (Prettier), type checking, and tests in the current repo. Use when user wants to add pre-commit hooks, set up Husky, configure lint-staged, or add commit-time formatting/typechecking/testing. +--- + +# Setup Pre-Commit Hooks + +## What This Sets Up + +- **Husky** pre-commit hook +- **lint-staged** running Prettier on all staged files +- **Prettier** config (if missing) +- **typecheck** and **test** scripts in the pre-commit hook + +## Steps + +### 1. Detect package manager + +Check for `package-lock.json` (npm), `pnpm-lock.yaml` (pnpm), `yarn.lock` (yarn), `bun.lockb` (bun). Use whichever is present. Default to npm if unclear. + +### 2. Install dependencies + +Install as devDependencies: + +``` +husky lint-staged prettier +``` + +### 3. Initialize Husky + +```bash +npx husky init +``` + +This creates `.husky/` dir and adds `prepare: "husky"` to package.json. + +### 4. Create `.husky/pre-commit` + +Write this file (no shebang needed for Husky v9+): + +``` +npx lint-staged +npm run typecheck +npm run test +``` + +**Adapt**: Replace `npm` with detected package manager. If repo has no `typecheck` or `test` script in package.json, omit those lines and tell the user. + +### 5. Create `.lintstagedrc` + +```json +{ + "*": "prettier --ignore-unknown --write" +} +``` + +### 6. Create `.prettierrc` (if missing) + +Only create if no Prettier config exists. Use these defaults: + +```json +{ + "useTabs": false, + "tabWidth": 2, + "printWidth": 80, + "singleQuote": false, + "trailingComma": "es5", + "semi": true, + "arrowParens": "always" +} +``` + +### 7. Verify + +- [ ] `.husky/pre-commit` exists and is executable +- [ ] `.lintstagedrc` exists +- [ ] `prepare` script in package.json is `"husky"` +- [ ] `prettier` config exists +- [ ] Run `npx lint-staged` to verify it works + +### 8. Commit + +Stage all changed/created files and commit with message: `Add pre-commit hooks (husky + lint-staged + prettier)` + +This will run through the new pre-commit hooks — a good smoke test that everything works. + +## Notes + +- Husky v9+ doesn't need shebangs in hook files +- `prettier --ignore-unknown` skips files Prettier can't parse (images, etc.) +- The pre-commit runs lint-staged first (fast, staged-only), then full typecheck and tests diff --git a/.agents/skills/triage-issue/SKILL.md b/.agents/skills/triage-issue/SKILL.md new file mode 100644 index 0000000..2b5b9fc --- /dev/null +++ b/.agents/skills/triage-issue/SKILL.md @@ -0,0 +1,102 @@ +--- +name: triage-issue +description: Triage a bug or issue by exploring the codebase to find root cause, then create a GitHub issue with a TDD-based fix plan. Use when user reports a bug, wants to file an issue, mentions "triage", or wants to investigate and plan a fix for a problem. +--- + +# Triage Issue + +Investigate a reported problem, find its root cause, and create a GitHub issue with a TDD fix plan. This is a mostly hands-off workflow - minimize questions to the user. + +## Process + +### 1. Capture the problem + +Get a brief description of the issue from the user. If they haven't provided one, ask ONE question: "What's the problem you're seeing?" + +Do NOT ask follow-up questions yet. Start investigating immediately. + +### 2. Explore and diagnose + +Use the Agent tool with subagent_type=Explore to deeply investigate the codebase. Your goal is to find: + +- **Where** the bug manifests (entry points, UI, API responses) +- **What** code path is involved (trace the flow) +- **Why** it fails (the root cause, not just the symptom) +- **What** related code exists (similar patterns, tests, adjacent modules) + +Look at: +- Related source files and their dependencies +- Existing tests (what's tested, what's missing) +- Recent changes to affected files (`git log` on relevant files) +- Error handling in the code path +- Similar patterns elsewhere in the codebase that work correctly + +### 3. Identify the fix approach + +Based on your investigation, determine: + +- The minimal change needed to fix the root cause +- Which modules/interfaces are affected +- What behaviors need to be verified via tests +- Whether this is a regression, missing feature, or design flaw + +### 4. Design TDD fix plan + +Create a concrete, ordered list of RED-GREEN cycles. Each cycle is one vertical slice: + +- **RED**: Describe a specific test that captures the broken/missing behavior +- **GREEN**: Describe the minimal code change to make that test pass + +Rules: +- Tests verify behavior through public interfaces, not implementation details +- One test at a time, vertical slices (NOT all tests first, then all code) +- Each test should survive internal refactors +- Include a final refactor step if needed +- **Durability**: Only suggest fixes that would survive radical codebase changes. Describe behaviors and contracts, not internal structure. Tests assert on observable outcomes (API responses, UI state, user-visible effects), not internal state. A good suggestion reads like a spec; a bad one reads like a diff. + +### 5. Create the GitHub issue + +Create a GitHub issue using `gh issue create` with the template below. Do NOT ask the user to review before creating - just create it and share the URL. + +<issue-template> + +## Problem + +A clear description of the bug or issue, including: +- What happens (actual behavior) +- What should happen (expected behavior) +- How to reproduce (if applicable) + +## Root Cause Analysis + +Describe what you found during investigation: +- The code path involved +- Why the current code fails +- Any contributing factors + +Do NOT include specific file paths, line numbers, or implementation details that couple to current code layout. Describe modules, behaviors, and contracts instead. The issue should remain useful even after major refactors. + +## TDD Fix Plan + +A numbered list of RED-GREEN cycles: + +1. **RED**: Write a test that [describes expected behavior] + **GREEN**: [Minimal change to make it pass] + +2. **RED**: Write a test that [describes next behavior] + **GREEN**: [Minimal change to make it pass] + +... + +**REFACTOR**: [Any cleanup needed after all tests pass] + +## Acceptance Criteria + +- [ ] Criterion 1 +- [ ] Criterion 2 +- [ ] All new tests pass +- [ ] Existing tests still pass + +</issue-template> + +After creating the issue, print the issue URL and a one-line summary of the root cause. diff --git a/.agents/skills/ubiquitous-language/SKILL.md b/.agents/skills/ubiquitous-language/SKILL.md new file mode 100644 index 0000000..db4fbd6 --- /dev/null +++ b/.agents/skills/ubiquitous-language/SKILL.md @@ -0,0 +1,84 @@ +--- +name: ubiquitous-language +description: Extract a DDD-style ubiquitous language glossary from the current conversation, flagging ambiguities and proposing canonical terms. Saves to UBIQUITOUS_LANGUAGE.md. Use when user wants to define domain terms, build a glossary, harden terminology, create a ubiquitous language, or mentions "domain model" or "DDD". +--- + +# Ubiquitous Language + +Extract and formalize domain terminology from the current conversation into a consistent glossary, saved to a local file. + +## Process + +1. **Scan the conversation** for domain-relevant nouns, verbs, and concepts +2. **Identify problems**: + - Same word used for different concepts (ambiguity) + - Different words used for the same concept (synonyms) + - Vague or overloaded terms +3. **Propose a canonical glossary** with opinionated term choices +4. **Write to `UBIQUITOUS_LANGUAGE.md`** in the working directory using the format below +5. **Output a summary** inline in the conversation + +## Output Format + +Write a `UBIQUITOUS_LANGUAGE.md` file with this structure: + +```md +# Ubiquitous Language + +## Order lifecycle + +| Term | Definition | Aliases to avoid | +|------|-----------|-----------------| +| **Order** | A customer's request to purchase one or more items | Purchase, transaction | +| **Invoice** | A request for payment sent to a customer after delivery | Bill, payment request | + +## People + +| Term | Definition | Aliases to avoid | +|------|-----------|-----------------| +| **Customer** | A person or organization that places orders | Client, buyer, account | +| **User** | An authentication identity in the system | Login, account | + +## Relationships + +- An **Invoice** belongs to exactly one **Customer** +- An **Order** produces one or more **Invoices** + +## Example dialogue + +> **Dev:** "When a **Customer** places an **Order**, do we create the **Invoice** immediately?" +> **Domain expert:** "No — an **Invoice** is only generated once a **Fulfillment** is confirmed. A single **Order** can produce multiple **Invoices** if items ship in separate **Shipments**." +> **Dev:** "So if a **Shipment** is cancelled before dispatch, no **Invoice** exists for it?" +> **Domain expert:** "Exactly. The **Invoice** lifecycle is tied to the **Fulfillment**, not the **Order**." + +## Flagged ambiguities + +- "account" was used to mean both **Customer** and **User** — these are distinct concepts: a **Customer** places orders, while a **User** is an authentication identity that may or may not represent a **Customer**. +``` + +## Rules + +- **Be opinionated.** When multiple words exist for the same concept, pick the best one and list the others as aliases to avoid. +- **Flag conflicts explicitly.** If a term is used ambiguously in the conversation, call it out in the "Flagged ambiguities" section with a clear recommendation. +- **Keep definitions tight.** One sentence max. Define what it IS, not what it does. +- **Show relationships.** Use bold term names and express cardinality where obvious. +- **Only include domain terms.** Skip generic programming concepts (array, function, endpoint) unless they have domain-specific meaning. +- **Group terms into multiple tables** when natural clusters emerge (e.g. by subdomain, lifecycle, or actor). Each group gets its own heading and table. If all terms belong to a single cohesive domain, one table is fine — don't force groupings. +- **Write an example dialogue.** A short conversation (3-5 exchanges) between a dev and a domain expert that demonstrates how the terms interact naturally. The dialogue should clarify boundaries between related concepts and show terms being used precisely. + +## Re-running + +When invoked again in the same conversation: + +1. Read the existing `UBIQUITOUS_LANGUAGE.md` +2. Incorporate any new terms from subsequent discussion +3. Update definitions if understanding has evolved +4. Mark changed entries with "(updated)" and new entries with "(new)" +5. Re-flag any new ambiguities +6. Rewrite the example dialogue to incorporate new terms + +## Post-output instruction + +After writing the file, state: + +> I've written/updated `UBIQUITOUS_LANGUAGE.md`. From this point forward I will use these terms consistently. If I drift from this language or you notice a term that should be added, let me know. diff --git a/.agents/skills/write-a-skill/SKILL.md b/.agents/skills/write-a-skill/SKILL.md new file mode 100644 index 0000000..7339c8a --- /dev/null +++ b/.agents/skills/write-a-skill/SKILL.md @@ -0,0 +1,117 @@ +--- +name: write-a-skill +description: Create new agent skills with proper structure, progressive disclosure, and bundled resources. Use when user wants to create, write, or build a new skill. +--- + +# Writing Skills + +## Process + +1. **Gather requirements** - ask user about: + - What task/domain does the skill cover? + - What specific use cases should it handle? + - Does it need executable scripts or just instructions? + - Any reference materials to include? + +2. **Draft the skill** - create: + - SKILL.md with concise instructions + - Additional reference files if content exceeds 500 lines + - Utility scripts if deterministic operations needed + +3. **Review with user** - present draft and ask: + - Does this cover your use cases? + - Anything missing or unclear? + - Should any section be more/less detailed? + +## Skill Structure + +``` +skill-name/ +├── SKILL.md # Main instructions (required) +├── REFERENCE.md # Detailed docs (if needed) +├── EXAMPLES.md # Usage examples (if needed) +└── scripts/ # Utility scripts (if needed) + └── helper.js +``` + +## SKILL.md Template + +```md +--- +name: skill-name +description: Brief description of capability. Use when [specific triggers]. +--- + +# Skill Name + +## Quick start + +[Minimal working example] + +## Workflows + +[Step-by-step processes with checklists for complex tasks] + +## Advanced features + +[Link to separate files: See [REFERENCE.md](REFERENCE.md)] +``` + +## Description Requirements + +The description is **the only thing your agent sees** when deciding which skill to load. It's surfaced in the system prompt alongside all other installed skills. Your agent reads these descriptions and picks the relevant skill based on the user's request. + +**Goal**: Give your agent just enough info to know: + +1. What capability this skill provides +2. When/why to trigger it (specific keywords, contexts, file types) + +**Format**: + +- Max 1024 chars +- Write in third person +- First sentence: what it does +- Second sentence: "Use when [specific triggers]" + +**Good example**: + +``` +Extract text and tables from PDF files, fill forms, merge documents. Use when working with PDF files or when user mentions PDFs, forms, or document extraction. +``` + +**Bad example**: + +``` +Helps with documents. +``` + +The bad example gives your agent no way to distinguish this from other document skills. + +## When to Add Scripts + +Add utility scripts when: + +- Operation is deterministic (validation, formatting) +- Same code would be generated repeatedly +- Errors need explicit handling + +Scripts save tokens and improve reliability vs generated code. + +## When to Split Files + +Split into separate files when: + +- SKILL.md exceeds 100 lines +- Content has distinct domains (finance vs sales schemas) +- Advanced features are rarely needed + +## Review Checklist + +After drafting, verify: + +- [ ] Description includes triggers ("Use when...") +- [ ] SKILL.md under 100 lines +- [ ] No time-sensitive info +- [ ] Consistent terminology +- [ ] Concrete examples included +- [ ] References one level deep diff --git a/.claude/settings.json b/.claude/settings.json new file mode 100644 index 0000000..2fb3cda --- /dev/null +++ b/.claude/settings.json @@ -0,0 +1,6 @@ +{ + "enabledPlugins": { + "ui-ux-pro-max@ui-ux-pro-max-skill": true, + "agent-browser@agent-browser": true + } +} diff --git a/.claude/skills/design-an-interface b/.claude/skills/design-an-interface new file mode 120000 index 0000000..9e0bdac --- /dev/null +++ b/.claude/skills/design-an-interface @@ -0,0 +1 @@ +../../.agents/skills/design-an-interface \ No newline at end of file diff --git a/.claude/skills/edit-article b/.claude/skills/edit-article new file mode 120000 index 0000000..6000a62 --- /dev/null +++ b/.claude/skills/edit-article @@ -0,0 +1 @@ +../../.agents/skills/edit-article \ No newline at end of file diff --git a/.claude/skills/git-guardrails-claude-code b/.claude/skills/git-guardrails-claude-code new file mode 120000 index 0000000..769e701 --- /dev/null +++ b/.claude/skills/git-guardrails-claude-code @@ -0,0 +1 @@ +../../.agents/skills/git-guardrails-claude-code \ No newline at end of file diff --git a/.claude/skills/grill-me b/.claude/skills/grill-me new file mode 120000 index 0000000..eea91a8 --- /dev/null +++ b/.claude/skills/grill-me @@ -0,0 +1 @@ +../../.agents/skills/grill-me \ No newline at end of file diff --git a/.claude/skills/improve-codebase-architecture b/.claude/skills/improve-codebase-architecture new file mode 120000 index 0000000..be3dac9 --- /dev/null +++ b/.claude/skills/improve-codebase-architecture @@ -0,0 +1 @@ +../../.agents/skills/improve-codebase-architecture \ No newline at end of file diff --git a/.claude/skills/migrate-to-shoehorn b/.claude/skills/migrate-to-shoehorn new file mode 120000 index 0000000..a2c8459 --- /dev/null +++ b/.claude/skills/migrate-to-shoehorn @@ -0,0 +1 @@ +../../.agents/skills/migrate-to-shoehorn \ No newline at end of file diff --git a/.claude/skills/prd-to-issues b/.claude/skills/prd-to-issues new file mode 120000 index 0000000..1a23837 --- /dev/null +++ b/.claude/skills/prd-to-issues @@ -0,0 +1 @@ +../../.agents/skills/prd-to-issues \ No newline at end of file diff --git a/.claude/skills/prd-to-plan b/.claude/skills/prd-to-plan new file mode 120000 index 0000000..83a4642 --- /dev/null +++ b/.claude/skills/prd-to-plan @@ -0,0 +1 @@ +../../.agents/skills/prd-to-plan \ No newline at end of file diff --git a/.claude/skills/request-refactor-plan b/.claude/skills/request-refactor-plan new file mode 120000 index 0000000..32190ba --- /dev/null +++ b/.claude/skills/request-refactor-plan @@ -0,0 +1 @@ +../../.agents/skills/request-refactor-plan \ No newline at end of file diff --git a/.claude/skills/scaffold-exercises b/.claude/skills/scaffold-exercises new file mode 120000 index 0000000..b4a6373 --- /dev/null +++ b/.claude/skills/scaffold-exercises @@ -0,0 +1 @@ +../../.agents/skills/scaffold-exercises \ No newline at end of file diff --git a/.claude/skills/setup-pre-commit b/.claude/skills/setup-pre-commit new file mode 120000 index 0000000..340f6b1 --- /dev/null +++ b/.claude/skills/setup-pre-commit @@ -0,0 +1 @@ +../../.agents/skills/setup-pre-commit \ No newline at end of file diff --git a/.claude/skills/tdd b/.claude/skills/tdd new file mode 120000 index 0000000..2178bb8 --- /dev/null +++ b/.claude/skills/tdd @@ -0,0 +1 @@ +../../.agents/skills/tdd \ No newline at end of file diff --git a/.claude/skills/triage-issue b/.claude/skills/triage-issue new file mode 120000 index 0000000..8a6c0b9 --- /dev/null +++ b/.claude/skills/triage-issue @@ -0,0 +1 @@ +../../.agents/skills/triage-issue \ No newline at end of file diff --git a/.claude/skills/ubiquitous-language b/.claude/skills/ubiquitous-language new file mode 120000 index 0000000..8f91da6 --- /dev/null +++ b/.claude/skills/ubiquitous-language @@ -0,0 +1 @@ +../../.agents/skills/ubiquitous-language \ No newline at end of file diff --git a/.claude/skills/write-a-prd b/.claude/skills/write-a-prd new file mode 120000 index 0000000..34cf7ba --- /dev/null +++ b/.claude/skills/write-a-prd @@ -0,0 +1 @@ +../../.agents/skills/write-a-prd \ No newline at end of file diff --git a/.claude/skills/write-a-skill b/.claude/skills/write-a-skill new file mode 120000 index 0000000..8e09e46 --- /dev/null +++ b/.claude/skills/write-a-skill @@ -0,0 +1 @@ +../../.agents/skills/write-a-skill \ No newline at end of file diff --git a/.env.dev b/.env.dev new file mode 100644 index 0000000..15a943b --- /dev/null +++ b/.env.dev @@ -0,0 +1,12 @@ +# PostgreSQL +POSTGRES_USER=cmdb +POSTGRES_PASSWORD=cmdb +POSTGRES_DB=cmdb +POSTGRES_HOST=postgres +POSTGRES_PORT=5432 + +# Kafka +KAFKA_BOOTSTRAP_SERVERS=kafka:9092 + +# Redis +REDIS_URL=redis://redis:6379 diff --git a/.env.example b/.env.example new file mode 100644 index 0000000..194e0a2 --- /dev/null +++ b/.env.example @@ -0,0 +1,12 @@ +# PostgreSQL +POSTGRES_USER= +POSTGRES_PASSWORD= +POSTGRES_DB= +POSTGRES_HOST= +POSTGRES_PORT= + +# Kafka +KAFKA_BOOTSTRAP_SERVERS= + +# Redis +REDIS_URL= diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml new file mode 100644 index 0000000..1e7694e --- /dev/null +++ b/.github/workflows/ci.yml @@ -0,0 +1,66 @@ +name: CI + +on: + push: + branches: [dev, main] + pull_request: + branches: [dev, main] + +jobs: + python-lint: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: astral-sh/setup-uv@v5 + with: + version: "latest" + - uses: actions/setup-python@v5 + with: + python-version: "3.13" + - run: uv sync + - run: uv run ruff check . + - run: uv run ruff format --check . + + python-test: + runs-on: ubuntu-latest + needs: python-lint + strategy: + matrix: + service: [ipam, auth, event, webhook] + steps: + - uses: actions/checkout@v4 + - uses: astral-sh/setup-uv@v5 + with: + version: "latest" + - uses: actions/setup-python@v5 + with: + python-version: "3.13" + - run: uv sync + - run: uv run --package cmdb-${{ matrix.service }} pytest services/${{ matrix.service }}/tests/ -m "not integration" -x -q + + frontend-lint: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions/setup-node@v4 + with: + node-version: "22" + - uses: pnpm/action-setup@v4 + with: + version: latest + - run: cd frontend && pnpm install --frozen-lockfile + - run: cd frontend && pnpm lint + + frontend-build: + runs-on: ubuntu-latest + needs: frontend-lint + steps: + - uses: actions/checkout@v4 + - uses: actions/setup-node@v4 + with: + node-version: "22" + - uses: pnpm/action-setup@v4 + with: + version: latest + - run: cd frontend && pnpm install --frozen-lockfile + - run: cd frontend && pnpm --filter @cmdb/client build diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..c7b7989 --- /dev/null +++ b/.gitignore @@ -0,0 +1,61 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +*.egg-info/ +*.egg +dist/ +build/ +.eggs/ +*.whl + +# Virtual environments +.venv/ +venv/ +ENV/ + +# uv +.python-versions + +# Testing +.pytest_cache/ +.coverage +htmlcov/ +.tox/ + +# IDE +.idea/ +.vscode/ +*.swp +*.swo +*~ +.DS_Store + +# Node.js +node_modules/ +.next/ +out/ +.turbo/ + +# Environment +.env +.env.local +.env.prod +.env.*.local + +# Docker +docker-compose.override.yml + +# RSA keys +keys/ + +# SSL certificates +infrastructure/nginx/ssl/ + +# Logs +*.log +npm-debug.log* + +# OS +Thumbs.db diff --git a/.node-version b/.node-version new file mode 100644 index 0000000..2bd5a0a --- /dev/null +++ b/.node-version @@ -0,0 +1 @@ +22 diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml new file mode 100644 index 0000000..d4de6b0 --- /dev/null +++ b/.pre-commit-config.yaml @@ -0,0 +1,7 @@ +repos: + - repo: https://github.com/astral-sh/ruff-pre-commit + rev: v0.9.10 + hooks: + - id: ruff + args: [--fix] + - id: ruff-format diff --git a/.python-version b/.python-version new file mode 100644 index 0000000..24ee5b1 --- /dev/null +++ b/.python-version @@ -0,0 +1 @@ +3.13 diff --git a/CLAUDE.ko.md b/CLAUDE.ko.md new file mode 100644 index 0000000..224131a --- /dev/null +++ b/CLAUDE.ko.md @@ -0,0 +1,116 @@ +# CLAUDE.ko.md + +이 파일은 CLAUDE.md의 한국어 번역본으로, 사람이 읽기 위한 용도입니다. +Claude Code는 이 파일이 아닌 CLAUDE.md를 참조합니다. + +## 빌드 & 개발 명령어 + +```bash +# 의존성 +uv sync # 워크스페이스 전체 의존성 설치 + +# 테스트 +uv run pytest # 전체 테스트 실행 +make test # 위와 동일 +uv run pytest services/ipam/tests/test_domain/test_prefix.py::TestPrefixCreate::test_method # 단일 테스트 + +# 린트 & 포맷 +uv run ruff check . # 린트 +make lint # 린트 (프론트엔드 포함) +uv run ruff format . # 포맷 +make format # 위와 동일 + +# Docker 개발 환경 +make dev-up # 개발 컨테이너 시작 +make dev-down # 개발 컨테이너 중지 +make dev-logs # 개발 컨테이너 로그 확인 +make dev-build # 개발 이미지 빌드 + +# 프론트엔드 +cd frontend && pnpm install && pnpm dev # 개발 서버 +cd frontend && pnpm lint # 프론트엔드 린트 +``` + +## 아키텍처 개요 + +**uv 모노레포 워크스페이스**: `services/*` 와 `shared` 패키지로 구성. + +각 서비스는 **DDD + CQRS + Event Sourcing** 패턴을 따름: + +``` +services/<name>/src/<name>/ + domain/ # Aggregate, Entity, Value Object, Domain Event, Repository 인터페이스 + application/ # Command, Query, Handler, DTO + infrastructure/ # DB 모델, Repository 구현체, 설정 + interface/ # FastAPI 라우터, 스키마, 메인 앱 +``` + +**공유 라이브러리** (`shared/src/shared/`) 제공 모듈: +- `domain/` — 기반 클래스: Entity, ValueObject, AggregateRoot, Repository, DomainService, CustomField, Tag +- `cqrs/` — Command/Query 버스, Command/Query 기반 클래스 +- `event/` — Event Store, DomainEvent, AggregateRoot(ES), Snapshot 지원 +- `api/` — 페이지네이션, 필터링, 정렬, 에러 핸들링, OpenAPI 유틸, 미들웨어 +- `messaging/` — Kafka Producer/Consumer, 직렬화 +- `db/` — Tenant DB 매니저 (멀티테넌시) + +**기술 스택**: Python 3.13, FastAPI, PostgreSQL, Kafka, Redis, Next.js (프론트엔드) + +## 설계 & 구현 가이드라인 + +### DDD 패턴 +- **Entity**: ID 기반 도메인 객체, 생명주기를 가짐 +- **Value Object**: 불변, 값에 의한 동등성 +- **Aggregate**: 일관성 경계, AggregateRoot를 통해서만 접근, Repository로만 조회 +- **Repository**: 도메인 레이어에 인터페이스, 인프라 레이어에 구현체 +- **Domain Event**: 도메인에서 발생한 사건의 기록 +- **Domain Service**: 단일 Aggregate에 속하지 않는 로직 + +### CQRS + Event Sourcing 흐름 +``` +Command → CommandHandler → Aggregate (apply()로 상태 변경) → DomainEvent + → Event Store (추가) + Kafka (발행) + +Query → QueryHandler → Read Model (비정규화된 프로젝션) +``` + +### 멀티테넌시 +- `shared.db.tenant_db_manager`를 통한 테넌트 수준 데이터베이스 격리 +- 각 테넌트는 별도 스키마 또는 데이터베이스를 가짐 + +### 서비스 간 통신 +- 서비스 간 통신은 **오직** Kafka 비동기 이벤트만 사용 +- 동기 서비스 간 호출 금지 + +### 클린 아키텍처 레이어 +- **Domain**은 아무것도 의존하지 않음 +- **Application**은 Domain에 의존 +- **Infrastructure**는 Domain + Application에 의존 +- **Interface**는 Application에 의존 (Domain 내부에 직접 의존하지 않음) + +## 코드 스타일 + +- **Ruff** 설정: `line-length = 120`, `target-version = "py313"`, `quote-style = "double"` +- 린트 규칙: E, F, I, N (N802 제외), W, UP, B, A, SIM +- **Pre-commit 훅** 설정됨: `ruff --fix` + `ruff format` +- 커밋 전 `uv run ruff check . && uv run ruff format .` 실행 + +## 프로젝트 관리 + +- **GitHub 저장소**: `fray-cloud/cmdb` +- **이슈 트래킹**: GitHub Issues, 마일스톤 `P1` (Phase 1) +- **PRD**: Issue #1 +- **워크플로우**: 섹션별 커밋 → push → 이슈 태스크 체크박스 체크 + +## 사용 가능한 스킬 + +| 스킬 | 트리거 | 용도 | +|------|--------|------| +| `/grill-me` | "grill me", 설계 스트레스 테스트 | 공유된 이해에 도달할 때까지 계획에 대해 집요하게 질문 | +| `/design-an-interface` | API 설계, 인터페이스 탐색 | 다양한 인터페이스 설계안을 병렬로 생성 | +| `/prd-to-plan` | PRD 분해, 단계 계획 | PRD를 tracer-bullet 구현 계획으로 변환 | +| `/prd-to-issues` | PRD를 이슈로 변환 | PRD를 GitHub 이슈로 분해 | +| `/triage-issue` | 버그 보고, 이슈 조사 | 코드베이스 탐색 + TDD 수정 계획으로 버그 분류 | +| `/ubiquitous-language` | 도메인 용어 정의, 용어집 | 대화에서 DDD 유비쿼터스 언어 추출 | +| `/request-refactor-plan` | 리팩터링 계획, RFC | 작은 커밋 단위의 리팩터링 계획을 GitHub 이슈로 생성 | +| `/write-a-skill` | 새 스킬 생성 | 올바른 구조의 에이전트 스킬 생성 | +| `/tdd` | TDD, red-green-refactor | 테스트 주도 개발 루프 | diff --git a/CLAUDE.md b/CLAUDE.md new file mode 100644 index 0000000..519a426 --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1,117 @@ +# CLAUDE.md + +This file provides guidance to Claude Code when working in this repository. + +> **CLAUDE.ko.md** is a Korean translation for human readers. When editing this file, also update CLAUDE.ko.md. Claude Code should only reference this file (CLAUDE.md). + +## Build & Development Commands + +```bash +# Dependencies +uv sync # Install all workspace dependencies + +# Testing +uv run pytest # Run all tests +make test # Same as above +uv run pytest services/ipam/tests/test_domain/test_prefix.py::TestPrefixCreate::test_method # Single test + +# Linting & Formatting +uv run ruff check . # Lint +make lint # Lint (includes frontend) +uv run ruff format . # Format +make format # Same as above + +# Docker Dev Environment +make dev-up # Start dev containers +make dev-down # Stop dev containers +make dev-logs # Tail dev container logs +make dev-build # Build dev images + +# Frontend +cd frontend && pnpm install && pnpm dev # Dev server +cd frontend && pnpm lint # Lint frontend +``` + +## Architecture Overview + +**uv monorepo workspace** with `services/*` and `shared` packages. + +Each service follows **DDD + CQRS + Event Sourcing**: + +``` +services/<name>/src/<name>/ + domain/ # Aggregates, Entities, Value Objects, Domain Events, Repository interfaces + application/ # Commands, Queries, Handlers, DTOs + infrastructure/ # DB models, Repository implementations, Config + interface/ # FastAPI routers, Schemas, Main app +``` + +**Shared library** (`shared/src/shared/`) provides: +- `domain/` — Base classes: Entity, ValueObject, AggregateRoot, Repository, DomainService, CustomField, Tag +- `cqrs/` — Command/Query bus, Command/Query base classes +- `event/` — Event Store, DomainEvent, AggregateRoot with ES, Snapshot support +- `api/` — Pagination, Filtering, Sorting, Error handling, OpenAPI utils, Middleware +- `messaging/` — Kafka Producer/Consumer, Serialization +- `db/` — Tenant DB manager (multi-tenancy) + +**Tech stack**: Python 3.13, FastAPI, PostgreSQL, Kafka, Redis, Next.js (frontend) + +## Design & Implementation Guidelines + +### DDD Patterns +- **Entity**: Identity-based domain object with lifecycle +- **Value Object**: Immutable, equality by value +- **Aggregate**: Consistency boundary with AggregateRoot, accessed only through Repository +- **Repository**: Interface in domain layer, implementation in infrastructure +- **Domain Event**: Record of something that happened in the domain +- **Domain Service**: Logic that doesn't belong to a single Aggregate + +### CQRS + Event Sourcing Flow +``` +Command → CommandHandler → Aggregate (mutate via apply()) → DomainEvent + → Event Store (append) + Kafka (publish) + +Query → QueryHandler → Read Model (denormalized projection) +``` + +### Multi-Tenancy +- Tenant-level database isolation via `shared.db.tenant_db_manager` +- Each tenant gets its own schema or database + +### Cross-Service Communication +- Services communicate **only** via async Kafka events +- No synchronous inter-service calls + +### Clean Architecture Layers +- **Domain** depends on nothing +- **Application** depends on Domain +- **Infrastructure** depends on Domain + Application +- **Interface** depends on Application (never directly on Domain internals) + +## Code Style + +- **Ruff** config: `line-length = 120`, `target-version = "py313"`, `quote-style = "double"` +- Lint rules: E, F, I, N (except N802), W, UP, B, A, SIM +- **Pre-commit hooks** configured: `ruff --fix` + `ruff format` +- Run `uv run ruff check . && uv run ruff format .` before committing + +## Project Management + +- **GitHub repo**: `fray-cloud/cmdb` +- **Issue tracking**: GitHub Issues with milestone `P1` (Phase 1) +- **PRD**: Issue #1 +- **Workflow**: section-by-section commits → push → check issue task checkboxes + +## Available Skills + +| Skill | Trigger | Purpose | +|-------|---------|---------| +| `/grill-me` | "grill me", stress-test a design | Interview relentlessly about a plan until shared understanding | +| `/design-an-interface` | design API, explore interface options | Generate multiple interface designs in parallel | +| `/prd-to-plan` | break down PRD, plan phases | Turn PRD into tracer-bullet implementation plan | +| `/prd-to-issues` | convert PRD to issues | Break PRD into GitHub issues | +| `/triage-issue` | report bug, investigate issue | Triage bug with codebase exploration + TDD fix plan | +| `/ubiquitous-language` | define domain terms, glossary | Extract DDD ubiquitous language from conversation | +| `/request-refactor-plan` | plan refactor, refactoring RFC | Create refactor plan with tiny commits as GitHub issue | +| `/write-a-skill` | create new skill | Create new agent skills with proper structure | +| `/tdd` | TDD, red-green-refactor | Test-driven development loop | diff --git a/Makefile b/Makefile new file mode 100644 index 0000000..a51a2de --- /dev/null +++ b/Makefile @@ -0,0 +1,73 @@ +.PHONY: dev-up dev-down dev-logs dev-build dev-init prod-up prod-down prod-build \ + lint format test db-shell kafka-shell redis-shell clean dev-keygen dev-cert + +# Dev environment +dev-up: + docker compose -f docker-compose.dev.yml up -d + +dev-down: + docker compose -f docker-compose.dev.yml down + +dev-logs: + docker compose -f docker-compose.dev.yml logs -f + +dev-build: + docker build --target dev -t cmdb-base:dev -f infrastructure/docker/Dockerfile.base . + docker compose -f docker-compose.dev.yml build + +dev-init: dev-build dev-up + @echo "Running database migrations..." + docker compose -f docker-compose.dev.yml run --rm auth-init + docker compose -f docker-compose.dev.yml run --rm ipam-init + docker compose -f docker-compose.dev.yml run --rm event-init + docker compose -f docker-compose.dev.yml run --rm tenant-init + @echo "All migrations completed." + +# Prod environment +prod-up: + docker compose -f docker-compose.yml up -d + +prod-down: + docker compose -f docker-compose.yml down + +prod-build: + docker build --target prod -t cmdb-base:prod -f infrastructure/docker/Dockerfile.base . + docker compose -f docker-compose.yml build + +# Development utilities +lint: + uv run ruff check . + cd frontend && pnpm lint + +format: + uv run ruff format . + +test: + uv run pytest + +db-shell: + docker compose -f docker-compose.dev.yml exec postgres psql -U cmdb + +kafka-shell: + docker compose -f docker-compose.dev.yml exec kafka bash + +redis-shell: + docker compose -f docker-compose.dev.yml exec redis redis-cli + +dev-cert: + @mkdir -p infrastructure/nginx/ssl + openssl req -x509 -nodes -days 365 \ + -newkey rsa:2048 \ + -keyout infrastructure/nginx/ssl/key.pem \ + -out infrastructure/nginx/ssl/cert.pem \ + -subj "/CN=localhost" + @echo "SSL certificates generated in infrastructure/nginx/ssl/" + +dev-keygen: + @mkdir -p keys + openssl genrsa -out keys/private.pem 2048 + openssl rsa -in keys/private.pem -pubout -out keys/public.pem + @echo "RSA keys generated in keys/" + +clean: + docker compose -f docker-compose.dev.yml down -v --remove-orphans diff --git a/docker-compose.dev.yml b/docker-compose.dev.yml new file mode 100644 index 0000000..197a4c7 --- /dev/null +++ b/docker-compose.dev.yml @@ -0,0 +1,269 @@ +services: + postgres: + image: postgres:16 + environment: + POSTGRES_USER: ${POSTGRES_USER:-cmdb} + POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-cmdb} + POSTGRES_DB: ${POSTGRES_DB:-cmdb} + ports: + - "5432:5432" + volumes: + - postgres-data:/var/lib/postgresql/data + - ./infrastructure/docker/init-databases.sh:/docker-entrypoint-initdb.d/init-databases.sh + healthcheck: + test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-cmdb}"] + interval: 10s + timeout: 5s + retries: 5 + networks: + - cmdb-network + + kafka: + image: apache/kafka:3.9.0 + environment: + KAFKA_NODE_ID: 1 + KAFKA_PROCESS_ROLES: broker,controller + KAFKA_CONTROLLER_QUORUM_VOTERS: 1@kafka:9093 + KAFKA_LISTENERS: PLAINTEXT://:9092,CONTROLLER://:9093 + KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092 + KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT + KAFKA_CONTROLLER_LISTENER_NAMES: CONTROLLER + KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true" + KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 + CLUSTER_ID: "cmdb-dev-cluster-001" + ports: + - "9092:9092" + volumes: + - kafka-data:/tmp/kraft-combined-logs + healthcheck: + test: ["CMD-SHELL", "/opt/kafka/bin/kafka-broker-api-versions.sh --bootstrap-server localhost:9092 || exit 1"] + interval: 10s + timeout: 10s + retries: 5 + networks: + - cmdb-network + + redis: + image: redis:7 + ports: + - "6379:6379" + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 10s + timeout: 5s + retries: 5 + networks: + - cmdb-network + + ipam: + build: + context: . + dockerfile: services/ipam/Dockerfile.dev + volumes: + - ./services/ipam/src:/app/services/ipam/src + ports: + - "8001:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_ipam + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + + auth: + build: + context: . + dockerfile: services/auth/Dockerfile.dev + volumes: + - ./services/auth/src:/app/services/auth/src + - ./keys:/app/keys:ro + ports: + - "8002:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_auth + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + RSA_PRIVATE_KEY_PATH: /app/keys/private.pem + RSA_PUBLIC_KEY_PATH: /app/keys/public.pem + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + + tenant: + build: + context: . + dockerfile: services/tenant/Dockerfile.dev + volumes: + - ./services/tenant/src:/app/services/tenant/src + ports: + - "8003:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_tenant + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + + event: + build: + context: . + dockerfile: services/event/Dockerfile.dev + volumes: + - ./services/event/src:/app/services/event/src + ports: + - "8004:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_event + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + + webhook: + build: + context: . + dockerfile: services/webhook/Dockerfile.dev + volumes: + - ./services/webhook/src:/app/services/webhook/src + ports: + - "8005:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_webhook + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + + frontend: + build: + context: ./frontend + dockerfile: apps/client/Dockerfile.dev + volumes: + - ./frontend/apps/client/src:/app/apps/client/src + - ./frontend/packages:/app/packages + ports: + - "3000:3000" + networks: + - cmdb-network + + nginx: + image: nginx:latest + volumes: + - ./infrastructure/nginx/nginx.conf:/etc/nginx/nginx.conf:ro + - ./infrastructure/nginx/ssl:/etc/nginx/ssl:ro + ports: + - "80:80" + - "443:443" + depends_on: + ipam: + condition: service_started + auth: + condition: service_started + tenant: + condition: service_started + event: + condition: service_started + webhook: + condition: service_started + frontend: + condition: service_started + networks: + - cmdb-network + + # ── Init containers (migration) ── + auth-init: + image: cmdb-auth:latest + command: uv run --package cmdb-auth alembic upgrade head + working_dir: /app/services/auth + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_auth + depends_on: + postgres: + condition: service_healthy + profiles: ["init"] + networks: + - cmdb-network + + ipam-init: + image: cmdb-ipam:latest + command: uv run --package cmdb-ipam alembic upgrade head + working_dir: /app/services/ipam + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_ipam + depends_on: + postgres: + condition: service_healthy + profiles: ["init"] + networks: + - cmdb-network + + event-init: + image: cmdb-event:latest + command: uv run --package cmdb-event alembic upgrade head + working_dir: /app/services/event + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_event + depends_on: + postgres: + condition: service_healthy + profiles: ["init"] + networks: + - cmdb-network + + tenant-init: + image: cmdb-tenant:latest + command: uv run --package cmdb-tenant alembic upgrade head + working_dir: /app/services/tenant + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_tenant + depends_on: + postgres: + condition: service_healthy + profiles: ["init"] + networks: + - cmdb-network + +volumes: + postgres-data: + kafka-data: + redis-data: + +networks: + cmdb-network: + driver: bridge diff --git a/docker-compose.yml b/docker-compose.yml new file mode 100644 index 0000000..06740ae --- /dev/null +++ b/docker-compose.yml @@ -0,0 +1,8 @@ +include: + - path: infrastructure/docker-compose.yml + - path: services/ipam/docker-compose.yml + - path: services/auth/docker-compose.yml + - path: services/tenant/docker-compose.yml + - path: services/event/docker-compose.yml + - path: services/webhook/docker-compose.yml + - path: frontend/docker-compose.yml diff --git a/frontend/.gitignore b/frontend/.gitignore new file mode 100644 index 0000000..5ef6a52 --- /dev/null +++ b/frontend/.gitignore @@ -0,0 +1,41 @@ +# See https://help.github.com/articles/ignoring-files/ for more about ignoring files. + +# dependencies +/node_modules +/.pnp +.pnp.* +.yarn/* +!.yarn/patches +!.yarn/plugins +!.yarn/releases +!.yarn/versions + +# testing +/coverage + +# next.js +/.next/ +/out/ + +# production +/build + +# misc +.DS_Store +*.pem + +# debug +npm-debug.log* +yarn-debug.log* +yarn-error.log* +.pnpm-debug.log* + +# env files (can opt-in for committing if needed) +.env* + +# vercel +.vercel + +# typescript +*.tsbuildinfo +next-env.d.ts diff --git a/frontend/README.md b/frontend/README.md new file mode 100644 index 0000000..e215bc4 --- /dev/null +++ b/frontend/README.md @@ -0,0 +1,36 @@ +This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app). + +## Getting Started + +First, run the development server: + +```bash +npm run dev +# or +yarn dev +# or +pnpm dev +# or +bun dev +``` + +Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. + +You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file. + +This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel. + +## Learn More + +To learn more about Next.js, take a look at the following resources: + +- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API. +- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial. + +You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js) - your feedback and contributions are welcome! + +## Deploy on Vercel + +The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js. + +Check out our [Next.js deployment documentation](https://nextjs.org/docs/app/building-your-application/deploying) for more details. diff --git a/frontend/apps/admin/eslint.config.mjs b/frontend/apps/admin/eslint.config.mjs new file mode 100644 index 0000000..05e726d --- /dev/null +++ b/frontend/apps/admin/eslint.config.mjs @@ -0,0 +1,18 @@ +import { defineConfig, globalIgnores } from "eslint/config"; +import nextVitals from "eslint-config-next/core-web-vitals"; +import nextTs from "eslint-config-next/typescript"; + +const eslintConfig = defineConfig([ + ...nextVitals, + ...nextTs, + // Override default ignores of eslint-config-next. + globalIgnores([ + // Default ignores of eslint-config-next: + ".next/**", + "out/**", + "build/**", + "next-env.d.ts", + ]), +]); + +export default eslintConfig; diff --git a/frontend/apps/admin/next.config.ts b/frontend/apps/admin/next.config.ts new file mode 100644 index 0000000..8c49ac8 --- /dev/null +++ b/frontend/apps/admin/next.config.ts @@ -0,0 +1,7 @@ +import type { NextConfig } from "next"; + +const nextConfig: NextConfig = { + transpilePackages: ["@cmdb/shared"], +}; + +export default nextConfig; diff --git a/frontend/apps/admin/package.json b/frontend/apps/admin/package.json new file mode 100644 index 0000000..51dbea8 --- /dev/null +++ b/frontend/apps/admin/package.json @@ -0,0 +1,27 @@ +{ + "name": "@cmdb/admin", + "version": "0.1.0", + "private": true, + "scripts": { + "dev": "next dev --port 3001", + "build": "next build", + "start": "next start", + "lint": "eslint" + }, + "dependencies": { + "@cmdb/shared": "workspace:*", + "next": "16.1.7", + "react": "19.2.3", + "react-dom": "19.2.3" + }, + "devDependencies": { + "@tailwindcss/postcss": "^4", + "@types/node": "^20", + "@types/react": "^19", + "@types/react-dom": "^19", + "eslint": "^9", + "eslint-config-next": "16.1.7", + "tailwindcss": "^4", + "typescript": "^5" + } +} diff --git a/frontend/apps/admin/src/app/layout.tsx b/frontend/apps/admin/src/app/layout.tsx new file mode 100644 index 0000000..6b267f5 --- /dev/null +++ b/frontend/apps/admin/src/app/layout.tsx @@ -0,0 +1,14 @@ +import type { Metadata } from "next"; + +export const metadata: Metadata = { + title: "CMDB Admin", + description: "CMDB Administration Panel", +}; + +export default function RootLayout({ children }: { children: React.ReactNode }) { + return ( + <html lang="en"> + <body>{children}</body> + </html> + ); +} diff --git a/frontend/apps/admin/src/app/page.tsx b/frontend/apps/admin/src/app/page.tsx new file mode 100644 index 0000000..379d409 --- /dev/null +++ b/frontend/apps/admin/src/app/page.tsx @@ -0,0 +1,10 @@ +export default function AdminHome() { + return ( + <main className="flex min-h-screen items-center justify-center"> + <div className="text-center"> + <h1 className="text-3xl font-bold">CMDB Admin</h1> + <p className="mt-2 text-gray-500">Administration panel — coming soon</p> + </div> + </main> + ); +} diff --git a/frontend/apps/admin/tsconfig.json b/frontend/apps/admin/tsconfig.json new file mode 100644 index 0000000..b8cdcc7 --- /dev/null +++ b/frontend/apps/admin/tsconfig.json @@ -0,0 +1,23 @@ +{ + "compilerOptions": { + "target": "ES2017", + "lib": ["dom", "dom.iterable", "esnext"], + "allowJs": true, + "skipLibCheck": true, + "strict": true, + "noEmit": true, + "esModuleInterop": true, + "module": "esnext", + "moduleResolution": "bundler", + "resolveJsonModule": true, + "isolatedModules": true, + "jsx": "preserve", + "incremental": true, + "plugins": [{ "name": "next" }], + "paths": { + "@/*": ["./src/*"] + } + }, + "include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"], + "exclude": ["node_modules"] +} diff --git a/frontend/apps/client/.prettierrc b/frontend/apps/client/.prettierrc new file mode 100644 index 0000000..6120787 --- /dev/null +++ b/frontend/apps/client/.prettierrc @@ -0,0 +1,7 @@ +{ + "semi": true, + "singleQuote": false, + "tabWidth": 2, + "trailingComma": "all", + "printWidth": 100 +} diff --git a/frontend/apps/client/Dockerfile b/frontend/apps/client/Dockerfile new file mode 100644 index 0000000..cd37888 --- /dev/null +++ b/frontend/apps/client/Dockerfile @@ -0,0 +1,19 @@ +FROM node:22-slim AS deps +RUN corepack enable && corepack prepare pnpm@latest --activate +WORKDIR /app +COPY package.json pnpm-lock.yaml ./ +RUN pnpm install --frozen-lockfile + +FROM deps AS builder +COPY . . +RUN pnpm build + +FROM node:22-slim AS runner +RUN corepack enable && corepack prepare pnpm@latest --activate +WORKDIR /app +ENV NODE_ENV=production +COPY --from=builder /app/.next/standalone ./ +COPY --from=builder /app/.next/static ./.next/static +COPY --from=builder /app/public ./public +EXPOSE 3000 +CMD ["node", "server.js"] diff --git a/frontend/apps/client/Dockerfile.dev b/frontend/apps/client/Dockerfile.dev new file mode 100644 index 0000000..1fed2c8 --- /dev/null +++ b/frontend/apps/client/Dockerfile.dev @@ -0,0 +1,10 @@ +FROM node:22-slim +RUN corepack enable && corepack prepare pnpm@latest --activate +WORKDIR /app +COPY package.json pnpm-workspace.yaml ./ +COPY apps/client/package.json ./apps/client/ +COPY packages/shared/package.json ./packages/shared/ +RUN pnpm install +COPY . . +EXPOSE 3000 +CMD ["pnpm", "dev"] diff --git a/frontend/apps/client/components.json b/frontend/apps/client/components.json new file mode 100644 index 0000000..8d886db --- /dev/null +++ b/frontend/apps/client/components.json @@ -0,0 +1,25 @@ +{ + "$schema": "https://ui.shadcn.com/schema.json", + "style": "base-nova", + "rsc": true, + "tsx": true, + "tailwind": { + "config": "", + "css": "src/app/globals.css", + "baseColor": "neutral", + "cssVariables": true, + "prefix": "" + }, + "iconLibrary": "lucide", + "rtl": false, + "aliases": { + "components": "@/components", + "utils": "@/lib/utils", + "ui": "@/components/ui", + "lib": "@/lib", + "hooks": "@/hooks" + }, + "menuColor": "default", + "menuAccent": "subtle", + "registries": {} +} diff --git a/frontend/apps/client/docker-compose.dev.yml b/frontend/apps/client/docker-compose.dev.yml new file mode 100644 index 0000000..f8348b3 --- /dev/null +++ b/frontend/apps/client/docker-compose.dev.yml @@ -0,0 +1,11 @@ +services: + frontend: + build: + context: . + dockerfile: Dockerfile.dev + volumes: + - ./src:/app/src + ports: + - "3000:3000" + networks: + - cmdb-network diff --git a/frontend/apps/client/docker-compose.yml b/frontend/apps/client/docker-compose.yml new file mode 100644 index 0000000..5e460e4 --- /dev/null +++ b/frontend/apps/client/docker-compose.yml @@ -0,0 +1,7 @@ +services: + frontend: + build: + context: . + dockerfile: Dockerfile + networks: + - cmdb-network diff --git a/frontend/apps/client/eslint.config.mjs b/frontend/apps/client/eslint.config.mjs new file mode 100644 index 0000000..05e726d --- /dev/null +++ b/frontend/apps/client/eslint.config.mjs @@ -0,0 +1,18 @@ +import { defineConfig, globalIgnores } from "eslint/config"; +import nextVitals from "eslint-config-next/core-web-vitals"; +import nextTs from "eslint-config-next/typescript"; + +const eslintConfig = defineConfig([ + ...nextVitals, + ...nextTs, + // Override default ignores of eslint-config-next. + globalIgnores([ + // Default ignores of eslint-config-next: + ".next/**", + "out/**", + "build/**", + "next-env.d.ts", + ]), +]); + +export default eslintConfig; diff --git a/frontend/apps/client/next.config.ts b/frontend/apps/client/next.config.ts new file mode 100644 index 0000000..8c49ac8 --- /dev/null +++ b/frontend/apps/client/next.config.ts @@ -0,0 +1,7 @@ +import type { NextConfig } from "next"; + +const nextConfig: NextConfig = { + transpilePackages: ["@cmdb/shared"], +}; + +export default nextConfig; diff --git a/frontend/apps/client/package.json b/frontend/apps/client/package.json new file mode 100644 index 0000000..3b2229c --- /dev/null +++ b/frontend/apps/client/package.json @@ -0,0 +1,37 @@ +{ + "name": "@cmdb/client", + "version": "0.1.0", + "private": true, + "scripts": { + "dev": "next dev --port 3000", + "build": "next build", + "start": "next start", + "lint": "eslint" + }, + "dependencies": { + "@base-ui/react": "^1.3.0", + "@cmdb/shared": "workspace:*", + "@tanstack/react-query": "^5", + "@tanstack/react-table": "^8", + "class-variance-authority": "^0.7.1", + "clsx": "^2.1.1", + "lucide-react": "^0.500", + "next": "16.1.7", + "next-themes": "^0.4", + "react": "19.2.3", + "react-dom": "19.2.3", + "shadcn": "^4.1.0", + "tailwind-merge": "^3.5.0", + "tw-animate-css": "^1.4.0" + }, + "devDependencies": { + "@tailwindcss/postcss": "^4", + "@types/node": "^20", + "@types/react": "^19", + "@types/react-dom": "^19", + "eslint": "^9", + "eslint-config-next": "16.1.7", + "tailwindcss": "^4", + "typescript": "^5" + } +} diff --git a/frontend/apps/client/postcss.config.mjs b/frontend/apps/client/postcss.config.mjs new file mode 100644 index 0000000..61e3684 --- /dev/null +++ b/frontend/apps/client/postcss.config.mjs @@ -0,0 +1,7 @@ +const config = { + plugins: { + "@tailwindcss/postcss": {}, + }, +}; + +export default config; diff --git a/frontend/apps/client/public/file.svg b/frontend/apps/client/public/file.svg new file mode 100644 index 0000000..004145c --- /dev/null +++ b/frontend/apps/client/public/file.svg @@ -0,0 +1 @@ +<svg fill="none" viewBox="0 0 16 16" xmlns="http://www.w3.org/2000/svg"><path d="M14.5 13.5V5.41a1 1 0 0 0-.3-.7L9.8.29A1 1 0 0 0 9.08 0H1.5v13.5A2.5 2.5 0 0 0 4 16h8a2.5 2.5 0 0 0 2.5-2.5m-1.5 0v-7H8v-5H3v12a1 1 0 0 0 1 1h8a1 1 0 0 0 1-1M9.5 5V2.12L12.38 5zM5.13 5h-.62v1.25h2.12V5zm-.62 3h7.12v1.25H4.5zm.62 3h-.62v1.25h7.12V11z" clip-rule="evenodd" fill="#666" fill-rule="evenodd"/></svg> \ No newline at end of file diff --git a/frontend/apps/client/public/globe.svg b/frontend/apps/client/public/globe.svg new file mode 100644 index 0000000..567f17b --- /dev/null +++ b/frontend/apps/client/public/globe.svg @@ -0,0 +1 @@ +<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16"><g clip-path="url(#a)"><path fill-rule="evenodd" clip-rule="evenodd" d="M10.27 14.1a6.5 6.5 0 0 0 3.67-3.45q-1.24.21-2.7.34-.31 1.83-.97 3.1M8 16A8 8 0 1 0 8 0a8 8 0 0 0 0 16m.48-1.52a7 7 0 0 1-.96 0H7.5a4 4 0 0 1-.84-1.32q-.38-.89-.63-2.08a40 40 0 0 0 3.92 0q-.25 1.2-.63 2.08a4 4 0 0 1-.84 1.31zm2.94-4.76q1.66-.15 2.95-.43a7 7 0 0 0 0-2.58q-1.3-.27-2.95-.43a18 18 0 0 1 0 3.44m-1.27-3.54a17 17 0 0 1 0 3.64 39 39 0 0 1-4.3 0 17 17 0 0 1 0-3.64 39 39 0 0 1 4.3 0m1.1-1.17q1.45.13 2.69.34a6.5 6.5 0 0 0-3.67-3.44q.65 1.26.98 3.1M8.48 1.5l.01.02q.41.37.84 1.31.38.89.63 2.08a40 40 0 0 0-3.92 0q.25-1.2.63-2.08a4 4 0 0 1 .85-1.32 7 7 0 0 1 .96 0m-2.75.4a6.5 6.5 0 0 0-3.67 3.44 29 29 0 0 1 2.7-.34q.31-1.83.97-3.1M4.58 6.28q-1.66.16-2.95.43a7 7 0 0 0 0 2.58q1.3.27 2.95.43a18 18 0 0 1 0-3.44m.17 4.71q-1.45-.12-2.69-.34a6.5 6.5 0 0 0 3.67 3.44q-.65-1.27-.98-3.1" fill="#666"/></g><defs><clipPath id="a"><path fill="#fff" d="M0 0h16v16H0z"/></clipPath></defs></svg> \ No newline at end of file diff --git a/frontend/apps/client/public/next.svg b/frontend/apps/client/public/next.svg new file mode 100644 index 0000000..5174b28 --- /dev/null +++ b/frontend/apps/client/public/next.svg @@ -0,0 +1 @@ +<svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 394 80"><path fill="#000" d="M262 0h68.5v12.7h-27.2v66.6h-13.6V12.7H262V0ZM149 0v12.7H94v20.4h44.3v12.6H94v21h55v12.6H80.5V0h68.7zm34.3 0h-17.8l63.8 79.4h17.9l-32-39.7 32-39.6h-17.9l-23 28.6-23-28.6zm18.3 56.7-9-11-27.1 33.7h17.8l18.3-22.7z"/><path fill="#000" d="M81 79.3 17 0H0v79.3h13.6V17l50.2 62.3H81Zm252.6-.4c-1 0-1.8-.4-2.5-1s-1.1-1.6-1.1-2.6.3-1.8 1-2.5 1.6-1 2.6-1 1.8.3 2.5 1a3.4 3.4 0 0 1 .6 4.3 3.7 3.7 0 0 1-3 1.8zm23.2-33.5h6v23.3c0 2.1-.4 4-1.3 5.5a9.1 9.1 0 0 1-3.8 3.5c-1.6.8-3.5 1.3-5.7 1.3-2 0-3.7-.4-5.3-1s-2.8-1.8-3.7-3.2c-.9-1.3-1.4-3-1.4-5h6c.1.8.3 1.6.7 2.2s1 1.2 1.6 1.5c.7.4 1.5.5 2.4.5 1 0 1.8-.2 2.4-.6a4 4 0 0 0 1.6-1.8c.3-.8.5-1.8.5-3V45.5zm30.9 9.1a4.4 4.4 0 0 0-2-3.3 7.5 7.5 0 0 0-4.3-1.1c-1.3 0-2.4.2-3.3.5-.9.4-1.6 1-2 1.6a3.5 3.5 0 0 0-.3 4c.3.5.7.9 1.3 1.2l1.8 1 2 .5 3.2.8c1.3.3 2.5.7 3.7 1.2a13 13 0 0 1 3.2 1.8 8.1 8.1 0 0 1 3 6.5c0 2-.5 3.7-1.5 5.1a10 10 0 0 1-4.4 3.5c-1.8.8-4.1 1.2-6.8 1.2-2.6 0-4.9-.4-6.8-1.2-2-.8-3.4-2-4.5-3.5a10 10 0 0 1-1.7-5.6h6a5 5 0 0 0 3.5 4.6c1 .4 2.2.6 3.4.6 1.3 0 2.5-.2 3.5-.6 1-.4 1.8-1 2.4-1.7a4 4 0 0 0 .8-2.4c0-.9-.2-1.6-.7-2.2a11 11 0 0 0-2.1-1.4l-3.2-1-3.8-1c-2.8-.7-5-1.7-6.6-3.2a7.2 7.2 0 0 1-2.4-5.7 8 8 0 0 1 1.7-5 10 10 0 0 1 4.3-3.5c2-.8 4-1.2 6.4-1.2 2.3 0 4.4.4 6.2 1.2 1.8.8 3.2 2 4.3 3.4 1 1.4 1.5 3 1.5 5h-5.8z"/></svg> \ No newline at end of file diff --git a/frontend/apps/client/public/vercel.svg b/frontend/apps/client/public/vercel.svg new file mode 100644 index 0000000..7705396 --- /dev/null +++ b/frontend/apps/client/public/vercel.svg @@ -0,0 +1 @@ +<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1155 1000"><path d="m577.3 0 577.4 1000H0z" fill="#fff"/></svg> \ No newline at end of file diff --git a/frontend/apps/client/public/window.svg b/frontend/apps/client/public/window.svg new file mode 100644 index 0000000..b2b2a44 --- /dev/null +++ b/frontend/apps/client/public/window.svg @@ -0,0 +1 @@ +<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16"><path fill-rule="evenodd" clip-rule="evenodd" d="M1.5 2.5h13v10a1 1 0 0 1-1 1h-11a1 1 0 0 1-1-1zM0 1h16v11.5a2.5 2.5 0 0 1-2.5 2.5h-11A2.5 2.5 0 0 1 0 12.5zm3.75 4.5a.75.75 0 1 0 0-1.5.75.75 0 0 0 0 1.5M7 4.75a.75.75 0 1 1-1.5 0 .75.75 0 0 1 1.5 0m1.75.75a.75.75 0 1 0 0-1.5.75.75 0 0 0 0 1.5" fill="#666"/></svg> \ No newline at end of file diff --git a/frontend/apps/client/src/app/(auth)/layout.tsx b/frontend/apps/client/src/app/(auth)/layout.tsx new file mode 100644 index 0000000..372f9b7 --- /dev/null +++ b/frontend/apps/client/src/app/(auth)/layout.tsx @@ -0,0 +1,11 @@ +export default function AuthLayout({ + children, +}: { + children: React.ReactNode; +}) { + return ( + <div className="flex min-h-screen items-center justify-center bg-background p-4"> + <div className="w-full max-w-md">{children}</div> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(auth)/login/page.tsx b/frontend/apps/client/src/app/(auth)/login/page.tsx new file mode 100644 index 0000000..2c70318 --- /dev/null +++ b/frontend/apps/client/src/app/(auth)/login/page.tsx @@ -0,0 +1,163 @@ +"use client"; + +import { Suspense, useEffect, useState } from "react"; +import { useRouter, useSearchParams } from "next/navigation"; +import Link from "next/link"; +import { useAuth } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { + Card, + CardContent, + CardDescription, + CardFooter, + CardHeader, + CardTitle, +} from "@/components/ui/card"; + +interface Tenant { + id: string; + name: string; + slug: string; +} + +function LoginForm() { + const router = useRouter(); + const searchParams = useSearchParams(); + const { login } = useAuth(); + + const [email, setEmail] = useState(""); + const [password, setPassword] = useState(""); + const [tenantId, setTenantId] = useState(""); + const [tenants, setTenants] = useState<Tenant[]>([]); + const [error, setError] = useState(""); + const [isSubmitting, setIsSubmitting] = useState(false); + + const successMessage = searchParams.get("message"); + + useEffect(() => { + fetch(`/api/v1/tenant/tenants?limit=100`) + .then((r) => r.json()) + .then((data) => { + const items = data.items || []; + setTenants(items); + if (items.length === 1) { + setTenantId(items[0].id); + } + const saved = localStorage.getItem("tenant_id"); + if (saved && items.some((t: Tenant) => t.id === saved)) { + setTenantId(saved); + } + }) + .catch(() => {}); + }, []); + + async function handleSubmit(e: React.FormEvent) { + e.preventDefault(); + setError(""); + + if (!tenantId) { + setError("Please select an organization."); + return; + } + + setIsSubmitting(true); + try { + await login({ email, password, tenant_id: tenantId }); + router.replace("/"); + } catch (err) { + setError( + err instanceof Error ? err.message : "Login failed. Please try again.", + ); + } finally { + setIsSubmitting(false); + } + } + + return ( + <Card> + <CardHeader className="text-center"> + <CardTitle className="text-2xl">Sign in to CMDB</CardTitle> + <CardDescription> + Enter your credentials to access your account + </CardDescription> + </CardHeader> + <CardContent> + <form onSubmit={handleSubmit} className="space-y-4"> + {successMessage && ( + <div className="rounded-md bg-green-50 p-3 text-sm text-green-800 dark:bg-green-950 dark:text-green-200"> + {successMessage} + </div> + )} + {error && ( + <div className="rounded-md bg-destructive/10 p-3 text-sm text-destructive"> + {error} + </div> + )} + {tenants.length > 0 && ( + <div className="space-y-2"> + <Label htmlFor="tenant">Organization</Label> + <select + id="tenant" + value={tenantId} + onChange={(e) => setTenantId(e.target.value)} + className="flex h-9 w-full rounded-md border border-input bg-transparent px-3 py-1 text-sm shadow-xs transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring" + required + > + <option value="">Select organization...</option> + {tenants.map((t) => ( + <option key={t.id} value={t.id}> + {t.name} + </option> + ))} + </select> + </div> + )} + <div className="space-y-2"> + <Label htmlFor="email">Email</Label> + <Input + id="email" + type="email" + placeholder="you@example.com" + value={email} + onChange={(e) => setEmail(e.target.value)} + required + autoComplete="email" + /> + </div> + <div className="space-y-2"> + <Label htmlFor="password">Password</Label> + <Input + id="password" + type="password" + value={password} + onChange={(e) => setPassword(e.target.value)} + required + autoComplete="current-password" + /> + </div> + <Button type="submit" className="w-full" disabled={isSubmitting}> + {isSubmitting ? "Signing in..." : "Sign in"} + </Button> + </form> + </CardContent> + <CardFooter className="justify-center"> + <p className="text-sm text-muted-foreground"> + Don't have an account?{" "} + <Link href="/signup" className="text-primary underline-offset-4 hover:underline"> + Sign up + </Link> + </p> + </CardFooter> + </Card> + ); +} + +export default function LoginPage() { + return ( + <Suspense> + <LoginForm /> + </Suspense> + ); +} diff --git a/frontend/apps/client/src/app/(auth)/setup/page.tsx b/frontend/apps/client/src/app/(auth)/setup/page.tsx new file mode 100644 index 0000000..f6aadd6 --- /dev/null +++ b/frontend/apps/client/src/app/(auth)/setup/page.tsx @@ -0,0 +1,184 @@ +"use client"; + +import { useState } from "react"; +import { useRouter } from "next/navigation"; +import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Button } from "@/components/ui/button"; +import { setupCreateTenant } from "@cmdb/shared/lib/setup"; + +export default function SetupPage() { + const router = useRouter(); + const [step, setStep] = useState(1); + const [error, setError] = useState(""); + const [loading, setLoading] = useState(false); + + // Step 1: Tenant + const [tenantName, setTenantName] = useState(""); + const [tenantSlug, setTenantSlug] = useState(""); + const [tenantId, setTenantId] = useState(""); + + // Step 2: Admin + const [email, setEmail] = useState(""); + const [username, setUsername] = useState(""); + const [password, setPassword] = useState(""); + const [confirmPassword, setConfirmPassword] = useState(""); + + const handleCreateTenant = async (e: React.FormEvent) => { + e.preventDefault(); + setError(""); + setLoading(true); + try { + const result = await setupCreateTenant({ name: tenantName, slug: tenantSlug }); + setTenantId(result.id); + localStorage.setItem("tenant_id", result.id); + setStep(2); + } catch (err) { + setError(err instanceof Error ? err.message : "Failed to create tenant"); + } finally { + setLoading(false); + } + }; + + const handleCreateAdmin = async (e: React.FormEvent) => { + e.preventDefault(); + setError(""); + if (password !== confirmPassword) { + setError("Passwords do not match"); + return; + } + setLoading(true); + try { + const res = await fetch(`/api/v1/auth/register`, { + method: "POST", + headers: { "Content-Type": "application/json" }, + body: JSON.stringify({ email, username, password, tenant_id: tenantId }), + }); + if (!res.ok) { + const err = await res.json().catch(() => ({ detail: res.statusText })); + const detail = Array.isArray(err.detail) ? err.detail.map((d: { msg?: string }) => d.msg).join(", ") : err.detail; + throw new Error(detail || "Registration failed"); + } + setStep(3); + } catch (err) { + setError(err instanceof Error ? err.message : "Failed to create admin"); + } finally { + setLoading(false); + } + }; + + if (step === 3) { + return ( + <Card className="w-full max-w-md"> + <CardHeader className="text-center"> + <CardTitle className="text-2xl">Setup Complete!</CardTitle> + <CardDescription>Your CMDB instance is ready to use.</CardDescription> + </CardHeader> + <CardContent> + <Button className="w-full" onClick={() => router.push("/login")}> + Go to Login + </Button> + </CardContent> + </Card> + ); + } + + return ( + <Card className="w-full max-w-md"> + <CardHeader className="text-center"> + <CardTitle className="text-2xl">CMDB Setup</CardTitle> + <CardDescription> + {step === 1 ? "Step 1: Create your organization" : "Step 2: Create admin account"} + </CardDescription> + </CardHeader> + <CardContent> + {error && ( + <div className="mb-4 rounded-md bg-red-50 p-3 text-sm text-red-600 dark:bg-red-950 dark:text-red-400"> + {error} + </div> + )} + + {step === 1 && ( + <form onSubmit={handleCreateTenant} className="space-y-4"> + <div className="space-y-2"> + <Label htmlFor="tenant-name">Organization Name</Label> + <Input + id="tenant-name" + value={tenantName} + onChange={(e) => { + setTenantName(e.target.value); + setTenantSlug(e.target.value.toLowerCase().replace(/[^a-z0-9-]/g, "-")); + }} + placeholder="My Company" + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant-slug">Slug</Label> + <Input + id="tenant-slug" + value={tenantSlug} + onChange={(e) => setTenantSlug(e.target.value)} + placeholder="my-company" + required + /> + </div> + <Button type="submit" className="w-full" disabled={loading}> + {loading ? "Creating..." : "Next"} + </Button> + </form> + )} + + {step === 2 && ( + <form onSubmit={handleCreateAdmin} className="space-y-4"> + <div className="space-y-2"> + <Label htmlFor="email">Email</Label> + <Input + id="email" + type="email" + value={email} + onChange={(e) => setEmail(e.target.value)} + placeholder="admin@example.com" + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="username">Username</Label> + <Input + id="username" + value={username} + onChange={(e) => setUsername(e.target.value)} + placeholder="admin" + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="password">Password</Label> + <Input + id="password" + type="password" + value={password} + onChange={(e) => setPassword(e.target.value)} + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="confirm-password">Confirm Password</Label> + <Input + id="confirm-password" + type="password" + value={confirmPassword} + onChange={(e) => setConfirmPassword(e.target.value)} + required + /> + </div> + <Button type="submit" className="w-full" disabled={loading}> + {loading ? "Creating..." : "Complete Setup"} + </Button> + </form> + )} + </CardContent> + </Card> + ); +} diff --git a/frontend/apps/client/src/app/(auth)/signup/page.tsx b/frontend/apps/client/src/app/(auth)/signup/page.tsx new file mode 100644 index 0000000..c5fb567 --- /dev/null +++ b/frontend/apps/client/src/app/(auth)/signup/page.tsx @@ -0,0 +1,135 @@ +"use client"; + +import { useState } from "react"; +import { useRouter } from "next/navigation"; +import Link from "next/link"; +import { signup } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { + Card, + CardContent, + CardDescription, + CardFooter, + CardHeader, + CardTitle, +} from "@/components/ui/card"; + +export default function SignupPage() { + const router = useRouter(); + + const [email, setEmail] = useState(""); + const [username, setUsername] = useState(""); + const [password, setPassword] = useState(""); + const [confirmPassword, setConfirmPassword] = useState(""); + const [error, setError] = useState(""); + const [isSubmitting, setIsSubmitting] = useState(false); + + async function handleSubmit(e: React.FormEvent) { + e.preventDefault(); + setError(""); + + if (password !== confirmPassword) { + setError("Passwords do not match."); + return; + } + + if (password.length < 8) { + setError("Password must be at least 8 characters."); + return; + } + + setIsSubmitting(true); + + try { + await signup({ email, username, password }); + router.push("/login?message=Account created successfully. Please sign in."); + } catch (err) { + setError( + err instanceof Error + ? err.message + : "Signup failed. Please try again.", + ); + } finally { + setIsSubmitting(false); + } + } + + return ( + <Card> + <CardHeader className="text-center"> + <CardTitle className="text-2xl">Create an account</CardTitle> + <CardDescription> + Enter your details to get started with CMDB + </CardDescription> + </CardHeader> + <CardContent> + <form onSubmit={handleSubmit} className="space-y-4"> + {error && ( + <div className="rounded-md bg-destructive/10 p-3 text-sm text-destructive"> + {error} + </div> + )} + <div className="space-y-2"> + <Label htmlFor="email">Email</Label> + <Input + id="email" + type="email" + placeholder="you@example.com" + value={email} + onChange={(e) => setEmail(e.target.value)} + required + autoComplete="email" + /> + </div> + <div className="space-y-2"> + <Label htmlFor="username">Username</Label> + <Input + id="username" + type="text" + placeholder="johndoe" + value={username} + onChange={(e) => setUsername(e.target.value)} + required + autoComplete="username" + /> + </div> + <div className="space-y-2"> + <Label htmlFor="password">Password</Label> + <Input + id="password" + type="password" + value={password} + onChange={(e) => setPassword(e.target.value)} + required + autoComplete="new-password" + /> + </div> + <div className="space-y-2"> + <Label htmlFor="confirm-password">Confirm Password</Label> + <Input + id="confirm-password" + type="password" + value={confirmPassword} + onChange={(e) => setConfirmPassword(e.target.value)} + required + autoComplete="new-password" + /> + </div> + <Button type="submit" className="w-full" disabled={isSubmitting}> + {isSubmitting ? "Creating account..." : "Create account"} + </Button> + </form> + </CardContent> + <CardFooter className="justify-center"> + <p className="text-sm text-muted-foreground"> + Already have an account?{" "} + <Link href="/login" className="text-primary underline-offset-4 hover:underline"> + Sign in + </Link> + </p> + </CardFooter> + </Card> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/asns/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/asns/page.tsx new file mode 100644 index 0000000..b8016de --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/asns/page.tsx @@ -0,0 +1,40 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { ASN } from "@cmdb/shared"; +import { asnApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; + +const columns: ColumnDef<ASN>[] = [ + { accessorKey: "asn", header: "ASN" }, + { accessorKey: "rir_id", header: "RIR" }, + { accessorKey: "tenant_id", header: "Tenant" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function ASNsPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["asns", offset, limit], + queryFn: () => asnApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">ASNs</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/fhrp-groups/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/fhrp-groups/page.tsx new file mode 100644 index 0000000..b2d9673 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/fhrp-groups/page.tsx @@ -0,0 +1,40 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { FHRPGroup } from "@cmdb/shared"; +import { fhrpGroupApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; + +const columns: ColumnDef<FHRPGroup>[] = [ + { accessorKey: "name", header: "Name" }, + { accessorKey: "protocol", header: "Protocol" }, + { accessorKey: "group_id_value", header: "Group ID" }, + { accessorKey: "auth_type", header: "Auth Type" }, +]; + +export default function FHRPGroupsPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["fhrp-groups", offset, limit], + queryFn: () => fhrpGroupApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">FHRP Groups</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/[id]/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/[id]/page.tsx new file mode 100644 index 0000000..4c1e1cd --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/[id]/page.tsx @@ -0,0 +1,275 @@ +"use client"; + +import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter, useParams } from "next/navigation"; +import type { IPAddress } from "@cmdb/shared"; +import { ipAddressApi } from "@cmdb/shared"; +import { StatusBadge } from "@/components/status-badge"; +import { Button } from "@/components/ui/button"; +import { + Card, + CardContent, + CardHeader, + CardTitle, + CardAction, +} from "@/components/ui/card"; +import { + Dialog, + DialogContent, + DialogHeader, + DialogTitle, + DialogDescription, + DialogFooter, + DialogTrigger, + DialogClose, +} from "@/components/ui/dialog"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { Skeleton } from "@/components/ui/skeleton"; +import { ArrowLeft, Pencil, Trash2 } from "lucide-react"; +import { useState } from "react"; + +export default function IPAddressDetailPage() { + const router = useRouter(); + const params = useParams<{ id: string }>(); + const queryClient = useQueryClient(); + const [editing, setEditing] = useState(false); + const [editData, setEditData] = useState<Partial<IPAddress>>({}); + + const { data: ipAddress, isLoading } = useQuery({ + queryKey: ["ip-address", params.id], + queryFn: () => ipAddressApi.get(params.id), + }); + + const updateMutation = useMutation({ + mutationFn: (data: Partial<IPAddress>) => ipAddressApi.update(params.id, data), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["ip-address", params.id] }); + queryClient.invalidateQueries({ queryKey: ["ip-addresses"] }); + setEditing(false); + }, + }); + + const deleteMutation = useMutation({ + mutationFn: () => ipAddressApi.delete(params.id), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["ip-addresses"] }); + router.push("/ipam/ip-addresses"); + }, + }); + + function startEditing() { + if (!ipAddress) return; + setEditData({ + address: ipAddress.address, + status: ipAddress.status, + vrf_id: ipAddress.vrf_id, + dns_name: ipAddress.dns_name, + tenant_id: ipAddress.tenant_id, + description: ipAddress.description, + }); + setEditing(true); + } + + if (isLoading) { + return ( + <div className="space-y-4"> + <Skeleton className="h-8 w-48" /> + <Skeleton className="h-64 w-full" /> + </div> + ); + } + + if (!ipAddress) { + return <p className="text-muted-foreground">IP Address not found.</p>; + } + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/ip-addresses")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">{ipAddress.address}</h1> + <StatusBadge status={ipAddress.status} /> + </div> + + {editing ? ( + <Card> + <CardHeader> + <CardTitle>Edit IP Address</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + updateMutation.mutate(editData); + }} + > + <div className="space-y-2"> + <Label htmlFor="address">Address</Label> + <Input + id="address" + value={editData.address ?? ""} + onChange={(e) => setEditData({ ...editData, address: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label>Status</Label> + <Select + value={editData.status} + onValueChange={(val) => val && setEditData({ ...editData, status: val })} + > + <SelectTrigger className="w-full"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="active">Active</SelectItem> + <SelectItem value="reserved">Reserved</SelectItem> + <SelectItem value="deprecated">Deprecated</SelectItem> + </SelectContent> + </Select> + </div> + <div className="space-y-2"> + <Label htmlFor="vrf_id">VRF ID</Label> + <Input + id="vrf_id" + value={editData.vrf_id ?? ""} + onChange={(e) => + setEditData({ ...editData, vrf_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="dns_name">DNS Name</Label> + <Input + id="dns_name" + value={editData.dns_name ?? ""} + onChange={(e) => setEditData({ ...editData, dns_name: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={editData.tenant_id ?? ""} + onChange={(e) => + setEditData({ ...editData, tenant_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={editData.description ?? ""} + onChange={(e) => setEditData({ ...editData, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={updateMutation.isPending}> + {updateMutation.isPending ? "Saving..." : "Save"} + </Button> + <Button type="button" variant="outline" onClick={() => setEditing(false)}> + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + ) : ( + <Card> + <CardHeader> + <CardTitle>IP Address Details</CardTitle> + <CardAction> + <div className="flex gap-2"> + <Button variant="outline" size="sm" onClick={startEditing}> + <Pencil /> + Edit + </Button> + <Dialog> + <DialogTrigger + render={ + <Button variant="destructive" size="sm"> + <Trash2 /> + Delete + </Button> + } + /> + <DialogContent> + <DialogHeader> + <DialogTitle>Delete IP Address</DialogTitle> + <DialogDescription> + Are you sure you want to delete {ipAddress.address}? This action cannot + be undone. + </DialogDescription> + </DialogHeader> + <DialogFooter> + <DialogClose render={<Button variant="outline" />}>Cancel</DialogClose> + <Button + variant="destructive" + onClick={() => deleteMutation.mutate()} + disabled={deleteMutation.isPending} + > + {deleteMutation.isPending ? "Deleting..." : "Delete"} + </Button> + </DialogFooter> + </DialogContent> + </Dialog> + </div> + </CardAction> + </CardHeader> + <CardContent> + <dl className="grid gap-4 sm:grid-cols-2"> + <div> + <dt className="text-sm font-medium text-muted-foreground">Address</dt> + <dd className="mt-1">{ipAddress.address}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Status</dt> + <dd className="mt-1"> + <StatusBadge status={ipAddress.status} /> + </dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">VRF</dt> + <dd className="mt-1">{ipAddress.vrf_id ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">DNS Name</dt> + <dd className="mt-1">{ipAddress.dns_name || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Tenant</dt> + <dd className="mt-1">{ipAddress.tenant_id ?? "-"}</dd> + </div> + <div className="sm:col-span-2"> + <dt className="text-sm font-medium text-muted-foreground">Description</dt> + <dd className="mt-1">{ipAddress.description || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Created</dt> + <dd className="mt-1">{new Date(ipAddress.created_at).toLocaleString()}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Updated</dt> + <dd className="mt-1">{new Date(ipAddress.updated_at).toLocaleString()}</dd> + </div> + </dl> + </CardContent> + </Card> + )} + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/new/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/new/page.tsx new file mode 100644 index 0000000..111476f --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/new/page.tsx @@ -0,0 +1,146 @@ +"use client"; + +import { useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import { useState } from "react"; +import { ipAddressApi } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { ArrowLeft } from "lucide-react"; + +export default function NewIPAddressPage() { + const router = useRouter(); + const queryClient = useQueryClient(); + const [form, setForm] = useState({ + address: "", + status: "active", + vrf_id: "", + dns_name: "", + tenant_id: "", + description: "", + }); + + const mutation = useMutation({ + mutationFn: () => + ipAddressApi.create({ + address: form.address, + status: form.status, + vrf_id: form.vrf_id || null, + dns_name: form.dns_name, + tenant_id: form.tenant_id || null, + description: form.description, + }), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["ip-addresses"] }); + router.push("/ipam/ip-addresses"); + }, + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/ip-addresses")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">New IP Address</h1> + </div> + + <Card> + <CardHeader> + <CardTitle>Create IP Address</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + mutation.mutate(); + }} + > + <div className="space-y-2"> + <Label htmlFor="address">Address *</Label> + <Input + id="address" + placeholder="e.g. 10.0.0.1/32" + value={form.address} + onChange={(e) => setForm({ ...form, address: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label>Status</Label> + <Select + value={form.status} + onValueChange={(val) => val && setForm({ ...form, status: val })} + > + <SelectTrigger className="w-full"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="active">Active</SelectItem> + <SelectItem value="reserved">Reserved</SelectItem> + <SelectItem value="deprecated">Deprecated</SelectItem> + </SelectContent> + </Select> + </div> + <div className="space-y-2"> + <Label htmlFor="vrf_id">VRF ID</Label> + <Input + id="vrf_id" + value={form.vrf_id} + onChange={(e) => setForm({ ...form, vrf_id: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="dns_name">DNS Name</Label> + <Input + id="dns_name" + value={form.dns_name} + onChange={(e) => setForm({ ...form, dns_name: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={form.tenant_id} + onChange={(e) => setForm({ ...form, tenant_id: e.target.value })} + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={form.description} + onChange={(e) => setForm({ ...form, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={mutation.isPending}> + {mutation.isPending ? "Creating..." : "Create IP Address"} + </Button> + <Button + type="button" + variant="outline" + onClick={() => router.push("/ipam/ip-addresses")} + > + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/page.tsx new file mode 100644 index 0000000..a5ba0bd --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/ip-addresses/page.tsx @@ -0,0 +1,59 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { IPAddress } from "@cmdb/shared"; +import { ipAddressApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; +import { StatusBadge } from "@/components/status-badge"; +import { Button } from "@/components/ui/button"; +import { Plus } from "lucide-react"; + +const columns: ColumnDef<IPAddress>[] = [ + { accessorKey: "address", header: "Address" }, + { + accessorKey: "status", + header: "Status", + cell: ({ row }) => <StatusBadge status={row.original.status} />, + }, + { accessorKey: "vrf_id", header: "VRF" }, + { accessorKey: "dns_name", header: "DNS Name" }, + { accessorKey: "tenant_id", header: "Tenant" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function IPAddressesPage() { + const router = useRouter(); + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["ip-addresses", offset, limit], + queryFn: () => ipAddressApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center justify-between"> + <h1 className="text-2xl font-bold tracking-tight">IP Addresses</h1> + <Button onClick={() => router.push("/ipam/ip-addresses/new")}> + <Plus /> + New IP Address + </Button> + </div> + + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + onRowClick={(row) => router.push(`/ipam/ip-addresses/${row.id}`)} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/ip-ranges/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/ip-ranges/page.tsx new file mode 100644 index 0000000..88347c1 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/ip-ranges/page.tsx @@ -0,0 +1,46 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { IPRange } from "@cmdb/shared"; +import { ipRangeApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; +import { StatusBadge } from "@/components/status-badge"; + +const columns: ColumnDef<IPRange>[] = [ + { accessorKey: "start_address", header: "Start" }, + { accessorKey: "end_address", header: "End" }, + { + accessorKey: "status", + header: "Status", + cell: ({ row }) => <StatusBadge status={row.original.status} />, + }, + { accessorKey: "vrf_id", header: "VRF" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function IPRangesPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["ip-ranges", offset, limit], + queryFn: () => ipRangeApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">IP Ranges</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/[id]/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/[id]/page.tsx new file mode 100644 index 0000000..18f2789 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/[id]/page.tsx @@ -0,0 +1,328 @@ +"use client"; + +import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter, useParams } from "next/navigation"; +import type { Prefix } from "@cmdb/shared"; +import { prefixApi } from "@cmdb/shared"; +import { StatusBadge } from "@/components/status-badge"; +import { ChangelogTimeline } from "@/components/history/changelog-timeline"; +import { JournalEntries } from "@/components/journal/journal-entries"; +import { Button } from "@/components/ui/button"; +import { + Card, + CardContent, + CardHeader, + CardTitle, + CardAction, +} from "@/components/ui/card"; +import { + Dialog, + DialogContent, + DialogHeader, + DialogTitle, + DialogDescription, + DialogFooter, + DialogTrigger, + DialogClose, +} from "@/components/ui/dialog"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { Skeleton } from "@/components/ui/skeleton"; +import { Tabs, TabsList, TabsTrigger, TabsContent } from "@/components/ui/tabs"; +import { ArrowLeft, Pencil, Trash2 } from "lucide-react"; +import { useState } from "react"; + +export default function PrefixDetailPage() { + const router = useRouter(); + const params = useParams<{ id: string }>(); + const queryClient = useQueryClient(); + const [editing, setEditing] = useState(false); + const [editData, setEditData] = useState<Partial<Prefix>>({}); + + const { data: prefix, isLoading } = useQuery({ + queryKey: ["prefix", params.id], + queryFn: () => prefixApi.get(params.id), + }); + + const updateMutation = useMutation({ + mutationFn: (data: Partial<Prefix>) => prefixApi.update(params.id, data), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["prefix", params.id] }); + queryClient.invalidateQueries({ queryKey: ["prefixes"] }); + setEditing(false); + }, + }); + + const deleteMutation = useMutation({ + mutationFn: () => prefixApi.delete(params.id), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["prefixes"] }); + router.push("/ipam/prefixes"); + }, + }); + + function startEditing() { + if (!prefix) return; + setEditData({ + network: prefix.network, + status: prefix.status, + vrf_id: prefix.vrf_id, + vlan_id: prefix.vlan_id, + role: prefix.role, + tenant_id: prefix.tenant_id, + description: prefix.description, + }); + setEditing(true); + } + + if (isLoading) { + return ( + <div className="space-y-4"> + <Skeleton className="h-8 w-48" /> + <Skeleton className="h-64 w-full" /> + </div> + ); + } + + if (!prefix) { + return <p className="text-muted-foreground">Prefix not found.</p>; + } + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/prefixes")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">{prefix.network}</h1> + <StatusBadge status={prefix.status} /> + </div> + + <Tabs defaultValue="details"> + <TabsList> + <TabsTrigger value="details">Details</TabsTrigger> + <TabsTrigger value="history">History</TabsTrigger> + <TabsTrigger value="journal">Journal</TabsTrigger> + </TabsList> + + <TabsContent value="details"> + {editing ? ( + <Card> + <CardHeader> + <CardTitle>Edit Prefix</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + updateMutation.mutate(editData); + }} + > + <div className="space-y-2"> + <Label htmlFor="network">Network</Label> + <Input + id="network" + value={editData.network ?? ""} + onChange={(e) => setEditData({ ...editData, network: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label>Status</Label> + <Select + value={editData.status} + onValueChange={(val) => val && setEditData({ ...editData, status: val })} + > + <SelectTrigger className="w-full"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="active">Active</SelectItem> + <SelectItem value="reserved">Reserved</SelectItem> + <SelectItem value="deprecated">Deprecated</SelectItem> + <SelectItem value="container">Container</SelectItem> + </SelectContent> + </Select> + </div> + <div className="space-y-2"> + <Label htmlFor="vrf_id">VRF ID</Label> + <Input + id="vrf_id" + value={editData.vrf_id ?? ""} + onChange={(e) => + setEditData({ ...editData, vrf_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="vlan_id">VLAN ID</Label> + <Input + id="vlan_id" + value={editData.vlan_id ?? ""} + onChange={(e) => + setEditData({ ...editData, vlan_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="role">Role</Label> + <Input + id="role" + value={editData.role ?? ""} + onChange={(e) => + setEditData({ ...editData, role: e.target.value || null }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={editData.tenant_id ?? ""} + onChange={(e) => + setEditData({ ...editData, tenant_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={editData.description ?? ""} + onChange={(e) => setEditData({ ...editData, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={updateMutation.isPending}> + {updateMutation.isPending ? "Saving..." : "Save"} + </Button> + <Button type="button" variant="outline" onClick={() => setEditing(false)}> + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + ) : ( + <Card> + <CardHeader> + <CardTitle>Prefix Details</CardTitle> + <CardAction> + <div className="flex gap-2"> + <Button variant="outline" size="sm" onClick={startEditing}> + <Pencil /> + Edit + </Button> + <Dialog> + <DialogTrigger + render={ + <Button variant="destructive" size="sm"> + <Trash2 /> + Delete + </Button> + } + /> + <DialogContent> + <DialogHeader> + <DialogTitle>Delete Prefix</DialogTitle> + <DialogDescription> + Are you sure you want to delete {prefix.network}? This action cannot be + undone. + </DialogDescription> + </DialogHeader> + <DialogFooter> + <DialogClose render={<Button variant="outline" />}>Cancel</DialogClose> + <Button + variant="destructive" + onClick={() => deleteMutation.mutate()} + disabled={deleteMutation.isPending} + > + {deleteMutation.isPending ? "Deleting..." : "Delete"} + </Button> + </DialogFooter> + </DialogContent> + </Dialog> + </div> + </CardAction> + </CardHeader> + <CardContent> + <dl className="grid gap-4 sm:grid-cols-2"> + <div> + <dt className="text-sm font-medium text-muted-foreground">Network</dt> + <dd className="mt-1">{prefix.network}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Status</dt> + <dd className="mt-1"> + <StatusBadge status={prefix.status} /> + </dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">VRF</dt> + <dd className="mt-1">{prefix.vrf_id ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">VLAN</dt> + <dd className="mt-1">{prefix.vlan_id ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Role</dt> + <dd className="mt-1">{prefix.role ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Tenant</dt> + <dd className="mt-1">{prefix.tenant_id ?? "-"}</dd> + </div> + <div className="sm:col-span-2"> + <dt className="text-sm font-medium text-muted-foreground">Description</dt> + <dd className="mt-1">{prefix.description || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Created</dt> + <dd className="mt-1">{new Date(prefix.created_at).toLocaleString()}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Updated</dt> + <dd className="mt-1">{new Date(prefix.updated_at).toLocaleString()}</dd> + </div> + </dl> + </CardContent> + </Card> + )} + </TabsContent> + + <TabsContent value="history"> + <Card> + <CardHeader> + <CardTitle>Change History</CardTitle> + </CardHeader> + <CardContent> + <ChangelogTimeline objectId={params.id} /> + </CardContent> + </Card> + </TabsContent> + + <TabsContent value="journal"> + <Card> + <CardHeader> + <CardTitle>Journal</CardTitle> + </CardHeader> + <CardContent> + <JournalEntries objectType="prefix" objectId={params.id} /> + </CardContent> + </Card> + </TabsContent> + </Tabs> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/new/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/new/page.tsx new file mode 100644 index 0000000..d50ca72 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/new/page.tsx @@ -0,0 +1,157 @@ +"use client"; + +import { useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import { useState } from "react"; +import { prefixApi } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { ArrowLeft } from "lucide-react"; + +export default function NewPrefixPage() { + const router = useRouter(); + const queryClient = useQueryClient(); + const [form, setForm] = useState({ + network: "", + status: "active", + vrf_id: "", + vlan_id: "", + role: "", + tenant_id: "", + description: "", + }); + + const mutation = useMutation({ + mutationFn: () => + prefixApi.create({ + network: form.network, + status: form.status, + vrf_id: form.vrf_id || null, + vlan_id: form.vlan_id || null, + role: form.role || null, + tenant_id: form.tenant_id || null, + description: form.description, + }), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["prefixes"] }); + router.push("/ipam/prefixes"); + }, + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/prefixes")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">New Prefix</h1> + </div> + + <Card> + <CardHeader> + <CardTitle>Create Prefix</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + mutation.mutate(); + }} + > + <div className="space-y-2"> + <Label htmlFor="network">Network *</Label> + <Input + id="network" + placeholder="e.g. 10.0.0.0/24" + value={form.network} + onChange={(e) => setForm({ ...form, network: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label>Status</Label> + <Select + value={form.status} + onValueChange={(val) => val && setForm({ ...form, status: val })} + > + <SelectTrigger className="w-full"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="active">Active</SelectItem> + <SelectItem value="reserved">Reserved</SelectItem> + <SelectItem value="deprecated">Deprecated</SelectItem> + <SelectItem value="container">Container</SelectItem> + </SelectContent> + </Select> + </div> + <div className="space-y-2"> + <Label htmlFor="vrf_id">VRF ID</Label> + <Input + id="vrf_id" + value={form.vrf_id} + onChange={(e) => setForm({ ...form, vrf_id: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="vlan_id">VLAN ID</Label> + <Input + id="vlan_id" + value={form.vlan_id} + onChange={(e) => setForm({ ...form, vlan_id: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="role">Role</Label> + <Input + id="role" + value={form.role} + onChange={(e) => setForm({ ...form, role: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={form.tenant_id} + onChange={(e) => setForm({ ...form, tenant_id: e.target.value })} + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={form.description} + onChange={(e) => setForm({ ...form, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={mutation.isPending}> + {mutation.isPending ? "Creating..." : "Create Prefix"} + </Button> + <Button + type="button" + variant="outline" + onClick={() => router.push("/ipam/prefixes")} + > + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/page.tsx new file mode 100644 index 0000000..3fff49c --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/prefixes/page.tsx @@ -0,0 +1,59 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { Prefix } from "@cmdb/shared"; +import { prefixApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; +import { StatusBadge } from "@/components/status-badge"; +import { Button } from "@/components/ui/button"; +import { Plus } from "lucide-react"; + +const columns: ColumnDef<Prefix>[] = [ + { accessorKey: "network", header: "Network" }, + { + accessorKey: "status", + header: "Status", + cell: ({ row }) => <StatusBadge status={row.original.status} />, + }, + { accessorKey: "vrf_id", header: "VRF" }, + { accessorKey: "role", header: "Role" }, + { accessorKey: "tenant_id", header: "Tenant" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function PrefixesPage() { + const router = useRouter(); + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["prefixes", offset, limit], + queryFn: () => prefixApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center justify-between"> + <h1 className="text-2xl font-bold tracking-tight">Prefixes</h1> + <Button onClick={() => router.push("/ipam/prefixes/new")}> + <Plus /> + New Prefix + </Button> + </div> + + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + onRowClick={(row) => router.push(`/ipam/prefixes/${row.id}`)} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/rirs/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/rirs/page.tsx new file mode 100644 index 0000000..0fe8885 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/rirs/page.tsx @@ -0,0 +1,43 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { RIR } from "@cmdb/shared"; +import { rirApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; + +const columns: ColumnDef<RIR>[] = [ + { accessorKey: "name", header: "Name" }, + { + accessorKey: "is_private", + header: "Private", + cell: ({ row }) => (row.original.is_private ? "Yes" : "No"), + }, + { accessorKey: "description", header: "Description" }, +]; + +export default function RIRsPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["rirs", offset, limit], + queryFn: () => rirApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">RIRs</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/route-targets/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/route-targets/page.tsx new file mode 100644 index 0000000..ca8d762 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/route-targets/page.tsx @@ -0,0 +1,39 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { RouteTarget } from "@cmdb/shared"; +import { routeTargetApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; + +const columns: ColumnDef<RouteTarget>[] = [ + { accessorKey: "name", header: "Name" }, + { accessorKey: "tenant_id", header: "Tenant" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function RouteTargetsPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["route-targets", offset, limit], + queryFn: () => routeTargetApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">Route Targets</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/services/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/services/page.tsx new file mode 100644 index 0000000..b927007 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/services/page.tsx @@ -0,0 +1,44 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { Service } from "@cmdb/shared"; +import { serviceApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; + +const columns: ColumnDef<Service>[] = [ + { accessorKey: "name", header: "Name" }, + { accessorKey: "protocol", header: "Protocol" }, + { + accessorKey: "ports", + header: "Ports", + cell: ({ row }) => row.original.ports.join(", "), + }, + { accessorKey: "description", header: "Description" }, +]; + +export default function ServicesPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["services", offset, limit], + queryFn: () => serviceApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">Services</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vlan-groups/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vlan-groups/page.tsx new file mode 100644 index 0000000..3062a8b --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vlan-groups/page.tsx @@ -0,0 +1,44 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { VLANGroup } from "@cmdb/shared"; +import { vlanGroupApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; + +const columns: ColumnDef<VLANGroup>[] = [ + { accessorKey: "name", header: "Name" }, + { accessorKey: "slug", header: "Slug" }, + { + id: "vid_range", + header: "VID Range", + cell: ({ row }) => `${row.original.min_vid}-${row.original.max_vid}`, + }, + { accessorKey: "tenant_id", header: "Tenant" }, +]; + +export default function VLANGroupsPage() { + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["vlan-groups", offset, limit], + queryFn: () => vlanGroupApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <h1 className="text-2xl font-bold tracking-tight">VLAN Groups</h1> + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vlans/[id]/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vlans/[id]/page.tsx new file mode 100644 index 0000000..d58d426 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vlans/[id]/page.tsx @@ -0,0 +1,296 @@ +"use client"; + +import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter, useParams } from "next/navigation"; +import type { VLAN } from "@cmdb/shared"; +import { vlanApi } from "@cmdb/shared"; +import { StatusBadge } from "@/components/status-badge"; +import { Button } from "@/components/ui/button"; +import { + Card, + CardContent, + CardHeader, + CardTitle, + CardAction, +} from "@/components/ui/card"; +import { + Dialog, + DialogContent, + DialogHeader, + DialogTitle, + DialogDescription, + DialogFooter, + DialogTrigger, + DialogClose, +} from "@/components/ui/dialog"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { Skeleton } from "@/components/ui/skeleton"; +import { ArrowLeft, Pencil, Trash2 } from "lucide-react"; +import { useState } from "react"; + +export default function VLANDetailPage() { + const router = useRouter(); + const params = useParams<{ id: string }>(); + const queryClient = useQueryClient(); + const [editing, setEditing] = useState(false); + const [editData, setEditData] = useState<Partial<VLAN>>({}); + + const { data: vlan, isLoading } = useQuery({ + queryKey: ["vlan", params.id], + queryFn: () => vlanApi.get(params.id), + }); + + const updateMutation = useMutation({ + mutationFn: (data: Partial<VLAN>) => vlanApi.update(params.id, data), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["vlan", params.id] }); + queryClient.invalidateQueries({ queryKey: ["vlans"] }); + setEditing(false); + }, + }); + + const deleteMutation = useMutation({ + mutationFn: () => vlanApi.delete(params.id), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["vlans"] }); + router.push("/ipam/vlans"); + }, + }); + + function startEditing() { + if (!vlan) return; + setEditData({ + vid: vlan.vid, + name: vlan.name, + status: vlan.status, + group_id: vlan.group_id, + role: vlan.role, + tenant_id: vlan.tenant_id, + description: vlan.description, + }); + setEditing(true); + } + + if (isLoading) { + return ( + <div className="space-y-4"> + <Skeleton className="h-8 w-48" /> + <Skeleton className="h-64 w-full" /> + </div> + ); + } + + if (!vlan) { + return <p className="text-muted-foreground">VLAN not found.</p>; + } + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/vlans")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight"> + VLAN {vlan.vid} - {vlan.name} + </h1> + <StatusBadge status={vlan.status} /> + </div> + + {editing ? ( + <Card> + <CardHeader> + <CardTitle>Edit VLAN</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + updateMutation.mutate(editData); + }} + > + <div className="space-y-2"> + <Label htmlFor="vid">VID</Label> + <Input + id="vid" + type="number" + value={editData.vid ?? ""} + onChange={(e) => + setEditData({ ...editData, vid: parseInt(e.target.value) || 0 }) + } + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="name">Name</Label> + <Input + id="name" + value={editData.name ?? ""} + onChange={(e) => setEditData({ ...editData, name: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label>Status</Label> + <Select + value={editData.status} + onValueChange={(val) => val && setEditData({ ...editData, status: val })} + > + <SelectTrigger className="w-full"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="active">Active</SelectItem> + <SelectItem value="reserved">Reserved</SelectItem> + <SelectItem value="deprecated">Deprecated</SelectItem> + </SelectContent> + </Select> + </div> + <div className="space-y-2"> + <Label htmlFor="group_id">Group ID</Label> + <Input + id="group_id" + value={editData.group_id ?? ""} + onChange={(e) => + setEditData({ ...editData, group_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="role">Role</Label> + <Input + id="role" + value={editData.role ?? ""} + onChange={(e) => + setEditData({ ...editData, role: e.target.value || null }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={editData.tenant_id ?? ""} + onChange={(e) => + setEditData({ ...editData, tenant_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={editData.description ?? ""} + onChange={(e) => setEditData({ ...editData, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={updateMutation.isPending}> + {updateMutation.isPending ? "Saving..." : "Save"} + </Button> + <Button type="button" variant="outline" onClick={() => setEditing(false)}> + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + ) : ( + <Card> + <CardHeader> + <CardTitle>VLAN Details</CardTitle> + <CardAction> + <div className="flex gap-2"> + <Button variant="outline" size="sm" onClick={startEditing}> + <Pencil /> + Edit + </Button> + <Dialog> + <DialogTrigger + render={ + <Button variant="destructive" size="sm"> + <Trash2 /> + Delete + </Button> + } + /> + <DialogContent> + <DialogHeader> + <DialogTitle>Delete VLAN</DialogTitle> + <DialogDescription> + Are you sure you want to delete VLAN {vlan.vid} ({vlan.name})? This + action cannot be undone. + </DialogDescription> + </DialogHeader> + <DialogFooter> + <DialogClose render={<Button variant="outline" />}>Cancel</DialogClose> + <Button + variant="destructive" + onClick={() => deleteMutation.mutate()} + disabled={deleteMutation.isPending} + > + {deleteMutation.isPending ? "Deleting..." : "Delete"} + </Button> + </DialogFooter> + </DialogContent> + </Dialog> + </div> + </CardAction> + </CardHeader> + <CardContent> + <dl className="grid gap-4 sm:grid-cols-2"> + <div> + <dt className="text-sm font-medium text-muted-foreground">VID</dt> + <dd className="mt-1">{vlan.vid}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Name</dt> + <dd className="mt-1">{vlan.name}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Status</dt> + <dd className="mt-1"> + <StatusBadge status={vlan.status} /> + </dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Group</dt> + <dd className="mt-1">{vlan.group_id ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Role</dt> + <dd className="mt-1">{vlan.role ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Tenant</dt> + <dd className="mt-1">{vlan.tenant_id ?? "-"}</dd> + </div> + <div className="sm:col-span-2"> + <dt className="text-sm font-medium text-muted-foreground">Description</dt> + <dd className="mt-1">{vlan.description || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Created</dt> + <dd className="mt-1">{new Date(vlan.created_at).toLocaleString()}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Updated</dt> + <dd className="mt-1">{new Date(vlan.updated_at).toLocaleString()}</dd> + </div> + </dl> + </CardContent> + </Card> + )} + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vlans/new/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vlans/new/page.tsx new file mode 100644 index 0000000..0f906f0 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vlans/new/page.tsx @@ -0,0 +1,158 @@ +"use client"; + +import { useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import { useState } from "react"; +import { vlanApi } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { ArrowLeft } from "lucide-react"; + +export default function NewVLANPage() { + const router = useRouter(); + const queryClient = useQueryClient(); + const [form, setForm] = useState({ + vid: "", + name: "", + status: "active", + group_id: "", + role: "", + tenant_id: "", + description: "", + }); + + const mutation = useMutation({ + mutationFn: () => + vlanApi.create({ + vid: parseInt(form.vid) || 0, + name: form.name, + status: form.status, + group_id: form.group_id || null, + role: form.role || null, + tenant_id: form.tenant_id || null, + description: form.description, + }), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["vlans"] }); + router.push("/ipam/vlans"); + }, + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/vlans")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">New VLAN</h1> + </div> + + <Card> + <CardHeader> + <CardTitle>Create VLAN</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + mutation.mutate(); + }} + > + <div className="space-y-2"> + <Label htmlFor="vid">VID *</Label> + <Input + id="vid" + type="number" + placeholder="e.g. 100" + value={form.vid} + onChange={(e) => setForm({ ...form, vid: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="name">Name *</Label> + <Input + id="name" + value={form.name} + onChange={(e) => setForm({ ...form, name: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label>Status</Label> + <Select + value={form.status} + onValueChange={(val) => val && setForm({ ...form, status: val })} + > + <SelectTrigger className="w-full"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="active">Active</SelectItem> + <SelectItem value="reserved">Reserved</SelectItem> + <SelectItem value="deprecated">Deprecated</SelectItem> + </SelectContent> + </Select> + </div> + <div className="space-y-2"> + <Label htmlFor="group_id">Group ID</Label> + <Input + id="group_id" + value={form.group_id} + onChange={(e) => setForm({ ...form, group_id: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="role">Role</Label> + <Input + id="role" + value={form.role} + onChange={(e) => setForm({ ...form, role: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={form.tenant_id} + onChange={(e) => setForm({ ...form, tenant_id: e.target.value })} + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={form.description} + onChange={(e) => setForm({ ...form, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={mutation.isPending}> + {mutation.isPending ? "Creating..." : "Create VLAN"} + </Button> + <Button + type="button" + variant="outline" + onClick={() => router.push("/ipam/vlans")} + > + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vlans/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vlans/page.tsx new file mode 100644 index 0000000..b4d3918 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vlans/page.tsx @@ -0,0 +1,60 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { VLAN } from "@cmdb/shared"; +import { vlanApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; +import { StatusBadge } from "@/components/status-badge"; +import { Button } from "@/components/ui/button"; +import { Plus } from "lucide-react"; + +const columns: ColumnDef<VLAN>[] = [ + { accessorKey: "vid", header: "VID" }, + { accessorKey: "name", header: "Name" }, + { + accessorKey: "status", + header: "Status", + cell: ({ row }) => <StatusBadge status={row.original.status} />, + }, + { accessorKey: "group_id", header: "Group" }, + { accessorKey: "role", header: "Role" }, + { accessorKey: "tenant_id", header: "Tenant" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function VLANsPage() { + const router = useRouter(); + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["vlans", offset, limit], + queryFn: () => vlanApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center justify-between"> + <h1 className="text-2xl font-bold tracking-tight">VLANs</h1> + <Button onClick={() => router.push("/ipam/vlans/new")}> + <Plus /> + New VLAN + </Button> + </div> + + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + onRowClick={(row) => router.push(`/ipam/vlans/${row.id}`)} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/[id]/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/[id]/page.tsx new file mode 100644 index 0000000..e914d4e --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/[id]/page.tsx @@ -0,0 +1,271 @@ +"use client"; + +import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter, useParams } from "next/navigation"; +import type { VRF } from "@cmdb/shared"; +import { vrfApi } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { + Card, + CardContent, + CardHeader, + CardTitle, + CardAction, +} from "@/components/ui/card"; +import { + Dialog, + DialogContent, + DialogHeader, + DialogTitle, + DialogDescription, + DialogFooter, + DialogTrigger, + DialogClose, +} from "@/components/ui/dialog"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { Skeleton } from "@/components/ui/skeleton"; +import { ArrowLeft, Pencil, Trash2 } from "lucide-react"; +import { useState } from "react"; + +export default function VRFDetailPage() { + const router = useRouter(); + const params = useParams<{ id: string }>(); + const queryClient = useQueryClient(); + const [editing, setEditing] = useState(false); + const [editData, setEditData] = useState<Partial<VRF & { import_targets_str: string; export_targets_str: string }>>({}); + + const { data: vrf, isLoading } = useQuery({ + queryKey: ["vrf", params.id], + queryFn: () => vrfApi.get(params.id), + }); + + const updateMutation = useMutation({ + mutationFn: (data: Partial<VRF>) => vrfApi.update(params.id, data), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["vrf", params.id] }); + queryClient.invalidateQueries({ queryKey: ["vrfs"] }); + setEditing(false); + }, + }); + + const deleteMutation = useMutation({ + mutationFn: () => vrfApi.delete(params.id), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["vrfs"] }); + router.push("/ipam/vrfs"); + }, + }); + + function startEditing() { + if (!vrf) return; + setEditData({ + name: vrf.name, + rd: vrf.rd, + tenant_id: vrf.tenant_id, + description: vrf.description, + import_targets_str: vrf.import_targets.join(", "), + export_targets_str: vrf.export_targets.join(", "), + }); + setEditing(true); + } + + function handleSubmit() { + const importStr = (editData as { import_targets_str?: string }).import_targets_str ?? ""; + const exportStr = (editData as { export_targets_str?: string }).export_targets_str ?? ""; + updateMutation.mutate({ + name: editData.name, + rd: editData.rd, + import_targets: importStr ? importStr.split(",").map((s) => s.trim()) : [], + export_targets: exportStr ? exportStr.split(",").map((s) => s.trim()) : [], + tenant_id: editData.tenant_id, + description: editData.description, + }); + } + + if (isLoading) { + return ( + <div className="space-y-4"> + <Skeleton className="h-8 w-48" /> + <Skeleton className="h-64 w-full" /> + </div> + ); + } + + if (!vrf) { + return <p className="text-muted-foreground">VRF not found.</p>; + } + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/vrfs")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">{vrf.name}</h1> + </div> + + {editing ? ( + <Card> + <CardHeader> + <CardTitle>Edit VRF</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + handleSubmit(); + }} + > + <div className="space-y-2"> + <Label htmlFor="name">Name</Label> + <Input + id="name" + value={editData.name ?? ""} + onChange={(e) => setEditData({ ...editData, name: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="rd">RD</Label> + <Input + id="rd" + value={editData.rd ?? ""} + onChange={(e) => setEditData({ ...editData, rd: e.target.value || null })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="import_targets">Import Targets (comma separated)</Label> + <Input + id="import_targets" + value={(editData as { import_targets_str?: string }).import_targets_str ?? ""} + onChange={(e) => + setEditData({ ...editData, import_targets_str: e.target.value }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="export_targets">Export Targets (comma separated)</Label> + <Input + id="export_targets" + value={(editData as { export_targets_str?: string }).export_targets_str ?? ""} + onChange={(e) => + setEditData({ ...editData, export_targets_str: e.target.value }) + } + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={editData.tenant_id ?? ""} + onChange={(e) => + setEditData({ ...editData, tenant_id: e.target.value || null }) + } + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={editData.description ?? ""} + onChange={(e) => setEditData({ ...editData, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={updateMutation.isPending}> + {updateMutation.isPending ? "Saving..." : "Save"} + </Button> + <Button type="button" variant="outline" onClick={() => setEditing(false)}> + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + ) : ( + <Card> + <CardHeader> + <CardTitle>VRF Details</CardTitle> + <CardAction> + <div className="flex gap-2"> + <Button variant="outline" size="sm" onClick={startEditing}> + <Pencil /> + Edit + </Button> + <Dialog> + <DialogTrigger + render={ + <Button variant="destructive" size="sm"> + <Trash2 /> + Delete + </Button> + } + /> + <DialogContent> + <DialogHeader> + <DialogTitle>Delete VRF</DialogTitle> + <DialogDescription> + Are you sure you want to delete {vrf.name}? This action cannot be + undone. + </DialogDescription> + </DialogHeader> + <DialogFooter> + <DialogClose render={<Button variant="outline" />}>Cancel</DialogClose> + <Button + variant="destructive" + onClick={() => deleteMutation.mutate()} + disabled={deleteMutation.isPending} + > + {deleteMutation.isPending ? "Deleting..." : "Delete"} + </Button> + </DialogFooter> + </DialogContent> + </Dialog> + </div> + </CardAction> + </CardHeader> + <CardContent> + <dl className="grid gap-4 sm:grid-cols-2"> + <div> + <dt className="text-sm font-medium text-muted-foreground">Name</dt> + <dd className="mt-1">{vrf.name}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">RD</dt> + <dd className="mt-1">{vrf.rd ?? "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Import Targets</dt> + <dd className="mt-1">{vrf.import_targets.join(", ") || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Export Targets</dt> + <dd className="mt-1">{vrf.export_targets.join(", ") || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Tenant</dt> + <dd className="mt-1">{vrf.tenant_id ?? "-"}</dd> + </div> + <div className="sm:col-span-2"> + <dt className="text-sm font-medium text-muted-foreground">Description</dt> + <dd className="mt-1">{vrf.description || "-"}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Created</dt> + <dd className="mt-1">{new Date(vrf.created_at).toLocaleString()}</dd> + </div> + <div> + <dt className="text-sm font-medium text-muted-foreground">Updated</dt> + <dd className="mt-1">{new Date(vrf.updated_at).toLocaleString()}</dd> + </div> + </dl> + </CardContent> + </Card> + )} + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/new/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/new/page.tsx new file mode 100644 index 0000000..dd32434 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/new/page.tsx @@ -0,0 +1,135 @@ +"use client"; + +import { useMutation, useQueryClient } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import { useState } from "react"; +import { vrfApi } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; +import { Input } from "@/components/ui/input"; +import { Label } from "@/components/ui/label"; +import { Textarea } from "@/components/ui/textarea"; +import { ArrowLeft } from "lucide-react"; + +export default function NewVRFPage() { + const router = useRouter(); + const queryClient = useQueryClient(); + const [form, setForm] = useState({ + name: "", + rd: "", + import_targets: "", + export_targets: "", + tenant_id: "", + description: "", + }); + + const mutation = useMutation({ + mutationFn: () => + vrfApi.create({ + name: form.name, + rd: form.rd || null, + import_targets: form.import_targets + ? form.import_targets.split(",").map((s) => s.trim()) + : [], + export_targets: form.export_targets + ? form.export_targets.split(",").map((s) => s.trim()) + : [], + tenant_id: form.tenant_id || null, + description: form.description, + }), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["vrfs"] }); + router.push("/ipam/vrfs"); + }, + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center gap-4"> + <Button variant="ghost" size="sm" onClick={() => router.push("/ipam/vrfs")}> + <ArrowLeft /> + Back + </Button> + <h1 className="text-2xl font-bold tracking-tight">New VRF</h1> + </div> + + <Card> + <CardHeader> + <CardTitle>Create VRF</CardTitle> + </CardHeader> + <CardContent> + <form + className="grid gap-4 sm:grid-cols-2" + onSubmit={(e) => { + e.preventDefault(); + mutation.mutate(); + }} + > + <div className="space-y-2"> + <Label htmlFor="name">Name *</Label> + <Input + id="name" + value={form.name} + onChange={(e) => setForm({ ...form, name: e.target.value })} + required + /> + </div> + <div className="space-y-2"> + <Label htmlFor="rd">RD</Label> + <Input + id="rd" + placeholder="e.g. 65000:1" + value={form.rd} + onChange={(e) => setForm({ ...form, rd: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="import_targets">Import Targets (comma separated)</Label> + <Input + id="import_targets" + value={form.import_targets} + onChange={(e) => setForm({ ...form, import_targets: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="export_targets">Export Targets (comma separated)</Label> + <Input + id="export_targets" + value={form.export_targets} + onChange={(e) => setForm({ ...form, export_targets: e.target.value })} + /> + </div> + <div className="space-y-2"> + <Label htmlFor="tenant_id">Tenant ID</Label> + <Input + id="tenant_id" + value={form.tenant_id} + onChange={(e) => setForm({ ...form, tenant_id: e.target.value })} + /> + </div> + <div className="space-y-2 sm:col-span-2"> + <Label htmlFor="description">Description</Label> + <Textarea + id="description" + value={form.description} + onChange={(e) => setForm({ ...form, description: e.target.value })} + /> + </div> + <div className="flex gap-2 sm:col-span-2"> + <Button type="submit" disabled={mutation.isPending}> + {mutation.isPending ? "Creating..." : "Create VRF"} + </Button> + <Button + type="button" + variant="outline" + onClick={() => router.push("/ipam/vrfs")} + > + Cancel + </Button> + </div> + </form> + </CardContent> + </Card> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/page.tsx b/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/page.tsx new file mode 100644 index 0000000..e40adea --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/ipam/vrfs/page.tsx @@ -0,0 +1,62 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import { useRouter } from "next/navigation"; +import type { ColumnDef } from "@tanstack/react-table"; +import type { VRF } from "@cmdb/shared"; +import { vrfApi } from "@cmdb/shared"; +import { DataTable } from "@/components/data-table"; +import { Button } from "@/components/ui/button"; +import { Plus } from "lucide-react"; + +const columns: ColumnDef<VRF>[] = [ + { accessorKey: "name", header: "Name" }, + { accessorKey: "rd", header: "RD" }, + { + accessorKey: "import_targets", + header: "Import Targets", + cell: ({ row }) => row.original.import_targets.join(", ") || "-", + }, + { + accessorKey: "export_targets", + header: "Export Targets", + cell: ({ row }) => row.original.export_targets.join(", ") || "-", + }, + { accessorKey: "tenant_id", header: "Tenant" }, + { accessorKey: "description", header: "Description" }, +]; + +export default function VRFsPage() { + const router = useRouter(); + const [offset, setOffset] = useState(0); + const limit = 25; + + const { data, isLoading } = useQuery({ + queryKey: ["vrfs", offset, limit], + queryFn: () => vrfApi.list({ offset, limit }), + }); + + return ( + <div className="space-y-4"> + <div className="flex items-center justify-between"> + <h1 className="text-2xl font-bold tracking-tight">VRFs</h1> + <Button onClick={() => router.push("/ipam/vrfs/new")}> + <Plus /> + New VRF + </Button> + </div> + + <DataTable + columns={columns} + data={data?.items ?? []} + isLoading={isLoading} + pagination={ + data ? { offset: data.offset, limit: data.limit, total: data.total } : undefined + } + onPaginationChange={setOffset} + onRowClick={(row) => router.push(`/ipam/vrfs/${row.id}`)} + /> + </div> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/layout.tsx b/frontend/apps/client/src/app/(dashboard)/layout.tsx new file mode 100644 index 0000000..eebb0eb --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/layout.tsx @@ -0,0 +1,45 @@ +"use client"; + +import { useState } from "react"; +import { AuthGuard } from "@/components/auth-guard"; +import { Sidebar } from "@/components/layout/sidebar"; +import { Header } from "@/components/layout/header"; +import { cn } from "@/lib/utils"; + +export default function DashboardLayout({ + children, +}: { + children: React.ReactNode; +}) { + const [sidebarOpen, setSidebarOpen] = useState(false); + + return ( + <AuthGuard> + <div className="flex h-screen overflow-hidden"> + {/* Mobile overlay */} + {sidebarOpen && ( + <div + className="fixed inset-0 z-40 bg-black/50 lg:hidden" + onClick={() => setSidebarOpen(false)} + /> + )} + + {/* Sidebar */} + <div + className={cn( + "fixed inset-y-0 left-0 z-50 transition-transform lg:relative lg:z-0 lg:translate-x-0", + sidebarOpen ? "translate-x-0" : "-translate-x-full", + )} + > + <Sidebar /> + </div> + + {/* Main content */} + <div className="flex flex-1 flex-col overflow-hidden"> + <Header onMenuClick={() => setSidebarOpen(!sidebarOpen)} /> + <main className="flex-1 overflow-y-auto p-4 lg:p-6">{children}</main> + </div> + </div> + </AuthGuard> + ); +} diff --git a/frontend/apps/client/src/app/(dashboard)/page.tsx b/frontend/apps/client/src/app/(dashboard)/page.tsx new file mode 100644 index 0000000..3f383f0 --- /dev/null +++ b/frontend/apps/client/src/app/(dashboard)/page.tsx @@ -0,0 +1,163 @@ +"use client"; + +import { useQuery } from "@tanstack/react-query"; +import Link from "next/link"; +import { + prefixApi, + ipAddressApi, + vrfApi, + vlanApi, + changelogApi, +} from "@cmdb/shared"; +import { + Card, + CardContent, + CardHeader, + CardTitle, +} from "@/components/ui/card"; +import { Skeleton } from "@/components/ui/skeleton"; +import { cn } from "@/lib/utils"; +import { Globe, Hash, Layers, Network } from "lucide-react"; + +const ACTION_STYLES: Record<string, string> = { + created: "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200", + updated: "bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200", + deleted: "bg-red-100 text-red-800 dark:bg-red-900 dark:text-red-200", +}; + +export default function DashboardPage() { + const { data: prefixes, isLoading: loadingPrefixes } = useQuery({ + queryKey: ["dashboard-prefixes"], + queryFn: () => prefixApi.list({ limit: 1, offset: 0 }), + }); + + const { data: ipAddresses, isLoading: loadingIPs } = useQuery({ + queryKey: ["dashboard-ip-addresses"], + queryFn: () => ipAddressApi.list({ limit: 1, offset: 0 }), + }); + + const { data: vrfs, isLoading: loadingVRFs } = useQuery({ + queryKey: ["dashboard-vrfs"], + queryFn: () => vrfApi.list({ limit: 1, offset: 0 }), + }); + + const { data: vlans, isLoading: loadingVLANs } = useQuery({ + queryKey: ["dashboard-vlans"], + queryFn: () => vlanApi.list({ limit: 1, offset: 0 }), + }); + + const stats = [ + { + title: "Prefixes", + value: prefixes?.total, + loading: loadingPrefixes, + icon: Network, + href: "/ipam/prefixes", + }, + { + title: "IP Addresses", + value: ipAddresses?.total, + loading: loadingIPs, + icon: Hash, + href: "/ipam/ip-addresses", + }, + { + title: "VRFs", + value: vrfs?.total, + loading: loadingVRFs, + icon: Layers, + href: "/ipam/vrfs", + }, + { + title: "VLANs", + value: vlans?.total, + loading: loadingVLANs, + icon: Globe, + href: "/ipam/vlans", + }, + ]; + + return ( + <div className="space-y-6"> + <h1 className="text-3xl font-bold tracking-tight">IPAM Dashboard</h1> + + <div className="grid gap-4 sm:grid-cols-2 lg:grid-cols-4"> + {stats.map((stat) => ( + <Link key={stat.title} href={stat.href}> + <Card className="transition-colors hover:bg-accent/50"> + <CardHeader className="flex flex-row items-center justify-between pb-2"> + <CardTitle className="text-sm font-medium">{stat.title}</CardTitle> + <stat.icon className="h-4 w-4 text-muted-foreground" /> + </CardHeader> + <CardContent> + {stat.loading ? ( + <Skeleton className="h-8 w-16" /> + ) : ( + <div className="text-2xl font-bold">{stat.value ?? 0}</div> + )} + <p className="text-xs text-muted-foreground">Total count</p> + </CardContent> + </Card> + </Link> + ))} + </div> + + <RecentChanges /> + </div> + ); +} + +function RecentChanges() { + const { data, isLoading } = useQuery({ + queryKey: ["dashboard-recent-changes"], + queryFn: () => changelogApi.list({ limit: 10, offset: 0 }), + retry: false, + }); + + const entries = data?.items ?? []; + + return ( + <Card> + <CardHeader> + <CardTitle>Recent Changes</CardTitle> + </CardHeader> + <CardContent> + {isLoading ? ( + <div className="space-y-3"> + {Array.from({ length: 5 }).map((_, i) => ( + <div key={i} className="flex items-center gap-3"> + <Skeleton className="h-5 w-16" /> + <Skeleton className="h-4 w-48" /> + <Skeleton className="ml-auto h-4 w-24" /> + </div> + ))} + </div> + ) : entries.length === 0 ? ( + <p className="py-4 text-center text-sm text-muted-foreground">No recent changes.</p> + ) : ( + <div className="space-y-2"> + {entries.map((entry) => ( + <div key={entry.id} className="flex items-center gap-3 text-sm"> + <span + className={cn( + "inline-flex items-center rounded-md px-2 py-0.5 text-xs font-medium", + ACTION_STYLES[entry.action] ?? + "bg-gray-100 text-gray-800 dark:bg-gray-800 dark:text-gray-200", + )} + > + {entry.action} + </span> + <span className="truncate"> + {entry.aggregate_type} <span className="text-muted-foreground">{entry.event_type}</span> + </span> + <span className="ml-auto shrink-0 text-xs text-muted-foreground"> + {new Date(entry.timestamp).toLocaleString()} + </span> + </div> + ))} + </div> + )} + </CardContent> + </Card> + ); +} diff --git a/frontend/apps/client/src/app/favicon.ico b/frontend/apps/client/src/app/favicon.ico new file mode 100644 index 0000000..718d6fe Binary files /dev/null and b/frontend/apps/client/src/app/favicon.ico differ diff --git a/frontend/apps/client/src/app/globals.css b/frontend/apps/client/src/app/globals.css new file mode 100644 index 0000000..c56032b --- /dev/null +++ b/frontend/apps/client/src/app/globals.css @@ -0,0 +1,130 @@ +@import "tailwindcss"; +@import "tw-animate-css"; +@import "shadcn/tailwind.css"; + +@custom-variant dark (&:is(.dark *)); + +@theme inline { + --color-background: var(--background); + --color-foreground: var(--foreground); + --font-sans: var(--font-sans); + --font-mono: var(--font-geist-mono); + --font-heading: var(--font-sans); + --color-sidebar-ring: var(--sidebar-ring); + --color-sidebar-border: var(--sidebar-border); + --color-sidebar-accent-foreground: var(--sidebar-accent-foreground); + --color-sidebar-accent: var(--sidebar-accent); + --color-sidebar-primary-foreground: var(--sidebar-primary-foreground); + --color-sidebar-primary: var(--sidebar-primary); + --color-sidebar-foreground: var(--sidebar-foreground); + --color-sidebar: var(--sidebar); + --color-chart-5: var(--chart-5); + --color-chart-4: var(--chart-4); + --color-chart-3: var(--chart-3); + --color-chart-2: var(--chart-2); + --color-chart-1: var(--chart-1); + --color-ring: var(--ring); + --color-input: var(--input); + --color-border: var(--border); + --color-destructive: var(--destructive); + --color-accent-foreground: var(--accent-foreground); + --color-accent: var(--accent); + --color-muted-foreground: var(--muted-foreground); + --color-muted: var(--muted); + --color-secondary-foreground: var(--secondary-foreground); + --color-secondary: var(--secondary); + --color-primary-foreground: var(--primary-foreground); + --color-primary: var(--primary); + --color-popover-foreground: var(--popover-foreground); + --color-popover: var(--popover); + --color-card-foreground: var(--card-foreground); + --color-card: var(--card); + --radius-sm: calc(var(--radius) * 0.6); + --radius-md: calc(var(--radius) * 0.8); + --radius-lg: var(--radius); + --radius-xl: calc(var(--radius) * 1.4); + --radius-2xl: calc(var(--radius) * 1.8); + --radius-3xl: calc(var(--radius) * 2.2); + --radius-4xl: calc(var(--radius) * 2.6); +} + +:root { + --background: oklch(1 0 0); + --foreground: oklch(0.145 0 0); + --card: oklch(1 0 0); + --card-foreground: oklch(0.145 0 0); + --popover: oklch(1 0 0); + --popover-foreground: oklch(0.145 0 0); + --primary: oklch(0.205 0 0); + --primary-foreground: oklch(0.985 0 0); + --secondary: oklch(0.97 0 0); + --secondary-foreground: oklch(0.205 0 0); + --muted: oklch(0.97 0 0); + --muted-foreground: oklch(0.556 0 0); + --accent: oklch(0.97 0 0); + --accent-foreground: oklch(0.205 0 0); + --destructive: oklch(0.577 0.245 27.325); + --border: oklch(0.922 0 0); + --input: oklch(0.922 0 0); + --ring: oklch(0.708 0 0); + --chart-1: oklch(0.87 0 0); + --chart-2: oklch(0.556 0 0); + --chart-3: oklch(0.439 0 0); + --chart-4: oklch(0.371 0 0); + --chart-5: oklch(0.269 0 0); + --radius: 0.625rem; + --sidebar: oklch(0.985 0 0); + --sidebar-foreground: oklch(0.145 0 0); + --sidebar-primary: oklch(0.205 0 0); + --sidebar-primary-foreground: oklch(0.985 0 0); + --sidebar-accent: oklch(0.97 0 0); + --sidebar-accent-foreground: oklch(0.205 0 0); + --sidebar-border: oklch(0.922 0 0); + --sidebar-ring: oklch(0.708 0 0); +} + +.dark { + --background: oklch(0.145 0 0); + --foreground: oklch(0.985 0 0); + --card: oklch(0.205 0 0); + --card-foreground: oklch(0.985 0 0); + --popover: oklch(0.205 0 0); + --popover-foreground: oklch(0.985 0 0); + --primary: oklch(0.922 0 0); + --primary-foreground: oklch(0.205 0 0); + --secondary: oklch(0.269 0 0); + --secondary-foreground: oklch(0.985 0 0); + --muted: oklch(0.269 0 0); + --muted-foreground: oklch(0.708 0 0); + --accent: oklch(0.269 0 0); + --accent-foreground: oklch(0.985 0 0); + --destructive: oklch(0.704 0.191 22.216); + --border: oklch(1 0 0 / 10%); + --input: oklch(1 0 0 / 15%); + --ring: oklch(0.556 0 0); + --chart-1: oklch(0.87 0 0); + --chart-2: oklch(0.556 0 0); + --chart-3: oklch(0.439 0 0); + --chart-4: oklch(0.371 0 0); + --chart-5: oklch(0.269 0 0); + --sidebar: oklch(0.205 0 0); + --sidebar-foreground: oklch(0.985 0 0); + --sidebar-primary: oklch(0.488 0.243 264.376); + --sidebar-primary-foreground: oklch(0.985 0 0); + --sidebar-accent: oklch(0.269 0 0); + --sidebar-accent-foreground: oklch(0.985 0 0); + --sidebar-border: oklch(1 0 0 / 10%); + --sidebar-ring: oklch(0.556 0 0); +} + +@layer base { + * { + @apply border-border outline-ring/50; + } + body { + @apply bg-background text-foreground; + } + html { + @apply font-sans; + } +} \ No newline at end of file diff --git a/frontend/apps/client/src/app/layout.tsx b/frontend/apps/client/src/app/layout.tsx new file mode 100644 index 0000000..d2b4c78 --- /dev/null +++ b/frontend/apps/client/src/app/layout.tsx @@ -0,0 +1,35 @@ +import type { Metadata } from "next"; +import { Geist, Geist_Mono } from "next/font/google"; +import { Providers } from "@/components/providers"; +import "./globals.css"; + +const geistSans = Geist({ + variable: "--font-geist-sans", + subsets: ["latin"], +}); + +const geistMono = Geist_Mono({ + variable: "--font-geist-mono", + subsets: ["latin"], +}); + +export const metadata: Metadata = { + title: "CMDB", + description: "Configuration Management Database", +}; + +export default function RootLayout({ + children, +}: Readonly<{ + children: React.ReactNode; +}>) { + return ( + <html lang="en" suppressHydrationWarning> + <body + className={`${geistSans.variable} ${geistMono.variable} antialiased`} + > + <Providers>{children}</Providers> + </body> + </html> + ); +} diff --git a/frontend/apps/client/src/components/auth-guard.tsx b/frontend/apps/client/src/components/auth-guard.tsx new file mode 100644 index 0000000..72e66bb --- /dev/null +++ b/frontend/apps/client/src/components/auth-guard.tsx @@ -0,0 +1,50 @@ +"use client"; + +import { useAuth } from "@cmdb/shared"; +import { getSetupStatus } from "@cmdb/shared/lib/setup"; +import { useRouter } from "next/navigation"; +import { useEffect, useState } from "react"; +import type { ReactNode } from "react"; + +export function AuthGuard({ children }: { children: ReactNode }) { + const { isAuthenticated, isLoading } = useAuth(); + const router = useRouter(); + const [checkingSetup, setCheckingSetup] = useState(true); + + useEffect(() => { + async function checkSetup() { + try { + const status = await getSetupStatus(); + if (!status.initialized) { + router.replace("/setup"); + return; + } + } catch { + // Setup endpoint unavailable — assume initialized + } finally { + setCheckingSetup(false); + } + } + checkSetup(); + }, [router]); + + useEffect(() => { + if (!checkingSetup && !isLoading && !isAuthenticated) { + router.replace("/login"); + } + }, [isAuthenticated, isLoading, checkingSetup, router]); + + if (isLoading || checkingSetup) { + return ( + <div className="flex h-screen items-center justify-center"> + <div className="text-muted-foreground">Loading...</div> + </div> + ); + } + + if (!isAuthenticated) { + return null; + } + + return <>{children}</>; +} diff --git a/frontend/apps/client/src/components/data-table.tsx b/frontend/apps/client/src/components/data-table.tsx new file mode 100644 index 0000000..761cd25 --- /dev/null +++ b/frontend/apps/client/src/components/data-table.tsx @@ -0,0 +1,159 @@ +"use client"; + +import { + type ColumnDef, + flexRender, + getCoreRowModel, + useReactTable, +} from "@tanstack/react-table"; +import { + Table, + TableBody, + TableCell, + TableHead, + TableHeader, + TableRow, +} from "@/components/ui/table"; +import { Button } from "@/components/ui/button"; +import { Skeleton } from "@/components/ui/skeleton"; +import { ChevronLeft, ChevronRight } from "lucide-react"; + +interface DataTableProps<TData, TValue> { + columns: ColumnDef<TData, TValue>[]; + data: TData[]; + pagination?: { + offset: number; + limit: number; + total: number; + }; + onPaginationChange?: (offset: number) => void; + isLoading?: boolean; + onRowClick?: (row: TData) => void; +} + +export function DataTable<TData, TValue>({ + columns, + data, + pagination, + onPaginationChange, + isLoading, + onRowClick, +}: DataTableProps<TData, TValue>) { + const table = useReactTable({ + data, + columns, + getCoreRowModel: getCoreRowModel(), + }); + + const currentPage = pagination + ? Math.floor(pagination.offset / pagination.limit) + 1 + : 1; + const totalPages = pagination + ? Math.ceil(pagination.total / pagination.limit) + : 1; + + return ( + <div className="space-y-4"> + <div className="rounded-lg border"> + <Table> + <TableHeader> + {table.getHeaderGroups().map((headerGroup) => ( + <TableRow key={headerGroup.id}> + {headerGroup.headers.map((header) => ( + <TableHead key={header.id}> + {header.isPlaceholder + ? null + : flexRender( + header.column.columnDef.header, + header.getContext(), + )} + </TableHead> + ))} + </TableRow> + ))} + </TableHeader> + <TableBody> + {isLoading ? ( + Array.from({ length: pagination?.limit ?? 10 }).map((_, i) => ( + <TableRow key={i}> + {columns.map((_, j) => ( + <TableCell key={j}> + <Skeleton className="h-4 w-full" /> + </TableCell> + ))} + </TableRow> + )) + ) : table.getRowModel().rows.length ? ( + table.getRowModel().rows.map((row) => ( + <TableRow + key={row.id} + className={onRowClick ? "cursor-pointer" : ""} + onClick={() => onRowClick?.(row.original)} + > + {row.getVisibleCells().map((cell) => ( + <TableCell key={cell.id}> + {flexRender( + cell.column.columnDef.cell, + cell.getContext(), + )} + </TableCell> + ))} + </TableRow> + )) + ) : ( + <TableRow> + <TableCell + colSpan={columns.length} + className="h-24 text-center" + > + No results. + </TableCell> + </TableRow> + )} + </TableBody> + </Table> + </div> + + {pagination && totalPages > 1 && ( + <div className="flex items-center justify-between px-2"> + <p className="text-sm text-muted-foreground"> + Showing {pagination.offset + 1}- + {Math.min(pagination.offset + pagination.limit, pagination.total)}{" "} + of {pagination.total} + </p> + <div className="flex items-center gap-2"> + <Button + variant="outline" + size="sm" + onClick={() => + onPaginationChange?.( + Math.max(0, pagination.offset - pagination.limit), + ) + } + disabled={pagination.offset === 0} + > + <ChevronLeft /> + Previous + </Button> + <span className="text-sm text-muted-foreground"> + Page {currentPage} of {totalPages} + </span> + <Button + variant="outline" + size="sm" + onClick={() => + onPaginationChange?.(pagination.offset + pagination.limit) + } + disabled={ + pagination.offset + pagination.limit >= pagination.total + } + > + Next + <ChevronRight /> + </Button> + </div> + </div> + )} + </div> + ); +} diff --git a/frontend/apps/client/src/components/history/changelog-timeline.tsx b/frontend/apps/client/src/components/history/changelog-timeline.tsx new file mode 100644 index 0000000..1b33c12 --- /dev/null +++ b/frontend/apps/client/src/components/history/changelog-timeline.tsx @@ -0,0 +1,115 @@ +"use client"; + +import { useState } from "react"; +import { useQuery } from "@tanstack/react-query"; +import { changelogApi } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Skeleton } from "@/components/ui/skeleton"; +import { cn } from "@/lib/utils"; + +const ACTION_STYLES: Record<string, { label: string; className: string }> = { + created: { label: "Created", className: "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200" }, + updated: { label: "Updated", className: "bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200" }, + deleted: { label: "Deleted", className: "bg-red-100 text-red-800 dark:bg-red-900 dark:text-red-200" }, +}; + +interface ChangelogTimelineProps { + objectId: string; +} + +export function ChangelogTimeline({ objectId }: ChangelogTimelineProps) { + const [limit, setLimit] = useState(10); + + const { data, isLoading } = useQuery({ + queryKey: ["changelog", objectId, limit], + queryFn: () => changelogApi.getByObject(objectId, { limit, offset: 0 }), + }); + + if (isLoading) { + return ( + <div className="space-y-4 py-4"> + {Array.from({ length: 3 }).map((_, i) => ( + <div key={i} className="flex gap-4"> + <Skeleton className="h-3 w-3 shrink-0 rounded-full" /> + <div className="flex-1 space-y-2"> + <Skeleton className="h-4 w-32" /> + <Skeleton className="h-3 w-48" /> + </div> + </div> + ))} + </div> + ); + } + + const entries = data?.items ?? []; + const total = data?.total ?? 0; + const hasMore = entries.length < total; + + if (entries.length === 0) { + return ( + <p className="py-8 text-center text-sm text-muted-foreground">No history entries found.</p> + ); + } + + return ( + <div className="py-4"> + <div className="relative"> + {/* Vertical timeline line */} + <div className="absolute left-[5px] top-2 bottom-2 w-px bg-border" /> + + <div className="space-y-6"> + {entries.map((entry) => { + const actionStyle = ACTION_STYLES[entry.action] ?? { + label: entry.action, + className: "bg-gray-100 text-gray-800 dark:bg-gray-800 dark:text-gray-200", + }; + + return ( + <div key={entry.id} className="relative flex gap-4 pl-0"> + {/* Timeline dot */} + <div + className={cn( + "relative z-10 mt-1.5 h-3 w-3 shrink-0 rounded-full border-2 border-background", + entry.action === "created" && "bg-green-500", + entry.action === "updated" && "bg-blue-500", + entry.action === "deleted" && "bg-red-500", + !["created", "updated", "deleted"].includes(entry.action) && "bg-gray-500", + )} + /> + + {/* Content */} + <div className="flex-1 min-w-0"> + <div className="flex items-center gap-2 flex-wrap"> + <span className={cn("inline-flex items-center rounded-md px-2 py-0.5 text-xs font-medium", actionStyle.className)}> + {actionStyle.label} + </span> + <span className="text-xs text-muted-foreground"> + {entry.event_type} + </span> + </div> + <div className="mt-1 flex items-center gap-2 text-xs text-muted-foreground"> + <time>{new Date(entry.timestamp).toLocaleString()}</time> + {entry.user_id && ( + <> + <span>by</span> + <span className="font-medium text-foreground">{entry.user_id}</span> + </> + )} + </div> + </div> + </div> + ); + })} + </div> + </div> + + {hasMore && ( + <div className="mt-4 text-center"> + <Button variant="outline" size="sm" onClick={() => setLimit((prev) => prev + 10)}> + Load more + </Button> + </div> + )} + </div> + ); +} diff --git a/frontend/apps/client/src/components/journal/journal-entries.tsx b/frontend/apps/client/src/components/journal/journal-entries.tsx new file mode 100644 index 0000000..fad84f1 --- /dev/null +++ b/frontend/apps/client/src/components/journal/journal-entries.tsx @@ -0,0 +1,163 @@ +"use client"; + +import { useState } from "react"; +import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; +import { journalApi } from "@cmdb/shared"; +import type { JournalEntry } from "@cmdb/shared"; +import { Button } from "@/components/ui/button"; +import { Textarea } from "@/components/ui/textarea"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import { Skeleton } from "@/components/ui/skeleton"; +import { cn } from "@/lib/utils"; +import { Plus, X } from "lucide-react"; + +const ENTRY_TYPE_STYLES: Record<string, { label: string; className: string }> = { + info: { label: "Info", className: "bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200" }, + success: { label: "Success", className: "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200" }, + warning: { label: "Warning", className: "bg-yellow-100 text-yellow-800 dark:bg-yellow-900 dark:text-yellow-200" }, + danger: { label: "Danger", className: "bg-red-100 text-red-800 dark:bg-red-900 dark:text-red-200" }, +}; + +interface JournalEntriesProps { + objectType: string; + objectId: string; +} + +export function JournalEntries({ objectType, objectId }: JournalEntriesProps) { + const queryClient = useQueryClient(); + const [entryType, setEntryType] = useState<string>("info"); + const [comment, setComment] = useState(""); + + const { data, isLoading } = useQuery({ + queryKey: ["journal", objectType, objectId], + queryFn: () => journalApi.list({ object_type: objectType, object_id: objectId }), + }); + + const createMutation = useMutation({ + mutationFn: (data: { object_type: string; object_id: string; entry_type: string; comment: string }) => + journalApi.create(data), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["journal", objectType, objectId] }); + setComment(""); + setEntryType("info"); + }, + }); + + const deleteMutation = useMutation({ + mutationFn: (id: string) => journalApi.delete(id), + onSuccess: () => { + queryClient.invalidateQueries({ queryKey: ["journal", objectType, objectId] }); + }, + }); + + function handleSubmit(e: React.FormEvent) { + e.preventDefault(); + if (!comment.trim()) return; + createMutation.mutate({ + object_type: objectType, + object_id: objectId, + entry_type: entryType, + comment: comment.trim(), + }); + } + + if (isLoading) { + return ( + <div className="space-y-4 py-4"> + <Skeleton className="h-24 w-full" /> + <Skeleton className="h-16 w-full" /> + <Skeleton className="h-16 w-full" /> + </div> + ); + } + + const entries = data?.items ?? []; + + return ( + <div className="space-y-4 py-4"> + {/* Add Note Form */} + <form onSubmit={handleSubmit} className="space-y-3 rounded-lg border p-4"> + <p className="text-sm font-medium">Add Note</p> + <div className="flex gap-2"> + <Select value={entryType} onValueChange={(val) => val && setEntryType(val)}> + <SelectTrigger className="w-32"> + <SelectValue /> + </SelectTrigger> + <SelectContent> + <SelectItem value="info">Info</SelectItem> + <SelectItem value="success">Success</SelectItem> + <SelectItem value="warning">Warning</SelectItem> + <SelectItem value="danger">Danger</SelectItem> + </SelectContent> + </Select> + </div> + <Textarea + value={comment} + onChange={(e) => setComment(e.target.value)} + placeholder="Write a note..." + rows={3} + /> + <Button type="submit" size="sm" disabled={createMutation.isPending || !comment.trim()}> + <Plus className="h-4 w-4" /> + {createMutation.isPending ? "Adding..." : "Add Note"} + </Button> + </form> + + {/* Entries List */} + {entries.length === 0 ? ( + <p className="py-4 text-center text-sm text-muted-foreground">No journal entries yet.</p> + ) : ( + <div className="space-y-3"> + {entries.map((entry: JournalEntry) => { + const style = ENTRY_TYPE_STYLES[entry.entry_type] ?? ENTRY_TYPE_STYLES.info; + + return ( + <div + key={entry.id} + className="group relative flex gap-3 rounded-lg border p-3" + > + <div className="flex-1 min-w-0"> + <div className="flex items-center gap-2 mb-1"> + <span + className={cn( + "inline-flex items-center rounded-md px-2 py-0.5 text-xs font-medium", + style.className, + )} + > + {style.label} + </span> + <span className="text-xs text-muted-foreground"> + {new Date(entry.created_at).toLocaleString()} + </span> + {entry.user_id && ( + <span className="text-xs text-muted-foreground"> + by {entry.user_id} + </span> + )} + </div> + <p className="text-sm whitespace-pre-wrap">{entry.comment}</p> + </div> + <Button + variant="ghost" + size="icon-sm" + className="shrink-0 opacity-0 group-hover:opacity-100 transition-opacity" + onClick={() => deleteMutation.mutate(entry.id)} + disabled={deleteMutation.isPending} + > + <X className="h-3 w-3" /> + <span className="sr-only">Delete</span> + </Button> + </div> + ); + })} + </div> + )} + </div> + ); +} diff --git a/frontend/apps/client/src/components/layout/header.tsx b/frontend/apps/client/src/components/layout/header.tsx new file mode 100644 index 0000000..7c88b62 --- /dev/null +++ b/frontend/apps/client/src/components/layout/header.tsx @@ -0,0 +1,80 @@ +"use client"; + +import { useAuth } from "@cmdb/shared"; +import { useTheme } from "next-themes"; +import { LogOut, Menu, Moon, Search, Sun, User } from "lucide-react"; +import { Button } from "@/components/ui/button"; +import { + DropdownMenu, + DropdownMenuContent, + DropdownMenuItem, + DropdownMenuSeparator, + DropdownMenuTrigger, +} from "@/components/ui/dropdown-menu"; +import { GlobalSearch } from "@/components/search/global-search"; + +interface HeaderProps { + onMenuClick: () => void; +} + +export function Header({ onMenuClick }: HeaderProps) { + const { user, logout } = useAuth(); + const { theme, setTheme } = useTheme(); + + return ( + <header className="flex h-14 items-center gap-4 border-b bg-background px-4 lg:px-6"> + <Button + variant="ghost" + size="icon" + className="lg:hidden" + onClick={onMenuClick} + > + <Menu className="h-5 w-5" /> + <span className="sr-only">Toggle menu</span> + </Button> + + <div className="flex-1" /> + + <GlobalSearch /> + <Button variant="outline" size="sm" className="hidden gap-2 sm:flex"> + <Search className="h-4 w-4" /> + <span className="text-muted-foreground">Search...</span> + <kbd className="pointer-events-none ml-2 hidden select-none rounded border bg-muted px-1.5 py-0.5 font-mono text-xs text-muted-foreground sm:inline-block"> + ⌘K + </kbd> + </Button> + + <Button + variant="ghost" + size="icon" + onClick={() => setTheme(theme === "dark" ? "light" : "dark")} + > + <Sun className="h-4 w-4 rotate-0 scale-100 transition-all dark:-rotate-90 dark:scale-0" /> + <Moon className="absolute h-4 w-4 rotate-90 scale-0 transition-all dark:rotate-0 dark:scale-100" /> + <span className="sr-only">Toggle theme</span> + </Button> + + <DropdownMenu> + <DropdownMenuTrigger className="inline-flex items-center justify-center gap-2 whitespace-nowrap rounded-md text-sm font-medium transition-colors focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring disabled:pointer-events-none disabled:opacity-50 hover:bg-accent hover:text-accent-foreground h-9 w-9 cursor-pointer"> + <User className="h-4 w-4" /> + <span className="sr-only">User menu</span> + </DropdownMenuTrigger> + <DropdownMenuContent align="end"> + {user && ( + <> + <div className="px-2 py-1.5"> + <p className="text-sm font-medium">{user.username}</p> + <p className="text-xs text-muted-foreground">{user.email}</p> + </div> + <DropdownMenuSeparator /> + </> + )} + <DropdownMenuItem onClick={logout}> + <LogOut className="mr-2 h-4 w-4" /> + Sign out + </DropdownMenuItem> + </DropdownMenuContent> + </DropdownMenu> + </header> + ); +} diff --git a/frontend/apps/client/src/components/layout/sidebar.tsx b/frontend/apps/client/src/components/layout/sidebar.tsx new file mode 100644 index 0000000..864db3e --- /dev/null +++ b/frontend/apps/client/src/components/layout/sidebar.tsx @@ -0,0 +1,113 @@ +"use client"; + +import { useState } from "react"; +import Link from "next/link"; +import { usePathname } from "next/navigation"; +import { cn } from "@/lib/utils"; +import { + ChevronDown, + Globe, + Hash, + Home, + Layers, + Network, + Server, + Shield, + Target, + Cable, + FolderTree, +} from "lucide-react"; + +interface NavItem { + label: string; + href: string; + icon: React.ElementType; +} + +const ipamItems: NavItem[] = [ + { label: "Prefixes", href: "/ipam/prefixes", icon: Network }, + { label: "IP Addresses", href: "/ipam/ip-addresses", icon: Hash }, + { label: "IP Ranges", href: "/ipam/ip-ranges", icon: Cable }, + { label: "VRFs", href: "/ipam/vrfs", icon: Layers }, + { label: "VLANs", href: "/ipam/vlans", icon: Server }, + { label: "VLAN Groups", href: "/ipam/vlan-groups", icon: FolderTree }, + { label: "ASNs", href: "/ipam/asns", icon: Globe }, + { label: "RIRs", href: "/ipam/rirs", icon: Shield }, + { label: "Route Targets", href: "/ipam/route-targets", icon: Target }, +]; + +export function Sidebar({ className }: { className?: string }) { + const pathname = usePathname(); + const [ipamOpen, setIpamOpen] = useState( + pathname.startsWith("/ipam"), + ); + + return ( + <aside + className={cn( + "flex h-full w-64 flex-col border-r bg-sidebar text-sidebar-foreground", + className, + )} + > + <div className="flex h-14 items-center border-b px-4"> + <Link href="/" className="flex items-center gap-2 font-semibold"> + <Network className="h-5 w-5" /> + <span>CMDB</span> + </Link> + </div> + + <nav className="flex-1 overflow-y-auto p-3"> + <ul className="space-y-1"> + <li> + <Link + href="/" + className={cn( + "flex items-center gap-3 rounded-md px-3 py-2 text-sm font-medium transition-colors hover:bg-sidebar-accent hover:text-sidebar-accent-foreground", + pathname === "/" && + "bg-sidebar-accent text-sidebar-accent-foreground", + )} + > + <Home className="h-4 w-4" /> + Dashboard + </Link> + </li> + + <li> + <button + onClick={() => setIpamOpen(!ipamOpen)} + className="flex w-full items-center gap-3 rounded-md px-3 py-2 text-sm font-medium transition-colors hover:bg-sidebar-accent hover:text-sidebar-accent-foreground" + > + <Network className="h-4 w-4" /> + IPAM + <ChevronDown + className={cn( + "ml-auto h-4 w-4 transition-transform", + ipamOpen && "rotate-180", + )} + /> + </button> + {ipamOpen && ( + <ul className="ml-4 mt-1 space-y-1 border-l pl-3"> + {ipamItems.map((item) => ( + <li key={item.href}> + <Link + href={item.href} + className={cn( + "flex items-center gap-3 rounded-md px-3 py-1.5 text-sm transition-colors hover:bg-sidebar-accent hover:text-sidebar-accent-foreground", + pathname === item.href && + "bg-sidebar-accent font-medium text-sidebar-accent-foreground", + )} + > + <item.icon className="h-4 w-4" /> + {item.label} + </Link> + </li> + ))} + </ul> + )} + </li> + </ul> + </nav> + </aside> + ); +} diff --git a/frontend/apps/client/src/components/providers.tsx b/frontend/apps/client/src/components/providers.tsx new file mode 100644 index 0000000..f52a26b --- /dev/null +++ b/frontend/apps/client/src/components/providers.tsx @@ -0,0 +1,34 @@ +"use client"; + +import { QueryClient, QueryClientProvider } from "@tanstack/react-query"; +import { ThemeProvider } from "next-themes"; +import { AuthProvider } from "@cmdb/shared"; +import { useState } from "react"; +import type { ReactNode } from "react"; + +export function Providers({ children }: { children: ReactNode }) { + const [queryClient] = useState( + () => + new QueryClient({ + defaultOptions: { + queries: { + staleTime: 60 * 1000, + retry: 1, + }, + }, + }), + ); + + return ( + <QueryClientProvider client={queryClient}> + <ThemeProvider + attribute="class" + defaultTheme="system" + enableSystem + disableTransitionOnChange + > + <AuthProvider>{children}</AuthProvider> + </ThemeProvider> + </QueryClientProvider> + ); +} diff --git a/frontend/apps/client/src/components/search/global-search.tsx b/frontend/apps/client/src/components/search/global-search.tsx new file mode 100644 index 0000000..e8c649b --- /dev/null +++ b/frontend/apps/client/src/components/search/global-search.tsx @@ -0,0 +1,169 @@ +"use client"; + +import { useState, useEffect, useCallback, useRef } from "react"; +import { useRouter } from "next/navigation"; +import { useQuery } from "@tanstack/react-query"; +import type { SearchResult } from "@cmdb/shared"; +import { searchApi } from "@cmdb/shared"; +import { + Dialog, + DialogContent, + DialogTitle, +} from "@/components/ui/dialog"; +import { Input } from "@/components/ui/input"; +import { Badge } from "@/components/ui/badge"; +import { cn } from "@/lib/utils"; +import { Loader2, Search } from "lucide-react"; + +const ENTITY_ROUTES: Record<string, string> = { + prefix: "/ipam/prefixes", + ip_address: "/ipam/ip-addresses", + vrf: "/ipam/vrfs", + vlan: "/ipam/vlans", + ip_range: "/ipam/ip-ranges", + rir: "/ipam/rirs", + asn: "/ipam/asns", + fhrp_group: "/ipam/fhrp-groups", + route_target: "/ipam/route-targets", + vlan_group: "/ipam/vlan-groups", + service: "/ipam/services", +}; + +function getEntityRoute(entityType: string, entityId: string): string { + const base = ENTITY_ROUTES[entityType] ?? `/ipam/${entityType}s`; + return `${base}/${entityId}`; +} + +function groupByEntityType(results: SearchResult[]): Record<string, SearchResult[]> { + const grouped: Record<string, SearchResult[]> = {}; + for (const result of results) { + const key = result.entity_type; + if (!grouped[key]) grouped[key] = []; + grouped[key].push(result); + } + return grouped; +} + +export function GlobalSearch() { + const [open, setOpen] = useState(false); + const [query, setQuery] = useState(""); + const [debouncedQuery, setDebouncedQuery] = useState(""); + const router = useRouter(); + const inputRef = useRef<HTMLInputElement>(null); + + // Debounce the query + useEffect(() => { + const timer = setTimeout(() => { + setDebouncedQuery(query); + }, 300); + return () => clearTimeout(timer); + }, [query]); + + // Cmd+K shortcut + useEffect(() => { + function onKeyDown(e: KeyboardEvent) { + if ((e.metaKey || e.ctrlKey) && e.key === "k") { + e.preventDefault(); + setOpen((prev) => !prev); + } + } + document.addEventListener("keydown", onKeyDown); + return () => document.removeEventListener("keydown", onKeyDown); + }, []); + + // Focus input when dialog opens + useEffect(() => { + if (open) { + setTimeout(() => inputRef.current?.focus(), 50); + } + }, [open]); + + const handleOpenChange = useCallback((nextOpen: boolean) => { + setOpen(nextOpen); + if (!nextOpen) { + setQuery(""); + setDebouncedQuery(""); + } + }, []); + + const { data, isLoading } = useQuery({ + queryKey: ["global-search", debouncedQuery], + queryFn: () => searchApi.search(debouncedQuery), + enabled: debouncedQuery.length >= 2, + }); + + const navigateToResult = useCallback( + (result: SearchResult) => { + setOpen(false); + router.push(getEntityRoute(result.entity_type, result.entity_id)); + }, + [router], + ); + + const grouped = data?.results ? groupByEntityType(data.results) : {}; + const hasResults = Object.keys(grouped).length > 0; + const showNoResults = debouncedQuery.length >= 2 && !isLoading && !hasResults; + + return ( + <Dialog open={open} onOpenChange={handleOpenChange}> + <DialogContent + className="sm:max-w-lg top-[20%] translate-y-0" + showCloseButton={false} + > + <DialogTitle className="sr-only">Global Search</DialogTitle> + <div className="flex items-center gap-2 border-b pb-3"> + <Search className="h-4 w-4 shrink-0 text-muted-foreground" /> + <Input + ref={inputRef} + value={query} + onChange={(e) => setQuery(e.target.value)} + placeholder="Search prefixes, IPs, VRFs..." + className="border-0 p-0 shadow-none focus-visible:ring-0" + /> + {isLoading && <Loader2 className="h-4 w-4 shrink-0 animate-spin text-muted-foreground" />} + </div> + + <div className="max-h-80 overflow-y-auto"> + {showNoResults && ( + <p className="py-6 text-center text-sm text-muted-foreground">No results found.</p> + )} + + {debouncedQuery.length < 2 && !hasResults && ( + <p className="py-6 text-center text-sm text-muted-foreground"> + Type at least 2 characters to search. + </p> + )} + + {Object.entries(grouped).map(([entityType, results]) => ( + <div key={entityType} className="mb-2"> + <p className="mb-1 px-2 text-xs font-medium uppercase tracking-wider text-muted-foreground"> + {entityType.replace(/_/g, " ")} + </p> + {results.map((result) => ( + <button + key={`${result.entity_type}-${result.entity_id}`} + onClick={() => navigateToResult(result)} + className={cn( + "flex w-full items-center gap-3 rounded-md px-2 py-2 text-left text-sm", + "hover:bg-accent hover:text-accent-foreground", + "focus:bg-accent focus:text-accent-foreground focus:outline-none", + )} + > + <div className="flex-1 min-w-0"> + <p className="truncate font-medium">{result.display_text}</p> + {result.description && ( + <p className="truncate text-xs text-muted-foreground"> + {result.description} + </p> + )} + </div> + <Badge variant="secondary">{result.entity_type.replace(/_/g, " ")}</Badge> + </button> + ))} + </div> + ))} + </div> + </DialogContent> + </Dialog> + ); +} diff --git a/frontend/apps/client/src/components/status-badge.tsx b/frontend/apps/client/src/components/status-badge.tsx new file mode 100644 index 0000000..409c9c3 --- /dev/null +++ b/frontend/apps/client/src/components/status-badge.tsx @@ -0,0 +1,25 @@ +"use client"; + +import { Badge } from "@/components/ui/badge"; +import { cn } from "@/lib/utils"; + +const statusColors: Record<string, string> = { + active: "bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-400", + reserved: "bg-yellow-100 text-yellow-800 dark:bg-yellow-900/30 dark:text-yellow-400", + deprecated: "bg-red-100 text-red-800 dark:bg-red-900/30 dark:text-red-400", + container: "bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-400", +}; + +export function StatusBadge({ status }: { status: string }) { + return ( + <Badge + variant="outline" + className={cn( + "border-transparent capitalize", + statusColors[status] ?? "bg-secondary text-secondary-foreground", + )} + > + {status} + </Badge> + ); +} diff --git a/frontend/apps/client/src/components/ui/badge.tsx b/frontend/apps/client/src/components/ui/badge.tsx new file mode 100644 index 0000000..b20959d --- /dev/null +++ b/frontend/apps/client/src/components/ui/badge.tsx @@ -0,0 +1,52 @@ +import { mergeProps } from "@base-ui/react/merge-props" +import { useRender } from "@base-ui/react/use-render" +import { cva, type VariantProps } from "class-variance-authority" + +import { cn } from "@/lib/utils" + +const badgeVariants = cva( + "group/badge inline-flex h-5 w-fit shrink-0 items-center justify-center gap-1 overflow-hidden rounded-4xl border border-transparent px-2 py-0.5 text-xs font-medium whitespace-nowrap transition-all focus-visible:border-ring focus-visible:ring-[3px] focus-visible:ring-ring/50 has-data-[icon=inline-end]:pr-1.5 has-data-[icon=inline-start]:pl-1.5 aria-invalid:border-destructive aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 [&>svg]:pointer-events-none [&>svg]:size-3!", + { + variants: { + variant: { + default: "bg-primary text-primary-foreground [a]:hover:bg-primary/80", + secondary: + "bg-secondary text-secondary-foreground [a]:hover:bg-secondary/80", + destructive: + "bg-destructive/10 text-destructive focus-visible:ring-destructive/20 dark:bg-destructive/20 dark:focus-visible:ring-destructive/40 [a]:hover:bg-destructive/20", + outline: + "border-border text-foreground [a]:hover:bg-muted [a]:hover:text-muted-foreground", + ghost: + "hover:bg-muted hover:text-muted-foreground dark:hover:bg-muted/50", + link: "text-primary underline-offset-4 hover:underline", + }, + }, + defaultVariants: { + variant: "default", + }, + } +) + +function Badge({ + className, + variant = "default", + render, + ...props +}: useRender.ComponentProps<"span"> & VariantProps<typeof badgeVariants>) { + return useRender({ + defaultTagName: "span", + props: mergeProps<"span">( + { + className: cn(badgeVariants({ variant }), className), + }, + props + ), + render, + state: { + slot: "badge", + variant, + }, + }) +} + +export { Badge, badgeVariants } diff --git a/frontend/apps/client/src/components/ui/button.tsx b/frontend/apps/client/src/components/ui/button.tsx new file mode 100644 index 0000000..ded01b2 --- /dev/null +++ b/frontend/apps/client/src/components/ui/button.tsx @@ -0,0 +1,60 @@ +"use client" + +import { Button as ButtonPrimitive } from "@base-ui/react/button" +import { cva, type VariantProps } from "class-variance-authority" + +import { cn } from "@/lib/utils" + +const buttonVariants = cva( + "group/button inline-flex shrink-0 items-center justify-center rounded-lg border border-transparent bg-clip-padding text-sm font-medium whitespace-nowrap transition-all outline-none select-none focus-visible:border-ring focus-visible:ring-3 focus-visible:ring-ring/50 active:translate-y-px disabled:pointer-events-none disabled:opacity-50 aria-invalid:border-destructive aria-invalid:ring-3 aria-invalid:ring-destructive/20 dark:aria-invalid:border-destructive/50 dark:aria-invalid:ring-destructive/40 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4", + { + variants: { + variant: { + default: "bg-primary text-primary-foreground [a]:hover:bg-primary/80", + outline: + "border-border bg-background hover:bg-muted hover:text-foreground aria-expanded:bg-muted aria-expanded:text-foreground dark:border-input dark:bg-input/30 dark:hover:bg-input/50", + secondary: + "bg-secondary text-secondary-foreground hover:bg-secondary/80 aria-expanded:bg-secondary aria-expanded:text-secondary-foreground", + ghost: + "hover:bg-muted hover:text-foreground aria-expanded:bg-muted aria-expanded:text-foreground dark:hover:bg-muted/50", + destructive: + "bg-destructive/10 text-destructive hover:bg-destructive/20 focus-visible:border-destructive/40 focus-visible:ring-destructive/20 dark:bg-destructive/20 dark:hover:bg-destructive/30 dark:focus-visible:ring-destructive/40", + link: "text-primary underline-offset-4 hover:underline", + }, + size: { + default: + "h-8 gap-1.5 px-2.5 has-data-[icon=inline-end]:pr-2 has-data-[icon=inline-start]:pl-2", + xs: "h-6 gap-1 rounded-[min(var(--radius-md),10px)] px-2 text-xs in-data-[slot=button-group]:rounded-lg has-data-[icon=inline-end]:pr-1.5 has-data-[icon=inline-start]:pl-1.5 [&_svg:not([class*='size-'])]:size-3", + sm: "h-7 gap-1 rounded-[min(var(--radius-md),12px)] px-2.5 text-[0.8rem] in-data-[slot=button-group]:rounded-lg has-data-[icon=inline-end]:pr-1.5 has-data-[icon=inline-start]:pl-1.5 [&_svg:not([class*='size-'])]:size-3.5", + lg: "h-9 gap-1.5 px-2.5 has-data-[icon=inline-end]:pr-3 has-data-[icon=inline-start]:pl-3", + icon: "size-8", + "icon-xs": + "size-6 rounded-[min(var(--radius-md),10px)] in-data-[slot=button-group]:rounded-lg [&_svg:not([class*='size-'])]:size-3", + "icon-sm": + "size-7 rounded-[min(var(--radius-md),12px)] in-data-[slot=button-group]:rounded-lg", + "icon-lg": "size-9", + }, + }, + defaultVariants: { + variant: "default", + size: "default", + }, + } +) + +function Button({ + className, + variant = "default", + size = "default", + ...props +}: ButtonPrimitive.Props & VariantProps<typeof buttonVariants>) { + return ( + <ButtonPrimitive + data-slot="button" + className={cn(buttonVariants({ variant, size, className }))} + {...props} + /> + ) +} + +export { Button, buttonVariants } diff --git a/frontend/apps/client/src/components/ui/card.tsx b/frontend/apps/client/src/components/ui/card.tsx new file mode 100644 index 0000000..40cac5f --- /dev/null +++ b/frontend/apps/client/src/components/ui/card.tsx @@ -0,0 +1,103 @@ +import * as React from "react" + +import { cn } from "@/lib/utils" + +function Card({ + className, + size = "default", + ...props +}: React.ComponentProps<"div"> & { size?: "default" | "sm" }) { + return ( + <div + data-slot="card" + data-size={size} + className={cn( + "group/card flex flex-col gap-4 overflow-hidden rounded-xl bg-card py-4 text-sm text-card-foreground ring-1 ring-foreground/10 has-data-[slot=card-footer]:pb-0 has-[>img:first-child]:pt-0 data-[size=sm]:gap-3 data-[size=sm]:py-3 data-[size=sm]:has-data-[slot=card-footer]:pb-0 *:[img:first-child]:rounded-t-xl *:[img:last-child]:rounded-b-xl", + className + )} + {...props} + /> + ) +} + +function CardHeader({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="card-header" + className={cn( + "group/card-header @container/card-header grid auto-rows-min items-start gap-1 rounded-t-xl px-4 group-data-[size=sm]/card:px-3 has-data-[slot=card-action]:grid-cols-[1fr_auto] has-data-[slot=card-description]:grid-rows-[auto_auto] [.border-b]:pb-4 group-data-[size=sm]/card:[.border-b]:pb-3", + className + )} + {...props} + /> + ) +} + +function CardTitle({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="card-title" + className={cn( + "font-heading text-base leading-snug font-medium group-data-[size=sm]/card:text-sm", + className + )} + {...props} + /> + ) +} + +function CardDescription({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="card-description" + className={cn("text-sm text-muted-foreground", className)} + {...props} + /> + ) +} + +function CardAction({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="card-action" + className={cn( + "col-start-2 row-span-2 row-start-1 self-start justify-self-end", + className + )} + {...props} + /> + ) +} + +function CardContent({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="card-content" + className={cn("px-4 group-data-[size=sm]/card:px-3", className)} + {...props} + /> + ) +} + +function CardFooter({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="card-footer" + className={cn( + "flex items-center rounded-b-xl border-t bg-muted/50 p-4 group-data-[size=sm]/card:p-3", + className + )} + {...props} + /> + ) +} + +export { + Card, + CardHeader, + CardFooter, + CardTitle, + CardAction, + CardDescription, + CardContent, +} diff --git a/frontend/apps/client/src/components/ui/dialog.tsx b/frontend/apps/client/src/components/ui/dialog.tsx new file mode 100644 index 0000000..0e91f97 --- /dev/null +++ b/frontend/apps/client/src/components/ui/dialog.tsx @@ -0,0 +1,160 @@ +"use client" + +import * as React from "react" +import { Dialog as DialogPrimitive } from "@base-ui/react/dialog" + +import { cn } from "@/lib/utils" +import { Button } from "@/components/ui/button" +import { XIcon } from "lucide-react" + +function Dialog({ ...props }: DialogPrimitive.Root.Props) { + return <DialogPrimitive.Root data-slot="dialog" {...props} /> +} + +function DialogTrigger({ ...props }: DialogPrimitive.Trigger.Props) { + return <DialogPrimitive.Trigger data-slot="dialog-trigger" {...props} /> +} + +function DialogPortal({ ...props }: DialogPrimitive.Portal.Props) { + return <DialogPrimitive.Portal data-slot="dialog-portal" {...props} /> +} + +function DialogClose({ ...props }: DialogPrimitive.Close.Props) { + return <DialogPrimitive.Close data-slot="dialog-close" {...props} /> +} + +function DialogOverlay({ + className, + ...props +}: DialogPrimitive.Backdrop.Props) { + return ( + <DialogPrimitive.Backdrop + data-slot="dialog-overlay" + className={cn( + "fixed inset-0 isolate z-50 bg-black/10 duration-100 supports-backdrop-filter:backdrop-blur-xs data-open:animate-in data-open:fade-in-0 data-closed:animate-out data-closed:fade-out-0", + className + )} + {...props} + /> + ) +} + +function DialogContent({ + className, + children, + showCloseButton = true, + ...props +}: DialogPrimitive.Popup.Props & { + showCloseButton?: boolean +}) { + return ( + <DialogPortal> + <DialogOverlay /> + <DialogPrimitive.Popup + data-slot="dialog-content" + className={cn( + "fixed top-1/2 left-1/2 z-50 grid w-full max-w-[calc(100%-2rem)] -translate-x-1/2 -translate-y-1/2 gap-4 rounded-xl bg-background p-4 text-sm ring-1 ring-foreground/10 duration-100 outline-none sm:max-w-sm data-open:animate-in data-open:fade-in-0 data-open:zoom-in-95 data-closed:animate-out data-closed:fade-out-0 data-closed:zoom-out-95", + className + )} + {...props} + > + {children} + {showCloseButton && ( + <DialogPrimitive.Close + data-slot="dialog-close" + render={ + <Button + variant="ghost" + className="absolute top-2 right-2" + size="icon-sm" + /> + } + > + <XIcon + /> + <span className="sr-only">Close</span> + </DialogPrimitive.Close> + )} + </DialogPrimitive.Popup> + </DialogPortal> + ) +} + +function DialogHeader({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="dialog-header" + className={cn("flex flex-col gap-2", className)} + {...props} + /> + ) +} + +function DialogFooter({ + className, + showCloseButton = false, + children, + ...props +}: React.ComponentProps<"div"> & { + showCloseButton?: boolean +}) { + return ( + <div + data-slot="dialog-footer" + className={cn( + "-mx-4 -mb-4 flex flex-col-reverse gap-2 rounded-b-xl border-t bg-muted/50 p-4 sm:flex-row sm:justify-end", + className + )} + {...props} + > + {children} + {showCloseButton && ( + <DialogPrimitive.Close render={<Button variant="outline" />}> + Close + </DialogPrimitive.Close> + )} + </div> + ) +} + +function DialogTitle({ className, ...props }: DialogPrimitive.Title.Props) { + return ( + <DialogPrimitive.Title + data-slot="dialog-title" + className={cn( + "font-heading text-base leading-none font-medium", + className + )} + {...props} + /> + ) +} + +function DialogDescription({ + className, + ...props +}: DialogPrimitive.Description.Props) { + return ( + <DialogPrimitive.Description + data-slot="dialog-description" + className={cn( + "text-sm text-muted-foreground *:[a]:underline *:[a]:underline-offset-3 *:[a]:hover:text-foreground", + className + )} + {...props} + /> + ) +} + +export { + Dialog, + DialogClose, + DialogContent, + DialogDescription, + DialogFooter, + DialogHeader, + DialogOverlay, + DialogPortal, + DialogTitle, + DialogTrigger, +} diff --git a/frontend/apps/client/src/components/ui/dropdown-menu.tsx b/frontend/apps/client/src/components/ui/dropdown-menu.tsx new file mode 100644 index 0000000..9d5ebbd --- /dev/null +++ b/frontend/apps/client/src/components/ui/dropdown-menu.tsx @@ -0,0 +1,268 @@ +"use client" + +import * as React from "react" +import { Menu as MenuPrimitive } from "@base-ui/react/menu" + +import { cn } from "@/lib/utils" +import { ChevronRightIcon, CheckIcon } from "lucide-react" + +function DropdownMenu({ ...props }: MenuPrimitive.Root.Props) { + return <MenuPrimitive.Root data-slot="dropdown-menu" {...props} /> +} + +function DropdownMenuPortal({ ...props }: MenuPrimitive.Portal.Props) { + return <MenuPrimitive.Portal data-slot="dropdown-menu-portal" {...props} /> +} + +function DropdownMenuTrigger({ ...props }: MenuPrimitive.Trigger.Props) { + return <MenuPrimitive.Trigger data-slot="dropdown-menu-trigger" {...props} /> +} + +function DropdownMenuContent({ + align = "start", + alignOffset = 0, + side = "bottom", + sideOffset = 4, + className, + ...props +}: MenuPrimitive.Popup.Props & + Pick< + MenuPrimitive.Positioner.Props, + "align" | "alignOffset" | "side" | "sideOffset" + >) { + return ( + <MenuPrimitive.Portal> + <MenuPrimitive.Positioner + className="isolate z-50 outline-none" + align={align} + alignOffset={alignOffset} + side={side} + sideOffset={sideOffset} + > + <MenuPrimitive.Popup + data-slot="dropdown-menu-content" + className={cn("z-50 max-h-(--available-height) w-(--anchor-width) min-w-32 origin-(--transform-origin) overflow-x-hidden overflow-y-auto rounded-lg bg-popover p-1 text-popover-foreground shadow-md ring-1 ring-foreground/10 duration-100 outline-none data-[side=bottom]:slide-in-from-top-2 data-[side=inline-end]:slide-in-from-left-2 data-[side=inline-start]:slide-in-from-right-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2 data-open:animate-in data-open:fade-in-0 data-open:zoom-in-95 data-closed:animate-out data-closed:overflow-hidden data-closed:fade-out-0 data-closed:zoom-out-95", className )} + {...props} + /> + </MenuPrimitive.Positioner> + </MenuPrimitive.Portal> + ) +} + +function DropdownMenuGroup({ ...props }: MenuPrimitive.Group.Props) { + return <MenuPrimitive.Group data-slot="dropdown-menu-group" {...props} /> +} + +function DropdownMenuLabel({ + className, + inset, + ...props +}: MenuPrimitive.GroupLabel.Props & { + inset?: boolean +}) { + return ( + <MenuPrimitive.GroupLabel + data-slot="dropdown-menu-label" + data-inset={inset} + className={cn( + "px-1.5 py-1 text-xs font-medium text-muted-foreground data-inset:pl-7", + className + )} + {...props} + /> + ) +} + +function DropdownMenuItem({ + className, + inset, + variant = "default", + ...props +}: MenuPrimitive.Item.Props & { + inset?: boolean + variant?: "default" | "destructive" +}) { + return ( + <MenuPrimitive.Item + data-slot="dropdown-menu-item" + data-inset={inset} + data-variant={variant} + className={cn( + "group/dropdown-menu-item relative flex cursor-default items-center gap-1.5 rounded-md px-1.5 py-1 text-sm outline-hidden select-none focus:bg-accent focus:text-accent-foreground not-data-[variant=destructive]:focus:**:text-accent-foreground data-inset:pl-7 data-[variant=destructive]:text-destructive data-[variant=destructive]:focus:bg-destructive/10 data-[variant=destructive]:focus:text-destructive dark:data-[variant=destructive]:focus:bg-destructive/20 data-disabled:pointer-events-none data-disabled:opacity-50 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4 data-[variant=destructive]:*:[svg]:text-destructive", + className + )} + {...props} + /> + ) +} + +function DropdownMenuSub({ ...props }: MenuPrimitive.SubmenuRoot.Props) { + return <MenuPrimitive.SubmenuRoot data-slot="dropdown-menu-sub" {...props} /> +} + +function DropdownMenuSubTrigger({ + className, + inset, + children, + ...props +}: MenuPrimitive.SubmenuTrigger.Props & { + inset?: boolean +}) { + return ( + <MenuPrimitive.SubmenuTrigger + data-slot="dropdown-menu-sub-trigger" + data-inset={inset} + className={cn( + "flex cursor-default items-center gap-1.5 rounded-md px-1.5 py-1 text-sm outline-hidden select-none focus:bg-accent focus:text-accent-foreground not-data-[variant=destructive]:focus:**:text-accent-foreground data-inset:pl-7 data-popup-open:bg-accent data-popup-open:text-accent-foreground data-open:bg-accent data-open:text-accent-foreground [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4", + className + )} + {...props} + > + {children} + <ChevronRightIcon className="ml-auto" /> + </MenuPrimitive.SubmenuTrigger> + ) +} + +function DropdownMenuSubContent({ + align = "start", + alignOffset = -3, + side = "right", + sideOffset = 0, + className, + ...props +}: React.ComponentProps<typeof DropdownMenuContent>) { + return ( + <DropdownMenuContent + data-slot="dropdown-menu-sub-content" + className={cn("w-auto min-w-[96px] rounded-lg bg-popover p-1 text-popover-foreground shadow-lg ring-1 ring-foreground/10 duration-100 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2 data-open:animate-in data-open:fade-in-0 data-open:zoom-in-95 data-closed:animate-out data-closed:fade-out-0 data-closed:zoom-out-95", className )} + align={align} + alignOffset={alignOffset} + side={side} + sideOffset={sideOffset} + {...props} + /> + ) +} + +function DropdownMenuCheckboxItem({ + className, + children, + checked, + inset, + ...props +}: MenuPrimitive.CheckboxItem.Props & { + inset?: boolean +}) { + return ( + <MenuPrimitive.CheckboxItem + data-slot="dropdown-menu-checkbox-item" + data-inset={inset} + className={cn( + "relative flex cursor-default items-center gap-1.5 rounded-md py-1 pr-8 pl-1.5 text-sm outline-hidden select-none focus:bg-accent focus:text-accent-foreground focus:**:text-accent-foreground data-inset:pl-7 data-disabled:pointer-events-none data-disabled:opacity-50 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4", + className + )} + checked={checked} + {...props} + > + <span + className="pointer-events-none absolute right-2 flex items-center justify-center" + data-slot="dropdown-menu-checkbox-item-indicator" + > + <MenuPrimitive.CheckboxItemIndicator> + <CheckIcon + /> + </MenuPrimitive.CheckboxItemIndicator> + </span> + {children} + </MenuPrimitive.CheckboxItem> + ) +} + +function DropdownMenuRadioGroup({ ...props }: MenuPrimitive.RadioGroup.Props) { + return ( + <MenuPrimitive.RadioGroup + data-slot="dropdown-menu-radio-group" + {...props} + /> + ) +} + +function DropdownMenuRadioItem({ + className, + children, + inset, + ...props +}: MenuPrimitive.RadioItem.Props & { + inset?: boolean +}) { + return ( + <MenuPrimitive.RadioItem + data-slot="dropdown-menu-radio-item" + data-inset={inset} + className={cn( + "relative flex cursor-default items-center gap-1.5 rounded-md py-1 pr-8 pl-1.5 text-sm outline-hidden select-none focus:bg-accent focus:text-accent-foreground focus:**:text-accent-foreground data-inset:pl-7 data-disabled:pointer-events-none data-disabled:opacity-50 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4", + className + )} + {...props} + > + <span + className="pointer-events-none absolute right-2 flex items-center justify-center" + data-slot="dropdown-menu-radio-item-indicator" + > + <MenuPrimitive.RadioItemIndicator> + <CheckIcon + /> + </MenuPrimitive.RadioItemIndicator> + </span> + {children} + </MenuPrimitive.RadioItem> + ) +} + +function DropdownMenuSeparator({ + className, + ...props +}: MenuPrimitive.Separator.Props) { + return ( + <MenuPrimitive.Separator + data-slot="dropdown-menu-separator" + className={cn("-mx-1 my-1 h-px bg-border", className)} + {...props} + /> + ) +} + +function DropdownMenuShortcut({ + className, + ...props +}: React.ComponentProps<"span">) { + return ( + <span + data-slot="dropdown-menu-shortcut" + className={cn( + "ml-auto text-xs tracking-widest text-muted-foreground group-focus/dropdown-menu-item:text-accent-foreground", + className + )} + {...props} + /> + ) +} + +export { + DropdownMenu, + DropdownMenuPortal, + DropdownMenuTrigger, + DropdownMenuContent, + DropdownMenuGroup, + DropdownMenuLabel, + DropdownMenuItem, + DropdownMenuCheckboxItem, + DropdownMenuRadioGroup, + DropdownMenuRadioItem, + DropdownMenuSeparator, + DropdownMenuShortcut, + DropdownMenuSub, + DropdownMenuSubTrigger, + DropdownMenuSubContent, +} diff --git a/frontend/apps/client/src/components/ui/input.tsx b/frontend/apps/client/src/components/ui/input.tsx new file mode 100644 index 0000000..7d21bab --- /dev/null +++ b/frontend/apps/client/src/components/ui/input.tsx @@ -0,0 +1,20 @@ +import * as React from "react" +import { Input as InputPrimitive } from "@base-ui/react/input" + +import { cn } from "@/lib/utils" + +function Input({ className, type, ...props }: React.ComponentProps<"input">) { + return ( + <InputPrimitive + type={type} + data-slot="input" + className={cn( + "h-8 w-full min-w-0 rounded-lg border border-input bg-transparent px-2.5 py-1 text-base transition-colors outline-none file:inline-flex file:h-6 file:border-0 file:bg-transparent file:text-sm file:font-medium file:text-foreground placeholder:text-muted-foreground focus-visible:border-ring focus-visible:ring-3 focus-visible:ring-ring/50 disabled:pointer-events-none disabled:cursor-not-allowed disabled:bg-input/50 disabled:opacity-50 aria-invalid:border-destructive aria-invalid:ring-3 aria-invalid:ring-destructive/20 md:text-sm dark:bg-input/30 dark:disabled:bg-input/80 dark:aria-invalid:border-destructive/50 dark:aria-invalid:ring-destructive/40", + className + )} + {...props} + /> + ) +} + +export { Input } diff --git a/frontend/apps/client/src/components/ui/label.tsx b/frontend/apps/client/src/components/ui/label.tsx new file mode 100644 index 0000000..74da65c --- /dev/null +++ b/frontend/apps/client/src/components/ui/label.tsx @@ -0,0 +1,20 @@ +"use client" + +import * as React from "react" + +import { cn } from "@/lib/utils" + +function Label({ className, ...props }: React.ComponentProps<"label">) { + return ( + <label + data-slot="label" + className={cn( + "flex items-center gap-2 text-sm leading-none font-medium select-none group-data-[disabled=true]:pointer-events-none group-data-[disabled=true]:opacity-50 peer-disabled:cursor-not-allowed peer-disabled:opacity-50", + className + )} + {...props} + /> + ) +} + +export { Label } diff --git a/frontend/apps/client/src/components/ui/select.tsx b/frontend/apps/client/src/components/ui/select.tsx new file mode 100644 index 0000000..e8021f5 --- /dev/null +++ b/frontend/apps/client/src/components/ui/select.tsx @@ -0,0 +1,201 @@ +"use client" + +import * as React from "react" +import { Select as SelectPrimitive } from "@base-ui/react/select" + +import { cn } from "@/lib/utils" +import { ChevronDownIcon, CheckIcon, ChevronUpIcon } from "lucide-react" + +const Select = SelectPrimitive.Root + +function SelectGroup({ className, ...props }: SelectPrimitive.Group.Props) { + return ( + <SelectPrimitive.Group + data-slot="select-group" + className={cn("scroll-my-1 p-1", className)} + {...props} + /> + ) +} + +function SelectValue({ className, ...props }: SelectPrimitive.Value.Props) { + return ( + <SelectPrimitive.Value + data-slot="select-value" + className={cn("flex flex-1 text-left", className)} + {...props} + /> + ) +} + +function SelectTrigger({ + className, + size = "default", + children, + ...props +}: SelectPrimitive.Trigger.Props & { + size?: "sm" | "default" +}) { + return ( + <SelectPrimitive.Trigger + data-slot="select-trigger" + data-size={size} + className={cn( + "flex w-fit items-center justify-between gap-1.5 rounded-lg border border-input bg-transparent py-2 pr-2 pl-2.5 text-sm whitespace-nowrap transition-colors outline-none select-none focus-visible:border-ring focus-visible:ring-3 focus-visible:ring-ring/50 disabled:cursor-not-allowed disabled:opacity-50 aria-invalid:border-destructive aria-invalid:ring-3 aria-invalid:ring-destructive/20 data-placeholder:text-muted-foreground data-[size=default]:h-8 data-[size=sm]:h-7 data-[size=sm]:rounded-[min(var(--radius-md),10px)] *:data-[slot=select-value]:line-clamp-1 *:data-[slot=select-value]:flex *:data-[slot=select-value]:items-center *:data-[slot=select-value]:gap-1.5 dark:bg-input/30 dark:hover:bg-input/50 dark:aria-invalid:border-destructive/50 dark:aria-invalid:ring-destructive/40 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4", + className + )} + {...props} + > + {children} + <SelectPrimitive.Icon + render={ + <ChevronDownIcon className="pointer-events-none size-4 text-muted-foreground" /> + } + /> + </SelectPrimitive.Trigger> + ) +} + +function SelectContent({ + className, + children, + side = "bottom", + sideOffset = 4, + align = "center", + alignOffset = 0, + alignItemWithTrigger = true, + ...props +}: SelectPrimitive.Popup.Props & + Pick< + SelectPrimitive.Positioner.Props, + "align" | "alignOffset" | "side" | "sideOffset" | "alignItemWithTrigger" + >) { + return ( + <SelectPrimitive.Portal> + <SelectPrimitive.Positioner + side={side} + sideOffset={sideOffset} + align={align} + alignOffset={alignOffset} + alignItemWithTrigger={alignItemWithTrigger} + className="isolate z-50" + > + <SelectPrimitive.Popup + data-slot="select-content" + data-align-trigger={alignItemWithTrigger} + className={cn("relative isolate z-50 max-h-(--available-height) w-(--anchor-width) min-w-36 origin-(--transform-origin) overflow-x-hidden overflow-y-auto rounded-lg bg-popover text-popover-foreground shadow-md ring-1 ring-foreground/10 duration-100 data-[align-trigger=true]:animate-none data-[side=bottom]:slide-in-from-top-2 data-[side=inline-end]:slide-in-from-left-2 data-[side=inline-start]:slide-in-from-right-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2 data-open:animate-in data-open:fade-in-0 data-open:zoom-in-95 data-closed:animate-out data-closed:fade-out-0 data-closed:zoom-out-95", className )} + {...props} + > + <SelectScrollUpButton /> + <SelectPrimitive.List>{children}</SelectPrimitive.List> + <SelectScrollDownButton /> + </SelectPrimitive.Popup> + </SelectPrimitive.Positioner> + </SelectPrimitive.Portal> + ) +} + +function SelectLabel({ + className, + ...props +}: SelectPrimitive.GroupLabel.Props) { + return ( + <SelectPrimitive.GroupLabel + data-slot="select-label" + className={cn("px-1.5 py-1 text-xs text-muted-foreground", className)} + {...props} + /> + ) +} + +function SelectItem({ + className, + children, + ...props +}: SelectPrimitive.Item.Props) { + return ( + <SelectPrimitive.Item + data-slot="select-item" + className={cn( + "relative flex w-full cursor-default items-center gap-1.5 rounded-md py-1 pr-8 pl-1.5 text-sm outline-hidden select-none focus:bg-accent focus:text-accent-foreground not-data-[variant=destructive]:focus:**:text-accent-foreground data-disabled:pointer-events-none data-disabled:opacity-50 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4 *:[span]:last:flex *:[span]:last:items-center *:[span]:last:gap-2", + className + )} + {...props} + > + <SelectPrimitive.ItemText className="flex flex-1 shrink-0 gap-2 whitespace-nowrap"> + {children} + </SelectPrimitive.ItemText> + <SelectPrimitive.ItemIndicator + render={ + <span className="pointer-events-none absolute right-2 flex size-4 items-center justify-center" /> + } + > + <CheckIcon className="pointer-events-none" /> + </SelectPrimitive.ItemIndicator> + </SelectPrimitive.Item> + ) +} + +function SelectSeparator({ + className, + ...props +}: SelectPrimitive.Separator.Props) { + return ( + <SelectPrimitive.Separator + data-slot="select-separator" + className={cn("pointer-events-none -mx-1 my-1 h-px bg-border", className)} + {...props} + /> + ) +} + +function SelectScrollUpButton({ + className, + ...props +}: React.ComponentProps<typeof SelectPrimitive.ScrollUpArrow>) { + return ( + <SelectPrimitive.ScrollUpArrow + data-slot="select-scroll-up-button" + className={cn( + "top-0 z-10 flex w-full cursor-default items-center justify-center bg-popover py-1 [&_svg:not([class*='size-'])]:size-4", + className + )} + {...props} + > + <ChevronUpIcon + /> + </SelectPrimitive.ScrollUpArrow> + ) +} + +function SelectScrollDownButton({ + className, + ...props +}: React.ComponentProps<typeof SelectPrimitive.ScrollDownArrow>) { + return ( + <SelectPrimitive.ScrollDownArrow + data-slot="select-scroll-down-button" + className={cn( + "bottom-0 z-10 flex w-full cursor-default items-center justify-center bg-popover py-1 [&_svg:not([class*='size-'])]:size-4", + className + )} + {...props} + > + <ChevronDownIcon + /> + </SelectPrimitive.ScrollDownArrow> + ) +} + +export { + Select, + SelectContent, + SelectGroup, + SelectItem, + SelectLabel, + SelectScrollDownButton, + SelectScrollUpButton, + SelectSeparator, + SelectTrigger, + SelectValue, +} diff --git a/frontend/apps/client/src/components/ui/separator.tsx b/frontend/apps/client/src/components/ui/separator.tsx new file mode 100644 index 0000000..6e1369e --- /dev/null +++ b/frontend/apps/client/src/components/ui/separator.tsx @@ -0,0 +1,25 @@ +"use client" + +import { Separator as SeparatorPrimitive } from "@base-ui/react/separator" + +import { cn } from "@/lib/utils" + +function Separator({ + className, + orientation = "horizontal", + ...props +}: SeparatorPrimitive.Props) { + return ( + <SeparatorPrimitive + data-slot="separator" + orientation={orientation} + className={cn( + "shrink-0 bg-border data-horizontal:h-px data-horizontal:w-full data-vertical:w-px data-vertical:self-stretch", + className + )} + {...props} + /> + ) +} + +export { Separator } diff --git a/frontend/apps/client/src/components/ui/sheet.tsx b/frontend/apps/client/src/components/ui/sheet.tsx new file mode 100644 index 0000000..428a091 --- /dev/null +++ b/frontend/apps/client/src/components/ui/sheet.tsx @@ -0,0 +1,138 @@ +"use client" + +import * as React from "react" +import { Dialog as SheetPrimitive } from "@base-ui/react/dialog" + +import { cn } from "@/lib/utils" +import { Button } from "@/components/ui/button" +import { XIcon } from "lucide-react" + +function Sheet({ ...props }: SheetPrimitive.Root.Props) { + return <SheetPrimitive.Root data-slot="sheet" {...props} /> +} + +function SheetTrigger({ ...props }: SheetPrimitive.Trigger.Props) { + return <SheetPrimitive.Trigger data-slot="sheet-trigger" {...props} /> +} + +function SheetClose({ ...props }: SheetPrimitive.Close.Props) { + return <SheetPrimitive.Close data-slot="sheet-close" {...props} /> +} + +function SheetPortal({ ...props }: SheetPrimitive.Portal.Props) { + return <SheetPrimitive.Portal data-slot="sheet-portal" {...props} /> +} + +function SheetOverlay({ className, ...props }: SheetPrimitive.Backdrop.Props) { + return ( + <SheetPrimitive.Backdrop + data-slot="sheet-overlay" + className={cn( + "fixed inset-0 z-50 bg-black/10 transition-opacity duration-150 data-ending-style:opacity-0 data-starting-style:opacity-0 supports-backdrop-filter:backdrop-blur-xs", + className + )} + {...props} + /> + ) +} + +function SheetContent({ + className, + children, + side = "right", + showCloseButton = true, + ...props +}: SheetPrimitive.Popup.Props & { + side?: "top" | "right" | "bottom" | "left" + showCloseButton?: boolean +}) { + return ( + <SheetPortal> + <SheetOverlay /> + <SheetPrimitive.Popup + data-slot="sheet-content" + data-side={side} + className={cn( + "fixed z-50 flex flex-col gap-4 bg-background bg-clip-padding text-sm shadow-lg transition duration-200 ease-in-out data-ending-style:opacity-0 data-starting-style:opacity-0 data-[side=bottom]:inset-x-0 data-[side=bottom]:bottom-0 data-[side=bottom]:h-auto data-[side=bottom]:border-t data-[side=bottom]:data-ending-style:translate-y-[2.5rem] data-[side=bottom]:data-starting-style:translate-y-[2.5rem] data-[side=left]:inset-y-0 data-[side=left]:left-0 data-[side=left]:h-full data-[side=left]:w-3/4 data-[side=left]:border-r data-[side=left]:data-ending-style:translate-x-[-2.5rem] data-[side=left]:data-starting-style:translate-x-[-2.5rem] data-[side=right]:inset-y-0 data-[side=right]:right-0 data-[side=right]:h-full data-[side=right]:w-3/4 data-[side=right]:border-l data-[side=right]:data-ending-style:translate-x-[2.5rem] data-[side=right]:data-starting-style:translate-x-[2.5rem] data-[side=top]:inset-x-0 data-[side=top]:top-0 data-[side=top]:h-auto data-[side=top]:border-b data-[side=top]:data-ending-style:translate-y-[-2.5rem] data-[side=top]:data-starting-style:translate-y-[-2.5rem] data-[side=left]:sm:max-w-sm data-[side=right]:sm:max-w-sm", + className + )} + {...props} + > + {children} + {showCloseButton && ( + <SheetPrimitive.Close + data-slot="sheet-close" + render={ + <Button + variant="ghost" + className="absolute top-3 right-3" + size="icon-sm" + /> + } + > + <XIcon + /> + <span className="sr-only">Close</span> + </SheetPrimitive.Close> + )} + </SheetPrimitive.Popup> + </SheetPortal> + ) +} + +function SheetHeader({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="sheet-header" + className={cn("flex flex-col gap-0.5 p-4", className)} + {...props} + /> + ) +} + +function SheetFooter({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="sheet-footer" + className={cn("mt-auto flex flex-col gap-2 p-4", className)} + {...props} + /> + ) +} + +function SheetTitle({ className, ...props }: SheetPrimitive.Title.Props) { + return ( + <SheetPrimitive.Title + data-slot="sheet-title" + className={cn( + "font-heading text-base font-medium text-foreground", + className + )} + {...props} + /> + ) +} + +function SheetDescription({ + className, + ...props +}: SheetPrimitive.Description.Props) { + return ( + <SheetPrimitive.Description + data-slot="sheet-description" + className={cn("text-sm text-muted-foreground", className)} + {...props} + /> + ) +} + +export { + Sheet, + SheetTrigger, + SheetClose, + SheetContent, + SheetHeader, + SheetFooter, + SheetTitle, + SheetDescription, +} diff --git a/frontend/apps/client/src/components/ui/skeleton.tsx b/frontend/apps/client/src/components/ui/skeleton.tsx new file mode 100644 index 0000000..0118624 --- /dev/null +++ b/frontend/apps/client/src/components/ui/skeleton.tsx @@ -0,0 +1,13 @@ +import { cn } from "@/lib/utils" + +function Skeleton({ className, ...props }: React.ComponentProps<"div">) { + return ( + <div + data-slot="skeleton" + className={cn("animate-pulse rounded-md bg-muted", className)} + {...props} + /> + ) +} + +export { Skeleton } diff --git a/frontend/apps/client/src/components/ui/table.tsx b/frontend/apps/client/src/components/ui/table.tsx new file mode 100644 index 0000000..8dc13ae --- /dev/null +++ b/frontend/apps/client/src/components/ui/table.tsx @@ -0,0 +1,116 @@ +"use client" + +import * as React from "react" + +import { cn } from "@/lib/utils" + +function Table({ className, ...props }: React.ComponentProps<"table">) { + return ( + <div + data-slot="table-container" + className="relative w-full overflow-x-auto" + > + <table + data-slot="table" + className={cn("w-full caption-bottom text-sm", className)} + {...props} + /> + </div> + ) +} + +function TableHeader({ className, ...props }: React.ComponentProps<"thead">) { + return ( + <thead + data-slot="table-header" + className={cn("[&_tr]:border-b", className)} + {...props} + /> + ) +} + +function TableBody({ className, ...props }: React.ComponentProps<"tbody">) { + return ( + <tbody + data-slot="table-body" + className={cn("[&_tr:last-child]:border-0", className)} + {...props} + /> + ) +} + +function TableFooter({ className, ...props }: React.ComponentProps<"tfoot">) { + return ( + <tfoot + data-slot="table-footer" + className={cn( + "border-t bg-muted/50 font-medium [&>tr]:last:border-b-0", + className + )} + {...props} + /> + ) +} + +function TableRow({ className, ...props }: React.ComponentProps<"tr">) { + return ( + <tr + data-slot="table-row" + className={cn( + "border-b transition-colors hover:bg-muted/50 data-[state=selected]:bg-muted", + className + )} + {...props} + /> + ) +} + +function TableHead({ className, ...props }: React.ComponentProps<"th">) { + return ( + <th + data-slot="table-head" + className={cn( + "h-10 px-2 text-left align-middle font-medium whitespace-nowrap text-foreground [&:has([role=checkbox])]:pr-0", + className + )} + {...props} + /> + ) +} + +function TableCell({ className, ...props }: React.ComponentProps<"td">) { + return ( + <td + data-slot="table-cell" + className={cn( + "p-2 align-middle whitespace-nowrap [&:has([role=checkbox])]:pr-0", + className + )} + {...props} + /> + ) +} + +function TableCaption({ + className, + ...props +}: React.ComponentProps<"caption">) { + return ( + <caption + data-slot="table-caption" + className={cn("mt-4 text-sm text-muted-foreground", className)} + {...props} + /> + ) +} + +export { + Table, + TableHeader, + TableBody, + TableFooter, + TableHead, + TableRow, + TableCell, + TableCaption, +} diff --git a/frontend/apps/client/src/components/ui/tabs.tsx b/frontend/apps/client/src/components/ui/tabs.tsx new file mode 100644 index 0000000..56c4288 --- /dev/null +++ b/frontend/apps/client/src/components/ui/tabs.tsx @@ -0,0 +1,82 @@ +"use client" + +import { Tabs as TabsPrimitive } from "@base-ui/react/tabs" +import { cva, type VariantProps } from "class-variance-authority" + +import { cn } from "@/lib/utils" + +function Tabs({ + className, + orientation = "horizontal", + ...props +}: TabsPrimitive.Root.Props) { + return ( + <TabsPrimitive.Root + data-slot="tabs" + data-orientation={orientation} + className={cn( + "group/tabs flex gap-2 data-horizontal:flex-col", + className + )} + {...props} + /> + ) +} + +const tabsListVariants = cva( + "group/tabs-list inline-flex w-fit items-center justify-center rounded-lg p-[3px] text-muted-foreground group-data-horizontal/tabs:h-8 group-data-vertical/tabs:h-fit group-data-vertical/tabs:flex-col data-[variant=line]:rounded-none", + { + variants: { + variant: { + default: "bg-muted", + line: "gap-1 bg-transparent", + }, + }, + defaultVariants: { + variant: "default", + }, + } +) + +function TabsList({ + className, + variant = "default", + ...props +}: TabsPrimitive.List.Props & VariantProps<typeof tabsListVariants>) { + return ( + <TabsPrimitive.List + data-slot="tabs-list" + data-variant={variant} + className={cn(tabsListVariants({ variant }), className)} + {...props} + /> + ) +} + +function TabsTrigger({ className, ...props }: TabsPrimitive.Tab.Props) { + return ( + <TabsPrimitive.Tab + data-slot="tabs-trigger" + className={cn( + "relative inline-flex h-[calc(100%-1px)] flex-1 items-center justify-center gap-1.5 rounded-md border border-transparent px-1.5 py-0.5 text-sm font-medium whitespace-nowrap text-foreground/60 transition-all group-data-vertical/tabs:w-full group-data-vertical/tabs:justify-start hover:text-foreground focus-visible:border-ring focus-visible:ring-[3px] focus-visible:ring-ring/50 focus-visible:outline-1 focus-visible:outline-ring disabled:pointer-events-none disabled:opacity-50 aria-disabled:pointer-events-none aria-disabled:opacity-50 dark:text-muted-foreground dark:hover:text-foreground group-data-[variant=default]/tabs-list:data-active:shadow-sm group-data-[variant=line]/tabs-list:data-active:shadow-none [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4", + "group-data-[variant=line]/tabs-list:bg-transparent group-data-[variant=line]/tabs-list:data-active:bg-transparent dark:group-data-[variant=line]/tabs-list:data-active:border-transparent dark:group-data-[variant=line]/tabs-list:data-active:bg-transparent", + "data-active:bg-background data-active:text-foreground dark:data-active:border-input dark:data-active:bg-input/30 dark:data-active:text-foreground", + "after:absolute after:bg-foreground after:opacity-0 after:transition-opacity group-data-horizontal/tabs:after:inset-x-0 group-data-horizontal/tabs:after:bottom-[-5px] group-data-horizontal/tabs:after:h-0.5 group-data-vertical/tabs:after:inset-y-0 group-data-vertical/tabs:after:-right-1 group-data-vertical/tabs:after:w-0.5 group-data-[variant=line]/tabs-list:data-active:after:opacity-100", + className + )} + {...props} + /> + ) +} + +function TabsContent({ className, ...props }: TabsPrimitive.Panel.Props) { + return ( + <TabsPrimitive.Panel + data-slot="tabs-content" + className={cn("flex-1 text-sm outline-none", className)} + {...props} + /> + ) +} + +export { Tabs, TabsList, TabsTrigger, TabsContent, tabsListVariants } diff --git a/frontend/apps/client/src/components/ui/textarea.tsx b/frontend/apps/client/src/components/ui/textarea.tsx new file mode 100644 index 0000000..04d27f7 --- /dev/null +++ b/frontend/apps/client/src/components/ui/textarea.tsx @@ -0,0 +1,18 @@ +import * as React from "react" + +import { cn } from "@/lib/utils" + +function Textarea({ className, ...props }: React.ComponentProps<"textarea">) { + return ( + <textarea + data-slot="textarea" + className={cn( + "flex field-sizing-content min-h-16 w-full rounded-lg border border-input bg-transparent px-2.5 py-2 text-base transition-colors outline-none placeholder:text-muted-foreground focus-visible:border-ring focus-visible:ring-3 focus-visible:ring-ring/50 disabled:cursor-not-allowed disabled:bg-input/50 disabled:opacity-50 aria-invalid:border-destructive aria-invalid:ring-3 aria-invalid:ring-destructive/20 md:text-sm dark:bg-input/30 dark:disabled:bg-input/80 dark:aria-invalid:border-destructive/50 dark:aria-invalid:ring-destructive/40", + className + )} + {...props} + /> + ) +} + +export { Textarea } diff --git a/frontend/apps/client/src/lib/utils.ts b/frontend/apps/client/src/lib/utils.ts new file mode 100644 index 0000000..bd0c391 --- /dev/null +++ b/frontend/apps/client/src/lib/utils.ts @@ -0,0 +1,6 @@ +import { clsx, type ClassValue } from "clsx" +import { twMerge } from "tailwind-merge" + +export function cn(...inputs: ClassValue[]) { + return twMerge(clsx(inputs)) +} diff --git a/frontend/apps/client/tsconfig.json b/frontend/apps/client/tsconfig.json new file mode 100644 index 0000000..cf9c65d --- /dev/null +++ b/frontend/apps/client/tsconfig.json @@ -0,0 +1,34 @@ +{ + "compilerOptions": { + "target": "ES2017", + "lib": ["dom", "dom.iterable", "esnext"], + "allowJs": true, + "skipLibCheck": true, + "strict": true, + "noEmit": true, + "esModuleInterop": true, + "module": "esnext", + "moduleResolution": "bundler", + "resolveJsonModule": true, + "isolatedModules": true, + "jsx": "react-jsx", + "incremental": true, + "plugins": [ + { + "name": "next" + } + ], + "paths": { + "@/*": ["./src/*"] + } + }, + "include": [ + "next-env.d.ts", + "**/*.ts", + "**/*.tsx", + ".next/types/**/*.ts", + ".next/dev/types/**/*.ts", + "**/*.mts" + ], + "exclude": ["node_modules"] +} diff --git a/frontend/docker-compose.dev.yml b/frontend/docker-compose.dev.yml new file mode 100644 index 0000000..7d86ad8 --- /dev/null +++ b/frontend/docker-compose.dev.yml @@ -0,0 +1,16 @@ +services: + frontend: + build: + context: . + dockerfile: apps/client/Dockerfile.dev + volumes: + - ./apps/client/src:/app/src + - ./packages:/app/packages + ports: + - "3000:3000" + networks: + - cmdb-network + +networks: + cmdb-network: + external: true diff --git a/frontend/package.json b/frontend/package.json new file mode 100644 index 0000000..ef0a76f --- /dev/null +++ b/frontend/package.json @@ -0,0 +1,12 @@ +{ + "name": "cmdb-frontend", + "version": "0.1.0", + "private": true, + "scripts": { + "dev": "pnpm --filter @cmdb/client dev", + "dev:admin": "pnpm --filter @cmdb/admin dev", + "build": "pnpm -r build", + "lint": "pnpm -r lint", + "format": "prettier --write \"**/*.{ts,tsx,json,css,md}\"" + } +} diff --git a/frontend/packages/shared/package.json b/frontend/packages/shared/package.json new file mode 100644 index 0000000..1cc74af --- /dev/null +++ b/frontend/packages/shared/package.json @@ -0,0 +1,22 @@ +{ + "name": "@cmdb/shared", + "version": "0.1.0", + "private": true, + "main": "./src/index.ts", + "types": "./src/index.ts", + "exports": { + ".": "./src/index.ts", + "./lib/*": "./src/lib/*.ts", + "./hooks/*": "./src/hooks/*.tsx", + "./types/*": "./src/types/*.ts", + "./components/*": "./src/components/*.tsx" + }, + "peerDependencies": { + "react": "^19", + "react-dom": "^19" + }, + "devDependencies": { + "@types/react": "^19.2.14", + "@types/react-dom": "^19.2.3" + } +} diff --git a/frontend/packages/shared/src/hooks/use-auth.tsx b/frontend/packages/shared/src/hooks/use-auth.tsx new file mode 100644 index 0000000..a7b1c10 --- /dev/null +++ b/frontend/packages/shared/src/hooks/use-auth.tsx @@ -0,0 +1,62 @@ +"use client"; + +import { createContext, useCallback, useContext, useEffect, useMemo, useState } from "react"; +import type { ReactNode } from "react"; +import type { User } from "../types/auth"; +import { getCurrentUser, isAuthenticated as checkAuth, login as authLogin, logout as authLogout } from "../lib/auth"; +import type { LoginRequest } from "../types/auth"; + +interface AuthContextValue { + user: User | null; + isAuthenticated: boolean; + isLoading: boolean; + login: (credentials: LoginRequest) => Promise<void>; + logout: () => void; +} + +const AuthContext = createContext<AuthContextValue | null>(null); + +export function AuthProvider({ children }: { children: ReactNode }) { + const [user, setUser] = useState<User | null>(null); + const [isLoading, setIsLoading] = useState(true); + + useEffect(() => { + if (checkAuth()) { + const user = getCurrentUser(); + setUser(user); + } + setIsLoading(false); + }, []); + + const login = useCallback(async (credentials: LoginRequest) => { + await authLogin(credentials); + const user = getCurrentUser(); + setUser(user); + }, []); + + const logout = useCallback(() => { + setUser(null); + authLogout(); + }, []); + + const value = useMemo( + () => ({ + user, + isAuthenticated: !!user, + isLoading, + login, + logout, + }), + [user, isLoading, login, logout], + ); + + return <AuthContext value={value}>{children}</AuthContext>; +} + +export function useAuth(): AuthContextValue { + const context = useContext(AuthContext); + if (!context) { + throw new Error("useAuth must be used within an AuthProvider"); + } + return context; +} diff --git a/frontend/packages/shared/src/index.ts b/frontend/packages/shared/src/index.ts new file mode 100644 index 0000000..5b93608 --- /dev/null +++ b/frontend/packages/shared/src/index.ts @@ -0,0 +1,43 @@ +// Types +export type { PaginatedResponse, ApiError } from "./types/common"; +export type { LoginRequest, TokenResponse, User, SignupRequest } from "./types/auth"; +export type { + Prefix, + IPAddress, + VRF, + VLAN, + IPRange, + RIR, + ASN, + FHRPGroup, + RouteTarget, + VLANGroup, + Service, + SearchResult, + GlobalSearchResponse, + ChangeLogEntry, + JournalEntry, +} from "./types/ipam"; + +// API +export { api } from "./lib/api"; +export { + prefixApi, + ipAddressApi, + vrfApi, + vlanApi, + ipRangeApi, + rirApi, + asnApi, + fhrpGroupApi, + routeTargetApi, + vlanGroupApi, + serviceApi, + searchApi, + changelogApi, + journalApi, +} from "./lib/api-client"; + +// Auth +export { getAccessToken, isAuthenticated, login, signup, logout, getCurrentUser } from "./lib/auth"; +export { AuthProvider, useAuth } from "./hooks/use-auth"; diff --git a/frontend/packages/shared/src/lib/api-client.ts b/frontend/packages/shared/src/lib/api-client.ts new file mode 100644 index 0000000..e015276 --- /dev/null +++ b/frontend/packages/shared/src/lib/api-client.ts @@ -0,0 +1,70 @@ +import type { PaginatedResponse } from "../types/common"; +import type { + ASN, + ChangeLogEntry, + FHRPGroup, + GlobalSearchResponse, + IPAddress, + IPRange, + JournalEntry, + Prefix, + RIR, + RouteTarget, + Service, + VLAN, + VLANGroup, + VRF, +} from "../types/ipam"; +import { api } from "./api"; + +function buildQuery(params: Record<string, unknown>): string { + const entries = Object.entries(params).filter(([, v]) => v != null && v !== ""); + if (entries.length === 0) return ""; + return "?" + entries.map(([k, v]) => `${k}=${encodeURIComponent(String(v))}`).join("&"); +} + +function createEntityApi<T>(basePath: string) { + return { + list: (params: Record<string, unknown> = {}) => + api.get<PaginatedResponse<T>>(`${basePath}${buildQuery(params)}`), + get: (id: string) => api.get<T>(`${basePath}/${id}`), + create: (data: Partial<T>) => api.post<T>(basePath, data), + update: (id: string, data: Partial<T>) => api.patch<T>(`${basePath}/${id}`, data), + delete: (id: string) => api.delete<void>(`${basePath}/${id}`), + }; +} + +export const prefixApi = createEntityApi<Prefix>("/api/v1/prefixes"); +export const ipAddressApi = createEntityApi<IPAddress>("/api/v1/ip-addresses"); +export const vrfApi = createEntityApi<VRF>("/api/v1/vrfs"); +export const vlanApi = createEntityApi<VLAN>("/api/v1/vlans"); +export const ipRangeApi = createEntityApi<IPRange>("/api/v1/ip-ranges"); +export const rirApi = createEntityApi<RIR>("/api/v1/rirs"); +export const asnApi = createEntityApi<ASN>("/api/v1/asns"); +export const fhrpGroupApi = createEntityApi<FHRPGroup>("/api/v1/fhrp-groups"); +export const routeTargetApi = createEntityApi<RouteTarget>("/api/v1/route-targets"); +export const vlanGroupApi = createEntityApi<VLANGroup>("/api/v1/vlan-groups"); +export const serviceApi = createEntityApi<Service>("/api/v1/services"); + +export const searchApi = { + search: (q: string, entityTypes?: string[], offset = 0, limit = 20) => { + const params: Record<string, unknown> = { q, offset, limit }; + if (entityTypes?.length) params.entity_types = entityTypes.join(","); + return api.get<GlobalSearchResponse>(`/api/v1/search${buildQuery(params)}`); + }, +}; + +export const changelogApi = { + list: (params: Record<string, unknown> = {}) => + api.get<PaginatedResponse<ChangeLogEntry>>(`/api/v1/event/changelog${buildQuery(params)}`), + getByObject: (aggregateId: string, params: Record<string, unknown> = {}) => + api.get<PaginatedResponse<ChangeLogEntry>>(`/api/v1/event/changelog/${aggregateId}${buildQuery(params)}`), +}; + +export const journalApi = { + list: (params: Record<string, unknown> = {}) => + api.get<PaginatedResponse<JournalEntry>>(`/api/v1/event/journal-entries${buildQuery(params)}`), + create: (data: { object_type: string; object_id: string; entry_type: string; comment: string }) => + api.post<JournalEntry>("/api/v1/event/journal-entries", data), + delete: (id: string) => api.delete<void>(`/api/v1/event/journal-entries/${id}`), +}; diff --git a/frontend/packages/shared/src/lib/api.ts b/frontend/packages/shared/src/lib/api.ts new file mode 100644 index 0000000..af0eb7a --- /dev/null +++ b/frontend/packages/shared/src/lib/api.ts @@ -0,0 +1,88 @@ +const API_BASE_URL = process.env.NEXT_PUBLIC_API_URL || ""; + +type RequestOptions = Omit<RequestInit, "body"> & { + body?: unknown; +}; + +async function refreshAccessToken(): Promise<string | null> { + const refreshToken = localStorage.getItem("refresh_token"); + if (!refreshToken) return null; + + try { + const res = await fetch(`${API_BASE_URL}/api/v1/auth/refresh`, { + method: "POST", + headers: { "Content-Type": "application/json" }, + body: JSON.stringify({ refresh_token: refreshToken }), + }); + if (!res.ok) return null; + const data = await res.json(); + localStorage.setItem("access_token", data.access_token); + if (data.refresh_token) { + localStorage.setItem("refresh_token", data.refresh_token); + } + return data.access_token; + } catch { + return null; + } +} + +async function request<T>(path: string, options: RequestOptions = {}): Promise<T> { + const accessToken = localStorage.getItem("access_token"); + const headers: Record<string, string> = { + "Content-Type": "application/json", + ...(options.headers as Record<string, string>), + }; + + if (accessToken) { + headers["Authorization"] = `Bearer ${accessToken}`; + } + + const tenantId = localStorage.getItem("tenant_id"); + if (tenantId) { + headers["X-Tenant-ID"] = tenantId; + } + + let res = await fetch(`${API_BASE_URL}${path}`, { + ...options, + headers, + body: options.body ? JSON.stringify(options.body) : undefined, + }); + + if (res.status === 401 && accessToken) { + const newToken = await refreshAccessToken(); + if (newToken) { + headers["Authorization"] = `Bearer ${newToken}`; + res = await fetch(`${API_BASE_URL}${path}`, { + ...options, + headers, + body: options.body ? JSON.stringify(options.body) : undefined, + }); + } else { + localStorage.removeItem("access_token"); + localStorage.removeItem("refresh_token"); + window.location.href = "/login"; + throw new Error("Authentication expired"); + } + } + + if (!res.ok) { + const error = await res.json().catch(() => ({ detail: res.statusText })); + const detail = Array.isArray(error.detail) + ? error.detail.map((d: { msg?: string }) => d.msg).join(", ") + : error.detail; + throw new Error(detail || error.title || `Request failed: ${res.status}`); + } + + if (res.status === 204) { + return undefined as T; + } + + return res.json(); +} + +export const api = { + get: <T>(path: string) => request<T>(path), + post: <T>(path: string, body?: unknown) => request<T>(path, { method: "POST", body }), + patch: <T>(path: string, body?: unknown) => request<T>(path, { method: "PATCH", body }), + delete: <T>(path: string) => request<T>(path, { method: "DELETE" }), +}; diff --git a/frontend/packages/shared/src/lib/auth.ts b/frontend/packages/shared/src/lib/auth.ts new file mode 100644 index 0000000..a627f48 --- /dev/null +++ b/frontend/packages/shared/src/lib/auth.ts @@ -0,0 +1,46 @@ +import type { LoginRequest, SignupRequest, TokenResponse, User } from "../types/auth"; +import { api } from "./api"; + +export function getAccessToken(): string | null { + if (typeof window === "undefined") return null; + return localStorage.getItem("access_token"); +} + +export function isAuthenticated(): boolean { + return !!getAccessToken(); +} + +export async function login(credentials: LoginRequest): Promise<TokenResponse> { + const data = await api.post<TokenResponse>("/api/v1/auth/login", credentials); + localStorage.setItem("access_token", data.access_token); + localStorage.setItem("refresh_token", data.refresh_token); + if (credentials.tenant_id) localStorage.setItem("tenant_id", credentials.tenant_id); + return data; +} + +export async function signup(data: SignupRequest): Promise<User> { + return api.post<User>("/api/v1/auth/register", data); +} + +export function logout(): void { + localStorage.removeItem("access_token"); + localStorage.removeItem("refresh_token"); + window.location.href = "/login"; +} + +export function getCurrentUser(): User | null { + const token = getAccessToken(); + if (!token) return null; + try { + const payload = JSON.parse(atob(token.split(".")[1])); + return { + id: payload.sub, + email: "", + username: "", + status: "active", + roles: payload.roles || [], + }; + } catch { + return null; + } +} diff --git a/frontend/packages/shared/src/lib/setup.ts b/frontend/packages/shared/src/lib/setup.ts new file mode 100644 index 0000000..e0a84cd --- /dev/null +++ b/frontend/packages/shared/src/lib/setup.ts @@ -0,0 +1,39 @@ +const API_BASE_URL = process.env.NEXT_PUBLIC_API_URL || ""; + +export interface SetupStatus { + initialized: boolean; +} + +export interface CreateTenantRequest { + name: string; + slug: string; +} + +export interface TenantResponse { + id: string; + name: string; + slug: string; + status: string; + settings: Record<string, unknown>; + created_at: string; + updated_at: string; +} + +export async function getSetupStatus(): Promise<SetupStatus> { + const res = await fetch(`${API_BASE_URL}/api/v1/setup/status`); + if (!res.ok) throw new Error("Failed to check setup status"); + return res.json(); +} + +export async function setupCreateTenant(data: CreateTenantRequest): Promise<TenantResponse> { + const res = await fetch(`${API_BASE_URL}/api/v1/setup/create-tenant`, { + method: "POST", + headers: { "Content-Type": "application/json" }, + body: JSON.stringify(data), + }); + if (!res.ok) { + const error = await res.json().catch(() => ({ detail: res.statusText })); + throw new Error(error.detail || "Failed to create tenant"); + } + return res.json(); +} diff --git a/frontend/packages/shared/src/types/auth.ts b/frontend/packages/shared/src/types/auth.ts new file mode 100644 index 0000000..7da385a --- /dev/null +++ b/frontend/packages/shared/src/types/auth.ts @@ -0,0 +1,26 @@ +export interface LoginRequest { + email: string; + password: string; + tenant_id?: string; +} + +export interface TokenResponse { + access_token: string; + refresh_token: string; + token_type: string; +} + +export interface User { + id: string; + email: string; + username: string; + status: string; + roles: string[]; +} + +export interface SignupRequest { + email: string; + username: string; + password: string; + tenant_id?: string; +} diff --git a/frontend/packages/shared/src/types/common.ts b/frontend/packages/shared/src/types/common.ts new file mode 100644 index 0000000..12c42b9 --- /dev/null +++ b/frontend/packages/shared/src/types/common.ts @@ -0,0 +1,14 @@ +export interface PaginatedResponse<T> { + items: T[]; + total: number; + offset: number; + limit: number; +} + +export interface ApiError { + type: string; + title: string; + status: number; + detail: string; + instance?: string; +} diff --git a/frontend/packages/shared/src/types/ipam.ts b/frontend/packages/shared/src/types/ipam.ts new file mode 100644 index 0000000..c96a2d3 --- /dev/null +++ b/frontend/packages/shared/src/types/ipam.ts @@ -0,0 +1,181 @@ +export interface Prefix { + id: string; + network: string; + vrf_id: string | null; + vlan_id: string | null; + status: string; + role: string | null; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface IPAddress { + id: string; + address: string; + vrf_id: string | null; + status: string; + dns_name: string; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface VRF { + id: string; + name: string; + rd: string | null; + import_targets: string[]; + export_targets: string[]; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface VLAN { + id: string; + vid: number; + name: string; + group_id: string | null; + status: string; + role: string | null; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface IPRange { + id: string; + start_address: string; + end_address: string; + vrf_id: string | null; + status: string; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface RIR { + id: string; + name: string; + is_private: boolean; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface ASN { + id: string; + asn: number; + rir_id: string | null; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface FHRPGroup { + id: string; + protocol: string; + group_id_value: number; + auth_type: string; + name: string; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface RouteTarget { + id: string; + name: string; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface VLANGroup { + id: string; + name: string; + slug: string; + min_vid: number; + max_vid: number; + tenant_id: string | null; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface Service { + id: string; + name: string; + protocol: string; + ports: number[]; + ip_addresses: string[]; + description: string; + custom_fields: Record<string, unknown>; + tags: string[]; + created_at: string; + updated_at: string; +} + +export interface SearchResult { + entity_type: string; + entity_id: string; + display_text: string; + description: string; + relevance: number; +} + +export interface GlobalSearchResponse { + results: SearchResult[]; + total: number; +} + +export interface ChangeLogEntry { + id: number; + aggregate_id: string; + aggregate_type: string; + action: string; + event_type: string; + user_id: string | null; + tenant_id: string | null; + correlation_id: string | null; + timestamp: string; +} + +export interface JournalEntry { + id: string; + object_type: string; + object_id: string; + entry_type: "info" | "success" | "warning" | "danger"; + comment: string; + user_id: string | null; + tenant_id: string | null; + created_at: string; +} diff --git a/frontend/packages/shared/tsconfig.json b/frontend/packages/shared/tsconfig.json new file mode 100644 index 0000000..c4673d0 --- /dev/null +++ b/frontend/packages/shared/tsconfig.json @@ -0,0 +1,22 @@ +{ + "compilerOptions": { + "target": "ES2022", + "lib": ["dom", "dom.iterable", "ES2022"], + "module": "ESNext", + "moduleResolution": "bundler", + "jsx": "react-jsx", + "strict": true, + "esModuleInterop": true, + "skipLibCheck": true, + "forceConsistentCasingInFileNames": true, + "resolveJsonModule": true, + "isolatedModules": true, + "declaration": true, + "declarationMap": true, + "paths": { + "@/*": ["./src/*"] + } + }, + "include": ["src/**/*"], + "exclude": ["node_modules"] +} diff --git a/frontend/pnpm-lock.yaml b/frontend/pnpm-lock.yaml new file mode 100644 index 0000000..7f16ae3 --- /dev/null +++ b/frontend/pnpm-lock.yaml @@ -0,0 +1,6303 @@ +lockfileVersion: '9.0' + +settings: + autoInstallPeers: true + excludeLinksFromLockfile: false + +importers: + + .: {} + + apps/admin: + dependencies: + '@cmdb/shared': + specifier: workspace:* + version: link:../../packages/shared + next: + specifier: 16.1.7 + version: 16.1.7(@babel/core@7.29.0)(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + react: + specifier: 19.2.3 + version: 19.2.3 + react-dom: + specifier: 19.2.3 + version: 19.2.3(react@19.2.3) + devDependencies: + '@tailwindcss/postcss': + specifier: ^4 + version: 4.2.2 + '@types/node': + specifier: ^20 + version: 20.19.37 + '@types/react': + specifier: ^19 + version: 19.2.14 + '@types/react-dom': + specifier: ^19 + version: 19.2.3(@types/react@19.2.14) + eslint: + specifier: ^9 + version: 9.39.4(jiti@2.6.1) + eslint-config-next: + specifier: 16.1.7 + version: 16.1.7(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + tailwindcss: + specifier: ^4 + version: 4.2.2 + typescript: + specifier: ^5 + version: 5.9.3 + + apps/client: + dependencies: + '@base-ui/react': + specifier: ^1.3.0 + version: 1.3.0(@types/react@19.2.14)(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + '@cmdb/shared': + specifier: workspace:* + version: link:../../packages/shared + '@tanstack/react-query': + specifier: ^5 + version: 5.94.5(react@19.2.3) + '@tanstack/react-table': + specifier: ^8 + version: 8.21.3(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + class-variance-authority: + specifier: ^0.7.1 + version: 0.7.1 + clsx: + specifier: ^2.1.1 + version: 2.1.1 + lucide-react: + specifier: ^0.500 + version: 0.500.0(react@19.2.3) + next: + specifier: 16.1.7 + version: 16.1.7(@babel/core@7.29.0)(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + next-themes: + specifier: ^0.4 + version: 0.4.6(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + react: + specifier: 19.2.3 + version: 19.2.3 + react-dom: + specifier: 19.2.3 + version: 19.2.3(react@19.2.3) + shadcn: + specifier: ^4.1.0 + version: 4.1.0(@types/node@20.19.37)(typescript@5.9.3) + tailwind-merge: + specifier: ^3.5.0 + version: 3.5.0 + tw-animate-css: + specifier: ^1.4.0 + version: 1.4.0 + devDependencies: + '@tailwindcss/postcss': + specifier: ^4 + version: 4.2.2 + '@types/node': + specifier: ^20 + version: 20.19.37 + '@types/react': + specifier: ^19 + version: 19.2.14 + '@types/react-dom': + specifier: ^19 + version: 19.2.3(@types/react@19.2.14) + eslint: + specifier: ^9 + version: 9.39.4(jiti@2.6.1) + eslint-config-next: + specifier: 16.1.7 + version: 16.1.7(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + tailwindcss: + specifier: ^4 + version: 4.2.2 + typescript: + specifier: ^5 + version: 5.9.3 + + packages/shared: + dependencies: + react: + specifier: ^19 + version: 19.2.3 + react-dom: + specifier: ^19 + version: 19.2.3(react@19.2.3) + devDependencies: + '@types/react': + specifier: ^19.2.14 + version: 19.2.14 + '@types/react-dom': + specifier: ^19.2.3 + version: 19.2.3(@types/react@19.2.14) + +packages: + + '@alloc/quick-lru@5.2.0': + resolution: {integrity: sha512-UrcABB+4bUrFABwbluTIBErXwvbsU/V7TZWfmbgJfbkwiBuziS9gxdODUyuiecfdGQ85jglMW6juS3+z5TsKLw==} + engines: {node: '>=10'} + + '@babel/code-frame@7.29.0': + resolution: {integrity: sha512-9NhCeYjq9+3uxgdtp20LSiJXJvN0FeCtNGpJxuMFZ1Kv3cWUNb6DOhJwUvcVCzKGR66cw4njwM6hrJLqgOwbcw==} + engines: {node: '>=6.9.0'} + + '@babel/compat-data@7.29.0': + resolution: {integrity: sha512-T1NCJqT/j9+cn8fvkt7jtwbLBfLC/1y1c7NtCeXFRgzGTsafi68MRv8yzkYSapBnFA6L3U2VSc02ciDzoAJhJg==} + engines: {node: '>=6.9.0'} + + '@babel/core@7.29.0': + resolution: {integrity: sha512-CGOfOJqWjg2qW/Mb6zNsDm+u5vFQ8DxXfbM09z69p5Z6+mE1ikP2jUXw+j42Pf1XTYED2Rni5f95npYeuwMDQA==} + engines: {node: '>=6.9.0'} + + '@babel/generator@7.29.1': + resolution: {integrity: sha512-qsaF+9Qcm2Qv8SRIMMscAvG4O3lJ0F1GuMo5HR/Bp02LopNgnZBC/EkbevHFeGs4ls/oPz9v+Bsmzbkbe+0dUw==} + engines: {node: '>=6.9.0'} + + '@babel/helper-annotate-as-pure@7.27.3': + resolution: {integrity: sha512-fXSwMQqitTGeHLBC08Eq5yXz2m37E4pJX1qAU1+2cNedz/ifv/bVXft90VeSav5nFO61EcNgwr0aJxbyPaWBPg==} + engines: {node: '>=6.9.0'} + + '@babel/helper-compilation-targets@7.28.6': + resolution: {integrity: sha512-JYtls3hqi15fcx5GaSNL7SCTJ2MNmjrkHXg4FSpOA/grxK8KwyZ5bubHsCq8FXCkua6xhuaaBit+3b7+VZRfcA==} + engines: {node: '>=6.9.0'} + + '@babel/helper-create-class-features-plugin@7.28.6': + resolution: {integrity: sha512-dTOdvsjnG3xNT9Y0AUg1wAl38y+4Rl4sf9caSQZOXdNqVn+H+HbbJ4IyyHaIqNR6SW9oJpA/RuRjsjCw2IdIow==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0 + + '@babel/helper-globals@7.28.0': + resolution: {integrity: sha512-+W6cISkXFa1jXsDEdYA8HeevQT/FULhxzR99pxphltZcVaugps53THCeiWA8SguxxpSp3gKPiuYfSWopkLQ4hw==} + engines: {node: '>=6.9.0'} + + '@babel/helper-member-expression-to-functions@7.28.5': + resolution: {integrity: sha512-cwM7SBRZcPCLgl8a7cY0soT1SptSzAlMH39vwiRpOQkJlh53r5hdHwLSCZpQdVLT39sZt+CRpNwYG4Y2v77atg==} + engines: {node: '>=6.9.0'} + + '@babel/helper-module-imports@7.28.6': + resolution: {integrity: sha512-l5XkZK7r7wa9LucGw9LwZyyCUscb4x37JWTPz7swwFE/0FMQAGpiWUZn8u9DzkSBWEcK25jmvubfpw2dnAMdbw==} + engines: {node: '>=6.9.0'} + + '@babel/helper-module-transforms@7.28.6': + resolution: {integrity: sha512-67oXFAYr2cDLDVGLXTEABjdBJZ6drElUSI7WKp70NrpyISso3plG9SAGEF6y7zbha/wOzUByWWTJvEDVNIUGcA==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0 + + '@babel/helper-optimise-call-expression@7.27.1': + resolution: {integrity: sha512-URMGH08NzYFhubNSGJrpUEphGKQwMQYBySzat5cAByY1/YgIRkULnIy3tAMeszlL/so2HbeilYloUmSpd7GdVw==} + engines: {node: '>=6.9.0'} + + '@babel/helper-plugin-utils@7.28.6': + resolution: {integrity: sha512-S9gzZ/bz83GRysI7gAD4wPT/AI3uCnY+9xn+Mx/KPs2JwHJIz1W8PZkg2cqyt3RNOBM8ejcXhV6y8Og7ly/Dug==} + engines: {node: '>=6.9.0'} + + '@babel/helper-replace-supers@7.28.6': + resolution: {integrity: sha512-mq8e+laIk94/yFec3DxSjCRD2Z0TAjhVbEJY3UQrlwVo15Lmt7C2wAUbK4bjnTs4APkwsYLTahXRraQXhb1WCg==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0 + + '@babel/helper-skip-transparent-expression-wrappers@7.27.1': + resolution: {integrity: sha512-Tub4ZKEXqbPjXgWLl2+3JpQAYBJ8+ikpQ2Ocj/q/r0LwE3UhENh7EUabyHjz2kCEsrRY83ew2DQdHluuiDQFzg==} + engines: {node: '>=6.9.0'} + + '@babel/helper-string-parser@7.27.1': + resolution: {integrity: sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==} + engines: {node: '>=6.9.0'} + + '@babel/helper-validator-identifier@7.28.5': + resolution: {integrity: sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==} + engines: {node: '>=6.9.0'} + + '@babel/helper-validator-option@7.27.1': + resolution: {integrity: sha512-YvjJow9FxbhFFKDSuFnVCe2WxXk1zWc22fFePVNEaWJEu8IrZVlda6N0uHwzZrUM1il7NC9Mlp4MaJYbYd9JSg==} + engines: {node: '>=6.9.0'} + + '@babel/helpers@7.29.2': + resolution: {integrity: sha512-HoGuUs4sCZNezVEKdVcwqmZN8GoHirLUcLaYVNBK2J0DadGtdcqgr3BCbvH8+XUo4NGjNl3VOtSjEKNzqfFgKw==} + engines: {node: '>=6.9.0'} + + '@babel/parser@7.29.2': + resolution: {integrity: sha512-4GgRzy/+fsBa72/RZVJmGKPmZu9Byn8o4MoLpmNe1m8ZfYnz5emHLQz3U4gLud6Zwl0RZIcgiLD7Uq7ySFuDLA==} + engines: {node: '>=6.0.0'} + hasBin: true + + '@babel/plugin-syntax-jsx@7.28.6': + resolution: {integrity: sha512-wgEmr06G6sIpqr8YDwA2dSRTE3bJ+V0IfpzfSY3Lfgd7YWOaAdlykvJi13ZKBt8cZHfgH1IXN+CL656W3uUa4w==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0-0 + + '@babel/plugin-syntax-typescript@7.28.6': + resolution: {integrity: sha512-+nDNmQye7nlnuuHDboPbGm00Vqg3oO8niRRL27/4LYHUsHYh0zJ1xWOz0uRwNFmM1Avzk8wZbc6rdiYhomzv/A==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0-0 + + '@babel/plugin-transform-modules-commonjs@7.28.6': + resolution: {integrity: sha512-jppVbf8IV9iWWwWTQIxJMAJCWBuuKx71475wHwYytrRGQ2CWiDvYlADQno3tcYpS/T2UUWFQp3nVtYfK/YBQrA==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0-0 + + '@babel/plugin-transform-typescript@7.28.6': + resolution: {integrity: sha512-0YWL2RFxOqEm9Efk5PvreamxPME8OyY0wM5wh5lHjF+VtVhdneCWGzZeSqzOfiobVqQaNCd2z0tQvnI9DaPWPw==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0-0 + + '@babel/preset-typescript@7.28.5': + resolution: {integrity: sha512-+bQy5WOI2V6LJZpPVxY+yp66XdZ2yifu0Mc1aP5CQKgjn4QM5IN2i5fAZ4xKop47pr8rpVhiAeu+nDQa12C8+g==} + engines: {node: '>=6.9.0'} + peerDependencies: + '@babel/core': ^7.0.0-0 + + '@babel/runtime@7.29.2': + resolution: {integrity: sha512-JiDShH45zKHWyGe4ZNVRrCjBz8Nh9TMmZG1kh4QTK8hCBTWBi8Da+i7s1fJw7/lYpM4ccepSNfqzZ/QvABBi5g==} + engines: {node: '>=6.9.0'} + + '@babel/template@7.28.6': + resolution: {integrity: sha512-YA6Ma2KsCdGb+WC6UpBVFJGXL58MDA6oyONbjyF/+5sBgxY/dwkhLogbMT2GXXyU84/IhRw/2D1Os1B/giz+BQ==} + engines: {node: '>=6.9.0'} + + '@babel/traverse@7.29.0': + resolution: {integrity: sha512-4HPiQr0X7+waHfyXPZpWPfWL/J7dcN1mx9gL6WdQVMbPnF3+ZhSMs8tCxN7oHddJE9fhNE7+lxdnlyemKfJRuA==} + engines: {node: '>=6.9.0'} + + '@babel/types@7.29.0': + resolution: {integrity: sha512-LwdZHpScM4Qz8Xw2iKSzS+cfglZzJGvofQICy7W7v4caru4EaAmyUuO6BGrbyQ2mYV11W0U8j5mBhd14dd3B0A==} + engines: {node: '>=6.9.0'} + + '@base-ui/react@1.3.0': + resolution: {integrity: sha512-FwpKqZbPz14AITp1CVgf4AjhKPe1OeeVKSBMdgD10zbFlj3QSWelmtCMLi2+/PFZZcIm3l87G7rwtCZJwHyXWA==} + engines: {node: '>=14.0.0'} + peerDependencies: + '@types/react': ^17 || ^18 || ^19 + react: ^17 || ^18 || ^19 + react-dom: ^17 || ^18 || ^19 + peerDependenciesMeta: + '@types/react': + optional: true + + '@base-ui/utils@0.2.6': + resolution: {integrity: sha512-yQ+qeuqohwhsNpoYDqqXaLllYAkPCP4vYdDrVo8FQXaAPfHWm1pG/Vm+jmGTA5JFS0BAIjookyapuJFY8F9PIw==} + peerDependencies: + '@types/react': ^17 || ^18 || ^19 + react: ^17 || ^18 || ^19 + react-dom: ^17 || ^18 || ^19 + peerDependenciesMeta: + '@types/react': + optional: true + + '@dotenvx/dotenvx@1.57.1': + resolution: {integrity: sha512-iKXuo8Nes9Ft4zF3AZOT4FHkl6OV8bHqn61a67qHokkBzSEurnKZAlOkT0FYrRNVGvE6nCfZMtYswyjfXCR1MQ==} + hasBin: true + + '@ecies/ciphers@0.2.5': + resolution: {integrity: sha512-GalEZH4JgOMHYYcYmVqnFirFsjZHeoGMDt9IxEnM9F7GRUUyUksJ7Ou53L83WHJq3RWKD3AcBpo0iQh0oMpf8A==} + engines: {bun: '>=1', deno: '>=2', node: '>=16'} + peerDependencies: + '@noble/ciphers': ^1.0.0 + + '@emnapi/core@1.9.1': + resolution: {integrity: sha512-mukuNALVsoix/w1BJwFzwXBN/dHeejQtuVzcDsfOEsdpCumXb/E9j8w11h5S54tT1xhifGfbbSm/ICrObRb3KA==} + + '@emnapi/runtime@1.9.1': + resolution: {integrity: sha512-VYi5+ZVLhpgK4hQ0TAjiQiZ6ol0oe4mBx7mVv7IflsiEp0OWoVsp/+f9Vc1hOhE0TtkORVrI1GvzyreqpgWtkA==} + + '@emnapi/wasi-threads@1.2.0': + resolution: {integrity: sha512-N10dEJNSsUx41Z6pZsXU8FjPjpBEplgH24sfkmITrBED1/U2Esum9F3lfLrMjKHHjmi557zQn7kR9R+XWXu5Rg==} + + '@eslint-community/eslint-utils@4.9.1': + resolution: {integrity: sha512-phrYmNiYppR7znFEdqgfWHXR6NCkZEK7hwWDHZUjit/2/U0r6XvkDl0SYnoM51Hq7FhCGdLDT6zxCCOY1hexsQ==} + engines: {node: ^12.22.0 || ^14.17.0 || >=16.0.0} + peerDependencies: + eslint: ^6.0.0 || ^7.0.0 || >=8.0.0 + + '@eslint-community/regexpp@4.12.2': + resolution: {integrity: sha512-EriSTlt5OC9/7SXkRSCAhfSxxoSUgBm33OH+IkwbdpgoqsSsUg7y3uh+IICI/Qg4BBWr3U2i39RpmycbxMq4ew==} + engines: {node: ^12.0.0 || ^14.0.0 || >=16.0.0} + + '@eslint/config-array@0.21.2': + resolution: {integrity: sha512-nJl2KGTlrf9GjLimgIru+V/mzgSK0ABCDQRvxw5BjURL7WfH5uoWmizbH7QB6MmnMBd8cIC9uceWnezL1VZWWw==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@eslint/config-helpers@0.4.2': + resolution: {integrity: sha512-gBrxN88gOIf3R7ja5K9slwNayVcZgK6SOUORm2uBzTeIEfeVaIhOpCtTox3P6R7o2jLFwLFTLnC7kU/RGcYEgw==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@eslint/core@0.17.0': + resolution: {integrity: sha512-yL/sLrpmtDaFEiUj1osRP4TI2MDz1AddJL+jZ7KSqvBuliN4xqYY54IfdN8qD8Toa6g1iloph1fxQNkjOxrrpQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@eslint/eslintrc@3.3.5': + resolution: {integrity: sha512-4IlJx0X0qftVsN5E+/vGujTRIFtwuLbNsVUe7TO6zYPDR1O6nFwvwhIKEKSrl6dZchmYBITazxKoUYOjdtjlRg==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@eslint/js@9.39.4': + resolution: {integrity: sha512-nE7DEIchvtiFTwBw4Lfbu59PG+kCofhjsKaCWzxTpt4lfRjRMqG6uMBzKXuEcyXhOHoUp9riAm7/aWYGhXZ9cw==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@eslint/object-schema@2.1.7': + resolution: {integrity: sha512-VtAOaymWVfZcmZbp6E2mympDIHvyjXs/12LqWYjVw6qjrfF+VK+fyG33kChz3nnK+SU5/NeHOqrTEHS8sXO3OA==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@eslint/plugin-kit@0.4.1': + resolution: {integrity: sha512-43/qtrDUokr7LJqoF2c3+RInu/t4zfrpYdoSDfYyhg52rwLV6TnOvdG4fXm7IkSB3wErkcmJS9iEhjVtOSEjjA==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@floating-ui/core@1.7.5': + resolution: {integrity: sha512-1Ih4WTWyw0+lKyFMcBHGbb5U5FtuHJuujoyyr5zTaWS5EYMeT6Jb2AuDeftsCsEuchO+mM2ij5+q9crhydzLhQ==} + + '@floating-ui/dom@1.7.6': + resolution: {integrity: sha512-9gZSAI5XM36880PPMm//9dfiEngYoC6Am2izES1FF406YFsjvyBMmeJ2g4SAju3xWwtuynNRFL2s9hgxpLI5SQ==} + + '@floating-ui/react-dom@2.1.8': + resolution: {integrity: sha512-cC52bHwM/n/CxS87FH0yWdngEZrjdtLW/qVruo68qg+prK7ZQ4YGdut2GyDVpoGeAYe/h899rVeOVm6Oi40k2A==} + peerDependencies: + react: '>=16.8.0' + react-dom: '>=16.8.0' + + '@floating-ui/utils@0.2.11': + resolution: {integrity: sha512-RiB/yIh78pcIxl6lLMG0CgBXAZ2Y0eVHqMPYugu+9U0AeT6YBeiJpf7lbdJNIugFP5SIjwNRgo4DhR1Qxi26Gg==} + + '@hono/node-server@1.19.11': + resolution: {integrity: sha512-dr8/3zEaB+p0D2n/IUrlPF1HZm586qgJNXK1a9fhg/PzdtkK7Ksd5l312tJX2yBuALqDYBlG20QEbayqPyxn+g==} + engines: {node: '>=18.14.1'} + peerDependencies: + hono: ^4 + + '@humanfs/core@0.19.1': + resolution: {integrity: sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA==} + engines: {node: '>=18.18.0'} + + '@humanfs/node@0.16.7': + resolution: {integrity: sha512-/zUx+yOsIrG4Y43Eh2peDeKCxlRt/gET6aHfaKpuq267qXdYDFViVHfMaLyygZOnl0kGWxFIgsBy8QFuTLUXEQ==} + engines: {node: '>=18.18.0'} + + '@humanwhocodes/module-importer@1.0.1': + resolution: {integrity: sha512-bxveV4V8v5Yb4ncFTT3rPSgZBOpCkjfK0y4oVVVJwIuDVBRMDXrPyXRL988i5ap9m9bnyEEjWfm5WkBmtffLfA==} + engines: {node: '>=12.22'} + + '@humanwhocodes/retry@0.4.3': + resolution: {integrity: sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ==} + engines: {node: '>=18.18'} + + '@img/colour@1.1.0': + resolution: {integrity: sha512-Td76q7j57o/tLVdgS746cYARfSyxk8iEfRxewL9h4OMzYhbW4TAcppl0mT4eyqXddh6L/jwoM75mo7ixa/pCeQ==} + engines: {node: '>=18'} + + '@img/sharp-darwin-arm64@0.34.5': + resolution: {integrity: sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [arm64] + os: [darwin] + + '@img/sharp-darwin-x64@0.34.5': + resolution: {integrity: sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [x64] + os: [darwin] + + '@img/sharp-libvips-darwin-arm64@1.2.4': + resolution: {integrity: sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==} + cpu: [arm64] + os: [darwin] + + '@img/sharp-libvips-darwin-x64@1.2.4': + resolution: {integrity: sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==} + cpu: [x64] + os: [darwin] + + '@img/sharp-libvips-linux-arm64@1.2.4': + resolution: {integrity: sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==} + cpu: [arm64] + os: [linux] + libc: [glibc] + + '@img/sharp-libvips-linux-arm@1.2.4': + resolution: {integrity: sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==} + cpu: [arm] + os: [linux] + libc: [glibc] + + '@img/sharp-libvips-linux-ppc64@1.2.4': + resolution: {integrity: sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==} + cpu: [ppc64] + os: [linux] + libc: [glibc] + + '@img/sharp-libvips-linux-riscv64@1.2.4': + resolution: {integrity: sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==} + cpu: [riscv64] + os: [linux] + libc: [glibc] + + '@img/sharp-libvips-linux-s390x@1.2.4': + resolution: {integrity: sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==} + cpu: [s390x] + os: [linux] + libc: [glibc] + + '@img/sharp-libvips-linux-x64@1.2.4': + resolution: {integrity: sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==} + cpu: [x64] + os: [linux] + libc: [glibc] + + '@img/sharp-libvips-linuxmusl-arm64@1.2.4': + resolution: {integrity: sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==} + cpu: [arm64] + os: [linux] + libc: [musl] + + '@img/sharp-libvips-linuxmusl-x64@1.2.4': + resolution: {integrity: sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==} + cpu: [x64] + os: [linux] + libc: [musl] + + '@img/sharp-linux-arm64@0.34.5': + resolution: {integrity: sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [arm64] + os: [linux] + libc: [glibc] + + '@img/sharp-linux-arm@0.34.5': + resolution: {integrity: sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [arm] + os: [linux] + libc: [glibc] + + '@img/sharp-linux-ppc64@0.34.5': + resolution: {integrity: sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [ppc64] + os: [linux] + libc: [glibc] + + '@img/sharp-linux-riscv64@0.34.5': + resolution: {integrity: sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [riscv64] + os: [linux] + libc: [glibc] + + '@img/sharp-linux-s390x@0.34.5': + resolution: {integrity: sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [s390x] + os: [linux] + libc: [glibc] + + '@img/sharp-linux-x64@0.34.5': + resolution: {integrity: sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [x64] + os: [linux] + libc: [glibc] + + '@img/sharp-linuxmusl-arm64@0.34.5': + resolution: {integrity: sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [arm64] + os: [linux] + libc: [musl] + + '@img/sharp-linuxmusl-x64@0.34.5': + resolution: {integrity: sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [x64] + os: [linux] + libc: [musl] + + '@img/sharp-wasm32@0.34.5': + resolution: {integrity: sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [wasm32] + + '@img/sharp-win32-arm64@0.34.5': + resolution: {integrity: sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [arm64] + os: [win32] + + '@img/sharp-win32-ia32@0.34.5': + resolution: {integrity: sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [ia32] + os: [win32] + + '@img/sharp-win32-x64@0.34.5': + resolution: {integrity: sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + cpu: [x64] + os: [win32] + + '@inquirer/ansi@1.0.2': + resolution: {integrity: sha512-S8qNSZiYzFd0wAcyG5AXCvUHC5Sr7xpZ9wZ2py9XR88jUz8wooStVx5M6dRzczbBWjic9NP7+rY0Xi7qqK/aMQ==} + engines: {node: '>=18'} + + '@inquirer/confirm@5.1.21': + resolution: {integrity: sha512-KR8edRkIsUayMXV+o3Gv+q4jlhENF9nMYUZs9PA2HzrXeHI8M5uDag70U7RJn9yyiMZSbtF5/UexBtAVtZGSbQ==} + engines: {node: '>=18'} + peerDependencies: + '@types/node': '>=18' + peerDependenciesMeta: + '@types/node': + optional: true + + '@inquirer/core@10.3.2': + resolution: {integrity: sha512-43RTuEbfP8MbKzedNqBrlhhNKVwoK//vUFNW3Q3vZ88BLcrs4kYpGg+B2mm5p2K/HfygoCxuKwJJiv8PbGmE0A==} + engines: {node: '>=18'} + peerDependencies: + '@types/node': '>=18' + peerDependenciesMeta: + '@types/node': + optional: true + + '@inquirer/figures@1.0.15': + resolution: {integrity: sha512-t2IEY+unGHOzAaVM5Xx6DEWKeXlDDcNPeDyUpsRc6CUhBfU3VQOEl+Vssh7VNp1dR8MdUJBWhuObjXCsVpjN5g==} + engines: {node: '>=18'} + + '@inquirer/type@3.0.10': + resolution: {integrity: sha512-BvziSRxfz5Ov8ch0z/n3oijRSEcEsHnhggm4xFZe93DHcUCTlutlq9Ox4SVENAfcRD22UQq7T/atg9Wr3k09eA==} + engines: {node: '>=18'} + peerDependencies: + '@types/node': '>=18' + peerDependenciesMeta: + '@types/node': + optional: true + + '@jridgewell/gen-mapping@0.3.13': + resolution: {integrity: sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA==} + + '@jridgewell/remapping@2.3.5': + resolution: {integrity: sha512-LI9u/+laYG4Ds1TDKSJW2YPrIlcVYOwi2fUC6xB43lueCjgxV4lffOCZCtYFiH6TNOX+tQKXx97T4IKHbhyHEQ==} + + '@jridgewell/resolve-uri@3.1.2': + resolution: {integrity: sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==} + engines: {node: '>=6.0.0'} + + '@jridgewell/sourcemap-codec@1.5.5': + resolution: {integrity: sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==} + + '@jridgewell/trace-mapping@0.3.31': + resolution: {integrity: sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw==} + + '@modelcontextprotocol/sdk@1.27.1': + resolution: {integrity: sha512-sr6GbP+4edBwFndLbM60gf07z0FQ79gaExpnsjMGePXqFcSSb7t6iscpjk9DhFhwd+mTEQrzNafGP8/iGGFYaA==} + engines: {node: '>=18'} + peerDependencies: + '@cfworker/json-schema': ^4.1.1 + zod: ^3.25 || ^4.0 + peerDependenciesMeta: + '@cfworker/json-schema': + optional: true + + '@mswjs/interceptors@0.41.3': + resolution: {integrity: sha512-cXu86tF4VQVfwz8W1SPbhoRyHJkti6mjH/XJIxp40jhO4j2k1m4KYrEykxqWPkFF3vrK4rgQppBh//AwyGSXPA==} + engines: {node: '>=18'} + + '@napi-rs/wasm-runtime@0.2.12': + resolution: {integrity: sha512-ZVWUcfwY4E/yPitQJl481FjFo3K22D6qF0DuFH6Y/nbnE11GY5uguDxZMGXPQ8WQ0128MXQD7TnfHyK4oWoIJQ==} + + '@next/env@16.1.7': + resolution: {integrity: sha512-rJJbIdJB/RQr2F1nylZr/PJzamvNNhfr3brdKP6s/GW850jbtR70QlSfFselvIBbcPUOlQwBakexjFzqLzF6pg==} + + '@next/eslint-plugin-next@16.1.7': + resolution: {integrity: sha512-v/bRGOJlfRCO+NDKt0bZlIIWjhMKU8xbgEQBo+rV9C8S6czZvs96LZ/v24/GvpEnovZlL4QDpku/RzWHVbmPpA==} + + '@next/swc-darwin-arm64@16.1.7': + resolution: {integrity: sha512-b2wWIE8sABdyafc4IM8r5Y/dS6kD80JRtOGrUiKTsACFQfWWgUQ2NwoUX1yjFMXVsAwcQeNpnucF2ZrujsBBPg==} + engines: {node: '>= 10'} + cpu: [arm64] + os: [darwin] + + '@next/swc-darwin-x64@16.1.7': + resolution: {integrity: sha512-zcnVaaZulS1WL0Ss38R5Q6D2gz7MtBu8GZLPfK+73D/hp4GFMrC2sudLky1QibfV7h6RJBJs/gOFvYP0X7UVlQ==} + engines: {node: '>= 10'} + cpu: [x64] + os: [darwin] + + '@next/swc-linux-arm64-gnu@16.1.7': + resolution: {integrity: sha512-2ant89Lux/Q3VyC8vNVg7uBaFVP9SwoK2jJOOR0L8TQnX8CAYnh4uctAScy2Hwj2dgjVHqHLORQZJ2wH6VxhSQ==} + engines: {node: '>= 10'} + cpu: [arm64] + os: [linux] + libc: [glibc] + + '@next/swc-linux-arm64-musl@16.1.7': + resolution: {integrity: sha512-uufcze7LYv0FQg9GnNeZ3/whYfo+1Q3HnQpm16o6Uyi0OVzLlk2ZWoY7j07KADZFY8qwDbsmFnMQP3p3+Ftprw==} + engines: {node: '>= 10'} + cpu: [arm64] + os: [linux] + libc: [musl] + + '@next/swc-linux-x64-gnu@16.1.7': + resolution: {integrity: sha512-KWVf2gxYvHtvuT+c4MBOGxuse5TD7DsMFYSxVxRBnOzok/xryNeQSjXgxSv9QpIVlaGzEn/pIuI6Koosx8CGWA==} + engines: {node: '>= 10'} + cpu: [x64] + os: [linux] + libc: [glibc] + + '@next/swc-linux-x64-musl@16.1.7': + resolution: {integrity: sha512-HguhaGwsGr1YAGs68uRKc4aGWxLET+NevJskOcCAwXbwj0fYX0RgZW2gsOCzr9S11CSQPIkxmoSbuVaBp4Z3dA==} + engines: {node: '>= 10'} + cpu: [x64] + os: [linux] + libc: [musl] + + '@next/swc-win32-arm64-msvc@16.1.7': + resolution: {integrity: sha512-S0n3KrDJokKTeFyM/vGGGR8+pCmXYrjNTk2ZozOL1C/JFdfUIL9O1ATaJOl5r2POe56iRChbsszrjMAdWSv7kQ==} + engines: {node: '>= 10'} + cpu: [arm64] + os: [win32] + + '@next/swc-win32-x64-msvc@16.1.7': + resolution: {integrity: sha512-mwgtg8CNZGYm06LeEd+bNnOUfwOyNem/rOiP14Lsz+AnUY92Zq/LXwtebtUiaeVkhbroRCQ0c8GlR4UT1U+0yg==} + engines: {node: '>= 10'} + cpu: [x64] + os: [win32] + + '@noble/ciphers@1.3.0': + resolution: {integrity: sha512-2I0gnIVPtfnMw9ee9h1dJG7tp81+8Ob3OJb3Mv37rx5L40/b0i7djjCVvGOVqc9AEIQyvyu1i6ypKdFw8R8gQw==} + engines: {node: ^14.21.3 || >=16} + + '@noble/curves@1.9.7': + resolution: {integrity: sha512-gbKGcRUYIjA3/zCCNaWDciTMFI0dCkvou3TL8Zmy5Nc7sJ47a0jtOeZoTaMxkuqRo9cRhjOdZJXegxYE5FN/xw==} + engines: {node: ^14.21.3 || >=16} + + '@noble/hashes@1.8.0': + resolution: {integrity: sha512-jCs9ldd7NwzpgXDIf6P3+NrHh9/sD6CQdxHyjQI+h/6rDNo88ypBxxz45UDuZHz9r3tNz7N/VInSVoVdtXEI4A==} + engines: {node: ^14.21.3 || >=16} + + '@nodelib/fs.scandir@2.1.5': + resolution: {integrity: sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==} + engines: {node: '>= 8'} + + '@nodelib/fs.stat@2.0.5': + resolution: {integrity: sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==} + engines: {node: '>= 8'} + + '@nodelib/fs.walk@1.2.8': + resolution: {integrity: sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==} + engines: {node: '>= 8'} + + '@nolyfill/is-core-module@1.0.39': + resolution: {integrity: sha512-nn5ozdjYQpUCZlWGuxcJY/KpxkWQs4DcbMCmKojjyrYDEAGy4Ce19NN4v5MduafTwJlbKc99UA8YhSVqq9yPZA==} + engines: {node: '>=12.4.0'} + + '@open-draft/deferred-promise@2.2.0': + resolution: {integrity: sha512-CecwLWx3rhxVQF6V4bAgPS5t+So2sTbPgAzafKkVizyi7tlwpcFpdFqq+wqF2OwNBmqFuu6tOyouTuxgpMfzmA==} + + '@open-draft/logger@0.3.0': + resolution: {integrity: sha512-X2g45fzhxH238HKO4xbSr7+wBS8Fvw6ixhTDuvLd5mqh6bJJCFAPwU9mPDxbcrRtfxv4u5IHCEH77BmxvXmmxQ==} + + '@open-draft/until@2.1.0': + resolution: {integrity: sha512-U69T3ItWHvLwGg5eJ0n3I62nWuE6ilHlmz7zM0npLBRvPRd7e6NYmg54vvRtP5mZG7kZqZCFVdsTWo7BPtBujg==} + + '@rtsao/scc@1.1.0': + resolution: {integrity: sha512-zt6OdqaDoOnJ1ZYsCYGt9YmWzDXl4vQdKTyJev62gFhRGKdx7mcT54V9KIjg+d2wi9EXsPvAPKe7i7WjfVWB8g==} + + '@sec-ant/readable-stream@0.4.1': + resolution: {integrity: sha512-831qok9r2t8AlxLko40y2ebgSDhenenCatLVeW/uBtnHPyhHOvG0C7TvfgecV+wHzIm5KUICgzmVpWS+IMEAeg==} + + '@sindresorhus/merge-streams@4.0.0': + resolution: {integrity: sha512-tlqY9xq5ukxTUZBmoOp+m61cqwQD5pHJtFY3Mn8CA8ps6yghLH/Hw8UPdqg4OLmFW3IFlcXnQNmo/dh8HzXYIQ==} + engines: {node: '>=18'} + + '@swc/helpers@0.5.15': + resolution: {integrity: sha512-JQ5TuMi45Owi4/BIMAJBoSQoOJu12oOk/gADqlcUL9JEdHB8vyjUSsxqeNXnmXHjYKMi2WcYtezGEEhqUI/E2g==} + + '@tailwindcss/node@4.2.2': + resolution: {integrity: sha512-pXS+wJ2gZpVXqFaUEjojq7jzMpTGf8rU6ipJz5ovJV6PUGmlJ+jvIwGrzdHdQ80Sg+wmQxUFuoW1UAAwHNEdFA==} + + '@tailwindcss/oxide-android-arm64@4.2.2': + resolution: {integrity: sha512-dXGR1n+P3B6748jZO/SvHZq7qBOqqzQ+yFrXpoOWWALWndF9MoSKAT3Q0fYgAzYzGhxNYOoysRvYlpixRBBoDg==} + engines: {node: '>= 20'} + cpu: [arm64] + os: [android] + + '@tailwindcss/oxide-darwin-arm64@4.2.2': + resolution: {integrity: sha512-iq9Qjr6knfMpZHj55/37ouZeykwbDqF21gPFtfnhCCKGDcPI/21FKC9XdMO/XyBM7qKORx6UIhGgg6jLl7BZlg==} + engines: {node: '>= 20'} + cpu: [arm64] + os: [darwin] + + '@tailwindcss/oxide-darwin-x64@4.2.2': + resolution: {integrity: sha512-BlR+2c3nzc8f2G639LpL89YY4bdcIdUmiOOkv2GQv4/4M0vJlpXEa0JXNHhCHU7VWOKWT/CjqHdTP8aUuDJkuw==} + engines: {node: '>= 20'} + cpu: [x64] + os: [darwin] + + '@tailwindcss/oxide-freebsd-x64@4.2.2': + resolution: {integrity: sha512-YUqUgrGMSu2CDO82hzlQ5qSb5xmx3RUrke/QgnoEx7KvmRJHQuZHZmZTLSuuHwFf0DJPybFMXMYf+WJdxHy/nQ==} + engines: {node: '>= 20'} + cpu: [x64] + os: [freebsd] + + '@tailwindcss/oxide-linux-arm-gnueabihf@4.2.2': + resolution: {integrity: sha512-FPdhvsW6g06T9BWT0qTwiVZYE2WIFo2dY5aCSpjG/S/u1tby+wXoslXS0kl3/KXnULlLr1E3NPRRw0g7t2kgaQ==} + engines: {node: '>= 20'} + cpu: [arm] + os: [linux] + + '@tailwindcss/oxide-linux-arm64-gnu@4.2.2': + resolution: {integrity: sha512-4og1V+ftEPXGttOO7eCmW7VICmzzJWgMx+QXAJRAhjrSjumCwWqMfkDrNu1LXEQzNAwz28NCUpucgQPrR4S2yw==} + engines: {node: '>= 20'} + cpu: [arm64] + os: [linux] + libc: [glibc] + + '@tailwindcss/oxide-linux-arm64-musl@4.2.2': + resolution: {integrity: sha512-oCfG/mS+/+XRlwNjnsNLVwnMWYH7tn/kYPsNPh+JSOMlnt93mYNCKHYzylRhI51X+TbR+ufNhhKKzm6QkqX8ag==} + engines: {node: '>= 20'} + cpu: [arm64] + os: [linux] + libc: [musl] + + '@tailwindcss/oxide-linux-x64-gnu@4.2.2': + resolution: {integrity: sha512-rTAGAkDgqbXHNp/xW0iugLVmX62wOp2PoE39BTCGKjv3Iocf6AFbRP/wZT/kuCxC9QBh9Pu8XPkv/zCZB2mcMg==} + engines: {node: '>= 20'} + cpu: [x64] + os: [linux] + libc: [glibc] + + '@tailwindcss/oxide-linux-x64-musl@4.2.2': + resolution: {integrity: sha512-XW3t3qwbIwiSyRCggeO2zxe3KWaEbM0/kW9e8+0XpBgyKU4ATYzcVSMKteZJ1iukJ3HgHBjbg9P5YPRCVUxlnQ==} + engines: {node: '>= 20'} + cpu: [x64] + os: [linux] + libc: [musl] + + '@tailwindcss/oxide-wasm32-wasi@4.2.2': + resolution: {integrity: sha512-eKSztKsmEsn1O5lJ4ZAfyn41NfG7vzCg496YiGtMDV86jz1q/irhms5O0VrY6ZwTUkFy/EKG3RfWgxSI3VbZ8Q==} + engines: {node: '>=14.0.0'} + cpu: [wasm32] + bundledDependencies: + - '@napi-rs/wasm-runtime' + - '@emnapi/core' + - '@emnapi/runtime' + - '@tybys/wasm-util' + - '@emnapi/wasi-threads' + - tslib + + '@tailwindcss/oxide-win32-arm64-msvc@4.2.2': + resolution: {integrity: sha512-qPmaQM4iKu5mxpsrWZMOZRgZv1tOZpUm+zdhhQP0VhJfyGGO3aUKdbh3gDZc/dPLQwW4eSqWGrrcWNBZWUWaXQ==} + engines: {node: '>= 20'} + cpu: [arm64] + os: [win32] + + '@tailwindcss/oxide-win32-x64-msvc@4.2.2': + resolution: {integrity: sha512-1T/37VvI7WyH66b+vqHj/cLwnCxt7Qt3WFu5Q8hk65aOvlwAhs7rAp1VkulBJw/N4tMirXjVnylTR72uI0HGcA==} + engines: {node: '>= 20'} + cpu: [x64] + os: [win32] + + '@tailwindcss/oxide@4.2.2': + resolution: {integrity: sha512-qEUA07+E5kehxYp9BVMpq9E8vnJuBHfJEC0vPC5e7iL/hw7HR61aDKoVoKzrG+QKp56vhNZe4qwkRmMC0zDLvg==} + engines: {node: '>= 20'} + + '@tailwindcss/postcss@4.2.2': + resolution: {integrity: sha512-n4goKQbW8RVXIbNKRB/45LzyUqN451deQK0nzIeauVEqjlI49slUlgKYJM2QyUzap/PcpnS7kzSUmPb1sCRvYQ==} + + '@tanstack/query-core@5.94.5': + resolution: {integrity: sha512-Vx1JJiBURW/wdNGP45afjrqn0LfxYwL7K/bSrQvNRtyLGF1bxQPgUXCpzscG29e+UeFOh9hz1KOVala0N+bZiA==} + + '@tanstack/react-query@5.94.5': + resolution: {integrity: sha512-1wmrxKFkor+q8l+ygdHmv0Sq5g84Q3p4xvuJ7AdSIAhQQ7udOt+ZSZ19g1Jea3mHqtlTslLGJsmC4vHFgP0P3A==} + peerDependencies: + react: ^18 || ^19 + + '@tanstack/react-table@8.21.3': + resolution: {integrity: sha512-5nNMTSETP4ykGegmVkhjcS8tTLW6Vl4axfEGQN3v0zdHYbK4UfoqfPChclTrJ4EoK9QynqAu9oUf8VEmrpZ5Ww==} + engines: {node: '>=12'} + peerDependencies: + react: '>=16.8' + react-dom: '>=16.8' + + '@tanstack/table-core@8.21.3': + resolution: {integrity: sha512-ldZXEhOBb8Is7xLs01fR3YEc3DERiz5silj8tnGkFZytt1abEvl/GhUmCE0PMLaMPTa3Jk4HbKmRlHmu+gCftg==} + engines: {node: '>=12'} + + '@ts-morph/common@0.27.0': + resolution: {integrity: sha512-Wf29UqxWDpc+i61k3oIOzcUfQt79PIT9y/MWfAGlrkjg6lBC1hwDECLXPVJAhWjiGbfBCxZd65F/LIZF3+jeJQ==} + + '@tybys/wasm-util@0.10.1': + resolution: {integrity: sha512-9tTaPJLSiejZKx+Bmog4uSubteqTvFrVrURwkmHixBo0G4seD0zUxp98E1DzUBJxLQ3NPwXrGKDiVjwx/DpPsg==} + + '@types/estree@1.0.8': + resolution: {integrity: sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==} + + '@types/json-schema@7.0.15': + resolution: {integrity: sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA==} + + '@types/json5@0.0.29': + resolution: {integrity: sha512-dRLjCWHYg4oaA77cxO64oO+7JwCwnIzkZPdrrC71jQmQtlhM556pwKo5bUzqvZndkVbeFLIIi+9TC40JNF5hNQ==} + + '@types/node@20.19.37': + resolution: {integrity: sha512-8kzdPJ3FsNsVIurqBs7oodNnCEVbni9yUEkaHbgptDACOPW04jimGagZ51E6+lXUwJjgnBw+hyko/lkFWCldqw==} + + '@types/react-dom@19.2.3': + resolution: {integrity: sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ==} + peerDependencies: + '@types/react': ^19.2.0 + + '@types/react@19.2.14': + resolution: {integrity: sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w==} + + '@types/statuses@2.0.6': + resolution: {integrity: sha512-xMAgYwceFhRA2zY+XbEA7mxYbA093wdiW8Vu6gZPGWy9cmOyU9XesH1tNcEWsKFd5Vzrqx5T3D38PWx1FIIXkA==} + + '@types/validate-npm-package-name@4.0.2': + resolution: {integrity: sha512-lrpDziQipxCEeK5kWxvljWYhUvOiB2A9izZd9B2AFarYAkqZshb4lPbRs7zKEic6eGtH8V/2qJW+dPp9OtF6bw==} + + '@typescript-eslint/eslint-plugin@8.57.1': + resolution: {integrity: sha512-Gn3aqnvNl4NGc6x3/Bqk1AOn0thyTU9bqDRhiRnUWezgvr2OnhYCWCgC8zXXRVqBsIL1pSDt7T9nJUe0oM0kDQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + '@typescript-eslint/parser': ^8.57.1 + eslint: ^8.57.0 || ^9.0.0 || ^10.0.0 + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/parser@8.57.1': + resolution: {integrity: sha512-k4eNDan0EIMTT/dUKc/g+rsJ6wcHYhNPdY19VoX/EOtaAG8DLtKCykhrUnuHPYvinn5jhAPgD2Qw9hXBwrahsw==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + eslint: ^8.57.0 || ^9.0.0 || ^10.0.0 + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/project-service@8.57.1': + resolution: {integrity: sha512-vx1F37BRO1OftsYlmG9xay1TqnjNVlqALymwWVuYTdo18XuKxtBpCj1QlzNIEHlvlB27osvXFWptYiEWsVdYsg==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/scope-manager@8.57.1': + resolution: {integrity: sha512-hs/QcpCwlwT2L5S+3fT6gp0PabyGk4Q0Rv2doJXA0435/OpnSR3VRgvrp8Xdoc3UAYSg9cyUjTeFXZEPg/3OKg==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@typescript-eslint/tsconfig-utils@8.57.1': + resolution: {integrity: sha512-0lgOZB8cl19fHO4eI46YUx2EceQqhgkPSuCGLlGi79L2jwYY1cxeYc1Nae8Aw1xjgW3PKVDLlr3YJ6Bxx8HkWg==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/type-utils@8.57.1': + resolution: {integrity: sha512-+Bwwm0ScukFdyoJsh2u6pp4S9ktegF98pYUU0hkphOOqdMB+1sNQhIz8y5E9+4pOioZijrkfNO/HUJVAFFfPKA==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + eslint: ^8.57.0 || ^9.0.0 || ^10.0.0 + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/types@8.57.1': + resolution: {integrity: sha512-S29BOBPJSFUiblEl6RzPPjJt6w25A6XsBqRVDt53tA/tlL8q7ceQNZHTjPeONt/3S7KRI4quk+yP9jK2WjBiPQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@typescript-eslint/typescript-estree@8.57.1': + resolution: {integrity: sha512-ybe2hS9G6pXpqGtPli9Gx9quNV0TWLOmh58ADlmZe9DguLq0tiAKVjirSbtM1szG6+QH6rVXyU6GTLQbWnMY+g==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/utils@8.57.1': + resolution: {integrity: sha512-XUNSJ/lEVFttPMMoDVA2r2bwrl8/oPx8cURtczkSEswY5T3AeLmCy+EKWQNdL4u0MmAHOjcWrqJp2cdvgjn8dQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + eslint: ^8.57.0 || ^9.0.0 || ^10.0.0 + typescript: '>=4.8.4 <6.0.0' + + '@typescript-eslint/visitor-keys@8.57.1': + resolution: {integrity: sha512-YWnmJkXbofiz9KbnbbwuA2rpGkFPLbAIetcCNO6mJ8gdhdZ/v7WDXsoGFAJuM6ikUFKTlSQnjWnVO4ux+UzS6A==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + '@unrs/resolver-binding-android-arm-eabi@1.11.1': + resolution: {integrity: sha512-ppLRUgHVaGRWUx0R0Ut06Mjo9gBaBkg3v/8AxusGLhsIotbBLuRk51rAzqLC8gq6NyyAojEXglNjzf6R948DNw==} + cpu: [arm] + os: [android] + + '@unrs/resolver-binding-android-arm64@1.11.1': + resolution: {integrity: sha512-lCxkVtb4wp1v+EoN+HjIG9cIIzPkX5OtM03pQYkG+U5O/wL53LC4QbIeazgiKqluGeVEeBlZahHalCaBvU1a2g==} + cpu: [arm64] + os: [android] + + '@unrs/resolver-binding-darwin-arm64@1.11.1': + resolution: {integrity: sha512-gPVA1UjRu1Y/IsB/dQEsp2V1pm44Of6+LWvbLc9SDk1c2KhhDRDBUkQCYVWe6f26uJb3fOK8saWMgtX8IrMk3g==} + cpu: [arm64] + os: [darwin] + + '@unrs/resolver-binding-darwin-x64@1.11.1': + resolution: {integrity: sha512-cFzP7rWKd3lZaCsDze07QX1SC24lO8mPty9vdP+YVa3MGdVgPmFc59317b2ioXtgCMKGiCLxJ4HQs62oz6GfRQ==} + cpu: [x64] + os: [darwin] + + '@unrs/resolver-binding-freebsd-x64@1.11.1': + resolution: {integrity: sha512-fqtGgak3zX4DCB6PFpsH5+Kmt/8CIi4Bry4rb1ho6Av2QHTREM+47y282Uqiu3ZRF5IQioJQ5qWRV6jduA+iGw==} + cpu: [x64] + os: [freebsd] + + '@unrs/resolver-binding-linux-arm-gnueabihf@1.11.1': + resolution: {integrity: sha512-u92mvlcYtp9MRKmP+ZvMmtPN34+/3lMHlyMj7wXJDeXxuM0Vgzz0+PPJNsro1m3IZPYChIkn944wW8TYgGKFHw==} + cpu: [arm] + os: [linux] + + '@unrs/resolver-binding-linux-arm-musleabihf@1.11.1': + resolution: {integrity: sha512-cINaoY2z7LVCrfHkIcmvj7osTOtm6VVT16b5oQdS4beibX2SYBwgYLmqhBjA1t51CarSaBuX5YNsWLjsqfW5Cw==} + cpu: [arm] + os: [linux] + + '@unrs/resolver-binding-linux-arm64-gnu@1.11.1': + resolution: {integrity: sha512-34gw7PjDGB9JgePJEmhEqBhWvCiiWCuXsL9hYphDF7crW7UgI05gyBAi6MF58uGcMOiOqSJ2ybEeCvHcq0BCmQ==} + cpu: [arm64] + os: [linux] + libc: [glibc] + + '@unrs/resolver-binding-linux-arm64-musl@1.11.1': + resolution: {integrity: sha512-RyMIx6Uf53hhOtJDIamSbTskA99sPHS96wxVE/bJtePJJtpdKGXO1wY90oRdXuYOGOTuqjT8ACccMc4K6QmT3w==} + cpu: [arm64] + os: [linux] + libc: [musl] + + '@unrs/resolver-binding-linux-ppc64-gnu@1.11.1': + resolution: {integrity: sha512-D8Vae74A4/a+mZH0FbOkFJL9DSK2R6TFPC9M+jCWYia/q2einCubX10pecpDiTmkJVUH+y8K3BZClycD8nCShA==} + cpu: [ppc64] + os: [linux] + libc: [glibc] + + '@unrs/resolver-binding-linux-riscv64-gnu@1.11.1': + resolution: {integrity: sha512-frxL4OrzOWVVsOc96+V3aqTIQl1O2TjgExV4EKgRY09AJ9leZpEg8Ak9phadbuX0BA4k8U5qtvMSQQGGmaJqcQ==} + cpu: [riscv64] + os: [linux] + libc: [glibc] + + '@unrs/resolver-binding-linux-riscv64-musl@1.11.1': + resolution: {integrity: sha512-mJ5vuDaIZ+l/acv01sHoXfpnyrNKOk/3aDoEdLO/Xtn9HuZlDD6jKxHlkN8ZhWyLJsRBxfv9GYM2utQ1SChKew==} + cpu: [riscv64] + os: [linux] + libc: [musl] + + '@unrs/resolver-binding-linux-s390x-gnu@1.11.1': + resolution: {integrity: sha512-kELo8ebBVtb9sA7rMe1Cph4QHreByhaZ2QEADd9NzIQsYNQpt9UkM9iqr2lhGr5afh885d/cB5QeTXSbZHTYPg==} + cpu: [s390x] + os: [linux] + libc: [glibc] + + '@unrs/resolver-binding-linux-x64-gnu@1.11.1': + resolution: {integrity: sha512-C3ZAHugKgovV5YvAMsxhq0gtXuwESUKc5MhEtjBpLoHPLYM+iuwSj3lflFwK3DPm68660rZ7G8BMcwSro7hD5w==} + cpu: [x64] + os: [linux] + libc: [glibc] + + '@unrs/resolver-binding-linux-x64-musl@1.11.1': + resolution: {integrity: sha512-rV0YSoyhK2nZ4vEswT/QwqzqQXw5I6CjoaYMOX0TqBlWhojUf8P94mvI7nuJTeaCkkds3QE4+zS8Ko+GdXuZtA==} + cpu: [x64] + os: [linux] + libc: [musl] + + '@unrs/resolver-binding-wasm32-wasi@1.11.1': + resolution: {integrity: sha512-5u4RkfxJm+Ng7IWgkzi3qrFOvLvQYnPBmjmZQ8+szTK/b31fQCnleNl1GgEt7nIsZRIf5PLhPwT0WM+q45x/UQ==} + engines: {node: '>=14.0.0'} + cpu: [wasm32] + + '@unrs/resolver-binding-win32-arm64-msvc@1.11.1': + resolution: {integrity: sha512-nRcz5Il4ln0kMhfL8S3hLkxI85BXs3o8EYoattsJNdsX4YUU89iOkVn7g0VHSRxFuVMdM4Q1jEpIId1Ihim/Uw==} + cpu: [arm64] + os: [win32] + + '@unrs/resolver-binding-win32-ia32-msvc@1.11.1': + resolution: {integrity: sha512-DCEI6t5i1NmAZp6pFonpD5m7i6aFrpofcp4LA2i8IIq60Jyo28hamKBxNrZcyOwVOZkgsRp9O2sXWBWP8MnvIQ==} + cpu: [ia32] + os: [win32] + + '@unrs/resolver-binding-win32-x64-msvc@1.11.1': + resolution: {integrity: sha512-lrW200hZdbfRtztbygyaq/6jP6AKE8qQN2KvPcJ+x7wiD038YtnYtZ82IMNJ69GJibV7bwL3y9FgK+5w/pYt6g==} + cpu: [x64] + os: [win32] + + accepts@2.0.0: + resolution: {integrity: sha512-5cvg6CtKwfgdmVqY1WIiXKc3Q1bkRqGLi+2W/6ao+6Y7gu/RCwRuAhGEzh5B4KlszSuTLgZYuqFqo5bImjNKng==} + engines: {node: '>= 0.6'} + + acorn-jsx@5.3.2: + resolution: {integrity: sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==} + peerDependencies: + acorn: ^6.0.0 || ^7.0.0 || ^8.0.0 + + acorn@8.16.0: + resolution: {integrity: sha512-UVJyE9MttOsBQIDKw1skb9nAwQuR5wuGD3+82K6JgJlm/Y+KI92oNsMNGZCYdDsVtRHSak0pcV5Dno5+4jh9sw==} + engines: {node: '>=0.4.0'} + hasBin: true + + agent-base@7.1.4: + resolution: {integrity: sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ==} + engines: {node: '>= 14'} + + ajv-formats@3.0.1: + resolution: {integrity: sha512-8iUql50EUR+uUcdRQ3HDqa6EVyo3docL8g5WJ3FNcWmu62IbkGUue/pEyLBW8VGKKucTPgqeks4fIU1DA4yowQ==} + peerDependencies: + ajv: ^8.0.0 + peerDependenciesMeta: + ajv: + optional: true + + ajv@6.14.0: + resolution: {integrity: sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw==} + + ajv@8.18.0: + resolution: {integrity: sha512-PlXPeEWMXMZ7sPYOHqmDyCJzcfNrUr3fGNKtezX14ykXOEIvyK81d+qydx89KY5O71FKMPaQ2vBfBFI5NHR63A==} + + ansi-regex@5.0.1: + resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==} + engines: {node: '>=8'} + + ansi-regex@6.2.2: + resolution: {integrity: sha512-Bq3SmSpyFHaWjPk8If9yc6svM8c56dB5BAtW4Qbw5jHTwwXXcTLoRMkpDJp6VL0XzlWaCHTXrkFURMYmD0sLqg==} + engines: {node: '>=12'} + + ansi-styles@4.3.0: + resolution: {integrity: sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==} + engines: {node: '>=8'} + + argparse@2.0.1: + resolution: {integrity: sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==} + + aria-query@5.3.2: + resolution: {integrity: sha512-COROpnaoap1E2F000S62r6A60uHZnmlvomhfyT2DlTcrY1OrBKn2UhH7qn5wTC9zMvD0AY7csdPSNwKP+7WiQw==} + engines: {node: '>= 0.4'} + + array-buffer-byte-length@1.0.2: + resolution: {integrity: sha512-LHE+8BuR7RYGDKvnrmcuSq3tDcKv9OFEXQt/HpbZhY7V6h0zlUXutnAD82GiFx9rdieCMjkvtcsPqBwgUl1Iiw==} + engines: {node: '>= 0.4'} + + array-includes@3.1.9: + resolution: {integrity: sha512-FmeCCAenzH0KH381SPT5FZmiA/TmpndpcaShhfgEN9eCVjnFBqq3l1xrI42y8+PPLI6hypzou4GXw00WHmPBLQ==} + engines: {node: '>= 0.4'} + + array.prototype.findlast@1.2.5: + resolution: {integrity: sha512-CVvd6FHg1Z3POpBLxO6E6zr+rSKEQ9L6rZHAaY7lLfhKsWYUBBOuMs0e9o24oopj6H+geRCX0YJ+TJLBK2eHyQ==} + engines: {node: '>= 0.4'} + + array.prototype.findlastindex@1.2.6: + resolution: {integrity: sha512-F/TKATkzseUExPlfvmwQKGITM3DGTK+vkAsCZoDc5daVygbJBnjEUCbgkAvVFsgfXfX4YIqZ/27G3k3tdXrTxQ==} + engines: {node: '>= 0.4'} + + array.prototype.flat@1.3.3: + resolution: {integrity: sha512-rwG/ja1neyLqCuGZ5YYrznA62D4mZXg0i1cIskIUKSiqF3Cje9/wXAls9B9s1Wa2fomMsIv8czB8jZcPmxCXFg==} + engines: {node: '>= 0.4'} + + array.prototype.flatmap@1.3.3: + resolution: {integrity: sha512-Y7Wt51eKJSyi80hFrJCePGGNo5ktJCslFuboqJsbf57CCPcm5zztluPlc4/aD8sWsKvlwatezpV4U1efk8kpjg==} + engines: {node: '>= 0.4'} + + array.prototype.tosorted@1.1.4: + resolution: {integrity: sha512-p6Fx8B7b7ZhL/gmUsAy0D15WhvDccw3mnGNbZpi3pmeJdxtWsj2jEaI4Y6oo3XiHfzuSgPwKc04MYt6KgvC/wA==} + engines: {node: '>= 0.4'} + + arraybuffer.prototype.slice@1.0.4: + resolution: {integrity: sha512-BNoCY6SXXPQ7gF2opIP4GBE+Xw7U+pHMYKuzjgCN3GwiaIR09UUeKfheyIry77QtrCBlC0KK0q5/TER/tYh3PQ==} + engines: {node: '>= 0.4'} + + ast-types-flow@0.0.8: + resolution: {integrity: sha512-OH/2E5Fg20h2aPrbe+QL8JZQFko0YZaF+j4mnQ7BGhfavO7OpSLa8a0y9sBwomHdSbkhTS8TQNayBfnW5DwbvQ==} + + ast-types@0.16.1: + resolution: {integrity: sha512-6t10qk83GOG8p0vKmaCr8eiilZwO171AvbROMtvvNiwrTly62t+7XkA8RdIIVbpMhCASAsxgAzdRSwh6nw/5Dg==} + engines: {node: '>=4'} + + async-function@1.0.0: + resolution: {integrity: sha512-hsU18Ae8CDTR6Kgu9DYf0EbCr/a5iGL0rytQDobUcdpYOKokk8LEjVphnXkDkgpi0wYVsqrXuP0bZxJaTqdgoA==} + engines: {node: '>= 0.4'} + + available-typed-arrays@1.0.7: + resolution: {integrity: sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ==} + engines: {node: '>= 0.4'} + + axe-core@4.11.1: + resolution: {integrity: sha512-BASOg+YwO2C+346x3LZOeoovTIoTrRqEsqMa6fmfAV0P+U9mFr9NsyOEpiYvFjbc64NMrSswhV50WdXzdb/Z5A==} + engines: {node: '>=4'} + + axobject-query@4.1.0: + resolution: {integrity: sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ==} + engines: {node: '>= 0.4'} + + balanced-match@1.0.2: + resolution: {integrity: sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==} + + balanced-match@4.0.4: + resolution: {integrity: sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA==} + engines: {node: 18 || 20 || >=22} + + baseline-browser-mapping@2.10.10: + resolution: {integrity: sha512-sUoJ3IMxx4AyRqO4MLeHlnGDkyXRoUG0/AI9fjK+vS72ekpV0yWVY7O0BVjmBcRtkNcsAO2QDZ4tdKKGoI6YaQ==} + engines: {node: '>=6.0.0'} + hasBin: true + + body-parser@2.2.2: + resolution: {integrity: sha512-oP5VkATKlNwcgvxi0vM0p/D3n2C3EReYVX+DNYs5TjZFn/oQt2j+4sVJtSMr18pdRr8wjTcBl6LoV+FUwzPmNA==} + engines: {node: '>=18'} + + brace-expansion@1.1.12: + resolution: {integrity: sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==} + + brace-expansion@5.0.4: + resolution: {integrity: sha512-h+DEnpVvxmfVefa4jFbCf5HdH5YMDXRsmKflpf1pILZWRFlTbJpxeU55nJl4Smt5HQaGzg1o6RHFPJaOqnmBDg==} + engines: {node: 18 || 20 || >=22} + + braces@3.0.3: + resolution: {integrity: sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==} + engines: {node: '>=8'} + + browserslist@4.28.1: + resolution: {integrity: sha512-ZC5Bd0LgJXgwGqUknZY/vkUQ04r8NXnJZ3yYi4vDmSiZmC/pdSN0NbNRPxZpbtO4uAfDUAFffO8IZoM3Gj8IkA==} + engines: {node: ^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7} + hasBin: true + + bundle-name@4.1.0: + resolution: {integrity: sha512-tjwM5exMg6BGRI+kNmTntNsvdZS1X8BFYS6tnJ2hdH0kVxM6/eVZ2xy+FqStSWvYmtfFMDLIxurorHwDKfDz5Q==} + engines: {node: '>=18'} + + bytes@3.1.2: + resolution: {integrity: sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==} + engines: {node: '>= 0.8'} + + call-bind-apply-helpers@1.0.2: + resolution: {integrity: sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==} + engines: {node: '>= 0.4'} + + call-bind@1.0.8: + resolution: {integrity: sha512-oKlSFMcMwpUg2ednkhQ454wfWiU/ul3CkJe/PEHcTKuiX6RpbehUiFMXu13HalGZxfUwCQzZG747YXBn1im9ww==} + engines: {node: '>= 0.4'} + + call-bound@1.0.4: + resolution: {integrity: sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==} + engines: {node: '>= 0.4'} + + callsites@3.1.0: + resolution: {integrity: sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==} + engines: {node: '>=6'} + + caniuse-lite@1.0.30001780: + resolution: {integrity: sha512-llngX0E7nQci5BPJDqoZSbuZ5Bcs9F5db7EtgfwBerX9XGtkkiO4NwfDDIRzHTTwcYC8vC7bmeUEPGrKlR/TkQ==} + + chalk@4.1.2: + resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==} + engines: {node: '>=10'} + + chalk@5.6.2: + resolution: {integrity: sha512-7NzBL0rN6fMUW+f7A6Io4h40qQlG+xGmtMxfbnH/K7TAtt8JQWVQK+6g0UXKMeVJoyV5EkkNsErQ8pVD3bLHbA==} + engines: {node: ^12.17.0 || ^14.13 || >=16.0.0} + + class-variance-authority@0.7.1: + resolution: {integrity: sha512-Ka+9Trutv7G8M6WT6SeiRWz792K5qEqIGEGzXKhAE6xOWAY6pPH8U+9IY3oCMv6kqTmLsv7Xh/2w2RigkePMsg==} + + cli-cursor@5.0.0: + resolution: {integrity: sha512-aCj4O5wKyszjMmDT4tZj93kxyydN/K5zPWSCe6/0AV/AA1pqe5ZBIw0a2ZfPQV7lL5/yb5HsUreJ6UFAF1tEQw==} + engines: {node: '>=18'} + + cli-spinners@2.9.2: + resolution: {integrity: sha512-ywqV+5MmyL4E7ybXgKys4DugZbX0FC6LnwrhjuykIjnK9k8OQacQ7axGKnjDXWNhns0xot3bZI5h55H8yo9cJg==} + engines: {node: '>=6'} + + cli-width@4.1.0: + resolution: {integrity: sha512-ouuZd4/dm2Sw5Gmqy6bGyNNNe1qt9RpmxveLSO7KcgsTnU7RXfsw+/bukWGo1abgBiMAic068rclZsO4IWmmxQ==} + engines: {node: '>= 12'} + + client-only@0.0.1: + resolution: {integrity: sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==} + + cliui@8.0.1: + resolution: {integrity: sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==} + engines: {node: '>=12'} + + clsx@2.1.1: + resolution: {integrity: sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA==} + engines: {node: '>=6'} + + code-block-writer@13.0.3: + resolution: {integrity: sha512-Oofo0pq3IKnsFtuHqSF7TqBfr71aeyZDVJ0HpmqB7FBM2qEigL0iPONSCZSO9pE9dZTAxANe5XHG9Uy0YMv8cg==} + + color-convert@2.0.1: + resolution: {integrity: sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==} + engines: {node: '>=7.0.0'} + + color-name@1.1.4: + resolution: {integrity: sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==} + + commander@11.1.0: + resolution: {integrity: sha512-yPVavfyCcRhmorC7rWlkHn15b4wDVgVmBA7kV4QVBsF7kv/9TKJAbAXVTxvTnwP8HHKjRCJDClKbciiYS7p0DQ==} + engines: {node: '>=16'} + + commander@14.0.3: + resolution: {integrity: sha512-H+y0Jo/T1RZ9qPP4Eh1pkcQcLRglraJaSLoyOtHxu6AapkjWVCy2Sit1QQ4x3Dng8qDlSsZEet7g5Pq06MvTgw==} + engines: {node: '>=20'} + + concat-map@0.0.1: + resolution: {integrity: sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==} + + content-disposition@1.0.1: + resolution: {integrity: sha512-oIXISMynqSqm241k6kcQ5UwttDILMK4BiurCfGEREw6+X9jkkpEe5T9FZaApyLGGOnFuyMWZpdolTXMtvEJ08Q==} + engines: {node: '>=18'} + + content-type@1.0.5: + resolution: {integrity: sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==} + engines: {node: '>= 0.6'} + + convert-source-map@2.0.0: + resolution: {integrity: sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg==} + + cookie-signature@1.2.2: + resolution: {integrity: sha512-D76uU73ulSXrD1UXF4KE2TMxVVwhsnCgfAyTg9k8P6KGZjlXKrOLe4dJQKI3Bxi5wjesZoFXJWElNWBjPZMbhg==} + engines: {node: '>=6.6.0'} + + cookie@0.7.2: + resolution: {integrity: sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==} + engines: {node: '>= 0.6'} + + cookie@1.1.1: + resolution: {integrity: sha512-ei8Aos7ja0weRpFzJnEA9UHJ/7XQmqglbRwnf2ATjcB9Wq874VKH9kfjjirM6UhU2/E5fFYadylyhFldcqSidQ==} + engines: {node: '>=18'} + + cors@2.8.6: + resolution: {integrity: sha512-tJtZBBHA6vjIAaF6EnIaq6laBBP9aq/Y3ouVJjEfoHbRBcHBAHYcMh/w8LDrk2PvIMMq8gmopa5D4V8RmbrxGw==} + engines: {node: '>= 0.10'} + + cosmiconfig@9.0.1: + resolution: {integrity: sha512-hr4ihw+DBqcvrsEDioRO31Z17x71pUYoNe/4h6Z0wB72p7MU7/9gH8Q3s12NFhHPfYBBOV3qyfUxmr/Yn3shnQ==} + engines: {node: '>=14'} + peerDependencies: + typescript: '>=4.9.5' + peerDependenciesMeta: + typescript: + optional: true + + cross-spawn@7.0.6: + resolution: {integrity: sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==} + engines: {node: '>= 8'} + + cssesc@3.0.0: + resolution: {integrity: sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==} + engines: {node: '>=4'} + hasBin: true + + csstype@3.2.3: + resolution: {integrity: sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==} + + damerau-levenshtein@1.0.8: + resolution: {integrity: sha512-sdQSFB7+llfUcQHUQO3+B8ERRj0Oa4w9POWMI/puGtuf7gFywGmkaLCElnudfTiKZV+NvHqL0ifzdrI8Ro7ESA==} + + data-uri-to-buffer@4.0.1: + resolution: {integrity: sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==} + engines: {node: '>= 12'} + + data-view-buffer@1.0.2: + resolution: {integrity: sha512-EmKO5V3OLXh1rtK2wgXRansaK1/mtVdTUEiEI0W8RkvgT05kfxaH29PliLnpLP73yYO6142Q72QNa8Wx/A5CqQ==} + engines: {node: '>= 0.4'} + + data-view-byte-length@1.0.2: + resolution: {integrity: sha512-tuhGbE6CfTM9+5ANGf+oQb72Ky/0+s3xKUpHvShfiz2RxMFgFPjsXuRLBVMtvMs15awe45SRb83D6wH4ew6wlQ==} + engines: {node: '>= 0.4'} + + data-view-byte-offset@1.0.1: + resolution: {integrity: sha512-BS8PfmtDGnrgYdOonGZQdLZslWIeCGFP9tpan0hi1Co2Zr2NKADsvGYA8XxuG/4UWgJ6Cjtv+YJnB6MM69QGlQ==} + engines: {node: '>= 0.4'} + + debug@3.2.7: + resolution: {integrity: sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==} + peerDependencies: + supports-color: '*' + peerDependenciesMeta: + supports-color: + optional: true + + debug@4.4.3: + resolution: {integrity: sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==} + engines: {node: '>=6.0'} + peerDependencies: + supports-color: '*' + peerDependenciesMeta: + supports-color: + optional: true + + dedent@1.7.2: + resolution: {integrity: sha512-WzMx3mW98SN+zn3hgemf4OzdmyNhhhKz5Ay0pUfQiMQ3e1g+xmTJWp/pKdwKVXhdSkAEGIIzqeuWrL3mV/AXbA==} + peerDependencies: + babel-plugin-macros: ^3.1.0 + peerDependenciesMeta: + babel-plugin-macros: + optional: true + + deep-is@0.1.4: + resolution: {integrity: sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ==} + + deepmerge@4.3.1: + resolution: {integrity: sha512-3sUqbMEc77XqpdNO7FRyRog+eW3ph+GYCbj+rK+uYyRMuwsVy0rMiVtPn+QJlKFvWP/1PYpapqYn0Me2knFn+A==} + engines: {node: '>=0.10.0'} + + default-browser-id@5.0.1: + resolution: {integrity: sha512-x1VCxdX4t+8wVfd1so/9w+vQ4vx7lKd2Qp5tDRutErwmR85OgmfX7RlLRMWafRMY7hbEiXIbudNrjOAPa/hL8Q==} + engines: {node: '>=18'} + + default-browser@5.5.0: + resolution: {integrity: sha512-H9LMLr5zwIbSxrmvikGuI/5KGhZ8E2zH3stkMgM5LpOWDutGM2JZaj460Udnf1a+946zc7YBgrqEWwbk7zHvGw==} + engines: {node: '>=18'} + + define-data-property@1.1.4: + resolution: {integrity: sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A==} + engines: {node: '>= 0.4'} + + define-lazy-prop@3.0.0: + resolution: {integrity: sha512-N+MeXYoqr3pOgn8xfyRPREN7gHakLYjhsHhWGT3fWAiL4IkAt0iDw14QiiEm2bE30c5XX5q0FtAA3CK5f9/BUg==} + engines: {node: '>=12'} + + define-properties@1.2.1: + resolution: {integrity: sha512-8QmQKqEASLd5nx0U1B1okLElbUuuttJ/AnYmRXbbbGDWh6uS208EjD4Xqq/I9wK7u0v6O08XhTWnt5XtEbR6Dg==} + engines: {node: '>= 0.4'} + + depd@2.0.0: + resolution: {integrity: sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==} + engines: {node: '>= 0.8'} + + detect-libc@2.1.2: + resolution: {integrity: sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ==} + engines: {node: '>=8'} + + diff@8.0.3: + resolution: {integrity: sha512-qejHi7bcSD4hQAZE0tNAawRK1ZtafHDmMTMkrrIGgSLl7hTnQHmKCeB45xAcbfTqK2zowkM3j3bHt/4b/ARbYQ==} + engines: {node: '>=0.3.1'} + + doctrine@2.1.0: + resolution: {integrity: sha512-35mSku4ZXK0vfCuHEDAwt55dg2jNajHZ1odvF+8SSr82EsZY4QmXfuWso8oEd8zRhVObSN18aM0CjSdoBX7zIw==} + engines: {node: '>=0.10.0'} + + dotenv@17.3.1: + resolution: {integrity: sha512-IO8C/dzEb6O3F9/twg6ZLXz164a2fhTnEWb95H23Dm4OuN+92NmEAlTrupP9VW6Jm3sO26tQlqyvyi4CsnY9GA==} + engines: {node: '>=12'} + + dunder-proto@1.0.1: + resolution: {integrity: sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==} + engines: {node: '>= 0.4'} + + eciesjs@0.4.18: + resolution: {integrity: sha512-wG99Zcfcys9fZux7Cft8BAX/YrOJLJSZ3jyYPfhZHqN2E+Ffx+QXBDsv3gubEgPtV6dTzJMSQUwk1H98/t/0wQ==} + engines: {bun: '>=1', deno: '>=2', node: '>=16'} + + ee-first@1.1.1: + resolution: {integrity: sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow==} + + electron-to-chromium@1.5.321: + resolution: {integrity: sha512-L2C7Q279W2D/J4PLZLk7sebOILDSWos7bMsMNN06rK482umHUrh/3lM8G7IlHFOYip2oAg5nha1rCMxr/rs6ZQ==} + + emoji-regex@10.6.0: + resolution: {integrity: sha512-toUI84YS5YmxW219erniWD0CIVOo46xGKColeNQRgOzDorgBi1v4D71/OFzgD9GO2UGKIv1C3Sp8DAn0+j5w7A==} + + emoji-regex@8.0.0: + resolution: {integrity: sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==} + + emoji-regex@9.2.2: + resolution: {integrity: sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==} + + encodeurl@2.0.0: + resolution: {integrity: sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==} + engines: {node: '>= 0.8'} + + enhanced-resolve@5.20.1: + resolution: {integrity: sha512-Qohcme7V1inbAfvjItgw0EaxVX5q2rdVEZHRBrEQdRZTssLDGsL8Lwrznl8oQ/6kuTJONLaDcGjkNP247XEhcA==} + engines: {node: '>=10.13.0'} + + env-paths@2.2.1: + resolution: {integrity: sha512-+h1lkLKhZMTYjog1VEpJNG7NZJWcuc2DDk/qsqSTRRCOXiLjeQ1d1/udrUGhqMxUgAlwKNZ0cf2uqan5GLuS2A==} + engines: {node: '>=6'} + + error-ex@1.3.4: + resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==} + + es-abstract@1.24.1: + resolution: {integrity: sha512-zHXBLhP+QehSSbsS9Pt23Gg964240DPd6QCf8WpkqEXxQ7fhdZzYsocOr5u7apWonsS5EjZDmTF+/slGMyasvw==} + engines: {node: '>= 0.4'} + + es-define-property@1.0.1: + resolution: {integrity: sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==} + engines: {node: '>= 0.4'} + + es-errors@1.3.0: + resolution: {integrity: sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==} + engines: {node: '>= 0.4'} + + es-iterator-helpers@1.3.1: + resolution: {integrity: sha512-zWwRvqWiuBPr0muUG/78cW3aHROFCNIQ3zpmYDpwdbnt2m+xlNyRWpHBpa2lJjSBit7BQ+RXA1iwbSmu5yJ/EQ==} + engines: {node: '>= 0.4'} + + es-object-atoms@1.1.1: + resolution: {integrity: sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==} + engines: {node: '>= 0.4'} + + es-set-tostringtag@2.1.0: + resolution: {integrity: sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==} + engines: {node: '>= 0.4'} + + es-shim-unscopables@1.1.0: + resolution: {integrity: sha512-d9T8ucsEhh8Bi1woXCf+TIKDIROLG5WCkxg8geBCbvk22kzwC5G2OnXVMO6FUsvQlgUUXQ2itephWDLqDzbeCw==} + engines: {node: '>= 0.4'} + + es-to-primitive@1.3.0: + resolution: {integrity: sha512-w+5mJ3GuFL+NjVtJlvydShqE1eN3h3PbI7/5LAsYJP/2qtuMXjfL2LpHSRqo4b4eSF5K/DH1JXKUAHSB2UW50g==} + engines: {node: '>= 0.4'} + + escalade@3.2.0: + resolution: {integrity: sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==} + engines: {node: '>=6'} + + escape-html@1.0.3: + resolution: {integrity: sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==} + + escape-string-regexp@4.0.0: + resolution: {integrity: sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==} + engines: {node: '>=10'} + + eslint-config-next@16.1.7: + resolution: {integrity: sha512-FTq1i/QDltzq+zf9aB/cKWAiZ77baG0V7h8dRQh3thVx7I4dwr6ZXQrWKAaTB7x5VwVXlzoUTyMLIVQPLj2gJg==} + peerDependencies: + eslint: '>=9.0.0' + typescript: '>=3.3.1' + peerDependenciesMeta: + typescript: + optional: true + + eslint-import-resolver-node@0.3.9: + resolution: {integrity: sha512-WFj2isz22JahUv+B788TlO3N6zL3nNJGU8CcZbPZvVEkBPaJdCV4vy5wyghty5ROFbCRnm132v8BScu5/1BQ8g==} + + eslint-import-resolver-typescript@3.10.1: + resolution: {integrity: sha512-A1rHYb06zjMGAxdLSkN2fXPBwuSaQ0iO5M/hdyS0Ajj1VBaRp0sPD3dn1FhME3c/JluGFbwSxyCfqdSbtQLAHQ==} + engines: {node: ^14.18.0 || >=16.0.0} + peerDependencies: + eslint: '*' + eslint-plugin-import: '*' + eslint-plugin-import-x: '*' + peerDependenciesMeta: + eslint-plugin-import: + optional: true + eslint-plugin-import-x: + optional: true + + eslint-module-utils@2.12.1: + resolution: {integrity: sha512-L8jSWTze7K2mTg0vos/RuLRS5soomksDPoJLXIslC7c8Wmut3bx7CPpJijDcBZtxQ5lrbUdM+s0OlNbz0DCDNw==} + engines: {node: '>=4'} + peerDependencies: + '@typescript-eslint/parser': '*' + eslint: '*' + eslint-import-resolver-node: '*' + eslint-import-resolver-typescript: '*' + eslint-import-resolver-webpack: '*' + peerDependenciesMeta: + '@typescript-eslint/parser': + optional: true + eslint: + optional: true + eslint-import-resolver-node: + optional: true + eslint-import-resolver-typescript: + optional: true + eslint-import-resolver-webpack: + optional: true + + eslint-plugin-import@2.32.0: + resolution: {integrity: sha512-whOE1HFo/qJDyX4SnXzP4N6zOWn79WhnCUY/iDR0mPfQZO8wcYE4JClzI2oZrhBnnMUCBCHZhO6VQyoBU95mZA==} + engines: {node: '>=4'} + peerDependencies: + '@typescript-eslint/parser': '*' + eslint: ^2 || ^3 || ^4 || ^5 || ^6 || ^7.2.0 || ^8 || ^9 + peerDependenciesMeta: + '@typescript-eslint/parser': + optional: true + + eslint-plugin-jsx-a11y@6.10.2: + resolution: {integrity: sha512-scB3nz4WmG75pV8+3eRUQOHZlNSUhFNq37xnpgRkCCELU3XMvXAxLk1eqWWyE22Ki4Q01Fnsw9BA3cJHDPgn2Q==} + engines: {node: '>=4.0'} + peerDependencies: + eslint: ^3 || ^4 || ^5 || ^6 || ^7 || ^8 || ^9 + + eslint-plugin-react-hooks@7.0.1: + resolution: {integrity: sha512-O0d0m04evaNzEPoSW+59Mezf8Qt0InfgGIBJnpC0h3NH/WjUAR7BIKUfysC6todmtiZ/A0oUVS8Gce0WhBrHsA==} + engines: {node: '>=18'} + peerDependencies: + eslint: ^3.0.0 || ^4.0.0 || ^5.0.0 || ^6.0.0 || ^7.0.0 || ^8.0.0-0 || ^9.0.0 + + eslint-plugin-react@7.37.5: + resolution: {integrity: sha512-Qteup0SqU15kdocexFNAJMvCJEfa2xUKNV4CC1xsVMrIIqEy3SQ/rqyxCWNzfrd3/ldy6HMlD2e0JDVpDg2qIA==} + engines: {node: '>=4'} + peerDependencies: + eslint: ^3 || ^4 || ^5 || ^6 || ^7 || ^8 || ^9.7 + + eslint-scope@8.4.0: + resolution: {integrity: sha512-sNXOfKCn74rt8RICKMvJS7XKV/Xk9kA7DyJr8mJik3S7Cwgy3qlkkmyS2uQB3jiJg6VNdZd/pDBJu0nvG2NlTg==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + eslint-visitor-keys@3.4.3: + resolution: {integrity: sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag==} + engines: {node: ^12.22.0 || ^14.17.0 || >=16.0.0} + + eslint-visitor-keys@4.2.1: + resolution: {integrity: sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + eslint-visitor-keys@5.0.1: + resolution: {integrity: sha512-tD40eHxA35h0PEIZNeIjkHoDR4YjjJp34biM0mDvplBe//mB+IHCqHDGV7pxF+7MklTvighcCPPZC7ynWyjdTA==} + engines: {node: ^20.19.0 || ^22.13.0 || >=24} + + eslint@9.39.4: + resolution: {integrity: sha512-XoMjdBOwe/esVgEvLmNsD3IRHkm7fbKIUGvrleloJXUZgDHig2IPWNniv+GwjyJXzuNqVjlr5+4yVUZjycJwfQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + hasBin: true + peerDependencies: + jiti: '*' + peerDependenciesMeta: + jiti: + optional: true + + espree@10.4.0: + resolution: {integrity: sha512-j6PAQ2uUr79PZhBjP5C5fhl8e39FmRnOjsD5lGnWrFU8i2G776tBK7+nP8KuQUTTyAZUwfQqXAgrVH5MbH9CYQ==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + + esprima@4.0.1: + resolution: {integrity: sha512-eGuFFw7Upda+g4p+QHvnW0RyTX/SVeJBDM/gCtMARO0cLuT2HcEKnTPvhjV6aGeqrCB/sbNop0Kszm0jsaWU4A==} + engines: {node: '>=4'} + hasBin: true + + esquery@1.7.0: + resolution: {integrity: sha512-Ap6G0WQwcU/LHsvLwON1fAQX9Zp0A2Y6Y/cJBl9r/JbW90Zyg4/zbG6zzKa2OTALELarYHmKu0GhpM5EO+7T0g==} + engines: {node: '>=0.10'} + + esrecurse@4.3.0: + resolution: {integrity: sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag==} + engines: {node: '>=4.0'} + + estraverse@5.3.0: + resolution: {integrity: sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA==} + engines: {node: '>=4.0'} + + esutils@2.0.3: + resolution: {integrity: sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==} + engines: {node: '>=0.10.0'} + + etag@1.8.1: + resolution: {integrity: sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==} + engines: {node: '>= 0.6'} + + eventsource-parser@3.0.6: + resolution: {integrity: sha512-Vo1ab+QXPzZ4tCa8SwIHJFaSzy4R6SHf7BY79rFBDf0idraZWAkYrDjDj8uWaSm3S2TK+hJ7/t1CEmZ7jXw+pg==} + engines: {node: '>=18.0.0'} + + eventsource@3.0.7: + resolution: {integrity: sha512-CRT1WTyuQoD771GW56XEZFQ/ZoSfWid1alKGDYMmkt2yl8UXrVR4pspqWNEcqKvVIzg6PAltWjxcSSPrboA4iA==} + engines: {node: '>=18.0.0'} + + execa@5.1.1: + resolution: {integrity: sha512-8uSpZZocAZRBAPIEINJj3Lo9HyGitllczc27Eh5YYojjMFMn8yHMDMaUHE2Jqfq05D/wucwI4JGURyXt1vchyg==} + engines: {node: '>=10'} + + execa@9.6.1: + resolution: {integrity: sha512-9Be3ZoN4LmYR90tUoVu2te2BsbzHfhJyfEiAVfz7N5/zv+jduIfLrV2xdQXOHbaD6KgpGdO9PRPM1Y4Q9QkPkA==} + engines: {node: ^18.19.0 || >=20.5.0} + + express-rate-limit@8.3.1: + resolution: {integrity: sha512-D1dKN+cmyPWuvB+G2SREQDzPY1agpBIcTa9sJxOPMCNeH3gwzhqJRDWCXW3gg0y//+LQ/8j52JbMROWyrKdMdw==} + engines: {node: '>= 16'} + peerDependencies: + express: '>= 4.11' + + express@5.2.1: + resolution: {integrity: sha512-hIS4idWWai69NezIdRt2xFVofaF4j+6INOpJlVOLDO8zXGpUVEVzIYk12UUi2JzjEzWL3IOAxcTubgz9Po0yXw==} + engines: {node: '>= 18'} + + fast-deep-equal@3.1.3: + resolution: {integrity: sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==} + + fast-glob@3.3.1: + resolution: {integrity: sha512-kNFPyjhh5cKjrUltxs+wFx+ZkbRaxxmZ+X0ZU31SOsxCEtP9VPgtq2teZw1DebupL5GmDaNQ6yKMMVcM41iqDg==} + engines: {node: '>=8.6.0'} + + fast-glob@3.3.3: + resolution: {integrity: sha512-7MptL8U0cqcFdzIzwOTHoilX9x5BrNqye7Z/LuC7kCMRio1EMSyqRK3BEAUD7sXRq4iT4AzTVuZdhgQ2TCvYLg==} + engines: {node: '>=8.6.0'} + + fast-json-stable-stringify@2.1.0: + resolution: {integrity: sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw==} + + fast-levenshtein@2.0.6: + resolution: {integrity: sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw==} + + fast-uri@3.1.0: + resolution: {integrity: sha512-iPeeDKJSWf4IEOasVVrknXpaBV0IApz/gp7S2bb7Z4Lljbl2MGJRqInZiUrQwV16cpzw/D3S5j5Julj/gT52AA==} + + fastq@1.20.1: + resolution: {integrity: sha512-GGToxJ/w1x32s/D2EKND7kTil4n8OVk/9mycTc4VDza13lOvpUZTGX3mFSCtV9ksdGBVzvsyAVLM6mHFThxXxw==} + + fdir@6.5.0: + resolution: {integrity: sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==} + engines: {node: '>=12.0.0'} + peerDependencies: + picomatch: ^3 || ^4 + peerDependenciesMeta: + picomatch: + optional: true + + fetch-blob@3.2.0: + resolution: {integrity: sha512-7yAQpD2UMJzLi1Dqv7qFYnPbaPx7ZfFK6PiIxQ4PfkGPyNyl2Ugx+a/umUonmKqjhM4DnfbMvdX6otXq83soQQ==} + engines: {node: ^12.20 || >= 14.13} + + figures@6.1.0: + resolution: {integrity: sha512-d+l3qxjSesT4V7v2fh+QnmFnUWv9lSpjarhShNTgBOfA0ttejbQUAlHLitbjkoRiDulW0OPoQPYIGhIC8ohejg==} + engines: {node: '>=18'} + + file-entry-cache@8.0.0: + resolution: {integrity: sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ==} + engines: {node: '>=16.0.0'} + + fill-range@7.1.1: + resolution: {integrity: sha512-YsGpe3WHLK8ZYi4tWDg2Jy3ebRz2rXowDxnld4bkQB00cc/1Zw9AWnC0i9ztDJitivtQvaI9KaLyKrc+hBW0yg==} + engines: {node: '>=8'} + + finalhandler@2.1.1: + resolution: {integrity: sha512-S8KoZgRZN+a5rNwqTxlZZePjT/4cnm0ROV70LedRHZ0p8u9fRID0hJUZQpkKLzro8LfmC8sx23bY6tVNxv8pQA==} + engines: {node: '>= 18.0.0'} + + find-up@5.0.0: + resolution: {integrity: sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==} + engines: {node: '>=10'} + + flat-cache@4.0.1: + resolution: {integrity: sha512-f7ccFPK3SXFHpx15UIGyRJ/FJQctuKZ0zVuN3frBo4HnK3cay9VEW0R6yPYFHC0AgqhukPzKjq22t5DmAyqGyw==} + engines: {node: '>=16'} + + flatted@3.4.2: + resolution: {integrity: sha512-PjDse7RzhcPkIJwy5t7KPWQSZ9cAbzQXcafsetQoD7sOJRQlGikNbx7yZp2OotDnJyrDcbyRq3Ttb18iYOqkxA==} + + for-each@0.3.5: + resolution: {integrity: sha512-dKx12eRCVIzqCxFGplyFKJMPvLEWgmNtUrpTiJIR5u97zEhRG8ySrtboPHZXx7daLxQVrl643cTzbab2tkQjxg==} + engines: {node: '>= 0.4'} + + formdata-polyfill@4.0.10: + resolution: {integrity: sha512-buewHzMvYL29jdeQTVILecSaZKnt/RJWjoZCF5OW60Z67/GmSLBkOFM7qh1PI3zFNtJbaZL5eQu1vLfazOwj4g==} + engines: {node: '>=12.20.0'} + + forwarded@0.2.0: + resolution: {integrity: sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==} + engines: {node: '>= 0.6'} + + fresh@2.0.0: + resolution: {integrity: sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==} + engines: {node: '>= 0.8'} + + fs-extra@11.3.4: + resolution: {integrity: sha512-CTXd6rk/M3/ULNQj8FBqBWHYBVYybQ3VPBw0xGKFe3tuH7ytT6ACnvzpIQ3UZtB8yvUKC2cXn1a+x+5EVQLovA==} + engines: {node: '>=14.14'} + + function-bind@1.1.2: + resolution: {integrity: sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==} + + function.prototype.name@1.1.8: + resolution: {integrity: sha512-e5iwyodOHhbMr/yNrc7fDYG4qlbIvI5gajyzPnb5TCwyhjApznQh1BMFou9b30SevY43gCJKXycoCBjMbsuW0Q==} + engines: {node: '>= 0.4'} + + functions-have-names@1.2.3: + resolution: {integrity: sha512-xckBUXyTIqT97tq2x2AMb+g163b5JFysYk0x4qxNFwbfQkmNZoiRHb6sPzI9/QV33WeuvVYBUIiD4NzNIyqaRQ==} + + fuzzysort@3.1.0: + resolution: {integrity: sha512-sR9BNCjBg6LNgwvxlBd0sBABvQitkLzoVY9MYYROQVX/FvfJ4Mai9LsGhDgd8qYdds0bY77VzYd5iuB+v5rwQQ==} + + generator-function@2.0.1: + resolution: {integrity: sha512-SFdFmIJi+ybC0vjlHN0ZGVGHc3lgE0DxPAT0djjVg+kjOnSqclqmj0KQ7ykTOLP6YxoqOvuAODGdcHJn+43q3g==} + engines: {node: '>= 0.4'} + + gensync@1.0.0-beta.2: + resolution: {integrity: sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg==} + engines: {node: '>=6.9.0'} + + get-caller-file@2.0.5: + resolution: {integrity: sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==} + engines: {node: 6.* || 8.* || >= 10.*} + + get-east-asian-width@1.5.0: + resolution: {integrity: sha512-CQ+bEO+Tva/qlmw24dCejulK5pMzVnUOFOijVogd3KQs07HnRIgp8TGipvCCRT06xeYEbpbgwaCxglFyiuIcmA==} + engines: {node: '>=18'} + + get-intrinsic@1.3.0: + resolution: {integrity: sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==} + engines: {node: '>= 0.4'} + + get-own-enumerable-keys@1.0.0: + resolution: {integrity: sha512-PKsK2FSrQCyxcGHsGrLDcK0lx+0Ke+6e8KFFozA9/fIQLhQzPaRvJFdcz7+Axg3jUH/Mq+NI4xa5u/UT2tQskA==} + engines: {node: '>=14.16'} + + get-proto@1.0.1: + resolution: {integrity: sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==} + engines: {node: '>= 0.4'} + + get-stream@6.0.1: + resolution: {integrity: sha512-ts6Wi+2j3jQjqi70w5AlN8DFnkSwC+MqmxEzdEALB2qXZYV3X/b1CTfgPLGJNMeAWxdPfU8FO1ms3NUfaHCPYg==} + engines: {node: '>=10'} + + get-stream@9.0.1: + resolution: {integrity: sha512-kVCxPF3vQM/N0B1PmoqVUqgHP+EeVjmZSQn+1oCRPxd2P21P2F19lIgbR3HBosbB1PUhOAoctJnfEn2GbN2eZA==} + engines: {node: '>=18'} + + get-symbol-description@1.1.0: + resolution: {integrity: sha512-w9UMqWwJxHNOvoNzSJ2oPF5wvYcvP7jUvYzhp67yEhTi17ZDBBC1z9pTdGuzjD+EFIqLSYRweZjqfiPzQ06Ebg==} + engines: {node: '>= 0.4'} + + get-tsconfig@4.13.6: + resolution: {integrity: sha512-shZT/QMiSHc/YBLxxOkMtgSid5HFoauqCE3/exfsEcwg1WkeqjG+V40yBbBrsD+jW2HDXcs28xOfcbm2jI8Ddw==} + + glob-parent@5.1.2: + resolution: {integrity: sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==} + engines: {node: '>= 6'} + + glob-parent@6.0.2: + resolution: {integrity: sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==} + engines: {node: '>=10.13.0'} + + globals@14.0.0: + resolution: {integrity: sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ==} + engines: {node: '>=18'} + + globals@16.4.0: + resolution: {integrity: sha512-ob/2LcVVaVGCYN+r14cnwnoDPUufjiYgSqRhiFD0Q1iI4Odora5RE8Iv1D24hAz5oMophRGkGz+yuvQmmUMnMw==} + engines: {node: '>=18'} + + globalthis@1.0.4: + resolution: {integrity: sha512-DpLKbNU4WylpxJykQujfCcwYWiV/Jhm50Goo0wrVILAv5jOr9d+H+UR3PhSCD2rCCEIg0uc+G+muBTwD54JhDQ==} + engines: {node: '>= 0.4'} + + gopd@1.2.0: + resolution: {integrity: sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==} + engines: {node: '>= 0.4'} + + graceful-fs@4.2.11: + resolution: {integrity: sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==} + + graphql@16.13.1: + resolution: {integrity: sha512-gGgrVCoDKlIZ8fIqXBBb0pPKqDgki0Z/FSKNiQzSGj2uEYHr1tq5wmBegGwJx6QB5S5cM0khSBpi/JFHMCvsmQ==} + engines: {node: ^12.22.0 || ^14.16.0 || ^16.0.0 || >=17.0.0} + + has-bigints@1.1.0: + resolution: {integrity: sha512-R3pbpkcIqv2Pm3dUwgjclDRVmWpTJW2DcMzcIhEXEx1oh/CEMObMm3KLmRJOdvhM7o4uQBnwr8pzRK2sJWIqfg==} + engines: {node: '>= 0.4'} + + has-flag@4.0.0: + resolution: {integrity: sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==} + engines: {node: '>=8'} + + has-property-descriptors@1.0.2: + resolution: {integrity: sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg==} + + has-proto@1.2.0: + resolution: {integrity: sha512-KIL7eQPfHQRC8+XluaIw7BHUwwqL19bQn4hzNgdr+1wXoU0KKj6rufu47lhY7KbJR2C6T6+PfyN0Ea7wkSS+qQ==} + engines: {node: '>= 0.4'} + + has-symbols@1.1.0: + resolution: {integrity: sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==} + engines: {node: '>= 0.4'} + + has-tostringtag@1.0.2: + resolution: {integrity: sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==} + engines: {node: '>= 0.4'} + + hasown@2.0.2: + resolution: {integrity: sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==} + engines: {node: '>= 0.4'} + + headers-polyfill@4.0.3: + resolution: {integrity: sha512-IScLbePpkvO846sIwOtOTDjutRMWdXdJmXdMvk6gCBHxFO8d+QKOQedyZSxFTTFYRSmlgSTDtXqqq4pcenBXLQ==} + + hermes-estree@0.25.1: + resolution: {integrity: sha512-0wUoCcLp+5Ev5pDW2OriHC2MJCbwLwuRx+gAqMTOkGKJJiBCLjtrvy4PWUGn6MIVefecRpzoOZ/UV6iGdOr+Cw==} + + hermes-parser@0.25.1: + resolution: {integrity: sha512-6pEjquH3rqaI6cYAXYPcz9MS4rY6R4ngRgrgfDshRptUZIc3lw0MCIJIGDj9++mfySOuPTHB4nrSW99BCvOPIA==} + + hono@4.12.8: + resolution: {integrity: sha512-VJCEvtrezO1IAR+kqEYnxUOoStaQPGrCmX3j4wDTNOcD1uRPFpGlwQUIW8niPuvHXaTUxeOUl5MMDGrl+tmO9A==} + engines: {node: '>=16.9.0'} + + http-errors@2.0.1: + resolution: {integrity: sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ==} + engines: {node: '>= 0.8'} + + https-proxy-agent@7.0.6: + resolution: {integrity: sha512-vK9P5/iUfdl95AI+JVyUuIcVtd4ofvtrOr3HNtM2yxC9bnMbEdp3x01OhQNnjb8IJYi38VlTE3mBXwcfvywuSw==} + engines: {node: '>= 14'} + + human-signals@2.1.0: + resolution: {integrity: sha512-B4FFZ6q/T2jhhksgkbEW3HBvWIfDW85snkQgawt07S7J5QXTk6BkNV+0yAeZrM5QpMAdYlocGoljn0sJ/WQkFw==} + engines: {node: '>=10.17.0'} + + human-signals@8.0.1: + resolution: {integrity: sha512-eKCa6bwnJhvxj14kZk5NCPc6Hb6BdsU9DZcOnmQKSnO1VKrfV0zCvtttPZUsBvjmNDn8rpcJfpwSYnHBjc95MQ==} + engines: {node: '>=18.18.0'} + + iconv-lite@0.7.2: + resolution: {integrity: sha512-im9DjEDQ55s9fL4EYzOAv0yMqmMBSZp6G0VvFyTMPKWxiSBHUj9NW/qqLmXUwXrrM7AvqSlTCfvqRb0cM8yYqw==} + engines: {node: '>=0.10.0'} + + ignore@5.3.2: + resolution: {integrity: sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==} + engines: {node: '>= 4'} + + ignore@7.0.5: + resolution: {integrity: sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg==} + engines: {node: '>= 4'} + + import-fresh@3.3.1: + resolution: {integrity: sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==} + engines: {node: '>=6'} + + imurmurhash@0.1.4: + resolution: {integrity: sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==} + engines: {node: '>=0.8.19'} + + inherits@2.0.4: + resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==} + + internal-slot@1.1.0: + resolution: {integrity: sha512-4gd7VpWNQNB4UKKCFFVcp1AVv+FMOgs9NKzjHKusc8jTMhd5eL1NqQqOpE0KzMds804/yHlglp3uxgluOqAPLw==} + engines: {node: '>= 0.4'} + + ip-address@10.1.0: + resolution: {integrity: sha512-XXADHxXmvT9+CRxhXg56LJovE+bmWnEWB78LB83VZTprKTmaC5QfruXocxzTZ2Kl0DNwKuBdlIhjL8LeY8Sf8Q==} + engines: {node: '>= 12'} + + ipaddr.js@1.9.1: + resolution: {integrity: sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==} + engines: {node: '>= 0.10'} + + is-array-buffer@3.0.5: + resolution: {integrity: sha512-DDfANUiiG2wC1qawP66qlTugJeL5HyzMpfr8lLK+jMQirGzNod0B12cFB/9q838Ru27sBwfw78/rdoU7RERz6A==} + engines: {node: '>= 0.4'} + + is-arrayish@0.2.1: + resolution: {integrity: sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg==} + + is-async-function@2.1.1: + resolution: {integrity: sha512-9dgM/cZBnNvjzaMYHVoxxfPj2QXt22Ev7SuuPrs+xav0ukGB0S6d4ydZdEiM48kLx5kDV+QBPrpVnFyefL8kkQ==} + engines: {node: '>= 0.4'} + + is-bigint@1.1.0: + resolution: {integrity: sha512-n4ZT37wG78iz03xPRKJrHTdZbe3IicyucEtdRsV5yglwc3GyUfbAfpSeD0FJ41NbUNSt5wbhqfp1fS+BgnvDFQ==} + engines: {node: '>= 0.4'} + + is-boolean-object@1.2.2: + resolution: {integrity: sha512-wa56o2/ElJMYqjCjGkXri7it5FbebW5usLw/nPmCMs5DeZ7eziSYZhSmPRn0txqeW4LnAmQQU7FgqLpsEFKM4A==} + engines: {node: '>= 0.4'} + + is-bun-module@2.0.0: + resolution: {integrity: sha512-gNCGbnnnnFAUGKeZ9PdbyeGYJqewpmc2aKHUEMO5nQPWU9lOmv7jcmQIv+qHD8fXW6W7qfuCwX4rY9LNRjXrkQ==} + + is-callable@1.2.7: + resolution: {integrity: sha512-1BC0BVFhS/p0qtw6enp8e+8OD0UrK0oFLztSjNzhcKA3WDuJxxAPXzPuPtKkjEY9UUoEWlX/8fgKeu2S8i9JTA==} + engines: {node: '>= 0.4'} + + is-core-module@2.16.1: + resolution: {integrity: sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w==} + engines: {node: '>= 0.4'} + + is-data-view@1.0.2: + resolution: {integrity: sha512-RKtWF8pGmS87i2D6gqQu/l7EYRlVdfzemCJN/P3UOs//x1QE7mfhvzHIApBTRf7axvT6DMGwSwBXYCT0nfB9xw==} + engines: {node: '>= 0.4'} + + is-date-object@1.1.0: + resolution: {integrity: sha512-PwwhEakHVKTdRNVOw+/Gyh0+MzlCl4R6qKvkhuvLtPMggI1WAHt9sOwZxQLSGpUaDnrdyDsomoRgNnCfKNSXXg==} + engines: {node: '>= 0.4'} + + is-docker@3.0.0: + resolution: {integrity: sha512-eljcgEDlEns/7AXFosB5K/2nCM4P7FQPkGc/DWLy5rmFEWvZayGrik1d9/QIY5nJ4f9YsVvBkA6kJpHn9rISdQ==} + engines: {node: ^12.20.0 || ^14.13.1 || >=16.0.0} + hasBin: true + + is-extglob@2.1.1: + resolution: {integrity: sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==} + engines: {node: '>=0.10.0'} + + is-finalizationregistry@1.1.1: + resolution: {integrity: sha512-1pC6N8qWJbWoPtEjgcL2xyhQOP491EQjeUo3qTKcmV8YSDDJrOepfG8pcC7h/QgnQHYSv0mJ3Z/ZWxmatVrysg==} + engines: {node: '>= 0.4'} + + is-fullwidth-code-point@3.0.0: + resolution: {integrity: sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==} + engines: {node: '>=8'} + + is-generator-function@1.1.2: + resolution: {integrity: sha512-upqt1SkGkODW9tsGNG5mtXTXtECizwtS2kA161M+gJPc1xdb/Ax629af6YrTwcOeQHbewrPNlE5Dx7kzvXTizA==} + engines: {node: '>= 0.4'} + + is-glob@4.0.3: + resolution: {integrity: sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==} + engines: {node: '>=0.10.0'} + + is-in-ssh@1.0.0: + resolution: {integrity: sha512-jYa6Q9rH90kR1vKB6NM7qqd1mge3Fx4Dhw5TVlK1MUBqhEOuCagrEHMevNuCcbECmXZ0ThXkRm+Ymr51HwEPAw==} + engines: {node: '>=20'} + + is-inside-container@1.0.0: + resolution: {integrity: sha512-KIYLCCJghfHZxqjYBE7rEy0OBuTd5xCHS7tHVgvCLkx7StIoaxwNW3hCALgEUjFfeRk+MG/Qxmp/vtETEF3tRA==} + engines: {node: '>=14.16'} + hasBin: true + + is-interactive@2.0.0: + resolution: {integrity: sha512-qP1vozQRI+BMOPcjFzrjXuQvdak2pHNUMZoeG2eRbiSqyvbEf/wQtEOTOX1guk6E3t36RkaqiSt8A/6YElNxLQ==} + engines: {node: '>=12'} + + is-map@2.0.3: + resolution: {integrity: sha512-1Qed0/Hr2m+YqxnM09CjA2d/i6YZNfF6R2oRAOj36eUdS6qIV/huPJNSEpKbupewFs+ZsJlxsjjPbc0/afW6Lw==} + engines: {node: '>= 0.4'} + + is-negative-zero@2.0.3: + resolution: {integrity: sha512-5KoIu2Ngpyek75jXodFvnafB6DJgr3u8uuK0LEZJjrU19DrMD3EVERaR8sjz8CCGgpZvxPl9SuE1GMVPFHx1mw==} + engines: {node: '>= 0.4'} + + is-node-process@1.2.0: + resolution: {integrity: sha512-Vg4o6/fqPxIjtxgUH5QLJhwZ7gW5diGCVlXpuUfELC62CuxM1iHcRe51f2W1FDy04Ai4KJkagKjx3XaqyfRKXw==} + + is-number-object@1.1.1: + resolution: {integrity: sha512-lZhclumE1G6VYD8VHe35wFaIif+CTy5SJIi5+3y4psDgWu4wPDoBhF8NxUOinEc7pHgiTsT6MaBb92rKhhD+Xw==} + engines: {node: '>= 0.4'} + + is-number@7.0.0: + resolution: {integrity: sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==} + engines: {node: '>=0.12.0'} + + is-obj@3.0.0: + resolution: {integrity: sha512-IlsXEHOjtKhpN8r/tRFj2nDyTmHvcfNeu/nrRIcXE17ROeatXchkojffa1SpdqW4cr/Fj6QkEf/Gn4zf6KKvEQ==} + engines: {node: '>=12'} + + is-plain-obj@4.1.0: + resolution: {integrity: sha512-+Pgi+vMuUNkJyExiMBt5IlFoMyKnr5zhJ4Uspz58WOhBF5QoIZkFyNHIbBAtHwzVAgk5RtndVNsDRN61/mmDqg==} + engines: {node: '>=12'} + + is-promise@4.0.0: + resolution: {integrity: sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==} + + is-regex@1.2.1: + resolution: {integrity: sha512-MjYsKHO5O7mCsmRGxWcLWheFqN9DJ/2TmngvjKXihe6efViPqc274+Fx/4fYj/r03+ESvBdTXK0V6tA3rgez1g==} + engines: {node: '>= 0.4'} + + is-regexp@3.1.0: + resolution: {integrity: sha512-rbku49cWloU5bSMI+zaRaXdQHXnthP6DZ/vLnfdSKyL4zUzuWnomtOEiZZOd+ioQ+avFo/qau3KPTc7Fjy1uPA==} + engines: {node: '>=12'} + + is-set@2.0.3: + resolution: {integrity: sha512-iPAjerrse27/ygGLxw+EBR9agv9Y6uLeYVJMu+QNCoouJ1/1ri0mGrcWpfCqFZuzzx3WjtwxG098X+n4OuRkPg==} + engines: {node: '>= 0.4'} + + is-shared-array-buffer@1.0.4: + resolution: {integrity: sha512-ISWac8drv4ZGfwKl5slpHG9OwPNty4jOWPRIhBpxOoD+hqITiwuipOQ2bNthAzwA3B4fIjO4Nln74N0S9byq8A==} + engines: {node: '>= 0.4'} + + is-stream@2.0.1: + resolution: {integrity: sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==} + engines: {node: '>=8'} + + is-stream@4.0.1: + resolution: {integrity: sha512-Dnz92NInDqYckGEUJv689RbRiTSEHCQ7wOVeALbkOz999YpqT46yMRIGtSNl2iCL1waAZSx40+h59NV/EwzV/A==} + engines: {node: '>=18'} + + is-string@1.1.1: + resolution: {integrity: sha512-BtEeSsoaQjlSPBemMQIrY1MY0uM6vnS1g5fmufYOtnxLGUZM2178PKbhsk7Ffv58IX+ZtcvoGwccYsh0PglkAA==} + engines: {node: '>= 0.4'} + + is-symbol@1.1.1: + resolution: {integrity: sha512-9gGx6GTtCQM73BgmHQXfDmLtfjjTUDSyoxTCbp5WtoixAhfgsDirWIcVQ/IHpvI5Vgd5i/J5F7B9cN/WlVbC/w==} + engines: {node: '>= 0.4'} + + is-typed-array@1.1.15: + resolution: {integrity: sha512-p3EcsicXjit7SaskXHs1hA91QxgTw46Fv6EFKKGS5DRFLD8yKnohjF3hxoju94b/OcMZoQukzpPpBE9uLVKzgQ==} + engines: {node: '>= 0.4'} + + is-unicode-supported@1.3.0: + resolution: {integrity: sha512-43r2mRvz+8JRIKnWJ+3j8JtjRKZ6GmjzfaE/qiBJnikNnYv/6bagRJ1kUhNk8R5EX/GkobD+r+sfxCPJsiKBLQ==} + engines: {node: '>=12'} + + is-unicode-supported@2.1.0: + resolution: {integrity: sha512-mE00Gnza5EEB3Ds0HfMyllZzbBrmLOX3vfWoj9A9PEnTfratQ/BcaJOuMhnkhjXvb2+FkY3VuHqtAGpTPmglFQ==} + engines: {node: '>=18'} + + is-weakmap@2.0.2: + resolution: {integrity: sha512-K5pXYOm9wqY1RgjpL3YTkF39tni1XajUIkawTLUo9EZEVUFga5gSQJF8nNS7ZwJQ02y+1YCNYcMh+HIf1ZqE+w==} + engines: {node: '>= 0.4'} + + is-weakref@1.1.1: + resolution: {integrity: sha512-6i9mGWSlqzNMEqpCp93KwRS1uUOodk2OJ6b+sq7ZPDSy2WuI5NFIxp/254TytR8ftefexkWn5xNiHUNpPOfSew==} + engines: {node: '>= 0.4'} + + is-weakset@2.0.4: + resolution: {integrity: sha512-mfcwb6IzQyOKTs84CQMrOwW4gQcaTOAWJ0zzJCl2WSPDrWk/OzDaImWFH3djXhb24g4eudZfLRozAvPGw4d9hQ==} + engines: {node: '>= 0.4'} + + is-wsl@3.1.1: + resolution: {integrity: sha512-e6rvdUCiQCAuumZslxRJWR/Doq4VpPR82kqclvcS0efgt430SlGIk05vdCN58+VrzgtIcfNODjozVielycD4Sw==} + engines: {node: '>=16'} + + isarray@2.0.5: + resolution: {integrity: sha512-xHjhDr3cNBK0BzdUJSPXZntQUx/mwMS5Rw4A7lPJ90XGAO6ISP/ePDNuo0vhqOZU+UD5JoodwCAAoZQd3FeAKw==} + + isexe@2.0.0: + resolution: {integrity: sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==} + + isexe@3.1.5: + resolution: {integrity: sha512-6B3tLtFqtQS4ekarvLVMZ+X+VlvQekbe4taUkf/rhVO3d/h0M2rfARm/pXLcPEsjjMsFgrFgSrhQIxcSVrBz8w==} + engines: {node: '>=18'} + + iterator.prototype@1.1.5: + resolution: {integrity: sha512-H0dkQoCa3b2VEeKQBOxFph+JAbcrQdE7KC0UkqwpLmv2EC4P41QXP+rqo9wYodACiG5/WM5s9oDApTU8utwj9g==} + engines: {node: '>= 0.4'} + + jiti@2.6.1: + resolution: {integrity: sha512-ekilCSN1jwRvIbgeg/57YFh8qQDNbwDb9xT/qu2DAHbFFZUicIl4ygVaAvzveMhMVr3LnpSKTNnwt8PoOfmKhQ==} + hasBin: true + + jose@6.2.2: + resolution: {integrity: sha512-d7kPDd34KO/YnzaDOlikGpOurfF0ByC2sEV4cANCtdqLlTfBlw2p14O/5d/zv40gJPbIQxfES3nSx1/oYNyuZQ==} + + js-tokens@4.0.0: + resolution: {integrity: sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==} + + js-yaml@4.1.1: + resolution: {integrity: sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==} + hasBin: true + + jsesc@3.1.0: + resolution: {integrity: sha512-/sM3dO2FOzXjKQhJuo0Q173wf2KOo8t4I8vHy6lF9poUp7bKT0/NHE8fPX23PwfhnykfqnC2xRxOnVw5XuGIaA==} + engines: {node: '>=6'} + hasBin: true + + json-buffer@3.0.1: + resolution: {integrity: sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ==} + + json-parse-even-better-errors@2.3.1: + resolution: {integrity: sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w==} + + json-schema-traverse@0.4.1: + resolution: {integrity: sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg==} + + json-schema-traverse@1.0.0: + resolution: {integrity: sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug==} + + json-schema-typed@8.0.2: + resolution: {integrity: sha512-fQhoXdcvc3V28x7C7BMs4P5+kNlgUURe2jmUT1T//oBRMDrqy1QPelJimwZGo7Hg9VPV3EQV5Bnq4hbFy2vetA==} + + json-stable-stringify-without-jsonify@1.0.1: + resolution: {integrity: sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw==} + + json5@1.0.2: + resolution: {integrity: sha512-g1MWMLBiz8FKi1e4w0UyVL3w+iJceWAFBAaBnnGKOpNa5f8TLktkbre1+s6oICydWAm+HRUGTmI+//xv2hvXYA==} + hasBin: true + + json5@2.2.3: + resolution: {integrity: sha512-XmOWe7eyHYH14cLdVPoyg+GOH3rYX++KpzrylJwSW98t3Nk+U8XOl8FWKOgwtzdb8lXGf6zYwDUzeHMWfxasyg==} + engines: {node: '>=6'} + hasBin: true + + jsonfile@6.2.0: + resolution: {integrity: sha512-FGuPw30AdOIUTRMC2OMRtQV+jkVj2cfPqSeWXv1NEAJ1qZ5zb1X6z1mFhbfOB/iy3ssJCD+3KuZ8r8C3uVFlAg==} + + jsx-ast-utils@3.3.5: + resolution: {integrity: sha512-ZZow9HBI5O6EPgSJLUb8n2NKgmVWTwCvHGwFuJlMjvLFqlGG6pjirPhtdsseaLZjSibD8eegzmYpUZwoIlj2cQ==} + engines: {node: '>=4.0'} + + keyv@4.5.4: + resolution: {integrity: sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==} + + kleur@3.0.3: + resolution: {integrity: sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==} + engines: {node: '>=6'} + + kleur@4.1.5: + resolution: {integrity: sha512-o+NO+8WrRiQEE4/7nwRJhN1HWpVmJm511pBHUxPLtp0BUISzlBplORYSmTclCnJvQq2tKu/sgl3xVpkc7ZWuQQ==} + engines: {node: '>=6'} + + language-subtag-registry@0.3.23: + resolution: {integrity: sha512-0K65Lea881pHotoGEa5gDlMxt3pctLi2RplBb7Ezh4rRdLEOtgi7n4EwK9lamnUCkKBqaeKRVebTq6BAxSkpXQ==} + + language-tags@1.0.9: + resolution: {integrity: sha512-MbjN408fEndfiQXbFQ1vnd+1NoLDsnQW41410oQBXiyXDMYH5z505juWa4KUE1LqxRC7DgOgZDbKLxHIwm27hA==} + engines: {node: '>=0.10'} + + levn@0.4.1: + resolution: {integrity: sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==} + engines: {node: '>= 0.8.0'} + + lightningcss-android-arm64@1.32.0: + resolution: {integrity: sha512-YK7/ClTt4kAK0vo6w3X+Pnm0D2cf2vPHbhOXdoNti1Ga0al1P4TBZhwjATvjNwLEBCnKvjJc2jQgHXH0NEwlAg==} + engines: {node: '>= 12.0.0'} + cpu: [arm64] + os: [android] + + lightningcss-darwin-arm64@1.32.0: + resolution: {integrity: sha512-RzeG9Ju5bag2Bv1/lwlVJvBE3q6TtXskdZLLCyfg5pt+HLz9BqlICO7LZM7VHNTTn/5PRhHFBSjk5lc4cmscPQ==} + engines: {node: '>= 12.0.0'} + cpu: [arm64] + os: [darwin] + + lightningcss-darwin-x64@1.32.0: + resolution: {integrity: sha512-U+QsBp2m/s2wqpUYT/6wnlagdZbtZdndSmut/NJqlCcMLTWp5muCrID+K5UJ6jqD2BFshejCYXniPDbNh73V8w==} + engines: {node: '>= 12.0.0'} + cpu: [x64] + os: [darwin] + + lightningcss-freebsd-x64@1.32.0: + resolution: {integrity: sha512-JCTigedEksZk3tHTTthnMdVfGf61Fky8Ji2E4YjUTEQX14xiy/lTzXnu1vwiZe3bYe0q+SpsSH/CTeDXK6WHig==} + engines: {node: '>= 12.0.0'} + cpu: [x64] + os: [freebsd] + + lightningcss-linux-arm-gnueabihf@1.32.0: + resolution: {integrity: sha512-x6rnnpRa2GL0zQOkt6rts3YDPzduLpWvwAF6EMhXFVZXD4tPrBkEFqzGowzCsIWsPjqSK+tyNEODUBXeeVHSkw==} + engines: {node: '>= 12.0.0'} + cpu: [arm] + os: [linux] + + lightningcss-linux-arm64-gnu@1.32.0: + resolution: {integrity: sha512-0nnMyoyOLRJXfbMOilaSRcLH3Jw5z9HDNGfT/gwCPgaDjnx0i8w7vBzFLFR1f6CMLKF8gVbebmkUN3fa/kQJpQ==} + engines: {node: '>= 12.0.0'} + cpu: [arm64] + os: [linux] + libc: [glibc] + + lightningcss-linux-arm64-musl@1.32.0: + resolution: {integrity: sha512-UpQkoenr4UJEzgVIYpI80lDFvRmPVg6oqboNHfoH4CQIfNA+HOrZ7Mo7KZP02dC6LjghPQJeBsvXhJod/wnIBg==} + engines: {node: '>= 12.0.0'} + cpu: [arm64] + os: [linux] + libc: [musl] + + lightningcss-linux-x64-gnu@1.32.0: + resolution: {integrity: sha512-V7Qr52IhZmdKPVr+Vtw8o+WLsQJYCTd8loIfpDaMRWGUZfBOYEJeyJIkqGIDMZPwPx24pUMfwSxxI8phr/MbOA==} + engines: {node: '>= 12.0.0'} + cpu: [x64] + os: [linux] + libc: [glibc] + + lightningcss-linux-x64-musl@1.32.0: + resolution: {integrity: sha512-bYcLp+Vb0awsiXg/80uCRezCYHNg1/l3mt0gzHnWV9XP1W5sKa5/TCdGWaR/zBM2PeF/HbsQv/j2URNOiVuxWg==} + engines: {node: '>= 12.0.0'} + cpu: [x64] + os: [linux] + libc: [musl] + + lightningcss-win32-arm64-msvc@1.32.0: + resolution: {integrity: sha512-8SbC8BR40pS6baCM8sbtYDSwEVQd4JlFTOlaD3gWGHfThTcABnNDBda6eTZeqbofalIJhFx0qKzgHJmcPTnGdw==} + engines: {node: '>= 12.0.0'} + cpu: [arm64] + os: [win32] + + lightningcss-win32-x64-msvc@1.32.0: + resolution: {integrity: sha512-Amq9B/SoZYdDi1kFrojnoqPLxYhQ4Wo5XiL8EVJrVsB8ARoC1PWW6VGtT0WKCemjy8aC+louJnjS7U18x3b06Q==} + engines: {node: '>= 12.0.0'} + cpu: [x64] + os: [win32] + + lightningcss@1.32.0: + resolution: {integrity: sha512-NXYBzinNrblfraPGyrbPoD19C1h9lfI/1mzgWYvXUTe414Gz/X1FD2XBZSZM7rRTrMA8JL3OtAaGifrIKhQ5yQ==} + engines: {node: '>= 12.0.0'} + + lines-and-columns@1.2.4: + resolution: {integrity: sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==} + + locate-path@6.0.0: + resolution: {integrity: sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw==} + engines: {node: '>=10'} + + lodash.merge@4.6.2: + resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==} + + log-symbols@6.0.0: + resolution: {integrity: sha512-i24m8rpwhmPIS4zscNzK6MSEhk0DUWa/8iYQWxhffV8jkI4Phvs3F+quL5xvS0gdQR0FyTCMMH33Y78dDTzzIw==} + engines: {node: '>=18'} + + loose-envify@1.4.0: + resolution: {integrity: sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==} + hasBin: true + + lru-cache@5.1.1: + resolution: {integrity: sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==} + + lucide-react@0.500.0: + resolution: {integrity: sha512-IyGvnYOSBKiF7rFI3p0tDw4ZI+TKnn1b5LPF2/q3NBmA7zYaDagevs1eMNyPRpKdFXxgma6rEu6+2ylJaaGnJA==} + peerDependencies: + react: ^16.5.1 || ^17.0.0 || ^18.0.0 || ^19.0.0 + + magic-string@0.30.21: + resolution: {integrity: sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ==} + + math-intrinsics@1.1.0: + resolution: {integrity: sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==} + engines: {node: '>= 0.4'} + + media-typer@1.1.0: + resolution: {integrity: sha512-aisnrDP4GNe06UcKFnV5bfMNPBUw4jsLGaWwWfnH3v02GnBuXX2MCVn5RbrWo0j3pczUilYblq7fQ7Nw2t5XKw==} + engines: {node: '>= 0.8'} + + merge-descriptors@2.0.0: + resolution: {integrity: sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==} + engines: {node: '>=18'} + + merge-stream@2.0.0: + resolution: {integrity: sha512-abv/qOcuPfk3URPfDzmZU1LKmuw8kT+0nIHvKrKgFrwifol/doWcdA4ZqsWQ8ENrFKkd67Mfpo/LovbIUsbt3w==} + + merge2@1.4.1: + resolution: {integrity: sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==} + engines: {node: '>= 8'} + + micromatch@4.0.8: + resolution: {integrity: sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==} + engines: {node: '>=8.6'} + + mime-db@1.54.0: + resolution: {integrity: sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==} + engines: {node: '>= 0.6'} + + mime-types@3.0.2: + resolution: {integrity: sha512-Lbgzdk0h4juoQ9fCKXW4by0UJqj+nOOrI9MJ1sSj4nI8aI2eo1qmvQEie4VD1glsS250n15LsWsYtCugiStS5A==} + engines: {node: '>=18'} + + mimic-fn@2.1.0: + resolution: {integrity: sha512-OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg==} + engines: {node: '>=6'} + + mimic-function@5.0.1: + resolution: {integrity: sha512-VP79XUPxV2CigYP3jWwAUFSku2aKqBH7uTAapFWCBqutsbmDo96KY5o8uh6U+/YSIn5OxJnXp73beVkpqMIGhA==} + engines: {node: '>=18'} + + minimatch@10.2.4: + resolution: {integrity: sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg==} + engines: {node: 18 || 20 || >=22} + + minimatch@3.1.5: + resolution: {integrity: sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w==} + + minimist@1.2.8: + resolution: {integrity: sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==} + + ms@2.1.3: + resolution: {integrity: sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==} + + msw@2.12.14: + resolution: {integrity: sha512-4KXa4nVBIBjbDbd7vfQNuQ25eFxug0aropCQFoI0JdOBuJWamkT1yLVIWReFI8SiTRc+H1hKzaNk+cLk2N9rtQ==} + engines: {node: '>=18'} + hasBin: true + peerDependencies: + typescript: '>= 4.8.x' + peerDependenciesMeta: + typescript: + optional: true + + mute-stream@2.0.0: + resolution: {integrity: sha512-WWdIxpyjEn+FhQJQQv9aQAYlHoNVdzIzUySNV1gHUPDSdZJ3yZn7pAAbQcV7B56Mvu881q9FZV+0Vx2xC44VWA==} + engines: {node: ^18.17.0 || >=20.5.0} + + nanoid@3.3.11: + resolution: {integrity: sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==} + engines: {node: ^10 || ^12 || ^13.7 || ^14 || >=15.0.1} + hasBin: true + + napi-postinstall@0.3.4: + resolution: {integrity: sha512-PHI5f1O0EP5xJ9gQmFGMS6IZcrVvTjpXjz7Na41gTE7eE2hK11lg04CECCYEEjdc17EV4DO+fkGEtt7TpTaTiQ==} + engines: {node: ^12.20.0 || ^14.18.0 || >=16.0.0} + hasBin: true + + natural-compare@1.4.0: + resolution: {integrity: sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw==} + + negotiator@1.0.0: + resolution: {integrity: sha512-8Ofs/AUQh8MaEcrlq5xOX0CQ9ypTF5dl78mjlMNfOK08fzpgTHQRQPBxcPlEtIw0yRpws+Zo/3r+5WRby7u3Gg==} + engines: {node: '>= 0.6'} + + next-themes@0.4.6: + resolution: {integrity: sha512-pZvgD5L0IEvX5/9GWyHMf3m8BKiVQwsCMHfoFosXtXBMnaS0ZnIJ9ST4b4NqLVKDEm8QBxoNNGNaBv2JNF6XNA==} + peerDependencies: + react: ^16.8 || ^17 || ^18 || ^19 || ^19.0.0-rc + react-dom: ^16.8 || ^17 || ^18 || ^19 || ^19.0.0-rc + + next@16.1.7: + resolution: {integrity: sha512-WM0L7WrSvKwoLegLYr6V+mz+RIofqQgVAfHhMp9a88ms0cFX8iX9ew+snpWlSBwpkURJOUdvCEt3uLl3NNzvWg==} + engines: {node: '>=20.9.0'} + hasBin: true + peerDependencies: + '@opentelemetry/api': ^1.1.0 + '@playwright/test': ^1.51.1 + babel-plugin-react-compiler: '*' + react: ^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0 + react-dom: ^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0 + sass: ^1.3.0 + peerDependenciesMeta: + '@opentelemetry/api': + optional: true + '@playwright/test': + optional: true + babel-plugin-react-compiler: + optional: true + sass: + optional: true + + node-domexception@1.0.0: + resolution: {integrity: sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ==} + engines: {node: '>=10.5.0'} + deprecated: Use your platform's native DOMException instead + + node-exports-info@1.6.0: + resolution: {integrity: sha512-pyFS63ptit/P5WqUkt+UUfe+4oevH+bFeIiPPdfb0pFeYEu/1ELnJu5l+5EcTKYL5M7zaAa7S8ddywgXypqKCw==} + engines: {node: '>= 0.4'} + + node-fetch@3.3.2: + resolution: {integrity: sha512-dRB78srN/l6gqWulah9SrxeYnxeddIG30+GOqK/9OlLVyLg3HPnr6SqOWTWOXKRwC2eGYCkZ59NNuSgvSrpgOA==} + engines: {node: ^12.20.0 || ^14.13.1 || >=16.0.0} + + node-releases@2.0.36: + resolution: {integrity: sha512-TdC8FSgHz8Mwtw9g5L4gR/Sh9XhSP/0DEkQxfEFXOpiul5IiHgHan2VhYYb6agDSfp4KuvltmGApc8HMgUrIkA==} + + npm-run-path@4.0.1: + resolution: {integrity: sha512-S48WzZW777zhNIrn7gxOlISNAqi9ZC/uQFnRdbeIHhZhCA6UqpkOT8T1G7BvfdgP4Er8gF4sUbaS0i7QvIfCWw==} + engines: {node: '>=8'} + + npm-run-path@6.0.0: + resolution: {integrity: sha512-9qny7Z9DsQU8Ou39ERsPU4OZQlSTP47ShQzuKZ6PRXpYLtIFgl/DEBYEXKlvcEa+9tHVcK8CF81Y2V72qaZhWA==} + engines: {node: '>=18'} + + object-assign@4.1.1: + resolution: {integrity: sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==} + engines: {node: '>=0.10.0'} + + object-inspect@1.13.4: + resolution: {integrity: sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew==} + engines: {node: '>= 0.4'} + + object-keys@1.1.1: + resolution: {integrity: sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA==} + engines: {node: '>= 0.4'} + + object-treeify@1.1.33: + resolution: {integrity: sha512-EFVjAYfzWqWsBMRHPMAXLCDIJnpMhdWAqR7xG6M6a2cs6PMFpl/+Z20w9zDW4vkxOFfddegBKq9Rehd0bxWE7A==} + engines: {node: '>= 10'} + + object.assign@4.1.7: + resolution: {integrity: sha512-nK28WOo+QIjBkDduTINE4JkF/UJJKyf2EJxvJKfblDpyg0Q+pkOHNTL0Qwy6NP6FhE/EnzV73BxxqcJaXY9anw==} + engines: {node: '>= 0.4'} + + object.entries@1.1.9: + resolution: {integrity: sha512-8u/hfXFRBD1O0hPUjioLhoWFHRmt6tKA4/vZPyckBr18l1KE9uHrFaFaUi8MDRTpi4uak2goyPTSNJLXX2k2Hw==} + engines: {node: '>= 0.4'} + + object.fromentries@2.0.8: + resolution: {integrity: sha512-k6E21FzySsSK5a21KRADBd/NGneRegFO5pLHfdQLpRDETUNJueLXs3WCzyQ3tFRDYgbq3KHGXfTbi2bs8WQ6rQ==} + engines: {node: '>= 0.4'} + + object.groupby@1.0.3: + resolution: {integrity: sha512-+Lhy3TQTuzXI5hevh8sBGqbmurHbbIjAi0Z4S63nthVLmLxfbj4T54a4CfZrXIrt9iP4mVAPYMo/v99taj3wjQ==} + engines: {node: '>= 0.4'} + + object.values@1.2.1: + resolution: {integrity: sha512-gXah6aZrcUxjWg2zR2MwouP2eHlCBzdV4pygudehaKXSGW4v2AsRQUK+lwwXhii6KFZcunEnmSUoYp5CXibxtA==} + engines: {node: '>= 0.4'} + + on-finished@2.4.1: + resolution: {integrity: sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg==} + engines: {node: '>= 0.8'} + + once@1.4.0: + resolution: {integrity: sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==} + + onetime@5.1.2: + resolution: {integrity: sha512-kbpaSSGJTWdAY5KPVeMOKXSrPtr8C8C7wodJbcsd51jRnmD+GZu8Y0VoU6Dm5Z4vWr0Ig/1NKuWRKf7j5aaYSg==} + engines: {node: '>=6'} + + onetime@7.0.0: + resolution: {integrity: sha512-VXJjc87FScF88uafS3JllDgvAm+c/Slfz06lorj2uAY34rlUu0Nt+v8wreiImcrgAjjIHp1rXpTDlLOGw29WwQ==} + engines: {node: '>=18'} + + open@11.0.0: + resolution: {integrity: sha512-smsWv2LzFjP03xmvFoJ331ss6h+jixfA4UUV/Bsiyuu4YJPfN+FIQGOIiv4w9/+MoHkfkJ22UIaQWRVFRfH6Vw==} + engines: {node: '>=20'} + + optionator@0.9.4: + resolution: {integrity: sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g==} + engines: {node: '>= 0.8.0'} + + ora@8.2.0: + resolution: {integrity: sha512-weP+BZ8MVNnlCm8c0Qdc1WSWq4Qn7I+9CJGm7Qali6g44e/PUzbjNqJX5NJ9ljlNMosfJvg1fKEGILklK9cwnw==} + engines: {node: '>=18'} + + outvariant@1.4.3: + resolution: {integrity: sha512-+Sl2UErvtsoajRDKCE5/dBz4DIvHXQQnAxtQTF04OJxY0+DyZXSo5P5Bb7XYWOh81syohlYL24hbDwxedPUJCA==} + + own-keys@1.0.1: + resolution: {integrity: sha512-qFOyK5PjiWZd+QQIh+1jhdb9LpxTF0qs7Pm8o5QHYZ0M3vKqSqzsZaEB6oWlxZ+q2sJBMI/Ktgd2N5ZwQoRHfg==} + engines: {node: '>= 0.4'} + + p-limit@3.1.0: + resolution: {integrity: sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==} + engines: {node: '>=10'} + + p-locate@5.0.0: + resolution: {integrity: sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==} + engines: {node: '>=10'} + + parent-module@1.0.1: + resolution: {integrity: sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==} + engines: {node: '>=6'} + + parse-json@5.2.0: + resolution: {integrity: sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg==} + engines: {node: '>=8'} + + parse-ms@4.0.0: + resolution: {integrity: sha512-TXfryirbmq34y8QBwgqCVLi+8oA3oWx2eAnSn62ITyEhEYaWRlVZ2DvMM9eZbMs/RfxPu/PK/aBLyGj4IrqMHw==} + engines: {node: '>=18'} + + parseurl@1.3.3: + resolution: {integrity: sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==} + engines: {node: '>= 0.8'} + + path-browserify@1.0.1: + resolution: {integrity: sha512-b7uo2UCUOYZcnF/3ID0lulOJi/bafxa1xPe7ZPsammBSpjSWQkjNxlt635YGS2MiR9GjvuXCtz2emr3jbsz98g==} + + path-exists@4.0.0: + resolution: {integrity: sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==} + engines: {node: '>=8'} + + path-key@3.1.1: + resolution: {integrity: sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==} + engines: {node: '>=8'} + + path-key@4.0.0: + resolution: {integrity: sha512-haREypq7xkM7ErfgIyA0z+Bj4AGKlMSdlQE2jvJo6huWD1EdkKYV+G/T4nq0YEF2vgTT8kqMFKo1uHn950r4SQ==} + engines: {node: '>=12'} + + path-parse@1.0.7: + resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==} + + path-to-regexp@6.3.0: + resolution: {integrity: sha512-Yhpw4T9C6hPpgPeA28us07OJeqZ5EzQTkbfwuhsUg0c237RomFoETJgmp2sa3F/41gfLE6G5cqcYwznmeEeOlQ==} + + path-to-regexp@8.3.0: + resolution: {integrity: sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==} + + picocolors@1.1.1: + resolution: {integrity: sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==} + + picomatch@2.3.1: + resolution: {integrity: sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==} + engines: {node: '>=8.6'} + + picomatch@4.0.3: + resolution: {integrity: sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==} + engines: {node: '>=12'} + + pkce-challenge@5.0.1: + resolution: {integrity: sha512-wQ0b/W4Fr01qtpHlqSqspcj3EhBvimsdh0KlHhH8HRZnMsEa0ea2fTULOXOS9ccQr3om+GcGRk4e+isrZWV8qQ==} + engines: {node: '>=16.20.0'} + + possible-typed-array-names@1.1.0: + resolution: {integrity: sha512-/+5VFTchJDoVj3bhoqi6UeymcD00DAwb1nJwamzPvHEszJ4FpF6SNNbUbOS8yI56qHzdV8eK0qEfOSiodkTdxg==} + engines: {node: '>= 0.4'} + + postcss-selector-parser@7.1.1: + resolution: {integrity: sha512-orRsuYpJVw8LdAwqqLykBj9ecS5/cRHlI5+nvTo8LcCKmzDmqVORXtOIYEEQuL9D4BxtA1lm5isAqzQZCoQ6Eg==} + engines: {node: '>=4'} + + postcss@8.4.31: + resolution: {integrity: sha512-PS08Iboia9mts/2ygV3eLpY5ghnUcfLV/EXTOW1E2qYxJKGGBUtNjN76FYHnMs36RmARn41bC0AZmn+rR0OVpQ==} + engines: {node: ^10 || ^12 || >=14} + + postcss@8.5.8: + resolution: {integrity: sha512-OW/rX8O/jXnm82Ey1k44pObPtdblfiuWnrd8X7GJ7emImCOstunGbXUpp7HdBrFQX6rJzn3sPT397Wp5aCwCHg==} + engines: {node: ^10 || ^12 || >=14} + + powershell-utils@0.1.0: + resolution: {integrity: sha512-dM0jVuXJPsDN6DvRpea484tCUaMiXWjuCn++HGTqUWzGDjv5tZkEZldAJ/UMlqRYGFrD/etByo4/xOuC/snX2A==} + engines: {node: '>=20'} + + prelude-ls@1.2.1: + resolution: {integrity: sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==} + engines: {node: '>= 0.8.0'} + + pretty-ms@9.3.0: + resolution: {integrity: sha512-gjVS5hOP+M3wMm5nmNOucbIrqudzs9v/57bWRHQWLYklXqoXKrVfYW2W9+glfGsqtPgpiz5WwyEEB+ksXIx3gQ==} + engines: {node: '>=18'} + + prompts@2.4.2: + resolution: {integrity: sha512-NxNv/kLguCA7p3jE8oL2aEBsrJWgAakBpgmgK6lpPWV+WuOmY6r2/zbAVnP+T8bQlA0nzHXSJSJW0Hq7ylaD2Q==} + engines: {node: '>= 6'} + + prop-types@15.8.1: + resolution: {integrity: sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg==} + + proxy-addr@2.0.7: + resolution: {integrity: sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==} + engines: {node: '>= 0.10'} + + punycode@2.3.1: + resolution: {integrity: sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==} + engines: {node: '>=6'} + + qs@6.15.0: + resolution: {integrity: sha512-mAZTtNCeetKMH+pSjrb76NAM8V9a05I9aBZOHztWy/UqcJdQYNsf59vrRKWnojAT9Y+GbIvoTBC++CPHqpDBhQ==} + engines: {node: '>=0.6'} + + queue-microtask@1.2.3: + resolution: {integrity: sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==} + + range-parser@1.2.1: + resolution: {integrity: sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==} + engines: {node: '>= 0.6'} + + raw-body@3.0.2: + resolution: {integrity: sha512-K5zQjDllxWkf7Z5xJdV0/B0WTNqx6vxG70zJE4N0kBs4LovmEYWJzQGxC9bS9RAKu3bgM40lrd5zoLJ12MQ5BA==} + engines: {node: '>= 0.10'} + + react-dom@19.2.3: + resolution: {integrity: sha512-yELu4WmLPw5Mr/lmeEpox5rw3RETacE++JgHqQzd2dg+YbJuat3jH4ingc+WPZhxaoFzdv9y33G+F7Nl5O0GBg==} + peerDependencies: + react: ^19.2.3 + + react-is@16.13.1: + resolution: {integrity: sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==} + + react@19.2.3: + resolution: {integrity: sha512-Ku/hhYbVjOQnXDZFv2+RibmLFGwFdeeKHFcOTlrt7xplBnya5OGn/hIRDsqDiSUcfORsDC7MPxwork8jBwsIWA==} + engines: {node: '>=0.10.0'} + + recast@0.23.11: + resolution: {integrity: sha512-YTUo+Flmw4ZXiWfQKGcwwc11KnoRAYgzAE2E7mXKCjSviTKShtxBsN6YUUBB2gtaBzKzeKunxhUwNHQuRryhWA==} + engines: {node: '>= 4'} + + reflect.getprototypeof@1.0.10: + resolution: {integrity: sha512-00o4I+DVrefhv+nX0ulyi3biSHCPDe+yLv5o/p6d/UVlirijB8E16FtfwSAi4g3tcqrQ4lRAqQSoFEZJehYEcw==} + engines: {node: '>= 0.4'} + + regexp.prototype.flags@1.5.4: + resolution: {integrity: sha512-dYqgNSZbDwkaJ2ceRd9ojCGjBq+mOm9LmtXnAnEGyHhN/5R7iDW2TRw3h+o/jCFxus3P2LfWIIiwowAjANm7IA==} + engines: {node: '>= 0.4'} + + require-directory@2.1.1: + resolution: {integrity: sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==} + engines: {node: '>=0.10.0'} + + require-from-string@2.0.2: + resolution: {integrity: sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw==} + engines: {node: '>=0.10.0'} + + reselect@5.1.1: + resolution: {integrity: sha512-K/BG6eIky/SBpzfHZv/dd+9JBFiS4SWV7FIujVyJRux6e45+73RaUHXLmIR1f7WOMaQ0U1km6qwklRQxpJJY0w==} + + resolve-from@4.0.0: + resolution: {integrity: sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==} + engines: {node: '>=4'} + + resolve-pkg-maps@1.0.0: + resolution: {integrity: sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw==} + + resolve@1.22.11: + resolution: {integrity: sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ==} + engines: {node: '>= 0.4'} + hasBin: true + + resolve@2.0.0-next.6: + resolution: {integrity: sha512-3JmVl5hMGtJ3kMmB3zi3DL25KfkCEyy3Tw7Gmw7z5w8M9WlwoPFnIvwChzu1+cF3iaK3sp18hhPz8ANeimdJfA==} + engines: {node: '>= 0.4'} + hasBin: true + + restore-cursor@5.1.0: + resolution: {integrity: sha512-oMA2dcrw6u0YfxJQXm342bFKX/E4sG9rbTzO9ptUcR/e8A33cHuvStiYOwH7fszkZlZ1z/ta9AAoPk2F4qIOHA==} + engines: {node: '>=18'} + + rettime@0.10.1: + resolution: {integrity: sha512-uyDrIlUEH37cinabq0AX4QbgV4HbFZ/gqoiunWQ1UqBtRvTTytwhNYjE++pO/MjPTZL5KQCf2bEoJ/BJNVQ5Kw==} + + reusify@1.1.0: + resolution: {integrity: sha512-g6QUff04oZpHs0eG5p83rFLhHeV00ug/Yf9nZM6fLeUrPguBTkTQOdpAWWspMh55TZfVQDPaN3NQJfbVRAxdIw==} + engines: {iojs: '>=1.0.0', node: '>=0.10.0'} + + router@2.2.0: + resolution: {integrity: sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==} + engines: {node: '>= 18'} + + run-applescript@7.1.0: + resolution: {integrity: sha512-DPe5pVFaAsinSaV6QjQ6gdiedWDcRCbUuiQfQa2wmWV7+xC9bGulGI8+TdRmoFkAPaBXk8CrAbnlY2ISniJ47Q==} + engines: {node: '>=18'} + + run-parallel@1.2.0: + resolution: {integrity: sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==} + + safe-array-concat@1.1.3: + resolution: {integrity: sha512-AURm5f0jYEOydBj7VQlVvDrjeFgthDdEF5H1dP+6mNpoXOMo1quQqJ4wvJDyRZ9+pO3kGWoOdmV08cSv2aJV6Q==} + engines: {node: '>=0.4'} + + safe-push-apply@1.0.0: + resolution: {integrity: sha512-iKE9w/Z7xCzUMIZqdBsp6pEQvwuEebH4vdpjcDWnyzaI6yl6O9FHvVpmGelvEHNsoY6wGblkxR6Zty/h00WiSA==} + engines: {node: '>= 0.4'} + + safe-regex-test@1.1.0: + resolution: {integrity: sha512-x/+Cz4YrimQxQccJf5mKEbIa1NzeCRNI5Ecl/ekmlYaampdNLPalVyIcCZNNH3MvmqBugV5TMYZXv0ljslUlaw==} + engines: {node: '>= 0.4'} + + safer-buffer@2.1.2: + resolution: {integrity: sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==} + + scheduler@0.27.0: + resolution: {integrity: sha512-eNv+WrVbKu1f3vbYJT/xtiF5syA5HPIMtf9IgY/nKg0sWqzAUEvqY/xm7OcZc/qafLx/iO9FgOmeSAp4v5ti/Q==} + + semver@6.3.1: + resolution: {integrity: sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==} + hasBin: true + + semver@7.7.4: + resolution: {integrity: sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==} + engines: {node: '>=10'} + hasBin: true + + send@1.2.1: + resolution: {integrity: sha512-1gnZf7DFcoIcajTjTwjwuDjzuz4PPcY2StKPlsGAQ1+YH20IRVrBaXSWmdjowTJ6u8Rc01PoYOGHXfP1mYcZNQ==} + engines: {node: '>= 18'} + + serve-static@2.2.1: + resolution: {integrity: sha512-xRXBn0pPqQTVQiC8wyQrKs2MOlX24zQ0POGaj0kultvoOCstBQM5yvOhAVSUwOMjQtTvsPWoNCHfPGwaaQJhTw==} + engines: {node: '>= 18'} + + set-function-length@1.2.2: + resolution: {integrity: sha512-pgRc4hJ4/sNjWCSS9AmnS40x3bNMDTknHgL5UaMBTMyJnU90EgWh1Rz+MC9eFu4BuN/UwZjKQuY/1v3rM7HMfg==} + engines: {node: '>= 0.4'} + + set-function-name@2.0.2: + resolution: {integrity: sha512-7PGFlmtwsEADb0WYyvCMa1t+yke6daIG4Wirafur5kcf+MhUnPms1UeR0CKQdTZD81yESwMHbtn+TR+dMviakQ==} + engines: {node: '>= 0.4'} + + set-proto@1.0.0: + resolution: {integrity: sha512-RJRdvCo6IAnPdsvP/7m6bsQqNnn1FCBX5ZNtFL98MmFF/4xAIJTIg1YbHW5DC2W5SKZanrC6i4HsJqlajw/dZw==} + engines: {node: '>= 0.4'} + + setprototypeof@1.2.0: + resolution: {integrity: sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==} + + shadcn@4.1.0: + resolution: {integrity: sha512-3zETJ+0Ezj69FS6RL0HOkLKKAR5yXisXx1iISJdfLQfrUqj/VIQlanQi1Ukk+9OE+XHZVj4FQNTBSfbr2CyCYg==} + hasBin: true + + sharp@0.34.5: + resolution: {integrity: sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==} + engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0} + + shebang-command@2.0.0: + resolution: {integrity: sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==} + engines: {node: '>=8'} + + shebang-regex@3.0.0: + resolution: {integrity: sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==} + engines: {node: '>=8'} + + side-channel-list@1.0.0: + resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==} + engines: {node: '>= 0.4'} + + side-channel-map@1.0.1: + resolution: {integrity: sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA==} + engines: {node: '>= 0.4'} + + side-channel-weakmap@1.0.2: + resolution: {integrity: sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A==} + engines: {node: '>= 0.4'} + + side-channel@1.1.0: + resolution: {integrity: sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==} + engines: {node: '>= 0.4'} + + signal-exit@3.0.7: + resolution: {integrity: sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==} + + signal-exit@4.1.0: + resolution: {integrity: sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==} + engines: {node: '>=14'} + + sisteransi@1.0.5: + resolution: {integrity: sha512-bLGGlR1QxBcynn2d5YmDX4MGjlZvy2MRBDRNHLJ8VI6l6+9FUiyTFNJ0IveOSP0bcXgVDPRcfGqA0pjaqUpfVg==} + + source-map-js@1.2.1: + resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==} + engines: {node: '>=0.10.0'} + + source-map@0.6.1: + resolution: {integrity: sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==} + engines: {node: '>=0.10.0'} + + stable-hash@0.0.5: + resolution: {integrity: sha512-+L3ccpzibovGXFK+Ap/f8LOS0ahMrHTf3xu7mMLSpEGU0EO9ucaysSylKo9eRDFNhWve/y275iPmIZ4z39a9iA==} + + statuses@2.0.2: + resolution: {integrity: sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw==} + engines: {node: '>= 0.8'} + + stdin-discarder@0.2.2: + resolution: {integrity: sha512-UhDfHmA92YAlNnCfhmq0VeNL5bDbiZGg7sZ2IvPsXubGkiNa9EC+tUTsjBRsYUAz87btI6/1wf4XoVvQ3uRnmQ==} + engines: {node: '>=18'} + + stop-iteration-iterator@1.1.0: + resolution: {integrity: sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ==} + engines: {node: '>= 0.4'} + + strict-event-emitter@0.5.1: + resolution: {integrity: sha512-vMgjE/GGEPEFnhFub6pa4FmJBRBVOLpIII2hvCZ8Kzb7K0hlHo7mQv6xYrBvCL2LtAIBwFUK8wvuJgTVSQ5MFQ==} + + string-width@4.2.3: + resolution: {integrity: sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==} + engines: {node: '>=8'} + + string-width@7.2.0: + resolution: {integrity: sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ==} + engines: {node: '>=18'} + + string.prototype.includes@2.0.1: + resolution: {integrity: sha512-o7+c9bW6zpAdJHTtujeePODAhkuicdAryFsfVKwA+wGw89wJ4GTY484WTucM9hLtDEOpOvI+aHnzqnC5lHp4Rg==} + engines: {node: '>= 0.4'} + + string.prototype.matchall@4.0.12: + resolution: {integrity: sha512-6CC9uyBL+/48dYizRf7H7VAYCMCNTBeM78x/VTUe9bFEaxBepPJDa1Ow99LqI/1yF7kuy7Q3cQsYMrcjGUcskA==} + engines: {node: '>= 0.4'} + + string.prototype.repeat@1.0.0: + resolution: {integrity: sha512-0u/TldDbKD8bFCQ/4f5+mNRrXwZ8hg2w7ZR8wa16e8z9XpePWl3eGEcUD0OXpEH/VJH/2G3gjUtR3ZOiBe2S/w==} + + string.prototype.trim@1.2.10: + resolution: {integrity: sha512-Rs66F0P/1kedk5lyYyH9uBzuiI/kNRmwJAR9quK6VOtIpZ2G+hMZd+HQbbv25MgCA6gEffoMZYxlTod4WcdrKA==} + engines: {node: '>= 0.4'} + + string.prototype.trimend@1.0.9: + resolution: {integrity: sha512-G7Ok5C6E/j4SGfyLCloXTrngQIQU3PWtXGst3yM7Bea9FRURf1S42ZHlZZtsNque2FN2PoUhfZXYLNWwEr4dLQ==} + engines: {node: '>= 0.4'} + + string.prototype.trimstart@1.0.8: + resolution: {integrity: sha512-UXSH262CSZY1tfu3G3Secr6uGLCFVPMhIqHjlgCUtCCcgihYc/xKs9djMTMUOb2j1mVSeU8EU6NWc/iQKU6Gfg==} + engines: {node: '>= 0.4'} + + stringify-object@5.0.0: + resolution: {integrity: sha512-zaJYxz2FtcMb4f+g60KsRNFOpVMUyuJgA51Zi5Z1DOTC3S59+OQiVOzE9GZt0x72uBGWKsQIuBKeF9iusmKFsg==} + engines: {node: '>=14.16'} + + strip-ansi@6.0.1: + resolution: {integrity: sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==} + engines: {node: '>=8'} + + strip-ansi@7.2.0: + resolution: {integrity: sha512-yDPMNjp4WyfYBkHnjIRLfca1i6KMyGCtsVgoKe/z1+6vukgaENdgGBZt+ZmKPc4gavvEZ5OgHfHdrazhgNyG7w==} + engines: {node: '>=12'} + + strip-bom@3.0.0: + resolution: {integrity: sha512-vavAMRXOgBVNF6nyEEmL3DBK19iRpDcoIwW+swQ+CbGiu7lju6t+JklA1MHweoWtadgt4ISVUsXLyDq34ddcwA==} + engines: {node: '>=4'} + + strip-final-newline@2.0.0: + resolution: {integrity: sha512-BrpvfNAE3dcvq7ll3xVumzjKjZQ5tI1sEUIKr3Uoks0XUl45St3FlatVqef9prk4jRDzhW6WZg+3bk93y6pLjA==} + engines: {node: '>=6'} + + strip-final-newline@4.0.0: + resolution: {integrity: sha512-aulFJcD6YK8V1G7iRB5tigAP4TsHBZZrOV8pjV++zdUwmeV8uzbY7yn6h9MswN62adStNZFuCIx4haBnRuMDaw==} + engines: {node: '>=18'} + + strip-json-comments@3.1.1: + resolution: {integrity: sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==} + engines: {node: '>=8'} + + styled-jsx@5.1.6: + resolution: {integrity: sha512-qSVyDTeMotdvQYoHWLNGwRFJHC+i+ZvdBRYosOFgC+Wg1vx4frN2/RG/NA7SYqqvKNLf39P2LSRA2pu6n0XYZA==} + engines: {node: '>= 12.0.0'} + peerDependencies: + '@babel/core': '*' + babel-plugin-macros: '*' + react: '>= 16.8.0 || 17.x.x || ^18.0.0-0 || ^19.0.0-0' + peerDependenciesMeta: + '@babel/core': + optional: true + babel-plugin-macros: + optional: true + + supports-color@7.2.0: + resolution: {integrity: sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==} + engines: {node: '>=8'} + + supports-preserve-symlinks-flag@1.0.0: + resolution: {integrity: sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==} + engines: {node: '>= 0.4'} + + tabbable@6.4.0: + resolution: {integrity: sha512-05PUHKSNE8ou2dwIxTngl4EzcnsCDZGJ/iCLtDflR/SHB/ny14rXc+qU5P4mG9JkusiV7EivzY9Mhm55AzAvCg==} + + tagged-tag@1.0.0: + resolution: {integrity: sha512-yEFYrVhod+hdNyx7g5Bnkkb0G6si8HJurOoOEgC8B/O0uXLHlaey/65KRv6cuWBNhBgHKAROVpc7QyYqE5gFng==} + engines: {node: '>=20'} + + tailwind-merge@3.5.0: + resolution: {integrity: sha512-I8K9wewnVDkL1NTGoqWmVEIlUcB9gFriAEkXkfCjX5ib8ezGxtR3xD7iZIxrfArjEsH7F1CHD4RFUtxefdqV/A==} + + tailwindcss@4.2.2: + resolution: {integrity: sha512-KWBIxs1Xb6NoLdMVqhbhgwZf2PGBpPEiwOqgI4pFIYbNTfBXiKYyWoTsXgBQ9WFg/OlhnvHaY+AEpW7wSmFo2Q==} + + tapable@2.3.0: + resolution: {integrity: sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==} + engines: {node: '>=6'} + + tiny-invariant@1.3.3: + resolution: {integrity: sha512-+FbBPE1o9QAYvviau/qC5SE3caw21q3xkvWKBtja5vgqOWIHHJ3ioaq1VPfn/Szqctz2bU/oYeKd9/z5BL+PVg==} + + tinyglobby@0.2.15: + resolution: {integrity: sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ==} + engines: {node: '>=12.0.0'} + + tldts-core@7.0.27: + resolution: {integrity: sha512-YQ7uPjgWUibIK6DW5lrKujGwUKhLevU4hcGbP5O6TcIUb+oTjJYJVWPS4nZsIHrEEEG6myk/oqAJUEQmpZrHsg==} + + tldts@7.0.27: + resolution: {integrity: sha512-I4FZcVFcqCRuT0ph6dCDpPuO4Xgzvh+spkcTr1gK7peIvxWauoloVO0vuy1FQnijT63ss6AsHB6+OIM4aXHbPg==} + hasBin: true + + to-regex-range@5.0.1: + resolution: {integrity: sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==} + engines: {node: '>=8.0'} + + toidentifier@1.0.1: + resolution: {integrity: sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==} + engines: {node: '>=0.6'} + + tough-cookie@6.0.1: + resolution: {integrity: sha512-LktZQb3IeoUWB9lqR5EWTHgW/VTITCXg4D21M+lvybRVdylLrRMnqaIONLVb5mav8vM19m44HIcGq4qASeu2Qw==} + engines: {node: '>=16'} + + ts-api-utils@2.5.0: + resolution: {integrity: sha512-OJ/ibxhPlqrMM0UiNHJ/0CKQkoKF243/AEmplt3qpRgkW8VG7IfOS41h7V8TjITqdByHzrjcS/2si+y4lIh8NA==} + engines: {node: '>=18.12'} + peerDependencies: + typescript: '>=4.8.4' + + ts-morph@26.0.0: + resolution: {integrity: sha512-ztMO++owQnz8c/gIENcM9XfCEzgoGphTv+nKpYNM1bgsdOVC/jRZuEBf6N+mLLDNg68Kl+GgUZfOySaRiG1/Ug==} + + tsconfig-paths@3.15.0: + resolution: {integrity: sha512-2Ac2RgzDe/cn48GvOe3M+o82pEFewD3UPbyoUHHdKasHwJKjds4fLXWf/Ux5kATBKN20oaFGu+jbElp1pos0mg==} + + tsconfig-paths@4.2.0: + resolution: {integrity: sha512-NoZ4roiN7LnbKn9QqE1amc9DJfzvZXxF4xDavcOWt1BPkdx+m+0gJuPM+S0vCe7zTJMYUP0R8pO2XMr+Y8oLIg==} + engines: {node: '>=6'} + + tslib@2.8.1: + resolution: {integrity: sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==} + + tw-animate-css@1.4.0: + resolution: {integrity: sha512-7bziOlRqH0hJx80h/3mbicLW7o8qLsH5+RaLR2t+OHM3D0JlWGODQKQ4cxbK7WlvmUxpcj6Kgu6EKqjrGFe3QQ==} + + type-check@0.4.0: + resolution: {integrity: sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew==} + engines: {node: '>= 0.8.0'} + + type-fest@5.5.0: + resolution: {integrity: sha512-PlBfpQwiUvGViBNX84Yxwjsdhd1TUlXr6zjX7eoirtCPIr08NAmxwa+fcYBTeRQxHo9YC9wwF3m9i700sHma8g==} + engines: {node: '>=20'} + + type-is@2.0.1: + resolution: {integrity: sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==} + engines: {node: '>= 0.6'} + + typed-array-buffer@1.0.3: + resolution: {integrity: sha512-nAYYwfY3qnzX30IkA6AQZjVbtK6duGontcQm1WSG1MD94YLqK0515GNApXkoxKOWMusVssAHWLh9SeaoefYFGw==} + engines: {node: '>= 0.4'} + + typed-array-byte-length@1.0.3: + resolution: {integrity: sha512-BaXgOuIxz8n8pIq3e7Atg/7s+DpiYrxn4vdot3w9KbnBhcRQq6o3xemQdIfynqSeXeDrF32x+WvfzmOjPiY9lg==} + engines: {node: '>= 0.4'} + + typed-array-byte-offset@1.0.4: + resolution: {integrity: sha512-bTlAFB/FBYMcuX81gbL4OcpH5PmlFHqlCCpAl8AlEzMz5k53oNDvN8p1PNOWLEmI2x4orp3raOFB51tv9X+MFQ==} + engines: {node: '>= 0.4'} + + typed-array-length@1.0.7: + resolution: {integrity: sha512-3KS2b+kL7fsuk/eJZ7EQdnEmQoaho/r6KUef7hxvltNA5DR8NAUM+8wJMbJyZ4G9/7i3v5zPBIMN5aybAh2/Jg==} + engines: {node: '>= 0.4'} + + typescript-eslint@8.57.1: + resolution: {integrity: sha512-fLvZWf+cAGw3tqMCYzGIU6yR8K+Y9NT2z23RwOjlNFF2HwSB3KhdEFI5lSBv8tNmFkkBShSjsCjzx1vahZfISA==} + engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} + peerDependencies: + eslint: ^8.57.0 || ^9.0.0 || ^10.0.0 + typescript: '>=4.8.4 <6.0.0' + + typescript@5.9.3: + resolution: {integrity: sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==} + engines: {node: '>=14.17'} + hasBin: true + + unbox-primitive@1.1.0: + resolution: {integrity: sha512-nWJ91DjeOkej/TA8pXQ3myruKpKEYgqvpw9lz4OPHj/NWFNluYrjbz9j01CJ8yKQd2g4jFoOkINCTW2I5LEEyw==} + engines: {node: '>= 0.4'} + + undici-types@6.21.0: + resolution: {integrity: sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ==} + + unicorn-magic@0.3.0: + resolution: {integrity: sha512-+QBBXBCvifc56fsbuxZQ6Sic3wqqc3WWaqxs58gvJrcOuN83HGTCwz3oS5phzU9LthRNE9VrJCFCLUgHeeFnfA==} + engines: {node: '>=18'} + + universalify@2.0.1: + resolution: {integrity: sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==} + engines: {node: '>= 10.0.0'} + + unpipe@1.0.0: + resolution: {integrity: sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==} + engines: {node: '>= 0.8'} + + unrs-resolver@1.11.1: + resolution: {integrity: sha512-bSjt9pjaEBnNiGgc9rUiHGKv5l4/TGzDmYw3RhnkJGtLhbnnA/5qJj7x3dNDCRx/PJxu774LlH8lCOlB4hEfKg==} + + until-async@3.0.2: + resolution: {integrity: sha512-IiSk4HlzAMqTUseHHe3VhIGyuFmN90zMTpD3Z3y8jeQbzLIq500MVM7Jq2vUAnTKAFPJrqwkzr6PoTcPhGcOiw==} + + update-browserslist-db@1.2.3: + resolution: {integrity: sha512-Js0m9cx+qOgDxo0eMiFGEueWztz+d4+M3rGlmKPT+T4IS/jP4ylw3Nwpu6cpTTP8R1MAC1kF4VbdLt3ARf209w==} + hasBin: true + peerDependencies: + browserslist: '>= 4.21.0' + + uri-js@4.4.1: + resolution: {integrity: sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==} + + use-sync-external-store@1.6.0: + resolution: {integrity: sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w==} + peerDependencies: + react: ^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 + + util-deprecate@1.0.2: + resolution: {integrity: sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==} + + validate-npm-package-name@7.0.2: + resolution: {integrity: sha512-hVDIBwsRruT73PbK7uP5ebUt+ezEtCmzZz3F59BSr2F6OVFnJ/6h8liuvdLrQ88Xmnk6/+xGGuq+pG9WwTuy3A==} + engines: {node: ^20.17.0 || >=22.9.0} + + vary@1.1.2: + resolution: {integrity: sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==} + engines: {node: '>= 0.8'} + + web-streams-polyfill@3.3.3: + resolution: {integrity: sha512-d2JWLCivmZYTSIoge9MsgFCZrt571BikcWGYkjC1khllbTeDlGqZ2D8vD8E/lJa8WGWbb7Plm8/XJYV7IJHZZw==} + engines: {node: '>= 8'} + + which-boxed-primitive@1.1.1: + resolution: {integrity: sha512-TbX3mj8n0odCBFVlY8AxkqcHASw3L60jIuF8jFP78az3C2YhmGvqbHBpAjTRH2/xqYunrJ9g1jSyjCjpoWzIAA==} + engines: {node: '>= 0.4'} + + which-builtin-type@1.2.1: + resolution: {integrity: sha512-6iBczoX+kDQ7a3+YJBnh3T+KZRxM/iYNPXicqk66/Qfm1b93iu+yOImkg0zHbj5LNOcNv1TEADiZ0xa34B4q6Q==} + engines: {node: '>= 0.4'} + + which-collection@1.0.2: + resolution: {integrity: sha512-K4jVyjnBdgvc86Y6BkaLZEN933SwYOuBFkdmBu9ZfkcAbdVbpITnDmjvZ/aQjRXQrv5EPkTnD1s39GiiqbngCw==} + engines: {node: '>= 0.4'} + + which-typed-array@1.1.20: + resolution: {integrity: sha512-LYfpUkmqwl0h9A2HL09Mms427Q1RZWuOHsukfVcKRq9q95iQxdw0ix1JQrqbcDR9PH1QDwf5Qo8OZb5lksZ8Xg==} + engines: {node: '>= 0.4'} + + which@2.0.2: + resolution: {integrity: sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==} + engines: {node: '>= 8'} + hasBin: true + + which@4.0.0: + resolution: {integrity: sha512-GlaYyEb07DPxYCKhKzplCWBJtvxZcZMrL+4UkrTSJHHPyZU4mYYTv3qaOe77H7EODLSSopAUFAc6W8U4yqvscg==} + engines: {node: ^16.13.0 || >=18.0.0} + hasBin: true + + word-wrap@1.2.5: + resolution: {integrity: sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA==} + engines: {node: '>=0.10.0'} + + wrap-ansi@6.2.0: + resolution: {integrity: sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA==} + engines: {node: '>=8'} + + wrap-ansi@7.0.0: + resolution: {integrity: sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==} + engines: {node: '>=10'} + + wrappy@1.0.2: + resolution: {integrity: sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==} + + wsl-utils@0.3.1: + resolution: {integrity: sha512-g/eziiSUNBSsdDJtCLB8bdYEUMj4jR7AGeUo96p/3dTafgjHhpF4RiCFPiRILwjQoDXx5MqkBr4fwWtR3Ky4Wg==} + engines: {node: '>=20'} + + y18n@5.0.8: + resolution: {integrity: sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==} + engines: {node: '>=10'} + + yallist@3.1.1: + resolution: {integrity: sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==} + + yargs-parser@21.1.1: + resolution: {integrity: sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==} + engines: {node: '>=12'} + + yargs@17.7.2: + resolution: {integrity: sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==} + engines: {node: '>=12'} + + yocto-queue@0.1.0: + resolution: {integrity: sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==} + engines: {node: '>=10'} + + yoctocolors-cjs@2.1.3: + resolution: {integrity: sha512-U/PBtDf35ff0D8X8D0jfdzHYEPFxAI7jJlxZXwCSez5M3190m+QobIfh+sWDWSHMCWWJN2AWamkegn6vr6YBTw==} + engines: {node: '>=18'} + + yoctocolors@2.1.2: + resolution: {integrity: sha512-CzhO+pFNo8ajLM2d2IW/R93ipy99LWjtwblvC1RsoSUMZgyLbYFr221TnSNT7GjGdYui6P459mw9JH/g/zW2ug==} + engines: {node: '>=18'} + + zod-to-json-schema@3.25.1: + resolution: {integrity: sha512-pM/SU9d3YAggzi6MtR4h7ruuQlqKtad8e9S0fmxcMi+ueAK5Korys/aWcV9LIIHTVbj01NdzxcnXSN+O74ZIVA==} + peerDependencies: + zod: ^3.25 || ^4 + + zod-validation-error@4.0.2: + resolution: {integrity: sha512-Q6/nZLe6jxuU80qb/4uJ4t5v2VEZ44lzQjPDhYJNztRQ4wyWc6VF3D3Kb/fAuPetZQnhS3hnajCf9CsWesghLQ==} + engines: {node: '>=18.0.0'} + peerDependencies: + zod: ^3.25.0 || ^4.0.0 + + zod@3.25.76: + resolution: {integrity: sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ==} + + zod@4.3.6: + resolution: {integrity: sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg==} + +snapshots: + + '@alloc/quick-lru@5.2.0': {} + + '@babel/code-frame@7.29.0': + dependencies: + '@babel/helper-validator-identifier': 7.28.5 + js-tokens: 4.0.0 + picocolors: 1.1.1 + + '@babel/compat-data@7.29.0': {} + + '@babel/core@7.29.0': + dependencies: + '@babel/code-frame': 7.29.0 + '@babel/generator': 7.29.1 + '@babel/helper-compilation-targets': 7.28.6 + '@babel/helper-module-transforms': 7.28.6(@babel/core@7.29.0) + '@babel/helpers': 7.29.2 + '@babel/parser': 7.29.2 + '@babel/template': 7.28.6 + '@babel/traverse': 7.29.0 + '@babel/types': 7.29.0 + '@jridgewell/remapping': 2.3.5 + convert-source-map: 2.0.0 + debug: 4.4.3 + gensync: 1.0.0-beta.2 + json5: 2.2.3 + semver: 6.3.1 + transitivePeerDependencies: + - supports-color + + '@babel/generator@7.29.1': + dependencies: + '@babel/parser': 7.29.2 + '@babel/types': 7.29.0 + '@jridgewell/gen-mapping': 0.3.13 + '@jridgewell/trace-mapping': 0.3.31 + jsesc: 3.1.0 + + '@babel/helper-annotate-as-pure@7.27.3': + dependencies: + '@babel/types': 7.29.0 + + '@babel/helper-compilation-targets@7.28.6': + dependencies: + '@babel/compat-data': 7.29.0 + '@babel/helper-validator-option': 7.27.1 + browserslist: 4.28.1 + lru-cache: 5.1.1 + semver: 6.3.1 + + '@babel/helper-create-class-features-plugin@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-annotate-as-pure': 7.27.3 + '@babel/helper-member-expression-to-functions': 7.28.5 + '@babel/helper-optimise-call-expression': 7.27.1 + '@babel/helper-replace-supers': 7.28.6(@babel/core@7.29.0) + '@babel/helper-skip-transparent-expression-wrappers': 7.27.1 + '@babel/traverse': 7.29.0 + semver: 6.3.1 + transitivePeerDependencies: + - supports-color + + '@babel/helper-globals@7.28.0': {} + + '@babel/helper-member-expression-to-functions@7.28.5': + dependencies: + '@babel/traverse': 7.29.0 + '@babel/types': 7.29.0 + transitivePeerDependencies: + - supports-color + + '@babel/helper-module-imports@7.28.6': + dependencies: + '@babel/traverse': 7.29.0 + '@babel/types': 7.29.0 + transitivePeerDependencies: + - supports-color + + '@babel/helper-module-transforms@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-module-imports': 7.28.6 + '@babel/helper-validator-identifier': 7.28.5 + '@babel/traverse': 7.29.0 + transitivePeerDependencies: + - supports-color + + '@babel/helper-optimise-call-expression@7.27.1': + dependencies: + '@babel/types': 7.29.0 + + '@babel/helper-plugin-utils@7.28.6': {} + + '@babel/helper-replace-supers@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-member-expression-to-functions': 7.28.5 + '@babel/helper-optimise-call-expression': 7.27.1 + '@babel/traverse': 7.29.0 + transitivePeerDependencies: + - supports-color + + '@babel/helper-skip-transparent-expression-wrappers@7.27.1': + dependencies: + '@babel/traverse': 7.29.0 + '@babel/types': 7.29.0 + transitivePeerDependencies: + - supports-color + + '@babel/helper-string-parser@7.27.1': {} + + '@babel/helper-validator-identifier@7.28.5': {} + + '@babel/helper-validator-option@7.27.1': {} + + '@babel/helpers@7.29.2': + dependencies: + '@babel/template': 7.28.6 + '@babel/types': 7.29.0 + + '@babel/parser@7.29.2': + dependencies: + '@babel/types': 7.29.0 + + '@babel/plugin-syntax-jsx@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-plugin-utils': 7.28.6 + + '@babel/plugin-syntax-typescript@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-plugin-utils': 7.28.6 + + '@babel/plugin-transform-modules-commonjs@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-module-transforms': 7.28.6(@babel/core@7.29.0) + '@babel/helper-plugin-utils': 7.28.6 + transitivePeerDependencies: + - supports-color + + '@babel/plugin-transform-typescript@7.28.6(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-annotate-as-pure': 7.27.3 + '@babel/helper-create-class-features-plugin': 7.28.6(@babel/core@7.29.0) + '@babel/helper-plugin-utils': 7.28.6 + '@babel/helper-skip-transparent-expression-wrappers': 7.27.1 + '@babel/plugin-syntax-typescript': 7.28.6(@babel/core@7.29.0) + transitivePeerDependencies: + - supports-color + + '@babel/preset-typescript@7.28.5(@babel/core@7.29.0)': + dependencies: + '@babel/core': 7.29.0 + '@babel/helper-plugin-utils': 7.28.6 + '@babel/helper-validator-option': 7.27.1 + '@babel/plugin-syntax-jsx': 7.28.6(@babel/core@7.29.0) + '@babel/plugin-transform-modules-commonjs': 7.28.6(@babel/core@7.29.0) + '@babel/plugin-transform-typescript': 7.28.6(@babel/core@7.29.0) + transitivePeerDependencies: + - supports-color + + '@babel/runtime@7.29.2': {} + + '@babel/template@7.28.6': + dependencies: + '@babel/code-frame': 7.29.0 + '@babel/parser': 7.29.2 + '@babel/types': 7.29.0 + + '@babel/traverse@7.29.0': + dependencies: + '@babel/code-frame': 7.29.0 + '@babel/generator': 7.29.1 + '@babel/helper-globals': 7.28.0 + '@babel/parser': 7.29.2 + '@babel/template': 7.28.6 + '@babel/types': 7.29.0 + debug: 4.4.3 + transitivePeerDependencies: + - supports-color + + '@babel/types@7.29.0': + dependencies: + '@babel/helper-string-parser': 7.27.1 + '@babel/helper-validator-identifier': 7.28.5 + + '@base-ui/react@1.3.0(@types/react@19.2.14)(react-dom@19.2.3(react@19.2.3))(react@19.2.3)': + dependencies: + '@babel/runtime': 7.29.2 + '@base-ui/utils': 0.2.6(@types/react@19.2.14)(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + '@floating-ui/react-dom': 2.1.8(react-dom@19.2.3(react@19.2.3))(react@19.2.3) + '@floating-ui/utils': 0.2.11 + react: 19.2.3 + react-dom: 19.2.3(react@19.2.3) + tabbable: 6.4.0 + use-sync-external-store: 1.6.0(react@19.2.3) + optionalDependencies: + '@types/react': 19.2.14 + + '@base-ui/utils@0.2.6(@types/react@19.2.14)(react-dom@19.2.3(react@19.2.3))(react@19.2.3)': + dependencies: + '@babel/runtime': 7.29.2 + '@floating-ui/utils': 0.2.11 + react: 19.2.3 + react-dom: 19.2.3(react@19.2.3) + reselect: 5.1.1 + use-sync-external-store: 1.6.0(react@19.2.3) + optionalDependencies: + '@types/react': 19.2.14 + + '@dotenvx/dotenvx@1.57.1': + dependencies: + commander: 11.1.0 + dotenv: 17.3.1 + eciesjs: 0.4.18 + execa: 5.1.1 + fdir: 6.5.0(picomatch@4.0.3) + ignore: 5.3.2 + object-treeify: 1.1.33 + picomatch: 4.0.3 + which: 4.0.0 + + '@ecies/ciphers@0.2.5(@noble/ciphers@1.3.0)': + dependencies: + '@noble/ciphers': 1.3.0 + + '@emnapi/core@1.9.1': + dependencies: + '@emnapi/wasi-threads': 1.2.0 + tslib: 2.8.1 + optional: true + + '@emnapi/runtime@1.9.1': + dependencies: + tslib: 2.8.1 + optional: true + + '@emnapi/wasi-threads@1.2.0': + dependencies: + tslib: 2.8.1 + optional: true + + '@eslint-community/eslint-utils@4.9.1(eslint@9.39.4(jiti@2.6.1))': + dependencies: + eslint: 9.39.4(jiti@2.6.1) + eslint-visitor-keys: 3.4.3 + + '@eslint-community/regexpp@4.12.2': {} + + '@eslint/config-array@0.21.2': + dependencies: + '@eslint/object-schema': 2.1.7 + debug: 4.4.3 + minimatch: 3.1.5 + transitivePeerDependencies: + - supports-color + + '@eslint/config-helpers@0.4.2': + dependencies: + '@eslint/core': 0.17.0 + + '@eslint/core@0.17.0': + dependencies: + '@types/json-schema': 7.0.15 + + '@eslint/eslintrc@3.3.5': + dependencies: + ajv: 6.14.0 + debug: 4.4.3 + espree: 10.4.0 + globals: 14.0.0 + ignore: 5.3.2 + import-fresh: 3.3.1 + js-yaml: 4.1.1 + minimatch: 3.1.5 + strip-json-comments: 3.1.1 + transitivePeerDependencies: + - supports-color + + '@eslint/js@9.39.4': {} + + '@eslint/object-schema@2.1.7': {} + + '@eslint/plugin-kit@0.4.1': + dependencies: + '@eslint/core': 0.17.0 + levn: 0.4.1 + + '@floating-ui/core@1.7.5': + dependencies: + '@floating-ui/utils': 0.2.11 + + '@floating-ui/dom@1.7.6': + dependencies: + '@floating-ui/core': 1.7.5 + '@floating-ui/utils': 0.2.11 + + '@floating-ui/react-dom@2.1.8(react-dom@19.2.3(react@19.2.3))(react@19.2.3)': + dependencies: + '@floating-ui/dom': 1.7.6 + react: 19.2.3 + react-dom: 19.2.3(react@19.2.3) + + '@floating-ui/utils@0.2.11': {} + + '@hono/node-server@1.19.11(hono@4.12.8)': + dependencies: + hono: 4.12.8 + + '@humanfs/core@0.19.1': {} + + '@humanfs/node@0.16.7': + dependencies: + '@humanfs/core': 0.19.1 + '@humanwhocodes/retry': 0.4.3 + + '@humanwhocodes/module-importer@1.0.1': {} + + '@humanwhocodes/retry@0.4.3': {} + + '@img/colour@1.1.0': + optional: true + + '@img/sharp-darwin-arm64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-darwin-arm64': 1.2.4 + optional: true + + '@img/sharp-darwin-x64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-darwin-x64': 1.2.4 + optional: true + + '@img/sharp-libvips-darwin-arm64@1.2.4': + optional: true + + '@img/sharp-libvips-darwin-x64@1.2.4': + optional: true + + '@img/sharp-libvips-linux-arm64@1.2.4': + optional: true + + '@img/sharp-libvips-linux-arm@1.2.4': + optional: true + + '@img/sharp-libvips-linux-ppc64@1.2.4': + optional: true + + '@img/sharp-libvips-linux-riscv64@1.2.4': + optional: true + + '@img/sharp-libvips-linux-s390x@1.2.4': + optional: true + + '@img/sharp-libvips-linux-x64@1.2.4': + optional: true + + '@img/sharp-libvips-linuxmusl-arm64@1.2.4': + optional: true + + '@img/sharp-libvips-linuxmusl-x64@1.2.4': + optional: true + + '@img/sharp-linux-arm64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linux-arm64': 1.2.4 + optional: true + + '@img/sharp-linux-arm@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linux-arm': 1.2.4 + optional: true + + '@img/sharp-linux-ppc64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linux-ppc64': 1.2.4 + optional: true + + '@img/sharp-linux-riscv64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linux-riscv64': 1.2.4 + optional: true + + '@img/sharp-linux-s390x@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linux-s390x': 1.2.4 + optional: true + + '@img/sharp-linux-x64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linux-x64': 1.2.4 + optional: true + + '@img/sharp-linuxmusl-arm64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linuxmusl-arm64': 1.2.4 + optional: true + + '@img/sharp-linuxmusl-x64@0.34.5': + optionalDependencies: + '@img/sharp-libvips-linuxmusl-x64': 1.2.4 + optional: true + + '@img/sharp-wasm32@0.34.5': + dependencies: + '@emnapi/runtime': 1.9.1 + optional: true + + '@img/sharp-win32-arm64@0.34.5': + optional: true + + '@img/sharp-win32-ia32@0.34.5': + optional: true + + '@img/sharp-win32-x64@0.34.5': + optional: true + + '@inquirer/ansi@1.0.2': {} + + '@inquirer/confirm@5.1.21(@types/node@20.19.37)': + dependencies: + '@inquirer/core': 10.3.2(@types/node@20.19.37) + '@inquirer/type': 3.0.10(@types/node@20.19.37) + optionalDependencies: + '@types/node': 20.19.37 + + '@inquirer/core@10.3.2(@types/node@20.19.37)': + dependencies: + '@inquirer/ansi': 1.0.2 + '@inquirer/figures': 1.0.15 + '@inquirer/type': 3.0.10(@types/node@20.19.37) + cli-width: 4.1.0 + mute-stream: 2.0.0 + signal-exit: 4.1.0 + wrap-ansi: 6.2.0 + yoctocolors-cjs: 2.1.3 + optionalDependencies: + '@types/node': 20.19.37 + + '@inquirer/figures@1.0.15': {} + + '@inquirer/type@3.0.10(@types/node@20.19.37)': + optionalDependencies: + '@types/node': 20.19.37 + + '@jridgewell/gen-mapping@0.3.13': + dependencies: + '@jridgewell/sourcemap-codec': 1.5.5 + '@jridgewell/trace-mapping': 0.3.31 + + '@jridgewell/remapping@2.3.5': + dependencies: + '@jridgewell/gen-mapping': 0.3.13 + '@jridgewell/trace-mapping': 0.3.31 + + '@jridgewell/resolve-uri@3.1.2': {} + + '@jridgewell/sourcemap-codec@1.5.5': {} + + '@jridgewell/trace-mapping@0.3.31': + dependencies: + '@jridgewell/resolve-uri': 3.1.2 + '@jridgewell/sourcemap-codec': 1.5.5 + + '@modelcontextprotocol/sdk@1.27.1(zod@3.25.76)': + dependencies: + '@hono/node-server': 1.19.11(hono@4.12.8) + ajv: 8.18.0 + ajv-formats: 3.0.1(ajv@8.18.0) + content-type: 1.0.5 + cors: 2.8.6 + cross-spawn: 7.0.6 + eventsource: 3.0.7 + eventsource-parser: 3.0.6 + express: 5.2.1 + express-rate-limit: 8.3.1(express@5.2.1) + hono: 4.12.8 + jose: 6.2.2 + json-schema-typed: 8.0.2 + pkce-challenge: 5.0.1 + raw-body: 3.0.2 + zod: 3.25.76 + zod-to-json-schema: 3.25.1(zod@3.25.76) + transitivePeerDependencies: + - supports-color + + '@mswjs/interceptors@0.41.3': + dependencies: + '@open-draft/deferred-promise': 2.2.0 + '@open-draft/logger': 0.3.0 + '@open-draft/until': 2.1.0 + is-node-process: 1.2.0 + outvariant: 1.4.3 + strict-event-emitter: 0.5.1 + + '@napi-rs/wasm-runtime@0.2.12': + dependencies: + '@emnapi/core': 1.9.1 + '@emnapi/runtime': 1.9.1 + '@tybys/wasm-util': 0.10.1 + optional: true + + '@next/env@16.1.7': {} + + '@next/eslint-plugin-next@16.1.7': + dependencies: + fast-glob: 3.3.1 + + '@next/swc-darwin-arm64@16.1.7': + optional: true + + '@next/swc-darwin-x64@16.1.7': + optional: true + + '@next/swc-linux-arm64-gnu@16.1.7': + optional: true + + '@next/swc-linux-arm64-musl@16.1.7': + optional: true + + '@next/swc-linux-x64-gnu@16.1.7': + optional: true + + '@next/swc-linux-x64-musl@16.1.7': + optional: true + + '@next/swc-win32-arm64-msvc@16.1.7': + optional: true + + '@next/swc-win32-x64-msvc@16.1.7': + optional: true + + '@noble/ciphers@1.3.0': {} + + '@noble/curves@1.9.7': + dependencies: + '@noble/hashes': 1.8.0 + + '@noble/hashes@1.8.0': {} + + '@nodelib/fs.scandir@2.1.5': + dependencies: + '@nodelib/fs.stat': 2.0.5 + run-parallel: 1.2.0 + + '@nodelib/fs.stat@2.0.5': {} + + '@nodelib/fs.walk@1.2.8': + dependencies: + '@nodelib/fs.scandir': 2.1.5 + fastq: 1.20.1 + + '@nolyfill/is-core-module@1.0.39': {} + + '@open-draft/deferred-promise@2.2.0': {} + + '@open-draft/logger@0.3.0': + dependencies: + is-node-process: 1.2.0 + outvariant: 1.4.3 + + '@open-draft/until@2.1.0': {} + + '@rtsao/scc@1.1.0': {} + + '@sec-ant/readable-stream@0.4.1': {} + + '@sindresorhus/merge-streams@4.0.0': {} + + '@swc/helpers@0.5.15': + dependencies: + tslib: 2.8.1 + + '@tailwindcss/node@4.2.2': + dependencies: + '@jridgewell/remapping': 2.3.5 + enhanced-resolve: 5.20.1 + jiti: 2.6.1 + lightningcss: 1.32.0 + magic-string: 0.30.21 + source-map-js: 1.2.1 + tailwindcss: 4.2.2 + + '@tailwindcss/oxide-android-arm64@4.2.2': + optional: true + + '@tailwindcss/oxide-darwin-arm64@4.2.2': + optional: true + + '@tailwindcss/oxide-darwin-x64@4.2.2': + optional: true + + '@tailwindcss/oxide-freebsd-x64@4.2.2': + optional: true + + '@tailwindcss/oxide-linux-arm-gnueabihf@4.2.2': + optional: true + + '@tailwindcss/oxide-linux-arm64-gnu@4.2.2': + optional: true + + '@tailwindcss/oxide-linux-arm64-musl@4.2.2': + optional: true + + '@tailwindcss/oxide-linux-x64-gnu@4.2.2': + optional: true + + '@tailwindcss/oxide-linux-x64-musl@4.2.2': + optional: true + + '@tailwindcss/oxide-wasm32-wasi@4.2.2': + optional: true + + '@tailwindcss/oxide-win32-arm64-msvc@4.2.2': + optional: true + + '@tailwindcss/oxide-win32-x64-msvc@4.2.2': + optional: true + + '@tailwindcss/oxide@4.2.2': + optionalDependencies: + '@tailwindcss/oxide-android-arm64': 4.2.2 + '@tailwindcss/oxide-darwin-arm64': 4.2.2 + '@tailwindcss/oxide-darwin-x64': 4.2.2 + '@tailwindcss/oxide-freebsd-x64': 4.2.2 + '@tailwindcss/oxide-linux-arm-gnueabihf': 4.2.2 + '@tailwindcss/oxide-linux-arm64-gnu': 4.2.2 + '@tailwindcss/oxide-linux-arm64-musl': 4.2.2 + '@tailwindcss/oxide-linux-x64-gnu': 4.2.2 + '@tailwindcss/oxide-linux-x64-musl': 4.2.2 + '@tailwindcss/oxide-wasm32-wasi': 4.2.2 + '@tailwindcss/oxide-win32-arm64-msvc': 4.2.2 + '@tailwindcss/oxide-win32-x64-msvc': 4.2.2 + + '@tailwindcss/postcss@4.2.2': + dependencies: + '@alloc/quick-lru': 5.2.0 + '@tailwindcss/node': 4.2.2 + '@tailwindcss/oxide': 4.2.2 + postcss: 8.5.8 + tailwindcss: 4.2.2 + + '@tanstack/query-core@5.94.5': {} + + '@tanstack/react-query@5.94.5(react@19.2.3)': + dependencies: + '@tanstack/query-core': 5.94.5 + react: 19.2.3 + + '@tanstack/react-table@8.21.3(react-dom@19.2.3(react@19.2.3))(react@19.2.3)': + dependencies: + '@tanstack/table-core': 8.21.3 + react: 19.2.3 + react-dom: 19.2.3(react@19.2.3) + + '@tanstack/table-core@8.21.3': {} + + '@ts-morph/common@0.27.0': + dependencies: + fast-glob: 3.3.3 + minimatch: 10.2.4 + path-browserify: 1.0.1 + + '@tybys/wasm-util@0.10.1': + dependencies: + tslib: 2.8.1 + optional: true + + '@types/estree@1.0.8': {} + + '@types/json-schema@7.0.15': {} + + '@types/json5@0.0.29': {} + + '@types/node@20.19.37': + dependencies: + undici-types: 6.21.0 + + '@types/react-dom@19.2.3(@types/react@19.2.14)': + dependencies: + '@types/react': 19.2.14 + + '@types/react@19.2.14': + dependencies: + csstype: 3.2.3 + + '@types/statuses@2.0.6': {} + + '@types/validate-npm-package-name@4.0.2': {} + + '@typescript-eslint/eslint-plugin@8.57.1(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3)': + dependencies: + '@eslint-community/regexpp': 4.12.2 + '@typescript-eslint/parser': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + '@typescript-eslint/scope-manager': 8.57.1 + '@typescript-eslint/type-utils': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + '@typescript-eslint/utils': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + '@typescript-eslint/visitor-keys': 8.57.1 + eslint: 9.39.4(jiti@2.6.1) + ignore: 7.0.5 + natural-compare: 1.4.0 + ts-api-utils: 2.5.0(typescript@5.9.3) + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + '@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3)': + dependencies: + '@typescript-eslint/scope-manager': 8.57.1 + '@typescript-eslint/types': 8.57.1 + '@typescript-eslint/typescript-estree': 8.57.1(typescript@5.9.3) + '@typescript-eslint/visitor-keys': 8.57.1 + debug: 4.4.3 + eslint: 9.39.4(jiti@2.6.1) + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + '@typescript-eslint/project-service@8.57.1(typescript@5.9.3)': + dependencies: + '@typescript-eslint/tsconfig-utils': 8.57.1(typescript@5.9.3) + '@typescript-eslint/types': 8.57.1 + debug: 4.4.3 + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + '@typescript-eslint/scope-manager@8.57.1': + dependencies: + '@typescript-eslint/types': 8.57.1 + '@typescript-eslint/visitor-keys': 8.57.1 + + '@typescript-eslint/tsconfig-utils@8.57.1(typescript@5.9.3)': + dependencies: + typescript: 5.9.3 + + '@typescript-eslint/type-utils@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3)': + dependencies: + '@typescript-eslint/types': 8.57.1 + '@typescript-eslint/typescript-estree': 8.57.1(typescript@5.9.3) + '@typescript-eslint/utils': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + debug: 4.4.3 + eslint: 9.39.4(jiti@2.6.1) + ts-api-utils: 2.5.0(typescript@5.9.3) + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + '@typescript-eslint/types@8.57.1': {} + + '@typescript-eslint/typescript-estree@8.57.1(typescript@5.9.3)': + dependencies: + '@typescript-eslint/project-service': 8.57.1(typescript@5.9.3) + '@typescript-eslint/tsconfig-utils': 8.57.1(typescript@5.9.3) + '@typescript-eslint/types': 8.57.1 + '@typescript-eslint/visitor-keys': 8.57.1 + debug: 4.4.3 + minimatch: 10.2.4 + semver: 7.7.4 + tinyglobby: 0.2.15 + ts-api-utils: 2.5.0(typescript@5.9.3) + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + '@typescript-eslint/utils@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3)': + dependencies: + '@eslint-community/eslint-utils': 4.9.1(eslint@9.39.4(jiti@2.6.1)) + '@typescript-eslint/scope-manager': 8.57.1 + '@typescript-eslint/types': 8.57.1 + '@typescript-eslint/typescript-estree': 8.57.1(typescript@5.9.3) + eslint: 9.39.4(jiti@2.6.1) + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + '@typescript-eslint/visitor-keys@8.57.1': + dependencies: + '@typescript-eslint/types': 8.57.1 + eslint-visitor-keys: 5.0.1 + + '@unrs/resolver-binding-android-arm-eabi@1.11.1': + optional: true + + '@unrs/resolver-binding-android-arm64@1.11.1': + optional: true + + '@unrs/resolver-binding-darwin-arm64@1.11.1': + optional: true + + '@unrs/resolver-binding-darwin-x64@1.11.1': + optional: true + + '@unrs/resolver-binding-freebsd-x64@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-arm-gnueabihf@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-arm-musleabihf@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-arm64-gnu@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-arm64-musl@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-ppc64-gnu@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-riscv64-gnu@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-riscv64-musl@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-s390x-gnu@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-x64-gnu@1.11.1': + optional: true + + '@unrs/resolver-binding-linux-x64-musl@1.11.1': + optional: true + + '@unrs/resolver-binding-wasm32-wasi@1.11.1': + dependencies: + '@napi-rs/wasm-runtime': 0.2.12 + optional: true + + '@unrs/resolver-binding-win32-arm64-msvc@1.11.1': + optional: true + + '@unrs/resolver-binding-win32-ia32-msvc@1.11.1': + optional: true + + '@unrs/resolver-binding-win32-x64-msvc@1.11.1': + optional: true + + accepts@2.0.0: + dependencies: + mime-types: 3.0.2 + negotiator: 1.0.0 + + acorn-jsx@5.3.2(acorn@8.16.0): + dependencies: + acorn: 8.16.0 + + acorn@8.16.0: {} + + agent-base@7.1.4: {} + + ajv-formats@3.0.1(ajv@8.18.0): + optionalDependencies: + ajv: 8.18.0 + + ajv@6.14.0: + dependencies: + fast-deep-equal: 3.1.3 + fast-json-stable-stringify: 2.1.0 + json-schema-traverse: 0.4.1 + uri-js: 4.4.1 + + ajv@8.18.0: + dependencies: + fast-deep-equal: 3.1.3 + fast-uri: 3.1.0 + json-schema-traverse: 1.0.0 + require-from-string: 2.0.2 + + ansi-regex@5.0.1: {} + + ansi-regex@6.2.2: {} + + ansi-styles@4.3.0: + dependencies: + color-convert: 2.0.1 + + argparse@2.0.1: {} + + aria-query@5.3.2: {} + + array-buffer-byte-length@1.0.2: + dependencies: + call-bound: 1.0.4 + is-array-buffer: 3.0.5 + + array-includes@3.1.9: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-object-atoms: 1.1.1 + get-intrinsic: 1.3.0 + is-string: 1.1.1 + math-intrinsics: 1.1.0 + + array.prototype.findlast@1.2.5: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + es-shim-unscopables: 1.1.0 + + array.prototype.findlastindex@1.2.6: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + es-shim-unscopables: 1.1.0 + + array.prototype.flat@1.3.3: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-shim-unscopables: 1.1.0 + + array.prototype.flatmap@1.3.3: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-shim-unscopables: 1.1.0 + + array.prototype.tosorted@1.1.4: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + es-shim-unscopables: 1.1.0 + + arraybuffer.prototype.slice@1.0.4: + dependencies: + array-buffer-byte-length: 1.0.2 + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + get-intrinsic: 1.3.0 + is-array-buffer: 3.0.5 + + ast-types-flow@0.0.8: {} + + ast-types@0.16.1: + dependencies: + tslib: 2.8.1 + + async-function@1.0.0: {} + + available-typed-arrays@1.0.7: + dependencies: + possible-typed-array-names: 1.1.0 + + axe-core@4.11.1: {} + + axobject-query@4.1.0: {} + + balanced-match@1.0.2: {} + + balanced-match@4.0.4: {} + + baseline-browser-mapping@2.10.10: {} + + body-parser@2.2.2: + dependencies: + bytes: 3.1.2 + content-type: 1.0.5 + debug: 4.4.3 + http-errors: 2.0.1 + iconv-lite: 0.7.2 + on-finished: 2.4.1 + qs: 6.15.0 + raw-body: 3.0.2 + type-is: 2.0.1 + transitivePeerDependencies: + - supports-color + + brace-expansion@1.1.12: + dependencies: + balanced-match: 1.0.2 + concat-map: 0.0.1 + + brace-expansion@5.0.4: + dependencies: + balanced-match: 4.0.4 + + braces@3.0.3: + dependencies: + fill-range: 7.1.1 + + browserslist@4.28.1: + dependencies: + baseline-browser-mapping: 2.10.10 + caniuse-lite: 1.0.30001780 + electron-to-chromium: 1.5.321 + node-releases: 2.0.36 + update-browserslist-db: 1.2.3(browserslist@4.28.1) + + bundle-name@4.1.0: + dependencies: + run-applescript: 7.1.0 + + bytes@3.1.2: {} + + call-bind-apply-helpers@1.0.2: + dependencies: + es-errors: 1.3.0 + function-bind: 1.1.2 + + call-bind@1.0.8: + dependencies: + call-bind-apply-helpers: 1.0.2 + es-define-property: 1.0.1 + get-intrinsic: 1.3.0 + set-function-length: 1.2.2 + + call-bound@1.0.4: + dependencies: + call-bind-apply-helpers: 1.0.2 + get-intrinsic: 1.3.0 + + callsites@3.1.0: {} + + caniuse-lite@1.0.30001780: {} + + chalk@4.1.2: + dependencies: + ansi-styles: 4.3.0 + supports-color: 7.2.0 + + chalk@5.6.2: {} + + class-variance-authority@0.7.1: + dependencies: + clsx: 2.1.1 + + cli-cursor@5.0.0: + dependencies: + restore-cursor: 5.1.0 + + cli-spinners@2.9.2: {} + + cli-width@4.1.0: {} + + client-only@0.0.1: {} + + cliui@8.0.1: + dependencies: + string-width: 4.2.3 + strip-ansi: 6.0.1 + wrap-ansi: 7.0.0 + + clsx@2.1.1: {} + + code-block-writer@13.0.3: {} + + color-convert@2.0.1: + dependencies: + color-name: 1.1.4 + + color-name@1.1.4: {} + + commander@11.1.0: {} + + commander@14.0.3: {} + + concat-map@0.0.1: {} + + content-disposition@1.0.1: {} + + content-type@1.0.5: {} + + convert-source-map@2.0.0: {} + + cookie-signature@1.2.2: {} + + cookie@0.7.2: {} + + cookie@1.1.1: {} + + cors@2.8.6: + dependencies: + object-assign: 4.1.1 + vary: 1.1.2 + + cosmiconfig@9.0.1(typescript@5.9.3): + dependencies: + env-paths: 2.2.1 + import-fresh: 3.3.1 + js-yaml: 4.1.1 + parse-json: 5.2.0 + optionalDependencies: + typescript: 5.9.3 + + cross-spawn@7.0.6: + dependencies: + path-key: 3.1.1 + shebang-command: 2.0.0 + which: 2.0.2 + + cssesc@3.0.0: {} + + csstype@3.2.3: {} + + damerau-levenshtein@1.0.8: {} + + data-uri-to-buffer@4.0.1: {} + + data-view-buffer@1.0.2: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + is-data-view: 1.0.2 + + data-view-byte-length@1.0.2: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + is-data-view: 1.0.2 + + data-view-byte-offset@1.0.1: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + is-data-view: 1.0.2 + + debug@3.2.7: + dependencies: + ms: 2.1.3 + + debug@4.4.3: + dependencies: + ms: 2.1.3 + + dedent@1.7.2: {} + + deep-is@0.1.4: {} + + deepmerge@4.3.1: {} + + default-browser-id@5.0.1: {} + + default-browser@5.5.0: + dependencies: + bundle-name: 4.1.0 + default-browser-id: 5.0.1 + + define-data-property@1.1.4: + dependencies: + es-define-property: 1.0.1 + es-errors: 1.3.0 + gopd: 1.2.0 + + define-lazy-prop@3.0.0: {} + + define-properties@1.2.1: + dependencies: + define-data-property: 1.1.4 + has-property-descriptors: 1.0.2 + object-keys: 1.1.1 + + depd@2.0.0: {} + + detect-libc@2.1.2: {} + + diff@8.0.3: {} + + doctrine@2.1.0: + dependencies: + esutils: 2.0.3 + + dotenv@17.3.1: {} + + dunder-proto@1.0.1: + dependencies: + call-bind-apply-helpers: 1.0.2 + es-errors: 1.3.0 + gopd: 1.2.0 + + eciesjs@0.4.18: + dependencies: + '@ecies/ciphers': 0.2.5(@noble/ciphers@1.3.0) + '@noble/ciphers': 1.3.0 + '@noble/curves': 1.9.7 + '@noble/hashes': 1.8.0 + + ee-first@1.1.1: {} + + electron-to-chromium@1.5.321: {} + + emoji-regex@10.6.0: {} + + emoji-regex@8.0.0: {} + + emoji-regex@9.2.2: {} + + encodeurl@2.0.0: {} + + enhanced-resolve@5.20.1: + dependencies: + graceful-fs: 4.2.11 + tapable: 2.3.0 + + env-paths@2.2.1: {} + + error-ex@1.3.4: + dependencies: + is-arrayish: 0.2.1 + + es-abstract@1.24.1: + dependencies: + array-buffer-byte-length: 1.0.2 + arraybuffer.prototype.slice: 1.0.4 + available-typed-arrays: 1.0.7 + call-bind: 1.0.8 + call-bound: 1.0.4 + data-view-buffer: 1.0.2 + data-view-byte-length: 1.0.2 + data-view-byte-offset: 1.0.1 + es-define-property: 1.0.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + es-set-tostringtag: 2.1.0 + es-to-primitive: 1.3.0 + function.prototype.name: 1.1.8 + get-intrinsic: 1.3.0 + get-proto: 1.0.1 + get-symbol-description: 1.1.0 + globalthis: 1.0.4 + gopd: 1.2.0 + has-property-descriptors: 1.0.2 + has-proto: 1.2.0 + has-symbols: 1.1.0 + hasown: 2.0.2 + internal-slot: 1.1.0 + is-array-buffer: 3.0.5 + is-callable: 1.2.7 + is-data-view: 1.0.2 + is-negative-zero: 2.0.3 + is-regex: 1.2.1 + is-set: 2.0.3 + is-shared-array-buffer: 1.0.4 + is-string: 1.1.1 + is-typed-array: 1.1.15 + is-weakref: 1.1.1 + math-intrinsics: 1.1.0 + object-inspect: 1.13.4 + object-keys: 1.1.1 + object.assign: 4.1.7 + own-keys: 1.0.1 + regexp.prototype.flags: 1.5.4 + safe-array-concat: 1.1.3 + safe-push-apply: 1.0.0 + safe-regex-test: 1.1.0 + set-proto: 1.0.0 + stop-iteration-iterator: 1.1.0 + string.prototype.trim: 1.2.10 + string.prototype.trimend: 1.0.9 + string.prototype.trimstart: 1.0.8 + typed-array-buffer: 1.0.3 + typed-array-byte-length: 1.0.3 + typed-array-byte-offset: 1.0.4 + typed-array-length: 1.0.7 + unbox-primitive: 1.1.0 + which-typed-array: 1.1.20 + + es-define-property@1.0.1: {} + + es-errors@1.3.0: {} + + es-iterator-helpers@1.3.1: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + es-set-tostringtag: 2.1.0 + function-bind: 1.1.2 + get-intrinsic: 1.3.0 + globalthis: 1.0.4 + gopd: 1.2.0 + has-property-descriptors: 1.0.2 + has-proto: 1.2.0 + has-symbols: 1.1.0 + internal-slot: 1.1.0 + iterator.prototype: 1.1.5 + math-intrinsics: 1.1.0 + safe-array-concat: 1.1.3 + + es-object-atoms@1.1.1: + dependencies: + es-errors: 1.3.0 + + es-set-tostringtag@2.1.0: + dependencies: + es-errors: 1.3.0 + get-intrinsic: 1.3.0 + has-tostringtag: 1.0.2 + hasown: 2.0.2 + + es-shim-unscopables@1.1.0: + dependencies: + hasown: 2.0.2 + + es-to-primitive@1.3.0: + dependencies: + is-callable: 1.2.7 + is-date-object: 1.1.0 + is-symbol: 1.1.1 + + escalade@3.2.0: {} + + escape-html@1.0.3: {} + + escape-string-regexp@4.0.0: {} + + eslint-config-next@16.1.7(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3): + dependencies: + '@next/eslint-plugin-next': 16.1.7 + eslint: 9.39.4(jiti@2.6.1) + eslint-import-resolver-node: 0.3.9 + eslint-import-resolver-typescript: 3.10.1(eslint-plugin-import@2.32.0)(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-import: 2.32.0(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-jsx-a11y: 6.10.2(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-react: 7.37.5(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-react-hooks: 7.0.1(eslint@9.39.4(jiti@2.6.1)) + globals: 16.4.0 + typescript-eslint: 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + optionalDependencies: + typescript: 5.9.3 + transitivePeerDependencies: + - '@typescript-eslint/parser' + - eslint-import-resolver-webpack + - eslint-plugin-import-x + - supports-color + + eslint-config-next@16.1.7(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3): + dependencies: + '@next/eslint-plugin-next': 16.1.7 + eslint: 9.39.4(jiti@2.6.1) + eslint-import-resolver-node: 0.3.9 + eslint-import-resolver-typescript: 3.10.1(eslint-plugin-import@2.32.0)(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-import: 2.32.0(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-jsx-a11y: 6.10.2(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-react: 7.37.5(eslint@9.39.4(jiti@2.6.1)) + eslint-plugin-react-hooks: 7.0.1(eslint@9.39.4(jiti@2.6.1)) + globals: 16.4.0 + typescript-eslint: 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + optionalDependencies: + typescript: 5.9.3 + transitivePeerDependencies: + - '@typescript-eslint/parser' + - eslint-import-resolver-webpack + - eslint-plugin-import-x + - supports-color + + eslint-import-resolver-node@0.3.9: + dependencies: + debug: 3.2.7 + is-core-module: 2.16.1 + resolve: 1.22.11 + transitivePeerDependencies: + - supports-color + + eslint-import-resolver-typescript@3.10.1(eslint-plugin-import@2.32.0)(eslint@9.39.4(jiti@2.6.1)): + dependencies: + '@nolyfill/is-core-module': 1.0.39 + debug: 4.4.3 + eslint: 9.39.4(jiti@2.6.1) + get-tsconfig: 4.13.6 + is-bun-module: 2.0.0 + stable-hash: 0.0.5 + tinyglobby: 0.2.15 + unrs-resolver: 1.11.1 + optionalDependencies: + eslint-plugin-import: 2.32.0(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)) + transitivePeerDependencies: + - supports-color + + eslint-module-utils@2.12.1(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)): + dependencies: + debug: 3.2.7 + optionalDependencies: + '@typescript-eslint/parser': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + eslint: 9.39.4(jiti@2.6.1) + eslint-import-resolver-node: 0.3.9 + eslint-import-resolver-typescript: 3.10.1(eslint-plugin-import@2.32.0)(eslint@9.39.4(jiti@2.6.1)) + transitivePeerDependencies: + - supports-color + + eslint-plugin-import@2.32.0(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)): + dependencies: + '@rtsao/scc': 1.1.0 + array-includes: 3.1.9 + array.prototype.findlastindex: 1.2.6 + array.prototype.flat: 1.3.3 + array.prototype.flatmap: 1.3.3 + debug: 3.2.7 + doctrine: 2.1.0 + eslint: 9.39.4(jiti@2.6.1) + eslint-import-resolver-node: 0.3.9 + eslint-module-utils: 2.12.1(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)) + hasown: 2.0.2 + is-core-module: 2.16.1 + is-glob: 4.0.3 + minimatch: 3.1.5 + object.fromentries: 2.0.8 + object.groupby: 1.0.3 + object.values: 1.2.1 + semver: 6.3.1 + string.prototype.trimend: 1.0.9 + tsconfig-paths: 3.15.0 + optionalDependencies: + '@typescript-eslint/parser': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + transitivePeerDependencies: + - eslint-import-resolver-typescript + - eslint-import-resolver-webpack + - supports-color + + eslint-plugin-import@2.32.0(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)): + dependencies: + '@rtsao/scc': 1.1.0 + array-includes: 3.1.9 + array.prototype.findlastindex: 1.2.6 + array.prototype.flat: 1.3.3 + array.prototype.flatmap: 1.3.3 + debug: 3.2.7 + doctrine: 2.1.0 + eslint: 9.39.4(jiti@2.6.1) + eslint-import-resolver-node: 0.3.9 + eslint-module-utils: 2.12.1(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.10.1)(eslint@9.39.4(jiti@2.6.1)) + hasown: 2.0.2 + is-core-module: 2.16.1 + is-glob: 4.0.3 + minimatch: 3.1.5 + object.fromentries: 2.0.8 + object.groupby: 1.0.3 + object.values: 1.2.1 + semver: 6.3.1 + string.prototype.trimend: 1.0.9 + tsconfig-paths: 3.15.0 + transitivePeerDependencies: + - eslint-import-resolver-typescript + - eslint-import-resolver-webpack + - supports-color + + eslint-plugin-jsx-a11y@6.10.2(eslint@9.39.4(jiti@2.6.1)): + dependencies: + aria-query: 5.3.2 + array-includes: 3.1.9 + array.prototype.flatmap: 1.3.3 + ast-types-flow: 0.0.8 + axe-core: 4.11.1 + axobject-query: 4.1.0 + damerau-levenshtein: 1.0.8 + emoji-regex: 9.2.2 + eslint: 9.39.4(jiti@2.6.1) + hasown: 2.0.2 + jsx-ast-utils: 3.3.5 + language-tags: 1.0.9 + minimatch: 3.1.5 + object.fromentries: 2.0.8 + safe-regex-test: 1.1.0 + string.prototype.includes: 2.0.1 + + eslint-plugin-react-hooks@7.0.1(eslint@9.39.4(jiti@2.6.1)): + dependencies: + '@babel/core': 7.29.0 + '@babel/parser': 7.29.2 + eslint: 9.39.4(jiti@2.6.1) + hermes-parser: 0.25.1 + zod: 4.3.6 + zod-validation-error: 4.0.2(zod@4.3.6) + transitivePeerDependencies: + - supports-color + + eslint-plugin-react@7.37.5(eslint@9.39.4(jiti@2.6.1)): + dependencies: + array-includes: 3.1.9 + array.prototype.findlast: 1.2.5 + array.prototype.flatmap: 1.3.3 + array.prototype.tosorted: 1.1.4 + doctrine: 2.1.0 + es-iterator-helpers: 1.3.1 + eslint: 9.39.4(jiti@2.6.1) + estraverse: 5.3.0 + hasown: 2.0.2 + jsx-ast-utils: 3.3.5 + minimatch: 3.1.5 + object.entries: 1.1.9 + object.fromentries: 2.0.8 + object.values: 1.2.1 + prop-types: 15.8.1 + resolve: 2.0.0-next.6 + semver: 6.3.1 + string.prototype.matchall: 4.0.12 + string.prototype.repeat: 1.0.0 + + eslint-scope@8.4.0: + dependencies: + esrecurse: 4.3.0 + estraverse: 5.3.0 + + eslint-visitor-keys@3.4.3: {} + + eslint-visitor-keys@4.2.1: {} + + eslint-visitor-keys@5.0.1: {} + + eslint@9.39.4(jiti@2.6.1): + dependencies: + '@eslint-community/eslint-utils': 4.9.1(eslint@9.39.4(jiti@2.6.1)) + '@eslint-community/regexpp': 4.12.2 + '@eslint/config-array': 0.21.2 + '@eslint/config-helpers': 0.4.2 + '@eslint/core': 0.17.0 + '@eslint/eslintrc': 3.3.5 + '@eslint/js': 9.39.4 + '@eslint/plugin-kit': 0.4.1 + '@humanfs/node': 0.16.7 + '@humanwhocodes/module-importer': 1.0.1 + '@humanwhocodes/retry': 0.4.3 + '@types/estree': 1.0.8 + ajv: 6.14.0 + chalk: 4.1.2 + cross-spawn: 7.0.6 + debug: 4.4.3 + escape-string-regexp: 4.0.0 + eslint-scope: 8.4.0 + eslint-visitor-keys: 4.2.1 + espree: 10.4.0 + esquery: 1.7.0 + esutils: 2.0.3 + fast-deep-equal: 3.1.3 + file-entry-cache: 8.0.0 + find-up: 5.0.0 + glob-parent: 6.0.2 + ignore: 5.3.2 + imurmurhash: 0.1.4 + is-glob: 4.0.3 + json-stable-stringify-without-jsonify: 1.0.1 + lodash.merge: 4.6.2 + minimatch: 3.1.5 + natural-compare: 1.4.0 + optionator: 0.9.4 + optionalDependencies: + jiti: 2.6.1 + transitivePeerDependencies: + - supports-color + + espree@10.4.0: + dependencies: + acorn: 8.16.0 + acorn-jsx: 5.3.2(acorn@8.16.0) + eslint-visitor-keys: 4.2.1 + + esprima@4.0.1: {} + + esquery@1.7.0: + dependencies: + estraverse: 5.3.0 + + esrecurse@4.3.0: + dependencies: + estraverse: 5.3.0 + + estraverse@5.3.0: {} + + esutils@2.0.3: {} + + etag@1.8.1: {} + + eventsource-parser@3.0.6: {} + + eventsource@3.0.7: + dependencies: + eventsource-parser: 3.0.6 + + execa@5.1.1: + dependencies: + cross-spawn: 7.0.6 + get-stream: 6.0.1 + human-signals: 2.1.0 + is-stream: 2.0.1 + merge-stream: 2.0.0 + npm-run-path: 4.0.1 + onetime: 5.1.2 + signal-exit: 3.0.7 + strip-final-newline: 2.0.0 + + execa@9.6.1: + dependencies: + '@sindresorhus/merge-streams': 4.0.0 + cross-spawn: 7.0.6 + figures: 6.1.0 + get-stream: 9.0.1 + human-signals: 8.0.1 + is-plain-obj: 4.1.0 + is-stream: 4.0.1 + npm-run-path: 6.0.0 + pretty-ms: 9.3.0 + signal-exit: 4.1.0 + strip-final-newline: 4.0.0 + yoctocolors: 2.1.2 + + express-rate-limit@8.3.1(express@5.2.1): + dependencies: + express: 5.2.1 + ip-address: 10.1.0 + + express@5.2.1: + dependencies: + accepts: 2.0.0 + body-parser: 2.2.2 + content-disposition: 1.0.1 + content-type: 1.0.5 + cookie: 0.7.2 + cookie-signature: 1.2.2 + debug: 4.4.3 + depd: 2.0.0 + encodeurl: 2.0.0 + escape-html: 1.0.3 + etag: 1.8.1 + finalhandler: 2.1.1 + fresh: 2.0.0 + http-errors: 2.0.1 + merge-descriptors: 2.0.0 + mime-types: 3.0.2 + on-finished: 2.4.1 + once: 1.4.0 + parseurl: 1.3.3 + proxy-addr: 2.0.7 + qs: 6.15.0 + range-parser: 1.2.1 + router: 2.2.0 + send: 1.2.1 + serve-static: 2.2.1 + statuses: 2.0.2 + type-is: 2.0.1 + vary: 1.1.2 + transitivePeerDependencies: + - supports-color + + fast-deep-equal@3.1.3: {} + + fast-glob@3.3.1: + dependencies: + '@nodelib/fs.stat': 2.0.5 + '@nodelib/fs.walk': 1.2.8 + glob-parent: 5.1.2 + merge2: 1.4.1 + micromatch: 4.0.8 + + fast-glob@3.3.3: + dependencies: + '@nodelib/fs.stat': 2.0.5 + '@nodelib/fs.walk': 1.2.8 + glob-parent: 5.1.2 + merge2: 1.4.1 + micromatch: 4.0.8 + + fast-json-stable-stringify@2.1.0: {} + + fast-levenshtein@2.0.6: {} + + fast-uri@3.1.0: {} + + fastq@1.20.1: + dependencies: + reusify: 1.1.0 + + fdir@6.5.0(picomatch@4.0.3): + optionalDependencies: + picomatch: 4.0.3 + + fetch-blob@3.2.0: + dependencies: + node-domexception: 1.0.0 + web-streams-polyfill: 3.3.3 + + figures@6.1.0: + dependencies: + is-unicode-supported: 2.1.0 + + file-entry-cache@8.0.0: + dependencies: + flat-cache: 4.0.1 + + fill-range@7.1.1: + dependencies: + to-regex-range: 5.0.1 + + finalhandler@2.1.1: + dependencies: + debug: 4.4.3 + encodeurl: 2.0.0 + escape-html: 1.0.3 + on-finished: 2.4.1 + parseurl: 1.3.3 + statuses: 2.0.2 + transitivePeerDependencies: + - supports-color + + find-up@5.0.0: + dependencies: + locate-path: 6.0.0 + path-exists: 4.0.0 + + flat-cache@4.0.1: + dependencies: + flatted: 3.4.2 + keyv: 4.5.4 + + flatted@3.4.2: {} + + for-each@0.3.5: + dependencies: + is-callable: 1.2.7 + + formdata-polyfill@4.0.10: + dependencies: + fetch-blob: 3.2.0 + + forwarded@0.2.0: {} + + fresh@2.0.0: {} + + fs-extra@11.3.4: + dependencies: + graceful-fs: 4.2.11 + jsonfile: 6.2.0 + universalify: 2.0.1 + + function-bind@1.1.2: {} + + function.prototype.name@1.1.8: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + functions-have-names: 1.2.3 + hasown: 2.0.2 + is-callable: 1.2.7 + + functions-have-names@1.2.3: {} + + fuzzysort@3.1.0: {} + + generator-function@2.0.1: {} + + gensync@1.0.0-beta.2: {} + + get-caller-file@2.0.5: {} + + get-east-asian-width@1.5.0: {} + + get-intrinsic@1.3.0: + dependencies: + call-bind-apply-helpers: 1.0.2 + es-define-property: 1.0.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + function-bind: 1.1.2 + get-proto: 1.0.1 + gopd: 1.2.0 + has-symbols: 1.1.0 + hasown: 2.0.2 + math-intrinsics: 1.1.0 + + get-own-enumerable-keys@1.0.0: {} + + get-proto@1.0.1: + dependencies: + dunder-proto: 1.0.1 + es-object-atoms: 1.1.1 + + get-stream@6.0.1: {} + + get-stream@9.0.1: + dependencies: + '@sec-ant/readable-stream': 0.4.1 + is-stream: 4.0.1 + + get-symbol-description@1.1.0: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + get-intrinsic: 1.3.0 + + get-tsconfig@4.13.6: + dependencies: + resolve-pkg-maps: 1.0.0 + + glob-parent@5.1.2: + dependencies: + is-glob: 4.0.3 + + glob-parent@6.0.2: + dependencies: + is-glob: 4.0.3 + + globals@14.0.0: {} + + globals@16.4.0: {} + + globalthis@1.0.4: + dependencies: + define-properties: 1.2.1 + gopd: 1.2.0 + + gopd@1.2.0: {} + + graceful-fs@4.2.11: {} + + graphql@16.13.1: {} + + has-bigints@1.1.0: {} + + has-flag@4.0.0: {} + + has-property-descriptors@1.0.2: + dependencies: + es-define-property: 1.0.1 + + has-proto@1.2.0: + dependencies: + dunder-proto: 1.0.1 + + has-symbols@1.1.0: {} + + has-tostringtag@1.0.2: + dependencies: + has-symbols: 1.1.0 + + hasown@2.0.2: + dependencies: + function-bind: 1.1.2 + + headers-polyfill@4.0.3: {} + + hermes-estree@0.25.1: {} + + hermes-parser@0.25.1: + dependencies: + hermes-estree: 0.25.1 + + hono@4.12.8: {} + + http-errors@2.0.1: + dependencies: + depd: 2.0.0 + inherits: 2.0.4 + setprototypeof: 1.2.0 + statuses: 2.0.2 + toidentifier: 1.0.1 + + https-proxy-agent@7.0.6: + dependencies: + agent-base: 7.1.4 + debug: 4.4.3 + transitivePeerDependencies: + - supports-color + + human-signals@2.1.0: {} + + human-signals@8.0.1: {} + + iconv-lite@0.7.2: + dependencies: + safer-buffer: 2.1.2 + + ignore@5.3.2: {} + + ignore@7.0.5: {} + + import-fresh@3.3.1: + dependencies: + parent-module: 1.0.1 + resolve-from: 4.0.0 + + imurmurhash@0.1.4: {} + + inherits@2.0.4: {} + + internal-slot@1.1.0: + dependencies: + es-errors: 1.3.0 + hasown: 2.0.2 + side-channel: 1.1.0 + + ip-address@10.1.0: {} + + ipaddr.js@1.9.1: {} + + is-array-buffer@3.0.5: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + get-intrinsic: 1.3.0 + + is-arrayish@0.2.1: {} + + is-async-function@2.1.1: + dependencies: + async-function: 1.0.0 + call-bound: 1.0.4 + get-proto: 1.0.1 + has-tostringtag: 1.0.2 + safe-regex-test: 1.1.0 + + is-bigint@1.1.0: + dependencies: + has-bigints: 1.1.0 + + is-boolean-object@1.2.2: + dependencies: + call-bound: 1.0.4 + has-tostringtag: 1.0.2 + + is-bun-module@2.0.0: + dependencies: + semver: 7.7.4 + + is-callable@1.2.7: {} + + is-core-module@2.16.1: + dependencies: + hasown: 2.0.2 + + is-data-view@1.0.2: + dependencies: + call-bound: 1.0.4 + get-intrinsic: 1.3.0 + is-typed-array: 1.1.15 + + is-date-object@1.1.0: + dependencies: + call-bound: 1.0.4 + has-tostringtag: 1.0.2 + + is-docker@3.0.0: {} + + is-extglob@2.1.1: {} + + is-finalizationregistry@1.1.1: + dependencies: + call-bound: 1.0.4 + + is-fullwidth-code-point@3.0.0: {} + + is-generator-function@1.1.2: + dependencies: + call-bound: 1.0.4 + generator-function: 2.0.1 + get-proto: 1.0.1 + has-tostringtag: 1.0.2 + safe-regex-test: 1.1.0 + + is-glob@4.0.3: + dependencies: + is-extglob: 2.1.1 + + is-in-ssh@1.0.0: {} + + is-inside-container@1.0.0: + dependencies: + is-docker: 3.0.0 + + is-interactive@2.0.0: {} + + is-map@2.0.3: {} + + is-negative-zero@2.0.3: {} + + is-node-process@1.2.0: {} + + is-number-object@1.1.1: + dependencies: + call-bound: 1.0.4 + has-tostringtag: 1.0.2 + + is-number@7.0.0: {} + + is-obj@3.0.0: {} + + is-plain-obj@4.1.0: {} + + is-promise@4.0.0: {} + + is-regex@1.2.1: + dependencies: + call-bound: 1.0.4 + gopd: 1.2.0 + has-tostringtag: 1.0.2 + hasown: 2.0.2 + + is-regexp@3.1.0: {} + + is-set@2.0.3: {} + + is-shared-array-buffer@1.0.4: + dependencies: + call-bound: 1.0.4 + + is-stream@2.0.1: {} + + is-stream@4.0.1: {} + + is-string@1.1.1: + dependencies: + call-bound: 1.0.4 + has-tostringtag: 1.0.2 + + is-symbol@1.1.1: + dependencies: + call-bound: 1.0.4 + has-symbols: 1.1.0 + safe-regex-test: 1.1.0 + + is-typed-array@1.1.15: + dependencies: + which-typed-array: 1.1.20 + + is-unicode-supported@1.3.0: {} + + is-unicode-supported@2.1.0: {} + + is-weakmap@2.0.2: {} + + is-weakref@1.1.1: + dependencies: + call-bound: 1.0.4 + + is-weakset@2.0.4: + dependencies: + call-bound: 1.0.4 + get-intrinsic: 1.3.0 + + is-wsl@3.1.1: + dependencies: + is-inside-container: 1.0.0 + + isarray@2.0.5: {} + + isexe@2.0.0: {} + + isexe@3.1.5: {} + + iterator.prototype@1.1.5: + dependencies: + define-data-property: 1.1.4 + es-object-atoms: 1.1.1 + get-intrinsic: 1.3.0 + get-proto: 1.0.1 + has-symbols: 1.1.0 + set-function-name: 2.0.2 + + jiti@2.6.1: {} + + jose@6.2.2: {} + + js-tokens@4.0.0: {} + + js-yaml@4.1.1: + dependencies: + argparse: 2.0.1 + + jsesc@3.1.0: {} + + json-buffer@3.0.1: {} + + json-parse-even-better-errors@2.3.1: {} + + json-schema-traverse@0.4.1: {} + + json-schema-traverse@1.0.0: {} + + json-schema-typed@8.0.2: {} + + json-stable-stringify-without-jsonify@1.0.1: {} + + json5@1.0.2: + dependencies: + minimist: 1.2.8 + + json5@2.2.3: {} + + jsonfile@6.2.0: + dependencies: + universalify: 2.0.1 + optionalDependencies: + graceful-fs: 4.2.11 + + jsx-ast-utils@3.3.5: + dependencies: + array-includes: 3.1.9 + array.prototype.flat: 1.3.3 + object.assign: 4.1.7 + object.values: 1.2.1 + + keyv@4.5.4: + dependencies: + json-buffer: 3.0.1 + + kleur@3.0.3: {} + + kleur@4.1.5: {} + + language-subtag-registry@0.3.23: {} + + language-tags@1.0.9: + dependencies: + language-subtag-registry: 0.3.23 + + levn@0.4.1: + dependencies: + prelude-ls: 1.2.1 + type-check: 0.4.0 + + lightningcss-android-arm64@1.32.0: + optional: true + + lightningcss-darwin-arm64@1.32.0: + optional: true + + lightningcss-darwin-x64@1.32.0: + optional: true + + lightningcss-freebsd-x64@1.32.0: + optional: true + + lightningcss-linux-arm-gnueabihf@1.32.0: + optional: true + + lightningcss-linux-arm64-gnu@1.32.0: + optional: true + + lightningcss-linux-arm64-musl@1.32.0: + optional: true + + lightningcss-linux-x64-gnu@1.32.0: + optional: true + + lightningcss-linux-x64-musl@1.32.0: + optional: true + + lightningcss-win32-arm64-msvc@1.32.0: + optional: true + + lightningcss-win32-x64-msvc@1.32.0: + optional: true + + lightningcss@1.32.0: + dependencies: + detect-libc: 2.1.2 + optionalDependencies: + lightningcss-android-arm64: 1.32.0 + lightningcss-darwin-arm64: 1.32.0 + lightningcss-darwin-x64: 1.32.0 + lightningcss-freebsd-x64: 1.32.0 + lightningcss-linux-arm-gnueabihf: 1.32.0 + lightningcss-linux-arm64-gnu: 1.32.0 + lightningcss-linux-arm64-musl: 1.32.0 + lightningcss-linux-x64-gnu: 1.32.0 + lightningcss-linux-x64-musl: 1.32.0 + lightningcss-win32-arm64-msvc: 1.32.0 + lightningcss-win32-x64-msvc: 1.32.0 + + lines-and-columns@1.2.4: {} + + locate-path@6.0.0: + dependencies: + p-locate: 5.0.0 + + lodash.merge@4.6.2: {} + + log-symbols@6.0.0: + dependencies: + chalk: 5.6.2 + is-unicode-supported: 1.3.0 + + loose-envify@1.4.0: + dependencies: + js-tokens: 4.0.0 + + lru-cache@5.1.1: + dependencies: + yallist: 3.1.1 + + lucide-react@0.500.0(react@19.2.3): + dependencies: + react: 19.2.3 + + magic-string@0.30.21: + dependencies: + '@jridgewell/sourcemap-codec': 1.5.5 + + math-intrinsics@1.1.0: {} + + media-typer@1.1.0: {} + + merge-descriptors@2.0.0: {} + + merge-stream@2.0.0: {} + + merge2@1.4.1: {} + + micromatch@4.0.8: + dependencies: + braces: 3.0.3 + picomatch: 2.3.1 + + mime-db@1.54.0: {} + + mime-types@3.0.2: + dependencies: + mime-db: 1.54.0 + + mimic-fn@2.1.0: {} + + mimic-function@5.0.1: {} + + minimatch@10.2.4: + dependencies: + brace-expansion: 5.0.4 + + minimatch@3.1.5: + dependencies: + brace-expansion: 1.1.12 + + minimist@1.2.8: {} + + ms@2.1.3: {} + + msw@2.12.14(@types/node@20.19.37)(typescript@5.9.3): + dependencies: + '@inquirer/confirm': 5.1.21(@types/node@20.19.37) + '@mswjs/interceptors': 0.41.3 + '@open-draft/deferred-promise': 2.2.0 + '@types/statuses': 2.0.6 + cookie: 1.1.1 + graphql: 16.13.1 + headers-polyfill: 4.0.3 + is-node-process: 1.2.0 + outvariant: 1.4.3 + path-to-regexp: 6.3.0 + picocolors: 1.1.1 + rettime: 0.10.1 + statuses: 2.0.2 + strict-event-emitter: 0.5.1 + tough-cookie: 6.0.1 + type-fest: 5.5.0 + until-async: 3.0.2 + yargs: 17.7.2 + optionalDependencies: + typescript: 5.9.3 + transitivePeerDependencies: + - '@types/node' + + mute-stream@2.0.0: {} + + nanoid@3.3.11: {} + + napi-postinstall@0.3.4: {} + + natural-compare@1.4.0: {} + + negotiator@1.0.0: {} + + next-themes@0.4.6(react-dom@19.2.3(react@19.2.3))(react@19.2.3): + dependencies: + react: 19.2.3 + react-dom: 19.2.3(react@19.2.3) + + next@16.1.7(@babel/core@7.29.0)(react-dom@19.2.3(react@19.2.3))(react@19.2.3): + dependencies: + '@next/env': 16.1.7 + '@swc/helpers': 0.5.15 + baseline-browser-mapping: 2.10.10 + caniuse-lite: 1.0.30001780 + postcss: 8.4.31 + react: 19.2.3 + react-dom: 19.2.3(react@19.2.3) + styled-jsx: 5.1.6(@babel/core@7.29.0)(react@19.2.3) + optionalDependencies: + '@next/swc-darwin-arm64': 16.1.7 + '@next/swc-darwin-x64': 16.1.7 + '@next/swc-linux-arm64-gnu': 16.1.7 + '@next/swc-linux-arm64-musl': 16.1.7 + '@next/swc-linux-x64-gnu': 16.1.7 + '@next/swc-linux-x64-musl': 16.1.7 + '@next/swc-win32-arm64-msvc': 16.1.7 + '@next/swc-win32-x64-msvc': 16.1.7 + sharp: 0.34.5 + transitivePeerDependencies: + - '@babel/core' + - babel-plugin-macros + + node-domexception@1.0.0: {} + + node-exports-info@1.6.0: + dependencies: + array.prototype.flatmap: 1.3.3 + es-errors: 1.3.0 + object.entries: 1.1.9 + semver: 6.3.1 + + node-fetch@3.3.2: + dependencies: + data-uri-to-buffer: 4.0.1 + fetch-blob: 3.2.0 + formdata-polyfill: 4.0.10 + + node-releases@2.0.36: {} + + npm-run-path@4.0.1: + dependencies: + path-key: 3.1.1 + + npm-run-path@6.0.0: + dependencies: + path-key: 4.0.0 + unicorn-magic: 0.3.0 + + object-assign@4.1.1: {} + + object-inspect@1.13.4: {} + + object-keys@1.1.1: {} + + object-treeify@1.1.33: {} + + object.assign@4.1.7: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-object-atoms: 1.1.1 + has-symbols: 1.1.0 + object-keys: 1.1.1 + + object.entries@1.1.9: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-object-atoms: 1.1.1 + + object.fromentries@2.0.8: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-object-atoms: 1.1.1 + + object.groupby@1.0.3: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + + object.values@1.2.1: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-object-atoms: 1.1.1 + + on-finished@2.4.1: + dependencies: + ee-first: 1.1.1 + + once@1.4.0: + dependencies: + wrappy: 1.0.2 + + onetime@5.1.2: + dependencies: + mimic-fn: 2.1.0 + + onetime@7.0.0: + dependencies: + mimic-function: 5.0.1 + + open@11.0.0: + dependencies: + default-browser: 5.5.0 + define-lazy-prop: 3.0.0 + is-in-ssh: 1.0.0 + is-inside-container: 1.0.0 + powershell-utils: 0.1.0 + wsl-utils: 0.3.1 + + optionator@0.9.4: + dependencies: + deep-is: 0.1.4 + fast-levenshtein: 2.0.6 + levn: 0.4.1 + prelude-ls: 1.2.1 + type-check: 0.4.0 + word-wrap: 1.2.5 + + ora@8.2.0: + dependencies: + chalk: 5.6.2 + cli-cursor: 5.0.0 + cli-spinners: 2.9.2 + is-interactive: 2.0.0 + is-unicode-supported: 2.1.0 + log-symbols: 6.0.0 + stdin-discarder: 0.2.2 + string-width: 7.2.0 + strip-ansi: 7.2.0 + + outvariant@1.4.3: {} + + own-keys@1.0.1: + dependencies: + get-intrinsic: 1.3.0 + object-keys: 1.1.1 + safe-push-apply: 1.0.0 + + p-limit@3.1.0: + dependencies: + yocto-queue: 0.1.0 + + p-locate@5.0.0: + dependencies: + p-limit: 3.1.0 + + parent-module@1.0.1: + dependencies: + callsites: 3.1.0 + + parse-json@5.2.0: + dependencies: + '@babel/code-frame': 7.29.0 + error-ex: 1.3.4 + json-parse-even-better-errors: 2.3.1 + lines-and-columns: 1.2.4 + + parse-ms@4.0.0: {} + + parseurl@1.3.3: {} + + path-browserify@1.0.1: {} + + path-exists@4.0.0: {} + + path-key@3.1.1: {} + + path-key@4.0.0: {} + + path-parse@1.0.7: {} + + path-to-regexp@6.3.0: {} + + path-to-regexp@8.3.0: {} + + picocolors@1.1.1: {} + + picomatch@2.3.1: {} + + picomatch@4.0.3: {} + + pkce-challenge@5.0.1: {} + + possible-typed-array-names@1.1.0: {} + + postcss-selector-parser@7.1.1: + dependencies: + cssesc: 3.0.0 + util-deprecate: 1.0.2 + + postcss@8.4.31: + dependencies: + nanoid: 3.3.11 + picocolors: 1.1.1 + source-map-js: 1.2.1 + + postcss@8.5.8: + dependencies: + nanoid: 3.3.11 + picocolors: 1.1.1 + source-map-js: 1.2.1 + + powershell-utils@0.1.0: {} + + prelude-ls@1.2.1: {} + + pretty-ms@9.3.0: + dependencies: + parse-ms: 4.0.0 + + prompts@2.4.2: + dependencies: + kleur: 3.0.3 + sisteransi: 1.0.5 + + prop-types@15.8.1: + dependencies: + loose-envify: 1.4.0 + object-assign: 4.1.1 + react-is: 16.13.1 + + proxy-addr@2.0.7: + dependencies: + forwarded: 0.2.0 + ipaddr.js: 1.9.1 + + punycode@2.3.1: {} + + qs@6.15.0: + dependencies: + side-channel: 1.1.0 + + queue-microtask@1.2.3: {} + + range-parser@1.2.1: {} + + raw-body@3.0.2: + dependencies: + bytes: 3.1.2 + http-errors: 2.0.1 + iconv-lite: 0.7.2 + unpipe: 1.0.0 + + react-dom@19.2.3(react@19.2.3): + dependencies: + react: 19.2.3 + scheduler: 0.27.0 + + react-is@16.13.1: {} + + react@19.2.3: {} + + recast@0.23.11: + dependencies: + ast-types: 0.16.1 + esprima: 4.0.1 + source-map: 0.6.1 + tiny-invariant: 1.3.3 + tslib: 2.8.1 + + reflect.getprototypeof@1.0.10: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + get-intrinsic: 1.3.0 + get-proto: 1.0.1 + which-builtin-type: 1.2.1 + + regexp.prototype.flags@1.5.4: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-errors: 1.3.0 + get-proto: 1.0.1 + gopd: 1.2.0 + set-function-name: 2.0.2 + + require-directory@2.1.1: {} + + require-from-string@2.0.2: {} + + reselect@5.1.1: {} + + resolve-from@4.0.0: {} + + resolve-pkg-maps@1.0.0: {} + + resolve@1.22.11: + dependencies: + is-core-module: 2.16.1 + path-parse: 1.0.7 + supports-preserve-symlinks-flag: 1.0.0 + + resolve@2.0.0-next.6: + dependencies: + es-errors: 1.3.0 + is-core-module: 2.16.1 + node-exports-info: 1.6.0 + object-keys: 1.1.1 + path-parse: 1.0.7 + supports-preserve-symlinks-flag: 1.0.0 + + restore-cursor@5.1.0: + dependencies: + onetime: 7.0.0 + signal-exit: 4.1.0 + + rettime@0.10.1: {} + + reusify@1.1.0: {} + + router@2.2.0: + dependencies: + debug: 4.4.3 + depd: 2.0.0 + is-promise: 4.0.0 + parseurl: 1.3.3 + path-to-regexp: 8.3.0 + transitivePeerDependencies: + - supports-color + + run-applescript@7.1.0: {} + + run-parallel@1.2.0: + dependencies: + queue-microtask: 1.2.3 + + safe-array-concat@1.1.3: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + get-intrinsic: 1.3.0 + has-symbols: 1.1.0 + isarray: 2.0.5 + + safe-push-apply@1.0.0: + dependencies: + es-errors: 1.3.0 + isarray: 2.0.5 + + safe-regex-test@1.1.0: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + is-regex: 1.2.1 + + safer-buffer@2.1.2: {} + + scheduler@0.27.0: {} + + semver@6.3.1: {} + + semver@7.7.4: {} + + send@1.2.1: + dependencies: + debug: 4.4.3 + encodeurl: 2.0.0 + escape-html: 1.0.3 + etag: 1.8.1 + fresh: 2.0.0 + http-errors: 2.0.1 + mime-types: 3.0.2 + ms: 2.1.3 + on-finished: 2.4.1 + range-parser: 1.2.1 + statuses: 2.0.2 + transitivePeerDependencies: + - supports-color + + serve-static@2.2.1: + dependencies: + encodeurl: 2.0.0 + escape-html: 1.0.3 + parseurl: 1.3.3 + send: 1.2.1 + transitivePeerDependencies: + - supports-color + + set-function-length@1.2.2: + dependencies: + define-data-property: 1.1.4 + es-errors: 1.3.0 + function-bind: 1.1.2 + get-intrinsic: 1.3.0 + gopd: 1.2.0 + has-property-descriptors: 1.0.2 + + set-function-name@2.0.2: + dependencies: + define-data-property: 1.1.4 + es-errors: 1.3.0 + functions-have-names: 1.2.3 + has-property-descriptors: 1.0.2 + + set-proto@1.0.0: + dependencies: + dunder-proto: 1.0.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + + setprototypeof@1.2.0: {} + + shadcn@4.1.0(@types/node@20.19.37)(typescript@5.9.3): + dependencies: + '@babel/core': 7.29.0 + '@babel/parser': 7.29.2 + '@babel/plugin-transform-typescript': 7.28.6(@babel/core@7.29.0) + '@babel/preset-typescript': 7.28.5(@babel/core@7.29.0) + '@dotenvx/dotenvx': 1.57.1 + '@modelcontextprotocol/sdk': 1.27.1(zod@3.25.76) + '@types/validate-npm-package-name': 4.0.2 + browserslist: 4.28.1 + commander: 14.0.3 + cosmiconfig: 9.0.1(typescript@5.9.3) + dedent: 1.7.2 + deepmerge: 4.3.1 + diff: 8.0.3 + execa: 9.6.1 + fast-glob: 3.3.3 + fs-extra: 11.3.4 + fuzzysort: 3.1.0 + https-proxy-agent: 7.0.6 + kleur: 4.1.5 + msw: 2.12.14(@types/node@20.19.37)(typescript@5.9.3) + node-fetch: 3.3.2 + open: 11.0.0 + ora: 8.2.0 + postcss: 8.5.8 + postcss-selector-parser: 7.1.1 + prompts: 2.4.2 + recast: 0.23.11 + stringify-object: 5.0.0 + tailwind-merge: 3.5.0 + ts-morph: 26.0.0 + tsconfig-paths: 4.2.0 + validate-npm-package-name: 7.0.2 + zod: 3.25.76 + zod-to-json-schema: 3.25.1(zod@3.25.76) + transitivePeerDependencies: + - '@cfworker/json-schema' + - '@types/node' + - babel-plugin-macros + - supports-color + - typescript + + sharp@0.34.5: + dependencies: + '@img/colour': 1.1.0 + detect-libc: 2.1.2 + semver: 7.7.4 + optionalDependencies: + '@img/sharp-darwin-arm64': 0.34.5 + '@img/sharp-darwin-x64': 0.34.5 + '@img/sharp-libvips-darwin-arm64': 1.2.4 + '@img/sharp-libvips-darwin-x64': 1.2.4 + '@img/sharp-libvips-linux-arm': 1.2.4 + '@img/sharp-libvips-linux-arm64': 1.2.4 + '@img/sharp-libvips-linux-ppc64': 1.2.4 + '@img/sharp-libvips-linux-riscv64': 1.2.4 + '@img/sharp-libvips-linux-s390x': 1.2.4 + '@img/sharp-libvips-linux-x64': 1.2.4 + '@img/sharp-libvips-linuxmusl-arm64': 1.2.4 + '@img/sharp-libvips-linuxmusl-x64': 1.2.4 + '@img/sharp-linux-arm': 0.34.5 + '@img/sharp-linux-arm64': 0.34.5 + '@img/sharp-linux-ppc64': 0.34.5 + '@img/sharp-linux-riscv64': 0.34.5 + '@img/sharp-linux-s390x': 0.34.5 + '@img/sharp-linux-x64': 0.34.5 + '@img/sharp-linuxmusl-arm64': 0.34.5 + '@img/sharp-linuxmusl-x64': 0.34.5 + '@img/sharp-wasm32': 0.34.5 + '@img/sharp-win32-arm64': 0.34.5 + '@img/sharp-win32-ia32': 0.34.5 + '@img/sharp-win32-x64': 0.34.5 + optional: true + + shebang-command@2.0.0: + dependencies: + shebang-regex: 3.0.0 + + shebang-regex@3.0.0: {} + + side-channel-list@1.0.0: + dependencies: + es-errors: 1.3.0 + object-inspect: 1.13.4 + + side-channel-map@1.0.1: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + get-intrinsic: 1.3.0 + object-inspect: 1.13.4 + + side-channel-weakmap@1.0.2: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + get-intrinsic: 1.3.0 + object-inspect: 1.13.4 + side-channel-map: 1.0.1 + + side-channel@1.1.0: + dependencies: + es-errors: 1.3.0 + object-inspect: 1.13.4 + side-channel-list: 1.0.0 + side-channel-map: 1.0.1 + side-channel-weakmap: 1.0.2 + + signal-exit@3.0.7: {} + + signal-exit@4.1.0: {} + + sisteransi@1.0.5: {} + + source-map-js@1.2.1: {} + + source-map@0.6.1: {} + + stable-hash@0.0.5: {} + + statuses@2.0.2: {} + + stdin-discarder@0.2.2: {} + + stop-iteration-iterator@1.1.0: + dependencies: + es-errors: 1.3.0 + internal-slot: 1.1.0 + + strict-event-emitter@0.5.1: {} + + string-width@4.2.3: + dependencies: + emoji-regex: 8.0.0 + is-fullwidth-code-point: 3.0.0 + strip-ansi: 6.0.1 + + string-width@7.2.0: + dependencies: + emoji-regex: 10.6.0 + get-east-asian-width: 1.5.0 + strip-ansi: 7.2.0 + + string.prototype.includes@2.0.1: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-abstract: 1.24.1 + + string.prototype.matchall@4.0.12: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-errors: 1.3.0 + es-object-atoms: 1.1.1 + get-intrinsic: 1.3.0 + gopd: 1.2.0 + has-symbols: 1.1.0 + internal-slot: 1.1.0 + regexp.prototype.flags: 1.5.4 + set-function-name: 2.0.2 + side-channel: 1.1.0 + + string.prototype.repeat@1.0.0: + dependencies: + define-properties: 1.2.1 + es-abstract: 1.24.1 + + string.prototype.trim@1.2.10: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-data-property: 1.1.4 + define-properties: 1.2.1 + es-abstract: 1.24.1 + es-object-atoms: 1.1.1 + has-property-descriptors: 1.0.2 + + string.prototype.trimend@1.0.9: + dependencies: + call-bind: 1.0.8 + call-bound: 1.0.4 + define-properties: 1.2.1 + es-object-atoms: 1.1.1 + + string.prototype.trimstart@1.0.8: + dependencies: + call-bind: 1.0.8 + define-properties: 1.2.1 + es-object-atoms: 1.1.1 + + stringify-object@5.0.0: + dependencies: + get-own-enumerable-keys: 1.0.0 + is-obj: 3.0.0 + is-regexp: 3.1.0 + + strip-ansi@6.0.1: + dependencies: + ansi-regex: 5.0.1 + + strip-ansi@7.2.0: + dependencies: + ansi-regex: 6.2.2 + + strip-bom@3.0.0: {} + + strip-final-newline@2.0.0: {} + + strip-final-newline@4.0.0: {} + + strip-json-comments@3.1.1: {} + + styled-jsx@5.1.6(@babel/core@7.29.0)(react@19.2.3): + dependencies: + client-only: 0.0.1 + react: 19.2.3 + optionalDependencies: + '@babel/core': 7.29.0 + + supports-color@7.2.0: + dependencies: + has-flag: 4.0.0 + + supports-preserve-symlinks-flag@1.0.0: {} + + tabbable@6.4.0: {} + + tagged-tag@1.0.0: {} + + tailwind-merge@3.5.0: {} + + tailwindcss@4.2.2: {} + + tapable@2.3.0: {} + + tiny-invariant@1.3.3: {} + + tinyglobby@0.2.15: + dependencies: + fdir: 6.5.0(picomatch@4.0.3) + picomatch: 4.0.3 + + tldts-core@7.0.27: {} + + tldts@7.0.27: + dependencies: + tldts-core: 7.0.27 + + to-regex-range@5.0.1: + dependencies: + is-number: 7.0.0 + + toidentifier@1.0.1: {} + + tough-cookie@6.0.1: + dependencies: + tldts: 7.0.27 + + ts-api-utils@2.5.0(typescript@5.9.3): + dependencies: + typescript: 5.9.3 + + ts-morph@26.0.0: + dependencies: + '@ts-morph/common': 0.27.0 + code-block-writer: 13.0.3 + + tsconfig-paths@3.15.0: + dependencies: + '@types/json5': 0.0.29 + json5: 1.0.2 + minimist: 1.2.8 + strip-bom: 3.0.0 + + tsconfig-paths@4.2.0: + dependencies: + json5: 2.2.3 + minimist: 1.2.8 + strip-bom: 3.0.0 + + tslib@2.8.1: {} + + tw-animate-css@1.4.0: {} + + type-check@0.4.0: + dependencies: + prelude-ls: 1.2.1 + + type-fest@5.5.0: + dependencies: + tagged-tag: 1.0.0 + + type-is@2.0.1: + dependencies: + content-type: 1.0.5 + media-typer: 1.1.0 + mime-types: 3.0.2 + + typed-array-buffer@1.0.3: + dependencies: + call-bound: 1.0.4 + es-errors: 1.3.0 + is-typed-array: 1.1.15 + + typed-array-byte-length@1.0.3: + dependencies: + call-bind: 1.0.8 + for-each: 0.3.5 + gopd: 1.2.0 + has-proto: 1.2.0 + is-typed-array: 1.1.15 + + typed-array-byte-offset@1.0.4: + dependencies: + available-typed-arrays: 1.0.7 + call-bind: 1.0.8 + for-each: 0.3.5 + gopd: 1.2.0 + has-proto: 1.2.0 + is-typed-array: 1.1.15 + reflect.getprototypeof: 1.0.10 + + typed-array-length@1.0.7: + dependencies: + call-bind: 1.0.8 + for-each: 0.3.5 + gopd: 1.2.0 + is-typed-array: 1.1.15 + possible-typed-array-names: 1.1.0 + reflect.getprototypeof: 1.0.10 + + typescript-eslint@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3): + dependencies: + '@typescript-eslint/eslint-plugin': 8.57.1(@typescript-eslint/parser@8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3))(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + '@typescript-eslint/parser': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + '@typescript-eslint/typescript-estree': 8.57.1(typescript@5.9.3) + '@typescript-eslint/utils': 8.57.1(eslint@9.39.4(jiti@2.6.1))(typescript@5.9.3) + eslint: 9.39.4(jiti@2.6.1) + typescript: 5.9.3 + transitivePeerDependencies: + - supports-color + + typescript@5.9.3: {} + + unbox-primitive@1.1.0: + dependencies: + call-bound: 1.0.4 + has-bigints: 1.1.0 + has-symbols: 1.1.0 + which-boxed-primitive: 1.1.1 + + undici-types@6.21.0: {} + + unicorn-magic@0.3.0: {} + + universalify@2.0.1: {} + + unpipe@1.0.0: {} + + unrs-resolver@1.11.1: + dependencies: + napi-postinstall: 0.3.4 + optionalDependencies: + '@unrs/resolver-binding-android-arm-eabi': 1.11.1 + '@unrs/resolver-binding-android-arm64': 1.11.1 + '@unrs/resolver-binding-darwin-arm64': 1.11.1 + '@unrs/resolver-binding-darwin-x64': 1.11.1 + '@unrs/resolver-binding-freebsd-x64': 1.11.1 + '@unrs/resolver-binding-linux-arm-gnueabihf': 1.11.1 + '@unrs/resolver-binding-linux-arm-musleabihf': 1.11.1 + '@unrs/resolver-binding-linux-arm64-gnu': 1.11.1 + '@unrs/resolver-binding-linux-arm64-musl': 1.11.1 + '@unrs/resolver-binding-linux-ppc64-gnu': 1.11.1 + '@unrs/resolver-binding-linux-riscv64-gnu': 1.11.1 + '@unrs/resolver-binding-linux-riscv64-musl': 1.11.1 + '@unrs/resolver-binding-linux-s390x-gnu': 1.11.1 + '@unrs/resolver-binding-linux-x64-gnu': 1.11.1 + '@unrs/resolver-binding-linux-x64-musl': 1.11.1 + '@unrs/resolver-binding-wasm32-wasi': 1.11.1 + '@unrs/resolver-binding-win32-arm64-msvc': 1.11.1 + '@unrs/resolver-binding-win32-ia32-msvc': 1.11.1 + '@unrs/resolver-binding-win32-x64-msvc': 1.11.1 + + until-async@3.0.2: {} + + update-browserslist-db@1.2.3(browserslist@4.28.1): + dependencies: + browserslist: 4.28.1 + escalade: 3.2.0 + picocolors: 1.1.1 + + uri-js@4.4.1: + dependencies: + punycode: 2.3.1 + + use-sync-external-store@1.6.0(react@19.2.3): + dependencies: + react: 19.2.3 + + util-deprecate@1.0.2: {} + + validate-npm-package-name@7.0.2: {} + + vary@1.1.2: {} + + web-streams-polyfill@3.3.3: {} + + which-boxed-primitive@1.1.1: + dependencies: + is-bigint: 1.1.0 + is-boolean-object: 1.2.2 + is-number-object: 1.1.1 + is-string: 1.1.1 + is-symbol: 1.1.1 + + which-builtin-type@1.2.1: + dependencies: + call-bound: 1.0.4 + function.prototype.name: 1.1.8 + has-tostringtag: 1.0.2 + is-async-function: 2.1.1 + is-date-object: 1.1.0 + is-finalizationregistry: 1.1.1 + is-generator-function: 1.1.2 + is-regex: 1.2.1 + is-weakref: 1.1.1 + isarray: 2.0.5 + which-boxed-primitive: 1.1.1 + which-collection: 1.0.2 + which-typed-array: 1.1.20 + + which-collection@1.0.2: + dependencies: + is-map: 2.0.3 + is-set: 2.0.3 + is-weakmap: 2.0.2 + is-weakset: 2.0.4 + + which-typed-array@1.1.20: + dependencies: + available-typed-arrays: 1.0.7 + call-bind: 1.0.8 + call-bound: 1.0.4 + for-each: 0.3.5 + get-proto: 1.0.1 + gopd: 1.2.0 + has-tostringtag: 1.0.2 + + which@2.0.2: + dependencies: + isexe: 2.0.0 + + which@4.0.0: + dependencies: + isexe: 3.1.5 + + word-wrap@1.2.5: {} + + wrap-ansi@6.2.0: + dependencies: + ansi-styles: 4.3.0 + string-width: 4.2.3 + strip-ansi: 6.0.1 + + wrap-ansi@7.0.0: + dependencies: + ansi-styles: 4.3.0 + string-width: 4.2.3 + strip-ansi: 6.0.1 + + wrappy@1.0.2: {} + + wsl-utils@0.3.1: + dependencies: + is-wsl: 3.1.1 + powershell-utils: 0.1.0 + + y18n@5.0.8: {} + + yallist@3.1.1: {} + + yargs-parser@21.1.1: {} + + yargs@17.7.2: + dependencies: + cliui: 8.0.1 + escalade: 3.2.0 + get-caller-file: 2.0.5 + require-directory: 2.1.1 + string-width: 4.2.3 + y18n: 5.0.8 + yargs-parser: 21.1.1 + + yocto-queue@0.1.0: {} + + yoctocolors-cjs@2.1.3: {} + + yoctocolors@2.1.2: {} + + zod-to-json-schema@3.25.1(zod@3.25.76): + dependencies: + zod: 3.25.76 + + zod-validation-error@4.0.2(zod@4.3.6): + dependencies: + zod: 4.3.6 + + zod@3.25.76: {} + + zod@4.3.6: {} diff --git a/frontend/pnpm-workspace.yaml b/frontend/pnpm-workspace.yaml new file mode 100644 index 0000000..6dd36f6 --- /dev/null +++ b/frontend/pnpm-workspace.yaml @@ -0,0 +1,7 @@ +packages: + - "apps/*" + - "packages/*" + +ignoredBuiltDependencies: + - sharp + - unrs-resolver diff --git a/infrastructure/docker-compose.dev.yml b/infrastructure/docker-compose.dev.yml new file mode 100644 index 0000000..07d6bda --- /dev/null +++ b/infrastructure/docker-compose.dev.yml @@ -0,0 +1,86 @@ +services: + postgres: + image: postgres:16 + environment: + POSTGRES_USER: ${POSTGRES_USER:-cmdb} + POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-cmdb} + POSTGRES_DB: ${POSTGRES_DB:-cmdb} + POSTGRES_LOG_STATEMENT: all + ports: + - "5432:5432" + volumes: + - postgres-data:/var/lib/postgresql/data + - ./docker/init-databases.sh:/docker-entrypoint-initdb.d/init-databases.sh + healthcheck: + test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-cmdb}"] + interval: 10s + timeout: 5s + retries: 5 + networks: + - cmdb-network + + kafka: + image: apache/kafka:3.9.0 + environment: + KAFKA_NODE_ID: 1 + KAFKA_PROCESS_ROLES: broker,controller + KAFKA_CONTROLLER_QUORUM_VOTERS: 1@kafka:9093 + KAFKA_LISTENERS: PLAINTEXT://:9092,CONTROLLER://:9093 + KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092 + KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT + KAFKA_CONTROLLER_LISTENER_NAMES: CONTROLLER + KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true" + KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 + CLUSTER_ID: "cmdb-dev-cluster-001" + ports: + - "9092:9092" + volumes: + - kafka-data:/tmp/kraft-combined-logs + healthcheck: + test: ["CMD-SHELL", "/opt/kafka/bin/kafka-broker-api-versions.sh --bootstrap-server localhost:9092 || exit 1"] + interval: 10s + timeout: 10s + retries: 5 + networks: + - cmdb-network + + redis: + image: redis:7 + ports: + - "6379:6379" + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 10s + timeout: 5s + retries: 5 + networks: + - cmdb-network + + nginx: + image: nginx:latest + volumes: + - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro + - ./nginx/ssl:/etc/nginx/ssl:ro + ports: + - "80:80" + - "443:443" + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + +volumes: + postgres-data: + kafka-data: + redis-data: + +networks: + cmdb-network: + external: true diff --git a/infrastructure/docker-compose.yml b/infrastructure/docker-compose.yml new file mode 100644 index 0000000..2515112 --- /dev/null +++ b/infrastructure/docker-compose.yml @@ -0,0 +1,74 @@ +services: + postgres: + image: postgres:16 + environment: + POSTGRES_USER: ${POSTGRES_USER:-cmdb} + POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-cmdb} + POSTGRES_DB: ${POSTGRES_DB:-cmdb} + volumes: + - postgres-data:/var/lib/postgresql/data + - ./docker/init-databases.sh:/docker-entrypoint-initdb.d/init-databases.sh + healthcheck: + test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-cmdb}"] + interval: 10s + timeout: 5s + retries: 5 + networks: + - cmdb-network + + kafka: + image: bitnami/kafka:latest + environment: + KAFKA_CFG_NODE_ID: 1 + KAFKA_CFG_PROCESS_ROLES: broker,controller + KAFKA_CFG_CONTROLLER_QUORUM_VOTERS: 1@kafka:9093 + KAFKA_CFG_LISTENERS: PLAINTEXT://:9092,CONTROLLER://:9093 + KAFKA_CFG_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092 + KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP: CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT + KAFKA_CFG_CONTROLLER_LISTENER_NAMES: CONTROLLER + KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: "true" + volumes: + - kafka-data:/bitnami/kafka + healthcheck: + test: ["CMD-SHELL", "kafka-broker-api-versions.sh --bootstrap-server localhost:9092"] + interval: 10s + timeout: 10s + retries: 5 + networks: + - cmdb-network + + redis: + image: redis:7 + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 10s + timeout: 5s + retries: 5 + networks: + - cmdb-network + + nginx: + image: nginx:latest + volumes: + - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro + - ./nginx/ssl:/etc/nginx/ssl:ro + ports: + - "80:80" + - "443:443" + depends_on: + - postgres + - kafka + - redis + networks: + - cmdb-network + +volumes: + postgres-data: + kafka-data: + redis-data: + +networks: + cmdb-network: + driver: bridge diff --git a/infrastructure/docker/Dockerfile.base b/infrastructure/docker/Dockerfile.base new file mode 100644 index 0000000..e490fce --- /dev/null +++ b/infrastructure/docker/Dockerfile.base @@ -0,0 +1,11 @@ +FROM python:3.13-slim AS base +COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv +WORKDIR /app +COPY pyproject.toml uv.lock ./ +COPY shared/ ./shared/ + +FROM base AS prod +RUN uv sync --frozen --no-dev --package cmdb-shared + +FROM base AS dev +RUN uv sync --frozen --package cmdb-shared diff --git a/infrastructure/docker/init-databases.sh b/infrastructure/docker/init-databases.sh new file mode 100755 index 0000000..8c93351 --- /dev/null +++ b/infrastructure/docker/init-databases.sh @@ -0,0 +1,14 @@ +#!/bin/bash +set -e + +databases=("cmdb_ipam" "cmdb_auth" "cmdb_tenant" "cmdb_event" "cmdb_webhook") + +for db in "${databases[@]}"; do + echo "Creating database: $db" + psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" --dbname "$POSTGRES_DB" <<-EOSQL + SELECT 'CREATE DATABASE $db' WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = '$db')\gexec + GRANT ALL PRIVILEGES ON DATABASE $db TO $POSTGRES_USER; +EOSQL +done + +echo "All databases created successfully." diff --git a/infrastructure/nginx/nginx.conf b/infrastructure/nginx/nginx.conf new file mode 100644 index 0000000..9e07258 --- /dev/null +++ b/infrastructure/nginx/nginx.conf @@ -0,0 +1,333 @@ +worker_processes auto; + +events { + worker_connections 1024; +} + +http { + # --- Logging --- + log_format json_log escape=json + '{"time":"$time_iso8601",' + '"remote_addr":"$remote_addr",' + '"method":"$request_method",' + '"uri":"$request_uri",' + '"status":$status,' + '"body_bytes_sent":$body_bytes_sent,' + '"request_time":$request_time,' + '"upstream_response_time":"$upstream_response_time",' + '"correlation_id":"$http_x_correlation_id",' + '"user_agent":"$http_user_agent",' + '"referer":"$http_referer"}'; + + access_log /var/log/nginx/access.log json_log; + error_log /var/log/nginx/error.log warn; + + # --- Rate Limiting --- + limit_req_zone $binary_remote_addr zone=api_limit:10m rate=30r/s; + limit_req_status 429; + + # --- CORS --- + map $http_origin $cors_origin { + default ""; + ~^https?://localhost(:\d+)?$ $http_origin; + ~^https://.*\.cmdb\.io$ $http_origin; + } + + # --- DNS resolver for dynamic upstreams --- + resolver 127.0.0.11 valid=10s ipv6=off; + + # --- Upstreams --- + upstream frontend { + server frontend:3000; + } + + upstream auth { + server auth:8000; + } + + upstream tenant { + server tenant:8000; + } + + upstream ipam { + server ipam:8000; + } + + upstream event { + server event:8000; + } + + upstream webhook { + server webhook:8000; + } + + # --- HTTP server (dev) --- + server { + listen 80; + + # Timeouts + proxy_connect_timeout 10s; + proxy_read_timeout 30s; + proxy_send_timeout 30s; + + # Common proxy headers + proxy_set_header Host $host; + proxy_set_header X-Real-IP $remote_addr; + proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; + proxy_set_header X-Forwarded-Proto $scheme; + + # CORS preflight + location ~* ^/api/.*$ { + if ($request_method = OPTIONS) { + add_header Access-Control-Allow-Origin $cors_origin always; + add_header Access-Control-Allow-Methods "GET, POST, PUT, PATCH, DELETE, OPTIONS" always; + add_header Access-Control-Allow-Headers "Authorization, Content-Type, X-Correlation-ID, X-Tenant-ID" always; + add_header Access-Control-Max-Age 86400; + add_header Content-Length 0; + return 204; + } + return 404; + } + + # Public endpoints + location = /api/v1/auth/login { + limit_req zone=api_limit burst=10 nodelay; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/login; + } + location = /api/v1/auth/register { + limit_req zone=api_limit burst=10 nodelay; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/register; + } + location = /api/v1/auth/refresh { + limit_req zone=api_limit burst=10 nodelay; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/refresh; + } + location /api/v1/setup/ { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://tenant/setup/; + } + location /api/v1/tenant/ { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://tenant/; + } + + # IPAM + location /api/v1/ { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://ipam/api/v1/; + } + + # Event + location /api/v1/event/ { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://event/; + } + + # Frontend + location / { + proxy_pass http://frontend; + proxy_set_header Upgrade $http_upgrade; + proxy_set_header Connection "upgrade"; + } + } + + # --- Main HTTPS server --- + server { + listen 443 ssl; + + ssl_certificate /etc/nginx/ssl/cert.pem; + ssl_certificate_key /etc/nginx/ssl/key.pem; + ssl_protocols TLSv1.2 TLSv1.3; + ssl_ciphers HIGH:!aNULL:!MD5; + + # Timeouts + proxy_connect_timeout 10s; + proxy_read_timeout 30s; + proxy_send_timeout 30s; + + # Common proxy headers + proxy_set_header Host $host; + proxy_set_header X-Real-IP $remote_addr; + proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; + proxy_set_header X-Forwarded-Proto $scheme; + + # ===================================================================== + # Internal: auth_request target + # ===================================================================== + location = /internal/auth/validate { + internal; + proxy_pass http://auth/auth/validate; + proxy_pass_request_body off; + proxy_set_header Content-Length ""; + proxy_set_header X-Original-URI $request_uri; + proxy_set_header X-Original-Method $request_method; + } + + # ===================================================================== + # Public endpoints (no JWT required) + # ===================================================================== + location = /api/v1/auth/login { + limit_req zone=api_limit burst=10 nodelay; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/login; + } + + location = /api/v1/auth/register { + limit_req zone=api_limit burst=10 nodelay; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/register; + } + + location = /api/v1/auth/refresh { + limit_req zone=api_limit burst=10 nodelay; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/refresh; + } + + location /api/v1/auth/.well-known/ { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/auth/.well-known/; + } + + # Setup endpoints (public, no auth) + location /api/v1/setup/ { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://tenant/setup/; + } + + # Tenant list (public, needed for login tenant selector) + location = /api/v1/tenant/tenants { + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://tenant/tenants; + } + + # ===================================================================== + # Protected endpoints (JWT required via auth_request) + # ===================================================================== + + # Auth service (logout, users, roles, api-tokens, permissions) + location /api/v1/auth/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/; + } + + location /api/v1/users/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/users/; + } + + location /api/v1/roles/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/roles/; + } + + location /api/v1/api-tokens/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/api-tokens/; + } + + location /api/v1/permissions/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://auth/permissions/; + } + + # Tenant service + location /api/v1/tenant/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://tenant/; + } + + # IPAM service (catch-all for /api/v1/ not matched above) + location /api/v1/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://ipam/api/v1/; + } + + # Event service + location /api/v1/event/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://event/; + } + + # Webhook service + location /api/v1/webhook/ { + limit_req zone=api_limit burst=50 nodelay; + auth_request /internal/auth/validate; + auth_request_set $auth_user_id $upstream_http_x_user_id; + auth_request_set $auth_tenant_id $upstream_http_x_tenant_id; + + proxy_set_header X-User-ID $auth_user_id; + proxy_set_header X-Tenant-ID $auth_tenant_id; + add_header Access-Control-Allow-Origin $cors_origin always; + proxy_pass http://webhook/; + } + + # ===================================================================== + # Frontend (Next.js) + # ===================================================================== + location / { + proxy_pass http://frontend; + proxy_set_header Upgrade $http_upgrade; + proxy_set_header Connection "upgrade"; + } + } +} diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 0000000..5161e99 --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,23 @@ +[project] +name = "cmdb" +version = "0.1.0" +description = "CMDB Platform" +requires-python = ">=3.13" + +[tool.uv.workspace] +members = ["services/*", "shared"] + +[dependency-groups] +dev = [ + "ruff", + "pytest", + "pytest-asyncio", + "pre-commit", + "pytest-httpx", + "fakeredis", + "testcontainers[postgresql,kafka,redis]", +] + +[tool.ruff.lint] +ignore = ["N802"] + diff --git a/ruff.toml b/ruff.toml new file mode 100644 index 0000000..fc60aeb --- /dev/null +++ b/ruff.toml @@ -0,0 +1,8 @@ +target-version = "py313" +line-length = 120 + +[lint] +select = ["E", "F", "I", "N", "W", "UP", "B", "A", "SIM"] + +[format] +quote-style = "double" diff --git a/services/auth/Dockerfile b/services/auth/Dockerfile new file mode 100644 index 0000000..b79d7cf --- /dev/null +++ b/services/auth/Dockerfile @@ -0,0 +1,5 @@ +FROM cmdb-base:prod +COPY services/auth/ ./services/auth/ +RUN uv sync --frozen --no-dev --package cmdb-auth +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-auth", "uvicorn", "auth.interface.main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/services/auth/Dockerfile.dev b/services/auth/Dockerfile.dev new file mode 100644 index 0000000..cea5b57 --- /dev/null +++ b/services/auth/Dockerfile.dev @@ -0,0 +1,8 @@ +FROM cmdb-base:dev +COPY services/auth/pyproject.toml ./services/auth/pyproject.toml +COPY services/auth/alembic.ini ./services/auth/alembic.ini +COPY services/auth/alembic ./services/auth/alembic +COPY services/auth/src ./services/auth/src +RUN uv sync --frozen --package cmdb-auth +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-auth", "uvicorn", "auth.interface.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"] diff --git a/services/auth/alembic.ini b/services/auth/alembic.ini new file mode 100644 index 0000000..b5713f0 --- /dev/null +++ b/services/auth/alembic.ini @@ -0,0 +1,35 @@ +[alembic] +script_location = alembic +sqlalchemy.url = postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_auth + +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console + +[logger_sqlalchemy] +level = WARN +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s diff --git a/services/auth/alembic/env.py b/services/auth/alembic/env.py new file mode 100644 index 0000000..6c008d8 --- /dev/null +++ b/services/auth/alembic/env.py @@ -0,0 +1,61 @@ +import asyncio +import os +from logging.config import fileConfig + +from alembic import context +from auth.infrastructure.models import AuthBase +from sqlalchemy import pool +from sqlalchemy.ext.asyncio import async_engine_from_config + +config = context.config + +# Override DB URL from environment variable if set +db_url = os.environ.get("DATABASE_URL") +if db_url: + config.set_main_option("sqlalchemy.url", db_url) + +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +target_metadata = AuthBase.metadata + + +def run_migrations_offline() -> None: + url = config.get_main_option("sqlalchemy.url") + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + ) + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection) -> None: + context.configure( + connection=connection, + target_metadata=target_metadata, + ) + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + connectable = async_engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + await connectable.dispose() + + +def run_migrations_online() -> None: + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/services/auth/alembic/versions/001_create_auth_tables.py b/services/auth/alembic/versions/001_create_auth_tables.py new file mode 100644 index 0000000..8cec8fe --- /dev/null +++ b/services/auth/alembic/versions/001_create_auth_tables.py @@ -0,0 +1,116 @@ +"""create auth tables + +Revision ID: 001 +Revises: +Create Date: 2026-03-19 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects import postgresql + +revision = "001" +down_revision = None +branch_labels = None +depends_on = None + + +def upgrade() -> None: + # Users + op.create_table( + "users", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("email", sa.String(255), nullable=False), + sa.Column("password_hash", sa.String(255), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=False), + sa.Column("status", sa.String(20), server_default="active", nullable=False), + sa.Column("display_name", sa.String(255), nullable=True), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + sa.UniqueConstraint("email", "tenant_id", name="uq_user_email_tenant"), + ) + op.create_index("ix_users_tenant_id", "users", ["tenant_id"]) + + # Roles + op.create_table( + "roles", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=False), + sa.Column("description", sa.String(1024), nullable=True), + sa.Column("permissions", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_system", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + sa.UniqueConstraint("name", "tenant_id", name="uq_role_name_tenant"), + ) + op.create_index("ix_roles_tenant_id", "roles", ["tenant_id"]) + + # Groups + op.create_table( + "groups", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + sa.UniqueConstraint("name", "tenant_id", name="uq_group_name_tenant"), + ) + op.create_index("ix_groups_tenant_id", "groups", ["tenant_id"]) + + # User-Role association + op.create_table( + "user_roles", + sa.Column("user_id", sa.Uuid(), sa.ForeignKey("users.id", ondelete="CASCADE"), nullable=False), + sa.Column("role_id", sa.Uuid(), sa.ForeignKey("roles.id", ondelete="CASCADE"), nullable=False), + sa.PrimaryKeyConstraint("user_id", "role_id"), + ) + + # User-Group association + op.create_table( + "user_groups", + sa.Column("user_id", sa.Uuid(), sa.ForeignKey("users.id", ondelete="CASCADE"), nullable=False), + sa.Column("group_id", sa.Uuid(), sa.ForeignKey("groups.id", ondelete="CASCADE"), nullable=False), + sa.PrimaryKeyConstraint("user_id", "group_id"), + ) + + # Group-Role association + op.create_table( + "group_roles", + sa.Column("group_id", sa.Uuid(), sa.ForeignKey("groups.id", ondelete="CASCADE"), nullable=False), + sa.Column("role_id", sa.Uuid(), sa.ForeignKey("roles.id", ondelete="CASCADE"), nullable=False), + sa.PrimaryKeyConstraint("group_id", "role_id"), + ) + + # API Tokens + op.create_table( + "api_tokens", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("user_id", sa.Uuid(), sa.ForeignKey("users.id", ondelete="CASCADE"), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=False), + sa.Column("key_hash", sa.String(255), nullable=False), + sa.Column("description", sa.String(1024), nullable=True), + sa.Column("scopes", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("expires_at", sa.DateTime(timezone=True), nullable=True), + sa.Column("allowed_ips", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("last_used_at", sa.DateTime(timezone=True), nullable=True), + sa.Column("is_revoked", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_api_tokens_key_hash", "api_tokens", ["key_hash"], unique=True) + op.create_index("ix_api_tokens_user_id", "api_tokens", ["user_id"]) + op.create_index("ix_api_tokens_tenant_id", "api_tokens", ["tenant_id"]) + + +def downgrade() -> None: + op.drop_table("api_tokens") + op.drop_table("group_roles") + op.drop_table("user_groups") + op.drop_table("user_roles") + op.drop_table("groups") + op.drop_table("roles") + op.drop_table("users") diff --git a/services/auth/docker-compose.dev.yml b/services/auth/docker-compose.dev.yml new file mode 100644 index 0000000..66616ff --- /dev/null +++ b/services/auth/docker-compose.dev.yml @@ -0,0 +1,28 @@ +services: + auth: + build: + context: ../../ + dockerfile: services/auth/Dockerfile.dev + volumes: + - ../../services/auth/src:/app/services/auth/src + ports: + - "8002:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_auth + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + RSA_PRIVATE_KEY: ${RSA_PRIVATE_KEY:-} + RSA_PUBLIC_KEY: ${RSA_PUBLIC_KEY:-} + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + +networks: + cmdb-network: + external: true diff --git a/services/auth/docker-compose.yml b/services/auth/docker-compose.yml new file mode 100644 index 0000000..87c9f7c --- /dev/null +++ b/services/auth/docker-compose.yml @@ -0,0 +1,20 @@ +services: + auth: + build: + context: ../../ + dockerfile: services/auth/Dockerfile + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_auth + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + RSA_PRIVATE_KEY: ${RSA_PRIVATE_KEY:-} + RSA_PUBLIC_KEY: ${RSA_PUBLIC_KEY:-} + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network diff --git a/services/auth/pyproject.toml b/services/auth/pyproject.toml new file mode 100644 index 0000000..3b18c2b --- /dev/null +++ b/services/auth/pyproject.toml @@ -0,0 +1,34 @@ +[project] +name = "cmdb-auth" +version = "0.1.0" +description = "CMDB auth Service" +requires-python = ">=3.13" +dependencies = [ + "cmdb-shared", + "fastapi>=0.115", + "uvicorn", + "sqlalchemy[asyncio]>=2.0", + "asyncpg", + "alembic", + "pydantic-settings>=2.0", + "bcrypt>=4.0", + "PyJWT>=2.0", + "cryptography>=42.0", + "redis>=5.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/auth"] + +[tool.uv.sources] +cmdb-shared = { workspace = true } + +[tool.pytest.ini_options] +testpaths = ["tests"] +pythonpath = ["src"] +asyncio_mode = "auto" +markers = ["integration: requires Docker infrastructure (PostgreSQL, Kafka, Redis)"] diff --git a/services/auth/src/auth/__init__.py b/services/auth/src/auth/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/auth/src/auth/application/__init__.py b/services/auth/src/auth/application/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/auth/src/auth/application/command_handlers.py b/services/auth/src/auth/application/command_handlers.py new file mode 100644 index 0000000..1f191f9 --- /dev/null +++ b/services/auth/src/auth/application/command_handlers.py @@ -0,0 +1,388 @@ +import hashlib +import secrets +from uuid import UUID + +from shared.cqrs.command import Command, CommandHandler +from shared.domain.exceptions import ( + AuthorizationError, + BusinessRuleViolationError, + ConflictError, + EntityNotFoundError, +) +from shared.messaging.producer import KafkaEventProducer + +from auth.application.dto import APITokenDTO, AuthTokenDTO +from auth.domain.api_token import APIToken +from auth.domain.permission import Permission +from auth.domain.repository import APITokenRepository, RoleRepository, UserRepository +from auth.domain.role import Role +from auth.domain.user import User +from auth.infrastructure.login_rate_limiter import LoginRateLimiter +from auth.infrastructure.security import BcryptPasswordService, JWTService +from auth.infrastructure.token_blacklist import RedisTokenBlacklist + + +class RegisterUserHandler(CommandHandler[UUID]): + def __init__( + self, + repository: UserRepository, + password_service: BcryptPasswordService, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._password_service = password_service + self._event_producer = event_producer + + async def handle(self, command: Command) -> UUID: + existing = await self._repository.find_by_email(command.email, command.tenant_id) + if existing is not None: + raise ConflictError(f"User with email '{command.email}' already exists") + + password_hash = await self._password_service.hash_async(command.password) + + user = User.create( + email=command.email, + password_hash=password_hash, + tenant_id=command.tenant_id, + display_name=command.display_name, + ) + + await self._repository.save(user) + + for event in user.collect_events(): + await self._event_producer.publish("auth.events", event) + + return user.id + + +class LoginHandler(CommandHandler[AuthTokenDTO]): + def __init__( + self, + repository: UserRepository, + role_repository: RoleRepository, + password_service: BcryptPasswordService, + jwt_service: JWTService, + rate_limiter: LoginRateLimiter, + ) -> None: + self._repository = repository + self._role_repository = role_repository + self._password_service = password_service + self._jwt_service = jwt_service + self._rate_limiter = rate_limiter + + async def handle(self, command: Command) -> AuthTokenDTO: + if await self._rate_limiter.is_locked(command.email, command.client_ip): + raise AuthorizationError("Too many login attempts. Please try again later.") + + user = await self._repository.find_by_email(command.email, command.tenant_id) + if user is None: + await self._rate_limiter.record_failure(command.email, command.client_ip) + raise AuthorizationError("Invalid email or password") + + if not await self._password_service.verify_async(command.password, user.password_hash): + await self._rate_limiter.record_failure(command.email, command.client_ip) + raise AuthorizationError("Invalid email or password") + + if user.status != "active": + raise AuthorizationError(f"User account is {user.status}") + + await self._rate_limiter.reset(command.email, command.client_ip) + + roles = await self._role_repository.find_by_ids(user.role_ids) + role_names = [r.name for r in roles] + + access_token = self._jwt_service.create_access_token( + user_id=user.id, + tenant_id=user.tenant_id, + roles=role_names, + ) + refresh_token = self._jwt_service.create_refresh_token( + user_id=user.id, + tenant_id=user.tenant_id, + ) + + return AuthTokenDTO( + access_token=access_token, + refresh_token=refresh_token, + expires_in=self._jwt_service.access_expire_minutes * 60, + ) + + +class RefreshTokenHandler(CommandHandler[AuthTokenDTO]): + def __init__( + self, + repository: UserRepository, + role_repository: RoleRepository, + jwt_service: JWTService, + token_blacklist: RedisTokenBlacklist, + ) -> None: + self._repository = repository + self._role_repository = role_repository + self._jwt_service = jwt_service + self._token_blacklist = token_blacklist + + async def handle(self, command: Command) -> AuthTokenDTO: + try: + payload = self._jwt_service.decode_token(command.refresh_token) + except Exception as exc: + raise AuthorizationError("Invalid refresh token") from exc + + if payload.get("type") != "refresh": + raise AuthorizationError("Invalid token type") + + jti = payload.get("jti") + if jti and await self._token_blacklist.is_blacklisted(jti): + raise AuthorizationError("Token has been revoked") + + user_id = UUID(payload["sub"]) + tenant_id = UUID(payload["tenant_id"]) + + user = await self._repository.find_by_id(user_id) + if user is None or user.status != "active": + raise AuthorizationError("User not found or inactive") + + roles = await self._role_repository.find_by_ids(user.role_ids) + role_names = [r.name for r in roles] + + access_token = self._jwt_service.create_access_token( + user_id=user_id, + tenant_id=tenant_id, + roles=role_names, + ) + + return AuthTokenDTO( + access_token=access_token, + refresh_token=command.refresh_token, + expires_in=self._jwt_service.access_expire_minutes * 60, + ) + + +class LogoutHandler(CommandHandler[None]): + def __init__( + self, + jwt_service: JWTService, + token_blacklist: RedisTokenBlacklist, + ) -> None: + self._jwt_service = jwt_service + self._token_blacklist = token_blacklist + + async def handle(self, command: Command) -> None: + try: + payload = self._jwt_service.decode_token(command.refresh_token) + except Exception: + return # Already invalid, nothing to do + + jti = payload.get("jti") + if jti: + import time + + exp = payload.get("exp", 0) + remaining = max(int(exp - time.time()), 0) + if remaining > 0: + await self._token_blacklist.blacklist(jti, remaining) + + +class ChangePasswordHandler(CommandHandler[None]): + def __init__( + self, + repository: UserRepository, + password_service: BcryptPasswordService, + ) -> None: + self._repository = repository + self._password_service = password_service + + async def handle(self, command: Command) -> None: + user = await self._repository.find_by_id(command.user_id) + if user is None: + raise EntityNotFoundError(f"User {command.user_id} not found") + + if not await self._password_service.verify_async(command.old_password, user.password_hash): + raise AuthorizationError("Current password is incorrect") + + new_hash = await self._password_service.hash_async(command.new_password) + user.change_password(new_hash) + await self._repository.save(user) + + +class AssignRoleHandler(CommandHandler[None]): + def __init__( + self, + repository: UserRepository, + role_repository: RoleRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._role_repository = role_repository + self._event_producer = event_producer + + async def handle(self, command: Command) -> None: + user = await self._repository.find_by_id(command.user_id) + if user is None: + raise EntityNotFoundError(f"User {command.user_id} not found") + + role = await self._role_repository.find_by_id(command.role_id) + if role is None: + raise EntityNotFoundError(f"Role {command.role_id} not found") + + user.assign_role(command.role_id) + await self._repository.save(user) + + for event in user.collect_events(): + await self._event_producer.publish("auth.events", event) + + +class RemoveRoleHandler(CommandHandler[None]): + def __init__( + self, + repository: UserRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._event_producer = event_producer + + async def handle(self, command: Command) -> None: + user = await self._repository.find_by_id(command.user_id) + if user is None: + raise EntityNotFoundError(f"User {command.user_id} not found") + + user.remove_role(command.role_id) + await self._repository.save(user) + + for event in user.collect_events(): + await self._event_producer.publish("auth.events", event) + + +class CreateRoleHandler(CommandHandler[UUID]): + def __init__( + self, + repository: RoleRepository, + ) -> None: + self._repository = repository + + async def handle(self, command: Command) -> UUID: + existing = await self._repository.find_by_name(command.name, command.tenant_id) + if existing is not None: + raise ConflictError(f"Role '{command.name}' already exists") + + permissions = [] + if command.permissions: + permissions = [Permission(**p) for p in command.permissions] + + role = Role.create( + name=command.name, + tenant_id=command.tenant_id, + description=command.description, + permissions=permissions, + ) + + await self._repository.save(role) + return role.id + + +class UpdateRoleHandler(CommandHandler[None]): + def __init__( + self, + repository: RoleRepository, + ) -> None: + self._repository = repository + + async def handle(self, command: Command) -> None: + role = await self._repository.find_by_id(command.role_id) + if role is None: + raise EntityNotFoundError(f"Role {command.role_id} not found") + + if role.is_system: + raise BusinessRuleViolationError("Cannot modify system role") + + if command.name is not None: + role.name = command.name + if command.description is not None: + role.description = command.description + if command.permissions is not None: + role.permissions = [Permission(**p) for p in command.permissions] + + from datetime import datetime + + role.updated_at = datetime.now() + await self._repository.save(role) + + +class DeleteRoleHandler(CommandHandler[None]): + def __init__( + self, + repository: RoleRepository, + ) -> None: + self._repository = repository + + async def handle(self, command: Command) -> None: + role = await self._repository.find_by_id(command.role_id) + if role is None: + raise EntityNotFoundError(f"Role {command.role_id} not found") + + if role.is_system: + raise BusinessRuleViolationError("Cannot delete system role") + + await self._repository.delete(command.role_id) + + +class CreateAPITokenHandler(CommandHandler[APITokenDTO]): + def __init__( + self, + repository: APITokenRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._event_producer = event_producer + + async def handle(self, command: Command) -> APITokenDTO: + raw_key = secrets.token_urlsafe(48) + key_hash = hashlib.sha256(raw_key.encode()).hexdigest() + + token = APIToken.create( + user_id=command.user_id, + tenant_id=command.tenant_id, + key_hash=key_hash, + description=command.description, + scopes=command.scopes, + expires_at=command.expires_at, + allowed_ips=command.allowed_ips, + ) + + await self._repository.save(token) + + for event in token.collect_events(): + await self._event_producer.publish("auth.events", event) + + return APITokenDTO( + id=token.id, + user_id=token.user_id, + tenant_id=token.tenant_id, + description=token.description, + scopes=token.scopes, + expires_at=token.expires_at, + allowed_ips=token.allowed_ips, + is_revoked=token.is_revoked, + created_at=token.created_at, + key=raw_key, + ) + + +class RevokeAPITokenHandler(CommandHandler[None]): + def __init__( + self, + repository: APITokenRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._event_producer = event_producer + + async def handle(self, command: Command) -> None: + token = await self._repository.find_by_id(command.token_id) + if token is None: + raise EntityNotFoundError(f"API Token {command.token_id} not found") + + token.revoke() + await self._repository.save(token) + + for event in token.collect_events(): + await self._event_producer.publish("auth.events", event) diff --git a/services/auth/src/auth/application/commands.py b/services/auth/src/auth/application/commands.py new file mode 100644 index 0000000..77eebca --- /dev/null +++ b/services/auth/src/auth/application/commands.py @@ -0,0 +1,73 @@ +from datetime import datetime +from uuid import UUID + +from shared.cqrs.command import Command + + +class RegisterUserCommand(Command): + email: str + password: str + tenant_id: UUID + display_name: str | None = None + + +class LoginCommand(Command): + email: str + password: str + tenant_id: UUID + client_ip: str = "0.0.0.0" + + +class RefreshTokenCommand(Command): + refresh_token: str + + +class LogoutCommand(Command): + refresh_token: str + + +class ChangePasswordCommand(Command): + user_id: UUID + old_password: str + new_password: str + + +class AssignRoleCommand(Command): + user_id: UUID + role_id: UUID + + +class RemoveRoleCommand(Command): + user_id: UUID + role_id: UUID + + +class CreateRoleCommand(Command): + name: str + tenant_id: UUID + description: str | None = None + permissions: list[dict] | None = None + + +class UpdateRoleCommand(Command): + role_id: UUID + name: str | None = None + description: str | None = None + permissions: list[dict] | None = None + + +class DeleteRoleCommand(Command): + role_id: UUID + + +class CreateAPITokenCommand(Command): + user_id: UUID + tenant_id: UUID + description: str | None = None + scopes: list[str] | None = None + expires_at: datetime | None = None + allowed_ips: list[str] | None = None + + +class RevokeAPITokenCommand(Command): + token_id: UUID diff --git a/services/auth/src/auth/application/dto.py b/services/auth/src/auth/application/dto.py new file mode 100644 index 0000000..f7ee3e1 --- /dev/null +++ b/services/auth/src/auth/application/dto.py @@ -0,0 +1,62 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + +from auth.domain.user import UserStatus + + +class UserDTO(BaseModel): + id: UUID + email: str + tenant_id: UUID + status: UserStatus + display_name: str | None + role_ids: list[UUID] + group_ids: list[UUID] + created_at: datetime + updated_at: datetime + + +class AuthTokenDTO(BaseModel): + access_token: str + refresh_token: str + token_type: str = "bearer" + expires_in: int + + +class RoleDTO(BaseModel): + id: UUID + name: str + tenant_id: UUID + description: str | None + permissions: list[dict] + is_system: bool + created_at: datetime + updated_at: datetime + + +class GroupDTO(BaseModel): + id: UUID + name: str + tenant_id: UUID + role_ids: list[UUID] + created_at: datetime + updated_at: datetime + + +class APITokenDTO(BaseModel): + id: UUID + user_id: UUID + tenant_id: UUID + description: str | None + scopes: list[str] + expires_at: datetime | None + allowed_ips: list[str] + is_revoked: bool + created_at: datetime + key: str | None = None + + +class PermissionCheckDTO(BaseModel): + allowed: bool diff --git a/services/auth/src/auth/application/queries.py b/services/auth/src/auth/application/queries.py new file mode 100644 index 0000000..2407489 --- /dev/null +++ b/services/auth/src/auth/application/queries.py @@ -0,0 +1,39 @@ +from uuid import UUID + +from shared.cqrs.query import Query + + +class GetUserQuery(Query): + user_id: UUID + + +class ListUsersQuery(Query): + tenant_id: UUID + offset: int = 0 + limit: int = 50 + + +class GetRoleQuery(Query): + role_id: UUID + + +class ListRolesQuery(Query): + tenant_id: UUID + offset: int = 0 + limit: int = 50 + + +class CheckPermissionQuery(Query): + user_id: UUID + object_type: str + action: str + + +class ListAPITokensQuery(Query): + user_id: UUID + offset: int = 0 + limit: int = 50 + + +class ValidateTokenQuery(Query): + token: str diff --git a/services/auth/src/auth/application/query_handlers.py b/services/auth/src/auth/application/query_handlers.py new file mode 100644 index 0000000..6e787b0 --- /dev/null +++ b/services/auth/src/auth/application/query_handlers.py @@ -0,0 +1,191 @@ +from shared.cqrs.query import Query, QueryHandler +from shared.domain.exceptions import AuthorizationError, EntityNotFoundError + +from auth.application.dto import ( + APITokenDTO, + PermissionCheckDTO, + RoleDTO, + UserDTO, +) +from auth.domain.repository import ( + APITokenRepository, + GroupRepository, + RoleRepository, + UserRepository, +) +from auth.domain.services import PermissionChecker +from auth.infrastructure.security import JWTService +from auth.infrastructure.token_blacklist import RedisTokenBlacklist + + +class GetUserHandler(QueryHandler[UserDTO]): + def __init__(self, repository: UserRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> UserDTO: + user = await self._repository.find_by_id(query.user_id) + if user is None: + raise EntityNotFoundError(f"User {query.user_id} not found") + return UserDTO( + id=user.id, + email=user.email, + tenant_id=user.tenant_id, + status=user.status, + display_name=user.display_name, + role_ids=user.role_ids, + group_ids=user.group_ids, + created_at=user.created_at, + updated_at=user.updated_at, + ) + + +class ListUsersHandler(QueryHandler[tuple[list[UserDTO], int]]): + def __init__(self, repository: UserRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> tuple[list[UserDTO], int]: + users, total = await self._repository.find_all( + query.tenant_id, + offset=query.offset, + limit=query.limit, + ) + items = [ + UserDTO( + id=u.id, + email=u.email, + tenant_id=u.tenant_id, + status=u.status, + display_name=u.display_name, + role_ids=u.role_ids, + group_ids=u.group_ids, + created_at=u.created_at, + updated_at=u.updated_at, + ) + for u in users + ] + return items, total + + +class GetRoleHandler(QueryHandler[RoleDTO]): + def __init__(self, repository: RoleRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> RoleDTO: + role = await self._repository.find_by_id(query.role_id) + if role is None: + raise EntityNotFoundError(f"Role {query.role_id} not found") + return RoleDTO( + id=role.id, + name=role.name, + tenant_id=role.tenant_id, + description=role.description, + permissions=[p.model_dump() for p in role.permissions], + is_system=role.is_system, + created_at=role.created_at, + updated_at=role.updated_at, + ) + + +class ListRolesHandler(QueryHandler[tuple[list[RoleDTO], int]]): + def __init__(self, repository: RoleRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> tuple[list[RoleDTO], int]: + roles, total = await self._repository.find_all( + query.tenant_id, + offset=query.offset, + limit=query.limit, + ) + items = [ + RoleDTO( + id=r.id, + name=r.name, + tenant_id=r.tenant_id, + description=r.description, + permissions=[p.model_dump() for p in r.permissions], + is_system=r.is_system, + created_at=r.created_at, + updated_at=r.updated_at, + ) + for r in roles + ] + return items, total + + +class CheckPermissionHandler(QueryHandler[PermissionCheckDTO]): + def __init__( + self, + user_repository: UserRepository, + role_repository: RoleRepository, + group_repository: GroupRepository, + ) -> None: + self._user_repository = user_repository + self._role_repository = role_repository + self._group_repository = group_repository + self._checker = PermissionChecker() + + async def handle(self, query: Query) -> PermissionCheckDTO: + user = await self._user_repository.find_by_id(query.user_id) + if user is None: + return PermissionCheckDTO(allowed=False) + + roles = await self._role_repository.find_by_ids(user.role_ids) + groups = await self._group_repository.find_by_ids(user.group_ids) + + allowed = self._checker.has_permission( + user=user, + roles=roles, + groups=groups, + object_type=query.object_type, + action=query.action, + ) + return PermissionCheckDTO(allowed=allowed) + + +class ListAPITokensHandler(QueryHandler[tuple[list[APITokenDTO], int]]): + def __init__(self, repository: APITokenRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> tuple[list[APITokenDTO], int]: + tokens, total = await self._repository.find_all_by_user( + query.user_id, + offset=query.offset, + limit=query.limit, + ) + items = [ + APITokenDTO( + id=t.id, + user_id=t.user_id, + tenant_id=t.tenant_id, + description=t.description, + scopes=t.scopes, + expires_at=t.expires_at, + allowed_ips=t.allowed_ips, + is_revoked=t.is_revoked, + created_at=t.created_at, + ) + for t in tokens + ] + return items, total + + +class ValidateTokenHandler(QueryHandler[dict]): + def __init__( + self, + jwt_service: JWTService, + token_blacklist: RedisTokenBlacklist, + ) -> None: + self._jwt_service = jwt_service + self._token_blacklist = token_blacklist + + async def handle(self, query: Query) -> dict: + try: + payload = self._jwt_service.decode_token(query.token) + except Exception as exc: + raise AuthorizationError("Invalid token") from exc + + jti = payload.get("jti") + if jti and await self._token_blacklist.is_blacklisted(jti): + raise AuthorizationError("Token has been revoked") + + return payload diff --git a/services/auth/src/auth/domain/__init__.py b/services/auth/src/auth/domain/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/auth/src/auth/domain/api_token.py b/services/auth/src/auth/domain/api_token.py new file mode 100644 index 0000000..9dc5c89 --- /dev/null +++ b/services/auth/src/auth/domain/api_token.py @@ -0,0 +1,78 @@ +from datetime import datetime +from typing import Any +from uuid import UUID + +from pydantic import Field +from shared.domain.entity import Entity +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.domain_event import DomainEvent + +from auth.domain.events import TokenGenerated, TokenRevoked + + +class APIToken(Entity): + user_id: UUID + tenant_id: UUID + key_hash: str + description: str | None = None + scopes: list[str] = Field(default_factory=list) + expires_at: datetime | None = None + allowed_ips: list[str] = Field(default_factory=list) + last_used_at: datetime | None = None + is_revoked: bool = False + + def model_post_init(self, __context: Any) -> None: + object.__setattr__(self, "_pending_events", []) + + def collect_events(self) -> list[DomainEvent]: + events: list[DomainEvent] = list(self._pending_events) + self._pending_events.clear() + return events + + @classmethod + def create( + cls, + *, + user_id: UUID, + tenant_id: UUID, + key_hash: str, + description: str | None = None, + scopes: list[str] | None = None, + expires_at: datetime | None = None, + allowed_ips: list[str] | None = None, + ) -> "APIToken": + token = cls( + user_id=user_id, + tenant_id=tenant_id, + key_hash=key_hash, + description=description, + scopes=scopes or [], + expires_at=expires_at, + allowed_ips=allowed_ips or [], + ) + token._pending_events.append( + TokenGenerated( + aggregate_id=token.id, + version=1, + user_id=user_id, + token_type="api_token", + ) + ) + return token + + def revoke(self) -> None: + if self.is_revoked: + raise BusinessRuleViolationError("Token is already revoked") + self.is_revoked = True + self.updated_at = datetime.now() + self._pending_events.append(TokenRevoked(aggregate_id=self.id, version=1)) + + def is_expired(self) -> bool: + if self.expires_at is None: + return False + return datetime.now() >= self.expires_at + + def has_scope(self, scope: str) -> bool: + if not self.scopes: + return True + return scope in self.scopes diff --git a/services/auth/src/auth/domain/events.py b/services/auth/src/auth/domain/events.py new file mode 100644 index 0000000..ef52810 --- /dev/null +++ b/services/auth/src/auth/domain/events.py @@ -0,0 +1,31 @@ +from uuid import UUID + +from shared.event.domain_event import DomainEvent + + +class UserCreated(DomainEvent): + email: str + tenant_id: UUID + + +class UserLocked(DomainEvent): + pass + + +class RoleAssigned(DomainEvent): + user_id: UUID + role_id: UUID + + +class RoleRemoved(DomainEvent): + user_id: UUID + role_id: UUID + + +class TokenGenerated(DomainEvent): + user_id: UUID + token_type: str + + +class TokenRevoked(DomainEvent): + pass diff --git a/services/auth/src/auth/domain/group.py b/services/auth/src/auth/domain/group.py new file mode 100644 index 0000000..ad7c48a --- /dev/null +++ b/services/auth/src/auth/domain/group.py @@ -0,0 +1,30 @@ +from typing import Any +from uuid import UUID + +from pydantic import Field +from shared.domain.entity import Entity +from shared.event.domain_event import DomainEvent + + +class Group(Entity): + name: str + tenant_id: UUID + role_ids: list[UUID] = Field(default_factory=list) + + def model_post_init(self, __context: Any) -> None: + object.__setattr__(self, "_pending_events", []) + + def collect_events(self) -> list[DomainEvent]: + events: list[DomainEvent] = list(self._pending_events) + self._pending_events.clear() + return events + + @classmethod + def create( + cls, + *, + name: str, + tenant_id: UUID, + role_ids: list[UUID] | None = None, + ) -> "Group": + return cls(name=name, tenant_id=tenant_id, role_ids=role_ids or []) diff --git a/services/auth/src/auth/domain/permission.py b/services/auth/src/auth/domain/permission.py new file mode 100644 index 0000000..94327c9 --- /dev/null +++ b/services/auth/src/auth/domain/permission.py @@ -0,0 +1,15 @@ +from enum import StrEnum + +from shared.domain.value_object import ValueObject + + +class Action(StrEnum): + VIEW = "view" + ADD = "add" + CHANGE = "change" + DELETE = "delete" + + +class Permission(ValueObject): + object_type: str + actions: list[str] diff --git a/services/auth/src/auth/domain/repository.py b/services/auth/src/auth/domain/repository.py new file mode 100644 index 0000000..68c184c --- /dev/null +++ b/services/auth/src/auth/domain/repository.py @@ -0,0 +1,71 @@ +from abc import abstractmethod +from uuid import UUID + +from shared.domain.repository import Repository + +from auth.domain.api_token import APIToken +from auth.domain.group import Group +from auth.domain.role import Role +from auth.domain.user import User + + +class UserRepository(Repository[User]): + @abstractmethod + async def find_by_email(self, email: str, tenant_id: UUID) -> User | None: ... + + @abstractmethod + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[User], int]: ... + + +class RoleRepository(Repository[Role]): + @abstractmethod + async def find_by_name(self, name: str, tenant_id: UUID) -> Role | None: ... + + @abstractmethod + async def find_by_ids(self, role_ids: list[UUID]) -> list[Role]: ... + + @abstractmethod + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Role], int]: ... + + +class GroupRepository(Repository[Group]): + @abstractmethod + async def find_by_name(self, name: str, tenant_id: UUID) -> Group | None: ... + + @abstractmethod + async def find_by_ids(self, group_ids: list[UUID]) -> list[Group]: ... + + @abstractmethod + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Group], int]: ... + + +class APITokenRepository(Repository[APIToken]): + @abstractmethod + async def find_by_key_hash(self, key_hash: str) -> APIToken | None: ... + + @abstractmethod + async def find_all_by_user( + self, + user_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[APIToken], int]: ... diff --git a/services/auth/src/auth/domain/role.py b/services/auth/src/auth/domain/role.py new file mode 100644 index 0000000..9b7fd5a --- /dev/null +++ b/services/auth/src/auth/domain/role.py @@ -0,0 +1,58 @@ +from datetime import datetime +from typing import Any +from uuid import UUID + +from pydantic import Field +from shared.domain.entity import Entity +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.domain_event import DomainEvent + +from auth.domain.permission import Permission + + +class Role(Entity): + name: str + tenant_id: UUID + description: str | None = None + permissions: list[Permission] = Field(default_factory=list) + is_system: bool = False + + def model_post_init(self, __context: Any) -> None: + object.__setattr__(self, "_pending_events", []) + + def collect_events(self) -> list[DomainEvent]: + events: list[DomainEvent] = list(self._pending_events) + self._pending_events.clear() + return events + + @classmethod + def create( + cls, + *, + name: str, + tenant_id: UUID, + description: str | None = None, + permissions: list[Permission] | None = None, + ) -> "Role": + return cls( + name=name, + tenant_id=tenant_id, + description=description, + permissions=permissions or [], + ) + + def add_permission(self, permission: Permission) -> None: + for p in self.permissions: + if p.object_type == permission.object_type: + raise BusinessRuleViolationError( + f"Permission for object_type '{permission.object_type}' already exists" + ) + self.permissions.append(permission) + self.updated_at = datetime.now() + + def remove_permission(self, object_type: str) -> None: + original_len = len(self.permissions) + self.permissions = [p for p in self.permissions if p.object_type != object_type] + if len(self.permissions) == original_len: + raise BusinessRuleViolationError(f"No permission found for object_type '{object_type}'") + self.updated_at = datetime.now() diff --git a/services/auth/src/auth/domain/services.py b/services/auth/src/auth/domain/services.py new file mode 100644 index 0000000..97adabc --- /dev/null +++ b/services/auth/src/auth/domain/services.py @@ -0,0 +1,36 @@ +from abc import ABC, abstractmethod + +from auth.domain.group import Group +from auth.domain.role import Role +from auth.domain.user import User + + +class PasswordService(ABC): + @abstractmethod + def hash(self, password: str) -> str: ... + + @abstractmethod + def verify(self, password: str, hashed: str) -> bool: ... + + +class PermissionChecker: + def has_permission( + self, + user: User, + roles: list[Role], + groups: list[Group], + object_type: str, + action: str, + ) -> bool: + all_role_ids = set(user.role_ids) + for group in groups: + if group.id in user.group_ids: + all_role_ids.update(group.role_ids) + + for role in roles: + if role.id not in all_role_ids: + continue + for perm in role.permissions: + if perm.object_type == object_type and action in perm.actions: + return True + return False diff --git a/services/auth/src/auth/domain/user.py b/services/auth/src/auth/domain/user.py new file mode 100644 index 0000000..7853b2e --- /dev/null +++ b/services/auth/src/auth/domain/user.py @@ -0,0 +1,93 @@ +from datetime import datetime +from enum import StrEnum +from typing import Any +from uuid import UUID + +from pydantic import Field +from shared.domain.entity import Entity +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.domain_event import DomainEvent + +from auth.domain.events import RoleAssigned, RoleRemoved, UserCreated, UserLocked + + +class UserStatus(StrEnum): + ACTIVE = "active" + INACTIVE = "inactive" + LOCKED = "locked" + + +class User(Entity): + email: str + password_hash: str + tenant_id: UUID + status: UserStatus = UserStatus.ACTIVE + display_name: str | None = None + role_ids: list[UUID] = Field(default_factory=list) + group_ids: list[UUID] = Field(default_factory=list) + + def model_post_init(self, __context: Any) -> None: + object.__setattr__(self, "_pending_events", []) + + def collect_events(self) -> list[DomainEvent]: + events: list[DomainEvent] = list(self._pending_events) + self._pending_events.clear() + return events + + @classmethod + def create( + cls, + *, + email: str, + password_hash: str, + tenant_id: UUID, + display_name: str | None = None, + ) -> "User": + user = cls( + email=email, + password_hash=password_hash, + tenant_id=tenant_id, + display_name=display_name, + ) + user._pending_events.append( + UserCreated( + aggregate_id=user.id, + version=1, + email=email, + tenant_id=tenant_id, + ) + ) + return user + + def change_password(self, new_hash: str) -> None: + if self.status == UserStatus.LOCKED: + raise BusinessRuleViolationError("Cannot change password of a locked user") + self.password_hash = new_hash + self.updated_at = datetime.now() + + def assign_role(self, role_id: UUID) -> None: + if role_id in self.role_ids: + raise BusinessRuleViolationError(f"Role {role_id} is already assigned") + self.role_ids.append(role_id) + self.updated_at = datetime.now() + self._pending_events.append(RoleAssigned(aggregate_id=self.id, version=1, user_id=self.id, role_id=role_id)) + + def remove_role(self, role_id: UUID) -> None: + if role_id not in self.role_ids: + raise BusinessRuleViolationError(f"Role {role_id} is not assigned") + self.role_ids.remove(role_id) + self.updated_at = datetime.now() + self._pending_events.append(RoleRemoved(aggregate_id=self.id, version=1, user_id=self.id, role_id=role_id)) + + def lock(self) -> None: + if self.status == UserStatus.LOCKED: + raise BusinessRuleViolationError("User is already locked") + self.status = UserStatus.LOCKED + self.updated_at = datetime.now() + self._pending_events.append(UserLocked(aggregate_id=self.id, version=1)) + + def activate(self) -> None: + if self.status == UserStatus.ACTIVE: + raise BusinessRuleViolationError("User is already active") + self.status = UserStatus.ACTIVE + self.updated_at = datetime.now() diff --git a/services/auth/src/auth/infrastructure/__init__.py b/services/auth/src/auth/infrastructure/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/auth/src/auth/infrastructure/api_token_repository.py b/services/auth/src/auth/infrastructure/api_token_repository.py new file mode 100644 index 0000000..1a45d79 --- /dev/null +++ b/services/auth/src/auth/infrastructure/api_token_repository.py @@ -0,0 +1,88 @@ +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from auth.domain.api_token import APIToken +from auth.domain.repository import APITokenRepository +from auth.infrastructure.models import APITokenModel + + +class PostgresAPITokenRepository(APITokenRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def find_by_id(self, entity_id: UUID) -> APIToken | None: + result = await self._session.get(APITokenModel, entity_id) + return self._to_entity(result) if result else None + + async def find_by_key_hash(self, key_hash: str) -> APIToken | None: + stmt = select(APITokenModel).where(APITokenModel.key_hash == key_hash) + result = await self._session.execute(stmt) + row = result.scalar_one_or_none() + return self._to_entity(row) if row else None + + async def find_all_by_user( + self, + user_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[APIToken], int]: + count_stmt = select(sa_func.count()).select_from(APITokenModel).where(APITokenModel.user_id == user_id) + total = (await self._session.execute(count_stmt)).scalar_one() + + stmt = ( + select(APITokenModel) + .where(APITokenModel.user_id == user_id) + .order_by(APITokenModel.created_at.desc()) + .offset(offset) + .limit(limit) + ) + result = await self._session.execute(stmt) + return [self._to_entity(r) for r in result.scalars().all()], total + + async def save(self, entity: APIToken) -> APIToken: + model = self._to_model(entity) + merged = await self._session.merge(model) + await self._session.commit() + return self._to_entity(merged) + + async def delete(self, entity_id: UUID) -> None: + model = await self._session.get(APITokenModel, entity_id) + if model: + await self._session.delete(model) + await self._session.commit() + + @staticmethod + def _to_entity(model: APITokenModel) -> APIToken: + return APIToken( + id=model.id, + user_id=model.user_id, + tenant_id=model.tenant_id, + key_hash=model.key_hash, + description=model.description, + scopes=model.scopes or [], + expires_at=model.expires_at, + allowed_ips=model.allowed_ips or [], + last_used_at=model.last_used_at, + is_revoked=model.is_revoked, + created_at=model.created_at, + ) + + @staticmethod + def _to_model(entity: APIToken) -> APITokenModel: + return APITokenModel( + id=entity.id, + user_id=entity.user_id, + tenant_id=entity.tenant_id, + key_hash=entity.key_hash, + description=entity.description, + scopes=entity.scopes, + expires_at=entity.expires_at, + allowed_ips=entity.allowed_ips, + last_used_at=entity.last_used_at, + is_revoked=entity.is_revoked, + created_at=entity.created_at, + ) diff --git a/services/auth/src/auth/infrastructure/config.py b/services/auth/src/auth/infrastructure/config.py new file mode 100644 index 0000000..01fe9c5 --- /dev/null +++ b/services/auth/src/auth/infrastructure/config.py @@ -0,0 +1,26 @@ +from pathlib import Path + +from pydantic_settings import BaseSettings + + +class Settings(BaseSettings): + database_url: str = "postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_auth" + + kafka_bootstrap_servers: str = "kafka:9092" + redis_url: str = "redis://redis:6379" + + rsa_private_key: str = "" + rsa_public_key: str = "" + rsa_private_key_path: str = "" + rsa_public_key_path: str = "" + jwt_algorithm: str = "RS256" + jwt_access_token_expire_minutes: int = 30 + jwt_refresh_token_expire_days: int = 7 + + bcrypt_rounds: int = 12 + + def model_post_init(self, __context: object) -> None: + if not self.rsa_private_key and self.rsa_private_key_path: + self.rsa_private_key = Path(self.rsa_private_key_path).read_text() + if not self.rsa_public_key and self.rsa_public_key_path: + self.rsa_public_key = Path(self.rsa_public_key_path).read_text() diff --git a/services/auth/src/auth/infrastructure/database.py b/services/auth/src/auth/infrastructure/database.py new file mode 100644 index 0000000..26744e3 --- /dev/null +++ b/services/auth/src/auth/infrastructure/database.py @@ -0,0 +1,26 @@ +from sqlalchemy.ext.asyncio import ( + AsyncEngine, + AsyncSession, + async_sessionmaker, + create_async_engine, +) + + +class Database: + def __init__(self, url: str) -> None: + self._engine: AsyncEngine = create_async_engine(url, echo=False, pool_size=5) + self._session_factory = async_sessionmaker( + self._engine, + class_=AsyncSession, + expire_on_commit=False, + ) + + @property + def engine(self) -> AsyncEngine: + return self._engine + + def session(self) -> AsyncSession: + return self._session_factory() + + async def close(self) -> None: + await self._engine.dispose() diff --git a/services/auth/src/auth/infrastructure/group_repository.py b/services/auth/src/auth/infrastructure/group_repository.py new file mode 100644 index 0000000..2d143f6 --- /dev/null +++ b/services/auth/src/auth/infrastructure/group_repository.py @@ -0,0 +1,94 @@ +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from auth.domain.group import Group +from auth.domain.repository import GroupRepository +from auth.infrastructure.models import GroupModel, GroupRoleModel + + +class PostgresGroupRepository(GroupRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def find_by_id(self, entity_id: UUID) -> Group | None: + result = await self._session.get(GroupModel, entity_id) + return self._to_entity(result) if result else None + + async def find_by_name(self, name: str, tenant_id: UUID) -> Group | None: + stmt = select(GroupModel).where( + GroupModel.name == name, + GroupModel.tenant_id == tenant_id, + ) + result = await self._session.execute(stmt) + row = result.scalar_one_or_none() + return self._to_entity(row) if row else None + + async def find_by_ids(self, group_ids: list[UUID]) -> list[Group]: + if not group_ids: + return [] + stmt = select(GroupModel).where(GroupModel.id.in_(group_ids)) + result = await self._session.execute(stmt) + return [self._to_entity(r) for r in result.scalars().unique().all()] + + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Group], int]: + count_stmt = select(sa_func.count()).select_from(GroupModel).where(GroupModel.tenant_id == tenant_id) + total = (await self._session.execute(count_stmt)).scalar_one() + + stmt = ( + select(GroupModel) + .where(GroupModel.tenant_id == tenant_id) + .order_by(GroupModel.created_at.desc()) + .offset(offset) + .limit(limit) + ) + result = await self._session.execute(stmt) + return [self._to_entity(r) for r in result.scalars().unique().all()], total + + async def save(self, entity: Group) -> Group: + model = self._to_model(entity) + merged = await self._session.merge(model) + await self._session.flush() + + # Sync role associations + await self._session.execute(GroupRoleModel.__table__.delete().where(GroupRoleModel.group_id == entity.id)) + for role_id in entity.role_ids: + await self._session.execute(GroupRoleModel.__table__.insert().values(group_id=entity.id, role_id=role_id)) + + await self._session.commit() + return self._to_entity(merged) + + async def delete(self, entity_id: UUID) -> None: + model = await self._session.get(GroupModel, entity_id) + if model: + await self._session.delete(model) + await self._session.commit() + + @staticmethod + def _to_entity(model: GroupModel) -> Group: + return Group( + id=model.id, + name=model.name, + tenant_id=model.tenant_id, + role_ids=[r.id for r in model.roles], + created_at=model.created_at, + updated_at=model.updated_at, + ) + + @staticmethod + def _to_model(entity: Group) -> GroupModel: + return GroupModel( + id=entity.id, + name=entity.name, + tenant_id=entity.tenant_id, + created_at=entity.created_at, + updated_at=entity.updated_at, + ) diff --git a/services/auth/src/auth/infrastructure/login_rate_limiter.py b/services/auth/src/auth/infrastructure/login_rate_limiter.py new file mode 100644 index 0000000..28e1d74 --- /dev/null +++ b/services/auth/src/auth/infrastructure/login_rate_limiter.py @@ -0,0 +1,37 @@ +import redis.asyncio as redis + + +class LoginRateLimiter: + THRESHOLDS = [ + (5, 300), # 5 failures → 5 min lockout + (10, 1800), # 10 failures → 30 min lockout + ] + + def __init__(self, redis_url: str) -> None: + self._redis = redis.from_url(redis_url, decode_responses=True) + + def _key(self, email: str, ip: str) -> str: + return f"login_attempt:{email}:{ip}" + + async def is_locked(self, email: str, ip: str) -> bool: + lock_key = f"login_lock:{email}:{ip}" + return await self._redis.exists(lock_key) > 0 + + async def record_failure(self, email: str, ip: str) -> None: + key = self._key(email, ip) + count = await self._redis.incr(key) + await self._redis.expire(key, 3600) + + for threshold, lockout_seconds in self.THRESHOLDS: + if count == threshold: + lock_key = f"login_lock:{email}:{ip}" + await self._redis.setex(lock_key, lockout_seconds, "1") + break + + async def reset(self, email: str, ip: str) -> None: + key = self._key(email, ip) + lock_key = f"login_lock:{email}:{ip}" + await self._redis.delete(key, lock_key) + + async def close(self) -> None: + await self._redis.aclose() diff --git a/services/auth/src/auth/infrastructure/models.py b/services/auth/src/auth/infrastructure/models.py new file mode 100644 index 0000000..6b4cf81 --- /dev/null +++ b/services/auth/src/auth/infrastructure/models.py @@ -0,0 +1,99 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import DateTime as SADateTime +from sqlalchemy import ForeignKey, String, UniqueConstraint +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column, relationship +from sqlalchemy.sql import func + + +class AuthBase(DeclarativeBase): + pass + + +class UserRoleModel(AuthBase): + __tablename__ = "user_roles" + + user_id: Mapped[UUID] = mapped_column(ForeignKey("users.id", ondelete="CASCADE"), primary_key=True) + role_id: Mapped[UUID] = mapped_column(ForeignKey("roles.id", ondelete="CASCADE"), primary_key=True) + + +class UserGroupModel(AuthBase): + __tablename__ = "user_groups" + + user_id: Mapped[UUID] = mapped_column(ForeignKey("users.id", ondelete="CASCADE"), primary_key=True) + group_id: Mapped[UUID] = mapped_column(ForeignKey("groups.id", ondelete="CASCADE"), primary_key=True) + + +class GroupRoleModel(AuthBase): + __tablename__ = "group_roles" + + group_id: Mapped[UUID] = mapped_column(ForeignKey("groups.id", ondelete="CASCADE"), primary_key=True) + role_id: Mapped[UUID] = mapped_column(ForeignKey("roles.id", ondelete="CASCADE"), primary_key=True) + + +class UserModel(AuthBase): + __tablename__ = "users" + __table_args__ = (UniqueConstraint("email", "tenant_id", name="uq_user_email_tenant"),) + + id: Mapped[UUID] = mapped_column(primary_key=True) + email: Mapped[str] = mapped_column(String(255)) + password_hash: Mapped[str] = mapped_column(String(255)) + tenant_id: Mapped[UUID] = mapped_column(index=True) + status: Mapped[str] = mapped_column(String(20), default="active") + display_name: Mapped[str | None] = mapped_column(String(255), nullable=True) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + SADateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + roles: Mapped[list["RoleModel"]] = relationship(secondary="user_roles", lazy="selectin", viewonly=True) + groups: Mapped[list["GroupModel"]] = relationship(secondary="user_groups", lazy="selectin", viewonly=True) + + +class RoleModel(AuthBase): + __tablename__ = "roles" + __table_args__ = (UniqueConstraint("name", "tenant_id", name="uq_role_name_tenant"),) + + id: Mapped[UUID] = mapped_column(primary_key=True) + name: Mapped[str] = mapped_column(String(255)) + tenant_id: Mapped[UUID] = mapped_column(index=True) + description: Mapped[str | None] = mapped_column(String(1024), nullable=True) + permissions: Mapped[list] = mapped_column(JSONB, default=list) + is_system: Mapped[bool] = mapped_column(default=False) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + SADateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + +class GroupModel(AuthBase): + __tablename__ = "groups" + __table_args__ = (UniqueConstraint("name", "tenant_id", name="uq_group_name_tenant"),) + + id: Mapped[UUID] = mapped_column(primary_key=True) + name: Mapped[str] = mapped_column(String(255)) + tenant_id: Mapped[UUID] = mapped_column(index=True) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + SADateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + roles: Mapped[list["RoleModel"]] = relationship(secondary="group_roles", lazy="selectin", viewonly=True) + + +class APITokenModel(AuthBase): + __tablename__ = "api_tokens" + + id: Mapped[UUID] = mapped_column(primary_key=True) + user_id: Mapped[UUID] = mapped_column(ForeignKey("users.id", ondelete="CASCADE"), index=True) + tenant_id: Mapped[UUID] = mapped_column(index=True) + key_hash: Mapped[str] = mapped_column(String(255), unique=True, index=True) + description: Mapped[str | None] = mapped_column(String(1024), nullable=True) + scopes: Mapped[list] = mapped_column(JSONB, default=list) + expires_at: Mapped[datetime | None] = mapped_column(SADateTime(timezone=True), nullable=True) + allowed_ips: Mapped[list] = mapped_column(JSONB, default=list) + last_used_at: Mapped[datetime | None] = mapped_column(SADateTime(timezone=True), nullable=True) + is_revoked: Mapped[bool] = mapped_column(default=False) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) diff --git a/services/auth/src/auth/infrastructure/role_repository.py b/services/auth/src/auth/infrastructure/role_repository.py new file mode 100644 index 0000000..322b728 --- /dev/null +++ b/services/auth/src/auth/infrastructure/role_repository.py @@ -0,0 +1,94 @@ +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from auth.domain.permission import Permission +from auth.domain.repository import RoleRepository +from auth.domain.role import Role +from auth.infrastructure.models import RoleModel + + +class PostgresRoleRepository(RoleRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def find_by_id(self, entity_id: UUID) -> Role | None: + result = await self._session.get(RoleModel, entity_id) + return self._to_entity(result) if result else None + + async def find_by_name(self, name: str, tenant_id: UUID) -> Role | None: + stmt = select(RoleModel).where( + RoleModel.name == name, + RoleModel.tenant_id == tenant_id, + ) + result = await self._session.execute(stmt) + row = result.scalar_one_or_none() + return self._to_entity(row) if row else None + + async def find_by_ids(self, role_ids: list[UUID]) -> list[Role]: + if not role_ids: + return [] + stmt = select(RoleModel).where(RoleModel.id.in_(role_ids)) + result = await self._session.execute(stmt) + return [self._to_entity(r) for r in result.scalars().all()] + + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Role], int]: + count_stmt = select(sa_func.count()).select_from(RoleModel).where(RoleModel.tenant_id == tenant_id) + total = (await self._session.execute(count_stmt)).scalar_one() + + stmt = ( + select(RoleModel) + .where(RoleModel.tenant_id == tenant_id) + .order_by(RoleModel.created_at.desc()) + .offset(offset) + .limit(limit) + ) + result = await self._session.execute(stmt) + return [self._to_entity(r) for r in result.scalars().all()], total + + async def save(self, entity: Role) -> Role: + model = self._to_model(entity) + merged = await self._session.merge(model) + await self._session.commit() + return self._to_entity(merged) + + async def delete(self, entity_id: UUID) -> None: + model = await self._session.get(RoleModel, entity_id) + if model: + await self._session.delete(model) + await self._session.commit() + + @staticmethod + def _to_entity(model: RoleModel) -> Role: + permissions = [Permission(**p) for p in (model.permissions or [])] + return Role( + id=model.id, + name=model.name, + tenant_id=model.tenant_id, + description=model.description, + permissions=permissions, + is_system=model.is_system, + created_at=model.created_at, + updated_at=model.updated_at, + ) + + @staticmethod + def _to_model(entity: Role) -> RoleModel: + return RoleModel( + id=entity.id, + name=entity.name, + tenant_id=entity.tenant_id, + description=entity.description, + permissions=[p.model_dump() for p in entity.permissions], + is_system=entity.is_system, + created_at=entity.created_at, + updated_at=entity.updated_at, + ) diff --git a/services/auth/src/auth/infrastructure/security.py b/services/auth/src/auth/infrastructure/security.py new file mode 100644 index 0000000..6201a90 --- /dev/null +++ b/services/auth/src/auth/infrastructure/security.py @@ -0,0 +1,110 @@ +import asyncio +import hashlib +from datetime import UTC, datetime, timedelta +from uuid import UUID, uuid4 + +import bcrypt +import jwt + +from auth.domain.services import PasswordService +from auth.infrastructure.config import Settings + + +class BcryptPasswordService(PasswordService): + def __init__(self, rounds: int = 12) -> None: + self._rounds = rounds + + def hash(self, password: str) -> str: + salt = bcrypt.gensalt(rounds=self._rounds) + return bcrypt.hashpw(password.encode("utf-8"), salt).decode("utf-8") + + def verify(self, password: str, hashed: str) -> bool: + return bcrypt.checkpw(password.encode("utf-8"), hashed.encode("utf-8")) + + async def hash_async(self, password: str) -> str: + return await asyncio.to_thread(self.hash, password) + + async def verify_async(self, password: str, hashed: str) -> bool: + return await asyncio.to_thread(self.verify, password, hashed) + + +class JWTService: + def __init__(self, settings: Settings) -> None: + self._private_key = settings.rsa_private_key + self._public_key = settings.rsa_public_key + self._algorithm = settings.jwt_algorithm + self._access_expire_minutes = settings.jwt_access_token_expire_minutes + self._refresh_expire_days = settings.jwt_refresh_token_expire_days + + def create_access_token( + self, + user_id: UUID, + tenant_id: UUID, + roles: list[str], + ) -> str: + now = datetime.now(UTC) + payload = { + "sub": str(user_id), + "tenant_id": str(tenant_id), + "roles": roles, + "type": "access", + "exp": now + timedelta(minutes=self._access_expire_minutes), + "iat": now, + "jti": str(uuid4()), + } + return jwt.encode(payload, self._private_key, algorithm=self._algorithm) + + def create_refresh_token( + self, + user_id: UUID, + tenant_id: UUID, + ) -> str: + now = datetime.now(UTC) + payload = { + "sub": str(user_id), + "tenant_id": str(tenant_id), + "type": "refresh", + "exp": now + timedelta(days=self._refresh_expire_days), + "iat": now, + "jti": str(uuid4()), + } + return jwt.encode(payload, self._private_key, algorithm=self._algorithm) + + def decode_token(self, token: str) -> dict: + return jwt.decode(token, self._public_key, algorithms=[self._algorithm]) + + @property + def access_expire_minutes(self) -> int: + return self._access_expire_minutes + + @property + def public_key_pem(self) -> str: + return self._public_key + + def get_jwks(self) -> dict: + from cryptography.hazmat.primitives.serialization import load_pem_public_key + + public_key = load_pem_public_key(self._public_key.encode()) + numbers = public_key.public_numbers() + + def _int_to_base64url(value: int) -> str: + import base64 + + byte_length = (value.bit_length() + 7) // 8 + value_bytes = value.to_bytes(byte_length, byteorder="big") + return base64.urlsafe_b64encode(value_bytes).rstrip(b"=").decode("ascii") + + kid = hashlib.sha256(self._public_key.encode()).hexdigest()[:16] + + return { + "keys": [ + { + "kty": "RSA", + "use": "sig", + "alg": "RS256", + "kid": kid, + "n": _int_to_base64url(numbers.n), + "e": _int_to_base64url(numbers.e), + } + ] + } diff --git a/services/auth/src/auth/infrastructure/seed.py b/services/auth/src/auth/infrastructure/seed.py new file mode 100644 index 0000000..5cc9daa --- /dev/null +++ b/services/auth/src/auth/infrastructure/seed.py @@ -0,0 +1,131 @@ +"""Seed default roles and superadmin user. + +Usage: + uv run --package cmdb-auth python -m auth.infrastructure.seed +""" + +import asyncio +import os + +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine + +from auth.domain.permission import Permission +from auth.domain.role import Role +from auth.domain.user import User +from auth.infrastructure.role_repository import PostgresRoleRepository +from auth.infrastructure.security import BcryptPasswordService +from auth.infrastructure.user_repository import PostgresUserRepository + +DEFAULT_ROLES = [ + { + "name": "superadmin", + "description": "Full system access", + "is_system": True, + "permissions": [ + {"object_type": "ipam", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "dcim", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "circuit", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "virtualization", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "tenant", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "auth", "actions": ["view", "add", "change", "delete"]}, + ], + }, + { + "name": "admin", + "description": "Administrative access", + "is_system": True, + "permissions": [ + {"object_type": "ipam", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "dcim", "actions": ["view", "add", "change", "delete"]}, + {"object_type": "auth", "actions": ["view", "add", "change"]}, + ], + }, + { + "name": "operator", + "description": "Operational access", + "is_system": True, + "permissions": [ + {"object_type": "ipam", "actions": ["view", "add", "change"]}, + {"object_type": "dcim", "actions": ["view", "add", "change"]}, + ], + }, + { + "name": "viewer", + "description": "Read-only access", + "is_system": True, + "permissions": [ + {"object_type": "ipam", "actions": ["view"]}, + {"object_type": "dcim", "actions": ["view"]}, + {"object_type": "circuit", "actions": ["view"]}, + {"object_type": "virtualization", "actions": ["view"]}, + ], + }, +] + + +async def seed(database_url: str, tenant_id: str) -> None: + engine = create_async_engine(database_url) + session_factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False) + + async with session_factory() as session: + role_repo = PostgresRoleRepository(session) + user_repo = PostgresUserRepository(session) + password_service = BcryptPasswordService() + + tenant_uuid = __import__("uuid").UUID(tenant_id) + superadmin_role_id = None + + for role_def in DEFAULT_ROLES: + existing = await role_repo.find_by_name(role_def["name"], tenant_uuid) + if existing: + print(f" Role '{role_def['name']}' already exists, skipping") + if role_def["name"] == "superadmin": + superadmin_role_id = existing.id + continue + + permissions = [Permission(**p) for p in role_def["permissions"]] + role = Role( + name=role_def["name"], + tenant_id=tenant_uuid, + description=role_def["description"], + permissions=permissions, + is_system=role_def["is_system"], + ) + await role_repo.save(role) + print(f" Created role: {role_def['name']}") + if role_def["name"] == "superadmin": + superadmin_role_id = role.id + + # Create superadmin user + admin_email = os.getenv("ADMIN_EMAIL", "admin@cmdb.local") + admin_password = os.getenv("ADMIN_PASSWORD", "changeme123") + + existing_user = await user_repo.find_by_email(admin_email, tenant_uuid) + if existing_user: + print(f" Superadmin user '{admin_email}' already exists, skipping") + else: + password_hash = password_service.hash(admin_password) + user = User.create( + email=admin_email, + password_hash=password_hash, + tenant_id=tenant_uuid, + display_name="System Admin", + ) + if superadmin_role_id: + user.role_ids.append(superadmin_role_id) + await user_repo.save(user) + print(f" Created superadmin user: {admin_email}") + + await engine.dispose() + print("Seed completed!") + + +def main() -> None: + database_url = os.getenv("DATABASE_URL", "postgresql+asyncpg://cmdb:cmdb@localhost:5432/cmdb_auth") + tenant_id = os.getenv("SEED_TENANT_ID", "00000000-0000-0000-0000-000000000001") + print(f"Seeding auth database: {database_url}") + asyncio.run(seed(database_url, tenant_id)) + + +if __name__ == "__main__": + main() diff --git a/services/auth/src/auth/infrastructure/token_blacklist.py b/services/auth/src/auth/infrastructure/token_blacklist.py new file mode 100644 index 0000000..b821b30 --- /dev/null +++ b/services/auth/src/auth/infrastructure/token_blacklist.py @@ -0,0 +1,16 @@ +import redis.asyncio as redis + + +class RedisTokenBlacklist: + def __init__(self, redis_url: str) -> None: + self._redis = redis.from_url(redis_url, decode_responses=True) + + async def blacklist(self, jti: str, expires_in: int) -> None: + await self._redis.setex(f"token_blacklist:{jti}", expires_in, "1") + + async def is_blacklisted(self, jti: str) -> bool: + result = await self._redis.get(f"token_blacklist:{jti}") + return result is not None + + async def close(self) -> None: + await self._redis.aclose() diff --git a/services/auth/src/auth/infrastructure/user_repository.py b/services/auth/src/auth/infrastructure/user_repository.py new file mode 100644 index 0000000..143c63f --- /dev/null +++ b/services/auth/src/auth/infrastructure/user_repository.py @@ -0,0 +1,100 @@ +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from auth.domain.repository import UserRepository +from auth.domain.user import User, UserStatus +from auth.infrastructure.models import UserGroupModel, UserModel, UserRoleModel + + +class PostgresUserRepository(UserRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def find_by_id(self, entity_id: UUID) -> User | None: + result = await self._session.get(UserModel, entity_id) + return self._to_entity(result) if result else None + + async def find_by_email(self, email: str, tenant_id: UUID) -> User | None: + stmt = select(UserModel).where( + UserModel.email == email, + UserModel.tenant_id == tenant_id, + ) + result = await self._session.execute(stmt) + row = result.scalar_one_or_none() + return self._to_entity(row) if row else None + + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[User], int]: + count_stmt = select(sa_func.count()).select_from(UserModel).where(UserModel.tenant_id == tenant_id) + total = (await self._session.execute(count_stmt)).scalar_one() + + stmt = ( + select(UserModel) + .where(UserModel.tenant_id == tenant_id) + .order_by(UserModel.created_at.desc()) + .offset(offset) + .limit(limit) + ) + result = await self._session.execute(stmt) + rows = result.scalars().unique().all() + return [self._to_entity(r) for r in rows], total + + async def save(self, entity: User) -> User: + model = self._to_model(entity) + merged = await self._session.merge(model) + await self._session.flush() + + # Sync role associations + await self._session.execute(UserRoleModel.__table__.delete().where(UserRoleModel.user_id == entity.id)) + for role_id in entity.role_ids: + await self._session.execute(UserRoleModel.__table__.insert().values(user_id=entity.id, role_id=role_id)) + + # Sync group associations + await self._session.execute(UserGroupModel.__table__.delete().where(UserGroupModel.user_id == entity.id)) + for group_id in entity.group_ids: + await self._session.execute(UserGroupModel.__table__.insert().values(user_id=entity.id, group_id=group_id)) + + await self._session.commit() + return self._to_entity(merged) + + async def delete(self, entity_id: UUID) -> None: + model = await self._session.get(UserModel, entity_id) + if model: + await self._session.delete(model) + await self._session.commit() + + @staticmethod + def _to_entity(model: UserModel) -> User: + return User( + id=model.id, + email=model.email, + password_hash=model.password_hash, + tenant_id=model.tenant_id, + status=UserStatus(model.status), + display_name=model.display_name, + role_ids=[r.id for r in model.roles], + group_ids=[g.id for g in model.groups], + created_at=model.created_at, + updated_at=model.updated_at, + ) + + @staticmethod + def _to_model(entity: User) -> UserModel: + return UserModel( + id=entity.id, + email=entity.email, + password_hash=entity.password_hash, + tenant_id=entity.tenant_id, + status=entity.status.value, + display_name=entity.display_name, + created_at=entity.created_at, + updated_at=entity.updated_at, + ) diff --git a/services/auth/src/auth/interface/__init__.py b/services/auth/src/auth/interface/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/auth/src/auth/interface/dependencies.py b/services/auth/src/auth/interface/dependencies.py new file mode 100644 index 0000000..36ff609 --- /dev/null +++ b/services/auth/src/auth/interface/dependencies.py @@ -0,0 +1,35 @@ +from uuid import UUID + +from fastapi import Request +from shared.domain.exceptions import AuthorizationError + +from auth.infrastructure.security import JWTService + + +async def get_current_user(request: Request) -> dict: + auth_header = request.headers.get("Authorization") + if not auth_header or not auth_header.startswith("Bearer "): + raise AuthorizationError("Missing or invalid Authorization header") + + token = auth_header.removeprefix("Bearer ").strip() + + jwt_service: JWTService = request.app.state.jwt_service + token_blacklist = request.app.state.token_blacklist + + try: + payload = jwt_service.decode_token(token) + except Exception as exc: + raise AuthorizationError("Invalid or expired token") from exc + + if payload.get("type") != "access": + raise AuthorizationError("Invalid token type") + + jti = payload.get("jti") + if jti and await token_blacklist.is_blacklisted(jti): + raise AuthorizationError("Token has been revoked") + + return { + "user_id": UUID(payload["sub"]), + "tenant_id": UUID(payload["tenant_id"]), + "roles": payload.get("roles", []), + } diff --git a/services/auth/src/auth/interface/main.py b/services/auth/src/auth/interface/main.py new file mode 100644 index 0000000..de19292 --- /dev/null +++ b/services/auth/src/auth/interface/main.py @@ -0,0 +1,103 @@ +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager + +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from shared.api.errors import domain_exception_handler +from shared.api.middleware import CorrelationIdMiddleware +from shared.domain.exceptions import DomainError +from shared.messaging.producer import KafkaEventProducer +from shared.messaging.serialization import EventSerializer + +from auth.domain.events import ( + RoleAssigned, + RoleRemoved, + TokenGenerated, + TokenRevoked, + UserCreated, + UserLocked, +) +from auth.infrastructure.config import Settings +from auth.infrastructure.database import Database +from auth.infrastructure.login_rate_limiter import LoginRateLimiter +from auth.infrastructure.security import BcryptPasswordService, JWTService +from auth.infrastructure.token_blacklist import RedisTokenBlacklist +from auth.interface.router import ( + api_token_router, + auth_router, + permission_router, + role_router, + user_router, +) + + +@asynccontextmanager +async def lifespan(app: FastAPI) -> AsyncGenerator[None]: + settings = Settings() + + database = Database(settings.database_url) + + serializer = EventSerializer() + serializer.register(UserCreated) + serializer.register(UserLocked) + serializer.register(RoleAssigned) + serializer.register(RoleRemoved) + serializer.register(TokenGenerated) + serializer.register(TokenRevoked) + event_producer = KafkaEventProducer( + settings.kafka_bootstrap_servers, + serializer, + ) + await event_producer.start() + + password_service = BcryptPasswordService(settings.bcrypt_rounds) + jwt_service = JWTService(settings) + token_blacklist = RedisTokenBlacklist(settings.redis_url) + rate_limiter = LoginRateLimiter(settings.redis_url) + + app.state.database = database + app.state.settings = settings + app.state.event_producer = event_producer + app.state.password_service = password_service + app.state.jwt_service = jwt_service + app.state.token_blacklist = token_blacklist + app.state.rate_limiter = rate_limiter + + yield + + await event_producer.stop() + await token_blacklist.close() + await rate_limiter.close() + await database.close() + + +def create_app() -> FastAPI: + app = FastAPI(title="CMDB Auth Service", lifespan=lifespan) + app.add_middleware( + CORSMiddleware, + allow_origins=["http://localhost:3000"], + allow_methods=["*"], + allow_headers=["*"], + ) + app.add_middleware(CorrelationIdMiddleware) + app.add_exception_handler(DomainError, domain_exception_handler) + app.include_router(auth_router) + app.include_router(user_router) + app.include_router(role_router) + app.include_router(api_token_router) + app.include_router(permission_router) + + # JWKS endpoint + @app.get("/auth/.well-known/jwks.json", tags=["auth"]) + async def jwks() -> dict: + return app.state.jwt_service.get_jwks() + + # Health check + @app.get("/health", include_in_schema=False) + async def health() -> dict: + return {"status": "ok"} + + return app + + +app = create_app() diff --git a/services/auth/src/auth/interface/router.py b/services/auth/src/auth/interface/router.py new file mode 100644 index 0000000..f497479 --- /dev/null +++ b/services/auth/src/auth/interface/router.py @@ -0,0 +1,479 @@ +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus +from sqlalchemy.ext.asyncio import AsyncSession + +from auth.application.command_handlers import ( + AssignRoleHandler, + ChangePasswordHandler, + CreateAPITokenHandler, + CreateRoleHandler, + DeleteRoleHandler, + LoginHandler, + LogoutHandler, + RefreshTokenHandler, + RegisterUserHandler, + RemoveRoleHandler, + RevokeAPITokenHandler, + UpdateRoleHandler, +) +from auth.application.commands import ( + AssignRoleCommand, + ChangePasswordCommand, + CreateAPITokenCommand, + CreateRoleCommand, + DeleteRoleCommand, + LoginCommand, + LogoutCommand, + RefreshTokenCommand, + RegisterUserCommand, + RemoveRoleCommand, + RevokeAPITokenCommand, + UpdateRoleCommand, +) +from auth.application.queries import ( + CheckPermissionQuery, + GetRoleQuery, + GetUserQuery, + ListAPITokensQuery, + ListRolesQuery, + ListUsersQuery, +) +from auth.application.query_handlers import ( + CheckPermissionHandler, + GetRoleHandler, + GetUserHandler, + ListAPITokensHandler, + ListRolesHandler, + ListUsersHandler, +) +from auth.infrastructure.api_token_repository import PostgresAPITokenRepository +from auth.infrastructure.group_repository import PostgresGroupRepository +from auth.infrastructure.role_repository import PostgresRoleRepository +from auth.infrastructure.user_repository import PostgresUserRepository +from auth.interface.dependencies import get_current_user +from auth.interface.schemas import ( + APITokenListResponse, + APITokenResponse, + AssignRoleRequest, + AuthTokenResponse, + CreateAPITokenRequest, + CreateRoleRequest, + LoginRequest, + PermissionCheckResponse, + RefreshTokenRequest, + RegisterRequest, + RoleListResponse, + RoleResponse, + UpdateRoleRequest, + UserListResponse, + UserResponse, +) + +# --- Helpers --- + + +def _get_session(request: Request) -> AsyncSession: + return request.app.state.database.session() + + +def _get_auth_command_bus(request: Request) -> CommandBus: + session = _get_session(request) + user_repo = PostgresUserRepository(session) + role_repo = PostgresRoleRepository(session) + + bus = CommandBus() + bus.register( + RegisterUserCommand, + RegisterUserHandler( + user_repo, + request.app.state.password_service, + request.app.state.event_producer, + ), + ) + bus.register( + LoginCommand, + LoginHandler( + user_repo, + role_repo, + request.app.state.password_service, + request.app.state.jwt_service, + request.app.state.rate_limiter, + ), + ) + bus.register( + RefreshTokenCommand, + RefreshTokenHandler( + user_repo, + role_repo, + request.app.state.jwt_service, + request.app.state.token_blacklist, + ), + ) + bus.register( + LogoutCommand, + LogoutHandler( + request.app.state.jwt_service, + request.app.state.token_blacklist, + ), + ) + bus.register( + ChangePasswordCommand, + ChangePasswordHandler( + user_repo, + request.app.state.password_service, + ), + ) + bus.register( + AssignRoleCommand, + AssignRoleHandler( + user_repo, + role_repo, + request.app.state.event_producer, + ), + ) + bus.register( + RemoveRoleCommand, + RemoveRoleHandler( + user_repo, + request.app.state.event_producer, + ), + ) + bus.register(CreateRoleCommand, CreateRoleHandler(role_repo)) + bus.register(UpdateRoleCommand, UpdateRoleHandler(role_repo)) + bus.register(DeleteRoleCommand, DeleteRoleHandler(role_repo)) + + token_repo = PostgresAPITokenRepository(session) + bus.register( + CreateAPITokenCommand, + CreateAPITokenHandler(token_repo, request.app.state.event_producer), + ) + bus.register( + RevokeAPITokenCommand, + RevokeAPITokenHandler(token_repo, request.app.state.event_producer), + ) + + return bus + + +def _get_query_bus(request: Request) -> QueryBus: + session = _get_session(request) + user_repo = PostgresUserRepository(session) + role_repo = PostgresRoleRepository(session) + group_repo = PostgresGroupRepository(session) + token_repo = PostgresAPITokenRepository(session) + + bus = QueryBus() + bus.register(GetUserQuery, GetUserHandler(user_repo)) + bus.register(ListUsersQuery, ListUsersHandler(user_repo)) + bus.register(GetRoleQuery, GetRoleHandler(role_repo)) + bus.register(ListRolesQuery, ListRolesHandler(role_repo)) + bus.register( + CheckPermissionQuery, + CheckPermissionHandler(user_repo, role_repo, group_repo), + ) + bus.register(ListAPITokensQuery, ListAPITokensHandler(token_repo)) + return bus + + +# ============================================================================= +# Auth Router (public endpoints) +# ============================================================================= + +auth_router = APIRouter(prefix="/auth", tags=["auth"]) + + +@auth_router.post( + "/register", + status_code=status.HTTP_201_CREATED, + response_model=UserResponse, +) +async def register( + body: RegisterRequest, + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> UserResponse: + user_id = await command_bus.dispatch(RegisterUserCommand(**body.model_dump())) + result = await query_bus.dispatch(GetUserQuery(user_id=user_id)) + return UserResponse(**result.model_dump()) + + +@auth_router.post("/login", response_model=AuthTokenResponse) +async def login( + body: LoginRequest, + request: Request, + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> AuthTokenResponse: + client_ip = request.client.host if request.client else "0.0.0.0" + result = await command_bus.dispatch(LoginCommand(**body.model_dump(), client_ip=client_ip)) + return AuthTokenResponse(**result.model_dump()) + + +@auth_router.post("/refresh", response_model=AuthTokenResponse) +async def refresh_token( + body: RefreshTokenRequest, + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> AuthTokenResponse: + result = await command_bus.dispatch(RefreshTokenCommand(refresh_token=body.refresh_token)) + return AuthTokenResponse(**result.model_dump()) + + +@auth_router.post("/logout", status_code=status.HTTP_204_NO_CONTENT) +async def logout( + body: RefreshTokenRequest, + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 + _current_user: dict = Depends(get_current_user), # noqa: B008 +) -> None: + await command_bus.dispatch(LogoutCommand(refresh_token=body.refresh_token)) + + +@auth_router.get("/validate", status_code=status.HTTP_200_OK, include_in_schema=False) +async def validate( + current_user: dict = Depends(get_current_user), # noqa: B008 +) -> None: + from fastapi.responses import Response + + response = Response(status_code=200) + response.headers["X-User-ID"] = str(current_user["user_id"]) + response.headers["X-Tenant-ID"] = str(current_user["tenant_id"]) + return response + + +# ============================================================================= +# User Router (authenticated endpoints) +# ============================================================================= + +user_router = APIRouter(prefix="/users", tags=["users"]) + + +@user_router.get("", response_model=UserListResponse) +async def list_users( + current_user: dict = Depends(get_current_user), # noqa: B008 + params: OffsetParams = Depends(), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> UserListResponse: + items, total = await query_bus.dispatch( + ListUsersQuery( + tenant_id=current_user["tenant_id"], + offset=params.offset, + limit=params.limit, + ) + ) + return UserListResponse( + items=[UserResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@user_router.get("/{user_id}", response_model=UserResponse) +async def get_user( + user_id: UUID, + _current_user: dict = Depends(get_current_user), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> UserResponse: + result = await query_bus.dispatch(GetUserQuery(user_id=user_id)) + return UserResponse(**result.model_dump()) + + +@user_router.post( + "/{user_id}/roles", + status_code=status.HTTP_204_NO_CONTENT, +) +async def assign_role( + user_id: UUID, + body: AssignRoleRequest, + _current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(AssignRoleCommand(user_id=user_id, role_id=body.role_id)) + + +@user_router.delete( + "/{user_id}/roles/{role_id}", + status_code=status.HTTP_204_NO_CONTENT, +) +async def remove_role( + user_id: UUID, + role_id: UUID, + _current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(RemoveRoleCommand(user_id=user_id, role_id=role_id)) + + +# ============================================================================= +# Role Router (authenticated endpoints) +# ============================================================================= + +role_router = APIRouter(prefix="/roles", tags=["roles"]) + + +@role_router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=RoleResponse, +) +async def create_role( + body: CreateRoleRequest, + _current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RoleResponse: + role_id = await command_bus.dispatch( + CreateRoleCommand( + name=body.name, + tenant_id=body.tenant_id, + description=body.description, + permissions=[p.model_dump() for p in body.permissions] if body.permissions else None, + ) + ) + result = await query_bus.dispatch(GetRoleQuery(role_id=role_id)) + return RoleResponse(**result.model_dump()) + + +@role_router.get("", response_model=RoleListResponse) +async def list_roles( + current_user: dict = Depends(get_current_user), # noqa: B008 + params: OffsetParams = Depends(), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RoleListResponse: + items, total = await query_bus.dispatch( + ListRolesQuery( + tenant_id=current_user["tenant_id"], + offset=params.offset, + limit=params.limit, + ) + ) + return RoleListResponse( + items=[RoleResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@role_router.get("/{role_id}", response_model=RoleResponse) +async def get_role( + role_id: UUID, + _current_user: dict = Depends(get_current_user), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RoleResponse: + result = await query_bus.dispatch(GetRoleQuery(role_id=role_id)) + return RoleResponse(**result.model_dump()) + + +@role_router.patch("/{role_id}", response_model=RoleResponse) +async def update_role( + role_id: UUID, + body: UpdateRoleRequest, + _current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RoleResponse: + await command_bus.dispatch( + UpdateRoleCommand( + role_id=role_id, + name=body.name, + description=body.description, + permissions=[p.model_dump() for p in body.permissions] if body.permissions else None, + ) + ) + result = await query_bus.dispatch(GetRoleQuery(role_id=role_id)) + return RoleResponse(**result.model_dump()) + + +@role_router.delete( + "/{role_id}", + status_code=status.HTTP_204_NO_CONTENT, +) +async def delete_role( + role_id: UUID, + _current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(DeleteRoleCommand(role_id=role_id)) + + +# ============================================================================= +# API Token Router (authenticated endpoints) +# ============================================================================= + +api_token_router = APIRouter(prefix="/api-tokens", tags=["api-tokens"]) + + +@api_token_router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=APITokenResponse, +) +async def create_api_token( + body: CreateAPITokenRequest, + current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> APITokenResponse: + result = await command_bus.dispatch( + CreateAPITokenCommand( + user_id=current_user["user_id"], + tenant_id=current_user["tenant_id"], + description=body.description, + scopes=body.scopes, + expires_at=body.expires_at, + allowed_ips=body.allowed_ips, + ) + ) + return APITokenResponse(**result.model_dump()) + + +@api_token_router.get("", response_model=APITokenListResponse) +async def list_api_tokens( + current_user: dict = Depends(get_current_user), # noqa: B008 + params: OffsetParams = Depends(), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> APITokenListResponse: + items, total = await query_bus.dispatch( + ListAPITokensQuery( + user_id=current_user["user_id"], + offset=params.offset, + limit=params.limit, + ) + ) + return APITokenListResponse( + items=[APITokenResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@api_token_router.delete( + "/{token_id}", + status_code=status.HTTP_204_NO_CONTENT, +) +async def revoke_api_token( + token_id: UUID, + _current_user: dict = Depends(get_current_user), # noqa: B008 + command_bus: CommandBus = Depends(_get_auth_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(RevokeAPITokenCommand(token_id=token_id)) + + +# ============================================================================= +# Permission Check Router (internal) +# ============================================================================= + +permission_router = APIRouter(prefix="/permissions", tags=["permissions"]) + + +@permission_router.get("/check", response_model=PermissionCheckResponse) +async def check_permission( + user_id: UUID, + object_type: str, + action: str, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> PermissionCheckResponse: + result = await query_bus.dispatch(CheckPermissionQuery(user_id=user_id, object_type=object_type, action=action)) + return PermissionCheckResponse(**result.model_dump()) diff --git a/services/auth/src/auth/interface/schemas.py b/services/auth/src/auth/interface/schemas.py new file mode 100644 index 0000000..bf0aa1c --- /dev/null +++ b/services/auth/src/auth/interface/schemas.py @@ -0,0 +1,134 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel, Field + +from auth.domain.user import UserStatus + +# --- Auth --- + + +class RegisterRequest(BaseModel): + email: str = Field(..., max_length=255) + password: str = Field(..., min_length=8, max_length=128) + tenant_id: UUID + display_name: str | None = Field(None, max_length=255) + + +class LoginRequest(BaseModel): + email: str + password: str + tenant_id: UUID + + +class RefreshTokenRequest(BaseModel): + refresh_token: str + + +class AuthTokenResponse(BaseModel): + access_token: str + refresh_token: str + token_type: str = "bearer" + expires_in: int + + +# --- Users --- + + +class UserResponse(BaseModel): + id: UUID + email: str + tenant_id: UUID + status: UserStatus + display_name: str | None + role_ids: list[UUID] + group_ids: list[UUID] + created_at: datetime + updated_at: datetime + + +class UserListResponse(BaseModel): + items: list[UserResponse] + total: int + offset: int + limit: int + + +class AssignRoleRequest(BaseModel): + role_id: UUID + + +# --- Roles --- + + +class PermissionSchema(BaseModel): + object_type: str + actions: list[str] + + +class CreateRoleRequest(BaseModel): + name: str = Field(..., min_length=1, max_length=255) + tenant_id: UUID + description: str | None = Field(None, max_length=1024) + permissions: list[PermissionSchema] | None = None + + +class UpdateRoleRequest(BaseModel): + name: str | None = Field(None, min_length=1, max_length=255) + description: str | None = Field(None, max_length=1024) + permissions: list[PermissionSchema] | None = None + + +class RoleResponse(BaseModel): + id: UUID + name: str + tenant_id: UUID + description: str | None + permissions: list[PermissionSchema] + is_system: bool + created_at: datetime + updated_at: datetime + + +class RoleListResponse(BaseModel): + items: list[RoleResponse] + total: int + offset: int + limit: int + + +# --- API Tokens --- + + +class CreateAPITokenRequest(BaseModel): + description: str | None = Field(None, max_length=1024) + scopes: list[str] | None = None + expires_at: datetime | None = None + allowed_ips: list[str] | None = None + + +class APITokenResponse(BaseModel): + id: UUID + user_id: UUID + tenant_id: UUID + description: str | None + scopes: list[str] + expires_at: datetime | None + allowed_ips: list[str] + is_revoked: bool + created_at: datetime + key: str | None = None + + +class APITokenListResponse(BaseModel): + items: list[APITokenResponse] + total: int + offset: int + limit: int + + +# --- Permissions --- + + +class PermissionCheckResponse(BaseModel): + allowed: bool diff --git a/services/auth/tests/__init__.py b/services/auth/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/auth/tests/conftest.py b/services/auth/tests/conftest.py new file mode 100644 index 0000000..74765d0 --- /dev/null +++ b/services/auth/tests/conftest.py @@ -0,0 +1,273 @@ +from uuid import UUID + +import pytest +from auth.domain.api_token import APIToken +from auth.domain.group import Group +from auth.domain.repository import APITokenRepository, GroupRepository, RoleRepository, UserRepository +from auth.domain.role import Role +from auth.domain.user import User +from auth.infrastructure.config import Settings +from auth.infrastructure.security import BcryptPasswordService, JWTService +from cryptography.hazmat.primitives import serialization +from cryptography.hazmat.primitives.asymmetric import rsa +from shared.event.domain_event import DomainEvent +from shared.messaging.producer import KafkaEventProducer + +# --------------------------------------------------------------------------- +# In-Memory Repository Implementations +# --------------------------------------------------------------------------- + + +class InMemoryUserRepository(UserRepository): + def __init__(self) -> None: + self._store: dict[UUID, User] = {} + + async def find_by_id(self, entity_id: UUID) -> User | None: + return self._store.get(entity_id) + + async def save(self, entity: User) -> User: + self._store[entity.id] = entity + return entity + + async def delete(self, entity_id: UUID) -> None: + self._store.pop(entity_id, None) + + async def find_by_email(self, email: str, tenant_id: UUID) -> User | None: + for user in self._store.values(): + if user.email == email and user.tenant_id == tenant_id: + return user + return None + + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[User], int]: + users = [u for u in self._store.values() if u.tenant_id == tenant_id] + return users[offset : offset + limit], len(users) + + +class InMemoryRoleRepository(RoleRepository): + def __init__(self) -> None: + self._store: dict[UUID, Role] = {} + + async def find_by_id(self, entity_id: UUID) -> Role | None: + return self._store.get(entity_id) + + async def save(self, entity: Role) -> Role: + self._store[entity.id] = entity + return entity + + async def delete(self, entity_id: UUID) -> None: + self._store.pop(entity_id, None) + + async def find_by_name(self, name: str, tenant_id: UUID) -> Role | None: + for role in self._store.values(): + if role.name == name and role.tenant_id == tenant_id: + return role + return None + + async def find_by_ids(self, role_ids: list[UUID]) -> list[Role]: + return [r for r in self._store.values() if r.id in role_ids] + + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Role], int]: + roles = [r for r in self._store.values() if r.tenant_id == tenant_id] + return roles[offset : offset + limit], len(roles) + + +class InMemoryGroupRepository(GroupRepository): + def __init__(self) -> None: + self._store: dict[UUID, Group] = {} + + async def find_by_id(self, entity_id: UUID) -> Group | None: + return self._store.get(entity_id) + + async def save(self, entity: Group) -> Group: + self._store[entity.id] = entity + return entity + + async def delete(self, entity_id: UUID) -> None: + self._store.pop(entity_id, None) + + async def find_by_name(self, name: str, tenant_id: UUID) -> Group | None: + for group in self._store.values(): + if group.name == name and group.tenant_id == tenant_id: + return group + return None + + async def find_by_ids(self, group_ids: list[UUID]) -> list[Group]: + return [g for g in self._store.values() if g.id in group_ids] + + async def find_all( + self, + tenant_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Group], int]: + groups = [g for g in self._store.values() if g.tenant_id == tenant_id] + return groups[offset : offset + limit], len(groups) + + +class InMemoryAPITokenRepository(APITokenRepository): + def __init__(self) -> None: + self._store: dict[UUID, APIToken] = {} + + async def find_by_id(self, entity_id: UUID) -> APIToken | None: + return self._store.get(entity_id) + + async def save(self, entity: APIToken) -> APIToken: + self._store[entity.id] = entity + return entity + + async def delete(self, entity_id: UUID) -> None: + self._store.pop(entity_id, None) + + async def find_by_key_hash(self, key_hash: str) -> APIToken | None: + for token in self._store.values(): + if token.key_hash == key_hash: + return token + return None + + async def find_all_by_user( + self, + user_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[APIToken], int]: + tokens = [t for t in self._store.values() if t.user_id == user_id] + return tokens[offset : offset + limit], len(tokens) + + +# --------------------------------------------------------------------------- +# Fake Kafka Producer +# --------------------------------------------------------------------------- + + +class FakeKafkaProducer(KafkaEventProducer): + """A no-op Kafka producer that records published events for assertions.""" + + def __init__(self) -> None: + self.published: list[tuple[str, DomainEvent]] = [] + + async def start(self) -> None: + pass + + async def stop(self) -> None: + pass + + async def publish(self, topic: str, event: DomainEvent) -> None: + self.published.append((topic, event)) + + async def publish_many(self, topic: str, events: list[DomainEvent]) -> None: + for event in events: + self.published.append((topic, event)) + + +# --------------------------------------------------------------------------- +# Fake Login Rate Limiter +# --------------------------------------------------------------------------- + + +class FakeLoginRateLimiter: + """A no-op rate limiter that never locks.""" + + async def is_locked(self, email: str, ip: str) -> bool: + return False + + async def record_failure(self, email: str, ip: str) -> None: + pass + + async def reset(self, email: str, ip: str) -> None: + pass + + +# --------------------------------------------------------------------------- +# RSA Key Pair for JWT +# --------------------------------------------------------------------------- + + +def _generate_rsa_keys() -> tuple[str, str]: + private_key = rsa.generate_private_key(public_exponent=65537, key_size=2048) + private_pem = private_key.private_bytes( + encoding=serialization.Encoding.PEM, + format=serialization.PrivateFormat.PKCS8, + encryption_algorithm=serialization.NoEncryption(), + ).decode("utf-8") + public_pem = ( + private_key.public_key() + .public_bytes( + encoding=serialization.Encoding.PEM, + format=serialization.PublicFormat.SubjectPublicKeyInfo, + ) + .decode("utf-8") + ) + return private_pem, public_pem + + +_PRIVATE_PEM, _PUBLIC_PEM = _generate_rsa_keys() + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + + +@pytest.fixture +def user_repository() -> InMemoryUserRepository: + return InMemoryUserRepository() + + +@pytest.fixture +def role_repository() -> InMemoryRoleRepository: + return InMemoryRoleRepository() + + +@pytest.fixture +def group_repository() -> InMemoryGroupRepository: + return InMemoryGroupRepository() + + +@pytest.fixture +def api_token_repository() -> InMemoryAPITokenRepository: + return InMemoryAPITokenRepository() + + +@pytest.fixture +def password_service() -> BcryptPasswordService: + return BcryptPasswordService(rounds=4) + + +@pytest.fixture +def jwt_settings() -> Settings: + return Settings( + rsa_private_key=_PRIVATE_PEM, + rsa_public_key=_PUBLIC_PEM, + jwt_algorithm="RS256", + jwt_access_token_expire_minutes=30, + jwt_refresh_token_expire_days=7, + ) + + +@pytest.fixture +def jwt_service(jwt_settings: Settings) -> JWTService: + return JWTService(jwt_settings) + + +@pytest.fixture +def event_producer() -> FakeKafkaProducer: + return FakeKafkaProducer() + + +@pytest.fixture +def rate_limiter() -> FakeLoginRateLimiter: + return FakeLoginRateLimiter() diff --git a/services/auth/tests/test_auth_db.py b/services/auth/tests/test_auth_db.py new file mode 100644 index 0000000..6bb261b --- /dev/null +++ b/services/auth/tests/test_auth_db.py @@ -0,0 +1,127 @@ +"""Auth Docker integration tests: real PostgreSQL via testcontainers. + +Verifies User CRUD with real database persistence. +Marked with @pytest.mark.integration — requires Docker. +""" + +from __future__ import annotations + +from uuid import uuid4 + +import pytest +from auth.domain.user import User +from auth.infrastructure.models import AuthBase +from auth.infrastructure.user_repository import PostgresUserRepository +from sqlalchemy import text +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine +from testcontainers.postgres import PostgresContainer + +TENANT_ID = uuid4() + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + + +@pytest.fixture(scope="session") +def postgres_container(): + with PostgresContainer("postgres:16") as pg: + yield pg + + +@pytest.fixture(scope="session") +async def engine(postgres_container): + url = postgres_container.get_connection_url().replace("psycopg2", "asyncpg") + eng = create_async_engine(url) + async with eng.begin() as conn: + await conn.run_sync(AuthBase.metadata.create_all) + yield eng + await eng.dispose() + + +@pytest.fixture +async def session(engine): + factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False) + async with factory() as session: + yield session + for table in reversed(AuthBase.metadata.sorted_tables): + await session.execute(text(f'TRUNCATE TABLE "{table.name}" CASCADE')) + await session.commit() + + +# --------------------------------------------------------------------------- +# TestAuthDB +# --------------------------------------------------------------------------- + + +@pytest.mark.integration +class TestAuthDB: + """Create user -> persist -> retrieve by email -> verify fields.""" + + async def test_create_user_persists_to_db(self, session: AsyncSession) -> None: + repo = PostgresUserRepository(session) + + user = User.create( + email="alice@example.com", + password_hash="hashed_password_123", + tenant_id=TENANT_ID, + display_name="Alice", + ) + saved = await repo.save(user) + + assert saved.id == user.id + assert saved.email == "alice@example.com" + + async def test_retrieve_user_by_email(self, session: AsyncSession) -> None: + repo = PostgresUserRepository(session) + + user = User.create( + email="bob@example.com", + password_hash="hashed_password_456", + tenant_id=TENANT_ID, + display_name="Bob", + ) + await repo.save(user) + + found = await repo.find_by_email("bob@example.com", TENANT_ID) + assert found is not None + assert found.id == user.id + assert found.email == "bob@example.com" + assert found.display_name == "Bob" + assert found.tenant_id == TENANT_ID + + async def test_retrieve_non_existent_email_returns_none(self, session: AsyncSession) -> None: + repo = PostgresUserRepository(session) + + found = await repo.find_by_email("nobody@example.com", TENANT_ID) + assert found is None + + async def test_find_by_id_returns_user(self, session: AsyncSession) -> None: + repo = PostgresUserRepository(session) + + user = User.create( + email="charlie@example.com", + password_hash="hashed_pw", + tenant_id=TENANT_ID, + ) + await repo.save(user) + + found = await repo.find_by_id(user.id) + assert found is not None + assert found.email == "charlie@example.com" + + async def test_delete_user_removes_from_db(self, session: AsyncSession) -> None: + repo = PostgresUserRepository(session) + + user = User.create( + email="dave@example.com", + password_hash="hashed_pw", + tenant_id=TENANT_ID, + ) + await repo.save(user) + + await repo.delete(user.id) + + found = await repo.find_by_id(user.id) + assert found is None diff --git a/services/auth/tests/test_auth_e2e.py b/services/auth/tests/test_auth_e2e.py new file mode 100644 index 0000000..7f22ce9 --- /dev/null +++ b/services/auth/tests/test_auth_e2e.py @@ -0,0 +1,297 @@ +from uuid import uuid4 + +import pytest +from auth.application.command_handlers import ( + AssignRoleHandler, + CreateRoleHandler, + LoginHandler, + RegisterUserHandler, +) +from auth.application.commands import ( + AssignRoleCommand, + CreateRoleCommand, + LoginCommand, + RegisterUserCommand, +) +from auth.domain.services import PermissionChecker +from auth.infrastructure.security import BcryptPasswordService, JWTService +from shared.domain.exceptions import AuthorizationError, ConflictError + +from tests.conftest import ( + FakeKafkaProducer, + FakeLoginRateLimiter, + InMemoryRoleRepository, + InMemoryUserRepository, +) + +TENANT_ID = uuid4() + + +class TestUserRegistration: + async def test_register_user_stores_email_and_display_name( + self, + user_repository: InMemoryUserRepository, + password_service: BcryptPasswordService, + event_producer: FakeKafkaProducer, + ) -> None: + handler = RegisterUserHandler(user_repository, password_service, event_producer) + command = RegisterUserCommand( + email="alice@example.com", + password="StrongP@ss1", + tenant_id=TENANT_ID, + display_name="Alice", + ) + + user_id = await handler.handle(command) + + user = await user_repository.find_by_id(user_id) + assert user is not None + assert user.email == "alice@example.com" + assert user.display_name == "Alice" + assert user.tenant_id == TENANT_ID + + async def test_duplicate_email_raises_conflict_error( + self, + user_repository: InMemoryUserRepository, + password_service: BcryptPasswordService, + event_producer: FakeKafkaProducer, + ) -> None: + handler = RegisterUserHandler(user_repository, password_service, event_producer) + command = RegisterUserCommand( + email="bob@example.com", + password="StrongP@ss1", + tenant_id=TENANT_ID, + ) + + await handler.handle(command) + + with pytest.raises(ConflictError, match="already exists"): + await handler.handle(command) + + +class TestAuthentication: + async def test_login_with_correct_password_returns_jwt( + self, + user_repository: InMemoryUserRepository, + role_repository: InMemoryRoleRepository, + password_service: BcryptPasswordService, + jwt_service: JWTService, + event_producer: FakeKafkaProducer, + rate_limiter: FakeLoginRateLimiter, + ) -> None: + # Register user first + register_handler = RegisterUserHandler(user_repository, password_service, event_producer) + await register_handler.handle( + RegisterUserCommand( + email="charlie@example.com", + password="Secret123!", + tenant_id=TENANT_ID, + ) + ) + + # Login + login_handler = LoginHandler(user_repository, role_repository, password_service, jwt_service, rate_limiter) + result = await login_handler.handle( + LoginCommand( + email="charlie@example.com", + password="Secret123!", + tenant_id=TENANT_ID, + ) + ) + + assert result.access_token + assert result.refresh_token + assert result.token_type == "bearer" + assert result.expires_in > 0 + + async def test_login_with_wrong_password_raises_error( + self, + user_repository: InMemoryUserRepository, + role_repository: InMemoryRoleRepository, + password_service: BcryptPasswordService, + jwt_service: JWTService, + event_producer: FakeKafkaProducer, + rate_limiter: FakeLoginRateLimiter, + ) -> None: + # Register user + register_handler = RegisterUserHandler(user_repository, password_service, event_producer) + await register_handler.handle( + RegisterUserCommand( + email="dave@example.com", + password="Correct123!", + tenant_id=TENANT_ID, + ) + ) + + # Attempt login with wrong password + login_handler = LoginHandler(user_repository, role_repository, password_service, jwt_service, rate_limiter) + with pytest.raises(AuthorizationError, match="Invalid email or password"): + await login_handler.handle( + LoginCommand( + email="dave@example.com", + password="Wrong123!", + tenant_id=TENANT_ID, + ) + ) + + async def test_jwt_decode_returns_matching_user_id( + self, + user_repository: InMemoryUserRepository, + role_repository: InMemoryRoleRepository, + password_service: BcryptPasswordService, + jwt_service: JWTService, + event_producer: FakeKafkaProducer, + rate_limiter: FakeLoginRateLimiter, + ) -> None: + # Register user + register_handler = RegisterUserHandler(user_repository, password_service, event_producer) + user_id = await register_handler.handle( + RegisterUserCommand( + email="eve@example.com", + password="Token123!", + tenant_id=TENANT_ID, + ) + ) + + # Login + login_handler = LoginHandler(user_repository, role_repository, password_service, jwt_service, rate_limiter) + result = await login_handler.handle( + LoginCommand( + email="eve@example.com", + password="Token123!", + tenant_id=TENANT_ID, + ) + ) + + # Decode access token and verify user_id + payload = jwt_service.decode_token(result.access_token) + assert payload["sub"] == str(user_id) + assert payload["tenant_id"] == str(TENANT_ID) + assert payload["type"] == "access" + + +class TestRolePermissions: + async def test_assign_role_with_permissions_grants_access( + self, + user_repository: InMemoryUserRepository, + role_repository: InMemoryRoleRepository, + password_service: BcryptPasswordService, + event_producer: FakeKafkaProducer, + ) -> None: + # Register user + register_handler = RegisterUserHandler(user_repository, password_service, event_producer) + user_id = await register_handler.handle( + RegisterUserCommand( + email="frank@example.com", + password="Role123!", + tenant_id=TENANT_ID, + ) + ) + + # Create role with permission + create_role_handler = CreateRoleHandler(role_repository) + role_id = await create_role_handler.handle( + CreateRoleCommand( + name="editor", + tenant_id=TENANT_ID, + permissions=[{"object_type": "prefix", "actions": ["view", "add", "change"]}], + ) + ) + + # Assign role to user + assign_handler = AssignRoleHandler(user_repository, role_repository, event_producer) + await assign_handler.handle(AssignRoleCommand(user_id=user_id, role_id=role_id)) + + # Check permission + user = await user_repository.find_by_id(user_id) + roles = await role_repository.find_by_ids(user.role_ids) + + checker = PermissionChecker() + assert checker.has_permission(user, roles, [], "prefix", "view") is True + assert checker.has_permission(user, roles, [], "prefix", "add") is True + assert checker.has_permission(user, roles, [], "prefix", "change") is True + + async def test_unassigned_permission_is_denied( + self, + user_repository: InMemoryUserRepository, + role_repository: InMemoryRoleRepository, + password_service: BcryptPasswordService, + event_producer: FakeKafkaProducer, + ) -> None: + # Register user + register_handler = RegisterUserHandler(user_repository, password_service, event_producer) + user_id = await register_handler.handle( + RegisterUserCommand( + email="grace@example.com", + password="Perm123!", + tenant_id=TENANT_ID, + ) + ) + + # Create role with limited permissions + create_role_handler = CreateRoleHandler(role_repository) + role_id = await create_role_handler.handle( + CreateRoleCommand( + name="viewer", + tenant_id=TENANT_ID, + permissions=[{"object_type": "prefix", "actions": ["view"]}], + ) + ) + + # Assign role + assign_handler = AssignRoleHandler(user_repository, role_repository, event_producer) + await assign_handler.handle(AssignRoleCommand(user_id=user_id, role_id=role_id)) + + # Check that "delete" action is denied + user = await user_repository.find_by_id(user_id) + roles = await role_repository.find_by_ids(user.role_ids) + + checker = PermissionChecker() + assert checker.has_permission(user, roles, [], "prefix", "delete") is False + # Also check a completely unrelated object_type + assert checker.has_permission(user, roles, [], "vlan", "view") is False + + +class TestTokenLifecycle: + async def test_access_token_has_correct_claims( + self, + jwt_service: JWTService, + ) -> None: + user_id = uuid4() + tenant_id = uuid4() + + token = jwt_service.create_access_token( + user_id=user_id, + tenant_id=tenant_id, + roles=["admin"], + ) + + payload = jwt_service.decode_token(token) + assert payload["sub"] == str(user_id) + assert payload["tenant_id"] == str(tenant_id) + assert payload["type"] == "access" + assert payload["roles"] == ["admin"] + assert "exp" in payload + assert "iat" in payload + assert "jti" in payload + + async def test_refresh_token_has_refresh_type( + self, + jwt_service: JWTService, + ) -> None: + user_id = uuid4() + tenant_id = uuid4() + + token = jwt_service.create_refresh_token( + user_id=user_id, + tenant_id=tenant_id, + ) + + payload = jwt_service.decode_token(token) + assert payload["sub"] == str(user_id) + assert payload["tenant_id"] == str(tenant_id) + assert payload["type"] == "refresh" + assert "roles" not in payload + assert "exp" in payload + assert "iat" in payload + assert "jti" in payload diff --git a/services/event/Dockerfile b/services/event/Dockerfile new file mode 100644 index 0000000..4c2fe70 --- /dev/null +++ b/services/event/Dockerfile @@ -0,0 +1,5 @@ +FROM cmdb-base:prod +COPY services/event/ ./services/event/ +RUN uv sync --frozen --no-dev --package cmdb-event +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-event", "uvicorn", "event.interface.main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/services/event/Dockerfile.dev b/services/event/Dockerfile.dev new file mode 100644 index 0000000..0222562 --- /dev/null +++ b/services/event/Dockerfile.dev @@ -0,0 +1,8 @@ +FROM cmdb-base:dev +COPY services/event/pyproject.toml ./services/event/pyproject.toml +COPY services/event/alembic.ini ./services/event/alembic.ini +COPY services/event/alembic ./services/event/alembic +COPY services/event/src ./services/event/src +RUN uv sync --frozen --package cmdb-event +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-event", "uvicorn", "event.interface.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"] diff --git a/services/event/alembic.ini b/services/event/alembic.ini new file mode 100644 index 0000000..c64669b --- /dev/null +++ b/services/event/alembic.ini @@ -0,0 +1,35 @@ +[alembic] +script_location = alembic +sqlalchemy.url = postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_event + +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console + +[logger_sqlalchemy] +level = WARN +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s diff --git a/services/event/alembic/env.py b/services/event/alembic/env.py new file mode 100644 index 0000000..d4f5e88 --- /dev/null +++ b/services/event/alembic/env.py @@ -0,0 +1,61 @@ +import asyncio +import os +from logging.config import fileConfig + +from alembic import context +from event.infrastructure.models import EventBase +from sqlalchemy import pool +from sqlalchemy.ext.asyncio import async_engine_from_config + +config = context.config + +# Override DB URL from environment variable if set +db_url = os.environ.get("DATABASE_URL") +if db_url: + config.set_main_option("sqlalchemy.url", db_url) + +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +target_metadata = EventBase.metadata + + +def run_migrations_offline() -> None: + url = config.get_main_option("sqlalchemy.url") + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + ) + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection) -> None: + context.configure( + connection=connection, + target_metadata=target_metadata, + ) + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + connectable = async_engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + await connectable.dispose() + + +def run_migrations_online() -> None: + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/services/event/alembic/versions/001_create_event_tables.py b/services/event/alembic/versions/001_create_event_tables.py new file mode 100644 index 0000000..594e81e --- /dev/null +++ b/services/event/alembic/versions/001_create_event_tables.py @@ -0,0 +1,59 @@ +"""create event tables + +Revision ID: 001 +Revises: +Create Date: 2026-03-19 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects import postgresql + +revision = "001" +down_revision = None +branch_labels = None +depends_on = None + + +def upgrade() -> None: + # Stored Events + op.create_table( + "stored_events", + sa.Column("id", sa.Integer(), autoincrement=True, nullable=False), + sa.Column("aggregate_id", sa.Uuid(), nullable=False), + sa.Column("aggregate_type", sa.String(255), nullable=False), + sa.Column("event_type", sa.Text(), nullable=False), + sa.Column("version", sa.Integer(), nullable=False), + sa.Column("payload", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("timestamp", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + sa.UniqueConstraint("aggregate_id", "version", name="uq_event_aggregate_version"), + ) + op.create_index("ix_stored_events_aggregate_id", "stored_events", ["aggregate_id"]) + op.create_index("ix_stored_events_aggregate_type", "stored_events", ["aggregate_type"]) + op.create_index("ix_stored_events_timestamp", "stored_events", ["timestamp"]) + + # Change Logs + op.create_table( + "change_logs", + sa.Column("id", sa.Integer(), autoincrement=True, nullable=False), + sa.Column("aggregate_id", sa.Uuid(), nullable=False), + sa.Column("aggregate_type", sa.String(255), nullable=False), + sa.Column("action", sa.String(50), nullable=False), + sa.Column("event_type", sa.Text(), nullable=False), + sa.Column("user_id", sa.Uuid(), nullable=True), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("correlation_id", sa.String(255), nullable=True), + sa.Column("timestamp", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_change_logs_aggregate_id", "change_logs", ["aggregate_id"]) + op.create_index("ix_change_logs_aggregate_type", "change_logs", ["aggregate_type"]) + op.create_index("ix_change_logs_user_id", "change_logs", ["user_id"]) + op.create_index("ix_change_logs_tenant_id", "change_logs", ["tenant_id"]) + op.create_index("ix_change_logs_timestamp", "change_logs", ["timestamp"]) + + +def downgrade() -> None: + op.drop_table("change_logs") + op.drop_table("stored_events") diff --git a/services/event/alembic/versions/002_add_journal_entries.py b/services/event/alembic/versions/002_add_journal_entries.py new file mode 100644 index 0000000..5639e63 --- /dev/null +++ b/services/event/alembic/versions/002_add_journal_entries.py @@ -0,0 +1,35 @@ +"""add journal entries + +Revision ID: 002 +Revises: 001 +Create Date: 2026-03-22 +""" + +import sqlalchemy as sa +from alembic import op + +revision = "002" +down_revision = "001" +branch_labels = None +depends_on = None + + +def upgrade() -> None: + op.create_table( + "journal_entries", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("object_type", sa.String(100), nullable=False), + sa.Column("object_id", sa.Uuid(), nullable=False), + sa.Column("entry_type", sa.String(20), nullable=False), + sa.Column("comment", sa.Text(), nullable=False), + sa.Column("user_id", sa.Uuid(), nullable=True), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_journal_object", "journal_entries", ["object_type", "object_id"]) + op.create_index("ix_journal_tenant", "journal_entries", ["tenant_id"]) + + +def downgrade() -> None: + op.drop_table("journal_entries") diff --git a/services/event/docker-compose.dev.yml b/services/event/docker-compose.dev.yml new file mode 100644 index 0000000..47006c5 --- /dev/null +++ b/services/event/docker-compose.dev.yml @@ -0,0 +1,26 @@ +services: + event: + build: + context: ../../ + dockerfile: services/event/Dockerfile.dev + volumes: + - ../../services/event/src:/app/services/event/src + ports: + - "8004:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_event + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + +networks: + cmdb-network: + external: true diff --git a/services/event/docker-compose.yml b/services/event/docker-compose.yml new file mode 100644 index 0000000..c4bf3f1 --- /dev/null +++ b/services/event/docker-compose.yml @@ -0,0 +1,18 @@ +services: + event: + build: + context: ../../ + dockerfile: services/event/Dockerfile + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_event + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network diff --git a/services/event/pyproject.toml b/services/event/pyproject.toml new file mode 100644 index 0000000..ea1a3be --- /dev/null +++ b/services/event/pyproject.toml @@ -0,0 +1,25 @@ +[project] +name = "cmdb-event" +version = "0.1.0" +description = "CMDB event Service" +requires-python = ">=3.13" +dependencies = [ + "cmdb-shared", + "fastapi>=0.115", + "uvicorn", + "sqlalchemy[asyncio]>=2.0", + "asyncpg", + "alembic", + "pydantic-settings>=2.0", + "aiokafka", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/event"] + +[tool.uv.sources] +cmdb-shared = { workspace = true } diff --git a/services/event/src/event/__init__.py b/services/event/src/event/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/event/src/event/application/__init__.py b/services/event/src/event/application/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/event/src/event/domain/__init__.py b/services/event/src/event/domain/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/event/src/event/domain/models.py b/services/event/src/event/domain/models.py new file mode 100644 index 0000000..3eb79be --- /dev/null +++ b/services/event/src/event/domain/models.py @@ -0,0 +1,26 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + + +class StoredEvent(BaseModel): + id: int | None = None + aggregate_id: UUID + aggregate_type: str + event_type: str + version: int + payload: dict + timestamp: datetime + + +class ChangeLogEntry(BaseModel): + id: int | None = None + aggregate_id: UUID + aggregate_type: str + action: str + event_type: str + user_id: UUID | None = None + tenant_id: UUID | None = None + correlation_id: str | None = None + timestamp: datetime diff --git a/services/event/src/event/infrastructure/__init__.py b/services/event/src/event/infrastructure/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/event/src/event/infrastructure/changelog_repository.py b/services/event/src/event/infrastructure/changelog_repository.py new file mode 100644 index 0000000..2639c9a --- /dev/null +++ b/services/event/src/event/infrastructure/changelog_repository.py @@ -0,0 +1,82 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from event.infrastructure.models import ChangeLogModel + + +class ChangeLogRepository: + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def create(self, entry_data: dict) -> ChangeLogModel: + model = ChangeLogModel( + aggregate_id=entry_data["aggregate_id"], + aggregate_type=entry_data["aggregate_type"], + action=entry_data["action"], + event_type=entry_data["event_type"], + user_id=entry_data.get("user_id"), + tenant_id=entry_data.get("tenant_id"), + correlation_id=entry_data.get("correlation_id"), + timestamp=entry_data.get("timestamp", datetime.now()), + ) + self._session.add(model) + await self._session.commit() + return model + + async def find_by_aggregate( + self, + aggregate_id: UUID, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[ChangeLogModel], int]: + count_stmt = ( + select(sa_func.count()).select_from(ChangeLogModel).where(ChangeLogModel.aggregate_id == aggregate_id) + ) + total = (await self._session.execute(count_stmt)).scalar_one() + + stmt = ( + select(ChangeLogModel) + .where(ChangeLogModel.aggregate_id == aggregate_id) + .order_by(ChangeLogModel.timestamp.desc()) + .offset(offset) + .limit(limit) + ) + result = await self._session.execute(stmt) + return list(result.scalars().all()), total + + async def find_all( + self, + *, + aggregate_type: str | None = None, + tenant_id: UUID | None = None, + user_id: UUID | None = None, + from_timestamp: datetime | None = None, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[ChangeLogModel], int]: + base = select(ChangeLogModel) + count_base = select(sa_func.count()).select_from(ChangeLogModel) + + if aggregate_type: + base = base.where(ChangeLogModel.aggregate_type == aggregate_type) + count_base = count_base.where(ChangeLogModel.aggregate_type == aggregate_type) + if tenant_id: + base = base.where(ChangeLogModel.tenant_id == tenant_id) + count_base = count_base.where(ChangeLogModel.tenant_id == tenant_id) + if user_id: + base = base.where(ChangeLogModel.user_id == user_id) + count_base = count_base.where(ChangeLogModel.user_id == user_id) + if from_timestamp: + base = base.where(ChangeLogModel.timestamp >= from_timestamp) + count_base = count_base.where(ChangeLogModel.timestamp >= from_timestamp) + + total = (await self._session.execute(count_base)).scalar_one() + + stmt = base.order_by(ChangeLogModel.timestamp.desc()).offset(offset).limit(limit) + result = await self._session.execute(stmt) + return list(result.scalars().all()), total diff --git a/services/event/src/event/infrastructure/config.py b/services/event/src/event/infrastructure/config.py new file mode 100644 index 0000000..bb8336d --- /dev/null +++ b/services/event/src/event/infrastructure/config.py @@ -0,0 +1,8 @@ +from pydantic_settings import BaseSettings + + +class Settings(BaseSettings): + database_url: str = "postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_event" + kafka_bootstrap_servers: str = "kafka:9092" + kafka_group_id: str = "event-service" + kafka_dlq_topic: str = "events.dlq" diff --git a/services/event/src/event/infrastructure/database.py b/services/event/src/event/infrastructure/database.py new file mode 100644 index 0000000..26744e3 --- /dev/null +++ b/services/event/src/event/infrastructure/database.py @@ -0,0 +1,26 @@ +from sqlalchemy.ext.asyncio import ( + AsyncEngine, + AsyncSession, + async_sessionmaker, + create_async_engine, +) + + +class Database: + def __init__(self, url: str) -> None: + self._engine: AsyncEngine = create_async_engine(url, echo=False, pool_size=5) + self._session_factory = async_sessionmaker( + self._engine, + class_=AsyncSession, + expire_on_commit=False, + ) + + @property + def engine(self) -> AsyncEngine: + return self._engine + + def session(self) -> AsyncSession: + return self._session_factory() + + async def close(self) -> None: + await self._engine.dispose() diff --git a/services/event/src/event/infrastructure/event_consumer.py b/services/event/src/event/infrastructure/event_consumer.py new file mode 100644 index 0000000..b6ec69a --- /dev/null +++ b/services/event/src/event/infrastructure/event_consumer.py @@ -0,0 +1,177 @@ +import json +import logging +import re +from datetime import UTC, datetime +from uuid import UUID + +from aiokafka import AIOKafkaConsumer, AIOKafkaProducer +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker + +from event.infrastructure.changelog_repository import ChangeLogRepository +from event.infrastructure.event_repository import EventRepository + +logger = logging.getLogger(__name__) + +# Map event name suffix to action +ACTION_PATTERNS = [ + (re.compile(r"Created$"), "created"), + (re.compile(r"Updated$"), "updated"), + (re.compile(r"Changed$"), "updated"), + (re.compile(r"Deleted$"), "deleted"), + (re.compile(r"Suspended$"), "suspended"), + (re.compile(r"Locked$"), "locked"), + (re.compile(r"Assigned$"), "assigned"), + (re.compile(r"Removed$"), "removed"), + (re.compile(r"Revoked$"), "revoked"), + (re.compile(r"Generated$"), "created"), +] + + +def _extract_aggregate_type(event_type: str) -> str: + """Extract aggregate type from event_type string. + + Example: 'auth.domain.events.UserCreated' → 'user' + """ + parts = event_type.rsplit(".", 1) + event_name = parts[-1] if parts else event_type + + # Remove action suffix to get the entity name + for pattern, _ in ACTION_PATTERNS: + match = pattern.search(event_name) + if match: + entity = event_name[: match.start()] + return entity.lower() if entity else "unknown" + + return event_name.lower() + + +def _extract_action(event_type: str) -> str: + """Extract action from event_type string. + + Example: 'auth.domain.events.UserCreated' → 'created' + """ + parts = event_type.rsplit(".", 1) + event_name = parts[-1] if parts else event_type + + for pattern, action in ACTION_PATTERNS: + if pattern.search(event_name): + return action + + return "changed" + + +class EventConsumerWorker: + def __init__( + self, + bootstrap_servers: str, + group_id: str, + session_factory: async_sessionmaker[AsyncSession], + dlq_topic: str = "events.dlq", + ) -> None: + self._bootstrap_servers = bootstrap_servers + self._group_id = group_id + self._session_factory = session_factory + self._dlq_topic = dlq_topic + self._consumer: AIOKafkaConsumer | None = None + self._dlq_producer: AIOKafkaProducer | None = None + self._running = False + + async def start(self) -> None: + self._consumer = AIOKafkaConsumer( + bootstrap_servers=self._bootstrap_servers, + group_id=self._group_id, + enable_auto_commit=False, + ) + self._consumer.subscribe(pattern=r".*\.events") + await self._consumer.start() + + self._dlq_producer = AIOKafkaProducer(bootstrap_servers=self._bootstrap_servers) + await self._dlq_producer.start() + + self._running = True + logger.info("Event consumer started (pattern: *.events)") + + async def stop(self) -> None: + self._running = False + if self._consumer: + await self._consumer.stop() + if self._dlq_producer: + await self._dlq_producer.stop() + logger.info("Event consumer stopped") + + async def consume(self) -> None: + if self._consumer is None: + raise RuntimeError("Consumer not started") + + async for msg in self._consumer: + if not self._running: + break + try: + await self._process_message(msg) + await self._consumer.commit() + except Exception: + logger.exception("Failed to process message from %s", msg.topic) + await self._send_to_dlq(msg) + await self._consumer.commit() + + async def _process_message(self, msg: object) -> None: + raw = json.loads(msg.value) # type: ignore[attr-defined] + + aggregate_id = raw.get("aggregate_id") + event_type = raw.get("event_type", "") + version = raw.get("version", 0) + timestamp_str = raw.get("timestamp") + + if not aggregate_id or not event_type: + logger.warning("Skipping message with missing aggregate_id or event_type") + return + + timestamp = datetime.fromisoformat(timestamp_str) if timestamp_str else datetime.now(UTC) + + aggregate_type = _extract_aggregate_type(event_type) + action = _extract_action(event_type) + + async with self._session_factory() as session: + event_repo = EventRepository(session) + changelog_repo = ChangeLogRepository(session) + + await event_repo.append( + { + "aggregate_id": UUID(aggregate_id), + "aggregate_type": aggregate_type, + "event_type": event_type, + "version": version, + "payload": raw, + "timestamp": timestamp, + } + ) + + await changelog_repo.create( + { + "aggregate_id": UUID(aggregate_id), + "aggregate_type": aggregate_type, + "action": action, + "event_type": event_type, + "user_id": _try_uuid(raw.get("user_id")), + "tenant_id": _try_uuid(raw.get("tenant_id")), + "correlation_id": raw.get("correlation_id"), + "timestamp": timestamp, + } + ) + + async def _send_to_dlq(self, msg: object) -> None: + if self._dlq_producer: + await self._dlq_producer.send_and_wait( + self._dlq_topic, + value=msg.value, # type: ignore[attr-defined] + key=msg.key, # type: ignore[attr-defined] + ) + + +def _try_uuid(value: str | None) -> UUID | None: + if not value: + return None + try: + return UUID(value) + except (ValueError, AttributeError): + return None diff --git a/services/event/src/event/infrastructure/event_repository.py b/services/event/src/event/infrastructure/event_repository.py new file mode 100644 index 0000000..204be20 --- /dev/null +++ b/services/event/src/event/infrastructure/event_repository.py @@ -0,0 +1,67 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from event.infrastructure.models import StoredEventModel + + +class EventRepository: + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def append(self, event_data: dict) -> StoredEventModel: + model = StoredEventModel( + aggregate_id=event_data["aggregate_id"], + aggregate_type=event_data["aggregate_type"], + event_type=event_data["event_type"], + version=event_data["version"], + payload=event_data["payload"], + timestamp=event_data.get("timestamp", datetime.now()), + ) + self._session.add(model) + await self._session.commit() + return model + + async def find_by_aggregate( + self, + aggregate_id: UUID, + *, + from_version: int = 0, + ) -> list[StoredEventModel]: + stmt = ( + select(StoredEventModel) + .where( + StoredEventModel.aggregate_id == aggregate_id, + StoredEventModel.version >= from_version, + ) + .order_by(StoredEventModel.version.asc()) + ) + result = await self._session.execute(stmt) + return list(result.scalars().all()) + + async def find_all( + self, + *, + aggregate_type: str | None = None, + from_timestamp: datetime | None = None, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[StoredEventModel], int]: + base = select(StoredEventModel) + count_base = select(sa_func.count()).select_from(StoredEventModel) + + if aggregate_type: + base = base.where(StoredEventModel.aggregate_type == aggregate_type) + count_base = count_base.where(StoredEventModel.aggregate_type == aggregate_type) + if from_timestamp: + base = base.where(StoredEventModel.timestamp >= from_timestamp) + count_base = count_base.where(StoredEventModel.timestamp >= from_timestamp) + + total = (await self._session.execute(count_base)).scalar_one() + + stmt = base.order_by(StoredEventModel.timestamp.desc()).offset(offset).limit(limit) + result = await self._session.execute(stmt) + return list(result.scalars().all()), total diff --git a/services/event/src/event/infrastructure/journal_repository.py b/services/event/src/event/infrastructure/journal_repository.py new file mode 100644 index 0000000..0c47806 --- /dev/null +++ b/services/event/src/event/infrastructure/journal_repository.py @@ -0,0 +1,63 @@ +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from event.infrastructure.models import JournalEntryModel + + +class JournalRepository: + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def create(self, data: dict) -> JournalEntryModel: + model = JournalEntryModel(**data) + self._session.add(model) + await self._session.commit() + return model + + async def find_by_id(self, entry_id: UUID) -> JournalEntryModel | None: + return await self._session.get(JournalEntryModel, entry_id) + + async def find_all( + self, + *, + object_type: str | None = None, + object_id: UUID | None = None, + tenant_id: UUID | None = None, + user_id: UUID | None = None, + entry_type: str | None = None, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[JournalEntryModel], int]: + base = select(JournalEntryModel) + count_base = select(sa_func.count()).select_from(JournalEntryModel) + + if object_type: + base = base.where(JournalEntryModel.object_type == object_type) + count_base = count_base.where(JournalEntryModel.object_type == object_type) + if object_id: + base = base.where(JournalEntryModel.object_id == object_id) + count_base = count_base.where(JournalEntryModel.object_id == object_id) + if tenant_id: + base = base.where(JournalEntryModel.tenant_id == tenant_id) + count_base = count_base.where(JournalEntryModel.tenant_id == tenant_id) + if user_id: + base = base.where(JournalEntryModel.user_id == user_id) + count_base = count_base.where(JournalEntryModel.user_id == user_id) + if entry_type: + base = base.where(JournalEntryModel.entry_type == entry_type) + count_base = count_base.where(JournalEntryModel.entry_type == entry_type) + + total = (await self._session.execute(count_base)).scalar_one() + + stmt = base.order_by(JournalEntryModel.created_at.desc()).offset(offset).limit(limit) + result = await self._session.execute(stmt) + return list(result.scalars().all()), total + + async def delete(self, entry_id: UUID) -> None: + model = await self._session.get(JournalEntryModel, entry_id) + if model is not None: + await self._session.delete(model) + await self._session.commit() diff --git a/services/event/src/event/infrastructure/models.py b/services/event/src/event/infrastructure/models.py new file mode 100644 index 0000000..afc5b3e --- /dev/null +++ b/services/event/src/event/infrastructure/models.py @@ -0,0 +1,57 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import DateTime as SADateTime +from sqlalchemy import Index, Integer, String, Text, UniqueConstraint +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column +from sqlalchemy.sql import func + + +class EventBase(DeclarativeBase): + pass + + +class StoredEventModel(EventBase): + __tablename__ = "stored_events" + __table_args__ = (UniqueConstraint("aggregate_id", "version", name="uq_event_aggregate_version"),) + + id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True) + aggregate_id: Mapped[UUID] = mapped_column(index=True) + aggregate_type: Mapped[str] = mapped_column(String(255), index=True) + event_type: Mapped[str] = mapped_column(Text) + version: Mapped[int] = mapped_column(Integer) + payload: Mapped[dict] = mapped_column(JSONB, default=dict) + timestamp: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now(), index=True) + + +class ChangeLogModel(EventBase): + __tablename__ = "change_logs" + + id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True) + aggregate_id: Mapped[UUID] = mapped_column(index=True) + aggregate_type: Mapped[str] = mapped_column(String(255), index=True) + action: Mapped[str] = mapped_column(String(50)) + event_type: Mapped[str] = mapped_column(Text) + user_id: Mapped[UUID | None] = mapped_column(nullable=True, index=True) + tenant_id: Mapped[UUID | None] = mapped_column(nullable=True, index=True) + correlation_id: Mapped[str | None] = mapped_column(String(255), nullable=True) + timestamp: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now(), index=True) + + +class JournalEntryModel(EventBase): + __tablename__ = "journal_entries" + + id: Mapped[UUID] = mapped_column(primary_key=True) + object_type: Mapped[str] = mapped_column(String(100)) + object_id: Mapped[UUID] + entry_type: Mapped[str] = mapped_column(String(20)) + comment: Mapped[str] = mapped_column(Text) + user_id: Mapped[UUID | None] = mapped_column(nullable=True) + tenant_id: Mapped[UUID | None] = mapped_column(nullable=True) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + + __table_args__ = ( + Index("ix_journal_object", "object_type", "object_id"), + Index("ix_journal_tenant", "tenant_id"), + ) diff --git a/services/event/src/event/interface/__init__.py b/services/event/src/event/interface/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/event/src/event/interface/main.py b/services/event/src/event/interface/main.py new file mode 100644 index 0000000..732bf59 --- /dev/null +++ b/services/event/src/event/interface/main.py @@ -0,0 +1,77 @@ +import asyncio +import contextlib +import logging +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager + +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from shared.api.errors import domain_exception_handler +from shared.api.middleware import CorrelationIdMiddleware, UserMiddleware +from shared.domain.exceptions import DomainError +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker + +from event.infrastructure.config import Settings +from event.infrastructure.database import Database +from event.infrastructure.event_consumer import EventConsumerWorker +from event.interface.router import router + +logger = logging.getLogger(__name__) + + +@asynccontextmanager +async def lifespan(app: FastAPI) -> AsyncGenerator[None]: + settings = Settings() + + database = Database(settings.database_url) + + session_factory = async_sessionmaker( + database.engine, + class_=AsyncSession, + expire_on_commit=False, + ) + + consumer = EventConsumerWorker( + bootstrap_servers=settings.kafka_bootstrap_servers, + group_id=settings.kafka_group_id, + session_factory=session_factory, + dlq_topic=settings.kafka_dlq_topic, + ) + await consumer.start() + + consumer_task = asyncio.create_task(consumer.consume()) + + app.state.database = database + app.state.settings = settings + app.state.consumer = consumer + + yield + + await consumer.stop() + consumer_task.cancel() + with contextlib.suppress(asyncio.CancelledError): + await consumer_task + await database.close() + + +def create_app() -> FastAPI: + app = FastAPI(title="CMDB Event Service", lifespan=lifespan) + app.add_middleware( + CORSMiddleware, + allow_origins=["http://localhost:3000"], + allow_methods=["*"], + allow_headers=["*"], + ) + app.add_middleware(UserMiddleware) + app.add_middleware(CorrelationIdMiddleware) + app.add_exception_handler(DomainError, domain_exception_handler) + app.include_router(router) + + @app.get("/health", include_in_schema=False) + async def health() -> dict: + return {"status": "ok"} + + return app + + +app = create_app() diff --git a/services/event/src/event/interface/router.py b/services/event/src/event/interface/router.py new file mode 100644 index 0000000..79f5d7f --- /dev/null +++ b/services/event/src/event/interface/router.py @@ -0,0 +1,248 @@ +from datetime import datetime +from uuid import UUID, uuid4 + +from fastapi import APIRouter, Depends, HTTPException, Query, Request, status +from shared.api.pagination import OffsetParams +from sqlalchemy.ext.asyncio import AsyncSession + +from event.infrastructure.changelog_repository import ChangeLogRepository +from event.infrastructure.event_repository import EventRepository +from event.infrastructure.journal_repository import JournalRepository +from event.interface.schemas import ( + ChangeLogListResponse, + ChangeLogResponse, + CreateJournalEntryRequest, + EventListResponse, + JournalEntryListResponse, + JournalEntryResponse, + StoredEventResponse, +) + +router = APIRouter(tags=["events"]) + + +def _get_session(request: Request) -> AsyncSession: + return request.app.state.database.session() + + +# ============================================================================= +# Events (Event Stream) +# ============================================================================= + + +@router.get("/events/stream/{aggregate_id}", response_model=list[StoredEventResponse]) +async def get_event_stream( + aggregate_id: UUID, + from_version: int = Query(0, ge=0), + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> list[StoredEventResponse]: + repo = EventRepository(session) + events = await repo.find_by_aggregate(aggregate_id, from_version=from_version) + return [ + StoredEventResponse( + id=e.id, + aggregate_id=e.aggregate_id, + aggregate_type=e.aggregate_type, + event_type=e.event_type, + version=e.version, + payload=e.payload, + timestamp=e.timestamp, + ) + for e in events + ] + + +@router.get("/events", response_model=EventListResponse) +async def list_events( + aggregate_type: str | None = None, + from_timestamp: datetime | None = None, + params: OffsetParams = Depends(), # noqa: B008 + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> EventListResponse: + repo = EventRepository(session) + events, total = await repo.find_all( + aggregate_type=aggregate_type, + from_timestamp=from_timestamp, + offset=params.offset, + limit=params.limit, + ) + return EventListResponse( + items=[ + StoredEventResponse( + id=e.id, + aggregate_id=e.aggregate_id, + aggregate_type=e.aggregate_type, + event_type=e.event_type, + version=e.version, + payload=e.payload, + timestamp=e.timestamp, + ) + for e in events + ], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +# ============================================================================= +# Change Log +# ============================================================================= + + +@router.get("/changelog/{aggregate_id}", response_model=ChangeLogListResponse) +async def get_changelog_by_aggregate( + aggregate_id: UUID, + params: OffsetParams = Depends(), # noqa: B008 + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> ChangeLogListResponse: + repo = ChangeLogRepository(session) + entries, total = await repo.find_by_aggregate(aggregate_id, offset=params.offset, limit=params.limit) + return ChangeLogListResponse( + items=[ + ChangeLogResponse( + id=e.id, + aggregate_id=e.aggregate_id, + aggregate_type=e.aggregate_type, + action=e.action, + event_type=e.event_type, + user_id=e.user_id, + tenant_id=e.tenant_id, + correlation_id=e.correlation_id, + timestamp=e.timestamp, + ) + for e in entries + ], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.get("/changelog", response_model=ChangeLogListResponse) +async def list_changelog( + aggregate_type: str | None = None, + tenant_id: UUID | None = None, + user_id: UUID | None = None, + from_timestamp: datetime | None = None, + params: OffsetParams = Depends(), # noqa: B008 + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> ChangeLogListResponse: + repo = ChangeLogRepository(session) + entries, total = await repo.find_all( + aggregate_type=aggregate_type, + tenant_id=tenant_id, + user_id=user_id, + from_timestamp=from_timestamp, + offset=params.offset, + limit=params.limit, + ) + return ChangeLogListResponse( + items=[ + ChangeLogResponse( + id=e.id, + aggregate_id=e.aggregate_id, + aggregate_type=e.aggregate_type, + action=e.action, + event_type=e.event_type, + user_id=e.user_id, + tenant_id=e.tenant_id, + correlation_id=e.correlation_id, + timestamp=e.timestamp, + ) + for e in entries + ], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +# ============================================================================= +# Journal Entries +# ============================================================================= + + +def _journal_response(e) -> JournalEntryResponse: # type: ignore[no-untyped-def] + return JournalEntryResponse( + id=e.id, + object_type=e.object_type, + object_id=e.object_id, + entry_type=e.entry_type, + comment=e.comment, + user_id=e.user_id, + tenant_id=e.tenant_id, + created_at=e.created_at, + ) + + +@router.post("/journal-entries", response_model=JournalEntryResponse, status_code=status.HTTP_201_CREATED) +async def create_journal_entry( + body: CreateJournalEntryRequest, + request: Request, + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> JournalEntryResponse: + user_id_str = getattr(request.state, "user_id", None) + user_id = UUID(user_id_str) if user_id_str else None + tenant_id_str = request.headers.get("X-Tenant-ID") + tenant_id = UUID(tenant_id_str) if tenant_id_str else None + + repo = JournalRepository(session) + entry = await repo.create( + { + "id": uuid4(), + "object_type": body.object_type, + "object_id": body.object_id, + "entry_type": body.entry_type, + "comment": body.comment, + "user_id": user_id, + "tenant_id": tenant_id, + } + ) + return _journal_response(entry) + + +@router.get("/journal-entries", response_model=JournalEntryListResponse) +async def list_journal_entries( + object_type: str | None = None, + object_id: UUID | None = None, + tenant_id: UUID | None = None, + user_id: UUID | None = None, + entry_type: str | None = None, + params: OffsetParams = Depends(), # noqa: B008 + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> JournalEntryListResponse: + repo = JournalRepository(session) + entries, total = await repo.find_all( + object_type=object_type, + object_id=object_id, + tenant_id=tenant_id, + user_id=user_id, + entry_type=entry_type, + offset=params.offset, + limit=params.limit, + ) + return JournalEntryListResponse( + items=[_journal_response(e) for e in entries], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.delete("/journal-entries/{entry_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_journal_entry( + entry_id: UUID, + request: Request, + session: AsyncSession = Depends(_get_session), # noqa: B008 +) -> None: + repo = JournalRepository(session) + entry = await repo.find_by_id(entry_id) + if entry is None: + raise HTTPException(status_code=404, detail=f"JournalEntry {entry_id} not found") + + current_user = getattr(request.state, "user_id", None) + if entry.user_id is not None and current_user is not None and str(entry.user_id) != current_user: + raise HTTPException(status_code=403, detail="Only the author can delete this journal entry") + + await repo.delete(entry_id) diff --git a/services/event/src/event/interface/schemas.py b/services/event/src/event/interface/schemas.py new file mode 100644 index 0000000..9d2f692 --- /dev/null +++ b/services/event/src/event/interface/schemas.py @@ -0,0 +1,69 @@ +from datetime import datetime +from typing import Literal +from uuid import UUID + +from pydantic import BaseModel + + +class StoredEventResponse(BaseModel): + id: int + aggregate_id: UUID + aggregate_type: str + event_type: str + version: int + payload: dict + timestamp: datetime + + +class EventListResponse(BaseModel): + items: list[StoredEventResponse] + total: int + offset: int + limit: int + + +class ChangeLogResponse(BaseModel): + id: int + aggregate_id: UUID + aggregate_type: str + action: str + event_type: str + user_id: UUID | None + tenant_id: UUID | None + correlation_id: str | None + timestamp: datetime + + +class ChangeLogListResponse(BaseModel): + items: list[ChangeLogResponse] + total: int + offset: int + limit: int + + +# --- Journal Entry --- + + +class CreateJournalEntryRequest(BaseModel): + object_type: str + object_id: UUID + entry_type: Literal["info", "success", "warning", "danger"] + comment: str + + +class JournalEntryResponse(BaseModel): + id: UUID + object_type: str + object_id: UUID + entry_type: str + comment: str + user_id: UUID | None + tenant_id: UUID | None + created_at: datetime + + +class JournalEntryListResponse(BaseModel): + items: list[JournalEntryResponse] + total: int + offset: int + limit: int diff --git a/services/event/tests/__init__.py b/services/event/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/event/tests/test_journal.py b/services/event/tests/test_journal.py new file mode 100644 index 0000000..2f6e97f --- /dev/null +++ b/services/event/tests/test_journal.py @@ -0,0 +1,115 @@ +"""Tests for journal entry schemas and model.""" + +from datetime import UTC, datetime +from uuid import uuid4 + +from event.infrastructure.models import JournalEntryModel +from event.interface.schemas import ( + CreateJournalEntryRequest, + JournalEntryListResponse, + JournalEntryResponse, +) + + +class TestCreateJournalEntryRequest: + def test_valid_entry_types(self) -> None: + for entry_type in ("info", "success", "warning", "danger"): + req = CreateJournalEntryRequest( + object_type="prefix", + object_id=uuid4(), + entry_type=entry_type, + comment="Test note", + ) + assert req.entry_type == entry_type + + def test_invalid_entry_type_rejected(self) -> None: + import pytest + + with pytest.raises(Exception): # noqa: B017 + CreateJournalEntryRequest( + object_type="prefix", + object_id=uuid4(), + entry_type="invalid", + comment="Test", + ) + + def test_fields(self) -> None: + oid = uuid4() + req = CreateJournalEntryRequest( + object_type="vrf", + object_id=oid, + entry_type="warning", + comment="Check this VRF config", + ) + assert req.object_type == "vrf" + assert req.object_id == oid + assert req.comment == "Check this VRF config" + + +class TestJournalEntryResponse: + def test_from_dict(self) -> None: + now = datetime.now(UTC) + uid = uuid4() + tid = uuid4() + resp = JournalEntryResponse( + id=uuid4(), + object_type="prefix", + object_id=uuid4(), + entry_type="info", + comment="Allocated for DC-1", + user_id=uid, + tenant_id=tid, + created_at=now, + ) + assert resp.entry_type == "info" + assert resp.user_id == uid + assert resp.tenant_id == tid + + def test_nullable_user_and_tenant(self) -> None: + resp = JournalEntryResponse( + id=uuid4(), + object_type="ip_address", + object_id=uuid4(), + entry_type="danger", + comment="Conflicting assignment", + user_id=None, + tenant_id=None, + created_at=datetime.now(UTC), + ) + assert resp.user_id is None + assert resp.tenant_id is None + + +class TestJournalEntryListResponse: + def test_empty(self) -> None: + resp = JournalEntryListResponse(items=[], total=0, offset=0, limit=50) + assert resp.items == [] + assert resp.total == 0 + + def test_with_items(self) -> None: + items = [ + JournalEntryResponse( + id=uuid4(), + object_type="prefix", + object_id=uuid4(), + entry_type="info", + comment=f"Note {i}", + user_id=uuid4(), + tenant_id=uuid4(), + created_at=datetime.now(UTC), + ) + for i in range(3) + ] + resp = JournalEntryListResponse(items=items, total=3, offset=0, limit=50) + assert len(resp.items) == 3 + assert resp.total == 3 + + +class TestJournalEntryModel: + def test_tablename(self) -> None: + assert JournalEntryModel.__tablename__ == "journal_entries" + + def test_table_args_has_indexes(self) -> None: + index_names = [idx.name for idx in JournalEntryModel.__table_args__ if hasattr(idx, "name")] + assert "ix_journal_object" in index_names + assert "ix_journal_tenant" in index_names diff --git a/services/ipam/Dockerfile b/services/ipam/Dockerfile new file mode 100644 index 0000000..fa11a68 --- /dev/null +++ b/services/ipam/Dockerfile @@ -0,0 +1,5 @@ +FROM cmdb-base:prod +COPY services/ipam/ ./services/ipam/ +RUN uv sync --frozen --no-dev --package cmdb-ipam +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-ipam", "uvicorn", "ipam.interface.main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/services/ipam/Dockerfile.dev b/services/ipam/Dockerfile.dev new file mode 100644 index 0000000..3b3babe --- /dev/null +++ b/services/ipam/Dockerfile.dev @@ -0,0 +1,8 @@ +FROM cmdb-base:dev +COPY services/ipam/pyproject.toml ./services/ipam/pyproject.toml +COPY services/ipam/alembic.ini ./services/ipam/alembic.ini +COPY services/ipam/alembic ./services/ipam/alembic +COPY services/ipam/src ./services/ipam/src +RUN uv sync --frozen --package cmdb-ipam +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-ipam", "uvicorn", "ipam.interface.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"] diff --git a/services/ipam/alembic.ini b/services/ipam/alembic.ini new file mode 100644 index 0000000..9c62402 --- /dev/null +++ b/services/ipam/alembic.ini @@ -0,0 +1,35 @@ +[alembic] +script_location = alembic +sqlalchemy.url = postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_ipam + +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console + +[logger_sqlalchemy] +level = WARN +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s diff --git a/services/ipam/alembic/env.py b/services/ipam/alembic/env.py new file mode 100644 index 0000000..f9fc567 --- /dev/null +++ b/services/ipam/alembic/env.py @@ -0,0 +1,68 @@ +import asyncio +import os +from logging.config import fileConfig + +from alembic import context +from ipam.infrastructure.models import IPAMBase +from shared.event.models import EventStoreBase +from sqlalchemy import MetaData, pool +from sqlalchemy.ext.asyncio import async_engine_from_config + +config = context.config + +# Override DB URL from environment variable if set +db_url = os.environ.get("DATABASE_URL") +if db_url: + config.set_main_option("sqlalchemy.url", db_url) + +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +# Merge metadata from both IPAM read models and event store tables +combined_metadata = MetaData() +for metadata in (IPAMBase.metadata, EventStoreBase.metadata): + for table in metadata.tables.values(): + table.tometadata(combined_metadata) + +target_metadata = combined_metadata + + +def run_migrations_offline() -> None: + url = config.get_main_option("sqlalchemy.url") + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + ) + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection) -> None: + context.configure( + connection=connection, + target_metadata=target_metadata, + ) + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + connectable = async_engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + await connectable.dispose() + + +def run_migrations_online() -> None: + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/services/ipam/alembic/versions/001_create_ipam_tables.py b/services/ipam/alembic/versions/001_create_ipam_tables.py new file mode 100644 index 0000000..3062bc7 --- /dev/null +++ b/services/ipam/alembic/versions/001_create_ipam_tables.py @@ -0,0 +1,218 @@ +"""create ipam tables + +Revision ID: 001 +Revises: +Create Date: 2026-03-19 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects import postgresql + +revision = "001" +down_revision = None +branch_labels = None +depends_on = None + + +def upgrade() -> None: + # --- Event Store Tables --- + + # domain_events + op.create_table( + "domain_events", + sa.Column("id", sa.Integer(), autoincrement=True, nullable=False), + sa.Column("aggregate_id", sa.Uuid(), nullable=False), + sa.Column("event_type", sa.Text(), nullable=False), + sa.Column("version", sa.Integer(), nullable=False), + sa.Column("payload", postgresql.JSONB(), nullable=False), + sa.Column("timestamp", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_domain_events_aggregate_id", "domain_events", ["aggregate_id"]) + op.create_index( + "ix_domain_events_agg_version", + "domain_events", + ["aggregate_id", "version"], + unique=True, + ) + + # aggregate_snapshots + op.create_table( + "aggregate_snapshots", + sa.Column("id", sa.Integer(), autoincrement=True, nullable=False), + sa.Column("aggregate_id", sa.Uuid(), nullable=False), + sa.Column("aggregate_type", sa.Text(), nullable=False), + sa.Column("version", sa.Integer(), nullable=False), + sa.Column("state", postgresql.JSONB(), nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_aggregate_snapshots_aggregate_id", "aggregate_snapshots", ["aggregate_id"]) + op.create_index( + "ix_snapshots_agg_version", + "aggregate_snapshots", + ["aggregate_id", "version"], + unique=True, + ) + + # --- Read Model Tables --- + + # prefixes_read + op.create_table( + "prefixes_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("network", sa.String(50), nullable=False), + sa.Column("vrf_id", sa.Uuid(), nullable=True), + sa.Column("vlan_id", sa.Uuid(), nullable=True), + sa.Column("status", sa.String(20), nullable=False), + sa.Column("role", sa.String(100), nullable=True), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_prefixes_read_network", "prefixes_read", ["network"]) + + # ip_addresses_read + op.create_table( + "ip_addresses_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("address", sa.String(50), nullable=False), + sa.Column("vrf_id", sa.Uuid(), nullable=True), + sa.Column("status", sa.String(20), nullable=False), + sa.Column("dns_name", sa.String(255), server_default="", nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_ip_addresses_read_address", "ip_addresses_read", ["address"]) + + # vrfs_read + op.create_table( + "vrfs_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("rd", sa.String(50), nullable=True), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_vrfs_read_name", "vrfs_read", ["name"]) + + # vlans_read + op.create_table( + "vlans_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("vid", sa.Integer(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("group_id", sa.Uuid(), nullable=True), + sa.Column("status", sa.String(20), nullable=False), + sa.Column("role", sa.String(100), nullable=True), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_vlans_read_vid", "vlans_read", ["vid"]) + + # ip_ranges_read + op.create_table( + "ip_ranges_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("start_address", sa.String(50), nullable=False), + sa.Column("end_address", sa.String(50), nullable=False), + sa.Column("vrf_id", sa.Uuid(), nullable=True), + sa.Column("status", sa.String(20), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_ip_ranges_read_start_address", "ip_ranges_read", ["start_address"]) + + # rirs_read + op.create_table( + "rirs_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("is_private", sa.Boolean(), server_default="false", nullable=False), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_rirs_read_name", "rirs_read", ["name"]) + + # asns_read + op.create_table( + "asns_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("asn", sa.Integer(), nullable=False), + sa.Column("rir_id", sa.Uuid(), nullable=True), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_asns_read_asn", "asns_read", ["asn"]) + + # fhrp_groups_read + op.create_table( + "fhrp_groups_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("protocol", sa.String(20), nullable=False), + sa.Column("group_id_value", sa.Integer(), nullable=False), + sa.Column("auth_type", sa.String(20), nullable=False), + sa.Column("auth_key", sa.String(255), server_default="", nullable=False), + sa.Column("name", sa.String(255), server_default="", nullable=False), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + + +def downgrade() -> None: + op.drop_table("fhrp_groups_read") + op.drop_table("asns_read") + op.drop_table("rirs_read") + op.drop_table("ip_ranges_read") + op.drop_table("vlans_read") + op.drop_table("vrfs_read") + op.drop_table("ip_addresses_read") + op.drop_table("prefixes_read") + op.drop_table("aggregate_snapshots") + op.drop_table("domain_events") diff --git a/services/ipam/alembic/versions/002_add_new_aggregates.py b/services/ipam/alembic/versions/002_add_new_aggregates.py new file mode 100644 index 0000000..f70d31d --- /dev/null +++ b/services/ipam/alembic/versions/002_add_new_aggregates.py @@ -0,0 +1,83 @@ +"""add new aggregates + +Revision ID: 002 +Revises: 001 +Create Date: 2026-03-21 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects import postgresql + +revision = "002" +down_revision = "001" +branch_labels = None +depends_on = None + + +def upgrade() -> None: + # --- route_targets_read --- + op.create_table( + "route_targets_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(100), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_route_targets_read_name", "route_targets_read", ["name"]) + + # --- vlan_groups_read --- + op.create_table( + "vlan_groups_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("slug", sa.String(255), nullable=False), + sa.Column("min_vid", sa.Integer(), nullable=False), + sa.Column("max_vid", sa.Integer(), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_vlan_groups_read_name", "vlan_groups_read", ["name"]) + op.create_index("ix_vlan_groups_read_slug", "vlan_groups_read", ["slug"], unique=True) + + # --- services_read --- + op.create_table( + "services_read", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("protocol", sa.String(10), nullable=False), + sa.Column("ports", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("ip_addresses", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("custom_fields", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("tags", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_deleted", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_services_read_name", "services_read", ["name"]) + + # --- Add import_targets / export_targets to vrfs_read --- + op.add_column("vrfs_read", sa.Column("import_targets", postgresql.JSONB(), server_default="[]", nullable=False)) + op.add_column("vrfs_read", sa.Column("export_targets", postgresql.JSONB(), server_default="[]", nullable=False)) + + +def downgrade() -> None: + op.drop_column("vrfs_read", "export_targets") + op.drop_column("vrfs_read", "import_targets") + op.drop_table("services_read") + op.drop_table("vlan_groups_read") + op.drop_table("route_targets_read") diff --git a/services/ipam/alembic/versions/003_add_search_and_saved_filters.py b/services/ipam/alembic/versions/003_add_search_and_saved_filters.py new file mode 100644 index 0000000..ff9f427 --- /dev/null +++ b/services/ipam/alembic/versions/003_add_search_and_saved_filters.py @@ -0,0 +1,109 @@ +"""add search vectors and saved filters + +Revision ID: 003 +Revises: 002 +Create Date: 2026-03-22 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects import postgresql + +revision = "003" +down_revision = "002" +branch_labels = None +depends_on = None + +SEARCH_VECTOR_CONFIGS = [ + ( + "prefixes_read", + "to_tsvector('simple', coalesce(network, '') || ' ' || coalesce(description, '') || ' ' || coalesce(role, ''))", + ), + ( + "ip_addresses_read", + "to_tsvector('simple', coalesce(address, '') || ' ' || coalesce(dns_name, '')" + " || ' ' || coalesce(description, ''))", + ), + ( + "vrfs_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(rd, '') || ' ' || coalesce(description, ''))", + ), + ( + "vlans_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || vid::text || ' ' || coalesce(description, ''))", + ), + ( + "ip_ranges_read", + "to_tsvector('simple', coalesce(start_address, '') || ' ' || coalesce(end_address, '')" + " || ' ' || coalesce(description, ''))", + ), + ( + "rirs_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(description, ''))", + ), + ( + "asns_read", + "to_tsvector('simple', asn::text || ' ' || coalesce(description, ''))", + ), + ( + "fhrp_groups_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(protocol, '')" + " || ' ' || coalesce(description, ''))", + ), + ( + "route_targets_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(description, ''))", + ), + ( + "vlan_groups_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(slug, '') || ' ' || coalesce(description, ''))", + ), + ( + "services_read", + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(protocol, '')" + " || ' ' || coalesce(description, ''))", + ), +] + + +def upgrade() -> None: + # Add search_vector generated columns and GIN indexes + for table_name, expression in SEARCH_VECTOR_CONFIGS: + op.add_column( + table_name, + sa.Column( + "search_vector", + postgresql.TSVECTOR(), + sa.Computed(expression, persisted=True), + nullable=True, + ), + ) + op.create_index( + f"ix_{table_name}_search", + table_name, + ["search_vector"], + postgresql_using="gin", + ) + + # Create saved_filters table + op.create_table( + "saved_filters", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("user_id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("entity_type", sa.String(50), nullable=False), + sa.Column("filter_config", postgresql.JSONB(), server_default="{}", nullable=False), + sa.Column("is_default", sa.Boolean(), server_default="false", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_saved_filters_user_id", "saved_filters", ["user_id"]) + op.create_index("ix_saved_filters_user_entity", "saved_filters", ["user_id", "entity_type"]) + + +def downgrade() -> None: + op.drop_table("saved_filters") + for table_name, _ in SEARCH_VECTOR_CONFIGS: + op.drop_index(f"ix_{table_name}_search", table_name) + op.drop_column(table_name, "search_vector") diff --git a/services/ipam/alembic/versions/004_add_export_templates.py b/services/ipam/alembic/versions/004_add_export_templates.py new file mode 100644 index 0000000..4f8f81a --- /dev/null +++ b/services/ipam/alembic/versions/004_add_export_templates.py @@ -0,0 +1,34 @@ +"""add export templates + +Revision ID: 004 +Revises: 003 +Create Date: 2026-03-22 +""" + +import sqlalchemy as sa +from alembic import op + +revision = "004" +down_revision = "003" +branch_labels = None +depends_on = None + + +def upgrade() -> None: + op.create_table( + "export_templates", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("entity_type", sa.String(50), nullable=False), + sa.Column("template_content", sa.Text(), nullable=False), + sa.Column("output_format", sa.String(20), server_default="text", nullable=False), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + sa.UniqueConstraint("name"), + ) + + +def downgrade() -> None: + op.drop_table("export_templates") diff --git a/services/ipam/docker-compose.dev.yml b/services/ipam/docker-compose.dev.yml new file mode 100644 index 0000000..1d09e22 --- /dev/null +++ b/services/ipam/docker-compose.dev.yml @@ -0,0 +1,26 @@ +services: + ipam: + build: + context: ../../ + dockerfile: services/ipam/Dockerfile.dev + volumes: + - ../../services/ipam/src:/app/services/ipam/src + ports: + - "8001:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_ipam + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + +networks: + cmdb-network: + external: true diff --git a/services/ipam/docker-compose.yml b/services/ipam/docker-compose.yml new file mode 100644 index 0000000..c23b4d8 --- /dev/null +++ b/services/ipam/docker-compose.yml @@ -0,0 +1,18 @@ +services: + ipam: + build: + context: ../../ + dockerfile: services/ipam/Dockerfile + environment: + DATABASE_URL: postgresql://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_ipam + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network diff --git a/services/ipam/pyproject.toml b/services/ipam/pyproject.toml new file mode 100644 index 0000000..68ee615 --- /dev/null +++ b/services/ipam/pyproject.toml @@ -0,0 +1,36 @@ +[project] +name = "cmdb-ipam" +version = "0.1.0" +description = "CMDB ipam Service" +requires-python = ">=3.13" +dependencies = [ + "cmdb-shared", + "fastapi>=0.115", + "uvicorn", + "sqlalchemy[asyncio]>=2.0", + "asyncpg", + "alembic", + "pydantic-settings>=2.0", + "aiokafka", + "redis>=5.0", + "jinja2>=3.1", + "python-multipart", + "pyyaml>=6.0", + "strawberry-graphql[fastapi]", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/ipam"] + +[tool.uv.sources] +cmdb-shared = { workspace = true } + +[tool.pytest.ini_options] +testpaths = ["tests"] +pythonpath = ["src"] +asyncio_mode = "auto" +markers = ["integration: requires Docker infrastructure (PostgreSQL, Kafka, Redis)"] diff --git a/services/ipam/src/ipam/__init__.py b/services/ipam/src/ipam/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/src/ipam/application/__init__.py b/services/ipam/src/ipam/application/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/src/ipam/application/command_handlers.py b/services/ipam/src/ipam/application/command_handlers.py new file mode 100644 index 0000000..30bd1f2 --- /dev/null +++ b/services/ipam/src/ipam/application/command_handlers.py @@ -0,0 +1,2341 @@ +from __future__ import annotations + +from uuid import UUID, uuid4 + +from shared.cqrs.command import Command, CommandHandler +from shared.domain.exceptions import ConflictError, EntityNotFoundError +from shared.event.pg_store import PostgresEventStore +from shared.messaging.producer import KafkaEventProducer + +from ipam.application.commands import ( + BulkCreateASNsCommand, + BulkCreateFHRPGroupsCommand, + BulkCreateIPAddressesCommand, + BulkCreateIPRangesCommand, + BulkCreatePrefixesCommand, + BulkCreateRIRsCommand, + BulkCreateRouteTargetsCommand, + BulkCreateServicesCommand, + BulkCreateVLANGroupsCommand, + BulkCreateVLANsCommand, + BulkCreateVRFsCommand, + BulkDeleteASNsCommand, + BulkDeleteFHRPGroupsCommand, + BulkDeleteIPAddressesCommand, + BulkDeleteIPRangesCommand, + BulkDeletePrefixesCommand, + BulkDeleteRIRsCommand, + BulkDeleteRouteTargetsCommand, + BulkDeleteServicesCommand, + BulkDeleteVLANGroupsCommand, + BulkDeleteVLANsCommand, + BulkDeleteVRFsCommand, + BulkUpdateASNsCommand, + BulkUpdateFHRPGroupsCommand, + BulkUpdateIPAddressesCommand, + BulkUpdateIPRangesCommand, + BulkUpdatePrefixesCommand, + BulkUpdateRIRsCommand, + BulkUpdateRouteTargetsCommand, + BulkUpdateServicesCommand, + BulkUpdateVLANGroupsCommand, + BulkUpdateVLANsCommand, + BulkUpdateVRFsCommand, + ChangeIPAddressStatusCommand, + ChangeIPRangeStatusCommand, + ChangePrefixStatusCommand, + ChangeVLANStatusCommand, + CreateASNCommand, + CreateFHRPGroupCommand, + CreateIPAddressCommand, + CreateIPRangeCommand, + CreatePrefixCommand, + CreateRIRCommand, + CreateRouteTargetCommand, + CreateServiceCommand, + CreateVLANCommand, + CreateVLANGroupCommand, + CreateVRFCommand, + DeleteASNCommand, + DeleteFHRPGroupCommand, + DeleteIPAddressCommand, + DeleteIPRangeCommand, + DeletePrefixCommand, + DeleteRIRCommand, + DeleteRouteTargetCommand, + DeleteServiceCommand, + DeleteVLANCommand, + DeleteVLANGroupCommand, + DeleteVRFCommand, + UpdateASNCommand, + UpdateFHRPGroupCommand, + UpdateIPAddressCommand, + UpdateIPRangeCommand, + UpdatePrefixCommand, + UpdateRIRCommand, + UpdateRouteTargetCommand, + UpdateServiceCommand, + UpdateVLANCommand, + UpdateVLANGroupCommand, + UpdateVRFCommand, +) +from ipam.application.read_model import ( + ASNReadModelRepository, + FHRPGroupReadModelRepository, + IPAddressReadModelRepository, + IPRangeReadModelRepository, + PrefixReadModelRepository, + RIRReadModelRepository, + RouteTargetReadModelRepository, + SavedFilterRepository, + ServiceReadModelRepository, + VLANGroupReadModelRepository, + VLANReadModelRepository, + VRFReadModelRepository, +) +from ipam.domain.asn import ASN +from ipam.domain.fhrp_group import FHRPGroup +from ipam.domain.ip_address import IPAddress +from ipam.domain.ip_range import IPRange +from ipam.domain.prefix import Prefix +from ipam.domain.rir import RIR +from ipam.domain.route_target import RouteTarget +from ipam.domain.service import Service +from ipam.domain.value_objects import ( + FHRPAuthType, + FHRPProtocol, + IPAddressStatus, + IPRangeStatus, + PrefixStatus, + ServiceProtocol, + VLANStatus, +) +from ipam.domain.vlan import VLAN +from ipam.domain.vlan_group import VLANGroup +from ipam.domain.vrf import VRF + +# --------------------------------------------------------------------------- +# Prefix +# --------------------------------------------------------------------------- + + +class CreatePrefixHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreatePrefixCommand) -> UUID: + prefix = Prefix.create( + network=command.network, + vrf_id=command.vrf_id, + vlan_id=command.vlan_id, + status=PrefixStatus(command.status), + role=command.role, + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = prefix.collect_uncommitted_events() + await self._event_store.append(prefix.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(prefix) + await self._event_producer.publish_many("ipam.events", events) + return prefix.id + + +class UpdatePrefixHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdatePrefixCommand) -> None: + prefix = await self._event_store.load_aggregate(Prefix, command.prefix_id) + if prefix is None: + raise EntityNotFoundError(f"Prefix {command.prefix_id} not found") + + prefix.update( + description=command.description, + role=command.role, + tenant_id=command.tenant_id, + vlan_id=command.vlan_id, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = prefix.collect_uncommitted_events() + await self._event_store.append( + prefix.id, new_events, expected_version=prefix.version - len(new_events), aggregate=prefix + ) + await self._read_model_repo.upsert_from_aggregate(prefix) + await self._event_producer.publish_many("ipam.events", new_events) + + +class ChangePrefixStatusHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: ChangePrefixStatusCommand) -> None: + prefix = await self._event_store.load_aggregate(Prefix, command.prefix_id) + if prefix is None: + raise EntityNotFoundError(f"Prefix {command.prefix_id} not found") + + prefix.change_status(PrefixStatus(command.status)) + + new_events = prefix.collect_uncommitted_events() + await self._event_store.append( + prefix.id, new_events, expected_version=prefix.version - len(new_events), aggregate=prefix + ) + await self._read_model_repo.upsert_from_aggregate(prefix) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeletePrefixHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeletePrefixCommand) -> None: + prefix = await self._event_store.load_aggregate(Prefix, command.prefix_id) + if prefix is None: + raise EntityNotFoundError(f"Prefix {command.prefix_id} not found") + + prefix.delete() + + new_events = prefix.collect_uncommitted_events() + await self._event_store.append( + prefix.id, new_events, expected_version=prefix.version - len(new_events), aggregate=prefix + ) + await self._read_model_repo.mark_deleted(prefix.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# IPAddress +# --------------------------------------------------------------------------- + + +class CreateIPAddressHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateIPAddressCommand) -> UUID: + if await self._read_model_repo.exists_in_vrf(command.address, command.vrf_id): + raise ConflictError(f"IP address {command.address} already exists in this VRF scope") + + ip = IPAddress.create( + address=command.address, + vrf_id=command.vrf_id, + status=IPAddressStatus(command.status), + dns_name=command.dns_name, + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = ip.collect_uncommitted_events() + await self._event_store.append(ip.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(ip) + await self._event_producer.publish_many("ipam.events", events) + return ip.id + + +class UpdateIPAddressHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateIPAddressCommand) -> None: + ip = await self._event_store.load_aggregate(IPAddress, command.ip_id) + if ip is None: + raise EntityNotFoundError(f"IP address {command.ip_id} not found") + + ip.update( + dns_name=command.dns_name, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = ip.collect_uncommitted_events() + await self._event_store.append(ip.id, new_events, expected_version=ip.version - len(new_events), aggregate=ip) + await self._read_model_repo.upsert_from_aggregate(ip) + await self._event_producer.publish_many("ipam.events", new_events) + + +class ChangeIPAddressStatusHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: ChangeIPAddressStatusCommand) -> None: + ip = await self._event_store.load_aggregate(IPAddress, command.ip_id) + if ip is None: + raise EntityNotFoundError(f"IP address {command.ip_id} not found") + + ip.change_status(IPAddressStatus(command.status)) + + new_events = ip.collect_uncommitted_events() + await self._event_store.append(ip.id, new_events, expected_version=ip.version - len(new_events), aggregate=ip) + await self._read_model_repo.upsert_from_aggregate(ip) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteIPAddressHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteIPAddressCommand) -> None: + ip = await self._event_store.load_aggregate(IPAddress, command.ip_id) + if ip is None: + raise EntityNotFoundError(f"IP address {command.ip_id} not found") + + ip.delete() + + new_events = ip.collect_uncommitted_events() + await self._event_store.append(ip.id, new_events, expected_version=ip.version - len(new_events), aggregate=ip) + await self._read_model_repo.mark_deleted(ip.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# VRF +# --------------------------------------------------------------------------- + + +class CreateVRFHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VRFReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateVRFCommand) -> UUID: + vrf = VRF.create( + name=command.name, + rd=command.rd, + import_targets=command.import_targets or [], + export_targets=command.export_targets or [], + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = vrf.collect_uncommitted_events() + await self._event_store.append(vrf.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(vrf) + await self._event_producer.publish_many("ipam.events", events) + return vrf.id + + +class UpdateVRFHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VRFReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateVRFCommand) -> None: + vrf = await self._event_store.load_aggregate(VRF, command.vrf_id) + if vrf is None: + raise EntityNotFoundError(f"VRF {command.vrf_id} not found") + + vrf.update( + name=command.name, + import_targets=command.import_targets, + export_targets=command.export_targets, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = vrf.collect_uncommitted_events() + await self._event_store.append( + vrf.id, new_events, expected_version=vrf.version - len(new_events), aggregate=vrf + ) + await self._read_model_repo.upsert_from_aggregate(vrf) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteVRFHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VRFReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteVRFCommand) -> None: + vrf = await self._event_store.load_aggregate(VRF, command.vrf_id) + if vrf is None: + raise EntityNotFoundError(f"VRF {command.vrf_id} not found") + + vrf.delete() + + new_events = vrf.collect_uncommitted_events() + await self._event_store.append( + vrf.id, new_events, expected_version=vrf.version - len(new_events), aggregate=vrf + ) + await self._read_model_repo.mark_deleted(vrf.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# VLAN +# --------------------------------------------------------------------------- + + +class CreateVLANHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateVLANCommand) -> UUID: + vlan = VLAN.create( + vid=command.vid, + name=command.name, + group_id=command.group_id, + status=VLANStatus(command.status), + role=command.role, + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = vlan.collect_uncommitted_events() + await self._event_store.append(vlan.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(vlan) + await self._event_producer.publish_many("ipam.events", events) + return vlan.id + + +class UpdateVLANHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateVLANCommand) -> None: + vlan = await self._event_store.load_aggregate(VLAN, command.vlan_id) + if vlan is None: + raise EntityNotFoundError(f"VLAN {command.vlan_id} not found") + + vlan.update( + name=command.name, + role=command.role, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = vlan.collect_uncommitted_events() + await self._event_store.append( + vlan.id, new_events, expected_version=vlan.version - len(new_events), aggregate=vlan + ) + await self._read_model_repo.upsert_from_aggregate(vlan) + await self._event_producer.publish_many("ipam.events", new_events) + + +class ChangeVLANStatusHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: ChangeVLANStatusCommand) -> None: + vlan = await self._event_store.load_aggregate(VLAN, command.vlan_id) + if vlan is None: + raise EntityNotFoundError(f"VLAN {command.vlan_id} not found") + + vlan.change_status(VLANStatus(command.status)) + + new_events = vlan.collect_uncommitted_events() + await self._event_store.append( + vlan.id, new_events, expected_version=vlan.version - len(new_events), aggregate=vlan + ) + await self._read_model_repo.upsert_from_aggregate(vlan) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteVLANHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteVLANCommand) -> None: + vlan = await self._event_store.load_aggregate(VLAN, command.vlan_id) + if vlan is None: + raise EntityNotFoundError(f"VLAN {command.vlan_id} not found") + + vlan.delete() + + new_events = vlan.collect_uncommitted_events() + await self._event_store.append( + vlan.id, new_events, expected_version=vlan.version - len(new_events), aggregate=vlan + ) + await self._read_model_repo.mark_deleted(vlan.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# IPRange +# --------------------------------------------------------------------------- + + +class CreateIPRangeHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateIPRangeCommand) -> UUID: + ip_range = IPRange.create( + start_address=command.start_address, + end_address=command.end_address, + vrf_id=command.vrf_id, + status=IPRangeStatus(command.status), + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = ip_range.collect_uncommitted_events() + await self._event_store.append(ip_range.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(ip_range) + await self._event_producer.publish_many("ipam.events", events) + return ip_range.id + + +class UpdateIPRangeHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateIPRangeCommand) -> None: + ip_range = await self._event_store.load_aggregate(IPRange, command.range_id) + if ip_range is None: + raise EntityNotFoundError(f"IP range {command.range_id} not found") + + ip_range.update( + description=command.description, + tenant_id=command.tenant_id, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = ip_range.collect_uncommitted_events() + await self._event_store.append( + ip_range.id, new_events, expected_version=ip_range.version - len(new_events), aggregate=ip_range + ) + await self._read_model_repo.upsert_from_aggregate(ip_range) + await self._event_producer.publish_many("ipam.events", new_events) + + +class ChangeIPRangeStatusHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: ChangeIPRangeStatusCommand) -> None: + ip_range = await self._event_store.load_aggregate(IPRange, command.range_id) + if ip_range is None: + raise EntityNotFoundError(f"IP range {command.range_id} not found") + + ip_range.change_status(IPRangeStatus(command.status)) + + new_events = ip_range.collect_uncommitted_events() + await self._event_store.append( + ip_range.id, new_events, expected_version=ip_range.version - len(new_events), aggregate=ip_range + ) + await self._read_model_repo.upsert_from_aggregate(ip_range) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteIPRangeHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteIPRangeCommand) -> None: + ip_range = await self._event_store.load_aggregate(IPRange, command.range_id) + if ip_range is None: + raise EntityNotFoundError(f"IP range {command.range_id} not found") + + ip_range.delete() + + new_events = ip_range.collect_uncommitted_events() + await self._event_store.append( + ip_range.id, new_events, expected_version=ip_range.version - len(new_events), aggregate=ip_range + ) + await self._read_model_repo.mark_deleted(ip_range.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# RIR +# --------------------------------------------------------------------------- + + +class CreateRIRHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RIRReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateRIRCommand) -> UUID: + rir = RIR.create( + name=command.name, + is_private=command.is_private, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = rir.collect_uncommitted_events() + await self._event_store.append(rir.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(rir) + await self._event_producer.publish_many("ipam.events", events) + return rir.id + + +class UpdateRIRHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RIRReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateRIRCommand) -> None: + rir = await self._event_store.load_aggregate(RIR, command.rir_id) + if rir is None: + raise EntityNotFoundError(f"RIR {command.rir_id} not found") + + rir.update( + description=command.description, + is_private=command.is_private, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = rir.collect_uncommitted_events() + await self._event_store.append( + rir.id, new_events, expected_version=rir.version - len(new_events), aggregate=rir + ) + await self._read_model_repo.upsert_from_aggregate(rir) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteRIRHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RIRReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteRIRCommand) -> None: + rir = await self._event_store.load_aggregate(RIR, command.rir_id) + if rir is None: + raise EntityNotFoundError(f"RIR {command.rir_id} not found") + + rir.delete() + + new_events = rir.collect_uncommitted_events() + await self._event_store.append( + rir.id, new_events, expected_version=rir.version - len(new_events), aggregate=rir + ) + await self._read_model_repo.mark_deleted(rir.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# ASN +# --------------------------------------------------------------------------- + + +class CreateASNHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ASNReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateASNCommand) -> UUID: + asn = ASN.create( + asn=command.asn, + rir_id=command.rir_id, + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = asn.collect_uncommitted_events() + await self._event_store.append(asn.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(asn) + await self._event_producer.publish_many("ipam.events", events) + return asn.id + + +class UpdateASNHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ASNReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateASNCommand) -> None: + asn = await self._event_store.load_aggregate(ASN, command.asn_id) + if asn is None: + raise EntityNotFoundError(f"ASN {command.asn_id} not found") + + asn.update( + description=command.description, + tenant_id=command.tenant_id, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = asn.collect_uncommitted_events() + await self._event_store.append( + asn.id, new_events, expected_version=asn.version - len(new_events), aggregate=asn + ) + await self._read_model_repo.upsert_from_aggregate(asn) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteASNHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ASNReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteASNCommand) -> None: + asn = await self._event_store.load_aggregate(ASN, command.asn_id) + if asn is None: + raise EntityNotFoundError(f"ASN {command.asn_id} not found") + + asn.delete() + + new_events = asn.collect_uncommitted_events() + await self._event_store.append( + asn.id, new_events, expected_version=asn.version - len(new_events), aggregate=asn + ) + await self._read_model_repo.mark_deleted(asn.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# FHRPGroup +# --------------------------------------------------------------------------- + + +class CreateFHRPGroupHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: FHRPGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateFHRPGroupCommand) -> UUID: + group = FHRPGroup.create( + protocol=FHRPProtocol(command.protocol), + group_id_value=command.group_id_value, + auth_type=FHRPAuthType(command.auth_type), + auth_key=command.auth_key, + name=command.name, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = group.collect_uncommitted_events() + await self._event_store.append(group.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(group) + await self._event_producer.publish_many("ipam.events", events) + return group.id + + +class UpdateFHRPGroupHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: FHRPGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateFHRPGroupCommand) -> None: + group = await self._event_store.load_aggregate(FHRPGroup, command.fhrp_group_id) + if group is None: + raise EntityNotFoundError(f"FHRP group {command.fhrp_group_id} not found") + + group.update( + name=command.name, + auth_type=command.auth_type, + auth_key=command.auth_key, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.upsert_from_aggregate(group) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteFHRPGroupHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: FHRPGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteFHRPGroupCommand) -> None: + group = await self._event_store.load_aggregate(FHRPGroup, command.fhrp_group_id) + if group is None: + raise EntityNotFoundError(f"FHRP group {command.fhrp_group_id} not found") + + group.delete() + + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.mark_deleted(group.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# Bulk Operations +# --------------------------------------------------------------------------- + + +class BulkCreatePrefixesHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreatePrefixesCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + prefix = Prefix.create( + network=item.network, + vrf_id=item.vrf_id, + vlan_id=item.vlan_id, + status=PrefixStatus(item.status), + role=item.role, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = prefix.collect_uncommitted_events() + await self._event_store.append(prefix.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(prefix) + all_events.extend(events) + results.append(prefix.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateIPAddressesHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateIPAddressesCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + if await self._read_model_repo.exists_in_vrf(item.address, item.vrf_id): + raise ConflictError(f"IP address {item.address} already exists in this VRF scope") + + ip = IPAddress.create( + address=item.address, + vrf_id=item.vrf_id, + status=IPAddressStatus(item.status), + dns_name=item.dns_name, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = ip.collect_uncommitted_events() + await self._event_store.append(ip.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(ip) + all_events.extend(events) + results.append(ip.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateVRFsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VRFReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateVRFsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + vrf = VRF.create( + name=item.name, + rd=item.rd, + import_targets=item.import_targets, + export_targets=item.export_targets, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = vrf.collect_uncommitted_events() + await self._event_store.append(vrf.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(vrf) + all_events.extend(events) + results.append(vrf.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateVLANsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateVLANsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + vlan = VLAN.create( + vid=item.vid, + name=item.name, + group_id=item.group_id, + status=VLANStatus(item.status), + role=item.role, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = vlan.collect_uncommitted_events() + await self._event_store.append(vlan.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(vlan) + all_events.extend(events) + results.append(vlan.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateIPRangesHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateIPRangesCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + ip_range = IPRange.create( + start_address=item.start_address, + end_address=item.end_address, + vrf_id=item.vrf_id, + status=IPRangeStatus(item.status), + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = ip_range.collect_uncommitted_events() + await self._event_store.append(ip_range.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(ip_range) + all_events.extend(events) + results.append(ip_range.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateRIRsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RIRReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateRIRsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + rir = RIR.create( + name=item.name, + is_private=item.is_private, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = rir.collect_uncommitted_events() + await self._event_store.append(rir.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(rir) + all_events.extend(events) + results.append(rir.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateASNsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ASNReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateASNsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + asn = ASN.create( + asn=item.asn, + rir_id=item.rir_id, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = asn.collect_uncommitted_events() + await self._event_store.append(asn.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(asn) + all_events.extend(events) + results.append(asn.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateFHRPGroupsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: FHRPGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateFHRPGroupsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + group = FHRPGroup.create( + protocol=FHRPProtocol(item.protocol), + group_id_value=item.group_id_value, + auth_type=FHRPAuthType(item.auth_type), + auth_key=item.auth_key, + name=item.name, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = group.collect_uncommitted_events() + await self._event_store.append(group.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(group) + all_events.extend(events) + results.append(group.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +# --------------------------------------------------------------------------- +# Bulk Update / Delete +# --------------------------------------------------------------------------- + + +class BulkUpdatePrefixesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdatePrefixesCommand) -> int: + all_events: list = [] + for item in command.items: + prefix = await self._event_store.load_aggregate(Prefix, item.prefix_id) + if prefix is None: + raise EntityNotFoundError(f"Prefix {item.prefix_id} not found") + prefix.update( + description=item.description, + role=item.role, + tenant_id=item.tenant_id, + vlan_id=item.vlan_id, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = prefix.collect_uncommitted_events() + await self._event_store.append( + prefix.id, new_events, expected_version=prefix.version - len(new_events), aggregate=prefix + ) + await self._read_model_repo.upsert_from_aggregate(prefix) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeletePrefixesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: PrefixReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeletePrefixesCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + prefix = await self._event_store.load_aggregate(Prefix, agg_id) + if prefix is None: + raise EntityNotFoundError(f"Prefix {agg_id} not found") + prefix.delete() + new_events = prefix.collect_uncommitted_events() + await self._event_store.append( + prefix.id, new_events, expected_version=prefix.version - len(new_events), aggregate=prefix + ) + await self._read_model_repo.mark_deleted(prefix.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateIPAddressesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateIPAddressesCommand) -> int: + all_events: list = [] + for item in command.items: + ip = await self._event_store.load_aggregate(IPAddress, item.ip_id) + if ip is None: + raise EntityNotFoundError(f"IP address {item.ip_id} not found") + ip.update( + dns_name=item.dns_name, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = ip.collect_uncommitted_events() + await self._event_store.append( + ip.id, new_events, expected_version=ip.version - len(new_events), aggregate=ip + ) + await self._read_model_repo.upsert_from_aggregate(ip) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteIPAddressesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPAddressReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteIPAddressesCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + ip = await self._event_store.load_aggregate(IPAddress, agg_id) + if ip is None: + raise EntityNotFoundError(f"IP address {agg_id} not found") + ip.delete() + new_events = ip.collect_uncommitted_events() + await self._event_store.append( + ip.id, new_events, expected_version=ip.version - len(new_events), aggregate=ip + ) + await self._read_model_repo.mark_deleted(ip.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateVRFsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VRFReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateVRFsCommand) -> int: + all_events: list = [] + for item in command.items: + vrf = await self._event_store.load_aggregate(VRF, item.vrf_id) + if vrf is None: + raise EntityNotFoundError(f"VRF {item.vrf_id} not found") + vrf.update( + name=item.name, + import_targets=item.import_targets, + export_targets=item.export_targets, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = vrf.collect_uncommitted_events() + await self._event_store.append( + vrf.id, new_events, expected_version=vrf.version - len(new_events), aggregate=vrf + ) + await self._read_model_repo.upsert_from_aggregate(vrf) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteVRFsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VRFReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteVRFsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + vrf = await self._event_store.load_aggregate(VRF, agg_id) + if vrf is None: + raise EntityNotFoundError(f"VRF {agg_id} not found") + vrf.delete() + new_events = vrf.collect_uncommitted_events() + await self._event_store.append( + vrf.id, new_events, expected_version=vrf.version - len(new_events), aggregate=vrf + ) + await self._read_model_repo.mark_deleted(vrf.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateVLANsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateVLANsCommand) -> int: + all_events: list = [] + for item in command.items: + vlan = await self._event_store.load_aggregate(VLAN, item.vlan_id) + if vlan is None: + raise EntityNotFoundError(f"VLAN {item.vlan_id} not found") + vlan.update( + name=item.name, + role=item.role, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = vlan.collect_uncommitted_events() + await self._event_store.append( + vlan.id, new_events, expected_version=vlan.version - len(new_events), aggregate=vlan + ) + await self._read_model_repo.upsert_from_aggregate(vlan) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteVLANsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteVLANsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + vlan = await self._event_store.load_aggregate(VLAN, agg_id) + if vlan is None: + raise EntityNotFoundError(f"VLAN {agg_id} not found") + vlan.delete() + new_events = vlan.collect_uncommitted_events() + await self._event_store.append( + vlan.id, new_events, expected_version=vlan.version - len(new_events), aggregate=vlan + ) + await self._read_model_repo.mark_deleted(vlan.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateIPRangesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateIPRangesCommand) -> int: + all_events: list = [] + for item in command.items: + ip_range = await self._event_store.load_aggregate(IPRange, item.range_id) + if ip_range is None: + raise EntityNotFoundError(f"IP range {item.range_id} not found") + ip_range.update( + description=item.description, + tenant_id=item.tenant_id, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = ip_range.collect_uncommitted_events() + await self._event_store.append( + ip_range.id, new_events, expected_version=ip_range.version - len(new_events), aggregate=ip_range + ) + await self._read_model_repo.upsert_from_aggregate(ip_range) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteIPRangesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: IPRangeReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteIPRangesCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + ip_range = await self._event_store.load_aggregate(IPRange, agg_id) + if ip_range is None: + raise EntityNotFoundError(f"IP range {agg_id} not found") + ip_range.delete() + new_events = ip_range.collect_uncommitted_events() + await self._event_store.append( + ip_range.id, new_events, expected_version=ip_range.version - len(new_events), aggregate=ip_range + ) + await self._read_model_repo.mark_deleted(ip_range.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateRIRsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RIRReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateRIRsCommand) -> int: + all_events: list = [] + for item in command.items: + rir = await self._event_store.load_aggregate(RIR, item.rir_id) + if rir is None: + raise EntityNotFoundError(f"RIR {item.rir_id} not found") + rir.update( + description=item.description, + is_private=item.is_private, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = rir.collect_uncommitted_events() + await self._event_store.append( + rir.id, new_events, expected_version=rir.version - len(new_events), aggregate=rir + ) + await self._read_model_repo.upsert_from_aggregate(rir) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteRIRsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RIRReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteRIRsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + rir = await self._event_store.load_aggregate(RIR, agg_id) + if rir is None: + raise EntityNotFoundError(f"RIR {agg_id} not found") + rir.delete() + new_events = rir.collect_uncommitted_events() + await self._event_store.append( + rir.id, new_events, expected_version=rir.version - len(new_events), aggregate=rir + ) + await self._read_model_repo.mark_deleted(rir.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateASNsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ASNReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateASNsCommand) -> int: + all_events: list = [] + for item in command.items: + asn = await self._event_store.load_aggregate(ASN, item.asn_id) + if asn is None: + raise EntityNotFoundError(f"ASN {item.asn_id} not found") + asn.update( + description=item.description, + tenant_id=item.tenant_id, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = asn.collect_uncommitted_events() + await self._event_store.append( + asn.id, new_events, expected_version=asn.version - len(new_events), aggregate=asn + ) + await self._read_model_repo.upsert_from_aggregate(asn) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteASNsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ASNReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteASNsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + asn = await self._event_store.load_aggregate(ASN, agg_id) + if asn is None: + raise EntityNotFoundError(f"ASN {agg_id} not found") + asn.delete() + new_events = asn.collect_uncommitted_events() + await self._event_store.append( + asn.id, new_events, expected_version=asn.version - len(new_events), aggregate=asn + ) + await self._read_model_repo.mark_deleted(asn.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateFHRPGroupsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: FHRPGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateFHRPGroupsCommand) -> int: + all_events: list = [] + for item in command.items: + group = await self._event_store.load_aggregate(FHRPGroup, item.fhrp_group_id) + if group is None: + raise EntityNotFoundError(f"FHRP group {item.fhrp_group_id} not found") + group.update( + name=item.name, + auth_type=item.auth_type, + auth_key=item.auth_key, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.upsert_from_aggregate(group) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteFHRPGroupsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: FHRPGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteFHRPGroupsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + group = await self._event_store.load_aggregate(FHRPGroup, agg_id) + if group is None: + raise EntityNotFoundError(f"FHRP group {agg_id} not found") + group.delete() + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.mark_deleted(group.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +# --------------------------------------------------------------------------- +# RouteTarget +# --------------------------------------------------------------------------- + + +class CreateRouteTargetHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RouteTargetReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateRouteTargetCommand) -> UUID: + rt = RouteTarget.create( + name=command.name, + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = rt.collect_uncommitted_events() + await self._event_store.append(rt.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(rt) + await self._event_producer.publish_many("ipam.events", events) + return rt.id + + +class UpdateRouteTargetHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RouteTargetReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateRouteTargetCommand) -> None: + rt = await self._event_store.load_aggregate(RouteTarget, command.route_target_id) + if rt is None: + raise EntityNotFoundError(f"RouteTarget {command.route_target_id} not found") + + rt.update( + description=command.description, + tenant_id=command.tenant_id, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = rt.collect_uncommitted_events() + await self._event_store.append(rt.id, new_events, expected_version=rt.version - len(new_events), aggregate=rt) + await self._read_model_repo.upsert_from_aggregate(rt) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteRouteTargetHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RouteTargetReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteRouteTargetCommand) -> None: + rt = await self._event_store.load_aggregate(RouteTarget, command.route_target_id) + if rt is None: + raise EntityNotFoundError(f"RouteTarget {command.route_target_id} not found") + + rt.delete() + + new_events = rt.collect_uncommitted_events() + await self._event_store.append(rt.id, new_events, expected_version=rt.version - len(new_events), aggregate=rt) + await self._read_model_repo.mark_deleted(rt.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# VLANGroup +# --------------------------------------------------------------------------- + + +class CreateVLANGroupHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateVLANGroupCommand) -> UUID: + group = VLANGroup.create( + name=command.name, + slug=command.slug, + min_vid=command.min_vid, + max_vid=command.max_vid, + tenant_id=command.tenant_id, + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = group.collect_uncommitted_events() + await self._event_store.append(group.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(group) + await self._event_producer.publish_many("ipam.events", events) + return group.id + + +class UpdateVLANGroupHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateVLANGroupCommand) -> None: + group = await self._event_store.load_aggregate(VLANGroup, command.vlan_group_id) + if group is None: + raise EntityNotFoundError(f"VLANGroup {command.vlan_group_id} not found") + + group.update( + name=command.name, + description=command.description, + min_vid=command.min_vid, + max_vid=command.max_vid, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.upsert_from_aggregate(group) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteVLANGroupHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteVLANGroupCommand) -> None: + group = await self._event_store.load_aggregate(VLANGroup, command.vlan_group_id) + if group is None: + raise EntityNotFoundError(f"VLANGroup {command.vlan_group_id} not found") + + group.delete() + + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.mark_deleted(group.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +# --------------------------------------------------------------------------- +# Service +# --------------------------------------------------------------------------- + + +class CreateServiceHandler(CommandHandler[UUID]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ServiceReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: CreateServiceCommand) -> UUID: + svc = Service.create( + name=command.name, + protocol=ServiceProtocol(command.protocol), + ports=command.ports or [], + ip_addresses=command.ip_addresses or [], + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + events = svc.collect_uncommitted_events() + await self._event_store.append(svc.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(svc) + await self._event_producer.publish_many("ipam.events", events) + return svc.id + + +class UpdateServiceHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ServiceReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: UpdateServiceCommand) -> None: + svc = await self._event_store.load_aggregate(Service, command.service_id) + if svc is None: + raise EntityNotFoundError(f"Service {command.service_id} not found") + + svc.update( + name=command.name, + protocol=command.protocol, + ports=command.ports or [], + ip_addresses=command.ip_addresses or [], + description=command.description, + custom_fields=command.custom_fields or {}, + tags=command.tags or [], + ) + + new_events = svc.collect_uncommitted_events() + await self._event_store.append( + svc.id, new_events, expected_version=svc.version - len(new_events), aggregate=svc + ) + await self._read_model_repo.upsert_from_aggregate(svc) + await self._event_producer.publish_many("ipam.events", new_events) + + +class DeleteServiceHandler(CommandHandler[None]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ServiceReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: DeleteServiceCommand) -> None: + svc = await self._event_store.load_aggregate(Service, command.service_id) + if svc is None: + raise EntityNotFoundError(f"Service {command.service_id} not found") + + svc.delete() + + new_events = svc.collect_uncommitted_events() + await self._event_store.append( + svc.id, new_events, expected_version=svc.version - len(new_events), aggregate=svc + ) + await self._read_model_repo.mark_deleted(svc.id) + await self._event_producer.publish_many("ipam.events", new_events) + + +class BulkUpdateRouteTargetsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RouteTargetReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateRouteTargetsCommand) -> int: + all_events: list = [] + for item in command.items: + rt = await self._event_store.load_aggregate(RouteTarget, item.route_target_id) + if rt is None: + raise EntityNotFoundError(f"RouteTarget {item.route_target_id} not found") + rt.update( + description=item.description, + tenant_id=item.tenant_id, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = rt.collect_uncommitted_events() + await self._event_store.append( + rt.id, new_events, expected_version=rt.version - len(new_events), aggregate=rt + ) + await self._read_model_repo.upsert_from_aggregate(rt) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteRouteTargetsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RouteTargetReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteRouteTargetsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + rt = await self._event_store.load_aggregate(RouteTarget, agg_id) + if rt is None: + raise EntityNotFoundError(f"RouteTarget {agg_id} not found") + rt.delete() + new_events = rt.collect_uncommitted_events() + await self._event_store.append( + rt.id, new_events, expected_version=rt.version - len(new_events), aggregate=rt + ) + await self._read_model_repo.mark_deleted(rt.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateVLANGroupsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateVLANGroupsCommand) -> int: + all_events: list = [] + for item in command.items: + group = await self._event_store.load_aggregate(VLANGroup, item.vlan_group_id) + if group is None: + raise EntityNotFoundError(f"VLANGroup {item.vlan_group_id} not found") + group.update( + name=item.name, + description=item.description, + min_vid=item.min_vid, + max_vid=item.max_vid, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.upsert_from_aggregate(group) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteVLANGroupsHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteVLANGroupsCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + group = await self._event_store.load_aggregate(VLANGroup, agg_id) + if group is None: + raise EntityNotFoundError(f"VLANGroup {agg_id} not found") + group.delete() + new_events = group.collect_uncommitted_events() + await self._event_store.append( + group.id, new_events, expected_version=group.version - len(new_events), aggregate=group + ) + await self._read_model_repo.mark_deleted(group.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +class BulkUpdateServicesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ServiceReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkUpdateServicesCommand) -> int: + all_events: list = [] + for item in command.items: + svc = await self._event_store.load_aggregate(Service, item.service_id) + if svc is None: + raise EntityNotFoundError(f"Service {item.service_id} not found") + svc.update( + name=item.name, + protocol=item.protocol, + ports=item.ports, + ip_addresses=item.ip_addresses, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + new_events = svc.collect_uncommitted_events() + await self._event_store.append( + svc.id, new_events, expected_version=svc.version - len(new_events), aggregate=svc + ) + await self._read_model_repo.upsert_from_aggregate(svc) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.items) + + +class BulkDeleteServicesHandler(CommandHandler[int]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ServiceReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkDeleteServicesCommand) -> int: + all_events: list = [] + for agg_id in command.ids: + svc = await self._event_store.load_aggregate(Service, agg_id) + if svc is None: + raise EntityNotFoundError(f"Service {agg_id} not found") + svc.delete() + new_events = svc.collect_uncommitted_events() + await self._event_store.append( + svc.id, new_events, expected_version=svc.version - len(new_events), aggregate=svc + ) + await self._read_model_repo.mark_deleted(svc.id) + all_events.extend(new_events) + await self._event_producer.publish_many("ipam.events", all_events) + return len(command.ids) + + +# --------------------------------------------------------------------------- +# Bulk Operations (new aggregates) +# --------------------------------------------------------------------------- + + +class BulkCreateRouteTargetsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: RouteTargetReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateRouteTargetsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + rt = RouteTarget.create( + name=item.name, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = rt.collect_uncommitted_events() + await self._event_store.append(rt.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(rt) + all_events.extend(events) + results.append(rt.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateVLANGroupsHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: VLANGroupReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateVLANGroupsCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + group = VLANGroup.create( + name=item.name, + slug=item.slug, + min_vid=item.min_vid, + max_vid=item.max_vid, + tenant_id=item.tenant_id, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = group.collect_uncommitted_events() + await self._event_store.append(group.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(group) + all_events.extend(events) + results.append(group.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +class BulkCreateServicesHandler(CommandHandler[list[UUID]]): + def __init__( + self, + event_store: PostgresEventStore, + read_model_repo: ServiceReadModelRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._event_store = event_store + self._read_model_repo = read_model_repo + self._event_producer = event_producer + + async def handle(self, command: BulkCreateServicesCommand) -> list[UUID]: + results: list[UUID] = [] + all_events: list = [] + for item in command.items: + svc = Service.create( + name=item.name, + protocol=ServiceProtocol(item.protocol), + ports=item.ports, + ip_addresses=item.ip_addresses, + description=item.description, + custom_fields=item.custom_fields, + tags=item.tags, + ) + events = svc.collect_uncommitted_events() + await self._event_store.append(svc.id, events, expected_version=0) + await self._read_model_repo.upsert_from_aggregate(svc) + all_events.extend(events) + results.append(svc.id) + await self._event_producer.publish_many("ipam.events", all_events) + return results + + +# --------------------------------------------------------------------------- +# Saved Filter +# --------------------------------------------------------------------------- + + +class CreateSavedFilterHandler(CommandHandler[UUID]): + def __init__(self, repo: SavedFilterRepository) -> None: + self._repo = repo + + async def handle(self, command: Command) -> UUID: + if command.is_default: + await self._repo.clear_default(command.user_id, command.entity_type) + return await self._repo.create( + { + "id": uuid4(), + "user_id": command.user_id, + "name": command.name, + "entity_type": command.entity_type, + "filter_config": command.filter_config, + "is_default": command.is_default, + } + ) + + +class UpdateSavedFilterHandler(CommandHandler[None]): + def __init__(self, repo: SavedFilterRepository) -> None: + self._repo = repo + + async def handle(self, command: Command) -> None: + existing = await self._repo.find_by_id(command.filter_id) + if existing is None: + raise EntityNotFoundError(f"SavedFilter {command.filter_id} not found") + update_data: dict = {} + if command.name is not None: + update_data["name"] = command.name + if command.filter_config is not None: + update_data["filter_config"] = command.filter_config + if command.is_default is not None: + update_data["is_default"] = command.is_default + if command.is_default: + await self._repo.clear_default(existing["user_id"], existing["entity_type"]) + if update_data: + await self._repo.update(command.filter_id, update_data) + + +class DeleteSavedFilterHandler(CommandHandler[None]): + def __init__(self, repo: SavedFilterRepository) -> None: + self._repo = repo + + async def handle(self, command: Command) -> None: + existing = await self._repo.find_by_id(command.filter_id) + if existing is None: + raise EntityNotFoundError(f"SavedFilter {command.filter_id} not found") + await self._repo.delete(command.filter_id) diff --git a/services/ipam/src/ipam/application/commands.py b/services/ipam/src/ipam/application/commands.py new file mode 100644 index 0000000..59771ab --- /dev/null +++ b/services/ipam/src/ipam/application/commands.py @@ -0,0 +1,573 @@ +from uuid import UUID + +from shared.cqrs.command import Command + +# --- Prefix --- + + +class CreatePrefixCommand(Command): + network: str + vrf_id: UUID | None = None + vlan_id: UUID | None = None + status: str = "active" + role: str | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdatePrefixCommand(Command): + prefix_id: UUID + description: str | None = None + role: str | None = None + tenant_id: UUID | None = None + vlan_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ChangePrefixStatusCommand(Command): + prefix_id: UUID + status: str + + +class DeletePrefixCommand(Command): + prefix_id: UUID + + +# --- IPAddress --- + + +class CreateIPAddressCommand(Command): + address: str + vrf_id: UUID | None = None + status: str = "active" + dns_name: str = "" + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateIPAddressCommand(Command): + ip_id: UUID + dns_name: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ChangeIPAddressStatusCommand(Command): + ip_id: UUID + status: str + + +class DeleteIPAddressCommand(Command): + ip_id: UUID + + +# --- VRF --- + + +class CreateVRFCommand(Command): + name: str + rd: str | None = None + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateVRFCommand(Command): + vrf_id: UUID + name: str | None = None + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteVRFCommand(Command): + vrf_id: UUID + + +# --- VLAN --- + + +class CreateVLANCommand(Command): + vid: int + name: str + group_id: UUID | None = None + status: str = "active" + role: str | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateVLANCommand(Command): + vlan_id: UUID + name: str | None = None + role: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ChangeVLANStatusCommand(Command): + vlan_id: UUID + status: str + + +class DeleteVLANCommand(Command): + vlan_id: UUID + + +# --- IPRange --- + + +class CreateIPRangeCommand(Command): + start_address: str + end_address: str + vrf_id: UUID | None = None + status: str = "active" + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateIPRangeCommand(Command): + range_id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ChangeIPRangeStatusCommand(Command): + range_id: UUID + status: str + + +class DeleteIPRangeCommand(Command): + range_id: UUID + + +# --- RIR --- + + +class CreateRIRCommand(Command): + name: str + is_private: bool = False + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateRIRCommand(Command): + rir_id: UUID + description: str | None = None + is_private: bool | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteRIRCommand(Command): + rir_id: UUID + + +# --- ASN --- + + +class CreateASNCommand(Command): + asn: int + rir_id: UUID | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateASNCommand(Command): + asn_id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteASNCommand(Command): + asn_id: UUID + + +# --- FHRPGroup --- + + +class CreateFHRPGroupCommand(Command): + protocol: str + group_id_value: int + auth_type: str = "plaintext" + auth_key: str = "" + name: str = "" + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateFHRPGroupCommand(Command): + fhrp_group_id: UUID + name: str | None = None + auth_type: str | None = None + auth_key: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteFHRPGroupCommand(Command): + fhrp_group_id: UUID + + +# --- Bulk Operations --- + + +class BulkCreatePrefixesCommand(Command): + items: list[CreatePrefixCommand] + + +class BulkUpdatePrefixItem(Command): + prefix_id: UUID + description: str | None = None + role: str | None = None + tenant_id: UUID | None = None + vlan_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdatePrefixesCommand(Command): + items: list[BulkUpdatePrefixItem] + + +class BulkDeletePrefixesCommand(Command): + ids: list[UUID] + + +class BulkCreateIPAddressesCommand(Command): + items: list[CreateIPAddressCommand] + + +class BulkUpdateIPAddressItem(Command): + ip_id: UUID + dns_name: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateIPAddressesCommand(Command): + items: list[BulkUpdateIPAddressItem] + + +class BulkDeleteIPAddressesCommand(Command): + ids: list[UUID] + + +class BulkCreateVRFsCommand(Command): + items: list[CreateVRFCommand] + + +class BulkUpdateVRFItem(Command): + vrf_id: UUID + name: str | None = None + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateVRFsCommand(Command): + items: list[BulkUpdateVRFItem] + + +class BulkDeleteVRFsCommand(Command): + ids: list[UUID] + + +class BulkCreateVLANsCommand(Command): + items: list[CreateVLANCommand] + + +class BulkUpdateVLANItem(Command): + vlan_id: UUID + name: str | None = None + role: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateVLANsCommand(Command): + items: list[BulkUpdateVLANItem] + + +class BulkDeleteVLANsCommand(Command): + ids: list[UUID] + + +class BulkCreateIPRangesCommand(Command): + items: list[CreateIPRangeCommand] + + +class BulkUpdateIPRangeItem(Command): + range_id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateIPRangesCommand(Command): + items: list[BulkUpdateIPRangeItem] + + +class BulkDeleteIPRangesCommand(Command): + ids: list[UUID] + + +class BulkCreateRIRsCommand(Command): + items: list[CreateRIRCommand] + + +class BulkUpdateRIRItem(Command): + rir_id: UUID + description: str | None = None + is_private: bool | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateRIRsCommand(Command): + items: list[BulkUpdateRIRItem] + + +class BulkDeleteRIRsCommand(Command): + ids: list[UUID] + + +class BulkCreateASNsCommand(Command): + items: list[CreateASNCommand] + + +class BulkUpdateASNItem(Command): + asn_id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateASNsCommand(Command): + items: list[BulkUpdateASNItem] + + +class BulkDeleteASNsCommand(Command): + ids: list[UUID] + + +class BulkCreateFHRPGroupsCommand(Command): + items: list[CreateFHRPGroupCommand] + + +class BulkUpdateFHRPGroupItem(Command): + fhrp_group_id: UUID + name: str | None = None + auth_type: str | None = None + auth_key: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateFHRPGroupsCommand(Command): + items: list[BulkUpdateFHRPGroupItem] + + +class BulkDeleteFHRPGroupsCommand(Command): + ids: list[UUID] + + +# --- RouteTarget --- + + +class CreateRouteTargetCommand(Command): + name: str + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateRouteTargetCommand(Command): + route_target_id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteRouteTargetCommand(Command): + route_target_id: UUID + + +# --- VLANGroup --- + + +class CreateVLANGroupCommand(Command): + name: str + slug: str + min_vid: int = 1 + max_vid: int = 4094 + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateVLANGroupCommand(Command): + vlan_group_id: UUID + name: str | None = None + description: str | None = None + min_vid: int | None = None + max_vid: int | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteVLANGroupCommand(Command): + vlan_group_id: UUID + + +# --- Service --- + + +class CreateServiceCommand(Command): + name: str + protocol: str = "tcp" + ports: list[int] | None = None + ip_addresses: list[UUID] | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateServiceCommand(Command): + service_id: UUID + name: str | None = None + protocol: str | None = None + ports: list[int] | None = None + ip_addresses: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class DeleteServiceCommand(Command): + service_id: UUID + + +# --- Bulk Operations (new aggregates) --- + + +class BulkCreateRouteTargetsCommand(Command): + items: list[CreateRouteTargetCommand] + + +class BulkUpdateRouteTargetItem(Command): + route_target_id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateRouteTargetsCommand(Command): + items: list[BulkUpdateRouteTargetItem] + + +class BulkDeleteRouteTargetsCommand(Command): + ids: list[UUID] + + +class BulkCreateVLANGroupsCommand(Command): + items: list[CreateVLANGroupCommand] + + +class BulkUpdateVLANGroupItem(Command): + vlan_group_id: UUID + name: str | None = None + description: str | None = None + min_vid: int | None = None + max_vid: int | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateVLANGroupsCommand(Command): + items: list[BulkUpdateVLANGroupItem] + + +class BulkDeleteVLANGroupsCommand(Command): + ids: list[UUID] + + +class BulkCreateServicesCommand(Command): + items: list[CreateServiceCommand] + + +class BulkUpdateServiceItem(Command): + service_id: UUID + name: str | None = None + protocol: str | None = None + ports: list[int] | None = None + ip_addresses: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateServicesCommand(Command): + items: list[BulkUpdateServiceItem] + + +class BulkDeleteServicesCommand(Command): + ids: list[UUID] + + +# --- Saved Filter --- + + +class CreateSavedFilterCommand(Command): + user_id: UUID + name: str + entity_type: str + filter_config: dict = {} + is_default: bool = False + + +class UpdateSavedFilterCommand(Command): + filter_id: UUID + name: str | None = None + filter_config: dict | None = None + is_default: bool | None = None + + +class DeleteSavedFilterCommand(Command): + filter_id: UUID diff --git a/services/ipam/src/ipam/application/dto.py b/services/ipam/src/ipam/application/dto.py new file mode 100644 index 0000000..fecb922 --- /dev/null +++ b/services/ipam/src/ipam/application/dto.py @@ -0,0 +1,174 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + + +class PrefixDTO(BaseModel): + id: UUID + network: str + vrf_id: UUID | None + vlan_id: UUID | None + status: str + role: str | None + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class IPAddressDTO(BaseModel): + id: UUID + address: str + vrf_id: UUID | None + status: str + dns_name: str + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class VRFDTO(BaseModel): + id: UUID + name: str + rd: str | None + import_targets: list[UUID] + export_targets: list[UUID] + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class VLANDTO(BaseModel): + id: UUID + vid: int + name: str + group_id: UUID | None + status: str + role: str | None + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class IPRangeDTO(BaseModel): + id: UUID + start_address: str + end_address: str + vrf_id: UUID | None + status: str + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class RIRDTO(BaseModel): + id: UUID + name: str + is_private: bool + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class ASNDTO(BaseModel): + id: UUID + asn: int + rir_id: UUID | None + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class FHRPGroupDTO(BaseModel): + id: UUID + protocol: str + group_id_value: int + auth_type: str + name: str + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class RouteTargetDTO(BaseModel): + id: UUID + name: str + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class VLANGroupDTO(BaseModel): + id: UUID + name: str + slug: str + min_vid: int + max_vid: int + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class ServiceDTO(BaseModel): + id: UUID + name: str + protocol: str + ports: list[int] + ip_addresses: list[UUID] + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class SavedFilterDTO(BaseModel): + id: UUID + user_id: UUID + name: str + entity_type: str + filter_config: dict + is_default: bool + created_at: datetime + updated_at: datetime + + +class SearchResultDTO(BaseModel): + entity_type: str + entity_id: UUID + display_text: str + description: str + relevance: float + + +class GlobalSearchResultDTO(BaseModel): + results: list[SearchResultDTO] + total: int diff --git a/services/ipam/src/ipam/application/export_service.py b/services/ipam/src/ipam/application/export_service.py new file mode 100644 index 0000000..2901f9f --- /dev/null +++ b/services/ipam/src/ipam/application/export_service.py @@ -0,0 +1,53 @@ +"""Export service — converts DTO dicts to CSV, JSON, YAML formats.""" + +from __future__ import annotations + +import csv +import io +import json +from datetime import datetime +from typing import Any +from uuid import UUID + +import yaml + + +def _serialize_value(value: Any) -> Any: + """Convert non-serializable types to strings.""" + if isinstance(value, UUID): + return str(value) + if isinstance(value, datetime): + return value.isoformat() + if isinstance(value, dict | list): + return json.dumps(value, default=str) + return value + + +def _serialize_row(item: dict[str, Any]) -> dict[str, Any]: + """Serialize a single row for export.""" + return {k: _serialize_value(v) for k, v in item.items()} + + +def export_csv(items: list[dict[str, Any]]) -> str: + """Export items as CSV string.""" + if not items: + return "" + fields = list(items[0].keys()) + output = io.StringIO() + writer = csv.DictWriter(output, fieldnames=fields) + writer.writeheader() + for item in items: + writer.writerow(_serialize_row(item)) + return output.getvalue() + + +def export_json(items: list[dict[str, Any]]) -> str: + """Export items as JSON string.""" + serialized = [_serialize_row(item) for item in items] + return json.dumps(serialized, indent=2, ensure_ascii=False) + + +def export_yaml(items: list[dict[str, Any]]) -> str: + """Export items as YAML string.""" + serialized = [_serialize_row(item) for item in items] + return yaml.dump(serialized, allow_unicode=True, default_flow_style=False, sort_keys=False) diff --git a/services/ipam/src/ipam/application/import_service.py b/services/ipam/src/ipam/application/import_service.py new file mode 100644 index 0000000..0a2eaf7 --- /dev/null +++ b/services/ipam/src/ipam/application/import_service.py @@ -0,0 +1,100 @@ +"""CSV import service — parses CSV content into command-ready dicts.""" + +from __future__ import annotations + +import csv +import io +import json +from typing import Any +from uuid import UUID + +from pydantic import BaseModel + + +class ImportRowError(BaseModel): + row: int + field: str + error: str + + +# Fields that should be parsed as UUID +_UUID_FIELDS = {"vrf_id", "vlan_id", "tenant_id", "rir_id", "group_id"} + +# Fields that should be parsed as int +_INT_FIELDS = {"vid", "asn", "group_id_value", "min_vid", "max_vid"} + +# Fields that should be parsed as bool +_BOOL_FIELDS = {"is_private"} + +# Fields that should be parsed as JSON +_JSON_FIELDS = {"custom_fields", "tags", "import_targets", "export_targets", "ports", "ip_addresses"} + + +def _convert_value(field: str, value: str) -> Any: + """Convert a CSV string value to the appropriate Python type.""" + stripped = value.strip() + if stripped == "": + return None + + if field in _UUID_FIELDS: + return UUID(stripped) + if field in _INT_FIELDS: + return int(stripped) + if field in _BOOL_FIELDS: + return stripped.lower() in ("true", "1", "yes") + if field in _JSON_FIELDS: + return json.loads(stripped) + return stripped + + +def parse_csv(content: str) -> tuple[list[dict[str, Any]], list[ImportRowError]]: + """Parse CSV content into a list of dicts ready for BulkCreate commands. + + Returns: + Tuple of (parsed_items, errors). Each parsed item is a dict with + field names matching CreateCommand fields. + """ + reader = csv.DictReader(io.StringIO(content)) + items: list[dict[str, Any]] = [] + errors: list[ImportRowError] = [] + + for row_num, row in enumerate(reader, start=2): # row 1 is header + item: dict[str, Any] = {} + row_has_error = False + + for field, raw_value in row.items(): + if field is None or raw_value is None: + continue + field = field.strip() + if not field: + continue + try: + converted = _convert_value(field, raw_value) + if converted is not None: + item[field] = converted + except (ValueError, json.JSONDecodeError) as e: + errors.append(ImportRowError(row=row_num, field=field, error=str(e))) + row_has_error = True + + if not row_has_error and item: + items.append(item) + + return items, errors + + +# Mapping from entity_type to BulkCreate command class name +ENTITY_COMMAND_MAP: dict[str, str] = { + "prefix": "BulkCreatePrefixesCommand", + "ip_address": "BulkCreateIPAddressesCommand", + "vrf": "BulkCreateVRFsCommand", + "vlan": "BulkCreateVLANsCommand", + "ip_range": "BulkCreateIPRangesCommand", + "rir": "BulkCreateRIRsCommand", + "asn": "BulkCreateASNsCommand", + "fhrp_group": "BulkCreateFHRPGroupsCommand", + "route_target": "BulkCreateRouteTargetsCommand", + "vlan_group": "BulkCreateVLANGroupsCommand", + "service": "BulkCreateServicesCommand", +} + +VALID_ENTITY_TYPES = set(ENTITY_COMMAND_MAP.keys()) diff --git a/services/ipam/src/ipam/application/queries.py b/services/ipam/src/ipam/application/queries.py new file mode 100644 index 0000000..955248e --- /dev/null +++ b/services/ipam/src/ipam/application/queries.py @@ -0,0 +1,196 @@ +from datetime import datetime +from typing import Any +from uuid import UUID + +from shared.cqrs.query import Query + +# --- Base --- + + +class BaseListQuery(Query): + offset: int = 0 + limit: int = 50 + description_contains: str | None = None + tag_slugs: list[str] | None = None + custom_field_filters: dict[str, Any] | None = None + created_after: datetime | None = None + created_before: datetime | None = None + updated_after: datetime | None = None + updated_before: datetime | None = None + sort_by: str | None = None + sort_dir: str = "asc" + + +# --- Prefix --- + + +class GetPrefixQuery(Query): + prefix_id: UUID + + +class ListPrefixesQuery(BaseListQuery): + vrf_id: UUID | None = None + status: str | None = None + tenant_id: UUID | None = None + role: str | None = None + + +class GetPrefixChildrenQuery(Query): + prefix_id: UUID + + +class GetPrefixUtilizationQuery(Query): + prefix_id: UUID + + +class GetAvailablePrefixesQuery(Query): + prefix_id: UUID + desired_prefix_length: int + + +class GetAvailableIPsQuery(Query): + prefix_id: UUID + count: int = 1 + + +# --- IPAddress --- + + +class GetIPAddressQuery(Query): + ip_id: UUID + + +class ListIPAddressesQuery(BaseListQuery): + vrf_id: UUID | None = None + status: str | None = None + tenant_id: UUID | None = None + + +# --- VRF --- + + +class GetVRFQuery(Query): + vrf_id: UUID + + +class ListVRFsQuery(BaseListQuery): + tenant_id: UUID | None = None + + +# --- VLAN --- + + +class GetVLANQuery(Query): + vlan_id: UUID + + +class ListVLANsQuery(BaseListQuery): + group_id: UUID | None = None + status: str | None = None + tenant_id: UUID | None = None + + +# --- IPRange --- + + +class GetIPRangeQuery(Query): + range_id: UUID + + +class ListIPRangesQuery(BaseListQuery): + vrf_id: UUID | None = None + status: str | None = None + tenant_id: UUID | None = None + + +class GetIPRangeUtilizationQuery(Query): + range_id: UUID + + +# --- RIR --- + + +class GetRIRQuery(Query): + rir_id: UUID + + +class ListRIRsQuery(BaseListQuery): + pass + + +# --- ASN --- + + +class GetASNQuery(Query): + asn_id: UUID + + +class ListASNsQuery(BaseListQuery): + rir_id: UUID | None = None + tenant_id: UUID | None = None + + +# --- FHRPGroup --- + + +class GetFHRPGroupQuery(Query): + fhrp_group_id: UUID + + +class ListFHRPGroupsQuery(BaseListQuery): + pass + + +# --- RouteTarget --- + + +class GetRouteTargetQuery(Query): + route_target_id: UUID + + +class ListRouteTargetsQuery(BaseListQuery): + tenant_id: UUID | None = None + + +# --- VLANGroup --- + + +class GetVLANGroupQuery(Query): + vlan_group_id: UUID + + +class ListVLANGroupsQuery(BaseListQuery): + tenant_id: UUID | None = None + + +# --- Service --- + + +class GetServiceQuery(Query): + service_id: UUID + + +class ListServicesQuery(BaseListQuery): + pass + + +# --- Saved Filter --- + + +class GetSavedFilterQuery(Query): + filter_id: UUID + + +class ListSavedFiltersQuery(Query): + user_id: UUID + entity_type: str | None = None + + +# --- Global Search --- + + +class GlobalSearchQuery(Query): + q: str + entity_types: list[str] | None = None + offset: int = 0 + limit: int = 20 diff --git a/services/ipam/src/ipam/application/query_handlers.py b/services/ipam/src/ipam/application/query_handlers.py new file mode 100644 index 0000000..ce5cda6 --- /dev/null +++ b/services/ipam/src/ipam/application/query_handlers.py @@ -0,0 +1,687 @@ +from shared.api.filtering import FilterOperator, FilterParam +from shared.api.sorting import SortParam +from shared.cqrs.query import Query, QueryHandler +from shared.domain.exceptions import EntityNotFoundError + +from ipam.application.dto import ( + ASNDTO, + RIRDTO, + VLANDTO, + VRFDTO, + FHRPGroupDTO, + GlobalSearchResultDTO, + IPAddressDTO, + IPRangeDTO, + PrefixDTO, + RouteTargetDTO, + SavedFilterDTO, + SearchResultDTO, + ServiceDTO, + VLANGroupDTO, +) +from ipam.application.queries import BaseListQuery +from ipam.application.read_model import ( + ASNReadModelRepository, + FHRPGroupReadModelRepository, + GlobalSearchRepository, + IPAddressReadModelRepository, + IPRangeReadModelRepository, + PrefixReadModelRepository, + RIRReadModelRepository, + RouteTargetReadModelRepository, + SavedFilterRepository, + ServiceReadModelRepository, + VLANGroupReadModelRepository, + VLANReadModelRepository, + VRFReadModelRepository, +) +from ipam.domain.ip_address import IPAddress +from ipam.domain.ip_range import IPRange +from ipam.domain.prefix import Prefix +from ipam.domain.services import ( + AvailablePrefixService, + IPAvailabilityService, + IPRangeUtilizationService, + PrefixUtilizationService, +) + +# --------------------------------------------------------------------------- +# Common filter builder +# --------------------------------------------------------------------------- + + +def _build_common_filters( + query: BaseListQuery, +) -> tuple[list[FilterParam], list[SortParam] | None, list[str] | None, dict[str, str] | None]: + """Build common filters, sort params, tag_slugs, and custom_field_filters from BaseListQuery.""" + filters: list[FilterParam] = [] + + if query.description_contains is not None: + filters.append( + FilterParam(field="description", operator=FilterOperator.ILIKE, value=query.description_contains) + ) + if query.created_after is not None: + filters.append( + FilterParam(field="created_at", operator=FilterOperator.GTE, value=query.created_after.isoformat()) + ) + if query.created_before is not None: + filters.append( + FilterParam(field="created_at", operator=FilterOperator.LTE, value=query.created_before.isoformat()) + ) + if query.updated_after is not None: + filters.append( + FilterParam(field="updated_at", operator=FilterOperator.GTE, value=query.updated_after.isoformat()) + ) + if query.updated_before is not None: + filters.append( + FilterParam(field="updated_at", operator=FilterOperator.LTE, value=query.updated_before.isoformat()) + ) + + sort_params: list[SortParam] | None = None + if query.sort_by is not None: + sort_params = [SortParam(field=query.sort_by, direction=query.sort_dir)] + + return filters, sort_params, query.tag_slugs, query.custom_field_filters + + +# --------------------------------------------------------------------------- +# Prefix +# --------------------------------------------------------------------------- + + +class GetPrefixHandler(QueryHandler[PrefixDTO]): + def __init__(self, read_model_repo: PrefixReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> PrefixDTO: + data = await self._repo.find_by_id(query.prefix_id) + if data is None: + raise EntityNotFoundError(f"Prefix {query.prefix_id} not found") + return PrefixDTO(**data) + + +class ListPrefixesHandler(QueryHandler[tuple[list[PrefixDTO], int]]): + def __init__(self, read_model_repo: PrefixReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[PrefixDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.vrf_id is not None: + filters.append(FilterParam(field="vrf_id", operator=FilterOperator.EQ, value=str(query.vrf_id))) + if query.status is not None: + filters.append(FilterParam(field="status", operator=FilterOperator.EQ, value=query.status)) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + if query.role is not None: + filters.append(FilterParam(field="role", operator=FilterOperator.EQ, value=query.role)) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [PrefixDTO(**item) for item in items], total + + +class GetPrefixChildrenHandler(QueryHandler[list[PrefixDTO]]): + def __init__(self, read_model_repo: PrefixReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> list[PrefixDTO]: + parent = await self._repo.find_by_id(query.prefix_id) + if parent is None: + raise EntityNotFoundError(f"Prefix {query.prefix_id} not found") + children = await self._repo.find_children(parent["network"], parent.get("vrf_id")) + return [PrefixDTO(**child) for child in children] + + +class GetPrefixUtilizationHandler(QueryHandler[float]): + def __init__( + self, + prefix_repo: PrefixReadModelRepository, + ip_repo: IPAddressReadModelRepository, + cache: object | None = None, + ) -> None: + self._prefix_repo = prefix_repo + self._ip_repo = ip_repo + self._service = PrefixUtilizationService() + self._cache = cache + + async def handle(self, query: Query) -> float: + if self._cache is not None: + cache_key = f"prefix_utilization:{query.prefix_id}" + cached = await self._cache.get_json(cache_key) + if cached is not None: + return cached + + data = await self._prefix_repo.find_by_id(query.prefix_id) + if data is None: + raise EntityNotFoundError(f"Prefix {query.prefix_id} not found") + prefix = _reconstruct_prefix(data) + children_data = await self._prefix_repo.find_children(data["network"], data.get("vrf_id")) + child_prefixes = [_reconstruct_prefix(c) for c in children_data] + ips_data = await self._ip_repo.find_by_prefix(data["network"], data.get("vrf_id")) + ip_addresses = [_reconstruct_ip(ip) for ip in ips_data] + result = self._service.calculate(prefix, child_prefixes, ip_addresses) + + if self._cache is not None: + await self._cache.set_json(cache_key, result) + + return result + + +class GetAvailablePrefixesHandler(QueryHandler[list[str]]): + def __init__(self, read_model_repo: PrefixReadModelRepository) -> None: + self._repo = read_model_repo + self._service = AvailablePrefixService() + + async def handle(self, query: Query) -> list[str]: + data = await self._repo.find_by_id(query.prefix_id) + if data is None: + raise EntityNotFoundError(f"Prefix {query.prefix_id} not found") + parent = _reconstruct_prefix(data) + children_data = await self._repo.find_children(data["network"], data.get("vrf_id")) + child_prefixes = [_reconstruct_prefix(c) for c in children_data] + return self._service.find_available(parent, child_prefixes, query.desired_prefix_length) + + +class GetAvailableIPsHandler(QueryHandler[list[str]]): + def __init__( + self, + prefix_repo: PrefixReadModelRepository, + ip_repo: IPAddressReadModelRepository, + ) -> None: + self._prefix_repo = prefix_repo + self._ip_repo = ip_repo + self._service = IPAvailabilityService() + + async def handle(self, query: Query) -> list[str]: + data = await self._prefix_repo.find_by_id(query.prefix_id) + if data is None: + raise EntityNotFoundError(f"Prefix {query.prefix_id} not found") + prefix = _reconstruct_prefix(data) + ips_data = await self._ip_repo.find_by_prefix(data["network"], data.get("vrf_id")) + used_addresses = [_reconstruct_ip(ip) for ip in ips_data] + return self._service.find_available(prefix, used_addresses, query.count) + + +# --------------------------------------------------------------------------- +# IPAddress +# --------------------------------------------------------------------------- + + +class GetIPAddressHandler(QueryHandler[IPAddressDTO]): + def __init__(self, read_model_repo: IPAddressReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> IPAddressDTO: + data = await self._repo.find_by_id(query.ip_id) + if data is None: + raise EntityNotFoundError(f"IPAddress {query.ip_id} not found") + return IPAddressDTO(**data) + + +class ListIPAddressesHandler(QueryHandler[tuple[list[IPAddressDTO], int]]): + def __init__(self, read_model_repo: IPAddressReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[IPAddressDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.vrf_id is not None: + filters.append(FilterParam(field="vrf_id", operator=FilterOperator.EQ, value=str(query.vrf_id))) + if query.status is not None: + filters.append(FilterParam(field="status", operator=FilterOperator.EQ, value=query.status)) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [IPAddressDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# VRF +# --------------------------------------------------------------------------- + + +class GetVRFHandler(QueryHandler[VRFDTO]): + def __init__(self, read_model_repo: VRFReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> VRFDTO: + data = await self._repo.find_by_id(query.vrf_id) + if data is None: + raise EntityNotFoundError(f"VRF {query.vrf_id} not found") + return VRFDTO(**data) + + +class ListVRFsHandler(QueryHandler[tuple[list[VRFDTO], int]]): + def __init__(self, read_model_repo: VRFReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[VRFDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [VRFDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# VLAN +# --------------------------------------------------------------------------- + + +class GetVLANHandler(QueryHandler[VLANDTO]): + def __init__(self, read_model_repo: VLANReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> VLANDTO: + data = await self._repo.find_by_id(query.vlan_id) + if data is None: + raise EntityNotFoundError(f"VLAN {query.vlan_id} not found") + return VLANDTO(**data) + + +class ListVLANsHandler(QueryHandler[tuple[list[VLANDTO], int]]): + def __init__(self, read_model_repo: VLANReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[VLANDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.group_id is not None: + filters.append(FilterParam(field="group_id", operator=FilterOperator.EQ, value=str(query.group_id))) + if query.status is not None: + filters.append(FilterParam(field="status", operator=FilterOperator.EQ, value=query.status)) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [VLANDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# IPRange +# --------------------------------------------------------------------------- + + +class GetIPRangeHandler(QueryHandler[IPRangeDTO]): + def __init__(self, read_model_repo: IPRangeReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> IPRangeDTO: + data = await self._repo.find_by_id(query.range_id) + if data is None: + raise EntityNotFoundError(f"IPRange {query.range_id} not found") + return IPRangeDTO(**data) + + +class ListIPRangesHandler(QueryHandler[tuple[list[IPRangeDTO], int]]): + def __init__(self, read_model_repo: IPRangeReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[IPRangeDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.vrf_id is not None: + filters.append(FilterParam(field="vrf_id", operator=FilterOperator.EQ, value=str(query.vrf_id))) + if query.status is not None: + filters.append(FilterParam(field="status", operator=FilterOperator.EQ, value=query.status)) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [IPRangeDTO(**item) for item in items], total + + +class GetIPRangeUtilizationHandler(QueryHandler[float]): + def __init__( + self, + range_repo: IPRangeReadModelRepository, + ip_repo: IPAddressReadModelRepository, + ) -> None: + self._range_repo = range_repo + self._ip_repo = ip_repo + self._service = IPRangeUtilizationService() + + async def handle(self, query: Query) -> float: + data = await self._range_repo.find_by_id(query.range_id) + if data is None: + raise EntityNotFoundError(f"IPRange {query.range_id} not found") + ip_range = _reconstruct_ip_range(data) + ips_data = await self._ip_repo.find_ips_in_range(data["start_address"], data["end_address"], data.get("vrf_id")) + used_addresses = [_reconstruct_ip(ip) for ip in ips_data] + return self._service.calculate(ip_range, used_addresses) + + +# --------------------------------------------------------------------------- +# RIR +# --------------------------------------------------------------------------- + + +class GetRIRHandler(QueryHandler[RIRDTO]): + def __init__(self, read_model_repo: RIRReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> RIRDTO: + data = await self._repo.find_by_id(query.rir_id) + if data is None: + raise EntityNotFoundError(f"RIR {query.rir_id} not found") + return RIRDTO(**data) + + +class ListRIRsHandler(QueryHandler[tuple[list[RIRDTO], int]]): + def __init__(self, read_model_repo: RIRReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[RIRDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [RIRDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# ASN +# --------------------------------------------------------------------------- + + +class GetASNHandler(QueryHandler[ASNDTO]): + def __init__(self, read_model_repo: ASNReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> ASNDTO: + data = await self._repo.find_by_id(query.asn_id) + if data is None: + raise EntityNotFoundError(f"ASN {query.asn_id} not found") + return ASNDTO(**data) + + +class ListASNsHandler(QueryHandler[tuple[list[ASNDTO], int]]): + def __init__(self, read_model_repo: ASNReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[ASNDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.rir_id is not None: + filters.append(FilterParam(field="rir_id", operator=FilterOperator.EQ, value=str(query.rir_id))) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [ASNDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# FHRPGroup +# --------------------------------------------------------------------------- + + +class GetFHRPGroupHandler(QueryHandler[FHRPGroupDTO]): + def __init__(self, read_model_repo: FHRPGroupReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> FHRPGroupDTO: + data = await self._repo.find_by_id(query.fhrp_group_id) + if data is None: + raise EntityNotFoundError(f"FHRPGroup {query.fhrp_group_id} not found") + return FHRPGroupDTO(**data) + + +class ListFHRPGroupsHandler(QueryHandler[tuple[list[FHRPGroupDTO], int]]): + def __init__(self, read_model_repo: FHRPGroupReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[FHRPGroupDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [FHRPGroupDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# RouteTarget +# --------------------------------------------------------------------------- + + +class GetRouteTargetHandler(QueryHandler[RouteTargetDTO]): + def __init__(self, read_model_repo: RouteTargetReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> RouteTargetDTO: + data = await self._repo.find_by_id(query.route_target_id) + if data is None: + raise EntityNotFoundError(f"RouteTarget {query.route_target_id} not found") + return RouteTargetDTO(**data) + + +class ListRouteTargetsHandler(QueryHandler[tuple[list[RouteTargetDTO], int]]): + def __init__(self, read_model_repo: RouteTargetReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[RouteTargetDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [RouteTargetDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# VLANGroup +# --------------------------------------------------------------------------- + + +class GetVLANGroupHandler(QueryHandler[VLANGroupDTO]): + def __init__(self, read_model_repo: VLANGroupReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> VLANGroupDTO: + data = await self._repo.find_by_id(query.vlan_group_id) + if data is None: + raise EntityNotFoundError(f"VLANGroup {query.vlan_group_id} not found") + return VLANGroupDTO(**data) + + +class ListVLANGroupsHandler(QueryHandler[tuple[list[VLANGroupDTO], int]]): + def __init__(self, read_model_repo: VLANGroupReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[VLANGroupDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + if query.tenant_id is not None: + filters.append(FilterParam(field="tenant_id", operator=FilterOperator.EQ, value=str(query.tenant_id))) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [VLANGroupDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# Service +# --------------------------------------------------------------------------- + + +class GetServiceHandler(QueryHandler[ServiceDTO]): + def __init__(self, read_model_repo: ServiceReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> ServiceDTO: + data = await self._repo.find_by_id(query.service_id) + if data is None: + raise EntityNotFoundError(f"Service {query.service_id} not found") + return ServiceDTO(**data) + + +class ListServicesHandler(QueryHandler[tuple[list[ServiceDTO], int]]): + def __init__(self, read_model_repo: ServiceReadModelRepository) -> None: + self._repo = read_model_repo + + async def handle(self, query: Query) -> tuple[list[ServiceDTO], int]: + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + items, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + filters=filters or None, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + return [ServiceDTO(**item) for item in items], total + + +# --------------------------------------------------------------------------- +# Saved Filter +# --------------------------------------------------------------------------- + + +class GetSavedFilterHandler(QueryHandler[SavedFilterDTO]): + def __init__(self, repo: SavedFilterRepository) -> None: + self._repo = repo + + async def handle(self, query: Query) -> SavedFilterDTO: + data = await self._repo.find_by_id(query.filter_id) + if data is None: + raise EntityNotFoundError(f"SavedFilter {query.filter_id} not found") + return SavedFilterDTO(**data) + + +class ListSavedFiltersHandler(QueryHandler[list[SavedFilterDTO]]): + def __init__(self, repo: SavedFilterRepository) -> None: + self._repo = repo + + async def handle(self, query: Query) -> list[SavedFilterDTO]: + items = await self._repo.find_by_user(query.user_id, query.entity_type) + return [SavedFilterDTO(**item) for item in items] + + +# --------------------------------------------------------------------------- +# Global Search +# --------------------------------------------------------------------------- + + +class GlobalSearchHandler(QueryHandler[GlobalSearchResultDTO]): + def __init__(self, search_repo: GlobalSearchRepository) -> None: + self._repo = search_repo + + async def handle(self, query: Query) -> GlobalSearchResultDTO: + results, total = await self._repo.search(query.q, query.entity_types, query.offset, query.limit) + return GlobalSearchResultDTO( + results=[SearchResultDTO(**r) for r in results], + total=total, + ) + + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def _reconstruct_prefix(data: dict) -> Prefix: + """Reconstruct a Prefix domain object from read model data for domain service use.""" + from uuid import UUID + + prefix = Prefix(aggregate_id=UUID(str(data["id"]))) + from ipam.domain.value_objects import PrefixNetwork, PrefixStatus + + prefix.network = PrefixNetwork(network=data["network"]) if data.get("network") else None + prefix.vrf_id = UUID(str(data["vrf_id"])) if data.get("vrf_id") else None + prefix.vlan_id = UUID(str(data["vlan_id"])) if data.get("vlan_id") else None + prefix.status = PrefixStatus(data["status"]) + prefix.role = data.get("role") + prefix.tenant_id = UUID(str(data["tenant_id"])) if data.get("tenant_id") else None + prefix.description = data.get("description", "") + prefix.custom_fields = data.get("custom_fields", {}) + prefix.tags = [UUID(str(t)) for t in data.get("tags", [])] + return prefix + + +def _reconstruct_ip_range(data: dict) -> IPRange: + """Reconstruct an IPRange domain object from read model data for domain service use.""" + from uuid import UUID + + from ipam.domain.value_objects import IPAddressValue, IPRangeStatus + + ip_range = IPRange(aggregate_id=UUID(str(data["id"]))) + ip_range.start_address = IPAddressValue(address=data["start_address"]) if data.get("start_address") else None + ip_range.end_address = IPAddressValue(address=data["end_address"]) if data.get("end_address") else None + ip_range.vrf_id = UUID(str(data["vrf_id"])) if data.get("vrf_id") else None + ip_range.status = IPRangeStatus(data["status"]) + ip_range.tenant_id = UUID(str(data["tenant_id"])) if data.get("tenant_id") else None + ip_range.description = data.get("description", "") + ip_range.custom_fields = data.get("custom_fields", {}) + ip_range.tags = [UUID(str(t)) for t in data.get("tags", [])] + return ip_range + + +def _reconstruct_ip(data: dict) -> IPAddress: + """Reconstruct an IPAddress domain object from read model data for domain service use.""" + from uuid import UUID + + from ipam.domain.value_objects import IPAddressStatus, IPAddressValue + + ip = IPAddress(aggregate_id=UUID(str(data["id"]))) + ip.address = IPAddressValue(address=data["address"]) if data.get("address") else None + ip.vrf_id = UUID(str(data["vrf_id"])) if data.get("vrf_id") else None + ip.status = IPAddressStatus(data["status"]) + ip.dns_name = data.get("dns_name", "") + ip.tenant_id = UUID(str(data["tenant_id"])) if data.get("tenant_id") else None + ip.description = data.get("description", "") + ip.custom_fields = data.get("custom_fields", {}) + ip.tags = [UUID(str(t)) for t in data.get("tags", [])] + return ip diff --git a/services/ipam/src/ipam/application/read_model.py b/services/ipam/src/ipam/application/read_model.py new file mode 100644 index 0000000..bb46df4 --- /dev/null +++ b/services/ipam/src/ipam/application/read_model.py @@ -0,0 +1,121 @@ +from abc import ABC, abstractmethod +from typing import Any +from uuid import UUID + +from shared.api.filtering import FilterParam +from shared.api.sorting import SortParam + + +class ReadModelRepository(ABC): + @abstractmethod + async def upsert_from_aggregate(self, aggregate: Any) -> None: ... + + @abstractmethod + async def find_by_id(self, entity_id: UUID) -> dict | None: ... + + @abstractmethod + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: ... + + @abstractmethod + async def mark_deleted(self, entity_id: UUID) -> None: ... + + +class PrefixReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_children(self, parent_network: str, vrf_id: UUID | None) -> list[dict]: ... + + @abstractmethod + async def find_by_vrf(self, vrf_id: UUID, *, offset: int = 0, limit: int = 50) -> tuple[list[dict], int]: ... + + +class IPAddressReadModelRepository(ReadModelRepository): + @abstractmethod + async def exists_in_vrf(self, address: str, vrf_id: UUID | None) -> bool: ... + + @abstractmethod + async def find_by_prefix(self, network: str, vrf_id: UUID | None) -> list[dict]: ... + + @abstractmethod + async def find_ips_in_range(self, start_address: str, end_address: str, vrf_id: UUID | None) -> list[dict]: ... + + +class VRFReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_by_name(self, name: str) -> dict | None: ... + + +class VLANReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_by_vid(self, vid: int, group_id: UUID | None) -> dict | None: ... + + +class IPRangeReadModelRepository(ReadModelRepository): + pass + + +class RIRReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_by_name(self, name: str) -> dict | None: ... + + +class ASNReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_by_asn(self, asn: int) -> dict | None: ... + + +class FHRPGroupReadModelRepository(ReadModelRepository): + pass + + +class RouteTargetReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_by_name(self, name: str) -> dict | None: ... + + +class VLANGroupReadModelRepository(ReadModelRepository): + @abstractmethod + async def find_by_slug(self, slug: str) -> dict | None: ... + + +class ServiceReadModelRepository(ReadModelRepository): + pass + + +class SavedFilterRepository(ABC): + @abstractmethod + async def find_by_id(self, filter_id: UUID) -> dict | None: ... + + @abstractmethod + async def find_by_user(self, user_id: UUID, entity_type: str | None = None) -> list[dict]: ... + + @abstractmethod + async def create(self, data: dict) -> UUID: ... + + @abstractmethod + async def update(self, filter_id: UUID, data: dict) -> None: ... + + @abstractmethod + async def delete(self, filter_id: UUID) -> None: ... + + @abstractmethod + async def clear_default(self, user_id: UUID, entity_type: str) -> None: ... + + +class GlobalSearchRepository(ABC): + @abstractmethod + async def search( + self, + query: str, + entity_types: list[str] | None = None, + offset: int = 0, + limit: int = 20, + ) -> tuple[list[dict], int]: ... diff --git a/services/ipam/src/ipam/domain/__init__.py b/services/ipam/src/ipam/domain/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/src/ipam/domain/asn.py b/services/ipam/src/ipam/domain/asn.py new file mode 100644 index 0000000..98c1b38 --- /dev/null +++ b/services/ipam/src/ipam/domain/asn.py @@ -0,0 +1,129 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import ASNCreated, ASNDeleted, ASNUpdated +from ipam.domain.value_objects import ASNumber + + +class ASN(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.asn: ASNumber | None = None + self.rir_id: UUID | None = None + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + asn: int, + rir_id: UUID | None = None, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> ASN: + asn_vo = ASNumber(asn=asn) + aggregate = cls() + aggregate.apply_event( + ASNCreated( + aggregate_id=aggregate.id, + version=aggregate._next_version(), + asn=asn_vo.asn, + rir_id=rir_id, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return aggregate + + def update( + self, + *, + description: str | None = None, + tenant_id: UUID | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted ASN") + self.apply_event( + ASNUpdated( + aggregate_id=self.id, + version=self._next_version(), + description=description, + tenant_id=tenant_id, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("ASN is already deleted") + self.apply_event( + ASNDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_ASNCreated(self, event: ASNCreated) -> None: # noqa: N802 + self.asn = ASNumber(asn=event.asn) + self.rir_id = event.rir_id + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_ASNUpdated(self, event: ASNUpdated) -> None: # noqa: N802 + if event.description is not None: + self.description = event.description + if event.tenant_id is not None: + self.tenant_id = event.tenant_id + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_ASNDeleted(self, event: ASNDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "asn": self.asn.asn if self.asn else None, + "rir_id": str(self.rir_id) if self.rir_id else None, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + aggregate = cls(aggregate_id=aggregate_id) + aggregate.version = version + aggregate.asn = ASNumber(asn=state["asn"]) if state.get("asn") is not None else None + aggregate.rir_id = UUID(state["rir_id"]) if state.get("rir_id") else None + aggregate.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + aggregate.description = state.get("description", "") + aggregate.custom_fields = state.get("custom_fields", {}) + aggregate.tags = [UUID(t) for t in state.get("tags", [])] + aggregate._deleted = state.get("deleted", False) + return aggregate diff --git a/services/ipam/src/ipam/domain/events.py b/services/ipam/src/ipam/domain/events.py new file mode 100644 index 0000000..a8762ea --- /dev/null +++ b/services/ipam/src/ipam/domain/events.py @@ -0,0 +1,302 @@ +from uuid import UUID + +from shared.event.domain_event import DomainEvent + +# Prefix Events + + +class PrefixCreated(DomainEvent): + network: str + vrf_id: UUID | None = None + vlan_id: UUID | None = None + status: str = "active" + role: str | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class PrefixUpdated(DomainEvent): + description: str | None = None + role: str | None = None + tenant_id: UUID | None = None + vlan_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class PrefixDeleted(DomainEvent): + pass + + +class PrefixStatusChanged(DomainEvent): + old_status: str + new_status: str + + +# IPAddress Events + + +class IPAddressCreated(DomainEvent): + address: str + vrf_id: UUID | None = None + status: str = "active" + dns_name: str = "" + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class IPAddressUpdated(DomainEvent): + dns_name: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class IPAddressDeleted(DomainEvent): + pass + + +class IPAddressStatusChanged(DomainEvent): + old_status: str + new_status: str + + +# VRF Events + + +class VRFCreated(DomainEvent): + name: str + rd: str | None = None + import_targets: list[UUID] = [] + export_targets: list[UUID] = [] + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class VRFUpdated(DomainEvent): + name: str | None = None + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class VRFDeleted(DomainEvent): + pass + + +# VLAN Events + + +class VLANCreated(DomainEvent): + vid: int + name: str + group_id: UUID | None = None + status: str = "active" + role: str | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class VLANUpdated(DomainEvent): + name: str | None = None + role: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class VLANDeleted(DomainEvent): + pass + + +class VLANStatusChanged(DomainEvent): + old_status: str + new_status: str + + +# IPRange Events + + +class IPRangeCreated(DomainEvent): + start_address: str + end_address: str + vrf_id: UUID | None = None + status: str = "active" + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class IPRangeUpdated(DomainEvent): + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class IPRangeDeleted(DomainEvent): + pass + + +class IPRangeStatusChanged(DomainEvent): + old_status: str + new_status: str + + +# RIR Events + + +class RIRCreated(DomainEvent): + name: str + is_private: bool = False + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class RIRUpdated(DomainEvent): + description: str | None = None + is_private: bool | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class RIRDeleted(DomainEvent): + pass + + +# ASN Events + + +class ASNCreated(DomainEvent): + asn: int + rir_id: UUID | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class ASNUpdated(DomainEvent): + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ASNDeleted(DomainEvent): + pass + + +# FHRPGroup Events + + +class FHRPGroupCreated(DomainEvent): + protocol: str + group_id_value: int + auth_type: str = "plaintext" + auth_key: str = "" + name: str = "" + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class FHRPGroupUpdated(DomainEvent): + name: str | None = None + auth_type: str | None = None + auth_key: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class FHRPGroupDeleted(DomainEvent): + pass + + +# RouteTarget Events + + +class RouteTargetCreated(DomainEvent): + name: str + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class RouteTargetUpdated(DomainEvent): + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class RouteTargetDeleted(DomainEvent): + pass + + +# VLANGroup Events + + +class VLANGroupCreated(DomainEvent): + name: str + slug: str + min_vid: int = 1 + max_vid: int = 4094 + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class VLANGroupUpdated(DomainEvent): + name: str | None = None + description: str | None = None + min_vid: int | None = None + max_vid: int | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class VLANGroupDeleted(DomainEvent): + pass + + +# Service Events + + +class ServiceCreated(DomainEvent): + name: str + protocol: str = "tcp" + ports: list[int] = [] + ip_addresses: list[UUID] = [] + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class ServiceUpdated(DomainEvent): + name: str | None = None + protocol: str | None = None + ports: list[int] | None = None + ip_addresses: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ServiceDeleted(DomainEvent): + pass diff --git a/services/ipam/src/ipam/domain/fhrp_group.py b/services/ipam/src/ipam/domain/fhrp_group.py new file mode 100644 index 0000000..047c777 --- /dev/null +++ b/services/ipam/src/ipam/domain/fhrp_group.py @@ -0,0 +1,148 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import FHRPGroupCreated, FHRPGroupDeleted, FHRPGroupUpdated +from ipam.domain.value_objects import FHRPAuthType, FHRPProtocol + + +class FHRPGroup(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.protocol: FHRPProtocol | None = None + self.group_id_value: int = 0 + self.auth_type: FHRPAuthType = FHRPAuthType.PLAINTEXT + self.auth_key: str = "" + self.name: str = "" + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + protocol: FHRPProtocol, + group_id_value: int, + auth_type: FHRPAuthType = FHRPAuthType.PLAINTEXT, + auth_key: str = "", + name: str = "", + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> FHRPGroup: + group = cls() + group.apply_event( + FHRPGroupCreated( + aggregate_id=group.id, + version=group._next_version(), + protocol=protocol.value, + group_id_value=group_id_value, + auth_type=auth_type.value, + auth_key=auth_key, + name=name, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return group + + def update( + self, + *, + name: str | None = None, + auth_type: str | None = None, + auth_key: str | None = None, + description: str | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted FHRP group") + self.apply_event( + FHRPGroupUpdated( + aggregate_id=self.id, + version=self._next_version(), + name=name, + auth_type=auth_type, + auth_key=auth_key, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("FHRP group is already deleted") + self.apply_event( + FHRPGroupDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_FHRPGroupCreated(self, event: FHRPGroupCreated) -> None: # noqa: N802 + self.protocol = FHRPProtocol(event.protocol) + self.group_id_value = event.group_id_value + self.auth_type = FHRPAuthType(event.auth_type) + self.auth_key = event.auth_key + self.name = event.name + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_FHRPGroupUpdated(self, event: FHRPGroupUpdated) -> None: # noqa: N802 + if event.name is not None: + self.name = event.name + if event.auth_type is not None: + self.auth_type = FHRPAuthType(event.auth_type) + if event.auth_key is not None: + self.auth_key = event.auth_key + if event.description is not None: + self.description = event.description + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_FHRPGroupDeleted(self, event: FHRPGroupDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "protocol": self.protocol.value if self.protocol else None, + "group_id_value": self.group_id_value, + "auth_type": self.auth_type.value, + "auth_key": self.auth_key, + "name": self.name, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + group = cls(aggregate_id=aggregate_id) + group.version = version + group.protocol = FHRPProtocol(state["protocol"]) if state.get("protocol") else None + group.group_id_value = state.get("group_id_value", 0) + group.auth_type = FHRPAuthType(state["auth_type"]) if state.get("auth_type") else FHRPAuthType.PLAINTEXT + group.auth_key = state.get("auth_key", "") + group.name = state.get("name", "") + group.description = state.get("description", "") + group.custom_fields = state.get("custom_fields", {}) + group.tags = [UUID(t) for t in state.get("tags", [])] + group._deleted = state.get("deleted", False) + return group diff --git a/services/ipam/src/ipam/domain/ip_address.py b/services/ipam/src/ipam/domain/ip_address.py new file mode 100644 index 0000000..7d95fef --- /dev/null +++ b/services/ipam/src/ipam/domain/ip_address.py @@ -0,0 +1,162 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import ( + IPAddressCreated, + IPAddressDeleted, + IPAddressStatusChanged, + IPAddressUpdated, +) +from ipam.domain.value_objects import IPAddressStatus, IPAddressValue + + +class IPAddress(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.address: IPAddressValue | None = None + self.vrf_id: UUID | None = None + self.status: IPAddressStatus = IPAddressStatus.ACTIVE + self.dns_name: str = "" + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + address: str, + vrf_id: UUID | None = None, + status: IPAddressStatus = IPAddressStatus.ACTIVE, + dns_name: str = "", + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> IPAddress: + ip = cls() + ip.apply_event( + IPAddressCreated( + aggregate_id=ip.id, + version=ip._next_version(), + address=str(IPAddressValue(address=address).address), + vrf_id=vrf_id, + status=status.value, + dns_name=dns_name, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return ip + + def update( + self, + *, + dns_name: str | None = None, + description: str | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted IP address") + self.apply_event( + IPAddressUpdated( + aggregate_id=self.id, + version=self._next_version(), + dns_name=dns_name, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def change_status(self, new_status: IPAddressStatus) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot change status of a deleted IP address") + if self.status == new_status: + raise BusinessRuleViolationError(f"IP address is already {new_status.value}") + self.apply_event( + IPAddressStatusChanged( + aggregate_id=self.id, + version=self._next_version(), + old_status=self.status.value, + new_status=new_status.value, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("IP address is already deleted") + self.apply_event( + IPAddressDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_IPAddressCreated(self, event: IPAddressCreated) -> None: # noqa: N802 + self.address = IPAddressValue(address=event.address) + self.vrf_id = event.vrf_id + self.status = IPAddressStatus(event.status) + self.dns_name = event.dns_name + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_IPAddressUpdated(self, event: IPAddressUpdated) -> None: # noqa: N802 + if event.dns_name is not None: + self.dns_name = event.dns_name + if event.description is not None: + self.description = event.description + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_IPAddressStatusChanged(self, event: IPAddressStatusChanged) -> None: # noqa: N802 + self.status = IPAddressStatus(event.new_status) + + def _apply_IPAddressDeleted(self, event: IPAddressDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "address": self.address.address if self.address else None, + "vrf_id": str(self.vrf_id) if self.vrf_id else None, + "status": self.status.value, + "dns_name": self.dns_name, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + ip = cls(aggregate_id=aggregate_id) + ip.version = version + ip.address = IPAddressValue(address=state["address"]) if state.get("address") else None + ip.vrf_id = UUID(state["vrf_id"]) if state.get("vrf_id") else None + ip.status = IPAddressStatus(state["status"]) + ip.dns_name = state.get("dns_name", "") + ip.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + ip.description = state.get("description", "") + ip.custom_fields = state.get("custom_fields", {}) + ip.tags = [UUID(t) for t in state.get("tags", [])] + ip._deleted = state.get("deleted", False) + return ip diff --git a/services/ipam/src/ipam/domain/ip_range.py b/services/ipam/src/ipam/domain/ip_range.py new file mode 100644 index 0000000..0f8d2cd --- /dev/null +++ b/services/ipam/src/ipam/domain/ip_range.py @@ -0,0 +1,163 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import IPRangeCreated, IPRangeDeleted, IPRangeStatusChanged, IPRangeUpdated +from ipam.domain.value_objects import IPAddressValue, IPRangeStatus + + +class IPRange(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.start_address: IPAddressValue | None = None + self.end_address: IPAddressValue | None = None + self.vrf_id: UUID | None = None + self.status: IPRangeStatus = IPRangeStatus.ACTIVE + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + start_address: str, + end_address: str, + vrf_id: UUID | None = None, + status: IPRangeStatus = IPRangeStatus.ACTIVE, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> IPRange: + start = IPAddressValue(address=start_address) + end = IPAddressValue(address=end_address) + if start.version != end.version: + raise BusinessRuleViolationError("Start and end addresses must be the same IP version") + if start.ip_address >= end.ip_address: + raise BusinessRuleViolationError("Start address must be less than end address") + ip_range = cls() + ip_range.apply_event( + IPRangeCreated( + aggregate_id=ip_range.id, + version=ip_range._next_version(), + start_address=start.address, + end_address=end.address, + vrf_id=vrf_id, + status=status.value, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return ip_range + + def update( + self, + *, + description: str | None = None, + tenant_id: UUID | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted IP range") + self.apply_event( + IPRangeUpdated( + aggregate_id=self.id, + version=self._next_version(), + description=description, + tenant_id=tenant_id, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def change_status(self, new_status: IPRangeStatus) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot change status of a deleted IP range") + if self.status == new_status: + raise BusinessRuleViolationError(f"IP range is already {new_status.value}") + self.apply_event( + IPRangeStatusChanged( + aggregate_id=self.id, + version=self._next_version(), + old_status=self.status.value, + new_status=new_status.value, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("IP range is already deleted") + self.apply_event( + IPRangeDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_IPRangeCreated(self, event: IPRangeCreated) -> None: # noqa: N802 + self.start_address = IPAddressValue(address=event.start_address) + self.end_address = IPAddressValue(address=event.end_address) + self.vrf_id = event.vrf_id + self.status = IPRangeStatus(event.status) + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_IPRangeUpdated(self, event: IPRangeUpdated) -> None: # noqa: N802 + if event.description is not None: + self.description = event.description + if event.tenant_id is not None: + self.tenant_id = event.tenant_id + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_IPRangeStatusChanged(self, event: IPRangeStatusChanged) -> None: # noqa: N802 + self.status = IPRangeStatus(event.new_status) + + def _apply_IPRangeDeleted(self, event: IPRangeDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "start_address": self.start_address.address if self.start_address else None, + "end_address": self.end_address.address if self.end_address else None, + "vrf_id": str(self.vrf_id) if self.vrf_id else None, + "status": self.status.value, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + ip_range = cls(aggregate_id=aggregate_id) + ip_range.version = version + ip_range.start_address = IPAddressValue(address=state["start_address"]) if state.get("start_address") else None + ip_range.end_address = IPAddressValue(address=state["end_address"]) if state.get("end_address") else None + ip_range.vrf_id = UUID(state["vrf_id"]) if state.get("vrf_id") else None + ip_range.status = IPRangeStatus(state["status"]) + ip_range.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + ip_range.description = state.get("description", "") + ip_range.custom_fields = state.get("custom_fields", {}) + ip_range.tags = [UUID(t) for t in state.get("tags", [])] + ip_range._deleted = state.get("deleted", False) + return ip_range diff --git a/services/ipam/src/ipam/domain/prefix.py b/services/ipam/src/ipam/domain/prefix.py new file mode 100644 index 0000000..bdbffb5 --- /dev/null +++ b/services/ipam/src/ipam/domain/prefix.py @@ -0,0 +1,171 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import PrefixCreated, PrefixDeleted, PrefixStatusChanged, PrefixUpdated +from ipam.domain.value_objects import PrefixNetwork, PrefixStatus + + +class Prefix(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.network: PrefixNetwork | None = None + self.vrf_id: UUID | None = None + self.vlan_id: UUID | None = None + self.status: PrefixStatus = PrefixStatus.ACTIVE + self.role: str | None = None + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + network: str, + vrf_id: UUID | None = None, + vlan_id: UUID | None = None, + status: PrefixStatus = PrefixStatus.ACTIVE, + role: str | None = None, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> Prefix: + prefix = cls() + prefix.apply_event( + PrefixCreated( + aggregate_id=prefix.id, + version=prefix._next_version(), + network=str(PrefixNetwork(network=network).network), + vrf_id=vrf_id, + vlan_id=vlan_id, + status=status.value, + role=role, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return prefix + + def update( + self, + *, + description: str | None = None, + role: str | None = None, + tenant_id: UUID | None = None, + vlan_id: UUID | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted prefix") + self.apply_event( + PrefixUpdated( + aggregate_id=self.id, + version=self._next_version(), + description=description, + role=role, + tenant_id=tenant_id, + vlan_id=vlan_id, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def change_status(self, new_status: PrefixStatus) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot change status of a deleted prefix") + if self.status == new_status: + raise BusinessRuleViolationError(f"Prefix is already {new_status.value}") + self.apply_event( + PrefixStatusChanged( + aggregate_id=self.id, + version=self._next_version(), + old_status=self.status.value, + new_status=new_status.value, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("Prefix is already deleted") + self.apply_event( + PrefixDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_PrefixCreated(self, event: PrefixCreated) -> None: # noqa: N802 + self.network = PrefixNetwork(network=event.network) + self.vrf_id = event.vrf_id + self.vlan_id = event.vlan_id + self.status = PrefixStatus(event.status) + self.role = event.role + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_PrefixUpdated(self, event: PrefixUpdated) -> None: # noqa: N802 + if event.description is not None: + self.description = event.description + if event.role is not None: + self.role = event.role + if event.tenant_id is not None: + self.tenant_id = event.tenant_id + if event.vlan_id is not None: + self.vlan_id = event.vlan_id + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_PrefixStatusChanged(self, event: PrefixStatusChanged) -> None: # noqa: N802 + self.status = PrefixStatus(event.new_status) + + def _apply_PrefixDeleted(self, event: PrefixDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "network": self.network.network if self.network else None, + "vrf_id": str(self.vrf_id) if self.vrf_id else None, + "vlan_id": str(self.vlan_id) if self.vlan_id else None, + "status": self.status.value, + "role": self.role, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + prefix = cls(aggregate_id=aggregate_id) + prefix.version = version + prefix.network = PrefixNetwork(network=state["network"]) if state.get("network") else None + prefix.vrf_id = UUID(state["vrf_id"]) if state.get("vrf_id") else None + prefix.vlan_id = UUID(state["vlan_id"]) if state.get("vlan_id") else None + prefix.status = PrefixStatus(state["status"]) + prefix.role = state.get("role") + prefix.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + prefix.description = state.get("description", "") + prefix.custom_fields = state.get("custom_fields", {}) + prefix.tags = [UUID(t) for t in state.get("tags", [])] + prefix._deleted = state.get("deleted", False) + return prefix diff --git a/services/ipam/src/ipam/domain/repository.py b/services/ipam/src/ipam/domain/repository.py new file mode 100644 index 0000000..05843fc --- /dev/null +++ b/services/ipam/src/ipam/domain/repository.py @@ -0,0 +1,123 @@ +from abc import ABC, abstractmethod +from uuid import UUID + +from ipam.domain.asn import ASN +from ipam.domain.fhrp_group import FHRPGroup +from ipam.domain.ip_address import IPAddress +from ipam.domain.ip_range import IPRange +from ipam.domain.prefix import Prefix +from ipam.domain.rir import RIR +from ipam.domain.vlan import VLAN +from ipam.domain.vrf import VRF + + +class PrefixRepository(ABC): + @abstractmethod + async def find_by_id(self, prefix_id: UUID) -> Prefix | None: ... + + @abstractmethod + async def save(self, prefix: Prefix) -> None: ... + + @abstractmethod + async def delete(self, prefix_id: UUID) -> None: ... + + @abstractmethod + async def find_children(self, parent_network: str, vrf_id: UUID | None) -> list[Prefix]: ... + + @abstractmethod + async def find_by_vrf(self, vrf_id: UUID, *, offset: int = 0, limit: int = 50) -> tuple[list[Prefix], int]: ... + + +class IPAddressRepository(ABC): + @abstractmethod + async def find_by_id(self, ip_id: UUID) -> IPAddress | None: ... + + @abstractmethod + async def save(self, ip_address: IPAddress) -> None: ... + + @abstractmethod + async def delete(self, ip_id: UUID) -> None: ... + + @abstractmethod + async def find_by_prefix(self, network: str, vrf_id: UUID | None) -> list[IPAddress]: ... + + @abstractmethod + async def exists_in_vrf(self, address: str, vrf_id: UUID | None) -> bool: ... + + +class VRFRepository(ABC): + @abstractmethod + async def find_by_id(self, vrf_id: UUID) -> VRF | None: ... + + @abstractmethod + async def save(self, vrf: VRF) -> None: ... + + @abstractmethod + async def delete(self, vrf_id: UUID) -> None: ... + + @abstractmethod + async def find_by_name(self, name: str) -> VRF | None: ... + + +class VLANRepository(ABC): + @abstractmethod + async def find_by_id(self, vlan_id: UUID) -> VLAN | None: ... + + @abstractmethod + async def save(self, vlan: VLAN) -> None: ... + + @abstractmethod + async def delete(self, vlan_id: UUID) -> None: ... + + @abstractmethod + async def find_by_vid(self, vid: int, group_id: UUID | None) -> VLAN | None: ... + + +class IPRangeRepository(ABC): + @abstractmethod + async def find_by_id(self, range_id: UUID) -> IPRange | None: ... + + @abstractmethod + async def save(self, ip_range: IPRange) -> None: ... + + @abstractmethod + async def delete(self, range_id: UUID) -> None: ... + + +class RIRRepository(ABC): + @abstractmethod + async def find_by_id(self, rir_id: UUID) -> RIR | None: ... + + @abstractmethod + async def save(self, rir: RIR) -> None: ... + + @abstractmethod + async def delete(self, rir_id: UUID) -> None: ... + + @abstractmethod + async def find_by_name(self, name: str) -> RIR | None: ... + + +class ASNRepository(ABC): + @abstractmethod + async def find_by_id(self, asn_id: UUID) -> ASN | None: ... + + @abstractmethod + async def save(self, asn: ASN) -> None: ... + + @abstractmethod + async def delete(self, asn_id: UUID) -> None: ... + + @abstractmethod + async def find_by_asn(self, asn: int) -> ASN | None: ... + + +class FHRPGroupRepository(ABC): + @abstractmethod + async def find_by_id(self, group_id: UUID) -> FHRPGroup | None: ... + + @abstractmethod + async def save(self, group: FHRPGroup) -> None: ... + + @abstractmethod + async def delete(self, group_id: UUID) -> None: ... diff --git a/services/ipam/src/ipam/domain/rir.py b/services/ipam/src/ipam/domain/rir.py new file mode 100644 index 0000000..9cea0d0 --- /dev/null +++ b/services/ipam/src/ipam/domain/rir.py @@ -0,0 +1,121 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import RIRCreated, RIRDeleted, RIRUpdated + + +class RIR(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.name: str = "" + self.is_private: bool = False + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + name: str, + is_private: bool = False, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> RIR: + rir = cls() + rir.apply_event( + RIRCreated( + aggregate_id=rir.id, + version=rir._next_version(), + name=name, + is_private=is_private, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return rir + + def update( + self, + *, + description: str | None = None, + is_private: bool | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted RIR") + self.apply_event( + RIRUpdated( + aggregate_id=self.id, + version=self._next_version(), + description=description, + is_private=is_private, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("RIR is already deleted") + self.apply_event( + RIRDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_RIRCreated(self, event: RIRCreated) -> None: # noqa: N802 + self.name = event.name + self.is_private = event.is_private + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_RIRUpdated(self, event: RIRUpdated) -> None: # noqa: N802 + if event.description is not None: + self.description = event.description + if event.is_private is not None: + self.is_private = event.is_private + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_RIRDeleted(self, event: RIRDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "name": self.name, + "is_private": self.is_private, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + rir = cls(aggregate_id=aggregate_id) + rir.version = version + rir.name = state["name"] + rir.is_private = state.get("is_private", False) + rir.description = state.get("description", "") + rir.custom_fields = state.get("custom_fields", {}) + rir.tags = [UUID(t) for t in state.get("tags", [])] + rir._deleted = state.get("deleted", False) + return rir diff --git a/services/ipam/src/ipam/domain/route_target.py b/services/ipam/src/ipam/domain/route_target.py new file mode 100644 index 0000000..9a85406 --- /dev/null +++ b/services/ipam/src/ipam/domain/route_target.py @@ -0,0 +1,123 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import RouteTargetCreated, RouteTargetDeleted, RouteTargetUpdated +from ipam.domain.value_objects import RouteDistinguisher + + +class RouteTarget(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.name: RouteDistinguisher | None = None + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + name: str, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> RouteTarget: + name_vo = RouteDistinguisher(rd=name) + aggregate = cls() + aggregate.apply_event( + RouteTargetCreated( + aggregate_id=aggregate.id, + version=aggregate._next_version(), + name=name_vo.rd, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return aggregate + + def update( + self, + *, + description: str | None = None, + tenant_id: UUID | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted RouteTarget") + self.apply_event( + RouteTargetUpdated( + aggregate_id=self.id, + version=self._next_version(), + description=description, + tenant_id=tenant_id, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("RouteTarget is already deleted") + self.apply_event( + RouteTargetDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_RouteTargetCreated(self, event: RouteTargetCreated) -> None: # noqa: N802 + self.name = RouteDistinguisher(rd=event.name) + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_RouteTargetUpdated(self, event: RouteTargetUpdated) -> None: # noqa: N802 + if event.description is not None: + self.description = event.description + if event.tenant_id is not None: + self.tenant_id = event.tenant_id + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_RouteTargetDeleted(self, event: RouteTargetDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "name": self.name.rd if self.name else None, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + aggregate = cls(aggregate_id=aggregate_id) + aggregate.version = version + aggregate.name = RouteDistinguisher(rd=state["name"]) if state.get("name") else None + aggregate.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + aggregate.description = state.get("description", "") + aggregate.custom_fields = state.get("custom_fields", {}) + aggregate.tags = [UUID(t) for t in state.get("tags", [])] + aggregate._deleted = state.get("deleted", False) + return aggregate diff --git a/services/ipam/src/ipam/domain/service.py b/services/ipam/src/ipam/domain/service.py new file mode 100644 index 0000000..8d4ecc4 --- /dev/null +++ b/services/ipam/src/ipam/domain/service.py @@ -0,0 +1,157 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import ServiceCreated, ServiceDeleted, ServiceUpdated +from ipam.domain.value_objects import ServiceProtocol + + +class Service(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.name: str = "" + self.protocol: ServiceProtocol = ServiceProtocol.TCP + self.ports: list[int] = [] + self.ip_addresses: list[UUID] = [] + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + name: str, + protocol: ServiceProtocol, + ports: list[int], + ip_addresses: list[UUID] | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> Service: + cls._validate_ports(ports) + aggregate = cls() + aggregate.apply_event( + ServiceCreated( + aggregate_id=aggregate.id, + version=aggregate._next_version(), + name=name, + protocol=protocol.value, + ports=ports, + ip_addresses=ip_addresses or [], + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return aggregate + + def update( + self, + *, + name: str | None = None, + protocol: str | None = None, + ports: list[int] | None = None, + ip_addresses: list[UUID] | None = None, + description: str | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted Service") + if ports is not None: + self._validate_ports(ports) + self.apply_event( + ServiceUpdated( + aggregate_id=self.id, + version=self._next_version(), + name=name, + protocol=protocol, + ports=ports, + ip_addresses=ip_addresses, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("Service is already deleted") + self.apply_event( + ServiceDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + @staticmethod + def _validate_ports(ports: list[int]) -> None: + if not ports: + raise BusinessRuleViolationError("Service must have at least one port") + for port in ports: + if not 1 <= port <= 65535: + raise BusinessRuleViolationError(f"Port must be between 1 and 65535, got {port}") + + # --- Event Handlers --- + + def _apply_ServiceCreated(self, event: ServiceCreated) -> None: # noqa: N802 + self.name = event.name + self.protocol = ServiceProtocol(event.protocol) + self.ports = list(event.ports) + self.ip_addresses = list(event.ip_addresses) + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_ServiceUpdated(self, event: ServiceUpdated) -> None: # noqa: N802 + if event.name is not None: + self.name = event.name + if event.protocol is not None: + self.protocol = ServiceProtocol(event.protocol) + if event.ports is not None: + self.ports = list(event.ports) + if event.ip_addresses is not None: + self.ip_addresses = list(event.ip_addresses) + if event.description is not None: + self.description = event.description + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_ServiceDeleted(self, event: ServiceDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "name": self.name, + "protocol": self.protocol.value, + "ports": self.ports, + "ip_addresses": [str(ip) for ip in self.ip_addresses], + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + aggregate = cls(aggregate_id=aggregate_id) + aggregate.version = version + aggregate.name = state.get("name", "") + aggregate.protocol = ServiceProtocol(state.get("protocol", "tcp")) + aggregate.ports = state.get("ports", []) + aggregate.ip_addresses = [UUID(ip) for ip in state.get("ip_addresses", [])] + aggregate.description = state.get("description", "") + aggregate.custom_fields = state.get("custom_fields", {}) + aggregate.tags = [UUID(t) for t in state.get("tags", [])] + aggregate._deleted = state.get("deleted", False) + return aggregate diff --git a/services/ipam/src/ipam/domain/services.py b/services/ipam/src/ipam/domain/services.py new file mode 100644 index 0000000..8974541 --- /dev/null +++ b/services/ipam/src/ipam/domain/services.py @@ -0,0 +1,88 @@ +from ipam.domain.ip_address import IPAddress +from ipam.domain.ip_range import IPRange +from ipam.domain.prefix import Prefix + + +class PrefixUtilizationService: + def calculate( + self, + prefix: Prefix, + child_prefixes: list[Prefix], + ip_addresses: list[IPAddress], + ) -> float: + if prefix.network is None: + return 0.0 + total = prefix.network.num_addresses + if total == 0: + return 0.0 + used = sum(cp.network.num_addresses for cp in child_prefixes if cp.network) + used += len(ip_addresses) + return min(used / total, 1.0) + + +class AvailablePrefixService: + def find_available( + self, + parent: Prefix, + child_prefixes: list[Prefix], + desired_prefix_length: int, + ) -> list[str]: + if parent.network is None: + return [] + parent_net = parent.network.ip_network + used_nets = sorted( + [cp.network.ip_network for cp in child_prefixes if cp.network], + key=lambda n: n.network_address, + ) + available = [] + candidates = list(parent_net.subnets(new_prefix=desired_prefix_length)) + for candidate in candidates: + overlaps = False + for used in used_nets: + if candidate.overlaps(used): + overlaps = True + break + if not overlaps: + available.append(str(candidate)) + return available + + +class IPRangeUtilizationService: + def calculate(self, ip_range: IPRange, used_addresses: list[IPAddress]) -> float: + if ip_range.start_address is None or ip_range.end_address is None: + return 0.0 + start = int(ip_range.start_address.ip_address) + end = int(ip_range.end_address.ip_address) + total = end - start + 1 + if total <= 0: + return 0.0 + in_range = 0 + for addr in used_addresses: + if addr.address: + ip_int = int(addr.address.ip_address) + if start <= ip_int <= end: + in_range += 1 + return min(in_range / total, 1.0) + + +class IPAvailabilityService: + def find_available( + self, + prefix: Prefix, + used_addresses: list[IPAddress], + count: int = 1, + ) -> list[str]: + if prefix.network is None: + return [] + net = prefix.network.ip_network + used_set = set() + for addr in used_addresses: + if addr.address: + used_set.add(addr.address.ip_address) + available = [] + for host in net.hosts(): + if host not in used_set: + available.append(str(host)) + if len(available) >= count: + break + return available diff --git a/services/ipam/src/ipam/domain/value_objects.py b/services/ipam/src/ipam/domain/value_objects.py new file mode 100644 index 0000000..ece21b6 --- /dev/null +++ b/services/ipam/src/ipam/domain/value_objects.py @@ -0,0 +1,132 @@ +import ipaddress +from enum import StrEnum + +from pydantic import field_validator +from shared.domain.value_object import ValueObject + + +class PrefixStatus(StrEnum): + ACTIVE = "active" + RESERVED = "reserved" + DEPRECATED = "deprecated" + CONTAINER = "container" + + +class IPAddressStatus(StrEnum): + ACTIVE = "active" + RESERVED = "reserved" + DEPRECATED = "deprecated" + DHCP = "dhcp" + SLAAC = "slaac" + + +class VLANStatus(StrEnum): + ACTIVE = "active" + RESERVED = "reserved" + DEPRECATED = "deprecated" + + +class PrefixNetwork(ValueObject): + network: str + + @field_validator("network") + @classmethod + def validate_network(cls, v: str) -> str: + ipaddress.ip_network(v, strict=False) + return str(ipaddress.ip_network(v, strict=False)) + + @property + def ip_network(self) -> ipaddress.IPv4Network | ipaddress.IPv6Network: + return ipaddress.ip_network(self.network, strict=False) + + @property + def version(self) -> int: + return self.ip_network.version + + @property + def num_addresses(self) -> int: + return self.ip_network.num_addresses + + @property + def prefix_length(self) -> int: + return self.ip_network.prefixlen + + def contains(self, other: "PrefixNetwork") -> bool: + return other.ip_network.subnet_of(self.ip_network) + + +class IPAddressValue(ValueObject): + address: str + + @field_validator("address") + @classmethod + def validate_address(cls, v: str) -> str: + ipaddress.ip_address(v) + return str(ipaddress.ip_address(v)) + + @property + def ip_address(self) -> ipaddress.IPv4Address | ipaddress.IPv6Address: + return ipaddress.ip_address(self.address) + + @property + def version(self) -> int: + return self.ip_address.version + + +class VLANId(ValueObject): + vid: int + + @field_validator("vid") + @classmethod + def validate_vid(cls, v: int) -> int: + if not 1 <= v <= 4094: + raise ValueError(f"VLAN ID must be between 1 and 4094, got {v}") + return v + + +class IPRangeStatus(StrEnum): + ACTIVE = "active" + RESERVED = "reserved" + DEPRECATED = "deprecated" + + +class FHRPProtocol(StrEnum): + VRRP = "vrrp" + HSRP = "hsrp" + GLBP = "glbp" + CARP = "carp" + OTHER = "other" + + +class FHRPAuthType(StrEnum): + PLAINTEXT = "plaintext" + MD5 = "md5" + + +class RouteDistinguisher(ValueObject): + rd: str + + @field_validator("rd") + @classmethod + def validate_rd(cls, v: str) -> str: + parts = v.split(":") + if len(parts) != 2: + raise ValueError(f"Route Distinguisher must be in format 'ASN:NN' or 'IP:NN', got '{v}'") + return v + + +class ServiceProtocol(StrEnum): + TCP = "tcp" + UDP = "udp" + SCTP = "sctp" + + +class ASNumber(ValueObject): + asn: int + + @field_validator("asn") + @classmethod + def validate_asn(cls, v: int) -> int: + if not 1 <= v <= 4294967295: + raise ValueError(f"ASN must be between 1 and 4294967295, got {v}") + return v diff --git a/services/ipam/src/ipam/domain/vlan.py b/services/ipam/src/ipam/domain/vlan.py new file mode 100644 index 0000000..b899f3e --- /dev/null +++ b/services/ipam/src/ipam/domain/vlan.py @@ -0,0 +1,167 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import VLANCreated, VLANDeleted, VLANStatusChanged, VLANUpdated +from ipam.domain.value_objects import VLANId, VLANStatus + + +class VLAN(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.vid: VLANId | None = None + self.name: str = "" + self.group_id: UUID | None = None + self.status: VLANStatus = VLANStatus.ACTIVE + self.role: str | None = None + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + vid: int, + name: str, + group_id: UUID | None = None, + status: VLANStatus = VLANStatus.ACTIVE, + role: str | None = None, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> VLAN: + vlan = cls() + vlan.apply_event( + VLANCreated( + aggregate_id=vlan.id, + version=vlan._next_version(), + vid=VLANId(vid=vid).vid, + name=name, + group_id=group_id, + status=status.value, + role=role, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return vlan + + def update( + self, + *, + name: str | None = None, + role: str | None = None, + description: str | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted VLAN") + self.apply_event( + VLANUpdated( + aggregate_id=self.id, + version=self._next_version(), + name=name, + role=role, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def change_status(self, new_status: VLANStatus) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot change status of a deleted VLAN") + if self.status == new_status: + raise BusinessRuleViolationError(f"VLAN is already {new_status.value}") + self.apply_event( + VLANStatusChanged( + aggregate_id=self.id, + version=self._next_version(), + old_status=self.status.value, + new_status=new_status.value, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("VLAN is already deleted") + self.apply_event( + VLANDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_VLANCreated(self, event: VLANCreated) -> None: # noqa: N802 + self.vid = VLANId(vid=event.vid) + self.name = event.name + self.group_id = event.group_id + self.status = VLANStatus(event.status) + self.role = event.role + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_VLANUpdated(self, event: VLANUpdated) -> None: # noqa: N802 + if event.name is not None: + self.name = event.name + if event.role is not None: + self.role = event.role + if event.description is not None: + self.description = event.description + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_VLANStatusChanged(self, event: VLANStatusChanged) -> None: # noqa: N802 + self.status = VLANStatus(event.new_status) + + def _apply_VLANDeleted(self, event: VLANDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "vid": self.vid.vid if self.vid else None, + "name": self.name, + "group_id": str(self.group_id) if self.group_id else None, + "status": self.status.value, + "role": self.role, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + vlan = cls(aggregate_id=aggregate_id) + vlan.version = version + vlan.vid = VLANId(vid=state["vid"]) if state.get("vid") is not None else None + vlan.name = state.get("name", "") + vlan.group_id = UUID(state["group_id"]) if state.get("group_id") else None + vlan.status = VLANStatus(state["status"]) + vlan.role = state.get("role") + vlan.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + vlan.description = state.get("description", "") + vlan.custom_fields = state.get("custom_fields", {}) + vlan.tags = [UUID(t) for t in state.get("tags", [])] + vlan._deleted = state.get("deleted", False) + return vlan diff --git a/services/ipam/src/ipam/domain/vlan_group.py b/services/ipam/src/ipam/domain/vlan_group.py new file mode 100644 index 0000000..921dece --- /dev/null +++ b/services/ipam/src/ipam/domain/vlan_group.py @@ -0,0 +1,161 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import VLANGroupCreated, VLANGroupDeleted, VLANGroupUpdated + + +class VLANGroup(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.name: str = "" + self.slug: str = "" + self.min_vid: int = 1 + self.max_vid: int = 4094 + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + name: str, + slug: str, + min_vid: int = 1, + max_vid: int = 4094, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> VLANGroup: + cls._validate_vid_range(min_vid, max_vid) + aggregate = cls() + aggregate.apply_event( + VLANGroupCreated( + aggregate_id=aggregate.id, + version=aggregate._next_version(), + name=name, + slug=slug, + min_vid=min_vid, + max_vid=max_vid, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return aggregate + + def update( + self, + *, + name: str | None = None, + description: str | None = None, + min_vid: int | None = None, + max_vid: int | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted VLANGroup") + new_min = min_vid if min_vid is not None else self.min_vid + new_max = max_vid if max_vid is not None else self.max_vid + if min_vid is not None or max_vid is not None: + self._validate_vid_range(new_min, new_max) + self.apply_event( + VLANGroupUpdated( + aggregate_id=self.id, + version=self._next_version(), + name=name, + description=description, + min_vid=min_vid, + max_vid=max_vid, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("VLANGroup is already deleted") + self.apply_event( + VLANGroupDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + @staticmethod + def _validate_vid_range(min_vid: int, max_vid: int) -> None: + if not 1 <= min_vid <= 4094: + raise BusinessRuleViolationError(f"min_vid must be between 1 and 4094, got {min_vid}") + if not 1 <= max_vid <= 4094: + raise BusinessRuleViolationError(f"max_vid must be between 1 and 4094, got {max_vid}") + if min_vid > max_vid: + raise BusinessRuleViolationError(f"min_vid ({min_vid}) must be <= max_vid ({max_vid})") + + # --- Event Handlers --- + + def _apply_VLANGroupCreated(self, event: VLANGroupCreated) -> None: # noqa: N802 + self.name = event.name + self.slug = event.slug + self.min_vid = event.min_vid + self.max_vid = event.max_vid + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_VLANGroupUpdated(self, event: VLANGroupUpdated) -> None: # noqa: N802 + if event.name is not None: + self.name = event.name + if event.description is not None: + self.description = event.description + if event.min_vid is not None: + self.min_vid = event.min_vid + if event.max_vid is not None: + self.max_vid = event.max_vid + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_VLANGroupDeleted(self, event: VLANGroupDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "name": self.name, + "slug": self.slug, + "min_vid": self.min_vid, + "max_vid": self.max_vid, + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + aggregate = cls(aggregate_id=aggregate_id) + aggregate.version = version + aggregate.name = state.get("name", "") + aggregate.slug = state.get("slug", "") + aggregate.min_vid = state.get("min_vid", 1) + aggregate.max_vid = state.get("max_vid", 4094) + aggregate.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + aggregate.description = state.get("description", "") + aggregate.custom_fields = state.get("custom_fields", {}) + aggregate.tags = [UUID(t) for t in state.get("tags", [])] + aggregate._deleted = state.get("deleted", False) + return aggregate diff --git a/services/ipam/src/ipam/domain/vrf.py b/services/ipam/src/ipam/domain/vrf.py new file mode 100644 index 0000000..19cafa1 --- /dev/null +++ b/services/ipam/src/ipam/domain/vrf.py @@ -0,0 +1,148 @@ +from __future__ import annotations + +from typing import Any, Self +from uuid import UUID + +from shared.domain.exceptions import BusinessRuleViolationError +from shared.event.aggregate import AggregateRoot + +from ipam.domain.events import VRFCreated, VRFDeleted, VRFUpdated +from ipam.domain.value_objects import RouteDistinguisher + + +class VRF(AggregateRoot): + def __init__(self, aggregate_id: UUID | None = None) -> None: + super().__init__(aggregate_id) + self.name: str = "" + self.rd: RouteDistinguisher | None = None + self.import_targets: list[UUID] = [] + self.export_targets: list[UUID] = [] + self.tenant_id: UUID | None = None + self.description: str = "" + self.custom_fields: dict = {} + self.tags: list[UUID] = [] + self._deleted: bool = False + + @classmethod + def create( + cls, + *, + name: str, + rd: str | None = None, + import_targets: list[UUID] | None = None, + export_targets: list[UUID] | None = None, + tenant_id: UUID | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> VRF: + vrf = cls() + vrf.apply_event( + VRFCreated( + aggregate_id=vrf.id, + version=vrf._next_version(), + name=name, + rd=RouteDistinguisher(rd=rd).rd if rd else None, + import_targets=import_targets or [], + export_targets=export_targets or [], + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields or {}, + tags=tags or [], + ) + ) + return vrf + + def update( + self, + *, + name: str | None = None, + import_targets: list[UUID] | None = None, + export_targets: list[UUID] | None = None, + description: str | None = None, + custom_fields: dict | None = None, + tags: list[UUID] | None = None, + ) -> None: + if self._deleted: + raise BusinessRuleViolationError("Cannot update a deleted VRF") + self.apply_event( + VRFUpdated( + aggregate_id=self.id, + version=self._next_version(), + name=name, + import_targets=import_targets, + export_targets=export_targets, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + ) + + def delete(self) -> None: + if self._deleted: + raise BusinessRuleViolationError("VRF is already deleted") + self.apply_event( + VRFDeleted( + aggregate_id=self.id, + version=self._next_version(), + ) + ) + + # --- Event Handlers --- + + def _apply_VRFCreated(self, event: VRFCreated) -> None: # noqa: N802 + self.name = event.name + self.rd = RouteDistinguisher(rd=event.rd) if event.rd else None + self.import_targets = list(event.import_targets) + self.export_targets = list(event.export_targets) + self.tenant_id = event.tenant_id + self.description = event.description + self.custom_fields = event.custom_fields + self.tags = list(event.tags) + + def _apply_VRFUpdated(self, event: VRFUpdated) -> None: # noqa: N802 + if event.name is not None: + self.name = event.name + if event.import_targets is not None: + self.import_targets = list(event.import_targets) + if event.export_targets is not None: + self.export_targets = list(event.export_targets) + if event.description is not None: + self.description = event.description + if event.custom_fields is not None: + self.custom_fields = event.custom_fields + if event.tags is not None: + self.tags = list(event.tags) + + def _apply_VRFDeleted(self, event: VRFDeleted) -> None: # noqa: N802 + self._deleted = True + + # --- Snapshot --- + + def to_snapshot(self) -> dict[str, Any]: + return { + "name": self.name, + "rd": self.rd.rd if self.rd else None, + "import_targets": [str(t) for t in self.import_targets], + "export_targets": [str(t) for t in self.export_targets], + "tenant_id": str(self.tenant_id) if self.tenant_id else None, + "description": self.description, + "custom_fields": self.custom_fields, + "tags": [str(t) for t in self.tags], + "deleted": self._deleted, + } + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + vrf = cls(aggregate_id=aggregate_id) + vrf.version = version + vrf.name = state.get("name", "") + vrf.rd = RouteDistinguisher(rd=state["rd"]) if state.get("rd") else None + vrf.import_targets = [UUID(t) for t in state.get("import_targets", [])] + vrf.export_targets = [UUID(t) for t in state.get("export_targets", [])] + vrf.tenant_id = UUID(state["tenant_id"]) if state.get("tenant_id") else None + vrf.description = state.get("description", "") + vrf.custom_fields = state.get("custom_fields", {}) + vrf.tags = [UUID(t) for t in state.get("tags", [])] + vrf._deleted = state.get("deleted", False) + return vrf diff --git a/services/ipam/src/ipam/infrastructure/__init__.py b/services/ipam/src/ipam/infrastructure/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/src/ipam/infrastructure/cache.py b/services/ipam/src/ipam/infrastructure/cache.py new file mode 100644 index 0000000..48f6858 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/cache.py @@ -0,0 +1,40 @@ +import json +import logging +from typing import Any +from uuid import UUID + +import redis.asyncio as redis + +logger = logging.getLogger(__name__) + + +class RedisCache: + def __init__(self, redis_url: str) -> None: + self._redis_url = redis_url + self._redis: redis.Redis | None = None + + async def connect(self) -> None: + self._redis = redis.from_url(self._redis_url, decode_responses=True) + + async def close(self) -> None: + if self._redis: + await self._redis.aclose() + + async def get_json(self, key: str) -> Any | None: + if self._redis is None: + return None + raw = await self._redis.get(key) + if raw is None: + return None + return json.loads(raw) + + async def set_json(self, key: str, value: Any, ttl: int = 300) -> None: + if self._redis is None: + return + await self._redis.set(key, json.dumps(value), ex=ttl) + + async def invalidate_prefix_utilization(self, prefix_id: UUID) -> None: + if self._redis is None: + return + key = f"prefix_utilization:{prefix_id}" + await self._redis.delete(key) diff --git a/services/ipam/src/ipam/infrastructure/config.py b/services/ipam/src/ipam/infrastructure/config.py new file mode 100644 index 0000000..8a6bb6e --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/config.py @@ -0,0 +1,16 @@ +from pydantic_settings import BaseSettings + + +class Settings(BaseSettings): + database_url: str = "postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_ipam" + db_host: str = "postgres" + db_port: int = 5432 + db_user: str = "cmdb" + db_password: str = "cmdb" + kafka_bootstrap_servers: str = "kafka:9092" + redis_url: str = "redis://redis:6379" + + def tenant_db_url(self, tenant_slug: str) -> str: + return f"postgresql+asyncpg://{self.db_user}:{self.db_password}@{self.db_host}:{self.db_port}/cmdb_tenant_{tenant_slug}" + + model_config = {"env_prefix": "IPAM_"} diff --git a/services/ipam/src/ipam/infrastructure/cross_service_consumer.py b/services/ipam/src/ipam/infrastructure/cross_service_consumer.py new file mode 100644 index 0000000..31aa048 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/cross_service_consumer.py @@ -0,0 +1,37 @@ +"""Phase 2 준비: 타 서비스 이벤트 수신 Consumer. + +lifespan 연동은 Phase 2에서 실제 토픽이 존재할 때 추가. +""" + +import logging + +from shared.event.domain_event import DomainEvent +from shared.messaging.consumer import KafkaEventConsumer +from shared.messaging.serialization import EventSerializer + +logger = logging.getLogger(__name__) + + +class CrossServiceConsumer: + def __init__(self, bootstrap_servers: str, serializer: EventSerializer) -> None: + self._consumer = KafkaEventConsumer( + bootstrap_servers=bootstrap_servers, + group_id="ipam-cross-service", + topics=["dcim.events", "tenancy.events"], + serializer=serializer, + ) + + async def handle_dcim_event(self, event: DomainEvent) -> None: + logger.info("Received DCIM event: %s (Phase 2)", event.event_type) + + async def handle_tenancy_event(self, event: DomainEvent) -> None: + logger.info("Received Tenancy event: %s (Phase 2)", event.event_type) + + async def start(self) -> None: + await self._consumer.start() + + async def stop(self) -> None: + await self._consumer.stop() + + async def consume(self) -> None: + await self._consumer.consume() diff --git a/services/ipam/src/ipam/infrastructure/database.py b/services/ipam/src/ipam/infrastructure/database.py new file mode 100644 index 0000000..712fa3b --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/database.py @@ -0,0 +1,28 @@ +from sqlalchemy.ext.asyncio import ( + AsyncEngine, + AsyncSession, + async_sessionmaker, + create_async_engine, +) + + +class Database: + def __init__(self, url: str) -> None: + self._engine: AsyncEngine = create_async_engine( + url, echo=False, pool_size=20, max_overflow=30, pool_pre_ping=True, pool_recycle=300 + ) + self._session_factory = async_sessionmaker( + self._engine, + class_=AsyncSession, + expire_on_commit=False, + ) + + @property + def engine(self) -> AsyncEngine: + return self._engine + + def session(self) -> AsyncSession: + return self._session_factory() + + async def close(self) -> None: + await self._engine.dispose() diff --git a/services/ipam/src/ipam/infrastructure/event_projector.py b/services/ipam/src/ipam/infrastructure/event_projector.py new file mode 100644 index 0000000..9ae3c0c --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/event_projector.py @@ -0,0 +1,566 @@ +import logging +from uuid import UUID + +from shared.event.domain_event import DomainEvent +from shared.messaging.consumer import KafkaEventConsumer +from sqlalchemy import update +from sqlalchemy.dialects.postgresql import insert + +from ipam.domain.events import ( + ASNCreated, + ASNDeleted, + ASNUpdated, + FHRPGroupCreated, + FHRPGroupDeleted, + FHRPGroupUpdated, + IPAddressCreated, + IPAddressDeleted, + IPAddressStatusChanged, + IPAddressUpdated, + IPRangeCreated, + IPRangeDeleted, + IPRangeStatusChanged, + IPRangeUpdated, + PrefixCreated, + PrefixDeleted, + PrefixStatusChanged, + PrefixUpdated, + RIRCreated, + RIRDeleted, + RIRUpdated, + RouteTargetCreated, + RouteTargetDeleted, + RouteTargetUpdated, + ServiceCreated, + ServiceDeleted, + ServiceUpdated, + VLANCreated, + VLANDeleted, + VLANGroupCreated, + VLANGroupDeleted, + VLANGroupUpdated, + VLANStatusChanged, + VLANUpdated, + VRFCreated, + VRFDeleted, + VRFUpdated, +) +from ipam.infrastructure.models import ( + ASNReadModel, + FHRPGroupReadModel, + IPAddressReadModel, + IPRangeReadModel, + PrefixReadModel, + RIRReadModel, + RouteTargetReadModel, + ServiceReadModel, + VLANGroupReadModel, + VLANReadModel, + VRFReadModel, +) + +logger = logging.getLogger(__name__) + + +class IPAMEventProjector: + def __init__(self, session_factory: object, cache: object | None = None) -> None: + self._session_factory = session_factory + self._cache = cache + + async def _invalidate_cache(self, prefix_id: UUID) -> None: + if self._cache is not None: + await self._cache.invalidate_prefix_utilization(prefix_id) + + async def _handle_prefix_created(self, event: DomainEvent) -> None: + assert isinstance(event, PrefixCreated) + async with self._session_factory() as session: + stmt = insert(PrefixReadModel).values( + id=event.aggregate_id, + network=event.network, + vrf_id=event.vrf_id, + vlan_id=event.vlan_id, + status=event.status, + role=event.role, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + await self._invalidate_cache(event.aggregate_id) + + async def _handle_prefix_updated(self, event: DomainEvent) -> None: + assert isinstance(event, PrefixUpdated) + values: dict = {} + if event.description is not None: + values["description"] = event.description + if event.role is not None: + values["role"] = event.role + if event.tenant_id is not None: + values["tenant_id"] = event.tenant_id + if event.vlan_id is not None: + values["vlan_id"] = event.vlan_id + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(PrefixReadModel, event.aggregate_id, values) + await self._invalidate_cache(event.aggregate_id) + + async def _handle_prefix_status_changed(self, event: DomainEvent) -> None: + assert isinstance(event, PrefixStatusChanged) + await self._update_model(PrefixReadModel, event.aggregate_id, {"status": event.new_status}) + await self._invalidate_cache(event.aggregate_id) + + async def _handle_prefix_deleted(self, event: DomainEvent) -> None: + await self._update_model(PrefixReadModel, event.aggregate_id, {"is_deleted": True}) + await self._invalidate_cache(event.aggregate_id) + + async def _handle_ip_address_created(self, event: DomainEvent) -> None: + assert isinstance(event, IPAddressCreated) + async with self._session_factory() as session: + stmt = insert(IPAddressReadModel).values( + id=event.aggregate_id, + address=event.address, + vrf_id=event.vrf_id, + status=event.status, + dns_name=event.dns_name, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_ip_address_updated(self, event: DomainEvent) -> None: + assert isinstance(event, IPAddressUpdated) + values: dict = {} + if event.dns_name is not None: + values["dns_name"] = event.dns_name + if event.description is not None: + values["description"] = event.description + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(IPAddressReadModel, event.aggregate_id, values) + + async def _handle_ip_address_status_changed(self, event: DomainEvent) -> None: + assert isinstance(event, IPAddressStatusChanged) + await self._update_model(IPAddressReadModel, event.aggregate_id, {"status": event.new_status}) + + async def _handle_ip_address_deleted(self, event: DomainEvent) -> None: + await self._update_model(IPAddressReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _handle_vrf_created(self, event: DomainEvent) -> None: + assert isinstance(event, VRFCreated) + async with self._session_factory() as session: + stmt = insert(VRFReadModel).values( + id=event.aggregate_id, + name=event.name, + rd=event.rd, + import_targets=[str(t) for t in event.import_targets], + export_targets=[str(t) for t in event.export_targets], + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_vrf_updated(self, event: DomainEvent) -> None: + assert isinstance(event, VRFUpdated) + values: dict = {} + if event.name is not None: + values["name"] = event.name + if event.import_targets is not None: + values["import_targets"] = [str(t) for t in event.import_targets] + if event.export_targets is not None: + values["export_targets"] = [str(t) for t in event.export_targets] + if event.description is not None: + values["description"] = event.description + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(VRFReadModel, event.aggregate_id, values) + + async def _handle_vrf_deleted(self, event: DomainEvent) -> None: + await self._update_model(VRFReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _handle_vlan_created(self, event: DomainEvent) -> None: + assert isinstance(event, VLANCreated) + async with self._session_factory() as session: + stmt = insert(VLANReadModel).values( + id=event.aggregate_id, + vid=event.vid, + name=event.name, + group_id=event.group_id, + status=event.status, + role=event.role, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_vlan_updated(self, event: DomainEvent) -> None: + assert isinstance(event, VLANUpdated) + values: dict = {} + if event.name is not None: + values["name"] = event.name + if event.role is not None: + values["role"] = event.role + if event.description is not None: + values["description"] = event.description + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(VLANReadModel, event.aggregate_id, values) + + async def _handle_vlan_status_changed(self, event: DomainEvent) -> None: + assert isinstance(event, VLANStatusChanged) + await self._update_model(VLANReadModel, event.aggregate_id, {"status": event.new_status}) + + async def _handle_vlan_deleted(self, event: DomainEvent) -> None: + await self._update_model(VLANReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _handle_ip_range_created(self, event: DomainEvent) -> None: + assert isinstance(event, IPRangeCreated) + async with self._session_factory() as session: + stmt = insert(IPRangeReadModel).values( + id=event.aggregate_id, + start_address=event.start_address, + end_address=event.end_address, + vrf_id=event.vrf_id, + status=event.status, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_ip_range_updated(self, event: DomainEvent) -> None: + assert isinstance(event, IPRangeUpdated) + values: dict = {} + if event.description is not None: + values["description"] = event.description + if event.tenant_id is not None: + values["tenant_id"] = event.tenant_id + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(IPRangeReadModel, event.aggregate_id, values) + + async def _handle_ip_range_status_changed(self, event: DomainEvent) -> None: + assert isinstance(event, IPRangeStatusChanged) + await self._update_model(IPRangeReadModel, event.aggregate_id, {"status": event.new_status}) + + async def _handle_ip_range_deleted(self, event: DomainEvent) -> None: + await self._update_model(IPRangeReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _handle_rir_created(self, event: DomainEvent) -> None: + assert isinstance(event, RIRCreated) + async with self._session_factory() as session: + stmt = insert(RIRReadModel).values( + id=event.aggregate_id, + name=event.name, + is_private=event.is_private, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_rir_updated(self, event: DomainEvent) -> None: + assert isinstance(event, RIRUpdated) + values: dict = {} + if event.description is not None: + values["description"] = event.description + if event.is_private is not None: + values["is_private"] = event.is_private + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(RIRReadModel, event.aggregate_id, values) + + async def _handle_rir_deleted(self, event: DomainEvent) -> None: + await self._update_model(RIRReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _handle_asn_created(self, event: DomainEvent) -> None: + assert isinstance(event, ASNCreated) + async with self._session_factory() as session: + stmt = insert(ASNReadModel).values( + id=event.aggregate_id, + asn=event.asn, + rir_id=event.rir_id, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_asn_updated(self, event: DomainEvent) -> None: + assert isinstance(event, ASNUpdated) + values: dict = {} + if event.description is not None: + values["description"] = event.description + if event.tenant_id is not None: + values["tenant_id"] = event.tenant_id + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(ASNReadModel, event.aggregate_id, values) + + async def _handle_asn_deleted(self, event: DomainEvent) -> None: + await self._update_model(ASNReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _handle_fhrp_group_created(self, event: DomainEvent) -> None: + assert isinstance(event, FHRPGroupCreated) + async with self._session_factory() as session: + stmt = insert(FHRPGroupReadModel).values( + id=event.aggregate_id, + protocol=event.protocol, + group_id_value=event.group_id_value, + auth_type=event.auth_type, + auth_key=event.auth_key, + name=event.name, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_fhrp_group_updated(self, event: DomainEvent) -> None: + assert isinstance(event, FHRPGroupUpdated) + values: dict = {} + if event.name is not None: + values["name"] = event.name + if event.auth_type is not None: + values["auth_type"] = event.auth_type + if event.auth_key is not None: + values["auth_key"] = event.auth_key + if event.description is not None: + values["description"] = event.description + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(FHRPGroupReadModel, event.aggregate_id, values) + + async def _handle_fhrp_group_deleted(self, event: DomainEvent) -> None: + await self._update_model(FHRPGroupReadModel, event.aggregate_id, {"is_deleted": True}) + + # --- RouteTarget --- + + async def _handle_route_target_created(self, event: DomainEvent) -> None: + assert isinstance(event, RouteTargetCreated) + async with self._session_factory() as session: + stmt = insert(RouteTargetReadModel).values( + id=event.aggregate_id, + name=event.name, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_route_target_updated(self, event: DomainEvent) -> None: + assert isinstance(event, RouteTargetUpdated) + values: dict = {} + if event.description is not None: + values["description"] = event.description + if event.tenant_id is not None: + values["tenant_id"] = event.tenant_id + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(RouteTargetReadModel, event.aggregate_id, values) + + async def _handle_route_target_deleted(self, event: DomainEvent) -> None: + await self._update_model(RouteTargetReadModel, event.aggregate_id, {"is_deleted": True}) + + # --- VLANGroup --- + + async def _handle_vlan_group_created(self, event: DomainEvent) -> None: + assert isinstance(event, VLANGroupCreated) + async with self._session_factory() as session: + stmt = insert(VLANGroupReadModel).values( + id=event.aggregate_id, + name=event.name, + slug=event.slug, + min_vid=event.min_vid, + max_vid=event.max_vid, + tenant_id=event.tenant_id, + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_vlan_group_updated(self, event: DomainEvent) -> None: + assert isinstance(event, VLANGroupUpdated) + values: dict = {} + if event.name is not None: + values["name"] = event.name + if event.description is not None: + values["description"] = event.description + if event.min_vid is not None: + values["min_vid"] = event.min_vid + if event.max_vid is not None: + values["max_vid"] = event.max_vid + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(VLANGroupReadModel, event.aggregate_id, values) + + async def _handle_vlan_group_deleted(self, event: DomainEvent) -> None: + await self._update_model(VLANGroupReadModel, event.aggregate_id, {"is_deleted": True}) + + # --- Service --- + + async def _handle_service_created(self, event: DomainEvent) -> None: + assert isinstance(event, ServiceCreated) + async with self._session_factory() as session: + stmt = insert(ServiceReadModel).values( + id=event.aggregate_id, + name=event.name, + protocol=event.protocol, + ports=event.ports, + ip_addresses=[str(ip) for ip in event.ip_addresses], + description=event.description, + custom_fields=event.custom_fields, + tags=[str(t) for t in event.tags], + is_deleted=False, + ) + stmt = stmt.on_conflict_do_update(index_elements=["id"], set_=dict(stmt.excluded)) + await session.execute(stmt) + await session.commit() + + async def _handle_service_updated(self, event: DomainEvent) -> None: + assert isinstance(event, ServiceUpdated) + values: dict = {} + if event.name is not None: + values["name"] = event.name + if event.protocol is not None: + values["protocol"] = event.protocol + if event.ports is not None: + values["ports"] = event.ports + if event.ip_addresses is not None: + values["ip_addresses"] = [str(ip) for ip in event.ip_addresses] + if event.description is not None: + values["description"] = event.description + if event.custom_fields is not None: + values["custom_fields"] = event.custom_fields + if event.tags is not None: + values["tags"] = [str(t) for t in event.tags] + if values: + await self._update_model(ServiceReadModel, event.aggregate_id, values) + + async def _handle_service_deleted(self, event: DomainEvent) -> None: + await self._update_model(ServiceReadModel, event.aggregate_id, {"is_deleted": True}) + + async def _update_model(self, model_cls: type, aggregate_id: UUID, values: dict) -> None: + async with self._session_factory() as session: + stmt = update(model_cls).where(model_cls.id == aggregate_id).values(**values) + await session.execute(stmt) + await session.commit() + + def register_all(self, consumer: KafkaEventConsumer) -> None: + consumer.subscribe(PrefixCreated, self._handle_prefix_created) + consumer.subscribe(PrefixUpdated, self._handle_prefix_updated) + consumer.subscribe(PrefixStatusChanged, self._handle_prefix_status_changed) + consumer.subscribe(PrefixDeleted, self._handle_prefix_deleted) + + consumer.subscribe(IPAddressCreated, self._handle_ip_address_created) + consumer.subscribe(IPAddressUpdated, self._handle_ip_address_updated) + consumer.subscribe(IPAddressStatusChanged, self._handle_ip_address_status_changed) + consumer.subscribe(IPAddressDeleted, self._handle_ip_address_deleted) + + consumer.subscribe(VRFCreated, self._handle_vrf_created) + consumer.subscribe(VRFUpdated, self._handle_vrf_updated) + consumer.subscribe(VRFDeleted, self._handle_vrf_deleted) + + consumer.subscribe(VLANCreated, self._handle_vlan_created) + consumer.subscribe(VLANUpdated, self._handle_vlan_updated) + consumer.subscribe(VLANStatusChanged, self._handle_vlan_status_changed) + consumer.subscribe(VLANDeleted, self._handle_vlan_deleted) + + consumer.subscribe(IPRangeCreated, self._handle_ip_range_created) + consumer.subscribe(IPRangeUpdated, self._handle_ip_range_updated) + consumer.subscribe(IPRangeStatusChanged, self._handle_ip_range_status_changed) + consumer.subscribe(IPRangeDeleted, self._handle_ip_range_deleted) + + consumer.subscribe(RIRCreated, self._handle_rir_created) + consumer.subscribe(RIRUpdated, self._handle_rir_updated) + consumer.subscribe(RIRDeleted, self._handle_rir_deleted) + + consumer.subscribe(ASNCreated, self._handle_asn_created) + consumer.subscribe(ASNUpdated, self._handle_asn_updated) + consumer.subscribe(ASNDeleted, self._handle_asn_deleted) + + consumer.subscribe(FHRPGroupCreated, self._handle_fhrp_group_created) + consumer.subscribe(FHRPGroupUpdated, self._handle_fhrp_group_updated) + consumer.subscribe(FHRPGroupDeleted, self._handle_fhrp_group_deleted) + + consumer.subscribe(RouteTargetCreated, self._handle_route_target_created) + consumer.subscribe(RouteTargetUpdated, self._handle_route_target_updated) + consumer.subscribe(RouteTargetDeleted, self._handle_route_target_deleted) + + consumer.subscribe(VLANGroupCreated, self._handle_vlan_group_created) + consumer.subscribe(VLANGroupUpdated, self._handle_vlan_group_updated) + consumer.subscribe(VLANGroupDeleted, self._handle_vlan_group_deleted) + + consumer.subscribe(ServiceCreated, self._handle_service_created) + consumer.subscribe(ServiceUpdated, self._handle_service_updated) + consumer.subscribe(ServiceDeleted, self._handle_service_deleted) diff --git a/services/ipam/src/ipam/infrastructure/models.py b/services/ipam/src/ipam/infrastructure/models.py new file mode 100644 index 0000000..ee3cc41 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/models.py @@ -0,0 +1,346 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import Boolean, Computed, DateTime, Index, Integer, String, Text, func +from sqlalchemy.dialects.postgresql import JSONB, TSVECTOR +from sqlalchemy.dialects.postgresql import UUID as SAUUID +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column + + +class IPAMBase(DeclarativeBase): + pass + + +class PrefixReadModel(IPAMBase): + __tablename__ = "prefixes_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + network: Mapped[str] = mapped_column(String(50), index=True) + vrf_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + vlan_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + status: Mapped[str] = mapped_column(String(20)) + role: Mapped[str | None] = mapped_column(String(100), nullable=True) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(network, '') || ' ' || coalesce(description, '') || ' ' || coalesce(role, ''))", # noqa: E501 + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_prefixes_read_search", "search_vector", postgresql_using="gin"),) + + +class IPAddressReadModel(IPAMBase): + __tablename__ = "ip_addresses_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + address: Mapped[str] = mapped_column(String(50), index=True) + vrf_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + status: Mapped[str] = mapped_column(String(20)) + dns_name: Mapped[str] = mapped_column(String(255), default="") + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(address, '') || ' ' || coalesce(dns_name, '') || ' ' || coalesce(description, ''))", # noqa: E501 + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_ip_addresses_read_search", "search_vector", postgresql_using="gin"),) + + +class VRFReadModel(IPAMBase): + __tablename__ = "vrfs_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(255), index=True) + rd: Mapped[str | None] = mapped_column(String(50), nullable=True) + import_targets: Mapped[list] = mapped_column(JSONB, default=list) + export_targets: Mapped[list] = mapped_column(JSONB, default=list) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(rd, '') || ' ' || coalesce(description, ''))", + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_vrfs_read_search", "search_vector", postgresql_using="gin"),) + + +class VLANReadModel(IPAMBase): + __tablename__ = "vlans_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + vid: Mapped[int] = mapped_column(Integer, index=True) + name: Mapped[str] = mapped_column(String(255)) + group_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + status: Mapped[str] = mapped_column(String(20)) + role: Mapped[str | None] = mapped_column(String(100), nullable=True) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(name, '') || ' ' || vid::text || ' ' || coalesce(description, ''))", + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_vlans_read_search", "search_vector", postgresql_using="gin"),) + + +class IPRangeReadModel(IPAMBase): + __tablename__ = "ip_ranges_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + start_address: Mapped[str] = mapped_column(String(50), index=True) + end_address: Mapped[str] = mapped_column(String(50)) + vrf_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + status: Mapped[str] = mapped_column(String(20)) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(start_address, '') || ' ' || coalesce(end_address, '') || ' ' || coalesce(description, ''))", # noqa: E501 + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_ip_ranges_read_search", "search_vector", postgresql_using="gin"),) + + +class RIRReadModel(IPAMBase): + __tablename__ = "rirs_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(255), index=True) + is_private: Mapped[bool] = mapped_column(Boolean, default=False) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed("to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(description, ''))", persisted=True), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_rirs_read_search", "search_vector", postgresql_using="gin"),) + + +class ASNReadModel(IPAMBase): + __tablename__ = "asns_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + asn: Mapped[int] = mapped_column(Integer, index=True) + rir_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed("to_tsvector('simple', asn::text || ' ' || coalesce(description, ''))", persisted=True), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_asns_read_search", "search_vector", postgresql_using="gin"),) + + +class FHRPGroupReadModel(IPAMBase): + __tablename__ = "fhrp_groups_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + protocol: Mapped[str] = mapped_column(String(20)) + group_id_value: Mapped[int] = mapped_column(Integer) + auth_type: Mapped[str] = mapped_column(String(20)) + auth_key: Mapped[str] = mapped_column(String(255), default="") + name: Mapped[str] = mapped_column(String(255), default="") + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(protocol, '') || ' ' || coalesce(description, ''))", # noqa: E501 + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_fhrp_groups_read_search", "search_vector", postgresql_using="gin"),) + + +class RouteTargetReadModel(IPAMBase): + __tablename__ = "route_targets_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(100), index=True) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed("to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(description, ''))", persisted=True), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_route_targets_read_search", "search_vector", postgresql_using="gin"),) + + +class VLANGroupReadModel(IPAMBase): + __tablename__ = "vlan_groups_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(255), index=True) + slug: Mapped[str] = mapped_column(String(255), unique=True, index=True) + min_vid: Mapped[int] = mapped_column(Integer) + max_vid: Mapped[int] = mapped_column(Integer) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(slug, '') || ' ' || coalesce(description, ''))", # noqa: E501 + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_vlan_groups_read_search", "search_vector", postgresql_using="gin"),) + + +class ServiceReadModel(IPAMBase): + __tablename__ = "services_read" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(255), index=True) + protocol: Mapped[str] = mapped_column(String(10)) + ports: Mapped[list] = mapped_column(JSONB, default=list) + ip_addresses: Mapped[list] = mapped_column(JSONB, default=list) + description: Mapped[str] = mapped_column(Text, default="") + custom_fields: Mapped[dict] = mapped_column(JSONB, default=dict) + tags: Mapped[list] = mapped_column(JSONB, default=list) + search_vector: Mapped[str | None] = mapped_column( + TSVECTOR, + Computed( + "to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(protocol, '') || ' ' || coalesce(description, ''))", # noqa: E501 + persisted=True, + ), + nullable=True, + ) + is_deleted: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_services_read_search", "search_vector", postgresql_using="gin"),) + + +class SavedFilterModel(IPAMBase): + __tablename__ = "saved_filters" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + user_id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), index=True) + name: Mapped[str] = mapped_column(String(255)) + entity_type: Mapped[str] = mapped_column(String(50)) + filter_config: Mapped[dict] = mapped_column(JSONB, default=dict) + is_default: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + __table_args__ = (Index("ix_saved_filters_user_entity", "user_id", "entity_type"),) + + +class ExportTemplateModel(IPAMBase): + __tablename__ = "export_templates" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(255), unique=True) + entity_type: Mapped[str] = mapped_column(String(50)) + template_content: Mapped[str] = mapped_column(Text) + output_format: Mapped[str] = mapped_column(String(20), default="text") + description: Mapped[str] = mapped_column(Text, default="") + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) diff --git a/services/ipam/src/ipam/infrastructure/read_model_repository.py b/services/ipam/src/ipam/infrastructure/read_model_repository.py new file mode 100644 index 0000000..35b2756 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/read_model_repository.py @@ -0,0 +1,1113 @@ +from __future__ import annotations + +import ipaddress +from typing import Any +from uuid import UUID + +import sqlalchemy as sa +from shared.api.filtering import FilterParam, apply_filters +from shared.api.sorting import SortParam, apply_sorting +from shared.domain.filters import filter_by_custom_field +from sqlalchemy import Select, func, select +from sqlalchemy.ext.asyncio import AsyncSession + +from ipam.application.read_model import ( + ASNReadModelRepository, + FHRPGroupReadModelRepository, + IPAddressReadModelRepository, + IPRangeReadModelRepository, + PrefixReadModelRepository, + RIRReadModelRepository, + RouteTargetReadModelRepository, + ServiceReadModelRepository, + VLANGroupReadModelRepository, + VLANReadModelRepository, + VRFReadModelRepository, +) +from ipam.infrastructure.models import ( + ASNReadModel, + FHRPGroupReadModel, + IPAddressReadModel, + IPRangeReadModel, + PrefixReadModel, + RIRReadModel, + RouteTargetReadModel, + ServiceReadModel, + VLANGroupReadModel, + VLANReadModel, + VRFReadModel, +) + +# --------------------------------------------------------------------------- +# Common helpers +# --------------------------------------------------------------------------- + + +def _apply_advanced_filters( + stmt: Select, # type: ignore[type-arg] + model: Any, + *, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, +) -> Select: # type: ignore[type-arg] + """Apply standard filters, sorting, tag slug filtering, and custom field filtering.""" + if filters: + stmt = apply_filters(stmt, model, filters) + if tag_slugs: + tag_uuids = [UUID(s) if len(s) == 36 else s for s in tag_slugs] + for tag_val in tag_uuids: + stmt = stmt.where(model.tags.contains([str(tag_val)])) + if custom_field_filters: + for field_name, value in custom_field_filters.items(): + stmt = filter_by_custom_field(stmt, model.custom_fields, field_name, value) + if sort_params: + stmt = apply_sorting(stmt, model, sort_params) + return stmt + + +def _find_all_common( + stmt: Select, # type: ignore[type-arg] + model: Any, + *, + offset: int, + limit: int, + filters: list[FilterParam] | None, + sort_params: list[SortParam] | None, + tag_slugs: list[str] | None, + custom_field_filters: dict[str, str] | None, + default_order: Any, +) -> Select: # type: ignore[type-arg] + """Build a paginated, filtered, sorted query.""" + stmt = _apply_advanced_filters( + stmt, + model, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + if not sort_params: + stmt = stmt.order_by(default_order) + return stmt.offset(offset).limit(limit) + + +# --------------------------------------------------------------------------- +# Prefix +# --------------------------------------------------------------------------- + + +class PostgresPrefixReadModelRepository(PrefixReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = PrefixReadModel( + id=aggregate.id, + network=str(aggregate.network.network) if aggregate.network else "", + vrf_id=aggregate.vrf_id, + vlan_id=aggregate.vlan_id, + status=aggregate.status.value, + role=aggregate.role, + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(PrefixReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(PrefixReadModel).where(PrefixReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + PrefixReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(PrefixReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(PrefixReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_children(self, parent_network: str, vrf_id: UUID | None) -> list[dict]: + stmt = select(PrefixReadModel).where( + PrefixReadModel.is_deleted == sa.false(), + PrefixReadModel.network != parent_network, + ) + if vrf_id is not None: + stmt = stmt.where(PrefixReadModel.vrf_id == vrf_id) + else: + stmt = stmt.where(PrefixReadModel.vrf_id.is_(None)) + result = await self._session.execute(stmt) + parent_net = ipaddress.ip_network(parent_network, strict=False) + children = [] + for row in result.scalars().all(): + try: + child_net = ipaddress.ip_network(row.network, strict=False) + except ValueError: + continue + if child_net.subnet_of(parent_net): + children.append(self._to_dict(row)) + return children + + async def find_by_vrf(self, vrf_id: UUID, *, offset: int = 0, limit: int = 50) -> tuple[list[dict], int]: + stmt = select(PrefixReadModel).where( + PrefixReadModel.vrf_id == vrf_id, + PrefixReadModel.is_deleted == sa.false(), + ) + count_stmt = select(func.count()).select_from(stmt.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = stmt.offset(offset).limit(limit).order_by(PrefixReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + @staticmethod + def _to_dict(model: PrefixReadModel) -> dict: + return { + "id": model.id, + "network": model.network, + "vrf_id": model.vrf_id, + "vlan_id": model.vlan_id, + "status": model.status, + "role": model.role, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# IP Address +# --------------------------------------------------------------------------- + + +class PostgresIPAddressReadModelRepository(IPAddressReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = IPAddressReadModel( + id=aggregate.id, + address=str(aggregate.address.address) if aggregate.address else "", + vrf_id=aggregate.vrf_id, + status=aggregate.status.value, + dns_name=aggregate.dns_name, + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(IPAddressReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(IPAddressReadModel).where(IPAddressReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + IPAddressReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(IPAddressReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(IPAddressReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def exists_in_vrf(self, address: str, vrf_id: UUID | None) -> bool: + stmt = select(func.count()).where( + IPAddressReadModel.address == address, + IPAddressReadModel.is_deleted == sa.false(), + ) + if vrf_id is not None: + stmt = stmt.where(IPAddressReadModel.vrf_id == vrf_id) + else: + stmt = stmt.where(IPAddressReadModel.vrf_id.is_(None)) + result = await self._session.execute(stmt) + return result.scalar_one() > 0 + + async def find_by_prefix(self, network: str, vrf_id: UUID | None) -> list[dict]: + stmt = select(IPAddressReadModel).where( + IPAddressReadModel.is_deleted == sa.false(), + ) + if vrf_id is not None: + stmt = stmt.where(IPAddressReadModel.vrf_id == vrf_id) + else: + stmt = stmt.where(IPAddressReadModel.vrf_id.is_(None)) + result = await self._session.execute(stmt) + prefix_net = ipaddress.ip_network(network, strict=False) + matched = [] + for row in result.scalars().all(): + try: + addr_str = row.address.split("/")[0] + addr = ipaddress.ip_address(addr_str) + except ValueError: + continue + if addr in prefix_net: + matched.append(self._to_dict(row)) + return matched + + async def find_ips_in_range(self, start_address: str, end_address: str, vrf_id: UUID | None) -> list[dict]: + stmt = select(IPAddressReadModel).where(IPAddressReadModel.is_deleted == sa.false()) + if vrf_id is not None: + stmt = stmt.where(IPAddressReadModel.vrf_id == vrf_id) + else: + stmt = stmt.where(IPAddressReadModel.vrf_id.is_(None)) + result = await self._session.execute(stmt) + start_ip = ipaddress.ip_address(start_address) + end_ip = ipaddress.ip_address(end_address) + matched = [] + for row in result.scalars().all(): + try: + addr_str = row.address.split("/")[0] + addr = ipaddress.ip_address(addr_str) + except ValueError: + continue + if start_ip <= addr <= end_ip: + matched.append(self._to_dict(row)) + return matched + + @staticmethod + def _to_dict(model: IPAddressReadModel) -> dict: + return { + "id": model.id, + "address": model.address, + "vrf_id": model.vrf_id, + "status": model.status, + "dns_name": model.dns_name, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# VRF +# --------------------------------------------------------------------------- + + +class PostgresVRFReadModelRepository(VRFReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = VRFReadModel( + id=aggregate.id, + name=aggregate.name, + rd=aggregate.rd.rd if aggregate.rd else None, + import_targets=[str(t) for t in aggregate.import_targets], + export_targets=[str(t) for t in aggregate.export_targets], + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(VRFReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(VRFReadModel).where(VRFReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + VRFReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(VRFReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(VRFReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_by_name(self, name: str) -> dict | None: + stmt = select(VRFReadModel).where( + VRFReadModel.name == name, + VRFReadModel.is_deleted == sa.false(), + ) + result = await self._session.execute(stmt) + model = result.scalar_one_or_none() + return self._to_dict(model) if model else None + + @staticmethod + def _to_dict(model: VRFReadModel) -> dict: + return { + "id": model.id, + "name": model.name, + "rd": model.rd, + "import_targets": [UUID(t) if isinstance(t, str) else t for t in (model.import_targets or [])], + "export_targets": [UUID(t) if isinstance(t, str) else t for t in (model.export_targets or [])], + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# VLAN +# --------------------------------------------------------------------------- + + +class PostgresVLANReadModelRepository(VLANReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = VLANReadModel( + id=aggregate.id, + vid=aggregate.vid.vid if aggregate.vid else 0, + name=aggregate.name, + group_id=aggregate.group_id, + status=aggregate.status.value, + role=aggregate.role, + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(VLANReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(VLANReadModel).where(VLANReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + VLANReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(VLANReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(VLANReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_by_vid(self, vid: int, group_id: UUID | None) -> dict | None: + stmt = select(VLANReadModel).where( + VLANReadModel.vid == vid, + VLANReadModel.is_deleted == sa.false(), + ) + if group_id is not None: + stmt = stmt.where(VLANReadModel.group_id == group_id) + else: + stmt = stmt.where(VLANReadModel.group_id.is_(None)) + result = await self._session.execute(stmt) + model = result.scalar_one_or_none() + return self._to_dict(model) if model else None + + @staticmethod + def _to_dict(model: VLANReadModel) -> dict: + return { + "id": model.id, + "vid": model.vid, + "name": model.name, + "group_id": model.group_id, + "status": model.status, + "role": model.role, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# IP Range +# --------------------------------------------------------------------------- + + +class PostgresIPRangeReadModelRepository(IPRangeReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = IPRangeReadModel( + id=aggregate.id, + start_address=aggregate.start_address.address if aggregate.start_address else "", + end_address=aggregate.end_address.address if aggregate.end_address else "", + vrf_id=aggregate.vrf_id, + status=aggregate.status.value, + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(IPRangeReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(IPRangeReadModel).where(IPRangeReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + IPRangeReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(IPRangeReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(IPRangeReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + @staticmethod + def _to_dict(model: IPRangeReadModel) -> dict: + return { + "id": model.id, + "start_address": model.start_address, + "end_address": model.end_address, + "vrf_id": model.vrf_id, + "status": model.status, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# RIR +# --------------------------------------------------------------------------- + + +class PostgresRIRReadModelRepository(RIRReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = RIRReadModel( + id=aggregate.id, + name=aggregate.name, + is_private=aggregate.is_private, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(RIRReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(RIRReadModel).where(RIRReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + RIRReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(RIRReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(RIRReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_by_name(self, name: str) -> dict | None: + stmt = select(RIRReadModel).where( + RIRReadModel.name == name, + RIRReadModel.is_deleted == sa.false(), + ) + result = await self._session.execute(stmt) + model = result.scalar_one_or_none() + return self._to_dict(model) if model else None + + @staticmethod + def _to_dict(model: RIRReadModel) -> dict: + return { + "id": model.id, + "name": model.name, + "is_private": model.is_private, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# ASN +# --------------------------------------------------------------------------- + + +class PostgresASNReadModelRepository(ASNReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = ASNReadModel( + id=aggregate.id, + asn=aggregate.asn.asn if aggregate.asn else 0, + rir_id=aggregate.rir_id, + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(ASNReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(ASNReadModel).where(ASNReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + ASNReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(ASNReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(ASNReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_by_asn(self, asn: int) -> dict | None: + stmt = select(ASNReadModel).where( + ASNReadModel.asn == asn, + ASNReadModel.is_deleted == sa.false(), + ) + result = await self._session.execute(stmt) + model = result.scalar_one_or_none() + return self._to_dict(model) if model else None + + @staticmethod + def _to_dict(model: ASNReadModel) -> dict: + return { + "id": model.id, + "asn": model.asn, + "rir_id": model.rir_id, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# FHRP Group +# --------------------------------------------------------------------------- + + +class PostgresFHRPGroupReadModelRepository(FHRPGroupReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = FHRPGroupReadModel( + id=aggregate.id, + protocol=aggregate.protocol.value if aggregate.protocol else "", + group_id_value=aggregate.group_id_value, + auth_type=aggregate.auth_type.value, + auth_key=aggregate.auth_key, + name=aggregate.name, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(FHRPGroupReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(FHRPGroupReadModel).where(FHRPGroupReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + FHRPGroupReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(FHRPGroupReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(FHRPGroupReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + @staticmethod + def _to_dict(model: FHRPGroupReadModel) -> dict: + return { + "id": model.id, + "protocol": model.protocol, + "group_id_value": model.group_id_value, + "auth_type": model.auth_type, + "auth_key": model.auth_key, + "name": model.name, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# RouteTarget +# --------------------------------------------------------------------------- + + +class PostgresRouteTargetReadModelRepository(RouteTargetReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = RouteTargetReadModel( + id=aggregate.id, + name=aggregate.name.rd if aggregate.name else "", + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(RouteTargetReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(RouteTargetReadModel).where(RouteTargetReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + RouteTargetReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(RouteTargetReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(RouteTargetReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_by_name(self, name: str) -> dict | None: + stmt = select(RouteTargetReadModel).where( + RouteTargetReadModel.name == name, + RouteTargetReadModel.is_deleted == sa.false(), + ) + result = await self._session.execute(stmt) + model = result.scalar_one_or_none() + return self._to_dict(model) if model else None + + @staticmethod + def _to_dict(model: RouteTargetReadModel) -> dict: + return { + "id": model.id, + "name": model.name, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# VLANGroup +# --------------------------------------------------------------------------- + + +class PostgresVLANGroupReadModelRepository(VLANGroupReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = VLANGroupReadModel( + id=aggregate.id, + name=aggregate.name, + slug=aggregate.slug, + min_vid=aggregate.min_vid, + max_vid=aggregate.max_vid, + tenant_id=aggregate.tenant_id, + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(VLANGroupReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(VLANGroupReadModel).where(VLANGroupReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + VLANGroupReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(VLANGroupReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(VLANGroupReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + async def find_by_slug(self, slug: str) -> dict | None: + stmt = select(VLANGroupReadModel).where( + VLANGroupReadModel.slug == slug, + VLANGroupReadModel.is_deleted == sa.false(), + ) + result = await self._session.execute(stmt) + model = result.scalar_one_or_none() + return self._to_dict(model) if model else None + + @staticmethod + def _to_dict(model: VLANGroupReadModel) -> dict: + return { + "id": model.id, + "name": model.name, + "slug": model.slug, + "min_vid": model.min_vid, + "max_vid": model.max_vid, + "tenant_id": model.tenant_id, + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } + + +# --------------------------------------------------------------------------- +# Service +# --------------------------------------------------------------------------- + + +class PostgresServiceReadModelRepository(ServiceReadModelRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + model = ServiceReadModel( + id=aggregate.id, + name=aggregate.name, + protocol=aggregate.protocol.value if aggregate.protocol else "tcp", + ports=aggregate.ports, + ip_addresses=[str(ip) for ip in aggregate.ip_addresses], + description=aggregate.description, + custom_fields=aggregate.custom_fields, + tags=[str(t) for t in aggregate.tags], + is_deleted=aggregate._deleted, + ) + await self._session.merge(model) + await self._session.flush() + + async def find_by_id(self, entity_id: UUID) -> dict | None: + model = await self._session.get(ServiceReadModel, entity_id) + if model is None or model.is_deleted: + return None + return self._to_dict(model) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: list[FilterParam] | None = None, + sort_params: list[SortParam] | None = None, + tag_slugs: list[str] | None = None, + custom_field_filters: dict[str, str] | None = None, + ) -> tuple[list[dict], int]: + base = select(ServiceReadModel).where(ServiceReadModel.is_deleted == sa.false()) + filtered = _apply_advanced_filters( + base, + ServiceReadModel, + filters=filters, + sort_params=sort_params, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + ) + count_stmt = select(func.count()).select_from(filtered.subquery()) + total = (await self._session.execute(count_stmt)).scalar_one() + stmt = filtered.offset(offset).limit(limit) + if not sort_params: + stmt = stmt.order_by(ServiceReadModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()], total + + async def mark_deleted(self, entity_id: UUID) -> None: + model = await self._session.get(ServiceReadModel, entity_id) + if model: + model.is_deleted = True + await self._session.flush() + + @staticmethod + def _to_dict(model: ServiceReadModel) -> dict: + return { + "id": model.id, + "name": model.name, + "protocol": model.protocol, + "ports": model.ports or [], + "ip_addresses": [UUID(ip) if isinstance(ip, str) else ip for ip in (model.ip_addresses or [])], + "description": model.description, + "custom_fields": model.custom_fields, + "tags": [UUID(t) if isinstance(t, str) else t for t in (model.tags or [])], + "created_at": model.created_at, + "updated_at": model.updated_at, + } diff --git a/services/ipam/src/ipam/infrastructure/saved_filter_repository.py b/services/ipam/src/ipam/infrastructure/saved_filter_repository.py new file mode 100644 index 0000000..86ff9f2 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/saved_filter_repository.py @@ -0,0 +1,75 @@ +from __future__ import annotations + +from uuid import UUID + +import sqlalchemy as sa +from sqlalchemy import select, update +from sqlalchemy.ext.asyncio import AsyncSession + +from ipam.application.read_model import SavedFilterRepository +from ipam.infrastructure.models import SavedFilterModel + + +class PostgresSavedFilterRepository(SavedFilterRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def find_by_id(self, filter_id: UUID) -> dict | None: + model = await self._session.get(SavedFilterModel, filter_id) + if model is None: + return None + return self._to_dict(model) + + async def find_by_user(self, user_id: UUID, entity_type: str | None = None) -> list[dict]: + stmt = select(SavedFilterModel).where(SavedFilterModel.user_id == user_id) + if entity_type is not None: + stmt = stmt.where(SavedFilterModel.entity_type == entity_type) + stmt = stmt.order_by(SavedFilterModel.created_at.desc()) + result = await self._session.execute(stmt) + return [self._to_dict(r) for r in result.scalars().all()] + + async def create(self, data: dict) -> UUID: + model = SavedFilterModel(**data) + self._session.add(model) + await self._session.flush() + return model.id + + async def update(self, filter_id: UUID, data: dict) -> None: + model = await self._session.get(SavedFilterModel, filter_id) + if model is None: + return + for key, value in data.items(): + setattr(model, key, value) + await self._session.flush() + + async def delete(self, filter_id: UUID) -> None: + model = await self._session.get(SavedFilterModel, filter_id) + if model is not None: + await self._session.delete(model) + await self._session.flush() + + async def clear_default(self, user_id: UUID, entity_type: str) -> None: + stmt = ( + update(SavedFilterModel) + .where( + SavedFilterModel.user_id == user_id, + SavedFilterModel.entity_type == entity_type, + SavedFilterModel.is_default == sa.true(), + ) + .values(is_default=False) + ) + await self._session.execute(stmt) + await self._session.flush() + + @staticmethod + def _to_dict(model: SavedFilterModel) -> dict: + return { + "id": model.id, + "user_id": model.user_id, + "name": model.name, + "entity_type": model.entity_type, + "filter_config": model.filter_config, + "is_default": model.is_default, + "created_at": model.created_at, + "updated_at": model.updated_at, + } diff --git a/services/ipam/src/ipam/infrastructure/search_repository.py b/services/ipam/src/ipam/infrastructure/search_repository.py new file mode 100644 index 0000000..f9ace05 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/search_repository.py @@ -0,0 +1,89 @@ +from __future__ import annotations + +from typing import Any + +import sqlalchemy as sa +from sqlalchemy import func, literal_column, select, union_all +from sqlalchemy.ext.asyncio import AsyncSession + +from ipam.application.read_model import GlobalSearchRepository +from ipam.infrastructure.models import ( + ASNReadModel, + FHRPGroupReadModel, + IPAddressReadModel, + IPRangeReadModel, + PrefixReadModel, + RIRReadModel, + RouteTargetReadModel, + ServiceReadModel, + VLANGroupReadModel, + VLANReadModel, + VRFReadModel, +) + +SEARCHABLE_MODELS: list[tuple[str, Any, Any]] = [ + ("prefix", PrefixReadModel, PrefixReadModel.network), + ("ip_address", IPAddressReadModel, IPAddressReadModel.address), + ("vrf", VRFReadModel, VRFReadModel.name), + ("vlan", VLANReadModel, VLANReadModel.name), + ("ip_range", IPRangeReadModel, IPRangeReadModel.start_address), + ("rir", RIRReadModel, RIRReadModel.name), + ("asn", ASNReadModel, func.cast(ASNReadModel.asn, sa.Text)), + ("fhrp_group", FHRPGroupReadModel, FHRPGroupReadModel.name), + ("route_target", RouteTargetReadModel, RouteTargetReadModel.name), + ("vlan_group", VLANGroupReadModel, VLANGroupReadModel.name), + ("service", ServiceReadModel, ServiceReadModel.name), +] + + +class PostgresGlobalSearchRepository(GlobalSearchRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def search( + self, + query: str, + entity_types: list[str] | None = None, + offset: int = 0, + limit: int = 20, + ) -> tuple[list[dict], int]: + tsquery = func.plainto_tsquery("simple", query) + + subqueries = [] + for entity_type, model_cls, display_col in SEARCHABLE_MODELS: + if entity_types and entity_type not in entity_types: + continue + stmt = select( + literal_column(f"'{entity_type}'").label("entity_type"), + model_cls.id.label("entity_id"), + display_col.label("display_text"), + model_cls.description.label("description"), + func.ts_rank(model_cls.search_vector, tsquery).label("relevance"), + ).where( + model_cls.search_vector.op("@@")(tsquery), + model_cls.is_deleted == sa.false(), + ) + subqueries.append(stmt) + + if not subqueries: + return [], 0 + + union_stmt = union_all(*subqueries) + union_sub = union_stmt.subquery() + + count_stmt = select(func.count()).select_from(union_sub) + total = (await self._session.execute(count_stmt)).scalar_one() + + result_stmt = select(union_sub).order_by(union_sub.c.relevance.desc()).offset(offset).limit(limit) + result = await self._session.execute(result_stmt) + rows = [ + { + "entity_type": row.entity_type, + "entity_id": row.entity_id, + "display_text": str(row.display_text), + "description": row.description or "", + "relevance": float(row.relevance), + } + for row in result + ] + return rows, total diff --git a/services/ipam/src/ipam/infrastructure/template_repository.py b/services/ipam/src/ipam/infrastructure/template_repository.py new file mode 100644 index 0000000..a7f10d3 --- /dev/null +++ b/services/ipam/src/ipam/infrastructure/template_repository.py @@ -0,0 +1,36 @@ +from __future__ import annotations + +from uuid import UUID + +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from ipam.infrastructure.models import ExportTemplateModel + + +class TemplateRepository: + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def create(self, data: dict) -> ExportTemplateModel: + model = ExportTemplateModel(**data) + self._session.add(model) + await self._session.flush() + return model + + async def find_by_id(self, template_id: UUID) -> ExportTemplateModel | None: + return await self._session.get(ExportTemplateModel, template_id) + + async def find_all(self, entity_type: str | None = None) -> list[ExportTemplateModel]: + stmt = select(ExportTemplateModel) + if entity_type is not None: + stmt = stmt.where(ExportTemplateModel.entity_type == entity_type) + stmt = stmt.order_by(ExportTemplateModel.created_at.desc()) + result = await self._session.execute(stmt) + return list(result.scalars().all()) + + async def delete(self, template_id: UUID) -> None: + model = await self._session.get(ExportTemplateModel, template_id) + if model is not None: + await self._session.delete(model) + await self._session.flush() diff --git a/services/ipam/src/ipam/interface/__init__.py b/services/ipam/src/ipam/interface/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/src/ipam/interface/graphql/__init__.py b/services/ipam/src/ipam/interface/graphql/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/src/ipam/interface/graphql/context.py b/services/ipam/src/ipam/interface/graphql/context.py new file mode 100644 index 0000000..f904e0c --- /dev/null +++ b/services/ipam/src/ipam/interface/graphql/context.py @@ -0,0 +1,109 @@ +from shared.cqrs.bus import QueryBus +from starlette.requests import Request + +from ipam.application.queries import ( + GetASNQuery, + GetFHRPGroupQuery, + GetIPAddressQuery, + GetIPRangeQuery, + GetPrefixQuery, + GetRIRQuery, + GetRouteTargetQuery, + GetServiceQuery, + GetVLANGroupQuery, + GetVLANQuery, + GetVRFQuery, + ListASNsQuery, + ListFHRPGroupsQuery, + ListIPAddressesQuery, + ListIPRangesQuery, + ListPrefixesQuery, + ListRIRsQuery, + ListRouteTargetsQuery, + ListServicesQuery, + ListVLANGroupsQuery, + ListVLANsQuery, + ListVRFsQuery, +) +from ipam.application.query_handlers import ( + GetASNHandler, + GetFHRPGroupHandler, + GetIPAddressHandler, + GetIPRangeHandler, + GetPrefixHandler, + GetRIRHandler, + GetRouteTargetHandler, + GetServiceHandler, + GetVLANGroupHandler, + GetVLANHandler, + GetVRFHandler, + ListASNsHandler, + ListFHRPGroupsHandler, + ListIPAddressesHandler, + ListIPRangesHandler, + ListPrefixesHandler, + ListRIRsHandler, + ListRouteTargetsHandler, + ListServicesHandler, + ListVLANGroupsHandler, + ListVLANsHandler, + ListVRFsHandler, +) +from ipam.infrastructure.read_model_repository import ( + PostgresASNReadModelRepository, + PostgresFHRPGroupReadModelRepository, + PostgresIPAddressReadModelRepository, + PostgresIPRangeReadModelRepository, + PostgresPrefixReadModelRepository, + PostgresRIRReadModelRepository, + PostgresRouteTargetReadModelRepository, + PostgresServiceReadModelRepository, + PostgresVLANGroupReadModelRepository, + PostgresVLANReadModelRepository, + PostgresVRFReadModelRepository, +) + + +async def get_graphql_context(request: Request) -> dict: + database = request.app.state.database + session = database.session() + + # Build query bus with all handlers + query_bus = QueryBus() + + prefix_repo = PostgresPrefixReadModelRepository(session) + ip_repo = PostgresIPAddressReadModelRepository(session) + vrf_repo = PostgresVRFReadModelRepository(session) + vlan_repo = PostgresVLANReadModelRepository(session) + ip_range_repo = PostgresIPRangeReadModelRepository(session) + rir_repo = PostgresRIRReadModelRepository(session) + asn_repo = PostgresASNReadModelRepository(session) + fhrp_group_repo = PostgresFHRPGroupReadModelRepository(session) + route_target_repo = PostgresRouteTargetReadModelRepository(session) + vlan_group_repo = PostgresVLANGroupReadModelRepository(session) + service_repo = PostgresServiceReadModelRepository(session) + + query_bus.register(GetPrefixQuery, GetPrefixHandler(prefix_repo)) + query_bus.register(ListPrefixesQuery, ListPrefixesHandler(prefix_repo)) + query_bus.register(GetIPAddressQuery, GetIPAddressHandler(ip_repo)) + query_bus.register(ListIPAddressesQuery, ListIPAddressesHandler(ip_repo)) + query_bus.register(GetVRFQuery, GetVRFHandler(vrf_repo)) + query_bus.register(ListVRFsQuery, ListVRFsHandler(vrf_repo)) + query_bus.register(GetVLANQuery, GetVLANHandler(vlan_repo)) + query_bus.register(ListVLANsQuery, ListVLANsHandler(vlan_repo)) + query_bus.register(GetIPRangeQuery, GetIPRangeHandler(ip_range_repo)) + query_bus.register(ListIPRangesQuery, ListIPRangesHandler(ip_range_repo)) + query_bus.register(GetRIRQuery, GetRIRHandler(rir_repo)) + query_bus.register(ListRIRsQuery, ListRIRsHandler(rir_repo)) + query_bus.register(GetASNQuery, GetASNHandler(asn_repo)) + query_bus.register(ListASNsQuery, ListASNsHandler(asn_repo)) + query_bus.register(GetFHRPGroupQuery, GetFHRPGroupHandler(fhrp_group_repo)) + query_bus.register(ListFHRPGroupsQuery, ListFHRPGroupsHandler(fhrp_group_repo)) + query_bus.register(GetRouteTargetQuery, GetRouteTargetHandler(route_target_repo)) + query_bus.register(ListRouteTargetsQuery, ListRouteTargetsHandler(route_target_repo)) + query_bus.register(GetVLANGroupQuery, GetVLANGroupHandler(vlan_group_repo)) + query_bus.register(ListVLANGroupsQuery, ListVLANGroupsHandler(vlan_group_repo)) + query_bus.register(GetServiceQuery, GetServiceHandler(service_repo)) + query_bus.register(ListServicesQuery, ListServicesHandler(service_repo)) + + return {"query_bus": query_bus} diff --git a/services/ipam/src/ipam/interface/graphql/gql_types.py b/services/ipam/src/ipam/interface/graphql/gql_types.py new file mode 100644 index 0000000..359fe86 --- /dev/null +++ b/services/ipam/src/ipam/interface/graphql/gql_types.py @@ -0,0 +1,161 @@ +import uuid +from datetime import datetime + +import strawberry + + +@strawberry.type +class PrefixType: + id: uuid.UUID + network: str + vrf_id: uuid.UUID | None + vlan_id: uuid.UUID | None + status: str + role: str | None + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class IPAddressType: + id: uuid.UUID + address: str + vrf_id: uuid.UUID | None + status: str + dns_name: str + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class VRFType: + id: uuid.UUID + name: str + rd: str | None + import_targets: list[uuid.UUID] + export_targets: list[uuid.UUID] + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class VLANType: + id: uuid.UUID + vid: int + name: str + group_id: uuid.UUID | None + status: str + role: str | None + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class IPRangeType: + id: uuid.UUID + start_address: str + end_address: str + vrf_id: uuid.UUID | None + status: str + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class RIRType: + id: uuid.UUID + name: str + is_private: bool + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class ASNType: + id: uuid.UUID + asn: int + rir_id: uuid.UUID | None + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class FHRPGroupType: + id: uuid.UUID + protocol: str + group_id_value: int + auth_type: str + name: str + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class RouteTargetType: + id: uuid.UUID + name: str + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class VLANGroupType: + id: uuid.UUID + name: str + slug: str + min_vid: int + max_vid: int + tenant_id: uuid.UUID | None + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime + + +@strawberry.type +class ServiceType: + id: uuid.UUID + name: str + protocol: str + ports: list[int] + ip_addresses: list[uuid.UUID] + description: str + custom_fields: strawberry.scalars.JSON + tags: list[uuid.UUID] + created_at: datetime + updated_at: datetime diff --git a/services/ipam/src/ipam/interface/graphql/schema.py b/services/ipam/src/ipam/interface/graphql/schema.py new file mode 100644 index 0000000..0c9505f --- /dev/null +++ b/services/ipam/src/ipam/interface/graphql/schema.py @@ -0,0 +1,331 @@ +import uuid + +import strawberry +from strawberry.types import Info + +from ipam.interface.graphql.gql_types import ( + ASNType, + FHRPGroupType, + IPAddressType, + IPRangeType, + PrefixType, + RIRType, + RouteTargetType, + ServiceType, + VLANGroupType, + VLANType, + VRFType, +) + + +def _dto_to_type(dto, type_cls): + """Convert a Pydantic DTO to a Strawberry type.""" + return type_cls(**dto.model_dump()) + + +@strawberry.type +class Query: + # --------------------------------------------------------------------------- + # Prefix + # --------------------------------------------------------------------------- + + @strawberry.field + async def prefix(self, info: Info, id: uuid.UUID) -> PrefixType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetPrefixQuery + + dto = await query_bus.dispatch(GetPrefixQuery(prefix_id=id)) + return _dto_to_type(dto, PrefixType) + + @strawberry.field + async def prefixes( + self, + info: Info, + offset: int = 0, + limit: int = 50, + vrf_id: uuid.UUID | None = None, + status: str | None = None, + tenant_id: uuid.UUID | None = None, + ) -> list[PrefixType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListPrefixesQuery + + items, _ = await query_bus.dispatch( + ListPrefixesQuery(offset=offset, limit=limit, vrf_id=vrf_id, status=status, tenant_id=tenant_id) + ) + return [_dto_to_type(dto, PrefixType) for dto in items] + + # --------------------------------------------------------------------------- + # IPAddress + # --------------------------------------------------------------------------- + + @strawberry.field + async def ip_address(self, info: Info, id: uuid.UUID) -> IPAddressType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetIPAddressQuery + + dto = await query_bus.dispatch(GetIPAddressQuery(ip_id=id)) + return _dto_to_type(dto, IPAddressType) + + @strawberry.field + async def ip_addresses( + self, + info: Info, + offset: int = 0, + limit: int = 50, + vrf_id: uuid.UUID | None = None, + status: str | None = None, + tenant_id: uuid.UUID | None = None, + ) -> list[IPAddressType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListIPAddressesQuery + + items, _ = await query_bus.dispatch( + ListIPAddressesQuery(offset=offset, limit=limit, vrf_id=vrf_id, status=status, tenant_id=tenant_id) + ) + return [_dto_to_type(dto, IPAddressType) for dto in items] + + # --------------------------------------------------------------------------- + # VRF + # --------------------------------------------------------------------------- + + @strawberry.field + async def vrf(self, info: Info, id: uuid.UUID) -> VRFType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetVRFQuery + + dto = await query_bus.dispatch(GetVRFQuery(vrf_id=id)) + return _dto_to_type(dto, VRFType) + + @strawberry.field + async def vrfs( + self, + info: Info, + offset: int = 0, + limit: int = 50, + tenant_id: uuid.UUID | None = None, + ) -> list[VRFType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListVRFsQuery + + items, _ = await query_bus.dispatch(ListVRFsQuery(offset=offset, limit=limit, tenant_id=tenant_id)) + return [_dto_to_type(dto, VRFType) for dto in items] + + # --------------------------------------------------------------------------- + # VLAN + # --------------------------------------------------------------------------- + + @strawberry.field + async def vlan(self, info: Info, id: uuid.UUID) -> VLANType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetVLANQuery + + dto = await query_bus.dispatch(GetVLANQuery(vlan_id=id)) + return _dto_to_type(dto, VLANType) + + @strawberry.field + async def vlans( + self, + info: Info, + offset: int = 0, + limit: int = 50, + group_id: uuid.UUID | None = None, + status: str | None = None, + tenant_id: uuid.UUID | None = None, + ) -> list[VLANType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListVLANsQuery + + items, _ = await query_bus.dispatch( + ListVLANsQuery(offset=offset, limit=limit, group_id=group_id, status=status, tenant_id=tenant_id) + ) + return [_dto_to_type(dto, VLANType) for dto in items] + + # --------------------------------------------------------------------------- + # IPRange + # --------------------------------------------------------------------------- + + @strawberry.field + async def ip_range(self, info: Info, id: uuid.UUID) -> IPRangeType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetIPRangeQuery + + dto = await query_bus.dispatch(GetIPRangeQuery(range_id=id)) + return _dto_to_type(dto, IPRangeType) + + @strawberry.field + async def ip_ranges( + self, + info: Info, + offset: int = 0, + limit: int = 50, + vrf_id: uuid.UUID | None = None, + status: str | None = None, + tenant_id: uuid.UUID | None = None, + ) -> list[IPRangeType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListIPRangesQuery + + items, _ = await query_bus.dispatch( + ListIPRangesQuery(offset=offset, limit=limit, vrf_id=vrf_id, status=status, tenant_id=tenant_id) + ) + return [_dto_to_type(dto, IPRangeType) for dto in items] + + # --------------------------------------------------------------------------- + # RIR + # --------------------------------------------------------------------------- + + @strawberry.field + async def rir(self, info: Info, id: uuid.UUID) -> RIRType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetRIRQuery + + dto = await query_bus.dispatch(GetRIRQuery(rir_id=id)) + return _dto_to_type(dto, RIRType) + + @strawberry.field + async def rirs( + self, + info: Info, + offset: int = 0, + limit: int = 50, + ) -> list[RIRType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListRIRsQuery + + items, _ = await query_bus.dispatch(ListRIRsQuery(offset=offset, limit=limit)) + return [_dto_to_type(dto, RIRType) for dto in items] + + # --------------------------------------------------------------------------- + # ASN + # --------------------------------------------------------------------------- + + @strawberry.field + async def asn(self, info: Info, id: uuid.UUID) -> ASNType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetASNQuery + + dto = await query_bus.dispatch(GetASNQuery(asn_id=id)) + return _dto_to_type(dto, ASNType) + + @strawberry.field + async def asns( + self, + info: Info, + offset: int = 0, + limit: int = 50, + rir_id: uuid.UUID | None = None, + tenant_id: uuid.UUID | None = None, + ) -> list[ASNType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListASNsQuery + + items, _ = await query_bus.dispatch( + ListASNsQuery(offset=offset, limit=limit, rir_id=rir_id, tenant_id=tenant_id) + ) + return [_dto_to_type(dto, ASNType) for dto in items] + + # --------------------------------------------------------------------------- + # FHRPGroup + # --------------------------------------------------------------------------- + + @strawberry.field + async def fhrp_group(self, info: Info, id: uuid.UUID) -> FHRPGroupType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetFHRPGroupQuery + + dto = await query_bus.dispatch(GetFHRPGroupQuery(fhrp_group_id=id)) + return _dto_to_type(dto, FHRPGroupType) + + @strawberry.field + async def fhrp_groups( + self, + info: Info, + offset: int = 0, + limit: int = 50, + ) -> list[FHRPGroupType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListFHRPGroupsQuery + + items, _ = await query_bus.dispatch(ListFHRPGroupsQuery(offset=offset, limit=limit)) + return [_dto_to_type(dto, FHRPGroupType) for dto in items] + + # --------------------------------------------------------------------------- + # RouteTarget + # --------------------------------------------------------------------------- + + @strawberry.field + async def route_target(self, info: Info, id: uuid.UUID) -> RouteTargetType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetRouteTargetQuery + + dto = await query_bus.dispatch(GetRouteTargetQuery(route_target_id=id)) + return _dto_to_type(dto, RouteTargetType) + + @strawberry.field + async def route_targets( + self, + info: Info, + offset: int = 0, + limit: int = 50, + tenant_id: uuid.UUID | None = None, + ) -> list[RouteTargetType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListRouteTargetsQuery + + items, _ = await query_bus.dispatch(ListRouteTargetsQuery(offset=offset, limit=limit, tenant_id=tenant_id)) + return [_dto_to_type(dto, RouteTargetType) for dto in items] + + # --------------------------------------------------------------------------- + # VLANGroup + # --------------------------------------------------------------------------- + + @strawberry.field + async def vlan_group(self, info: Info, id: uuid.UUID) -> VLANGroupType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetVLANGroupQuery + + dto = await query_bus.dispatch(GetVLANGroupQuery(vlan_group_id=id)) + return _dto_to_type(dto, VLANGroupType) + + @strawberry.field + async def vlan_groups( + self, + info: Info, + offset: int = 0, + limit: int = 50, + tenant_id: uuid.UUID | None = None, + ) -> list[VLANGroupType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListVLANGroupsQuery + + items, _ = await query_bus.dispatch(ListVLANGroupsQuery(offset=offset, limit=limit, tenant_id=tenant_id)) + return [_dto_to_type(dto, VLANGroupType) for dto in items] + + # --------------------------------------------------------------------------- + # Service + # --------------------------------------------------------------------------- + + @strawberry.field + async def service(self, info: Info, id: uuid.UUID) -> ServiceType: # noqa: A002 + query_bus = info.context["query_bus"] + from ipam.application.queries import GetServiceQuery + + dto = await query_bus.dispatch(GetServiceQuery(service_id=id)) + return _dto_to_type(dto, ServiceType) + + @strawberry.field + async def services( + self, + info: Info, + offset: int = 0, + limit: int = 50, + ) -> list[ServiceType]: + query_bus = info.context["query_bus"] + from ipam.application.queries import ListServicesQuery + + items, _ = await query_bus.dispatch(ListServicesQuery(offset=offset, limit=limit)) + return [_dto_to_type(dto, ServiceType) for dto in items] + + +schema = strawberry.Schema(query=Query) diff --git a/services/ipam/src/ipam/interface/main.py b/services/ipam/src/ipam/interface/main.py new file mode 100644 index 0000000..985ac9e --- /dev/null +++ b/services/ipam/src/ipam/interface/main.py @@ -0,0 +1,232 @@ +import asyncio +import contextlib +import logging +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager + +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from shared.api.errors import domain_exception_handler +from shared.api.middleware import CorrelationIdMiddleware, UserMiddleware +from shared.domain.exceptions import DomainError +from shared.event.pg_store import PostgresEventStore +from shared.messaging.consumer import KafkaEventConsumer +from shared.messaging.producer import KafkaEventProducer +from shared.messaging.serialization import EventSerializer +from strawberry.fastapi import GraphQLRouter + +from ipam.domain.events import ( + ASNCreated, + ASNDeleted, + ASNUpdated, + FHRPGroupCreated, + FHRPGroupDeleted, + FHRPGroupUpdated, + IPAddressCreated, + IPAddressDeleted, + IPAddressStatusChanged, + IPAddressUpdated, + IPRangeCreated, + IPRangeDeleted, + IPRangeStatusChanged, + IPRangeUpdated, + PrefixCreated, + PrefixDeleted, + PrefixStatusChanged, + PrefixUpdated, + RIRCreated, + RIRDeleted, + RIRUpdated, + RouteTargetCreated, + RouteTargetDeleted, + RouteTargetUpdated, + ServiceCreated, + ServiceDeleted, + ServiceUpdated, + VLANCreated, + VLANDeleted, + VLANGroupCreated, + VLANGroupDeleted, + VLANGroupUpdated, + VLANStatusChanged, + VLANUpdated, + VRFCreated, + VRFDeleted, + VRFUpdated, +) +from ipam.infrastructure.cache import RedisCache +from ipam.infrastructure.config import Settings +from ipam.infrastructure.database import Database +from ipam.infrastructure.event_projector import IPAMEventProjector +from ipam.interface.graphql.context import get_graphql_context +from ipam.interface.graphql.schema import schema +from ipam.interface.routers.asn_router import router as asn_router +from ipam.interface.routers.fhrp_group_router import router as fhrp_group_router +from ipam.interface.routers.import_export_router import router as import_export_router +from ipam.interface.routers.ip_address_router import router as ip_address_router +from ipam.interface.routers.ip_range_router import router as ip_range_router +from ipam.interface.routers.prefix_router import router as prefix_router +from ipam.interface.routers.rir_router import router as rir_router +from ipam.interface.routers.route_target_router import router as route_target_router +from ipam.interface.routers.saved_filter_router import router as saved_filter_router +from ipam.interface.routers.search_router import router as search_router +from ipam.interface.routers.service_router import router as service_router +from ipam.interface.routers.vlan_group_router import router as vlan_group_router +from ipam.interface.routers.vlan_router import router as vlan_router +from ipam.interface.routers.vrf_router import router as vrf_router + +logger = logging.getLogger(__name__) + +ALL_EVENTS = [ + # Prefix + PrefixCreated, + PrefixUpdated, + PrefixStatusChanged, + PrefixDeleted, + # IPAddress + IPAddressCreated, + IPAddressUpdated, + IPAddressStatusChanged, + IPAddressDeleted, + # VRF + VRFCreated, + VRFUpdated, + VRFDeleted, + # VLAN + VLANCreated, + VLANUpdated, + VLANStatusChanged, + VLANDeleted, + # IPRange + IPRangeCreated, + IPRangeUpdated, + IPRangeStatusChanged, + IPRangeDeleted, + # RIR + RIRCreated, + RIRUpdated, + RIRDeleted, + # ASN + ASNCreated, + ASNUpdated, + ASNDeleted, + # FHRPGroup + FHRPGroupCreated, + FHRPGroupUpdated, + FHRPGroupDeleted, + # RouteTarget + RouteTargetCreated, + RouteTargetUpdated, + RouteTargetDeleted, + # VLANGroup + VLANGroupCreated, + VLANGroupUpdated, + VLANGroupDeleted, + # Service + ServiceCreated, + ServiceUpdated, + ServiceDeleted, +] + + +@asynccontextmanager +async def lifespan(app: FastAPI) -> AsyncGenerator[None]: + settings = Settings() + database = Database(settings.database_url) + + event_store = PostgresEventStore(database.session) + for event_cls in ALL_EVENTS: + event_store.register_event_type(event_cls) + + serializer = EventSerializer() + for event_cls in ALL_EVENTS: + serializer.register(event_cls) + event_producer = KafkaEventProducer(settings.kafka_bootstrap_servers, serializer) + await event_producer.start() + + projector_consumer = KafkaEventConsumer( + bootstrap_servers=settings.kafka_bootstrap_servers, + group_id="ipam-projector", + topics=["ipam.events"], + serializer=serializer, + ) + cache = RedisCache(settings.redis_url) + await cache.connect() + + projector = IPAMEventProjector(database.session, cache=cache) + projector.register_all(projector_consumer) + await projector_consumer.start() + consumer_task = asyncio.create_task(projector_consumer.consume()) + + app.state.settings = settings + app.state.database = database + app.state.event_store = event_store + app.state.event_producer = event_producer + app.state.cache = cache + + yield + + consumer_task.cancel() + with contextlib.suppress(asyncio.CancelledError): + await consumer_task + await projector_consumer.stop() + await event_producer.stop() + await cache.close() + await database.close() + + +OPENAPI_TAGS = [ + {"name": "prefixes", "description": "IP prefix (subnet) management"}, + {"name": "ip-addresses", "description": "IP address management"}, + {"name": "vrfs", "description": "Virtual Routing and Forwarding instances"}, + {"name": "vlans", "description": "VLAN management"}, + {"name": "ip-ranges", "description": "IP address range management"}, + {"name": "rirs", "description": "Regional Internet Registries"}, + {"name": "asns", "description": "Autonomous System Numbers"}, + {"name": "fhrp-groups", "description": "First Hop Redundancy Protocol groups"}, + {"name": "route-targets", "description": "BGP route targets for VRF import/export"}, + {"name": "vlan-groups", "description": "VLAN group management"}, + {"name": "services", "description": "Network service (TCP/UDP/SCTP) management"}, + {"name": "saved-filters", "description": "User-specific saved filter presets"}, + {"name": "search", "description": "Global full-text search across IPAM entities"}, + {"name": "import-export", "description": "CSV import, CSV/JSON/YAML export, Jinja2 templates"}, +] + + +def create_app() -> FastAPI: + app = FastAPI( + title="CMDB IPAM Service", + version="1.0.0", + description="IP Address Management service — prefixes, addresses, VRFs, VLANs, and more.", + openapi_tags=OPENAPI_TAGS, + lifespan=lifespan, + ) + app.add_middleware( + CORSMiddleware, + allow_origins=["http://localhost:3000"], + allow_methods=["*"], + allow_headers=["*"], + ) + app.add_middleware(UserMiddleware) + app.add_middleware(CorrelationIdMiddleware) + app.add_exception_handler(DomainError, domain_exception_handler) + app.include_router(prefix_router, prefix="/api/v1") + app.include_router(ip_address_router, prefix="/api/v1") + app.include_router(vrf_router, prefix="/api/v1") + app.include_router(vlan_router, prefix="/api/v1") + app.include_router(ip_range_router, prefix="/api/v1") + app.include_router(rir_router, prefix="/api/v1") + app.include_router(asn_router, prefix="/api/v1") + app.include_router(fhrp_group_router, prefix="/api/v1") + app.include_router(route_target_router, prefix="/api/v1") + app.include_router(vlan_group_router, prefix="/api/v1") + app.include_router(service_router, prefix="/api/v1") + app.include_router(saved_filter_router, prefix="/api/v1") + app.include_router(search_router, prefix="/api/v1") + app.include_router(import_export_router, prefix="/api/v1") + graphql_app = GraphQLRouter(schema, context_getter=get_graphql_context) + app.include_router(graphql_app, prefix="/graphql") + return app + + +app = create_app() diff --git a/services/ipam/src/ipam/interface/routers/__init__.py b/services/ipam/src/ipam/interface/routers/__init__.py new file mode 100644 index 0000000..cf87da8 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/__init__.py @@ -0,0 +1,19 @@ +from ipam.interface.routers.asn_router import router as asn_router +from ipam.interface.routers.fhrp_group_router import router as fhrp_group_router +from ipam.interface.routers.ip_address_router import router as ip_address_router +from ipam.interface.routers.ip_range_router import router as ip_range_router +from ipam.interface.routers.prefix_router import router as prefix_router +from ipam.interface.routers.rir_router import router as rir_router +from ipam.interface.routers.vlan_router import router as vlan_router +from ipam.interface.routers.vrf_router import router as vrf_router + +__all__ = [ + "asn_router", + "fhrp_group_router", + "ip_address_router", + "ip_range_router", + "prefix_router", + "rir_router", + "vlan_router", + "vrf_router", +] diff --git a/services/ipam/src/ipam/interface/routers/asn_router.py b/services/ipam/src/ipam/interface/routers/asn_router.py new file mode 100644 index 0000000..fb8691f --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/asn_router.py @@ -0,0 +1,224 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateASNsHandler, + BulkDeleteASNsHandler, + BulkUpdateASNsHandler, + CreateASNHandler, + DeleteASNHandler, + UpdateASNHandler, +) +from ipam.application.commands import ( + BulkCreateASNsCommand, + BulkDeleteASNsCommand, + BulkUpdateASNItem, + BulkUpdateASNsCommand, + CreateASNCommand, + DeleteASNCommand, + UpdateASNCommand, +) +from ipam.application.queries import GetASNQuery, ListASNsQuery +from ipam.application.query_handlers import GetASNHandler, ListASNsHandler +from ipam.infrastructure.read_model_repository import PostgresASNReadModelRepository +from ipam.interface.schemas import ( + ASNListResponse, + ASNResponse, + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateASNRequest, + UpdateASNRequest, +) +from ipam.interface.schemas import ( + BulkUpdateASNItem as BulkUpdateASNItemSchema, +) + +router = APIRouter(prefix="/asns", tags=["asns"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresASNReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreateASNCommand, CreateASNHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdateASNCommand, UpdateASNHandler(event_store, read_model_repo, event_producer)) + bus.register(DeleteASNCommand, DeleteASNHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreateASNsCommand, + BulkCreateASNsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateASNsCommand, + BulkUpdateASNsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteASNsCommand, + BulkDeleteASNsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresASNReadModelRepository(session) + + bus = QueryBus() + bus.register(GetASNQuery, GetASNHandler(read_model_repo)) + bus.register(ListASNsQuery, ListASNsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=ASNResponse, +) +async def create_asn( + body: CreateASNRequest, + request: Request, +) -> ASNResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + asn_id = await command_bus.dispatch(CreateASNCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetASNQuery(asn_id=asn_id)) + return ASNResponse(**result.model_dump()) + + +@router.get("", response_model=ASNListResponse) +async def list_asns( + params: OffsetParams = Depends(), # noqa: B008 + rir_id: UUID | None = None, + tenant_id: UUID | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> ASNListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListASNsQuery( + offset=params.offset, + limit=params.limit, + rir_id=rir_id, + tenant_id=tenant_id, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return ASNListResponse( + items=[ASNResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_asns( + body: list[BulkUpdateASNItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateASNsCommand( + items=[BulkUpdateASNItem(asn_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_asns( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteASNsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{asn_id}", response_model=ASNResponse) +async def get_asn( + asn_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> ASNResponse: + result = await query_bus.dispatch(GetASNQuery(asn_id=asn_id)) + return ASNResponse(**result.model_dump()) + + +@router.patch("/{asn_id}", response_model=ASNResponse) +async def update_asn( + asn_id: UUID, + body: UpdateASNRequest, + request: Request, +) -> ASNResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateASNCommand(asn_id=asn_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetASNQuery(asn_id=asn_id)) + return ASNResponse(**result.model_dump()) + + +@router.delete("/{asn_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_asn( + asn_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteASNCommand(asn_id=asn_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_asns( + body: list[CreateASNRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch(BulkCreateASNsCommand(items=[CreateASNCommand(**i.model_dump()) for i in body])) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/fhrp_group_router.py b/services/ipam/src/ipam/interface/routers/fhrp_group_router.py new file mode 100644 index 0000000..28cf61f --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/fhrp_group_router.py @@ -0,0 +1,239 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateFHRPGroupsHandler, + BulkDeleteFHRPGroupsHandler, + BulkUpdateFHRPGroupsHandler, + CreateFHRPGroupHandler, + DeleteFHRPGroupHandler, + UpdateFHRPGroupHandler, +) +from ipam.application.commands import ( + BulkCreateFHRPGroupsCommand, + BulkDeleteFHRPGroupsCommand, + BulkUpdateFHRPGroupItem, + BulkUpdateFHRPGroupsCommand, + CreateFHRPGroupCommand, + DeleteFHRPGroupCommand, + UpdateFHRPGroupCommand, +) +from ipam.application.queries import GetFHRPGroupQuery, ListFHRPGroupsQuery +from ipam.application.query_handlers import GetFHRPGroupHandler, ListFHRPGroupsHandler +from ipam.infrastructure.read_model_repository import PostgresFHRPGroupReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateFHRPGroupRequest, + FHRPGroupListResponse, + FHRPGroupResponse, + UpdateFHRPGroupRequest, +) +from ipam.interface.schemas import ( + BulkUpdateFHRPGroupItem as BulkUpdateFHRPGroupItemSchema, +) + +router = APIRouter(prefix="/fhrp-groups", tags=["fhrp-groups"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresFHRPGroupReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register( + CreateFHRPGroupCommand, + CreateFHRPGroupHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + UpdateFHRPGroupCommand, + UpdateFHRPGroupHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + DeleteFHRPGroupCommand, + DeleteFHRPGroupHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkCreateFHRPGroupsCommand, + BulkCreateFHRPGroupsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateFHRPGroupsCommand, + BulkUpdateFHRPGroupsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteFHRPGroupsCommand, + BulkDeleteFHRPGroupsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresFHRPGroupReadModelRepository(session) + + bus = QueryBus() + bus.register(GetFHRPGroupQuery, GetFHRPGroupHandler(read_model_repo)) + bus.register(ListFHRPGroupsQuery, ListFHRPGroupsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=FHRPGroupResponse, +) +async def create_fhrp_group( + body: CreateFHRPGroupRequest, + request: Request, +) -> FHRPGroupResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + group_id = await command_bus.dispatch(CreateFHRPGroupCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetFHRPGroupQuery(fhrp_group_id=group_id)) + return FHRPGroupResponse(**result.model_dump()) + + +@router.get("", response_model=FHRPGroupListResponse) +async def list_fhrp_groups( + params: OffsetParams = Depends(), # noqa: B008 + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> FHRPGroupListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListFHRPGroupsQuery( + offset=params.offset, + limit=params.limit, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return FHRPGroupListResponse( + items=[FHRPGroupResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_fhrp_groups( + body: list[BulkUpdateFHRPGroupItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateFHRPGroupsCommand( + items=[ + BulkUpdateFHRPGroupItem(fhrp_group_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) + for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_fhrp_groups( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteFHRPGroupsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{fhrp_group_id}", response_model=FHRPGroupResponse) +async def get_fhrp_group( + fhrp_group_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> FHRPGroupResponse: + result = await query_bus.dispatch(GetFHRPGroupQuery(fhrp_group_id=fhrp_group_id)) + return FHRPGroupResponse(**result.model_dump()) + + +@router.patch("/{fhrp_group_id}", response_model=FHRPGroupResponse) +async def update_fhrp_group( + fhrp_group_id: UUID, + body: UpdateFHRPGroupRequest, + request: Request, +) -> FHRPGroupResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch( + UpdateFHRPGroupCommand( + fhrp_group_id=fhrp_group_id, + **body.model_dump(exclude_unset=True), + ) + ) + await session.commit() + result = await query_bus.dispatch(GetFHRPGroupQuery(fhrp_group_id=fhrp_group_id)) + return FHRPGroupResponse(**result.model_dump()) + + +@router.delete("/{fhrp_group_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_fhrp_group( + fhrp_group_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteFHRPGroupCommand(fhrp_group_id=fhrp_group_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_fhrp_groups( + body: list[CreateFHRPGroupRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch( + BulkCreateFHRPGroupsCommand(items=[CreateFHRPGroupCommand(**i.model_dump()) for i in body]) + ) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/import_export_router.py b/services/ipam/src/ipam/interface/routers/import_export_router.py new file mode 100644 index 0000000..4fae69e --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/import_export_router.py @@ -0,0 +1,447 @@ +"""Import/Export router — CSV import, CSV/JSON/YAML export, Jinja2 template rendering.""" + +from __future__ import annotations + +from typing import Literal +from uuid import UUID + +from fastapi import APIRouter, HTTPException, Request, UploadFile, status +from fastapi.responses import StreamingResponse +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateASNsHandler, + BulkCreateFHRPGroupsHandler, + BulkCreateIPAddressesHandler, + BulkCreateIPRangesHandler, + BulkCreatePrefixesHandler, + BulkCreateRIRsHandler, + BulkCreateRouteTargetsHandler, + BulkCreateServicesHandler, + BulkCreateVLANGroupsHandler, + BulkCreateVLANsHandler, + BulkCreateVRFsHandler, +) +from ipam.application.commands import ( + BulkCreateASNsCommand, + BulkCreateFHRPGroupsCommand, + BulkCreateIPAddressesCommand, + BulkCreateIPRangesCommand, + BulkCreatePrefixesCommand, + BulkCreateRIRsCommand, + BulkCreateRouteTargetsCommand, + BulkCreateServicesCommand, + BulkCreateVLANGroupsCommand, + BulkCreateVLANsCommand, + BulkCreateVRFsCommand, + CreateASNCommand, + CreateFHRPGroupCommand, + CreateIPAddressCommand, + CreateIPRangeCommand, + CreatePrefixCommand, + CreateRIRCommand, + CreateRouteTargetCommand, + CreateServiceCommand, + CreateVLANCommand, + CreateVLANGroupCommand, + CreateVRFCommand, +) +from ipam.application.export_service import export_csv, export_json, export_yaml +from ipam.application.import_service import VALID_ENTITY_TYPES, parse_csv +from ipam.application.queries import ( + ListASNsQuery, + ListFHRPGroupsQuery, + ListIPAddressesQuery, + ListIPRangesQuery, + ListPrefixesQuery, + ListRIRsQuery, + ListRouteTargetsQuery, + ListServicesQuery, + ListVLANGroupsQuery, + ListVLANsQuery, + ListVRFsQuery, +) +from ipam.application.query_handlers import ( + ListASNsHandler, + ListFHRPGroupsHandler, + ListIPAddressesHandler, + ListIPRangesHandler, + ListPrefixesHandler, + ListRIRsHandler, + ListRouteTargetsHandler, + ListServicesHandler, + ListVLANGroupsHandler, + ListVLANsHandler, + ListVRFsHandler, +) +from ipam.infrastructure.read_model_repository import ( + PostgresASNReadModelRepository, + PostgresFHRPGroupReadModelRepository, + PostgresIPAddressReadModelRepository, + PostgresIPRangeReadModelRepository, + PostgresPrefixReadModelRepository, + PostgresRIRReadModelRepository, + PostgresRouteTargetReadModelRepository, + PostgresServiceReadModelRepository, + PostgresVLANGroupReadModelRepository, + PostgresVLANReadModelRepository, + PostgresVRFReadModelRepository, +) +from ipam.interface.schemas import ImportResponse, ImportRowErrorSchema + +router = APIRouter(tags=["import-export"]) + +# Mapping entity_type -> (BulkCreateCommand, CreateCommand, BulkCreateHandler, ReadModelRepo) +_BULK_CREATE_MAP: dict[str, tuple] = { + "prefix": ( + BulkCreatePrefixesCommand, + CreatePrefixCommand, + BulkCreatePrefixesHandler, + PostgresPrefixReadModelRepository, + ), + "ip_address": ( + BulkCreateIPAddressesCommand, + CreateIPAddressCommand, + BulkCreateIPAddressesHandler, + PostgresIPAddressReadModelRepository, + ), + "vrf": ( + BulkCreateVRFsCommand, + CreateVRFCommand, + BulkCreateVRFsHandler, + PostgresVRFReadModelRepository, + ), + "vlan": ( + BulkCreateVLANsCommand, + CreateVLANCommand, + BulkCreateVLANsHandler, + PostgresVLANReadModelRepository, + ), + "ip_range": ( + BulkCreateIPRangesCommand, + CreateIPRangeCommand, + BulkCreateIPRangesHandler, + PostgresIPRangeReadModelRepository, + ), + "rir": ( + BulkCreateRIRsCommand, + CreateRIRCommand, + BulkCreateRIRsHandler, + PostgresRIRReadModelRepository, + ), + "asn": ( + BulkCreateASNsCommand, + CreateASNCommand, + BulkCreateASNsHandler, + PostgresASNReadModelRepository, + ), + "fhrp_group": ( + BulkCreateFHRPGroupsCommand, + CreateFHRPGroupCommand, + BulkCreateFHRPGroupsHandler, + PostgresFHRPGroupReadModelRepository, + ), + "route_target": ( + BulkCreateRouteTargetsCommand, + CreateRouteTargetCommand, + BulkCreateRouteTargetsHandler, + PostgresRouteTargetReadModelRepository, + ), + "vlan_group": ( + BulkCreateVLANGroupsCommand, + CreateVLANGroupCommand, + BulkCreateVLANGroupsHandler, + PostgresVLANGroupReadModelRepository, + ), + "service": ( + BulkCreateServicesCommand, + CreateServiceCommand, + BulkCreateServicesHandler, + PostgresServiceReadModelRepository, + ), +} + +# Mapping entity_type -> (ListQuery, ListHandler, ReadModelRepo) +_LIST_QUERY_MAP = { + "prefix": (ListPrefixesQuery, ListPrefixesHandler, PostgresPrefixReadModelRepository), + "ip_address": (ListIPAddressesQuery, ListIPAddressesHandler, PostgresIPAddressReadModelRepository), + "vrf": (ListVRFsQuery, ListVRFsHandler, PostgresVRFReadModelRepository), + "vlan": (ListVLANsQuery, ListVLANsHandler, PostgresVLANReadModelRepository), + "ip_range": (ListIPRangesQuery, ListIPRangesHandler, PostgresIPRangeReadModelRepository), + "rir": (ListRIRsQuery, ListRIRsHandler, PostgresRIRReadModelRepository), + "asn": (ListASNsQuery, ListASNsHandler, PostgresASNReadModelRepository), + "fhrp_group": (ListFHRPGroupsQuery, ListFHRPGroupsHandler, PostgresFHRPGroupReadModelRepository), + "route_target": (ListRouteTargetsQuery, ListRouteTargetsHandler, PostgresRouteTargetReadModelRepository), + "vlan_group": (ListVLANGroupsQuery, ListVLANGroupsHandler, PostgresVLANGroupReadModelRepository), + "service": (ListServicesQuery, ListServicesHandler, PostgresServiceReadModelRepository), +} + +_FORMAT_CONTENT_TYPES = { + "csv": "text/csv", + "json": "application/json", + "yaml": "application/x-yaml", +} + +_FORMAT_EXPORTERS = { + "csv": export_csv, + "json": export_json, + "yaml": export_yaml, +} + + +def _validate_entity_type(entity_type: str) -> None: + if entity_type not in VALID_ENTITY_TYPES: + raise HTTPException( + status_code=400, + detail=f"Invalid entity_type '{entity_type}'. Valid types: {sorted(VALID_ENTITY_TYPES)}", + ) + + +# ============================================================================= +# CSV Import +# ============================================================================= + + +@router.post("/import/{entity_type}", response_model=ImportResponse, status_code=status.HTTP_200_OK) +async def import_csv( + entity_type: str, + file: UploadFile, + request: Request, +) -> ImportResponse: + _validate_entity_type(entity_type) + + content = (await file.read()).decode("utf-8-sig") + items, parse_errors = parse_csv(content) + schema_errors = [ImportRowErrorSchema(row=e.row, field=e.field, error=e.error) for e in parse_errors] + + if not items: + return ImportResponse(imported=0, failed=len(schema_errors), errors=schema_errors) + + bulk_cmd_cls, create_cmd_cls, handler_cls, repo_cls = _BULK_CREATE_MAP[entity_type] + + session = request.app.state.database.session() + repo = repo_cls(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(bulk_cmd_cls, handler_cls(event_store, repo, event_producer)) + + create_commands = [] + import_errors = list(schema_errors) + for i, item in enumerate(items): + try: + cmd = create_cmd_cls(**item) + create_commands.append(cmd) + except Exception as e: + import_errors.append(ImportRowErrorSchema(row=i + 2, field="", error=str(e))) + + if not create_commands: + return ImportResponse(imported=0, failed=len(import_errors), errors=import_errors) + + try: + result_ids = await bus.dispatch(bulk_cmd_cls(items=create_commands)) + return ImportResponse(imported=len(result_ids), failed=len(import_errors), errors=import_errors) + except Exception as e: + import_errors.append(ImportRowErrorSchema(row=0, field="", error=f"Bulk create failed: {e}")) + return ImportResponse(imported=0, failed=len(import_errors), errors=import_errors) + + +# ============================================================================= +# Export +# ============================================================================= + + +@router.get("/export/{entity_type}") +async def export_data( + entity_type: str, + request: Request, + export_format: Literal["csv", "json", "yaml"] = "csv", + limit: int = 10000, +) -> StreamingResponse: + _validate_entity_type(entity_type) + + if limit > 10000: + limit = 10000 + + query_cls, handler_cls, repo_cls = _LIST_QUERY_MAP[entity_type] + + session = request.app.state.database.session() + repo = repo_cls(session) + + bus = QueryBus() + bus.register(query_cls, handler_cls(repo)) + + query = query_cls(offset=0, limit=limit) + items, _total = await bus.dispatch(query) + + item_dicts = [item.model_dump() for item in items] + exporter = _FORMAT_EXPORTERS[export_format] + output = exporter(item_dicts) + + content_type = _FORMAT_CONTENT_TYPES[export_format] + filename = f"{entity_type}_export.{export_format}" + + return StreamingResponse( + iter([output]), + media_type=content_type, + headers={"Content-Disposition": f'attachment; filename="{filename}"'}, + ) + + +# ============================================================================= +# Jinja2 Template Rendering +# ============================================================================= + + +@router.post("/export/{entity_type}/render") +async def render_template( + entity_type: str, + request: Request, + template_id: UUID | None = None, + limit: int = 10000, +) -> StreamingResponse: + _validate_entity_type(entity_type) + + if template_id is None: + raise HTTPException(status_code=400, detail="template_id is required") + + if limit > 10000: + limit = 10000 + + from ipam.infrastructure.template_repository import TemplateRepository + + session = request.app.state.database.session() + template_repo = TemplateRepository(session) + template = await template_repo.find_by_id(template_id) + if template is None: + raise HTTPException(status_code=404, detail=f"ExportTemplate {template_id} not found") + + query_cls, handler_cls, repo_cls = _LIST_QUERY_MAP[entity_type] + repo = repo_cls(session) + bus = QueryBus() + bus.register(query_cls, handler_cls(repo)) + + query = query_cls(offset=0, limit=limit) + items, total = await bus.dispatch(query) + item_dicts = [item.model_dump() for item in items] + + from jinja2.sandbox import SandboxedEnvironment + + env = SandboxedEnvironment() + jinja_template = env.from_string(template.template_content) + rendered = jinja_template.render(items=item_dicts, total=total, entity_type=entity_type) + + content_type_map = { + "csv": "text/csv", + "json": "application/json", + "yaml": "application/x-yaml", + "html": "text/html", + "xml": "application/xml", + "text": "text/plain", + } + content_type = content_type_map.get(template.output_format, "text/plain") + filename = f"{entity_type}_export.{template.output_format}" + + return StreamingResponse( + iter([rendered]), + media_type=content_type, + headers={"Content-Disposition": f'attachment; filename="{filename}"'}, + ) + + +# ============================================================================= +# Export Templates CRUD +# ============================================================================= + + +@router.post("/export-templates", status_code=status.HTTP_201_CREATED) +async def create_export_template( + request: Request, + body: dict, +) -> dict: + from uuid import uuid4 + + from ipam.infrastructure.template_repository import TemplateRepository + + session = request.app.state.database.session() + repo = TemplateRepository(session) + template = await repo.create( + { + "id": uuid4(), + "name": body["name"], + "entity_type": body["entity_type"], + "template_content": body["template_content"], + "output_format": body.get("output_format", "text"), + "description": body.get("description", ""), + } + ) + return { + "id": str(template.id), + "name": template.name, + "entity_type": template.entity_type, + "output_format": template.output_format, + "description": template.description, + "created_at": template.created_at.isoformat() if template.created_at else None, + } + + +@router.get("/export-templates") +async def list_export_templates( + request: Request, + entity_type: str | None = None, +) -> list[dict]: + from ipam.infrastructure.template_repository import TemplateRepository + + session = request.app.state.database.session() + repo = TemplateRepository(session) + templates = await repo.find_all(entity_type=entity_type) + return [ + { + "id": str(t.id), + "name": t.name, + "entity_type": t.entity_type, + "output_format": t.output_format, + "description": t.description, + "created_at": t.created_at.isoformat() if t.created_at else None, + } + for t in templates + ] + + +@router.get("/export-templates/{template_id}") +async def get_export_template( + template_id: UUID, + request: Request, +) -> dict: + from ipam.infrastructure.template_repository import TemplateRepository + + session = request.app.state.database.session() + repo = TemplateRepository(session) + template = await repo.find_by_id(template_id) + if template is None: + raise HTTPException(status_code=404, detail=f"ExportTemplate {template_id} not found") + return { + "id": str(template.id), + "name": template.name, + "entity_type": template.entity_type, + "template_content": template.template_content, + "output_format": template.output_format, + "description": template.description, + "created_at": template.created_at.isoformat() if template.created_at else None, + } + + +@router.delete("/export-templates/{template_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_export_template( + template_id: UUID, + request: Request, +) -> None: + from ipam.infrastructure.template_repository import TemplateRepository + + session = request.app.state.database.session() + repo = TemplateRepository(session) + template = await repo.find_by_id(template_id) + if template is None: + raise HTTPException(status_code=404, detail=f"ExportTemplate {template_id} not found") + await repo.delete(template_id) diff --git a/services/ipam/src/ipam/interface/routers/ip_address_router.py b/services/ipam/src/ipam/interface/routers/ip_address_router.py new file mode 100644 index 0000000..915429d --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/ip_address_router.py @@ -0,0 +1,261 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateIPAddressesHandler, + BulkDeleteIPAddressesHandler, + BulkUpdateIPAddressesHandler, + ChangeIPAddressStatusHandler, + CreateIPAddressHandler, + DeleteIPAddressHandler, + UpdateIPAddressHandler, +) +from ipam.application.commands import ( + BulkCreateIPAddressesCommand, + BulkDeleteIPAddressesCommand, + BulkUpdateIPAddressesCommand, + BulkUpdateIPAddressItem, + ChangeIPAddressStatusCommand, + CreateIPAddressCommand, + DeleteIPAddressCommand, + UpdateIPAddressCommand, +) +from ipam.application.queries import GetIPAddressQuery, ListIPAddressesQuery +from ipam.application.query_handlers import GetIPAddressHandler, ListIPAddressesHandler +from ipam.infrastructure.read_model_repository import PostgresIPAddressReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + ChangeStatusRequest, + CreateIPAddressRequest, + IPAddressListResponse, + IPAddressResponse, + UpdateIPAddressRequest, +) +from ipam.interface.schemas import ( + BulkUpdateIPAddressItem as BulkUpdateIPAddressItemSchema, +) + +router = APIRouter(prefix="/ip-addresses", tags=["ip-addresses"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresIPAddressReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register( + CreateIPAddressCommand, + CreateIPAddressHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + UpdateIPAddressCommand, + UpdateIPAddressHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + ChangeIPAddressStatusCommand, + ChangeIPAddressStatusHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + DeleteIPAddressCommand, + DeleteIPAddressHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkCreateIPAddressesCommand, + BulkCreateIPAddressesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateIPAddressesCommand, + BulkUpdateIPAddressesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteIPAddressesCommand, + BulkDeleteIPAddressesHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresIPAddressReadModelRepository(session) + + bus = QueryBus() + bus.register(GetIPAddressQuery, GetIPAddressHandler(read_model_repo)) + bus.register(ListIPAddressesQuery, ListIPAddressesHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=IPAddressResponse, +) +async def create_ip_address( + body: CreateIPAddressRequest, + request: Request, +) -> IPAddressResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + ip_id = await command_bus.dispatch(CreateIPAddressCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetIPAddressQuery(ip_id=ip_id)) + return IPAddressResponse(**result.model_dump()) + + +@router.get("", response_model=IPAddressListResponse) +async def list_ip_addresses( + params: OffsetParams = Depends(), # noqa: B008 + vrf_id: UUID | None = None, + status_filter: str | None = None, + tenant_id: UUID | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> IPAddressListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListIPAddressesQuery( + offset=params.offset, + limit=params.limit, + vrf_id=vrf_id, + status=status_filter, + tenant_id=tenant_id, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return IPAddressListResponse( + items=[IPAddressResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_ip_addresses( + body: list[BulkUpdateIPAddressItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateIPAddressesCommand( + items=[ + BulkUpdateIPAddressItem(ip_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_ip_addresses( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteIPAddressesCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{ip_id}", response_model=IPAddressResponse) +async def get_ip_address( + ip_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> IPAddressResponse: + result = await query_bus.dispatch(GetIPAddressQuery(ip_id=ip_id)) + return IPAddressResponse(**result.model_dump()) + + +@router.patch("/{ip_id}", response_model=IPAddressResponse) +async def update_ip_address( + ip_id: UUID, + body: UpdateIPAddressRequest, + request: Request, +) -> IPAddressResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateIPAddressCommand(ip_id=ip_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetIPAddressQuery(ip_id=ip_id)) + return IPAddressResponse(**result.model_dump()) + + +@router.post("/{ip_id}/status", response_model=IPAddressResponse) +async def change_ip_address_status( + ip_id: UUID, + body: ChangeStatusRequest, + request: Request, +) -> IPAddressResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(ChangeIPAddressStatusCommand(ip_id=ip_id, status=body.status)) + await session.commit() + result = await query_bus.dispatch(GetIPAddressQuery(ip_id=ip_id)) + return IPAddressResponse(**result.model_dump()) + + +@router.delete("/{ip_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_ip_address( + ip_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteIPAddressCommand(ip_id=ip_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_ip_addresses( + body: list[CreateIPAddressRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch( + BulkCreateIPAddressesCommand(items=[CreateIPAddressCommand(**i.model_dump()) for i in body]) + ) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/ip_range_router.py b/services/ipam/src/ipam/interface/routers/ip_range_router.py new file mode 100644 index 0000000..705d067 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/ip_range_router.py @@ -0,0 +1,275 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateIPRangesHandler, + BulkDeleteIPRangesHandler, + BulkUpdateIPRangesHandler, + ChangeIPRangeStatusHandler, + CreateIPRangeHandler, + DeleteIPRangeHandler, + UpdateIPRangeHandler, +) +from ipam.application.commands import ( + BulkCreateIPRangesCommand, + BulkDeleteIPRangesCommand, + BulkUpdateIPRangeItem, + BulkUpdateIPRangesCommand, + ChangeIPRangeStatusCommand, + CreateIPRangeCommand, + DeleteIPRangeCommand, + UpdateIPRangeCommand, +) +from ipam.application.queries import GetIPRangeQuery, GetIPRangeUtilizationQuery, ListIPRangesQuery +from ipam.application.query_handlers import GetIPRangeHandler, GetIPRangeUtilizationHandler, ListIPRangesHandler +from ipam.infrastructure.read_model_repository import ( + PostgresIPAddressReadModelRepository, + PostgresIPRangeReadModelRepository, +) +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + ChangeStatusRequest, + CreateIPRangeRequest, + IPRangeListResponse, + IPRangeResponse, + UpdateIPRangeRequest, +) +from ipam.interface.schemas import ( + BulkUpdateIPRangeItem as BulkUpdateIPRangeItemSchema, +) + +router = APIRouter(prefix="/ip-ranges", tags=["ip-ranges"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresIPRangeReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register( + CreateIPRangeCommand, + CreateIPRangeHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + UpdateIPRangeCommand, + UpdateIPRangeHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + ChangeIPRangeStatusCommand, + ChangeIPRangeStatusHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + DeleteIPRangeCommand, + DeleteIPRangeHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkCreateIPRangesCommand, + BulkCreateIPRangesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateIPRangesCommand, + BulkUpdateIPRangesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteIPRangesCommand, + BulkDeleteIPRangesHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresIPRangeReadModelRepository(session) + ip_repo = PostgresIPAddressReadModelRepository(session) + + bus = QueryBus() + bus.register(GetIPRangeQuery, GetIPRangeHandler(read_model_repo)) + bus.register(ListIPRangesQuery, ListIPRangesHandler(read_model_repo)) + bus.register(GetIPRangeUtilizationQuery, GetIPRangeUtilizationHandler(read_model_repo, ip_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=IPRangeResponse, +) +async def create_ip_range( + body: CreateIPRangeRequest, + request: Request, +) -> IPRangeResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + range_id = await command_bus.dispatch(CreateIPRangeCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetIPRangeQuery(range_id=range_id)) + return IPRangeResponse(**result.model_dump()) + + +@router.get("", response_model=IPRangeListResponse) +async def list_ip_ranges( + params: OffsetParams = Depends(), # noqa: B008 + vrf_id: UUID | None = None, + status_filter: str | None = None, + tenant_id: UUID | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> IPRangeListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListIPRangesQuery( + offset=params.offset, + limit=params.limit, + vrf_id=vrf_id, + status=status_filter, + tenant_id=tenant_id, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return IPRangeListResponse( + items=[IPRangeResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_ip_ranges( + body: list[BulkUpdateIPRangeItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateIPRangesCommand( + items=[ + BulkUpdateIPRangeItem(range_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_ip_ranges( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteIPRangesCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{range_id}", response_model=IPRangeResponse) +async def get_ip_range( + range_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> IPRangeResponse: + result = await query_bus.dispatch(GetIPRangeQuery(range_id=range_id)) + return IPRangeResponse(**result.model_dump()) + + +@router.patch("/{range_id}", response_model=IPRangeResponse) +async def update_ip_range( + range_id: UUID, + body: UpdateIPRangeRequest, + request: Request, +) -> IPRangeResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateIPRangeCommand(range_id=range_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetIPRangeQuery(range_id=range_id)) + return IPRangeResponse(**result.model_dump()) + + +@router.post("/{range_id}/status", response_model=IPRangeResponse) +async def change_ip_range_status( + range_id: UUID, + body: ChangeStatusRequest, + request: Request, +) -> IPRangeResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(ChangeIPRangeStatusCommand(range_id=range_id, status=body.status)) + await session.commit() + result = await query_bus.dispatch(GetIPRangeQuery(range_id=range_id)) + return IPRangeResponse(**result.model_dump()) + + +@router.delete("/{range_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_ip_range( + range_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteIPRangeCommand(range_id=range_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_ip_ranges( + body: list[CreateIPRangeRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch( + BulkCreateIPRangesCommand(items=[CreateIPRangeCommand(**i.model_dump()) for i in body]) + ) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) + + +@router.get("/{range_id}/utilization") +async def get_ip_range_utilization( + range_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> dict: + utilization = await query_bus.dispatch(GetIPRangeUtilizationQuery(range_id=range_id)) + return {"range_id": range_id, "utilization": utilization} diff --git a/services/ipam/src/ipam/interface/routers/prefix_router.py b/services/ipam/src/ipam/interface/routers/prefix_router.py new file mode 100644 index 0000000..9c7ce78 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/prefix_router.py @@ -0,0 +1,312 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreatePrefixesHandler, + BulkDeletePrefixesHandler, + BulkUpdatePrefixesHandler, + ChangePrefixStatusHandler, + CreatePrefixHandler, + DeletePrefixHandler, + UpdatePrefixHandler, +) +from ipam.application.commands import ( + BulkCreatePrefixesCommand, + BulkDeletePrefixesCommand, + BulkUpdatePrefixesCommand, + BulkUpdatePrefixItem, + ChangePrefixStatusCommand, + CreatePrefixCommand, + DeletePrefixCommand, + UpdatePrefixCommand, +) +from ipam.application.queries import ( + GetAvailableIPsQuery, + GetAvailablePrefixesQuery, + GetPrefixChildrenQuery, + GetPrefixQuery, + GetPrefixUtilizationQuery, + ListPrefixesQuery, +) +from ipam.application.query_handlers import ( + GetAvailableIPsHandler, + GetAvailablePrefixesHandler, + GetPrefixChildrenHandler, + GetPrefixHandler, + GetPrefixUtilizationHandler, + ListPrefixesHandler, +) +from ipam.infrastructure.read_model_repository import ( + PostgresIPAddressReadModelRepository, + PostgresPrefixReadModelRepository, +) +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + ChangeStatusRequest, + CreatePrefixRequest, + PrefixListResponse, + PrefixResponse, + UpdatePrefixRequest, +) +from ipam.interface.schemas import ( + BulkUpdatePrefixItem as BulkUpdatePrefixSchema, +) + +router = APIRouter(prefix="/prefixes", tags=["prefixes"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresPrefixReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreatePrefixCommand, CreatePrefixHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdatePrefixCommand, UpdatePrefixHandler(event_store, read_model_repo, event_producer)) + bus.register( + ChangePrefixStatusCommand, + ChangePrefixStatusHandler(event_store, read_model_repo, event_producer), + ) + bus.register(DeletePrefixCommand, DeletePrefixHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreatePrefixesCommand, + BulkCreatePrefixesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdatePrefixesCommand, + BulkUpdatePrefixesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeletePrefixesCommand, + BulkDeletePrefixesHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + prefix_repo = PostgresPrefixReadModelRepository(session) + ip_repo = PostgresIPAddressReadModelRepository(session) + + bus = QueryBus() + bus.register(GetPrefixQuery, GetPrefixHandler(prefix_repo)) + bus.register(ListPrefixesQuery, ListPrefixesHandler(prefix_repo)) + bus.register(GetPrefixChildrenQuery, GetPrefixChildrenHandler(prefix_repo)) + cache = getattr(request.app.state, "cache", None) + bus.register(GetPrefixUtilizationQuery, GetPrefixUtilizationHandler(prefix_repo, ip_repo, cache=cache)) + bus.register(GetAvailablePrefixesQuery, GetAvailablePrefixesHandler(prefix_repo)) + bus.register(GetAvailableIPsQuery, GetAvailableIPsHandler(prefix_repo, ip_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=PrefixResponse, +) +async def create_prefix( + body: CreatePrefixRequest, + request: Request, +) -> PrefixResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + prefix_id = await command_bus.dispatch(CreatePrefixCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetPrefixQuery(prefix_id=prefix_id)) + return PrefixResponse(**result.model_dump()) + + +@router.get("/{prefix_id}", response_model=PrefixResponse) +async def get_prefix( + prefix_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> PrefixResponse: + result = await query_bus.dispatch(GetPrefixQuery(prefix_id=prefix_id)) + return PrefixResponse(**result.model_dump()) + + +@router.get("", response_model=PrefixListResponse) +async def list_prefixes( + params: OffsetParams = Depends(), # noqa: B008 + vrf_id: UUID | None = None, + status_filter: str | None = None, + tenant_id: UUID | None = None, + role: str | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> PrefixListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListPrefixesQuery( + offset=params.offset, + limit=params.limit, + vrf_id=vrf_id, + status=status_filter, + tenant_id=tenant_id, + role=role, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return PrefixListResponse( + items=[PrefixResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/{prefix_id}", response_model=PrefixResponse) +async def update_prefix( + prefix_id: UUID, + body: UpdatePrefixRequest, + request: Request, +) -> PrefixResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdatePrefixCommand(prefix_id=prefix_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetPrefixQuery(prefix_id=prefix_id)) + return PrefixResponse(**result.model_dump()) + + +@router.post("/{prefix_id}/status", response_model=PrefixResponse) +async def change_prefix_status( + prefix_id: UUID, + body: ChangeStatusRequest, + request: Request, +) -> PrefixResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(ChangePrefixStatusCommand(prefix_id=prefix_id, new_status=body.status)) + await session.commit() + result = await query_bus.dispatch(GetPrefixQuery(prefix_id=prefix_id)) + return PrefixResponse(**result.model_dump()) + + +@router.delete("/{prefix_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_prefix( + prefix_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeletePrefixCommand(prefix_id=prefix_id)) + await session.commit() + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_prefixes( + body: list[BulkUpdatePrefixSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdatePrefixesCommand( + items=[ + BulkUpdatePrefixItem(prefix_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_prefixes( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeletePrefixesCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.post("/bulk", response_model=BulkCreateResponse, status_code=status.HTTP_201_CREATED) +async def bulk_create_prefixes( + body: list[CreatePrefixRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + commands = [CreatePrefixCommand(**b.model_dump()) for b in body] + ids = await command_bus.dispatch(BulkCreatePrefixesCommand(items=commands)) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) + + +@router.get("/{prefix_id}/children", response_model=list[PrefixResponse]) +async def get_prefix_children( + prefix_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> list[PrefixResponse]: + children = await query_bus.dispatch(GetPrefixChildrenQuery(prefix_id=prefix_id)) + return [PrefixResponse(**c.model_dump()) for c in children] + + +@router.get("/{prefix_id}/utilization") +async def get_prefix_utilization( + prefix_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> dict: + utilization = await query_bus.dispatch(GetPrefixUtilizationQuery(prefix_id=prefix_id)) + return {"utilization": utilization} + + +@router.get("/{prefix_id}/available-prefixes") +async def get_available_prefixes( + prefix_id: UUID, + desired_prefix_length: int = 24, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> dict: + available = await query_bus.dispatch( + GetAvailablePrefixesQuery(prefix_id=prefix_id, desired_prefix_length=desired_prefix_length) + ) + return {"available_prefixes": available} + + +@router.get("/{prefix_id}/available-ips") +async def get_available_ips( + prefix_id: UUID, + count: int = 1, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> dict: + available = await query_bus.dispatch(GetAvailableIPsQuery(prefix_id=prefix_id, count=count)) + return {"available_ips": available} diff --git a/services/ipam/src/ipam/interface/routers/rir_router.py b/services/ipam/src/ipam/interface/routers/rir_router.py new file mode 100644 index 0000000..7d41a70 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/rir_router.py @@ -0,0 +1,220 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateRIRsHandler, + BulkDeleteRIRsHandler, + BulkUpdateRIRsHandler, + CreateRIRHandler, + DeleteRIRHandler, + UpdateRIRHandler, +) +from ipam.application.commands import ( + BulkCreateRIRsCommand, + BulkDeleteRIRsCommand, + BulkUpdateRIRItem, + BulkUpdateRIRsCommand, + CreateRIRCommand, + DeleteRIRCommand, + UpdateRIRCommand, +) +from ipam.application.queries import GetRIRQuery, ListRIRsQuery +from ipam.application.query_handlers import GetRIRHandler, ListRIRsHandler +from ipam.infrastructure.read_model_repository import PostgresRIRReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateRIRRequest, + RIRListResponse, + RIRResponse, + UpdateRIRRequest, +) +from ipam.interface.schemas import ( + BulkUpdateRIRItem as BulkUpdateRIRItemSchema, +) + +router = APIRouter(prefix="/rirs", tags=["rirs"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresRIRReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreateRIRCommand, CreateRIRHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdateRIRCommand, UpdateRIRHandler(event_store, read_model_repo, event_producer)) + bus.register(DeleteRIRCommand, DeleteRIRHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreateRIRsCommand, + BulkCreateRIRsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateRIRsCommand, + BulkUpdateRIRsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteRIRsCommand, + BulkDeleteRIRsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresRIRReadModelRepository(session) + + bus = QueryBus() + bus.register(GetRIRQuery, GetRIRHandler(read_model_repo)) + bus.register(ListRIRsQuery, ListRIRsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=RIRResponse, +) +async def create_rir( + body: CreateRIRRequest, + request: Request, +) -> RIRResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + rir_id = await command_bus.dispatch(CreateRIRCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetRIRQuery(rir_id=rir_id)) + return RIRResponse(**result.model_dump()) + + +@router.get("", response_model=RIRListResponse) +async def list_rirs( + params: OffsetParams = Depends(), # noqa: B008 + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RIRListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListRIRsQuery( + offset=params.offset, + limit=params.limit, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return RIRListResponse( + items=[RIRResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_rirs( + body: list[BulkUpdateRIRItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateRIRsCommand( + items=[BulkUpdateRIRItem(rir_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_rirs( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteRIRsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{rir_id}", response_model=RIRResponse) +async def get_rir( + rir_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RIRResponse: + result = await query_bus.dispatch(GetRIRQuery(rir_id=rir_id)) + return RIRResponse(**result.model_dump()) + + +@router.patch("/{rir_id}", response_model=RIRResponse) +async def update_rir( + rir_id: UUID, + body: UpdateRIRRequest, + request: Request, +) -> RIRResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateRIRCommand(rir_id=rir_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetRIRQuery(rir_id=rir_id)) + return RIRResponse(**result.model_dump()) + + +@router.delete("/{rir_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_rir( + rir_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteRIRCommand(rir_id=rir_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_rirs( + body: list[CreateRIRRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch(BulkCreateRIRsCommand(items=[CreateRIRCommand(**i.model_dump()) for i in body])) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/route_target_router.py b/services/ipam/src/ipam/interface/routers/route_target_router.py new file mode 100644 index 0000000..c06dae1 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/route_target_router.py @@ -0,0 +1,229 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateRouteTargetsHandler, + BulkDeleteRouteTargetsHandler, + BulkUpdateRouteTargetsHandler, + CreateRouteTargetHandler, + DeleteRouteTargetHandler, + UpdateRouteTargetHandler, +) +from ipam.application.commands import ( + BulkCreateRouteTargetsCommand, + BulkDeleteRouteTargetsCommand, + BulkUpdateRouteTargetItem, + BulkUpdateRouteTargetsCommand, + CreateRouteTargetCommand, + DeleteRouteTargetCommand, + UpdateRouteTargetCommand, +) +from ipam.application.queries import GetRouteTargetQuery, ListRouteTargetsQuery +from ipam.application.query_handlers import GetRouteTargetHandler, ListRouteTargetsHandler +from ipam.infrastructure.read_model_repository import PostgresRouteTargetReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateRouteTargetRequest, + RouteTargetListResponse, + RouteTargetResponse, + UpdateRouteTargetRequest, +) +from ipam.interface.schemas import ( + BulkUpdateRouteTargetItem as BulkUpdateRouteTargetItemSchema, +) + +router = APIRouter(prefix="/route-targets", tags=["route-targets"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresRouteTargetReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreateRouteTargetCommand, CreateRouteTargetHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdateRouteTargetCommand, UpdateRouteTargetHandler(event_store, read_model_repo, event_producer)) + bus.register(DeleteRouteTargetCommand, DeleteRouteTargetHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreateRouteTargetsCommand, + BulkCreateRouteTargetsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateRouteTargetsCommand, + BulkUpdateRouteTargetsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteRouteTargetsCommand, + BulkDeleteRouteTargetsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresRouteTargetReadModelRepository(session) + + bus = QueryBus() + bus.register(GetRouteTargetQuery, GetRouteTargetHandler(read_model_repo)) + bus.register(ListRouteTargetsQuery, ListRouteTargetsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=RouteTargetResponse, +) +async def create_route_target( + body: CreateRouteTargetRequest, + request: Request, +) -> RouteTargetResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + route_target_id = await command_bus.dispatch(CreateRouteTargetCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetRouteTargetQuery(route_target_id=route_target_id)) + return RouteTargetResponse(**result.model_dump()) + + +@router.get("", response_model=RouteTargetListResponse) +async def list_route_targets( + params: OffsetParams = Depends(), # noqa: B008 + tenant_id: UUID | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RouteTargetListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListRouteTargetsQuery( + offset=params.offset, + limit=params.limit, + tenant_id=tenant_id, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return RouteTargetListResponse( + items=[RouteTargetResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_route_targets( + body: list[BulkUpdateRouteTargetItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateRouteTargetsCommand( + items=[ + BulkUpdateRouteTargetItem(route_target_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) + for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_route_targets( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteRouteTargetsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{route_target_id}", response_model=RouteTargetResponse) +async def get_route_target( + route_target_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> RouteTargetResponse: + result = await query_bus.dispatch(GetRouteTargetQuery(route_target_id=route_target_id)) + return RouteTargetResponse(**result.model_dump()) + + +@router.patch("/{route_target_id}", response_model=RouteTargetResponse) +async def update_route_target( + route_target_id: UUID, + body: UpdateRouteTargetRequest, + request: Request, +) -> RouteTargetResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch( + UpdateRouteTargetCommand(route_target_id=route_target_id, **body.model_dump(exclude_unset=True)) + ) + await session.commit() + result = await query_bus.dispatch(GetRouteTargetQuery(route_target_id=route_target_id)) + return RouteTargetResponse(**result.model_dump()) + + +@router.delete("/{route_target_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_route_target( + route_target_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteRouteTargetCommand(route_target_id=route_target_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_route_targets( + body: list[CreateRouteTargetRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch( + BulkCreateRouteTargetsCommand(items=[CreateRouteTargetCommand(**i.model_dump()) for i in body]) + ) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/saved_filter_router.py b/services/ipam/src/ipam/interface/routers/saved_filter_router.py new file mode 100644 index 0000000..1472d3c --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/saved_filter_router.py @@ -0,0 +1,121 @@ +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + CreateSavedFilterHandler, + DeleteSavedFilterHandler, + UpdateSavedFilterHandler, +) +from ipam.application.commands import ( + CreateSavedFilterCommand, + DeleteSavedFilterCommand, + UpdateSavedFilterCommand, +) +from ipam.application.queries import GetSavedFilterQuery, ListSavedFiltersQuery +from ipam.application.query_handlers import GetSavedFilterHandler, ListSavedFiltersHandler +from ipam.infrastructure.saved_filter_repository import PostgresSavedFilterRepository +from ipam.interface.schemas import ( + CreateSavedFilterRequest, + SavedFilterListResponse, + SavedFilterResponse, + UpdateSavedFilterRequest, +) + +router = APIRouter(prefix="/saved-filters", tags=["saved-filters"]) + + +def _get_command_bus(request: Request) -> CommandBus: + session = request.app.state.database.session() + repo = PostgresSavedFilterRepository(session) + bus = CommandBus() + bus.register(CreateSavedFilterCommand, CreateSavedFilterHandler(repo)) + bus.register(UpdateSavedFilterCommand, UpdateSavedFilterHandler(repo)) + bus.register(DeleteSavedFilterCommand, DeleteSavedFilterHandler(repo)) + return bus + + +def _get_query_bus(request: Request) -> QueryBus: + session = request.app.state.database.session() + repo = PostgresSavedFilterRepository(session) + bus = QueryBus() + bus.register(GetSavedFilterQuery, GetSavedFilterHandler(repo)) + bus.register(ListSavedFiltersQuery, ListSavedFiltersHandler(repo)) + return bus + + +@router.post("", response_model=SavedFilterResponse, status_code=status.HTTP_201_CREATED) +async def create_saved_filter( + body: CreateSavedFilterRequest, + request: Request, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> SavedFilterResponse: + user_id = getattr(request.state, "user_id", None) + if user_id is None: + from fastapi import HTTPException + + raise HTTPException(status_code=400, detail="X-User-ID header is required") + filter_id = await command_bus.dispatch( + CreateSavedFilterCommand( + user_id=UUID(user_id), + name=body.name, + entity_type=body.entity_type, + filter_config=body.filter_config, + is_default=body.is_default, + ) + ) + result = await query_bus.dispatch(GetSavedFilterQuery(filter_id=filter_id)) + return SavedFilterResponse(**result.model_dump()) + + +@router.get("", response_model=SavedFilterListResponse) +async def list_saved_filters( + request: Request, + entity_type: str | None = None, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> SavedFilterListResponse: + user_id = getattr(request.state, "user_id", None) + if user_id is None: + from fastapi import HTTPException + + raise HTTPException(status_code=400, detail="X-User-ID header is required") + items = await query_bus.dispatch(ListSavedFiltersQuery(user_id=UUID(user_id), entity_type=entity_type)) + return SavedFilterListResponse(items=[SavedFilterResponse(**i.model_dump()) for i in items]) + + +@router.get("/{filter_id}", response_model=SavedFilterResponse) +async def get_saved_filter( + filter_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> SavedFilterResponse: + result = await query_bus.dispatch(GetSavedFilterQuery(filter_id=filter_id)) + return SavedFilterResponse(**result.model_dump()) + + +@router.patch("/{filter_id}", response_model=SavedFilterResponse) +async def update_saved_filter( + filter_id: UUID, + body: UpdateSavedFilterRequest, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> SavedFilterResponse: + await command_bus.dispatch( + UpdateSavedFilterCommand( + filter_id=filter_id, + name=body.name, + filter_config=body.filter_config, + is_default=body.is_default, + ) + ) + result = await query_bus.dispatch(GetSavedFilterQuery(filter_id=filter_id)) + return SavedFilterResponse(**result.model_dump()) + + +@router.delete("/{filter_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_saved_filter( + filter_id: UUID, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(DeleteSavedFilterCommand(filter_id=filter_id)) diff --git a/services/ipam/src/ipam/interface/routers/search_router.py b/services/ipam/src/ipam/interface/routers/search_router.py new file mode 100644 index 0000000..bcfbf96 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/search_router.py @@ -0,0 +1,33 @@ +from fastapi import APIRouter, Depends, Request +from fastapi import Query as QueryParam +from shared.cqrs.bus import QueryBus + +from ipam.application.queries import GlobalSearchQuery +from ipam.application.query_handlers import GlobalSearchHandler +from ipam.infrastructure.search_repository import PostgresGlobalSearchRepository +from ipam.interface.schemas import GlobalSearchResponse, SearchResultResponse + +router = APIRouter(prefix="/search", tags=["search"]) + + +def _get_query_bus(request: Request) -> QueryBus: + session = request.app.state.database.session() + search_repo = PostgresGlobalSearchRepository(session) + bus = QueryBus() + bus.register(GlobalSearchQuery, GlobalSearchHandler(search_repo)) + return bus + + +@router.get("", response_model=GlobalSearchResponse) +async def global_search( + q: str, + entity_types: list[str] | None = QueryParam(None), # noqa: B008 + offset: int = 0, + limit: int = 20, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> GlobalSearchResponse: + result = await query_bus.dispatch(GlobalSearchQuery(q=q, entity_types=entity_types, offset=offset, limit=limit)) + return GlobalSearchResponse( + results=[SearchResultResponse(**r.model_dump()) for r in result.results], + total=result.total, + ) diff --git a/services/ipam/src/ipam/interface/routers/service_router.py b/services/ipam/src/ipam/interface/routers/service_router.py new file mode 100644 index 0000000..7167cfa --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/service_router.py @@ -0,0 +1,224 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateServicesHandler, + BulkDeleteServicesHandler, + BulkUpdateServicesHandler, + CreateServiceHandler, + DeleteServiceHandler, + UpdateServiceHandler, +) +from ipam.application.commands import ( + BulkCreateServicesCommand, + BulkDeleteServicesCommand, + BulkUpdateServiceItem, + BulkUpdateServicesCommand, + CreateServiceCommand, + DeleteServiceCommand, + UpdateServiceCommand, +) +from ipam.application.queries import GetServiceQuery, ListServicesQuery +from ipam.application.query_handlers import GetServiceHandler, ListServicesHandler +from ipam.infrastructure.read_model_repository import PostgresServiceReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateServiceRequest, + ServiceListResponse, + ServiceResponse, + UpdateServiceRequest, +) +from ipam.interface.schemas import ( + BulkUpdateServiceItem as BulkUpdateServiceItemSchema, +) + +router = APIRouter(prefix="/services", tags=["services"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresServiceReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreateServiceCommand, CreateServiceHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdateServiceCommand, UpdateServiceHandler(event_store, read_model_repo, event_producer)) + bus.register(DeleteServiceCommand, DeleteServiceHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreateServicesCommand, + BulkCreateServicesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateServicesCommand, + BulkUpdateServicesHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteServicesCommand, + BulkDeleteServicesHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresServiceReadModelRepository(session) + + bus = QueryBus() + bus.register(GetServiceQuery, GetServiceHandler(read_model_repo)) + bus.register(ListServicesQuery, ListServicesHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=ServiceResponse, +) +async def create_service( + body: CreateServiceRequest, + request: Request, +) -> ServiceResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + service_id = await command_bus.dispatch(CreateServiceCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetServiceQuery(service_id=service_id)) + return ServiceResponse(**result.model_dump()) + + +@router.get("", response_model=ServiceListResponse) +async def list_services( + params: OffsetParams = Depends(), # noqa: B008 + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> ServiceListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListServicesQuery( + offset=params.offset, + limit=params.limit, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return ServiceListResponse( + items=[ServiceResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_services( + body: list[BulkUpdateServiceItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateServicesCommand( + items=[ + BulkUpdateServiceItem(service_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_services( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteServicesCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{service_id}", response_model=ServiceResponse) +async def get_service( + service_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> ServiceResponse: + result = await query_bus.dispatch(GetServiceQuery(service_id=service_id)) + return ServiceResponse(**result.model_dump()) + + +@router.patch("/{service_id}", response_model=ServiceResponse) +async def update_service( + service_id: UUID, + body: UpdateServiceRequest, + request: Request, +) -> ServiceResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateServiceCommand(service_id=service_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetServiceQuery(service_id=service_id)) + return ServiceResponse(**result.model_dump()) + + +@router.delete("/{service_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_service( + service_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteServiceCommand(service_id=service_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_services( + body: list[CreateServiceRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch( + BulkCreateServicesCommand(items=[CreateServiceCommand(**i.model_dump()) for i in body]) + ) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/vlan_group_router.py b/services/ipam/src/ipam/interface/routers/vlan_group_router.py new file mode 100644 index 0000000..1d1c722 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/vlan_group_router.py @@ -0,0 +1,229 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateVLANGroupsHandler, + BulkDeleteVLANGroupsHandler, + BulkUpdateVLANGroupsHandler, + CreateVLANGroupHandler, + DeleteVLANGroupHandler, + UpdateVLANGroupHandler, +) +from ipam.application.commands import ( + BulkCreateVLANGroupsCommand, + BulkDeleteVLANGroupsCommand, + BulkUpdateVLANGroupItem, + BulkUpdateVLANGroupsCommand, + CreateVLANGroupCommand, + DeleteVLANGroupCommand, + UpdateVLANGroupCommand, +) +from ipam.application.queries import GetVLANGroupQuery, ListVLANGroupsQuery +from ipam.application.query_handlers import GetVLANGroupHandler, ListVLANGroupsHandler +from ipam.infrastructure.read_model_repository import PostgresVLANGroupReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateVLANGroupRequest, + UpdateVLANGroupRequest, + VLANGroupListResponse, + VLANGroupResponse, +) +from ipam.interface.schemas import ( + BulkUpdateVLANGroupItem as BulkUpdateVLANGroupItemSchema, +) + +router = APIRouter(prefix="/vlan-groups", tags=["vlan-groups"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresVLANGroupReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreateVLANGroupCommand, CreateVLANGroupHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdateVLANGroupCommand, UpdateVLANGroupHandler(event_store, read_model_repo, event_producer)) + bus.register(DeleteVLANGroupCommand, DeleteVLANGroupHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreateVLANGroupsCommand, + BulkCreateVLANGroupsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateVLANGroupsCommand, + BulkUpdateVLANGroupsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteVLANGroupsCommand, + BulkDeleteVLANGroupsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresVLANGroupReadModelRepository(session) + + bus = QueryBus() + bus.register(GetVLANGroupQuery, GetVLANGroupHandler(read_model_repo)) + bus.register(ListVLANGroupsQuery, ListVLANGroupsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=VLANGroupResponse, +) +async def create_vlan_group( + body: CreateVLANGroupRequest, + request: Request, +) -> VLANGroupResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + vlan_group_id = await command_bus.dispatch(CreateVLANGroupCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetVLANGroupQuery(vlan_group_id=vlan_group_id)) + return VLANGroupResponse(**result.model_dump()) + + +@router.get("", response_model=VLANGroupListResponse) +async def list_vlan_groups( + params: OffsetParams = Depends(), # noqa: B008 + tenant_id: UUID | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> VLANGroupListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListVLANGroupsQuery( + offset=params.offset, + limit=params.limit, + tenant_id=tenant_id, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return VLANGroupListResponse( + items=[VLANGroupResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_vlan_groups( + body: list[BulkUpdateVLANGroupItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateVLANGroupsCommand( + items=[ + BulkUpdateVLANGroupItem(vlan_group_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) + for i in body + ] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_vlan_groups( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteVLANGroupsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{vlan_group_id}", response_model=VLANGroupResponse) +async def get_vlan_group( + vlan_group_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> VLANGroupResponse: + result = await query_bus.dispatch(GetVLANGroupQuery(vlan_group_id=vlan_group_id)) + return VLANGroupResponse(**result.model_dump()) + + +@router.patch("/{vlan_group_id}", response_model=VLANGroupResponse) +async def update_vlan_group( + vlan_group_id: UUID, + body: UpdateVLANGroupRequest, + request: Request, +) -> VLANGroupResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch( + UpdateVLANGroupCommand(vlan_group_id=vlan_group_id, **body.model_dump(exclude_unset=True)) + ) + await session.commit() + result = await query_bus.dispatch(GetVLANGroupQuery(vlan_group_id=vlan_group_id)) + return VLANGroupResponse(**result.model_dump()) + + +@router.delete("/{vlan_group_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_vlan_group( + vlan_group_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteVLANGroupCommand(vlan_group_id=vlan_group_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_vlan_groups( + body: list[CreateVLANGroupRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch( + BulkCreateVLANGroupsCommand(items=[CreateVLANGroupCommand(**i.model_dump()) for i in body]) + ) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/vlan_router.py b/services/ipam/src/ipam/interface/routers/vlan_router.py new file mode 100644 index 0000000..69edd4f --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/vlan_router.py @@ -0,0 +1,259 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateVLANsHandler, + BulkDeleteVLANsHandler, + BulkUpdateVLANsHandler, + ChangeVLANStatusHandler, + CreateVLANHandler, + DeleteVLANHandler, + UpdateVLANHandler, +) +from ipam.application.commands import ( + BulkCreateVLANsCommand, + BulkDeleteVLANsCommand, + BulkUpdateVLANItem, + BulkUpdateVLANsCommand, + ChangeVLANStatusCommand, + CreateVLANCommand, + DeleteVLANCommand, + UpdateVLANCommand, +) +from ipam.application.queries import GetVLANQuery, ListVLANsQuery +from ipam.application.query_handlers import GetVLANHandler, ListVLANsHandler +from ipam.infrastructure.read_model_repository import PostgresVLANReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + ChangeStatusRequest, + CreateVLANRequest, + UpdateVLANRequest, + VLANListResponse, + VLANResponse, +) +from ipam.interface.schemas import ( + BulkUpdateVLANItem as BulkUpdateVLANItemSchema, +) + +router = APIRouter(prefix="/vlans", tags=["vlans"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresVLANReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register( + CreateVLANCommand, + CreateVLANHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + UpdateVLANCommand, + UpdateVLANHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + ChangeVLANStatusCommand, + ChangeVLANStatusHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + DeleteVLANCommand, + DeleteVLANHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkCreateVLANsCommand, + BulkCreateVLANsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateVLANsCommand, + BulkUpdateVLANsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteVLANsCommand, + BulkDeleteVLANsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresVLANReadModelRepository(session) + + bus = QueryBus() + bus.register(GetVLANQuery, GetVLANHandler(read_model_repo)) + bus.register(ListVLANsQuery, ListVLANsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=VLANResponse, +) +async def create_vlan( + body: CreateVLANRequest, + request: Request, +) -> VLANResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + vlan_id = await command_bus.dispatch(CreateVLANCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetVLANQuery(vlan_id=vlan_id)) + return VLANResponse(**result.model_dump()) + + +@router.get("", response_model=VLANListResponse) +async def list_vlans( + params: OffsetParams = Depends(), # noqa: B008 + group_id: UUID | None = None, + status_filter: str | None = None, + tenant_id: UUID | None = None, + role: str | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> VLANListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListVLANsQuery( + offset=params.offset, + limit=params.limit, + group_id=group_id, + status=status_filter, + tenant_id=tenant_id, + role=role, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return VLANListResponse( + items=[VLANResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_vlans( + body: list[BulkUpdateVLANItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateVLANsCommand( + items=[BulkUpdateVLANItem(vlan_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_vlans( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteVLANsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{vlan_id}", response_model=VLANResponse) +async def get_vlan( + vlan_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> VLANResponse: + result = await query_bus.dispatch(GetVLANQuery(vlan_id=vlan_id)) + return VLANResponse(**result.model_dump()) + + +@router.patch("/{vlan_id}", response_model=VLANResponse) +async def update_vlan( + vlan_id: UUID, + body: UpdateVLANRequest, + request: Request, +) -> VLANResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateVLANCommand(vlan_id=vlan_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetVLANQuery(vlan_id=vlan_id)) + return VLANResponse(**result.model_dump()) + + +@router.post("/{vlan_id}/status", response_model=VLANResponse) +async def change_vlan_status( + vlan_id: UUID, + body: ChangeStatusRequest, + request: Request, +) -> VLANResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(ChangeVLANStatusCommand(vlan_id=vlan_id, status=body.status)) + await session.commit() + result = await query_bus.dispatch(GetVLANQuery(vlan_id=vlan_id)) + return VLANResponse(**result.model_dump()) + + +@router.delete("/{vlan_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_vlan( + vlan_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteVLANCommand(vlan_id=vlan_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_vlans( + body: list[CreateVLANRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch(BulkCreateVLANsCommand(items=[CreateVLANCommand(**i.model_dump()) for i in body])) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/routers/vrf_router.py b/services/ipam/src/ipam/interface/routers/vrf_router.py new file mode 100644 index 0000000..cf74334 --- /dev/null +++ b/services/ipam/src/ipam/interface/routers/vrf_router.py @@ -0,0 +1,222 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from fastapi import Query as QueryParam +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from ipam.application.command_handlers import ( + BulkCreateVRFsHandler, + BulkDeleteVRFsHandler, + BulkUpdateVRFsHandler, + CreateVRFHandler, + DeleteVRFHandler, + UpdateVRFHandler, +) +from ipam.application.commands import ( + BulkCreateVRFsCommand, + BulkDeleteVRFsCommand, + BulkUpdateVRFItem, + BulkUpdateVRFsCommand, + CreateVRFCommand, + DeleteVRFCommand, + UpdateVRFCommand, +) +from ipam.application.queries import GetVRFQuery, ListVRFsQuery +from ipam.application.query_handlers import GetVRFHandler, ListVRFsHandler +from ipam.infrastructure.read_model_repository import PostgresVRFReadModelRepository +from ipam.interface.schemas import ( + BulkCreateResponse, + BulkDeleteRequest, + BulkDeleteResponse, + BulkUpdateResponse, + CreateVRFRequest, + UpdateVRFRequest, + VRFListResponse, + VRFResponse, +) +from ipam.interface.schemas import ( + BulkUpdateVRFItem as BulkUpdateVRFItemSchema, +) + +router = APIRouter(prefix="/vrfs", tags=["vrfs"]) + + +def _get_session(request: Request): + return request.app.state.database.session() + + +def _get_command_bus(request: Request, session=None) -> CommandBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresVRFReadModelRepository(session) + event_store = request.app.state.event_store + event_producer = request.app.state.event_producer + + bus = CommandBus() + bus.register(CreateVRFCommand, CreateVRFHandler(event_store, read_model_repo, event_producer)) + bus.register(UpdateVRFCommand, UpdateVRFHandler(event_store, read_model_repo, event_producer)) + bus.register(DeleteVRFCommand, DeleteVRFHandler(event_store, read_model_repo, event_producer)) + bus.register( + BulkCreateVRFsCommand, + BulkCreateVRFsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkUpdateVRFsCommand, + BulkUpdateVRFsHandler(event_store, read_model_repo, event_producer), + ) + bus.register( + BulkDeleteVRFsCommand, + BulkDeleteVRFsHandler(event_store, read_model_repo, event_producer), + ) + return bus + + +def _get_query_bus(request: Request, session=None) -> QueryBus: + if session is None: + session = _get_session(request) + read_model_repo = PostgresVRFReadModelRepository(session) + + bus = QueryBus() + bus.register(GetVRFQuery, GetVRFHandler(read_model_repo)) + bus.register(ListVRFsQuery, ListVRFsHandler(read_model_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=VRFResponse, +) +async def create_vrf( + body: CreateVRFRequest, + request: Request, +) -> VRFResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + vrf_id = await command_bus.dispatch(CreateVRFCommand(**body.model_dump())) + await session.commit() + result = await query_bus.dispatch(GetVRFQuery(vrf_id=vrf_id)) + return VRFResponse(**result.model_dump()) + + +@router.get("", response_model=VRFListResponse) +async def list_vrfs( + params: OffsetParams = Depends(), # noqa: B008 + tenant_id: UUID | None = None, + description_contains: str | None = None, + tag_slugs: list[str] | None = QueryParam(None), # noqa: B008 + custom_fields: str | None = None, + created_after: datetime | None = None, + created_before: datetime | None = None, + updated_after: datetime | None = None, + updated_before: datetime | None = None, + sort_by: str | None = None, + sort_dir: str = "asc", + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> VRFListResponse: + custom_field_filters = json.loads(custom_fields) if custom_fields else None + items, total = await query_bus.dispatch( + ListVRFsQuery( + offset=params.offset, + limit=params.limit, + tenant_id=tenant_id, + description_contains=description_contains, + tag_slugs=tag_slugs, + custom_field_filters=custom_field_filters, + created_after=created_after, + created_before=created_before, + updated_after=updated_after, + updated_before=updated_before, + sort_by=sort_by, + sort_dir=sort_dir, + ) + ) + return VRFListResponse( + items=[VRFResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.patch("/bulk", response_model=BulkUpdateResponse) +async def bulk_update_vrfs( + body: list[BulkUpdateVRFItemSchema], + request: Request, +) -> BulkUpdateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + updated = await command_bus.dispatch( + BulkUpdateVRFsCommand( + items=[BulkUpdateVRFItem(vrf_id=i.id, **i.model_dump(exclude={"id"}, exclude_unset=True)) for i in body] + ) + ) + await session.commit() + return BulkUpdateResponse(updated=updated) + + +@router.delete("/bulk", response_model=BulkDeleteResponse) +async def bulk_delete_vrfs( + body: BulkDeleteRequest, + request: Request, +) -> BulkDeleteResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + deleted = await command_bus.dispatch(BulkDeleteVRFsCommand(ids=body.ids)) + await session.commit() + return BulkDeleteResponse(deleted=deleted) + + +@router.get("/{vrf_id}", response_model=VRFResponse) +async def get_vrf( + vrf_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> VRFResponse: + result = await query_bus.dispatch(GetVRFQuery(vrf_id=vrf_id)) + return VRFResponse(**result.model_dump()) + + +@router.patch("/{vrf_id}", response_model=VRFResponse) +async def update_vrf( + vrf_id: UUID, + body: UpdateVRFRequest, + request: Request, +) -> VRFResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + query_bus = _get_query_bus(request, session) + await command_bus.dispatch(UpdateVRFCommand(vrf_id=vrf_id, **body.model_dump(exclude_unset=True))) + await session.commit() + result = await query_bus.dispatch(GetVRFQuery(vrf_id=vrf_id)) + return VRFResponse(**result.model_dump()) + + +@router.delete("/{vrf_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_vrf( + vrf_id: UUID, + request: Request, +) -> None: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + await command_bus.dispatch(DeleteVRFCommand(vrf_id=vrf_id)) + await session.commit() + + +@router.post( + "/bulk", + status_code=status.HTTP_201_CREATED, + response_model=BulkCreateResponse, +) +async def bulk_create_vrfs( + body: list[CreateVRFRequest], + request: Request, +) -> BulkCreateResponse: + session = _get_session(request) + command_bus = _get_command_bus(request, session) + ids = await command_bus.dispatch(BulkCreateVRFsCommand(items=[CreateVRFCommand(**i.model_dump()) for i in body])) + await session.commit() + return BulkCreateResponse(ids=ids, count=len(ids)) diff --git a/services/ipam/src/ipam/interface/schemas.py b/services/ipam/src/ipam/interface/schemas.py new file mode 100644 index 0000000..180a5e9 --- /dev/null +++ b/services/ipam/src/ipam/interface/schemas.py @@ -0,0 +1,648 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + +# --- Shared --- + + +class ChangeStatusRequest(BaseModel): + status: str + + +class BulkCreateResponse(BaseModel): + ids: list[UUID] + count: int + + +class BulkDeleteRequest(BaseModel): + ids: list[UUID] + + +class BulkUpdateResponse(BaseModel): + updated: int + + +class BulkDeleteResponse(BaseModel): + deleted: int + + +# --- Prefix --- + + +class CreatePrefixRequest(BaseModel): + network: str + vrf_id: UUID | None = None + vlan_id: UUID | None = None + status: str = "active" + role: str | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdatePrefixRequest(BaseModel): + description: str | None = None + role: str | None = None + tenant_id: UUID | None = None + vlan_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdatePrefixItem(BaseModel): + id: UUID + description: str | None = None + role: str | None = None + tenant_id: UUID | None = None + vlan_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class PrefixResponse(BaseModel): + id: UUID + network: str + vrf_id: UUID | None + vlan_id: UUID | None + status: str + role: str | None + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class PrefixListResponse(BaseModel): + items: list[PrefixResponse] + total: int + offset: int + limit: int + + +# --- IPAddress --- + + +class CreateIPAddressRequest(BaseModel): + address: str + vrf_id: UUID | None = None + status: str = "active" + dns_name: str = "" + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateIPAddressRequest(BaseModel): + dns_name: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateIPAddressItem(BaseModel): + id: UUID + dns_name: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class IPAddressResponse(BaseModel): + id: UUID + address: str + vrf_id: UUID | None + status: str + dns_name: str + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class IPAddressListResponse(BaseModel): + items: list[IPAddressResponse] + total: int + offset: int + limit: int + + +# --- VRF --- + + +class CreateVRFRequest(BaseModel): + name: str + rd: str | None = None + tenant_id: UUID | None = None + description: str = "" + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateVRFRequest(BaseModel): + name: str | None = None + description: str | None = None + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateVRFItem(BaseModel): + id: UUID + name: str | None = None + import_targets: list[UUID] | None = None + export_targets: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class VRFResponse(BaseModel): + id: UUID + name: str + rd: str | None + tenant_id: UUID | None + description: str + import_targets: list[UUID] + export_targets: list[UUID] + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class VRFListResponse(BaseModel): + items: list[VRFResponse] + total: int + offset: int + limit: int + + +# --- VLAN --- + + +class CreateVLANRequest(BaseModel): + vid: int + name: str + group_id: UUID | None = None + status: str = "active" + role: str | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateVLANRequest(BaseModel): + name: str | None = None + role: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateVLANItem(BaseModel): + id: UUID + name: str | None = None + role: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class VLANResponse(BaseModel): + id: UUID + vid: int + name: str + group_id: UUID | None + status: str + role: str | None + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class VLANListResponse(BaseModel): + items: list[VLANResponse] + total: int + offset: int + limit: int + + +# --- IPRange --- + + +class CreateIPRangeRequest(BaseModel): + start_address: str + end_address: str + vrf_id: UUID | None = None + status: str = "active" + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateIPRangeRequest(BaseModel): + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateIPRangeItem(BaseModel): + id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class IPRangeResponse(BaseModel): + id: UUID + start_address: str + end_address: str + vrf_id: UUID | None + status: str + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class IPRangeListResponse(BaseModel): + items: list[IPRangeResponse] + total: int + offset: int + limit: int + + +# --- RIR --- + + +class CreateRIRRequest(BaseModel): + name: str + is_private: bool = False + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateRIRRequest(BaseModel): + description: str | None = None + is_private: bool | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateRIRItem(BaseModel): + id: UUID + description: str | None = None + is_private: bool | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class RIRResponse(BaseModel): + id: UUID + name: str + is_private: bool + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class RIRListResponse(BaseModel): + items: list[RIRResponse] + total: int + offset: int + limit: int + + +# --- ASN --- + + +class CreateASNRequest(BaseModel): + asn: int + rir_id: UUID | None = None + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateASNRequest(BaseModel): + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateASNItem(BaseModel): + id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ASNResponse(BaseModel): + id: UUID + asn: int + rir_id: UUID | None + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class ASNListResponse(BaseModel): + items: list[ASNResponse] + total: int + offset: int + limit: int + + +# --- FHRPGroup --- + + +class CreateFHRPGroupRequest(BaseModel): + protocol: str + group_id_value: int + auth_type: str = "plaintext" + auth_key: str = "" + name: str = "" + description: str = "" + custom_fields: dict = {} + tags: list[UUID] = [] + + +class UpdateFHRPGroupRequest(BaseModel): + name: str | None = None + auth_type: str | None = None + auth_key: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateFHRPGroupItem(BaseModel): + id: UUID + name: str | None = None + auth_type: str | None = None + auth_key: str | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class FHRPGroupResponse(BaseModel): + id: UUID + protocol: str + group_id_value: int + auth_type: str + name: str + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class FHRPGroupListResponse(BaseModel): + items: list[FHRPGroupResponse] + total: int + offset: int + limit: int + + +# --- RouteTarget --- + + +class CreateRouteTargetRequest(BaseModel): + name: str + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateRouteTargetRequest(BaseModel): + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateRouteTargetItem(BaseModel): + id: UUID + description: str | None = None + tenant_id: UUID | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class RouteTargetResponse(BaseModel): + id: UUID + name: str + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class RouteTargetListResponse(BaseModel): + items: list[RouteTargetResponse] + total: int + offset: int + limit: int + + +# --- VLANGroup --- + + +class CreateVLANGroupRequest(BaseModel): + name: str + slug: str + min_vid: int = 1 + max_vid: int = 4094 + tenant_id: UUID | None = None + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateVLANGroupRequest(BaseModel): + name: str | None = None + description: str | None = None + min_vid: int | None = None + max_vid: int | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateVLANGroupItem(BaseModel): + id: UUID + name: str | None = None + description: str | None = None + min_vid: int | None = None + max_vid: int | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class VLANGroupResponse(BaseModel): + id: UUID + name: str + slug: str + min_vid: int + max_vid: int + tenant_id: UUID | None + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class VLANGroupListResponse(BaseModel): + items: list[VLANGroupResponse] + total: int + offset: int + limit: int + + +# --- Service --- + + +class CreateServiceRequest(BaseModel): + name: str + protocol: str = "tcp" + ports: list[int] = [] + ip_addresses: list[UUID] = [] + description: str = "" + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class UpdateServiceRequest(BaseModel): + name: str | None = None + protocol: str | None = None + ports: list[int] | None = None + ip_addresses: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class BulkUpdateServiceItem(BaseModel): + id: UUID + name: str | None = None + protocol: str | None = None + ports: list[int] | None = None + ip_addresses: list[UUID] | None = None + description: str | None = None + custom_fields: dict | None = None + tags: list[UUID] | None = None + + +class ServiceResponse(BaseModel): + id: UUID + name: str + protocol: str + ports: list[int] + ip_addresses: list[UUID] + description: str + custom_fields: dict + tags: list[UUID] + created_at: datetime + updated_at: datetime + + +class ServiceListResponse(BaseModel): + items: list[ServiceResponse] + total: int + offset: int + limit: int + + +# --- Saved Filter --- + + +class CreateSavedFilterRequest(BaseModel): + name: str + entity_type: str + filter_config: dict = {} + is_default: bool = False + + +class UpdateSavedFilterRequest(BaseModel): + name: str | None = None + filter_config: dict | None = None + is_default: bool | None = None + + +class SavedFilterResponse(BaseModel): + id: UUID + user_id: UUID + name: str + entity_type: str + filter_config: dict + is_default: bool + created_at: datetime + updated_at: datetime + + +class SavedFilterListResponse(BaseModel): + items: list[SavedFilterResponse] + + +# --- Global Search --- + + +class SearchResultResponse(BaseModel): + entity_type: str + entity_id: UUID + display_text: str + description: str + relevance: float + + +class GlobalSearchResponse(BaseModel): + results: list[SearchResultResponse] + total: int + + +# --- Import/Export --- + + +class ImportRowErrorSchema(BaseModel): + row: int + field: str + error: str + + +class ImportResponse(BaseModel): + imported: int + failed: int + errors: list[ImportRowErrorSchema] diff --git a/services/ipam/tests/__init__.py b/services/ipam/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/tests/test_application/__init__.py b/services/ipam/tests/test_application/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/tests/test_application/test_import_export.py b/services/ipam/tests/test_application/test_import_export.py new file mode 100644 index 0000000..7b645c5 --- /dev/null +++ b/services/ipam/tests/test_application/test_import_export.py @@ -0,0 +1,174 @@ +"""Tests for import/export services.""" + +import json +from datetime import UTC, datetime +from uuid import uuid4 + +from ipam.application.export_service import export_csv, export_json, export_yaml +from ipam.application.import_service import VALID_ENTITY_TYPES, parse_csv + + +class TestParseCSV: + def test_basic_prefix_csv(self) -> None: + csv_content = "network,status,description\n10.0.0.0/24,active,Test network\n192.168.1.0/24,reserved,Lab\n" + items, errors = parse_csv(csv_content) + assert len(items) == 2 + assert len(errors) == 0 + assert items[0]["network"] == "10.0.0.0/24" + assert items[0]["status"] == "active" + assert items[1]["description"] == "Lab" + + def test_uuid_field_conversion(self) -> None: + uid = str(uuid4()) + csv_content = f"network,vrf_id\n10.0.0.0/24,{uid}\n" + items, errors = parse_csv(csv_content) + assert len(items) == 1 + assert str(items[0]["vrf_id"]) == uid + + def test_int_field_conversion(self) -> None: + csv_content = "vid,name\n100,VLAN100\n" + items, errors = parse_csv(csv_content) + assert len(items) == 1 + assert items[0]["vid"] == 100 + + def test_bool_field_conversion(self) -> None: + csv_content = "name,is_private\nRIPE,true\nARIN,false\n" + items, errors = parse_csv(csv_content) + assert len(items) == 2 + assert items[0]["is_private"] is True + assert items[1]["is_private"] is False + + def test_json_field_conversion(self) -> None: + import csv as csv_mod + import io + + tag_id = str(uuid4()) + rows = [ + {"network": "10.0.0.0/24", "custom_fields": '{"env": "prod"}', "tags": f'["{tag_id}"]'}, + ] + output = io.StringIO() + writer = csv_mod.DictWriter(output, fieldnames=["network", "custom_fields", "tags"]) + writer.writeheader() + writer.writerows(rows) + csv_content = output.getvalue() + + items, errors = parse_csv(csv_content) + assert len(errors) == 0 + assert len(items) == 1 + assert items[0]["custom_fields"] == {"env": "prod"} + assert isinstance(items[0]["tags"], list) + + def test_empty_values_skipped(self) -> None: + csv_content = "network,vrf_id,description\n10.0.0.0/24,,\n" + items, errors = parse_csv(csv_content) + assert len(items) == 1 + assert "vrf_id" not in items[0] + assert "description" not in items[0] + + def test_invalid_uuid_produces_error(self) -> None: + csv_content = "network,vrf_id\n10.0.0.0/24,not-a-uuid\n" + items, errors = parse_csv(csv_content) + assert len(items) == 0 + assert len(errors) == 1 + assert errors[0].row == 2 + assert errors[0].field == "vrf_id" + + def test_invalid_int_produces_error(self) -> None: + csv_content = "vid,name\nabc,VLAN\n" + items, errors = parse_csv(csv_content) + assert len(items) == 0 + assert len(errors) == 1 + assert errors[0].field == "vid" + + def test_empty_csv(self) -> None: + csv_content = "network,status\n" + items, errors = parse_csv(csv_content) + assert len(items) == 0 + assert len(errors) == 0 + + def test_partial_errors(self) -> None: + csv_content = "network,vrf_id\n10.0.0.0/24,\n192.168.0.0/16,bad-uuid\n172.16.0.0/12,\n" + items, errors = parse_csv(csv_content) + assert len(items) == 2 # rows 1 and 3 succeed + assert len(errors) == 1 # row 2 fails + + +class TestValidEntityTypes: + def test_all_entity_types(self) -> None: + expected = { + "prefix", + "ip_address", + "vrf", + "vlan", + "ip_range", + "rir", + "asn", + "fhrp_group", + "route_target", + "vlan_group", + "service", + } + assert expected == VALID_ENTITY_TYPES + + +class TestExportCSV: + def test_basic_export(self) -> None: + items = [ + {"network": "10.0.0.0/24", "status": "active"}, + {"network": "192.168.0.0/16", "status": "reserved"}, + ] + result = export_csv(items) + lines = result.strip().split("\n") + assert len(lines) == 3 # header + 2 rows + assert "network" in lines[0] + assert "10.0.0.0/24" in lines[1] + + def test_empty_export(self) -> None: + assert export_csv([]) == "" + + def test_uuid_serialization(self) -> None: + uid = uuid4() + items = [{"id": uid, "name": "test"}] + result = export_csv(items) + assert str(uid) in result + + def test_datetime_serialization(self) -> None: + now = datetime.now(UTC) + items = [{"name": "test", "created_at": now}] + result = export_csv(items) + assert now.isoformat() in result + + def test_dict_serialization(self) -> None: + items = [{"name": "test", "custom_fields": {"env": "prod"}}] + result = export_csv(items) + assert '"env"' in result + + +class TestExportJSON: + def test_basic_export(self) -> None: + items = [{"network": "10.0.0.0/24", "status": "active"}] + result = export_json(items) + parsed = json.loads(result) + assert len(parsed) == 1 + assert parsed[0]["network"] == "10.0.0.0/24" + + def test_uuid_serialization(self) -> None: + uid = uuid4() + items = [{"id": uid}] + result = export_json(items) + parsed = json.loads(result) + assert parsed[0]["id"] == str(uid) + + +class TestExportYAML: + def test_basic_export(self) -> None: + items = [{"network": "10.0.0.0/24", "status": "active"}] + result = export_yaml(items) + assert "network:" in result or "network: " in result + assert "10.0.0.0/24" in result + + def test_uuid_serialization(self) -> None: + uid = uuid4() + items = [{"id": uid}] + result = export_yaml(items) + assert str(uid) in result diff --git a/services/ipam/tests/test_application/test_query_handlers.py b/services/ipam/tests/test_application/test_query_handlers.py new file mode 100644 index 0000000..f96dd5c --- /dev/null +++ b/services/ipam/tests/test_application/test_query_handlers.py @@ -0,0 +1,113 @@ +"""Tests for common filter builder and query handler helpers.""" + +from datetime import UTC, datetime +from uuid import uuid4 + +from ipam.application.queries import BaseListQuery, ListPrefixesQuery +from ipam.application.query_handlers import _build_common_filters +from shared.api.filtering import FilterOperator + + +class TestBuildCommonFilters: + def test_empty_query_returns_no_filters(self) -> None: + query = BaseListQuery() + filters, sort_params, tag_slugs, custom_field_filters = _build_common_filters(query) + assert filters == [] + assert sort_params is None + assert tag_slugs is None + assert custom_field_filters is None + + def test_description_contains_produces_ilike_filter(self) -> None: + query = BaseListQuery(description_contains="test") + filters, _, _, _ = _build_common_filters(query) + assert len(filters) == 1 + assert filters[0].field == "description" + assert filters[0].operator == FilterOperator.ILIKE + assert filters[0].value == "test" + + def test_created_after_produces_gte_filter(self) -> None: + dt = datetime(2024, 1, 1, tzinfo=UTC) + query = BaseListQuery(created_after=dt) + filters, _, _, _ = _build_common_filters(query) + assert len(filters) == 1 + assert filters[0].field == "created_at" + assert filters[0].operator == FilterOperator.GTE + + def test_created_before_produces_lte_filter(self) -> None: + dt = datetime(2024, 12, 31, tzinfo=UTC) + query = BaseListQuery(created_before=dt) + filters, _, _, _ = _build_common_filters(query) + assert len(filters) == 1 + assert filters[0].field == "created_at" + assert filters[0].operator == FilterOperator.LTE + + def test_updated_after_and_before(self) -> None: + query = BaseListQuery( + updated_after=datetime(2024, 1, 1, tzinfo=UTC), + updated_before=datetime(2024, 12, 31, tzinfo=UTC), + ) + filters, _, _, _ = _build_common_filters(query) + assert len(filters) == 2 + fields = [f.field for f in filters] + assert "updated_at" in fields + + def test_sort_by_produces_sort_params(self) -> None: + query = BaseListQuery(sort_by="name", sort_dir="desc") + _, sort_params, _, _ = _build_common_filters(query) + assert sort_params is not None + assert len(sort_params) == 1 + assert sort_params[0].field == "name" + assert sort_params[0].direction == "desc" + + def test_tag_slugs_passed_through(self) -> None: + query = BaseListQuery(tag_slugs=["production", "staging"]) + _, _, tag_slugs, _ = _build_common_filters(query) + assert tag_slugs == ["production", "staging"] + + def test_custom_field_filters_passed_through(self) -> None: + query = BaseListQuery(custom_field_filters={"env": "prod", "region": "kr"}) + _, _, _, custom_field_filters = _build_common_filters(query) + assert custom_field_filters == {"env": "prod", "region": "kr"} + + def test_all_filters_combined(self) -> None: + query = BaseListQuery( + description_contains="network", + created_after=datetime(2024, 1, 1, tzinfo=UTC), + sort_by="created_at", + tag_slugs=["prod"], + custom_field_filters={"env": "prod"}, + ) + filters, sort_params, tag_slugs, cf = _build_common_filters(query) + assert len(filters) == 2 # description + created_after + assert sort_params is not None + assert tag_slugs == ["prod"] + assert cf == {"env": "prod"} + + +class TestListPrefixesQueryInheritance: + def test_inherits_base_list_query_fields(self) -> None: + query = ListPrefixesQuery( + offset=10, + limit=25, + vrf_id=uuid4(), + status="active", + tenant_id=uuid4(), + role="management", + description_contains="test", + sort_by="network", + ) + assert query.offset == 10 + assert query.limit == 25 + assert query.role == "management" + assert query.description_contains == "test" + assert query.sort_by == "network" + + def test_defaults(self) -> None: + query = ListPrefixesQuery() + assert query.offset == 0 + assert query.limit == 50 + assert query.vrf_id is None + assert query.status is None + assert query.role is None + assert query.tag_slugs is None + assert query.sort_dir == "asc" diff --git a/services/ipam/tests/test_application/test_saved_filters.py b/services/ipam/tests/test_application/test_saved_filters.py new file mode 100644 index 0000000..48727e4 --- /dev/null +++ b/services/ipam/tests/test_application/test_saved_filters.py @@ -0,0 +1,83 @@ +"""Tests for SavedFilter commands, queries, and DTOs.""" + +from datetime import UTC +from uuid import uuid4 + +from ipam.application.commands import ( + CreateSavedFilterCommand, + DeleteSavedFilterCommand, + UpdateSavedFilterCommand, +) +from ipam.application.dto import SavedFilterDTO +from ipam.application.queries import GetSavedFilterQuery, ListSavedFiltersQuery + + +class TestSavedFilterCommands: + def test_create_command_defaults(self) -> None: + cmd = CreateSavedFilterCommand( + user_id=uuid4(), + name="My Filter", + entity_type="prefix", + ) + assert cmd.filter_config == {} + assert cmd.is_default is False + + def test_create_command_with_config(self) -> None: + config = {"status": "active", "vrf_id": str(uuid4())} + cmd = CreateSavedFilterCommand( + user_id=uuid4(), + name="Active Prefixes", + entity_type="prefix", + filter_config=config, + is_default=True, + ) + assert cmd.filter_config == config + assert cmd.is_default is True + + def test_update_command_partial(self) -> None: + cmd = UpdateSavedFilterCommand(filter_id=uuid4(), name="New Name") + assert cmd.name == "New Name" + assert cmd.filter_config is None + assert cmd.is_default is None + + def test_delete_command(self) -> None: + fid = uuid4() + cmd = DeleteSavedFilterCommand(filter_id=fid) + assert cmd.filter_id == fid + + +class TestSavedFilterQueries: + def test_get_query(self) -> None: + fid = uuid4() + q = GetSavedFilterQuery(filter_id=fid) + assert q.filter_id == fid + + def test_list_query_defaults(self) -> None: + uid = uuid4() + q = ListSavedFiltersQuery(user_id=uid) + assert q.user_id == uid + assert q.entity_type is None + + def test_list_query_with_entity_type(self) -> None: + q = ListSavedFiltersQuery(user_id=uuid4(), entity_type="vrf") + assert q.entity_type == "vrf" + + +class TestSavedFilterDTO: + def test_dto_from_dict(self) -> None: + from datetime import datetime + + now = datetime.now(UTC) + data = { + "id": uuid4(), + "user_id": uuid4(), + "name": "Test Filter", + "entity_type": "prefix", + "filter_config": {"status": "active"}, + "is_default": False, + "created_at": now, + "updated_at": now, + } + dto = SavedFilterDTO(**data) + assert dto.name == "Test Filter" + assert dto.filter_config == {"status": "active"} diff --git a/services/ipam/tests/test_application/test_search.py b/services/ipam/tests/test_application/test_search.py new file mode 100644 index 0000000..91daa5c --- /dev/null +++ b/services/ipam/tests/test_application/test_search.py @@ -0,0 +1,62 @@ +"""Tests for global search DTOs and queries.""" + +from uuid import uuid4 + +from ipam.application.dto import GlobalSearchResultDTO, SearchResultDTO +from ipam.application.queries import GlobalSearchQuery + + +class TestGlobalSearchQuery: + def test_defaults(self) -> None: + q = GlobalSearchQuery(q="test") + assert q.q == "test" + assert q.entity_types is None + assert q.offset == 0 + assert q.limit == 20 + + def test_with_entity_types(self) -> None: + q = GlobalSearchQuery(q="prod", entity_types=["prefix", "vrf"], offset=10, limit=5) + assert q.entity_types == ["prefix", "vrf"] + assert q.offset == 10 + assert q.limit == 5 + + +class TestSearchResultDTO: + def test_from_dict(self) -> None: + dto = SearchResultDTO( + entity_type="prefix", + entity_id=uuid4(), + display_text="10.0.0.0/24", + description="Management network", + relevance=0.85, + ) + assert dto.entity_type == "prefix" + assert dto.display_text == "10.0.0.0/24" + + +class TestGlobalSearchResultDTO: + def test_empty_results(self) -> None: + result = GlobalSearchResultDTO(results=[], total=0) + assert result.results == [] + assert result.total == 0 + + def test_with_results(self) -> None: + items = [ + SearchResultDTO( + entity_type="prefix", + entity_id=uuid4(), + display_text="10.0.0.0/24", + description="test", + relevance=0.9, + ), + SearchResultDTO( + entity_type="vrf", + entity_id=uuid4(), + display_text="production", + description="prod vrf", + relevance=0.7, + ), + ] + result = GlobalSearchResultDTO(results=items, total=2) + assert len(result.results) == 2 + assert result.total == 2 diff --git a/services/ipam/tests/test_domain/__init__.py b/services/ipam/tests/test_domain/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/tests/test_domain/test_asn.py b/services/ipam/tests/test_domain/test_asn.py new file mode 100644 index 0000000..e395f42 --- /dev/null +++ b/services/ipam/tests/test_domain/test_asn.py @@ -0,0 +1,246 @@ +"""Unit tests for the ASN aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.asn import ASN +from ipam.domain.events import ( + ASNCreated, + ASNDeleted, + ASNUpdated, +) +from ipam.domain.value_objects import ASNumber +from pydantic import ValidationError +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_asn( + asn: int = 65001, + rir_id=None, + tenant_id=None, + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> ASN: + return ASN.create( + asn=asn, + rir_id=rir_id, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestASNCreate: + def test_create_returns_asn_instance(self): + aggregate = make_asn() + assert isinstance(aggregate, ASN) + + def test_create_sets_asn(self): + aggregate = make_asn(asn=65000) + assert isinstance(aggregate.asn, ASNumber) + assert aggregate.asn.asn == 65000 + + def test_create_with_rir_id(self): + rir_id = uuid4() + aggregate = make_asn(rir_id=rir_id) + assert aggregate.rir_id == rir_id + + def test_create_with_custom_fields_and_tags(self): + tag_id = uuid4() + aggregate = make_asn(custom_fields={"provider": "aws"}, tags=[tag_id]) + assert aggregate.custom_fields == {"provider": "aws"} + assert aggregate.tags == [tag_id] + + def test_create_emits_asn_created_event(self): + aggregate = make_asn() + events = aggregate.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], ASNCreated) + + def test_create_version_is_1(self): + aggregate = make_asn() + assert aggregate.version == 1 + + def test_create_is_not_deleted(self): + aggregate = make_asn() + assert aggregate._deleted is False + + def test_create_invalid_asn_raises_error(self): + with pytest.raises((ValueError, ValidationError)): + make_asn(asn=0) + + def test_create_asn_too_large_raises_error(self): + with pytest.raises((ValueError, ValidationError)): + make_asn(asn=4294967296) + + def test_create_event_has_correct_aggregate_id(self): + aggregate = make_asn() + events = aggregate.collect_uncommitted_events() + assert events[0].aggregate_id == aggregate.id + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestASNUpdate: + def test_update_description(self): + aggregate = make_asn(description="old") + aggregate.collect_uncommitted_events() + aggregate.update(description="new description") + assert aggregate.description == "new description" + + def test_update_custom_fields(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.update(custom_fields={"provider": "gcp"}) + assert aggregate.custom_fields == {"provider": "gcp"} + + def test_update_tags(self): + tag_id = uuid4() + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.update(tags=[tag_id]) + assert aggregate.tags == [tag_id] + + def test_update_produces_asn_updated_event(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.update(description="updated") + events = aggregate.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], ASNUpdated) + + def test_update_increments_version(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.update(description="v2") + assert aggregate.version == 2 + + def test_update_deleted_raises_error(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + aggregate.update(description="should fail") + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestASNDelete: + def test_delete_marks_as_deleted(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.delete() + assert aggregate._deleted is True + + def test_delete_produces_asn_deleted_event(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.delete() + events = aggregate.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], ASNDeleted) + + def test_delete_increments_version(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.delete() + assert aggregate.version == 2 + + def test_double_delete_raises_error(self): + aggregate = make_asn() + aggregate.collect_uncommitted_events() + aggregate.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + aggregate.delete() + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestASNLoadFromHistory: + def test_load_from_history_restores_state(self): + rir_id = uuid4() + original = make_asn(asn=65100, rir_id=rir_id, description="original") + original.update(description="updated") + original.delete() + events = original.collect_uncommitted_events() + + restored = ASN() + restored.load_from_history(events) + + assert restored.asn.asn == 65100 + assert restored.rir_id == rir_id + assert restored.description == "updated" + assert restored._deleted is True + assert restored.version == 3 + + def test_load_from_history_does_not_add_uncommitted_events(self): + aggregate = make_asn() + events = aggregate.collect_uncommitted_events() + + restored = ASN() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestASNSnapshot: + def test_to_snapshot_returns_dict(self): + aggregate = make_asn() + snap = aggregate.to_snapshot() + assert isinstance(snap, dict) + + def test_snapshot_keys(self): + aggregate = make_asn() + snap = aggregate.to_snapshot() + expected_keys = {"asn", "rir_id", "tenant_id", "description", "custom_fields", "tags", "deleted"} + assert expected_keys == snap.keys() + + def test_snapshot_roundtrip_preserves_state(self): + tag_id = uuid4() + rir_id = uuid4() + tenant_id = uuid4() + aggregate = make_asn( + asn=65200, + rir_id=rir_id, + tenant_id=tenant_id, + description="test", + custom_fields={"provider": "aws"}, + tags=[tag_id], + ) + snap = aggregate.to_snapshot() + restored = ASN.from_snapshot(aggregate.id, snap, aggregate.version) + + assert restored.asn.asn == 65200 + assert restored.rir_id == rir_id + assert restored.tenant_id == tenant_id + assert restored.description == "test" + assert restored.custom_fields == {"provider": "aws"} + assert restored.tags == [tag_id] + assert restored.id == aggregate.id + assert restored.version == aggregate.version + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_fhrp_group.py b/services/ipam/tests/test_domain/test_fhrp_group.py new file mode 100644 index 0000000..4dcc2bb --- /dev/null +++ b/services/ipam/tests/test_domain/test_fhrp_group.py @@ -0,0 +1,269 @@ +"""Unit tests for the FHRPGroup aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.events import ( + FHRPGroupCreated, + FHRPGroupDeleted, + FHRPGroupUpdated, +) +from ipam.domain.fhrp_group import FHRPGroup +from ipam.domain.value_objects import FHRPAuthType, FHRPProtocol +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_fhrp_group( + protocol: FHRPProtocol = FHRPProtocol.VRRP, + group_id_value: int = 1, + auth_type: FHRPAuthType = FHRPAuthType.PLAINTEXT, + auth_key: str = "", + name: str = "", + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> FHRPGroup: + return FHRPGroup.create( + protocol=protocol, + group_id_value=group_id_value, + auth_type=auth_type, + auth_key=auth_key, + name=name, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestFHRPGroupCreate: + def test_create_returns_fhrp_group_instance(self): + group = make_fhrp_group() + assert isinstance(group, FHRPGroup) + + def test_create_sets_protocol(self): + group = make_fhrp_group(protocol=FHRPProtocol.HSRP) + assert group.protocol == FHRPProtocol.HSRP + + def test_create_sets_group_id_value(self): + group = make_fhrp_group(group_id_value=42) + assert group.group_id_value == 42 + + def test_create_with_custom_fields_and_tags(self): + tag_id = uuid4() + group = make_fhrp_group(custom_fields={"env": "prod"}, tags=[tag_id]) + assert group.custom_fields == {"env": "prod"} + assert group.tags == [tag_id] + + def test_create_emits_fhrp_group_created_event(self): + group = make_fhrp_group() + events = group.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], FHRPGroupCreated) + + def test_create_version_is_1(self): + group = make_fhrp_group() + assert group.version == 1 + + def test_create_is_not_deleted(self): + group = make_fhrp_group() + assert group._deleted is False + + def test_create_sets_auth_type(self): + group = make_fhrp_group(auth_type=FHRPAuthType.MD5, auth_key="secret") + assert group.auth_type == FHRPAuthType.MD5 + assert group.auth_key == "secret" + + def test_create_event_has_correct_aggregate_id(self): + group = make_fhrp_group() + events = group.collect_uncommitted_events() + assert events[0].aggregate_id == group.id + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestFHRPGroupUpdate: + def test_update_name(self): + group = make_fhrp_group(name="old") + group.collect_uncommitted_events() + group.update(name="new name") + assert group.name == "new name" + + def test_update_auth_type_and_auth_key(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.update(auth_type="md5", auth_key="newsecret") + assert group.auth_type == FHRPAuthType.MD5 + assert group.auth_key == "newsecret" + + def test_update_custom_fields(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.update(custom_fields={"priority": "high"}) + assert group.custom_fields == {"priority": "high"} + + def test_update_tags(self): + tag_id = uuid4() + group = make_fhrp_group() + group.collect_uncommitted_events() + group.update(tags=[tag_id]) + assert group.tags == [tag_id] + + def test_update_produces_fhrp_group_updated_event(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.update(name="updated") + events = group.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], FHRPGroupUpdated) + + def test_update_increments_version(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.update(name="v2") + assert group.version == 2 + + def test_update_deleted_raises_error(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + group.update(name="should fail") + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestFHRPGroupDelete: + def test_delete_marks_as_deleted(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.delete() + assert group._deleted is True + + def test_delete_produces_fhrp_group_deleted_event(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.delete() + events = group.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], FHRPGroupDeleted) + + def test_delete_increments_version(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.delete() + assert group.version == 2 + + def test_double_delete_raises_error(self): + group = make_fhrp_group() + group.collect_uncommitted_events() + group.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + group.delete() + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestFHRPGroupLoadFromHistory: + def test_load_from_history_restores_state(self): + original = make_fhrp_group( + protocol=FHRPProtocol.HSRP, + group_id_value=10, + name="original", + auth_type=FHRPAuthType.PLAINTEXT, + ) + original.update(name="updated", auth_type="md5", auth_key="secret") + original.delete() + events = original.collect_uncommitted_events() + + restored = FHRPGroup() + restored.load_from_history(events) + + assert restored.protocol == FHRPProtocol.HSRP + assert restored.group_id_value == 10 + assert restored.name == "updated" + assert restored.auth_type == FHRPAuthType.MD5 + assert restored.auth_key == "secret" + assert restored._deleted is True + assert restored.version == 3 + + def test_load_from_history_does_not_add_uncommitted_events(self): + group = make_fhrp_group() + events = group.collect_uncommitted_events() + + restored = FHRPGroup() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestFHRPGroupSnapshot: + def test_to_snapshot_returns_dict(self): + group = make_fhrp_group() + snap = group.to_snapshot() + assert isinstance(snap, dict) + + def test_snapshot_keys(self): + group = make_fhrp_group() + snap = group.to_snapshot() + expected_keys = { + "protocol", + "group_id_value", + "auth_type", + "auth_key", + "name", + "description", + "custom_fields", + "tags", + "deleted", + } + assert expected_keys == snap.keys() + + def test_snapshot_roundtrip_preserves_state(self): + tag_id = uuid4() + group = make_fhrp_group( + protocol=FHRPProtocol.GLBP, + group_id_value=5, + auth_type=FHRPAuthType.MD5, + auth_key="mykey", + name="test-group", + description="test", + custom_fields={"priority": "high"}, + tags=[tag_id], + ) + snap = group.to_snapshot() + restored = FHRPGroup.from_snapshot(group.id, snap, group.version) + + assert restored.protocol == FHRPProtocol.GLBP + assert restored.group_id_value == 5 + assert restored.auth_type == FHRPAuthType.MD5 + assert restored.auth_key == "mykey" + assert restored.name == "test-group" + assert restored.description == "test" + assert restored.custom_fields == {"priority": "high"} + assert restored.tags == [tag_id] + assert restored.id == group.id + assert restored.version == group.version + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_ip_address.py b/services/ipam/tests/test_domain/test_ip_address.py new file mode 100644 index 0000000..d442267 --- /dev/null +++ b/services/ipam/tests/test_domain/test_ip_address.py @@ -0,0 +1,482 @@ +"""Unit tests for the IPAddress aggregate root.""" + +from uuid import UUID, uuid4 + +import pytest +from ipam.domain.events import ( + IPAddressCreated, + IPAddressDeleted, + IPAddressStatusChanged, + IPAddressUpdated, +) +from ipam.domain.ip_address import IPAddress +from ipam.domain.value_objects import IPAddressStatus, IPAddressValue +from pydantic import ValidationError +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_ip( + address: str = "192.168.1.1", + vrf_id: UUID | None = None, + status: IPAddressStatus = IPAddressStatus.ACTIVE, + dns_name: str = "", + tenant_id: UUID | None = None, + description: str = "", +) -> IPAddress: + return IPAddress.create( + address=address, + vrf_id=vrf_id, + status=status, + dns_name=dns_name, + tenant_id=tenant_id, + description=description, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestIPAddressCreate: + def test_create_returns_ip_address_instance(self): + ip = make_ip() + assert isinstance(ip, IPAddress) + + def test_create_sets_address(self): + ip = make_ip(address="10.0.0.1") + assert isinstance(ip.address, IPAddressValue) + assert ip.address.address == "10.0.0.1" + + def test_create_ipv4_address(self): + ip = make_ip(address="192.168.100.200") + assert ip.address.version == 4 + + def test_create_ipv6_address(self): + ip = make_ip(address="2001:db8::1") + assert ip.address.version == 6 + assert ip.address.address == "2001:db8::1" + + def test_create_ipv6_loopback(self): + ip = make_ip(address="::1") + assert ip.address.address == "::1" + + def test_create_sets_default_status_active(self): + ip = make_ip() + assert ip.status == IPAddressStatus.ACTIVE + + def test_create_sets_explicit_status(self): + ip = make_ip(status=IPAddressStatus.DHCP) + assert ip.status == IPAddressStatus.DHCP + + def test_create_sets_vrf_id(self): + vrf_id = uuid4() + ip = make_ip(vrf_id=vrf_id) + assert ip.vrf_id == vrf_id + + def test_create_sets_vrf_id_none_by_default(self): + ip = make_ip() + assert ip.vrf_id is None + + def test_create_sets_dns_name(self): + ip = make_ip(dns_name="server.example.com") + assert ip.dns_name == "server.example.com" + + def test_create_sets_empty_dns_name_by_default(self): + ip = make_ip() + assert ip.dns_name == "" + + def test_create_sets_tenant_id(self): + tenant_id = uuid4() + ip = make_ip(tenant_id=tenant_id) + assert ip.tenant_id == tenant_id + + def test_create_sets_description(self): + ip = make_ip(description="Gateway address") + assert ip.description == "Gateway address" + + def test_create_version_is_1(self): + ip = make_ip() + assert ip.version == 1 + + def test_create_is_not_deleted(self): + ip = make_ip() + assert ip._deleted is False + + def test_create_produces_one_event(self): + ip = make_ip() + events = ip.collect_uncommitted_events() + assert len(events) == 1 + + def test_create_event_type_is_ip_address_created(self): + ip = make_ip() + events = ip.collect_uncommitted_events() + assert isinstance(events[0], IPAddressCreated) + + def test_create_event_has_correct_aggregate_id(self): + ip = make_ip() + events = ip.collect_uncommitted_events() + assert events[0].aggregate_id == ip.id + + def test_create_event_has_version_1(self): + ip = make_ip() + events = ip.collect_uncommitted_events() + assert events[0].version == 1 + + def test_create_event_address_matches(self): + ip = make_ip(address="172.16.0.1") + events = ip.collect_uncommitted_events() + assert events[0].address == "172.16.0.1" + + def test_create_with_invalid_address_raises(self): + with pytest.raises((ValueError, ValidationError)): + make_ip(address="not-an-ip") + + def test_create_assigns_unique_ids(self): + ip1 = make_ip() + ip2 = make_ip() + assert ip1.id != ip2.id + + def test_collect_uncommitted_events_clears_queue(self): + ip = make_ip() + ip.collect_uncommitted_events() + assert ip.collect_uncommitted_events() == [] + + def test_create_slaac_status(self): + ip = make_ip(status=IPAddressStatus.SLAAC) + assert ip.status == IPAddressStatus.SLAAC + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestIPAddressUpdate: + def test_update_dns_name_changes_dns_name(self): + ip = make_ip(dns_name="old.example.com") + ip.collect_uncommitted_events() + ip.update(dns_name="new.example.com") + assert ip.dns_name == "new.example.com" + + def test_update_description_changes_description(self): + ip = make_ip(description="old") + ip.collect_uncommitted_events() + ip.update(description="new description") + assert ip.description == "new description" + + def test_update_produces_ip_address_updated_event(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.update(dns_name="host.local") + events = ip.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPAddressUpdated) + + def test_update_increments_version(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.update(description="v2") + assert ip.version == 2 + + def test_update_event_has_correct_aggregate_id(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.update(description="x") + events = ip.collect_uncommitted_events() + assert events[0].aggregate_id == ip.id + + def test_update_with_none_args_does_not_change_existing_values(self): + ip = make_ip(dns_name="keep.example.com", description="keep this") + ip.collect_uncommitted_events() + ip.update() # all None + assert ip.dns_name == "keep.example.com" + assert ip.description == "keep this" + + def test_update_after_delete_raises_business_rule_violation(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + ip.update(dns_name="should.fail") + + def test_multiple_updates_accumulate_version(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.update(description="v2") + ip.collect_uncommitted_events() + ip.update(dns_name="v3.example.com") + assert ip.version == 3 + + +# --------------------------------------------------------------------------- +# change_status() +# --------------------------------------------------------------------------- + + +class TestIPAddressChangeStatus: + def test_change_status_updates_status(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.collect_uncommitted_events() + ip.change_status(IPAddressStatus.RESERVED) + assert ip.status == IPAddressStatus.RESERVED + + def test_change_status_to_dhcp(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.collect_uncommitted_events() + ip.change_status(IPAddressStatus.DHCP) + assert ip.status == IPAddressStatus.DHCP + + def test_change_status_to_slaac(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.collect_uncommitted_events() + ip.change_status(IPAddressStatus.SLAAC) + assert ip.status == IPAddressStatus.SLAAC + + def test_change_status_produces_ip_address_status_changed_event(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.collect_uncommitted_events() + ip.change_status(IPAddressStatus.DEPRECATED) + events = ip.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPAddressStatusChanged) + + def test_change_status_event_contains_old_and_new_status(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.collect_uncommitted_events() + ip.change_status(IPAddressStatus.RESERVED) + events = ip.collect_uncommitted_events() + assert events[0].old_status == "active" + assert events[0].new_status == "reserved" + + def test_change_status_increments_version(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.change_status(IPAddressStatus.DEPRECATED) + assert ip.version == 2 + + def test_change_status_to_same_status_raises_business_rule_violation(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.collect_uncommitted_events() + with pytest.raises(BusinessRuleViolationError, match="already"): + ip.change_status(IPAddressStatus.ACTIVE) + + def test_change_status_after_delete_raises_business_rule_violation(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + ip.change_status(IPAddressStatus.RESERVED) + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestIPAddressDelete: + def test_delete_marks_ip_address_as_deleted(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + assert ip._deleted is True + + def test_delete_produces_ip_address_deleted_event(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + events = ip.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPAddressDeleted) + + def test_delete_increments_version(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + assert ip.version == 2 + + def test_delete_twice_raises_business_rule_violation(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + ip.delete() + + def test_update_after_delete_is_blocked(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + with pytest.raises(BusinessRuleViolationError): + ip.update(dns_name="blocked.example.com") + + def test_change_status_after_delete_is_blocked(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + with pytest.raises(BusinessRuleViolationError): + ip.change_status(IPAddressStatus.DEPRECATED) + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestIPAddressLoadFromHistory: + def test_load_from_history_restores_address(self): + original = make_ip(address="10.10.10.10") + events = original.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.address.address == "10.10.10.10" + + def test_load_from_history_restores_status(self): + original = make_ip(status=IPAddressStatus.DHCP) + events = original.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.status == IPAddressStatus.DHCP + + def test_load_from_history_restores_after_update(self): + ip = make_ip(dns_name="original.example.com") + ip.update(dns_name="updated.example.com", description="changed") + events = ip.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.dns_name == "updated.example.com" + assert restored.description == "changed" + assert restored.version == 2 + + def test_load_from_history_restores_status_change(self): + ip = make_ip(status=IPAddressStatus.ACTIVE) + ip.change_status(IPAddressStatus.RESERVED) + events = ip.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.status == IPAddressStatus.RESERVED + + def test_load_from_history_restores_deleted_state(self): + ip = make_ip() + ip.delete() + events = ip.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored._deleted is True + assert restored.version == 2 + + def test_load_from_history_does_not_add_uncommitted_events(self): + ip = make_ip() + ip.update(description="v2") + events = ip.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + def test_load_from_history_restores_vrf_id(self): + vrf_id = uuid4() + ip = make_ip(vrf_id=vrf_id) + events = ip.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.vrf_id == vrf_id + + def test_load_from_history_restores_ipv6_address(self): + ip = make_ip(address="fd00::1") + events = ip.collect_uncommitted_events() + + restored = IPAddress() + restored.load_from_history(events) + + assert restored.address.address == "fd00::1" + assert restored.address.version == 6 + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestIPAddressSnapshot: + def test_to_snapshot_returns_dict(self): + ip = make_ip() + snap = ip.to_snapshot() + assert isinstance(snap, dict) + + def test_to_snapshot_contains_expected_keys(self): + ip = make_ip() + snap = ip.to_snapshot() + expected = { + "address", + "vrf_id", + "status", + "dns_name", + "tenant_id", + "description", + "custom_fields", + "tags", + "deleted", + } + assert expected == snap.keys() + + def test_snapshot_roundtrip_preserves_address(self): + ip = make_ip(address="10.20.30.40") + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored.address.address == "10.20.30.40" + + def test_snapshot_roundtrip_preserves_status(self): + ip = make_ip(status=IPAddressStatus.SLAAC) + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored.status == IPAddressStatus.SLAAC + + def test_snapshot_roundtrip_preserves_dns_name(self): + ip = make_ip(dns_name="host.example.com") + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored.dns_name == "host.example.com" + + def test_snapshot_roundtrip_preserves_vrf_id(self): + vrf_id = uuid4() + ip = make_ip(vrf_id=vrf_id) + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored.vrf_id == vrf_id + + def test_snapshot_roundtrip_preserves_aggregate_id(self): + ip = make_ip() + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored.id == ip.id + + def test_snapshot_roundtrip_preserves_deleted_state(self): + ip = make_ip() + ip.collect_uncommitted_events() + ip.delete() + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored._deleted is True + + def test_from_snapshot_does_not_produce_uncommitted_events(self): + ip = make_ip() + snap = ip.to_snapshot() + restored = IPAddress.from_snapshot(ip.id, snap, ip.version) + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_ip_range.py b/services/ipam/tests/test_domain/test_ip_range.py new file mode 100644 index 0000000..4516448 --- /dev/null +++ b/services/ipam/tests/test_domain/test_ip_range.py @@ -0,0 +1,314 @@ +"""Unit tests for the IPRange aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.events import ( + IPRangeCreated, + IPRangeDeleted, + IPRangeStatusChanged, + IPRangeUpdated, +) +from ipam.domain.ip_range import IPRange +from ipam.domain.value_objects import IPAddressValue, IPRangeStatus +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_ip_range( + start_address: str = "192.168.1.1", + end_address: str = "192.168.1.254", + vrf_id=None, + status: IPRangeStatus = IPRangeStatus.ACTIVE, + tenant_id=None, + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> IPRange: + return IPRange.create( + start_address=start_address, + end_address=end_address, + vrf_id=vrf_id, + status=status, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestIPRangeCreate: + def test_create_returns_ip_range_instance(self): + ip_range = make_ip_range() + assert isinstance(ip_range, IPRange) + + def test_create_sets_start_and_end_address(self): + ip_range = make_ip_range(start_address="10.0.0.1", end_address="10.0.0.100") + assert isinstance(ip_range.start_address, IPAddressValue) + assert ip_range.start_address.address == "10.0.0.1" + assert ip_range.end_address.address == "10.0.0.100" + + def test_create_with_custom_fields_and_tags(self): + tag_id = uuid4() + ip_range = make_ip_range( + custom_fields={"env": "prod"}, + tags=[tag_id], + ) + assert ip_range.custom_fields == {"env": "prod"} + assert ip_range.tags == [tag_id] + + def test_create_start_gte_end_raises_error(self): + with pytest.raises(BusinessRuleViolationError, match="less than"): + make_ip_range(start_address="192.168.1.100", end_address="192.168.1.1") + + def test_create_start_equal_end_raises_error(self): + with pytest.raises(BusinessRuleViolationError, match="less than"): + make_ip_range(start_address="192.168.1.1", end_address="192.168.1.1") + + def test_create_different_ip_versions_raises_error(self): + with pytest.raises(BusinessRuleViolationError, match="same IP version"): + make_ip_range(start_address="192.168.1.1", end_address="2001:db8::1") + + def test_create_emits_ip_range_created_event(self): + ip_range = make_ip_range() + events = ip_range.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPRangeCreated) + + def test_create_version_is_1(self): + ip_range = make_ip_range() + assert ip_range.version == 1 + + def test_create_is_not_deleted(self): + ip_range = make_ip_range() + assert ip_range._deleted is False + + def test_create_event_has_correct_aggregate_id(self): + ip_range = make_ip_range() + events = ip_range.collect_uncommitted_events() + assert events[0].aggregate_id == ip_range.id + + def test_create_assigns_unique_ids(self): + r1 = make_ip_range() + r2 = make_ip_range() + assert r1.id != r2.id + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestIPRangeUpdate: + def test_update_description(self): + ip_range = make_ip_range(description="old") + ip_range.collect_uncommitted_events() + ip_range.update(description="new description") + assert ip_range.description == "new description" + + def test_update_custom_fields(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.update(custom_fields={"region": "us-east"}) + assert ip_range.custom_fields == {"region": "us-east"} + + def test_update_tags(self): + tag_id = uuid4() + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.update(tags=[tag_id]) + assert ip_range.tags == [tag_id] + + def test_update_produces_ip_range_updated_event(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.update(description="updated") + events = ip_range.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPRangeUpdated) + + def test_update_increments_version(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.update(description="v2") + assert ip_range.version == 2 + + def test_update_deleted_raises_error(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + ip_range.update(description="should fail") + + +# --------------------------------------------------------------------------- +# change_status() +# --------------------------------------------------------------------------- + + +class TestIPRangeChangeStatus: + def test_change_status(self): + ip_range = make_ip_range(status=IPRangeStatus.ACTIVE) + ip_range.collect_uncommitted_events() + ip_range.change_status(IPRangeStatus.RESERVED) + assert ip_range.status == IPRangeStatus.RESERVED + + def test_change_status_produces_event(self): + ip_range = make_ip_range(status=IPRangeStatus.ACTIVE) + ip_range.collect_uncommitted_events() + ip_range.change_status(IPRangeStatus.DEPRECATED) + events = ip_range.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPRangeStatusChanged) + + def test_change_status_event_contains_old_and_new(self): + ip_range = make_ip_range(status=IPRangeStatus.ACTIVE) + ip_range.collect_uncommitted_events() + ip_range.change_status(IPRangeStatus.RESERVED) + events = ip_range.collect_uncommitted_events() + assert events[0].old_status == "active" + assert events[0].new_status == "reserved" + + def test_same_status_raises_error(self): + ip_range = make_ip_range(status=IPRangeStatus.ACTIVE) + ip_range.collect_uncommitted_events() + with pytest.raises(BusinessRuleViolationError, match="already"): + ip_range.change_status(IPRangeStatus.ACTIVE) + + def test_deleted_raises_error(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + ip_range.change_status(IPRangeStatus.RESERVED) + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestIPRangeDelete: + def test_delete_marks_as_deleted(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.delete() + assert ip_range._deleted is True + + def test_delete_produces_ip_range_deleted_event(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.delete() + events = ip_range.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], IPRangeDeleted) + + def test_delete_increments_version(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.delete() + assert ip_range.version == 2 + + def test_double_delete_raises_error(self): + ip_range = make_ip_range() + ip_range.collect_uncommitted_events() + ip_range.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + ip_range.delete() + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestIPRangeLoadFromHistory: + def test_load_from_history_restores_state(self): + original = make_ip_range( + start_address="10.0.0.1", + end_address="10.0.0.100", + description="original", + ) + original.update(description="updated") + original.change_status(IPRangeStatus.RESERVED) + original.delete() + events = original.collect_uncommitted_events() + + restored = IPRange() + restored.load_from_history(events) + + assert restored.start_address.address == "10.0.0.1" + assert restored.end_address.address == "10.0.0.100" + assert restored.description == "updated" + assert restored.status == IPRangeStatus.RESERVED + assert restored._deleted is True + assert restored.version == 4 + + def test_load_from_history_does_not_add_uncommitted_events(self): + ip_range = make_ip_range() + events = ip_range.collect_uncommitted_events() + + restored = IPRange() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestIPRangeSnapshot: + def test_to_snapshot_returns_dict(self): + ip_range = make_ip_range() + snap = ip_range.to_snapshot() + assert isinstance(snap, dict) + + def test_snapshot_keys(self): + ip_range = make_ip_range() + snap = ip_range.to_snapshot() + expected_keys = { + "start_address", + "end_address", + "vrf_id", + "status", + "tenant_id", + "description", + "custom_fields", + "tags", + "deleted", + } + assert expected_keys == snap.keys() + + def test_snapshot_roundtrip_preserves_state(self): + tag_id = uuid4() + tenant_id = uuid4() + ip_range = make_ip_range( + start_address="10.0.0.1", + end_address="10.0.0.100", + tenant_id=tenant_id, + description="test", + custom_fields={"env": "prod"}, + tags=[tag_id], + ) + snap = ip_range.to_snapshot() + restored = IPRange.from_snapshot(ip_range.id, snap, ip_range.version) + + assert restored.start_address.address == "10.0.0.1" + assert restored.end_address.address == "10.0.0.100" + assert restored.tenant_id == tenant_id + assert restored.description == "test" + assert restored.custom_fields == {"env": "prod"} + assert restored.tags == [tag_id] + assert restored.id == ip_range.id + assert restored.version == ip_range.version + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_prefix.py b/services/ipam/tests/test_domain/test_prefix.py new file mode 100644 index 0000000..e719112 --- /dev/null +++ b/services/ipam/tests/test_domain/test_prefix.py @@ -0,0 +1,481 @@ +"""Unit tests for the Prefix aggregate root.""" + +from uuid import UUID, uuid4 + +import pytest +from ipam.domain.events import ( + PrefixCreated, + PrefixDeleted, + PrefixStatusChanged, + PrefixUpdated, +) +from ipam.domain.prefix import Prefix +from ipam.domain.value_objects import PrefixNetwork, PrefixStatus +from pydantic import ValidationError +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_prefix( + network: str = "192.168.0.0/24", + vrf_id: UUID | None = None, + status: PrefixStatus = PrefixStatus.ACTIVE, + role: str | None = None, + tenant_id: UUID | None = None, + description: str = "", +) -> Prefix: + return Prefix.create( + network=network, + vrf_id=vrf_id, + status=status, + role=role, + tenant_id=tenant_id, + description=description, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestPrefixCreate: + def test_create_returns_prefix_instance(self): + prefix = make_prefix() + assert isinstance(prefix, Prefix) + + def test_create_sets_network(self): + prefix = make_prefix(network="10.0.0.0/8") + assert isinstance(prefix.network, PrefixNetwork) + assert prefix.network.network == "10.0.0.0/8" + + def test_create_normalises_host_bits_in_network(self): + prefix = make_prefix(network="10.0.0.5/8") + assert prefix.network.network == "10.0.0.0/8" + + def test_create_sets_default_status_active(self): + prefix = make_prefix() + assert prefix.status == PrefixStatus.ACTIVE + + def test_create_sets_explicit_status(self): + prefix = make_prefix(status=PrefixStatus.RESERVED) + assert prefix.status == PrefixStatus.RESERVED + + def test_create_sets_vrf_id(self): + vrf_id = uuid4() + prefix = make_prefix(vrf_id=vrf_id) + assert prefix.vrf_id == vrf_id + + def test_create_sets_vrf_id_none_by_default(self): + prefix = make_prefix() + assert prefix.vrf_id is None + + def test_create_sets_role(self): + prefix = make_prefix(role="loopback") + assert prefix.role == "loopback" + + def test_create_sets_tenant_id(self): + tenant_id = uuid4() + prefix = make_prefix(tenant_id=tenant_id) + assert prefix.tenant_id == tenant_id + + def test_create_sets_description(self): + prefix = make_prefix(description="Management network") + assert prefix.description == "Management network" + + def test_create_version_is_1(self): + prefix = make_prefix() + assert prefix.version == 1 + + def test_create_is_not_deleted(self): + prefix = make_prefix() + assert prefix._deleted is False + + def test_create_produces_one_event(self): + prefix = make_prefix() + events = prefix.collect_uncommitted_events() + assert len(events) == 1 + + def test_create_event_type_is_prefix_created(self): + prefix = make_prefix() + events = prefix.collect_uncommitted_events() + assert isinstance(events[0], PrefixCreated) + + def test_create_event_has_correct_aggregate_id(self): + prefix = make_prefix() + events = prefix.collect_uncommitted_events() + assert events[0].aggregate_id == prefix.id + + def test_create_event_has_version_1(self): + prefix = make_prefix() + events = prefix.collect_uncommitted_events() + assert events[0].version == 1 + + def test_create_event_network_matches(self): + prefix = make_prefix(network="172.16.0.0/12") + events = prefix.collect_uncommitted_events() + assert events[0].network == "172.16.0.0/12" + + def test_create_ipv6_prefix(self): + prefix = make_prefix(network="2001:db8::/32") + assert prefix.network.version == 6 + assert prefix.network.network == "2001:db8::/32" + + def test_create_ipv6_produces_prefix_created_event(self): + prefix = make_prefix(network="fd00::/8") + events = prefix.collect_uncommitted_events() + assert isinstance(events[0], PrefixCreated) + assert events[0].network == "fd00::/8" + + def test_create_with_invalid_network_raises(self): + with pytest.raises((ValueError, ValidationError)): + make_prefix(network="not-a-network") + + def test_create_assigns_unique_ids(self): + p1 = make_prefix() + p2 = make_prefix() + assert p1.id != p2.id + + def test_collect_uncommitted_events_clears_queue(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + assert prefix.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestPrefixUpdate: + def test_update_description_changes_description(self): + prefix = make_prefix(description="old") + prefix.collect_uncommitted_events() # flush create event + prefix.update(description="new description") + assert prefix.description == "new description" + + def test_update_role_changes_role(self): + prefix = make_prefix(role="old-role") + prefix.collect_uncommitted_events() + prefix.update(role="loopback") + assert prefix.role == "loopback" + + def test_update_tenant_id_changes_tenant_id(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + new_tenant = uuid4() + prefix.update(tenant_id=new_tenant) + assert prefix.tenant_id == new_tenant + + def test_update_produces_prefix_updated_event(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.update(description="updated") + events = prefix.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], PrefixUpdated) + + def test_update_increments_version(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.update(description="v2") + assert prefix.version == 2 + + def test_update_event_has_correct_aggregate_id(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.update(description="x") + events = prefix.collect_uncommitted_events() + assert events[0].aggregate_id == prefix.id + + def test_update_with_none_args_does_not_change_existing_values(self): + prefix = make_prefix(description="keep", role="keep-role") + prefix.collect_uncommitted_events() + prefix.update() # all None + assert prefix.description == "keep" + assert prefix.role == "keep-role" + + def test_update_after_delete_raises_business_rule_violation(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + prefix.update(description="should fail") + + +# --------------------------------------------------------------------------- +# change_status() +# --------------------------------------------------------------------------- + + +class TestPrefixChangeStatus: + def test_change_status_updates_status(self): + prefix = make_prefix(status=PrefixStatus.ACTIVE) + prefix.collect_uncommitted_events() + prefix.change_status(PrefixStatus.RESERVED) + assert prefix.status == PrefixStatus.RESERVED + + def test_change_status_produces_prefix_status_changed_event(self): + prefix = make_prefix(status=PrefixStatus.ACTIVE) + prefix.collect_uncommitted_events() + prefix.change_status(PrefixStatus.DEPRECATED) + events = prefix.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], PrefixStatusChanged) + + def test_change_status_event_contains_old_and_new_status(self): + prefix = make_prefix(status=PrefixStatus.ACTIVE) + prefix.collect_uncommitted_events() + prefix.change_status(PrefixStatus.RESERVED) + events = prefix.collect_uncommitted_events() + assert events[0].old_status == "active" + assert events[0].new_status == "reserved" + + def test_change_status_increments_version(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.change_status(PrefixStatus.DEPRECATED) + assert prefix.version == 2 + + def test_change_status_to_same_status_raises_business_rule_violation(self): + prefix = make_prefix(status=PrefixStatus.ACTIVE) + prefix.collect_uncommitted_events() + with pytest.raises(BusinessRuleViolationError, match="already"): + prefix.change_status(PrefixStatus.ACTIVE) + + def test_change_status_after_delete_raises_business_rule_violation(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + prefix.change_status(PrefixStatus.RESERVED) + + def test_change_status_to_container(self): + prefix = make_prefix(status=PrefixStatus.ACTIVE) + prefix.collect_uncommitted_events() + prefix.change_status(PrefixStatus.CONTAINER) + assert prefix.status == PrefixStatus.CONTAINER + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestPrefixDelete: + def test_delete_marks_prefix_as_deleted(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + assert prefix._deleted is True + + def test_delete_produces_prefix_deleted_event(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + events = prefix.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], PrefixDeleted) + + def test_delete_increments_version(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + assert prefix.version == 2 + + def test_delete_twice_raises_business_rule_violation(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + prefix.delete() + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestPrefixLoadFromHistory: + def test_load_from_history_restores_state_after_create(self): + original = make_prefix( + network="10.1.0.0/16", + description="original", + role="transit", + ) + events = original.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored.network.network == "10.1.0.0/16" + assert restored.description == "original" + assert restored.role == "transit" + assert restored.version == 1 + + def test_load_from_history_restores_state_after_update(self): + prefix = make_prefix(description="old") + prefix.update(description="new", role="loopback") + events = prefix.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored.description == "new" + assert restored.role == "loopback" + assert restored.version == 2 + + def test_load_from_history_restores_status_change(self): + prefix = make_prefix(status=PrefixStatus.ACTIVE) + prefix.change_status(PrefixStatus.DEPRECATED) + events = prefix.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored.status == PrefixStatus.DEPRECATED + + def test_load_from_history_restores_deleted_state(self): + prefix = make_prefix() + prefix.delete() + events = prefix.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored._deleted is True + assert restored.version == 2 + + def test_load_from_history_does_not_add_uncommitted_events(self): + prefix = make_prefix() + prefix.update(description="v2") + events = prefix.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + def test_load_from_history_restores_vrf_id(self): + vrf_id = uuid4() + prefix = make_prefix(vrf_id=vrf_id) + events = prefix.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored.vrf_id == vrf_id + + def test_load_from_history_restores_tenant_id(self): + tenant_id = uuid4() + prefix = make_prefix(tenant_id=tenant_id) + events = prefix.collect_uncommitted_events() + + restored = Prefix() + restored.load_from_history(events) + + assert restored.tenant_id == tenant_id + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestPrefixSnapshot: + def test_to_snapshot_returns_dict(self): + prefix = make_prefix() + snap = prefix.to_snapshot() + assert isinstance(snap, dict) + + def test_to_snapshot_contains_expected_keys(self): + prefix = make_prefix() + snap = prefix.to_snapshot() + expected_keys = { + "network", + "vrf_id", + "vlan_id", + "status", + "role", + "tenant_id", + "description", + "custom_fields", + "tags", + "deleted", + } + assert expected_keys == snap.keys() + + def test_snapshot_roundtrip_preserves_network(self): + prefix = make_prefix(network="172.20.0.0/14") + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.network.network == "172.20.0.0/14" + + def test_snapshot_roundtrip_preserves_status(self): + prefix = make_prefix(status=PrefixStatus.RESERVED) + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.status == PrefixStatus.RESERVED + + def test_snapshot_roundtrip_preserves_vrf_id(self): + vrf_id = uuid4() + prefix = make_prefix(vrf_id=vrf_id) + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.vrf_id == vrf_id + + def test_snapshot_roundtrip_preserves_tenant_id(self): + tenant_id = uuid4() + prefix = make_prefix(tenant_id=tenant_id) + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.tenant_id == tenant_id + + def test_snapshot_roundtrip_preserves_role(self): + prefix = make_prefix(role="infrastructure") + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.role == "infrastructure" + + def test_snapshot_roundtrip_preserves_description(self): + prefix = make_prefix(description="test description") + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.description == "test description" + + def test_snapshot_roundtrip_preserves_aggregate_id(self): + prefix = make_prefix() + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.id == prefix.id + + def test_snapshot_roundtrip_preserves_version(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.update(description="v2") + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.version == 2 + + def test_snapshot_roundtrip_preserves_deleted_state(self): + prefix = make_prefix() + prefix.collect_uncommitted_events() + prefix.delete() + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored._deleted is True + + def test_snapshot_roundtrip_with_none_vrf_id(self): + prefix = make_prefix(vrf_id=None) + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.vrf_id is None + + def test_from_snapshot_does_not_produce_uncommitted_events(self): + prefix = make_prefix() + snap = prefix.to_snapshot() + restored = Prefix.from_snapshot(prefix.id, snap, prefix.version) + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_rir.py b/services/ipam/tests/test_domain/test_rir.py new file mode 100644 index 0000000..9d921b6 --- /dev/null +++ b/services/ipam/tests/test_domain/test_rir.py @@ -0,0 +1,233 @@ +"""Unit tests for the RIR aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.events import ( + RIRCreated, + RIRDeleted, + RIRUpdated, +) +from ipam.domain.rir import RIR +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_rir( + name: str = "ARIN", + is_private: bool = False, + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> RIR: + return RIR.create( + name=name, + is_private=is_private, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestRIRCreate: + def test_create_returns_rir_instance(self): + rir = make_rir() + assert isinstance(rir, RIR) + + def test_create_sets_name(self): + rir = make_rir(name="RIPE NCC") + assert rir.name == "RIPE NCC" + + def test_create_with_custom_fields_and_tags(self): + tag_id = uuid4() + rir = make_rir(custom_fields={"region": "NA"}, tags=[tag_id]) + assert rir.custom_fields == {"region": "NA"} + assert rir.tags == [tag_id] + + def test_create_emits_rir_created_event(self): + rir = make_rir() + events = rir.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], RIRCreated) + + def test_create_version_is_1(self): + rir = make_rir() + assert rir.version == 1 + + def test_create_is_not_deleted(self): + rir = make_rir() + assert rir._deleted is False + + def test_create_sets_is_private(self): + rir = make_rir(is_private=True) + assert rir.is_private is True + + def test_create_event_has_correct_aggregate_id(self): + rir = make_rir() + events = rir.collect_uncommitted_events() + assert events[0].aggregate_id == rir.id + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestRIRUpdate: + def test_update_description(self): + rir = make_rir(description="old") + rir.collect_uncommitted_events() + rir.update(description="new description") + assert rir.description == "new description" + + def test_update_is_private(self): + rir = make_rir(is_private=False) + rir.collect_uncommitted_events() + rir.update(is_private=True) + assert rir.is_private is True + + def test_update_custom_fields(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.update(custom_fields={"region": "EU"}) + assert rir.custom_fields == {"region": "EU"} + + def test_update_tags(self): + tag_id = uuid4() + rir = make_rir() + rir.collect_uncommitted_events() + rir.update(tags=[tag_id]) + assert rir.tags == [tag_id] + + def test_update_produces_rir_updated_event(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.update(description="updated") + events = rir.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], RIRUpdated) + + def test_update_increments_version(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.update(description="v2") + assert rir.version == 2 + + def test_update_deleted_raises_error(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + rir.update(description="should fail") + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestRIRDelete: + def test_delete_marks_as_deleted(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.delete() + assert rir._deleted is True + + def test_delete_produces_rir_deleted_event(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.delete() + events = rir.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], RIRDeleted) + + def test_delete_increments_version(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.delete() + assert rir.version == 2 + + def test_double_delete_raises_error(self): + rir = make_rir() + rir.collect_uncommitted_events() + rir.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + rir.delete() + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestRIRLoadFromHistory: + def test_load_from_history_restores_state(self): + original = make_rir(name="APNIC", is_private=False, description="original") + original.update(description="updated", is_private=True) + original.delete() + events = original.collect_uncommitted_events() + + restored = RIR() + restored.load_from_history(events) + + assert restored.name == "APNIC" + assert restored.description == "updated" + assert restored.is_private is True + assert restored._deleted is True + assert restored.version == 3 + + def test_load_from_history_does_not_add_uncommitted_events(self): + rir = make_rir() + events = rir.collect_uncommitted_events() + + restored = RIR() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestRIRSnapshot: + def test_to_snapshot_returns_dict(self): + rir = make_rir() + snap = rir.to_snapshot() + assert isinstance(snap, dict) + + def test_snapshot_keys(self): + rir = make_rir() + snap = rir.to_snapshot() + expected_keys = {"name", "is_private", "description", "custom_fields", "tags", "deleted"} + assert expected_keys == snap.keys() + + def test_snapshot_roundtrip_preserves_state(self): + tag_id = uuid4() + rir = make_rir( + name="LACNIC", + is_private=True, + description="test", + custom_fields={"region": "SA"}, + tags=[tag_id], + ) + snap = rir.to_snapshot() + restored = RIR.from_snapshot(rir.id, snap, rir.version) + + assert restored.name == "LACNIC" + assert restored.is_private is True + assert restored.description == "test" + assert restored.custom_fields == {"region": "SA"} + assert restored.tags == [tag_id] + assert restored.id == rir.id + assert restored.version == rir.version + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_route_target.py b/services/ipam/tests/test_domain/test_route_target.py new file mode 100644 index 0000000..c771f65 --- /dev/null +++ b/services/ipam/tests/test_domain/test_route_target.py @@ -0,0 +1,132 @@ +"""Unit tests for the RouteTarget aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.events import RouteTargetCreated, RouteTargetDeleted, RouteTargetUpdated +from ipam.domain.route_target import RouteTarget +from pydantic import ValidationError +from shared.domain.exceptions import BusinessRuleViolationError + + +def make_rt( + name: str = "65000:100", + tenant_id=None, + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> RouteTarget: + return RouteTarget.create( + name=name, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +class TestRouteTargetCreate: + def test_create_returns_instance(self): + assert isinstance(make_rt(), RouteTarget) + + def test_create_sets_name(self): + rt = make_rt(name="65001:200") + assert rt.name.rd == "65001:200" + + def test_create_with_tenant(self): + tid = uuid4() + assert make_rt(tenant_id=tid).tenant_id == tid + + def test_create_with_custom_fields_and_tags(self): + tag_id = uuid4() + rt = make_rt(custom_fields={"note": "test"}, tags=[tag_id]) + assert rt.custom_fields == {"note": "test"} + assert rt.tags == [tag_id] + + def test_create_emits_event(self): + events = make_rt().collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], RouteTargetCreated) + + def test_create_version_is_1(self): + assert make_rt().version == 1 + + def test_create_invalid_name_raises_error(self): + with pytest.raises((ValueError, ValidationError)): + make_rt(name="invalid") + + +class TestRouteTargetUpdate: + def test_update_description(self): + rt = make_rt() + rt.collect_uncommitted_events() + rt.update(description="updated") + assert rt.description == "updated" + + def test_update_produces_event(self): + rt = make_rt() + rt.collect_uncommitted_events() + rt.update(description="new") + events = rt.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], RouteTargetUpdated) + + def test_update_deleted_raises_error(self): + rt = make_rt() + rt.collect_uncommitted_events() + rt.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + rt.update(description="fail") + + +class TestRouteTargetDelete: + def test_delete_marks_deleted(self): + rt = make_rt() + rt.collect_uncommitted_events() + rt.delete() + assert rt._deleted is True + + def test_delete_produces_event(self): + rt = make_rt() + rt.collect_uncommitted_events() + rt.delete() + events = rt.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], RouteTargetDeleted) + + def test_double_delete_raises_error(self): + rt = make_rt() + rt.collect_uncommitted_events() + rt.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + rt.delete() + + +class TestRouteTargetLoadFromHistory: + def test_load_from_history_restores_state(self): + original = make_rt(name="65000:1", description="orig") + original.update(description="updated") + original.delete() + events = original.collect_uncommitted_events() + + restored = RouteTarget() + restored.load_from_history(events) + assert restored.name.rd == "65000:1" + assert restored.description == "updated" + assert restored._deleted is True + assert restored.version == 3 + + +class TestRouteTargetSnapshot: + def test_snapshot_roundtrip(self): + tag_id = uuid4() + tid = uuid4() + rt = make_rt(name="65000:50", tenant_id=tid, description="test", custom_fields={"k": "v"}, tags=[tag_id]) + snap = rt.to_snapshot() + restored = RouteTarget.from_snapshot(rt.id, snap, rt.version) + assert restored.name.rd == "65000:50" + assert restored.tenant_id == tid + assert restored.description == "test" + assert restored.custom_fields == {"k": "v"} + assert restored.tags == [tag_id] + assert restored.id == rt.id diff --git a/services/ipam/tests/test_domain/test_service.py b/services/ipam/tests/test_domain/test_service.py new file mode 100644 index 0000000..e6d1a6d --- /dev/null +++ b/services/ipam/tests/test_domain/test_service.py @@ -0,0 +1,175 @@ +"""Unit tests for the Service aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.events import ServiceCreated, ServiceDeleted, ServiceUpdated +from ipam.domain.service import Service +from ipam.domain.value_objects import ServiceProtocol +from shared.domain.exceptions import BusinessRuleViolationError + + +def make_service( + name: str = "HTTP", + protocol: ServiceProtocol = ServiceProtocol.TCP, + ports: list[int] | None = None, + ip_addresses: list | None = None, + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> Service: + return Service.create( + name=name, + protocol=protocol, + ports=ports or [80], + ip_addresses=ip_addresses, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +class TestServiceCreate: + def test_create_returns_instance(self): + assert isinstance(make_service(), Service) + + def test_create_sets_fields(self): + svc = make_service(name="SSH", protocol=ServiceProtocol.TCP, ports=[22]) + assert svc.name == "SSH" + assert svc.protocol == ServiceProtocol.TCP + assert svc.ports == [22] + + def test_create_with_ip_addresses(self): + ip_id = uuid4() + svc = make_service(ip_addresses=[ip_id]) + assert svc.ip_addresses == [ip_id] + + def test_create_emits_event(self): + events = make_service().collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], ServiceCreated) + + def test_create_version_is_1(self): + assert make_service().version == 1 + + def test_create_empty_ports_raises_error(self): + with pytest.raises(BusinessRuleViolationError, match="at least one port"): + Service.create(name="Test", protocol=ServiceProtocol.TCP, ports=[]) + + def test_create_invalid_port_raises_error(self): + with pytest.raises(BusinessRuleViolationError): + make_service(ports=[0]) + + def test_create_port_too_large_raises_error(self): + with pytest.raises(BusinessRuleViolationError): + make_service(ports=[70000]) + + def test_create_multiple_ports(self): + svc = make_service(ports=[80, 443, 8080]) + assert svc.ports == [80, 443, 8080] + + +class TestServiceUpdate: + def test_update_name(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.update(name="HTTPS") + assert svc.name == "HTTPS" + + def test_update_ports(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.update(ports=[443]) + assert svc.ports == [443] + + def test_update_protocol(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.update(protocol="udp") + assert svc.protocol == ServiceProtocol.UDP + + def test_update_invalid_ports(self): + svc = make_service() + svc.collect_uncommitted_events() + with pytest.raises(BusinessRuleViolationError): + svc.update(ports=[0]) + + def test_update_produces_event(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.update(name="new") + events = svc.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], ServiceUpdated) + + def test_update_deleted_raises_error(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + svc.update(name="fail") + + +class TestServiceDelete: + def test_delete_marks_deleted(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.delete() + assert svc._deleted is True + + def test_delete_produces_event(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.delete() + events = svc.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], ServiceDeleted) + + def test_double_delete_raises_error(self): + svc = make_service() + svc.collect_uncommitted_events() + svc.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + svc.delete() + + +class TestServiceLoadFromHistory: + def test_load_from_history_restores_state(self): + ip_id = uuid4() + original = make_service(name="HTTP", ports=[80, 443], ip_addresses=[ip_id]) + original.update(name="HTTPS", ports=[443]) + original.delete() + events = original.collect_uncommitted_events() + + restored = Service() + restored.load_from_history(events) + assert restored.name == "HTTPS" + assert restored.ports == [443] + assert restored.ip_addresses == [ip_id] + assert restored._deleted is True + assert restored.version == 3 + + +class TestServiceSnapshot: + def test_snapshot_roundtrip(self): + tag_id = uuid4() + ip_id = uuid4() + svc = make_service( + name="DNS", + protocol=ServiceProtocol.UDP, + ports=[53], + ip_addresses=[ip_id], + description="DNS server", + custom_fields={"zone": "internal"}, + tags=[tag_id], + ) + snap = svc.to_snapshot() + restored = Service.from_snapshot(svc.id, snap, svc.version) + assert restored.name == "DNS" + assert restored.protocol == ServiceProtocol.UDP + assert restored.ports == [53] + assert restored.ip_addresses == [ip_id] + assert restored.description == "DNS server" + assert restored.custom_fields == {"zone": "internal"} + assert restored.tags == [tag_id] + assert restored.id == svc.id diff --git a/services/ipam/tests/test_domain/test_services.py b/services/ipam/tests/test_domain/test_services.py new file mode 100644 index 0000000..11dfcb2 --- /dev/null +++ b/services/ipam/tests/test_domain/test_services.py @@ -0,0 +1,361 @@ +"""Unit tests for IPAM domain services.""" + +import pytest +from ipam.domain.ip_address import IPAddress +from ipam.domain.ip_range import IPRange +from ipam.domain.prefix import Prefix +from ipam.domain.services import ( + AvailablePrefixService, + IPAvailabilityService, + IPRangeUtilizationService, + PrefixUtilizationService, +) +from ipam.domain.value_objects import PrefixStatus + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_prefix(network: str, status: PrefixStatus = PrefixStatus.ACTIVE) -> Prefix: + p = Prefix.create(network=network, status=status) + p.collect_uncommitted_events() + return p + + +def make_ip(address: str) -> IPAddress: + ip = IPAddress.create(address=address) + ip.collect_uncommitted_events() + return ip + + +# --------------------------------------------------------------------------- +# PrefixUtilizationService +# --------------------------------------------------------------------------- + + +class TestPrefixUtilizationService: + def setup_method(self): + self.service = PrefixUtilizationService() + + def test_empty_prefix_with_no_children_and_no_ips_returns_zero(self): + prefix = make_prefix("192.168.0.0/24") + result = self.service.calculate(prefix, [], []) + assert result == 0.0 + + def test_prefix_with_no_network_returns_zero(self): + prefix = Prefix() # no create() called, network is None + result = self.service.calculate(prefix, [], []) + assert result == 0.0 + + def test_single_ip_in_slash24_gives_correct_ratio(self): + prefix = make_prefix("192.168.0.0/24") + ip = make_ip("192.168.0.1") + result = self.service.calculate(prefix, [], [ip]) + # 1 IP out of 256 addresses + assert result == pytest.approx(1 / 256) + + def test_multiple_ips_give_correct_ratio(self): + prefix = make_prefix("10.0.0.0/24") + ips = [make_ip(f"10.0.0.{i}") for i in range(1, 11)] + result = self.service.calculate(prefix, [], ips) + assert result == pytest.approx(10 / 256) + + def test_child_prefix_contributes_its_size(self): + parent = make_prefix("10.0.0.0/24") + child = make_prefix("10.0.0.0/28") # 16 addresses + result = self.service.calculate(parent, [child], []) + assert result == pytest.approx(16 / 256) + + def test_multiple_child_prefixes_sum_correctly(self): + parent = make_prefix("10.0.0.0/24") + child1 = make_prefix("10.0.0.0/28") # 16 addresses + child2 = make_prefix("10.0.0.16/28") # 16 addresses + result = self.service.calculate(parent, [child1, child2], []) + assert result == pytest.approx(32 / 256) + + def test_combined_child_prefixes_and_ips(self): + parent = make_prefix("10.0.0.0/24") + child = make_prefix("10.0.0.0/28") # 16 addresses + ips = [make_ip(f"10.0.0.{i}") for i in range(100, 104)] # 4 IPs + result = self.service.calculate(parent, [child], ips) + assert result == pytest.approx(20 / 256) + + def test_fully_used_prefix_returns_1_0(self): + # Use a /30 (4 addresses) filled with a /30 child + parent = make_prefix("10.0.0.0/30") + child = make_prefix("10.0.0.0/30") + result = self.service.calculate(parent, [child], []) + assert result == 1.0 + + def test_result_is_capped_at_1_0_when_oversubscribed(self): + # Oversubscribe: child is larger than parent due to host count + parent = make_prefix("10.0.0.0/30") # 4 addresses + child1 = make_prefix("10.0.0.0/29") # 8 addresses (larger) + result = self.service.calculate(parent, [child1], []) + assert result == 1.0 + + def test_child_prefix_with_no_network_is_skipped(self): + parent = make_prefix("10.0.0.0/24") + child_no_network = Prefix() # network is None + result = self.service.calculate(parent, [child_no_network], []) + assert result == 0.0 + + def test_slash32_prefix_full_with_one_ip(self): + prefix = make_prefix("10.0.0.1/32") + ip = make_ip("10.0.0.1") + result = self.service.calculate(prefix, [], [ip]) + assert result == 1.0 + + def test_ipv6_prefix_utilization(self): + prefix = make_prefix("2001:db8::/126") # 4 addresses + ip = make_ip("2001:db8::1") + result = self.service.calculate(prefix, [], [ip]) + assert result == pytest.approx(1 / 4) + + +# --------------------------------------------------------------------------- +# AvailablePrefixService +# --------------------------------------------------------------------------- + + +class TestAvailablePrefixService: + def setup_method(self): + self.service = AvailablePrefixService() + + def test_parent_with_no_network_returns_empty_list(self): + parent = Prefix() # no network + result = self.service.find_available(parent, [], 28) + assert result == [] + + def test_find_all_slash28_in_slash24_with_no_children(self): + parent = make_prefix("192.168.0.0/24") + result = self.service.find_available(parent, [], 28) + # /24 contains 16 non-overlapping /28 subnets + assert len(result) == 16 + + def test_all_found_prefixes_are_valid_cidr_strings(self): + parent = make_prefix("10.0.0.0/24") + result = self.service.find_available(parent, [], 28) + import ipaddress + + for subnet_str in result: + net = ipaddress.ip_network(subnet_str, strict=True) + assert net.prefixlen == 28 + + def test_all_found_prefixes_are_subnets_of_parent(self): + parent = make_prefix("10.0.0.0/24") + result = self.service.find_available(parent, [], 28) + import ipaddress + + parent_net = ipaddress.ip_network("10.0.0.0/24") + for subnet_str in result: + subnet = ipaddress.ip_network(subnet_str) + assert subnet.subnet_of(parent_net) + + def test_used_child_prefix_is_excluded(self): + parent = make_prefix("192.168.0.0/24") + used = make_prefix("192.168.0.0/28") + result = self.service.find_available(parent, [used], 28) + assert "192.168.0.0/28" not in result + assert len(result) == 15 # 16 - 1 used + + def test_multiple_used_child_prefixes_excluded(self): + parent = make_prefix("192.168.0.0/24") + used1 = make_prefix("192.168.0.0/28") + used2 = make_prefix("192.168.0.16/28") + result = self.service.find_available(parent, [used1, used2], 28) + assert "192.168.0.0/28" not in result + assert "192.168.0.16/28" not in result + assert len(result) == 14 + + def test_all_subnets_used_returns_empty_list(self): + parent = make_prefix("10.0.0.0/30") # 4 addresses + used = make_prefix("10.0.0.0/30") # same subnet fills it + result = self.service.find_available(parent, [used], 30) + assert result == [] + + def test_desired_prefix_length_same_as_parent(self): + parent = make_prefix("10.0.0.0/28") + result = self.service.find_available(parent, [], 28) + assert result == ["10.0.0.0/28"] + + def test_child_prefix_with_no_network_is_ignored(self): + parent = make_prefix("10.0.0.0/24") + child_no_network = Prefix() # network is None + result = self.service.find_available(parent, [child_no_network], 28) + # All /28 subnets should still be available + assert len(result) == 16 + + def test_overlapping_child_prefix_excludes_candidate(self): + parent = make_prefix("10.0.0.0/24") + # Use a /29 that overlaps with the first /28 + used = make_prefix("10.0.0.0/29") + result = self.service.find_available(parent, [used], 28) + # 10.0.0.0/28 overlaps with 10.0.0.0/29 + assert "10.0.0.0/28" not in result + + def test_ipv6_parent_finds_available_subnets(self): + parent = make_prefix("2001:db8::/48") + result = self.service.find_available(parent, [], 56) + # /48 contains 256 non-overlapping /56 subnets + assert len(result) == 256 + + def test_result_contains_no_duplicates(self): + parent = make_prefix("10.0.0.0/24") + result = self.service.find_available(parent, [], 28) + assert len(result) == len(set(result)) + + +# --------------------------------------------------------------------------- +# IPAvailabilityService +# --------------------------------------------------------------------------- + + +class TestIPAvailabilityService: + def setup_method(self): + self.service = IPAvailabilityService() + + def test_prefix_with_no_network_returns_empty_list(self): + prefix = Prefix() # no network + result = self.service.find_available(prefix, [], count=5) + assert result == [] + + def test_find_one_available_ip_in_empty_prefix(self): + prefix = make_prefix("10.0.0.0/30") + result = self.service.find_available(prefix, [], count=1) + assert len(result) == 1 + assert result[0] == "10.0.0.1" # first host in /30 + + def test_find_multiple_available_ips(self): + prefix = make_prefix("10.0.0.0/29") # hosts: .1 through .6 + result = self.service.find_available(prefix, [], count=3) + assert len(result) == 3 + + def test_found_ips_are_within_prefix(self): + import ipaddress + + prefix = make_prefix("192.168.10.0/28") + result = self.service.find_available(prefix, [], count=5) + net = ipaddress.ip_network("192.168.10.0/28") + for ip_str in result: + assert ipaddress.ip_address(ip_str) in net + + def test_used_addresses_are_excluded(self): + prefix = make_prefix("10.0.0.0/30") + # In a /30: hosts are .1 and .2 + used = [make_ip("10.0.0.1")] + result = self.service.find_available(prefix, used, count=1) + assert "10.0.0.1" not in result + assert result == ["10.0.0.2"] + + def test_multiple_used_addresses_are_excluded(self): + prefix = make_prefix("10.0.0.0/29") # hosts: .1-.6 + used = [make_ip("10.0.0.1"), make_ip("10.0.0.2"), make_ip("10.0.0.3")] + result = self.service.find_available(prefix, used, count=3) + for ip_str in result: + assert ip_str not in {"10.0.0.1", "10.0.0.2", "10.0.0.3"} + + def test_count_limits_number_of_results(self): + prefix = make_prefix("10.0.0.0/24") + result = self.service.find_available(prefix, [], count=5) + assert len(result) == 5 + + def test_count_larger_than_available_returns_all_hosts(self): + prefix = make_prefix("10.0.0.0/30") # only 2 host addresses + result = self.service.find_available(prefix, [], count=100) + assert len(result) == 2 # only 2 actual hosts in /30 + + def test_all_addresses_used_returns_empty_list(self): + prefix = make_prefix("10.0.0.0/30") # hosts: .1 and .2 + used = [make_ip("10.0.0.1"), make_ip("10.0.0.2")] + result = self.service.find_available(prefix, used, count=1) + assert result == [] + + def test_result_contains_no_duplicates(self): + prefix = make_prefix("10.0.0.0/24") + result = self.service.find_available(prefix, [], count=10) + assert len(result) == len(set(result)) + + def test_default_count_is_1(self): + prefix = make_prefix("10.0.0.0/24") + result = self.service.find_available(prefix, []) + assert len(result) == 1 + + def test_ipv6_prefix_returns_available_addresses(self): + prefix = make_prefix("2001:db8::/126") # hosts: ::1, ::2 + result = self.service.find_available(prefix, [], count=2) + assert len(result) == 2 + assert "2001:db8::1" in result + assert "2001:db8::2" in result + + def test_ipv6_used_address_is_excluded(self): + prefix = make_prefix("2001:db8::/126") + used = [make_ip("2001:db8::1")] + result = self.service.find_available(prefix, used, count=1) + assert "2001:db8::1" not in result + assert result == ["2001:db8::2"] + + def test_network_and_broadcast_addresses_excluded_for_ipv4(self): + # In a /30: network=.0, hosts=.1,.2, broadcast=.3 + prefix = make_prefix("10.0.0.0/30") + result = self.service.find_available(prefix, [], count=10) + assert "10.0.0.0" not in result # network address + assert "10.0.0.3" not in result # broadcast address + + def test_ip_address_with_no_address_is_skipped(self): + prefix = make_prefix("10.0.0.0/30") + ip_no_address = IPAddress() # address is None + # Should not crash; ip with no address is simply not in used_set + result = self.service.find_available(prefix, [ip_no_address], count=2) + assert len(result) == 2 + + +# --------------------------------------------------------------------------- +# IPRangeUtilizationService +# --------------------------------------------------------------------------- + + +def make_ip_range(start: str, end: str) -> IPRange: + r = IPRange.create(start_address=start, end_address=end) + r.collect_uncommitted_events() + return r + + +class TestIPRangeUtilizationService: + def setup_method(self): + self.service = IPRangeUtilizationService() + + def test_empty_range_no_addresses(self): + ip_range = make_ip_range("10.0.0.1", "10.0.0.10") + result = self.service.calculate(ip_range, []) + assert result == 0.0 + + def test_one_address_in_range_of_ten(self): + ip_range = make_ip_range("10.0.0.1", "10.0.0.10") + used = [make_ip("10.0.0.5")] + result = self.service.calculate(ip_range, used) + assert result == pytest.approx(1 / 10) + + def test_all_addresses_used(self): + ip_range = make_ip_range("10.0.0.1", "10.0.0.3") + used = [make_ip("10.0.0.1"), make_ip("10.0.0.2"), make_ip("10.0.0.3")] + result = self.service.calculate(ip_range, used) + assert result == 1.0 + + def test_address_outside_range_not_counted(self): + ip_range = make_ip_range("10.0.0.1", "10.0.0.5") + used = [make_ip("10.0.0.100")] + result = self.service.calculate(ip_range, used) + assert result == 0.0 + + def test_range_with_no_addresses_returns_zero(self): + ip_range = IPRange() + result = self.service.calculate(ip_range, []) + assert result == 0.0 + + def test_ipv6_range_utilization(self): + ip_range = make_ip_range("2001:db8::1", "2001:db8::4") + used = [make_ip("2001:db8::2")] + result = self.service.calculate(ip_range, used) + assert result == pytest.approx(1 / 4) diff --git a/services/ipam/tests/test_domain/test_value_objects.py b/services/ipam/tests/test_domain/test_value_objects.py new file mode 100644 index 0000000..2e242cc --- /dev/null +++ b/services/ipam/tests/test_domain/test_value_objects.py @@ -0,0 +1,320 @@ +"""Unit tests for IPAM domain value objects.""" + +import pytest +from ipam.domain.value_objects import ( + ASNumber, + FHRPAuthType, + FHRPProtocol, + IPAddressStatus, + IPAddressValue, + IPRangeStatus, + PrefixNetwork, + PrefixStatus, + RouteDistinguisher, + VLANId, + VLANStatus, +) +from pydantic import ValidationError + + +class TestPrefixNetwork: + def test_valid_ipv4_cidr_is_accepted(self): + pn = PrefixNetwork(network="192.168.1.0/24") + assert pn.network == "192.168.1.0/24" + + def test_valid_ipv4_host_bits_normalised(self): + # 192.168.1.5/24 → network is 192.168.1.0/24 (strict=False) + pn = PrefixNetwork(network="192.168.1.5/24") + assert pn.network == "192.168.1.0/24" + + def test_valid_ipv6_cidr_is_accepted(self): + pn = PrefixNetwork(network="2001:db8::/32") + assert pn.network == "2001:db8::/32" + + def test_valid_ipv6_host_bits_normalised(self): + pn = PrefixNetwork(network="2001:db8::1/32") + assert pn.network == "2001:db8::/32" + + def test_slash_32_host_prefix_accepted(self): + pn = PrefixNetwork(network="10.0.0.1/32") + assert pn.network == "10.0.0.1/32" + + def test_invalid_cidr_raises_validation_error(self): + with pytest.raises(ValidationError): + PrefixNetwork(network="not-a-network") + + def test_missing_prefix_length_is_treated_as_host_route(self): + # ipaddress.ip_network("192.168.1.0", strict=False) is valid — it + # normalises to a /32 host route. The validator does NOT raise here. + pn = PrefixNetwork(network="192.168.1.0") + assert pn.network == "192.168.1.0/32" + + def test_invalid_octet_raises_validation_error(self): + with pytest.raises(ValidationError): + PrefixNetwork(network="999.168.1.0/24") + + def test_version_is_4_for_ipv4(self): + pn = PrefixNetwork(network="10.0.0.0/8") + assert pn.version == 4 + + def test_version_is_6_for_ipv6(self): + pn = PrefixNetwork(network="::1/128") + assert pn.version == 6 + + def test_num_addresses_slash24(self): + pn = PrefixNetwork(network="192.168.0.0/24") + assert pn.num_addresses == 256 + + def test_num_addresses_slash32(self): + pn = PrefixNetwork(network="10.0.0.1/32") + assert pn.num_addresses == 1 + + def test_num_addresses_slash16(self): + pn = PrefixNetwork(network="172.16.0.0/16") + assert pn.num_addresses == 65536 + + def test_prefix_length_property(self): + pn = PrefixNetwork(network="10.0.0.0/8") + assert pn.prefix_length == 8 + + def test_ip_network_property_returns_correct_type(self): + import ipaddress + + pn = PrefixNetwork(network="192.168.0.0/24") + assert isinstance(pn.ip_network, ipaddress.IPv4Network) + + def test_contains_child_prefix(self): + parent = PrefixNetwork(network="192.168.0.0/24") + child = PrefixNetwork(network="192.168.0.0/28") + assert parent.contains(child) is True + + def test_contains_returns_false_for_non_child(self): + parent = PrefixNetwork(network="192.168.0.0/24") + other = PrefixNetwork(network="10.0.0.0/8") + assert parent.contains(other) is False + + def test_contains_returns_false_for_parent_network(self): + child = PrefixNetwork(network="192.168.0.0/28") + parent = PrefixNetwork(network="192.168.0.0/24") + assert child.contains(parent) is False + + def test_contains_self(self): + pn = PrefixNetwork(network="192.168.0.0/24") + assert pn.contains(pn) is True + + def test_value_object_is_immutable(self): + pn = PrefixNetwork(network="10.0.0.0/8") + with pytest.raises((ValueError, ValidationError)): + pn.network = "10.1.0.0/16" + + +class TestIPAddressValue: + def test_valid_ipv4_is_accepted(self): + ip = IPAddressValue(address="192.168.1.1") + assert ip.address == "192.168.1.1" + + def test_valid_ipv6_is_accepted(self): + ip = IPAddressValue(address="2001:db8::1") + assert ip.address == "2001:db8::1" + + def test_loopback_ipv4_is_accepted(self): + ip = IPAddressValue(address="127.0.0.1") + assert ip.address == "127.0.0.1" + + def test_loopback_ipv6_is_accepted(self): + ip = IPAddressValue(address="::1") + assert ip.address == "::1" + + def test_invalid_address_raises_validation_error(self): + with pytest.raises(ValidationError): + IPAddressValue(address="not-an-ip") + + def test_address_with_cidr_raises_validation_error(self): + with pytest.raises(ValidationError): + IPAddressValue(address="192.168.1.1/24") + + def test_invalid_octet_raises_validation_error(self): + with pytest.raises(ValidationError): + IPAddressValue(address="256.0.0.1") + + def test_version_is_4_for_ipv4(self): + ip = IPAddressValue(address="10.0.0.1") + assert ip.version == 4 + + def test_version_is_6_for_ipv6(self): + ip = IPAddressValue(address="fe80::1") + assert ip.version == 6 + + def test_ip_address_property_returns_correct_object(self): + import ipaddress + + ip = IPAddressValue(address="192.168.1.100") + assert isinstance(ip.ip_address, ipaddress.IPv4Address) + assert ip.ip_address == ipaddress.IPv4Address("192.168.1.100") + + def test_value_object_is_immutable(self): + ip = IPAddressValue(address="10.0.0.1") + with pytest.raises((ValueError, ValidationError)): + ip.address = "10.0.0.2" + + +class TestVLANId: + def test_minimum_valid_vid(self): + vlan_id = VLANId(vid=1) + assert vlan_id.vid == 1 + + def test_maximum_valid_vid(self): + vlan_id = VLANId(vid=4094) + assert vlan_id.vid == 4094 + + def test_midrange_valid_vid(self): + vlan_id = VLANId(vid=100) + assert vlan_id.vid == 100 + + def test_vid_zero_raises_validation_error(self): + with pytest.raises(ValidationError) as exc_info: + VLANId(vid=0) + assert "1 and 4094" in str(exc_info.value) + + def test_vid_4095_raises_validation_error(self): + with pytest.raises(ValidationError) as exc_info: + VLANId(vid=4095) + assert "1 and 4094" in str(exc_info.value) + + def test_negative_vid_raises_validation_error(self): + with pytest.raises(ValidationError): + VLANId(vid=-1) + + def test_value_object_is_immutable(self): + vlan_id = VLANId(vid=100) + with pytest.raises((ValueError, ValidationError)): + vlan_id.vid = 200 + + +class TestRouteDistinguisher: + def test_asn_colon_nn_format_accepted(self): + rd = RouteDistinguisher(rd="65000:100") + assert rd.rd == "65000:100" + + def test_ip_colon_nn_format_accepted(self): + rd = RouteDistinguisher(rd="192.168.1.1:100") + assert rd.rd == "192.168.1.1:100" + + def test_zero_values_accepted(self): + rd = RouteDistinguisher(rd="0:0") + assert rd.rd == "0:0" + + def test_large_asn_accepted(self): + rd = RouteDistinguisher(rd="4294967295:65535") + assert rd.rd == "4294967295:65535" + + def test_missing_colon_raises_validation_error(self): + with pytest.raises(ValidationError) as exc_info: + RouteDistinguisher(rd="65000100") + assert "ASN:NN" in str(exc_info.value) or "IP:NN" in str(exc_info.value) + + def test_multiple_colons_raises_validation_error(self): + with pytest.raises(ValidationError): + RouteDistinguisher(rd="65000:100:200") + + def test_empty_string_raises_validation_error(self): + with pytest.raises(ValidationError): + RouteDistinguisher(rd="") + + def test_colon_only_raises_validation_error(self): + # A single colon splits into two empty strings — the validator should reject it + # or accept it depending on implementation. The validator checks len(parts) == 2, + # so ":" produces ["", ""], which is length 2 and passes the colon check. + # This is an edge case that documents current behaviour. + rd = RouteDistinguisher(rd=":") + assert rd.rd == ":" + + def test_value_object_is_immutable(self): + rd = RouteDistinguisher(rd="65000:100") + with pytest.raises((ValueError, ValidationError)): + rd.rd = "65001:100" + + +class TestStatusEnums: + def test_prefix_status_values(self): + assert PrefixStatus.ACTIVE == "active" + assert PrefixStatus.RESERVED == "reserved" + assert PrefixStatus.DEPRECATED == "deprecated" + assert PrefixStatus.CONTAINER == "container" + + def test_ip_address_status_values(self): + assert IPAddressStatus.ACTIVE == "active" + assert IPAddressStatus.RESERVED == "reserved" + assert IPAddressStatus.DEPRECATED == "deprecated" + assert IPAddressStatus.DHCP == "dhcp" + assert IPAddressStatus.SLAAC == "slaac" + + def test_vlan_status_values(self): + assert VLANStatus.ACTIVE == "active" + assert VLANStatus.RESERVED == "reserved" + assert VLANStatus.DEPRECATED == "deprecated" + + def test_prefix_status_from_string(self): + assert PrefixStatus("active") is PrefixStatus.ACTIVE + + def test_ip_address_status_from_string(self): + assert IPAddressStatus("dhcp") is IPAddressStatus.DHCP + + def test_vlan_status_from_string(self): + assert VLANStatus("deprecated") is VLANStatus.DEPRECATED + + def test_ip_range_status_values(self): + assert IPRangeStatus.ACTIVE == "active" + assert IPRangeStatus.RESERVED == "reserved" + assert IPRangeStatus.DEPRECATED == "deprecated" + + def test_ip_range_status_from_string(self): + assert IPRangeStatus("active") is IPRangeStatus.ACTIVE + + def test_fhrp_protocol_values(self): + assert FHRPProtocol.VRRP == "vrrp" + assert FHRPProtocol.HSRP == "hsrp" + assert FHRPProtocol.GLBP == "glbp" + assert FHRPProtocol.CARP == "carp" + assert FHRPProtocol.OTHER == "other" + + def test_fhrp_protocol_from_string(self): + assert FHRPProtocol("vrrp") is FHRPProtocol.VRRP + + def test_fhrp_auth_type_values(self): + assert FHRPAuthType.PLAINTEXT == "plaintext" + assert FHRPAuthType.MD5 == "md5" + + def test_fhrp_auth_type_from_string(self): + assert FHRPAuthType("md5") is FHRPAuthType.MD5 + + +class TestASNumber: + def test_valid_asn_minimum(self): + asn = ASNumber(asn=1) + assert asn.asn == 1 + + def test_valid_asn_maximum(self): + asn = ASNumber(asn=4294967295) + assert asn.asn == 4294967295 + + def test_valid_asn_private_range(self): + asn = ASNumber(asn=65001) + assert asn.asn == 65001 + + def test_asn_zero_raises_validation_error(self): + with pytest.raises(ValidationError): + ASNumber(asn=0) + + def test_asn_negative_raises_validation_error(self): + with pytest.raises(ValidationError): + ASNumber(asn=-1) + + def test_asn_too_large_raises_validation_error(self): + with pytest.raises(ValidationError): + ASNumber(asn=4294967296) + + def test_value_object_is_immutable(self): + asn = ASNumber(asn=65000) + with pytest.raises((ValueError, ValidationError)): + asn.asn = 65001 diff --git a/services/ipam/tests/test_domain/test_vlan.py b/services/ipam/tests/test_domain/test_vlan.py new file mode 100644 index 0000000..b8b15f2 --- /dev/null +++ b/services/ipam/tests/test_domain/test_vlan.py @@ -0,0 +1,515 @@ +"""Unit tests for the VLAN aggregate root.""" + +from uuid import UUID, uuid4 + +import pytest +from ipam.domain.events import VLANCreated, VLANDeleted, VLANStatusChanged, VLANUpdated +from ipam.domain.value_objects import VLANId, VLANStatus +from ipam.domain.vlan import VLAN +from pydantic import ValidationError +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_vlan( + vid: int = 100, + name: str = "test-vlan", + group_id: UUID | None = None, + status: VLANStatus = VLANStatus.ACTIVE, + role: str | None = None, + tenant_id: UUID | None = None, + description: str = "", +) -> VLAN: + return VLAN.create( + vid=vid, + name=name, + group_id=group_id, + status=status, + role=role, + tenant_id=tenant_id, + description=description, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestVLANCreate: + def test_create_returns_vlan_instance(self): + vlan = make_vlan() + assert isinstance(vlan, VLAN) + + def test_create_sets_vid_as_vlan_id(self): + vlan = make_vlan(vid=200) + assert isinstance(vlan.vid, VLANId) + assert vlan.vid.vid == 200 + + def test_create_sets_minimum_valid_vid(self): + vlan = make_vlan(vid=1) + assert vlan.vid.vid == 1 + + def test_create_sets_maximum_valid_vid(self): + vlan = make_vlan(vid=4094) + assert vlan.vid.vid == 4094 + + def test_create_with_vid_zero_raises(self): + with pytest.raises((ValidationError, ValueError)): + make_vlan(vid=0) + + def test_create_with_vid_4095_raises(self): + with pytest.raises((ValidationError, ValueError)): + make_vlan(vid=4095) + + def test_create_with_negative_vid_raises(self): + with pytest.raises((ValidationError, ValueError)): + make_vlan(vid=-1) + + def test_create_sets_name(self): + vlan = make_vlan(name="production") + assert vlan.name == "production" + + def test_create_sets_group_id(self): + group_id = uuid4() + vlan = make_vlan(group_id=group_id) + assert vlan.group_id == group_id + + def test_create_sets_group_id_none_by_default(self): + vlan = make_vlan() + assert vlan.group_id is None + + def test_create_sets_default_status_active(self): + vlan = make_vlan() + assert vlan.status == VLANStatus.ACTIVE + + def test_create_sets_explicit_status(self): + vlan = make_vlan(status=VLANStatus.RESERVED) + assert vlan.status == VLANStatus.RESERVED + + def test_create_sets_role(self): + vlan = make_vlan(role="access") + assert vlan.role == "access" + + def test_create_sets_role_none_by_default(self): + vlan = make_vlan() + assert vlan.role is None + + def test_create_sets_tenant_id(self): + tenant_id = uuid4() + vlan = make_vlan(tenant_id=tenant_id) + assert vlan.tenant_id == tenant_id + + def test_create_sets_description(self): + vlan = make_vlan(description="Production VLAN") + assert vlan.description == "Production VLAN" + + def test_create_version_is_1(self): + vlan = make_vlan() + assert vlan.version == 1 + + def test_create_is_not_deleted(self): + vlan = make_vlan() + assert vlan._deleted is False + + def test_create_produces_one_event(self): + vlan = make_vlan() + events = vlan.collect_uncommitted_events() + assert len(events) == 1 + + def test_create_event_type_is_vlan_created(self): + vlan = make_vlan() + events = vlan.collect_uncommitted_events() + assert isinstance(events[0], VLANCreated) + + def test_create_event_has_correct_aggregate_id(self): + vlan = make_vlan() + events = vlan.collect_uncommitted_events() + assert events[0].aggregate_id == vlan.id + + def test_create_event_has_version_1(self): + vlan = make_vlan() + events = vlan.collect_uncommitted_events() + assert events[0].version == 1 + + def test_create_event_vid_matches(self): + vlan = make_vlan(vid=500) + events = vlan.collect_uncommitted_events() + assert events[0].vid == 500 + + def test_create_event_name_matches(self): + vlan = make_vlan(name="dmz") + events = vlan.collect_uncommitted_events() + assert events[0].name == "dmz" + + def test_create_assigns_unique_ids(self): + v1 = make_vlan(vid=10) + v2 = make_vlan(vid=20) + assert v1.id != v2.id + + def test_collect_uncommitted_events_clears_queue(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + assert vlan.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestVLANUpdate: + def test_update_name_changes_name(self): + vlan = make_vlan(name="old") + vlan.collect_uncommitted_events() + vlan.update(name="new-name") + assert vlan.name == "new-name" + + def test_update_role_changes_role(self): + vlan = make_vlan(role="old-role") + vlan.collect_uncommitted_events() + vlan.update(role="trunk") + assert vlan.role == "trunk" + + def test_update_description_changes_description(self): + vlan = make_vlan(description="old") + vlan.collect_uncommitted_events() + vlan.update(description="new description") + assert vlan.description == "new description" + + def test_update_produces_vlan_updated_event(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.update(name="updated") + events = vlan.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VLANUpdated) + + def test_update_increments_version(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.update(name="v2") + assert vlan.version == 2 + + def test_update_event_has_correct_aggregate_id(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.update(name="x") + events = vlan.collect_uncommitted_events() + assert events[0].aggregate_id == vlan.id + + def test_update_with_none_args_does_not_change_existing_values(self): + vlan = make_vlan(name="keep", role="access", description="keep-desc") + vlan.collect_uncommitted_events() + vlan.update() # all None + assert vlan.name == "keep" + assert vlan.role == "access" + assert vlan.description == "keep-desc" + + def test_update_after_delete_raises_business_rule_violation(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + vlan.update(name="should fail") + + def test_multiple_updates_accumulate_version(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.update(name="v2") + vlan.collect_uncommitted_events() + vlan.update(description="v3-desc") + assert vlan.version == 3 + + +# --------------------------------------------------------------------------- +# change_status() +# --------------------------------------------------------------------------- + + +class TestVLANChangeStatus: + def test_change_status_updates_status(self): + vlan = make_vlan(status=VLANStatus.ACTIVE) + vlan.collect_uncommitted_events() + vlan.change_status(VLANStatus.RESERVED) + assert vlan.status == VLANStatus.RESERVED + + def test_change_status_to_deprecated(self): + vlan = make_vlan(status=VLANStatus.ACTIVE) + vlan.collect_uncommitted_events() + vlan.change_status(VLANStatus.DEPRECATED) + assert vlan.status == VLANStatus.DEPRECATED + + def test_change_status_produces_vlan_status_changed_event(self): + vlan = make_vlan(status=VLANStatus.ACTIVE) + vlan.collect_uncommitted_events() + vlan.change_status(VLANStatus.DEPRECATED) + events = vlan.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VLANStatusChanged) + + def test_change_status_event_contains_old_and_new_status(self): + vlan = make_vlan(status=VLANStatus.ACTIVE) + vlan.collect_uncommitted_events() + vlan.change_status(VLANStatus.RESERVED) + events = vlan.collect_uncommitted_events() + assert events[0].old_status == "active" + assert events[0].new_status == "reserved" + + def test_change_status_increments_version(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.change_status(VLANStatus.DEPRECATED) + assert vlan.version == 2 + + def test_change_status_to_same_status_raises_business_rule_violation(self): + vlan = make_vlan(status=VLANStatus.ACTIVE) + vlan.collect_uncommitted_events() + with pytest.raises(BusinessRuleViolationError, match="already"): + vlan.change_status(VLANStatus.ACTIVE) + + def test_change_status_after_delete_raises_business_rule_violation(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + vlan.change_status(VLANStatus.RESERVED) + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestVLANDelete: + def test_delete_marks_vlan_as_deleted(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + assert vlan._deleted is True + + def test_delete_produces_vlan_deleted_event(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + events = vlan.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VLANDeleted) + + def test_delete_increments_version(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + assert vlan.version == 2 + + def test_delete_twice_raises_business_rule_violation(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + vlan.delete() + + def test_update_after_delete_is_blocked(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + with pytest.raises(BusinessRuleViolationError): + vlan.update(name="blocked") + + def test_change_status_after_delete_is_blocked(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + with pytest.raises(BusinessRuleViolationError): + vlan.change_status(VLANStatus.DEPRECATED) + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestVLANLoadFromHistory: + def test_load_from_history_restores_vid(self): + original = make_vlan(vid=300) + events = original.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.vid.vid == 300 + + def test_load_from_history_restores_name(self): + original = make_vlan(name="corp-lan") + events = original.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.name == "corp-lan" + + def test_load_from_history_restores_group_id(self): + group_id = uuid4() + original = make_vlan(group_id=group_id) + events = original.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.group_id == group_id + + def test_load_from_history_restores_status(self): + original = make_vlan(status=VLANStatus.RESERVED) + events = original.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.status == VLANStatus.RESERVED + + def test_load_from_history_restores_after_update(self): + vlan = make_vlan(name="original") + vlan.update(name="updated", role="access", description="new-desc") + events = vlan.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.name == "updated" + assert restored.role == "access" + assert restored.description == "new-desc" + assert restored.version == 2 + + def test_load_from_history_restores_status_change(self): + vlan = make_vlan(status=VLANStatus.ACTIVE) + vlan.change_status(VLANStatus.DEPRECATED) + events = vlan.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.status == VLANStatus.DEPRECATED + + def test_load_from_history_restores_deleted_state(self): + vlan = make_vlan() + vlan.delete() + events = vlan.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored._deleted is True + assert restored.version == 2 + + def test_load_from_history_does_not_add_uncommitted_events(self): + vlan = make_vlan() + vlan.update(name="v2") + events = vlan.collect_uncommitted_events() + + restored = VLAN() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestVLANSnapshot: + def test_to_snapshot_returns_dict(self): + vlan = make_vlan() + snap = vlan.to_snapshot() + assert isinstance(snap, dict) + + def test_to_snapshot_contains_expected_keys(self): + vlan = make_vlan() + snap = vlan.to_snapshot() + expected = { + "vid", + "name", + "group_id", + "status", + "role", + "tenant_id", + "description", + "custom_fields", + "tags", + "deleted", + } + assert expected == snap.keys() + + def test_snapshot_roundtrip_preserves_vid(self): + vlan = make_vlan(vid=777) + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.vid.vid == 777 + + def test_snapshot_roundtrip_preserves_name(self): + vlan = make_vlan(name="data-center") + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.name == "data-center" + + def test_snapshot_roundtrip_preserves_status(self): + vlan = make_vlan(status=VLANStatus.DEPRECATED) + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.status == VLANStatus.DEPRECATED + + def test_snapshot_roundtrip_preserves_group_id(self): + group_id = uuid4() + vlan = make_vlan(group_id=group_id) + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.group_id == group_id + + def test_snapshot_roundtrip_preserves_none_group_id(self): + vlan = make_vlan(group_id=None) + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.group_id is None + + def test_snapshot_roundtrip_preserves_role(self): + vlan = make_vlan(role="access") + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.role == "access" + + def test_snapshot_roundtrip_preserves_description(self): + vlan = make_vlan(description="my vlan") + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.description == "my vlan" + + def test_snapshot_roundtrip_preserves_aggregate_id(self): + vlan = make_vlan() + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.id == vlan.id + + def test_snapshot_roundtrip_preserves_version(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.update(name="v2") + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.version == 2 + + def test_snapshot_roundtrip_preserves_deleted_state(self): + vlan = make_vlan() + vlan.collect_uncommitted_events() + vlan.delete() + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored._deleted is True + + def test_from_snapshot_does_not_produce_uncommitted_events(self): + vlan = make_vlan() + snap = vlan.to_snapshot() + restored = VLAN.from_snapshot(vlan.id, snap, vlan.version) + assert restored.collect_uncommitted_events() == [] diff --git a/services/ipam/tests/test_domain/test_vlan_group.py b/services/ipam/tests/test_domain/test_vlan_group.py new file mode 100644 index 0000000..3cf9f44 --- /dev/null +++ b/services/ipam/tests/test_domain/test_vlan_group.py @@ -0,0 +1,163 @@ +"""Unit tests for the VLANGroup aggregate root.""" + +from uuid import uuid4 + +import pytest +from ipam.domain.events import VLANGroupCreated, VLANGroupDeleted, VLANGroupUpdated +from ipam.domain.vlan_group import VLANGroup +from shared.domain.exceptions import BusinessRuleViolationError + + +def make_vlan_group( + name: str = "Default", + slug: str = "default", + min_vid: int = 1, + max_vid: int = 4094, + tenant_id=None, + description: str = "", + custom_fields: dict | None = None, + tags: list | None = None, +) -> VLANGroup: + return VLANGroup.create( + name=name, + slug=slug, + min_vid=min_vid, + max_vid=max_vid, + tenant_id=tenant_id, + description=description, + custom_fields=custom_fields, + tags=tags, + ) + + +class TestVLANGroupCreate: + def test_create_returns_instance(self): + assert isinstance(make_vlan_group(), VLANGroup) + + def test_create_sets_fields(self): + vg = make_vlan_group(name="Test", slug="test", min_vid=100, max_vid=200) + assert vg.name == "Test" + assert vg.slug == "test" + assert vg.min_vid == 100 + assert vg.max_vid == 200 + + def test_create_emits_event(self): + events = make_vlan_group().collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VLANGroupCreated) + + def test_create_version_is_1(self): + assert make_vlan_group().version == 1 + + def test_create_invalid_min_vid(self): + with pytest.raises(BusinessRuleViolationError): + make_vlan_group(min_vid=0) + + def test_create_invalid_max_vid(self): + with pytest.raises(BusinessRuleViolationError): + make_vlan_group(max_vid=5000) + + def test_create_min_greater_than_max(self): + with pytest.raises(BusinessRuleViolationError): + make_vlan_group(min_vid=200, max_vid=100) + + +class TestVLANGroupUpdate: + def test_update_name(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.update(name="Updated") + assert vg.name == "Updated" + + def test_update_vid_range(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.update(min_vid=10, max_vid=100) + assert vg.min_vid == 10 + assert vg.max_vid == 100 + + def test_update_invalid_vid_range(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + with pytest.raises(BusinessRuleViolationError): + vg.update(min_vid=500, max_vid=100) + + def test_update_produces_event(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.update(name="new") + events = vg.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VLANGroupUpdated) + + def test_update_deleted_raises_error(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + vg.update(name="fail") + + +class TestVLANGroupDelete: + def test_delete_marks_deleted(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.delete() + assert vg._deleted is True + + def test_delete_produces_event(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.delete() + events = vg.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VLANGroupDeleted) + + def test_double_delete_raises_error(self): + vg = make_vlan_group() + vg.collect_uncommitted_events() + vg.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + vg.delete() + + +class TestVLANGroupLoadFromHistory: + def test_load_from_history_restores_state(self): + original = make_vlan_group(name="Orig", slug="orig", min_vid=10, max_vid=500) + original.update(name="Updated") + original.delete() + events = original.collect_uncommitted_events() + + restored = VLANGroup() + restored.load_from_history(events) + assert restored.name == "Updated" + assert restored.slug == "orig" + assert restored.min_vid == 10 + assert restored.max_vid == 500 + assert restored._deleted is True + assert restored.version == 3 + + +class TestVLANGroupSnapshot: + def test_snapshot_roundtrip(self): + tag_id = uuid4() + tid = uuid4() + vg = make_vlan_group( + name="Test", + slug="test", + min_vid=10, + max_vid=200, + tenant_id=tid, + description="desc", + custom_fields={"k": "v"}, + tags=[tag_id], + ) + snap = vg.to_snapshot() + restored = VLANGroup.from_snapshot(vg.id, snap, vg.version) + assert restored.name == "Test" + assert restored.slug == "test" + assert restored.min_vid == 10 + assert restored.max_vid == 200 + assert restored.tenant_id == tid + assert restored.tags == [tag_id] + assert restored.id == vg.id diff --git a/services/ipam/tests/test_domain/test_vrf.py b/services/ipam/tests/test_domain/test_vrf.py new file mode 100644 index 0000000..4715eb2 --- /dev/null +++ b/services/ipam/tests/test_domain/test_vrf.py @@ -0,0 +1,477 @@ +"""Unit tests for the VRF aggregate root.""" + +from uuid import UUID, uuid4 + +import pytest +from ipam.domain.events import VRFCreated, VRFDeleted, VRFUpdated +from ipam.domain.value_objects import RouteDistinguisher +from ipam.domain.vrf import VRF +from pydantic import ValidationError +from shared.domain.exceptions import BusinessRuleViolationError + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def make_vrf( + name: str = "default", + rd: str | None = None, + tenant_id: UUID | None = None, + description: str = "", +) -> VRF: + return VRF.create( + name=name, + rd=rd, + tenant_id=tenant_id, + description=description, + ) + + +# --------------------------------------------------------------------------- +# create() +# --------------------------------------------------------------------------- + + +class TestVRFCreate: + def test_create_returns_vrf_instance(self): + vrf = make_vrf() + assert isinstance(vrf, VRF) + + def test_create_sets_name(self): + vrf = make_vrf(name="management") + assert vrf.name == "management" + + def test_create_sets_rd_as_route_distinguisher(self): + vrf = make_vrf(rd="65000:100") + assert isinstance(vrf.rd, RouteDistinguisher) + assert vrf.rd.rd == "65000:100" + + def test_create_sets_rd_none_by_default(self): + vrf = make_vrf() + assert vrf.rd is None + + def test_create_with_ip_based_rd(self): + vrf = make_vrf(rd="192.168.1.1:100") + assert vrf.rd.rd == "192.168.1.1:100" + + def test_create_with_invalid_rd_raises(self): + with pytest.raises((ValueError, ValidationError)): + make_vrf(rd="invalid-rd-format") + + def test_create_sets_tenant_id(self): + tenant_id = uuid4() + vrf = make_vrf(tenant_id=tenant_id) + assert vrf.tenant_id == tenant_id + + def test_create_sets_tenant_id_none_by_default(self): + vrf = make_vrf() + assert vrf.tenant_id is None + + def test_create_sets_description(self): + vrf = make_vrf(description="Customer A VRF") + assert vrf.description == "Customer A VRF" + + def test_create_sets_empty_description_by_default(self): + vrf = make_vrf() + assert vrf.description == "" + + def test_create_version_is_1(self): + vrf = make_vrf() + assert vrf.version == 1 + + def test_create_is_not_deleted(self): + vrf = make_vrf() + assert vrf._deleted is False + + def test_create_produces_one_event(self): + vrf = make_vrf() + events = vrf.collect_uncommitted_events() + assert len(events) == 1 + + def test_create_event_type_is_vrf_created(self): + vrf = make_vrf() + events = vrf.collect_uncommitted_events() + assert isinstance(events[0], VRFCreated) + + def test_create_event_has_correct_aggregate_id(self): + vrf = make_vrf() + events = vrf.collect_uncommitted_events() + assert events[0].aggregate_id == vrf.id + + def test_create_event_has_version_1(self): + vrf = make_vrf() + events = vrf.collect_uncommitted_events() + assert events[0].version == 1 + + def test_create_event_name_matches(self): + vrf = make_vrf(name="transit") + events = vrf.collect_uncommitted_events() + assert events[0].name == "transit" + + def test_create_event_rd_matches(self): + vrf = make_vrf(rd="65001:200") + events = vrf.collect_uncommitted_events() + assert events[0].rd == "65001:200" + + def test_create_assigns_unique_ids(self): + v1 = make_vrf() + v2 = make_vrf() + assert v1.id != v2.id + + def test_collect_uncommitted_events_clears_queue(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + assert vrf.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# update() +# --------------------------------------------------------------------------- + + +class TestVRFUpdate: + def test_update_name_changes_name(self): + vrf = make_vrf(name="old-name") + vrf.collect_uncommitted_events() + vrf.update(name="new-name") + assert vrf.name == "new-name" + + def test_update_description_changes_description(self): + vrf = make_vrf(description="old") + vrf.collect_uncommitted_events() + vrf.update(description="new description") + assert vrf.description == "new description" + + def test_update_produces_vrf_updated_event(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.update(name="updated") + events = vrf.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VRFUpdated) + + def test_update_increments_version(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.update(name="v2") + assert vrf.version == 2 + + def test_update_event_has_correct_aggregate_id(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.update(name="x") + events = vrf.collect_uncommitted_events() + assert events[0].aggregate_id == vrf.id + + def test_update_with_none_args_does_not_change_existing_values(self): + vrf = make_vrf(name="keep", description="keep-desc") + vrf.collect_uncommitted_events() + vrf.update() # all None + assert vrf.name == "keep" + assert vrf.description == "keep-desc" + + def test_update_after_delete_raises_business_rule_violation(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + with pytest.raises(BusinessRuleViolationError, match="deleted"): + vrf.update(name="should fail") + + def test_multiple_updates_accumulate_version(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.update(name="v2") + vrf.collect_uncommitted_events() + vrf.update(description="v3-desc") + assert vrf.version == 3 + + +# --------------------------------------------------------------------------- +# delete() +# --------------------------------------------------------------------------- + + +class TestVRFDelete: + def test_delete_marks_vrf_as_deleted(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + assert vrf._deleted is True + + def test_delete_produces_vrf_deleted_event(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + events = vrf.collect_uncommitted_events() + assert len(events) == 1 + assert isinstance(events[0], VRFDeleted) + + def test_delete_increments_version(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + assert vrf.version == 2 + + def test_delete_twice_raises_business_rule_violation(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + with pytest.raises(BusinessRuleViolationError, match="already deleted"): + vrf.delete() + + def test_update_after_delete_is_blocked(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + with pytest.raises(BusinessRuleViolationError): + vrf.update(name="blocked") + + +# --------------------------------------------------------------------------- +# load_from_history() +# --------------------------------------------------------------------------- + + +class TestVRFLoadFromHistory: + def test_load_from_history_restores_name(self): + original = make_vrf(name="customer-a") + events = original.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.name == "customer-a" + + def test_load_from_history_restores_rd(self): + original = make_vrf(rd="65000:999") + events = original.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.rd is not None + assert restored.rd.rd == "65000:999" + + def test_load_from_history_restores_tenant_id(self): + tenant_id = uuid4() + original = make_vrf(tenant_id=tenant_id) + events = original.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.tenant_id == tenant_id + + def test_load_from_history_restores_description(self): + original = make_vrf(description="testing") + events = original.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.description == "testing" + + def test_load_from_history_restores_version(self): + original = make_vrf() + events = original.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.version == 1 + + def test_load_from_history_restores_state_after_update(self): + vrf = make_vrf(name="original") + vrf.update(name="updated", description="new-desc") + events = vrf.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.name == "updated" + assert restored.description == "new-desc" + assert restored.version == 2 + + def test_load_from_history_restores_deleted_state(self): + vrf = make_vrf() + vrf.delete() + events = vrf.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored._deleted is True + assert restored.version == 2 + + def test_load_from_history_does_not_add_uncommitted_events(self): + vrf = make_vrf() + vrf.update(name="v2") + events = vrf.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + assert restored.collect_uncommitted_events() == [] + + def test_load_from_history_restores_aggregate_id(self): + vrf = make_vrf() + original_id = vrf.id + events = vrf.collect_uncommitted_events() + + restored = VRF() + restored.load_from_history(events) + + # The aggregate ID comes from the event, not the shell object + assert restored.id != original_id # shell object has a different UUID + # The event aggregate_id matches the original + assert events[0].aggregate_id == original_id + + +# --------------------------------------------------------------------------- +# Snapshot round-trip +# --------------------------------------------------------------------------- + + +class TestVRFSnapshot: + def test_to_snapshot_returns_dict(self): + vrf = make_vrf() + snap = vrf.to_snapshot() + assert isinstance(snap, dict) + + def test_to_snapshot_contains_expected_keys(self): + vrf = make_vrf() + snap = vrf.to_snapshot() + expected = { + "name", + "rd", + "import_targets", + "export_targets", + "tenant_id", + "description", + "custom_fields", + "tags", + "deleted", + } + assert expected == snap.keys() + + def test_snapshot_roundtrip_preserves_name(self): + vrf = make_vrf(name="backbone") + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.name == "backbone" + + def test_snapshot_roundtrip_preserves_rd(self): + vrf = make_vrf(rd="65500:1") + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.rd is not None + assert restored.rd.rd == "65500:1" + + def test_snapshot_roundtrip_preserves_none_rd(self): + vrf = make_vrf(rd=None) + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.rd is None + + def test_snapshot_roundtrip_preserves_tenant_id(self): + tenant_id = uuid4() + vrf = make_vrf(tenant_id=tenant_id) + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.tenant_id == tenant_id + + def test_snapshot_roundtrip_preserves_description(self): + vrf = make_vrf(description="my vrf") + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.description == "my vrf" + + def test_snapshot_roundtrip_preserves_aggregate_id(self): + vrf = make_vrf() + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.id == vrf.id + + def test_snapshot_roundtrip_preserves_version(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.update(name="v2") + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.version == 2 + + def test_snapshot_roundtrip_preserves_deleted_state(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + vrf.delete() + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored._deleted is True + + def test_from_snapshot_does_not_produce_uncommitted_events(self): + vrf = make_vrf() + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.collect_uncommitted_events() == [] + + +# --------------------------------------------------------------------------- +# Route Targets (import/export) +# --------------------------------------------------------------------------- + + +class TestVRFRouteTargets: + def test_create_with_import_targets(self): + rt_id = uuid4() + vrf = VRF.create(name="test", import_targets=[rt_id]) + assert vrf.import_targets == [rt_id] + + def test_create_with_export_targets(self): + rt_id = uuid4() + vrf = VRF.create(name="test", export_targets=[rt_id]) + assert vrf.export_targets == [rt_id] + + def test_create_default_empty_targets(self): + vrf = make_vrf() + assert vrf.import_targets == [] + assert vrf.export_targets == [] + + def test_update_import_targets(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + rt_id = uuid4() + vrf.update(import_targets=[rt_id]) + assert vrf.import_targets == [rt_id] + + def test_update_export_targets(self): + vrf = make_vrf() + vrf.collect_uncommitted_events() + rt_id = uuid4() + vrf.update(export_targets=[rt_id]) + assert vrf.export_targets == [rt_id] + + def test_event_has_import_export_targets(self): + rt1 = uuid4() + rt2 = uuid4() + vrf = VRF.create(name="test", import_targets=[rt1], export_targets=[rt2]) + events = vrf.collect_uncommitted_events() + assert events[0].import_targets == [rt1] + assert events[0].export_targets == [rt2] + + def test_snapshot_preserves_targets(self): + rt1, rt2 = uuid4(), uuid4() + vrf = VRF.create(name="test", import_targets=[rt1], export_targets=[rt2]) + snap = vrf.to_snapshot() + restored = VRF.from_snapshot(vrf.id, snap, vrf.version) + assert restored.import_targets == [rt1] + assert restored.export_targets == [rt2] + + def test_load_from_history_restores_targets(self): + rt1, rt2 = uuid4(), uuid4() + vrf = VRF.create(name="test", import_targets=[rt1], export_targets=[rt2]) + events = vrf.collect_uncommitted_events() + restored = VRF() + restored.load_from_history(events) + assert restored.import_targets == [rt1] + assert restored.export_targets == [rt2] diff --git a/services/ipam/tests/test_integration/__init__.py b/services/ipam/tests/test_integration/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/ipam/tests/test_integration/conftest.py b/services/ipam/tests/test_integration/conftest.py new file mode 100644 index 0000000..7f05265 --- /dev/null +++ b/services/ipam/tests/test_integration/conftest.py @@ -0,0 +1,284 @@ +"""Fixtures for IPAM integration tests (in-memory, no Docker required).""" + +from __future__ import annotations + +from datetime import datetime +from typing import Any +from uuid import UUID + +import pytest +from ipam.application.command_handlers import ( + BulkCreatePrefixesHandler, + CreatePrefixHandler, + DeletePrefixHandler, + UpdatePrefixHandler, +) +from ipam.application.query_handlers import GetPrefixHandler, ListPrefixesHandler +from ipam.application.read_model import PrefixReadModelRepository +from shared.domain.exceptions import ConflictError +from shared.event.domain_event import DomainEvent + +# --------------------------------------------------------------------------- +# InMemoryEventStore +# --------------------------------------------------------------------------- + + +class InMemoryEventStore: + """In-memory event store for mock integration tests. + + Matches the PostgresEventStore interface used by command handlers: + - append(aggregate_id, events, expected_version, *, aggregate=None, session=None) + - load_aggregate(aggregate_cls, aggregate_id) + - load_stream(aggregate_id, after_version=0) + - load_snapshot(aggregate_id) + - save_snapshot(aggregate_id, state, version) + """ + + def __init__(self) -> None: + self._events: dict[UUID, list[DomainEvent]] = {} + self._versions: dict[UUID, int] = {} + self._snapshots: dict[UUID, tuple[dict[str, Any], int]] = {} + + async def append( + self, + aggregate_id: UUID, + events: list[DomainEvent], + expected_version: int, + *, + aggregate: Any = None, + session: Any = None, + ) -> None: + current_version = self._versions.get(aggregate_id, 0) + if current_version != expected_version: + raise ConflictError( + f"Expected version {expected_version}, but current version is {current_version}", + details={ + "aggregate_id": str(aggregate_id), + "expected_version": expected_version, + "current_version": current_version, + }, + ) + if aggregate_id not in self._events: + self._events[aggregate_id] = [] + self._events[aggregate_id].extend(events) + self._versions[aggregate_id] = current_version + len(events) + + async def load_aggregate(self, aggregate_cls: type, aggregate_id: UUID) -> Any | None: + snapshot_data = await self.load_snapshot(aggregate_id) + if snapshot_data is not None: + state, snapshot_version = snapshot_data + aggregate = aggregate_cls.from_snapshot(aggregate_id, state, snapshot_version) + events = await self.load_stream(aggregate_id, after_version=snapshot_version) + else: + events = await self.load_stream(aggregate_id) + if not events: + return None + aggregate = aggregate_cls(aggregate_id=aggregate_id) + snapshot_version = 0 + + aggregate.load_from_history(events) + return aggregate + + async def load_stream( + self, aggregate_id: UUID, after_version: int = 0, *, session: Any = None + ) -> list[DomainEvent]: + all_events = self._events.get(aggregate_id, []) + return [e for e in all_events if e.version > after_version] + + async def load_snapshot(self, aggregate_id: UUID) -> tuple[dict[str, Any], int] | None: + return self._snapshots.get(aggregate_id) + + async def save_snapshot(self, aggregate_id: UUID, state: dict[str, Any], version: int) -> None: + self._snapshots[aggregate_id] = (state, version) + + def get_events(self, aggregate_id: UUID) -> list[DomainEvent]: + """Helper: return all stored events for an aggregate (for test assertions).""" + return list(self._events.get(aggregate_id, [])) + + def all_events(self) -> list[DomainEvent]: + """Helper: return all stored events across all aggregates.""" + result: list[DomainEvent] = [] + for events in self._events.values(): + result.extend(events) + return result + + +# --------------------------------------------------------------------------- +# FakeKafkaProducer +# --------------------------------------------------------------------------- + + +class FakeKafkaProducer: + """Collects published events for assertion instead of sending to Kafka.""" + + def __init__(self) -> None: + self.published: list[tuple[str, DomainEvent]] = [] + + async def publish(self, topic: str, event: DomainEvent) -> None: + self.published.append((topic, event)) + + async def publish_many(self, topic: str, events: list[DomainEvent]) -> None: + for event in events: + self.published.append((topic, event)) + + async def start(self) -> None: + pass + + async def stop(self) -> None: + pass + + def get_events(self, topic: str | None = None) -> list[DomainEvent]: + """Helper: return published events, optionally filtered by topic.""" + if topic is None: + return [e for _, e in self.published] + return [e for t, e in self.published if t == topic] + + +# --------------------------------------------------------------------------- +# InMemoryPrefixReadModelRepository +# --------------------------------------------------------------------------- + + +class InMemoryPrefixReadModelRepository(PrefixReadModelRepository): + """In-memory read model repository for Prefix aggregates.""" + + def __init__(self) -> None: + self._data: dict[UUID, dict] = {} + + async def upsert_from_aggregate(self, aggregate: Any) -> None: + now = datetime.now() + existing = self._data.get(aggregate.id) + created_at = existing["created_at"] if existing else now + self._data[aggregate.id] = { + "id": aggregate.id, + "network": str(aggregate.network.network) if aggregate.network else None, + "vrf_id": aggregate.vrf_id, + "vlan_id": aggregate.vlan_id, + "status": aggregate.status.value, + "role": aggregate.role, + "tenant_id": aggregate.tenant_id, + "description": aggregate.description, + "custom_fields": aggregate.custom_fields, + "tags": list(aggregate.tags), + "created_at": created_at, + "updated_at": now, + } + + async def find_by_id(self, entity_id: UUID) -> dict | None: + data = self._data.get(entity_id) + if data is None: + return None + if data.get("_deleted"): + return None + return dict(data) + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + filters: Any = None, + sort_params: Any = None, + tag_slugs: Any = None, + custom_field_filters: Any = None, + ) -> tuple[list[dict], int]: + items = [d for d in self._data.values() if not d.get("_deleted")] + total = len(items) + return items[offset : offset + limit], total + + async def mark_deleted(self, entity_id: UUID) -> None: + if entity_id in self._data: + self._data[entity_id]["_deleted"] = True + + async def find_children(self, parent_network: str, vrf_id: UUID | None) -> list[dict]: + import ipaddress + + parent_net = ipaddress.ip_network(parent_network, strict=False) + children: list[dict] = [] + for data in self._data.values(): + if data.get("_deleted"): + continue + if data.get("vrf_id") != vrf_id: + continue + try: + child_net = ipaddress.ip_network(data["network"], strict=False) + except (ValueError, TypeError): + continue + if child_net != parent_net and child_net.subnet_of(parent_net): + children.append(dict(data)) + return children + + async def find_by_vrf(self, vrf_id: UUID, *, offset: int = 0, limit: int = 50) -> tuple[list[dict], int]: + items = [d for d in self._data.values() if not d.get("_deleted") and d.get("vrf_id") == vrf_id] + total = len(items) + return items[offset : offset + limit], total + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + + +@pytest.fixture +def event_store() -> InMemoryEventStore: + return InMemoryEventStore() + + +@pytest.fixture +def kafka_producer() -> FakeKafkaProducer: + return FakeKafkaProducer() + + +@pytest.fixture +def prefix_read_repo() -> InMemoryPrefixReadModelRepository: + return InMemoryPrefixReadModelRepository() + + +@pytest.fixture +def create_prefix_handler( + event_store: InMemoryEventStore, + prefix_read_repo: InMemoryPrefixReadModelRepository, + kafka_producer: FakeKafkaProducer, +) -> CreatePrefixHandler: + return CreatePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def update_prefix_handler( + event_store: InMemoryEventStore, + prefix_read_repo: InMemoryPrefixReadModelRepository, + kafka_producer: FakeKafkaProducer, +) -> UpdatePrefixHandler: + return UpdatePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def delete_prefix_handler( + event_store: InMemoryEventStore, + prefix_read_repo: InMemoryPrefixReadModelRepository, + kafka_producer: FakeKafkaProducer, +) -> DeletePrefixHandler: + return DeletePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def bulk_create_prefix_handler( + event_store: InMemoryEventStore, + prefix_read_repo: InMemoryPrefixReadModelRepository, + kafka_producer: FakeKafkaProducer, +) -> BulkCreatePrefixesHandler: + return BulkCreatePrefixesHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def get_prefix_handler( + prefix_read_repo: InMemoryPrefixReadModelRepository, +) -> GetPrefixHandler: + return GetPrefixHandler(prefix_read_repo) + + +@pytest.fixture +def list_prefixes_handler( + prefix_read_repo: InMemoryPrefixReadModelRepository, +) -> ListPrefixesHandler: + return ListPrefixesHandler(prefix_read_repo) diff --git a/services/ipam/tests/test_integration/test_event_flow.py b/services/ipam/tests/test_integration/test_event_flow.py new file mode 100644 index 0000000..d846136 --- /dev/null +++ b/services/ipam/tests/test_integration/test_event_flow.py @@ -0,0 +1,430 @@ +"""Event flow tests: EventSerializer serialization/deserialization roundtrip for all IPAM events. + +No Docker required — these run as regular (non-integration) tests. +""" + +from __future__ import annotations + +from uuid import uuid4 + +import pytest +from ipam.domain.events import ( + ASNCreated, + ASNDeleted, + ASNUpdated, + FHRPGroupCreated, + FHRPGroupDeleted, + FHRPGroupUpdated, + IPAddressCreated, + IPAddressDeleted, + IPAddressStatusChanged, + IPAddressUpdated, + IPRangeCreated, + IPRangeDeleted, + IPRangeStatusChanged, + IPRangeUpdated, + PrefixCreated, + PrefixDeleted, + PrefixStatusChanged, + PrefixUpdated, + RIRCreated, + RIRDeleted, + RIRUpdated, + RouteTargetCreated, + RouteTargetDeleted, + RouteTargetUpdated, + ServiceCreated, + ServiceDeleted, + ServiceUpdated, + VLANCreated, + VLANDeleted, + VLANGroupCreated, + VLANGroupDeleted, + VLANGroupUpdated, + VLANStatusChanged, + VLANUpdated, + VRFCreated, + VRFDeleted, + VRFUpdated, +) +from shared.messaging.serialization import EventSerializer + + +def _make_serializer() -> EventSerializer: + """Create an EventSerializer with all IPAM event types registered.""" + serializer = EventSerializer() + all_event_types = [ + PrefixCreated, + PrefixUpdated, + PrefixDeleted, + PrefixStatusChanged, + IPAddressCreated, + IPAddressUpdated, + IPAddressDeleted, + IPAddressStatusChanged, + IPRangeCreated, + IPRangeUpdated, + IPRangeDeleted, + IPRangeStatusChanged, + VRFCreated, + VRFUpdated, + VRFDeleted, + VLANCreated, + VLANUpdated, + VLANDeleted, + VLANStatusChanged, + VLANGroupCreated, + VLANGroupUpdated, + VLANGroupDeleted, + RIRCreated, + RIRUpdated, + RIRDeleted, + ASNCreated, + ASNUpdated, + ASNDeleted, + FHRPGroupCreated, + FHRPGroupUpdated, + FHRPGroupDeleted, + RouteTargetCreated, + RouteTargetUpdated, + RouteTargetDeleted, + ServiceCreated, + ServiceUpdated, + ServiceDeleted, + ] + for event_cls in all_event_types: + serializer.register(event_cls) + return serializer + + +def _make_sample_events() -> list: + """Create one sample instance of each IPAM event type.""" + agg_id = uuid4() + vrf_id = uuid4() + vlan_id = uuid4() + tenant_id = uuid4() + rir_id = uuid4() + group_id = uuid4() + tag_id = uuid4() + ip_addr_id = uuid4() + + return [ + # Prefix events + PrefixCreated( + aggregate_id=agg_id, + version=1, + network="10.0.0.0/8", + vrf_id=vrf_id, + vlan_id=vlan_id, + status="active", + role="infrastructure", + tenant_id=tenant_id, + description="Test prefix", + custom_fields={"site": "dc1"}, + tags=[tag_id], + ), + PrefixUpdated( + aggregate_id=agg_id, + version=2, + description="Updated description", + role="production", + ), + PrefixDeleted( + aggregate_id=agg_id, + version=3, + ), + PrefixStatusChanged( + aggregate_id=agg_id, + version=2, + old_status="active", + new_status="reserved", + ), + # IPAddress events + IPAddressCreated( + aggregate_id=uuid4(), + version=1, + address="10.0.0.1/32", + vrf_id=vrf_id, + status="active", + dns_name="host1.example.com", + tenant_id=tenant_id, + description="Host 1", + custom_fields={"env": "prod"}, + tags=[tag_id], + ), + IPAddressUpdated( + aggregate_id=uuid4(), + version=2, + dns_name="host1-updated.example.com", + description="Updated host", + ), + IPAddressDeleted( + aggregate_id=uuid4(), + version=3, + ), + IPAddressStatusChanged( + aggregate_id=uuid4(), + version=2, + old_status="active", + new_status="deprecated", + ), + # IPRange events + IPRangeCreated( + aggregate_id=uuid4(), + version=1, + start_address="10.0.0.1", + end_address="10.0.0.255", + vrf_id=vrf_id, + status="active", + tenant_id=tenant_id, + description="Range 1", + ), + IPRangeUpdated( + aggregate_id=uuid4(), + version=2, + description="Updated range", + ), + IPRangeDeleted( + aggregate_id=uuid4(), + version=3, + ), + IPRangeStatusChanged( + aggregate_id=uuid4(), + version=2, + old_status="active", + new_status="reserved", + ), + # VRF events + VRFCreated( + aggregate_id=uuid4(), + version=1, + name="VRF-1", + rd="65000:1", + import_targets=[uuid4()], + export_targets=[uuid4()], + tenant_id=tenant_id, + description="Test VRF", + ), + VRFUpdated( + aggregate_id=uuid4(), + version=2, + name="VRF-1-updated", + description="Updated VRF", + ), + VRFDeleted( + aggregate_id=uuid4(), + version=3, + ), + # VLAN events + VLANCreated( + aggregate_id=uuid4(), + version=1, + vid=100, + name="VLAN-100", + group_id=group_id, + status="active", + role="production", + tenant_id=tenant_id, + description="Production VLAN", + ), + VLANUpdated( + aggregate_id=uuid4(), + version=2, + name="VLAN-100-updated", + ), + VLANDeleted( + aggregate_id=uuid4(), + version=3, + ), + VLANStatusChanged( + aggregate_id=uuid4(), + version=2, + old_status="active", + new_status="deprecated", + ), + # VLANGroup events + VLANGroupCreated( + aggregate_id=uuid4(), + version=1, + name="Group-1", + slug="group-1", + min_vid=1, + max_vid=4094, + tenant_id=tenant_id, + description="Test group", + ), + VLANGroupUpdated( + aggregate_id=uuid4(), + version=2, + name="Group-1-updated", + ), + VLANGroupDeleted( + aggregate_id=uuid4(), + version=3, + ), + # RIR events + RIRCreated( + aggregate_id=uuid4(), + version=1, + name="ARIN", + is_private=False, + description="American Registry", + ), + RIRUpdated( + aggregate_id=uuid4(), + version=2, + description="Updated RIR", + is_private=True, + ), + RIRDeleted( + aggregate_id=uuid4(), + version=3, + ), + # ASN events + ASNCreated( + aggregate_id=uuid4(), + version=1, + asn=65000, + rir_id=rir_id, + tenant_id=tenant_id, + description="Test ASN", + ), + ASNUpdated( + aggregate_id=uuid4(), + version=2, + description="Updated ASN", + ), + ASNDeleted( + aggregate_id=uuid4(), + version=3, + ), + # FHRPGroup events + FHRPGroupCreated( + aggregate_id=uuid4(), + version=1, + protocol="vrrp", + group_id_value=1, + auth_type="plaintext", + auth_key="secret", + name="FHRP-1", + description="Test FHRP", + ), + FHRPGroupUpdated( + aggregate_id=uuid4(), + version=2, + name="FHRP-1-updated", + ), + FHRPGroupDeleted( + aggregate_id=uuid4(), + version=3, + ), + # RouteTarget events + RouteTargetCreated( + aggregate_id=uuid4(), + version=1, + name="65000:100", + tenant_id=tenant_id, + description="Test RT", + ), + RouteTargetUpdated( + aggregate_id=uuid4(), + version=2, + description="Updated RT", + ), + RouteTargetDeleted( + aggregate_id=uuid4(), + version=3, + ), + # Service events + ServiceCreated( + aggregate_id=uuid4(), + version=1, + name="HTTP", + protocol="tcp", + ports=[80, 443], + ip_addresses=[ip_addr_id], + description="Web service", + ), + ServiceUpdated( + aggregate_id=uuid4(), + version=2, + name="HTTPS", + ports=[443], + ), + ServiceDeleted( + aggregate_id=uuid4(), + version=3, + ), + ] + + +class TestEventSerializerRoundtrip: + """Verify serialize → deserialize roundtrip for every IPAM event type.""" + + @pytest.fixture + def serializer(self) -> EventSerializer: + return _make_serializer() + + @pytest.fixture + def sample_events(self) -> list: + return _make_sample_events() + + async def test_all_events_roundtrip(self, serializer: EventSerializer, sample_events: list) -> None: + """Every IPAM event should survive a serialize/deserialize roundtrip.""" + for original in sample_events: + serialized = serializer.serialize(original) + assert isinstance(serialized, bytes) + + deserialized = serializer.deserialize(serialized) + assert type(deserialized) is type(original), ( + f"Type mismatch: expected {type(original).__name__}, got {type(deserialized).__name__}" + ) + assert deserialized.aggregate_id == original.aggregate_id + assert deserialized.version == original.version + assert deserialized.event_type == original.event_type + + @pytest.mark.parametrize( + "event", + _make_sample_events(), + ids=lambda e: type(e).__name__, + ) + async def test_individual_event_roundtrip(self, serializer: EventSerializer, event) -> None: + """Parameterized test: each event type individually.""" + serialized = serializer.serialize(event) + deserialized = serializer.deserialize(serialized) + + assert type(deserialized) is type(event) + assert deserialized.aggregate_id == event.aggregate_id + assert deserialized.version == event.version + + # Compare all non-meta fields + original_data = event.model_dump(exclude={"event_id", "timestamp"}) + roundtrip_data = deserialized.model_dump(exclude={"event_id", "timestamp"}) + assert roundtrip_data == original_data, ( + f"Field mismatch for {type(event).__name__}: {original_data} != {roundtrip_data}" + ) + + async def test_serialized_format_is_json_bytes(self, serializer: EventSerializer) -> None: + """Serialized output should be valid JSON bytes.""" + import json + + event = PrefixCreated( + aggregate_id=uuid4(), + version=1, + network="10.0.0.0/8", + ) + serialized = serializer.serialize(event) + parsed = json.loads(serialized) + assert isinstance(parsed, dict) + assert parsed["network"] == "10.0.0.0/8" + assert "event_type" in parsed + + async def test_event_type_field_contains_module_path(self, serializer: EventSerializer) -> None: + """The event_type field should contain the full module-qualified name.""" + event = PrefixCreated( + aggregate_id=uuid4(), + version=1, + network="10.0.0.0/8", + ) + assert "PrefixCreated" in event.event_type + assert "ipam.domain.events" in event.event_type diff --git a/services/ipam/tests/test_integration/test_ipam_db.py b/services/ipam/tests/test_integration/test_ipam_db.py new file mode 100644 index 0000000..730f4e9 --- /dev/null +++ b/services/ipam/tests/test_integration/test_ipam_db.py @@ -0,0 +1,274 @@ +"""IPAM Docker integration tests: real PostgreSQL via testcontainers. + +These tests require Docker and are marked with @pytest.mark.integration. +Run with: uv run --package cmdb-ipam pytest services/ipam/tests/ -m integration +""" + +from __future__ import annotations + +import pytest +from ipam.application.command_handlers import ( + CreatePrefixHandler, + DeletePrefixHandler, + UpdatePrefixHandler, +) +from ipam.application.commands import ( + CreatePrefixCommand, + DeletePrefixCommand, + UpdatePrefixCommand, +) +from ipam.application.queries import ListPrefixesQuery +from ipam.application.query_handlers import GetPrefixHandler, ListPrefixesHandler +from ipam.domain.events import PrefixCreated, PrefixDeleted, PrefixUpdated +from ipam.infrastructure.models import IPAMBase +from ipam.infrastructure.read_model_repository import PostgresPrefixReadModelRepository +from shared.event.models import EventStoreBase +from shared.event.pg_store import PostgresEventStore +from sqlalchemy import text +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine +from testcontainers.postgres import PostgresContainer + +from .conftest import FakeKafkaProducer + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + + +@pytest.fixture(scope="session") +def postgres_container(): + with PostgresContainer("postgres:16") as pg: + yield pg + + +@pytest.fixture(scope="session") +async def engine(postgres_container): + url = postgres_container.get_connection_url().replace("psycopg2", "asyncpg") + eng = create_async_engine(url) + async with eng.begin() as conn: + await conn.run_sync(IPAMBase.metadata.create_all) + await conn.run_sync(EventStoreBase.metadata.create_all) + yield eng + await eng.dispose() + + +@pytest.fixture +async def session(engine): + factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False) + async with factory() as session: + yield session + # TRUNCATE all tables after each test + for table in reversed(IPAMBase.metadata.sorted_tables): + await session.execute(text(f'TRUNCATE TABLE "{table.name}" CASCADE')) + for table in reversed(EventStoreBase.metadata.sorted_tables): + await session.execute(text(f'TRUNCATE TABLE "{table.name}" CASCADE')) + await session.commit() + + +@pytest.fixture +def session_factory(session): + """Return an async context manager that yields the existing session.""" + + from contextlib import asynccontextmanager + + @asynccontextmanager + async def _factory(): + yield session + + return _factory + + +@pytest.fixture +def event_store(session_factory): + store = PostgresEventStore(session_factory) + store.register_event_type(PrefixCreated) + store.register_event_type(PrefixUpdated) + store.register_event_type(PrefixDeleted) + return store + + +@pytest.fixture +def prefix_read_repo(session): + return PostgresPrefixReadModelRepository(session) + + +@pytest.fixture +def kafka_producer(): + return FakeKafkaProducer() + + +@pytest.fixture +def create_handler(event_store, prefix_read_repo, kafka_producer): + return CreatePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def update_handler(event_store, prefix_read_repo, kafka_producer): + return UpdatePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def delete_handler(event_store, prefix_read_repo, kafka_producer): + return DeletePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def get_handler(prefix_read_repo): + return GetPrefixHandler(prefix_read_repo) + + +@pytest.fixture +def list_handler(prefix_read_repo): + return ListPrefixesHandler(prefix_read_repo) + + +# --------------------------------------------------------------------------- +# TestPrefixCRUDWithDB +# --------------------------------------------------------------------------- + + +@pytest.mark.integration +class TestPrefixCRUDWithDB: + """Full CRUD cycle with real PostgreSQL: create, read, update, delete.""" + + async def test_create_prefix_persists_event( + self, + create_handler: CreatePrefixHandler, + session: AsyncSession, + ) -> None: + prefix_id = await create_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8", description="DB test"), + ) + + # Verify event persisted in domain_events table + result = await session.execute( + text("SELECT * FROM domain_events WHERE aggregate_id = :agg_id"), + {"agg_id": str(prefix_id)}, + ) + rows = result.fetchall() + assert len(rows) == 1 + assert rows[0].event_type.endswith("PrefixCreated") + + async def test_create_prefix_persists_read_model( + self, + create_handler: CreatePrefixHandler, + session: AsyncSession, + ) -> None: + prefix_id = await create_handler.handle( + CreatePrefixCommand(network="192.168.1.0/24", description="Read model test"), + ) + + # Verify read model persisted in prefixes_read table + result = await session.execute( + text("SELECT * FROM prefixes_read WHERE id = :id"), + {"id": str(prefix_id)}, + ) + row = result.fetchone() + assert row is not None + assert row.network == "192.168.1.0/24" + assert row.description == "Read model test" + + async def test_query_returns_created_prefix( + self, + create_handler: CreatePrefixHandler, + list_handler: ListPrefixesHandler, + ) -> None: + prefix_id = await create_handler.handle( + CreatePrefixCommand(network="172.16.0.0/12", description="Queryable"), + ) + + items, total = await list_handler.handle(ListPrefixesQuery()) + assert total == 1 + assert items[0].id == prefix_id + assert items[0].network == "172.16.0.0/12" + + async def test_update_persists_in_db( + self, + create_handler: CreatePrefixHandler, + update_handler: UpdatePrefixHandler, + get_handler: GetPrefixHandler, + session: AsyncSession, + ) -> None: + from ipam.application.queries import GetPrefixQuery + + prefix_id = await create_handler.handle( + CreatePrefixCommand(network="10.1.0.0/16", description="Original"), + ) + + await update_handler.handle( + UpdatePrefixCommand(prefix_id=prefix_id, description="Updated via DB"), + ) + + dto = await get_handler.handle(GetPrefixQuery(prefix_id=prefix_id)) + assert dto.description == "Updated via DB" + + # Verify two events in store + result = await session.execute( + text("SELECT COUNT(*) FROM domain_events WHERE aggregate_id = :agg_id"), + {"agg_id": str(prefix_id)}, + ) + assert result.scalar_one() == 2 + + async def test_delete_marks_as_deleted( + self, + create_handler: CreatePrefixHandler, + delete_handler: DeletePrefixHandler, + list_handler: ListPrefixesHandler, + session: AsyncSession, + ) -> None: + prefix_id = await create_handler.handle( + CreatePrefixCommand(network="10.2.0.0/16"), + ) + + await delete_handler.handle(DeletePrefixCommand(prefix_id=prefix_id)) + + # List should return nothing (deleted prefix excluded) + items, total = await list_handler.handle(ListPrefixesQuery()) + assert total == 0 + + # But the read model row still exists with is_deleted=True + result = await session.execute( + text("SELECT is_deleted FROM prefixes_read WHERE id = :id"), + {"id": str(prefix_id)}, + ) + row = result.fetchone() + assert row is not None + assert row.is_deleted is True + + +# --------------------------------------------------------------------------- +# TestFilteringWithDB +# --------------------------------------------------------------------------- + + +@pytest.mark.integration +class TestFilteringWithDB: + """Filtering with real PostgreSQL: status filter + description ILIKE.""" + + async def test_filter_by_status( + self, + create_handler: CreatePrefixHandler, + list_handler: ListPrefixesHandler, + ) -> None: + await create_handler.handle(CreatePrefixCommand(network="10.0.0.0/8", status="active")) + await create_handler.handle(CreatePrefixCommand(network="172.16.0.0/12", status="reserved")) + await create_handler.handle(CreatePrefixCommand(network="192.168.0.0/16", status="active")) + + items, total = await list_handler.handle(ListPrefixesQuery(status="active")) + assert total == 2 + assert all(item.status == "active" for item in items) + + async def test_filter_by_description_contains( + self, + create_handler: CreatePrefixHandler, + list_handler: ListPrefixesHandler, + ) -> None: + await create_handler.handle(CreatePrefixCommand(network="10.0.0.0/8", description="Production network")) + await create_handler.handle(CreatePrefixCommand(network="172.16.0.0/12", description="Development lab")) + await create_handler.handle(CreatePrefixCommand(network="192.168.0.0/16", description="Production servers")) + + items, total = await list_handler.handle( + ListPrefixesQuery(description_contains="production"), + ) + assert total == 2 + assert all("roduction" in item.description for item in items) diff --git a/services/ipam/tests/test_integration/test_ipam_e2e.py b/services/ipam/tests/test_integration/test_ipam_e2e.py new file mode 100644 index 0000000..cf917cd --- /dev/null +++ b/services/ipam/tests/test_integration/test_ipam_e2e.py @@ -0,0 +1,315 @@ +"""IPAM E2E mock tests: full command → event → read model → query flow. + +These tests use InMemoryEventStore + InMemoryPrefixReadModelRepository + FakeKafkaProducer. +No Docker required — they run as regular (non-integration) tests. +""" + +from __future__ import annotations + +from uuid import UUID + +from ipam.application.command_handlers import ( + BulkCreatePrefixesHandler, + CreatePrefixHandler, + DeletePrefixHandler, + UpdatePrefixHandler, +) +from ipam.application.commands import ( + BulkCreatePrefixesCommand, + CreatePrefixCommand, + DeletePrefixCommand, + UpdatePrefixCommand, +) +from ipam.application.queries import GetPrefixQuery, ListPrefixesQuery +from ipam.application.query_handlers import GetPrefixHandler, ListPrefixesHandler +from ipam.domain.events import PrefixCreated, PrefixDeleted, PrefixUpdated + +from .conftest import FakeKafkaProducer, InMemoryEventStore, InMemoryPrefixReadModelRepository + + +class TestCreatePrefixE2E: + """Create Prefix → verify event stored → verify read model → query returns it.""" + + async def test_create_prefix_stores_event( + self, + create_prefix_handler: CreatePrefixHandler, + event_store: InMemoryEventStore, + ) -> None: + command = CreatePrefixCommand(network="10.0.0.0/8", description="Test prefix") + prefix_id = await create_prefix_handler.handle(command) + + events = event_store.get_events(prefix_id) + assert len(events) == 1 + assert isinstance(events[0], PrefixCreated) + assert events[0].network == "10.0.0.0/8" + assert events[0].aggregate_id == prefix_id + + async def test_create_prefix_populates_read_model( + self, + create_prefix_handler: CreatePrefixHandler, + prefix_read_repo: InMemoryPrefixReadModelRepository, + ) -> None: + command = CreatePrefixCommand(network="192.168.1.0/24", description="LAN") + prefix_id = await create_prefix_handler.handle(command) + + data = await prefix_read_repo.find_by_id(prefix_id) + assert data is not None + assert data["network"] == "192.168.1.0/24" + assert data["description"] == "LAN" + assert data["status"] == "active" + + async def test_create_prefix_publishes_kafka_event( + self, + create_prefix_handler: CreatePrefixHandler, + kafka_producer: FakeKafkaProducer, + ) -> None: + command = CreatePrefixCommand(network="172.16.0.0/12") + await create_prefix_handler.handle(command) + + kafka_events = kafka_producer.get_events("ipam.events") + assert len(kafka_events) == 1 + assert isinstance(kafka_events[0], PrefixCreated) + + async def test_create_prefix_queryable( + self, + create_prefix_handler: CreatePrefixHandler, + get_prefix_handler: GetPrefixHandler, + list_prefixes_handler: ListPrefixesHandler, + ) -> None: + command = CreatePrefixCommand(network="10.1.0.0/16", description="Queryable") + prefix_id = await create_prefix_handler.handle(command) + + # GetPrefixQuery + dto = await get_prefix_handler.handle(GetPrefixQuery(prefix_id=prefix_id)) + assert dto.network == "10.1.0.0/16" + assert dto.description == "Queryable" + + # ListPrefixesQuery + items, total = await list_prefixes_handler.handle(ListPrefixesQuery()) + assert total == 1 + assert items[0].id == prefix_id + + async def test_create_prefix_with_all_fields( + self, + create_prefix_handler: CreatePrefixHandler, + get_prefix_handler: GetPrefixHandler, + ) -> None: + command = CreatePrefixCommand( + network="10.2.0.0/16", + status="reserved", + role="infrastructure", + description="Full fields", + custom_fields={"site": "dc1"}, + ) + prefix_id = await create_prefix_handler.handle(command) + + dto = await get_prefix_handler.handle(GetPrefixQuery(prefix_id=prefix_id)) + assert dto.status == "reserved" + assert dto.role == "infrastructure" + assert dto.custom_fields == {"site": "dc1"} + + +class TestUpdatePrefixE2E: + """Update Prefix → verify update event → verify read model updated.""" + + async def test_update_prefix_stores_event( + self, + create_prefix_handler: CreatePrefixHandler, + update_prefix_handler: UpdatePrefixHandler, + event_store: InMemoryEventStore, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8"), + ) + + await update_prefix_handler.handle( + UpdatePrefixCommand(prefix_id=prefix_id, description="Updated"), + ) + + events = event_store.get_events(prefix_id) + assert len(events) == 2 + assert isinstance(events[0], PrefixCreated) + assert isinstance(events[1], PrefixUpdated) + assert events[1].description == "Updated" + + async def test_update_prefix_updates_read_model( + self, + create_prefix_handler: CreatePrefixHandler, + update_prefix_handler: UpdatePrefixHandler, + get_prefix_handler: GetPrefixHandler, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8", description="Original"), + ) + + await update_prefix_handler.handle( + UpdatePrefixCommand(prefix_id=prefix_id, description="Updated", role="infra"), + ) + + dto = await get_prefix_handler.handle(GetPrefixQuery(prefix_id=prefix_id)) + assert dto.description == "Updated" + assert dto.role == "infra" + assert dto.network == "10.0.0.0/8" # Unchanged + + async def test_update_prefix_publishes_kafka_event( + self, + create_prefix_handler: CreatePrefixHandler, + update_prefix_handler: UpdatePrefixHandler, + kafka_producer: FakeKafkaProducer, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8"), + ) + + await update_prefix_handler.handle( + UpdatePrefixCommand(prefix_id=prefix_id, description="Updated"), + ) + + kafka_events = kafka_producer.get_events("ipam.events") + assert len(kafka_events) == 2 + assert isinstance(kafka_events[1], PrefixUpdated) + + +class TestDeletePrefixE2E: + """Delete Prefix → verify delete event → verify read model marked deleted.""" + + async def test_delete_prefix_stores_event( + self, + create_prefix_handler: CreatePrefixHandler, + delete_prefix_handler: DeletePrefixHandler, + event_store: InMemoryEventStore, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8"), + ) + + await delete_prefix_handler.handle(DeletePrefixCommand(prefix_id=prefix_id)) + + events = event_store.get_events(prefix_id) + assert len(events) == 2 + assert isinstance(events[1], PrefixDeleted) + + async def test_delete_prefix_marks_read_model_deleted( + self, + create_prefix_handler: CreatePrefixHandler, + delete_prefix_handler: DeletePrefixHandler, + prefix_read_repo: InMemoryPrefixReadModelRepository, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8"), + ) + + # Exists before delete + assert await prefix_read_repo.find_by_id(prefix_id) is not None + + await delete_prefix_handler.handle(DeletePrefixCommand(prefix_id=prefix_id)) + + # Gone after delete + assert await prefix_read_repo.find_by_id(prefix_id) is None + + async def test_delete_prefix_excluded_from_list( + self, + create_prefix_handler: CreatePrefixHandler, + delete_prefix_handler: DeletePrefixHandler, + list_prefixes_handler: ListPrefixesHandler, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8"), + ) + await create_prefix_handler.handle( + CreatePrefixCommand(network="192.168.0.0/16"), + ) + + await delete_prefix_handler.handle(DeletePrefixCommand(prefix_id=prefix_id)) + + items, total = await list_prefixes_handler.handle(ListPrefixesQuery()) + assert total == 1 + assert items[0].network == "192.168.0.0/16" + + async def test_delete_prefix_publishes_kafka_event( + self, + create_prefix_handler: CreatePrefixHandler, + delete_prefix_handler: DeletePrefixHandler, + kafka_producer: FakeKafkaProducer, + ) -> None: + prefix_id = await create_prefix_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8"), + ) + + await delete_prefix_handler.handle(DeletePrefixCommand(prefix_id=prefix_id)) + + kafka_events = kafka_producer.get_events("ipam.events") + assert len(kafka_events) == 2 + assert isinstance(kafka_events[1], PrefixDeleted) + + +class TestBulkCreatePrefixesE2E: + """BulkCreate multiple prefixes → verify all created.""" + + async def test_bulk_create_stores_all_events( + self, + bulk_create_prefix_handler: BulkCreatePrefixesHandler, + event_store: InMemoryEventStore, + ) -> None: + command = BulkCreatePrefixesCommand( + items=[ + CreatePrefixCommand(network="10.0.0.0/8", description="Net A"), + CreatePrefixCommand(network="172.16.0.0/12", description="Net B"), + CreatePrefixCommand(network="192.168.0.0/16", description="Net C"), + ] + ) + ids = await bulk_create_prefix_handler.handle(command) + + assert len(ids) == 3 + all_events = event_store.all_events() + assert len(all_events) == 3 + assert all(isinstance(e, PrefixCreated) for e in all_events) + + async def test_bulk_create_populates_read_model( + self, + bulk_create_prefix_handler: BulkCreatePrefixesHandler, + list_prefixes_handler: ListPrefixesHandler, + ) -> None: + command = BulkCreatePrefixesCommand( + items=[ + CreatePrefixCommand(network="10.0.0.0/8"), + CreatePrefixCommand(network="172.16.0.0/12"), + ] + ) + await bulk_create_prefix_handler.handle(command) + + items, total = await list_prefixes_handler.handle(ListPrefixesQuery()) + assert total == 2 + + async def test_bulk_create_publishes_all_kafka_events( + self, + bulk_create_prefix_handler: BulkCreatePrefixesHandler, + kafka_producer: FakeKafkaProducer, + ) -> None: + command = BulkCreatePrefixesCommand( + items=[ + CreatePrefixCommand(network="10.0.0.0/8"), + CreatePrefixCommand(network="172.16.0.0/12"), + CreatePrefixCommand(network="192.168.0.0/16"), + ] + ) + await bulk_create_prefix_handler.handle(command) + + kafka_events = kafka_producer.get_events("ipam.events") + assert len(kafka_events) == 3 + + async def test_bulk_create_returns_unique_ids( + self, + bulk_create_prefix_handler: BulkCreatePrefixesHandler, + ) -> None: + command = BulkCreatePrefixesCommand( + items=[ + CreatePrefixCommand(network="10.0.0.0/8"), + CreatePrefixCommand(network="172.16.0.0/12"), + ] + ) + ids = await bulk_create_prefix_handler.handle(command) + + assert len(ids) == 2 + assert len(set(ids)) == 2 # All unique + assert all(isinstance(id_, UUID) for id_ in ids) diff --git a/services/ipam/tests/test_integration/test_kafka_flow.py b/services/ipam/tests/test_integration/test_kafka_flow.py new file mode 100644 index 0000000..523db0e --- /dev/null +++ b/services/ipam/tests/test_integration/test_kafka_flow.py @@ -0,0 +1,149 @@ +"""Kafka integration tests: real Kafka via testcontainers. + +Verifies KafkaEventProducer.publish() → KafkaEventConsumer receives → event data matches. +Marked with @pytest.mark.integration — requires Docker. +""" + +from __future__ import annotations + +import asyncio +from uuid import uuid4 + +import pytest +from ipam.domain.events import PrefixCreated +from shared.messaging.consumer import KafkaEventConsumer +from shared.messaging.producer import KafkaEventProducer +from shared.messaging.serialization import EventSerializer +from testcontainers.kafka import KafkaContainer + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + +TOPIC = "test.ipam.events" + + +@pytest.fixture(scope="session") +def kafka_container(): + with KafkaContainer("confluentinc/cp-kafka:7.6.0") as kafka: + yield kafka + + +@pytest.fixture(scope="session") +def bootstrap_servers(kafka_container): + return kafka_container.get_bootstrap_server() + + +@pytest.fixture +def serializer(): + s = EventSerializer() + s.register(PrefixCreated) + return s + + +@pytest.fixture +async def producer(bootstrap_servers): + p = KafkaEventProducer(bootstrap_servers=bootstrap_servers) + await p.start() + yield p + await p.stop() + + +# --------------------------------------------------------------------------- +# TestKafkaFlow +# --------------------------------------------------------------------------- + + +@pytest.mark.integration +class TestKafkaFlow: + """Publish event via KafkaEventProducer → consume via KafkaEventConsumer → verify.""" + + async def test_publish_and_consume_event( + self, + bootstrap_servers: str, + producer: KafkaEventProducer, + serializer: EventSerializer, + ) -> None: + # Create a test event + agg_id = uuid4() + event = PrefixCreated( + aggregate_id=agg_id, + version=1, + network="10.0.0.0/8", + description="Kafka test", + status="active", + ) + + # Publish event + await producer.publish(TOPIC, event) + + # Set up consumer + received_events: list = [] + + async def handler(evt): + received_events.append(evt) + + consumer = KafkaEventConsumer( + bootstrap_servers=bootstrap_servers, + group_id=f"test-group-{uuid4()}", + topics=[TOPIC], + serializer=serializer, + ) + consumer.subscribe(PrefixCreated, handler) + await consumer.start() + + # Consume with timeout + try: + await asyncio.wait_for(consumer.consume(), timeout=10.0) + except TimeoutError: + pass + finally: + await consumer.stop() + + # Verify received + assert len(received_events) >= 1 + received = received_events[0] + assert isinstance(received, PrefixCreated) + assert received.aggregate_id == agg_id + assert received.network == "10.0.0.0/8" + assert received.description == "Kafka test" + + async def test_publish_many_events( + self, + bootstrap_servers: str, + producer: KafkaEventProducer, + serializer: EventSerializer, + ) -> None: + events = [ + PrefixCreated(aggregate_id=uuid4(), version=1, network="10.0.0.0/8"), + PrefixCreated(aggregate_id=uuid4(), version=1, network="172.16.0.0/12"), + PrefixCreated(aggregate_id=uuid4(), version=1, network="192.168.0.0/16"), + ] + + topic = f"test.bulk.{uuid4()}" + await producer.publish_many(topic, events) + + received_events: list = [] + + async def handler(evt): + received_events.append(evt) + + consumer = KafkaEventConsumer( + bootstrap_servers=bootstrap_servers, + group_id=f"test-group-{uuid4()}", + topics=[topic], + serializer=serializer, + ) + consumer.subscribe(PrefixCreated, handler) + await consumer.start() + + try: + await asyncio.wait_for(consumer.consume(), timeout=10.0) + except TimeoutError: + pass + finally: + await consumer.stop() + + assert len(received_events) == 3 + networks = {e.network for e in received_events} + assert networks == {"10.0.0.0/8", "172.16.0.0/12", "192.168.0.0/16"} diff --git a/services/ipam/tests/test_integration/test_tenant_isolation.py b/services/ipam/tests/test_integration/test_tenant_isolation.py new file mode 100644 index 0000000..b34d9cb --- /dev/null +++ b/services/ipam/tests/test_integration/test_tenant_isolation.py @@ -0,0 +1,172 @@ +"""Tenant isolation integration tests: real PostgreSQL via testcontainers. + +Verifies that queries filtered by tenant_id return only that tenant's data. +Marked with @pytest.mark.integration — requires Docker. +""" + +from __future__ import annotations + +from uuid import uuid4 + +import pytest +from ipam.application.command_handlers import CreatePrefixHandler +from ipam.application.commands import CreatePrefixCommand +from ipam.application.queries import ListPrefixesQuery +from ipam.application.query_handlers import ListPrefixesHandler +from ipam.domain.events import PrefixCreated +from ipam.infrastructure.models import IPAMBase +from ipam.infrastructure.read_model_repository import PostgresPrefixReadModelRepository +from shared.event.models import EventStoreBase +from shared.event.pg_store import PostgresEventStore +from sqlalchemy import text +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine +from testcontainers.postgres import PostgresContainer + +from .conftest import FakeKafkaProducer + +TENANT_A = uuid4() +TENANT_B = uuid4() + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + + +@pytest.fixture(scope="session") +def postgres_container(): + with PostgresContainer("postgres:16") as pg: + yield pg + + +@pytest.fixture(scope="session") +async def engine(postgres_container): + url = postgres_container.get_connection_url().replace("psycopg2", "asyncpg") + eng = create_async_engine(url) + async with eng.begin() as conn: + await conn.run_sync(IPAMBase.metadata.create_all) + await conn.run_sync(EventStoreBase.metadata.create_all) + yield eng + await eng.dispose() + + +@pytest.fixture +async def session(engine): + factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False) + async with factory() as session: + yield session + for table in reversed(IPAMBase.metadata.sorted_tables): + await session.execute(text(f'TRUNCATE TABLE "{table.name}" CASCADE')) + for table in reversed(EventStoreBase.metadata.sorted_tables): + await session.execute(text(f'TRUNCATE TABLE "{table.name}" CASCADE')) + await session.commit() + + +@pytest.fixture +def session_factory(session): + from contextlib import asynccontextmanager + + @asynccontextmanager + async def _factory(): + yield session + + return _factory + + +@pytest.fixture +def event_store(session_factory): + store = PostgresEventStore(session_factory) + store.register_event_type(PrefixCreated) + return store + + +@pytest.fixture +def prefix_read_repo(session): + return PostgresPrefixReadModelRepository(session) + + +@pytest.fixture +def kafka_producer(): + return FakeKafkaProducer() + + +@pytest.fixture +def create_handler(event_store, prefix_read_repo, kafka_producer): + return CreatePrefixHandler(event_store, prefix_read_repo, kafka_producer) + + +@pytest.fixture +def list_handler(prefix_read_repo): + return ListPrefixesHandler(prefix_read_repo) + + +# --------------------------------------------------------------------------- +# TestTenantIsolation +# --------------------------------------------------------------------------- + + +@pytest.mark.integration +class TestTenantIsolation: + """Verify that listing prefixes with a tenant_id filter returns only that tenant's data.""" + + async def test_tenant_a_sees_only_own_prefixes( + self, + create_handler: CreatePrefixHandler, + list_handler: ListPrefixesHandler, + ) -> None: + # Create prefixes for tenant A + await create_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8", tenant_id=TENANT_A, description="Tenant A net"), + ) + await create_handler.handle( + CreatePrefixCommand(network="10.1.0.0/16", tenant_id=TENANT_A, description="Tenant A subnet"), + ) + + # Create prefix for tenant B + await create_handler.handle( + CreatePrefixCommand(network="172.16.0.0/12", tenant_id=TENANT_B, description="Tenant B net"), + ) + + # Query with tenant A filter + items, total = await list_handler.handle(ListPrefixesQuery(tenant_id=TENANT_A)) + assert total == 2 + assert all(item.tenant_id == TENANT_A for item in items) + + async def test_tenant_b_sees_only_own_prefixes( + self, + create_handler: CreatePrefixHandler, + list_handler: ListPrefixesHandler, + ) -> None: + # Create prefixes for tenant A + await create_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8", tenant_id=TENANT_A), + ) + + # Create prefixes for tenant B + await create_handler.handle( + CreatePrefixCommand(network="172.16.0.0/12", tenant_id=TENANT_B), + ) + await create_handler.handle( + CreatePrefixCommand(network="192.168.0.0/16", tenant_id=TENANT_B), + ) + + # Query with tenant B filter + items, total = await list_handler.handle(ListPrefixesQuery(tenant_id=TENANT_B)) + assert total == 2 + assert all(item.tenant_id == TENANT_B for item in items) + + async def test_no_filter_returns_all_tenants( + self, + create_handler: CreatePrefixHandler, + list_handler: ListPrefixesHandler, + ) -> None: + await create_handler.handle( + CreatePrefixCommand(network="10.0.0.0/8", tenant_id=TENANT_A), + ) + await create_handler.handle( + CreatePrefixCommand(network="172.16.0.0/12", tenant_id=TENANT_B), + ) + + # Query without tenant filter + items, total = await list_handler.handle(ListPrefixesQuery()) + assert total == 2 diff --git a/services/tenant/Dockerfile b/services/tenant/Dockerfile new file mode 100644 index 0000000..025ad55 --- /dev/null +++ b/services/tenant/Dockerfile @@ -0,0 +1,5 @@ +FROM cmdb-base:prod +COPY services/tenant/ ./services/tenant/ +RUN uv sync --frozen --no-dev --package cmdb-tenant +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-tenant", "uvicorn", "tenant.interface.main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/services/tenant/Dockerfile.dev b/services/tenant/Dockerfile.dev new file mode 100644 index 0000000..6e488d3 --- /dev/null +++ b/services/tenant/Dockerfile.dev @@ -0,0 +1,9 @@ +FROM cmdb-base:dev +COPY services/tenant/pyproject.toml ./services/tenant/pyproject.toml +COPY services/tenant/alembic.ini ./services/tenant/alembic.ini +COPY services/tenant/alembic ./services/tenant/alembic +COPY services/tenant/alembic_tenant_db ./services/tenant/alembic_tenant_db +COPY services/tenant/src ./services/tenant/src +RUN uv sync --frozen --package cmdb-tenant +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-tenant", "uvicorn", "tenant.interface.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"] diff --git a/services/tenant/alembic.ini b/services/tenant/alembic.ini new file mode 100644 index 0000000..cb070d4 --- /dev/null +++ b/services/tenant/alembic.ini @@ -0,0 +1,35 @@ +[alembic] +script_location = alembic +sqlalchemy.url = postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_tenant + +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console + +[logger_sqlalchemy] +level = WARN +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s diff --git a/services/tenant/alembic/env.py b/services/tenant/alembic/env.py new file mode 100644 index 0000000..aa91c42 --- /dev/null +++ b/services/tenant/alembic/env.py @@ -0,0 +1,61 @@ +import asyncio +import os +from logging.config import fileConfig + +from alembic import context +from sqlalchemy import pool +from sqlalchemy.ext.asyncio import async_engine_from_config +from tenant.infrastructure.models import TenantBase + +config = context.config + +# Override DB URL from environment variable if set +db_url = os.environ.get("DATABASE_URL") +if db_url: + config.set_main_option("sqlalchemy.url", db_url) + +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +target_metadata = TenantBase.metadata + + +def run_migrations_offline() -> None: + url = config.get_main_option("sqlalchemy.url") + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + ) + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection) -> None: + context.configure( + connection=connection, + target_metadata=target_metadata, + ) + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + connectable = async_engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + await connectable.dispose() + + +def run_migrations_online() -> None: + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/services/tenant/alembic/versions/001_create_tenants_table.py b/services/tenant/alembic/versions/001_create_tenants_table.py new file mode 100644 index 0000000..01de3d5 --- /dev/null +++ b/services/tenant/alembic/versions/001_create_tenants_table.py @@ -0,0 +1,40 @@ +"""create tenants table + +Revision ID: 001 +Create Date: 2026-03-19 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects.postgresql import JSONB + +revision = "001" +down_revision = None +branch_labels = None +depends_on = None + + +def upgrade() -> None: + op.create_table( + "tenants", + sa.Column("id", sa.Uuid(), primary_key=True), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("slug", sa.String(255), nullable=False, unique=True, index=True), + sa.Column("status", sa.String(20), nullable=False, server_default="active"), + sa.Column("settings", JSONB, nullable=False, server_default="{}"), + sa.Column("db_config", JSONB, nullable=True), + sa.Column( + "created_at", + sa.DateTime(timezone=True), + server_default=sa.func.now(), + ), + sa.Column( + "updated_at", + sa.DateTime(timezone=True), + server_default=sa.func.now(), + ), + ) + + +def downgrade() -> None: + op.drop_table("tenants") diff --git a/services/tenant/alembic_tenant_db/env.py b/services/tenant/alembic_tenant_db/env.py new file mode 100644 index 0000000..5b5b0cf --- /dev/null +++ b/services/tenant/alembic_tenant_db/env.py @@ -0,0 +1,43 @@ +from logging.config import fileConfig + +from alembic import context +from sqlalchemy import engine_from_config, pool + +config = context.config + +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +target_metadata = None + + +def run_migrations_offline() -> None: + url = config.get_main_option("sqlalchemy.url") + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + ) + with context.begin_transaction(): + context.run_migrations() + + +def run_migrations_online() -> None: + connectable = engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + with connectable.connect() as connection: + context.configure( + connection=connection, + target_metadata=target_metadata, + ) + with context.begin_transaction(): + context.run_migrations() + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/services/tenant/alembic_tenant_db/script.py.mako b/services/tenant/alembic_tenant_db/script.py.mako new file mode 100644 index 0000000..590f5b3 --- /dev/null +++ b/services/tenant/alembic_tenant_db/script.py.mako @@ -0,0 +1,24 @@ +"""${message} + +Revision ID: ${up_revision} +Revises: ${down_revision | comma,n} +Create Date: ${create_date} +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa +${imports if imports else ""} + +revision: str = ${repr(up_revision)} +down_revision: Union[str, None] = ${repr(down_revision)} +branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} +depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} + + +def upgrade() -> None: + ${upgrades if upgrades else "pass"} + + +def downgrade() -> None: + ${downgrades if downgrades else "pass"} diff --git a/services/tenant/alembic_tenant_db/versions/.gitkeep b/services/tenant/alembic_tenant_db/versions/.gitkeep new file mode 100644 index 0000000..e69de29 diff --git a/services/tenant/docker-compose.dev.yml b/services/tenant/docker-compose.dev.yml new file mode 100644 index 0000000..d2cee0b --- /dev/null +++ b/services/tenant/docker-compose.dev.yml @@ -0,0 +1,26 @@ +services: + tenant: + build: + context: ../../ + dockerfile: services/tenant/Dockerfile.dev + volumes: + - ../../services/tenant/src:/app/services/tenant/src + ports: + - "8003:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_tenant + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + +networks: + cmdb-network: + external: true diff --git a/services/tenant/docker-compose.prod.yml b/services/tenant/docker-compose.prod.yml new file mode 100644 index 0000000..20fd6dc --- /dev/null +++ b/services/tenant/docker-compose.prod.yml @@ -0,0 +1,18 @@ +services: + tenant: + build: + context: ../../ + dockerfile: services/tenant/Dockerfile + environment: + DATABASE_URL: postgresql://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_tenant + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network diff --git a/services/tenant/pyproject.toml b/services/tenant/pyproject.toml new file mode 100644 index 0000000..f4a50b0 --- /dev/null +++ b/services/tenant/pyproject.toml @@ -0,0 +1,27 @@ +[project] +name = "cmdb-tenant" +version = "0.1.0" +description = "CMDB tenant Service" +requires-python = ">=3.13" +dependencies = [ + "cmdb-shared", + "fastapi>=0.115", + "uvicorn", + "sqlalchemy[asyncio]>=2.0", + "asyncpg", + "alembic", + "pydantic-settings>=2.0", + "aiokafka", + "redis>=5.0", + "psycopg2-binary>=2.9", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/tenant"] + +[tool.uv.sources] +cmdb-shared = { workspace = true } diff --git a/services/tenant/src/tenant/__init__.py b/services/tenant/src/tenant/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/tenant/src/tenant/application/__init__.py b/services/tenant/src/tenant/application/__init__.py new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/services/tenant/src/tenant/application/__init__.py @@ -0,0 +1 @@ + diff --git a/services/tenant/src/tenant/application/command_handlers.py b/services/tenant/src/tenant/application/command_handlers.py new file mode 100644 index 0000000..585fffa --- /dev/null +++ b/services/tenant/src/tenant/application/command_handlers.py @@ -0,0 +1,102 @@ +from uuid import UUID + +from shared.cqrs.command import Command, CommandHandler +from shared.domain.exceptions import ConflictError, EntityNotFoundError +from shared.messaging.producer import KafkaEventProducer + +from tenant.domain.repository import TenantRepository +from tenant.domain.tenant import Tenant, TenantSettings +from tenant.infrastructure.db_provisioning import TenantDbProvisioner + + +class CreateTenantHandler(CommandHandler[UUID]): + def __init__( + self, + repository: TenantRepository, + provisioner: TenantDbProvisioner, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._provisioner = provisioner + self._event_producer = event_producer + + async def handle(self, command: Command) -> UUID: + existing = await self._repository.find_by_slug(command.slug) + if existing is not None: + raise ConflictError(f"Tenant with slug '{command.slug}' already exists") + + db_config = await self._provisioner.provision(command.slug) + + tenant = Tenant.create( + name=command.name, + slug=command.slug, + settings=TenantSettings( + custom_domain=command.custom_domain, + logo_url=command.logo_url, + theme=command.theme, + ), + db_config=db_config, + ) + + await self._repository.save(tenant) + + for event in tenant.collect_events(): + await self._event_producer.publish("tenant.events", event) + + return tenant.id + + +class SuspendTenantHandler(CommandHandler[None]): + def __init__( + self, + repository: TenantRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._event_producer = event_producer + + async def handle(self, command: Command) -> None: + tenant = await self._repository.find_by_id(command.tenant_id) + if tenant is None: + raise EntityNotFoundError(f"Tenant {command.tenant_id} not found") + tenant.suspend() + await self._repository.save(tenant) + for event in tenant.collect_events(): + await self._event_producer.publish("tenant.events", event) + + +class UpdateTenantSettingsHandler(CommandHandler[None]): + def __init__(self, repository: TenantRepository) -> None: + self._repository = repository + + async def handle(self, command: Command) -> None: + tenant = await self._repository.find_by_id(command.tenant_id) + if tenant is None: + raise EntityNotFoundError(f"Tenant {command.tenant_id} not found") + tenant.update_settings( + TenantSettings( + custom_domain=command.custom_domain, + logo_url=command.logo_url, + theme=command.theme, + ) + ) + await self._repository.save(tenant) + + +class DeleteTenantHandler(CommandHandler[None]): + def __init__( + self, + repository: TenantRepository, + event_producer: KafkaEventProducer, + ) -> None: + self._repository = repository + self._event_producer = event_producer + + async def handle(self, command: Command) -> None: + tenant = await self._repository.find_by_id(command.tenant_id) + if tenant is None: + raise EntityNotFoundError(f"Tenant {command.tenant_id} not found") + tenant.mark_deleted() + await self._repository.save(tenant) + for event in tenant.collect_events(): + await self._event_producer.publish("tenant.events", event) diff --git a/services/tenant/src/tenant/application/commands.py b/services/tenant/src/tenant/application/commands.py new file mode 100644 index 0000000..9113bad --- /dev/null +++ b/services/tenant/src/tenant/application/commands.py @@ -0,0 +1,26 @@ +from uuid import UUID + +from shared.cqrs.command import Command + + +class CreateTenantCommand(Command): + name: str + slug: str + custom_domain: str | None = None + logo_url: str | None = None + theme: str | None = None + + +class SuspendTenantCommand(Command): + tenant_id: UUID + + +class UpdateTenantSettingsCommand(Command): + tenant_id: UUID + custom_domain: str | None = None + logo_url: str | None = None + theme: str | None = None + + +class DeleteTenantCommand(Command): + tenant_id: UUID diff --git a/services/tenant/src/tenant/application/dto.py b/services/tenant/src/tenant/application/dto.py new file mode 100644 index 0000000..ea55abf --- /dev/null +++ b/services/tenant/src/tenant/application/dto.py @@ -0,0 +1,17 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + +from tenant.domain.tenant import TenantStatus + + +class TenantDTO(BaseModel): + id: UUID + name: str + slug: str + status: TenantStatus + settings: dict + db_name: str | None + created_at: datetime + updated_at: datetime diff --git a/services/tenant/src/tenant/application/queries.py b/services/tenant/src/tenant/application/queries.py new file mode 100644 index 0000000..0fc81e0 --- /dev/null +++ b/services/tenant/src/tenant/application/queries.py @@ -0,0 +1,12 @@ +from uuid import UUID + +from shared.cqrs.query import Query + + +class GetTenantQuery(Query): + tenant_id: UUID + + +class ListTenantsQuery(Query): + offset: int = 0 + limit: int = 50 diff --git a/services/tenant/src/tenant/application/query_handlers.py b/services/tenant/src/tenant/application/query_handlers.py new file mode 100644 index 0000000..1f5f656 --- /dev/null +++ b/services/tenant/src/tenant/application/query_handlers.py @@ -0,0 +1,50 @@ +from shared.cqrs.query import Query, QueryHandler +from shared.domain.exceptions import EntityNotFoundError + +from tenant.application.dto import TenantDTO +from tenant.domain.repository import TenantRepository + + +class GetTenantHandler(QueryHandler[TenantDTO]): + def __init__(self, repository: TenantRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> TenantDTO: + tenant = await self._repository.find_by_id(query.tenant_id) + if tenant is None: + raise EntityNotFoundError(f"Tenant {query.tenant_id} not found") + return TenantDTO( + id=tenant.id, + name=tenant.name, + slug=tenant.slug, + status=tenant.status, + settings=tenant.settings.model_dump(), + db_name=tenant.db_config.db_name if tenant.db_config else None, + created_at=tenant.created_at, + updated_at=tenant.updated_at, + ) + + +class ListTenantsHandler(QueryHandler[tuple[list[TenantDTO], int]]): + def __init__(self, repository: TenantRepository) -> None: + self._repository = repository + + async def handle(self, query: Query) -> tuple[list[TenantDTO], int]: + tenants, total = await self._repository.find_all( + offset=query.offset, + limit=query.limit, + ) + items = [ + TenantDTO( + id=t.id, + name=t.name, + slug=t.slug, + status=t.status, + settings=t.settings.model_dump(), + db_name=t.db_config.db_name if t.db_config else None, + created_at=t.created_at, + updated_at=t.updated_at, + ) + for t in tenants + ] + return items, total diff --git a/services/tenant/src/tenant/domain/__init__.py b/services/tenant/src/tenant/domain/__init__.py new file mode 100644 index 0000000..cf0f0cc --- /dev/null +++ b/services/tenant/src/tenant/domain/__init__.py @@ -0,0 +1,19 @@ +from tenant.domain.events import TenantCreated, TenantDeleted, TenantSuspended +from tenant.domain.repository import TenantRepository +from tenant.domain.tenant import ( + Tenant, + TenantDbConfig, + TenantSettings, + TenantStatus, +) + +__all__ = [ + "Tenant", + "TenantCreated", + "TenantDbConfig", + "TenantDeleted", + "TenantRepository", + "TenantSettings", + "TenantStatus", + "TenantSuspended", +] diff --git a/services/tenant/src/tenant/domain/events.py b/services/tenant/src/tenant/domain/events.py new file mode 100644 index 0000000..b64c4a9 --- /dev/null +++ b/services/tenant/src/tenant/domain/events.py @@ -0,0 +1,14 @@ +from shared.event.domain_event import DomainEvent + + +class TenantCreated(DomainEvent): + tenant_name: str + slug: str + + +class TenantSuspended(DomainEvent): + pass + + +class TenantDeleted(DomainEvent): + pass diff --git a/services/tenant/src/tenant/domain/repository.py b/services/tenant/src/tenant/domain/repository.py new file mode 100644 index 0000000..9ad4dde --- /dev/null +++ b/services/tenant/src/tenant/domain/repository.py @@ -0,0 +1,18 @@ +from abc import abstractmethod + +from shared.domain.repository import Repository + +from tenant.domain.tenant import Tenant + + +class TenantRepository(Repository[Tenant]): + @abstractmethod + async def find_by_slug(self, slug: str) -> Tenant | None: ... + + @abstractmethod + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Tenant], int]: ... diff --git a/services/tenant/src/tenant/domain/tenant.py b/services/tenant/src/tenant/domain/tenant.py new file mode 100644 index 0000000..6c4986d --- /dev/null +++ b/services/tenant/src/tenant/domain/tenant.py @@ -0,0 +1,102 @@ +from datetime import datetime +from enum import StrEnum +from typing import Any + +from pydantic import Field +from shared.domain.entity import Entity +from shared.domain.exceptions import BusinessRuleViolationError +from shared.domain.value_object import ValueObject +from shared.event.domain_event import DomainEvent + +from tenant.domain.events import TenantCreated, TenantDeleted, TenantSuspended + + +class TenantStatus(StrEnum): + ACTIVE = "active" + SUSPENDED = "suspended" + DELETED = "deleted" + + +class TenantSettings(ValueObject): + custom_domain: str | None = None + logo_url: str | None = None + theme: str | None = None + + +class TenantDbConfig(ValueObject): + db_host: str + db_port: int = 5432 + db_name: str + db_user: str + db_password: str + + @property + def url(self) -> str: + return f"postgresql+asyncpg://{self.db_user}:{self.db_password}@{self.db_host}:{self.db_port}/{self.db_name}" + + @property + def sync_url(self) -> str: + return f"postgresql://{self.db_user}:{self.db_password}@{self.db_host}:{self.db_port}/{self.db_name}" + + +class Tenant(Entity): + name: str + slug: str + status: TenantStatus = TenantStatus.ACTIVE + settings: TenantSettings = Field(default_factory=TenantSettings) + db_config: TenantDbConfig | None = None + + def model_post_init(self, __context: Any) -> None: + object.__setattr__(self, "_pending_events", []) + + def collect_events(self) -> list[DomainEvent]: + events: list[DomainEvent] = list(self._pending_events) + self._pending_events.clear() + return events + + @classmethod + def create( + cls, + *, + name: str, + slug: str, + settings: TenantSettings | None = None, + db_config: TenantDbConfig | None = None, + ) -> "Tenant": + tenant = cls( + name=name, + slug=slug, + settings=settings or TenantSettings(), + db_config=db_config, + ) + tenant._pending_events.append( + TenantCreated( + aggregate_id=tenant.id, + version=1, + tenant_name=name, + slug=slug, + ) + ) + return tenant + + def suspend(self) -> None: + if self.status == TenantStatus.DELETED: + raise BusinessRuleViolationError("Cannot suspend a deleted tenant") + if self.status == TenantStatus.SUSPENDED: + raise BusinessRuleViolationError("Tenant is already suspended") + self.status = TenantStatus.SUSPENDED + self.updated_at = datetime.now() + self._pending_events.append(TenantSuspended(aggregate_id=self.id, version=1)) + + def mark_deleted(self) -> None: + if self.status == TenantStatus.DELETED: + raise BusinessRuleViolationError("Tenant is already deleted") + self.status = TenantStatus.DELETED + self.updated_at = datetime.now() + self._pending_events.append(TenantDeleted(aggregate_id=self.id, version=1)) + + def update_settings(self, settings: TenantSettings) -> None: + if self.status == TenantStatus.DELETED: + raise BusinessRuleViolationError("Cannot update a deleted tenant") + self.settings = settings + self.updated_at = datetime.now() diff --git a/services/tenant/src/tenant/infrastructure/__init__.py b/services/tenant/src/tenant/infrastructure/__init__.py new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/__init__.py @@ -0,0 +1 @@ + diff --git a/services/tenant/src/tenant/infrastructure/config.py b/services/tenant/src/tenant/infrastructure/config.py new file mode 100644 index 0000000..8ccbcd9 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/config.py @@ -0,0 +1,13 @@ +from pydantic_settings import BaseSettings + + +class Settings(BaseSettings): + database_url: str = "postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_tenant" + + postgres_host: str = "postgres" + postgres_port: int = 5432 + postgres_user: str = "cmdb" + postgres_password: str = "cmdb" + + kafka_bootstrap_servers: str = "kafka:9092" + redis_url: str = "redis://redis:6379" diff --git a/services/tenant/src/tenant/infrastructure/database.py b/services/tenant/src/tenant/infrastructure/database.py new file mode 100644 index 0000000..26744e3 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/database.py @@ -0,0 +1,26 @@ +from sqlalchemy.ext.asyncio import ( + AsyncEngine, + AsyncSession, + async_sessionmaker, + create_async_engine, +) + + +class Database: + def __init__(self, url: str) -> None: + self._engine: AsyncEngine = create_async_engine(url, echo=False, pool_size=5) + self._session_factory = async_sessionmaker( + self._engine, + class_=AsyncSession, + expire_on_commit=False, + ) + + @property + def engine(self) -> AsyncEngine: + return self._engine + + def session(self) -> AsyncSession: + return self._session_factory() + + async def close(self) -> None: + await self._engine.dispose() diff --git a/services/tenant/src/tenant/infrastructure/db_provisioning.py b/services/tenant/src/tenant/infrastructure/db_provisioning.py new file mode 100644 index 0000000..0ea0f69 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/db_provisioning.py @@ -0,0 +1,58 @@ +import asyncio +from pathlib import Path + +from alembic import command as alembic_command +from alembic.config import Config as AlembicConfig +from sqlalchemy import text +from sqlalchemy.ext.asyncio import create_async_engine + +from tenant.domain.tenant import TenantDbConfig +from tenant.infrastructure.config import Settings + + +class TenantDbProvisioner: + def __init__(self, settings: Settings) -> None: + self._settings = settings + + async def provision(self, slug: str) -> TenantDbConfig: + db_name = f"cmdb_tenant_{slug}" + db_config = TenantDbConfig( + db_host=self._settings.postgres_host, + db_port=self._settings.postgres_port, + db_name=db_name, + db_user=self._settings.postgres_user, + db_password=self._settings.postgres_password, + ) + + await self._create_database(db_name) + await self._run_migrations(db_config) + + return db_config + + async def _create_database(self, db_name: str) -> None: + s = self._settings + admin_url = ( + f"postgresql+asyncpg://{s.postgres_user}:{s.postgres_password}@{s.postgres_host}:{s.postgres_port}/postgres" + ) + engine = create_async_engine(admin_url, isolation_level="AUTOCOMMIT") + try: + async with engine.connect() as conn: + result = await conn.execute( + text("SELECT 1 FROM pg_database WHERE datname = :name"), + {"name": db_name}, + ) + if result.scalar() is None: + await conn.execute(text(f'CREATE DATABASE "{db_name}"')) + finally: + await engine.dispose() + + async def _run_migrations(self, db_config: TenantDbConfig) -> None: + def _run() -> None: + alembic_cfg = AlembicConfig() + script_dir = str(Path(__file__).resolve().parent.parent.parent.parent / "alembic_tenant_db") + alembic_cfg.set_main_option("script_location", script_dir) + alembic_cfg.set_main_option("sqlalchemy.url", db_config.sync_url) + alembic_command.upgrade(alembic_cfg, "head") + + loop = asyncio.get_running_loop() + await loop.run_in_executor(None, _run) diff --git a/services/tenant/src/tenant/infrastructure/models.py b/services/tenant/src/tenant/infrastructure/models.py new file mode 100644 index 0000000..9216998 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/models.py @@ -0,0 +1,27 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import DateTime as SADateTime +from sqlalchemy import String +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column +from sqlalchemy.sql import func + + +class TenantBase(DeclarativeBase): + pass + + +class TenantModel(TenantBase): + __tablename__ = "tenants" + + id: Mapped[UUID] = mapped_column(primary_key=True) + name: Mapped[str] = mapped_column(String(255)) + slug: Mapped[str] = mapped_column(String(255), unique=True, index=True) + status: Mapped[str] = mapped_column(String(20), default="active") + settings: Mapped[dict] = mapped_column(JSONB, default=dict) + db_config: Mapped[dict | None] = mapped_column(JSONB, nullable=True) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + SADateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) diff --git a/services/tenant/src/tenant/infrastructure/tenant_db_manager.py b/services/tenant/src/tenant/infrastructure/tenant_db_manager.py new file mode 100644 index 0000000..1f83756 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/tenant_db_manager.py @@ -0,0 +1,3 @@ +from shared.db.tenant_db_manager import TenantDbManager + +__all__ = ["TenantDbManager"] diff --git a/services/tenant/src/tenant/infrastructure/tenant_repository.py b/services/tenant/src/tenant/infrastructure/tenant_repository.py new file mode 100644 index 0000000..b608539 --- /dev/null +++ b/services/tenant/src/tenant/infrastructure/tenant_repository.py @@ -0,0 +1,79 @@ +from uuid import UUID + +from sqlalchemy import func as sa_func +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from tenant.domain.repository import TenantRepository +from tenant.domain.tenant import Tenant, TenantDbConfig, TenantSettings, TenantStatus +from tenant.infrastructure.models import TenantModel + + +class PostgresTenantRepository(TenantRepository): + def __init__(self, session: AsyncSession) -> None: + self._session = session + + async def find_by_id(self, entity_id: UUID) -> Tenant | None: + result = await self._session.get(TenantModel, entity_id) + return self._to_entity(result) if result else None + + async def find_by_slug(self, slug: str) -> Tenant | None: + stmt = select(TenantModel).where(TenantModel.slug == slug) + result = await self._session.execute(stmt) + row = result.scalar_one_or_none() + return self._to_entity(row) if row else None + + async def find_all( + self, + *, + offset: int = 0, + limit: int = 50, + ) -> tuple[list[Tenant], int]: + count_stmt = select(sa_func.count()).select_from(TenantModel) + total = (await self._session.execute(count_stmt)).scalar_one() + + stmt = select(TenantModel).order_by(TenantModel.created_at.desc()).offset(offset).limit(limit) + result = await self._session.execute(stmt) + rows = result.scalars().all() + return [self._to_entity(r) for r in rows], total + + async def save(self, entity: Tenant) -> Tenant: + model = self._to_model(entity) + merged = await self._session.merge(model) + await self._session.commit() + return self._to_entity(merged) + + async def delete(self, entity_id: UUID) -> None: + model = await self._session.get(TenantModel, entity_id) + if model: + await self._session.delete(model) + await self._session.commit() + + @staticmethod + def _to_entity(model: TenantModel) -> Tenant: + db_config = None + if model.db_config: + db_config = TenantDbConfig(**model.db_config) + return Tenant( + id=model.id, + name=model.name, + slug=model.slug, + status=TenantStatus(model.status), + settings=TenantSettings(**(model.settings or {})), + db_config=db_config, + created_at=model.created_at, + updated_at=model.updated_at, + ) + + @staticmethod + def _to_model(entity: Tenant) -> TenantModel: + return TenantModel( + id=entity.id, + name=entity.name, + slug=entity.slug, + status=entity.status.value, + settings=entity.settings.model_dump(), + db_config=entity.db_config.model_dump() if entity.db_config else None, + created_at=entity.created_at, + updated_at=entity.updated_at, + ) diff --git a/services/tenant/src/tenant/interface/__init__.py b/services/tenant/src/tenant/interface/__init__.py new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/services/tenant/src/tenant/interface/__init__.py @@ -0,0 +1 @@ + diff --git a/services/tenant/src/tenant/interface/main.py b/services/tenant/src/tenant/interface/main.py new file mode 100644 index 0000000..1597066 --- /dev/null +++ b/services/tenant/src/tenant/interface/main.py @@ -0,0 +1,68 @@ +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager + +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from shared.api.errors import domain_exception_handler +from shared.api.middleware import CorrelationIdMiddleware +from shared.domain.exceptions import DomainError +from shared.messaging.producer import KafkaEventProducer +from shared.messaging.serialization import EventSerializer + +from tenant.domain.events import TenantCreated, TenantDeleted, TenantSuspended +from tenant.infrastructure.config import Settings +from tenant.infrastructure.database import Database +from tenant.infrastructure.db_provisioning import TenantDbProvisioner +from tenant.infrastructure.tenant_db_manager import TenantDbManager +from tenant.interface.router import router +from tenant.interface.setup_router import setup_router + + +@asynccontextmanager +async def lifespan(app: FastAPI) -> AsyncGenerator[None]: + settings = Settings() + + database = Database(settings.database_url) + + serializer = EventSerializer() + serializer.register(TenantCreated) + serializer.register(TenantSuspended) + serializer.register(TenantDeleted) + event_producer = KafkaEventProducer( + settings.kafka_bootstrap_servers, + serializer, + ) + await event_producer.start() + + provisioner = TenantDbProvisioner(settings) + tenant_db_manager = TenantDbManager() + + app.state.database = database + app.state.settings = settings + app.state.event_producer = event_producer + app.state.provisioner = provisioner + app.state.tenant_db_manager = tenant_db_manager + + yield + + await event_producer.stop() + await database.close() + await tenant_db_manager.close_all() + + +def create_app() -> FastAPI: + app = FastAPI(title="CMDB Tenant Service", lifespan=lifespan) + app.add_middleware( + CORSMiddleware, + allow_origins=["http://localhost:3000"], + allow_methods=["*"], + allow_headers=["*"], + ) + app.add_middleware(CorrelationIdMiddleware) + app.add_exception_handler(DomainError, domain_exception_handler) + app.include_router(router) + app.include_router(setup_router) + return app + + +app = create_app() diff --git a/services/tenant/src/tenant/interface/router.py b/services/tenant/src/tenant/interface/router.py new file mode 100644 index 0000000..62b9bea --- /dev/null +++ b/services/tenant/src/tenant/interface/router.py @@ -0,0 +1,144 @@ +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus +from sqlalchemy.ext.asyncio import AsyncSession + +from tenant.application.command_handlers import ( + CreateTenantHandler, + DeleteTenantHandler, + SuspendTenantHandler, + UpdateTenantSettingsHandler, +) +from tenant.application.commands import ( + CreateTenantCommand, + DeleteTenantCommand, + SuspendTenantCommand, + UpdateTenantSettingsCommand, +) +from tenant.application.queries import GetTenantQuery, ListTenantsQuery +from tenant.application.query_handlers import GetTenantHandler, ListTenantsHandler +from tenant.infrastructure.tenant_repository import PostgresTenantRepository +from tenant.interface.schemas import ( + CreateTenantRequest, + TenantListResponse, + TenantResponse, + UpdateTenantSettingsRequest, +) + +router = APIRouter(prefix="/tenants", tags=["tenants"]) + + +def _get_session(request: Request) -> AsyncSession: + return request.app.state.database.session() + + +def _get_command_bus(request: Request) -> CommandBus: + session = _get_session(request) + repo = PostgresTenantRepository(session) + + bus = CommandBus() + bus.register( + CreateTenantCommand, + CreateTenantHandler( + repo, + request.app.state.provisioner, + request.app.state.event_producer, + ), + ) + bus.register( + SuspendTenantCommand, + SuspendTenantHandler(repo, request.app.state.event_producer), + ) + bus.register( + UpdateTenantSettingsCommand, + UpdateTenantSettingsHandler(repo), + ) + bus.register( + DeleteTenantCommand, + DeleteTenantHandler(repo, request.app.state.event_producer), + ) + return bus + + +def _get_query_bus(request: Request) -> QueryBus: + session = _get_session(request) + repo = PostgresTenantRepository(session) + + bus = QueryBus() + bus.register(GetTenantQuery, GetTenantHandler(repo)) + bus.register(ListTenantsQuery, ListTenantsHandler(repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=TenantResponse, +) +async def create_tenant( + body: CreateTenantRequest, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> TenantResponse: + tenant_id = await command_bus.dispatch(CreateTenantCommand(**body.model_dump())) + result = await query_bus.dispatch(GetTenantQuery(tenant_id=tenant_id)) + return TenantResponse(**result.model_dump()) + + +@router.get("", response_model=TenantListResponse) +async def list_tenants( + params: OffsetParams = Depends(), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> TenantListResponse: + items, total = await query_bus.dispatch(ListTenantsQuery(offset=params.offset, limit=params.limit)) + return TenantListResponse( + items=[TenantResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.get("/{tenant_id}", response_model=TenantResponse) +async def get_tenant( + tenant_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> TenantResponse: + result = await query_bus.dispatch(GetTenantQuery(tenant_id=tenant_id)) + return TenantResponse(**result.model_dump()) + + +@router.patch("/{tenant_id}", response_model=TenantResponse) +async def update_tenant_settings( + tenant_id: UUID, + body: UpdateTenantSettingsRequest, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> TenantResponse: + await command_bus.dispatch(UpdateTenantSettingsCommand(tenant_id=tenant_id, **body.model_dump())) + result = await query_bus.dispatch(GetTenantQuery(tenant_id=tenant_id)) + return TenantResponse(**result.model_dump()) + + +@router.post( + "/{tenant_id}/suspend", + status_code=status.HTTP_204_NO_CONTENT, +) +async def suspend_tenant( + tenant_id: UUID, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(SuspendTenantCommand(tenant_id=tenant_id)) + + +@router.delete( + "/{tenant_id}", + status_code=status.HTTP_204_NO_CONTENT, +) +async def delete_tenant( + tenant_id: UUID, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(DeleteTenantCommand(tenant_id=tenant_id)) diff --git a/services/tenant/src/tenant/interface/schemas.py b/services/tenant/src/tenant/interface/schemas.py new file mode 100644 index 0000000..90c4422 --- /dev/null +++ b/services/tenant/src/tenant/interface/schemas.py @@ -0,0 +1,38 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel, Field + +from tenant.domain.tenant import TenantStatus + + +class CreateTenantRequest(BaseModel): + name: str = Field(..., min_length=1, max_length=255) + slug: str = Field(..., min_length=1, max_length=255, pattern=r"^[a-z0-9-]+$") + custom_domain: str | None = None + logo_url: str | None = None + theme: str | None = None + + +class UpdateTenantSettingsRequest(BaseModel): + custom_domain: str | None = None + logo_url: str | None = None + theme: str | None = None + + +class TenantResponse(BaseModel): + id: UUID + name: str + slug: str + status: TenantStatus + settings: dict + db_name: str | None + created_at: datetime + updated_at: datetime + + +class TenantListResponse(BaseModel): + items: list[TenantResponse] + total: int + offset: int + limit: int diff --git a/services/tenant/src/tenant/interface/setup_router.py b/services/tenant/src/tenant/interface/setup_router.py new file mode 100644 index 0000000..4e80a38 --- /dev/null +++ b/services/tenant/src/tenant/interface/setup_router.py @@ -0,0 +1,53 @@ +from fastapi import APIRouter, HTTPException, Request +from pydantic import BaseModel +from shared.cqrs.bus import CommandBus, QueryBus + +from tenant.application.command_handlers import CreateTenantHandler +from tenant.application.commands import CreateTenantCommand +from tenant.application.queries import GetTenantQuery, ListTenantsQuery +from tenant.application.query_handlers import GetTenantHandler, ListTenantsHandler +from tenant.infrastructure.tenant_repository import PostgresTenantRepository +from tenant.interface.schemas import CreateTenantRequest, TenantResponse + +setup_router = APIRouter(prefix="/setup", tags=["setup"]) + + +class SetupStatusResponse(BaseModel): + initialized: bool + + +@setup_router.get("/status", response_model=SetupStatusResponse) +async def get_setup_status(request: Request) -> SetupStatusResponse: + session = request.app.state.database.session() + repo = PostgresTenantRepository(session) + bus = QueryBus() + bus.register(ListTenantsQuery, ListTenantsHandler(repo)) + items, total = await bus.dispatch(ListTenantsQuery(offset=0, limit=1)) + return SetupStatusResponse(initialized=total > 0) + + +@setup_router.post("/create-tenant", response_model=TenantResponse) +async def setup_create_tenant( + body: CreateTenantRequest, + request: Request, +) -> TenantResponse: + # Only allow if not yet initialized + session = request.app.state.database.session() + repo = PostgresTenantRepository(session) + + query_bus = QueryBus() + query_bus.register(ListTenantsQuery, ListTenantsHandler(repo)) + query_bus.register(GetTenantQuery, GetTenantHandler(repo)) + + _, total = await query_bus.dispatch(ListTenantsQuery(offset=0, limit=1)) + if total > 0: + raise HTTPException(status_code=403, detail="System already initialized") + + command_bus = CommandBus() + command_bus.register( + CreateTenantCommand, + CreateTenantHandler(repo, request.app.state.provisioner, request.app.state.event_producer), + ) + tenant_id = await command_bus.dispatch(CreateTenantCommand(**body.model_dump())) + result = await query_bus.dispatch(GetTenantQuery(tenant_id=tenant_id)) + return TenantResponse(**result.model_dump()) diff --git a/services/webhook/Dockerfile b/services/webhook/Dockerfile new file mode 100644 index 0000000..54fa581 --- /dev/null +++ b/services/webhook/Dockerfile @@ -0,0 +1,5 @@ +FROM cmdb-base:prod +COPY services/webhook/ ./services/webhook/ +RUN uv sync --frozen --no-dev --package cmdb-webhook +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-webhook", "uvicorn", "webhook.interface.main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/services/webhook/Dockerfile.dev b/services/webhook/Dockerfile.dev new file mode 100644 index 0000000..d89b80f --- /dev/null +++ b/services/webhook/Dockerfile.dev @@ -0,0 +1,5 @@ +FROM cmdb-base:dev +COPY services/webhook/pyproject.toml ./services/webhook/pyproject.toml +RUN uv sync --frozen --package cmdb-webhook +EXPOSE 8000 +CMD ["uv", "run", "--package", "cmdb-webhook", "uvicorn", "webhook.interface.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"] diff --git a/services/webhook/alembic.ini b/services/webhook/alembic.ini new file mode 100644 index 0000000..6e5d25c --- /dev/null +++ b/services/webhook/alembic.ini @@ -0,0 +1,26 @@ +[alembic] +script_location = alembic +sqlalchemy.url = postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_webhook + +[loggers] +keys = root + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s +datefmt = %H:%M:%S diff --git a/services/webhook/alembic/env.py b/services/webhook/alembic/env.py new file mode 100644 index 0000000..692571a --- /dev/null +++ b/services/webhook/alembic/env.py @@ -0,0 +1,48 @@ +import asyncio +from logging.config import fileConfig + +from alembic import context +from sqlalchemy import pool +from sqlalchemy.ext.asyncio import async_engine_from_config +from webhook.infrastructure.models import WebhookBase + +config = context.config + +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +target_metadata = WebhookBase.metadata + + +def run_migrations_offline() -> None: + url = config.get_main_option("sqlalchemy.url") + context.configure(url=url, target_metadata=target_metadata, literal_binds=True) + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection) -> None: + context.configure(connection=connection, target_metadata=target_metadata) + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + connectable = async_engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + await connectable.dispose() + + +def run_migrations_online() -> None: + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/services/webhook/alembic/versions/001_create_webhook_tables.py b/services/webhook/alembic/versions/001_create_webhook_tables.py new file mode 100644 index 0000000..b64d9b0 --- /dev/null +++ b/services/webhook/alembic/versions/001_create_webhook_tables.py @@ -0,0 +1,62 @@ +"""create webhook tables + +Revision ID: 001 +Revises: +Create Date: 2026-03-22 +""" + +import sqlalchemy as sa +from alembic import op +from sqlalchemy.dialects import postgresql + +revision = "001" +down_revision = None +branch_labels = None +depends_on = None + + +def upgrade() -> None: + op.create_table( + "webhooks", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("name", sa.String(255), nullable=False), + sa.Column("url", sa.Text(), nullable=False), + sa.Column("secret", sa.String(255), nullable=False), + sa.Column("event_types", postgresql.JSONB(), server_default="[]", nullable=False), + sa.Column("is_active", sa.Boolean(), server_default=sa.text("true"), nullable=False), + sa.Column("tenant_id", sa.Uuid(), nullable=True), + sa.Column("description", sa.Text(), server_default="", nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_webhooks_name", "webhooks", ["name"]) + op.create_index("ix_webhooks_is_active", "webhooks", ["is_active"]) + op.create_index("ix_webhooks_tenant_id", "webhooks", ["tenant_id"]) + + op.create_table( + "webhook_event_logs", + sa.Column("id", sa.Uuid(), nullable=False), + sa.Column("webhook_id", sa.Uuid(), nullable=False), + sa.Column("event_type", sa.String(255), nullable=False), + sa.Column("event_id", sa.String(255), nullable=False), + sa.Column("request_url", sa.Text(), nullable=False), + sa.Column("request_body", sa.Text(), nullable=False), + sa.Column("response_status", sa.Integer(), nullable=True), + sa.Column("response_body", sa.Text(), nullable=True), + sa.Column("error_message", sa.Text(), nullable=True), + sa.Column("attempt", sa.Integer(), server_default="1", nullable=False), + sa.Column("duration_ms", sa.Integer(), nullable=True), + sa.Column("success", sa.Boolean(), server_default=sa.text("false"), nullable=False), + sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), + sa.PrimaryKeyConstraint("id"), + ) + op.create_index("ix_webhook_event_logs_webhook_id", "webhook_event_logs", ["webhook_id"]) + op.create_index("ix_webhook_event_logs_event_type", "webhook_event_logs", ["event_type"]) + op.create_index("ix_webhook_event_logs_event_id", "webhook_event_logs", ["event_id"]) + op.create_index("ix_webhook_event_logs_created_at", "webhook_event_logs", ["created_at"]) + + +def downgrade() -> None: + op.drop_table("webhook_event_logs") + op.drop_table("webhooks") diff --git a/services/webhook/alembic/versions/__init__.py b/services/webhook/alembic/versions/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/docker-compose.dev.yml b/services/webhook/docker-compose.dev.yml new file mode 100644 index 0000000..9a01a41 --- /dev/null +++ b/services/webhook/docker-compose.dev.yml @@ -0,0 +1,26 @@ +services: + webhook: + build: + context: ../../ + dockerfile: services/webhook/Dockerfile.dev + volumes: + - ../../services/webhook/src:/app/services/webhook/src + ports: + - "8005:8000" + environment: + DATABASE_URL: postgresql+asyncpg://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_webhook + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network + +networks: + cmdb-network: + external: true diff --git a/services/webhook/docker-compose.yml b/services/webhook/docker-compose.yml new file mode 100644 index 0000000..f4a7dce --- /dev/null +++ b/services/webhook/docker-compose.yml @@ -0,0 +1,18 @@ +services: + webhook: + build: + context: ../../ + dockerfile: services/webhook/Dockerfile + environment: + DATABASE_URL: postgresql://${POSTGRES_USER:-cmdb}:${POSTGRES_PASSWORD:-cmdb}@postgres:5432/cmdb_webhook + KAFKA_BOOTSTRAP_SERVERS: kafka:9092 + REDIS_URL: redis://redis:6379 + depends_on: + postgres: + condition: service_healthy + kafka: + condition: service_healthy + redis: + condition: service_healthy + networks: + - cmdb-network diff --git a/services/webhook/pyproject.toml b/services/webhook/pyproject.toml new file mode 100644 index 0000000..e950fa9 --- /dev/null +++ b/services/webhook/pyproject.toml @@ -0,0 +1,28 @@ +[project] +name = "cmdb-webhook" +version = "0.1.0" +description = "CMDB webhook Service" +requires-python = ">=3.13" +dependencies = [ + "cmdb-shared", + "fastapi>=0.115", + "uvicorn", + "httpx", + "pydantic-settings", + "asyncpg", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/webhook"] + +[tool.pytest.ini_options] +testpaths = ["tests"] +pythonpath = ["src"] +asyncio_mode = "auto" + +[tool.uv.sources] +cmdb-shared = { workspace = true } diff --git a/services/webhook/src/webhook/__init__.py b/services/webhook/src/webhook/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/src/webhook/application/__init__.py b/services/webhook/src/webhook/application/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/src/webhook/application/command_handlers.py b/services/webhook/src/webhook/application/command_handlers.py new file mode 100644 index 0000000..30e483c --- /dev/null +++ b/services/webhook/src/webhook/application/command_handlers.py @@ -0,0 +1,77 @@ +from __future__ import annotations + +from datetime import datetime +from uuid import UUID + +from shared.cqrs.command import CommandHandler +from shared.domain.exceptions import EntityNotFoundError + +from webhook.application.commands import CreateWebhookCommand, DeleteWebhookCommand, UpdateWebhookCommand +from webhook.domain.repository import WebhookRepository +from webhook.domain.webhook import Webhook +from webhook.infrastructure.webhook_cache import WebhookCache + + +class CreateWebhookHandler(CommandHandler[UUID]): + def __init__(self, repo: WebhookRepository, cache: WebhookCache | None = None) -> None: + self._repo = repo + self._cache = cache + + async def handle(self, command: CreateWebhookCommand) -> UUID: + webhook = Webhook( + name=command.name, + url=command.url, + secret=command.secret, + event_types=command.event_types, + tenant_id=command.tenant_id, + description=command.description, + ) + await self._repo.save(webhook) + if self._cache: + await self._cache.invalidate(command.tenant_id) + return webhook.id + + +class UpdateWebhookHandler(CommandHandler[UUID]): + def __init__(self, repo: WebhookRepository, cache: WebhookCache | None = None) -> None: + self._repo = repo + self._cache = cache + + async def handle(self, command: UpdateWebhookCommand) -> UUID: + webhook = await self._repo.find_by_id(command.webhook_id) + if webhook is None: + raise EntityNotFoundError(f"Webhook {command.webhook_id} not found") + + if command.name is not None: + webhook.name = command.name + if command.url is not None: + webhook.url = command.url + if command.secret is not None: + webhook.secret = command.secret + if command.event_types is not None: + webhook.event_types = command.event_types + if command.is_active is not None: + webhook.is_active = command.is_active + if command.description is not None: + webhook.description = command.description + webhook.updated_at = datetime.now() + + await self._repo.save(webhook) + if self._cache: + await self._cache.invalidate(webhook.tenant_id) + return webhook.id + + +class DeleteWebhookHandler(CommandHandler[None]): + def __init__(self, repo: WebhookRepository, cache: WebhookCache | None = None) -> None: + self._repo = repo + self._cache = cache + + async def handle(self, command: DeleteWebhookCommand) -> None: + webhook = await self._repo.find_by_id(command.webhook_id) + if webhook is None: + raise EntityNotFoundError(f"Webhook {command.webhook_id} not found") + + await self._repo.delete(command.webhook_id) + if self._cache: + await self._cache.invalidate(webhook.tenant_id) diff --git a/services/webhook/src/webhook/application/commands.py b/services/webhook/src/webhook/application/commands.py new file mode 100644 index 0000000..84c6f52 --- /dev/null +++ b/services/webhook/src/webhook/application/commands.py @@ -0,0 +1,26 @@ +from uuid import UUID + +from shared.cqrs.command import Command + + +class CreateWebhookCommand(Command): + name: str + url: str + secret: str + event_types: list[str] + tenant_id: UUID | None = None + description: str = "" + + +class UpdateWebhookCommand(Command): + webhook_id: UUID + name: str | None = None + url: str | None = None + secret: str | None = None + event_types: list[str] | None = None + is_active: bool | None = None + description: str | None = None + + +class DeleteWebhookCommand(Command): + webhook_id: UUID diff --git a/services/webhook/src/webhook/application/dto.py b/services/webhook/src/webhook/application/dto.py new file mode 100644 index 0000000..744d987 --- /dev/null +++ b/services/webhook/src/webhook/application/dto.py @@ -0,0 +1,31 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + + +class WebhookDTO(BaseModel): + id: UUID + name: str + url: str + secret: str + event_types: list[str] + is_active: bool + tenant_id: UUID | None + description: str + created_at: datetime + updated_at: datetime + + +class WebhookLogDTO(BaseModel): + id: UUID + webhook_id: UUID + event_type: str + event_id: str + request_url: str + response_status: int | None + error_message: str | None + attempt: int + duration_ms: int | None + success: bool + created_at: datetime diff --git a/services/webhook/src/webhook/application/queries.py b/services/webhook/src/webhook/application/queries.py new file mode 100644 index 0000000..f3b18c7 --- /dev/null +++ b/services/webhook/src/webhook/application/queries.py @@ -0,0 +1,20 @@ +from uuid import UUID + +from shared.cqrs.query import Query + + +class GetWebhookQuery(Query): + webhook_id: UUID + + +class ListWebhooksQuery(Query): + offset: int = 0 + limit: int = 50 + is_active: bool | None = None + tenant_id: UUID | None = None + + +class ListWebhookLogsQuery(Query): + webhook_id: UUID + offset: int = 0 + limit: int = 50 diff --git a/services/webhook/src/webhook/application/query_handlers.py b/services/webhook/src/webhook/application/query_handlers.py new file mode 100644 index 0000000..7b298ac --- /dev/null +++ b/services/webhook/src/webhook/application/query_handlers.py @@ -0,0 +1,46 @@ +from __future__ import annotations + +from shared.cqrs.query import QueryHandler +from shared.domain.exceptions import EntityNotFoundError + +from webhook.application.dto import WebhookDTO, WebhookLogDTO +from webhook.application.queries import GetWebhookQuery, ListWebhookLogsQuery, ListWebhooksQuery +from webhook.domain.repository import WebhookLogRepository, WebhookRepository + + +class GetWebhookHandler(QueryHandler[WebhookDTO]): + def __init__(self, repo: WebhookRepository) -> None: + self._repo = repo + + async def handle(self, query: GetWebhookQuery) -> WebhookDTO: + webhook = await self._repo.find_by_id(query.webhook_id) + if webhook is None: + raise EntityNotFoundError(f"Webhook {query.webhook_id} not found") + return WebhookDTO.model_validate(webhook.model_dump()) + + +class ListWebhooksHandler(QueryHandler[tuple[list[WebhookDTO], int]]): + def __init__(self, repo: WebhookRepository) -> None: + self._repo = repo + + async def handle(self, query: ListWebhooksQuery) -> tuple[list[WebhookDTO], int]: + webhooks, total = await self._repo.find_all( + offset=query.offset, + limit=query.limit, + is_active=query.is_active, + tenant_id=query.tenant_id, + ) + return [WebhookDTO.model_validate(w.model_dump()) for w in webhooks], total + + +class ListWebhookLogsHandler(QueryHandler[tuple[list[WebhookLogDTO], int]]): + def __init__(self, log_repo: WebhookLogRepository) -> None: + self._log_repo = log_repo + + async def handle(self, query: ListWebhookLogsQuery) -> tuple[list[WebhookLogDTO], int]: + logs, total = await self._log_repo.find_by_webhook( + query.webhook_id, + offset=query.offset, + limit=query.limit, + ) + return [WebhookLogDTO.model_validate(log.model_dump()) for log in logs], total diff --git a/services/webhook/src/webhook/domain/__init__.py b/services/webhook/src/webhook/domain/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/src/webhook/domain/repository.py b/services/webhook/src/webhook/domain/repository.py new file mode 100644 index 0000000..370c75e --- /dev/null +++ b/services/webhook/src/webhook/domain/repository.py @@ -0,0 +1,34 @@ +from abc import ABC, abstractmethod +from uuid import UUID + +from webhook.domain.webhook import Webhook +from webhook.domain.webhook_log import WebhookEventLog + + +class WebhookRepository(ABC): + @abstractmethod + async def find_by_id(self, webhook_id: UUID) -> Webhook | None: ... + + @abstractmethod + async def find_all( + self, *, offset: int = 0, limit: int = 50, is_active: bool | None = None, tenant_id: UUID | None = None + ) -> tuple[list[Webhook], int]: ... + + @abstractmethod + async def find_active_for_tenant(self, tenant_id: UUID | None) -> list[Webhook]: ... + + @abstractmethod + async def save(self, webhook: Webhook) -> None: ... + + @abstractmethod + async def delete(self, webhook_id: UUID) -> None: ... + + +class WebhookLogRepository(ABC): + @abstractmethod + async def save(self, log: WebhookEventLog) -> None: ... + + @abstractmethod + async def find_by_webhook( + self, webhook_id: UUID, *, offset: int = 0, limit: int = 50 + ) -> tuple[list[WebhookEventLog], int]: ... diff --git a/services/webhook/src/webhook/domain/webhook.py b/services/webhook/src/webhook/domain/webhook.py new file mode 100644 index 0000000..df9c52d --- /dev/null +++ b/services/webhook/src/webhook/domain/webhook.py @@ -0,0 +1,30 @@ +from datetime import datetime +from uuid import UUID, uuid4 + +from pydantic import BaseModel, Field + + +class Webhook(BaseModel): + id: UUID = Field(default_factory=uuid4) + name: str + url: str + secret: str + event_types: list[str] # ["*"] or ["ipam.domain.events.PrefixCreated", ...] + is_active: bool = True + tenant_id: UUID | None = None + description: str = "" + created_at: datetime = Field(default_factory=datetime.now) + updated_at: datetime = Field(default_factory=datetime.now) + + def matches_event(self, event_type: str) -> bool: + if not self.is_active: + return False + return "*" in self.event_types or event_type in self.event_types + + def deactivate(self) -> None: + self.is_active = False + self.updated_at = datetime.now() + + def activate(self) -> None: + self.is_active = True + self.updated_at = datetime.now() diff --git a/services/webhook/src/webhook/domain/webhook_log.py b/services/webhook/src/webhook/domain/webhook_log.py new file mode 100644 index 0000000..3bca708 --- /dev/null +++ b/services/webhook/src/webhook/domain/webhook_log.py @@ -0,0 +1,20 @@ +from datetime import datetime +from uuid import UUID, uuid4 + +from pydantic import BaseModel, Field + + +class WebhookEventLog(BaseModel): + id: UUID = Field(default_factory=uuid4) + webhook_id: UUID + event_type: str + event_id: str + request_url: str + request_body: str + response_status: int | None = None + response_body: str | None = None + error_message: str | None = None + attempt: int = 1 + duration_ms: int | None = None + success: bool = False + created_at: datetime = Field(default_factory=datetime.now) diff --git a/services/webhook/src/webhook/infrastructure/__init__.py b/services/webhook/src/webhook/infrastructure/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/src/webhook/infrastructure/config.py b/services/webhook/src/webhook/infrastructure/config.py new file mode 100644 index 0000000..c94fd63 --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/config.py @@ -0,0 +1,14 @@ +from pydantic_settings import BaseSettings + + +class Settings(BaseSettings): + database_url: str = "postgresql+asyncpg://cmdb:cmdb@postgres:5432/cmdb_webhook" + kafka_bootstrap_servers: str = "kafka:9092" + kafka_group_id: str = "webhook-service" + kafka_dlq_topic: str = "webhook.dlq" + redis_url: str = "redis://redis:6379" + webhook_max_retries: int = 5 + webhook_retry_backoffs: list[int] = [10, 30, 120, 600, 3600] + webhook_delivery_timeout: float = 10.0 + + model_config = {"env_prefix": "WEBHOOK_"} diff --git a/services/webhook/src/webhook/infrastructure/database.py b/services/webhook/src/webhook/infrastructure/database.py new file mode 100644 index 0000000..86631bd --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/database.py @@ -0,0 +1,17 @@ +from sqlalchemy.ext.asyncio import AsyncEngine, AsyncSession, async_sessionmaker, create_async_engine + + +class Database: + def __init__(self, url: str) -> None: + self._engine: AsyncEngine = create_async_engine(url, echo=False, pool_size=5) + self._session_factory = async_sessionmaker(self._engine, class_=AsyncSession, expire_on_commit=False) + + @property + def engine(self) -> AsyncEngine: + return self._engine + + def session(self) -> AsyncSession: + return self._session_factory() + + async def close(self) -> None: + await self._engine.dispose() diff --git a/services/webhook/src/webhook/infrastructure/models.py b/services/webhook/src/webhook/infrastructure/models.py new file mode 100644 index 0000000..c55273d --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/models.py @@ -0,0 +1,46 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import Boolean, DateTime, Integer, String, Text, func +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.dialects.postgresql import UUID as SAUUID +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column + + +class WebhookBase(DeclarativeBase): + pass + + +class WebhookModel(WebhookBase): + __tablename__ = "webhooks" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + name: Mapped[str] = mapped_column(String(255), index=True) + url: Mapped[str] = mapped_column(Text) + secret: Mapped[str] = mapped_column(String(255)) + event_types: Mapped[list] = mapped_column(JSONB, default=list) + is_active: Mapped[bool] = mapped_column(Boolean, default=True, index=True) + tenant_id: Mapped[UUID | None] = mapped_column(SAUUID(as_uuid=True), nullable=True, index=True) + description: Mapped[str] = mapped_column(Text, default="") + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now()) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now() + ) + + +class WebhookEventLogModel(WebhookBase): + __tablename__ = "webhook_event_logs" + + id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), primary_key=True) + webhook_id: Mapped[UUID] = mapped_column(SAUUID(as_uuid=True), index=True) + event_type: Mapped[str] = mapped_column(String(255), index=True) + event_id: Mapped[str] = mapped_column(String(255), index=True) + request_url: Mapped[str] = mapped_column(Text) + request_body: Mapped[str] = mapped_column(Text) + response_status: Mapped[int | None] = mapped_column(Integer, nullable=True) + response_body: Mapped[str | None] = mapped_column(Text, nullable=True) + error_message: Mapped[str | None] = mapped_column(Text, nullable=True) + attempt: Mapped[int] = mapped_column(Integer, default=1) + duration_ms: Mapped[int | None] = mapped_column(Integer, nullable=True) + success: Mapped[bool] = mapped_column(Boolean, default=False) + created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now(), index=True) diff --git a/services/webhook/src/webhook/infrastructure/repository.py b/services/webhook/src/webhook/infrastructure/repository.py new file mode 100644 index 0000000..fad3550 --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/repository.py @@ -0,0 +1,166 @@ +from __future__ import annotations + +from uuid import UUID + +from sqlalchemy import func, select + +from webhook.domain.repository import WebhookLogRepository, WebhookRepository +from webhook.domain.webhook import Webhook +from webhook.domain.webhook_log import WebhookEventLog +from webhook.infrastructure.database import Database +from webhook.infrastructure.models import WebhookEventLogModel, WebhookModel + + +class PostgresWebhookRepository(WebhookRepository): + def __init__(self, database: Database) -> None: + self._db = database + + async def find_by_id(self, webhook_id: UUID) -> Webhook | None: + async with self._db.session() as session: + model = await session.get(WebhookModel, webhook_id) + if model is None: + return None + return self._to_domain(model) + + async def find_all( + self, *, offset: int = 0, limit: int = 50, is_active: bool | None = None, tenant_id: UUID | None = None + ) -> tuple[list[Webhook], int]: + async with self._db.session() as session: + stmt = select(WebhookModel) + count_stmt = select(func.count()).select_from(WebhookModel) + + if is_active is not None: + stmt = stmt.where(WebhookModel.is_active == is_active) + count_stmt = count_stmt.where(WebhookModel.is_active == is_active) + if tenant_id is not None: + stmt = stmt.where(WebhookModel.tenant_id == tenant_id) + count_stmt = count_stmt.where(WebhookModel.tenant_id == tenant_id) + + stmt = stmt.order_by(WebhookModel.created_at.desc()).offset(offset).limit(limit) + + result = await session.execute(stmt) + models = result.scalars().all() + + count_result = await session.execute(count_stmt) + total = count_result.scalar_one() + + return [self._to_domain(m) for m in models], total + + async def find_active_for_tenant(self, tenant_id: UUID | None) -> list[Webhook]: + async with self._db.session() as session: + stmt = select(WebhookModel).where(WebhookModel.is_active.is_(True)) + + if tenant_id is not None: + stmt = stmt.where((WebhookModel.tenant_id == tenant_id) | (WebhookModel.tenant_id.is_(None))) + + result = await session.execute(stmt) + models = result.scalars().all() + return [self._to_domain(m) for m in models] + + async def save(self, webhook: Webhook) -> None: + async with self._db.session() as session: + model = self._to_model(webhook) + await session.merge(model) + await session.commit() + + async def delete(self, webhook_id: UUID) -> None: + async with self._db.session() as session: + model = await session.get(WebhookModel, webhook_id) + if model: + await session.delete(model) + await session.commit() + + def _to_domain(self, model: WebhookModel) -> Webhook: + return Webhook( + id=model.id, + name=model.name, + url=model.url, + secret=model.secret, + event_types=model.event_types, + is_active=model.is_active, + tenant_id=model.tenant_id, + description=model.description, + created_at=model.created_at, + updated_at=model.updated_at, + ) + + def _to_model(self, webhook: Webhook) -> WebhookModel: + return WebhookModel( + id=webhook.id, + name=webhook.name, + url=webhook.url, + secret=webhook.secret, + event_types=webhook.event_types, + is_active=webhook.is_active, + tenant_id=webhook.tenant_id, + description=webhook.description, + created_at=webhook.created_at, + updated_at=webhook.updated_at, + ) + + +class PostgresWebhookLogRepository(WebhookLogRepository): + def __init__(self, database: Database) -> None: + self._db = database + + async def save(self, log: WebhookEventLog) -> None: + async with self._db.session() as session: + model = WebhookEventLogModel( + id=log.id, + webhook_id=log.webhook_id, + event_type=log.event_type, + event_id=log.event_id, + request_url=log.request_url, + request_body=log.request_body, + response_status=log.response_status, + response_body=log.response_body, + error_message=log.error_message, + attempt=log.attempt, + duration_ms=log.duration_ms, + success=log.success, + created_at=log.created_at, + ) + session.add(model) + await session.commit() + + async def find_by_webhook( + self, webhook_id: UUID, *, offset: int = 0, limit: int = 50 + ) -> tuple[list[WebhookEventLog], int]: + async with self._db.session() as session: + stmt = ( + select(WebhookEventLogModel) + .where(WebhookEventLogModel.webhook_id == webhook_id) + .order_by(WebhookEventLogModel.created_at.desc()) + .offset(offset) + .limit(limit) + ) + count_stmt = ( + select(func.count()) + .select_from(WebhookEventLogModel) + .where(WebhookEventLogModel.webhook_id == webhook_id) + ) + + result = await session.execute(stmt) + models = result.scalars().all() + + count_result = await session.execute(count_stmt) + total = count_result.scalar_one() + + return [self._to_domain(m) for m in models], total + + def _to_domain(self, model: WebhookEventLogModel) -> WebhookEventLog: + return WebhookEventLog( + id=model.id, + webhook_id=model.webhook_id, + event_type=model.event_type, + event_id=model.event_id, + request_url=model.request_url, + request_body=model.request_body, + response_status=model.response_status, + response_body=model.response_body, + error_message=model.error_message, + attempt=model.attempt, + duration_ms=model.duration_ms, + success=model.success, + created_at=model.created_at, + ) diff --git a/services/webhook/src/webhook/infrastructure/retry_manager.py b/services/webhook/src/webhook/infrastructure/retry_manager.py new file mode 100644 index 0000000..caf804e --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/retry_manager.py @@ -0,0 +1,85 @@ +import json +import logging +import time +from dataclasses import dataclass + +import redis.asyncio as redis + +logger = logging.getLogger(__name__) + + +@dataclass +class RetryItem: + webhook_id: str + event_payload: dict + event_type: str + attempt: int + + +class RetryManager: + RETRY_KEY = "webhook:retries" + + def __init__(self, redis_url: str, max_retries: int = 5, backoffs: list[int] | None = None) -> None: + self._redis_url = redis_url + self._redis: redis.Redis | None = None + self._max_retries = max_retries + self._backoffs = backoffs or [10, 30, 120, 600, 3600] + + async def connect(self) -> None: + self._redis = redis.from_url(self._redis_url) + + async def close(self) -> None: + if self._redis: + await self._redis.aclose() + + async def schedule_retry(self, webhook_id: str, event_payload: dict, event_type: str, attempt: int) -> bool: + """Schedule retry. Returns False if max retries exceeded.""" + if attempt > self._max_retries: + return False + if self._redis is None: + return False + backoff_idx = min(attempt - 1, len(self._backoffs) - 1) + delay = self._backoffs[backoff_idx] + next_time = time.time() + delay + item = json.dumps( + { + "webhook_id": webhook_id, + "event_payload": event_payload, + "event_type": event_type, + "attempt": attempt, + }, + default=str, + ) + await self._redis.zadd(self.RETRY_KEY, {item: next_time}) + logger.info("Scheduled retry %d for webhook %s in %ds", attempt, webhook_id, delay) + return True + + async def get_due_retries(self) -> list[RetryItem]: + """Fetch and remove all items whose scheduled time has passed.""" + if self._redis is None: + return [] + now = time.time() + items = await self._redis.zrangebyscore(self.RETRY_KEY, "-inf", now) + if not items: + return [] + pipe = self._redis.pipeline() + for item in items: + pipe.zrem(self.RETRY_KEY, item) + await pipe.execute() + result = [] + for raw in items: + data = json.loads(raw) + result.append( + RetryItem( + webhook_id=data["webhook_id"], + event_payload=data["event_payload"], + event_type=data["event_type"], + attempt=data["attempt"], + ) + ) + return result + + async def pending_count(self) -> int: + if self._redis is None: + return 0 + return await self._redis.zcard(self.RETRY_KEY) diff --git a/services/webhook/src/webhook/infrastructure/webhook_cache.py b/services/webhook/src/webhook/infrastructure/webhook_cache.py new file mode 100644 index 0000000..f972e91 --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/webhook_cache.py @@ -0,0 +1,46 @@ +import json +import logging +from uuid import UUID + +import redis.asyncio as redis + +from webhook.domain.webhook import Webhook + +logger = logging.getLogger(__name__) + + +class WebhookCache: + def __init__(self, redis_url: str, ttl: int = 30) -> None: + self._redis_url = redis_url + self._ttl = ttl + self._redis: redis.Redis | None = None + + async def connect(self) -> None: + self._redis = redis.from_url(self._redis_url, decode_responses=True) + + async def close(self) -> None: + if self._redis: + await self._redis.aclose() + + def _key(self, tenant_id: UUID | None) -> str: + return f"webhooks:active:{tenant_id}" if tenant_id else "webhooks:active:_global" + + async def get_active_webhooks(self, tenant_id: UUID | None) -> list[Webhook] | None: + if self._redis is None: + return None + raw = await self._redis.get(self._key(tenant_id)) + if raw is None: + return None + data = json.loads(raw) + return [Webhook(**item) for item in data] + + async def set_active_webhooks(self, tenant_id: UUID | None, webhooks: list[Webhook]) -> None: + if self._redis is None: + return + data = json.dumps([w.model_dump(mode="json") for w in webhooks]) + await self._redis.set(self._key(tenant_id), data, ex=self._ttl) + + async def invalidate(self, tenant_id: UUID | None) -> None: + if self._redis is None: + return + await self._redis.delete(self._key(tenant_id)) diff --git a/services/webhook/src/webhook/infrastructure/webhook_consumer.py b/services/webhook/src/webhook/infrastructure/webhook_consumer.py new file mode 100644 index 0000000..e2f1117 --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/webhook_consumer.py @@ -0,0 +1,156 @@ +import asyncio +import json +import logging +from uuid import UUID + +from aiokafka import AIOKafkaConsumer, AIOKafkaProducer + +from webhook.infrastructure.database import Database +from webhook.infrastructure.repository import PostgresWebhookRepository +from webhook.infrastructure.webhook_cache import WebhookCache +from webhook.infrastructure.webhook_dispatcher import WebhookDispatcher + +logger = logging.getLogger(__name__) + + +class WebhookConsumerWorker: + def __init__( + self, + bootstrap_servers: str, + group_id: str, + database: Database, + dispatcher: WebhookDispatcher, + cache: WebhookCache, + dlq_topic: str = "webhook.dlq", + ) -> None: + self._bootstrap_servers = bootstrap_servers + self._group_id = group_id + self._database = database + self._dispatcher = dispatcher + self._cache = cache + self._dlq_topic = dlq_topic + self._consumer: AIOKafkaConsumer | None = None + self._dlq_producer: AIOKafkaProducer | None = None + self._running = False + + async def start(self) -> None: + self._consumer = AIOKafkaConsumer( + bootstrap_servers=self._bootstrap_servers, + group_id=self._group_id, + enable_auto_commit=False, + ) + self._consumer.subscribe(pattern=r".*\.events") + await self._consumer.start() + self._dlq_producer = AIOKafkaProducer(bootstrap_servers=self._bootstrap_servers) + await self._dlq_producer.start() + self._running = True + logger.info("Webhook consumer started (pattern: *.events)") + + async def stop(self) -> None: + self._running = False + if self._consumer: + await self._consumer.stop() + if self._dlq_producer: + await self._dlq_producer.stop() + logger.info("Webhook consumer stopped") + + async def consume(self) -> None: + if self._consumer is None: + raise RuntimeError("Consumer not started") + async for msg in self._consumer: + if not self._running: + break + try: + await self._process_message(msg) + await self._consumer.commit() + except Exception: + logger.exception("Failed to process webhook message from %s", msg.topic) + await self._send_to_dlq(msg) + await self._consumer.commit() + + async def _process_message(self, msg: object) -> None: + raw = json.loads(msg.value) # type: ignore[attr-defined] + event_type = raw.get("event_type", "") + if not event_type: + return + + event_tenant_id = raw.get("tenant_id") + event_tenant_uuid = UUID(event_tenant_id) if event_tenant_id else None + + # Get matching webhooks + if event_tenant_uuid is not None: + # Tenant-scoped event -> only webhooks for this tenant + webhooks = await self._get_active_webhooks(event_tenant_uuid) + else: + # Global event (no tenant_id) -> all active webhooks + webhooks = await self._get_all_active_webhooks() + + for webhook in webhooks: + if webhook.matches_event(event_type): + await self._dispatcher.dispatch(webhook, raw, event_type) + + async def _get_active_webhooks(self, tenant_id: UUID) -> list: + # Try cache first + cached = await self._cache.get_active_webhooks(tenant_id) + if cached is not None: + return cached + # Miss -> query DB + repo = PostgresWebhookRepository(self._database) + webhooks = await repo.find_active_for_tenant(tenant_id) + await self._cache.set_active_webhooks(tenant_id, webhooks) + return webhooks + + async def _get_all_active_webhooks(self) -> list: + # For global events, get all active webhooks (cache key: _global) + cached = await self._cache.get_active_webhooks(None) + if cached is not None: + return cached + repo = PostgresWebhookRepository(self._database) + webhooks_list, _ = await repo.find_all(is_active=True, limit=1000) + await self._cache.set_active_webhooks(None, webhooks_list) + return webhooks_list + + async def _send_to_dlq(self, msg: object) -> None: + if self._dlq_producer: + await self._dlq_producer.send_and_wait( + self._dlq_topic, + value=msg.value, # type: ignore[attr-defined] + key=msg.key, # type: ignore[attr-defined] + ) + + +class RetryWorker: + def __init__( + self, + retry_manager: "RetryManager", # noqa: F821 + database: Database, + dispatcher: WebhookDispatcher, + cache: WebhookCache, + poll_interval: float = 5.0, + ) -> None: + from webhook.infrastructure.retry_manager import RetryManager # noqa: F811 + + self._retry_manager: RetryManager = retry_manager + self._database = database + self._dispatcher = dispatcher + self._cache = cache + self._poll_interval = poll_interval + self._running = True + + async def run(self) -> None: + while self._running: + try: + items = await self._retry_manager.get_due_retries() + for item in items: + repo = PostgresWebhookRepository(self._database) + webhook = await repo.find_by_id(UUID(item.webhook_id)) + if webhook and webhook.is_active: + await self._dispatcher.dispatch(webhook, item.event_payload, item.event_type, item.attempt) + else: + logger.info("Skipping retry for deleted/inactive webhook %s", item.webhook_id) + except Exception: + logger.exception("Error processing retries") + await asyncio.sleep(self._poll_interval) + + async def stop(self) -> None: + self._running = False diff --git a/services/webhook/src/webhook/infrastructure/webhook_delivery.py b/services/webhook/src/webhook/infrastructure/webhook_delivery.py new file mode 100644 index 0000000..b06751d --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/webhook_delivery.py @@ -0,0 +1,62 @@ +import hashlib +import hmac +import json +import logging +import time +from dataclasses import dataclass + +import httpx + +logger = logging.getLogger(__name__) + + +@dataclass +class DeliveryResult: + success: bool + status_code: int | None + response_body: str | None + error_message: str | None + duration_ms: int + + +class WebhookDeliveryService: + def __init__(self, timeout: float = 10.0) -> None: + self._timeout = timeout + self._client: httpx.AsyncClient | None = None + + async def start(self) -> None: + self._client = httpx.AsyncClient(timeout=self._timeout) + + async def stop(self) -> None: + if self._client: + await self._client.aclose() + + async def deliver(self, url: str, payload: dict, secret: str, event_type: str, webhook_id: str) -> DeliveryResult: + body = json.dumps(payload, default=str) + signature = hmac.new(secret.encode(), body.encode(), hashlib.sha256).hexdigest() + headers = { + "Content-Type": "application/json", + "X-Webhook-Event": event_type, + "X-Webhook-Signature": f"sha256={signature}", + "X-Webhook-ID": webhook_id, + } + start_time = time.monotonic() + try: + resp = await self._client.post(url, content=body, headers=headers) + duration = int((time.monotonic() - start_time) * 1000) + return DeliveryResult( + success=200 <= resp.status_code < 300, + status_code=resp.status_code, + response_body=resp.text[:4096] if resp.text else None, + error_message=None if resp.status_code < 300 else f"HTTP {resp.status_code}", + duration_ms=duration, + ) + except Exception as e: + duration = int((time.monotonic() - start_time) * 1000) + return DeliveryResult( + success=False, + status_code=None, + response_body=None, + error_message=str(e), + duration_ms=duration, + ) diff --git a/services/webhook/src/webhook/infrastructure/webhook_dispatcher.py b/services/webhook/src/webhook/infrastructure/webhook_dispatcher.py new file mode 100644 index 0000000..0286ae2 --- /dev/null +++ b/services/webhook/src/webhook/infrastructure/webhook_dispatcher.py @@ -0,0 +1,61 @@ +import json +import logging + +from webhook.domain.webhook import Webhook +from webhook.domain.webhook_log import WebhookEventLog +from webhook.infrastructure.database import Database +from webhook.infrastructure.repository import PostgresWebhookLogRepository +from webhook.infrastructure.retry_manager import RetryManager +from webhook.infrastructure.webhook_delivery import WebhookDeliveryService + +logger = logging.getLogger(__name__) + + +class WebhookDispatcher: + def __init__( + self, + database: Database, + delivery_service: WebhookDeliveryService, + retry_manager: RetryManager, + ) -> None: + self._database = database + self._delivery_service = delivery_service + self._retry_manager = retry_manager + + async def dispatch(self, webhook: Webhook, payload: dict, event_type: str, attempt: int = 1) -> None: + result = await self._delivery_service.deliver( + url=webhook.url, + payload=payload, + secret=webhook.secret, + event_type=event_type, + webhook_id=str(webhook.id), + ) + + log = WebhookEventLog( + webhook_id=webhook.id, + event_type=event_type, + event_id=payload.get("aggregate_id", ""), + request_url=webhook.url, + request_body=json.dumps(payload, default=str)[:8192], + response_status=result.status_code, + response_body=result.response_body, + error_message=result.error_message, + attempt=attempt, + duration_ms=result.duration_ms, + success=result.success, + ) + await self._save_log(log) + + if not result.success: + scheduled = await self._retry_manager.schedule_retry( + webhook_id=str(webhook.id), + event_payload=payload, + event_type=event_type, + attempt=attempt + 1, + ) + if not scheduled: + logger.warning("Max retries exceeded for webhook %s, event %s", webhook.id, event_type) + + async def _save_log(self, log: WebhookEventLog) -> None: + repo = PostgresWebhookLogRepository(self._database) + await repo.save(log) diff --git a/services/webhook/src/webhook/interface/__init__.py b/services/webhook/src/webhook/interface/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/src/webhook/interface/main.py b/services/webhook/src/webhook/interface/main.py new file mode 100644 index 0000000..10710aa --- /dev/null +++ b/services/webhook/src/webhook/interface/main.py @@ -0,0 +1,105 @@ +import asyncio +import contextlib +import logging +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager + +from fastapi import FastAPI +from shared.api.errors import domain_exception_handler +from shared.api.middleware import CorrelationIdMiddleware +from shared.domain.exceptions import DomainError + +from webhook.infrastructure.config import Settings +from webhook.infrastructure.database import Database +from webhook.infrastructure.retry_manager import RetryManager +from webhook.infrastructure.webhook_cache import WebhookCache +from webhook.infrastructure.webhook_consumer import RetryWorker, WebhookConsumerWorker +from webhook.infrastructure.webhook_delivery import WebhookDeliveryService +from webhook.infrastructure.webhook_dispatcher import WebhookDispatcher +from webhook.interface.routers.webhook_router import router as webhook_router + +logger = logging.getLogger(__name__) + + +@asynccontextmanager +async def lifespan(app: FastAPI) -> AsyncGenerator[None]: + settings = Settings() + database = Database(settings.database_url) + + delivery_service = WebhookDeliveryService(timeout=settings.webhook_delivery_timeout) + await delivery_service.start() + + retry_manager = RetryManager( + settings.redis_url, + max_retries=settings.webhook_max_retries, + backoffs=settings.webhook_retry_backoffs, + ) + await retry_manager.connect() + + webhook_cache = WebhookCache(settings.redis_url) + await webhook_cache.connect() + + dispatcher = WebhookDispatcher(database, delivery_service, retry_manager) + + consumer = WebhookConsumerWorker( + bootstrap_servers=settings.kafka_bootstrap_servers, + group_id=settings.kafka_group_id, + database=database, + dispatcher=dispatcher, + cache=webhook_cache, + dlq_topic=settings.kafka_dlq_topic, + ) + await consumer.start() + consumer_task = asyncio.create_task(consumer.consume()) + + retry_worker = RetryWorker( + retry_manager=retry_manager, + database=database, + dispatcher=dispatcher, + cache=webhook_cache, + ) + retry_task = asyncio.create_task(retry_worker.run()) + + app.state.database = database + app.state.settings = settings + app.state.delivery_service = delivery_service + app.state.webhook_cache = webhook_cache + + yield + + consumer_task.cancel() + with contextlib.suppress(asyncio.CancelledError): + await consumer_task + await consumer.stop() + + await retry_worker.stop() + retry_task.cancel() + with contextlib.suppress(asyncio.CancelledError): + await retry_task + + await delivery_service.stop() + await retry_manager.close() + await webhook_cache.close() + await database.close() + + +def create_app() -> FastAPI: + app = FastAPI( + title="CMDB Webhook Service", + version="1.0.0", + description="Webhook delivery service for CMDB domain events.", + openapi_tags=[{"name": "webhooks", "description": "Webhook registration and delivery logs"}], + lifespan=lifespan, + ) + app.add_middleware(CorrelationIdMiddleware) + app.add_exception_handler(DomainError, domain_exception_handler) + app.include_router(webhook_router, prefix="/api/v1") + + @app.get("/health", include_in_schema=False) + async def health() -> dict: + return {"status": "ok"} + + return app + + +app = create_app() diff --git a/services/webhook/src/webhook/interface/routers/__init__.py b/services/webhook/src/webhook/interface/routers/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/src/webhook/interface/routers/webhook_router.py b/services/webhook/src/webhook/interface/routers/webhook_router.py new file mode 100644 index 0000000..c5956e3 --- /dev/null +++ b/services/webhook/src/webhook/interface/routers/webhook_router.py @@ -0,0 +1,191 @@ +import json +from datetime import datetime +from uuid import UUID + +from fastapi import APIRouter, Depends, Request, status +from shared.api.pagination import OffsetParams +from shared.cqrs.bus import CommandBus, QueryBus + +from webhook.application.command_handlers import CreateWebhookHandler, DeleteWebhookHandler, UpdateWebhookHandler +from webhook.application.commands import CreateWebhookCommand, DeleteWebhookCommand, UpdateWebhookCommand +from webhook.application.queries import GetWebhookQuery, ListWebhookLogsQuery, ListWebhooksQuery +from webhook.application.query_handlers import GetWebhookHandler, ListWebhookLogsHandler, ListWebhooksHandler +from webhook.domain.webhook_log import WebhookEventLog +from webhook.infrastructure.repository import PostgresWebhookLogRepository, PostgresWebhookRepository +from webhook.interface.schemas import ( + CreateWebhookRequest, + UpdateWebhookRequest, + WebhookListResponse, + WebhookLogListResponse, + WebhookLogResponse, + WebhookResponse, +) + +router = APIRouter(prefix="/webhooks", tags=["webhooks"]) + + +def _get_command_bus(request: Request) -> CommandBus: + database = request.app.state.database + repo = PostgresWebhookRepository(database) + cache = getattr(request.app.state, "webhook_cache", None) + bus = CommandBus() + bus.register(CreateWebhookCommand, CreateWebhookHandler(repo, cache)) + bus.register(UpdateWebhookCommand, UpdateWebhookHandler(repo, cache)) + bus.register(DeleteWebhookCommand, DeleteWebhookHandler(repo, cache)) + return bus + + +def _get_query_bus(request: Request) -> QueryBus: + database = request.app.state.database + repo = PostgresWebhookRepository(database) + log_repo = PostgresWebhookLogRepository(database) + bus = QueryBus() + bus.register(GetWebhookQuery, GetWebhookHandler(repo)) + bus.register(ListWebhooksQuery, ListWebhooksHandler(repo)) + bus.register(ListWebhookLogsQuery, ListWebhookLogsHandler(log_repo)) + return bus + + +@router.post( + "", + status_code=status.HTTP_201_CREATED, + response_model=WebhookResponse, +) +async def create_webhook( + body: CreateWebhookRequest, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> WebhookResponse: + webhook_id = await command_bus.dispatch(CreateWebhookCommand(**body.model_dump())) + dto = await query_bus.dispatch(GetWebhookQuery(webhook_id=webhook_id)) + return WebhookResponse(**dto.model_dump(exclude={"secret"})) + + +@router.get("", response_model=WebhookListResponse) +async def list_webhooks( + params: OffsetParams = Depends(), # noqa: B008 + is_active: bool | None = None, + tenant_id: UUID | None = None, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> WebhookListResponse: + items, total = await query_bus.dispatch( + ListWebhooksQuery( + offset=params.offset, + limit=params.limit, + is_active=is_active, + tenant_id=tenant_id, + ) + ) + return WebhookListResponse( + items=[WebhookResponse(**i.model_dump(exclude={"secret"})) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.get("/{webhook_id}", response_model=WebhookResponse) +async def get_webhook( + webhook_id: UUID, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> WebhookResponse: + dto = await query_bus.dispatch(GetWebhookQuery(webhook_id=webhook_id)) + return WebhookResponse(**dto.model_dump(exclude={"secret"})) + + +@router.patch("/{webhook_id}", response_model=WebhookResponse) +async def update_webhook( + webhook_id: UUID, + body: UpdateWebhookRequest, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> WebhookResponse: + await command_bus.dispatch(UpdateWebhookCommand(webhook_id=webhook_id, **body.model_dump(exclude_unset=True))) + dto = await query_bus.dispatch(GetWebhookQuery(webhook_id=webhook_id)) + return WebhookResponse(**dto.model_dump(exclude={"secret"})) + + +@router.delete("/{webhook_id}", status_code=status.HTTP_204_NO_CONTENT) +async def delete_webhook( + webhook_id: UUID, + command_bus: CommandBus = Depends(_get_command_bus), # noqa: B008 +) -> None: + await command_bus.dispatch(DeleteWebhookCommand(webhook_id=webhook_id)) + + +@router.get("/{webhook_id}/logs", response_model=WebhookLogListResponse) +async def list_webhook_logs( + webhook_id: UUID, + params: OffsetParams = Depends(), # noqa: B008 + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> WebhookLogListResponse: + items, total = await query_bus.dispatch( + ListWebhookLogsQuery( + webhook_id=webhook_id, + offset=params.offset, + limit=params.limit, + ) + ) + return WebhookLogListResponse( + items=[WebhookLogResponse(**i.model_dump()) for i in items], + total=total, + offset=params.offset, + limit=params.limit, + ) + + +@router.post("/{webhook_id}/test", response_model=WebhookLogResponse) +async def test_webhook( + webhook_id: UUID, + request: Request, + query_bus: QueryBus = Depends(_get_query_bus), # noqa: B008 +) -> WebhookLogResponse: + dto = await query_bus.dispatch(GetWebhookQuery(webhook_id=webhook_id)) + payload = { + "event_type": "webhook.test", + "webhook_id": str(webhook_id), + "timestamp": datetime.now().isoformat(), + "message": "Test event from CMDB Webhook Service", + } + + delivery_service = request.app.state.delivery_service + result = await delivery_service.deliver( + url=dto.url, + payload=payload, + secret=dto.secret, + event_type="webhook.test", + webhook_id=str(webhook_id), + ) + + # Log the test delivery + log = WebhookEventLog( + webhook_id=webhook_id, + event_type="webhook.test", + event_id="test", + request_url=dto.url, + request_body=json.dumps(payload), + response_status=result.status_code, + response_body=result.response_body, + error_message=result.error_message, + attempt=1, + duration_ms=result.duration_ms, + success=result.success, + ) + + database = request.app.state.database + log_repo = PostgresWebhookLogRepository(database) + await log_repo.save(log) + + return WebhookLogResponse( + id=log.id, + webhook_id=log.webhook_id, + event_type=log.event_type, + event_id=log.event_id, + request_url=log.request_url, + response_status=log.response_status, + error_message=log.error_message, + attempt=log.attempt, + duration_ms=log.duration_ms, + success=log.success, + created_at=log.created_at, + ) diff --git a/services/webhook/src/webhook/interface/schemas.py b/services/webhook/src/webhook/interface/schemas.py new file mode 100644 index 0000000..ce35c2e --- /dev/null +++ b/services/webhook/src/webhook/interface/schemas.py @@ -0,0 +1,63 @@ +from datetime import datetime +from uuid import UUID + +from pydantic import BaseModel + + +class CreateWebhookRequest(BaseModel): + name: str + url: str + secret: str + event_types: list[str] + tenant_id: UUID | None = None + description: str = "" + + +class UpdateWebhookRequest(BaseModel): + name: str | None = None + url: str | None = None + secret: str | None = None + event_types: list[str] | None = None + is_active: bool | None = None + description: str | None = None + + +class WebhookResponse(BaseModel): + id: UUID + name: str + url: str + event_types: list[str] + is_active: bool + tenant_id: UUID | None + description: str + created_at: datetime + updated_at: datetime + # Note: secret intentionally excluded from response + + +class WebhookListResponse(BaseModel): + items: list[WebhookResponse] + total: int + offset: int + limit: int + + +class WebhookLogResponse(BaseModel): + id: UUID + webhook_id: UUID + event_type: str + event_id: str + request_url: str + response_status: int | None + error_message: str | None + attempt: int + duration_ms: int | None + success: bool + created_at: datetime + + +class WebhookLogListResponse(BaseModel): + items: list[WebhookLogResponse] + total: int + offset: int + limit: int diff --git a/services/webhook/tests/__init__.py b/services/webhook/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/tests/conftest.py b/services/webhook/tests/conftest.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/tests/test_domain/__init__.py b/services/webhook/tests/test_domain/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/tests/test_domain/test_webhook.py b/services/webhook/tests/test_domain/test_webhook.py new file mode 100644 index 0000000..e50666d --- /dev/null +++ b/services/webhook/tests/test_domain/test_webhook.py @@ -0,0 +1,67 @@ +from uuid import uuid4 + +from webhook.domain.webhook import Webhook + + +class TestWebhookMatchesEvent: + def test_exact_match(self): + wh = Webhook( + name="test", url="http://example.com", secret="s", event_types=["ipam.domain.events.PrefixCreated"] + ) + assert wh.matches_event("ipam.domain.events.PrefixCreated") is True + + def test_no_match(self): + wh = Webhook( + name="test", url="http://example.com", secret="s", event_types=["ipam.domain.events.PrefixCreated"] + ) + assert wh.matches_event("ipam.domain.events.PrefixDeleted") is False + + def test_wildcard_matches_all(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"]) + assert wh.matches_event("anything.here") is True + + def test_inactive_does_not_match(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"], is_active=False) + assert wh.matches_event("anything") is False + + def test_multiple_event_types(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["a.Created", "a.Updated"]) + assert wh.matches_event("a.Created") is True + assert wh.matches_event("a.Updated") is True + assert wh.matches_event("a.Deleted") is False + + def test_empty_event_types_matches_nothing(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=[]) + assert wh.matches_event("anything") is False + + +class TestWebhookActivation: + def test_deactivate(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"]) + wh.deactivate() + assert wh.is_active is False + + def test_activate(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"], is_active=False) + wh.activate() + assert wh.is_active is True + + def test_deactivate_updates_timestamp(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"]) + old_ts = wh.updated_at + wh.deactivate() + assert wh.updated_at >= old_ts + + +class TestWebhookCreation: + def test_default_values(self): + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"]) + assert wh.is_active is True + assert wh.tenant_id is None + assert wh.description == "" + assert wh.id is not None + + def test_with_tenant(self): + tid = uuid4() + wh = Webhook(name="test", url="http://example.com", secret="s", event_types=["*"], tenant_id=tid) + assert wh.tenant_id == tid diff --git a/services/webhook/tests/test_domain/test_webhook_log.py b/services/webhook/tests/test_domain/test_webhook_log.py new file mode 100644 index 0000000..a62c984 --- /dev/null +++ b/services/webhook/tests/test_domain/test_webhook_log.py @@ -0,0 +1,31 @@ +from uuid import uuid4 + +from webhook.domain.webhook_log import WebhookEventLog + + +class TestWebhookEventLog: + def test_create_log(self): + log = WebhookEventLog( + webhook_id=uuid4(), + event_type="test", + event_id="123", + request_url="http://example.com", + request_body="{}", + ) + assert log.success is False + assert log.attempt == 1 + assert log.id is not None + + def test_create_successful_log(self): + log = WebhookEventLog( + webhook_id=uuid4(), + event_type="test", + event_id="123", + request_url="http://example.com", + request_body="{}", + response_status=200, + success=True, + duration_ms=50, + ) + assert log.success is True + assert log.duration_ms == 50 diff --git a/services/webhook/tests/test_infrastructure/__init__.py b/services/webhook/tests/test_infrastructure/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/tests/test_infrastructure/test_event_matching.py b/services/webhook/tests/test_infrastructure/test_event_matching.py new file mode 100644 index 0000000..434d335 --- /dev/null +++ b/services/webhook/tests/test_infrastructure/test_event_matching.py @@ -0,0 +1,41 @@ +from uuid import uuid4 + +from webhook.domain.webhook import Webhook + + +class TestTenantEventMatching: + def _make_webhook(self, tenant_id=None, event_types=None, is_active=True): + return Webhook( + name="test", + url="http://example.com", + secret="s", + event_types=event_types or ["*"], + tenant_id=tenant_id, + is_active=is_active, + ) + + def test_tenant_webhook_matches_same_tenant_event(self): + tid = uuid4() + wh = self._make_webhook(tenant_id=tid) + # Simulating: event has same tenant_id -> webhook for this tenant is selected + assert wh.matches_event("any.event") is True + + def test_inactive_webhook_never_matches(self): + wh = self._make_webhook(is_active=False) + assert wh.matches_event("any.event") is False + + def test_event_type_filter_with_tenant(self): + tid = uuid4() + wh = self._make_webhook(tenant_id=tid, event_types=["ipam.domain.events.PrefixCreated"]) + assert wh.matches_event("ipam.domain.events.PrefixCreated") is True + assert wh.matches_event("ipam.domain.events.PrefixDeleted") is False + + def test_wildcard_with_tenant(self): + tid = uuid4() + wh = self._make_webhook(tenant_id=tid, event_types=["*"]) + assert wh.matches_event("any.event.type") is True + + def test_global_webhook_no_tenant(self): + wh = self._make_webhook(tenant_id=None, event_types=["*"]) + assert wh.matches_event("any.event") is True + assert wh.tenant_id is None diff --git a/services/webhook/tests/test_infrastructure/test_retry_manager.py b/services/webhook/tests/test_infrastructure/test_retry_manager.py new file mode 100644 index 0000000..f5f5743 --- /dev/null +++ b/services/webhook/tests/test_infrastructure/test_retry_manager.py @@ -0,0 +1,49 @@ +import time + +import fakeredis.aioredis +import pytest +import pytest_asyncio +from webhook.infrastructure.retry_manager import RetryManager + + +@pytest_asyncio.fixture +async def retry_manager(): + mgr = RetryManager("redis://localhost", max_retries=3, backoffs=[1, 2, 5]) + # Override Redis with fakeredis + mgr._redis = fakeredis.aioredis.FakeRedis() + yield mgr + await mgr._redis.aclose() + + +class TestRetryManager: + @pytest.mark.asyncio + async def test_schedule_and_retrieve(self, retry_manager): + scheduled = await retry_manager.schedule_retry("wh-1", {"test": True}, "ev.type", 1) + assert scheduled is True + # Set score to past to make it immediately due + items = await retry_manager._redis.zrangebyscore(RetryManager.RETRY_KEY, "-inf", "+inf") + assert len(items) == 1 + # Move the score to the past + await retry_manager._redis.zadd(RetryManager.RETRY_KEY, {items[0]: time.time() - 10}) + due = await retry_manager.get_due_retries() + assert len(due) == 1 + assert due[0].webhook_id == "wh-1" + assert due[0].attempt == 1 + + @pytest.mark.asyncio + async def test_max_retries_exceeded(self, retry_manager): + # max_retries=3, so attempt 4 should fail + result = await retry_manager.schedule_retry("wh-1", {}, "ev", 4) + assert result is False + + @pytest.mark.asyncio + async def test_empty_queue(self, retry_manager): + due = await retry_manager.get_due_retries() + assert due == [] + + @pytest.mark.asyncio + async def test_pending_count(self, retry_manager): + await retry_manager.schedule_retry("wh-1", {}, "ev", 1) + await retry_manager.schedule_retry("wh-2", {}, "ev", 2) + count = await retry_manager.pending_count() + assert count == 2 diff --git a/services/webhook/tests/test_infrastructure/test_webhook_delivery.py b/services/webhook/tests/test_infrastructure/test_webhook_delivery.py new file mode 100644 index 0000000..4af6a23 --- /dev/null +++ b/services/webhook/tests/test_infrastructure/test_webhook_delivery.py @@ -0,0 +1,57 @@ +import hashlib +import hmac +import json + +import httpx +import pytest +import pytest_asyncio +from webhook.infrastructure.webhook_delivery import WebhookDeliveryService + + +@pytest_asyncio.fixture +async def delivery_service(): + svc = WebhookDeliveryService(timeout=5.0) + await svc.start() + yield svc + await svc.stop() + + +class TestWebhookDeliveryService: + @pytest.mark.asyncio + async def test_successful_delivery(self, delivery_service, httpx_mock): + httpx_mock.add_response(status_code=200, text="OK") + result = await delivery_service.deliver( + "http://example.com/hook", {"key": "value"}, "secret123", "test.event", "wh-1" + ) + assert result.success is True + assert result.status_code == 200 + assert result.duration_ms >= 0 + + @pytest.mark.asyncio + async def test_failed_delivery(self, delivery_service, httpx_mock): + httpx_mock.add_response(status_code=500, text="Error") + result = await delivery_service.deliver( + "http://example.com/hook", {"key": "value"}, "secret123", "test.event", "wh-1" + ) + assert result.success is False + assert result.status_code == 500 + + @pytest.mark.asyncio + async def test_timeout_delivery(self, delivery_service, httpx_mock): + httpx_mock.add_exception(httpx.ReadTimeout("timeout")) + result = await delivery_service.deliver("http://example.com/hook", {}, "secret", "test", "wh-1") + assert result.success is False + assert result.error_message is not None + + @pytest.mark.asyncio + async def test_hmac_signature(self, delivery_service, httpx_mock): + httpx_mock.add_response(status_code=200) + payload = {"test": "data"} + secret = "my-secret" + await delivery_service.deliver("http://example.com/hook", payload, secret, "test.event", "wh-1") + request = httpx_mock.get_request() + body = json.dumps(payload, default=str) + expected_sig = hmac.new(secret.encode(), body.encode(), hashlib.sha256).hexdigest() + assert request.headers["X-Webhook-Signature"] == f"sha256={expected_sig}" + assert request.headers["X-Webhook-Event"] == "test.event" + assert request.headers["X-Webhook-ID"] == "wh-1" diff --git a/services/webhook/tests/test_integration/__init__.py b/services/webhook/tests/test_integration/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/services/webhook/tests/test_integration/test_webhook_e2e.py b/services/webhook/tests/test_integration/test_webhook_e2e.py new file mode 100644 index 0000000..d174024 --- /dev/null +++ b/services/webhook/tests/test_integration/test_webhook_e2e.py @@ -0,0 +1,209 @@ +"""Webhook E2E tests: event matching + delivery via httpx_mock. + +No Docker required — these run as regular (non-integration) tests. +""" + +from __future__ import annotations + +import hashlib +import hmac +import json +from uuid import uuid4 + +import pytest_asyncio +from webhook.domain.webhook import Webhook +from webhook.infrastructure.webhook_delivery import WebhookDeliveryService + +# --------------------------------------------------------------------------- +# TestWebhookEventMatching +# --------------------------------------------------------------------------- + + +class TestWebhookEventMatching: + """Webhook.matches_event() returns correct results based on event_types and is_active.""" + + def test_matching_event_returns_true(self) -> None: + webhook = Webhook( + name="test-hook", + url="http://example.com/hook", + secret="s3cret", + event_types=["PrefixCreated"], + ) + assert webhook.matches_event("PrefixCreated") is True + + def test_non_matching_event_returns_false(self) -> None: + webhook = Webhook( + name="test-hook", + url="http://example.com/hook", + secret="s3cret", + event_types=["PrefixCreated"], + ) + assert webhook.matches_event("VLANCreated") is False + + def test_inactive_webhook_does_not_match(self) -> None: + webhook = Webhook( + name="test-hook", + url="http://example.com/hook", + secret="s3cret", + event_types=["PrefixCreated"], + is_active=False, + ) + assert webhook.matches_event("PrefixCreated") is False + + def test_wildcard_matches_any_event(self) -> None: + webhook = Webhook( + name="catch-all", + url="http://example.com/hook", + secret="s3cret", + event_types=["*"], + ) + assert webhook.matches_event("PrefixCreated") is True + assert webhook.matches_event("VLANDeleted") is True + + def test_deactivated_webhook_does_not_match(self) -> None: + webhook = Webhook( + name="test-hook", + url="http://example.com/hook", + secret="s3cret", + event_types=["PrefixCreated"], + ) + webhook.deactivate() + assert webhook.matches_event("PrefixCreated") is False + + def test_reactivated_webhook_matches_again(self) -> None: + webhook = Webhook( + name="test-hook", + url="http://example.com/hook", + secret="s3cret", + event_types=["PrefixCreated"], + ) + webhook.deactivate() + webhook.activate() + assert webhook.matches_event("PrefixCreated") is True + + def test_multiple_event_types(self) -> None: + webhook = Webhook( + name="multi-hook", + url="http://example.com/hook", + secret="s3cret", + event_types=["PrefixCreated", "PrefixDeleted"], + ) + assert webhook.matches_event("PrefixCreated") is True + assert webhook.matches_event("PrefixDeleted") is True + assert webhook.matches_event("PrefixUpdated") is False + + +# --------------------------------------------------------------------------- +# TestWebhookDeliveryE2E +# --------------------------------------------------------------------------- + + +@pytest_asyncio.fixture +async def delivery_service(): + svc = WebhookDeliveryService(timeout=5.0) + await svc.start() + yield svc + await svc.stop() + + +class TestWebhookDeliveryE2E: + """Create webhook -> deliver event -> verify HTTP POST sent with correct headers.""" + + async def test_deliver_event_sends_http_post(self, delivery_service, httpx_mock) -> None: + httpx_mock.add_response(status_code=200, text="OK") + + webhook = Webhook( + name="delivery-test", + url="http://example.com/hook", + secret="my-secret", + event_types=["PrefixCreated"], + ) + + payload = { + "event_type": "PrefixCreated", + "aggregate_id": str(uuid4()), + "network": "10.0.0.0/8", + } + + result = await delivery_service.deliver( + url=webhook.url, + payload=payload, + secret=webhook.secret, + event_type="PrefixCreated", + webhook_id=str(webhook.id), + ) + + assert result.success is True + assert result.status_code == 200 + + # Verify that an HTTP request was actually sent + request = httpx_mock.get_request() + assert request is not None + assert request.method == "POST" + + async def test_deliver_event_has_hmac_signature(self, delivery_service, httpx_mock) -> None: + httpx_mock.add_response(status_code=200) + + webhook = Webhook( + name="sig-test", + url="http://example.com/hook", + secret="hmac-secret-key", + event_types=["PrefixCreated"], + ) + + payload = {"event_type": "PrefixCreated", "network": "192.168.1.0/24"} + + await delivery_service.deliver( + url=webhook.url, + payload=payload, + secret=webhook.secret, + event_type="PrefixCreated", + webhook_id=str(webhook.id), + ) + + request = httpx_mock.get_request() + assert "X-Webhook-Signature" in request.headers + + # Verify the HMAC signature is correct + body = json.dumps(payload, default=str) + expected_sig = hmac.new(webhook.secret.encode(), body.encode(), hashlib.sha256).hexdigest() + assert request.headers["X-Webhook-Signature"] == f"sha256={expected_sig}" + + async def test_deliver_event_contains_event_data(self, delivery_service, httpx_mock) -> None: + httpx_mock.add_response(status_code=200) + + webhook = Webhook( + name="payload-test", + url="http://example.com/hook", + secret="test-secret", + event_types=["PrefixCreated"], + ) + + agg_id = str(uuid4()) + payload = { + "event_type": "PrefixCreated", + "aggregate_id": agg_id, + "network": "172.16.0.0/12", + "status": "active", + } + + await delivery_service.deliver( + url=webhook.url, + payload=payload, + secret=webhook.secret, + event_type="PrefixCreated", + webhook_id=str(webhook.id), + ) + + request = httpx_mock.get_request() + + # Verify payload was sent as JSON body + sent_body = json.loads(request.content.decode()) + assert sent_body["event_type"] == "PrefixCreated" + assert sent_body["aggregate_id"] == agg_id + assert sent_body["network"] == "172.16.0.0/12" + + # Verify headers contain event metadata + assert request.headers["X-Webhook-Event"] == "PrefixCreated" + assert request.headers["X-Webhook-ID"] == str(webhook.id) + assert request.headers["Content-Type"] == "application/json" diff --git a/shared/pyproject.toml b/shared/pyproject.toml new file mode 100644 index 0000000..6ac22ae --- /dev/null +++ b/shared/pyproject.toml @@ -0,0 +1,20 @@ +[project] +name = "cmdb-shared" +version = "0.1.0" +description = "CMDB Shared Library" +requires-python = ">=3.13" +dependencies = [ + "pydantic>=2.0", + "sqlalchemy>=2.0", + "alembic", + "aiokafka", + "redis", + "fastapi>=0.115", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/shared"] diff --git a/shared/src/shared/__init__.py b/shared/src/shared/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/shared/src/shared/api/__init__.py b/shared/src/shared/api/__init__.py new file mode 100644 index 0000000..6190266 --- /dev/null +++ b/shared/src/shared/api/__init__.py @@ -0,0 +1,33 @@ +from shared.api.errors import ProblemDetail, domain_exception_handler +from shared.api.filtering import FilterOperator, FilterParam, apply_filters +from shared.api.middleware import CorrelationIdMiddleware, TenantMiddleware, UserMiddleware +from shared.api.openapi import customize_openapi +from shared.api.pagination import ( + CursorPage, + CursorParams, + OffsetPage, + OffsetParams, + decode_cursor, + encode_cursor, +) +from shared.api.sorting import SortParam, apply_sorting + +__all__ = [ + "CorrelationIdMiddleware", + "CursorPage", + "CursorParams", + "FilterOperator", + "FilterParam", + "OffsetPage", + "OffsetParams", + "ProblemDetail", + "SortParam", + "TenantMiddleware", + "UserMiddleware", + "apply_filters", + "apply_sorting", + "customize_openapi", + "decode_cursor", + "domain_exception_handler", + "encode_cursor", +] diff --git a/shared/src/shared/api/errors.py b/shared/src/shared/api/errors.py new file mode 100644 index 0000000..fbc9311 --- /dev/null +++ b/shared/src/shared/api/errors.py @@ -0,0 +1,50 @@ +from typing import Any + +from fastapi import Request +from fastapi.responses import JSONResponse +from pydantic import BaseModel + +from shared.domain.exceptions import ( + AuthorizationError, + BusinessRuleViolationError, + ConflictError, + DomainError, + EntityNotFoundError, + InfrastructureError, + ValidationError, +) + + +class ProblemDetail(BaseModel): + type: str = "about:blank" + title: str + status: int + detail: str + instance: str | None = None + extensions: dict[str, Any] = {} + + +EXCEPTION_STATUS_MAP: dict[type[DomainError], int] = { + EntityNotFoundError: 404, + BusinessRuleViolationError: 422, + AuthorizationError: 403, + ConflictError: 409, + ValidationError: 400, + InfrastructureError: 503, +} + + +def domain_exception_handler(request: Request, exc: DomainError) -> JSONResponse: + status = EXCEPTION_STATUS_MAP.get(type(exc), 500) + problem = ProblemDetail( + type=f"urn:cmdb:error:{exc.code}", + title=type(exc).__name__, + status=status, + detail=exc.message, + instance=str(request.url), + extensions=exc.details, + ) + return JSONResponse( + status_code=status, + content=problem.model_dump(exclude_none=True), + ) diff --git a/shared/src/shared/api/filtering.py b/shared/src/shared/api/filtering.py new file mode 100644 index 0000000..b1fc978 --- /dev/null +++ b/shared/src/shared/api/filtering.py @@ -0,0 +1,53 @@ +from enum import StrEnum +from typing import Any + +from pydantic import BaseModel +from sqlalchemy import Select + + +class FilterOperator(StrEnum): + EQ = "eq" + NEQ = "neq" + GT = "gt" + GTE = "gte" + LT = "lt" + LTE = "lte" + IN = "in" + CONTAINS = "contains" + STARTSWITH = "startswith" + ILIKE = "ilike" + + +class FilterParam(BaseModel): + field: str + operator: FilterOperator = FilterOperator.EQ + value: Any + + +_OPERATOR_MAP = { + FilterOperator.EQ: lambda col, val: col == val, + FilterOperator.NEQ: lambda col, val: col != val, + FilterOperator.GT: lambda col, val: col > val, + FilterOperator.GTE: lambda col, val: col >= val, + FilterOperator.LT: lambda col, val: col < val, + FilterOperator.LTE: lambda col, val: col <= val, + FilterOperator.IN: lambda col, val: col.in_(val), + FilterOperator.CONTAINS: lambda col, val: col.contains(val), + FilterOperator.STARTSWITH: lambda col, val: col.startswith(val), + FilterOperator.ILIKE: lambda col, val: col.ilike(f"%{val}%"), +} + + +def apply_filters( + query: Select, # type: ignore[type-arg] + model: Any, + filters: list[FilterParam], +) -> Select: # type: ignore[type-arg] + for f in filters: + column = getattr(model, f.field, None) + if column is None: + continue + op_fn = _OPERATOR_MAP.get(f.operator) + if op_fn: + query = query.where(op_fn(column, f.value)) + return query diff --git a/shared/src/shared/api/middleware.py b/shared/src/shared/api/middleware.py new file mode 100644 index 0000000..a6082c1 --- /dev/null +++ b/shared/src/shared/api/middleware.py @@ -0,0 +1,43 @@ +from uuid import uuid4 + +from fastapi import Request, Response +from fastapi.responses import JSONResponse +from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint + + +class TenantMiddleware(BaseHTTPMiddleware): + HEADER = "X-Tenant-ID" + + async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response: + tenant_id = request.headers.get(self.HEADER) + if tenant_id is None: + return JSONResponse( + status_code=400, + content={ + "type": "urn:cmdb:error:MissingTenant", + "title": "Missing Tenant", + "status": 400, + "detail": f"Header '{self.HEADER}' is required", + }, + ) + request.state.tenant_id = tenant_id + return await call_next(request) + + +class UserMiddleware(BaseHTTPMiddleware): + HEADER = "X-User-ID" + + async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response: + request.state.user_id = request.headers.get(self.HEADER) + return await call_next(request) + + +class CorrelationIdMiddleware(BaseHTTPMiddleware): + HEADER = "X-Correlation-ID" + + async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response: + correlation_id = request.headers.get(self.HEADER) or str(uuid4()) + request.state.correlation_id = correlation_id + response = await call_next(request) + response.headers[self.HEADER] = correlation_id + return response diff --git a/shared/src/shared/api/openapi.py b/shared/src/shared/api/openapi.py new file mode 100644 index 0000000..9e58689 --- /dev/null +++ b/shared/src/shared/api/openapi.py @@ -0,0 +1,23 @@ +from fastapi import FastAPI +from fastapi.openapi.utils import get_openapi + + +def customize_openapi( + app: FastAPI, + title: str, + version: str, + description: str = "", +) -> None: + def custom_schema() -> dict: + if app.openapi_schema: + return app.openapi_schema + schema = get_openapi( + title=title, + version=version, + description=description, + routes=app.routes, + ) + app.openapi_schema = schema + return schema + + app.openapi = custom_schema # type: ignore[method-assign] diff --git a/shared/src/shared/api/pagination.py b/shared/src/shared/api/pagination.py new file mode 100644 index 0000000..06391b1 --- /dev/null +++ b/shared/src/shared/api/pagination.py @@ -0,0 +1,41 @@ +import base64 +import json +from typing import Any, Generic, TypeVar + +from pydantic import BaseModel, Field + +T = TypeVar("T") + + +class OffsetParams(BaseModel): + offset: int = Field(0, ge=0) + limit: int = Field(50, ge=1, le=200) + + +class CursorParams(BaseModel): + cursor: str | None = None + limit: int = Field(50, ge=1, le=200) + + +class OffsetPage(BaseModel, Generic[T]): # noqa: UP046 + items: list[T] + total: int + offset: int + limit: int + + +class CursorPage(BaseModel, Generic[T]): # noqa: UP046 + items: list[T] + next_cursor: str | None = None + previous_cursor: str | None = None + limit: int + + +def encode_cursor(values: dict[str, Any]) -> str: + payload = json.dumps(values, default=str).encode("utf-8") + return base64.urlsafe_b64encode(payload).decode("ascii") + + +def decode_cursor(cursor: str) -> dict[str, Any]: + payload = base64.urlsafe_b64decode(cursor.encode("ascii")) + return json.loads(payload) diff --git a/shared/src/shared/api/sorting.py b/shared/src/shared/api/sorting.py new file mode 100644 index 0000000..2ef21a3 --- /dev/null +++ b/shared/src/shared/api/sorting.py @@ -0,0 +1,23 @@ +from typing import Any, Literal + +from pydantic import BaseModel +from sqlalchemy import Select, asc, desc + + +class SortParam(BaseModel): + field: str + direction: Literal["asc", "desc"] = "asc" + + +def apply_sorting( + query: Select, # type: ignore[type-arg] + model: Any, + sort_params: list[SortParam], +) -> Select: # type: ignore[type-arg] + for param in sort_params: + column = getattr(model, param.field, None) + if column is None: + continue + order_fn = asc if param.direction == "asc" else desc + query = query.order_by(order_fn(column)) + return query diff --git a/shared/src/shared/cqrs/__init__.py b/shared/src/shared/cqrs/__init__.py new file mode 100644 index 0000000..b02d57a --- /dev/null +++ b/shared/src/shared/cqrs/__init__.py @@ -0,0 +1,12 @@ +from shared.cqrs.bus import CommandBus, QueryBus +from shared.cqrs.command import Command, CommandHandler +from shared.cqrs.query import Query, QueryHandler + +__all__ = [ + "Command", + "CommandBus", + "CommandHandler", + "Query", + "QueryBus", + "QueryHandler", +] diff --git a/shared/src/shared/cqrs/bus.py b/shared/src/shared/cqrs/bus.py new file mode 100644 index 0000000..6a17256 --- /dev/null +++ b/shared/src/shared/cqrs/bus.py @@ -0,0 +1,36 @@ +from typing import Any + +from shared.cqrs.command import Command, CommandHandler +from shared.cqrs.query import Query, QueryHandler + + +class CommandBus: + def __init__(self) -> None: + self._handlers: dict[type[Command], CommandHandler[Any]] = {} + + def register(self, command_type: type[Command], handler: CommandHandler[Any]) -> None: + if command_type in self._handlers: + raise ValueError(f"Handler already registered for {command_type.__name__}") + self._handlers[command_type] = handler + + async def dispatch(self, command: Command) -> Any: + handler = self._handlers.get(type(command)) + if handler is None: + raise ValueError(f"No handler registered for {type(command).__name__}") + return await handler.handle(command) + + +class QueryBus: + def __init__(self) -> None: + self._handlers: dict[type[Query], QueryHandler[Any]] = {} + + def register(self, query_type: type[Query], handler: QueryHandler[Any]) -> None: + if query_type in self._handlers: + raise ValueError(f"Handler already registered for {query_type.__name__}") + self._handlers[query_type] = handler + + async def dispatch(self, query: Query) -> Any: + handler = self._handlers.get(type(query)) + if handler is None: + raise ValueError(f"No handler registered for {type(query).__name__}") + return await handler.handle(query) diff --git a/shared/src/shared/cqrs/command.py b/shared/src/shared/cqrs/command.py new file mode 100644 index 0000000..30a738f --- /dev/null +++ b/shared/src/shared/cqrs/command.py @@ -0,0 +1,12 @@ +from abc import ABC, abstractmethod + +from pydantic import BaseModel, ConfigDict + + +class Command(BaseModel): + model_config = ConfigDict(frozen=True) + + +class CommandHandler[R](ABC): + @abstractmethod + async def handle(self, command: Command) -> R: ... diff --git a/shared/src/shared/cqrs/query.py b/shared/src/shared/cqrs/query.py new file mode 100644 index 0000000..c74a904 --- /dev/null +++ b/shared/src/shared/cqrs/query.py @@ -0,0 +1,12 @@ +from abc import ABC, abstractmethod + +from pydantic import BaseModel, ConfigDict + + +class Query(BaseModel): + model_config = ConfigDict(frozen=True) + + +class QueryHandler[R](ABC): + @abstractmethod + async def handle(self, query: Query) -> R: ... diff --git a/shared/src/shared/db/__init__.py b/shared/src/shared/db/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/shared/src/shared/db/tenant_db_manager.py b/shared/src/shared/db/tenant_db_manager.py new file mode 100644 index 0000000..190bb7e --- /dev/null +++ b/shared/src/shared/db/tenant_db_manager.py @@ -0,0 +1,45 @@ +from sqlalchemy.ext.asyncio import ( + AsyncEngine, + AsyncSession, + async_sessionmaker, + create_async_engine, +) + + +class TenantDbManager: + def __init__(self) -> None: + self._engines: dict[str, AsyncEngine] = {} + self._session_factories: dict[str, async_sessionmaker[AsyncSession]] = {} + + def register(self, tenant_id: str, db_url: str) -> None: + if tenant_id not in self._engines: + engine = create_async_engine( + db_url, + echo=False, + pool_size=5, + max_overflow=10, + ) + self._engines[tenant_id] = engine + self._session_factories[tenant_id] = async_sessionmaker( + engine, + class_=AsyncSession, + expire_on_commit=False, + ) + + def get_session(self, tenant_id: str) -> AsyncSession: + factory = self._session_factories.get(tenant_id) + if factory is None: + raise KeyError(f"No database registered for tenant {tenant_id}") + return factory() + + def get_session_factory(self, tenant_id: str) -> async_sessionmaker[AsyncSession]: + factory = self._session_factories.get(tenant_id) + if factory is None: + raise KeyError(f"No database registered for tenant {tenant_id}") + return factory + + async def close_all(self) -> None: + for engine in self._engines.values(): + await engine.dispose() + self._engines.clear() + self._session_factories.clear() diff --git a/shared/src/shared/domain/__init__.py b/shared/src/shared/domain/__init__.py new file mode 100644 index 0000000..a8612a6 --- /dev/null +++ b/shared/src/shared/domain/__init__.py @@ -0,0 +1,52 @@ +from shared.domain.custom_field import ( + CustomFieldDefinition, + CustomFieldValidator, + FieldType, +) +from shared.domain.custom_field_mixin import CustomFieldMixin +from shared.domain.entity import Entity +from shared.domain.exceptions import ( + AuthorizationError, + BusinessRuleViolationError, + ConflictError, + DomainError, + EntityNotFoundError, + InfrastructureError, + ValidationError, +) +from shared.domain.filters import filter_by_custom_field, filter_by_tag_slugs +from shared.domain.models import ( + CustomFieldDefinitionModel, + SharedBase, + TagAssignmentModel, + TagModel, +) +from shared.domain.repository import Repository +from shared.domain.service import DomainService +from shared.domain.tag import Tag +from shared.domain.value_object import ValueObject + +__all__ = [ + "AuthorizationError", + "BusinessRuleViolationError", + "ConflictError", + "CustomFieldDefinition", + "CustomFieldDefinitionModel", + "CustomFieldMixin", + "CustomFieldValidator", + "DomainError", + "DomainService", + "Entity", + "EntityNotFoundError", + "FieldType", + "InfrastructureError", + "Repository", + "SharedBase", + "Tag", + "TagAssignmentModel", + "TagModel", + "ValidationError", + "ValueObject", + "filter_by_custom_field", + "filter_by_tag_slugs", +] diff --git a/shared/src/shared/domain/custom_field.py b/shared/src/shared/domain/custom_field.py new file mode 100644 index 0000000..943097d --- /dev/null +++ b/shared/src/shared/domain/custom_field.py @@ -0,0 +1,108 @@ +from enum import StrEnum +from typing import Any + +from pydantic import BaseModel, ConfigDict + +from shared.domain.exceptions import ValidationError + + +class FieldType(StrEnum): + TEXT = "text" + INTEGER = "integer" + FLOAT = "float" + BOOLEAN = "boolean" + DATE = "date" + DATETIME = "datetime" + SELECT = "select" + MULTISELECT = "multiselect" + URL = "url" + + +class CustomFieldDefinition(BaseModel): + model_config = ConfigDict(frozen=True) + + name: str + field_type: FieldType + required: bool = False + default: Any = None + choices: list[str] | None = None + validation_regex: str | None = None + description: str = "" + + +_TYPE_VALIDATORS: dict[FieldType, type] = { + FieldType.TEXT: str, + FieldType.INTEGER: int, + FieldType.FLOAT: float, + FieldType.BOOLEAN: bool, + FieldType.DATE: str, + FieldType.DATETIME: str, + FieldType.SELECT: str, + FieldType.MULTISELECT: list, + FieldType.URL: str, +} + + +class CustomFieldValidator: + @staticmethod + def validate(definition: CustomFieldDefinition, value: Any) -> Any: + if value is None: + if definition.required: + raise ValidationError( + f"Field '{definition.name}' is required", + details={"field": definition.name}, + ) + return definition.default + + expected_type = _TYPE_VALIDATORS.get(definition.field_type) + if expected_type and not isinstance(value, expected_type): + raise ValidationError( + f"Field '{definition.name}' expected {definition.field_type}, got {type(value).__name__}", + details={"field": definition.name, "expected": definition.field_type}, + ) + + if definition.choices is not None: + if definition.field_type == FieldType.MULTISELECT: + invalid = [v for v in value if v not in definition.choices] + if invalid: + raise ValidationError( + f"Field '{definition.name}' invalid choices: {invalid}", + details={"field": definition.name, "invalid": invalid}, + ) + elif value not in definition.choices: + raise ValidationError( + f"Field '{definition.name}' value '{value}' not in choices", + details={"field": definition.name, "value": value}, + ) + + if definition.validation_regex is not None: + import re + + if not re.match(definition.validation_regex, str(value)): + raise ValidationError( + f"Field '{definition.name}' failed regex validation", + details={"field": definition.name, "pattern": definition.validation_regex}, + ) + + return value + + @staticmethod + def validate_all( + definitions: list[CustomFieldDefinition], + values: dict[str, Any], + ) -> dict[str, Any]: + defs_by_name = {d.name: d for d in definitions} + result: dict[str, Any] = {} + + for name, definition in defs_by_name.items(): + value = values.get(name) + result[name] = CustomFieldValidator.validate(definition, value) + + unknown = set(values) - set(defs_by_name) + if unknown: + raise ValidationError( + f"Unknown custom fields: {unknown}", + details={"unknown_fields": list(unknown)}, + ) + + return result diff --git a/shared/src/shared/domain/custom_field_mixin.py b/shared/src/shared/domain/custom_field_mixin.py new file mode 100644 index 0000000..9473e5d --- /dev/null +++ b/shared/src/shared/domain/custom_field_mixin.py @@ -0,0 +1,18 @@ +from sqlalchemy import Index +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import Mapped, declared_attr, mapped_column + + +class CustomFieldMixin: + custom_field_data: Mapped[dict] = mapped_column(JSONB, default=dict) + + @declared_attr + @classmethod + def __table_args__(cls) -> tuple: + return ( + Index( + f"ix_{cls.__tablename__}_custom_fields", # type: ignore[attr-defined] + "custom_field_data", + postgresql_using="gin", + ), + ) diff --git a/shared/src/shared/domain/entity.py b/shared/src/shared/domain/entity.py new file mode 100644 index 0000000..635b082 --- /dev/null +++ b/shared/src/shared/domain/entity.py @@ -0,0 +1,12 @@ +from datetime import datetime +from uuid import UUID, uuid4 + +from pydantic import BaseModel, ConfigDict, Field + + +class Entity(BaseModel): + model_config = ConfigDict(frozen=False) + + id: UUID = Field(default_factory=uuid4) + created_at: datetime = Field(default_factory=datetime.now) + updated_at: datetime = Field(default_factory=datetime.now) diff --git a/shared/src/shared/domain/exceptions.py b/shared/src/shared/domain/exceptions.py new file mode 100644 index 0000000..0421906 --- /dev/null +++ b/shared/src/shared/domain/exceptions.py @@ -0,0 +1,29 @@ +class DomainError(Exception): + def __init__( + self, + message: str, + code: str | None = None, + details: dict | None = None, + ) -> None: + self.message = message + self.code = code or self.__class__.__name__ + self.details = details or {} + super().__init__(message) + + +class EntityNotFoundError(DomainError): ... + + +class BusinessRuleViolationError(DomainError): ... + + +class AuthorizationError(DomainError): ... + + +class ConflictError(DomainError): ... + + +class ValidationError(DomainError): ... + + +class InfrastructureError(DomainError): ... diff --git a/shared/src/shared/domain/filters.py b/shared/src/shared/domain/filters.py new file mode 100644 index 0000000..51c54c9 --- /dev/null +++ b/shared/src/shared/domain/filters.py @@ -0,0 +1,31 @@ +from typing import Any + +from sqlalchemy import Select, select + +from shared.domain.models import TagAssignmentModel, TagModel + + +def filter_by_custom_field( + query: Select, # type: ignore[type-arg] + column: Any, + field_name: str, + value: Any, +) -> Select: # type: ignore[type-arg] + return query.where(column[field_name].astext == str(value)) + + +def filter_by_tag_slugs( + query: Select, # type: ignore[type-arg] + content_type: str, + object_id_column: Any, + tag_slugs: list[str], +) -> Select: # type: ignore[type-arg] + subquery = ( + select(TagAssignmentModel.object_id) + .join(TagModel, TagAssignmentModel.tag_id == TagModel.id) + .where( + TagAssignmentModel.content_type == content_type, + TagModel.slug.in_(tag_slugs), + ) + ) + return query.where(object_id_column.in_(subquery)) diff --git a/shared/src/shared/domain/models.py b/shared/src/shared/domain/models.py new file mode 100644 index 0000000..3b63403 --- /dev/null +++ b/shared/src/shared/domain/models.py @@ -0,0 +1,56 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import Boolean, ForeignKey, Index, String, Text +from sqlalchemy import DateTime as SADateTime +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column +from sqlalchemy.sql import func + + +class SharedBase(DeclarativeBase): + pass + + +class CustomFieldDefinitionModel(SharedBase): + __tablename__ = "custom_field_definitions" + + id: Mapped[UUID] = mapped_column(primary_key=True) + content_type: Mapped[str] = mapped_column(String(100), index=True) + name: Mapped[str] = mapped_column(String(100)) + field_type: Mapped[str] = mapped_column(String(20)) + required: Mapped[bool] = mapped_column(Boolean, default=False) + default_value: Mapped[dict | None] = mapped_column(JSONB, nullable=True) + choices: Mapped[list | None] = mapped_column(JSONB, nullable=True) + validation_regex: Mapped[str | None] = mapped_column(Text, nullable=True) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + + +class TagModel(SharedBase): + __tablename__ = "tags" + + id: Mapped[UUID] = mapped_column(primary_key=True) + name: Mapped[str] = mapped_column(String(100), unique=True) + slug: Mapped[str] = mapped_column(String(100), unique=True) + color: Mapped[str] = mapped_column(String(7), default="#9e9e9e") + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + + +class TagAssignmentModel(SharedBase): + __tablename__ = "tag_assignments" + + id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True) + tag_id: Mapped[UUID] = mapped_column(ForeignKey("tags.id")) + content_type: Mapped[str] = mapped_column(String(100)) + object_id: Mapped[UUID] + + __table_args__ = ( + Index("ix_tag_assignments_content_object", "content_type", "object_id"), + Index( + "ix_tag_assignments_unique", + "tag_id", + "content_type", + "object_id", + unique=True, + ), + ) diff --git a/shared/src/shared/domain/repository.py b/shared/src/shared/domain/repository.py new file mode 100644 index 0000000..9c49616 --- /dev/null +++ b/shared/src/shared/domain/repository.py @@ -0,0 +1,15 @@ +from abc import ABC, abstractmethod +from uuid import UUID + +from shared.domain.entity import Entity + + +class Repository[T: Entity](ABC): + @abstractmethod + async def find_by_id(self, entity_id: UUID) -> T | None: ... + + @abstractmethod + async def save(self, entity: T) -> T: ... + + @abstractmethod + async def delete(self, entity_id: UUID) -> None: ... diff --git a/shared/src/shared/domain/service.py b/shared/src/shared/domain/service.py new file mode 100644 index 0000000..3f4e3f9 --- /dev/null +++ b/shared/src/shared/domain/service.py @@ -0,0 +1,2 @@ +class DomainService: + pass diff --git a/shared/src/shared/domain/tag.py b/shared/src/shared/domain/tag.py new file mode 100644 index 0000000..3f1961f --- /dev/null +++ b/shared/src/shared/domain/tag.py @@ -0,0 +1,10 @@ +from uuid import UUID, uuid4 + +from pydantic import BaseModel, Field + + +class Tag(BaseModel): + id: UUID = Field(default_factory=uuid4) + name: str + slug: str + color: str = "#9e9e9e" diff --git a/shared/src/shared/domain/value_object.py b/shared/src/shared/domain/value_object.py new file mode 100644 index 0000000..705b930 --- /dev/null +++ b/shared/src/shared/domain/value_object.py @@ -0,0 +1,5 @@ +from pydantic import BaseModel, ConfigDict + + +class ValueObject(BaseModel): + model_config = ConfigDict(frozen=True) diff --git a/shared/src/shared/event/__init__.py b/shared/src/shared/event/__init__.py new file mode 100644 index 0000000..e4e5400 --- /dev/null +++ b/shared/src/shared/event/__init__.py @@ -0,0 +1,17 @@ +from shared.event.aggregate import AggregateRoot +from shared.event.domain_event import DomainEvent +from shared.event.models import EventStoreBase, StoredEvent, StoredSnapshot +from shared.event.pg_store import PostgresEventStore +from shared.event.snapshot import SnapshotStrategy +from shared.event.store import EventStore + +__all__ = [ + "AggregateRoot", + "DomainEvent", + "EventStore", + "EventStoreBase", + "PostgresEventStore", + "SnapshotStrategy", + "StoredEvent", + "StoredSnapshot", +] diff --git a/shared/src/shared/event/aggregate.py b/shared/src/shared/event/aggregate.py new file mode 100644 index 0000000..71e4aca --- /dev/null +++ b/shared/src/shared/event/aggregate.py @@ -0,0 +1,43 @@ +from typing import Any, Self +from uuid import UUID, uuid4 + +from shared.event.domain_event import DomainEvent + + +class AggregateRoot: + def __init__(self, aggregate_id: UUID | None = None) -> None: + self.id: UUID = aggregate_id or uuid4() + self.version: int = 0 + self._uncommitted_events: list[DomainEvent] = [] + + def apply_event(self, event: DomainEvent, *, is_new: bool = True) -> None: + self._apply(event) + self.version = event.version + if is_new: + self._uncommitted_events.append(event) + + def _apply(self, event: DomainEvent) -> None: + method_name = f"_apply_{type(event).__name__}" + method = getattr(self, method_name, None) + if method is None: + raise NotImplementedError(f"{type(self).__name__} missing {method_name}") + method(event) + + def load_from_history(self, events: list[DomainEvent]) -> None: + for event in events: + self.apply_event(event, is_new=False) + + def collect_uncommitted_events(self) -> list[DomainEvent]: + events = list(self._uncommitted_events) + self._uncommitted_events.clear() + return events + + def _next_version(self) -> int: + return self.version + 1 + + def to_snapshot(self) -> dict[str, Any]: + raise NotImplementedError + + @classmethod + def from_snapshot(cls, aggregate_id: UUID, state: dict[str, Any], version: int) -> Self: + raise NotImplementedError diff --git a/shared/src/shared/event/domain_event.py b/shared/src/shared/event/domain_event.py new file mode 100644 index 0000000..089b717 --- /dev/null +++ b/shared/src/shared/event/domain_event.py @@ -0,0 +1,22 @@ +from datetime import datetime +from typing import Any +from uuid import UUID, uuid4 + +from pydantic import BaseModel, ConfigDict, Field + + +class DomainEvent(BaseModel): + model_config = ConfigDict(frozen=True) + + event_id: UUID = Field(default_factory=uuid4) + aggregate_id: UUID + timestamp: datetime = Field(default_factory=datetime.now) + version: int + event_type: str = "" + + @classmethod + def __pydantic_init_subclass__(cls, **kwargs: Any) -> None: + super().__pydantic_init_subclass__(**kwargs) + if "event_type" in cls.model_fields: + cls.model_fields["event_type"].default = f"{cls.__module__}.{cls.__qualname__}" + cls.model_rebuild(force=True) diff --git a/shared/src/shared/event/models.py b/shared/src/shared/event/models.py new file mode 100644 index 0000000..8413245 --- /dev/null +++ b/shared/src/shared/event/models.py @@ -0,0 +1,52 @@ +from datetime import datetime +from uuid import UUID + +from sqlalchemy import DateTime as SADateTime +from sqlalchemy import Index, Integer, Text +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column +from sqlalchemy.sql import func + + +class EventStoreBase(DeclarativeBase): + pass + + +class StoredEvent(EventStoreBase): + __tablename__ = "domain_events" + + id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True) + aggregate_id: Mapped[UUID] = mapped_column(index=True) + event_type: Mapped[str] = mapped_column(Text) + version: Mapped[int] = mapped_column(Integer) + payload: Mapped[dict] = mapped_column(JSONB) + timestamp: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + + __table_args__ = ( + Index( + "ix_domain_events_agg_version", + "aggregate_id", + "version", + unique=True, + ), + ) + + +class StoredSnapshot(EventStoreBase): + __tablename__ = "aggregate_snapshots" + + id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True) + aggregate_id: Mapped[UUID] = mapped_column(index=True) + aggregate_type: Mapped[str] = mapped_column(Text) + version: Mapped[int] = mapped_column(Integer) + state: Mapped[dict] = mapped_column(JSONB) + created_at: Mapped[datetime] = mapped_column(SADateTime(timezone=True), server_default=func.now()) + + __table_args__ = ( + Index( + "ix_snapshots_agg_version", + "aggregate_id", + "version", + unique=True, + ), + ) diff --git a/shared/src/shared/event/pg_store.py b/shared/src/shared/event/pg_store.py new file mode 100644 index 0000000..8b754cb --- /dev/null +++ b/shared/src/shared/event/pg_store.py @@ -0,0 +1,171 @@ +import json +import logging +from collections.abc import AsyncGenerator, Callable +from contextlib import asynccontextmanager +from typing import Any +from uuid import UUID + +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from shared.domain.exceptions import ConflictError +from shared.event.aggregate import AggregateRoot +from shared.event.domain_event import DomainEvent +from shared.event.models import StoredEvent, StoredSnapshot +from shared.event.snapshot import SnapshotStrategy +from shared.event.store import EventStore + +logger = logging.getLogger(__name__) + + +class PostgresEventStore(EventStore): + def __init__( + self, + session_factory: Callable[..., AsyncSession], + snapshot_strategy: SnapshotStrategy | None = None, + ) -> None: + self._session_factory = session_factory + self._snapshot_strategy = snapshot_strategy or SnapshotStrategy() + self._event_registry: dict[str, type[DomainEvent]] = {} + + def register_event_type(self, event_cls: type[DomainEvent]) -> None: + key = f"{event_cls.__module__}.{event_cls.__qualname__}" + self._event_registry[key] = event_cls + + @asynccontextmanager + async def _get_session(self, session: AsyncSession | None = None) -> AsyncGenerator[tuple[AsyncSession, bool]]: + if session is not None: + yield session, False + else: + async with self._session_factory() as new_session: + yield new_session, True + + async def load_aggregate[T: AggregateRoot]( + self, + aggregate_cls: type[T], + aggregate_id: UUID, + ) -> T | None: + snapshot_data = await self.load_snapshot(aggregate_id) + if snapshot_data is not None: + state, snapshot_version = snapshot_data + aggregate = aggregate_cls.from_snapshot(aggregate_id, state, snapshot_version) + events = await self.load_stream(aggregate_id, after_version=snapshot_version) + else: + events = await self.load_stream(aggregate_id) + if not events: + return None + aggregate = aggregate_cls(aggregate_id=aggregate_id) + snapshot_version = 0 + + aggregate.load_from_history(events) + return aggregate + + async def append( + self, + aggregate_id: UUID, + events: list[DomainEvent], + expected_version: int, + *, + aggregate: AggregateRoot | None = None, + session: AsyncSession | None = None, + ) -> None: + async with self._get_session(session) as (sess, owns_session): + result = await sess.execute( + select(StoredEvent.version) + .where(StoredEvent.aggregate_id == aggregate_id) + .order_by(StoredEvent.version.desc()) + .limit(1) + ) + current_version = result.scalar() or 0 + + if current_version != expected_version: + raise ConflictError( + f"Expected version {expected_version}, but current version is {current_version}", + details={ + "aggregate_id": str(aggregate_id), + "expected_version": expected_version, + "current_version": current_version, + }, + ) + + for event in events: + stored = StoredEvent( + aggregate_id=event.aggregate_id, + event_type=event.event_type, + version=event.version, + payload=json.loads(event.model_dump_json()), + timestamp=event.timestamp, + ) + sess.add(stored) + + if owns_session: + await sess.commit() + + if aggregate is not None: + await self._try_snapshot(aggregate, expected_version) + + async def _try_snapshot(self, aggregate: AggregateRoot, last_snapshot_version: int) -> None: + if self._snapshot_strategy.should_snapshot(aggregate.version, last_snapshot_version): + try: + await self.save_snapshot(aggregate.id, aggregate.to_snapshot(), aggregate.version) + logger.debug("Saved snapshot for aggregate %s at version %d", aggregate.id, aggregate.version) + except Exception: + logger.warning("Failed to save snapshot for aggregate %s", aggregate.id, exc_info=True) + + async def load_stream( + self, + aggregate_id: UUID, + after_version: int = 0, + *, + session: AsyncSession | None = None, + ) -> list[DomainEvent]: + async with self._get_session(session) as (sess, _): + result = await sess.execute( + select(StoredEvent) + .where( + StoredEvent.aggregate_id == aggregate_id, + StoredEvent.version > after_version, + ) + .order_by(StoredEvent.version) + ) + rows = result.scalars().all() + + events: list[DomainEvent] = [] + for row in rows: + event_cls = self._event_registry.get(row.event_type) + if event_cls is None: + event_cls = DomainEvent + events.append(event_cls.model_validate(row.payload)) + return events + + async def load_snapshot( + self, + aggregate_id: UUID, + ) -> tuple[dict[str, Any], int] | None: + async with self._session_factory() as session: + result = await session.execute( + select(StoredSnapshot) + .where(StoredSnapshot.aggregate_id == aggregate_id) + .order_by(StoredSnapshot.version.desc()) + .limit(1) + ) + row = result.scalar_one_or_none() + if row is None: + return None + return row.state, row.version + + async def save_snapshot( + self, + aggregate_id: UUID, + state: dict[str, Any], + version: int, + ) -> None: + async with self._session_factory() as session: + snapshot = StoredSnapshot( + aggregate_id=aggregate_id, + aggregate_type="", + version=version, + state=state, + ) + session.add(snapshot) + await session.commit() diff --git a/shared/src/shared/event/snapshot.py b/shared/src/shared/event/snapshot.py new file mode 100644 index 0000000..8a09718 --- /dev/null +++ b/shared/src/shared/event/snapshot.py @@ -0,0 +1,6 @@ +class SnapshotStrategy: + def __init__(self, every_n_events: int = 100) -> None: + self.every_n_events = every_n_events + + def should_snapshot(self, current_version: int, last_snapshot_version: int) -> bool: + return (current_version - last_snapshot_version) >= self.every_n_events diff --git a/shared/src/shared/event/store.py b/shared/src/shared/event/store.py new file mode 100644 index 0000000..89ebc98 --- /dev/null +++ b/shared/src/shared/event/store.py @@ -0,0 +1,36 @@ +from abc import ABC, abstractmethod +from typing import Any +from uuid import UUID + +from shared.event.domain_event import DomainEvent + + +class EventStore(ABC): + @abstractmethod + async def append( + self, + aggregate_id: UUID, + events: list[DomainEvent], + expected_version: int, + ) -> None: ... + + @abstractmethod + async def load_stream( + self, + aggregate_id: UUID, + after_version: int = 0, + ) -> list[DomainEvent]: ... + + @abstractmethod + async def load_snapshot( + self, + aggregate_id: UUID, + ) -> tuple[dict[str, Any], int] | None: ... + + @abstractmethod + async def save_snapshot( + self, + aggregate_id: UUID, + state: dict[str, Any], + version: int, + ) -> None: ... diff --git a/shared/src/shared/messaging/__init__.py b/shared/src/shared/messaging/__init__.py new file mode 100644 index 0000000..186ea87 --- /dev/null +++ b/shared/src/shared/messaging/__init__.py @@ -0,0 +1,9 @@ +from shared.messaging.consumer import KafkaEventConsumer +from shared.messaging.producer import KafkaEventProducer +from shared.messaging.serialization import EventSerializer + +__all__ = [ + "EventSerializer", + "KafkaEventConsumer", + "KafkaEventProducer", +] diff --git a/shared/src/shared/messaging/consumer.py b/shared/src/shared/messaging/consumer.py new file mode 100644 index 0000000..37a8153 --- /dev/null +++ b/shared/src/shared/messaging/consumer.py @@ -0,0 +1,73 @@ +import logging +from collections.abc import Awaitable, Callable + +from aiokafka import AIOKafkaConsumer + +from shared.event.domain_event import DomainEvent +from shared.messaging.producer import KafkaEventProducer +from shared.messaging.serialization import EventSerializer + +logger = logging.getLogger(__name__) + +EventHandler = Callable[[DomainEvent], Awaitable[None]] + + +class KafkaEventConsumer: + def __init__( + self, + bootstrap_servers: str, + group_id: str, + topics: list[str], + serializer: EventSerializer | None = None, + dlq_topic: str | None = None, + dlq_producer: KafkaEventProducer | None = None, + ) -> None: + self._bootstrap_servers = bootstrap_servers + self._group_id = group_id + self._topics = topics + self._serializer = serializer or EventSerializer() + self._handlers: dict[type[DomainEvent], list[EventHandler]] = {} + self._dlq_topic = dlq_topic + self._dlq_producer = dlq_producer + self._consumer: AIOKafkaConsumer | None = None + + def subscribe(self, event_type: type[DomainEvent], handler: EventHandler) -> None: + self._handlers.setdefault(event_type, []).append(handler) + + async def start(self) -> None: + self._consumer = AIOKafkaConsumer( + *self._topics, + bootstrap_servers=self._bootstrap_servers, + group_id=self._group_id, + enable_auto_commit=False, + ) + await self._consumer.start() + + async def stop(self) -> None: + if self._consumer: + await self._consumer.stop() + + async def consume(self) -> None: + if self._consumer is None: + raise RuntimeError("Consumer not started") + async for msg in self._consumer: + try: + event = self._serializer.deserialize(msg.value) + handlers = self._handlers.get(type(event), []) + for handler in handlers: + await handler(event) + await self._consumer.commit() + except Exception: + logger.exception("Failed to process message from %s", msg.topic) + await self._send_to_dlq(msg) + await self._consumer.commit() + + async def _send_to_dlq(self, msg: object) -> None: + if self._dlq_producer and self._dlq_topic: + if self._dlq_producer._producer is None: + return + await self._dlq_producer._producer.send_and_wait( + self._dlq_topic, + value=msg.value, # type: ignore[attr-defined] + key=msg.key, # type: ignore[attr-defined] + ) diff --git a/shared/src/shared/messaging/producer.py b/shared/src/shared/messaging/producer.py new file mode 100644 index 0000000..ad0916d --- /dev/null +++ b/shared/src/shared/messaging/producer.py @@ -0,0 +1,48 @@ +from types import TracebackType + +from aiokafka import AIOKafkaProducer + +from shared.event.domain_event import DomainEvent +from shared.messaging.serialization import EventSerializer + + +class KafkaEventProducer: + def __init__( + self, + bootstrap_servers: str, + serializer: EventSerializer | None = None, + ) -> None: + self._bootstrap_servers = bootstrap_servers + self._serializer = serializer or EventSerializer() + self._producer: AIOKafkaProducer | None = None + + async def start(self) -> None: + self._producer = AIOKafkaProducer(bootstrap_servers=self._bootstrap_servers) + await self._producer.start() + + async def stop(self) -> None: + if self._producer: + await self._producer.stop() + + async def publish(self, topic: str, event: DomainEvent) -> None: + if self._producer is None: + raise RuntimeError("Producer not started") + key = str(event.aggregate_id).encode("utf-8") + value = self._serializer.serialize(event) + await self._producer.send_and_wait(topic, value=value, key=key) + + async def publish_many(self, topic: str, events: list[DomainEvent]) -> None: + for event in events: + await self.publish(topic, event) + + async def __aenter__(self) -> "KafkaEventProducer": + await self.start() + return self + + async def __aexit__( + self, + exc_type: type[BaseException] | None, + exc_val: BaseException | None, + exc_tb: TracebackType | None, + ) -> None: + await self.stop() diff --git a/shared/src/shared/messaging/serialization.py b/shared/src/shared/messaging/serialization.py new file mode 100644 index 0000000..bf7572c --- /dev/null +++ b/shared/src/shared/messaging/serialization.py @@ -0,0 +1,21 @@ +import json + +from shared.event.domain_event import DomainEvent + + +class EventSerializer: + def __init__(self) -> None: + self._registry: dict[str, type[DomainEvent]] = {} + + def register(self, event_cls: type[DomainEvent]) -> None: + key = f"{event_cls.__module__}.{event_cls.__qualname__}" + self._registry[key] = event_cls + + def serialize(self, event: DomainEvent) -> bytes: + return event.model_dump_json().encode("utf-8") + + def deserialize(self, data: bytes) -> DomainEvent: + raw = json.loads(data) + event_type = raw.get("event_type", "") + cls = self._registry.get(event_type, DomainEvent) + return cls.model_validate(raw) diff --git a/skills-lock.json b/skills-lock.json index e2ad060..79fcc86 100644 --- a/skills-lock.json +++ b/skills-lock.json @@ -1,30 +1,90 @@ { "version": 1, "skills": { + "design-an-interface": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "885b05367fb1cf61a1e27e0d50f14d2502a05c1fa276c6e9ea050ce6a90e7095" + }, + "edit-article": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "6d29872a4015fcbb4879fe1bddf1919995c477dcf7248885efb85722f50e6c4d" + }, + "git-guardrails-claude-code": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "53a1de29d2b76481202608376fa4a20a8bd7e65b0b8a907a8c9578e629cf313f" + }, "grill-me": { "source": "mattpocock/skills", "sourceType": "github", - "computedHash": "e02446c0912decc73472ab3392af040fac223fb93b46fd101bce27686d717136" + "computedHash": "f343ef02aa0ada0f8b13a77b6db7fb6a11f88d8067128215585500ef4fd3b5e0" }, "improve-codebase-architecture": { "source": "mattpocock/skills", "sourceType": "github", "computedHash": "76d07c4c0bebc162cc76ca7494bfe7a90e279f35832f7ce6dfc4ce16518fd560" }, + "migrate-to-shoehorn": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "8f1a623a019abca0ab5e22e8deaae375829368cc1344016010201cf650331d33" + }, + "obsidian-vault": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "960437c7c2febc15cf9d49964f0c8bd889d90a56e81b6eaa33f3ac5b594291c3" + }, "prd-to-issues": { "source": "mattpocock/skills", "sourceType": "github", "computedHash": "32192340c1a14317b25741f5a0a38d162b387b6d4bfd09e0ed3859367321e98f" }, + "prd-to-plan": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "b0fad4a6e8097a5095fe00bfc2ebd98827c2d599e8b0a974c9d3c50b59295ac8" + }, + "request-refactor-plan": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "176cf1d0c4dc4fc4d958c4bc4ed67f222f2656bda2f3f15e3a3e98b2e5edaf2f" + }, + "scaffold-exercises": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "486d60a7e984b76850bd22b3b58b98d0063a3b9b3acf319e4d5a37bf1283ea02" + }, + "setup-pre-commit": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "f9aeedf0a1f560ec465c3939c54434c2d949ac8b958a0493db52216f688a0366" + }, "tdd": { "source": "mattpocock/skills", "sourceType": "github", "computedHash": "ad055544599481182d0079c4198e85da26677cb408a677be2da04915375f0427" }, + "triage-issue": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "6834b87393bff8f529b3126b9f761c1c2b9706af8a784ad1061961c34bbdb4a9" + }, + "ubiquitous-language": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "5f0985cb5bfc8dad3664882a4b384fb7d0c625ef454ef420c5c4e6b9cb580b51" + }, "write-a-prd": { "source": "mattpocock/skills", "sourceType": "github", "computedHash": "d03d51db8cfd3dd5c27f9b8e1ffe61686391fdbd62bf4cb2f47db7e62ee68cd2" + }, + "write-a-skill": { + "source": "mattpocock/skills", + "sourceType": "github", + "computedHash": "b44d8aab2ead83c716e01af4c9a24ccc4575ce70ad58ec4f1749fb88c9cc82ba" } } } diff --git a/uv.lock b/uv.lock new file mode 100644 index 0000000..919acd1 --- /dev/null +++ b/uv.lock @@ -0,0 +1,1444 @@ +version = 1 +revision = 3 +requires-python = ">=3.13" + +[manifest] +members = [ + "cmdb", + "cmdb-auth", + "cmdb-event", + "cmdb-ipam", + "cmdb-shared", + "cmdb-tenant", + "cmdb-webhook", +] + +[[package]] +name = "aiokafka" +version = "0.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "async-timeout" }, + { name = "packaging" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/87/18/d3a4f8f9ad099fc59217b8cdf66eeecde3a9ef3bb31fe676e431a3b0010f/aiokafka-0.13.0.tar.gz", hash = "sha256:7d634af3c8d694a37a6c8535c54f01a740e74cccf7cc189ecc4a3d64e31ce122", size = 598580, upload-time = "2026-01-02T13:55:18.911Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e3/f6/a74c49759233e98b61182ba3d49d5ac9c8de0643651892acba2704fba1cc/aiokafka-0.13.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:39d71c40cff733221a6b2afff4beeac5dacbd119fb99eec5198af59115264a1a", size = 343733, upload-time = "2026-01-02T13:54:58.536Z" }, + { url = "https://files.pythonhosted.org/packages/cf/52/4f7e80eee2c69cd8b047c18145469bf0dc27542a5dca3f96ff81ade575b0/aiokafka-0.13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:faa2f5f3d0d2283a0c1a149748cc7e3a3862ef327fa5762e2461088eedde230a", size = 346258, upload-time = "2026-01-02T13:55:00.947Z" }, + { url = "https://files.pythonhosted.org/packages/81/9b/d2766bb3b0bad53eb25a88e51a884be4b77a1706053ad717b893b4daea4b/aiokafka-0.13.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b890d535e55f5073f939585bef5301634df669e97832fda77aa743498f008662", size = 1114744, upload-time = "2026-01-02T13:55:02.475Z" }, + { url = "https://files.pythonhosted.org/packages/8f/00/12e0a39cd4809149a09b4a52b629abc9bf80e7b8bad9950040b1adae99fc/aiokafka-0.13.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e22eb8a1475b9c0f45b553b6e2dcaf4ec3c0014bf4e389e00a0a0ec85d0e3bdc", size = 1105676, upload-time = "2026-01-02T13:55:04.036Z" }, + { url = "https://files.pythonhosted.org/packages/38/4a/0bc91e90faf55533fe6468461c2dd31c22b0e1d274b9386f341cca3f7eb7/aiokafka-0.13.0-cp313-cp313-win32.whl", hash = "sha256:ae507c7b09e882484f709f2e7172b3a4f75afffcd896d00517feb35c619495bb", size = 308257, upload-time = "2026-01-02T13:55:05.873Z" }, + { url = "https://files.pythonhosted.org/packages/23/63/5433d1aa10c4fb4cf85bd73013263c36d7da4604b0c77ed4d1ad42fae70c/aiokafka-0.13.0-cp313-cp313-win_amd64.whl", hash = "sha256:fec1a7e3458365a72809edaa2b990f65ca39b01a2a579f879ac4da6c9b2dbc5c", size = 326968, upload-time = "2026-01-02T13:55:07.351Z" }, + { url = "https://files.pythonhosted.org/packages/3c/cc/45b04c3a5fd3d2d5f444889ecceb80b2f78d6d66aa45e3042767e55579e2/aiokafka-0.13.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:9a403785f7092c72906c37f7618f7b16a4219eba8ed0bdda90fba410a7dd50b5", size = 344503, upload-time = "2026-01-02T13:55:08.723Z" }, + { url = "https://files.pythonhosted.org/packages/76/df/0b76fe3b93558ae71b856940e384909c4c2c7a1c330423003191e4ba7782/aiokafka-0.13.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:256807326831b7eee253ea1017bd2b19ab1c2298ce6b20a87fde97c253c572bc", size = 347621, upload-time = "2026-01-02T13:55:10.147Z" }, + { url = "https://files.pythonhosted.org/packages/34/1a/d59932f98fd3c106e2a7c8d4d5ebd8df25403436dfc27b3031918a37385e/aiokafka-0.13.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:64d90f91291da265d7f25296ba68fc6275684eebd6d1cf05a1b2abe6c2ba3543", size = 1111410, upload-time = "2026-01-02T13:55:11.763Z" }, + { url = "https://files.pythonhosted.org/packages/7e/04/fbf3e34ab3bc21e6e760c3fcd089375052fccc04eb8745459a82a58a647b/aiokafka-0.13.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b5a33cc043c8d199bcf101359d86f2d31fd54f4b157ac12028bdc34e3e1cf74a", size = 1094799, upload-time = "2026-01-02T13:55:13.795Z" }, + { url = "https://files.pythonhosted.org/packages/85/10/509f709fd3b7c3e568a5b8044be0e80a1504f8da6ddc72c128b21e270913/aiokafka-0.13.0-cp314-cp314-win32.whl", hash = "sha256:538950384b539ba2333d35a853f09214c0409e818e5d5f366ef759eea50bae9c", size = 311553, upload-time = "2026-01-02T13:55:15.928Z" }, + { url = "https://files.pythonhosted.org/packages/2b/18/424d6a4eb6f4835a371c1e2cfafce800540b33d957c6638795d911f98973/aiokafka-0.13.0-cp314-cp314-win_amd64.whl", hash = "sha256:c906dd42daadd14b4506a2e6c62dfef3d4919b5953d32ae5e5f0d99efd103c89", size = 330648, upload-time = "2026-01-02T13:55:17.421Z" }, +] + +[[package]] +name = "alembic" +version = "1.18.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mako" }, + { name = "sqlalchemy" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/94/13/8b084e0f2efb0275a1d534838844926f798bd766566b1375174e2448cd31/alembic-1.18.4.tar.gz", hash = "sha256:cb6e1fd84b6174ab8dbb2329f86d631ba9559dd78df550b57804d607672cedbc", size = 2056725, upload-time = "2026-02-10T16:00:47.195Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/29/6533c317b74f707ea28f8d633734dbda2119bbadfc61b2f3640ba835d0f7/alembic-1.18.4-py3-none-any.whl", hash = "sha256:a5ed4adcf6d8a4cb575f3d759f071b03cd6e5c7618eb796cb52497be25bfe19a", size = 263893, upload-time = "2026-02-10T16:00:49.997Z" }, +] + +[[package]] +name = "annotated-doc" +version = "0.0.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/57/ba/046ceea27344560984e26a590f90bc7f4a75b06701f653222458922b558c/annotated_doc-0.0.4.tar.gz", hash = "sha256:fbcda96e87e9c92ad167c2e53839e57503ecfda18804ea28102353485033faa4", size = 7288, upload-time = "2025-11-10T22:07:42.062Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/d3/26bf1008eb3d2daa8ef4cacc7f3bfdc11818d111f7e2d0201bc6e3b49d45/annotated_doc-0.0.4-py3-none-any.whl", hash = "sha256:571ac1dc6991c450b25a9c2d84a3705e2ae7a53467b5d111c24fa8baabbed320", size = 5303, upload-time = "2025-11-10T22:07:40.673Z" }, +] + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, +] + +[[package]] +name = "anyio" +version = "4.12.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/96/f0/5eb65b2bb0d09ac6776f2eb54adee6abe8228ea05b20a5ad0e4945de8aac/anyio-4.12.1.tar.gz", hash = "sha256:41cfcc3a4c85d3f05c932da7c26d0201ac36f72abd4435ba90d0464a3ffed703", size = 228685, upload-time = "2026-01-06T11:45:21.246Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/38/0e/27be9fdef66e72d64c0cdc3cc2823101b80585f8119b5c112c2e8f5f7dab/anyio-4.12.1-py3-none-any.whl", hash = "sha256:d405828884fc140aa80a3c667b8beed277f1dfedec42ba031bd6ac3db606ab6c", size = 113592, upload-time = "2026-01-06T11:45:19.497Z" }, +] + +[[package]] +name = "async-timeout" +version = "5.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a5/ae/136395dfbfe00dfc94da3f3e136d0b13f394cba8f4841120e34226265780/async_timeout-5.0.1.tar.gz", hash = "sha256:d9321a7a3d5a6a5e187e824d2fa0793ce379a202935782d555d6e9d2735677d3", size = 9274, upload-time = "2024-11-06T16:41:39.6Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fe/ba/e2081de779ca30d473f21f5b30e0e737c438205440784c7dfc81efc2b029/async_timeout-5.0.1-py3-none-any.whl", hash = "sha256:39e3809566ff85354557ec2398b55e096c8364bacac9405a7a1fa429e77fe76c", size = 6233, upload-time = "2024-11-06T16:41:37.9Z" }, +] + +[[package]] +name = "asyncpg" +version = "0.31.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/cc/d18065ce2380d80b1bcce927c24a2642efd38918e33fd724bc4bca904877/asyncpg-0.31.0.tar.gz", hash = "sha256:c989386c83940bfbd787180f2b1519415e2d3d6277a70d9d0f0145ac73500735", size = 993667, upload-time = "2025-11-24T23:27:00.812Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/11/97b5c2af72a5d0b9bc3fa30cd4b9ce22284a9a943a150fdc768763caf035/asyncpg-0.31.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c204fab1b91e08b0f47e90a75d1b3c62174dab21f670ad6c5d0f243a228f015b", size = 661111, upload-time = "2025-11-24T23:26:04.467Z" }, + { url = "https://files.pythonhosted.org/packages/1b/71/157d611c791a5e2d0423f09f027bd499935f0906e0c2a416ce712ba51ef3/asyncpg-0.31.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:54a64f91839ba59008eccf7aad2e93d6e3de688d796f35803235ea1c4898ae1e", size = 636928, upload-time = "2025-11-24T23:26:05.944Z" }, + { url = "https://files.pythonhosted.org/packages/2e/fc/9e3486fb2bbe69d4a867c0b76d68542650a7ff1574ca40e84c3111bb0c6e/asyncpg-0.31.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c0e0822b1038dc7253b337b0f3f676cadc4ac31b126c5d42691c39691962e403", size = 3424067, upload-time = "2025-11-24T23:26:07.957Z" }, + { url = "https://files.pythonhosted.org/packages/12/c6/8c9d076f73f07f995013c791e018a1cd5f31823c2a3187fc8581706aa00f/asyncpg-0.31.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bef056aa502ee34204c161c72ca1f3c274917596877f825968368b2c33f585f4", size = 3518156, upload-time = "2025-11-24T23:26:09.591Z" }, + { url = "https://files.pythonhosted.org/packages/ae/3b/60683a0baf50fbc546499cfb53132cb6835b92b529a05f6a81471ab60d0c/asyncpg-0.31.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0bfbcc5b7ffcd9b75ab1558f00db2ae07db9c80637ad1b2469c43df79d7a5ae2", size = 3319636, upload-time = "2025-11-24T23:26:11.168Z" }, + { url = "https://files.pythonhosted.org/packages/50/dc/8487df0f69bd398a61e1792b3cba0e47477f214eff085ba0efa7eac9ce87/asyncpg-0.31.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:22bc525ebbdc24d1261ecbf6f504998244d4e3be1721784b5f64664d61fbe602", size = 3472079, upload-time = "2025-11-24T23:26:13.164Z" }, + { url = "https://files.pythonhosted.org/packages/13/a1/c5bbeeb8531c05c89135cb8b28575ac2fac618bcb60119ee9696c3faf71c/asyncpg-0.31.0-cp313-cp313-win32.whl", hash = "sha256:f890de5e1e4f7e14023619399a471ce4b71f5418cd67a51853b9910fdfa73696", size = 527606, upload-time = "2025-11-24T23:26:14.78Z" }, + { url = "https://files.pythonhosted.org/packages/91/66/b25ccb84a246b470eb943b0107c07edcae51804912b824054b3413995a10/asyncpg-0.31.0-cp313-cp313-win_amd64.whl", hash = "sha256:dc5f2fa9916f292e5c5c8b2ac2813763bcd7f58e130055b4ad8a0531314201ab", size = 596569, upload-time = "2025-11-24T23:26:16.189Z" }, + { url = "https://files.pythonhosted.org/packages/3c/36/e9450d62e84a13aea6580c83a47a437f26c7ca6fa0f0fd40b6670793ea30/asyncpg-0.31.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:f6b56b91bb0ffc328c4e3ed113136cddd9deefdf5f79ab448598b9772831df44", size = 660867, upload-time = "2025-11-24T23:26:17.631Z" }, + { url = "https://files.pythonhosted.org/packages/82/4b/1d0a2b33b3102d210439338e1beea616a6122267c0df459ff0265cd5807a/asyncpg-0.31.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:334dec28cf20d7f5bb9e45b39546ddf247f8042a690bff9b9573d00086e69cb5", size = 638349, upload-time = "2025-11-24T23:26:19.689Z" }, + { url = "https://files.pythonhosted.org/packages/41/aa/e7f7ac9a7974f08eff9183e392b2d62516f90412686532d27e196c0f0eeb/asyncpg-0.31.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:98cc158c53f46de7bb677fd20c417e264fc02b36d901cc2a43bd6cb0dc6dbfd2", size = 3410428, upload-time = "2025-11-24T23:26:21.275Z" }, + { url = "https://files.pythonhosted.org/packages/6f/de/bf1b60de3dede5c2731e6788617a512bc0ebd9693eac297ee74086f101d7/asyncpg-0.31.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9322b563e2661a52e3cdbc93eed3be7748b289f792e0011cb2720d278b366ce2", size = 3471678, upload-time = "2025-11-24T23:26:23.627Z" }, + { url = "https://files.pythonhosted.org/packages/46/78/fc3ade003e22d8bd53aaf8f75f4be48f0b460fa73738f0391b9c856a9147/asyncpg-0.31.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:19857a358fc811d82227449b7ca40afb46e75b33eb8897240c3839dd8b744218", size = 3313505, upload-time = "2025-11-24T23:26:25.235Z" }, + { url = "https://files.pythonhosted.org/packages/bf/e9/73eb8a6789e927816f4705291be21f2225687bfa97321e40cd23055e903a/asyncpg-0.31.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:ba5f8886e850882ff2c2ace5732300e99193823e8107e2c53ef01c1ebfa1e85d", size = 3434744, upload-time = "2025-11-24T23:26:26.944Z" }, + { url = "https://files.pythonhosted.org/packages/08/4b/f10b880534413c65c5b5862f79b8e81553a8f364e5238832ad4c0af71b7f/asyncpg-0.31.0-cp314-cp314-win32.whl", hash = "sha256:cea3a0b2a14f95834cee29432e4ddc399b95700eb1d51bbc5bfee8f31fa07b2b", size = 532251, upload-time = "2025-11-24T23:26:28.404Z" }, + { url = "https://files.pythonhosted.org/packages/d3/2d/7aa40750b7a19efa5d66e67fc06008ca0f27ba1bd082e457ad82f59aba49/asyncpg-0.31.0-cp314-cp314-win_amd64.whl", hash = "sha256:04d19392716af6b029411a0264d92093b6e5e8285ae97a39957b9a9c14ea72be", size = 604901, upload-time = "2025-11-24T23:26:30.34Z" }, + { url = "https://files.pythonhosted.org/packages/ce/fe/b9dfe349b83b9dee28cc42360d2c86b2cdce4cb551a2c2d27e156bcac84d/asyncpg-0.31.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:bdb957706da132e982cc6856bb2f7b740603472b54c3ebc77fe60ea3e57e1bd2", size = 702280, upload-time = "2025-11-24T23:26:32Z" }, + { url = "https://files.pythonhosted.org/packages/6a/81/e6be6e37e560bd91e6c23ea8a6138a04fd057b08cf63d3c5055c98e81c1d/asyncpg-0.31.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6d11b198111a72f47154fa03b85799f9be63701e068b43f84ac25da0bda9cb31", size = 682931, upload-time = "2025-11-24T23:26:33.572Z" }, + { url = "https://files.pythonhosted.org/packages/a6/45/6009040da85a1648dd5bc75b3b0a062081c483e75a1a29041ae63a0bf0dc/asyncpg-0.31.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:18c83b03bc0d1b23e6230f5bf8d4f217dc9bc08644ce0502a9d91dc9e634a9c7", size = 3581608, upload-time = "2025-11-24T23:26:35.638Z" }, + { url = "https://files.pythonhosted.org/packages/7e/06/2e3d4d7608b0b2b3adbee0d0bd6a2d29ca0fc4d8a78f8277df04e2d1fd7b/asyncpg-0.31.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e009abc333464ff18b8f6fd146addffd9aaf63e79aa3bb40ab7a4c332d0c5e9e", size = 3498738, upload-time = "2025-11-24T23:26:37.275Z" }, + { url = "https://files.pythonhosted.org/packages/7d/aa/7d75ede780033141c51d83577ea23236ba7d3a23593929b32b49db8ed36e/asyncpg-0.31.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:3b1fbcb0e396a5ca435a8826a87e5c2c2cc0c8c68eb6fadf82168056b0e53a8c", size = 3401026, upload-time = "2025-11-24T23:26:39.423Z" }, + { url = "https://files.pythonhosted.org/packages/ba/7a/15e37d45e7f7c94facc1e9148c0e455e8f33c08f0b8a0b1deb2c5171771b/asyncpg-0.31.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8df714dba348efcc162d2adf02d213e5fab1bd9f557e1305633e851a61814a7a", size = 3429426, upload-time = "2025-11-24T23:26:41.032Z" }, + { url = "https://files.pythonhosted.org/packages/13/d5/71437c5f6ae5f307828710efbe62163974e71237d5d46ebd2869ea052d10/asyncpg-0.31.0-cp314-cp314t-win32.whl", hash = "sha256:1b41f1afb1033f2b44f3234993b15096ddc9cd71b21a42dbd87fc6a57b43d65d", size = 614495, upload-time = "2025-11-24T23:26:42.659Z" }, + { url = "https://files.pythonhosted.org/packages/3c/d7/8fb3044eaef08a310acfe23dae9a8e2e07d305edc29a53497e52bc76eca7/asyncpg-0.31.0-cp314-cp314t-win_amd64.whl", hash = "sha256:bd4107bb7cdd0e9e65fae66a62afd3a249663b844fa34d479f6d5b3bef9c04c3", size = 706062, upload-time = "2025-11-24T23:26:44.086Z" }, +] + +[[package]] +name = "bcrypt" +version = "5.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d4/36/3329e2518d70ad8e2e5817d5a4cac6bba05a47767ec416c7d020a965f408/bcrypt-5.0.0.tar.gz", hash = "sha256:f748f7c2d6fd375cc93d3fba7ef4a9e3a092421b8dbf34d8d4dc06be9492dfdd", size = 25386, upload-time = "2025-09-25T19:50:47.829Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/13/85/3e65e01985fddf25b64ca67275bb5bdb4040bd1a53b66d355c6c37c8a680/bcrypt-5.0.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f3c08197f3039bec79cee59a606d62b96b16669cff3949f21e74796b6e3cd2be", size = 481806, upload-time = "2025-09-25T19:49:05.102Z" }, + { url = "https://files.pythonhosted.org/packages/44/dc/01eb79f12b177017a726cbf78330eb0eb442fae0e7b3dfd84ea2849552f3/bcrypt-5.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:200af71bc25f22006f4069060c88ed36f8aa4ff7f53e67ff04d2ab3f1e79a5b2", size = 268626, upload-time = "2025-09-25T19:49:06.723Z" }, + { url = "https://files.pythonhosted.org/packages/8c/cf/e82388ad5959c40d6afd94fb4743cc077129d45b952d46bdc3180310e2df/bcrypt-5.0.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:baade0a5657654c2984468efb7d6c110db87ea63ef5a4b54732e7e337253e44f", size = 271853, upload-time = "2025-09-25T19:49:08.028Z" }, + { url = "https://files.pythonhosted.org/packages/ec/86/7134b9dae7cf0efa85671651341f6afa695857fae172615e960fb6a466fa/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:c58b56cdfb03202b3bcc9fd8daee8e8e9b6d7e3163aa97c631dfcfcc24d36c86", size = 269793, upload-time = "2025-09-25T19:49:09.727Z" }, + { url = "https://files.pythonhosted.org/packages/cc/82/6296688ac1b9e503d034e7d0614d56e80c5d1a08402ff856a4549cb59207/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4bfd2a34de661f34d0bda43c3e4e79df586e4716ef401fe31ea39d69d581ef23", size = 289930, upload-time = "2025-09-25T19:49:11.204Z" }, + { url = "https://files.pythonhosted.org/packages/d1/18/884a44aa47f2a3b88dd09bc05a1e40b57878ecd111d17e5bba6f09f8bb77/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:ed2e1365e31fc73f1825fa830f1c8f8917ca1b3ca6185773b349c20fd606cec2", size = 272194, upload-time = "2025-09-25T19:49:12.524Z" }, + { url = "https://files.pythonhosted.org/packages/0e/8f/371a3ab33c6982070b674f1788e05b656cfbf5685894acbfef0c65483a59/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:83e787d7a84dbbfba6f250dd7a5efd689e935f03dd83b0f919d39349e1f23f83", size = 269381, upload-time = "2025-09-25T19:49:14.308Z" }, + { url = "https://files.pythonhosted.org/packages/b1/34/7e4e6abb7a8778db6422e88b1f06eb07c47682313997ee8a8f9352e5a6f1/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:137c5156524328a24b9fac1cb5db0ba618bc97d11970b39184c1d87dc4bf1746", size = 271750, upload-time = "2025-09-25T19:49:15.584Z" }, + { url = "https://files.pythonhosted.org/packages/c0/1b/54f416be2499bd72123c70d98d36c6cd61a4e33d9b89562c22481c81bb30/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:38cac74101777a6a7d3b3e3cfefa57089b5ada650dce2baf0cbdd9d65db22a9e", size = 303757, upload-time = "2025-09-25T19:49:17.244Z" }, + { url = "https://files.pythonhosted.org/packages/13/62/062c24c7bcf9d2826a1a843d0d605c65a755bc98002923d01fd61270705a/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:d8d65b564ec849643d9f7ea05c6d9f0cd7ca23bdd4ac0c2dbef1104ab504543d", size = 306740, upload-time = "2025-09-25T19:49:18.693Z" }, + { url = "https://files.pythonhosted.org/packages/d5/c8/1fdbfc8c0f20875b6b4020f3c7dc447b8de60aa0be5faaf009d24242aec9/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:741449132f64b3524e95cd30e5cd3343006ce146088f074f31ab26b94e6c75ba", size = 334197, upload-time = "2025-09-25T19:49:20.523Z" }, + { url = "https://files.pythonhosted.org/packages/a6/c1/8b84545382d75bef226fbc6588af0f7b7d095f7cd6a670b42a86243183cd/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:212139484ab3207b1f0c00633d3be92fef3c5f0af17cad155679d03ff2ee1e41", size = 352974, upload-time = "2025-09-25T19:49:22.254Z" }, + { url = "https://files.pythonhosted.org/packages/10/a6/ffb49d4254ed085e62e3e5dd05982b4393e32fe1e49bb1130186617c29cd/bcrypt-5.0.0-cp313-cp313t-win32.whl", hash = "sha256:9d52ed507c2488eddd6a95bccee4e808d3234fa78dd370e24bac65a21212b861", size = 148498, upload-time = "2025-09-25T19:49:24.134Z" }, + { url = "https://files.pythonhosted.org/packages/48/a9/259559edc85258b6d5fc5471a62a3299a6aa37a6611a169756bf4689323c/bcrypt-5.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f6984a24db30548fd39a44360532898c33528b74aedf81c26cf29c51ee47057e", size = 145853, upload-time = "2025-09-25T19:49:25.702Z" }, + { url = "https://files.pythonhosted.org/packages/2d/df/9714173403c7e8b245acf8e4be8876aac64a209d1b392af457c79e60492e/bcrypt-5.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:9fffdb387abe6aa775af36ef16f55e318dcda4194ddbf82007a6f21da29de8f5", size = 139626, upload-time = "2025-09-25T19:49:26.928Z" }, + { url = "https://files.pythonhosted.org/packages/f8/14/c18006f91816606a4abe294ccc5d1e6f0e42304df5a33710e9e8e95416e1/bcrypt-5.0.0-cp314-cp314t-macosx_10_12_universal2.whl", hash = "sha256:4870a52610537037adb382444fefd3706d96d663ac44cbb2f37e3919dca3d7ef", size = 481862, upload-time = "2025-09-25T19:49:28.365Z" }, + { url = "https://files.pythonhosted.org/packages/67/49/dd074d831f00e589537e07a0725cf0e220d1f0d5d8e85ad5bbff251c45aa/bcrypt-5.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:48f753100931605686f74e27a7b49238122aa761a9aefe9373265b8b7aa43ea4", size = 268544, upload-time = "2025-09-25T19:49:30.39Z" }, + { url = "https://files.pythonhosted.org/packages/f5/91/50ccba088b8c474545b034a1424d05195d9fcbaaf802ab8bfe2be5a4e0d7/bcrypt-5.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f70aadb7a809305226daedf75d90379c397b094755a710d7014b8b117df1ebbf", size = 271787, upload-time = "2025-09-25T19:49:32.144Z" }, + { url = "https://files.pythonhosted.org/packages/aa/e7/d7dba133e02abcda3b52087a7eea8c0d4f64d3e593b4fffc10c31b7061f3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:744d3c6b164caa658adcb72cb8cc9ad9b4b75c7db507ab4bc2480474a51989da", size = 269753, upload-time = "2025-09-25T19:49:33.885Z" }, + { url = "https://files.pythonhosted.org/packages/33/fc/5b145673c4b8d01018307b5c2c1fc87a6f5a436f0ad56607aee389de8ee3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a28bc05039bdf3289d757f49d616ab3efe8cf40d8e8001ccdd621cd4f98f4fc9", size = 289587, upload-time = "2025-09-25T19:49:35.144Z" }, + { url = "https://files.pythonhosted.org/packages/27/d7/1ff22703ec6d4f90e62f1a5654b8867ef96bafb8e8102c2288333e1a6ca6/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:7f277a4b3390ab4bebe597800a90da0edae882c6196d3038a73adf446c4f969f", size = 272178, upload-time = "2025-09-25T19:49:36.793Z" }, + { url = "https://files.pythonhosted.org/packages/c8/88/815b6d558a1e4d40ece04a2f84865b0fef233513bd85fd0e40c294272d62/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:79cfa161eda8d2ddf29acad370356b47f02387153b11d46042e93a0a95127493", size = 269295, upload-time = "2025-09-25T19:49:38.164Z" }, + { url = "https://files.pythonhosted.org/packages/51/8c/e0db387c79ab4931fc89827d37608c31cc57b6edc08ccd2386139028dc0d/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a5393eae5722bcef046a990b84dff02b954904c36a194f6cfc817d7dca6c6f0b", size = 271700, upload-time = "2025-09-25T19:49:39.917Z" }, + { url = "https://files.pythonhosted.org/packages/06/83/1570edddd150f572dbe9fc00f6203a89fc7d4226821f67328a85c330f239/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7f4c94dec1b5ab5d522750cb059bb9409ea8872d4494fd152b53cca99f1ddd8c", size = 334034, upload-time = "2025-09-25T19:49:41.227Z" }, + { url = "https://files.pythonhosted.org/packages/c9/f2/ea64e51a65e56ae7a8a4ec236c2bfbdd4b23008abd50ac33fbb2d1d15424/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0cae4cb350934dfd74c020525eeae0a5f79257e8a201c0c176f4b84fdbf2a4b4", size = 352766, upload-time = "2025-09-25T19:49:43.08Z" }, + { url = "https://files.pythonhosted.org/packages/d7/d4/1a388d21ee66876f27d1a1f41287897d0c0f1712ef97d395d708ba93004c/bcrypt-5.0.0-cp314-cp314t-win32.whl", hash = "sha256:b17366316c654e1ad0306a6858e189fc835eca39f7eb2cafd6aaca8ce0c40a2e", size = 152449, upload-time = "2025-09-25T19:49:44.971Z" }, + { url = "https://files.pythonhosted.org/packages/3f/61/3291c2243ae0229e5bca5d19f4032cecad5dfb05a2557169d3a69dc0ba91/bcrypt-5.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:92864f54fb48b4c718fc92a32825d0e42265a627f956bc0361fe869f1adc3e7d", size = 149310, upload-time = "2025-09-25T19:49:46.162Z" }, + { url = "https://files.pythonhosted.org/packages/3e/89/4b01c52ae0c1a681d4021e5dd3e45b111a8fb47254a274fa9a378d8d834b/bcrypt-5.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:dd19cf5184a90c873009244586396a6a884d591a5323f0e8a5922560718d4993", size = 143761, upload-time = "2025-09-25T19:49:47.345Z" }, + { url = "https://files.pythonhosted.org/packages/84/29/6237f151fbfe295fe3e074ecc6d44228faa1e842a81f6d34a02937ee1736/bcrypt-5.0.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:fc746432b951e92b58317af8e0ca746efe93e66555f1b40888865ef5bf56446b", size = 494553, upload-time = "2025-09-25T19:49:49.006Z" }, + { url = "https://files.pythonhosted.org/packages/45/b6/4c1205dde5e464ea3bd88e8742e19f899c16fa8916fb8510a851fae985b5/bcrypt-5.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c2388ca94ffee269b6038d48747f4ce8df0ffbea43f31abfa18ac72f0218effb", size = 275009, upload-time = "2025-09-25T19:49:50.581Z" }, + { url = "https://files.pythonhosted.org/packages/3b/71/427945e6ead72ccffe77894b2655b695ccf14ae1866cd977e185d606dd2f/bcrypt-5.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:560ddb6ec730386e7b3b26b8b4c88197aaed924430e7b74666a586ac997249ef", size = 278029, upload-time = "2025-09-25T19:49:52.533Z" }, + { url = "https://files.pythonhosted.org/packages/17/72/c344825e3b83c5389a369c8a8e58ffe1480b8a699f46c127c34580c4666b/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d79e5c65dcc9af213594d6f7f1fa2c98ad3fc10431e7aa53c176b441943efbdd", size = 275907, upload-time = "2025-09-25T19:49:54.709Z" }, + { url = "https://files.pythonhosted.org/packages/0b/7e/d4e47d2df1641a36d1212e5c0514f5291e1a956a7749f1e595c07a972038/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2b732e7d388fa22d48920baa267ba5d97cca38070b69c0e2d37087b381c681fd", size = 296500, upload-time = "2025-09-25T19:49:56.013Z" }, + { url = "https://files.pythonhosted.org/packages/0f/c3/0ae57a68be2039287ec28bc463b82e4b8dc23f9d12c0be331f4782e19108/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0c8e093ea2532601a6f686edbc2c6b2ec24131ff5c52f7610dd64fa4553b5464", size = 278412, upload-time = "2025-09-25T19:49:57.356Z" }, + { url = "https://files.pythonhosted.org/packages/45/2b/77424511adb11e6a99e3a00dcc7745034bee89036ad7d7e255a7e47be7d8/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5b1589f4839a0899c146e8892efe320c0fa096568abd9b95593efac50a87cb75", size = 275486, upload-time = "2025-09-25T19:49:59.116Z" }, + { url = "https://files.pythonhosted.org/packages/43/0a/405c753f6158e0f3f14b00b462d8bca31296f7ecfc8fc8bc7919c0c7d73a/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:89042e61b5e808b67daf24a434d89bab164d4de1746b37a8d173b6b14f3db9ff", size = 277940, upload-time = "2025-09-25T19:50:00.869Z" }, + { url = "https://files.pythonhosted.org/packages/62/83/b3efc285d4aadc1fa83db385ec64dcfa1707e890eb42f03b127d66ac1b7b/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:e3cf5b2560c7b5a142286f69bde914494b6d8f901aaa71e453078388a50881c4", size = 310776, upload-time = "2025-09-25T19:50:02.393Z" }, + { url = "https://files.pythonhosted.org/packages/95/7d/47ee337dacecde6d234890fe929936cb03ebc4c3a7460854bbd9c97780b8/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f632fd56fc4e61564f78b46a2269153122db34988e78b6be8b32d28507b7eaeb", size = 312922, upload-time = "2025-09-25T19:50:04.232Z" }, + { url = "https://files.pythonhosted.org/packages/d6/3a/43d494dfb728f55f4e1cf8fd435d50c16a2d75493225b54c8d06122523c6/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:801cad5ccb6b87d1b430f183269b94c24f248dddbbc5c1f78b6ed231743e001c", size = 341367, upload-time = "2025-09-25T19:50:05.559Z" }, + { url = "https://files.pythonhosted.org/packages/55/ab/a0727a4547e383e2e22a630e0f908113db37904f58719dc48d4622139b5c/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3cf67a804fc66fc217e6914a5635000259fbbbb12e78a99488e4d5ba445a71eb", size = 359187, upload-time = "2025-09-25T19:50:06.916Z" }, + { url = "https://files.pythonhosted.org/packages/1b/bb/461f352fdca663524b4643d8b09e8435b4990f17fbf4fea6bc2a90aa0cc7/bcrypt-5.0.0-cp38-abi3-win32.whl", hash = "sha256:3abeb543874b2c0524ff40c57a4e14e5d3a66ff33fb423529c88f180fd756538", size = 153752, upload-time = "2025-09-25T19:50:08.515Z" }, + { url = "https://files.pythonhosted.org/packages/41/aa/4190e60921927b7056820291f56fc57d00d04757c8b316b2d3c0d1d6da2c/bcrypt-5.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:35a77ec55b541e5e583eb3436ffbbf53b0ffa1fa16ca6782279daf95d146dcd9", size = 150881, upload-time = "2025-09-25T19:50:09.742Z" }, + { url = "https://files.pythonhosted.org/packages/54/12/cd77221719d0b39ac0b55dbd39358db1cd1246e0282e104366ebbfb8266a/bcrypt-5.0.0-cp38-abi3-win_arm64.whl", hash = "sha256:cde08734f12c6a4e28dc6755cd11d3bdfea608d93d958fffbe95a7026ebe4980", size = 144931, upload-time = "2025-09-25T19:50:11.016Z" }, + { url = "https://files.pythonhosted.org/packages/5d/ba/2af136406e1c3839aea9ecadc2f6be2bcd1eff255bd451dd39bcf302c47a/bcrypt-5.0.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0c418ca99fd47e9c59a301744d63328f17798b5947b0f791e9af3c1c499c2d0a", size = 495313, upload-time = "2025-09-25T19:50:12.309Z" }, + { url = "https://files.pythonhosted.org/packages/ac/ee/2f4985dbad090ace5ad1f7dd8ff94477fe089b5fab2040bd784a3d5f187b/bcrypt-5.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddb4e1500f6efdd402218ffe34d040a1196c072e07929b9820f363a1fd1f4191", size = 275290, upload-time = "2025-09-25T19:50:13.673Z" }, + { url = "https://files.pythonhosted.org/packages/e4/6e/b77ade812672d15cf50842e167eead80ac3514f3beacac8902915417f8b7/bcrypt-5.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7aeef54b60ceddb6f30ee3db090351ecf0d40ec6e2abf41430997407a46d2254", size = 278253, upload-time = "2025-09-25T19:50:15.089Z" }, + { url = "https://files.pythonhosted.org/packages/36/c4/ed00ed32f1040f7990dac7115f82273e3c03da1e1a1587a778d8cea496d8/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f0ce778135f60799d89c9693b9b398819d15f1921ba15fe719acb3178215a7db", size = 276084, upload-time = "2025-09-25T19:50:16.699Z" }, + { url = "https://files.pythonhosted.org/packages/e7/c4/fa6e16145e145e87f1fa351bbd54b429354fd72145cd3d4e0c5157cf4c70/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a71f70ee269671460b37a449f5ff26982a6f2ba493b3eabdd687b4bf35f875ac", size = 297185, upload-time = "2025-09-25T19:50:18.525Z" }, + { url = "https://files.pythonhosted.org/packages/24/b4/11f8a31d8b67cca3371e046db49baa7c0594d71eb40ac8121e2fc0888db0/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8429e1c410b4073944f03bd778a9e066e7fad723564a52ff91841d278dfc822", size = 278656, upload-time = "2025-09-25T19:50:19.809Z" }, + { url = "https://files.pythonhosted.org/packages/ac/31/79f11865f8078e192847d2cb526e3fa27c200933c982c5b2869720fa5fce/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:edfcdcedd0d0f05850c52ba3127b1fce70b9f89e0fe5ff16517df7e81fa3cbb8", size = 275662, upload-time = "2025-09-25T19:50:21.567Z" }, + { url = "https://files.pythonhosted.org/packages/d4/8d/5e43d9584b3b3591a6f9b68f755a4da879a59712981ef5ad2a0ac1379f7a/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:611f0a17aa4a25a69362dcc299fda5c8a3d4f160e2abb3831041feb77393a14a", size = 278240, upload-time = "2025-09-25T19:50:23.305Z" }, + { url = "https://files.pythonhosted.org/packages/89/48/44590e3fc158620f680a978aafe8f87a4c4320da81ed11552f0323aa9a57/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:db99dca3b1fdc3db87d7c57eac0c82281242d1eabf19dcb8a6b10eb29a2e72d1", size = 311152, upload-time = "2025-09-25T19:50:24.597Z" }, + { url = "https://files.pythonhosted.org/packages/5f/85/e4fbfc46f14f47b0d20493669a625da5827d07e8a88ee460af6cd9768b44/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:5feebf85a9cefda32966d8171f5db7e3ba964b77fdfe31919622256f80f9cf42", size = 313284, upload-time = "2025-09-25T19:50:26.268Z" }, + { url = "https://files.pythonhosted.org/packages/25/ae/479f81d3f4594456a01ea2f05b132a519eff9ab5768a70430fa1132384b1/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3ca8a166b1140436e058298a34d88032ab62f15aae1c598580333dc21d27ef10", size = 341643, upload-time = "2025-09-25T19:50:28.02Z" }, + { url = "https://files.pythonhosted.org/packages/df/d2/36a086dee1473b14276cd6ea7f61aef3b2648710b5d7f1c9e032c29b859f/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:61afc381250c3182d9078551e3ac3a41da14154fbff647ddf52a769f588c4172", size = 359698, upload-time = "2025-09-25T19:50:31.347Z" }, + { url = "https://files.pythonhosted.org/packages/c0/f6/688d2cd64bfd0b14d805ddb8a565e11ca1fb0fd6817175d58b10052b6d88/bcrypt-5.0.0-cp39-abi3-win32.whl", hash = "sha256:64d7ce196203e468c457c37ec22390f1a61c85c6f0b8160fd752940ccfb3a683", size = 153725, upload-time = "2025-09-25T19:50:34.384Z" }, + { url = "https://files.pythonhosted.org/packages/9f/b9/9d9a641194a730bda138b3dfe53f584d61c58cd5230e37566e83ec2ffa0d/bcrypt-5.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:64ee8434b0da054d830fa8e89e1c8bf30061d539044a39524ff7dec90481e5c2", size = 150912, upload-time = "2025-09-25T19:50:35.69Z" }, + { url = "https://files.pythonhosted.org/packages/27/44/d2ef5e87509158ad2187f4dd0852df80695bb1ee0cfe0a684727b01a69e0/bcrypt-5.0.0-cp39-abi3-win_arm64.whl", hash = "sha256:f2347d3534e76bf50bca5500989d6c1d05ed64b440408057a37673282c654927", size = 144953, upload-time = "2025-09-25T19:50:37.32Z" }, +] + +[[package]] +name = "certifi" +version = "2026.2.25" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/2d/7bf41579a8986e348fa033a31cdd0e4121114f6bce2457e8876010b092dd/certifi-2026.2.25.tar.gz", hash = "sha256:e887ab5cee78ea814d3472169153c2d12cd43b14bd03329a39a9c6e2e80bfba7", size = 155029, upload-time = "2026-02-25T02:54:17.342Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9a/3c/c17fb3ca2d9c3acff52e30b309f538586f9f5b9c9cf454f3845fc9af4881/certifi-2026.2.25-py3-none-any.whl", hash = "sha256:027692e4402ad994f1c42e52a4997a9763c646b73e4096e4d5d6db8af1d6f0fa", size = 153684, upload-time = "2026-02-25T02:54:15.766Z" }, +] + +[[package]] +name = "cffi" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pycparser", marker = "implementation_name != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, + { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, + { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" }, + { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, + { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, + { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, + { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, + { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" }, + { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" }, + { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" }, + { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" }, + { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" }, + { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" }, + { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" }, + { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" }, + { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" }, + { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" }, + { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" }, + { url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" }, + { url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" }, + { url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" }, + { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" }, + { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" }, + { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" }, + { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" }, + { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" }, + { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" }, + { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" }, + { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" }, + { url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" }, + { url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" }, + { url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" }, +] + +[[package]] +name = "cfgv" +version = "3.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4e/b5/721b8799b04bf9afe054a3899c6cf4e880fcf8563cc71c15610242490a0c/cfgv-3.5.0.tar.gz", hash = "sha256:d5b1034354820651caa73ede66a6294d6e95c1b00acc5e9b098e917404669132", size = 7334, upload-time = "2025-11-19T20:55:51.612Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/db/3c/33bac158f8ab7f89b2e59426d5fe2e4f63f7ed25df84c036890172b412b5/cfgv-3.5.0-py2.py3-none-any.whl", hash = "sha256:a8dc6b26ad22ff227d2634a65cb388215ce6cc96bbcc5cfde7641ae87e8dacc0", size = 7445, upload-time = "2025-11-19T20:55:50.744Z" }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7b/60/e3bec1881450851b087e301bedc3daa9377a4d45f1c26aa90b0b235e38aa/charset_normalizer-3.4.6.tar.gz", hash = "sha256:1ae6b62897110aa7c79ea2f5dd38d1abca6db663687c0b1ad9aed6f6bae3d9d6", size = 143363, upload-time = "2026-03-15T18:53:25.478Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/1d/4fdabeef4e231153b6ed7567602f3b68265ec4e5b76d6024cf647d43d981/charset_normalizer-3.4.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:11afb56037cbc4b1555a34dd69151e8e069bee82e613a73bef6e714ce733585f", size = 294823, upload-time = "2026-03-15T18:51:15.755Z" }, + { url = "https://files.pythonhosted.org/packages/47/7b/20e809b89c69d37be748d98e84dce6820bf663cf19cf6b942c951a3e8f41/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:423fb7e748a08f854a08a222b983f4df1912b1daedce51a72bd24fe8f26a1843", size = 198527, upload-time = "2026-03-15T18:51:17.177Z" }, + { url = "https://files.pythonhosted.org/packages/37/a6/4f8d27527d59c039dce6f7622593cdcd3d70a8504d87d09eb11e9fdc6062/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:d73beaac5e90173ac3deb9928a74763a6d230f494e4bfb422c217a0ad8e629bf", size = 218388, upload-time = "2026-03-15T18:51:18.934Z" }, + { url = "https://files.pythonhosted.org/packages/f6/9b/4770ccb3e491a9bacf1c46cc8b812214fe367c86a96353ccc6daf87b01ec/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d60377dce4511655582e300dc1e5a5f24ba0cb229005a1d5c8d0cb72bb758ab8", size = 214563, upload-time = "2026-03-15T18:51:20.374Z" }, + { url = "https://files.pythonhosted.org/packages/2b/58/a199d245894b12db0b957d627516c78e055adc3a0d978bc7f65ddaf7c399/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:530e8cebeea0d76bdcf93357aa5e41336f48c3dc709ac52da2bb167c5b8271d9", size = 206587, upload-time = "2026-03-15T18:51:21.807Z" }, + { url = "https://files.pythonhosted.org/packages/7e/70/3def227f1ec56f5c69dfc8392b8bd63b11a18ca8178d9211d7cc5e5e4f27/charset_normalizer-3.4.6-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:a26611d9987b230566f24a0a125f17fe0de6a6aff9f25c9f564aaa2721a5fb88", size = 194724, upload-time = "2026-03-15T18:51:23.508Z" }, + { url = "https://files.pythonhosted.org/packages/58/ab/9318352e220c05efd31c2779a23b50969dc94b985a2efa643ed9077bfca5/charset_normalizer-3.4.6-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:34315ff4fc374b285ad7f4a0bf7dcbfe769e1b104230d40f49f700d4ab6bbd84", size = 202956, upload-time = "2026-03-15T18:51:25.239Z" }, + { url = "https://files.pythonhosted.org/packages/75/13/f3550a3ac25b70f87ac98c40d3199a8503676c2f1620efbf8d42095cfc40/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ddd609f9e1af8c7bd6e2aca279c931aefecd148a14402d4e368f3171769fd", size = 201923, upload-time = "2026-03-15T18:51:26.682Z" }, + { url = "https://files.pythonhosted.org/packages/1b/db/c5c643b912740b45e8eec21de1bbab8e7fc085944d37e1e709d3dcd9d72f/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:80d0a5615143c0b3225e5e3ef22c8d5d51f3f72ce0ea6fb84c943546c7b25b6c", size = 195366, upload-time = "2026-03-15T18:51:28.129Z" }, + { url = "https://files.pythonhosted.org/packages/5a/67/3b1c62744f9b2448443e0eb160d8b001c849ec3fef591e012eda6484787c/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:92734d4d8d187a354a556626c221cd1a892a4e0802ccb2af432a1d85ec012194", size = 219752, upload-time = "2026-03-15T18:51:29.556Z" }, + { url = "https://files.pythonhosted.org/packages/f6/98/32ffbaf7f0366ffb0445930b87d103f6b406bc2c271563644bde8a2b1093/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:613f19aa6e082cf96e17e3ffd89383343d0d589abda756b7764cf78361fd41dc", size = 203296, upload-time = "2026-03-15T18:51:30.921Z" }, + { url = "https://files.pythonhosted.org/packages/41/12/5d308c1bbe60cabb0c5ef511574a647067e2a1f631bc8634fcafaccd8293/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:2b1a63e8224e401cafe7739f77efd3f9e7f5f2026bda4aead8e59afab537784f", size = 215956, upload-time = "2026-03-15T18:51:32.399Z" }, + { url = "https://files.pythonhosted.org/packages/53/e9/5f85f6c5e20669dbe56b165c67b0260547dea97dba7e187938833d791687/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6cceb5473417d28edd20c6c984ab6fee6c6267d38d906823ebfe20b03d607dc2", size = 208652, upload-time = "2026-03-15T18:51:34.214Z" }, + { url = "https://files.pythonhosted.org/packages/f1/11/897052ea6af56df3eef3ca94edafee410ca699ca0c7b87960ad19932c55e/charset_normalizer-3.4.6-cp313-cp313-win32.whl", hash = "sha256:d7de2637729c67d67cf87614b566626057e95c303bc0a55ffe391f5205e7003d", size = 143940, upload-time = "2026-03-15T18:51:36.15Z" }, + { url = "https://files.pythonhosted.org/packages/a1/5c/724b6b363603e419829f561c854b87ed7c7e31231a7908708ac086cdf3e2/charset_normalizer-3.4.6-cp313-cp313-win_amd64.whl", hash = "sha256:572d7c822caf521f0525ba1bce1a622a0b85cf47ffbdae6c9c19e3b5ac3c4389", size = 154101, upload-time = "2026-03-15T18:51:37.876Z" }, + { url = "https://files.pythonhosted.org/packages/01/a5/7abf15b4c0968e47020f9ca0935fb3274deb87cb288cd187cad92e8cdffd/charset_normalizer-3.4.6-cp313-cp313-win_arm64.whl", hash = "sha256:a4474d924a47185a06411e0064b803c68be044be2d60e50e8bddcc2649957c1f", size = 143109, upload-time = "2026-03-15T18:51:39.565Z" }, + { url = "https://files.pythonhosted.org/packages/25/6f/ffe1e1259f384594063ea1869bfb6be5cdb8bc81020fc36c3636bc8302a1/charset_normalizer-3.4.6-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:9cc6e6d9e571d2f863fa77700701dae73ed5f78881efc8b3f9a4398772ff53e8", size = 294458, upload-time = "2026-03-15T18:51:41.134Z" }, + { url = "https://files.pythonhosted.org/packages/56/60/09bb6c13a8c1016c2ed5c6a6488e4ffef506461aa5161662bd7636936fb1/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ef5960d965e67165d75b7c7ffc60a83ec5abfc5c11b764ec13ea54fbef8b4421", size = 199277, upload-time = "2026-03-15T18:51:42.953Z" }, + { url = "https://files.pythonhosted.org/packages/00/50/dcfbb72a5138bbefdc3332e8d81a23494bf67998b4b100703fd15fa52d81/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b3694e3f87f8ac7ce279d4355645b3c878d24d1424581b46282f24b92f5a4ae2", size = 218758, upload-time = "2026-03-15T18:51:44.339Z" }, + { url = "https://files.pythonhosted.org/packages/03/b3/d79a9a191bb75f5aa81f3aaaa387ef29ce7cb7a9e5074ba8ea095cc073c2/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5d11595abf8dd942a77883a39d81433739b287b6aa71620f15164f8096221b30", size = 215299, upload-time = "2026-03-15T18:51:45.871Z" }, + { url = "https://files.pythonhosted.org/packages/76/7e/bc8911719f7084f72fd545f647601ea3532363927f807d296a8c88a62c0d/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7bda6eebafd42133efdca535b04ccb338ab29467b3f7bf79569883676fc628db", size = 206811, upload-time = "2026-03-15T18:51:47.308Z" }, + { url = "https://files.pythonhosted.org/packages/e2/40/c430b969d41dda0c465aa36cc7c2c068afb67177bef50905ac371b28ccc7/charset_normalizer-3.4.6-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:bbc8c8650c6e51041ad1be191742b8b421d05bbd3410f43fa2a00c8db87678e8", size = 193706, upload-time = "2026-03-15T18:51:48.849Z" }, + { url = "https://files.pythonhosted.org/packages/48/15/e35e0590af254f7df984de1323640ef375df5761f615b6225ba8deb9799a/charset_normalizer-3.4.6-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:22c6f0c2fbc31e76c3b8a86fba1a56eda6166e238c29cdd3d14befdb4a4e4815", size = 202706, upload-time = "2026-03-15T18:51:50.257Z" }, + { url = "https://files.pythonhosted.org/packages/5e/bd/f736f7b9cc5e93a18b794a50346bb16fbfd6b37f99e8f306f7951d27c17c/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7edbed096e4a4798710ed6bc75dcaa2a21b68b6c356553ac4823c3658d53743a", size = 202497, upload-time = "2026-03-15T18:51:52.012Z" }, + { url = "https://files.pythonhosted.org/packages/9d/ba/2cc9e3e7dfdf7760a6ed8da7446d22536f3d0ce114ac63dee2a5a3599e62/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:7f9019c9cb613f084481bd6a100b12e1547cf2efe362d873c2e31e4035a6fa43", size = 193511, upload-time = "2026-03-15T18:51:53.723Z" }, + { url = "https://files.pythonhosted.org/packages/9e/cb/5be49b5f776e5613be07298c80e1b02a2d900f7a7de807230595c85a8b2e/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:58c948d0d086229efc484fe2f30c2d382c86720f55cd9bc33591774348ad44e0", size = 220133, upload-time = "2026-03-15T18:51:55.333Z" }, + { url = "https://files.pythonhosted.org/packages/83/43/99f1b5dad345accb322c80c7821071554f791a95ee50c1c90041c157ae99/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:419a9d91bd238052642a51938af8ac05da5b3343becde08d5cdeab9046df9ee1", size = 203035, upload-time = "2026-03-15T18:51:56.736Z" }, + { url = "https://files.pythonhosted.org/packages/87/9a/62c2cb6a531483b55dddff1a68b3d891a8b498f3ca555fbcf2978e804d9d/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5273b9f0b5835ff0350c0828faea623c68bfa65b792720c453e22b25cc72930f", size = 216321, upload-time = "2026-03-15T18:51:58.17Z" }, + { url = "https://files.pythonhosted.org/packages/6e/79/94a010ff81e3aec7c293eb82c28f930918e517bc144c9906a060844462eb/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:0e901eb1049fdb80f5bd11ed5ea1e498ec423102f7a9b9e4645d5b8204ff2815", size = 208973, upload-time = "2026-03-15T18:51:59.998Z" }, + { url = "https://files.pythonhosted.org/packages/2a/57/4ecff6d4ec8585342f0c71bc03efaa99cb7468f7c91a57b105bcd561cea8/charset_normalizer-3.4.6-cp314-cp314-win32.whl", hash = "sha256:b4ff1d35e8c5bd078be89349b6f3a845128e685e751b6ea1169cf2160b344c4d", size = 144610, upload-time = "2026-03-15T18:52:02.213Z" }, + { url = "https://files.pythonhosted.org/packages/80/94/8434a02d9d7f168c25767c64671fead8d599744a05d6a6c877144c754246/charset_normalizer-3.4.6-cp314-cp314-win_amd64.whl", hash = "sha256:74119174722c4349af9708993118581686f343adc1c8c9c007d59be90d077f3f", size = 154962, upload-time = "2026-03-15T18:52:03.658Z" }, + { url = "https://files.pythonhosted.org/packages/46/4c/48f2cdbfd923026503dfd67ccea45c94fd8fe988d9056b468579c66ed62b/charset_normalizer-3.4.6-cp314-cp314-win_arm64.whl", hash = "sha256:e5bcc1a1ae744e0bb59641171ae53743760130600da8db48cbb6e4918e186e4e", size = 143595, upload-time = "2026-03-15T18:52:05.123Z" }, + { url = "https://files.pythonhosted.org/packages/31/93/8878be7569f87b14f1d52032946131bcb6ebbd8af3e20446bc04053dc3f1/charset_normalizer-3.4.6-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:ad8faf8df23f0378c6d527d8b0b15ea4a2e23c89376877c598c4870d1b2c7866", size = 314828, upload-time = "2026-03-15T18:52:06.831Z" }, + { url = "https://files.pythonhosted.org/packages/06/b6/fae511ca98aac69ecc35cde828b0a3d146325dd03d99655ad38fc2cc3293/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f5ea69428fa1b49573eef0cc44a1d43bebd45ad0c611eb7d7eac760c7ae771bc", size = 208138, upload-time = "2026-03-15T18:52:08.239Z" }, + { url = "https://files.pythonhosted.org/packages/54/57/64caf6e1bf07274a1e0b7c160a55ee9e8c9ec32c46846ce59b9c333f7008/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:06a7e86163334edfc5d20fe104db92fcd666e5a5df0977cb5680a506fe26cc8e", size = 224679, upload-time = "2026-03-15T18:52:10.043Z" }, + { url = "https://files.pythonhosted.org/packages/aa/cb/9ff5a25b9273ef160861b41f6937f86fae18b0792fe0a8e75e06acb08f1d/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e1f6e2f00a6b8edb562826e4632e26d063ac10307e80f7461f7de3ad8ef3f077", size = 223475, upload-time = "2026-03-15T18:52:11.854Z" }, + { url = "https://files.pythonhosted.org/packages/fc/97/440635fc093b8d7347502a377031f9605a1039c958f3cd18dcacffb37743/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:95b52c68d64c1878818687a473a10547b3292e82b6f6fe483808fb1468e2f52f", size = 215230, upload-time = "2026-03-15T18:52:13.325Z" }, + { url = "https://files.pythonhosted.org/packages/cd/24/afff630feb571a13f07c8539fbb502d2ab494019492aaffc78ef41f1d1d0/charset_normalizer-3.4.6-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:7504e9b7dc05f99a9bbb4525c67a2c155073b44d720470a148b34166a69c054e", size = 199045, upload-time = "2026-03-15T18:52:14.752Z" }, + { url = "https://files.pythonhosted.org/packages/e5/17/d1399ecdaf7e0498c327433e7eefdd862b41236a7e484355b8e0e5ebd64b/charset_normalizer-3.4.6-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:172985e4ff804a7ad08eebec0a1640ece87ba5041d565fff23c8f99c1f389484", size = 211658, upload-time = "2026-03-15T18:52:16.278Z" }, + { url = "https://files.pythonhosted.org/packages/b5/38/16baa0affb957b3d880e5ac2144caf3f9d7de7bc4a91842e447fbb5e8b67/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4be9f4830ba8741527693848403e2c457c16e499100963ec711b1c6f2049b7c7", size = 210769, upload-time = "2026-03-15T18:52:17.782Z" }, + { url = "https://files.pythonhosted.org/packages/05/34/c531bc6ac4c21da9ddfddb3107be2287188b3ea4b53b70fc58f2a77ac8d8/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:79090741d842f564b1b2827c0b82d846405b744d31e84f18d7a7b41c20e473ff", size = 201328, upload-time = "2026-03-15T18:52:19.553Z" }, + { url = "https://files.pythonhosted.org/packages/fa/73/a5a1e9ca5f234519c1953608a03fe109c306b97fdfb25f09182babad51a7/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:87725cfb1a4f1f8c2fc9890ae2f42094120f4b44db9360be5d99a4c6b0e03a9e", size = 225302, upload-time = "2026-03-15T18:52:21.043Z" }, + { url = "https://files.pythonhosted.org/packages/ba/f6/cd782923d112d296294dea4bcc7af5a7ae0f86ab79f8fefbda5526b6cfc0/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:fcce033e4021347d80ed9c66dcf1e7b1546319834b74445f561d2e2221de5659", size = 211127, upload-time = "2026-03-15T18:52:22.491Z" }, + { url = "https://files.pythonhosted.org/packages/0e/c5/0b6898950627af7d6103a449b22320372c24c6feda91aa24e201a478d161/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:ca0276464d148c72defa8bb4390cce01b4a0e425f3b50d1435aa6d7a18107602", size = 222840, upload-time = "2026-03-15T18:52:24.113Z" }, + { url = "https://files.pythonhosted.org/packages/7d/25/c4bba773bef442cbdc06111d40daa3de5050a676fa26e85090fc54dd12f0/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:197c1a244a274bb016dd8b79204850144ef77fe81c5b797dc389327adb552407", size = 216890, upload-time = "2026-03-15T18:52:25.541Z" }, + { url = "https://files.pythonhosted.org/packages/35/1a/05dacadb0978da72ee287b0143097db12f2e7e8d3ffc4647da07a383b0b7/charset_normalizer-3.4.6-cp314-cp314t-win32.whl", hash = "sha256:2a24157fa36980478dd1770b585c0f30d19e18f4fb0c47c13aa568f871718579", size = 155379, upload-time = "2026-03-15T18:52:27.05Z" }, + { url = "https://files.pythonhosted.org/packages/5d/7a/d269d834cb3a76291651256f3b9a5945e81d0a49ab9f4a498964e83c0416/charset_normalizer-3.4.6-cp314-cp314t-win_amd64.whl", hash = "sha256:cd5e2801c89992ed8c0a3f0293ae83c159a60d9a5d685005383ef4caca77f2c4", size = 169043, upload-time = "2026-03-15T18:52:28.502Z" }, + { url = "https://files.pythonhosted.org/packages/23/06/28b29fba521a37a8932c6a84192175c34d49f84a6d4773fa63d05f9aff22/charset_normalizer-3.4.6-cp314-cp314t-win_arm64.whl", hash = "sha256:47955475ac79cc504ef2704b192364e51d0d473ad452caedd0002605f780101c", size = 148523, upload-time = "2026-03-15T18:52:29.956Z" }, + { url = "https://files.pythonhosted.org/packages/2a/68/687187c7e26cb24ccbd88e5069f5ef00eba804d36dde11d99aad0838ab45/charset_normalizer-3.4.6-py3-none-any.whl", hash = "sha256:947cf925bc916d90adba35a64c82aace04fa39b46b52d4630ece166655905a69", size = 61455, upload-time = "2026-03-15T18:53:23.833Z" }, +] + +[[package]] +name = "click" +version = "8.3.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload-time = "2025-11-15T20:45:42.706Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" }, +] + +[[package]] +name = "cmdb" +version = "0.1.0" +source = { virtual = "." } + +[package.dev-dependencies] +dev = [ + { name = "fakeredis" }, + { name = "pre-commit" }, + { name = "pytest" }, + { name = "pytest-asyncio" }, + { name = "pytest-httpx" }, + { name = "ruff" }, + { name = "testcontainers", extra = ["redis"] }, +] + +[package.metadata] + +[package.metadata.requires-dev] +dev = [ + { name = "fakeredis" }, + { name = "pre-commit" }, + { name = "pytest" }, + { name = "pytest-asyncio" }, + { name = "pytest-httpx" }, + { name = "ruff" }, + { name = "testcontainers", extras = ["postgresql", "kafka", "redis"] }, +] + +[[package]] +name = "cmdb-auth" +version = "0.1.0" +source = { editable = "services/auth" } +dependencies = [ + { name = "alembic" }, + { name = "asyncpg" }, + { name = "bcrypt" }, + { name = "cmdb-shared" }, + { name = "cryptography" }, + { name = "fastapi" }, + { name = "pydantic-settings" }, + { name = "pyjwt" }, + { name = "redis" }, + { name = "sqlalchemy", extra = ["asyncio"] }, + { name = "uvicorn" }, +] + +[package.metadata] +requires-dist = [ + { name = "alembic" }, + { name = "asyncpg" }, + { name = "bcrypt", specifier = ">=4.0" }, + { name = "cmdb-shared", editable = "shared" }, + { name = "cryptography", specifier = ">=42.0" }, + { name = "fastapi", specifier = ">=0.115" }, + { name = "pydantic-settings", specifier = ">=2.0" }, + { name = "pyjwt", specifier = ">=2.0" }, + { name = "redis", specifier = ">=5.0" }, + { name = "sqlalchemy", extras = ["asyncio"], specifier = ">=2.0" }, + { name = "uvicorn" }, +] + +[[package]] +name = "cmdb-event" +version = "0.1.0" +source = { editable = "services/event" } +dependencies = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "asyncpg" }, + { name = "cmdb-shared" }, + { name = "fastapi" }, + { name = "pydantic-settings" }, + { name = "sqlalchemy", extra = ["asyncio"] }, + { name = "uvicorn" }, +] + +[package.metadata] +requires-dist = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "asyncpg" }, + { name = "cmdb-shared", editable = "shared" }, + { name = "fastapi", specifier = ">=0.115" }, + { name = "pydantic-settings", specifier = ">=2.0" }, + { name = "sqlalchemy", extras = ["asyncio"], specifier = ">=2.0" }, + { name = "uvicorn" }, +] + +[[package]] +name = "cmdb-ipam" +version = "0.1.0" +source = { editable = "services/ipam" } +dependencies = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "asyncpg" }, + { name = "cmdb-shared" }, + { name = "fastapi" }, + { name = "jinja2" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "pyyaml" }, + { name = "redis" }, + { name = "sqlalchemy", extra = ["asyncio"] }, + { name = "strawberry-graphql", extra = ["fastapi"] }, + { name = "uvicorn" }, +] + +[package.metadata] +requires-dist = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "asyncpg" }, + { name = "cmdb-shared", editable = "shared" }, + { name = "fastapi", specifier = ">=0.115" }, + { name = "jinja2", specifier = ">=3.1" }, + { name = "pydantic-settings", specifier = ">=2.0" }, + { name = "python-multipart" }, + { name = "pyyaml", specifier = ">=6.0" }, + { name = "redis", specifier = ">=5.0" }, + { name = "sqlalchemy", extras = ["asyncio"], specifier = ">=2.0" }, + { name = "strawberry-graphql", extras = ["fastapi"] }, + { name = "uvicorn" }, +] + +[[package]] +name = "cmdb-shared" +version = "0.1.0" +source = { editable = "shared" } +dependencies = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "fastapi" }, + { name = "pydantic" }, + { name = "redis" }, + { name = "sqlalchemy" }, +] + +[package.metadata] +requires-dist = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "fastapi", specifier = ">=0.115" }, + { name = "pydantic", specifier = ">=2.0" }, + { name = "redis" }, + { name = "sqlalchemy", specifier = ">=2.0" }, +] + +[[package]] +name = "cmdb-tenant" +version = "0.1.0" +source = { editable = "services/tenant" } +dependencies = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "asyncpg" }, + { name = "cmdb-shared" }, + { name = "fastapi" }, + { name = "psycopg2-binary" }, + { name = "pydantic-settings" }, + { name = "redis" }, + { name = "sqlalchemy", extra = ["asyncio"] }, + { name = "uvicorn" }, +] + +[package.metadata] +requires-dist = [ + { name = "aiokafka" }, + { name = "alembic" }, + { name = "asyncpg" }, + { name = "cmdb-shared", editable = "shared" }, + { name = "fastapi", specifier = ">=0.115" }, + { name = "psycopg2-binary", specifier = ">=2.9" }, + { name = "pydantic-settings", specifier = ">=2.0" }, + { name = "redis", specifier = ">=5.0" }, + { name = "sqlalchemy", extras = ["asyncio"], specifier = ">=2.0" }, + { name = "uvicorn" }, +] + +[[package]] +name = "cmdb-webhook" +version = "0.1.0" +source = { editable = "services/webhook" } +dependencies = [ + { name = "asyncpg" }, + { name = "cmdb-shared" }, + { name = "fastapi" }, + { name = "httpx" }, + { name = "pydantic-settings" }, + { name = "uvicorn" }, +] + +[package.metadata] +requires-dist = [ + { name = "asyncpg" }, + { name = "cmdb-shared", editable = "shared" }, + { name = "fastapi", specifier = ">=0.115" }, + { name = "httpx" }, + { name = "pydantic-settings" }, + { name = "uvicorn" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, +] + +[[package]] +name = "cross-web" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a4/58/e688e99d1493c565d1587e64b499268d0a3129ae59f4efe440aac395f803/cross_web-0.4.1.tar.gz", hash = "sha256:0466295028dcae98c9ab3d18757f90b0e74fac2ff90efbe87e74657546d9993d", size = 157385, upload-time = "2026-01-09T18:17:41.534Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/67/49/92b46b6e65f09b717a66c4e5a9bc47a45ebc83dd0e0ed126f8258363479d/cross_web-0.4.1-py3-none-any.whl", hash = "sha256:41b07c3a38253c517ec0603c1a366353aff77538946092b0f9a2235033f192c2", size = 14320, upload-time = "2026-01-09T18:17:40.325Z" }, +] + +[[package]] +name = "cryptography" +version = "46.0.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/60/04/ee2a9e8542e4fa2773b81771ff8349ff19cdd56b7258a0cc442639052edb/cryptography-46.0.5.tar.gz", hash = "sha256:abace499247268e3757271b2f1e244b36b06f8515cf27c4d49468fc9eb16e93d", size = 750064, upload-time = "2026-02-10T19:18:38.255Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/81/b0bb27f2ba931a65409c6b8a8b358a7f03c0e46eceacddff55f7c84b1f3b/cryptography-46.0.5-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:351695ada9ea9618b3500b490ad54c739860883df6c1f555e088eaf25b1bbaad", size = 7176289, upload-time = "2026-02-10T19:17:08.274Z" }, + { url = "https://files.pythonhosted.org/packages/ff/9e/6b4397a3e3d15123de3b1806ef342522393d50736c13b20ec4c9ea6693a6/cryptography-46.0.5-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c18ff11e86df2e28854939acde2d003f7984f721eba450b56a200ad90eeb0e6b", size = 4275637, upload-time = "2026-02-10T19:17:10.53Z" }, + { url = "https://files.pythonhosted.org/packages/63/e7/471ab61099a3920b0c77852ea3f0ea611c9702f651600397ac567848b897/cryptography-46.0.5-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4d7e3d356b8cd4ea5aff04f129d5f66ebdc7b6f8eae802b93739ed520c47c79b", size = 4424742, upload-time = "2026-02-10T19:17:12.388Z" }, + { url = "https://files.pythonhosted.org/packages/37/53/a18500f270342d66bf7e4d9f091114e31e5ee9e7375a5aba2e85a91e0044/cryptography-46.0.5-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:50bfb6925eff619c9c023b967d5b77a54e04256c4281b0e21336a130cd7fc263", size = 4277528, upload-time = "2026-02-10T19:17:13.853Z" }, + { url = "https://files.pythonhosted.org/packages/22/29/c2e812ebc38c57b40e7c583895e73c8c5adb4d1e4a0cc4c5a4fdab2b1acc/cryptography-46.0.5-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:803812e111e75d1aa73690d2facc295eaefd4439be1023fefc4995eaea2af90d", size = 4947993, upload-time = "2026-02-10T19:17:15.618Z" }, + { url = "https://files.pythonhosted.org/packages/6b/e7/237155ae19a9023de7e30ec64e5d99a9431a567407ac21170a046d22a5a3/cryptography-46.0.5-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ee190460e2fbe447175cda91b88b84ae8322a104fc27766ad09428754a618ed", size = 4456855, upload-time = "2026-02-10T19:17:17.221Z" }, + { url = "https://files.pythonhosted.org/packages/2d/87/fc628a7ad85b81206738abbd213b07702bcbdada1dd43f72236ef3cffbb5/cryptography-46.0.5-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:f145bba11b878005c496e93e257c1e88f154d278d2638e6450d17e0f31e558d2", size = 3984635, upload-time = "2026-02-10T19:17:18.792Z" }, + { url = "https://files.pythonhosted.org/packages/84/29/65b55622bde135aedf4565dc509d99b560ee4095e56989e815f8fd2aa910/cryptography-46.0.5-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:e9251e3be159d1020c4030bd2e5f84d6a43fe54b6c19c12f51cde9542a2817b2", size = 4277038, upload-time = "2026-02-10T19:17:20.256Z" }, + { url = "https://files.pythonhosted.org/packages/bc/36/45e76c68d7311432741faf1fbf7fac8a196a0a735ca21f504c75d37e2558/cryptography-46.0.5-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:47fb8a66058b80e509c47118ef8a75d14c455e81ac369050f20ba0d23e77fee0", size = 4912181, upload-time = "2026-02-10T19:17:21.825Z" }, + { url = "https://files.pythonhosted.org/packages/6d/1a/c1ba8fead184d6e3d5afcf03d569acac5ad063f3ac9fb7258af158f7e378/cryptography-46.0.5-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4c3341037c136030cb46e4b1e17b7418ea4cbd9dd207e4a6f3b2b24e0d4ac731", size = 4456482, upload-time = "2026-02-10T19:17:25.133Z" }, + { url = "https://files.pythonhosted.org/packages/f9/e5/3fb22e37f66827ced3b902cf895e6a6bc1d095b5b26be26bd13c441fdf19/cryptography-46.0.5-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:890bcb4abd5a2d3f852196437129eb3667d62630333aacc13dfd470fad3aaa82", size = 4405497, upload-time = "2026-02-10T19:17:26.66Z" }, + { url = "https://files.pythonhosted.org/packages/1a/df/9d58bb32b1121a8a2f27383fabae4d63080c7ca60b9b5c88be742be04ee7/cryptography-46.0.5-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80a8d7bfdf38f87ca30a5391c0c9ce4ed2926918e017c29ddf643d0ed2778ea1", size = 4667819, upload-time = "2026-02-10T19:17:28.569Z" }, + { url = "https://files.pythonhosted.org/packages/ea/ed/325d2a490c5e94038cdb0117da9397ece1f11201f425c4e9c57fe5b9f08b/cryptography-46.0.5-cp311-abi3-win32.whl", hash = "sha256:60ee7e19e95104d4c03871d7d7dfb3d22ef8a9b9c6778c94e1c8fcc8365afd48", size = 3028230, upload-time = "2026-02-10T19:17:30.518Z" }, + { url = "https://files.pythonhosted.org/packages/e9/5a/ac0f49e48063ab4255d9e3b79f5def51697fce1a95ea1370f03dc9db76f6/cryptography-46.0.5-cp311-abi3-win_amd64.whl", hash = "sha256:38946c54b16c885c72c4f59846be9743d699eee2b69b6988e0a00a01f46a61a4", size = 3480909, upload-time = "2026-02-10T19:17:32.083Z" }, + { url = "https://files.pythonhosted.org/packages/00/13/3d278bfa7a15a96b9dc22db5a12ad1e48a9eb3d40e1827ef66a5df75d0d0/cryptography-46.0.5-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:94a76daa32eb78d61339aff7952ea819b1734b46f73646a07decb40e5b3448e2", size = 7119287, upload-time = "2026-02-10T19:17:33.801Z" }, + { url = "https://files.pythonhosted.org/packages/67/c8/581a6702e14f0898a0848105cbefd20c058099e2c2d22ef4e476dfec75d7/cryptography-46.0.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:5be7bf2fb40769e05739dd0046e7b26f9d4670badc7b032d6ce4db64dddc0678", size = 4265728, upload-time = "2026-02-10T19:17:35.569Z" }, + { url = "https://files.pythonhosted.org/packages/dd/4a/ba1a65ce8fc65435e5a849558379896c957870dd64fecea97b1ad5f46a37/cryptography-46.0.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe346b143ff9685e40192a4960938545c699054ba11d4f9029f94751e3f71d87", size = 4408287, upload-time = "2026-02-10T19:17:36.938Z" }, + { url = "https://files.pythonhosted.org/packages/f8/67/8ffdbf7b65ed1ac224d1c2df3943553766914a8ca718747ee3871da6107e/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:c69fd885df7d089548a42d5ec05be26050ebcd2283d89b3d30676eb32ff87dee", size = 4270291, upload-time = "2026-02-10T19:17:38.748Z" }, + { url = "https://files.pythonhosted.org/packages/f8/e5/f52377ee93bc2f2bba55a41a886fd208c15276ffbd2569f2ddc89d50e2c5/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:8293f3dea7fc929ef7240796ba231413afa7b68ce38fd21da2995549f5961981", size = 4927539, upload-time = "2026-02-10T19:17:40.241Z" }, + { url = "https://files.pythonhosted.org/packages/3b/02/cfe39181b02419bbbbcf3abdd16c1c5c8541f03ca8bda240debc467d5a12/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:1abfdb89b41c3be0365328a410baa9df3ff8a9110fb75e7b52e66803ddabc9a9", size = 4442199, upload-time = "2026-02-10T19:17:41.789Z" }, + { url = "https://files.pythonhosted.org/packages/c0/96/2fcaeb4873e536cf71421a388a6c11b5bc846e986b2b069c79363dc1648e/cryptography-46.0.5-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:d66e421495fdb797610a08f43b05269e0a5ea7f5e652a89bfd5a7d3c1dee3648", size = 3960131, upload-time = "2026-02-10T19:17:43.379Z" }, + { url = "https://files.pythonhosted.org/packages/d8/d2/b27631f401ddd644e94c5cf33c9a4069f72011821cf3dc7309546b0642a0/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:4e817a8920bfbcff8940ecfd60f23d01836408242b30f1a708d93198393a80b4", size = 4270072, upload-time = "2026-02-10T19:17:45.481Z" }, + { url = "https://files.pythonhosted.org/packages/f4/a7/60d32b0370dae0b4ebe55ffa10e8599a2a59935b5ece1b9f06edb73abdeb/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:68f68d13f2e1cb95163fa3b4db4bf9a159a418f5f6e7242564fc75fcae667fd0", size = 4892170, upload-time = "2026-02-10T19:17:46.997Z" }, + { url = "https://files.pythonhosted.org/packages/d2/b9/cf73ddf8ef1164330eb0b199a589103c363afa0cf794218c24d524a58eab/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a3d1fae9863299076f05cb8a778c467578262fae09f9dc0ee9b12eb4268ce663", size = 4441741, upload-time = "2026-02-10T19:17:48.661Z" }, + { url = "https://files.pythonhosted.org/packages/5f/eb/eee00b28c84c726fe8fa0158c65afe312d9c3b78d9d01daf700f1f6e37ff/cryptography-46.0.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c4143987a42a2397f2fc3b4d7e3a7d313fbe684f67ff443999e803dd75a76826", size = 4396728, upload-time = "2026-02-10T19:17:50.058Z" }, + { url = "https://files.pythonhosted.org/packages/65/f4/6bc1a9ed5aef7145045114b75b77c2a8261b4d38717bd8dea111a63c3442/cryptography-46.0.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:7d731d4b107030987fd61a7f8ab512b25b53cef8f233a97379ede116f30eb67d", size = 4652001, upload-time = "2026-02-10T19:17:51.54Z" }, + { url = "https://files.pythonhosted.org/packages/86/ef/5d00ef966ddd71ac2e6951d278884a84a40ffbd88948ef0e294b214ae9e4/cryptography-46.0.5-cp314-cp314t-win32.whl", hash = "sha256:c3bcce8521d785d510b2aad26ae2c966092b7daa8f45dd8f44734a104dc0bc1a", size = 3003637, upload-time = "2026-02-10T19:17:52.997Z" }, + { url = "https://files.pythonhosted.org/packages/b7/57/f3f4160123da6d098db78350fdfd9705057aad21de7388eacb2401dceab9/cryptography-46.0.5-cp314-cp314t-win_amd64.whl", hash = "sha256:4d8ae8659ab18c65ced284993c2265910f6c9e650189d4e3f68445ef82a810e4", size = 3469487, upload-time = "2026-02-10T19:17:54.549Z" }, + { url = "https://files.pythonhosted.org/packages/e2/fa/a66aa722105ad6a458bebd64086ca2b72cdd361fed31763d20390f6f1389/cryptography-46.0.5-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:4108d4c09fbbf2789d0c926eb4152ae1760d5a2d97612b92d508d96c861e4d31", size = 7170514, upload-time = "2026-02-10T19:17:56.267Z" }, + { url = "https://files.pythonhosted.org/packages/0f/04/c85bdeab78c8bc77b701bf0d9bdcf514c044e18a46dcff330df5448631b0/cryptography-46.0.5-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1f30a86d2757199cb2d56e48cce14deddf1f9c95f1ef1b64ee91ea43fe2e18", size = 4275349, upload-time = "2026-02-10T19:17:58.419Z" }, + { url = "https://files.pythonhosted.org/packages/5c/32/9b87132a2f91ee7f5223b091dc963055503e9b442c98fc0b8a5ca765fab0/cryptography-46.0.5-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:039917b0dc418bb9f6edce8a906572d69e74bd330b0b3fea4f79dab7f8ddd235", size = 4420667, upload-time = "2026-02-10T19:18:00.619Z" }, + { url = "https://files.pythonhosted.org/packages/a1/a6/a7cb7010bec4b7c5692ca6f024150371b295ee1c108bdc1c400e4c44562b/cryptography-46.0.5-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:ba2a27ff02f48193fc4daeadf8ad2590516fa3d0adeeb34336b96f7fa64c1e3a", size = 4276980, upload-time = "2026-02-10T19:18:02.379Z" }, + { url = "https://files.pythonhosted.org/packages/8e/7c/c4f45e0eeff9b91e3f12dbd0e165fcf2a38847288fcfd889deea99fb7b6d/cryptography-46.0.5-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:61aa400dce22cb001a98014f647dc21cda08f7915ceb95df0c9eaf84b4b6af76", size = 4939143, upload-time = "2026-02-10T19:18:03.964Z" }, + { url = "https://files.pythonhosted.org/packages/37/19/e1b8f964a834eddb44fa1b9a9976f4e414cbb7aa62809b6760c8803d22d1/cryptography-46.0.5-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ce58ba46e1bc2aac4f7d9290223cead56743fa6ab94a5d53292ffaac6a91614", size = 4453674, upload-time = "2026-02-10T19:18:05.588Z" }, + { url = "https://files.pythonhosted.org/packages/db/ed/db15d3956f65264ca204625597c410d420e26530c4e2943e05a0d2f24d51/cryptography-46.0.5-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:420d0e909050490d04359e7fdb5ed7e667ca5c3c402b809ae2563d7e66a92229", size = 3978801, upload-time = "2026-02-10T19:18:07.167Z" }, + { url = "https://files.pythonhosted.org/packages/41/e2/df40a31d82df0a70a0daf69791f91dbb70e47644c58581d654879b382d11/cryptography-46.0.5-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:582f5fcd2afa31622f317f80426a027f30dc792e9c80ffee87b993200ea115f1", size = 4276755, upload-time = "2026-02-10T19:18:09.813Z" }, + { url = "https://files.pythonhosted.org/packages/33/45/726809d1176959f4a896b86907b98ff4391a8aa29c0aaaf9450a8a10630e/cryptography-46.0.5-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:bfd56bb4b37ed4f330b82402f6f435845a5f5648edf1ad497da51a8452d5d62d", size = 4901539, upload-time = "2026-02-10T19:18:11.263Z" }, + { url = "https://files.pythonhosted.org/packages/99/0f/a3076874e9c88ecb2ecc31382f6e7c21b428ede6f55aafa1aa272613e3cd/cryptography-46.0.5-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:a3d507bb6a513ca96ba84443226af944b0f7f47dcc9a399d110cd6146481d24c", size = 4452794, upload-time = "2026-02-10T19:18:12.914Z" }, + { url = "https://files.pythonhosted.org/packages/02/ef/ffeb542d3683d24194a38f66ca17c0a4b8bf10631feef44a7ef64e631b1a/cryptography-46.0.5-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9f16fbdf4da055efb21c22d81b89f155f02ba420558db21288b3d0035bafd5f4", size = 4404160, upload-time = "2026-02-10T19:18:14.375Z" }, + { url = "https://files.pythonhosted.org/packages/96/93/682d2b43c1d5f1406ed048f377c0fc9fc8f7b0447a478d5c65ab3d3a66eb/cryptography-46.0.5-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:ced80795227d70549a411a4ab66e8ce307899fad2220ce5ab2f296e687eacde9", size = 4667123, upload-time = "2026-02-10T19:18:15.886Z" }, + { url = "https://files.pythonhosted.org/packages/45/2d/9c5f2926cb5300a8eefc3f4f0b3f3df39db7f7ce40c8365444c49363cbda/cryptography-46.0.5-cp38-abi3-win32.whl", hash = "sha256:02f547fce831f5096c9a567fd41bc12ca8f11df260959ecc7c3202555cc47a72", size = 3010220, upload-time = "2026-02-10T19:18:17.361Z" }, + { url = "https://files.pythonhosted.org/packages/48/ef/0c2f4a8e31018a986949d34a01115dd057bf536905dca38897bacd21fac3/cryptography-46.0.5-cp38-abi3-win_amd64.whl", hash = "sha256:556e106ee01aa13484ce9b0239bca667be5004efb0aabbed28d353df86445595", size = 3467050, upload-time = "2026-02-10T19:18:18.899Z" }, +] + +[[package]] +name = "distlib" +version = "0.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/96/8e/709914eb2b5749865801041647dc7f4e6d00b549cfe88b65ca192995f07c/distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d", size = 614605, upload-time = "2025-07-17T16:52:00.465Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047, upload-time = "2025-07-17T16:51:58.613Z" }, +] + +[[package]] +name = "docker" +version = "7.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pywin32", marker = "sys_platform == 'win32'" }, + { name = "requests" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/91/9b/4a2ea29aeba62471211598dac5d96825bb49348fa07e906ea930394a83ce/docker-7.1.0.tar.gz", hash = "sha256:ad8c70e6e3f8926cb8a92619b832b4ea5299e2831c14284663184e200546fa6c", size = 117834, upload-time = "2024-05-23T11:13:57.216Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e3/26/57c6fb270950d476074c087527a558ccb6f4436657314bfb6cdf484114c4/docker-7.1.0-py3-none-any.whl", hash = "sha256:c96b93b7f0a746f9e77d325bcfb87422a3d8bd4f03136ae8a85b37f1898d5fc0", size = 147774, upload-time = "2024-05-23T11:13:55.01Z" }, +] + +[[package]] +name = "fakeredis" +version = "2.34.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "redis" }, + { name = "sortedcontainers" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/11/40/fd09efa66205eb32253d2b2ebc63537281384d2040f0a88bcd2289e120e4/fakeredis-2.34.1.tar.gz", hash = "sha256:4ff55606982972eecce3ab410e03d746c11fe5deda6381d913641fbd8865ea9b", size = 177315, upload-time = "2026-02-25T13:17:51.315Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/49/b5/82f89307d0d769cd9bf46a54fb9136be08e4e57c5570ae421db4c9a2ba62/fakeredis-2.34.1-py3-none-any.whl", hash = "sha256:0107ec99d48913e7eec2a5e3e2403d1bd5f8aa6489d1a634571b975289c48f12", size = 122160, upload-time = "2026-02-25T13:17:49.701Z" }, +] + +[[package]] +name = "fastapi" +version = "0.135.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-doc" }, + { name = "pydantic" }, + { name = "starlette" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e7/7b/f8e0211e9380f7195ba3f3d40c292594fd81ba8ec4629e3854c353aaca45/fastapi-0.135.1.tar.gz", hash = "sha256:d04115b508d936d254cea545b7312ecaa58a7b3a0f84952535b4c9afae7668cd", size = 394962, upload-time = "2026-03-01T18:18:29.369Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e4/72/42e900510195b23a56bde950d26a51f8b723846bfcaa0286e90287f0422b/fastapi-0.135.1-py3-none-any.whl", hash = "sha256:46e2fc5745924b7c840f71ddd277382af29ce1cdb7d5eab5bf697e3fb9999c9e", size = 116999, upload-time = "2026-03-01T18:18:30.831Z" }, +] + +[[package]] +name = "filelock" +version = "3.25.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/b8/00651a0f559862f3bb7d6f7477b192afe3f583cc5e26403b44e59a55ab34/filelock-3.25.2.tar.gz", hash = "sha256:b64ece2b38f4ca29dd3e810287aa8c48182bbecd1ae6e9ae126c9b35f1382694", size = 40480, upload-time = "2026-03-11T20:45:38.487Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/a5/842ae8f0c08b61d6484b52f99a03510a3a72d23141942d216ebe81fefbce/filelock-3.25.2-py3-none-any.whl", hash = "sha256:ca8afb0da15f229774c9ad1b455ed96e85a81373065fb10446672f64444ddf70", size = 26759, upload-time = "2026-03-11T20:45:37.437Z" }, +] + +[[package]] +name = "graphql-core" +version = "3.2.8" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/68/c5/36aa96205c3ecbb3d34c7c24189e4553c7ca2ebc7e1dd07432339b980272/graphql_core-3.2.8.tar.gz", hash = "sha256:015457da5d996c924ddf57a43f4e959b0b94fb695b85ed4c29446e508ed65cf3", size = 513181, upload-time = "2026-03-05T19:55:37.332Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/86/41/cb887d9afc5dabd78feefe6ccbaf83ff423c206a7a1b7aeeac05120b2125/graphql_core-3.2.8-py3-none-any.whl", hash = "sha256:cbee07bee1b3ed5e531723685369039f32ff815ef60166686e0162f540f1520c", size = 207349, upload-time = "2026-03-05T19:55:35.911Z" }, +] + +[[package]] +name = "greenlet" +version = "3.3.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a3/51/1664f6b78fc6ebbd98019a1fd730e83fa78f2db7058f72b1463d3612b8db/greenlet-3.3.2.tar.gz", hash = "sha256:2eaf067fc6d886931c7962e8c6bede15d2f01965560f3359b27c80bde2d151f2", size = 188267, upload-time = "2026-02-20T20:54:15.531Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ac/48/f8b875fa7dea7dd9b33245e37f065af59df6a25af2f9561efa8d822fde51/greenlet-3.3.2-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:aa6ac98bdfd716a749b84d4034486863fd81c3abde9aa3cf8eff9127981a4ae4", size = 279120, upload-time = "2026-02-20T20:19:01.9Z" }, + { url = "https://files.pythonhosted.org/packages/49/8d/9771d03e7a8b1ee456511961e1b97a6d77ae1dea4a34a5b98eee706689d3/greenlet-3.3.2-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ab0c7e7901a00bc0a7284907273dc165b32e0d109a6713babd04471327ff7986", size = 603238, upload-time = "2026-02-20T20:47:32.873Z" }, + { url = "https://files.pythonhosted.org/packages/59/0e/4223c2bbb63cd5c97f28ffb2a8aee71bdfb30b323c35d409450f51b91e3e/greenlet-3.3.2-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:d248d8c23c67d2291ffd47af766e2a3aa9fa1c6703155c099feb11f526c63a92", size = 614219, upload-time = "2026-02-20T20:55:59.817Z" }, + { url = "https://files.pythonhosted.org/packages/94/2b/4d012a69759ac9d77210b8bfb128bc621125f5b20fc398bce3940d036b1c/greenlet-3.3.2-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ccd21bb86944ca9be6d967cf7691e658e43417782bce90b5d2faeda0ff78a7dd", size = 628268, upload-time = "2026-02-20T21:02:48.024Z" }, + { url = "https://files.pythonhosted.org/packages/7a/34/259b28ea7a2a0c904b11cd36c79b8cef8019b26ee5dbe24e73b469dea347/greenlet-3.3.2-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b6997d360a4e6a4e936c0f9625b1c20416b8a0ea18a8e19cabbefc712e7397ab", size = 616774, upload-time = "2026-02-20T20:21:02.454Z" }, + { url = "https://files.pythonhosted.org/packages/0a/03/996c2d1689d486a6e199cb0f1cf9e4aa940c500e01bdf201299d7d61fa69/greenlet-3.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:64970c33a50551c7c50491671265d8954046cb6e8e2999aacdd60e439b70418a", size = 1571277, upload-time = "2026-02-20T20:49:34.795Z" }, + { url = "https://files.pythonhosted.org/packages/d9/c4/2570fc07f34a39f2caf0bf9f24b0a1a0a47bc2e8e465b2c2424821389dfc/greenlet-3.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1a9172f5bf6bd88e6ba5a84e0a68afeac9dc7b6b412b245dd64f52d83c81e55b", size = 1640455, upload-time = "2026-02-20T20:21:10.261Z" }, + { url = "https://files.pythonhosted.org/packages/91/39/5ef5aa23bc545aa0d31e1b9b55822b32c8da93ba657295840b6b34124009/greenlet-3.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:a7945dd0eab63ded0a48e4dcade82939783c172290a7903ebde9e184333ca124", size = 230961, upload-time = "2026-02-20T20:16:58.461Z" }, + { url = "https://files.pythonhosted.org/packages/62/6b/a89f8456dcb06becff288f563618e9f20deed8dd29beea14f9a168aef64b/greenlet-3.3.2-cp313-cp313-win_arm64.whl", hash = "sha256:394ead29063ee3515b4e775216cb756b2e3b4a7e55ae8fd884f17fa579e6b327", size = 230221, upload-time = "2026-02-20T20:17:37.152Z" }, + { url = "https://files.pythonhosted.org/packages/3f/ae/8bffcbd373b57a5992cd077cbe8858fff39110480a9d50697091faea6f39/greenlet-3.3.2-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:8d1658d7291f9859beed69a776c10822a0a799bc4bfe1bd4272bb60e62507dab", size = 279650, upload-time = "2026-02-20T20:18:00.783Z" }, + { url = "https://files.pythonhosted.org/packages/d1/c0/45f93f348fa49abf32ac8439938726c480bd96b2a3c6f4d949ec0124b69f/greenlet-3.3.2-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:18cb1b7337bca281915b3c5d5ae19f4e76d35e1df80f4ad3c1a7be91fadf1082", size = 650295, upload-time = "2026-02-20T20:47:34.036Z" }, + { url = "https://files.pythonhosted.org/packages/b3/de/dd7589b3f2b8372069ab3e4763ea5329940fc7ad9dcd3e272a37516d7c9b/greenlet-3.3.2-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c2e47408e8ce1c6f1ceea0dffcdf6ebb85cc09e55c7af407c99f1112016e45e9", size = 662163, upload-time = "2026-02-20T20:56:01.295Z" }, + { url = "https://files.pythonhosted.org/packages/cd/ac/85804f74f1ccea31ba518dcc8ee6f14c79f73fe36fa1beba38930806df09/greenlet-3.3.2-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e3cb43ce200f59483eb82949bf1835a99cf43d7571e900d7c8d5c62cdf25d2f9", size = 675371, upload-time = "2026-02-20T21:02:49.664Z" }, + { url = "https://files.pythonhosted.org/packages/d2/d8/09bfa816572a4d83bccd6750df1926f79158b1c36c5f73786e26dbe4ee38/greenlet-3.3.2-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:63d10328839d1973e5ba35e98cccbca71b232b14051fd957b6f8b6e8e80d0506", size = 664160, upload-time = "2026-02-20T20:21:04.015Z" }, + { url = "https://files.pythonhosted.org/packages/48/cf/56832f0c8255d27f6c35d41b5ec91168d74ec721d85f01a12131eec6b93c/greenlet-3.3.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:8e4ab3cfb02993c8cc248ea73d7dae6cec0253e9afa311c9b37e603ca9fad2ce", size = 1619181, upload-time = "2026-02-20T20:49:36.052Z" }, + { url = "https://files.pythonhosted.org/packages/0a/23/b90b60a4aabb4cec0796e55f25ffbfb579a907c3898cd2905c8918acaa16/greenlet-3.3.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:94ad81f0fd3c0c0681a018a976e5c2bd2ca2d9d94895f23e7bb1af4e8af4e2d5", size = 1687713, upload-time = "2026-02-20T20:21:11.684Z" }, + { url = "https://files.pythonhosted.org/packages/f3/ca/2101ca3d9223a1dc125140dbc063644dca76df6ff356531eb27bc267b446/greenlet-3.3.2-cp314-cp314-win_amd64.whl", hash = "sha256:8c4dd0f3997cf2512f7601563cc90dfb8957c0cff1e3a1b23991d4ea1776c492", size = 232034, upload-time = "2026-02-20T20:20:08.186Z" }, + { url = "https://files.pythonhosted.org/packages/f6/4a/ecf894e962a59dea60f04877eea0fd5724618da89f1867b28ee8b91e811f/greenlet-3.3.2-cp314-cp314-win_arm64.whl", hash = "sha256:cd6f9e2bbd46321ba3bbb4c8a15794d32960e3b0ae2cc4d49a1a53d314805d71", size = 231437, upload-time = "2026-02-20T20:18:59.722Z" }, + { url = "https://files.pythonhosted.org/packages/98/6d/8f2ef704e614bcf58ed43cfb8d87afa1c285e98194ab2cfad351bf04f81e/greenlet-3.3.2-cp314-cp314t-macosx_11_0_universal2.whl", hash = "sha256:e26e72bec7ab387ac80caa7496e0f908ff954f31065b0ffc1f8ecb1338b11b54", size = 286617, upload-time = "2026-02-20T20:19:29.856Z" }, + { url = "https://files.pythonhosted.org/packages/5e/0d/93894161d307c6ea237a43988f27eba0947b360b99ac5239ad3fe09f0b47/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8b466dff7a4ffda6ca975979bab80bdadde979e29fc947ac3be4451428d8b0e4", size = 655189, upload-time = "2026-02-20T20:47:35.742Z" }, + { url = "https://files.pythonhosted.org/packages/f5/2c/d2d506ebd8abcb57386ec4f7ba20f4030cbe56eae541bc6fd6ef399c0b41/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b8bddc5b73c9720bea487b3bffdb1840fe4e3656fba3bd40aa1489e9f37877ff", size = 658225, upload-time = "2026-02-20T20:56:02.527Z" }, + { url = "https://files.pythonhosted.org/packages/d1/67/8197b7e7e602150938049d8e7f30de1660cfb87e4c8ee349b42b67bdb2e1/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:59b3e2c40f6706b05a9cd299c836c6aa2378cabe25d021acd80f13abf81181cf", size = 666581, upload-time = "2026-02-20T21:02:51.526Z" }, + { url = "https://files.pythonhosted.org/packages/8e/30/3a09155fbf728673a1dea713572d2d31159f824a37c22da82127056c44e4/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b26b0f4428b871a751968285a1ac9648944cea09807177ac639b030bddebcea4", size = 657907, upload-time = "2026-02-20T20:21:05.259Z" }, + { url = "https://files.pythonhosted.org/packages/f3/fd/d05a4b7acd0154ed758797f0a43b4c0962a843bedfe980115e842c5b2d08/greenlet-3.3.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:1fb39a11ee2e4d94be9a76671482be9398560955c9e568550de0224e41104727", size = 1618857, upload-time = "2026-02-20T20:49:37.309Z" }, + { url = "https://files.pythonhosted.org/packages/6f/e1/50ee92a5db521de8f35075b5eff060dd43d39ebd46c2181a2042f7070385/greenlet-3.3.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:20154044d9085151bc309e7689d6f7ba10027f8f5a8c0676ad398b951913d89e", size = 1680010, upload-time = "2026-02-20T20:21:13.427Z" }, + { url = "https://files.pythonhosted.org/packages/29/4b/45d90626aef8e65336bed690106d1382f7a43665e2249017e9527df8823b/greenlet-3.3.2-cp314-cp314t-win_amd64.whl", hash = "sha256:c04c5e06ec3e022cbfe2cd4a846e1d4e50087444f875ff6d2c2ad8445495cf1a", size = 237086, upload-time = "2026-02-20T20:20:45.786Z" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, +] + +[[package]] +name = "identify" +version = "2.6.18" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/46/c4/7fb4db12296cdb11893d61c92048fe617ee853f8523b9b296ac03b43757e/identify-2.6.18.tar.gz", hash = "sha256:873ac56a5e3fd63e7438a7ecbc4d91aca692eb3fefa4534db2b7913f3fc352fd", size = 99580, upload-time = "2026-03-15T18:39:50.319Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/46/33/92ef41c6fad0233e41d3d84ba8e8ad18d1780f1e5d99b3c683e6d7f98b63/identify-2.6.18-py2.py3-none-any.whl", hash = "sha256:8db9d3c8ea9079db92cafb0ebf97abdc09d52e97f4dcf773a2e694048b7cd737", size = 99394, upload-time = "2026-03-15T18:39:48.915Z" }, +] + +[[package]] +name = "idna" +version = "3.11" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" }, +] + +[[package]] +name = "iniconfig" +version = "2.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" }, +] + +[[package]] +name = "jinja2" +version = "3.1.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, +] + +[[package]] +name = "mako" +version = "1.3.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" }, +] + +[[package]] +name = "markupsafe" +version = "3.0.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" }, + { url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" }, + { url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" }, + { url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" }, + { url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" }, + { url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" }, + { url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" }, + { url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" }, + { url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" }, + { url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" }, + { url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" }, + { url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" }, + { url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" }, + { url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" }, + { url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" }, + { url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" }, + { url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" }, + { url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" }, + { url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" }, + { url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" }, + { url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" }, + { url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" }, + { url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" }, + { url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" }, + { url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" }, + { url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" }, + { url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" }, + { url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" }, + { url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" }, + { url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" }, + { url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" }, + { url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" }, + { url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" }, + { url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" }, + { url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" }, + { url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" }, + { url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" }, + { url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" }, + { url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" }, + { url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" }, + { url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" }, + { url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" }, + { url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" }, +] + +[[package]] +name = "nodeenv" +version = "1.10.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/24/bf/d1bda4f6168e0b2e9e5958945e01910052158313224ada5ce1fb2e1113b8/nodeenv-1.10.0.tar.gz", hash = "sha256:996c191ad80897d076bdfba80a41994c2b47c68e224c542b48feba42ba00f8bb", size = 55611, upload-time = "2025-12-20T14:08:54.006Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/88/b2/d0896bdcdc8d28a7fc5717c305f1a861c26e18c05047949fb371034d98bd/nodeenv-1.10.0-py2.py3-none-any.whl", hash = "sha256:5bb13e3eed2923615535339b3c620e76779af4cb4c6a90deccc9e36b274d3827", size = 23438, upload-time = "2025-12-20T14:08:52.782Z" }, +] + +[[package]] +name = "packaging" +version = "26.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/65/ee/299d360cdc32edc7d2cf530f3accf79c4fca01e96ffc950d8a52213bd8e4/packaging-26.0.tar.gz", hash = "sha256:00243ae351a257117b6a241061796684b084ed1c516a08c48a3f7e147a9d80b4", size = 143416, upload-time = "2026-01-21T20:50:39.064Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/b9/c538f279a4e237a006a2c98387d081e9eb060d203d8ed34467cc0f0b9b53/packaging-26.0-py3-none-any.whl", hash = "sha256:b36f1fef9334a5588b4166f8bcd26a14e521f2b55e6b9de3aaa80d3ff7a37529", size = 74366, upload-time = "2026-01-21T20:50:37.788Z" }, +] + +[[package]] +name = "platformdirs" +version = "4.9.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/19/56/8d4c30c8a1d07013911a8fdbd8f89440ef9f08d07a1b50ab8ca8be5a20f9/platformdirs-4.9.4.tar.gz", hash = "sha256:1ec356301b7dc906d83f371c8f487070e99d3ccf9e501686456394622a01a934", size = 28737, upload-time = "2026-03-05T18:34:13.271Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/63/d7/97f7e3a6abb67d8080dd406fd4df842c2be0efaf712d1c899c32a075027c/platformdirs-4.9.4-py3-none-any.whl", hash = "sha256:68a9a4619a666ea6439f2ff250c12a853cd1cbd5158d258bd824a7df6be2f868", size = 21216, upload-time = "2026-03-05T18:34:12.172Z" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, +] + +[[package]] +name = "pre-commit" +version = "4.5.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cfgv" }, + { name = "identify" }, + { name = "nodeenv" }, + { name = "pyyaml" }, + { name = "virtualenv" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/40/f1/6d86a29246dfd2e9b6237f0b5823717f60cad94d47ddc26afa916d21f525/pre_commit-4.5.1.tar.gz", hash = "sha256:eb545fcff725875197837263e977ea257a402056661f09dae08e4b149b030a61", size = 198232, upload-time = "2025-12-16T21:14:33.552Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5d/19/fd3ef348460c80af7bb4669ea7926651d1f95c23ff2df18b9d24bab4f3fa/pre_commit-4.5.1-py2.py3-none-any.whl", hash = "sha256:3b3afd891e97337708c1674210f8eba659b52a38ea5f822ff142d10786221f77", size = 226437, upload-time = "2025-12-16T21:14:32.409Z" }, +] + +[[package]] +name = "psycopg2-binary" +version = "2.9.11" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ac/6c/8767aaa597ba424643dc87348c6f1754dd9f48e80fdc1b9f7ca5c3a7c213/psycopg2-binary-2.9.11.tar.gz", hash = "sha256:b6aed9e096bf63f9e75edf2581aa9a7e7186d97ab5c177aa6c87797cd591236c", size = 379620, upload-time = "2025-10-10T11:14:48.041Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ff/a8/a2709681b3ac11b0b1786def10006b8995125ba268c9a54bea6f5ae8bd3e/psycopg2_binary-2.9.11-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b8fb3db325435d34235b044b199e56cdf9ff41223a4b9752e8576465170bb38c", size = 3756572, upload-time = "2025-10-10T11:12:32.873Z" }, + { url = "https://files.pythonhosted.org/packages/62/e1/c2b38d256d0dafd32713e9f31982a5b028f4a3651f446be70785f484f472/psycopg2_binary-2.9.11-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:366df99e710a2acd90efed3764bb1e28df6c675d33a7fb40df9b7281694432ee", size = 3864529, upload-time = "2025-10-10T11:12:36.791Z" }, + { url = "https://files.pythonhosted.org/packages/11/32/b2ffe8f3853c181e88f0a157c5fb4e383102238d73c52ac6d93a5c8bffe6/psycopg2_binary-2.9.11-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:8c55b385daa2f92cb64b12ec4536c66954ac53654c7f15a203578da4e78105c0", size = 4411242, upload-time = "2025-10-10T11:12:42.388Z" }, + { url = "https://files.pythonhosted.org/packages/10/04/6ca7477e6160ae258dc96f67c371157776564679aefd247b66f4661501a2/psycopg2_binary-2.9.11-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c0377174bf1dd416993d16edc15357f6eb17ac998244cca19bc67cdc0e2e5766", size = 4468258, upload-time = "2025-10-10T11:12:48.654Z" }, + { url = "https://files.pythonhosted.org/packages/3c/7e/6a1a38f86412df101435809f225d57c1a021307dd0689f7a5e7fe83588b1/psycopg2_binary-2.9.11-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:5c6ff3335ce08c75afaed19e08699e8aacf95d4a260b495a4a8545244fe2ceb3", size = 4166295, upload-time = "2025-10-10T11:12:52.525Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7d/c07374c501b45f3579a9eb761cbf2604ddef3d96ad48679112c2c5aa9c25/psycopg2_binary-2.9.11-cp313-cp313-manylinux_2_38_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:84011ba3109e06ac412f95399b704d3d6950e386b7994475b231cf61eec2fc1f", size = 3983133, upload-time = "2025-10-30T02:55:24.329Z" }, + { url = "https://files.pythonhosted.org/packages/82/56/993b7104cb8345ad7d4516538ccf8f0d0ac640b1ebd8c754a7b024e76878/psycopg2_binary-2.9.11-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ba34475ceb08cccbdd98f6b46916917ae6eeb92b5ae111df10b544c3a4621dc4", size = 3652383, upload-time = "2025-10-10T11:12:56.387Z" }, + { url = "https://files.pythonhosted.org/packages/2d/ac/eaeb6029362fd8d454a27374d84c6866c82c33bfc24587b4face5a8e43ef/psycopg2_binary-2.9.11-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:b31e90fdd0f968c2de3b26ab014314fe814225b6c324f770952f7d38abf17e3c", size = 3298168, upload-time = "2025-10-10T11:13:00.403Z" }, + { url = "https://files.pythonhosted.org/packages/2b/39/50c3facc66bded9ada5cbc0de867499a703dc6bca6be03070b4e3b65da6c/psycopg2_binary-2.9.11-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:d526864e0f67f74937a8fce859bd56c979f5e2ec57ca7c627f5f1071ef7fee60", size = 3044712, upload-time = "2025-10-30T02:55:27.975Z" }, + { url = "https://files.pythonhosted.org/packages/9c/8e/b7de019a1f562f72ada81081a12823d3c1590bedc48d7d2559410a2763fe/psycopg2_binary-2.9.11-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:04195548662fa544626c8ea0f06561eb6203f1984ba5b4562764fbeb4c3d14b1", size = 3347549, upload-time = "2025-10-10T11:13:03.971Z" }, + { url = "https://files.pythonhosted.org/packages/80/2d/1bb683f64737bbb1f86c82b7359db1eb2be4e2c0c13b947f80efefa7d3e5/psycopg2_binary-2.9.11-cp313-cp313-win_amd64.whl", hash = "sha256:efff12b432179443f54e230fdf60de1f6cc726b6c832db8701227d089310e8aa", size = 2714215, upload-time = "2025-10-10T11:13:07.14Z" }, + { url = "https://files.pythonhosted.org/packages/64/12/93ef0098590cf51d9732b4f139533732565704f45bdc1ffa741b7c95fb54/psycopg2_binary-2.9.11-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:92e3b669236327083a2e33ccfa0d320dd01b9803b3e14dd986a4fc54aa00f4e1", size = 3756567, upload-time = "2025-10-10T11:13:11.885Z" }, + { url = "https://files.pythonhosted.org/packages/7c/a9/9d55c614a891288f15ca4b5209b09f0f01e3124056924e17b81b9fa054cc/psycopg2_binary-2.9.11-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:e0deeb03da539fa3577fcb0b3f2554a97f7e5477c246098dbb18091a4a01c16f", size = 3864755, upload-time = "2025-10-10T11:13:17.727Z" }, + { url = "https://files.pythonhosted.org/packages/13/1e/98874ce72fd29cbde93209977b196a2edae03f8490d1bd8158e7f1daf3a0/psycopg2_binary-2.9.11-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9b52a3f9bb540a3e4ec0f6ba6d31339727b2950c9772850d6545b7eae0b9d7c5", size = 4411646, upload-time = "2025-10-10T11:13:24.432Z" }, + { url = "https://files.pythonhosted.org/packages/5a/bd/a335ce6645334fb8d758cc358810defca14a1d19ffbc8a10bd38a2328565/psycopg2_binary-2.9.11-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:db4fd476874ccfdbb630a54426964959e58da4c61c9feba73e6094d51303d7d8", size = 4468701, upload-time = "2025-10-10T11:13:29.266Z" }, + { url = "https://files.pythonhosted.org/packages/44/d6/c8b4f53f34e295e45709b7568bf9b9407a612ea30387d35eb9fa84f269b4/psycopg2_binary-2.9.11-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:47f212c1d3be608a12937cc131bd85502954398aaa1320cb4c14421a0ffccf4c", size = 4166293, upload-time = "2025-10-10T11:13:33.336Z" }, + { url = "https://files.pythonhosted.org/packages/4b/e0/f8cc36eadd1b716ab36bb290618a3292e009867e5c97ce4aba908cb99644/psycopg2_binary-2.9.11-cp314-cp314-manylinux_2_38_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e35b7abae2b0adab776add56111df1735ccc71406e56203515e228a8dc07089f", size = 3983184, upload-time = "2025-10-30T02:55:32.483Z" }, + { url = "https://files.pythonhosted.org/packages/53/3e/2a8fe18a4e61cfb3417da67b6318e12691772c0696d79434184a511906dc/psycopg2_binary-2.9.11-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:fcf21be3ce5f5659daefd2b3b3b6e4727b028221ddc94e6c1523425579664747", size = 3652650, upload-time = "2025-10-10T11:13:38.181Z" }, + { url = "https://files.pythonhosted.org/packages/76/36/03801461b31b29fe58d228c24388f999fe814dfc302856e0d17f97d7c54d/psycopg2_binary-2.9.11-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:9bd81e64e8de111237737b29d68039b9c813bdf520156af36d26819c9a979e5f", size = 3298663, upload-time = "2025-10-10T11:13:44.878Z" }, + { url = "https://files.pythonhosted.org/packages/97/77/21b0ea2e1a73aa5fa9222b2a6b8ba325c43c3a8d54272839c991f2345656/psycopg2_binary-2.9.11-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:32770a4d666fbdafab017086655bcddab791d7cb260a16679cc5a7338b64343b", size = 3044737, upload-time = "2025-10-30T02:55:35.69Z" }, + { url = "https://files.pythonhosted.org/packages/67/69/f36abe5f118c1dca6d3726ceae164b9356985805480731ac6712a63f24f0/psycopg2_binary-2.9.11-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c3cb3a676873d7506825221045bd70e0427c905b9c8ee8d6acd70cfcbd6e576d", size = 3347643, upload-time = "2025-10-10T11:13:53.499Z" }, + { url = "https://files.pythonhosted.org/packages/e1/36/9c0c326fe3a4227953dfb29f5d0c8ae3b8eb8c1cd2967aa569f50cb3c61f/psycopg2_binary-2.9.11-cp314-cp314-win_amd64.whl", hash = "sha256:4012c9c954dfaccd28f94e84ab9f94e12df76b4afb22331b1f0d3154893a6316", size = 2803913, upload-time = "2025-10-10T11:13:57.058Z" }, +] + +[[package]] +name = "pycparser" +version = "3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload-time = "2026-01-21T14:26:51.89Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload-time = "2026-01-21T14:26:50.693Z" }, +] + +[[package]] +name = "pydantic" +version = "2.12.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/69/44/36f1a6e523abc58ae5f928898e4aca2e0ea509b5aa6f6f392a5d882be928/pydantic-2.12.5.tar.gz", hash = "sha256:4d351024c75c0f085a9febbb665ce8c0c6ec5d30e903bdb6394b7ede26aebb49", size = 821591, upload-time = "2025-11-26T15:11:46.471Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5a/87/b70ad306ebb6f9b585f114d0ac2137d792b48be34d732d60e597c2f8465a/pydantic-2.12.5-py3-none-any.whl", hash = "sha256:e561593fccf61e8a20fc46dfc2dfe075b8be7d0188df33f221ad1f0139180f9d", size = 463580, upload-time = "2025-11-26T15:11:44.605Z" }, +] + +[[package]] +name = "pydantic-core" +version = "2.41.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/71/70/23b021c950c2addd24ec408e9ab05d59b035b39d97cdc1130e1bce647bb6/pydantic_core-2.41.5.tar.gz", hash = "sha256:08daa51ea16ad373ffd5e7606252cc32f07bc72b28284b6bc9c6df804816476e", size = 460952, upload-time = "2025-11-04T13:43:49.098Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/06/8806241ff1f70d9939f9af039c6c35f2360cf16e93c2ca76f184e76b1564/pydantic_core-2.41.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:941103c9be18ac8daf7b7adca8228f8ed6bb7a1849020f643b3a14d15b1924d9", size = 2120403, upload-time = "2025-11-04T13:40:25.248Z" }, + { url = "https://files.pythonhosted.org/packages/94/02/abfa0e0bda67faa65fef1c84971c7e45928e108fe24333c81f3bfe35d5f5/pydantic_core-2.41.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:112e305c3314f40c93998e567879e887a3160bb8689ef3d2c04b6cc62c33ac34", size = 1896206, upload-time = "2025-11-04T13:40:27.099Z" }, + { url = "https://files.pythonhosted.org/packages/15/df/a4c740c0943e93e6500f9eb23f4ca7ec9bf71b19e608ae5b579678c8d02f/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cbaad15cb0c90aa221d43c00e77bb33c93e8d36e0bf74760cd00e732d10a6a0", size = 1919307, upload-time = "2025-11-04T13:40:29.806Z" }, + { url = "https://files.pythonhosted.org/packages/9a/e3/6324802931ae1d123528988e0e86587c2072ac2e5394b4bc2bc34b61ff6e/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03ca43e12fab6023fc79d28ca6b39b05f794ad08ec2feccc59a339b02f2b3d33", size = 2063258, upload-time = "2025-11-04T13:40:33.544Z" }, + { url = "https://files.pythonhosted.org/packages/c9/d4/2230d7151d4957dd79c3044ea26346c148c98fbf0ee6ebd41056f2d62ab5/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc799088c08fa04e43144b164feb0c13f9a0bc40503f8df3e9fde58a3c0c101e", size = 2214917, upload-time = "2025-11-04T13:40:35.479Z" }, + { url = "https://files.pythonhosted.org/packages/e6/9f/eaac5df17a3672fef0081b6c1bb0b82b33ee89aa5cec0d7b05f52fd4a1fa/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:97aeba56665b4c3235a0e52b2c2f5ae9cd071b8a8310ad27bddb3f7fb30e9aa2", size = 2332186, upload-time = "2025-11-04T13:40:37.436Z" }, + { url = "https://files.pythonhosted.org/packages/cf/4e/35a80cae583a37cf15604b44240e45c05e04e86f9cfd766623149297e971/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:406bf18d345822d6c21366031003612b9c77b3e29ffdb0f612367352aab7d586", size = 2073164, upload-time = "2025-11-04T13:40:40.289Z" }, + { url = "https://files.pythonhosted.org/packages/bf/e3/f6e262673c6140dd3305d144d032f7bd5f7497d3871c1428521f19f9efa2/pydantic_core-2.41.5-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b93590ae81f7010dbe380cdeab6f515902ebcbefe0b9327cc4804d74e93ae69d", size = 2179146, upload-time = "2025-11-04T13:40:42.809Z" }, + { url = "https://files.pythonhosted.org/packages/75/c7/20bd7fc05f0c6ea2056a4565c6f36f8968c0924f19b7d97bbfea55780e73/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:01a3d0ab748ee531f4ea6c3e48ad9dac84ddba4b0d82291f87248f2f9de8d740", size = 2137788, upload-time = "2025-11-04T13:40:44.752Z" }, + { url = "https://files.pythonhosted.org/packages/3a/8d/34318ef985c45196e004bc46c6eab2eda437e744c124ef0dbe1ff2c9d06b/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:6561e94ba9dacc9c61bce40e2d6bdc3bfaa0259d3ff36ace3b1e6901936d2e3e", size = 2340133, upload-time = "2025-11-04T13:40:46.66Z" }, + { url = "https://files.pythonhosted.org/packages/9c/59/013626bf8c78a5a5d9350d12e7697d3d4de951a75565496abd40ccd46bee/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:915c3d10f81bec3a74fbd4faebe8391013ba61e5a1a8d48c4455b923bdda7858", size = 2324852, upload-time = "2025-11-04T13:40:48.575Z" }, + { url = "https://files.pythonhosted.org/packages/1a/d9/c248c103856f807ef70c18a4f986693a46a8ffe1602e5d361485da502d20/pydantic_core-2.41.5-cp313-cp313-win32.whl", hash = "sha256:650ae77860b45cfa6e2cdafc42618ceafab3a2d9a3811fcfbd3bbf8ac3c40d36", size = 1994679, upload-time = "2025-11-04T13:40:50.619Z" }, + { url = "https://files.pythonhosted.org/packages/9e/8b/341991b158ddab181cff136acd2552c9f35bd30380422a639c0671e99a91/pydantic_core-2.41.5-cp313-cp313-win_amd64.whl", hash = "sha256:79ec52ec461e99e13791ec6508c722742ad745571f234ea6255bed38c6480f11", size = 2019766, upload-time = "2025-11-04T13:40:52.631Z" }, + { url = "https://files.pythonhosted.org/packages/73/7d/f2f9db34af103bea3e09735bb40b021788a5e834c81eedb541991badf8f5/pydantic_core-2.41.5-cp313-cp313-win_arm64.whl", hash = "sha256:3f84d5c1b4ab906093bdc1ff10484838aca54ef08de4afa9de0f5f14d69639cd", size = 1981005, upload-time = "2025-11-04T13:40:54.734Z" }, + { url = "https://files.pythonhosted.org/packages/ea/28/46b7c5c9635ae96ea0fbb779e271a38129df2550f763937659ee6c5dbc65/pydantic_core-2.41.5-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:3f37a19d7ebcdd20b96485056ba9e8b304e27d9904d233d7b1015db320e51f0a", size = 2119622, upload-time = "2025-11-04T13:40:56.68Z" }, + { url = "https://files.pythonhosted.org/packages/74/1a/145646e5687e8d9a1e8d09acb278c8535ebe9e972e1f162ed338a622f193/pydantic_core-2.41.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1d1d9764366c73f996edd17abb6d9d7649a7eb690006ab6adbda117717099b14", size = 1891725, upload-time = "2025-11-04T13:40:58.807Z" }, + { url = "https://files.pythonhosted.org/packages/23/04/e89c29e267b8060b40dca97bfc64a19b2a3cf99018167ea1677d96368273/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25e1c2af0fce638d5f1988b686f3b3ea8cd7de5f244ca147c777769e798a9cd1", size = 1915040, upload-time = "2025-11-04T13:41:00.853Z" }, + { url = "https://files.pythonhosted.org/packages/84/a3/15a82ac7bd97992a82257f777b3583d3e84bdb06ba6858f745daa2ec8a85/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506d766a8727beef16b7adaeb8ee6217c64fc813646b424d0804d67c16eddb66", size = 2063691, upload-time = "2025-11-04T13:41:03.504Z" }, + { url = "https://files.pythonhosted.org/packages/74/9b/0046701313c6ef08c0c1cf0e028c67c770a4e1275ca73131563c5f2a310a/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4819fa52133c9aa3c387b3328f25c1facc356491e6135b459f1de698ff64d869", size = 2213897, upload-time = "2025-11-04T13:41:05.804Z" }, + { url = "https://files.pythonhosted.org/packages/8a/cd/6bac76ecd1b27e75a95ca3a9a559c643b3afcd2dd62086d4b7a32a18b169/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2b761d210c9ea91feda40d25b4efe82a1707da2ef62901466a42492c028553a2", size = 2333302, upload-time = "2025-11-04T13:41:07.809Z" }, + { url = "https://files.pythonhosted.org/packages/4c/d2/ef2074dc020dd6e109611a8be4449b98cd25e1b9b8a303c2f0fca2f2bcf7/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22f0fb8c1c583a3b6f24df2470833b40207e907b90c928cc8d3594b76f874375", size = 2064877, upload-time = "2025-11-04T13:41:09.827Z" }, + { url = "https://files.pythonhosted.org/packages/18/66/e9db17a9a763d72f03de903883c057b2592c09509ccfe468187f2a2eef29/pydantic_core-2.41.5-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2782c870e99878c634505236d81e5443092fba820f0373997ff75f90f68cd553", size = 2180680, upload-time = "2025-11-04T13:41:12.379Z" }, + { url = "https://files.pythonhosted.org/packages/d3/9e/3ce66cebb929f3ced22be85d4c2399b8e85b622db77dad36b73c5387f8f8/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:0177272f88ab8312479336e1d777f6b124537d47f2123f89cb37e0accea97f90", size = 2138960, upload-time = "2025-11-04T13:41:14.627Z" }, + { url = "https://files.pythonhosted.org/packages/a6/62/205a998f4327d2079326b01abee48e502ea739d174f0a89295c481a2272e/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:63510af5e38f8955b8ee5687740d6ebf7c2a0886d15a6d65c32814613681bc07", size = 2339102, upload-time = "2025-11-04T13:41:16.868Z" }, + { url = "https://files.pythonhosted.org/packages/3c/0d/f05e79471e889d74d3d88f5bd20d0ed189ad94c2423d81ff8d0000aab4ff/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:e56ba91f47764cc14f1daacd723e3e82d1a89d783f0f5afe9c364b8bb491ccdb", size = 2326039, upload-time = "2025-11-04T13:41:18.934Z" }, + { url = "https://files.pythonhosted.org/packages/ec/e1/e08a6208bb100da7e0c4b288eed624a703f4d129bde2da475721a80cab32/pydantic_core-2.41.5-cp314-cp314-win32.whl", hash = "sha256:aec5cf2fd867b4ff45b9959f8b20ea3993fc93e63c7363fe6851424c8a7e7c23", size = 1995126, upload-time = "2025-11-04T13:41:21.418Z" }, + { url = "https://files.pythonhosted.org/packages/48/5d/56ba7b24e9557f99c9237e29f5c09913c81eeb2f3217e40e922353668092/pydantic_core-2.41.5-cp314-cp314-win_amd64.whl", hash = "sha256:8e7c86f27c585ef37c35e56a96363ab8de4e549a95512445b85c96d3e2f7c1bf", size = 2015489, upload-time = "2025-11-04T13:41:24.076Z" }, + { url = "https://files.pythonhosted.org/packages/4e/bb/f7a190991ec9e3e0ba22e4993d8755bbc4a32925c0b5b42775c03e8148f9/pydantic_core-2.41.5-cp314-cp314-win_arm64.whl", hash = "sha256:e672ba74fbc2dc8eea59fb6d4aed6845e6905fc2a8afe93175d94a83ba2a01a0", size = 1977288, upload-time = "2025-11-04T13:41:26.33Z" }, + { url = "https://files.pythonhosted.org/packages/92/ed/77542d0c51538e32e15afe7899d79efce4b81eee631d99850edc2f5e9349/pydantic_core-2.41.5-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:8566def80554c3faa0e65ac30ab0932b9e3a5cd7f8323764303d468e5c37595a", size = 2120255, upload-time = "2025-11-04T13:41:28.569Z" }, + { url = "https://files.pythonhosted.org/packages/bb/3d/6913dde84d5be21e284439676168b28d8bbba5600d838b9dca99de0fad71/pydantic_core-2.41.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b80aa5095cd3109962a298ce14110ae16b8c1aece8b72f9dafe81cf597ad80b3", size = 1863760, upload-time = "2025-11-04T13:41:31.055Z" }, + { url = "https://files.pythonhosted.org/packages/5a/f0/e5e6b99d4191da102f2b0eb9687aaa7f5bea5d9964071a84effc3e40f997/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3006c3dd9ba34b0c094c544c6006cc79e87d8612999f1a5d43b769b89181f23c", size = 1878092, upload-time = "2025-11-04T13:41:33.21Z" }, + { url = "https://files.pythonhosted.org/packages/71/48/36fb760642d568925953bcc8116455513d6e34c4beaa37544118c36aba6d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72f6c8b11857a856bcfa48c86f5368439f74453563f951e473514579d44aa612", size = 2053385, upload-time = "2025-11-04T13:41:35.508Z" }, + { url = "https://files.pythonhosted.org/packages/20/25/92dc684dd8eb75a234bc1c764b4210cf2646479d54b47bf46061657292a8/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cb1b2f9742240e4bb26b652a5aeb840aa4b417c7748b6f8387927bc6e45e40d", size = 2218832, upload-time = "2025-11-04T13:41:37.732Z" }, + { url = "https://files.pythonhosted.org/packages/e2/09/f53e0b05023d3e30357d82eb35835d0f6340ca344720a4599cd663dca599/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd3d54f38609ff308209bd43acea66061494157703364ae40c951f83ba99a1a9", size = 2327585, upload-time = "2025-11-04T13:41:40Z" }, + { url = "https://files.pythonhosted.org/packages/aa/4e/2ae1aa85d6af35a39b236b1b1641de73f5a6ac4d5a7509f77b814885760c/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ff4321e56e879ee8d2a879501c8e469414d948f4aba74a2d4593184eb326660", size = 2041078, upload-time = "2025-11-04T13:41:42.323Z" }, + { url = "https://files.pythonhosted.org/packages/cd/13/2e215f17f0ef326fc72afe94776edb77525142c693767fc347ed6288728d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0d2568a8c11bf8225044aa94409e21da0cb09dcdafe9ecd10250b2baad531a9", size = 2173914, upload-time = "2025-11-04T13:41:45.221Z" }, + { url = "https://files.pythonhosted.org/packages/02/7a/f999a6dcbcd0e5660bc348a3991c8915ce6599f4f2c6ac22f01d7a10816c/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:a39455728aabd58ceabb03c90e12f71fd30fa69615760a075b9fec596456ccc3", size = 2129560, upload-time = "2025-11-04T13:41:47.474Z" }, + { url = "https://files.pythonhosted.org/packages/3a/b1/6c990ac65e3b4c079a4fb9f5b05f5b013afa0f4ed6780a3dd236d2cbdc64/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:239edca560d05757817c13dc17c50766136d21f7cd0fac50295499ae24f90fdf", size = 2329244, upload-time = "2025-11-04T13:41:49.992Z" }, + { url = "https://files.pythonhosted.org/packages/d9/02/3c562f3a51afd4d88fff8dffb1771b30cfdfd79befd9883ee094f5b6c0d8/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:2a5e06546e19f24c6a96a129142a75cee553cc018ffee48a460059b1185f4470", size = 2331955, upload-time = "2025-11-04T13:41:54.079Z" }, + { url = "https://files.pythonhosted.org/packages/5c/96/5fb7d8c3c17bc8c62fdb031c47d77a1af698f1d7a406b0f79aaa1338f9ad/pydantic_core-2.41.5-cp314-cp314t-win32.whl", hash = "sha256:b4ececa40ac28afa90871c2cc2b9ffd2ff0bf749380fbdf57d165fd23da353aa", size = 1988906, upload-time = "2025-11-04T13:41:56.606Z" }, + { url = "https://files.pythonhosted.org/packages/22/ed/182129d83032702912c2e2d8bbe33c036f342cc735737064668585dac28f/pydantic_core-2.41.5-cp314-cp314t-win_amd64.whl", hash = "sha256:80aa89cad80b32a912a65332f64a4450ed00966111b6615ca6816153d3585a8c", size = 1981607, upload-time = "2025-11-04T13:41:58.889Z" }, + { url = "https://files.pythonhosted.org/packages/9f/ed/068e41660b832bb0b1aa5b58011dea2a3fe0ba7861ff38c4d4904c1c1a99/pydantic_core-2.41.5-cp314-cp314t-win_arm64.whl", hash = "sha256:35b44f37a3199f771c3eaa53051bc8a70cd7b54f333531c59e29fd4db5d15008", size = 1974769, upload-time = "2025-11-04T13:42:01.186Z" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.13.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/52/6d/fffca34caecc4a3f97bda81b2098da5e8ab7efc9a66e819074a11955d87e/pydantic_settings-2.13.1.tar.gz", hash = "sha256:b4c11847b15237fb0171e1462bf540e294affb9b86db4d9aa5c01730bdbe4025", size = 223826, upload-time = "2026-02-19T13:45:08.055Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/00/4b/ccc026168948fec4f7555b9164c724cf4125eac006e176541483d2c959be/pydantic_settings-2.13.1-py3-none-any.whl", hash = "sha256:d56fd801823dbeae7f0975e1f8c8e25c258eb75d278ea7abb5d9cebb01b56237", size = 58929, upload-time = "2026-02-19T13:45:06.034Z" }, +] + +[[package]] +name = "pygments" +version = "2.19.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, +] + +[[package]] +name = "pyjwt" +version = "2.12.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c2/27/a3b6e5bf6ff856d2509292e95c8f57f0df7017cf5394921fc4e4ef40308a/pyjwt-2.12.1.tar.gz", hash = "sha256:c74a7a2adf861c04d002db713dd85f84beb242228e671280bf709d765b03672b", size = 102564, upload-time = "2026-03-13T19:27:37.25Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/7a/8dd906bd22e79e47397a61742927f6747fe93242ef86645ee9092e610244/pyjwt-2.12.1-py3-none-any.whl", hash = "sha256:28ca37c070cad8ba8cd9790cd940535d40274d22f80ab87f3ac6a713e6e8454c", size = 29726, upload-time = "2026-03-13T19:27:35.677Z" }, +] + +[[package]] +name = "pytest" +version = "9.0.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d1/db/7ef3487e0fb0049ddb5ce41d3a49c235bf9ad299b6a25d5780a89f19230f/pytest-9.0.2.tar.gz", hash = "sha256:75186651a92bd89611d1d9fc20f0b4345fd827c41ccd5c299a868a05d70edf11", size = 1568901, upload-time = "2025-12-06T21:30:51.014Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" }, +] + +[[package]] +name = "pytest-asyncio" +version = "1.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/90/2c/8af215c0f776415f3590cac4f9086ccefd6fd463befeae41cd4d3f193e5a/pytest_asyncio-1.3.0.tar.gz", hash = "sha256:d7f52f36d231b80ee124cd216ffb19369aa168fc10095013c6b014a34d3ee9e5", size = 50087, upload-time = "2025-11-10T16:07:47.256Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/35/f8b19922b6a25bc0880171a2f1a003eaeb93657475193ab516fd87cac9da/pytest_asyncio-1.3.0-py3-none-any.whl", hash = "sha256:611e26147c7f77640e6d0a92a38ed17c3e9848063698d5c93d5aa7aa11cebff5", size = 15075, upload-time = "2025-11-10T16:07:45.537Z" }, +] + +[[package]] +name = "pytest-httpx" +version = "0.36.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "httpx" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/5574834da9499066fa1a5ea9c336f94dba2eae02298d36dab192fcf95c86/pytest_httpx-0.36.0.tar.gz", hash = "sha256:9edb66a5fd4388ce3c343189bc67e7e1cb50b07c2e3fc83b97d511975e8a831b", size = 56793, upload-time = "2025-12-02T16:34:57.414Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e2/d2/1eb1ea9c84f0d2033eb0b49675afdc71aa4ea801b74615f00f3c33b725e3/pytest_httpx-0.36.0-py3-none-any.whl", hash = "sha256:bd4c120bb80e142df856e825ec9f17981effb84d159f9fa29ed97e2357c3a9c8", size = 20229, upload-time = "2025-12-02T16:34:56.45Z" }, +] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, +] + +[[package]] +name = "python-discovery" +version = "1.1.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "filelock" }, + { name = "platformdirs" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d7/7e/9f3b0dd3a074a6c3e1e79f35e465b1f2ee4b262d619de00cfce523cc9b24/python_discovery-1.1.3.tar.gz", hash = "sha256:7acca36e818cd88e9b2ba03e045ad7e93e1713e29c6bbfba5d90202310b7baa5", size = 56945, upload-time = "2026-03-10T15:08:15.038Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e7/80/73211fc5bfbfc562369b4aa61dc1e4bf07dc7b34df7b317e4539316b809c/python_discovery-1.1.3-py3-none-any.whl", hash = "sha256:90e795f0121bc84572e737c9aa9966311b9fde44ffb88a5953b3ec9b31c6945e", size = 31485, upload-time = "2026-03-10T15:08:13.06Z" }, +] + +[[package]] +name = "python-dotenv" +version = "1.2.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/82/ed/0301aeeac3e5353ef3d94b6ec08bbcabd04a72018415dcb29e588514bba8/python_dotenv-1.2.2.tar.gz", hash = "sha256:2c371a91fbd7ba082c2c1dc1f8bf89ca22564a087c2c287cd9b662adde799cf3", size = 50135, upload-time = "2026-03-01T16:00:26.196Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0b/d7/1959b9648791274998a9c3526f6d0ec8fd2233e4d4acce81bbae76b44b2a/python_dotenv-1.2.2-py3-none-any.whl", hash = "sha256:1d8214789a24de455a8b8bd8ae6fe3c6b69a5e3d64aa8a8e5d68e694bbcb285a", size = 22101, upload-time = "2026-03-01T16:00:25.09Z" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.22" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/01/979e98d542a70714b0cb2b6728ed0b7c46792b695e3eaec3e20711271ca3/python_multipart-0.0.22.tar.gz", hash = "sha256:7340bef99a7e0032613f56dc36027b959fd3b30a787ed62d310e951f7c3a3a58", size = 37612, upload-time = "2026-01-25T10:15:56.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1b/d0/397f9626e711ff749a95d96b7af99b9c566a9bb5129b8e4c10fc4d100304/python_multipart-0.0.22-py3-none-any.whl", hash = "sha256:2b2cd894c83d21bf49d702499531c7bafd057d730c201782048f7945d82de155", size = 24579, upload-time = "2026-01-25T10:15:54.811Z" }, +] + +[[package]] +name = "pywin32" +version = "311" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" }, + { url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" }, + { url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" }, + { url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" }, + { url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" }, + { url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" }, +] + +[[package]] +name = "pyyaml" +version = "6.0.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" }, + { url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" }, + { url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" }, + { url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" }, + { url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" }, + { url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" }, + { url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" }, + { url = "https://files.pythonhosted.org/packages/de/94/980b50a6531b3019e45ddeada0626d45fa85cbe22300844a7983285bed3b/pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", size = 137427, upload-time = "2025-09-25T21:32:32.58Z" }, + { url = "https://files.pythonhosted.org/packages/97/c9/39d5b874e8b28845e4ec2202b5da735d0199dbe5b8fb85f91398814a9a46/pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", size = 154090, upload-time = "2025-09-25T21:32:33.659Z" }, + { url = "https://files.pythonhosted.org/packages/73/e8/2bdf3ca2090f68bb3d75b44da7bbc71843b19c9f2b9cb9b0f4ab7a5a4329/pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", size = 140246, upload-time = "2025-09-25T21:32:34.663Z" }, + { url = "https://files.pythonhosted.org/packages/9d/8c/f4bd7f6465179953d3ac9bc44ac1a8a3e6122cf8ada906b4f96c60172d43/pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", size = 181814, upload-time = "2025-09-25T21:32:35.712Z" }, + { url = "https://files.pythonhosted.org/packages/bd/9c/4d95bb87eb2063d20db7b60faa3840c1b18025517ae857371c4dd55a6b3a/pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", size = 173809, upload-time = "2025-09-25T21:32:36.789Z" }, + { url = "https://files.pythonhosted.org/packages/92/b5/47e807c2623074914e29dabd16cbbdd4bf5e9b2db9f8090fa64411fc5382/pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", size = 766454, upload-time = "2025-09-25T21:32:37.966Z" }, + { url = "https://files.pythonhosted.org/packages/02/9e/e5e9b168be58564121efb3de6859c452fccde0ab093d8438905899a3a483/pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", size = 836355, upload-time = "2025-09-25T21:32:39.178Z" }, + { url = "https://files.pythonhosted.org/packages/88/f9/16491d7ed2a919954993e48aa941b200f38040928474c9e85ea9e64222c3/pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", size = 794175, upload-time = "2025-09-25T21:32:40.865Z" }, + { url = "https://files.pythonhosted.org/packages/dd/3f/5989debef34dc6397317802b527dbbafb2b4760878a53d4166579111411e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", size = 755228, upload-time = "2025-09-25T21:32:42.084Z" }, + { url = "https://files.pythonhosted.org/packages/d7/ce/af88a49043cd2e265be63d083fc75b27b6ed062f5f9fd6cdc223ad62f03e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", size = 789194, upload-time = "2025-09-25T21:32:43.362Z" }, + { url = "https://files.pythonhosted.org/packages/23/20/bb6982b26a40bb43951265ba29d4c246ef0ff59c9fdcdf0ed04e0687de4d/pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac", size = 156429, upload-time = "2025-09-25T21:32:57.844Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f4/a4541072bb9422c8a883ab55255f918fa378ecf083f5b85e87fc2b4eda1b/pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3", size = 143912, upload-time = "2025-09-25T21:32:59.247Z" }, + { url = "https://files.pythonhosted.org/packages/7c/f9/07dd09ae774e4616edf6cda684ee78f97777bdd15847253637a6f052a62f/pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", size = 189108, upload-time = "2025-09-25T21:32:44.377Z" }, + { url = "https://files.pythonhosted.org/packages/4e/78/8d08c9fb7ce09ad8c38ad533c1191cf27f7ae1effe5bb9400a46d9437fcf/pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", size = 183641, upload-time = "2025-09-25T21:32:45.407Z" }, + { url = "https://files.pythonhosted.org/packages/7b/5b/3babb19104a46945cf816d047db2788bcaf8c94527a805610b0289a01c6b/pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", size = 831901, upload-time = "2025-09-25T21:32:48.83Z" }, + { url = "https://files.pythonhosted.org/packages/8b/cc/dff0684d8dc44da4d22a13f35f073d558c268780ce3c6ba1b87055bb0b87/pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", size = 861132, upload-time = "2025-09-25T21:32:50.149Z" }, + { url = "https://files.pythonhosted.org/packages/b1/5e/f77dc6b9036943e285ba76b49e118d9ea929885becb0a29ba8a7c75e29fe/pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", size = 839261, upload-time = "2025-09-25T21:32:51.808Z" }, + { url = "https://files.pythonhosted.org/packages/ce/88/a9db1376aa2a228197c58b37302f284b5617f56a5d959fd1763fb1675ce6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", size = 805272, upload-time = "2025-09-25T21:32:52.941Z" }, + { url = "https://files.pythonhosted.org/packages/da/92/1446574745d74df0c92e6aa4a7b0b3130706a4142b2d1a5869f2eaa423c6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", size = 829923, upload-time = "2025-09-25T21:32:54.537Z" }, + { url = "https://files.pythonhosted.org/packages/f0/7a/1c7270340330e575b92f397352af856a8c06f230aa3e76f86b39d01b416a/pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9", size = 174062, upload-time = "2025-09-25T21:32:55.767Z" }, + { url = "https://files.pythonhosted.org/packages/f1/12/de94a39c2ef588c7e6455cfbe7343d3b2dc9d6b6b2f40c4c6565744c873d/pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b", size = 149341, upload-time = "2025-09-25T21:32:56.828Z" }, +] + +[[package]] +name = "redis" +version = "7.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/da/82/4d1a5279f6c1251d3d2a603a798a1137c657de9b12cfc1fba4858232c4d2/redis-7.3.0.tar.gz", hash = "sha256:4d1b768aafcf41b01022410b3cc4f15a07d9b3d6fe0c66fc967da2c88e551034", size = 4928081, upload-time = "2026-03-06T18:18:16.287Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f0/28/84e57fce7819e81ec5aa1bd31c42b89607241f4fb1a3ea5b0d2dbeaea26c/redis-7.3.0-py3-none-any.whl", hash = "sha256:9d4fcb002a12a5e3c3fbe005d59c48a2cc231f87fbb2f6b70c2d89bb64fec364", size = 404379, upload-time = "2026-03-06T18:18:14.583Z" }, +] + +[[package]] +name = "requests" +version = "2.32.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" }, +] + +[[package]] +name = "ruff" +version = "0.15.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/51/df/f8629c19c5318601d3121e230f74cbee7a3732339c52b21daa2b82ef9c7d/ruff-0.15.6.tar.gz", hash = "sha256:8394c7bb153a4e3811a4ecdacd4a8e6a4fa8097028119160dffecdcdf9b56ae4", size = 4597916, upload-time = "2026-03-12T23:05:47.51Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9e/2f/4e03a7e5ce99b517e98d3b4951f411de2b0fa8348d39cf446671adcce9a2/ruff-0.15.6-py3-none-linux_armv6l.whl", hash = "sha256:7c98c3b16407b2cf3d0f2b80c80187384bc92c6774d85fefa913ecd941256fff", size = 10508953, upload-time = "2026-03-12T23:05:17.246Z" }, + { url = "https://files.pythonhosted.org/packages/70/60/55bcdc3e9f80bcf39edf0cd272da6fa511a3d94d5a0dd9e0adf76ceebdb4/ruff-0.15.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:ee7dcfaad8b282a284df4aa6ddc2741b3f4a18b0555d626805555a820ea181c3", size = 10942257, upload-time = "2026-03-12T23:05:23.076Z" }, + { url = "https://files.pythonhosted.org/packages/e7/f9/005c29bd1726c0f492bfa215e95154cf480574140cb5f867c797c18c790b/ruff-0.15.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:3bd9967851a25f038fc8b9ae88a7fbd1b609f30349231dffaa37b6804923c4bb", size = 10322683, upload-time = "2026-03-12T23:05:33.738Z" }, + { url = "https://files.pythonhosted.org/packages/5f/74/2f861f5fd7cbb2146bddb5501450300ce41562da36d21868c69b7a828169/ruff-0.15.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:13f4594b04e42cd24a41da653886b04d2ff87adbf57497ed4f728b0e8a4866f8", size = 10660986, upload-time = "2026-03-12T23:05:53.245Z" }, + { url = "https://files.pythonhosted.org/packages/c1/a1/309f2364a424eccb763cdafc49df843c282609f47fe53aa83f38272389e0/ruff-0.15.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e2ed8aea2f3fe57886d3f00ea5b8aae5bf68d5e195f487f037a955ff9fbaac9e", size = 10332177, upload-time = "2026-03-12T23:05:56.145Z" }, + { url = "https://files.pythonhosted.org/packages/30/41/7ebf1d32658b4bab20f8ac80972fb19cd4e2c6b78552be263a680edc55ac/ruff-0.15.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:70789d3e7830b848b548aae96766431c0dc01a6c78c13381f423bf7076c66d15", size = 11170783, upload-time = "2026-03-12T23:06:01.742Z" }, + { url = "https://files.pythonhosted.org/packages/76/be/6d488f6adca047df82cd62c304638bcb00821c36bd4881cfca221561fdfc/ruff-0.15.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:542aaf1de3154cea088ced5a819ce872611256ffe2498e750bbae5247a8114e9", size = 12044201, upload-time = "2026-03-12T23:05:28.697Z" }, + { url = "https://files.pythonhosted.org/packages/71/68/e6f125df4af7e6d0b498f8d373274794bc5156b324e8ab4bf5c1b4fc0ec7/ruff-0.15.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c22e6f02c16cfac3888aa636e9eba857254d15bbacc9906c9689fdecb1953ab", size = 11421561, upload-time = "2026-03-12T23:05:31.236Z" }, + { url = "https://files.pythonhosted.org/packages/f1/9f/f85ef5fd01a52e0b472b26dc1b4bd228b8f6f0435975442ffa4741278703/ruff-0.15.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98893c4c0aadc8e448cfa315bd0cc343a5323d740fe5f28ef8a3f9e21b381f7e", size = 11310928, upload-time = "2026-03-12T23:05:45.288Z" }, + { url = "https://files.pythonhosted.org/packages/8c/26/b75f8c421f5654304b89471ed384ae8c7f42b4dff58fa6ce1626d7f2b59a/ruff-0.15.6-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:70d263770d234912374493e8cc1e7385c5d49376e41dfa51c5c3453169dc581c", size = 11235186, upload-time = "2026-03-12T23:05:50.677Z" }, + { url = "https://files.pythonhosted.org/packages/fc/d4/d5a6d065962ff7a68a86c9b4f5500f7d101a0792078de636526c0edd40da/ruff-0.15.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:55a1ad63c5a6e54b1f21b7514dfadc0c7fb40093fa22e95143cf3f64ebdcd512", size = 10635231, upload-time = "2026-03-12T23:05:37.044Z" }, + { url = "https://files.pythonhosted.org/packages/d6/56/7c3acf3d50910375349016cf33de24be021532042afbed87942858992491/ruff-0.15.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:8dc473ba093c5ec238bb1e7429ee676dca24643c471e11fbaa8a857925b061c0", size = 10340357, upload-time = "2026-03-12T23:06:04.748Z" }, + { url = "https://files.pythonhosted.org/packages/06/54/6faa39e9c1033ff6a3b6e76b5df536931cd30caf64988e112bbf91ef5ce5/ruff-0.15.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:85b042377c2a5561131767974617006f99f7e13c63c111b998f29fc1e58a4cfb", size = 10860583, upload-time = "2026-03-12T23:05:58.978Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/509a201b843b4dfb0b32acdedf68d951d3377988cae43949ba4c4133a96a/ruff-0.15.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:cef49e30bc5a86a6a92098a7fbf6e467a234d90b63305d6f3ec01225a9d092e0", size = 11410976, upload-time = "2026-03-12T23:05:39.955Z" }, + { url = "https://files.pythonhosted.org/packages/6c/25/3fc9114abf979a41673ce877c08016f8e660ad6cf508c3957f537d2e9fa9/ruff-0.15.6-py3-none-win32.whl", hash = "sha256:bbf67d39832404812a2d23020dda68fee7f18ce15654e96fb1d3ad21a5fe436c", size = 10616872, upload-time = "2026-03-12T23:05:42.451Z" }, + { url = "https://files.pythonhosted.org/packages/89/7a/09ece68445ceac348df06e08bf75db72d0e8427765b96c9c0ffabc1be1d9/ruff-0.15.6-py3-none-win_amd64.whl", hash = "sha256:aee25bc84c2f1007ecb5037dff75cef00414fdf17c23f07dc13e577883dca406", size = 11787271, upload-time = "2026-03-12T23:05:20.168Z" }, + { url = "https://files.pythonhosted.org/packages/7f/d0/578c47dd68152ddddddf31cd7fc67dc30b7cdf639a86275fda821b0d9d98/ruff-0.15.6-py3-none-win_arm64.whl", hash = "sha256:c34de3dd0b0ba203be50ae70f5910b17188556630e2178fd7d79fc030eb0d837", size = 11060497, upload-time = "2026-03-12T23:05:25.968Z" }, +] + +[[package]] +name = "six" +version = "1.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, +] + +[[package]] +name = "sortedcontainers" +version = "2.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594, upload-time = "2021-05-16T22:03:42.897Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575, upload-time = "2021-05-16T22:03:41.177Z" }, +] + +[[package]] +name = "sqlalchemy" +version = "2.0.48" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "greenlet", marker = "platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/1f/73/b4a9737255583b5fa858e0bb8e116eb94b88c910164ed2ed719147bde3de/sqlalchemy-2.0.48.tar.gz", hash = "sha256:5ca74f37f3369b45e1f6b7b06afb182af1fd5dde009e4ffd831830d98cbe5fe7", size = 9886075, upload-time = "2026-03-02T15:28:51.474Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/c6/569dc8bf3cd375abc5907e82235923e986799f301cd79a903f784b996fca/sqlalchemy-2.0.48-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e3070c03701037aa418b55d36532ecb8f8446ed0135acb71c678dbdf12f5b6e4", size = 2152599, upload-time = "2026-03-02T15:49:14.41Z" }, + { url = "https://files.pythonhosted.org/packages/6d/ff/f4e04a4bd5a24304f38cb0d4aa2ad4c0fb34999f8b884c656535e1b2b74c/sqlalchemy-2.0.48-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2645b7d8a738763b664a12a1542c89c940daa55196e8d73e55b169cc5c99f65f", size = 3278825, upload-time = "2026-03-02T15:50:38.269Z" }, + { url = "https://files.pythonhosted.org/packages/fe/88/cb59509e4668d8001818d7355d9995be90c321313078c912420603a7cb95/sqlalchemy-2.0.48-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b19151e76620a412c2ac1c6f977ab1b9fa7ad43140178345136456d5265b32ed", size = 3295200, upload-time = "2026-03-02T15:53:29.366Z" }, + { url = "https://files.pythonhosted.org/packages/87/dc/1609a4442aefd750ea2f32629559394ec92e89ac1d621a7f462b70f736ff/sqlalchemy-2.0.48-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5b193a7e29fd9fa56e502920dca47dffe60f97c863494946bd698c6058a55658", size = 3226876, upload-time = "2026-03-02T15:50:39.802Z" }, + { url = "https://files.pythonhosted.org/packages/37/c3/6ae2ab5ea2fa989fbac4e674de01224b7a9d744becaf59bb967d62e99bed/sqlalchemy-2.0.48-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:36ac4ddc3d33e852da9cb00ffb08cea62ca05c39711dc67062ca2bb1fae35fd8", size = 3265045, upload-time = "2026-03-02T15:53:31.421Z" }, + { url = "https://files.pythonhosted.org/packages/6f/82/ea4665d1bb98c50c19666e672f21b81356bd6077c4574e3d2bbb84541f53/sqlalchemy-2.0.48-cp313-cp313-win32.whl", hash = "sha256:389b984139278f97757ea9b08993e7b9d1142912e046ab7d82b3fbaeb0209131", size = 2113700, upload-time = "2026-03-02T15:54:35.825Z" }, + { url = "https://files.pythonhosted.org/packages/b7/2b/b9040bec58c58225f073f5b0c1870defe1940835549dafec680cbd58c3c3/sqlalchemy-2.0.48-cp313-cp313-win_amd64.whl", hash = "sha256:d612c976cbc2d17edfcc4c006874b764e85e990c29ce9bd411f926bbfb02b9a2", size = 2139487, upload-time = "2026-03-02T15:54:37.079Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f4/7b17bd50244b78a49d22cc63c969d71dc4de54567dc152a9b46f6fae40ce/sqlalchemy-2.0.48-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:69f5bc24904d3bc3640961cddd2523e361257ef68585d6e364166dfbe8c78fae", size = 3558851, upload-time = "2026-03-02T15:57:48.607Z" }, + { url = "https://files.pythonhosted.org/packages/20/0d/213668e9aca61d370f7d2a6449ea4ec699747fac67d4bda1bb3d129025be/sqlalchemy-2.0.48-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fd08b90d211c086181caed76931ecfa2bdfc83eea3cfccdb0f82abc6c4b876cb", size = 3525525, upload-time = "2026-03-02T16:04:38.058Z" }, + { url = "https://files.pythonhosted.org/packages/85/d7/a84edf412979e7d59c69b89a5871f90a49228360594680e667cb2c46a828/sqlalchemy-2.0.48-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:1ccd42229aaac2df431562117ac7e667d702e8e44afdb6cf0e50fa3f18160f0b", size = 3466611, upload-time = "2026-03-02T15:57:50.759Z" }, + { url = "https://files.pythonhosted.org/packages/86/55/42404ce5770f6be26a2b0607e7866c31b9a4176c819e9a7a5e0a055770be/sqlalchemy-2.0.48-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f0dcbc588cd5b725162c076eb9119342f6579c7f7f55057bb7e3c6ff27e13121", size = 3475812, upload-time = "2026-03-02T16:04:40.092Z" }, + { url = "https://files.pythonhosted.org/packages/ae/ae/29b87775fadc43e627cf582fe3bda4d02e300f6b8f2747c764950d13784c/sqlalchemy-2.0.48-cp313-cp313t-win32.whl", hash = "sha256:9764014ef5e58aab76220c5664abb5d47d5bc858d9debf821e55cfdd0f128485", size = 2141335, upload-time = "2026-03-02T15:52:51.518Z" }, + { url = "https://files.pythonhosted.org/packages/91/44/f39d063c90f2443e5b46ec4819abd3d8de653893aae92df42a5c4f5843de/sqlalchemy-2.0.48-cp313-cp313t-win_amd64.whl", hash = "sha256:e2f35b4cccd9ed286ad62e0a3c3ac21e06c02abc60e20aa51a3e305a30f5fa79", size = 2173095, upload-time = "2026-03-02T15:52:52.79Z" }, + { url = "https://files.pythonhosted.org/packages/f7/b3/f437eaa1cf028bb3c927172c7272366393e73ccd104dcf5b6963f4ab5318/sqlalchemy-2.0.48-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:e2d0d88686e3d35a76f3e15a34e8c12d73fc94c1dea1cd55782e695cc14086dd", size = 2154401, upload-time = "2026-03-02T15:49:17.24Z" }, + { url = "https://files.pythonhosted.org/packages/6c/1c/b3abdf0f402aa3f60f0df6ea53d92a162b458fca2321d8f1f00278506402/sqlalchemy-2.0.48-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:49b7bddc1eebf011ea5ab722fdbe67a401caa34a350d278cc7733c0e88fecb1f", size = 3274528, upload-time = "2026-03-02T15:50:41.489Z" }, + { url = "https://files.pythonhosted.org/packages/f2/5e/327428a034407651a048f5e624361adf3f9fbac9d0fa98e981e9c6ff2f5e/sqlalchemy-2.0.48-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:426c5ca86415d9b8945c7073597e10de9644802e2ff502b8e1f11a7a2642856b", size = 3279523, upload-time = "2026-03-02T15:53:32.962Z" }, + { url = "https://files.pythonhosted.org/packages/2a/ca/ece73c81a918add0965b76b868b7b5359e068380b90ef1656ee995940c02/sqlalchemy-2.0.48-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:288937433bd44e3990e7da2402fabc44a3c6c25d3704da066b85b89a85474ae0", size = 3224312, upload-time = "2026-03-02T15:50:42.996Z" }, + { url = "https://files.pythonhosted.org/packages/88/11/fbaf1ae91fa4ee43f4fe79661cead6358644824419c26adb004941bdce7c/sqlalchemy-2.0.48-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:8183dc57ae7d9edc1346e007e840a9f3d6aa7b7f165203a99e16f447150140d2", size = 3246304, upload-time = "2026-03-02T15:53:34.937Z" }, + { url = "https://files.pythonhosted.org/packages/fa/a8/5fb0deb13930b4f2f698c5541ae076c18981173e27dd00376dbaea7a9c82/sqlalchemy-2.0.48-cp314-cp314-win32.whl", hash = "sha256:1182437cb2d97988cfea04cf6cdc0b0bb9c74f4d56ec3d08b81e23d621a28cc6", size = 2116565, upload-time = "2026-03-02T15:54:38.321Z" }, + { url = "https://files.pythonhosted.org/packages/95/7e/e83615cb63f80047f18e61e31e8e32257d39458426c23006deeaf48f463b/sqlalchemy-2.0.48-cp314-cp314-win_amd64.whl", hash = "sha256:144921da96c08feb9e2b052c5c5c1d0d151a292c6135623c6b2c041f2a45f9e0", size = 2142205, upload-time = "2026-03-02T15:54:39.831Z" }, + { url = "https://files.pythonhosted.org/packages/83/e3/69d8711b3f2c5135e9cde5f063bc1605860f0b2c53086d40c04017eb1f77/sqlalchemy-2.0.48-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5aee45fd2c6c0f2b9cdddf48c48535e7471e42d6fb81adfde801da0bd5b93241", size = 3563519, upload-time = "2026-03-02T15:57:52.387Z" }, + { url = "https://files.pythonhosted.org/packages/f8/4f/a7cce98facca73c149ea4578981594aaa5fd841e956834931de503359336/sqlalchemy-2.0.48-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7cddca31edf8b0653090cbb54562ca027c421c58ddde2c0685f49ff56a1690e0", size = 3528611, upload-time = "2026-03-02T16:04:42.097Z" }, + { url = "https://files.pythonhosted.org/packages/cd/7d/5936c7a03a0b0cb0fa0cc425998821c6029756b0855a8f7ee70fba1de955/sqlalchemy-2.0.48-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7a936f1bb23d370b7c8cc079d5fce4c7d18da87a33c6744e51a93b0f9e97e9b3", size = 3472326, upload-time = "2026-03-02T15:57:54.423Z" }, + { url = "https://files.pythonhosted.org/packages/f4/33/cea7dfc31b52904efe3dcdc169eb4514078887dff1f5ae28a7f4c5d54b3c/sqlalchemy-2.0.48-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e004aa9248e8cb0a5f9b96d003ca7c1c0a5da8decd1066e7b53f59eb8ce7c62b", size = 3478453, upload-time = "2026-03-02T16:04:44.584Z" }, + { url = "https://files.pythonhosted.org/packages/c8/95/32107c4d13be077a9cae61e9ae49966a35dc4bf442a8852dd871db31f62e/sqlalchemy-2.0.48-cp314-cp314t-win32.whl", hash = "sha256:b8438ec5594980d405251451c5b7ea9aa58dda38eb7ac35fb7e4c696712ee24f", size = 2147209, upload-time = "2026-03-02T15:52:54.274Z" }, + { url = "https://files.pythonhosted.org/packages/d2/d7/1e073da7a4bc645eb83c76067284a0374e643bc4be57f14cc6414656f92c/sqlalchemy-2.0.48-cp314-cp314t-win_amd64.whl", hash = "sha256:d854b3970067297f3a7fbd7a4683587134aa9b3877ee15aa29eea478dc68f933", size = 2182198, upload-time = "2026-03-02T15:52:55.606Z" }, + { url = "https://files.pythonhosted.org/packages/46/2c/9664130905f03db57961b8980b05cab624afd114bf2be2576628a9f22da4/sqlalchemy-2.0.48-py3-none-any.whl", hash = "sha256:a66fe406437dd65cacd96a72689a3aaaecaebbcd62d81c5ac1c0fdbeac835096", size = 1940202, upload-time = "2026-03-02T15:52:43.285Z" }, +] + +[package.optional-dependencies] +asyncio = [ + { name = "greenlet" }, +] + +[[package]] +name = "starlette" +version = "0.52.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c4/68/79977123bb7be889ad680d79a40f339082c1978b5cfcf62c2d8d196873ac/starlette-0.52.1.tar.gz", hash = "sha256:834edd1b0a23167694292e94f597773bc3f89f362be6effee198165a35d62933", size = 2653702, upload-time = "2026-01-18T13:34:11.062Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/81/0d/13d1d239a25cbfb19e740db83143e95c772a1fe10202dda4b76792b114dd/starlette-0.52.1-py3-none-any.whl", hash = "sha256:0029d43eb3d273bc4f83a08720b4912ea4b071087a3b48db01b7c839f7954d74", size = 74272, upload-time = "2026-01-18T13:34:09.188Z" }, +] + +[[package]] +name = "strawberry-graphql" +version = "0.311.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cross-web" }, + { name = "graphql-core" }, + { name = "packaging" }, + { name = "python-dateutil" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c2/c2/de04037d5bd2b2ee47c6325b14d3c53b23c690dcb3761a00ff24e37756c2/strawberry_graphql-0.311.3.tar.gz", hash = "sha256:04801aa9540812c3b0ad629de777346908d8c486b0e21c6342b4bafa64c42daa", size = 214131, upload-time = "2026-03-16T19:13:08.573Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6b/7b/4ffdda96126397b31318ae429b9db5790f47c7f17568f98ca61778929494/strawberry_graphql-0.311.3-py3-none-any.whl", hash = "sha256:c63e4a98190f898ef6c321bb6628bcbfeab98cc7521779b222bfad889d8c17b6", size = 311631, upload-time = "2026-03-16T19:13:10.398Z" }, +] + +[package.optional-dependencies] +fastapi = [ + { name = "fastapi" }, + { name = "python-multipart" }, +] + +[[package]] +name = "testcontainers" +version = "4.14.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "docker" }, + { name = "python-dotenv" }, + { name = "typing-extensions" }, + { name = "urllib3" }, + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ca/ac/a597c3a0e02b26cbed6dd07df68be1e57684766fd1c381dee9b170a99690/testcontainers-4.14.2.tar.gz", hash = "sha256:1340ccf16fe3acd9389a6c9e1d9ab21d9fe99a8afdf8165f89c3e69c1967d239", size = 166841, upload-time = "2026-03-18T05:19:16.696Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/13/2d/26b8b30067d94339afee62c3edc9b803a6eb9332f521ba77d8aaab5de873/testcontainers-4.14.2-py3-none-any.whl", hash = "sha256:0d0522c3cd8f8d9627cda41f7a6b51b639fa57bdc492923c045117933c668d68", size = 125712, upload-time = "2026-03-18T05:19:15.29Z" }, +] + +[package.optional-dependencies] +redis = [ + { name = "redis" }, +] + +[[package]] +name = "typing-extensions" +version = "4.15.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/55/e3/70399cb7dd41c10ac53367ae42139cf4b1ca5f36bb3dc6c9d33acdb43655/typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464", size = 75949, upload-time = "2025-10-01T02:14:41.687Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload-time = "2025-10-01T02:14:40.154Z" }, +] + +[[package]] +name = "urllib3" +version = "2.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload-time = "2026-01-07T16:24:43.925Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" }, +] + +[[package]] +name = "uvicorn" +version = "0.42.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e3/ad/4a96c425be6fb67e0621e62d86c402b4a17ab2be7f7c055d9bd2f638b9e2/uvicorn-0.42.0.tar.gz", hash = "sha256:9b1f190ce15a2dd22e7758651d9b6d12df09a13d51ba5bf4fc33c383a48e1775", size = 85393, upload-time = "2026-03-16T06:19:50.077Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0a/89/f8827ccff89c1586027a105e5630ff6139a64da2515e24dafe860bd9ae4d/uvicorn-0.42.0-py3-none-any.whl", hash = "sha256:96c30f5c7abe6f74ae8900a70e92b85ad6613b745d4879eb9b16ccad15645359", size = 68830, upload-time = "2026-03-16T06:19:48.325Z" }, +] + +[[package]] +name = "virtualenv" +version = "21.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "distlib" }, + { name = "filelock" }, + { name = "platformdirs" }, + { name = "python-discovery" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/aa/92/58199fe10049f9703c2666e809c4f686c54ef0a68b0f6afccf518c0b1eb9/virtualenv-21.2.0.tar.gz", hash = "sha256:1720dc3a62ef5b443092e3f499228599045d7fea4c79199770499df8becf9098", size = 5840618, upload-time = "2026-03-09T17:24:38.013Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c6/59/7d02447a55b2e55755011a647479041bc92a82e143f96a8195cb33bd0a1c/virtualenv-21.2.0-py3-none-any.whl", hash = "sha256:1bd755b504931164a5a496d217c014d098426cddc79363ad66ac78125f9d908f", size = 5825084, upload-time = "2026-03-09T17:24:35.378Z" }, +] + +[[package]] +name = "wrapt" +version = "2.1.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/2e/64/925f213fdcbb9baeb1530449ac71a4d57fc361c053d06bf78d0c5c7cd80c/wrapt-2.1.2.tar.gz", hash = "sha256:3996a67eecc2c68fd47b4e3c564405a5777367adfd9b8abb58387b63ee83b21e", size = 81678, upload-time = "2026-03-06T02:53:25.134Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4c/7a/d936840735c828b38d26a854e85d5338894cda544cb7a85a9d5b8b9c4df7/wrapt-2.1.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:787fd6f4d67befa6fe2abdffcbd3de2d82dfc6fb8a6d850407c53332709d030b", size = 61259, upload-time = "2026-03-06T02:53:41.922Z" }, + { url = "https://files.pythonhosted.org/packages/5e/88/9a9b9a90ac8ca11c2fdb6a286cb3a1fc7dd774c00ed70929a6434f6bc634/wrapt-2.1.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:4bdf26e03e6d0da3f0e9422fd36bcebf7bc0eeb55fdf9c727a09abc6b9fe472e", size = 61851, upload-time = "2026-03-06T02:52:48.672Z" }, + { url = "https://files.pythonhosted.org/packages/03/a9/5b7d6a16fd6533fed2756900fc8fc923f678179aea62ada6d65c92718c00/wrapt-2.1.2-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:bbac24d879aa22998e87f6b3f481a5216311e7d53c7db87f189a7a0266dafffb", size = 121446, upload-time = "2026-03-06T02:54:14.013Z" }, + { url = "https://files.pythonhosted.org/packages/45/bb/34c443690c847835cfe9f892be78c533d4f32366ad2888972c094a897e39/wrapt-2.1.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:16997dfb9d67addc2e3f41b62a104341e80cac52f91110dece393923c0ebd5ca", size = 123056, upload-time = "2026-03-06T02:54:10.829Z" }, + { url = "https://files.pythonhosted.org/packages/93/b9/ff205f391cb708f67f41ea148545f2b53ff543a7ac293b30d178af4d2271/wrapt-2.1.2-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:162e4e2ba7542da9027821cb6e7c5e068d64f9a10b5f15512ea28e954893a267", size = 117359, upload-time = "2026-03-06T02:53:03.623Z" }, + { url = "https://files.pythonhosted.org/packages/1f/3d/1ea04d7747825119c3c9a5e0874a40b33594ada92e5649347c457d982805/wrapt-2.1.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f29c827a8d9936ac320746747a016c4bc66ef639f5cd0d32df24f5eacbf9c69f", size = 121479, upload-time = "2026-03-06T02:53:45.844Z" }, + { url = "https://files.pythonhosted.org/packages/78/cc/ee3a011920c7a023b25e8df26f306b2484a531ab84ca5c96260a73de76c0/wrapt-2.1.2-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:a9dd9813825f7ecb018c17fd147a01845eb330254dff86d3b5816f20f4d6aaf8", size = 116271, upload-time = "2026-03-06T02:54:46.356Z" }, + { url = "https://files.pythonhosted.org/packages/98/fd/e5ff7ded41b76d802cf1191288473e850d24ba2e39a6ec540f21ae3b57cb/wrapt-2.1.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6f8dbdd3719e534860d6a78526aafc220e0241f981367018c2875178cf83a413", size = 120573, upload-time = "2026-03-06T02:52:50.163Z" }, + { url = "https://files.pythonhosted.org/packages/47/c5/242cae3b5b080cd09bacef0591691ba1879739050cc7c801ff35c8886b66/wrapt-2.1.2-cp313-cp313-win32.whl", hash = "sha256:5c35b5d82b16a3bc6e0a04349b606a0582bc29f573786aebe98e0c159bc48db6", size = 58205, upload-time = "2026-03-06T02:53:47.494Z" }, + { url = "https://files.pythonhosted.org/packages/12/69/c358c61e7a50f290958809b3c61ebe8b3838ea3e070d7aac9814f95a0528/wrapt-2.1.2-cp313-cp313-win_amd64.whl", hash = "sha256:f8bc1c264d8d1cf5b3560a87bbdd31131573eb25f9f9447bb6252b8d4c44a3a1", size = 60452, upload-time = "2026-03-06T02:53:30.038Z" }, + { url = "https://files.pythonhosted.org/packages/8e/66/c8a6fcfe321295fd8c0ab1bd685b5a01462a9b3aa2f597254462fc2bc975/wrapt-2.1.2-cp313-cp313-win_arm64.whl", hash = "sha256:3beb22f674550d5634642c645aba4c72a2c66fb185ae1aebe1e955fae5a13baf", size = 58842, upload-time = "2026-03-06T02:52:52.114Z" }, + { url = "https://files.pythonhosted.org/packages/da/55/9c7052c349106e0b3f17ae8db4b23a691a963c334de7f9dbd60f8f74a831/wrapt-2.1.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0fc04bc8664a8bc4c8e00b37b5355cffca2535209fba1abb09ae2b7c76ddf82b", size = 63075, upload-time = "2026-03-06T02:53:19.108Z" }, + { url = "https://files.pythonhosted.org/packages/09/a8/ce7b4006f7218248dd71b7b2b732d0710845a0e49213b18faef64811ffef/wrapt-2.1.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a9b9d50c9af998875a1482a038eb05755dfd6fe303a313f6a940bb53a83c3f18", size = 63719, upload-time = "2026-03-06T02:54:33.452Z" }, + { url = "https://files.pythonhosted.org/packages/e4/e5/2ca472e80b9e2b7a17f106bb8f9df1db11e62101652ce210f66935c6af67/wrapt-2.1.2-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2d3ff4f0024dd224290c0eabf0240f1bfc1f26363431505fb1b0283d3b08f11d", size = 152643, upload-time = "2026-03-06T02:52:42.721Z" }, + { url = "https://files.pythonhosted.org/packages/36/42/30f0f2cefca9d9cbf6835f544d825064570203c3e70aa873d8ae12e23791/wrapt-2.1.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3278c471f4468ad544a691b31bb856374fbdefb7fee1a152153e64019379f015", size = 158805, upload-time = "2026-03-06T02:54:25.441Z" }, + { url = "https://files.pythonhosted.org/packages/bb/67/d08672f801f604889dcf58f1a0b424fe3808860ede9e03affc1876b295af/wrapt-2.1.2-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a8914c754d3134a3032601c6984db1c576e6abaf3fc68094bb8ab1379d75ff92", size = 145990, upload-time = "2026-03-06T02:53:57.456Z" }, + { url = "https://files.pythonhosted.org/packages/68/a7/fd371b02e73babec1de6ade596e8cd9691051058cfdadbfd62a5898f3295/wrapt-2.1.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:ff95d4264e55839be37bafe1536db2ab2de19da6b65f9244f01f332b5286cfbf", size = 155670, upload-time = "2026-03-06T02:54:55.309Z" }, + { url = "https://files.pythonhosted.org/packages/86/2d/9fe0095dfdb621009f40117dcebf41d7396c2c22dca6eac779f4c007b86c/wrapt-2.1.2-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:76405518ca4e1b76fbb1b9f686cff93aebae03920cc55ceeec48ff9f719c5f67", size = 144357, upload-time = "2026-03-06T02:54:24.092Z" }, + { url = "https://files.pythonhosted.org/packages/0e/b6/ec7b4a254abbe4cde9fa15c5d2cca4518f6b07d0f1b77d4ee9655e30280e/wrapt-2.1.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c0be8b5a74c5824e9359b53e7e58bef71a729bacc82e16587db1c4ebc91f7c5a", size = 150269, upload-time = "2026-03-06T02:53:31.268Z" }, + { url = "https://files.pythonhosted.org/packages/6e/6b/2fabe8ebf148f4ee3c782aae86a795cc68ffe7d432ef550f234025ce0cfa/wrapt-2.1.2-cp313-cp313t-win32.whl", hash = "sha256:f01277d9a5fc1862f26f7626da9cf443bebc0abd2f303f41c5e995b15887dabd", size = 59894, upload-time = "2026-03-06T02:54:15.391Z" }, + { url = "https://files.pythonhosted.org/packages/ca/fb/9ba66fc2dedc936de5f8073c0217b5d4484e966d87723415cc8262c5d9c2/wrapt-2.1.2-cp313-cp313t-win_amd64.whl", hash = "sha256:84ce8f1c2104d2f6daa912b1b5b039f331febfeee74f8042ad4e04992bd95c8f", size = 63197, upload-time = "2026-03-06T02:54:41.943Z" }, + { url = "https://files.pythonhosted.org/packages/c0/1c/012d7423c95d0e337117723eb8ecf73c622ce15a97847e84cf3f8f26cd7e/wrapt-2.1.2-cp313-cp313t-win_arm64.whl", hash = "sha256:a93cd767e37faeddbe07d8fc4212d5cba660af59bdb0f6372c93faaa13e6e679", size = 60363, upload-time = "2026-03-06T02:54:48.093Z" }, + { url = "https://files.pythonhosted.org/packages/39/25/e7ea0b417db02bb796182a5316398a75792cd9a22528783d868755e1f669/wrapt-2.1.2-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:1370e516598854e5b4366e09ce81e08bfe94d42b0fd569b88ec46cc56d9164a9", size = 61418, upload-time = "2026-03-06T02:53:55.706Z" }, + { url = "https://files.pythonhosted.org/packages/ec/0f/fa539e2f6a770249907757eaeb9a5ff4deb41c026f8466c1c6d799088a9b/wrapt-2.1.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:6de1a3851c27e0bd6a04ca993ea6f80fc53e6c742ee1601f486c08e9f9b900a9", size = 61914, upload-time = "2026-03-06T02:52:53.37Z" }, + { url = "https://files.pythonhosted.org/packages/53/37/02af1867f5b1441aaeda9c82deed061b7cd1372572ddcd717f6df90b5e93/wrapt-2.1.2-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:de9f1a2bbc5ac7f6012ec24525bdd444765a2ff64b5985ac6e0692144838542e", size = 120417, upload-time = "2026-03-06T02:54:30.74Z" }, + { url = "https://files.pythonhosted.org/packages/c3/b7/0138a6238c8ba7476c77cf786a807f871672b37f37a422970342308276e7/wrapt-2.1.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:970d57ed83fa040d8b20c52fe74a6ae7e3775ae8cff5efd6a81e06b19078484c", size = 122797, upload-time = "2026-03-06T02:54:51.539Z" }, + { url = "https://files.pythonhosted.org/packages/e1/ad/819ae558036d6a15b7ed290d5b14e209ca795dd4da9c58e50c067d5927b0/wrapt-2.1.2-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3969c56e4563c375861c8df14fa55146e81ac11c8db49ea6fb7f2ba58bc1ff9a", size = 117350, upload-time = "2026-03-06T02:54:37.651Z" }, + { url = "https://files.pythonhosted.org/packages/8b/2d/afc18dc57a4600a6e594f77a9ae09db54f55ba455440a54886694a84c71b/wrapt-2.1.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:57d7c0c980abdc5f1d98b11a2aa3bb159790add80258c717fa49a99921456d90", size = 121223, upload-time = "2026-03-06T02:54:35.221Z" }, + { url = "https://files.pythonhosted.org/packages/b9/5b/5ec189b22205697bc56eb3b62aed87a1e0423e9c8285d0781c7a83170d15/wrapt-2.1.2-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:776867878e83130c7a04237010463372e877c1c994d449ca6aaafeab6aab2586", size = 116287, upload-time = "2026-03-06T02:54:19.654Z" }, + { url = "https://files.pythonhosted.org/packages/f7/2d/f84939a7c9b5e6cdd8a8d0f6a26cabf36a0f7e468b967720e8b0cd2bdf69/wrapt-2.1.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:fab036efe5464ec3291411fabb80a7a39e2dd80bae9bcbeeca5087fdfa891e19", size = 119593, upload-time = "2026-03-06T02:54:16.697Z" }, + { url = "https://files.pythonhosted.org/packages/0b/fe/ccd22a1263159c4ac811ab9374c061bcb4a702773f6e06e38de5f81a1bdc/wrapt-2.1.2-cp314-cp314-win32.whl", hash = "sha256:e6ed62c82ddf58d001096ae84ce7f833db97ae2263bff31c9b336ba8cfe3f508", size = 58631, upload-time = "2026-03-06T02:53:06.498Z" }, + { url = "https://files.pythonhosted.org/packages/65/0a/6bd83be7bff2e7efaac7b4ac9748da9d75a34634bbbbc8ad077d527146df/wrapt-2.1.2-cp314-cp314-win_amd64.whl", hash = "sha256:467e7c76315390331c67073073d00662015bb730c566820c9ca9b54e4d67fd04", size = 60875, upload-time = "2026-03-06T02:53:50.252Z" }, + { url = "https://files.pythonhosted.org/packages/6c/c0/0b3056397fe02ff80e5a5d72d627c11eb885d1ca78e71b1a5c1e8c7d45de/wrapt-2.1.2-cp314-cp314-win_arm64.whl", hash = "sha256:da1f00a557c66225d53b095a97eace0fc5349e3bfda28fa34ffae238978ee575", size = 59164, upload-time = "2026-03-06T02:53:59.128Z" }, + { url = "https://files.pythonhosted.org/packages/71/ed/5d89c798741993b2371396eb9d4634f009ff1ad8a6c78d366fe2883ea7a6/wrapt-2.1.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:62503ffbc2d3a69891cf29beeaccdb4d5e0a126e2b6a851688d4777e01428dbb", size = 63163, upload-time = "2026-03-06T02:52:54.873Z" }, + { url = "https://files.pythonhosted.org/packages/c6/8c/05d277d182bf36b0a13d6bd393ed1dec3468a25b59d01fba2dd70fe4d6ae/wrapt-2.1.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c7e6cd120ef837d5b6f860a6ea3745f8763805c418bb2f12eeb1fa6e25f22d22", size = 63723, upload-time = "2026-03-06T02:52:56.374Z" }, + { url = "https://files.pythonhosted.org/packages/f4/27/6c51ec1eff4413c57e72d6106bb8dec6f0c7cdba6503d78f0fa98767bcc9/wrapt-2.1.2-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:3769a77df8e756d65fbc050333f423c01ae012b4f6731aaf70cf2bef61b34596", size = 152652, upload-time = "2026-03-06T02:53:23.79Z" }, + { url = "https://files.pythonhosted.org/packages/db/4c/d7dd662d6963fc7335bfe29d512b02b71cdfa23eeca7ab3ac74a67505deb/wrapt-2.1.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a76d61a2e851996150ba0f80582dd92a870643fa481f3b3846f229de88caf044", size = 158807, upload-time = "2026-03-06T02:53:35.742Z" }, + { url = "https://files.pythonhosted.org/packages/b4/4d/1e5eea1a78d539d346765727422976676615814029522c76b87a95f6bcdd/wrapt-2.1.2-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:6f97edc9842cf215312b75fe737ee7c8adda75a89979f8e11558dfff6343cc4b", size = 146061, upload-time = "2026-03-06T02:52:57.574Z" }, + { url = "https://files.pythonhosted.org/packages/89/bc/62cabea7695cd12a288023251eeefdcb8465056ddaab6227cb78a2de005b/wrapt-2.1.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4006c351de6d5007aa33a551f600404ba44228a89e833d2fadc5caa5de8edfbf", size = 155667, upload-time = "2026-03-06T02:53:39.422Z" }, + { url = "https://files.pythonhosted.org/packages/e9/99/6f2888cd68588f24df3a76572c69c2de28287acb9e1972bf0c83ce97dbc1/wrapt-2.1.2-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:a9372fc3639a878c8e7d87e1556fa209091b0a66e912c611e3f833e2c4202be2", size = 144392, upload-time = "2026-03-06T02:54:22.41Z" }, + { url = "https://files.pythonhosted.org/packages/40/51/1dfc783a6c57971614c48e361a82ca3b6da9055879952587bc99fe1a7171/wrapt-2.1.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3144b027ff30cbd2fca07c0a87e67011adb717eb5f5bd8496325c17e454257a3", size = 150296, upload-time = "2026-03-06T02:54:07.848Z" }, + { url = "https://files.pythonhosted.org/packages/6c/38/cbb8b933a0201076c1f64fc42883b0023002bdc14a4964219154e6ff3350/wrapt-2.1.2-cp314-cp314t-win32.whl", hash = "sha256:3b8d15e52e195813efe5db8cec156eebe339aaf84222f4f4f051a6c01f237ed7", size = 60539, upload-time = "2026-03-06T02:54:00.594Z" }, + { url = "https://files.pythonhosted.org/packages/82/dd/e5176e4b241c9f528402cebb238a36785a628179d7d8b71091154b3e4c9e/wrapt-2.1.2-cp314-cp314t-win_amd64.whl", hash = "sha256:08ffa54146a7559f5b8df4b289b46d963a8e74ed16ba3687f99896101a3990c5", size = 63969, upload-time = "2026-03-06T02:54:39Z" }, + { url = "https://files.pythonhosted.org/packages/5c/99/79f17046cf67e4a95b9987ea129632ba8bcec0bc81f3fb3d19bdb0bd60cd/wrapt-2.1.2-cp314-cp314t-win_arm64.whl", hash = "sha256:72aaa9d0d8e4ed0e2e98019cea47a21f823c9dd4b43c7b77bba6679ffcca6a00", size = 60554, upload-time = "2026-03-06T02:53:14.132Z" }, + { url = "https://files.pythonhosted.org/packages/1a/c7/8528ac2dfa2c1e6708f647df7ae144ead13f0a31146f43c7264b4942bf12/wrapt-2.1.2-py3-none-any.whl", hash = "sha256:b8fd6fa2b2c4e7621808f8c62e8317f4aae56e59721ad933bac5239d913cf0e8", size = 43993, upload-time = "2026-03-06T02:53:12.905Z" }, +]