- Core application lives under
app/, with App Router layouts inapp/(auth)/,app/(dashboard)/. - Supporting UI atoms live in
components/; shared hooks inhooks/, shared contexts incontexts/. - Configuration lives in
next.config.ts,app.config.ts(if present), andconfig/. - Shared utilities and lib code are in
lib/; type definitions intypes/. - i18n locale files live under
i18n/locales/(structure must match the old project). - Static assets belong in
public/orassets/. - Tests belong in
tests/(mirror source structure when tests exist). - UI vs feedback:
components/ui/holds presentational, declarative UI primitives (e.g. Button, Dialog).lib/feedback/holds global imperative APIs for toast and confirm dialogs (MessageProvider/useMessage, DialogProvider/useDialog). Use@/lib/feedback/messageand@/lib/feedback/dialogfor imperative feedback; use@/components/ui/*for declarative UI.
pnpm dev– start the Next.js development server with hot reload.pnpm build– create a production build.pnpm start– run the production bundle locally.pnpm lint– run ESLint.pnpm test:run– run the test suite (when configured).pnpm tsc --noEmit– perform a strict TypeScript type check (or rely onnext buildfor type-checking).
Before committing any code changes, you MUST run and pass:
-
Lockfile Sync Check:
pnpm install --frozen-lockfile- Ensures
pnpm-lock.yamlis in sync withpackage.json - MUST run
pnpm installafter modifyingpackage.jsonand commit the updatedpnpm-lock.yaml - CI will fail if the lockfile is out of sync.
- Ensures
-
TypeScript Type Check:
pnpm tsc --noEmit(orpnpm build)- Ensures all TypeScript types are correct.
- Must have zero errors before committing.
-
Lint Check:
pnpm lint- Ensures code follows ESLint rules.
- Fix issues before committing.
-
Format Check (if Prettier is configured):
pnpm prettier --check .- Ensures consistent formatting.
- If it fails, run
pnpm lint:fixorpnpm format(when available) to auto-fix.
-
Test Coverage Check (when tests exist): Review and update tests for code changes
- MUST review test cases when modifying code: add tests for new features, update tests for changed behavior, remove tests for removed features.
- Run
pnpm test:runto ensure all tests pass. - Ensure test cases accurately reflect the current implementation.
Automated Enforcement: If a pre-commit hook exists, it will run these checks. If any check fails, the commit will be blocked.
Quick Fix: If checks fail:
- Run
pnpm installto sync lockfile (ifpackage.jsonchanged). - Fix ESLint/Prettier issues.
- Address TypeScript errors manually.
- Review and update test cases as needed, then run
pnpm test:runto verify.
- Use Prettier defaults when configured; run
pnpm lint:fixorpnpm formatafter making changes. - React components use functional components with TypeScript; prefer hooks and custom hooks for shared logic.
- Component files use kebab-case (e.g.
bucket-selector.tsx); reference them with PascalCase in JSX (e.g.<BucketSelector />). - Override shadcn primitives outside
components/ui/; never edit files in that directory directly. - Render tabular data with the shared
DataTable+useDataTableutilities unless a specific requirement makes them unsuitable. - Language pack files must follow the structure used in the old project; do not alter i18n layout or keys arbitrarily.
- Directories: Group by domain/feature; use plural for domain folders (e.g.
buckets/,user/,object/). - File names: kebab-case; do not repeat the directory name in the filename (e.g. under
buckets/useinfo.tsx,new-form.tsx,selector.tsxinstead ofbucket-info.tsx,bucket-new-form.tsx). The path already provides context. - Component names: PascalCase, aligned with the domain and purpose (e.g.
BucketInfo,UserDropdown); component names may still include the domain when used in JSX for clarity. - Forms: Use consistent patterns per domain:
XxxNewForm/XxxEditFormorXxxForm; files can benew-form.tsx,edit-form.tsx,form.tsxunder the domain folder. - Placement: Components used only by one domain live in that domain folder; components reused by 3+ different domain pages may stay at root or under
components/shared/(document if so).
- When tests are configured, add new suites under
tests/, mirroring source structure. - Name files
*.spec.tsor*.test.ts. - Keep tests deterministic; mock network calls through provided hooks or context.
⚠️ CRITICAL: Every code change MUST include corresponding test updates when tests exist:- New features: Add comprehensive test cases covering happy paths and edge cases.
- Modified behavior: Update existing tests to reflect new implementation.
- Removed features: Remove or update tests for deprecated/removed functionality.
- Bug fixes: Add regression tests to prevent future occurrences.
- Run
pnpm test:runbefore submitting any changes.
- Follow conventional, action-oriented commit subjects (e.g.
feat: add bucket selector,fix: correct object list pagination). - Each pull request should include: a concise summary, linked issue or task, screenshots for UI work, and testing notes.
- Keep PRs scoped; large refactors should be coordinated in advance.
- Commit message and PR title must be in English.
- When a PR template exists (e.g.
.github/pull_request_template.md), follow it strictly.
- Apply visual tweaks (e.g. removing shadows, altering colors) at usage sites via classes such as
class="shadow-none". - When extending shadcn components, create wrapper components (e.g.
BucketSelector.tsx) instead of forking primitives. - Do not change base colors or theme variables defined in
console-newunless explicitly required by the migration plan.
- Incremental progress over big bangs – Small changes that compile and pass tests.
- Learning from existing code – Study and plan before implementing.
- Pragmatic over dogmatic – Adapt to project reality.
- Clear intent over clever code – Be boring and obvious.
- Single responsibility per function/class.
- Avoid premature abstractions.
- No clever tricks – choose the boring solution.
- If you need to explain it, it’s too complex.
Break complex work into 3–5 stages. Document in IMPLEMENTATION_PLAN.md only when explicitly requested (see Documentation Restriction):
## Stage N: [Name]
**Goal**: [Specific deliverable]
**Success Criteria**: [Testable outcomes]
**Tests**: [Specific test cases]
**Status**: [Not Started|In Progress|Complete]- Update status as you progress.
- Remove the file when all stages are done.
- Understand – Study existing patterns in the codebase.
- Test – Write tests first (red).
- Implement – Minimal code to pass (green).
- Refactor – Clean up with tests passing.
- Commit – With a clear message linking to the plan.
CRITICAL: Maximum 3 attempts per issue, then STOP.
-
Document what failed:
- What you tried.
- Specific error messages.
- Why you think it failed.
-
Research alternatives:
- Find 2–3 similar implementations.
- Note different approaches used.
-
Question fundamentals:
- Is this the right abstraction level?
- Can this be split into smaller problems?
- Is there a simpler approach entirely?
-
Try a different angle:
- Different library/framework feature?
- Different architectural pattern?
- Remove abstraction instead of adding?
- Composition over inheritance – Use dependency injection.
- Interfaces over singletons – Enable testing and flexibility.
- Explicit over implicit – Clear data flow and dependencies.
- Test-driven when possible – Never disable tests; fix them.
-
Every commit must:
- Compile successfully.
- Pass all existing tests.
- Include tests for new functionality (when tests exist).
- Follow project formatting/linting.
-
Before committing:
- Run formatters/linters.
- Self-review changes.
- Ensure commit message explains "why".
- Fail fast with descriptive messages.
- Include context for debugging.
- Handle errors at the appropriate level.
- Never silently swallow exceptions.
When multiple valid approaches exist, choose based on:
- Testability – Can I easily test this?
- Readability – Will someone understand this in 6 months?
- Consistency – Does this match project patterns?
- Simplicity – Is this the simplest solution that works?
- Reversibility – How hard is it to change later?
- Find 3 similar features/components.
- Identify common patterns and conventions.
- Use the same libraries/utilities when possible.
- Follow existing test patterns.
- Use the project’s existing build system.
- Use the project’s test framework.
- Use the project’s formatter/linter settings.
- Don’t introduce new tools without strong justification.
- Tests written and passing (when applicable).
- Code follows project conventions.
- No linter/formatter warnings.
- Commit messages are clear.
- Implementation matches plan.
- No TODOs without issue numbers.
- Test behavior, not implementation.
- One assertion per test when possible.
- Clear test names describing the scenario.
- Use existing test utilities/helpers.
- Tests should be deterministic.
Unless explicitly requested, do not produce any summary-type, plan-type, analysis-type, or similar documentation in the project. This includes but is not limited to:
IMPLEMENTATION_PLAN.md,SUMMARY.md,PLAN.md,CHANGELOG.md- Migration summaries, progress reports, or task completion reports
- Analysis documents (e.g. refactor analysis, page/code analysis, architecture analysis,
*_ANALYSIS*.md) - Any document created proactively to describe or track work
Create such documents only when the user explicitly asks for them.
NEVER:
- Use
--no-verifyto bypass commit hooks. - Disable tests instead of fixing them.
- Commit code that doesn’t compile.
- Make assumptions – verify with existing code.
- During migration: modify page text, add UI components, or change component positions without plan approval.
ALWAYS:
- Commit working code incrementally.
- Update plan documentation as you go.
- Learn from existing implementations if exists (especially
console-old). - Stop after 3 failed attempts and reassess.