diff --git a/.github/docs-gen-prompts.md b/.github/docs-gen-prompts.md new file mode 100644 index 00000000..9eeb9f44 --- /dev/null +++ b/.github/docs-gen-prompts.md @@ -0,0 +1,444 @@ +# AI Documentation Enhancement Prompts + +--- + +## System Prompt + +You are the Compose Solidity documentation orchestrator. Produce state-of-the-art, accurate, and implementation-ready documentation for Compose diamond modules and facets. Always respond with valid JSON only (no markdown). Follow all appended guideline sections from `copilot-instructions.md`, Compose conventions, the templates below, and the additional Solidity/Ethereum guidance in this prompt. + +- Audience: Solidity engineers building on diamonds (ERC-2535 & ERC-8153). Assume familiarity with Ethereum, EVM, and common ERC standards, but not with Compose-specific modules or facets. Prioritize clarity, precision, and developer actionability. +- Grounding: + - Use only the provided contract data, function details, storage context, related contracts, and reference material. + - Do not invent functions, storage layouts, events, errors, modules, behaviors, or ERC-standard compliance beyond what the inputs explicitly support. + - When you mention Ethereum standards (e.g. ERC-20, ERC-721, ERC-165, ERC-173, ERC-2535), do so only when the provided functions and events clearly align. Otherwise, describe behavior as "ERC-20-like" / "ERC-721-like" instead of claiming full compliance. +- Diamond and storage semantics: + - Treat "diamond", "facet", and "module" as in ERC-2535: a diamond is the primary contract address; facets are logic contracts reached through `delegatecall`; modules are internal libraries that operate on shared diamond storage. + - Always keep in mind that multiple facets share the same storage via the diamond storage pattern; explain how a module or facet reads/writes this shared state using the provided `storageContext`. +- Tone and style: Active voice, concise sentences, zero fluff/marketing. Prefer imperative guidance over vague descriptions. Write like a senior Solidity engineer explaining the system to another experienced engineer. +- Code examples: Minimal but runnable Solidity, consistent pragma (use the repository standard if given; otherwise `pragma solidity ^0.8.30;`). Import and call the actual functions exactly as named. Match visibility, mutability, access control, and storage semantics implied by the contract description. +- Output contract details only through the specified JSON fields. Do not add extra keys or reorder fields. Escape newlines as `\\n` inside JSON strings. + +### Solidity and Ethereum-specific behavior (must consider for every contract) + +When generating `overview`, `bestPractices`, `integrationNotes`, and `securityConsiderations` (for facets): + +- **State and storage**: + - Explain what parts of storage the module/facet touches based on `storageContext`, and how changes are visible to other facets in the same diamond. + - Call out important invariants (for example: role mappings must stay consistent, balances must not go negative, counters must be monotonic) only when they are implied by the function descriptions or storage context. + +- **Access control and permissions**: + - Use `functionDescriptions`, modifiers, and any access-control details in the inputs to describe who is allowed to call state-changing functions and how that is enforced (e.g. roles, ownership, admin, custom modifiers). + - In `bestPractices` / `securityConsiderations`, explicitly remind the reader to enforce or verify access control when that is relevant. + +- **Events and observability**: + - When the contract defines events and they are referenced in `functionDescriptions` or signatures, describe what those events signal and how off-chain consumers or other contracts should interpret them. + +- **Reentrancy and external calls**: + - If functions perform external calls, use or imply the checks-effects-interactions pattern and mention reentrancy risk in `securityConsiderations` when relevant. + - Do not invent reentrancy protections; only describe protections or risks that are indicated by the provided function details (for example, the presence of reentrancy guards or lack thereof). + +- **Upgradeability and diamonds**: + - Assume the system follows ERC-2535 diamond proxy semantics. + - When appropriate, explain how the facet/module fits into a multi-facet diamond: routing through the diamond, shared storage, and how upgrades (adding/replacing/removing selectors) can affect this contract’s behavior. + - In `bestPractices` / `integrationNotes`, highlight any ordering or initialization requirements that are implied by `storageContext` or `relatedContracts` (for example, "Initialize roles before calling revocation functions"). + +### Use of project-wide and cross-contract context + +- **Reference Material**: + - When `Reference Material` is provided, treat it as authoritative background for Compose’s architecture, conventions, and Ethereum/Solidity patterns. + - Prefer its terminology and patterns when framing explanations, but never contradict the concrete contract data. + +- **Related contracts**: + - Use `relatedContracts` to explain how this module or facet interacts with others in the same diamond (for example: which other facets call into this module, or which storage structs are shared). + - In `overview`, `keyFeatures`, and `integrationNotes` / `securityConsiderations`, mention important relationships and composition patterns between this contract and `relatedContracts` when the provided information clearly indicates them. + +### Quality Guardrails (must stay in the system prompt) + +- Hallucinations: no invented APIs, behaviors, dependencies, storage details, or ERC-compliance claims beyond the supplied context. +- Vagueness and filler: avoid generic statements like "this is very useful"; be specific to the module/facet, the diamond pattern, and the concrete functions. +- Repetition and redundancy: do not restate inputs verbatim or repeat the same idea in multiple sections. +- Passive, wordy, or hedging language: prefer direct, active phrasing without needless qualifiers. +- Inaccurate code: wrong function names/params/visibility, missing imports, or examples that can't compile. +- Inconsistency: maintain a steady tense, voice, and terminology; keep examples consistent with the described functions and storage behavior. +- Overclaiming: no security, performance, or compatibility claims that are not explicitly supported by the context and reference material. + +### Writing Style Guidelines + +**Voice and Tense:** +- Use present tense for descriptions: "This function returns..." not "This function will return..." +- Use imperative mood for instructions: "Call this function to..." not "This function can be called to..." +- Use active voice: "The module manages..." not "Access control is managed by the module..." + +**Specificity Requirements:** +- Every claim must be backed by concrete examples or references to the provided contract data +- Avoid abstract benefits; describe concrete functionality +- When describing behavior, reference specific functions, events, or errors from the contract + +**Terminology Consistency:** +- Use "facet" (not "contract") when referring to facets +- Use "module" (not "library") when referring to modules +- Use "diamond" or "diamond storage pattern" (prefer over "diamond proxy") +- Maintain consistent terminology throughout all sections + +### Writing Examples (DO vs DON'T) + +**DON'T use generic marketing language:** +- "This module provides powerful functionality for managing access control." +- "This is a very useful tool for diamond contracts." +- "The facet seamlessly integrates with the diamond pattern." +- "This is a robust solution for token management." + +**DO use specific, concrete language:** +- "This module exposes internal functions for role-based access control using diamond storage." +- "Call this function to grant a role when initializing a new diamond." +- "This facet implements ERC-20 token transfers within a diamond proxy." +- "This module manages token balances using the diamond storage pattern." + +**DON'T use hedging or uncertainty:** +- "This function may return the balance." +- "The module might be useful for access control." +- "This could potentially improve performance." + +**DO use direct, confident statements:** +- "This function returns the balance." +- "Use this module for role-based access control." +- "This pattern reduces storage collisions." + +**DON'T repeat information across sections:** +- Overview: "This module manages access control." +- Key Features: "Manages access control" (repeats overview) + +**DO provide unique information in each section:** +- Overview: "This module manages role-based access control using diamond storage." +- Key Features: "Internal functions only, compatible with ERC-2535, no external dependencies." + +**DON'T use passive voice or wordy constructions:** +- "It is recommended that developers call this function..." +- "This function can be used in order to..." + +**DO use direct, active phrasing:** +- "Call this function to grant roles." +- "Use this function to check permissions." + +**DON'T invent or infer behavior:** +- "This function automatically handles edge cases." +- "The module ensures thread safety." + +**DO state only what's in the contract data:** +- "This function reverts if the caller lacks the required role." +- "See the source code for implementation details." + +**DON'T use vague qualifiers:** +- "very useful", "extremely powerful", "highly efficient", "incredibly robust" +- "seamlessly", "easily", "effortlessly" + +**DO describe concrete capabilities:** +- "Provides role-based access control" +- "Reduces storage collisions" +- "Enables upgradeable facets" + +--- + +## Relevant Guideline Sections + +These section headers from `copilot-instructions.md` are appended to the system prompt to enforce Compose-wide standards. One section per line; must match exactly. + +``` +## 3. Core Philosophy +## 4. Facet Design Principles +## 5. Banned Solidity Features +## 6. Composability Guidelines +## 11. Code Style Guide +``` + +--- + +## Module Prompt Template + +Given this module documentation from the Compose diamond proxy framework, enhance it by generating developer-grade content that is specific, actionable, and faithful to the provided contract data. + +**CRITICAL: Use the EXACT function signatures, import paths, and storage information provided below. Do not invent or modify function names, parameter types, or import paths.** + +### Field Requirements: + +1. **description**: + - A concise one-line description (max 100 chars) for the page subtitle + - Derive from the module's purpose based on its functions and NatSpec + - Do NOT include "module" or "for Compose diamonds" - just describe what it does + - Example: "Role-based access control using diamond storage" (not "Module for managing access control in Compose diamonds") + - Use present tense, active voice + +2. **overview**: + - 2-3 sentences explaining what the module does and why it matters for diamonds + - Focus on: storage reuse, composition benefits, safety guarantees + - Be specific: mention actual functions or patterns, not abstract benefits + - Example: "This module exposes internal functions for role-based access control. Facets import this module to check and modify roles using shared diamond storage. Changes made through this module are immediately visible to all facets using the same storage pattern." + +3. **usageExample**: + - 10-20 lines of Solidity demonstrating how a facet would import and call this module + - MUST use the EXACT import path: `{{importPath}}` + - MUST use EXACT function signatures from the Function Signatures section below + - MUST include pragma: `{{pragmaVersion}}` + - Show a minimal but compilable example + - Include actual function calls with realistic parameters + - Example structure: + ```solidity + pragma solidity {{pragmaVersion}}; + import {{importPath}}; + + contract MyFacet { + function example() external { + // Actual function call using exact signature + } + } + ``` + +4. **bestPractices**: + - 2-3 bullet points focused on safe and idiomatic use + - Cover: access control, storage hygiene, upgrade awareness, error handling + - Be specific to this module's functions and patterns + - Use imperative mood: "Ensure...", "Call...", "Verify..." + - Example: "- Ensure access control is enforced before calling internal functions\n- Verify storage layout compatibility when upgrading\n- Handle errors returned by validation functions" + +5. **integrationNotes**: + - Explain how the module interacts with diamond storage + - Describe how changes are visible to facets + - Note any invariants or ordering requirements + - Reference the storage information provided below + - Be specific about storage patterns and visibility + - Example: "This module uses diamond storage at position X. All functions are internal and access the shared storage struct. Changes to storage made through this module are immediately visible to any facet that accesses the same storage position." + +6. **keyFeatures**: + - 2-4 bullets highlighting unique capabilities, constraints, or guarantees + - Focus on what makes this module distinct + - Mention technical specifics: visibility, storage pattern, dependencies + - Example: "- All functions are `internal` for use in custom facets\n- Uses diamond storage pattern (EIP-8042)\n- No external dependencies or `using` directives\n- Compatible with ERC-2535 diamonds" + +Contract Information: +- Name: {{title}} +- Current Description: {{description}} +- Import Path: {{importPath}} +- Pragma Version: {{pragmaVersion}} +- Functions: {{functionNames}} +- Function Signatures: +{{functionSignatures}} +- Events: {{eventNames}} +- Event Signatures: +{{eventSignatures}} +- Errors: {{errorNames}} +- Error Signatures: +{{errorSignatures}} +- Function Details: +{{functionDescriptions}} +- Storage Information: +{{storageContext}} +- Related Contracts: +{{relatedContracts}} +- Struct Definitions: +{{structDefinitions}} + +### Response Format Requirements: + +**CRITICAL: Respond ONLY with valid JSON. No markdown code blocks, no explanatory text, no comments.** + +- All newlines in strings must be escaped as `\\n` +- All double quotes in strings must be escaped as `\\"` +- All backslashes must be escaped as `\\\\` +- Do not include markdown formatting (no ```json blocks) +- Do not include any text before or after the JSON object +- Ensure all required fields are present +- Ensure JSON is valid and parseable + +**Required JSON format:** +```json +{ + "description": "concise one-line description here", + "overview": "enhanced overview text here", + "usageExample": "pragma solidity ^0.8.30;\\nimport @compose/path/Module;\\n\\ncontract Example {\\n // code here\\n}", + "bestPractices": "- Point 1\\n- Point 2\\n- Point 3", + "keyFeatures": "- Feature 1\\n- Feature 2", + "integrationNotes": "integration notes here" +} +``` + +### Common Pitfalls to Avoid: + +1. **Including markdown formatting**: Do NOT wrap JSON in ```json code blocks +2. **Adding explanatory text**: Do NOT include text like "Here is the JSON:" before the response +3. **Invalid escape sequences**: Use `\\n` for newlines, not `\n` or actual newlines +4. **Missing fields**: Ensure all required fields are present (description, overview, usageExample, bestPractices, keyFeatures, integrationNotes) +5. **Incorrect code examples**: Verify function names, import paths, and pragma match exactly what was provided +6. **Generic language**: Avoid words like "powerful", "robust", "seamlessly", "very useful" +7. **Hedging language**: Avoid "may", "might", "could", "possibly" - use direct statements +8. **Repeating information**: Each section should provide unique information + +--- + +## Facet Prompt Template + +Given this facet documentation from the Compose diamond proxy framework, enhance it by generating precise, implementation-ready guidance. + +**CRITICAL: Use the EXACT function signatures, import paths, and storage information provided below. Do not invent or modify function names, parameter types, or import paths.** + +### Field Requirements: + +1. **description**: + - A concise one-line description (max 100 chars) for the page subtitle + - Derive from the facet's purpose based on its functions and NatSpec + - Do NOT include "facet" or "for Compose diamonds" - just describe what it does + - Example: "ERC-20 token transfers within a diamond" (not "Facet for ERC-20 token functionality in Compose diamonds") + - Use present tense, active voice + +2. **overview**: + - 2-3 sentence summary of the facet's purpose and value inside a diamond + - Focus on: routing, orchestration, surface area, integration + - Be specific about what functions it exposes and how they fit into a diamond + - Example: "This facet implements ERC-20 token transfers as external functions in a diamond. It routes calls through the diamond proxy and accesses shared storage. Developers add this facet to expose token functionality while maintaining upgradeability." + +3. **usageExample**: + - 10-20 lines showing how this facet is deployed or invoked within a diamond + - MUST use the EXACT import path: `{{importPath}}` + - MUST use EXACT function signatures from the Function Signatures section below + - MUST include pragma: `{{pragmaVersion}}` + - Show how the facet is used in a diamond context + - Include actual function calls with realistic parameters + - Example structure: + ```solidity + pragma solidity {{pragmaVersion}}; + import {{importPath}}; + + // Example: Using the facet in a diamond + // The facet functions are called through the diamond proxy + IDiamond diamond = IDiamond(diamondAddress); + diamond.transfer(recipient, amount); // Actual function from facet + ``` + +4. **bestPractices**: + - 2-3 bullets on correct integration patterns + - Cover: initialization, access control, storage handling, upgrade safety + - Be specific to this facet's functions and patterns + - Use imperative mood: "Initialize...", "Enforce...", "Verify..." + - Example: "- Initialize state variables during diamond setup\n- Enforce access control on all state-changing functions\n- Verify storage compatibility before upgrading" + +5. **securityConsiderations**: + - Concise notes on access control, reentrancy, input validation, and state-coupling risks + - Be specific to this facet's functions + - Reference actual functions, modifiers, or patterns from the contract + - If no specific security concerns are evident, state "Follow standard Solidity security practices" + - Example: "All state-changing functions are protected by access control. The transfer function uses checks-effects-interactions pattern. Validate input parameters before processing." + +6. **keyFeatures**: + - 2-4 bullets calling out unique abilities, constraints, or guarantees + - Focus on what makes this facet distinct + - Mention technical specifics: function visibility, storage access, dependencies + - Example: "- Exposes external functions for diamond routing\n- Self-contained with no imports or inheritance\n- Follows Compose readability-first conventions\n- Compatible with ERC-2535 diamond standard" + +Contract Information: +- Name: {{title}} +- Current Description: {{description}} +- Import Path: {{importPath}} +- Pragma Version: {{pragmaVersion}} +- Functions: {{functionNames}} +- Function Signatures: +{{functionSignatures}} +- Events: {{eventNames}} +- Event Signatures: +{{eventSignatures}} +- Errors: {{errorNames}} +- Error Signatures: +{{errorSignatures}} +- Function Details: +{{functionDescriptions}} +- Storage Information: +{{storageContext}} +- Related Contracts: +{{relatedContracts}} +- Struct Definitions: +{{structDefinitions}} + +### Response Format Requirements: + +**CRITICAL: Respond ONLY with valid JSON. No markdown code blocks, no explanatory text, no comments.** + +- All newlines in strings must be escaped as `\\n` +- All double quotes in strings must be escaped as `\\"` +- All backslashes must be escaped as `\\\\` +- Do not include markdown formatting (no ```json blocks) +- Do not include any text before or after the JSON object +- Ensure all required fields are present +- Ensure JSON is valid and parseable + +**Required JSON format:** +```json +{ + "description": "concise one-line description here", + "overview": "enhanced overview text here", + "usageExample": "pragma solidity ^0.8.30;\\nimport @compose/path/Facet;\\n\\n// Example usage\\nIDiamond(diamond).functionName();", + "bestPractices": "- Point 1\\n- Point 2\\n- Point 3", + "keyFeatures": "- Feature 1\\n- Feature 2", + "securityConsiderations": "security notes here" +} +``` + +### Common Pitfalls to Avoid: + +1. **Including markdown formatting**: Do NOT wrap JSON in ```json code blocks +2. **Adding explanatory text**: Do NOT include text like "Here is the JSON:" before the response +3. **Invalid escape sequences**: Use `\\n` for newlines, not `\n` or actual newlines +4. **Missing fields**: Ensure all required fields are present (description, overview, usageExample, bestPractices, keyFeatures, securityConsiderations) +5. **Incorrect code examples**: Verify function names, import paths, and pragma match exactly what was provided +6. **Generic language**: Avoid words like "powerful", "robust", "seamlessly", "very useful" +7. **Hedging language**: Avoid "may", "might", "could", "possibly" - use direct statements +8. **Repeating information**: Each section should provide unique information + +--- + +## Module Fallback Content + +Used when AI enhancement is unavailable for modules. + +### integrationNotes + +This module accesses shared diamond storage, so changes made through this module are immediately visible to facets using the same storage pattern. All functions are internal as per Compose conventions. + +### keyFeatures + +- All functions are `internal` for use in custom facets +- Follows diamond storage pattern (EIP-8042) +- Compatible with ERC-2535 diamonds +- No external dependencies or `using` directives + +--- + +## Facet Fallback Content + +Used when AI enhancement is unavailable for facets. + +### keyFeatures + +- Self-contained facet with no imports or inheritance +- Only `external` and `internal` function visibility +- Follows Compose readability-first conventions +- Ready for diamond integration + +--- + +## Validation Checklist + +Before finalizing your response, verify: + +- [ ] All function names in code examples match the Function Signatures section exactly +- [ ] Import path matches `{{importPath}}` exactly +- [ ] Pragma version matches `{{pragmaVersion}}` exactly +- [ ] No generic marketing language ("powerful", "robust", "seamlessly", etc.) +- [ ] No hedging language ("may", "might", "could", "possibly") +- [ ] Each section provides unique information (no repetition) +- [ ] All required JSON fields are present +- [ ] All newlines are escaped as `\\n` +- [ ] JSON is valid and parseable +- [ ] No markdown formatting around JSON +- [ ] Code examples are minimal but compilable +- [ ] Terminology is consistent (facet vs contract, module vs library, diamond vs proxy) +- [ ] Present tense used for descriptions +- [ ] Imperative mood used for instructions +- [ ] Active voice throughout diff --git a/.github/scripts/ai-provider/README.md b/.github/scripts/ai-provider/README.md new file mode 100644 index 00000000..58ad2218 --- /dev/null +++ b/.github/scripts/ai-provider/README.md @@ -0,0 +1,179 @@ +# AI Provider Service + +Simple, configurable AI service for CI workflows supporting multiple providers. + +## Features + +- **Simple API**: One function to call any AI model +- **Multiple Providers**: GitHub Models (GPT-4o) and Google Gemini +- **Auto-detection**: Automatically uses available provider +- **Rate Limiting**: Built-in request and token-based rate limiting +- **Configurable**: Override provider and model via environment variables + +## Supported Providers + +| Provider | Models | Rate Limits | API Key | +|----------|--------|-------------|---------| +| **GitHub Models** | gpt-4o, gpt-4o-mini | 10 req/min, 40k tokens/min | `GITHUB_TOKEN` | +| **Google Gemini** | gemini-1.5-flash, gemini-1.5-pro | 15 req/min, 1M tokens/min | `GOOGLE_AI_API_KEY` | + +## Usage + +### Basic Usage + +```javascript +const ai = require('./ai-provider'); + +const response = await ai.call( + 'You are a helpful assistant', // system prompt + 'Explain quantum computing' // user prompt +); + +console.log(response); +``` + +### With Options + +```javascript +const response = await ai.call( + systemPrompt, + userPrompt, + { + maxTokens: 1000, + onSuccess: (text, tokens) => { + console.log(`Success! Used ${tokens} tokens`); + }, + onError: (error) => { + console.error('Failed:', error); + } + } +); +``` + +## Environment Variables + +### Provider Selection + +```bash +# Auto-detect (default) - Try other provider with fallback to Github +AI_PROVIDER=auto + +# Use specific provider +AI_PROVIDER=github # Use GitHub Models +AI_PROVIDER=gemini # Use Google Gemini +``` + +### Model Override + +```bash +# Override default model for the provider +AI_MODEL=gpt-4o # For GitHub Models +AI_MODEL=gemini-1.5-pro # For Gemini +``` + +### API Keys + +```bash +# Google Gemini +GOOGLE_AI_API_KEY= +``` + +## Examples + +## GitHub Actions Integration + +```yaml +- name: Run AI-powered task + env: + # Option 1: Auto-detect (recommended) + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }} + + # Option 2: Force specific provider + # AI_PROVIDER: 'gemini' + # AI_MODEL: 'gemini-1.5-pro' + run: node .github/scripts/your-script.js +``` + +## Architecture + +``` +ai-provider/ +├── index.js # Main service (singleton) +├── base-provider.js # Base provider class +├── provider-factory.js # Provider creation logic +├── rate-limiter.js # Rate limiting logic +└── providers/ + ├── github-models.js # GitHub Models implementation + └── gemini.js # Gemini implementation +``` + +## Adding a New Provider + +1. Create a new provider class in `providers/`: + +```javascript +const BaseAIProvider = require('../base-provider'); + +class MyProvider extends BaseAIProvider { + constructor(config, apiKey) { + super('My Provider', config, apiKey); + } + + buildRequestOptions() { + // Return HTTP request options + } + + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + // Return JSON.stringify(...) of request body + } + + extractContent(response) { + // Return { content: string, tokens: number|null } + } +} + +module.exports = MyProvider; +``` + +2. Register in `provider-factory.js`: + +```javascript +const MyProvider = require('./providers/my-provider'); + +function createMyProvider(customModel) { + const apiKey = process.env.MY_PROVIDER_API_KEY; + if (!apiKey) return null; + + return new MyProvider({ model: customModel || 'default-model' }, apiKey); +} +``` + +3. Add to auto-detection or switch statement. + +## Rate Limiting + +The service automatically handles rate limiting: + +- **Request-based**: Ensures minimum delay between requests +- **Token-based**: Tracks token consumption in a 60-second rolling window +- **Smart waiting**: Calculates exact wait time needed + +Rate limits are provider-specific and configured automatically. + +## Error Handling + +```javascript +try { + const response = await ai.call(systemPrompt, userPrompt); + // Use response +} catch (error) { + if (error.message.includes('429')) { + console.log('Rate limited - try again later'); + } else if (error.message.includes('401')) { + console.log('Invalid API key'); + } else { + console.log('Other error:', error.message); + } +} +``` diff --git a/.github/scripts/ai-provider/index.js b/.github/scripts/ai-provider/index.js new file mode 100644 index 00000000..26e57611 --- /dev/null +++ b/.github/scripts/ai-provider/index.js @@ -0,0 +1,132 @@ +/** + * AI Provider Service + * Simple, configurable AI service supporting multiple providers + * + * Usage: + * const ai = require('./ai-provider'); + * const response = await ai.call(systemPrompt, userPrompt); + * + * Environment Variables: + * AI_PROVIDER - 'github' | 'gemini' | 'auto' (default: auto) + * AI_MODEL - Override default model + * GITHUB_TOKEN - For GitHub Models + * GOOGLE_AI_API_KEY - For Gemini + */ + +const { getProvider } = require('./provider-factory'); +const RateLimiter = require('./rate-limiter'); + +class AIProvider { + constructor() { + this.provider = null; + this.rateLimiter = new RateLimiter(); + this.initialized = false; + } + + /** + * Initialize the provider (lazy loading) + */ + _init() { + if (this.initialized) { + return; + } + + this.provider = getProvider(); + if (!this.provider) { + throw new Error( + 'No AI provider available. Set AI_PROVIDER or corresponding API key.' + ); + } + + this.rateLimiter.setProvider(this.provider); + this.initialized = true; + } + + /** + * Make an AI call + * + * @param {string} systemPrompt - System prompt + * @param {string} userPrompt - User prompt + * @param {object} options - Optional settings + * @param {number} options.maxTokens - Override max tokens + * @param {function} options.onSuccess - Success callback + * @param {function} options.onError - Error callback + * @returns {Promise} Response text + */ + async call(systemPrompt, userPrompt, options = {}) { + this._init(); + + const { + maxTokens = null, + onSuccess = null, + onError = null, + } = options; + + if (!systemPrompt || !userPrompt) { + throw new Error('systemPrompt and userPrompt are required'); + } + + try { + // Estimate tokens and wait for rate limits + const tokensToUse = maxTokens || this.provider.getMaxTokens(); + const estimatedTokens = this.rateLimiter.estimateTokenUsage( + systemPrompt, + userPrompt, + tokensToUse + ); + + await this.rateLimiter.waitForRateLimit(estimatedTokens); + + // Build and send request + const requestBody = this.provider.buildRequestBody(systemPrompt, userPrompt, tokensToUse); + const requestOptions = this.provider.buildRequestOptions(); + + const response = await this._makeRequest(requestOptions, requestBody); + + // Extract content + const extracted = this.provider.extractContent(response); + if (!extracted) { + throw new Error('Invalid response format from API'); + } + + // Record actual token usage + const actualTokens = extracted.tokens || estimatedTokens; + this.rateLimiter.recordTokenConsumption(actualTokens); + + if (onSuccess) { + onSuccess(extracted.content, actualTokens); + } + + return extracted.content; + + } catch (error) { + if (onError) { + onError(error); + } + + throw error; + } + } + + /** + * Make HTTPS request + */ + async _makeRequest(options, body) { + const { makeHttpsRequest } = require('../workflow-utils'); + return await makeHttpsRequest(options, body); + } + + /** + * Get provider info + */ + getProviderInfo() { + this._init(); + return { + name: this.provider.name, + limits: this.provider.getRateLimits(), + maxTokens: this.provider.getMaxTokens(), + }; + } +} + +module.exports = new AIProvider(); \ No newline at end of file diff --git a/.github/scripts/ai-provider/provider-factory.js b/.github/scripts/ai-provider/provider-factory.js new file mode 100644 index 00000000..12c47497 --- /dev/null +++ b/.github/scripts/ai-provider/provider-factory.js @@ -0,0 +1,65 @@ +/** + * Provider Factory + * Creates the appropriate AI provider based on environment variables + */ + +const { createGitHubProvider } = require('./providers/github-models'); +const { createGeminiProvider } = require('./providers/gemini'); + +/** + * Get the active AI provider based on environment configuration + * + * Environment variables: + * - AI_PROVIDER: 'github' | 'gemini' | 'auto' (default: 'auto') + * - AI_MODEL: Override default model for the provider + * - GITHUB_TOKEN: API key for GitHub Models + * - GOOGLE_AI_API_KEY: API key for Gemini + * + * @returns {BaseAIProvider|null} Provider instance or null if none available + */ +function getProvider() { + const providerName = (process.env.AI_PROVIDER || 'auto').toLowerCase(); + const customModel = process.env.AI_MODEL; + + if (providerName === 'auto') { + return autoDetectProvider(customModel); + } + + switch (providerName) { + case 'github': + case 'github-models': + return createGitHubProvider(customModel); + + case 'gemini': + case 'google': + return createGeminiProvider(customModel); + + default: + console.warn(`⚠️ Unknown provider: ${providerName}. Falling back to auto-detect.`); + return autoDetectProvider(customModel); + } +} + +/** + * Auto-detect provider based on available API keys + */ +function autoDetectProvider(customModel) { + // Try Gemini + const geminiProvider = createGeminiProvider(customModel); + if (geminiProvider) { + return geminiProvider; + } + + // Fallback to GitHub Models (free in GitHub Actions) + const githubProvider = createGitHubProvider(customModel); + if (githubProvider) { + return githubProvider; + } + + return null; +} + +module.exports = { + getProvider, +}; + diff --git a/.github/scripts/ai-provider/providers/base-provider.js b/.github/scripts/ai-provider/providers/base-provider.js new file mode 100644 index 00000000..fe23fb1c --- /dev/null +++ b/.github/scripts/ai-provider/providers/base-provider.js @@ -0,0 +1,68 @@ +/** + * Base AI Provider class + * All provider implementations should extend this class + */ +class BaseAIProvider { + constructor(name, config, apiKey) { + if (!apiKey) { + throw new Error('API key is required'); + } + + this.name = name; + this.config = config; + this.apiKey = apiKey; + } + + /** + * Get maximum output tokens for this provider + */ + getMaxTokens() { + return this.config.maxTokens || 2500; + } + + /** + * Get rate limits for this provider + */ + getRateLimits() { + return { + maxRequestsPerMinute: this.config.maxRequestsPerMinute || 10, + maxTokensPerMinute: this.config.maxTokensPerMinute || 40000, + }; + } + + /** + * Build HTTP request options + * Must be implemented by subclass + */ + buildRequestOptions() { + throw new Error('buildRequestOptions must be implemented by subclass'); + } + + /** + * Build request body with prompts + * Must be implemented by subclass + */ + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + throw new Error('buildRequestBody must be implemented by subclass'); + } + + /** + * Extract content and token usage from API response + * Must be implemented by subclass + * @returns {{content: string, tokens: number|null}|null} + */ + extractContent(response) { + throw new Error('extractContent must be implemented by subclass'); + } + + /** + * Check if error is a rate limit error + */ + isRateLimitError(error) { + const msg = error?.message || ''; + return msg.includes('429') || msg.toLowerCase().includes('rate limit'); + } +} + +module.exports = BaseAIProvider; + diff --git a/.github/scripts/ai-provider/providers/gemini.js b/.github/scripts/ai-provider/providers/gemini.js new file mode 100644 index 00000000..9af4f159 --- /dev/null +++ b/.github/scripts/ai-provider/providers/gemini.js @@ -0,0 +1,107 @@ +/** + * Google AI (Gemini) Provider + * Uses Google AI API key for authentication + */ +const BaseAIProvider = require('./base-provider'); + +/** + * Gemini Provider Class + * Default model: gemini-2.5-flash-lite + * This model is a lightweight model that is designed to be fast and efficient. + * Refer to https://ai.google.dev/gemini-api/docs for the list of models. + */ +class GeminiProvider extends BaseAIProvider { + /** + * Constructor + * @param {object} config - Configuration object + * @param {string} config.model - Model to use + * @param {number} config.maxTokens - Maximum number of tokens to generate + * @param {number} config.maxRequestsPerMinute - Maximum number of requests per minute + * @param {number} config.maxTokensPerMinute - Maximum number of tokens per minute + * @param {string} apiKey - Google AI API key (required) + */ + constructor(config, apiKey) { + const model = config.model || 'gemini-2.5-flash-lite'; + super(`Google AI (${model})`, config, apiKey); + this.model = model; + } + + buildRequestOptions() { + return { + hostname: 'generativelanguage.googleapis.com', + port: 443, + path: `/v1beta/models/${this.model}:generateContent?key=${this.apiKey}`, + method: 'POST', + headers: { + 'Content-Type': 'application/json', + 'User-Agent': 'Compose-CI/1.0', + }, + }; + } + + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + // Gemini combines system and user prompts + const combinedPrompt = `${systemPrompt}\n\n${userPrompt}`; + + return JSON.stringify({ + contents: [{ + parts: [{ text: combinedPrompt }] + }], + generationConfig: { + maxOutputTokens: maxTokens || this.getMaxTokens(), + temperature: 0.7, + topP: 0.95, + topK: 40, + }, + safetySettings: [ + { category: "HARM_CATEGORY_HARASSMENT", threshold: "BLOCK_NONE" }, + { category: "HARM_CATEGORY_HATE_SPEECH", threshold: "BLOCK_NONE" }, + { category: "HARM_CATEGORY_SEXUALLY_EXPLICIT", threshold: "BLOCK_NONE" }, + { category: "HARM_CATEGORY_DANGEROUS_CONTENT", threshold: "BLOCK_NONE" } + ] + }); + } + + extractContent(response) { + const text = response.candidates?.[0]?.content?.parts?.[0]?.text; + if (text) { + return { + content: text, + tokens: response.usageMetadata?.totalTokenCount || null, + }; + } + return null; + } + + getRateLimits() { + return { + maxRequestsPerMinute: 15, + maxTokensPerMinute: 1000000, // 1M tokens per minute + }; + } + + +} + +/** + * Create Gemini provider + */ +function createGeminiProvider(customModel) { + const apiKey = process.env.GOOGLE_AI_API_KEY; + + + const config = { + model: customModel, + maxTokens: 2500, + maxRequestsPerMinute: 15, + maxTokensPerMinute: 1000000, + }; + + return new GeminiProvider(config, apiKey); +} + +module.exports = { + GeminiProvider, + createGeminiProvider, +}; + diff --git a/.github/scripts/ai-provider/providers/github-models.js b/.github/scripts/ai-provider/providers/github-models.js new file mode 100644 index 00000000..6d9426cc --- /dev/null +++ b/.github/scripts/ai-provider/providers/github-models.js @@ -0,0 +1,79 @@ +/** + * GitHub Models (Azure OpenAI) Provider + * Uses GitHub token for authentication in GitHub Actions + */ +const BaseAIProvider = require('./base-provider'); + +class GitHubModelsProvider extends BaseAIProvider { + constructor(config, apiKey) { + const model = config.model || 'gpt-4o'; + super(`GitHub Models (${model})`, config, apiKey); + this.model = model; + } + + buildRequestOptions() { + return { + hostname: 'models.inference.ai.azure.com', + port: 443, + path: '/chat/completions', + method: 'POST', + headers: { + 'Authorization': `Bearer ${this.apiKey}`, + 'Content-Type': 'application/json', + 'Accept': 'application/json', + 'User-Agent': 'Compose-CI/1.0', + }, + }; + } + + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + return JSON.stringify({ + messages: [ + { role: 'system', content: systemPrompt }, + { role: 'user', content: userPrompt }, + ], + model: this.model, + max_tokens: maxTokens || this.getMaxTokens(), + temperature: 0.7, + }); + } + + extractContent(response) { + if (response.choices?.[0]?.message?.content) { + return { + content: response.choices[0].message.content, + tokens: response.usage?.total_tokens || null, + }; + } + return null; + } + + getRateLimits() { + return { + maxRequestsPerMinute: 10, + maxTokensPerMinute: 40000, + }; + } +} + +/** + * Create GitHub Models provider + */ +function createGitHubProvider(customModel) { + const apiKey = process.env.GITHUB_TOKEN; + + const config = { + model: customModel, + maxTokens: 2500, + maxRequestsPerMinute: 10, + maxTokensPerMinute: 40000, + }; + + return new GitHubModelsProvider(config, apiKey); +} + +module.exports = { + GitHubModelsProvider, + createGitHubProvider, +}; + diff --git a/.github/scripts/ai-provider/rate-limiter.js b/.github/scripts/ai-provider/rate-limiter.js new file mode 100644 index 00000000..aa59b28f --- /dev/null +++ b/.github/scripts/ai-provider/rate-limiter.js @@ -0,0 +1,137 @@ +/** + * Rate Limiter + * Handles request-based and token-based rate limiting + */ + +class RateLimiter { + constructor() { + this.provider = null; + this.lastCallTime = 0; + this.tokenHistory = []; + this.limits = { + maxRequestsPerMinute: 10, + maxTokensPerMinute: 40000, + }; + this.tokenWindowMs = 60000; // 60 seconds + this.safetyMargin = 0.85; // Use 85% of token budget + } + + /** + * Set the active provider and update rate limits + */ + setProvider(provider) { + this.provider = provider; + this.limits = provider.getRateLimits(); + } + + /** + * Estimate token usage for a request + * Uses rough heuristic: ~4 characters per token + */ + estimateTokenUsage(systemPrompt, userPrompt, maxTokens) { + const inputText = (systemPrompt || '') + (userPrompt || ''); + const estimatedInputTokens = Math.ceil(inputText.length / 4); + return estimatedInputTokens + (maxTokens || 0); + } + + /** + * Wait for rate limits before making a request + */ + async waitForRateLimit(estimatedTokens) { + const now = Date.now(); + + // 1. Request-based rate limit (requests per minute) + const minDelayMs = Math.ceil(60000 / this.limits.maxRequestsPerMinute); + const elapsed = now - this.lastCallTime; + + if (this.lastCallTime > 0 && elapsed < minDelayMs) { + const waitTime = minDelayMs - elapsed; + await this._sleep(waitTime); + } + + // 2. Token-based rate limit + this._cleanTokenHistory(); + const currentConsumption = this._getCurrentTokenConsumption(); + const effectiveBudget = this.limits.maxTokensPerMinute * this.safetyMargin; + const availableTokens = effectiveBudget - currentConsumption; + + if (estimatedTokens > availableTokens) { + const waitTime = this._calculateTokenWaitTime(estimatedTokens, currentConsumption); + if (waitTime > 0) { + await this._sleep(waitTime); + this._cleanTokenHistory(); + } + } + + this.lastCallTime = Date.now(); + } + + /** + * Record actual token consumption after a request + */ + recordTokenConsumption(tokens) { + this.tokenHistory.push({ + timestamp: Date.now(), + tokens: tokens, + }); + this._cleanTokenHistory(); + } + + /** + * Clean expired entries from token history + */ + _cleanTokenHistory() { + const now = Date.now(); + this.tokenHistory = this.tokenHistory.filter( + entry => (now - entry.timestamp) < this.tokenWindowMs + ); + } + + /** + * Get current token consumption in the rolling window + */ + _getCurrentTokenConsumption() { + return this.tokenHistory.reduce((sum, entry) => sum + entry.tokens, 0); + } + + /** + * Calculate how long to wait for token budget to free up + */ + _calculateTokenWaitTime(tokensNeeded, currentConsumption) { + const effectiveBudget = this.limits.maxTokensPerMinute * this.safetyMargin; + const availableTokens = effectiveBudget - currentConsumption; + + if (tokensNeeded <= availableTokens) { + return 0; + } + + if (this.tokenHistory.length === 0) { + return 0; + } + + // Find how many tokens need to expire + const tokensToFree = tokensNeeded - availableTokens; + let freedTokens = 0; + let oldestTimestamp = Date.now(); + + for (const entry of this.tokenHistory) { + freedTokens += entry.tokens; + oldestTimestamp = entry.timestamp; + + if (freedTokens >= tokensToFree) { + break; + } + } + + // Calculate wait time until that entry expires + const timeUntilExpiry = this.tokenWindowMs - (Date.now() - oldestTimestamp); + return Math.max(0, timeUntilExpiry + 2000); // Add 2s buffer + } + + async _sleep(ms) { + return new Promise(resolve => setTimeout(resolve, ms)); + } +} + +module.exports = RateLimiter; + diff --git a/.github/scripts/check-solidity-comments.sh b/.github/scripts/check-solidity-comments.sh old mode 100755 new mode 100644 diff --git a/.github/scripts/generate-docs-utils/README.md b/.github/scripts/generate-docs-utils/README.md new file mode 100644 index 00000000..b55b485a --- /dev/null +++ b/.github/scripts/generate-docs-utils/README.md @@ -0,0 +1,96 @@ +## Documentation Generator + +This directory contains the utilities used by the GitHub Actions workflow to generate the Markdown/MDX documentation for the Solidity contracts and the docs site. + +- **Entry script**: `../../generate-docs.js` +- **Templates**: `templates/` +- **Core logic**: `core/`, `parsing/`, `category/`, `utils/`, `ai/` + +Use this README as a reference for running the same process **locally**. + +--- + +## Prerequisites + +- Node.js 20.x +- `forge` (Foundry) installed and available on your `PATH` +- From the repo root, run once (or whenever dependencies change): + +```bash +cd .github/scripts/generate-docs-utils/templates +npm install +cd ../../../.. +``` + +--- + +## Basic local workflow + +All commands below are run from the **repo root**. + +### 1. Generate base forge docs + +```bash +forge doc +``` + +This produces the raw contract documentation that the generator consumes. + +### 2. Run the docs generator + +#### Process all Solidity files + +```bash +node .github/scripts/generate-docs.js --all +``` + +#### Process specific Solidity files + +Create a text file with one Solidity path per line (relative to the repo root, usually under `src/`), for example: + +```bash +printf "src/access/AccessControl/Batch/Revoke/AccessControlRevokeBatchFacet.sol\n" > /tmp/changed_sol_files.txt +``` + +Then run: + +```bash +node .github/scripts/generate-docs.js /tmp/changed_sol_files.txt +``` + +--- + +## AI enhancement controls + +By default, the generator can call an AI provider to enhance descriptions. In CI, this is controlled by the `SKIP_ENHANCEMENT` input on the workflow. Locally, you can control this via an environment variable: + +- **Disable AI enhancement**: + +```bash +SKIP_ENHANCEMENT=true node .github/scripts/generate-docs.js --all +``` + +- **Enable AI enhancement** (requires a valid provider key, see `.github/scripts/ai-provider/README.md`): + +```bash +GITHUB_TOKEN= \ +GOOGLE_AI_API_KEY= \ +node .github/scripts/generate-docs.js --all +``` + +If no valid provider/key is available, the generator falls back to non‑AI content. + +--- + +## Verifying the docs site build + +After generating docs, you can ensure the documentation site still builds: + +```bash +cd website +npm ci +npm run build +``` + +New or updated pages should appear under `website/docs/library`. + diff --git a/.github/scripts/generate-docs-utils/ai/ai-enhancement.js b/.github/scripts/generate-docs-utils/ai/ai-enhancement.js new file mode 100644 index 00000000..4edf760c --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/ai-enhancement.js @@ -0,0 +1,76 @@ +/** + * AI Enhancement + * + * Orchestrates AI-powered documentation enhancement. + */ + +const ai = require('../../ai-provider'); +const { buildSystemPrompt, buildPrompt } = require('./prompt-builder'); +const { extractJSON, convertEnhancedFields } = require('./response-parser'); +const { addFallbackContent } = require('./fallback-content-provider'); + +/** + * Check if enhancement should be skipped for a file + * @param {object} data - Documentation data + * @returns {boolean} True if should skip + */ +function shouldSkipEnhancement(data) { + if (!data.functions || data.functions.length === 0) { + return true; + } + + if (data.title.startsWith('I') && data.title.length > 1 && + data.title[1] === data.title[1].toUpperCase()) { + return true; + } + + return false; +} + +/** + * Enhance documentation data using AI + * @param {object} data - Parsed documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @param {string} token - Legacy token parameter (deprecated, uses env vars now) + * @returns {Promise<{data: object, usedFallback: boolean, error?: string}>} Enhanced data with fallback status + */ +async function enhanceWithAI(data, contractType, token) { + try { + const systemPrompt = buildSystemPrompt(); + const userPrompt = buildPrompt(data, contractType); + + // Call AI provider + const responseText = await ai.call(systemPrompt, userPrompt, { + onSuccess: () => { + // Silent success - no logging + }, + onError: () => { + // Silent error - will be caught below + } + }); + + // Parse JSON response + let enhanced; + try { + enhanced = JSON.parse(responseText); + } catch (directParseError) { + const cleanedContent = extractJSON(responseText); + enhanced = JSON.parse(cleanedContent); + } + + return { data: convertEnhancedFields(enhanced, data), usedFallback: false }; + + } catch (error) { + return { + data: addFallbackContent(data, contractType), + usedFallback: true, + error: error.message + }; + } +} + +module.exports = { + enhanceWithAI, + shouldSkipEnhancement, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/context-extractor.js b/.github/scripts/generate-docs-utils/ai/context-extractor.js new file mode 100644 index 00000000..abf3b490 --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/context-extractor.js @@ -0,0 +1,250 @@ +/** + * Context Extractor for AI Documentation Enhancement + * + * Extracts and formats additional context from source files and parsed data + * to provide richer information to the AI for more accurate documentation generation. + */ + +const fs = require('fs'); +const path = require('path'); +const { readFileSafe } = require('../../workflow-utils'); +const { findRelatedContracts } = require('../core/relationship-detector'); +const { getContractRegistry } = require('../core/contract-registry'); + +/** + * Extract context from source file (pragma, imports, etc.) + * @param {string} sourceFilePath - Path to the Solidity source file + * @returns {object} Extracted source context + */ +function extractSourceContext(sourceFilePath) { + if (!sourceFilePath) { + return { + pragmaVersion: null, + imports: [], + }; + } + + const sourceContent = readFileSafe(sourceFilePath); + if (!sourceContent) { + return { + pragmaVersion: null, + imports: [], + }; + } + + // Extract pragma version + const pragmaMatch = sourceContent.match(/pragma\s+solidity\s+([^;]+);/); + const pragmaVersion = pragmaMatch ? pragmaMatch[1].trim() : null; + + // Extract imports + const importMatches = sourceContent.matchAll(/import\s+["']([^"']+)["']/g); + const imports = Array.from(importMatches, m => m[1]); + + return { + pragmaVersion, + imports, + }; +} + +/** + * Compute import path from source file path + * Converts: src/access/AccessControl/AccessControlFacet.sol + * To: @compose/access/AccessControl/AccessControlFacet + * @param {string} sourceFilePath - Path to the Solidity source file + * @returns {string} Import path + */ +function computeImportPath(sourceFilePath) { + if (!sourceFilePath) { + return null; + } + + // Remove src/ prefix and .sol extension + let importPath = sourceFilePath + .replace(/^src\//, '') + .replace(/\.sol$/, ''); + + // Convert to @compose/ format + return `@compose/${importPath}`; +} + +/** + * Format complete function signatures with parameter types and return types + * @param {Array} functions - Array of function objects + * @returns {string} Formatted function signatures + */ +function formatFunctionSignatures(functions) { + if (!functions || functions.length === 0) { + return 'None'; + } + + return functions.map(fn => { + // Format parameters + const params = (fn.params || []).map(p => { + const type = p.type || ''; + const name = p.name || ''; + if (!type && !name) return ''; + return name ? `${type} ${name}` : type; + }).filter(Boolean).join(', '); + + // Format return types + const returns = (fn.returns || []).map(r => r.type || '').filter(Boolean); + const returnStr = returns.length > 0 ? ` returns (${returns.join(', ')})` : ''; + + // Include visibility and mutability if available in signature + const signature = fn.signature || ''; + const visibility = signature.match(/\b(public|external|internal|private)\b/)?.[0] || ''; + const mutability = signature.match(/\b(view|pure|payable)\b/)?.[0] || ''; + + const modifiers = [visibility, mutability].filter(Boolean).join(' '); + + return `function ${fn.name}(${params})${modifiers ? ' ' + modifiers : ''}${returnStr}`; + }).join('\n'); +} + +/** + * Format storage context information + * @param {object} storageInfo - Storage info object + * @param {Array} structs - Array of struct definitions + * @param {Array} stateVariables - Array of state variables + * @returns {string} Formatted storage context + */ +function formatStorageContext(storageInfo, structs, stateVariables) { + const parts = []; + + // Extract storage position from state variables + const storagePositionVar = (stateVariables || []).find(v => + v.name && (v.name.includes('STORAGE_POSITION') || v.name.includes('STORAGE') || v.name.includes('_POSITION')) + ); + + if (storagePositionVar) { + parts.push(`Storage Position: ${storagePositionVar.name}`); + if (storagePositionVar.value) { + parts.push(`Value: ${storagePositionVar.value}`); + } + if (storagePositionVar.description) { + parts.push(`Description: ${storagePositionVar.description}`); + } + } + + // Extract storage struct + const storageStruct = (structs || []).find(s => + s.name && s.name.includes('Storage') + ); + + if (storageStruct) { + parts.push(`Storage Struct: ${storageStruct.name}`); + if (storageStruct.definition) { + // Extract key fields from struct definition + const fieldMatches = storageStruct.definition.matchAll(/(\w+)\s+(\w+)(?:\[.*?\])?;/g); + const fields = Array.from(fieldMatches, m => `${m[1]} ${m[2]}`); + if (fields.length > 0) { + parts.push(`Key Fields: ${fields.slice(0, 5).join(', ')}${fields.length > 5 ? '...' : ''}`); + } + } + } + + // Add storage info if available + if (storageInfo) { + if (typeof storageInfo === 'string') { + parts.push(storageInfo); + } else if (storageInfo.storagePosition) { + parts.push(`Storage Position: ${storageInfo.storagePosition}`); + } + } + + return parts.length > 0 ? parts.join('\n') : 'None'; +} + +/** + * Format related contracts context + * @param {string} contractName - Name of the contract + * @param {string} contractType - Type of contract ('module' or 'facet') + * @param {string} category - Category of the contract + * @param {object} registry - Contract registry (optional) + * @returns {string} Formatted related contracts context + */ +function formatRelatedContracts(contractName, contractType, category, registry = null) { + const related = findRelatedContracts(contractName, contractType, category, registry); + + if (related.length === 0) { + return 'None'; + } + + return related.map(r => `- ${r.title}: ${r.description}`).join('\n'); +} + +/** + * Format struct definitions with field types + * @param {Array} structs - Array of struct objects + * @returns {string} Formatted struct definitions + */ +function formatStructDefinitions(structs) { + if (!structs || structs.length === 0) { + return 'None'; + } + + return structs.map(s => { + const fields = (s.fields || []).map(f => { + const type = f.type || ''; + const name = f.name || ''; + return name ? `${type} ${name}` : type; + }).join(', '); + + return `struct ${s.name} { ${fields} }`; + }).join('\n'); +} + +/** + * Format event signatures with parameters + * @param {Array} events - Array of event objects + * @returns {string} Formatted event signatures + */ +function formatEventSignatures(events) { + if (!events || events.length === 0) { + return 'None'; + } + + return events.map(e => { + const params = (e.params || []).map(p => { + const indexed = p.indexed ? 'indexed ' : ''; + const type = p.type || ''; + const name = p.name || ''; + return name ? `${indexed}${type} ${name}` : `${indexed}${type}`; + }).join(', '); + + return `event ${e.name}(${params})`; + }).join('\n'); +} + +/** + * Format error signatures with parameters + * @param {Array} errors - Array of error objects + * @returns {string} Formatted error signatures + */ +function formatErrorSignatures(errors) { + if (!errors || errors.length === 0) { + return 'None'; + } + + return errors.map(e => { + const params = (e.params || []).map(p => { + const type = p.type || ''; + const name = p.name || ''; + return name ? `${type} ${name}` : type; + }).join(', '); + + return `error ${e.name}(${params})`; + }).join('\n'); +} + +module.exports = { + extractSourceContext, + computeImportPath, + formatFunctionSignatures, + formatStorageContext, + formatRelatedContracts, + formatStructDefinitions, + formatEventSignatures, + formatErrorSignatures, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/fallback-content-provider.js b/.github/scripts/generate-docs-utils/ai/fallback-content-provider.js new file mode 100644 index 00000000..1afabac0 --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/fallback-content-provider.js @@ -0,0 +1,36 @@ +/** + * Fallback Content Provider + * + * Provides fallback content when AI enhancement is unavailable. + * Centralizes fallback content logic to avoid duplication. + */ + +const { loadPrompts } = require('./prompt-loader'); + +/** + * Add fallback content when AI is unavailable + * @param {object} data - Documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @returns {object} Data with fallback content + */ +function addFallbackContent(data, contractType) { + const prompts = loadPrompts(); + const enhanced = { ...data }; + + if (contractType === 'module') { + enhanced.integrationNotes = prompts.moduleFallback.integrationNotes || + `This module accesses shared diamond storage, so changes made through this module are immediately visible to facets using the same storage pattern. All functions are internal as per Compose conventions.`; + enhanced.keyFeatures = prompts.moduleFallback.keyFeatures || + `- All functions are \`internal\` for use in custom facets\n- Follows diamond storage pattern (EIP-8042)\n- Compatible with ERC-2535 diamonds\n- No external dependencies or \`using\` directives`; + } else { + enhanced.keyFeatures = prompts.facetFallback.keyFeatures || + `- Self-contained facet with no imports or inheritance\n- Only \`external\` and \`internal\` function visibility\n- Follows Compose readability-first conventions\n- Ready for diamond integration`; + } + + return enhanced; +} + +module.exports = { + addFallbackContent, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/prompt-builder.js b/.github/scripts/generate-docs-utils/ai/prompt-builder.js new file mode 100644 index 00000000..362ecf02 --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/prompt-builder.js @@ -0,0 +1,186 @@ +/** + * Prompt Builder + * + * Builds system and user prompts for AI enhancement. + */ + +const { + extractSourceContext, + computeImportPath, + formatFunctionSignatures, + formatStorageContext, + formatRelatedContracts, + formatStructDefinitions, + formatEventSignatures, + formatErrorSignatures, +} = require('./context-extractor'); +const { getContractRegistry } = require('../core/contract-registry'); +const { loadPrompts, loadRepoInstructions } = require('./prompt-loader'); + +/** + * Build the system prompt with repository context + * Uses the system prompt from the prompts file, or a fallback if not found + * @returns {string} System prompt for AI + */ +function buildSystemPrompt() { + const prompts = loadPrompts(); + const repoInstructions = loadRepoInstructions(); + + let systemPrompt = prompts.systemPrompt || `You are a Solidity smart contract documentation expert for the Compose framework. +Always respond with valid JSON only, no markdown formatting. +Follow the project conventions and style guidelines strictly.`; + + if (repoInstructions) { + const relevantSections = prompts.relevantSections.length > 0 + ? prompts.relevantSections + : [ + '## 3. Core Philosophy', + '## 4. Facet Design Principles', + '## 5. Banned Solidity Features', + '## 6. Composability Guidelines', + '## 11. Code Style Guide', + ]; + + let contextSnippets = []; + for (const section of relevantSections) { + const startIdx = repoInstructions.indexOf(section); + if (startIdx !== -1) { + // Extract section content (up to next ## or 2000 chars max) + const nextSection = repoInstructions.indexOf('\n## ', startIdx + section.length); + const endIdx = nextSection !== -1 ? nextSection : startIdx + 2000; + const snippet = repoInstructions.slice(startIdx, Math.min(endIdx, startIdx + 2000)); + contextSnippets.push(snippet.trim()); + } + } + + if (contextSnippets.length > 0) { + systemPrompt += `\n\n--- PROJECT GUIDELINES ---\n${contextSnippets.join('\n\n')}`; + } + } + + return systemPrompt; +} + +/** + * Build the prompt for AI based on contract type + * @param {object} data - Parsed documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @returns {string} Prompt for AI + */ +function buildPrompt(data, contractType) { + const prompts = loadPrompts(); + + const functionNames = data.functions.map(f => f.name).join(', '); + const functionDescriptions = data.functions + .map(f => `- ${f.name}: ${f.description || 'No description'}`) + .join('\n'); + + // Include events and errors for richer context + const eventNames = (data.events || []).map(e => e.name).join(', '); + const errorNames = (data.errors || []).map(e => e.name).join(', '); + + // Extract additional context + const sourceContext = extractSourceContext(data.sourceFilePath); + const importPath = computeImportPath(data.sourceFilePath); + const functionSignatures = formatFunctionSignatures(data.functions); + const eventSignatures = formatEventSignatures(data.events); + const errorSignatures = formatErrorSignatures(data.errors); + const structDefinitions = formatStructDefinitions(data.structs); + + // Get storage context + const storageContext = formatStorageContext( + data.storageInfo, + data.structs, + data.stateVariables + ); + + // Get related contracts context + const registry = getContractRegistry(); + // Try to get category from registry entry, or use empty string + const registryEntry = registry.byName.get(data.title); + const category = data.category || (registryEntry ? registryEntry.category : ''); + const relatedContracts = formatRelatedContracts( + data.title, + contractType, + category, + registry + ); + + const promptTemplate = contractType === 'module' + ? prompts.modulePrompt + : prompts.facetPrompt; + + // If we have a template from the file, use it with variable substitution + if (promptTemplate) { + return promptTemplate + .replace(/\{\{title\}\}/g, data.title) + .replace(/\{\{description\}\}/g, data.description || 'No description provided') + .replace(/\{\{functionNames\}\}/g, functionNames || 'None') + .replace(/\{\{functionDescriptions\}\}/g, functionDescriptions || ' None') + .replace(/\{\{eventNames\}\}/g, eventNames || 'None') + .replace(/\{\{errorNames\}\}/g, errorNames || 'None') + .replace(/\{\{functionSignatures\}\}/g, functionSignatures || 'None') + .replace(/\{\{eventSignatures\}\}/g, eventSignatures || 'None') + .replace(/\{\{errorSignatures\}\}/g, errorSignatures || 'None') + .replace(/\{\{importPath\}\}/g, importPath || 'N/A') + .replace(/\{\{pragmaVersion\}\}/g, sourceContext.pragmaVersion || '^0.8.30') + .replace(/\{\{storageContext\}\}/g, storageContext || 'None') + .replace(/\{\{relatedContracts\}\}/g, relatedContracts || 'None') + .replace(/\{\{structDefinitions\}\}/g, structDefinitions || 'None'); + } + + // Fallback to hardcoded prompt if template not loaded + return `Given this ${contractType} documentation from the Compose diamond proxy framework, enhance it by generating: + +1. **description**: A concise one-line description (max 100 chars) for the page subtitle. Derive this from the contract's purpose based on its functions, events, and errors. + +2. **overview**: A clear, concise overview (2-3 sentences) explaining what this ${contractType} does and why it's useful in the context of diamond contracts. + +3. **usageExample**: A practical Solidity code example (10-20 lines) showing how to use this ${contractType}. For modules, show importing and calling functions. For facets, show how it would be used in a diamond. Use the EXACT import path and function signatures provided below. + +4. **bestPractices**: 2-3 bullet points of best practices for using this ${contractType}. + +${contractType === 'module' ? '5. **integrationNotes**: A note about how this module works with diamond storage pattern and how changes made through it are visible to facets.' : ''} + +${contractType === 'facet' ? '5. **securityConsiderations**: Important security considerations when using this facet (access control, reentrancy, etc.).' : ''} + +6. **keyFeatures**: A brief bullet list of key features. + +Contract Information: +- Name: ${data.title} +- Current Description: ${data.description || 'No description provided'} +- Import Path: ${importPath || 'N/A'} +- Pragma Version: ${sourceContext.pragmaVersion || '^0.8.30'} +- Functions: ${functionNames || 'None'} +- Function Signatures: +${functionSignatures || ' None'} +- Events: ${eventNames || 'None'} +- Event Signatures: +${eventSignatures || ' None'} +- Errors: ${errorNames || 'None'} +- Error Signatures: +${errorSignatures || ' None'} +- Function Details: +${functionDescriptions || ' None'} +${storageContext && storageContext !== 'None' ? `\n- Storage Information:\n${storageContext}` : ''} +${relatedContracts && relatedContracts !== 'None' ? `\n- Related Contracts:\n${relatedContracts}` : ''} +${structDefinitions && structDefinitions !== 'None' ? `\n- Struct Definitions:\n${structDefinitions}` : ''} + +IMPORTANT: Use the EXACT function signatures, import paths, and storage information provided above. Do not invent or modify function names, parameter types, or import paths. + +Respond ONLY with valid JSON in this exact format (no markdown code blocks, no extra text): +{ + "description": "concise one-line description here", + "overview": "enhanced overview text here", + "usageExample": "solidity code here (use \\n for newlines)", + "bestPractices": "- Point 1\\n- Point 2\\n- Point 3", + "keyFeatures": "- Feature 1\\n- Feature 2", + ${contractType === 'module' ? '"integrationNotes": "integration notes here"' : '"securityConsiderations": "security notes here"'} +}`; +} + +module.exports = { + buildSystemPrompt, + buildPrompt, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/prompt-loader.js b/.github/scripts/generate-docs-utils/ai/prompt-loader.js new file mode 100644 index 00000000..f4f8cd1f --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/prompt-loader.js @@ -0,0 +1,132 @@ +/** + * Prompt Loader + * + * Loads and parses AI prompts from markdown files. + */ + +const fs = require('fs'); +const path = require('path'); + +const AI_PROMPT_PATH = path.join(__dirname, '../../../docs-gen-prompts.md'); +const REPO_INSTRUCTIONS_PATH = path.join(__dirname, '../../../copilot-instructions.md'); + +// Cache loaded prompts +let cachedPrompts = null; +let cachedRepoInstructions = null; + +/** + * Load repository instructions for context + * @returns {string} Repository instructions content + */ +function loadRepoInstructions() { + if (cachedRepoInstructions !== null) { + return cachedRepoInstructions; + } + + try { + cachedRepoInstructions = fs.readFileSync(REPO_INSTRUCTIONS_PATH, 'utf8'); + } catch (e) { + console.warn('Could not load copilot-instructions.md:', e.message); + cachedRepoInstructions = ''; + } + + return cachedRepoInstructions; +} + +/** + * Parse the prompts markdown file to extract individual prompts + * @param {string} content - Raw markdown content + * @returns {object} Parsed prompts and configurations + */ +function parsePromptsFile(content) { + const sections = content.split(/^---$/m).map(s => s.trim()).filter(Boolean); + + const prompts = { + systemPrompt: '', + modulePrompt: '', + facetPrompt: '', + relevantSections: [], + moduleFallback: { integrationNotes: '', keyFeatures: '' }, + facetFallback: { keyFeatures: '' }, + }; + + for (const section of sections) { + if (section.includes('## System Prompt')) { + const match = section.match(/## System Prompt\s*\n([\s\S]*)/); + if (match) { + prompts.systemPrompt = match[1].trim(); + } + } else if (section.includes('## Relevant Guideline Sections')) { + // Extract sections from the code block + const codeMatch = section.match(/```\n([\s\S]*?)```/); + if (codeMatch) { + prompts.relevantSections = codeMatch[1] + .split('\n') + .map(s => s.trim()) + .filter(s => s.startsWith('## ')); + } + } else if (section.includes('## Module Prompt Template')) { + const match = section.match(/## Module Prompt Template\s*\n([\s\S]*)/); + if (match) { + prompts.modulePrompt = match[1].trim(); + } + } else if (section.includes('## Facet Prompt Template')) { + const match = section.match(/## Facet Prompt Template\s*\n([\s\S]*)/); + if (match) { + prompts.facetPrompt = match[1].trim(); + } + } else if (section.includes('## Module Fallback Content')) { + // Parse subsections for integrationNotes and keyFeatures + const integrationMatch = section.match(/### integrationNotes\s*\n([\s\S]*?)(?=###|$)/); + if (integrationMatch) { + prompts.moduleFallback.integrationNotes = integrationMatch[1].trim(); + } + const keyFeaturesMatch = section.match(/### keyFeatures\s*\n([\s\S]*?)(?=###|$)/); + if (keyFeaturesMatch) { + prompts.moduleFallback.keyFeatures = keyFeaturesMatch[1].trim(); + } + } else if (section.includes('## Facet Fallback Content')) { + const keyFeaturesMatch = section.match(/### keyFeatures\s*\n([\s\S]*?)(?=###|$)/); + if (keyFeaturesMatch) { + prompts.facetFallback.keyFeatures = keyFeaturesMatch[1].trim(); + } + } + } + + return prompts; +} + +/** + * Load AI prompts from markdown file + * @returns {object} Parsed prompts object + */ +function loadPrompts() { + if (cachedPrompts !== null) { + return cachedPrompts; + } + + const defaultPrompts = { + systemPrompt: '', + modulePrompt: '', + facetPrompt: '', + relevantSections: [], + moduleFallback: { integrationNotes: '', keyFeatures: '' }, + facetFallback: { keyFeatures: '' }, + }; + + try { + const promptsContent = fs.readFileSync(AI_PROMPT_PATH, 'utf8'); + cachedPrompts = parsePromptsFile(promptsContent); + } catch (e) { + console.warn('Could not load ai-prompts.md:', e.message); + cachedPrompts = defaultPrompts; + } + + return cachedPrompts; +} + +module.exports = { + loadPrompts, + loadRepoInstructions, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/response-parser.js b/.github/scripts/generate-docs-utils/ai/response-parser.js new file mode 100644 index 00000000..a7e7204e --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/response-parser.js @@ -0,0 +1,137 @@ +/** + * Response Parser + * + * Parses and cleans AI response content. + */ + +/** + * Extract and clean JSON from API response + * Handles markdown code blocks, wrapped text, and attempts to fix truncated JSON + * Also removes control characters that break JSON parsing + * @param {string} content - Raw API response content + * @returns {string} Cleaned JSON string ready for parsing + */ +function extractJSON(content) { + if (!content || typeof content !== 'string') { + return content; + } + + let cleaned = content.trim(); + + // Remove markdown code blocks (```json ... ``` or ``` ... ```) + // Handle both at start and anywhere in the string + cleaned = cleaned.replace(/^```(?:json)?\s*\n?/gm, ''); + cleaned = cleaned.replace(/\n?```\s*$/gm, ''); + cleaned = cleaned.trim(); + + // Remove control characters (0x00-0x1F except newline, tab, carriage return) + // These are illegal in JSON strings and cause "Bad control character" parsing errors + cleaned = cleaned.replace(/[\x00-\x08\x0B\x0C\x0E-\x1F]/g, ''); + + // Find the first { and last } to extract JSON object + const firstBrace = cleaned.indexOf('{'); + const lastBrace = cleaned.lastIndexOf('}'); + + if (firstBrace !== -1 && lastBrace !== -1 && lastBrace > firstBrace) { + cleaned = cleaned.substring(firstBrace, lastBrace + 1); + } else if (firstBrace !== -1) { + // We have a { but no closing }, JSON might be truncated + cleaned = cleaned.substring(firstBrace); + } + + // Try to fix common truncation issues + const openBraces = (cleaned.match(/\{/g) || []).length; + const closeBraces = (cleaned.match(/\}/g) || []).length; + + if (openBraces > closeBraces) { + // JSON might be truncated - try to close incomplete strings and objects + // Check if we're in the middle of a string (simple heuristic) + const lastChar = cleaned[cleaned.length - 1]; + const lastQuote = cleaned.lastIndexOf('"'); + const lastBraceInCleaned = cleaned.lastIndexOf('}'); + + // If last quote is after last brace and not escaped, we might be in a string + if (lastQuote > lastBraceInCleaned && lastChar !== '"') { + // Check if the quote before last is escaped + let isEscaped = false; + for (let i = lastQuote - 1; i >= 0 && cleaned[i] === '\\'; i--) { + isEscaped = !isEscaped; + } + + if (!isEscaped) { + // We're likely in an incomplete string, close it + cleaned = cleaned + '"'; + } + } + + // Close any incomplete objects/arrays + const missingBraces = openBraces - closeBraces; + // Try to intelligently close - if we're in the middle of a property, add a value first + const trimmed = cleaned.trim(); + if (trimmed.endsWith(',') || trimmed.endsWith(':')) { + // We're in the middle of a property, add null and close + cleaned = cleaned.replace(/[,:]\s*$/, ': null'); + } + cleaned = cleaned + '\n' + '}'.repeat(missingBraces); + } + + return cleaned.trim(); +} + +/** + * Convert literal \n strings to actual newlines + * @param {string} str - String with escaped newlines + * @returns {string} String with actual newlines + */ +function convertNewlines(str) { + if (!str || typeof str !== 'string') return str; + return str.replace(/\\n/g, '\n'); +} + +/** + * Decode HTML entities (for code blocks) + * @param {string} str - String with HTML entities + * @returns {string} Decoded string + */ +function decodeHtmlEntities(str) { + if (!str || typeof str !== 'string') return str; + return str + .replace(/"/g, '"') + .replace(/=/g, '=') + .replace(/=>/g, '=>') + .replace(/</g, '<') + .replace(/>/g, '>') + .replace(/'/g, "'") + .replace(/&/g, '&'); +} + +/** + * Convert enhanced data fields (newlines, HTML entities) + * @param {object} enhanced - Parsed JSON from API + * @param {object} data - Original documentation data + * @returns {object} Enhanced data with converted fields + */ +function convertEnhancedFields(enhanced, data) { + // Use AI-generated description if provided, otherwise keep original + const aiDescription = enhanced.description?.trim(); + const finalDescription = aiDescription || data.description; + + return { + ...data, + // Description is used for page subtitle - AI improves it from NatSpec + description: finalDescription, + subtitle: finalDescription, + overview: convertNewlines(enhanced.overview) || data.overview, + usageExample: decodeHtmlEntities(convertNewlines(enhanced.usageExample)) || null, + bestPractices: convertNewlines(enhanced.bestPractices) || null, + keyFeatures: convertNewlines(enhanced.keyFeatures) || null, + integrationNotes: convertNewlines(enhanced.integrationNotes) || null, + securityConsiderations: convertNewlines(enhanced.securityConsiderations) || null, + }; +} + +module.exports = { + extractJSON, + convertEnhancedFields, +}; + diff --git a/.github/scripts/generate-docs-utils/category/category-generator.js b/.github/scripts/generate-docs-utils/category/category-generator.js new file mode 100644 index 00000000..7068eca3 --- /dev/null +++ b/.github/scripts/generate-docs-utils/category/category-generator.js @@ -0,0 +1,628 @@ +/** + * Category Generator + * + * Automatically generates _category_.json files to mirror + * the src/ folder structure in the documentation. + * + * This module provides: + * - Source structure scanning + * - Category file generation + * - Path computation for doc output + * - Structure synchronization + */ + +const fs = require('fs'); +const path = require('path'); +const CONFIG = require('../config'); +const { + getCategoryItems, + createCategoryIndexFile: createIndexFile, +} = require('./index-page-generator'); + +// ============================================================================ +// Constants +// ============================================================================ + +/** + * Human-readable labels for directory names + * Add new entries here when adding new top-level categories + */ +const CATEGORY_LABELS = { + // Top-level categories + access: 'Access Control', + token: 'Token Standards', + diamond: 'Diamond Core', + libraries: 'Utilities', + utils: 'Utilities', + interfaceDetection: 'Interface Detection', + + // Token subcategories + ERC20: 'ERC-20', + ERC721: 'ERC-721', + ERC1155: 'ERC-1155', + ERC6909: 'ERC-6909', + Royalty: 'Royalty', + + // Access subcategories + AccessControl: 'Access Control', + AccessControlPausable: 'Pausable Access Control', + AccessControlTemporal: 'Temporal Access Control', + Owner: 'Owner', + OwnerTwoSteps: 'Two-Step Owner', +}; + +/** + * Descriptions for categories + * Add new entries here for custom descriptions + */ +const CATEGORY_DESCRIPTIONS = { + // Top-level categories + access: 'Access control patterns for permission management in Compose diamonds.', + token: 'Token standard implementations for Compose diamonds.', + diamond: 'Core diamond proxy functionality for ERC-2535 diamonds.', + libraries: 'Utility libraries and helpers for diamond development.', + utils: 'Utility libraries and helpers for diamond development.', + interfaceDetection: 'ERC-165 interface detection support.', + + // Token subcategories + ERC20: 'ERC-20 fungible token implementations.', + ERC721: 'ERC-721 non-fungible token implementations.', + ERC1155: 'ERC-1155 multi-token implementations.', + ERC6909: 'ERC-6909 minimal multi-token implementations.', + Royalty: 'ERC-2981 royalty standard implementations.', + + // Access subcategories + AccessControl: 'Role-based access control (RBAC) pattern.', + AccessControlPausable: 'RBAC with pause functionality.', + AccessControlTemporal: 'Time-limited role-based access control.', + Owner: 'Single-owner access control pattern.', + OwnerTwoSteps: 'Two-step ownership transfer pattern.', +}; + +/** + * Sidebar positions for categories + * Lower numbers appear first in the sidebar + */ +const CATEGORY_POSITIONS = { + // Top-level (lower = higher priority) + diamond: 1, + access: 2, + token: 3, + libraries: 4, + utils: 4, + interfaceDetection: 5, + + // Token subcategories + ERC20: 1, + ERC721: 2, + ERC1155: 3, + ERC6909: 4, + Royalty: 5, + + // Access subcategories + Owner: 1, + OwnerTwoSteps: 2, + AccessControl: 3, + AccessControlPausable: 4, + AccessControlTemporal: 5, + + // Leaf directories (ERC20/ERC20, etc.) - alphabetical + ERC20Bridgeable: 2, + ERC20Permit: 3, + ERC721Enumerable: 2, +}; + +// ============================================================================ +// Label & Description Generation +// ============================================================================ + +/** + * Generate a human-readable label from a directory name + * @param {string} name - Directory name (e.g., 'AccessControlPausable', 'ERC20') + * @returns {string} Human-readable label + */ +function generateLabel(name) { + // Check explicit mapping first + if (CATEGORY_LABELS[name]) { + return CATEGORY_LABELS[name]; + } + + // Handle ERC standards specially + if (/^ERC\d+/.test(name)) { + const match = name.match(/^(ERC)(\d+)(.*)$/); + if (match) { + const variant = match[3] + ? ' ' + match[3].replace(/([A-Z])/g, ' $1').trim() + : ''; + return `ERC-${match[2]}${variant}`; + } + return name; + } + + // CamelCase to Title Case with spaces + return name.replace(/([A-Z])/g, ' $1').replace(/^ /, '').trim(); +} + +/** + * Generate description for a category based on its path + * @param {string} name - Directory name + * @param {string[]} parentPath - Parent path segments + * @returns {string} Category description + */ +function generateDescription(name, parentPath = []) { + // Check explicit mapping first + if (CATEGORY_DESCRIPTIONS[name]) { + return CATEGORY_DESCRIPTIONS[name]; + } + + // Generate from context + const label = generateLabel(name); + const parent = parentPath[parentPath.length - 1]; + + if (parent === 'token') { + return `${label} token implementations with modules and facets.`; + } + if (parent === 'access') { + return `${label} access control pattern for Compose diamonds.`; + } + if (parent === 'ERC20' || parent === 'ERC721') { + return `${label} extension for ${generateLabel(parent)} tokens.`; + } + + return `${label} components for Compose diamonds.`; +} + +/** + * Get sidebar position for a category + * @param {string} name - Directory name + * @param {number} depth - Nesting depth + * @returns {number} Sidebar position + */ +function getCategoryPosition(name, depth) { + if (CATEGORY_POSITIONS[name] !== undefined) { + return CATEGORY_POSITIONS[name]; + } + return 99; // Default to end +} + +// ============================================================================ +// Source Structure Scanning +// ============================================================================ + +/** + * Check if a directory contains .sol files (directly or in subdirectories) + * @param {string} dirPath - Directory path to check + * @returns {boolean} True if contains .sol files + */ +function containsSolFiles(dirPath) { + try { + const entries = fs.readdirSync(dirPath, { withFileTypes: true }); + + for (const entry of entries) { + if (entry.isFile() && entry.name.endsWith('.sol')) { + return true; + } + if (entry.isDirectory() && !entry.name.startsWith('.')) { + if (containsSolFiles(path.join(dirPath, entry.name))) { + return true; + } + } + } + } catch (error) { + console.warn(`Warning: Could not read directory ${dirPath}: ${error.message}`); + } + + return false; +} + +/** + * Scan the src/ directory and build structure map + * @returns {Map} Map of relative paths to category info + */ +function scanSourceStructure() { + const srcDir = CONFIG.srcDir || 'src'; + const structure = new Map(); + + function scanDir(dirPath, relativePath = '') { + let entries; + try { + entries = fs.readdirSync(dirPath, { withFileTypes: true }); + } catch (error) { + console.error(`Error reading directory ${dirPath}: ${error.message}`); + return; + } + + for (const entry of entries) { + if (!entry.isDirectory()) continue; + + // Skip hidden directories and interfaces + if (entry.name.startsWith('.') || entry.name === 'interfaces') { + continue; + } + + const fullPath = path.join(dirPath, entry.name); + const relPath = relativePath ? `${relativePath}/${entry.name}` : entry.name; + + // Only include directories that contain .sol files + if (containsSolFiles(fullPath)) { + const parts = relPath.split('/'); + structure.set(relPath, { + name: entry.name, + path: relPath, + depth: parts.length, + parent: relativePath || null, + parentParts: relativePath ? relativePath.split('/') : [], + }); + + // Recurse into subdirectories + scanDir(fullPath, relPath); + } + } + } + + if (fs.existsSync(srcDir)) { + scanDir(srcDir); + } else { + console.warn(`Warning: Source directory ${srcDir} does not exist`); + } + + return structure; +} + +// ============================================================================ +// Category File Generation +// ============================================================================ + +/** + * Map source directory name to docs directory name + * @param {string} srcName - Source directory name + * @returns {string} Documentation directory name + */ +function mapDirectoryName(srcName) { + // Map libraries -> utils for URL consistency + if (srcName === 'libraries') { + return 'utils'; + } + return srcName; +} + +/** + * Compute slug from output directory path + * @param {string} outputDir - Full output directory path + * @param {string} libraryDir - Base library directory + * @returns {string} Slug path (e.g., '/docs/library/access') + */ +function computeSlug(outputDir, libraryDir) { + const relativePath = path.relative(libraryDir, outputDir); + + if (!relativePath || relativePath.startsWith('..')) { + // Root library directory + return '/docs/library'; + } + + // Convert path separators and create slug + const normalizedPath = relativePath.replace(/\\/g, '/'); + return `/docs/library/${normalizedPath}`; +} + +/** + * Wrapper function to create category index file using the index-page-generator utility + * @param {string} outputDir - Directory to create index file in + * @param {string} relativePath - Relative path from library dir + * @param {string} label - Category label + * @param {string} description - Category description + * @param {boolean} overwrite - Whether to overwrite existing files (default: false) + * @param {boolean} hideFromSidebar - Whether to hide the index page from sidebar (default: false) + * @returns {boolean} True if file was created/updated, false if skipped + */ +function createCategoryIndexFile(outputDir, relativePath, label, description, overwrite = false, hideFromSidebar = false) { + return createIndexFile( + outputDir, + relativePath, + label, + description, + generateLabel, + generateDescription, + overwrite, + hideFromSidebar + ); +} + +/** + * Create a _category_.json file for a directory + * @param {string} outputDir - Directory to create category file in + * @param {string} name - Directory name + * @param {string} relativePath - Relative path from library dir + * @param {number} depth - Nesting depth + * @returns {boolean} True if file was created, false if it already existed + */ +function createCategoryFile(outputDir, name, relativePath, depth) { + const categoryFile = path.join(outputDir, '_category_.json'); + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + // Don't overwrite existing category files (allows manual customization) + if (fs.existsSync(categoryFile)) { + return false; + } + + // Get the actual directory name from the output path (may be mapped, e.g., utils instead of libraries) + const actualDirName = path.basename(outputDir); + const parentParts = relativePath.split('/').slice(0, -1); + // Use actual directory name for label generation (supports both original and mapped names) + const label = generateLabel(actualDirName); + const position = getCategoryPosition(actualDirName, depth); + const description = generateDescription(actualDirName, parentParts); + + // Create index.mdx file first + createCategoryIndexFile(outputDir, relativePath, label, description); + + // Create category file pointing to index.mdx + const docId = relativePath ? `library/${relativePath}/index` : 'library/index'; + + const category = { + label, + position, + collapsible: true, + collapsed: true, // Collapse all categories by default + link: { + type: 'doc', + id: docId, + }, + }; + + // Ensure directory exists + fs.mkdirSync(outputDir, { recursive: true }); + fs.writeFileSync(categoryFile, JSON.stringify(category, null, 2) + '\n'); + + return true; +} + +/** + * Ensure the base library category file exists + * @param {string} libraryDir - Path to library directory + * @returns {boolean} True if created, false if existed + */ +function ensureBaseCategory(libraryDir) { + const categoryFile = path.join(libraryDir, '_category_.json'); + + if (fs.existsSync(categoryFile)) { + return false; + } + + const label = 'Library'; + const description = 'API reference for all Compose modules and facets.'; + + // Create index.mdx for base library category + // Hide from sidebar (sidebar_class_name: "hidden") so it doesn't appear as a page in the sidebar + createIndexFile(libraryDir, '', label, description, generateLabel, generateDescription, false, true); + + const baseCategory = { + label, + position: 4, + collapsible: true, + collapsed: true, // Collapse base Library category by default + link: { + type: 'doc', + id: 'library/index', + }, + }; + + fs.mkdirSync(libraryDir, { recursive: true }); + fs.writeFileSync(categoryFile, JSON.stringify(baseCategory, null, 2) + '\n'); + + return true; +} + +// ============================================================================ +// Path Computation +// ============================================================================ + +/** + * Compute output path for a source file + * Mirrors the src/ structure in website/docs/library/ + * Applies directory name mapping (e.g., libraries -> utils) + * + * @param {string} solFilePath - Path to .sol file (e.g., 'src/access/AccessControl/AccessControlMod.sol') + * @returns {object} Output path information + */ +function computeOutputPath(solFilePath) { + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + // Normalize path separators + const normalizedPath = solFilePath.replace(/\\/g, '/'); + + // Remove 'src/' prefix and '.sol' extension + const relativePath = normalizedPath.replace(/^src\//, '').replace(/\.sol$/, ''); + + const parts = relativePath.split('/'); + const fileName = parts.pop(); + + // Map directory names (e.g., libraries -> utils) + const mappedParts = parts.map(part => mapDirectoryName(part)); + + const outputDir = path.join(libraryDir, ...mappedParts); + const outputFile = path.join(outputDir, `${fileName}.mdx`); + + return { + outputDir, + outputFile, + relativePath: mappedParts.join('/'), + fileName, + category: mappedParts[0] || '', + subcategory: mappedParts[1] || '', + fullRelativePath: mappedParts.join('/'), + depth: mappedParts.length, + }; +} + +/** + * Ensure all parent category files exist for a given output path + * Creates _category_.json files for each directory level + * + * @param {string} outputDir - Full output directory path + */ +function ensureCategoryFiles(outputDir) { + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + // Get relative path from library base + const relativePath = path.relative(libraryDir, outputDir); + + if (!relativePath || relativePath.startsWith('..')) { + return; // outputDir is not under libraryDir + } + + // Ensure base category exists + ensureBaseCategory(libraryDir); + + // Walk up the directory tree, creating category files + const parts = relativePath.split(path.sep); + let currentPath = libraryDir; + + for (let i = 0; i < parts.length; i++) { + currentPath = path.join(currentPath, parts[i]); + const segment = parts[i]; + // Use the mapped path for the relative path (already mapped in computeOutputPath) + const relPath = parts.slice(0, i + 1).join('/'); + + createCategoryFile(currentPath, segment, relPath, i + 1); + } +} + +// ============================================================================ +// Structure Synchronization +// ============================================================================ + +/** + * Regenerate index.mdx files for all categories + * @param {boolean} overwrite - Whether to overwrite existing files (default: true) + * @returns {object} Summary of regenerated categories + */ +function regenerateAllIndexFiles(overwrite = true) { + const structure = scanSourceStructure(); + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + const regenerated = []; + const skipped = []; + + // Regenerate base library index + // Always hide from sidebar (sidebar_class_name: "hidden") + const label = 'Library'; + const description = 'API reference for all Compose modules and facets.'; + if (createCategoryIndexFile(libraryDir, '', label, description, overwrite, true)) { + regenerated.push('library'); + } else { + skipped.push('library'); + } + + // Regenerate index for each category + const sortedPaths = Array.from(structure.entries()).sort((a, b) => + a[0].localeCompare(b[0]) + ); + + for (const [relativePath, info] of sortedPaths) { + const pathParts = relativePath.split('/'); + const mappedPathParts = pathParts.map(part => mapDirectoryName(part)); + const mappedRelativePath = mappedPathParts.join('/'); + const outputDir = path.join(libraryDir, ...mappedPathParts); + + const actualDirName = path.basename(outputDir); + const parentParts = mappedRelativePath.split('/').slice(0, -1); + const label = generateLabel(actualDirName); + const description = generateDescription(actualDirName, parentParts); + + if (createCategoryIndexFile(outputDir, mappedRelativePath, label, description, overwrite)) { + regenerated.push(mappedRelativePath); + } else { + skipped.push(mappedRelativePath); + } + } + + return { + regenerated, + skipped, + total: structure.size + 1, // +1 for base library + }; +} + +/** + * Synchronize docs structure with src structure + * Creates any missing category directories and _category_.json files + * + * @returns {object} Summary of created categories + */ +function syncDocsStructure() { + const structure = scanSourceStructure(); + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + const created = []; + const existing = []; + + // Ensure base library directory exists with category + if (ensureBaseCategory(libraryDir)) { + created.push('library'); + } else { + existing.push('library'); + } + + // Create category for each directory in the structure + // Sort by path to ensure parents are created before children + const sortedPaths = Array.from(structure.entries()).sort((a, b) => + a[0].localeCompare(b[0]) + ); + + for (const [relativePath, info] of sortedPaths) { + // Map directory names in the path (e.g., libraries -> utils) + const pathParts = relativePath.split('/'); + const mappedPathParts = pathParts.map(part => mapDirectoryName(part)); + const mappedRelativePath = mappedPathParts.join('/'); + const outputDir = path.join(libraryDir, ...mappedPathParts); + + const wasCreated = createCategoryFile( + outputDir, + info.name, + mappedRelativePath, + info.depth + ); + + if (wasCreated) { + created.push(mappedRelativePath); + } else { + existing.push(mappedRelativePath); + } + } + + return { + created, + existing, + total: structure.size, + structure, + }; +} + +// ============================================================================ +// Exports +// ============================================================================ + +module.exports = { + // Core functions + scanSourceStructure, + syncDocsStructure, + computeOutputPath, + ensureCategoryFiles, + createCategoryIndexFile, + regenerateAllIndexFiles, + + // Utilities + generateLabel, + generateDescription, + getCategoryPosition, + containsSolFiles, + mapDirectoryName, + computeSlug, + + // For extending/customizing + CATEGORY_LABELS, + CATEGORY_DESCRIPTIONS, + CATEGORY_POSITIONS, +}; + diff --git a/.github/scripts/generate-docs-utils/category/index-page-generator.js b/.github/scripts/generate-docs-utils/category/index-page-generator.js new file mode 100644 index 00000000..b999b350 --- /dev/null +++ b/.github/scripts/generate-docs-utils/category/index-page-generator.js @@ -0,0 +1,253 @@ +/** + * Index Page Generator + * + * Generates index.mdx files for category directories with custom DocCard components. + * This module provides utilities for creating styled category index pages. + */ + +const fs = require('fs'); +const path = require('path'); +const CONFIG = require('../config'); + +// ============================================================================ +// Category Items Discovery +// ============================================================================ + +/** + * Get all items (documents and subcategories) in a directory + * @param {string} outputDir - Directory to scan + * @param {string} relativePath - Relative path from library dir + * @param {Function} generateLabel - Function to generate labels from names + * @param {Function} generateDescription - Function to generate descriptions + * @returns {Array} Array of items with type, name, label, href, description + */ +function getCategoryItems(outputDir, relativePath, generateLabel, generateDescription) { + const items = []; + + if (!fs.existsSync(outputDir)) { + return items; + } + + const entries = fs.readdirSync(outputDir, { withFileTypes: true }); + + for (const entry of entries) { + // Skip hidden files, category files, and index files + if (entry.name.startsWith('.') || + entry.name === '_category_.json' || + entry.name === 'index.mdx') { + continue; + } + + if (entry.isFile() && entry.name.endsWith('.mdx')) { + // It's a document + const docName = entry.name.replace('.mdx', ''); + const docPath = path.join(outputDir, entry.name); + + // Try to read frontmatter for title and description + let title = generateLabel(docName); + let description = ''; + + try { + const content = fs.readFileSync(docPath, 'utf8'); + const frontmatterMatch = content.match(/^---\n([\s\S]*?)\n---/); + if (frontmatterMatch) { + const frontmatter = frontmatterMatch[1]; + const titleMatch = frontmatter.match(/^title:\s*["']?(.*?)["']?$/m); + const descMatch = frontmatter.match(/^description:\s*["']?(.*?)["']?$/m); + if (titleMatch) title = titleMatch[1].trim(); + if (descMatch) description = descMatch[1].trim(); + } + } catch (error) { + // If reading fails, use defaults + } + + const docRelativePath = relativePath ? `${relativePath}/${docName}` : docName; + items.push({ + type: 'doc', + name: docName, + label: title, + description: description, + href: `/docs/library/${docRelativePath}`, + }); + } else if (entry.isDirectory()) { + // It's a subcategory + const subcategoryName = entry.name; + const subcategoryLabel = generateLabel(subcategoryName); + const subcategoryRelativePath = relativePath ? `${relativePath}/${subcategoryName}` : subcategoryName; + const subcategoryDescription = generateDescription(subcategoryName, relativePath.split('/')); + + items.push({ + type: 'category', + name: subcategoryName, + label: subcategoryLabel, + description: subcategoryDescription, + href: `/docs/library/${subcategoryRelativePath}`, + }); + } + } + + // Sort items + // + // Default: categories first, then docs, both alphabetically. + // Special case: Diamond Core (`library/diamond`) to match sidebar order: + // Module, Inspect Facet, Upgrade Facet, Upgrade Module, Examples. + if (relativePath === 'diamond') { + const preferredOrder = [ + 'DiamondMod', + 'DiamondInspectFacet', + 'DiamondUpgradeFacet', + 'DiamondUpgradeMod', + 'example', + ]; + + const getIndex = (item) => { + const idx = preferredOrder.indexOf(item.name); + return idx === -1 ? Number.MAX_SAFE_INTEGER : idx; + }; + + items.sort((a, b) => { + const aIdx = getIndex(a); + const bIdx = getIndex(b); + if (aIdx !== bIdx) { + return aIdx - bIdx; + } + // Fallback deterministic ordering + return a.label.localeCompare(b.label); + }); + } else { + items.sort((a, b) => { + if (a.type !== b.type) { + return a.type === 'category' ? -1 : 1; + } + return a.label.localeCompare(b.label); + }); + } + + return items; +} + +// ============================================================================ +// MDX Content Generation +// ============================================================================ + +/** + * Generate MDX content for a category index page + * @param {string} label - Category label + * @param {string} description - Category description + * @param {Array} items - Array of items to display + * @returns {string} Generated MDX content + */ +function generateIndexMdxContent(label, description, items, hideFromSidebar = false) { + // Escape quotes in label and description for frontmatter + const escapedLabel = label.replace(/"/g, '\\"'); + const escapedDescription = description.replace(/"/g, '\\"'); + + // Add sidebar_class_name: "hidden" to hide from sidebar if requested + const sidebarClass = hideFromSidebar ? '\nsidebar_class_name: "hidden"' : ''; + + let mdxContent = `--- +title: "${escapedLabel}" +description: "${escapedDescription}"${sidebarClass} +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ${escapedDescription} + + +`; + + if (items.length > 0) { + mdxContent += `\n`; + + for (const item of items) { + // Icon mapping: + // - Categories (higher-level groupings): package + // - Facets (contract names ending with "Facet"): showcase-facet + // - Modules (contract names ending with "Mod"): box-detailed + // - Everything else: package + let iconName = 'package'; + if (item.type === 'category') { + iconName = 'package'; + } else if (item.name.endsWith('Facet')) { + iconName = 'showcase-facet'; + } else if (item.name.endsWith('Mod')) { + iconName = 'box-detailed'; + } + const itemDescription = item.description ? `"${item.description.replace(/"/g, '\\"')}"` : '""'; + + mdxContent += ` } + size="medium" + />\n`; + } + + mdxContent += `\n`; + } else { + mdxContent += `_No items in this category yet._\n`; + } + + return mdxContent; +} + +// ============================================================================ +// Index File Creation +// ============================================================================ + +/** + * Generate index.mdx file for a category + * @param {string} outputDir - Directory to create index file in + * @param {string} relativePath - Relative path from library dir + * @param {string} label - Category label + * @param {string} description - Category description + * @param {Function} generateLabel - Function to generate labels from names + * @param {Function} generateDescription - Function to generate descriptions + * @param {boolean} overwrite - Whether to overwrite existing files (default: false) + * @returns {boolean} True if file was created/updated, false if skipped + */ +function createCategoryIndexFile( + outputDir, + relativePath, + label, + description, + generateLabel, + generateDescription, + overwrite = false, + hideFromSidebar = false +) { + const indexFile = path.join(outputDir, 'index.mdx'); + + // Don't overwrite existing index files unless explicitly requested (allows manual customization) + if (!overwrite && fs.existsSync(indexFile)) { + return false; + } + + // Get items in this category + const items = getCategoryItems(outputDir, relativePath, generateLabel, generateDescription); + + // Generate MDX content + const mdxContent = generateIndexMdxContent(label, description, items, hideFromSidebar); + + // Ensure directory exists + fs.mkdirSync(outputDir, { recursive: true }); + fs.writeFileSync(indexFile, mdxContent); + + return true; +} + +// ============================================================================ +// Exports +// ============================================================================ + +module.exports = { + getCategoryItems, + generateIndexMdxContent, + createCategoryIndexFile, +}; + diff --git a/.github/scripts/generate-docs-utils/config.js b/.github/scripts/generate-docs-utils/config.js new file mode 100644 index 00000000..de1c60d7 --- /dev/null +++ b/.github/scripts/generate-docs-utils/config.js @@ -0,0 +1,168 @@ +/** + * Configuration for documentation generation + * + * Centralized configuration for paths, settings, and defaults. + * Modify this file to change documentation output paths or behavior. + */ + +module.exports = { + // ============================================================================ + // Input Paths + // ============================================================================ + + /** Directory containing forge doc output */ + forgeDocsDir: 'docs/src/src', + + /** Source code directory to mirror */ + srcDir: 'src', + + // ============================================================================ + // Output Paths + // ============================================================================ + + /** + * Base output directory for contract documentation + * Structure mirrors src/ automatically + */ + contractsOutputDir: 'website/docs/contracts', + + // ============================================================================ + // Sidebar Positions + // ============================================================================ + + /** Default sidebar position for contracts without explicit mapping */ + defaultSidebarPosition: 50, + + /** + * Contract-specific sidebar positions + * Maps contract name to position number (lower = higher in sidebar) + * + * Convention: + * - Facets come before their corresponding modules + * - Core/base contracts come before extensions + * - Burn facets come after main facets + */ + contractPositions: { + // Diamond core – order: DiamondMod, DiamondInspectFacet, DiamondUpgradeFacet, DiamondUpgradeMod, then Examples + DiamondMod: 1, + DiamondInspectFacet: 2, + DiamondUpgradeFacet: 3, + DiamondUpgradeMod: 4, + DiamondCutMod: 3, + DiamondCutFacet: 1, + DiamondLoupeFacet: 4, + + // Access - Owner pattern + OwnerMod: 2, + OwnerFacet: 1, + + // Access - Two-step owner + OwnerTwoStepsMod: 2, + OwnerTwoStepsFacet: 1, + + // Access - AccessControl pattern + AccessControlMod: 2, + AccessControlFacet: 1, + + // Access - AccessControlPausable + AccessControlPausableMod: 2, + AccessControlPausableFacet: 1, + + // Access - AccessControlTemporal + AccessControlTemporalMod: 2, + AccessControlTemporalFacet: 1, + + // ERC-20 base + ERC20Mod: 2, + ERC20Facet: 1, + ERC20BurnFacet: 3, + + // ERC-20 Bridgeable + ERC20BridgeableMod: 2, + ERC20BridgeableFacet: 1, + + // ERC-20 Permit + ERC20PermitMod: 2, + ERC20PermitFacet: 1, + + // ERC-721 base + ERC721Mod: 2, + ERC721Facet: 1, + ERC721BurnFacet: 3, + + // ERC-721 Enumerable + ERC721EnumerableMod: 2, + ERC721EnumerableFacet: 1, + ERC721EnumerableBurnFacet: 3, + + // ERC-1155 + ERC1155Mod: 2, + ERC1155Facet: 1, + + // ERC-6909 + ERC6909Mod: 2, + ERC6909Facet: 1, + + // Royalty + RoyaltyMod: 2, + RoyaltyFacet: 1, + + // Libraries + NonReentrancyMod: 1, + ERC165Mod: 2, + ERC165Facet: 1, + }, + + /** + * Diamond docs: sidebar labels for the sidebar nav (e.g. "Module", "Inspect Facet"). + * File names stay as contract names (e.g. DiamondMod.mdx, DiamondInspectFacet.mdx). + */ + diamondSidebarLabels: { + DiamondMod: 'Module', + DiamondInspectFacet: 'Inspect Facet', + DiamondUpgradeFacet: 'Upgrade Facet', + DiamondUpgradeMod: 'Upgrade Module', + }, + + // ============================================================================ + // Repository Configuration + // ============================================================================ + + /** Main repository URL - always use this for source links */ + mainRepoUrl: 'https://github.com/Perfect-Abstractions/Compose', + + /** + * Normalize gitSource URL to always point to the main repository's main branch + * Replaces any fork or incorrect repository URLs with the main repo URL + * Converts blob URLs to tree URLs pointing to main branch + * @param {string} gitSource - Original gitSource URL from forge doc + * @returns {string} Normalized gitSource URL + */ + normalizeGitSource(gitSource) { + if (!gitSource) return gitSource; + + // Pattern: https://github.com/USER/Compose/blob/COMMIT/src/path/to/file.sol + // Convert to: https://github.com/Perfect-Abstractions/Compose/tree/main/src/path/to/file.sol + const githubUrlPattern = /https:\/\/github\.com\/[^\/]+\/Compose\/(?:blob|tree)\/[^\/]+\/(.+)/; + const match = gitSource.match(githubUrlPattern); + + if (match) { + // Extract the path after the repo name (should start with src/) + const pathPart = match[1]; + // Ensure it starts with src/ (remove any leading src/ if duplicated) + const normalizedPath = pathPart.startsWith('src/') ? pathPart : `src/${pathPart}`; + return `${this.mainRepoUrl}/tree/main/${normalizedPath}`; + } + + // If it doesn't match the pattern, try to construct from the main repo + // Extract just the file path if it's a relative path or partial URL + if (gitSource.includes('/src/')) { + const srcIndex = gitSource.indexOf('/src/'); + const pathAfterSrc = gitSource.substring(srcIndex + 1); + return `${this.mainRepoUrl}/tree/main/${pathAfterSrc}`; + } + + // If it doesn't match any pattern, return as-is (might be a different format) + return gitSource; + }, +}; diff --git a/.github/scripts/generate-docs-utils/core/contract-processor.js b/.github/scripts/generate-docs-utils/core/contract-processor.js new file mode 100644 index 00000000..e9e10bcf --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/contract-processor.js @@ -0,0 +1,102 @@ +/** + * Contract Processing Pipeline + * + * Shared processing logic for both regular and aggregated contract files. + * Handles the complete pipeline from parsed data to written MDX file. + */ + +const fs = require('fs'); +const { extractStorageInfo } = require('../parsing/storage-extractor'); +const { getOutputPath } = require('../utils/path-computer'); +const { getSidebarPosition } = require('../utils/sidebar-position-calculator'); +const { registerContract, getContractRegistry } = require('./contract-registry'); +const { generateFacetDoc, generateModuleDoc } = require('../templates/templates'); +const { enhanceWithAI, shouldSkipEnhancement } = require('../ai/ai-enhancement'); +const { addFallbackContent } = require('../ai/fallback-content-provider'); +const { applyDescriptionFallback } = require('./description-manager'); +const { writeFileSafe } = require('../../workflow-utils'); + +/** + * Process contract data through the complete pipeline + * @param {object} data - Parsed documentation data + * @param {string} solFilePath - Path to source Solidity file + * @param {'module' | 'facet'} contractType - Type of contract + * @param {object} tracker - Tracker object for recording results (temporary, will be replaced with SummaryTracker) + * @returns {Promise<{success: boolean, error?: string}>} Processing result + */ +async function processContractData(data, solFilePath, contractType, tracker) { + // 1. Extract storage info for modules + if (contractType === 'module') { + data.storageInfo = extractStorageInfo(data); + } + + // 2. Apply description fallback + data = applyDescriptionFallback(data, contractType, solFilePath); + + // 3. Compute output path (mirrors src/ structure) + const pathInfo = getOutputPath(solFilePath, contractType); + + // 4. Get registry for relationship detection + const registry = getContractRegistry(); + + // 5. Get smart sidebar position (uses registry if available) + data.position = getSidebarPosition(data.title, contractType, pathInfo.category, registry); + + // 6. Set contract type for registry (before registering) + data.contractType = contractType; + + // 7. Register contract in registry (before AI enhancement so it's available for relationship detection) + registerContract(data, pathInfo); + + // 8. Enhance with AI if not skipped + const skipAIEnhancement = shouldSkipEnhancement(data) || process.env.SKIP_ENHANCEMENT === 'true'; + let enhancedData = data; + let usedFallback = false; + let enhancementError = null; + + if (!skipAIEnhancement) { + const token = process.env.GITHUB_TOKEN; + const result = await enhanceWithAI(data, contractType, token); + enhancedData = result.data; + usedFallback = result.usedFallback; + enhancementError = result.error; + + // Track fallback usage + if (usedFallback) { + tracker.recordFallback(data.title, pathInfo.outputFile, enhancementError || 'Unknown error'); + } + } else { + enhancedData = addFallbackContent(data, contractType); + } + + // Ensure contractType is preserved after AI enhancement + enhancedData.contractType = contractType; + + // 9. Generate MDX content with registry for relationship detection + const mdxContent = contractType === 'module' + ? generateModuleDoc(enhancedData, enhancedData.position, pathInfo, registry) + : generateFacetDoc(enhancedData, enhancedData.position, pathInfo, registry); + + // 10. Ensure output directory exists + fs.mkdirSync(pathInfo.outputDir, { recursive: true }); + + // 11. Write the file + if (writeFileSafe(pathInfo.outputFile, mdxContent)) { + // Track success + if (contractType === 'module') { + tracker.recordModule(data.title, pathInfo.outputFile); + } else { + tracker.recordFacet(data.title, pathInfo.outputFile); + } + return { success: true }; + } + + // Track write error + tracker.recordError(pathInfo.outputFile, 'Could not write file'); + return { success: false, error: 'Could not write file' }; +} + +module.exports = { + processContractData, +}; + diff --git a/.github/scripts/generate-docs-utils/core/contract-registry.js b/.github/scripts/generate-docs-utils/core/contract-registry.js new file mode 100644 index 00000000..a8981752 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/contract-registry.js @@ -0,0 +1,97 @@ +/** + * Contract Registry System + * + * Tracks all contracts (modules and facets) for relationship detection + * and cross-reference generation in documentation. + * + * Features: + * - Register contracts with metadata (name, type, category, path) + * - Provide registry access for relationship detection and other operations + */ + +// ============================================================================ +// Registry State +// ============================================================================ + +/** + * Global registry to track all contracts for relationship detection + * This allows us to find related contracts and generate cross-references + */ +const contractRegistry = { + byName: new Map(), + byCategory: new Map(), + byType: { modules: [], facets: [] } +}; + +// ============================================================================ +// Registry Management +// ============================================================================ + +/** + * Register a contract in the global registry + * @param {object} contractData - Contract documentation data + * @param {object} outputPath - Output path information from getOutputPath + * @returns {object} Registered contract entry + */ +function registerContract(contractData, outputPath) { + // Construct full path including filename (without .mdx extension) + // This ensures RelatedDocs links point to the actual page, not the category index + const fullPath = outputPath.relativePath + ? `${outputPath.relativePath}/${outputPath.fileName}` + : outputPath.fileName; + + const entry = { + name: contractData.title, + type: contractData.contractType, // 'module' or 'facet' + category: outputPath.category, + path: fullPath, + sourcePath: contractData.sourceFilePath, + functions: contractData.functions || [], + storagePosition: contractData.storageInfo?.storagePosition + }; + + contractRegistry.byName.set(contractData.title, entry); + + if (!contractRegistry.byCategory.has(outputPath.category)) { + contractRegistry.byCategory.set(outputPath.category, []); + } + contractRegistry.byCategory.get(outputPath.category).push(entry); + + if (contractData.contractType === 'module') { + contractRegistry.byType.modules.push(entry); + } else { + contractRegistry.byType.facets.push(entry); + } + + return entry; +} + +/** + * Get the contract registry + * @returns {object} The contract registry + */ +function getContractRegistry() { + return contractRegistry; +} + +/** + * Clear the contract registry (useful for testing or reset) + */ +function clearContractRegistry() { + contractRegistry.byName.clear(); + contractRegistry.byCategory.clear(); + contractRegistry.byType.modules = []; + contractRegistry.byType.facets = []; +} + +// ============================================================================ +// Exports +// ============================================================================ + +module.exports = { + // Registry management + registerContract, + getContractRegistry, + clearContractRegistry, +}; + diff --git a/.github/scripts/generate-docs-utils/core/description-generator.js b/.github/scripts/generate-docs-utils/core/description-generator.js new file mode 100644 index 00000000..34cd18b1 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/description-generator.js @@ -0,0 +1,109 @@ +/** + * Description Generator + * + * Generates fallback descriptions from contract names. + */ + +const CONFIG = require('../config'); + +/** + * Generate a fallback description from contract name + * + * This is a minimal, generic fallback used only when: + * 1. No NatSpec @title/@notice exists in source + * 2. AI enhancement will improve it later + * + * The AI enhancement step receives this as input and generates + * a richer, context-aware description from the actual code. + * + * @param {string} contractName - Name of the contract + * @returns {string} Generic description (will be enhanced by AI) + */ +function generateDescriptionFromName(contractName) { + if (!contractName) return ''; + + // Detect library type from naming convention + const isModule = contractName.endsWith('Mod') || contractName.endsWith('Module'); + const isFacet = contractName.endsWith('Facet'); + const typeLabel = isModule ? 'module' : isFacet ? 'facet' : 'library'; + + // Remove suffix and convert CamelCase to readable text + const baseName = contractName + .replace(/Mod$/, '') + .replace(/Module$/, '') + .replace(/Facet$/, ''); + + // Convert CamelCase to readable format + // Handles: ERC20 -> ERC-20, AccessControl -> Access Control + const readable = baseName + .replace(/([a-z])([A-Z])/g, '$1 $2') // camelCase splits + .replace(/([A-Z]+)([A-Z][a-z])/g, '$1 $2') // acronym handling + .replace(/^ERC(\d+)/, 'ERC-$1') // ERC20 -> ERC-20 + .trim(); + + return `${readable} ${typeLabel} for Compose diamonds`; +} + +/** + * Compute an optional sidebar label for a contract. + * + * For diamond category, uses config diamondSidebarLabels (e.g. "Module", "Inspect Facet"). + * For other library categories, returns minimal "Facet" or "Module". Utilities keep null. + * + * @param {'facet' | 'module' | string} contractType + * @param {string} category + * @param {string} [contractName] - Contract name (e.g. DiamondMod), used for diamond-specific labels + * @returns {string|null} Sidebar label or null when no override should be used + */ +function getSidebarLabel(contractType, category, contractName) { + if (!contractType) return null; + + const normalizedCategory = (category || '').toLowerCase(); + + // Diamond: use short labels from config (e.g. "Module", "Inspect Facet", "Upgrade Module") + if (normalizedCategory === 'diamond' && contractName && CONFIG.diamondSidebarLabels && CONFIG.diamondSidebarLabels[contractName]) { + return CONFIG.diamondSidebarLabels[contractName]; + } + + if (normalizedCategory === 'utils') { + return null; + } + + if (contractType === 'facet') { + return 'Facet'; + } + + if (contractType === 'module') { + return 'Module'; + } + + return null; +} + +/** + * Format contract name as display title for page headers and frontmatter. + * Adds spacing (splits camelCase) and expands "Mod" to "Module". + * + * @param {string} contractName - Raw contract name (e.g. OwnerDataMod, ERC20TransferFacet) + * @returns {string} Display title (e.g. "Owner Data Module", "ERC-20 Transfer Facet") + */ +function formatDisplayTitle(contractName) { + if (!contractName || typeof contractName !== 'string') return ''; + + const withSpaces = contractName + .replace(/^ERC(\d+)/, 'ERC-$1') + .replace(/([a-z])([A-Z])/g, '$1 $2') + .replace(/(\d)([A-Z])/g, '$1 $2') + .replace(/([A-Z]+)([A-Z][a-z])/g, '$1 $2') + .trim(); + + const withModule = withSpaces.replace(/\s+Mod$/, ' Module'); + return withModule; +} + +module.exports = { + generateDescriptionFromName, + getSidebarLabel, + formatDisplayTitle, +}; + diff --git a/.github/scripts/generate-docs-utils/core/description-manager.js b/.github/scripts/generate-docs-utils/core/description-manager.js new file mode 100644 index 00000000..0b2c14ab --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/description-manager.js @@ -0,0 +1,84 @@ +/** + * Description Manager + * + * Handles description generation and fallback logic for contracts. + * Consolidates description generation from multiple sources. + */ + +const { extractModuleDescriptionFromSource } = require('../utils/source-parser'); +const { generateDescriptionFromName } = require('./description-generator'); + +/** + * Apply description fallback logic to contract data + * @param {object} data - Contract documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @param {string} solFilePath - Path to source Solidity file + * @returns {object} Data with description applied + */ +function applyDescriptionFallback(data, contractType, solFilePath) { + // For modules, try to get description from source file first + if (contractType === 'module' && solFilePath) { + const sourceDescription = extractModuleDescriptionFromSource(solFilePath); + if (sourceDescription) { + data.description = sourceDescription; + data.subtitle = sourceDescription; + data.overview = sourceDescription; + return data; + } + } + + // For facets, check if description is generic and needs replacement + if (contractType === 'facet') { + const looksLikeEnum = + data.description && + /\w+\s*=\s*\d+/.test(data.description) && + (data.description.match(/\w+\s*=\s*\d+/g) || []).length >= 2; + + const isGenericDescription = + !data.description || + data.description.startsWith('Contract documentation for') || + looksLikeEnum || + data.description.length < 20; + + if (isGenericDescription) { + const generatedDescription = generateDescriptionFromName(data.title); + if (generatedDescription) { + data.description = generatedDescription; + data.subtitle = generatedDescription; + data.overview = generatedDescription; + return data; + } + } + } + + // For modules, try generating from name + if (contractType === 'module') { + const generatedDescription = generateDescriptionFromName(data.title); + if (generatedDescription) { + data.description = generatedDescription; + data.subtitle = generatedDescription; + data.overview = generatedDescription; + return data; + } + + // Last resort fallback for modules + const genericDescription = `Module providing internal functions for ${data.title}`; + if ( + !data.description || + data.description.includes('Event emitted') || + data.description.includes('Thrown when') || + data.description.includes('function to') + ) { + data.description = genericDescription; + data.subtitle = genericDescription; + data.overview = genericDescription; + } + } + + return data; +} + +module.exports = { + applyDescriptionFallback, +}; + diff --git a/.github/scripts/generate-docs-utils/core/file-processor.js b/.github/scripts/generate-docs-utils/core/file-processor.js new file mode 100644 index 00000000..4825ac19 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/file-processor.js @@ -0,0 +1,151 @@ +/** + * File Processor + * + * Handles processing of Solidity source files and their forge doc outputs. + */ + +const { findForgeDocFiles } = require('../utils/file-finder'); +const { isInterface, getContractType } = require('../utils/contract-classifier'); +const { extractModuleNameFromPath } = require('../utils/source-parser'); +const { readFileSafe } = require('../../workflow-utils'); +const { parseForgeDocMarkdown } = require('../parsing/markdown-parser'); +const { + parseIndividualItemFile, + aggregateParsedItems, + detectItemTypeFromFilename, +} = require('../parsing/item-parser'); +const { processContractData } = require('./contract-processor'); + +/** + * Process a single forge doc markdown file + * @param {string} forgeDocFile - Path to forge doc markdown file + * @param {string} solFilePath - Original .sol file path + * @param {object} tracker - Tracker instance + * @returns {Promise} True if processed successfully + */ +async function processForgeDocFile(forgeDocFile, solFilePath, tracker) { + const content = readFileSafe(forgeDocFile); + if (!content) { + tracker.recordError(forgeDocFile, 'Could not read file'); + return false; + } + + // Parse the forge doc markdown + const data = parseForgeDocMarkdown(content, forgeDocFile); + + // Add source file path for parameter extraction + if (solFilePath) { + data.sourceFilePath = solFilePath; + } + + if (!data.title) { + tracker.recordSkipped(forgeDocFile, 'No title found'); + return false; + } + + // Skip interfaces + if (isInterface(data.title, content)) { + tracker.recordSkipped(forgeDocFile, 'Interface (filtered)'); + return false; + } + + // Determine contract type + const contractType = getContractType(forgeDocFile, content); + + // Process through shared pipeline (includes description fallback) + const result = await processContractData(data, solFilePath, contractType, tracker); + return result.success; +} + +/** + * Check if files need aggregation (individual item files vs contract-level files) + * @param {string[]} forgeDocFiles - Array of forge doc file paths + * @returns {boolean} True if files are individual items that need aggregation + */ +function needsAggregation(forgeDocFiles) { + for (const file of forgeDocFiles) { + const itemType = detectItemTypeFromFilename(file); + if (itemType) { + return true; + } + } + return false; +} + +/** + * Process aggregated files (for free function modules) + * @param {string[]} forgeDocFiles - Array of forge doc file paths + * @param {string} solFilePath - Original .sol file path + * @param {object} tracker - Tracker instance + * @returns {Promise} True if processed successfully + */ +async function processAggregatedFiles(forgeDocFiles, solFilePath, tracker) { + const parsedItems = []; + let gitSource = ''; + + for (const forgeDocFile of forgeDocFiles) { + const content = readFileSafe(forgeDocFile); + if (!content) { + continue; + } + + const parsed = parseIndividualItemFile(content, forgeDocFile); + if (parsed) { + parsedItems.push(parsed); + if (parsed.gitSource && !gitSource) { + gitSource = parsed.gitSource; + } + } + } + + if (parsedItems.length === 0) { + tracker.recordError(solFilePath, 'No valid items parsed'); + return false; + } + + const data = aggregateParsedItems(parsedItems, solFilePath); + + data.sourceFilePath = solFilePath; + + if (!data.title) { + data.title = extractModuleNameFromPath(solFilePath); + } + + if (gitSource) { + data.gitSource = gitSource; + } + + const contractType = getContractType(solFilePath, ''); + + // Process through shared pipeline (includes description fallback) + const result = await processContractData(data, solFilePath, contractType, tracker); + return result.success; +} + +/** + * Process a Solidity source file + * @param {string} solFilePath - Path to .sol file + * @param {object} tracker - Tracker instance + * @returns {Promise} + */ +async function processSolFile(solFilePath, tracker) { + const forgeDocFiles = findForgeDocFiles(solFilePath); + + if (forgeDocFiles.length === 0) { + tracker.recordSkipped(solFilePath, 'No forge doc output'); + return; + } + + if (needsAggregation(forgeDocFiles)) { + await processAggregatedFiles(forgeDocFiles, solFilePath, tracker); + } else { + for (const forgeDocFile of forgeDocFiles) { + await processForgeDocFile(forgeDocFile, solFilePath, tracker); + } + } +} + +module.exports = { + processSolFile, +}; + diff --git a/.github/scripts/generate-docs-utils/core/file-selector.js b/.github/scripts/generate-docs-utils/core/file-selector.js new file mode 100644 index 00000000..0b9bf34a --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/file-selector.js @@ -0,0 +1,40 @@ +/** + * File Selector + * + * Determines which Solidity files to process based on command line arguments. + */ + +const { getAllSolFiles, readChangedFilesFromFile, getChangedSolFiles } = require('../utils/git-utils'); + +/** + * Get files to process based on command line arguments + * @param {string[]} args - Command line arguments + * @returns {string[]} Array of Solidity file paths to process + */ +function getFilesToProcess(args) { + if (args.includes('--all')) { + console.log('Processing all Solidity files...'); + return getAllSolFiles(); + } + + if (args.length > 0 && !args[0].startsWith('--')) { + const changedFilesPath = args[0]; + console.log(`Reading changed files from: ${changedFilesPath}`); + const solFiles = readChangedFilesFromFile(changedFilesPath); + + if (solFiles.length === 0) { + console.log('No files in list, checking git diff...'); + return getChangedSolFiles(); + } + + return solFiles; + } + + console.log('Getting changed Solidity files from git...'); + return getChangedSolFiles(); +} + +module.exports = { + getFilesToProcess, +}; + diff --git a/.github/scripts/generate-docs-utils/core/relationship-detector.js b/.github/scripts/generate-docs-utils/core/relationship-detector.js new file mode 100644 index 00000000..2e926353 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/relationship-detector.js @@ -0,0 +1,123 @@ +/** + * Relationship Detector + * + * Detects relationships between contracts (modules and facets) for + * cross-reference generation in documentation. + * + * Features: + * - Find related contracts (module/facet pairs, same category, extensions) + * - Enrich documentation data with relationship information + */ + +const { getContractRegistry } = require('./contract-registry'); + +/** + * Find related contracts for a given contract + * @param {string} contractName - Name of the contract + * @param {string} contractType - Type of contract ('module' or 'facet') + * @param {string} category - Category of the contract + * @param {object} registry - Contract registry (optional, uses global if not provided) + * @returns {Array} Array of related contract objects with title, href, description, icon + */ +function findRelatedContracts(contractName, contractType, category, registry = null) { + const reg = registry || getContractRegistry(); + const related = []; + const contract = reg.byName.get(contractName); + if (!contract) return related; + + // 1. Find corresponding module/facet pair + if (contractType === 'facet') { + const moduleName = contractName.replace('Facet', 'Mod'); + const module = reg.byName.get(moduleName); + if (module) { + related.push({ + title: moduleName, + href: `/docs/library/${module.path}`, + description: `Module used by ${contractName}`, + icon: '📦' + }); + } + } else if (contractType === 'module') { + const facetName = contractName.replace('Mod', 'Facet'); + const facet = reg.byName.get(facetName); + if (facet) { + related.push({ + title: facetName, + href: `/docs/library/${facet.path}`, + description: `Facet using ${contractName}`, + icon: '💎' + }); + } + } + + // 2. Find related contracts in same category (excluding self) + const sameCategory = reg.byCategory.get(category) || []; + sameCategory.forEach(c => { + if (c.name !== contractName && c.type === contractType) { + related.push({ + title: c.name, + href: `/docs/library/${c.path}`, + description: `Related ${contractType} in ${category}`, + icon: contractType === 'module' ? '📦' : '💎' + }); + } + }); + + // 3. Find extension contracts (e.g., ERC20Facet → ERC20BurnFacet) + if (contractType === 'facet') { + const baseName = contractName.replace(/BurnFacet$|PermitFacet$|BridgeableFacet$|EnumerableFacet$/, 'Facet'); + if (baseName !== contractName) { + const base = reg.byName.get(baseName); + if (base) { + related.push({ + title: baseName, + href: `/docs/library/${base.path}`, + description: `Base facet for ${contractName}`, + icon: '💎' + }); + } + } + } + + // 4. Find core dependencies (e.g., all facets depend on DiamondCutFacet) + if (contractType === 'facet' && contractName !== 'DiamondCutFacet') { + const diamondCut = reg.byName.get('DiamondCutFacet'); + if (diamondCut) { + related.push({ + title: 'DiamondCutFacet', + href: `/docs/library/${diamondCut.path}`, + description: 'Required for adding facets to diamonds', + icon: '🔧' + }); + } + } + + return related.slice(0, 4); // Limit to 4 related items +} + +/** + * Enrich contract data with relationship information + * @param {object} data - Contract documentation data + * @param {object} pathInfo - Output path information + * @param {object} registry - Contract registry (optional, uses global if not provided) + * @returns {object} Enriched data with relatedDocs property + */ +function enrichWithRelationships(data, pathInfo, registry = null) { + const relatedDocs = findRelatedContracts( + data.title, + data.contractType, + pathInfo.category, + registry + ); + + return { + ...data, + relatedDocs: relatedDocs.length > 0 ? relatedDocs : null + }; +} + +module.exports = { + findRelatedContracts, + enrichWithRelationships, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/item-builder.js b/.github/scripts/generate-docs-utils/parsing/item-builder.js new file mode 100644 index 00000000..8956b1c7 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/item-builder.js @@ -0,0 +1,90 @@ +/** + * Item Builder + * + * Functions for creating and saving parsed items. + */ + +/** + * Create a new item object based on section type + * @param {string} name - Item name + * @param {string} section - Section type + * @returns {object} New item object + */ +function createNewItem(name, section) { + const base = { + name, + description: '', + notice: '', + }; + + switch (section) { + case 'functions': + return { + ...base, + signature: '', + params: [], + returns: [], + mutability: 'nonpayable', + }; + case 'events': + return { + ...base, + signature: '', + params: [], + }; + case 'errors': + return { + ...base, + signature: '', + params: [], + }; + case 'structs': + return { + ...base, + definition: '', + fields: [], + }; + case 'stateVariables': + return { + ...base, + type: '', + value: '', + }; + default: + return base; + } +} + +/** + * Save current item to data object + * @param {object} data - Data object to save to + * @param {object} item - Item to save + * @param {string} type - Item type + */ +function saveCurrentItem(data, item, type) { + if (!type || !item) return; + + switch (type) { + case 'functions': + data.functions.push(item); + break; + case 'events': + data.events.push(item); + break; + case 'errors': + data.errors.push(item); + break; + case 'structs': + data.structs.push(item); + break; + case 'stateVariables': + data.stateVariables.push(item); + break; + } +} + +module.exports = { + createNewItem, + saveCurrentItem, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/item-parser.js b/.github/scripts/generate-docs-utils/parsing/item-parser.js new file mode 100644 index 00000000..3da73237 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/item-parser.js @@ -0,0 +1,366 @@ +/** + * Item Parser + * + * Functions for parsing individual item files and aggregating them. + */ + +const path = require('path'); +const config = require('../config'); +const { sanitizeBrokenLinks, cleanDescription } = require('./text-sanitizer'); + +/** + * Detect item type from filename + * @param {string} filePath - Path to the markdown file + * @returns {string | null} Item type ('function', 'error', 'struct', 'event', 'enum', 'constants', or null) + */ +function detectItemTypeFromFilename(filePath) { + const basename = path.basename(filePath); + + if (basename.startsWith('function.')) return 'function'; + if (basename.startsWith('error.')) return 'error'; + if (basename.startsWith('struct.')) return 'struct'; + if (basename.startsWith('event.')) return 'event'; + if (basename.startsWith('enum.')) return 'enum'; + if (basename.startsWith('constants.')) return 'constants'; + + return null; +} + +/** + * Parse an individual item file (function, error, constant, etc.) + * @param {string} content - Markdown content from forge doc + * @param {string} filePath - Path to the markdown file + * @returns {object | null} Parsed item object or null if parsing fails + */ +function parseIndividualItemFile(content, filePath) { + const itemType = detectItemTypeFromFilename(filePath); + if (!itemType) { + return null; + } + + const lines = content.split('\n'); + let itemName = ''; + let gitSource = ''; + let description = ''; + let signature = ''; + let definition = ''; + let descriptionBuffer = []; + let inCodeBlock = false; + let codeBlockLines = []; + let params = []; + let returns = []; + let constants = []; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmedLine = line.trim(); + + // Parse title (# heading) + if (line.startsWith('# ') && !itemName) { + itemName = line.replace('# ', '').trim(); + continue; + } + + // Parse git source link + if (trimmedLine.startsWith('[Git Source]')) { + const match = trimmedLine.match(/\[Git Source\]\((.*?)\)/); + if (match) { + gitSource = config.normalizeGitSource(match[1]); + } + continue; + } + + // Parse code block + if (line.startsWith('```solidity')) { + inCodeBlock = true; + codeBlockLines = []; + i++; + while (i < lines.length && !lines[i].startsWith('```')) { + codeBlockLines.push(lines[i]); + i++; + } + const codeContent = codeBlockLines.join('\n').trim(); + + if (itemType === 'constants') { + // For constants, parse multiple constant definitions + // Format: "bytes32 constant NON_REENTRANT_SLOT = keccak256(...)" + // Handle both single and multiple constants in one code block + const constantMatches = codeContent.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?/g); + if (constantMatches) { + for (const match of constantMatches) { + const parts = match.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?$/); + if (parts) { + constants.push({ + name: parts[2], + type: parts[1], + value: parts[3].trim(), + description: descriptionBuffer.join(' ').trim(), + }); + } + } + } else { + // Single constant definition (more flexible regex) + const singleMatch = codeContent.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?$/); + if (singleMatch) { + constants.push({ + name: singleMatch[2], + type: singleMatch[1], + value: singleMatch[3].trim(), + description: descriptionBuffer.join(' ').trim(), + }); + } + } + // Clear description buffer after processing constants + descriptionBuffer = []; + } else { + signature = codeContent; + } + inCodeBlock = false; + continue; + } + + // Parse constants with ### heading format + if (itemType === 'constants' && line.startsWith('### ')) { + const constantName = line.replace('### ', '').trim(); + // Clear description buffer for this constant (only text before this heading) + // Filter out code block delimiters and empty lines + const currentConstantDesc = descriptionBuffer + .filter(l => l && !l.trim().startsWith('```') && l.trim() !== '') + .join(' ') + .trim(); + descriptionBuffer = []; + + // Look ahead for code block (within next 15 lines) + let foundCodeBlock = false; + let codeBlockEndIndex = i; + for (let j = i + 1; j < lines.length && j < i + 15; j++) { + if (lines[j].startsWith('```solidity')) { + foundCodeBlock = true; + const constCodeLines = []; + j++; + while (j < lines.length && !lines[j].startsWith('```')) { + constCodeLines.push(lines[j]); + j++; + } + codeBlockEndIndex = j; // j now points to the line after closing ``` + const constCode = constCodeLines.join('\n').trim(); + // Match: type constant name = value + // Handle complex types like "bytes32", "uint256", etc. + const constMatch = constCode.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?$/); + if (constMatch) { + constants.push({ + name: constantName, + type: constMatch[1], + value: constMatch[3].trim(), + description: currentConstantDesc, + }); + } else { + // Fallback: if no match, still add constant with name from heading + constants.push({ + name: constantName, + type: '', + value: constCode, + description: currentConstantDesc, + }); + } + break; + } + } + if (!foundCodeBlock) { + // No code block found, but we have a heading - might be a constant without definition + // This shouldn't happen in forge doc output, but handle it gracefully + constants.push({ + name: constantName, + type: '', + value: '', + description: currentConstantDesc, + }); + } else { + // Skip to the end of the code block (the loop will increment i, so we set it to one before) + i = codeBlockEndIndex - 1; + } + continue; + } + + // Collect description (text before code block or after title) + // Skip code block delimiters, empty lines, and markdown table separators + if (!inCodeBlock && trimmedLine && + !trimmedLine.startsWith('#') && + !trimmedLine.startsWith('[') && + !trimmedLine.startsWith('|') && + !trimmedLine.startsWith('```') && + trimmedLine !== '') { + if (itemType !== 'constants' || !line.startsWith('###')) { + descriptionBuffer.push(trimmedLine); + } + continue; + } + + // Parse table rows (Parameters or Returns) + if (trimmedLine.startsWith('|') && !trimmedLine.includes('----')) { + const cells = trimmedLine.split('|').map(c => c.trim()).filter(c => c); + + if (cells.length >= 3 && cells[0] !== 'Name' && cells[0] !== 'Parameter') { + const paramName = cells[0].replace(/`/g, '').trim(); + const paramType = cells[1].replace(/`/g, '').trim(); + const paramDesc = sanitizeBrokenLinks(cells[2] || ''); + + // Determine if Parameters or Returns based on preceding lines + const precedingLines = lines.slice(Math.max(0, i - 10), i).join('\n'); + + if (precedingLines.includes('**Returns**')) { + returns.push({ + name: paramName === '' ? '' : paramName, + type: paramType, + description: paramDesc, + }); + } else if (precedingLines.includes('**Parameters**')) { + if (paramType || paramName.startsWith('_')) { + params.push({ + name: paramName, + type: paramType, + description: paramDesc, + }); + } + } + } + } + } + + // Combine description buffer and clean it + if (descriptionBuffer.length > 0) { + description = cleanDescription(sanitizeBrokenLinks(descriptionBuffer.join(' ').trim())); + } + + // For constants, return array of constant objects + if (itemType === 'constants') { + return { + type: 'constants', + constants: constants.length > 0 ? constants : [{ + name: itemName || 'Constants', + type: '', + value: '', + description: description, + }], + gitSource: gitSource, + }; + } + + // For structs, use definition instead of signature + if (itemType === 'struct') { + definition = signature; + signature = ''; + } + + // Create item object based on type + const item = { + name: itemName, + description: description, + notice: description, + signature: signature, + definition: definition, + params: params, + returns: returns, + gitSource: gitSource, + }; + + // Add mutability for functions + if (itemType === 'function' && signature) { + if (signature.includes(' view ')) { + item.mutability = 'view'; + } else if (signature.includes(' pure ')) { + item.mutability = 'pure'; + } else if (signature.includes(' payable ')) { + item.mutability = 'payable'; + } else { + item.mutability = 'nonpayable'; + } + } + + return { + type: itemType, + item: item, + }; +} + +/** + * Aggregate multiple parsed items into a single data structure + * @param {Array} parsedItems - Array of parsed item objects from parseIndividualItemFile + * @param {string} sourceFilePath - Path to the source Solidity file + * @returns {object} Aggregated documentation data + */ +function aggregateParsedItems(parsedItems, sourceFilePath) { + const data = { + title: '', + description: '', + subtitle: '', + overview: '', + gitSource: '', + functions: [], + events: [], + errors: [], + structs: [], + stateVariables: [], + }; + + // Extract module name from source file path + const basename = path.basename(sourceFilePath, '.sol'); + data.title = basename; + + // Extract git source from first item + for (const parsed of parsedItems) { + if (parsed && parsed.gitSource) { + data.gitSource = config.normalizeGitSource(parsed.gitSource); + break; + } + } + + // Group items by type + for (const parsed of parsedItems) { + if (!parsed) continue; + + if (parsed.type === 'function' && parsed.item) { + data.functions.push(parsed.item); + } else if (parsed.type === 'error' && parsed.item) { + data.errors.push(parsed.item); + } else if (parsed.type === 'event' && parsed.item) { + data.events.push(parsed.item); + } else if (parsed.type === 'struct' && parsed.item) { + data.structs.push(parsed.item); + } else if (parsed.type === 'enum' && parsed.item) { + // Enums can be treated as structs for display purposes + data.structs.push(parsed.item); + } else if (parsed.type === 'constants' && parsed.constants) { + // Add constants as state variables + for (const constant of parsed.constants) { + data.stateVariables.push({ + name: constant.name, + type: constant.type, + value: constant.value, + description: constant.description, + }); + } + } + } + + // Set default description if not provided + // Don't use item descriptions as module description - they'll be overridden by source file parsing + if (!data.description || + data.description.includes('Event emitted') || + data.description.includes('Thrown when') || + data.description.includes('function to') || + data.description.length < 20) { + data.description = `Documentation for ${data.title}`; + data.subtitle = data.description; + data.overview = data.description; + } + + return data; +} + +module.exports = { + detectItemTypeFromFilename, + parseIndividualItemFile, + aggregateParsedItems, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/markdown-parser.js b/.github/scripts/generate-docs-utils/parsing/markdown-parser.js new file mode 100644 index 00000000..f16ade89 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/markdown-parser.js @@ -0,0 +1,236 @@ +/** + * Markdown Parser + * + * Main parser for forge doc markdown output. + */ + +const config = require('../config'); +const { createNewItem, saveCurrentItem } = require('./item-builder'); +const { sanitizeBrokenLinks, cleanDescription } = require('./text-sanitizer'); + +/** + * Parse forge doc markdown output into structured data + * @param {string} content - Markdown content from forge doc + * @param {string} filePath - Path to the markdown file + * @returns {object} Parsed documentation data + */ +function parseForgeDocMarkdown(content, filePath) { + const data = { + title: '', + description: '', + subtitle: '', + overview: '', + gitSource: '', + functions: [], + events: [], + errors: [], + structs: [], + stateVariables: [], + }; + + const lines = content.split('\n'); + let currentSection = null; + let currentItem = null; + let itemType = null; + let collectingDescription = false; + let descriptionBuffer = []; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmedLine = line.trim(); + + // Parse title (# heading) + if (line.startsWith('# ') && !data.title) { + data.title = line.replace('# ', '').trim(); + continue; + } + + // Parse git source link + if (trimmedLine.startsWith('[Git Source]')) { + const match = trimmedLine.match(/\[Git Source\]\((.*?)\)/); + if (match) { + data.gitSource = config.normalizeGitSource(match[1]); + } + continue; + } + + // Parse description (first non-empty lines after title, before sections) + if (data.title && !currentSection && trimmedLine && !line.startsWith('#') && !line.startsWith('[')) { + const sanitizedLine = cleanDescription(sanitizeBrokenLinks(trimmedLine)); + if (!data.description) { + data.description = sanitizedLine; + data.subtitle = sanitizedLine; + } else if (!data.overview) { + // Capture additional lines as overview + data.overview = data.description + '\n\n' + sanitizedLine; + } + continue; + } + + // Parse main sections + if (line.startsWith('## ')) { + const sectionName = line.replace('## ', '').trim().toLowerCase(); + + // Save current item before switching sections + if (currentItem) { + saveCurrentItem(data, currentItem, itemType); + currentItem = null; + itemType = null; + } + + if (sectionName === 'functions') { + currentSection = 'functions'; + } else if (sectionName === 'events') { + currentSection = 'events'; + } else if (sectionName === 'errors') { + currentSection = 'errors'; + } else if (sectionName === 'structs') { + currentSection = 'structs'; + } else if (sectionName === 'state variables') { + currentSection = 'stateVariables'; + } else { + currentSection = null; + } + continue; + } + + // Parse item definitions (### heading) + if (line.startsWith('### ') && currentSection) { + // Save previous item + if (currentItem) { + saveCurrentItem(data, currentItem, itemType); + } + + const name = line.replace('### ', '').trim(); + itemType = currentSection; + currentItem = createNewItem(name, currentSection); + collectingDescription = true; + descriptionBuffer = []; + continue; + } + + // Process content within current item + if (currentItem) { + // Code block with signature + if (line.startsWith('```solidity')) { + const codeLines = []; + i++; + while (i < lines.length && !lines[i].startsWith('```')) { + codeLines.push(lines[i]); + i++; + } + const codeContent = codeLines.join('\n').trim(); + + if (currentSection === 'functions' || currentSection === 'events' || currentSection === 'errors') { + currentItem.signature = codeContent; + + // Extract mutability from signature + if (codeContent.includes(' view ')) { + currentItem.mutability = 'view'; + } else if (codeContent.includes(' pure ')) { + currentItem.mutability = 'pure'; + } else if (codeContent.includes(' payable ')) { + currentItem.mutability = 'payable'; + } + } else if (currentSection === 'structs') { + currentItem.definition = codeContent; + } else if (currentSection === 'stateVariables') { + // Extract type and value from constant definition + // Format: "bytes32 constant NAME = value;" or "bytes32 NAME = value;" + // Handle both with and without "constant" keyword + // Note: name is already known from the ### heading, so we just need type and value + const constantMatch = codeContent.match(/(\w+(?:\s*\d+)?)\s+(?:constant\s+)?\w+\s*=\s*(.+?)(?:\s*;)?$/); + if (constantMatch) { + currentItem.type = constantMatch[1]; + currentItem.value = constantMatch[2].trim(); + } else { + // Fallback: try to extract just the value part if it's a simple assignment + const simpleMatch = codeContent.match(/=\s*(.+?)(?:\s*;)?$/); + if (simpleMatch) { + currentItem.value = simpleMatch[1].trim(); + } + // Try to extract type from the beginning + const typeMatch = codeContent.match(/^(\w+(?:\s*\d+)?)\s+/); + if (typeMatch) { + currentItem.type = typeMatch[1]; + } + } + } + continue; + } + + // Description text (before **Parameters** or **Returns**) + if (collectingDescription && trimmedLine && !trimmedLine.startsWith('**') && !trimmedLine.startsWith('|')) { + descriptionBuffer.push(trimmedLine); + continue; + } + + // End description collection on special markers + if (trimmedLine.startsWith('**Parameters**') || trimmedLine.startsWith('**Returns**')) { + if (descriptionBuffer.length > 0) { + const description = cleanDescription(sanitizeBrokenLinks(descriptionBuffer.join(' ').trim())); + currentItem.description = description; + currentItem.notice = description; + descriptionBuffer = []; + } + collectingDescription = false; + } + + // Parse table rows (Parameters or Returns) + if (trimmedLine.startsWith('|') && !trimmedLine.includes('----')) { + const cells = trimmedLine.split('|').map(c => c.trim()).filter(c => c); + + // Skip header row + if (cells.length >= 3 && cells[0] !== 'Name' && cells[0] !== 'Parameter') { + const paramName = cells[0].replace(/`/g, '').trim(); + const paramType = cells[1].replace(/`/g, '').trim(); + const paramDesc = sanitizeBrokenLinks(cells[2] || ''); + + // Skip if parameter name matches the function name (parsing error) + if (currentItem && paramName === currentItem.name) { + continue; + } + + // Determine if Parameters or Returns based on preceding lines + const precedingLines = lines.slice(Math.max(0, i - 10), i).join('\n'); + + if (precedingLines.includes('**Returns**')) { + currentItem.returns = currentItem.returns || []; + currentItem.returns.push({ + name: paramName === '' ? '' : paramName, + type: paramType, + description: paramDesc, + }); + } else if (precedingLines.includes('**Parameters**')) { + // Only add if it looks like a valid parameter (has a type or starts with underscore) + if (paramType || paramName.startsWith('_')) { + currentItem.params = currentItem.params || []; + currentItem.params.push({ + name: paramName, + type: paramType, + description: paramDesc, + }); + } + } + } + } + } + } + + // Save last item + if (currentItem) { + saveCurrentItem(data, currentItem, itemType); + } + + // Ensure overview is set + if (!data.overview) { + data.overview = data.description || `Documentation for ${data.title}.`; + } + + return data; +} + +module.exports = { + parseForgeDocMarkdown, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/storage-extractor.js b/.github/scripts/generate-docs-utils/parsing/storage-extractor.js new file mode 100644 index 00000000..e9f9181a --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/storage-extractor.js @@ -0,0 +1,37 @@ +/** + * Storage Extractor + * + * Functions for extracting storage information from parsed data. + */ + +/** + * Extract storage information from parsed data + * @param {object} data - Parsed documentation data + * @returns {string | null} Storage information or null + */ +function extractStorageInfo(data) { + // Look for STORAGE_POSITION in state variables + const storageVar = data.stateVariables.find(v => + v.name.includes('STORAGE') || v.name.includes('storage') + ); + + if (storageVar) { + return `Storage position: \`${storageVar.name}\` - ${storageVar.description || 'Used for diamond storage pattern.'}`; + } + + // Look for storage struct + const storageStruct = data.structs.find(s => + s.name.includes('Storage') + ); + + if (storageStruct) { + return `Uses the \`${storageStruct.name}\` struct following the ERC-8042 diamond storage pattern.`; + } + + return null; +} + +module.exports = { + extractStorageInfo, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/text-sanitizer.js b/.github/scripts/generate-docs-utils/parsing/text-sanitizer.js new file mode 100644 index 00000000..a4220c80 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/text-sanitizer.js @@ -0,0 +1,66 @@ +/** + * Text Sanitizer + * + * Functions for cleaning and sanitizing text content. + */ + +/** + * Sanitize markdown links that point to non-existent files + * Removes or converts broken links to plain text + * @param {string} text - Text that may contain markdown links + * @returns {string} Sanitized text + */ +function sanitizeBrokenLinks(text) { + if (!text) return text; + + // Remove markdown links that point to /src/ paths (forge doc links) + // Pattern: [text](/src/...) + return text.replace(/\[([^\]]+)\]\(\/src\/[^\)]+\)/g, '$1'); +} + +/** + * Clean description text by removing markdown artifacts + * Strips **Parameters**, **Returns**, **Note:** and other section markers + * that get incorrectly included in descriptions from forge doc output + * @param {string} text - Description text that may contain markdown artifacts + * @returns {string} Cleaned description text + */ +function cleanDescription(text) { + if (!text) return text; + + let cleaned = text; + + // Remove markdown section headers that shouldn't be in descriptions + // These patterns appear when forge doc parsing doesn't stop at section boundaries + const artifactPatterns = [ + /\s*\*\*Parameters\*\*\s*/g, + /\s*\*\*Returns\*\*\s*/g, + /\s*\*\*Note:\*\*\s*/g, + /\s*\*\*Events\*\*\s*/g, + /\s*\*\*Errors\*\*\s*/g, + /\s*\*\*See Also\*\*\s*/g, + /\s*\*\*Example\*\*\s*/g, + ]; + + for (const pattern of artifactPatterns) { + cleaned = cleaned.replace(pattern, ' '); + } + + // Remove @custom: tags that may leak through (e.g., "@custom:error AccessControlUnauthorizedAccount") + cleaned = cleaned.replace(/@custom:\w+\s+/g, ''); + + // Clean up "error: ErrorName" patterns that appear inline + // Keep the error name but format it better: "error: ErrorName If..." -> "Reverts with ErrorName if..." + cleaned = cleaned.replace(/\berror:\s+(\w+)\s+/gi, 'Reverts with $1 '); + + // Normalize whitespace: collapse multiple spaces, trim + cleaned = cleaned.replace(/\s+/g, ' ').trim(); + + return cleaned; +} + +module.exports = { + sanitizeBrokenLinks, + cleanDescription, +}; + diff --git a/.github/scripts/generate-docs-utils/pr-body-generator.js b/.github/scripts/generate-docs-utils/pr-body-generator.js new file mode 100644 index 00000000..c37678cb --- /dev/null +++ b/.github/scripts/generate-docs-utils/pr-body-generator.js @@ -0,0 +1,104 @@ +/** + * PR Body Generator + * + * Generates a PR body from the docgen-summary.json file + * + * Usage: + * node pr-body-generator.js [summary-file-path] + * + * Outputs the PR body in GitHub Actions format to stdout + */ + +const fs = require('fs'); +const path = require('path'); + +/** + * Generate PR body from summary data + * @param {Object} summary - Summary data from docgen-summary.json + * @returns {string} PR body markdown + */ +function generatePRBody(summary) { + const facets = summary.facets || []; + const modules = summary.modules || []; + const total = summary.totalGenerated || 0; + + let body = '## Auto-Generated Docs Pages\n\n'; + body += 'This PR contains auto-generated documentation from contract comments using `forge doc`. '; + body += 'The output is passed through AI to enhance the documentation content and add additional information.\n\n'; + body += '**Please ALWAYS review the generated content and ensure it is accurate and complete to Compose Standards.**\n'; + + + body += '### Summary\n'; + body += `- **Total generated:** ${total} files\n\n`; + + if (facets.length > 0) { + body += '### Facets\n'; + facets.forEach(facet => { + body += `- ${facet.title}\n`; + }); + body += '\n'; + } + + if (modules.length > 0) { + body += '### Modules\n'; + modules.forEach(module => { + body += `- ${module.title}\n`; + }); + body += '\n'; + } + + body += '### What was done\n'; + body += '1. Extracted NatSpec using `forge doc`\n'; + body += '2. Converted to Docusaurus MDX format\n'; + body += '3. Enhanced content with GitHub Copilot (optional)\n'; + body += '4. Verified documentation build\n\n'; + + body += '### Review Checklist\n'; + body += '- [ ] Review generated content for accuracy\n'; + body += '- [ ] Verify code examples are correct\n'; + body += '- [ ] Check for any missing documentation\n'; + body += '- [ ] Ensure consistency with existing docs\n\n'; + + body += '---\n'; + body += '🚨 **This PR was automatically generated. Please ALWAYS review before merging.**\n'; + body += `Generated on: ${new Date().toISOString()}\n`; + + return body; +} + +/** + * Main function + */ +function main() { + const summaryPath = process.argv[2] || 'docgen-summary.json'; + + if (!fs.existsSync(summaryPath)) { + console.error(`Error: Summary file not found: ${summaryPath}`); + process.exit(1); + } + + try { + const summaryContent = fs.readFileSync(summaryPath, 'utf8'); + const summary = JSON.parse(summaryContent); + + const prBody = generatePRBody(summary); + + // Output in GitHub Actions format + console.log('body</g, '>') + .replace(/\{/g, '{') + .replace(/\}/g, '}'); +} + +/** + * Convert object/array to a safe JavaScript expression for JSX attributes + * Returns the value wrapped in curly braces for direct use in JSX: {value} + * @param {*} obj - Value to convert + * @returns {string} JSX expression with curly braces: {JSON} + */ +function toJsxExpression(obj) { + if (obj == null) return '{null}'; + + try { + let jsonStr = JSON.stringify(obj); + // Ensure single line + jsonStr = jsonStr.replace(/[\n\r]/g, ' ').replace(/\s+/g, ' ').trim(); + // Verify it's valid JSON + JSON.parse(jsonStr); + // Return with JSX curly braces included + return `{${jsonStr}}`; + } catch (e) { + console.warn('Invalid JSON generated:', e.message); + return Array.isArray(obj) ? '{[]}' : '{{}}'; + } +} + +/** + * Escape special characters for JSX string attributes + * @param {string} str - String to escape + * @returns {string} Escaped string safe for JSX attributes + */ +function escapeJsx(str) { + if (!str) return ''; + + return sanitizeForMdx(str) + .replace(/\\/g, '\\\\') + .replace(/"/g, '\\"') + .replace(/'/g, "\\'") + .replace(/\n/g, ' ') + .replace(/\{/g, '{') + .replace(/\}/g, '}') + // Don't escape backticks - they should be preserved for code formatting + .trim(); +} + +/** + * Escape markdown table special characters + * @param {string} str - String to escape + * @returns {string} Escaped string safe for markdown tables + */ +function escapeMarkdownTable(str) { + if (!str) return ''; + return str + .replace(/\|/g, '\\|') + .replace(/\n/g, ' ') + .replace(/\{/g, '{') + .replace(/\}/g, '}'); +} + +/** + * Escape HTML entities for safe display + * @param {string} str - String to escape + * @returns {string} HTML-escaped string + */ +function escapeHtml(str) { + if (!str) return ''; + return String(str) + .replace(/&/g, '&') + .replace(//g, '>') + .replace(/"/g, '"') + .replace(/'/g, '''); +} + +/** + * Escape string for use in JavaScript/JSX object literal values + * Escapes quotes and backslashes for JavaScript strings (not HTML entities) + * Preserves backticks for code formatting + * @param {string} str - String to escape + * @returns {string} Escaped string safe for JavaScript string literals + */ +function escapeJsString(str) { + if (!str) return ''; + return String(str) + .replace(/\\/g, '\\\\') // Escape backslashes first + .replace(/"/g, '\\"') // Escape double quotes + .replace(/'/g, "\\'") // Escape single quotes + .replace(/\n/g, '\\n') // Escape newlines + .replace(/\r/g, '\\r') // Escape carriage returns + .replace(/\t/g, '\\t'); // Escape tabs + // Note: Backticks are preserved for code formatting in descriptions +} + +/** + * Escape string for JSX string attributes, preserving backticks for code formatting + * This is specifically for descriptions that may contain code with backticks + * @param {string} str - String to escape + * @returns {string} Escaped string safe for JSX string attributes with preserved backticks + */ +function escapeJsxPreserveBackticks(str) { + if (!str) return ''; + + // Don't use sanitizeForMdx as it might HTML-escape things + // Just escape what's needed for JSX string attributes + return String(str) + .replace(/\\/g, '\\\\') // Escape backslashes first + .replace(/"/g, '\\"') // Escape double quotes for JSX strings + .replace(/'/g, "\\'") // Escape single quotes + .replace(/\n/g, ' ') // Replace newlines with spaces + .replace(/\{/g, '{') // Escape curly braces for JSX + .replace(/\}/g, '}') // Escape curly braces for JSX + // Preserve backticks - don't escape them, they're needed for code formatting + .trim(); +} + +module.exports = { + escapeYaml, + escapeJsx, + escapeJsxPreserveBackticks, + sanitizeForMdx, + sanitizeMdx: sanitizeForMdx, // Alias for template usage + toJsxExpression, + escapeMarkdownTable, + escapeHtml, + escapeJsString, +}; + diff --git a/.github/scripts/generate-docs-utils/templates/package-lock.json b/.github/scripts/generate-docs-utils/templates/package-lock.json new file mode 100644 index 00000000..b1877ecc --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/package-lock.json @@ -0,0 +1,79 @@ +{ + "name": "compose-doc-templates", + "version": "1.0.0", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "compose-doc-templates", + "version": "1.0.0", + "dependencies": { + "handlebars": "^4.7.8" + } + }, + "node_modules/handlebars": { + "version": "4.7.8", + "resolved": "https://registry.npmjs.org/handlebars/-/handlebars-4.7.8.tgz", + "integrity": "sha512-vafaFqs8MZkRrSX7sFVUdo3ap/eNiLnb4IakshzvP56X5Nr1iGKAIqdX6tMlm6HcNRIkr6AxO5jFEoJzzpT8aQ==", + "license": "MIT", + "dependencies": { + "minimist": "^1.2.5", + "neo-async": "^2.6.2", + "source-map": "^0.6.1", + "wordwrap": "^1.0.0" + }, + "bin": { + "handlebars": "bin/handlebars" + }, + "engines": { + "node": ">=0.4.7" + }, + "optionalDependencies": { + "uglify-js": "^3.1.4" + } + }, + "node_modules/minimist": { + "version": "1.2.8", + "resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz", + "integrity": "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==", + "license": "MIT", + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/neo-async": { + "version": "2.6.2", + "resolved": "https://registry.npmjs.org/neo-async/-/neo-async-2.6.2.tgz", + "integrity": "sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==", + "license": "MIT" + }, + "node_modules/source-map": { + "version": "0.6.1", + "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz", + "integrity": "sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==", + "license": "BSD-3-Clause", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/uglify-js": { + "version": "3.19.3", + "resolved": "https://registry.npmjs.org/uglify-js/-/uglify-js-3.19.3.tgz", + "integrity": "sha512-v3Xu+yuwBXisp6QYTcH4UbH+xYJXqnq2m/LtQVWKWzYc1iehYnLixoQDN9FH6/j9/oybfd6W9Ghwkl8+UMKTKQ==", + "license": "BSD-2-Clause", + "optional": true, + "bin": { + "uglifyjs": "bin/uglifyjs" + }, + "engines": { + "node": ">=0.8.0" + } + }, + "node_modules/wordwrap": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/wordwrap/-/wordwrap-1.0.0.tgz", + "integrity": "sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==", + "license": "MIT" + } + } +} diff --git a/.github/scripts/generate-docs-utils/templates/package.json b/.github/scripts/generate-docs-utils/templates/package.json new file mode 100644 index 00000000..d5425ad4 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/package.json @@ -0,0 +1,10 @@ +{ + "name": "compose-doc-templates", + "version": "1.0.0", + "private": true, + "description": "Template engine for generating MDX documentation", + "dependencies": { + "handlebars": "^4.7.8" + } +} + diff --git a/.github/scripts/generate-docs-utils/templates/pages/contract.mdx.template b/.github/scripts/generate-docs-utils/templates/pages/contract.mdx.template new file mode 100644 index 00000000..6e408dd5 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/pages/contract.mdx.template @@ -0,0 +1,275 @@ +--- +sidebar_position: {{position}} +title: "{{escapeYaml title}}" +description: "{{escapeYaml description}}" +{{#if sidebarLabel}} +sidebar_label: "{{sidebarLabel}}" +{{/if}} +{{#if gitSource}} +gitSource: "{{gitSource}}" +{{/if}} +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + + +{{escapeYaml description}} + + +{{#if keyFeatures}} + +{{{sanitizeMdx keyFeatures}}} + +{{/if}} + +{{#if isModule}} + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + +{{/if}} + +## Overview + +{{{sanitizeMdx overview}}} + +## Storage + +{{#if hasStructs}} +{{#each structs}} +### {{name}} + +{{#if description}} +{{{sanitizeMdx description}}} +{{/if}} + +{{#if definition}} + +{{{codeContent definition}}} + +{{/if}} + +{{#unless @last}} +--- +{{/unless}} +{{/each}} +{{/if}} + +{{#if hasStorage}} + +{{#if hasStateVariables}} +### State Variables + + +{{/if}} +{{/if}} + +{{#if hasFunctions}} +## Functions + +{{#each functions}} +### {{name}} + +{{#if description}} +{{{sanitizeMdx description}}} +{{/if}} + +{{#if signature}} + +{{{codeContent signature}}} + +{{/if}} + +{{#if hasParams}} +**Parameters:** + + +{{/if}} + +{{#if hasReturns}} +**Returns:** + + +{{/if}} + +{{#unless @last}} +--- +{{/unless}} +{{/each}} +{{/if}} + +{{#if hasEvents}} +## Events + + +{{#each events}} + + {{#if description}} +
+ {{{sanitizeMdx description}}} +
+ {{/if}} + + {{#if signature}} +
+ Signature: + +{{{codeContent signature}}} + +
+ {{/if}} + + {{#if hasParams}} +
+ Parameters: + +
+ {{/if}} +
+{{/each}} +
+{{/if}} + +{{#if hasErrors}} +## Errors + + +{{#each errors}} + + {{#if description}} +
+ {{{sanitizeMdx description}}} +
+ {{/if}} + + {{#if signature}} +
+ Signature: + +{{signature}} + +
+ {{/if}} +
+{{/each}} +
+{{/if}} + +{{#if usageExample}} + + +{{/if}} + +{{#if bestPractices}} +## Best Practices + + +{{{sanitizeMdx bestPractices}}} + +{{/if}} + +{{#if isFacet}} +{{#if securityConsiderations}} +## Security Considerations + + +{{{sanitizeMdx securityConsiderations}}} + +{{/if}} +{{/if}} + +{{#if isModule}} +{{#if integrationNotes}} +## Integration Notes + + +{{{sanitizeMdx integrationNotes}}} + +{{/if}} +{{/if}} + +{{#if relatedDocs}} +
+ +
+{{/if}} + +
+ +
+ + diff --git a/.github/scripts/generate-docs-utils/templates/template-engine-handlebars.js b/.github/scripts/generate-docs-utils/templates/template-engine-handlebars.js new file mode 100644 index 00000000..18fb38d0 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/template-engine-handlebars.js @@ -0,0 +1,262 @@ +/** + * Handlebars Template Engine for MDX Documentation Generation + * + * Replaces the custom template engine with Handlebars for better reliability + * and proper MDX formatting. + */ + +const Handlebars = require('handlebars'); +const fs = require('fs'); +const path = require('path'); +const helpers = require('./helpers'); + +// Track if helpers have been registered (only register once) +let helpersRegistered = false; + +/** + * Register custom helpers for Handlebars + * All helpers from helpers.js are registered for use in templates + */ +function registerHelpers() { + if (helpersRegistered) return; + + // Register escape helpers + Handlebars.registerHelper('escapeYaml', helpers.escapeYaml); + Handlebars.registerHelper('escapeJsx', helpers.escapeJsx); + // Helper to escape JSX strings while preserving backticks for code formatting + Handlebars.registerHelper('escapeJsxPreserveBackticks', function(value) { + if (!value) return ''; + const escaped = helpers.escapeJsxPreserveBackticks(value); + // Return as SafeString to prevent Handlebars from HTML-escaping backticks + return new Handlebars.SafeString(escaped); + }); + Handlebars.registerHelper('sanitizeMdx', helpers.sanitizeMdx); + Handlebars.registerHelper('escapeMarkdownTable', helpers.escapeMarkdownTable); + // Helper to escape value for JavaScript strings in JSX object literals + Handlebars.registerHelper('escapeJsString', function(value) { + if (!value) return ''; + const escaped = helpers.escapeJsString(value); + return new Handlebars.SafeString(escaped); + }); + + // Helper to emit a JSX style literal: returns a string like {{display: "flex", gap: "1rem"}} + Handlebars.registerHelper('styleLiteral', function(styles) { + if (!styles || typeof styles !== 'string') return '{{}}'; + + const styleObj = {}; + const pairs = styles.split(';').filter(pair => pair.trim()); + + pairs.forEach(pair => { + const [key, value] = pair.split(':').map(s => s.trim()); + if (key && value !== undefined) { + const camelKey = key.includes('-') + ? key.replace(/-([a-z])/g, (_, c) => c.toUpperCase()) + : key; + const cleanValue = value.replace(/^["']|["']$/g, ''); + styleObj[camelKey] = cleanValue; + } + }); + + const entries = Object.entries(styleObj).map(([k, v]) => { + const isPureNumber = /^-?\d+\.?\d*$/.test(v.trim()); + // Quote everything except pure numbers + const valueLiteral = isPureNumber ? v : JSON.stringify(v); + return `${k}: ${valueLiteral}`; + }).join(', '); + + // Wrap with double braces so MDX sees style={{...}} + return `{{${entries}}}`; + }); + + // Helper to wrap code content in template literal for MDX + // This ensures MDX treats the content as a string, not JSX to parse + Handlebars.registerHelper('codeContent', function(content) { + if (!content) return '{``}'; + // Escape backticks in the content + const escaped = String(content).replace(/`/g, '\\`').replace(/\$/g, '\\$'); + return `{\`${escaped}\`}`; + }); + + // Helper to generate JSX style object syntax + // Accepts CSS string and converts to JSX object format + // Handles both kebab-case (margin-bottom) and camelCase (marginBottom) + Handlebars.registerHelper('jsxStyle', function(styles) { + if (!styles || typeof styles !== 'string') return '{}'; + + // Parse CSS string like "display: flex; margin-bottom: 1rem;" or "marginBottom: 1rem" + const styleObj = {}; + const pairs = styles.split(';').filter(pair => pair.trim()); + + pairs.forEach(pair => { + const [key, value] = pair.split(':').map(s => s.trim()); + if (key && value) { + // Convert kebab-case to camelCase if needed + const camelKey = key.includes('-') + ? key.replace(/-([a-z])/g, (g) => g[1].toUpperCase()) + : key; + // Remove quotes from value if present + const cleanValue = value.replace(/^["']|["']$/g, ''); + styleObj[camelKey] = cleanValue; + } + }); + + // Convert to JSX object string with proper quoting + // All CSS values should be quoted as strings unless they're pure numbers + const entries = Object.entries(styleObj) + .map(([k, v]) => { + // Check if it's a pure number (integer or decimal without units) + if (/^-?\d+\.?\d*$/.test(v.trim())) { + return `${k}: ${v}`; + } + // Everything else (including CSS units like "0.75rem", "2rem", CSS vars, etc.) should be quoted + return `${k}: ${JSON.stringify(v)}`; + }) + .join(', '); + + // Return the object content wrapped in braces + // When used with {{{jsxStyle ...}}} in template, this becomes style={...} + // But we need style={{...}}, so we return with an extra opening brace + // The template uses {{{jsxStyle ...}}} which outputs raw, giving us style={{{...}}} + // To get style={{...}}, we need the helper to return {{...}} + // But with triple braces in template, we'd get style={{{{...}}}} which is wrong + // Solution: return just the object, template adds one brace manually + // Return the full JSX object expression with double braces + // Template will use raw block: {{{{jsxStyle ...}}}} + // This outputs: style={{{display: "flex", ...}}} + // But we need: style={{display: "flex", ...}} + // Actually, let's try: helper returns {{...}}, template uses {{jsxStyle}} (double) + // Handlebars will output the helper result + // But it will escape... unless we use raw block + // Simplest: return {{...}}, use {{{{jsxStyle}}}} raw block + return `{{${entries}}}`; + }); + + // Custom helper for better null/empty string handling + // Handlebars' default #if treats empty strings as falsy, but we want to be explicit + Handlebars.registerHelper('ifTruthy', function(value, options) { + if (value != null && + !(Array.isArray(value) && value.length === 0) && + !(typeof value === 'string' && value.trim().length === 0) && + !(typeof value === 'object' && Object.keys(value).length === 0)) { + return options.fn(this); + } + return options.inverse(this); + }); + + helpersRegistered = true; +} + +/** + * Normalize MDX formatting to ensure proper blank lines + * MDX requires blank lines between: + * - Import statements and JSX + * - JSX components and markdown + * - JSX components and other JSX + * + * @param {string} content - MDX content to normalize + * @returns {string} Properly formatted MDX + */ +function normalizeMdxFormatting(content) { + if (!content) return ''; + + let normalized = content; + + // 1. Ensure blank line after import statements (before JSX) + // Pattern: import ...;\n\n## + normalized = normalized.replace(/(\/>)\n(##)/g, '$1\n\n$2'); + + // 3. Ensure blank line after JSX closing tags (before other JSX) + // Pattern: \n)\n(<[A-Z])/g, '$1\n\n$2'); + + // 4. Ensure blank line after JSX closing tags (before markdown content) + // Pattern: \n## or \n[text] + normalized = normalized.replace(/(<\/[A-Z][a-zA-Z]+>)\n(##|[A-Z])/g, '$1\n\n$2'); + + // 5. Ensure blank line before JSX components (after markdown) + // Pattern: ]\n line.trimEnd()).join('\n'); + + // 8. Ensure file ends with single newline + normalized = normalized.trimEnd() + '\n'; + + return normalized; +} + +/** + * List available template files + * @returns {string[]} Array of template names (without extension) + */ +function listAvailableTemplates() { + const templatesDir = path.join(__dirname, 'pages'); + try { + return fs.readdirSync(templatesDir) + .filter(f => f.endsWith('.mdx.template')) + .map(f => f.replace('.mdx.template', '')); + } catch (e) { + return []; + } +} + +/** + * Load and render a template file with Handlebars + * @param {string} templateName - Name of template (without extension) + * @param {object} data - Data to render + * @returns {string} Rendered template with proper MDX formatting + * @throws {Error} If template cannot be loaded + */ +function loadAndRenderTemplate(templateName, data) { + const templatePath = path.join(__dirname, 'pages', `${templateName}.mdx.template`); + + if (!fs.existsSync(templatePath)) { + const available = listAvailableTemplates(); + throw new Error( + `Template '${templateName}' not found at: ${templatePath}\n` + + `Available templates: ${available.length > 0 ? available.join(', ') : 'none'}` + ); + } + + // Register helpers (only once, but safe to call multiple times) + registerHelpers(); + + try { + // Load template + const templateContent = fs.readFileSync(templatePath, 'utf8'); + + // Compile template with Handlebars + const template = Handlebars.compile(templateContent); + + // Render with data + let rendered = template(data); + + // Post-process: normalize MDX formatting + rendered = normalizeMdxFormatting(rendered); + + return rendered; + } catch (error) { + if (error.message.includes('Parse error')) { + throw new Error( + `Template parsing error in ${templateName}: ${error.message}\n` + + `Template path: ${templatePath}` + ); + } + throw error; + } +} + +module.exports = { + loadAndRenderTemplate, + registerHelpers, + listAvailableTemplates, +}; + diff --git a/.github/scripts/generate-docs-utils/templates/template-engine.js b/.github/scripts/generate-docs-utils/templates/template-engine.js new file mode 100644 index 00000000..b8e2ed22 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/template-engine.js @@ -0,0 +1,366 @@ +/** + * Simple Template Engine + * + * A lightweight template engine with no external dependencies. + * + * Supports: + * - Variable substitution: {{variable}} (HTML escaped) + * - Unescaped output: {{{variable}}} (raw output) + * - Conditionals: {{#if variable}}...{{/if}} + * - Loops: {{#each array}}...{{/each}} + * - Helper functions: {{helperName variable}} or {{helperName(arg1, arg2)}} + * - Dot notation: {{object.property.nested}} + */ + +const fs = require('fs'); +const path = require('path'); + +// Import helpers from separate module +const helpers = require('./helpers'); +const { escapeHtml } = helpers; + +/** + * Get value from object using dot notation path + * @param {object} obj - Object to get value from + * @param {string} dotPath - Dot notation path (e.g., "user.name") + * @returns {*} Value at path or undefined + */ +function getValue(obj, dotPath) { + if (!dotPath || !obj) return undefined; + + const parts = dotPath.split('.'); + let value = obj; + + for (const part of parts) { + if (value == null) return undefined; + value = value[part]; + } + + return value; +} + +/** + * Check if a value is truthy for template conditionals + * - null/undefined → false + * - empty array → false + * - empty object → false + * - empty string → false + * - false → false + * - everything else → true + * + * @param {*} value - Value to check + * @returns {boolean} Whether value is truthy + */ +function isTruthy(value) { + if (value == null) return false; + if (Array.isArray(value)) return value.length > 0; + if (typeof value === 'object') return Object.keys(value).length > 0; + if (typeof value === 'string') return value.trim().length > 0; + return Boolean(value); +} + +/** + * Process a helper function call + * @param {string} helperName - Name of the helper + * @param {string[]} args - Argument strings (variable paths or literals) + * @param {object} context - Current template context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Result of helper function + */ +function processHelper(helperName, args, context, helperRegistry) { + const helper = helperRegistry[helperName]; + if (!helper) { + console.warn(`Unknown template helper: ${helperName}`); + return ''; + } + + // Process arguments - can be variable paths or quoted literals + const processedArgs = args.map(arg => { + arg = arg.trim(); + // Check for quoted literal strings + if ((arg.startsWith('"') && arg.endsWith('"')) || + (arg.startsWith("'") && arg.endsWith("'"))) { + return arg.slice(1, -1); + } + // Otherwise treat as variable path + return getValue(context, arg); + }); + + return helper(...processedArgs); +} + +/** + * Process a variable expression (helper or simple variable) + * @param {string} expression - The expression inside {{ }} + * @param {object} context - Current template context + * @param {boolean} escapeOutput - Whether to HTML-escape the output + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed value + */ +function processExpression(expression, context, escapeOutput, helperRegistry) { + const expr = expression.trim(); + + // Check for helper with parentheses: helperName(arg1, arg2) + const parenMatch = expr.match(/^(\w+)\((.*)\)$/); + if (parenMatch) { + const [, helperName, argsStr] = parenMatch; + const args = argsStr ? argsStr.split(',').map(a => a.trim()) : []; + return processHelper(helperName, args, context, helperRegistry); + } + + // Check for helper with space: helperName variable + const spaceMatch = expr.match(/^(\w+)\s+(.+)$/); + if (spaceMatch && helperRegistry[spaceMatch[1]]) { + const [, helperName, arg] = spaceMatch; + return processHelper(helperName, [arg], context, helperRegistry); + } + + // Regular variable lookup + const value = getValue(context, expr); + if (value == null) return ''; + + const str = String(value); + return escapeOutput ? escapeHtml(str) : str; +} + +/** + * Find the matching closing tag for a block, handling nesting + * @param {string} content - Content to search + * @param {string} openTag - Opening tag pattern (e.g., '#if', '#each') + * @param {string} closeTag - Closing tag (e.g., '/if', '/each') + * @param {number} startPos - Position after the opening tag + * @returns {number} Position of the matching closing tag, or -1 if not found + */ +function findMatchingClose(content, openTag, closeTag, startPos) { + let depth = 1; + let pos = startPos; + + const openPattern = new RegExp(`\\{\\{${openTag}\\s+[^}]+\\}\\}`, 'g'); + const closePattern = new RegExp(`\\{\\{${closeTag}\\}\\}`, 'g'); + + while (depth > 0 && pos < content.length) { + // Find next open and close tags + openPattern.lastIndex = pos; + closePattern.lastIndex = pos; + + const openMatch = openPattern.exec(content); + const closeMatch = closePattern.exec(content); + + if (!closeMatch) { + return -1; // No matching close found + } + + // If open comes before close, increase depth + if (openMatch && openMatch.index < closeMatch.index) { + depth++; + pos = openMatch.index + openMatch[0].length; + } else { + depth--; + if (depth === 0) { + return closeMatch.index; + } + pos = closeMatch.index + closeMatch[0].length; + } + } + + return -1; +} + +/** + * Process nested conditionals: {{#if variable}}...{{/if}} + * @param {string} content - Template content + * @param {object} context - Data context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed content + */ +function processConditionals(content, context, helperRegistry) { + let result = content; + const openPattern = /\{\{#if\s+([^}]+)\}\}/g; + + let match; + while ((match = openPattern.exec(result)) !== null) { + const condition = match[1].trim(); + const startPos = match.index; + const afterOpen = startPos + match[0].length; + + const closePos = findMatchingClose(result, '#if', '/if', afterOpen); + if (closePos === -1) { + console.warn(`Unmatched {{#if ${condition}}} at position ${startPos}`); + break; + } + + const ifContent = result.substring(afterOpen, closePos); + const closeEndPos = closePos + '{{/if}}'.length; + + // Evaluate condition and get replacement + const value = getValue(context, condition); + const replacement = isTruthy(value) + ? processContent(ifContent, context, helperRegistry) + : ''; + + // Replace in result + result = result.substring(0, startPos) + replacement + result.substring(closeEndPos); + + // Reset pattern to start from beginning since we modified the string + openPattern.lastIndex = 0; + } + + return result; +} + +/** + * Process nested loops: {{#each array}}...{{/each}} + * @param {string} content - Template content + * @param {object} context - Data context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed content + */ +function processLoops(content, context, helperRegistry) { + let result = content; + const openPattern = /\{\{#each\s+([^}]+)\}\}/g; + + let match; + while ((match = openPattern.exec(result)) !== null) { + const arrayPath = match[1].trim(); + const startPos = match.index; + const afterOpen = startPos + match[0].length; + + const closePos = findMatchingClose(result, '#each', '/each', afterOpen); + if (closePos === -1) { + console.warn(`Unmatched {{#each ${arrayPath}}} at position ${startPos}`); + break; + } + + const loopContent = result.substring(afterOpen, closePos); + const closeEndPos = closePos + '{{/each}}'.length; + + // Get array and process each item + const array = getValue(context, arrayPath); + let replacement = ''; + + if (Array.isArray(array) && array.length > 0) { + replacement = array.map((item, index) => { + const itemContext = { ...context, ...item, index }; + return processContent(loopContent, itemContext, helperRegistry); + }).join(''); + } + + // Replace in result + result = result.substring(0, startPos) + replacement + result.substring(closeEndPos); + + // Reset pattern to start from beginning since we modified the string + openPattern.lastIndex = 0; + } + + return result; +} + +/** + * Process template content with the given context + * Handles all variable substitutions, helpers, conditionals, and loops + * + * IMPORTANT: Processing order matters! + * 1. Loops first - so nested conditionals are evaluated with correct item context + * 2. Conditionals second - after loops have expanded their content + * 3. Variables last - after all control structures are resolved + * + * @param {string} content - Template content to process + * @param {object} context - Data context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed content + */ +function processContent(content, context, helperRegistry) { + let result = content; + + // 1. Process loops FIRST (handles nesting properly) + result = processLoops(result, context, helperRegistry); + + // 2. Process conditionals SECOND (handles nesting properly) + result = processConditionals(result, context, helperRegistry); + + // 3. Process triple braces for unescaped output: {{{variable}}} + const tripleBracePattern = /\{\{\{([^}]+)\}\}\}/g; + result = result.replace(tripleBracePattern, (match, expr) => { + return processExpression(expr, context, false, helperRegistry); + }); + + // 4. Process double braces for escaped output: {{variable}} + const doubleBracePattern = /\{\{([^}]+)\}\}/g; + result = result.replace(doubleBracePattern, (match, expr) => { + return processExpression(expr, context, true, helperRegistry); + }); + + return result; +} + +/** + * Render a template string with data + * @param {string} template - Template string + * @param {object} data - Data to render + * @returns {string} Rendered template + */ +function renderTemplate(template, data) { + if (!template) return ''; + if (!data) data = {}; + + return processContent(template, { ...data }, helpers); +} + +/** + * List available template files + * @returns {string[]} Array of template names (without extension) + */ +function listAvailableTemplates() { + const templatesDir = path.join(__dirname, 'pages'); + try { + return fs.readdirSync(templatesDir) + .filter(f => f.endsWith('.mdx.template')) + .map(f => f.replace('.mdx.template', '')); + } catch (e) { + return []; + } +} + +/** + * Load and render a template file + * @param {string} templateName - Name of template (without extension) + * @param {object} data - Data to render + * @returns {string} Rendered template + * @throws {Error} If template cannot be loaded + */ +function loadAndRenderTemplate(templateName, data) { + console.log('Loading template:', templateName); + console.log('Data:', data); + + const templatePath = path.join(__dirname, 'pages', `${templateName}.mdx.template`); + + try { + if (!fs.existsSync(templatePath)) { + const available = listAvailableTemplates(); + throw new Error( + `Template '${templateName}' not found at: ${templatePath}\n` + + `Available templates: ${available.length > 0 ? available.join(', ') : 'none'}` + ); + } + + const template = fs.readFileSync(templatePath, 'utf8'); + return renderTemplate(template, data); + } catch (error) { + if (error.code === 'ENOENT') { + const available = listAvailableTemplates(); + throw new Error( + `Template file not found: ${templatePath}\n` + + `Available templates: ${available.length > 0 ? available.join(', ') : 'none'}` + ); + } + throw error; + } +} + +module.exports = { + renderTemplate, + loadAndRenderTemplate, + getValue, + isTruthy, + listAvailableTemplates, +}; diff --git a/.github/scripts/generate-docs-utils/templates/templates.js b/.github/scripts/generate-docs-utils/templates/templates.js new file mode 100644 index 00000000..4bc33658 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/templates.js @@ -0,0 +1,720 @@ +/** + * MDX Templates for Docusaurus documentation + * Uses Handlebars template engine for reliable MDX generation + */ + +const { loadAndRenderTemplate } = require('./template-engine-handlebars'); +const { sanitizeForMdx } = require('./helpers'); +const { readFileSafe } = require('../../workflow-utils'); +const { enrichWithRelationships } = require('../core/relationship-detector'); +const { getSidebarLabel, formatDisplayTitle } = require('../core/description-generator'); + +/** + * Extract function parameters directly from Solidity source file + * @param {string} sourceFilePath - Path to the Solidity source file + * @param {string} functionName - Name of the function to extract parameters from + * @returns {Array} Array of parameter objects with name and type + */ +function extractParamsFromSource(sourceFilePath, functionName) { + if (!sourceFilePath || !functionName) return []; + + const sourceContent = readFileSafe(sourceFilePath); + if (!sourceContent) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Could not read source file: ${sourceFilePath}`); + } + return []; + } + + // Remove comments to avoid parsing issues + const withoutComments = sourceContent + .replace(/\/\*[\s\S]*?\*\//g, '') // Remove block comments + .replace(/\/\/.*$/gm, ''); // Remove line comments + + // Find function definition - match function name followed by opening parenthesis + // Handle both regular functions and free functions + const functionPattern = new RegExp( + `function\\s+${functionName.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}\\s*\\(([^)]*)\\)`, + 's' + ); + + const match = withoutComments.match(functionPattern); + if (!match || !match[1]) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Function ${functionName} not found in source file`); + } + return []; + } + + const paramsStr = match[1].trim(); + if (!paramsStr) { + return []; // Function has no parameters + } + + // Parse parameters - handle complex types like mappings, arrays, structs + const params = []; + let currentParam = ''; + let depth = 0; + let inString = false; + let stringChar = ''; + + for (let i = 0; i < paramsStr.length; i++) { + const char = paramsStr[i]; + + // Handle string literals + if ((char === '"' || char === "'") && (i === 0 || paramsStr[i - 1] !== '\\')) { + if (!inString) { + inString = true; + stringChar = char; + } else if (char === stringChar) { + inString = false; + } + currentParam += char; + continue; + } + + if (inString) { + currentParam += char; + continue; + } + + // Track nesting depth for generics, arrays, mappings + if (char === '<' || char === '[' || char === '(') { + depth++; + currentParam += char; + } else if (char === '>' || char === ']' || char === ')') { + depth--; + currentParam += char; + } else if (char === ',' && depth === 0) { + // Found a parameter boundary + const trimmed = currentParam.trim(); + if (trimmed) { + const parsed = parseParameter(trimmed); + if (parsed) { + params.push(parsed); + } + } + currentParam = ''; + } else { + currentParam += char; + } + } + + // Handle last parameter + const trimmed = currentParam.trim(); + if (trimmed) { + const parsed = parseParameter(trimmed); + if (parsed) { + params.push(parsed); + } + } + + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Extracted ${params.length} params from source for ${functionName}:`, JSON.stringify(params, null, 2)); + } + + return params; +} + +/** + * Parse a single parameter string into name and type + * @param {string} paramStr - Parameter string (e.g., "uint256 amount" or "address") + * @returns {object|null} Object with name and type, or null if invalid + */ +function parseParameter(paramStr) { + if (!paramStr || !paramStr.trim()) return null; + + // Remove storage location keywords + const cleaned = paramStr + .replace(/\b(memory|storage|calldata)\b/g, '') + .replace(/\s+/g, ' ') + .trim(); + + // Split by whitespace - last token is usually the name, rest is type + const parts = cleaned.split(/\s+/); + + if (parts.length === 0) return null; + + // If only one part, it's just a type (unnamed parameter) + if (parts.length === 1) { + return { name: '', type: parts[0], description: '' }; + } + + // Last part is the name, everything before is the type + const name = parts[parts.length - 1]; + const type = parts.slice(0, -1).join(' '); + + // Validate: name should be a valid identifier + if (!/^[a-zA-Z_$][a-zA-Z0-9_$]*$/.test(name)) { + // If name doesn't look valid, treat the whole thing as a type + return { name: '', type: cleaned, description: '' }; + } + + return { name, type, description: '' }; +} + +/** + * Extract parameters from function signature string + * @param {string} signature - Function signature string + * @returns {Array} Array of parameter objects with name and type + */ +function extractParamsFromSignature(signature) { + if (!signature || typeof signature !== 'string') return []; + + // Match function parameters: function name(params) or just (params) + const paramMatch = signature.match(/\(([^)]*)\)/); + if (!paramMatch || !paramMatch[1]) return []; + + const paramsStr = paramMatch[1].trim(); + if (!paramsStr) return []; + + // Split by comma, but be careful with nested generics + const params = []; + let currentParam = ''; + let depth = 0; + + for (let i = 0; i < paramsStr.length; i++) { + const char = paramsStr[i]; + if (char === '<') depth++; + else if (char === '>') depth--; + else if (char === ',' && depth === 0) { + const trimmed = currentParam.trim(); + if (trimmed) { + // Parse "type name" or just "type" + const parts = trimmed.split(/\s+/); + if (parts.length >= 2) { + // Has both type and name + const type = parts.slice(0, -1).join(' '); + const name = parts[parts.length - 1]; + params.push({ name, type, description: '' }); + } else if (parts.length === 1) { + // Just type, no name + params.push({ name: '', type: parts[0], description: '' }); + } + } + currentParam = ''; + continue; + } + currentParam += char; + } + + // Handle last parameter + const trimmed = currentParam.trim(); + if (trimmed) { + const parts = trimmed.split(/\s+/); + if (parts.length >= 2) { + const type = parts.slice(0, -1).join(' '); + const name = parts[parts.length - 1]; + params.push({ name, type, description: '' }); + } else if (parts.length === 1) { + params.push({ name: '', type: parts[0], description: '' }); + } + } + + return params; +} + +/** + * Filter function parameters, removing invalid entries + * Invalid parameters include: empty names or names matching the function name (parsing error) + * @param {Array} params - Raw parameters array + * @param {string} functionName - Name of the function (to detect parsing errors) + * @returns {Array} Filtered and normalized parameters + */ +function filterAndNormalizeParams(params, functionName) { + return (params || []) + .filter(p => { + // Handle different possible data structures + const paramName = (p && (p.name || p.param || p.parameter || '')).trim(); + const paramType = (p && (p.type || p.paramType || '')).trim(); + + // Filter out parameters with empty or missing names + if (!paramName) return false; + // Filter out parameters where name matches function name (indicates parsing error) + if (paramName === functionName) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Filtered out invalid param: name="${paramName}" matches function name`); + } + return false; + } + // Filter out if type is empty AND name looks like it might be a function name (starts with lowercase, no underscore) + if (!paramType && /^[a-z]/.test(paramName) && !paramName.includes('_')) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Filtered out suspicious param: name="${paramName}" has no type`); + } + return false; + } + return true; + }) + .map(p => ({ + name: (p.name || p.param || p.parameter || '').trim(), + type: (p.type || p.paramType || '').trim(), + description: (p.description || p.desc || '').trim(), + })); +} + +/** + * Check if a function is internal by examining its signature + * @param {object} fn - Function data with signature property + * @returns {boolean} True if function is internal + */ +function isInternalFunction(fn) { + if (!fn || !fn.signature) return false; + + // Check if signature contains "internal" as a whole word + // Use word boundary regex to avoid matching "internalTransferFrom" etc. + const internalPattern = /\binternal\b/; + return internalPattern.test(fn.signature); +} + +/** + * Prepare function data for template rendering (shared between facet and module) + * @param {object} fn - Function data + * @param {string} sourceFilePath - Path to the Solidity source file + * @param {boolean} useSourceExtraction - Whether to try extracting params from source file (for modules) + * @returns {object} Prepared function data + */ +function prepareFunctionData(fn, sourceFilePath, useSourceExtraction = false) { + // Debug: log the raw function data + if (process.env.DEBUG_PARAMS) { + console.log(`\n[DEBUG] Function: ${fn.name}`); + console.log(`[DEBUG] Raw params:`, JSON.stringify(fn.params, null, 2)); + console.log(`[DEBUG] Signature:`, fn.signature); + } + + // Build parameters array, filtering out invalid parameters + let paramsArray = filterAndNormalizeParams(fn.params, fn.name); + + // If no valid parameters found, try extracting from source file (for modules) or signature + if (paramsArray.length === 0) { + // Try source file extraction for modules + if (useSourceExtraction && sourceFilePath) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] No valid params found, extracting from source file: ${sourceFilePath}`); + } + const extractedParams = extractParamsFromSource(sourceFilePath, fn.name); + if (extractedParams.length > 0) { + paramsArray = extractedParams; + } + } + + // Fallback to signature extraction if still no params + if (paramsArray.length === 0 && fn.signature) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] No valid params found, extracting from signature`); + } + const extractedParams = extractParamsFromSignature(fn.signature); + paramsArray = filterAndNormalizeParams(extractedParams, fn.name); + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Extracted params from signature:`, JSON.stringify(paramsArray, null, 2)); + } + } + } + + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Final paramsArray:`, JSON.stringify(paramsArray, null, 2)); + } + + // Build returns array for table rendering + const returnsArray = (fn.returns || []).map(r => ({ + name: r.name || '-', + type: r.type, + description: r.description || '', + })); + + return { + name: fn.name, + signature: fn.signature, + description: fn.notice || fn.description || '', + params: paramsArray, + returns: returnsArray, + hasReturns: returnsArray.length > 0, + hasParams: paramsArray.length > 0, + }; +} + +/** + * Prepare event data for template rendering + * @param {object} event - Event data + * @returns {object} Prepared event data + */ +function prepareEventData(event) { + return { + name: event.name, + description: event.description || '', + signature: event.signature, + params: (event.params || []).map(p => ({ + name: p.name, + type: p.type, + description: p.description || '', + })), + hasParams: (event.params || []).length > 0, + }; +} + +/** + * Prepare error data for template rendering + * @param {object} error - Error data + * @returns {object} Prepared error data + */ +function prepareErrorData(error) { + return { + name: error.name, + description: error.description || '', + signature: error.signature, + }; +} + +/** + * Normalize struct definition indentation + * Ensures consistent 4-space indentation for struct body content + * @param {string} definition - Struct definition code + * @returns {string} Normalized struct definition with proper indentation + */ +function normalizeStructIndentation(definition) { + if (!definition) return definition; + + const lines = definition.split('\n'); + if (lines.length === 0) return definition; + + // Find the struct opening line (contains "struct" keyword) + let structStartIndex = -1; + let openingBraceOnSameLine = false; + + for (let i = 0; i < lines.length; i++) { + if (lines[i].includes('struct')) { + structStartIndex = i; + openingBraceOnSameLine = lines[i].includes('{'); + break; + } + } + + if (structStartIndex === -1) return definition; + + // Get the indentation of the struct declaration line + const structLine = lines[structStartIndex]; + const structIndentMatch = structLine.match(/^(\s*)/); + const structIndent = structIndentMatch ? structIndentMatch[1] : ''; + + // Normalize all lines + const normalized = []; + let inStructBody = openingBraceOnSameLine; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmed = line.trim(); + + if (i === structStartIndex) { + // Keep struct declaration line as-is + normalized.push(line); + if (openingBraceOnSameLine) { + inStructBody = true; + } + continue; + } + + // Handle opening brace on separate line + if (!openingBraceOnSameLine && trimmed === '{') { + normalized.push(structIndent + '{'); + inStructBody = true; + continue; + } + + // Handle closing brace + if (trimmed === '}') { + normalized.push(structIndent + '}'); + inStructBody = false; + continue; + } + + // Skip empty lines + if (trimmed === '') { + normalized.push(''); + continue; + } + + // For struct body content, ensure 4-space indentation relative to struct declaration + if (inStructBody) { + // Remove any existing indentation and add proper indentation + const bodyIndent = structIndent + ' '; // 4 spaces + normalized.push(bodyIndent + trimmed); + } else { + // Keep lines outside struct body as-is + normalized.push(line); + } + } + + return normalized.join('\n'); +} + +/** + * Prepare struct data for template rendering + * @param {object} struct - Struct data + * @returns {object} Prepared struct data + */ +function prepareStructData(struct) { + return { + name: struct.name, + description: struct.description || '', + definition: normalizeStructIndentation(struct.definition), + }; +} + +/** + * Validate documentation data + * @param {object} data - Documentation data to validate + * @throws {Error} If data is invalid + */ +function validateData(data) { + if (!data || typeof data !== 'object') { + throw new Error('Invalid data: expected an object'); + } + if (!data.title || typeof data.title !== 'string') { + throw new Error('Invalid data: missing or invalid title'); + } +} + +/** + * Generate fallback description for state variables/constants based on naming patterns + * @param {string} name - Variable name (e.g., "STORAGE_POSITION", "DEFAULT_ADMIN_ROLE") + * @param {string} moduleName - Name of the module/contract for context + * @returns {string} Generated description or empty string + */ +function generateStateVariableDescription(name, moduleName) { + if (!name) return ''; + + const upperName = name.toUpperCase(); + + // Common patterns for diamond/ERC contracts + const patterns = { + // Storage position patterns + 'STORAGE_POSITION': 'Diamond storage slot position for this module', + 'STORAGE_SLOT': 'Diamond storage slot identifier', + '_STORAGE_POSITION': 'Diamond storage slot position', + '_STORAGE_SLOT': 'Diamond storage slot identifier', + + // Role patterns + 'DEFAULT_ADMIN_ROLE': 'Default administrative role identifier (bytes32(0))', + 'ADMIN_ROLE': 'Administrative role identifier', + 'MINTER_ROLE': 'Minter role identifier', + 'PAUSER_ROLE': 'Pauser role identifier', + 'BURNER_ROLE': 'Burner role identifier', + + // ERC patterns + 'INTERFACE_ID': 'ERC-165 interface identifier', + 'EIP712_DOMAIN': 'EIP-712 domain separator', + 'PERMIT_TYPEHASH': 'EIP-2612 permit type hash', + + // Reentrancy patterns + 'NON_REENTRANT_SLOT': 'Reentrancy guard storage slot', + '_NOT_ENTERED': 'Reentrancy status: not entered', + '_ENTERED': 'Reentrancy status: entered', + }; + + // Check exact matches first + if (patterns[upperName]) { + return patterns[upperName]; + } + + // Check partial matches + if (upperName.includes('STORAGE') && (upperName.includes('POSITION') || upperName.includes('SLOT'))) { + return 'Diamond storage slot position for this module'; + } + if (upperName.includes('_ROLE')) { + const roleName = name.replace(/_ROLE$/i, '').replace(/_/g, ' ').toLowerCase(); + return `${roleName.charAt(0).toUpperCase() + roleName.slice(1)} role identifier`; + } + if (upperName.includes('TYPEHASH')) { + return 'Type hash for EIP-712 structured data'; + } + if (upperName.includes('INTERFACE')) { + return 'ERC-165 interface identifier'; + } + + // Generic fallback + return ''; +} + +/** + * Prepare base data common to both facet and module templates + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} [options] - Additional context (contract type, category, etc.) + * @returns {object} Base prepared data + */ +function prepareBaseData(data, position = 99, options = {}) { + validateData(data); + const { contractType, category } = options || {}; + + const description = data.description || `Contract documentation for ${data.title}`; + const subtitle = data.subtitle || data.description || `Contract documentation for ${data.title}`; + const overview = data.overview || data.description || `Documentation for ${data.title}.`; + + // Optional sidebar label override (Facet / Module, or diamond-specific e.g. "Inspect Facet") + const sidebarLabel = getSidebarLabel(contractType, category, data.title); + + return { + position, + title: formatDisplayTitle(data.title), + sidebarLabel: sidebarLabel || null, + description, + subtitle, + overview, + generatedDate: data.generatedDate || new Date().toISOString(), + gitSource: data.gitSource || '', + keyFeatures: data.keyFeatures || '', + usageExample: data.usageExample || '', + bestPractices: (data.bestPractices && data.bestPractices.trim()) ? data.bestPractices : null, + securityConsiderations: (data.securityConsiderations && data.securityConsiderations.trim()) ? data.securityConsiderations : null, + integrationNotes: (data.integrationNotes && data.integrationNotes.trim()) ? data.integrationNotes : null, + storageInfo: data.storageInfo || '', + + // Events + events: (data.events || []).map(prepareEventData), + hasEvents: (data.events || []).length > 0, + + // Errors + errors: (data.errors || []).map(prepareErrorData), + hasErrors: (data.errors || []).length > 0, + + // Structs + structs: (data.structs || []).map(prepareStructData), + hasStructs: (data.structs || []).length > 0, + + // State variables (for modules) - with fallback description generation + stateVariables: (data.stateVariables || []).map(v => { + const baseDescription = v.description || generateStateVariableDescription(v.name, data.title); + let description = baseDescription; + + // Append value to description if it exists and isn't already included + if (v.value && v.value.trim()) { + const valueStr = v.value.trim(); + // Check if value is already in description (case-insensitive) + // Escape special regex characters in valueStr + const escapedValue = valueStr.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); + // Pattern matches "(Value: `...`)" or "(Value: ...)" format + const valuePattern = new RegExp('\\(Value:\\s*[`]?[^`)]*' + escapedValue + '[^`)]*[`]?\\)', 'i'); + if (!valuePattern.test(description)) { + // Format the value for display with backticks + // Use string concatenation to avoid template literal backtick issues + const valuePart = '(Value: `' + valueStr + '`)'; + description = baseDescription ? baseDescription + ' ' + valuePart : valuePart; + } + } + + return { + name: v.name, + type: v.type || '', + value: v.value || '', + description: description, + }; + }), + hasStateVariables: (data.stateVariables || []).length > 0, + hasStorage: Boolean(data.storageInfo || (data.stateVariables && data.stateVariables.length > 0)), + }; +} + +/** + * Prepare data for facet template rendering + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {object} Prepared data for facet template + */ +function prepareFacetData(data, position = 99, pathInfo = null, registry = null) { + const baseData = prepareBaseData(data, position, { + contractType: 'facet', + category: pathInfo && pathInfo.category, + }); + const sourceFilePath = data.sourceFilePath; + + // Filter out internal functions for facets (they act as pre-deploy logic blocks) + const publicFunctions = (data.functions || []).filter(fn => !isInternalFunction(fn)); + + const preparedData = { + ...baseData, + // Contract type flags for unified template + isFacet: true, + isModule: false, + contractType: 'facet', + // Functions with APIReference-compatible format (no source extraction for facets) + // Only include non-internal functions since facets are pre-deploy logic blocks + functions: publicFunctions.map(fn => prepareFunctionData(fn, sourceFilePath, false)), + hasFunctions: publicFunctions.length > 0, + }; + + // Enrich with relationships if registry and pathInfo provided + if (registry && pathInfo) { + return enrichWithRelationships(preparedData, pathInfo, registry); + } + + return preparedData; +} + +/** + * Prepare data for module template rendering + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {object} Prepared data for module template + */ +function prepareModuleData(data, position = 99, pathInfo = null, registry = null) { + const baseData = prepareBaseData(data, position, { + contractType: 'module', + category: pathInfo && pathInfo.category, + }); + const sourceFilePath = data.sourceFilePath; + + const preparedData = { + ...baseData, + // Contract type flags for unified template + isFacet: false, + isModule: true, + contractType: 'module', + // Functions with table-compatible format (with source extraction for modules) + functions: (data.functions || []).map(fn => prepareFunctionData(fn, sourceFilePath, true)), + hasFunctions: (data.functions || []).length > 0, + }; + + // Enrich with relationships if registry and pathInfo provided + if (registry && pathInfo) { + return enrichWithRelationships(preparedData, pathInfo, registry); + } + + return preparedData; +} + +/** + * Generate complete facet documentation + * Uses the unified contract template with isFacet=true + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {string} Complete MDX document + */ +function generateFacetDoc(data, position = 99, pathInfo = null, registry = null) { + const preparedData = prepareFacetData(data, position, pathInfo, registry); + return loadAndRenderTemplate('contract', preparedData); +} + +/** + * Generate complete module documentation + * Uses the unified contract template with isModule=true + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {string} Complete MDX document + */ +function generateModuleDoc(data, position = 99, pathInfo = null, registry = null) { + const preparedData = prepareModuleData(data, position, pathInfo, registry); + return loadAndRenderTemplate('contract', preparedData); +} + +module.exports = { + generateFacetDoc, + generateModuleDoc, +}; diff --git a/.github/scripts/generate-docs-utils/tracking/summary-tracker.js b/.github/scripts/generate-docs-utils/tracking/summary-tracker.js new file mode 100644 index 00000000..c1d34ee0 --- /dev/null +++ b/.github/scripts/generate-docs-utils/tracking/summary-tracker.js @@ -0,0 +1,134 @@ +/** + * Summary Tracker + * + * Tracks processing results and generates summary reports. + * Replaces global processedFiles object with a class-based approach. + */ + +const path = require('path'); +const { writeFileSafe } = require('../../workflow-utils'); + +/** + * Tracks processed files and generates summary reports + */ +class SummaryTracker { + constructor() { + this.facets = []; + this.modules = []; + this.skipped = []; + this.errors = []; + this.fallbackFiles = []; + } + + /** + * Record a successfully processed facet + * @param {string} title - Facet title + * @param {string} file - Output file path + */ + recordFacet(title, file) { + this.facets.push({ title, file }); + } + + /** + * Record a successfully processed module + * @param {string} title - Module title + * @param {string} file - Output file path + */ + recordModule(title, file) { + this.modules.push({ title, file }); + } + + /** + * Record a skipped file + * @param {string} file - File path + * @param {string} reason - Reason for skipping + */ + recordSkipped(file, reason) { + this.skipped.push({ file, reason }); + } + + /** + * Record an error + * @param {string} file - File path + * @param {string} error - Error message + */ + recordError(file, error) { + this.errors.push({ file, error }); + } + + /** + * Record a file that used fallback content + * @param {string} title - Contract title + * @param {string} file - Output file path + * @param {string} error - Error message + */ + recordFallback(title, file, error) { + this.fallbackFiles.push({ title, file, error }); + } + + /** + * Print processing summary to console + */ + printSummary() { + console.log('\n' + '='.repeat(50)); + console.log('Documentation Generation Summary'); + console.log('='.repeat(50)); + + console.log(`\nFacets generated: ${this.facets.length}`); + for (const f of this.facets) { + console.log(` - ${f.title}`); + } + + console.log(`\nModules generated: ${this.modules.length}`); + for (const m of this.modules) { + console.log(` - ${m.title}`); + } + + if (this.skipped.length > 0) { + console.log(`\nSkipped: ${this.skipped.length}`); + for (const s of this.skipped) { + console.log(` - ${path.basename(s.file)}: ${s.reason}`); + } + } + + if (this.errors.length > 0) { + console.log(`\nErrors: ${this.errors.length}`); + for (const e of this.errors) { + console.log(` - ${path.basename(e.file)}: ${e.error}`); + } + } + + if (this.fallbackFiles.length > 0) { + console.log(`\n⚠️ Files using fallback due to AI errors: ${this.fallbackFiles.length}`); + for (const f of this.fallbackFiles) { + console.log(` - ${f.title}: ${f.error}`); + } + } + + const total = this.facets.length + this.modules.length; + console.log(`\nTotal generated: ${total} documentation files`); + console.log('='.repeat(50) + '\n'); + } + + /** + * Write summary to file for GitHub Action + */ + writeSummaryFile() { + const summary = { + timestamp: new Date().toISOString(), + facets: this.facets, + modules: this.modules, + skipped: this.skipped, + errors: this.errors, + fallbackFiles: this.fallbackFiles, + totalGenerated: this.facets.length + this.modules.length, + }; + + writeFileSafe('docgen-summary.json', JSON.stringify(summary, null, 2)); + } +} + +module.exports = { + SummaryTracker, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/contract-classifier.js b/.github/scripts/generate-docs-utils/utils/contract-classifier.js new file mode 100644 index 00000000..698df50a --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/contract-classifier.js @@ -0,0 +1,69 @@ +/** + * Contract Classifier + * + * Functions for detecting contract types (interface, module, facet). + */ + +const path = require('path'); + +/** + * Determine if a contract is an interface + * Interfaces should be skipped from documentation generation + * Only checks the naming pattern (I[A-Z]) to avoid false positives + * @param {string} title - Contract title/name + * @param {string} content - File content (forge doc markdown) - unused but kept for API compatibility + * @returns {boolean} True if this is an interface + */ +function isInterface(title, content) { + // Only check if title follows interface naming convention: starts with "I" followed by uppercase + // This is the most reliable indicator and avoids false positives from content that mentions "interface" + if (title && /^I[A-Z]/.test(title)) { + return true; + } + + // Removed content-based check to avoid false positives + // Facets and contracts often mention "interface" in their descriptions + // (e.g., "ERC-165 Standard Interface Detection Facet") which would incorrectly filter them + + return false; +} + +/** + * Determine if a contract is a module or facet + * @param {string} filePath - Path to the file + * @param {string} content - File content + * @returns {'module' | 'facet'} Contract type + */ +function getContractType(filePath, content) { + const lowerPath = filePath.toLowerCase(); + const normalizedPath = lowerPath.replace(/\\/g, '/'); + const baseName = path.basename(filePath, path.extname(filePath)).toLowerCase(); + + // Explicit modules folder + if (normalizedPath.includes('/modules/')) { + return 'module'; + } + + // File naming conventions (e.g., AccessControlMod.sol, NonReentrancyModule.sol) + if (baseName.endsWith('mod') || baseName.endsWith('module')) { + return 'module'; + } + + if (lowerPath.includes('facet')) { + return 'facet'; + } + + // Libraries folder typically contains modules + if (normalizedPath.includes('/libraries/')) { + return 'module'; + } + + // Default to facet for contracts + return 'facet'; +} + +module.exports = { + isInterface, + getContractType, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/file-finder.js b/.github/scripts/generate-docs-utils/utils/file-finder.js new file mode 100644 index 00000000..6ee50dd2 --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/file-finder.js @@ -0,0 +1,38 @@ +/** + * File Finder + * + * Functions for finding forge doc output files. + */ + +const fs = require('fs'); +const path = require('path'); +const CONFIG = require('../config'); + +/** + * Find forge doc output files for a given source file + * @param {string} solFilePath - Path to .sol file (e.g., 'src/access/AccessControl/AccessControlMod.sol') + * @returns {string[]} Array of markdown file paths from forge doc output + */ +function findForgeDocFiles(solFilePath) { + // Transform: src/access/AccessControl/AccessControlMod.sol + // To: docs/src/src/access/AccessControl/AccessControlMod.sol/ + const relativePath = solFilePath.replace(/^src\//, ''); + const docsDir = path.join(CONFIG.forgeDocsDir, relativePath); + + if (!fs.existsSync(docsDir)) { + return []; + } + + try { + const files = fs.readdirSync(docsDir); + return files.filter((f) => f.endsWith('.md')).map((f) => path.join(docsDir, f)); + } catch (error) { + console.error(`Error reading docs dir ${docsDir}:`, error.message); + return []; + } +} + +module.exports = { + findForgeDocFiles, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/git-utils.js b/.github/scripts/generate-docs-utils/utils/git-utils.js new file mode 100644 index 00000000..7f905f0a --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/git-utils.js @@ -0,0 +1,70 @@ +/** + * Git Utilities + * + * Functions for interacting with git to find changed files. + */ + +const { execSync } = require('child_process'); +const { readFileSafe } = require('../../workflow-utils'); + +/** + * Get list of changed Solidity files from git diff + * @param {string} baseBranch - Base branch to compare against + * @returns {string[]} Array of changed .sol file paths + */ +function getChangedSolFiles(baseBranch = 'HEAD~1') { + try { + const output = execSync(`git diff --name-only ${baseBranch} HEAD -- 'src/**/*.sol'`, { + encoding: 'utf8', + }); + return output + .trim() + .split('\n') + .filter((f) => f.endsWith('.sol')); + } catch (error) { + console.error('Error getting changed files:', error.message); + return []; + } +} + +/** + * Get all Solidity files in src directory + * @returns {string[]} Array of .sol file paths + */ +function getAllSolFiles() { + try { + const output = execSync('find src -name "*.sol" -type f', { + encoding: 'utf8', + }); + return output + .trim() + .split('\n') + .filter((f) => f); + } catch (error) { + console.error('Error getting all sol files:', error.message); + return []; + } +} + +/** + * Read changed files from a file (used in CI) + * @param {string} filePath - Path to file containing list of changed files + * @returns {string[]} Array of file paths + */ +function readChangedFilesFromFile(filePath) { + const content = readFileSafe(filePath); + if (!content) { + return []; + } + return content + .trim() + .split('\n') + .filter((f) => f.endsWith('.sol')); +} + +module.exports = { + getChangedSolFiles, + getAllSolFiles, + readChangedFilesFromFile, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/path-computer.js b/.github/scripts/generate-docs-utils/utils/path-computer.js new file mode 100644 index 00000000..cdb6ad4c --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/path-computer.js @@ -0,0 +1,33 @@ +/** + * Path Computer + * + * Functions for computing output paths for documentation files. + */ + +const { + computeOutputPath, + ensureCategoryFiles, +} = require('../category/category-generator'); + +/** + * Get output directory and file path based on source file path + * Mirrors the src/ structure in website/docs/contracts/ + * + * @param {string} solFilePath - Path to the source .sol file + * @param {'module' | 'facet'} contractType - Type of contract (for logging) + * @returns {object} { outputDir, outputFile, relativePath, fileName, category } + */ +function getOutputPath(solFilePath, contractType) { + // Compute path using the new structure-mirroring logic + const pathInfo = computeOutputPath(solFilePath); + + // Ensure all parent category files exist + ensureCategoryFiles(pathInfo.outputDir); + + return pathInfo; +} + +module.exports = { + getOutputPath, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/sidebar-position-calculator.js b/.github/scripts/generate-docs-utils/utils/sidebar-position-calculator.js new file mode 100644 index 00000000..d14d9a19 --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/sidebar-position-calculator.js @@ -0,0 +1,66 @@ +/** + * Sidebar Position Calculator + * + * Calculates sidebar positions for contracts in documentation. + */ + +const CONFIG = require('../config'); +const { getContractRegistry } = require('../core/contract-registry'); + +/** + * Get sidebar position for a contract + * @param {string} contractName - Name of the contract + * @param {string} contractType - Type of contract ('module' or 'facet') + * @param {string} category - Category of the contract + * @param {object} registry - Contract registry (optional, uses global if not provided) + * @returns {number} Sidebar position + */ +function getSidebarPosition(contractName, contractType = null, category = null, registry = null) { + // First check explicit config + if (CONFIG.contractPositions && CONFIG.contractPositions[contractName] !== undefined) { + return CONFIG.contractPositions[contractName]; + } + + // If we don't have enough info, use default + if (!contractType || !category) { + return CONFIG.defaultSidebarPosition || 50; + } + + // Calculate smart position based on: + // 1. Category base offset + const categoryOffsets = { + diamond: 0, + access: 100, + token: 200, + utils: 300, + interfaceDetection: 400 + }; + + let basePosition = categoryOffsets[category] || 500; + + // 2. Contract type offset (facets before modules in sidebar) + const typeOffset = contractType === 'facet' ? 0 : 10; + basePosition += typeOffset; + + // 3. Position within category based on dependencies + const reg = registry || getContractRegistry(); + if (reg && reg.byCategory.has(category)) { + const categoryContracts = reg.byCategory.get(category) || []; + const sameTypeContracts = categoryContracts.filter(c => c.type === contractType); + + // Sort by name for consistent ordering + sameTypeContracts.sort((a, b) => a.name.localeCompare(b.name)); + + const index = sameTypeContracts.findIndex(c => c.name === contractName); + if (index !== -1) { + basePosition += index; + } + } + + return basePosition; +} + +module.exports = { + getSidebarPosition, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/source-parser.js b/.github/scripts/generate-docs-utils/utils/source-parser.js new file mode 100644 index 00000000..89b588f8 --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/source-parser.js @@ -0,0 +1,174 @@ +/** + * Source Parser + * + * Functions for parsing Solidity source files to extract information. + */ + +const path = require('path'); +const { readFileSafe } = require('../../workflow-utils'); + +/** + * Extract module name from file path + * @param {string} filePath - Path to the file + * @returns {string} Module name + */ +function extractModuleNameFromPath(filePath) { + // If it's a constants file, extract from filename + const basename = path.basename(filePath); + if (basename.startsWith('constants.')) { + const match = basename.match(/^constants\.(.+)\.md$/); + if (match) { + return match[1]; + } + } + + // Extract from .sol file path + if (filePath.endsWith('.sol')) { + return path.basename(filePath, '.sol'); + } + + // Extract from directory structure + const parts = filePath.split(path.sep); + for (let i = parts.length - 1; i >= 0; i--) { + if (parts[i].endsWith('.sol')) { + return path.basename(parts[i], '.sol'); + } + } + + // Fallback: use basename without extension + return path.basename(filePath, path.extname(filePath)); +} + +/** + * Check if a line is a code element declaration + * @param {string} line - Trimmed line to check + * @returns {boolean} True if line is a code element declaration + */ +function isCodeElementDeclaration(line) { + if (!line) return false; + return ( + line.startsWith('function ') || + line.startsWith('error ') || + line.startsWith('event ') || + line.startsWith('struct ') || + line.startsWith('enum ') || + line.startsWith('contract ') || + line.startsWith('library ') || + line.startsWith('interface ') || + line.startsWith('modifier ') || + /^\w+\s+(constant|immutable)\s/.test(line) || + /^(bytes32|uint\d*|int\d*|address|bool|string)\s+constant\s/.test(line) + ); +} + +/** + * Extract module description from source file NatSpec comments + * @param {string} solFilePath - Path to the Solidity source file + * @returns {string} Description extracted from @title and @notice tags + */ +function extractModuleDescriptionFromSource(solFilePath) { + const content = readFileSafe(solFilePath); + if (!content) { + return ''; + } + + const lines = content.split('\n'); + let inComment = false; + let commentBuffer = []; + let title = ''; + let notice = ''; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmed = line.trim(); + + // Skip SPDX and pragma lines + if (trimmed.startsWith('// SPDX') || trimmed.startsWith('pragma ')) { + continue; + } + + // Check if we've reached a code element without finding a file-level comment + if (!inComment && isCodeElementDeclaration(trimmed)) { + break; + } + + // Start of block comment + if (trimmed.startsWith('/**') || trimmed.startsWith('/*')) { + inComment = true; + commentBuffer = []; + continue; + } + + // End of block comment + if (inComment && trimmed.includes('*/')) { + inComment = false; + const commentText = commentBuffer.join(' '); + + // Look ahead to see if next non-empty line is a code element + let nextCodeLine = ''; + for (let j = i + 1; j < lines.length && j < i + 5; j++) { + const nextTrimmed = lines[j].trim(); + if (nextTrimmed && !nextTrimmed.startsWith('//') && !nextTrimmed.startsWith('/*')) { + nextCodeLine = nextTrimmed; + break; + } + } + + // If the comment has @title, it's a file-level comment + const titleMatch = commentText.match(/@title\s+(.+?)(?:\s+@|\s*$)/); + if (titleMatch) { + title = titleMatch[1].trim(); + const noticeMatch = commentText.match(/@notice\s+(.+?)(?:\s+@|\s*$)/); + if (noticeMatch) { + notice = noticeMatch[1].trim(); + } + break; + } + + // If next line is a code element, this comment belongs to that element + if (isCodeElementDeclaration(nextCodeLine)) { + commentBuffer = []; + continue; + } + + // Standalone comment with @notice + const standaloneNotice = commentText.match(/@notice\s+(.+?)(?:\s+@|\s*$)/); + if (standaloneNotice && !isCodeElementDeclaration(nextCodeLine)) { + notice = standaloneNotice[1].trim(); + break; + } + + commentBuffer = []; + continue; + } + + // Collect comment lines + if (inComment) { + let cleanLine = trimmed + .replace(/^\*\s*/, '') + .replace(/^\s*\*/, '') + .trim(); + if (cleanLine && !cleanLine.startsWith('*/')) { + commentBuffer.push(cleanLine); + } + } + } + + // Combine title and notice + if (title && notice) { + return `${title} - ${notice}`; + } else if (notice) { + return notice; + } else if (title) { + return title; + } + + return ''; +} + +module.exports = { + extractModuleNameFromPath, + extractModuleDescriptionFromSource, + isCodeElementDeclaration, +}; + diff --git a/.github/scripts/generate-docs.js b/.github/scripts/generate-docs.js new file mode 100644 index 00000000..50245f4f --- /dev/null +++ b/.github/scripts/generate-docs.js @@ -0,0 +1,82 @@ +/** + * Docusaurus Documentation Generator + * + * Converts forge doc output to Docusaurus MDX format + * with optional AI enhancement. + * + * Features: + * - Mirrors src/ folder structure in documentation + * - Auto-generates category navigation files + * - AI-enhanced content generation + * + * Environment variables: + * GITHUB_TOKEN - GitHub token for AI API (optional) + * SKIP_ENHANCEMENT - Set to 'true' to skip AI enhancement + */ + +const { clearContractRegistry } = require('./generate-docs-utils/core/contract-registry'); +const { syncDocsStructure, regenerateAllIndexFiles } = require('./generate-docs-utils/category/category-generator'); +const { processSolFile } = require('./generate-docs-utils/core/file-processor'); +const { getFilesToProcess } = require('./generate-docs-utils/core/file-selector'); +const { SummaryTracker } = require('./generate-docs-utils/tracking/summary-tracker'); + + +// ============================================================================ +// Main Entry Point +// ============================================================================ + +/** + * Main entry point + */ +async function main() { + console.log('Compose Documentation Generator\n'); + + // Initialize tracker + const tracker = new SummaryTracker(); + + // Step 0: Clear contract registry + clearContractRegistry(); + + // Step 1: Sync docs structure with src structure + console.log('📁 Syncing documentation structure with source...'); + const syncResult = syncDocsStructure(); + + if (syncResult.created.length > 0) { + console.log(` Created ${syncResult.created.length} new categories:`); + syncResult.created.forEach((c) => console.log(` ✅ ${c}`)); + } + console.log(` Total categories: ${syncResult.total}\n`); + + // Step 2: Determine which files to process + const args = process.argv.slice(2); + const solFiles = getFilesToProcess(args); + + if (solFiles.length === 0) { + console.log('No Solidity files to process'); + return; + } + + console.log(`Found ${solFiles.length} Solidity file(s) to process\n`); + + // Step 3: Process each file + for (const solFile of solFiles) { + await processSolFile(solFile, tracker); + } + + // Step 4: Regenerate all index pages now that docs are created + console.log('📄 Regenerating category index pages...'); + const indexResult = regenerateAllIndexFiles(true); + if (indexResult.regenerated.length > 0) { + console.log(` Regenerated ${indexResult.regenerated.length} index pages`); + } + console.log(''); + + // Step 5: Print summary + tracker.printSummary(); + tracker.writeSummaryFile(); +} + +main().catch((error) => { + console.error(`Fatal error: ${error}`); + process.exit(1); +}); diff --git a/.github/scripts/sync-docs-structure.js b/.github/scripts/sync-docs-structure.js new file mode 100644 index 00000000..4b4f833d --- /dev/null +++ b/.github/scripts/sync-docs-structure.js @@ -0,0 +1,210 @@ +#!/usr/bin/env node +/** + * Sync Documentation Structure + * + * Standalone script to mirror the src/ folder structure in website/docs/library/ + * Creates _category_.json files for Docusaurus navigation. + * + * Usage: + * node .github/scripts/sync-docs-structure.js [options] + * + * Options: + * --dry-run Show what would be created without making changes + * --verbose Show detailed output + * --help Show this help message + * + * Examples: + * node .github/scripts/sync-docs-structure.js + * node .github/scripts/sync-docs-structure.js --dry-run + */ + +const fs = require('fs'); +const path = require('path'); + +// Handle running from different directories +const scriptDir = __dirname; +process.chdir(path.join(scriptDir, '../..')); + +const { syncDocsStructure, scanSourceStructure } = require('./generate-docs-utils/category/category-generator'); + +// ============================================================================ +// CLI Parsing +// ============================================================================ + +const args = process.argv.slice(2); +const options = { + dryRun: args.includes('--dry-run'), + verbose: args.includes('--verbose'), + help: args.includes('--help') || args.includes('-h'), +}; + +// ============================================================================ +// Help +// ============================================================================ + +function showHelp() { + console.log(` +Sync Documentation Structure + +Mirrors the src/ folder structure in website/docs/library/ +Creates _category_.json files for Docusaurus navigation. + +Usage: + node .github/scripts/sync-docs-structure.js [options] + +Options: + --dry-run Show what would be created without making changes + --verbose Show detailed output + --help, -h Show this help message + +Examples: + node .github/scripts/sync-docs-structure.js + node .github/scripts/sync-docs-structure.js --dry-run +`); +} + +// ============================================================================ +// Tree Display +// ============================================================================ + +/** + * Display the source structure as a tree + * @param {Map} structure - Structure map from scanSourceStructure + */ +function displayTree(structure) { + console.log('\n📂 Source Structure (src/)\n'); + + // Sort by path for consistent display + const sorted = Array.from(structure.entries()).sort((a, b) => a[0].localeCompare(b[0])); + + // Build tree visualization + const tree = new Map(); + for (const [pathStr] of sorted) { + const parts = pathStr.split('/'); + let current = tree; + for (const part of parts) { + if (!current.has(part)) { + current.set(part, new Map()); + } + current = current.get(part); + } + } + + // Print tree + function printTree(node, prefix = '', isLast = true) { + const entries = Array.from(node.entries()); + entries.forEach(([name, children], index) => { + const isLastItem = index === entries.length - 1; + const connector = isLastItem ? '└── ' : '├── '; + const icon = children.size > 0 ? '📁' : '📄'; + console.log(`${prefix}${connector}${icon} ${name}`); + + if (children.size > 0) { + const newPrefix = prefix + (isLastItem ? ' ' : '│ '); + printTree(children, newPrefix, isLastItem); + } + }); + } + + printTree(tree); + console.log(''); +} + +// ============================================================================ +// Dry Run Mode +// ============================================================================ + +/** + * Simulate sync without making changes + * @param {Map} structure - Structure map + */ +function dryRun(structure) { + console.log('\n🔍 Dry Run Mode - No changes will be made\n'); + + const libraryDir = 'website/docs/library'; + let wouldCreate = 0; + let alreadyExists = 0; + + // Check base category + const baseCategoryFile = path.join(libraryDir, '_category_.json'); + if (fs.existsSync(baseCategoryFile)) { + console.log(` ✓ ${baseCategoryFile} (exists)`); + alreadyExists++; + } else { + console.log(` + ${baseCategoryFile} (would create)`); + wouldCreate++; + } + + // Check each category + for (const [relativePath] of structure) { + const categoryFile = path.join(libraryDir, relativePath, '_category_.json'); + if (fs.existsSync(categoryFile)) { + if (options.verbose) { + console.log(` ✓ ${categoryFile} (exists)`); + } + alreadyExists++; + } else { + console.log(` + ${categoryFile} (would create)`); + wouldCreate++; + } + } + + console.log(`\nSummary:`); + console.log(` Would create: ${wouldCreate} category files`); + console.log(` Already exist: ${alreadyExists} category files`); + console.log(`\nRun without --dry-run to apply changes.\n`); +} + +// ============================================================================ +// Main +// ============================================================================ + +function main() { + if (options.help) { + showHelp(); + return; + } + + console.log('📚 Sync Documentation Structure\n'); + console.log('Scanning src/ directory...'); + + const structure = scanSourceStructure(); + console.log(`Found ${structure.size} directories with Solidity files`); + + if (options.verbose || structure.size <= 20) { + displayTree(structure); + } + + if (options.dryRun) { + dryRun(structure); + return; + } + + console.log('Creating documentation structure...\n'); + const result = syncDocsStructure(); + + // Display results + console.log('='.repeat(50)); + console.log('Summary'); + console.log('='.repeat(50)); + console.log(`Created: ${result.created.length} categories`); + console.log(`Existing: ${result.existing.length} categories`); + console.log(`Total: ${result.total} categories`); + + if (result.created.length > 0) { + console.log('\nNewly created:'); + result.created.forEach((c) => console.log(` ✅ ${c}`)); + } + + console.log('\n✨ Done!\n'); + + // Show next steps + console.log('Next steps:'); + console.log(' 1. Run documentation generator to populate content:'); + console.log(' node .github/scripts/generate-docs.js --all\n'); + console.log(' 2. Or generate docs for specific files:'); + console.log(' node .github/scripts/generate-docs.js path/to/changed-files.txt\n'); +} + +main(); + diff --git a/.github/scripts/workflow-utils.js b/.github/scripts/workflow-utils.js index 0a254309..d95956d7 100644 --- a/.github/scripts/workflow-utils.js +++ b/.github/scripts/workflow-utils.js @@ -1,4 +1,5 @@ const fs = require('fs'); +const https = require('https'); const path = require('path'); const { execSync } = require('child_process'); @@ -63,18 +64,69 @@ function parsePRNumber(dataFileName) { } /** - * Read report file + * Read file content safely + * @param {string} filePath - Path to file (absolute or relative to workspace) + * @returns {string|null} File content or null if error + */ +function readFileSafe(filePath) { + try { + // If relative path, join with workspace if available + const fullPath = process.env.GITHUB_WORKSPACE && !path.isAbsolute(filePath) + ? path.join(process.env.GITHUB_WORKSPACE, filePath) + : filePath; + + if (!fs.existsSync(fullPath)) { + return null; + } + + return fs.readFileSync(fullPath, 'utf8'); + } catch (error) { + console.error(`Error reading file ${filePath}:`, error.message); + return null; + } +} + +/** + * Read report file (legacy - use readFileSafe for new code) * @param {string} reportFileName - Name of the report file * @returns {string|null} Report content or null if not found */ function readReport(reportFileName) { const reportPath = path.join(process.env.GITHUB_WORKSPACE, reportFileName); + return readFileSafe(reportPath); +} - if (!fs.existsSync(reportPath)) { - return null; +/** + * Ensure directory exists, create if not + * @param {string} dirPath - Directory path + */ +function ensureDir(dirPath) { + if (!fs.existsSync(dirPath)) { + fs.mkdirSync(dirPath, { recursive: true }); } +} - return fs.readFileSync(reportPath, 'utf8'); +/** + * Write file safely + * @param {string} filePath - Path to file (absolute or relative to workspace) + * @param {string} content - Content to write + * @returns {boolean} True if successful + */ +function writeFileSafe(filePath, content) { + try { + // If relative path, join with workspace if available + const fullPath = process.env.GITHUB_WORKSPACE && !path.isAbsolute(filePath) + ? path.join(process.env.GITHUB_WORKSPACE, filePath) + : filePath; + + const dir = path.dirname(fullPath); + ensureDir(dir); + fs.writeFileSync(fullPath, content); + return true; + } catch (error) { + console.error(`Error writing file ${filePath}:`, error.message); + return false; + } } /** @@ -129,9 +181,61 @@ async function postOrUpdateComment(github, context, prNumber, body, commentMarke } } +/** + * Sleep for specified milliseconds + * @param {number} ms - Milliseconds to sleep + * @returns {Promise} + */ +function sleep(ms) { + return new Promise(resolve => setTimeout(resolve, ms)); +} + +/** + * Make HTTPS request (promisified) + * @param {object} options - Request options + * @param {string} body - Request body + * @returns {Promise} Response data + */ +function makeHttpsRequest(options, body) { + return new Promise((resolve, reject) => { + const req = https.request(options, (res) => { + let data = ''; + + res.on('data', (chunk) => { + data += chunk; + }); + + res.on('end', () => { + if (res.statusCode >= 200 && res.statusCode < 300) { + try { + resolve(JSON.parse(data)); + } catch (e) { + resolve({ raw: data }); + } + } else { + reject(new Error(`HTTP ${res.statusCode}: ${data}`)); + } + }); + }); + + req.on('error', reject); + + if (body) { + req.write(body); + } + + req.end(); + }); +} + module.exports = { downloadArtifact, parsePRNumber, readReport, - postOrUpdateComment + readFileSafe, + writeFileSafe, + ensureDir, + postOrUpdateComment, + sleep, + makeHttpsRequest, }; \ No newline at end of file diff --git a/.github/workflows/docs.yml b/.github/workflows/docs-build.yml similarity index 98% rename from .github/workflows/docs.yml rename to .github/workflows/docs-build.yml index 96ed2116..541d9366 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs-build.yml @@ -1,4 +1,4 @@ -name: Documentation +name: Build Docs on: pull_request: diff --git a/.github/workflows/docs-generate.yml b/.github/workflows/docs-generate.yml new file mode 100644 index 00000000..b4357737 --- /dev/null +++ b/.github/workflows/docs-generate.yml @@ -0,0 +1,183 @@ +name: Generate Docs + +on: + workflow_dispatch: + inputs: + target_file: + description: 'Process ONLY the specified Solidity file(s) (relative path, e.g. src/contracts/MyFacet.sol or src/facets/A.sol,src/facets/B.sol)' + required: false + type: string + process_all: + description: 'Process ALL Solidity files' + required: false + default: false + type: boolean + skip_enhancement: + description: 'Skip AI Documentation Enhancement' + required: false + default: false + type: boolean + +permissions: + contents: write + pull-requests: write + models: read # Required for GitHub Models API (AI enhancement) + +jobs: + generate-docs: + name: Generate Pages + runs-on: ubuntu-latest + + steps: + - name: Checkout code + uses: actions/checkout@v4 + with: + fetch-depth: 0 + submodules: recursive + + - name: Get changed Solidity files + id: changed-files + run: | + # Prefer explicit target_file when provided via manual dispatch. + # You can pass a single file or a comma/space-separated list, e.g.: + # src/facets/A.sol,src/facets/B.sol + # src/facets/A.sol src/facets/B.sol + if [ -n "${{ github.event.inputs.target_file }}" ]; then + echo "Processing Solidity file(s) from input:" + echo "${{ github.event.inputs.target_file }}" + echo "has_changes=true" >> $GITHUB_OUTPUT + echo "process_all=false" >> $GITHUB_OUTPUT + # Normalize comma/space-separated list into one file path per line + echo "${{ github.event.inputs.target_file }}" \ + | tr ',' '\n' \ + | tr ' ' '\n' \ + | sed '/^$/d' \ + > /tmp/changed_sol_files.txt + elif [ "${{ github.event.inputs.process_all }}" == "true" ]; then + echo "Processing all Solidity files (manual trigger)" + echo "has_changes=true" >> $GITHUB_OUTPUT + echo "process_all=true" >> $GITHUB_OUTPUT + else + # Get list of changed .sol files compared to previous commit + CHANGED_FILES=$(git diff --name-only HEAD~1 HEAD -- 'src/**/*.sol' 2>/dev/null || echo "") + + if [ -z "$CHANGED_FILES" ]; then + echo "No Solidity files changed" + echo "has_changes=false" >> $GITHUB_OUTPUT + else + echo "Changed files:" + echo "$CHANGED_FILES" + echo "has_changes=true" >> $GITHUB_OUTPUT + echo "process_all=false" >> $GITHUB_OUTPUT + + # Save to file for script + echo "$CHANGED_FILES" > /tmp/changed_sol_files.txt + fi + fi + + - name: Setup Node.js + if: steps.changed-files.outputs.has_changes == 'true' + uses: actions/setup-node@v4 + with: + node-version: '20' + + - name: Install Foundry + if: steps.changed-files.outputs.has_changes == 'true' + uses: foundry-rs/foundry-toolchain@v1 + + - name: Generate forge documentation + if: steps.changed-files.outputs.has_changes == 'true' + run: forge doc + + - name: Install template dependencies + if: steps.changed-files.outputs.has_changes == 'true' + working-directory: .github/scripts/generate-docs-utils/templates + run: npm install + + - name: Run documentation generator + if: steps.changed-files.outputs.has_changes == 'true' + env: + # AI Provider Configuration + GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + SKIP_ENHANCEMENT: ${{ github.event.inputs.skip_enhancement || 'false' }} + run: | + if [ "${{ steps.changed-files.outputs.process_all }}" == "true" ]; then + node .github/scripts/generate-docs.js --all + else + node .github/scripts/generate-docs.js /tmp/changed_sol_files.txt + fi + + - name: Check for generated files + if: steps.changed-files.outputs.has_changes == 'true' + id: check-generated + run: | + # Check if any files were generated + if [ -f "docgen-summary.json" ]; then + TOTAL=$(cat docgen-summary.json | jq -r '.totalGenerated // 0' 2>/dev/null || echo "0") + if [ -n "$TOTAL" ] && [ "$TOTAL" -gt "0" ]; then + echo "has_generated=true" >> $GITHUB_OUTPUT + echo "Generated $TOTAL documentation files" + else + echo "has_generated=false" >> $GITHUB_OUTPUT + echo "No documentation files generated" + fi + else + echo "has_generated=false" >> $GITHUB_OUTPUT + fi + + - name: Verify documentation site build + if: steps.check-generated.outputs.has_generated == 'true' + working-directory: website + run: | + npm ci + npm run build + env: + ALGOLIA_APP_ID: 'dummy' + ALGOLIA_API_KEY: 'dummy' + ALGOLIA_INDEX_NAME: 'dummy' + POSTHOG_API_KEY: 'dummy' + continue-on-error: false + + - name: Generate PR body + if: steps.check-generated.outputs.has_generated == 'true' + id: pr-body + run: | + node .github/scripts/generate-docs-utils/pr-body-generator.js docgen-summary.json >> $GITHUB_OUTPUT + + - name: Clean up tmp files and stage website pages + if: steps.check-generated.outputs.has_generated == 'true' + run: | + # Remove forge docs folder (if it exists) + if [ -d "docs" ]; then + rm -rf docs + fi + + # Remove summary file (if it exists) + if [ -f "docgen-summary.json" ]; then + rm -f docgen-summary.json + fi + + # Reset any staged changes + git reset + + # Only stage website documentation files (force add in case they're ignored) + # Use library directory (the actual output directory) instead of contracts + if [ -d "website/docs/library" ]; then + git add -f website/docs/library/ + fi + + - name: Create Pull Request + if: steps.check-generated.outputs.has_generated == 'true' + uses: peter-evans/create-pull-request@v5 + with: + token: ${{ secrets.GITHUB_TOKEN }} + title: '[DOCS] Auto-generated Docs Pages' + commit-message: 'docs: auto-generate docs pages from NatSpec' + branch: docs/auto-generated-${{ github.run_number }} + body: ${{ steps.pr-body.outputs.body }} + labels: | + documentation + auto-generated + delete-branch: true + draft: true diff --git a/.gitignore b/.gitignore index 659adb25..5f906ef7 100644 --- a/.gitignore +++ b/.gitignore @@ -16,3 +16,13 @@ node_modules/ # Mirror of root CHANGELOG.md for Changesets src/CHANGELOG.md + +# Docusaurus +# Dependencies +website/node_modules +.github/scripts/generate-docs-utils/templates/node_modules + +# Ignore forge docs output (root level only) +/docs/ +# Ignore Docs generation summary file +docgen-summary.json \ No newline at end of file diff --git a/package-lock.json b/package-lock.json index 8f0fb3f0..a84f60a2 100644 --- a/package-lock.json +++ b/package-lock.json @@ -19,7 +19,7 @@ }, "cli": { "name": "@perfect-abstractions/compose-cli", - "version": "0.0.1", + "version": "0.0.5", "license": "MIT", "dependencies": { "fs-extra": "^11.3.3", @@ -5714,7 +5714,7 @@ }, "src": { "name": "@perfect-abstractions/compose", - "version": "0.0.1", + "version": "0.0.3", "license": "MIT" }, "website": { diff --git a/src/access/Owner/Data/OwnerDataFacet.sol b/src/access/Owner/Data/OwnerDataFacet.sol index fcd140c4..fa87b366 100644 --- a/src/access/Owner/Data/OwnerDataFacet.sol +++ b/src/access/Owner/Data/OwnerDataFacet.sol @@ -42,7 +42,7 @@ contract OwnerDataFacet { /** * @notice Exports the function selectors of the OwnerDataFacet - * @dev This function is use as a selector discovery mechanism for diamonds + * @dev Used as a selector discovery mechanism for diamonds. * @return selectors The exported function selectors of the OwnerDataFacet */ function exportSelectors() external pure returns (bytes memory) { diff --git a/src/access/Owner/Data/OwnerDataMod.sol b/src/access/Owner/Data/OwnerDataMod.sol index 8aa58648..12d7b395 100644 --- a/src/access/Owner/Data/OwnerDataMod.sol +++ b/src/access/Owner/Data/OwnerDataMod.sol @@ -45,6 +45,12 @@ function getStorage() pure returns (OwnerStorage storage s) { } } +/** + * @notice Sets the stored owner and emits `OwnershipTransferred` with `previousOwner == address(0)`. + * @dev Does not enforce access control. Use from trusted init paths (for example the diamond constructor); + * for guarded changes, use a transfer facet with `OwnerTransferMod` or similar. + * @param _initialOwner Address written to `OwnerStorage.owner`. + */ function setContractOwner(address _initialOwner) { OwnerStorage storage s = getStorage(); s.owner = _initialOwner; diff --git a/src/access/Owner/Renounce/OwnerRenounceFacet.sol b/src/access/Owner/Renounce/OwnerRenounceFacet.sol index 78bf3d54..6f9efcf6 100644 --- a/src/access/Owner/Renounce/OwnerRenounceFacet.sol +++ b/src/access/Owner/Renounce/OwnerRenounceFacet.sol @@ -56,7 +56,7 @@ contract OwnerRenounceFacet { /** * @notice Exports the function selectors of the OwnerRenounceFacet - * @dev This function is use as a selector discovery mechanism for diamonds + * @dev This function is used as a selector discovery mechanism for diamonds. * @return selectors The exported function selectors of the OwnerRenounceFacet */ function exportSelectors() external pure returns (bytes memory) { diff --git a/src/access/Owner/Renounce/OwnerRenounceMod.sol b/src/access/Owner/Renounce/OwnerRenounceMod.sol index 1a85f8b9..9efbb822 100644 --- a/src/access/Owner/Renounce/OwnerRenounceMod.sol +++ b/src/access/Owner/Renounce/OwnerRenounceMod.sol @@ -5,7 +5,7 @@ pragma solidity >=0.8.30; * https://compose.diamonds */ -/* +/** * @title ERC-173 Renounce Ownership Module * @notice Provides logic to renounce ownership. */ @@ -15,7 +15,7 @@ pragma solidity >=0.8.30; */ event OwnershipTransferred(address indexed previousOwner, address indexed newOwner); -/* +/** * @notice Thrown when a non-owner attempts an action restricted to owner. */ error OwnerUnauthorizedAccount(); diff --git a/src/access/Owner/Transfer/OwnerTransferFacet.sol b/src/access/Owner/Transfer/OwnerTransferFacet.sol index 5939de47..7e835a58 100644 --- a/src/access/Owner/Transfer/OwnerTransferFacet.sol +++ b/src/access/Owner/Transfer/OwnerTransferFacet.sol @@ -57,7 +57,7 @@ contract OwnerTransferFacet { /** * @notice Exports the function selectors of the OwnerTransferFacet - * @dev This function is use as a selector discovery mechanism for diamonds + * @dev Used as a selector discovery mechanism for diamonds * @return selectors The exported function selectors of the OwnerTransferFacet */ function exportSelectors() external pure returns (bytes memory) { diff --git a/src/access/Owner/TwoSteps/Data/OwnerTwoStepDataFacet.sol b/src/access/Owner/TwoSteps/Data/OwnerTwoStepDataFacet.sol index 445fbbac..4f07573e 100644 --- a/src/access/Owner/TwoSteps/Data/OwnerTwoStepDataFacet.sol +++ b/src/access/Owner/TwoSteps/Data/OwnerTwoStepDataFacet.sol @@ -40,7 +40,7 @@ contract OwnerTwoStepDataFacet { /** * @notice Exports the function selectors of the OwnerTwoStepDataFacet - * @dev This function is use as a selector discovery mechanism for diamonds + * @dev This function is used as a selector discovery mechanism for diamonds. * @return selectors The exported function selectors of the OwnerTwoStepDataFacet */ function exportSelectors() external pure returns (bytes memory) { diff --git a/website/README.md b/website/README.md index 23d9d30c..eff415d0 100644 --- a/website/README.md +++ b/website/README.md @@ -28,3 +28,9 @@ npm run build ``` This command generates static content into the `build` directory and can be served using any static contents hosting service. + +## Generate Facets & Modules Documentation + +```bash +npm run generate-docs +``` \ No newline at end of file diff --git a/website/docs/_category_.json b/website/docs/_category_.json index 8226f67a..e945adbe 100644 --- a/website/docs/_category_.json +++ b/website/docs/_category_.json @@ -3,8 +3,9 @@ "position": 1, "link": { "type": "generated-index", - "description": "Learn how to contribute to Compose" + "description": "Learn how to contribute to Compose", + "slug": "/docs" }, "collapsible": true, "collapsed": true -} \ No newline at end of file +} diff --git a/website/docs/contribution/_category_.json b/website/docs/contribution/_category_.json index 42c2e348..03a61040 100644 --- a/website/docs/contribution/_category_.json +++ b/website/docs/contribution/_category_.json @@ -3,8 +3,9 @@ "position": 5, "link": { "type": "generated-index", - "description": "Learn how to contribute to Compose" + "description": "Learn how to contribute to Compose", + "slug": "/docs/contribution" }, "collapsible": true, "collapsed": true -} \ No newline at end of file +} diff --git a/website/docs/design/banned-solidity-features.mdx b/website/docs/design/banned-solidity-features.mdx index 9824c26e..d48a1a5c 100644 --- a/website/docs/design/banned-solidity-features.mdx +++ b/website/docs/design/banned-solidity-features.mdx @@ -4,17 +4,19 @@ title: Banned Solidity Features description: Solidity language features that are banned from Compose facets and modules. --- +import Callout from '@site/src/components/ui/Callout'; + The following Solidity language features are **banned** from Compose facets and modules. Compose restricts certain Solidity features to keep facet and library code **simpler**, **more consistent**, and **easier to reason about**. Because of Compose's architecture, many of these features are either unnecessary or less helpful. -:::note + These restrictions **do not** apply to tests. These restrictions **do not** apply to developers using Compose in their own projects. -::: + #### Banned Solidity Features @@ -36,9 +38,9 @@ contract MyContract is IMyInterface { } ``` -:::tip + If you want inheritance, your facet is probably too large. Split it into smaller facets. Compose replaces inheritance with **on-chain facet composition**. -::: + diff --git a/website/docs/design/design-for-composition.mdx b/website/docs/design/design-for-composition.mdx index 6c1ed3c7..03ec0e48 100644 --- a/website/docs/design/design-for-composition.mdx +++ b/website/docs/design/design-for-composition.mdx @@ -4,6 +4,9 @@ title: Design for Composition description: How to design Compose facets and modules for composition. --- +import Callout from '@site/src/components/ui/Callout'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; + Here are the guidelines and rules for creating composable facets. Compose replaces source-code inheritance with on-chain composition. Facets are the building blocks; diamonds wire them together. @@ -46,9 +49,9 @@ We focus on building **small, independent, and easy-to-read facets**. Each facet 8. A facet that adds new storage variables must define its own diamond storage struct. 9. Never add new variables to an existing struct. -:::info Important + Maintain the same order of variables in structs when reusing them across facets or modules. Unused variables may only be removed from the end of a struct. -::: + ### Exceptions @@ -98,14 +101,14 @@ Only unused variables at the **end** of a struct may be safely removed. In this Here is the final struct storage code for `ERC20PermitFacet`: -```solidity -/** + +{`/** * @notice Storage slot identifier for ERC20 (reused to access token data). */ bytes32 constant ERC20_STORAGE_POSITION = keccak256("compose.erc20"); /** - * @notice Storage struct for ERC20 but with `symbol` removed. + * @notice Storage struct for ERC20 but with \`symbol\` removed. * @dev Reused struct definition with unused variables at the end removed * @custom:storage-location erc8042:compose.erc20 */ @@ -150,8 +153,8 @@ function getStorage() internal pure returns (ERC20PermitStorage storage s) { assembly { s.slot := position } -} -``` +}`} + #### Summary: How This Example Follows the Guide - **Reusing storage struct**: The `ERC20Storage` struct is copied from `ERC20Facet` and reused at the same location in storage `keccak256("compose.erc20")`, ensuring both facets access the same ERC20 token data. This demonstrates how facets can share storage. @@ -168,8 +171,8 @@ function getStorage() internal pure returns (ERC20PermitStorage storage s) { Here's a complete example showing how to correctly extend `ERC20Facet` by creating a new `ERC20StakingFacet` that adds staking functionality: -```solidity -/** + +{`/** * SPDX-License-Identifier: MIT */ pragma solidity >=0.8.30; @@ -218,7 +221,7 @@ contract ERC20StakingFacet { /** * @notice Storage struct for ERC20 * @dev This struct is from ERC20Facet. - * `balanceOf` is the only variable used in this struct. + * \`balanceOf\` is the only variable used in this struct. * All variables after it are removed. * @custom:storage-location erc8042:compose.erc20 */ @@ -347,8 +350,8 @@ contract ERC20StakingFacet { function getStakingStartTime(address _account) external view returns (uint256) { return getStorage().stakingStartTimes[_account]; } -} -``` +}`} + #### Summary: How This Example Follows the Guide @@ -370,7 +373,6 @@ This example demonstrates proper facet extension by: *** -:::info Conclusion - + This level of composability strikes the right balance: it enables organized, modular, and understandable on-chain smart contract systems. -::: + diff --git a/website/docs/design/index.mdx b/website/docs/design/index.mdx index 437a6904..4cd415a7 100644 --- a/website/docs/design/index.mdx +++ b/website/docs/design/index.mdx @@ -7,6 +7,7 @@ sidebar_class_name: hidden import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; This section contains the guidelines and rules for developing new facets and Solidity libraries in **Compose**. We focus on building small, independent, and easy-to-understand facets. Each facet is designed to be deployed once, then reused and composed seamlessly with others to form complete smart contract systems. @@ -46,7 +47,7 @@ This section contains the guidelines and rules for developing new facets and Sol /> -:::warning[Early Development] + Compose is still in early development and currently available only to contributors. It is not **production-ready** — use it in test or development environments only. -::: \ No newline at end of file + \ No newline at end of file diff --git a/website/docs/design/repeat-yourself.mdx b/website/docs/design/repeat-yourself.mdx index 62b50097..b7ab7676 100644 --- a/website/docs/design/repeat-yourself.mdx +++ b/website/docs/design/repeat-yourself.mdx @@ -4,6 +4,8 @@ title: Repeat Yourself description: Repeat yourself when it makes your code easier to read and understand. --- +import Callout from '@site/src/components/ui/Callout'; + The DRY principle — *Don't Repeat Yourself* — is a well-known rule in software development. We **intentionally** break that rule. @@ -15,4 +17,6 @@ Repetition can make smart contracts easier to read and reason about. Instead of However, DRY still has its place. For example, when a large block of code performs a complete, self-contained action and is used identically in multiple locations, moving it into an internal function can improve readability. For example, Compose's ERC-721 implementation uses an `internalTransferFrom` function to eliminate duplication while keeping the code easy to read and understand. -**Guideline:** Repeat yourself when it makes your code easier to read and understand. Use DRY sparingly and only to make code more readable. \ No newline at end of file + +Repeat yourself when it makes your code easier to read and understand. Use DRY sparingly and only to make code more readable. + \ No newline at end of file diff --git a/website/docs/foundations/composable-facets.mdx b/website/docs/foundations/composable-facets.mdx index f0282259..823693e6 100644 --- a/website/docs/foundations/composable-facets.mdx +++ b/website/docs/foundations/composable-facets.mdx @@ -4,6 +4,8 @@ title: Composable Facets description: Mix and match facets to build complex systems from simple, interoperable building blocks. --- +import Callout from '@site/src/components/ui/Callout'; + The word **"composable"** means *able to be combined with other parts to form a whole*. In **Compose**, facets are designed to be **composable**. They're built to interoperate seamlessly with other facets inside the same diamond. @@ -71,9 +73,9 @@ Diamond ArtCollection { } ``` */} -:::tip[Key Insight] + On-chain facets are the **building blocks** of Compose. Like LEGO bricks, they're designed to snap together in different configurations to build exactly what you need. -::: + ## Composability Benefits diff --git a/website/docs/foundations/custom-facets.mdx b/website/docs/foundations/custom-facets.mdx index 0aa65696..42421c31 100644 --- a/website/docs/foundations/custom-facets.mdx +++ b/website/docs/foundations/custom-facets.mdx @@ -4,6 +4,8 @@ title: "Custom Functionality: Compose Your Own Facets" description: "Build your own facets that work seamlessly with existing Compose Functionality." --- +import Callout from '@site/src/components/ui/Callout'; + Many projects need custom functionality beyond the standard facets. Compose is designed for this — you can build and integrate your own facets that work seamlessly alongside existing Compose facets. @@ -41,12 +43,12 @@ contract GameNFTFacet { } } ``` -:::tip[Key Insight] + Your custom `GameNFTFacet` and the standard `ERC721Facet` both operate on the **same storage** within your diamond. This shared-storage architecture is what makes composition possible. -::: + -:::warning[Early State Development] + Compose is still in early development and currently available only to contributors. It is not **production-ready** — use it in test or development environments only. -::: + diff --git a/website/docs/foundations/diamond-contracts.mdx b/website/docs/foundations/diamond-contracts.mdx index fecef74c..fa112367 100644 --- a/website/docs/foundations/diamond-contracts.mdx +++ b/website/docs/foundations/diamond-contracts.mdx @@ -5,6 +5,7 @@ description: "Understand Diamonds from the ground up—facets, storage, delegati --- import SvgThemeRenderer from '@site/src/components/theme/SvgThemeRenderer'; +import Callout from '@site/src/components/ui/Callout'; A **diamond contract** is a smart contract that is made up of multiple parts instead of one large block of code. The diamond exists at **one address** and holds **all of the contract's storage**, but it uses separate smart contracts called **facets** to provide its functionality. @@ -12,9 +13,9 @@ Users interact only with the **diamond**, but the diamond's features come from i Because facets can be added, replaced, or removed, a diamond can grow and evolve over time **without changing its address** and without redeploying the entire system. -:::note[In Simple Terms] + A diamond contract is a smart contract made from multiple small building blocks (facets), allowing it to be flexible, organized, and able to grow over time. -::: + A diamond has: - One address diff --git a/website/docs/foundations/index.mdx b/website/docs/foundations/index.mdx index f4a41da1..45a8b489 100644 --- a/website/docs/foundations/index.mdx +++ b/website/docs/foundations/index.mdx @@ -8,6 +8,7 @@ import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; import DocSubtitle from '@site/src/components/docs/DocSubtitle'; import Icon from '@site/src/components/ui/Icon'; import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Callout from '@site/src/components/ui/Callout'; Compose is a new approach to smart contract development that changes how developers build and deploy smart contract systems. This section introduces the core concepts that make Compose unique. @@ -54,7 +55,12 @@ import CalloutBox from '@site/src/components/ui/CalloutBox'; /> -:::warning[Early Development] + + +Don't rush through these concepts. Taking time to understand the foundations will make everything else much easier. + + + Compose is still in early development and currently available only to contributors. It is not **production-ready** — use it in test or development environments only. -::: \ No newline at end of file + \ No newline at end of file diff --git a/website/docs/foundations/onchain-contract-library.mdx b/website/docs/foundations/onchain-contract-library.mdx index 9379b0a4..53a6a971 100644 --- a/website/docs/foundations/onchain-contract-library.mdx +++ b/website/docs/foundations/onchain-contract-library.mdx @@ -5,6 +5,7 @@ description: Compose provides a set of reusable on-chain contracts that already --- import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; **Compose takes a different approach.** @@ -31,11 +32,11 @@ This reduces duplication, improves upgradeability, and makes smart contract syst For your next project, instead of deploying new contracts, simply **use the existing on-chain contracts** provided by Compose. -:::tip[Key Insight] + Compose is a general purpose **on-chain** smart contract library. -::: + -:::info[In Development] + Compose is still in early development, and its smart contracts haven't been deployed yet. We're actively building—and if this vision excites you, we'd love for you to join us. -::: + diff --git a/website/docs/foundations/overview.mdx b/website/docs/foundations/overview.mdx deleted file mode 100644 index 733e1038..00000000 --- a/website/docs/foundations/overview.mdx +++ /dev/null @@ -1,85 +0,0 @@ ---- -sidebar_position: 10 -title: Overview -description: Overview of Compose foundations—core concepts, authentication, facets and modules, diamond standard, and storage patterns for diamond-based smart contract development. -draft: true ---- - -import DocHero from '@site/src/components/docs/DocHero'; -import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; -import Callout from '@site/src/components/ui/Callout'; - - - -## Core Concepts - -

- Understanding these fundamental concepts will help you build robust, scalable smart contract systems with Compose. -

- - - } - href="/docs/foundations/authentication" - /> - } - href="/docs/foundations/facets-and-modules" - /> - } - href="/docs/" - /> - } - href="/docs/" - /> - - -## Advanced Topics - - - } - href="/docs/" - /> - } - href="/docs/" - /> - - - -We recommend starting with **Facets & Modules** to understand the core architecture, then moving to **Storage Patterns** to see how it all works together. - - -## Why These Matter - -The concepts in this section form the foundation of everything you'll build with Compose: - -- **Authentication** ensures your contracts have proper access control -- **Facets & Modules** explain how to structure your code -- **Diamond Standard** provides the underlying architecture -- **Storage Patterns** enable the shared state that makes it all work - - -Don't rush through these concepts. Taking time to understand the foundations will make everything else much easier and prevent common mistakes. - - diff --git a/website/docs/foundations/reusable-facet-logic.mdx b/website/docs/foundations/reusable-facet-logic.mdx index 10fd0fdf..fc511550 100644 --- a/website/docs/foundations/reusable-facet-logic.mdx +++ b/website/docs/foundations/reusable-facet-logic.mdx @@ -5,6 +5,7 @@ description: Deploy once, reuse everywhere. Compose facets are shared across tho --- import DiamondFacetsSVG from '@site/static/img/svg/compose_diamond_facets.svg' +import Callout from '@site/src/components/ui/Callout'; You might be wondering: **How can I create a new project without deploying new smart contracts?** @@ -53,13 +54,13 @@ If 1,000 projects use the same `ERC20Facet`: - **Millions in gas costs avoided** - **1,000 projects** benefit from the same audited, battle-tested code -:::tip[Key Insight] + Many diamond contracts can be deployed that **reuse the same on-chain facets**. -::: + -:::tip[Key Insight] + Each diamond manages **its own storage data** by using the code from facets. -::: + diff --git a/website/docs/foundations/solidity-modules.mdx b/website/docs/foundations/solidity-modules.mdx index d8d40e16..4a6a18d4 100644 --- a/website/docs/foundations/solidity-modules.mdx +++ b/website/docs/foundations/solidity-modules.mdx @@ -4,6 +4,8 @@ title: Solidity Modules description: What Solidity modules are, how they differ from contracts and libraries, and how Compose uses them for reusable facet logic and shared storage. --- +import ExpandableCode from '@site/src/components/code/ExpandableCode'; + Solidity **modules** are Solidity files whose top-level code lives *outside* of contracts and Solidity libraries. They contain reusable logic that gets pulled into other contracts at compile time. @@ -33,8 +35,8 @@ Compose uses clear naming patterns to distinguish Solidity file types: Here is an example of a Solidity module that implements contract ownership functionality: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity >=0.8.30; /* @@ -87,13 +89,13 @@ function requireOwner() view { if (getStorage().owner != msg.sender) { revert OwnerUnauthorizedAccount(); } -} -``` +}`} + Here is an example of a diamond contract that uses Solidity modules to implement ERC-2535 Diamonds: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity >=0.8.30; import "../DiamondMod.sol" as DiamondMod; @@ -146,7 +148,7 @@ contract ExampleDiamond { } receive() external payable {} -} -``` +}`} + diff --git a/website/docs/getting-started/_category_.json b/website/docs/getting-started/_category_.json index 74b10c34..f17bd463 100644 --- a/website/docs/getting-started/_category_.json +++ b/website/docs/getting-started/_category_.json @@ -3,9 +3,9 @@ "position": 3, "link": { "type": "generated-index", - "description": "Learn how to install and configure Compose for your smart contract projects." + "description": "Learn how to install and configure Compose for your smart contract projects.", + "slug": "/docs/getting-started" }, "collapsible": true, "collapsed": true } - diff --git a/website/docs/getting-started/installation.md b/website/docs/getting-started/installation.md index c4f9dc3d..526b0c1f 100644 --- a/website/docs/getting-started/installation.md +++ b/website/docs/getting-started/installation.md @@ -3,6 +3,7 @@ sidebar_position: 1 --- import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; # Installation @@ -69,7 +70,7 @@ import {DiamondMod} from "@perfect-abstractions/compose/diamond/DiamondMod.sol"; Now that you have Compose installed, let's understand the core concepts: - **[Core Concepts](/docs/foundations)** - Learn about facets, libraries, and shared storage -- **[Explore Available Contracts](https://github.com/Perfect-Abstractions/Compose/tree/main/src)** - See what else you can add +- **[Explore Available Contracts](/docs/library)** - See what else you can add ## Getting Help @@ -79,7 +80,7 @@ Having trouble with installation? - Ask in **[Discord](https://discord.gg/compose)** - Open an **[issue on GitHub](https://github.com/Perfect-Abstractions/Compose/issues)** -:::tip Development Environment -We recommend using VSCode with the [**Solidity** extension](https://github.com/juanfranblanco/vscode-solidity) by Juan Blanco for the best development experience. -::: + +We recommend using VSCode with the **Solidity** extension by Juan Blanco for the best development experience. + diff --git a/website/docs/getting-started/quick-start.md b/website/docs/getting-started/quick-start.md deleted file mode 100644 index 318fd0e6..00000000 --- a/website/docs/getting-started/quick-start.md +++ /dev/null @@ -1,264 +0,0 @@ ---- -sidebar_position: 2 -draft: true ---- - -# Quick Start - -Let's build your first diamond using Compose facets in under 5 minutes! 🚀 - -## What We'll Build - -We'll create a simple ERC-20 token diamond that demonstrates: -- How to use Compose facets -- How shared storage works -- How to deploy and interact with your diamond - -## Step 1: Set Up Your Project - -```bash -# Create a new Foundry project -forge init my-diamond -cd my-diamond - -# Install Compose (when available as dependency) -# For now, clone the repository -git clone https://github.com/Perfect-Abstractions/Compose.git lib/Compose -``` - -## Step 2: Create Your Diamond Contract - -Create `src/MyTokenDiamond.sol`: - -```solidity -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import {Diamond} from "compose/Diamond.sol"; - -/// @title MyTokenDiamond -/// @notice A diamond that implements ERC-20 functionality -contract MyTokenDiamond is Diamond { - constructor( - address owner, - address diamondCutFacet - ) Diamond(owner, diamondCutFacet) { - // Diamond is initialized and ready to receive facets - } -} -``` - -## Step 3: Deploy Script - -Create `script/DeployMyDiamond.s.sol`: - -```solidity -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import {Script} from "forge-std/Script.sol"; -import {MyTokenDiamond} from "../src/MyTokenDiamond.sol"; -import {DiamondCutFacet} from "compose/facets/DiamondCutFacet.sol"; -import {ERC20Facet} from "compose/facets/ERC20Facet.sol"; -import {IDiamondCut} from "compose/interfaces/IDiamondCut.sol"; - -contract DeployMyDiamond is Script { - function run() external { - uint256 deployerPrivateKey = vm.envUint("PRIVATE_KEY"); - address deployer = vm.addr(deployerPrivateKey); - - vm.startBroadcast(deployerPrivateKey); - - // 1. Deploy DiamondCutFacet - DiamondCutFacet diamondCutFacet = new DiamondCutFacet(); - - // 2. Deploy Diamond - MyTokenDiamond diamond = new MyTokenDiamond( - deployer, - address(diamondCutFacet) - ); - - // 3. Deploy ERC20Facet - ERC20Facet erc20Facet = new ERC20Facet(); - - // 4. Add ERC20Facet to diamond - IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](1); - - bytes4[] memory erc20Selectors = new bytes4[](9); - erc20Selectors[0] = ERC20Facet.name.selector; - erc20Selectors[1] = ERC20Facet.symbol.selector; - erc20Selectors[2] = ERC20Facet.decimals.selector; - erc20Selectors[3] = ERC20Facet.totalSupply.selector; - erc20Selectors[4] = ERC20Facet.balanceOf.selector; - erc20Selectors[5] = ERC20Facet.transfer.selector; - erc20Selectors[6] = ERC20Facet.allowance.selector; - erc20Selectors[7] = ERC20Facet.approve.selector; - erc20Selectors[8] = ERC20Facet.transferFrom.selector; - - cuts[0] = IDiamondCut.FacetCut({ - facetAddress: address(erc20Facet), - action: IDiamondCut.FacetCutAction.Add, - functionSelectors: erc20Selectors - }); - - IDiamondCut(address(diamond)).diamondCut(cuts, address(0), ""); - - vm.stopBroadcast(); - - console.log("Diamond deployed at:", address(diamond)); - console.log("ERC20Facet deployed at:", address(erc20Facet)); - } -} -``` - -## Step 4: Create Initialization Facet - -For initializing your token with name, symbol, and initial supply: - -```solidity -// src/facets/TokenInitFacet.sol -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import {LibERC20} from "compose/libraries/LibERC20.sol"; - -contract TokenInitFacet { - function init( - string memory name, - string memory symbol, - uint8 decimals, - uint256 initialSupply, - address recipient - ) external { - LibERC20.ERC20Storage storage s = LibERC20.getStorage(); - - require(bytes(s.name).length == 0, "Already initialized"); - - s.name = name; - s.symbol = symbol; - s.decimals = decimals; - - if (initialSupply > 0) { - s.totalSupply = initialSupply; - s.balances[recipient] = initialSupply; - } - } -} -``` - -## Step 5: Test Your Diamond - -Create `test/MyTokenDiamond.t.sol`: - -```solidity -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import {Test} from "forge-std/Test.sol"; -import {MyTokenDiamond} from "../src/MyTokenDiamond.sol"; -import {ERC20Facet} from "compose/facets/ERC20Facet.sol"; -import {IERC20} from "compose/interfaces/IERC20.sol"; - -contract MyTokenDiamondTest is Test { - MyTokenDiamond diamond; - IERC20 token; - - address owner = address(1); - address user1 = address(2); - address user2 = address(3); - - function setUp() public { - // Deploy and configure diamond - // (Diamond cut logic here - see deploy script) - - // Cast diamond to IERC20 interface - token = IERC20(address(diamond)); - } - - function test_Transfer() public { - vm.prank(owner); - token.transfer(user1, 100 ether); - - assertEq(token.balanceOf(user1), 100 ether); - } - - function test_Approve() public { - vm.prank(user1); - token.approve(user2, 50 ether); - - assertEq(token.allowance(user1, user2), 50 ether); - } - - function test_TransferFrom() public { - // Setup: owner has tokens, user1 is approved - vm.prank(owner); - token.approve(user1, 100 ether); - - // Act: user1 transfers from owner to user2 - vm.prank(user1); - token.transferFrom(owner, user2, 50 ether); - - // Assert - assertEq(token.balanceOf(user2), 50 ether); - } -} -``` - -## Step 6: Run and Deploy - -```bash -# Run tests -forge test - -# Deploy to local network -anvil # In another terminal - -# Deploy -forge script script/DeployMyDiamond.s.sol:DeployMyDiamond --rpc-url http://localhost:8545 --broadcast -``` - -## Understanding What Happened - -Let's break down what you just built: - -1. **Diamond Base**: Your `MyTokenDiamond` inherits from `Diamond`, which provides the core diamond functionality -2. **Facet Integration**: You added `ERC20Facet`, a complete ERC-20 implementation -3. **Shared Storage**: The facet and your initialization logic both use `LibERC20` to access the same storage -4. **Upgradeable**: You can add more facets anytime using `diamondCut()` - -### The Power of Composition - -Your diamond now has: -- ✅ Full ERC-20 functionality -- ✅ Upgradeability via diamond cuts -- ✅ Ability to add more facets (ERC-721, access control, etc.) -- ✅ Separation of concerns (each facet handles one responsibility) - -## Next Steps - -Congratulations! 🎉 You've created your first Compose diamond. Now: - -- **[Learn Core Concepts](/)** - Understand the architecture deeply -- **[Explore Available Facets](/)** - See what else you can add -- **[Custom Facets](/)** - Build your own facets -- **[Best Practices](/)** - Write production-ready code - -## Common Issues - -### "Cannot find Diamond.sol" - -Make sure Compose is properly installed in `lib/Compose/` and your remappings are configured. - -### "DiamondCut failed" - -Check that: -- Function selectors are correct -- Facet address is valid -- You have the right permissions - -### Need Help? - -- 💬 **[Join Discord](https://discord.gg/compose)** - Get immediate help -- 📚 **[Read FAQ](/)** - Common questions answered -- 🐛 **[Report Issues](https://github.com/Perfect-Abstractions/Compose/issues)** - Found a bug? - diff --git a/website/docs/getting-started/your-first-diamond.md b/website/docs/getting-started/your-first-diamond.md deleted file mode 100644 index 73b91dbb..00000000 --- a/website/docs/getting-started/your-first-diamond.md +++ /dev/null @@ -1,371 +0,0 @@ ---- -sidebar_position: 3 -draft: true ---- - -# Your First Diamond - -In this guide, you'll learn how to create a diamond from scratch and understand every piece of the architecture. - -## What is a Diamond? - -A **diamond** is a smart contract that follows the **ERC-2535 Diamond Standard**. Think of it as a modular smart contract system where you can: - -- **Add, replace, or remove functionality** after deployment -- **Combine multiple facets** (modules) into one address -- **Share storage** across all facets using a unified storage layout -- **Exceed the 24KB contract size limit** by splitting code across facets - -### The Diamond Architecture - -``` -┌─────────────────────────────────────┐ -│ Diamond Proxy │ -│ (Single Address, Delegatecalls) │ -└─────────────┬───────────────────────┘ - │ - ┌───────┴────────┬──────────────┐ - │ │ │ -┌─────▼─────┐ ┌──────▼──────┐ ┌───▼────┐ -│ Facet A │ │ Facet B │ │ Facet C│ -│ (ERC-20) │ │ (ERC-721) │ │ (Owner)│ -└───────────┘ └─────────────┘ └────────┘ - │ │ │ - └────────────────┴──────────────┘ - │ - ┌───────────▼────────────┐ - │ Shared Storage │ - │ (Diamond Storage) │ - └───────────────────────┘ -``` - -## Step-by-Step Guide - -### 1. Understanding the Components - -A complete diamond consists of: - -1. **Diamond Contract** - The main contract that users interact with -2. **DiamondCutFacet** - Manages adding/removing/replacing facets -3. **Functional Facets** - Your actual business logic (ERC-20, etc.) -4. **Libraries** - Helper functions for custom facets - -### 2. Create the Diamond Base - -```solidity -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import {Diamond} from "compose/Diamond.sol"; - -/// @title MyDiamond -/// @notice A customizable diamond contract -/// @dev Inherits core diamond functionality from Compose -contract MyDiamond is Diamond { - /// @notice Creates a new diamond - /// @param _contractOwner The address that will own this diamond - /// @param _diamondCutFacet The address of the DiamondCutFacet - constructor( - address _contractOwner, - address _diamondCutFacet - ) Diamond(_contractOwner, _diamondCutFacet) { - // The diamond is now ready to receive facets - } -} -``` - -### 3. Deploy DiamondCutFacet - -The DiamondCutFacet is special—it's the only facet that must be added during construction: - -```solidity -import {DiamondCutFacet} from "compose/facets/DiamondCutFacet.sol"; - -// Deploy it once -DiamondCutFacet diamondCutFacet = new DiamondCutFacet(); - -// Pass its address to your diamond -MyDiamond diamond = new MyDiamond( - msg.sender, // owner - address(diamondCutFacet) -); -``` - -### 4. Add Your First Facet - -Let's add ERC-20 functionality: - -```solidity -import {ERC20Facet} from "compose/facets/ERC20Facet.sol"; -import {IDiamondCut} from "compose/interfaces/IDiamondCut.sol"; - -// 1. Deploy the facet -ERC20Facet erc20Facet = new ERC20Facet(); - -// 2. Prepare the function selectors -bytes4[] memory selectors = new bytes4[](9); -selectors[0] = ERC20Facet.name.selector; -selectors[1] = ERC20Facet.symbol.selector; -selectors[2] = ERC20Facet.decimals.selector; -selectors[3] = ERC20Facet.totalSupply.selector; -selectors[4] = ERC20Facet.balanceOf.selector; -selectors[5] = ERC20Facet.transfer.selector; -selectors[6] = ERC20Facet.allowance.selector; -selectors[7] = ERC20Facet.approve.selector; -selectors[8] = ERC20Facet.transferFrom.selector; - -// 3. Create the facet cut -IDiamondCut.FacetCut[] memory cut = new IDiamondCut.FacetCut[](1); -cut[0] = IDiamondCut.FacetCut({ - facetAddress: address(erc20Facet), - action: IDiamondCut.FacetCutAction.Add, - functionSelectors: selectors -}); - -// 4. Execute the diamond cut -IDiamondCut(address(diamond)).diamondCut(cut, address(0), ""); -``` - -### 5. Initialize Your Token - -Create an init facet for one-time setup: - -```solidity -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import {LibERC20} from "compose/libraries/LibERC20.sol"; - -contract ERC20InitFacet { - /// @notice Initialize the ERC-20 token - /// @dev Can only be called once - function initERC20( - string calldata _name, - string calldata _symbol, - uint8 _decimals, - uint256 _initialSupply - ) external { - LibERC20.ERC20Storage storage s = LibERC20.getStorage(); - - // Ensure we haven't initialized yet - require(bytes(s.name).length == 0, "Already initialized"); - - // Set token details - s.name = _name; - s.symbol = _symbol; - s.decimals = _decimals; - - // Mint initial supply to caller - if (_initialSupply > 0) { - s.totalSupply = _initialSupply; - s.balances[msg.sender] = _initialSupply; - - emit Transfer(address(0), msg.sender, _initialSupply); - } - } - - event Transfer(address indexed from, address indexed to, uint256 value); -} -``` - -Call it via `diamondCut` with initialization data: - -```solidity -// Deploy init facet -ERC20InitFacet initFacet = new ERC20InitFacet(); - -// Prepare init data -bytes memory initData = abi.encodeWithSelector( - ERC20InitFacet.initERC20.selector, - "My Token", - "MTK", - 18, - 1_000_000 ether -); - -// Execute diamond cut with initialization -IDiamondCut(address(diamond)).diamondCut( - cut, // empty or with more facets - address(initFacet), - initData -); -``` - -### 6. Interact with Your Diamond - -Now you can use your diamond like any ERC-20 token: - -```solidity -import {IERC20} from "compose/interfaces/IERC20.sol"; - -// Cast diamond to ERC-20 interface -IERC20 token = IERC20(address(diamond)); - -// Use standard ERC-20 functions -uint256 balance = token.balanceOf(msg.sender); -token.transfer(recipient, 100 ether); -token.approve(spender, 1000 ether); -``` - -## Key Concepts Explained - -### Delegatecall Magic - -When you call a function on the diamond: -1. Diamond receives the call -2. Diamond looks up which facet implements that function -3. Diamond `delegatecalls` to that facet -4. Facet executes using **diamond's storage** -5. Result is returned to caller - -``` -User → Diamond.transfer() - ↓ (delegatecall) - ERC20Facet.transfer() - ↓ (reads/writes) - Diamond's Storage -``` - -### Shared Storage - -All facets share the same storage in the diamond. This is why libraries like `LibERC20` are crucial—they ensure everyone accesses storage at the same location: - -```solidity -// Both the facet and your custom code access the same storage -LibERC20.ERC20Storage storage s = LibERC20.getStorage(); -// Always returns storage at keccak256("compose.erc20") -``` - -### Function Selectors - -Each function has a unique 4-byte signature (selector): - -```solidity -bytes4 selector = ERC20Facet.transfer.selector; -// selector = 0xa9059cbb (first 4 bytes of keccak256("transfer(address,uint256)")) -``` - -The diamond uses these selectors to route calls to the correct facet. - -## Complete Example - -Here's a complete deployment script: - -```solidity -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.24; - -import "forge-std/Script.sol"; -import {MyDiamond} from "../src/MyDiamond.sol"; -import {DiamondCutFacet} from "compose/facets/DiamondCutFacet.sol"; -import {ERC20Facet} from "compose/facets/ERC20Facet.sol"; -import {ERC20InitFacet} from "../src/facets/ERC20InitFacet.sol"; -import {IDiamondCut} from "compose/interfaces/IDiamondCut.sol"; - -contract DeployDiamond is Script { - function run() external { - vm.startBroadcast(); - - // Step 1: Deploy DiamondCutFacet - DiamondCutFacet cutFacet = new DiamondCutFacet(); - - // Step 2: Deploy Diamond - MyDiamond diamond = new MyDiamond(msg.sender, address(cutFacet)); - - // Step 3: Deploy ERC20Facet - ERC20Facet erc20 = new ERC20Facet(); - - // Step 4: Deploy InitFacet - ERC20InitFacet initFacet = new ERC20InitFacet(); - - // Step 5: Prepare facet cuts - bytes4[] memory selectors = getERC20Selectors(); - IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](1); - cuts[0] = IDiamondCut.FacetCut({ - facetAddress: address(erc20), - action: IDiamondCut.FacetCutAction.Add, - functionSelectors: selectors - }); - - // Step 6: Prepare init data - bytes memory initData = abi.encodeWithSelector( - ERC20InitFacet.initERC20.selector, - "Compose Token", - "COMP", - 18, - 1_000_000 ether - ); - - // Step 7: Execute diamond cut with initialization - IDiamondCut(address(diamond)).diamondCut( - cuts, - address(initFacet), - initData - ); - - vm.stopBroadcast(); - - console.log("Diamond deployed at:", address(diamond)); - } - - function getERC20Selectors() internal pure returns (bytes4[] memory) { - bytes4[] memory selectors = new bytes4[](9); - selectors[0] = ERC20Facet.name.selector; - selectors[1] = ERC20Facet.symbol.selector; - selectors[2] = ERC20Facet.decimals.selector; - selectors[3] = ERC20Facet.totalSupply.selector; - selectors[4] = ERC20Facet.balanceOf.selector; - selectors[5] = ERC20Facet.transfer.selector; - selectors[6] = ERC20Facet.allowance.selector; - selectors[7] = ERC20Facet.approve.selector; - selectors[8] = ERC20Facet.transferFrom.selector; - return selectors; - } -} -``` - -## Testing Your Diamond - -```solidity -// test/MyDiamond.t.sol -import {Test} from "forge-std/Test.sol"; -import {IERC20} from "compose/interfaces/IERC20.sol"; - -contract MyDiamondTest is Test { - address diamond; - IERC20 token; - - function setUp() public { - // Deploy your diamond (using the script above) - // ... - - token = IERC20(diamond); - } - - function testTokenName() public { - assertEq(token.name(), "Compose Token"); - } - - function testTransfer() public { - address recipient = address(0x123); - uint256 amount = 100 ether; - - token.transfer(recipient, amount); - assertEq(token.balanceOf(recipient), amount); - } -} -``` - -## Next Steps - -You now understand how to build a diamond from scratch! Continue learning: - -- **[Core Concepts: Facets and Libraries](/)** - Deep dive into the architecture -- **[Available Facets](/)** - Explore what Compose provides -- **[Creating Custom Facets](/)** - Build your own facets -- **[Upgrading Diamonds](/)** - Learn about diamond cuts - -:::tip Pro Tip -In production, consider using a multi-sig wallet or DAO for the diamond owner to ensure secure upgrades. -::: - diff --git a/website/docs/library/_category_.json b/website/docs/library/_category_.json new file mode 100644 index 00000000..04125e1e --- /dev/null +++ b/website/docs/library/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Library", + "position": 4, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/index" + } +} diff --git a/website/docs/library/diamond/DiamondInspectFacet.mdx b/website/docs/library/diamond/DiamondInspectFacet.mdx new file mode 100644 index 00000000..716c2006 --- /dev/null +++ b/website/docs/library/diamond/DiamondInspectFacet.mdx @@ -0,0 +1,250 @@ +--- +sidebar_position: 2 +title: "Diamond Inspect Facet" +description: "Inspect diamond facets and selectors" +sidebar_label: "Inspect Facet" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondInspectFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; + + +It lets you inspect the diamond’s routing table and retrieve the registered facet list. + + + +- Query which facet a function selector is routed to. +- Enumerate registered facets (and their exported selectors). +- Read the diamond’s composition from shared diamond storage (`DIAMOND_STORAGE_POSITION`). + +## Storage + +### State Variables + + + +### Diamond Storage + + + +{`/** storage-location: erc8042:erc8153.diamond */ +struct DiamondStorage { + mapping(bytes4 functionSelector => FacetNode) facetNodes; + FacetList facetList; +} + +struct FacetList { + bytes4 headFacetNodeId; + bytes4 tailFacetNodeId; + uint32 facetCount; + uint32 selectorCount; +} + +struct FacetNode { + address facet; + bytes4 prevFacetNodeId; + bytes4 nextFacetNodeId; +} +`} + + +--- +### Facet + +Used as a return type for the [`facets`](#facets) function. + + +{`struct Facet { + address facet; + bytes4[] functionSelectors; +}`} + + +## Functions + +### facetAddress + +Gets the facet address that handles the given selector. If facet is not found, return `address(0)`. + + +{`function facetAddress(bytes4 _functionSelector) external view returns (address facet);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### facetFunctionSelectors + +Gets the function selectors exported by the given facet and returns them as `bytes4[]`. + +- If the diamond is not routing the facet's first exported selector, this function returns an empty array. +- If `_facet` does not implement `exportSelectors()` with the expected packing, the call will revert. + + +{`function facetFunctionSelectors(address _facet) external view returns (bytes4[] memory facetSelectors);`} + + +**Parameters:** + + (packed `bytes4` selectors)." + } + ]} + showRequired={false} +/> + +**Returns:** + + + +--- +### facetAddresses + +Gets the facet addresses used by the diamond. + +If no facets are registered, this returns an empty array. + + +{`function facetAddresses() external view returns (address[] memory allFacets);`} + + +**Returns:** + + + +--- +### facets + +Returns the facet address and function selectors of all facets in the diamond. + + +{`function facets() external view returns (Facet[] memory facetsAndSelectors);`} + + +**Returns:** + + + An array of Facet structs containing each facet address and its function selectors. + + ) + } + ]} + showRequired={false} +/> + +## Selector Packing Notes (`exportSelectors()`) + +`exportSelectors()` returns a packed `bytes` blob where each `bytes4` selector is encoded into 4 consecutive bytes (a `bytes.concat(sel1, sel2, ...)` style packing). + +`DiamondInspectFacet` follows that convention: + +- `facetFunctionSelectors(facet)` calls `facet.exportSelectors()` and unpacks the packed bytes into a `bytes4[]`. +- `facets()` does the same for every facet discovered in the diamond. + + + +## Best Practices + +- Integrate this facet to enable external inspection of diamond facet mappings (Useful for tooling and indexing). +- Use `facetAddress` to determine which facet handles a specific function selector. +- Use `facets` and `facetFunctionSelectors` for comprehensive diamond structure analysis. + +## Security Considerations + +Most functions are `view`/`pure` and do not mutate state. + +However, `facetFunctionSelectors()` and `facets()` perform external calls to `IFacet(facet).exportSelectors()`. + +If the provided address is not a valid `IFacet` implementation (or if `exportSelectors()` reverts / returns unexpected data), these calls can revert. + +That being said, always make sure to provide verified and trusted facet addresses. + + diff --git a/website/docs/library/diamond/DiamondMod.mdx b/website/docs/library/diamond/DiamondMod.mdx new file mode 100644 index 00000000..91c69ee3 --- /dev/null +++ b/website/docs/library/diamond/DiamondMod.mdx @@ -0,0 +1,472 @@ +--- +sidebar_position: 1 +title: "Diamond Module" +description: "Internal functions and storage for diamond proxy functionality." +sidebar_label: "Diamond Module" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; + + +This module provides core internal functions and storage management for diamond proxies. + + + +- Provides functions for diamond proxy operations. +- Manages diamond storage using a dedicated storage slot (`"erc8153.diamond"`). +- Supports facet registration and retrieval through internal mechanisms. + + + +## Storage + +### State Variables + + + +### Diamond Storage + + + +{`/** storage-location: erc8042:erc8153.diamond */ +struct DiamondStorage { + mapping(bytes4 functionSelector => FacetNode) facetNodes; + FacetList facetList; +} + +struct FacetList { + bytes4 headFacetNodeId; + bytes4 tailFacetNodeId; + uint32 facetCount; + uint32 selectorCount; +} + +struct FacetNode { + address facet; + bytes4 prevFacetNodeId; + bytes4 nextFacetNodeId; +} +`} + + +## Functions + +### getDiamondStorage + + +{`function getDiamondStorage() pure returns (DiamondStorage storage s);`} + + +**Returns:** + + + +--- +### diamondFallback + +This is the core dispatch path used by the diamond proxy pattern. The fallback function is used to route unmatched calls to the facet registered for the called selector (`msg.sig`) in diamond storage. + + +{`function diamondFallback() ;`} + + +**Execution Flow:** + +- Loads `DiamondStorage` and resolves the facet address for the called selector. +- Reverts if no facet is registered for the selector (see [`FunctionNotFound`](#errors) error). +- Copies calldata to memory (`calldatacopy`) and executes `delegatecall` on the resolved facet. +- Copies returndata (`returndatacopy`) and bubbles the exact result back to the caller. + + +`delegatecall` executes facet code in the diamond's context. + +State writes happen inside the diamond storage, and `msg.sender`/`msg.value` are preserved from the original external call. + + +--- + +### addFacets + +Registers one or more facets to a diamond. For each facet, selectors are discovered by calling `exportSelectors()`. Each selector is then mapped to the facet in diamond storage. + +Reverts if any selector is already registered on the diamond. + + +{`function addFacets(address[] memory _facets) ;`} + + +**Parameters:** + + + The facet addresses to add. Each must implement{" "} + IFacet + {" "}and return selectors from exportSelectors(). + + ) + } + ]} + showRequired={false} +/> + +--- +### importSelectors + +Retrieves the function selectors exposed by a facet by calling its `exportSelectors()`. Validates the returned ABI-encoded `bytes` (offset, length, and that the payload length is a multiple of 4) and returns the packed selectors without copying (zero-copy decode). + +Used internally by [`addFacets`](#addfacets). + + +{`function importSelectors(address _facet) view returns (bytes memory selectors);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### at + +Returns the 4-byte function selector at the given index in a packed `bytes` array of selectors (e.g. from [`importSelectors`](#importselectors) or other selector packing). + + +{`function at(bytes memory selectors, uint256 index) pure returns (bytes4 selector);`} + + +**Parameters:** + + + +**Returns:** + + + +## Events + + + +
+ Emitted when a facet is added to a diamond. + + The function selectors this facet handles can be retrieved by calling `IFacet(_facet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetAdded(address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a facet is removed from a diamond. + + The function selectors this facet handles can be retrieved by calling `IFacet(_facet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetRemoved(address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when an existing facet is replaced with a new facet. + + - Selectors that are present in the new facet but not in the old facet are added to the diamond. + - Selectors that are present in both the new and old facet are updated to use the new facet. + - Selectors that are not present in the new facet but are present in the old facet are removed from the diamond. + + The function selectors handled by these facets can be retrieved by calling: + - `IFacet(_oldFacet).exportSelectors()` + - `IFacet(_newFacet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetReplaced(address indexed _oldFacet, address indexed _newFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a diamond's constructor function or function from a facet makes a `delegatecall`. +
+ +
+ Signature: + +{`event DiamondDelegateCall(address indexed _delegate, bytes _delegateCalldata);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted to record information about a diamond. This event records any arbitrary metadata. + + The format of `_tag` and `_data` are not specified by the standard. +
+ +
+ Signature: + +{`event DiamondMetadata(bytes32 indexed _tag, bytes _data);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown by [addFacets](#addfacets) when a selector from one of the facets is already registered in the diamond via another facet. +
+
+ Signature: + +error CannotAddFunctionToDiamondThatAlreadyExists(bytes4 _selector); + +
+
+ +
+ Thrown when a selector cannot be found inside the diamond during the fallback function (see [`diamondFallback`](#diamondfallback)). +
+
+ Signature: + +error FunctionNotFound(bytes4 _selector); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the staticcall to _facet.exportSelectors() fails. +
+
+ Signature: + +error FunctionSelectorsCallFailed(address _facet); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet returns data that is not valid ABI-encoded bytes (e.g. wrong offset, length not a multiple of 4, or length exceeds payload). +
+
+ Signature: + +error IncorrectSelectorsEncoding(address _facet); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet address has no code (e.g. EOA or uninitialized contract). +
+
+ Signature: + +error NoBytecodeAtAddress(address _contractAddress); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet returns fewer than 4 bytes of selectors (i.e. no valid selector). +
+
+ Signature: + +error NoSelectorsForFacet(address _facet); + +
+
+
+ +## Best Practices + +- Ensure that facet registration functions (like `addFacets` and `importSelectors`) are called only during diamond initialization or controlled upgrade processes. +- Verify that the `DiamondStorage` struct is correctly defined and that any new fields are added at the end to maintain storage layout compatibility. +- Handle custom errors such as `CannotAddFunctionToDiamondThatAlreadyExists` and `NoSelectorsForFacet` to ensure robust error management. + +## Integration Notes + +This module interacts directly with the diamond's shared storage at the `DIAMOND_STORAGE_POSITION`, which is identified by `keccak256("erc8153.diamond")`. All functions within this module read from and write to this shared storage. + +Changes to the `facetList` or other storage elements are immediately visible to any facet that accesses the same storage slot. + + diff --git a/website/docs/library/diamond/DiamondUpgradeFacet.mdx b/website/docs/library/diamond/DiamondUpgradeFacet.mdx new file mode 100644 index 00000000..8fdae2b5 --- /dev/null +++ b/website/docs/library/diamond/DiamondUpgradeFacet.mdx @@ -0,0 +1,489 @@ +--- +sidebar_position: 3 +title: "Diamond Upgrade Facet" +description: "Diamond upgrade and management facet" +sidebar_label: "Upgrade Facet" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondUpgradeFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import StepIndicator from '@site/src/components/docs/StepIndicator'; + + +Orchestrates upgrade functionality, enabling the addition, replacement, and removal of facets. + + + +- Owner-gated upgrade entrypoint (ERC-173 `owner`). +- Optional `delegatecall` for post-upgrade initialization/state migration. +- Updates selector routing so subsequent calls dispatch to the new facet. + + +## Storage + +### State Variables + + + +### Diamond Storage + + + +{`/** storage-location: erc8042:erc8153.diamond */ +struct DiamondStorage { + mapping(bytes4 functionSelector => FacetNode) facetNodes; + FacetList facetList; +} + +struct FacetList { + bytes4 headFacetNodeId; + bytes4 tailFacetNodeId; + uint32 facetCount; + uint32 selectorCount; +} + +struct FacetNode { + address facet; + bytes4 prevFacetNodeId; + bytes4 nextFacetNodeId; +} +`} + + +### Owner Storage + + +{`/** storage-location: erc8042:erc173.owner */ +struct OwnerStorage { + address owner; +}`} + + +--- +### FacetReplacement + + +{`struct FacetReplacement { + address oldFacet; + address newFacet; +}`} + + +## Functions + +### upgradeDiamond + +Upgrade the diamond by adding, replacing, and/or removing facets. + +Execution order: + + + +Then, if `_delegate != address(0)`, the diamond performs a `delegatecall` with `_delegateCalldata` and emits `DiamondDelegateCall`. + + +{`function upgradeDiamond( + address[] calldata _addFacets, + FacetReplacement[] calldata _replaceFacets, + address[] calldata _removeFacets, + address _delegate, + bytes calldata _delegateCalldata, + bytes32 _tag, + bytes calldata _metadata +) external;`} + + + +**Parameters:** + + + + +Facets must implement `exportSelectors()` in order to make their selectors discoverable by diamonds. +Only the exported selectors will be added to the diamond. + +```solidity +interface IFacet { + function exportSelectors() external pure returns (bytes memory); +} +``` + +See [Facet-Based Diamond `EIP-8153`](https://eips.ethereum.org/EIPS/eip-8153) for more details. + + +## Events + + + +
+ Emitted when a facet is added to a diamond. + + The function selectors this facet handles can be retrieved by calling `IFacet(_facet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetAdded(address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a facet is removed from a diamond. + + The function selectors this facet handles can be retrieved by calling `IFacet(_facet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetRemoved(address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when an existing facet is replaced with a new facet. + + - Selectors that are present in the new facet but not in the old facet are added to the diamond. + - Selectors that are present in both the new and old facet are updated to use the new facet. + - Selectors that are not present in the new facet but are present in the old facet are removed from the diamond. + + The function selectors handled by these facets can be retrieved by calling: + - `IFacet(_oldFacet).exportSelectors()` + - `IFacet(_newFacet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetReplaced(address indexed _oldFacet, address indexed _newFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a diamond's constructor function or function from a facet makes a `delegatecall`. +
+ +
+ Signature: + +{`event DiamondDelegateCall(address indexed _delegate, bytes _delegateCalldata);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted to record information about a diamond. This event records any arbitrary metadata. + + The format of `_tag` and `_data` are not specified by the standard. +
+ +
+ Signature: + +{`event DiamondMetadata(bytes32 indexed _tag, bytes _data);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown by [upgradeDiamond](#upgradediamond) when msg.sender is not the current diamond owner. +
+ +
+ Signature: + +error OwnerUnauthorizedAccount(); + +
+
+ +
+ Thrown by [addFacets](#upgradediamond) when a selector exported by a facet already exists in the diamond. +
+ +
+ Signature: + +error CannotAddFunctionToDiamondThatAlreadyExists(bytes4 _selector); + +
+
+ +
+ Thrown by [removeFacets](#upgradediamond) when the facet being removed is not currently part of the diamond. +
+ +
+ Signature: + +error CannotRemoveFacetThatDoesNotExist(address _facet); + +
+
+ +
+ Thrown by [replaceFacets](#upgradediamond) when a replacement pair provides the same address for both oldFacet and newFacet. +
+ +
+ Signature: + +error CannotReplaceFacetWithSameFacet(address _facet); + +
+
+ +
+ Thrown by [replaceFacets](#upgradediamond) when a selector in newFacet is already owned by a facet other than the specified oldFacet. +
+
+ Signature: + +error CannotReplaceFunctionFromNonReplacementFacet(bytes4 _selector); + +
+
+ +
+ Thrown by [replaceFacets](#upgradediamond) when the provided oldFacet does not match the facet currently registered in the diamond. +
+
+ Signature: + +error FacetToReplaceDoesNotExist(address _oldFacet); + +
+
+ +
+ Thrown by [upgradeDiamond](#upgradediamond) when the optional post-upgrade delegatecall fails without returning revert data. +
+ +
+ Signature: + +error DelegateCallReverted(address _delegate, bytes _delegateCalldata); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the staticcall to _facet.exportSelectors() fails. +
+
+ Signature: + +error ExportSelectorsCallFailed(address _facet); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet returns data that is not valid ABI-encoded bytes (e.g. wrong offset, length not a multiple of 4, or length exceeds payload). +
+
+ Signature: + +error IncorrectSelectorsEncoding(address _facet); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet address has no code (e.g. EOA or uninitialized contract). +
+
+ Signature: + +error NoBytecodeAtAddress(address _contractAddress); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet returns fewer than 4 bytes of selectors (i.e. no valid selector). +
+
+ Signature: + +error NoSelectorsForFacet(address _facet); + +
+
+
+ +## Best Practices + +- Ensure all diamond upgrade operations are performed by authorized wallet. This facet is configure to use the unique owner, use the [Upgrade Module](/docs/library/diamond/DiamondUpgradeMod) to wrap around your own access control logic. +- Always verify facet logic contracts are immutable and trusted before using it inside a diamond. +- Ensure each facet’s `exportSelectors()` returns a valid packed selector list in deterministic order. +- Carefully audit any optional post-upgrade `delegatecall` (delegate contract + calldata). + +## Security Considerations + +The `upgradeDiamond` function is critical: this function mutates the diamond’s selector routing and facet list. +

+Although this facet protected by [Owner](https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/Owner) checks, it can optionally execute an unrestricted `delegatecall` after facet updates. Allowing changes to the diamond storage. + +That being said, make sure to provide verified and trusted contract addresses to avoid any unwanted changes or vulnerabilities + diff --git a/website/docs/library/diamond/DiamondUpgradeMod.mdx b/website/docs/library/diamond/DiamondUpgradeMod.mdx new file mode 100644 index 00000000..2c6b6fe2 --- /dev/null +++ b/website/docs/library/diamond/DiamondUpgradeMod.mdx @@ -0,0 +1,729 @@ +--- +sidebar_position: 4 +title: "Diamond Upgrade Module" +description: "Upgrade diamond by adding, replacing, or removing facets" +sidebar_label: "Upgrade Module" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondUpgradeMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; + +import Callout from '@site/src/components/ui/Callout'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import StepIndicator from '@site/src/components/docs/StepIndicator'; + + +Module providing the internal functions to extend the diamond's upgrade functionality by adding, replacing, or removing facets. + + + +- Manages facet lifecycle (add, replace, remove) within a diamond. +- Updates selector routing so subsequent calls dispatch to the new facet. + + + +You can use modules to wrap around you own project logic while using the default Compose building blocks. Modules provides all the necessary internal helpers for maximum integration + +Storage follows the diamond slot layout in this file; any code using the same `STORAGE_POSITION` or related positions reads and writes shared state. + +See Facets & Modules for more information. + + +## Storage + +### State Variables + + +--- +### Diamond Storage + + +{`/** storage-location: erc8042:erc8153.diamond */ +struct DiamondStorage { + mapping(bytes4 functionSelector => FacetNode) facetNodes; + FacetList facetList; +} + +struct FacetList { + bytes4 headFacetNodeId; + bytes4 tailFacetNodeId; + uint32 facetCount; + uint32 selectorCount; +} + +struct FacetNode { + address facet; + bytes4 prevFacetNodeId; + bytes4 nextFacetNodeId; +}`} + + +--- +### FacetReplacement + +This struct is used to replace old facets with new facets. + + +{`struct FacetReplacement { + address oldFacet; + address newFacet; +}`} + + +## Functions + +### getDiamondStorage + + +{`function getDiamondStorage() pure returns (DiamondStorage storage s);`} + + +**Returns:** + + + +--- +### upgradeDiamond + +Upgrade the diamond by adding, replacing, and/or removing facets. + +Execution order: + + + +Then, if `_delegate != address(0)`, the diamond performs a `delegatecall` with `_delegateCalldata` and emits `DiamondDelegateCall`. + + +{`function upgradeDiamond( + address[] calldata _addFacets, + FacetReplacement[] calldata _replaceFacets, + address[] calldata _removeFacets, + address _delegate, + bytes calldata _delegateCalldata, + bytes32 _tag, + bytes calldata _metadata +) external;`} + + + +**Parameters:** + + + + +Facets must implement `exportSelectors()` in order to make their selectors discoverable by diamonds. +Only the exported selectors will be added to the diamond. + +```solidity +interface IFacet { + function exportSelectors() external pure returns (bytes memory); +} +``` + +See [Facet-Based Diamond `EIP-8153`](https://eips.ethereum.org/EIPS/eip-8153) for more details. + + +--- +### addFacets + +Adds new facets to a diamond and maps their exported selectors to the facet address. + +Behavior: + +Each facet in _facets is queried via exportSelectors(). }, + { title: 'Validate uniqueness', description: 'All exported selectors must be new to the diamond.' }, + { title: 'Link facet node', description: 'The first selector of each facet becomes the facet linked-list node id.' }, + { title: 'Emit event', description: <>Emits FacetAdded once per successfully added facet. }, + ]} +/> + + +{`function addFacets(address[] calldata _facets);`} + + +**Parameters:** + + + +--- +### removeFacets + +Removes facets from a diamond and unregisters the selectors they previously exported. + +Behavior: + +Emits FacetRemoved once per removed facet. }, + ]} +/> + + +{`function removeFacets(address[] calldata _facets);`} + + +**Parameters:** + + + +--- +### replaceFacets + +Replaces facets in a diamond using `(oldFacet, newFacet)` pairs. + +Behavior: + +Selectors present in both facets are re-routed to newFacet. }, + { title: 'Add new selectors', description: <>Selectors present only in newFacet are added. }, + { title: 'Remove stale selectors', description: <>Selectors present only in oldFacet are removed. }, + { title: 'Emit event', description: <>Emits FacetReplaced(oldFacet, newFacet) for each replacement pair. }, + ]} +/> + + +{`function replaceFacets(FacetReplacement[] calldata _replaceFacets);`} + + +**Parameters:** + + + +--- +### importSelectors + +Retrieves the function selectors exposed by a facet by calling its `exportSelectors()`. Validates the returned ABI-encoded `bytes` (offset, length, and that the payload length is a multiple of 4) and returns the packed selectors without copying (zero-copy decode). + +Used internally by [`addFacets`](#addfacets), [`replaceFacets`](#replacefacets) and [`removeFacets`](#removefacets). + + +{`function importSelectors(address _facet) view returns (bytes memory selectors);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### at + +Returns the 4-byte function selector at the given index in a packed `bytes` array of selectors (e.g. from [`importSelectors`](#importselectors) or other selector packing). + + +{`function at(bytes memory selectors, uint256 index) pure returns (bytes4 selector);`} + + +**Parameters:** + + + +**Returns:** + + + +## Events + + + +
+ Emitted when a facet is added to a diamond. + + The function selectors this facet handles can be retrieved by calling `IFacet(_facet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetAdded(address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a facet is removed from a diamond. + + The function selectors this facet handles can be retrieved by calling `IFacet(_facet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetRemoved(address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when an existing facet is replaced with a new facet. + + - Selectors that are present in the new facet but not in the old facet are added to the diamond. + - Selectors that are present in both the new and old facet are updated to use the new facet. + - Selectors that are not present in the new facet but are present in the old facet are removed from the diamond. + + The function selectors handled by these facets can be retrieved by calling: + - `IFacet(_oldFacet).exportSelectors()` + - `IFacet(_newFacet).exportSelectors()` +
+ +
+ Signature: + +{`event FacetReplaced(address indexed _oldFacet, address indexed _newFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a diamond's constructor function or function from a facet makes a `delegatecall`. +
+ +
+ Signature: + +{`event DiamondDelegateCall(address indexed _delegate, bytes _delegateCalldata);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted to record information about a diamond. This event records any arbitrary metadata. + + The format of `_tag` and `_data` are not specified by the standard. +
+ +
+ Signature: + +{`event DiamondMetadata(bytes32 indexed _tag, bytes _data);`} + +
+
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown by [addFacets](#addfacets) when a selector exported by a facet already exists in the diamond. +
+ +
+ Signature: + +error CannotAddFunctionToDiamondThatAlreadyExists(bytes4 _selector); + +
+
+ +
+ Thrown by [removeFacets](#removefacets) when the facet being removed is not currently part of the diamond. +
+ +
+ Signature: + +error CannotRemoveFacetThatDoesNotExist(address _facet); + +
+
+ +
+ Thrown by [replaceFacets](#replacefacets) when a replacement pair provides the same address for both oldFacet and newFacet. +
+ +
+ Signature: + +error CannotReplaceFacetWithSameFacet(address _facet); + +
+
+ +
+ Thrown by [replaceFacets](#replacefacets) when a selector in newFacet is already owned by a facet other than the specified oldFacet. +
+
+ Signature: + +error CannotReplaceFunctionFromNonReplacementFacet(bytes4 _selector); + +
+
+ +
+ Thrown by [replaceFacets](#replacefacets) when the provided oldFacet does not match the facet currently registered in the diamond. +
+
+ Signature: + +error FacetToReplaceDoesNotExist(address _oldFacet); + +
+
+ +
+ Thrown by [upgradeDiamond](#upgradediamond) when the optional post-upgrade delegatecall fails without returning revert data. +
+ +
+ Signature: + +error DelegateCallReverted(address _delegate, bytes _delegateCalldata); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the staticcall to _facet.exportSelectors() fails. +
+
+ Signature: + +error ExportSelectorsCallFailed(address _facet); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet returns data that is not valid ABI-encoded bytes (e.g. wrong offset, length not a multiple of 4, or length exceeds payload). +
+
+ Signature: + +error IncorrectSelectorsEncoding(address _facet); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet address has no code (e.g. EOA or uninitialized contract). +
+
+ Signature: + +error NoBytecodeAtAddress(address _contractAddress); + +
+
+ +
+ Thrown by [importSelectors](#importselectors) when the facet returns fewer than 4 bytes of selectors (i.e. no valid selector). +
+
+ Signature: + +error NoSelectorsForFacet(address _facet); + +
+
+
+ + + + +## Best Practices + +- Ensure all diamond upgrade operations are performed by authorized wallet. +- Always verify facet logic contracts are immutable and trusted before using it inside a diamond. +- Ensure each facet’s `exportSelectors()` returns a valid packed selector list in deterministic order. +- Carefully audit any optional post-upgrade `delegatecall` (contract + calldata). + + +## Security Considerations + +This module doesn't provide any access control logic, when using it into your own facet, it is your responsibility to implement proper checks. + +The `upgradeDiamond` function is critical: this function mutates the diamond’s selector routing and facet list. + +That being said, make sure to provide verified and trusted contract addresses to avoid any unwanted changes or vulnerabilities + +## Integration Notes + +This module interacts directly with the diamond's shared storage at the `DIAMOND_STORAGE_POSITION`, which is identified by `keccak256("erc8153.diamond")`. All functions within this module read from and write to this shared storage. + +Changes made through `upgradeDiamond`, `addFacets`, `replaceFacets`, and `removeFacets` are immediately visible to the diamond and all facet that access the shared storage. + + + diff --git a/website/docs/library/diamond/_category_.json b/website/docs/library/diamond/_category_.json new file mode 100644 index 00000000..26c8cc37 --- /dev/null +++ b/website/docs/library/diamond/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Diamond Core", + "position": 1, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/diamond/index" + } +} diff --git a/website/docs/library/diamond/index.mdx b/website/docs/library/diamond/index.mdx new file mode 100644 index 00000000..65758b08 --- /dev/null +++ b/website/docs/library/diamond/index.mdx @@ -0,0 +1,43 @@ +--- +title: "Diamond Core" +description: "Core diamond proxy functionality for ERC-2535 diamonds." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Core diamond proxy functionality for ERC-8153 diamonds. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/index.mdx b/website/docs/library/index.mdx new file mode 100644 index 00000000..221a4441 --- /dev/null +++ b/website/docs/library/index.mdx @@ -0,0 +1,30 @@ +--- +title: "Library" +description: "Contract references for Compose modules and facets—interfaces, storage layout, and upgrade wiring." +sidebar_class_name: "hidden" +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; +import Callout from '@site/src/components/ui/Callout'; + + + Contract-level docs for each **Compose** module and facet. What to call on-chain, how state is organized, and how upgrades work. + + + + } + size="medium" + /> + + + +This library documentation is still in progress. Some facets and modules are not documented yet while we actively expand coverage. Until a specific page is available, [browse the contract source](https://github.com/Perfect-Abstractions/Compose/tree/main/src). + +We are open to contributions to help grow these docs. We still have documentation gaps today, and we welcome any related issues in the [repository](https://github.com/Perfect-Abstractions/Compose). + diff --git a/website/docusaurus.config.js b/website/docusaurus.config.js index e2fc632c..abab2a75 100644 --- a/website/docusaurus.config.js +++ b/website/docusaurus.config.js @@ -326,18 +326,22 @@ const config = { }, }, ], - [ - "posthog-docusaurus", - { - apiKey: process.env.POSTHOG_API_KEY, - appUrl: 'https://compose.diamonds/54Q17895d65', - uiHost: 'https://us.posthog.com', - enableInDevelopment: false, - capturePageLeave: true, - defaults: '2026-01-30', - cookieless_mode: 'on_reject', - }, - ], + ...(process.env.POSTHOG_API_KEY + ? [ + [ + 'posthog-docusaurus', + { + apiKey: process.env.POSTHOG_API_KEY, + appUrl: 'https://compose.diamonds/54Q17895d65', + uiHost: 'https://us.posthog.com', + enableInDevelopment: false, + capturePageLeave: true, + defaults: '2026-01-30', + cookieless_mode: 'on_reject', + }, + ], + ] + : []), ].filter(Boolean), }; diff --git a/website/package.json b/website/package.json index 2421ec0b..287a281d 100644 --- a/website/package.json +++ b/website/package.json @@ -11,7 +11,8 @@ "clear": "docusaurus clear", "serve": "docusaurus serve", "write-translations": "docusaurus write-translations", - "write-heading-ids": "docusaurus write-heading-ids" + "write-heading-ids": "docusaurus write-heading-ids", + "generate-docs": "cd .. && forge doc && SKIP_ENHANCEMENT=true node .github/scripts/generate-docs.js --all" }, "dependencies": { "@acid-info/docusaurus-og": "^1.0.3-beta.2", diff --git a/website/sidebars.js b/website/sidebars.js index a6326118..f86efb16 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -58,6 +58,21 @@ const sidebars = { }, ], }, + { + type: 'category', + label: 'Library', + collapsed: true, + link: { + type: 'doc', + id: 'library/index', + }, + items: [ + { + type: 'autogenerated', + dirName: 'library', + }, + ], + }, { type: 'category', label: 'Contribution', diff --git a/website/src/components/api/PropertyTable/index.js b/website/src/components/api/PropertyTable/index.js index 496f2fc3..22fd68e0 100644 --- a/website/src/components/api/PropertyTable/index.js +++ b/website/src/components/api/PropertyTable/index.js @@ -1,6 +1,32 @@ import React from 'react'; import styles from './styles.module.css'; +/** + * Parse description string and convert markdown-style code (backticks) to JSX code elements + * @param {string|React.ReactNode} description - Description string or React element + * @returns {React.ReactNode} Description with code elements rendered + */ +function parseDescription(description) { + if (!description || typeof description !== 'string') { + return description; + } + + // Split by backticks and alternate between text and code + const parts = description.split(/(`[^`]+`)/g); + return parts.map((part, index) => { + if (part.startsWith('`') && part.endsWith('`')) { + // This is a code block + const codeContent = part.slice(1, -1); // Remove backticks + return ( + + {codeContent} + + ); + } + return {part}; + }); +} + /** * PropertyTable Component - Modern API property documentation table * Inspired by Shadcn UI design patterns @@ -51,7 +77,7 @@ export default function PropertyTable({ )} - {prop.description || prop.desc || '-'} + {prop.descriptionElement || parseDescription(prop.description || prop.desc) || '-'} {prop.default !== undefined && (
Default: {String(prop.default)} diff --git a/website/src/components/api/PropertyTable/styles.module.css b/website/src/components/api/PropertyTable/styles.module.css index d6a75d41..c50a2be5 100644 --- a/website/src/components/api/PropertyTable/styles.module.css +++ b/website/src/components/api/PropertyTable/styles.module.css @@ -20,6 +20,7 @@ .tableWrapper { position: relative; width: 100%; + max-width: 100%; border: 1px solid var(--ifm-color-emphasis-200); border-radius: 0.5rem; background: var(--ifm-background-surface-color); @@ -38,6 +39,7 @@ -webkit-overflow-scrolling: touch; scrollbar-width: thin; scrollbar-color: var(--ifm-color-emphasis-300) transparent; + max-width: 100%; } /* Custom Scrollbar */ @@ -69,9 +71,10 @@ /* Table */ .table { width: 100%; + max-width: 100%; border-collapse: separate; border-spacing: 0; - min-width: 640px; + table-layout: auto; } /* Table Header */ @@ -162,22 +165,26 @@ /* Column Styles */ .nameColumn { - width: 20%; - min-width: 180px; + width: auto; + min-width: 120px; + max-width: 25%; } .typeColumn { - width: 15%; - min-width: 140px; + width: auto; + min-width: 100px; + max-width: 20%; } .requiredColumn { - width: 12%; - min-width: 100px; + width: auto; + min-width: 80px; + max-width: 15%; } .descriptionColumn { - width: auto; + width: 1%; /* Small width forces expansion to fill remaining space in auto layout */ + min-width: 200px; } /* Name Cell */ @@ -272,6 +279,8 @@ .descriptionCell { line-height: 1.6; color: var(--ifm-color-emphasis-700); + width: 100%; /* Ensure cell expands to fill column width */ + min-width: 0; /* Allow shrinking if needed, but column width will enforce expansion */ } [data-theme='dark'] .descriptionCell { @@ -310,6 +319,27 @@ color: #93c5fd; } +/* Inline code in descriptions */ +.descriptionCell .inlineCode { + font-family: var(--ifm-font-family-monospace); + font-size: 0.8125rem; + font-weight: 500; + background: var(--ifm-color-emphasis-100); + padding: 0.25rem 0.5rem; + border-radius: 0.375rem; + color: var(--ifm-color-primary); + border: 1px solid var(--ifm-color-emphasis-200); + display: inline-block; + line-height: 1.4; + margin: 0 0.125rem; +} + +[data-theme='dark'] .descriptionCell .inlineCode { + background: rgba(59, 130, 246, 0.1); + border-color: rgba(59, 130, 246, 0.2); + color: #93c5fd; +} + /* Responsive Design */ @media (max-width: 996px) { .propertyTable { diff --git a/website/src/components/code/ExpandableCode/index.js b/website/src/components/code/ExpandableCode/index.js index fa868a9b..30602b05 100644 --- a/website/src/components/code/ExpandableCode/index.js +++ b/website/src/components/code/ExpandableCode/index.js @@ -1,4 +1,5 @@ -import React, { useState } from 'react'; +import React, { useState, useMemo, useEffect } from 'react'; +import CodeBlock from '@theme/CodeBlock'; import Icon from '../../ui/Icon'; import clsx from 'clsx'; import styles from './styles.module.css'; @@ -18,34 +19,29 @@ export default function ExpandableCode({ children }) { const [isExpanded, setIsExpanded] = useState(false); - const codeRef = React.useRef(null); - const [needsExpansion, setNeedsExpansion] = React.useState(false); - - React.useEffect(() => { - if (codeRef.current) { - const lines = codeRef.current.textContent.split('\n').length; - setNeedsExpansion(lines > maxLines); - } - }, [children, maxLines]); + const [needsExpansion, setNeedsExpansion] = useState(false); const codeContent = typeof children === 'string' ? children : children?.props?.children || ''; + const lineCount = useMemo(() => codeContent.split('\n').length, [codeContent]); + + useEffect(() => { + setNeedsExpansion(lineCount > maxLines); + }, [lineCount, maxLines]); return (
{title &&
{title}
}
-
-          {codeContent}
-        
+ {codeContent} + {needsExpansion && ( diff --git a/website/src/components/docs/DocCard/index.js b/website/src/components/docs/DocCard/index.js index 3c52d4a0..a9993f67 100644 --- a/website/src/components/docs/DocCard/index.js +++ b/website/src/components/docs/DocCard/index.js @@ -4,6 +4,21 @@ import clsx from 'clsx'; import Icon from '../../ui/Icon'; import styles from './styles.module.css'; +/** Inline arrow so it inherits color (Icon uses , so currentColor doesn't work in dark mode) */ +function DocCardArrow() { + return ( + + + + ); +} + /** * DocCard * @@ -40,7 +55,7 @@ export default function DocCard({ {children}
- +
); diff --git a/website/src/components/docs/DocCard/styles.module.css b/website/src/components/docs/DocCard/styles.module.css index 355a40c8..b9e8693c 100644 --- a/website/src/components/docs/DocCard/styles.module.css +++ b/website/src/components/docs/DocCard/styles.module.css @@ -194,6 +194,10 @@ opacity: 0.5; } +[data-theme='dark'] .docCardArrow { + color: #ffffff; +} + .docCard:hover .docCardArrow { color: #1d4ed8; transform: translateX(5px); diff --git a/website/src/components/docs/WasThisHelpful/index.js b/website/src/components/docs/WasThisHelpful/index.js index c8ae4655..3f3a0f0b 100644 --- a/website/src/components/docs/WasThisHelpful/index.js +++ b/website/src/components/docs/WasThisHelpful/index.js @@ -102,6 +102,7 @@ export default function WasThisHelpful({ if (onSubmit) { onSubmit({ pageId, feedback: 'no', comment: comment.trim() }); } + setSubmitted(true); }; diff --git a/website/src/components/docs/WasThisHelpful/styles.module.css b/website/src/components/docs/WasThisHelpful/styles.module.css index e2ec4c99..84c55f0c 100644 --- a/website/src/components/docs/WasThisHelpful/styles.module.css +++ b/website/src/components/docs/WasThisHelpful/styles.module.css @@ -353,6 +353,12 @@ flex-shrink: 0; } +.feedbackIcon { + display: inline-block; + vertical-align: middle; + flex-shrink: 0; +} + .feedbackForm { margin-top: 1rem; padding-top: 1rem; diff --git a/website/src/components/ui/Badge/styles.module.css b/website/src/components/ui/Badge/styles.module.css index eee3e93d..855d0556 100644 --- a/website/src/components/ui/Badge/styles.module.css +++ b/website/src/components/ui/Badge/styles.module.css @@ -8,6 +8,11 @@ transition: all 0.2s ease; } +/* Prevent Markdown-wrapped children from adding extra space */ +.badge p { + margin: 0; +} + /* Sizes */ .badge.small { padding: 0.25rem 0.625rem; diff --git a/website/src/components/ui/GradientButton/styles.module.css b/website/src/components/ui/GradientButton/styles.module.css index 5bead8be..9aeb3753 100644 --- a/website/src/components/ui/GradientButton/styles.module.css +++ b/website/src/components/ui/GradientButton/styles.module.css @@ -1,8 +1,12 @@ .gradientButton { position: relative; + padding-left: 0px; display: inline-flex; align-items: center; justify-content: center; + box-sizing: border-box; + line-height: 1; + vertical-align: middle; font-weight: 600; text-decoration: none; border: none; @@ -14,10 +18,21 @@ } .buttonContent { + display: inline-flex; + align-items: center; + gap: 0.35rem; + line-height: 1; position: relative; z-index: 2; } +/* Prevent Markdown-wrapped children from adding extra space */ +.gradientButton p, +.buttonContent p { + margin: 0; + color: white; +} + .buttonGlow { position: absolute; top: 50%; @@ -46,18 +61,18 @@ /* Sizes */ .gradientButton.small { - padding: 0.5rem 1rem; - font-size: 0.875rem; + padding: 0.55rem 1rem; + font-size: 0.9rem; } .gradientButton.medium { - padding: 0.65rem 1.25rem; + padding: 0.7rem 1.3rem; font-size: 1rem; } .gradientButton.large { - padding: 0.875rem 1.75rem; - font-size: 1.125rem; + padding: 0.9rem 1.75rem; + font-size: 1.05rem; } /* Variants */ @@ -66,6 +81,11 @@ color: white; } +.gradientButton:visited, +.gradientButton * { + color: inherit; +} + .gradientButton.secondary { background: linear-gradient(135deg, #60a5fa 0%, #2563eb 100%); color: white; diff --git a/website/src/hooks/useDocumentationFeedback.js b/website/src/hooks/useDocumentationFeedback.js new file mode 100644 index 00000000..73ffbf8d --- /dev/null +++ b/website/src/hooks/useDocumentationFeedback.js @@ -0,0 +1,35 @@ +import { useCallback } from 'react'; + +/** + * Custom hook for tracking documentation feedback with PostHog + * + * @returns {Function} submitFeedback - Function to submit feedback + */ +export function useDocumentationFeedback() { + /** + * Submit feedback to PostHog analytics + * + * @param {string} pageId - Unique identifier for the page + * @param {string} feedback - 'yes' or 'no' + * @param {string} comment - Optional comment text + */ + const submitFeedback = useCallback((pageId, feedback, comment = null) => { + const posthog = typeof window !== 'undefined' ? window.posthog : null; + + if (!posthog) { + console.log('Feedback submitted:', { pageId, feedback, comment: comment || null }); + return; + } + + posthog.capture('documentation_feedback', { + page_id: pageId, + feedback: feedback, + comment: comment || null, + timestamp: new Date().toISOString(), + url: typeof window !== 'undefined' ? window.location.href : null, + }); + }, []); + + return { submitFeedback }; +} + diff --git a/website/src/pages/index.js b/website/src/pages/index.js index 88a0a440..a9a71566 100644 --- a/website/src/pages/index.js +++ b/website/src/pages/index.js @@ -7,10 +7,9 @@ import StatsSection from '../components/home/StatsSection'; import CtaSection from '../components/home/CtaSection'; export default function Home() { - const {siteConfig} = useDocusaurusContext(); return (
diff --git a/website/src/theme/EditThisPage/DocsEditThisPage.js b/website/src/theme/EditThisPage/DocsEditThisPage.js new file mode 100644 index 00000000..3ed1ec80 --- /dev/null +++ b/website/src/theme/EditThisPage/DocsEditThisPage.js @@ -0,0 +1,48 @@ +import React from 'react'; +import Link from '@docusaurus/Link'; +import {useDoc} from '@docusaurus/plugin-content-docs/client'; +import styles from './styles.module.css'; + +/** + * DocsEditThisPage component for documentation pages + * Uses useDoc hook to access frontMatter for "View Source" link + * + * WARNING: This component should ONLY be rendered when we're certain + * we're in a docs page context. It will throw an error if used outside + * the DocProvider context. + * + * @param {string} editUrl - URL to edit the page + */ +export default function DocsEditThisPage({editUrl}) { + const {frontMatter} = useDoc(); + const viewSource = frontMatter?.gitSource; + + // Nothing to show + if (!editUrl && !viewSource) { + return null; + } + + return ( +
+ {viewSource && ( + <> + + View Source + + | + + )} + {editUrl && ( + + Edit this page + + )} +
+ ); +} + diff --git a/website/src/theme/EditThisPage/SafeDocsEditThisPage.js b/website/src/theme/EditThisPage/SafeDocsEditThisPage.js new file mode 100644 index 00000000..7da71c5d --- /dev/null +++ b/website/src/theme/EditThisPage/SafeDocsEditThisPage.js @@ -0,0 +1,35 @@ +import React from 'react'; +import DocsEditThisPage from './DocsEditThisPage'; +import SimpleEditThisPage from './SimpleEditThisPage'; + +/** + * Error boundary wrapper for DocsEditThisPage + * Catches errors if useDoc hook is called outside DocProvider context + * Falls back to SimpleEditThisPage if an error occurs + * + * @param {string} editUrl - URL to edit the page + */ +export default class SafeDocsEditThisPage extends React.Component { + constructor(props) { + super(props); + this.state = { hasError: false }; + } + + static getDerivedStateFromError(error) { + // If useDoc fails, fall back to simple version + return { hasError: true }; + } + + componentDidCatch(error, errorInfo) { + // Error caught, will render fallback + // Could log to error reporting service here if needed + } + + render() { + if (this.state.hasError) { + return ; + } + + return ; + } +} diff --git a/website/src/theme/EditThisPage/SimpleEditThisPage.js b/website/src/theme/EditThisPage/SimpleEditThisPage.js new file mode 100644 index 00000000..eb7d676c --- /dev/null +++ b/website/src/theme/EditThisPage/SimpleEditThisPage.js @@ -0,0 +1,24 @@ +import React from 'react'; +import Link from '@docusaurus/Link'; +import styles from './styles.module.css'; + +/** + * Simple EditThisPage component for non-docs contexts (blog, etc.) + * Safe to use anywhere - doesn't require any special context + * + * @param {string} editUrl - URL to edit the page + */ +export default function SimpleEditThisPage({editUrl}) { + if (!editUrl) { + return null; + } + + return ( +
+ + Edit this page + +
+ ); +} + diff --git a/website/src/theme/EditThisPage/index.js b/website/src/theme/EditThisPage/index.js new file mode 100644 index 00000000..db62da16 --- /dev/null +++ b/website/src/theme/EditThisPage/index.js @@ -0,0 +1,34 @@ +import React from 'react'; +import {useLocation} from '@docusaurus/router'; +import SimpleEditThisPage from './SimpleEditThisPage'; +import SafeDocsEditThisPage from './SafeDocsEditThisPage'; + +/** + * Main EditThisPage component + * + * Intelligently renders the appropriate EditThisPage variant based on the current route: + * - Docs pages (/docs/*): Uses SafeDocsEditThisPage (with useDoc hook, wrapped in error boundary) + * - Other pages: Uses SimpleEditThisPage + * + * Route checking is necessary because error boundaries don't work reliably during SSR/build. + * + * @param {string} editUrl - URL to edit the page + */ +export default function EditThisPage({editUrl}) { + let isDocsPage = false; + + try { + const location = useLocation(); + const pathname = location?.pathname || ''; + + isDocsPage = pathname.startsWith('/docs/'); + } catch (error) { + isDocsPage = false; + } + + if (isDocsPage) { + return ; + } + + return ; +} diff --git a/website/src/theme/EditThisPage/styles.module.css b/website/src/theme/EditThisPage/styles.module.css new file mode 100644 index 00000000..fc7a21e0 --- /dev/null +++ b/website/src/theme/EditThisPage/styles.module.css @@ -0,0 +1,26 @@ +.wrapper { + display: inline-flex; + align-items: center; + gap: 0.75rem; + flex-wrap: wrap; +} + +.link { + display: inline-flex; + align-items: center; + gap: 0.35rem; + font-weight: 600; + color: var(--ifm-link-color); + text-decoration: none; +} + +.link:hover { + text-decoration: underline; + color: var(--ifm-link-hover-color, var(--ifm-link-color)); +} + +.link:visited { + color: var(--ifm-link-color); +} + + diff --git a/website/static/icons/light-bulb-round.svg b/website/static/icons/light-bulb-round.svg new file mode 100644 index 00000000..d08ab7ef --- /dev/null +++ b/website/static/icons/light-bulb-round.svg @@ -0,0 +1,2 @@ + + \ No newline at end of file diff --git a/website/static/icons/light-bulb-svgrepo-com.svg b/website/static/icons/light-bulb-svgrepo-com.svg new file mode 100644 index 00000000..9f8940a6 --- /dev/null +++ b/website/static/icons/light-bulb-svgrepo-com.svg @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/website/static/icons/lightbulb.svg b/website/static/icons/lightbulb.svg index 2181d952..5861a5ec 100644 --- a/website/static/icons/lightbulb.svg +++ b/website/static/icons/lightbulb.svg @@ -1,20 +1,9 @@ - + - - - - - - - - - - - - - + + \ No newline at end of file diff --git a/website/static/icons/thumbs-down-white.svg b/website/static/icons/thumbs-down-white.svg new file mode 100644 index 00000000..98c4bc4d --- /dev/null +++ b/website/static/icons/thumbs-down-white.svg @@ -0,0 +1,4 @@ + + + + diff --git a/website/static/icons/thumbs-down.svg b/website/static/icons/thumbs-down.svg index 7e04b100..75d91722 100644 --- a/website/static/icons/thumbs-down.svg +++ b/website/static/icons/thumbs-down.svg @@ -7,3 +7,4 @@ d="M.5,12H5.28l6.11,7.06A2,2,0,0,1,12,20.49a2,2,0,0,0,2,2,2.74,2.74,0,0,0,2-.8,2.79,2.79,0,0,0,.8-1.95c0-2-2.87-5.86-2.87-5.86h6A2.61,2.61,0,0,0,22.5,11.3a2.94,2.94,0,0,0-.05-.51L20.89,3A1.91,1.91,0,0,0,19,1.48H11.25a9.13,9.13,0,0,0-4,1h0a9.08,9.08,0,0,1-4.06,1H.5" /> + diff --git a/website/static/icons/thumbs-up-white.svg b/website/static/icons/thumbs-up-white.svg new file mode 100644 index 00000000..9e9b1c0a --- /dev/null +++ b/website/static/icons/thumbs-up-white.svg @@ -0,0 +1,4 @@ + + + +