Compare commits
6 Commits
provider-e
...
43683-inco
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ed132332a0 | ||
|
|
1b032fdaab | ||
|
|
1a803a2f90 | ||
|
|
9fb12ee990 | ||
|
|
2a6e009a7d | ||
|
|
83c871210f |
@@ -1,55 +0,0 @@
|
||||
# Phase 2: Explore Repository
|
||||
|
||||
You are analyzing a codebase to understand its structure before reviewing documentation impact.
|
||||
|
||||
## Objective
|
||||
Produce a structured overview of the repository to inform subsequent documentation analysis.
|
||||
|
||||
## Instructions
|
||||
|
||||
1. **Identify Primary Languages and Frameworks**
|
||||
- Scan for Cargo.toml, package.json, or other manifest files
|
||||
- Note the primary language(s) and key dependencies
|
||||
|
||||
2. **Map Documentation Structure**
|
||||
- This project uses **mdBook** (https://rust-lang.github.io/mdBook/)
|
||||
- Documentation is in `docs/src/`
|
||||
- Table of contents: `docs/src/SUMMARY.md` (mdBook format: https://rust-lang.github.io/mdBook/format/summary.html)
|
||||
- Style guide: `docs/.rules`
|
||||
- Agent guidelines: `docs/AGENTS.md`
|
||||
- Formatting: Prettier (config in `docs/.prettierrc`)
|
||||
|
||||
3. **Identify Build and Tooling**
|
||||
- Note build systems (cargo, npm, etc.)
|
||||
- Identify documentation tooling (mdbook, etc.)
|
||||
|
||||
4. **Output Format**
|
||||
Produce a JSON summary:
|
||||
|
||||
```json
|
||||
{
|
||||
"primary_language": "Rust",
|
||||
"frameworks": ["GPUI"],
|
||||
"documentation": {
|
||||
"system": "mdBook",
|
||||
"location": "docs/src/",
|
||||
"toc_file": "docs/src/SUMMARY.md",
|
||||
"toc_format": "https://rust-lang.github.io/mdBook/format/summary.html",
|
||||
"style_guide": "docs/.rules",
|
||||
"agent_guidelines": "docs/AGENTS.md",
|
||||
"formatter": "prettier",
|
||||
"formatter_config": "docs/.prettierrc",
|
||||
"custom_preprocessor": "docs_preprocessor (handles {#kb action::Name} syntax)"
|
||||
},
|
||||
"key_directories": {
|
||||
"source": "crates/",
|
||||
"docs": "docs/src/",
|
||||
"extensions": "extensions/"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Constraints
|
||||
- Read-only: Do not modify any files
|
||||
- Focus on structure, not content details
|
||||
- Complete within 2 minutes
|
||||
@@ -1,57 +0,0 @@
|
||||
# Phase 3: Analyze Changes
|
||||
|
||||
You are analyzing code changes to understand their nature and scope.
|
||||
|
||||
## Objective
|
||||
Produce a clear, neutral summary of what changed in the codebase.
|
||||
|
||||
## Input
|
||||
You will receive:
|
||||
- List of changed files from the triggering commit/PR
|
||||
- Repository structure from Phase 2
|
||||
|
||||
## Instructions
|
||||
|
||||
1. **Categorize Changed Files**
|
||||
- Source code (which crates/modules)
|
||||
- Configuration
|
||||
- Tests
|
||||
- Documentation (already existing)
|
||||
- Other
|
||||
|
||||
2. **Analyze Each Change**
|
||||
- Review diffs for files likely to impact documentation
|
||||
- Focus on: public APIs, settings, keybindings, commands, user-visible behavior
|
||||
|
||||
3. **Identify What Did NOT Change**
|
||||
- Note stable interfaces or behaviors
|
||||
- Important for avoiding unnecessary documentation updates
|
||||
|
||||
4. **Output Format**
|
||||
Produce a markdown summary:
|
||||
|
||||
```markdown
|
||||
## Change Analysis
|
||||
|
||||
### Changed Files Summary
|
||||
| Category | Files | Impact Level |
|
||||
| --- | --- | --- |
|
||||
| Source - [crate] | file1.rs, file2.rs | High/Medium/Low |
|
||||
| Settings | settings.json | Medium |
|
||||
| Tests | test_*.rs | None |
|
||||
|
||||
### Behavioral Changes
|
||||
- **[Feature/Area]**: Description of what changed from user perspective
|
||||
- **[Feature/Area]**: Description...
|
||||
|
||||
### Unchanged Areas
|
||||
- [Area]: Confirmed no changes to [specific behavior]
|
||||
|
||||
### Files Requiring Deeper Review
|
||||
- `path/to/file.rs`: Reason for deeper review
|
||||
```
|
||||
|
||||
## Constraints
|
||||
- Read-only: Do not modify any files
|
||||
- Neutral tone: Describe what changed, not whether it's good/bad
|
||||
- Do not propose documentation changes yet
|
||||
@@ -1,76 +0,0 @@
|
||||
# Phase 4: Plan Documentation Impact
|
||||
|
||||
You are determining whether and how documentation should be updated based on code changes.
|
||||
|
||||
## Objective
|
||||
Produce a structured documentation plan that will guide Phase 5 execution.
|
||||
|
||||
## Documentation System
|
||||
This is an **mdBook** site (https://rust-lang.github.io/mdBook/):
|
||||
- `docs/src/SUMMARY.md` defines book structure per https://rust-lang.github.io/mdBook/format/summary.html
|
||||
- If adding new pages, they MUST be added to SUMMARY.md
|
||||
- Use `{#kb action::ActionName}` syntax for keybindings (custom preprocessor expands these)
|
||||
- Prettier formatting (80 char width) will be applied automatically
|
||||
|
||||
## Input
|
||||
You will receive:
|
||||
- Change analysis from Phase 3
|
||||
- Repository structure from Phase 2
|
||||
- Documentation guidelines from `docs/AGENTS.md`
|
||||
|
||||
## Instructions
|
||||
|
||||
1. **Review AGENTS.md**
|
||||
- Load and apply all rules from `docs/AGENTS.md`
|
||||
- Respect scope boundaries (in-scope vs out-of-scope)
|
||||
|
||||
2. **Evaluate Documentation Impact**
|
||||
For each behavioral change from Phase 3:
|
||||
- Does existing documentation cover this area?
|
||||
- Is the documentation now inaccurate or incomplete?
|
||||
- Classify per AGENTS.md "Change Classification" section
|
||||
|
||||
3. **Identify Specific Updates**
|
||||
For each required update:
|
||||
- Exact file path
|
||||
- Specific section or heading
|
||||
- Type of change (update existing, add new, deprecate)
|
||||
- Description of the change
|
||||
|
||||
4. **Flag Uncertainty**
|
||||
Explicitly mark:
|
||||
- Assumptions you're making
|
||||
- Areas where human confirmation is needed
|
||||
- Ambiguous requirements
|
||||
|
||||
5. **Output Format**
|
||||
Use the exact format specified in `docs/AGENTS.md` Phase 4 section:
|
||||
|
||||
```markdown
|
||||
## Documentation Impact Assessment
|
||||
|
||||
### Summary
|
||||
Brief description of code changes analyzed.
|
||||
|
||||
### Documentation Updates Required: [Yes/No]
|
||||
|
||||
### Planned Changes
|
||||
|
||||
#### 1. [File Path]
|
||||
- **Section**: [Section name or "New section"]
|
||||
- **Change Type**: [Update/Add/Deprecate]
|
||||
- **Reason**: Why this change is needed
|
||||
- **Description**: What will be added/modified
|
||||
|
||||
### Uncertainty Flags
|
||||
- [ ] [Description of any assumptions or areas needing confirmation]
|
||||
|
||||
### No Changes Needed
|
||||
- [List files reviewed but not requiring updates, with brief reason]
|
||||
```
|
||||
|
||||
## Constraints
|
||||
- Read-only: Do not modify any files
|
||||
- Conservative: When uncertain, flag for human review rather than planning changes
|
||||
- Scoped: Only plan changes that trace directly to code changes from Phase 3
|
||||
- No scope expansion: Do not plan "improvements" unrelated to triggering changes
|
||||
@@ -1,67 +0,0 @@
|
||||
# Phase 5: Apply Documentation Plan
|
||||
|
||||
You are executing a pre-approved documentation plan for an **mdBook** documentation site.
|
||||
|
||||
## Objective
|
||||
Implement exactly the changes specified in the documentation plan from Phase 4.
|
||||
|
||||
## Documentation System
|
||||
- **mdBook**: https://rust-lang.github.io/mdBook/
|
||||
- **SUMMARY.md**: Follows mdBook format (https://rust-lang.github.io/mdBook/format/summary.html)
|
||||
- **Prettier**: Will be run automatically after this phase (80 char line width)
|
||||
- **Custom preprocessor**: Use `{#kb action::ActionName}` for keybindings instead of hardcoding
|
||||
|
||||
## Input
|
||||
You will receive:
|
||||
- Documentation plan from Phase 4
|
||||
- Documentation guidelines from `docs/AGENTS.md`
|
||||
- Style rules from `docs/.rules`
|
||||
|
||||
## Instructions
|
||||
|
||||
1. **Validate Plan**
|
||||
- Confirm all planned files are within scope per AGENTS.md
|
||||
- Verify no out-of-scope files are targeted
|
||||
|
||||
2. **Execute Each Planned Change**
|
||||
For each item in "Planned Changes":
|
||||
- Navigate to the specified file
|
||||
- Locate the specified section
|
||||
- Apply the described change
|
||||
- Follow style rules from `docs/.rules`
|
||||
|
||||
3. **Style Compliance**
|
||||
Every edit must follow `docs/.rules`:
|
||||
- Second person, present tense
|
||||
- No hedging words ("simply", "just", "easily")
|
||||
- Proper keybinding format (`Cmd+Shift+P`)
|
||||
- Settings Editor first, JSON second
|
||||
- Correct terminology (folder not directory, etc.)
|
||||
|
||||
4. **Preserve Context**
|
||||
- Maintain surrounding content structure
|
||||
- Keep consistent heading levels
|
||||
- Preserve existing cross-references
|
||||
|
||||
## Constraints
|
||||
- Execute ONLY changes listed in the plan
|
||||
- Do not discover new documentation targets
|
||||
- Do not make stylistic improvements outside planned sections
|
||||
- Do not expand scope beyond what Phase 4 specified
|
||||
- If a planned change cannot be applied (file missing, section not found), skip and note it
|
||||
|
||||
## Output
|
||||
After applying changes, output a summary:
|
||||
|
||||
```markdown
|
||||
## Applied Changes
|
||||
|
||||
### Successfully Applied
|
||||
- `path/to/file.md`: [Brief description of change]
|
||||
|
||||
### Skipped (Could Not Apply)
|
||||
- `path/to/file.md`: [Reason - e.g., "Section not found"]
|
||||
|
||||
### Warnings
|
||||
- [Any issues encountered during application]
|
||||
```
|
||||
@@ -1,54 +0,0 @@
|
||||
# Phase 6: Summarize Changes
|
||||
|
||||
You are generating a summary of documentation updates for PR review.
|
||||
|
||||
## Objective
|
||||
Create a clear, reviewable summary of all documentation changes made.
|
||||
|
||||
## Input
|
||||
You will receive:
|
||||
- Applied changes report from Phase 5
|
||||
- Original change analysis from Phase 3
|
||||
- Git diff of documentation changes
|
||||
|
||||
## Instructions
|
||||
|
||||
1. **Gather Change Information**
|
||||
- List all modified documentation files
|
||||
- Identify the corresponding code changes that triggered each update
|
||||
|
||||
2. **Generate Summary**
|
||||
Use the format specified in `docs/AGENTS.md` Phase 6 section:
|
||||
|
||||
```markdown
|
||||
## Documentation Update Summary
|
||||
|
||||
### Changes Made
|
||||
| File | Change | Related Code |
|
||||
| --- | --- | --- |
|
||||
| docs/src/path.md | Brief description | PR #123 or commit SHA |
|
||||
|
||||
### Rationale
|
||||
Brief explanation of why these updates were made, linking back to the triggering code changes.
|
||||
|
||||
### Review Notes
|
||||
- Items reviewers should pay special attention to
|
||||
- Any uncertainty flags from Phase 4 that were addressed
|
||||
- Assumptions made during documentation
|
||||
```
|
||||
|
||||
3. **Add Context for Reviewers**
|
||||
- Highlight any changes that might be controversial
|
||||
- Note if any planned changes were skipped and why
|
||||
- Flag areas where reviewer expertise is especially needed
|
||||
|
||||
## Output Format
|
||||
The summary should be suitable for:
|
||||
- PR description body
|
||||
- Commit message (condensed version)
|
||||
- Team communication
|
||||
|
||||
## Constraints
|
||||
- Read-only (documentation changes already applied in Phase 5)
|
||||
- Factual: Describe what was done, not justify why it's good
|
||||
- Complete: Account for all changes, including skipped items
|
||||
@@ -1,67 +0,0 @@
|
||||
# Phase 7: Commit and Open PR
|
||||
|
||||
You are creating a git branch, committing documentation changes, and opening a PR.
|
||||
|
||||
## Objective
|
||||
Package documentation updates into a reviewable pull request.
|
||||
|
||||
## Input
|
||||
You will receive:
|
||||
- Summary from Phase 6
|
||||
- List of modified files
|
||||
|
||||
## Instructions
|
||||
|
||||
1. **Create Branch**
|
||||
```sh
|
||||
git checkout -b docs/auto-update-{date}
|
||||
```
|
||||
Use format: `docs/auto-update-YYYY-MM-DD` or `docs/auto-update-{short-sha}`
|
||||
|
||||
2. **Stage and Commit**
|
||||
- Stage only documentation files in `docs/src/`
|
||||
- Do not stage any other files
|
||||
|
||||
Commit message format:
|
||||
```
|
||||
docs: auto-update documentation for [brief description]
|
||||
|
||||
[Summary from Phase 6, condensed]
|
||||
|
||||
Triggered by: [commit SHA or PR reference]
|
||||
|
||||
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
|
||||
```
|
||||
|
||||
3. **Push Branch**
|
||||
```sh
|
||||
git push -u origin docs/auto-update-{date}
|
||||
```
|
||||
|
||||
4. **Create Pull Request**
|
||||
Use the Phase 6 summary as the PR body.
|
||||
|
||||
PR Title: `docs: [Brief description of documentation updates]`
|
||||
|
||||
Labels (if available): `documentation`, `automated`
|
||||
|
||||
Base branch: `main`
|
||||
|
||||
## Constraints
|
||||
- Do NOT auto-merge
|
||||
- Do NOT request specific reviewers (let CODEOWNERS handle it)
|
||||
- Do NOT modify files outside `docs/src/`
|
||||
- If no changes to commit, exit gracefully with message "No documentation changes to commit"
|
||||
|
||||
## Output
|
||||
```markdown
|
||||
## PR Created
|
||||
|
||||
- **Branch**: docs/auto-update-{date}
|
||||
- **PR URL**: https://github.com/zed-industries/zed/pull/XXXX
|
||||
- **Status**: Ready for review
|
||||
|
||||
### Commit
|
||||
- SHA: {commit-sha}
|
||||
- Files: {count} documentation files modified
|
||||
```
|
||||
264
.github/workflows/docs_automation.yml
vendored
264
.github/workflows/docs_automation.yml
vendored
@@ -1,264 +0,0 @@
|
||||
name: Documentation Automation
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
paths:
|
||||
- 'crates/**'
|
||||
- 'extensions/**'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
pr_number:
|
||||
description: 'PR number to analyze (gets full PR diff)'
|
||||
required: false
|
||||
type: string
|
||||
trigger_sha:
|
||||
description: 'Commit SHA to analyze (ignored if pr_number is set)'
|
||||
required: false
|
||||
type: string
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
|
||||
env:
|
||||
FACTORY_API_KEY: ${{ secrets.FACTORY_API_KEY }}
|
||||
DROID_MODEL: claude-opus-4-5-20251101
|
||||
|
||||
jobs:
|
||||
docs-automation:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Install Droid CLI
|
||||
id: install-droid
|
||||
run: |
|
||||
curl -fsSL https://app.factory.ai/cli | sh
|
||||
echo "${HOME}/.local/bin" >> "$GITHUB_PATH"
|
||||
echo "DROID_BIN=${HOME}/.local/bin/droid" >> "$GITHUB_ENV"
|
||||
# Verify installation
|
||||
"${HOME}/.local/bin/droid" --version
|
||||
|
||||
- name: Setup Node.js (for Prettier)
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: Install Prettier
|
||||
run: npm install -g prettier
|
||||
|
||||
- name: Get changed files
|
||||
id: changed
|
||||
run: |
|
||||
if [ -n "${{ inputs.pr_number }}" ]; then
|
||||
# Get full PR diff
|
||||
echo "Analyzing PR #${{ inputs.pr_number }}"
|
||||
echo "source=pr" >> "$GITHUB_OUTPUT"
|
||||
echo "ref=${{ inputs.pr_number }}" >> "$GITHUB_OUTPUT"
|
||||
gh pr diff "${{ inputs.pr_number }}" --name-only > /tmp/changed_files.txt
|
||||
elif [ -n "${{ inputs.trigger_sha }}" ]; then
|
||||
# Get single commit diff
|
||||
SHA="${{ inputs.trigger_sha }}"
|
||||
echo "Analyzing commit $SHA"
|
||||
echo "source=commit" >> "$GITHUB_OUTPUT"
|
||||
echo "ref=$SHA" >> "$GITHUB_OUTPUT"
|
||||
git diff --name-only "${SHA}^" "$SHA" > /tmp/changed_files.txt
|
||||
else
|
||||
# Default to current commit
|
||||
SHA="${{ github.sha }}"
|
||||
echo "Analyzing commit $SHA"
|
||||
echo "source=commit" >> "$GITHUB_OUTPUT"
|
||||
echo "ref=$SHA" >> "$GITHUB_OUTPUT"
|
||||
git diff --name-only "${SHA}^" "$SHA" > /tmp/changed_files.txt || git diff --name-only HEAD~1 HEAD > /tmp/changed_files.txt
|
||||
fi
|
||||
|
||||
echo "Changed files:"
|
||||
cat /tmp/changed_files.txt
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
|
||||
# Phase 0: Guardrails are loaded via AGENTS.md in each phase
|
||||
|
||||
# Phase 2: Explore Repository (Read-Only - default)
|
||||
- name: "Phase 2: Explore Repository"
|
||||
id: phase2
|
||||
run: |
|
||||
"$DROID_BIN" exec \
|
||||
-m "$DROID_MODEL" \
|
||||
-f .factory/prompts/docs-automation/phase2-explore.md \
|
||||
> /tmp/phase2-output.txt 2>&1 || true
|
||||
echo "Repository exploration complete"
|
||||
cat /tmp/phase2-output.txt
|
||||
|
||||
# Phase 3: Analyze Changes (Read-Only - default)
|
||||
- name: "Phase 3: Analyze Changes"
|
||||
id: phase3
|
||||
run: |
|
||||
CHANGED_FILES=$(tr '\n' ' ' < /tmp/changed_files.txt)
|
||||
echo "Analyzing changes in: $CHANGED_FILES"
|
||||
|
||||
# Build prompt with context
|
||||
cat > /tmp/phase3-prompt.md << 'EOF'
|
||||
$(cat .factory/prompts/docs-automation/phase3-analyze.md)
|
||||
|
||||
## Context
|
||||
|
||||
### Changed Files
|
||||
$CHANGED_FILES
|
||||
|
||||
### Phase 2 Output
|
||||
$(cat /tmp/phase2-output.txt)
|
||||
EOF
|
||||
|
||||
"$DROID_BIN" exec \
|
||||
-m "$DROID_MODEL" \
|
||||
"$(cat .factory/prompts/docs-automation/phase3-analyze.md)
|
||||
|
||||
Changed files: $CHANGED_FILES" \
|
||||
> /tmp/phase3-output.md 2>&1 || true
|
||||
echo "Change analysis complete"
|
||||
cat /tmp/phase3-output.md
|
||||
|
||||
# Phase 4: Plan Documentation Impact (Read-Only - default)
|
||||
- name: "Phase 4: Plan Documentation Impact"
|
||||
id: phase4
|
||||
run: |
|
||||
"$DROID_BIN" exec \
|
||||
-m "$DROID_MODEL" \
|
||||
-f .factory/prompts/docs-automation/phase4-plan.md \
|
||||
> /tmp/phase4-plan.md 2>&1 || true
|
||||
echo "Documentation plan complete"
|
||||
cat /tmp/phase4-plan.md
|
||||
|
||||
# Check if updates are required
|
||||
if grep -q "NO_UPDATES_REQUIRED" /tmp/phase4-plan.md; then
|
||||
echo "updates_required=false" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "updates_required=true" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
# Phase 5: Apply Plan (Write-Enabled with --auto medium)
|
||||
- name: "Phase 5: Apply Documentation Plan"
|
||||
id: phase5
|
||||
if: steps.phase4.outputs.updates_required == 'true'
|
||||
run: |
|
||||
"$DROID_BIN" exec \
|
||||
-m "$DROID_MODEL" \
|
||||
--auto medium \
|
||||
-f .factory/prompts/docs-automation/phase5-apply.md \
|
||||
> /tmp/phase5-report.md 2>&1 || true
|
||||
echo "Documentation updates applied"
|
||||
cat /tmp/phase5-report.md
|
||||
|
||||
# Phase 5b: Format with Prettier
|
||||
- name: "Phase 5b: Format with Prettier"
|
||||
id: phase5b
|
||||
if: steps.phase4.outputs.updates_required == 'true'
|
||||
run: |
|
||||
echo "Formatting documentation with Prettier..."
|
||||
cd docs && prettier --write src/
|
||||
|
||||
echo "Verifying Prettier formatting passes..."
|
||||
cd docs && prettier --check src/
|
||||
|
||||
echo "Prettier formatting complete"
|
||||
|
||||
# Phase 6: Summarize Changes (Read-Only - default)
|
||||
- name: "Phase 6: Summarize Changes"
|
||||
id: phase6
|
||||
if: steps.phase4.outputs.updates_required == 'true'
|
||||
run: |
|
||||
# Get git diff of docs
|
||||
git diff docs/src/ > /tmp/docs-diff.txt || true
|
||||
|
||||
"$DROID_BIN" exec \
|
||||
-m "$DROID_MODEL" \
|
||||
-f .factory/prompts/docs-automation/phase6-summarize.md \
|
||||
> /tmp/phase6-summary.md 2>&1 || true
|
||||
echo "Summary generated"
|
||||
cat /tmp/phase6-summary.md
|
||||
|
||||
# Phase 7: Commit and Open PR
|
||||
- name: "Phase 7: Create PR"
|
||||
id: phase7
|
||||
if: steps.phase4.outputs.updates_required == 'true'
|
||||
run: |
|
||||
# Check if there are actual changes
|
||||
if git diff --quiet docs/src/; then
|
||||
echo "No documentation changes detected"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Configure git
|
||||
git config user.name "factory-droid[bot]"
|
||||
git config user.email "138933559+factory-droid[bot]@users.noreply.github.com"
|
||||
|
||||
# Daily batch branch - one branch per day, multiple commits accumulate
|
||||
BRANCH_NAME="docs/auto-update-$(date +%Y-%m-%d)"
|
||||
|
||||
# Stash local changes from phase 5
|
||||
git stash push -m "docs-automation-changes" -- docs/src/
|
||||
|
||||
# Check if branch already exists on remote
|
||||
if git ls-remote --exit-code --heads origin "$BRANCH_NAME" > /dev/null 2>&1; then
|
||||
echo "Branch $BRANCH_NAME exists, checking out and updating..."
|
||||
git fetch origin "$BRANCH_NAME"
|
||||
git checkout -B "$BRANCH_NAME" "origin/$BRANCH_NAME"
|
||||
else
|
||||
echo "Creating new branch $BRANCH_NAME..."
|
||||
git checkout -b "$BRANCH_NAME"
|
||||
fi
|
||||
|
||||
# Apply stashed changes
|
||||
git stash pop || true
|
||||
|
||||
# Stage and commit
|
||||
git add docs/src/
|
||||
SUMMARY=$(head -50 < /tmp/phase6-summary.md)
|
||||
git commit -m "docs: auto-update documentation
|
||||
|
||||
${SUMMARY}
|
||||
|
||||
Triggered by: ${{ steps.changed.outputs.source }} ${{ steps.changed.outputs.ref }}
|
||||
|
||||
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>"
|
||||
|
||||
# Push
|
||||
git push -u origin "$BRANCH_NAME"
|
||||
|
||||
# Check if PR already exists for this branch
|
||||
EXISTING_PR=$(gh pr list --head "$BRANCH_NAME" --json number --jq '.[0].number' || echo "")
|
||||
|
||||
if [ -n "$EXISTING_PR" ]; then
|
||||
echo "PR #$EXISTING_PR already exists for branch $BRANCH_NAME, updated with new commit"
|
||||
else
|
||||
# Create new PR
|
||||
gh pr create \
|
||||
--title "docs: automated documentation update ($(date +%Y-%m-%d))" \
|
||||
--body-file /tmp/phase6-summary.md \
|
||||
--base main || true
|
||||
echo "PR created on branch: $BRANCH_NAME"
|
||||
fi
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
|
||||
# Summary output
|
||||
- name: "Summary"
|
||||
if: always()
|
||||
run: |
|
||||
echo "## Documentation Automation Summary" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "" >> "$GITHUB_STEP_SUMMARY"
|
||||
|
||||
if [ "${{ steps.phase4.outputs.updates_required }}" == "false" ]; then
|
||||
echo "No documentation updates required for this change." >> "$GITHUB_STEP_SUMMARY"
|
||||
elif [ -f /tmp/phase6-summary.md ]; then
|
||||
cat /tmp/phase6-summary.md >> "$GITHUB_STEP_SUMMARY"
|
||||
else
|
||||
echo "Workflow completed. Check individual phase outputs for details." >> "$GITHUB_STEP_SUMMARY"
|
||||
fi
|
||||
11
Cargo.lock
generated
11
Cargo.lock
generated
@@ -5924,11 +5924,9 @@ dependencies = [
|
||||
"async-trait",
|
||||
"client",
|
||||
"collections",
|
||||
"credentials_provider",
|
||||
"criterion",
|
||||
"ctor",
|
||||
"dap",
|
||||
"dirs 4.0.0",
|
||||
"extension",
|
||||
"fs",
|
||||
"futures 0.3.31",
|
||||
@@ -5937,11 +5935,8 @@ dependencies = [
|
||||
"http_client",
|
||||
"language",
|
||||
"language_extension",
|
||||
"language_model",
|
||||
"log",
|
||||
"lsp",
|
||||
"markdown",
|
||||
"menu",
|
||||
"moka",
|
||||
"node_runtime",
|
||||
"parking_lot",
|
||||
@@ -5956,21 +5951,17 @@ dependencies = [
|
||||
"serde_json",
|
||||
"serde_json_lenient",
|
||||
"settings",
|
||||
"smol",
|
||||
"task",
|
||||
"telemetry",
|
||||
"tempfile",
|
||||
"theme",
|
||||
"theme_extension",
|
||||
"toml 0.8.23",
|
||||
"ui",
|
||||
"ui_input",
|
||||
"url",
|
||||
"util",
|
||||
"wasmparser 0.221.3",
|
||||
"wasmtime",
|
||||
"wasmtime-wasi",
|
||||
"workspace",
|
||||
"zlog",
|
||||
]
|
||||
|
||||
@@ -14882,7 +14873,6 @@ dependencies = [
|
||||
"copilot",
|
||||
"edit_prediction",
|
||||
"editor",
|
||||
"extension_host",
|
||||
"feature_flags",
|
||||
"fs",
|
||||
"futures 0.3.31",
|
||||
@@ -14890,7 +14880,6 @@ dependencies = [
|
||||
"gpui",
|
||||
"heck 0.5.0",
|
||||
"language",
|
||||
"language_model",
|
||||
"language_models",
|
||||
"log",
|
||||
"menu",
|
||||
|
||||
@@ -3,11 +3,11 @@ use agent_client_protocol::{self as acp};
|
||||
use anyhow::Result;
|
||||
use collections::IndexMap;
|
||||
use gpui::{Entity, SharedString, Task};
|
||||
use language_model::{IconOrSvg, LanguageModelProviderId};
|
||||
use language_model::LanguageModelProviderId;
|
||||
use project::Project;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::{any::Any, error::Error, fmt, path::Path, rc::Rc, sync::Arc};
|
||||
use ui::App;
|
||||
use ui::{App, IconName};
|
||||
use uuid::Uuid;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize, Hash)]
|
||||
@@ -210,12 +210,21 @@ pub trait AgentModelSelector: 'static {
|
||||
}
|
||||
}
|
||||
|
||||
/// Icon for a model in the model selector.
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub enum AgentModelIcon {
|
||||
/// A built-in icon from Zed's icon set.
|
||||
Named(IconName),
|
||||
/// Path to a custom SVG icon file.
|
||||
Path(SharedString),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub struct AgentModelInfo {
|
||||
pub id: acp::ModelId,
|
||||
pub name: SharedString,
|
||||
pub description: Option<SharedString>,
|
||||
pub icon: Option<IconOrSvg>,
|
||||
pub icon: Option<AgentModelIcon>,
|
||||
}
|
||||
|
||||
impl From<acp::ModelInfo> for AgentModelInfo {
|
||||
|
||||
@@ -739,7 +739,7 @@ impl ActivityIndicator {
|
||||
extension_store.outstanding_operations().iter().next()
|
||||
{
|
||||
let (message, icon, rotate) = match operation {
|
||||
ExtensionOperation::Install | ExtensionOperation::AutoInstall => (
|
||||
ExtensionOperation::Install => (
|
||||
format!("Installing {extension_id} extension…"),
|
||||
IconName::LoadCircle,
|
||||
true,
|
||||
|
||||
@@ -30,7 +30,7 @@ use futures::{StreamExt, future};
|
||||
use gpui::{
|
||||
App, AppContext, AsyncApp, Context, Entity, SharedString, Subscription, Task, WeakEntity,
|
||||
};
|
||||
use language_model::{LanguageModel, LanguageModelProvider, LanguageModelRegistry};
|
||||
use language_model::{IconOrSvg, LanguageModel, LanguageModelProvider, LanguageModelRegistry};
|
||||
use project::{Project, ProjectItem, ProjectPath, Worktree};
|
||||
use prompt_store::{
|
||||
ProjectContext, PromptStore, RULES_FILE_NAMES, RulesFileContext, UserRulesContext,
|
||||
@@ -153,7 +153,10 @@ impl LanguageModels {
|
||||
id: Self::model_id(model),
|
||||
name: model.name().0,
|
||||
description: None,
|
||||
icon: Some(provider.icon()),
|
||||
icon: Some(match provider.icon() {
|
||||
IconOrSvg::Svg(path) => acp_thread::AgentModelIcon::Path(path),
|
||||
IconOrSvg::Icon(name) => acp_thread::AgentModelIcon::Named(name),
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1630,7 +1633,9 @@ mod internal_tests {
|
||||
id: acp::ModelId::new("fake/fake"),
|
||||
name: "Fake".into(),
|
||||
description: None,
|
||||
icon: Some(language_model::IconOrSvg::Icon(ui::IconName::ZedAssistant)),
|
||||
icon: Some(acp_thread::AgentModelIcon::Named(
|
||||
ui::IconName::ZedAssistant
|
||||
)),
|
||||
}]
|
||||
)])
|
||||
);
|
||||
|
||||
@@ -5,7 +5,7 @@ use crate::{AgentServer, AgentServerDelegate, load_proxy_env};
|
||||
use acp_thread::AgentConnection;
|
||||
use anyhow::{Context as _, Result};
|
||||
use gpui::{App, SharedString, Task};
|
||||
use language_models::api_key_for_gemini_cli;
|
||||
use language_models::provider::google::GoogleLanguageModelProvider;
|
||||
use project::agent_server_store::GEMINI_NAME;
|
||||
|
||||
#[derive(Clone)]
|
||||
@@ -37,7 +37,11 @@ impl AgentServer for Gemini {
|
||||
cx.spawn(async move |cx| {
|
||||
extra_env.insert("SURFACE".to_owned(), "zed".to_owned());
|
||||
|
||||
if let Some(api_key) = cx.update(api_key_for_gemini_cli)?.await.ok() {
|
||||
if let Some(api_key) = cx
|
||||
.update(GoogleLanguageModelProvider::api_key_for_gemini_cli)?
|
||||
.await
|
||||
.ok()
|
||||
{
|
||||
extra_env.insert("GEMINI_API_KEY".into(), api_key);
|
||||
}
|
||||
let (command, root_dir, login) = store
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
use std::{cmp::Reverse, rc::Rc, sync::Arc};
|
||||
|
||||
use acp_thread::{AgentModelInfo, AgentModelList, AgentModelSelector};
|
||||
use acp_thread::{AgentModelIcon, AgentModelInfo, AgentModelList, AgentModelSelector};
|
||||
use agent_client_protocol::ModelId;
|
||||
use agent_servers::AgentServer;
|
||||
use agent_settings::AgentSettings;
|
||||
@@ -13,7 +13,6 @@ use gpui::{
|
||||
Action, AsyncWindowContext, BackgroundExecutor, DismissEvent, FocusHandle, Task, WeakEntity,
|
||||
};
|
||||
use itertools::Itertools;
|
||||
use language_model::IconOrSvg;
|
||||
use ordered_float::OrderedFloat;
|
||||
use picker::{Picker, PickerDelegate};
|
||||
use settings::Settings;
|
||||
@@ -352,8 +351,8 @@ impl PickerDelegate for AcpModelPickerDelegate {
|
||||
.child(
|
||||
ModelSelectorListItem::new(ix, model_info.name.clone())
|
||||
.map(|this| match &model_info.icon {
|
||||
Some(IconOrSvg::Svg(path)) => this.icon_path(path.clone()),
|
||||
Some(IconOrSvg::Icon(icon)) => this.icon(*icon),
|
||||
Some(AgentModelIcon::Path(path)) => this.icon_path(path.clone()),
|
||||
Some(AgentModelIcon::Named(icon)) => this.icon(*icon),
|
||||
None => this,
|
||||
})
|
||||
.is_selected(is_selected)
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
use std::rc::Rc;
|
||||
use std::sync::Arc;
|
||||
|
||||
use acp_thread::{AgentModelInfo, AgentModelSelector};
|
||||
use acp_thread::{AgentModelIcon, AgentModelInfo, AgentModelSelector};
|
||||
use agent_servers::AgentServer;
|
||||
use agent_settings::AgentSettings;
|
||||
use fs::Fs;
|
||||
use gpui::{Entity, FocusHandle};
|
||||
use language_model::IconOrSvg;
|
||||
use picker::popover_menu::PickerPopoverMenu;
|
||||
use settings::Settings as _;
|
||||
use ui::{ButtonLike, KeyBinding, PopoverMenuHandle, TintColor, Tooltip, prelude::*};
|
||||
@@ -128,8 +127,8 @@ impl Render for AcpModelSelectorPopover {
|
||||
.when_some(model_icon, |this, icon| {
|
||||
this.child(
|
||||
match icon {
|
||||
IconOrSvg::Svg(path) => Icon::from_external_svg(path),
|
||||
IconOrSvg::Icon(icon_name) => Icon::new(icon_name),
|
||||
AgentModelIcon::Path(path) => Icon::from_external_svg(path),
|
||||
AgentModelIcon::Named(icon_name) => Icon::new(icon_name),
|
||||
}
|
||||
.color(color)
|
||||
.size(IconSize::XSmall),
|
||||
|
||||
@@ -368,49 +368,26 @@ fn update_active_language_model_from_settings(cx: &mut App) {
|
||||
}
|
||||
}
|
||||
|
||||
// Filter out models from providers that are not authenticated
|
||||
fn is_provider_authenticated(
|
||||
selection: &LanguageModelSelection,
|
||||
registry: &LanguageModelRegistry,
|
||||
cx: &App,
|
||||
) -> bool {
|
||||
let provider_id = LanguageModelProviderId::from(selection.provider.0.clone());
|
||||
registry
|
||||
.provider(&provider_id)
|
||||
.map_or(false, |provider| provider.is_authenticated(cx))
|
||||
}
|
||||
|
||||
let registry = LanguageModelRegistry::global(cx);
|
||||
let registry_ref = registry.read(cx);
|
||||
|
||||
let default = settings
|
||||
.default_model
|
||||
.as_ref()
|
||||
.filter(|s| is_provider_authenticated(s, registry_ref, cx))
|
||||
.map(to_selected_model);
|
||||
let default = settings.default_model.as_ref().map(to_selected_model);
|
||||
let inline_assistant = settings
|
||||
.inline_assistant_model
|
||||
.as_ref()
|
||||
.filter(|s| is_provider_authenticated(s, registry_ref, cx))
|
||||
.map(to_selected_model);
|
||||
let commit_message = settings
|
||||
.commit_message_model
|
||||
.as_ref()
|
||||
.filter(|s| is_provider_authenticated(s, registry_ref, cx))
|
||||
.map(to_selected_model);
|
||||
let thread_summary = settings
|
||||
.thread_summary_model
|
||||
.as_ref()
|
||||
.filter(|s| is_provider_authenticated(s, registry_ref, cx))
|
||||
.map(to_selected_model);
|
||||
let inline_alternatives = settings
|
||||
.inline_alternatives
|
||||
.iter()
|
||||
.filter(|s| is_provider_authenticated(s, registry_ref, cx))
|
||||
.map(to_selected_model)
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
registry.update(cx, |registry, cx| {
|
||||
LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
|
||||
registry.select_default_model(default.as_ref(), cx);
|
||||
registry.select_inline_assistant_model(inline_assistant.as_ref(), cx);
|
||||
registry.select_commit_message_model(commit_message.as_ref(), cx);
|
||||
|
||||
@@ -1259,26 +1259,28 @@ impl InlineAssistant {
|
||||
let bottom = top + 1.0;
|
||||
(top, bottom)
|
||||
});
|
||||
let height_in_lines = editor.visible_line_count().unwrap_or(0.);
|
||||
let vertical_scroll_margin = editor.vertical_scroll_margin() as ScrollOffset;
|
||||
let scroll_target_top = (scroll_target_range.0 - vertical_scroll_margin)
|
||||
// Don't scroll up too far in the case of a large vertical_scroll_margin.
|
||||
.max(scroll_target_range.0 - height_in_lines / 2.0);
|
||||
let scroll_target_bottom = (scroll_target_range.1 + vertical_scroll_margin)
|
||||
// Don't scroll down past where the top would still be visible.
|
||||
.min(scroll_target_top + height_in_lines);
|
||||
let mut scroll_target_top = scroll_target_range.0;
|
||||
let mut scroll_target_bottom = scroll_target_range.1;
|
||||
|
||||
scroll_target_top -= editor.vertical_scroll_margin() as ScrollOffset;
|
||||
scroll_target_bottom += editor.vertical_scroll_margin() as ScrollOffset;
|
||||
|
||||
let height_in_lines = editor.visible_line_count().unwrap_or(0.);
|
||||
let scroll_top = editor.scroll_position(cx).y;
|
||||
let scroll_bottom = scroll_top + height_in_lines;
|
||||
|
||||
if scroll_target_top < scroll_top {
|
||||
editor.set_scroll_position(point(0., scroll_target_top), window, cx);
|
||||
} else if scroll_target_bottom > scroll_bottom {
|
||||
editor.set_scroll_position(
|
||||
point(0., scroll_target_bottom - height_in_lines),
|
||||
window,
|
||||
cx,
|
||||
);
|
||||
if (scroll_target_bottom - scroll_target_top) <= height_in_lines {
|
||||
editor.set_scroll_position(
|
||||
point(0., scroll_target_bottom - height_in_lines),
|
||||
window,
|
||||
cx,
|
||||
);
|
||||
} else {
|
||||
editor.set_scroll_position(point(0., scroll_target_top), window, cx);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
@@ -2,9 +2,10 @@ use std::{cmp::Reverse, sync::Arc};
|
||||
|
||||
use agent_settings::AgentSettings;
|
||||
use collections::{HashMap, HashSet, IndexMap};
|
||||
use futures::{StreamExt, channel::mpsc};
|
||||
use fuzzy::{StringMatch, StringMatchCandidate, match_strings};
|
||||
use gpui::{Action, AnyElement, App, BackgroundExecutor, DismissEvent, FocusHandle, Task};
|
||||
use gpui::{
|
||||
Action, AnyElement, App, BackgroundExecutor, DismissEvent, FocusHandle, Subscription, Task,
|
||||
};
|
||||
use language_model::{
|
||||
AuthenticateError, ConfiguredModel, IconOrSvg, LanguageModel, LanguageModelId,
|
||||
LanguageModelProvider, LanguageModelProviderId, LanguageModelRegistry,
|
||||
@@ -75,7 +76,7 @@ fn all_models(cx: &App) -> GroupedModels {
|
||||
})
|
||||
.collect();
|
||||
|
||||
let all: Vec<ModelInfo> = providers
|
||||
let all = providers
|
||||
.iter()
|
||||
.flat_map(|provider| {
|
||||
provider
|
||||
@@ -123,7 +124,7 @@ pub struct LanguageModelPickerDelegate {
|
||||
filtered_entries: Vec<LanguageModelPickerEntry>,
|
||||
selected_index: usize,
|
||||
_authenticate_all_providers_task: Task<()>,
|
||||
_refresh_models_task: Task<()>,
|
||||
_subscriptions: Vec<Subscription>,
|
||||
popover_styles: bool,
|
||||
focus_handle: FocusHandle,
|
||||
}
|
||||
@@ -150,42 +151,24 @@ impl LanguageModelPickerDelegate {
|
||||
get_active_model: Arc::new(get_active_model),
|
||||
on_toggle_favorite: Arc::new(on_toggle_favorite),
|
||||
_authenticate_all_providers_task: Self::authenticate_all_providers(cx),
|
||||
_refresh_models_task: {
|
||||
// Create a channel to signal when models need refreshing
|
||||
let (refresh_tx, mut refresh_rx) = mpsc::unbounded::<()>();
|
||||
|
||||
// Subscribe to registry events and send refresh signals through the channel
|
||||
let registry = LanguageModelRegistry::global(cx);
|
||||
cx.subscribe(®istry, move |_picker, _, event, _cx| match event {
|
||||
language_model::Event::ProviderStateChanged(_)
|
||||
| language_model::Event::AddedProvider(_)
|
||||
| language_model::Event::RemovedProvider(_)
|
||||
| language_model::Event::ProvidersChanged => {
|
||||
refresh_tx.unbounded_send(()).ok();
|
||||
}
|
||||
language_model::Event::DefaultModelChanged
|
||||
| language_model::Event::InlineAssistantModelChanged
|
||||
| language_model::Event::CommitMessageModelChanged
|
||||
| language_model::Event::ThreadSummaryModelChanged => {}
|
||||
})
|
||||
.detach();
|
||||
|
||||
// Spawn a task that listens for refresh signals and updates the picker
|
||||
cx.spawn_in(window, async move |this, cx| {
|
||||
while let Some(()) = refresh_rx.next().await {
|
||||
if this
|
||||
.update_in(cx, |picker, window, cx| {
|
||||
picker.delegate.all_models = Arc::new(all_models(cx));
|
||||
picker.refresh(window, cx);
|
||||
})
|
||||
.is_err()
|
||||
{
|
||||
// Picker was dropped, exit the loop
|
||||
break;
|
||||
_subscriptions: vec![cx.subscribe_in(
|
||||
&LanguageModelRegistry::global(cx),
|
||||
window,
|
||||
|picker, _, event, window, cx| {
|
||||
match event {
|
||||
language_model::Event::ProviderStateChanged(_)
|
||||
| language_model::Event::AddedProvider(_)
|
||||
| language_model::Event::RemovedProvider(_) => {
|
||||
let query = picker.query(cx);
|
||||
picker.delegate.all_models = Arc::new(all_models(cx));
|
||||
// Update matches will automatically drop the previous task
|
||||
// if we get a provider event again
|
||||
picker.update_matches(query, window, cx)
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
})
|
||||
},
|
||||
},
|
||||
)],
|
||||
popover_styles,
|
||||
focus_handle,
|
||||
}
|
||||
|
||||
@@ -1,6 +1,11 @@
|
||||
use gpui::{Action, FocusHandle, prelude::*};
|
||||
use ui::{ElevationIndex, KeyBinding, ListItem, ListItemSpacing, Tooltip, prelude::*};
|
||||
|
||||
enum ModelIcon {
|
||||
Name(IconName),
|
||||
Path(SharedString),
|
||||
}
|
||||
|
||||
#[derive(IntoElement)]
|
||||
pub struct ModelSelectorHeader {
|
||||
title: SharedString,
|
||||
@@ -35,11 +40,6 @@ impl RenderOnce for ModelSelectorHeader {
|
||||
}
|
||||
}
|
||||
|
||||
enum ModelIcon {
|
||||
Name(IconName),
|
||||
Path(SharedString),
|
||||
}
|
||||
|
||||
#[derive(IntoElement)]
|
||||
pub struct ModelSelectorListItem {
|
||||
index: usize,
|
||||
|
||||
@@ -8,7 +8,7 @@ use futures::{AsyncBufReadExt, AsyncReadExt, StreamExt, io::BufReader, stream::B
|
||||
use http_client::http::{self, HeaderMap, HeaderValue};
|
||||
use http_client::{AsyncBody, HttpClient, Method, Request as HttpRequest, StatusCode};
|
||||
use serde::{Deserialize, Serialize};
|
||||
pub use settings::ModelMode;
|
||||
pub use settings::{AnthropicAvailableModel as AvailableModel, ModelMode};
|
||||
use strum::{EnumIter, EnumString};
|
||||
use thiserror::Error;
|
||||
|
||||
|
||||
@@ -1159,34 +1159,6 @@ impl BufferDiff {
|
||||
new_index_text
|
||||
}
|
||||
|
||||
pub fn stage_or_unstage_all_hunks(
|
||||
&mut self,
|
||||
stage: bool,
|
||||
buffer: &text::BufferSnapshot,
|
||||
file_exists: bool,
|
||||
cx: &mut Context<Self>,
|
||||
) {
|
||||
let hunks = self
|
||||
.hunks_intersecting_range(Anchor::MIN..Anchor::MAX, buffer, cx)
|
||||
.collect::<Vec<_>>();
|
||||
let Some(secondary) = self.secondary_diff.as_ref() else {
|
||||
return;
|
||||
};
|
||||
self.inner.stage_or_unstage_hunks_impl(
|
||||
&secondary.read(cx).inner,
|
||||
stage,
|
||||
&hunks,
|
||||
buffer,
|
||||
file_exists,
|
||||
);
|
||||
if let Some((first, last)) = hunks.first().zip(hunks.last()) {
|
||||
let changed_range = first.buffer_range.start..last.buffer_range.end;
|
||||
cx.emit(BufferDiffEvent::DiffChanged {
|
||||
changed_range: Some(changed_range),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
pub fn range_to_hunk_range(
|
||||
&self,
|
||||
range: Range<Anchor>,
|
||||
|
||||
@@ -20475,7 +20475,7 @@ impl Editor {
|
||||
EditorSettings::get_global(cx).gutter.line_numbers
|
||||
}
|
||||
|
||||
pub fn relative_line_numbers(&self, cx: &App) -> RelativeLineNumbers {
|
||||
pub fn relative_line_numbers(&self, cx: &mut App) -> RelativeLineNumbers {
|
||||
match (
|
||||
self.use_relative_line_numbers,
|
||||
EditorSettings::get_global(cx).relative_line_numbers,
|
||||
@@ -25294,70 +25294,6 @@ impl EditorSnapshot {
|
||||
let digit_count = self.widest_line_number().ilog10() + 1;
|
||||
column_pixels(style, digit_count as usize, window)
|
||||
}
|
||||
|
||||
/// Returns the line delta from `base` to `line` in the multibuffer, ignoring wrapped lines.
|
||||
///
|
||||
/// This is positive if `base` is before `line`.
|
||||
fn relative_line_delta(&self, base: DisplayRow, line: DisplayRow) -> i64 {
|
||||
let point = DisplayPoint::new(line, 0).to_point(self);
|
||||
self.relative_line_delta_to_point(base, point)
|
||||
}
|
||||
|
||||
/// Returns the line delta from `base` to `point` in the multibuffer, ignoring wrapped lines.
|
||||
///
|
||||
/// This is positive if `base` is before `point`.
|
||||
pub fn relative_line_delta_to_point(&self, base: DisplayRow, point: Point) -> i64 {
|
||||
let base_point = DisplayPoint::new(base, 0).to_point(self);
|
||||
point.row as i64 - base_point.row as i64
|
||||
}
|
||||
|
||||
/// Returns the line delta from `base` to `line` in the multibuffer, counting wrapped lines.
|
||||
///
|
||||
/// This is positive if `base` is before `line`.
|
||||
fn relative_wrapped_line_delta(&self, base: DisplayRow, line: DisplayRow) -> i64 {
|
||||
let point = DisplayPoint::new(line, 0).to_point(self);
|
||||
self.relative_wrapped_line_delta_to_point(base, point)
|
||||
}
|
||||
|
||||
/// Returns the line delta from `base` to `point` in the multibuffer, counting wrapped lines.
|
||||
///
|
||||
/// This is positive if `base` is before `point`.
|
||||
pub fn relative_wrapped_line_delta_to_point(&self, base: DisplayRow, point: Point) -> i64 {
|
||||
let base_point = DisplayPoint::new(base, 0).to_point(self);
|
||||
let wrap_snapshot = self.wrap_snapshot();
|
||||
let base_wrap_row = wrap_snapshot.make_wrap_point(base_point, Bias::Left).row();
|
||||
let wrap_row = wrap_snapshot.make_wrap_point(point, Bias::Left).row();
|
||||
wrap_row.0 as i64 - base_wrap_row.0 as i64
|
||||
}
|
||||
|
||||
/// Returns the unsigned relative line number to display for each row in `rows`.
|
||||
///
|
||||
/// Wrapped rows are excluded from the hashmap if `count_relative_lines` is `false`.
|
||||
pub fn calculate_relative_line_numbers(
|
||||
&self,
|
||||
rows: &Range<DisplayRow>,
|
||||
relative_to: DisplayRow,
|
||||
count_wrapped_lines: bool,
|
||||
) -> HashMap<DisplayRow, u32> {
|
||||
let initial_offset = if count_wrapped_lines {
|
||||
self.relative_wrapped_line_delta(relative_to, rows.start)
|
||||
} else {
|
||||
self.relative_line_delta(relative_to, rows.start)
|
||||
};
|
||||
let display_row_infos = self
|
||||
.row_infos(rows.start)
|
||||
.take(rows.len())
|
||||
.enumerate()
|
||||
.map(|(i, row_info)| (DisplayRow(rows.start.0 + i as u32), row_info));
|
||||
display_row_infos
|
||||
.filter(|(_row, row_info)| {
|
||||
row_info.buffer_row.is_some()
|
||||
|| (count_wrapped_lines && row_info.wrapped_buffer_row.is_some())
|
||||
})
|
||||
.enumerate()
|
||||
.map(|(i, (row, _row_info))| (row, (initial_offset + i as i64).unsigned_abs() as u32))
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn column_pixels(style: &EditorStyle, column: usize, window: &Window) -> Pixels {
|
||||
|
||||
@@ -36,8 +36,7 @@ use languages::markdown_lang;
|
||||
use languages::rust_lang;
|
||||
use lsp::CompletionParams;
|
||||
use multi_buffer::{
|
||||
ExcerptRange, IndentGuide, MultiBuffer, MultiBufferFilterMode, MultiBufferOffset,
|
||||
MultiBufferOffsetUtf16, PathKey,
|
||||
IndentGuide, MultiBufferFilterMode, MultiBufferOffset, MultiBufferOffsetUtf16, PathKey,
|
||||
};
|
||||
use parking_lot::Mutex;
|
||||
use pretty_assertions::{assert_eq, assert_ne};
|
||||
@@ -28634,130 +28633,6 @@ async fn test_sticky_scroll(cx: &mut TestAppContext) {
|
||||
assert_eq!(sticky_headers(10.0), vec![]);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
fn test_relative_line_numbers(cx: &mut TestAppContext) {
|
||||
init_test(cx, |_| {});
|
||||
|
||||
let buffer_1 = cx.new(|cx| Buffer::local("aaaaaaaaaa\nbbb\n", cx));
|
||||
let buffer_2 = cx.new(|cx| Buffer::local("cccccccccc\nddd\n", cx));
|
||||
let buffer_3 = cx.new(|cx| Buffer::local("eee\nffffffffff\n", cx));
|
||||
|
||||
let multibuffer = cx.new(|cx| {
|
||||
let mut multibuffer = MultiBuffer::new(ReadWrite);
|
||||
multibuffer.push_excerpts(
|
||||
buffer_1.clone(),
|
||||
[ExcerptRange::new(Point::new(0, 0)..Point::new(2, 0))],
|
||||
cx,
|
||||
);
|
||||
multibuffer.push_excerpts(
|
||||
buffer_2.clone(),
|
||||
[ExcerptRange::new(Point::new(0, 0)..Point::new(2, 0))],
|
||||
cx,
|
||||
);
|
||||
multibuffer.push_excerpts(
|
||||
buffer_3.clone(),
|
||||
[ExcerptRange::new(Point::new(0, 0)..Point::new(2, 0))],
|
||||
cx,
|
||||
);
|
||||
multibuffer
|
||||
});
|
||||
|
||||
// wrapped contents of multibuffer:
|
||||
// aaa
|
||||
// aaa
|
||||
// aaa
|
||||
// a
|
||||
// bbb
|
||||
//
|
||||
// ccc
|
||||
// ccc
|
||||
// ccc
|
||||
// c
|
||||
// ddd
|
||||
//
|
||||
// eee
|
||||
// fff
|
||||
// fff
|
||||
// fff
|
||||
// f
|
||||
|
||||
let (editor, cx) = cx.add_window_view(|window, cx| build_editor(multibuffer, window, cx));
|
||||
editor.update_in(cx, |editor, window, cx| {
|
||||
editor.set_wrap_width(Some(30.0.into()), cx); // every 3 characters
|
||||
|
||||
// includes trailing newlines.
|
||||
let expected_line_numbers = [2, 6, 7, 10, 14, 15, 18, 19, 23];
|
||||
let expected_wrapped_line_numbers = [
|
||||
2, 3, 4, 5, 6, 7, 10, 11, 12, 13, 14, 15, 18, 19, 20, 21, 22, 23,
|
||||
];
|
||||
|
||||
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
|
||||
s.select_ranges([
|
||||
Point::new(7, 0)..Point::new(7, 1), // second row of `ccc`
|
||||
]);
|
||||
});
|
||||
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
|
||||
// these are all 0-indexed
|
||||
let base_display_row = DisplayRow(11);
|
||||
let base_row = 3;
|
||||
let wrapped_base_row = 7;
|
||||
|
||||
// test not counting wrapped lines
|
||||
let expected_relative_numbers = expected_line_numbers
|
||||
.into_iter()
|
||||
.enumerate()
|
||||
.map(|(i, row)| (DisplayRow(row), i.abs_diff(base_row) as u32))
|
||||
.collect_vec();
|
||||
let actual_relative_numbers = snapshot
|
||||
.calculate_relative_line_numbers(
|
||||
&(DisplayRow(0)..DisplayRow(24)),
|
||||
base_display_row,
|
||||
false,
|
||||
)
|
||||
.into_iter()
|
||||
.sorted()
|
||||
.collect_vec();
|
||||
assert_eq!(expected_relative_numbers, actual_relative_numbers);
|
||||
// check `calculate_relative_line_numbers()` against `relative_line_delta()` for each line
|
||||
for (display_row, relative_number) in expected_relative_numbers {
|
||||
assert_eq!(
|
||||
relative_number,
|
||||
snapshot
|
||||
.relative_line_delta(display_row, base_display_row)
|
||||
.unsigned_abs() as u32,
|
||||
);
|
||||
}
|
||||
|
||||
// test counting wrapped lines
|
||||
let expected_wrapped_relative_numbers = expected_wrapped_line_numbers
|
||||
.into_iter()
|
||||
.enumerate()
|
||||
.map(|(i, row)| (DisplayRow(row), i.abs_diff(wrapped_base_row) as u32))
|
||||
.collect_vec();
|
||||
let actual_relative_numbers = snapshot
|
||||
.calculate_relative_line_numbers(
|
||||
&(DisplayRow(0)..DisplayRow(24)),
|
||||
base_display_row,
|
||||
true,
|
||||
)
|
||||
.into_iter()
|
||||
.sorted()
|
||||
.collect_vec();
|
||||
assert_eq!(expected_wrapped_relative_numbers, actual_relative_numbers);
|
||||
// check `calculate_relative_line_numbers()` against `relative_wrapped_line_delta()` for each line
|
||||
for (display_row, relative_number) in expected_wrapped_relative_numbers {
|
||||
assert_eq!(
|
||||
relative_number,
|
||||
snapshot
|
||||
.relative_wrapped_line_delta(display_row, base_display_row)
|
||||
.unsigned_abs() as u32,
|
||||
);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_scroll_by_clicking_sticky_header(cx: &mut TestAppContext) {
|
||||
init_test(cx, |_| {});
|
||||
|
||||
@@ -66,7 +66,7 @@ use project::{
|
||||
};
|
||||
use settings::{
|
||||
GitGutterSetting, GitHunkStyleSetting, IndentGuideBackgroundColoring, IndentGuideColoring,
|
||||
RelativeLineNumbers, Settings,
|
||||
Settings,
|
||||
};
|
||||
use smallvec::{SmallVec, smallvec};
|
||||
use std::{
|
||||
@@ -194,6 +194,8 @@ pub struct EditorElement {
|
||||
style: EditorStyle,
|
||||
}
|
||||
|
||||
type DisplayRowDelta = u32;
|
||||
|
||||
impl EditorElement {
|
||||
pub(crate) const SCROLLBAR_WIDTH: Pixels = px(15.);
|
||||
|
||||
@@ -3223,6 +3225,64 @@ impl EditorElement {
|
||||
.collect()
|
||||
}
|
||||
|
||||
fn calculate_relative_line_numbers(
|
||||
&self,
|
||||
snapshot: &EditorSnapshot,
|
||||
rows: &Range<DisplayRow>,
|
||||
relative_to: Option<DisplayRow>,
|
||||
count_wrapped_lines: bool,
|
||||
) -> HashMap<DisplayRow, DisplayRowDelta> {
|
||||
let mut relative_rows: HashMap<DisplayRow, DisplayRowDelta> = Default::default();
|
||||
let Some(relative_to) = relative_to else {
|
||||
return relative_rows;
|
||||
};
|
||||
|
||||
let start = rows.start.min(relative_to);
|
||||
let end = rows.end.max(relative_to);
|
||||
|
||||
let buffer_rows = snapshot
|
||||
.row_infos(start)
|
||||
.take(1 + end.minus(start) as usize)
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
let head_idx = relative_to.minus(start);
|
||||
let mut delta = 1;
|
||||
let mut i = head_idx + 1;
|
||||
let should_count_line = |row_info: &RowInfo| {
|
||||
if count_wrapped_lines {
|
||||
row_info.buffer_row.is_some() || row_info.wrapped_buffer_row.is_some()
|
||||
} else {
|
||||
row_info.buffer_row.is_some()
|
||||
}
|
||||
};
|
||||
while i < buffer_rows.len() as u32 {
|
||||
if should_count_line(&buffer_rows[i as usize]) {
|
||||
if rows.contains(&DisplayRow(i + start.0)) {
|
||||
relative_rows.insert(DisplayRow(i + start.0), delta);
|
||||
}
|
||||
delta += 1;
|
||||
}
|
||||
i += 1;
|
||||
}
|
||||
delta = 1;
|
||||
i = head_idx.min(buffer_rows.len().saturating_sub(1) as u32);
|
||||
while i > 0 && buffer_rows[i as usize].buffer_row.is_none() && !count_wrapped_lines {
|
||||
i -= 1;
|
||||
}
|
||||
|
||||
while i > 0 {
|
||||
i -= 1;
|
||||
if should_count_line(&buffer_rows[i as usize]) {
|
||||
if rows.contains(&DisplayRow(i + start.0)) {
|
||||
relative_rows.insert(DisplayRow(i + start.0), delta);
|
||||
}
|
||||
delta += 1;
|
||||
}
|
||||
}
|
||||
|
||||
relative_rows
|
||||
}
|
||||
|
||||
fn layout_line_numbers(
|
||||
&self,
|
||||
gutter_hitbox: Option<&Hitbox>,
|
||||
@@ -3232,7 +3292,7 @@ impl EditorElement {
|
||||
rows: Range<DisplayRow>,
|
||||
buffer_rows: &[RowInfo],
|
||||
active_rows: &BTreeMap<DisplayRow, LineHighlightSpec>,
|
||||
relative_line_base: Option<DisplayRow>,
|
||||
newest_selection_head: Option<DisplayPoint>,
|
||||
snapshot: &EditorSnapshot,
|
||||
window: &mut Window,
|
||||
cx: &mut App,
|
||||
@@ -3244,16 +3304,32 @@ impl EditorElement {
|
||||
return Arc::default();
|
||||
}
|
||||
|
||||
let relative = self.editor.read(cx).relative_line_numbers(cx);
|
||||
let (newest_selection_head, relative) = self.editor.update(cx, |editor, cx| {
|
||||
let newest_selection_head = newest_selection_head.unwrap_or_else(|| {
|
||||
let newest = editor
|
||||
.selections
|
||||
.newest::<Point>(&editor.display_snapshot(cx));
|
||||
SelectionLayout::new(
|
||||
newest,
|
||||
editor.selections.line_mode(),
|
||||
editor.cursor_offset_on_selection,
|
||||
editor.cursor_shape,
|
||||
&snapshot.display_snapshot,
|
||||
true,
|
||||
true,
|
||||
None,
|
||||
)
|
||||
.head
|
||||
});
|
||||
let relative = editor.relative_line_numbers(cx);
|
||||
(newest_selection_head, relative)
|
||||
});
|
||||
|
||||
let relative_line_numbers_enabled = relative.enabled();
|
||||
let relative_rows = if relative_line_numbers_enabled && let Some(base) = relative_line_base
|
||||
{
|
||||
snapshot.calculate_relative_line_numbers(&rows, base, relative.wrapped())
|
||||
} else {
|
||||
Default::default()
|
||||
};
|
||||
let relative_to = relative_line_numbers_enabled.then(|| newest_selection_head.row());
|
||||
|
||||
let relative_rows =
|
||||
self.calculate_relative_line_numbers(snapshot, &rows, relative_to, relative.wrapped());
|
||||
let mut line_number = String::new();
|
||||
let segments = buffer_rows.iter().enumerate().flat_map(|(ix, row_info)| {
|
||||
let display_row = DisplayRow(rows.start.0 + ix as u32);
|
||||
@@ -4576,8 +4652,6 @@ impl EditorElement {
|
||||
gutter_hitbox: &Hitbox,
|
||||
text_hitbox: &Hitbox,
|
||||
style: &EditorStyle,
|
||||
relative_line_numbers: RelativeLineNumbers,
|
||||
relative_to: Option<DisplayRow>,
|
||||
window: &mut Window,
|
||||
cx: &mut App,
|
||||
) -> Option<StickyHeaders> {
|
||||
@@ -4607,21 +4681,9 @@ impl EditorElement {
|
||||
);
|
||||
|
||||
let line_number = show_line_numbers.then(|| {
|
||||
let relative_number = relative_to.and_then(|base| match relative_line_numbers {
|
||||
RelativeLineNumbers::Disabled => None,
|
||||
RelativeLineNumbers::Enabled => {
|
||||
Some(snapshot.relative_line_delta_to_point(base, start_point))
|
||||
}
|
||||
RelativeLineNumbers::Wrapped => {
|
||||
Some(snapshot.relative_wrapped_line_delta_to_point(base, start_point))
|
||||
}
|
||||
});
|
||||
let number = relative_number
|
||||
.filter(|&delta| delta != 0)
|
||||
.map(|delta| delta.unsigned_abs() as u32)
|
||||
.unwrap_or(start_point.row + 1);
|
||||
let number = (start_point.row + 1).to_string();
|
||||
let color = cx.theme().colors().editor_line_number;
|
||||
self.shape_line_number(SharedString::from(number.to_string()), color, window)
|
||||
self.shape_line_number(SharedString::from(number), color, window)
|
||||
});
|
||||
|
||||
lines.push(StickyHeaderLine::new(
|
||||
@@ -9374,28 +9436,6 @@ impl Element for EditorElement {
|
||||
window,
|
||||
cx,
|
||||
);
|
||||
|
||||
// relative rows are based on newest selection, even outside the visible area
|
||||
let relative_row_base = self.editor.update(cx, |editor, cx| {
|
||||
if editor.selections.count()==0 {
|
||||
return None;
|
||||
}
|
||||
let newest = editor
|
||||
.selections
|
||||
.newest::<Point>(&editor.display_snapshot(cx));
|
||||
Some(SelectionLayout::new(
|
||||
newest,
|
||||
editor.selections.line_mode(),
|
||||
editor.cursor_offset_on_selection,
|
||||
editor.cursor_shape,
|
||||
&snapshot.display_snapshot,
|
||||
true,
|
||||
true,
|
||||
None,
|
||||
)
|
||||
.head.row())
|
||||
});
|
||||
|
||||
let mut breakpoint_rows = self.editor.update(cx, |editor, cx| {
|
||||
editor.active_breakpoints(start_row..end_row, window, cx)
|
||||
});
|
||||
@@ -9413,7 +9453,7 @@ impl Element for EditorElement {
|
||||
start_row..end_row,
|
||||
&row_infos,
|
||||
&active_rows,
|
||||
relative_row_base,
|
||||
newest_selection_head,
|
||||
&snapshot,
|
||||
window,
|
||||
cx,
|
||||
@@ -9733,7 +9773,6 @@ impl Element for EditorElement {
|
||||
&& is_singleton
|
||||
&& EditorSettings::get_global(cx).sticky_scroll.enabled
|
||||
{
|
||||
let relative = self.editor.read(cx).relative_line_numbers(cx);
|
||||
self.layout_sticky_headers(
|
||||
&snapshot,
|
||||
editor_width,
|
||||
@@ -9745,8 +9784,6 @@ impl Element for EditorElement {
|
||||
&gutter_hitbox,
|
||||
&text_hitbox,
|
||||
&style,
|
||||
relative,
|
||||
relative_row_base,
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
@@ -11594,7 +11631,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
fn test_layout_line_numbers(cx: &mut TestAppContext) {
|
||||
fn test_shape_line_numbers(cx: &mut TestAppContext) {
|
||||
init_test(cx, |_| {});
|
||||
let window = cx.add_window(|window, cx| {
|
||||
let buffer = MultiBuffer::build_simple(&sample_text(6, 6, 'a'), cx);
|
||||
@@ -11634,7 +11671,7 @@ mod tests {
|
||||
})
|
||||
.collect::<Vec<_>>(),
|
||||
&BTreeMap::default(),
|
||||
Some(DisplayRow(0)),
|
||||
Some(DisplayPoint::new(DisplayRow(0), 0)),
|
||||
&snapshot,
|
||||
window,
|
||||
cx,
|
||||
@@ -11646,9 +11683,10 @@ mod tests {
|
||||
let relative_rows = window
|
||||
.update(cx, |editor, window, cx| {
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
snapshot.calculate_relative_line_numbers(
|
||||
element.calculate_relative_line_numbers(
|
||||
&snapshot,
|
||||
&(DisplayRow(0)..DisplayRow(6)),
|
||||
DisplayRow(3),
|
||||
Some(DisplayRow(3)),
|
||||
false,
|
||||
)
|
||||
})
|
||||
@@ -11664,9 +11702,10 @@ mod tests {
|
||||
let relative_rows = window
|
||||
.update(cx, |editor, window, cx| {
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
snapshot.calculate_relative_line_numbers(
|
||||
element.calculate_relative_line_numbers(
|
||||
&snapshot,
|
||||
&(DisplayRow(3)..DisplayRow(6)),
|
||||
DisplayRow(1),
|
||||
Some(DisplayRow(1)),
|
||||
false,
|
||||
)
|
||||
})
|
||||
@@ -11680,9 +11719,10 @@ mod tests {
|
||||
let relative_rows = window
|
||||
.update(cx, |editor, window, cx| {
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
snapshot.calculate_relative_line_numbers(
|
||||
element.calculate_relative_line_numbers(
|
||||
&snapshot,
|
||||
&(DisplayRow(0)..DisplayRow(3)),
|
||||
DisplayRow(6),
|
||||
Some(DisplayRow(6)),
|
||||
false,
|
||||
)
|
||||
})
|
||||
@@ -11719,7 +11759,7 @@ mod tests {
|
||||
})
|
||||
.collect::<Vec<_>>(),
|
||||
&BTreeMap::default(),
|
||||
Some(DisplayRow(0)),
|
||||
Some(DisplayPoint::new(DisplayRow(0), 0)),
|
||||
&snapshot,
|
||||
window,
|
||||
cx,
|
||||
@@ -11734,7 +11774,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
fn test_layout_line_numbers_wrapping(cx: &mut TestAppContext) {
|
||||
fn test_shape_line_numbers_wrapping(cx: &mut TestAppContext) {
|
||||
init_test(cx, |_| {});
|
||||
let window = cx.add_window(|window, cx| {
|
||||
let buffer = MultiBuffer::build_simple(&sample_text(6, 6, 'a'), cx);
|
||||
@@ -11779,7 +11819,7 @@ mod tests {
|
||||
})
|
||||
.collect::<Vec<_>>(),
|
||||
&BTreeMap::default(),
|
||||
Some(DisplayRow(0)),
|
||||
Some(DisplayPoint::new(DisplayRow(0), 0)),
|
||||
&snapshot,
|
||||
window,
|
||||
cx,
|
||||
@@ -11791,9 +11831,10 @@ mod tests {
|
||||
let relative_rows = window
|
||||
.update(cx, |editor, window, cx| {
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
snapshot.calculate_relative_line_numbers(
|
||||
element.calculate_relative_line_numbers(
|
||||
&snapshot,
|
||||
&(DisplayRow(0)..DisplayRow(6)),
|
||||
DisplayRow(3),
|
||||
Some(DisplayRow(3)),
|
||||
true,
|
||||
)
|
||||
})
|
||||
@@ -11830,7 +11871,7 @@ mod tests {
|
||||
})
|
||||
.collect::<Vec<_>>(),
|
||||
&BTreeMap::from_iter([(DisplayRow(0), LineHighlightSpec::default())]),
|
||||
Some(DisplayRow(0)),
|
||||
Some(DisplayPoint::new(DisplayRow(0), 0)),
|
||||
&snapshot,
|
||||
window,
|
||||
cx,
|
||||
@@ -11845,9 +11886,10 @@ mod tests {
|
||||
let relative_rows = window
|
||||
.update(cx, |editor, window, cx| {
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
snapshot.calculate_relative_line_numbers(
|
||||
element.calculate_relative_line_numbers(
|
||||
&snapshot,
|
||||
&(DisplayRow(0)..DisplayRow(6)),
|
||||
DisplayRow(3),
|
||||
Some(DisplayRow(3)),
|
||||
true,
|
||||
)
|
||||
})
|
||||
|
||||
@@ -389,49 +389,6 @@ pub trait ExtensionContextServerProxy: Send + Sync + 'static {
|
||||
fn unregister_context_server(&self, server_id: Arc<str>, cx: &mut App);
|
||||
}
|
||||
|
||||
/// A function that registers a language model provider with the registry.
|
||||
/// This allows extension_host to create the provider (which requires WasmExtension)
|
||||
/// and pass a registration closure to the language_models crate.
|
||||
pub type LanguageModelProviderRegistration = Box<dyn FnOnce(&mut App) + Send + Sync + 'static>;
|
||||
|
||||
pub trait ExtensionLanguageModelProviderProxy: Send + Sync + 'static {
|
||||
/// Register an LLM provider from an extension.
|
||||
/// The `register_fn` closure will be called with the App context and should
|
||||
/// register the provider with the LanguageModelRegistry.
|
||||
fn register_language_model_provider(
|
||||
&self,
|
||||
provider_id: Arc<str>,
|
||||
register_fn: LanguageModelProviderRegistration,
|
||||
cx: &mut App,
|
||||
);
|
||||
|
||||
/// Unregister an LLM provider when an extension is unloaded.
|
||||
fn unregister_language_model_provider(&self, provider_id: Arc<str>, cx: &mut App);
|
||||
}
|
||||
|
||||
impl ExtensionLanguageModelProviderProxy for ExtensionHostProxy {
|
||||
fn register_language_model_provider(
|
||||
&self,
|
||||
provider_id: Arc<str>,
|
||||
register_fn: LanguageModelProviderRegistration,
|
||||
cx: &mut App,
|
||||
) {
|
||||
let Some(proxy) = self.language_model_provider_proxy.read().clone() else {
|
||||
return;
|
||||
};
|
||||
|
||||
proxy.register_language_model_provider(provider_id, register_fn, cx)
|
||||
}
|
||||
|
||||
fn unregister_language_model_provider(&self, provider_id: Arc<str>, cx: &mut App) {
|
||||
let Some(proxy) = self.language_model_provider_proxy.read().clone() else {
|
||||
return;
|
||||
};
|
||||
|
||||
proxy.unregister_language_model_provider(provider_id, cx)
|
||||
}
|
||||
}
|
||||
|
||||
impl ExtensionContextServerProxy for ExtensionHostProxy {
|
||||
fn register_context_server(
|
||||
&self,
|
||||
|
||||
@@ -298,58 +298,6 @@ pub struct LanguageModelProviderManifestEntry {
|
||||
/// Path to an SVG icon file relative to the extension root (e.g., "icons/provider.svg").
|
||||
#[serde(default)]
|
||||
pub icon: Option<String>,
|
||||
/// Hardcoded models to always show (as opposed to a model list loaded over the network).
|
||||
#[serde(default)]
|
||||
pub models: Vec<LanguageModelManifestEntry>,
|
||||
/// Authentication configuration.
|
||||
#[serde(default)]
|
||||
pub auth: Option<LanguageModelAuthConfig>,
|
||||
}
|
||||
|
||||
/// Manifest entry for a language model.
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Deserialize, Serialize)]
|
||||
pub struct LanguageModelManifestEntry {
|
||||
/// Unique identifier for the model.
|
||||
pub id: String,
|
||||
/// Display name for the model.
|
||||
pub name: String,
|
||||
/// Maximum input token count.
|
||||
pub max_token_count: u64,
|
||||
/// Maximum output tokens (optional).
|
||||
pub max_output_tokens: Option<u64>,
|
||||
/// Whether the model supports image inputs.
|
||||
pub supports_images: bool,
|
||||
/// Whether the model supports tool/function calling.
|
||||
pub supports_tools: bool,
|
||||
/// Whether the model supports extended thinking/reasoning.
|
||||
pub supports_thinking: bool,
|
||||
}
|
||||
|
||||
/// Authentication configuration for a language model provider.
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Deserialize, Serialize)]
|
||||
pub struct LanguageModelAuthConfig {
|
||||
/// Human-readable name for the credential shown in the UI input field (e.g. "API Key", "Access Token").
|
||||
pub credential_label: Option<String>,
|
||||
/// Environment variable names for the API key (if env var auth supported).
|
||||
/// Multiple env vars can be specified; they will be checked in order.
|
||||
#[serde(default)]
|
||||
pub env_vars: Option<Vec<String>>,
|
||||
/// OAuth configuration for web-based authentication flows.
|
||||
#[serde(default)]
|
||||
pub oauth: Option<OAuthConfig>,
|
||||
}
|
||||
|
||||
/// OAuth configuration for web-based authentication.
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Deserialize, Serialize)]
|
||||
pub struct OAuthConfig {
|
||||
/// The text to display on the sign-in button (e.g. "Sign in with GitHub").
|
||||
pub sign_in_button_label: Option<String>,
|
||||
/// The Zed icon path to display on the sign-in button (e.g. "github").
|
||||
#[serde(default)]
|
||||
pub sign_in_button_icon: Option<String>,
|
||||
/// The description text shown next to the sign-in button in edit prediction settings.
|
||||
#[serde(default)]
|
||||
pub sign_in_description: Option<String>,
|
||||
}
|
||||
|
||||
impl ExtensionManifest {
|
||||
|
||||
@@ -29,26 +29,6 @@ pub use wit::{
|
||||
GithubRelease, GithubReleaseAsset, GithubReleaseOptions, github_release_by_tag_name,
|
||||
latest_github_release,
|
||||
},
|
||||
zed::extension::llm_provider::{
|
||||
CacheConfiguration as LlmCacheConfiguration, CompletionEvent as LlmCompletionEvent,
|
||||
CompletionRequest as LlmCompletionRequest, CustomModelConfig as LlmCustomModelConfig,
|
||||
DeviceFlowPromptInfo as LlmDeviceFlowPromptInfo, ImageData as LlmImageData,
|
||||
MessageContent as LlmMessageContent, MessageRole as LlmMessageRole,
|
||||
ModelCapabilities as LlmModelCapabilities, ModelInfo as LlmModelInfo,
|
||||
OauthWebAuthConfig as LlmOauthWebAuthConfig, OauthWebAuthResult as LlmOauthWebAuthResult,
|
||||
ProviderInfo as LlmProviderInfo, ProviderSettings as LlmProviderSettings,
|
||||
RequestMessage as LlmRequestMessage, StopReason as LlmStopReason,
|
||||
ThinkingContent as LlmThinkingContent, TokenUsage as LlmTokenUsage,
|
||||
ToolChoice as LlmToolChoice, ToolDefinition as LlmToolDefinition,
|
||||
ToolInputFormat as LlmToolInputFormat, ToolResult as LlmToolResult,
|
||||
ToolResultContent as LlmToolResultContent, ToolUse as LlmToolUse,
|
||||
ToolUseJsonParseError as LlmToolUseJsonParseError,
|
||||
delete_credential as llm_delete_credential, get_credential as llm_get_credential,
|
||||
get_env_var as llm_get_env_var, get_provider_settings as llm_get_provider_settings,
|
||||
oauth_open_browser as llm_oauth_open_browser,
|
||||
oauth_send_http_request as llm_oauth_send_http_request,
|
||||
oauth_start_web_auth as llm_oauth_start_web_auth, store_credential as llm_store_credential,
|
||||
},
|
||||
zed::extension::nodejs::{
|
||||
node_binary_path, npm_install_package, npm_package_installed_version,
|
||||
npm_package_latest_version,
|
||||
@@ -279,93 +259,6 @@ pub trait Extension: Send + Sync {
|
||||
) -> Result<DebugRequest, String> {
|
||||
Err("`run_dap_locator` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Returns information about language model providers offered by this extension.
|
||||
fn llm_providers(&self) -> Vec<LlmProviderInfo> {
|
||||
Vec::new()
|
||||
}
|
||||
|
||||
/// Returns the models available for a provider.
|
||||
fn llm_provider_models(&self, _provider_id: &str) -> Result<Vec<LlmModelInfo>, String> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
/// Returns markdown content to display in the provider's settings UI.
|
||||
/// This can include setup instructions, links to documentation, etc.
|
||||
fn llm_provider_settings_markdown(&self, _provider_id: &str) -> Option<String> {
|
||||
None
|
||||
}
|
||||
|
||||
/// Check if the provider is authenticated.
|
||||
fn llm_provider_is_authenticated(&self, _provider_id: &str) -> bool {
|
||||
false
|
||||
}
|
||||
|
||||
/// Start an OAuth device flow sign-in.
|
||||
/// This is called when the user explicitly clicks "Sign in with GitHub" or similar.
|
||||
/// Returns information needed to display the device flow prompt modal to the user.
|
||||
fn llm_provider_start_device_flow_sign_in(
|
||||
&mut self,
|
||||
_provider_id: &str,
|
||||
) -> Result<LlmDeviceFlowPromptInfo, String> {
|
||||
Err("`llm_provider_start_device_flow_sign_in` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Poll for device flow sign-in completion.
|
||||
/// This is called after llm_provider_start_device_flow_sign_in returns the user code.
|
||||
/// The extension should poll the OAuth provider until the user authorizes or the flow times out.
|
||||
fn llm_provider_poll_device_flow_sign_in(&mut self, _provider_id: &str) -> Result<(), String> {
|
||||
Err("`llm_provider_poll_device_flow_sign_in` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Reset credentials for the provider.
|
||||
fn llm_provider_reset_credentials(&mut self, _provider_id: &str) -> Result<(), String> {
|
||||
Err("`llm_provider_reset_credentials` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Count tokens for a request.
|
||||
fn llm_count_tokens(
|
||||
&self,
|
||||
_provider_id: &str,
|
||||
_model_id: &str,
|
||||
_request: &LlmCompletionRequest,
|
||||
) -> Result<u64, String> {
|
||||
Err("`llm_count_tokens` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Start streaming a completion from the model.
|
||||
/// Returns a stream ID that can be used with `llm_stream_completion_next` and `llm_stream_completion_close`.
|
||||
fn llm_stream_completion_start(
|
||||
&mut self,
|
||||
_provider_id: &str,
|
||||
_model_id: &str,
|
||||
_request: &LlmCompletionRequest,
|
||||
) -> Result<String, String> {
|
||||
Err("`llm_stream_completion_start` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Get the next event from a completion stream.
|
||||
/// Returns `Ok(None)` when the stream is complete.
|
||||
fn llm_stream_completion_next(
|
||||
&mut self,
|
||||
_stream_id: &str,
|
||||
) -> Result<Option<LlmCompletionEvent>, String> {
|
||||
Err("`llm_stream_completion_next` not implemented".to_string())
|
||||
}
|
||||
|
||||
/// Close a completion stream and release its resources.
|
||||
fn llm_stream_completion_close(&mut self, _stream_id: &str) {
|
||||
// Default implementation does nothing
|
||||
}
|
||||
|
||||
/// Get cache configuration for a model (if prompt caching is supported).
|
||||
fn llm_cache_configuration(
|
||||
&self,
|
||||
_provider_id: &str,
|
||||
_model_id: &str,
|
||||
) -> Option<LlmCacheConfiguration> {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Registers the provided type as a Zed extension.
|
||||
@@ -624,67 +517,6 @@ impl wit::Guest for Component {
|
||||
) -> Result<DebugRequest, String> {
|
||||
extension().run_dap_locator(locator_name, build_task)
|
||||
}
|
||||
|
||||
fn llm_providers() -> Vec<LlmProviderInfo> {
|
||||
extension().llm_providers()
|
||||
}
|
||||
|
||||
fn llm_provider_models(provider_id: String) -> Result<Vec<LlmModelInfo>, String> {
|
||||
extension().llm_provider_models(&provider_id)
|
||||
}
|
||||
|
||||
fn llm_provider_settings_markdown(provider_id: String) -> Option<String> {
|
||||
extension().llm_provider_settings_markdown(&provider_id)
|
||||
}
|
||||
|
||||
fn llm_provider_is_authenticated(provider_id: String) -> bool {
|
||||
extension().llm_provider_is_authenticated(&provider_id)
|
||||
}
|
||||
|
||||
fn llm_provider_start_device_flow_sign_in(
|
||||
provider_id: String,
|
||||
) -> Result<LlmDeviceFlowPromptInfo, String> {
|
||||
extension().llm_provider_start_device_flow_sign_in(&provider_id)
|
||||
}
|
||||
|
||||
fn llm_provider_poll_device_flow_sign_in(provider_id: String) -> Result<(), String> {
|
||||
extension().llm_provider_poll_device_flow_sign_in(&provider_id)
|
||||
}
|
||||
|
||||
fn llm_provider_reset_credentials(provider_id: String) -> Result<(), String> {
|
||||
extension().llm_provider_reset_credentials(&provider_id)
|
||||
}
|
||||
|
||||
fn llm_count_tokens(
|
||||
provider_id: String,
|
||||
model_id: String,
|
||||
request: LlmCompletionRequest,
|
||||
) -> Result<u64, String> {
|
||||
extension().llm_count_tokens(&provider_id, &model_id, &request)
|
||||
}
|
||||
|
||||
fn llm_stream_completion_start(
|
||||
provider_id: String,
|
||||
model_id: String,
|
||||
request: LlmCompletionRequest,
|
||||
) -> Result<String, String> {
|
||||
extension().llm_stream_completion_start(&provider_id, &model_id, &request)
|
||||
}
|
||||
|
||||
fn llm_stream_completion_next(stream_id: String) -> Result<Option<LlmCompletionEvent>, String> {
|
||||
extension().llm_stream_completion_next(&stream_id)
|
||||
}
|
||||
|
||||
fn llm_stream_completion_close(stream_id: String) {
|
||||
extension().llm_stream_completion_close(&stream_id)
|
||||
}
|
||||
|
||||
fn llm_cache_configuration(
|
||||
provider_id: String,
|
||||
model_id: String,
|
||||
) -> Option<LlmCacheConfiguration> {
|
||||
extension().llm_cache_configuration(&provider_id, &model_id)
|
||||
}
|
||||
}
|
||||
|
||||
/// The ID of a language server.
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
//! An HTTP client.
|
||||
|
||||
pub use crate::wit::zed::extension::http_client::{
|
||||
HttpMethod, HttpRequest, HttpResponse, HttpResponseStream, HttpResponseWithStatus,
|
||||
RedirectPolicy, fetch, fetch_fallible, fetch_stream,
|
||||
HttpMethod, HttpRequest, HttpResponse, HttpResponseStream, RedirectPolicy, fetch, fetch_stream,
|
||||
};
|
||||
|
||||
impl HttpRequest {
|
||||
@@ -16,11 +15,6 @@ impl HttpRequest {
|
||||
fetch(self)
|
||||
}
|
||||
|
||||
/// Like [`fetch`], except it doesn't treat any status codes as errors.
|
||||
pub fn fetch_fallible(&self) -> Result<HttpResponseWithStatus, String> {
|
||||
fetch_fallible(self)
|
||||
}
|
||||
|
||||
/// Executes the [`HttpRequest`] with [`fetch_stream`].
|
||||
pub fn fetch_stream(&self) -> Result<HttpResponseStream, String> {
|
||||
fetch_stream(self)
|
||||
|
||||
@@ -8,7 +8,6 @@ world extension {
|
||||
import platform;
|
||||
import process;
|
||||
import nodejs;
|
||||
import llm-provider;
|
||||
|
||||
use common.{env-vars, range};
|
||||
use context-server.{context-server-configuration};
|
||||
@@ -16,11 +15,6 @@ world extension {
|
||||
use lsp.{completion, symbol};
|
||||
use process.{command};
|
||||
use slash-command.{slash-command, slash-command-argument-completion, slash-command-output};
|
||||
use llm-provider.{
|
||||
provider-info, model-info, completion-request,
|
||||
cache-configuration, completion-event, token-usage,
|
||||
device-flow-prompt-info
|
||||
};
|
||||
|
||||
/// Initializes the extension.
|
||||
export init-extension: func();
|
||||
@@ -170,73 +164,4 @@ world extension {
|
||||
export dap-config-to-scenario: func(config: debug-config) -> result<debug-scenario, string>;
|
||||
export dap-locator-create-scenario: func(locator-name: string, build-config-template: build-task-template, resolved-label: string, debug-adapter-name: string) -> option<debug-scenario>;
|
||||
export run-dap-locator: func(locator-name: string, config: resolved-task) -> result<debug-request, string>;
|
||||
|
||||
/// Returns information about language model providers offered by this extension.
|
||||
export llm-providers: func() -> list<provider-info>;
|
||||
|
||||
/// Returns the models available for a provider.
|
||||
export llm-provider-models: func(provider-id: string) -> result<list<model-info>, string>;
|
||||
|
||||
/// Returns markdown content to display in the provider's settings UI.
|
||||
/// This can include setup instructions, links to documentation, etc.
|
||||
export llm-provider-settings-markdown: func(provider-id: string) -> option<string>;
|
||||
|
||||
/// Check if the provider is authenticated.
|
||||
export llm-provider-is-authenticated: func(provider-id: string) -> bool;
|
||||
|
||||
/// Start an OAuth device flow sign-in.
|
||||
/// This is called when the user explicitly clicks "Sign in with GitHub" or similar.
|
||||
///
|
||||
/// The device flow works as follows:
|
||||
/// 1. Extension requests a device code from the OAuth provider
|
||||
/// 2. Extension returns prompt info including user code and verification URL
|
||||
/// 3. Host displays a modal with the prompt info
|
||||
/// 4. Host calls llm-provider-poll-device-flow-sign-in
|
||||
/// 5. Extension polls for the access token while user authorizes in browser
|
||||
/// 6. Once authorized, extension stores the credential and returns success
|
||||
///
|
||||
/// Returns information needed to display the device flow prompt modal.
|
||||
export llm-provider-start-device-flow-sign-in: func(provider-id: string) -> result<device-flow-prompt-info, string>;
|
||||
|
||||
/// Poll for device flow sign-in completion.
|
||||
/// This is called after llm-provider-start-device-flow-sign-in returns the user code.
|
||||
/// The extension should poll the OAuth provider until the user authorizes or the flow times out.
|
||||
/// Returns Ok(()) on successful authentication, or an error message on failure.
|
||||
export llm-provider-poll-device-flow-sign-in: func(provider-id: string) -> result<_, string>;
|
||||
|
||||
/// Reset credentials for the provider.
|
||||
export llm-provider-reset-credentials: func(provider-id: string) -> result<_, string>;
|
||||
|
||||
/// Count tokens for a request.
|
||||
export llm-count-tokens: func(
|
||||
provider-id: string,
|
||||
model-id: string,
|
||||
request: completion-request
|
||||
) -> result<u64, string>;
|
||||
|
||||
/// Start streaming a completion from the model.
|
||||
/// Returns a stream ID that can be used with llm-stream-next and llm-stream-close.
|
||||
export llm-stream-completion-start: func(
|
||||
provider-id: string,
|
||||
model-id: string,
|
||||
request: completion-request
|
||||
) -> result<string, string>;
|
||||
|
||||
/// Get the next event from a completion stream.
|
||||
/// Returns None when the stream is complete.
|
||||
export llm-stream-completion-next: func(
|
||||
stream-id: string
|
||||
) -> result<option<completion-event>, string>;
|
||||
|
||||
/// Close a completion stream and release its resources.
|
||||
export llm-stream-completion-close: func(
|
||||
stream-id: string
|
||||
);
|
||||
|
||||
/// Get cache configuration for a model (if prompt caching is supported).
|
||||
export llm-cache-configuration: func(
|
||||
provider-id: string,
|
||||
model-id: string
|
||||
) -> option<cache-configuration>;
|
||||
|
||||
}
|
||||
|
||||
@@ -51,26 +51,9 @@ interface http-client {
|
||||
body: list<u8>,
|
||||
}
|
||||
|
||||
/// An HTTP response that includes the status code.
|
||||
///
|
||||
/// Used by `fetch-fallible` which returns responses for all status codes
|
||||
/// rather than treating some status codes as errors.
|
||||
record http-response-with-status {
|
||||
/// The HTTP status code.
|
||||
status: u16,
|
||||
/// The response headers.
|
||||
headers: list<tuple<string, string>>,
|
||||
/// The response body.
|
||||
body: list<u8>,
|
||||
}
|
||||
|
||||
/// Performs an HTTP request and returns the response.
|
||||
/// Returns an error if the response status is 4xx or 5xx.
|
||||
fetch: func(req: http-request) -> result<http-response, string>;
|
||||
|
||||
/// Performs an HTTP request and returns the response regardless of its status code.
|
||||
fetch-fallible: func(req: http-request) -> result<http-response-with-status, string>;
|
||||
|
||||
/// An HTTP response stream.
|
||||
resource http-response-stream {
|
||||
/// Retrieves the next chunk of data from the response stream.
|
||||
|
||||
@@ -1,362 +0,0 @@
|
||||
interface llm-provider {
|
||||
use http-client.{http-request, http-response-with-status};
|
||||
|
||||
/// Information about a language model provider.
|
||||
record provider-info {
|
||||
/// Unique identifier for the provider (e.g. "my-extension.my-provider").
|
||||
id: string,
|
||||
/// Display name for the provider.
|
||||
name: string,
|
||||
/// Path to an SVG icon file relative to the extension root (e.g. "icons/provider.svg").
|
||||
icon: option<string>,
|
||||
}
|
||||
|
||||
/// Capabilities of a language model.
|
||||
record model-capabilities {
|
||||
/// Whether the model supports image inputs.
|
||||
supports-images: bool,
|
||||
/// Whether the model supports tool/function calling.
|
||||
supports-tools: bool,
|
||||
/// Whether the model supports the "auto" tool choice.
|
||||
supports-tool-choice-auto: bool,
|
||||
/// Whether the model supports the "any" tool choice.
|
||||
supports-tool-choice-any: bool,
|
||||
/// Whether the model supports the "none" tool choice.
|
||||
supports-tool-choice-none: bool,
|
||||
/// Whether the model supports extended thinking/reasoning.
|
||||
supports-thinking: bool,
|
||||
/// The format for tool input schemas.
|
||||
tool-input-format: tool-input-format,
|
||||
}
|
||||
|
||||
/// Format for tool input schemas.
|
||||
enum tool-input-format {
|
||||
/// Standard JSON Schema format.
|
||||
json-schema,
|
||||
/// A subset of JSON Schema supported by Google AI.
|
||||
/// See https://ai.google.dev/api/caching#Schema
|
||||
json-schema-subset,
|
||||
/// Simplified schema format for certain providers.
|
||||
simplified,
|
||||
}
|
||||
|
||||
/// Information about a specific model.
|
||||
record model-info {
|
||||
/// Unique identifier for the model.
|
||||
id: string,
|
||||
/// Display name for the model.
|
||||
name: string,
|
||||
/// Maximum input token count.
|
||||
max-token-count: u64,
|
||||
/// Maximum output tokens (optional).
|
||||
max-output-tokens: option<u64>,
|
||||
/// Model capabilities.
|
||||
capabilities: model-capabilities,
|
||||
/// Whether this is the default model for the provider.
|
||||
is-default: bool,
|
||||
/// Whether this is the default fast model.
|
||||
is-default-fast: bool,
|
||||
}
|
||||
|
||||
/// The role of a message participant.
|
||||
enum message-role {
|
||||
/// User message.
|
||||
user,
|
||||
/// Assistant message.
|
||||
assistant,
|
||||
/// System message.
|
||||
system,
|
||||
}
|
||||
|
||||
/// A message in a completion request.
|
||||
record request-message {
|
||||
/// The role of the message sender.
|
||||
role: message-role,
|
||||
/// The content of the message.
|
||||
content: list<message-content>,
|
||||
/// Whether to cache this message for prompt caching.
|
||||
cache: bool,
|
||||
}
|
||||
|
||||
/// Content within a message.
|
||||
variant message-content {
|
||||
/// Plain text content.
|
||||
text(string),
|
||||
/// Image content.
|
||||
image(image-data),
|
||||
/// A tool use request from the assistant.
|
||||
tool-use(tool-use),
|
||||
/// A tool result from the user.
|
||||
tool-result(tool-result),
|
||||
/// Thinking/reasoning content.
|
||||
thinking(thinking-content),
|
||||
/// Redacted/encrypted thinking content.
|
||||
redacted-thinking(string),
|
||||
}
|
||||
|
||||
/// Image data for vision models.
|
||||
record image-data {
|
||||
/// Base64-encoded image data.
|
||||
source: string,
|
||||
/// Image width in pixels (optional).
|
||||
width: option<u32>,
|
||||
/// Image height in pixels (optional).
|
||||
height: option<u32>,
|
||||
}
|
||||
|
||||
/// A tool use request from the model.
|
||||
record tool-use {
|
||||
/// Unique identifier for this tool use.
|
||||
id: string,
|
||||
/// The name of the tool being used.
|
||||
name: string,
|
||||
/// JSON string of the tool input arguments.
|
||||
input: string,
|
||||
/// Whether the input JSON is complete (false while streaming, true when done).
|
||||
is-input-complete: bool,
|
||||
/// Thought signature for providers that support it (e.g., Anthropic).
|
||||
thought-signature: option<string>,
|
||||
}
|
||||
|
||||
/// A tool result to send back to the model.
|
||||
record tool-result {
|
||||
/// The ID of the tool use this is a result for.
|
||||
tool-use-id: string,
|
||||
/// The name of the tool.
|
||||
tool-name: string,
|
||||
/// Whether this result represents an error.
|
||||
is-error: bool,
|
||||
/// The content of the result.
|
||||
content: tool-result-content,
|
||||
}
|
||||
|
||||
/// Content of a tool result.
|
||||
variant tool-result-content {
|
||||
/// Text result.
|
||||
text(string),
|
||||
/// Image result.
|
||||
image(image-data),
|
||||
}
|
||||
|
||||
/// Thinking/reasoning content from models that support extended thinking.
|
||||
record thinking-content {
|
||||
/// The thinking text.
|
||||
text: string,
|
||||
/// Signature for the thinking block (provider-specific).
|
||||
signature: option<string>,
|
||||
}
|
||||
|
||||
/// A tool definition for function calling.
|
||||
record tool-definition {
|
||||
/// The name of the tool.
|
||||
name: string,
|
||||
/// Description of what the tool does.
|
||||
description: string,
|
||||
/// JSON Schema for input parameters.
|
||||
input-schema: string,
|
||||
}
|
||||
|
||||
/// Tool choice preference for the model.
|
||||
enum tool-choice {
|
||||
/// Let the model decide whether to use tools.
|
||||
auto,
|
||||
/// Force the model to use at least one tool.
|
||||
any,
|
||||
/// Prevent the model from using tools.
|
||||
none,
|
||||
}
|
||||
|
||||
/// A completion request to send to the model.
|
||||
record completion-request {
|
||||
/// The messages in the conversation.
|
||||
messages: list<request-message>,
|
||||
/// Available tools for the model to use.
|
||||
tools: list<tool-definition>,
|
||||
/// Tool choice preference.
|
||||
tool-choice: option<tool-choice>,
|
||||
/// Stop sequences to end generation.
|
||||
stop-sequences: list<string>,
|
||||
/// Temperature for sampling (0.0-1.0).
|
||||
temperature: option<f32>,
|
||||
/// Whether thinking/reasoning is allowed.
|
||||
thinking-allowed: bool,
|
||||
/// Maximum tokens to generate.
|
||||
max-tokens: option<u64>,
|
||||
}
|
||||
|
||||
/// Events emitted during completion streaming.
|
||||
variant completion-event {
|
||||
/// Completion has started.
|
||||
started,
|
||||
/// Text content chunk.
|
||||
text(string),
|
||||
/// Thinking/reasoning content chunk.
|
||||
thinking(thinking-content),
|
||||
/// Redacted thinking (encrypted) chunk.
|
||||
redacted-thinking(string),
|
||||
/// Tool use request from the model.
|
||||
tool-use(tool-use),
|
||||
/// JSON parse error when parsing tool input.
|
||||
tool-use-json-parse-error(tool-use-json-parse-error),
|
||||
/// Completion stopped.
|
||||
stop(stop-reason),
|
||||
/// Token usage update.
|
||||
usage(token-usage),
|
||||
/// Reasoning details (provider-specific JSON).
|
||||
reasoning-details(string),
|
||||
}
|
||||
|
||||
/// Error information when tool use JSON parsing fails.
|
||||
record tool-use-json-parse-error {
|
||||
/// The tool use ID.
|
||||
id: string,
|
||||
/// The tool name.
|
||||
tool-name: string,
|
||||
/// The raw input that failed to parse.
|
||||
raw-input: string,
|
||||
/// The parse error message.
|
||||
error: string,
|
||||
}
|
||||
|
||||
/// Reason the completion stopped.
|
||||
enum stop-reason {
|
||||
/// The model finished generating.
|
||||
end-turn,
|
||||
/// Maximum tokens reached.
|
||||
max-tokens,
|
||||
/// The model wants to use a tool.
|
||||
tool-use,
|
||||
/// The model refused to respond.
|
||||
refusal,
|
||||
}
|
||||
|
||||
/// Token usage statistics.
|
||||
record token-usage {
|
||||
/// Number of input tokens used.
|
||||
input-tokens: u64,
|
||||
/// Number of output tokens generated.
|
||||
output-tokens: u64,
|
||||
/// Tokens used for cache creation (if supported).
|
||||
cache-creation-input-tokens: option<u64>,
|
||||
/// Tokens read from cache (if supported).
|
||||
cache-read-input-tokens: option<u64>,
|
||||
}
|
||||
|
||||
/// Cache configuration for prompt caching.
|
||||
record cache-configuration {
|
||||
/// Maximum number of cache anchors.
|
||||
max-cache-anchors: u32,
|
||||
/// Whether caching should be applied to tool definitions.
|
||||
should-cache-tool-definitions: bool,
|
||||
/// Minimum token count for a message to be cached.
|
||||
min-total-token-count: u64,
|
||||
}
|
||||
|
||||
/// Configuration for starting an OAuth web authentication flow.
|
||||
record oauth-web-auth-config {
|
||||
/// The URL to open in the user's browser to start authentication.
|
||||
/// This should include client_id, redirect_uri, scope, state, etc.
|
||||
/// Use `{port}` as a placeholder in the URL - it will be replaced with
|
||||
/// the actual localhost port before opening the browser.
|
||||
/// Example: "https://example.com/oauth?redirect_uri=http://127.0.0.1:{port}/callback"
|
||||
auth-url: string,
|
||||
/// The path to listen on for the OAuth callback (e.g., "/callback").
|
||||
/// A localhost server will be started to receive the redirect.
|
||||
callback-path: string,
|
||||
/// Timeout in seconds to wait for the callback (default: 300 = 5 minutes).
|
||||
timeout-secs: option<u32>,
|
||||
}
|
||||
|
||||
/// Result of an OAuth web authentication flow.
|
||||
record oauth-web-auth-result {
|
||||
/// The full callback URL that was received, including query parameters.
|
||||
/// The extension is responsible for parsing the code, state, etc.
|
||||
callback-url: string,
|
||||
/// The port that was used for the localhost callback server.
|
||||
port: u32,
|
||||
}
|
||||
|
||||
/// Get a stored credential for this provider.
|
||||
get-credential: func(provider-id: string) -> option<string>;
|
||||
|
||||
/// Store a credential for this provider.
|
||||
store-credential: func(provider-id: string, value: string) -> result<_, string>;
|
||||
|
||||
/// Delete a stored credential for this provider.
|
||||
delete-credential: func(provider-id: string) -> result<_, string>;
|
||||
|
||||
/// Read an environment variable.
|
||||
get-env-var: func(name: string) -> option<string>;
|
||||
|
||||
/// Start an OAuth web authentication flow.
|
||||
///
|
||||
/// This will:
|
||||
/// 1. Start a localhost server to receive the OAuth callback
|
||||
/// 2. Open the auth URL in the user's default browser
|
||||
/// 3. Wait for the callback (up to the timeout)
|
||||
/// 4. Return the callback URL with query parameters
|
||||
///
|
||||
/// The extension is responsible for:
|
||||
/// - Constructing the auth URL with client_id, redirect_uri, scope, state, etc.
|
||||
/// - Parsing the callback URL to extract the authorization code
|
||||
/// - Exchanging the code for tokens using fetch-fallible from http-client
|
||||
oauth-start-web-auth: func(config: oauth-web-auth-config) -> result<oauth-web-auth-result, string>;
|
||||
|
||||
/// Make an HTTP request for OAuth token exchange.
|
||||
///
|
||||
/// This is a convenience wrapper around http-client's fetch-fallible for OAuth flows.
|
||||
/// Unlike the standard fetch, this does not treat non-2xx responses as errors,
|
||||
/// allowing proper handling of OAuth error responses.
|
||||
oauth-send-http-request: func(request: http-request) -> result<http-response-with-status, string>;
|
||||
|
||||
/// Open a URL in the user's default browser.
|
||||
///
|
||||
/// Useful for OAuth flows that need to open a browser but handle the
|
||||
/// callback differently (e.g., polling-based flows).
|
||||
oauth-open-browser: func(url: string) -> result<_, string>;
|
||||
|
||||
/// Provider settings from user configuration.
|
||||
/// Extensions can use this to allow custom API URLs, custom models, etc.
|
||||
record provider-settings {
|
||||
/// Custom API URL override (if configured by the user).
|
||||
api-url: option<string>,
|
||||
/// Custom models configured by the user.
|
||||
available-models: list<custom-model-config>,
|
||||
}
|
||||
|
||||
/// Configuration for a custom model defined by the user.
|
||||
record custom-model-config {
|
||||
/// The model's API identifier.
|
||||
name: string,
|
||||
/// Display name for the UI.
|
||||
display-name: option<string>,
|
||||
/// Maximum input token count.
|
||||
max-tokens: u64,
|
||||
/// Maximum output tokens (optional).
|
||||
max-output-tokens: option<u64>,
|
||||
/// Thinking budget for models that support extended thinking (None = auto).
|
||||
thinking-budget: option<u32>,
|
||||
}
|
||||
|
||||
/// Get provider-specific settings configured by the user.
|
||||
/// Returns settings like custom API URLs and custom model configurations.
|
||||
get-provider-settings: func(provider-id: string) -> option<provider-settings>;
|
||||
|
||||
/// Information needed to display the device flow prompt modal to the user.
|
||||
record device-flow-prompt-info {
|
||||
/// The user code to display (e.g., "ABC-123").
|
||||
user-code: string,
|
||||
/// The URL the user needs to visit to authorize (for the "Connect" button).
|
||||
verification-url: string,
|
||||
/// The headline text for the modal (e.g., "Use GitHub Copilot in Zed.").
|
||||
headline: string,
|
||||
/// A description to show below the headline (e.g., "Using Copilot requires an active subscription on GitHub.").
|
||||
description: string,
|
||||
/// Label for the connect button (e.g., "Connect to GitHub").
|
||||
connect-button-label: string,
|
||||
/// Success headline shown when authorization completes.
|
||||
success-headline: string,
|
||||
/// Success message shown when authorization completes.
|
||||
success-message: string,
|
||||
}
|
||||
}
|
||||
@@ -255,21 +255,6 @@ async fn copy_extension_resources(
|
||||
}
|
||||
}
|
||||
|
||||
for (_, provider_entry) in &manifest.language_model_providers {
|
||||
if let Some(icon_path) = &provider_entry.icon {
|
||||
let source_icon = extension_path.join(icon_path);
|
||||
let dest_icon = output_dir.join(icon_path);
|
||||
|
||||
// Create parent directory if needed
|
||||
if let Some(parent) = dest_icon.parent() {
|
||||
fs::create_dir_all(parent)?;
|
||||
}
|
||||
|
||||
fs::copy(&source_icon, &dest_icon)
|
||||
.with_context(|| format!("failed to copy LLM provider icon '{}'", icon_path))?;
|
||||
}
|
||||
}
|
||||
|
||||
if !manifest.languages.is_empty() {
|
||||
let output_languages_dir = output_dir.join("languages");
|
||||
fs::create_dir_all(&output_languages_dir)?;
|
||||
|
||||
@@ -22,9 +22,7 @@ async-tar.workspace = true
|
||||
async-trait.workspace = true
|
||||
client.workspace = true
|
||||
collections.workspace = true
|
||||
credentials_provider.workspace = true
|
||||
dap.workspace = true
|
||||
dirs.workspace = true
|
||||
extension.workspace = true
|
||||
fs.workspace = true
|
||||
futures.workspace = true
|
||||
@@ -32,11 +30,8 @@ gpui.workspace = true
|
||||
gpui_tokio.workspace = true
|
||||
http_client.workspace = true
|
||||
language.workspace = true
|
||||
language_model.workspace = true
|
||||
log.workspace = true
|
||||
markdown.workspace = true
|
||||
lsp.workspace = true
|
||||
menu.workspace = true
|
||||
moka.workspace = true
|
||||
node_runtime.workspace = true
|
||||
paths.workspace = true
|
||||
@@ -48,16 +43,11 @@ serde.workspace = true
|
||||
serde_json.workspace = true
|
||||
serde_json_lenient.workspace = true
|
||||
settings.workspace = true
|
||||
smol.workspace = true
|
||||
task.workspace = true
|
||||
telemetry.workspace = true
|
||||
tempfile.workspace = true
|
||||
theme.workspace = true
|
||||
toml.workspace = true
|
||||
ui.workspace = true
|
||||
ui_input.workspace = true
|
||||
url.workspace = true
|
||||
workspace.workspace = true
|
||||
util.workspace = true
|
||||
wasmparser.workspace = true
|
||||
wasmtime-wasi.workspace = true
|
||||
|
||||
@@ -1,124 +0,0 @@
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use gpui::App;
|
||||
|
||||
const ANTHROPIC_EXTENSION_ID: &str = "anthropic";
|
||||
const ANTHROPIC_PROVIDER_ID: &str = "anthropic";
|
||||
const ANTHROPIC_DEFAULT_API_URL: &str = "https://api.anthropic.com";
|
||||
|
||||
/// Migrates Anthropic API credentials from the old built-in provider location
|
||||
/// to the new extension-based location.
|
||||
///
|
||||
/// This should only be called during auto-install of the extension.
|
||||
pub fn migrate_anthropic_credentials_if_needed(extension_id: &str, cx: &mut App) {
|
||||
if extension_id != ANTHROPIC_EXTENSION_ID {
|
||||
return;
|
||||
}
|
||||
|
||||
let extension_credential_key = format!(
|
||||
"extension-llm-{}:{}",
|
||||
ANTHROPIC_EXTENSION_ID, ANTHROPIC_PROVIDER_ID
|
||||
);
|
||||
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
// Read from old location
|
||||
let old_credential = credentials_provider
|
||||
.read_credentials(ANTHROPIC_DEFAULT_API_URL, &cx)
|
||||
.await
|
||||
.ok()
|
||||
.flatten();
|
||||
|
||||
let api_key = match old_credential {
|
||||
Some((_, key_bytes)) => match String::from_utf8(key_bytes) {
|
||||
Ok(key) if !key.is_empty() => key,
|
||||
Ok(_) => {
|
||||
log::debug!("Existing Anthropic API key is empty, nothing to migrate");
|
||||
return;
|
||||
}
|
||||
Err(_) => {
|
||||
log::error!("Failed to decode Anthropic API key as UTF-8");
|
||||
return;
|
||||
}
|
||||
},
|
||||
None => {
|
||||
log::debug!("No existing Anthropic API key found to migrate");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
log::info!("Migrating existing Anthropic API key to Anthropic extension");
|
||||
|
||||
match credentials_provider
|
||||
.write_credentials(&extension_credential_key, "Bearer", api_key.as_bytes(), &cx)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
log::info!("Successfully migrated Anthropic API key to extension");
|
||||
}
|
||||
Err(err) => {
|
||||
log::error!("Failed to migrate Anthropic API key: {}", err);
|
||||
}
|
||||
}
|
||||
})
|
||||
.detach();
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use gpui::TestAppContext;
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_migrates_credentials_from_old_location(cx: &mut TestAppContext) {
|
||||
let api_key = "sk-ant-test-key-12345";
|
||||
|
||||
cx.write_credentials(ANTHROPIC_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_anthropic_credentials_if_needed(ANTHROPIC_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let migrated = cx.read_credentials("extension-llm-anthropic:anthropic");
|
||||
assert!(migrated.is_some(), "Credentials should have been migrated");
|
||||
let (username, password) = migrated.unwrap();
|
||||
assert_eq!(username, "Bearer");
|
||||
assert_eq!(String::from_utf8(password).unwrap(), api_key);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_no_migration_if_no_old_credentials(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
migrate_anthropic_credentials_if_needed(ANTHROPIC_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-anthropic:anthropic");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not create credentials if none existed"
|
||||
);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_skips_migration_for_other_extensions(cx: &mut TestAppContext) {
|
||||
let api_key = "sk-ant-test-key";
|
||||
|
||||
cx.write_credentials(ANTHROPIC_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_anthropic_credentials_if_needed("some-other-extension", cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-anthropic:anthropic");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not migrate for other extensions"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,216 +0,0 @@
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use gpui::App;
|
||||
use std::path::PathBuf;
|
||||
|
||||
const COPILOT_CHAT_EXTENSION_ID: &str = "copilot-chat";
|
||||
const COPILOT_CHAT_PROVIDER_ID: &str = "copilot-chat";
|
||||
|
||||
/// Migrates Copilot OAuth credentials from the GitHub Copilot config files
|
||||
/// to the new extension-based credential location.
|
||||
///
|
||||
/// This should only be called during auto-install of the extension.
|
||||
pub fn migrate_copilot_credentials_if_needed(extension_id: &str, cx: &mut App) {
|
||||
if extension_id != COPILOT_CHAT_EXTENSION_ID {
|
||||
return;
|
||||
}
|
||||
|
||||
let credential_key = format!(
|
||||
"extension-llm-{}:{}",
|
||||
COPILOT_CHAT_EXTENSION_ID, COPILOT_CHAT_PROVIDER_ID
|
||||
);
|
||||
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
|
||||
cx.spawn(async move |_cx| {
|
||||
// Read from copilot config files
|
||||
let oauth_token = match read_copilot_oauth_token().await {
|
||||
Some(token) if !token.is_empty() => token,
|
||||
_ => {
|
||||
log::debug!("No existing Copilot OAuth token found to migrate");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
log::info!("Migrating existing Copilot OAuth token to Copilot Chat extension");
|
||||
|
||||
match credentials_provider
|
||||
.write_credentials(&credential_key, "api_key", oauth_token.as_bytes(), &_cx)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
log::info!("Successfully migrated Copilot OAuth token to Copilot Chat extension");
|
||||
}
|
||||
Err(err) => {
|
||||
log::error!("Failed to migrate Copilot OAuth token: {}", err);
|
||||
}
|
||||
}
|
||||
})
|
||||
.detach();
|
||||
}
|
||||
|
||||
async fn read_copilot_oauth_token() -> Option<String> {
|
||||
let config_paths = copilot_config_paths();
|
||||
|
||||
for path in config_paths {
|
||||
if let Some(token) = read_oauth_token_from_file(&path).await {
|
||||
return Some(token);
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
fn copilot_config_paths() -> Vec<PathBuf> {
|
||||
let config_dir = if cfg!(target_os = "windows") {
|
||||
dirs::data_local_dir()
|
||||
} else {
|
||||
std::env::var("XDG_CONFIG_HOME")
|
||||
.map(PathBuf::from)
|
||||
.ok()
|
||||
.or_else(|| dirs::home_dir().map(|h| h.join(".config")))
|
||||
};
|
||||
|
||||
let Some(config_dir) = config_dir else {
|
||||
return Vec::new();
|
||||
};
|
||||
|
||||
let copilot_dir = config_dir.join("github-copilot");
|
||||
|
||||
vec![
|
||||
copilot_dir.join("hosts.json"),
|
||||
copilot_dir.join("apps.json"),
|
||||
]
|
||||
}
|
||||
|
||||
async fn read_oauth_token_from_file(path: &PathBuf) -> Option<String> {
|
||||
let contents = match smol::fs::read_to_string(path).await {
|
||||
Ok(contents) => contents,
|
||||
Err(_) => return None,
|
||||
};
|
||||
|
||||
extract_oauth_token(&contents, "github.com")
|
||||
}
|
||||
|
||||
fn extract_oauth_token(contents: &str, domain: &str) -> Option<String> {
|
||||
let value: serde_json::Value = serde_json::from_str(contents).ok()?;
|
||||
let obj = value.as_object()?;
|
||||
|
||||
for (key, value) in obj.iter() {
|
||||
if key.starts_with(domain) {
|
||||
if let Some(token) = value.get("oauth_token").and_then(|v| v.as_str()) {
|
||||
return Some(token.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use gpui::TestAppContext;
|
||||
|
||||
#[test]
|
||||
fn test_extract_oauth_token_from_hosts_json() {
|
||||
let contents = r#"{
|
||||
"github.com": {
|
||||
"oauth_token": "ghu_test_token_12345"
|
||||
}
|
||||
}"#;
|
||||
|
||||
let token = extract_oauth_token(contents, "github.com");
|
||||
assert_eq!(token, Some("ghu_test_token_12345".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_oauth_token_with_user_suffix() {
|
||||
let contents = r#"{
|
||||
"github.com:user": {
|
||||
"oauth_token": "ghu_another_token"
|
||||
}
|
||||
}"#;
|
||||
|
||||
let token = extract_oauth_token(contents, "github.com");
|
||||
assert_eq!(token, Some("ghu_another_token".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_oauth_token_wrong_domain() {
|
||||
let contents = r#"{
|
||||
"gitlab.com": {
|
||||
"oauth_token": "some_token"
|
||||
}
|
||||
}"#;
|
||||
|
||||
let token = extract_oauth_token(contents, "github.com");
|
||||
assert_eq!(token, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_oauth_token_invalid_json() {
|
||||
let contents = "not valid json";
|
||||
let token = extract_oauth_token(contents, "github.com");
|
||||
assert_eq!(token, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_oauth_token_missing_oauth_token_field() {
|
||||
let contents = r#"{
|
||||
"github.com": {
|
||||
"user": "testuser"
|
||||
}
|
||||
}"#;
|
||||
|
||||
let token = extract_oauth_token(contents, "github.com");
|
||||
assert_eq!(token, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_oauth_token_multiple_entries_picks_first_match() {
|
||||
let contents = r#"{
|
||||
"gitlab.com": {
|
||||
"oauth_token": "gitlab_token"
|
||||
},
|
||||
"github.com": {
|
||||
"oauth_token": "github_token"
|
||||
}
|
||||
}"#;
|
||||
|
||||
let token = extract_oauth_token(contents, "github.com");
|
||||
assert_eq!(token, Some("github_token".to_string()));
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_skips_migration_for_other_extensions(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
migrate_copilot_credentials_if_needed("some-other-extension", cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-copilot-chat:copilot-chat");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not create credentials for other extensions"
|
||||
);
|
||||
}
|
||||
|
||||
// Note: Unlike the other migrations, copilot migration reads from the filesystem
|
||||
// (copilot config files), not from the credentials provider. In tests, these files
|
||||
// don't exist, so no migration occurs.
|
||||
#[gpui::test]
|
||||
async fn test_no_credentials_when_no_copilot_config_exists(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
migrate_copilot_credentials_if_needed(COPILOT_CHAT_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-copilot-chat:copilot-chat");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"No credentials should be written when copilot config doesn't exist"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,6 @@
|
||||
mod anthropic_migration;
|
||||
mod capability_granter;
|
||||
mod copilot_migration;
|
||||
pub mod extension_settings;
|
||||
mod google_ai_migration;
|
||||
pub mod headless_host;
|
||||
mod open_router_migration;
|
||||
mod openai_migration;
|
||||
pub mod wasm_host;
|
||||
|
||||
#[cfg(test)]
|
||||
@@ -17,14 +12,13 @@ use async_tar::Archive;
|
||||
use client::ExtensionProvides;
|
||||
use client::{Client, ExtensionMetadata, GetExtensionsResponse, proto, telemetry::Telemetry};
|
||||
use collections::{BTreeMap, BTreeSet, HashSet, btree_map};
|
||||
|
||||
pub use extension::ExtensionManifest;
|
||||
use extension::extension_builder::{CompileExtensionOptions, ExtensionBuilder};
|
||||
use extension::{
|
||||
ExtensionContextServerProxy, ExtensionDebugAdapterProviderProxy, ExtensionEvents,
|
||||
ExtensionGrammarProxy, ExtensionHostProxy, ExtensionLanguageModelProviderProxy,
|
||||
ExtensionLanguageProxy, ExtensionLanguageServerProxy, ExtensionSlashCommandProxy,
|
||||
ExtensionSnippetProxy, ExtensionThemeProxy,
|
||||
ExtensionGrammarProxy, ExtensionHostProxy, ExtensionLanguageProxy,
|
||||
ExtensionLanguageServerProxy, ExtensionSlashCommandProxy, ExtensionSnippetProxy,
|
||||
ExtensionThemeProxy,
|
||||
};
|
||||
use fs::{Fs, RemoveOptions};
|
||||
use futures::future::join_all;
|
||||
@@ -38,8 +32,8 @@ use futures::{
|
||||
select_biased,
|
||||
};
|
||||
use gpui::{
|
||||
App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, Global, SharedString, Task,
|
||||
WeakEntity, actions,
|
||||
App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, Global, Task, WeakEntity,
|
||||
actions,
|
||||
};
|
||||
use http_client::{AsyncBody, HttpClient, HttpClientWithUrl};
|
||||
use language::{
|
||||
@@ -59,28 +53,15 @@ use std::{
|
||||
cmp::Ordering,
|
||||
path::{self, Path, PathBuf},
|
||||
sync::Arc,
|
||||
time::Duration,
|
||||
time::{Duration, Instant},
|
||||
};
|
||||
use url::Url;
|
||||
use util::{ResultExt, paths::RemotePathBuf};
|
||||
use wasm_host::llm_provider::ExtensionLanguageModelProvider;
|
||||
use wasm_host::{
|
||||
WasmExtension, WasmHost,
|
||||
wit::{
|
||||
LlmCacheConfiguration, LlmModelInfo, LlmProviderInfo, is_supported_wasm_api_version,
|
||||
wasm_api_version_range,
|
||||
},
|
||||
wit::{is_supported_wasm_api_version, wasm_api_version_range},
|
||||
};
|
||||
|
||||
struct LlmProviderWithModels {
|
||||
provider_info: LlmProviderInfo,
|
||||
models: Vec<LlmModelInfo>,
|
||||
cache_configs: collections::HashMap<String, LlmCacheConfiguration>,
|
||||
is_authenticated: bool,
|
||||
icon_path: Option<SharedString>,
|
||||
auth_config: Option<extension::LanguageModelAuthConfig>,
|
||||
}
|
||||
|
||||
pub use extension::{
|
||||
ExtensionLibraryKind, GrammarManifestEntry, OldExtensionManifest, SchemaVersion,
|
||||
};
|
||||
@@ -89,82 +70,6 @@ pub use extension_settings::ExtensionSettings;
|
||||
pub const RELOAD_DEBOUNCE_DURATION: Duration = Duration::from_millis(200);
|
||||
const FS_WATCH_LATENCY: Duration = Duration::from_millis(100);
|
||||
|
||||
/// Extension IDs that are being migrated from hardcoded LLM providers.
|
||||
/// For backwards compatibility, if the user has the corresponding env var set,
|
||||
/// we automatically enable env var reading for these extensions on first install.
|
||||
pub const LEGACY_LLM_EXTENSION_IDS: &[&str] = &[
|
||||
"anthropic",
|
||||
"copilot-chat",
|
||||
"google-ai",
|
||||
"openrouter",
|
||||
"openai",
|
||||
];
|
||||
|
||||
/// Migrates legacy LLM provider extensions by auto-enabling env var reading
|
||||
/// if the env var is currently present in the environment.
|
||||
///
|
||||
/// This is idempotent: if the env var is already in `allowed_env_vars`,
|
||||
/// we skip. This means if a user explicitly removes it, it will be re-added on
|
||||
/// next launch if the env var is still set - but that's predictable behavior.
|
||||
fn migrate_legacy_llm_provider_env_var(manifest: &ExtensionManifest, cx: &mut App) {
|
||||
// Only apply migration to known legacy LLM extensions
|
||||
if !LEGACY_LLM_EXTENSION_IDS.contains(&manifest.id.as_ref()) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Check each provider in the manifest
|
||||
for (provider_id, provider_entry) in &manifest.language_model_providers {
|
||||
let Some(auth_config) = &provider_entry.auth else {
|
||||
continue;
|
||||
};
|
||||
let Some(env_vars) = &auth_config.env_vars else {
|
||||
continue;
|
||||
};
|
||||
|
||||
let full_provider_id = format!("{}:{}", manifest.id, provider_id);
|
||||
|
||||
// For each env var, check if it's set and enable it if so
|
||||
for env_var_name in env_vars {
|
||||
let env_var_is_set = std::env::var(env_var_name)
|
||||
.map(|v| !v.is_empty())
|
||||
.unwrap_or(false);
|
||||
|
||||
if !env_var_is_set {
|
||||
continue;
|
||||
}
|
||||
|
||||
let settings_key: Arc<str> = format!("{}:{}", full_provider_id, env_var_name).into();
|
||||
|
||||
// Check if already enabled in settings
|
||||
let already_enabled = ExtensionSettings::get_global(cx)
|
||||
.allowed_env_var_providers
|
||||
.contains(settings_key.as_ref());
|
||||
|
||||
if already_enabled {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Enable env var reading since the env var is set
|
||||
settings::update_settings_file(<dyn fs::Fs>::global(cx), cx, {
|
||||
let settings_key = settings_key.clone();
|
||||
move |settings, _| {
|
||||
let allowed = settings
|
||||
.extension
|
||||
.allowed_env_var_providers
|
||||
.get_or_insert_with(Vec::new);
|
||||
|
||||
if !allowed
|
||||
.iter()
|
||||
.any(|id| id.as_ref() == settings_key.as_ref())
|
||||
{
|
||||
allowed.push(settings_key);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// The current extension [`SchemaVersion`] supported by Zed.
|
||||
const CURRENT_SCHEMA_VERSION: SchemaVersion = SchemaVersion(1);
|
||||
|
||||
@@ -226,8 +131,6 @@ pub struct ExtensionStore {
|
||||
pub enum ExtensionOperation {
|
||||
Upgrade,
|
||||
Install,
|
||||
/// Auto-install from settings - triggers legacy LLM provider migrations
|
||||
AutoInstall,
|
||||
Remove,
|
||||
}
|
||||
|
||||
@@ -710,60 +613,8 @@ impl ExtensionStore {
|
||||
|
||||
cx.spawn(async move |this, cx| {
|
||||
for extension_id in extensions_to_install {
|
||||
// When enabled, this checks if an extension exists locally in the repo's extensions/
|
||||
// directory and installs it as a dev extension instead of fetching from the registry.
|
||||
// This is useful for testing auto-installed extensions before they've been published.
|
||||
// Set to `true` only during local development/testing of new auto-install extensions.
|
||||
#[cfg(debug_assertions)]
|
||||
const DEBUG_ALLOW_UNPUBLISHED_AUTO_EXTENSIONS: bool = false;
|
||||
|
||||
#[cfg(debug_assertions)]
|
||||
if DEBUG_ALLOW_UNPUBLISHED_AUTO_EXTENSIONS {
|
||||
let local_extension_path = std::path::PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||
.parent()
|
||||
.unwrap()
|
||||
.parent()
|
||||
.unwrap()
|
||||
.join("extensions")
|
||||
.join(extension_id.as_ref());
|
||||
|
||||
if local_extension_path.exists() {
|
||||
// Force-remove existing extension directory if it exists and isn't a symlink
|
||||
// This handles the case where the extension was previously installed from the registry
|
||||
if let Some(installed_dir) = this
|
||||
.update(cx, |this, _cx| this.installed_dir.clone())
|
||||
.ok()
|
||||
{
|
||||
let existing_path = installed_dir.join(extension_id.as_ref());
|
||||
if existing_path.exists() {
|
||||
let metadata = std::fs::symlink_metadata(&existing_path);
|
||||
let is_symlink = metadata.map(|m| m.is_symlink()).unwrap_or(false);
|
||||
if !is_symlink {
|
||||
if let Err(e) = std::fs::remove_dir_all(&existing_path) {
|
||||
log::error!(
|
||||
"Failed to remove existing extension directory {:?}: {}",
|
||||
existing_path,
|
||||
e
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(task) = this
|
||||
.update(cx, |this, cx| {
|
||||
this.install_dev_extension(local_extension_path, cx)
|
||||
})
|
||||
.ok()
|
||||
{
|
||||
task.await.log_err();
|
||||
}
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
this.update(cx, |this, cx| {
|
||||
this.auto_install_latest_extension(extension_id.clone(), cx);
|
||||
this.install_latest_extension(extension_id.clone(), cx);
|
||||
})
|
||||
.ok();
|
||||
}
|
||||
@@ -918,10 +769,7 @@ impl ExtensionStore {
|
||||
this.update(cx, |this, cx| this.reload(Some(extension_id.clone()), cx))?
|
||||
.await;
|
||||
|
||||
if matches!(
|
||||
operation,
|
||||
ExtensionOperation::Install | ExtensionOperation::AutoInstall
|
||||
) {
|
||||
if let ExtensionOperation::Install = operation {
|
||||
this.update(cx, |this, cx| {
|
||||
cx.emit(Event::ExtensionInstalled(extension_id.clone()));
|
||||
if let Some(events) = ExtensionEvents::try_global(cx)
|
||||
@@ -931,27 +779,6 @@ impl ExtensionStore {
|
||||
this.emit(extension::Event::ExtensionInstalled(manifest.clone()), cx)
|
||||
});
|
||||
}
|
||||
|
||||
// Run legacy LLM provider migrations only for auto-installed extensions
|
||||
if matches!(operation, ExtensionOperation::AutoInstall) {
|
||||
if let Some(manifest) = this.extension_manifest_for_id(&extension_id) {
|
||||
migrate_legacy_llm_provider_env_var(&manifest, cx);
|
||||
}
|
||||
copilot_migration::migrate_copilot_credentials_if_needed(&extension_id, cx);
|
||||
anthropic_migration::migrate_anthropic_credentials_if_needed(
|
||||
&extension_id,
|
||||
cx,
|
||||
);
|
||||
google_ai_migration::migrate_google_ai_credentials_if_needed(
|
||||
&extension_id,
|
||||
cx,
|
||||
);
|
||||
openai_migration::migrate_openai_credentials_if_needed(&extension_id, cx);
|
||||
open_router_migration::migrate_open_router_credentials_if_needed(
|
||||
&extension_id,
|
||||
cx,
|
||||
);
|
||||
}
|
||||
})
|
||||
.ok();
|
||||
}
|
||||
@@ -961,24 +788,8 @@ impl ExtensionStore {
|
||||
}
|
||||
|
||||
pub fn install_latest_extension(&mut self, extension_id: Arc<str>, cx: &mut Context<Self>) {
|
||||
self.install_latest_extension_with_operation(extension_id, ExtensionOperation::Install, cx);
|
||||
}
|
||||
log::info!("installing extension {extension_id} latest version");
|
||||
|
||||
/// Auto-install an extension, triggering legacy LLM provider migrations.
|
||||
fn auto_install_latest_extension(&mut self, extension_id: Arc<str>, cx: &mut Context<Self>) {
|
||||
self.install_latest_extension_with_operation(
|
||||
extension_id,
|
||||
ExtensionOperation::AutoInstall,
|
||||
cx,
|
||||
);
|
||||
}
|
||||
|
||||
fn install_latest_extension_with_operation(
|
||||
&mut self,
|
||||
extension_id: Arc<str>,
|
||||
operation: ExtensionOperation,
|
||||
cx: &mut Context<Self>,
|
||||
) {
|
||||
let schema_versions = schema_version_range();
|
||||
let wasm_api_versions = wasm_api_version_range(ReleaseChannel::global(cx));
|
||||
|
||||
@@ -1001,8 +812,13 @@ impl ExtensionStore {
|
||||
return;
|
||||
};
|
||||
|
||||
self.install_or_upgrade_extension_at_endpoint(extension_id, url, operation, cx)
|
||||
.detach_and_log_err(cx);
|
||||
self.install_or_upgrade_extension_at_endpoint(
|
||||
extension_id,
|
||||
url,
|
||||
ExtensionOperation::Install,
|
||||
cx,
|
||||
)
|
||||
.detach_and_log_err(cx);
|
||||
}
|
||||
|
||||
pub fn upgrade_extension(
|
||||
@@ -1021,6 +837,7 @@ impl ExtensionStore {
|
||||
operation: ExtensionOperation,
|
||||
cx: &mut Context<Self>,
|
||||
) -> Task<Result<()>> {
|
||||
log::info!("installing extension {extension_id} {version}");
|
||||
let Some(url) = self
|
||||
.http_client
|
||||
.build_zed_api_url(
|
||||
@@ -1196,37 +1013,9 @@ impl ExtensionStore {
|
||||
}
|
||||
}
|
||||
|
||||
fs.create_symlink(output_path, extension_source_path.clone())
|
||||
fs.create_symlink(output_path, extension_source_path)
|
||||
.await?;
|
||||
|
||||
// Re-load manifest and run migrations before reload so settings are updated before providers are registered
|
||||
let manifest_for_migration =
|
||||
ExtensionManifest::load(fs.clone(), &extension_source_path).await?;
|
||||
this.update(cx, |_this, cx| {
|
||||
migrate_legacy_llm_provider_env_var(&manifest_for_migration, cx);
|
||||
// Also run credential migrations for dev extensions
|
||||
copilot_migration::migrate_copilot_credentials_if_needed(
|
||||
manifest_for_migration.id.as_ref(),
|
||||
cx,
|
||||
);
|
||||
anthropic_migration::migrate_anthropic_credentials_if_needed(
|
||||
manifest_for_migration.id.as_ref(),
|
||||
cx,
|
||||
);
|
||||
google_ai_migration::migrate_google_ai_credentials_if_needed(
|
||||
manifest_for_migration.id.as_ref(),
|
||||
cx,
|
||||
);
|
||||
openai_migration::migrate_openai_credentials_if_needed(
|
||||
manifest_for_migration.id.as_ref(),
|
||||
cx,
|
||||
);
|
||||
open_router_migration::migrate_open_router_credentials_if_needed(
|
||||
manifest_for_migration.id.as_ref(),
|
||||
cx,
|
||||
);
|
||||
})?;
|
||||
|
||||
this.update(cx, |this, cx| this.reload(None, cx))?.await;
|
||||
this.update(cx, |this, cx| {
|
||||
cx.emit(Event::ExtensionInstalled(extension_id.clone()));
|
||||
@@ -1345,6 +1134,18 @@ impl ExtensionStore {
|
||||
return Task::ready(());
|
||||
}
|
||||
|
||||
let reload_count = extensions_to_unload
|
||||
.iter()
|
||||
.filter(|id| extensions_to_load.contains(id))
|
||||
.count();
|
||||
|
||||
log::info!(
|
||||
"extensions updated. loading {}, reloading {}, unloading {}",
|
||||
extensions_to_load.len() - reload_count,
|
||||
reload_count,
|
||||
extensions_to_unload.len() - reload_count
|
||||
);
|
||||
|
||||
let extension_ids = extensions_to_load
|
||||
.iter()
|
||||
.filter_map(|id| {
|
||||
@@ -1419,11 +1220,6 @@ impl ExtensionStore {
|
||||
for command_name in extension.manifest.slash_commands.keys() {
|
||||
self.proxy.unregister_slash_command(command_name.clone());
|
||||
}
|
||||
for provider_id in extension.manifest.language_model_providers.keys() {
|
||||
let full_provider_id: Arc<str> = format!("{}:{}", extension_id, provider_id).into();
|
||||
self.proxy
|
||||
.unregister_language_model_provider(full_provider_id, cx);
|
||||
}
|
||||
}
|
||||
|
||||
self.wasm_extensions
|
||||
@@ -1562,11 +1358,7 @@ impl ExtensionStore {
|
||||
})
|
||||
.await;
|
||||
|
||||
let mut wasm_extensions: Vec<(
|
||||
Arc<ExtensionManifest>,
|
||||
WasmExtension,
|
||||
Vec<LlmProviderWithModels>,
|
||||
)> = Vec::new();
|
||||
let mut wasm_extensions = Vec::new();
|
||||
for extension in extension_entries {
|
||||
if extension.manifest.lib.kind.is_none() {
|
||||
continue;
|
||||
@@ -1584,149 +1376,7 @@ impl ExtensionStore {
|
||||
|
||||
match wasm_extension {
|
||||
Ok(wasm_extension) => {
|
||||
// Query for LLM providers if the manifest declares any
|
||||
let mut llm_providers_with_models = Vec::new();
|
||||
if !extension.manifest.language_model_providers.is_empty() {
|
||||
let providers_result = wasm_extension
|
||||
.call(|ext, store| {
|
||||
async move { ext.call_llm_providers(store).await }.boxed()
|
||||
})
|
||||
.await;
|
||||
|
||||
if let Ok(Ok(providers)) = providers_result {
|
||||
for provider_info in providers {
|
||||
let models_result = wasm_extension
|
||||
.call({
|
||||
let provider_id = provider_info.id.clone();
|
||||
|ext, store| {
|
||||
async move {
|
||||
ext.call_llm_provider_models(store, &provider_id)
|
||||
.await
|
||||
}
|
||||
.boxed()
|
||||
}
|
||||
})
|
||||
.await;
|
||||
|
||||
let models: Vec<LlmModelInfo> = match models_result {
|
||||
Ok(Ok(Ok(models))) => models,
|
||||
Ok(Ok(Err(e))) => {
|
||||
log::error!(
|
||||
"Failed to get models for LLM provider {} in extension {}: {}",
|
||||
provider_info.id,
|
||||
extension.manifest.id,
|
||||
e
|
||||
);
|
||||
Vec::new()
|
||||
}
|
||||
Ok(Err(e)) => {
|
||||
log::error!(
|
||||
"Wasm error calling llm_provider_models for {} in extension {}: {:?}",
|
||||
provider_info.id,
|
||||
extension.manifest.id,
|
||||
e
|
||||
);
|
||||
Vec::new()
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!(
|
||||
"Extension call failed for llm_provider_models {} in extension {}: {:?}",
|
||||
provider_info.id,
|
||||
extension.manifest.id,
|
||||
e
|
||||
);
|
||||
Vec::new()
|
||||
}
|
||||
};
|
||||
|
||||
// Query cache configurations for each model
|
||||
let mut cache_configs = collections::HashMap::default();
|
||||
for model in &models {
|
||||
let cache_config_result = wasm_extension
|
||||
.call({
|
||||
let provider_id = provider_info.id.clone();
|
||||
let model_id = model.id.clone();
|
||||
|ext, store| {
|
||||
async move {
|
||||
ext.call_llm_cache_configuration(
|
||||
store,
|
||||
&provider_id,
|
||||
&model_id,
|
||||
)
|
||||
.await
|
||||
}
|
||||
.boxed()
|
||||
}
|
||||
})
|
||||
.await;
|
||||
|
||||
if let Ok(Ok(Some(config))) = cache_config_result {
|
||||
cache_configs.insert(model.id.clone(), config);
|
||||
}
|
||||
}
|
||||
|
||||
// Query initial authentication state
|
||||
let is_authenticated = wasm_extension
|
||||
.call({
|
||||
let provider_id = provider_info.id.clone();
|
||||
|ext, store| {
|
||||
async move {
|
||||
ext.call_llm_provider_is_authenticated(
|
||||
store,
|
||||
&provider_id,
|
||||
)
|
||||
.await
|
||||
}
|
||||
.boxed()
|
||||
}
|
||||
})
|
||||
.await
|
||||
.unwrap_or(Ok(false))
|
||||
.unwrap_or(false);
|
||||
|
||||
// Resolve icon path if provided
|
||||
let icon_path = provider_info.icon.as_ref().map(|icon| {
|
||||
let icon_file_path = extension_path.join(icon);
|
||||
// Canonicalize to resolve symlinks (dev extensions are symlinked)
|
||||
let absolute_icon_path = icon_file_path
|
||||
.canonicalize()
|
||||
.unwrap_or(icon_file_path)
|
||||
.to_string_lossy()
|
||||
.to_string();
|
||||
SharedString::from(absolute_icon_path)
|
||||
});
|
||||
|
||||
let provider_id_arc: Arc<str> =
|
||||
provider_info.id.as_str().into();
|
||||
let auth_config = extension
|
||||
.manifest
|
||||
.language_model_providers
|
||||
.get(&provider_id_arc)
|
||||
.and_then(|entry| entry.auth.clone());
|
||||
|
||||
llm_providers_with_models.push(LlmProviderWithModels {
|
||||
provider_info,
|
||||
models,
|
||||
cache_configs,
|
||||
is_authenticated,
|
||||
icon_path,
|
||||
auth_config,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
log::error!(
|
||||
"Failed to get LLM providers from extension {}: {:?}",
|
||||
extension.manifest.id,
|
||||
providers_result
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
wasm_extensions.push((
|
||||
extension.manifest.clone(),
|
||||
wasm_extension,
|
||||
llm_providers_with_models,
|
||||
))
|
||||
wasm_extensions.push((extension.manifest.clone(), wasm_extension))
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!(
|
||||
@@ -1745,7 +1395,7 @@ impl ExtensionStore {
|
||||
this.update(cx, |this, cx| {
|
||||
this.reload_complete_senders.clear();
|
||||
|
||||
for (manifest, wasm_extension, llm_providers_with_models) in &wasm_extensions {
|
||||
for (manifest, wasm_extension) in &wasm_extensions {
|
||||
let extension = Arc::new(wasm_extension.clone());
|
||||
|
||||
for (language_server_id, language_server_config) in &manifest.language_servers {
|
||||
@@ -1799,42 +1449,9 @@ impl ExtensionStore {
|
||||
this.proxy
|
||||
.register_debug_locator(extension.clone(), debug_adapter.clone());
|
||||
}
|
||||
|
||||
// Register LLM providers
|
||||
for llm_provider in llm_providers_with_models {
|
||||
let provider_id: Arc<str> =
|
||||
format!("{}:{}", manifest.id, llm_provider.provider_info.id).into();
|
||||
let wasm_ext = extension.as_ref().clone();
|
||||
let pinfo = llm_provider.provider_info.clone();
|
||||
let mods = llm_provider.models.clone();
|
||||
let cache_cfgs = llm_provider.cache_configs.clone();
|
||||
let auth = llm_provider.is_authenticated;
|
||||
let icon = llm_provider.icon_path.clone();
|
||||
let auth_config = llm_provider.auth_config.clone();
|
||||
|
||||
this.proxy.register_language_model_provider(
|
||||
provider_id.clone(),
|
||||
Box::new(move |cx: &mut App| {
|
||||
let provider = Arc::new(ExtensionLanguageModelProvider::new(
|
||||
wasm_ext, pinfo, mods, cache_cfgs, auth, icon, auth_config, cx,
|
||||
));
|
||||
language_model::LanguageModelRegistry::global(cx).update(
|
||||
cx,
|
||||
|registry, cx| {
|
||||
registry.register_provider(provider, cx);
|
||||
},
|
||||
);
|
||||
}),
|
||||
cx,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
let wasm_extensions_without_llm: Vec<_> = wasm_extensions
|
||||
.into_iter()
|
||||
.map(|(manifest, ext, _)| (manifest, ext))
|
||||
.collect();
|
||||
this.wasm_extensions.extend(wasm_extensions_without_llm);
|
||||
this.wasm_extensions.extend(wasm_extensions);
|
||||
this.proxy.set_extensions_loaded();
|
||||
this.proxy.reload_current_theme(cx);
|
||||
this.proxy.reload_current_icon_theme(cx);
|
||||
@@ -1856,6 +1473,7 @@ impl ExtensionStore {
|
||||
let index_path = self.index_path.clone();
|
||||
let proxy = self.proxy.clone();
|
||||
cx.background_spawn(async move {
|
||||
let start_time = Instant::now();
|
||||
let mut index = ExtensionIndex::default();
|
||||
|
||||
fs.create_dir(&work_dir).await.log_err();
|
||||
@@ -1893,6 +1511,7 @@ impl ExtensionStore {
|
||||
.log_err();
|
||||
}
|
||||
|
||||
log::info!("rebuilt extension index in {:?}", start_time.elapsed());
|
||||
index
|
||||
})
|
||||
}
|
||||
@@ -2166,6 +1785,11 @@ impl ExtensionStore {
|
||||
})?,
|
||||
path_style,
|
||||
);
|
||||
log::info!(
|
||||
"Uploading extension {} to {:?}",
|
||||
missing_extension.clone().id,
|
||||
dest_dir
|
||||
);
|
||||
|
||||
client
|
||||
.update(cx, |client, cx| {
|
||||
@@ -2173,6 +1797,11 @@ impl ExtensionStore {
|
||||
})?
|
||||
.await?;
|
||||
|
||||
log::info!(
|
||||
"Finished uploading extension {}",
|
||||
missing_extension.clone().id
|
||||
);
|
||||
|
||||
let result = client
|
||||
.update(cx, |client, _cx| {
|
||||
client.proto_client().request(proto::InstallExtension {
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
use collections::{HashMap, HashSet};
|
||||
use collections::HashMap;
|
||||
use extension::{
|
||||
DownloadFileCapability, ExtensionCapability, NpmInstallPackageCapability, ProcessExecCapability,
|
||||
};
|
||||
@@ -16,10 +16,6 @@ pub struct ExtensionSettings {
|
||||
pub auto_install_extensions: HashMap<Arc<str>, bool>,
|
||||
pub auto_update_extensions: HashMap<Arc<str>, bool>,
|
||||
pub granted_capabilities: Vec<ExtensionCapability>,
|
||||
/// The extension language model providers that are allowed to read API keys
|
||||
/// from environment variables. Each entry is in the format
|
||||
/// "extension_id:provider_id:ENV_VAR_NAME".
|
||||
pub allowed_env_var_providers: HashSet<Arc<str>>,
|
||||
}
|
||||
|
||||
impl ExtensionSettings {
|
||||
@@ -64,13 +60,6 @@ impl Settings for ExtensionSettings {
|
||||
}
|
||||
})
|
||||
.collect(),
|
||||
allowed_env_var_providers: content
|
||||
.extension
|
||||
.allowed_env_var_providers
|
||||
.clone()
|
||||
.unwrap_or_default()
|
||||
.into_iter()
|
||||
.collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,124 +0,0 @@
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use gpui::App;
|
||||
|
||||
const GOOGLE_AI_EXTENSION_ID: &str = "google-ai";
|
||||
const GOOGLE_AI_PROVIDER_ID: &str = "google-ai";
|
||||
const GOOGLE_AI_DEFAULT_API_URL: &str = "https://generativelanguage.googleapis.com";
|
||||
|
||||
/// Migrates Google AI API credentials from the old built-in provider location
|
||||
/// to the new extension-based location.
|
||||
///
|
||||
/// This should only be called during auto-install of the extension.
|
||||
pub fn migrate_google_ai_credentials_if_needed(extension_id: &str, cx: &mut App) {
|
||||
if extension_id != GOOGLE_AI_EXTENSION_ID {
|
||||
return;
|
||||
}
|
||||
|
||||
let extension_credential_key = format!(
|
||||
"extension-llm-{}:{}",
|
||||
GOOGLE_AI_EXTENSION_ID, GOOGLE_AI_PROVIDER_ID
|
||||
);
|
||||
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
// Read from old location
|
||||
let old_credential = credentials_provider
|
||||
.read_credentials(GOOGLE_AI_DEFAULT_API_URL, &cx)
|
||||
.await
|
||||
.ok()
|
||||
.flatten();
|
||||
|
||||
let api_key = match old_credential {
|
||||
Some((_, key_bytes)) => match String::from_utf8(key_bytes) {
|
||||
Ok(key) if !key.is_empty() => key,
|
||||
Ok(_) => {
|
||||
log::debug!("Existing Google AI API key is empty, nothing to migrate");
|
||||
return;
|
||||
}
|
||||
Err(_) => {
|
||||
log::error!("Failed to decode Google AI API key as UTF-8");
|
||||
return;
|
||||
}
|
||||
},
|
||||
None => {
|
||||
log::debug!("No existing Google AI API key found to migrate");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
log::info!("Migrating existing Google AI API key to Google AI extension");
|
||||
|
||||
match credentials_provider
|
||||
.write_credentials(&extension_credential_key, "Bearer", api_key.as_bytes(), &cx)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
log::info!("Successfully migrated Google AI API key to extension");
|
||||
}
|
||||
Err(err) => {
|
||||
log::error!("Failed to migrate Google AI API key: {}", err);
|
||||
}
|
||||
}
|
||||
})
|
||||
.detach();
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use gpui::TestAppContext;
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_migrates_credentials_from_old_location(cx: &mut TestAppContext) {
|
||||
let api_key = "AIzaSy-test-key-12345";
|
||||
|
||||
cx.write_credentials(GOOGLE_AI_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_google_ai_credentials_if_needed(GOOGLE_AI_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let migrated = cx.read_credentials("extension-llm-google-ai:google-ai");
|
||||
assert!(migrated.is_some(), "Credentials should have been migrated");
|
||||
let (username, password) = migrated.unwrap();
|
||||
assert_eq!(username, "Bearer");
|
||||
assert_eq!(String::from_utf8(password).unwrap(), api_key);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_no_migration_if_no_old_credentials(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
migrate_google_ai_credentials_if_needed(GOOGLE_AI_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-google-ai:google-ai");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not create credentials if none existed"
|
||||
);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_skips_migration_for_other_extensions(cx: &mut TestAppContext) {
|
||||
let api_key = "AIzaSy-test-key";
|
||||
|
||||
cx.write_credentials(GOOGLE_AI_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_google_ai_credentials_if_needed("some-other-extension", cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-google-ai:google-ai");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not migrate for other extensions"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,124 +0,0 @@
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use gpui::App;
|
||||
|
||||
const OPEN_ROUTER_EXTENSION_ID: &str = "openrouter";
|
||||
const OPEN_ROUTER_PROVIDER_ID: &str = "openrouter";
|
||||
const OPEN_ROUTER_DEFAULT_API_URL: &str = "https://openrouter.ai/api/v1";
|
||||
|
||||
/// Migrates OpenRouter API credentials from the old built-in provider location
|
||||
/// to the new extension-based location.
|
||||
///
|
||||
/// This should only be called during auto-install of the extension.
|
||||
pub fn migrate_open_router_credentials_if_needed(extension_id: &str, cx: &mut App) {
|
||||
if extension_id != OPEN_ROUTER_EXTENSION_ID {
|
||||
return;
|
||||
}
|
||||
|
||||
let extension_credential_key = format!(
|
||||
"extension-llm-{}:{}",
|
||||
OPEN_ROUTER_EXTENSION_ID, OPEN_ROUTER_PROVIDER_ID
|
||||
);
|
||||
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
// Read from old location
|
||||
let old_credential = credentials_provider
|
||||
.read_credentials(OPEN_ROUTER_DEFAULT_API_URL, &cx)
|
||||
.await
|
||||
.ok()
|
||||
.flatten();
|
||||
|
||||
let api_key = match old_credential {
|
||||
Some((_, key_bytes)) => match String::from_utf8(key_bytes) {
|
||||
Ok(key) if !key.is_empty() => key,
|
||||
Ok(_) => {
|
||||
log::debug!("Existing OpenRouter API key is empty, nothing to migrate");
|
||||
return;
|
||||
}
|
||||
Err(_) => {
|
||||
log::error!("Failed to decode OpenRouter API key as UTF-8");
|
||||
return;
|
||||
}
|
||||
},
|
||||
None => {
|
||||
log::debug!("No existing OpenRouter API key found to migrate");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
log::info!("Migrating existing OpenRouter API key to OpenRouter extension");
|
||||
|
||||
match credentials_provider
|
||||
.write_credentials(&extension_credential_key, "Bearer", api_key.as_bytes(), &cx)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
log::info!("Successfully migrated OpenRouter API key to extension");
|
||||
}
|
||||
Err(err) => {
|
||||
log::error!("Failed to migrate OpenRouter API key: {}", err);
|
||||
}
|
||||
}
|
||||
})
|
||||
.detach();
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use gpui::TestAppContext;
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_migrates_credentials_from_old_location(cx: &mut TestAppContext) {
|
||||
let api_key = "sk-or-test-key-12345";
|
||||
|
||||
cx.write_credentials(OPEN_ROUTER_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_open_router_credentials_if_needed(OPEN_ROUTER_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let migrated = cx.read_credentials("extension-llm-openrouter:openrouter");
|
||||
assert!(migrated.is_some(), "Credentials should have been migrated");
|
||||
let (username, password) = migrated.unwrap();
|
||||
assert_eq!(username, "Bearer");
|
||||
assert_eq!(String::from_utf8(password).unwrap(), api_key);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_no_migration_if_no_old_credentials(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
migrate_open_router_credentials_if_needed(OPEN_ROUTER_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-openrouter:openrouter");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not create credentials if none existed"
|
||||
);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_skips_migration_for_other_extensions(cx: &mut TestAppContext) {
|
||||
let api_key = "sk-or-test-key";
|
||||
|
||||
cx.write_credentials(OPEN_ROUTER_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_open_router_credentials_if_needed("some-other-extension", cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-openrouter:openrouter");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not migrate for other extensions"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,124 +0,0 @@
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use gpui::App;
|
||||
|
||||
const OPENAI_EXTENSION_ID: &str = "openai";
|
||||
const OPENAI_PROVIDER_ID: &str = "openai";
|
||||
const OPENAI_DEFAULT_API_URL: &str = "https://api.openai.com/v1";
|
||||
|
||||
/// Migrates OpenAI API credentials from the old built-in provider location
|
||||
/// to the new extension-based location.
|
||||
///
|
||||
/// This should only be called during auto-install of the extension.
|
||||
pub fn migrate_openai_credentials_if_needed(extension_id: &str, cx: &mut App) {
|
||||
if extension_id != OPENAI_EXTENSION_ID {
|
||||
return;
|
||||
}
|
||||
|
||||
let extension_credential_key = format!(
|
||||
"extension-llm-{}:{}",
|
||||
OPENAI_EXTENSION_ID, OPENAI_PROVIDER_ID
|
||||
);
|
||||
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
// Read from old location
|
||||
let old_credential = credentials_provider
|
||||
.read_credentials(OPENAI_DEFAULT_API_URL, &cx)
|
||||
.await
|
||||
.ok()
|
||||
.flatten();
|
||||
|
||||
let api_key = match old_credential {
|
||||
Some((_, key_bytes)) => match String::from_utf8(key_bytes) {
|
||||
Ok(key) if !key.is_empty() => key,
|
||||
Ok(_) => {
|
||||
log::debug!("Existing OpenAI API key is empty, nothing to migrate");
|
||||
return;
|
||||
}
|
||||
Err(_) => {
|
||||
log::error!("Failed to decode OpenAI API key as UTF-8");
|
||||
return;
|
||||
}
|
||||
},
|
||||
None => {
|
||||
log::debug!("No existing OpenAI API key found to migrate");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
log::info!("Migrating existing OpenAI API key to OpenAI extension");
|
||||
|
||||
match credentials_provider
|
||||
.write_credentials(&extension_credential_key, "Bearer", api_key.as_bytes(), &cx)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
log::info!("Successfully migrated OpenAI API key to extension");
|
||||
}
|
||||
Err(err) => {
|
||||
log::error!("Failed to migrate OpenAI API key: {}", err);
|
||||
}
|
||||
}
|
||||
})
|
||||
.detach();
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use gpui::TestAppContext;
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_migrates_credentials_from_old_location(cx: &mut TestAppContext) {
|
||||
let api_key = "sk-test-key-12345";
|
||||
|
||||
cx.write_credentials(OPENAI_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_openai_credentials_if_needed(OPENAI_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let migrated = cx.read_credentials("extension-llm-openai:openai");
|
||||
assert!(migrated.is_some(), "Credentials should have been migrated");
|
||||
let (username, password) = migrated.unwrap();
|
||||
assert_eq!(username, "Bearer");
|
||||
assert_eq!(String::from_utf8(password).unwrap(), api_key);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_no_migration_if_no_old_credentials(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
migrate_openai_credentials_if_needed(OPENAI_EXTENSION_ID, cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-openai:openai");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not create credentials if none existed"
|
||||
);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_skips_migration_for_other_extensions(cx: &mut TestAppContext) {
|
||||
let api_key = "sk-test-key";
|
||||
|
||||
cx.write_credentials(OPENAI_DEFAULT_API_URL, "Bearer", api_key.as_bytes());
|
||||
|
||||
cx.update(|cx| {
|
||||
migrate_openai_credentials_if_needed("some-other-extension", cx);
|
||||
});
|
||||
|
||||
cx.run_until_parked();
|
||||
|
||||
let credentials = cx.read_credentials("extension-llm-openai:openai");
|
||||
assert!(
|
||||
credentials.is_none(),
|
||||
"Should not migrate for other extensions"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,9 @@
|
||||
pub mod llm_provider;
|
||||
pub mod wit;
|
||||
|
||||
use crate::capability_granter::CapabilityGranter;
|
||||
use crate::{ExtensionManifest, ExtensionSettings};
|
||||
use anyhow::{Context as _, Result, anyhow, bail};
|
||||
use async_trait::async_trait;
|
||||
|
||||
use dap::{DebugRequest, StartDebuggingRequestArgumentsRequest};
|
||||
use extension::{
|
||||
CodeLabel, Command, Completion, ContextServerConfiguration, DebugAdapterBinary,
|
||||
@@ -66,7 +64,7 @@ pub struct WasmHost {
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct WasmExtension {
|
||||
tx: Arc<UnboundedSender<ExtensionCall>>,
|
||||
tx: UnboundedSender<ExtensionCall>,
|
||||
pub manifest: Arc<ExtensionManifest>,
|
||||
pub work_dir: Arc<Path>,
|
||||
#[allow(unused)]
|
||||
@@ -76,10 +74,7 @@ pub struct WasmExtension {
|
||||
|
||||
impl Drop for WasmExtension {
|
||||
fn drop(&mut self) {
|
||||
// Only close the channel when this is the last clone holding the sender
|
||||
if Arc::strong_count(&self.tx) == 1 {
|
||||
self.tx.close_channel();
|
||||
}
|
||||
self.tx.close_channel();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -676,7 +671,7 @@ impl WasmHost {
|
||||
Ok(WasmExtension {
|
||||
manifest,
|
||||
work_dir,
|
||||
tx: Arc::new(tx),
|
||||
tx,
|
||||
zed_api_version,
|
||||
_task: task,
|
||||
})
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,12 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>Authentication Complete</title>
|
||||
</head>
|
||||
<body style="font-family: system-ui, sans-serif; display: flex; justify-content: center; align-items: center; height: 100vh; margin: 0;">
|
||||
<div style="text-align: center;">
|
||||
<h1>Authentication Complete</h1>
|
||||
<p>You can close this window and return to Zed.</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
@@ -16,7 +16,7 @@ use lsp::LanguageServerName;
|
||||
use release_channel::ReleaseChannel;
|
||||
use task::{DebugScenario, SpawnInTerminal, TaskTemplate, ZedDebugConfig};
|
||||
|
||||
use crate::wasm_host::wit::since_v0_8_0::dap::StartDebuggingRequestArgumentsRequest;
|
||||
use crate::wasm_host::wit::since_v0_6_0::dap::StartDebuggingRequestArgumentsRequest;
|
||||
|
||||
use super::{WasmState, wasm_engine};
|
||||
use anyhow::{Context as _, Result, anyhow};
|
||||
@@ -33,19 +33,6 @@ pub use latest::CodeLabelSpanLiteral;
|
||||
pub use latest::{
|
||||
CodeLabel, CodeLabelSpan, Command, DebugAdapterBinary, ExtensionProject, Range, SlashCommand,
|
||||
zed::extension::context_server::ContextServerConfiguration,
|
||||
zed::extension::llm_provider::{
|
||||
CacheConfiguration as LlmCacheConfiguration, CompletionEvent as LlmCompletionEvent,
|
||||
CompletionRequest as LlmCompletionRequest, DeviceFlowPromptInfo as LlmDeviceFlowPromptInfo,
|
||||
ImageData as LlmImageData, MessageContent as LlmMessageContent,
|
||||
MessageRole as LlmMessageRole, ModelCapabilities as LlmModelCapabilities,
|
||||
ModelInfo as LlmModelInfo, ProviderInfo as LlmProviderInfo,
|
||||
RequestMessage as LlmRequestMessage, StopReason as LlmStopReason,
|
||||
ThinkingContent as LlmThinkingContent, TokenUsage as LlmTokenUsage,
|
||||
ToolChoice as LlmToolChoice, ToolDefinition as LlmToolDefinition,
|
||||
ToolInputFormat as LlmToolInputFormat, ToolResult as LlmToolResult,
|
||||
ToolResultContent as LlmToolResultContent, ToolUse as LlmToolUse,
|
||||
ToolUseJsonParseError as LlmToolUseJsonParseError,
|
||||
},
|
||||
zed::extension::lsp::{
|
||||
Completion, CompletionKind, CompletionLabelDetails, InsertTextFormat, Symbol, SymbolKind,
|
||||
},
|
||||
@@ -1020,20 +1007,6 @@ impl Extension {
|
||||
resource: Resource<Arc<dyn WorktreeDelegate>>,
|
||||
) -> Result<Result<DebugAdapterBinary, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
let dap_binary = ext
|
||||
.call_get_dap_binary(
|
||||
store,
|
||||
&adapter_name,
|
||||
&task.try_into()?,
|
||||
user_installed_path.as_ref().and_then(|p| p.to_str()),
|
||||
resource,
|
||||
)
|
||||
.await?
|
||||
.map_err(|e| anyhow!("{e:?}"))?;
|
||||
|
||||
Ok(Ok(dap_binary))
|
||||
}
|
||||
Extension::V0_6_0(ext) => {
|
||||
let dap_binary = ext
|
||||
.call_get_dap_binary(
|
||||
@@ -1059,16 +1032,6 @@ impl Extension {
|
||||
config: serde_json::Value,
|
||||
) -> Result<Result<StartDebuggingRequestArgumentsRequest, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
let config =
|
||||
serde_json::to_string(&config).context("Adapter config is not a valid JSON")?;
|
||||
let result = ext
|
||||
.call_dap_request_kind(store, &adapter_name, &config)
|
||||
.await?
|
||||
.map_err(|e| anyhow!("{e:?}"))?;
|
||||
|
||||
Ok(Ok(result))
|
||||
}
|
||||
Extension::V0_6_0(ext) => {
|
||||
let config =
|
||||
serde_json::to_string(&config).context("Adapter config is not a valid JSON")?;
|
||||
@@ -1089,15 +1052,6 @@ impl Extension {
|
||||
config: ZedDebugConfig,
|
||||
) -> Result<Result<DebugScenario, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
let config = config.into();
|
||||
let result = ext
|
||||
.call_dap_config_to_scenario(store, &config)
|
||||
.await?
|
||||
.map_err(|e| anyhow!("{e:?}"))?;
|
||||
|
||||
Ok(Ok(result.try_into()?))
|
||||
}
|
||||
Extension::V0_6_0(ext) => {
|
||||
let config = config.into();
|
||||
let dap_binary = ext
|
||||
@@ -1120,20 +1074,6 @@ impl Extension {
|
||||
debug_adapter_name: String,
|
||||
) -> Result<Option<DebugScenario>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
let build_config_template = build_config_template.into();
|
||||
let result = ext
|
||||
.call_dap_locator_create_scenario(
|
||||
store,
|
||||
&locator_name,
|
||||
&build_config_template,
|
||||
&resolved_label,
|
||||
&debug_adapter_name,
|
||||
)
|
||||
.await?;
|
||||
|
||||
Ok(result.map(TryInto::try_into).transpose()?)
|
||||
}
|
||||
Extension::V0_6_0(ext) => {
|
||||
let build_config_template = build_config_template.into();
|
||||
let dap_binary = ext
|
||||
@@ -1159,15 +1099,6 @@ impl Extension {
|
||||
resolved_build_task: SpawnInTerminal,
|
||||
) -> Result<Result<DebugRequest, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
let build_config_template = resolved_build_task.try_into()?;
|
||||
let dap_request = ext
|
||||
.call_run_dap_locator(store, &locator_name, &build_config_template)
|
||||
.await?
|
||||
.map_err(|e| anyhow!("{e:?}"))?;
|
||||
|
||||
Ok(Ok(dap_request.into()))
|
||||
}
|
||||
Extension::V0_6_0(ext) => {
|
||||
let build_config_template = resolved_build_task.try_into()?;
|
||||
let dap_request = ext
|
||||
@@ -1180,174 +1111,6 @@ impl Extension {
|
||||
_ => anyhow::bail!("`dap_locator_create_scenario` not available prior to v0.6.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_providers(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
) -> Result<Vec<latest::llm_provider::ProviderInfo>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => ext.call_llm_providers(store).await,
|
||||
_ => Ok(Vec::new()),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_provider_models(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
) -> Result<Result<Vec<latest::llm_provider::ModelInfo>, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => ext.call_llm_provider_models(store, provider_id).await,
|
||||
_ => anyhow::bail!("`llm_provider_models` not available prior to v0.8.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_provider_settings_markdown(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
) -> Result<Option<String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_provider_settings_markdown(store, provider_id)
|
||||
.await
|
||||
}
|
||||
_ => Ok(None),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_provider_is_authenticated(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
) -> Result<bool> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_provider_is_authenticated(store, provider_id)
|
||||
.await
|
||||
}
|
||||
_ => Ok(false),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_provider_start_device_flow_sign_in(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
) -> Result<Result<LlmDeviceFlowPromptInfo, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_provider_start_device_flow_sign_in(store, provider_id)
|
||||
.await
|
||||
}
|
||||
_ => {
|
||||
anyhow::bail!(
|
||||
"`llm_provider_start_device_flow_sign_in` not available prior to v0.8.0"
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_provider_poll_device_flow_sign_in(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
) -> Result<Result<(), String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_provider_poll_device_flow_sign_in(store, provider_id)
|
||||
.await
|
||||
}
|
||||
_ => {
|
||||
anyhow::bail!(
|
||||
"`llm_provider_poll_device_flow_sign_in` not available prior to v0.8.0"
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_provider_reset_credentials(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
) -> Result<Result<(), String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_provider_reset_credentials(store, provider_id)
|
||||
.await
|
||||
}
|
||||
_ => anyhow::bail!("`llm_provider_reset_credentials` not available prior to v0.8.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_count_tokens(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
model_id: &str,
|
||||
request: &latest::llm_provider::CompletionRequest,
|
||||
) -> Result<Result<u64, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_count_tokens(store, provider_id, model_id, request)
|
||||
.await
|
||||
}
|
||||
_ => anyhow::bail!("`llm_count_tokens` not available prior to v0.8.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_stream_completion_start(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
model_id: &str,
|
||||
request: &latest::llm_provider::CompletionRequest,
|
||||
) -> Result<Result<String, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_stream_completion_start(store, provider_id, model_id, request)
|
||||
.await
|
||||
}
|
||||
_ => anyhow::bail!("`llm_stream_completion_start` not available prior to v0.8.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_stream_completion_next(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
stream_id: &str,
|
||||
) -> Result<Result<Option<latest::llm_provider::CompletionEvent>, String>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => ext.call_llm_stream_completion_next(store, stream_id).await,
|
||||
_ => anyhow::bail!("`llm_stream_completion_next` not available prior to v0.8.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_stream_completion_close(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
stream_id: &str,
|
||||
) -> Result<()> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => ext.call_llm_stream_completion_close(store, stream_id).await,
|
||||
_ => anyhow::bail!("`llm_stream_completion_close` not available prior to v0.8.0"),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn call_llm_cache_configuration(
|
||||
&self,
|
||||
store: &mut Store<WasmState>,
|
||||
provider_id: &str,
|
||||
model_id: &str,
|
||||
) -> Result<Option<latest::llm_provider::CacheConfiguration>> {
|
||||
match self {
|
||||
Extension::V0_8_0(ext) => {
|
||||
ext.call_llm_cache_configuration(store, provider_id, model_id)
|
||||
.await
|
||||
}
|
||||
_ => Ok(None),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
trait ToWasmtimeResult<T> {
|
||||
|
||||
@@ -32,6 +32,8 @@ wasmtime::component::bindgen!({
|
||||
},
|
||||
});
|
||||
|
||||
pub use self::zed::extension::*;
|
||||
|
||||
mod settings {
|
||||
#![allow(dead_code)]
|
||||
include!(concat!(env!("OUT_DIR"), "/since_v0.6.0/settings.rs"));
|
||||
|
||||
@@ -1,19 +1,18 @@
|
||||
use crate::wasm_host::wit::since_v0_8_0::{
|
||||
use crate::wasm_host::wit::since_v0_6_0::{
|
||||
dap::{
|
||||
BuildTaskDefinition, BuildTaskDefinitionTemplatePayload, StartDebuggingRequestArguments,
|
||||
TcpArguments, TcpArgumentsTemplate,
|
||||
},
|
||||
lsp::{CompletionKind, CompletionLabelDetails, InsertTextFormat, SymbolKind},
|
||||
slash_command::SlashCommandOutputSection,
|
||||
};
|
||||
use crate::wasm_host::wit::{CompletionKind, CompletionLabelDetails, InsertTextFormat, SymbolKind};
|
||||
use crate::wasm_host::{WasmState, wit::ToWasmtimeResult};
|
||||
use ::http_client::{AsyncBody, HttpRequestExt};
|
||||
use ::settings::{ModelMode, Settings, SettingsStore, WorktreeId};
|
||||
use ::settings::{Settings, WorktreeId};
|
||||
use anyhow::{Context as _, Result, bail};
|
||||
use async_compression::futures::bufread::GzipDecoder;
|
||||
use async_tar::Archive;
|
||||
use async_trait::async_trait;
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use extension::{
|
||||
ExtensionLanguageServerProxy, KeyValueStoreDelegate, ProjectDelegate, WorktreeDelegate,
|
||||
};
|
||||
@@ -23,14 +22,12 @@ use gpui::{BackgroundExecutor, SharedString};
|
||||
use language::{BinaryStatus, LanguageName, language_settings::AllLanguageSettings};
|
||||
use project::project_settings::ProjectSettings;
|
||||
use semver::Version;
|
||||
use smol::net::TcpListener;
|
||||
use std::{
|
||||
env,
|
||||
net::Ipv4Addr,
|
||||
path::{Path, PathBuf},
|
||||
str::FromStr,
|
||||
sync::{Arc, OnceLock},
|
||||
time::Duration,
|
||||
};
|
||||
use task::{SpawnInTerminal, ZedDebugConfig};
|
||||
use url::Url;
|
||||
@@ -618,19 +615,6 @@ impl http_client::Host for WasmState {
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn fetch_fallible(
|
||||
&mut self,
|
||||
request: http_client::HttpRequest,
|
||||
) -> wasmtime::Result<Result<http_client::HttpResponseWithStatus, String>> {
|
||||
maybe!(async {
|
||||
let request = convert_request(&request)?;
|
||||
let mut response = self.host.http_client.send(request).await?;
|
||||
convert_response_with_status(&mut response).await
|
||||
})
|
||||
.await
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn fetch_stream(
|
||||
&mut self,
|
||||
request: http_client::HttpRequest,
|
||||
@@ -734,26 +718,6 @@ async fn convert_response(
|
||||
Ok(extension_response)
|
||||
}
|
||||
|
||||
async fn convert_response_with_status(
|
||||
response: &mut ::http_client::Response<AsyncBody>,
|
||||
) -> anyhow::Result<http_client::HttpResponseWithStatus> {
|
||||
let status = response.status().as_u16();
|
||||
let headers: Vec<(String, String)> = response
|
||||
.headers()
|
||||
.iter()
|
||||
.map(|(k, v)| (k.to_string(), v.to_str().unwrap_or("").to_string()))
|
||||
.collect();
|
||||
|
||||
let mut body = Vec::new();
|
||||
response.body_mut().read_to_end(&mut body).await?;
|
||||
|
||||
Ok(http_client::HttpResponseWithStatus {
|
||||
status,
|
||||
headers,
|
||||
body,
|
||||
})
|
||||
}
|
||||
|
||||
impl nodejs::Host for WasmState {
|
||||
async fn node_binary_path(&mut self) -> wasmtime::Result<Result<String, String>> {
|
||||
self.host
|
||||
@@ -1145,369 +1109,3 @@ impl ExtensionImports for WasmState {
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
}
|
||||
|
||||
impl llm_provider::Host for WasmState {
|
||||
async fn get_credential(&mut self, provider_id: String) -> wasmtime::Result<Option<String>> {
|
||||
let extension_id = self.manifest.id.clone();
|
||||
let is_legacy_extension = crate::LEGACY_LLM_EXTENSION_IDS.contains(&extension_id.as_ref());
|
||||
|
||||
// Check if this provider has env vars configured and if the user has allowed any of them
|
||||
let env_vars = self
|
||||
.manifest
|
||||
.language_model_providers
|
||||
.get(&Arc::<str>::from(provider_id.as_str()))
|
||||
.and_then(|entry| entry.auth.as_ref())
|
||||
.and_then(|auth| auth.env_vars.clone());
|
||||
|
||||
if let Some(env_vars) = env_vars {
|
||||
let full_provider_id = format!("{}:{}", extension_id, provider_id);
|
||||
|
||||
// Check each env var to see if it's allowed and set
|
||||
for env_var_name in &env_vars {
|
||||
let settings_key: Arc<str> =
|
||||
format!("{}:{}", full_provider_id, env_var_name).into();
|
||||
|
||||
// For legacy extensions, auto-allow if env var is set
|
||||
let env_var_is_set = env::var(env_var_name)
|
||||
.map(|v| !v.is_empty())
|
||||
.unwrap_or(false);
|
||||
|
||||
let is_allowed = self
|
||||
.on_main_thread({
|
||||
let settings_key = settings_key.clone();
|
||||
move |cx| {
|
||||
async move {
|
||||
cx.update(|cx| {
|
||||
crate::extension_settings::ExtensionSettings::get_global(cx)
|
||||
.allowed_env_var_providers
|
||||
.contains(&settings_key)
|
||||
})
|
||||
}
|
||||
.boxed_local()
|
||||
}
|
||||
})
|
||||
.await
|
||||
.unwrap_or(false);
|
||||
|
||||
if is_allowed || (is_legacy_extension && env_var_is_set) {
|
||||
if let Ok(value) = env::var(env_var_name) {
|
||||
if !value.is_empty() {
|
||||
return Ok(Some(value));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to credential store
|
||||
let credential_key = format!("extension-llm-{}:{}", extension_id, provider_id);
|
||||
|
||||
self.on_main_thread(move |cx| {
|
||||
async move {
|
||||
let credentials_provider = cx.update(|cx| <dyn CredentialsProvider>::global(cx))?;
|
||||
let result = credentials_provider
|
||||
.read_credentials(&credential_key, cx)
|
||||
.await
|
||||
.ok()
|
||||
.flatten();
|
||||
Ok(result.map(|(_, password)| String::from_utf8_lossy(&password).to_string()))
|
||||
}
|
||||
.boxed_local()
|
||||
})
|
||||
.await
|
||||
}
|
||||
|
||||
async fn store_credential(
|
||||
&mut self,
|
||||
provider_id: String,
|
||||
value: String,
|
||||
) -> wasmtime::Result<Result<(), String>> {
|
||||
let extension_id = self.manifest.id.clone();
|
||||
let credential_key = format!("extension-llm-{}:{}", extension_id, provider_id);
|
||||
|
||||
self.on_main_thread(move |cx| {
|
||||
async move {
|
||||
let credentials_provider = cx.update(|cx| <dyn CredentialsProvider>::global(cx))?;
|
||||
credentials_provider
|
||||
.write_credentials(&credential_key, "api_key", value.as_bytes(), cx)
|
||||
.await
|
||||
.map_err(|e| anyhow::anyhow!("{}", e))
|
||||
}
|
||||
.boxed_local()
|
||||
})
|
||||
.await
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn delete_credential(
|
||||
&mut self,
|
||||
provider_id: String,
|
||||
) -> wasmtime::Result<Result<(), String>> {
|
||||
let extension_id = self.manifest.id.clone();
|
||||
let credential_key = format!("extension-llm-{}:{}", extension_id, provider_id);
|
||||
|
||||
self.on_main_thread(move |cx| {
|
||||
async move {
|
||||
let credentials_provider = cx.update(|cx| <dyn CredentialsProvider>::global(cx))?;
|
||||
credentials_provider
|
||||
.delete_credentials(&credential_key, cx)
|
||||
.await
|
||||
.map_err(|e| anyhow::anyhow!("{}", e))
|
||||
}
|
||||
.boxed_local()
|
||||
})
|
||||
.await
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn get_env_var(&mut self, name: String) -> wasmtime::Result<Option<String>> {
|
||||
let extension_id = self.manifest.id.clone();
|
||||
|
||||
// Find which provider (if any) declares this env var in its auth config
|
||||
let mut allowed_provider_id: Option<Arc<str>> = None;
|
||||
for (provider_id, provider_entry) in &self.manifest.language_model_providers {
|
||||
if let Some(auth_config) = &provider_entry.auth {
|
||||
if let Some(env_vars) = &auth_config.env_vars {
|
||||
if env_vars.iter().any(|v| v == &name) {
|
||||
allowed_provider_id = Some(provider_id.clone());
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If no provider declares this env var, deny access
|
||||
let Some(provider_id) = allowed_provider_id else {
|
||||
log::warn!(
|
||||
"Extension {} attempted to read env var {} which is not declared in any provider auth config",
|
||||
extension_id,
|
||||
name
|
||||
);
|
||||
return Ok(None);
|
||||
};
|
||||
|
||||
// Check if the user has allowed this specific env var
|
||||
let settings_key: Arc<str> = format!("{}:{}:{}", extension_id, provider_id, name).into();
|
||||
let is_legacy_extension = crate::LEGACY_LLM_EXTENSION_IDS.contains(&extension_id.as_ref());
|
||||
|
||||
// For legacy extensions, auto-allow if env var is set
|
||||
let env_var_is_set = env::var(&name).map(|v| !v.is_empty()).unwrap_or(false);
|
||||
|
||||
let is_allowed = self
|
||||
.on_main_thread({
|
||||
let settings_key = settings_key.clone();
|
||||
move |cx| {
|
||||
async move {
|
||||
cx.update(|cx| {
|
||||
crate::extension_settings::ExtensionSettings::get_global(cx)
|
||||
.allowed_env_var_providers
|
||||
.contains(&settings_key)
|
||||
})
|
||||
}
|
||||
.boxed_local()
|
||||
}
|
||||
})
|
||||
.await
|
||||
.unwrap_or(false);
|
||||
|
||||
if !is_allowed && !(is_legacy_extension && env_var_is_set) {
|
||||
log::debug!(
|
||||
"Extension {} provider {} is not allowed to read env var {}",
|
||||
extension_id,
|
||||
provider_id,
|
||||
name
|
||||
);
|
||||
return Ok(None);
|
||||
}
|
||||
|
||||
Ok(env::var(&name).ok())
|
||||
}
|
||||
|
||||
async fn oauth_start_web_auth(
|
||||
&mut self,
|
||||
config: llm_provider::OauthWebAuthConfig,
|
||||
) -> wasmtime::Result<Result<llm_provider::OauthWebAuthResult, String>> {
|
||||
let auth_url = config.auth_url;
|
||||
let callback_path = config.callback_path;
|
||||
let timeout_secs = config.timeout_secs.unwrap_or(300);
|
||||
|
||||
self.on_main_thread(move |cx| {
|
||||
async move {
|
||||
// Bind to port 0 to let the OS assign an available port, then substitute
|
||||
// it into the auth URL's {port} placeholder for the OAuth callback.
|
||||
let listener = TcpListener::bind("127.0.0.1:0")
|
||||
.await
|
||||
.map_err(|e| anyhow::anyhow!("Failed to bind localhost server: {}", e))?;
|
||||
let port = listener
|
||||
.local_addr()
|
||||
.map_err(|e| anyhow::anyhow!("Failed to get local address: {}", e))?
|
||||
.port();
|
||||
|
||||
let auth_url_with_port = auth_url.replace("{port}", &port.to_string());
|
||||
cx.update(|cx| {
|
||||
cx.open_url(&auth_url_with_port);
|
||||
})?;
|
||||
|
||||
let accept_future = async {
|
||||
let (mut stream, _) = listener
|
||||
.accept()
|
||||
.await
|
||||
.map_err(|e| anyhow::anyhow!("Failed to accept connection: {}", e))?;
|
||||
|
||||
let mut request_line = String::new();
|
||||
{
|
||||
let mut reader = smol::io::BufReader::new(&mut stream);
|
||||
smol::io::AsyncBufReadExt::read_line(&mut reader, &mut request_line)
|
||||
.await
|
||||
.map_err(|e| anyhow::anyhow!("Failed to read request: {}", e))?;
|
||||
}
|
||||
|
||||
let path = request_line
|
||||
.split_whitespace()
|
||||
.nth(1)
|
||||
.ok_or_else(|| anyhow::anyhow!("Malformed HTTP request"))?;
|
||||
|
||||
let callback_url = if path.starts_with(&callback_path)
|
||||
|| path.starts_with(&format!("/{}", callback_path.trim_start_matches('/')))
|
||||
{
|
||||
format!("http://localhost:{}{}", port, path)
|
||||
} else {
|
||||
return Err(anyhow::anyhow!("Unexpected callback path: {}", path));
|
||||
};
|
||||
|
||||
let response = format!(
|
||||
"HTTP/1.1 200 OK\r\nContent-Type: text/html\r\nConnection: close\r\n\r\n{}",
|
||||
include_str!("../oauth_callback_response.html")
|
||||
);
|
||||
|
||||
smol::io::AsyncWriteExt::write_all(&mut stream, response.as_bytes())
|
||||
.await
|
||||
.ok();
|
||||
smol::io::AsyncWriteExt::flush(&mut stream).await.ok();
|
||||
|
||||
Ok(callback_url)
|
||||
};
|
||||
|
||||
let timeout_duration = Duration::from_secs(timeout_secs as u64);
|
||||
let callback_url = smol::future::or(accept_future, async {
|
||||
smol::Timer::after(timeout_duration).await;
|
||||
Err(anyhow::anyhow!(
|
||||
"OAuth callback timed out after {} seconds",
|
||||
timeout_secs
|
||||
))
|
||||
})
|
||||
.await?;
|
||||
|
||||
Ok(llm_provider::OauthWebAuthResult {
|
||||
callback_url,
|
||||
port: port as u32,
|
||||
})
|
||||
}
|
||||
.boxed_local()
|
||||
})
|
||||
.await
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn oauth_send_http_request(
|
||||
&mut self,
|
||||
request: http_client::HttpRequest,
|
||||
) -> wasmtime::Result<Result<http_client::HttpResponseWithStatus, String>> {
|
||||
maybe!(async {
|
||||
let request = convert_request(&request)?;
|
||||
let mut response = self.host.http_client.send(request).await?;
|
||||
convert_response_with_status(&mut response).await
|
||||
})
|
||||
.await
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn oauth_open_browser(&mut self, url: String) -> wasmtime::Result<Result<(), String>> {
|
||||
self.on_main_thread(move |cx| {
|
||||
async move {
|
||||
cx.update(|cx| {
|
||||
cx.open_url(&url);
|
||||
})?;
|
||||
Ok(())
|
||||
}
|
||||
.boxed_local()
|
||||
})
|
||||
.await
|
||||
.to_wasmtime_result()
|
||||
}
|
||||
|
||||
async fn get_provider_settings(
|
||||
&mut self,
|
||||
provider_id: String,
|
||||
) -> wasmtime::Result<Option<llm_provider::ProviderSettings>> {
|
||||
let extension_id = self.manifest.id.clone();
|
||||
|
||||
let result = self
|
||||
.on_main_thread(move |cx| {
|
||||
async move {
|
||||
cx.update(|cx| {
|
||||
let settings_store = cx.global::<SettingsStore>();
|
||||
let user_settings = settings_store.raw_user_settings();
|
||||
let language_models =
|
||||
user_settings.and_then(|s| s.content.language_models.as_ref());
|
||||
|
||||
// Map provider IDs to their settings
|
||||
// The provider_id from the extension is just the provider part (e.g., "google-ai")
|
||||
// We need to match this to the appropriate settings
|
||||
match provider_id.as_str() {
|
||||
"google-ai" => {
|
||||
let google = language_models.and_then(|lm| lm.google.as_ref());
|
||||
let google = google?;
|
||||
|
||||
let api_url = google.api_url.clone().filter(|s| !s.is_empty());
|
||||
|
||||
let available_models = google
|
||||
.available_models
|
||||
.as_ref()
|
||||
.map(|models| {
|
||||
models
|
||||
.iter()
|
||||
.map(|m| {
|
||||
let thinking_budget = match &m.mode {
|
||||
Some(ModelMode::Thinking { budget_tokens }) => {
|
||||
*budget_tokens
|
||||
}
|
||||
_ => None,
|
||||
};
|
||||
llm_provider::CustomModelConfig {
|
||||
name: m.name.clone(),
|
||||
display_name: m.display_name.clone(),
|
||||
max_tokens: m.max_tokens,
|
||||
max_output_tokens: None,
|
||||
thinking_budget,
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
Some(llm_provider::ProviderSettings {
|
||||
api_url,
|
||||
available_models,
|
||||
})
|
||||
}
|
||||
_ => {
|
||||
log::debug!(
|
||||
"Extension {} requested settings for unknown provider: {}",
|
||||
extension_id,
|
||||
provider_id
|
||||
);
|
||||
None
|
||||
}
|
||||
}
|
||||
})
|
||||
.ok()
|
||||
.flatten()
|
||||
}
|
||||
.boxed_local()
|
||||
})
|
||||
.await;
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -442,9 +442,7 @@ impl ExtensionsPage {
|
||||
let extension_store = ExtensionStore::global(cx).read(cx);
|
||||
|
||||
match extension_store.outstanding_operations().get(extension_id) {
|
||||
Some(ExtensionOperation::Install) | Some(ExtensionOperation::AutoInstall) => {
|
||||
ExtensionStatus::Installing
|
||||
}
|
||||
Some(ExtensionOperation::Install) => ExtensionStatus::Installing,
|
||||
Some(ExtensionOperation::Remove) => ExtensionStatus::Removing,
|
||||
Some(ExtensionOperation::Upgrade) => ExtensionStatus::Upgrading,
|
||||
None => match extension_store.installed_extensions().get(extension_id) {
|
||||
|
||||
@@ -296,20 +296,6 @@ impl TestAppContext {
|
||||
&self.text_system
|
||||
}
|
||||
|
||||
/// Simulates writing credentials to the platform keychain.
|
||||
pub fn write_credentials(&self, url: &str, username: &str, password: &[u8]) {
|
||||
let _ = self
|
||||
.test_platform
|
||||
.write_credentials(url, username, password);
|
||||
}
|
||||
|
||||
/// Simulates reading credentials from the platform keychain.
|
||||
pub fn read_credentials(&self, url: &str) -> Option<(String, Vec<u8>)> {
|
||||
smol::block_on(self.test_platform.read_credentials(url))
|
||||
.ok()
|
||||
.flatten()
|
||||
}
|
||||
|
||||
/// Simulates writing to the platform clipboard
|
||||
pub fn write_to_clipboard(&self, item: ClipboardItem) {
|
||||
self.test_platform.write_to_clipboard(item)
|
||||
|
||||
@@ -71,6 +71,16 @@ struct StateInner {
|
||||
scroll_handler: Option<Box<dyn FnMut(&ListScrollEvent, &mut Window, &mut App)>>,
|
||||
scrollbar_drag_start_height: Option<Pixels>,
|
||||
measuring_behavior: ListMeasuringBehavior,
|
||||
pending_scroll: Option<PendingScrollFraction>,
|
||||
}
|
||||
|
||||
/// Keeps track of a fractional scroll position within an item for restoration
|
||||
/// after remeasurement.
|
||||
struct PendingScrollFraction {
|
||||
/// The index of the item to scroll within.
|
||||
item_ix: usize,
|
||||
/// Fractional offset (0.0 to 1.0) within the item's height.
|
||||
fraction: f32,
|
||||
}
|
||||
|
||||
/// Whether the list is scrolling from top to bottom or bottom to top.
|
||||
@@ -225,6 +235,7 @@ impl ListState {
|
||||
reset: false,
|
||||
scrollbar_drag_start_height: None,
|
||||
measuring_behavior: ListMeasuringBehavior::default(),
|
||||
pending_scroll: None,
|
||||
})));
|
||||
this.splice(0..0, item_count);
|
||||
this
|
||||
@@ -254,6 +265,45 @@ impl ListState {
|
||||
self.splice(0..old_count, element_count);
|
||||
}
|
||||
|
||||
/// Remeasure all items while preserving proportional scroll position.
|
||||
///
|
||||
/// Use this when item heights may have changed (e.g., font size changes)
|
||||
/// but the number and identity of items remains the same.
|
||||
pub fn remeasure(&self) {
|
||||
let state = &mut *self.0.borrow_mut();
|
||||
|
||||
let new_items = state.items.iter().map(|item| ListItem::Unmeasured {
|
||||
focus_handle: item.focus_handle(),
|
||||
});
|
||||
|
||||
// If there's a `logical_scroll_top`, we need to keep track of it as a
|
||||
// `PendingScrollFraction`, so we can later preserve that scroll
|
||||
// position proportionally to the item, in case the item's height
|
||||
// changes.
|
||||
if let Some(scroll_top) = state.logical_scroll_top {
|
||||
let mut cursor = state.items.cursor::<Count>(());
|
||||
cursor.seek(&Count(scroll_top.item_ix), Bias::Right);
|
||||
|
||||
if let Some(item) = cursor.item() {
|
||||
if let Some(size) = item.size() {
|
||||
let fraction = if size.height.0 > 0.0 {
|
||||
(scroll_top.offset_in_item.0 / size.height.0).clamp(0.0, 1.0)
|
||||
} else {
|
||||
0.0
|
||||
};
|
||||
|
||||
state.pending_scroll = Some(PendingScrollFraction {
|
||||
item_ix: scroll_top.item_ix,
|
||||
fraction,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
state.items = SumTree::from_iter(new_items, ());
|
||||
state.measuring_behavior.reset();
|
||||
}
|
||||
|
||||
/// The number of items in this list.
|
||||
pub fn item_count(&self) -> usize {
|
||||
self.0.borrow().items.summary().count
|
||||
@@ -644,6 +694,20 @@ impl StateInner {
|
||||
let mut element = render_item(item_index, window, cx);
|
||||
let element_size = element.layout_as_root(available_item_space, window, cx);
|
||||
size = Some(element_size);
|
||||
|
||||
// If there's a pending scroll adjustment for the scroll-top
|
||||
// item, apply it, ensuring proportional scroll position is
|
||||
// maintained after re-measuring.
|
||||
if ix == 0 {
|
||||
if let Some(pending_scroll) = self.pending_scroll.take() {
|
||||
if pending_scroll.item_ix == scroll_top.item_ix {
|
||||
scroll_top.offset_in_item =
|
||||
Pixels(pending_scroll.fraction * element_size.height.0);
|
||||
self.logical_scroll_top = Some(scroll_top);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if visible_height < available_height {
|
||||
item_layouts.push_back(ItemLayout {
|
||||
index: item_index,
|
||||
@@ -1184,16 +1248,16 @@ impl sum_tree::SeekTarget<'_, ListItemSummary, ListItemSummary> for Height {
|
||||
mod test {
|
||||
|
||||
use gpui::{ScrollDelta, ScrollWheelEvent};
|
||||
use std::cell::Cell;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::{self as gpui, TestAppContext};
|
||||
use crate::{
|
||||
self as gpui, AppContext, Context, Element, IntoElement, ListState, Render, Styled,
|
||||
TestAppContext, Window, div, list, point, px, size,
|
||||
};
|
||||
|
||||
#[gpui::test]
|
||||
fn test_reset_after_paint_before_scroll(cx: &mut TestAppContext) {
|
||||
use crate::{
|
||||
AppContext, Context, Element, IntoElement, ListState, Render, Styled, Window, div,
|
||||
list, point, px, size,
|
||||
};
|
||||
|
||||
let cx = cx.add_empty_window();
|
||||
|
||||
let state = ListState::new(5, crate::ListAlignment::Top, px(10.));
|
||||
@@ -1237,11 +1301,6 @@ mod test {
|
||||
|
||||
#[gpui::test]
|
||||
fn test_scroll_by_positive_and_negative_distance(cx: &mut TestAppContext) {
|
||||
use crate::{
|
||||
AppContext, Context, Element, IntoElement, ListState, Render, Styled, Window, div,
|
||||
list, point, px, size,
|
||||
};
|
||||
|
||||
let cx = cx.add_empty_window();
|
||||
|
||||
let state = ListState::new(5, crate::ListAlignment::Top, px(10.));
|
||||
@@ -1284,4 +1343,68 @@ mod test {
|
||||
assert_eq!(offset.item_ix, 0);
|
||||
assert_eq!(offset.offset_in_item, px(0.));
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
fn test_remeasure(cx: &mut TestAppContext) {
|
||||
let cx = cx.add_empty_window();
|
||||
|
||||
// Create a list with 10 items, each 100px tall. We'll keep a reference
|
||||
// to the item height so we can later change the height and assert how
|
||||
// `ListState` handles it.
|
||||
let item_height = Rc::new(Cell::new(100usize));
|
||||
let state = ListState::new(10, crate::ListAlignment::Top, px(10.));
|
||||
|
||||
struct TestView {
|
||||
state: ListState,
|
||||
item_height: Rc<Cell<usize>>,
|
||||
}
|
||||
|
||||
impl Render for TestView {
|
||||
fn render(&mut self, _: &mut Window, _: &mut Context<Self>) -> impl IntoElement {
|
||||
let height = self.item_height.get();
|
||||
list(self.state.clone(), move |_, _, _| {
|
||||
div().h(px(height as f32)).w_full().into_any()
|
||||
})
|
||||
.w_full()
|
||||
.h_full()
|
||||
}
|
||||
}
|
||||
|
||||
let state_clone = state.clone();
|
||||
let item_height_clone = item_height.clone();
|
||||
let view = cx.update(|_, cx| {
|
||||
cx.new(|_| TestView {
|
||||
state: state_clone,
|
||||
item_height: item_height_clone,
|
||||
})
|
||||
});
|
||||
|
||||
// Simulate scrolling 40px inside the element with index 2. Since the
|
||||
// original item height is 100px, this equates to 40% inside the item.
|
||||
state.scroll_to(gpui::ListOffset {
|
||||
item_ix: 2,
|
||||
offset_in_item: px(40.),
|
||||
});
|
||||
|
||||
cx.draw(point(px(0.), px(0.)), size(px(100.), px(200.)), |_, _| {
|
||||
view.clone()
|
||||
});
|
||||
|
||||
let offset = state.logical_scroll_top();
|
||||
assert_eq!(offset.item_ix, 2);
|
||||
assert_eq!(offset.offset_in_item, px(40.));
|
||||
|
||||
// Update the `item_height` to be 50px instead of 100px so we can assert
|
||||
// that the scroll position is proportionally preserved, that is,
|
||||
// instead of 40px from the top of item 2, it should be 20px, since the
|
||||
// item's height has been halved.
|
||||
item_height.set(50);
|
||||
state.remeasure();
|
||||
|
||||
cx.draw(point(px(0.), px(0.)), size(px(100.), px(200.)), |_, _| view);
|
||||
|
||||
let offset = state.logical_scroll_top();
|
||||
assert_eq!(offset.item_ix, 2);
|
||||
assert_eq!(offset.offset_in_item, px(20.));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -6,7 +6,7 @@ use crate::{
|
||||
TestDisplay, TestWindow, WindowAppearance, WindowParams, size,
|
||||
};
|
||||
use anyhow::Result;
|
||||
use collections::{HashMap, VecDeque};
|
||||
use collections::VecDeque;
|
||||
use futures::channel::oneshot;
|
||||
use parking_lot::Mutex;
|
||||
use std::{
|
||||
@@ -32,7 +32,6 @@ pub(crate) struct TestPlatform {
|
||||
current_clipboard_item: Mutex<Option<ClipboardItem>>,
|
||||
#[cfg(any(target_os = "linux", target_os = "freebsd"))]
|
||||
current_primary_item: Mutex<Option<ClipboardItem>>,
|
||||
credentials: Mutex<HashMap<String, (String, Vec<u8>)>>,
|
||||
#[cfg(target_os = "macos")]
|
||||
current_find_pasteboard_item: Mutex<Option<ClipboardItem>>,
|
||||
pub(crate) prompts: RefCell<TestPrompts>,
|
||||
@@ -120,7 +119,6 @@ impl TestPlatform {
|
||||
current_clipboard_item: Mutex::new(None),
|
||||
#[cfg(any(target_os = "linux", target_os = "freebsd"))]
|
||||
current_primary_item: Mutex::new(None),
|
||||
credentials: Mutex::new(HashMap::default()),
|
||||
#[cfg(target_os = "macos")]
|
||||
current_find_pasteboard_item: Mutex::new(None),
|
||||
weak: weak.clone(),
|
||||
@@ -432,20 +430,15 @@ impl Platform for TestPlatform {
|
||||
*self.current_find_pasteboard_item.lock() = Some(item);
|
||||
}
|
||||
|
||||
fn write_credentials(&self, url: &str, username: &str, password: &[u8]) -> Task<Result<()>> {
|
||||
self.credentials
|
||||
.lock()
|
||||
.insert(url.to_string(), (username.to_string(), password.to_vec()));
|
||||
fn write_credentials(&self, _url: &str, _username: &str, _password: &[u8]) -> Task<Result<()>> {
|
||||
Task::ready(Ok(()))
|
||||
}
|
||||
|
||||
fn read_credentials(&self, url: &str) -> Task<Result<Option<(String, Vec<u8>)>>> {
|
||||
let result = self.credentials.lock().get(url).cloned();
|
||||
Task::ready(Ok(result))
|
||||
fn read_credentials(&self, _url: &str) -> Task<Result<Option<(String, Vec<u8>)>>> {
|
||||
Task::ready(Ok(None))
|
||||
}
|
||||
|
||||
fn delete_credentials(&self, url: &str) -> Task<Result<()>> {
|
||||
self.credentials.lock().remove(url);
|
||||
fn delete_credentials(&self, _url: &str) -> Task<Result<()>> {
|
||||
Task::ready(Ok(()))
|
||||
}
|
||||
|
||||
|
||||
@@ -1490,23 +1490,19 @@ impl Buffer {
|
||||
let (tx, rx) = futures::channel::oneshot::channel();
|
||||
let prev_version = self.text.version();
|
||||
self.reload_task = Some(cx.spawn(async move |this, cx| {
|
||||
let Some((new_mtime, load_bytes_task, encoding)) = this.update(cx, |this, cx| {
|
||||
let Some((new_mtime, new_text)) = this.update(cx, |this, cx| {
|
||||
let file = this.file.as_ref()?.as_local()?;
|
||||
Some((
|
||||
file.disk_state().mtime(),
|
||||
file.load_bytes(cx),
|
||||
this.encoding,
|
||||
))
|
||||
|
||||
Some((file.disk_state().mtime(), file.load(cx)))
|
||||
})?
|
||||
else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let bytes = load_bytes_task.await?;
|
||||
let (cow, _encoding_used, _has_errors) = encoding.decode(&bytes);
|
||||
let new_text = cow.into_owned();
|
||||
|
||||
let diff = this.update(cx, |this, cx| this.diff(new_text, cx))?.await;
|
||||
let new_text = new_text.await?;
|
||||
let diff = this
|
||||
.update(cx, |this, cx| this.diff(new_text.clone(), cx))?
|
||||
.await;
|
||||
this.update(cx, |this, cx| {
|
||||
if this.version() == diff.base_version {
|
||||
this.finalize_last_transaction();
|
||||
|
||||
@@ -228,6 +228,10 @@ impl ApiKeyState {
|
||||
}
|
||||
|
||||
impl ApiKey {
|
||||
pub fn key(&self) -> &str {
|
||||
&self.key
|
||||
}
|
||||
|
||||
pub fn from_env(env_var_name: SharedString, key: &str) -> Self {
|
||||
Self {
|
||||
source: ApiKeySource::EnvVar(env_var_name),
|
||||
@@ -235,6 +239,16 @@ impl ApiKey {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn load_from_system_keychain(
|
||||
url: &str,
|
||||
credentials_provider: &dyn CredentialsProvider,
|
||||
cx: &AsyncApp,
|
||||
) -> Result<Self, AuthenticateError> {
|
||||
Self::load_from_system_keychain_impl(url, credentials_provider, cx)
|
||||
.await
|
||||
.into_authenticate_result()
|
||||
}
|
||||
|
||||
async fn load_from_system_keychain_impl(
|
||||
url: &str,
|
||||
credentials_provider: &dyn CredentialsProvider,
|
||||
|
||||
@@ -818,11 +818,6 @@ pub trait LanguageModelProvider: 'static {
|
||||
fn icon(&self) -> IconOrSvg {
|
||||
IconOrSvg::default()
|
||||
}
|
||||
/// Returns the path to an external SVG icon for this provider, if any.
|
||||
/// When present, this takes precedence over `icon()`.
|
||||
fn icon_path(&self) -> Option<SharedString> {
|
||||
None
|
||||
}
|
||||
fn default_model(&self, cx: &App) -> Option<Arc<dyn LanguageModel>>;
|
||||
fn default_fast_model(&self, cx: &App) -> Option<Arc<dyn LanguageModel>>;
|
||||
fn provided_models(&self, cx: &App) -> Vec<Arc<dyn LanguageModel>>;
|
||||
@@ -844,7 +839,6 @@ pub trait LanguageModelProvider: 'static {
|
||||
pub enum ConfigurationViewTargetAgent {
|
||||
#[default]
|
||||
ZedAgent,
|
||||
EditPrediction,
|
||||
Other(SharedString),
|
||||
}
|
||||
|
||||
|
||||
@@ -492,7 +492,6 @@ mod tests {
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.register_provider(provider.clone(), cx);
|
||||
|
||||
// Set up a hiding function that hides the fake provider when "fake-extension" is installed
|
||||
registry.set_builtin_provider_hiding_fn(Box::new(|id| {
|
||||
if id == "fake" {
|
||||
Some("fake-extension")
|
||||
@@ -502,21 +501,17 @@ mod tests {
|
||||
}));
|
||||
});
|
||||
|
||||
// Provider should be visible initially
|
||||
let visible = registry.read(cx).visible_providers();
|
||||
assert_eq!(visible.len(), 1);
|
||||
assert_eq!(visible[0].id(), provider_id);
|
||||
|
||||
// Install the extension
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.extension_installed("fake-extension".into(), cx);
|
||||
});
|
||||
|
||||
// Provider should now be hidden
|
||||
let visible = registry.read(cx).visible_providers();
|
||||
assert!(visible.is_empty());
|
||||
|
||||
// But still in providers()
|
||||
let all = registry.read(cx).providers();
|
||||
assert_eq!(all.len(), 1);
|
||||
}
|
||||
@@ -531,7 +526,6 @@ mod tests {
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.register_provider(provider.clone(), cx);
|
||||
|
||||
// Set up hiding function
|
||||
registry.set_builtin_provider_hiding_fn(Box::new(|id| {
|
||||
if id == "fake" {
|
||||
Some("fake-extension")
|
||||
@@ -540,20 +534,16 @@ mod tests {
|
||||
}
|
||||
}));
|
||||
|
||||
// Start with extension installed
|
||||
registry.extension_installed("fake-extension".into(), cx);
|
||||
});
|
||||
|
||||
// Provider should be hidden
|
||||
let visible = registry.read(cx).visible_providers();
|
||||
assert!(visible.is_empty());
|
||||
|
||||
// Uninstall the extension
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.extension_uninstalled("fake-extension", cx);
|
||||
});
|
||||
|
||||
// Provider should now be visible again
|
||||
let visible = registry.read(cx).visible_providers();
|
||||
assert_eq!(visible.len(), 1);
|
||||
assert_eq!(visible[0].id(), provider_id);
|
||||
@@ -564,7 +554,6 @@ mod tests {
|
||||
let registry = cx.new(|_| LanguageModelRegistry::default());
|
||||
|
||||
registry.update(cx, |registry, cx| {
|
||||
// Set up hiding function
|
||||
registry.set_builtin_provider_hiding_fn(Box::new(|id| {
|
||||
if id == "anthropic" {
|
||||
Some("anthropic")
|
||||
@@ -575,19 +564,15 @@ mod tests {
|
||||
}
|
||||
}));
|
||||
|
||||
// Install only anthropic extension
|
||||
registry.extension_installed("anthropic".into(), cx);
|
||||
});
|
||||
|
||||
let registry_read = registry.read(cx);
|
||||
|
||||
// Anthropic should be hidden
|
||||
assert!(registry_read.should_hide_provider(&LanguageModelProviderId("anthropic".into())));
|
||||
|
||||
// OpenAI should not be hidden (extension not installed)
|
||||
assert!(!registry_read.should_hide_provider(&LanguageModelProviderId("openai".into())));
|
||||
|
||||
// Unknown provider should not be hidden
|
||||
assert!(!registry_read.should_hide_provider(&LanguageModelProviderId("unknown".into())));
|
||||
}
|
||||
|
||||
@@ -609,7 +594,6 @@ mod tests {
|
||||
}));
|
||||
});
|
||||
|
||||
// Sync with a set containing the extension
|
||||
let mut extension_ids = HashSet::default();
|
||||
extension_ids.insert(Arc::from("fake-extension"));
|
||||
|
||||
@@ -617,15 +601,12 @@ mod tests {
|
||||
registry.sync_installed_llm_extensions(extension_ids, cx);
|
||||
});
|
||||
|
||||
// Provider should be hidden
|
||||
assert!(registry.read(cx).visible_providers().is_empty());
|
||||
|
||||
// Sync with empty set
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.sync_installed_llm_extensions(HashSet::default(), cx);
|
||||
});
|
||||
|
||||
// Provider should be visible again
|
||||
assert_eq!(registry.read(cx).visible_providers().len(), 1);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -8,7 +8,6 @@ use gpui::{
|
||||
App, AppContext as _, DevicePixels, Image, ImageFormat, ObjectFit, SharedString, Size, Task,
|
||||
point, px, size,
|
||||
};
|
||||
use image::GenericImageView as _;
|
||||
use image::codecs::png::PngEncoder;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use util::ResultExt;
|
||||
@@ -81,16 +80,6 @@ impl std::fmt::Debug for LanguageModelImage {
|
||||
/// Anthropic wants uploaded images to be smaller than this in both dimensions.
|
||||
const ANTHROPIC_SIZE_LIMIT: f32 = 1568.;
|
||||
|
||||
/// Default per-image hard limit (in bytes) for the encoded image payload we send upstream.
|
||||
///
|
||||
/// NOTE: `LanguageModelImage.source` is base64-encoded PNG bytes (without the `data:` prefix).
|
||||
/// This limit is enforced on the encoded PNG bytes *before* base64 encoding.
|
||||
const DEFAULT_IMAGE_MAX_BYTES: usize = 5 * 1024 * 1024;
|
||||
|
||||
/// Conservative cap on how many times we'll attempt to shrink/re-encode an image to fit
|
||||
/// `DEFAULT_IMAGE_MAX_BYTES`.
|
||||
const MAX_IMAGE_DOWNSCALE_PASSES: usize = 8;
|
||||
|
||||
impl LanguageModelImage {
|
||||
pub fn empty() -> Self {
|
||||
Self {
|
||||
@@ -123,62 +112,29 @@ impl LanguageModelImage {
|
||||
let height = dynamic_image.height();
|
||||
let image_size = size(DevicePixels(width as i32), DevicePixels(height as i32));
|
||||
|
||||
// First apply any provider-specific dimension constraints we know about (Anthropic).
|
||||
let mut processed_image = if image_size.width.0 > ANTHROPIC_SIZE_LIMIT as i32
|
||||
|| image_size.height.0 > ANTHROPIC_SIZE_LIMIT as i32
|
||||
{
|
||||
let new_bounds = ObjectFit::ScaleDown.get_bounds(
|
||||
gpui::Bounds {
|
||||
origin: point(px(0.0), px(0.0)),
|
||||
size: size(px(ANTHROPIC_SIZE_LIMIT), px(ANTHROPIC_SIZE_LIMIT)),
|
||||
},
|
||||
image_size,
|
||||
);
|
||||
dynamic_image.resize(
|
||||
new_bounds.size.width.into(),
|
||||
new_bounds.size.height.into(),
|
||||
image::imageops::FilterType::Triangle,
|
||||
)
|
||||
} else {
|
||||
dynamic_image
|
||||
};
|
||||
let base64_image = {
|
||||
if image_size.width.0 > ANTHROPIC_SIZE_LIMIT as i32
|
||||
|| image_size.height.0 > ANTHROPIC_SIZE_LIMIT as i32
|
||||
{
|
||||
let new_bounds = ObjectFit::ScaleDown.get_bounds(
|
||||
gpui::Bounds {
|
||||
origin: point(px(0.0), px(0.0)),
|
||||
size: size(px(ANTHROPIC_SIZE_LIMIT), px(ANTHROPIC_SIZE_LIMIT)),
|
||||
},
|
||||
image_size,
|
||||
);
|
||||
let resized_image = dynamic_image.resize(
|
||||
new_bounds.size.width.into(),
|
||||
new_bounds.size.height.into(),
|
||||
image::imageops::FilterType::Triangle,
|
||||
);
|
||||
|
||||
// Then enforce a default per-image size cap on the encoded PNG bytes.
|
||||
//
|
||||
// We always send PNG bytes (either original PNG bytes, or re-encoded PNG) base64'd.
|
||||
// The upstream provider limit we want to respect is effectively on the binary image
|
||||
// payload size, so we enforce against the encoded PNG bytes before base64 encoding.
|
||||
let mut encoded_png = encode_png_bytes(&processed_image).log_err()?;
|
||||
for _pass in 0..MAX_IMAGE_DOWNSCALE_PASSES {
|
||||
if encoded_png.len() <= DEFAULT_IMAGE_MAX_BYTES {
|
||||
break;
|
||||
encode_as_base64(data, resized_image)
|
||||
} else {
|
||||
encode_as_base64(data, dynamic_image)
|
||||
}
|
||||
|
||||
// Scale down geometrically to converge quickly. We don't know the final PNG size
|
||||
// as a function of pixels, so we iteratively shrink.
|
||||
let (w, h) = processed_image.dimensions();
|
||||
if w <= 1 || h <= 1 {
|
||||
break;
|
||||
}
|
||||
|
||||
// Shrink by ~15% each pass (0.85). This is a compromise between speed and
|
||||
// preserving image detail.
|
||||
let new_w = ((w as f32) * 0.85).round().max(1.0) as u32;
|
||||
let new_h = ((h as f32) * 0.85).round().max(1.0) as u32;
|
||||
|
||||
processed_image =
|
||||
processed_image.resize(new_w, new_h, image::imageops::FilterType::Triangle);
|
||||
encoded_png = encode_png_bytes(&processed_image).log_err()?;
|
||||
}
|
||||
|
||||
if encoded_png.len() > DEFAULT_IMAGE_MAX_BYTES {
|
||||
// Still too large after multiple passes; treat as non-convertible for now.
|
||||
// (Provider-specific handling can be introduced later.)
|
||||
return None;
|
||||
}
|
||||
|
||||
// Now base64 encode the PNG bytes.
|
||||
let base64_image = encode_bytes_as_base64(encoded_png.as_slice()).log_err()?;
|
||||
.log_err()?;
|
||||
|
||||
// SAFETY: The base64 encoder should not produce non-UTF8.
|
||||
let source = unsafe { String::from_utf8_unchecked(base64_image) };
|
||||
@@ -208,20 +164,21 @@ impl LanguageModelImage {
|
||||
}
|
||||
}
|
||||
|
||||
fn encode_png_bytes(image: &image::DynamicImage) -> Result<Vec<u8>> {
|
||||
let mut png = Vec::new();
|
||||
image.write_with_encoder(PngEncoder::new(&mut png))?;
|
||||
Ok(png)
|
||||
}
|
||||
|
||||
fn encode_bytes_as_base64(bytes: &[u8]) -> Result<Vec<u8>> {
|
||||
fn encode_as_base64(data: Arc<Image>, image: image::DynamicImage) -> Result<Vec<u8>> {
|
||||
let mut base64_image = Vec::new();
|
||||
{
|
||||
let mut base64_encoder = EncoderWriter::new(
|
||||
Cursor::new(&mut base64_image),
|
||||
&base64::engine::general_purpose::STANDARD,
|
||||
);
|
||||
base64_encoder.write_all(bytes)?;
|
||||
if data.format() == ImageFormat::Png {
|
||||
base64_encoder.write_all(data.bytes())?;
|
||||
} else {
|
||||
let mut png = Vec::new();
|
||||
image.write_with_encoder(PngEncoder::new(&mut png))?;
|
||||
|
||||
base64_encoder.write_all(png.as_slice())?;
|
||||
}
|
||||
}
|
||||
Ok(base64_image)
|
||||
}
|
||||
@@ -460,71 +417,6 @@ pub struct LanguageModelResponseMessage {
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use base64::Engine as _;
|
||||
use gpui::TestAppContext;
|
||||
use image::ImageDecoder as _;
|
||||
|
||||
fn base64_to_png_bytes(base64_png: &str) -> Vec<u8> {
|
||||
base64::engine::general_purpose::STANDARD
|
||||
.decode(base64_png.as_bytes())
|
||||
.expect("base64 should decode")
|
||||
}
|
||||
|
||||
fn png_dimensions(png_bytes: &[u8]) -> (u32, u32) {
|
||||
let decoder =
|
||||
image::codecs::png::PngDecoder::new(Cursor::new(png_bytes)).expect("png should decode");
|
||||
decoder.dimensions()
|
||||
}
|
||||
|
||||
fn make_noisy_png_bytes(width: u32, height: u32) -> Vec<u8> {
|
||||
// Create an RGBA image with per-pixel variance to avoid PNG compressing too well.
|
||||
let mut img = image::RgbaImage::new(width, height);
|
||||
for y in 0..height {
|
||||
for x in 0..width {
|
||||
let r = ((x ^ y) & 0xFF) as u8;
|
||||
let g = ((x.wrapping_mul(31) ^ y.wrapping_mul(17)) & 0xFF) as u8;
|
||||
let b = ((x.wrapping_mul(131) ^ y.wrapping_mul(7)) & 0xFF) as u8;
|
||||
img.put_pixel(x, y, image::Rgba([r, g, b, 0xFF]));
|
||||
}
|
||||
}
|
||||
|
||||
let mut out = Vec::new();
|
||||
image::DynamicImage::ImageRgba8(img)
|
||||
.write_with_encoder(PngEncoder::new(&mut out))
|
||||
.expect("png encoding should succeed");
|
||||
out
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_from_image_downscales_to_default_5mb_limit(cx: &mut TestAppContext) {
|
||||
// Pick a size that reliably produces a PNG > 5MB when filled with noise.
|
||||
// If this fails (image is too small), bump dimensions.
|
||||
let original_png = make_noisy_png_bytes(4096, 4096);
|
||||
assert!(
|
||||
original_png.len() > DEFAULT_IMAGE_MAX_BYTES,
|
||||
"precondition failed: noisy PNG must exceed DEFAULT_IMAGE_MAX_BYTES"
|
||||
);
|
||||
|
||||
let image = gpui::Image::from_bytes(ImageFormat::Png, original_png);
|
||||
let lm_image = cx
|
||||
.update(|cx| LanguageModelImage::from_image(Arc::new(image), cx))
|
||||
.await
|
||||
.expect("image conversion should succeed");
|
||||
|
||||
let encoded_png = base64_to_png_bytes(lm_image.source.as_ref());
|
||||
assert!(
|
||||
encoded_png.len() <= DEFAULT_IMAGE_MAX_BYTES,
|
||||
"expected encoded PNG <= DEFAULT_IMAGE_MAX_BYTES, got {} bytes",
|
||||
encoded_png.len()
|
||||
);
|
||||
|
||||
// Ensure we actually downscaled in pixels (not just re-encoded).
|
||||
let (w, h) = png_dimensions(&encoded_png);
|
||||
assert!(
|
||||
w < 4096 || h < 4096,
|
||||
"expected image to be downscaled in at least one dimension; got {w}x{h}"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_language_model_tool_result_content_deserialization() {
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
use ::extension::{
|
||||
use collections::HashMap;
|
||||
use extension::{
|
||||
ExtensionHostProxy, ExtensionLanguageModelProviderProxy, LanguageModelProviderRegistration,
|
||||
};
|
||||
use collections::HashMap;
|
||||
use gpui::{App, Entity};
|
||||
use language_model::{LanguageModelProviderId, LanguageModelRegistry};
|
||||
use std::sync::{Arc, LazyLock};
|
||||
@@ -59,7 +59,6 @@ pub fn init_proxy(cx: &mut App) {
|
||||
let proxy = ExtensionHostProxy::default_global(cx);
|
||||
let registry = LanguageModelRegistry::global(cx);
|
||||
|
||||
// Set the function that determines which built-in providers should be hidden
|
||||
registry.update(cx, |registry, _cx| {
|
||||
registry.set_builtin_provider_hiding_fn(Box::new(extension_for_builtin_provider));
|
||||
});
|
||||
|
||||
@@ -1,43 +0,0 @@
|
||||
use anyhow::Result;
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use gpui::{App, Task};
|
||||
|
||||
const GEMINI_API_KEY_VAR_NAME: &str = "GEMINI_API_KEY";
|
||||
const GOOGLE_AI_API_KEY_VAR_NAME: &str = "GOOGLE_AI_API_KEY";
|
||||
const GOOGLE_AI_EXTENSION_CREDENTIAL_KEY: &str = "extension-llm-google-ai:google-ai";
|
||||
|
||||
/// Returns the Google AI API key for use by the Gemini CLI.
|
||||
///
|
||||
/// This function checks the following sources in order:
|
||||
/// 1. `GEMINI_API_KEY` environment variable
|
||||
/// 2. `GOOGLE_AI_API_KEY` environment variable
|
||||
/// 3. Extension credential store (`extension-llm-google-ai:google-ai`)
|
||||
pub fn api_key_for_gemini_cli(cx: &mut App) -> Task<Result<String>> {
|
||||
if let Ok(key) = std::env::var(GEMINI_API_KEY_VAR_NAME) {
|
||||
if !key.is_empty() {
|
||||
return Task::ready(Ok(key));
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(key) = std::env::var(GOOGLE_AI_API_KEY_VAR_NAME) {
|
||||
if !key.is_empty() {
|
||||
return Task::ready(Ok(key));
|
||||
}
|
||||
}
|
||||
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
let credential = credentials_provider
|
||||
.read_credentials(GOOGLE_AI_EXTENSION_CREDENTIAL_KEY, &cx)
|
||||
.await?;
|
||||
|
||||
match credential {
|
||||
Some((_, key_bytes)) => {
|
||||
let key = String::from_utf8(key_bytes)?;
|
||||
Ok(key)
|
||||
}
|
||||
None => Err(anyhow::anyhow!("No Google AI API key found")),
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -8,12 +8,11 @@ use language_model::{LanguageModelProviderId, LanguageModelRegistry};
|
||||
use provider::deepseek::DeepSeekLanguageModelProvider;
|
||||
|
||||
pub mod extension;
|
||||
mod google_ai_api_key;
|
||||
pub mod provider;
|
||||
mod settings;
|
||||
|
||||
pub use crate::extension::init_proxy as init_extension_proxy;
|
||||
pub use crate::google_ai_api_key::api_key_for_gemini_cli;
|
||||
|
||||
use crate::provider::anthropic::AnthropicLanguageModelProvider;
|
||||
use crate::provider::bedrock::BedrockLanguageModelProvider;
|
||||
use crate::provider::cloud::CloudLanguageModelProvider;
|
||||
@@ -39,41 +38,36 @@ pub fn init(user_store: Entity<UserStore>, client: Arc<Client>, cx: &mut App) {
|
||||
if let Some(extension_store) = extension_host::ExtensionStore::try_global(cx) {
|
||||
cx.subscribe(&extension_store, {
|
||||
let registry = registry.clone();
|
||||
move |extension_store, event, cx| {
|
||||
match event {
|
||||
extension_host::Event::ExtensionInstalled(extension_id) => {
|
||||
// Check if this extension has language_model_providers
|
||||
if let Some(manifest) = extension_store
|
||||
.read(cx)
|
||||
.extension_manifest_for_id(extension_id)
|
||||
{
|
||||
if !manifest.language_model_providers.is_empty() {
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.extension_installed(extension_id.clone(), cx);
|
||||
});
|
||||
}
|
||||
move |extension_store, event, cx| match event {
|
||||
extension_host::Event::ExtensionInstalled(extension_id) => {
|
||||
if let Some(manifest) = extension_store
|
||||
.read(cx)
|
||||
.extension_manifest_for_id(extension_id)
|
||||
{
|
||||
if !manifest.language_model_providers.is_empty() {
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.extension_installed(extension_id.clone(), cx);
|
||||
});
|
||||
}
|
||||
}
|
||||
extension_host::Event::ExtensionUninstalled(extension_id) => {
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.extension_uninstalled(extension_id, cx);
|
||||
});
|
||||
}
|
||||
extension_host::Event::ExtensionsUpdated => {
|
||||
// Re-sync installed extensions on bulk updates
|
||||
let mut new_ids = HashSet::default();
|
||||
for (extension_id, entry) in extension_store.read(cx).installed_extensions()
|
||||
{
|
||||
if !entry.manifest.language_model_providers.is_empty() {
|
||||
new_ids.insert(extension_id.clone());
|
||||
}
|
||||
}
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.sync_installed_llm_extensions(new_ids, cx);
|
||||
});
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
extension_host::Event::ExtensionUninstalled(extension_id) => {
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.extension_uninstalled(extension_id, cx);
|
||||
});
|
||||
}
|
||||
extension_host::Event::ExtensionsUpdated => {
|
||||
let mut new_ids = HashSet::default();
|
||||
for (extension_id, entry) in extension_store.read(cx).installed_extensions() {
|
||||
if !entry.manifest.language_model_providers.is_empty() {
|
||||
new_ids.insert(extension_id.clone());
|
||||
}
|
||||
}
|
||||
registry.update(cx, |registry, cx| {
|
||||
registry.sync_installed_llm_extensions(new_ids, cx);
|
||||
});
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
})
|
||||
.detach();
|
||||
|
||||
@@ -1104,7 +1104,6 @@ impl Render for ConfigurationView {
|
||||
.on_action(cx.listener(Self::save_api_key))
|
||||
.child(Label::new(format!("To use {}, you need to add an API key. Follow these steps:", match &self.target_agent {
|
||||
ConfigurationViewTargetAgent::ZedAgent => "Zed's agent with Anthropic".into(),
|
||||
ConfigurationViewTargetAgent::EditPrediction => "Anthropic for edit predictions".into(),
|
||||
ConfigurationViewTargetAgent::Other(agent) => agent.clone(),
|
||||
})))
|
||||
.child(
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
use anyhow::{Context as _, Result, anyhow};
|
||||
use collections::BTreeMap;
|
||||
use credentials_provider::CredentialsProvider;
|
||||
use futures::{FutureExt, Stream, StreamExt, future, future::BoxFuture};
|
||||
use google_ai::{
|
||||
FunctionDeclaration, GenerateContentResponse, GoogleModelMode, Part, SystemInstruction,
|
||||
@@ -31,7 +32,7 @@ use ui::{ButtonLink, ConfiguredApiCard, List, ListBulletItem, prelude::*};
|
||||
use ui_input::InputField;
|
||||
use util::ResultExt;
|
||||
|
||||
use language_model::ApiKeyState;
|
||||
use language_model::{ApiKey, ApiKeyState};
|
||||
|
||||
const PROVIDER_ID: LanguageModelProviderId = language_model::GOOGLE_PROVIDER_ID;
|
||||
const PROVIDER_NAME: LanguageModelProviderName = language_model::GOOGLE_PROVIDER_NAME;
|
||||
@@ -116,6 +117,22 @@ impl GoogleLanguageModelProvider {
|
||||
})
|
||||
}
|
||||
|
||||
pub fn api_key_for_gemini_cli(cx: &mut App) -> Task<Result<String>> {
|
||||
if let Some(key) = API_KEY_ENV_VAR.value.clone() {
|
||||
return Task::ready(Ok(key));
|
||||
}
|
||||
let credentials_provider = <dyn CredentialsProvider>::global(cx);
|
||||
let api_url = Self::api_url(cx).to_string();
|
||||
cx.spawn(async move |cx| {
|
||||
Ok(
|
||||
ApiKey::load_from_system_keychain(&api_url, credentials_provider.as_ref(), cx)
|
||||
.await?
|
||||
.key()
|
||||
.to_string(),
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
fn settings(cx: &App) -> &GoogleSettings {
|
||||
&crate::AllLanguageModelSettings::get_global(cx).google
|
||||
}
|
||||
@@ -690,7 +707,7 @@ pub fn count_google_tokens(
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
// Tiktoken doesn't support these models, so we manually use the
|
||||
// Tiktoken doesn't yet support these models, so we manually use the
|
||||
// same tokenizer as GPT-4.
|
||||
tiktoken_rs::num_tokens_from_messages("gpt-4", &messages).map(|tokens| tokens as u64)
|
||||
})
|
||||
@@ -841,7 +858,6 @@ impl Render for ConfigurationView {
|
||||
.on_action(cx.listener(Self::save_api_key))
|
||||
.child(Label::new(format!("To use {}, you need to add an API key. Follow these steps:", match &self.target_agent {
|
||||
ConfigurationViewTargetAgent::ZedAgent => "Zed's agent with Google AI".into(),
|
||||
ConfigurationViewTargetAgent::EditPrediction => "Google AI for edit predictions".into(),
|
||||
ConfigurationViewTargetAgent::Other(agent) => agent.clone(),
|
||||
})))
|
||||
.child(
|
||||
|
||||
@@ -4204,30 +4204,10 @@ impl Repository {
|
||||
&mut self,
|
||||
entries: Vec<RepoPath>,
|
||||
cx: &mut Context<Self>,
|
||||
) -> Task<anyhow::Result<()>> {
|
||||
self.stage_or_unstage_entries(true, entries, cx)
|
||||
}
|
||||
|
||||
pub fn unstage_entries(
|
||||
&mut self,
|
||||
entries: Vec<RepoPath>,
|
||||
cx: &mut Context<Self>,
|
||||
) -> Task<anyhow::Result<()>> {
|
||||
self.stage_or_unstage_entries(false, entries, cx)
|
||||
}
|
||||
|
||||
fn stage_or_unstage_entries(
|
||||
&mut self,
|
||||
stage: bool,
|
||||
entries: Vec<RepoPath>,
|
||||
cx: &mut Context<Self>,
|
||||
) -> Task<anyhow::Result<()>> {
|
||||
if entries.is_empty() {
|
||||
return Task::ready(Ok(()));
|
||||
}
|
||||
let Some(git_store) = self.git_store.upgrade() else {
|
||||
return Task::ready(Ok(()));
|
||||
};
|
||||
let id = self.id;
|
||||
let save_tasks = self.save_buffers(&entries, cx);
|
||||
let paths = entries
|
||||
@@ -4235,164 +4215,113 @@ impl Repository {
|
||||
.map(|p| p.as_unix_str())
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ");
|
||||
let status = if stage {
|
||||
format!("git add {paths}")
|
||||
} else {
|
||||
format!("git reset {paths}")
|
||||
};
|
||||
let status = format!("git add {paths}");
|
||||
let job_key = GitJobKey::WriteIndex(entries.clone());
|
||||
|
||||
self.spawn_job_with_tracking(
|
||||
entries.clone(),
|
||||
if stage {
|
||||
pending_op::GitStatus::Staged
|
||||
} else {
|
||||
pending_op::GitStatus::Unstaged
|
||||
},
|
||||
pending_op::GitStatus::Staged,
|
||||
cx,
|
||||
async move |this, cx| {
|
||||
for save_task in save_tasks {
|
||||
save_task.await?;
|
||||
}
|
||||
|
||||
this.update(cx, |this, cx| {
|
||||
let weak_this = cx.weak_entity();
|
||||
this.update(cx, |this, _| {
|
||||
this.send_keyed_job(
|
||||
Some(job_key),
|
||||
Some(status.into()),
|
||||
move |git_repo, mut cx| async move {
|
||||
let hunk_staging_operation_counts = weak_this
|
||||
.update(&mut cx, |this, cx| {
|
||||
let mut hunk_staging_operation_counts = HashMap::default();
|
||||
for path in &entries {
|
||||
let Some(project_path) =
|
||||
this.repo_path_to_project_path(path, cx)
|
||||
else {
|
||||
continue;
|
||||
};
|
||||
let Some(buffer) = git_store
|
||||
.read(cx)
|
||||
.buffer_store
|
||||
.read(cx)
|
||||
.get_by_path(&project_path)
|
||||
else {
|
||||
continue;
|
||||
};
|
||||
let Some(diff_state) = git_store
|
||||
.read(cx)
|
||||
.diffs
|
||||
.get(&buffer.read(cx).remote_id())
|
||||
.cloned()
|
||||
else {
|
||||
continue;
|
||||
};
|
||||
let Some(uncommitted_diff) =
|
||||
diff_state.read(cx).uncommitted_diff.as_ref().and_then(
|
||||
|uncommitted_diff| uncommitted_diff.upgrade(),
|
||||
)
|
||||
else {
|
||||
continue;
|
||||
};
|
||||
let buffer_snapshot = buffer.read(cx).text_snapshot();
|
||||
let file_exists = buffer
|
||||
.read(cx)
|
||||
.file()
|
||||
.is_some_and(|file| file.disk_state().exists());
|
||||
let hunk_staging_operation_count =
|
||||
diff_state.update(cx, |diff_state, cx| {
|
||||
uncommitted_diff.update(
|
||||
cx,
|
||||
|uncommitted_diff, cx| {
|
||||
uncommitted_diff
|
||||
.stage_or_unstage_all_hunks(
|
||||
stage,
|
||||
&buffer_snapshot,
|
||||
file_exists,
|
||||
cx,
|
||||
);
|
||||
},
|
||||
);
|
||||
|
||||
diff_state.hunk_staging_operation_count += 1;
|
||||
diff_state.hunk_staging_operation_count
|
||||
});
|
||||
hunk_staging_operation_counts.insert(
|
||||
diff_state.downgrade(),
|
||||
hunk_staging_operation_count,
|
||||
);
|
||||
}
|
||||
hunk_staging_operation_counts
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
let result = match git_repo {
|
||||
move |git_repo, _cx| async move {
|
||||
match git_repo {
|
||||
RepositoryState::Local(LocalRepositoryState {
|
||||
backend,
|
||||
environment,
|
||||
..
|
||||
}) => {
|
||||
if stage {
|
||||
backend.stage_paths(entries, environment.clone()).await
|
||||
} else {
|
||||
backend.unstage_paths(entries, environment.clone()).await
|
||||
}
|
||||
}
|
||||
}) => backend.stage_paths(entries, environment.clone()).await,
|
||||
RepositoryState::Remote(RemoteRepositoryState {
|
||||
project_id,
|
||||
client,
|
||||
}) => {
|
||||
if stage {
|
||||
client
|
||||
.request(proto::Stage {
|
||||
project_id: project_id.0,
|
||||
repository_id: id.to_proto(),
|
||||
paths: entries
|
||||
.into_iter()
|
||||
.map(|repo_path| repo_path.to_proto())
|
||||
.collect(),
|
||||
})
|
||||
.await
|
||||
.context("sending stage request")
|
||||
.map(|_| ())
|
||||
} else {
|
||||
client
|
||||
.request(proto::Unstage {
|
||||
project_id: project_id.0,
|
||||
repository_id: id.to_proto(),
|
||||
paths: entries
|
||||
.into_iter()
|
||||
.map(|repo_path| repo_path.to_proto())
|
||||
.collect(),
|
||||
})
|
||||
.await
|
||||
.context("sending unstage request")
|
||||
.map(|_| ())
|
||||
}
|
||||
client
|
||||
.request(proto::Stage {
|
||||
project_id: project_id.0,
|
||||
repository_id: id.to_proto(),
|
||||
paths: entries
|
||||
.into_iter()
|
||||
.map(|repo_path| repo_path.to_proto())
|
||||
.collect(),
|
||||
})
|
||||
.await
|
||||
.context("sending stage request")?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
};
|
||||
|
||||
for (diff_state, hunk_staging_operation_count) in
|
||||
hunk_staging_operation_counts
|
||||
{
|
||||
diff_state
|
||||
.update(&mut cx, |diff_state, cx| {
|
||||
if result.is_ok() {
|
||||
diff_state.hunk_staging_operation_count_as_of_write =
|
||||
hunk_staging_operation_count;
|
||||
} else if let Some(uncommitted_diff) =
|
||||
&diff_state.uncommitted_diff
|
||||
{
|
||||
uncommitted_diff
|
||||
.update(cx, |uncommitted_diff, cx| {
|
||||
uncommitted_diff.clear_pending_hunks(cx);
|
||||
})
|
||||
.ok();
|
||||
}
|
||||
})
|
||||
.ok();
|
||||
}
|
||||
},
|
||||
)
|
||||
})?
|
||||
.await?
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
result
|
||||
pub fn unstage_entries(
|
||||
&mut self,
|
||||
entries: Vec<RepoPath>,
|
||||
cx: &mut Context<Self>,
|
||||
) -> Task<anyhow::Result<()>> {
|
||||
if entries.is_empty() {
|
||||
return Task::ready(Ok(()));
|
||||
}
|
||||
let id = self.id;
|
||||
let save_tasks = self.save_buffers(&entries, cx);
|
||||
let paths = entries
|
||||
.iter()
|
||||
.map(|p| p.as_unix_str())
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ");
|
||||
let status = format!("git reset {paths}");
|
||||
let job_key = GitJobKey::WriteIndex(entries.clone());
|
||||
|
||||
self.spawn_job_with_tracking(
|
||||
entries.clone(),
|
||||
pending_op::GitStatus::Unstaged,
|
||||
cx,
|
||||
async move |this, cx| {
|
||||
for save_task in save_tasks {
|
||||
save_task.await?;
|
||||
}
|
||||
|
||||
this.update(cx, |this, _| {
|
||||
this.send_keyed_job(
|
||||
Some(job_key),
|
||||
Some(status.into()),
|
||||
move |git_repo, _cx| async move {
|
||||
match git_repo {
|
||||
RepositoryState::Local(LocalRepositoryState {
|
||||
backend,
|
||||
environment,
|
||||
..
|
||||
}) => backend.unstage_paths(entries, environment).await,
|
||||
RepositoryState::Remote(RemoteRepositoryState {
|
||||
project_id,
|
||||
client,
|
||||
}) => {
|
||||
client
|
||||
.request(proto::Unstage {
|
||||
project_id: project_id.0,
|
||||
repository_id: id.to_proto(),
|
||||
paths: entries
|
||||
.into_iter()
|
||||
.map(|repo_path| repo_path.to_proto())
|
||||
.collect(),
|
||||
})
|
||||
.await
|
||||
.context("sending unstage request")?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
},
|
||||
)
|
||||
})?
|
||||
@@ -4418,7 +4347,7 @@ impl Repository {
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
self.stage_or_unstage_entries(true, to_stage, cx)
|
||||
self.stage_entries(to_stage, cx)
|
||||
}
|
||||
|
||||
pub fn unstage_all(&mut self, cx: &mut Context<Self>) -> Task<anyhow::Result<()>> {
|
||||
@@ -4438,7 +4367,7 @@ impl Repository {
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
self.stage_or_unstage_entries(false, to_unstage, cx)
|
||||
self.unstage_entries(to_unstage, cx)
|
||||
}
|
||||
|
||||
pub fn stash_all(&mut self, cx: &mut Context<Self>) -> Task<anyhow::Result<()>> {
|
||||
|
||||
@@ -10922,146 +10922,3 @@ async fn test_git_worktree_remove(cx: &mut gpui::TestAppContext) {
|
||||
});
|
||||
assert!(active_repo_path.is_none());
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_optimistic_hunks_in_staged_files(cx: &mut gpui::TestAppContext) {
|
||||
use DiffHunkSecondaryStatus::*;
|
||||
init_test(cx);
|
||||
|
||||
let committed_contents = r#"
|
||||
one
|
||||
two
|
||||
three
|
||||
"#
|
||||
.unindent();
|
||||
let file_contents = r#"
|
||||
one
|
||||
TWO
|
||||
three
|
||||
"#
|
||||
.unindent();
|
||||
|
||||
let fs = FakeFs::new(cx.background_executor.clone());
|
||||
fs.insert_tree(
|
||||
path!("/dir"),
|
||||
json!({
|
||||
".git": {},
|
||||
"file.txt": file_contents.clone()
|
||||
}),
|
||||
)
|
||||
.await;
|
||||
|
||||
fs.set_head_and_index_for_repo(
|
||||
path!("/dir/.git").as_ref(),
|
||||
&[("file.txt", committed_contents.clone())],
|
||||
);
|
||||
|
||||
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
|
||||
|
||||
let buffer = project
|
||||
.update(cx, |project, cx| {
|
||||
project.open_local_buffer(path!("/dir/file.txt"), cx)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
let snapshot = buffer.read_with(cx, |buffer, _| buffer.snapshot());
|
||||
let uncommitted_diff = project
|
||||
.update(cx, |project, cx| {
|
||||
project.open_uncommitted_diff(buffer.clone(), cx)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
// The hunk is initially unstaged.
|
||||
uncommitted_diff.read_with(cx, |diff, cx| {
|
||||
assert_hunks(
|
||||
diff.hunks(&snapshot, cx),
|
||||
&snapshot,
|
||||
&diff.base_text_string().unwrap(),
|
||||
&[(
|
||||
1..2,
|
||||
"two\n",
|
||||
"TWO\n",
|
||||
DiffHunkStatus::modified(HasSecondaryHunk),
|
||||
)],
|
||||
);
|
||||
});
|
||||
|
||||
// Get the repository handle.
|
||||
let repo = project.read_with(cx, |project, cx| {
|
||||
project.repositories(cx).values().next().unwrap().clone()
|
||||
});
|
||||
|
||||
// Stage the file.
|
||||
let stage_task = repo.update(cx, |repo, cx| {
|
||||
repo.stage_entries(vec![repo_path("file.txt")], cx)
|
||||
});
|
||||
|
||||
// Run a few ticks to let the job start and mark hunks as pending,
|
||||
// but don't run_until_parked which would complete the entire operation.
|
||||
for _ in 0..10 {
|
||||
cx.executor().tick();
|
||||
let [hunk]: [_; 1] = uncommitted_diff
|
||||
.read_with(cx, |diff, cx| diff.hunks(&snapshot, cx).collect::<Vec<_>>())
|
||||
.try_into()
|
||||
.unwrap();
|
||||
match hunk.secondary_status {
|
||||
HasSecondaryHunk => {}
|
||||
SecondaryHunkRemovalPending => break,
|
||||
NoSecondaryHunk => panic!("hunk was not optimistically staged"),
|
||||
_ => panic!("unexpected hunk state"),
|
||||
}
|
||||
}
|
||||
uncommitted_diff.read_with(cx, |diff, cx| {
|
||||
assert_hunks(
|
||||
diff.hunks(&snapshot, cx),
|
||||
&snapshot,
|
||||
&diff.base_text_string().unwrap(),
|
||||
&[(
|
||||
1..2,
|
||||
"two\n",
|
||||
"TWO\n",
|
||||
DiffHunkStatus::modified(SecondaryHunkRemovalPending),
|
||||
)],
|
||||
);
|
||||
});
|
||||
|
||||
// Let the staging complete.
|
||||
stage_task.await.unwrap();
|
||||
cx.run_until_parked();
|
||||
|
||||
// The hunk is now fully staged.
|
||||
uncommitted_diff.read_with(cx, |diff, cx| {
|
||||
assert_hunks(
|
||||
diff.hunks(&snapshot, cx),
|
||||
&snapshot,
|
||||
&diff.base_text_string().unwrap(),
|
||||
&[(
|
||||
1..2,
|
||||
"two\n",
|
||||
"TWO\n",
|
||||
DiffHunkStatus::modified(NoSecondaryHunk),
|
||||
)],
|
||||
);
|
||||
});
|
||||
|
||||
// Simulate a commit by updating HEAD to match the current file contents.
|
||||
// The FakeGitRepository's commit method is a no-op, so we need to manually
|
||||
// update HEAD to simulate the commit completing.
|
||||
fs.set_head_for_repo(
|
||||
path!("/dir/.git").as_ref(),
|
||||
&[("file.txt", file_contents.clone())],
|
||||
"newhead",
|
||||
);
|
||||
cx.run_until_parked();
|
||||
|
||||
// After committing, there are no more hunks.
|
||||
uncommitted_diff.read_with(cx, |diff, cx| {
|
||||
assert_hunks(
|
||||
diff.hunks(&snapshot, cx),
|
||||
&snapshot,
|
||||
&diff.base_text_string().unwrap(),
|
||||
&[] as &[(Range<u32>, &str, &str, DiffHunkStatus)],
|
||||
);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -193,15 +193,7 @@ impl MetadataCache {
|
||||
) -> Result<Self> {
|
||||
let mut cache = MetadataCache::default();
|
||||
for result in db.iter(txn)? {
|
||||
// Fail-open: skip records that can't be decoded (e.g. from a different branch)
|
||||
// rather than failing the entire prompt store initialization.
|
||||
let Ok((prompt_id, metadata)) = result else {
|
||||
log::warn!(
|
||||
"Skipping unreadable prompt record in database: {:?}",
|
||||
result.err()
|
||||
);
|
||||
continue;
|
||||
};
|
||||
let (prompt_id, metadata) = result?;
|
||||
cache.metadata.push(metadata.clone());
|
||||
cache.metadata_by_id.insert(prompt_id, metadata);
|
||||
}
|
||||
@@ -685,86 +677,7 @@ mod tests {
|
||||
assert_eq!(
|
||||
loaded_after_reset.trim(),
|
||||
expected_content_after_reset.trim(),
|
||||
"Content should be back to default after saving default content"
|
||||
);
|
||||
}
|
||||
|
||||
/// Test that the prompt store initializes successfully even when the database
|
||||
/// contains records with incompatible/undecodable PromptId keys (e.g., from
|
||||
/// a different branch that used a different serialization format).
|
||||
///
|
||||
/// This is a regression test for the "fail-open" behavior: we should skip
|
||||
/// bad records rather than failing the entire store initialization.
|
||||
#[gpui::test]
|
||||
async fn test_prompt_store_handles_incompatible_db_records(cx: &mut TestAppContext) {
|
||||
cx.executor().allow_parking();
|
||||
|
||||
let temp_dir = tempfile::tempdir().unwrap();
|
||||
let db_path = temp_dir.path().join("prompts-db-with-bad-records");
|
||||
std::fs::create_dir_all(&db_path).unwrap();
|
||||
|
||||
// First, create the DB and write an incompatible record directly.
|
||||
// We simulate a record written by a different branch that used
|
||||
// `{"kind":"CommitMessage"}` instead of `{"kind":"BuiltIn", ...}`.
|
||||
{
|
||||
let db_env = unsafe {
|
||||
heed::EnvOpenOptions::new()
|
||||
.map_size(1024 * 1024 * 1024)
|
||||
.max_dbs(4)
|
||||
.open(&db_path)
|
||||
.unwrap()
|
||||
};
|
||||
|
||||
let mut txn = db_env.write_txn().unwrap();
|
||||
// Create the metadata.v2 database with raw bytes so we can write
|
||||
// an incompatible key format.
|
||||
let metadata_db: Database<heed::types::Bytes, heed::types::Bytes> = db_env
|
||||
.create_database(&mut txn, Some("metadata.v2"))
|
||||
.unwrap();
|
||||
|
||||
// Write an incompatible PromptId key: `{"kind":"CommitMessage"}`
|
||||
// This is the old/branch format that current code can't decode.
|
||||
let bad_key = br#"{"kind":"CommitMessage"}"#;
|
||||
let dummy_metadata = br#"{"id":{"kind":"CommitMessage"},"title":"Bad Record","default":false,"saved_at":"2024-01-01T00:00:00Z"}"#;
|
||||
metadata_db.put(&mut txn, bad_key, dummy_metadata).unwrap();
|
||||
|
||||
// Also write a valid record to ensure we can still read good data.
|
||||
let good_key = br#"{"kind":"User","uuid":"550e8400-e29b-41d4-a716-446655440000"}"#;
|
||||
let good_metadata = br#"{"id":{"kind":"User","uuid":"550e8400-e29b-41d4-a716-446655440000"},"title":"Good Record","default":false,"saved_at":"2024-01-01T00:00:00Z"}"#;
|
||||
metadata_db.put(&mut txn, good_key, good_metadata).unwrap();
|
||||
|
||||
txn.commit().unwrap();
|
||||
}
|
||||
|
||||
// Now try to create a PromptStore from this DB.
|
||||
// With fail-open behavior, this should succeed and skip the bad record.
|
||||
// Without fail-open, this would return an error.
|
||||
let store_result = cx.update(|cx| PromptStore::new(db_path, cx)).await;
|
||||
|
||||
assert!(
|
||||
store_result.is_ok(),
|
||||
"PromptStore should initialize successfully even with incompatible DB records. \
|
||||
Got error: {:?}",
|
||||
store_result.err()
|
||||
);
|
||||
|
||||
let store = cx.new(|_cx| store_result.unwrap());
|
||||
|
||||
// Verify the good record was loaded.
|
||||
let good_id = PromptId::User {
|
||||
uuid: UserPromptId("550e8400-e29b-41d4-a716-446655440000".parse().unwrap()),
|
||||
};
|
||||
let metadata = store.read_with(cx, |store, _| store.metadata(good_id));
|
||||
assert!(
|
||||
metadata.is_some(),
|
||||
"Valid records should still be loaded after skipping bad ones"
|
||||
);
|
||||
assert_eq!(
|
||||
metadata
|
||||
.as_ref()
|
||||
.and_then(|m| m.title.as_ref().map(|t| t.as_ref())),
|
||||
Some("Good Record"),
|
||||
"Valid record should have correct title"
|
||||
"After saving default content, load should return default"
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -281,6 +281,7 @@ impl JsonSchema for LanguageModelProviderSetting {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"amazon-bedrock",
|
||||
"anthropic",
|
||||
"copilot_chat",
|
||||
"deepseek",
|
||||
"google",
|
||||
|
||||
@@ -20,12 +20,6 @@ pub struct ExtensionSettingsContent {
|
||||
pub auto_update_extensions: HashMap<Arc<str>, bool>,
|
||||
/// The capabilities granted to extensions.
|
||||
pub granted_extension_capabilities: Option<Vec<ExtensionCapabilityContent>>,
|
||||
/// Extension language model providers that are allowed to read API keys from
|
||||
/// environment variables. Each entry is in the format
|
||||
/// "extension_id:provider_id:ENV_VAR_NAME" (e.g., "google-ai:google-ai:GEMINI_API_KEY").
|
||||
///
|
||||
/// Default: []
|
||||
pub allowed_env_var_providers: Option<Vec<Arc<str>>>,
|
||||
}
|
||||
|
||||
/// A capability for an extension.
|
||||
|
||||
@@ -20,8 +20,6 @@ anyhow.workspace = true
|
||||
bm25 = "2.3.2"
|
||||
copilot.workspace = true
|
||||
edit_prediction.workspace = true
|
||||
extension_host.workspace = true
|
||||
language_model.workspace = true
|
||||
language_models.workspace = true
|
||||
editor.workspace = true
|
||||
feature_flags.workspace = true
|
||||
|
||||
@@ -7659,8 +7659,8 @@ fn edit_prediction_language_settings_section() -> Vec<SettingsPageItem> {
|
||||
files: USER,
|
||||
render: Arc::new(|_, window, cx| {
|
||||
let settings_window = cx.entity();
|
||||
let page = window.use_state(cx, |window, cx| {
|
||||
crate::pages::EditPredictionSetupPage::new(settings_window, window, cx)
|
||||
let page = window.use_state(cx, |_, _| {
|
||||
crate::pages::EditPredictionSetupPage::new(settings_window)
|
||||
});
|
||||
page.into_any_element()
|
||||
}),
|
||||
|
||||
@@ -3,15 +3,10 @@ use edit_prediction::{
|
||||
mercury::{MERCURY_CREDENTIALS_URL, mercury_api_token},
|
||||
sweep_ai::{SWEEP_CREDENTIALS_URL, sweep_api_token},
|
||||
};
|
||||
use extension_host::ExtensionStore;
|
||||
use feature_flags::FeatureFlagAppExt as _;
|
||||
use gpui::{AnyView, Entity, ScrollHandle, Subscription, prelude::*};
|
||||
use language_model::{
|
||||
ConfigurationViewTargetAgent, LanguageModelProviderId, LanguageModelRegistry,
|
||||
};
|
||||
use gpui::{Entity, ScrollHandle, prelude::*};
|
||||
use language_models::provider::mistral::{CODESTRAL_API_URL, codestral_api_key};
|
||||
use std::collections::HashMap;
|
||||
use ui::{ButtonLink, ConfiguredApiCard, Icon, WithScrollbar, prelude::*};
|
||||
use ui::{ButtonLink, ConfiguredApiCard, WithScrollbar, prelude::*};
|
||||
|
||||
use crate::{
|
||||
SettingField, SettingItem, SettingsFieldMetadata, SettingsPageItem, SettingsWindow, USER,
|
||||
@@ -21,133 +16,24 @@ use crate::{
|
||||
pub struct EditPredictionSetupPage {
|
||||
settings_window: Entity<SettingsWindow>,
|
||||
scroll_handle: ScrollHandle,
|
||||
extension_oauth_views: HashMap<LanguageModelProviderId, ExtensionOAuthProviderView>,
|
||||
_registry_subscription: Subscription,
|
||||
}
|
||||
|
||||
struct ExtensionOAuthProviderView {
|
||||
provider_name: SharedString,
|
||||
provider_icon: IconName,
|
||||
provider_icon_path: Option<SharedString>,
|
||||
configuration_view: AnyView,
|
||||
}
|
||||
|
||||
impl EditPredictionSetupPage {
|
||||
pub fn new(
|
||||
settings_window: Entity<SettingsWindow>,
|
||||
window: &mut Window,
|
||||
cx: &mut Context<Self>,
|
||||
) -> Self {
|
||||
let registry_subscription = cx.subscribe_in(
|
||||
&LanguageModelRegistry::global(cx),
|
||||
window,
|
||||
|this, _, event: &language_model::Event, window, cx| match event {
|
||||
language_model::Event::AddedProvider(provider_id) => {
|
||||
this.maybe_add_extension_oauth_view(provider_id, window, cx);
|
||||
}
|
||||
language_model::Event::RemovedProvider(provider_id) => {
|
||||
this.extension_oauth_views.remove(provider_id);
|
||||
}
|
||||
_ => {}
|
||||
},
|
||||
);
|
||||
|
||||
let mut this = Self {
|
||||
pub fn new(settings_window: Entity<SettingsWindow>) -> Self {
|
||||
Self {
|
||||
settings_window,
|
||||
scroll_handle: ScrollHandle::new(),
|
||||
extension_oauth_views: HashMap::default(),
|
||||
_registry_subscription: registry_subscription,
|
||||
};
|
||||
this.build_extension_oauth_views(window, cx);
|
||||
this
|
||||
}
|
||||
|
||||
fn build_extension_oauth_views(&mut self, window: &mut Window, cx: &mut Context<Self>) {
|
||||
let oauth_provider_ids = get_extension_oauth_provider_ids(cx);
|
||||
for provider_id in oauth_provider_ids {
|
||||
self.maybe_add_extension_oauth_view(&provider_id, window, cx);
|
||||
}
|
||||
}
|
||||
|
||||
fn maybe_add_extension_oauth_view(
|
||||
&mut self,
|
||||
provider_id: &LanguageModelProviderId,
|
||||
window: &mut Window,
|
||||
cx: &mut Context<Self>,
|
||||
) {
|
||||
// Check if this provider has OAuth configured in the extension manifest
|
||||
if !is_extension_oauth_provider(provider_id, cx) {
|
||||
return;
|
||||
}
|
||||
|
||||
let registry = LanguageModelRegistry::global(cx).read(cx);
|
||||
let Some(provider) = registry.provider(provider_id) else {
|
||||
return;
|
||||
};
|
||||
|
||||
let provider_name = provider.name().0;
|
||||
let provider_icon = provider.icon();
|
||||
let provider_icon_path = provider.icon_path();
|
||||
let configuration_view =
|
||||
provider.configuration_view(ConfigurationViewTargetAgent::EditPrediction, window, cx);
|
||||
|
||||
self.extension_oauth_views.insert(
|
||||
provider_id.clone(),
|
||||
ExtensionOAuthProviderView {
|
||||
provider_name,
|
||||
provider_icon,
|
||||
provider_icon_path,
|
||||
configuration_view,
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
impl Render for EditPredictionSetupPage {
|
||||
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
|
||||
let settings_window = self.settings_window.clone();
|
||||
|
||||
let copilot_extension_installed = ExtensionStore::global(cx)
|
||||
.read(cx)
|
||||
.installed_extensions()
|
||||
.contains_key("copilot-chat");
|
||||
|
||||
let mut providers: Vec<AnyElement> = Vec::new();
|
||||
|
||||
// Built-in Copilot (hidden if copilot-chat extension is installed)
|
||||
if !copilot_extension_installed {
|
||||
providers.push(render_github_copilot_provider(window, cx).into_any_element());
|
||||
}
|
||||
|
||||
// Extension providers with OAuth support
|
||||
for (provider_id, view) in &self.extension_oauth_views {
|
||||
let icon_element: AnyElement = if let Some(icon_path) = &view.provider_icon_path {
|
||||
Icon::from_external_svg(icon_path.clone())
|
||||
.size(ui::IconSize::Medium)
|
||||
.into_any_element()
|
||||
} else {
|
||||
Icon::new(view.provider_icon)
|
||||
.size(ui::IconSize::Medium)
|
||||
.into_any_element()
|
||||
};
|
||||
|
||||
providers.push(
|
||||
v_flex()
|
||||
.id(SharedString::from(provider_id.0.to_string()))
|
||||
.min_w_0()
|
||||
.gap_1p5()
|
||||
.child(
|
||||
h_flex().gap_2().items_center().child(icon_element).child(
|
||||
Headline::new(view.provider_name.clone()).size(HeadlineSize::Small),
|
||||
),
|
||||
)
|
||||
.child(view.configuration_view.clone())
|
||||
.into_any_element(),
|
||||
);
|
||||
}
|
||||
|
||||
if cx.has_flag::<Zeta2FeatureFlag>() {
|
||||
providers.push(
|
||||
let providers = [
|
||||
Some(render_github_copilot_provider(window, cx).into_any_element()),
|
||||
cx.has_flag::<Zeta2FeatureFlag>().then(|| {
|
||||
render_api_key_provider(
|
||||
IconName::Inception,
|
||||
"Mercury",
|
||||
@@ -158,12 +44,9 @@ impl Render for EditPredictionSetupPage {
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
.into_any_element(),
|
||||
);
|
||||
}
|
||||
|
||||
if cx.has_flag::<Zeta2FeatureFlag>() {
|
||||
providers.push(
|
||||
.into_any_element()
|
||||
}),
|
||||
cx.has_flag::<Zeta2FeatureFlag>().then(|| {
|
||||
render_api_key_provider(
|
||||
IconName::SweepAi,
|
||||
"Sweep",
|
||||
@@ -174,33 +57,32 @@ impl Render for EditPredictionSetupPage {
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
.into_any_element()
|
||||
}),
|
||||
Some(
|
||||
render_api_key_provider(
|
||||
IconName::AiMistral,
|
||||
"Codestral",
|
||||
"https://console.mistral.ai/codestral".into(),
|
||||
codestral_api_key(cx),
|
||||
|cx| language_models::MistralLanguageModelProvider::api_url(cx),
|
||||
Some(settings_window.update(cx, |settings_window, cx| {
|
||||
let codestral_settings = codestral_settings();
|
||||
settings_window
|
||||
.render_sub_page_items_section(
|
||||
codestral_settings.iter().enumerate(),
|
||||
None,
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
.into_any_element()
|
||||
})),
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
.into_any_element(),
|
||||
);
|
||||
}
|
||||
|
||||
providers.push(
|
||||
render_api_key_provider(
|
||||
IconName::AiMistral,
|
||||
"Codestral",
|
||||
"https://console.mistral.ai/codestral".into(),
|
||||
codestral_api_key(cx),
|
||||
|cx| language_models::MistralLanguageModelProvider::api_url(cx),
|
||||
Some(settings_window.update(cx, |settings_window, cx| {
|
||||
let codestral_settings = codestral_settings();
|
||||
settings_window
|
||||
.render_sub_page_items_section(
|
||||
codestral_settings.iter().enumerate(),
|
||||
None,
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
.into_any_element()
|
||||
})),
|
||||
window,
|
||||
cx,
|
||||
)
|
||||
.into_any_element(),
|
||||
);
|
||||
),
|
||||
];
|
||||
|
||||
div()
|
||||
.size_full()
|
||||
@@ -214,60 +96,11 @@ impl Render for EditPredictionSetupPage {
|
||||
.pb_16()
|
||||
.overflow_y_scroll()
|
||||
.track_scroll(&self.scroll_handle)
|
||||
.children(providers),
|
||||
.children(providers.into_iter().flatten()),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/// Get extension provider IDs that have OAuth configured.
|
||||
fn get_extension_oauth_provider_ids(cx: &App) -> Vec<LanguageModelProviderId> {
|
||||
let extension_store = ExtensionStore::global(cx).read(cx);
|
||||
|
||||
extension_store
|
||||
.installed_extensions()
|
||||
.iter()
|
||||
.flat_map(|(extension_id, entry)| {
|
||||
entry.manifest.language_model_providers.iter().filter_map(
|
||||
move |(provider_id, provider_entry)| {
|
||||
// Check if this provider has OAuth configured
|
||||
let has_oauth = provider_entry
|
||||
.auth
|
||||
.as_ref()
|
||||
.is_some_and(|auth| auth.oauth.is_some());
|
||||
|
||||
if has_oauth {
|
||||
Some(LanguageModelProviderId(
|
||||
format!("{}:{}", extension_id, provider_id).into(),
|
||||
))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
},
|
||||
)
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Check if a provider ID corresponds to an extension with OAuth configured.
|
||||
fn is_extension_oauth_provider(provider_id: &LanguageModelProviderId, cx: &App) -> bool {
|
||||
// Extension provider IDs are in the format "extension_id:provider_id"
|
||||
let Some((extension_id, local_provider_id)) = provider_id.0.split_once(':') else {
|
||||
return false;
|
||||
};
|
||||
|
||||
let extension_store = ExtensionStore::global(cx).read(cx);
|
||||
let Some(entry) = extension_store.installed_extensions().get(extension_id) else {
|
||||
return false;
|
||||
};
|
||||
|
||||
entry
|
||||
.manifest
|
||||
.language_model_providers
|
||||
.get(local_provider_id)
|
||||
.and_then(|p| p.auth.as_ref())
|
||||
.is_some_and(|auth| auth.oauth.is_some())
|
||||
}
|
||||
|
||||
fn render_api_key_provider(
|
||||
icon: IconName,
|
||||
title: &'static str,
|
||||
|
||||
@@ -26,6 +26,7 @@ use std::{
|
||||
sync::{Arc, LazyLock, RwLock},
|
||||
time::Duration,
|
||||
};
|
||||
use theme::ThemeSettings;
|
||||
use title_bar::platform_title_bar::PlatformTitleBar;
|
||||
use ui::{
|
||||
Banner, ContextMenu, Divider, DropdownMenu, DropdownStyle, IconButtonShape, KeyBinding,
|
||||
@@ -1376,8 +1377,22 @@ impl SettingsWindow {
|
||||
})
|
||||
.detach();
|
||||
|
||||
let mut ui_font_size = ThemeSettings::get_global(cx).ui_font_size(cx);
|
||||
cx.observe_global_in::<SettingsStore>(window, move |this, window, cx| {
|
||||
this.fetch_files(window, cx);
|
||||
|
||||
// Whenever settings are changed, it's possible that the changed
|
||||
// settings affects the rendering of the `SettingsWindow`, like is
|
||||
// the case with `ui_font_size`. When that happens, we need to
|
||||
// instruct the `ListState` to re-measure the list items, as the
|
||||
// list item heights may have changed depending on the new font
|
||||
// size.
|
||||
let new_ui_font_size = ThemeSettings::get_global(cx).ui_font_size(cx);
|
||||
if new_ui_font_size != ui_font_size {
|
||||
this.list_state.remeasure();
|
||||
ui_font_size = new_ui_font_size;
|
||||
}
|
||||
|
||||
cx.notify();
|
||||
})
|
||||
.detach();
|
||||
@@ -1487,7 +1502,6 @@ impl SettingsWindow {
|
||||
None
|
||||
};
|
||||
|
||||
// high overdraw value so the list scrollbar len doesn't change too much
|
||||
let list_state = gpui::ListState::new(0, gpui::ListAlignment::Top, px(0.0)).measure_all();
|
||||
list_state.set_scroll_handler(|_, _, _| {});
|
||||
|
||||
@@ -1980,7 +1994,6 @@ impl SettingsWindow {
|
||||
}
|
||||
|
||||
fn reset_list_state(&mut self) {
|
||||
// plus one for the title
|
||||
let mut visible_items_count = self.visible_page_items().count();
|
||||
|
||||
if visible_items_count > 0 {
|
||||
|
||||
@@ -344,7 +344,6 @@ pub struct Switch {
|
||||
label: Option<SharedString>,
|
||||
label_position: Option<SwitchLabelPosition>,
|
||||
label_size: LabelSize,
|
||||
label_color: Color,
|
||||
full_width: bool,
|
||||
key_binding: Option<KeyBinding>,
|
||||
color: SwitchColor,
|
||||
@@ -362,7 +361,6 @@ impl Switch {
|
||||
label: None,
|
||||
label_position: None,
|
||||
label_size: LabelSize::Small,
|
||||
label_color: Color::Default,
|
||||
full_width: false,
|
||||
key_binding: None,
|
||||
color: SwitchColor::default(),
|
||||
@@ -410,11 +408,6 @@ impl Switch {
|
||||
self
|
||||
}
|
||||
|
||||
pub fn label_color(mut self, color: Color) -> Self {
|
||||
self.label_color = color;
|
||||
self
|
||||
}
|
||||
|
||||
pub fn full_width(mut self, full_width: bool) -> Self {
|
||||
self.full_width = full_width;
|
||||
self
|
||||
@@ -514,11 +507,7 @@ impl RenderOnce for Switch {
|
||||
self.label_position == Some(SwitchLabelPosition::Start),
|
||||
|this| {
|
||||
this.when_some(label.clone(), |this, label| {
|
||||
this.child(
|
||||
Label::new(label)
|
||||
.color(self.label_color)
|
||||
.size(self.label_size),
|
||||
)
|
||||
this.child(Label::new(label).size(self.label_size))
|
||||
})
|
||||
},
|
||||
)
|
||||
@@ -527,11 +516,7 @@ impl RenderOnce for Switch {
|
||||
self.label_position == Some(SwitchLabelPosition::End),
|
||||
|this| {
|
||||
this.when_some(label, |this, label| {
|
||||
this.child(
|
||||
Label::new(label)
|
||||
.color(self.label_color)
|
||||
.size(self.label_size),
|
||||
)
|
||||
this.child(Label::new(label).size(self.label_size))
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
@@ -1,294 +0,0 @@
|
||||
use gpui::{
|
||||
Animation, AnimationExt, App, ClipboardItem, Context, DismissEvent, Element, Entity,
|
||||
EventEmitter, FocusHandle, Focusable, InteractiveElement, IntoElement, MouseDownEvent,
|
||||
ParentElement, Render, SharedString, Styled, Subscription, Transformation, Window, div,
|
||||
percentage, rems, svg,
|
||||
};
|
||||
use menu;
|
||||
use std::time::Duration;
|
||||
use ui::{Button, Icon, IconName, Label, Vector, VectorName, prelude::*};
|
||||
|
||||
use crate::ModalView;
|
||||
|
||||
/// Configuration for the OAuth device flow modal.
|
||||
/// This allows extensions to specify the text and appearance of the modal.
|
||||
#[derive(Clone)]
|
||||
pub struct OAuthDeviceFlowModalConfig {
|
||||
/// The user code to display (e.g., "ABC-123").
|
||||
pub user_code: String,
|
||||
/// The URL the user needs to visit to authorize (for the "Connect" button).
|
||||
pub verification_url: String,
|
||||
/// The headline text for the modal (e.g., "Use GitHub Copilot in Zed.").
|
||||
pub headline: String,
|
||||
/// A description to show below the headline.
|
||||
pub description: String,
|
||||
/// Label for the connect button (e.g., "Connect to GitHub").
|
||||
pub connect_button_label: String,
|
||||
/// Success headline shown when authorization completes.
|
||||
pub success_headline: String,
|
||||
/// Success message shown when authorization completes.
|
||||
pub success_message: String,
|
||||
/// Optional path to an SVG icon file (absolute path on disk).
|
||||
pub icon_path: Option<SharedString>,
|
||||
}
|
||||
|
||||
/// The current status of the OAuth device flow.
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum OAuthDeviceFlowStatus {
|
||||
/// Waiting for user to click connect and authorize.
|
||||
Prompting,
|
||||
/// User clicked connect, waiting for authorization.
|
||||
WaitingForAuthorization,
|
||||
/// Successfully authorized.
|
||||
Authorized,
|
||||
/// Authorization failed with an error message.
|
||||
Failed(String),
|
||||
}
|
||||
|
||||
/// Shared state for the OAuth device flow that can be observed by the modal.
|
||||
pub struct OAuthDeviceFlowState {
|
||||
pub config: OAuthDeviceFlowModalConfig,
|
||||
pub status: OAuthDeviceFlowStatus,
|
||||
}
|
||||
|
||||
impl EventEmitter<()> for OAuthDeviceFlowState {}
|
||||
|
||||
impl OAuthDeviceFlowState {
|
||||
pub fn new(config: OAuthDeviceFlowModalConfig) -> Self {
|
||||
Self {
|
||||
config,
|
||||
status: OAuthDeviceFlowStatus::Prompting,
|
||||
}
|
||||
}
|
||||
|
||||
/// Update the status of the OAuth flow.
|
||||
pub fn set_status(&mut self, status: OAuthDeviceFlowStatus, cx: &mut Context<Self>) {
|
||||
self.status = status;
|
||||
cx.emit(());
|
||||
cx.notify();
|
||||
}
|
||||
}
|
||||
|
||||
/// A generic OAuth device flow modal that can be used by extensions.
|
||||
pub struct OAuthDeviceFlowModal {
|
||||
state: Entity<OAuthDeviceFlowState>,
|
||||
connect_clicked: bool,
|
||||
focus_handle: FocusHandle,
|
||||
_subscription: Subscription,
|
||||
}
|
||||
|
||||
impl Focusable for OAuthDeviceFlowModal {
|
||||
fn focus_handle(&self, _: &App) -> FocusHandle {
|
||||
self.focus_handle.clone()
|
||||
}
|
||||
}
|
||||
|
||||
impl EventEmitter<DismissEvent> for OAuthDeviceFlowModal {}
|
||||
|
||||
impl ModalView for OAuthDeviceFlowModal {}
|
||||
|
||||
impl OAuthDeviceFlowModal {
|
||||
pub fn new(state: Entity<OAuthDeviceFlowState>, cx: &mut Context<Self>) -> Self {
|
||||
let subscription = cx.observe(&state, |_, _, cx| {
|
||||
cx.notify();
|
||||
});
|
||||
|
||||
Self {
|
||||
state,
|
||||
connect_clicked: false,
|
||||
focus_handle: cx.focus_handle(),
|
||||
_subscription: subscription,
|
||||
}
|
||||
}
|
||||
|
||||
fn render_icon(&self, cx: &mut Context<Self>) -> impl IntoElement {
|
||||
let state = self.state.read(cx);
|
||||
let icon_color = Color::Custom(cx.theme().colors().icon);
|
||||
// Match ZedXCopilot visual appearance
|
||||
let icon_size = rems(2.5);
|
||||
let plus_size = rems(0.875);
|
||||
// The "+" in ZedXCopilot SVG has fill-opacity="0.5"
|
||||
let plus_color = cx.theme().colors().icon.opacity(0.5);
|
||||
|
||||
if let Some(icon_path) = &state.config.icon_path {
|
||||
// Show "[Provider Icon] + [Zed Logo]" format to match built-in Copilot modal
|
||||
h_flex()
|
||||
.gap_2()
|
||||
.items_center()
|
||||
.child(
|
||||
Icon::from_external_svg(icon_path.clone())
|
||||
.size(ui::IconSize::Custom(icon_size))
|
||||
.color(icon_color),
|
||||
)
|
||||
.child(
|
||||
svg()
|
||||
.size(plus_size)
|
||||
.path("icons/plus.svg")
|
||||
.text_color(plus_color),
|
||||
)
|
||||
.child(Vector::new(VectorName::ZedLogo, icon_size, icon_size).color(icon_color))
|
||||
.into_any_element()
|
||||
} else {
|
||||
// Fallback to just Zed logo if no provider icon
|
||||
Vector::new(VectorName::ZedLogo, icon_size, icon_size)
|
||||
.color(icon_color)
|
||||
.into_any_element()
|
||||
}
|
||||
}
|
||||
|
||||
fn render_device_code(&self, cx: &mut Context<Self>) -> impl IntoElement {
|
||||
let state = self.state.read(cx);
|
||||
let user_code = state.config.user_code.clone();
|
||||
let copied = cx
|
||||
.read_from_clipboard()
|
||||
.map(|item| item.text().as_ref() == Some(&user_code))
|
||||
.unwrap_or(false);
|
||||
let user_code_for_click = user_code.clone();
|
||||
|
||||
h_flex()
|
||||
.w_full()
|
||||
.p_1()
|
||||
.border_1()
|
||||
.border_muted(cx)
|
||||
.rounded_sm()
|
||||
.cursor_pointer()
|
||||
.justify_between()
|
||||
.on_mouse_down(gpui::MouseButton::Left, move |_, window, cx| {
|
||||
cx.write_to_clipboard(ClipboardItem::new_string(user_code_for_click.clone()));
|
||||
window.refresh();
|
||||
})
|
||||
.child(div().flex_1().child(Label::new(user_code)))
|
||||
.child(div().flex_none().px_1().child(Label::new(if copied {
|
||||
"Copied!"
|
||||
} else {
|
||||
"Copy"
|
||||
})))
|
||||
}
|
||||
|
||||
fn render_prompting_modal(&self, cx: &mut Context<Self>) -> impl Element {
|
||||
let (connect_button_label, verification_url, headline, description) = {
|
||||
let state = self.state.read(cx);
|
||||
let label = if self.connect_clicked {
|
||||
"Waiting for connection...".to_string()
|
||||
} else {
|
||||
state.config.connect_button_label.clone()
|
||||
};
|
||||
(
|
||||
label,
|
||||
state.config.verification_url.clone(),
|
||||
state.config.headline.clone(),
|
||||
state.config.description.clone(),
|
||||
)
|
||||
};
|
||||
|
||||
v_flex()
|
||||
.flex_1()
|
||||
.gap_2()
|
||||
.items_center()
|
||||
.child(Headline::new(headline).size(HeadlineSize::Large))
|
||||
.child(Label::new(description).color(Color::Muted))
|
||||
.child(self.render_device_code(cx))
|
||||
.child(
|
||||
Label::new("Paste this code into GitHub after clicking the button below.")
|
||||
.size(ui::LabelSize::Small),
|
||||
)
|
||||
.child(
|
||||
Button::new("connect-button", connect_button_label)
|
||||
.on_click(cx.listener(move |this, _, _window, cx| {
|
||||
cx.open_url(&verification_url);
|
||||
this.connect_clicked = true;
|
||||
}))
|
||||
.full_width()
|
||||
.style(ButtonStyle::Filled),
|
||||
)
|
||||
.child(
|
||||
Button::new("cancel-button", "Cancel")
|
||||
.full_width()
|
||||
.on_click(cx.listener(|_, _, _, cx| {
|
||||
cx.emit(DismissEvent);
|
||||
})),
|
||||
)
|
||||
}
|
||||
|
||||
fn render_authorized_modal(&self, cx: &mut Context<Self>) -> impl Element {
|
||||
let state = self.state.read(cx);
|
||||
let success_headline = state.config.success_headline.clone();
|
||||
let success_message = state.config.success_message.clone();
|
||||
|
||||
v_flex()
|
||||
.gap_2()
|
||||
.child(Headline::new(success_headline).size(HeadlineSize::Large))
|
||||
.child(Label::new(success_message))
|
||||
.child(
|
||||
Button::new("done-button", "Done")
|
||||
.full_width()
|
||||
.on_click(cx.listener(|_, _, _, cx| cx.emit(DismissEvent))),
|
||||
)
|
||||
}
|
||||
|
||||
fn render_failed_modal(&self, error: &str, cx: &mut Context<Self>) -> impl Element {
|
||||
v_flex()
|
||||
.gap_2()
|
||||
.child(Headline::new("Authorization Failed").size(HeadlineSize::Large))
|
||||
.child(Label::new(error.to_string()).color(Color::Error))
|
||||
.child(
|
||||
Button::new("close-button", "Close")
|
||||
.full_width()
|
||||
.on_click(cx.listener(|_, _, _, cx| cx.emit(DismissEvent))),
|
||||
)
|
||||
}
|
||||
|
||||
fn render_loading(window: &mut Window, _cx: &mut Context<Self>) -> impl Element {
|
||||
let loading_icon = svg()
|
||||
.size_8()
|
||||
.path(IconName::ArrowCircle.path())
|
||||
.text_color(window.text_style().color)
|
||||
.with_animation(
|
||||
"icon_circle_arrow",
|
||||
Animation::new(Duration::from_secs(2)).repeat(),
|
||||
|svg, delta| svg.with_transformation(Transformation::rotate(percentage(delta))),
|
||||
);
|
||||
|
||||
h_flex().justify_center().child(loading_icon)
|
||||
}
|
||||
}
|
||||
|
||||
impl Render for OAuthDeviceFlowModal {
|
||||
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
|
||||
let status = self.state.read(cx).status.clone();
|
||||
|
||||
let prompt = match &status {
|
||||
OAuthDeviceFlowStatus::Prompting => self.render_prompting_modal(cx).into_any_element(),
|
||||
OAuthDeviceFlowStatus::WaitingForAuthorization => {
|
||||
if self.connect_clicked {
|
||||
self.render_prompting_modal(cx).into_any_element()
|
||||
} else {
|
||||
Self::render_loading(window, cx).into_any_element()
|
||||
}
|
||||
}
|
||||
OAuthDeviceFlowStatus::Authorized => {
|
||||
self.render_authorized_modal(cx).into_any_element()
|
||||
}
|
||||
OAuthDeviceFlowStatus::Failed(error) => {
|
||||
self.render_failed_modal(error, cx).into_any_element()
|
||||
}
|
||||
};
|
||||
|
||||
v_flex()
|
||||
.id("oauth-device-flow-modal")
|
||||
.track_focus(&self.focus_handle(cx))
|
||||
.elevation_3(cx)
|
||||
.w_96()
|
||||
.items_center()
|
||||
.p_4()
|
||||
.gap_2()
|
||||
.on_action(cx.listener(|_, _: &menu::Cancel, _, cx| {
|
||||
cx.emit(DismissEvent);
|
||||
}))
|
||||
.on_any_mouse_down(cx.listener(|this, _: &MouseDownEvent, window, cx| {
|
||||
window.focus(&this.focus_handle, cx);
|
||||
}))
|
||||
.child(self.render_icon(cx))
|
||||
.child(prompt)
|
||||
}
|
||||
}
|
||||
@@ -4,7 +4,6 @@ pub mod invalid_item_view;
|
||||
pub mod item;
|
||||
mod modal_layer;
|
||||
pub mod notifications;
|
||||
pub mod oauth_device_flow_modal;
|
||||
pub mod pane;
|
||||
pub mod pane_group;
|
||||
mod path_list;
|
||||
|
||||
@@ -1361,7 +1361,7 @@ impl LocalWorktree {
|
||||
}
|
||||
|
||||
let content = fs.load_bytes(&abs_path).await?;
|
||||
let (text, encoding, has_bom) = decode_byte(content)?;
|
||||
let (text, encoding, has_bom) = decode_byte(content);
|
||||
|
||||
let worktree = this.upgrade().context("worktree was dropped")?;
|
||||
let file = match entry.await? {
|
||||
@@ -1489,12 +1489,25 @@ impl LocalWorktree {
|
||||
let fs = fs.clone();
|
||||
let abs_path = abs_path.clone();
|
||||
async move {
|
||||
let bom_bytes = if has_bom {
|
||||
if encoding == encoding_rs::UTF_16LE {
|
||||
vec![0xFF, 0xFE]
|
||||
} else if encoding == encoding_rs::UTF_16BE {
|
||||
vec![0xFE, 0xFF]
|
||||
} else if encoding == encoding_rs::UTF_8 {
|
||||
vec![0xEF, 0xBB, 0xBF]
|
||||
} else {
|
||||
vec![]
|
||||
}
|
||||
} else {
|
||||
vec![]
|
||||
};
|
||||
|
||||
// For UTF-8, use the optimized `fs.save` which writes Rope chunks directly to disk
|
||||
// without allocating a contiguous string.
|
||||
if encoding == encoding_rs::UTF_8 && !has_bom {
|
||||
return fs.save(&abs_path, &text, line_ending).await;
|
||||
}
|
||||
|
||||
// For legacy encodings (e.g. Shift-JIS), we fall back to converting the entire Rope
|
||||
// to a String/Bytes in memory before writing.
|
||||
//
|
||||
@@ -1507,45 +1520,13 @@ impl LocalWorktree {
|
||||
LineEnding::Windows => text_string.replace('\n', "\r\n"),
|
||||
};
|
||||
|
||||
// Create the byte vector manually for UTF-16 encodings because encoding_rs encodes to UTF-8 by default (per WHATWG standards),
|
||||
// which is not what we want for saving files.
|
||||
let bytes = if encoding == encoding_rs::UTF_16BE {
|
||||
let mut data = Vec::with_capacity(normalized_text.len() * 2 + 2);
|
||||
if has_bom {
|
||||
data.extend_from_slice(&[0xFE, 0xFF]); // BOM
|
||||
}
|
||||
let utf16be_bytes =
|
||||
normalized_text.encode_utf16().flat_map(|u| u.to_be_bytes());
|
||||
data.extend(utf16be_bytes);
|
||||
data.into()
|
||||
} else if encoding == encoding_rs::UTF_16LE {
|
||||
let mut data = Vec::with_capacity(normalized_text.len() * 2 + 2);
|
||||
if has_bom {
|
||||
data.extend_from_slice(&[0xFF, 0xFE]); // BOM
|
||||
}
|
||||
let utf16le_bytes =
|
||||
normalized_text.encode_utf16().flat_map(|u| u.to_le_bytes());
|
||||
data.extend(utf16le_bytes);
|
||||
data.into()
|
||||
let (cow, _, _) = encoding.encode(&normalized_text);
|
||||
let bytes = if !bom_bytes.is_empty() {
|
||||
let mut bytes = bom_bytes;
|
||||
bytes.extend_from_slice(&cow);
|
||||
bytes.into()
|
||||
} else {
|
||||
// For other encodings (Shift-JIS, UTF-8 with BOM, etc.), delegate to encoding_rs.
|
||||
let bom_bytes = if has_bom {
|
||||
if encoding == encoding_rs::UTF_8 {
|
||||
vec![0xEF, 0xBB, 0xBF]
|
||||
} else {
|
||||
vec![]
|
||||
}
|
||||
} else {
|
||||
vec![]
|
||||
};
|
||||
let (cow, _, _) = encoding.encode(&normalized_text);
|
||||
if !bom_bytes.is_empty() {
|
||||
let mut bytes = bom_bytes;
|
||||
bytes.extend_from_slice(&cow);
|
||||
bytes.into()
|
||||
} else {
|
||||
cow
|
||||
}
|
||||
cow
|
||||
};
|
||||
|
||||
fs.write(&abs_path, &bytes).await
|
||||
@@ -5861,28 +5842,11 @@ impl fs::Watcher for NullWatcher {
|
||||
}
|
||||
}
|
||||
|
||||
fn decode_byte(bytes: Vec<u8>) -> anyhow::Result<(String, &'static Encoding, bool)> {
|
||||
fn decode_byte(bytes: Vec<u8>) -> (String, &'static Encoding, bool) {
|
||||
// check BOM
|
||||
if let Some((encoding, _bom_len)) = Encoding::for_bom(&bytes) {
|
||||
let (cow, _) = encoding.decode_with_bom_removal(&bytes);
|
||||
return Ok((cow.into_owned(), encoding, true));
|
||||
}
|
||||
|
||||
match analyze_byte_content(&bytes) {
|
||||
ByteContent::Utf16Le => {
|
||||
let encoding = encoding_rs::UTF_16LE;
|
||||
let (cow, _, _) = encoding.decode(&bytes);
|
||||
return Ok((cow.into_owned(), encoding, false));
|
||||
}
|
||||
ByteContent::Utf16Be => {
|
||||
let encoding = encoding_rs::UTF_16BE;
|
||||
let (cow, _, _) = encoding.decode(&bytes);
|
||||
return Ok((cow.into_owned(), encoding, false));
|
||||
}
|
||||
ByteContent::Binary => {
|
||||
anyhow::bail!("Binary files are not supported");
|
||||
}
|
||||
ByteContent::Unknown => {}
|
||||
return (cow.into_owned(), encoding, true);
|
||||
}
|
||||
|
||||
fn detect_encoding(bytes: Vec<u8>) -> (String, &'static Encoding) {
|
||||
@@ -5903,66 +5867,14 @@ fn decode_byte(bytes: Vec<u8>) -> anyhow::Result<(String, &'static Encoding, boo
|
||||
// displaying raw escape sequences instead of the correct characters.
|
||||
if text.contains('\x1b') {
|
||||
let (s, enc) = detect_encoding(text.into_bytes());
|
||||
Ok((s, enc, false))
|
||||
(s, enc, false)
|
||||
} else {
|
||||
Ok((text, encoding_rs::UTF_8, false))
|
||||
(text, encoding_rs::UTF_8, false)
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
let (s, enc) = detect_encoding(e.into_bytes());
|
||||
Ok((s, enc, false))
|
||||
(s, enc, false)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(PartialEq)]
|
||||
enum ByteContent {
|
||||
Utf16Le,
|
||||
Utf16Be,
|
||||
Binary,
|
||||
Unknown,
|
||||
}
|
||||
// Heuristic check using null byte distribution.
|
||||
// NOTE: This relies on the presence of ASCII characters (which become `0x00` in UTF-16).
|
||||
// Files consisting purely of non-ASCII characters (like Japanese) may not be detected here
|
||||
// and will result in `Unknown`.
|
||||
fn analyze_byte_content(bytes: &[u8]) -> ByteContent {
|
||||
if bytes.len() < 2 {
|
||||
return ByteContent::Unknown;
|
||||
}
|
||||
|
||||
let check_len = bytes.len().min(1024);
|
||||
let sample = &bytes[..check_len];
|
||||
|
||||
if !sample.contains(&0) {
|
||||
return ByteContent::Unknown;
|
||||
}
|
||||
|
||||
let mut even_nulls = 0;
|
||||
let mut odd_nulls = 0;
|
||||
|
||||
for (i, &byte) in sample.iter().enumerate() {
|
||||
if byte == 0 {
|
||||
if i % 2 == 0 {
|
||||
even_nulls += 1;
|
||||
} else {
|
||||
odd_nulls += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let total_nulls = even_nulls + odd_nulls;
|
||||
if total_nulls < check_len / 10 {
|
||||
return ByteContent::Unknown;
|
||||
}
|
||||
|
||||
if even_nulls > odd_nulls * 4 {
|
||||
return ByteContent::Utf16Be;
|
||||
}
|
||||
|
||||
if odd_nulls > even_nulls * 4 {
|
||||
return ByteContent::Utf16Le;
|
||||
}
|
||||
|
||||
ByteContent::Binary
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
use crate::{Entry, EntryKind, Event, PathChange, Worktree, WorktreeModelHandle};
|
||||
use anyhow::Result;
|
||||
use anyhow::{Context as _, Result};
|
||||
use encoding_rs;
|
||||
use fs::{FakeFs, Fs, RealFs, RemoveOptions};
|
||||
use git::{DOT_GIT, GITIGNORE, REPO_EXCLUDE};
|
||||
@@ -2568,87 +2568,71 @@ fn init_test(cx: &mut gpui::TestAppContext) {
|
||||
#[gpui::test]
|
||||
async fn test_load_file_encoding(cx: &mut TestAppContext) {
|
||||
init_test(cx);
|
||||
|
||||
struct TestCase {
|
||||
name: &'static str,
|
||||
bytes: Vec<u8>,
|
||||
expected_text: &'static str,
|
||||
}
|
||||
|
||||
// --- Success Cases ---
|
||||
let success_cases = vec![
|
||||
TestCase {
|
||||
name: "utf8.txt",
|
||||
bytes: "こんにちは".as_bytes().to_vec(),
|
||||
expected_text: "こんにちは",
|
||||
},
|
||||
TestCase {
|
||||
name: "sjis.txt",
|
||||
bytes: vec![0x82, 0xb1, 0x82, 0xf1, 0x82, 0xc9, 0x82, 0xbf, 0x82, 0xcd],
|
||||
expected_text: "こんにちは",
|
||||
},
|
||||
TestCase {
|
||||
name: "eucjp.txt",
|
||||
bytes: vec![0xa4, 0xb3, 0xa4, 0xf3, 0xa4, 0xcb, 0xa4, 0xc1, 0xa4, 0xcf],
|
||||
expected_text: "こんにちは",
|
||||
},
|
||||
TestCase {
|
||||
name: "iso2022jp.txt",
|
||||
bytes: vec![
|
||||
let test_cases: Vec<(&str, &[u8], &str)> = vec![
|
||||
("utf8.txt", "こんにちは".as_bytes(), "こんにちは"), // "こんにちは" is Japanese "Hello"
|
||||
(
|
||||
"sjis.txt",
|
||||
&[0x82, 0xb1, 0x82, 0xf1, 0x82, 0xc9, 0x82, 0xbf, 0x82, 0xcd],
|
||||
"こんにちは",
|
||||
),
|
||||
(
|
||||
"eucjp.txt",
|
||||
&[0xa4, 0xb3, 0xa4, 0xf3, 0xa4, 0xcb, 0xa4, 0xc1, 0xa4, 0xcf],
|
||||
"こんにちは",
|
||||
),
|
||||
(
|
||||
"iso2022jp.txt",
|
||||
&[
|
||||
0x1b, 0x24, 0x42, 0x24, 0x33, 0x24, 0x73, 0x24, 0x4b, 0x24, 0x41, 0x24, 0x4f, 0x1b,
|
||||
0x28, 0x42,
|
||||
],
|
||||
expected_text: "こんにちは",
|
||||
},
|
||||
TestCase {
|
||||
name: "win1252.txt",
|
||||
bytes: vec![0x43, 0x61, 0x66, 0xe9],
|
||||
expected_text: "Café",
|
||||
},
|
||||
TestCase {
|
||||
name: "gbk.txt",
|
||||
bytes: vec![
|
||||
"こんにちは",
|
||||
),
|
||||
// Western Europe (Windows-1252)
|
||||
// "Café" -> 0xE9 is 'é' in Windows-1252 (it is typically 0xC3 0xA9 in UTF-8)
|
||||
("win1252.txt", &[0x43, 0x61, 0x66, 0xe9], "Café"),
|
||||
// Chinese Simplified (GBK)
|
||||
// Note: We use a slightly longer string here because short byte sequences can be ambiguous
|
||||
// in multi-byte encodings. Providing more context helps the heuristic detector guess correctly.
|
||||
// Text: "今天天气不错" (Today's weather is not bad / nice)
|
||||
// Bytes:
|
||||
// 今: BD F1
|
||||
// 天: CC EC
|
||||
// 天: CC EC
|
||||
// 气: C6 F8
|
||||
// 不: B2 BB
|
||||
// 错: B4 ED
|
||||
(
|
||||
"gbk.txt",
|
||||
&[
|
||||
0xbd, 0xf1, 0xcc, 0xec, 0xcc, 0xec, 0xc6, 0xf8, 0xb2, 0xbb, 0xb4, 0xed,
|
||||
],
|
||||
expected_text: "今天天气不错",
|
||||
},
|
||||
// UTF-16LE with BOM
|
||||
TestCase {
|
||||
name: "utf16le_bom.txt",
|
||||
bytes: vec![
|
||||
"今天天气不错",
|
||||
),
|
||||
(
|
||||
"utf16le_bom.txt",
|
||||
&[
|
||||
0xFF, 0xFE, // BOM
|
||||
0x53, 0x30, 0x93, 0x30, 0x6B, 0x30, 0x61, 0x30, 0x6F, 0x30,
|
||||
0x53, 0x30, // こ
|
||||
0x93, 0x30, // ん
|
||||
0x6B, 0x30, // に
|
||||
0x61, 0x30, // ち
|
||||
0x6F, 0x30, // は
|
||||
],
|
||||
expected_text: "こんにちは",
|
||||
},
|
||||
// UTF-16BE with BOM
|
||||
TestCase {
|
||||
name: "utf16be_bom.txt",
|
||||
bytes: vec![
|
||||
0xFE, 0xFF, // BOM
|
||||
0x30, 0x53, 0x30, 0x93, 0x30, 0x6B, 0x30, 0x61, 0x30, 0x6F,
|
||||
"こんにちは",
|
||||
),
|
||||
(
|
||||
"utf8_bom.txt",
|
||||
&[
|
||||
0xEF, 0xBB, 0xBF, // UTF-8 BOM
|
||||
0xE3, 0x81, 0x93, // こ
|
||||
0xE3, 0x82, 0x93, // ん
|
||||
0xE3, 0x81, 0xAB, // に
|
||||
0xE3, 0x81, 0xA1, // ち
|
||||
0xE3, 0x81, 0xAF, // は
|
||||
],
|
||||
expected_text: "こんにちは",
|
||||
},
|
||||
// UTF-16LE without BOM (ASCII only)
|
||||
// This relies on the "null byte heuristic" we implemented.
|
||||
// "ABC" -> 41 00 42 00 43 00
|
||||
TestCase {
|
||||
name: "utf16le_ascii_no_bom.txt",
|
||||
bytes: vec![0x41, 0x00, 0x42, 0x00, 0x43, 0x00],
|
||||
expected_text: "ABC",
|
||||
},
|
||||
];
|
||||
|
||||
// --- Failure Cases ---
|
||||
let failure_cases = vec![
|
||||
// Binary File (Should be detected by heuristic and return Error)
|
||||
// Contains random bytes and mixed nulls that don't match UTF-16 patterns
|
||||
TestCase {
|
||||
name: "binary.bin",
|
||||
bytes: vec![0x00, 0xFF, 0x12, 0x00, 0x99, 0x88, 0x77, 0x66, 0x00],
|
||||
expected_text: "", // Not used
|
||||
},
|
||||
"こんにちは",
|
||||
),
|
||||
];
|
||||
|
||||
let root_path = if cfg!(windows) {
|
||||
@@ -2658,11 +2642,15 @@ async fn test_load_file_encoding(cx: &mut TestAppContext) {
|
||||
};
|
||||
|
||||
let fs = FakeFs::new(cx.background_executor.clone());
|
||||
fs.create_dir(root_path).await.unwrap();
|
||||
|
||||
for case in success_cases.iter().chain(failure_cases.iter()) {
|
||||
let path = root_path.join(case.name);
|
||||
fs.write(&path, &case.bytes).await.unwrap();
|
||||
let mut files_json = serde_json::Map::new();
|
||||
for (name, _, _) in &test_cases {
|
||||
files_json.insert(name.to_string(), serde_json::Value::String("".to_string()));
|
||||
}
|
||||
|
||||
for (name, bytes, _) in &test_cases {
|
||||
let path = root_path.join(name);
|
||||
fs.write(&path, bytes).await.unwrap();
|
||||
}
|
||||
|
||||
let tree = Worktree::local(
|
||||
@@ -2679,54 +2667,34 @@ async fn test_load_file_encoding(cx: &mut TestAppContext) {
|
||||
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
|
||||
.await;
|
||||
|
||||
let rel_path = |name: &str| {
|
||||
RelPath::new(&Path::new(name), PathStyle::local())
|
||||
.unwrap()
|
||||
.into_arc()
|
||||
};
|
||||
|
||||
// Run Success Tests
|
||||
for case in success_cases {
|
||||
for (name, _, expected) in test_cases {
|
||||
let loaded = tree
|
||||
.update(cx, |tree, cx| tree.load_file(&rel_path(case.name), cx))
|
||||
.await;
|
||||
if let Err(e) = &loaded {
|
||||
panic!("Failed to load success case '{}': {:?}", case.name, e);
|
||||
}
|
||||
let loaded = loaded.unwrap();
|
||||
.update(cx, |tree, cx| tree.load_file(rel_path(name), cx))
|
||||
.await
|
||||
.with_context(|| format!("Failed to load {}", name))
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(
|
||||
loaded.text, case.expected_text,
|
||||
loaded.text, expected,
|
||||
"Encoding mismatch for file: {}",
|
||||
case.name
|
||||
name
|
||||
);
|
||||
}
|
||||
|
||||
// Run Failure Tests
|
||||
for case in failure_cases {
|
||||
let loaded = tree
|
||||
.update(cx, |tree, cx| tree.load_file(&rel_path(case.name), cx))
|
||||
.await;
|
||||
assert!(
|
||||
loaded.is_err(),
|
||||
"Failure case '{}' unexpectedly succeeded! It should have been detected as binary.",
|
||||
case.name
|
||||
);
|
||||
let err_msg = loaded.unwrap_err().to_string();
|
||||
println!("Got expected error for {}: {}", case.name, err_msg);
|
||||
}
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_write_file_encoding(cx: &mut gpui::TestAppContext) {
|
||||
init_test(cx);
|
||||
let fs = FakeFs::new(cx.executor());
|
||||
|
||||
let root_path = if cfg!(windows) {
|
||||
Path::new("C:\\root")
|
||||
} else {
|
||||
Path::new("/root")
|
||||
};
|
||||
fs.create_dir(root_path).await.unwrap();
|
||||
let file_path = root_path.join("test.txt");
|
||||
|
||||
fs.insert_file(&file_path, "initial".into()).await;
|
||||
|
||||
let worktree = Worktree::local(
|
||||
root_path,
|
||||
@@ -2739,107 +2707,33 @@ async fn test_write_file_encoding(cx: &mut gpui::TestAppContext) {
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
// Define test case structure
|
||||
struct TestCase {
|
||||
name: &'static str,
|
||||
text: &'static str,
|
||||
encoding: &'static encoding_rs::Encoding,
|
||||
has_bom: bool,
|
||||
expected_bytes: Vec<u8>,
|
||||
}
|
||||
let path: Arc<Path> = Path::new("test.txt").into();
|
||||
let rel_path = RelPath::new(&path, PathStyle::local()).unwrap().into_arc();
|
||||
|
||||
let cases = vec![
|
||||
// Shift_JIS with Japanese
|
||||
TestCase {
|
||||
name: "Shift_JIS with Japanese",
|
||||
text: "こんにちは",
|
||||
encoding: encoding_rs::SHIFT_JIS,
|
||||
has_bom: false,
|
||||
expected_bytes: vec![0x82, 0xb1, 0x82, 0xf1, 0x82, 0xc9, 0x82, 0xbf, 0x82, 0xcd],
|
||||
},
|
||||
// UTF-8 No BOM
|
||||
TestCase {
|
||||
name: "UTF-8 No BOM",
|
||||
text: "AB",
|
||||
encoding: encoding_rs::UTF_8,
|
||||
has_bom: false,
|
||||
expected_bytes: vec![0x41, 0x42],
|
||||
},
|
||||
// UTF-8 with BOM
|
||||
TestCase {
|
||||
name: "UTF-8 with BOM",
|
||||
text: "AB",
|
||||
encoding: encoding_rs::UTF_8,
|
||||
has_bom: true,
|
||||
expected_bytes: vec![0xEF, 0xBB, 0xBF, 0x41, 0x42],
|
||||
},
|
||||
// UTF-16LE No BOM with Japanese
|
||||
// NOTE: This passes thanks to the manual encoding fix implemented in `write_file`.
|
||||
TestCase {
|
||||
name: "UTF-16LE No BOM with Japanese",
|
||||
text: "こんにちは",
|
||||
encoding: encoding_rs::UTF_16LE,
|
||||
has_bom: false,
|
||||
expected_bytes: vec![0x53, 0x30, 0x93, 0x30, 0x6b, 0x30, 0x61, 0x30, 0x6f, 0x30],
|
||||
},
|
||||
// UTF-16LE with BOM
|
||||
TestCase {
|
||||
name: "UTF-16LE with BOM",
|
||||
text: "A",
|
||||
encoding: encoding_rs::UTF_16LE,
|
||||
has_bom: true,
|
||||
expected_bytes: vec![0xFF, 0xFE, 0x41, 0x00],
|
||||
},
|
||||
// UTF-16BE No BOM with Japanese
|
||||
// NOTE: This passes thanks to the manual encoding fix.
|
||||
TestCase {
|
||||
name: "UTF-16BE No BOM with Japanese",
|
||||
text: "こんにちは",
|
||||
encoding: encoding_rs::UTF_16BE,
|
||||
has_bom: false,
|
||||
expected_bytes: vec![0x30, 0x53, 0x30, 0x93, 0x30, 0x6b, 0x30, 0x61, 0x30, 0x6f],
|
||||
},
|
||||
// UTF-16BE with BOM
|
||||
TestCase {
|
||||
name: "UTF-16BE with BOM",
|
||||
text: "A",
|
||||
encoding: encoding_rs::UTF_16BE,
|
||||
has_bom: true,
|
||||
expected_bytes: vec![0xFE, 0xFF, 0x00, 0x41],
|
||||
},
|
||||
let text = text::Rope::from("こんにちは");
|
||||
|
||||
let task = worktree.update(cx, |wt, cx| {
|
||||
wt.write_file(
|
||||
rel_path,
|
||||
text,
|
||||
text::LineEnding::Unix,
|
||||
encoding_rs::SHIFT_JIS,
|
||||
false,
|
||||
cx,
|
||||
)
|
||||
});
|
||||
|
||||
task.await.unwrap();
|
||||
|
||||
let bytes = fs.load_bytes(&file_path).await.unwrap();
|
||||
|
||||
let expected_bytes = vec![
|
||||
0x82, 0xb1, // こ
|
||||
0x82, 0xf1, // ん
|
||||
0x82, 0xc9, // に
|
||||
0x82, 0xbf, // ち
|
||||
0x82, 0xcd, // は
|
||||
];
|
||||
|
||||
for (i, case) in cases.into_iter().enumerate() {
|
||||
let file_name = format!("test_{}.txt", i);
|
||||
let path: Arc<Path> = Path::new(&file_name).into();
|
||||
let file_path = root_path.join(&file_name);
|
||||
|
||||
fs.insert_file(&file_path, "".into()).await;
|
||||
|
||||
let rel_path = RelPath::new(&path, PathStyle::local()).unwrap().into_arc();
|
||||
let text = text::Rope::from(case.text);
|
||||
|
||||
let task = worktree.update(cx, |wt, cx| {
|
||||
wt.write_file(
|
||||
rel_path,
|
||||
text,
|
||||
text::LineEnding::Unix,
|
||||
case.encoding,
|
||||
case.has_bom,
|
||||
cx,
|
||||
)
|
||||
});
|
||||
|
||||
if let Err(e) = task.await {
|
||||
panic!("Unexpected error in case '{}': {:?}", case.name, e);
|
||||
}
|
||||
|
||||
let bytes = fs.load_bytes(&file_path).await.unwrap();
|
||||
|
||||
assert_eq!(
|
||||
bytes, case.expected_bytes,
|
||||
"case '{}' mismatch. Expected {:?}, but got {:?}",
|
||||
case.name, case.expected_bytes, bytes
|
||||
);
|
||||
}
|
||||
assert_eq!(bytes, expected_bytes, "Should be saved as Shift-JIS");
|
||||
}
|
||||
|
||||
@@ -571,11 +571,6 @@ fn main() {
|
||||
dap_adapters::init(cx);
|
||||
auto_update_ui::init(cx);
|
||||
reliability::init(client.clone(), cx);
|
||||
// Initialize the language model registry first, then set up the extension proxy
|
||||
// BEFORE extension_host::init so that extensions can register their LLM providers
|
||||
// when they load.
|
||||
language_model::init(app_state.client.clone(), cx);
|
||||
language_models::init_extension_proxy(cx);
|
||||
extension_host::init(
|
||||
extension_host_proxy.clone(),
|
||||
app_state.fs.clone(),
|
||||
@@ -601,6 +596,7 @@ fn main() {
|
||||
cx,
|
||||
);
|
||||
supermaven::init(app_state.client.clone(), cx);
|
||||
language_model::init(app_state.client.clone(), cx);
|
||||
language_models::init(app_state.user_store.clone(), app_state.client.clone(), cx);
|
||||
acp_tools::init(cx);
|
||||
edit_prediction_ui::init(cx);
|
||||
|
||||
353
docs/AGENTS.md
353
docs/AGENTS.md
@@ -1,353 +0,0 @@
|
||||
# Documentation Automation Agent Guidelines
|
||||
|
||||
This file governs automated documentation updates triggered by code changes. All automation phases must comply with these rules.
|
||||
|
||||
## Documentation System
|
||||
|
||||
This documentation uses **mdBook** (https://rust-lang.github.io/mdBook/).
|
||||
|
||||
### Key Files
|
||||
|
||||
- **`docs/src/SUMMARY.md`**: Table of contents following mdBook format (https://rust-lang.github.io/mdBook/format/summary.html)
|
||||
- **`docs/book.toml`**: mdBook configuration
|
||||
- **`docs/.prettierrc`**: Prettier config (80 char line width)
|
||||
|
||||
### SUMMARY.md Format
|
||||
|
||||
The `SUMMARY.md` file defines the book structure. Format rules:
|
||||
|
||||
- Chapter titles are links: `[Title](./path/to/file.md)`
|
||||
- Nesting via indentation (2 spaces per level)
|
||||
- Separators: `---` for horizontal rules between sections
|
||||
- Draft chapters: `[Title]()` (empty parens, not yet written)
|
||||
|
||||
Example:
|
||||
|
||||
```markdown
|
||||
# Section Title
|
||||
|
||||
- [Chapter](./chapter.md)
|
||||
- [Nested Chapter](./nested.md)
|
||||
|
||||
---
|
||||
|
||||
# Another Section
|
||||
```
|
||||
|
||||
### Custom Preprocessor
|
||||
|
||||
The docs use a custom preprocessor (`docs_preprocessor`) that expands special commands:
|
||||
|
||||
| Syntax | Purpose | Example |
|
||||
| ----------------------------- | ------------------------------------- | ------------------------------- |
|
||||
| `{#kb action::ActionName}` | Keybinding for action | `{#kb agent::ToggleFocus}` |
|
||||
| `{#action agent::ActionName}` | Action reference (renders as command) | `{#action agent::OpenSettings}` |
|
||||
|
||||
**Rules:**
|
||||
|
||||
- Always use preprocessor syntax for keybindings instead of hardcoding
|
||||
- Action names use `snake_case` in the namespace, `PascalCase` for the action
|
||||
- Common namespaces: `agent::`, `editor::`, `assistant::`, `vim::`
|
||||
|
||||
### Formatting Requirements
|
||||
|
||||
All documentation must pass **Prettier** formatting:
|
||||
|
||||
```sh
|
||||
cd docs && npx prettier --check src/
|
||||
```
|
||||
|
||||
Before any documentation change is considered complete:
|
||||
|
||||
1. Run Prettier to format: `cd docs && npx prettier --write src/`
|
||||
2. Verify it passes: `cd docs && npx prettier --check src/`
|
||||
|
||||
Prettier config: 80 character line width (`docs/.prettierrc`)
|
||||
|
||||
### Section Anchors
|
||||
|
||||
Use `{#anchor-id}` syntax for linkable section headers:
|
||||
|
||||
```markdown
|
||||
## Getting Started {#getting-started}
|
||||
|
||||
### Custom Models {#anthropic-custom-models}
|
||||
```
|
||||
|
||||
Anchor IDs should be:
|
||||
|
||||
- Lowercase with hyphens
|
||||
- Unique within the page
|
||||
- Descriptive (can include parent context like `anthropic-custom-models`)
|
||||
|
||||
### Code Block Annotations
|
||||
|
||||
Use annotations after the language identifier to indicate file context:
|
||||
|
||||
```markdown
|
||||
\`\`\`json [settings]
|
||||
{
|
||||
"agent": { ... }
|
||||
}
|
||||
\`\`\`
|
||||
|
||||
\`\`\`json [keymap]
|
||||
[
|
||||
{ "bindings": { ... } }
|
||||
]
|
||||
\`\`\`
|
||||
```
|
||||
|
||||
Valid annotations: `[settings]` (for settings.json), `[keymap]` (for keymap.json)
|
||||
|
||||
### Blockquote Formatting
|
||||
|
||||
Use bold labels for callouts:
|
||||
|
||||
```markdown
|
||||
> **Note:** Important information the user should know.
|
||||
|
||||
> **Tip:** Helpful advice that saves time or improves workflow.
|
||||
|
||||
> **Warn:** Caution about potential issues or gotchas.
|
||||
```
|
||||
|
||||
### Image References
|
||||
|
||||
Images are hosted externally. Reference format:
|
||||
|
||||
```markdown
|
||||

|
||||
```
|
||||
|
||||
### Cross-Linking
|
||||
|
||||
- Relative links for same-directory: `[Agent Panel](./agent-panel.md)`
|
||||
- With anchors: `[Custom Models](./llm-providers.md#anthropic-custom-models)`
|
||||
- Parent directory: `[Telemetry](../telemetry.md)`
|
||||
|
||||
## Scope
|
||||
|
||||
### In-Scope Documentation
|
||||
|
||||
- All Markdown files in `docs/src/`
|
||||
- `docs/src/SUMMARY.md` (mdBook table of contents)
|
||||
- Language-specific docs in `docs/src/languages/`
|
||||
- Feature docs (AI, extensions, configuration, etc.)
|
||||
|
||||
### Out-of-Scope (Do Not Modify)
|
||||
|
||||
- `CHANGELOG.md`, `CONTRIBUTING.md`, `README.md` at repo root
|
||||
- Inline code comments and rustdoc
|
||||
- `CLAUDE.md`, `GEMINI.md`, or other AI instruction files
|
||||
- Build configuration (`book.toml`, theme files, `docs_preprocessor`)
|
||||
- Any file outside `docs/src/`
|
||||
|
||||
## Page Structure Patterns
|
||||
|
||||
### Standard Page Layout
|
||||
|
||||
Most documentation pages follow this structure:
|
||||
|
||||
1. **Title** (H1) - Single sentence or phrase
|
||||
2. **Overview/Introduction** - 1-3 paragraphs explaining what this is
|
||||
3. **Getting Started** `{#getting-started}` - Prerequisites and first steps
|
||||
4. **Main Content** - Feature details, organized by topic
|
||||
5. **Advanced/Configuration** - Power user options
|
||||
6. **See Also** (optional) - Related documentation links
|
||||
|
||||
### Settings Documentation Pattern
|
||||
|
||||
When documenting settings:
|
||||
|
||||
1. Show the Settings Editor (UI) approach first
|
||||
2. Then show JSON as "Or add this to your settings.json:"
|
||||
3. Always show complete, valid JSON with surrounding structure:
|
||||
|
||||
```json [settings]
|
||||
{
|
||||
"agent": {
|
||||
"default_model": {
|
||||
"provider": "anthropic",
|
||||
"model": "claude-sonnet-4"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Provider/Feature Documentation Pattern
|
||||
|
||||
For each provider or distinct feature:
|
||||
|
||||
1. H3 heading with anchor: `### Provider Name {#provider-name}`
|
||||
2. Brief description (1-2 sentences)
|
||||
3. Setup steps (numbered list)
|
||||
4. Configuration example (JSON code block)
|
||||
5. Custom models section if applicable: `#### Custom Models {#provider-custom-models}`
|
||||
|
||||
## Style Rules
|
||||
|
||||
Inherit all conventions from `docs/.rules`. Key points:
|
||||
|
||||
### Voice
|
||||
|
||||
- Second person ("you"), present tense
|
||||
- Direct and concise—no hedging ("simply", "just", "easily")
|
||||
- Honest about limitations; no promotional language
|
||||
|
||||
### Formatting
|
||||
|
||||
- Keybindings: backticks with `+` for simultaneous keys (`Cmd+Shift+P`)
|
||||
- Show both macOS and Linux/Windows variants when they differ
|
||||
- Use `sh` code blocks for terminal commands
|
||||
- Settings: show Settings Editor UI first, JSON as secondary
|
||||
|
||||
### Terminology
|
||||
|
||||
| Use | Instead of |
|
||||
| --------------- | -------------------------------------- |
|
||||
| folder | directory |
|
||||
| project | workspace |
|
||||
| Settings Editor | settings UI |
|
||||
| command palette | command bar |
|
||||
| panel | sidebar (be specific: "Project Panel") |
|
||||
|
||||
## Zed-Specific Conventions
|
||||
|
||||
### Recognized Rules Files
|
||||
|
||||
When documenting rules/instructions for AI, note that Zed recognizes these files (in priority order):
|
||||
|
||||
- `.rules`
|
||||
- `.cursorrules`
|
||||
- `.windsurfrules`
|
||||
- `.clinerules`
|
||||
- `.github/copilot-instructions.md`
|
||||
- `AGENT.md`
|
||||
- `AGENTS.md`
|
||||
- `CLAUDE.md`
|
||||
- `GEMINI.md`
|
||||
|
||||
### Settings File Locations
|
||||
|
||||
- macOS: `~/.config/zed/settings.json`
|
||||
- Linux: `~/.config/zed/settings.json`
|
||||
- Windows: `%AppData%\Zed\settings.json`
|
||||
|
||||
### Keymap File Locations
|
||||
|
||||
- macOS: `~/.config/zed/keymap.json`
|
||||
- Linux: `~/.config/zed/keymap.json`
|
||||
- Windows: `%AppData%\Zed\keymap.json`
|
||||
|
||||
## Safety Constraints
|
||||
|
||||
### Must Not
|
||||
|
||||
- Delete existing documentation files
|
||||
- Remove sections documenting existing functionality
|
||||
- Change URLs or anchor links without verifying references
|
||||
- Modify `SUMMARY.md` structure without corresponding content
|
||||
- Add speculative documentation for unreleased features
|
||||
- Include internal implementation details not relevant to users
|
||||
|
||||
### Must
|
||||
|
||||
- Preserve existing structure when updating content
|
||||
- Maintain backward compatibility of documented settings/commands
|
||||
- Flag uncertainty explicitly rather than guessing
|
||||
- Link to related documentation when adding new sections
|
||||
|
||||
## Change Classification
|
||||
|
||||
### Requires Documentation Update
|
||||
|
||||
- New user-facing features or commands
|
||||
- Changed keybindings or default behaviors
|
||||
- Modified settings schema or options
|
||||
- Deprecated or removed functionality
|
||||
- API changes affecting extensions
|
||||
|
||||
### Does Not Require Documentation Update
|
||||
|
||||
- Internal refactoring without behavioral changes
|
||||
- Performance optimizations (unless user-visible)
|
||||
- Bug fixes that restore documented behavior
|
||||
- Test changes
|
||||
- CI/CD changes
|
||||
|
||||
## Output Format
|
||||
|
||||
### Phase 4 Documentation Plan
|
||||
|
||||
When generating a documentation plan, use this structure:
|
||||
|
||||
```markdown
|
||||
## Documentation Impact Assessment
|
||||
|
||||
### Summary
|
||||
|
||||
Brief description of code changes analyzed.
|
||||
|
||||
### Documentation Updates Required: [Yes/No]
|
||||
|
||||
### Planned Changes
|
||||
|
||||
#### 1. [File Path]
|
||||
|
||||
- **Section**: [Section name or "New section"]
|
||||
- **Change Type**: [Update/Add/Deprecate]
|
||||
- **Reason**: Why this change is needed
|
||||
- **Description**: What will be added/modified
|
||||
|
||||
#### 2. [File Path]
|
||||
|
||||
...
|
||||
|
||||
### Uncertainty Flags
|
||||
|
||||
- [ ] [Description of any assumptions or areas needing confirmation]
|
||||
|
||||
### No Changes Needed
|
||||
|
||||
- [List files reviewed but not requiring updates, with brief reason]
|
||||
```
|
||||
|
||||
### Phase 6 Summary Format
|
||||
|
||||
```markdown
|
||||
## Documentation Update Summary
|
||||
|
||||
### Changes Made
|
||||
|
||||
| File | Change | Related Code |
|
||||
| -------------- | ----------------- | ----------------- |
|
||||
| path/to/doc.md | Brief description | link to PR/commit |
|
||||
|
||||
### Rationale
|
||||
|
||||
Brief explanation of why these updates were made.
|
||||
|
||||
### Review Notes
|
||||
|
||||
Any items reviewers should pay special attention to.
|
||||
```
|
||||
|
||||
## Behavioral Guidelines
|
||||
|
||||
### Conservative by Default
|
||||
|
||||
- When uncertain whether to document something, flag it for human review
|
||||
- Prefer smaller, focused updates over broad rewrites
|
||||
- Do not "improve" documentation unrelated to the triggering code change
|
||||
|
||||
### Traceability
|
||||
|
||||
- Every documentation change should trace to a specific code change
|
||||
- Include references to relevant commits, PRs, or issues in summaries
|
||||
|
||||
### Incremental Updates
|
||||
|
||||
- Update existing sections rather than creating parallel documentation
|
||||
- Maintain consistency with surrounding content
|
||||
- Follow the established patterns in each documentation area
|
||||
@@ -2687,6 +2687,9 @@ These values take in the same options as the root-level settings with the same n
|
||||
```json [settings]
|
||||
{
|
||||
"language_models": {
|
||||
"anthropic": {
|
||||
"api_url": "https://api.anthropic.com"
|
||||
},
|
||||
"google": {
|
||||
"api_url": "https://generativelanguage.googleapis.com"
|
||||
},
|
||||
|
||||
823
extensions/google-ai/Cargo.lock
generated
823
extensions/google-ai/Cargo.lock
generated
@@ -1,823 +0,0 @@
|
||||
# This file is automatically @generated by Cargo.
|
||||
# It is not intended for manual editing.
|
||||
version = 4
|
||||
|
||||
[[package]]
|
||||
name = "adler2"
|
||||
version = "2.0.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "320119579fcad9c21884f5c4861d16174d0e06250625266f50fe6898340abefa"
|
||||
|
||||
[[package]]
|
||||
name = "anyhow"
|
||||
version = "1.0.100"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a23eb6b1614318a8071c9b2521f36b424b2c83db5eb3a0fead4a6c0809af6e61"
|
||||
|
||||
[[package]]
|
||||
name = "auditable-serde"
|
||||
version = "0.8.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5c7bf8143dfc3c0258df908843e169b5cc5fcf76c7718bd66135ef4a9cd558c5"
|
||||
dependencies = [
|
||||
"semver",
|
||||
"serde",
|
||||
"serde_json",
|
||||
"topological-sort",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "bitflags"
|
||||
version = "2.10.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "812e12b5285cc515a9c72a5c1d3b6d46a19dac5acfef5265968c166106e31dd3"
|
||||
|
||||
[[package]]
|
||||
name = "cfg-if"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9330f8b2ff13f34540b44e946ef35111825727b38d33286ef986142615121801"
|
||||
|
||||
[[package]]
|
||||
name = "crc32fast"
|
||||
version = "1.5.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9481c1c90cbf2ac953f07c8d4a58aa3945c425b7185c9154d67a65e4230da511"
|
||||
dependencies = [
|
||||
"cfg-if",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "displaydoc"
|
||||
version = "0.2.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "97369cbbc041bc366949bc74d34658d6cda5621039731c6310521892a3a20ae0"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "equivalent"
|
||||
version = "1.0.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "877a4ace8713b0bcf2a4e7eec82529c029f1d0619886d18145fea96c3ffe5c0f"
|
||||
|
||||
[[package]]
|
||||
name = "flate2"
|
||||
version = "1.1.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "bfe33edd8e85a12a67454e37f8c75e730830d83e313556ab9ebf9ee7fbeb3bfb"
|
||||
dependencies = [
|
||||
"crc32fast",
|
||||
"miniz_oxide",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "foldhash"
|
||||
version = "0.1.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
|
||||
|
||||
[[package]]
|
||||
name = "form_urlencoded"
|
||||
version = "1.2.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "cb4cb245038516f5f85277875cdaa4f7d2c9a0fa0468de06ed190163b1581fcf"
|
||||
dependencies = [
|
||||
"percent-encoding",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "65bc07b1a8bc7c85c5f2e110c476c7389b4554ba72af57d8445ea63a576b0876"
|
||||
dependencies = [
|
||||
"futures-channel",
|
||||
"futures-core",
|
||||
"futures-executor",
|
||||
"futures-io",
|
||||
"futures-sink",
|
||||
"futures-task",
|
||||
"futures-util",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-channel"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2dff15bf788c671c1934e366d07e30c1814a8ef514e1af724a602e8a2fbe1b10"
|
||||
dependencies = [
|
||||
"futures-core",
|
||||
"futures-sink",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-core"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "05f29059c0c2090612e8d742178b0580d2dc940c837851ad723096f87af6663e"
|
||||
|
||||
[[package]]
|
||||
name = "futures-executor"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1e28d1d997f585e54aebc3f97d39e72338912123a67330d723fdbb564d646c9f"
|
||||
dependencies = [
|
||||
"futures-core",
|
||||
"futures-task",
|
||||
"futures-util",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-io"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9e5c1b78ca4aae1ac06c48a526a655760685149f0d465d21f37abfe57ce075c6"
|
||||
|
||||
[[package]]
|
||||
name = "futures-macro"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "162ee34ebcb7c64a8abebc059ce0fee27c2262618d7b60ed8faf72fef13c3650"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-sink"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e575fab7d1e0dcb8d0c7bcf9a63ee213816ab51902e6d244a95819acacf1d4f7"
|
||||
|
||||
[[package]]
|
||||
name = "futures-task"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f90f7dce0722e95104fcb095585910c0977252f286e354b5e3bd38902cd99988"
|
||||
|
||||
[[package]]
|
||||
name = "futures-util"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9fa08315bb612088cc391249efdc3bc77536f16c91f6cf495e6fbe85b20a4a81"
|
||||
dependencies = [
|
||||
"futures-channel",
|
||||
"futures-core",
|
||||
"futures-io",
|
||||
"futures-macro",
|
||||
"futures-sink",
|
||||
"futures-task",
|
||||
"memchr",
|
||||
"pin-project-lite",
|
||||
"pin-utils",
|
||||
"slab",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "google-ai"
|
||||
version = "0.1.0"
|
||||
dependencies = [
|
||||
"serde",
|
||||
"serde_json",
|
||||
"zed_extension_api",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "hashbrown"
|
||||
version = "0.15.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9229cfe53dfd69f0609a49f65461bd93001ea1ef889cd5529dd176593f5338a1"
|
||||
dependencies = [
|
||||
"foldhash",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "hashbrown"
|
||||
version = "0.16.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "841d1cc9bed7f9236f321df977030373f4a4163ae1a7dbfe1a51a2c1a51d9100"
|
||||
|
||||
[[package]]
|
||||
name = "heck"
|
||||
version = "0.5.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea"
|
||||
|
||||
[[package]]
|
||||
name = "icu_collections"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "4c6b649701667bbe825c3b7e6388cb521c23d88644678e83c0c4d0a621a34b43"
|
||||
dependencies = [
|
||||
"displaydoc",
|
||||
"potential_utf",
|
||||
"yoke",
|
||||
"zerofrom",
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "icu_locale_core"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "edba7861004dd3714265b4db54a3c390e880ab658fec5f7db895fae2046b5bb6"
|
||||
dependencies = [
|
||||
"displaydoc",
|
||||
"litemap",
|
||||
"tinystr",
|
||||
"writeable",
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "icu_normalizer"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5f6c8828b67bf8908d82127b2054ea1b4427ff0230ee9141c54251934ab1b599"
|
||||
dependencies = [
|
||||
"icu_collections",
|
||||
"icu_normalizer_data",
|
||||
"icu_properties",
|
||||
"icu_provider",
|
||||
"smallvec",
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "icu_normalizer_data"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "7aedcccd01fc5fe81e6b489c15b247b8b0690feb23304303a9e560f37efc560a"
|
||||
|
||||
[[package]]
|
||||
name = "icu_properties"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e93fcd3157766c0c8da2f8cff6ce651a31f0810eaa1c51ec363ef790bbb5fb99"
|
||||
dependencies = [
|
||||
"icu_collections",
|
||||
"icu_locale_core",
|
||||
"icu_properties_data",
|
||||
"icu_provider",
|
||||
"zerotrie",
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "icu_properties_data"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "02845b3647bb045f1100ecd6480ff52f34c35f82d9880e029d329c21d1054899"
|
||||
|
||||
[[package]]
|
||||
name = "icu_provider"
|
||||
version = "2.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "85962cf0ce02e1e0a629cc34e7ca3e373ce20dda4c4d7294bbd0bf1fdb59e614"
|
||||
dependencies = [
|
||||
"displaydoc",
|
||||
"icu_locale_core",
|
||||
"writeable",
|
||||
"yoke",
|
||||
"zerofrom",
|
||||
"zerotrie",
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "id-arena"
|
||||
version = "2.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "25a2bc672d1148e28034f176e01fffebb08b35768468cc954630da77a1449005"
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "1.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "3b0875f23caa03898994f6ddc501886a45c7d3d62d04d2d90788d47be1b1e4de"
|
||||
dependencies = [
|
||||
"idna_adapter",
|
||||
"smallvec",
|
||||
"utf8_iter",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "idna_adapter"
|
||||
version = "1.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "3acae9609540aa318d1bc588455225fb2085b9ed0c4f6bd0d9d5bcd86f1a0344"
|
||||
dependencies = [
|
||||
"icu_normalizer",
|
||||
"icu_properties",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "indexmap"
|
||||
version = "2.12.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0ad4bb2b565bca0645f4d68c5c9af97fba094e9791da685bf83cb5f3ce74acf2"
|
||||
dependencies = [
|
||||
"equivalent",
|
||||
"hashbrown 0.16.1",
|
||||
"serde",
|
||||
"serde_core",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "itoa"
|
||||
version = "1.0.15"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "4a5f13b858c8d314ee3e8f639011f7ccefe71f97f96e50151fb991f267928e2c"
|
||||
|
||||
[[package]]
|
||||
name = "leb128fmt"
|
||||
version = "0.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "09edd9e8b54e49e587e4f6295a7d29c3ea94d469cb40ab8ca70b288248a81db2"
|
||||
|
||||
[[package]]
|
||||
name = "litemap"
|
||||
version = "0.8.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6373607a59f0be73a39b6fe456b8192fcc3585f602af20751600e974dd455e77"
|
||||
|
||||
[[package]]
|
||||
name = "log"
|
||||
version = "0.4.29"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897"
|
||||
|
||||
[[package]]
|
||||
name = "memchr"
|
||||
version = "2.7.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f52b00d39961fc5b2736ea853c9cc86238e165017a493d1d5c8eac6bdc4cc273"
|
||||
|
||||
[[package]]
|
||||
name = "miniz_oxide"
|
||||
version = "0.8.9"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1fa76a2c86f704bdb222d66965fb3d63269ce38518b83cb0575fca855ebb6316"
|
||||
dependencies = [
|
||||
"adler2",
|
||||
"simd-adler32",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "once_cell"
|
||||
version = "1.21.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
|
||||
|
||||
[[package]]
|
||||
name = "percent-encoding"
|
||||
version = "2.3.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9b4f627cb1b25917193a259e49bdad08f671f8d9708acfd5fe0a8c1455d87220"
|
||||
|
||||
[[package]]
|
||||
name = "pin-project-lite"
|
||||
version = "0.2.16"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "3b3cff922bd51709b605d9ead9aa71031d81447142d828eb4a6eba76fe619f9b"
|
||||
|
||||
[[package]]
|
||||
name = "pin-utils"
|
||||
version = "0.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "8b870d8c151b6f2fb93e84a13146138f05d02ed11c7e7c54f8826aaaf7c9f184"
|
||||
|
||||
[[package]]
|
||||
name = "potential_utf"
|
||||
version = "0.1.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b73949432f5e2a09657003c25bca5e19a0e9c84f8058ca374f49e0ebe605af77"
|
||||
dependencies = [
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "prettyplease"
|
||||
version = "0.2.37"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "479ca8adacdd7ce8f1fb39ce9ecccbfe93a3f1344b3d0d97f20bc0196208f62b"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"syn",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "proc-macro2"
|
||||
version = "1.0.103"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5ee95bc4ef87b8d5ba32e8b7714ccc834865276eab0aed5c9958d00ec45f49e8"
|
||||
dependencies = [
|
||||
"unicode-ident",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "quote"
|
||||
version = "1.0.42"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a338cc41d27e6cc6dce6cefc13a0729dfbb81c262b1f519331575dd80ef3067f"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ryu"
|
||||
version = "1.0.20"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
|
||||
|
||||
[[package]]
|
||||
name = "semver"
|
||||
version = "1.0.27"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d767eb0aabc880b29956c35734170f26ed551a859dbd361d140cdbeca61ab1e2"
|
||||
dependencies = [
|
||||
"serde",
|
||||
"serde_core",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "serde"
|
||||
version = "1.0.228"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9a8e94ea7f378bd32cbbd37198a4a91436180c5bb472411e48b5ec2e2124ae9e"
|
||||
dependencies = [
|
||||
"serde_core",
|
||||
"serde_derive",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "serde_core"
|
||||
version = "1.0.228"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "41d385c7d4ca58e59fc732af25c3983b67ac852c1a25000afe1175de458b67ad"
|
||||
dependencies = [
|
||||
"serde_derive",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "serde_derive"
|
||||
version = "1.0.228"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d540f220d3187173da220f885ab66608367b6574e925011a9353e4badda91d79"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "serde_json"
|
||||
version = "1.0.145"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "402a6f66d8c709116cf22f558eab210f5a50187f702eb4d7e5ef38d9a7f1c79c"
|
||||
dependencies = [
|
||||
"itoa",
|
||||
"memchr",
|
||||
"ryu",
|
||||
"serde",
|
||||
"serde_core",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "simd-adler32"
|
||||
version = "0.3.7"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d66dc143e6b11c1eddc06d5c423cfc97062865baf299914ab64caa38182078fe"
|
||||
|
||||
[[package]]
|
||||
name = "slab"
|
||||
version = "0.4.11"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "7a2ae44ef20feb57a68b23d846850f861394c2e02dc425a50098ae8c90267589"
|
||||
|
||||
[[package]]
|
||||
name = "smallvec"
|
||||
version = "1.15.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "67b1b7a3b5fe4f1376887184045fcf45c69e92af734b7aaddc05fb777b6fbd03"
|
||||
|
||||
[[package]]
|
||||
name = "spdx"
|
||||
version = "0.10.9"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c3e17e880bafaeb362a7b751ec46bdc5b61445a188f80e0606e68167cd540fa3"
|
||||
dependencies = [
|
||||
"smallvec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "stable_deref_trait"
|
||||
version = "1.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6ce2be8dc25455e1f91df71bfa12ad37d7af1092ae736f3a6cd0e37bc7810596"
|
||||
|
||||
[[package]]
|
||||
name = "syn"
|
||||
version = "2.0.111"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "390cc9a294ab71bdb1aa2e99d13be9c753cd2d7bd6560c77118597410c4d2e87"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"unicode-ident",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "synstructure"
|
||||
version = "0.13.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "728a70f3dbaf5bab7f0c4b1ac8d7ae5ea60a4b5549c8a5914361c99147a709d2"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tinystr"
|
||||
version = "0.8.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "42d3e9c45c09de15d06dd8acf5f4e0e399e85927b7f00711024eb7ae10fa4869"
|
||||
dependencies = [
|
||||
"displaydoc",
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "topological-sort"
|
||||
version = "0.2.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ea68304e134ecd095ac6c3574494fc62b909f416c4fca77e440530221e549d3d"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-ident"
|
||||
version = "1.0.22"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9312f7c4f6ff9069b165498234ce8be658059c6728633667c526e27dc2cf1df5"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-xid"
|
||||
version = "0.2.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
|
||||
|
||||
[[package]]
|
||||
name = "url"
|
||||
version = "2.5.7"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "08bc136a29a3d1758e07a9cca267be308aeebf5cfd5a10f3f67ab2097683ef5b"
|
||||
dependencies = [
|
||||
"form_urlencoded",
|
||||
"idna",
|
||||
"percent-encoding",
|
||||
"serde",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "utf8_iter"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b6c140620e7ffbb22c2dee59cafe6084a59b5ffc27a8859a5f0d494b5d52b6be"
|
||||
|
||||
[[package]]
|
||||
name = "wasm-encoder"
|
||||
version = "0.227.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "80bb72f02e7fbf07183443b27b0f3d4144abf8c114189f2e088ed95b696a7822"
|
||||
dependencies = [
|
||||
"leb128fmt",
|
||||
"wasmparser",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wasm-metadata"
|
||||
version = "0.227.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ce1ef0faabbbba6674e97a56bee857ccddf942785a336c8b47b42373c922a91d"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"auditable-serde",
|
||||
"flate2",
|
||||
"indexmap",
|
||||
"serde",
|
||||
"serde_derive",
|
||||
"serde_json",
|
||||
"spdx",
|
||||
"url",
|
||||
"wasm-encoder",
|
||||
"wasmparser",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wasmparser"
|
||||
version = "0.227.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0f51cad774fb3c9461ab9bccc9c62dfb7388397b5deda31bf40e8108ccd678b2"
|
||||
dependencies = [
|
||||
"bitflags",
|
||||
"hashbrown 0.15.5",
|
||||
"indexmap",
|
||||
"semver",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-bindgen"
|
||||
version = "0.41.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "10fb6648689b3929d56bbc7eb1acf70c9a42a29eb5358c67c10f54dbd5d695de"
|
||||
dependencies = [
|
||||
"wit-bindgen-rt",
|
||||
"wit-bindgen-rust-macro",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-bindgen-core"
|
||||
version = "0.41.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "92fa781d4f2ff6d3f27f3cc9b74a73327b31ca0dc4a3ef25a0ce2983e0e5af9b"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"heck",
|
||||
"wit-parser",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-bindgen-rt"
|
||||
version = "0.41.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c4db52a11d4dfb0a59f194c064055794ee6564eb1ced88c25da2cf76e50c5621"
|
||||
dependencies = [
|
||||
"bitflags",
|
||||
"futures",
|
||||
"once_cell",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-bindgen-rust"
|
||||
version = "0.41.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9d0809dc5ba19e2e98661bf32fc0addc5a3ca5bf3a6a7083aa6ba484085ff3ce"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"heck",
|
||||
"indexmap",
|
||||
"prettyplease",
|
||||
"syn",
|
||||
"wasm-metadata",
|
||||
"wit-bindgen-core",
|
||||
"wit-component",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-bindgen-rust-macro"
|
||||
version = "0.41.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ad19eec017904e04c60719592a803ee5da76cb51c81e3f6fbf9457f59db49799"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"prettyplease",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
"wit-bindgen-core",
|
||||
"wit-bindgen-rust",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-component"
|
||||
version = "0.227.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "635c3adc595422cbf2341a17fb73a319669cc8d33deed3a48368a841df86b676"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"bitflags",
|
||||
"indexmap",
|
||||
"log",
|
||||
"serde",
|
||||
"serde_derive",
|
||||
"serde_json",
|
||||
"wasm-encoder",
|
||||
"wasm-metadata",
|
||||
"wasmparser",
|
||||
"wit-parser",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wit-parser"
|
||||
version = "0.227.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ddf445ed5157046e4baf56f9138c124a0824d4d1657e7204d71886ad8ce2fc11"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"id-arena",
|
||||
"indexmap",
|
||||
"log",
|
||||
"semver",
|
||||
"serde",
|
||||
"serde_derive",
|
||||
"serde_json",
|
||||
"unicode-xid",
|
||||
"wasmparser",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "writeable"
|
||||
version = "0.6.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9edde0db4769d2dc68579893f2306b26c6ecfbe0ef499b013d731b7b9247e0b9"
|
||||
|
||||
[[package]]
|
||||
name = "yoke"
|
||||
version = "0.8.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "72d6e5c6afb84d73944e5cedb052c4680d5657337201555f9f2a16b7406d4954"
|
||||
dependencies = [
|
||||
"stable_deref_trait",
|
||||
"yoke-derive",
|
||||
"zerofrom",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "yoke-derive"
|
||||
version = "0.8.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b659052874eb698efe5b9e8cf382204678a0086ebf46982b79d6ca3182927e5d"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
"synstructure",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "zed_extension_api"
|
||||
version = "0.8.0"
|
||||
dependencies = [
|
||||
"serde",
|
||||
"serde_json",
|
||||
"wit-bindgen",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "zerofrom"
|
||||
version = "0.1.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "50cc42e0333e05660c3587f3bf9d0478688e15d870fab3346451ce7f8c9fbea5"
|
||||
dependencies = [
|
||||
"zerofrom-derive",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "zerofrom-derive"
|
||||
version = "0.1.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d71e5d6e06ab090c67b5e44993ec16b72dcbaabc526db883a360057678b48502"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
"synstructure",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "zerotrie"
|
||||
version = "0.2.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2a59c17a5562d507e4b54960e8569ebee33bee890c70aa3fe7b97e85a9fd7851"
|
||||
dependencies = [
|
||||
"displaydoc",
|
||||
"yoke",
|
||||
"zerofrom",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "zerovec"
|
||||
version = "0.11.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6c28719294829477f525be0186d13efa9a3c602f7ec202ca9e353d310fb9a002"
|
||||
dependencies = [
|
||||
"yoke",
|
||||
"zerofrom",
|
||||
"zerovec-derive",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "zerovec-derive"
|
||||
version = "0.11.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "eadce39539ca5cb3985590102671f2567e659fca9666581ad3411d59207951f3"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
]
|
||||
@@ -1,17 +0,0 @@
|
||||
[package]
|
||||
name = "google-ai"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
publish = false
|
||||
license = "Apache-2.0"
|
||||
|
||||
[workspace]
|
||||
|
||||
[lib]
|
||||
path = "src/google_ai.rs"
|
||||
crate-type = ["cdylib"]
|
||||
|
||||
[dependencies]
|
||||
zed_extension_api = { path = "../../crates/extension_api" }
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = "1.0"
|
||||
@@ -1 +0,0 @@
|
||||
../../LICENSE-APACHE
|
||||
@@ -1,13 +0,0 @@
|
||||
id = "google-ai"
|
||||
name = "Google AI"
|
||||
description = "Google Gemini LLM provider for Zed."
|
||||
version = "0.1.0"
|
||||
schema_version = 1
|
||||
authors = ["Zed Team"]
|
||||
repository = "https://github.com/zed-industries/zed"
|
||||
|
||||
[language_model_providers.google]
|
||||
name = "Google AI"
|
||||
|
||||
[language_model_providers.google.auth]
|
||||
env_vars = ["GEMINI_API_KEY", "GOOGLE_AI_API_KEY"]
|
||||
@@ -1,3 +0,0 @@
|
||||
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M7.44 12.27C7.81333 13.1217 8 14.0317 8 15C8 14.0317 8.18083 13.1217 8.5425 12.27C8.91583 11.4183 9.4175 10.6775 10.0475 10.0475C10.6775 9.4175 11.4183 8.92167 12.27 8.56C13.1217 8.18667 14.0317 8 15 8C14.0317 8 13.1217 7.81917 12.27 7.4575C11.4411 7.1001 10.6871 6.5895 10.0475 5.9525C9.4105 5.31293 8.8999 4.55891 8.5425 3.73C8.18083 2.87833 8 1.96833 8 1C8 1.96833 7.81333 2.87833 7.44 3.73C7.07833 4.58167 6.5825 5.3225 5.9525 5.9525C5.31293 6.5895 4.55891 7.1001 3.73 7.4575C2.87833 7.81917 1.96833 8 1 8C1.96833 8 2.87833 8.18667 3.73 8.56C4.58167 8.92167 5.3225 9.4175 5.9525 10.0475C6.5825 10.6775 7.07833 11.4183 7.44 12.27Z" fill="black"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 762 B |
File diff suppressed because it is too large
Load Diff
@@ -1,38 +0,0 @@
|
||||
## Triage Watcher v0.1
|
||||
# This is a small script to watch for new issues on the Zed repository and open them in a new browser tab interactively.
|
||||
#
|
||||
## Installing Julia
|
||||
#
|
||||
# You need Julia installed on your system:
|
||||
# curl -fsSL https://install.julialang.org | sh
|
||||
#
|
||||
## Running this script:
|
||||
# 1. It only works on Macos/Linux
|
||||
# Open a new Julia repl with `julia` inside the `zed` repo
|
||||
# 2. Paste the following code
|
||||
# 3. Whenever you close your computer, just type the Up arrow on the REPL + enter to rerun the loop again to resume
|
||||
function get_issues()
|
||||
entries = filter(x -> occursin("state:needs triage", x), split(read(`gh issue list -L 10`, String), '\n'))
|
||||
top = findfirst.('\t', entries) .- 1
|
||||
[entries[i][begin:top[i]] for i in eachindex(entries)]
|
||||
end
|
||||
|
||||
nums = get_issues();
|
||||
while true
|
||||
new_nums = get_issues()
|
||||
# Open each new issue in a new browser tab
|
||||
for issue_num in setdiff(new_nums, nums)
|
||||
url = "https://github.com/zed-industries/zed/issues/" * issue_num
|
||||
println("\nOpening $url")
|
||||
open_tab = `/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome $url`
|
||||
try
|
||||
sound_file = "/Users/mrg/Downloads/mario_coin_sound.mp3"
|
||||
run(`afplay -v 0.02 $sound_file`)
|
||||
finally
|
||||
end
|
||||
run(open_tab)
|
||||
end
|
||||
nums = new_nums
|
||||
print("🧘🏼")
|
||||
sleep(60)
|
||||
end
|
||||
Reference in New Issue
Block a user