Merge remote-tracking branch 'origin/master' into goto_next_reference

pull/6465/head
Anthony Templeton 2 years ago
commit f77c800ab4

@ -85,6 +85,7 @@ jobs:
rust: stable rust: stable
target: x86_64-pc-windows-msvc target: x86_64-pc-windows-msvc
cross: false cross: false
# 23.03: build issues
- build: aarch64-macos - build: aarch64-macos
os: macos-latest os: macos-latest
rust: stable rust: stable
@ -113,6 +114,12 @@ jobs:
mkdir -p runtime/grammars/sources mkdir -p runtime/grammars/sources
tar xJf grammars/grammars.tar.xz -C runtime/grammars/sources tar xJf grammars/grammars.tar.xz -C runtime/grammars/sources
# The rust-toolchain action ignores rust-toolchain.toml files.
# Removing this before building with cargo ensures that the rust-toolchain
# is considered the same between installation and usage.
- name: Remove the rust-toolchain.toml file
run: rm rust-toolchain.toml
- name: Install ${{ matrix.rust }} toolchain - name: Install ${{ matrix.rust }} toolchain
uses: dtolnay/rust-toolchain@master uses: dtolnay/rust-toolchain@master
with: with:
@ -155,6 +162,10 @@ jobs:
shell: bash shell: bash
if: matrix.build == 'aarch64-linux' || matrix.build == 'x86_64-linux' if: matrix.build == 'aarch64-linux' || matrix.build == 'x86_64-linux'
run: | run: |
# Required as of 22.x https://github.com/AppImage/AppImageKit/wiki/FUSE
sudo add-apt-repository universe
sudo apt install libfuse2
mkdir dist mkdir dist
name=dev name=dev
@ -244,7 +255,7 @@ jobs:
exe=".exe" exe=".exe"
fi fi
pkgname=helix-$GITHUB_REF_NAME-$platform pkgname=helix-$GITHUB_REF_NAME-$platform
mkdir $pkgname mkdir -p $pkgname
cp $source/LICENSE $source/README.md $pkgname cp $source/LICENSE $source/README.md $pkgname
mkdir $pkgname/contrib mkdir $pkgname/contrib
cp -r $source/contrib/completion $pkgname/contrib cp -r $source/contrib/completion $pkgname/contrib

@ -1,3 +1,276 @@
# 23.03 (2023-03-31)
23.03 brings some long-awaited and exciting features. Thank you to everyone involved! This release saw changes from 102 contributors.
For the full log, check out the [git log](https://github.com/helix-editor/helix/compare/22.12..23.03).
Also check out the [release notes](https://helix-editor.com/news/release-23-03-highlights/) for more commentary on larger features.
Breaking changes:
- Select diagnostic range in `goto_*_diag` commands ([#4713](https://github.com/helix-editor/helix/pull/4713), [#5164](https://github.com/helix-editor/helix/pull/5164), [#6193](https://github.com/helix-editor/helix/pull/6193))
- Remove jump behavior from `increment`/`decrement` ([#4123](https://github.com/helix-editor/helix/pull/4123), [#5929](https://github.com/helix-editor/helix/pull/5929))
- Select change range in `goto_*_change` commands ([#5206](https://github.com/helix-editor/helix/pull/5206))
- Split file modification indicator from filename statusline elements ([#4731](https://github.com/helix-editor/helix/pull/4731), [#6036](https://github.com/helix-editor/helix/pull/6036))
- Jump to symbol ranges in LSP goto commands ([#5986](https://github.com/helix-editor/helix/pull/5986))
- Workspace detection now stops at the first `.helix/` directory (merging multiple `.helix/languages.toml` configurations is no longer supported) ([#5748](https://github.com/helix-editor/helix/pull/5748))
Features:
- Dynamic workspace symbol picker ([#5055](https://github.com/helix-editor/helix/pull/5055))
- Soft-wrap ([#5420](https://github.com/helix-editor/helix/pull/5420), [#5786](https://github.com/helix-editor/helix/pull/5786), [#5893](https://github.com/helix-editor/helix/pull/5893), [#6142](https://github.com/helix-editor/helix/pull/6142), [#6440](https://github.com/helix-editor/helix/pull/6440))
- Initial support for LSP snippet completions ([#5864](https://github.com/helix-editor/helix/pull/5864), [b1f7528](https://github.com/helix-editor/helix/commit/b1f7528), [#6263](https://github.com/helix-editor/helix/pull/6263), [bbf4800](https://github.com/helix-editor/helix/commit/bbf4800), [90348b8](https://github.com/helix-editor/helix/commit/90348b8), [f87299f](https://github.com/helix-editor/helix/commit/f87299f), [#6371](https://github.com/helix-editor/helix/pull/6371), [9fe3adc](https://github.com/helix-editor/helix/commit/9fe3adc))
- Add a statusline element for showing the current version control HEAD ([#5682](https://github.com/helix-editor/helix/pull/5682))
- Display LSP type hints ([#5420](https://github.com/helix-editor/helix/pull/5420), [#5934](https://github.com/helix-editor/helix/pull/5934), [#6312](https://github.com/helix-editor/helix/pull/6312))
- Enable the Kitty keyboard protocol on terminals with support ([#4939](https://github.com/helix-editor/helix/pull/4939), [#6170](https://github.com/helix-editor/helix/pull/6170), [#6194](https://github.com/helix-editor/helix/pull/6194), [#6438](https://github.com/helix-editor/helix/pull/6438))
- Add a statusline element for the basename of the current file ([#5318](https://github.com/helix-editor/helix/pull/5318))
- Add substring matching syntax for the picker ([#5658](https://github.com/helix-editor/helix/pull/5658))
- Support LSP `textDocument/prepareRename` ([#6103](https://github.com/helix-editor/helix/pull/6103))
- Allow multiple runtime directories with priorities ([#5411](https://github.com/helix-editor/helix/pull/5411))
- Allow configuring whether to insert or replace completions ([#5728](https://github.com/helix-editor/helix/pull/5728))
- Allow per-workspace config file `.helix/config.toml` ([#5748](https://github.com/helix-editor/helix/pull/5748))
- Add `workspace-lsp-roots` config option to support multiple LSP roots for use with monorepos ([#5748](https://github.com/helix-editor/helix/pull/5748))
Commands:
- `:pipe-to` which pipes selections into a shell command and ignores output ([#4931](https://github.com/helix-editor/helix/pull/4931))
- `merge_consecutive_selections` (`A-_`) combines all consecutive selections ([#5047](https://github.com/helix-editor/helix/pull/5047))
- `rotate_view_reverse` which focuses the previous view ([#5356](https://github.com/helix-editor/helix/pull/5356))
- `goto_declaration` (`gD`, requires LSP) which jumps to a symbol's declaration ([#5646](https://github.com/helix-editor/helix/pull/5646))
- `file_picker_in_current_buffer_directory` ([#4666](https://github.com/helix-editor/helix/pull/4666))
- `:character-info` which shows information about the character under the cursor ([#4000](https://github.com/helix-editor/helix/pull/4000))
- `:toggle-option` for toggling config options at runtime ([#4085](https://github.com/helix-editor/helix/pull/4085))
- `dap_restart` for restarting a debug session in DAP ([#5651](https://github.com/helix-editor/helix/pull/5651))
- `:lsp-stop` to stop the language server of the current buffer ([#5964](https://github.com/helix-editor/helix/pull/5964))
- `:reset-diff-change` for resetting a diff hunk to its original text ([#4974](https://github.com/helix-editor/helix/pull/4974))
- `:config-open-workspace` for opening the config file local to the current workspace ([#5748](https://github.com/helix-editor/helix/pull/5748))
Usability improvements:
- Remove empty detail section in completion menu when LSP doesn't send details ([#4902](https://github.com/helix-editor/helix/pull/4902))
- Pass client information on LSP initialization ([#4904](https://github.com/helix-editor/helix/pull/4904))
- Allow specifying environment variables for language servers in language config ([#4004](https://github.com/helix-editor/helix/pull/4004))
- Allow detached git worktrees to be recognized as root paths ([#5097](https://github.com/helix-editor/helix/pull/5097))
- Improve error message handling for theme loading failures ([#5073](https://github.com/helix-editor/helix/pull/5073))
- Print the names of binaries required for LSP/DAP in health-check ([#5195](https://github.com/helix-editor/helix/pull/5195))
- Improve sorting in the picker in cases of ties ([#5169](https://github.com/helix-editor/helix/pull/5169))
- Add theming for prompt suggestions ([#5104](https://github.com/helix-editor/helix/pull/5104))
- Open a file picker when using `:open` on directories ([#2707](https://github.com/helix-editor/helix/pull/2707), [#5278](https://github.com/helix-editor/helix/pull/5278))
- Reload language config with `:config-reload` ([#5239](https://github.com/helix-editor/helix/pull/5239), [#5381](https://github.com/helix-editor/helix/pull/5381), [#5431](https://github.com/helix-editor/helix/pull/5431))
- Improve indent queries for python when the tree is errored ([#5332](https://github.com/helix-editor/helix/pull/5332))
- Picker: Open files without closing the picker with `A-ret` ([#4435](https://github.com/helix-editor/helix/pull/4435))
- Allow theming cursors by primary/secondary and by mode ([#5130](https://github.com/helix-editor/helix/pull/5130))
- Allow configuration of the minimum width for the line-numbers gutter ([#4724](https://github.com/helix-editor/helix/pull/4724), [#5696](https://github.com/helix-editor/helix/pull/5696))
- Use filename completer for `:run-shell-command` command ([#5729](https://github.com/helix-editor/helix/pull/5729))
- Surround with line-endings with `ms<ret>` ([#4571](https://github.com/helix-editor/helix/pull/4571))
- Hide duplicate symlinks in file pickers ([#5658](https://github.com/helix-editor/helix/pull/5658))
- Tabulate buffer picker contents ([#5777](https://github.com/helix-editor/helix/pull/5777))
- Add an option to disable LSP ([#4425](https://github.com/helix-editor/helix/pull/4425))
- Short-circuit tree-sitter and word object motions ([#5851](https://github.com/helix-editor/helix/pull/5851))
- Add exit code to failed command message ([#5898](https://github.com/helix-editor/helix/pull/5898))
- Make `m` textobject look for pairs enclosing selections ([#3344](https://github.com/helix-editor/helix/pull/3344))
- Negotiate LSP position encoding ([#5894](https://github.com/helix-editor/helix/pull/5894), [a48d1a4](https://github.com/helix-editor/helix/commit/a48d1a4))
- Display deprecated LSP completions with strikethrough ([#5932](https://github.com/helix-editor/helix/pull/5932))
- Add JSONRPC request ID to failed LSP/DAP request log messages ([#6010](https://github.com/helix-editor/helix/pull/6010), [#6018](https://github.com/helix-editor/helix/pull/6018))
- Ignore case when filtering LSP completions ([#6008](https://github.com/helix-editor/helix/pull/6008))
- Show current language when no arguments are passed to `:set-language` ([#5895](https://github.com/helix-editor/helix/pull/5895))
- Refactor and rewrite all book documentation ([#5534](https://github.com/helix-editor/helix/pull/5534))
- Separate diagnostic picker message and code ([#6095](https://github.com/helix-editor/helix/pull/6095))
- Add a config option to bypass undercurl detection ([#6253](https://github.com/helix-editor/helix/pull/6253))
- Only complete appropriate arguments for typed commands ([#5966](https://github.com/helix-editor/helix/pull/5966))
- Discard outdated LSP diagnostics ([3c9d5d0](https://github.com/helix-editor/helix/commit/3c9d5d0))
- Discard outdated LSP workspace edits ([b6a4927](https://github.com/helix-editor/helix/commit/b6a4927))
- Run shell commands asynchronously ([#6373](https://github.com/helix-editor/helix/pull/6373))
- Show diagnostic codes in LSP diagnostic messages ([#6378](https://github.com/helix-editor/helix/pull/6378))
- Highlight the current line in a DAP debug session ([#5957](https://github.com/helix-editor/helix/pull/5957))
- Hide signature help if it overlaps with the completion menu ([#5523](https://github.com/helix-editor/helix/pull/5523), [7a69c40](https://github.com/helix-editor/helix/commit/7a69c40))
Fixes:
- Fix behavior of `auto-completion` flag for completion-on-trigger ([#5042](https://github.com/helix-editor/helix/pull/5042))
- Reset editor mode when changing buffers ([#5072](https://github.com/helix-editor/helix/pull/5072))
- Respect scrolloff settings in mouse movements ([#5255](https://github.com/helix-editor/helix/pull/5255))
- Avoid trailing `s` when only one file is opened ([#5189](https://github.com/helix-editor/helix/pull/5189))
- Fix erroneous indent between closers of auto-pairs ([#5330](https://github.com/helix-editor/helix/pull/5330))
- Expand `~` when parsing file paths in `:open` ([#5329](https://github.com/helix-editor/helix/pull/5329))
- Fix theme inheritance for default themes ([#5218](https://github.com/helix-editor/helix/pull/5218))
- Fix `extend_line` with a count when the current line(s) are selected ([#5288](https://github.com/helix-editor/helix/pull/5288))
- Prompt: Fix autocompletion for paths containing periods ([#5175](https://github.com/helix-editor/helix/pull/5175))
- Skip serializing JSONRPC params if params is null ([#5471](https://github.com/helix-editor/helix/pull/5471))
- Fix interaction with the `xclip` clipboard provider ([#5426](https://github.com/helix-editor/helix/pull/5426))
- Fix undo/redo execution from the command palette ([#5294](https://github.com/helix-editor/helix/pull/5294))
- Fix highlighting of non-block cursors ([#5575](https://github.com/helix-editor/helix/pull/5575))
- Fix panic when nooping in `join_selections` and `join_selections_space` ([#5423](https://github.com/helix-editor/helix/pull/5423))
- Fix selecting a changed file in global search ([#5639](https://github.com/helix-editor/helix/pull/5639))
- Fix initial syntax highlight layer sort order ([#5196](https://github.com/helix-editor/helix/pull/5196))
- Fix UTF-8 length handling for shellwords ([#5738](https://github.com/helix-editor/helix/pull/5738))
- Remove C-j and C-k bindings from the completion menu ([#5070](https://github.com/helix-editor/helix/pull/5070))
- Always commit to history when pasting ([#5790](https://github.com/helix-editor/helix/pull/5790))
- Properly handle LSP position encoding ([#5711](https://github.com/helix-editor/helix/pull/5711))
- Fix infinite loop in `copy_selection_on_prev_line` ([#5888](https://github.com/helix-editor/helix/pull/5888))
- Fix completion popup positioning ([#5842](https://github.com/helix-editor/helix/pull/5842))
- Fix a panic when uncommenting a line with only a comment token ([#5933](https://github.com/helix-editor/helix/pull/5933))
- Fix panic in `goto_window_center` at EOF ([#5987](https://github.com/helix-editor/helix/pull/5987))
- Ignore invalid file URIs sent by a language server ([#6000](https://github.com/helix-editor/helix/pull/6000))
- Decode LSP URIs for the workspace diagnostics picker ([#6016](https://github.com/helix-editor/helix/pull/6016))
- Fix incorrect usages of `tab_width` with `indent_width` ([#5918](https://github.com/helix-editor/helix/pull/5918))
- DAP: Send Disconnect if the Terminated event is received ([#5532](https://github.com/helix-editor/helix/pull/5532))
- DAP: Validate key and index exist when requesting variables ([#5628](https://github.com/helix-editor/helix/pull/5628))
- Check LSP renaming support before prompting for rename text ([#6257](https://github.com/helix-editor/helix/pull/6257))
- Fix indent guide rendering ([#6136](https://github.com/helix-editor/helix/pull/6136))
- Fix division by zero panic ([#6155](https://github.com/helix-editor/helix/pull/6155))
- Fix lacking space panic ([#6109](https://github.com/helix-editor/helix/pull/6109))
- Send error replies for malformed and unhandled LSP requests ([#6058](https://github.com/helix-editor/helix/pull/6058))
- Fix table column calculations for dynamic pickers ([#5920](https://github.com/helix-editor/helix/pull/5920))
- Skip adding jumplist entries for `:<n>` line number previews ([#5751](https://github.com/helix-editor/helix/pull/5751))
- Fix completion race conditions ([#6173](https://github.com/helix-editor/helix/pull/6173))
- Fix `shrink_selection` with multiple cursors ([#6093](https://github.com/helix-editor/helix/pull/6093))
- Fix indentation calculation for lines with mixed tabs/spaces ([#6278](https://github.com/helix-editor/helix/pull/6278))
- No-op `client/registerCapability` LSP requests ([#6258](https://github.com/helix-editor/helix/pull/6258))
- Send the STOP signal to all processes in the process group ([#3546](https://github.com/helix-editor/helix/pull/3546))
- Fix workspace edit client capabilities declaration ([7bf168d](https://github.com/helix-editor/helix/commit/7bf168d))
- Fix highlighting in picker results with multiple columns ([#6333](https://github.com/helix-editor/helix/pull/6333))
- Canonicalize paths before stripping the current dir as a prefix ([#6290](https://github.com/helix-editor/helix/pull/6290))
- Fix truncation behavior for long path names in the file picker ([#6410](https://github.com/helix-editor/helix/pull/6410), [67783dd](https://github.com/helix-editor/helix/commit/67783dd))
- Fix theme reloading behavior in `:config-reload` ([ab819d8](https://github.com/helix-editor/helix/commit/ab819d8))
Themes:
- Update `serika` ([#5038](https://github.com/helix-editor/helix/pull/5038), [#6344](https://github.com/helix-editor/helix/pull/6344))
- Update `flatwhite` ([#5036](https://github.com/helix-editor/helix/pull/5036), [#6323](https://github.com/helix-editor/helix/pull/6323))
- Update `autumn` ([#5051](https://github.com/helix-editor/helix/pull/5051), [#5397](https://github.com/helix-editor/helix/pull/5397), [#6280](https://github.com/helix-editor/helix/pull/6280), [#6316](https://github.com/helix-editor/helix/pull/6316))
- Update `acme` ([#5019](https://github.com/helix-editor/helix/pull/5019), [#5486](https://github.com/helix-editor/helix/pull/5486), [#5488](https://github.com/helix-editor/helix/pull/5488))
- Update `gruvbox` themes ([#5066](https://github.com/helix-editor/helix/pull/5066), [#5333](https://github.com/helix-editor/helix/pull/5333), [#5540](https://github.com/helix-editor/helix/pull/5540), [#6285](https://github.com/helix-editor/helix/pull/6285), [#6295](https://github.com/helix-editor/helix/pull/6295))
- Update `base16_transparent` ([#5105](https://github.com/helix-editor/helix/pull/5105))
- Update `dark_high_contrast` ([#5105](https://github.com/helix-editor/helix/pull/5105))
- Update `dracula` ([#5236](https://github.com/helix-editor/helix/pull/5236), [#5627](https://github.com/helix-editor/helix/pull/5627), [#6414](https://github.com/helix-editor/helix/pull/6414))
- Update `monokai_pro_spectrum` ([#5250](https://github.com/helix-editor/helix/pull/5250), [#5602](https://github.com/helix-editor/helix/pull/5602))
- Update `rose_pine` ([#5267](https://github.com/helix-editor/helix/pull/5267), [#5489](https://github.com/helix-editor/helix/pull/5489), [#6384](https://github.com/helix-editor/helix/pull/6384))
- Update `kanagawa` ([#5273](https://github.com/helix-editor/helix/pull/5273), [#5571](https://github.com/helix-editor/helix/pull/5571), [#6085](https://github.com/helix-editor/helix/pull/6085))
- Update `emacs` ([#5334](https://github.com/helix-editor/helix/pull/5334))
- Add `github` themes ([#5353](https://github.com/helix-editor/helix/pull/5353), [efeec12](https://github.com/helix-editor/helix/commit/efeec12))
- Dark themes: `github_dark`, `github_dark_colorblind`, `github_dark_dimmed`, `github_dark_high_contrast`, `github_dark_tritanopia`
- Light themes: `github_light`, `github_light_colorblind`, `github_light_dimmed`, `github_light_high_contrast`, `github_light_tritanopia`
- Update `solarized` variants ([#5445](https://github.com/helix-editor/helix/pull/5445), [#6327](https://github.com/helix-editor/helix/pull/6327))
- Update `catppuccin` variants ([#5404](https://github.com/helix-editor/helix/pull/5404), [#6107](https://github.com/helix-editor/helix/pull/6107), [#6269](https://github.com/helix-editor/helix/pull/6269), [#6464](https://github.com/helix-editor/helix/pull/6464))
- Use curly underlines in built-in themes ([#5419](https://github.com/helix-editor/helix/pull/5419))
- Update `zenburn` ([#5573](https://github.com/helix-editor/helix/pull/5573))
- Rewrite `snazzy` ([#3971](https://github.com/helix-editor/helix/pull/3971))
- Add `monokai_aqua` ([#5578](https://github.com/helix-editor/helix/pull/5578))
- Add `markup.strikethrough` to existing themes ([#5619](https://github.com/helix-editor/helix/pull/5619))
- Update `sonokai` ([#5440](https://github.com/helix-editor/helix/pull/5440))
- Update `onedark` ([#5755](https://github.com/helix-editor/helix/pull/5755))
- Add `ayu_evolve` ([#5638](https://github.com/helix-editor/helix/pull/5638), [#6028](https://github.com/helix-editor/helix/pull/6028), [#6225](https://github.com/helix-editor/helix/pull/6225))
- Add `jellybeans` ([#5719](https://github.com/helix-editor/helix/pull/5719))
- Update `fleet_dark` ([#5605](https://github.com/helix-editor/helix/pull/5605), [#6266](https://github.com/helix-editor/helix/pull/6266), [#6324](https://github.com/helix-editor/helix/pull/6324), [#6375](https://github.com/helix-editor/helix/pull/6375))
- Add `darcula-solid` ([#5778](https://github.com/helix-editor/helix/pull/5778))
- Remove text background from monokai themes ([#6009](https://github.com/helix-editor/helix/pull/6009))
- Update `pop_dark` ([#5992](https://github.com/helix-editor/helix/pull/5992), [#6208](https://github.com/helix-editor/helix/pull/6208), [#6227](https://github.com/helix-editor/helix/pull/6227), [#6292](https://github.com/helix-editor/helix/pull/6292))
- Add `everblush` ([#6086](https://github.com/helix-editor/helix/pull/6086))
- Add `adwaita-dark` ([#6042](https://github.com/helix-editor/helix/pull/6042), [#6342](https://github.com/helix-editor/helix/pull/6342))
- Update `papercolor` ([#6162](https://github.com/helix-editor/helix/pull/6162))
- Update `onelight` ([#6192](https://github.com/helix-editor/helix/pull/6192), [#6276](https://github.com/helix-editor/helix/pull/6276))
- Add `molokai` ([#6260](https://github.com/helix-editor/helix/pull/6260))
- Update `ayu` variants ([#6329](https://github.com/helix-editor/helix/pull/6329))
- Update `tokyonight` variants ([#6349](https://github.com/helix-editor/helix/pull/6349))
- Update `nord` variants ([#6376](https://github.com/helix-editor/helix/pull/6376))
New languages:
- BibTeX ([#5064](https://github.com/helix-editor/helix/pull/5064))
- Mermaid.js ([#5147](https://github.com/helix-editor/helix/pull/5147))
- Crystal ([#4993](https://github.com/helix-editor/helix/pull/4993), [#5205](https://github.com/helix-editor/helix/pull/5205))
- MATLAB/Octave ([#5192](https://github.com/helix-editor/helix/pull/5192))
- `tfvars` (uses HCL) ([#5396](https://github.com/helix-editor/helix/pull/5396))
- Ponylang ([#5416](https://github.com/helix-editor/helix/pull/5416))
- DHall ([1f6809c](https://github.com/helix-editor/helix/commit/1f6809c))
- Sagemath ([#5649](https://github.com/helix-editor/helix/pull/5649))
- MSBuild ([#5793](https://github.com/helix-editor/helix/pull/5793))
- pem ([#5797](https://github.com/helix-editor/helix/pull/5797))
- passwd ([#4959](https://github.com/helix-editor/helix/pull/4959))
- hosts ([#4950](https://github.com/helix-editor/helix/pull/4950), [#5914](https://github.com/helix-editor/helix/pull/5914))
- uxntal ([#6047](https://github.com/helix-editor/helix/pull/6047))
- Yuck ([#6064](https://github.com/helix-editor/helix/pull/6064), [#6242](https://github.com/helix-editor/helix/pull/6242))
- GNU gettext PO ([#5996](https://github.com/helix-editor/helix/pull/5996))
- Sway ([#6023](https://github.com/helix-editor/helix/pull/6023))
- NASM ([#6068](https://github.com/helix-editor/helix/pull/6068))
- PRQL ([#6126](https://github.com/helix-editor/helix/pull/6126))
- reStructuredText ([#6180](https://github.com/helix-editor/helix/pull/6180))
- Smithy ([#6370](https://github.com/helix-editor/helix/pull/6370))
- VHDL ([#5826](https://github.com/helix-editor/helix/pull/5826))
- Rego (OpenPolicy Agent) ([#6415](https://github.com/helix-editor/helix/pull/6415))
- Nim ([#6123](https://github.com/helix-editor/helix/pull/6123))
Updated languages and queries:
- Use diff syntax for patch files ([#5085](https://github.com/helix-editor/helix/pull/5085))
- Add Haskell textobjects ([#5061](https://github.com/helix-editor/helix/pull/5061))
- Fix commonlisp configuration ([#5091](https://github.com/helix-editor/helix/pull/5091))
- Update Scheme ([bae890d](https://github.com/helix-editor/helix/commit/bae890d))
- Add indent queries for Bash ([#5149](https://github.com/helix-editor/helix/pull/5149))
- Recognize `c++` as a C++ extension ([#5183](https://github.com/helix-editor/helix/pull/5183))
- Enable HTTP server in `metals` (Scala) config ([#5551](https://github.com/helix-editor/helix/pull/5551))
- Change V-lang language server to `v ls` from `vls` ([#5677](https://github.com/helix-editor/helix/pull/5677))
- Inject comment grammar into Nix ([#5208](https://github.com/helix-editor/helix/pull/5208))
- Update Rust highlights ([#5238](https://github.com/helix-editor/helix/pull/5238), [#5349](https://github.com/helix-editor/helix/pull/5349))
- Fix HTML injection within Markdown ([#5265](https://github.com/helix-editor/helix/pull/5265))
- Fix comment token for godot ([#5276](https://github.com/helix-editor/helix/pull/5276))
- Expand injections for Vue ([#5268](https://github.com/helix-editor/helix/pull/5268))
- Add `.bash_aliases` as a Bash file-type ([#5347](https://github.com/helix-editor/helix/pull/5347))
- Fix comment token for sshclientconfig ([#5351](https://github.com/helix-editor/helix/pull/5351))
- Update Prisma ([#5417](https://github.com/helix-editor/helix/pull/5417))
- Update C++ ([#5457](https://github.com/helix-editor/helix/pull/5457))
- Add more file-types for Python ([#5593](https://github.com/helix-editor/helix/pull/5593))
- Update tree-sitter-scala ([#5576](https://github.com/helix-editor/helix/pull/5576))
- Add an injection regex for Lua ([#5606](https://github.com/helix-editor/helix/pull/5606))
- Add `build.gradle` to java roots configuration ([#5641](https://github.com/helix-editor/helix/pull/5641))
- Add Hub PR files to markdown file-types ([#5634](https://github.com/helix-editor/helix/pull/5634))
- Add an external formatter configuration for Cue ([#5679](https://github.com/helix-editor/helix/pull/5679))
- Add injections for builders and writers to Nix ([#5629](https://github.com/helix-editor/helix/pull/5629))
- Update tree-sitter-xml to fix whitespace parsing ([#5685](https://github.com/helix-editor/helix/pull/5685))
- Add `Justfile` to the make file-types configuration ([#5687](https://github.com/helix-editor/helix/pull/5687))
- Update tree-sitter-sql and highlight queries ([#5683](https://github.com/helix-editor/helix/pull/5683), [#5772](https://github.com/helix-editor/helix/pull/5772))
- Use the bash grammar and queries for env language ([#5720](https://github.com/helix-editor/helix/pull/5720))
- Add podspec files to ruby file-types ([#5811](https://github.com/helix-editor/helix/pull/5811))
- Recognize `.C` and `.H` file-types as C++ ([#5808](https://github.com/helix-editor/helix/pull/5808))
- Recognize plist and mobileconfig files as XML ([#5863](https://github.com/helix-editor/helix/pull/5863))
- Fix `select` indentation in Go ([#5713](https://github.com/helix-editor/helix/pull/5713))
- Check for external file modifications when writing ([#5805](https://github.com/helix-editor/helix/pull/5805))
- Recognize containerfiles as dockerfile syntax ([#5873](https://github.com/helix-editor/helix/pull/5873))
- Update godot grammar and queries ([#5944](https://github.com/helix-editor/helix/pull/5944), [#6186](https://github.com/helix-editor/helix/pull/6186))
- Improve DHall highlights ([#5959](https://github.com/helix-editor/helix/pull/5959))
- Recognize `.env.dist` and `source.env` as env language ([#6003](https://github.com/helix-editor/helix/pull/6003))
- Update tree-sitter-git-rebase ([#6030](https://github.com/helix-editor/helix/pull/6030), [#6094](https://github.com/helix-editor/helix/pull/6094))
- Improve SQL highlights ([#6041](https://github.com/helix-editor/helix/pull/6041))
- Improve markdown highlights and inject LaTeX ([#6100](https://github.com/helix-editor/helix/pull/6100))
- Add textobject queries for Elm ([#6084](https://github.com/helix-editor/helix/pull/6084))
- Recognize graphql schema file type ([#6159](https://github.com/helix-editor/helix/pull/6159))
- Improve highlighting in comments ([#6143](https://github.com/helix-editor/helix/pull/6143))
- Improve highlighting for JavaScript/TypeScript/ECMAScript languages ([#6205](https://github.com/helix-editor/helix/pull/6205))
- Improve PHP highlights ([#6203](https://github.com/helix-editor/helix/pull/6203), [#6250](https://github.com/helix-editor/helix/pull/6250), [#6299](https://github.com/helix-editor/helix/pull/6299))
- Improve Go highlights ([#6204](https://github.com/helix-editor/helix/pull/6204))
- Highlight unchecked sqlx functions as SQL in Rust ([#6256](https://github.com/helix-editor/helix/pull/6256))
- Improve Erlang highlights ([cdd6c8d](https://github.com/helix-editor/helix/commit/cdd6c8d))
- Improve Nix highlights ([fb4d703](https://github.com/helix-editor/helix/commit/fb4d703))
- Improve gdscript highlights ([#6311](https://github.com/helix-editor/helix/pull/6311))
- Improve Vlang highlights ([#6279](https://github.com/helix-editor/helix/pull/6279))
- Improve Makefile highlights ([#6339](https://github.com/helix-editor/helix/pull/6339))
- Remove auto-pair for `'` in OCaml ([#6381](https://github.com/helix-editor/helix/pull/6381))
- Fix indents in switch statements in ECMA languages ([#6369](https://github.com/helix-editor/helix/pull/6369))
- Recognize xlb and storyboard file-types as XML ([#6407](https://github.com/helix-editor/helix/pull/6407))
- Recognize cts and mts file-types as TypeScript ([#6424](https://github.com/helix-editor/helix/pull/6424))
- Recognize SVG file-type as XML ([#6431](https://github.com/helix-editor/helix/pull/6431))
- Add theme scopes for (un)checked list item markup scopes ([#6434](https://github.com/helix-editor/helix/pull/6434))
- Update git commit grammar and add the comment textobject ([#6439](https://github.com/helix-editor/helix/pull/6439), [#6493](https://github.com/helix-editor/helix/pull/6493))
- Recognize ARB file-type as JSON ([#6452](https://github.com/helix-editor/helix/pull/6452))
- Inject markdown into markdown strings in Julia ([#6489](https://github.com/helix-editor/helix/pull/6489))
Packaging:
- Fix Nix flake devShell for darwin hosts ([#5368](https://github.com/helix-editor/helix/pull/5368))
- Add Appstream metadata file to `contrib/` ([#5643](https://github.com/helix-editor/helix/pull/5643))
- Increase the MSRV to 1.65 ([#5570](https://github.com/helix-editor/helix/pull/5570), [#6185](https://github.com/helix-editor/helix/pull/6185))
- Expose the Nix flake's `wrapper` ([#5994](https://github.com/helix-editor/helix/pull/5994))
# 22.12 (2022-12-06) # 22.12 (2022-12-06)
This is a great big release filled with changes from a 99 contributors. A big _thank you_ to you all! This is a great big release filled with changes from a 99 contributors. A big _thank you_ to you all!

585
Cargo.lock generated

File diff suppressed because it is too large Load Diff

@ -32,6 +32,3 @@ inherits = "test"
package.helix-core.opt-level = 2 package.helix-core.opt-level = 2
package.helix-tui.opt-level = 2 package.helix-tui.opt-level = 2
package.helix-term.opt-level = 2 package.helix-term.opt-level = 2
[patch.crates-io]
tree-sitter = { git = "https://github.com/tree-sitter/tree-sitter", rev = "c51896d32dcc11a38e41f36e3deb1a6a9c4f4b14" }

@ -1 +1 @@
22.12 23.03

@ -30,6 +30,9 @@ You can use a custom configuration file by specifying it with the `-c` or
Additionally, you can reload the configuration file by sending the USR1 Additionally, you can reload the configuration file by sending the USR1
signal to the Helix process on Unix operating systems, such as by using the command `pkill -USR1 hx`. signal to the Helix process on Unix operating systems, such as by using the command `pkill -USR1 hx`.
Finally, you can have a `config.toml` local to a project by putting it under a `.helix` directory in your repository.
Its settings will be merged with the configuration directory `config.toml` and the built-in configuration.
## Editor ## Editor
### `[editor]` Section ### `[editor]` Section
@ -57,7 +60,8 @@ signal to the Helix process on Unix operating systems, such as by using the comm
| `rulers` | List of column positions at which to display the rulers. Can be overridden by language specific `rulers` in `languages.toml` file | `[]` | | `rulers` | List of column positions at which to display the rulers. Can be overridden by language specific `rulers` in `languages.toml` file | `[]` |
| `bufferline` | Renders a line at the top of the editor displaying open buffers. Can be `always`, `never` or `multiple` (only shown if more than one buffer is in use) | `never` | | `bufferline` | Renders a line at the top of the editor displaying open buffers. Can be `always`, `never` or `multiple` (only shown if more than one buffer is in use) | `never` |
| `color-modes` | Whether to color the mode indicator with different colors depending on the mode itself | `false` | | `color-modes` | Whether to color the mode indicator with different colors depending on the mode itself | `false` |
| `text-width` | Maximum line length. Used for the `:reflow` command and soft-wrapping if `soft-wrap.wrap_at_text_width` is set | `80` | | `text-width` | Maximum line length. Used for the `:reflow` command and soft-wrapping if `soft-wrap.wrap-at-text-width` is set | `80` |
| `workspace-lsp-roots` | Directories relative to the workspace root that are treated as LSP roots. Should only be set in `.helix/config.toml` | `[]` |
### `[editor.statusline]` Section ### `[editor.statusline]` Section
@ -123,6 +127,7 @@ The following statusline elements can be configured:
| `auto-signature-help` | Enable automatic popup of signature help (parameter hints) | `true` | | `auto-signature-help` | Enable automatic popup of signature help (parameter hints) | `true` |
| `display-inlay-hints` | Display inlay hints[^2] | `false` | | `display-inlay-hints` | Display inlay hints[^2] | `false` |
| `display-signature-help-docs` | Display docs under signature help popup | `true` | | `display-signature-help-docs` | Display docs under signature help popup | `true` |
| `snippets` | Enables snippet completions. Requires a server restart (`:lsp-restart`) to take effect after `:config-reload`/`:set`. | `true` |
[^1]: By default, a progress spinner is shown in the statusline beside the file path. [^1]: By default, a progress spinner is shown in the statusline beside the file path.
[^2]: You may also have to activate them in the LSP config for them to appear, not just in Helix. [^2]: You may also have to activate them in the LSP config for them to appear, not just in Helix.
@ -156,7 +161,7 @@ All git related options are only enabled in a git repository.
| Key | Description | Default | | Key | Description | Default |
|--|--|---------| |--|--|---------|
|`hidden` | Enables ignoring hidden files | true |`hidden` | Enables ignoring hidden files | true
|`follow-links` | Follow symlinks instead of ignoring them | true |`follow-symlinks` | Follow symlinks instead of ignoring them | true
|`deduplicate-links` | Ignore symlinks that point at files already shown in the picker | true |`deduplicate-links` | Ignore symlinks that point at files already shown in the picker | true
|`parents` | Enables reading ignore files from parent directories | true |`parents` | Enables reading ignore files from parent directories | true
|`ignore` | Enables reading `.ignore` files | true |`ignore` | Enables reading `.ignore` files | true

@ -2,9 +2,9 @@
Helix's editing model is strongly inspired from Vim and Kakoune, and a notable Helix's editing model is strongly inspired from Vim and Kakoune, and a notable
difference from Vim (and the most striking similarity to Kakoune) is that Helix difference from Vim (and the most striking similarity to Kakoune) is that Helix
follows the `selection → action` model. This means that the whatever you are follows the `selection → action` model. This means that whatever you are
going to act on (a word, a paragraph, a line, etc) is selected first and the going to act on (a word, a paragraph, a line, etc.) is selected first and the
action itself (delete, change, yank, etc) comes second. A cursor is simply a action itself (delete, change, yank, etc.) comes second. A cursor is simply a
single width selection. single width selection.
See also Kakoune's [Migrating from Vim](https://github.com/mawww/kakoune/wiki/Migrating-from-Vim) and Helix's [Migrating from Vim](https://github.com/helix-editor/helix/wiki/Migrating-from-Vim). See also Kakoune's [Migrating from Vim](https://github.com/mawww/kakoune/wiki/Migrating-from-Vim) and Helix's [Migrating from Vim](https://github.com/helix-editor/helix/wiki/Migrating-from-Vim).

@ -9,6 +9,7 @@
| bicep | ✓ | | | `bicep-langserver` | | bicep | ✓ | | | `bicep-langserver` |
| c | ✓ | ✓ | ✓ | `clangd` | | c | ✓ | ✓ | ✓ | `clangd` |
| c-sharp | ✓ | ✓ | | `OmniSharp` | | c-sharp | ✓ | ✓ | | `OmniSharp` |
| cabal | ✓ | | | |
| cairo | ✓ | | | | | cairo | ✓ | | | |
| capnp | ✓ | | ✓ | | | capnp | ✓ | | ✓ | |
| clojure | ✓ | | | `clojure-lsp` | | clojure | ✓ | | | `clojure-lsp` |
@ -27,6 +28,7 @@
| diff | ✓ | | | | | diff | ✓ | | | |
| dockerfile | ✓ | | | `docker-langserver` | | dockerfile | ✓ | | | `docker-langserver` |
| dot | ✓ | | | `dot-language-server` | | dot | ✓ | | | `dot-language-server` |
| dtd | ✓ | | | |
| edoc | ✓ | | | | | edoc | ✓ | | | |
| eex | ✓ | | | | | eex | ✓ | | | |
| ejs | ✓ | | | | | ejs | ✓ | | | |
@ -59,6 +61,7 @@
| heex | ✓ | ✓ | | `elixir-ls` | | heex | ✓ | ✓ | | `elixir-ls` |
| hosts | ✓ | | | | | hosts | ✓ | | | |
| html | ✓ | | | `vscode-html-language-server` | | html | ✓ | | | `vscode-html-language-server` |
| hurl | ✓ | | ✓ | |
| idris | | | | `idris2-lsp` | | idris | | | | `idris2-lsp` |
| iex | ✓ | | | | | iex | ✓ | | | |
| ini | ✓ | | | | | ini | ✓ | | | |
@ -68,7 +71,8 @@
| json | ✓ | | ✓ | `vscode-json-language-server` | | json | ✓ | | ✓ | `vscode-json-language-server` |
| jsonnet | ✓ | | | `jsonnet-language-server` | | jsonnet | ✓ | | | `jsonnet-language-server` |
| jsx | ✓ | ✓ | ✓ | `typescript-language-server` | | jsx | ✓ | ✓ | ✓ | `typescript-language-server` |
| julia | ✓ | | | `julia` | | julia | ✓ | ✓ | ✓ | `julia` |
| just | ✓ | ✓ | ✓ | |
| kdl | ✓ | | | | | kdl | ✓ | | | |
| kotlin | ✓ | | | `kotlin-language-server` | | kotlin | ✓ | | | `kotlin-language-server` |
| latex | ✓ | ✓ | | `texlab` | | latex | ✓ | ✓ | | `texlab` |
@ -79,6 +83,7 @@
| llvm-mir-yaml | ✓ | | ✓ | | | llvm-mir-yaml | ✓ | | ✓ | |
| lua | ✓ | ✓ | ✓ | `lua-language-server` | | lua | ✓ | ✓ | ✓ | `lua-language-server` |
| make | ✓ | | | | | make | ✓ | | | |
| markdoc | ✓ | | | `markdoc-ls` |
| markdown | ✓ | | | `marksman` | | markdown | ✓ | | | `marksman` |
| markdown.inline | ✓ | | | | | markdown.inline | ✓ | | | |
| matlab | ✓ | | | | | matlab | ✓ | | | |
@ -94,6 +99,7 @@
| ocaml | ✓ | | ✓ | `ocamllsp` | | ocaml | ✓ | | ✓ | `ocamllsp` |
| ocaml-interface | ✓ | | | `ocamllsp` | | ocaml-interface | ✓ | | | `ocamllsp` |
| odin | ✓ | | | `ols` | | odin | ✓ | | | `ols` |
| opencl | ✓ | ✓ | ✓ | `clangd` |
| openscad | ✓ | | | `openscad-lsp` | | openscad | ✓ | | | `openscad-lsp` |
| org | ✓ | | | | | org | ✓ | | | |
| pascal | ✓ | ✓ | | `pasls` | | pascal | ✓ | ✓ | | `pasls` |
@ -116,6 +122,7 @@
| rego | ✓ | | | `regols` | | rego | ✓ | | | `regols` |
| rescript | ✓ | ✓ | | `rescript-language-server` | | rescript | ✓ | ✓ | | `rescript-language-server` |
| rmarkdown | ✓ | | ✓ | `R` | | rmarkdown | ✓ | | ✓ | `R` |
| robot | ✓ | | | `robotframework_ls` |
| ron | ✓ | | ✓ | | | ron | ✓ | | ✓ | |
| rst | ✓ | | | | | rst | ✓ | | | |
| ruby | ✓ | ✓ | ✓ | `solargraph` | | ruby | ✓ | ✓ | ✓ | `solargraph` |

@ -46,7 +46,7 @@
| `:character-info`, `:char` | Get info about the character under the primary cursor. | | `:character-info`, `:char` | Get info about the character under the primary cursor. |
| `:reload` | Discard changes and reload from the source file. | | `:reload` | Discard changes and reload from the source file. |
| `:reload-all` | Discard changes and reload all documents from the source files. | | `:reload-all` | Discard changes and reload all documents from the source files. |
| `:update` | Write changes only if the file has been modified. | | `:update`, `:u` | Write changes only if the file has been modified. |
| `:lsp-workspace-command` | Open workspace command picker | | `:lsp-workspace-command` | Open workspace command picker |
| `:lsp-restart` | Restarts the Language Server that is in use by the current doc | | `:lsp-restart` | Restarts the Language Server that is in use by the current doc |
| `:lsp-stop` | Stops the Language Server that is in use by the current doc | | `:lsp-stop` | Stops the Language Server that is in use by the current doc |
@ -70,6 +70,7 @@
| `:tree-sitter-subtree`, `:ts-subtree` | Display tree sitter subtree under cursor, primarily for debugging queries. | | `:tree-sitter-subtree`, `:ts-subtree` | Display tree sitter subtree under cursor, primarily for debugging queries. |
| `:config-reload` | Refresh user config. | | `:config-reload` | Refresh user config. |
| `:config-open` | Open the user config.toml file. | | `:config-open` | Open the user config.toml file. |
| `:config-open-workspace` | Open the workspace config.toml file. |
| `:log-open` | Open the helix log file. | | `:log-open` | Open the helix log file. |
| `:insert-output` | Run shell command, inserting output before each selection. | | `:insert-output` | Run shell command, inserting output before each selection. |
| `:append-output` | Run shell command, appending output after each selection. | | `:append-output` | Run shell command, appending output after each selection. |

@ -12,6 +12,7 @@
- [macOS](#macos) - [macOS](#macos)
- [Homebrew Core](#homebrew-core) - [Homebrew Core](#homebrew-core)
- [Windows](#windows) - [Windows](#windows)
- [Winget](#winget)
- [Scoop](#scoop) - [Scoop](#scoop)
- [Chocolatey](#chocolatey) - [Chocolatey](#chocolatey)
- [MSYS2](#msys2) - [MSYS2](#msys2)
@ -40,8 +41,6 @@ line.
## Linux, macOS, Windows and OpenBSD packaging status ## Linux, macOS, Windows and OpenBSD packaging status
Helix is available for Linux, macOS and Windows via the official repositories listed below.
[![Packaging status](https://repology.org/badge/vertical-allrepos/helix.svg)](https://repology.org/project/helix/versions) [![Packaging status](https://repology.org/badge/vertical-allrepos/helix.svg)](https://repology.org/project/helix/versions)
## Linux ## Linux
@ -50,7 +49,7 @@ The following third party repositories are available:
### Ubuntu ### Ubuntu
Helix is available via [Maveonair's PPA](https://launchpad.net/~maveonair/+archive/ubuntu/helix-editor): Add the `PPA` for Helix:
```sh ```sh
sudo add-apt-repository ppa:maveonair/helix-editor sudo add-apt-repository ppa:maveonair/helix-editor
@ -60,7 +59,7 @@ sudo apt install helix
### Fedora/RHEL ### Fedora/RHEL
Helix is available via `copr`: Enable the `COPR` repository for Helix:
```sh ```sh
sudo dnf copr enable varlad/helix sudo dnf copr enable varlad/helix
@ -91,8 +90,8 @@ If you are using a version of Nix without flakes enabled,
### AppImage ### AppImage
Install Helix using [AppImage](https://appimage.org/). Install Helix using the Linux [AppImage](https://appimage.org/) format.
Download Helix AppImage from the [latest releases](https://github.com/helix-editor/helix/releases/latest) page. Download the official Helix AppImage from the [latest releases](https://github.com/helix-editor/helix/releases/latest) page.
```sh ```sh
chmod +x helix-*.AppImage # change permission for executable mode chmod +x helix-*.AppImage # change permission for executable mode
@ -109,9 +108,17 @@ brew install helix
## Windows ## Windows
Install on Windows using [Scoop](https://scoop.sh/), [Chocolatey](https://chocolatey.org/) Install on Windows using [Winget](https://learn.microsoft.com/en-us/windows/package-manager/winget/), [Scoop](https://scoop.sh/), [Chocolatey](https://chocolatey.org/)
or [MSYS2](https://msys2.org/). or [MSYS2](https://msys2.org/).
### Winget
Windows Package Manager winget command-line tool is by default available on Windows 11 and modern versions of Windows 10 as a part of the App Installer.
You can get [App Installer from the Microsoft Store](https://www.microsoft.com/p/app-installer/9nblggh4nns1#activetab=pivot:overviewtab). If it's already installed, make sure it is updated with the latest version.
```sh
winget install Helix.Helix
```
### Scoop ### Scoop
```sh ```sh
@ -134,33 +141,37 @@ pacman -S mingw-w64-ucrt-x86_64-helix
## Building from source ## Building from source
Clone the repository: Requirements:
- The [Rust toolchain](https://www.rust-lang.org/tools/install)
- The [Git version control system](https://git-scm.com/)
- A c++14 compatible compiler to build the tree-sitter grammars, for example GCC or Clang
If you are using the `musl-libc` standard library instead of `glibc` the following environment variable must be set during the build to ensure tree-sitter grammars can be loaded correctly:
```sh
RUSTFLAGS="-C target-feature=-crt-static"
```
1. Clone the repository:
```sh ```sh
git clone https://github.com/helix-editor/helix git clone https://github.com/helix-editor/helix
cd helix cd helix
``` ```
Compile from source: 2. Compile from source:
```sh ```sh
cargo install --path helix-term --locked cargo install --path helix-term --locked
``` ```
This command will create the `hx` executable and construct the tree-sitter This command will create the `hx` executable and construct the tree-sitter
grammars in the local `runtime` folder. To build the tree-sitter grammars requires grammars in the local `runtime` folder.
a c++ compiler to be installed, for example `gcc-c++`.
> 💡 If you are using the musl-libc instead of glibc the following environment variable must be set during the build
> to ensure tree-sitter grammars can be loaded correctly:
>
> ```sh
> RUSTFLAGS="-C target-feature=-crt-static"
> ```
> 💡 Tree-sitter grammars can be fetched and compiled if not pre-packaged. Fetch > 💡 Tree-sitter grammars can be fetched and compiled if not pre-packaged. Fetch
> grammars with `hx --grammar fetch` (requires `git`) and compile them with > grammars with `hx --grammar fetch` and compile them with
> `hx --grammar build` (requires a C++ compiler). This will install them in > `hx --grammar build`. This will install them in
> the `runtime` directory within the user's helix config directory (more > the `runtime` directory within the user's helix config directory (more
> [details below](#multiple-runtime-directories)). > [details below](#multiple-runtime-directories)).

@ -63,7 +63,8 @@ These configuration keys are available:
| `config` | Language Server configuration | | `config` | Language Server configuration |
| `grammar` | The tree-sitter grammar to use (defaults to the value of `name`) | | `grammar` | The tree-sitter grammar to use (defaults to the value of `name`) |
| `formatter` | The formatter for the language, it will take precedence over the lsp when defined. The formatter must be able to take the original file as input from stdin and write the formatted file to stdout | | `formatter` | The formatter for the language, it will take precedence over the lsp when defined. The formatter must be able to take the original file as input from stdin and write the formatted file to stdout |
| `text-width` | Maximum line length. Used for the `:reflow` command and soft-wrapping if `soft-wrap.wrap_at_text_width` is set, defaults to `editor.text-width` | | `text-width` | Maximum line length. Used for the `:reflow` command and soft-wrapping if `soft-wrap.wrap-at-text-width` is set, defaults to `editor.text-width` |
| `workspace-lsp-roots` | Directories relative to the workspace root that are treated as LSP roots. Should only be set in `.helix/config.toml`. Overwrites the setting of the same name in `config.toml` if set. |
### File-type detection and the `file-types` key ### File-type detection and the `file-types` key

@ -278,8 +278,11 @@ These scopes are used for theming the editor interface:
| `ui.cursor.primary.normal` | | | `ui.cursor.primary.normal` | |
| `ui.cursor.primary.insert` | | | `ui.cursor.primary.insert` | |
| `ui.cursor.primary.select` | | | `ui.cursor.primary.select` | |
| `ui.debug.breakpoint` | Breakpoint indicator, found in the gutter |
| `ui.debug.active` | Indicator for the line at which debugging execution is paused at, found in the gutter |
| `ui.gutter` | Gutter | | `ui.gutter` | Gutter |
| `ui.gutter.selected` | Gutter for the line the cursor is on | | `ui.gutter.selected` | Gutter for the line the cursor is on |
| `ui.highlight.frameline` | Line at which debugging execution is paused at |
| `ui.linenr` | Line numbers | | `ui.linenr` | Line numbers |
| `ui.linenr.selected` | Line number for the line the cursor is on | | `ui.linenr.selected` | Line number for the line the cursor is on |
| `ui.statusline` | Statusline | | `ui.statusline` | Statusline |

@ -36,6 +36,9 @@
<content_rating type="oars-1.1" /> <content_rating type="oars-1.1" />
<releases> <releases>
<release version="23.03" date="2023-03-31">
<url>https://helix-editor.com/news/release-23-03-highlights/</url>
</release>
<release version="22.12" date="2022-12-6"> <release version="22.12" date="2022-12-6">
<url>https://helix-editor.com/news/release-22-12-highlights/</url> <url>https://helix-editor.com/news/release-22-12-highlights/</url>
</release> </release>

@ -18,9 +18,6 @@
}, },
"dream2nix": { "dream2nix": {
"inputs": { "inputs": {
"alejandra": [
"nci"
],
"all-cabal-json": [ "all-cabal-json": [
"nci" "nci"
], ],
@ -28,6 +25,8 @@
"devshell": [ "devshell": [
"nci" "nci"
], ],
"drv-parts": "drv-parts",
"flake-compat": "flake-compat",
"flake-parts": [ "flake-parts": [
"nci", "nci",
"parts" "parts"
@ -51,6 +50,7 @@
"nci", "nci",
"nixpkgs" "nixpkgs"
], ],
"nixpkgsV1": "nixpkgsV1",
"poetry2nix": [ "poetry2nix": [
"nci" "nci"
], ],
@ -62,11 +62,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1677289985, "lastModified": 1680258209,
"narHash": "sha256-lUp06cTTlWubeBGMZqPl9jODM99LpWMcwxRiscFAUJg=", "narHash": "sha256-lEo50RXI/17/a9aCIun8Hz62ZJ5JM5RGeTgclIP+Lgc=",
"owner": "nix-community", "owner": "nix-community",
"repo": "dream2nix", "repo": "dream2nix",
"rev": "28b973a8d4c30cc1cbb3377ea2023a76bc3fb889", "rev": "6f512b5a220fdb26bd3c659f7b55e4f052ec8b35",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -75,6 +75,54 @@
"type": "github" "type": "github"
} }
}, },
"drv-parts": {
"inputs": {
"flake-compat": [
"nci",
"dream2nix",
"flake-compat"
],
"flake-parts": [
"nci",
"dream2nix",
"flake-parts"
],
"nixpkgs": [
"nci",
"dream2nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1680172861,
"narHash": "sha256-QMyI338xRxaHFDlCXdLCtgelGQX2PdlagZALky4ZXJ8=",
"owner": "davhau",
"repo": "drv-parts",
"rev": "ced8a52f62b0a94244713df2225c05c85b416110",
"type": "github"
},
"original": {
"owner": "davhau",
"repo": "drv-parts",
"type": "github"
}
},
"flake-compat": {
"flake": false,
"locked": {
"lastModified": 1673956053,
"narHash": "sha256-4gtG9iQuiKITOjNQQeQIpoIB6b16fm+504Ch3sNKLd8=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "35bb57c0c8d8b62bbfd284272c928ceb64ddbde9",
"type": "github"
},
"original": {
"owner": "edolstra",
"repo": "flake-compat",
"type": "github"
}
},
"flake-utils": { "flake-utils": {
"locked": { "locked": {
"lastModified": 1659877975, "lastModified": 1659877975,
@ -119,11 +167,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1677297103, "lastModified": 1680329418,
"narHash": "sha256-ArlJIbp9NGV9yvhZdV0SOUFfRlI/kHeKoCk30NbSiLc=", "narHash": "sha256-+KN0eQLSZvL1J0kDO8/fxv0UCHTyZCADLmpIfeeiSGo=",
"owner": "yusdacra", "owner": "yusdacra",
"repo": "nix-cargo-integration", "repo": "nix-cargo-integration",
"rev": "a79272a2cb0942392bb3a5bf9a3ec6bc568795b2", "rev": "98c1d2ff5155f0fee5d290f6b982cb990839d540",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -134,11 +182,11 @@
}, },
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1677063315, "lastModified": 1680213900,
"narHash": "sha256-qiB4ajTeAOVnVSAwCNEEkoybrAlA+cpeiBxLobHndE8=", "narHash": "sha256-cIDr5WZIj3EkKyCgj/6j3HBH4Jj1W296z7HTcWj1aMA=",
"owner": "nixos", "owner": "nixos",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "988cc958c57ce4350ec248d2d53087777f9e1949", "rev": "e3652e0735fbec227f342712f180f4f21f0594f2",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -151,11 +199,11 @@
"nixpkgs-lib": { "nixpkgs-lib": {
"locked": { "locked": {
"dir": "lib", "dir": "lib",
"lastModified": 1675183161, "lastModified": 1678375444,
"narHash": "sha256-Zq8sNgAxDckpn7tJo7V1afRSk2eoVbu3OjI1QklGLNg=", "narHash": "sha256-XIgHfGvjFvZQ8hrkfocanCDxMefc/77rXeHvYdzBMc8=",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "e1e1b192c1a5aab2960bf0a0bd53a2e8124fa18e", "rev": "130fa0baaa2b93ec45523fdcde942f6844ee9f6e",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -166,6 +214,21 @@
"type": "github" "type": "github"
} }
}, },
"nixpkgsV1": {
"locked": {
"lastModified": 1678500271,
"narHash": "sha256-tRBLElf6f02HJGG0ZR7znMNFv/Uf7b2fFInpTHiHaSE=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "5eb98948b66de29f899c7fe27ae112a47964baf8",
"type": "github"
},
"original": {
"id": "nixpkgs",
"ref": "nixos-22.11",
"type": "indirect"
}
},
"parts": { "parts": {
"inputs": { "inputs": {
"nixpkgs-lib": [ "nixpkgs-lib": [
@ -174,11 +237,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1675933616, "lastModified": 1679737941,
"narHash": "sha256-/rczJkJHtx16IFxMmAWu5nNYcSXNg1YYXTHoGjLrLUA=", "narHash": "sha256-srSD9CwsVPnUMsIZ7Kt/UegkKUEBcTyU1Rev7mO45S0=",
"owner": "hercules-ci", "owner": "hercules-ci",
"repo": "flake-parts", "repo": "flake-parts",
"rev": "47478a4a003e745402acf63be7f9a092d51b83d7", "rev": "3502ee99d6dade045bdeaf7b0cd8ec703484c25c",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -192,11 +255,11 @@
"nixpkgs-lib": "nixpkgs-lib" "nixpkgs-lib": "nixpkgs-lib"
}, },
"locked": { "locked": {
"lastModified": 1675933616, "lastModified": 1679737941,
"narHash": "sha256-/rczJkJHtx16IFxMmAWu5nNYcSXNg1YYXTHoGjLrLUA=", "narHash": "sha256-srSD9CwsVPnUMsIZ7Kt/UegkKUEBcTyU1Rev7mO45S0=",
"owner": "hercules-ci", "owner": "hercules-ci",
"repo": "flake-parts", "repo": "flake-parts",
"rev": "47478a4a003e745402acf63be7f9a092d51b83d7", "rev": "3502ee99d6dade045bdeaf7b0cd8ec703484c25c",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -221,11 +284,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1677292251, "lastModified": 1680315536,
"narHash": "sha256-D+6q5Z2MQn3UFJtqsM5/AvVHi3NXKZTIMZt1JGq/spA=", "narHash": "sha256-0AsBuKssJMbcRcw4HJQwJsUHhZxR5+gaf6xPQayhR44=",
"owner": "oxalica", "owner": "oxalica",
"repo": "rust-overlay", "repo": "rust-overlay",
"rev": "34cdbf6ad480ce13a6a526f57d8b9e609f3d65dc", "rev": "5c8c151bdd639074a0051325c16df1a64ee23497",
"type": "github" "type": "github"
}, },
"original": { "original": {

@ -123,8 +123,6 @@
then ''$RUSTFLAGS -C link-arg=-fuse-ld=lld -C target-cpu=native -Clink-arg=-Wl,--no-rosegment'' then ''$RUSTFLAGS -C link-arg=-fuse-ld=lld -C target-cpu=native -Clink-arg=-Wl,--no-rosegment''
else "$RUSTFLAGS"; else "$RUSTFLAGS";
in { in {
# by default NCI adds rust-analyzer component, but helix toolchain doesn't have rust-analyzer
nci.toolchains.shell.components = ["rust-src" "rustfmt" "clippy"];
nci.projects."helix-project".relPath = ""; nci.projects."helix-project".relPath = "";
nci.crates."helix-term" = { nci.crates."helix-term" = {
overrides = { overrides = {

@ -29,9 +29,10 @@ tree-sitter = "0.20"
once_cell = "1.17" once_cell = "1.17"
arc-swap = "1" arc-swap = "1"
regex = "1" regex = "1"
bitflags = "2.0" bitflags = "2.2"
ahash = "0.8.3" ahash = "0.8.3"
hashbrown = { version = "0.13.2", features = ["raw"] } hashbrown = { version = "0.13.2", features = ["raw"] }
dunce = "1.0"
log = "0.4" log = "0.4"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
@ -44,7 +45,7 @@ encoding_rs = "0.8"
chrono = { version = "0.4", default-features = false, features = ["alloc", "std"] } chrono = { version = "0.4", default-features = false, features = ["alloc", "std"] }
etcetera = "0.4" etcetera = "0.7"
textwrap = "0.16.0" textwrap = "0.16.0"
[dev-dependencies] [dev-dependencies]

@ -36,55 +36,12 @@ pub mod unicode {
pub use unicode_width as width; pub use unicode_width as width;
} }
pub use helix_loader::find_workspace;
pub fn find_first_non_whitespace_char(line: RopeSlice) -> Option<usize> { pub fn find_first_non_whitespace_char(line: RopeSlice) -> Option<usize> {
line.chars().position(|ch| !ch.is_whitespace()) line.chars().position(|ch| !ch.is_whitespace())
} }
/// Find project root.
///
/// Order of detection:
/// * Top-most folder containing a root marker in current git repository
/// * Git repository root if no marker detected
/// * Top-most folder containing a root marker if not git repository detected
/// * Current working directory as fallback
pub fn find_root(root: Option<&str>, root_markers: &[String]) -> std::path::PathBuf {
let current_dir = std::env::current_dir().expect("unable to determine current directory");
let root = match root {
Some(root) => {
let root = std::path::Path::new(root);
if root.is_absolute() {
root.to_path_buf()
} else {
current_dir.join(root)
}
}
None => current_dir.clone(),
};
let mut top_marker = None;
for ancestor in root.ancestors() {
if root_markers
.iter()
.any(|marker| ancestor.join(marker).exists())
{
top_marker = Some(ancestor);
}
if ancestor.join(".git").exists() {
// Top marker is repo root if not root marker was detected yet
if top_marker.is_none() {
top_marker = Some(ancestor);
}
// Don't go higher than repo if we're in one
break;
}
}
// Return the found top marker or the current_dir as fallback
top_marker.map_or(current_dir, |a| a.to_path_buf())
}
pub use ropey::{self, str_utils, Rope, RopeBuilder, RopeSlice}; pub use ropey::{self, str_utils, Rope, RopeBuilder, RopeSlice};
// pub use tendril::StrTendril as Tendril; // pub use tendril::StrTendril as Tendril;

@ -62,7 +62,7 @@ pub fn move_vertically_visual(
annotations: &mut TextAnnotations, annotations: &mut TextAnnotations,
) -> Range { ) -> Range {
if !text_fmt.soft_wrap { if !text_fmt.soft_wrap {
move_vertically(slice, range, dir, count, behaviour, text_fmt, annotations); return move_vertically(slice, range, dir, count, behaviour, text_fmt, annotations);
} }
annotations.clear_line_annotations(); annotations.clear_line_annotations();
let pos = range.cursor(slice); let pos = range.cursor(slice);

@ -40,6 +40,21 @@ pub fn expand_tilde(path: &Path) -> PathBuf {
/// needs to improve on. /// needs to improve on.
/// Copied from cargo: <https://github.com/rust-lang/cargo/blob/070e459c2d8b79c5b2ac5218064e7603329c92ae/crates/cargo-util/src/paths.rs#L81> /// Copied from cargo: <https://github.com/rust-lang/cargo/blob/070e459c2d8b79c5b2ac5218064e7603329c92ae/crates/cargo-util/src/paths.rs#L81>
pub fn get_normalized_path(path: &Path) -> PathBuf { pub fn get_normalized_path(path: &Path) -> PathBuf {
// normalization strategy is to canonicalize first ancestor path that exists (i.e., canonicalize as much as possible),
// then run handrolled normalization on the non-existent remainder
let (base, path) = path
.ancestors()
.find_map(|base| {
let canonicalized_base = dunce::canonicalize(base).ok()?;
let remainder = path.strip_prefix(base).ok()?.into();
Some((canonicalized_base, remainder))
})
.unwrap_or_else(|| (PathBuf::new(), PathBuf::from(path)));
if path.as_os_str().is_empty() {
return base;
}
let mut components = path.components().peekable(); let mut components = path.components().peekable();
let mut ret = if let Some(c @ Component::Prefix(..)) = components.peek().cloned() { let mut ret = if let Some(c @ Component::Prefix(..)) = components.peek().cloned() {
components.next(); components.next();
@ -63,7 +78,7 @@ pub fn get_normalized_path(path: &Path) -> PathBuf {
} }
} }
} }
ret base.join(ret)
} }
/// Returns the canonical, absolute form of a path with all intermediate components normalized. /// Returns the canonical, absolute form of a path with all intermediate components normalized.
@ -82,13 +97,19 @@ pub fn get_canonicalized_path(path: &Path) -> std::io::Result<PathBuf> {
} }
pub fn get_relative_path(path: &Path) -> PathBuf { pub fn get_relative_path(path: &Path) -> PathBuf {
let path = PathBuf::from(path);
let path = if path.is_absolute() { let path = if path.is_absolute() {
let cwdir = std::env::current_dir().expect("couldn't determine current directory"); let cwdir = std::env::current_dir()
path.strip_prefix(cwdir).unwrap_or(path) .map(|path| get_normalized_path(&path))
.expect("couldn't determine current directory");
get_normalized_path(&path)
.strip_prefix(cwdir)
.map(PathBuf::from)
.unwrap_or(path)
} else { } else {
path path
}; };
fold_home_dir(path) fold_home_dir(&path)
} }
/// Returns a truncated filepath where the basepart of the path is reduced to the first /// Returns a truncated filepath where the basepart of the path is reduced to the first

@ -109,7 +109,7 @@ pub fn visual_coords_at_pos(text: RopeSlice, pos: usize, tab_width: usize) -> Po
/// softwrapping positions are estimated with an O(1) algorithm /// softwrapping positions are estimated with an O(1) algorithm
/// to ensure consistent performance for large lines (currently unimplemented) /// to ensure consistent performance for large lines (currently unimplemented)
/// ///
/// Usualy you want to use `visual_offset_from_anchor` instead but this function /// Usually you want to use `visual_offset_from_anchor` instead but this function
/// can be useful (and faster) if /// can be useful (and faster) if
/// * You already know the visual position of the block /// * You already know the visual position of the block
/// * You only care about the horizontal offset (column) and not the vertical offset (row) /// * You only care about the horizontal offset (column) and not the vertical offset (row)
@ -291,7 +291,7 @@ pub fn pos_at_visual_coords(text: RopeSlice, coords: Position, tab_width: usize)
/// ///
/// If no (text) grapheme starts at exactly at the specified column the /// If no (text) grapheme starts at exactly at the specified column the
/// start of the grapheme to the left is returned. If there is no grapheme /// start of the grapheme to the left is returned. If there is no grapheme
/// to the left (for example if the line starts with virtual text) then the positiong /// to the left (for example if the line starts with virtual text) then the positioning
/// of the next grapheme to the right is returned. /// of the next grapheme to the right is returned.
/// ///
/// If the `line` coordinate is beyond the end of the file, the EOF /// If the `line` coordinate is beyond the end of the file, the EOF

@ -38,7 +38,7 @@ use std::borrow::Cow;
/// Ranges are considered to be inclusive on the left and /// Ranges are considered to be inclusive on the left and
/// exclusive on the right, regardless of anchor-head ordering. /// exclusive on the right, regardless of anchor-head ordering.
/// This means, for example, that non-zero-width ranges that /// This means, for example, that non-zero-width ranges that
/// are directly adjecent, sharing an edge, do not overlap. /// are directly adjacent, sharing an edge, do not overlap.
/// However, a zero-width range will overlap with the shared /// However, a zero-width range will overlap with the shared
/// left-edge of another range. /// left-edge of another range.
/// ///

@ -294,14 +294,14 @@ mod test {
#[test] #[test]
fn test_lists() { fn test_lists() {
let input = let input =
r#":set statusline.center ["file-type","file-encoding"] '["list", "in", "qoutes"]'"#; r#":set statusline.center ["file-type","file-encoding"] '["list", "in", "quotes"]'"#;
let shellwords = Shellwords::from(input); let shellwords = Shellwords::from(input);
let result = shellwords.words().to_vec(); let result = shellwords.words().to_vec();
let expected = vec![ let expected = vec![
Cow::from(":set"), Cow::from(":set"),
Cow::from("statusline.center"), Cow::from("statusline.center"),
Cow::from(r#"["file-type","file-encoding"]"#), Cow::from(r#"["file-type","file-encoding"]"#),
Cow::from(r#"["list", "in", "qoutes"]"#), Cow::from(r#"["list", "in", "quotes"]"#),
]; ];
assert_eq!(expected, result); assert_eq!(expected, result);
} }

@ -20,7 +20,7 @@ use std::{
fmt, fmt,
hash::{Hash, Hasher}, hash::{Hash, Hasher},
mem::{replace, transmute}, mem::{replace, transmute},
path::Path, path::{Path, PathBuf},
str::FromStr, str::FromStr,
sync::Arc, sync::Arc,
}; };
@ -127,6 +127,10 @@ pub struct LanguageConfiguration {
pub auto_pairs: Option<AutoPairs>, pub auto_pairs: Option<AutoPairs>,
pub rulers: Option<Vec<u16>>, // if set, override editor's rulers pub rulers: Option<Vec<u16>>, // if set, override editor's rulers
/// Hardcoded LSP root directories relative to the workspace root, like `examples` or `tools/fuzz`.
/// Falling back to the current working directory if none are configured.
pub workspace_lsp_roots: Option<Vec<PathBuf>>,
} }
#[derive(Debug, PartialEq, Eq, Hash)] #[derive(Debug, PartialEq, Eq, Hash)]
@ -551,6 +555,8 @@ impl LanguageConfiguration {
#[serde(default, rename_all = "kebab-case", deny_unknown_fields)] #[serde(default, rename_all = "kebab-case", deny_unknown_fields)]
pub struct SoftWrap { pub struct SoftWrap {
/// Soft wrap lines that exceed viewport width. Default to off /// Soft wrap lines that exceed viewport width. Default to off
// NOTE: Option on purpose because the struct is shared between language config and global config.
// By default the option is None so that the language config falls back to the global config unless explicitly set.
pub enable: Option<bool>, pub enable: Option<bool>,
/// Maximum space left free at the end of the line. /// Maximum space left free at the end of the line.
/// This space is used to wrap text at word boundaries. If that is not possible within this limit /// This space is used to wrap text at word boundaries. If that is not possible within this limit

@ -172,7 +172,7 @@ impl TextAnnotations {
for char_idx in char_range { for char_idx in char_range {
if let Some((_, Some(highlight))) = self.overlay_at(char_idx) { if let Some((_, Some(highlight))) = self.overlay_at(char_idx) {
// we don't know the number of chars the original grapheme takes // we don't know the number of chars the original grapheme takes
// however it doesn't matter as highlight bounderies are automatically // however it doesn't matter as highlight boundaries are automatically
// aligned to grapheme boundaries in the rendering code // aligned to grapheme boundaries in the rendering code
highlights.push((highlight.0, char_idx..char_idx + 1)) highlights.push((highlight.0, char_idx..char_idx + 1))
} }
@ -203,7 +203,7 @@ impl TextAnnotations {
/// Add new grapheme overlays. /// Add new grapheme overlays.
/// ///
/// The overlayed grapheme will be rendered with `highlight` /// The overlaid grapheme will be rendered with `highlight`
/// patched on top of `ui.text`. /// patched on top of `ui.text`.
/// ///
/// The overlays **must be sorted** by their `char_idx`. /// The overlays **must be sorted** by their `char_idx`.

@ -62,12 +62,10 @@ impl Client {
if command.is_empty() { if command.is_empty() {
return Result::Err(Error::Other(anyhow!("Command not provided"))); return Result::Err(Error::Other(anyhow!("Command not provided")));
} }
if transport == "tcp" && port_arg.is_some() { match (transport, port_arg) {
Self::tcp_process(command, args, port_arg.unwrap(), id).await ("tcp", Some(port_arg)) => Self::tcp_process(command, args, port_arg, id).await,
} else if transport == "stdio" { ("stdio", _) => Self::stdio(command, args, id),
Self::stdio(command, args, id) _ => Result::Err(Error::Other(anyhow!("Incorrect transport {}", transport))),
} else {
Result::Err(Error::Other(anyhow!("Incorrect transport {}", transport)))
} }
} }
@ -512,4 +510,10 @@ impl Client {
self.call::<requests::SetExceptionBreakpoints>(args) self.call::<requests::SetExceptionBreakpoints>(args)
} }
pub fn current_stack_frame(&self) -> Option<&StackFrame> {
self.stack_frames
.get(&self.thread_id?)?
.get(self.active_frame?)
}
} }

@ -230,38 +230,48 @@ impl Transport {
} }
} }
async fn recv( async fn recv_inner(
transport: Arc<Self>, transport: Arc<Self>,
mut server_stdout: Box<dyn AsyncBufRead + Unpin + Send>, mut server_stdout: Box<dyn AsyncBufRead + Unpin + Send>,
client_tx: UnboundedSender<Payload>, client_tx: UnboundedSender<Payload>,
) { ) -> Result<()> {
let mut recv_buffer = String::new(); let mut recv_buffer = String::new();
loop { loop {
match Self::recv_server_message(&mut server_stdout, &mut recv_buffer).await { let msg = Self::recv_server_message(&mut server_stdout, &mut recv_buffer).await?;
Ok(msg) => { transport.process_server_message(&client_tx, msg).await?;
transport
.process_server_message(&client_tx, msg)
.await
.unwrap();
}
Err(err) => {
error!("err: <- {:?}", err);
break;
}
}
} }
} }
async fn send( async fn recv(
transport: Arc<Self>,
server_stdout: Box<dyn AsyncBufRead + Unpin + Send>,
client_tx: UnboundedSender<Payload>,
) {
if let Err(err) = Self::recv_inner(transport, server_stdout, client_tx).await {
error!("err: <- {:?}", err);
}
}
async fn send_inner(
transport: Arc<Self>, transport: Arc<Self>,
mut server_stdin: Box<dyn AsyncWrite + Unpin + Send>, mut server_stdin: Box<dyn AsyncWrite + Unpin + Send>,
mut client_rx: UnboundedReceiver<Payload>, mut client_rx: UnboundedReceiver<Payload>,
) { ) -> Result<()> {
while let Some(payload) = client_rx.recv().await { while let Some(payload) = client_rx.recv().await {
transport transport
.send_payload_to_server(&mut server_stdin, payload) .send_payload_to_server(&mut server_stdin, payload)
.await .await?;
.unwrap() }
Ok(())
}
async fn send(
transport: Arc<Self>,
server_stdin: Box<dyn AsyncWrite + Unpin + Send>,
client_rx: UnboundedReceiver<Payload>,
) {
if let Err(err) = Self::send_inner(transport, server_stdin, client_rx).await {
error!("err: <- {:?}", err);
} }
} }

@ -17,7 +17,7 @@ path = "src/main.rs"
anyhow = "1" anyhow = "1"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
toml = "0.7" toml = "0.7"
etcetera = "0.4" etcetera = "0.7"
tree-sitter = "0.20" tree-sitter = "0.20"
once_cell = "1.17" once_cell = "1.17"
log = "0.4" log = "0.4"
@ -27,6 +27,7 @@ log = "0.4"
# cloning/compiling tree-sitter grammars # cloning/compiling tree-sitter grammars
cc = { version = "1" } cc = { version = "1" }
threadpool = { version = "1.0" } threadpool = { version = "1.0" }
tempfile = "3.5.0"
[target.'cfg(not(target_arch = "wasm32"))'.dependencies] [target.'cfg(not(target_arch = "wasm32"))'.dependencies]
libloading = "0.7" libloading = "0.8"

@ -1,4 +1,5 @@
use std::borrow::Cow; use std::borrow::Cow;
use std::path::Path;
use std::process::Command; use std::process::Command;
const VERSION: &str = include_str!("../VERSION"); const VERSION: &str = include_str!("../VERSION");
@ -11,7 +12,7 @@ fn main() {
.filter(|output| output.status.success()) .filter(|output| output.status.success())
.and_then(|x| String::from_utf8(x.stdout).ok()); .and_then(|x| String::from_utf8(x.stdout).ok());
let version: Cow<_> = match git_hash { let version: Cow<_> = match &git_hash {
Some(git_hash) => format!("{} ({})", VERSION, &git_hash[..8]).into(), Some(git_hash) => format!("{} ({})", VERSION, &git_hash[..8]).into(),
None => VERSION.into(), None => VERSION.into(),
}; };
@ -23,4 +24,40 @@ fn main() {
println!("cargo:rerun-if-changed=../VERSION"); println!("cargo:rerun-if-changed=../VERSION");
println!("cargo:rustc-env=VERSION_AND_GIT_HASH={}", version); println!("cargo:rustc-env=VERSION_AND_GIT_HASH={}", version);
if git_hash.is_none() {
return;
}
// we need to revparse because the git dir could be anywhere if you are
// using detached worktrees but there is no good way to obtain an OsString
// from command output so for now we can't accept non-utf8 paths here
// probably rare enouch where it doesn't matter tough we could use gitoxide
// here but that would be make it a hard dependency and slow compile times
let Some(git_dir): Option<String> = Command::new("git")
.args(["rev-parse", "--git-dir"])
.output()
.ok()
.filter(|output| output.status.success())
.and_then(|x| String::from_utf8(x.stdout).ok())
else{ return; };
// If heads starts pointing at something else (different branch)
// we need to return
let head = Path::new(&git_dir).join("HEAD");
if head.exists() {
println!("cargo:rerun-if-changed={}", head.display());
}
// if the thing head points to (branch) itself changes
// we need to return
let Some(head_ref): Option<String> = Command::new("git")
.args(["symbolic-ref", "HEAD"])
.output()
.ok()
.filter(|output| output.status.success())
.and_then(|x| String::from_utf8(x.stdout).ok())
else{ return; };
let head_ref = Path::new(&git_dir).join(head_ref);
if head_ref.exists() {
println!("cargo:rerun-if-changed={}", head_ref.display());
}
} }

@ -9,37 +9,38 @@ pub fn default_lang_config() -> toml::Value {
/// User configured languages.toml file, merged with the default config. /// User configured languages.toml file, merged with the default config.
pub fn user_lang_config() -> Result<toml::Value, toml::de::Error> { pub fn user_lang_config() -> Result<toml::Value, toml::de::Error> {
let config = crate::local_config_dirs() let config = [
.into_iter() crate::config_dir(),
.chain([crate::config_dir()].into_iter()) crate::find_workspace().0.join(".helix"),
.map(|path| path.join("languages.toml")) ]
.filter_map(|file| { .into_iter()
std::fs::read_to_string(file) .map(|path| path.join("languages.toml"))
.map(|config| toml::from_str(&config)) .filter_map(|file| {
.ok() std::fs::read_to_string(file)
}) .map(|config| toml::from_str(&config))
.collect::<Result<Vec<_>, _>>()? .ok()
.into_iter() })
.chain([default_lang_config()].into_iter()) .collect::<Result<Vec<_>, _>>()?
.fold(toml::Value::Table(toml::value::Table::default()), |a, b| { .into_iter()
// combines for example .fold(default_lang_config(), |a, b| {
// b: // combines for example
// [[language]] // b:
// name = "toml" // [[language]]
// language-server = { command = "taplo", args = ["lsp", "stdio"] } // name = "toml"
// // language-server = { command = "taplo", args = ["lsp", "stdio"] }
// a: //
// [[language]] // a:
// language-server = { command = "/usr/bin/taplo" } // [[language]]
// // language-server = { command = "/usr/bin/taplo" }
// into: //
// [[language]] // into:
// name = "toml" // [[language]]
// language-server = { command = "/usr/bin/taplo" } // name = "toml"
// // language-server = { command = "/usr/bin/taplo" }
// thus it overrides the third depth-level of b with values of a if they exist, but otherwise merges their values //
crate::merge_toml_values(b, a, 3) // thus it overrides the third depth-level of b with values of a if they exist, but otherwise merges their values
}); crate::merge_toml_values(a, b, 3)
});
Ok(config) Ok(config)
} }

@ -1,4 +1,4 @@
use anyhow::{anyhow, Context, Result}; use anyhow::{anyhow, bail, Context, Result};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::fs; use std::fs;
use std::time::SystemTime; use std::time::SystemTime;
@ -8,6 +8,7 @@ use std::{
process::Command, process::Command,
sync::mpsc::channel, sync::mpsc::channel,
}; };
use tempfile::TempPath;
use tree_sitter::Language; use tree_sitter::Language;
#[cfg(unix)] #[cfg(unix)]
@ -97,15 +98,12 @@ pub fn fetch_grammars() -> Result<()> {
let mut git_up_to_date = 0; let mut git_up_to_date = 0;
let mut non_git = Vec::new(); let mut non_git = Vec::new();
for res in results { for (grammar_id, res) in results {
match res { match res {
Ok(FetchStatus::GitUpToDate) => git_up_to_date += 1, Ok(FetchStatus::GitUpToDate) => git_up_to_date += 1,
Ok(FetchStatus::GitUpdated { Ok(FetchStatus::GitUpdated { revision }) => git_updated.push((grammar_id, revision)),
grammar_id, Ok(FetchStatus::NonGit) => non_git.push(grammar_id),
revision, Err(e) => errors.push((grammar_id, e)),
}) => git_updated.push((grammar_id, revision)),
Ok(FetchStatus::NonGit { grammar_id }) => non_git.push(grammar_id),
Err(e) => errors.push(e),
} }
} }
@ -137,10 +135,10 @@ pub fn fetch_grammars() -> Result<()> {
if !errors.is_empty() { if !errors.is_empty() {
let len = errors.len(); let len = errors.len();
println!("{} grammars failed to fetch", len); for (i, (grammar, error)) in errors.into_iter().enumerate() {
for (i, error) in errors.into_iter().enumerate() { println!("Failure {}/{len}: {grammar} {error}", i + 1);
println!("\tFailure {}/{}: {}", i + 1, len, error);
} }
bail!("{len} grammars failed to fetch");
} }
Ok(()) Ok(())
@ -157,11 +155,11 @@ pub fn build_grammars(target: Option<String>) -> Result<()> {
let mut already_built = 0; let mut already_built = 0;
let mut built = Vec::new(); let mut built = Vec::new();
for res in results { for (grammar_id, res) in results {
match res { match res {
Ok(BuildStatus::AlreadyBuilt) => already_built += 1, Ok(BuildStatus::AlreadyBuilt) => already_built += 1,
Ok(BuildStatus::Built { grammar_id }) => built.push(grammar_id), Ok(BuildStatus::Built) => built.push(grammar_id),
Err(e) => errors.push(e), Err(e) => errors.push((grammar_id, e)),
} }
} }
@ -178,10 +176,10 @@ pub fn build_grammars(target: Option<String>) -> Result<()> {
if !errors.is_empty() { if !errors.is_empty() {
let len = errors.len(); let len = errors.len();
println!("{} grammars failed to build", len); for (i, (grammar_id, error)) in errors.into_iter().enumerate() {
for (i, error) in errors.into_iter().enumerate() { println!("Failure {}/{len}: {grammar_id} {error}", i + 1);
println!("\tFailure {}/{}: {}", i, len, error);
} }
bail!("{len} grammars failed to build");
} }
Ok(()) Ok(())
@ -213,7 +211,7 @@ fn get_grammar_configs() -> Result<Vec<GrammarConfiguration>> {
Ok(grammars) Ok(grammars)
} }
fn run_parallel<F, Res>(grammars: Vec<GrammarConfiguration>, job: F) -> Vec<Result<Res>> fn run_parallel<F, Res>(grammars: Vec<GrammarConfiguration>, job: F) -> Vec<(String, Result<Res>)>
where where
F: Fn(GrammarConfiguration) -> Result<Res> + Send + 'static + Clone, F: Fn(GrammarConfiguration) -> Result<Res> + Send + 'static + Clone,
Res: Send + 'static, Res: Send + 'static,
@ -228,7 +226,7 @@ where
pool.execute(move || { pool.execute(move || {
// Ignore any SendErrors, if any job in another thread has encountered an // Ignore any SendErrors, if any job in another thread has encountered an
// error the Receiver will be closed causing this send to fail. // error the Receiver will be closed causing this send to fail.
let _ = tx.send(job(grammar)); let _ = tx.send((grammar.grammar_id.clone(), job(grammar)));
}); });
} }
@ -239,13 +237,8 @@ where
enum FetchStatus { enum FetchStatus {
GitUpToDate, GitUpToDate,
GitUpdated { GitUpdated { revision: String },
grammar_id: String, NonGit,
revision: String,
},
NonGit {
grammar_id: String,
},
} }
fn fetch_grammar(grammar: GrammarConfiguration) -> Result<FetchStatus> { fn fetch_grammar(grammar: GrammarConfiguration) -> Result<FetchStatus> {
@ -286,17 +279,12 @@ fn fetch_grammar(grammar: GrammarConfiguration) -> Result<FetchStatus> {
)?; )?;
git(&grammar_dir, ["checkout", &revision])?; git(&grammar_dir, ["checkout", &revision])?;
Ok(FetchStatus::GitUpdated { Ok(FetchStatus::GitUpdated { revision })
grammar_id: grammar.grammar_id,
revision,
})
} else { } else {
Ok(FetchStatus::GitUpToDate) Ok(FetchStatus::GitUpToDate)
} }
} else { } else {
Ok(FetchStatus::NonGit { Ok(FetchStatus::NonGit)
grammar_id: grammar.grammar_id,
})
} }
} }
@ -346,7 +334,7 @@ where
enum BuildStatus { enum BuildStatus {
AlreadyBuilt, AlreadyBuilt,
Built { grammar_id: String }, Built,
} }
fn build_grammar(grammar: GrammarConfiguration, target: Option<&str>) -> Result<BuildStatus> { fn build_grammar(grammar: GrammarConfiguration, target: Option<&str>) -> Result<BuildStatus> {
@ -413,6 +401,18 @@ fn build_tree_sitter_library(
let mut library_path = parser_lib_path.join(&grammar.grammar_id); let mut library_path = parser_lib_path.join(&grammar.grammar_id);
library_path.set_extension(DYLIB_EXTENSION); library_path.set_extension(DYLIB_EXTENSION);
// if we are running inside a buildscript emit cargo metadata
// to detect if we are running from a buildscript check some env variables
// that cargo only sets for build scripts
if std::env::var("OUT_DIR").is_ok() && std::env::var("CARGO").is_ok() {
if let Some(scanner_path) = scanner_path.as_ref().and_then(|path| path.to_str()) {
println!("cargo:rerun-if-changed={scanner_path}");
}
if let Some(parser_path) = parser_path.to_str() {
println!("cargo:rerun-if-changed={parser_path}");
}
}
let recompile = needs_recompile(&library_path, &parser_path, &scanner_path) let recompile = needs_recompile(&library_path, &parser_path, &scanner_path)
.context("Failed to compare source and binary timestamps")?; .context("Failed to compare source and binary timestamps")?;
@ -433,16 +433,53 @@ fn build_tree_sitter_library(
for (key, value) in compiler.env() { for (key, value) in compiler.env() {
command.env(key, value); command.env(key, value);
} }
command.args(compiler.args()); command.args(compiler.args());
// used to delay dropping the temporary object file until after the compilation is complete
let _path_guard;
if cfg!(all(windows, target_env = "msvc")) { if compiler.is_like_msvc() {
command command
.args(["/nologo", "/LD", "/I"]) .args(["/nologo", "/LD", "/I"])
.arg(header_path) .arg(header_path)
.arg("/Od") .arg("/Od")
.arg("/utf-8"); .arg("/utf-8")
.arg("/std:c11");
if let Some(scanner_path) = scanner_path.as_ref() { if let Some(scanner_path) = scanner_path.as_ref() {
command.arg(scanner_path); if scanner_path.extension() == Some("c".as_ref()) {
command.arg(scanner_path);
} else {
let mut cpp_command = Command::new(compiler.path());
cpp_command.current_dir(src_path);
for (key, value) in compiler.env() {
cpp_command.env(key, value);
}
cpp_command.args(compiler.args());
let object_file =
library_path.with_file_name(format!("{}_scanner.obj", &grammar.grammar_id));
cpp_command
.args(["/nologo", "/LD", "/I"])
.arg(header_path)
.arg("/Od")
.arg("/utf-8")
.arg("/std:c++14")
.arg(format!("/Fo{}", object_file.display()))
.arg("/c")
.arg(scanner_path);
let output = cpp_command
.output()
.context("Failed to execute C++ compiler")?;
if !output.status.success() {
return Err(anyhow!(
"Parser compilation failed.\nStdout: {}\nStderr: {}",
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
));
}
command.arg(&object_file);
_path_guard = TempPath::from_path(object_file);
}
} }
command command
@ -454,20 +491,49 @@ fn build_tree_sitter_library(
.arg("-shared") .arg("-shared")
.arg("-fPIC") .arg("-fPIC")
.arg("-fno-exceptions") .arg("-fno-exceptions")
.arg("-g")
.arg("-I") .arg("-I")
.arg(header_path) .arg(header_path)
.arg("-o") .arg("-o")
.arg(&library_path) .arg(&library_path);
.arg("-O3");
if let Some(scanner_path) = scanner_path.as_ref() { if let Some(scanner_path) = scanner_path.as_ref() {
if scanner_path.extension() == Some("c".as_ref()) { if scanner_path.extension() == Some("c".as_ref()) {
command.arg("-xc").arg("-std=c99").arg(scanner_path); command.arg("-xc").arg("-std=c11").arg(scanner_path);
} else { } else {
command.arg(scanner_path); let mut cpp_command = Command::new(compiler.path());
cpp_command.current_dir(src_path);
for (key, value) in compiler.env() {
cpp_command.env(key, value);
}
cpp_command.args(compiler.args());
let object_file =
library_path.with_file_name(format!("{}_scanner.o", &grammar.grammar_id));
cpp_command
.arg("-fPIC")
.arg("-fno-exceptions")
.arg("-I")
.arg(header_path)
.arg("-o")
.arg(&object_file)
.arg("-std=c++14")
.arg("-c")
.arg(scanner_path);
let output = cpp_command
.output()
.context("Failed to execute C++ compiler")?;
if !output.status.success() {
return Err(anyhow!(
"Parser compilation failed.\nStdout: {}\nStderr: {}",
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
));
}
command.arg(&object_file);
_path_guard = TempPath::from_path(object_file);
} }
} }
command.arg("-xc").arg(parser_path); command.arg("-xc").arg("-std=c11").arg(parser_path);
if cfg!(all( if cfg!(all(
unix, unix,
not(any(target_os = "macos", target_os = "illumos")) not(any(target_os = "macos", target_os = "illumos"))
@ -487,9 +553,7 @@ fn build_tree_sitter_library(
)); ));
} }
Ok(BuildStatus::Built { Ok(BuildStatus::Built)
grammar_id: grammar.grammar_id,
})
} }
fn needs_recompile( fn needs_recompile(

@ -42,7 +42,7 @@ fn prioritize_runtime_dirs() -> Vec<PathBuf> {
let mut rt_dirs = Vec::new(); let mut rt_dirs = Vec::new();
if let Ok(dir) = std::env::var("CARGO_MANIFEST_DIR") { if let Ok(dir) = std::env::var("CARGO_MANIFEST_DIR") {
// this is the directory of the crate being run by cargo, we need the workspace path so we take the parent // this is the directory of the crate being run by cargo, we need the workspace path so we take the parent
let path = std::path::PathBuf::from(dir).parent().unwrap().join(RT_DIR); let path = PathBuf::from(dir).parent().unwrap().join(RT_DIR);
log::debug!("runtime dir: {}", path.to_string_lossy()); log::debug!("runtime dir: {}", path.to_string_lossy());
rt_dirs.push(path); rt_dirs.push(path);
} }
@ -113,15 +113,6 @@ pub fn config_dir() -> PathBuf {
path path
} }
pub fn local_config_dirs() -> Vec<PathBuf> {
let directories = find_local_config_dirs()
.into_iter()
.map(|path| path.join(".helix"))
.collect();
log::debug!("Located configuration folders: {:?}", directories);
directories
}
pub fn cache_dir() -> PathBuf { pub fn cache_dir() -> PathBuf {
// TODO: allow env var override // TODO: allow env var override
let strategy = choose_base_strategy().expect("Unable to find the config directory!"); let strategy = choose_base_strategy().expect("Unable to find the config directory!");
@ -137,6 +128,10 @@ pub fn config_file() -> PathBuf {
.unwrap_or_else(|| config_dir().join("config.toml")) .unwrap_or_else(|| config_dir().join("config.toml"))
} }
pub fn workspace_config_file() -> PathBuf {
find_workspace().0.join(".helix").join("config.toml")
}
pub fn lang_config_file() -> PathBuf { pub fn lang_config_file() -> PathBuf {
config_dir().join("languages.toml") config_dir().join("languages.toml")
} }
@ -145,22 +140,6 @@ pub fn log_file() -> PathBuf {
cache_dir().join("helix.log") cache_dir().join("helix.log")
} }
pub fn find_local_config_dirs() -> Vec<PathBuf> {
let current_dir = std::env::current_dir().expect("unable to determine current directory");
let mut directories = Vec::new();
for ancestor in current_dir.ancestors() {
if ancestor.join(".git").exists() {
directories.push(ancestor.to_path_buf());
// Don't go higher than repo if we're in one
break;
} else if ancestor.join(".helix").is_dir() {
directories.push(ancestor.to_path_buf());
}
}
directories
}
/// Merge two TOML documents, merging values from `right` onto `left` /// Merge two TOML documents, merging values from `right` onto `left`
/// ///
/// When an array exists in both `left` and `right`, `right`'s array is /// When an array exists in both `left` and `right`, `right`'s array is
@ -302,3 +281,21 @@ mod merge_toml_tests {
) )
} }
} }
/// Finds the current workspace folder.
/// Used as a ceiling dir for LSP root resolution, the filepicker and potentially as a future filewatching root
///
/// This function starts searching the FS upward from the CWD
/// and returns the first directory that contains either `.git` or `.helix`.
/// If no workspace was found returns (CWD, true).
/// Otherwise (workspace, false) is returned
pub fn find_workspace() -> (PathBuf, bool) {
let current_dir = std::env::current_dir().expect("unable to determine current directory");
for ancestor in current_dir.ancestors() {
if ancestor.join(".git").exists() || ancestor.join(".helix").exists() {
return (ancestor.to_owned(), false);
}
}
(current_dir, true)
}

@ -27,3 +27,4 @@ thiserror = "1.0"
tokio = { version = "1.27", features = ["rt", "rt-multi-thread", "io-util", "io-std", "time", "process", "macros", "fs", "parking_lot", "sync"] } tokio = { version = "1.27", features = ["rt", "rt-multi-thread", "io-util", "io-std", "time", "process", "macros", "fs", "parking_lot", "sync"] }
tokio-stream = "0.1.12" tokio-stream = "0.1.12"
which = "4.4" which = "4.4"
parking_lot = "0.12.1"

@ -1,22 +1,26 @@
use crate::{ use crate::{
jsonrpc, find_lsp_workspace, jsonrpc,
transport::{Payload, Transport}, transport::{Payload, Transport},
Call, Error, OffsetEncoding, Result, Call, Error, OffsetEncoding, Result,
}; };
use helix_core::{find_root, ChangeSet, Rope}; use helix_core::{find_workspace, path, ChangeSet, Rope};
use helix_loader::{self, VERSION_AND_GIT_HASH}; use helix_loader::{self, VERSION_AND_GIT_HASH};
use lsp::PositionEncodingKind; use lsp::{
notification::DidChangeWorkspaceFolders, DidChangeWorkspaceFoldersParams, OneOf,
PositionEncodingKind, WorkspaceFolder, WorkspaceFoldersChangeEvent,
};
use lsp_types as lsp; use lsp_types as lsp;
use parking_lot::Mutex;
use serde::Deserialize; use serde::Deserialize;
use serde_json::Value; use serde_json::Value;
use std::collections::HashMap;
use std::future::Future; use std::future::Future;
use std::process::Stdio; use std::process::Stdio;
use std::sync::{ use std::sync::{
atomic::{AtomicU64, Ordering}, atomic::{AtomicU64, Ordering},
Arc, Arc,
}; };
use std::{collections::HashMap, path::PathBuf};
use tokio::{ use tokio::{
io::{BufReader, BufWriter}, io::{BufReader, BufWriter},
process::{Child, Command}, process::{Child, Command},
@ -26,6 +30,17 @@ use tokio::{
}, },
}; };
fn workspace_for_uri(uri: lsp::Url) -> WorkspaceFolder {
lsp::WorkspaceFolder {
name: uri
.path_segments()
.and_then(|segments| segments.last())
.map(|basename| basename.to_string())
.unwrap_or_default(),
uri,
}
}
#[derive(Debug)] #[derive(Debug)]
pub struct Client { pub struct Client {
id: usize, id: usize,
@ -36,11 +51,121 @@ pub struct Client {
config: Option<Value>, config: Option<Value>,
root_path: std::path::PathBuf, root_path: std::path::PathBuf,
root_uri: Option<lsp::Url>, root_uri: Option<lsp::Url>,
workspace_folders: Vec<lsp::WorkspaceFolder>, workspace_folders: Mutex<Vec<lsp::WorkspaceFolder>>,
initialize_notify: Arc<Notify>,
/// workspace folders added while the server is still initializing
req_timeout: u64, req_timeout: u64,
} }
impl Client { impl Client {
pub fn try_add_doc(
self: &Arc<Self>,
root_markers: &[String],
manual_roots: &[PathBuf],
doc_path: Option<&std::path::PathBuf>,
may_support_workspace: bool,
) -> bool {
let (workspace, workspace_is_cwd) = find_workspace();
let workspace = path::get_normalized_path(&workspace);
let root = find_lsp_workspace(
doc_path
.and_then(|x| x.parent().and_then(|x| x.to_str()))
.unwrap_or("."),
root_markers,
manual_roots,
&workspace,
workspace_is_cwd,
);
let root_uri = root
.as_ref()
.and_then(|root| lsp::Url::from_file_path(root).ok());
if self.root_path == root.unwrap_or(workspace)
|| root_uri.as_ref().map_or(false, |root_uri| {
self.workspace_folders
.lock()
.iter()
.any(|workspace| &workspace.uri == root_uri)
})
{
// workspace URI is already registered so we can use this client
return true;
}
// this server definitely doesn't support multiple workspace, no need to check capabilities
if !may_support_workspace {
return false;
}
let Some(capabilities) = self.capabilities.get() else {
let client = Arc::clone(self);
// initialization hasn't finished yet, deal with this new root later
// TODO: In the edgecase that a **new root** is added
// for an LSP that **doesn't support workspace_folders** before initaliation is finished
// the new roots are ignored.
// That particular edgecase would require retroactively spawning new LSP
// clients and therefore also require us to retroactively update the corresponding
// documents LSP client handle. It's doable but a pretty weird edgecase so let's
// wait and see if anyone ever runs into it.
tokio::spawn(async move {
client.initialize_notify.notified().await;
if let Some(workspace_folders_caps) = client
.capabilities()
.workspace
.as_ref()
.and_then(|cap| cap.workspace_folders.as_ref())
.filter(|cap| cap.supported.unwrap_or(false))
{
client.add_workspace_folder(
root_uri,
&workspace_folders_caps.change_notifications,
);
}
});
return true;
};
if let Some(workspace_folders_caps) = capabilities
.workspace
.as_ref()
.and_then(|cap| cap.workspace_folders.as_ref())
.filter(|cap| cap.supported.unwrap_or(false))
{
self.add_workspace_folder(root_uri, &workspace_folders_caps.change_notifications);
true
} else {
// the server doesn't support multi workspaces, we need a new client
false
}
}
fn add_workspace_folder(
&self,
root_uri: Option<lsp::Url>,
change_notifications: &Option<OneOf<bool, String>>,
) {
// root_uri is None just means that there isn't really any LSP workspace
// associated with this file. For servers that support multiple workspaces
// there is just one server so we can always just use that shared instance.
// No need to add a new workspace root here as there is no logical root for this file
// let the server deal with this
let Some(root_uri) = root_uri else {
return;
};
// server supports workspace folders, let's add the new root to the list
self.workspace_folders
.lock()
.push(workspace_for_uri(root_uri.clone()));
if &Some(OneOf::Left(false)) == change_notifications {
// server specifically opted out of DidWorkspaceChange notifications
// let's assume the server will request the workspace folders itself
// and that we can therefore reuse the client (but are done now)
return;
}
tokio::spawn(self.did_change_workspace(vec![workspace_for_uri(root_uri)], Vec::new()));
}
#[allow(clippy::type_complexity)] #[allow(clippy::type_complexity)]
#[allow(clippy::too_many_arguments)] #[allow(clippy::too_many_arguments)]
pub fn start( pub fn start(
@ -49,6 +174,7 @@ impl Client {
config: Option<Value>, config: Option<Value>,
server_environment: HashMap<String, String>, server_environment: HashMap<String, String>,
root_markers: &[String], root_markers: &[String],
manual_roots: &[PathBuf],
id: usize, id: usize,
req_timeout: u64, req_timeout: u64,
doc_path: Option<&std::path::PathBuf>, doc_path: Option<&std::path::PathBuf>,
@ -75,27 +201,26 @@ impl Client {
let (server_rx, server_tx, initialize_notify) = let (server_rx, server_tx, initialize_notify) =
Transport::start(reader, writer, stderr, id); Transport::start(reader, writer, stderr, id);
let (workspace, workspace_is_cwd) = find_workspace();
let root_path = find_root( let workspace = path::get_normalized_path(&workspace);
doc_path.and_then(|x| x.parent().and_then(|x| x.to_str())), let root = find_lsp_workspace(
doc_path
.and_then(|x| x.parent().and_then(|x| x.to_str()))
.unwrap_or("."),
root_markers, root_markers,
manual_roots,
&workspace,
workspace_is_cwd,
); );
let root_uri = lsp::Url::from_file_path(root_path.clone()).ok(); // `root_uri` and `workspace_folder` can be empty in case there is no workspace
// `root_url` can not, use `workspace` as a fallback
let root_path = root.clone().unwrap_or_else(|| workspace.clone());
let root_uri = root.and_then(|root| lsp::Url::from_file_path(root).ok());
// TODO: support multiple workspace folders
let workspace_folders = root_uri let workspace_folders = root_uri
.clone() .clone()
.map(|root| { .map(|root| vec![workspace_for_uri(root)])
vec![lsp::WorkspaceFolder {
name: root
.path_segments()
.and_then(|segments| segments.last())
.map(|basename| basename.to_string())
.unwrap_or_default(),
uri: root,
}]
})
.unwrap_or_default(); .unwrap_or_default();
let client = Self { let client = Self {
@ -106,10 +231,10 @@ impl Client {
capabilities: OnceCell::new(), capabilities: OnceCell::new(),
config, config,
req_timeout, req_timeout,
root_path, root_path,
root_uri, root_uri,
workspace_folders, workspace_folders: Mutex::new(workspace_folders),
initialize_notify: initialize_notify.clone(),
}; };
Ok((client, server_rx, initialize_notify)) Ok((client, server_rx, initialize_notify))
@ -154,7 +279,7 @@ impl Client {
"utf-16" => Some(OffsetEncoding::Utf16), "utf-16" => Some(OffsetEncoding::Utf16),
"utf-32" => Some(OffsetEncoding::Utf32), "utf-32" => Some(OffsetEncoding::Utf32),
encoding => { encoding => {
log::error!("Server provided invalid position encording {encoding}, defaulting to utf-16"); log::error!("Server provided invalid position encoding {encoding}, defaulting to utf-16");
None None
}, },
}) })
@ -165,8 +290,10 @@ impl Client {
self.config.as_ref() self.config.as_ref()
} }
pub fn workspace_folders(&self) -> &[lsp::WorkspaceFolder] { pub async fn workspace_folders(
&self.workspace_folders &self,
) -> parking_lot::MutexGuard<'_, Vec<lsp::WorkspaceFolder>> {
self.workspace_folders.lock()
} }
/// Execute a RPC request on the language server. /// Execute a RPC request on the language server.
@ -286,7 +413,7 @@ impl Client {
// General messages // General messages
// ------------------------------------------------------------------------------------------- // -------------------------------------------------------------------------------------------
pub(crate) async fn initialize(&self) -> Result<lsp::InitializeResult> { pub(crate) async fn initialize(&self, enable_snippets: bool) -> Result<lsp::InitializeResult> {
if let Some(config) = &self.config { if let Some(config) = &self.config {
log::info!("Using custom LSP config: {}", config); log::info!("Using custom LSP config: {}", config);
} }
@ -294,7 +421,7 @@ impl Client {
#[allow(deprecated)] #[allow(deprecated)]
let params = lsp::InitializeParams { let params = lsp::InitializeParams {
process_id: Some(std::process::id()), process_id: Some(std::process::id()),
workspace_folders: Some(self.workspace_folders.clone()), workspace_folders: Some(self.workspace_folders.lock().clone()),
// root_path is obsolete, but some clients like pyright still use it so we specify both. // root_path is obsolete, but some clients like pyright still use it so we specify both.
// clients will prefer _uri if possible // clients will prefer _uri if possible
root_path: self.root_path.to_str().map(|path| path.to_owned()), root_path: self.root_path.to_str().map(|path| path.to_owned()),
@ -334,7 +461,7 @@ impl Client {
text_document: Some(lsp::TextDocumentClientCapabilities { text_document: Some(lsp::TextDocumentClientCapabilities {
completion: Some(lsp::CompletionClientCapabilities { completion: Some(lsp::CompletionClientCapabilities {
completion_item: Some(lsp::CompletionItemCapability { completion_item: Some(lsp::CompletionItemCapability {
snippet_support: Some(true), snippet_support: Some(enable_snippets),
resolve_support: Some(lsp::CompletionItemCapabilityResolveSupport { resolve_support: Some(lsp::CompletionItemCapabilityResolveSupport {
properties: vec![ properties: vec![
String::from("documentation"), String::from("documentation"),
@ -413,8 +540,8 @@ impl Client {
}), }),
general: Some(lsp::GeneralClientCapabilities { general: Some(lsp::GeneralClientCapabilities {
position_encodings: Some(vec![ position_encodings: Some(vec![
PositionEncodingKind::UTF32,
PositionEncodingKind::UTF8, PositionEncodingKind::UTF8,
PositionEncodingKind::UTF32,
PositionEncodingKind::UTF16, PositionEncodingKind::UTF16,
]), ]),
..Default::default() ..Default::default()
@ -465,6 +592,16 @@ impl Client {
) )
} }
pub fn did_change_workspace(
&self,
added: Vec<WorkspaceFolder>,
removed: Vec<WorkspaceFolder>,
) -> impl Future<Output = Result<()>> {
self.notify::<DidChangeWorkspaceFolders>(DidChangeWorkspaceFoldersParams {
event: WorkspaceFoldersChangeEvent { added, removed },
})
}
// ------------------------------------------------------------------------------------------- // -------------------------------------------------------------------------------------------
// Text document // Text document
// ------------------------------------------------------------------------------------------- // -------------------------------------------------------------------------------------------

@ -10,11 +10,15 @@ pub use lsp::{Position, Url};
pub use lsp_types as lsp; pub use lsp_types as lsp;
use futures_util::stream::select_all::SelectAll; use futures_util::stream::select_all::SelectAll;
use helix_core::syntax::{LanguageConfiguration, LanguageServerConfiguration}; use helix_core::{
path,
syntax::{LanguageConfiguration, LanguageServerConfiguration},
};
use tokio::sync::mpsc::UnboundedReceiver; use tokio::sync::mpsc::UnboundedReceiver;
use std::{ use std::{
collections::{hash_map::Entry, HashMap}, collections::{hash_map::Entry, HashMap},
path::{Path, PathBuf},
sync::{ sync::{
atomic::{AtomicUsize, Ordering}, atomic::{AtomicUsize, Ordering},
Arc, Arc,
@ -128,7 +132,11 @@ pub mod util {
) -> Option<usize> { ) -> Option<usize> {
let pos_line = pos.line as usize; let pos_line = pos.line as usize;
if pos_line > doc.len_lines() - 1 { if pos_line > doc.len_lines() - 1 {
return None; // If it extends past the end, truncate it to the end. This is because the
// way the LSP describes the range including the last newline is by
// specifying a line number after what we would call the last line.
log::warn!("LSP position {pos:?} out of range assuming EOF");
return Some(doc.len_chars());
} }
// We need to be careful here to fully comply ith the LSP spec. // We need to be careful here to fully comply ith the LSP spec.
@ -144,10 +152,10 @@ pub mod util {
// > \n, \r\n and \r. Positions are line end character agnostic. // > \n, \r\n and \r. Positions are line end character agnostic.
// > So you can not specify a position that denotes \r|\n or \n| where | represents the character offset. // > So you can not specify a position that denotes \r|\n or \n| where | represents the character offset.
// //
// This means that while the line must be in bounds the `charater` // This means that while the line must be in bounds the `character`
// must be capped to the end of the line. // must be capped to the end of the line.
// Note that the end of the line here is **before** the line terminator // Note that the end of the line here is **before** the line terminator
// so we must use `line_end_char_index` istead of `doc.line_to_char(pos_line + 1)` // so we must use `line_end_char_index` instead of `doc.line_to_char(pos_line + 1)`
// //
// FIXME: Helix does not fully comply with the LSP spec for line terminators. // FIXME: Helix does not fully comply with the LSP spec for line terminators.
// The LSP standard requires that line terminators are ['\n', '\r\n', '\r']. // The LSP standard requires that line terminators are ['\n', '\r\n', '\r'].
@ -238,9 +246,20 @@ pub mod util {
pub fn lsp_range_to_range( pub fn lsp_range_to_range(
doc: &Rope, doc: &Rope,
range: lsp::Range, mut range: lsp::Range,
offset_encoding: OffsetEncoding, offset_encoding: OffsetEncoding,
) -> Option<Range> { ) -> Option<Range> {
// This is sort of an edgecase. It's not clear from the spec how to deal with
// ranges where end < start. They don't make much sense but vscode simply caps start to end
// and because it's not specified quite a few LS rely on this as a result (for example the TS server)
if range.start > range.end {
log::error!(
"Invalid LSP range start {:?} > end {:?}, using an empty range at the end instead",
range.start,
range.end
);
range.start = range.end;
}
let start = lsp_pos_to_pos(doc, range.start, offset_encoding)?; let start = lsp_pos_to_pos(doc, range.start, offset_encoding)?;
let end = lsp_pos_to_pos(doc, range.end, offset_encoding)?; let end = lsp_pos_to_pos(doc, range.end, offset_encoding)?;
@ -605,7 +624,7 @@ impl Notification {
#[derive(Debug)] #[derive(Debug)]
pub struct Registry { pub struct Registry {
inner: HashMap<LanguageId, (usize, Arc<Client>)>, inner: HashMap<LanguageId, Vec<(usize, Arc<Client>)>>,
counter: AtomicUsize, counter: AtomicUsize,
pub incoming: SelectAll<UnboundedReceiverStream<(usize, Call)>>, pub incoming: SelectAll<UnboundedReceiverStream<(usize, Call)>>,
@ -629,18 +648,24 @@ impl Registry {
pub fn get_by_id(&self, id: usize) -> Option<&Client> { pub fn get_by_id(&self, id: usize) -> Option<&Client> {
self.inner self.inner
.values() .values()
.flatten()
.find(|(client_id, _)| client_id == &id) .find(|(client_id, _)| client_id == &id)
.map(|(_, client)| client.as_ref()) .map(|(_, client)| client.as_ref())
} }
pub fn remove_by_id(&mut self, id: usize) { pub fn remove_by_id(&mut self, id: usize) {
self.inner.retain(|_, (client_id, _)| client_id != &id) self.inner.retain(|_, clients| {
clients.retain(|&(client_id, _)| client_id != id);
!clients.is_empty()
})
} }
pub fn restart( pub fn restart(
&mut self, &mut self,
language_config: &LanguageConfiguration, language_config: &LanguageConfiguration,
doc_path: Option<&std::path::PathBuf>, doc_path: Option<&std::path::PathBuf>,
root_dirs: &[PathBuf],
enable_snippets: bool,
) -> Result<Option<Arc<Client>>> { ) -> Result<Option<Arc<Client>>> {
let config = match &language_config.language_server { let config = match &language_config.language_server {
Some(config) => config, Some(config) => config,
@ -655,15 +680,23 @@ impl Registry {
// initialize a new client // initialize a new client
let id = self.counter.fetch_add(1, Ordering::Relaxed); let id = self.counter.fetch_add(1, Ordering::Relaxed);
let NewClientResult(client, incoming) = let NewClientResult(client, incoming) = start_client(
start_client(id, language_config, config, doc_path)?; id,
language_config,
config,
doc_path,
root_dirs,
enable_snippets,
)?;
self.incoming.push(UnboundedReceiverStream::new(incoming)); self.incoming.push(UnboundedReceiverStream::new(incoming));
let (_, old_client) = entry.insert((id, client.clone())); let old_clients = entry.insert(vec![(id, client.clone())]);
tokio::spawn(async move { for (_, old_client) in old_clients {
let _ = old_client.force_shutdown().await; tokio::spawn(async move {
}); let _ = old_client.force_shutdown().await;
});
}
Ok(Some(client)) Ok(Some(client))
} }
@ -673,10 +706,12 @@ impl Registry {
pub fn stop(&mut self, language_config: &LanguageConfiguration) { pub fn stop(&mut self, language_config: &LanguageConfiguration) {
let scope = language_config.scope.clone(); let scope = language_config.scope.clone();
if let Some((_, client)) = self.inner.remove(&scope) { if let Some(clients) = self.inner.remove(&scope) {
tokio::spawn(async move { for (_, client) in clients {
let _ = client.force_shutdown().await; tokio::spawn(async move {
}); let _ = client.force_shutdown().await;
});
}
} }
} }
@ -684,30 +719,39 @@ impl Registry {
&mut self, &mut self,
language_config: &LanguageConfiguration, language_config: &LanguageConfiguration,
doc_path: Option<&std::path::PathBuf>, doc_path: Option<&std::path::PathBuf>,
root_dirs: &[PathBuf],
enable_snippets: bool,
) -> Result<Option<Arc<Client>>> { ) -> Result<Option<Arc<Client>>> {
let config = match &language_config.language_server { let config = match &language_config.language_server {
Some(config) => config, Some(config) => config,
None => return Ok(None), None => return Ok(None),
}; };
match self.inner.entry(language_config.scope.clone()) { let clients = self.inner.entry(language_config.scope.clone()).or_default();
Entry::Occupied(entry) => Ok(Some(entry.get().1.clone())), // check if we already have a client for this documents root that we can reuse
Entry::Vacant(entry) => { if let Some((_, client)) = clients.iter_mut().enumerate().find(|(i, (_, client))| {
// initialize a new client client.try_add_doc(&language_config.roots, root_dirs, doc_path, *i == 0)
let id = self.counter.fetch_add(1, Ordering::Relaxed); }) {
return Ok(Some(client.1.clone()));
let NewClientResult(client, incoming) =
start_client(id, language_config, config, doc_path)?;
self.incoming.push(UnboundedReceiverStream::new(incoming));
entry.insert((id, client.clone()));
Ok(Some(client))
}
} }
// initialize a new client
let id = self.counter.fetch_add(1, Ordering::Relaxed);
let NewClientResult(client, incoming) = start_client(
id,
language_config,
config,
doc_path,
root_dirs,
enable_snippets,
)?;
clients.push((id, client.clone()));
self.incoming.push(UnboundedReceiverStream::new(incoming));
Ok(Some(client))
} }
pub fn iter_clients(&self) -> impl Iterator<Item = &Arc<Client>> { pub fn iter_clients(&self) -> impl Iterator<Item = &Arc<Client>> {
self.inner.values().map(|(_, client)| client) self.inner.values().flatten().map(|(_, client)| client)
} }
} }
@ -798,6 +842,8 @@ fn start_client(
config: &LanguageConfiguration, config: &LanguageConfiguration,
ls_config: &LanguageServerConfiguration, ls_config: &LanguageServerConfiguration,
doc_path: Option<&std::path::PathBuf>, doc_path: Option<&std::path::PathBuf>,
root_dirs: &[PathBuf],
enable_snippets: bool,
) -> Result<NewClientResult> { ) -> Result<NewClientResult> {
let (client, incoming, initialize_notify) = Client::start( let (client, incoming, initialize_notify) = Client::start(
&ls_config.command, &ls_config.command,
@ -805,6 +851,7 @@ fn start_client(
config.config.clone(), config.config.clone(),
ls_config.environment.clone(), ls_config.environment.clone(),
&config.roots, &config.roots,
config.workspace_lsp_roots.as_deref().unwrap_or(root_dirs),
id, id,
ls_config.timeout, ls_config.timeout,
doc_path, doc_path,
@ -820,7 +867,7 @@ fn start_client(
.capabilities .capabilities
.get_or_try_init(|| { .get_or_try_init(|| {
_client _client
.initialize() .initialize(enable_snippets)
.map_ok(|response| response.capabilities) .map_ok(|response| response.capabilities)
}) })
.await; .await;
@ -842,6 +889,65 @@ fn start_client(
Ok(NewClientResult(client, incoming)) Ok(NewClientResult(client, incoming))
} }
/// Find an LSP workspace of a file using the following mechanism:
/// * if the file is outside `workspace` return `None`
/// * start at `file` and search the file tree upward
/// * stop the search at the first `root_dirs` entry that contains `file`
/// * if no `root_dirs` matches `file` stop at workspace
/// * Returns the top most directory that contains a `root_marker`
/// * If no root marker and we stopped at a `root_dirs` entry, return the directory we stopped at
/// * If we stopped at `workspace` instead and `workspace_is_cwd == false` return `None`
/// * If we stopped at `workspace` instead and `workspace_is_cwd == true` return `workspace`
pub fn find_lsp_workspace(
file: &str,
root_markers: &[String],
root_dirs: &[PathBuf],
workspace: &Path,
workspace_is_cwd: bool,
) -> Option<PathBuf> {
let file = std::path::Path::new(file);
let mut file = if file.is_absolute() {
file.to_path_buf()
} else {
let current_dir = std::env::current_dir().expect("unable to determine current directory");
current_dir.join(file)
};
file = path::get_normalized_path(&file);
if !file.starts_with(workspace) {
return None;
}
let mut top_marker = None;
for ancestor in file.ancestors() {
if root_markers
.iter()
.any(|marker| ancestor.join(marker).exists())
{
top_marker = Some(ancestor);
}
if root_dirs
.iter()
.any(|root_dir| path::get_normalized_path(&workspace.join(root_dir)) == ancestor)
{
// if the worskapce is the cwd do not search any higher for workspaces
// but specify
return Some(top_marker.unwrap_or(workspace).to_owned());
}
if ancestor == workspace {
// if the workspace is the CWD, let the LSP decide what the workspace
// is
return top_marker
.or_else(|| (!workspace_is_cwd).then_some(workspace))
.map(Path::to_owned);
}
}
debug_assert!(false, "workspace must be an ancestor of <file>");
None
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::{lsp, util::*, OffsetEncoding}; use super::{lsp, util::*, OffsetEncoding};
@ -860,16 +966,16 @@ mod tests {
test_case!("", (0, 0) => Some(0)); test_case!("", (0, 0) => Some(0));
test_case!("", (0, 1) => Some(0)); test_case!("", (0, 1) => Some(0));
test_case!("", (1, 0) => None); test_case!("", (1, 0) => Some(0));
test_case!("\n\n", (0, 0) => Some(0)); test_case!("\n\n", (0, 0) => Some(0));
test_case!("\n\n", (1, 0) => Some(1)); test_case!("\n\n", (1, 0) => Some(1));
test_case!("\n\n", (1, 1) => Some(1)); test_case!("\n\n", (1, 1) => Some(1));
test_case!("\n\n", (2, 0) => Some(2)); test_case!("\n\n", (2, 0) => Some(2));
test_case!("\n\n", (3, 0) => None); test_case!("\n\n", (3, 0) => Some(2));
test_case!("test\n\n\n\ncase", (4, 3) => Some(11)); test_case!("test\n\n\n\ncase", (4, 3) => Some(11));
test_case!("test\n\n\n\ncase", (4, 4) => Some(12)); test_case!("test\n\n\n\ncase", (4, 4) => Some(12));
test_case!("test\n\n\n\ncase", (4, 5) => Some(12)); test_case!("test\n\n\n\ncase", (4, 5) => Some(12));
test_case!("", (u32::MAX, u32::MAX) => None); test_case!("", (u32::MAX, u32::MAX) => Some(0));
} }
#[test] #[test]

@ -61,7 +61,7 @@ fn render_elements(
offset: &mut usize, offset: &mut usize,
tabstops: &mut Vec<(usize, (usize, usize))>, tabstops: &mut Vec<(usize, (usize, usize))>,
newline_with_offset: &str, newline_with_offset: &str,
include_placeholer: bool, include_placeholder: bool,
) { ) {
use SnippetElement::*; use SnippetElement::*;
@ -89,7 +89,7 @@ fn render_elements(
offset, offset,
tabstops, tabstops,
newline_with_offset, newline_with_offset,
include_placeholer, include_placeholder,
); );
} }
&Tabstop { tabstop } => { &Tabstop { tabstop } => {
@ -100,14 +100,14 @@ fn render_elements(
value: inner_snippet_elements, value: inner_snippet_elements,
} => { } => {
let start_offset = *offset; let start_offset = *offset;
if include_placeholer { if include_placeholder {
render_elements( render_elements(
inner_snippet_elements, inner_snippet_elements,
insert, insert,
offset, offset,
tabstops, tabstops,
newline_with_offset, newline_with_offset,
include_placeholer, include_placeholder,
); );
} }
tabstops.push((*tabstop, (start_offset, *offset))); tabstops.push((*tabstop, (start_offset, *offset)));
@ -127,7 +127,7 @@ fn render_elements(
pub fn render( pub fn render(
snippet: &Snippet<'_>, snippet: &Snippet<'_>,
newline_with_offset: &str, newline_with_offset: &str,
include_placeholer: bool, include_placeholder: bool,
) -> (Tendril, Vec<SmallVec<[(usize, usize); 1]>>) { ) -> (Tendril, Vec<SmallVec<[(usize, usize); 1]>>) {
let mut insert = Tendril::new(); let mut insert = Tendril::new();
let mut tabstops = Vec::new(); let mut tabstops = Vec::new();
@ -139,7 +139,7 @@ pub fn render(
&mut offset, &mut offset,
&mut tabstops, &mut tabstops,
newline_with_offset, newline_with_offset,
include_placeholer, include_placeholder,
); );
// sort in ascending order (except for 0, which should always be the last one (per lsp doc)) // sort in ascending order (except for 0, which should always be the last one (per lsp doc))

@ -68,7 +68,7 @@ grep-searcher = "0.1.11"
[target.'cfg(not(windows))'.dependencies] # https://github.com/vorner/signal-hook/issues/100 [target.'cfg(not(windows))'.dependencies] # https://github.com/vorner/signal-hook/issues/100
signal-hook-tokio = { version = "0.3", features = ["futures-v0_3"] } signal-hook-tokio = { version = "0.3", features = ["futures-v0_3"] }
libc = "0.2.140" libc = "0.2.142"
[build-dependencies] [build-dependencies]
helix-loader = { version = "0.6", path = "../helix-loader" } helix-loader = { version = "0.6", path = "../helix-loader" }

@ -25,7 +25,7 @@ use crate::{
config::Config, config::Config,
job::Jobs, job::Jobs,
keymap::Keymaps, keymap::Keymaps,
ui::{self, overlay::overlayed}, ui::{self, overlay::overlaid},
}; };
use log::{debug, error, warn}; use log::{debug, error, warn};
@ -169,7 +169,7 @@ impl Application {
std::env::set_current_dir(first).context("set current dir")?; std::env::set_current_dir(first).context("set current dir")?;
editor.new_file(Action::VerticalSplit); editor.new_file(Action::VerticalSplit);
let picker = ui::file_picker(".".into(), &config.load().editor); let picker = ui::file_picker(".".into(), &config.load().editor);
compositor.push(Box::new(overlayed(picker))); compositor.push(Box::new(overlaid(picker)));
} else { } else {
let nr_of_files = args.files.len(); let nr_of_files = args.files.len();
for (i, (file, pos)) in args.files.into_iter().enumerate() { for (i, (file, pos)) in args.files.into_iter().enumerate() {
@ -361,6 +361,9 @@ impl Application {
ConfigEvent::Update(editor_config) => { ConfigEvent::Update(editor_config) => {
let mut app_config = (*self.config.load().clone()).clone(); let mut app_config = (*self.config.load().clone()).clone();
app_config.editor = *editor_config; app_config.editor = *editor_config;
if let Err(err) = self.terminal.reconfigure(app_config.editor.clone().into()) {
self.editor.set_error(err.to_string());
};
self.config.store(Arc::new(app_config)); self.config.store(Arc::new(app_config));
} }
} }
@ -393,20 +396,23 @@ impl Application {
/// Refresh theme after config change /// Refresh theme after config change
fn refresh_theme(&mut self, config: &Config) -> Result<(), Error> { fn refresh_theme(&mut self, config: &Config) -> Result<(), Error> {
if let Some(theme) = config.theme.clone() { let true_color = config.editor.true_color || crate::true_color();
let true_color = self.true_color(); let theme = config
let theme = self .theme
.theme_loader .as_ref()
.load(&theme) .and_then(|theme| {
.map_err(|err| anyhow::anyhow!("Failed to load theme `{}`: {}", theme, err))?; self.theme_loader
.load(theme)
if true_color || theme.is_16_color() { .map_err(|e| {
self.editor.set_theme(theme); log::warn!("failed to load theme `{}` - {}", theme, e);
} else { e
anyhow::bail!("theme requires truecolor support, which is not available") })
} .ok()
} .filter(|theme| (true_color || theme.is_16_color()))
})
.unwrap_or_else(|| self.theme_loader.default_theme(true_color));
self.editor.set_theme(theme);
Ok(()) Ok(())
} }
@ -416,6 +422,8 @@ impl Application {
.map_err(|err| anyhow::anyhow!("Failed to load config: {}", err))?; .map_err(|err| anyhow::anyhow!("Failed to load config: {}", err))?;
self.refresh_language_config()?; self.refresh_language_config()?;
self.refresh_theme(&default_config)?; self.refresh_theme(&default_config)?;
self.terminal
.reconfigure(default_config.editor.clone().into())?;
// Store new config // Store new config
self.config.store(Arc::new(default_config)); self.config.store(Arc::new(default_config));
Ok(()) Ok(())
@ -431,10 +439,6 @@ impl Application {
} }
} }
fn true_color(&self) -> bool {
self.config.load().editor.true_color || crate::true_color()
}
#[cfg(windows)] #[cfg(windows)]
// no signal handling available on windows // no signal handling available on windows
pub async fn handle_signals(&mut self, _signal: ()) {} pub async fn handle_signals(&mut self, _signal: ()) {}
@ -472,7 +476,17 @@ impl Application {
} }
} }
signal::SIGCONT => { signal::SIGCONT => {
self.claim_term().await.unwrap(); // Copy/Paste from same issue from neovim:
// https://github.com/neovim/neovim/issues/12322
// https://github.com/neovim/neovim/pull/13084
for retries in 1..=10 {
match self.claim_term().await {
Ok(()) => break,
Err(err) if retries == 10 => panic!("Failed to claim terminal: {}", err),
Err(_) => continue,
}
}
// redraw the terminal // redraw the terminal
let area = self.terminal.size().expect("couldn't get terminal size"); let area = self.terminal.size().expect("couldn't get terminal size");
self.compositor.resize(area); self.compositor.resize(area);
@ -1018,7 +1032,7 @@ impl Application {
let language_server = let language_server =
self.editor.language_servers.get_by_id(server_id).unwrap(); self.editor.language_servers.get_by_id(server_id).unwrap();
Ok(json!(language_server.workspace_folders())) Ok(json!(&*language_server.workspace_folders().await))
} }
Ok(MethodCall::WorkspaceConfiguration(params)) => { Ok(MethodCall::WorkspaceConfiguration(params)) => {
let result: Vec<_> = params let result: Vec<_> = params
@ -1034,8 +1048,7 @@ impl Application {
None => self None => self
.editor .editor
.language_servers .language_servers
.get_by_id(server_id) .get_by_id(server_id)?
.unwrap()
.config()?, .config()?,
}; };
if let Some(section) = item.section.as_ref() { if let Some(section) = item.section.as_ref() {

@ -12,7 +12,7 @@ pub use typed::*;
use helix_core::{ use helix_core::{
char_idx_at_visual_offset, comment, char_idx_at_visual_offset, comment,
doc_formatter::TextFormat, doc_formatter::TextFormat,
encoding, find_first_non_whitespace_char, find_root, graphemes, encoding, find_first_non_whitespace_char, find_workspace, graphemes,
history::UndoKind, history::UndoKind,
increment, indent, increment, indent,
indent::IndentStyle, indent::IndentStyle,
@ -54,8 +54,8 @@ use crate::{
job::Callback, job::Callback,
keymap::ReverseKeymap, keymap::ReverseKeymap,
ui::{ ui::{
self, editor::InsertEvent, overlay::overlayed, FilePicker, Picker, Popup, Prompt, self, editor::InsertEvent, lsp::SignatureHelp, overlay::overlaid, FilePicker, Picker,
PromptEvent, Popup, Prompt, PromptEvent,
}, },
}; };
@ -347,6 +347,7 @@ impl MappableCommand {
goto_first_nonwhitespace, "Goto first non-blank in line", goto_first_nonwhitespace, "Goto first non-blank in line",
trim_selections, "Trim whitespace from selections", trim_selections, "Trim whitespace from selections",
extend_to_line_start, "Extend to line start", extend_to_line_start, "Extend to line start",
extend_to_first_nonwhitespace, "Extend to first non-blank in line",
extend_to_line_end, "Extend to line end", extend_to_line_end, "Extend to line end",
extend_to_line_end_newline, "Extend to line end", extend_to_line_end_newline, "Extend to line end",
signature_help, "Show signature help", signature_help, "Show signature help",
@ -841,6 +842,24 @@ fn kill_to_line_end(cx: &mut Context) {
fn goto_first_nonwhitespace(cx: &mut Context) { fn goto_first_nonwhitespace(cx: &mut Context) {
let (view, doc) = current!(cx.editor); let (view, doc) = current!(cx.editor);
goto_first_nonwhitespace_impl(
view,
doc,
if cx.editor.mode == Mode::Select {
Movement::Extend
} else {
Movement::Move
},
)
}
fn extend_to_first_nonwhitespace(cx: &mut Context) {
let (view, doc) = current!(cx.editor);
goto_first_nonwhitespace_impl(view, doc, Movement::Extend)
}
fn goto_first_nonwhitespace_impl(view: &mut View, doc: &mut Document, movement: Movement) {
let text = doc.text().slice(..); let text = doc.text().slice(..);
let selection = doc.selection(view.id).clone().transform(|range| { let selection = doc.selection(view.id).clone().transform(|range| {
@ -848,7 +867,7 @@ fn goto_first_nonwhitespace(cx: &mut Context) {
if let Some(pos) = find_first_non_whitespace_char(text.line(line)) { if let Some(pos) = find_first_non_whitespace_char(text.line(line)) {
let pos = pos + text.line_to_char(line); let pos = pos + text.line_to_char(line);
range.put_cursor(text, pos, cx.editor.mode == Mode::Select) range.put_cursor(text, pos, movement == Movement::Extend)
} else { } else {
range range
} }
@ -1563,7 +1582,7 @@ fn half_page_down(cx: &mut Context) {
} }
#[allow(deprecated)] #[allow(deprecated)]
// currently uses the deprected `visual_coords_at_pos`/`pos_at_visual_coords` functions // currently uses the deprecated `visual_coords_at_pos`/`pos_at_visual_coords` functions
// as this function ignores softwrapping (and virtual text) and instead only cares // as this function ignores softwrapping (and virtual text) and instead only cares
// about "text visual position" // about "text visual position"
// //
@ -2149,7 +2168,7 @@ fn global_search(cx: &mut Context) {
Some((path.clone().into(), Some((*line_num, *line_num)))) Some((path.clone().into(), Some((*line_num, *line_num))))
}, },
); );
compositor.push(Box::new(overlayed(picker))); compositor.push(Box::new(overlaid(picker)));
}, },
)); ));
Ok(call) Ok(call)
@ -2421,11 +2440,9 @@ fn append_mode(cx: &mut Context) {
} }
fn file_picker(cx: &mut Context) { fn file_picker(cx: &mut Context) {
// We don't specify language markers, root will be the root of the current let root = find_workspace().0;
// git repo or the current dir if we're not in a repo
let root = find_root(None, &[]);
let picker = ui::file_picker(root, &cx.editor.config()); let picker = ui::file_picker(root, &cx.editor.config());
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
fn file_picker_in_current_buffer_directory(cx: &mut Context) { fn file_picker_in_current_buffer_directory(cx: &mut Context) {
@ -2442,12 +2459,12 @@ fn file_picker_in_current_buffer_directory(cx: &mut Context) {
}; };
let picker = ui::file_picker(path, &cx.editor.config()); let picker = ui::file_picker(path, &cx.editor.config());
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
fn file_picker_in_current_directory(cx: &mut Context) { fn file_picker_in_current_directory(cx: &mut Context) {
let cwd = std::env::current_dir().unwrap_or_else(|_| PathBuf::from("./")); let cwd = std::env::current_dir().unwrap_or_else(|_| PathBuf::from("./"));
let picker = ui::file_picker(cwd, &cx.editor.config()); let picker = ui::file_picker(cwd, &cx.editor.config());
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
fn buffer_picker(cx: &mut Context) { fn buffer_picker(cx: &mut Context) {
@ -2512,7 +2529,7 @@ fn buffer_picker(cx: &mut Context) {
Some((meta.id.into(), Some((line, line)))) Some((meta.id.into(), Some((line, line))))
}, },
); );
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
fn jumplist_picker(cx: &mut Context) { fn jumplist_picker(cx: &mut Context) {
@ -2551,6 +2568,13 @@ fn jumplist_picker(cx: &mut Context) {
} }
} }
for (view, _) in cx.editor.tree.views_mut() {
for doc_id in view.jumps.iter().map(|e| e.0).collect::<Vec<_>>().iter() {
let doc = doc_mut!(cx.editor, doc_id);
view.sync_changes(doc);
}
}
let new_meta = |view: &View, doc_id: DocumentId, selection: Selection| { let new_meta = |view: &View, doc_id: DocumentId, selection: Selection| {
let doc = &cx.editor.documents.get(&doc_id); let doc = &cx.editor.documents.get(&doc_id);
let text = doc.map_or("".into(), |d| { let text = doc.map_or("".into(), |d| {
@ -2594,7 +2618,7 @@ fn jumplist_picker(cx: &mut Context) {
Some((meta.path.clone()?.into(), Some((line, line)))) Some((meta.path.clone()?.into(), Some((line, line))))
}, },
); );
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
impl ui::menu::Item for MappableCommand { impl ui::menu::Item for MappableCommand {
@ -2668,7 +2692,7 @@ pub fn command_palette(cx: &mut Context) {
} }
} }
}); });
compositor.push(Box::new(overlayed(picker))); compositor.push(Box::new(overlaid(picker)));
}, },
)); ));
} }
@ -4189,7 +4213,7 @@ pub fn completion(cx: &mut Context) {
None => return, None => return,
}; };
// setup a chanel that allows the request to be canceled // setup a channel that allows the request to be canceled
let (tx, rx) = oneshot::channel(); let (tx, rx) = oneshot::channel();
// set completion_request so that this request can be canceled // set completion_request so that this request can be canceled
// by setting completion_request, the old channel stored there is dropped // by setting completion_request, the old channel stored there is dropped
@ -4242,7 +4266,7 @@ pub fn completion(cx: &mut Context) {
let (view, doc) = current_ref!(editor); let (view, doc) = current_ref!(editor);
// check if the completion request is stale. // check if the completion request is stale.
// //
// Completions are completed asynchrounsly and therefore the user could // Completions are completed asynchronously and therefore the user could
//switch document/view or leave insert mode. In all of thoise cases the //switch document/view or leave insert mode. In all of thoise cases the
// completion should be discarded // completion should be discarded
if editor.mode != Mode::Insert || view.id != trigger_view || doc.id() != trigger_doc { if editor.mode != Mode::Insert || view.id != trigger_view || doc.id() != trigger_doc {
@ -4265,7 +4289,7 @@ pub fn completion(cx: &mut Context) {
} }
let size = compositor.size(); let size = compositor.size();
let ui = compositor.find::<ui::EditorView>().unwrap(); let ui = compositor.find::<ui::EditorView>().unwrap();
ui.set_completion( let completion_area = ui.set_completion(
editor, editor,
savepoint, savepoint,
items, items,
@ -4274,6 +4298,15 @@ pub fn completion(cx: &mut Context) {
trigger_offset, trigger_offset,
size, size,
); );
let size = compositor.size();
let signature_help_area = compositor
.find_id::<Popup<SignatureHelp>>(SignatureHelp::ID)
.map(|signature_help| signature_help.area(size, editor));
// Delete the signature help popup if they intersect.
if matches!((completion_area, signature_help_area),(Some(a), Some(b)) if a.intersects(b))
{
compositor.remove(SignatureHelp::ID);
}
}, },
); );
} }

@ -2,7 +2,7 @@ use super::{Context, Editor};
use crate::{ use crate::{
compositor::{self, Compositor}, compositor::{self, Compositor},
job::{Callback, Jobs}, job::{Callback, Jobs},
ui::{self, overlay::overlayed, FilePicker, Picker, Popup, Prompt, PromptEvent, Text}, ui::{self, overlay::overlaid, FilePicker, Picker, Popup, Prompt, PromptEvent, Text},
}; };
use dap::{StackFrame, Thread, ThreadStates}; use dap::{StackFrame, Thread, ThreadStates};
use helix_core::syntax::{DebugArgumentValue, DebugConfigCompletion, DebugTemplate}; use helix_core::syntax::{DebugArgumentValue, DebugConfigCompletion, DebugTemplate};
@ -270,7 +270,7 @@ pub fn dap_launch(cx: &mut Context) {
let templates = config.templates.clone(); let templates = config.templates.clone();
cx.push_layer(Box::new(overlayed(Picker::new( cx.push_layer(Box::new(overlaid(Picker::new(
templates, templates,
(), (),
|cx, template, _action| { |cx, template, _action| {

@ -26,7 +26,7 @@ use helix_view::{
use crate::{ use crate::{
compositor::{self, Compositor}, compositor::{self, Compositor},
ui::{ ui::{
self, lsp::SignatureHelp, overlay::overlayed, DynamicPicker, FileLocation, FilePicker, self, lsp::SignatureHelp, overlay::overlaid, DynamicPicker, FileLocation, FilePicker,
Popup, PromptEvent, Popup, PromptEvent,
}, },
}; };
@ -81,7 +81,7 @@ impl ui::menu::Item for lsp::Location {
// Most commonly, this will not allocate, especially on Unix systems where the root prefix // Most commonly, this will not allocate, especially on Unix systems where the root prefix
// is a simple `/` and not `C:\` (with whatever drive letter) // is a simple `/` and not `C:\` (with whatever drive letter)
write!(&mut res, ":{}", self.range.start.line) write!(&mut res, ":{}", self.range.start.line + 1)
.expect("Will only failed if allocating fail"); .expect("Will only failed if allocating fail");
res.into() res.into()
} }
@ -205,7 +205,9 @@ fn jump_to_location(
log::warn!("lsp position out of bounds - {:?}", location.range); log::warn!("lsp position out of bounds - {:?}", location.range);
return; return;
}; };
doc.set_selection(view.id, Selection::single(new_range.anchor, new_range.head)); // we flip the range so that the cursor sits on the start of the symbol
// (for example start of the function).
doc.set_selection(view.id, Selection::single(new_range.head, new_range.anchor));
align_view(doc, view, Align::Center); align_view(doc, view, Align::Center);
} }
@ -372,7 +374,7 @@ pub fn symbol_picker(cx: &mut Context) {
}; };
let picker = sym_picker(symbols, current_url, offset_encoding); let picker = sym_picker(symbols, current_url, offset_encoding);
compositor.push(Box::new(overlayed(picker))) compositor.push(Box::new(overlaid(picker)))
} }
}, },
) )
@ -431,7 +433,7 @@ pub fn workspace_symbol_picker(cx: &mut Context) {
future.boxed() future.boxed()
}; };
let dyn_picker = DynamicPicker::new(picker, Box::new(get_symbols)); let dyn_picker = DynamicPicker::new(picker, Box::new(get_symbols));
compositor.push(Box::new(overlayed(dyn_picker))) compositor.push(Box::new(overlaid(dyn_picker)))
}, },
) )
} }
@ -454,7 +456,7 @@ pub fn diagnostics_picker(cx: &mut Context) {
DiagnosticsFormat::HideSourcePath, DiagnosticsFormat::HideSourcePath,
offset_encoding, offset_encoding,
); );
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
} }
@ -471,7 +473,7 @@ pub fn workspace_diagnostics_picker(cx: &mut Context) {
DiagnosticsFormat::ShowSourcePath, DiagnosticsFormat::ShowSourcePath,
offset_encoding, offset_encoding,
); );
cx.push_layer(Box::new(overlayed(picker))); cx.push_layer(Box::new(overlaid(picker)));
} }
impl ui::menu::Item for lsp::CodeActionOrCommand { impl ui::menu::Item for lsp::CodeActionOrCommand {
@ -491,7 +493,7 @@ impl ui::menu::Item for lsp::CodeActionOrCommand {
/// ///
/// While the `kind` field is defined as open ended in the LSP spec (any value may be used) /// While the `kind` field is defined as open ended in the LSP spec (any value may be used)
/// in practice a closed set of common values (mostly suggested in the LSP spec) are used. /// in practice a closed set of common values (mostly suggested in the LSP spec) are used.
/// VSCode displays each of these categories seperatly (seperated by a heading in the codeactions picker) /// VSCode displays each of these categories separately (separated by a heading in the codeactions picker)
/// to make them easier to navigate. Helix does not display these headings to the user. /// to make them easier to navigate. Helix does not display these headings to the user.
/// However it does sort code actions by their categories to achieve the same order as the VScode picker, /// However it does sort code actions by their categories to achieve the same order as the VScode picker,
/// just without the headings. /// just without the headings.
@ -521,7 +523,7 @@ fn action_category(action: &CodeActionOrCommand) -> u32 {
} }
} }
fn action_prefered(action: &CodeActionOrCommand) -> bool { fn action_preferred(action: &CodeActionOrCommand) -> bool {
matches!( matches!(
action, action,
CodeActionOrCommand::CodeAction(CodeAction { CodeActionOrCommand::CodeAction(CodeAction {
@ -600,12 +602,12 @@ pub fn code_action(cx: &mut Context) {
} }
// Sort codeactions into a useful order. This behaviour is only partially described in the LSP spec. // Sort codeactions into a useful order. This behaviour is only partially described in the LSP spec.
// Many details are modeled after vscode because langauge servers are usually tested against it. // Many details are modeled after vscode because language servers are usually tested against it.
// VScode sorts the codeaction two times: // VScode sorts the codeaction two times:
// //
// First the codeactions that fix some diagnostics are moved to the front. // First the codeactions that fix some diagnostics are moved to the front.
// If both codeactions fix some diagnostics (or both fix none) the codeaction // If both codeactions fix some diagnostics (or both fix none) the codeaction
// that is marked with `is_preffered` is shown first. The codeactions are then shown in seperate // that is marked with `is_preferred` is shown first. The codeactions are then shown in separate
// submenus that only contain a certain category (see `action_category`) of actions. // submenus that only contain a certain category (see `action_category`) of actions.
// //
// Below this done in in a single sorting step // Below this done in in a single sorting step
@ -627,10 +629,10 @@ pub fn code_action(cx: &mut Context) {
return order; return order;
} }
// if one of the codeactions is marked as prefered show it first // if one of the codeactions is marked as preferred show it first
// otherwise keep the original LSP sorting // otherwise keep the original LSP sorting
action_prefered(action1) action_preferred(action1)
.cmp(&action_prefered(action2)) .cmp(&action_preferred(action2))
.reverse() .reverse()
}); });
@ -955,7 +957,7 @@ fn goto_impl(
}, },
move |_editor, location| Some(location_to_file_location(location)), move |_editor, location| Some(location_to_file_location(location)),
); );
compositor.push(Box::new(overlayed(picker))); compositor.push(Box::new(overlaid(picker)));
} }
} }
} }
@ -1271,10 +1273,25 @@ pub fn signature_help_impl(cx: &mut Context, invoked: SignatureHelpInvoked) {
contents.set_active_param_range(active_param_range()); contents.set_active_param_range(active_param_range());
let old_popup = compositor.find_id::<Popup<SignatureHelp>>(SignatureHelp::ID); let old_popup = compositor.find_id::<Popup<SignatureHelp>>(SignatureHelp::ID);
let popup = Popup::new(SignatureHelp::ID, contents) let mut popup = Popup::new(SignatureHelp::ID, contents)
.position(old_popup.and_then(|p| p.get_position())) .position(old_popup.and_then(|p| p.get_position()))
.position_bias(Open::Above) .position_bias(Open::Above)
.ignore_escape_key(true); .ignore_escape_key(true);
// Don't create a popup if it intersects the auto-complete menu.
let size = compositor.size();
if compositor
.find::<ui::EditorView>()
.unwrap()
.completion
.as_mut()
.map(|completion| completion.area(size, editor))
.filter(|area| area.intersects(popup.area(size, editor)))
.is_some()
{
return;
}
compositor.replace_or_push(SignatureHelp::ID, popup); compositor.replace_or_push(SignatureHelp::ID, popup);
}, },
); );

@ -116,7 +116,7 @@ fn open(cx: &mut compositor::Context, args: &[Cow<str>], event: PromptEvent) ->
let call: job::Callback = job::Callback::EditorCompositor(Box::new( let call: job::Callback = job::Callback::EditorCompositor(Box::new(
move |editor: &mut Editor, compositor: &mut Compositor| { move |editor: &mut Editor, compositor: &mut Compositor| {
let picker = ui::file_picker(path, &editor.config()); let picker = ui::file_picker(path, &editor.config());
compositor.push(Box::new(overlayed(picker))); compositor.push(Box::new(overlaid(picker)));
}, },
)); ));
Ok(call) Ok(call)
@ -1335,7 +1335,7 @@ fn lsp_workspace_command(
let picker = ui::Picker::new(commands, (), |cx, command, _action| { let picker = ui::Picker::new(commands, (), |cx, command, _action| {
execute_lsp_command(cx.editor, command.clone()); execute_lsp_command(cx.editor, command.clone());
}); });
compositor.push(Box::new(overlayed(picker))) compositor.push(Box::new(overlaid(picker)))
}, },
)); ));
Ok(call) Ok(call)
@ -1371,13 +1371,19 @@ fn lsp_restart(
return Ok(()); return Ok(());
} }
let editor_config = cx.editor.config.load();
let (_view, doc) = current!(cx.editor); let (_view, doc) = current!(cx.editor);
let config = doc let config = doc
.language_config() .language_config()
.context("LSP not defined for the current document")?; .context("LSP not defined for the current document")?;
let scope = config.scope.clone(); let scope = config.scope.clone();
cx.editor.language_servers.restart(config, doc.path())?; cx.editor.language_servers.restart(
config,
doc.path(),
&editor_config.workspace_lsp_roots,
editor_config.lsp.snippets,
)?;
// This collect is needed because refresh_language_server would need to re-borrow editor. // This collect is needed because refresh_language_server would need to re-borrow editor.
let document_ids_to_refresh: Vec<DocumentId> = cx let document_ids_to_refresh: Vec<DocumentId> = cx
@ -1764,12 +1770,12 @@ fn toggle_option(
let pointer = format!("/{}", key.replace('.', "/")); let pointer = format!("/{}", key.replace('.', "/"));
let value = config.pointer_mut(&pointer).ok_or_else(key_error)?; let value = config.pointer_mut(&pointer).ok_or_else(key_error)?;
if let Value::Bool(b) = *value { let Value::Bool(old_value) = *value else {
*value = Value::Bool(!b);
} else {
anyhow::bail!("Key `{}` is not toggle-able", key) anyhow::bail!("Key `{}` is not toggle-able", key)
} };
let new_value = !old_value;
*value = Value::Bool(new_value);
// This unwrap should never fail because we only replace one boolean value // This unwrap should never fail because we only replace one boolean value
// with another, maintaining a valid json config // with another, maintaining a valid json config
let config = serde_json::from_value(config).unwrap(); let config = serde_json::from_value(config).unwrap();
@ -1778,6 +1784,8 @@ fn toggle_option(
.config_events .config_events
.0 .0
.send(ConfigEvent::Update(config))?; .send(ConfigEvent::Update(config))?;
cx.editor
.set_status(format!("Option `{}` is now set to `{}`", key, new_value));
Ok(()) Ok(())
} }
@ -1970,6 +1978,20 @@ fn open_config(
Ok(()) Ok(())
} }
fn open_workspace_config(
cx: &mut compositor::Context,
_args: &[Cow<str>],
event: PromptEvent,
) -> anyhow::Result<()> {
if event != PromptEvent::Validate {
return Ok(());
}
cx.editor
.open(&helix_loader::workspace_config_file(), Action::Replace)?;
Ok(())
}
fn open_log( fn open_log(
cx: &mut compositor::Context, cx: &mut compositor::Context,
_args: &[Cow<str>], _args: &[Cow<str>],
@ -2105,20 +2127,16 @@ fn reset_diff_change(
let scrolloff = editor.config().scrolloff; let scrolloff = editor.config().scrolloff;
let (view, doc) = current!(editor); let (view, doc) = current!(editor);
// TODO refactor to use let..else once MSRV is raised to 1.65 let Some(handle) = doc.diff_handle() else {
let handle = match doc.diff_handle() { bail!("Diff is not available in the current buffer")
Some(handle) => handle,
None => bail!("Diff is not available in the current buffer"),
}; };
let diff = handle.load(); let diff = handle.load();
let doc_text = doc.text().slice(..); let doc_text = doc.text().slice(..);
let line = doc.selection(view.id).primary().cursor_line(doc_text); let line = doc.selection(view.id).primary().cursor_line(doc_text);
// TODO refactor to use let..else once MSRV is raised to 1.65 let Some(hunk_idx) = diff.hunk_at(line as u32, true) else {
let hunk_idx = match diff.hunk_at(line as u32, true) { bail!("There is no change at the cursor")
Some(hunk_idx) => hunk_idx,
None => bail!("There is no change at the cursor"),
}; };
let hunk = diff.nth_hunk(hunk_idx); let hunk = diff.nth_hunk(hunk_idx);
let diff_base = diff.diff_base(); let diff_base = diff.diff_base();
@ -2479,7 +2497,7 @@ pub const TYPABLE_COMMAND_LIST: &[TypableCommand] = &[
}, },
TypableCommand { TypableCommand {
name: "update", name: "update",
aliases: &[], aliases: &["u"],
doc: "Write changes only if the file has been modified.", doc: "Write changes only if the file has been modified.",
fun: update, fun: update,
signature: CommandSignature::none(), signature: CommandSignature::none(),
@ -2646,6 +2664,13 @@ pub const TYPABLE_COMMAND_LIST: &[TypableCommand] = &[
fun: open_config, fun: open_config,
signature: CommandSignature::none(), signature: CommandSignature::none(),
}, },
TypableCommand {
name: "config-open-workspace",
aliases: &[],
doc: "Open the workspace config.toml file.",
fun: open_workspace_config,
signature: CommandSignature::none(),
},
TypableCommand { TypableCommand {
name: "log-open", name: "log-open",
aliases: &[], aliases: &[],

@ -1,27 +1,34 @@
use crate::keymap::{default::default, merge_keys, Keymap}; use crate::keymap;
use crate::keymap::{merge_keys, Keymap};
use helix_loader::merge_toml_values;
use helix_view::document::Mode; use helix_view::document::Mode;
use serde::Deserialize; use serde::Deserialize;
use std::collections::HashMap; use std::collections::HashMap;
use std::fmt::Display; use std::fmt::Display;
use std::fs;
use std::io::Error as IOError; use std::io::Error as IOError;
use std::path::PathBuf;
use toml::de::Error as TomlError; use toml::de::Error as TomlError;
#[derive(Debug, Clone, PartialEq, Deserialize)] #[derive(Debug, Clone, PartialEq)]
#[serde(deny_unknown_fields)]
pub struct Config { pub struct Config {
pub theme: Option<String>, pub theme: Option<String>,
#[serde(default = "default")]
pub keys: HashMap<Mode, Keymap>, pub keys: HashMap<Mode, Keymap>,
#[serde(default)]
pub editor: helix_view::editor::Config, pub editor: helix_view::editor::Config,
} }
#[derive(Debug, Clone, PartialEq, Deserialize)]
#[serde(deny_unknown_fields)]
pub struct ConfigRaw {
pub theme: Option<String>,
pub keys: Option<HashMap<Mode, Keymap>>,
pub editor: Option<toml::Value>,
}
impl Default for Config { impl Default for Config {
fn default() -> Config { fn default() -> Config {
Config { Config {
theme: None, theme: None,
keys: default(), keys: keymap::default(),
editor: helix_view::editor::Config::default(), editor: helix_view::editor::Config::default(),
} }
} }
@ -33,6 +40,12 @@ pub enum ConfigLoadError {
Error(IOError), Error(IOError),
} }
impl Default for ConfigLoadError {
fn default() -> Self {
ConfigLoadError::Error(IOError::new(std::io::ErrorKind::NotFound, "place holder"))
}
}
impl Display for ConfigLoadError { impl Display for ConfigLoadError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self { match self {
@ -43,17 +56,72 @@ impl Display for ConfigLoadError {
} }
impl Config { impl Config {
pub fn load(config_path: PathBuf) -> Result<Config, ConfigLoadError> { pub fn load(
match std::fs::read_to_string(config_path) { global: Result<String, ConfigLoadError>,
Ok(config) => toml::from_str(&config) local: Result<String, ConfigLoadError>,
.map(merge_keys) ) -> Result<Config, ConfigLoadError> {
.map_err(ConfigLoadError::BadConfig), let global_config: Result<ConfigRaw, ConfigLoadError> =
Err(err) => Err(ConfigLoadError::Error(err)), global.and_then(|file| toml::from_str(&file).map_err(ConfigLoadError::BadConfig));
} let local_config: Result<ConfigRaw, ConfigLoadError> =
local.and_then(|file| toml::from_str(&file).map_err(ConfigLoadError::BadConfig));
let res = match (global_config, local_config) {
(Ok(global), Ok(local)) => {
let mut keys = keymap::default();
if let Some(global_keys) = global.keys {
merge_keys(&mut keys, global_keys)
}
if let Some(local_keys) = local.keys {
merge_keys(&mut keys, local_keys)
}
let editor = match (global.editor, local.editor) {
(None, None) => helix_view::editor::Config::default(),
(None, Some(val)) | (Some(val), None) => {
val.try_into().map_err(ConfigLoadError::BadConfig)?
}
(Some(global), Some(local)) => merge_toml_values(global, local, 3)
.try_into()
.map_err(ConfigLoadError::BadConfig)?,
};
Config {
theme: local.theme.or(global.theme),
keys,
editor,
}
}
// if any configs are invalid return that first
(_, Err(ConfigLoadError::BadConfig(err)))
| (Err(ConfigLoadError::BadConfig(err)), _) => {
return Err(ConfigLoadError::BadConfig(err))
}
(Ok(config), Err(_)) | (Err(_), Ok(config)) => {
let mut keys = keymap::default();
if let Some(keymap) = config.keys {
merge_keys(&mut keys, keymap);
}
Config {
theme: config.theme,
keys,
editor: config.editor.map_or_else(
|| Ok(helix_view::editor::Config::default()),
|val| val.try_into().map_err(ConfigLoadError::BadConfig),
)?,
}
}
// these are just two io errors return the one for the global config
(Err(err), Err(_)) => return Err(err),
};
Ok(res)
} }
pub fn load_default() -> Result<Config, ConfigLoadError> { pub fn load_default() -> Result<Config, ConfigLoadError> {
Config::load(helix_loader::config_file()) let global_config =
fs::read_to_string(helix_loader::config_file()).map_err(ConfigLoadError::Error);
let local_config = fs::read_to_string(helix_loader::workspace_config_file())
.map_err(ConfigLoadError::Error);
Config::load(global_config, local_config)
} }
} }
@ -61,6 +129,12 @@ impl Config {
mod tests { mod tests {
use super::*; use super::*;
impl Config {
fn load_test(config: &str) -> Config {
Config::load(Ok(config.to_owned()), Err(ConfigLoadError::default())).unwrap()
}
}
#[test] #[test]
fn parsing_keymaps_config_file() { fn parsing_keymaps_config_file() {
use crate::keymap; use crate::keymap;
@ -77,18 +151,24 @@ mod tests {
A-F12 = "move_next_word_end" A-F12 = "move_next_word_end"
"#; "#;
let mut keys = keymap::default();
merge_keys(
&mut keys,
hashmap! {
Mode::Insert => Keymap::new(keymap!({ "Insert mode"
"y" => move_line_down,
"S-C-a" => delete_selection,
})),
Mode::Normal => Keymap::new(keymap!({ "Normal mode"
"A-F12" => move_next_word_end,
})),
},
);
assert_eq!( assert_eq!(
toml::from_str::<Config>(sample_keymaps).unwrap(), Config::load_test(sample_keymaps),
Config { Config {
keys: hashmap! { keys,
Mode::Insert => Keymap::new(keymap!({ "Insert mode"
"y" => move_line_down,
"S-C-a" => delete_selection,
})),
Mode::Normal => Keymap::new(keymap!({ "Normal mode"
"A-F12" => move_next_word_end,
})),
},
..Default::default() ..Default::default()
} }
); );
@ -97,11 +177,11 @@ mod tests {
#[test] #[test]
fn keys_resolve_to_correct_defaults() { fn keys_resolve_to_correct_defaults() {
// From serde default // From serde default
let default_keys = toml::from_str::<Config>("").unwrap().keys; let default_keys = Config::load_test("").keys;
assert_eq!(default_keys, default()); assert_eq!(default_keys, keymap::default());
// From the Default trait // From the Default trait
let default_keys = Config::default().keys; let default_keys = Config::default().keys;
assert_eq!(default_keys, default()); assert_eq!(default_keys, keymap::default());
} }
} }

@ -2,7 +2,6 @@ pub mod default;
pub mod macros; pub mod macros;
pub use crate::commands::MappableCommand; pub use crate::commands::MappableCommand;
use crate::config::Config;
use arc_swap::{ use arc_swap::{
access::{DynAccess, DynGuard}, access::{DynAccess, DynGuard},
ArcSwap, ArcSwap,
@ -16,7 +15,7 @@ use std::{
sync::Arc, sync::Arc,
}; };
use default::default; pub use default::default;
use macros::key; use macros::key;
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@ -417,12 +416,10 @@ impl Default for Keymaps {
} }
/// Merge default config keys with user overwritten keys for custom user config. /// Merge default config keys with user overwritten keys for custom user config.
pub fn merge_keys(mut config: Config) -> Config { pub fn merge_keys(dst: &mut HashMap<Mode, Keymap>, mut delta: HashMap<Mode, Keymap>) {
let mut delta = std::mem::replace(&mut config.keys, default()); for (mode, keys) in dst {
for (mode, keys) in &mut config.keys {
keys.merge(delta.remove(mode).unwrap_or_default()) keys.merge(delta.remove(mode).unwrap_or_default())
} }
config
} }
#[cfg(test)] #[cfg(test)]
@ -449,26 +446,24 @@ mod tests {
#[test] #[test]
fn merge_partial_keys() { fn merge_partial_keys() {
let config = Config { let keymap = hashmap! {
keys: hashmap! { Mode::Normal => Keymap::new(
Mode::Normal => Keymap::new( keymap!({ "Normal mode"
keymap!({ "Normal mode" "i" => normal_mode,
"i" => normal_mode, "无" => insert_mode,
"无" => insert_mode, "z" => jump_backward,
"z" => jump_backward, "g" => { "Merge into goto mode"
"g" => { "Merge into goto mode" "$" => goto_line_end,
"$" => goto_line_end, "g" => delete_char_forward,
"g" => delete_char_forward, },
}, })
}) )
)
},
..Default::default()
}; };
let mut merged_config = merge_keys(config.clone()); let mut merged_keyamp = default();
assert_ne!(config, merged_config); merge_keys(&mut merged_keyamp, keymap.clone());
assert_ne!(keymap, merged_keyamp);
let mut keymap = Keymaps::new(Box::new(Constant(merged_config.keys.clone()))); let mut keymap = Keymaps::new(Box::new(Constant(merged_keyamp.clone())));
assert_eq!( assert_eq!(
keymap.get(Mode::Normal, key!('i')), keymap.get(Mode::Normal, key!('i')),
KeymapResult::Matched(MappableCommand::normal_mode), KeymapResult::Matched(MappableCommand::normal_mode),
@ -486,7 +481,7 @@ mod tests {
"Leaf should replace node" "Leaf should replace node"
); );
let keymap = merged_config.keys.get_mut(&Mode::Normal).unwrap(); let keymap = merged_keyamp.get_mut(&Mode::Normal).unwrap();
// Assumes that `g` is a node in default keymap // Assumes that `g` is a node in default keymap
assert_eq!( assert_eq!(
keymap.root().search(&[key!('g'), key!('$')]).unwrap(), keymap.root().search(&[key!('g'), key!('$')]).unwrap(),
@ -506,30 +501,28 @@ mod tests {
"Old leaves in subnode should be present in merged node" "Old leaves in subnode should be present in merged node"
); );
assert!(merged_config.keys.get(&Mode::Normal).unwrap().len() > 1); assert!(merged_keyamp.get(&Mode::Normal).unwrap().len() > 1);
assert!(merged_config.keys.get(&Mode::Insert).unwrap().len() > 0); assert!(merged_keyamp.get(&Mode::Insert).unwrap().len() > 0);
} }
#[test] #[test]
fn order_should_be_set() { fn order_should_be_set() {
let config = Config { let keymap = hashmap! {
keys: hashmap! { Mode::Normal => Keymap::new(
Mode::Normal => Keymap::new( keymap!({ "Normal mode"
keymap!({ "Normal mode" "space" => { ""
"space" => { "" "s" => { ""
"s" => { "" "v" => vsplit,
"v" => vsplit, "c" => hsplit,
"c" => hsplit,
},
}, },
}) },
) })
}, )
..Default::default()
}; };
let mut merged_config = merge_keys(config.clone()); let mut merged_keyamp = default();
assert_ne!(config, merged_config); merge_keys(&mut merged_keyamp, keymap.clone());
let keymap = merged_config.keys.get_mut(&Mode::Normal).unwrap(); assert_ne!(keymap, merged_keyamp);
let keymap = merged_keyamp.get_mut(&Mode::Normal).unwrap();
// Make sure mapping works // Make sure mapping works
assert_eq!( assert_eq!(
keymap keymap

@ -3,7 +3,7 @@ use crossterm::event::EventStream;
use helix_loader::VERSION_AND_GIT_HASH; use helix_loader::VERSION_AND_GIT_HASH;
use helix_term::application::Application; use helix_term::application::Application;
use helix_term::args::Args; use helix_term::args::Args;
use helix_term::config::Config; use helix_term::config::{Config, ConfigLoadError};
use std::path::PathBuf; use std::path::PathBuf;
fn setup_logging(logpath: PathBuf, verbosity: u64) -> Result<()> { fn setup_logging(logpath: PathBuf, verbosity: u64) -> Result<()> {
@ -126,18 +126,19 @@ FLAGS:
helix_loader::initialize_config_file(args.config_file.clone()); helix_loader::initialize_config_file(args.config_file.clone());
let config = match std::fs::read_to_string(helix_loader::config_file()) { let config = match Config::load_default() {
Ok(config) => toml::from_str(&config) Ok(config) => config,
.map(helix_term::keymap::merge_keys) Err(ConfigLoadError::Error(err)) if err.kind() == std::io::ErrorKind::NotFound => {
.unwrap_or_else(|err| { Config::default()
eprintln!("Bad config: {}", err); }
eprintln!("Press <ENTER> to continue with default config"); Err(ConfigLoadError::Error(err)) => return Err(Error::new(err)),
use std::io::Read; Err(ConfigLoadError::BadConfig(err)) => {
let _ = std::io::stdin().read(&mut []); eprintln!("Bad config: {}", err);
Config::default() eprintln!("Press <ENTER> to continue with default config");
}), use std::io::Read;
Err(err) if err.kind() == std::io::ErrorKind::NotFound => Config::default(), let _ = std::io::stdin().read(&mut []);
Err(err) => return Err(Error::new(err)), Config::default()
}
}; };
let syn_loader_conf = helix_core::config::user_syntax_loader().unwrap_or_else(|err| { let syn_loader_conf = helix_core::config::user_syntax_loader().unwrap_or_else(|err| {

@ -141,16 +141,12 @@ impl Completion {
} }
}; };
let start_offset = let Some(range) = util::lsp_range_to_range(doc.text(), edit.range, offset_encoding) else{
match util::lsp_pos_to_pos(doc.text(), edit.range.start, offset_encoding) { return Transaction::new(doc.text());
Some(start) => start as i128 - primary_cursor as i128, };
None => return Transaction::new(doc.text()),
}; let start_offset = range.anchor as i128 - primary_cursor as i128;
let end_offset = let end_offset = range.head as i128 - primary_cursor as i128;
match util::lsp_pos_to_pos(doc.text(), edit.range.end, offset_encoding) {
Some(end) => end as i128 - primary_cursor as i128,
None => return Transaction::new(doc.text()),
};
(Some((start_offset, end_offset)), edit.new_text) (Some((start_offset, end_offset)), edit.new_text)
} else { } else {
@ -414,6 +410,10 @@ impl Completion {
true true
} }
pub fn area(&mut self, viewport: Rect, editor: &Editor) -> Rect {
self.popup.area(viewport, editor)
}
} }
impl Component for Completion { impl Component for Completion {
@ -481,7 +481,7 @@ impl Component for Completion {
}; };
let popup_area = { let popup_area = {
let (popup_x, popup_y) = self.popup.get_rel_position(area, cx); let (popup_x, popup_y) = self.popup.get_rel_position(area, cx.editor);
let (popup_width, popup_height) = self.popup.get_size(); let (popup_width, popup_height) = self.popup.get_size();
Rect::new(popup_x, popup_y, popup_width, popup_height) Rect::new(popup_x, popup_y, popup_width, popup_height)
}; };

@ -118,7 +118,7 @@ pub fn render_document(
fn translate_positions( fn translate_positions(
char_pos: usize, char_pos: usize,
first_visisble_char_idx: usize, first_visible_char_idx: usize,
translated_positions: &mut [TranslatedPosition], translated_positions: &mut [TranslatedPosition],
text_fmt: &TextFormat, text_fmt: &TextFormat,
renderer: &mut TextRenderer, renderer: &mut TextRenderer,
@ -126,7 +126,7 @@ fn translate_positions(
) { ) {
// check if any positions translated on the fly (like cursor) has been reached // check if any positions translated on the fly (like cursor) has been reached
for (char_idx, callback) in &mut *translated_positions { for (char_idx, callback) in &mut *translated_positions {
if *char_idx < char_pos && *char_idx >= first_visisble_char_idx { if *char_idx < char_pos && *char_idx >= first_visible_char_idx {
// by replacing the char_index with usize::MAX large number we ensure // by replacing the char_index with usize::MAX large number we ensure
// that the same position is only translated once // that the same position is only translated once
// text will never reach usize::MAX as rust memory allocations are limited // text will never reach usize::MAX as rust memory allocations are limited
@ -259,7 +259,7 @@ pub fn render_text<'t>(
} }
} }
// aquire the correct grapheme style // acquire the correct grapheme style
if char_pos >= style_span.1 { if char_pos >= style_span.1 {
style_span = styles.next().unwrap_or((Style::default(), usize::MAX)); style_span = styles.next().unwrap_or((Style::default(), usize::MAX));
} }
@ -404,7 +404,7 @@ impl<'a> TextRenderer<'a> {
let cut_off_start = self.col_offset.saturating_sub(position.col); let cut_off_start = self.col_offset.saturating_sub(position.col);
let is_whitespace = grapheme.is_whitespace(); let is_whitespace = grapheme.is_whitespace();
// TODO is it correct to apply the whitspace style to all unicode white spaces? // TODO is it correct to apply the whitespace style to all unicode white spaces?
if is_whitespace { if is_whitespace {
style = style.patch(self.whitespace_style); style = style.patch(self.whitespace_style);
} }

@ -93,40 +93,6 @@ impl EditorView {
let mut line_decorations: Vec<Box<dyn LineDecoration>> = Vec::new(); let mut line_decorations: Vec<Box<dyn LineDecoration>> = Vec::new();
let mut translated_positions: Vec<TranslatedPosition> = Vec::new(); let mut translated_positions: Vec<TranslatedPosition> = Vec::new();
// DAP: Highlight current stack frame position
let stack_frame = editor.debugger.as_ref().and_then(|debugger| {
if let (Some(frame), Some(thread_id)) = (debugger.active_frame, debugger.thread_id) {
debugger
.stack_frames
.get(&thread_id)
.and_then(|bt| bt.get(frame))
} else {
None
}
});
if let Some(frame) = stack_frame {
if doc.path().is_some()
&& frame
.source
.as_ref()
.and_then(|source| source.path.as_ref())
== doc.path()
{
let line = frame.line - 1; // convert to 0-indexing
let style = theme.get("ui.highlight");
let line_decoration = move |renderer: &mut TextRenderer, pos: LinePos| {
if pos.doc_line != line {
return;
}
renderer
.surface
.set_style(Rect::new(area.x, pos.visual_line, area.width, 1), style);
};
line_decorations.push(Box::new(line_decoration));
}
}
if is_focused && config.cursorline { if is_focused && config.cursorline {
line_decorations.push(Self::cursorline_decorator(doc, view, theme)) line_decorations.push(Self::cursorline_decorator(doc, view, theme))
} }
@ -135,6 +101,23 @@ impl EditorView {
Self::highlight_cursorcolumn(doc, view, surface, theme, inner, &text_annotations); Self::highlight_cursorcolumn(doc, view, surface, theme, inner, &text_annotations);
} }
// Set DAP highlights, if needed.
if let Some(frame) = editor.current_stack_frame() {
let dap_line = frame.line.saturating_sub(1) as usize;
let style = theme.get("ui.highlight.frameline");
let line_decoration = move |renderer: &mut TextRenderer, pos: LinePos| {
if pos.doc_line != dap_line {
return;
}
renderer.surface.set_style(
Rect::new(inner.x, inner.y + pos.visual_line, inner.width, 1),
style,
);
};
line_decorations.push(Box::new(line_decoration));
}
let mut highlights = let mut highlights =
Self::doc_syntax_highlights(doc, view.offset.anchor, inner.height, theme); Self::doc_syntax_highlights(doc, view.offset.anchor, inner.height, theme);
let overlay_highlights = Self::overlay_syntax_highlights( let overlay_highlights = Self::overlay_syntax_highlights(
@ -422,6 +405,7 @@ impl EditorView {
let primary_selection_scope = theme let primary_selection_scope = theme
.find_scope_index_exact("ui.selection.primary") .find_scope_index_exact("ui.selection.primary")
.unwrap_or(selection_scope); .unwrap_or(selection_scope);
let base_cursor_scope = theme let base_cursor_scope = theme
.find_scope_index_exact("ui.cursor") .find_scope_index_exact("ui.cursor")
.unwrap_or(selection_scope); .unwrap_or(selection_scope);
@ -968,7 +952,7 @@ impl EditorView {
start_offset: usize, start_offset: usize,
trigger_offset: usize, trigger_offset: usize,
size: Rect, size: Rect,
) { ) -> Option<Rect> {
let mut completion = Completion::new( let mut completion = Completion::new(
editor, editor,
savepoint, savepoint,
@ -980,15 +964,17 @@ impl EditorView {
if completion.is_empty() { if completion.is_empty() {
// skip if we got no completion results // skip if we got no completion results
return; return None;
} }
let area = completion.area(size, editor);
editor.last_completion = None; editor.last_completion = None;
self.last_insert.1.push(InsertEvent::TriggerCompletion); self.last_insert.1.push(InsertEvent::TriggerCompletion);
// TODO : propagate required size on resize to completion too // TODO : propagate required size on resize to completion too
completion.required_size((size.width, size.height)); completion.required_size((size.width, size.height));
self.completion = Some(completion); self.completion = Some(completion);
Some(area)
} }
pub fn clear_completion(&mut self, editor: &mut Editor) { pub fn clear_completion(&mut self, editor: &mut Editor) {
@ -1272,13 +1258,15 @@ impl Component for EditorView {
// let completion swallow the event if necessary // let completion swallow the event if necessary
let mut consumed = false; let mut consumed = false;
if let Some(completion) = &mut self.completion { if let Some(completion) = &mut self.completion {
// use a fake context here let res = {
let mut cx = Context { // use a fake context here
editor: cx.editor, let mut cx = Context {
jobs: cx.jobs, editor: cx.editor,
scroll: None, jobs: cx.jobs,
scroll: None,
};
completion.handle_event(event, &mut cx)
}; };
let res = completion.handle_event(event, &mut cx);
if let EventResult::Consumed(callback) = res { if let EventResult::Consumed(callback) = res {
consumed = true; consumed = true;
@ -1286,6 +1274,12 @@ impl Component for EditorView {
if callback.is_some() { if callback.is_some() {
// assume close_fn // assume close_fn
self.clear_completion(cx.editor); self.clear_completion(cx.editor);
// In case the popup was deleted because of an intersection w/ the auto-complete menu.
commands::signature_help_impl(
&mut cx,
commands::SignatureHelpInvoked::Automatic,
);
} }
} }
} }

@ -54,7 +54,7 @@ impl QueryAtom {
} }
fn indices(&self, matcher: &Matcher, item: &str, indices: &mut Vec<usize>) -> bool { fn indices(&self, matcher: &Matcher, item: &str, indices: &mut Vec<usize>) -> bool {
// for inverse there are no indicies to return // for inverse there are no indices to return
// just return whether we matched // just return whether we matched
if self.inverse { if self.inverse {
return self.matches(matcher, item); return self.matches(matcher, item);
@ -120,7 +120,7 @@ enum QueryAtomKind {
/// ///
/// Usage: `foo` /// Usage: `foo`
Fuzzy, Fuzzy,
/// Item contains query atom as a continous substring /// Item contains query atom as a continuous substring
/// ///
/// Usage `'foo` /// Usage `'foo`
Substring, Substring,
@ -213,7 +213,7 @@ impl FuzzyQuery {
Some(score) Some(score)
} }
pub fn fuzzy_indicies(&self, item: &str, matcher: &Matcher) -> Option<(i64, Vec<usize>)> { pub fn fuzzy_indices(&self, item: &str, matcher: &Matcher) -> Option<(i64, Vec<usize>)> {
let (score, mut indices) = self.first_fuzzy_atom.as_ref().map_or_else( let (score, mut indices) = self.first_fuzzy_atom.as_ref().map_or_else(
|| Some((0, Vec::new())), || Some((0, Vec::new())),
|atom| matcher.fuzzy_indices(item, atom), |atom| matcher.fuzzy_indices(item, atom),

@ -7,8 +7,8 @@ fn run_test<'a>(query: &str, items: &'a [&'a str]) -> Vec<String> {
items items
.iter() .iter()
.filter_map(|item| { .filter_map(|item| {
let (_, indicies) = query.fuzzy_indicies(item, &matcher)?; let (_, indices) = query.fuzzy_indices(item, &matcher)?;
let matched_string = indicies let matched_string = indices
.iter() .iter()
.map(|&pos| item.chars().nth(pos).unwrap()) .map(|&pos| item.chars().nth(pos).unwrap())
.collect(); .collect();

@ -347,6 +347,7 @@ impl<T: Item + 'static> Component for Menu<T> {
offset: scroll, offset: scroll,
selected: self.cursor, selected: self.cursor,
}, },
false,
); );
if let Some(cursor) = self.cursor { if let Some(cursor) = self.cursor {

@ -16,7 +16,7 @@ pub struct Overlay<T> {
} }
/// Surrounds the component with a margin of 5% on each side, and an additional 2 rows at the bottom /// Surrounds the component with a margin of 5% on each side, and an additional 2 rows at the bottom
pub fn overlayed<T>(content: T) -> Overlay<T> { pub fn overlaid<T>(content: T) -> Overlay<T> {
Overlay { Overlay {
content, content,
calc_child_size: Box::new(|rect: Rect| clip_rect_relative(rect.clip_bottom(2), 90, 90)), calc_child_size: Box::new(|rect: Rect| clip_rect_relative(rect.clip_bottom(2), 90, 90)),

@ -794,7 +794,7 @@ impl<T: Item + 'static> Component for Picker<T> {
// might be inconsistencies. This is the best we can do since only the // might be inconsistencies. This is the best we can do since only the
// text in Row is displayed to the end user. // text in Row is displayed to the end user.
let (_score, highlights) = FuzzyQuery::new(self.prompt.line()) let (_score, highlights) = FuzzyQuery::new(self.prompt.line())
.fuzzy_indicies(&line, &self.matcher) .fuzzy_indices(&line, &self.matcher)
.unwrap_or_default(); .unwrap_or_default();
let highlight_byte_ranges: Vec<_> = line let highlight_byte_ranges: Vec<_> = line
@ -885,6 +885,7 @@ impl<T: Item + 'static> Component for Picker<T> {
offset: 0, offset: 0,
selected: Some(cursor), selected: Some(cursor),
}, },
self.truncate_start,
); );
} }

@ -6,7 +6,10 @@ use crate::{
use tui::buffer::Buffer as Surface; use tui::buffer::Buffer as Surface;
use helix_core::Position; use helix_core::Position;
use helix_view::graphics::{Margin, Rect}; use helix_view::{
graphics::{Margin, Rect},
Editor,
};
// TODO: share logic with Menu, it's essentially Popup(render_fn), but render fn needs to return // TODO: share logic with Menu, it's essentially Popup(render_fn), but render fn needs to return
// a width/height hint. maybe Popup(Box<Component>) // a width/height hint. maybe Popup(Box<Component>)
@ -88,10 +91,10 @@ impl<T: Component> Popup<T> {
/// Calculate the position where the popup should be rendered and return the coordinates of the /// Calculate the position where the popup should be rendered and return the coordinates of the
/// top left corner. /// top left corner.
pub fn get_rel_position(&mut self, viewport: Rect, cx: &Context) -> (u16, u16) { pub fn get_rel_position(&mut self, viewport: Rect, editor: &Editor) -> (u16, u16) {
let position = self let position = self
.position .position
.get_or_insert_with(|| cx.editor.cursor().0.unwrap_or_default()); .get_or_insert_with(|| editor.cursor().0.unwrap_or_default());
let (width, height) = self.size; let (width, height) = self.size;
@ -155,6 +158,16 @@ impl<T: Component> Popup<T> {
pub fn contents_mut(&mut self) -> &mut T { pub fn contents_mut(&mut self) -> &mut T {
&mut self.contents &mut self.contents
} }
pub fn area(&mut self, viewport: Rect, editor: &Editor) -> Rect {
// trigger required_size so we recalculate if the child changed
self.required_size((viewport.width, viewport.height));
let (rel_x, rel_y) = self.get_rel_position(viewport, editor);
// clip to viewport
viewport.intersection(Rect::new(rel_x, rel_y, self.size.0, self.size.1))
}
} }
impl<T: Component> Component for Popup<T> { impl<T: Component> Component for Popup<T> {
@ -232,16 +245,9 @@ impl<T: Component> Component for Popup<T> {
} }
fn render(&mut self, viewport: Rect, surface: &mut Surface, cx: &mut Context) { fn render(&mut self, viewport: Rect, surface: &mut Surface, cx: &mut Context) {
// trigger required_size so we recalculate if the child changed let area = self.area(viewport, cx.editor);
self.required_size((viewport.width, viewport.height));
cx.scroll = Some(self.scroll); cx.scroll = Some(self.scroll);
let (rel_x, rel_y) = self.get_rel_position(viewport, cx);
// clip to viewport
let area = viewport.intersection(Rect::new(rel_x, rel_y, self.size.0, self.size.1));
// clear area // clear area
let background = cx.editor.theme.get("ui.popup"); let background = cx.editor.theme.get("ui.popup");
surface.clear_with(area, background); surface.clear_with(area, background);

@ -511,11 +511,21 @@ impl Component for Prompt {
ctrl!('e') | key!(End) => self.move_end(), ctrl!('e') | key!(End) => self.move_end(),
ctrl!('a') | key!(Home) => self.move_start(), ctrl!('a') | key!(Home) => self.move_start(),
ctrl!('w') | alt!(Backspace) | ctrl!(Backspace) => { ctrl!('w') | alt!(Backspace) | ctrl!(Backspace) => {
self.delete_word_backwards(cx.editor) self.delete_word_backwards(cx.editor);
(self.callback_fn)(cx, &self.line, PromptEvent::Update);
}
alt!('d') | alt!(Delete) | ctrl!(Delete) => {
self.delete_word_forwards(cx.editor);
(self.callback_fn)(cx, &self.line, PromptEvent::Update);
}
ctrl!('k') => {
self.kill_to_end_of_line(cx.editor);
(self.callback_fn)(cx, &self.line, PromptEvent::Update);
}
ctrl!('u') => {
self.kill_to_start_of_line(cx.editor);
(self.callback_fn)(cx, &self.line, PromptEvent::Update);
} }
alt!('d') | alt!(Delete) | ctrl!(Delete) => self.delete_word_forwards(cx.editor),
ctrl!('k') => self.kill_to_end_of_line(cx.editor),
ctrl!('u') => self.kill_to_start_of_line(cx.editor),
ctrl!('h') | key!(Backspace) | shift!(Backspace) => { ctrl!('h') | key!(Backspace) | shift!(Backspace) => {
self.delete_char_backwards(cx.editor); self.delete_char_backwards(cx.editor);
(self.callback_fn)(cx, &self.line, PromptEvent::Update); (self.callback_fn)(cx, &self.line, PromptEvent::Update);

@ -2,8 +2,6 @@
mod test { mod test {
mod helpers; mod helpers;
use std::path::PathBuf;
use helix_core::{syntax::AutoPairConfig, Selection}; use helix_core::{syntax::AutoPairConfig, Selection};
use helix_term::config::Config; use helix_term::config::Config;

@ -3,7 +3,7 @@ use std::{
ops::RangeInclusive, ops::RangeInclusive,
}; };
use helix_core::diagnostic::Severity; use helix_core::{diagnostic::Severity, path::get_normalized_path};
use helix_view::doc; use helix_view::doc;
use super::*; use super::*;
@ -23,7 +23,7 @@ async fn test_write_quit_fail() -> anyhow::Result<()> {
assert_eq!(1, docs.len()); assert_eq!(1, docs.len());
let doc = docs.pop().unwrap(); let doc = docs.pop().unwrap();
assert_eq!(Some(file.path()), doc.path().map(PathBuf::as_path)); assert_eq!(Some(&get_normalized_path(file.path())), doc.path());
assert_eq!(&Severity::Error, app.editor.get_status().unwrap().1); assert_eq!(&Severity::Error, app.editor.get_status().unwrap().1);
}), }),
false, false,
@ -269,7 +269,7 @@ async fn test_write_scratch_to_new_path() -> anyhow::Result<()> {
assert_eq!(1, docs.len()); assert_eq!(1, docs.len());
let doc = docs.pop().unwrap(); let doc = docs.pop().unwrap();
assert_eq!(Some(&file.path().to_path_buf()), doc.path()); assert_eq!(Some(&get_normalized_path(file.path())), doc.path());
}), }),
false, false,
) )
@ -341,7 +341,7 @@ async fn test_write_new_path() -> anyhow::Result<()> {
Some(&|app| { Some(&|app| {
let doc = doc!(app.editor); let doc = doc!(app.editor);
assert!(!app.editor.is_err()); assert!(!app.editor.is_err());
assert_eq!(file1.path(), doc.path().unwrap()); assert_eq!(&get_normalized_path(file1.path()), doc.path().unwrap());
}), }),
), ),
( (
@ -349,7 +349,7 @@ async fn test_write_new_path() -> anyhow::Result<()> {
Some(&|app| { Some(&|app| {
let doc = doc!(app.editor); let doc = doc!(app.editor);
assert!(!app.editor.is_err()); assert!(!app.editor.is_err());
assert_eq!(file2.path(), doc.path().unwrap()); assert_eq!(&get_normalized_path(file2.path()), doc.path().unwrap());
assert!(app.editor.document_by_path(file1.path()).is_none()); assert!(app.editor.document_by_path(file1.path()).is_none());
}), }),
), ),

@ -1,6 +1,7 @@
use std::{ use std::{
fs::File, fs::File,
io::{Read, Write}, io::{Read, Write},
mem::replace,
path::PathBuf, path::PathBuf,
time::Duration, time::Duration,
}; };
@ -222,10 +223,11 @@ pub fn temp_file_with_contents<S: AsRef<str>>(
/// Generates a config with defaults more suitable for integration tests /// Generates a config with defaults more suitable for integration tests
pub fn test_config() -> Config { pub fn test_config() -> Config {
merge_keys(Config { Config {
editor: test_editor_config(), editor: test_editor_config(),
keys: helix_term::keymap::default(),
..Default::default() ..Default::default()
}) }
} }
pub fn test_editor_config() -> helix_view::editor::Config { pub fn test_editor_config() -> helix_view::editor::Config {
@ -300,8 +302,10 @@ impl AppBuilder {
// Remove this attribute once `with_config` is used in a test: // Remove this attribute once `with_config` is used in a test:
#[allow(dead_code)] #[allow(dead_code)]
pub fn with_config(mut self, config: Config) -> Self { pub fn with_config(mut self, mut config: Config) -> Self {
self.config = helix_term::keymap::merge_keys(config); let keys = replace(&mut config.keys, helix_term::keymap::default());
merge_keys(&mut config.keys, keys);
self.config = config;
self self
} }

@ -391,7 +391,7 @@ async fn cursor_position_newly_opened_file() -> anyhow::Result<()> {
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn cursor_position_append_eof() -> anyhow::Result<()> { async fn cursor_position_append_eof() -> anyhow::Result<()> {
// Selection is fowards // Selection is forwards
test(( test((
"#[foo|]#", "#[foo|]#",
"abar<esc>", "abar<esc>",

@ -1,5 +1,7 @@
use super::*; use super::*;
use helix_core::path::get_normalized_path;
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_split_write_quit_all() -> anyhow::Result<()> { async fn test_split_write_quit_all() -> anyhow::Result<()> {
let mut file1 = tempfile::NamedTempFile::new()?; let mut file1 = tempfile::NamedTempFile::new()?;
@ -25,21 +27,21 @@ async fn test_split_write_quit_all() -> anyhow::Result<()> {
let doc1 = docs let doc1 = docs
.iter() .iter()
.find(|doc| doc.path().unwrap() == file1.path()) .find(|doc| doc.path().unwrap() == &get_normalized_path(file1.path()))
.unwrap(); .unwrap();
assert_eq!("hello1", doc1.text().to_string()); assert_eq!("hello1", doc1.text().to_string());
let doc2 = docs let doc2 = docs
.iter() .iter()
.find(|doc| doc.path().unwrap() == file2.path()) .find(|doc| doc.path().unwrap() == &get_normalized_path(file2.path()))
.unwrap(); .unwrap();
assert_eq!("hello2", doc2.text().to_string()); assert_eq!("hello2", doc2.text().to_string());
let doc3 = docs let doc3 = docs
.iter() .iter()
.find(|doc| doc.path().unwrap() == file3.path()) .find(|doc| doc.path().unwrap() == &get_normalized_path(file3.path()))
.unwrap(); .unwrap();
assert_eq!("hello3", doc3.text().to_string()); assert_eq!("hello3", doc3.text().to_string());

@ -16,7 +16,7 @@ include = ["src/**/*", "README.md"]
default = ["crossterm"] default = ["crossterm"]
[dependencies] [dependencies]
bitflags = "2.0" bitflags = "2.2"
cassowary = "0.3" cassowary = "0.3"
unicode-segmentation = "1.10" unicode-segmentation = "1.10"
crossterm = { version = "0.26", optional = true } crossterm = { version = "0.26", optional = true }

@ -63,6 +63,7 @@ pub struct CrosstermBackend<W: Write> {
buffer: W, buffer: W,
capabilities: Capabilities, capabilities: Capabilities,
supports_keyboard_enhancement_protocol: OnceCell<bool>, supports_keyboard_enhancement_protocol: OnceCell<bool>,
mouse_capture_enabled: bool,
} }
impl<W> CrosstermBackend<W> impl<W> CrosstermBackend<W>
@ -74,25 +75,25 @@ where
buffer, buffer,
capabilities: Capabilities::from_env_or_default(config), capabilities: Capabilities::from_env_or_default(config),
supports_keyboard_enhancement_protocol: OnceCell::new(), supports_keyboard_enhancement_protocol: OnceCell::new(),
mouse_capture_enabled: false,
} }
} }
#[inline] #[inline]
fn supports_keyboard_enhancement_protocol(&self) -> io::Result<bool> { fn supports_keyboard_enhancement_protocol(&self) -> bool {
self.supports_keyboard_enhancement_protocol *self.supports_keyboard_enhancement_protocol
.get_or_try_init(|| { .get_or_init(|| {
use std::time::Instant; use std::time::Instant;
let now = Instant::now(); let now = Instant::now();
let support = terminal::supports_keyboard_enhancement(); let supported = matches!(terminal::supports_keyboard_enhancement(), Ok(true));
log::debug!( log::debug!(
"The keyboard enhancement protocol is {}supported in this terminal (checked in {:?})", "The keyboard enhancement protocol is {}supported in this terminal (checked in {:?})",
if matches!(support, Ok(true)) { "" } else { "not " }, if supported { "" } else { "not " },
Instant::now().duration_since(now) Instant::now().duration_since(now)
); );
support supported
}) })
.copied()
} }
} }
@ -124,8 +125,9 @@ where
execute!(self.buffer, terminal::Clear(terminal::ClearType::All))?; execute!(self.buffer, terminal::Clear(terminal::ClearType::All))?;
if config.enable_mouse_capture { if config.enable_mouse_capture {
execute!(self.buffer, EnableMouseCapture)?; execute!(self.buffer, EnableMouseCapture)?;
self.mouse_capture_enabled = true;
} }
if self.supports_keyboard_enhancement_protocol()? { if self.supports_keyboard_enhancement_protocol() {
execute!( execute!(
self.buffer, self.buffer,
PushKeyboardEnhancementFlags( PushKeyboardEnhancementFlags(
@ -137,13 +139,26 @@ where
Ok(()) Ok(())
} }
fn reconfigure(&mut self, config: Config) -> io::Result<()> {
if self.mouse_capture_enabled != config.enable_mouse_capture {
if config.enable_mouse_capture {
execute!(self.buffer, EnableMouseCapture)?;
} else {
execute!(self.buffer, DisableMouseCapture)?;
}
self.mouse_capture_enabled = config.enable_mouse_capture;
}
Ok(())
}
fn restore(&mut self, config: Config) -> io::Result<()> { fn restore(&mut self, config: Config) -> io::Result<()> {
// reset cursor shape // reset cursor shape
write!(self.buffer, "\x1B[0 q")?; write!(self.buffer, "\x1B[0 q")?;
if config.enable_mouse_capture { if config.enable_mouse_capture {
execute!(self.buffer, DisableMouseCapture)?; execute!(self.buffer, DisableMouseCapture)?;
} }
if self.supports_keyboard_enhancement_protocol()? { if self.supports_keyboard_enhancement_protocol() {
execute!(self.buffer, PopKeyboardEnhancementFlags)?; execute!(self.buffer, PopKeyboardEnhancementFlags)?;
} }
execute!( execute!(
@ -345,9 +360,9 @@ impl ModifierDiff {
} }
} }
/// Crossterm uses semicolon as a seperator for colors /// Crossterm uses semicolon as a separator for colors
/// this is actually not spec compliant (altough commonly supported) /// this is actually not spec compliant (although commonly supported)
/// However the correct approach is to use colons as a seperator. /// However the correct approach is to use colons as a separator.
/// This usually doesn't make a difference for emulators that do support colored underlines. /// This usually doesn't make a difference for emulators that do support colored underlines.
/// However terminals that do not support colored underlines will ignore underlines colors with colons /// However terminals that do not support colored underlines will ignore underlines colors with colons
/// while escape sequences with semicolons are always processed which leads to weird visual artifacts. /// while escape sequences with semicolons are always processed which leads to weird visual artifacts.

@ -14,6 +14,7 @@ pub use self::test::TestBackend;
pub trait Backend { pub trait Backend {
fn claim(&mut self, config: Config) -> Result<(), io::Error>; fn claim(&mut self, config: Config) -> Result<(), io::Error>;
fn reconfigure(&mut self, config: Config) -> Result<(), io::Error>;
fn restore(&mut self, config: Config) -> Result<(), io::Error>; fn restore(&mut self, config: Config) -> Result<(), io::Error>;
fn force_restore() -> Result<(), io::Error>; fn force_restore() -> Result<(), io::Error>;
fn draw<'a, I>(&mut self, content: I) -> Result<(), io::Error> fn draw<'a, I>(&mut self, content: I) -> Result<(), io::Error>

@ -111,6 +111,10 @@ impl Backend for TestBackend {
Ok(()) Ok(())
} }
fn reconfigure(&mut self, _config: Config) -> Result<(), io::Error> {
Ok(())
}
fn restore(&mut self, _config: Config) -> Result<(), io::Error> { fn restore(&mut self, _config: Config) -> Result<(), io::Error> {
Ok(()) Ok(())
} }

@ -433,6 +433,47 @@ impl Buffer {
(x_offset as u16, y) (x_offset as u16, y)
} }
pub fn set_spans_truncated(&mut self, x: u16, y: u16, spans: &Spans, width: u16) -> (u16, u16) {
// prevent panic if out of range
if !self.in_bounds(x, y) || width == 0 {
return (x, y);
}
let mut x_offset = x as usize;
let max_offset = min(self.area.right(), width.saturating_add(x));
let mut start_index = self.index_of(x, y);
let mut index = self.index_of(max_offset as u16, y);
let content_width = spans.width();
let truncated = content_width > width as usize;
if truncated {
self.content[start_index].set_symbol("…");
start_index += 1;
} else {
index -= width as usize - content_width;
}
for span in spans.0.iter().rev() {
for s in span.content.graphemes(true).rev() {
let width = s.width();
if width == 0 {
continue;
}
let start = index - width;
if start < start_index {
break;
}
self.content[start].set_symbol(s);
self.content[start].set_style(span.style);
for i in start + 1..index {
self.content[i].reset();
}
index -= width;
x_offset += width;
}
}
(x_offset as u16, y)
}
pub fn set_spans(&mut self, x: u16, y: u16, spans: &Spans, width: u16) -> (u16, u16) { pub fn set_spans(&mut self, x: u16, y: u16, spans: &Spans, width: u16) -> (u16, u16) {
let mut remaining_width = width; let mut remaining_width = width;
let mut x = x; let mut x = x;

@ -116,6 +116,10 @@ where
self.backend.claim(config) self.backend.claim(config)
} }
pub fn reconfigure(&mut self, config: Config) -> io::Result<()> {
self.backend.reconfigure(config)
}
pub fn restore(&mut self, config: Config) -> io::Result<()> { pub fn restore(&mut self, config: Config) -> io::Result<()> {
self.backend.restore(config) self.backend.restore(config)
} }

@ -354,7 +354,13 @@ impl TableState {
impl<'a> Table<'a> { impl<'a> Table<'a> {
// type State = TableState; // type State = TableState;
pub fn render_table(mut self, area: Rect, buf: &mut Buffer, state: &mut TableState) { pub fn render_table(
mut self,
area: Rect,
buf: &mut Buffer,
state: &mut TableState,
truncate: bool,
) {
if area.area() == 0 { if area.area() == 0 {
return; return;
} }
@ -401,6 +407,7 @@ impl<'a> Table<'a> {
width: *width, width: *width,
height: max_header_height, height: max_header_height,
}, },
truncate,
); );
col += *width + self.column_spacing; col += *width + self.column_spacing;
} }
@ -457,6 +464,7 @@ impl<'a> Table<'a> {
width: *width, width: *width,
height: table_row.height, height: table_row.height,
}, },
truncate,
); );
col += *width + self.column_spacing; col += *width + self.column_spacing;
} }
@ -464,20 +472,24 @@ impl<'a> Table<'a> {
} }
} }
fn render_cell(buf: &mut Buffer, cell: &Cell, area: Rect) { fn render_cell(buf: &mut Buffer, cell: &Cell, area: Rect, truncate: bool) {
buf.set_style(area, cell.style); buf.set_style(area, cell.style);
for (i, spans) in cell.content.lines.iter().enumerate() { for (i, spans) in cell.content.lines.iter().enumerate() {
if i as u16 >= area.height { if i as u16 >= area.height {
break; break;
} }
buf.set_spans(area.x, area.y + i as u16, spans, area.width); if truncate {
buf.set_spans_truncated(area.x, area.y + i as u16, spans, area.width);
} else {
buf.set_spans(area.x, area.y + i as u16, spans, area.width);
}
} }
} }
impl<'a> Widget for Table<'a> { impl<'a> Widget for Table<'a> {
fn render(self, area: Rect, buf: &mut Buffer) { fn render(self, area: Rect, buf: &mut Buffer) {
let mut state = TableState::default(); let mut state = TableState::default();
Table::render_table(self, area, buf, &mut state); Table::render_table(self, area, buf, &mut state, false);
} }
} }

@ -17,7 +17,7 @@ tokio = { version = "1", features = ["rt", "rt-multi-thread", "time", "sync", "p
parking_lot = "0.12" parking_lot = "0.12"
arc-swap = { version = "1.6.0" } arc-swap = { version = "1.6.0" }
gix = { version = "0.41.0", default-features = false , optional = true } gix = { version = "0.43.0", default-features = false , optional = true }
imara-diff = "0.1.5" imara-diff = "0.1.5"
anyhow = "1" anyhow = "1"

@ -14,7 +14,7 @@ default = []
term = ["crossterm"] term = ["crossterm"]
[dependencies] [dependencies]
bitflags = "2.0" bitflags = "2.2"
anyhow = "1" anyhow = "1"
helix-core = { version = "0.6", path = "../helix-core" } helix-core = { version = "0.6", path = "../helix-core" }
helix-loader = { version = "0.6", path = "../helix-loader" } helix-loader = { version = "0.6", path = "../helix-loader" }

@ -10,6 +10,7 @@ use crate::{
view::ViewPosition, view::ViewPosition,
Align, Document, DocumentId, View, ViewId, Align, Document, DocumentId, View, ViewId,
}; };
use dap::StackFrame;
use helix_vcs::DiffProviderRegistry; use helix_vcs::DiffProviderRegistry;
use futures_util::stream::select_all::SelectAll; use futures_util::stream::select_all::SelectAll;
@ -281,6 +282,8 @@ pub struct Config {
/// Whether to color modes with different colors. Defaults to `false`. /// Whether to color modes with different colors. Defaults to `false`.
pub color_modes: bool, pub color_modes: bool,
pub soft_wrap: SoftWrap, pub soft_wrap: SoftWrap,
/// Workspace specific lsp ceiling dirs
pub workspace_lsp_roots: Vec<PathBuf>,
} }
#[derive(Debug, Default, Clone, PartialEq, Eq, Serialize, Deserialize)] #[derive(Debug, Default, Clone, PartialEq, Eq, Serialize, Deserialize)]
@ -349,6 +352,8 @@ pub struct LspConfig {
pub display_signature_help_docs: bool, pub display_signature_help_docs: bool,
/// Display inlay hints /// Display inlay hints
pub display_inlay_hints: bool, pub display_inlay_hints: bool,
/// Whether to enable snippet support
pub snippets: bool,
} }
impl Default for LspConfig { impl Default for LspConfig {
@ -359,6 +364,7 @@ impl Default for LspConfig {
auto_signature_help: true, auto_signature_help: true,
display_signature_help_docs: true, display_signature_help_docs: true,
display_inlay_hints: false, display_inlay_hints: false,
snippets: true,
} }
} }
} }
@ -743,9 +749,13 @@ impl Default for Config {
bufferline: BufferLine::default(), bufferline: BufferLine::default(),
indent_guides: IndentGuidesConfig::default(), indent_guides: IndentGuidesConfig::default(),
color_modes: false, color_modes: false,
soft_wrap: SoftWrap::default(), soft_wrap: SoftWrap {
enable: Some(false),
..SoftWrap::default()
},
text_width: 80, text_width: 80,
completion_replace: false, completion_replace: false,
workspace_lsp_roots: Vec::new(),
} }
} }
} }
@ -844,7 +854,7 @@ pub struct Editor {
pub config_events: (UnboundedSender<ConfigEvent>, UnboundedReceiver<ConfigEvent>), pub config_events: (UnboundedSender<ConfigEvent>, UnboundedReceiver<ConfigEvent>),
/// Allows asynchronous tasks to control the rendering /// Allows asynchronous tasks to control the rendering
/// The `Notify` allows asynchronous tasks to request the editor to perform a redraw /// The `Notify` allows asynchronous tasks to request the editor to perform a redraw
/// The `RwLock` blocks the editor from performing the render until an exclusive lock can be aquired /// The `RwLock` blocks the editor from performing the render until an exclusive lock can be acquired
pub redraw_handle: RedrawHandle, pub redraw_handle: RedrawHandle,
pub needs_redraw: bool, pub needs_redraw: bool,
/// Cached position of the cursor calculated during rendering. /// Cached position of the cursor calculated during rendering.
@ -1086,15 +1096,15 @@ impl Editor {
} }
// if doc doesn't have a URL it's a scratch buffer, ignore it // if doc doesn't have a URL it's a scratch buffer, ignore it
let (lang, path) = { let doc = self.document(doc_id)?;
let doc = self.document(doc_id)?; let (lang, path) = (doc.language.clone(), doc.path().cloned());
(doc.language.clone(), doc.path().cloned()) let config = doc.config.load();
}; let root_dirs = &config.workspace_lsp_roots;
// try to find a language server based on the language name // try to find a language server based on the language name
let language_server = lang.as_ref().and_then(|language| { let language_server = lang.as_ref().and_then(|language| {
self.language_servers self.language_servers
.get(language, path.as_ref()) .get(language, path.as_ref(), root_dirs, config.lsp.snippets)
.map_err(|e| { .map_err(|e| {
log::error!( log::error!(
"Failed to initialize the LSP for `{}` {{ {} }}", "Failed to initialize the LSP for `{}` {{ {} }}",
@ -1652,6 +1662,12 @@ impl Editor {
doc.restore_cursor = false; doc.restore_cursor = false;
} }
} }
pub fn current_stack_frame(&self) -> Option<&StackFrame> {
self.debugger
.as_ref()
.and_then(|debugger| debugger.current_stack_frame())
}
} }
fn try_restore_indent(doc: &mut Document, view: &mut View) { fn try_restore_indent(doc: &mut Document, view: &mut View) {

@ -2,7 +2,7 @@ use std::fmt::Write;
use crate::{ use crate::{
editor::GutterType, editor::GutterType,
graphics::{Color, Style, UnderlineStyle}, graphics::{Style, UnderlineStyle},
Document, Editor, Theme, View, Document, Editor, Theme, View,
}; };
@ -245,9 +245,9 @@ pub fn breakpoints<'doc>(
theme: &Theme, theme: &Theme,
_is_focused: bool, _is_focused: bool,
) -> GutterFn<'doc> { ) -> GutterFn<'doc> {
let warning = theme.get("warning");
let error = theme.get("error"); let error = theme.get("error");
let info = theme.get("info"); let info = theme.get("info");
let breakpoint_style = theme.get("ui.debug.breakpoint");
let breakpoints = doc.path().and_then(|path| editor.breakpoints.get(path)); let breakpoints = doc.path().and_then(|path| editor.breakpoints.get(path));
@ -265,30 +265,52 @@ pub fn breakpoints<'doc>(
.iter() .iter()
.find(|breakpoint| breakpoint.line == line)?; .find(|breakpoint| breakpoint.line == line)?;
let mut style = if breakpoint.condition.is_some() && breakpoint.log_message.is_some() { let style = if breakpoint.condition.is_some() && breakpoint.log_message.is_some() {
error.underline_style(UnderlineStyle::Line) error.underline_style(UnderlineStyle::Line)
} else if breakpoint.condition.is_some() { } else if breakpoint.condition.is_some() {
error error
} else if breakpoint.log_message.is_some() { } else if breakpoint.log_message.is_some() {
info info
} else { } else {
warning breakpoint_style
}; };
if !breakpoint.verified { let sym = if breakpoint.verified { "●" } else { "◯" };
// Faded colors write!(out, "{}", sym).unwrap();
style = if let Some(Color::Rgb(r, g, b)) = style.fg { Some(style)
style.fg(Color::Rgb( },
((r as f32) * 0.4).floor() as u8, )
((g as f32) * 0.4).floor() as u8, }
((b as f32) * 0.4).floor() as u8,
)) fn execution_pause_indicator<'doc>(
} else { editor: &'doc Editor,
style.fg(Color::Gray) doc: &'doc Document,
} theme: &Theme,
}; is_focused: bool,
) -> GutterFn<'doc> {
let style = theme.get("ui.debug.active");
let current_stack_frame = editor.current_stack_frame();
let frame_line = current_stack_frame.map(|frame| frame.line - 1);
let frame_source_path = current_stack_frame.map(|frame| {
frame
.source
.as_ref()
.and_then(|source| source.path.as_ref())
});
let should_display_for_current_doc =
doc.path().is_some() && frame_source_path.unwrap_or(None) == doc.path();
Box::new(
move |line: usize, _selected: bool, first_visual_line: bool, out: &mut String| {
if !first_visual_line
|| !is_focused
|| line != frame_line?
|| !should_display_for_current_doc
{
return None;
}
let sym = if breakpoint.verified { "▲" } else { "⊚" }; let sym = "▶";
write!(out, "{}", sym).unwrap(); write!(out, "{}", sym).unwrap();
Some(style) Some(style)
}, },
@ -304,9 +326,11 @@ pub fn diagnostics_or_breakpoints<'doc>(
) -> GutterFn<'doc> { ) -> GutterFn<'doc> {
let mut diagnostics = diagnostic(editor, doc, view, theme, is_focused); let mut diagnostics = diagnostic(editor, doc, view, theme, is_focused);
let mut breakpoints = breakpoints(editor, doc, view, theme, is_focused); let mut breakpoints = breakpoints(editor, doc, view, theme, is_focused);
let mut execution_pause_indicator = execution_pause_indicator(editor, doc, theme, is_focused);
Box::new(move |line, selected, first_visual_line: bool, out| { Box::new(move |line, selected, first_visual_line: bool, out| {
breakpoints(line, selected, first_visual_line, out) execution_pause_indicator(line, selected, first_visual_line, out)
.or_else(|| breakpoints(line, selected, first_visual_line, out))
.or_else(|| diagnostics(line, selected, first_visual_line, out)) .or_else(|| diagnostics(line, selected, first_visual_line, out))
}) })
} }

@ -321,6 +321,7 @@ impl Editor {
} }
} }
None => { None => {
self.debugger = None;
self.set_status( self.set_status(
"Terminated debugging session and disconnected debugger.", "Terminated debugging session and disconnected debugger.",
); );

@ -128,7 +128,7 @@ impl Loader {
let parent_palette = parent_theme_toml.get("palette"); let parent_palette = parent_theme_toml.get("palette");
let palette = theme_toml.get("palette"); let palette = theme_toml.get("palette");
// handle the table seperately since it needs a `merge_depth` of 2 // handle the table separately since it needs a `merge_depth` of 2
// this would conflict with the rest of the theme merge strategy // this would conflict with the rest of the theme merge strategy
let palette_values = match (parent_palette, palette) { let palette_values = match (parent_palette, palette) {
(Some(parent_palette), Some(palette)) => { (Some(parent_palette), Some(palette)) => {

@ -212,7 +212,7 @@ source = { git = "https://github.com/tree-sitter/tree-sitter-c", rev = "7175a6dd
name = "cpp" name = "cpp"
scope = "source.cpp" scope = "source.cpp"
injection-regex = "cpp" injection-regex = "cpp"
file-types = ["cc", "hh", "c++", "cpp", "hpp", "h", "ipp", "tpp", "cxx", "hxx", "ixx", "txx", "ino", "C", "H"] file-types = ["cc", "hh", "c++", "cpp", "hpp", "h", "ipp", "tpp", "cxx", "hxx", "ixx", "txx", "ino", "C", "H", "cu", "cuh"]
roots = [] roots = []
comment-token = "//" comment-token = "//"
language-server = { command = "clangd" } language-server = { command = "clangd" }
@ -606,7 +606,7 @@ indent = { tab-width = 2, unit = " " }
[[grammar]] [[grammar]]
name = "ruby" name = "ruby"
source = { git = "https://github.com/tree-sitter/tree-sitter-ruby", rev = "4c600a463d97e36a0ca5ac57e11f3ac8c297a0fa" } source = { git = "https://github.com/tree-sitter/tree-sitter-ruby", rev = "206c7077164372c596ffa8eaadb9435c28941364" }
[[language]] [[language]]
name = "bash" name = "bash"
@ -875,14 +875,14 @@ name = "haskell"
scope = "source.haskell" scope = "source.haskell"
injection-regex = "haskell" injection-regex = "haskell"
file-types = ["hs", "hs-boot"] file-types = ["hs", "hs-boot"]
roots = ["Setup.hs", "stack.yaml", "*.cabal"] roots = ["Setup.hs", "stack.yaml", "cabal.project"]
comment-token = "--" comment-token = "--"
language-server = { command = "haskell-language-server-wrapper", args = ["--lsp"] } language-server = { command = "haskell-language-server-wrapper", args = ["--lsp"] }
indent = { tab-width = 2, unit = " " } indent = { tab-width = 2, unit = " " }
[[grammar]] [[grammar]]
name = "haskell" name = "haskell"
source = { git = "https://github.com/tree-sitter/tree-sitter-haskell", rev = "b6ec26f181dd059eedd506fa5fbeae1b8e5556c8" } source = { git = "https://github.com/tree-sitter/tree-sitter-haskell", rev = "98fc7f59049aeb713ab9b72a8ff25dcaaef81087" }
[[language]] [[language]]
name = "purescript" name = "purescript"
@ -978,8 +978,8 @@ source = { git = "https://github.com/uyha/tree-sitter-cmake", rev = "6e51463ef30
[[language]] [[language]]
name = "make" name = "make"
scope = "source.make" scope = "source.make"
file-types = ["Makefile", "makefile", "mk", "Justfile", "justfile", ".justfile"] file-types = ["Makefile", "makefile", "mk"]
injection-regex = "(make|makefile|Makefile|mk|just)" injection-regex = "(make|makefile|Makefile|mk)"
roots = [] roots = []
comment-token = "#" comment-token = "#"
indent = { tab-width = 4, unit = "\t" } indent = { tab-width = 4, unit = "\t" }
@ -1126,7 +1126,7 @@ indent = { tab-width = 2, unit = " " }
[[grammar]] [[grammar]]
name = "markdown" name = "markdown"
source = { git = "https://github.com/MDeiml/tree-sitter-markdown", rev = "7e7aa9a25ca9729db9fe22912f8f47bdb403a979", subpath = "tree-sitter-markdown" } source = { git = "https://github.com/MDeiml/tree-sitter-markdown", rev = "fa6bfd51727e4bef99f7eec5f43947f73d64ea7d", subpath = "tree-sitter-markdown" }
[[language]] [[language]]
name = "markdown.inline" name = "markdown.inline"
@ -1138,7 +1138,7 @@ grammar = "markdown_inline"
[[grammar]] [[grammar]]
name = "markdown_inline" name = "markdown_inline"
source = { git = "https://github.com/MDeiml/tree-sitter-markdown", rev = "7e7aa9a25ca9729db9fe22912f8f47bdb403a979", subpath = "tree-sitter-markdown-inline" } source = { git = "https://github.com/MDeiml/tree-sitter-markdown", rev = "fa6bfd51727e4bef99f7eec5f43947f73d64ea7d", subpath = "tree-sitter-markdown-inline" }
[[language]] [[language]]
name = "dart" name = "dart"
@ -1194,7 +1194,7 @@ text-width = 72
[[grammar]] [[grammar]]
name = "git-commit" name = "git-commit"
source = { git = "https://github.com/the-mikedavis/tree-sitter-git-commit", rev = "7421fd81840950c0ff4191733cee3b6ac06cb295" } source = { git = "https://github.com/the-mikedavis/tree-sitter-git-commit", rev = "db88cffa3952dd2328b741af5d0fc69bdb76704f" }
[[language]] [[language]]
name = "diff" name = "diff"
@ -1425,7 +1425,7 @@ language-server = { command = "gleam", args = ["lsp"] }
[[grammar]] [[grammar]]
name = "gleam" name = "gleam"
source = { git = "https://github.com/gleam-lang/tree-sitter-gleam", rev = "d6cbdf3477fcdb0b4d811518a356f9b5cd1795ed" } source = { git = "https://github.com/gleam-lang/tree-sitter-gleam", rev = "ae79782c00656945db69641378e688cdb78d52c1" }
[[language]] [[language]]
name = "ron" name = "ron"
@ -1437,6 +1437,20 @@ comment-token = "//"
indent = { tab-width = 4, unit = " " } indent = { tab-width = 4, unit = " " }
grammar = "rust" grammar = "rust"
[[language]]
name = "robot"
scope = "source.robot"
injection-regex = "robot"
file-types = ["robot", "resource"]
comment-token = "#"
roots = []
indent = { tab-width = 4, unit = " " }
language-server = { command = "robotframework_ls" }
[[grammar]]
name = "robot"
source = { git = "https://github.com/Hubro/tree-sitter-robot", rev = "f1142bfaa6acfce95e25d2c6d18d218f4f533927" }
[[language]] [[language]]
name = "r" name = "r"
scope = "source.r" scope = "source.r"
@ -1446,7 +1460,7 @@ shebangs = ["r", "R"]
roots = [] roots = []
comment-token = "#" comment-token = "#"
indent = { tab-width = 2, unit = " " } indent = { tab-width = 2, unit = " " }
language-server = { command = "R", args = ["--slave", "-e", "languageserver::run()"] } language-server = { command = "R", args = ["--no-echo", "-e", "languageserver::run()"] }
[[grammar]] [[grammar]]
name = "r" name = "r"
@ -1545,6 +1559,7 @@ file-types = ["gd"]
shebangs = [] shebangs = []
roots = ["project.godot"] roots = ["project.godot"]
auto-format = true auto-format = true
formatter = { command = "gdformat", args = ["-"] }
comment-token = "#" comment-token = "#"
indent = { tab-width = 4, unit = "\t" } indent = { tab-width = 4, unit = "\t" }
@ -2068,7 +2083,7 @@ source = { git = "https://github.com/Unoqwy/tree-sitter-kdl", rev = "e1cd292c6d1
name = "xml" name = "xml"
scope = "source.xml" scope = "source.xml"
injection-regex = "xml" injection-regex = "xml"
file-types = ["xml", "mobileconfig", "plist", "xib", "storyboard", "svg"] file-types = ["xml", "mobileconfig", "plist", "xib", "storyboard", "svg", "xsd"]
indent = { tab-width = 2, unit = " " } indent = { tab-width = 2, unit = " " }
roots = [] roots = []
@ -2084,6 +2099,26 @@ roots = []
name = "xml" name = "xml"
source = { git = "https://github.com/RenjiSann/tree-sitter-xml", rev = "48a7c2b6fb9d515577e115e6788937e837815651" } source = { git = "https://github.com/RenjiSann/tree-sitter-xml", rev = "48a7c2b6fb9d515577e115e6788937e837815651" }
[[language]]
name = "dtd"
scope = "source.dtd"
injection-regex = "dtd"
file-types = ["dtd", "ent"]
indent = {tab-width = 2, unit = " "}
roots = []
[language.auto-pairs]
'(' = ')'
'[' = ']'
'"' = '"'
"'" = "'"
'<' = '>'
[[grammar]]
name = "dtd"
source = { git = "https://github.com/KMikeeU/tree-sitter-dtd", rev = "6116becb02a6b8e9588ef73d300a9ba4622e156f"}
[[language]] [[language]]
name = "wit" name = "wit"
scope = "source.wit" scope = "source.wit"
@ -2364,7 +2399,7 @@ file-types = ["smithy"]
roots = ["smithy-build.json"] roots = ["smithy-build.json"]
comment-token = "//" comment-token = "//"
indent = { tab-width = 4, unit = " " } indent = { tab-width = 4, unit = " " }
language-server = { command = "cs", args = ["launch", "com.disneystreaming.smithy:smithy-language-server:latest.release", "--", "0"] } language-server = { command = "cs", args = ["launch", "--contrib", "smithy-language-server", "--", "0"] }
[[grammar]] [[grammar]]
name = "smithy" name = "smithy"
@ -2421,3 +2456,65 @@ language-server = { command = "nimlangserver" }
[[grammar]] [[grammar]]
name = "nim" name = "nim"
source = { git = "https://github.com/aMOPel/tree-sitter-nim", rev = "240239b232550e431d67de250d1b5856209e7f06" } source = { git = "https://github.com/aMOPel/tree-sitter-nim", rev = "240239b232550e431d67de250d1b5856209e7f06" }
[[language]]
name = "cabal"
scope = "source.cabal"
file-types = [ "cabal" ]
roots = ["cabal.project", "Setup.hs"]
indent = { tab-width = 2, unit = " " }
comment-token = "--"
[[grammar]]
name = "cabal"
source = { git = "https://gitlab.com/magus/tree-sitter-cabal", rev = "7d5fa6887ae05a0b06d046f1e754c197c8ad869b" }
[[language]]
name = "hurl"
scope = "source.hurl"
injection-regex = "hurl"
file-types = ["hurl"]
roots = []
comment-token = "#"
indent = { tab-width = 2, unit = " " }
[[grammar]]
name = "hurl"
source = { git = "https://github.com/pfeiferj/tree-sitter-hurl", rev = "264c42064b61ee21abe88d0061f29a0523352e22" }
[[language]]
name = "markdoc"
scope = "text.markdoc"
roots = []
file-types = ["mdoc"]
language-server = { command = "markdoc-ls", args = ["--stdio"] }
[[grammar]]
name = "markdoc"
source = { git = "https://github.com/markdoc-extra/tree-sitter-markdoc", rev = "5ffe71b29e8a3f94823913ea9cea51fcfa7e3bf8" }
[[language]]
name = "opencl"
scope = "source.cl"
injection-regex = "(cl|opencl)"
file-types = ["cl"]
roots = []
comment-token = "//"
language-server = { command = "clangd" }
[[grammar]]
name = "opencl"
source = { git = "https://github.com/lefp/tree-sitter-opencl", rev = "8e1d24a57066b3cd1bb9685bbc1ca9de5c1b78fb" }
[[language]]
name = "just"
scope = "source.just"
file-types = ["justfile", "Justfile", "just"]
injection-regex = "just"
roots = []
comment-token = "#"
indent = { tab-width = 4, unit = "\t" }
[[grammar]]
name = "just"
source = { git = "https://github.com/IndianBoy42/tree-sitter-just", rev = "8af0aab79854aaf25b620a52c39485849922f766" }

@ -107,6 +107,7 @@
(null) @constant (null) @constant
(number_literal) @constant.numeric (number_literal) @constant.numeric
(char_literal) @constant.character (char_literal) @constant.character
(escape_sequence) @constant.character.escape
(call_expression (call_expression
function: (identifier) @function) function: (identifier) @function)

@ -0,0 +1,15 @@
(comment) @comment
[
"cabal-version"
(field_name)
] @type
(section_name) @type
[
(section_type)
"if"
"elseif"
"else"
] @keyword

@ -0,0 +1,39 @@
; highlights.scm
(comment) @comment
[
"ELEMENT"
"ATTLIST"
] @keyword
[
"#REQUIRED"
"#IMPLIED"
"#FIXED"
"#PCDATA"
] @keyword.directive
[
"EMPTY"
"ANY"
"SYSTEM"
"PUBLIC"
] @constant
(element_name) @module
(attribute_name) @attribute
(system_literal) @string
(pubid_literal) @string
(attribute_value) @string
[
">"
"</"
"<?"
"?>"
"<!"
] @punctuation.bracket

@ -0,0 +1,2 @@
((comment) @injection.content
(#set! injection.language "comment"))

@ -10,5 +10,9 @@
(change kind: "modified" @diff.delta) (change kind: "modified" @diff.delta)
(change kind: "renamed" @diff.delta.moved) (change kind: "renamed" @diff.delta.moved)
[":" "->" (scissors)] @punctuation.delimiter (trailer
key: (trailer_key) @variable.other.member
value: (trailer_value) @string)
[":" "=" "->" (scissors)] @punctuation.delimiter
(comment) @comment (comment) @comment

@ -77,6 +77,7 @@
"if" "if"
"import" "import"
"let" "let"
"panic"
"todo" "todo"
"try" "try"
"type" "type"

@ -14,8 +14,6 @@
(argument_list) (argument_list)
(field_declaration_list) (field_declaration_list)
(block) (block)
(type_switch_statement)
(expression_switch_statement)
(var_declaration) (var_declaration)
] @indent ] @indent
@ -24,5 +22,19 @@
")" ")"
] @outdent ] @outdent
((_ "}" @outdent) @outer (#not-kind-eq? @outer "select_statement")) ; Switches and selects aren't indented, only their case bodies are.
(communication_case) @extend ; Outdent all closing braces except those closing switches or selects.
(
(_ "}" @outdent) @outer
(#not-kind-eq? @outer "select_statement")
(#not-kind-eq? @outer "type_switch_statement")
(#not-kind-eq? @outer "expression_switch_statement")
)
; Starting a line after a new case should indent.
[
(communication_case)
(expression_case)
(default_case)
(type_case)
] @extend

@ -12,7 +12,7 @@
(identifier) @local.definition)) (identifier) @local.definition))
(var_spec (var_spec
name: (identifier) @local.definition) (identifier) @local.definition)
(for_statement (for_statement
(range_clause (range_clause

@ -1,2 +1,6 @@
((comment) @injection.content ((comment) @injection.content
(#set! injection.language "comment")) (#set! injection.language "comment"))
(quasiquote
(quoter) @injection.language
(quasiquote_body) @injection.content)

@ -2,7 +2,7 @@
[ [
(adt) (adt)
(decl_type) (type_alias)
(newtype) (newtype)
] @class.around ] @class.around

@ -0,0 +1,127 @@
[
"[QueryStringParams]"
"[FormParams]"
"[MultipartFormData]"
"[Cookies]"
"[Captures]"
"[Asserts]"
"[Options]"
"[BasicAuth]"
] @attribute
(comment) @comment
[
(key_string)
(json_key_string)
] @variable.other.member
(value_string) @string
(quoted_string) @string
(json_string) @string
(file_value) @string.special.path
(regex) @string.regex
[
"\\"
(regex_escaped_char)
(quoted_string_escaped_char)
(key_string_escaped_char)
(value_string_escaped_char)
(oneline_string_escaped_char)
(multiline_string_escaped_char)
(filename_escaped_char)
(json_string_escaped_char)
] @constant.character.escape
(method) @type.builtin
(multiline_string_type) @type
[
"status"
"url"
"header"
"cookie"
"body"
"xpath"
"jsonpath"
"regex"
"variable"
"duration"
"sha256"
"md5"
"bytes"
] @function.builtin
(filter) @attribute
(version) @string.special
[
"null"
"cacert"
"location"
"insecure"
"max-redirs"
"retry"
"retry-interval"
"retry-max-count"
(variable_option "variable")
"verbose"
"very-verbose"
] @constant.builtin
(boolean) @constant.builtin.boolean
(variable_name) @variable
[
"not"
"equals"
"=="
"notEquals"
"!="
"greaterThan"
">"
"greaterThanOrEquals"
">="
"lessThan"
"<"
"lessThanOrEquals"
"<="
"startsWith"
"endsWith"
"contains"
"matches"
"exists"
"includes"
"isInteger"
"isFloat"
"isBoolean"
"isString"
"isCollection"
] @keyword.operator
(integer) @constant.numeric.integer
(float) @constant.numeric.float
(status) @constant.numeric
(json_number) @constant.numeric.float
[
":"
","
] @punctuation.delimiter
[
"["
"]"
"{"
"}"
"{{"
"}}"
] @punctuation.special
[
"base64,"
"file,"
"hex,"
] @string.special

@ -0,0 +1,11 @@
[
(json_object)
(json_array)
(xml_tag)
] @indent
[
"}"
"]"
(xml_close_tag)
] @outdent

@ -0,0 +1,14 @@
((comment) @injection.content
(#set! injection.language "comment"))
((json_value) @injection.content
(#set! injection.language "json"))
((xml) @injection.content
(#set! injection.language "xml"))
((multiline_string
(multiline_string_type) @injection.language
(multiline_string_content) @injection.content)
(#set! injection.include-children)
(#set! injection.combined))

@ -0,0 +1,16 @@
[
(struct_definition)
(macro_definition)
(function_definition)
(compound_expression)
(let_statement)
(if_statement)
(for_statement)
(while_statement)
(do_clause)
(parameter_list)
] @indent
[
"end"
] @outdent

@ -26,3 +26,9 @@
prefix: (identifier) @function.macro) @injection.content prefix: (identifier) @function.macro) @injection.content
(#eq? @function.macro "re") (#eq? @function.macro "re")
(#set! injection.language "regex")) (#set! injection.language "regex"))
(
(prefixed_string_literal
prefix: (identifier) @function.macro) @injection.content
(#eq? @function.macro "md")
(#set! injection.language "markdown"))

@ -0,0 +1,46 @@
(function_definition (_)? @function.inside) @function.around
(short_function_definition (_)? @function.inside) @function.around
(macro_definition (_)? @function.inside) @function.around
(struct_definition (_)? @class.inside) @class.around
(abstract_definition (_)? @class.inside) @class.around
(primitive_definition (_)? @class.inside) @class.around
(parameter_list
; Match all children of parameter_list *except* keyword_parameters
([(identifier)
(slurp_parameter)
(optional_parameter)
(typed_parameter)
(tuple_expression)
(interpolation_expression)
(call_expression)]
@parameter.inside . ","? @parameter.around) @parameter.around)
(keyword_parameters
((_) @parameter.inside . ","? @parameter.around) @parameter.around)
(argument_list
((_) @parameter.inside . ","? @parameter.around) @parameter.around)
(type_parameter_list
((_) @parameter.inside . ","? @parameter.around) @parameter.around)
(line_comment) @comment.inside
(line_comment)+ @comment.around
(block_comment) @comment.inside
(block_comment)+ @comment.around
(_expression (macro_identifier
(identifier) @_name
(#match? @_name "^(test|test_throws|test_logs|inferred|test_deprecated|test_warn|test_nowarn|test_broken|test_skip)$")
)
.
(macro_argument_list) @test.inside) @test.around

@ -0,0 +1,4 @@
(body) @fold
(recipe) @fold
(interpolation) @fold
(item (_) @fold)

@ -0,0 +1,33 @@
(assignment (NAME) @variable)
(alias (NAME) @variable)
(value (NAME) @variable)
(parameter (NAME) @variable)
(setting (NAME) @keyword)
(setting "shell" @keyword)
(call (NAME) @function)
(dependency (NAME) @function)
(depcall (NAME) @function)
(recipeheader (NAME) @function)
(depcall (expression) @variable.parameter)
(parameter) @variable.parameter
(variadic_parameters) @variable.parameter
["if" "else"] @keyword.control.conditional
(string) @string
(boolean ["true" "false"]) @constant.builtin.boolean
(comment) @comment
; (interpolation) @string
(shebang interpreter:(TEXT) @keyword ) @comment
["export" "alias" "set"] @keyword
["@" "==" "!=" "+" ":="] @operator
[ "(" ")" "[" "]" "{{" "}}" "{" "}"] @punctuation.bracket

@ -0,0 +1,3 @@
[
(recipe_body)
] @indent

@ -0,0 +1,16 @@
((comment) @injection.content
(#set! injection.language "comment"))
(shebang_recipe
(shebang
interpreter:(TEXT) @injection.language)
(shebang_body) @injection.content
)
(source_file
(item (setting lang:(NAME) @injection.language))
(item (recipe (body (recipe_body) @injection.content)))
)
; ((interpolation (expression) @injection.content)
; (#set! injection.language "just"))

@ -0,0 +1,10 @@
(assignment (NAME) @local.definition)
(alias left:(NAME) @local.definition)
(alias right:(NAME) @local.reference)
(value (NAME) @local.reference)
(parameter (NAME) @local.definition)
(call (NAME) @local.reference)
(dependency (NAME) @local.reference)
(depcall (NAME) @local.reference)
(recipeheader (NAME) @local.definition)

@ -0,0 +1,48 @@
(body) @function.inside
(recipe) @function.around
(expression
if:(expression) @function.inside
)
(expression
else:(expression) @function.inside
)
(interpolation (expression) @function.inside) @function.around
(settinglist (stringlist) @function.inside) @function.around
(call (NAME) @class.inside) @class.around
(dependency (NAME) @class.inside) @class.around
(depcall (NAME) @class.inside)
(dependency) @parameter.around
(depcall) @parameter.inside
(depcall (expression) @parameter.inside)
(stringlist
(string) @parameter.inside
. ","? @_end
; Commented out since we don't support `#make-range!` at the moment
; (#make-range! "parameter.around" @parameter.inside @_end)
)
(parameters
[(parameter)
(variadic_parameters)] @parameter.inside
. " "? @_end
; Commented out since we don't support `#make-range!` at the moment
; (#make-range! "parameter.around" @parameter.inside @_end)
)
(expression
(condition) @function.inside
) @function.around
(expression
if:(expression) @function.inside
)
(expression
else:(expression) @function.inside
)
(item [(alias) (assignment) (export) (setting)]) @class.around
(recipeheader) @class.around
(line) @class.around
(comment) @comment.around

@ -0,0 +1,16 @@
tag_name: (identifier) @tag
(tag_self_closing "/" @tag)
(tag_close "/" @tag)
([(tag_start) (tag_end) "="] @tag)
(attribute [key : (identifier)] @attribute)
(attribute [shorthand : (identifier)] @attribute)
(variable [variable : (identifier) (variable_sigil)] @variable)
(variable_tail property : (identifier) @variable.other.member)
(function function_name : (identifier) @function)
(function_parameter_named parameter : (identifier) @variable.parameter)
(hash_key key: (identifier) @variable.other.member)
(string) @string
(number) @constant.numeric
(boolean) @constant.builtin.boolean
(null) @constant.builtin

@ -0,0 +1,2 @@
((markdown) @injection.content
(#set! injection.language "markdown"))

@ -0,0 +1,150 @@
[
"sizeof"
; @todo why does "uniform" break highlighting?
; "uniform" ; OpenCL C 3.0.13 reserves this as a keyword, but doesn't seem to use it for anything
(function_qualifier)
] @keyword
[
"enum"
"struct"
"typedef"
"union"
] @keyword.storage.type
[
"extern"
"register"
(type_qualifier)
(access_qualifier)
(storage_class_specifier)
(address_space_qualifier)
] @keyword.storage.modifier
[
"goto"
"break"
"continue"
] @keyword.control
[
"do"
"for"
"while"
] @keyword.control.repeat
[
"if"
"else"
"switch"
"case"
"default"
] @keyword.control.conditional
"return" @keyword.control.return
[
"defined"
"#define"
"#elif"
"#else"
"#endif"
"#if"
"#ifdef"
"#ifndef"
"#include"
(preproc_directive)
] @keyword.directive
(pointer_declarator "*" @type.builtin)
(abstract_pointer_declarator "*" @type.builtin)
[
"+"
"-"
"*"
"/"
"++"
"--"
"%"
"=="
"!="
">"
"<"
">="
"<="
"&&"
"||"
"!"
"&"
"|"
"^"
"~"
"<<"
">>"
"="
"+="
"-="
"*="
"/="
"%="
"<<="
">>="
"&="
"^="
"|="
"?"
] @operator
(conditional_expression ":" @operator)
"..." @punctuation
["," "." ":" ";" "->" "::"] @punctuation.delimiter
["(" ")" "[" "]" "{" "}"] @punctuation.bracket
[(true) (false)] @constant.builtin.boolean
(enumerator name: (identifier) @type.enum.variant)
(string_literal) @string
(system_lib_string) @string
(null) @constant
(number_literal) @constant.numeric
(char_literal) @constant.character
(call_expression
function: (identifier) @function)
(call_expression
function: (field_expression
field: (field_identifier) @function))
(call_expression (argument_list (identifier) @variable))
(function_declarator
declarator: [(identifier) (field_identifier)] @function)
(parameter_declaration
declarator: (identifier) @variable.parameter)
(parameter_declaration
(pointer_declarator
declarator: (identifier) @variable.parameter))
(preproc_function_def
name: (identifier) @function.special)
(attribute
name: (identifier) @attribute)
(field_identifier) @variable.other.member
(statement_identifier) @label
(type_identifier) @type
(scalar_type) @type.builtin
(sized_type_specifier) @type.builtin
(vector_type) @type.builtin
(other_builtin_type) @type.builtin
((identifier) @constant
(#match? @constant "^[A-Z][A-Z\\d_]*$"))
(identifier) @variable
(comment) @comment

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save