teal-language / teal-language-server Goto Github PK
View Code? Open in Web Editor NEWA language server for Teal, a typed dialect of Lua
A language server for Teal, a typed dialect of Lua
hi,
I would like to package teal-language-server for nixpkgs, and while it's doable as is, I would prefer to package a tagged release (it's also simpler). Otherwise one has to specify the "dev" server to install the software, i.e., luarocks install --server=https://luarocks.org/dev teal-language-server
https://luarocks.org/modules/3uclidian/teal-language-server
Use teal-language/tl#369 to grab type info from cursor pos
I think the best candidate for this would be LuaSocket as it:
but I'm open to other suggestions like luv
or cqueues
The cli should expose the get_config()
function so we can load a project's config and type check things correctly as we currently cant preload any modules or include dirs
This should probably be solved upstream by having tl.Error
include a length: number
field or a start_x
and end_x
or something
I want to type annotate a function that keeps returning functions until the argument is nil, the function is defined like this
local type ExecuteResult = function(): string
local type ExecuteArgument = ExecuteArgument | function(string): ExecuteResult
local function execute(prog: string): ExecuteArgument
local cmd = prog
local function f(arg: string | nil)
if arg is nil then
local f = io.popen(cmd)
local s = assert(f:read('*a'))
f:close()
return s
else
cmd = cmd .. " " .. arg
return f
end
end
return f
end
execute "echo" "hello" "world"()
This does not work because ExecuteArgument
becomes recursive, crashing teal. Is there any way around this?
At the very least we can incrementally update the token list without re-parsing the entire doc.
Whether this speedup is worth it, idk. We could bring in my ltreesitter
module to do this, but then theres the issue of translating the cst that tree-sitter generates to an ast that tl
understands, which might not be worth it.
We could get a real speedup if we could figure out a way to incrementally update the type checking environment instead of regenerating it each time
currently it only looks at the hovered token, which is fine for simple things, but for indexing in particular it gives no information at all
i.e.
local record Foo
x: number
record Bar
y: string
end
end
hovering on Foo
will show the definition
but for hovering something like Foo.Bar
will not show any info (except for when you hover over the Foo
, but whatever)
To implement this without reimplementing the parser, we either need changes in the teal api to expose more info about nodes or bring in something like tree-sitter. While tree-sitter is nice, I'd like to avoid external deps when possible, especially one that can require some extra steps to install like tree-sitter.
I'm trying to get this language server working in neovim but sadly the hover
action doesn't seem to be working. See this gif:
I'd expect some popup to be shown with the type of the variable my cursor currently is on. Did I get this wrong and the behavior is perfectly fine?
Other language servers like the ones for lua
or go
work fine.
Is this working for others? If so, can you reproduce with my config below? Do you have suggestions how to further track down the problem?
$ nvim --version
NVIM v0.9.5
Build type: Release
LuaJIT 2.1.1702233742
Reduced config:
set runtimepath=$XDG_DATA_HOME/nvim/,$XDG_CONFIG_HOME/nvim,$VIMRUNTIME,$XDG_CONFIG_HOME/nvim/after
if !has('nvim')
set viminfo+='1000,n$XDG_DATA_HOME/nvim/viminfo
else
" Do nothing here to use the neovim default
" or do soemething like:
set viminfo+=n~/.shada
endif
let mapleader = "-"
let maplocalleader= ","
let g:python3_host_prog = "~/.local/share/nvim/nvim-env/bin/python3"
" Plugins will be downloaded under the specified directory.
call plug#begin('~/.config/nvim/plugged')
Plug 'neovim/nvim-lspconfig'
Plug 'teal-language/vim-teal'
call plug#end()
lua <<EOF
local util = require 'lspconfig.util'
local nvim_lsp = require('lspconfig')
local capabilities = nvim_lsp.util.default_config
local function on_attach_add(o)
return function(client, bufnr)
local function buf_set_keymap(...)
vim.api.nvim_buf_set_keymap(bufnr, ...)
end
local function buf_set_option(...)
vim.api.nvim_buf_set_option(bufnr, ...)
end
buf_set_option('omnifunc', 'v:lua.vim.lsp.omnifunc')
-- Mappings.
local opts = { noremap=false, silent=true }
buf_set_keymap('n', 'K', vim.lsp.buf.hover, opts)
end
end
local on_attach = on_attach_add{}
require'lspconfig'.teal_ls.setup{
on_attach = on_attach,
capabilities = capabilities,
root_dir = util.root_pattern("tlconfig.lua", ".git"),
}
EOF
This is particularly sad since at least querying the types (and also the completion for variables and entries in records) is some core feature when handling a codebase which is a bit bigger. Also it is maybe one of the big advantages of tl that the types are known and I hoped this would make coding easier.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.