I'm glad to hear good news on LSP v2 porting progress.
The pylib__linux field might indeed be modified somehow. It's a weird way to parameterize Python in the first place. I change it regularly when I boot a live Linux ISO and install CudaText.
I am almost reaching an opinion that CudaText should adopt uv for all plug-ins. $UV_PYTHON_PREFERENCE can prioritize the system Python over any others installed. Meanwhile uv could eliminate similar dependency problems, which I have encountered before, whereby a plug-in fails because my system lacks a dep. Instead of a user needing to (a) find, and then (b) install a missing dep, uv could do it automatically; up to a separate Python version if needed. Righ now, CudaText plug-ins do not have explicit dependency definitions. At best there are comments and error messages. There is more involved than just dropping some portable Python and modding some $PYTHON_* vars.
CudaText using Pascal is odd for an app so reliant on Python. Maybe wxPython could work. But I love CudaText UI customization. I have a highly modified menu layout.
Anyway, here are some third-party quotes pulled from articles on uv. Nothing below is mine. Thanks!
A year of uv: pros, cons, and should you migrate - Yes, probably.
Because I'm a freelancer dev, and also a trainer...I've seen all other tools fail spectacularly. pyenv, poetry, pipenv, pdm, pyflow, pipx, anaconda...PATH, PYTHONPATH, terrible naming conventions, having multiple Python versions on the same machine, optional packages on Linux, and Python being a system dependency create a thousand ways to shoot yourself in the foot....uv brought...more than Python project management....uv capabilities...alter deeply the way you use Python scripts....Personally, I used to manage a gigantic venv just for my local scripts, which I had to kill and clean every year. Now, you are free to use whatever. It's fast. Transparent. Efficient. Self-descriptive.
Hacker News Review
With the advent of uv, I'm finally feeling like Python packaging is solved. As mentioned in the article, being able to have inline dependencies in a single-file Python script and running it naturally is just beautiful.
One other key part of this is freezing a timestamp with your dependency list...This has also let me easily reconstruct some older environments in less than a minute, when I've been version hunting for 30-60 minutes in the past.
uv uses a platform independent resolution for its lockfiles supports features that Poetry does not
There's like 5 different ways to create virtual environments. With uv, you don't have to care about any of that. The venv and your Python install are just handled for you by 'uv run', which is magic.
conda user for 10 years and uv skeptic for 18 months. I get it! I loved my long-lived curated conda envs. I finally tried uv to manage an environment and it's got me hooked....No more meticulous tracking of a env.yml or requirements.txt just 'uv add' and 'uv sync' and that's it! I just don't think about it anymore
A big thing that trips people up until they try to use a public project (from source) or an older project, is the concept of a dependencies file and a lock file.
The dependency file (what requirements.txt is supposed to be), just documents the things you depend on directly, and possibly known version constraints. A lock file captures the exact version of your direct and indirect dependencies at the moment in time it's generated. When you go to use the project, it will read the lock file, if it exists, and match those versions for anything listed directly or indirectly in the dependency file. It's like keeping a snapshot of the exact last-working dependency configuration. You can always tell it to update the lock file and it will try to recaclulate everything from latest that meets your dependency constraints in the dependency file, but if something doesn't work with that you'll presumably have your old lock file to fall back on _that will still work_.
It's a standard issue/pattern in all dependency managers, but it's only been getting attention for a handful of years...It has the side effect of helping old projects keep working much longer though.
uv can handle things like downloading the correct python person, creating a venv (or activating an existing venv if one exists) and essentially all the other cognitive load in a way that's completely transparent to the user. It means you can give someone a Python project and a single command to run it, and you can have confidence it will work on regardless of the platform or a dozen other little variables that trip people up.
I am a casual python user, and for that I love uv. Something I haven't quite figured out yet is integration with the pyright lsp - when I edit random projects in neovim, any imports have red squiggles. Does anyone know of a good way to resolve imports for the lsp via uv?
I start a shell with "uv run bash" and start neovim from there. I'm sure there's other ways but it's a quick fix and doesn't involve mucking around with neovim config.
That's brilliant, thanks
EDIT - 'uv run nvim' works also
A Deep Dive into UV: The Fast Python Package Manager | Better Stack Community
Unlike tools like pyenv, uv integrates Python version control with dependency management, ensuring a consistent and optimized development experience.
UV — Intuitively and Exhaustively Explained