The other day I got to share a demo by pasting this command in a Discord channel:
uvx --with 'llm==0.17a0' --with 'llm-claude-3==0.6a0' \
llm -m claude-3.5-sonnet 'describe image' \
-a https://static.simonwillison.net/static/2024/pelicans.jpg
That single command creates a throwaway virtual environment, installs an alpha release of my LLM tool, adds an alpha release of my llm-claude-3 plugin, runs that command with options, returns the results and then throws everything away again.And because of the way uv uses symlinks and caching, running the commands a second time is almost instant.
(That stuff is no longer alpha, see https://simonwillison.net/2024/Oct/29/llm-multi-modal/)
Also, how is that different from `pipx run`?
Contrast that with other Python virtual environment tools where you can end up with the same exact dependency copied dozens of times across your system.
"pipx run" for example can result in identical copies of dependencies that are used by multiple tools.
1. Replaced `flake8` with `ruff`;
2. Replaced `black` with `ruff`;
3. Replaced `pip` with `uv`;
4. Replaced `poetry` with `uv`.
Their next project appears to be exactly what I _wish_ their next project to be — a replacement for pyright et al (https://github.com/astral-sh/ruff/discussions/10581). Type checking is my least favorite part of my Python toolchain at the moment; I can't wait to take it for a spin.
If Astral can leapfrog Pyright too, great. But I think it will be significantly harder than leapfrogging Flake8 and Black was.
Seems like `ruff server` is an LSP?
Ruff is great because you need to lint your code all the time and you could save maybe 1 minute per CI.
As for Python package management, my team is migrating to Bazel which has its own way of locking in the Python dependencies and then pulling them from a remote cache. Under Bazel, we are only re-examining the dependencies when someone proposes a change to produce the lock. It's so rare, that having a new+faster thing that does this part would not present a meaningful benefit.
Have you considered Pants[0], Buck[1] or Waf[2]? What ultimately made you decide to go for Bazel?
In about the length of a weekend I was able to find a reasonable group of settings for my pyproject.toml and roll that out to many of my projects.
The tooling is maturing and converging on more common ground (eg pyproject.toml) while adopting existing, accepted practices and standards.
The documentation is awesome and integrates well within my IDE of choice (pycharm)
Myself, I run ruff with most of the lintiners turned on. It is quite strict and opinionated (to the point of pedantic) -- a good learning path for me because I would never read PEPs anyway.
But once it is running I just stay within bounds and take the enforced rigor around doc-strings, syntatic sugar, and other little things.
I'm not on the uv train yet because I have no issues with venv and pip right now.
For now their decisions align with the community needs and everyone is happy.
But what will gonna be when their investors needs divert from the community needs or worse, they fail to meet their investors financial expectations and shutdown?
If it gets abandoned the community will be able to take over its maintenance or fork you.
Tails, you win, heads, you win.
I'd like to hear some disadvantages of using Astral in this point of time and how they plan to improve.
I really hope that any developer is reading this and is interested in this project , they do this. because I don't feel like doing it but I honestly feel its impact because I wanted to try nomadnet once on arch machine and I tried pipx and it was so slow
From https://astral.sh/blog/uv-unified-python-packaging
> Tool management: `uv tool install` and `uv tool` run (aliased to uvx). uv can now install command-line tools in isolated virtual environments and execute one-off commands without explicit installation (e.g., `uvx ruff check`), making it a high-performance, unified alternative to tools like pipx.
I do miss being able to shell into an env, but it hasn't been a big deal because you can still run `uv run -- code .` to launch VSCode with the right Python interpreter.
It's surprising to me that the future toolchain for Python productivity is Rust, but the results are awesome.
(I highly recommend using direnv and sourcing the virtual environment in a .envrc file though… when your editor has support for that it works so well compared to bespoke shell tools)
When I would be excited for is an actual LSP alternative to pylance. Pylance is basically a proprietary monopoly. Kind of crazy for Python to not have its own high performance LSP.
Solution: make python execution as disposable as paper plates :)
Uv works very well for me, but it does still sometimes run into weird issues when the underlying python installation is bad (ie installing a library that depends on a dev build)
Such a breeze to work with.
In contrast, I had to use pipenv recently and I’d forgotten how terrible its UX was. I wanted to upgrade one of the deps in a smallish project and it took 43 seconds to resolve the change. If anyone reading this is using pipenv, switch to uv right now and thank me later.
Yet another package manager is not what Python needs.
Respectfully, your dogmatism doesn't seem to serve you with numbers like that