For one example, a number of years back, I built a python package, env, and version manager. It was built entirely Rust and distributed as a binary. Since I know users would likely have pip installed, it provided an easy way for them to install, regardless of OS.
You could go further like in this case, and use wheels + PyPi for something unrelated to Python.
I believe it is used to cross platform link Rust/maturin wheels, which seems nice because it's one fewer unusual install script to integrate into your project, if zig isn't packaged for Debian yet.
It's useful as a distro-agnostic distribution method. CMake is also installable like this despite having nothing to do with Python.
Or I should say it was useful as a distribution method, because most people had Python already available. Since most distros now don't allow you to install stuff outside a venv you need uv to install things (via `uv tool install`) and we're not yet at the point where most people already have uv installed.
Regular Python bindings / c extensions don’t depend on a pypi-packaged instance of gcc or llvm though. It’s understood that these things are provided externally from the “system” environment.
I know some of it has already happened with rust, but perhaps there’s a broader reckoning that needs to occur here wrt standards around how language specific build and packaging systems handle cross language projects… which could well point to phasing those in favour of nix or pixi, which are designed from the getgo to support this use case.
Usually arbitrary binaries stuffed in Python wheels are mostly self contained single binaries and such, with as little dynamic linking nonsense as possible, so they don't break all the time, or have dependency conflicts.
It seems to consistently work really well for binaries, although it would be nice to have first class support for integrating npm packages.
That's really cool actually. Now that AI is a little more commonly available for developer tooling I feel like its easier than ever to learn any programming language since you can braindrain the model.
The standard models are pretty bad a zig right now since the language is so new and changes so fast. The entire language spec is available in one html file though so you can have a little better success feeding that for context.
> The entire language spec is available in one html file though so you can have a little better success feeding that for context.
This is what I've started doing for every library I use. I go to their Github, download their docs, and drop the whole thing into my project. Then whenever the AI gets confused, I say "consult docs/somelib/"