Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A neat little thing I like about Zig is one of the options for installing it is via PyPI like this: https://pypi.org/project/ziglang/

  pip install ziglang
Which means you don't even have to install it separately to try it out via uvx. If you have uv installed already try this:

  cd /tmp
  echo '#include <stdio.h>                     
  
  int main() {
      printf("Hello, World!");
      return 0;
  }' > hello.c

  uvx --from ziglang python-zig cc /tmp/hello.c
  ./a.out


For anyone not familiar: You can bundle arbitrary software as Python wheels. Can be convenient in cases like this!


What "cases" are those? Tell me one useful and neat case. Why is it useful and neat, you think?


For one example, a number of years back, I built a python package, env, and version manager. It was built entirely Rust and distributed as a binary. Since I know users would likely have pip installed, it provided an easy way for them to install, regardless of OS.

You could go further like in this case, and use wheels + PyPi for something unrelated to Python.


I believe it is used to cross platform link Rust/maturin wheels, which seems nice because it's one fewer unusual install script to integrate into your project, if zig isn't packaged for Debian yet.


It's useful as a distro-agnostic distribution method. CMake is also installable like this despite having nothing to do with Python.

Or I should say it was useful as a distribution method, because most people had Python already available. Since most distros now don't allow you to install stuff outside a venv you need uv to install things (via `uv tool install`) and we're not yet at the point where most people already have uv installed.


Bundling a browser frontend together with your Python application.


uv and ruff use that Approach. They are Python related tools, but written 100% in Rust.


For this sort of stuff I find micromamba / pixi a better way of managing packages, as oppposed to the pip / uv family of tools


Pixi, Conan, or Nix— all better choices than abusing the Python ecosystem to ship arbitrary executables.


It could easily be the case that the zig compiler is useful in some mixed-language project and this is not actually "abuse".


Regular Python bindings / c extensions don’t depend on a pypi-packaged instance of gcc or llvm though. It’s understood that these things are provided externally from the “system” environment.

I know some of it has already happened with rust, but perhaps there’s a broader reckoning that needs to occur here wrt standards around how language specific build and packaging systems handle cross language projects… which could well point to phasing those in favour of nix or pixi, which are designed from the getgo to support this use case.


What do those systems do that UV/PyPi doesn't?

Usually arbitrary binaries stuffed in Python wheels are mostly self contained single binaries and such, with as little dynamic linking nonsense as possible, so they don't break all the time, or have dependency conflicts.

It seems to consistently work really well for binaries, although it would be nice to have first class support for integrating npm packages.


reinventing nix but worse.


Not even close, that's still imperative package management


I wish we had that for Nim too!


try pixi!


That's really cool actually. Now that AI is a little more commonly available for developer tooling I feel like its easier than ever to learn any programming language since you can braindrain the model.


The standard models are pretty bad a zig right now since the language is so new and changes so fast. The entire language spec is available in one html file though so you can have a little better success feeding that for context.


> The entire language spec is available in one html file though so you can have a little better success feeding that for context.

This is what I've started doing for every library I use. I go to their Github, download their docs, and drop the whole thing into my project. Then whenever the AI gets confused, I say "consult docs/somelib/"


Just use gh_grep mcp and the model will fetch what it needs if you tell it to, no need to download from GitHub manually like this


that's what context7 mcp is for!


I on the other hand see most languages become superfluous, as coding agents keep improving.

During the last year I have been observing how MCP, tools and agents, have reduced the amount of language specific code we used to write.


That's a nice trick!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: