Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"The constraint system offered by Guidance is extremely powerful. It can ensure that the output conforms to any context free grammar (so long as the backend LLM has full support for Guidance). More on this below." --from https://github.com/guidance-ai/guidance/

I didn't find any more on that comment below. Is there a list of supported LLMs?



Good point re: documentation...

We have support for Huggingface Transformers, llama.cpp, vLLM, SGLang, and TensorRT-LLM, along with some smaller providers (e.g. mistral.rs). Using any of these libraries as an inference host means you can use an OSS model with the guidance backend for full support. Most open source models will run on at least one of these backends (with vLLM probably being the most popular hosted solution, and transformers/llama.cpp being the most popular local model solutions)

We're also the backend used by OpenAI/Azure OpenAI for structured outputs on the closed source model side.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: