If anything this removes a major roadblock for libraries/languages that want to employ LLM calls as a primitive, no? Although, I fear the vendor lock-in intensifies here, also given how restrictive and specific the Chat API.
Either way, as part of the LMQL team, I am actually pretty excited about this, also with respect to what we want to build going forward. This makes language model programming much easier.
`Although, I fear the vendor lock-in intensifies here, also given how restrictive and specific the Chat API.`
Eh, would be pretty easy to write a wrapper that takes a functions-like JSON Schema object and interpolates it into a traditional "You MUST return ONLY JSON in the following format:" prompt snippet.
Either way, as part of the LMQL team, I am actually pretty excited about this, also with respect to what we want to build going forward. This makes language model programming much easier.