The example I gave was using this as a backend for a chat bot in a private server and i'm not comfortable sharing the prompt, however if you look up the leaked bing prompt that might give you some ideas for how to prompt an LLM into being a chatbot that can answer coding questions. I've had pretty good results using it as a bot (with some glue code that does sorta vanilla regex-based prompt cleaning, but not too much, it's mostly prompt)
If you're not trying to get it to be a chatbot it's much easier, here's a prompt that worked for me on the first try in the default mode with 13Bq4 on a 1080Ti:
Here are is a short, clear, well written example of a program that lists the first 10 numbers of the fibonacci sequence, written in javascript:
```js
and when given that it finished it with:
function Fib(n) {
if (n == 0 || n == 1) return 1;
else return Fib(n-1)+Fib(n-2);
}
var i = 0;
while (i < 10) {
console.log("The number " + i + " is: " + Fib(i));
i++;
}
```
\end{code}
(I don't work at OpenAI so take it with a grain of salt) Yes and No they are similar. It is basically just a fancy autocomplete like llama, but I believe it's specifically been trained on chat content, or at least finetuned on such, and it probably uses a more chat focused labeling scheme on the training data as well to help it perform well on that specific task and be conversational.