That was fast, nice work! I posted in your last thread. Now that you support GPT-4 I can probably use this.
Couple Q’s before I set some time aside to try it out:
Do you have a public API if I want to define some custom prompt commands? I’d like to do some simple things like make functions that are region specific with a custom prompt string. I could hack it up but it would be helpful to know if there’s an API surface that I should use to keep things stable between versions.
Sure. You can use functions without double dashes as public api. If you want some specific, open issue or you can even use llm library directly if you want more control.
That was fast, nice work! I posted in your last thread. Now that you support GPT-4 I can probably use this.
Couple Q’s before I set some time aside to try it out:
Sure. You can use functions without double dashes as public api. If you want some specific, open issue or you can even use llm library directly if you want more control.