Hi all,
I wanted toshare a project I have been working on.
With Instrukt, you can build, customize, debug and instruct AI agents straight from the terminal.
I made a quick demo to showcase the main features here: https://youtu.be/_mkIoqiY0dE
Looking forward for feedback.
Seems cool. One more step towards a full operating systems manager.
I guess from the video, you use the toolformer library https://github.com/lucidrains/toolformer-pytorch, and it looks like they use the PalM model underneath.
How bound are your project/functionality to that specific model ? I’m planning to setup a local LLM, and would like a framework that can use the model I choose.
I don’t use toolformer it’s just used for demo purposes, I simply scan the PDF of the paper and generate embeddings which are used for question answering. The AI logic is built with Langchain.
It currently uses OpenAI model but the next priority for me is to use local LLMs which should be straightforward with Langchain.
Cool. I’ve tabbed your project for later. Thanks for sharing…
You’re welcome.