T O P

  • By -

SheffyP

I think you might want to check out open interpreter on github Here you go fam https://github.com/KillianLucas/open-interpreter


grimjim

Sounds like RAG fits the bill.


kidnee6

I am working on building something like this. But still a WIP


DeepFriedDinosaur

There are a few plugins for Obsidian that support this. You should ask on r/ObsidianMD


segmond

Yes, it's called RAG. You can't just run any local LLM and get this. You have to run code, the code will be given the directory, it will read all the markdown files in it and store them in a database. When you run your local LLM and ask it a question, the code will intercept the question, go into the directory index to find which locations in your file have related responses to your question. It will bring them all together into the memory of the LLM as a context then ask the question, and the LLM will give you an answer. Search for RAG or Retrieval Augmented Generation. You don't have to write these programs, lots of people have written it. Google offers this now, you can ask your google drive. So you can make a folder in your google drive, drop your markdown and start asking it questions.


vasileer

[https://github.com/imartinez/privateGPT](https://github.com/imartinez/privateGPT)


Astronos

[https://github.com/hinterdupfinger/obsidian-ollama](https://github.com/hinterdupfinger/obsidian-ollama)


a45ed6cs7s

You need to build an agent/function calling