• 55 Posts
  • 58 Comments
Joined 2 months ago
cake
Cake day: April 4th, 2024

help-circle

  • I have used this small R package that allows you to read the text content of a PDF and send it to a local llama model via ollama or one of the large LLM APIs. I could use that to get structured answers in JSON format on a whole folder of papers, but the context length of a typical model is only long enough to hold a single (roughly 40-page) paper in the memory. So I had to get separate structurer answers on each paper and then generate a complete summary from those. Unfortunately that is not user-friendly yet.



























  • I work as a research economist and use half a model zoo and APIs regularly and have even written a small R package to work with LLM Apis. ChatGPT Pro in the interface for programming questions (helping me to write or document my R code, for example I have used it to easily translate tax laws into R functions to make microsimulations [of course i double check them]). Anthropic’s or Groq’s API’s to process large amounts of documents fast (for examples creating JSON-lists about papers to make them more easily searchable). I have one small script that has many useful prompts that really helps me to rephrase texts (e.g. “Please rephrase this paragraph to be more clear and concise. Give me {n} Versions. {paragrph}”) , which I have included into my browser. I used Stable Diffusion to generate images for my Christmas Cards.