Hi @anonuser
Thank you for reaching out and using Phidata! I’ve tagged the relevant engineers to assist you with your query. We aim to respond within 24 hours.
If this is urgent, please feel free to let us know, and we’ll do our best to prioritize it.
Thanks for your patience!
Thanks for the reply @manthanguptaa.
Workflows seems to be a good fit where the use case is static, orchestratable. But I am intrested in a more dynamic setup.
Please Let me elaborate on this, Imagine a case where user asks a question to LLM, Optionally Attaches it a bunch of files, now LLM has access to a sandboxed code interpreter (Some VM / Container) where LLM generates code, RUNS IT in sandbox, to solve user’s problem. Ex: Looking at the excel files I uploaded, generate a executive BI report. A RAG wouldn’t work here as the files are structured, large, all of them should be considered for output generation, chunks are not enough. Azure achieved this using Azure OpenAI Assistants Code Interpreter.
I would like to know if this is already part of phidata / in the roadmap.
The other question was: “Does Any chat completion API Work with phidata, or does it have to be special Assistants API”