You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After implementing a large language model (LLM) Function Call feature using instructor-ai/instructor as a dependency, I observed some limitations in the flexibility of LLM interpretation. Specifically, when dealing with irregular or exceptional inputs, the robustness of the LLM dropped significantly compared to natural language scenarios. This required me to supplement the LLM with additional mechanisms, such as string similarity checks or correction dictionaries, to filter and preprocess inputs effectively.
This experience highlighted the value of the work done by BoundaryML/BAML. However, I found BAML lacking in the following areas:
Limited API Control: The lack of APIs restricts fine-grained control, making it challenging to adapt BAML for specific requirements.
Framework Incompatibility: BAML does not support custom frameworks for complete, agent, or dialogue management, limiting its flexibility for tailored use cases.
Suggestions
To enhance BAML’s usability and adoption, I suggest the following improvements:
Provide a Comprehensive API: Introduce APIs to enable precise and fine-grained control over the core functionalities. This would make it easier to integrate BAML into custom workflows and frameworks.
Improve Framework Compatibility: Enhance BAML to support custom implementations of complete, agent, and dialogue management. This will allow developers to adapt BAML to their unique environments and requirements.
Motivation
With these improvements, BAML could serve as a more robust and flexible solution for managing LLM interactions, especially in scenarios that require handling edge cases or integrating with custom frameworks. I would be eager to try BAML again if these enhancements are implemented.
We'd like to use specific components of BAML - specifically the Prompt Declaration (.baml files - no clients) and SAP Parsing features - to integrate into our existing pipeline. We don't need the full client implementation since we already have one. Could you help us:
Construct agent prompts by passing arguments
Access the parsed output
If there's a way to use these features independently with OpenAI's API or other APIs, please let us know. We already have a client implementation and only need these specific components.
Description
After implementing a large language model (LLM) Function Call feature using instructor-ai/instructor as a dependency, I observed some limitations in the flexibility of LLM interpretation. Specifically, when dealing with irregular or exceptional inputs, the robustness of the LLM dropped significantly compared to natural language scenarios. This required me to supplement the LLM with additional mechanisms, such as string similarity checks or correction dictionaries, to filter and preprocess inputs effectively.
This experience highlighted the value of the work done by BoundaryML/BAML. However, I found BAML lacking in the following areas:
Suggestions
To enhance BAML’s usability and adoption, I suggest the following improvements:
Motivation
With these improvements, BAML could serve as a more robust and flexible solution for managing LLM interactions, especially in scenarios that require handling edge cases or integrating with custom frameworks. I would be eager to try BAML again if these enhancements are implemented.
References
Thank you for considering these suggestions!
The text was updated successfully, but these errors were encountered: