Dynamic LLM Model Switching
Feature Description
This feature allows you to select between the Lumenore Query Agent and Auto in the LLM selection dropdown, or choose from specific AI models such as Chat GPT 4.1, 4.1 Mini, 4o, 4o Mini, and more, giving you the flexibility to run queries using the Lumenore proprietary query engine or let the system auto-select the best AI LLM model, enhancing query accuracy. You can also choose a specific AI model of their preference for a more tailored experience.
End User Business Benefits
Flexibility of Choice
Users decide whether they want speed (via Lumenore Query Agent) or depth (via GPT-based AI), depending on the complexity of their query.
Accuracy in Responses
Auto-selection ensures the system intelligently matches the query type with the best AI model, reducing errors and improving confidence in results.
Cost Optimization
By switching to the internal non-AI engine for routine queries, organizations reduce unnecessary consumption of advanced GPT resources, saving costs.
Tailored User Experience
Business users can stick with their preferred AI model for consistent responses aligned with their needs.
Scalability Across Teams
Different departments can adopt different query modes (basic vs. advanced) without disrupting workflows, making the feature enterprise-ready.
Enhanced Adoption & Trust
The ability to transparently choose or auto-select models increases user trust, as they see the platform aligning with their exact use case and performance expectations.
Explore and know the full capability of this feature