Custodian AI+
Build your AI solutions from the ground up
In this world of shiny new AI, it is important to remember that successful legal AI solutions need to be built from the ground up. Well-managed master data, metadata and grounding data is the foundation to implementing well-performing AI apps, and Custodian AI+ solution can tie your AI/LLM of choice to relevant data across the firm’s entire tech stack.
Data-driven AI for law firms and other PSOs
Custodian AI+ utilises PSAs unique position as ‘custodians’ of our client’s enterprise data, and helps connect generative AI solutions to the relevant grounding data stored in systems and services across the firm’s existing tech stack.
The cutting-edge solution, based on Microsoft Azure Open AI, is designed to support different End User tools, including a Custodian AI+ native UI, ChatGPT, relevant Microsoft Copilots and iManage Insight+ (future).
See Custodian AI+ in action on Vimeo.
Retrieval-augmented generation (RAG)
Custodian AI+ uses a Retrieval-augmented generation (RAF) model which combines the power of the leading Large Language Models (LLMs) with selected curated firm data stored in e.g., the document management system, and all other key data sources in the firm (ERP, CRM, HR etc.,).
Retrieval-augmented generation (RAG) is a technique for enhancing the accuracy and reliability of generative AI models with facts fetched from external sources.
In other words, it fills a gap in how LLMs work. Under the hood, LLMs are neural networks, typically measured by how many parameters they contain. An LLM’s parameters essentially represent the general patterns of how humans use words to form sentences.
That deep understanding, sometimes called parameterized knowledge, makes LLMs useful in responding to general prompts at light speed. However, it does not serve users who want a deeper dive into a current or more specific topic.
Retrieval-augmented generation gives models sources they can cite, so users can check any claims. That builds trust.
What’s more, the technique can help models clear up ambiguity in a user query. It also reduces the possibility a model will make a wrong guess, a phenomenon sometimes called hallucination.
Another great advantage of RAG is that it’s faster and less expensive than retraining a model with additional datasets. And it lets users hot-swap new sources on the fly.
Key Benefits
Industry Optimised
Tailored to the specific needs of law firms and other PSOs, with standard Custodian connectors to the systems and services used in those organisations.
Innovative & Secure
Rely on Microsoft to take advantage of the innovation and AI roadmap, as well as their security and performance to make your AI solution future-proof.
Easy to deploy
Our Custodian standard connectors make it easy to connect to iManage, Deltek Maconomy and all your firm’s other key data sources.
Build Competitive Advantage
The flexibility of Custodian AI+ means that it can be configured to support your firm’s specific use cases, building competitive advantage vis-à-vis competitors.