Creating value through Large Language Models
Exploring JMAN’s capabilities and recent successes in deployment of LLM-based solutions
Key Benefits
10
days saved
15
days to develop
50%
reduction in pipeline execution time
Introduction
Gen AI is a hot topic in every boardroom, office kitchen and water fountain at the moment. While many are concerned about its implementation, JMAN has been expanding its capabilities in this field using Large Language Models as the basis for Generative AI approaches to solve problems and create value.
As many of you will know, Large Language Models (LLMs) are trained to recognise, understand and generate human language in response to a user’s prompts. Already this technology has been deployed across a range of use cases.
Here are some of our recent success stories for LLM-based solutions.
Case Studies
Streamlining Deal Origination for a UK Based Mid Market PE Fund
JMAN’s advanced LLM framework streamlined sector categorization for 2100+ Software & Data companies, enhancing data accuracy and investment efficiency for a growth-focused PE client:
Context/Challenge
The fund faced significant challenges stemming from significant data quality issues within the source data (originating from Salesforce). This led to companies being misclassified within their respective business sectors & enterprises. This inaccuracy posed a significant hurdle and complicated the reliable assessment of businesses for investment purposes.
Solution
First, we employed a keyword extraction model to gather content from company websites, facilitated by web scraping tools and bots. This served as the foundational dataset for the sector categorisation process. Next, a tailored LLM framework (semantics search model) was developed and implemented to process the foundational data. This framework leveraged vectors to categorise companies to their respective sectors with a high level of accuracy and speed, thereby streamlining the process through technical automation.
Value Creation
The impact of this solution was twofold.
Firstly, it significantly reduced the time required to complete the categorisation and prevented the need for lengthy manual mapping. This task would have taken the client around 25 days to complete manually. However, it took only 15 days to develop, train and deploy the LLM solution, which can be applied to similar tasks in the future. Additionally, the LLM model minimises the need for further manual intervention once the model is established and trained.
Furthermore, the use of the LLM solution has contributed to significant improvements in data quality and accuracy at source, resulting in increased reliability of the data for the investment process.
Data Quality Enhancements for a PE backed Payments Business
JMAN developed and established a LLM model to enhance data quality for 2500+ company records to improve the data reliability for a leading provider of international and multi-currency payment services
Context/Challenge
The client was grappling with poor data quality in their Salesforce database, making it challenging to reconcile revenue figures between the source data and management accounts. The problems with the data quality were numerous, spelling errors, extraneous whitespacing, and a significant number of duplicate entries.
Solution
The solution used a pre-trained LLM model to rectify and standardise the source data, using paraphrase mining to match customer names & addresses to produce a single source of truth customer list. The standardised & cleaned output of the model was then reintroduced back into the pipeline with minimal manual intervention. This is in contrast with alternative methods that would necessitate the development of labour-intensive Python scripts or the use of Alteryx fuzzy matching; which can be particularly challenging when dealing with substantial volumes of data
Value Creation
The LLM model helped reduce pipeline execution time by 50%, delivering significant cost savings. It also enhanced data accuracy at the source, thereby improving data quality and leading to a more reliable decision-making process for the client. Additionally, the model now runs for the client with minimal manual intervention, ensuring high-quality data standardization within the source system.
External Knowledge Management for Global Buyout PE Fund
Building on an established relationship with a Global Private Equity Fund, JMAN developed a solution to enhance the client’s implementation of Azure Open AI to improve the efficiency and privacy of information sharing.
Context
The existing system returned customer insights from Azure Blob, Azure Cognitive search indexes and local files based on user prompts. However, this lacked any user-level segregation of documents, diminishing the quality of outputs by creating ambiguous results.
Solution
In response to this challenge, the team engineered a custom data pipeline which connected SharePoint with Azure Open AI, via Azure Blob, providing uniquely segmented document access for user groups. To ensure information security, the team connected data stored in SharePoint, which determined folder and file access restrictions, to Azure Open AI using REST APIs. This solution leveraged SharePoint folder access indexing to isolate available data sources on a fully bespoke basis. Moreover, using Terraform to leverage Infrastructure as a Service (IaaS) enabled the team to scale and redeploy this solution across multiple offices and teams.
Value-Creation
This solution created significant value for the client by providing bespoke insights based on a user’s specific folder access. The approach also streamlined the document upload process by enabling a live sync of files between SharePoint and the data source. Ultimately, this culminated in operational productivity increases by enabling efficient knowledge sharing between offices and quick deployment of bespoke insights. Furthermore, the IaaS model can effectively be manipulated and redeployed to similar models for other interested portfolio companies.
JMAN Internal Knowledge Management
In a recent research and development project, JMAN utilised a large-language model as part of a complex tech stack to create a chatbot which has been designed to improve internal knowledge sharing.
Context
JMAN recently embarked on a fortnight-long research and development project to create a chatbot for internal use to aid knowledge sharing between projects and across the business.
Solution
This involved developing an easy-to-use application built on Chainlit, an open-source Python package, which provides the interface for receiving user prompts and questions. These queries are fed through a python script which connects the interface to the LLM. This in turn converts the individual words in the sentence into vector representations. Through a process of ‘embedding’ each vector is attributed tacit information, locating it in a multi-dimensional space, similar to how coordinates identify a location on a map. This is matched to a vector database, which finds the associated vector with highest similarity which can be fed back into the LLM to generate a response for the chatbot user.
For JMAN’s chatbot, an externally sourced LLM, LangChain, was used for text processing as it was deemed to be more cost and time effective for this two-week project. This R&D is still underway as the team looks to hone the model with increasing volumes of relevant data to extract improved insights.
Value-Creation
This model drives value for JMAN by allowing meaningful insights to be extracted from existing case studies without going directly to the consultants involved on the project. By using this tool Knowledge Sharing conversations within JMAN are made more meaningful and time efficient.
Summary
Much has been written about the potential of LLMs to revolutionize the workplace. Many companies are beginning to step away from the speculation and begin to act on the opportunity. By combining our leading data and tech expertise with pragmatic and practical commercial acumen; JMAN are implementing LLMs to deliver tangible value for ourselves and our clients. Please reach out to the team if you would like to explore how JMAN can help your business leverage the potential of LLMs.