In the rapidly evolving world of work, continuous learning and skill development are more important than ever. But how can we identify our skill gaps and find the right learning resources to fill them? Enter the power of large language models. This blog will showcase a prototyped an LLM system designed to do just that.
This LLM system is not just a tool; it's a friendly guide, a mentor if you will, that assists our employees in identifying their skill gaps and creating personalized learning paths. It's like having a personal career coach, available 24/7, ready to provide constructive feedback and growth opportunities tailored specifically to each individual's needs.
Before we delve deeper, let's highlight a key benefit that sets it apart.
It is designed for internal use, making it a perfect low-risk introduction implementation of LLMs for businesses. It doesn't involve any customer or financial data, focusing solely on employee skill development.
This is a significant advantage for those concerned about data security as it uses the GPT-4 model deployed on Azure Open AI within your very own Azure subscription. Therefore it operates within the confines of an organization's internal network, ensuring data security and privacy.
Now let's take a closer look at how it works. Imagine an employee, let's call her Jane. Jane interacts with the LLM, expressing her desire to improve her skills. The LLM, using prebuilt functions that returns all the context available about Jane, understands her current skill set, identifies gaps, and suggests a learning path. It considers Jane's role, aspirations, available time and necessary skills when suggesting this path. It might recommend additional training or certifications that align with Jane's career goals and the company's needs.
But it doesn't stop there, it goes a step further by providing Jane with links to resources suitable for her level. These resources can come from own company resources first and then official Databricks and Azure sources, ensuring that Jane is learning from trusted, high-quality material. And don't worry, it double-checks the links in advance to ensure they're active and relevant.
Now, Jane has a question. She interacts with the LLM, asking for clarification or additional information. It responds to Jane's query based on the context provided. If the query can't be answered, it's honest and says 'I don't know'. But more often than not, it provides Jane with the information she needs to continue her learning journey.
Here’s a small snippet of how the prompt looks like:
Prompt: Your role is to support XX employees in discovering areas for skill improvement and developing customized learning paths. Take into account the employee's current position, career goals, and necessary skills when recommending learning paths. Propose additional training or certifications that align with their career ambitions and the company's requirements. Provide links from trusted Databricks and Azure sources to resources that match the employee's skill level. Always pre-verify the links. Always ensure that the suggestions are seen as opportunities for growth and not as criticisms of their current skill set. ………………. ………………. ………………. Remember to: ……………….. • Utilize the context provided about the employee to provide personalized recommendations. • Engage in dialogue, asking about their career goals, preferred learning methods, and specific needs. • Encourage peer learning and suggest employees to connect with for certain skills. • Provide advice on time management and suggest learning resources that are designed to be consumed in short bursts for busy schedules …………………. Resources: {*Function that returns list of internal available resources} EmployeeContext: {*Function that returns list of employee attributes} History: {*Function that returns employee history} Query: {User query}
***This is a sample of the full prompt
This is just one example of a LLM system prompt. It's designed to be flexible, adapting to each employee's unique needs and circumstances.
Investment and Cost Estimation
With any LLM system it's important to consider the investment required. The cost of implementing and running such a system will depend on several factors, including the number of employees using the system and the frequency and complexity of their interactions.
Let's consider a scenario where each employee interacts with the system 5-6 times a day, but only 2-3 days a week, with each interaction using approximately 250 tokens.
Given these parameters, the cost per interaction, including both the prompt and completion, would be approximately £0.018. Over a month, the cost per employee would be around £1.008.
Additionally, the system creates embeddings for training resources and some employee context. Assuming that the system creates embeddings for 1,000 tokens of training resources and employee context per month, the cost for embeddings would be £0.08.
So, the total cost per employee per month, including the cost of embeddings, would be approximately £1.088.
This is a rough estimate, and the actual cost can vary depending on the number of tokens used per interaction, the number of interactions per month, and the amount of data used for embeddings.
It's also worth noting that this estimate does not include the cost of any additional Azure services that might be used in conjunction with the LLM system. However, even with these costs, the potential return on investment from improved employee productivity, reduced training costs, and increased employee satisfaction could be significant.
Benefits
The benefits of this system are numerous and some of the potential ROI metrics include:
-
Reduced Training Costs: By identifying skill gaps and providing personalized learning paths, the system could reduce the need for external training providers, leading to cost savings.
-
Improved Employee Productivity: As employees upgrade their skills more quickly and effectively, their productivity could increase, leading to higher output and potentially increased revenue.
-
Reduced Employee Turnover: If the system leads to higher employee satisfaction and engagement (as employees feel the company is investing in their development), this could reduce turnover and the associated costs of hiring and training new staff.
-
Competitive Advantage: Enhancing employee skills can give the company a competitive advantage in the market.
-
Resource Prioritization: The system prioritizes teaching and recommending from the company's own resources first, ensuring internal knowledge is fully utilized.
In the world of LLMs, we're just scratching the surface of what's possible. But one thing is clear: Generative AI has the potential to revolutionize the way we learn and work. And at Advancing Analytics, we're excited to be at the forefront of this movement. To learn how LLMs can be used within your business, download our LLM Workshop flyer or get in touch with us today.
Topics Covered :
Author
Alexandru Malanca