Handling multiple interactions with Langchain

There are many tutorials on getting started with Langchain and LLMs to create simple chat applications. I want to go slightly beyond this post and go into a bit of detail on the role of memory has in chat applications, and lastly touch on how you can scale your application across multiple sessions and multiple users. What is Langchain? Langchain is an open-source python package that helps in creating LLM solutions....

<span title='2023-10-24 07:00:00 +0000 UTC'>October 24, 2023</span>&nbsp;·&nbsp;5 min&nbsp;·&nbsp;James Malcolm

Counting Pennies - Deploy or buy GenAI?

In this post, we explore the cost of deploying or buying your generative AI. Specifically, I want to focus on the computing cost - not the additional costs which contribute to the total cost of ownership. In this, I want to explore three options, these are: Managed: Use OpenAI directly Self-managed: Deploy using AWS Self-managed: Deploy using Google Cloud This post is part of my wider LLM series. Handling multiple interactions with Langchain LLM Risks - Prompt Injection Or a full list of posts, available here....

<span title='2023-08-07 02:13:04 +0000 UTC'>August 7, 2023</span>&nbsp;·&nbsp;5 min&nbsp;·&nbsp;James Malcolm

LLM Risks - Prompt Injection

Generative AI models are all the rage nowadays. For data people, generative models have been around for several years, but the power and usability of products such as ChatGPT have taken the world by storm. This emergence has brought in new and emerging security risks with it. One of the largest and most novel risks is prompt injection. Prompt injection attacks can affect all large language and generative AI models....

<span title='2023-06-15 02:13:04 +0000 UTC'>June 15, 2023</span>&nbsp;·&nbsp;4 min&nbsp;·&nbsp;James Malcolm