LLM Context
Play
18:22 CC

LLM Context

#446 Mar 10, 2024 pro
Building on the previous episode, we look at refactoring our background job into a more maintainable object and provide context to the LLM so that we can chain together responses for a more conversational experience.

Streaming LLM Responses
Play
24:10 CC

Streaming LLM Responses

#445 Mar 3, 2024 free
In this episode, we look at running a self hosted Large Language Model (LLM) and consuming it with a Rails application. We will use a background to make API requests to the LLM and then stream the responses in real-time to the browser.

Detect Spam with AI
Play
29:22 CC

Detect Spam with AI

#427 Nov 5, 2023 free
We can create a small python service that uses a Large Language Model (LLM) to detect if a message is spam or not. Using this service, we can tie it into our Rails application so that any comment created will be evaluated for being spam or not. We explore a few different routes on handling any messages flagged as spam.

Text to Image with Machine Learning
Play
26:09 CC

Text to Image with Machine Learning

#398 Apr 23, 2023 pro
In this episode, we take a new approach to the python service that our Rails background jobs will be calling. This is a more stable and thread safe approach and more simple to implement. We'll look at creating a Text to Image generation service that our Rails application will interact with.

OpenAI API Integration
Play
26:33 CC

OpenAI API Integration

#390 Feb 26, 2023 pro
In this episode, we have a look at using the OpenAI API to create an AI Response to comments. Instead of simply using the API directly with Net::HTTP, we're also going to create an API wrapper around Net::HTTP so that we can simplify the responsibility of our OpenAI integration.