Table of Contents

Build a Chatbot with DeepSeek

Abby Updated by Abby

Compatible with all bot types

In order to follow along you'll need a DeepSeek API key with a payment method attached

You can find the web template here but please note you'll need to add your API key in the webhook block before it will work

The Flow

The basic flow consists of a loop where we allow the user to ask questions and DeepSeek to respond, the questions and responses are saved in a memory variable that we also pass to DeepSeek in order to keep the context

We'll start the flow off with an Ask a question block, the variable saved here should be @user_text

Next we'll add a Set a field block and save an array type variable as @memory, the value will be an empty array: []

This empty array is where we'll be adding our conversation history as it comes in, this way DeepSeek has context in case the user wants to ask follow up questions

The following block is a Formula block where we'll encode the user input.

This way the user has freedom to input any character or line break without breaking the JSON we send to DeepSeek

The output of the formula should also be @user_text this will overwrite this: "Hi I have a question" with this: "Hi%I%have%a%question"

The exact formula is: Encode(@user_text)

Next is the webhook block.

The webhook

Here's the url we'll use: https://api.deepseek.com/chat/completions

This should be a POST API call

You'll need your DeepSeek API key here. Headers:

Authorization: Bearer <DeepSeek API Key>

In the body we'll add our prompt with the message history (the @memory variable) for context

The prompt can be anything, but here's what we used:

You are a helpful Landbot assistant, respond in one paragraph, chat history: @memory"

The content for the user is the variable @user_text that we encoded

Body:

{
"model": "deepseek-chat",
"messages": [
{"role": "system", "content": "You are a helpful Landbot assistant, respond in one paragraph, chat history: @memory"},
{"role": "user", "content": "@user_text"}
],
"stream": false
}

This is what the body looks like in Landbot:

Landbot isn't currently capable of handing streaming responses from the API, so at the moment we need to turn "stream" to the boolean false

Next we'll test the request with a test value for @user_text (no need to give @memory a value for now):

Once we've tested the request and gotten a 200 response we can save the response as a variable

Since there's a lot of extra information that we don't need or want to show the user, we'll click on the purple banner in 'Save Responses as Fields' and select choices.0.message.content:

We'll save this as a String variable called @response

The Flow (again)

Next we're going to push our question and the response we got into our memory variable

Here's the formula we'll use: Push(@memory, @user_text, @response)

The output will be @memory

We'll connect this to another (different) Ask a question block with the text of the block being the response variable (@response) and the variable saved being @user_text again

This block will loop back to the Encode formula from before.

That's it! We recommend using a Persistent Menu block so that the user can escape the loop if needed, it's completely optional though

Was this article helpful?

Google Gemini in Landbot

Contact