How I Built a Serverless Custom Chatbot with ChatGPT API and AWS in 10 Minutes

--

Revolution of AI is astonishing the world. Large Language Models have appeared for never go. Companies like OpenAI, Google and Meta are developed amazing technologies than extend human capabilities… if people learn to use it.

That’s why I propose myself create a simple guide for people like me that is more close to business than the depth of the creation of these models, and give one of a lot of posible ways to appropriate them for creating some value.

Here is a step by step tutorial of how to use OpenAI API, specificly, with ChatGPT model, for adapting a customable solution for any end you’d like. The better: how to do it Serverless without knowing about how to configure servers, deployments and else.

For this proposal, I used Docker, AWS S3, AWS Lambda and AWS API Gateway. Here a diagram for more graphic understanding:

Prerequisites:

Create a Docker Container

Using a Docker Container for creating a zip file with OpenAI Library

# Create the container
docker run -it ubuntu
# Update Packaging tool - APT
apt-get update
# Install some properties
apt-get install software-properties-common add-apt-repository ppa:deadsnakes/ppa
# Update Packaging tool for verifying everything is in order
apt-get update
# Install Python Version you will use
apt-get install python3.9
# Install pip
apt-get install python3-pip
# Install environments generator of Python
apt install python3.9-venv
# Create environment for install specific library of openai
python3.9 -m venv openai_venv
# Activate Python environment
source openai_venv/bin/activate
# Create a folder for saving libraries to install
mkdir python
# Install openai library in python folder
pip install openai -t python
# install zip method
apt-get install zip
# saving python folder as a zip
zip -r openai.zip python
# Exit from docker container
exit
# Copy zip file to local PATH
docker cp [CONTAINER-ID]:./openai.zip [LOCAL-PATH]

Upload Zip File to S3

Create a Bucket for uploads Layers for Lambdas Functions

aws s3 cp openai.zip s3://layers-bucket/openai.zip --profile [aws-cli-profile]

Generate a Lambda Function

Create Layer with Zip File uploaded to S3 Bucket

Create a Lambda Function

Add Layer to Lambda Function

Create code in Lambda Function:

import json
import openai

openai.api_key = "PUT OPENAI KEY HERE"

def lambda_handler(event, context):
test = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a writer. When user gives you words create a literature phrase"},
{"role": "user", "content": "storms"},
{"role": "assistant", "content": "I am not afraid of storms, for I am learning how to sail my ship."},
{"role": "user", "content": event["word"]}
]
)

test = test['choices'][0]['message']['content']
print(test)

return {
'statusCode': 200,
'body': json.dumps(test)
}

In this case, this code is useful for generate poetic phrases. You import OpenAI API, add api key and then generate a structure like the showed in example, where the message has three roles:
- System: You say to the model how it should perform. In my case like a writer.
- User: In this case is the person who will use the service. You just have to pass an example of user prompt for “training” the model. The next user prompt is feed by the API.
- Assistant: It is like the chatbot response. You just have to give an example of response to the first example of user prompt and it will be able to generate similar responses for new prompts.

Response test

# Test with this JSON in Lambda Function

{
"word": "love"
}

# You will get a response like this:
{
"statusCode": 200,
"body": "\"Love is like a wildflower; it can be found in the most unexpected places and brings beauty and joy to those lucky enough to witness its bloom.\""
}

Add an API Gateway as a Trigger

Test Lambda Function

Open a Bash Console and follow next steps:

# Define the json for feed the function from the API Gateway as a system variable
# I used for test the word: Sadness
export PAYLOAD=$(echo '{"word": "Sadness"}' | base64)

# Invoce lambda
aws lambda invoke --function-name OpenAiPrueba --payload $PAYLOAD response.json --profile [PROFILE AWS CLI HERE]

# Print the response
cat response.json && echo

Look a final test response example:

--

--

Data 4 Dummies by Cristian Restrepo

Professional blog by Cristian Restrepo | Business & Data Science passionate | Data Engineer at Arkho