In this short post, I will demonstrate how to authenticate and make API calls during the execution of a workflow built on AWS Step Functions. Prior to the addition of the HTTPS Endpoint state, we would need to fall back and invoke a Lambda function. While this approach still has benefits, the post below will walk you through the steps of invoking ChatGPT inside a Step Function Workflow.
Step 1: EventBridge Connection
The HTTP Endpoint State requires an Eventbridge Connection, which in this case, stores our API token.
In the web console, navigate to Eventbridge Connections. Click Create Connection.
On the first portion of the screen, give your connection a valid name and select Public. Optionally provide a description.
At the bottom of the page, select API Key.
This part is important!.
- For the API Key Name, enter
Authorization
- for the Value, enter
Bearer sk-XXXXXX
where your sk-XXXXXX is your OpenAI API token.
Click Create.
In my case, this is what I see after creating the connection.
There is a space between the Bearer and your API token!
Step 2: SFN Workflow
The first iteration will be as simple as it can get. Create a new Step Function on the AWS Console. NOTE: When prompted, select JSONata as the Query language.
After selecting the State, we will need to add a few bits of information to configure our HTTP call out.
- API Endpoint:
https://api.openai.com/v1/chat/completions
- Method:
POST
- Connection: Select the Connection Name you created in the Step 1 above
Below is a basic request body in the format OpenAI expects
{
"model": "gpt-4o",
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Hello!"
}
]
}
As I noted in my previous article, beyond the low-code Workflow builder on the Web Console, we also can fully define our workflows via ASL, enabling Infrastructure-as-Code deployments.
Below is the JSON ASL for the workflow
{
"QueryLanguage": "JSONata",
"Comment": "Dec 2024 flow updated as POC for alternatives, and also to show we don't always need to hop into a Lambda.",
"StartAt": "Call HTTPS APIs",
"States": {
"Call HTTPS APIs": {
"Type": "Task",
"Resource": "arn:aws:states:::http:invoke",
"Arguments": {
"ApiEndpoint": "https://api.openai.com/v1/chat/completions",
"Method": "POST",
"InvocationConfig": {
"ConnectionArn": "arn:aws:events:us-east-1:00000000000000:connection/OpenaiDec24/6a2df29c-c945-4e9f-a446-9957ee9de7e1"
},
"RequestBody": {
"model": "gpt-4o",
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Hello!"
}
]
}
},
"Retry": [
{
"ErrorEquals": [
"States.ALL"
],
"BackoffRate": 2,
"IntervalSeconds": 1,
"MaxAttempts": 3,
"JitterStrategy": "FULL"
}
],
"End": true,
"Comment": "- Authorization is the key, Bearer <TOKEN> is the value via event Bridge connection"
}
}
}
You can copy the JSON ASL above and paste this into the Workflow editor on the Console. Select the Code pill at the top of the screen, and on the left hand side, paste the definition above and navigate back to the Design tab.
Step 3: Execute the Workflow
If you haven’t done so, we need to rename and save the Workflow. Click the Config tab, rename the workflow, select your Execution Role which allows HTTP invocations, and click save.
You can let the Console create the IAM Execution role for you automatically. The benefit is that AWS will ensure you have the necessary permissions. If you are new to IAM configuration, this is a great feature and one that can help you learn the necessary elements. To review the role created, navigate to IAM > Roles and search the Execution Role created by AWS for you. Clicking into role will allow you to review your permissions.
With our worfklow defined and saved, we can now test the Execution. I am not going to define a Payload. Leave this screen blank and click Start Execution.
In the graph view, if everything is configured correctly (i.e. the Eventbridge Connection is setup as outlined above), you should see a successful execution.
For the Task State Output, you should see a RequestBody
key, with the data sent back from the API call out to ChatGPT.
That’s it. In my case, ChatGPT responded with Hello! How can I help you today?
.
Optional: Variables and JSONata
Above was a simple example to ensure everything is wire up correctly in order to call out to OpenAI APIs from within our Step Function workflow. Below will demonstrate how we can dynamically pass in the prompt to OpenAI. As I did with the previous post, I am going to use a Pass state to enter the prompt, and then refer to this variable within the defition of the request body.
The updated workflow will make use of Variables and JSONata to refer to the prompt defined the Pass state.
Below is the ASL definition of the updated workflow
{
"QueryLanguage": "JSONata",
"Comment": "Dec 2024 flow updated as POC for alternatives, and also to show we don't always need to hop into a Lambda.",
"StartAt": "Pass",
"States": {
"Pass": {
"Type": "Pass",
"Next": "Call HTTPS APIs",
"Assign": {
"prompt": "How many wins do the Boston Bruisn have this season?"
}
},
"Call HTTPS APIs": {
"Type": "Task",
"Resource": "arn:aws:states:::http:invoke",
"Arguments": {
"ApiEndpoint": "https://api.openai.com/v1/chat/completions",
"Method": "POST",
"InvocationConfig": {
"ConnectionArn": "arn:aws:events:us-east-1:000000000000:connection/OpenaiDec24/6a2df29c-c945-4e9f-a446-9957ee9de7e1"
},
"RequestBody": {
"model": "gpt-4o",
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "{% $prompt %}"
}
]
}
},
"Retry": [
{
"ErrorEquals": [
"States.ALL"
],
"BackoffRate": 2,
"IntervalSeconds": 1,
"MaxAttempts": 3,
"JitterStrategy": "FULL"
}
],
"End": true,
"Comment": "- Authorization is the key, Bearer <TOKEN> is the value via event Bridge connection"
}
}
}
A few notes on above:
- We define a variable called
prompt
in the Pass state. - In the defintion of the Request body, you can see that we are using JSONata to dynamically pull in the prompt via the variable defined above via
"content": "{% $prompt %}"
After a successfull execution, we see that ChatGPT acknowledges that the model does not have the ability to recall current information.
Optional: Input Prompt
Finally, a more realistic workflow would expect that the prompt would be included in the payload that triggers the execution of the Step Function.
The changes:
- Remove the Pass state that was added above to highlight Variables
- Include the prompt in the payload at the start of the execution.
We will make use of the reserved variables, namely the input key, to isolate the prompt passed into the Workflow Execution.
The ASL definition is below:
{
"QueryLanguage": "JSONata",
"Comment": "Dec 2024 flow updated as POC for alternatives, and also to show we don't always need to hop into a Lambda.",
"StartAt": "Call HTTPS APIs",
"States": {
"Call HTTPS APIs": {
"Type": "Task",
"Resource": "arn:aws:states:::http:invoke",
"Arguments": {
"ApiEndpoint": "https://api.openai.com/v1/chat/completions",
"Method": "POST",
"InvocationConfig": {
"ConnectionArn": "arn:aws:events:us-east-1:0000000000000:connection/OpenaiDec24/6a2df29c-c945-4e9f-a446-9957ee9de7e1"
},
"RequestBody": {
"model": "gpt-4o",
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "{% $states.input.prompt %}"
}
]
}
},
"Retry": [
{
"ErrorEquals": [
"States.ALL"
],
"BackoffRate": 2,
"IntervalSeconds": 1,
"MaxAttempts": 3,
"JitterStrategy": "FULL"
}
],
"End": true,
"Comment": "- Authorization is the key, Bearer <TOKEN> is the value via event Bridge connection"
}
}
}
- Notice that the body of our request to OpenAI was modified to refer to the prompt key from
states.input
via"content": "{% $states.input.prompt %}"
which passes the input directly to OpenAI.
Save and execute the workflow with the input shown below.
{
"prompt":"Tell me a joke"
}
That’s it! We were able to pull in the input that triggered the execution into the call to OpenAI.
Summary
The post above walks you through the basics of getting started with making callouts to external APIs from inside an AWS Step Function workflow. Above, I used OpenAI’s API to make simple calls to Chat GPT, highlighting that you have options beyond AWS’s Bedrock inside your Agentic Solutions built on top of Step Functions.
Finally, you can call public apis that do not require user/pass by creating a EventBridge Connection with placeholder username/password. Even though the API will you want to leverage may not require a user/pass combintation, this will allow you to get around the required connection input.