Check out a free preview of the full First Look: ChatGPT API for Web Developers course

The "OpenAI Node.js API" Lesson is part of the full, First Look: ChatGPT API for Web Developers course featured in this preview video. Here's what you'd learn in this lesson:

Maximiliano demonstrates the process of creating a Node.js server for making API requests, which offers the additional benefit of keeping the API key secure on the server side.

Preview
Close

Transcript from the "OpenAI Node.js API" Lesson

[00:00:00]
>> We've mentioned this a couple of time, but it's, again, a good idea to do that again. Never store your key in a public place. And also, now we will see how to move to the server. But even when you go to the server, you should control who and how they access your services that are using OpenAI behind the scenes.

[00:00:21]
Because if someone can inject into your prompts, they can then use OpenAI and charge your credit card. Or if you get lots of users, you may be out of money easily. And by the way, you can set caps at OpenAI, in fact, I have one on my key.

[00:00:45]
And you can say, I don't wanna spend more than $10 or $1000. And please send me an email when we are close to $5 or $50. And of course, after that, you are done, which also means that sometimes when you have users that are asking the same thing, you may wanna do caching.

[00:01:06]
So you can cache responses and have your own memory, so maybe you don't wanna go all the time to OpenAI. I mean, if you have people asking the same thing and you already got an answer from OpenAI, you can save it and cache it. So how can we move this to the server?

[00:01:25]
Because we know that this is not good. Well, actually, if you're using Node.js, for example, and you have Node.js 17 or greater, you can just copy and paste the fetch that we have here. Like this, from the fetch, To response.json, and move that to the server, it's not more difficult than that.

[00:01:51]
So you open, in this case, server.js. And, for example, for /api/chat, you just paste something like that because Node.js now supports the Fetch API. So the same code really shall work, we just need to set up some things, some quick things. Such as, for example, we need to read the value from the request as JSON, like so.

[00:02:14]
And then, well the Open API, I think I need to update that one, because the one that I have on the server, it's an old one. So I will also update the one on the server, like so. And then I need to respond, so with a JSON. And they can just, in this case, await for the other response.

[00:02:38]
Because we are kind of a proxy at this point. Again, this is not adding any security or anything else. However, I mean, we can do that on the server, but there is a better way. Instead of making your HTTP request manually, you can use the OpenAI API, the OpenAI Node.js API.

[00:03:00]
So when we were in the website seeing here, for example, how to create the chat completion. Here, we can change from curl, python, to node.js. So we have here how to set up and import the library, and then how to send a message. So we can just copy and paste that or write it from a scratch.

[00:03:29]
And to try it, I will use this other general endpoint that I have here. What's the idea? I do have another, so to display the chat, I have our own playground. So I created our own playground, that is pretty simple. It's just a form with a textarea, a button.

[00:03:47]
And the idea is that I'm going to call the general API, sending the prompt and expecting their response. So let's see how it looks like. So I'm going to /playground on my localhost, it looks like this. In this case, it's not a chat, it's just for testing simple prompts.

[00:04:07]
You don't need context all the time. Sometimes I need a translation, I need to translate something to Portuguese. Well, it's just one call, that's all, you don't need context all the time. So in this case, we have this simple playground to play with this. So I just need to do the server part.

[00:04:26]
So then, I'm going to server.js, and first I'm going to paste what I have from the documentation. And first, this is require, so it's importing the library. Then it's setting up the library, by default, it's reading the key from process.env, that is pretty cool for deploying this, but we are learning.

[00:04:47]
So we are actually making this simpler, we are sure just importing the OpenAI, by the way, I have a typo here, it's AI key. Like so, and then we create the OpenAI object. And then the idea is that we use that object, in this case, when we receive a POST request to API channel.

[00:05:15]
And what we need to change here is to we need to read the body that we are receiving. So the body, this is express.js, nothing very fancy. And here we have the messages. So instead of using this for the content, remember that in this case we are not going to create a chat.

[00:05:36]
So we are not going to send previous past history. But because we're using the OpenAI ChatGPT chat version, we still need to send a chat message like real user, even if it's just one. And for the content, we are going to use body.prompt. Why it's called .prompt? Because from App.js we are passing an object with a property name as prompt, it's just so.

[00:06:07]
So we are passing the prompt, and instead of doing console.log of the message, what I need is to respond to the client that message. So I'm going to respond as a JSON for completion, blah, blah, blah, I think that's okay. So, coming back here, let's see if that works.

[00:06:26]
If I say hi, Send, I received an object on the other side, why? Because I'm sending the message. The message contains the role, and the content inside. If I just want the content, I say .content, so let's try again. Hi, hi there, how can I assist you today?

[00:06:50]
So now, try inside, I don't have the key. Anyway, I'm still having a security risk because anyone can call my API. So you need to set limits, or you need to create your own token. Maybe the user needs to be logged in, and you set limits per user.

[00:07:05]
Because remember, you're going to pay for every request. But now we have a nice internal playground that works server-side where we can try different things. Remember that you can use this for doing a lot of steps, a lot of tasks with the API.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now