Lesson Description

The "Standard Prompt" Lesson is part of the full, Practical Prompt Engineering course featured in this preview video. Here's what you'd learn in this lesson:

Sabrina introduces the "Standard Prompt," which is a direct question or instruction to the LLM. The quality of the question directly relates to the quality of the answer. Some examples of standard prompts are provided using Claude. Any AI/LLM tool can be used to follow along with these examples.

Preview
Close

Transcript from the "Standard Prompt" Lesson

[00:00:00]
>> Sabrina Goldfarb: Now we've gone through all of the boring history, background, everything, right? Now we get to just spend the rest of the day talking about and trying out a bunch of prompts. So first, the first prompt we're going to talk about is called the standard prompt. And yes, this is what it's actually called when you look at any of the research papers or anyone in the community. It's just going to be called the standard prompt.

[00:00:27]
This is something that if you've used any of these models ever, you have done before, right? It is a direct question or an instruction to the AI. It is the simplest form of prompting, where you're literally just asking. You just want to know something. It's the foundation though that everything else builds on, so it's really important that we get comfortable with these standard prompts. One really important thing to know is that the quality of the question directly relates to the quality of the answer.

[00:00:59]
This shouldn't be surprising to us, right? This is the same thing with us as humans. If I go up to someone on the street and I ask them a question about, I don't know, fixing my code, you're probably not going to have a great answer for me. But if I go up to someone with context on what I'm working on, and if that person has an idea of the features that are being built, and maybe the bug that I'm running into, they're going to be able to give me a much better answer.

[00:01:29]
So the quality of our question is going to directly relate. But that doesn't necessarily mean that we can't just have some fun looking at some standard prompts. So let's go into OpenAI and Claude. I will be using Claude. You are welcome to use any application you want: ChatGPT, Claude, Cursor, Copilot. I will be using Copilot later when we're working on the actual code of this, but for now, I'd like to show you a little bit of how all of these models act differently, right?

[00:02:03]
So I will be using Claude chat. I currently have personally the Max plan for today, just to make sure that I have plenty of bandwidth to use. You're welcome to use free plans, pro plans, anything you have, any model that you like to use, any provider you like to use. Maybe you want to use Cursor, maybe you want to use Copilot for this, maybe you want to use Claude, ChatGPT, maybe you want to use Google Gemini.

[00:02:26]
Whatever you're kind of having fun with and feeling, go for it. But each section we're going to talk about what the prompting technique is, then we're going to have a little bit of fun in chat and just test out a couple of prompts in chat, and then I'm going to move over to Copilot, which I'll explain in a few minutes, and utilize that for our actual codebase. So for these initial, just testing out these prompting techniques, again, I'm just going to be in Claude chat, no code necessary, nothing like that.

[00:02:59]
So let's get a couple of examples of a standard prompt going just so we can get comfortable with how these applications interact with us. And we have to remember a standard prompt is very simply us asking what we want from the agent or the LLM. But something I do want to mention is that this is different than using Google. If you have never used an AI agent or an LLM before, it's different than Google because in Google, you often try to summarize your question in just a couple of words, right?

[00:03:35]
Like, best tacos New York City kind of thing. But when it comes to these agents, we want to ask them full questions, like full questions, full sentences, right? And I was always that person that typed into Google, what are the best tacos in New York City? And so that's why I think I really enjoy using these LLMs because now I can just do that and people don't laugh at me every time I do it. But we really want to make sure we're using as much as we can, like full sentences, best structure we possibly can, full questions, kind of like you'd ask a neighbor.

[00:04:09]
So first one, because we've been talking about it so much, I'm going to ask what color is the sky? Okay, so we can see that some of the stuff that I mentioned earlier is actually happening right now. And if you all are typing in, whether it's to Claude Sonnet 4.5, maybe you're talking to ChatGPT-4, or maybe you're talking to Gemini 2, right? You can probably see that you got a different answer than I did, even if you're using exactly Sonnet 4.5, even if you're sitting right next to me, and even if you did it and pressed send at the exact same time, you would still remember LLMs being nondeterministic get a different answer than I did.

[00:04:54]
So let's see what Claude said. Claude said the sky is typically blue during the day when the sun is up. This happens because of a phenomenon called scattering. Didn't know that. Sunlight interacts with molecules in Earth's atmosphere, and blue light is scattered more than other colors, making the sky appear blue to our eyes. Of course, the sky changes color throughout the day: sunrise sunset, orange, pink, red, and purple hues; night, dark blue to black dotted with stars; overcast days, gray or white; storms, dark gray or even greenish.

[00:05:28]
The exact shade can also vary based on where you are, the time of year, pollution levels, and atmospheric conditions. So we can see that when we were talking about the fact that blue was likely that next most likely token, right? We got that next most likely token, but we didn't exactly get it in the order that we thought that we would, right? The sky is typically all came before blue. But we still did get blue, but we also still get those other next most likely tokens of orange and gray, and even some other colors like gray and greenish.

[00:06:05]
So another example, I'm going to open up new chats every time I prompt, just to have like a fresh slate every time. You all are welcome to do that or don't have to, but I like the fresh slate. Another standard prompt that we can talk to Claude about is why is thunder so scary? If I can type today. And again we can see that these are standard prompts. These are very simple. They're just a direct question, just a sentence.

[00:06:38]
They don't have to be anything fancy, but we can see that actually we get really interesting content from them. Just because they're standard prompts doesn't mean they have to be bad prompts, and it doesn't mean they have to be this short either, which we'll see in a moment when we go to look at making our Prompt Library. So thunder can feel scary for several good reasons. The surprise factor, they're loud and overwhelming, ancient instincts, really interesting, the unknown, and then things that are associated with storms.

[00:07:09]
So things that are associated with storms can be dangerous, so your brain is like, hey, potential hazard. The good news is thunder itself is just sound waves. It can't hurt you, right? Are you dealing with a thunderstorm right now or just curious why it affects people this way? We can see that these agents are also prompted to continue to prompt us to continue using them, right? So we can use this to our advantage when we are prompting ourselves because we can say maybe we're using a standard prompt and we're trying to build an application, but we can say, hey, if you have any other questions for me, make sure you ask those first before you start to implement this feature, right?

[00:07:50]
It's just a simple standard prompt, but it really adds to that kind of consistency and output that we were talking about how OpenAI was saying that we wanted to try to get. Hey, if you don't understand what I'm saying, make sure that you come and check with me first. Simple standard prompt, but can make a huge difference in your applications.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now