Open Source AI with Python & Hugging Face

Sentiment Analysis & Text Generation

Steve Kinney
Temporal
Open Source AI with Python & Hugging Face

Lesson Description

The "Sentiment Analysis & Text Generation" Lesson is part of the full, Open Source AI with Python & Hugging Face course featured in this preview video. Here's what you'd learn in this lesson:

Steve explores text processing beyond Q&A, focusing on practical applications like sentiment analysis and text generation. He covers nuances such as sarcasm detection, model parameters like temperature, and encourages experimenting with different techniques for creative results.

Preview
Close

Transcript from the "Sentiment Analysis & Text Generation" Lesson

[00:00:00]
>> Steve Kinney: We think a lot about. With text, we think a lot about I ask question or do prompt ChatGPT give response, right? And that is a very popular paradigm. Right, but it is not the only one and it is one that we will work with a lot today.

[00:00:18]
But I did want to introduce a few other ones because they have practical purposes, and when we see the code, you'll see that, like, I could just, like, there might be a use case for this in my application that I could just use like tomorrow. And it is not a heavy lift.

[00:00:34]
So I think it is the responsible and practical choice to kind of go through all of them. I will say one more time that especially for all the tech stuff, there's also a TypeScript SDK. And I understand that you are tuned into frontendmasters.com and so the likelihood that you are at one point touch JavaScript or TypeScript during your day to day life is high.

[00:00:56]
All of these concepts apply there as well. And honestly, the syntax isn't all that much different and the APIs are all the same from the libraries, so on and so forth. Even if you're like, that's cool, but I don't have Python in my app, it's okay, you can just do the same basic thing in TypeScript as well.

[00:01:14]
So one of the things we can do is sentiment analysis. This is like the hello world of any kind, machine learning kind of stuff, which is you take words and you give it a score. Are these happy words or sad words? And there's like nuance there, right? In simple, like sentiment analysis, simple, easy, great, not good with nuance insofar like sarcasm, which I have employed one or two times in my life.

[00:01:47]
It's not great with that. And we'll see how stuff like Transformers handle that and why ChatGPT is better with my sass than the basic sentiment analysis that we look at. But it is kind of the first and foremost easiest kind of way to wrap our heads around to making sure any of our setup works.

[00:02:10]
So we'll take a look at sentiment analysis, but then we get to kind of the popular kid in school, which is text generation. And I call this one out specifically because for a lot of us it is the one we think about first and foremost. But some of these other ones that we'll look at after this one, we'll be like that must be how ChatGPT or Gemini or Claude or what have you deepseek work.

[00:02:37]
No, text generation is predominantly the one. I use some interesting techniques in order to do that. And some of the other tools we're gonna see are just not dissimilar, but approaching it from a different angle. Right, and I will say this again in a little bit, I'll say this a whole bunch of times today in fact.

[00:02:55]
So why not set expectations where they belong? When we think about AI in terms of ChatGPT and Gemini and Claude, it is a very sophisticated mathematical model that is just guessing what the next word probably is. So if you've ever lied to your parents, you, you know exactly how this works.

[00:03:15]
And so that is effectively how most of these models work. And it's incredibly sophisticated in how they do that. But like at the end of the day, that's how it goes down. We'll see lots of different parameters in there, but there's a few interesting things, like how many new tokens, like how much new text do we want to generate.

[00:03:33]
The really interesting one is this idea of temperature, which we'll play with both numerous times, which is if the idea is that an LLM. Or what have you guesses what the next most likely word is it always picking the exact next most likely word would get you the same output every time.

[00:04:01]
And occasionally maybe that's what you want. A lot of times you need a little bit of creativity or spiciness in there as well. And then if you ever from hugging face you can pull down legit models, ones that are coming in at depending on how much room you have on your computer, how much ram, so on and so forth.

[00:04:25]
Ones that are thousands of parameters, up to 105 billion parameters, so on and so forth. And so with those open source models that you can pull down from hugging face, a lot of these concepts and tools that we see when you're running a full on LLM are like knobs that you can tweak.

[00:04:42]
So I think it's kind of fun to learn exactly what those knobs mean, see them in small cases and realize that they apply to the bigger models. The only difference with stuff like the hosted ones, like ChatGPT, Gemini, and Claude, is that they have the knobs and you don't.

[00:04:57]
But those knobs exist in all of the other models as well. It's just that they're abstracted away from you on the hosted services. And then, yeah, how many, you can generate multiple different ones, multiple different responses and text generation and then pick which one you want. And so you could start to do interesting things here as you build up the building blocks, which is we've only talked about two of the modalities so far.

[00:05:22]
But you could see like, hey, I want to maybe generate five different strings of text, and then I'm going to send them an analysis to see which one is the least sassy or the most positive is a better way to put it. So you begin to stack these things in interesting ways, and I would love your brain to start churning on some of these things now as we begin to see some of the rest.

[00:05:44]
Because any given one, cool. Starting to mix and match them is where I think the creative parts come and the ability to build new and interesting things.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now