Check out a free preview of the full Machine Learning in JavaScript with TensorFlow.js course

The "Transfer Learning Overview" Lesson is part of the full, Machine Learning in JavaScript with TensorFlow.js course featured in this preview video. Here's what you'd learn in this lesson:

Charlie introduces transfer learning, a technique that provides a pre-trained model with data for a new task. For example, an image classification model could be given a new set of image data to recognize a different set of objects. The Teachable Machine website is demonstrated, which allows users to create classes of data and train a model. Once the model is trained, it can be tested in the browser.

Preview
Close

Transcript from the "Transfer Learning Overview" Lesson

[00:00:00]
>> All right, so for transfer learning, just a recap. So, we're using a pre-trained model still, but we're adding some samples to make it perform a new task. So in terms of, for example, using image detection models, we're able to add custom images. So for example, in the projects that we built in the first project, like the different tasks, we were relying on the classes that the model had been trained with so that it recognized a person pretty well or a cat.

[00:00:29]
But, as I was telling you before about my risotto story, it did not recognize the picture of the risotto. But with transfer learning, we can use the image recognition model as well, and I can add all pictures of risotto if I want with the label risotto. And then it will be able to recognize risotto or anything else that you want to predict as well.

[00:00:50]
So it's a pretty quick way to be able to leverage the work that has already been done and add your samples as well, and you can do that in the browser. So you can build experiments where people can add their own images and be able to have that kind of functionality as well.

[00:01:08]
So I'll talk a little bit about the quick start about how to do it quickly. There's an easy way and a more difficult way. I'm gonna try to do both as part of this second section. But in the section about transfer learning, I'm going to introduce some terminology that I'm gonna try to explain as best I could.

[00:01:30]
Also, it's gonna probably hopefully make a bit more sense once we actually write the code, that's part of the more complicated ways of doing this. So you might have heard of epochs or no. So epochs is the number of iterations. So it's when you're training a machine learning model, you don't only give the data once and then it's done, you have predictions.

[00:01:49]
It's like depending on the a machine learning model has different layers, or at least a neural network has different layers. And you can pass a number of iteration that you want the model to go through, and it kind of like refines its prediction over time. So if I say the word epoch, you should think about iterations, but we'll see that when we write some code later on.

[00:02:09]
You have the notion of an activation function, and it's something added to a neural network to help the network learn complex patterns in the data. So there's different types of activation functions. I don't know necessarily all of them, so we'll see the one that we use later on as well.

[00:02:25]
And in terms of batch size, it's the number of training examples that are used. So if you record 10 or 20 new images, then your batch size could either be, if you want to feed them every step of the way, then maybe your batch size would be 40 and you use the same images.

[00:02:42]
But what you want is maybe split your data set into different chunks and the batch size would be the size of one chunk, and you just pass them along as you go. The learning rate is like a hyper parameter, and that's kind of a new word I didn't introduce.

[00:02:55]
But hyper parameter is something that you can tune when you use the model. And there's not a set way that if you use 0.1, then it will always work or something like that. It's a hyperparameter tuning, usually we call that, where you start with a certain value and then you see the output that you get.

[00:03:12]
And you kind of tune that and you see how it affects the accuracy of your model. So there's no real way to know in advance. Maybe the best thing that you can do is something that you'll tune. And we can see that later in the examples, we can try to maybe tweak and see how it affects the accuracy of our model.

[00:03:28]
But it's basically the learning rate is how much the model change each time that the weights are updated. And if you remember what I mentioned about the weights before, it's how important certain attributes are when you're using that data. So the learning rate is, well, when the weights are updated based on, I think that parameter is more important than how many times you should be updating your model.

[00:03:56]
So in terms of Tensor, so, well, part of TensorFlow, TensorFlow.js, it's a mathematical object that is to describe physical properties. So you could think about it as a data type or a data structure that is more specific to working with machine learning. And then an optimizer, we'll see that again when we write code, but it's a function that adjust different attributes of the neural networks such as the weights and the learning rates.

[00:04:21]
So if right now these terms don't really make sense, it's normal, as long as you don't write some code, they're kind of abstract. But I wanted to mention them to kind of give you an idea of maybe certain terms that you might come across. Maybe you've heard already when working with transfer learning or building your own model as well.

[00:04:41]
So we're starting with transfer learning, so that then it's like a nice intro towards building our custom model as that will be the last part of the workshop. But before we start writing some code, I just wanted to show quickly a tool that allows you to do transfer learning pretty fast.

[00:04:58]
It's also made by Google to work with TensorFlow.js. But it's called Teachable Machine, and it allows you to train in the browser and you model on top of a pre-trained model. So the one that we'll use is going to be standard image model. And this UI here allows you to actually train more labels or more classes on top of your model.

[00:05:22]
So you don't need to do that now. What we'll do a little bit, I'll just do a quick intro here. But for example, if I do tilt right, oops, tilt right. I'm gonna enable the webcam. And I'm gonna tilt right, hold to record it. It records a lot of data.

[00:05:41]
Oops, you can see my hand here. Anyway, so I have 39 sample images with the label tilt right, and then I'll do tilt left. And again, I do the same thing, and I tilt left. And you don't have to use exactly the same number of images, and then you can train it on the go.

[00:06:00]
Just, all you need to do is sit back and not close your browser. And it's basically retraining a new model on top of the image recognition model that's used in the background. So you have a little progress bar and then right away you get it tilt right, tilt left, tilt right, till left.

[00:06:15]
And now we have a model that's trained on our custom input. So we're going to see how to actually export that model and then implement that in a project. So that first part is going to be shorter because all the training is done here. But then there's a second part where we're kind of going to build and interface, basically that interface not as pretty.

[00:06:36]
But really instead of linking your users to go to Teachable Machine and export your model instead doing all of that in your application.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now