{"id":3680,"date":"2024-08-29T09:46:51","date_gmt":"2024-08-29T14:46:51","guid":{"rendered":"https:\/\/frontendmasters.com\/blog\/?p=3680"},"modified":"2024-08-29T09:46:52","modified_gmt":"2024-08-29T14:46:52","slug":"ai-in-chrome","status":"publish","type":"post","link":"https:\/\/frontendmasters.com\/blog\/ai-in-chrome\/","title":{"rendered":"AI in Chrome"},"content":{"rendered":"\n<p><a href=\"https:\/\/developer.chrome.com\/docs\/ai\/built-in\">Chrome is experimentally shipping<\/a> with Gemini Nano, their smallest Large Language Model (LLM) baked right in, then offer APIs to use it.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>In Chrome, these APIs are built to run inference against Gemini Nano with fine-tuning or an expert model. Designed to run locally on most modern devices, Gemini Nano is best for language-related use cases, such as summarization, rephrasing, or categorization.<\/p>\n<\/blockquote>\n\n\n\n<p>It&#8217;s an API most, with methods you call and get responses. <a href=\"https:\/\/www.raymondcamden.com\/2024\/08\/13\/a-quick-look-at-ai-in-chrome\">Raymond Camden had a look:<\/a><\/p>\n\n\n<pre class=\"wp-block-code\" aria-describedby=\"shcb-language-1\" data-shcb-language-name=\"JavaScript\" data-shcb-language-slug=\"javascript\"><span><code class=\"hljs language-javascript\"><span class=\"hljs-keyword\">const<\/span> model = <span class=\"hljs-keyword\">await<\/span> <span class=\"hljs-built_in\">window<\/span>.ai.createTextSession();\n<span class=\"hljs-keyword\">await<\/span> model.prompt(<span class=\"hljs-string\">\"Who are you?\"<\/span>);\n\n<span class=\"hljs-comment\">\/\/ I am a large language model, trained by Google.<\/span><\/code><\/span><small class=\"shcb-language\" id=\"shcb-language-1\"><span class=\"shcb-language__label\">Code language:<\/span> <span class=\"shcb-language__name\">JavaScript<\/span> <span class=\"shcb-language__paren\">(<\/span><span class=\"shcb-language__slug\">javascript<\/span><span class=\"shcb-language__paren\">)<\/span><\/small><\/pre>\n\n\n<p>Using AI in this way means 1) it&#8217;s fast (no network trip) 2) it works offline 3) it&#8217;s private (maybe) 4) it&#8217;s free to use.<\/p>\n\n\n\n<p>I admit that&#8217;s awfully compelling. I suspect this will happen and will be very highly used.<\/p>\n\n\n\n<p>Don&#8217;t we need to think about standards here though? What if Apple ships <code>window.ai.instantiateIntelligence()<\/code> with an <code>.ask()<\/code> method? And Firefox ships <code>navigator.llm('dolly').enqueueQuery()<\/code>? I&#8217;d just like to remind everyone that when browser just ship whatever and compete on proprietary features, everybody loses.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Chrome is experimentally shipping with Gemini Nano, their smallest Large Language Model (LLM) baked right in, then offer APIs to use it. In Chrome, these APIs are built to run inference against Gemini Nano with fine-tuning or an expert model. Designed to run locally on most modern devices, Gemini Nano is best for language-related use [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"sig_custom_text":"","sig_image_type":"featured-image","sig_custom_image":0,"sig_is_disabled":false,"inline_featured_image":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[29],"tags":[104,3,55],"class_list":["post-3680","post","type-post","status-publish","format-standard","hentry","category-the-beat","tag-ai","tag-javascript","tag-web-standards"],"acf":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts\/3680","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/comments?post=3680"}],"version-history":[{"count":2,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts\/3680\/revisions"}],"predecessor-version":[{"id":3682,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/posts\/3680\/revisions\/3682"}],"wp:attachment":[{"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/media?parent=3680"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/categories?post=3680"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/frontendmasters.com\/blog\/wp-json\/wp\/v2\/tags?post=3680"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}