Skip to content

Instantly share code, notes, and snippets.

@elbruno
Created April 30, 2024 16:29
Show Gist options
  • Save elbruno/13603c34827c382e0b1343cba380760e to your computer and use it in GitHub Desktop.
Save elbruno/13603c34827c382e0b1343cba380760e to your computer and use it in GitHub Desktop.
skllama3rpi.cs
// Copyright (c) 2024
// Author : Bruno Capuano
// Change Log :
// - Sample console application to use llama3 LLM running in a Raspberry Pi with Semantic Kernel
//
// The MIT License (MIT)
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.
#pragma warning disable SKEXP0001, SKEXP0003, SKEXP0010, SKEXP0011, SKEXP0050, SKEXP0052
using System.Diagnostics;
using Microsoft.Extensions.Configuration;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
// Create kernel with a custom http address
var builder = Kernel.CreateBuilder();
builder.AddOpenAIChatCompletion("llama", endpoint: new Uri("http://rpillm05:11434"), "apikey");
var kernel = builder.Build();
string prompt = "Write a joke about kittens";
var response = await kernel.InvokePromptAsync(prompt);
Console.WriteLine("RPI Llama3 Response: " + response.GetValue<string>());
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment