Go - Ollama 在本地構建你的 RAG 應用
這篇文章會引導大家使用 Golang 設置本地大型語言模型 (LLM) 並與之交互,以此在本地構建你的 AI 應用。
設置本地 LLM
首先,我們需要在本地計算機上運行 LLM。爲此,我們將使用 Ollama(可在 GitHub ollama[1] 上獲取)。雖然本地加載模型有多種選擇,但我發現 Ollama 是相對容易使用的。Ollama 支持 macOS、Linux 和 Windows。本文的代碼還沒有在 Windows 上測試過,所以我這裏重點介紹 macOS 和 Linux,但在 Windows 上的過程應該類似。
從官方網站下載並安裝 Ollama
接下來,在操作系統上打開終端,選擇要運行的模型。最簡單的方法是從 Ollama 模型列表中選擇一個模型。你還可以從 Hugging Face 等資源中安裝其他模型。選定模型後,在終端運行以下命令:
ollama pull <model name>
或者
ollama run <model name>
然後我們可以在本地計算機上運行 LLM 了。
使用 Go 與 LLM 交互
假設你已經建立了 Go 環境(如果沒有,參考 Go 安裝說明),然後我們就可以開始寫代碼了。
1. 初始化項目
mkdir go-ollama
cd go-ollama
go mod init go-ollama
2. 編輯代碼
用你喜歡的編輯器打開一個新文件 main.go。首先根據 Ollama API 文檔定義聊天請求結構:
package main
type Request struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
Stream bool `json:"stream"`
}
type Message struct {
Role string `json:"role"`
Content string `json:"content"`
}
參數的含義分別是:
-
Model: 使用 Ollama 下載的模型, 這裏是 llama3.1
-
Stream: 決定接收恆定的響應流(設爲 true)還是單個響應(設爲 false)。本例中我們將使用 false
-
Message 結構:該結構包含要發送給 AI 的問題。
package main
type ChatRequest struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
Stream bool `json:"stream"`
}
type Message struct {
Role string `json:"role"`
Content string `json:"content"`
}
func main() {
msg := Message{
Role: "user",
Content: "Why is the sky blue?",
}
req := Request{
Model: "llama3.1",
Stream: false,
Messages: []Message{msg},
}
}
3. 發送請求和接收回復
接下來,讓我們向 Ollama 發送此請求並獲取響應。下面是基於文檔的響應結構:
package main
import "time"
type Response struct {
Model string `json:"model"`
CreatedAt time.Time `json:"created_at"`
Message Message `json:"message"`
Done bool `json:"done"`
TotalDuration int64 `json:"total_duration"`
LoadDuration int `json:"load_duration"`
PromptEvalCount int `json:"prompt_eval_count"`
PromptEvalDuration int `json:"prompt_eval_duration"`
EvalCount int `json:"eval_count"`
EvalDuration int64 `json:"eval_duration"`
}
讓我們使用 Go 的標準庫創建一個簡單的 HTTP 客戶端:
func talkToOllama(url string, ollamaReq Request) (*Response, error) {
js, err := json.Marshal(&ollamaReq)
if err != nil {
return nil, err
}
client := http.Client{}
httpReq, err := http.NewRequest(http.MethodPost, url, bytes.NewReader(js))
if err != nil {
return nil, err
}
httpResp, err := client.Do(httpReq)
if err != nil {
return nil, err
}
defer httpResp.Body.Close()
ollamaResp := Response{}
err = json.NewDecoder(httpResp.Body).Decode(&ollamaResp)
return &ollamaResp, err
}
-
TalkToOllama: 該函數接收 Ollama API URL 和請求結構。
-
JSON Marshaling:將請求結構轉換爲 JSON 格式。
-
HTTP 請求:創建併發送帶有 JSON 有效載荷的 POST 請求。
-
響應處理:將 JSON 解碼爲響應結構並返回。
4. 運行程序
下面是完整的代碼:
package main
import (
"bytes"
"encoding/json"
"fmt"
"net/http"
"os"
"time"
)
type Request struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
Stream bool `json:"stream"`
}
type Message struct {
Role string `json:"role"`
Content string `json:"content"`
}
type Response struct {
Model string `json:"model"`
CreatedAt time.Time `json:"created_at"`
Message Message `json:"message"`
Done bool `json:"done"`
TotalDuration int64 `json:"total_duration"`
LoadDuration int `json:"load_duration"`
PromptEvalCount int `json:"prompt_eval_count"`
PromptEvalDuration int `json:"prompt_eval_duration"`
EvalCount int `json:"eval_count"`
EvalDuration int64 `json:"eval_duration"`
}
const defaultOllamaURL = "http://localhost:11434/api/chat"
func main() {
start := time.Now()
msg := Message{
Role: "user",
Content: "Why is the sky blue?",
}
req := Request{
Model: "llama3.1",
Stream: false,
Messages: []Message{msg},
}
resp, err := talkToOllama(defaultOllamaURL, req)
if err != nil {
fmt.Println(err)
os.Exit(1)
}
fmt.Println(resp.Message.Content)
fmt.Printf("Completed in %v", time.Since(start))
}
func talkToOllama(url string, ollamaReq Request) (*Response, error) {
js, err := json.Marshal(&ollamaReq)
if err != nil {
return nil, err
}
client := http.Client{}
httpReq, err := http.NewRequest(http.MethodPost, url, bytes.NewReader(js))
if err != nil {
return nil, err
}
httpResp, err := client.Do(httpReq)
if err != nil {
return nil, err
}
defer httpResp.Body.Close()
ollamaResp := Response{}
err = json.NewDecoder(httpResp.Body).Decode(&ollamaResp)
return &ollamaResp, err
}
go run main.go
運行後應該會看到與此類似的響應:
The sky appears blue to us because of a phenomenon called scattering, which occurs when sunlight interacts with the tiny molecules of gases in the atmosphere. Here's a simplified explanation:
1. Sunlight enters the Earth's atmosphere**: When sunlight enters our atmosphere, it consists of a broad spectrum of electromagnetic radiation, including all the colors of the visible light (red, orange, yellow, green, blue, indigo, and violet).
2. Scattering occurs**: As sunlight travels through the atmosphere, it encounters tiny molecules of gases such as nitrogen (N2) and oxygen (O2). These molecules are much smaller than the wavelength of light, so they scatter the shorter (blue) wavelengths more efficiently than the longer (red) wavelengths.
3. Blue light is scattered in all directions**: The scattering process favors blue light because it has a shorter wavelength, which allows it to be deflected by the gas molecules more easily. This scattered blue light reaches our eyes from all parts of the sky.
4. Our eyes perceive the sky as blue**: Since we see the scattered blue light from every direction in the atmosphere, our brains interpret this as a blue color for the entire sky.
Other factors can affect the apparent color of the sky, such as:
* Dust and pollutants**: Tiny particles in the air can scatter light in a way that adds a reddish tint to the sky.
* Clouds: When sunlight passes through water droplets or ice crystals in clouds, it scatters in all directions, giving the sky a white or gray appearance.
* Time of day: The angle of the sun changes throughout the day, which can alter the intensity and color of the scattered light. For example, during sunrise and sunset, the light has to travel through more of the Earth's atmosphere, scattering off more particles and making the sky appear redder.
In summary, the sky appears blue due to the scattering of sunlight by the tiny molecules in the atmosphere, which favors shorter wavelengths like blue light.
Completed in 38.315152042s
結論
在本文中,我們建立了一個本地 LLM,並使用 Go 的標準庫對其進行了查詢。這僅僅是個開始 -- 你可以在此基礎上自由擴展。我們可以將 Ollama 託管在不同的機器上,使代碼更適合生產。你甚至可以建立用戶交互、創建對話或開發 RAG 應用程序的邏輯鏈。
參考資料
[1]
ollama:https://github.com/ollama/ollama
本文由 Readfog 進行 AMP 轉碼,版權歸原作者所有。
來源:https://mp.weixin.qq.com/s/TbVFaLmD_OlJh8KJm55N9A