package ollama import ( "encoding/json" "errors" "fmt" "net/http" ) type ListRunningModelsResponse struct { Models []struct { Name string `json:"name"` Model string `json:"model"` Size int `json:"size"` Digest string `json:"digest"` Details struct { Format string `json:"format"` Family string `json:"family"` Families []string `json:"families"` ParameterSize string `json:"parameter_size"` QuantizationLevel string `json:"quantization_level"` } `json:"details"` ExpiresAt string `json:"expires_at"` SizeVram string `json:"size_vram"` ContextLength int `json:"context_length"` } `json:"models"` } func (o Ollama) ListRunningModels() (ListRunningModelsResponse, int, error) { req, err := http.NewRequest(http.MethodGet, fmt.Sprintf("%s/ps", o.baseUrl), nil) if err != nil { return ListRunningModelsResponse{}, -1, err } for key, val := range o.customHeaders { req.Header.Set(key, val) } resp, err := http.DefaultClient.Do(req) if err != nil { return ListRunningModelsResponse{}, -1, err } defer resp.Body.Close() if resp.StatusCode != 200 { return ListRunningModelsResponse{}, resp.StatusCode, errors.New("status code is not 200") } var respBody ListRunningModelsResponse if err := json.NewDecoder(resp.Body).Decode(&respBody); err != nil { return ListRunningModelsResponse{}, -1, err } return respBody, resp.StatusCode, nil }