Skip to content

Latest commit

 

History

History
149 lines (108 loc) · 4.18 KB

README.md

File metadata and controls

149 lines (108 loc) · 4.18 KB

Abso banner

TypeScript LLM client

npm version GitHub last commit (by committer) GitHub commit activity (branch)

Abso provides a unified interface for calling various LLMs while maintaining full type safety.

Features

  • OpenAI-compatible API 🔁
  • Lightweight & Fast ⚡
  • Embeddings support 🧮
  • Unified tool calling 🛠️
  • Tokenizer and cost calculation (soon) 🔢 for accurate token counting and cost estimation
  • Smart routing (soon) to the best model for your request

Providers

Provider Chat Streaming Tool Calling Embeddings Tokenizer Cost Calculation
OpenAI 🚧 🚧
Anthropic 🚧 🚧
xAI Grok 🚧 🚧
Mistral 🚧 🚧
Groq 🚧
Ollama 🚧
OpenRouter 🚧
Voyage
Azure 🚧 🚧 🚧 🚧 🚧
Bedrock 🚧 🚧 🚧 🚧 🚧
Gemini
DeepSeek

Installation

npm install abso-ai

Usage

import { abso } from "abso-ai"

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-4o",
})

console.log(result.choices[0].message.content)

Manually selecting a provider

Abso tries to infer the best provider for a given model, but you can also manually select a provider.

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "openai/gpt-4o",
  provider: "openrouter",
})

console.log(result.choices[0].message.content)

Streaming

const stream = await abso.chat.stream({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-4o",
})

for await (const chunk of stream) {
  console.log(chunk)
}

Embeddings

const embeddings = await abso.embed({
  model: "text-embedding-3-small",
  input: ["A cat was playing with a ball on the floor"],
})

console.log(embeddings.data[0].embedding)

Tokenizers (soon)

const tokens = await abso.tokenize({
  messages: [{ role: "user", content: "Hello, world!" }],
  model: "gpt-4o",
})

console.log(`${tokens.count} tokens`)

Custom Providers

import { Abso } from "abso"
import { MyCustomProvider } from "./myCustomProvider"

const abso = new Abso([])
abso.registerProvider(new MyCustomProvider(/* config */))

const result = await abso.chat.create({
  model: "my-custom-model",
  messages: [{ role: "user", content: "Hello!" }],
})

Ollama

import { abso } from "abso-ai"

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Hi, what's up?" }],
  model: "llama3.2",
  provider: "ollama",
})

console.log(result.choices[0].message.content)

Contributing

See our Contributing Guide.

Roadmap

  • More providers
  • Built in caching
  • Tokenizers
  • Cost calculation
  • Smart routing