Skip to content

lunary-ai/abso

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Abso banner

Abso makes calling various LLMs—simple, typed, and extensible. It provides a unified interface while maintaining full type safety and streaming capabilities.

Features

  • 🔁 OpenAI-compatible API across multiple LLM providers
  • 🚀 Lightweight & Fast: no overhead
  • 🔍 TypeScript-first: strongly typed methods, requests, and responses
  • 📦 Streaming support via both events and async iteration
  • 🛠️ Easy extensibility: add new providers with minimal fuss
  • 🧮 Embeddings support for semantic search and text analysis
  • 🔢 Tokenizer support for accurate token counting and cost estimation

Providers

Provider Chat Streaming Tool Calling Embeddings Tokenizer Cost Calculation
OpenAI 🚧 🚧
Anthropic 🚧 🚧
xAI Grok 🚧 🚧
Mistral 🚧 🚧
Groq 🚧
Ollama 🚧
OpenRouter 🚧
Voyage
Azure 🚧 🚧 🚧 🚧 🚧
Bedrock 🚧 🚧 🚧 🚧 🚧
Gemini

Installation

npm install abso-ai

Usage

import { abso } from "abso-ai";

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-4o",
});

console.log(result.choices[0].message.content);

Manually selecting a provider

Abso tries to infer the best provider for a given model, but you can also manually select a provider.

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "openai/gpt-4o",
  provider: "openrouter",
});

console.log(result.choices[0].message.content);

Streaming

const stream = await abso.chat.stream({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-4o",
});

for await (const chunk of stream) {
  console.log(chunk);
}

Tokenizers (soon)

const tokens = await abso.tokenize({
  messages: [{ role: "user", content: "Hello, world!" }],
  model: "gpt-4o",
});

console.log(`${tokens.count} tokens`);

Custom Providers

import { Abso } from "abso";
import { MyCustomProvider } from "./myCustomProvider";

const abso = new Abso([]);
abso.registerProvider(new MyCustomProvider(/* config */));

const result = await abso.chat.create({
  model: "my-custom-model",
  messages: [{ role: "user", content: "Hello!" }],
});

Ollama

import { abso } from "abso-ai";

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Hi, what's up?" }],
  model: "llama3.2",
  provider: "ollama",
});

console.log(result.choices[0].message.content);

Contributing

See our Contributing Guide.

Roadmap

  • More providers
  • Built in caching
  • Tokenizers
  • Cost calculation