Files
hatchet/frontend/docs/components/LLMIntegrationTabs.mdx
Gabe Ruttner 5babd09679 docs: chunky overhaul (#3129)
* improve overall docs structure and things
2026-03-04 14:33:15 -08:00

134 lines
6.3 KiB
Plaintext

import { Callout, Tabs } from "nextra/components";
import { snippets } from "@/lib/generated/snippets";
import { Snippet } from "@/components/code";
import PackageManagerInstall from "@/components/PackageManagerInstall";
import UniversalTabs from "@/components/UniversalTabs";
<Tabs items={["OpenAI", "Anthropic", "Groq", "Vercel AI SDK", "Ollama"]}>
<Tabs.Tab title="OpenAI">
OpenAI's [Chat Completions API](https://platform.openai.com/docs/guides/text-generation) provides access to GPT models for text generation, function calling, and structured outputs. It's the most widely adopted LLM API and supports streaming, tool use, and JSON mode.
<UniversalTabs items={["Python", "TypeScript", "Go", "Ruby"]} variant="hidden">
<Tabs.Tab title="Python">
<PackageManagerInstall packages={{ python: "openai" }} />
<Snippet src={snippets.python.guides.integrations.llm_openai.open_ai_usage} />
</Tabs.Tab>
<Tabs.Tab title="TypeScript">
<PackageManagerInstall packages={{ typescript: "openai" }} />
<Snippet src={snippets.typescript.guides.integrations.llm_openai.open_ai_usage} />
</Tabs.Tab>
<Tabs.Tab title="Go">
<PackageManagerInstall packages={{ go: "github.com/sashabaranov/go-openai" }} />
<Snippet src={snippets.go.guides.integrations.llm_openai.open_ai_usage} />
</Tabs.Tab>
<Tabs.Tab title="Ruby">
<PackageManagerInstall packages={{ ruby: "openai" }} />
<Snippet src={snippets.ruby.guides.integrations.llm_openai.open_ai_usage} />
</Tabs.Tab>
</UniversalTabs>
</Tabs.Tab>
<Tabs.Tab title="Anthropic">
Anthropic's [Messages API](https://docs.anthropic.com/en/docs/build-with-claude/text-generation) powers the Claude family of models, including Claude Sonnet and Claude Haiku. Claude excels at long-context reasoning, careful instruction following, and tool use with extended thinking support.
<UniversalTabs items={["Python", "TypeScript", "Go", "Ruby"]} variant="hidden">
<Tabs.Tab title="Python">
<PackageManagerInstall packages={{ python: "anthropic" }} />
<Snippet src={snippets.python.guides.integrations.llm_anthropic.anthropic_usage} />
</Tabs.Tab>
<Tabs.Tab title="TypeScript">
<PackageManagerInstall packages={{ typescript: "@anthropic-ai/sdk" }} />
<Snippet src={snippets.typescript.guides.integrations.llm_anthropic.anthropic_usage} />
</Tabs.Tab>
<Tabs.Tab title="Go">
<Callout type="info">
Anthropic Go: `go get github.com/anthropics/anthropic-sdk-go`, wire `messages.Create()` into your complete function.
</Callout>
</Tabs.Tab>
<Tabs.Tab title="Ruby">
<Callout type="info">
Anthropic Ruby: `bundle add anthropic`, wire the client into your complete function.
</Callout>
</Tabs.Tab>
</UniversalTabs>
</Tabs.Tab>
<Tabs.Tab title="Groq">
[Groq](https://console.groq.com/docs/overview) provides ultra-fast inference for open-source models like Llama and Mixtral using custom LPU hardware. Its OpenAI-compatible API makes it a drop-in replacement when you need low latency.
<UniversalTabs items={["Python", "TypeScript", "Go", "Ruby"]} variant="hidden">
<Tabs.Tab title="Python">
<PackageManagerInstall packages={{ python: "groq" }} />
<Snippet src={snippets.python.guides.integrations.llm_groq.groq_usage} />
</Tabs.Tab>
<Tabs.Tab title="TypeScript">
<PackageManagerInstall packages={{ typescript: "groq-sdk" }} />
<Snippet src={snippets.typescript.guides.integrations.llm_groq.groq_usage} />
</Tabs.Tab>
<Tabs.Tab title="Go">
<Callout type="info">
Groq: use `net/http` against `api.groq.com/openai/v1/chat/completions`. See [Groq docs](https://console.groq.com/docs).
</Callout>
</Tabs.Tab>
<Tabs.Tab title="Ruby">
<Callout type="info">
Groq Ruby: `bundle add groq` or use HTTP client. See [Groq docs](https://console.groq.com/docs).
</Callout>
</Tabs.Tab>
</UniversalTabs>
</Tabs.Tab>
<Tabs.Tab title="Vercel AI SDK">
The [Vercel AI SDK](https://sdk.vercel.ai/docs) is a TypeScript toolkit that provides a unified interface across providers (OpenAI, Anthropic, Google, and more). It includes helpers for streaming, tool calls, and structured object generation via `generateText` and `streamText`.
<UniversalTabs items={["Python", "TypeScript", "Go", "Ruby"]} variant="hidden">
<Tabs.Tab title="Python">
<Callout type="info">
Vercel AI SDK is JavaScript/TypeScript only. Use OpenAI, Anthropic, or Groq SDK directly.
</Callout>
</Tabs.Tab>
<Tabs.Tab title="TypeScript">
<PackageManagerInstall packages={{ typescript: "ai @ai-sdk/openai" }} />
<Snippet src={snippets.typescript.guides.integrations.llm_vercel_ai_sdk.vercel_ai_sdk_usage} />
</Tabs.Tab>
<Tabs.Tab title="Go">
<Callout type="info">
Vercel AI SDK is JavaScript/TypeScript only.
</Callout>
</Tabs.Tab>
<Tabs.Tab title="Ruby">
<Callout type="info">
Vercel AI SDK is JavaScript/TypeScript only.
</Callout>
</Tabs.Tab>
</UniversalTabs>
</Tabs.Tab>
<Tabs.Tab title="Ollama">
[Ollama](https://ollama.com/) runs open-source models locally with no API key required. It supports Llama, Mistral, Gemma, and others through a simple REST API on `localhost:11434`. Ideal for development, air-gapped environments, or when you want full control over your model.
<UniversalTabs items={["Python", "TypeScript", "Go", "Ruby"]} variant="hidden">
<Tabs.Tab title="Python">
<PackageManagerInstall packages={{ python: "ollama" }} />
<Snippet src={snippets.python.guides.integrations.llm_ollama.ollama_usage} />
</Tabs.Tab>
<Tabs.Tab title="TypeScript">
<Callout type="info">
Use `fetch` to `http://localhost:11434/api/chat`. See [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
</Callout>
</Tabs.Tab>
<Tabs.Tab title="Go">
<Callout type="info">
Use `net/http` to `http://localhost:11434/api/chat`. See [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
</Callout>
</Tabs.Tab>
<Tabs.Tab title="Ruby">
<Callout type="info">
Use `Net::HTTP` to `http://localhost:11434/api/chat`. See [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
</Callout>
</Tabs.Tab>
</UniversalTabs>
</Tabs.Tab>
</Tabs>