Skip to content

Commit 215908f

Browse files
samchonCopilot
andauthored
feat(website): also introduce HttpLlm module. (#1791)
* feat(website): also introduce `HttpLlm` module. * Potential fix for pull request finding Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
1 parent 6fe96b7 commit 215908f

File tree

4 files changed

+429
-12
lines changed

4 files changed

+429
-12
lines changed

README.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -127,10 +127,8 @@ Check out the document in the [website](https://typia.io/docs/):
127127
- LLM Function Calling
128128
- [`application()` function](https://typia.io/docs/llm/application/)
129129
- [`structuredOutput()` function](https://typia.io/docs/llm/structuredOutput/)
130+
- [`HttpLlm` module](https://typia.io/docs/llm/http/)
130131
- [`LlmJson` module](https://typia.io/docs/llm/json/)
131-
- [MCP (Model Context Protocol)](https://typia.io/docs/llm/mcp/)
132-
- [Vercel AI SDK](https://typia.io/docs/llm/vercel/)
133-
- [LangChain](https://typia.io/docs/llm/langchain/)
134132
- Protocol Buffer
135133
- [Message Schema](https://typia.io/docs/protobuf/message)
136134
- [`decode()` functions](https://typia.io/docs/protobuf/decode/)

website/src/content/docs/llm/_meta.ts

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -5,14 +5,11 @@ export default {
55
structuredOutput: "structuredOutput() function",
66
parameters: "parameters() function",
77
schema: "schema() function",
8+
http: "HttpLlm module",
89
json: "LlmJson module",
9-
mcp: "Model Context Protocol",
10-
vercel: "Vercel AI SDK",
11-
langchain: "LangChain",
12-
chat: {
13-
display: "hidden",
14-
},
15-
strategy: {
16-
display: "hidden",
17-
},
10+
mcp: { display: "hidden" },
11+
vercel: { display: "hidden" },
12+
langchain: { display: "hidden" },
13+
chat: { display: "hidden" },
14+
strategy: { display: "hidden" },
1815
} satisfies MetaRecord;

website/src/content/docs/llm/application.mdx

Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,98 @@ Structured output is another feature of LLM. The "structured output" means that
115115

116116
> [💻 Playground Link](/playground/?script=JYWwDg9gTgLgBAbzgSQDIBsQEExncAYwEMZgIA7OAXzgDMoIQ4AiAAQGciQCALCgeghgApuSJhgzANwAoUJFhwYATwlE6DJsxVrpMuHPDR4SAEIAjdlliF0wgMrCoAN0LDqGxiwB0-C1ZsCO0cXNz0DAgp2eHEwAC4UDGxcfGJSCjgAXiVVYCJvdExvWNSSMnIAHn9rUiCHJ1cCYQA+AAoASllI8nYIOwKIAHNW2M6gA)
117117
118+
## Integrations
119+
120+
`typia.llm.controller<Class>()` wraps a TypeScript class into an `ILlmController` that can be plugged into any supported framework. Every class method becomes a tool — JSDoc comments become descriptions, TypeScript types become JSON schemas, and validation feedback is embedded automatically.
121+
122+
<Tabs items={["Vercel AI SDK", "LangChain", "Model Context Protocol"]}>
123+
<Tabs.Tab>
124+
```typescript filename="src/main.ts"
125+
import { openai } from "@ai-sdk/openai";
126+
import { toVercelTools } from "@typia/vercel";
127+
import { generateText, Tool } from "ai";
128+
import typia from "typia";
129+
130+
import { BbsArticleService } from "./BbsArticleService";
131+
132+
const tools: Record<string, Tool> = toVercelTools({
133+
controllers: [
134+
typia.llm.controller<BbsArticleService>(
135+
"bbs",
136+
new BbsArticleService(),
137+
),
138+
],
139+
});
140+
141+
const result = await generateText({
142+
model: openai("gpt-4o"),
143+
tools,
144+
prompt: "I want to create a new article about TypeScript",
145+
});
146+
```
147+
</Tabs.Tab>
148+
<Tabs.Tab>
149+
```typescript filename="src/main.ts"
150+
import { ChainValues, Runnable } from "@langchain/core";
151+
import { ChatPromptTemplate } from "@langchain/core/prompts";
152+
import { DynamicStructuredTool } from "@langchain/core/tools";
153+
import { ChatOpenAI } from "@langchain/openai";
154+
import { toLangChainTools } from "@typia/langchain";
155+
import { AgentExecutor, createToolCallingAgent } from "langchain/agents";
156+
import typia from "typia";
157+
158+
import { BbsArticleService } from "./BbsArticleService";
159+
160+
const tools: DynamicStructuredTool[] = toLangChainTools({
161+
controllers: [
162+
typia.llm.controller<BbsArticleService>(
163+
"bbs",
164+
new BbsArticleService(),
165+
),
166+
],
167+
});
168+
169+
const agent: Runnable = createToolCallingAgent({
170+
llm: new ChatOpenAI({ model: "gpt-4o" }),
171+
tools,
172+
prompt: ChatPromptTemplate.fromMessages([
173+
["system", "You are a helpful assistant."],
174+
["human", "{input}"],
175+
["placeholder", "{agent_scratchpad}"],
176+
]),
177+
});
178+
const executor: AgentExecutor = new AgentExecutor({ agent, tools });
179+
const result: ChainValues = await executor.invoke({
180+
input: "I want to create a new article about TypeScript",
181+
});
182+
```
183+
</Tabs.Tab>
184+
<Tabs.Tab>
185+
```typescript filename="src/main.ts"
186+
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
187+
import { registerMcpControllers } from "@typia/mcp";
188+
import typia from "typia";
189+
190+
import { BbsArticleService } from "./BbsArticleService";
191+
192+
const server: McpServer = new McpServer({
193+
name: "my-server",
194+
version: "1.0.0",
195+
});
196+
197+
registerMcpControllers({
198+
server,
199+
controllers: [
200+
typia.llm.controller<BbsArticleService>(
201+
"bbs",
202+
new BbsArticleService(),
203+
),
204+
],
205+
});
206+
```
207+
</Tabs.Tab>
208+
</Tabs>
209+
118210
## Lenient JSON Parsing
119211

120212
<Tabs items={[

0 commit comments

Comments
 (0)