Recipes
Recipe: Figma design extraction
Pull frames from a Figma file, render them as PNGs, and feed them to a vision-capable model for design QA.
import { createRuntime } from '@agentskit/runtime'
import { anthropic } from '@agentskit/adapters'
import { figma } from '@agentskit/tools/integrations'
import { imagePart, textPart } from '@agentskit/core'
const tools = figma({
apiToken: process.env.FIGMA_API_TOKEN!,
})
const runtime = createRuntime({
adapter: anthropic({ apiKey: KEY, model: 'claude-sonnet-4-6' }),
tools,
systemPrompt: `You QA Figma frames against our design system. For each frame:
- Verify spacing (grid 8px).
- Verify typography roles match Tailwind tokens.
- Flag any free-form colors not in our palette.`,
})
const result = await runtime.run({
role: 'user',
content: [
textPart('QA the latest frames in file abc123, frames "checkout-v2-*"'),
],
})The runtime handles the Figma β PNG export step automatically (one
of the integration's sub-tools). Vision-enabled adapters (anthropic,
openai, gemini) consume the resulting imageParts.
#Output a checklist
System prompt to coerce structured output:
End your reply with a single fenced JSON block:
\`\`\`json
{ "frames": [{ "id": "...", "issues": [{ "rule": "...", "severity": "low|med|high" }] }] }
\`\`\`Pair with safeParseArgs from @agentskit/core to parse + validate.
#Related
- Recipe: multi-modal β image input semantics across providers.
figmaintegration page.- Recipe: schema-first agent β formalise the output contract.
Explore nearby
- PeerRecipes
Copy-paste solutions grouped by theme. Every recipe end-to-end, runs as written.
- PeerCustom adapter
Wrap any LLM API as an AgentsKit adapter. Plug-and-play with the rest of the kit in 30 lines.
- PeerAdapter contract tests
Verify any adapter against the ADR 0001 invariants A1βA10 with the shared test harness.