Worksona.js: An Embeddable Agent Library for the JavaScript Ecosystem
#worksona#portfolio#javascript#sdk#npm#ai-agents
David OlssonMost AI SDKs arrive as heavyweight installations with multi-package dependency trees and framework opinions baked in. Worksona.js takes the opposite position: the entire agent runtime ships as a single minifiable JavaScript file, installable in one command or droppable into a <script> tag from a CDN.
What It Is
Worksona.js (npm i worksona-js) is the distributable form of the Worksona agent model. It is a multi-provider agent library supporting Anthropic (Claude Opus 4.5, Sonnet 4.5), OpenAI (GPT-5, GPT-4o, o3, GPT Image 1.5), and Google (Gemini Pro, Gemini Flash) from a single unified API surface.
The library is dual-mode. As an embedded library, it runs directly in browsers and Node.js with no mandatory dependencies. As a REST server (v0.3.0), it exposes 32+ HTTP endpoints for agent management, document processing, image pipelines, text-to-speech, and web scraping — all secured with rate limiting and HTTP security headers.
Agent personality is configurable along four dimensions: formality, technicality, verbosity, and empathy. Agents hold conversation history, emit events, and support multi-agent chaining where the output of one agent feeds as input to the next.
Why It Matters
Two properties make Worksona.js useful in contexts where most agent libraries are not.
First, CDN deliverability. Because the library is a single file with zero runtime dependencies, it integrates into environments that cannot run a full build pipeline — static sites, embedded widgets, rapid prototypes, legacy frontend projects. The agent model reaches those environments without modification.
Second, frontier model parity. The library targets the latest model versions across all three major providers. This is a deliberate choice: Worksona.js functions as a capability explorer, tracking new model releases rather than pinning to stable versions. Teams evaluating new models can swap providers with a configuration change.
The event-driven architecture is the third distinction. Agents are event emitters — callers subscribe to message, tool_call, and error events rather than awaiting a promise. This enables reactive integration patterns: a downstream system can respond to agent activity without polling or managing async chains manually.
How It Works
flowchart TD
A[Caller Code] -->|npm / CDN| B[worksona.js]
B --> C{Agent Instance}
C -->|emit: message| D[Event Listener]
C -->|emit: tool_call| E[Tool Handler]
C --> F{Provider Router}
F -->|Anthropic API| G[Claude Opus/Sonnet]
F -->|OpenAI API| H[GPT-5 / o3]
F -->|Google API| I[Gemini Pro/Flash]
B --> J[REST Server Mode]
J --> K[32+ Endpoints]
K --> L[Document Processing\nPDF / DOCX / XLSX]
K --> M[Image Pipeline\nDALL-E / GPT Image]
K --> N[Text-to-Speech\nWeb Scraping]
In library mode, the caller creates a Worksona instance, configures agents via JSON, and subscribes to events. The library handles prompt assembly, provider-specific API formatting, and conversation history management internally.
In REST server mode, the same agent model is exposed over HTTP. Document processing endpoints accept a file and return LLM-ready text in a single call — PDF, DOCX, XLSX, and CSV parsing are native to the server, not external preprocessing steps. The image pipeline connects DALL-E and GPT Image generation, analysis, and editing through the same endpoint structure.
A real-time control panel is available in development mode for monitoring agent metrics, transactions, and live configuration.
Where It Fits in Worksona
Worksona.js is the portfolio's external distribution channel. Where worksona-api is a self-hosted backend runtime and worksona-mcp-server targets Claude Desktop integration, Worksona.js is designed to be dropped into any JavaScript project by any developer who finds it on npm.
The library is the underlying runtime for the Worksona playground and related demo projects. Its npm publication makes the Worksona agent model a reusable dependency — something that can appear in a package.json rather than requiring a cloned repository.
The single-file constraint is not a technical limitation. It is an architectural commitment: the agent model should be deliverable anywhere JavaScript runs, without negotiation.