Vector Stream Systems logoVector Stream Systems

Solutions

MBSE, reimagined for agent-ready teams

Model-Based Systems Engineering is the practice of using formal, linked digital models as the primary artifacts for analysis and verification. VectorOWL makes those models machine-accessible to AI agents—without losing the rigor.

What is MBSE?

Replace documents with linked, queryable models

Traditional engineering relies on Word docs, spreadsheets, and slide decks to capture requirements, architecture, and verification. MBSE replaces those static artifacts with formal digital models—requirements, architecture, behavior, and interfaces—linked together in a graph you can query, diff, and validate.

The result: less ambiguity, better traceability, and a single source of truth that scales with system complexity.

Why MBSE matters

  • 01 Reduces ambiguity from narrative-only specifications
  • 02 Improves traceability from intent to evidence
  • 03 Scales better for aerospace, automotive, and defense
  • 04 Enables automated validation and impact analysis

Why VectorOWL?

AI-augmented systems engineering

Semantic retrieval

Traditional MBSE tools are purely symbolic. VectorOWL adds embeddings so you can query by meaning—"find wing designs similar to this one"—not just by ID.

Agent-ready context

Coding agents need structured context, not PDF exports. VectorOWL exposes the graph via Model Context Protocol so agents reason over the same model humans review.

Hard constraints

When AI suggests a design change, anchors verify it against safety margins and policy limits before it reaches production. No silent violations.

Documents vs Models vs VectorOWL

CapabilityDocumentsTraditional MBSEVectorOWL
TraceabilityManual cross-referencesLinked model elementsQueryable graph + vectors
AI readinessNoneLow (symbolic only)Native (MCP + embeddings)
ValidationManual reviewStatic constraintsDynamic anchors + CI gates
Version controlSharePoint / emailTool-specific exportsGit-native ontology diffs
Similarity searchNoneNoneVector ANN over designs

Install

Install VectorOWL

VectorOWL will be open source. The repository is currently private while we finalize the public release. GitHub Sponsors receive immediate repo access and can clone, build, and run locally today.

Docker (quickest — no build required)

Pulls the pre-built image radsilent/vectorowl:latest from Docker Hub. No Rust tool chain or source code needed. Includes Caddy reverse proxy. Pick your platform:

macOS

curl -L https://github.com/radsilent/vectorowl-deploy/archive/main.tar.gz | tar xz
mv vectorowl-deploy-main vectorowl && cd vectorowl
cp .env.example .env
# Edit .env and set VECTOROWL_LICENSE_KEY
docker compose up -d

Requires Docker Desktop for Mac.

Windows

# PowerShell
Invoke-WebRequest -Uri https://github.com/radsilent/vectorowl-deploy/archive/main.tar.gz -OutFile vectorowl-deploy.tar.gz
tar -xzf vectorowl-deploy.tar.gz
Rename-Item vectorowl-deploy-main vectorowl
Set-Location vectorowl
Copy-Item .env.example .env
# Edit .env and set VECTOROWL_LICENSE_KEY
docker compose up -d

Requires Docker Desktop or WSL2. Or run the Linux steps inside WSL2.

Linux

curl -L https://github.com/radsilent/vectorowl-deploy/archive/main.tar.gz | tar xz
mv vectorowl-deploy-main vectorowl && cd vectorowl
cp .env.example .env
# Edit .env and set VECTOROWL_LICENSE_KEY
docker-compose up -d

Requires Docker Engine + Compose. Use docker compose (space) if you have the Compose plugin.

VectorOWL Desktop (Tauri)

The desktop app bundles the server, web UI, and system tray into a single native application. Requires Tauri prerequisites (Rust, Node.js, WebKit/GTK).

1. Clone & build server sidecar

git clone https://github.com/radsilent/VectorOWL.git
cd VectorOWL
cargo build --release -p vectorowld

# Stage sidecar for the desktop bundle
mkdir -p src-tauri/binaries src-tauri/binaries/lib
cp target/release/vectorowld \
  src-tauri/binaries/vectorowld-x86_64-unknown-linux-gnu

Requires Rust 1.75+. CPU-only mode works without a GPU.

2. Build the desktop app

# Install Tauri CLI if you haven't
cargo install tauri-cli

# Build the desktop app
cargo tauri build

Produces .deb, .rpm, and .AppImage under target/release/bundle/.

3. Run VectorOWL Desktop

VECTOROWL_LICENSE_KEY=<your-key> \
  VECTOROWL_REQUIRE_TORCH_GPU=false \
  ./target/release/vectorowl-desktop

A valid license key is required at startup. Contact Vector Stream Systems or visit pricing to obtain one.

Self-hosted server

Run the headless API server and web UI separately for remote access or CI integration.

Build & run the server

# CPU-only (no GPU required)
VECTOROWL_REQUIRE_TORCH_GPU=false \
  cargo run -p vectorowld

# GPU-enabled (auto-detects CUDA)
cargo run -p vectorowld

Default port is 8080. Override with VECTOROWL_PORT=<port>.

License key (required)

export VECTOROWL_LICENSE_KEY="your-license-key"

# Keys are generated per subscription and validated at startup.
# Contact streamline@vectorstreamsystems.com to obtain a key.

The server will not start without a valid license. Contact us or visit pricing.

MCP server setup

# Claude Desktop
python3 scripts/patch_claude_mcp.py

# Cursor
# Add to ~/.cursor/mcp.json
{
  "mcpServers": {
    "vectorowl-runtime": {
      "command": "vectorowl-mcp",
      "args": ["--url", "http://localhost:8080"]
    }
  }
}

See MCP setup docs for full config.

Verify

curl http://localhost:8080/openapi.json

You should see the OpenAPI spec. The UI dev server runs separately on port 5173.

Import formats

VectorOWL natively imports models from standard engineering interchange formats. Upload via the API or drop files into the import directory.

SysML v1 XMI · v2 / KerML XMI
UML 2.x XMI · Ecore / EMOF
Semantic Web RDF/XML · Turtle · N-Triples
Requirements ReqIF · CSV triples · JSON

Workflow

From zero to governed graph in 4 steps

1

Model your system in OWL

Define classes, individuals, and relationships. Start with requirements and architecture; add behavior and interfaces as you go.

2

Ingest evidence

Load simulation results, telemetry, and documents into the vector layer. Link them to ontology nodes for retrieval.

3

Define anchors

Encode hard constraints—temperature limits, stress bounds, policy gates—so no design ships without proof.

4

Connect your toolchain

Register VectorOWL MCP servers in Claude, Cursor, or your own host. Agents now reason over the same graph your team reviews.