Solutions
MBSE, reimagined for agent-ready teams
Model-Based Systems Engineering is the practice of using formal, linked digital models as the primary artifacts for analysis and verification. VectorOWL makes those models machine-accessible to AI agents—without losing the rigor.
What is MBSE?
Replace documents with linked, queryable models
Traditional engineering relies on Word docs, spreadsheets, and slide decks to capture requirements, architecture, and verification. MBSE replaces those static artifacts with formal digital models—requirements, architecture, behavior, and interfaces—linked together in a graph you can query, diff, and validate.
The result: less ambiguity, better traceability, and a single source of truth that scales with system complexity.
Why MBSE matters
- 01 Reduces ambiguity from narrative-only specifications
- 02 Improves traceability from intent to evidence
- 03 Scales better for aerospace, automotive, and defense
- 04 Enables automated validation and impact analysis
Why VectorOWL?
AI-augmented systems engineering
Semantic retrieval
Traditional MBSE tools are purely symbolic. VectorOWL adds embeddings so you can query by meaning—"find wing designs similar to this one"—not just by ID.
Agent-ready context
Coding agents need structured context, not PDF exports. VectorOWL exposes the graph via Model Context Protocol so agents reason over the same model humans review.
Hard constraints
When AI suggests a design change, anchors verify it against safety margins and policy limits before it reaches production. No silent violations.
Documents vs Models vs VectorOWL
| Capability | Documents | Traditional MBSE | VectorOWL |
|---|---|---|---|
| Traceability | Manual cross-references | Linked model elements | Queryable graph + vectors |
| AI readiness | None | Low (symbolic only) | Native (MCP + embeddings) |
| Validation | Manual review | Static constraints | Dynamic anchors + CI gates |
| Version control | SharePoint / email | Tool-specific exports | Git-native ontology diffs |
| Similarity search | None | None | Vector ANN over designs |
Install
Install VectorOWL
VectorOWL will be open source. The repository is currently private while we finalize the public release. GitHub Sponsors receive immediate repo access and can clone, build, and run locally today.
Docker (quickest — no build required)
Pulls the pre-built image radsilent/vectorowl:latest from Docker Hub. No Rust tool chain or source code needed. Includes Caddy reverse proxy. Pick your platform:
macOS
curl -L https://github.com/radsilent/vectorowl-deploy/archive/main.tar.gz | tar xz
mv vectorowl-deploy-main vectorowl && cd vectorowl
cp .env.example .env
# Edit .env and set VECTOROWL_LICENSE_KEY
docker compose up -dRequires Docker Desktop for Mac.
Windows
# PowerShell
Invoke-WebRequest -Uri https://github.com/radsilent/vectorowl-deploy/archive/main.tar.gz -OutFile vectorowl-deploy.tar.gz
tar -xzf vectorowl-deploy.tar.gz
Rename-Item vectorowl-deploy-main vectorowl
Set-Location vectorowl
Copy-Item .env.example .env
# Edit .env and set VECTOROWL_LICENSE_KEY
docker compose up -dRequires Docker Desktop or WSL2. Or run the Linux steps inside WSL2.
Linux
curl -L https://github.com/radsilent/vectorowl-deploy/archive/main.tar.gz | tar xz
mv vectorowl-deploy-main vectorowl && cd vectorowl
cp .env.example .env
# Edit .env and set VECTOROWL_LICENSE_KEY
docker-compose up -dRequires Docker Engine + Compose. Use docker compose (space) if you have the Compose plugin.
VectorOWL Desktop (Tauri)
The desktop app bundles the server, web UI, and system tray into a single native application. Requires Tauri prerequisites (Rust, Node.js, WebKit/GTK).
1. Clone & build server sidecar
git clone https://github.com/radsilent/VectorOWL.git
cd VectorOWL
cargo build --release -p vectorowld
# Stage sidecar for the desktop bundle
mkdir -p src-tauri/binaries src-tauri/binaries/lib
cp target/release/vectorowld \
src-tauri/binaries/vectorowld-x86_64-unknown-linux-gnuRequires Rust 1.75+. CPU-only mode works without a GPU.
2. Build the desktop app
# Install Tauri CLI if you haven't
cargo install tauri-cli
# Build the desktop app
cargo tauri buildProduces .deb, .rpm, and .AppImage under target/release/bundle/.
3. Run VectorOWL Desktop
VECTOROWL_LICENSE_KEY=<your-key> \
VECTOROWL_REQUIRE_TORCH_GPU=false \
./target/release/vectorowl-desktopA valid license key is required at startup. Contact Vector Stream Systems or visit pricing to obtain one.
Self-hosted server
Run the headless API server and web UI separately for remote access or CI integration.
Build & run the server
# CPU-only (no GPU required)
VECTOROWL_REQUIRE_TORCH_GPU=false \
cargo run -p vectorowld
# GPU-enabled (auto-detects CUDA)
cargo run -p vectorowldDefault port is 8080. Override with VECTOROWL_PORT=<port>.
License key (required)
export VECTOROWL_LICENSE_KEY="your-license-key"
# Keys are generated per subscription and validated at startup.
# Contact streamline@vectorstreamsystems.com to obtain a key.The server will not start without a valid license. Contact us or visit pricing.
MCP server setup
# Claude Desktop
python3 scripts/patch_claude_mcp.py
# Cursor
# Add to ~/.cursor/mcp.json
{
"mcpServers": {
"vectorowl-runtime": {
"command": "vectorowl-mcp",
"args": ["--url", "http://localhost:8080"]
}
}
}See MCP setup docs for full config.
Verify
curl http://localhost:8080/openapi.jsonYou should see the OpenAPI spec. The UI dev server runs separately on port 5173.
Import formats
VectorOWL natively imports models from standard engineering interchange formats. Upload via the API or drop files into the import directory.
Workflow
From zero to governed graph in 4 steps
Model your system in OWL
Define classes, individuals, and relationships. Start with requirements and architecture; add behavior and interfaces as you go.
Ingest evidence
Load simulation results, telemetry, and documents into the vector layer. Link them to ontology nodes for retrieval.
Define anchors
Encode hard constraints—temperature limits, stress bounds, policy gates—so no design ships without proof.
Connect your toolchain
Register VectorOWL MCP servers in Claude, Cursor, or your own host. Agents now reason over the same graph your team reviews.
