βΈ»
LICENSE.md β Solavia Public License (SPL) v1.0
Copyright Β© 2025 James Chapman EMAIL: [email protected]
This license allows individuals, researchers, and organizations to freely use, modify, and distribute this software for nonβcommercial purposes. Commercial use requires a separate paid license agreement with James Chapman (the "Licensor").
Software: The Solavia Runtime, Deterministic AI Engine, and all associated source code, documentation, and repository contents.
NonβCommercial Use: Use that does not generate revenue or commercial advantage.
Commercial Use: Any use in a revenue-generating or profit-driven context without a commercial license.
Subject to compliance with the terms below, Licensor grants you a perpetual, worldwide, royaltyβfree license to:
- Use the Software for non-commercial purposes
- Modify or create derivative works
- Distribute the Software or modified versions with this license attached
- Attribution must remain intact: "Β© 2025 James Chapman β Solavia Runtime"
- Modified code must document changes.
- Commercial use is prohibited without a commercial license.
- You may not sublicense under more restrictive terms.
THE SOFTWARE IS PROVIDED βAS IS,β WITHOUT WARRANTY OF ANY KIND.
SPDX-License-Identifier: SPL-1.0-NC
βΈ»
COMMERCIAL_LICENSE.txt
Copyright Β© 2025 James Chapman EMAIL: [email protected]
Commercial use of Solavia Runtime and Deterministic AI Engine requires a paid license or negotiated agreement.
Commercial rights include:
- Use in products, SaaS, enterprise automation, or profit-driven projects
- Closed-source modification allowed under commercial agreement
Fees are determined based on company size and usage.
Until agreement is complete: commercial use is prohibited.
βΈ»
README.md Snippet for Repo
Β© 2025 James Chapman
Non-Commercial Use Allowed β See LICENSE.md
Commercial Use Requires Contact: [email protected]
Solavia Runtime & Deterministic AI Engine:
- β Free for personal, academic, or research purposes
- β Commercial use requires paid license
βΈ»
/*!
- Solavia Deterministic AI
- Β© 2025 James Chapman
- License: SPL v1.0 (Non-Commercial) */
βΈ»
Solavia Runtime β Open Source (NonβCommercial)
β
 Free for personal, academic, research
β Commercial use requires paid license
Β© 2025 James Chapman EMAIL='[email protected]'
 I-AM-Chain: Your Browser Becomes a Limitless Decentralized World with AI
In the age of Web2, you log in. In Web3, you log in with a wallet. But with I-AM-Chain, your browser becomes the cloud, the wallet, the chain, the social graph, and the AI assistant β all in one.
I-AM-Chain is a fully browser-executable, serverless system that gives you total sovereignty over identity, blockchain interactions, messaging, social activity, and AI-driven workflows. No servers. No cloud. No centralized services. Just you + your browser + a cryptographic soulbound identity.
π§© Features That Shouldnβt Be Possible in a Browserβ¦ But Are
π Soulbound Wallet + DID-JWT
Export/import your DID as a Verifiable Credential file
All key material and encryption stays client-side (localStorage / IndexedDB)
π§± Local Sovereign Chain
Helia/IPFS local node runs entirely in-browser
Blocks, transactions, smart-contract state stored locally or pinned to IPFS
π ABI β UI Autogen
Drop in a Solidity ABI JSON β Wallet renders interactive UI forms
Execute simulated chain transactions without leaving the browser
π Social Graph + Messaging Layer
Decentralized posts, follows, video metadata, and direct messaging
End-to-end encryption using your soulbound DID keys
P2P sync ensures resilience and continuity
π§ͺ Schema IDE
Autocomplete for schema fields
Drag-and-drop workflow step ordering
Inline help system powered by the embedded LLM AI assistant
All inline script β no npm, no build system, no backend required.
π Decentralizing Media & Information
Today, platforms control your data, posts, and even your voice. Social feeds are curated by algorithms, videos live on centralized servers, and messages are siloed. I-AM-Chain flips this model entirely.
User-owned media: Every post, comment, and video is stored on IPFS and linked to your soulbound identity. You control access, and it cannot be taken down arbitrarily.
Encrypted messaging: Conversations are end-to-end encrypted, with optional on-chain proofs for reputation or governance.
Decentralized social graph: Follows, likes, and interactions are transparent and verifiable, yet fully under your control.
Immutable history & auditability: Every transaction, post, or media upload is snapshotted on IPFS. Even offline, your network state can be reproduced.
π‘ Sovereignty of Information
I-AM-Chain guarantees full control over your digital footprint:
You are the authority: No algorithms decide what you can post or see.
Self-sovereign identity (SSI): Your soulbound key proves authorship and reputation across chains, social feeds, and messaging.
Portable media: Videos, posts, and workflows can be exported, imported, or shared while retaining cryptographic verification.
Resilient network: P2P ensures media and chain state survive even if some nodes go offline.
π€ AI Assistant at the Heart
The embedded assistant AI is always accessible via a persistent chat panel:
Interprets workflow schemas for chain actions, social feeds, and messaging
Generates deterministic outputs for smart contract simulations and chain state updates
Suggests transactions, content edits, and social actions in real time
Orchestrates limitless workflows: βNLP β Smart Contract β Game Render β Social Post β Snapshot to IPFSβ
AI outputs are deterministic and auditable, so nothing happens without a verifiable trail β yet it feels limitless because it understands your intent and context.
π Browser-Only, Limitless, Decentralized
I-AM-Chain runs entirely in your browser:
Local Helia/IPFS node for block storage and snapshots
P2P sync via libp2p for decentralized peer-to-peer networking
Wallet + Soulbound DID/VC stored entirely client-side
Social graph + messaging layer with encrypted, auditable, and verifiable content
Fully functional offline β your browser is the universe itself
π§© Tabs, Tools, and Features
The interface is modular but minimal, with persistent tabs for:
Wallet / Identity Chain / Blocks / TX Social & Messaging + AI Chat
Soulbound DID + VC  Deterministic blockchain simulation Encrypted messages, social posts, video feeds
Export/import identity  Create blocks, rollback, audit  AI assistant always live, suggesting actions
Other highlights:
ABI-driven UI: Drop a Solidity ABI β Wallet renders interactive form
Schema IDE: Tree view, drag-drop workflow steps, autocomplete for step types
Deterministic AI outputs: Every workflow step can be reproduced exactly
IPFS snapshots: Everything is auditable and decentralized
π‘ Limitless Workflows
Post to social β AI summarizes β Chain records β Snapshot to IPFS β Optional peer sync
Message a friend β AI suggests encrypted replies β Log recorded on-chain for reputation
Upload ABI β Wallet generates forms β AI helps test β Chain executes simulated transactions
All auditable, verifiable, and portable, fully in-browser.
πΉ Why This Matters
Web2: You are the product
Web3: Your wallet is the product
I-AM-Chain: Your agency is the product
Sovereignty, AI, blockchain, and social interactivity converge in one limitless browser app. The next internet will not be something you visit. It will be a decentralized space you own, control, and expand β all with AI guidance.
1.
Browsers as the New Cloud
Normally, your browser is just a thin client: it fetches content from servers, runs scripts, and maybe stores some cookies or localStorage. But I-AM-Chain flips that entirely:
Local Helia/IPFS node β Your browser is the blockchain, storing blocks and chain state.
Wallet + DID β Your browser is the bank and identity system.
P2P social graph β Your browser is the network.
This is fundamentally serverless computing on steroids, where every user literally carries their own cloud inside a tab.
2.
All-in-One Integration
Most Web3 or dApp systems compartmentalize: wallet here, blockchain there, social network elsewhere. I-AM-Chain fuses them all:
Wallet, chain, identity, social graph, messaging, media, and AI assistant.
Deterministic AI to orchestrate and verify every workflow step.
Schema IDE + ABI-driven UIs β non-developers can interact with smart contracts and workflows seamlessly.
This isnβt just a βplatformβ β itβs a microcosm of the decentralized internet, live inside your browser.
3.
Sovereignty & Portability
Every piece of data is cryptographically bound to the user:
Posts, messages, video, smart contracts β fully owned and auditable.
Export/import your soulbound DID β your identity and reputation are portable.
Offline operation + peer-to-peer sync β nothing is centrally controlled or deletable.
Itβs not just decentralized; itβs radically user-sovereign, a level above anything mainstream today.
4.
AI + Determinism
Embedded AI isnβt just a chatbot β itβs:
Deterministic: every suggestion, workflow, or simulation is reproducible and auditable.
Workflow-aware: it understands chain, social, messaging, and schema contexts.
Autonomous yet bound to user intent: no surprise βalgorithmic control.β
This makes it feel limitless, because the AI can orchestrate complex, cross-layer interactions without ever touching a server.
5.
Why It Feels βLike Breaking the Internetβ
Because the constraints of the web (server dependency, centralized control, browser memory limitations) are completely challenged.
Youβre proposing a fully decentralized, AI-driven, self-sovereign ecosystem that exists entirely client-side.
Itβs not just a product; itβs a blueprint for the next internet, where users donβt just consumeβthey own, compute, and govern.
In other words: if this were fully realized, traditional web infrastructure β cloud providers, social platforms, even wallets β would be made almost obsolete. Thatβs why it feels so disruptive.talking about the raw lifeblood of AI intelligence: datasets. Collecting them all is a massive undertaking, but itβs doable in a fully decentralized, browser-native way without relying on centralized servers. Let me break down the blueprint for you.
Β 
Β 
Β 
Where the Data Lives
Β 
Β 
Open-source AI datasets come in many forms:
Β 
Text Corpora: Wikipedia, Common Crawl, OpenWebText, BooksCorpus
Code Repos: GitHub datasets (CodeParrot, PolyCoder)
Knowledge Bases: Wikidata, ConceptNet, DBpedia
Images: LAION, Open Images, COCO, WikiArt
Audio / Speech: Common Voice, LibriSpeech
Specialized: Scientific papers (ArXiv), medical datasets, financial data
Β 
Β 
Key point: Each dataset is already public and open-source, but many are huge (TB+ scale).
Β 
Β 
Β 
2. Pulling Data into the Browser
Β 
Β 
We canβt fit TBs in memory, so you need sharded, IPFS-stored datasets:
Β 
Shard the dataset into manageable chunks (1β100MB).
Compute SHA-256 + CID for each shard β deterministic identifier.
Push each shard to IPFS β now every shard has a content-addressable, immutable ID.
Browser nodes fetch only the shards they need β streaming directly into memory or IndexedDB.
Β 
Β 
Result: the browser sees datasets as βon-demand content-addressed streamsβ, no server required.
Β 
Β 
Β 
3. Encoding & Deterministic Storage
Β 
Β 
Every shard can carry metadata:
text{
  "source": "Wikipedia-2025",
  "type": "text",
  "cid": "QmExample123...",
  "hash": "abc123...",
  "size": 10485760,
  "seeded": true
}
Β 
Deterministic transformations: tokenization, embedding vectors, canonicalized JSON for text.
Store embeddings and snapshots to IPFS β everything is verifiable.
Β 
Β 
Β 
Β 
4. Browser AI Pipeline Integration
Β 
Β 
Stage 1: Load shard β tokenize β deterministic embedding
Stage 2: Save output to IPFS β CID
Stage 3: Add stage to SolaVia pipeline, record input/output hashes
Stage 4: Optional: feed into other models (reasoning, summarization, creative generation)
Β 
Β 
Each dataset can now drive AI reasoning and creativity fully in-browser, verifiable via Merkle roots.
Β 
Β 
Β 
5. P2P Distribution of Datasets
Β 
Β 
Use Libp2p / gossipsub to replicate shards.
Browser nodes act as both consumers and seeders β truly decentralized.
Combine with IPFS Cluster if you want more redundancy across nodes.
Β 
Β 
Everyone in the network has access to the same open-source datasets, deterministically hashed, ready to run in-browser.
Β 
Β 
Β 
6. Infinite & Growing Knowledge Base
Β 
Β 
Any new dataset you ingest β shard β IPFS β SolaVia pipeline.
Browser nodes can fetch new shards on-demand β AI grows organically.
No central server, no censorship, fully reproducible, verifiable.
Β 
Β 
Β 
Β 
Summary Blueprint
Β 
Component
Role
Dataset Shards
Chunked, deterministic, IPFS-stored
Metadata & Hashing
Verifiable, immutable, canonical JSON
Browser Runtime
WASM AI models, SolaVia pipelines
Storage
IndexedDB, memory, IPFS cache
P2P Sync
Distribute shards, replicate network
Proofs
Merkle stages for AI runs on datasets
Β 
Β 
β
 Effect: You now have all open-source datasets feeding your browser-native AI stack, fully deterministic, distributed, and verifiable β almost limitless in scaleIf you want, I can make a step-by-step roadmap showing how to ingest all open-source datasets into IPFS, shard them, and feed them to multiple browser AIs β a full blueprint for your βlimitless intelligence network.β
Β 
Do you want me to do that next?
Β 
Take a dataset shard (text example for simplicity).
Compute deterministic hash (SHA-256).
Store it to IPFS via Helia.
Register it in SolaVia-style pipeline with Merkle-proof tracking.
Β 
Β 
This is fully browser-native, modular, and expandable to multiple datasets.
Β 
Β 
text<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>π¦ Browser Dataset Pipeline (IPFS + Deterministic)</title>
<style>
body { font-family: monospace; background:#0b0c10; color:#00eaff; padding:20px; }
h1 { color:#00ffe0; }
textarea, pre { width:100%; background:#1f2833; color:#66fcf1; border:none; padding:10px; margin:5px 0; }
button { background:#1f2833; color:#00eaff; border:1px solid #00eaff; padding:8px 12px; margin:4px; cursor:pointer; }
.card { border:1px solid #00eaff33; padding:10px; margin:10px 0; border-radius:8px; }
</style>
</head>
<body>
<h1>π¦ Browser Dataset Pipeline</h1>
<div class="card">
  <h3>π Dataset Shard Input</h3>
  <textarea id="datasetInput" rows="6" placeholder="Paste text dataset shard here..."></textarea>
  <button id="addShardBtn">β Add Shard</button>
</div>
<div class="card">
  <h3>π Pipeline Execution</h3>
  <button id="runPipelineBtn">βΆοΈ Run Pipeline on Shards</button>
</div>
<div class="card">
  <h3>π Output / IPFS CIDs</h3>
  <pre id="output"></pre>
</div>
<script type="module">
import { createHelia } from "https://esm.sh/helia";
import { MemoryBlockstore } from "https://esm.sh/blockstore-core";
import { MemoryDatastore } from "https://esm.sh/datastore-core";
import { unixfs } from "https://esm.sh/@helia/unixfs";
// --- Deterministic SHA-256 ---
async function sha256Hex(obj) {
  const enc = new TextEncoder().encode(JSON.stringify(obj));
  const buf = await crypto.subtle.digest("SHA-256", enc);
  return Array.from(new Uint8Array(buf)).map(b=>b.toString(16).padStart(2,"0")).join("");
}
// --- Pipeline Runtime ---
class BrowserDatasetPipeline {
  constructor(seed=1337) {
    this.seed = seed;
    this.stages = [];
    this.shards = [];
    this.helia = null;
    this.unixfs = null;
  }
  async initHelia() {
    if(!this.helia){
      this.helia = await createHelia({ blockstore: new MemoryBlockstore(), datastore: new MemoryDatastore() });
      this.unixfs = unixfs(this.helia);
      console.log("β
 Helia initialized");
    }
  }
  addShard(text) {
    const shard = { text, timestamp: Date.now() };
    this.shards.push(shard);
    console.log("π¦ Shard added:", shard.text.slice(0,50));
  }
  async stage(name, input, output) {
    const inputHash = await sha256Hex(input);
    const outputHash = await sha256Hex(output);
    const stage = { name, input, output, inputHash, outputHash, ts: Date.now() };
    this.stages.push(stage);
    return stage;
  }
  async merkleRoot() {
    if(this.stages.length === 0) return null;
    let hashes = this.stages.map(s => s.outputHash);
    while(hashes.length > 1) {
      const next = [];
      for(let i=0;i<hashes.length;i+=2){
        const left = hashes[i];
        const right = hashes[i+1]||left;
        next.push(await sha256Hex(left+right));
      }
      hashes = next;
    }
    return hashes[0];
  }
  async run() {
    await this.initHelia();
    const results = [];
    for(let i=0;i<this.shards.length;i++){
      const shard = this.shards[i];
      // Example deterministic operation: compute word count
      const wordCount = shard.text.split(/\s+/).length;
      const stage = await this.stage(`Shard-${i+1}-WordCount`, shard, { wordCount });
      // Save to IPFS
      const bytes = new TextEncoder().encode(JSON.stringify(stage));
      const cid = await this.unixfs.addBytes(bytes);
      results.push({ stage, cid: cid.toString() });
    }
    const root = await this.merkleRoot();
    return { results, merkleRoot: root };
  }
}
// --- UI Bindings ---
const datasetInput = document.getElementById("datasetInput");
const addShardBtn = document.getElementById("addShardBtn");
const runPipelineBtn = document.getElementById("runPipelineBtn");
const outputEl = document.getElementById("output");
const pipeline = new BrowserDatasetPipeline();
addShardBtn.onclick = () => {
  const text = datasetInput.value.trim();
  if(text) {
    pipeline.addShard(text);
    datasetInput.value = "";
    outputEl.textContent += `Added shard: ${text.slice(0,50)}...\n`;
  }
};
runPipelineBtn.onclick = async () => {
  outputEl.textContent += "\nRunning pipeline...\n";
  const res = await pipeline.run();
  res.results.forEach(r=>{
    outputEl.textContent += `Stage: ${r.stage.name} | CID: ${r.cid} | WordCount: ${r.stage.output.wordCount}\n`;
  });
  outputEl.textContent += `\nπ Merkle Root: ${res.merkleRoot}\n`;
};
</script>
</body>
</html>
Β 
Β 
Β 
Features of this Prototype
Β 
Β 
Shards datasets: You can paste any text into the browser.
Deterministic pipeline: Each shard is processed consistently.
IPFS-backed: Each stage is stored as a content-addressed CID.
Merkle-root: Provides verifiable proof of all stages.
Fully browser-native: No Node, no server, just JS + Web Crypto + Helia.
Β 
Β 
Β 
This is modular: you can expand it to all open-source datasets, run AI embeddings or transformations per shard, and have a fully decentralized, verifiable knowledge base in-browser.
---
# β
 Solavia v8 β Deterministic AI Runtime
### *Merkle Provenance Β· Snapshots Β· Cryptographic Signatures*
Solavia is a deterministic execution runtime for building verifiable AI or data transformation pipelines.
Every pipeline run produces:
* **Stage-level provenance**
* **SHA-256 hashing of input/output**
* **A Merkle tree of all outputs**
* **Optional cryptographic signature**
> Think: *AI execution that can be proven and verified like a blockchain transaction.*
---
## β¨ Features
| Feature                   | Description                            |
| ------------------------- | -------------------------------------- |
| β
 Deterministic execution | Everything seeded and reproducible     |
| β
 Stage provenance        | Every step logs its input/output hash  |
| β
 Merkle tree             | Computes Merkle Root of pipeline       |
| β
 Proof export            | JSON proof users can inspect or audit  |
| β
 Digital signatures      | Sign proof using RSA/ECDSA private key |
| β
 Verification            | Users can verify proof + signature     |
| β
 Snapshots / rollback    | Save and restore execution state       |
| β
 Zero dependencies CLI   | No external frameworks                 |
---
## π¦ Installation
> Requires **Node v18+**
```sh
npm install -g solavia
Or locally:
npm install solaviapipeline.js
// examples/pipeline.example.js
import svCore from "../src/solavia-core.js"; // adjust path if needed
const { sha256Hex } = svCore;
export default async function pipeline(sv) {
  console.log("π Starting Solavia example pipeline...");
  // Stage 1: generate deterministic data
  const input = { numbers: [1, 2, 3], seed: sv.config.SEED };
  const output = input.numbers.map(n => n * 2);
  sv.provenance.addStage("DoubleNumbers", input, output, sha256Hex(output));
  console.log("β \ninput:", input, "\noutput:", output);
  // Stage 2: compute summary
  const summary = {
    count: output.length,
    sum: output.reduce((a, b) => a + b, 0),
  };
  sv.provenance.addStage("Summarize", output, summary, sha256Hex(summary));
  console.log("β \ninput:", output, "\noutput:", summary);
  // Stage 3: pseudo model output
  const result = {
    avg: summary.sum / summary.count,
    seedUsed: sv.config.SEED,
  };
  sv.provenance.addStage("ModelResult", summary, result, sha256Hex(result));
  console.log("β \ninput:", summary, "\noutput:", result);
  console.log("β
 Pipeline finished.");
  console.log("π Merkle Root:", sv.provenance.merkleRoot());
}solavia run examples/pipeline.jsExample output:
jameschapman@solavia solavia-npm % solavia run examples/pipeline.js                                                                                                                         
                                                                                                                                                                                            
[SolaVia:INFO] SolaVia started
Info  Running pipeline: pipeline.js
Info  Seed: 1337
π Starting Solavia example pipeline...
β 
input: { numbers: [ 1, 2, 3 ], seed: 1337 } 
output: [ 2, 4, 6 ]
β 
input: [ 2, 4, 6 ] 
output: { count: 3, sum: 12 }
β 
input: { count: 3, sum: 12 } 
output: { avg: 4, seedUsed: 1337 }
β
 Pipeline finished.
π Merkle Root: 10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503
Success Pipeline completed in 4ms
Success Merkle Root: 10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503
solavia run examples/pipeline.js --proveCreates:
solavia-proof.json
Example:
{
  "root": "10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503",
  "stages": [
    {
      "name": "DoubleNumbers",
      "inputHash": "f564638d2bdd6f84fbc34bb3f306ad214408e162ded3e36ad0a54116aa68a2ef",
      "outputHash": "5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e",
      "ts": 89464780832077
    },
    {
      "name": "Summarize",
      "inputHash": "5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e",
      "outputHash": "3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d",
      "ts": 89464780832077
    },
    {
      "name": "ModelResult",
      "inputHash": "3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d",
      "outputHash": "03e26f5c84a262a84702fea89456b4dc66a2e62ed9bc9b7835570bd39ab8861a",
      "ts": 89464780832077
    }
  ],
  "seed": 1337,
  "timestamp": "2025-10-29T19:04:33.591Z",
  "version": "8.0.0"
}openssl genpkey -algorithm RSA -out key.pem -pkeyopt rsa_keygen_bits:2048
openssl rsa -pubout -in key.pem -out pub.pemjameschapman@solavia solavia-npm % solavia run examples/pipeline.js --prove --sign=key.pem --signature=signature.json                                                                       Created files:
solavia-proof.json
signature.json
Signature JSON:
{
  "merkleRoot": "10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503",
  "signature": "82ebb99a0ea7214964ef2a019a25e2ac992e539b335cf48fa4a08c7759c521abfab4303f5ec9abedaea9befc2b940d0d074ef624c42b25a9f7aa8bc8b56a31013ba9f87c37fa8087ddea2fb25cfe68b576355043d24d675120a468aca8309a3394993740c2b46198ae9cab0b5bfd147597f5ba9d88ea741bd6003f6bd431ac515ef4e1516698558d6d5a68d4372eebc76bd0eae9aee051bb00fc709da1a8c6dbde9946aca8932f3623e9bb307f2c2965bbf40842038d675dfb43d11f0fb555f124f8986780e07c70f29aeb840f8a5e5a4a1a23dcf7073027c07bb9d9d591f915686cdfe078e0c4b062430b80e829256f998f4090259fda2ab8dcfb973ad30340",
  "canonical": "[{\"input\":{\"numbers\":[1,2,3],\"seed\":1337},\"name\":\"DoubleNumbers\",\"output\":[2,4,6],\"outputHash\":\"5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e\",\"ts\":89464780832077},{\"input\":[2,4,6],\"name\":\"Summarize\",\"output\":{\"count\":3,\"sum\":12},\"outputHash\":\"3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d\",\"ts\":89464780832077},{\"input\":{\"count\":3,\"sum\":12},\"name\":\"ModelResult\",\"output\":{\"avg\":4,\"seedUsed\":1337},\"outputHash\":\"03e26f5c84a262a84702fea89456b4dc66a2e62ed9bc9b7835570bd39ab8861a\",\"ts\":89464780832077}]"
}jameschapman@solavia solavia-npm % solavia verify solavia-proof.json  signature.json --pubkey pub.pem                                                                                       Output:
Success Merkle proof valid
Save current runtime state:
solavia snapshot "checkpoint-a"Rollback to a previous snapshot via CID:
solavia rollback bafy...xyzExample: pipeline with API fetch + embedding hashing.
agent-pipeline.js:
export default async function (sv) {
  sv.stage("fetch-joke", async () => {
    const r = await fetch("https://api.chucknorris.io/jokes/random");
    const joke = await r.json();
    return joke.value;
  });
  sv.stage("embed", async (joke) => {
    const vector = await sv.ai.embed(joke); // uses Solavia's deterministic embedding
    return vector;
  });
  sv.stage("rank", async (vector) => {
    return vector.reduce((acc, n) => acc + n, 0);
  });
}Run:
solavia run src/agent-pipeline.js --seed=1337 --prove --sign=key.pem --signature=signature.jsonsolavia run src/agent-pipeline.js --seed=1337 --prove --sign=key.pem --signature=signature.json  Output:
Info  Running pipeline: agent-pipeline.js
Info  Seed: 1337
Pipeline finished. Merkle root: 9dd1dbd8f194496192137cb15fed74511f49346175f3f31c0704eb37d1482b20
Success Pipeline completed in 455ms
Success Merkle Root: 9dd1dbd8f194496192137cb15fed74511f49346175f3f31c0704eb37d1482b20
Success Proof exported: solavia-proof.json
Success Signed proof: signature.json
solavia <command> [options]
Commands:
  run <file.js>       Run a pipeline script
  verify [proof]      Verify Merkle proof + signature
  snapshot [name]     Create a named snapshot
  rollback <cid>      Restore a snapshot
  help                Show help
Options:
  --seed=1337         Deterministic seed
  --prove=file.json   Export Merkle proof
  --sign=key.pem      Sign proof with private key
  --signature=file    Output file for signature JSON
  --pubkey=file       Public key for verification
Because they create tamper evidence.
Even if someone modifies one stage output, the Merkle root changes β and verification fails.
Solavia Runtime β Open Source (NonβCommercial) β Free for personal, academic, research β Commercial use requires paid license Β© 2025 James Chapman llc. EMAIL='[email protected]'
Solavia is a deterministic execution runtime for building verifiable AI or data transformation pipelines.
Every pipeline run produces:
- Stage-level provenance
- SHA-256 hashing of input/output
- A Merkle tree of all outputs
- Optional cryptographic signature
Think: AI execution that can be proven and verified like a blockchain transaction.
| Feature | Description | 
|---|---|
| β Deterministic execution | Everything seeded and reproducible | 
| β Stage provenance | Every step logs its input/output hash | 
| β Merkle tree | Computes Merkle Root of pipeline | 
| β Proof export | JSON proof users can inspect or audit | 
| β Digital signatures | Sign proof using RSA/ECDSA private key | 
| β Verification | Users can verify proof + signature | 
| β Snapshots / rollback | Save and restore execution state | 
| β Zero dependencies CLI | No external frameworks | 
Requires Node v18+
npm install -g solaviaOr locally:
npm install solaviapipeline.js
// examples/pipeline.example.js
import svCore from "../src/solavia-core.js"; // adjust path if needed
const { sha256Hex } = svCore;
export default async function pipeline(sv) {
  console.log("π Starting Solavia example pipeline...");
  // Stage 1: generate deterministic data
  const input = { numbers: [1, 2, 3], seed: sv.config.SEED };
  const output = input.numbers.map(n => n * 2);
  sv.provenance.addStage("DoubleNumbers", input, output, sha256Hex(output));
  console.log("β \ninput:", input, "\noutput:", output);
  // Stage 2: compute summary
  const summary = {
    count: output.length,
    sum: output.reduce((a, b) => a + b, 0),
  };
  sv.provenance.addStage("Summarize", output, summary, sha256Hex(summary));
  console.log("β \ninput:", output, "\noutput:", summary);
  // Stage 3: pseudo model output
  const result = {
    avg: summary.sum / summary.count,
    seedUsed: sv.config.SEED,
  };
  sv.provenance.addStage("ModelResult", summary, result, sha256Hex(result));
  console.log("β \ninput:", summary, "\noutput:", result);
  console.log("β
 Pipeline finished.");
  console.log("π Merkle Root:", sv.provenance.merkleRoot());
}solavia run examples/pipeline.jsExample output:
jameschapman@solavia solavia-npm % solavia run examples/pipeline.js                                                                                                                         
                                                                                                                                                                                            
[SolaVia:INFO] SolaVia started
Info  Running pipeline: pipeline.js
Info  Seed: 1337
π Starting Solavia example pipeline...
β 
input: { numbers: [ 1, 2, 3 ], seed: 1337 } 
output: [ 2, 4, 6 ]
β 
input: [ 2, 4, 6 ] 
output: { count: 3, sum: 12 }
β 
input: { count: 3, sum: 12 } 
output: { avg: 4, seedUsed: 1337 }
β
 Pipeline finished.
π Merkle Root: 10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503
Success Pipeline completed in 4ms
Success Merkle Root: 10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503
solavia run examples/pipeline.js --proveCreates:
solavia-proof.json
Example:
{
  "root": "10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503",
  "stages": [
    {
      "name": "DoubleNumbers",
      "inputHash": "f564638d2bdd6f84fbc34bb3f306ad214408e162ded3e36ad0a54116aa68a2ef",
      "outputHash": "5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e",
      "ts": 89464780832077
    },
    {
      "name": "Summarize",
      "inputHash": "5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e",
      "outputHash": "3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d",
      "ts": 89464780832077
    },
    {
      "name": "ModelResult",
      "inputHash": "3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d",
      "outputHash": "03e26f5c84a262a84702fea89456b4dc66a2e62ed9bc9b7835570bd39ab8861a",
      "ts": 89464780832077
    }
  ],
  "seed": 1337,
  "timestamp": "2025-10-29T19:04:33.591Z",
  "version": "8.0.0"
}openssl genpkey -algorithm RSA -out key.pem -pkeyopt rsa_keygen_bits:2048
openssl rsa -pubout -in key.pem -out pub.pemjameschapman@solavia solavia-npm % solavia run examples/pipeline.js --prove --sign=key.pem --signature=signature.json                                                                       Created files:
solavia-proof.json
signature.json
Signature JSON:
{
  "merkleRoot": "10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503",
  "signature": "82ebb99a0ea7214964ef2a019a25e2ac992e539b335cf48fa4a08c7759c521abfab4303f5ec9abedaea9befc2b940d0d074ef624c42b25a9f7aa8bc8b56a31013ba9f87c37fa8087ddea2fb25cfe68b576355043d24d675120a468aca8309a3394993740c2b46198ae9cab0b5bfd147597f5ba9d88ea741bd6003f6bd431ac515ef4e1516698558d6d5a68d4372eebc76bd0eae9aee051bb00fc709da1a8c6dbde9946aca8932f3623e9bb307f2c2965bbf40842038d675dfb43d11f0fb555f124f8986780e07c70f29aeb840f8a5e5a4a1a23dcf7073027c07bb9d9d591f915686cdfe078e0c4b062430b80e829256f998f4090259fda2ab8dcfb973ad30340",
  "canonical": "[{\"input\":{\"numbers\":[1,2,3],\"seed\":1337},\"name\":\"DoubleNumbers\",\"output\":[2,4,6],\"outputHash\":\"5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e\",\"ts\":89464780832077},{\"input\":[2,4,6],\"name\":\"Summarize\",\"output\":{\"count\":3,\"sum\":12},\"outputHash\":\"3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d\",\"ts\":89464780832077},{\"input\":{\"count\":3,\"sum\":12},\"name\":\"ModelResult\",\"output\":{\"avg\":4,\"seedUsed\":1337},\"outputHash\":\"03e26f5c84a262a84702fea89456b4dc66a2e62ed9bc9b7835570bd39ab8861a\",\"ts\":89464780832077}]"
}jameschapman@solavia solavia-npm % solavia verify solavia-proof.json  signature.json --pubkey pub.pem                                                                                       Output:
Success Merkle proof valid
Save current runtime state:
solavia snapshot "checkpoint-a"Rollback to a previous snapshot via CID:
solavia rollback bafy...xyzExample: pipeline with API fetch + embedding hashing.
agent-pipeline.js:
export default async function (sv) {
  sv.stage("fetch-joke", async () => {
    const r = await fetch("https://api.chucknorris.io/jokes/random");
    const joke = await r.json();
    return joke.value;
  });
  sv.stage("embed", async (joke) => {
    const vector = await sv.ai.embed(joke); // uses Solavia's deterministic embedding
    return vector;
  });
  sv.stage("rank", async (vector) => {
    return vector.reduce((acc, n) => acc + n, 0);
  });
}Run:
solavia run src/agent-pipeline.js --seed=1337 --prove --sign=key.pem --signature=signature.jsonsolavia run src/agent-pipeline.js --seed=1337 --prove --sign=key.pem --signature=signature.json  Output:
Info  Running pipeline: agent-pipeline.js
Info  Seed: 1337
Pipeline finished. Merkle root: 9dd1dbd8f194496192137cb15fed74511f49346175f3f31c0704eb37d1482b20
Success Pipeline completed in 455ms
Success Merkle Root: 9dd1dbd8f194496192137cb15fed74511f49346175f3f31c0704eb37d1482b20
Success Proof exported: solavia-proof.json
Success Signed proof: signature.json
solavia <command> [options]
Commands:
  run <file.js>       Run a pipeline script
  verify [proof]      Verify Merkle proof + signature
  snapshot [name]     Create a named snapshot
  rollback <cid>      Restore a snapshot
  help                Show help
Options:
  --seed=1337         Deterministic seed
  --prove=file.json   Export Merkle proof
  --sign=key.pem      Sign proof with private key
  --signature=file    Output file for signature JSON
  --pubkey=file       Public key for verification
Because they create tamper evidence.
Even if someone modifies one stage output, the Merkle root changes β and verification fails.
Solavia Runtime β Open Source (NonβCommercial) β Free for personal, academic, research β Commercial use requires paid license Β© 2025 James Chapman llc. EMAIL='[email protected]'
Solavia is a deterministic execution runtime for building verifiable AI or data transformation pipelines.
Every pipeline run produces:
- Stage-level provenance
- SHA-256 hashing of input/output
- A Merkle tree of all outputs
- Optional cryptographic signature
Think: AI execution that can be proven and verified like a blockchain transaction.
| Feature | Description | 
|---|---|
| β Deterministic execution | Everything seeded and reproducible | 
| β Stage provenance | Every step logs its input/output hash | 
| β Merkle tree | Computes Merkle Root of pipeline | 
| β Proof export | JSON proof users can inspect or audit | 
| β Digital signatures | Sign proof using RSA/ECDSA private key | 
| β Verification | Users can verify proof + signature | 
| β Snapshots / rollback | Save and restore execution state | 
| β Zero dependencies CLI | No external frameworks | 
Requires Node v18+
npm install -g solaviaOr locally:
npm install solaviapipeline.js
// examples/pipeline.example.js
import svCore from "../src/solavia-core.js"; // adjust path if needed
const { sha256Hex } = svCore;
export default async function pipeline(sv) {
  console.log("π Starting Solavia example pipeline...");
  // Stage 1: generate deterministic data
  const input = { numbers: [1, 2, 3], seed: sv.config.SEED };
  const output = input.numbers.map(n => n * 2);
  sv.provenance.addStage("DoubleNumbers", input, output, sha256Hex(output));
  console.log("β \ninput:", input, "\noutput:", output);
  // Stage 2: compute summary
  const summary = {
    count: output.length,
    sum: output.reduce((a, b) => a + b, 0),
  };
  sv.provenance.addStage("Summarize", output, summary, sha256Hex(summary));
  console.log("β \ninput:", output, "\noutput:", summary);
  // Stage 3: pseudo model output
  const result = {
    avg: summary.sum / summary.count,
    seedUsed: sv.config.SEED,
  };
  sv.provenance.addStage("ModelResult", summary, result, sha256Hex(result));
  console.log("β \ninput:", summary, "\noutput:", result);
  console.log("β
 Pipeline finished.");
  console.log("π Merkle Root:", sv.provenance.merkleRoot());
}solavia run examples/pipeline.jsExample output:
jameschapman@solavia solavia-npm % solavia run examples/pipeline.js                                                                                                                         
                                                                                                                                                                                            
[SolaVia:INFO] SolaVia started
Info  Running pipeline: pipeline.js
Info  Seed: 1337
π Starting Solavia example pipeline...
β 
input: { numbers: [ 1, 2, 3 ], seed: 1337 } 
output: [ 2, 4, 6 ]
β 
input: [ 2, 4, 6 ] 
output: { count: 3, sum: 12 }
β 
input: { count: 3, sum: 12 } 
output: { avg: 4, seedUsed: 1337 }
β
 Pipeline finished.
π Merkle Root: 10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503
Success Pipeline completed in 4ms
Success Merkle Root: 10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503
solavia run examples/pipeline.js --proveCreates:
solavia-proof.json
Example:
{
  "root": "10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503",
  "stages": [
    {
      "name": "DoubleNumbers",
      "inputHash": "f564638d2bdd6f84fbc34bb3f306ad214408e162ded3e36ad0a54116aa68a2ef",
      "outputHash": "5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e",
      "ts": 89464780832077
    },
    {
      "name": "Summarize",
      "inputHash": "5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e",
      "outputHash": "3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d",
      "ts": 89464780832077
    },
    {
      "name": "ModelResult",
      "inputHash": "3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d",
      "outputHash": "03e26f5c84a262a84702fea89456b4dc66a2e62ed9bc9b7835570bd39ab8861a",
      "ts": 89464780832077
    }
  ],
  "seed": 1337,
  "timestamp": "2025-10-29T19:04:33.591Z",
  "version": "8.0.0"
}openssl genpkey -algorithm RSA -out key.pem -pkeyopt rsa_keygen_bits:2048
openssl rsa -pubout -in key.pem -out pub.pemjameschapman@solavia solavia-npm % solavia run examples/pipeline.js --prove --sign=key.pem --signature=signature.json                                                                       Created files:
solavia-proof.json
signature.json
Signature JSON:
{
  "merkleRoot": "10fee41b9017216dc26c288203884d3d9a359ebe5020c0f8722b7089ab11b503",
  "signature": "82ebb99a0ea7214964ef2a019a25e2ac992e539b335cf48fa4a08c7759c521abfab4303f5ec9abedaea9befc2b940d0d074ef624c42b25a9f7aa8bc8b56a31013ba9f87c37fa8087ddea2fb25cfe68b576355043d24d675120a468aca8309a3394993740c2b46198ae9cab0b5bfd147597f5ba9d88ea741bd6003f6bd431ac515ef4e1516698558d6d5a68d4372eebc76bd0eae9aee051bb00fc709da1a8c6dbde9946aca8932f3623e9bb307f2c2965bbf40842038d675dfb43d11f0fb555f124f8986780e07c70f29aeb840f8a5e5a4a1a23dcf7073027c07bb9d9d591f915686cdfe078e0c4b062430b80e829256f998f4090259fda2ab8dcfb973ad30340",
  "canonical": "[{\"input\":{\"numbers\":[1,2,3],\"seed\":1337},\"name\":\"DoubleNumbers\",\"output\":[2,4,6],\"outputHash\":\"5949a6c45fd2fb2baa3e4576d5255e8752a72cf66402c0e141600be6d402675e\",\"ts\":89464780832077},{\"input\":[2,4,6],\"name\":\"Summarize\",\"output\":{\"count\":3,\"sum\":12},\"outputHash\":\"3e40387a2031a4bd2537450614c2e883d50fe117dbdd106e580d7a7deb640d8d\",\"ts\":89464780832077},{\"input\":{\"count\":3,\"sum\":12},\"name\":\"ModelResult\",\"output\":{\"avg\":4,\"seedUsed\":1337},\"outputHash\":\"03e26f5c84a262a84702fea89456b4dc66a2e62ed9bc9b7835570bd39ab8861a\",\"ts\":89464780832077}]"
}jameschapman@solavia solavia-npm % solavia verify solavia-proof.json  signature.json --pubkey pub.pem                                                                                       Output:
Success Merkle proof valid
Save current runtime state:
solavia snapshot "checkpoint-a"Rollback to a previous snapshot via CID:
solavia rollback bafy...xyzExample: pipeline with API fetch + embedding hashing.
agent-pipeline.js:
export default async function (sv) {
  sv.stage("fetch-joke", async () => {
    const r = await fetch("https://api.chucknorris.io/jokes/random");
    const joke = await r.json();
    return joke.value;
  });
  sv.stage("embed", async (joke) => {
    const vector = await sv.ai.embed(joke); // uses Solavia's deterministic embedding
    return vector;
  });
  sv.stage("rank", async (vector) => {
    return vector.reduce((acc, n) => acc + n, 0);
  });
}Run:
solavia run src/agent-pipeline.js --seed=1337 --prove --sign=key.pem --signature=signature.jsonsolavia run src/agent-pipeline.js --seed=1337 --prove --sign=key.pem --signature=signature.json  Output:
Info  Running pipeline: agent-pipeline.js
Info  Seed: 1337
Pipeline finished. Merkle root: 9dd1dbd8f194496192137cb15fed74511f49346175f3f31c0704eb37d1482b20
Success Pipeline completed in 455ms
Success Merkle Root: 9dd1dbd8f194496192137cb15fed74511f49346175f3f31c0704eb37d1482b20
Success Proof exported: solavia-proof.json
Success Signed proof: signature.json
solavia <command> [options]
Commands:
  run <file.js>       Run a pipeline script
  verify [proof]      Verify Merkle proof + signature
  snapshot [name]     Create a named snapshot
  rollback <cid>      Restore a snapshot
  help                Show help
Options:
  --seed=1337         Deterministic seed
  --prove=file.json   Export Merkle proof
  --sign=key.pem      Sign proof with private key
  --signature=file    Output file for signature JSON
  --pubkey=file       Public key for verification
Because they create tamper evidence.
Even if someone modifies one stage output, the Merkle root changes β and verification fails.
Solavia Runtime β Open Source (NonβCommercial) β Free for personal, academic, research β Commercial use requires paid license Β© 2025 James Chapman llc. EMAIL='[email protected]'