We are in the Agentic AI era. This is the infrastructure it runs on.

Give Your Agents
Context They Can Trust

Your agents just ask and tell. We handle everything underneath.

Feed NocturnusAI plain text, get back structured facts. Ask it questions in natural language, get verified answers with proof. Rules, inference, memory lifecycle, and consistency all happen automatically — your agent doesn't need to know how.

The logic engine at the foundation of every serious AI agent.

$ curl -fsSL nocturnus.ai/install | bash
Start Building → See the Architecture
Hexastore — 6-way indexed Backward + Forward Chaining Truth Maintenance System Temporal Atoms Salience Memory ACID Transactions WAL + Snapshots MCP · A2A · REST
nocturnus — the logic engine
# 1. Feed it plain English — facts extracted and stored automatically
$ POST /extract  { "text": "Acme Corp is on the enterprise plan. They get 24/7 SLA support.", "assert": true }
✓ Extracted & stored 2 facts
customer_tier(acme_corp, enterprise) · sla_support(acme_corp, 24_7)
# 2. Agent asks a natural language question — gets a verified answer
$ POST /synthesize  { "question": "What support level does Acme Corp have?" }
✓ "Acme Corp has 24/7 SLA support on the enterprise plan."
sourced from: sla_support(acme_corp, 24_7) · customer_tier(acme_corp, enterprise)
# 3. Or query directly via MCP — same KB, same verified truth
$ mcp tool: ask  { "predicate": "customer_tier", "args": ["acme_corp", "?tier"] }
✓ customer_tier(acme_corp, enterprise)
Works with LangChainCrewAIClaudeCursorWindsurfAutoGenAny MCP client
The Architecture

Not a Plugin. A Foundation.

Other tools sit on top of your LLM and hope for the best. Nocturnus sits beneath your agents and provides the substrate that makes correct reasoning possible — regardless of which LLM or framework is above it.

Your Agent Layer
LangChainCrewAIAutoGenClaudeCustom Agent
Protocol Layer
MCP (9 tools)HTTP REST APIPython SDKTypeScript SDKA2A Protocol
NocturnusAI — The Logic Engine
Hexastore
6-way indexed KB
Dual Inference
Backward + Rete
Truth Maintenance
Cascade retracts
Temporal Atoms
Time-aware facts
Salience Memory
Recency · freq · priority
ACID Transactions
Atomic reasoning
WAL + Snapshots
Crash recovery
Multi-Tenancy
DB + tenant headers
agent.py — any framework connects the same way
from langchain_anthropic import ChatAnthropic
from langchain.agents import AgentExecutor, create_tool_calling_agent
from nocturnusai.langchain import get_nocturnusai_tools

# Point your agent at the logic engine
tools = get_nocturnusai_tools("http://localhost:9300")
# tells, asks, teaches, forgets, recalls, context
# — all backed by the Hexastore + inference engine

llm = ChatAnthropic(model="claude-sonnet-4-20250514")
agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools)

result = executor.invoke({
  "input": "Is Acme Corp eligible for premium SLA?"
})
# Agent reasons over verified facts, not LLM memory.
# Answer is provable. Traceable. Consistent.
What the infrastructure provides provable · consistent · durable
// 1. Hexastore returns the fact in <100ms
customer_tier(acme, enterprise) → TRUE
// 2. Inference engine derives eligibility via rule
eligible_for_sla(acme) → DERIVED
via: eligible_for_sla(?c) :- customer_tier(?c, enterprise)
// 3. Full proof trace returned to agent
{ "result": true,
"proof": [
"customer_tier(acme, enterprise)",
"rule: eligible_for_sla(?c) :- ..."
] }
// Agent answers from infrastructure, not from
// statistical weights. Correct every time. ✓

9 MCP tools, all backed by the logic engine

tell
teach
ask
forget
recall
context
compress
cleanup
predicates
Built for agentic developers

Your Agent Asks. NocturnusAI Knows.

Start with natural language — feed it text, ask it questions, connect via MCP. The logic engine, inference, and memory management all happen beneath the surface.

Natural language

Plain English In, Verified Facts Out

POST /extract with any text. NocturnusAI calls your LLM to pull out structured facts and stores them automatically. No schema design, no parsing code, no mapping logic — your agent just feeds it context.

Q&A

Ask Questions, Get Grounded Answers

POST /synthesize with a natural language question. NocturnusAI queries its fact store, runs inference, and returns a sourced answer with a derivation trail — not a hallucinated guess from token probabilities.

MCP native

9 MCP Tools, Zero Integration Work

Connect any MCP-compatible agent, IDE, or framework with a two-line config. tell, ask, teach, forget, recall, context — your agent gets a complete reasoning toolkit without writing any integration code.

What happens underneath
Memory lifecycle

Salience-Ranked Memory

Composite scoring keeps the most relevant facts surfaced for your agent's context window. Episodic patterns consolidate into semantic summaries. Low-relevance facts decay automatically.

Consistency

Truth Maintenance System

Retract a fact and every conclusion that depended on it disappears automatically. No stale inferences, no manual cleanup — the knowledge base stays consistent by design.

Time-aware

Temporal Atoms

Every fact carries validFrom, validUntil, and TTL fields. Facts auto-expire. Query what was true at any point in time. Agents reason over history, not just the present snapshot.

Transactional

ACID Transactions

Multi-agent systems write concurrently. Transactions ensure atomic commits with contradiction detection — agents can explore hypotheticals without polluting shared state.

Ops-ready

Production Durability

WAL + snapshots for crash recovery. Leader/follower replication for read scaling. Prometheus metrics. Kubernetes-ready health probes. Self-hosted, your data, your infrastructure.

Interoperable

Universal Protocol Support

MCP, REST, Python SDK, TypeScript SDK, A2A agent discovery. Whatever your stack, NocturnusAI plugs in. New protocols don't require rewriting your knowledge layer.

⚡ Zero to production

Up and Running in 60 Seconds

No signup. No cloud dependency. No schemas to design. Production-grade infrastructure, self-hosted, on your terms.

01
30 seconds

Deploy the Logic Engine

One curl command. The installer checks Docker, pulls the image, starts the server, waits for healthy, and installs the native CLI binary. Nocturnus is live on port 9300 in under 30 seconds.

02
Hexastore + TMS

Load Your World

Assert facts about your domain: customers, products, rules, state, relationships. Everything is structured, typed, and time-aware. Rules you define teach the engine what to derive. The KB grows as your world grows.

03
MCP · SDK · REST

Connect Your Agents

Point any MCP-compatible framework, the Python SDK, TypeScript SDK, or direct HTTP at the running server. Your agents get 9 tools backed by the full reasoning stack — ask questions, get provable answers.

bash
$ curl -fsSL https://raw.githubusercontent.com/Auctalis/nocturnusai/main/install.sh | bash
✓ Nocturnus live on :9300 — WAL ready · Hexastore ready · Inference ready
✓ CLI installed → nocturnusai
$ curl localhost:9300/health
{ "status": "ok", "ready": true }

The Difference

What happens when your agent needs to know a customer's subscription tier?

❌ Without Nocturnus LLM guessing
// Agent prompt stuffing...
"Based on the conversation, I believe
 the customer is on the premium plan.
 I'm not entirely sure, but they
 mentioned something about enterprise
 features in a previous message..."

// Wrong. The customer is on "starter".
// Your agent just offered a 50% discount
// to the wrong tier. 💸
✓ With Nocturnus Fact-grounded answer
// 1. Ingest plain English → facts extracted
POST /extract
{ "text": "Acme Corp is on the starter plan",
  "assert": true }

// ✓ Extracted: subscription_tier(acme_corp, starter)

// 2. Agent asks in natural language
POST /synthesize
{ "question": "What plan is Acme on?" }

{ "answer": "Acme Corp is on the starter plan.",
  "derivation": ["subscription_tier(acme_corp, starter)"],
  "confidence": 0.95 }

// Correct. Sourced. Provable. ✓

Built for production from day one

< 100ms
Fact Retrieval
6-way Hexastore indexing
ACID
Transactional Truth
Commit or rollback atomically
WAL
Crash Recovery
Write-ahead log + snapshots
MCP
Protocol Native
9 tools, any agent framework