Simplex Cognitive Models

AI Models for
Every Task

Pre-trained cognitive models with calibrated confidence, belief revision, and specialized adapters for 50+ business domains.

0 Models
52 Specialists
0 Downloads
100% Open Source

Simplex Cognitive Base Models

Pre-trained models with confidence calibration, belief revision, and memory context protocol.

Divine Tier
32B

simplex-cognitive-32b

Cross-hive reasoning, complex synthesis, high-stakes decisions.

Size: 18 GB (Q4) Context: 128K tokens License: Apache 2.0
Coming Soon
Hive Tier
8B

simplex-cognitive-8b

Primary SLM for each hive. Specialist inference and reasoning.

Size: 4.1 GB (Q4) Context: 128K tokens License: Apache 2.0
Download GGUF
Edge Tier
3B

simplex-cognitive-3b

Mobile and embedded deployment. Fast local inference.

Size: 700 MB (Q4) Context: 32K tokens License: Apache 2.0
Coming Soon

Specialist Models

Hot-swappable LoRA adapters trained for specific business domains.

How to Use

Get started with Simplex Cognitive models in minutes.

Ollama
# Pull the base model
ollama pull simplex-cognitive-8b

# Run with a prompt
ollama run simplex-cognitive-8b "Summarize this document..."

# With confidence output
ollama run simplex-cognitive-8b "What is the capital of France? Include confidence."
Python (Transformers)
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

# Load base model
model = AutoModelForCausalLM.from_pretrained(
    "senuamedia/simplex-cognitive-8b"
)

# Load specialist adapter
model = PeftModel.from_pretrained(
    model, "senuamedia/simplex-lora-coding"
)

# Generate with confidence
output = model.generate(prompt, temperature=0.7)

Ready to Build?

Explore the full model catalog or learn more about the Simplex language.