File size: 4,519 Bytes
280831f
 
 
 
 
 
 
 
 
 
4fe7b6d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
280831f
 
 
 
4fe7b6d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
280831f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
title: README
emoji: 🔩
colorFrom: gray
colorTo: blue
sdk: static
pinned: false
thumbnail: >-
  https://cdn-uploads.huggingface.co/production/uploads/69c6fa91d74254bbb63f3348/10R3gfJB36vsT0scshb5e.png
---
<div class="center">
  <h1>Tinman Lab</h1>
  <h3>Autonomous Machines. Second-Order Systems.</h3>
  <div class="sub">AGENT MEMORY · ADVERSARIAL SAFETY · AGENTIC ECONOMY · PERCEPTION SYSTEMS · APPLIED RESEARCH</div>
</div>

<hr>

<p>We build on-device AI systems that reason, remember, and self-correct — small models designed to run autonomously at the edge with calibrated uncertainty and adversarial robustness.</p>

<h2>Research Areas</h2>

<ul>
  <li><strong>Agent Memory</strong> — Encrypted semantic memory infrastructure for persistent agent context</li>
  <li><strong>Adversarial Safety</strong> — Multi-agent stress-testing and trust verification for autonomous systems</li>
  <li><strong>Perception Systems</strong> — On-device vision, voice, and multimodal understanding</li>
  <li><strong>Disposition Distillation</strong> — A three-arc study finding that imitation, attention-head tempering, and frozen-base sidecars all fail to move judge-measured disposition without damaging content quality at sub-billion scale (<a href="https://arxiv.org/abs/2604.11867">arXiv:2604.11867</a>).</li>
</ul>

<h2>Open-Source Releases</h2>

<p><strong>Tinman SmolOmni (MLA)</strong> — small omnimodal models with multi-head latent attention.</p>

<table>
  <thead>
    <tr><th>Model</th><th>Description</th></tr>
  </thead>
  <tbody>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-SmolOmni-MLA-256M">Tinman-SmolOmni-MLA-256M</a></td><td>256M parameter omnimodal</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-SmolOmni-MLA-500M">Tinman-SmolOmni-MLA-500M</a></td><td>500M parameter omnimodal</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-SmolOmni-MLA-Toolkit">Tinman-SmolOmni-MLA-Toolkit</a></td><td>Training and inference toolkit</td></tr>
  </tbody>
</table>

<p><strong>Tinman Companion</strong> — Gemma 4 fine-tunes for on-device companion use cases.</p>

<table>
  <thead>
    <tr><th>Model</th><th>Description</th></tr>
  </thead>
  <tbody>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-merged">Tinman-gemma4-companion-merged</a></td><td>Full-precision merged model</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-gguf">Tinman-gemma4-companion-gguf</a></td><td>GGUF quantized for llama.cpp</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-litert-lm">Tinman-gemma4-companion-litert-lm</a></td><td>LiteRT-LM for on-device deployment</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-sft">Tinman-gemma4-companion-sft</a></td><td>SFT checkpoint</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-dpo">Tinman-gemma4-companion-dpo</a></td><td>DPO checkpoint</td></tr>
  </tbody>
</table>

<h2>Links</h2>

<p class="links">
  <a href="https://tinmanlab.com">Website</a>
  <a href="https://github.com/tinmanlabsl/">GitHub</a>
  <a href="https://arxiv.org/abs/2604.11867">Paper (arXiv)</a>
  <a href="https://github.com/tinmanlabsl/disposition-distillation">DD Artifacts</a>
</p>

<style>
  body { font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif; max-width: 720px; margin: 0 auto; padding: 2rem 1.5rem; color: #1a1a2e; line-height: 1.6; background: #fff; }
  .center { text-align: center; }
  h1 { font-size: 2rem; margin-bottom: 0.2rem; font-weight: 700; }
  h3 { font-size: 1.1rem; font-weight: 400; color: #333; margin-top: 0.2rem; margin-bottom: 0.3rem; }
  .sub { font-size: 0.75rem; letter-spacing: 0.15em; color: #888; margin-bottom: 1.5rem; }
  hr { border: none; border-top: 1px solid #e0e0e0; margin: 1.5rem 0; }
  h2 { font-size: 1.2rem; margin-top: 1.8rem; margin-bottom: 0.8rem; }
  p { margin-bottom: 0.8rem; }
  ul { padding-left: 1.2rem; }
  li { margin-bottom: 0.4rem; }
  table { width: 100%; border-collapse: collapse; margin: 1rem 0; font-size: 0.95rem; }
  th, td { text-align: left; padding: 0.5rem 0.8rem; border-bottom: 1px solid #eee; }
  th { font-weight: 600; border-bottom: 2px solid #ddd; }
  a { color: #2563eb; text-decoration: none; }
  a:hover { text-decoration: underline; }
  .links a { margin-right: 1.5rem; }
  strong { font-weight: 600; }
</style>