File size: 4,814 Bytes
908a382
64754b7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ac84558
64754b7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ac84558
64754b7
ac84558
64754b7
 
ac84558
 
 
5c1db60
64754b7
 
5c1db60
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
64754b7
 
 
5c1db60
64754b7
 
5c1db60
 
 
 
 
64754b7
 
 
 
 
 
 
 
5c1db60
 
64754b7
 
 
908a382
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
<html>
<head>
  <meta charset="utf-8" />
  <meta name="viewport" content="width=device-width, initial-scale=1" />
  <style>
    body {
      font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
      max-width: 720px;
      margin: 0 auto;
      padding: 2rem 1.5rem;
      color: #1a1a2e;
      line-height: 1.6;
      background: #fff;
    }
    .center { text-align: center; }
    h1 {
      font-size: 2rem;
      margin-bottom: 0.2rem;
      font-weight: 700;
    }
    h3 {
      font-size: 1.1rem;
      font-weight: 400;
      color: #333;
      margin-top: 0.2rem;
      margin-bottom: 0.3rem;
    }
    .sub {
      font-size: 0.75rem;
      letter-spacing: 0.15em;
      color: #888;
      margin-bottom: 1.5rem;
    }
    hr {
      border: none;
      border-top: 1px solid #e0e0e0;
      margin: 1.5rem 0;
    }
    h2 {
      font-size: 1.2rem;
      margin-top: 1.8rem;
      margin-bottom: 0.8rem;
    }
    p { margin-bottom: 0.8rem; }
    ul {
      padding-left: 1.2rem;
    }
    li {
      margin-bottom: 0.4rem;
    }
    table {
      width: 100%;
      border-collapse: collapse;
      margin: 1rem 0;
      font-size: 0.95rem;
    }
    th, td {
      text-align: left;
      padding: 0.5rem 0.8rem;
      border-bottom: 1px solid #eee;
    }
    th {
      font-weight: 600;
      border-bottom: 2px solid #ddd;
    }
    a {
      color: #2563eb;
      text-decoration: none;
    }
    a:hover {
      text-decoration: underline;
    }
    .links a {
      margin-right: 1.5rem;
    }
    strong { font-weight: 600; }
  </style>
</head>
<body>

<div class="center">
  <h1>Tinman Lab</h1>
  <h3>Autonomous Machines. Second-Order Systems.</h3>
  <div class="sub">AGENT MEMORY &middot; ADVERSARIAL SAFETY &middot; AGENTIC ECONOMY &middot; PERCEPTION SYSTEMS &middot; APPLIED RESEARCH</div>
</div>

<hr />

<p>We build on-device AI systems that reason, remember, and self-correct β€” small models designed to run autonomously at the edge with calibrated uncertainty and adversarial robustness.</p>

<h2>Research Areas</h2>

<ul>
  <li><strong>Agent Memory</strong> β€” Encrypted semantic memory infrastructure for persistent agent context</li>
  <li><strong>Adversarial Safety</strong> β€” Multi-agent stress-testing and trust verification for autonomous systems</li>
  <li><strong>Perception Systems</strong> β€” On-device vision, voice, and multimodal understanding</li>
  <li><strong>Disposition Distillation</strong> β€” A three-arc study finding that imitation, attention-head tempering, and frozen-base sidecars all fail to move judge-measured disposition without damaging content quality at sub-billion scale (<a href="https://arxiv.org/abs/2604.11867">arXiv:2604.11867</a>).</li>
</ul>

<h2>Open-Source Releases</h2>

<p><strong>Tinman SmolOmni (MLA)</strong> β€” small omnimodal models with multi-head latent attention.</p>

<table>
  <thead>
    <tr><th>Model</th><th>Description</th></tr>
  </thead>
  <tbody>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-SmolOmni-MLA-256M">Tinman-SmolOmni-MLA-256M</a></td><td>256M parameter omnimodal</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-SmolOmni-MLA-500M">Tinman-SmolOmni-MLA-500M</a></td><td>500M parameter omnimodal</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-SmolOmni-MLA-Toolkit">Tinman-SmolOmni-MLA-Toolkit</a></td><td>Training and inference toolkit</td></tr>
  </tbody>
</table>

<p><strong>Tinman Companion</strong> β€” Gemma 4 fine-tunes for on-device companion use cases.</p>

<table>
  <thead>
    <tr><th>Model</th><th>Description</th></tr>
  </thead>
  <tbody>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-merged">Tinman-gemma4-companion-merged</a></td><td>Full-precision merged model</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-gguf">Tinman-gemma4-companion-gguf</a></td><td>GGUF quantized for llama.cpp</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-litert-lm">Tinman-gemma4-companion-litert-lm</a></td><td>LiteRT-LM for on-device deployment</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-sft">Tinman-gemma4-companion-sft</a></td><td>SFT checkpoint</td></tr>
    <tr><td><a href="https://huggingface.co/Tinman-Lab/Tinman-gemma4-companion-dpo">Tinman-gemma4-companion-dpo</a></td><td>DPO checkpoint</td></tr>
  </tbody>
</table>

<h2>Links</h2>

<p class="links">
  <a href="https://tinmanlab.com">Website</a>
  <a href="https://github.com/tinmanlabsl/">GitHub</a>
  <a href="https://arxiv.org/abs/2604.11867">Paper (arXiv)</a>
  <a href="https://github.com/tinmanlabsl/disposition-distillation">DD Artifacts</a>
</p>

</body>
</html>