motoko-1-1b / README.md
hrudu's picture
update
89e5d21
metadata
license: apache-2.0
language:
  - en
library_name: transformers
tags:
  - haptics
  - time-series
  - robotics
  - sensor-fusion
  - mamba
  - transformer
pipeline_tag: time-series-classification

Motoko 1B

Motoko 1B is the core foundation model of the Motoko family: a general-purpose haptic model pretrained across touch, force, and sensor interaction data.

Model Details

  • Parameters: 1B
  • Architecture: Mamba / Hybrid CNN + Transformer
  • Input: Force, torque, pressure, vibration time-series
  • Output: Next-state prediction and signal classification
  • Sequence Length: Up to 2048 timesteps
  • Sampling Rate: Up to 1 kHz
  • License: Apache 2.0

Intended Use

Motoko 1B is designed for:

  • Haptic signal classification and understanding
  • Grasp stability prediction
  • Material and texture recognition from touch
  • Force state forecasting
  • Fine-tuning as a base for downstream haptic tasks
  • Serving as the parent model for Motoko LoRA adapters

Repository Layout

.
β”œβ”€β”€ README.md
β”œβ”€β”€ config.json
β”œβ”€β”€ tokenizer_config.json
β”œβ”€β”€ tokenizer.json
β”œβ”€β”€ model/
β”‚   β”œβ”€β”€ model.safetensors
β”‚   └── model.safetensors.index.json
β”œβ”€β”€ preprocessor/
β”‚   β”œβ”€β”€ preprocessor_config.json
β”‚   └── feature_extractor.py
β”œβ”€β”€ configs/
β”‚   β”œβ”€β”€ training_config.yaml
β”‚   └── sensor_config.yaml
β”œβ”€β”€ examples/
β”‚   β”œβ”€β”€ inference.py
β”‚   β”œβ”€β”€ grasp_stability.py
β”‚   β”œβ”€β”€ material_recognition.py
β”‚   └── force_forecasting.py
└── .gitattributes

Input Format

The model expects multichannel haptic time-series windows containing one or more of the following modalities:

  • Force
  • Torque
  • Pressure
  • Vibration

Signals should be normalized and resampled according to preprocessor/preprocessor_config.json before inference.

Tasks

Grasp Stability Prediction

Given a short force or tactile sequence collected during grasping, the model predicts whether a grasp is stable or likely to fail.

Material Recognition

Given touch-only or force-plus-vibration sequences, the model classifies the material category or texture family.

Force Forecasting

Given a recent trajectory of haptic observations, the model predicts the next force state or short horizon continuation.

Example Usage

from pathlib import Path

import numpy as np

from preprocessor.feature_extractor import MotokoFeatureExtractor

extractor = MotokoFeatureExtractor.from_config(
    Path("preprocessor/preprocessor_config.json")
)

sample = {
    "force": np.random.randn(256, 3),
    "torque": np.random.randn(256, 3),
    "pressure": np.random.randn(256, 16),
}

features = extractor(sample)
print(features["input_values"].shape)

Training

Base training hyperparameters are stored in configs/training_config.yaml, and sensor assumptions are defined in configs/sensor_config.yaml.

Limitations

  • This repository currently contains scaffold configuration and examples.
  • model/model.safetensors is a placeholder and should be replaced with actual trained weights.
  • Final tokenizer and preprocessing values should be aligned with the released checkpoint.

Citation

@misc{motoko1b,
  title        = {Motoko 1B},
  author       = {Motoko Team},
  year         = {2026},
  howpublished = {\url{https://huggingface.co/}},
  note         = {Foundation model for haptic understanding and forecasting}
}