Tomoul Sentence Embeddings (all-MiniLM-L6-v2, 384-dim)
Built with Tomoul - the minimalist AI inference engine in Zig.
Quick Start (Browser)
const response = await fetch('https://huggingface.co/tomoul/sentence-transformer/resolve/main/bin/tomoul_sentence_transformer_web_wasm32_bundled.wasm');
const wasm = await WebAssembly.instantiate(await response.arrayBuffer());
wasm.instance.exports.init();
API Reference
| Function | Description |
|---|---|
init |
Init |
embed |
Embed |
get_input_buffer_ptr |
Get Input Buffer Ptr |
get_max_input_bytes |
Get Max Input Bytes |
get_output_buffer_ptr |
Get Output Buffer Ptr |
is_ready |
Is Ready |
get_version |
Get Version |
reset |
Reset |
Files
| File | Description |
|---|---|
sentence_transformer.tl |
Raw model weights |
bin/tomoul_sentence_transformer_web_wasm32_bundled.wasm |
Browser WASM |
bin/tomoul_linux_x86_64_bundled |
Linux x64 |
bin/tomoul_linux_aarch64_bundled |
Linux ARM64 |
bin/tomoul_mac_x86_64_bundled |
macOS Intel |
bin/tomoul_mac_aarch64_bundled |
macOS ARM64 |
include/tomoul_sentence_transformer.h |
C header |
lib/libtomoul_sentence_transformer_*.a |
Static libraries |
lib/libtomoul_sentence_transformer_*.so |
Shared libraries (Linux) |
lib/libtomoul_sentence_transformer_*.dylib |
Dynamic libraries (macOS) |
License
MIT License
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support