YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Parakeet TDT v2 OpenVINO Models

Downloaded from: https://huggingface.co/FluidInference/parakeet-tdt-0.6b-v2-ov

Model Files

Preprocessor (Mel Spectrogram)

  • parakeet_melspectogram.xml - Model architecture
  • parakeet_melspectogram.bin - Weights (66 KB)

Encoder (Acoustic Model)

  • parakeet_encoder.xml - Model architecture
  • parakeet_encoder.bin - Weights (1.1 GB)

Decoder (LSTM Language Model)

  • parakeet_decoder.xml - Model architecture
  • parakeet_decoder.bin - Weights (14 MB)

Joint Network (Token + Duration Prediction)

  • parakeet_joint.xml - Model architecture
  • parakeet_joint.bin - Weights (3.3 MB)

Vocabulary

  • parakeet_vocab.json - SentencePiece vocabulary (1024 tokens + blank)

Model Configuration

  • Model Version: Parakeet TDT 0.6b v2
  • Blank Token ID: 1024
  • Vocabulary Size: 1024 regular tokens + 1 blank token
  • Duration Bins: [0, 1, 2, 3, 4] frames
  • Sample Rate: 16000 Hz
  • Frame Duration: 80ms per encoder frame

Usage Example

See docs/parakeet_openvino.md for usage examples.

Basic usage:

#include "eddy/models/parakeet/parakeet_openvino.hpp"

auto backend = std::make_shared<eddy::OpenVINOBackend>(
    eddy::OpenVINOOptions{.device = "CPU"}
);

eddy::parakeet::ModelPaths paths{
    .preprocessor = {.path = "models/parakeet/parakeet_melspectogram.xml"},
    .encoder = {.path = "models/parakeet/parakeet_encoder.xml"},
    .decoder = {.path = "models/parakeet/parakeet_decoder.xml"},
    .joint = {.path = "models/parakeet/parakeet_joint.xml"},
    .tokenizer_json = "models/parakeet/parakeet_vocab.json"
};

eddy::parakeet::RuntimeConfig cfg{
    .device = "CPU",
    .blank_token_id = 1024,
    .duration_bins = {0, 1, 2, 3, 4}
};

auto model = eddy::parakeet::make_openvino_parakeet(backend, paths, cfg);
model->warmup();

// Transcribe audio
eddy::parakeet::AudioSegment segment;
segment.sample_rate = 16000;
segment.pcm = /* your 16kHz mono float32 audio */;

auto result = model->infer(segment, {});
std::cout << "Transcription: " << result.text << "\n";

Next Steps

See docs/GETTING_STARTED_PARAKEET_CPP.md for a complete implementation guide.

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support