Neru.ai 0.3B e100

Test > https://huggingface.co/spaces/ezfiez/Neru

Model Details

Model Description

Neru.ai 0.3B e100 is a fully open-source Turkish language model.

  • Parameters: 300M
  • Context Window: 128 tokens
  • Memory: No conversational memory; each message is processed independently.
  • Language: Turkish only
  • Capabilities: Informational queries, general conversation.
  • Limitations: Incapable of mathematics or coding; cannot generate long-form text; has knowledge cutoff boundaries; may produce inaccurate or harmful information.

This model is released as a preliminary version for Neru.ai 1B e200. It is entirely open-source; it can be modified, fine-tuned, and utilized as long as proper attribution is provided.

Developed by

  • Developer: Ezfiez
  • Architecture: Mistral-based (trained from scratch)

Uses

Direct Use

  • Turkish text generation
  • Information sharing and casual conversation
  • Chatbot applications

Downstream Use

  • Educational research and fine-tuning experimentation

Out-of-Scope Use

  • Mathematical calculations
  • Code generation
  • Long-form content creation
  • Generation of harmful or factually incorrect content

Bias, Risks, and Limitations

  • Trained exclusively on Turkish data; may perform poorly with specialized terminology.
  • Lacks conversational context (stateless); does not remember previous messages in a session.
  • Potential risk of generating inaccurate, incomplete, or biased information.
  • Users are advised to utilize the model with these limitations in mind.

Training Details

Training Data

  • Trained on a cleaned corpus of Turkish text.
  • Dataset includes conversational data, news, and open-source text repositories.

Training Procedure

  • Trained from scratch using the Mistral architecture.
  • Utilizes a custom-built tokenizer.
  • File Size: ~1GB

Evaluation

  • Human evaluation indicates consistent and coherent performance in short-form conversations.
  • Performance is limited in long-form generation.
  • Inadequate for tasks involving mathematics or programming.

Technical Specifications

Model Architecture

  • Type: Causal Language Model (Causal LM)
  • Parameters: 300M
  • Context: 128 tokens
  • Tokenizer: Custom Turkish Tokenizer

Compute Requirements

  • Memory: ~1GB RAM/VRAM is sufficient.
  • Performance: Optimized for faster inference on GPU.

Citation

Developer: Ezfiez
License: MIT

APA Style: Ezfiez. (2026). Neru.ai 0.3B e100.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support

Model tree for ezfiez/neru.ai.0.3b.e100

Unable to build the model tree, the base model loops to the model itself. Learn more.

Space using ezfiez/neru.ai.0.3b.e100 1