--- language: en license: mit library_name: jax tags: - llama - text-generation - shakespeare model_name: LLaMA3 base_model: llama --- # LLaMA3 custom Llama-based model trained on tiny shakespeare text using JAX. ## Model Details - **Architecture**: Llama-based Transformer - **Parameters**: ~31.6M - **Training Framework**: JAX - **Tokenization**: GPT-2 Encoding (tiktoken) ## How to use ```python from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("parkneurals/LLaMA3")