WikiChat: Stopping the Hallucination of Large Language Model Chatbots by Few-Shot Grounding on Wikipedia
Paper • 2305.14292 • Published
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Quantization made by Richard Erkhov.
Llama-2-7b-WikiChat-fused - bnb 8bits
This model is a fine-tuned LLaMA-2 (7B) model. Please accept the LLaMA-2 license agreement before downloading this model.
Refer to the following for more information:
GitHub repository: https://github.com/stanford-oval/WikiChat
Paper: https://aclanthology.org/2023.findings-emnlp.157/
Stopping the Hallucination of Large Language Model Chatbots by Few-Shot Grounding on Wikipedia
Online demo:
https://wikichat.genie.stanford.edu