🚀 Released URA-LLaMa: Vietnamese Large Language Models (7B, 13B, 70B parameters)

Hello everyone,

As a research team formed from members in Ho Chi Minh City University of Technology (HCMUT) - VNU-HCM and Stanford University, we are pleased to introduce our large language models to the community. We affectionately refer to those language models as URA-LLaMa. They are fine-tuned on Vietnamese datasets from Meta’s original LLaMa-2 model, including all three versions of 7B, 13B, and 70B.

Model Family

We provide these models free of charge for research purposes. Our models come with evaluation results on 10 different tasks, covering various aspects and real-world usage scenarios.

Resources

About the Research Group

If you want to contribute to the development of large language models for Vietnamese, please do not hesitate to contact us!