模型:
kobkrit/openthaigpt-0.1.0-beta
任务:
文本生成数据集:
kobkrit/rd-taxqa iapp_wiki_qa_squad Thaweewat/alpaca-cleaned-52k-th Thaweewat/instruction-wild-52k-th Thaweewat/databricks-dolly-15k-th Thaweewat/hc3-24k-th Thaweewat/gpteacher-20k-th Thaweewat/onet-m6-social Thaweewat/alpaca-finance-43k-th 3AThaweewat/alpaca-finance-43k-th 3AThaweewat/onet-m6-social 3AThaweewat/gpteacher-20k-th 3AThaweewat/hc3-24k-th 3AThaweewat/databricks-dolly-15k-th 3AThaweewat/instruction-wild-52k-th 3AThaweewat/alpaca-cleaned-52k-th 3Aiapp_wiki_qa_squad 3Akobkrit/rd-taxqa许可:
apache-2.0OpenThaiGPT Version 0.1.0-beta is a 7B-parameter LLaMA model finetuned to follow Thai translated instructions below and makes use of the Huggingface LLaMA implementation.
Source Code : License Apache Software License 2.0. Weight : For research use only (due to the Facebook LLama's Weight LICENSE). Note that: A commercial use license for OpenThaiGPT 0.1.0 weight will be released later soon!
Finetune Code : https://github.com/OpenThaiGPT/openthaigpt-finetune-010beta Inference Code : https://github.com/OpenThaiGPT/openthaigpt Weight : https://huggingface.co/kobkrit/openthaigpt-0.1.0-beta
Pantip.com, ThaiSC
OpenThaiGPT Volunteers, Artificial Intelligence Entrepreneur Association of Thailand (AIEAT), and Artificial Intelligence Association of Thailand (AIAT)
Kobkrit Viriyayudhakorn ( kobkrit@iapp.co.th ), Sumeth Yuenyong ( sumeth.yue@mahidol.edu ) and Thaweewat Ruksujarit ( thaweewr@scg.com ).
Disclaimer: Provided responses are not guaranteed.