OLMo3-190M-zh-full
为零基础 AI 大模型研发训练营(llm001)L04 Full 模型(190M 参数,1 epoch完整训练)。完整训练该模型training loss 3.521, eval loss 3.450。
模型配置
- hidden_size: 768, num_layers: 12, num_heads: 12, intermediate_size: 3072
- vocab_size: 48000, sliding_window: 4096
训练配置
- 数据:cmz1024/llm101-olmo3-zh-demo-data (500M tokens),但使用42ailab/OLMo3-190M-zh版本tokenizer重新转换
- 训练:A800, max_steps=-1, bs=24×5=120, lr=5e-4, bf16
用法
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("complexly/olmo3-190m-zh-full")
tok = AutoTokenizer.from_pretrained("complexly/olmo3-190m-zh-full")
- Downloads last month
- 71
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for complexly/olmo3-190m-zh-full
Unable to build the model tree, the base model loops to the model itself. Learn more.