Instructions to use mjobe105/qlora70b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use mjobe105/qlora70b with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct") model = PeftModel.from_pretrained(base_model, "mjobe105/qlora70b") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- db697596ed85f35126d42a67abc80020df23455149e244c927f720915a4e3fb5
- Size of remote file:
- 7.74 kB
- SHA256:
- f9e27bbdffe95613f29768e3a21ee1e0b402dc2bb127c07c5d25189c2121a6e5
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.