Inference on tiny-llama #562
-
Need some help here! I have fine-trained tiny-llama on my personal books. How do perform inference on it? Command I used: ` ` |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
code to do inference can be found in README.md of the output repo/folder (josh-ops) |
Beta Was this translation helpful? Give feedback.
code to do inference can be found in README.md of the output repo/folder (josh-ops)