Skip to content

Inference on tiny-llama #562

Closed Answered by abhishekkrthakur
MrAnayDongre asked this question in Q&A
Discussion options

You must be logged in to vote

code to do inference can be found in README.md of the output repo/folder (josh-ops)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@MrAnayDongre
Comment options

Answer selected by MrAnayDongre
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants