You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This project fine-tunes large language models (LLMs) for text-based recommendations, using a novel prompt mechanism to improve accuracy and user satisfaction. It demonstrates efficient model adaptation with diverse datasets, leveraging advanced libraries and techniques for optimal performance.
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.