Skip to content
#

transformer-architecture

Here are 216 public repositories matching this topic...

This study aims to investigate the effectiveness of three Transformers (BERT, RoBERTa, XLNet) in handling data sparsity and cold start problems in the recommender system. We present a Transformer-based hybrid recommender system that predicts missing ratings and ex- tracts semantic embeddings from user reviews to mitigate the issues.

  • Updated May 30, 2024
  • Jupyter Notebook

Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.

  • Updated May 29, 2024
  • C#

Improve this page

Add a description, image, and links to the transformer-architecture topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the transformer-architecture topic, visit your repo's landing page and select "manage topics."

Learn more