GPT-based protein language model for PTM site prediction
-
Updated
May 28, 2024 - Jupyter Notebook
GPT-based protein language model for PTM site prediction
Have all the access to other api endpoints and supported by mixtral, openai, gemini, and etc. Repository from Openllm
The PyTorch implementation of fine-tuning the GPT-2(Generative Pre-trained Transformer 2) for dialogue generation.
A GPT-2 for R with the OpenAI trained weights
This repository contains demos I made with the Transformers library by HuggingFace.
Evaluating language models based on their strategic game-playing capabilities using chess as a benchmark.
Implementing MiniGPT(20 M parameter model) in pytorch
Improving Chest X-Ray Report Generation by Leveraging Warm-Starting
Annotations of the interesting ML papers I read
Arabic text regression with various models, GPT-2 text generation, and BERT-based text classification.
A pure Haskell implementation of a decoder-only transformer (GPT)
A Python-based chatbot project built on the autogen and tinygrad foundation, utilizing advanced agents for dynamic conversations and function orchestration, enhancing and expanding traditional chatbot capabilities.
Visual Studio Code client for Tabnine. https://marketplace.visualstudio.com/items?itemName=TabNine.tabnine-vscode
Ejemplo práctico para el despliegue de una API con FastAPI
Code the ICML 2024 paper: "MADA: Meta-Adaptive Optimizers through hyper-gradient Descent"
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Add a description, image, and links to the gpt-2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt-2 topic, visit your repo's landing page and select "manage topics."