Representation Learning Basics and Feature Extraction in Text
-
Updated
Jun 17, 2021 - Jupyter Notebook
Representation Learning Basics and Feature Extraction in Text
Pytorch implementation of the NLP experiment described in the original contrastive predictive coding paper (2018)
Code for reproducing results in Representation Learning in Sequence to Sequence Tasks: Multi-filter Gaussian Mixture Autoencoder.
Repository for my MSc thesis on "Scene Representation and Pre-Tagging for Autonomous Systems"
Floral Classifier using Discriminative Feature Learning
Implementation of A* algorithm using JavaScript (P5). This program uses matrices and nodes to make the world.
Implementation of k-Step Latent (KSL)
Notebook to run SSL types including autoencoder, denoising autoencoder, parallel autoencoder with embedding loss, and SimCLR. All experiments have simple conv6 backbone.
Topics in the Python ecosystem
Tensorflow unsupervised learning to denoise images from the mnist digits dataset
Self-Supervised Bayesian Representation Learning of Acoustic Emissions from Laser Powder Bed Fusion Process for In-situ Monitoring
Rafika Boutalbi's personal website
DL based representation learning of MS imaging data
C++ implementation of the paper "Word-like n-gram embedding". EMNLP 2018 Workshop on Noisy User-generated Text.
A deep convolutional network made of stacked feature extractors
Single Shot Multi-Box Detect Infer & Repeat implementation in PyTorch
Add a description, image, and links to the representation-learning topic page so that developers can more easily learn about it.
To associate your repository with the representation-learning topic, visit your repo's landing page and select "manage topics."