Skip to content
#

ptq

Here are 14 public repositories matching this topic...

Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.

  • Updated Jun 6, 2024
  • Python

This is the official implementation of "LLM-QBench: A Benchmark Towards the Best Practice for Post-training Quantization of Large Language Models", and it is also an efficient LLM compression tool with various advanced compression methods, supporting multiple inference backends.

  • Updated Jun 7, 2024
  • Python

Improve this page

Add a description, image, and links to the ptq topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ptq topic, visit your repo's landing page and select "manage topics."

Learn more