You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The text was updated successfully, but these errors were encountered:
penpaperkeycode
changed the title
[11/29]GPTQ: ACCURATE POST-TRAINING QUANTIZATION FOR GENERATIVE PRE-TRAINED TRANSFORMERS
[12/13]GPTQ: ACCURATE POST-TRAINING QUANTIZATION FOR GENERATIVE PRE-TRAINED TRANSFORMERS
Oct 11, 2022
Date: 2022.12.13
Presenter: Jeonghoon Kim
Keywords: Post-training quantization, GPT, causal language model task, acceleration, cuda kernel
A100 1장으로 175B까지 PTQ하는 논문 입니다.
방법론 자체가 기존 SOTA 방법과는 많이 달라 신기해서 관심을 많기 갖고 있는 논문입니다.
Paper(ICLR2023): https://openreview.net/forum?id=tcbBPnfwxS
The text was updated successfully, but these errors were encountered: