Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[12/13] 박승철, A Fast Post-Training Pruning Framework for Transformers #31

Open
snudm-starlab opened this issue Dec 7, 2022 · 0 comments

Comments

@snudm-starlab
Copy link

snudm-starlab commented Dec 7, 2022

When
12.13
Who :
박승철 ([email protected])

What
Title: A Fast Post-Training Pruning Framework for Transformers
Links: https://arxiv.org/pdf/2204.09656.pdf

Keywords:

Structured Pruning, PTQ

Abstract
선정한 논문은 Quantization에서 PTQ를 진행하듯이 pruning에서도 적은 training 비용으로 좋은 성능을 보이는
structured pruning 기법을 제안합니다. 해당 기법은 다른 structured pruning과 비교하였을 때 압축률 대비
가장 좋은 성능을 보이는 것은 아니지만, 압축에 걸리는 시간이 매우 적습니다.

발표자료 (추후 공유 예정)

@snudm-starlab snudm-starlab changed the title 12/13 박승철 12/13 A Fast Post-Training Pruning Framework for Transformers (박승철) Dec 7, 2022
@snudm-starlab snudm-starlab changed the title 12/13 A Fast Post-Training Pruning Framework for Transformers (박승철) [12/13] A Fast Post-Training Pruning Framework for Transformers (박승철) Dec 8, 2022
@snudm-starlab snudm-starlab changed the title [12/13] A Fast Post-Training Pruning Framework for Transformers (박승철) [12/13] 박승철, A Fast Post-Training Pruning Framework for Transformers Dec 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant