-
KAIST AI
Highlights
- Pro
Stars
Official implementation of "TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models"
Official Implementation of LANTERN (ICLR'25) and LANTERN++(ICLRW-SCOPE'25)
Official code for "Divide and Translate: Compositional First-Order Logic Translation and Verification for Complex Logical Reasoning", ICLR 2025.
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & V…
DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
An official implementation of "Diffusion Video Autoencoders: Toward Temporally Consistent Face Video Editing via Disentangled Video Encoding" (CVPR 2023) in PyTorch.
Official Pytorch implementation of the paper Learning Input-agnostic Manipulation Directions in StyleGAN with Text Guidance (accepted to ICLR 2023)
Personalized and Reliable Predictive Models for Healthcare (의료 데이터 기반 신뢰 가능한 개인화된 예측(진단) 모델)
Official Pytorch implementation of "Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)
A curated list for Efficient Large Language Models
General technology for enabling AI capabilities w/ LLMs and MLLMs
Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method ; GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model
TAM: Topology-Aware Margin Loss for Class-Imbalanced Node Classification
GraphENS: Neighbor-Aware Ego Network Synthesis for Class-Imbalanced Node Classification (ICLR'22)
Official PyTorch implementation of SGEM: Test-Time Adaptation for Automatic Speech Recognition via Sequential-Level Generalized Entropy Minimization (INTERSPEECH 2023 Oral Presentation)
🐙 Guides, papers, lecture, notebooks and resources for prompt engineering
meta learning from the initializaion induced by word embedding
This repository is the official PyTorch implementation of "Distilling Linguistic Context for Language Model Compression" by GeondoPark, Gyeongman Kim and Eunho Yang.
Implementations for "Trimming the ℓ₁ Regularizer: Statistical Analysis, Optimization, and Applications to Deep Learning" Published on ICML 2019
An official implementation of "Decomposed Knowledge Distillation for Class-incremental Semantic Segmentation" (NeurIPS 2022) in PyTorch.
Google Research
An official implementation of "Exploiting a Joint Embedding Space for Generalized Zero-Shot Semantic Segmentation" (ICCV 2021) in PyTorch.
A curated list of awesome computer vision resources
An official implementation of "Learning Memory-guided Normality for Anomaly Detection" (CVPR 2020) in PyTorch.
An official implementation of "Learning with Privileged Information for Efficient Image Super-Resolution" (ECCV2020) in PyTorch.