Be cheerful and hopeful.
Don't compare your life with others. There's no comparison between the sun and the moon. They shine when it's their time.
Curriculum Temperature for Knowledge Distillation Curriculum Temperature for Knowledge Distillation
Curriculum Temperature for Knowledge DistillationMetadata AAAI 2023 Authors: [[Zheng Li]], [[Xiang Li]], [[Lingfeng Yang
2023-12-13
Swing Distillation A Privacy-Preserving Knowledge Distillation Framework Swing Distillation A Privacy-Preserving Knowledge Distillation Framework
Swing Distillation: A Privacy-Preserving Knowledge Distillation FrameworkMetadata Authors: [[Junzhuo Li]], [[Xinwei Wu]]
2023-12-13
Knowledge Distillation A Survey Knowledge Distillation A Survey
Knowledge Distillation: A SurveyMetadata Item Type: [[Article]] Authors: [[Jianping Gou]], [[Baosheng Yu]], [[Stephen J.
DP-Forward DP-Forward
DP-Forward: Fine-tuning and Inference on Language Models with Differential Privacy in Forward Pass Metadata Tags: #Di
2023-12-03
FM with FL FM with FL
When Foundation Model Meets Federated Learning: Motivations, Challenges, and Future Directions Metadata Tags: #Founda
2023-11-27
HeteroFL HeteroFL
HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients Metadata Tags: #Hetero
2023-11-23
RAIL-KD RAIL-KD
RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge DistillationMetadata Authors: [[Md Akmal Haidar]], [[Nithin Anc
2023-11-18
FedPEAT Parameter-Efficient Fine Tuning, and Emulator Assisted Tuning FedPEAT Parameter-Efficient Fine Tuning, and Emulator Assisted Tuning
FedPEAT Convergence of Federated Learning, Parameter-Efficient Fine Tuning, and Emulator Assisted Tuning for Artificial
2023-11-17
Low-Parameter Federated Learning with Large Language Models Low-Parameter Federated Learning with Large Language Models
Low-Parameter Federated Learning with Large Language Models Metadata Tags: #LLM #Federated-Learning Authors: [[Jin
2023-11-09
FederatedScope-LLM: A Comprehensive Package for Fine-tuning Large Language Models in Federated Learning FederatedScope-LLM: A Comprehensive Package for Fine-tuning Large Language Models in Federated Learning
FederatedScope-LLM: A Comprehensive Package for Fine-tuning Large Language Models in Federated LearningMetadata Authors:
2023-11-06
Offsite-Tuning Transfer Learning without Full Model Offsite-Tuning Transfer Learning without Full Model
Offsite-Tuning: Transfer Learning without Full Model#LLM #Fine-tune Metadata Authors: [[Guangxuan Xiao]], [[Ji Lin]], [
2023-11-06
DaFKD Domain-Aware Federated Knowledge Distillation DaFKD Domain-Aware Federated Knowledge Distillation
DaFKD: Domain-Aware Federated Knowledge Distillation#Federated-Learning #GAN #Knowledge-Distillation #CVPR Metadata Item
2023-11-04
1 / 3