Core Concepts
Target Block Fine-Tuning (TBFT) optimizes model performance with reduced energy costs by fine-tuning specific blocks based on data drift types.
Stats
Compared with the 𝐵𝑙𝑜𝑐𝑘𝐴𝑣𝑔, where each block is fine-tuned individually and their performance improvements are averaged, TBFT exhibits an improvement in model accuracy by an average of 15.30% whilst saving 41.57% energy consumption on average compared with full fine-tuning.