ysautoml.optimization.losssearch
Funtional Modules
ysautoml.optimization.losssearch.train_losssearchysautoml.optimization.losssearch.custom_loss
Run Loss Function Search (LFS) using a custom, learnable loss function integrated into the YSAutoML optimization system. This module supports two main functionalities:
Launching a complete end-to-end training pipeline (
train_losssearch) where both model and loss function are jointly optimized.Directly accessing and using the learnable criterion (
custom_loss) in custom training or evaluation scripts.
ysautoml.optimization.losssearch.train_losssearch
ysautoml.optimization.losssearch.train_losssearch(**kwargs)
Launching a complete end-to-end training pipeline (train_losssearch) where both model and loss function are jointly optimized.
Parameters
epochs(int, default100): Total number of training epochs.lr_model(float, default0.1): Learning rate for the model optimizer.lr_loss(float, default0.0001): Learning rate for the optimizer that updates the custom loss parameters.momentum(float, default0.9): Momentum factor for model and loss optimizers.weight_decay(float, default0.0005): Weight decay coefficient for L2 regularization in model optimization.save_dir(str, default"./logs/losssearch"): Directory to save logs, checkpoints, and learned loss parameters.device(str, default"cuda:0"): Device identifier (e.g.,"cuda:0","cuda:1", or"cpu").
Returns
None: All training logs, learned loss parameter values, and model checkpoints are saved under the specifiedsave_dir.Logs:
${save_dir}/training.logTensorBoard events:
${save_dir}/events.out.tfevents.*Checkpoints:
${save_dir}/model_epoch_XX.pthLearned loss parameters: saved in model state dict as
criterion.theta
Examples
ysautoml.optimization.losssearch.custom_loss
ysautoml.optimization.losssearch.custom_loss( )
Directly accessing and using the learnable criterion (custom_loss) in custom training or evaluation scripts.
Parameters
None
Examples
Last updated