ysautoml.network.oneshot
Funtional Modules
ysautoml.network.oneshot.train_dynasysautoml.network.oneshot.quantize_dynas
ysautoml.network.oneshot.train_dynas
ysautoml.network.oneshot.train_dynas(**kwargs)
Run DYNAS (Subnet-Aware Dynamic Supernet Training for Neural Architecture Search) or SPOS baseline training using the original train_spos.py script, integrated into the YSAutoML API system.
Parameters
log_dir(str, defaultlogs/spos_dynamic): Directory path to store TensorBoard and log outputs.file_name(str, defaultspos_dynamic): Base name for experiment log and validation pickle files.seed(int, default0): Random seed for reproducibility.epochs(int, default250): Total number of training epochs.lr(float, default0.025): Initial learning rate for SGD optimizer.momentum(float, default0.9): Momentum factor for SGD optimization.wd(float, default0.0005): Weight decay coefficient for L2 regularization.nesterov(bool, defaultTrue): Whether to enable Nesterov momentum in SGD.train_batch_size(int, default64): Batch size used for training dataset.val_batch_size(int, default256): Batch size used for validation dataset.method(str, defaultdynas): Training mode. Usebaselinefor SPOS baseline, ordynasfor dynamic architecture training with adaptive learning rate scheduling.max_coeff(float, default4.0): Maximum coefficientγ_maxcontrolling adaptive learning rate scaling for DYNAS.
Returns
None: All training logs, pickled validation results, and TensorBoard summaries are saved underlog_dir.Logs:
${log_dir}/spos_dynamic.logTensorBoard events:
${log_dir}/events.out.tfevents.*Validation accuracy pickle:
exps/NAS-Bench-201-algos/valid_accs/{file_name}.pklKendall correlation report (CIFAR10/100, ImageNet)
Examples
ysautoml.network.oneshot.quantize_dynas
ysautoml.network.oneshot.quantize_dynas(**kwargs)
Run post-training quantization for a searched DYNAS/SPOS subnet using the original comparison.py script, integrated into the YSAutoML API system. This function loads the searched architecture from a text file, restores a trained checkpoint, applies uniform k-bit quantization, and saves the quantized checkpoint automatically.
Parameters
arch_file(str, defaultbest_structure.txt): Path to the text file containing the searched architecture configuration. The file must contain comma-separated integers (e.g.,2, 2, 6, 0, ...).ckpt(str, defaultepoch-241.pt): Path to the trained floating-point checkpoint file to be quantized.quant_bits(int, default8): Bit-width for uniform weight quantization (e.g., 8-bit, 6-bit).log_name(int, defaultquantize_output): Tag name for quantization process logging.
Returns
None:All quantization results are saved automatically as a new checkpoint file.
Quantized checkpoint:
${ckpt_dir}/${ckpt_name}_quant{quant_bits}.ptConsole output:
Loaded architecture tuple
Quantization status
Final saved checkpoint path
Examples
Last updated