ysautoml.network.zeroshot
Funtional Modules
ysautoml.network.zeroshot.autoformerysautoml.network.zeroshot.mobilenetv2
ysautoml.network.zeroshot.autoformer
ysautoml.network.zeroshot.autoformer.run_search_zeroshot
ysautoml.network.zeroshot.autoformer.run_search_zeroshot(**kwargs)
Run zero-shot architecture search using AutoFormer on ImageNet.
Parameters
param_limits(float): Maximum parameter count limit (e.g., 6, 23, 54 for Tiny/Small/Base).min_param_limits(float): Minimum parameter count limit.cfg(str): Path or name of YAML search space (e.g.,'space-T.yaml').output_dir(str): Directory to save search results.data_path(str, default'/dataset/ILSVRC2012'): Dataset root path.population_num(int, default10000): Number of architectures to sample.seed(int, default123): Random seed.gp(bool, defaultTrue): Enable Gaussian process metric.relative_position(bool, defaultTrue): Enable relative positional embedding.change_qkv(bool, defaultTrue): Use QKV reparameterization.dist_eval(bool, defaultTrue): Enable distributed evaluation.
Returns
None: Logs and search results (includingbest_arch.yamlandsearch.log) are saved underoutput_dir.
Examples
ysautoml.network.zeroshot.autoformer.run_retrain_zeroshot
ysautoml.network.zeroshot.autoformer.run_retrain_zeroshot(**kwargs)
Retrain the best subnet architecture found by AZ-NAS using AutoFormer.
Parameters
cfg(str): YAML configuration file path (absolute or relative).output_dir(str): Output directory for retraining results.data_path(str, default'/dataset/ILSVRC2012'): Dataset root path.epochs(int, default500): Number of training epochs.warmup_epochs(int, default20): Warm-up epochs.batch_size(int, default256): Batch size per GPU.model_type(str, default'AUTOFORMER'): Model type to train.mode(str, default'retrain'): Training mode.relative_position(bool): Use relative positional embedding.change_qkv(bool): Use QKV reparameterization.gp(bool): Enable Gaussian process module.dist_eval(bool): Enable distributed evaluation.device(str, default'0,1,2,3,4,5,6,7'): Visible CUDA devices.nproc_per_node(int, default8): Number of distributed processes per node.master_port(int, default6666): Master port for distributed training.
Returns
None: Trained model weights and logs are saved underoutput_dir.
Examples
ysautoml.network.zeroshot.mobilenetv2
ysautoml.network.zeroshot.mobilenetv2.run_search_zeroshot
ysautoml.network.zeroshot.mobilenetv2.run_search_zeroshot(**kwargs)
Run zero-shot evolution search for MobileNetV2 variants (AZ-NAS).
Parameters
gpu(int, default0): GPU ID to use.seed(int, default123): Random seed.metric(str, default'AZ_NAS'): Zero-shot score metric.population_size(int, default1024): Population size for evolution search.evolution_max_iter(int, default1e5): Maximum iterations for evolution search.resolution(int, default224): Input image resolution.budget_flops(float, default1e9): FLOPs constraint.max_layers(int, default16): Maximum number of layers.batch_size(int, default64): Batch size.data_path(str): Path to ImageNet dataset.num_classes(int, default1000): Number of classes.search_space(str, default'SearchSpace/search_space_IDW_fixfc.py'): Search space file.
Returns
None– Outputs best architecture (best_structure.txt) and FLOPs/params summary insave_dir.
Examples
ysautoml.network.zeroshot.mobilenetv2.run_retrain_zeroshot
ysautoml.network.zeroshot.mobilenetv2.run_retrain_zeroshot(**kwargs)
Retrain the searched MobileNetV2 architecture using AZ-NAS configuration (via Horovod).
Parameters
gpu_devices(str, default'0,1,2,3,4,5,6,7'): Visible GPU device IDs.metric(str, default'AZ_NAS'): Search metric name.population_size(int, default1024): Population size used in search.evolution_max_iter(int, default1e5): Number of evolution iterations used in search.seed(int, default123): Random seed.num_workers(int, default12): Number of data loader workers.init(str, default'custom_kaiming'): Weight initialization method.epochs(int, default150): Number of training epochs.resolution(int, default224): Input image resolution.batch_size_per_gpu(int, default64): Batch size per GPU.world_size(int, default8): Number of distributed workers.data_path(str): Dataset root path.best_structure_path(str, optional): Path tobest_structure.txt(absolute or relative to current working directory).
Returns
Path– Directory of retraining outputs.
Examples
Last updated