Update config
Last updated
Was this helpful?
Last updated
Was this helpful?
Update config
/api/{projectId}/optimize/config
Project ID
Target latency in MS
0
Maximum number of training cycles
5
Maximum number of trials
2
Maximum number of parallel workers/jobs
1
Number of initial trials
5
Number of optimization rounds
3
Number of trials per optimization round
3
Whether to import metrics for previous EON tuner runs in the same project to accelerate the hyperparameter search process
Whether to import resource usage (RAM/ROM/latency) metrics to accelerate the hyperparameter search process
Number of project trials to import
Number of resource usage trials to import
Enable standard error of the mean (SEM)
Standard error of the trial accuracy mean
Standard error of the trial latency mean
Hyperparameter optimization objectives + weights in string format
Enable trial level early stopping based on loss metrics during training
Stops the EON tuner if the feasible (mean) objective has not improved over the past “window_size” iterations
Threshold (in [0,1]) for considering relative improvement over the best point.
Enable Multi-fidelity Multi-Objective optimization
Enable verbose logging
Disable search constraints
Disable trial deduplication
Tuning algorithm to use to search hyperparameter space
random
, hyperband
, bayesian
, custom
Model variant to optimize for
float32
, int8
List of impulses specifying the EON Tuner search space
Target device
Search space template
Search space source
Hyperparameter optimization objectives and corresponding weights