Update config
Last updated
Last updated
Update config
Project ID
Target latency in MS
0
Target device
Maximum number of training cycles
5
Maximum number of trials
2
Maximum number of parallel workers/jobs
1
Number of initial trials
5
Number of optimization rounds
3
Number of trials per optimization round
3
Tuning algorithm to use to search hyperparameter space
Whether to import metrics for previous EON tuner runs in the same project to accelerate the hyperparameter search process
Whether to import resource usage (RAM/ROM/latency) metrics to accelerate the hyperparameter search process
Number of project trials to import
Number of resource usage trials to import
Enable standard error of the mean (SEM)
Standard error of the trial accuracy mean
Standard error of the trial latency mean
Hyperparameter optimization objectives ordered by priority
Hyperparameter optimization objectives + weights in string format
Model variant to optimize for
Enable trial level early stopping based on loss metrics during training
Stops the EON tuner if the feasible (mean) objective has not improved over the past “window_size” iterations
Threshold (in [0,1]) for considering relative improvement over the best point.
Enable Multi-fidelity Multi-Objective optimization
Enable verbose logging
List of impulses specifying the EON Tuner search space
Search space template
OK
Whether the operation succeeded
Optional error description (set if 'success' was false)