Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
Randomised restart.
Crossover rules.
Hill climbing.
A population of solutions.
A temperature schedule.
2. A
hyperparameter
of an optimisation algorithm is:
A value that affects how a solution is searched for.
A direction in hyperspace.
A value that is used to impose constraints on the solution.
The determinant of the Hessian.
A measure of how good a solution is.
3. First-order optimisation requires that objective functions be:
monotonic
\(C^1\) continuous
disconcerting
one-dimensional
invertible
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
point towards the global minimum of \(L(\theta)\)
point in the direction of steepest descent
have \(L_2\) norm 1
be equal to \(\theta\)
be zero
5. Finite differences is not an effective approach to apply first-order optimisation because:
of numerical roundoff issues.
the curse of dimensionality
the effect of measurement noise
none of the above
all of the above
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
gradient descent and crossover
memory and population
temperature and memory
thants
random restart and hyperdynamics
Submit Quiz