Data Fundamentals (H) - Week 08 Quiz
1. For a multi-objective optimisation, Pareto optimality means that:
There is no possible improvement to any sub-objective function.
Any improvement in any sub-objective functions makes at least one other worse.
Every combination of the sub-objective functions has been searched.
All sub-objective functions are zero.
Gradient descent is invalid.
2. What property of a probability distribution always holds true?
Probabilities are equally divided among outcomes
The product of all probabilities is 1
The determinant of probabilities is \(\infty\)
The sum of all probabilities is 0
The sum of all probabilities is 1
3. Bayesians use probability as:
a calculus of truth
a prayer book
a calculus of belief
complex angles
a representation of the long-term average of frequencies of outcomes
4. The conditional probability P(A|B) is defined to be: (\(\land\) means "and" and \(\lor\) means "or")
\(P(A)P(B)\)
\(P(A \land B) P(B)\)
\(P(A \land B) / P(B)\)
\(P(A \lor B) + P(B \lor A)\)
\(P(A||B) - B(A||P)\)
5. If I have a joint distribution over two random variables \(A\) and \(B\), \(P(A,B)\), how can I compute \(P(A)\)?
Sum/integrate over \(P(A,B)\) for every value of \(A\)
Sum/integrate over \(P(A,B)\) for every value of \(B\)
Sum/integrate \(P(A,B)\) for every value of \(A\) and \(B\).
\(P(A,B)−P(A|B)\)
Divide \(P(A,B)\) by \(P(B)\)
6. In an optimisation problem, a penalty function can be used to:
reduce the need for random search
accelerate gradient descent
implement genetic algorithms
issue red cards
implement soft constraints
Submit Quiz