1. Using Newton’s method, find an approximation recursive formula for
To help you, remember that
-
$x_{k+1} = x_{k} - \frac{2x_{k}}{x_{k}^{2} - 2}$ -
$x_{k+1} = \frac{x_{k}^{2} - 2}{2x_{k}}$ -
$x_{k+1} = \frac{2x_{k}}{x_{k}^{2} - 2}$ -
$x_{k+1} = x_{k} - \frac{x_{k}^{2} - 2}{2x_{k}}$
2. Regarding the previous question, suppose you don’t know any approximation for
-
$4$ -
$3$ -
$2$ - The initial value does not impact in the Newton’s method convergence.
3. Let’s continue investigating the method we are developing to compute the
- The algorithm would not converge.
- The algorithm would converge to
$\sqrt{2}$ . - The algorithm would converge to the negative root of
$x^{2} - 2$ . - The algorithm would converge to
$0$ .
4. Did you know that it is possible to calculate the reciprocal of any numberwithout performing division? (The reciprocal of a non-zero real number
Setting a non-zero real number
This method was in fact used in older IBM computers to implement division in hardware!
So, the iteration formula to find the reciprocal of
-
$x_{k+1} = 2x_{k} - ax_{k}^{2}$ -
$x_{k+1} = 2x_{k} + ax_{k}^{2}$ -
$x_{k+1} = 2x_{k} - x_{k}^{2}$ -
$x_{k+1} = x_{k} - ax_{k}^{2}$
5. Suppose we want to find the minimum value (suppose we already know that the minimum exists and is unique) of
Hint:
-
$x_{k+1} = x_{k} - \frac{x_{k} \log{\left( x_{k} \right)}}{\log{\left( x_{k} \right)} + 1}$ -
$x_{k+1} = x_{k} - x_{k}^{2} \log{\left( x_{k} \right)}$ -
$x_{k+1} = x_{k} - \log{\left( x_{k} \right)}$ -
$x_{k+1} = x_{k} - x_{k} \left( \log{\left( x_{k} \right)} + 1 \right)$
6. Regarding the Second Derivative Test to decide whether a point with
- If
$f^{\prime \prime} \left( x \right) \lt 0$ then$x$ is a local minimum. - If
$f^{\prime \prime} \left( x \right) \gt 0$ then$x$ is a local minimum. - If
$f^{\prime \prime} \left( x \right) = 0$ then$x$ is an inflection point. - If
$f^{\prime \prime} \left( x \right) = 0$ then the test is inconclusive.
7. Let
- (A)
- (B)
- (C)
- (D)
8. How many parameters has a Neural Network with:
- Input layer of size 3
- One hidden layer with 3 neurons
- One hidden layer with 2 neurons
- Output layer with size 1
An image is provided below:
-
$11$ -
$8$ -
$23$ -
$3$
9. Given the following Single Layer Perceptron with Sigmoid function as activation function, and log-loss as Loss Function
-
$-\left( y - \hat{y} \right)$ -
$-\left( y - \hat{y} \right) x_{1}$ -
$-\left( y - \hat{y} \right) x_{2}$ -
$1$
10. Suppose you have a function
Then the point
- Local maximum.
- Local minimum.
- Saddle point.
- We can’t infer anything with the given information.