본문 바로가기
728x90
반응형

Coursera72

Neural Networks and Deep Learning (7) ▤ 목차Shallow Neural NetworksShallow Neural NetworkActivation FunctionsSo far we’ve been using sigmoid as our activation function, but we can use other activation functions such as hyperbolic tangent, aka tanh.Tanh falls within the interval between -1 and 1 and, when plotted, looks like a shifted sigmoid function.Using tanh can center your data to have a mean of 0 rather than 0.5 when using a sigm.. 2024. 11. 25.
Neural Networks and Deep Learning (6) ▤ 목차Shallow Neural NetworksBuild a neural network with one hidden layer, using forward propagation and backpropagation.Learning ObjectivesDescribe hidden units and hidden layersUse units with a non-linear activation function, such as tanhImplement forward and backward propagationApply random initialization to your neural networkIncrease fluency in Deep Learning notations and Neural Network Repre.. 2024. 11. 23.
Neural Networks and Deep Learning (5) ▤ 목차Neural Networks BasicsQuizQ1In logistic regression, given $\bf x$ and parameters $w ∈ {\mathbb R^{n_x}}, b∈{\mathbb R}$. Which of the following best expresses what we want $\hat y$ to tell us?Choice$\sigma(W \textbf{x} + b)$$P(y = \hat y | x)$$\sigma(W \bf x)$$P(y = 1|\bf x)$Hint더보기Remember that we are interested in the probability that $y=1$.Answer더보기4Yes. We want the output $\hat y$ to tel.. 2024. 11. 18.
Neural Networks and Deep Learning (4) ▤ 목차Neural Networks BasicsPython and VectorizationVectorizationVectorization is how to apply a function in parallel across all examples simultaneously.In Python, vectorization avoids using the for-loop and reduces the computation and time. True or false. Vectorization cannot be done without a GPU.더보기FalseMore Vectorization ExamplesWe can apply any function to matrices or vectors.In our example, .. 2024. 11. 17.
Neural Networks and Deep Learning (3) ▤ 목차Neural Networks BasicsLogistic Regression as a Neural NetworkComputation GraphA computation graph is an organized forward pass (propagation) to compute the function of a neural network, followed by a backward pass (propagation) to calculate the gradients of a neural network.This is like looking at a math formula and calculating step by step to get the answer. One step of ________ propagation.. 2024. 11. 14.
Neural Networks and Deep Learning (2) ▤ 목차Neural Networks BasicsSet up a machine learning problem with a neural network mindset and use vectorization to speed up your models.Learning ObjectivesBuild a logistic regression model structured as a shallow neural networkBuild the general architecture of a learning algorithm, including parameter initialization, cost function, gradient calculation, and optimization implementation (gradient .. 2024. 11. 13.
728x90
반응형

home top bottom
}