본문 바로가기

728x90

Coursera

Calculus for Machine Learning and Data Science (4) Derivatives and OptimizationQuizQ1Consider the following lines.What can be said about the slopes at their intersection?Slope(Line 1) > Slope(Line 2).Slope(Line 1) Slope(Line 1) = Slope(Line 2).It is impossible to infer anything from the given information.Answer더보기2Correct! Line 2 is steeper than Line 1, therefore its slope is higher.Q2Given the following graph, what is the slope of the line? You.. 더보기
Calculus for Machine Learning and Data Science (3) Derivatives and OptimizationDerivativesExistence of the derivativeFunctions where we cannot find the derivative at every point are called the non-differentiable functions.더보기x = 1Correct. There is a cusp at x = 1 so the derivative does not exist.더보기x = -1Correct. There is a discontinuity/jump at x=-1, so the derivative does not exist.Vertical tangents are not differentiable at the origin (x = 0).. 더보기
Calculus for Machine Learning and Data Science (2) Derivatives and OptimizationDerivativesSome common derivatives - LinesHere are some of the common derivativesSome common derivatives - QuadraticsFor the above slide, plug in the values for $\Delta x$ (the change in $x$) and $x$, where $x = 1$ to get $\Delta f$ and then the slopeSome common derivatives - Higher degree polynomialsSome common derivatives - Other power functionsFor all power functio.. 더보기
Calculus for Machine Learning and Data Science (1) Derivatives and OptimizationAfter completing this course, you will be able to:Learning ObjectivesPerform gradient descent in neural networks with different activation and cost functionsVisually interpret the differentiation of different types of functions commonly used in machine learningApproximately optimize different types of functions commonly used in machine learning using first-order (grad.. 더보기
Calculus for Machine Learning and Data Science (0) What you'll learnAnalytically optimize different types of functions commonly used in machine learning using properties of derivatives and gradientsApproximately optimize different types of functions commonly used in machine learningVisually interpret the differentiation of different types of functions commonly used in machine learningPerform gradient descent in neural networks with different act.. 더보기
Linear Algebra for Machine Learning and Data Science (17) Determinants and EigenvectorsCheck your knowledgeQ1If $\text{det } M = 20$ and $\text{det } N = 10$, and $M, N$ have the same size. What is the value of $\text{det } M\cdot N$ and $\text{det}(N^{-1})$? (Videos: Determinant of a product, Determinant of inverses)Answer더보기$\det M\cdot N$ is 200 and $\det(N^{-1})$ is 1/10 or 0.1Q2Does the set $\{(1,2),(4,8)\}$ form a base for $\mathbb{R}^2$? (Videos.. 더보기
Linear Algebra for Machine Learning and Data Science (16) Determinants and EigenvectorsEigenvalues and EigenvectorsDimensionality Reduction and ProjectionDimensionality reduction helps preserve information and helps with visualization in exploratory analysis.Without it, we would be removing features that lead to losing information.Projection: Moving data points into a vector space in different dimensionsHere we are talking about dimensionality reductio.. 더보기
Linear Algebra for Machine Learning and Data Science (15) Determinants and EigenvectorsEigenvalues and EigenvectorsBases in Linear AlgebraA matrix can be seen as a linear transformation from a plane to a planeBasis: Two vectors coming from the origin that define the plots (square, parallelogram, etc)The main property of basis is that every point in the space can be expressed as a linear combination of elements in the basesTwo vectors that go in the sam.. 더보기

728x90