Lecture from: 22.04.2024 | Video: Video ETHZ
Differentiable Functions
Clicker Question: Finding a Minimum
Consider the function . We are at the point , where .
Observation: A minimum for this function exists because is continuous, and as , (the term dominates). This means the function must “turn around” somewhere.
/Semester-2/Analysis-1/Lecture-Notes-2024/attachments/Pasted-image-20250529095935.png)
The graph shows . If we take a small step to the right from , say to , the function value is slightly less than . This means .
- Question: Starting at , should we move left or right to find a minimum of ?
- Answer: The graph suggests that moving to the right might lead us towards a minimum, because the function is decreasing in the immediate vicinity to the right of (since for a positive ). (This is a baby example of an idea called Gradient Descent.)
Follow-up Question: How could we have made this decision (left or right) if we didn’t have the graph? This is what we want to explore!
Let’s look at a broader view of the function in this example:
/Semester-2/Analysis-1/Lecture-Notes-2024/attachments/Pasted-image-20250529100032.png)
It seems our initial step to the right from led us to a minimum, but maybe not the overall (global) minimum.
Question: How can we find or describe minima if we don’t have a picture of the function?
Unfortunately, the minimum we found by moving slightly to the right from was only a local minimum, not the true “global” minimum of the function.
/Semester-2/Analysis-1/Lecture-Notes-2024/attachments/Pasted-image-20250529100051.png)
Key Question: What conclusions can we draw about the global behavior of a function based on its local behavior?
Goal of This Chapter
- Investigate the local change of functions. To do this, we’ll use local approximation by straight lines (these lines are called “tangents”).
- Direct Applications:
- Determining the monotonicity of functions (where they are increasing or decreasing).
- Finding local extrema (minima and maxima).
- Taylor Approximation (approximating functions with polynomials of higher order).
The Derivative: Definition and Elementary Consequences
Let , , and be an accumulation point of .
We want to define the “slope of the tangent line” to at as the limit of the “slopes of secant lines.”
/Semester-2/Analysis-1/Lecture-Notes-2024/attachments/Pasted-image-20250529100114.png)
The slope of the secant line between and is given by the difference quotient: This difference quotient should provide a good approximation for the slope of at if is close to .
Definition: Differentiability
Let , , and be an accumulation point of . The function is differentiable at if the limit exists in (i.e., it’s a finite real number). If this limit exists, it is called the derivative of at and is denoted by .
Remark: Alternative Limit Form for the Derivative
It is often convenient to set . As , we have . Then the derivative can also be expressed as:
Remark: The Tangent Line
If is differentiable at , then the line defined by the equation is called the tangent line to at . This line is the best linear approximation of near .
/Semester-2/Analysis-1/Lecture-Notes-2024/attachments/Pasted-image-20250529100230.png)
The “approximation error” converges to faster than as . Specifically, it holds that: Exercise: Verify this limit using the definition of and .
Theorem: Reformulation of Differentiability
Let , and let be a function. Suppose is an accumulation point of .
Then is differentiable at if and only if there exists a function that is continuous at and satisfies the equation:
In this case, the value of at gives the derivative:
Explanation: What is doing?
Here’s an intuitive explanation of what this is doing:
-
Rearranging the Formula: For any in the domain , you can rearrange the formula to solve for : This fraction, , is the slope of the secant line connecting the point and any other point on the graph of . So, for , exactly represents this average slope between and .
-
Connecting to Differentiability: Satz 4.1.4 states that a function is differentiable at if and only if such a function exists that is defined on and is continuous at .
-
The Role of Continuity: The condition that is continuous at means, by Definition 3.2.1 or its characterization in Satz 3.2.4, that the limit of as approaches exists and is equal to . That is, .
-
What Must Be: Now, let’s combine these ideas. Since for , and exists and equals , this means: By the definition of the derivative (Definition 4.1.1), the limit on the left is exactly . Therefore, if is differentiable at and satisfies the condition in Satz 4.1.4, it must be that .
Why is this important?
-
The formula:
looks like the linear approximation of , but with a twist - the slope depends on , not just a fixed derivative.
-
If is continuous at , it ensures that the “slopes” smoothly approach the true derivative. This gives us differentiability.
-
Key idea: Differentiability at a point is equivalent to being able to write the function locally as a linear approximation with a continuous slope function .
Sketch of Proof
- Define as above.
- If is differentiable at , then the difference quotient converges to , so . That makes continuous at , and the formula holds.
- Conversely, if the formula holds and is continuous at , then the difference quotient equals , and: So the derivative exists, and equals .
Corollary: Differentiability Implies Continuity
If is differentiable at , then is continuous at .
Proof
If is differentiable at , then by the theorem above, where is continuous at .
Then . Since , is continuous at .
Examples of Derivatives
-
Constant Function: (for some constant ) for all . Then . So, for all .
-
Identity Function: for all . Then . So, for all .
-
Quadratic Function: for all . . So, for all .
-
Absolute Value Function: for all . Is differentiable at ? Consider the limit . If , . So, . If , . So, . Since the right-sided limit (1) and the left-sided limit (-1) are not equal, the overall limit does not exist. Therefore, is not differentiable at .
/Semester-2/Analysis-1/Lecture-Notes-2024/attachments/Pasted-image-20250529110638.png)
For any other point , is differentiable: If , then for near , , so . If , then for near , , so . So, for , .
Definition: Differentiable Function
Let , . The function is differentiable (on ) if is differentiable at all accumulation points of with .
Remark: Typically, is a union of intervals with endpoints . In this case, every point of is an accumulation point of .
In this case, for a differentiable function , we get a new function , called the derivative of .
(Example: ).
Examples of Differentiable Functions
- is differentiable, and .
- are differentiable. , .
Proof for
Consider for . For ,
Let . This is a power series (in ) with radius of convergence .
Since is a convergent power series, it is continuous at (by Theorem 3.7.11).
Therefore, . So, .
Thus, .
(For , see script.)
Rules for Differentiation
Theorem: Differentiation Rules
Let , be an accumulation point of , and . Let be differentiable at . Then:
-
Linearity of the Derivative: The functions and are differentiable at , and
-
Product Rule: The function is differentiable at , and
-
Quotient Rule: If , then is an accumulation point of , and the function is differentiable at (on this new domain), with
Proof (using Theorem 4.1.4 - Reformulation of Differentiability)
Since and are differentiable at , there exist functions , continuous at , such that for all :
- with
- with
Sum for 1
. Since and are continuous at , their sum is also continuous at .
By Theorem 4.1.4, is differentiable at , and .
Product for 2
The term is continuous at because are continuous at , and as .
Specifically, .
By Theorem 4.1.4, is differentiable at , and .
Properties (1) and (2) are proven. For (3), see script.
Examples: Applying Differentiation Rules
-
Power Rule (by induction): For , for all .
- Base Case (n=1): . (From Example 4.1.6)
- Inductive Step: Assume holds for some . Consider . Using the product rule: (by inductive hypothesis) . The formula holds.
-
Tangent Function: is differentiable in its domain . Using the quotient rule: . So, . This holds for .
-
Cotangent Function: Analogous to (2). . for .
Continue here: 17 Chain Rule, Derivatives of Inverse Functions, Central Theorems (Local Extrema, Rolle’s Theorem, Mean Value Theorem)