A number of years after I finished school, I was in a new town without a job, and got hired to teach a freshman algebra course at the nearby Big Ten university. About halfway into teaching the class, I was struck by the realization that virtually every problem was solved in the same way, by recognizing the "form" of a problem and applying an algorithm appropriate for that form, drawn from the most recent chapter.
In the TFA, the natural log in the integrand was a dead give-away because it only comes from one place in the standard order of topics in calculus class.
Is this what we call intuition?
The students called this the "trick." Many of them had come from high school math under the impression that math was subjective, and was a matter of guessing the teacher's preferred trick from among the many possible.
For instance, all of the class problems involving maxima and minima involved a quadratic equation, since it was the only form with an extremum that the students had learned. Every min/max problem culminated with completing the square. I taught my students a formula that they could just memorize.
The whole affair left me with a bad taste in my mouth.
The thing I hated about integration was which approach would work and the best option for each approach were much more "do a lot and see what's right" and I was too lazy :).
is super interesting, related to your last sentence.
I think not doing this starting in like middle school is a big part of the reason why people think math/science is useless. Unless the exact scenario they have been taught pops up, they can very rarely see the application. But the real world NEVER works this way. A problem is NEVER formulated as a straight forward well-formed problem. Figuring out how to mold it into something that you can apply the tools you know to is in and of itself a REALLY important skill to practice, and sadly, we almost NEVER practice that. Only in grad school does that type of thing come up.
When things are just for fun, the impact of having unfair questions than when unfair questions can cause people to fail a class or get a lower GPA. This is why sometimes these kind of unfair questions get designated as extra credit as its unfair for these questions to actually count against you.
I just haven’t had to use integral calculus in so many years, I don’t recall what the symbols mean and I certainly don’t care about them. That doesn’t mean I wouldn’t find the problem domain interesting, if it was expressed as such. Instead, though, I get a strong dose of mathematical formalism disconnected from anything I can meaningfully reason about. Too bad.
The key to the trick is that we construct the morph so that: a) we can tell the rate at which it increases the "area under curve" b) the rate is easier to integrate that the original function and c) the starting function has a known integral
a) is generally easier because differentiation under integral sign lets use use the standard differentiation rules.
b) this is where the difficulty in constructing the morph lies.
So we start from a known value of the integral (from c above) and then just add whatever the morph adds, which is the integral of the rate from a) over the interval of the morph.
[0] https://archive.org/details/advancedcalculus031579mbp/mode/1...
OTOH, if I'm given the expression, it's just mechanical and unrewarding.
It can be frustrating when math does not have any clear single path, but that's just the nature of the beast. In the beginning you'll just have to explore all the paths, but do that a couple of hundred times, and you'll start to notice patterns and what will work / what will not. Kind of like chess, where a good chess player can think N moves ahead in time.
I'm a math major, but I consider the time spent learning the tricks for antiderivation to be kinda useless.
This is the most important lesson I learned in grad school. Methods are so important. I really think it is the core of what we call "critical thinking" - knowing how facts are made.
I have needed to know the values of a few integrals in my job, but I have always ended up with a close enough answer using computational methods. What am I missing by not solving analytically?
In our experiments, we need to know how the frequency of an electromagnetic resonator will change when we couple it to a quantum system. We calculate these frequency shifts with integrals. Being able to calculate these integrals analytically for some limiting cases helps us understand the dependence on the parameters. And usually you can patch the limiting cases together and not even have to compute the integrals numerically.
To give an example consider the moment generating transform, Laplace transform. Their symbolic expression can be very informative.
Consider the Mercator projection. It was designed without any idea of the closed form of the required integral. It was mostly done by estimate and gut feel. Now that we know the actual form (an entirely serendipitous discovery) we feel more confident that we understand the transform. This part is considerably psychological but not entirely.
Note that when drawing a map in Mercator projection we have to fall back to numerical estimation. But it helps that parts of the transforms are built from functions tha have names, that means we have seen the same functions elsewhere, it instills a sense of familiarity and understanding.
There are way to many functions to name, so the ones we have given names to are a bit special.
But far better is developing a sense of what's "about right".
I have taught people who studied Electronic Engineering "properly" who calculate that the resistors need to be 20.7kΩ and 21.3kΩ for a given circuit and then will go mad scouring Farnell, Mouser et al for those values.
You or I would say "That needs to be a 22kΩ resistor and an 18kΩ resistor in series with a 4.7kΩ pot, because that is going to need adjusted on test because of the tolerances in everything else", wouldn't we?
First, a motivational anecdote, then some straightforward theory, a simple (yet impressive) example fully worked out, the general method, and further examples of increasing difficulty for practice with hints.
Feynman’s trick is equivalent to extending it into a double integral and then switching the order of integration.
I'(t)=\int_0^1 \partial/(\partial t)((x^t - 1)/(ln x))dx = \int_0^1 x^t dx=1/(t+1), when it is actually equal to \int_0^1 x^{t-1}/ln(x)dx.
These two are definitely not always equal to each other.
d/dt (x^t - 1)/ln(x) = d/dt [exp(ln(x)t) - 1]/ln(x) = ln(x)exp(ln(x)t)/ln(x) = exp(ln(x)t) = x^t.
Edit: d/dt exp(ln(x)t) = ln(x)exp(ln(x)t) by the chain rule, while d/dt (1/ln(x)) = 0 since the expression is constant with respect to t.
There are convergence considerations that were not discussed in the blog post, but the computations seem to be correct.
But I had always loved maths and went back to it much later. After having done some computer science, some concepts just made it click more for me. Like sets were a big one. Seeing functions as just a mapping between sets. Seeing functions as set elements. Seeing derivatives and integrals as simply the mapping between sets of functions.
What fascinates me is that differentiation is solved, basically. Don't come at me about known closed form expressions. But integration is not. Now this makes a certain amount of sense. Differentiation is non-injective after all. But what's more fascinating (and possibly really good evidence of my own neurodivergence) is that integration isn't just an algorithm. It requires some techniques to find, of which the Feynman technique is just one. I think I was introduced to it with the Basel problem. I have to confess I end up watching daily Tiktok integration problems. It scratches an itch.
I kinda wish I'd made it to complex analysis at least in college. I mean I kinda did. I do remember doing something with contour integrals. But it just wasn't structured well. By that I mean Laplace transforms, poles of a function in the S-plane and analytic continuations.
I'm not particularly proficient at the Feynman technique. Like I can't generally spot the alpha substitution that should be made. Maybe one day.