Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Calculus for the Curious (thonyc.wordpress.com)
243 points by diodorus on Oct 24, 2019 | hide | past | favorite | 38 comments


I just stumbled upon Infinite Powers in B&N, and after reading a bit, decided to buy it. Loving it so far. I’ve been curious about calculus for a long time now, since I’ve never taken a formal class on it and am now in a graduate CS program, I feel like I’m missing the deeper understanding of many of the formulas that are presented.

My plan is to get through it to get some background on the main ideas of calculus, then work through khan academy and/or read through Aleksandrov’s Mathematics Contents/Meaning.

If anyone knows of active forums/q&a/online practice for self-learning calculus, it’d be a huge help if you could share.


Try http://www.math.smith.edu/~callahan/intromine.html which is a very conceptual and simulation-focused calculus curriculum, without so much symbol twiddling.

The concepts of calculus (the mathematics of motion and change) are absolutely fundamental across science, but being able to get closed-form solutions to tricky indefinite integrals (while an enjoyable puzzle) is only marginally useful per se.

> work through khan academy

YMMV, but I don’t find KA to be especially pedagogically enlightened. I think of it as roughly an average-quality high school lecture videotaped, plus a big pile of mindless exercises. It’s still nice that it exists: it makes a quality floor for students with below-average teachers, and is free for everyone in the world, without any forced schedule.

You’ll get roughly the same result from just reading a standard introductory textbook and working the problems.

For better results, if you can afford the time, hire / find a private tutor to meet with regularly face to face.


I think KA fulfills a critical niche in the ecosystem of online study materials out there: he actually works through the math in real time on his 'blackboard'. That's one thing you can't get from watching 3Blue1Brown videos, or reading through textbooks.

Personally, I find I learn best when I can work my way through at least three/four different forms of understanding: initial intuition (ideally geometric) of the problem (i.e 3B1B, BetterExplained), the theoretical, proof-based solution you get from textbooks where you can see the derivation of the math concept, and finally, working through problems with a pen and paper. The fourth one is being able to code it from scratch or with the help from a basic library like numpy. KA helps immensely with that third form.


Downloaded the Calculus in Context when I read the your comment. That is exactly the kind of approach to math I appreciate and understand. I've been reading it and enjoying making simulations during the weekend. Thanks!


This is a fantastic thread https://twitter.com/nntaleb/status/1163192701472428032?lang=... from Taleb. As always, a little bit over the top, but mostly on point. I've seen the exact phenomenon he describes play out so many times now its not even funny. I'm doing a PhD in mathematical statistics, which is sort of like "calculus on steroids". I basically do calculus day in day out. Many of the problems we attack are simply not of much interest to mathematicians. I used to run my problems by math profs & they would say something like yeah it can be done, have you tried mathematica etc. instead of buckling down to do it. For instance, yesterday I had to prove that Fisher information of Cauchy is half. Now that's entirely calculus. You take a function f(x,t) = (pi*(1+(x-t)^2))^-1. You then take the log of that. Then you differentiate that w.r.t t. What you get is called the Fisher score function S(t). So you take the score function and differentiate that again. Lets call that g(x,t). Stick a negative sign in front of that. So now you have a complicated looking new function -g(x,t). You multiply the -g(x,t) with your original f(x,t) and integrate that product over the reals. The result is half. Most mathematicians usually get stuck on that last step. But for (mathematical) statisticians, this os sort of our bread and butter integral. So we know a bunch of tricks. Here's one such trick - https://stats.stackexchange.com/questions/145017/cauchy-dist...

Its like riding a bicycle. Those who ride the most know how to ride. But there are some who want to know how do bikes actually work...which is not going to help you much with riding the bike.


Oh, hey. Are you me? I am wrapping my time in CS grad program, and I also, never took calculus as a formal class.

Now, I will say, save machine learning/AI, calculus isn't really necessary; the world is completely discrete.

That being said, that doesn't mean that knowing calculus wouldn't _enhance_ your ability to understand and digest some of the more difficult reductions and proofs in, say, a theory of computation course.

I relied on "The Calculus Tutoring Handbook"[0]. I wanted a book that had answers to _all_ the exercises for confidence building purposes. The book goes slow and provides a great amount of detail -- the authors are pretty good at not hand-waving.

I also found \r\learnmath useful as a "I have a problem and can't ask anyone" site. They are really friendly.

[0] https://www.amazon.com/Calculus-Tutoring-Book-Carol-Ash/dp/0...


Am I wrong to say that even if the universe of your concern is discrete, calculus can at least describe the behavior of recursive discrete processes, among other things?


Depends on the process and the exact form of the discreteness.

Discreteness introduces discontinuities and errors, and it's usually possible to describe the errors analytically. But there are situations where discrete systems become numerically unstable and blow up while the smooth analytic equivalent has no problems.


Apparently I'm beyond the edit window of my original post. I mean the "world [in CS] is completely discrete[, in the context of mathematical modeling and abstraction.]"

I did not intend to imply that the world is discrete in the strictest sense. Just that, except for AI/ML, discrete math will prove much more helpful to understanding the concepts and material presented in a graduate CS curriculum.

The benefit studying continuous maths provides in the context of CS is the rigor and modeling skills one gains.

All of my thesis is rooted in Programming Languages, Compilers, and Type Theory. Continuous math is utterly useless in this context. It's all SAT/SMT, set theory, and graphs -- all of which are topics in discrete math.


Maybe I'm getting old, but I can't imagine a CS grad student not having taken a formal calculus class.

When I started my undergrad CS program in 1989 it required 4 semesters (2 full years) of calculus. This was in addition to 4 semesters of physics.

Maybe I'm just not up to date on what "Computer Science" is today.


>... calculus isn't really necessary; the world is completely discrete

Erwin Schrodinger would like a word with you.


I recommend "The Mechanical Universe" TV program, produced by Caltech with the Annenberg foundation.

Although it's not about calculus per se, it shows how it is used with physics. Newton having invented calculus in order to describe physics. However, they use the more standard Leibniz's notation on the program.

You probably won't be able to sit down and solve integrals after the show, but the program helps to take a practical and beautiful mathematics branch and gives viewers an intuition to its application that I didn't find in an actual math course.

https://www.youtube.com/playlist?list=PL8_xPU5epJddRABXqJ5h5...


If you have the chance, try to go through a formal proof and analysis based course that requires convergence proofs and all that (mathematical "analysis" isn't what you might think; it's a specific subject). For calculus, it is what drove home the point and the magic for me.

We used this book (https://smile.amazon.com/Introduction-Calculus-Analysis-Clas...) but I expect it may be a bad fit for self study. Try to find something with similar subject matter.


I loved calculus so much more once the professor walked us through the proofs and I was willing and able to understand them. Proving convergence is like being a cheeky kid saying, "Well, if you pick that small number, I'll just find a smaller one!"


Dr Jim Fowler, from Ohio State Uni, explains Calculus concepts with simple examples, https://www.youtube.com/user/kisonecat/videos


Spivak has a pretty decent calculus textbook that you might find interesting. I used a different textbook, but I believe that Spivak is more popular.


The point of Spivak’s book is to be rigorous and offer hard problems, not teach the basics of how calculus is used.

Doing epsilon–delta proofs can be fun, but it’s mostly useful for aspiring pure mathematicians, and not really relevant per se for the grandparent poster.


I took this to mean that they're interested in a more rigorous treatment.

> [I] am now in a graduate CS program, I feel like I’m missing the deeper understanding of many of the formulas that are presented

> My plan is to get through it to get some background on the main ideas of calculus


If someone has never taken a calculus course, and many formulas are presented in other types of technical books which were developed using calculus, then those formulas will seem somewhat mysterious/foreign.

Doing delta–epsilon proofs isn’t necessary to clear that up though. Just a regular introductory calculus curriculum is likely sufficient.


I used to always associate 'math smarts' with 'code smarts'. Spent most of my life telling myself that since I was bad at math I had no hope learning how to code. Now I'm 2 weeks into a coding bootcamp after losing my job and am realizing they are complete different parts of the brain.

I believe in myself more with every push to heroku. maybe I will do a Calculus class next and prove to myself I can learn anything. Anyone have recommendation on an online calc course for the math-insecure?


I have bury-the-needle good spatial reasoning and somewhat above average general intelligence. Gifted programs, all that stuff. Ought to be anywhere from somewhat good to very good at mathematics. But... I feel the way I figure dyslexics feel reading human language, when I read math.

Code? Natural and easy, even “hard” concepts. There, my measured natural abilities come out just as you’d think they would. My best guess is I find algorithmic thinking easy, but proof/equational thinking unnatural. All I can figure. I’ve had some limited success approaching math with a “what does this term _do_?” attitude, but it’s slow going.

IOW don’t worry, there are others out there. Sometimes we even get a reputation for being the ones to go with for the tricky stuff. Go figure.


> Anyone have recommendation on an online calc course for the math-insecure?

Honestly I'd say do one thing at a time, and do it well. It takes a year to stop being bad at anything worth doing, and a lifetime to get good. You're only two weeks into learning to code. Maybe you should go full-bore into that for the next year and pick up calculus some other time.


I used to suck at algebra, then I wrote c++ code for 30 years... now I can see how they differ and how they are similar.

sometimes when you are programming in a big body of code you are dealing with lots of types (types in the programming sense)... and you are dealing with functions that have type signatures that must be satisfied to avoid compile errors... so you wrap this type in that one so you can call that function.. or you convert the return value from one type to another so you can call something else...

anyway, that is all similar to what you are doing when you manipulate an equation. you follow the rules and change it into a form that is more useful.


> Spent most of my life telling myself that since I was bad at math

Note that very few people are actually “bad at math” in any kind of inherent way. The problem is usually a combination of psychologically damaging (and technically poor) teaching, parental/peer pressure, etc. leading to a phobia/mental block, which eventually leads people to construct an identity as “not a math person” (which has been tragically normalized in our society – in some places this doesn’t happen).

Plenty of the folks who say they are “bad at math” try again later under more relaxed and encouraging circumstances and are plenty successful.

So good luck!

If you are serious about it, my recommendation is to try to find a private tutor to meet with face to face.


What did you do before if you don't mind me asking ? Would you have done an investment banking bootcamp if it guaranteed higher pay ?


Apple Retail. Didn't qualify for Genius position after trying very hard and never felt like I fit in. it inspired me to play with swift playgrounds, which led me to the bootcamp program which I am LOVING


I just finished reading this, and thoroughly enjoyed it. It provides a clear description of the fundamental intuition at the heart of calculus (chop stuff up infinitely, then put it back together again), and a mix of the historical background of its development and its ancient and modern applications.

All books of this nature are somewhat idiosyncratic – it's not a history, not a textbook, nor an applied maths book; it's a packet of passion sent by someone who's clearly excited and enthralled with his topic: "Here's a story about something really cool! Maybe you'll think it's cool too!"

I've been investigating recent books with different approaches to calculus (trying to build a free online course that empowers people in maths). Here are some other books I can recommend/mention:

[0] "Burn Math Class: And Reinvent Mathematics for Yourself" Goes from 1+1 all the way to derivatives and integrals. My favorite work of math demystification and pedagogy.

[1] "Calculus Reordered: A History of the Big Ideas" Released this year, exactly as the title says. An accessible history, explaining each idea as it enters the world stage. I've only just started this, but it's a definitely more historically thorough, albeit less engaging, book than Infinite Powers. (Since historical accuracy seems to be TFA's main focus, I wonder what they would think of this book.)

[2] "Change Is the Only Constant: The Wisdom of Calculus in a Madcap World" Another just-published book – this is the year for Calculus! Amazon has lost/delayed my preordered copy, but from the author's other work I expect this to be a LOT of fun.

[3] "Magnificent Principia: Exploring Isaac Newton's Masterpiece" Sort of "the annotated Newton". Outlines Sir Isaac's history, social environment, and development of the Principia Mathematica. The bulk of this book is going through each section of the Principia, translating the language into modern speech/formulations (where needed), and explaining what Newton was getting at. Also not as gripping as Infinite Powers, but a great way of reading and understanding one of the most foundational scientific/mathematical texts of all time.

[4] "Introductory Calculus For Infants" I'm about to have my first child, so am naturally collecting suitable reading material for the budding babe (suggestions welcome!).

[0] https://amzn.to/2pVXJwj [1] https://amzn.to/2pMCZr5 [2] https://amzn.to/31Ny32e [3] https://amzn.to/2BIimyM [4] https://amzn.to/33ZxbJj


Thony's math and astronomy posts are great. Very careful with the details (a real scholar) yet readable as Asimov. Quite brave of Strogatz!

(Notice the tag cloud on that page? ... he's been at it for a while!)


"See for example Euler, who made great strides in the development of calculus without any really defined concepts of convergence, divergence or limits, but who doesn’t appear here at all."

Can anyone say more about this? I always suspected this was so, because the end of high school / start of uni is roughly that sort of time period, and I always thought there was something about convergence/limits that was missing, it seemed very ad hoc.


As I understand it our modern notion of what rigorous mathematics is didn’t exist back then. The justification for analysis was basically physical intuition and simply that it worked.

Euler and company just worked directly with the intuition of infinitesimal quantities. To be fair though, infinitesimals are the essential intuition for how calculus works anyway.

Personally I don’t really care about constructions of the real numbers or technical details of calculus. It’s enough for rigorous mathematics to know that models exist of the theories which produce calculus; that is models of complete ordered fields and even plain old ordered fields with infinitesimals.


Euler lived in the 1700s. Not sure how that would have affected your training unless you’re much older than average.

Cauchy and Taylor both formalized many concepts, again in the 18th and 19th century.

What are you thinking is ad box?


In high school mathematics, you're not really given the definition of a limit. Consider the definition of the derivative

limit as h -> 0 of (f(x+h) - f(x)) / h.

That's well-defined on (0, inf) but not on [0, inf). So you can't just evaluate at h=0 and be done with it.

The intuition is 'as h gets smaller and smaller, the ratio gets closer and closer to a new function of x'. But many high-school students aren't given a clear definition of what it means for one function to be 'close' to another, or what it means for x to 'get smaller and smaller'.

To see the confusion more clearly, try having the debate about whether 0.999... = 1 with someone who doesn't understand what a limit is.


Let n = 0.999... Then multiply both sides of the equality by 10 so that we have 10n = 9.999...Then subtract n from both sides of the resulting equality to get 9n = 9.000...Finally, divide both sides by 9 and voila we have n = 1 which is what we wanted to show.


My gut feeling is that that proof isn't quite correct, since you haven't used the notion of a limit anywhere. There's a fundamental fact about convergence of geometric series that you need to use.

I think your proof goes wrong since you haven't justified how arithemtic operations work with infinite decimals. AFAIK the only way to add non-terminating decimals is to convert them to fractions (or sequences of fractions as with pi, e, etc), add the fractions, and convert them back. So if you convert 0.999... and 9.999... to fractions, you've assumed the conclusion.

To play devil's advocate, I can try to rephrase your proof without infinite decimal arithmetic as follows.

Assume

n = 0.999... = 1 - epsilon, where epsilon is 'infinitesimal' (an ill-defined version of not-quite-zero). We'd like to show that epsilon is zero.

10n = 9.999 = 10 - 10epsilon

9n = 9.999 - (1 - epsilon) = 9 - 9epsilon

9n = 8.999 + epsilon = 9 - 9epsilon

The only way to get the epsilons to cancel is to assume epsilon = 0, which is to assume the conclusion.


I had trouble with the whole 0.999... = 1 thing as a young lad until I noticed something about fractions.

Any digit over 9 equals a decimal of zero point itself repeating. I.E.

1/9 = 0.111... 2/9 = 0.222... 3/9 = 1/3 = 0.333... ... 1 = 9/9 = 0.999...


This one is better than the other child comment, but you still need to show that the limit of a sum is the sum of the limits. Not as easy as you might think, and not usually done in high school!


It is an odd review mixing negative comment and positive appreciation. The last bit did me as I am not historian but what to know the latest development of the field. And the Zeno etc kind of philosophical discussion lately.


Holy, commas, Batman




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: