>>14412846I mean I can't think of anything that doesn't use calculus to some degree.
Calculus is the entire basis of backpropagation/gradient descent, the way machine learning (well, neural networks) works.
You use the chain rule and derive the gradient of a loss function with respect to the weights of your deep learning model. The gradient is used for gradient descent methods. No calculus/chain rule, no deep learning.