Original Source Here
“This guy is physics. He might know some calculus and differential equations, but what does he know about programming, data science and A.I? Like, why’s he talking about this stuff?”
If you asked yourself that question, I’m honestly glad. I would have been just as much skeptical if I heard a random dude on the internet make the kind of claims he’s made so far. Allow me to first clarify what I mean by being an “outsider”:
I am not actively involved in any research or endeavors that are necessarily conducive to the growth and advancement of A.I paradigms. Having used certain methodologies that invoke A.I to solve certain problems does not make me an expert or put me on the frontier. I am forever grateful to the wonderful scientific community that has made it significantly easier to use these technologies.
Long story short, I learned how to code only because I was so interested in machine learning and its possible applications in physics. My first ever university research project involved the use of a convolutional neural network to detect images containing atmospheric contrails.
I trained the model with 1600 images that I collected and labeled myself and used 400 to test it. I ended up getting training and test accuracies of 97.5% and 98.5% respectively. Needless to say, I lost my mind and thought this was the coolest thing ever. No literally, I mean the coolest thing ever. Of course, the high subsided after about two weeks when I realized this algorithm was not going to solve my multivariate statistics take home exam(ironically….?).
It was only after that semester was over when I finally decided to take out the time to understand the mathematics of some popular ML algorithms and then code them from scratch, like simple single variable linear regression, K Means clustering, random forests, principle component analysis and some types of deep neural networks. Admittedly, I didn’t suffer too much during this endeavor because of my physics background. If anything, I sincerely enjoyed learning about the theoretical aspects of ML. I mean come on, who doesn’t get excited when they see vectors, matrices, gradients and entropies?
As a result, I developed a newfound sense of appreciation for ML. In essence, it’s like the rekindled sense of love for Newton’s second law when you realize it’s not just the “F=ma” which you throw numbers into, but in fact a system of second order differential equations. Then you start appreciating the Lagrangian/Hamiltonian formalisms to consider energy differences or aggregates to derive equations of motion in settings not practically suited for treatment with Newton’s second law. Then you get to quantum mechanics and realize how useful probability theory and linear algebra really are, but later also realize some of their caveats, constraints and limitations. I could go on forever, but the point I’m trying to make here is:
The deeper your understanding, the more nuanced your perception.
Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot