Sunday, November 29, 2020

Machine Learning and Physics

Machine Learning has in use in the Large Hadron Collider for long enough that there is now a Coursera online course about it. Basically, machine learning is used to help handle the approximately one petabye per second of data collected from particle collisions. That is one kind of use of machine learning.

 Neural networks do the job of recognizing the content of images and video, and speech recognition much better than the traditional type of computer algorithms that people can write, so it is absolutely the right technique to give computers the senses of vision and hearing. The detection of "cat" or "utility" pole from two-dimensional arrays of bytes, which is computer vision, is generalizable to "seeing" patterns in data sets in N-dimensional arrays. This "data pattern sense" is a sense organ humans lack. Neural networks can help provide humans this sixth sense. 

 Beyond that, neural networks have to connect up with some type of symbolic representation, in order to be able to handle concepts, even simple relations like "bigger than", "smaller than", "behind", "above", etc.. The seventh lecture, Neurosymbolic AI, in the MIT introduction to deep learning, 6.S191 is from where I learned about it back in February, and a regret this year is that I have not been able to follow up on it. The idea is something like this (words added to clips from David Cox's slides):


I believe that the computer will have to connect what it can sense with its "data pattern sense" to symbolic representations, and then, what the computer can do will be no better or worse in its performance than whatever automated reasoning can do today, for instance, in mathematics theorem provers. So, if mathematics falls to Artificial Intelligence, then physics may follow, but not otherwise.