Yesterday, I read an excellent article by Mumford entitled “The Dawning of the Age of Stochasticity“.
The core message of the article – that probability and statistics represent a new frontier – is old news for people working in machine learning, and even more generally among computer scientists. However, there is one important way in which (I think) Mumford’s argument seems more refined than many previous articles and manifestos I have read on this topic. He argues for the promotion of notions such as probability and random variables to the status of first class entities in mathematics – which means that there is a need for tighter integration between probabilistic methods and other tools such as geometry, topology or algebra, and room for exciting new algorithmic techniques arising from such unions. In fact, the paper includes examples of how such a merger is not just good for applications but can also resolve deep questions at the foundations of mathematics and theoretical physics.
This message is, in my personal opinion, much more appealing than the standard line I often hear among ML, AI and CS folks – that statistical and probabilistic methods are to be used as exclusive replacements for other analytical tools. In my experience, many of the more difficult open problems in areas such as machine learning for dynamically dexterous robots remain unsolved precisely because existing statistical and probabilistic tools are not sharp enough to capture the subtleties of the solutions. On the other hand, one look at the biological world makes it clear than even primitive animals often utilize deep “insights” (albeit, acquired and tuned over eons via evolutionary processes).
I believe that a more pragmatic merger of probabilistic methods and other forms of mathematical analysis yields a much sharper scalpel with which to address real problems. So, I am pleased to hear a similar message from someone vastly more experienced and knowledgeable than me.