Skip to content

Research on number theory conjectures, leading to the development of new machine learning techniques, initially applied to math problems before being tested on real-life problems.

Notifications You must be signed in to change notification settings

VincentGranville/Experimental-Math-Number-Theory

Repository files navigation

The following content is now available. Source code and images are explained in details in the PDF accompanying these articles:

  • Detecting Subtle Departures from Randomness, available here.

    Summary:

    I discuss, in simple English, how to detect weak deviations from randomness, and workarounds to get better random-looking and unbreakable sequences. The theoretical background is related to the famous unsolved Riemann Hypothesis in number theory. This article also offers a strong, state-of-the-art summary on this topic, accessible to machine learning practitioners or beginners, and to decision makers in the field. The topic is usually explained in obscure jargon or inane generalities. To the contrary, this article will intrigue you with the beauty and power of this theory.

  • New Perspective on the Riemann Hypothesis, available here.

    Summary:

    In about 10 pages (plus Python code, exercises and figures), this article constitutes a scratch course on the subject. It covers a large range of topics, both recent as well as unpublished, in a very compact style. Full of clickable references, the document covers the basics, offering a light reading experience. It also includes plenty of advanced, state-of-the-art material explained as simply as possible. Written by a machine learning professional working on experimental math, it is targeted to other machine learning professionals. Physicists, mathematicians, quants, statisticians and engineers will hopefully find this document easy to read, interesting, and opening up new research horizons. Exercise 8 is particularly intriguing, showing a potential new path to proving the Riemann Hypothesis.

  • Math-free, Parameter-free Gradient Descent in Python, available here.

    Summary

    I discuss techniques related to the gradient descent method in 2D. The goal is to find the minima of a target function, called the cost function. The values of the function are computed at evenly spaced locations on a grid and stored in memory. Because of this, the approach is not directly based on derivatives, and there is no calculus involved. It implicitly uses discrete derivatives, but foremost, it is a simple geometric algorithm. The learning parameter typically attached to gradient descend is explicitly specified here: it is equal to the granularity of the mesh and does not need fine-tuning. In addition to gradient descent and ascent, I also show how to build contour lines and orthogonal trajectories, with the exact same algorithm. I apply the method to investigate one of the most famous unsolved problems in mathematics: the Riemann Hypothesis. The functions studied here are are defined on the complex plane.

About

Research on number theory conjectures, leading to the development of new machine learning techniques, initially applied to math problems before being tested on real-life problems.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages