Current Research Projects
Students will be divided into three groups, and each group will work closely with two faculty mentors on an open research problem. Faculty mentors will introduce students to one of the following research projects.
- Mathematical Biology and Differential Equations
- Title: Simulating the propulsion of microorganisms using deep learning.
- Expected Background: Prior exposure to the Python programming language and successful completion of single- and multi-variable Calculus.
- Project Description: The study of biological microorganisms has led to the design of micro-robots that are being used for a variety of tasks. An application of these micro-robots is in healthcare, where they are packaged in microscopic sac to deliver drugs to a targeted site in a safe and efficient manner. However, while great progress has been made in our understanding of the ways microorganisms move, several questions and complex scenarios remain to be explored.
In this project we will investigate the propulsion of various swimming microorganisms numerically, using deep learning (large artificial neural networks). In the first 3-4 weeks of the program, faculty mentors will provide students with a working, fundamental outline of deep learning, and will train the students in the use of DeepXDE, the Python toolbox that will be used throughout the project. In the remainder of the program’s duration, the students will simulate specific problems related to the motion of microorganisms in various fluids with one goal in mind: approximating the propulsion speed of the swimmer. - Graph Theory and Model Theory
- Title: Applications of Model Theory to Graph Theory.
- Expected Background: Prior knowledge of model theory or logic is not expected. Prior knowledge of graph theory is not required, but may be helpful.
- Project Description: Model theory is a branch of mathematical logic which studies structures in the abstract. When applied to the specific field of graph theory, many interesting results emerge. For example, a recent paper by Malliaris and Shelah shows that, under a model theoretic assumption known as stability, finite graphs have a nice regularity. In other words, they can be partitioned into pieces of roughly the same size such that any two pairs of sets of vertices have the roughly the same connections between them. In this proposed research project, we want to look at more connections between model theory and graph theory. In particular, we want to answer questions about coloring properties of classes of graphs through a model-theoretic lens. Which classes of graphs are indivisible (if we color the vertices of a large graph with finitely many colors, does it contain smaller graphs of a single color)? How does this change if we modify our notion of indivisible?
- Probability and Neural Networks
- Title: Probabilistic Expressive Power of Neural Networks.
- Expected Background: Successful completion of single- and multi-variable calculus, as well as a course in probability.
- Project Description: Neural networks (particularly deep networks) are currently the most successful and prevalent technique for many machine-learning tasks. Correspondingly, research into the mathematical properties of neural networks and related inquiries has greatly expanded during the past decade. Once a neural network has a set of weights and biases, it produces an underlying function from a parameterized family of functions. One of the choices needed before determining weights and biases is the activation function to be used, and among the most widely-used activation functions is the ReLU function which takes the positive part of its input. We will study ReLU neural networks – those networks that use the ReLU function for their activation functions.
Recent work of Lu and collaborators analyzed the likelihood that the underlying function of a ReLU network becomes constant (it is significant for deep networks). Such a network is ineffective as it will be unable to train on data. In our project, we will build on these ideas. For example, noting that a constant function may be determined by small networks, we will analyze the likelihood of the underlying function of a neural network becoming equivalent to the underlying function of a smaller network. We will also consider how likely it is for networks with different architectures to produce underlying functions that have comparable complexity.