Pittsburgh Interdisciplinary Mathematics Review https://pimr.pitt.edu/pimr An open-access journal on pure and interdisciplinary mathematics edited by students from Pittsburgh, PA University Library System, University of Pittsburgh en-US Pittsburgh Interdisciplinary Mathematics Review 2995-6544 Proofs Without Words https://pimr.pitt.edu/pimr/article/view/46 <p>Proofs of the formula for the area of a regular dodecagon and Viviani's Theorem without using words.</p> Paul Gartside Copyright (c) 2024 Paul Gartside https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 142 142 10.5195/pimr.2024.46 An Interview with Professor Piotr Hajłasz https://pimr.pitt.edu/pimr/article/view/42 <p>Piotr Hajłasz was born in Warsaw in 1966. He went to the University of Warsaw for his undergraduate and stayed there until 2004, working up through the ranks to an associate professor after receiving his Ph.D. in 1994 under the supervision of Bogdan Bojarski. As of today, he has been a professor of mathematics at the University of Pittsburgh for a full twenty years. He was elected as a Fellow of the American Mathematical Society in 2017. In recognition of his contribution, he was awarded the prestigious Sierpiński Medal by the Polish Mathematical Society in 2021. Winning this award has placed him among the ranks of distinguished mathematicians including Paul Erdős, Stanisław Ulam, and Benoit Mandelbrot. His research is in geometric function theory which covers a wide range of topics on the borderline of classical analysis, geometric analysis, theory of Sobolev spaces and analysis on metric spaces, where he is known for Hajłasz–Sobolev spaces.</p> Neil MacLachlan Lark Song Copyright (c) 2024 Piotr Hajłasz, Neil MacLachlan, Lark Song https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 106 114 10.5195/pimr.2024.42 Mathematics of Machine Learning: An Introduction https://pimr.pitt.edu/pimr/article/view/38 <p>Over the course of just a few decades, machine learning has grown into a force which shapes modern life perhaps as much as the combustion engine or wireless communication. As we drive to work, machine learning algorithms extract license plate numbers from images captured by automatic cameras at busy intersections. At work, they measure our productivity and govern our supply chains. In our personal lives, they power product recommendations on online shopping sites and suggestions on social media. In our homes and on our devices, they recognize our voices and our faces. They process our loan applications and evaluate our medical images. More globally, they stabilize power grids and assist in planning flight routes. There is not a space in our public and personal lives which machine learning has not at least begun to affect.</p> <p>For a field as ubiquitous, it is fairly poorly represented in our common knowledge. This article aims at giving a high level introduction to some core tasks, ideas and methods of machine learning for readers who are familiar with at least some undergraduate mathematics. Advanced readers may get more from certain sections, but our goal is to present a self-contained picture which requires little knowledge beyond calculus in multiple variables and elementary linear algebra. For readers who have not taken many of the advanced classes, this may also motivate why certain fields of study are of interest in applications.</p> <p>One big omission in this article are deep neural networks, i.e. the models which underly 'deep learning.' Due to their special importance, they will be discussed in a separate companion article.</p> Stephan Wojtowytsch Copyright (c) 2024 Stephen Wojtowytsch https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 1 25 10.5195/pimr.2024.38 Stirring Coffee https://pimr.pitt.edu/pimr/article/view/36 <p>Have you ever pondered, after gently swirling a spoon through your steaming cup of coffee, whether, despite your meticulous stirring efforts, a solitary coffee molecule might have ended up exactly at the position where it was before stirring? Legend has it that Dutch mathematician Luitzen Egbertus Jan Brouwer (1881–1966) engaged in precisely such an experiment, ultimately establishing what is now celebrated as the Brouwer Fixed Point theorem – a beautiful result combining Topology with Analysis.</p> Armin Schikorra Copyright (c) 2024 Armin Schikorra https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 26 32 10.5195/pimr.2024.36 The Category of Graphs https://pimr.pitt.edu/pimr/article/view/37 <p>The category of graphs and the mappings between them is considered. The monomorphisms and epimorphisms are characterized. Reflective and coreflective subcategories are identified and terminal, initial, projective, and injective objects are characterized. Parallels with the category of topological spaces are discussed.</p> Russell Walker Copyright (c) 2024 Russell Walker https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 33 48 10.5195/pimr.2024.37 Defining Probabilities Over the Prime Numbers https://pimr.pitt.edu/pimr/article/view/35 <p>This paper will detail Fawcett's experience solving a problem he stumbled upon while tutoring math and statistics at Cape Fear Community College in Wilmington, North Carolina. Though at first glance the problem seemed simple, he soon realized that the solution would require knowledge about the Riemann zeta function, which is a common fascination for math enthusiasts like him. To follow the arguments in this paper, all that is necessary is some elementary knowledge about probability and Calculus II.</p> Oscar Fawcett Copyright (c) 2024 Oscar Fawcett https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 49 58 10.5195/pimr.2024.35 Beach Math https://pimr.pitt.edu/pimr/article/view/45 <p>You're on the beach. The water's too chilly for a swim, and the sand is blazing outside of your trusty umbrella and towel. To top it off, you forgot your charger, and your novel is sitting at home because you thought you wouldn't need it. Your friends have wandered off, so you are left with nothing but your thoughts. What else is there to do but think about that math problem you saw in PIMR a while ago? That's where Beach Math comes in---bringing you challenging yet relaxing problems to keep your vacation boredom at bay. We hope you enjoy!</p> Ryder Pham Evan Hyzer Anastasiia Rudenko Copyright (c) 2024 Ryder Pham, Evan Hyzer, Anastasiia Rudenko https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 138 141 10.5195/pimr.2024.45 A History of the Department of Mathematics at the University of Pittsburgh, 1787-1995 https://pimr.pitt.edu/pimr/article/view/43 <p>This article outlines a history of the Department of Mathematics at the University of Pittsburgh from 1787-1995.</p> Stuart Hastings Copyright (c) 2024 Stuart Hastings https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 115 132 10.5195/pimr.2024.43 Applying to a Math PhD Program https://pimr.pitt.edu/pimr/article/view/44 <p>In this brief article, Neilan gives his personal recommendations based on what he has learned going through the process of applying to Math PhD programs himself and by serving as the Director of Graduate Studies at Pitt Math for the last two years. The advice he provides is mostly geared towards current undergraduates at a US university applying to a mathematics PhD program, but some aspects may also be applicable to prospective master’s students and/or international students.</p> Michael Neilan Copyright (c) 2024 Michael Neilan https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 133 137 10.5195/pimr.2024.44 Shallow Neural Networks and Laplace's Equation on the Half-Space with Dirichlet Boundary Data https://pimr.pitt.edu/pimr/article/view/39 <p>In this paper we investigate the ability of Shallow Neural Networks i.e. neural networks with one hidden layer, to solve Laplace’s equation on the half space. We are interested in answering the question if it is possible to fit the boundary value using a neural network then is it possible to learn the solution to the PDE in the entire region using the same network? Our analysis is done primarily in Barron Spaces, which are function spaces designed to include neural networks with a single hidden layer and infinite width. Our results indicate in general the solution is not in the Barron space even if the boundary values are. However, the solution can be approximated to \(\sim \varepsilon^2\)<sup> </sup>accuracy with functions of a low Barron norm. We implement a Physics Informed Neural Network with a custom loss function to demonstrate some of the theoretical results shown before.</p> Malhar Vaishampayan Copyright (c) 2024 Malhar Vaishampayan https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 59 70 10.5195/pimr.2024.39 A Generalization of Placing Identical Items into Identical Bins https://pimr.pitt.edu/pimr/article/view/40 <p>A common approach to counting the number of ways to place identical items into identical bins is by casework. In this article, an alternative approach is introduced and robust mathematical formulas are established to calculate the number of ways of placing arbitrary number of identical items into arbitrary number of identical bins. Firstly, single closed formulas for the cases of two and three bins are developed for arbitrary number of items. Secondly, a recursive formula for more than three bins is derived for arbitrary number of items. This recursive formula reduces the number of bins by one in each step until reaching the base case of three bins for which the closed formula derived in this paper can be applied. A Python program is implemented using the derived formulas that can count the number of ways for arbitrary bins and items.</p> Annie Wang Copyright (c) 2024 Annie Wang https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 71 88 10.5195/pimr.2024.40 On The Number of Robust Geometric Graphs in a Euclidean Space https://pimr.pitt.edu/pimr/article/view/41 <p>We say that a graph \(\Gamma = (V , E)\) with vertices in a \(k\)–dimensional Euclidean space is an \(\varepsilon\)–robust distance graph with threshold \(\tau\) if any two vertices \(v, w\) in \(V\) are adjacent if and only if \(\Vert v − w \Vert_2 \leq (1 + \varepsilon)^{−1}\tau\) and are not adjacent if and only if \(\Vert v − w \Vert_2 &gt; (1 + \varepsilon)\tau\). We show that there are universal constants \(C′, c′, c &gt; 0\) with the following property. For \(k \geq C′d \log{n}\), asymptotically almost every \(d\)–regular graph on \(n\) vertices is isomorphic to a \(\frac{c}{\sqrt{d}}\)–robust distance graph in \(\mathbb{R}^k\) , whereas for \(k \leq \frac{c′ d \log{n}}{\log{d}}\) , a.a.e \(d\)–regular graph on \(n\) vertices cannot be represented as an \(\frac{c}{\sqrt{d}}\)–robust distance graph.</p> Lufei Yang Colin Yip Madison Zhao Konstantin Tikhomirov Copyright (c) 2024 Lufei Yang, Colin Yip, Madison Zhao, Konstantin Tikhomirov https://creativecommons.org/licenses/by/4.0 2024-12-06 2024-12-06 2 89 105 10.5195/pimr.2024.41