Spring Term Schedule 2020

All meetings this term will be conducted remotely using the conferencing app Zoom.


Friday, May 15, 2-3 PM

Joel Shapiro will speak on:

What the Poisson Summation Formula knows about the Heat Equation

Abstract: For the real line, the solution to the initial-value problem for the time-varying heat equation can be obtained by convolving the initial temperature distribution with a gaussian kernel. The same problem for the unit circle also involves a convolution, but now with a kernel that cannot be expressed in closed form. In this talk I’ll show how the Poisson Summation Formula connects these two convolution kernels in a way that allows us to infer useful properties of the enigmatic circle kernel from those of the straightforward gaussian.

Notes for this talk are here … (~22 mb)


Friday, May 8, 2-3 PM

Thomas Wicks will speak on:

The space of bounded deformations

Abstract: T. Mura and S. Lee developed a variational problem in 1963 to determine the limiting problem of plastic flow. The space of bounded deformations, in which an extremum to this functional could be found, was then introduced by P. Suquet in 1978, where their displacements were Lebesgue integrable but their deformation gradients were bounded Radon measures.

Since then there have been over 1,700 publications on this space alone, with many more on approximating the solution of the variational problem that gave rise to it. Mathematical machinery allowing one to find a solution in this space was introduced in 1980 by R. Temam and G. Strang.

This talk will discuss the work of J. -F. Babadjian who, in 2015 discovered and corrected inconsistencies in the proofs of basic theorems about this space. 

Notes for this talk are here


Friday, May 1, 2–3 PM

Joel Shapiro will speak on:

The Poisson Summation Formula

Abstract: The Poisson Summation Formula reveals a remarkable connection between Fourier series and Fourier integrals. In this talk we’ll review the relevant Fourier theory, prove the Summation Formula, and then see how it lies at the heart of the Shannon-Nyquist Sampling Theorem of signal analysis.

Notes for this talk are here … (~21 mb)


Friday, April 24, 2–3 PM

Tuyen Tran (PSU) will speak on:

Convex Analysis with Applications to Multifacility Location Problems

Abstract: Convex analysis is one of the most important and useful areas of mathematical sciences with numerous applications to optimization, economics, systems control, and statistics. The presence of convexity makes it possible not only to comprehensively investigate qualitative properties of optimal solutions and derive necessary and sufficient conditions for optimality, but also to develop effective numerical algorithms for solving convex optimization problems.

However, solving large-scale optimization problems without the presence of convexity still remains a challenge. In this talk, we will give an overview of some useful techniques and numerical algorithms in convex analysis to deal with facility location problems in which the objective functions are not necessarily convex. We also provide numerical examples and present a broader picture of the development of nonsmooth optimization methods in solving multifacility location and clustering problems.


Friday, April 17, 2–3 PM

Pieter Vandenberge (PSU) will speak on:

Visualizing the method of Lagrange multipliers

Abstract: The method of Lagrange multipliers offers a way to find the maximum and minimum values of a function when it obeys constraints—often defined by functions themselves. The key insight of the technique is that at a critical point, the gradient of the constrained function must lie in a subspace generated by the gradients of the constraint functions.

All of this is quite possible to visualize. In this talk, we will recall the original method of Lagrange multipliers in the case of a single constraint, and visualize an example. Then we’ll move on to multiple constraints to see the constraint subspaces come to life.

Prerequisites for this talk: Calc 4 and Linear Algebra.


Friday, April 10, 2–3 PM

Logan Fox (PSU) will speak on:

The Hausdorff metric and geodesics in the space of compact sets

Abstract: Given a metric space (X,d), the Pompeiu-Hausdorff metric allows us to form a metric space consisting of the compact subsets of X. In many cases, properties of this space of compact sets is inherited from the underlying metric space (X,d). However, the geometry of this space is not well understood.

After a review of geodesic metric spaces and the Pompeiu-Hausdorff distance between sets, we will examine a potentially new result concerning the existence of geodesics in the space of compact sets.

Slides for this talk are here.


Friday, April 3, 2-3 PM

Prof. Mau Nam Nguyen (PSU) will speak on:

Local Convex Functions and Local DC (Difference of Convex) Functions



Winter Term Schedule 2020

Friday, February 28, 2-3 PM in FMH 462

Jim Rulla will speak on: Some Somewhat Surprising Sums

Abstract. We’ll investigate a remarkably simple summation formula and apply it to several examples including the Dirichlet kernel (sums of trig functions) and Faulhaber’s identities (sums of polynomials).  We’ll also derive the corresponding formula for integrals. The derivation is quite elementary. 

Notes for this talk are here.


Friday, February 21, 2-3 PM in FMH 462

Jim Rulla will speak on: Groups of operators and the Midpoint Rule

Abstract. The midpoint rule for integrals is the familiar approximation \[ \int_a^b f(x)\,dx \approx f\left(a+b\over 2\right)\left(b - a\right). \] The “midpoint” in the name is \(\bar x = {a+b\over 2}\), the midpoint of the interval \([a, b]\). The corresponding rule for sums is \[ \sum_{k = k_1}^{k_n} f(x_k) \approx n f\left(\bar x\right) , \] where \(\bar x = {1\over n}\sum_k x_k\) is the average of the arguments of \(f\), and \(n\) is the number of summands. The midpoint rule makes short work of a class of sums involving groups of operators. We’ll discuss how groups of operators arise in the study of initial value problems, give several examples, and apply the midpoint rule to derive the Dirichlet kernel and Faulhaber’s identities.

The material is quite elementary. All are welcome!

Notes for this talk are here.


Friday, February 7 & 14, 2-3 PM in FMH 462

Joel Shapiro will speak on: How to interpolate linear operators

Abstract: One of the most surprising and useful results in present-day analysis is the “Riesz-ThorinInterpolation Theorem,” a special case of which asserts (roughly) that whenever a linear transformation is bounded on \(L^p\) for two different values of \(p\) then it is also bounded on \(L^p\) for every intermediate value of \(p\). In this talk we’ll discuss this remarkable theorem in its full generality, and show how it gives unified proofs of some classical inequalities from real analysis, Fourier series, and integral operators.

Notes for these talks are here.


Friday, January 24 & 31, 2-3 PM in FMH 462

Logan Fox will speak on: One-sided derivative of distance in Aleksandrov space

Abstract: In an Aleksandrov space (that is; a complete, locally compact, and intrinsic metric space satisfying a bounded curvature condition) the semi-differentiability of distance is a common analytic tool. In these spaces, it is well-known that the distance between a geodesic and a compact set is right-differentiable; however, complete proofs of this result are difficult to find. This talk will review some of the basic properties of Aleksandrov spaces and explore a method for proving the semi-differentiability of distance to a compact set.

The paper on which this talk is based is here


Friday, January 10 & 17, 2-3 PM in FMH 462

Robert Lyons will speak on: Spread Spectrum

Abstract: Spread spectrum uses a collection of carrier frequencies to transmit a single message. We can also use spread spectrum techniques to estimate alignment information for an image distorted by an affine transform. To accomplish this, we embed a constellation of frequency impulses in an image and use least squares to extract the embedded signal. The least square transform estimate gives an estimate for the linear transform of the affine distortion, and we use partial pixel frequency interpolation to estimate the least squares frequency impulse locations to the nearest pixel. Along the way we will give some insight into windowing and its consequences.



Fall Term Schedule 2019

Friday, December 6, 2-3 PM in FMH Room 462

Joel Shapiro will speak on: \(L^p\)-convergence of Fourier series

Abstract: Thanks to Weierstrass approximation and basic Hilbert-space theory, the Fourier series of each function in the \(L^2\) space of an interval converges to that function in the space’s metric. We’ll review this result, and then investigate the \(L^p\)-situation for other values of \(p\).

Notes for this lecture appear as \(\S\)7 of this handout


Friday, November 8, 2-3 PM in FMH Room 462

Edward G. Effros (UCLA) will speak on: Quantum Mechanics and Non-Commutative Functional Anaysis

Abstract: We will review the notions of quantized real variables (unbounded self-adjoint operators) and their applications. Time permitting, we’ll discuss the Bell Inequalities.

The paper on which this talk is based is here.


Friday, November 1, 2-3 PM in FMH Room 462

Joel Shapiro will speak on: Conjugate Functions à la Riesz

Abstract: The famous Hungarian mathematician Marcel Riesz found himself (circa 1920) stumped by the problem of relating the \(L^4\)-norms of the real and imaginary parts of an arbitrary complex polynomial. One day, as he set questions for an examination, the solution magically appeared to him. What was this solution? Why did Riesz care? 

We will investigate—encountering along the way: complex analysis, PDE, Fourier series, and Banach space theory. 

Notes for this talk are here.


Friday, October 18 & 25, 2-3 PM in FMH Room 462

Logan Fox will speak on: Aleksandrov Spaces

Abstract: An Aleksandrov space (also known as a generalized Riemannian space) is a complete and locally compact length space with curvature bounded above or below. Such a space retains many of the geometric properties of a Riemannian manifold, but without the dependence on the smoothness of the manifold. This talk will cover basic definitions, such as geodesics and angles in length spaces; as well as some properties of Aleksandrov spaces. Some prior knowledge of Riemannian manifolds may be helpful, but is not necessary.


Friday, October 11, 2-3 PM in FMH Room 462

Jim Rulla will speak on: Sobolev’s Embeddings II

Abstract: The Sobolev Embedding Theorems say that when \(f\) and its first partial derivatives belong to \(L^p\), then \(f\) automatically belongs to some “nicer” space, either \(L^q\) for some \(q > p\) or \(C^\lambda\), the space of (Hölder) continuous functions. This week, we’ll prove the embedding theorem into \(C^\lambda\). Armed with this embedding, we’ll see why the restriction of \(f\) to a lower-dimensional subset (think: the boundary of a domain in \(R^n\)) is well-defined, even though the lower-dimensional set has measure zero.

This week’s embedding theorem is independent of last week’s; no prerequisite is assumed. All are welcome! Notes for Jim’s talks on Sobolev embedding are here.


Friday, October 4, 2PM-3PM: in FMH Room 462 (Fariborz Maseeh Hall, formerly called Neuberger Hall)

Jim Rulla will speak on: Sobolev’s Embeddings

Abstract. As a rule of thumb, if \(q>p\) then functions in \(L^q(\it R^n)\) are nicer than functions in \(L^p(\it R^n)\). Thus, on this nice-ness scale, \(L^\infty\) is closer to the space of continuous functions than is \(L^1\). The Sobolev embeddings say that when \(f\) and its partial derivatives belong to \(L^p\), then \(f\) automatically belongs to the “nicer” space \(L^q\) for some \(q > p\). In one dimension, the embedding is immediate: if \(f'\) is merely in \(L^1\), then \(f\) is bounded and (absolutely) continuous. Embeddings in higher dimensions depend on the dimension in a somewhat surprising way.

We’ll prove the easiest and most important of the embedding theorems, using only Hölder’s inequality and the Fundamental Theorem of Calculus.

Notes for this talk are here.

Seminar Schedule and lecture notes for: