FSUMATH
Florida State University Seal

This Week in Mathematics


TWIM RSS Feed
>> Next Week [2019-09-29 - 2019-10-05] >> Beyond Next Week [2019-10-05+]
<< View Previously Scheduled Events
Current Week [Sep 22, 2019 - Sep 28, 2019]
September
S M T W R F S
1234567
891011121314
15161718192021
22232425262728
2930     
Today:
Nothing Scheduled for Today
Entries for this week: 9
Tuesday September 24, 2019

Topology and Geometry Seminar [url]
Arithmetic hyperbolic manifolds in low dimensions II
    - Sam Ballas, FSU
Time: 3:35 Room: 201
Abstract/Desc: In this talk I will discuss how we can extend the techniques from last time to produce examples of compact hyperbolic manifolds in all dimensions.

Wednesday September 25, 2019

Departmental Tea Time
C is for cookie, and shorthand for C[0,1] w/the sup norm
Time: 3: Room: 204 LOV

Biomathematics Seminar [url]
TBA
    - Angie Davenport, FSU Math
Time: 3:35 Room: LOV 200

Thursday September 26, 2019

Algebra seminar
    - Michael Niemeier, FSU
Time: 3:35pm Room: 104 Love

Geometry, Topology and Data
What is a zigzag module with continuous index?
    - Haibin Hang, FSU
Time: 3:35 pm Room: 106 LOV
Abstract/Desc: Persistence modules and zigzag modules are basic objects of study in topological data analysis. In this work, we generalize and unify these objects in the framework of correspondence modules, which use partial linear relations between vector spaces as a replacement for linear maps. We show that a decomposition theorem still holds in this more general framework. This leads to formulations of persistent homology that enable us to analyze data using barcodes containing richer information.

Financial Mathematics
Goodness-of-fit testing of copulas using quasi-Monte Carlo methods
    - Yiran Chen, FSU Mathematics
Time: 3:35 Room: 201
Abstract/Desc: Simulations of copulas can be done by Monte Carlo methods or quasi-Monte Carlo methods. Goodness-of-it test can be used to find the best simulation algorithms for copulas. In this talk, I will introduce a new goodness-of-fit test based on collision test and low-discrepancy sequences, and present numerical results to compare its efficiency with some current tests.

Friday September 27, 2019

Colloquium Tea
Time: 3:00 pm Room: 204 LOV

Mathematics Colloquium [url]
A solution to a local version of the fifth Busemann-Petty Problem.
    - Dmitri Ryabogin , Kent State University
Time: 3:35 Room: Love 101
More Information
Abstract/Desc: In 1956, Busemann and Petty posed a series of questions about symmetric convex bodies, of which only the first one has been solved. Their fifth problem asks the following. Let K be an origin symmetric convex body in the n-dimensional Euclidean space and let H_x be a hyperplane passing through the origin orthogonal to a unit direction x. Consider a hyperplane G parallel to H_x and supporting to K and let C(K,x)=vol(K\cap H_x)dist (0, G). If there exists a constant C such that for all directions x we have C(K,x)=C, does it follow that K is an ellipsoid? We give an affirmative answer to this problem for bodies sufficiently close to the Euclidean ball in the Banach Mazur distance. This is a joint work with Maria Alfonseca, Fedor Nazarov and Vlad Yaskin.

Special Colloquium
Momentum Acceleration Under Random Gradient Noise
    - Mert Gurbuzbalaban, Rutgers University
Time: 1:25pm Room: LOV 102
More Information
Abstract/Desc: For many large-scale optimization and machine learning problems, first-order methods and their accelerated variants based on momentum have been a leading approach for computing low-to-medium accuracy solutions because of their cheap iterations and mild dependence on the problem dimension and data size. Even though momentum-based accelerated gradient (AG) methods proposed by Nesterov for convex optimization converges provably faster than gradient descent (GD) in the absence of noise, the comparison is no longer clear in the presence of gradient noise. In this talk, we focus on GD and AG methods for minimizing strongly convex functions when the gradient has random errors in the form of additive i.i.d. noise. We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm. Our results show that AG can achieve acceleration while being more robust to random gradient errors. Our framework also leads to practical optimal algorithms that can perform better than other state-of-the-art methods in the presence of random gradient noise. This is joint work with Serhat Aybat, Alireza Fallah and Asu Ozdaglar.


Problems? Email webmaster@math.fsu.edu.