Jekyll2019-09-17T12:28:31-04:00https://lpetrov.cc/posts/feed.xmlLeonid PetrovLeonid Petrov. Integrable ProbabilityLeonid PetrovMapping TASEP back in time2019-09-09T00:00:00-04:002019-09-09T00:00:00-04:00https://lpetrov.cc/2019/09/TASEP<p>We obtain a new relation between the distributions $\mu_t$ at different times $t ≥ 0$ of the continuous-time TASEP (Totally Asymmetric Simple Exclusion Process) started from the step initial configuration. Namely, we present a continuous-time Markov process with local interactions and particle-dependent rates which maps the TASEP distributions μt backwards in time. Under the backwards process, particles jump to the left, and the dynamics can be viewed as a version of the discrete-space Hammersley process. Combined with the forward TASEP evolution, this leads to a stationary Markov dynamics preserving $\mu_t$ which in turn brings new identities for expectations with respect to $\mu_t$. Based on a <a href="https://lpetrov.cc/2019/07/backwards_TASEP/">joint work with Axel Saenz</a>.</p>
<p>Note that the original keynote presentation contained videos which are not included in the PDF download.</p>
<!-- these references correspond to the CV PDF, papers basically in order of arXiv -->
<!--more-->
<p><a href="https://storage.lpetrov.cc/research_files/talks/TASEP_back_Osaka.pdf" target="_blank">PDF (13.5 MB)</a></p>Leonid PetrovWe obtain a new relation between the distributions $\mu_t$ at different times $t ≥ 0$ of the continuous-time TASEP (Totally Asymmetric Simple Exclusion Process) started from the step initial configuration. Namely, we present a continuous-time Markov process with local interactions and particle-dependent rates which maps the TASEP distributions μt backwards in time. Under the backwards process, particles jump to the left, and the dynamics can be viewed as a version of the discrete-space Hammersley process. Combined with the forward TASEP evolution, this leads to a stationary Markov dynamics preserving $\mu_t$ which in turn brings new identities for expectations with respect to $\mu_t$. Based on a joint work with Axel Saenz. Note that the original keynote presentation contained videos which are not included in the PDF download.Special Session on Integrable Probability 20202019-08-06T00:00:00-04:002019-08-06T00:00:00-04:00https://lpetrov.cc/2019/08/AMS-UVA<div><a href="http://www.ams.org/meetings/sectional/2273_program_ss27.html">Special Session on Integrable Probability at the 2020 AMS Spring Southeastern Sectional Meeting at University of Virginia, March 13-15, 2020</a></div>Leonid PetrovSpecial Session on Integrable Probability at the 2020 AMS Spring Southeastern Sectional Meeting at University of Virginia, March 13-15, 2020Course notes on random matrices2019-07-29T00:00:00-04:002019-07-29T00:00:00-04:00https://lpetrov.cc/2019/07/rmt-announce<div>MATH 8380 Random Matrices (Fall 2019) • <a href="https://rmt-fall2019.s3.amazonaws.com/rmt-fall2019.pdf">PDF course notes</a> • <a href="https://lpetrov.cc/rmt19/">Course page</a></div>Leonid PetrovMATH 8380 Random Matrices (Fall 2019) • PDF course notes • Course pageMATH 8380 • Random Matrices2019-07-28T00:00:00-04:002019-07-28T00:00:00-04:00https://lpetrov.cc/rmt<h3 id="course-notes--found-typo-or-mistake-let-me-know"><a href="https://rmt-fall2019.s3.amazonaws.com/rmt-fall2019.pdf">Course notes</a> • Found typo or mistake? Let me know!</h3>
<div><object data="https://rmt-fall2019.s3.amazonaws.com/up.txt" style="height:30px"></object></div>
<ul>
<li>Chapter 1. Some history</li>
<li>Chapter 2. Gaussian Unitary Ensemble and Dyson Brownian Motion</li>
<li>Interlude. Markov chains and stochastic differential equations</li>
<li>Chapter 3. Two ways to derive the GUE eigenvalue distribution</li>
<li>Chapter 4. Wigner Semicircle Law</li>
<li>Chapter 5. Free convolution</li>
<li>Chapter 6. Representation-theoretic discrete analogue of the GUE</li>
</ul>
<!--more-->
<hr />
<h3 id="syllabus">Syllabus</h3>
<p><strong>Instructor.</strong> Leonid Petrov. Contact information is at <a href="https://lpetrov.cc"><code class="highlighter-rouge">https://lpetrov.cc</code></a></p>
<p>The class meets on Tuesdays and Thursdays at 9:30-10:45 in Kerchof 128.</p>
<p>Office hours Tuesdays and Thursdays 11:30-1 (or just drop in at any time). Office is Kerchof 209</p>
<p><strong>Description.</strong> Study of random matrices is an exciting topic with first major advances in the mid-20th century in connection with statistical (quantum) physics. Since then it found numerous connections to algebra, geometry, combinatorics, as well as to the core of the probability theory. The applications are also numerous: e.g., statistics, number theory, engineering, neuroscience; with more of them discovered every month. The course will discuss fundamental problems and results of Random Matrix Theory, and their connections to tools of algebra and combinatorics.</p>
<p><strong>Course homepage.</strong> The course homepage is at <a href="https://lpetrov.cc/rmt19/"><code class="highlighter-rouge">https://lpetrov.cc/rmt19/</code></a>. It contains
the syllabus, link to course notes, and other relevant information.</p>
<p><strong>Structure.</strong> The course discusses:</p>
<ol>
<li>Limit shape results for random matrices (such as Wigner’s Semicircle Law). Connections to Free Probability.</li>
<li>Concrete ensembles of random matrices (GUE, circular, and Beta ensembles). Bulk and edge asymptotics via exact computations. Connection to determinantal point processes.</li>
<li>Unitary invariant Hermitian matrices. Interlacing arrays of reals
and their boundary.</li>
<li>Dynamics on matrices and spectra. Dyson’s Brownian Motion.</li>
<li>Universality of random matrix asymptotics.</li>
<li>(optional) Discrete analogues of random matrix models: random permutations, random tilings, interacting particle systems.</li>
<li>(optional) Applications to machine learning, neural networks.</li>
</ol>
<p><strong>References.</strong> There are several textbooks which I will consult while teaching the course. It is not required to buy any of them to successfully participate in the course.</p>
<ol>
<li>Mehta, M.L. <em>“Random Matrices”</em>.</li>
<li>Anderson, G.W., Guionnet, A. and Zeitouni, O. <em>“An Introduction to Random Matrices”</em>.</li>
<li>Pastur, L. and Shcherbina, M. <em>“Eigenvalue Distribution of Large Random Matrices”</em>.</li>
<li>Tao, T. <em>“Topics in random matrix theory”</em>.</li>
</ol>
<p>Course notes will be posted on this website, and updated regularly.
Direct download link is <a href="https://rmt-fall2019.s3.amazonaws.com/rmt-fall2019.pdf"><code class="highlighter-rouge">https://rmt-fall2019.s3.amazonaws.com/rmt-fall2019.pdf</code></a></p>
<p><strong>Grading.</strong>
The course grade is based on homework and class engagement
(your participation in in-class discussions; asking questions in class
and at office hours;
volunteering to type up homework solutions;
possibly volunteering to give short expository talks detailing
an aspect in the course; etc).
There is no midterm or final exam.</p>
<p>The homework will be assigned in the course notes (look for
green background). The deadline for each problem is 2.5 or 3 weeks,
which means:</p>
<ul>
<li>if a problem relates to a lecture on Tuesday in week $n$, then it is
due on Thursday on week $n+3$ (but no later than the official
final exam
date for the course);</li>
<li>if a problem relates to a lecture on Thursday in week $n$, then it is still
due on Thursday on week $n+3$ (but no later than the official
final exam
date for the course);</li>
</ul>
<p>Level of homework problems ranges from easy to very difficult.
It is understood that you won’t turn in all problems all the time,
but
putting an adequate effort into solving homework
problems and
communicating your solutions clearly is
of paramount importance for your learning.</p>
<p>Homework can be submitted either by email (scan or typeset, and send; this is the
preferred method); or turned in in class (in which case please still
scan the homework to keep a copy).</p>
<hr />
<p><sub><strong>Required official statement.</strong> All students with special needs requiring accommodations should present the appropriate paperwork from the Student Disability Access Center (SDAC). It is the student’s responsibility to present this paperwork in a timely fashion and follow up with the instructor about the accommodations being offered. Accommodations for test-taking (e.g., extended time) should be arranged at least 5 business days before an exam.</sub></p>Leonid PetrovCourse notes • Found typo or mistake? Let me know! Chapter 1. Some history Chapter 2. Gaussian Unitary Ensemble and Dyson Brownian Motion Interlude. Markov chains and stochastic differential equations Chapter 3. Two ways to derive the GUE eigenvalue distribution Chapter 4. Wigner Semicircle Law Chapter 5. Free convolution Chapter 6. Representation-theoretic discrete analogue of the GUEMapping TASEP back in time2019-07-04T00:00:00-04:002019-07-04T00:00:00-04:00https://lpetrov.cc/2019/07/backwards_TASEP<p>We obtain a new relation between the distributions $\mu_t$ at different times $t\ge 0$ of the continuous-time TASEP (Totally Asymmetric Simple Exclusion Process) started from the step initial configuration. Namely, we present a continuous-time Markov process with local interactions and particle-dependent rates which maps the TASEP distributions $\mu_t$ backwards in time. Under the backwards process, particles jump to the left, and the dynamics can be viewed as a version of the discrete-space Hammersley process. Combined with the forward TASEP evolution, this leads to an stationary Markov dynamics preserving $\mu_t$ which in turn brings new identities for expectations with respect to $\mu_t$.</p>
<p>The construction of the backwards dynamics is based on Markov maps interchanging parameters of Schur processes, and is motivated by bijectivizations of the Yang-Baxter equation. We also present a number of corollaries, extensions, and open questions arising from our constructions.</p>Leonid PetrovWe obtain a new relation between the distributions $\mu_t$ at different times $t\ge 0$ of the continuous-time TASEP (Totally Asymmetric Simple Exclusion Process) started from the step initial configuration. Namely, we present a continuous-time Markov process with local interactions and particle-dependent rates which maps the TASEP distributions $\mu_t$ backwards in time. Under the backwards process, particles jump to the left, and the dynamics can be viewed as a version of the discrete-space Hammersley process. Combined with the forward TASEP evolution, this leads to an stationary Markov dynamics preserving $\mu_t$ which in turn brings new identities for expectations with respect to $\mu_t$. The construction of the backwards dynamics is based on Markov maps interchanging parameters of Schur processes, and is motivated by bijectivizations of the Yang-Baxter equation. We also present a number of corollaries, extensions, and open questions arising from our constructions.2020 travel2019-05-20T00:00:00-04:002019-05-20T00:00:00-04:00https://lpetrov.cc/2019/05/travel-2020<h5 id="january">January</h5>
<p>15-18
•
Denver, CO
•
<a href="http://jointmathematicsmeetings.org/meetings/national/jmm2020/2245_intro">AMS Joint Mathematics Meeting</a></p>
<h5 id="february">February</h5>
<p>3-7
•
Los Angeles, CA
•
<a href="http://www.ipam.ucla.edu/aac2020">Workshop on Asymptotic Algebraic Combinatorics at IPAM</a></p>
<!-- ##### March -->
<!-- ##### April -->
<!-- ##### May -->
<!-- ##### June -->
<h5 id="july">July</h5>
<p>27-7
•
Oxford, UK
•
<a href="https://www.claymath.org/events/cmi-himr-integrable-probability-summer-school">CMI-HIMR Integrable Probability Summer School</a></p>
<h5 id="august">August</h5>
<p>17-21
•
Seoul, South Korea
•
<a href="http://wc2020.org/index.php">World Congress in Probability and Statistics</a></p>
<!-- ##### September -->
<!-- ##### October -->
<!-- ##### November -->
<!-- ##### December -->Leonid PetrovJanuary 15-18 • Denver, CO • AMS Joint Mathematics Meeting February 3-7 • Los Angeles, CA • Workshop on Asymptotic Algebraic Combinatorics at IPAM July 27-7 • Oxford, UK • CMI-HIMR Integrable Probability Summer School August 17-21 • Seoul, South Korea • World Congress in Probability and StatisticsErratum to “Stochastic higher spin vertex models on the line”2019-05-19T00:00:00-04:002019-05-19T00:00:00-04:00https://lpetrov.cc/2019/05/CP16-erratum<p>(<a href="https://storage.lpetrov.cc/research_files/Petrov-publ/1502_erratum.pdf">PDF</a>)</p>
<p>This is an erratum to the paper <a href="https://lpetrov.cc/2015/02/stoch-higher/">“Stochastic higher spin vertex models on the line”</a>. The aim of the note is to address two separate
errors in the paper: finite vertical spin Plancherel identities, and a false duality claim. The other main statements of the paper (the definition of new stochastic particle systems,
duality relations for them, and contour integral observables)
are not affected.</p>Leonid Petrov(PDF) This is an erratum to the paper “Stochastic higher spin vertex models on the line”. The aim of the note is to address two separate errors in the paper: finite vertical spin Plancherel identities, and a false duality claim. The other main statements of the paper (the definition of new stochastic particle systems, duality relations for them, and contour integral observables) are not affected.Yang-Baxter random fields and stochastic vertex models2019-05-15T00:00:00-04:002019-05-15T00:00:00-04:00https://lpetrov.cc/2019/05/BMP_YB<p>Bijectivization refines the Yang-Baxter equation into a pair of local Markov moves which randomly update the configuration of the vertex model. Employing this approach, we introduce new Yang-Baxter random fields of Young diagrams based on spin $q$-Whittaker and spin Hall-Littlewood symmetric functions. We match certain scalar Markovian marginals of these fields with (1) the stochastic six vertex model; (2) the stochastic higher spin six vertex model; and (3) a new vertex model with pushing which generalizes the $q$-Hahn PushTASEP introduced recently by Corwin-Matveev-Petrov (2018). Our matchings include models with two-sided stationary initial data, and we obtain Fredholm determinantal expressions for the $q$-Laplace transforms of the height functions of all these models. Moreover, we also discover difference operators acting diagonally on spin $q$-Whittaker or (stable) spin Hall-Littlewood symmetric functions.</p>Leonid PetrovBijectivization refines the Yang-Baxter equation into a pair of local Markov moves which randomly update the configuration of the vertex model. Employing this approach, we introduce new Yang-Baxter random fields of Young diagrams based on spin $q$-Whittaker and spin Hall-Littlewood symmetric functions. We match certain scalar Markovian marginals of these fields with (1) the stochastic six vertex model; (2) the stochastic higher spin six vertex model; and (3) a new vertex model with pushing which generalizes the $q$-Hahn PushTASEP introduced recently by Corwin-Matveev-Petrov (2018). Our matchings include models with two-sided stationary initial data, and we obtain Fredholm determinantal expressions for the $q$-Laplace transforms of the height functions of all these models. Moreover, we also discover difference operators acting diagonally on spin $q$-Whittaker or (stable) spin Hall-Littlewood symmetric functions.Simulations of the q reversal in q-vol lozenge tilings of the hexagon2019-04-30T00:00:00-04:002019-04-30T00:00:00-04:00https://lpetrov.cc/2019/04/q-vol-simulations<div><a href="https://lpetrov.cc/2019/04/q-vol-simulations/">Simulations of the $q$ parameter reversal in $q^{volume}$ lozenge tilings of the hexagon</a></div>
<!--more-->
<p><br /><br /></p>
<ul>
<li>
<h2 id="1---simulations-of-a-new-left-jumping-dynamics"><a href="https://lpetrov.cc/simulations/2019-04-30-qvol/">1</a> - Simulations of a new left-jumping dynamics</h2>
</li>
<li>
<h2 id="2---a-naive-mirroring-dynamics"><a href="https://lpetrov.cc/simulations/2019-05-02-qvol-mirroring/">2</a> - A naive mirroring dynamics</h2>
</li>
</ul>Leonid PetrovSimulations of the $q$ parameter reversal in $q^{volume}$ lozenge tilings of the hexagonFrom infinite random matrices over finite fields to square ice2019-03-12T08:00:00-04:002019-03-12T08:00:00-04:00https://lpetrov.cc/GLnq_6V<p>Asymptotic representation theory of symmetric groups is a rich and beautiful subject with deep connections with probability, mathematical physics, and algebraic combinatorics. A one-parameter deformation of this theory related to infinite random matrices over a finite field leads to a randomization of the classical Robinson-Schensted correspondence between words and Young tableaux. Exploring such randomizations we find unexpected applications to six vertex (square ice) type models and traffic systems on a 1-dimensional lattice.</p>
<!-- these references correspond to the CV PDF, papers basically in order of arXiv -->
<!--more-->
<p><a href="https://storage.lpetrov.cc/research_files/talks/GLnqSquareIce.pdf" target="_blank">PDF (32 MB)</a></p>Leonid PetrovAsymptotic representation theory of symmetric groups is a rich and beautiful subject with deep connections with probability, mathematical physics, and algebraic combinatorics. A one-parameter deformation of this theory related to infinite random matrices over a finite field leads to a randomization of the classical Robinson-Schensted correspondence between words and Young tableaux. Exploring such randomizations we find unexpected applications to six vertex (square ice) type models and traffic systems on a 1-dimensional lattice.