Hello, Bonjour, Sabaidee, Stravo! Welcome to the the fifth week of Statistical Mechanics: Algorithms and Computations from the Physics Department of Ecole normale supérieure. For four weeks now, we have concentrated on classical Statistical Mechanics, and from the equiprobability principle, we have just arrived at the Boltzmann distribution. Time has come to pay a three-week visit to the world of quantum physics, the world of wave functions, and of the Schrödinger equation. We will go even farther, into the world of quantum statistical mechanics, where we have at the same time the quantum wave functions and the Boltzmann distribution of thermal equilibrium. In this lecture, lecture 5, we introduce to one of the basic models in quantum physics, namely a particle in a harmonic potential, described by energy levels and wave functions that we know exactly. Here is the groundstate wavefunction of the particle, at energy E = 1/2. The square of the wavefunction gives the probability for the particle to be at position x... And here is the first excited state, with energy 3/2: the second excited state with energy 5/2 and so on and so on... Wait a few moments to create all these states by yourself! At a given temperature, these energy levels are subjet to the equiprobability principle and to the Boltzmann distribution. In the lecture, in just a few moments, we will discuss exactly how this works, and this will lead us very quickly to the density matrix and the celebrated Feynman path integral that describes the spread of the wavefunctions through the fluctuations of a path. We all know that at high temperature, the world is not really governed by quantum physics. The essence of our approach to quantum statistical mechanics is a certain transformation called the Trotter decomposition, that iteratively brings us from the semiclassical world at high temperature down to the full quantum world at low-temperatures. How this works exactly will be explained in this week's tutorial. We will also discuss the time evolution and program the quantum equivalent of molecular dynamics for simple quantum systems. This week's homework session will be again all about practical computing: You will take your first steps in computing wavefunctions and in Quantum Monte Carlo, for the harmonic oscillator. The quantum mechanical harmonic oscillator describes a particle of mass m in a potential 1/2 m omega^2 x^2 governed by the Schrödinger equation. Let us simplify this equation by putting Planck's constant h_bar = 1, the mass of the particle = 1 and the oscillator constant omega = 1. This is not a restriction, and Michael, Alberto, Vivien, and I will present the most important equations both with and without the constants. We arrive at the time-independent Schrödinger equation where H is the Hamilton operator or the Hamiltonian. Its solutions, we saw them before, are the ground state wavefunction of energy 1/2, the first excited state of energy 3/2, the second excited state of energy 5/2, and so on, and so on... These wave functions were produced with the program "harmonic_wavefunction.py", that implements an exact recursion relation for the Hermite polynomials The wave functions are zero in the limit x going to - infinity and x going to + infinity. In addition, they are normalized, which means that their integral from -infinity to infinity of psi_n squared is equal to one. Finally, the wavefunctions are orthogonal. You don't have to believe me that the wavefunctions computed in "harmonic_wavefunctions.py" actually solve the Schrödinger equation. You can check this for yourself. To do so let's rewrite the Schrödinger equation as H psi / psi is equal to E, and let's write a little program "harmonic_wavefunctions_check.py" with a discrete approximations for the second derivative. Sure enough, for the groundstate wavefunction psi_0 we find H psi_0/psi_0 is equal to 1/2 for all x. And for the first excited state, we find H psi_1 / psi_1 equals to 3/2 everywhere. Now let us move right away into Quantum Statistical Mechanics. Quantum means that for a particle in the state n, the probability to be at the position x is given by the absolute value of psi_n(x)^2. But we are also doing statistical mechanics, and the probability to be in the state n is given by exp(-E_n/ kT) or exp(-beta E_n) where beta = 1/k_b T. Now let's put the two pieces together, and we find that the probability to be in state n and at position x is proportional to exp(-beta E_n) * | psi_n(x)|^2. Before plunging into this subject, please take a moment to download, run and modify the two programs we discussed in this section. On the coursera website, you will find the program harmonic_wavefunction.py that implements the recursion of Hermite polynomials. There is also the nice program harmonic_wavefunctions_check.py that checks that the Schrödinger equation is solved, that the wavefunctions are normalized, and that they are orthogonal. As we discussed a few moments ago, the probability to be in state n and at position x is proportional to e^-(beta * En) psi_n(x) psi_n*(x). In this equation, the asterisk refers to the complex conjugate. In this lecture, and this your homework this week, the wavefunctions are real-valued, so psi∗ = psi, but in this week's tutorial, we have to take into account complex wavefunctions, so we better use the correct formulas from the beginning. Notice that in this equation, we have two different types of probabilities. We have the thermal probability of the Boltzmann distribution, and the quantum-mechanical probability of the wavefunctions: two completely separate worlds meet in this equation. However, the energy levels and wave functions cannot *normally* be computed , and this expression leads nowhere, even for simple problems! To make progress, we discard the information about the energy levels and consider what is called the (diagonal) density matrix: the probability to be at x which is proportional to the density matrix rho(x, x, beta) equals to the Σ_n e^(-beta E_n) psi_n(x) psi_n*(x). We also consider a more general object, the non-diagonal density matrix, which is equal to rho(x, x', beta) = Σ_n psi_n(x) psi_n*(x'). This is the central object of Quantum Statistical Mechanics. For example, the partition function Z(beta) is given by the Trace of the density matrix. As discussed in previous weeks, the partition function Z is the sum of the probabilities π_n; but here the n are no longer positions in space, but the energy levels. We next discuss the three fundamental properties of the density matrix: First of all, each density matrix possesses the convolution property. This means that the integral over x' of rho(x, x', beta_1)*rho(x', x'', beta_2) can be written as an integral over x' over a double sum over n and m. This can be exchanged into a double sum over n and m over the integral in x'. The orthogonality property that we just discussed allows us to write this as a Σ_n psi_n(x) e^(-(beta_1 + beta_2) E_n) psi_n*(x''): in other words, the density matrix rho(x, x'', beta_1 + beta_2). In this exact equation let us set beta_1 equal to beta_2. We find that the integral over x' of rho(x, x', beta)*rho(x', x'', beta) is equal to the density matrix rho(x, x'', 2beta). Now realize that beta = 1/Temperature. So in this equation we compute the density matrix at 2beta, that means at low temperature, through a product over density matrices at high temperature. We can use this equation if we know the density matrix at high temperature, to compute it at twice lower temperature. Then we can use it again to compute it at 4 times lower temperature, 8 times lower temperature, and so on, and so on... until we reach the full quantum regime. The second property is the free density matrix. We will derive this equation in the beginning of this week's tutorial, and make sure that you understand the role of the non-diagonal elements in this density matrix, and we will illustrate this in nice pictures of the entire density matrix at high temperature; and lower, and lower temperatures. As beta becomes larger, the variance of the gaussian becomes larger, and the system becomes more and more quantum. Finally, the third property of the density matrix concerns the high-temperature limit for a Hamiltonian H = H_free plus a potential V, the density matrix at small beta (high temperature) is given by rho(x, x', beta) = e^(-beta/2 V(x)) * rho_free(x, x', beta) e^(-beta/2 V(x')). So you see, at high temperature, the correction of the density matrix to the free density matrix is given by a simple Boltzmann factor e^(-beta V(x)) split into half between x and x'. But notice that through this expression, we have an explicit formula for the density matrix rho(x, x', beta) without solving the Schrödinger equation, for any potential. So this is the density matrix at high temperature (small beta). So now, let us involve the convolution property, and from this density matrix at temperature beta, let's compute it at inverse temperature 2beta, 4beta, 8beta, 16beta, and so on... We can reach the full quantum regime. Please take a moment to download and to run this program as written, for the harmonic oscillator. You can then modify it for other potentials by just changing an exponential factor, not by solving a new Schrödinger equation. We will pursue this great story further in this week's homework session. In matrix squaring, the subject of the last section, we convoluted two density matrices at temperature T to obtain a new density matrix at temperature T/2. By iterating this process, we could go to lower and lower temperatures starting from the high-temperature quasi-classical limit. Normally, however, we cannot do this matrix squaring analytically. For a large number of particles, we soon ran out of space to store a reasonable discretized approximation of rho(x, x', beta) on the computer, so we cannot do the matrix squaring numerically here. We now see how the Feynman path integral overcomes this problem, how it leads to the use of Monte-Carlo methods and to the idea of path sampling. Instead of evaluating the convolution integrals one after the other, as we did in matrix squaring, let us write them out all together. So we write the density matrix rho(x, x', beta) = integral dx'' rho(x, x'', beta/2) rho(x'', x', beta/2). Each of the density matrices at beta/2 can be written as an integral over two density matrices at temperature beta/4. This gives an integral over dx'', dx''', dx'''' of rho on temperature beta/4, beta/4, beta/4 and beta/4. Now, each of the density matrices at beta/4 can again be written as a product over two density matrices at beta/8, and thus this would lead us to multiple integrals over dx''''' dx'''''' dx''''''' and dx''''''''. The idea we are pursuing is great, but we are having a notational nightmare.. Let us write {x0, x1, x2, x3 ...} instead of the cumbersome {x, x', x'', x'''...}. This gives the density matrix... [formula on screen] For the partition function, which is the trace of the density matrix as we discussed before, we find that... [formula on screen] x0, x1... xN in these integrals is called a "path", and we can imagine the variable xk to be at position k beta/N of an imaginary time variable tau that goes from 0 to beta in little steps of Delta tau which is equal to beta/N. Density matrices and partition functions can thus be expressed as multiple integrals over paths variables, so called paths integrals. In Markov-chain Monte-Carlo, we can move from one path configuration to the next by choosing one position x_k and making a little displacement delta x that can be positive or negative. We compute the weight after the move and before the move and accept this move with the Metropolis acceptance probability. Note that we can also move x0 which is between x1 and x(N-1) so that the path can move as a whole. Configurations of a Markov chain simulation for the Harmonic oscillator are shown here. The histogram of the x-position in this simulation is given by the probability of the particle to be at position x, or in other words, the density matrix rho(x, x, beta), the diagonal density matrix. In Python this gives the program naive_harmonic_path.py, that I ask you to download and to run from the Coursera website. You will modify this program in this week's homework where you will do your own Markov-chain Monte Carlo simulation of a Quantum system, or a Path-Integral Monte-Carlo simulation. In conclusion, we have plunged in this session of Statistical Mechanics: Algorithms and Computations into the world of quantum physics and quantum statistical mechanics. What I have shown you, the case of the harmonic oscillator, can be greatly generalized, as we will see in the coming weeks. The solution of the Schrödinger equation needs a new technique for each potential, and Feynman Path Integral is more general. It is for this reason that it is so famous. It naturally leads to the idea of Monte Carlo simulations, and Path integral Monte Carlo algorithms. As we are becoming to be great experts in Monte Carlo simulation, of course this approach is just for us... So as I said, the story will continue to unfold in homeworks, lectures, and tutorials, and I hope to keep up your interest in this fascinating subject. Finally, let me thank you for your attention, and see you again, in further sessions of this lecture course.