VSOP19, Quy Nhon 3-18/08/2013

Ngo Van Thanh, Institute of Physics, Hanoi, Vietnam.

I.1. Introduction I.2. Thermodynamics and statistical mechanics I.3. Phase transition I.4. Probability theory

Part I. Basics Part II. Monte Carlo Simulation Methods

II.1. The spin models II.2. Boundary conditions II.3. Simple sampling Monte Carlo methods II.4. Importance sampling Monte Carlo methods

Part III. Finite size effects and Reweighting methods

III.1. Finite size effects III.2. Single histogram method III.3. Multiple histogram method III.4. Wang-Landau method III.5. The applications

A Guide to Monte Carlo Simulations in Statistical Physics

Monte Carlo Simulation in Statistical Physics: An Introduction

D. Landau and K. Binder, (Cambridge University Press, 2009).

Understanding Molecular Simulation : From Algorithms to Applications

K. Binder and D. W. Heermann (Springer-Verlag Berlin Heidelberg, 2010).

Frustrated Spin Systems

D. Frenkel, (Academic Press, 2002).

Lecture notes PDF files : http://iop.vast.ac.vn/~nvthanh/cours/vsop/ Example code : http://iop.vast.ac.vn/~nvthanh/cours/vsop/code/

H. T. Diep, 2nd Ed. (World Scientific, 2013).

Experiment

I.1. Introduction 

Revolution of science :

Nature

The old division of physics into “experimental and “theoretical” branches is not complete.

Computer simulation :

Third branch : COMPUTER SIMULATION

Theory

Simulation

A tool to exploit the electronic

 

computing machines for the development of nuclear weapons and code breaking

 Monte Carlo simulation

Developed during and after Second World War Molecular dynamic simulation and Monte Carlo simulation (50th decade) Simulation : Perform an experiment on the computer or test a theory

First review of the use of Monte Carlo simulations (Metropolis and Ulam, 1949) Use a complete Hamiltonian without any approximative techniques

I.2. Thermodynamics and statistical mechanics

Basic notations : Partition function

(1.1)

the summation is taken over all the possible states of the system : Hamiltonian : Boltzmann constant : Temperature

Probability

Free energy

(1.2)

Internal Energy

(1.3)

(1.4)

Entropy

Free energy differences

Fluctuations

(1.5)

The probability

The average energy

with

(1.6)

Specific heat

 Magnetization

(1.7)

Susceptibility

(1.8)

(1.9)

For a system in a pure phase

(1.11)

Consider the NVT ensemble

(1.10)

(1.12)

Entropy S and the pressure p are the conjugate variables

(1.14)

(1.13)

Fluctuations of extensive variables (like S) scale with the volume Fluctuations of intensive variables (like p) scale with the inverse volume

Gas

I.3. Phase transition

A phase transition is the transformation

of thermodynamic system from one phase (or state of matter) to another

 Freezing

Solid Liquid

Order parameter : The order parameter is a quantity which is zero in one phase, and non-zero in the other.

Melting 

Ferromagnet : spontaneous magnetization

Liquid–gas : difference in the density

An order parameter may be :

Liquid crystals : degree of orientational order

a scalar quantity

or a multicomponent quantity

Correlation function

Two-point correlation function, space-dependent

(1.15)

  : the quantity whose correlation is being measured.

Time correlation function between two quantities

(1.16)

If A = B, autocorrelation function of A

(1.17)

Fluctuations in the quantity A

r : spatial distance

(1.18)

define :

First order & second order transition

Consider a system which is in thermal equilibrium and

undergo a phase transition between a disordered state and one

First order transition:

The first derivatives of the free energy are discontinuous at TC

The internal energy is discontinuous

F U

metastable states

latent heat

metastable states

TC T TC T

The magnetization is discontinuous at TC

latent heat

TC

Double peak in energy histogram

Ising antiferromagnetic model on the FCC lattice with N = 24

Second order transition

First derivatives are continuous at the critical temperature

F

Internal energy and magnetization are continuous

U

critical point

critical point

TC T TC T

Internal energy and magnetization are continuous

Changes the curvature at the critical temperature TC

Ising ferromagnetic model on the FCC lattice with N = 24

Phase diagrams

p

Phase diagram is a type of chart used to show conditions at which thermodynamically distinct phases can occur at equilibrium.

critical point

melting curve

Liquid

vaporization curve

Solid

triple point

Vapor

sublimation curve

TC T

Simplified pressure–temperature phase diagram for water

Critical behavior and exponents

The reduced distance from the critical temperature

Critical exponents describe the behaviour of physical quantities near the phase transitions .

(1.19)

(1.20)

For a magnet, the asymptotic expressions are valid only as

(1.21)

(1.22)

: are the “critical exponents”

(1.23)

At , for a ferromagnet

(1.24)

For a system in d-dimensions, above the critical temperature the two-body correlation function has the Ornstein–Zernike form

At

(1.25)

(1.26)

Two-dimensional Ising square lattice

 logarithmic divergence of the specific heat

Two-dimensional XY-models

The correlation length

(1.27)

Kosterlitz and Thouless phase transition

Universality and scaling Consider a simple Ising ferromagnet in a small magnetic field H

(1.28)

At T is near the critical point, the free energy :

the gap exponent

is “scaled” variable

The correlation function is written in scaling form

(1.29)

The Rushbrooke equality

(1.30)

The “hyperscaling” expression for the lattice dimensionality

(1.31)

Landau theory

The free energy of a d-dimensional system near a phase transition

for comparison to the simulations.

: dimensionless coefficients

R : as the interaction range of the model

For the case of a homogeneous system

(1.32)

(1.33)

In equilibrium the free energy must be a minimum

If

Solve the equation

(1.34)

we have

(1.35)

For

Expanding r in the vicinity of TC :

For

, m1 corresponds to the solution above TC

(1.36)

If

Solve the equation

(1.37)

we have

(1.38)

(1.39)

(1.40)

tricritical point appears when

If : the transition is first order

tricritical exponents

I.4. Probability theory

Basic notions: Considering an elementary event with a countable set of random outcomes Suppose this event occurs repeatedly times, we count how often the outcome is observed

 Define probabilities for the outcome :

: never occurs;

: it is certain to occur.

(1.41)

From its definition, we conclude that

(1.42)

and are “mutually exclusive” events

the occurrence of implies that does not occur and vice versa.

Consider two events

(1.43)

one with outcomes and probabilities

 Defines the outcome and the joint probabilities

the second with outcomes and probabilities

If the events are independent

If they are not independent, one define the conditional probability

(1.44)

(1.45)

Conditional probability that occurs, given that occurs

For the outcome of random events

 Defines the expectation value of a random variable

The expectation value of a real function

(1.46)

Consider two functions with the linear combination

 Defining the nth moment as

(1.48)

The cumulants

(1.47)

(1.49)

For , the cumulant is called the “variance”

(1.50)

For two random variables

If are independent 

(1.51)

Covariance of

(1.53)

(1.52)

As a measure of the degree of independence of the two random variables

Special probability distributions and the central limit theorem Consider a very large number of events

two events are mutually exclusive and exhaustive

Suppose that N independent samples of these events occur

(1.54)

each outcome is either 0 or 1 denote the sum of these outcomes

(1.55)

The binomial distribution

(1.56)

the probability that of the is 1 and is 0

(1.57)

Suppose we have two outcomes (1, 0) of an experiment

if the outcome is 0, the experiment is repeated, otherwise we stop the random variable of interest is the number n of experiments until we get the outcome 1 The geometrical distribution

(1.58)

In the case that the probability of success is very small, we have the Poisson distribution

(1.59)

(1.60)

Gaussian distribution

this is an approximation to the binomial distribution in the case of a very large number of possible outcomes and a very large number of samples.

Statistical errors Suppose the quantity A is distributed according to a Gaussian

 mean value and width

consider n statistically independent observations {Ai} of this quantity A unbiased estimator of the mean of this distribution:

(1.61)

the standard error of this estimate:

Consider deviations

the mean square deviation

(1.62)

(1.63)

use the relation

(1.64)

we have

For simple sampling Monte Carlo, the computation of errors of averages

(1.65)

For importance sampling Monte Carlo

(1.66)

: correlation time : the time interval between subsequently generated states

Markov chains and master equations

The concept of Markov chains is so central to Monte Carlo simulations.

Define a stochastic process at discrete times labeled consecutively

Consider a system with a finite set of possible states

Denote by the state the system is in at time ,

consider the conditional probability that

 Markov process

if this conditional probability is independent of all states but the immediate predecessor

(1.67)

(1.68)

 Markov chain

The corresponding sequence of states the transition probability to move from state i to state j

(1.69)

require that:

The total probability, at time the system is in state

(1.70)

 Master equation

(1.71)

Treating time as a continuous variable, we rewrite

(1.72)

This equation describes the balance of gain and loss processes

(1.73)

since the probabilities of the events are “mutually” exclusive so, the total probability for a move away from the state j :

The master equation with the equilibrium probability

(1.74)

the detailed balance with the equilibrium probability

(1.75)