# Demo - Normalized Cross-Correlation

In this lesson we show how to use Normalized Cross-Correlation (NCC) to find similarities between two images. We also demonstrate that implementing the NCC using convolution at frequency domain can speed up the computation of similarities.

# NCC implementation

The NCC is a measure of similarity between images. Let be a 2D image and be also a 2D image. The goal is to find a point in that better match with . Thus, the NCC for a point is given by the following equation:

where is the size of .

The following implementation of NCC uses periodic convolution to calculate the mean of at neighborhood of .

``` 1 from numpy import *
2 from ia636 import *
3
4 def NCC(f,h):
5     f = asarray(f)
6     h = asarray(h)
7
8     # Mask for computing mean at neighborhood of h
9     neigh_h = ones(h.shape)/h.size
10     neigh_h1= ones(h.shape)
11
12     fM = iapconv(f,neigh_h)
13     hM = h.mean()
14
15     fk = f - fM
16     hk = h - hM
17     hk = hk[::-1,::-1]
18
19     return iapconv(fk,hk)/sqrt(iapconv(fk**2,neigh_h1))
```

## Using NCC to find the better matching point

Lets apply the NCC function we just implemented. To do this, consider an image and an image , with h being a cutting of (template). Then, be applying our NCC function we obtain a correlation map where the point with maximum value is the one where better matches with .

``` 1 f = adread('cameraman.pgm')
2 h = f[25:40+25,90:90+40]
3
5 adshow(h, title='2D image h - a cutting of f')
6 g = NCC(f,h)
7 peak = unravel_index(argmax(g),g.shape)
8 adshow(ianormalize(g),title='Correlation map - peak at %s'%(str(peak)))
9 h1 = zeros(f.shape)
10 h1[:h.shape[0],:h.shape[1]] = h
11 h1 = iaptrans(h1,array(peak)-array(h.shape)/2)
12 adshow(ianormalize((f+h1*2.)/3),title='Image f mixed with h in the point of maximum correlation')
```

# Faster NCC implementation using convolution at frequency domain

Now we will switch the convolution at space domain by one at frequency domain using fast Fourier transform.

``` 1 def iafreqpconv(f,h):
2     f = asarray(f)
3     h = asarray(h)
4
5     mh,nh = h.shape
6     filter = zeros(f.shape, dtype=float)
7     filter[:mh,:nh] = h
8     h = iaptrans(filter, -array(h.shape)/2)
9
10     F = fft.fftn(f)
11     H = fft.fft2(h)
12
13     return fft.ifftn(F*H).real
14
15 def fNCC(f,h):
16     f = asarray(f)
17     h = asarray(h)
18
19     neigh_h = ones(h.shape)/h.size
20     neigh_h1= ones(h.shape)
21
22     fM = iafreqpconv(f,neigh_h)
23     hM = h.mean()
24
25     fk = f - fM
26     hk = h - hM
27     hk = hk[::-1,::-1]
28
29     return iafreqpconv(fk,hk)/sqrt(iafreqpconv(fk**2,neigh_h1))
```

# Comparing performance

Here we execute our NCC and fNCC functions with templates of different sizes. We can observe that the performance differences increase with the template size. Also, we can see that it is difficult to find the point of maximum correlation if the template is too small. Thus, it is evident that the implementation using fast Fourier transform is quite better than using convolution at space domain.

``` 1 h = f[30:30+3,95:95+3]
2
4 adshow(h, title='Template with size 3x3')
5
6 timeBefore = time.time()
7 g = NCC(f,h)
8 timeAfter = time.time()
9 adshow(ianormalize(g),title='Convolution at space domain - Time %f seconds'%(timeAfter-timeBefore))
10
11 timeBefore = time.time()
12 g = fNCC(f,h)
13 timeAfter = time.time()
14 adshow(ianormalize(g),title='Convolution at frequency domain - Time %f seconds'%(timeAfter-timeBefore))
```
```------------------------------------------------------------
*** Exception while evaluating code:
File "<string>", line 14, in <module>
File "aws_python.py", line 195, in adshow
if (int(fmin) < 0) or (int(fmax) > 255):
ValueError: cannot convert float NaN to integer

------------------------------------------------------------

```
``` 1 h = f[30:30+9,95:95+9]
2
4 adshow(h, title='Template with size 9x9')
5
6 timeBefore = time.time()
7 g = NCC(f,h)
8 timeAfter = time.time()
9 adshow(ianormalize(g),title='Convolution at space domain - Time %f seconds'%(timeAfter-timeBefore))
10
11 timeBefore = time.time()
12 g = fNCC(f,h)
13 timeAfter = time.time()
14 adshow(ianormalize(g),title='Convolution at frequency domain - Time %f seconds'%(timeAfter-timeBefore))
```
``` 1 h = f[30:30+27,95:95+27]
2
4 adshow(h, title='Template with size 27x27')
5
6 timeBefore = time.time()
7 g = NCC(f,h)
8 timeAfter = time.time()
9 adshow(ianormalize(g),title='Convolution at space domain - Time %f seconds'%(timeAfter-timeBefore))
10
11 timeBefore = time.time()
12 g = fNCC(f,h)
13 timeAfter = time.time()
14 adshow(ianormalize(g),title='Convolution at frequency domain - Time %f seconds'%(timeAfter-timeBefore))
```
``` 1 h = f[20:20+81,90:90+81]
2
4 adshow(h, title='Template with size 81x81')
5
6 timeBefore = time.time()
7 g = NCC(f,h)
8 timeAfter = time.time()
9 adshow(ianormalize(g),title='Convolution at space domain - Time %f seconds'%(timeAfter-timeBefore))
10
11 timeBefore = time.time()
12 g = fNCC(f,h)
13 timeAfter = time.time()
14 adshow(ianormalize(g),title='Convolution at frequency domain - Time %f seconds'%(timeAfter-timeBefore))
```
``` 1 h = f[20:20+162,10:10+162]
2
4 adshow(h, title='Template with size 162x162')
5
6 timeBefore = time.time()
7 g = NCC(f,h)
8 timeAfter = time.time()
9 adshow(ianormalize(g),title='Convolution at space domain - Time %f seconds'%(timeAfter-timeBefore))
10
11 timeBefore = time.time()
12 g = fNCC(f,h)
13 timeAfter = time.time()
14 adshow(ianormalize(g),title='Convolution at frequency domain - Time %f seconds'%(timeAfter-timeBefore))
```