# Description

Train the Optimum-Path Classifier.

# Synopse

• OPF = iafit(feats, labels, fdist,mst = False)
• feats: Feature matrix.
• labels: Labels.
• fdist: Distance function beetween feature vectors.
``` 1 def iafit(feats, labels, fdist,mst = False):
2     from scipy.spatial import distance
3     from iaOPF import iaprim, iamstseeds, iadijkstra
4     from numpy import array
5
6     A = distance.squareform(distance.pdist(feats, fdist))
7     B = iaprim(A)
8     P = iamstseeds(B, labels)
9     P = array(P)
10
11
12     labels = array(labels)
13     seeds_labels = labels[P]
14     if mst:
15         OPF = iadijkstra(B, P, seeds_labels)
16     else:
17         OPF = iadijkstra(A, P, seeds_labels)
18     return OPF
```

# Examples

``` 1 from numpy import *
2 from iaOPF import *
3
4
5 from scikits.learn import datasets
6
9     labels = iris.target
10     feats = iris['data']
11     return feats, labels
12
13 def euclidian(X, Y):
14     E = X - Y
15     return sqrt(dot(E, E))
16
18 t = time.time()
19 OPF = iafit(feats, labels, euclidian)
20 print "Fit via IFT full graph: time elapsed: %f secs" % (time.time()-t)
21
22 print 'costs to root: ', OPF[0]
23 print 'parents: ', OPF[1]
24 print 'labels: ', OPF[2]
25
26 t = time.time()
27 OPF2 = iafit(feats, labels, euclidian,True)
28 print "Fit via IFT MST graph: time elapsed: %f secs" % (time.time()-t)
29
30 print 'fit2 costs to root: ', (abs(OPF[0]-OPF2[0])<0.1).all(), abs(OPF[0]-OPF2[0]).max()
31 print 'fit2 parents: ', (abs(OPF[1]-OPF2[1])<0.1).all(),abs(OPF[1]-OPF2[1]).max()
32 print 'fit2 labels: ', (abs(OPF[2]-OPF2[2])<0.1).all(),abs(OPF[2]-OPF2[2]).max()
```
```Fit via IFT full graph: time elapsed: 0.236963 secs
costs to root:  [ 0.2236068   0.2236068   0.2236068   0.2236068   0.2236068   0.34641016
0.2236068   0.2236068   0.2236068   0.2236068   0.2236068   0.2236068
0.2236068   0.24494897  0.41231056  0.36055513  0.34641016  0.2236068
0.34641016  0.24494897  0.3         0.24494897  0.45825757  0.          0.3
0.2236068   0.2         0.2236068   0.2236068   0.2236068   0.2236068
0.3         0.34641016  0.34641016  0.2236068   0.2236068   0.3
0.2236068   0.2236068   0.2236068   0.2236068   0.6244998   0.2236068
0.2236068   0.36055513  0.2236068   0.24494897  0.2236068   0.2236068
0.2236068   0.31622777  0.31622777  0.31622777  0.31622777  0.31622777
0.3         0.31622777  0.38729833  0.31622777  0.38729833  0.38729833
0.31622777  0.48989795  0.33166248  0.42426407  0.31622777  0.2
0.31622777  0.50990195  0.31622777  0.          0.33166248  0.
0.33166248  0.31622777  0.31622777  0.31622777  0.          0.33166248
0.34641016  0.31622777  0.31622777  0.31622777  0.          0.
0.37416574  0.31622777  0.50990195  0.31622777  0.31622777  0.31622777
0.33166248  0.31622777  0.38729833  0.31622777  0.31622777  0.31622777
0.31622777  0.          0.31622777  0.42426407  0.          0.4
0.36055513  0.36055513  0.52915026  0.          0.43588989  0.55677644
0.63245553  0.2236068   0.34641016  0.36055513  0.26457513  0.48989795
0.37416574  0.36055513  0.81853528  0.52915026  0.          0.36055513
0.31622777  0.52915026  0.24494897  0.36055513  0.4         0.24494897
0.14142136  0.36055513  0.4         0.43588989  0.81853528  0.36055513
0.          0.53851648  0.53851648  0.37416574  0.36055513  0.
0.36055513  0.36055513  0.36055513  0.          0.36055513  0.36055513
0.36055513  0.24494897  0.          0.37416574  0.28284271]
parents:  [   7.    9.   47.   29.    0.   10.   47.   26.   38.   30.   48.    7.
9.   38.   33.   33.   10.    7.    5.   48.   27.   19.    6.   23.
11.   30.   23.    0.    0.   11.   29.   20.   46.   32.   30.   49.
10.   30.   42.    7.    0.    8.   47.   26.   46.    1.   19.    3.
27.    7.   52.   75.   77.   89.   58.   66.   51.   98.   65.   89.
93.   96.   92.   78.   82.   86.   84.   82.   72.   89.   70.   97.
72.   63.   75.   65.   58.   77.   61.   81.   89.   69.   69.   83.
84.   56.   52.   68.   96.   94.   55.   78.   69.   57.   90.   96.
94.   74.   98.   94.  136.  101.  120.  116.  128.  107.  106.  125.
128.  143.  147.  147.  139.  101.  121.  110.  147.  105.  122.  119.
140.  101.  105.  126.  120.  102.  127.  138.  103.  125.  107.  117.
128.  133.  103.  130.  148.  116.  138.  145.  104.  145.  101.  140.
140.  147.  123.  147.  115.  127.]
labels:  [ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.
0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.
0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  1.  1.  1.  1.
1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  2.  2.  2.  2.  2.  2.  2.  2.
2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.
2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.  2.
2.  2.  2.  2.  2.  2.]
Fit via IFT MST graph: time elapsed: 0.169598 secs
fit2 costs to root:  False 0.264575131106
fit2 parents:  False 32.0
fit2 labels:  True 0.0
```