1r.kappa(1)                    Grass User's Manual                   r.kappa(1)
2
3
4

NAME

6       r.kappa   -  Calculates  error  matrix and kappa parameter for accuracy
7       assessment of classification result.
8

KEYWORDS

10       raster, statistics, classification
11

SYNOPSIS

13       r.kappa
14       r.kappa --help
15       r.kappa   [-wh]   classification=name   reference=name    [output=name]
16       [title=string]     [--overwrite]    [--help]    [--verbose]   [--quiet]
17       [--ui]
18
19   Flags:
20       -w
21           Wide report
22           132 columns (default: 80)
23
24       -h
25           No header in the report
26
27       --overwrite
28           Allow output files to overwrite existing files
29
30       --help
31           Print usage summary
32
33       --verbose
34           Verbose module output
35
36       --quiet
37           Quiet module output
38
39       --ui
40           Force launching GUI dialog
41
42   Parameters:
43       classification=name [required]
44           Name of raster map containing classification result
45
46       reference=name [required]
47           Name of raster map containing reference classes
48
49       output=name
50           Name for output file containing error matrix and kappa
51           If not given write to standard output
52
53       title=string
54           Title for error matrix and kappa
55           Default: ACCURACY ASSESSMENT
56

DESCRIPTION

58       r.kappa tabulates the error matrix of classification result by crossing
59       classified map layer with respect to reference map layer.  Both overall
60       kappa (accompanied by its variance) and conditional  kappa  values  are
61       calculated.   This  analysis  program  respects  the current geographic
62       region and mask settings.
63
64       r.kappa calculates the error matrix of the two map layers and  prepares
65       the  table  from  which  the report is to be created.  kappa values for
66       overall and each classes are computed along with their variances.  Also
67       percent  of  commission  and  ommission error, total correct classified
68       result by pixel counts, total area in pixel counts  and  percentage  of
69       overall correctly classified pixels are tabulated.
70
71       The  report will be write to an output file which is in plain text for‐
72       mat and named by user at prompt of running the program.
73
74       The body of the report is arranged in panels.   The  classified  result
75       map  layer categories is arranged along the vertical axis of the table,
76       while the reference map layer categories  along  the  horizontal  axis.
77       Each  panel has a maximum of 5 categories (9 if wide format) across the
78       top.  In addition, the last column of the last panel reflects  a  cross
79       total  of  each  column for each row.  All of the categories of the map
80       layer arranged along the vertical axis, i.e., the reference map  layer,
81       are  included  in  each  panel.  There is a total at the bottom of each
82       column representing the sum of all the rows in that column.
83

NOTES

85       It is recommended to reclassify categories  of  classified  result  map
86       layer into a more manageable number before running r.kappa on the clas‐
87       sified raster map layer. Because r.kappa calculates  and  then  reports
88       information for each and every category.
89
90       NA’s in output file mean non-applicable in case MASK exists.
91
92       The  Estimated  kappa value in r.kappa is the value only for one class,
93       i.e. the observed  agreement  between  the  classifications  for  those
94       observations  that  have been classified by classifier 1 into the class
95       i. In other words, here the choice of reference is important.
96
97       It is calculated as:
98
99       kpp[i] = (pii[i] - pi[i] * pj[i]) / (pi[i] - pi[i] * pj[i]);
100
101       where=
102
103           ·   pii[i] is the probability of agreement (i.e. number  of  pixels
104               for  which  there  is  agreement  divided  by  total  number of
105               assessed pixels)
106
107           ·   Pi[i] is the probability of classification i having  classified
108               the point as i
109
110           ·   Pj[i]  is the probability of classification j having classified
111               the point as i.
112

EXAMPLE

114       Example for North Carolina sample dataset:
115       g.region raster=landclass96 -p
116       r.kappa -w classification=landuse96_28m reference=landclass96
117
118       Verification of classified LANDSAT scene against training areas:
119       r.kappa -w classification=lsat7_2002_classes reference=training
120

SEE ALSO

122       g.region, r.category, r.mask, r.reclass, r.report, r.stats
123

AUTHOR

125       Tao Wen, University of Illinois at Urbana-Champaign, Illinois
126
127       Last changed: $Date: 2018-10-18 21:13:18 +0200 (Thu, 18 Oct 2018) $
128

SOURCE CODE

130       Available at: r.kappa source code (history)
131
132       Main index | Raster index | Topics index | Keywords index  |  Graphical
133       index | Full index
134
135       © 2003-2019 GRASS Development Team, GRASS GIS 7.4.4 Reference Manual
136
137
138
139GRASS 7.4.4                                                         r.kappa(1)
Impressum