1r.kappa(1)                  GRASS GIS User's Manual                 r.kappa(1)
2
3
4

NAME

6       r.kappa  - Calculates error matrix and kappa parameter for accuracy as‐
7       sessment of classification result.
8

KEYWORDS

10       raster, statistics, classification
11

SYNOPSIS

13       r.kappa
14       r.kappa --help
15       r.kappa [-whm] classification=name reference=name  [output=name]   [ti‐
16       tle=string]    format=string    [--overwrite]    [--help]   [--verbose]
17       [--quiet]  [--ui]
18
19   Flags:
20       -w
21           Wide report
22           132 columns (default: 80)
23
24       -h
25           No header in the report
26
27       -m
28           Print Matrix only
29
30       --overwrite
31           Allow output files to overwrite existing files
32
33       --help
34           Print usage summary
35
36       --verbose
37           Verbose module output
38
39       --quiet
40           Quiet module output
41
42       --ui
43           Force launching GUI dialog
44
45   Parameters:
46       classification=name [required]
47           Name of raster map containing classification result
48
49       reference=name [required]
50           Name of raster map containing reference classes
51
52       output=name
53           Name for output file containing error matrix and kappa
54           If not given write to standard output
55
56       title=string
57           Title for error matrix and kappa
58           Default: ACCURACY ASSESSMENT
59
60       format=string [required]
61           Output format
62           Options: plain, json
63           Default: plain
64           plain: Plain text output
65           json: JSON (JavaScript Object Notation)
66

DESCRIPTION

68       r.kappa tabulates the error matrix of classification result by crossing
69       classified map layer with respect to reference map layer.  Both overall
70       kappa (accompanied by its variance) and conditional  kappa  values  are
71       calculated.   This analysis program respects the current geographic re‐
72       gion and mask settings.
73
74       r.kappa calculates the error matrix of the two map layers and  prepares
75       the  table  from  which  the report is to be created.  kappa values for
76       overall and each classes are computed along with their variances.  Also
77       percent  of commission and omission error, total correct classified re‐
78       sult by pixel counts, total area in  pixel  counts  and  percentage  of
79       overall correctly classified pixels are tabulated.
80
81       The  report  will  be  written to an output file which is in plain text
82       format and named by user at prompt of running the  program.  To  obtain
83       machine readable version, specify a json output format.
84
85       The  body  of  the report is arranged in panels.  The classified result
86       map layer categories is arranged along the vertical axis of the  table,
87       while  the  reference  map  layer categories along the horizontal axis.
88       Each panel has a maximum of 5 categories (9 if wide format) across  the
89       top.   In  addition, the last column of the last panel reflects a cross
90       total of each column for each row.  All of the categories  of  the  map
91       layer  arranged along the vertical axis, i.e., the reference map layer,
92       are included in each panel.  There is a total at  the  bottom  of  each
93       column representing the sum of all the rows in that column.
94

OUTPUT VARIABLES

96       All  output  variables  (except  kappa variance) have been validated to
97       produce correct values in accordance to  formulas  given  by  Rossiter,
98       D.G.,  2004.  "Technical Note: Statistical methods for accuracy assess‐
99       ment of classified thematic maps".
100
101       Observations
102           Overall count of observed cells (sum of both correct and  incorrect
103           ones).
104
105       Correct
106           Overall count of correct cells (cells with equal value in reference
107           and classification maps).
108
109       Overall accuracy
110           Number of correct cells divided by overall cell count (expressed in
111           percent).
112
113       User’s accuracy
114           Share  of correctly classified cells out of all cells classified as
115           belonging to specified class (expressed in  percent).   Inverse  of
116           commission error.
117
118       Commission
119           Commission error = 100 - user’s accuracy.
120
121       Producer’s accuracy
122           Share  of  correctly classified cells out of all cells known to be‐
123           long to specified class (expressed in percent).  Inverse  of  omis‐
124           sion error.
125
126       Omission
127           Omission error = 100 - producer’s accuracy.
128
129       Kappa
130           Choen’s kappa index value.
131
132       Kappa variance
133           Variance of kappa index. Correctness needs to be validated.
134
135       Conditional kappa
136           Conditional user’s kappa for specified class.
137
138       MCC
139           Matthews  (Mattheus) Correlation Coefficient is implemented accord‐
140           ing to Grandini, M., Bagli, E.,  Visani,  G.  2020.   "Metrics  for
141           multi-class classification: An overview."
142

NOTES

144       It  is  recommended  to  reclassify categories of classified result map
145       layer into a more manageable number before running r.kappa on the clas‐
146       sified  raster  map  layer. Because r.kappa calculates and then reports
147       information for each and every category.
148
149       NA’s in output mean it was not possible to calculate  the  value  (e.g.
150       calculation  would  involve division by zero).  In JSON output NA’s are
151       represented with value null.  If there is no overlap between both maps,
152       a  warning  is  printed  and output values are set to 0 or null respec‐
153       tively.
154
155       The Estimated kappa value in r.kappa is the value only for  one  class,
156       i.e.  the  observed agreement between the classifications for those ob‐
157       servations that have been classified by classifier 1 into the class  i.
158       In other words, here the choice of reference is important.
159
160       It is calculated as:
161
162       kpp[i] = (pii[i] - pi[i] * pj[i]) / (pi[i] - pi[i] * pj[i]);
163
164       where=
165
166           •   pii[i]  is  the probability of agreement (i.e. number of pixels
167               for which there is agreement divided by  total  number  of  as‐
168               sessed pixels)
169
170           •   Pi[i]  is the probability of classification i having classified
171               the point as i
172
173           •   Pj[i] is the probability of classification j having  classified
174               the point as i.
175
176       Some  of  reported values (overall accuracy, Choen’s kappa, MCC) can be
177       misleading if cell count among classes is not balanced. See e.g.   Pow‐
178       ers,  D.M.W.,  2012.  "The Problem with Kappa"; Zhu, Q., 2020.  "On the
179       performance of Matthews correlation coefficient  (MCC)  for  imbalanced
180       dataset".
181

EXAMPLE

183       Example for North Carolina sample dataset:
184       g.region raster=landclass96 -p
185       r.kappa -w classification=landuse96_28m reference=landclass96
186       # export Kappa matrix as CSV file "kappa.csv"
187       r.kappa classification=landuse96_28m reference=landclass96 output=kappa.csv -m -h
188
189       Verification of classified LANDSAT scene against training areas:
190       r.kappa -w classification=lsat7_2002_classes reference=training
191

SEE ALSO

193        g.region, r.category, r.mask, r.reclass, r.report, r.stats
194

AUTHORS

196       Tao Wen, University of Illinois at Urbana-Champaign, Illinois
197       Maris Nartiss, University of Latvia (JSON output, MCC)
198

SOURCE CODE

200       Available at: r.kappa source code (history)
201
202       Accessed: Saturday Oct 28 18:17:38 2023
203
204       Main  index  | Raster index | Topics index | Keywords index | Graphical
205       index | Full index
206
207       © 2003-2023 GRASS Development Team, GRASS GIS 8.3.1 Reference Manual
208
209
210
211GRASS 8.3.1                                                         r.kappa(1)
Impressum