1r.kappa(1)                  GRASS GIS User's Manual                 r.kappa(1)
2
3
4

NAME

6       r.kappa  - Calculates error matrix and kappa parameter for accuracy as‐
7       sessment of classification result.
8

KEYWORDS

10       raster, statistics, classification
11

SYNOPSIS

13       r.kappa
14       r.kappa --help
15       r.kappa [-whm] classification=name reference=name  [output=name]   [ti‐
16       tle=string]   [--overwrite]  [--help]  [--verbose]  [--quiet]  [--ui]
17
18   Flags:
19       -w
20           Wide report
21           132 columns (default: 80)
22
23       -h
24           No header in the report
25
26       -m
27           Print Matrix only
28
29       --overwrite
30           Allow output files to overwrite existing files
31
32       --help
33           Print usage summary
34
35       --verbose
36           Verbose module output
37
38       --quiet
39           Quiet module output
40
41       --ui
42           Force launching GUI dialog
43
44   Parameters:
45       classification=name [required]
46           Name of raster map containing classification result
47
48       reference=name [required]
49           Name of raster map containing reference classes
50
51       output=name
52           Name for output file containing error matrix and kappa
53           If not given write to standard output
54
55       title=string
56           Title for error matrix and kappa
57           Default: ACCURACY ASSESSMENT
58

DESCRIPTION

60       r.kappa tabulates the error matrix of classification result by crossing
61       classified map layer with respect to reference map layer.  Both overall
62       kappa  (accompanied  by  its variance) and conditional kappa values are
63       calculated.  This analysis program respects the current geographic  re‐
64       gion and mask settings.
65
66       r.kappa  calculates the error matrix of the two map layers and prepares
67       the table from which the report is to be  created.   kappa  values  for
68       overall  and each classes are computed along with their variances. Also
69       percent of comission and omission error, total correct  classified  re‐
70       sult  by  pixel  counts,  total  area in pixel counts and percentage of
71       overall correctly classified pixels are tabulated.
72
73       The report will be write to an output file which is in plain text  for‐
74       mat and named by user at prompt of running the program.
75
76       The  body  of  the report is arranged in panels.  The classified result
77       map layer categories is arranged along the vertical axis of the  table,
78       while  the  reference  map  layer categories along the horizontal axis.
79       Each panel has a maximum of 5 categories (9 if wide format) across  the
80       top.   In  addition, the last column of the last panel reflects a cross
81       total of each column for each row.  All of the categories  of  the  map
82       layer  arranged along the vertical axis, i.e., the reference map layer,
83       are included in each panel.  There is a total at  the  bottom  of  each
84       column representing the sum of all the rows in that column.
85

NOTES

87       It  is  recommended  to  reclassify categories of classified result map
88       layer into a more manageable number before running r.kappa on the clas‐
89       sified  raster  map  layer. Because r.kappa calculates and then reports
90       information for each and every category.
91
92       NA’s in output file mean non-applicable in case MASK exists.
93
94       The Estimated kappa value in r.kappa is the value only for  one  class,
95       i.e.  the  observed agreement between the classifications for those ob‐
96       servations that have been classified by classifier 1 into the class  i.
97       In other words, here the choice of reference is important.
98
99       It is calculated as:
100
101       kpp[i] = (pii[i] - pi[i] * pj[i]) / (pi[i] - pi[i] * pj[i]);
102
103       where=
104
105           •   pii[i]  is  the probability of agreement (i.e. number of pixels
106               for which there is agreement divided by  total  number  of  as‐
107               sessed pixels)
108
109           •   Pi[i]  is the probability of classification i having classified
110               the point as i
111
112           •   Pj[i] is the probability of classification j having  classified
113               the point as i.
114

EXAMPLE

116       Example for North Carolina sample dataset:
117       g.region raster=landclass96 -p
118       r.kappa -w classification=landuse96_28m reference=landclass96
119       # export Kappa matrix as CSV file "kappa.csv"
120       r.kappa classification=landuse96_28m reference=landclass96 output=kappa.csv -m -h
121
122       Verification of classified LANDSAT scene against training areas:
123       r.kappa -w classification=lsat7_2002_classes reference=training
124

SEE ALSO

126        g.region, r.category, r.mask, r.reclass, r.report, r.stats
127

AUTHOR

129       Tao Wen, University of Illinois at Urbana-Champaign, Illinois
130

SOURCE CODE

132       Available at: r.kappa source code (history)
133
134       Accessed: Mon Jun 20 16:46:10 2022
135
136       Main  index  | Raster index | Topics index | Keywords index | Graphical
137       index | Full index
138
139       © 2003-2022 GRASS Development Team, GRASS GIS 8.2.0 Reference Manual
140
141
142
143GRASS 8.2.0                                                         r.kappa(1)
Impressum