1string::token(n) Text and string utilities string::token(n)
2
3
4
5______________________________________________________________________________
6
8 string::token - Regex based iterative lexing
9
11 package require Tcl 8.5
12
13 package require string::token ?1?
14
15 package require fileutil
16
17 ::string token text lex string
18
19 ::string token file lex path
20
21 ::string token chomp lex startvar string resultvar
22
23______________________________________________________________________________
24
26 This package provides commands for regular expression based lexing
27 (tokenization) of strings.
28
29 The complete set of procedures is described below.
30
31 ::string token text lex string
32 This command takes an ordered dictionary lex mapping regular
33 expressions to labels, and tokenizes the string according to
34 this dictionary.
35
36 The result of the command is a list of tokens, where each token
37 is a 3-element list of label, start- and end-index in the
38 string.
39
40 The command will throw an error if it is not able to tokenize
41 the whole string.
42
43 ::string token file lex path
44 This command is a convenience wrapper around ::string token text
45 above, and fileutil::cat, enabling the easy tokenization of
46 whole files. Note that this command loads the file wholly into
47 memory before starting to process it.
48
49 If the file is too large for this mode of operation a command
50 directly based on ::string token chomp below will be necessary.
51
52 ::string token chomp lex startvar string resultvar
53 This command is the work horse underlying ::string token text
54 above. It is exposed to enable users to write their own lexers,
55 which, for example may apply different lexing dictionaries
56 according to some internal state, etc.
57
58 The command takes an ordered dictionary lex mapping regular
59 expressions to labels, a variable startvar which indicates where
60 to start lexing in the input string, and a result variable
61 resultvar to extend.
62
63 The result of the command is a tri-state numeric code indicating
64 one of
65
66 0 No token found.
67
68 1 Token found.
69
70 2 End of string reached.
71
72 Note that recognition of a token from lex is started at the
73 character index in startvar.
74
75 If a token was recognized (status 1) the command will update the
76 index in startvar to point to the first character of the string
77 past the recognized token, and it will further extend the
78 resultvar with a 3-element list containing the label associated
79 with the regular expression of the token, and the start- and
80 end-character-indices of the token in string.
81
82 Neither startvar nor resultvar will be updated if no token is
83 recognized at all.
84
85 Note that the regular expressions are applied (tested) in the
86 order they are specified in lex, and the first matching pattern
87 stops the process. Because of this it is recommended to specify
88 the patterns to lex with from the most specific to the most gen‐
89 eral.
90
91 Further note that all regex patterns are implicitly prefixed
92 with the constraint escape A to ensure that a match starts
93 exactly at the character index found in startvar.
94
96 This document, and the package it describes, will undoubtedly contain
97 bugs and other problems. Please report such in the category textutil
98 of the Tcllib Trackers [http://core.tcl.tk/tcllib/reportlist]. Please
99 also report any ideas for enhancements you may have for either package
100 and/or documentation.
101
102 When proposing code changes, please provide unified diffs, i.e the out‐
103 put of diff -u.
104
105 Note further that attachments are strongly preferred over inlined
106 patches. Attachments can be made by going to the Edit form of the
107 ticket immediately after its creation, and then using the left-most
108 button in the secondary navigation bar.
109
111 lexing, regex, string, tokenization
112
114 Text processing
115
116
117
118tcllib 1 string::token(n)