1Test::Assertions(3) User Contributed Perl Documentation Test::Assertions(3)
2
3
4
6 Test::Assertions - a simple set of building blocks for both unit and
7 runtime testing
8
10 #ASSERT does nothing
11 use Test::Assertions;
12
13 #ASSERT warns "Assertion failure"...
14 use Test::Assertions qw(warn);
15
16 #ASSERT dies with "Assertion failure"...
17 use Test::Assertions qw(die);
18
19 #ASSERT warns "Assertion failure"... with stack trace
20 use Test::Assertions qw(cluck);
21
22 #ASSERT dies with "Assertion failure"... with stack trace
23 use Test::Assertions qw(confess);
24
25 #ASSERT prints ok/not ok
26 use Test::Assertions qw(test);
27
28 #Will cause an assertion failure
29 ASSERT(1 == 0);
30
31 #Optional message
32 ASSERT(0 == 1, "daft");
33
34 #Checks if coderef dies
35 ASSERT(
36 DIED( sub {die()} )
37 );
38
39 #Check if perl compiles OK
40 ASSERT(
41 COMPILES('program.pl')
42 );
43
44 #Deep comparisons
45 ASSERT(
46 EQUAL(\@a, \@b),
47 "lists of widgets match" # an optional message
48 );
49 ASSERT(
50 EQUAL(\%a, \%b)
51 );
52
53 #Compare to a canned value
54 ASSERT(
55 EQUALS_FILE($foo, 'bar.dat'),
56 "value matched stored value"
57 );
58
59 #Compare to a canned value (regex match using file contents as regex)
60 ASSERT(
61 MATCHES_FILE($foo, 'bar.regex')
62 );
63
64 #Compare file contents
65 ASSERT(
66 FILES_EQUAL('foo.dat', 'bar.dat')
67 );
68
69 #returns 'not ok for Foo::Bar Tests (1 errors in 3 tests)'
70 ASSESS(
71 ['ok 1', 'not ok 2', 'A comment', 'ok 3'], 'Foo::Bar Tests', 0
72 );
73
74 #Collate results from another test script
75 ASSESS_FILE("test.pl");
76
77 #File routines
78 $success = WRITE_FILE('bar.dat', 'hello world');
79 ASSERT( WRITE_FILE('bar.dat', 'hello world'), 'file was written');
80 $string = READ_FILE('example.out');
81 ASSERT( READ_FILE('example.out'), 'file has content' );
82
83 The helper routines don't need to be used inside ASSERT():
84
85 if ( EQUALS_FILE($string, $filename) ) {
86 print "File hasn't changed - skipping\n";
87 } else {
88 my $rc = run_complex_process($string);
89 print "File changed - string was reprocessed with result '$rc'\n";
90 }
91
92 ($boolean, $output) = COMPILES('file.pl');
93 # or...
94 my $string;
95 ($boolean, $standard_output) = COMPILES('file.pl', 1, \$string);
96 # $string now contains standard error, separate from $standard_output
97
98 In test mode:
99
100 use Test::Assertions qw(test);
101 plan tests => 4;
102 plan tests; #will attempt to deduce the number
103 only (1,2); #Only report ok/not ok for these tests
104 ignore 2; #Skip this test
105
106 #In test/ok mode...
107 use Test::Assertions qw(test/ok);
108 ok(1); #synonym for ASSERT
109
111 Test::Assertions provides a convenient set of tools for constructing
112 tests, such as unit tests or run-time assertion checks (like C's ASSERT
113 macro). Unlike some of the Test:: modules available on CPAN,
114 Test::Assertions is not limited to unit test scripts; for example it
115 can be used to check output is as expected within a benchmarking
116 script. When it is used for unit tests, it generates output in the
117 standard form for CPAN unit testing (under Test::Harness).
118
119 The package's import method is used to control the behaviour of ASSERT:
120 whether it dies, warns, prints 'ok'/'not ok', or does nothing.
121
122 In 'test' mode the script also exports plan(), only() and ignore()
123 functions. In 'test/ok' mode an ok() function is also exported for
124 compatibility with Test/Test::Harness. The plan function attempts to
125 count the number of tests if it isn't told a number (this works fine in
126 simple test scripts but not in loops/subroutines). In either mode, a
127 warning will be emitted if the planned number of tests is not the same
128 as the number of tests actually run, e.g.
129
130 # Looks like you planned 2 tests but actually ran 1.
131
132 METHODS
133 plan $number_of_tests
134 Specify the number of tests to expect. If $number_of_tests isn't
135 supplied, ASSERTION tries to deduce the number itself by parsing
136 the calling script and counting the number of calls to ASSERT. It
137 also returns the number of tests, should you wish to make use of
138 that figure at some point. In 'test' and 'test/ok' mode a warning
139 will be emitted if the actual number of tests does not match the
140 number planned, similar to Test::More.
141
142 only(@test_numbers)
143 Only display the results of these tests
144
145 ignore(@test_numbers)
146 Don't display the results of these tests
147
148 ASSERT($bool, $comment)
149 The workhorse function. Behaviour depends on how the module was
150 imported. $comment is optional.
151
152 ASSESS(@result_strings)
153 Collate the results from a set of tests. In a scalar context
154 returns a result string starting with "ok" or "not ok"; in a list
155 context returns 1=pass or 0=fail, followed by a description.
156
157 ($bool, $desc) = ASSESS(@args)
158
159 is equivalent to
160
161 ($bool, $desc) = INTERPRET(scalar ASSESS(@args))
162
163 ASSESS_FILE($file, $verbose, $timeout)
164 $verbose is an optional boolean
165 default timeout is 60 seconds (0=never timeout)
166
167 In a scalar context returns a result string; in a list context
168 returns 1=pass or 0=fail, followed by a description. The timeout
169 uses alarm(), but has no effect on platforms which do not implement
170 alarm().
171
172 ($bool, $desc) = INTERPRET($result_string)
173 Inteprets a result string. $bool indicates 1=pass/0=fail; $desc is
174 an optional description.
175
176 $bool = EQUAL($item1, $item2)
177 Deep comparison of 2 data structures (i.e. references to some kind
178 of structure) or scalars.
179
180 $bool = EQUALS_FILE($string, $filename)
181 Compares a string with a canned value in a file.
182
183 $bool = MATCHES_FILE($string, $regexfilename)
184 Compares a value with a regex that is read from a file. The regex
185 has the '^' anchor prepended and the '$' anchor appended, after
186 being read in from the file. Handy if you have random numbers or
187 dates in your output.
188
189 $bool = FILES_EQUAL($filename1, $filename2)
190 Test if 2 files' contents are identical
191
192 $bool = DIED($coderef)
193 Test if the coderef died
194
195 COMPILES($filename, $strict, $scalar_reference)
196 Test if the perl code in $filename compiles OK, like perl -c. If
197 $strict is true, tests with the options -Mstrict -w.
198
199 In scalar context it returns 1 if the code compiled, 0 otherwise.
200 In list context it returns the same boolean, followed by the output
201 (that is, standard output and standard error combined) of the
202 syntax check.
203
204 If $scalar_reference is supplied and is a scalar reference then the
205 standard output and standard error of the syntax check subprocess
206 will be captured separately. Standard error will be put into this
207 scalar - IO::CaptureOutput is loaded on demand to do this - and
208 standard output will be returned as described above.
209
210 $contents = READ_FILE($filename)
211 Reads the specified file and returns the contents. Returns undef
212 if file cannot be read.
213
214 $success = WRITE_FILE($filename, $contents)
215 Writes the given contents to the specified file. Returns undef if
216 file cannot be written.
217
219 When Test::Assertions is imported with no arguments, ASSERT is aliased
220 to an empty coderef. If this is still too much runtime overhead for
221 you, you can use a constant to optimise out ASSERT statements at
222 compile time. See the section on runtime testing in
223 Test::Assertions::Manual for a discussion of overheads, some examples
224 and some benchmark results.
225
227 The following modules are loaded on demand:
228
229 Carp
230 File::Spec
231 Test::More
232 File::Compare
233 IO::CaptureOutput
234
236 Test and Test::Simple
237 Minimal unit testing modules
238
239 Test::More
240 Richer unit testing toolkit compatible with Test and Test::Simple
241
242 Carp::Assert
243 Runtime testing toolkit
244
246 - Declare ASSERT() with :assertions attribute in versions of perl >= 5.9
247 so it can be optimised away at runtime. It should be possible to declare
248 the attribute conditionally in a BEGIN block (with eval) for backwards
249 compatibility
250
252 Test::Assertions::Manual - A guide to using Test::Assertions
253
255 $Revision: 1.54 $ on $Date: 2006/08/07 10:44:42 $ by $Author: simonf $
256
258 John Alden with additions from Piers Kent and Simon Flack <cpan _at_
259 bbc _dot_ co _dot_ uk>
260
262 (c) BBC 2005. This program is free software; you can redistribute it
263 and/or modify it under the GNU GPL.
264
265 See the file COPYING in this distribution, or
266 http://www.gnu.org/licenses/gpl.txt
267
268
269
270perl v5.36.0 2022-07-22 Test::Assertions(3)