MetaTesting.jl
MetaTesting is a collection of utilities for testing "testers," functions that run tests. It is primarily intended as a test dependency.
API
MetaTesting.errors — Functionerrors(f, msg_pattern="")Returns true if at least 1 error is recorded into a testset with a failure matching the given pattern.
f should be a function that takes no argument, and calls some code that uses @testset. msg_pattern is a regex or a string, that should be contained in the error message. If nothing is passed then it default to the empty string, which matches any error message.
If a test fails (rather than passing or erroring) then errors will throw an error.
Examples
using MetaTesting, Test
function test_approx(x, y)
@test x ≈ y
end
@testset "test_approx tests" begin
test_approx(1.0, 1.0) # passes
@test errors() do
test_approx(1.0, (2.0,)) # errors, isapprox not defined for types
end
end;
# output
Test Summary: | Pass Total Time
test_approx tests | 2 2 0.0s
Test.DefaultTestSet("test_approx tests", Any[], 2, false, false, true, 1.684442215984607e9, 1.68444221725024e9, false)MetaTesting.fails — Methodfails(f)f should be a function that takes no argument, and calls some code that used @test. fails(f) returns true if at least 1 @test fails. If a test errors then it will display that error and throw an error of its own.
Examples
using MetaTesting, Test
function test_equal(x, y)
@test x == y
end
@testset "test_equal tests" begin
test_equal(1, 1) # passes
@test fails() do
test_equal(1, 2) # fails
end
end;
# output
Test Summary: | Pass Total Time
test_equal tests | 2 2 0.0s
Test.DefaultTestSet("test_equal tests", Any[], 2, false, false, true, 1.684442217674312e9, 1.684442217756266e9, false)MetaTesting.nonpassing_results — Methodnonpassing_results(f)f should be a function that takes no argument, and calls some code that used @test. Invoking it via nonpassing_results(f) will prevent those @test being added to the current testset, and will return a collection of all nonpassing test results.