The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

NAME

Config::Model::Tester - Test framework for Config::Model

VERSION

version 2.057

SYNOPSIS

 # in t/model_test.t
 use warnings;
 use strict;

 use Config::Model::Tester ;
 use ExtUtils::testlib;

 my $arg = shift || ''; # typically e t l
 my $test_only_app = shift || ''; # only run one set of test
 my $do = shift ; # select subtests to run with a regexp

 run_tests($arg, $test_only_app, $do) ;

DESCRIPTION

This class provides a way to test configuration models with tests files. This class was designed to tests several models and several tests cases per model.

A specific layout for test files must be followed.

Simple test file layout

Each test case is represented by a configuration file (not a directory) in the *-examples directory. This configuration file will be used by the model to test and is copied as $confdir/$conf_file_name using the global variables explained below.

In the example below, we have 1 app model to test: lcdproc and 2 tests cases. The app name matches the file specified in lib/Config/Model/*.d directory. In this case, the app name matches lib/Config/Model/system.d/lcdproc

 t
 |-- model_test.t
 \-- model_tests.d           # do not change directory name
     |-- lcdproc-test-conf.pl   # test specification for lcdproc app
     \-- lcdproc-examples
         |-- t0              # subtest t0
         \-- LCDD-0.5.5      # subtest for older LCDproc

Test specification is written in lcdproc-test-conf.pl file (i.e. this modules looks for files named like <app-name>-test-conf.pl>).

Subtests are specified in files in directory lcdproc-examples ( i.e. this modules looks for subtests in directory <model-name>-examples.pl>. lcdproc-test-conf.pl contains instructions so that each file will be used as a /etc/LCDd.conf file during each test case.

lcdproc-test-conf.pl can contain specifications for more test cases. Each test case will require a new file in lcdproc-examples directory.

See "Examples" for a link to the actual LCDproc model tests

Test file layout for multi-file configuration

When a configuration is spread over several files, each test case is provided in a sub-directory. This sub-directory is copied in $conf_dir (a global variable as explained below)

In the example below, the test specification is written in dpkg-test-conf.pl. Dpkg layout requires several files per test case. dpkg-test-conf.pl will contain instructions so that each directory under dpkg-examples will be used.

 t/model_tests.d
 \-- dpkg-test-conf.pl         # test specification
 \-- dpkg-examples
     \-- libversion            # example subdir, used as subtest name
         \-- debian            # directory for one test case
             |-- changelog
             |-- compat
             |-- control
             |-- copyright
             |-- rules
             |-- source
             |   \-- format
             \-- watch

See "Examples" for a link to the (many) Dpkg model tests

More complex file layout

Each test case is a sub-directory on the *-examples directory and contains several files. The destination of the test files may depend on the system (e.g. the OS). For instance, system wide ssh_config is stored in /etc/ssh on Linux, and directly in /etc on MacOS.

These files are copied in a test directory using a setup parameter:

  setup => {
    test_file_in_example_dir => 'destination'
  }

Let's consider this example of 2 tests cases for ssh:

 t/model_tests.d/
 |-- ssh-test-conf.pl
 |-- ssh-examples
     \-- basic
         |-- system_ssh_config
         \-- user_ssh_config

Unfortunately, user_ssh_config is a user file, so you specify where the home directory for the tests with another global variable:

  $home_for_test = '/home/joe' ;

For Linux only, the setup parameter is:

 setup => {
   'system_ssh_config' => '/etc/ssh/ssh_config',
   'user_ssh_config'   => "~/.ssh/config"
 }

On the other hand, system wide config file is different on MacOS and the test file must be copied in the correct location. When the value of the setup hash is another hash, the key of this other hash is used as to specify the target location for other OS (as returned by Perl $^O variable:

      setup => {
        'system_ssh_config' => {
            'darwin' => '/etc/ssh_config',
            'default' => '/etc/ssh/ssh_config',
        },
        'user_ssh_config' => "~/.ssh/config"
      }

See the actual Ssh and Sshd model tests

Basic test specification

Each model test is specified in <model>-test-conf.pl. This file contains a set of global variables. (yes, global variables are often bad ideas in programs, but they are handy for tests):

 # config file name (used to copy test case into test wr_root directory)
 $conf_file_name = "fstab" ;
 # config dir where to copy the file (optional)
 #$conf_dir = "etc" ;
 # home directory for this test
 $home_for_test = '/home/joe' ;

Here, t0 file will be copied in wr_root/test-t0/etc/fstab.

 # config model name to test
 $model_to_test = "Fstab" ;

 # list of tests. This modules looks for @tests global variable
 @tests = (
    {
     # test name
     name => 't0',
     # add optional specification here for t0 test
    },
    {
     name => 't1',
     # add optional specification here for t1 test
    },
 );

 1; # to keep Perl happy

You can suppress warnings by specifying no_warnings => 1. On the other hand, you may also want to check for warnings specified to your model. In this case, you should avoid specifying no_warnings here and specify warning tests or warning filters as mentioned below.

See actual fstab test.

Internal tests or backend tests

Some tests will require the creation of a configuration class dedicated for test (typically to test corner cases on a backend).

This test class can be created directly in the test specification by calling create_config_class on $model variable. See for instance the layer test or the test for shellvar backend.

Test specification with arbitrary file names

In some models (e.g. Multistrap, the config file is chosen by the user. In this case, the file name must be specified for each tests case:

 $model_to_test = "Multistrap";

 @tests = (
    {
        name        => 'arm',
        config_file => '/home/foo/my_arm.conf',
        check       => {},
    },
 );

See actual multistrap test.

Test scenario

Each subtest follow a sequence explained below. Each step of this sequence may be altered by adding specification in <model-to-test>-test-conf.pl:

  • Setup test in wr_root/<subtest name>/. If your configuration file layout depend on the target system, you will have to specify the path using setup parameter. See "Test file layout depending on system".

  • Create configuration instance, load config data and check its validity. Use load_check => 'no' if your file is not valid.

  • Check for config data warning. You should pass the list of expected warnings. E.g.

        load_warnings => [ qr/Missing/, (qr/deprecated/) x 3 , ],

    Use an empty array_ref to mask load warnings.

  • Optionally run update command:

        update => { in => 'some-test-data.txt', returns => 'foo' , no_warnings => [ 0 | 1 ] }

    returns is the expected return value (optional). All other arguments are passed to update method. Note that quiet => 1 may be useful for less verbose test.

  • Optionally load configuration data. You should design this config data to suppress any error or warning mentioned above. E.g:

        load => 'binary:seaview Synopsis="multiplatform interface for sequence alignment"',

    See Config::Model::Loader for the syntax of the string accepted by load parameter.

  • Optionally, run a check before running apply_fix (if any). This step is useful to check warning messages:

       check_before_fix => {
          dump_errors   => [ ... ] # optional, see below
          dump_warnings => [ ... ] # optional, see below
       }

    Use dump_errors if you expect issues:

      check_before_fix => {
        dump_errors =>  [
            # the issues  and a way to fix the issue using Config::Model::Node::load
            qr/mandatory/ => 'Files:"*" Copyright:0="(c) foobar"',
            qr/mandatory/ => ' License:FOO text="foo bar" ! Files:"*" License short_name="FOO" '
        ],
      }

    Likewise, specify any expected warnings (note the list must contain only ref to regular expressions):

      check_before_fix => {
            dump_warnings => [ (qr/deprecated/) x 3 ],
      }

    You can tolerate any dump warning this way:

      check_before_fix => {
            dump_warnings => undef ,
      }

    Both dump_warnings and dump_errors can be specified in check_before_fix hash.

  • Optionally, call apply_fixes:

        apply_fix => 1,
  • Call dump_tree to check the validity of the data after optional apply_fix. This step is not optional.

    As with check_before_fix, both dump_errors or dump_warnings can be used.

  • Run specific content check to verify that configuration data was retrieved correctly:

        check => {
            'fs:/proc fs_spec',           "proc" ,
            'fs:/proc fs_file',           "/proc" ,
            'fs:/home fs_file',          "/home",
        },

    The keys of the hash points to the value to be checked using the syntax described in "grab(...)" in Config::Model::AnyThing:.

    You can run check using different check modes (See "fetch( ... )" in Config::Model::Value) by passing a hash ref instead of a scalar :

        check  => {
            'sections:debian packages:0' , { mode => 'layered', value => 'dpkg-dev' },
            'sections:base packages:0',    { mode => 'layered', value => "gcc-4.2-base' },
        },

    The whole hash content (except "value") is passed to grab and fetch

    A regexp can also be used to check value:

       check => {
          "License text" => qr/gnu/i,
          "License text" => { mode => 'custom', value => qr/gnu/i },
       }
  • Verify if a hash contains one or more keys (or keys matching a regexp):

     has_key => [
        'sections' => 'debian', # sections must point to a hash element
        'control' => [qw/source binary/],
        'copyright Files' => qr/.c$/,
        'copyright Files' => [qr/\.h$/], qr/\.c$/],
     ],
  • Verify that a hash has not a key (or a key matching a regexp):

     has_not_key => [
        'copyright Files' => qr/.virus$/ # silly, isn't ?
     ],
  • Verify annotation extracted from the configuration file comments:

        verify_annotation => {
                'source Build-Depends' => "do NOT add libgtk2-perl to build-deps (see bug #554704)",
                'source Maintainer' => "what a fine\nteam this one is",
            },
  • Write back the config data in wr_root/<subtest name>/. Note that write back is forced, so the tested configuration files are written back even if the configuration values were not changed during the test.

    You can skip warning when writing back with the global :

        no_warnings => 1,
  • Check the content of the written files(s) with Test::File::Contents. Tests can be grouped in an array ref:

       file_contents => {
                "/home/foo/my_arm.conf" => "really big string" ,
                "/home/bar/my_arm.conf" => [ "really big string" , "another"], ,
            }
    
       file_contents_like => {
                "/home/foo/my_arm.conf" => [ qr/should be there/, qr/as well/ ] ,
       }
    
       file_contents_unlike => {
                "/home/foo/my_arm.conf" => qr/should NOT be there/ ,
       }
  • Check added or removed configuration files. If you expect changes, specify a subref to alter the file list:

        file_check_sub => sub {
            my $list_ref = shift ;
            # file added during tests
            push @$list_ref, "/debian/source/format" ;
        },
  • Copy all config data from wr_root/<subtest name>/ to wr_root/<subtest name>-w/. This steps is necessary to check that configuration written back has the same content as the original configuration.

  • Create another configuration instance to read the conf file that was just copied (configuration data is checked.)

  • You can skip the load check if the written file still contain errors (e.g. some errors were ignored and cannot be fixed) with load_check2 => 'no'

  • Compare data read from original data.

  • Run specific content check on the written config file to verify that configuration data was written and retrieved correctly:

        wr_check => {
            'fs:/proc fs_spec' =>          "proc" ,
            'fs:/proc fs_file' =>          "/proc",
            'fs:/home fs_file' =>          "/home",
        },

    Like the check item explained above, you can run wr_check using different check modes.

Running the test

Run all tests with one of these commands:

 prove -l t/model_test.t :: [ t|l|e [ <model_name> [ <regexp> ]]]
 perl -Ilib t/model_test.t  [ t|l|e [ <model_name> [ <regexp> ]]]

By default, all tests are run on all models.

You can pass arguments to t/model_test.t:

  • a bunch of letters. 't' to get test traces. 'e' to get stack trace in case of errors, 'l' to have logs. All other letters are ignored. E.g.

      # run with log and error traces
      prove -lv t/model_test.t :: el
  • The model name to tests. E.g.:

      # run only fstab tests
      prove -lv t/model_test.t :: x fstab
  • A regexp to filter subtest E.g.:

      # run only fstab tests foobar subtest
      prove -lv t/model_test.t :: x fstab foobar
    
      # run only fstab tests foo subtest
      prove -lv t/model_test.t :: x fstab '^foo$'

Examples

SEE ALSO

AUTHOR

Dominique Dumont

COPYRIGHT AND LICENSE

This software is Copyright (c) 2013-2016 by Dominique Dumont.

This is free software, licensed under:

  The GNU Lesser General Public License, Version 2.1, February 1999

SUPPORT

Websites

The following websites have more information about this module, and may be of help to you. As always, in addition to those websites please use your favorite search engine to discover more resources.

Bugs / Feature Requests

Please report any bugs or feature requests by email to ddumont at cpan.org, or through the web interface at https://github.com/dod38fr/config-model-tester/issues. You will be automatically notified of any progress on the request by the system.

Source Code

The code is open to the world, and available for you to hack on. Please feel free to browse it and play with it, or whatever. If you want to contribute patches, please send me a diff or prod me to pull from your repository :)

http://github.com/dod38fr/config-model-tester.git

  git clone git://github.com/dod38fr/config-model-tester.git