Dmitry E. Oboukhov > Data-StreamSerializer-0.07 > Data::StreamSerializer

Download:
Data-StreamSerializer-0.07.tar.gz

Dependencies

Annotate this POD

View/Report Bugs
Module Version: 0.07   Source  

NAME ^

Data::StreamSerializer - non-blocking serializer.

SYNOPSIS ^

  use Data::StreamSerializer;

  my $sr = new Data::StreamSerializer('You data');

  while(defined(my $part = $sr->next)) {
      print $socket $part;
  }

DESCRIPTION ^

Sometimes You need to serialize a lot of data. If You use 'Dumper' it can take You for much time. If Your code is executed in event machine it can be inadmissible. So using the module You can serialize Your data progressively and do something between serialization itearions.

This module works slower than Data::Dumper, but it can serialize object progressively and You can do something else between serialization iterations.

Recognized types.

HASH

ARRAY

REF

Regexp

SCALAR

METHODS ^

new

Constructor. All arguments will be serialized.

next

Returns next part of serialized string or undef if all data were serialized.

block_size

Block size for one iteration. Too small value allows You to spend less time for each iteration, but in this case total serialization time will grow. Nod bad choice to set the value between 200 - 2000 bytes (default value is 512). See BENCHMARKS to make a decision.

recursion_depth

If serialized object has recursive references, they will be replaced by empty objects. But if this value is higher than 1 recursion will be reserialized until the value is reached.

Example:

    my $t = { a => 'b' };
    $t->{c} = $t;

This example will be serialized into string:

    {"c",{"c",{},"a","b"},"a","b"}

and if You increment recursion_depth, this example will be serialized into string: {"c",{"c",{"c",{},"a","b"},"a","b"},"a","b"}

etc.

recursion_detected

Returns TRUE if a recursion was detected.

is_eof

Returns TRUE if eof is reached. If it is TRUE the following next will return undef.

SEE ALSO ^

Data::StreamDeserializer.

BENCHMARKS ^

You can try a few scripts in benchmark/ directory. There are a few test arrays in this directory.

Here are a few test results of my system.

Array which contains 100 hashes:

    $ perl benchmark/vs_dumper.pl -n 1000 -b 512 benchmark/tests/01_100x10
    38296 bytes were read
    First serializing by eval... done
    First serializing by Data::StreamSerializer... done
    Starting 1000 iterations for Dumper... done (40.376 seconds)
    Starting 1000 iterations for Data::StreamSerializer... done (137.960 seconds)

    Dumper statistic:
            1000 iterations were done
            maximum serialization time: 0.0867 seconds
            minimum serialization time: 0.0396 seconds
            average serialization time: 0.0404 seconds

    Data::StreamSerializer statistic:
            1000 iterations were done
            58000 SUBiterations were done
            maximum serialization time: 0.1585 seconds
            minimum serialization time: 0.1356 seconds
            average serialization time: 0.1380 seconds
            average subiteration  time: 0.00238 seconds

Array which contains 1000 hashes:

    $  perl benchmark/vs_dumper.pl -n 1000 -b 512 benchmark/tests/02_1000x10
    355623 bytes were read
    First serializing by eval... done
    First serializing by Data::StreamSerializer... done
    Starting 1000 iterations for Dumper... done (405.334 seconds)
    Starting 1000 iterations for Data::StreamSerializer... done (1407.899 seconds)

    Dumper statistic:
            1000 iterations were done
            maximum serialization time: 0.4564 seconds
            minimum serialization time: 0.4018 seconds
            average serialization time: 0.4053 seconds

    Data::StreamSerializer statistic:
            1000 iterations were done
            520000 SUBiterations were done
            maximum serialization time: 2.0050 seconds
            minimum serialization time: 1.3862 seconds
            average serialization time: 1.4079 seconds
            average subiteration  time: 0.00271 seconds

You can see that in any cases one iteration gets the same time.

AUTHOR ^

Dmitry E. Oboukhov, <unera@debian.org>

COPYRIGHT AND LICENSE ^

Copyright (C) 2011 by Dmitry E. Oboukhov

This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself, either Perl version 5.10.1 or, at your option, any later version of Perl 5 you may have available.

VCS ^

The project is placed in my git repo. See here: http://git.uvw.ru/?p=data-stream-serializer;a=summary

syntax highlighting: