For details read Perl NOC. After June 25

Steffen Müller >
Math-Symbolic-0.612 >
Math::Symbolic::VectorCalculus

Module Version: 0.612
Math::Symbolic::VectorCalculus - Symbolically comp. grad, Jacobi matrices etc.

use Math::Symbolic qw/:all/; use Math::Symbolic::VectorCalculus; # not loaded by Math::Symbolic @gradient = grad 'x+y*z'; # or: $function = parse_from_string('a*b^c'); @gradient = grad $function; # or: @signature = qw(x y z); @gradient = grad 'a*x+b*y+c*z', @signature; # Gradient only for x, y, z # or: @gradient = grad $function, @signature; # Similar syntax variations as with the gradient: $divergence = div @functions; $divergence = div @functions, @signature; # Again, similar DWIM syntax variations as with grad: @rotation = rot @functions; @rotation = rot @functions, @signature; # Signatures always inferred from the functions here: @matrix = Jacobi @functions; # $matrix is now array of array references. These hold # Math::Symbolic trees. Or: @matrix = Jacobi @functions, @signature; # Similar to Jacobi: @matrix = Hesse $function; # or: @matrix = Hesse $function, @signature; $wronsky_determinant = WronskyDet @functions, @vars; # or: $wronsky_determinant = WronskyDet @functions; # functions of 1 variable $differential = TotalDifferential $function; $differential = TotalDifferential $function, @signature; $differential = TotalDifferential $function, @signature, @point; $dir_deriv = DirectionalDerivative $function, @vector; $dir_deriv = DirectionalDerivative $function, @vector, @signature; $taylor = TaylorPolyTwoDim $function, $var1, $var2, $degree; $taylor = TaylorPolyTwoDim $function, $var1, $var2, $degree, $var1_0, $var2_0; # example: $taylor = TaylorPolyTwoDim 'sin(x)*cos(y)', 'x', 'y', 2;

This module provides several subroutines related to vector calculus such as computing gradients, divergence, rotation, and Jacobi/Hesse Matrices of Math::Symbolic trees. Furthermore it provides means of computing directional derivatives and the total differential of a scalar function and the Wronsky Determinant of a set of n scalar functions.

Please note that the code herein may or may not be refactored into the OO-interface of the Math::Symbolic module in the future.

None by default.

You may choose to have any of the following routines exported to the calling namespace. ':all' tag exports all of the following:

grad div rot Jacobi Hesse WronskyDet TotalDifferential DirectionalDerivative TaylorPolyTwoDim

This subroutine computes the gradient of a Math::Symbolic tree representing a function.

The gradient of a function f(x1, x2, ..., xn) is defined as the vector:

( df(x1, x2, ..., xn) / d(x1), df(x1, x2, ..., xn) / d(x2), ..., df(x1, x2, ..., xn) / d(xn) )

(These are all partial derivatives.) Any good book on calculus will have more details on this.

grad uses prototypes to allow for a variety of usages. In its most basic form, it accepts only one argument which may either be a Math::Symbolic tree or a string both of which will be interpreted as the function to compute the gradient for. Optionally, you may specify a second argument which must be a (literal) array of Math::Symbolic::Variable objects or valid Math::Symbolic variable names (strings). These variables will the be used for the gradient instead of the x1, ..., xn inferred from the function signature.

This subroutine computes the divergence of a set of Math::Symbolic trees representing a vectorial function.

The divergence of a vectorial function F = (f1(x1, ..., xn), ..., fn(x1, ..., xn)) is defined like follows:

sum_from_i=1_to_n( dfi(x1, ..., xn) / dxi )

That is, the sum of all partial derivatives of the i-th component function to the i-th coordinate. See your favourite book on calculus for details. Obviously, it is important to keep in mind that the number of function components must be equal to the number of variables/coordinates.

Similar to grad, div uses prototypes to offer a comfortable interface. First argument must be a (literal) array of strings and Math::Symbolic trees which represent the vectorial function's components. If no second argument is passed, the variables used for computing the divergence will be inferred from the functions. That means the function signatures will be joined to form a signature for the vectorial function.

If the optional second argument is specified, it has to be a (literal) array of Math::Symbolic::Variable objects and valid variable names (strings). These will then be interpreted as the list of variables for computing the divergence.

This subroutine computes the rotation of a set of three Math::Symbolic trees representing a vectorial function.

The rotation of a vectorial function F = (f1(x1, x2, x3), f2(x1, x2, x3), f3(x1, x2, x3)) is defined as the following vector:

( ( df3/dx2 - df2/dx3 ), ( df1/dx3 - df3/dx1 ), ( df2/dx1 - df1/dx2 ) )

Or "nabla x F" for short. Again, I have to refer to the literature for the details on what rotation is. Please note that there have to be exactly three function components and three coordinates because the cross product and hence rotation is only defined in three dimensions.

As with the previously introduced subroutines div and grad, rot offers a prototyped interface. First argument must be a (literal) array of strings and Math::Symbolic trees which represent the vectorial function's components. If no second argument is passed, the variables used for computing the rotation will be inferred from the functions. That means the function signatures will be joined to form a signature for the vectorial function.

If the optional second argument is specified, it has to be a (literal) array of Math::Symbolic::Variable objects and valid variable names (strings). These will then be interpreted as the list of variables for computing the rotation. (And please excuse my copying the last two paragraphs from above.)

Jacobi() returns the Jacobi matrix of a given vectorial function. It expects any number of arguments (strings and/or Math::Symbolic trees) which will be interpreted as the vectorial function's components. Variables used for computing the matrix are, by default, inferred from the combined signature of the components. By specifying a second literal array of variable names as (second) argument, you may override this behaviour.

The Jacobi matrix is the vector of gradient vectors of the vectorial function's components.

Hesse() returns the Hesse matrix of a given scalar function. First argument must be a string (to be parsed as a Math::Symbolic tree) or a Math::Symbolic tree. As with Jacobi(), Hesse() optionally accepts an array of signature variables as second argument.

The Hesse matrix is the Jacobi matrix of the gradient of a scalar function.

This function computes the total differential of a scalar function of multiple variables in a certain point.

First argument must be the function to derive. The second argument is an optional (literal) array of variable names (strings) and Math::Symbolic::Variable objects to be used for deriving. If the argument is not specified, the functions signature will be used. The third argument is also an optional array and denotes the set of variable (names) to use for indicating the point for which to evaluate the differential. It must have the same number of elements as the second argument. If not specified the variable names used as coordinated (the second argument) with an appended '_0' will be used as the point's components.

DirectionalDerivative computes the directional derivative of a scalar function in the direction of a specified vector. With f being the function and X, A being vectors, it looks like this: (this is a partial derivative)

df(X)/dA = grad(f(X)) * (A / |A|)

First argument must be the function to derive (either a string or a valid Math::Symbolic tree). Second argument must be vector into whose direction to derive. It is to be specified as an array of variable names and objects. Third argument is the optional signature to be used for computing the gradient. Please see the documentation of the grad function for details. It's dimension must match that of the directional vector.

This subroutine computes the Taylor Polynomial for functions of two variables. Please refer to the documentation of the TaylorPolynomial function in the Math::Symbolic::MiscCalculus package for an explanation of single dimensional Taylor Polynomials. This is the counterpart in two dimensions.

First argument must be the function to approximate with the Taylor Polynomial either as a string or a Math::Symbolic tree. Second and third argument must be the names of the two coordinates. (These may alternatively be Math::Symbolic::Variable objects.) Fourth argument must be the degree of the Taylor Polynomial. Fifth and Sixth arguments are optional and specify the names of the variables to introduce as the point of approximation. These default to the names of the coordinates with '_0' appended.

WronskyDet() computes the Wronsky Determinant of a set of n functions.

First argument is required and a (literal) array of n functions. Second argument is optional and a (literal) array of n variables or variable names. If the second argument is omitted, the variables used for deriving are inferred from function signatures. This requires, however, that the function signatures have exactly one element. (And the function this exactly one variable.)

Please send feedback, bug reports, and support requests to the Math::Symbolic support mailing list: math-symbolic-support at lists dot sourceforge dot net. Please consider letting us know how you use Math::Symbolic. Thank you.

If you're interested in helping with the development or extending the module's functionality, please contact the developers' mailing list: math-symbolic-develop at lists dot sourceforge dot net.

List of contributors:

Steffen M�ller, symbolic-module at steffen-mueller dot net Stray Toaster, mwk at users dot sourceforge dot net Oliver Ebenh�h

New versions of this module can be found on http://steffen-mueller.net or CPAN. The module development takes place on Sourceforge at http://sourceforge.net/projects/math-symbolic/

syntax highlighting: