[Haskell-cafe] matrix computations based on the GSL

Henning Thielemann lemming at henning-thielemann.de
Wed Jun 29 16:23:23 EDT 2005


On Wed, 29 Jun 2005, Jacques Carette wrote:

> > If we instead distinguish row and column vectors because we treat them as
> >matrices, then the quadratic form
> >  x^T * A * x
> > denotes a 1x1 matrix, not a real.
> >
> But if you consider x to be a vector without orientation, writing down
> x^T is *completely meaningless*!

That's what I stated.

>  If x is oriented, then x^T makes sense.
> Also, if x is oriented, then
> x^T * (A * x) = (x^T * A) * x.
> What is the meaning of  (x * A) for a 'vector' x ?

It has of course no meaning.

Mathematical notation has the problem that it doesn't distinguish between
things that are different but in turn discriminates things which are
essentially the same. If your design goal is to keep as close as possible
to common notational habits you have already lost! As I already pointed
out in an earlier discussion I see it the other way round: Computer
languages are the touchstones for mathematical notation because you can't
tell a computer about an imprecise expression: "Don't be stupid, you
_know_ what I mean!"

More specific:
 You give two different things the same name. You write
  A*B
   and you mean a matrix multiplication. Matrix multiplication means
finding a representation of the composition of the operators represented
by A and B.
 But you also write
  A*x
 and you mean the matrix-vector multiplication. This corresponds to the
application of the operator represented by A to the vector x.
 You see: Two different things, but the same sign (*). Why? You like this
ambiguity because of its conciseness. You are used to it. What else?
 But you have to introduce an orientation of vectors, thus you
discriminate two things (row and column vectors) which are essentially the
same!
 What is the overall benefit?
 It seems to me like the effort of most Haskell XML libraries in trying to
have as few as possible combinator functions (namely one: o) which forces
you to not discriminate the types for the functions to be combined (the
three essential types are unified to a -> [a]) and even more it forces you
to put conversions from the natural type (like a->Bool, a->a) in every
atomic function!

>  It gets much worse when the middle A is of the form B.C.  To ensure
> that everything is as associative as can be, but no more, is very
> difficult.

I don't see the problem. There are three very different kinds of
multiplication, they should also have their own signs: Scalar product,
matrix-vector multiplication, matrix-matrix multiplication.

> I don't have the time to go into all the details of why this design
> 'works' (and it does!),

I have worked with Maple and I have finally dropped it because of its
design. I dropped MatLab, too, because the distinction of row and column
vectors, because it makes no sense to distinguish between e.g. convolving
row vectors or column vectors. Many routines have to be aware of this
difference for which it is irrelevant and many routines work only with one
of these kinds and you are often busy with transposing them.

> giving the 'expected' result to all meaningful linear algebra
> operations, but let me just say that this was the result of long and
> intense debate, and was the *only* design that actually allowed us to
> translate all of linear algebra idioms into convenient notation.

If translating all of existing idioms is your goal, then this is certainly
the only design. But adapting the sloppy (not really convenient)
mathematical notation is not a good design guideline. I advise everyone
who likes this kind of convenience to use Maple, MatLab, and friends
instead!



More information about the Haskell-Cafe mailing list