# [Haskell] Newbie : How come that cyclic recursive lists are efficient ?

Lennart Augustsson lennart at augustsson.net
Mon Jan 24 09:21:55 EST 2005

```It doesn't have to be a top level definition, it works anyway.

-- Lennart

Bruno Abdon wrote:
> 'hamming', in your code, is a top-level definition. When used three
> times inside its own definition, it's the same variable being used
> three times. You don't recompute a variable value in order to reuse
> it.
>
> As an example, if you do
>
> foo :: [Integer]
> foo = [1,2,3] + [4,5]
>
> bar = foo ++ foo  ++ foo
>
> the concatenation used to produce foo will not be done three times in
> order to calculate the value of bar. That would be true for any
> function would foo be defined upon, not only concatenation.
>
> Bruno Abdon
>
> On Mon, 24 Jan 2005 10:38:35 +0100, Francis Girard
> <francis.girard at free.fr> wrote:
>
>>Hi,
>>
>>The classical Hamming problem have the following solution in Haskell :
>>
>>*** BEGIN SNAP
>>-- hamming.hs
>>
>>-- Merges two infinite lists
>>merge :: (Ord a) => [a] -> [a] -> [a]
>>merge (x:xs)(y:ys)
>>  | x == y    = x : merge xs ys
>>  | x <  y    = x : merge xs (y:ys)
>>  | otherwise = y : merge (x:xs) ys
>>
>>-- Lazily produce the hamming sequence
>>hamming :: [Integer]
>>hamming
>>  = 1 : merge (map (2*) hamming) (merge (map (3*) hamming) (map (5*) hamming))
>>*** END SNAP
>>
>>I just love these algorithms that run after their tail (they make my brain
>>melt) but I don't know how is it that they are efficient.
>>
>>Here, the hamming recursively calls itself three times. For this algorithm to
>>generated sequence THROUGH RECURSION (i.e. not only intermediate "local"
>>results) otherwise it would end up regenerating the beginning of the sequence
>>over and over again.
>>
>>RECURSION since executing the program with GHCI runs quite smoothly and
>>responsively.
>>
>>That Haskell manages to do that is for me "magnifique". But I need to know (if
>>only a little) about how it achieves this in order to know what I, as a
>>lambda programmer, can do, and how to compute the Big-Oh complexity of the
>>algorithm.
>>
>>Thank you,
>>
>>Francis Girard
>>FRANCE
>>
>>_______________________________________________