[Haskell-cafe] *GROUP HUG*

Richard O'Keefe ok at cs.otago.ac.nz
Wed May 25 08:30:41 CEST 2011


On 25/05/2011, at 12:41 AM, Johannes Waldmann wrote:

> Yves Parès <limestrael <at> gmail.com> writes:
> 
>> For instance, one of my friends asked me once why the operation of 
>> calculating the length of list has an O(n) complexity, 
>> since to his opinion, you could just store the size inside the list 
>> and increment it when elements are appended. 
> 
> Then tell me, why does calculating the length of a (Haskell) 
> list has O(n) complexity.

Because it is a rather rare operation and the cost of doing
this would be far higher than you think.
> 
> I could just store the length of the list

Let's take "abcdef" as an example.
This is a list of length 6,
containing a list of length 5,
containing a list of length 4,
containing a list of length 3,
containing a list of length 2,
containing a list of length 1,
containing a list of length 0 (which can be shared).

So for this little example, you're saying "why not store six more
numbers?"

And I say, "why push up the space and time requirements so high
for something that is so very seldom useful?"

Then of course there are lists like
   fibs = 1 : 1 : zipWith (+) fibs (tail fibs)
that don't _have_ a finite length.

> 
> On the other hand, there are implementations of strict sequences
> with an O(1) size implemenation.
> http://hackage.haskell.org/packages/archive/containers/0.4.0.0/doc/html/Data-Sequence.html#v:length

The O(1) size comes at a cost; if all you need is what lists do well,
it's a good idea to avoid finger trees.  If you DO need the things
that finger trees do well, by all means use them.

The contrast here is not that Haskell doesn't store the length of
lists and Java does, but that Java doesn't do plain lists at all,
and Haskell does.




More information about the Haskell-Cafe mailing list