Comments on "Are new versions released often enough?"

I see new versions coming out pretty quickly, with lots of cool new features. I am impressed.
It is far more important that the stable releases really are stable than beeing numerous.
Although the releases are not infrequent, I think it would be nice to have bug-fix releases 3-4 times a year. But considering the number of people that actively work on GHC, that could be unreasonable.
really, i don't have any opinion. why should i complete this field? :)
A large project really doesn't want to see new versions too often - it's very costly to change compiler versions if there are -any- incompatibilities.
Releases are, if anything, a bit too frequent.
I would like to see faster bug fixing turnaround.
There always seems to be a working version compatible with my fairly conservative code, for all platforms.
However, I think it would be better if releases were a /bit/ more frequent.
Real even-odd numbering, with lots of point releases, would be great, if the personnel exists for it.
Rather too often, I think-- and maybe a touch too enthusiastic on adding new stuff and not keeping backwards compatiblity. I suppose this is the cost of living at the bleeding edge.
yes. the exception is of course GpH. GpH team keeps me and my students tied to old and stupid C plus MPI
Perhaps it would be a good idea to announce how much disc-space is needed, my first build of version 6.4 failed because I had only about 900M left in my root partition :-(
I often use older versions because they meet my needs.
Has a free software project ever released often enough for the users? *grin*
I would like it if new versions could find their way into Debian better.
I'm not a heavy user of new features.
jn
Actually, I would rather see more, but that more my craving for novelty. I don't *need* to.
GHC 6.4 (6.4 without Cabal support and 7.0 with, perhaps) should have been released a lot earlier, IMHO. I've been using GHC 6.3 just to access the new TH for a long time and it hasn't been pleasant.
possibly too often
Well, I don't think that the current tempo is a problem. But I do think that GHC could be released a little more often. Often when some new language feature is implemented people are eager to try it out. As soon as the implementation is done GHC could be shipped in an unstable version for people to play with.
We don't want to have to fix things too often. :-) On the other had of course there is always pressure for new features. :-) It can become tricky to support a range of ghc versions. We end up needing a lot of #if #endif stuff. Perhaps that's inevitable.
Instead of new versions, I'd like to see a focus on developing a new standard version of Haskell, solidifying the most popular extensions. I'm concerned that programs I write using (sometimes necessary) extensions with GHC today won't work in the future.
I am impressed with how often they are released! I think the frequency is just right, long enough for the last one to get settled in, but not so long that it feels like all developement has stopped.
See comments above. One always likes new versions. My reply is forced by the form rather than considered.
Too often
Haven't been in to haskell long enough to know
Just about the time when I start wondering when the next GHC will be released, for whatever reason, the next version is generally announced.
I hardly ever feel that I need the latest-greatest type system extension. However, this might be different for other researchers. As a library maintainer, stability is much more important. Maybe you should consider two kinds of GHC releases: latest type-system wonders, and latest Haskell98 stable?
Compatibility between releases seems sticky sometimes at the application level.
At present GHC seems to be adding cool new things faster than I have time to learn about thim.
I think it's good that new versions are not released all too often. It's become reasonably simple to build from CVS, so if one wants new features to play with, that's ok. I would like more releases of supporting software, however: happy, alex, especially haddock are released too infrequently for my taste.
this is a vivant thing!
things generally seem to come out about as fast as I keep up with them
For bleeding edge, CVS is not se broken most of the time. Though, GHC 6.3 was hard to build during the compatibility library transition.
Recently, as I need some new features of Data>generics, I had to install the beta (6.4-date) version. But it works well for me.
I haven't had much of a reason to upgrade my installation of GHC for a long time... The pars I've used (ie the "core" Haskell stuff) have been working alright for some time
Haven't thought this before on GHC - generally, I think that any active sw should have a stable release in 6 to 12 months regularly, but I also realize that this is much a question of time & money of the people involved, too.
Don't want to many releases, want to stablise on a good release and perhaps only release minor bug fixes
I'd like to keep up-to-date with the latest-and-greatest so that I could work out examples from recent academic papers, but I don't want to have to build my own GHC from the head to get there.
I actually don't know, but there is no option for that.
I don't have time to install updates all the time. I prefer stability to frequency.
GHC seems to be developed a lot faster than, say, GCC, which has very infrequent releases.
as you might see above, I have not come to even try all existing features. 99% of my time i'm coding basic haskell without any extensions or special features
Often enough that my own limited time means I don't miss features for long.
doesn't matter release new versions every if there is nothing new?
No idea! Just got here!
Although it was a long wait for 6.4 :)
Sorry, not enough experience w/GHC yet.
Working on internals myself, I consider version numbering somewhat strange, issuing "minor" release numbers to important changes, e.g.: 5.04: completely new lib. structure (compared to 5.02) 6.4 : completely new backend
Especially the daily builds are frequent!
It might be nice to have more "bug-fix" releases, but I realize that with a small team this is probably too much work.
It's a trade-off with stability. I'm happy with the process.
release early - release often.
The wait for GHC6.4 was a long one, on the other hand no point releasing something that is not ready. Maybe more releases so that the difference between them is not so large.
I seldom go to the trouble of installing (possibly unstable) CVS versions, which means that I have to wait for new features.
I just started using it so I am not sure how often new versions are released.
GHC stable versions are often enough. I do wish for metastable releases for testing specific features.
Release them when they are ready. Make them stable. Though Haskell is effectively the de facto standard implementation, we could do with a new Haskell language standard.
I think the release frequency is about right.
Well yes and no really. Yes, because because new releases break ABI - ie all libraries need to be rebuilt with the new version. No, since it would be nice to have more gradual regular updates. :)
I think software should be released often, without putting too much changes in each release. This way it would be easier to track bugs and correct them.
Although my opinion here probably doesn't mean much because I'm tracking CVS-HEAD anyway.
Y80[Y90[9Y
I compile from CVS every one or two weeks. So in fact I am more interested in features than in versions
FFI don't refer the function this programm istead of other error. (... but I don't know where is wrong.) FFmpeg.o(.text+0x1398):fake: undefined reference to `av_init_packet' (hsc) -- -*- mode: haskell -*- {-# OPTIONS -fglasgow-exts #-} #include <avformat.h> #include <avcodec.h> module FFmpeg where import Foreign data CAVPacket = CAVPacket {pktPts :: !(#type int64_t), pktDts :: !(#type int64_t), pktDatas :: !(Ptr (#type uint8_t)), pktSize :: !Int, pktStreamIndex :: !Int, pktFlags :: !Int, pktDuration :: !Int} deriving (Eq,Show) instance Storable CAVPacket where peek p = do{ pts <- (#peek AVPacket, pts) p; dts <- (#peek AVPacket, dts) p; datas <- (#peek AVPacket, data) p; size <- (#peek AVPacket, size) p; stream_index <- (#peek AVPacket, stream_index) p; flags <- (#peek AVPacket, flags) p; duration <- (#peek AVPacket, duration) p; return $! CAVPacket pts dts datas size stream_index flags duration } poke p (CAVPacket pts dts datas size stream_index flags duration) = do{(#poke AVPacket, pts) p pts; (#poke AVPacket, dts) p dts; (#poke AVPacket, data) p datas; (#poke AVPacket, size) p size; (#poke AVPacket, stream_index) p stream_index ; (#poke AVPacket, flags) p flags; (#poke AVPacket, duration) p duration} sizeOf _ = (#size AVPacket) -- alignment の値については自信なし alignment _ = 7 av_init_packet :: IO (Ptr CAVPacket) av_init_packet = alloca $ \pkt -> do c_av_init_packet pkt return pkt foreign import ccall unsafe "av_init_packet" c_av_init_packet :: Ptr CAVPacket -> IO () {- And Now, Newest Relese Candidate package of Windows can't load DLL. Loading package base-1.0 ... linking ... done. Loading package OpenGL-2.0 ... ghc.exe: can't load .so/.DLL for: m (addDLL: unkn own error -}
now that I'm more deeply involved in the release process (via cabal) I might change my mind, but in the past it has always seemed frequent enough.
IMHO a compiler should not have too frequent releases, except for bugfixes. Introducing new features every 6 months or so would lead to user's code maintenance problems and potentilly great difficulties in re-using very old as well as very recent code.
Most of what is in a new release falls into the nice-to-have category, so the timing of releases is not a big deal.
Maybe when I start using more of the advanced idioms I might like new versions becoming available quicker (I'm certainly looking forward to using GADTs in 6.4).
Persons depending on very new features should hopefully be willing to patch.
in general, yes. but not if you're on windows and have no interim snapshots.
New major releases should be released only when they are stable enough. Bug-fix versions could instead be released more often.
Yes, usually, but not 6.4.
Yes of course. New versions of a compiler are not needed often. What is needed is carefully crafting bindings to standard libraries so that the design of the bindings doesn't impede further work on the compiler.
Developers who want to live on the cutting edge can get more recent anyway either straight from the CVS or automatically bundled tar files
this should not be an issue because cvs snapshots are available
But there are sometimes major changes as in the module structure between 6.2.2 and 6.4 or as in the set of (syntacticly) accepted programs between 6.* and 6.2.2.
It's hard for me to remember back about this. Also, I have lots of conflicting interests. More frequent releases make packaging harder in some ways, but on the other hand it's possible to have less buggy packages without having a huge diff build up between releases. On balance I think a higher frequency of point releases, especially for x.y.1 when x.y inevitably quickly reveals a number of issues (like doesn't compile on powerpc this time round), would be good. Unfortunately, due to a combination of where we are in the Debian release process coupled with technical details I won't bore you with I've had to keep 6.4 out of Debian for the time being, so problems arising from us won't be found as quickly as I'd normally hope we'd be able to.
Actually I don't know, I'm sort of new to this.
please don't get driven by timescales make it driven by need / functionality
New version seem to be released frequently.
In hindsight, a release between 6.2.2 and 6.4 could have been a good idea. That way, 6.4 (or what then would have been 6.4.1) could wait until Cabal was finished, which seems will not be the case. But then again, things like these are hard to predict. More than a year between 6.2.2 and 6.4 seems long though, especially since 6.4 brings many new toys, like GADTs. :)
The nightly snapshots are a great idea: less hassle than CVS and more reliable reference points, and always there when the latest official release is not up to the task...
No, the automated daily builds should be sufficient be be on the bleeding edge. A release cycle of 6-9 month ok.
Consider the improvements and features added in each new version, I'd say the development cycle is efficient.
More often than I need 'em.
Yes, considering improvements the make. Of course, I wouldn't mind if we had the same feature set and stability two years ago. But for that, GHC needs more developers. In this month I've switched from full-time to part-time job (4/5), so I'll have one free day during the week for playing with things like GHC development :-)
I would rather have versions of GHC released whenever a major new feature (or set of related features) is added, instead of grouping a bunch of unrelated features together, seemingly arbitrarily. Thus, the current GHC 6.4 would have been preceded by a series of intermediate versions, 6.3.1, 6.3.2, 6.3.3, etc. I think having these intermediate releases would result in more testing. For example, people on Windows could not even test GHC 6.3 without building from source, but if they wanted to build everything from source they probably wouldn't be using Windows (and plus, it is time consuming to build GHC from source). If you had released GHC 6.3.1, GHC 6.3.2, etc. then I think that more people would use these intermediate releases and would find bugs sooner, with the result being that GHC 6.4 would look a lot more like the (presumably) upcoming GHC 6.4.1 release.
not quite. Automatic incremental software update would be nice.
well, i don't know but there is a new release this month :)
Maybe even too often.