<div dir="ltr"><br><br><div class="gmail_quote">On Mon, May 16, 2011 at 11:23 AM, Erik de Castro Lopo <span dir="ltr"><<a href="mailto:mle%2Bhs@mega-nerd.com">mle+hs@mega-nerd.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi all,<br>
<br>
I'm continuing work on the HTTP proxy I'm writing. The absolute<br>
bare basics are working with Warp, Wai and http-enumerator as long<br>
as the upstream web server doesn't send gzipped or chunked data. For<br>
these later two cases, httpe-enumerator helpfully gunzips/unchunks<br>
the data. That however causes a problem.<br>
<br>
If my proxy simply passes the HTTP headers and data it gets from<br>
http-enumerator and passes then on to the client, the client barfs<br>
because the headers claim the data is gzipped or chunked and the<br>
data actually isn't chunked/gzipped.<br>
<br>
There are a number of possible solutions to this:<br>
<br>
a) Strip the content/transfer-encoding header and add a<br>
content-length header instead. I think this is probably<br>
possible with the API as it is, but I haven't figured out<br>
how yet.<br>
<br>
b) Rechunk or re-gzip the data. This seems rather wasteful<br>
of CPU resources.<br>
<br>
c) Modify the Network.Http.Enumerator.http function so that<br>
de-chunking/gunzipping is optional.<br>
<br>
d) Expose the iterHeaders function that is internal to the<br>
http-enumerator package so that client code can grab the<br>
headers before deciding how to handle the body.<br>
<br>
Are there any other options I haven't thought of yet?<br>
<br>