RE: content-length transfer-encoding issue

I partially figured out the problem as given below and would like to propose
a fix that solves my issue and would seem to be prudent if no one can
identify a better solution.  There have been a couple unanswered 'I lost
data on a post response' postings that make me think other people are seeing
this (albeit rarely).

My fix is a one liner.

Here is a trace of what happens in the scenario I described in a prior
email:

POST /blah blahxyz.html HTTP/1.1
Accept: */*
Accept-Encoding: gzip,deflate
Accept-Language: en-us
Expect: 100-continue
Host: xzy.com
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0; .NET CLR
1.0.2914)/1.0 libwww/5.3.1
Cache-Control: no-cache
Connection: TE,Keep-Alive
Date: Fri, 20 Dec 2002 08:32:03 GMT
Content-Length: 172
Content-Type: application/x-www-form-urlencoded


HTTP/1.1 100 Continue
Date: Fri, 20 Dec 2002 08:31:24 GMT
Content-Length: 0
Server: SilverStream Server/10.0

postdata goes here
10HTTP/1.1 200 OK
Date: Fri, 20 Dec 2002 08:31:28 GMT
Transfer-Encoding: chunked
Content-Type: text/html
Server: SilverStream Server/10.0

1A72
<HTML>.....
....
</HTML>


Here is what happens:

1.  The post kicks off a 100 continue with a content-length header of 0.
That sets the HTResponse length to 0.

2.  When the 200 OK, comes in the HTResponse length is still 0 (fallout from
the content-length header in the prior 100).  As a consequence of the prior
100 response and the fact that transfer-Encoding: chunk is being used,
pumpData (HTStream * me) in HTMIME.C returns prematurely before all body
data has been processed (because it **thinks** the length is 0).

My fix is a one liner at HTTPStatus_put_block in http.c:


From:

	    status = (*me->info_target->isa->put_block)(me->info_target, b, l+1);
	    if (status != HT_CONTINUE) return status;
To:

	    status = (*me->info_target->isa->put_block)(me->info_target, b, l+1);
	    if (status != HT_CONTINUE) return status;
	    // re-init to -1 in case 100 continue had 0 content-length header
         HTResponse_setLength(me->request->response,-1);


Thoughts?
thx
fhc
-----Original Message-----
From: www-lib-request@w3.org [mailto:www-lib-request@w3.org]On Behalf Of
Fred Covely
Sent: Thursday, December 19, 2002 11:43 PM
To: www-lib@w3.org
Subject: content-length transfer-encoding issue



I have observed a bug on a SilverStream (Novell) web server that I accessing
as a client using libwww.  Any help is appreciated.

What happens is that I do a POST using SSL, then I do a non-ssl POST.

The second post returns to the application via HTReader_read status =
HT_LOADED, then to HTTPEvent(HTTP.c) which returns with a HT_OK, after a
call to CLEANUP which releases the response object.

Problem is the second post still has data coming in, so I lose a bunch of
return response from the server.

I have it down to this:

1.  The server does not send back 'content-length'.  It does send back a
transfer-encoding:chunk, which ought to allow for proper detection of the
last of the body response packets.  I can see a hex encoded body length in
the response, but libwww seems to ignore that altogether.

When I debug the following lines in HTMIME.c:

	/* Check if CL at all - thanks to jwei@hal.com (John Wei) */
	long cl = HTResponse_length(me->response);

cl comes back '0' during the failure.  It appears that the htresponse length
is not being set correctly if there is no 'content-length' header, and there
is instead a hex coded length along with a transfer-encoding:chunk header.

That seems like a scenario which should happen all the time, so perhaps the
SSL is messing something up?  Maybe not!

Any ideas appreciated.

thx
Fred Covely
BCF Technology

Received on Friday, 20 December 2002 03:47:02 UTC