2002-03-07 16:29:24 +08:00
|
|
|
These are problems known to exist at the time of this release. Feel free to
|
|
|
|
join in and help us correct one or more of these! Also be sure to check the
|
|
|
|
changelog of the current development status, as one or more of these problems
|
|
|
|
may have been fixed since this was written!
|
|
|
|
|
2004-05-25 22:28:44 +08:00
|
|
|
* If you use a very large amount of file descriptors (more than FD_SETSIZE)
|
|
|
|
and then use libcurl, it might crash on its use of select() which then
|
|
|
|
stores data out of bounds. Bug report #948950.
|
|
|
|
|
2004-04-07 22:03:13 +08:00
|
|
|
* --limit-rate using -d or -F does not work. This is because the limit logic
|
|
|
|
is provided by the curl app in its read/write callbacks, and when doing
|
2004-05-25 22:28:44 +08:00
|
|
|
-d/-F the callbacks aren't used! Bug report #921395.
|
2004-04-07 22:03:13 +08:00
|
|
|
|
2003-11-12 22:33:58 +08:00
|
|
|
* Doing resumed upload over HTTP does not work with '-C -', because curl
|
|
|
|
doesn't do a HEAD first to get the initial size. This needs to be done
|
|
|
|
manually for HTTP PUT resume to work, and then '-C [index]'.
|
|
|
|
|
2003-10-21 14:06:32 +08:00
|
|
|
* CURLOPT_USERPWD and CURLOPT_PROXYUSERPWD have no way of providing user names
|
|
|
|
that contain a colon. This can't be fixed easily in a backwards compatible
|
|
|
|
way without adding new options (and then, they should most probably allow
|
|
|
|
setting user name and password separately).
|
|
|
|
|
2003-10-17 20:21:48 +08:00
|
|
|
* libcurl ignores empty path parts in FTP URLs, whereas RFC1738 states that
|
|
|
|
such parts should be sent to the server as 'CWD ' (without an argument).
|
|
|
|
The only exception to this rule, is that we knowingly break this if the
|
|
|
|
empty part is first in the path, as then we use the double slashes to
|
|
|
|
indicate that the user wants to reach the root dir (this exception SHALL
|
|
|
|
remain even when this bug is fixed).
|
|
|
|
|
2003-08-11 23:15:25 +08:00
|
|
|
* libcurl doesn't treat the content-length of compressed data properly, as
|
|
|
|
it seems HTTP servers send the *uncompressed* length in that header and
|
|
|
|
libcurl thinks of it as the *compressed* lenght. Some explanations are here:
|
|
|
|
http://curl.haxx.se/mail/lib-2003-06/0146.html
|
|
|
|
|
|
|
|
* Downloading 0 (zero) bytes files over FTP will not create a zero byte file
|
|
|
|
locally, which is because libcurl doesn't call the write callback with zero
|
|
|
|
bytes. Explained here: http://curl.haxx.se/mail/archive-2003-04/0143.html
|
|
|
|
|
2003-05-16 18:57:53 +08:00
|
|
|
* IPv6 support on AIX 4.3.3 doesn't work due to a missing sockaddr_storage
|
|
|
|
struct. It has been reported to work on AIX 5.1 though.
|
|
|
|
|
2002-03-18 16:52:15 +08:00
|
|
|
* GOPHER transfers seem broken
|
2002-03-19 23:56:13 +08:00
|
|
|
|
2002-08-23 03:03:54 +08:00
|
|
|
* If a HTTP server responds to a HEAD request and includes a body (thus
|
|
|
|
violating the RFC2616), curl won't wait to read the response but just stop
|
|
|
|
reading and return back. If a second request (let's assume a GET) is then
|
|
|
|
immediately made to the same server again, the connection will be re-used
|
|
|
|
fine of course, and the second request will be sent off but when the
|
|
|
|
response is to get read, the previous response-body is what curl will read
|
|
|
|
and havoc is what happens.
|
|
|
|
More details on this is found in this libcurl mailing list thread:
|
|
|
|
http://curl.haxx.se/mail/lib-2002-08/0000.html
|