Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return HTTP 413 (Request Entity Too Large) when http.max_content_length exceeded #2902

Closed
dbertram opened this issue Apr 16, 2013 · 11 comments
Closed
Labels
:Core/Infra/REST API REST infrastructure and utilities >enhancement

Comments

@dbertram
Copy link

Currently elasticsearch drops the connection if http.max_content_length is exceeded. While this is acceptable behavior based on RFC2616 (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.4.14), it's not particularly friendly to client libraries.

Depending on the library being used, it can be difficult to determine the exact size of the HTTP request prior to actually sending it. Additionally, when the connection is simply closed, it leaves the underlying cause of the problem somewhat ambiguous without also inspecting the elasticsearch logs.

Proposed change: add an option to return HTTP 413 when http.max_content_length is exceeded instead of just dropping the connection.

Steps to repro the current behavior:

  1. Set http.max_content_length = 1kb in elasticsearch.yml
  2. Create a test index:
    curl -XPUT 'http://localhost:9200/testindex/'
  3. Index a case that exceeds http.max_content_length:
    curl -v -XPUT 'http://localhost:9200/testindex/testtype/1' -d '{
    "message": "large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message large message"
    }'

Expected: HTTP status code 413 (Request Entity Too Large)

Actual: Dropped connection client-side, and a TooLongFrameException in elasticsearch log

Here's the output from curl:

* About to connect() to localhost port 9200 (#0)
*   Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 9200 (#0)
> PUT /testindex/testtype/1 HTTP/1.1
> User-Agent: curl/7.20.1 (i686-pc-cygwin) libcurl/7.20.1 OpenSSL/0.9.8o zlib/1.2.5 libidn/1.18 libssh2/1.2.5
> Host: localhost:9200
> Accept: */*
> Content-Length: 1026
> Content-Type: application/x-www-form-urlencoded
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
* Empty reply from server
* Connection #0 to host localhost left intact
curl: (52) Empty reply from server
* Closing connection #0
@mike-wade
Copy link

+1 - This is causing me trouble at the moment (Have a specific list of indexes to search), specifically the Pyes python library raises NoServerAvailable exception when this happens, not very helpful!

@spinscale
Copy link
Contributor

Looks like a dupe of #2137 (or am I missing something) - You need to do some extra work to do this with netty3 IIRC, so maybe netty4 will help here.

@seti123
Copy link

seti123 commented Feb 8, 2014

+1 Version 0.90.10

org.elasticsearch.common.netty.handler.codec.frame.TooLongFrameException: HTTP content length exceeded 104857600 bytes.

Update: I increased size in /etc/elasticsearch/elasticsearch.yml and it works again

    # Set a custom allowed content length:
    # 
    http.max_content_length: 500mb

@denispeplin
Copy link

The exception with same name can be caused by large header, and it should be fixed differently: #5665

@wflanagan
Copy link

Is there a workaround for this? We are regularly getting this error on some larger documents.

@jasontedor
Copy link
Member

Is there a workaround for this? We are regularly getting this error on some larger documents.

@wflanagan Did you increase http.max_content_length?

The issue here is about how the connection is terminated and does not return HTTP status code 413 when the configured max content length is exceeded (by the way, this is an underlying issue with Netty).

@karmi
Copy link
Contributor

karmi commented Jun 9, 2016

I can confirm that the problem still exists, and it's painful for the clients, since there's no good way how to properly handle the situation or propagate the error to the user (see the linked issue for the Ruby client). I've decreased the limit when launching Elasticsearch:

$ ./tmp/builds/elasticsearch-2.4.0-SNAPSHOT/bin/elasticsearch -D es.http.max_content_length=1kb

When I try to index a document via Curl, I get back a 100 Continue response, which is really weird:

$ curl -v -X POST localhost:9200/test/test/1 -d @/Users/karmi/Contracts/Elasticsearch/Projects/BuildSystem/API/test/fixtures/builds_elasticsearch.json
* Hostname was NOT found in DNS cache
*   Trying ::1...
* Connected to localhost (::1) port 9200 (#0)
> POST /test/test/1 HTTP/1.1
> User-Agent: curl/7.37.1
> Host: localhost:9200
> Accept: */*
> Content-Length: 75680
> Content-Type: application/x-www-form-urlencoded
> Expect: 100-continue
> 
< HTTP/1.1 100 Continue
* Empty reply from server
* Connection #0 to host localhost left intact
curl: (52) Empty reply from server

This is the log output from Elasticsearch:

[2016-06-09 17:23:58,635][WARN ][http.netty               ] [Armor] Caught exception while handling client http traffic, closing connection [id: 0xc370c548, /0:0:0:0:0:0:0:1:62732 => /0:0:0:0:0:0:0:1:9200]
org.jboss.netty.handler.codec.frame.TooLongFrameException: HTTP content length exceeded 1024 bytes.
  at org.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:169)

I don't know the options we have how to handle the situation when somebody sends a too big request, but I think we should try hard here to be correct and return the 413 Payload Too Large response?

@clintongormley
Copy link
Contributor

Netty has added the ability to respond with a 413. See netty/netty#2211

@jasontedor
Copy link
Member

jasontedor commented Dec 16, 2016

With the upgrade to Netty 4, this is now handled correctly:

06:10:30 [jason:~] $ ~/elasticsearch/elasticsearch-6.0.0-alpha1-SNAPSHOT/bin/elasticsearch -E http.max_content_length=64m -d
06:10:32 [jason:~] $ dd if=/dev/zero of=zero bs=1024k count=128
128+0 records in
128+0 records out
134217728 bytes transferred in 0.199715 secs (672045843 bytes/sec)
06:10:42 [jason:~] $ curl -v -H "Expect:" -XPOST localhost:9200/ --data-binary @zero
Note: Unnecessary use of -X or --request, POST is already inferred.
*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 9200 (#0)
> POST / HTTP/1.1
> Host: localhost:9200
> User-Agent: curl/7.51.0
> Accept: */*
> Content-Length: 134217728
> Content-Type: application/x-www-form-urlencoded
>
< HTTP/1.1 413 Request Entity Too Large
< content-length: 0
* HTTP error before end of send, stop sending
<
* Curl_http_done: called premature == 0
* Closing connection 0

Note that if you send an Expect: 100-continue header informing Elasticsearch that you would like to send a request with Content-Length greater than http.max_content_length, Elasticsearch responds with a 417 Expectation Failed as the 100 Continue can not be granted due to the excessive Content-Length:

06:10:49 [jason:~] $ curl -v -XPOST localhost:9200/ --data-binary @zero
Note: Unnecessary use of -X or --request, POST is already inferred.
*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 9200 (#0)
> POST / HTTP/1.1
> Host: localhost:9200
> User-Agent: curl/7.51.0
> Accept: */*
> Content-Length: 134217728
> Content-Type: application/x-www-form-urlencoded
> Expect: 100-continue
>
< HTTP/1.1 417 Expectation Failed
< content-length: 0
* HTTP error before end of send, stop sending
<
* Curl_http_done: called premature == 0
* Closing connection 0

@jasontedor
Copy link
Member

Closed by #19526

@jasontedor jasontedor removed the help wanted adoptme label Dec 16, 2016
@jkryanchou
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:Core/Infra/REST API REST infrastructure and utilities >enhancement
Projects
None yet
Development

No branches or pull requests

10 participants