Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [jetty-dev] Help on Connection Management

Hi,

On Wed, Sep 7, 2016 at 11:09 AM, Prakash Nandihal <p.nandihal@xxxxxxxxx> wrote:
> I am doing POC on jetty high level http2 client,i am using jetty version
> 9.3.7.v20160115. Below is my code snippet.
>
> I am facing issue with jetty high level API on connection management.
> Let say i have to post 100000 requests to destination url by fetching data
> from db one by one,Default value of  MaxRequestsQueuedPerDestination in
> HttpClient is 1024, it means one connection can  handle only 1024
> requests,after processing  1024 requests do we have to create new connection
> for processing remaining requests ? or is there any way handle connections?
> And also Please explain what is reason behind allowing only 1024 requests
> per connection?

maxRequestsQueuedPerDestination tells you the max number of requests
that may be queued on a destination.

In HTTP/1.1, a destination holds a pool of connections (controlled by
maxConnectionsPerDestination, by default 64).
So if you send requests in a tight loop, assuming connection
establishment takes zero time, you can have at most 64 (on the
network) + 1024 (queued) outstanding requests. The next one will be
rejected.
The reason to limit the queued request is to avoid memory exhaustion.

In HTTP/2, in Jetty 9.3.x, things are a bit different, since a
destination only has one connection (in Jetty 9.4.x a destination has
a connection pool also for HTTP/2).
There is a limit too about how many requests you can send on that
connection (also defaults to 1024, see
MultiplexHttpDestination.maxRequestsPerConnection), and this number is
controlled by the server.
While clients may change this value, servers are prepared to accept
only a limited number of requests per connection.
Servers communicate this number to clients, and past that number they
will start to reject requests (this is defined in the HTTP/2 spec as
MAX_CONCURRENT_STREAMS).
Clients should honor the server setting (and Jetty does).

Now, if you really want to bomb a server you control with a large
number of requests, then configure the server to support a large
number of concurrent streams, which will be automatically reflected in
the client as the max number of requests per connection.

If you use Jetty 9.4.x, then Jetty will open up new connections as
required (up to maxConnectionsPerDestination) since it has a
connection pool that lacks in Jetty 9.3.x, so you can use smaller
numbers for max concurrent streams but open more connections.

A final note that it may take time to open a connection, enough for a
tight loop to fill the destination queue.
You probably need to "prime" the connection with a first request, and
then you can have your tight loop.

-- 
Simone Bordet
----
http://cometd.org
http://webtide.com
Developer advice, training, services and support
from the Jetty & CometD experts.


Back to the top