This version (2017/05/27 13:44) is a draft.
Approvals: 0/1

[10:17:08] * ChanServ sets mode: +o temporalfox [10:43:04] * ChanServ sets mode: +o temporalfox

[11:59:08] * ChanServ sets mode: +o temporalfox [12:29:37] * ChanServ sets mode: +o temporal_

[14:53:51] * ChanServ sets mode: +o temporalfox [15:05:01] * ChanServ sets mode: +o temporal_

[16:17:01] * ChanServ sets mode: +o temporalfox [18:28:50] * ChanServ sets mode: +o temporal_

[19:48:43] <eris0xff> hi! new vertx user here. couple comments: 1) The simple example for unit testing doesn't start up a server on http.port. I explicitly started a http server and it started working. 2) My first test web micro service running on Linux in a docker container gets around 15k requests per second just doing a basic text response. Is there a way to set up the vertx web module to achieve the 1M requests per second in the published tests?

[19:50:06] <eris0xff> (also, very cool library so thanks. I'm coming from libuv, so I'm used to the reactive model. lots of great functionality built into vertx)

[20:06:56] <eris0xff> (oh also… i noticed that on the latest techempower benchmarks, that netty is now benchmarking about 3x faster than vertx. is there a new version of netty that has higher performance or did they just change how the benchmark runs?)

[22:37:43] *** ChanServ sets mode: +o temporalfox