Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Echo test is very slow at 60ms for 99% between 2 powerful VM: #54

Open
Gipxy opened this issue Nov 1, 2024 · 28 comments
Open

Echo test is very slow at 60ms for 99% between 2 powerful VM: #54

Gipxy opened this issue Nov 1, 2024 · 28 comments

Comments

@Gipxy
Copy link

Gipxy commented Nov 1, 2024

Hi, I run the ECHO benchmark between 2 VM and the result is not good 10-60ms, can you advice:

12034.047 0.400000000000 44011 1.67
15859.711 0.500000000000 55006 2.00
17891.327 0.550000000000 60524
....
53084.159 0.978125000000 107599 45.71
55345.151 0.981250000000 107942 53.33
...
64258.047 0.989062500000 108797 91.43
66486.271 0.990625000000 108969 106.67
...
#[Mean = 18838.771, StdDeviation = 14477.590]
#[Max = 95092.735, Total count = 110000]
#[Buckets = 32, SubBuckets = 2048]

Detail env:

  • For network, i did network ping between 2 servers: 99% is 800us, 90% is 600us only
  • Hardware: powerful VM with 8 cores and 64GB RAM, almost free when i run perf test
  • Java: Zulu OpenSDK 17.0.5
  • Run cmd:
    #-- server
    JVM_OPTS="
    -Duk.co.real_logic.benchmarks.remote.output.directory=./server-output
    -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true
    -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000
    -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001" aeron/echo-server

--- client

JVM_OPTS="
-Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true
-Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000
-Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001"
./benchmark-runner --output-file "aeron-echo-test" --messages "100K" --message-length "288" --iterations 10 "aeron/echo-client"

Do take notes, i must change number of message from 501k to 10k as it will prompt msg lost with 501k

Thanks

@vyazelenko
Copy link
Contributor

If you can post AeronStat output from both nodes and run with -Daeron.print.configuration=true then we could have a look.

Also make sure that you test a UDP connection between the nodes (not ping), i.e. run something like iperf3.

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

Hi Vyazelenko, I don't iperf3, I will create simple Java UDP/TCP socket echo apps and provide benchmark. Here are aeron stats:

CLIENT aeron stat:

CnC version: 0.2.0
PID: 566350
Start time: 2024-11-01 15:56:58.644+0800
================================================================
  0:           47,954,528 - Bytes sent
  1:           47,954,120 - Bytes received
  2:                    0 - Failed offers to ReceiverProxy
  3:                    0 - Failed offers to SenderProxy
  4:                    0 - Failed offers to DriverConductorProxy
  5:                    6 - NAKs sent
  6:                   32 - NAKs received
  7:                  121 - Status Messages sent
  8:                  125 - Status Messages received
  9:                    3 - Heartbeats sent
 10:                    3 - Heartbeats received
 11:                   29 - Retransmits sent
 12:                    0 - Flow control under runs
 13:                    0 - Flow control over runs
 14:                    0 - Invalid packets
 15:                    0 - Errors: version=1.46.6 commit=187a1ec2d7
 16:                    0 - Short sends
 17:                    0 - Failed attempts to free log buffers
 18:                    0 - Sender flow control limits, i.e. back-pressure events
 19:                    0 - Unblocked Publications
 20:                    0 - Unblocked Control Commands
 21:                    0 - Possible TTL Asymmetry
 22:                    0 - ControllableIdleStrategy status
 23:                    0 - Loss gap fills
 24:                    0 - Client liveness timeouts
 25:                    0 - Resolution changes: driverName=null
 26:           77,038,003 - Conductor max cycle time doing its work in ns: DEDICATED
 27:                    0 - Conductor work cycle exceeded threshold count: threshold=1000000000ns DEDICATED
 28:           29,010,153 - Sender max cycle time doing its work in ns: DEDICATED
 29:                    0 - Sender work cycle exceeded threshold count: threshold=1000000000ns DEDICATED
 30:           44,143,664 - Receiver max cycle time doing its work in ns: DEDICATED
 31:                    0 - Receiver work cycle exceeded threshold count: threshold=1000000000ns DEDICATED
 32:               23,995 - NameResolver max time in ns
 33:                    0 - NameResolver exceeded threshold count
 34:               77,318 - Aeron software: version=1.46.6 commit=187a1ec2d7
 35:          648,032,256 - Bytes currently mapped
 36:               52,160 - Retransmitted bytes
 37:                    0 - Retransmit Pool Overflow count
 38:                    1 - snd-channel: aeron:udp?endpoint=10.52.19.5:13001 10.52.19.65:36314
 39:                    1 - snd-local-sockaddr: 38 10.52.19.65:36314
 40:           47,954,432 - pub-pos (sampled): 2 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001
 41:           56,343,040 - pub-lmt: 2 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001
 42:           47,954,432 - snd-pos: 2 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001
 43:           49,895,424 - snd-lmt: 2 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001
 44:                    0 - snd-bpe: 2 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001
 45:    1,730,447,839,143 - client-heartbeat: id=1 name= version=1.46.6 commit=187a1ec2d7
 46:                    1 - rcv-channel: aeron:udp?endpoint=10.52.19.65:13000 10.52.19.65:13000
 47:                    1 - rcv-local-sockaddr: 46 10.52.19.65:13000
 48:           47,954,432 - sub-pos: 3 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000 @0
 49:           47,954,432 - rcv-hwm: 4 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000
 50:           47,954,432 - rcv-pos: 4 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000

Server:

CnC version: 0.2.0
PID: 3655188
Start time: 2024-11-01 15:56:30.568+0800
================================================================
  0:           47,958,112 - Bytes sent
  1:           47,954,152 - Bytes received
  2:                    0 - Failed offers to ReceiverProxy
  3:                    0 - Failed offers to SenderProxy
  4:                    0 - Failed offers to DriverConductorProxy
  5:                   32 - NAKs sent
  6:                    6 - NAKs received
  7:                  176 - Status Messages sent
  8:                  121 - Status Messages received
  9:                  115 - Heartbeats sent
 10:                    4 - Heartbeats received
 11:                    6 - Retransmits sent
 12:                    0 - Flow control under runs
 13:                    0 - Flow control over runs
 14:                    0 - Invalid packets
 15:                    0 - Errors: version=1.46.6 commit=187a1ec2d7
 16:                    0 - Short sends
 17:                    0 - Failed attempts to free log buffers
 18:                    0 - Sender flow control limits, i.e. back-pressure events
 19:                    0 - Unblocked Publications
 20:                    0 - Unblocked Control Commands
 21:                    0 - Possible TTL Asymmetry
 22:                    0 - ControllableIdleStrategy status
 23:                    0 - Loss gap fills
 24:                    0 - Client liveness timeouts
 25:                    0 - Resolution changes: driverName=null
 26:           17,850,419 - Conductor max cycle time doing its work in ns: DEDICATED
 27:                    0 - Conductor work cycle exceeded threshold count: threshold=1000000000ns DEDICATED
 28:           13,386,220 - Sender max cycle time doing its work in ns: DEDICATED
 29:                    0 - Sender work cycle exceeded threshold count: threshold=1000000000ns DEDICATED
 30:            5,730,823 - Receiver max cycle time doing its work in ns: DEDICATED
 31:                    0 - Receiver work cycle exceeded threshold count: threshold=1000000000ns DEDICATED
 32:               28,874 - NameResolver max time in ns
 33:                    0 - NameResolver exceeded threshold count
 34:               77,318 - Aeron software: version=1.46.6 commit=187a1ec2d7
 35:          648,032,256 - Bytes currently mapped
 36:               23,040 - Retransmitted bytes
 37:                    0 - Retransmit Pool Overflow count
 38:                    1 - snd-channel: aeron:udp?endpoint=10.52.19.65:13000 10.52.19.5:37841
 39:                    1 - snd-local-sockaddr: 38 10.52.19.5:37841
 40:           47,954,432 - pub-pos (sampled): 2 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000
 41:           47,954,432 - pub-lmt: 2 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000
 42:           47,954,432 - snd-pos: 2 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000
 43:           49,576,384 - snd-lmt: 2 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000
 44:                    0 - snd-bpe: 2 -1625805811 55555 aeron:udp?endpoint=10.52.19.65:13000
 45:    1,730,447,850,352 - client-heartbeat: id=1 name= version=1.46.6 commit=187a1ec2d7
 46:                    1 - rcv-channel: aeron:udp?endpoint=10.52.19.5:13001 10.52.19.5:13001
 47:                    1 - rcv-local-sockaddr: 46 10.52.19.5:13001
 48:           47,954,432 - sub-pos: 3 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001 @0
 49:           47,954,432 - rcv-hwm: 4 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001
 50:           47,954,432 - rcv-pos: 4 177056341 77777 aeron:udp?endpoint=10.52.19.5:13001

@vyazelenko
Copy link
Contributor

@Gipxy iperf3 provides everything out of the box so no need to write any code. Plus it is a reliable tool that can be trusted.

From the provided AeronStats I can see that there is packet loss (i.e. NAK sent/received are not zero). Try running your test with lower message rates: 1K, 10K, 100K. And see if the loss disappears and the latency numbers improve.

You haven't shared the configuration output when running with -Daeron.print.configuration=true.

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

Yeah, but i don't Admin permission on that server. So can't install that:
UDP simple Java socket echo test by sending 1 byte: 99%=1640us, 90%=703us, avg=438us
TCP socket echo having similar result

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

Server log:

[thanhhoailuong@x01sfxolapp1a scripts]$ JVM_OPTS="\
> -Duk.co.real_logic.benchmarks.remote.output.directory=./server-output \
> -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true \
> -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 \
> -Daeron.print.configuration=true \
> -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001" aeron/echo-server
starting aeron-server...
+++ dirname aeron/echo-server
++ cd aeron
++ pwd
+ DIR=/var/tmp/runtime/scripts/aeron
++ basename aeron/echo-server .sh
+ PROCESS_FILE_NAME=echo-server
+ export PROCESS_FILE_NAME
+ exec /var/tmp/runtime/scripts/aeron/../run-java uk.co.real_logic.benchmarks.aeron.remote.EchoNode /var/tmp/runtime/scripts/aeron/low-latency-driver.properties /var/tmp/runtime/scripts/aeron/benchmark.properties
+++ dirname /var/tmp/runtime/scripts/aeron/../run-java
++ cd /var/tmp/runtime/scripts/aeron/..
++ pwd
+ DIR=/var/tmp/runtime/scripts
+ class_name=EchoNode
+ log_file_name=echo-server
+ '[' -d /dev/shm ']'
+ gc_log_file=/dev/shm/echo-server-gc.log
+ error_log_file=/dev/shm/echo-server-crash.log
+ java_options=(-Djava.net.preferIPv4Stack=true -Djava.net.preferIPv6Addresses=false -XX:+UnlockExperimentalVMOptions -XX:+TrustFinalNonStaticFields -XX:+UnlockDiagnosticVMOptions -XX:+DebugNonSafepoints -XX:+UseParallelGC -XX:ErrorFile="${error_log_file}" -Xms4G -Xmx4G -XX:+AlwaysPreTouch -XX:MaxMetaspaceSize=1G -XX:ReservedCodeCacheSize=1G -XX:+EnableDynamicAgentLoading "-Xlog:gc*,safepoint=info,arguments=info:file=${gc_log_file}:time")
+ add_opens=(--add-opens java.base/java.util.zip=ALL-UNNAMED)
+ jvm_opts=()
+ [[ -v JVM_OPTS ]]
+ [[ -n -Duk.co.real_logic.benchmarks.remote.output.directory=./server-output -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 -Daeron.print.configuration=true -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001 ]]
+ IFS=' '
+ read -ra jvm_opts
+ unset IFS
+ exec /bin/java -cp /var/tmp/runtime/scripts/../benchmarks-all/build/libs/benchmarks.jar -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv6Addresses=false -XX:+UnlockExperimentalVMOptions -XX:+TrustFinalNonStaticFields -XX:+UnlockDiagnosticVMOptions -XX:+DebugNonSafepoints -XX:+UseParallelGC -XX:ErrorFile=/dev/shm/echo-server-crash.log -Xms4G -Xmx4G -XX:+AlwaysPreTouch -XX:MaxMetaspaceSize=1G -XX:ReservedCodeCacheSize=1G -XX:+EnableDynamicAgentLoading '-Xlog:gc*,safepoint=info,arguments=info:file=/dev/shm/echo-server-gc.log:time' --add-opens java.base/java.util.zip=ALL-UNNAMED -Duk.co.real_logic.benchmarks.remote.output.directory=./server-output -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true '-Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000' -Daeron.print.configuration=true '-Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001' uk.co.real_logic.benchmarks.aeron.remote.EchoNode /var/tmp/runtime/scripts/aeron/low-latency-driver.properties /var/tmp/runtime/scripts/aeron/benchmark.properties
WARNING: Could not set desired SO_SNDBUF, adjust OS to allow aeron.socket.so_sndbuf attempted=2097152, actual=212992
WARNING: Could not set desired SO_RCVBUF, adjust OS to allow aeron.socket.so_sndbuf attempted=2097152, actual=212992
MediaDriver.Context
{
    isConcluded=true
    isClosed=false
    cncVersion=0.2.0
    aeronDirectory=/dev/shm/aeron
    enabledExperimentalFeatures=false
    aeronDirectoryName='/dev/shm/aeron'
    cncFile=/dev/shm/aeron/cnc.dat
    countersMetaDataBuffer=UnsafeBuffer{addressOffset=140513615279104, capacity=419430400, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]}
    countersValuesBuffer=UnsafeBuffer{addressOffset=140514034709504, capacity=104857600, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]}
    driverTimeoutMs=30000
    printConfigurationOnStart=true
    useWindowsHighResTimer=true
    warnIfDirectoryExists=false
    dirDeleteOnStart=true
    dirDeleteOnShutdown=true
    termBufferSparseFile=true
    performStorageChecks=false
    spiesSimulateConnection=true
    reliableStream=true
    tetherSubscriptions=true
    rejoinStream=true
    receiverGroupConsideration=INFER
    conductorBufferLength=1049344
    toClientsBufferLength=1048704
    counterValuesBufferLength=104857600
    errorBufferLength=10485760
    lowStorageWarningThreshold=167772160
    timerIntervalNs=1000000000
    clientLivenessTimeoutNs=10000000000
    imageLivenessTimeoutNs=10000000000
    publicationUnblockTimeoutNs=15000000000
    publicationConnectionTimeoutNs=5000000000
    publicationLingerTimeoutNs=5000000000
    untetheredWindowLimitTimeoutNs=5000000000
    untetheredRestingTimeoutNs=10000000000
    retransmitUnicastDelayNs=0
    retransmitUnicastLingerNs=10000000
    nakUnicastDelayNs=100000
    nakMulticastMaxBackoffNs=10000000
    nakMulticastGroupSize=10
    statusMessageTimeoutNs=200000000
    counterFreeToReuseTimeoutNs=1000000000
    conductorCycleThresholdNs=1000000000
    senderCycleThresholdNs=1000000000
    receiverCycleThresholdNs=1000000000
    nameResolverThresholdNs=5000000000
    publicationTermBufferLength=16777216
    ipcTermBufferLength=67108864
    publicationTermWindowLength=0
    ipcPublicationTermWindowLength=0
    initialWindowLength=2097152
    socketSndbufLength=2097152
    socketRcvbufLength=2097152
    socketMulticastTtl=0
    mtuLength=1408
    ipcMtuLength=1408
    filePageSize=4096
    publicationReservedSessionIdLow=-1
    publicationReservedSessionIdHigh=1000
    lossReportBufferLength=10485760
    epochClock=org.agrona.concurrent.SystemEpochClock@4f970963
    nanoClock=org.agrona.concurrent.SystemNanoClock@61f8bee4
    cachedEpochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0
    cachedNanoClock=org.agrona.concurrent.CachedNanoClock@887af79
    threadingMode=DEDICATED
    conductorThreadFactory=io.aeron.driver.MediaDriver$Context$$Lambda$41/0x0000000801028648@7fac631b
    senderThreadFactory=io.aeron.driver.MediaDriver$Context$$Lambda$42/0x0000000801028870@5b87ed94
    receiverThreadFactory=io.aeron.driver.MediaDriver$Context$$Lambda$43/0x0000000801028a98@6e0e048a
    sharedThreadFactory=null
    sharedNetworkThreadFactory=null
    conductorIdleStrategy=NoOpIdleStrategy{alias=noop}
    senderIdleStrategy=NoOpIdleStrategy{alias=noop}
    receiverIdleStrategy=NoOpIdleStrategy{alias=noop}
    sharedNetworkIdleStrategy=null
    sharedIdleStrategy=null
    sendChannelEndpointSupplier=io.aeron.driver.DefaultSendChannelEndpointSupplier@5bc79255
    receiveChannelEndpointSupplier=io.aeron.driver.DefaultReceiveChannelEndpointSupplier@47ef968d
    receiveChannelEndpointThreadLocals=ReceiveChannelEndpointThreadLocals{nextReceiverId=-4713806441138890018}
    tempBuffer=UnsafeBuffer{addressOffset=16, capacity=512, byteArray=byte[512], byteBuffer=null}
    unicastFlowControlSupplier=DefaultUnicastFlowControlSupplier{flowControlClass=io.aeron.driver.UnicastFlowControl}
    multicastFlowControlSupplier=DefaultMulticastFlowControlSupplier{flowControlClass=io.aeron.driver.MaxMulticastFlowControl}
    applicationSpecificFeedback=[]
    receiverGroupTag=null
    flowControlGroupTag=-1
    flowControlGroupMinSize=0
    flowControlReceiverTimeoutNs=5000000000
    reResolutionCheckIntervalNs=1000000000
    receiverGroupConsideration=INFER
    congestionControlSupplier=io.aeron.driver.DefaultCongestionControlSupplier@305fd85d
    terminationValidator=io.aeron.driver.DefaultDenyTerminationValidator@458c1321
    terminationHook=null
    nameResolver=io.aeron.driver.DefaultNameResolver@11438d26
    resolverName='null'
    resolverInterface='null'
    resolverBootstrapNeighbor='null'
    sendToStatusMessagePollRatio=6
    unicastFeedbackDelayGenerator=StaticDelayGenerator{delayInNs=100000, retryDelayNs=10000000}
    multicastFeedbackDelayGenerator=OptimalMulticastDelayGenerator{randMax=3.302585092994046E-7, baseX=1.2613558468370601E-8, constantT=3027931.0656411387, factorT=7.927976886994827E7, shouldFeedbackImmediately=false}
    retransmitUnicastDelayGenerator=StaticDelayGenerator{delayInNs=0, retryDelayNs=0}
    retransmitUnicastLingerGenerator=StaticDelayGenerator{delayInNs=10000000, retryDelayNs=10000000}
    errorLog=org.agrona.concurrent.errors.DistinctErrorLog@86be70a
    errorHandler=org.agrona.concurrent.errors.LoggingErrorHandler@480bdb19
    useConcurrentCountersManager=false
    countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}
    systemCounters=SystemCounters{counterByDescriptorMap={BYTES_SENT=AtomicCounter{isClosed=false, id=0, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, BYTES_RECEIVED=AtomicCounter{isClosed=false, id=1, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RECEIVER_PROXY_FAILS=AtomicCounter{isClosed=false, id=2, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, SENDER_PROXY_FAILS=AtomicCounter{isClosed=false, id=3, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, CONDUCTOR_PROXY_FAILS=AtomicCounter{isClosed=false, id=4, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, NAK_MESSAGES_SENT=AtomicCounter{isClosed=false, id=5, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, NAK_MESSAGES_RECEIVED=AtomicCounter{isClosed=false, id=6, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, STATUS_MESSAGES_SENT=AtomicCounter{isClosed=false, id=7, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, STATUS_MESSAGES_RECEIVED=AtomicCounter{isClosed=false, id=8, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, HEARTBEATS_SENT=AtomicCounter{isClosed=false, id=9, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, HEARTBEATS_RECEIVED=AtomicCounter{isClosed=false, id=10, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RETRANSMITS_SENT=AtomicCounter{isClosed=false, id=11, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, FLOW_CONTROL_UNDER_RUNS=AtomicCounter{isClosed=false, id=12, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, FLOW_CONTROL_OVER_RUNS=AtomicCounter{isClosed=false, id=13, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, INVALID_PACKETS=AtomicCounter{isClosed=false, id=14, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, ERRORS=AtomicCounter{isClosed=false, id=15, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, SHORT_SENDS=AtomicCounter{isClosed=false, id=16, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, FREE_FAILS=AtomicCounter{isClosed=false, id=17, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, SENDER_FLOW_CONTROL_LIMITS=AtomicCounter{isClosed=false, id=18, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, UNBLOCKED_PUBLICATIONS=AtomicCounter{isClosed=false, id=19, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, UNBLOCKED_COMMANDS=AtomicCounter{isClosed=false, id=20, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, POSSIBLE_TTL_ASYMMETRY=AtomicCounter{isClosed=false, id=21, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, CONTROLLABLE_IDLE_STRATEGY=AtomicCounter{isClosed=false, id=22, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, LOSS_GAP_FILLS=AtomicCounter{isClosed=false, id=23, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, CLIENT_TIMEOUTS=AtomicCounter{isClosed=false, id=24, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RESOLUTION_CHANGES=AtomicCounter{isClosed=false, id=25, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, CONDUCTOR_MAX_CYCLE_TIME=AtomicCounter{isClosed=false, id=26, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, CONDUCTOR_CYCLE_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=27, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, SENDER_MAX_CYCLE_TIME=AtomicCounter{isClosed=false, id=28, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, SENDER_CYCLE_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=29, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RECEIVER_MAX_CYCLE_TIME=AtomicCounter{isClosed=false, id=30, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RECEIVER_CYCLE_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=31, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, NAME_RESOLVER_MAX_TIME=AtomicCounter{isClosed=false, id=32, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, NAME_RESOLVER_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=33, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, AERON_VERSION=AtomicCounter{isClosed=false, id=34, value=77318, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, BYTES_CURRENTLY_MAPPED=AtomicCounter{isClosed=false, id=35, value=547360768, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RETRANSMITTED_BYTES=AtomicCounter{isClosed=false, id=36, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, RETRANSMIT_OVERFLOW=AtomicCounter{isClosed=false, id=37, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}}}
    lossReport=LossReport{nextRecordOffset=0, buffer=UnsafeBuffer{addressOffset=140515574820864, capacity=10485760, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=10485760 cap=10485760]}}
    logFactory=io.aeron.driver.buffer.FileStoreLogFactory@26a7b76d
    dataTransportPoller=DataTransportPoller{}
    controlTransportPoller=ControlTransportPoller{}
    driverCommandQueue={}
    receiverCommandQueue=org.agrona.concurrent.OneToOneConcurrentArrayQueue@4abdb505
    senderCommandQueue=org.agrona.concurrent.OneToOneConcurrentArrayQueue@7ce6a65d
    receiverProxy=ReceiverProxy{threadingMode=DEDICATED, failCount=AtomicCounter{isClosed=false, id=2, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}}
    senderProxy=SenderProxy{threadingMode=DEDICATED, failCount=AtomicCounter{isClosed=false, id=3, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}}
    driverConductorProxy=DriverConductorProxy{threadingMode=DEDICATED, failCount=AtomicCounter{isClosed=false, id=4, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}}
    clientProxy=ClientProxy{}
    toDriverCommands=org.agrona.concurrent.ringbuffer.ManyToOneRingBuffer@59906517
    lossReportBuffer=java.nio.DirectByteBuffer[pos=0 lim=10485760 cap=10485760]
    cncByteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]
    cncMetaDataBuffer=UnsafeBuffer{addressOffset=140513613180928, capacity=48, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]}
    channelSendTimestampClock=org.agrona.concurrent.SystemEpochNanoClock@5bfbf16f
    channelReceiveTimestampClock=org.agrona.concurrent.SystemEpochNanoClock@25af5db5
    conductorDutyCycleTracker=DutyCycleStallTracker{maxCycleTime=AtomicCounter{isClosed=false, id=26, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, cycleTimeThresholdExceededCount=AtomicCounter{isClosed=false, id=27, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, cycleTimeThresholdNs=1000000000}
    senderDutyCycleTracker=DutyCycleStallTracker{maxCycleTime=AtomicCounter{isClosed=false, id=28, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, cycleTimeThresholdExceededCount=AtomicCounter{isClosed=false, id=29, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, cycleTimeThresholdNs=1000000000}
    receiverDutyCycleTracker=DutyCycleStallTracker{maxCycleTime=AtomicCounter{isClosed=false, id=30, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, cycleTimeThresholdExceededCount=AtomicCounter{isClosed=false, id=31, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@7b49cea0}}, cycleTimeThresholdNs=1000000000}
    senderWildcardPortRange=null
    receiverWildcardPortRange=null
    senderPortManager=io.aeron.driver.media.WildcardPortManager@262b2c86
    receiverPortManager=io.aeron.driver.media.WildcardPortManager@371a67ec
    resourceFreeLimit=10
    asyncTaskExecutorThreads=1
    asyncTaskExecutor=java.util.concurrent.ThreadPoolExecutor@5ed828d[Running, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 0]
    maxResend=16
}

Client log:


[thanhhoailuong@x01sfxolapp2a scripts]$ JVM_OPTS="\
> -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true \
> -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 \
> -Daeron.print.configuration=true \
> -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001" \
> ./benchmark-runner --output-file "aeron-echo-test" --messages "100K" --message-length "288" --iterations 10 "aeron/echo-client"
+ iterations=60
+ messageRate=5K
+ burstSize=1
+ messageLength=288
+ [[ 9 -gt 0 ]]
+ option=--output-file
+ case ${option} in
+ outputFileName=aeron-echo-test
+ shift
+ shift
+ [[ 7 -gt 0 ]]
+ option=--messages
+ case ${option} in
+ command=--messages
+ shift
+ [[ 6 -gt 0 ]]
+ option=100K
+ case ${option} in
+ command=100K
+ shift
+ [[ 5 -gt 0 ]]
+ option=--message-length
+ case ${option} in
+ messageLength=288
+ shift
+ shift
+ [[ 3 -gt 0 ]]
+ option=--iterations
+ case ${option} in
+ iterations=10
+ shift
+ shift
+ [[ 1 -gt 0 ]]
+ option=aeron/echo-client
+ case ${option} in
+ command=aeron/echo-client
+ shift
+ [[ 0 -gt 0 ]]
+ [[ -z '' ]]
+ JVM_OPTS='-Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 -Daeron.print.configuration=true -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001 -Duk.co.real_logic.benchmarks.remote.output.file=aeron-echo-test -Duk.co.real_logic.benchmarks.remote.iterations=10 -Duk.co.real_logic.benchmarks.remote.message.rate=5K -Duk.co.real_logic.benchmarks.remote.batch.size=1 -Duk.co.real_logic.benchmarks.remote.message.length=288'
+ aeron/echo-client
+++ dirname aeron/echo-client
++ cd aeron
++ pwd
+ DIR=/var/tmp/runtime/scripts/aeron
++ basename aeron/echo-client .sh
+ PROCESS_FILE_NAME=echo-client
+ export PROCESS_FILE_NAME
+ export 'JVM_OPTS=-Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 -Daeron.print.configuration=true -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001 -Duk.co.real_logic.benchmarks.remote.output.file=aeron-echo-test -Duk.co.real_logic.benchmarks.remote.iterations=10 -Duk.co.real_logic.benchmarks.remote.message.rate=5K -Duk.co.real_logic.benchmarks.remote.batch.size=1 -Duk.co.real_logic.benchmarks.remote.message.length=288 -Duk.co.real_logic.benchmarks.remote.message.transceiver=uk.co.real_logic.benchmarks.aeron.remote.EchoMessageTransceiver'
+ JVM_OPTS='-Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 -Daeron.print.configuration=true -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001 -Duk.co.real_logic.benchmarks.remote.output.file=aeron-echo-test -Duk.co.real_logic.benchmarks.remote.iterations=10 -Duk.co.real_logic.benchmarks.remote.message.rate=5K -Duk.co.real_logic.benchmarks.remote.batch.size=1 -Duk.co.real_logic.benchmarks.remote.message.length=288 -Duk.co.real_logic.benchmarks.remote.message.transceiver=uk.co.real_logic.benchmarks.aeron.remote.EchoMessageTransceiver'
+ exec /var/tmp/runtime/scripts/aeron/../run-java uk.co.real_logic.benchmarks.remote.LoadTestRig /var/tmp/runtime/scripts/aeron/low-latency-driver.properties /var/tmp/runtime/scripts/aeron/benchmark.properties
+++ dirname /var/tmp/runtime/scripts/aeron/../run-java
++ cd /var/tmp/runtime/scripts/aeron/..
++ pwd
+ DIR=/var/tmp/runtime/scripts
+ class_name=LoadTestRig
+ log_file_name=echo-client
+ '[' -d /dev/shm ']'
+ gc_log_file=/dev/shm/echo-client-gc.log
+ error_log_file=/dev/shm/echo-client-crash.log
+ java_options=(-Djava.net.preferIPv4Stack=true -Djava.net.preferIPv6Addresses=false -XX:+UnlockExperimentalVMOptions -XX:+TrustFinalNonStaticFields -XX:+UnlockDiagnosticVMOptions -XX:+DebugNonSafepoints -XX:+UseParallelGC -XX:ErrorFile="${error_log_file}" -Xms4G -Xmx4G -XX:+AlwaysPreTouch -XX:MaxMetaspaceSize=1G -XX:ReservedCodeCacheSize=1G -XX:+EnableDynamicAgentLoading -Duk.co.real_logic.benchmarks.remote.output.directory=./out -Duk.co.real_logic.benchmarks.aeron.remote.output.directory=./output "-Xlog:gc*,safepoint=info,arguments=info:file=${gc_log_file}:time")
+ add_opens=(--add-opens java.base/java.util.zip=ALL-UNNAMED)
+ jvm_opts=()
+ [[ -v JVM_OPTS ]]
+ [[ -n -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true -Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000 -Daeron.print.configuration=true -Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001 -Duk.co.real_logic.benchmarks.remote.output.file=aeron-echo-test -Duk.co.real_logic.benchmarks.remote.iterations=10 -Duk.co.real_logic.benchmarks.remote.message.rate=5K -Duk.co.real_logic.benchmarks.remote.batch.size=1 -Duk.co.real_logic.benchmarks.remote.message.length=288 -Duk.co.real_logic.benchmarks.remote.message.transceiver=uk.co.real_logic.benchmarks.aeron.remote.EchoMessageTransceiver ]]
+ IFS=' '
+ read -ra jvm_opts
+ unset IFS
+ exec /bin/java -cp /var/tmp/runtime/scripts/../benchmarks-all/build/libs/benchmarks.jar -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv6Addresses=false -XX:+UnlockExperimentalVMOptions -XX:+TrustFinalNonStaticFields -XX:+UnlockDiagnosticVMOptions -XX:+DebugNonSafepoints -XX:+UseParallelGC -XX:ErrorFile=/dev/shm/echo-client-crash.log -Xms4G -Xmx4G -XX:+AlwaysPreTouch -XX:MaxMetaspaceSize=1G -XX:ReservedCodeCacheSize=1G -XX:+EnableDynamicAgentLoading -Duk.co.real_logic.benchmarks.remote.output.directory=./out -Duk.co.real_logic.benchmarks.aeron.remote.output.directory=./output '-Xlog:gc*,safepoint=info,arguments=info:file=/dev/shm/echo-client-gc.log:time' --add-opens java.base/java.util.zip=ALL-UNNAMED -Duk.co.real_logic.benchmarks.aeron.remote.embedded.media.driver=true '-Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000' -Daeron.print.configuration=true '-Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001' -Duk.co.real_logic.benchmarks.remote.output.file=aeron-echo-test -Duk.co.real_logic.benchmarks.remote.iterations=10 -Duk.co.real_logic.benchmarks.remote.message.rate=5K -Duk.co.real_logic.benchmarks.remote.batch.size=1 -Duk.co.real_logic.benchmarks.remote.message.length=288 -Duk.co.real_logic.benchmarks.remote.message.transceiver=uk.co.real_logic.benchmarks.aeron.remote.EchoMessageTransceiver uk.co.real_logic.benchmarks.remote.LoadTestRig /var/tmp/runtime/scripts/aeron/low-latency-driver.properties /var/tmp/runtime/scripts/aeron/benchmark.properties
WARNING: Could not set desired SO_SNDBUF, adjust OS to allow aeron.socket.so_sndbuf attempted=2097152, actual=212992
WARNING: Could not set desired SO_RCVBUF, adjust OS to allow aeron.socket.so_sndbuf attempted=2097152, actual=212992
MediaDriver.Context
{
    isConcluded=true
    isClosed=false
    cncVersion=0.2.0
    aeronDirectory=/dev/shm/aeron
    enabledExperimentalFeatures=false
    aeronDirectoryName='/dev/shm/aeron'
    cncFile=/dev/shm/aeron/cnc.dat
    countersMetaDataBuffer=UnsafeBuffer{addressOffset=140599179080704, capacity=419430400, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]}
    countersValuesBuffer=UnsafeBuffer{addressOffset=140599598511104, capacity=104857600, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]}
    driverTimeoutMs=30000
    printConfigurationOnStart=true
    useWindowsHighResTimer=true
    warnIfDirectoryExists=false
    dirDeleteOnStart=true
    dirDeleteOnShutdown=true
    termBufferSparseFile=true
    performStorageChecks=false
    spiesSimulateConnection=true
    reliableStream=true
    tetherSubscriptions=true
    rejoinStream=true
    receiverGroupConsideration=INFER
    conductorBufferLength=1049344
    toClientsBufferLength=1048704
    counterValuesBufferLength=104857600
    errorBufferLength=10485760
    lowStorageWarningThreshold=167772160
    timerIntervalNs=1000000000
    clientLivenessTimeoutNs=10000000000
    imageLivenessTimeoutNs=10000000000
    publicationUnblockTimeoutNs=15000000000
    publicationConnectionTimeoutNs=5000000000
    publicationLingerTimeoutNs=5000000000
    untetheredWindowLimitTimeoutNs=5000000000
    untetheredRestingTimeoutNs=10000000000
    retransmitUnicastDelayNs=0
    retransmitUnicastLingerNs=10000000
    nakUnicastDelayNs=100000
    nakMulticastMaxBackoffNs=10000000
    nakMulticastGroupSize=10
    statusMessageTimeoutNs=200000000
    counterFreeToReuseTimeoutNs=1000000000
    conductorCycleThresholdNs=1000000000
    senderCycleThresholdNs=1000000000
    receiverCycleThresholdNs=1000000000
    nameResolverThresholdNs=5000000000
    publicationTermBufferLength=16777216
    ipcTermBufferLength=67108864
    publicationTermWindowLength=0
    ipcPublicationTermWindowLength=0
    initialWindowLength=2097152
    socketSndbufLength=2097152
    socketRcvbufLength=2097152
    socketMulticastTtl=0
    mtuLength=1408
    ipcMtuLength=1408
    filePageSize=4096
    publicationReservedSessionIdLow=-1
    publicationReservedSessionIdHigh=1000
    lossReportBufferLength=10485760
    epochClock=org.agrona.concurrent.SystemEpochClock@1e6d1014
    nanoClock=org.agrona.concurrent.SystemNanoClock@76707e36
    cachedEpochClock=org.agrona.concurrent.CachedEpochClock@614ddd49
    cachedNanoClock=org.agrona.concurrent.CachedNanoClock@1f554b06
    threadingMode=DEDICATED
    conductorThreadFactory=io.aeron.driver.MediaDriver$Context$$Lambda$38/0x000000080102e528@694e1548
    senderThreadFactory=io.aeron.driver.MediaDriver$Context$$Lambda$39/0x000000080102e750@1c3a4799
    receiverThreadFactory=io.aeron.driver.MediaDriver$Context$$Lambda$40/0x000000080102e978@131276c2
    sharedThreadFactory=null
    sharedNetworkThreadFactory=null
    conductorIdleStrategy=NoOpIdleStrategy{alias=noop}
    senderIdleStrategy=NoOpIdleStrategy{alias=noop}
    receiverIdleStrategy=NoOpIdleStrategy{alias=noop}
    sharedNetworkIdleStrategy=null
    sharedIdleStrategy=null
    sendChannelEndpointSupplier=io.aeron.driver.DefaultSendChannelEndpointSupplier@26aa12dd
    receiveChannelEndpointSupplier=io.aeron.driver.DefaultReceiveChannelEndpointSupplier@3fd7a715
    receiveChannelEndpointThreadLocals=ReceiveChannelEndpointThreadLocals{nextReceiverId=-213902925383840465}
    tempBuffer=UnsafeBuffer{addressOffset=16, capacity=512, byteArray=byte[512], byteBuffer=null}
    unicastFlowControlSupplier=DefaultUnicastFlowControlSupplier{flowControlClass=io.aeron.driver.UnicastFlowControl}
    multicastFlowControlSupplier=DefaultMulticastFlowControlSupplier{flowControlClass=io.aeron.driver.MaxMulticastFlowControl}
    applicationSpecificFeedback=[]
    receiverGroupTag=null
    flowControlGroupTag=-1
    flowControlGroupMinSize=0
    flowControlReceiverTimeoutNs=5000000000
    reResolutionCheckIntervalNs=1000000000
    receiverGroupConsideration=INFER
    congestionControlSupplier=io.aeron.driver.DefaultCongestionControlSupplier@6b57696f
    terminationValidator=io.aeron.driver.DefaultDenyTerminationValidator@5bb21b69
    terminationHook=null
    nameResolver=io.aeron.driver.DefaultNameResolver@6b9651f3
    resolverName='null'
    resolverInterface='null'
    resolverBootstrapNeighbor='null'
    sendToStatusMessagePollRatio=6
    unicastFeedbackDelayGenerator=StaticDelayGenerator{delayInNs=100000, retryDelayNs=10000000}
    multicastFeedbackDelayGenerator=OptimalMulticastDelayGenerator{randMax=3.302585092994046E-7, baseX=1.2613558468370601E-8, constantT=3027931.0656411387, factorT=7.927976886994827E7, shouldFeedbackImmediately=false}
    retransmitUnicastDelayGenerator=StaticDelayGenerator{delayInNs=0, retryDelayNs=0}
    retransmitUnicastLingerGenerator=StaticDelayGenerator{delayInNs=10000000, retryDelayNs=10000000}
    errorLog=org.agrona.concurrent.errors.DistinctErrorLog@5f3a4b84
    errorHandler=org.agrona.concurrent.errors.LoggingErrorHandler@27f723
    useConcurrentCountersManager=false
    countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}
    systemCounters=SystemCounters{counterByDescriptorMap={BYTES_SENT=AtomicCounter{isClosed=false, id=0, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, BYTES_RECEIVED=AtomicCounter{isClosed=false, id=1, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RECEIVER_PROXY_FAILS=AtomicCounter{isClosed=false, id=2, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, SENDER_PROXY_FAILS=AtomicCounter{isClosed=false, id=3, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, CONDUCTOR_PROXY_FAILS=AtomicCounter{isClosed=false, id=4, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, NAK_MESSAGES_SENT=AtomicCounter{isClosed=false, id=5, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, NAK_MESSAGES_RECEIVED=AtomicCounter{isClosed=false, id=6, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, STATUS_MESSAGES_SENT=AtomicCounter{isClosed=false, id=7, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, STATUS_MESSAGES_RECEIVED=AtomicCounter{isClosed=false, id=8, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, HEARTBEATS_SENT=AtomicCounter{isClosed=false, id=9, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, HEARTBEATS_RECEIVED=AtomicCounter{isClosed=false, id=10, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RETRANSMITS_SENT=AtomicCounter{isClosed=false, id=11, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, FLOW_CONTROL_UNDER_RUNS=AtomicCounter{isClosed=false, id=12, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, FLOW_CONTROL_OVER_RUNS=AtomicCounter{isClosed=false, id=13, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, INVALID_PACKETS=AtomicCounter{isClosed=false, id=14, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, ERRORS=AtomicCounter{isClosed=false, id=15, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, SHORT_SENDS=AtomicCounter{isClosed=false, id=16, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, FREE_FAILS=AtomicCounter{isClosed=false, id=17, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, SENDER_FLOW_CONTROL_LIMITS=AtomicCounter{isClosed=false, id=18, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, UNBLOCKED_PUBLICATIONS=AtomicCounter{isClosed=false, id=19, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, UNBLOCKED_COMMANDS=AtomicCounter{isClosed=false, id=20, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, POSSIBLE_TTL_ASYMMETRY=AtomicCounter{isClosed=false, id=21, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, CONTROLLABLE_IDLE_STRATEGY=AtomicCounter{isClosed=false, id=22, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, LOSS_GAP_FILLS=AtomicCounter{isClosed=false, id=23, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, CLIENT_TIMEOUTS=AtomicCounter{isClosed=false, id=24, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RESOLUTION_CHANGES=AtomicCounter{isClosed=false, id=25, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, CONDUCTOR_MAX_CYCLE_TIME=AtomicCounter{isClosed=false, id=26, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, CONDUCTOR_CYCLE_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=27, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, SENDER_MAX_CYCLE_TIME=AtomicCounter{isClosed=false, id=28, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, SENDER_CYCLE_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=29, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RECEIVER_MAX_CYCLE_TIME=AtomicCounter{isClosed=false, id=30, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RECEIVER_CYCLE_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=31, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, NAME_RESOLVER_MAX_TIME=AtomicCounter{isClosed=false, id=32, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, NAME_RESOLVER_TIME_THRESHOLD_EXCEEDED=AtomicCounter{isClosed=false, id=33, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, AERON_VERSION=AtomicCounter{isClosed=false, id=34, value=77318, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, BYTES_CURRENTLY_MAPPED=AtomicCounter{isClosed=false, id=35, value=547360768, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RETRANSMITTED_BYTES=AtomicCounter{isClosed=false, id=36, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, RETRANSMIT_OVERFLOW=AtomicCounter{isClosed=false, id=37, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}}}
    lossReport=LossReport{nextRecordOffset=0, buffer=UnsafeBuffer{addressOffset=140601146159104, capacity=10485760, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=10485760 cap=10485760]}}
    logFactory=io.aeron.driver.buffer.FileStoreLogFactory@8646db9
    dataTransportPoller=DataTransportPoller{}
    controlTransportPoller=ControlTransportPoller{}
    driverCommandQueue={}
    receiverCommandQueue=org.agrona.concurrent.OneToOneConcurrentArrayQueue@37374a5e
    senderCommandQueue=org.agrona.concurrent.OneToOneConcurrentArrayQueue@4671e53b
    receiverProxy=ReceiverProxy{threadingMode=DEDICATED, failCount=AtomicCounter{isClosed=false, id=2, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}}
    senderProxy=SenderProxy{threadingMode=DEDICATED, failCount=AtomicCounter{isClosed=false, id=3, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}}
    driverConductorProxy=DriverConductorProxy{threadingMode=DEDICATED, failCount=AtomicCounter{isClosed=false, id=4, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}}
    clientProxy=ClientProxy{}
    toDriverCommands=org.agrona.concurrent.ringbuffer.ManyToOneRingBuffer@43bd930a
    lossReportBuffer=java.nio.DirectByteBuffer[pos=0 lim=10485760 cap=10485760]
    cncByteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]
    cncMetaDataBuffer=UnsafeBuffer{addressOffset=140599176982528, capacity=48, byteArray=null, byteBuffer=java.nio.DirectByteBuffer[pos=0 lim=536875008 cap=536875008]}
    channelSendTimestampClock=org.agrona.concurrent.SystemEpochNanoClock@33723e30
    channelReceiveTimestampClock=org.agrona.concurrent.SystemEpochNanoClock@64f6106c
    conductorDutyCycleTracker=DutyCycleStallTracker{maxCycleTime=AtomicCounter{isClosed=false, id=26, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, cycleTimeThresholdExceededCount=AtomicCounter{isClosed=false, id=27, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, cycleTimeThresholdNs=1000000000}
    senderDutyCycleTracker=DutyCycleStallTracker{maxCycleTime=AtomicCounter{isClosed=false, id=28, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, cycleTimeThresholdExceededCount=AtomicCounter{isClosed=false, id=29, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, cycleTimeThresholdNs=1000000000}
    receiverDutyCycleTracker=DutyCycleStallTracker{maxCycleTime=AtomicCounter{isClosed=false, id=30, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, cycleTimeThresholdExceededCount=AtomicCounter{isClosed=false, id=31, value=0, countersManager=CountersManager{freeToReuseTimeoutMs=1000, highWaterMarkId=37, freeList=[], epochClock=org.agrona.concurrent.CachedEpochClock@614ddd49}}, cycleTimeThresholdNs=1000000000}
    senderWildcardPortRange=null
    receiverWildcardPortRange=null
    senderPortManager=io.aeron.driver.media.WildcardPortManager@543c6f6d
    receiverPortManager=io.aeron.driver.media.WildcardPortManager@13eb8acf
    resourceFreeLimit=10
    asyncTaskExecutorThreads=1
    asyncTaskExecutor=java.util.concurrent.ThreadPoolExecutor@51c8530f[Running, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 0]
    maxResend=16
}

Starting latency benchmark using the following configuration:
Configuration{
    warmUpIterations=10
    warmupMessageRate=10000
    iterations=10
    messageRate=5K
    batchSize=1
    messageLength=288
    messageTransceiverClass=uk.co.real_logic.benchmarks.aeron.remote.EchoMessageTransceiver
    idleStrategy=NoOpIdleStrategy{alias=noop}
    trackHistory=false
    reportProgress=true
    outputDirectory=/var/tmp/runtime/scripts/./out
    outputFileNamePrefix=aeron-echo-test_rate=5K_batch=1_length=288_sha=4e356a9fa57cbfdb3a782bad5bacebcf3992b6767a317b2db4a235bb0fa222a0
}

Running warmup for 10 iterations of 10,000 messages each, with 288 bytes payload and a burst size of 1...
Send rate: 9,913 msgs/sec (1 of 10)
Send rate: 10,000 msgs/sec (2 of 10)
Send rate: 9,981 msgs/sec (3 of 10)
Send rate: 9,984 msgs/sec (4 of 10)
Send rate: 9,992 msgs/sec (5 of 10)
Send rate: 9,975 msgs/sec (6 of 10)
Send rate: 10,000 msgs/sec (7 of 10)
Send rate: 9,985 msgs/sec (8 of 10)
Send rate: 9,995 msgs/sec (9 of 10)

Running measurement for 10 iterations of 5,000 messages each, with 288 bytes payload and a burst size of 1...
Send rate: 4,986 msgs/sec (1 of 10)
Send rate: 4,992 msgs/sec (2 of 10)
Send rate: 4,998 msgs/sec (3 of 10)
Send rate: 5,000 msgs/sec (4 of 10)
Send rate: 4,999 msgs/sec (5 of 10)
Send rate: 4,999 msgs/sec (6 of 10)
Send rate: 4,998 msgs/sec (7 of 10)
Send rate: 5,000 msgs/sec (8 of 10)
Send rate: 5,000 msgs/sec (9 of 10)

Histogram of RTT latencies in microseconds.
       Value     Percentile TotalCount 1/(1-Percentile)

     156.799 0.000000000000          1           1.00
    2985.983 0.100000000000       5005           1.11
    3872.767 0.200000000000      10004           1.25
    4796.415 0.300000000000      15031           1.43
    6328.319 0.400000000000      20001           1.67
    8970.239 0.500000000000      25004           2.00
   10395.647 0.550000000000      27518           2.22
   11870.207 0.600000000000      30005           2.50
   13385.727 0.650000000000      32501           2.86
   14983.167 0.700000000000      34996           3.33
   16760.831 0.750000000000      37514           4.00
   17694.719 0.775000000000      38747           4.44
   18628.607 0.800000000000      39995           5.00
   19677.183 0.825000000000      41246           5.71
   20791.295 0.850000000000      42503           6.67
   22167.551 0.875000000000      43747           8.00
   23003.135 0.887500000000      44373           8.89
   23969.791 0.900000000000      45006          10.00
   25034.751 0.912500000000      45619          11.43
   26378.239 0.925000000000      46259          13.33
   28147.711 0.937500000000      46870          16.00
   29163.519 0.943750000000      47185          17.78
   30359.551 0.950000000000      47505          20.00
   31571.967 0.956250000000      47819          22.86
   32964.607 0.962500000000      48131          26.67
   34570.239 0.968750000000      48432          32.00
   35553.279 0.971875000000      48588          35.56
   36569.087 0.975000000000      48761          40.00
   37552.127 0.978125000000      48904          45.71
   38567.935 0.981250000000      49059          53.33
   39780.351 0.984375000000      49220          64.00
   40370.175 0.985937500000      49296          71.11
   40992.767 0.987500000000      49374          80.00
   41615.359 0.989062500000      49449          91.43
   42369.023 0.990625000000      49533         106.67
   43188.223 0.992187500000      49604         128.00
   43778.047 0.992968750000      49648         142.22
   44367.871 0.993750000000      49687         160.00
   44957.695 0.994531250000      49721         182.86
   45744.127 0.995312500000      49760         213.33
   46563.327 0.996093750000      49802         256.00
   46989.311 0.996484375000      49821         284.44
   47382.527 0.996875000000      49839         320.00
   47775.743 0.997265625000      49857         365.71
   48332.799 0.997656250000      49876         426.67
   48791.551 0.998046875000      49896         512.00
   49184.767 0.998242187500      49908         568.89
   49577.983 0.998437500000      49917         640.00
   49971.199 0.998632812500      49927         731.43
   50561.023 0.998828125000      49936         853.33
   51150.847 0.999023437500      49945        1024.00
   51576.831 0.999121093750      49950        1137.78
   52166.655 0.999218750000      49954        1280.00
   53149.695 0.999316406250      49959        1462.86
   54165.503 0.999414062500      49964        1706.67
   55148.543 0.999511718750      49969        2048.00
   55771.135 0.999560546875      49972        2275.56
   56164.351 0.999609375000      49974        2560.00
   56557.567 0.999658203125      49976        2925.71
   57147.391 0.999707031250      49979        3413.33
   57573.375 0.999755859375      49981        4096.00
   57966.591 0.999780273438      49983        4551.11
   58163.199 0.999804687500      49984        5120.00
   58359.807 0.999829101563      49985        5851.43
   58556.415 0.999853515625      49986        6826.67
   58753.023 0.999877929688      49987        8192.00
   58949.631 0.999890136719      49988        9102.22
   59146.239 0.999902343750      49989       10240.00
   59146.239 0.999914550781      49989       11702.86
   59375.615 0.999926757813      49990       13653.33
   59375.615 0.999938964844      49990       16384.00
   59572.223 0.999945068359      49991       18204.44
   59572.223 0.999951171875      49991       20480.00
   59572.223 0.999957275391      49991       23405.71
   59768.831 0.999963378906      49992       27306.67
   59768.831 0.999969482422      49992       32768.00
   59768.831 0.999972534180      49992       36408.89
   59768.831 0.999975585938      49992       40960.00
   59768.831 0.999978637695      49992       46811.43
   59965.439 0.999981689453      49993       54613.33
   59965.439 1.000000000000      49993
#[Mean    =    11683.581, StdDeviation   =     9223.233]
#[Max     =    59965.439, Total count    =        49993]
#[Buckets =           32, SubBuckets     =         2048]

*** WARNING: Target message rate not achieved: expected to send 50,000 messages in total but managed to send only 49,993 messages (loss 0.0140%)!

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

Changed to 1K, no msg lost, but similar result:

#[Mean    =    23274.977, StdDeviation   =    15027.193]
#[Max     =    92209.151, Total count    =        10000]
#[Buckets =           32, SubBuckets     =         2048]

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

FYI, I'm comparing messaging libs, I did simple java code for GRPC and Http Rest API as well between these 2 servers:

GRPC.         : 99%=2366us
Rest API.     : 99%=3479

Sending fx rate object about 50bytes, the can the result is alot better than Aeron Transpot

@vyazelenko
Copy link
Contributor

@Gipxy Have you seen this warning:

WARNING: Could not set desired SO_SNDBUF, adjust OS to allow aeron.socket.so_sndbuf attempted=2097152, actual=212992
WARNING: Could not set desired SO_RCVBUF, adjust OS to allow aeron.socket.so_sndbuf attempted=2097152, actual=212992

It means that the socket buffers could not be configured as desired (i.e. were not set to 2MB). You need to fix your OS configuration if you want to run with benchmark defaults. Otherwise, configure smaller socket buffer sizes and receiver window for your channels:

-Duk.co.real_logic.benchmarks.aeron.remote.source.channel=aeron:udp?endpoint=10.52.19.65:13000|so-sndbuf=128k|so-rcvbuf=128k|rcv-wnd=128k
-Duk.co.real_logic.benchmarks.aeron.remote.destination.channel=aeron:udp?endpoint=10.52.19.5:13001|so-sndbuf=128k|so-rcvbuf=128k|rcv-wnd=128k

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

@vyazelenko applied to server and client, but not working, still got warning on both client and server. Result are similar

@vyazelenko
Copy link
Contributor

vyazelenko commented Nov 1, 2024

On a properly configured system Aeron can achieve several orders of magnitude higher throughput and lower latency than gRCP.

Here are some figures from our own testing with 288 byte message at 100K msgs/sec:

  • On-prem:
| %lie   |   Java | c-ef_vi |
-----------------------------
| p99    |  19 us |    7 us |
| p999   |  21 us |    8 us |
| p9999  |  31 us |    8 us |
| p99999 |  88 us |   64 us |
| max    | 154 us |  111 us |
  • Cloud:
| %lie   |    java |  c-dpdk |
------------------------------
| p99    |  106 us |   42 us |
| p999   |  183 us |   47 us |
| p9999  |  224 us |   58 us |
| p99999 | 8167 us |  113 us |
| max    | 9568 us |  170 us |

@vyazelenko
Copy link
Contributor

vyazelenko commented Nov 1, 2024

@Gipxy You are seeing the warning, because MediaDriver settings are still referring to 2MB (see low-latency-driver.properties). That should not affect the results though, i.e. the settings from the channel URIs have precedence.

@vyazelenko
Copy link
Contributor

@Gipxy If you want some professional help with you benchmarks please contact [email protected].

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

Yeah, just found that as well, changed also, but still same result of avg: 23ms :(
Anything else i can try ? @vyazelenko

@vyazelenko
Copy link
Contributor

@Gipxy You must ensure that the system have enough CPU cores to run those benchmarks. In the default config media driver will run in DEDICATED mode which requires 3 CPU cores + 1 CPU core is needed for the client/server, i.e. you need 4 cores just for Aeron and it will busy spin on them.

You can try running with -Daeron.threading.mode=SHARED to see if CPU is the issue.

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

You are right, i didn't notice that there is existing test run which is not exit yet! Killed all of them, run again, before run CPU of both machine is 99%, when running about 45-50% (both of them have 8 cores) and new result a lot better:

524.607 0.987500000000      49375          80.00
3661.823 0.989062500000      49455          91.43
3794.943 0.990625000000      49532         106.67
3999.743 0.992187500000      49610         128.00

...

#[Mean    =      643.659, StdDeviation   =      936.833]
#[Max     =    25919.487, Total count    =        50000]
#[Buckets =           32, SubBuckets     =         2048]

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

Tried with 10k msg, or change to 100 iters or change message length to "32", all having similar result of 3ms, so for 99% comparison of sending about 32 bytes for all of these 3:

UDP Java Socket.  : 1600
GRPC              : 2366
Aeron Echo.       : 2800 
Rest API.         : 3479

Still slower than GRPC, any quick thing we can do to help improve this @vyazelenko ?

@vyazelenko
Copy link
Contributor

@Gipxy your Aeron results are 20 to 150 times worse than in our tests (see #54 (comment)). This is most likely caused by the OS configuration and/or machine setup (i.e. number of CPU cores, memory, networking capabilities etc.). For instance, you have mentioned that both VMs have 8 cores each but are those real CPU cores or virtual CPU cores? In the latter case you won't have enough CPU cores to run media driver in DEDICATED mode.

Tried with 10k msg, or change to 100 iters or change message length to "32", all having similar result of 3ms, so for 99% comparison of sending about 32 bytes for all of these 3:

UDP Java Socket.  : 1600
GRPC              : 2366
Aeron Echo.       : 2800 
Rest API.         : 3479

Still slower than GRPC, any quick thing we can do to help improve this @vyazelenko ?

How were the above results obtained, i.e. did you run Aeron benchmarks or is it all custom code? Do you know that we have gRPC benchmarks included (see https://github.com/real-logic/benchmarks/tree/master/scripts/grpc)?

If you want to compare Aeron and gRCP please use the built-in benchmarks.

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

  • For CPU, yes, this is just VM, mean only virtual cores but as mention, cpu free still 45% while running with DEDICATED mode, so should not be cpu issue?
  • For aeron result, its standard code from Aeron benchmark master
  • For Grpc, its my own code benchmarked with JMH. Its just simple grpc code sending object (id, bidRate, askRate, ccypair, timestamp)

@vyazelenko
Copy link
Contributor

  • You don’t have enough CPUs to run with DEDICATED mode. Run on larger instances or change to SHARED mode instead.
  • For the like for like comparison you must use the same test harness. Please run Aeron Echo benchmarks vs gRPC benchmarks from this repository. Provide complete information on how you’ve configured the benchmarks, capture outputs/AeronStat/result histograms.
  • If you need more guidance on how to tune Aeron/OS or explore commercial options around Aeron, please contact [email protected].

@Gipxy
Copy link
Author

Gipxy commented Nov 1, 2024

  • my cpu still free 40% when running in dedicated mode, so is it really not enough cpu
  • yes, i will compare with that grpc test code as well

@vyazelenko
Copy link
Contributor

@Gipxy Yes, you really do not have enough CPU cores. Let me clarify, benchmarks configure noop idle strategy by default meaning that if there is no work a thread will busy spin and burn 100% cpu until some useful work can be executed. If on top of it you would also pin threads that would mean that all four cores will be unavailable for anything else.

Because your current setup is oversubscribed OS will force frequent context switches between threads to get access to those 4 CPU cores. Reducing the number of active benchmarks threads from 4 to 2 will help as there will be less busy spinning threads.

What you can do in addition is to use different idle strategies. For a full list of supported strategies see Configuration#agentIdleStrategy.
Note: depending on the threading mode used different properties need to be set in order to change the idle strategy for media driver (see MediaDriver.Context#concludeIdleStrategies).

@vyazelenko vyazelenko reopened this Nov 1, 2024
@Gipxy
Copy link
Author

Gipxy commented Nov 2, 2024

Thank for your clarification, will try with difference Idle strategy

Here the result of running both standard GRPC and Aeron echo test on that 2 VMs with default first:

image

As expected, can see Aeron is slightly better than GRPC :)

@Gipxy
Copy link
Author

Gipxy commented Nov 3, 2024

Tried with difference idle strategy:

Type 99% 50%
noop 1838 204
sleep-ns 10854 246
spin 1630 206
yield 1935 210

If change to SHARED mode, then result is not good almost double!

@Gipxy
Copy link
Author

Gipxy commented Nov 3, 2024

Mean so far the best option of Aeron is about 30% better than GRPC only on these 2 VMs of 8 cores.
May i know how to configure this @vyazelenko ? "Reducing the number of active benchmarks threads from 4 to 2 will help as there will be less busy spinning threads"

@vyazelenko
Copy link
Contributor

Mean so far the best option of Aeron is about 30% better than GRPC only on these 2 VMs of 8 cores. May i know how to configure this @vyazelenko ? "Reducing the number of active benchmarks threads from 4 to 2 will help as there will be less busy spinning threads"

@Gipxy In order to change the threading mode of the media driver you need to set the aeron.threading.mode property (see low-latency-driver.properties. For the list of all supported threading modes see https://github.com/real-logic/aeron/wiki/Configuration-Options. You could try running with SHARED or SHARED_NETWORK.

@vyazelenko
Copy link
Contributor

vyazelenko commented Nov 3, 2024

Thank for your clarification, will try with difference Idle strategy

Here the result of running both standard GRPC and Aeron echo test on that 2 VMs with default first:

image

As expected, can see Aeron is slightly better than GRPC :)

@Gipxy please answer the following:

  • What was the message rate used for both tests?
  • What the message size used for both tests?
  • What were the number of iterations and the number of warmup iterations used?
  • How many runs of each test were performed?
  • Did you ensure that no other Java processes were running during those tests?
  • Did you stop any heavy processes that might affect the results (e.g. DB processes etc.)?
  • Did you see any message loss during the tests, i.e. did the test succeed or fail?
  • Can you post the complete output from each test including all of the configuration files (i.e. benchmarks.properties, low-latency-driver.properties etc.)? Please also include the files produced during the run such as AeronStat output, gc logs etc.

@Gipxy
Copy link
Author

Gipxy commented Nov 4, 2024

Hi @vyazelenko:

  • What was the message rate used for both tests? => 1k
  • What the message size used for both tests? => 288
  • What were the number of iterations and the number of warmup iterations used? => For warmup: default, 10 iter, each 10k. For normal iter: 100
  • How many runs of each test were performed? => run 2-3 times, but similar result, so took result from one of them
  • Did you ensure that no other Java processes were running during those tests? => yup, as mention 98% free before run and after run is 45% ( i used "mpstat 1")
  • Did you stop any heavy processes that might affect the results (e.g. DB processes etc.)? => nope, these VM app server are only for this POC testing
  • Did you see any message loss during the tests, i.e. did the test succeed or fail? => nope, tried to change to 10k rate also no msg lost.
  • Can you post the complete output from each test including all of the configuration files (i.e. benchmarks.properties, low-latency-driver.properties etc.)? Please also include the files produced during the run such as AeronStat output, gc logs etc. => let me re-run and provide you

@Gipxy
Copy link
Author

Gipxy commented Nov 4, 2024

Just re-run both of them, here are running log and aeron stat:
aeron-grpc-compare-log.zip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants