【 使用环境 】生产环境
【 使用版本 】
obbinlog-ce-4.2.5版本配合oceanbase 5.7.25-OceanBase-v4.2.5.3
【问题描述】通过flink cdc 3.1.1版本的oceanbase采集组件,采集增量日志每天凌晨左右会无数据捕获,但是flink 任务正常运行。
【附件及日志】
flink Task Managers相关日志
2025-11-07 09:15:51,347 INFO com.oceanbase.clogproxy.client.connection.ClientHandler [] - Connected to LogProxyServer, ip:172.28.4.182, version:4.2.5-b2547fdc317a4fee7b4e731f031968642b7b4387
2025-11-07 09:15:51,385 ERROR com.oceanbase.clogproxy.client.connection.ClientHandler [] - LogProxy refused handshake request: code: 1
message: "Failed to create oblogreader"
2025-11-07 09:15:51,386 ERROR com.oceanbase.clogproxy.client.connection.ClientHandler [] - Exception occurred ClientId: 6662: rootserver_list=xxx:2882:2881, cluster_id=, cluster_user=xxx, cluster_password=******, , sys_user=, sys_password=******, tb_white_list=xxx, tb_black_list=|, start_timestamp=0, start_timestamp_us=0, timezone=+08:00, working_mode=storage, with LogProxy: xxx:2983
com.oceanbase.clogproxy.client.exception.LogProxyClientException: LogProxy refused handshake request: code: 1
message: "Failed to create oblogreader"
at com.oceanbase.clogproxy.client.connection.ClientHandler.handleErrorResponse(ClientHandler.java:236) ~[blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at com.oceanbase.clogproxy.client.connection.ClientHandler.channelRead(ClientHandler.java:166) ~[blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [blob_p-e19d134d450ba3b11a5be60cf0335f72a7a53ba8-9aa24a50e23c5abf64af807eafde6745:?]
at java.lang.Thread.run(Unknown Source) [?:?]
2025-11-07 09:15:51,388 INFO com.oceanbase.clogproxy.client.connection.ClientStream [] - Try to stop this client
2025-11-07 09:15:52,681 INFO com.oceanbase.clogproxy.client.connection.ClientStream [] - Client process thread exit
2025-11-07 09:15:52,682 INFO com.oceanbase.clogproxy.client.connection.ClientStream [] - Client stopped successfully
2025-11-07 09:15:52,682 INFO com.oceanbase.clogproxy.client.connection.ClientHandler [] - Channel closed with ClientId: 6662, LogProxy: 172.28.4.182:2983
2025-11-07 09:16:24,422 INFO org.apache.flink.runtime.taskmanager.Task [] - Attempting to cancel task Source: Custom Source (1/1)#1 (852ed247850ddabc478a2aaa240f0a0c_bc764cd8ddf7a0cff126f51c16239658_0_1).
2025-11-07 09:16:24,422 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: Custom Source (1/1)#1 (852ed247850ddabc478a2aaa240f0a0c_bc764cd8ddf7a0cff126f51c16239658_0_1) switched from RUNNING to CANCELING.
/home/ds/oblogproxy/run/6662/log/oblogreader.log日志
2025-11-08 16:42:08.905706] [info] comm.cpp(380): About to write mssage: DATA_CLIENT, ch: 360344791018373129, msg type: 6
[2025-11-08 16:42:09.046748] [info] comm.cpp(380): About to write mssage: DATA_CLIENT, ch: 360344791018373129, msg type: 6
[2025-11-08 16:42:09.047236] [info] comm.cpp(380): About to write mssage: DATA_CLIENT, ch: 360344791018373129, msg type: 6
[2025-11-08 16:42:09.909425] [info] comm.cpp(380): About to write mssage: DATA_CLIENT, ch: 360344791018373129, msg type: 6
[2025-11-08 16:42:09.970968] [error] comm.cpp(398): Failed to send message through channel:360344791018373129, error:断开的管道
[2025-11-08 16:42:09.971027] [warning] sender_routine.cpp(204): Failed to send record data message to client, peer: 360344791018373129
[2025-11-08 16:42:09.971061] [error] sender_routine.cpp(170): Failed to write LogMessage to client: id:360344791018373129, fd:9, addr:83899309, port:41296
[2025-11-08 16:42:09.971110] [info] comm.cpp(63): Communicator stopping
[2025-11-08 16:42:09.971125] [info] comm.cpp(65): Communicator stopped
[2025-11-08 16:42:10.804515] [info] counter.cpp(69): Counter:[Span:2000ms][Delay:93841709,93880355][RCNT:7][RRPS:3][RIOS:647][WCNT:6][WRPS:3][WIOS:8182,AVG:2727][XWIOS:3273,AVG:1091][RFETCH:1914891][ROFFER:261][SPOLL:1914054][SENCODE:0][SSEND:320][NRecordQ:0]
[2025-11-08 16:42:11.909478] [info] reader_routine.cpp(84): Fetch liboblog timeout, nothing incoming...
[2025-11-08 16:42:11.909756] [info] counter.cpp(69): Counter:[Span:1105ms][Delay:94946931,94985577][RCNT:0][RRPS:0][RIOS:0][WCNT:0][WRPS:0][WIOS:0,AVG:0][XWIOS:0,AVG:0][RFETCH:0][ROFFER:0][SPOLL:0][SENCODE:0][SSEND:0][NRecordQ:0]
[2025-11-08 16:42:11.909813] [info] counter.cpp(81): #### Counter thread stop, tid: 109207
[2025-11-08 16:42:11.910232] [info] thread.cpp(46): << Joined thread: ReaderRoutine(109204)
[2025-11-08 16:42:11.910263] [info] thread.cpp(44): << Joining thread: SenderRoutine(109206)
[2025-11-08 16:42:13.575169] [info] thread.cpp(46): << Joined thread: SenderRoutine(109206)
[2025-11-08 16:42:13.575282] [warning] oblogreader_entry.cpp(69): !!! Exiting oblogreader process(109203) with peer: id:360344791018373129, fd:9, addr:83899309, port:41296, client meta: type:0, id:6662, ip:173.51.0.5, version:1.1.2, configuration:tb_white_list=xxxx cluster_user=xxx timezone=+08:00 rootserver_list=xxx:2882:2881 cluster_password=AD1B8C097F40ED3C0E35F228DE1213787C965682 cluster_id= tb_black_list=| working_mode=storage first_start_timestamp_us=0 first_start_timestamp=0 , pid:0, peer:fd:9, register_time:1762507312, enable_monitor:0, packet_version:2,
[2025-11-08 16:42:13.575312] [info] comm.cpp(63): Communicator stopping
[2025-11-08 16:42:13.575320] [info] comm.cpp(65): Communicator stopped
oblogproxy的logproxy.log无异常日志