忽略记录第三方库的重复异常消息
Ignore logging of duplicated exception messages of third party libs
我需要处理日志中特定异常的重复。
我使用 slf4j 和 logback 登录我的应用程序。我使用一些外部服务(DB、apache kafka、第三方库等)。当失去与此类服务的连接时,我得到异常,例如
[kafka-producer-network-thread | producer-1] WARN o.a.kafka.common.network.Selector - Error in I/O with localhost/127.0.0.1
java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_45]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_45]
at org.apache.kafka.common.network.Selector.poll(Selector.java:238) ~[kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:192) [kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:191) [kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:122) [kafka-clients-0.8.2.0.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
问题是我每秒都会收到这条消息。此异常消息将淹没我的日志文件,因此 N 小时后我的日志文件中将有几 GB。
我希望每 1-5 分钟收到一次关于此异常的日志消息。有什么方法可以忽略日志文件中重复的异常吗?
可能的解决方案:
忽略特定包和 class 的所有日志。
[糟糕,因为我可以跳过重要消息]
使用http://logback.qos.ch/manual/filters.html#DuplicateMessageFilter
[不好,因为我只能设置属性 AllowedRepetitions 或 CacheSize。它会匹配所有消息,但我只需要特定的例外]
编写自定义过滤器
也许,您知道已经实施的解决方案?
我认为您最好的选择是简单地扩展您已经找到的 DuplicateMessageFilter。它不是最终版本,而且相当容易:
使用基于 classname 或 exceptionType 或您希望根据
做出初始决定的任何其他过滤方法实现新的 TurboFilter
然后将 class 委托给父 class duplicity-check
您可用的参数:
public FilterReply decide(Marker marker, Logger logger, Level level,
String format, Object[] params, Throwable t) {
包括 Throwable
和 Logger
。
编写新的 turbo 过滤器并实现任何逻辑来拒绝某些特定日志记录真的很容易。
我已经使用下一个配置安装了新过滤器
logback.xml
<turboFilter class="package.DuplicationTimeoutTurboFilter">
<MinutesToBlock>3</MinutesToBlock>
<KeyPattern>
<loggerClass>org.apache.kafka.common.network.Selector</loggerClass>
<message>java.net.ConnectException: Connection refused: no further information</message>
</KeyPattern>
</turboFilter>
和实施:
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.turbo.TurboFilter;
import ch.qos.logback.core.spi.FilterReply;
import org.slf4j.Marker;
import java.time.LocalDateTime;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Objects;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;
public class DuplicationTimeoutTurboFilter extends TurboFilter {
private static final int CLEAN_UP_THRESHOLD = 1000;
private ConcurrentHashMap<KeyPattern, LocalDateTime> recentlyMatchedPatterns = new ConcurrentHashMap<>();
private Set<KeyPattern> ignoringPatterns = new HashSet<>();
private long minutesToBlock = 3L;
@Override
public FilterReply decide(Marker marker, Logger logger, Level level, String format, Object[] params, Throwable t) {
String rawLogMessage = format + Arrays.toString(params) + Objects.toString(t); //sometimes throwable can be inserted into params argument
Set<KeyPattern> matchedIgnoringSet = ignoringPatterns.stream()
.filter(key -> match(key, logger, rawLogMessage))
.collect(Collectors.toSet());
if (!matchedIgnoringSet.isEmpty() && isLoggedRecently(matchedIgnoringSet)) {
return FilterReply.DENY;
}
return FilterReply.NEUTRAL;
}
private boolean match(KeyPattern keyPattern, Logger logger, String rawText) {
String loggerClass = keyPattern.getLoggerClass();
String messagePattern = keyPattern.getMessage();
return loggerClass.equals(logger.getName()) && rawText.contains(messagePattern);
}
private boolean isLoggedRecently(Set<KeyPattern> matchedIgnoredList) {
for (KeyPattern pattern : matchedIgnoredList) {
LocalDateTime now = LocalDateTime.now();
LocalDateTime lastLogTime = recentlyMatchedPatterns.putIfAbsent(pattern, now);
if (lastLogTime == null) {
return false;
}
LocalDateTime blockedTillTime = lastLogTime.plusMinutes(minutesToBlock);
if (blockedTillTime.isAfter(now)) {
return true;
} else if (blockedTillTime.isBefore(now)) {
recentlyMatchedPatterns.put(pattern, now);
cleanupIfNeeded();
return false;
}
}
return true;
}
private void cleanupIfNeeded() {
if (recentlyMatchedPatterns.size() > CLEAN_UP_THRESHOLD) {
LocalDateTime oldTime = LocalDateTime.now().minusMinutes(minutesToBlock * 2);
recentlyMatchedPatterns.values().removeIf(lastLogTime -> lastLogTime.isAfter(oldTime));
}
}
public long getMinutesToBlock() {
return minutesToBlock;
}
public void setMinutesToBlock(long minutesToBlock) {
this.minutesToBlock = minutesToBlock;
}
public void addKeyPattern(KeyPattern keyPattern) {
ignoringPatterns.add(keyPattern);
}
public static class KeyPattern {
private String loggerClass;
private String message;
//constructor, getters, setters, equals, hashcode
}
}
我需要处理日志中特定异常的重复。
我使用 slf4j 和 logback 登录我的应用程序。我使用一些外部服务(DB、apache kafka、第三方库等)。当失去与此类服务的连接时,我得到异常,例如
[kafka-producer-network-thread | producer-1] WARN o.a.kafka.common.network.Selector - Error in I/O with localhost/127.0.0.1
java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_45]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_45]
at org.apache.kafka.common.network.Selector.poll(Selector.java:238) ~[kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:192) [kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:191) [kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:122) [kafka-clients-0.8.2.0.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
问题是我每秒都会收到这条消息。此异常消息将淹没我的日志文件,因此 N 小时后我的日志文件中将有几 GB。
我希望每 1-5 分钟收到一次关于此异常的日志消息。有什么方法可以忽略日志文件中重复的异常吗?
可能的解决方案:
忽略特定包和 class 的所有日志。 [糟糕,因为我可以跳过重要消息]
使用http://logback.qos.ch/manual/filters.html#DuplicateMessageFilter [不好,因为我只能设置属性 AllowedRepetitions 或 CacheSize。它会匹配所有消息,但我只需要特定的例外]
编写自定义过滤器 也许,您知道已经实施的解决方案?
我认为您最好的选择是简单地扩展您已经找到的 DuplicateMessageFilter。它不是最终版本,而且相当容易:
使用基于 classname 或 exceptionType 或您希望根据
做出初始决定的任何其他过滤方法实现新的 TurboFilter
然后将 class 委托给父 class duplicity-check
您可用的参数:
public FilterReply decide(Marker marker, Logger logger, Level level,
String format, Object[] params, Throwable t) {
包括 Throwable
和 Logger
。
编写新的 turbo 过滤器并实现任何逻辑来拒绝某些特定日志记录真的很容易。
我已经使用下一个配置安装了新过滤器 logback.xml
<turboFilter class="package.DuplicationTimeoutTurboFilter">
<MinutesToBlock>3</MinutesToBlock>
<KeyPattern>
<loggerClass>org.apache.kafka.common.network.Selector</loggerClass>
<message>java.net.ConnectException: Connection refused: no further information</message>
</KeyPattern>
</turboFilter>
和实施:
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.turbo.TurboFilter;
import ch.qos.logback.core.spi.FilterReply;
import org.slf4j.Marker;
import java.time.LocalDateTime;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Objects;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;
public class DuplicationTimeoutTurboFilter extends TurboFilter {
private static final int CLEAN_UP_THRESHOLD = 1000;
private ConcurrentHashMap<KeyPattern, LocalDateTime> recentlyMatchedPatterns = new ConcurrentHashMap<>();
private Set<KeyPattern> ignoringPatterns = new HashSet<>();
private long minutesToBlock = 3L;
@Override
public FilterReply decide(Marker marker, Logger logger, Level level, String format, Object[] params, Throwable t) {
String rawLogMessage = format + Arrays.toString(params) + Objects.toString(t); //sometimes throwable can be inserted into params argument
Set<KeyPattern> matchedIgnoringSet = ignoringPatterns.stream()
.filter(key -> match(key, logger, rawLogMessage))
.collect(Collectors.toSet());
if (!matchedIgnoringSet.isEmpty() && isLoggedRecently(matchedIgnoringSet)) {
return FilterReply.DENY;
}
return FilterReply.NEUTRAL;
}
private boolean match(KeyPattern keyPattern, Logger logger, String rawText) {
String loggerClass = keyPattern.getLoggerClass();
String messagePattern = keyPattern.getMessage();
return loggerClass.equals(logger.getName()) && rawText.contains(messagePattern);
}
private boolean isLoggedRecently(Set<KeyPattern> matchedIgnoredList) {
for (KeyPattern pattern : matchedIgnoredList) {
LocalDateTime now = LocalDateTime.now();
LocalDateTime lastLogTime = recentlyMatchedPatterns.putIfAbsent(pattern, now);
if (lastLogTime == null) {
return false;
}
LocalDateTime blockedTillTime = lastLogTime.plusMinutes(minutesToBlock);
if (blockedTillTime.isAfter(now)) {
return true;
} else if (blockedTillTime.isBefore(now)) {
recentlyMatchedPatterns.put(pattern, now);
cleanupIfNeeded();
return false;
}
}
return true;
}
private void cleanupIfNeeded() {
if (recentlyMatchedPatterns.size() > CLEAN_UP_THRESHOLD) {
LocalDateTime oldTime = LocalDateTime.now().minusMinutes(minutesToBlock * 2);
recentlyMatchedPatterns.values().removeIf(lastLogTime -> lastLogTime.isAfter(oldTime));
}
}
public long getMinutesToBlock() {
return minutesToBlock;
}
public void setMinutesToBlock(long minutesToBlock) {
this.minutesToBlock = minutesToBlock;
}
public void addKeyPattern(KeyPattern keyPattern) {
ignoringPatterns.add(keyPattern);
}
public static class KeyPattern {
private String loggerClass;
private String message;
//constructor, getters, setters, equals, hashcode
}
}