Giter Site home page Giter Site logo

log4j2-elasticsearch's People

Contributors

cuneytcalishkan avatar haikal00 avatar harryssuperman avatar martinhenningjensen avatar nithinnambiar avatar rfoltyns avatar thaarbach avatar tillerino avatar turesheim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

log4j2-elasticsearch's Issues

Problem with dependencies - io.searchbox:jest

Hello,

I think there is a problem with the dependencies towards io.searchbox:jest:2.4.0. This dependency is now forwarded to resolve to version 6.3.1 and there seems to be a change in the API.
BufferedBulk.getURI() doesn't exist anymore, it takes a parameter of type ElasticsearchVersion

String requestURL = getRequestURL(getNextServer(), clientRequest.getURI());
.

gradle dependencies:

implementation "com.lmax:disruptor:3.4.2"
implementation 'io.netty:netty-buffer:4.1.32.Final'
implementation 'org.appenders.log4j:log4j2-elasticsearch-jest:1.3.5'

image

I took the below configuration from your examples.
log4j2 appender configuration:

<Elasticsearch name="elasticsearchAsyncBatch">
    <RollingIndexName indexName="log4j2_test_jest" pattern="yyyy-MM-dd-HH" timeZone="Europe/Warsaw" />
    <ThresholdFilter level="INFO" onMatch="ACCEPT"/>
    <JacksonJsonLayout>
        <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="512" initialPoolSize="10000"
                                 monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
            <UnlimitedResizePolicy resizeFactor="0.6" />
        </PooledItemSourceFactory>
    </JacksonJsonLayout>
    <AsyncBatchDelivery batchSize="10000"
                        deliveryInterval="3000" >
        <IndexTemplate name="test_template_jest" path="classpath:indexTemplate.json" />
        <JestBufferedHttp serverUris="http://localhost:9200"
                          connTimeout="500"
                          readTimeout="30000"
                          maxTotalConnection="40"
                          defaultMaxTotalConnectionPerRoute="8"
                          mappingType="index">
            <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="5120000" initialPoolSize="3"
                                     monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
                <UnlimitedResizePolicy resizeFactor="0.70" />
            </PooledItemSourceFactory>
        </JestBufferedHttp>
    </AsyncBatchDelivery>
</Elasticsearch>

As soon as the logger configuration kicks in, the following error occurs:

java.lang.NoSuchMethodError: org.appenders.log4j2.elasticsearch.jest.BufferedBulk.getURI()Ljava/lang/String;
	at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpClient.prepareRequest(BufferedJestHttpClient.java:63) ~[log4j2-elasticsearch-jest-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpClient.executeAsync(BufferedJestHttpClient.java:47) ~[log4j2-elasticsearch-jest-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.jest.JestHttpObjectFactory$1.apply(JestHttpObjectFactory.java:195) ~[log4j2-elasticsearch-jest-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.jest.JestHttpObjectFactory$1.apply(JestHttpObjectFactory.java:178) ~[log4j2-elasticsearch-jest-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.BulkEmitter.notifyListener(BulkEmitter.java:93) ~[log4j2-elasticsearch-core-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.BulkEmitter.stop(BulkEmitter.java:159) ~[log4j2-elasticsearch-core-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.AsyncBatchDelivery.stop(AsyncBatchDelivery.java:182) ~[log4j2-elasticsearch-core-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.ItemSourceAppender.stop(ItemSourceAppender.java:72) ~[log4j2-elasticsearch-core-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.ElasticsearchAppender.lifecycleStop(ElasticsearchAppender.java:240) ~[log4j2-elasticsearch-core-1.3.5.jar:?]
	at org.appenders.log4j2.elasticsearch.ElasticsearchAppender.stop(ElasticsearchAppender.java:219) ~[log4j2-elasticsearch-core-1.3.5.jar:?]
	at org.apache.logging.log4j.core.config.AbstractConfiguration.stop(AbstractConfiguration.java:398) ~[log4j-core-2.12.1.jar:2.12.1]
	at org.apache.logging.log4j.core.AbstractLifeCycle.stop(AbstractLifeCycle.java:136) ~[log4j-core-2.12.1.jar:2.12.1]
	at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:584) ~[log4j-core-2.12.1.jar:2.12.1]
	at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:651) ~[log4j-core-2.12.1.jar:2.12.1]
	at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:668) ~[log4j-core-2.12.1.jar:2.12.1]

Any help is well appreciated, thank you very much for the good work.

Cheers

elasticsearch without Xpack

Hi,

We run an elasticsearch with no Xpack (free version).
Clients like curl or other with Basic Authentification are working fine.
Below does not work:

conf









exception
019-01-15 09:47:40,202 main ERROR Could not create plugin of type class org.appenders.log4j2.elasticsearch.jest.XPackAuth for element XPackAuth org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element XPackAuth are invalid: field 'certInfo' has invalid value 'null'

Java SPI Classpath Issue

When deploying an application using this appender we are seeing that the Java Service Provider Interface is used to register the BulkEmitter service. Once an application is deployed, all subsequent deployments fail with this error:

Failed to deploy artifact [xxx-xxxxx-xxxxx]
Caused by: java.util.ServiceConfigurationError: org.appenders.log4j2.elasticsearch.BatchEmitterFactory: Provider org.appenders.log4j2.elasticsearch.jest.BulkEmitterFactory not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239) ~[?:1.8.0_202]
at java.util.ServiceLoader.access$300(ServiceLoader.java:185) ~[?:1.8.0_202]
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376) ~[?:1.8.0_202]
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) ~[?:1.8.0_202]
at java.util.ServiceLoader$1.next(ServiceLoader.java:480) ~[?:1.8.0_202]
at org.appenders.log4j2.elasticsearch.spi.BatchEmitterServiceProvider.createInstance(BatchEmitterServiceProvider.java:66) ~[?:?]

Is there any recommendation or workaround known for this issue?

Thank You

Supporting Filter

Hi Rafal,
i tried to combine the ElasticsearchAppender with a ThresholdFilter, but i see that Filters are not supported.
Is it possible to support Filters?
Need something like that

		<Elasticsearch name="elasticsearch">
			<JsonLayout compact="true" properties="true" />
			<ThresholdFilter level="INFO" onMatch="ACCEPT"/>
			<AsyncBatchDelivery indexName="log4j2" deliveryInterval="5000" batchSize="500">
				<JestHttp serverUris="http://localhost:9200" />
			</AsyncBatchDelivery>
		</Elasticsearch>

How to override LogEventJacksonJsonMixIn properly?

I am trying to add two new properties to every log entry using a CustomLogEventMixIn.

import com.fasterxml.jackson.annotation.JsonProperty
import com.fasterxml.jackson.databind.annotation.JsonSerialize
import org.apache.logging.log4j.core.LogEvent
import org.apache.logging.log4j.core.jackson.LogEventJacksonJsonMixIn
import java.time.Instant

@JsonSerialize(`as` = LogEvent::class)
abstract class CustomLogEventMixIn : LogEventJacksonJsonMixIn() {

    @JsonProperty("@timestamp")
    fun getTimestamp(): Instant {
        return Instant.ofEpochMilli(this.timeMillis)
    }

    @JsonProperty
    fun getEnvironment(): String {
        return System.getenv("SPRING_PROFILES_ACTIVE")
    }

}

My log4j2.xml

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
    <Appenders>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/>
        </Console>
        <Elasticsearch name="elasticsearchAsyncBatch">
            <JacksonJsonLayout>
                <JacksonMixIn mixInClass="de.ottonow.reporting.CustomLogEventMixIn"
                              targetClass="org.apache.logging.log4j.core.LogEvent"/>
            </JacksonJsonLayout>
            <RollingIndexName indexName="log4j2" pattern="yyyy-MM-dd" />
            <AsyncBatchDelivery>

                <JestHttp serverUris="https:/foobar/_bulk" />
            </AsyncBatchDelivery>
        </Elasticsearch>
    </Appenders>
    <Loggers>
        <Root level="info">
            <AppenderRef ref="Console"/>
            <AppenderRef ref="elasticsearchAsyncBatch"/>
        </Root>
    </Loggers>
</Configuration>

I would expect the two properties to be in every log entry, however, the properties never get to the index and I can't see them.

I am using Elasticsearch 7.

Any help is appreciated.

Supporting ecs-logging-java

Feature Request

Elastic has created a uniform scheme for the elastic stack. For logging with Log4j2 the log4j2-ecs-layout has been published.

This feature would also make the log4j2 appenders compatible to Log-Ui in Kibana.

See
ECS-based logging for Java applications
Log4j2 ECS Layout

Example:

<Appenders>
    <Elasticsearch name="elasticsearchAsyncBatch">
        <IndexName indexName="log4j2"/>
        <EcsLayout>
                <KeyValuePair key="additionalField1" value="constant value"/>
                <KeyValuePair key="additionalField2" value="$${ctx:key}"/>
         </EcsLayout>
        <AsyncBatchDelivery batchSize="1000" deliveryInterval="10000" >
            <HCHttp serverUris="http://localhost:9200">
                <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="1024000" initialPoolSize="3"/>
            </HCHttp>
        </AsyncBatchDelivery>
    </Elasticsearch>
</Appenders>

This could make also <IndexTemplate name="log4j2" path="classpath:indexTemplate-7.json" /> obsolet.

Can't authenticate as AWS Elasticsearch Service does not return WWW-Authenticate header

Description
I used an AWS Elasticsearch instance (doesn't support x-pack) with fine grained access control. Used master username and password for basic credentials. I got below exception.

2020-05-31 23:52:57,142 I/O dispatcher 1 WARN Unrecognized token 'Unauthorized': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false') at [Source: (org.appenders.log4j2.elasticsearch.hc.ItemSourceContentInputStream); line: 1, column: 13] com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'Unauthorized': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false') at [Source: (org.appenders.log4j2.elasticsearch.hc.ItemSourceContentInputStream); line: 1, column: 13] at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1840) at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:722) at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidToken(UTF8StreamJsonParser.java:3556) at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2651) at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:856) at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:753) at com.fasterxml.jackson.databind.ObjectReader._initForReading(ObjectReader.java:357) at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1704) at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1228) at org.appenders.log4j2.elasticsearch.hc.HCHttp$1.deserializeResponse(HCHttp.java:190) at org.appenders.log4j2.elasticsearch.hc.HCHttp$1.deserializeResponse(HCHttp.java:158) at org.appenders.log4j2.elasticsearch.hc.HCResultCallback.completed(HCResultCallback.java:55) at org.appenders.log4j2.elasticsearch.hc.HCResultCallback.completed(HCResultCallback.java:38) at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:122) at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:181) at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:448) at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:338) at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265) at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81) at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39) at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:121) at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162) at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337) at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315) at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276) at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104) at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:591) at java.lang.Thread.run(Thread.java:748)

Can you please give me a solution ?

Configuration
<Elasticsearch name="elasticsearch"> <IndexName indexName="customerdata"/> <JacksonJsonLayout> <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="1024" initialPoolSize="3000"/> </JacksonJsonLayout> <AsyncBatchDelivery batchSize="1000" deliveryInterval="10000" > <HCHttp serverUris="${env:AWS_ES_URL}"> <Security> <BasicCredentials username="USERNAME" password="PASSWORD" /> </Security> <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="1024000" initialPoolSize="3"/> </HCHttp> </AsyncBatchDelivery> </Elasticsearch>

Runtime (please complete the following information):

  • log4j2-elasticsearch-hc:1.4.1
  • ES version 7.4
  • JVM openJDK
  • OS: Ubuntu

Usage of pattern yyyy-ww does not change index every 7 days

Description
Hello,

For couple of months we are using your app to send log events to ELK,
but for some reason it is not changing weekly index.
What we see actually is that the index stay for more than 10 days and when it change the index it neither Sunday nor Monday.

Configuration
`

<Appenders>
	<Elasticsearch name="elasticsearchAsyncBatch">
		<RollingIndexName indexName="log4j" pattern="yyyy-ww" timeZone="UTC" separator="." />
		<ThresholdFilter level="ALL" onMatch="ACCEPT"/>
		<JacksonJsonLayout>
			<PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="512" initialPoolSize="10000"
										monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
				<!--<UnlimitedResizePolicy resizeFactor="0.6"/>-->
			</PooledItemSourceFactory>
			<JacksonMixIn mixInClass="come.project.elk.CustomLogEventJacksonJsonMixIn"
                          targetClass="org.apache.logging.log4j.core.LogEvent"/>
		</JacksonJsonLayout>
		<AsyncBatchDelivery batchSize="5000"
							deliveryInterval="3000">
			<IndexTemplate name="vegas" path="classpath:elasticLogIndexTemplate.json"/>
			<JestBufferedHttp serverUris="http://somehost:9200"
								connTimeout="500"
								readTimeout="30000"
								maxTotalConnection="40"
								defaultMaxTotalConnectionPerRoute="8"
								mappingType="_doc">
				<PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="5120000" initialPoolSize="5"
											monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
					<UnlimitedResizePolicy resizeFactor="0.70"/>
				</PooledItemSourceFactory>
			</JestBufferedHttp>
		</AsyncBatchDelivery>
	</Elasticsearch>
	
</Appenders>
  
<Loggers>
	<Root level="trace">
		<AppenderRef ref="elasticsearchAsyncBatch" />
	</Root>
</Loggers>
`

Runtime (please complete the following information):

  • log4j2-elasticsearch-jest-1.4.1.jar
  • log4j2-elasticsearch-core-1.4.1.jar
  • WebLogic Server 12c
  • JVM 8
  • OS: Red Hat 7.5

Additional context
Trying with daily or hourly pattern it is ok.

Missing any time of internal logging

Please provide any kind of internal logging.
At this moment i have a problem with an async appender that switches on a failover appender but there is no log to check the exact error that occurs.

How to add custom fields

Hi,

Would it be possible to add custom field value only in the Json message to elastic?
For example, it would be good to add the name or ip of the host, so that you can recognize where comes the log.

Thanks.

Exception in thread "BatchNotifier" java.lang.StackOverflowError

Hello,

I am getting ๐Ÿ‘
2019-08-30 09:06:54,998 BatchNotifier ERROR Deferred operation failed: IndexTemplate not added: {"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters: [default : {dynamic_templates=[{strings={mapping={type=keyword}, match_mapping_type=string, match=}}], _all={enabled=false}, properties={loggerName={type=text, fields={keyword={ignore_above=256, type=keyword}}}, message={type=text, fields={keyword={ignore_above=256, type=keyword}}}, timeMillis={format=epoch_millis, type=date}}}]"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters: [default : {dynamic_templates=[{strings={mapping={type=keyword}, match_mapping_type=string, match=}}], _all={enabled=false}, properties={loggerName={type=text, fields={keyword={ignore_above=256, type=keyword}}}, message={type=text, fields={keyword={ignore_above=256, type=keyword}}}, timeMillis={format=epoch_millis, type=date}}}]","caused_by":{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters: [default : {dynamic_templates=[{strings={mapping={type=keyword}, match_mapping_type=string, match=*}}], _all={enabled=false}, properties={loggerName={type=text, fields={keyword={ignore_above=256, type=keyword}}}, message={type=text, fields={keyword={ignore_above=256, type=keyword}}}, timeMillis={format=epoch_millis, type=date}}}]"}}

Exception in thread "BatchNotifier" java.lang.StackOverflowError
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:936)
at com.google.gson.Gson.getAdapter(Gson.java:434)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.write(TypeAdapterRuntimeTypeWrapper.java:56)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.write(ReflectiveTypeAdapterFactory.java:127)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.write(ReflectiveTypeAdapterFactory.java:245)

Thanks for your help

How can I use different layouts with HCHttp client , other than JacksonJsonLayout.

Good day, sir I am having a small issue here , could really use your advices or recommendations.

Currently I'm trying to append logs into elastic using Elasticsearch appender configuration with Jest client:

<Elasticsearch name="SERVICE_LOGS_ELASTICSEARCH"> <IndexName indexName="java-service-${bundle:application:info.build.archiveBaseName}"/> <PatternLayout pattern="%m%n"/> <AsyncBatchDelivery batchSize="1000" deliveryInterval="5000"> <JestHttp serverUris="${bundle:application:spring.elasticsearch.jest.uris}" mappingType="_doc"/> </AsyncBatchDelivery> </Elasticsearch>

You might be wondering , why would i use PatternLayout in the elastic appender , but the main reason for that is because I am using custom messages whenever I'm logging to escape log4j2 inserted metadata in the log message. Everything is working just fine with such configuration for elastic appender. A little bit later I decided to use HCHttp client instead of Jest client because it was mentioned in documentation , that it is more optimized. Here is configuration of appender:

<Elasticsearch name="SERVICE_LOGS_ELASTICSEARCH"> <IndexName indexName="java-service-${bundle:application:info.build.archiveBaseName}"/> <PatternLayout pattern="%m%n"/> <AsyncBatchDelivery batchSize="1000" deliveryInterval="5000"> <HCHttp serverUris="${bundle:application:spring.elasticsearch.jest.uris}"> <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="1024000" initialPoolSize="3"/> </HCHttp> </AsyncBatchDelivery> </Elasticsearch>
So , this configuration would not wrok , throwing java.lang.UnsupportedOperationException: Use ItemSource based API instead.
Here is the my app link , if you wonder about what I am trying to achieve : https://github.com/jafarjafarov/actuatorSwaggerCRUDSample. The main idea is to allow the exposure of different java objects in elastic in json format during some flow execution.
Thanks in advance

Jest with LogstashLayout

I'm trying to use this project with layout from https://github.com/vy/log4j2-logstash-layout

        <Elasticsearch name="elasticsearchAsyncBatch">
            <IndexName indexName="log4j2"/>
                <LogstashLayout dateTimeFormatPattern="yyyy-MM-dd'T'HH:mm:ss.SSSZZZ"
                                eventTemplateUri="classpath:EcsLayout.json"
                                prettyPrintEnabled="true"
                                stackTraceEnabled="true">
                    <EventTemplateAdditionalFields>
                        <KeyValuePair key="correlationId" value="$${CorrelationId:}"/>
                        <KeyValuePair key="serviceName" value="$${ServiceName:}"/>
                        <KeyValuePair key="environment" value="marx"/>
                        <KeyValuePair key="@timestamp" value="${date:yyyy-MM-dd HH:mm:ss.SSS}"/>
                        <KeyValuePair key="host_name" value="${hostName}"/>
                    </EventTemplateAdditionalFields>
                </LogstashLayout>
            <AsyncBatchDelivery>
                <JestHttp serverUris="http://localhost:9200/_bulk"/>
            </AsyncBatchDelivery>
        </Elasticsearch>

However I'm getting error:

2020-01-13 15:25:20,025 main ERROR Could not create plugin of type class org.appenders.log4j2.elasticsearch.ElasticsearchAppender for element Elasticsearch: java.lang.IllegalArgumentException: Can not set org.apache.logging.log4j.core.layout.AbstractLayout field org.appenders.log4j2.elasticsearch.ElasticsearchAppender$Builder.layout to com.vlkan.log4j2.logstash.layout.LogstashLayout java.lang.IllegalArgumentException: Can not set org.apache.logging.log4j.core.layout.AbstractLayout field org.appenders.log4j2.elasticsearch.ElasticsearchAppender$Builder.layout to com.vlkan.log4j2.logstash.layout.LogstashLayout
	at java.base/jdk.internal.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:167)
	at java.base/jdk.internal.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:171)
	at java.base/jdk.internal.reflect.UnsafeObjectFieldAccessorImpl.set(UnsafeObjectFieldAccessorImpl.java:81)
	at java.base/java.lang.reflect.Field.set(Field.java:780)
	at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.injectFields(PluginBuilder.java:188)
	at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:121)
	at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:964)
	at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:904)
	at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:896)
	at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:514)
	at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:238)
	at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:250)
	at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:548)
	at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:620)
	at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:637)
	at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:231)
	at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
	at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
	at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
	at org.apache.commons.logging.LogAdapter$Log4jLog.<clinit>(LogAdapter.java:155)
	at org.apache.commons.logging.LogAdapter$Log4jAdapter.createLog(LogAdapter.java:122)
	at org.apache.commons.logging.LogAdapter.createLog(LogAdapter.java:89)
	at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:67)
	at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:59)
	at org.springframework.boot.SpringApplication.<clinit>(SpringApplication.java:195)
	at com.ydc.discovery.Application.main(Application.java:26)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.springframework.boot.maven.AbstractRunMojo$LaunchRunner.run(AbstractRunMojo.java:543)
	at java.base/java.lang.Thread.run(Thread.java:831)

2020-01-13 15:25:20,036 main ERROR Unable to invoke factory method in class org.appenders.log4j2.elasticsearch.ElasticsearchAppender for element Elasticsearch: java.lang.IllegalStateException: No factory method found for class org.appenders.log4j2.elasticsearch.ElasticsearchAppender java.lang.IllegalStateException: No factory method found for class org.appenders.

Referenced issue: vy/log4j2-logstash-layout#51

The field timestamp doesn't exist

Hi. I try to use your appender. But when I creat index in Kibana there is not filter by timestamp and my index doesn't have the field "timestamp". Could you please help me. Thanks in advance

kibana

Ignoring nodes with non-default cluster name?

Description
I have a "None of the configured nodes are available" thrown at org.appenders.log4j2.elasticsearch.bulkprocessor.BulkProcessorObjectFactory line 90.
After digging for a while, I realized that in BulkProcessorObjectFactory.createClient(), after client.addTransportAddress(..), client still has zero node but one filterred node, and during putTemplate(), "this.ensureNodesAreAvailable(nodes)"throw this exception.

Using custom Lookups for Virtual Properties

I want to use a custom lookup extended from ValueResolver, for a virtual property to resolve the value. The virtual property is given as in the following configuration.

<JacksonJsonLayout>
   <VirtualProperty name="AdditionalField" value="UnresolvedValue" dynamic="true"/>
</JacksonJsonLayout>

I see the value resolver for JacksonJsonLayout is set for Log4j2Lookup. How can I set my custom value resolver to do this?

Custom rolling index separator

Hello,
Currently the rolling index separator is hard coded as the hyphen "-" character.
It would be nice to have it also configurable to a custom character e.g. dot.

What do you think about adding this improvement?

Best regards,
Cuneyt

Remaining timer thread when reconfiguring log4j

Hi,

When log4j2 is reconfigured, a timer thread remains alive. The timer thread comes from the BulkEmitter constructor.
BulkEmitter#stop() should cancel the timer but with the configuration below it's not the case (stop is never called) :

<Elasticsearch name="elasticsearchAsyncBatch">
            <RollingIndexName indexName="myindex" pattern="yyyy-MM-dd-HH" />
            <AsyncBatchDelivery batchSize="${myBatchSize}" deliveryInterval="1000" >
                <JestHttp serverUris="${myserver}"
                          connTimeout="10000"
                          readTimeout="10000"
                          maxTotalConnection="2"
                          defaultMaxTotalConnectionPerRoute="2"
                          ioThreadCount="2" >
                </JestHttp>
                <AppenderRefFailoverPolicy>
                    <AppenderRef ref="elastic.appender.fallback" level="WARN" />
                </AppenderRefFailoverPolicy>
            </AsyncBatchDelivery>
            <MyCustomLayout />
</Elasticsearch>

Code used to reconfigure :

URI uri = URI.create("myConfig.xml");
LoggerContext context = (org.apache.logging.log4j.core.LoggerContext)
                    LogManager.getFactory().getContext("myclassname", null, null, false, uri, "sg-logger");

//... LATER

context.setConfigLocation(uri);

Should I change something in the configuration file ?

Thanks in advance,
Jerome

Additional fields Elasticsearch

This is the way that I can add logs in to Elasticsearch
{
"_index" : "log4j2",
"_type" : "index",
"_id" : "JpTcx20Bt2rAkaoY4EfT",
"_score" : 1.0,
"_source" : {
"timeMillis" : 1571016201881,
"loggerName" : "Test",
"level" : "INFO",
"message" : "Test Message",
"thread" : "main"
}
}

I want to add a new field as sessionId

I have gone through all the available contents in the web and the source code but could not locate the way to do that.

Is it possible to share an example where I can do this ?

java.lang.IllegalAccessError

Hello,

i got :
java.lang.IllegalAccessError: tried to access method org.apache.http.client.entity.EntityBuilder.()V from class org.apache.http.client.entity.ByteBufEntityBuilder
at org.apache.http.client.entity.ByteBufEntityBuilder.(ByteBufEntityBuilder.java:10)
at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpClient.prepareRequest(BufferedJestHttpClient.java:66)
at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpClient.executeAsync(BufferedJestHttpClient.java:47)
at org.appenders.log4j2.elasticsearch.jest.JestHttpObjectFactory$1.apply(JestHttpObjectFactory.java:186)
at org.appenders.log4j2.elasticsearch.jest.JestHttpObjectFactory$1.apply(JestHttpObjectFactory.java:169)
at org.appenders.log4j2.elasticsearch.BulkEmitter.notifyListener(BulkEmitter.java:93)
at org.appenders.log4j2.elasticsearch.BulkEmitter$1.run(BulkEmitter.java:129)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)

Thanks for your help

ClassCastException with failover appender

Description
When using async batch delivery with a failover appender, ClassCastException is thrown when the connection to Elastic server cannot be established.
It seems to be an issue introduced in version 1.4.0, it doesn't exist with 1.3.6

Configuration

<Elasticsearch name="ElasticsearchAsyncBatch">
    <RollingIndexName indexName="myindex" pattern="yyyy-ww" timeZone="UTC"/>
    <ThresholdFilter level="INFO" onMatch="ACCEPT"/>
    <JacksonJsonLayout>
        <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="512" initialPoolSize="10000"
                                    monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
            <UnlimitedResizePolicy resizeFactor="0.6"/>
        </PooledItemSourceFactory>
    </JacksonJsonLayout>
    <AsyncBatchDelivery batchSize="10000"
                        deliveryInterval="3000">
        <IndexTemplate name="expense_ng" path="classpath:elasticLogIndexTemplate.json"/>
        <JestBufferedHttp serverUris="http://localhost:9200"
                            connTimeout="500"
                            readTimeout="30000"
                            maxTotalConnection="40"
                            defaultMaxTotalConnectionPerRoute="8"
                            mappingType="index">
            <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="5120000" initialPoolSize="3"
                                        monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
                <UnlimitedResizePolicy resizeFactor="0.70"/>
            </PooledItemSourceFactory>
        </JestBufferedHttp>
        <AppenderRefFailoverPolicy>
            <AppenderRef ref="Console"/>
        </AppenderRefFailoverPolicy>
    </AsyncBatchDelivery>
</Elasticsearch>

Runtime (please complete the following information):

  • log4j2-elasticsearch-jest:1.4.0
  • spring-boot 2.2.5
  • JVM openjdk version "1.8.0_242
  • OS: macOS Mojave 10.14.6

Additional context
Stackstrace of the exception thrown:

2020-03-17 09:57:10,328 restartedMain ERROR IndexTemplate not added: Could not connect to http://localhost:9200
2020-03-17 09:57:10,360 pool-2-thread-1 ERROR Unable to execute failover java.lang.ClassCastException: io.netty.buffer.CompositeByteBuf cannot be cast to java.lang.String
	at org.appenders.log4j2.elasticsearch.AppenderRefFailoverPolicy.deliver(AppenderRefFailoverPolicy.java:41)
	at org.appenders.log4j2.elasticsearch.FailoverPolicy.deliver(FailoverPolicy.java:49)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.concurrent.ConcurrentLinkedQueue$CLQSpliterator.forEachRemaining(ConcurrentLinkedQueue.java:857)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485)
	at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpObjectFactory.lambda$createFailureHandler$1(BufferedJestHttpObjectFactory.java:122)
	at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpObjectFactory$1.failed(BufferedJestHttpObjectFactory.java:157)
	at org.appenders.log4j2.elasticsearch.jest.BufferedJestHttpClient$BufferedResultCallback.failed(BufferedJestHttpClient.java:116)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.executionFailed(DefaultClientExchangeHandlerImpl.java:101)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.failed(AbstractClientExchangeHandler.java:426)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.connectionRequestFailed(AbstractClientExchangeHandler.java:348)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler.access$100(AbstractClientExchangeHandler.java:62)
	at org.apache.http.impl.nio.client.AbstractClientExchangeHandler$1.failed(AbstractClientExchangeHandler.java:392)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager$1.failed(PoolingNHttpClientConnectionManager.java:316)
	at org.apache.http.concurrent.BasicFuture.failed(BasicFuture.java:137)
	at org.apache.http.nio.pool.RouteSpecificPool.failed(RouteSpecificPool.java:162)
	at org.apache.http.nio.pool.AbstractNIOConnPool.requestFailed(AbstractNIOConnPool.java:613)
	at org.apache.http.nio.pool.AbstractNIOConnPool$InternalSessionRequestCallback.failed(AbstractNIOConnPool.java:893)
	at org.apache.http.impl.nio.reactor.SessionRequestImpl.failed(SessionRequestImpl.java:177)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:176)
	at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148)
	at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221)
	at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
	at java.lang.Thread.run(Thread.java:748)

Mistyped ElasticsearchAppender#layout field causing plugin injection errors

ElasticsearchAppender.Builder#layout is of type AbstractLayout, whereas it should have been Layout. See Log4j 2.0 FileAppender for an example. This renders ElasticsearchAppender unusable for layouts that don't extend from AbstractLayout. For instance, LogstashLayout doesn't extend from AbstractLayout class, but implements Layout<String> interface, hence the Log4j field injection failure reported in #32.

Allow Bulk API parameters in JestBufferedHttp

After applying a hot fix for this issue, handling of additional Bulk API parameters was removed.

Given that sometimes it might be useful to define them, this feature should be brought back (see Jest AbstractAction.buildQueryString()).

org.apache.logging.log4j.core.config.ConfigurationException: No layout provided for Elasticsearch appender

Can spring vanilla (no spring-boot) be used with log4j2-elasticsearch ?

I'm trying to log to ES with log4j2-elasticsearch.

Using spring 4.3.18

I'm having troubles to load the indexTemplate.json

ERROR StatusLogger Could not create plugin of type class org.appenders.log4j2.elasticsearch.IndexTemplate for element IndexTemplate
org.apache.logging.log4j.core.config.ConfigurationException
at org.appenders.log4j2.elasticsearch.IndexTemplate$Builder.loadClasspathResource(IndexTemplate.java:195)
at org.appenders.log4j2.elasticsearch.IndexTemplate$Builder.loadSource(IndexTemplate.java:136)
at org.appenders.log4j2.elasticsearch.IndexTemplate$Builder.build(IndexTemplate.java:106)
at org.appenders.log4j2.elasticsearch.IndexTemplate$Builder.build(IndexTemplate.java:67)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:122)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1002)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:942)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:552)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:241)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:288)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:579)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:651)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:668)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:138)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:45)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:30)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at ec.ep.dgcomm.wmu.eng.WebAppInitializer.(WebAppInitializer.java:35)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:152)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:753)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:729)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.manageApp(HostConfig.java:1733)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:300)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:484)
at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:433)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:300)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
at com.sun.jmx.remote.security.MBeanServerAccessController.invoke(MBeanServerAccessController.java:468)
at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1468)
at javax.management.remote.rmi.RMIConnectionImpl.access$300(RMIConnectionImpl.java:76)
at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1408)
at javax.management.remote.rmi.RMIConnectionImpl.invoke(RMIConnectionImpl.java:829)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:361)
at sun.rmi.transport.Transport$1.run(Transport.java:200)
at sun.rmi.transport.Transport$1.run(Transport.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
at java.io.Reader.(Reader.java:78)
at java.io.InputStreamReader.(InputStreamReader.java:97)
at org.appenders.log4j2.elasticsearch.IndexTemplate$Builder.loadClasspathResource(IndexTemplate.java:184)
... 78 more
ERROR StatusLogger Could not create plugin of type class org.appenders.log4j2.elasticsearch.ElasticsearchAppender for element Elasticsearch
org.apache.logging.log4j.core.config.ConfigurationException: No layout provided for Elasticsearch appender
at org.appenders.log4j2.elasticsearch.ElasticsearchAppender$Builder.build(ElasticsearchAppender.java:121)
at org.appenders.log4j2.elasticsearch.ElasticsearchAppender$Builder.build(ElasticsearchAppender.java:81)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:122)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1002)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:942)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:552)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:241)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:288)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:579)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:651)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:668)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:138)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:45)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:30)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at ec.ep.dgcomm.wmu.eng.WebAppInitializer.(WebAppInitializer.java:35)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:152)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5303)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:753)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:729)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.manageApp(HostConfig.java:1733)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:300)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:484)
at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:433)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:300)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
at com.sun.jmx.remote.security.MBeanServerAccessController.invoke(MBeanServerAccessController.java:468)
at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1468)
at javax.management.remote.rmi.RMIConnectionImpl.access$300(RMIConnectionImpl.java:76)
at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1408)
at javax.management.remote.rmi.RMIConnectionImpl.invoke(RMIConnectionImpl.java:829)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:361)
at sun.rmi.transport.Transport$1.run(Transport.java:200)
at sun.rmi.transport.Transport$1.run(Transport.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
ERROR StatusLogger Null object returned for Elasticsearch in appenders.
ERROR StatusLogger Unable to locate appender "elasticsearchAsyncBatch" for logger config "root"

Add template in classpath

Hi, I'd like to ask on how I can add the template within the classpath?
I tried to add it in the resource folder and then change the value of the path param but to no avail!

The error I get:
selection_015

VirtualProperty to retrieve value associated to a key of MapMessage

Description
I have a MapMessage in which I have a key named "packageName". I would like this key/value be mapped as a field.
I am trying to use VirtualProperty to retrieve the associated value through : $${map:packageName} (or even ${map:packageName}) but I cannot make it work.
I am ending up with "${map:packageName}" value in ES.
Is there a way to make VirtualProperty work with MapMessage?

N.B., with for eg. RollingRandomAccessFile I can use "%map{packageName}" in the pattern of the layout.

Configuration
Configuration used to produce the behavior: XML

<Elasticsearch name="elasticsearchAsyncBatch">
        <IndexName indexName="log4j2" />
	<JacksonJsonLayout afterburner="true">
              <VirtualProperty name="hostname" value="$${env:hostname:-undefined}" />
	      <VirtualProperty name="packageName" value="$${map:packageName}" dynamic="true" />
	      <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="1024" initialPoolSize="3000" />
        </JacksonJsonLayout>
        <AsyncBatchDelivery batchSize="1000" deliveryInterval="10000" >
		<IndexTemplate name="template_log4j2" path="classpath:indexTemplate-2.json" />
		<HCHttp serverUris="${elasticsearch-url}" mappingType="log">
                    <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="1024000" initialPoolSize="3" />
                </HCHttp>
         </AsyncBatchDelivery>
</Elasticsearch>

Runtime :

  • Module name and version: log4j2-elasticsearch-hc-1.4.1, log4j2-elasticsearch-core-1.4.1, log4j-core-2.12.1.jar
  • Framework/server/module system used : exotic (webMethods IS) / ES 5
  • JVM : Hotspot 1.8.0_101 (52.0)
  • OS: RHEL

Additional context
None

Incompatible Log4J version

I tried adding the appender to the Minecraft logger

Minecraft Log4J Version log4j-core:2.8.1

[15:35:35 ERROR]: [org.bukkit.craftbukkit.v1_15_R1.CraftServer] tried to access class org.apache.logging.log4j.core.jackson.StackTraceElementMixIn from class org.apache.logging.log4j.core.jackson.ExtendedLog4j2JsonModule initializing ElasticSearchAppender v1.0 (Is it up to date?) java.lang.IllegalAccessError: tried to access class org.apache.logging.log4j.core.jackson.StackTraceElementMixIn from class org.apache.logging.log4j.core.jackson.ExtendedLog4j2JsonModule at org.apache.logging.log4j.core.jackson.ExtendedLog4j2JsonModule.setupModule(ExtendedLog4j2JsonModule.java:37) ~[?:?] at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745) ~[?:?] at org.appenders.log4j2.elasticsearch.JacksonJsonLayout$Builder.createConfiguredWriter(JacksonJsonLayout.java:122) ~[?:?] at org.appenders.log4j2.elasticsearch.JacksonJsonLayout$Builder.build(JacksonJsonLayout.java:114) ~[?:?] at de.cytooxien.elasticsearch.ElasticSearchPlugin.createElasticsearchAppenderBuilder(ElasticSearchPlugin.java:117) ~[?:?] at de.cytooxien.elasticsearch.ElasticSearchPlugin.onLoad(ElasticSearchPlugin.java:44) ~[?:?] at org.bukkit.craftbukkit.v1_15_R1.CraftServer.loadPlugins(CraftServer.java:361) ~[patched_1.15.1.jar:git-Paper-48] at net.minecraft.server.v1_15_R1.DedicatedServer.init(DedicatedServer.java:226) ~[patched_1.15.1.jar:git-Paper-48] at net.minecraft.server.v1_15_R1.MinecraftServer.run(MinecraftServer.java:884) ~[patched_1.15.1.jar:git-Paper-48] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]

public void onLoad() {
    instance = this;
    Logger logger = LogManager.getLogger();
    org.apache.logging.log4j.core.Logger coreLogger = (org.apache.logging.log4j.core.Logger)logger;
    createLoggerProgrammatically(coreLogger, createElasticsearchAppenderBuilder(coreLogger, false, false), Configuration::getAsyncLoggerConfigDelegate);
}

public  ElasticsearchAppender.Builder createElasticsearchAppenderBuilder(org.apache.logging.log4j.core.Logger coreLogger, boolean messageOnly, boolean buffered) {

    JestHttpObjectFactory.Builder jestHttpObjectFactoryBuilder;
    if (buffered) {
        jestHttpObjectFactoryBuilder = BufferedJestHttpObjectFactory.newBuilder();

        int estimatedBatchSizeInBytes = BATCH_SIZE * INITIAL_ITEM_SIZE_IN_BYTES;

        ((BufferedJestHttpObjectFactory.Builder)jestHttpObjectFactoryBuilder).withItemSourceFactory(
                PooledItemSourceFactory.newBuilder()
                        .withPoolName("batchPool")
                        .withInitialPoolSize(INITIAL_BATCH_POOL_SIZE)
                        .withItemSizeInBytes(estimatedBatchSizeInBytes)
                        .withMonitored(true)
                        .withMonitorTaskInterval(10000)
                        .build()
        );
    } else {
        jestHttpObjectFactoryBuilder = JestHttpObjectFactory.newBuilder();
    }

    jestHttpObjectFactoryBuilder.withConnTimeout(1000)
            .withReadTimeout(10000)
            .withIoThreadCount(8)
            .withDefaultMaxTotalConnectionPerRoute(8)
            .withMaxTotalConnection(8)
            .withMappingType("_doc");

    jestHttpObjectFactoryBuilder.withServerUris(hostname);

    IndexTemplate indexTemplate = new IndexTemplate.Builder()
            .withName("log4j2_test_jest")
            .withSource(loadClasspathResource("classpath:indexTemplate.json"))
            .build();

    BatchDelivery asyncBatchDelivery = new SpigotAsyncBatchDelivery(BATCH_SIZE + ADDITIONAL_BATCH_SIZE, 1000, jestHttpObjectFactoryBuilder.build(), AsyncBatchDelivery.Builder.DEFAULT_FAILOVER_POLICY, indexTemplate);

    IndexNameFormatter indexNameFormatter = RollingIndexNameFormatter.newBuilder()
            .withIndexName("log4j2_test_jest")
            .withPattern("yyyy-MM-dd-HH")
            .build();

    JacksonJsonLayout.Builder layoutBuilder = JacksonJsonLayout.newBuilder();

    if (buffered) {
        PooledItemSourceFactory sourceFactoryConfig = PooledItemSourceFactory.newBuilder()
                .withPoolName("itemPool")
                .withInitialPoolSize(INITIAL_ITEM_POOL_SIZE)
                .withItemSizeInBytes(INITIAL_ITEM_SIZE_IN_BYTES)
                .withMonitored(true)
                .withMonitorTaskInterval(10000)
                .build();
        layoutBuilder.withItemSourceFactory(sourceFactoryConfig).build();
    }

    return ElasticsearchAppender.newBuilder()
            .withName("elastic-appender")
            .withMessageOnly(messageOnly)
            .withBatchDelivery(asyncBatchDelivery)
            .withIndexNameFormatter(indexNameFormatter)
            .withLayout(layoutBuilder.build())
            .withIgnoreExceptions(false);
}

private String loadClasspathResource(String path) {
    try {
        BufferedReader br = new BufferedReader(new InputStreamReader(instance.getClassLoader().getResourceAsStream(path.replace("classpath:", "")), "UTF-8"));
        StringBuilder sb = new StringBuilder();

        String line;
        while((line = br.readLine()) != null) {
            sb.append(line);
            sb.append('\n');
        }

        return sb.toString();
    } catch (Exception var4) {
        throw new ConfigurationException(var4.getMessage(), var4);
    }
}

public void createLoggerProgrammatically(org.apache.logging.log4j.core.Logger coreLogger, ElasticsearchAppender.Builder appenderBuilder, Function<Configuration, AsyncLoggerConfigDelegate> delegateSupplier) {

    LoggerContext ctx = coreLogger.getContext();

    final Configuration config = ctx.getConfiguration();

    Appender appender = appenderBuilder.build();
    appender.start();

    ((LifeCycle)delegateSupplier.apply(config)).start();

    config.addAppender(appender);
}

public class SpigotAsyncBatchDelivery extends AsyncBatchDelivery {

    public SpigotAsyncBatchDelivery(int batchSize, int deliveryInterval, ClientObjectFactory objectFactory, FailoverPolicy failoverPolicy, IndexTemplate indexTemplate) {
        super(batchSize, deliveryInterval, objectFactory, failoverPolicy, indexTemplate);
    }

    @Override
    protected BatchEmitterServiceProvider createBatchEmitterServiceProvider() {
        return new SpigotBatchEmitterServiceProvider();
    }
}

public class SpigotBatchEmitterServiceProvider extends BatchEmitterServiceProvider{

    private final Logger LOG = StatusLogger.getLogger();

    @Override
    public BatchEmitter createInstance(int batchSize, int deliveryInterval, ClientObjectFactory clientObjectFactory, FailoverPolicy failoverPolicy) {
        BatchEmitterFactory factory = new BulkEmitterFactory(); //Service#load not supported
        LOG.info("BatchEmitterFactory class found {}", factory.getClass().getName());
        if (factory.accepts(clientObjectFactory.getClass())) {
            LOG.info("Using {} as BatchEmitterFactoryProvider", factory);
            return factory.createInstance(batchSize, deliveryInterval, clientObjectFactory, failoverPolicy);
        }
        throw new ConfigurationException(String.format("No compatible BatchEmitter implementations for %s found", clientObjectFactory.getClass().getName()));
    }
}

Logj42 ThreadContext - Not visible

Hi, I set a org.apache.logging.log4j.ThreadContext on my app, and it doesnยดt show at kibana, I use the configuration u have on the readme:

		<Elasticsearch name="elasticsearchAsyncBatch">
			<IndexName indexName="log4j2" />
			<AsyncBatchDelivery>
				<IndexTemplate name="log4j2"
					path="classpath:indexTemplate.json" />
				<JestHttp serverUris="http://localhost:9200" />
			</AsyncBatchDelivery>
		</Elasticsearch>

And in elastic this is the data it register, but I canยดt see the info of the threadcontext:

{
  "_index": "log4j2",
  "_type": "index",
  "_id": "fCVIQGoBxjbw00kUTTJ1",
  "_version": 1,
  "_score": 0,
  "_source": {
    "timeMillis": 1555856640973,
    "loggerName": "com.myapp.conversion.configuration.LoggerInterceptor",
    "level": "INFO",
    "message": "thread=END",
    "thread": "http-nio-8080-exec-1"
  },
  "highlight": {
    "loggerName": [
      "@kibana-highlighted-field@com.myapp.conversion.configuration.LoggerInterceptor@/kibana-highlighted-field@"
    ]
  }
}

This is my log on console ():
2019-04-21 09:24:00 INFO (LoggerInterceptor.java:47) => thread=END {execution-duration=733, fusionId=123, path=/DataConvert/v2/operacion/123, threadId=89cbe3be-d7f9-4977-8e15-0af1632c6dc2, userIp=0:0:0:0:0:0:0:1}

Logs not sent on application closing

Description
I tried this plugin and I must say it works great thank you !
But I've noticed in my application that last log in my application doesn't appear in ES.

Configuration
I use this plugging to provide a logstash for the buildsystem that we use : Quickbuild (www.pmease.com)

Runtime (please complete the following information):

  • Module name and version: log4j2-elasticsearch-hc-1.4.1
  • Framework/server/module system used:
  • JVM OpenJDK8 Hotspot
  • OS: Windows 10

Additional context

Error in Appenders and loggers

Hi, I'm getting the next error message when I run my application:
2020-02-05 15:57:11,466 WrapperListener_start_runner ERROR Appenders contains an invalid element or attribute "Elasticsearch"
2020-02-05 15:57:11,487 WrapperListener_start_runner ERROR loggers Loggers has no parameter that matches element AppenderRef

Here is my log4j2.xml:

``

		<Elasticsearch name="elasticsearchAsyncBatch">
    		<IndexName indexName="log4j2" />
    		<AsyncBatchDelivery batchSize="1000" deliveryInterval="5000" >
        		<IndexTemplate name="log4j2" path="classpath:indexTemplate.json" />
        		<JestHttp serverUris="http://localhost:9200" mappingType="_doc"/>
    		</AsyncBatchDelivery>
		</Elasticsearch>
</Appenders>
<Loggers>
    
    <!-- Http Logger shows wire traffic on DEBUG. -->
    <AsyncLogger name="org.mule.service.http" level="WARN"/>
    <AsyncLogger name="org.mule.extension.http" level="WARN"/>

	<!-- Mule logger -->        
    <AsyncLogger name="org.mule.runtime.core.internal.processor.LoggerMessageProcessor" level="INFO"/>
    <AsyncLogger name="org.mule" level="INFO"/>
    <AsyncLogger name="com.mulesoft" level="INFO"/>
	<!-- Cloudhub logger -->
	<AsyncLogger name="com.gigaspaces" level="ERROR"/>
    <AsyncLogger name="com.j_spaces" level="ERROR"/>
    <AsyncLogger name="com.sun.jini" level="ERROR"/>
    <AsyncLogger name="net.jini" level="ERROR"/>
    <AsyncLogger name="org.apache" level="WARN"/>
    <AsyncLogger name="org.apache.cxf" level="WARN"/>
    <AsyncLogger name="org.springframework.beans.factory" level="WARN"/>
    <AsyncLogger name="org.mule" level="INFO"/>
    <AsyncLogger name="com.mulesoft" level="INFO"/>
    <AsyncLogger name="org.jetel" level="WARN"/>
    <AsyncLogger name="Tracking" level="WARN"/>

	<AsyncLogger name="elasticsearch" level="info" additivity="false"/>
		<AppenderRef ref="elasticsearchAsyncBatch" />

    <AsyncRoot level="INFO">
    </AsyncRoot>
</Loggers>

``

Issues with Log4j2Plugins.dat when shading with maven-shade-plugin

Description
Configuring a simple java application to log into a local Elastic node

Configuration
pom.xml

 <dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>commons-cli</groupId>
            <artifactId>commons-cli</artifactId>
            <version>1.4</version>
        </dependency>
        <dependency>
            <groupId>io.dropwizard.metrics</groupId>
            <artifactId>metrics-core</artifactId>
            <version>4.1.5</version>
        </dependency>
        <dependency>
            <groupId>com.google.code.gson</groupId>
            <artifactId>gson</artifactId>
            <version>2.8.6</version>
        </dependency>
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>22.0</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.13</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-slf4j-impl</artifactId>
            <version>2.13.2</version>
        </dependency>
        <dependency>
            <groupId>org.appenders.log4j</groupId>
            <artifactId>log4j2-elasticsearch-hc</artifactId>
            <version>1.4.2</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>2.11.0</version>

        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-core</artifactId>
            <version>2.11.0</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-annotations</artifactId>
            <version>2.11.0</version>
        </dependency>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-buffer</artifactId>
            <version>4.1.32.Final</version>
        </dependency>
        <dependency>
            <groupId>net.openhft</groupId>
            <artifactId>chronicle-map</artifactId>
            <version>3.19.4</version>
        </dependency>
        <dependency>
            <groupId>com.lmax</groupId>
            <artifactId>disruptor</artifactId>
            <version>3.4.2</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.module</groupId>
            <artifactId>jackson-module-afterburner</artifactId>
            <version>2.11.0</version>
        </dependency>
        </dependencies>

log4j2.xml - Basically copy and paste from your examples

<?xml version="1.0" encoding="UTF-8"?>
<Configuration >
    <Appenders>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/>
        </Console>
        <!-- This configuration is much more verbose than the minimal one (few lines). It's here just to demonstrate the optional features -->
        <Elasticsearch name="elasticsearch">
            <RollingIndexName indexName="logs-audit-" pattern="yyyy-MM-dd"  />
            <ThresholdFilter level="DEBUG" onMatch="ACCEPT"/>
            <JacksonJsonLayout>
                <VirtualProperty name="hostname" value="$${sys:hostname:-undefined}" />
                <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="1024" initialPoolSize="6000"
                                         monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
                    <UnlimitedResizePolicy resizeFactor="0.6" />
                </PooledItemSourceFactory>
            </JacksonJsonLayout>

            <AsyncBatchDelivery batchSize="10" deliveryInterval="200">
                <!-- Use 'classpath:BOOT-INF/classes/<template file name>' if template file is in the Spring Boot app resources (yes, it's ugly) -->
                <!-- Use 'classpath:<template file name>' if template file is provided by one of your dependencies -->
                <IndexTemplate name="logs-audit-" path="classpath:indexTemplate.json" />
                <HCHttp serverUris="http://localhost:9200"
                        connTimeout="500"
                        readTimeout="30000"
                        maxTotalConnections="1">
                    <Security>
                        <BasicCredentials username="elastic" password="changeit" />
                    </Security>
                    <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="2048000" initialPoolSize="3"
                                             monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
                        <UnlimitedResizePolicy resizeFactor="0.70" />
                    </PooledItemSourceFactory>
                    <BatchLimitBackoffPolicy maxBatchesInFlight="4" />
                </HCHttp>
                <!--<ChronicleMapRetryFailoverPolicy fileName="failedItems.chronicleMap"
                                                 numberOfEntries="100000"
                                                 averageValueSize="2048"
                                                 batchSize="1000"
                                                 retryDelay="4000"
                                                 monitored="true"
                                                 monitorTaskInterval="30000">
                    <SingleKeySequenceSelector sequenceId="2"/>
                </ChronicleMapRetryFailoverPolicy>-->
            </AsyncBatchDelivery>
        </Elasticsearch>
    </Appenders>
    <Loggers>
        <AsyncLogger name="elasticsearch">
            <AppenderRef ref="elasticsearch" />
        </AsyncLogger>
        <Root level="debug">
            <AppenderRef ref="Console"/>
        </Root>
    </Loggers>
</Configuration>

Runtime (please complete the following information):

  • log4j2-elasticsearch-jest:1.4.2
  • plain jar
  • openJDK11
  • Ubuntu

Additional context
log4j messages printed out fine but no index or other information in debug messages.

PatternLayout patterns creating a JSON message are not sent correctly

I'm using a PatternLayout that specifies a number of fields and massages them into a JSON string in the log4j2.xml file (not in code):

<Elasticsearch name="JmeterRunlogESAppender" messageOnly="true">
      <IndexName indexName="jmeter-runlog-${date:yyyy.MM.dd}" />
      <PatternLayout>
        <alwaysWriteExceptions>false</alwaysWriteExceptions>
        <pattern>{"timestamp": "%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{GMT+0}", "Level": "%p", "i_gn": "%X{group}", "i_tn": "%t", "i_tnu": "%X{threadnum}", "i_un": "%X{username}", "i_uid": "%X{userId}", "i_cid": "%X{companyId}", "i_cty": "%X{country}", "i_corr_id": "%X{correlationId}", "m": "%m%enc{%ex}{JSON}"}%n</pattern>
      </PatternLayout>
      <AsyncBatchDelivery batchSize="1000" deliveryInterval="2000">
        <JestHttp serverUris="http://${ESServer}:${ESPort}" />
      </AsyncBatchDelivery>
    </Elasticsearch>

This appears to cause failures sending as I see a number of messages in the log:

[INFO] 2019-08-07 17:07:57,994 I/O dispatcher 1 WARN One or more of the items in the Bulk request failed, check BulkResult.getItems() for more information.
[INFO] 2019-08-07 17:07:58,002 I/O dispatcher 1 WARN Batch of 19 items failed. Redirecting to org.appenders.log4j2.elasticsearch.NoopFailoverPolicy
[INFO] 2019-08-07 17:07:59,070 I/O dispatcher 1 WARN One or more of the items in the Bulk request failed, check BulkResult.getItems() for more information.
[INFO] 2019-08-07 17:07:59,071 I/O dispatcher 1 WARN Batch of 5 items failed. Redirecting to org.appenders.log4j2.elasticsearch.NoopFailoverPolicy

The following PatternLayout works correctly:
<pattern>%m%n</pattern>

I suspect it is related to the need to have the message already in JSON format as per https://github.com/rfoltyns/log4j2-elasticsearch/tree/master/log4j2-elasticsearch-core#raw-log-message. Is this a correct assumption? Can the appender handle PatternLayouts that create the JSON string on the fly?

Testing with Elasticsearch7.x

Hi,

I am just starting to use this library. From your documentation it mentioned it works up till elastic search 6.x. I tested against elasticsearch 7.2, I can see the logs via Kibana, except the date that is millisecond. I tried possible option like

  1. creating the index in elastic search rather than from code.

But no luck. Any insight would help to greater extend.

regards
Raj

Couldn't send dynamic field

Im trying to send dynamic objects using slf4j fluent api and log4j2, for example:
logger.atError().addKeyValue("host_name", "33333").addKeyValue("test","value").log("11111111111111111");
This is what I see in Kibana:

{
  "_index": "XXXX",
  "_type": "_doc",
  "_id": "LKN_9nMB8k09b5T7Y1j1",
  "_version": 1,
  "_score": null,
  "_source": {
    "timeMillis": 1597568559213,
    "loggerName": "XXXX",
    "level": "ERROR",
    "message": "host_name=33333 test=value 11111111111111111",
    "thread": "http-nio-7080-exec-2"
  },
  "fields": {
    "timeMillis": [
      "2020-08-16T09:02:39.213Z"
    ]
  },
  "sort": [
    1597568559213
  ]
}

Here is my log4j2.xml appender:

<Elasticsearch name="elasticsearch">
            <IndexName indexName="xxx"/>
            <ThresholdFilter level="DEBUG" onMatch="ACCEPT"/>
            <JacksonJsonLayout>
                <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="1024" initialPoolSize="3000"/>
            </JacksonJsonLayout>
            <AsyncBatchDelivery batchSize="1" deliveryInterval="2" >
                <IndexTemplate name="xxx" path="classpath:indexTemplate.json" />
                <HCHttp serverUris="http://xxx:9200">
                    <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="1024000" initialPoolSize="3"/>
                </HCHttp>
            </AsyncBatchDelivery>
        </Elasticsearch>

Can you tell me what im doing wrong?

ConcurrentModificationException error

During logging operation the appender raises this exception.

2019-01-25 03:16:54,541 pool-17-thread-9 ERROR An exception occurred processing Appender cksLoggingAppender java.util.ConcurrentModificationException
at java.util.LinkedList$ListItr.checkForComodification(LinkedList.java:966)
at java.util.LinkedList$ListItr.next(LinkedList.java:888)
at io.searchbox.core.Bulk.getData(Bulk.java:64)
at io.searchbox.client.http.JestHttpClient.prepareRequest(JestHttpClient.java:101)
at io.searchbox.client.http.JestHttpClient.executeAsync(JestHttpClient.java:80)
at org.appenders.log4j2.elasticsearch.jest.JestHttpObjectFactory$1.apply(JestHttpObjectFactory.java:115)
at org.appenders.log4j2.elasticsearch.jest.JestHttpObjectFactory$1.apply(JestHttpObjectFactory.java:107)
at org.appenders.log4j2.elasticsearch.BulkEmitter.notifyListener(BulkEmitter.java:73)
at org.appenders.log4j2.elasticsearch.BulkEmitter.add(BulkEmitter.java:84)
at org.appenders.log4j2.elasticsearch.AsyncBatchDelivery.add(AsyncBatchDelivery.java:88)
at org.appenders.log4j2.elasticsearch.AsyncBatchDelivery.add(AsyncBatchDelivery.java:43)
at org.appenders.log4j2.elasticsearch.ElasticsearchAppender.append(ElasticsearchAppender.java:72)
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:156)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:129)
at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:120)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:84)
at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:448)
at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:433)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:403)
at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:63)
at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:146)
at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2170)
at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2125)
at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2108)
at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2002)
at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1974)
at org.apache.logging.log4j.spi.AbstractLogger.debug(AbstractLogger.java:318)
at com.enel.twobeat.cks.listener.handler.KeepAliveEventHandler.sessionOpened(KeepAliveEventHandler.java:77)
at org.apache.mina.core.filterchain.DefaultIoFilterChain$TailFilter.sessionOpened(DefaultIoFilterChain.java:789)
at org.apache.mina.core.filterchain.DefaultIoFilterChain.callNextSessionOpened(DefaultIoFilterChain.java:476)
at org.apache.mina.core.filterchain.DefaultIoFilterChain.access$800(DefaultIoFilterChain.java:48)
at org.apache.mina.core.filterchain.DefaultIoFilterChain$EntryImpl$1.sessionOpened(DefaultIoFilterChain.java:922)
at org.apache.mina.core.filterchain.IoFilterEvent.fire(IoFilterEvent.java:101)
at org.apache.mina.core.session.IoEvent.run(IoEvent.java:63)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Our logger configuration is:

                <Elasticsearch name="cksLoggingAppender">
			<JsonLayout compact="true">
                        <KeyValuePair key="uuid" value="$${ctx:uuid}"/>
				<KeyValuePair key="server.address" value="${host}"/>
				<KeyValuePair key="env" value="${env}"/>
				<KeyValuePair key="process" value="KA-LOG"/>
				<KeyValuePair key="event-date" value="$${date:yyyy-MM-d'T'HH:mm:ss.S}" />
			</JsonLayout>
			<RollingIndexName indexName="events-cks" pattern="yyyy-MM-dd" timeZone="Europe/Warsaw" />
			<AsyncBatchDelivery>
				<IndexTemplate name="template" path="${sys:cks.home}/config/elasticsearch/events-cks-template.json" />
				<JestHttp serverUris="URLPLACEHOLDER" />
			</AsyncBatchDelivery>
		</Elasticsearch>

                <Logger name="cksLogging" level="trace" additivity="false">
			<AppenderRef ref="cksLoggingAppender"/>
		</Logger>

NotXContentException

I've tried several configurations and always wind up with the same error from elasticsearch 7.1.

[2019-06-17T20:48:35,297][DEBUG][o.e.a.b.TransportShardBulkAction] [my-pc] [log4j2][0] failed to execute bulk item (index) index {[log4j2][index][ILBFaGsBxxbZTGDaecI6], source[Info]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse

Caused by: org.elasticsearch.common.compress.NotXContentException: Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes
        at org.elasticsearch.common.compress.CompressorFactory.compressor(CompressorFactory.java:56) ~[elasticsearch-7.1.1.jar:7.1.1]
<Configuration status="INFO">
    <Appenders>
        <Console name="console" target="SYSTEM_OUT"/>
        <Elasticsearch name="elasticsearch">
            <JacksonJsonLayout>
                <!-- let's test LogEvent mixin override -->
                <JacksonMixIn mixInClass="org.apache.logging.log4j.core.jackson.LogEventMixIn"
                              targetClass="org.apache.logging.log4j.core.LogEvent"/>
                <PooledItemSourceFactory poolName="itemPool" itemSizeInBytes="512" initialPoolSize="100"
                                         monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
                    <UnlimitedResizePolicy resizeFactor="0.6" />
                </PooledItemSourceFactory>
            </JacksonJsonLayout>
            <AsyncBatchDelivery batchSize="100"
                               deliveryInterval="3000" >
            <IndexName indexName="myindex" />
                <JestBufferedHttp serverUris="http://localhost:9200"
                    connTimeout="500"
                    readTimeout="30000"
                    maxTotalConnection="40"
                    defaultMaxTotalConnectionPerRoute="8">
                    <PooledItemSourceFactory poolName="batchPool" itemSizeInBytes="5120000" initialPoolSize="3"
                                             monitored="true" monitorTaskInterval="10000" resizeTimeout="500">
                        <UnlimitedResizePolicy resizeFactor="0.70" />
                    </PooledItemSourceFactory>
                </JestBufferedHttp>
                <!--<AppenderRefFailoverPolicy>
                    <AppenderRef ref="CONSOLE" />
                </AppenderRefFailoverPolicy>-->
            </AsyncBatchDelivery>
        </Elasticsearch>
    </Appenders>

    <Loggers>
        <Logger name="org" level="info" />
        <Root level="info">
            <AppenderRef ref="console" />
            <AppenderRef ref="elasticsearch" />
        </Root>
    </Loggers>
</Configuration>

Use TimeZone.getDefault as default time zone

if no time zone is explicitly set, a new index may not be created.
Example: System time zone is Europe/Berlin UTC+1

<Appenders>
   <Elasticsearch name="elasticsearch">
       <JsonLayout compact="true" properties="true" />
	<RollingIndexName indexName="log4j2" pattern="yyyy-MM-dd"/>
	<AsyncBatchDelivery deliveryInterval="5000" batchSize="500">
	    <IndexTemplate name="log4j2" path="classpath:log4j2-index-template.json" />
		  <JestHttp serverUris="localhost:9200" />
	</AsyncBatchDelivery>
    </Elasticsearch>
</Appenders>

In this case no new indexName will be created because
2018-01-09 00:00:00 UTC + 1 -> 2018-01-08 23:00:00 UTC, indexName -> log42j-2018-01-08

Solution:
Using TimeZone.getDefault() as default time zone.

Additional fields for logs

Hi. I'm using your library for streaming logs from Spring application directly to Elasticsearch.
But i'm facing with problem that i don't understand how to add extra fields.
My log4j2.xml config looks like:

<Elasticsearch name="elasticsearchAsyncBatch">
            <IndexName indexName="log4j2test" />
            <JacksonJsonLayout afterburner="true">
                <PooledItemSourceFactory itemSizeInBytes="1024" initialPoolSize="4000" />
            </JacksonJsonLayout>
            <AsyncBatchDelivery>
                <IndexTemplate name="log4j2test" path="classpath:indexTemplate.json" />
                <JestBufferedHttp serverUris="http://localhost:9200">
                    <PooledItemSourceFactory itemSizeInBytes="1024000" initialPoolSize="4" />
                </JestBufferedHttp>
            </AsyncBatchDelivery>
        </Elasticsearch>

Logs in Elasticsearch looks like:

timeMillis:1,559,135,125,693 loggerName: com.test..api.UserController level:ERROR message:TEST thread:http-nio-3201-exec-6 _id:GJkSVL8MFE2REW _type:index _index:log4j2test _score:0

I want add for example: X-B3-ParentSpanId, X-B3-SpanId, X-B3-TraceId.

BTW, my Elasticsearch is 7.0.1 v.
Any help is appreciated.

adding custom mappings

@rfoltyns
Hi Rafal
It would be great if you could also specify a path to a custom index template file. E.g. the field timeMillis is a long and it will indexed as long. In this case you can't use this field as a time filter in Kibana. Furthermore, you can not specify any other optimizations, e. g. dynamic templates.
Example index-template.json:

{
	"template": "log4j2*",
	"settings": {
		"index": {
			"refresh_interval": "5s"
		}
	},
	"mappings": {
		"_default_": {
			"dynamic_templates": [
				{
					"strings": {
						"mapping": {
							"type": "keyword"
						},
						"match_mapping_type": "string",
						"match": "*"
					}
				}
			],
			"_all": {
				"enabled": false
			},
			"properties": {
				"loggerName": {
					"type": "text",
					"fields": {
						"keyword": {
							"ignore_above": 256,
							"type": "keyword"
						}
					}
				},
				"message": {
					"type": "text",
					"fields": {
						"keyword": {
							"ignore_above": 256,
							"type": "keyword"
						}
					}
				},
				"timeMillis": {
					"type": "date",
					"format": "epoch_millis"
				}
			}
		}
	}
}

The template should be sent to ES during or after initializing the appender.
Futher informations here Index Templates

Control output on empty virtual properties values

Hi,

Great work Rafal !

In order to conform to ecs i use a lot of virtual properties with thread context map, with an empty String for default, like for ex <VirtualProperty name="event.category" value="$${ctx:event.category:-}" dynamic="true" /> .

Is there a way to filter out empty properties from output ?

Thx

High number of "I/O dispatcher" threads using log4j2-elasticsearch-jest

log4j2-elasticsearch-jest can create a high number of "I/O dispatcher" threads. Actually, it creates one "I/O dispatcher" thread per available processor. Depending on the host machine, the thread count can be very high per JVM.

This seems to be caused by the httpcore-nio library (org.apache.httpcomponents:httpcore-nio version 4.4.4) :
in IOReactorConfig, line 66 :

this.ioThreadCount = AVAIL_PROCS;

Then the threads are created in AbstractMultiworkerIOReactor, line 324 :

for (int i = 0; i < this.workerCount; i++) {
                final BaseIOReactor dispatcher = this.dispatchers[i];
                this.workers[i] = new Worker(dispatcher, eventDispatch);
                this.threads[i] = this.threadFactory.newThread(this.workers[i]);
}

Is there a workaround to avoid that and creates only 1 or 2 threads ?

Thanks !

Package naming conventions

I'm using this brilliant library in modular project (OSGi).
Of course library is not OSGi complaint, but bunch of wrapping instructions and dirty solution for refreshing log4j2 PluginRegistry and log4j2 context is enough to make it working in apache karaf environment.

Newer versions are much more problematic with running in modular environment. It's because weird package naming inside jars.
In example, inside log4j2-elasticsearch-core we have

  • com.fasterxml.jackson.core
  • org.apache.logging.log4j.core.jackson
  • org.appenders.core.logging
  • org.appenders.log4j2.elasticsearch

For now, biggest fault is com.fasterxml.jackson.core. That package is already present in jackson-core and that thing breaks package resolving.

There is a special reason for that kind of packaging? Most of projects in java world tries to keep things under single package root, equal to project groupId + moduleName. That really helps with modularity problems.

How to pass the value from ThreadContext to ValueProperty?

Hi!
I tried passing value from ThreadContext to VirtualProperty like this:

my code in Application:
ThreadContext.put("account", "elvisPresley"); Logger logger = LogManager.getLogger("elasticsearch"); logger.info("HELLO");

section with ValueProperty in log4j2.xml:
<JacksonJsonLayout> <VirtualProperty name="name" value="$${ctx:account:-undefined}" /> </JacksonJsonLayout>

And I keep getting 'undefined' in Kibana fot my field.
Does anyone know how to solve this?
It is my mistake or a bug?

Could not create plugin of type class org.appenders.log4j2.elasticsearch.AsyncBatchDelivery for element AsyncBatchDelivery

2019-11-22 18:16:54,080 main ERROR Could not create plugin of type class org.appenders.log4j2.elasticsearch.AsyncBatchDelivery for element AsyncBatchDelivery org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element AsyncBatchDelivery are invalid: field 'clientObjectFactory' has invalid value 'null'
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.injectFields(PluginBuilder.java:208)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:121)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1002)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:942)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:552)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:241)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:288)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:579)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:651)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:668)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.commons.logging.LogAdapter$Log4jLog.(LogAdapter.java:155)
at org.apache.commons.logging.LogAdapter$Log4jAdapter.createLog(LogAdapter.java:122)
at org.apache.commons.logging.LogAdapter.createLog(LogAdapter.java:89)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:67)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:59)
at org.springframework.boot.SpringApplication.(SpringApplication.java:196)
at com.mosambee.ElasticSearchLog4j2SampleAppApplication.main(ElasticSearchLog4j2SampleAppApplication.java:12)

2019-11-22 18:16:54,081 main DEBUG Building Plugin[name=appender, class=org.appenders.log4j2.elasticsearch.ElasticsearchAppender].
2019-11-22 18:16:54,082 main ERROR No BatchDelivery method provided for ElasticSearch appender: batchDelivery
2019-11-22 18:16:54,083 main DEBUG ElasticsearchAppender$Builder(, name="elasticsearchAsyncBatch", ThresholdFilter(INFO), JacksonJsonLayout(org.appenders.log4j2.elasticsearch.JacksonJsonLayout@6392827e), ignoreExceptions="null", AsyncBatchDelivery(null), messageOnly="null", RollingIndexName(org.appenders.log4j2.elasticsearch.RollingIndexNameFormatter@2ed2d9cb))
2019-11-22 18:16:54,083 main ERROR Could not create plugin of type class org.appenders.log4j2.elasticsearch.ElasticsearchAppender for element Elasticsearch org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element Elasticsearch are invalid: field 'batchDelivery' has invalid value 'null'
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.injectFields(PluginBuilder.java:208)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:121)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1002)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:942)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:552)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:241)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:288)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:579)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:651)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:668)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.commons.logging.LogAdapter$Log4jLog.(LogAdapter.java:155)
at org.apache.commons.logging.LogAdapter$Log4jAdapter.createLog(LogAdapter.java:122)
at org.apache.commons.logging.LogAdapter.createLog(LogAdapter.java:89)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:67)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:59)
at org.springframework.boot.SpringApplication.(SpringApplication.java:196)
at com.mosambee.ElasticSearchLog4j2SampleAppApplication.main(ElasticSearchLog4j2SampleAppApplication.java:12)

indexname with date pattern

@rfoltyns is there any way to use a date pattern as part of the index name?
I tried 'log4j2-'yyyy-MM-dd but this doesn't work. No index will be created.
If this is missing, it would be great if you can implement that.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.