Giter Site home page Giter Site logo

oracle / weblogic-logging-exporter Goto Github PK

View Code? Open in Web Editor NEW
24.0 30.0 16.0 388 KB

Export server logs from WebLogic Server in JSON format to Elasticsearch.

License: Universal Permissive License v1.0

Java 100.00%
weblogic elasticsearch kibana logs logging exporter

weblogic-logging-exporter's Introduction

WebLogic Logging Exporter

NOTE: The WebLogic Logging Exporter project has been archived. Now, the repository is read-only and all issues, pull requests, code, labels, milestones, and such, also have become read-only. Contributors with access to the repository only can fork or star the project.

Users are encouraged to use Fluentd or Logstash. If you use Fluentd to export logs to Elasticsearch or OpenSearch, then you may be interested in the WebLogic Kubernetes Operator documentation that describes how you can use Fluentd to export WebLogic logs to Elasticsearch.

The goal of this project is to provide an easy to configure, robust, and production-ready solution to access WLS log information through Elasticsearch and Kibana.

The WebLogic Logging Exporter adds a log event handler to WebLogic Server, such that WebLogic Server logs can be integrated into Elastic Stack in Kubernetes directly, by using the Elasticsearch REST API.

The current version of the WebLogic Logging Exporter is 1.0.1, which was released on Wednesday, January 27, 2021. This version supports pushing logs into Elasticsearch using the REST API.

The following features are planned for the next few releases:

  • Push logs into a Fluentd aggregator using the REST API.
  • Write logs in JSON format into the file system so that they could be collected and published by a sidecar, for example, Fluentd or Logstash.
  • Provide the ability to publish other logs (for example, other than the server logs).

Contents

Download the release

You can download the WebLogic Logging Exporter already compiled for you from the releases page.

Building from source

If you prefer, you can build the WebLogic Logging Exporter from the source code. To do this, you will need access to some WebLogic Server libraries. There are two ways to get these libraries:

  • Populate your local Maven repository with the required files from a local WebLogic Server installation using the Oracle Maven Synchronization plugin, or
  • Use the Oracle Maven repository to download them as part of your build; this requires registration and configuring your local Maven installation with the appropriate authentication details.

Populating your local Maven repository from a local WebLogic Server installation

You can use the Oracle Maven Synchronization plugin, which is included in your WebLogic Server installation, to install the necessary dependencies into your local Maven repository.

There are two steps:

  • Install the Oracle Maven Synchronization plugin.
  • Run the push goal to populate your local Maven repository from your WebLogic Server installation.

Installing the Oracle Maven Synchronization plugin

To install the plugin, navigate to your WebLogic Server installation, then enter the commands (this example assumes you installed WebLogic Server in /u01/wlshome):

cd /u01/wlshome/oracle_common/plugins/maven/com/oracle/oracle-maven-sync/12.2.1
mvn install:install-file -DpomFile=oracle-maven-sync-12.2.1.pom -Dfile=oracle-maven-sync-12.2.1.jar

Popoulating your local Maven repository

To populate your local Maven repository from your WebLogic Server installation, enter this command:

mvn com.oracle.maven:oracle-maven-sync:push -DoracleHome=/u01/wlshome

You can verify the dependencies were installed by looking in your local Maven repository which is normally located at ~/.m2/repository/com/oracle/weblogic.

Using the Oracle Maven repository

Note: If you populated your local repository using the Oracle Maven Synchronization plugin, then this step is not required.

To access the Oracle Maven repository, refer to the documentation available here.

Building the WebLogic Logging Exporter

To build the WebLogic Logging Exporter, clone the project from GitHub and then build it with Maven:

git clone [email protected]:oracle/wls-logging-exporter.git
mvn install

The weblogic-logging-exporter.jar will be available under the target directory.

Installation

This section outlines the steps that are required to add the WebLogic Logging Exporter to WebLogic Server.

  1. Download or build the WebLogic Logging Exporter as described above.

  2. Copy the weblogic-logging-exporter.jar into a suitable location, e.g. into your domain directory.

  3. Add a startup class to your domain configuration.

    • In the Administration Console, navigate to "Environment" then "Startup and Shutdown classes" in the main menu.
    • Add a new Startup class. You may choose any descriptive name and the class name must be weblogic.logging.exporter.Startup.
    • Target the startup class to each server that you want to export logs from.

    You can verify this by checking for the update in your config.xml which should be similar to this example:

    <startup-class>
        <name>LoggingExporterStartupClass</name>
        <target>AdminServer</target>
        <class-name>weblogic.logging.exporter.Startup</class-name>
    </startup-class>
    
  4. Add weblogic-logging-exporter.jar and snakeyaml-1.27.jar to your classpath.

    This project requires snakeyaml to parse the YAML configuration file. If you built the project locally, you can find this JAR file in your local maven repository at ~/.m2/repository/org/yaml/snakeyaml/1.27/snakeyaml-1.27.jar. Otherwise, you can download it from Maven Central.

    Place the file(s) in a suitable location, e.g. your domain directory.

    Update the server classpath to include these file(s). This can be done by adding a statement to the end of your setDomainEnv.sh script in your domain's bin directory as follows (this example assumes your domain directory is /u01/base_domain):

    export CLASSPATH="/u01/base_domain/weblogic-logging-exporter.jar:/u01/base_domain/snakeyaml-1.27.jar:$CLASSPATH"
    
  5. Create a configuration file for the WebLogic Logging Exporter.

    There are two options currently - the version 1.x configuration, or the new version 2.x configuration - please note that the 2.x configuration is alpha and therefore subject to change as we get close to the 2.0 release.

    a. Version 1.x configuration

    Create a file named `WebLogicLoggingExporter.yaml` in your domain's `config` directory.  You can copy the
    [sample provided in this project](samples/WebLogicLoggingExporter.yaml) as a starting point.  That sample
    contains details of all of the available configuration options.  A completed configuration file might look
    like this:
    
    ```
    publishHost:  localhost
    publishPort:  9200
    domainUID:  domain1
    weblogicLoggingExporterEnabled: true
    weblogicLoggingIndexName:  domain1-wls
    weblogicLoggingExporterSeverity:  Notice
    weblogicLoggingExporterBulkSize: 1
    weblogicLoggingExporterFilters:
    - filterExpression:  'severity > Warning'
    ```
    
    Note that you must give a unique `domainUID` to each domain.  This value is used to filter logs by domain when you
    send the logs from multiple domains to the same Elasticsearch server.  If you are using the WebLogic Kubernetes
    Operator, it is strongly recommended that you use the same `domainUID` value that you use for the domain.
    
    It is also strongly recommended that you consider using a different Elastcsearch index name for each domain.
    

    b. Version 2.x configuration

    If you prefer to place the configuration file in a different location, you can set the environment variable
    `WEBLOGIC_LOGGING_EXPORTER_CONFIG_FILE` to point to the location of the file.
    
    If you want to write the JSON logs to a file instead of sending it elasticsearch directly use the following configuration
    [file](samples/WebLogicFileLoggingExporter.yaml) and adjust it to your needs. Make sure to rename it to WebLogicLoggingExporter.yaml.
    
  6. Restart the servers to activate the changes. After restarting the servers, they will load the WebLogic Logging Exporter and start sending their logs to the specified Elasticsearch instance. You can then access them in Kibana as shown in the example below. You will need to create an index first and then go to the visualization page.

Kibana screenshot

You can also use a curl command similar to the following example to verify that logs have been posted to Elasticsearch. The default index name is wls, and docs.count should be greater than zero indicating that log entries are being sent to Elasticsearch.

$ curl "localhost:9200/_cat/indices?v"
health status index               uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open   wls                 q4Q2v2dXTBOyYsHZMdDe3H 5   1         23            0      101kb          101kb

Running Elasticsearch and Kibana locally for testing

If you wish to test on your local machine, a sample is provided to run Elasticsearch and Kibana in Docker on your local machine.

Contributing

This project welcomes contributions from the community. Before submitting a pull request, please review our contribution guide.

Security

Please consult the security guide for our responsible security vulnerability disclosure process.

License

Copyright (c) 2017, 2021 Oracle and/or its affiliates.

Released under the Universal Permissive License v1.0 as shown at https://oss.oracle.com/licenses/upl/.

weblogic-logging-exporter's People

Contributors

anissalam avatar dependabot[bot] avatar hzhao-github avatar markxnelson avatar mriccell avatar rjeberhard avatar rosemarymarano avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

weblogic-logging-exporter's Issues

Export logs to Elasticsearch server using HTTPS

It will be very helpful for customers using Elastic Cloud if the weblogic logging tools support to export the WLS operator logs to ELK using HTTPS connection.

Now we can specify an existing Elasticsearch server using command

helm upgrade \
  --namespace sample-weblogic-operator-ns \
  --set image=ghcr.io/oracle/weblogic-kubernetes-operator:3.2.1 \
  --set serviceAccount=sample-weblogic-operator-sa \
  --set "elkIntegrationEnabled=true" \
  --set "elasticSearchHost=sample.elasticsearch.com" \
  --set "elasticSearchPort=9200" \
  --set "enableClusterRoleBinding=true" \
  --set "domainNamespaceSelectionStrategy=LabelSelector" \
  --set "domainNamespaceLabelSelector=weblogic-operator\=enabled" \
  --wait \
  weblogic-operator \
  kubernetes/charts/weblogic-operator

It will be great to specify the scheme with variable like

helm upgrade \
  --namespace sample-weblogic-operator-ns \
  --set image=ghcr.io/oracle/weblogic-kubernetes-operator:3.2.1 \
  --set serviceAccount=sample-weblogic-operator-sa \
  --set "elkIntegrationEnabled=true" \
  --set "elasticSearchHost=sample.elasticsearch.com" \
  --set "elasticSearchPort=9200" \
  --set "elasticSearchUser=elastic" \
  --set "elasticSearchPassword=111111111111111111111" \
  --set "elasticSearchScheme=https" \
  --set "enableClusterRoleBinding=true" \
  --set "domainNamespaceSelectionStrategy=LabelSelector" \
  --set "domainNamespaceLabelSelector=weblogic-operator\=enabled" \
  --wait \
  weblogic-operator \
  kubernetes/charts/weblogic-operator

support for ODL format and specifying a Logger by name

This is more of an enhancement request than a problem...
there is a need to support Logging for components on top of WebLogic, like Service Bus, Analytics Publisher and more.
Instead of using the default WebLogic Logger, one must be able to specify the Logger by name, like "oracle.osb".
The ODL format is different from standard WebLogic logging, almost like JSON with brackets around most of the items:
[timestamp] [component id] [messagetype:level] [message-id] [module id]
([field-name: field-value])* message-text [supplemental-detail]

Startup class error

I have followed the guide to install the logging exporter but afther starting my WebLogic Server i get a error message what i cant solve.

I am using:
WebLogic Version: 14.1.1.
Linux: Oracle Linux

Error Message:
<Failed to invoke startup class "StartupClass-0", java.lang.NoClassDefFoundError: org/yaml/snakeyaml/scanner/ScannerException.
java.lang.NoClassDefFoundError: org/yaml/snakeyaml/scanner/ScannerException
at weblogic.logging.exporter.Startup.main(Startup.java:34)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
Truncated. see log file for complete stacktrace
Caused By: java.lang.ClassNotFoundException: org.yaml.snakeyaml.scanner.ScannerException
at com.oracle.classloader.PolicyClassLoader.findClass(PolicyClassLoader.java:399)
at com.oracle.classloader.PolicyClassLoader.loadClass(PolicyClassLoader.java:372)
at com.oracle.classloader.weblogic.LaunchClassLoader.loadClass(LaunchClassLoader.java:55)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at weblogic.logging.exporter.Startup.main(Startup.java:34)
Truncated. see log file for complete stacktrace

Has somebody a solution for this problem?

Response had end of stream after 0 bytes

Hi,

I have followed and configured all the steps and it was successful with below message:

`======================= Weblogic Logging Exporter Startup class called

JavaProperty/EnvVariable WEBLOGIC_LOGGING_EXPORTER_CONFIG_FILE:null

Env variable WEBLOGIC_LOGGING_EXPORTER_CONFIG_FILE is not set. Defaulting to:config/WebLogicLoggingExporter.yaml

Reading configuration from file name: C:\Oracle\Middleware\wls_12214\user_projects\domains\cari_domain\config\WebLogicLoggingExporter.yaml

Notice - Log Management - The server has successfully established a connection with the Domain level Diagnostic Service.

Config{weblogicLoggingIndexName='wls', publishHost='localhost', publishPort=9200, weblogicLoggingExporterSeverity='Notice', weblogicLoggingExporterBulkSize='2', enabled=true, weblogicLoggingExporterFilters=[FilterConfig{expression='MSGID != 'BEA-000449'', servers=[]}], domainUID='base_domain'}`

But then, the below exception is occured.

javax.ws.rs.ProcessingException: java.io.EOFException: Response had end of stream after 0 bytes
at org.glassfish.jersey.client.internal.HttpUrlConnector.apply(HttpUrlConnector.java:287)
at org.glassfish.jersey.client.ClientRuntime.invoke(ClientRuntime.java:255)
at org.glassfish.jersey.client.JerseyInvocation$1.call(JerseyInvocation.java:684)
at org.glassfish.jersey.client.JerseyInvocation$1.call(JerseyInvocation.java:681)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:228)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:444)
at org.glassfish.jersey.client.JerseyInvocation.invoke(JerseyInvocation.java:681)
at org.glassfish.jersey.client.JerseyInvocation$Builder.method(JerseyInvocation.java:437)
at org.glassfish.jersey.client.JerseyInvocation$Builder.put(JerseyInvocation.java:326)
at weblogic.logging.exporter.LogExportHandler.executePutOrPostOnUrl(LogExportHandler.java:167)
at weblogic.logging.exporter.LogExportHandler.createMappings(LogExportHandler.java:297)
at weblogic.logging.exporter.LogExportHandler.(LogExportHandler.java:53)
at weblogic.logging.exporter.Startup.main(Startup.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager.invokeMain(ClassDeploymentManager.java:449)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager.invokeClass(ClassDeploymentManager.java:359)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager.access$100(ClassDeploymentManager.java:63)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager$1.run(ClassDeploymentManager.java:286)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager$1.run(ClassDeploymentManager.java:273)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:344)
at weblogic.security.service.SecurityManager.runAsForUserCode(SecurityManager.java:197)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager.invokeClassDeployment(ClassDeploymentManager.java:272)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager.invokeClassDeployments(ClassDeploymentManager.java:253)
at weblogic.management.deploy.classdeployment.ClassDeploymentManager.runStartupsAfterAppAdminState(ClassDeploymentManager.java:215)
at weblogic.management.deploy.classdeployment.StartupClassPrelistenService.start(StartupClassPrelistenService.java:29)
at weblogic.server.AbstractServerService.postConstruct(AbstractServerService.java:76)
at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.hk2.utilities.reflection.ReflectionHelper.invoke(ReflectionHelper.java:1287)
at org.jvnet.hk2.internal.ClazzCreator.postConstructMe(ClazzCreator.java:333)
at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:375)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:487)
at org.glassfish.hk2.runlevel.internal.AsyncRunLevelContext.findOrCreate(AsyncRunLevelContext.java:305)
at org.glassfish.hk2.runlevel.RunLevelContext.findOrCreate(RunLevelContext.java:85)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2126)
at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:116)
at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:90)
at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.oneJob(CurrentTaskFuture.java:1237)
at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.run(CurrentTaskFuture.java:1168)
at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:681)
at weblogic.invocation.ComponentInvocationContextManager._runAs(ComponentInvocationContextManager.java:352)
at weblogic.invocation.ComponentInvocationContextManager.runAs(ComponentInvocationContextManager.java:337)
at weblogic.work.LivePartitionUtility.doRunWorkUnderContext(LivePartitionUtility.java:57)
at weblogic.work.PartitionUtility.runWorkUnderContext(PartitionUtility.java:41)
at weblogic.work.SelfTuningWorkManagerImpl.runWorkUnderContext(SelfTuningWorkManagerImpl.java:655)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:420)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:360)
Caused by: java.io.EOFException: Response had end of stream after 0 bytes
at weblogic.net.http.MessageHeader.isHTTP(MessageHeader.java:312)
at weblogic.net.http.MessageHeader.parseHeader(MessageHeader.java:232)
at weblogic.net.http.HttpClient.parseHTTP(HttpClient.java:556)
at weblogic.net.http.HttpURLConnection.getInputStream(HttpURLConnection.java:757)
at weblogic.net.http.SOAPHttpURLConnection.getInputStream(SOAPHttpURLConnection.java:42)
at weblogic.net.http.HttpURLConnection.getResponseCode(HttpURLConnection.java:1616)
at org.glassfish.jersey.client.internal.HttpUrlConnector._apply(HttpUrlConnector.java:394)
at org.glassfish.jersey.client.internal.HttpUrlConnector.apply(HttpUrlConnector.java:285)
... 52 more

Support "data streams"

Elastic search has had data streams for a while now. However, the logging exporter doesn't send things in the right way.

According to here: https://www.elastic.co/guide/en/elasticsearch/reference/7.10/use-a-data-stream.html

"To add multiple documents with a single request, use the bulk API. Only create actions are supported."

curl -X PUT "localhost:9200/my-data-stream/_bulk?refresh&pretty" -H 'Content-Type: application/json' -d'
{"create":{ }}
{ "@timestamp": "2099-03-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }
{"create":{ }}
{ "@timestamp": "2099-03-08T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
{"create":{ }}
{ "@timestamp": "2099-03-09T11:07:08.000Z", "user": { "id": "l7gk7f82" }, "message": "Logout successful" }
'
But the exporter sends:

POST /wls/doc/_bulk?pretty HTTP/1.1
Accept: application/json
Content-Type: application/json
User-Agent: Jersey/2.22.4 (HttpUrlConnection 1.8.0_281)
Host: localhost:9200
Connection: Keep-Alive
Content-Length: 1512

{ "index" : { }}
{"messageID": "BEA-2162611","message": "Creating ManagedScheduledExecutorService "DefaultManagedScheduledExecutorService" (partition="DOMAIN", module="null", application="bea_wls_deployment_internal", workmanager="default")","timestamp": 1641037065050,"serverName": "server_8080","threadName": "[ACTIVE] ExecuteThread: '10' for queue: 'weblogic.kernel.Default (self-tuning)'","severity": "Info","userId": "","level": "Info","loggerName": "CONCURRENCY","formattedDate": "Jan 1, 2022 11:37:45,050 AM UTC","subSystem": "CONCURRENCY","machineName": "server","transactionId": "","diagnosticContextId": "7434236c-676a-4857-839f-9b4b00bc7fc7-000000a5","sequenceNumber": 5865,"domainUID": "domainid"}

and ES produces an error:

{
"took" : 0,
"errors" : true,
"items" : [
{
"index" : {
"_index" : "wls",
"_type" : "doc",
"_id" : null,
"status" : 400,
"error" : {
"type" : "illegal_argument_exception",
"reason" : "only write ops with an op_type of create are allowed in data streams"
}
}
},
{
"index" : {
"_index" : "wls",
"_type" : "doc",
"_id" : null,
"status" : 400,
"error" : {
"type" : "illegal_argument_exception",
"reason" : "only write ops with an op_type of create are allowed in data streams"
}
}
}
]
}

Since (I assume) the exported never needs to update a doc, it could use "create" here. I'll try this out and submit a pull request if it works

Push data to ES,got successful=false

<weblogic.logging.exporter.LogExportHandler> logging of { "index" : { }}
{"messageID": "BEA-000337","message": "[STUCK] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)' has been busy for "115" seconds working on the request "Http Request Information: weblogic.servlet.internal.ServletRequestImpl@c6bc2dc[GET /WebRoot/stuck.jsp]
", which is more than the configured time (StuckThreadMaxTime) of "60" seconds in "server-failure-trigger". Stack trace:
java.lang.Thread.sleep(Native Method)
jsp_servlet.__stuck._jspService(__stuck.java:91)
weblogic.servlet.jsp.JspBase.service(JspBase.java:35)
weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:295)
weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:260)
weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:137)
weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:353)
weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:250)
weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3862)
weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3829)
weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:344)
weblogic.security.service.SecurityManager.runAsForUserCode(SecurityManager.java:197)
weblogic.servlet.provider.WlsSecurityProvider.runAsForUserCode(WlsSecurityProvider.java:203)
weblogic.servlet.provider.WlsSubjectHandle.run(WlsSubjectHandle.java:71)
weblogic.servlet.internal.WebAppServletContext.processSecuredExecute(WebAppServletContext.java:2502)
weblogic.servlet.internal.WebAppServletContext.doSecuredExecute(WebAppServletContext.java:2351)
weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2326)
weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2304)
weblogic.servlet.internal.ServletRequestImpl.runInternal(ServletRequestImpl.java:1779)
weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1733)
weblogic.servlet.provider.ContainerSupportProviderImpl$WlsRequestExecutor.run(ContainerSupportProviderImpl.java:272)
weblogic.invocation.ComponentInvocationContextManager._runAs(ComponentInvocationContextManager.java:352)
weblogic.invocation.ComponentInvocationContextManager.runAs(ComponentInvocationContextManager.java:337)
weblogic.work.LivePartitionUtility.doRunWorkUnderContext(LivePartitionUtility.java:57)
weblogic.work.PartitionUtility.runWorkUnderContext(PartitionUtility.java:41)
weblogic.work.SelfTuningWorkManagerImpl.runWorkUnderContext(SelfTuningWorkManagerImpl.java:651)
weblogic.work.ExecuteThread.execute(ExecuteThread.java:420)
weblogic.work.ExecuteThread.run(ExecuteThread.java:360)
","timestamp": 1599719366983,"serverName": "AdminServer","threadName": "[ACTIVE] ExecuteThread: '9' for queue: 'weblogic.kernel.Default (self-tuning)'","severity": "Error","userId": "","level": "Error","loggerName": "WebLogicServer","formattedDate": "Sep 10, 2020 2:29:26,983 PM CST","subSystem": "WebLogicServer","machineName": "testServer210","transactionId": "","diagnosticContextId": "8040dd9a-a265-4ede-858b-4c19734e1de9-00000015","sequenceNumber": 258,"domainUID": "testDomain"}
{ "index" : { }}
{"messageID": "BEA-002959","message": "Self-tuning thread pool contains 2 running threads, 10 idle threads, and 0 standby threads","timestamp": 1599719473889,"serverName": "AdminServer","threadName": "Timer-2","severity": "Info","userId": "","level": "Info","loggerName": "WorkManager","formattedDate": "Sep 10, 2020 2:31:13,889 PM CST","subSystem": "WorkManager","machineName": "testServer210","transactionId": "","diagnosticContextId": "8040dd9a-a265-4ede-858b-4c19734e1de9-00000010","sequenceNumber": 259,"domainUID": "testDomain"}
got result Result{response='null', status=400, successful=false}

My WebLogicLoggingExporter.yaml as follows:
weblogicLoggingIndexName: wls
publishHost: 192.168.3.210
publishPort: 9200
domainUID: testDomain
weblogicLoggingExporterEnabled: true
weblogicLoggingExporterSeverity: Debug
weblogicLoggingExporterBulkSize: 2

How to fix this issue?
Thanks.

Featrure to write to a file system

Hello,

We would like to use the feature "Write logs in JSON format into the file system so that they could be collected and published by a sidecar, e.g. fluentd or Logstash.", but I can't see an example for the configuration in "WebLogicLoggingExporter.yaml".

Can you help me with that, please?

Kind regards
Ralf

Application Log export

It would be nice to have a way to provide application logs via the same/similar process.

Support exporting logs to Elasticsearch cluster with multiple nodes

We have a scenario, that consists of 2 Elasticsearch(ES) master nodes deployed on Kubernetes and exposed as 'NodePort' service. So, that we can access the ES service using both the node ip/port and can also use multiple node addresses for Logstash configuration as an example.

In such case, we should be able to use both node ip/port as publishHost and publishPort (as a failover mechanism).

I think it will be a required feature in Weblogic exporter unless there is already a way to do this.
Also, in cases, when the ES service is exposed via Loadbalancer, Weblogic exporter should also support the configuration of loadbalancer URL instead of specifying host/port of the ES service.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.