Giter Site home page Giter Site logo

pgaudit_analyze's Introduction

pgAudit Analyze
Open Source PostgreSQL pgAudit Analyzer

Introduction

The PostgreSQL Audit extension (pgAudit) provides detailed session and/or object audit logging via the standard PostgreSQL logging facility. However, logs are not the ideal place to store audit information. The PostgreSQL Audit Log Analyzer (pgAudit Analyze) reads audit entries from the PostgreSQL logs and loads them into a database schema to aid in analysis and auditing.

Installation

  • Install pgAudit following the instructions included with the extension.

  • Update the log settings in postgresql.conf as follows:

log_destination = 'csvlog'
logging_collector = on
log_connections = on

The log files must end with .csv and follow a naming convention that ensures files will sort alphabetically with respect to creation time. Log location is customizable when calling pgAudit Analyze.

  • Install pgAudit Analyze:

Copy the bin and lib directories to any location you prefer but make sure they are in the same directory.

  • Execute audit.sql in the database you want to audit as postgres:
psql -U postgres -f sql/audit.sql <db name>

Running

pgAudit Analyze is intended to be run as a daemon process.

./pgaudit_analyze --daemon /path/to/log/files

Testing

Regression tests are located in the test directory. See test/README.md for more information.

Caveats

  • The pgaudit.logon table contains the logon information for users of the database. If a user is renamed they must also be renamed in this table or the logon history will be lost.

  • Reads and writes to the pgAudit schema by the user running pgAudit Analyze are never logged.

Author

The PostgreSQL Audit Log Analyzer was written by David Steele.

pgaudit_analyze's People

Contributors

amalek215 avatar cmwshang avatar crunchyheath avatar dwsteele avatar prlaurence avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pgaudit_analyze's Issues

pgaudit schema being audited

Hi David
Installed pgaudit/pgaudit_analyzer to a PostgreSQL 9.5 successfully, but when the analyzer daemon is running i see thousands of logs to the pgaudit schema tables by the user postgres

i tried to alter role the Postgres user to set the pgaudit.role to '' and pgaudit.log to none, but I see no effect in log files

the log grows very fast, 100Mb/min

This is the kind of log filling my logs as an example in pgaudit.log_event, but other pgaudit tables are being audited too,

2019-05-28 12:23:15.168 EST,"postgres","MYDB",5722,"[local]",5cec92b3.165a,91403524,"INSERT",2019-05-27 20:45:23 EST,3/22984457,190169372,LOG,00000,"AUDIT: SESSION,91403522,1,WRITE,INSERT,,,""insert into pgaudit.log_event (session_id, log_time, session_line_num, command, error_severity, sql_state_code,virtual_transaction_id, transaction_id, message, detail, hint, query, query_pos,
internal_query, internal_query_pos, context, location)
values ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17)"",""5cec92b3.165a,2019-05-28 11:15:40.658-05,84846193,insert,log,00000,3/21340228,187976738,,,,,,,,,""",,,,,,,,,""

any advice on how to correct this issue? did I misconfigure something?

Thanks in advance

Question: Unable to start pgaudit_analyze - DB root does not exist

Hello team,

I am using below script to run pgaudit analyze as a service.

[Service]
User=postgres
Group=postgres
Type=simple
ExecStart=/bin/bash -c '/app/extensions/pgaudit_analyze/bin/pgaudit_analyze --user postgres --log-file /app/log/postgresql/pganalyze.log /var/log/postgresql/'
TimeoutStartSec=0

Postgresql config has been modified to output logs as csv.

The error I get in /app/log/postgresql/pganalyze.log is that db "root" does not exist.

I am running the service as postgres user.

reading postgresql-12-main.csv
DBI connect('dbname=root;port=5432;','postgres',...) failed: FATAL:  database "root" does not exist at /app/extensions/pgaudit_analyze/bin/pgaudit_analyze line 166.
 at /app/extensions/pgaudit_analyze/bin/pgaudit_analyze line 44, <$hFile> line 1098.
        main::__ANON__('DBI connect(\'dbname=root;port=5432;\',\'postgres\',...) fail...') called at /usr/share/perl5/vendor_perl/Carp.pm line 100
        Carp::croak('DBI connect(\'dbname=root;port=5432;\',\'postgres\',...) fail...') called at /usr/lib64/perl5/vendor_perl/DBI.pm line 683
        DBI::__ANON__(undef, undef) called at /usr/lib64/perl5/vendor_perl/DBI.pm line 739
        DBI::connect('DBI', 'dbi:Pg:dbname=root;port=5432;', 'postgres', undef, 'HASH(0x21effe0)') called at /app/extensions/pgaudit_analyze/bin/pgaudit_analyze line 166
        main::databaseGet('root') called at /app/extensions/pgaudit_analyze/bin/pgaudit_analyze line 693
        eval {...} called at /app/extensions/pgaudit_analyze/bin/pgaudit_analyze line 649


If i run the command directly within the postgres user shell, i get the same error.

Can you please help me in resolving this issue.

pgaudit_analyze - unable to open log file

Hi .. Greetings!!!...
I have configured pgaudit successfully and I can see the change in the log files.
After that I followed the steps to configure pgaudit_analyze, did the modification in the postgresql.conf file, executed the audit.sql file (all objects created in the database).
Now when I am running following command:

./pgaudit_analyze --daemon

I am getting following error:

Unable to open log file: Permission denied at ./pgaudit_analyze line 633
at ./pgaudit_analyze line 44
main::ANON('Unable to open log file: Permission denied at ./pgaudit_analy...') called at /usr/share/perl5/Carp.pm line 45
Carp::confess('Unable to open log file: Permission denied') called at ./pgaudit_analyze line 633

I have checked the permission set of the logs in pg_log, they all are owned by postgres user.

I really appreciate any help.

It Does not work with postures-11 /Segmentation Fault

While configuring [root@postgressql-demo bin]# ./pgaudit_analyze --daemon /var/lib/pgsql/11/data/log
Segmentation fault
[root@postgressql-demo bin]#
I can see following in /var/log/message
ar 15 08:06:38 postgressql-demo systemd: Started PostgreSQL 11 database server.
Mar 15 08:23:43 postgressql-demo kernel: pgaudit_analyze[15126]: segfault at 0 ip 00007feebc1ae05f sp 00007ffed81b23b8 error 4 in libc-2.17.so[7feebc068000+1c2000]

pg logs filling up very fast after starting the pgAudit Analyzer daemon

Hi
I have setup pgAudit/pgAudit Analyzer onto our postgres server. After starting the pgAudit Anlyzer daemon, the postgres log files (and the csv files) are filling up continuously very fast (1mb/5secs)
I have found out its related to a parameter "log_duration = on" we have enabled in our postgresql.conf. We need this parameter on for our pgBadger process to log query data into the pgBadger reports.
As soon as we kill the pgAudit Analyzer daemon the entries stops in the files (both log/csv)
I have also verified by disabling the parameter to "log_duration = off" and the logs/csv files stops growing rapidly.
this is what I see in the csv files that is continuously logged when log_durarion = on when the pgAudit Analyzer daemon is running:

2017-06-26 10:25:00.081 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164481,"COMMIT",2017-06-26 10:23:24 ACST,5/0,0,LOG,00000,"duration: 0.484 ms",,,,,,,,,""
2017-06-26 10:25:00.081 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164482,"BEGIN",2017-06-26 10:23:24 ACST,5/49734,0,LOG,00000,"duration: 0.013 ms",,,,,,,,,""
2017-06-26 10:25:00.081 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164483,"BIND",2017-06-26 10:23:24 ACST,5/49734,0,LOG,00000,"duration: 0.023 ms",,,,,,,,,""
2017-06-26 10:25:00.081 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164484,"INSERT",2017-06-26 10:23:24 ACST,5/49734,368612248,LOG,00000,"duration: 0.077 ms",,,,,,,
,,""
2017-06-26 10:25:00.082 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164485,"COMMIT",2017-06-26 10:23:24 ACST,5/0,0,LOG,00000,"duration: 0.486 ms",,,,,,,,,""
2017-06-26 10:25:00.082 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164486,"BEGIN",2017-06-26 10:23:24 ACST,5/49735,0,LOG,00000,"duration: 0.014 ms",,,,,,,,,""
2017-06-26 10:25:00.082 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164487,"BIND",2017-06-26 10:23:24 ACST,5/49735,0,LOG,00000,"duration: 0.023 ms",,,,,,,,,""
2017-06-26 10:25:00.082 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164488,"INSERT",2017-06-26 10:23:24 ACST,5/49735,368612249,LOG,00000,"duration: 0.077 ms",,,,,,,
,,""
2017-06-26 10:25:00.083 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164489,"COMMIT",2017-06-26 10:23:24 ACST,5/0,0,LOG,00000,"duration: 0.482 ms",,,,,,,,,""
2017-06-26 10:25:00.083 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164490,"BEGIN",2017-06-26 10:23:24 ACST,5/49736,0,LOG,00000,"duration: 0.014 ms",,,,,,,,,""
2017-06-26 10:25:00.083 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164491,"BIND",2017-06-26 10:23:24 ACST,5/49736,0,LOG,00000,"duration: 0.023 ms",,,,,,,,,""
2017-06-26 10:25:00.083 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164492,"INSERT",2017-06-26 10:23:24 ACST,5/49736,368612250,LOG,00000,"duration: 0.077 ms",,,,,,,
,,""
2017-06-26 10:25:00.084 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164493,"COMMIT",2017-06-26 10:23:24 ACST,5/0,0,LOG,00000,"duration: 0.505 ms",,,,,,,,,""
2017-06-26 10:25:00.084 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164494,"BEGIN",2017-06-26 10:23:24 ACST,5/49737,0,LOG,00000,"duration: 0.014 ms",,,,,,,,,""
2017-06-26 10:25:00.084 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164495,"BIND",2017-06-26 10:23:24 ACST,5/49737,0,LOG,00000,"duration: 0.023 ms",,,,,,,,,""
2017-06-26 10:25:00.084 ACST,"postgres","postgres",21741,"[local]",59505b04.54ed,164496,"INSERT",2017-06-26 10:23:24 ACST,5/49737,368612251,LOG,00000,"duration: 0.076 ms",,,,,,,
,,""

this is what is in the log file:

2017-06-26 09:53:55 ACST [16083]: [49345-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.491 ms
2017-06-26 09:53:55 ACST [16083]: [49346-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.014 ms
2017-06-26 09:53:55 ACST [16083]: [49347-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.024 ms
2017-06-26 09:53:55 ACST [16083]: [49348-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.077 ms
2017-06-26 09:53:55 ACST [16083]: [49349-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.463 ms
2017-06-26 09:53:55 ACST [16083]: [49350-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.014 ms
2017-06-26 09:53:55 ACST [16083]: [49351-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.024 ms
2017-06-26 09:53:55 ACST [16083]: [49352-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.077 ms
2017-06-26 09:53:55 ACST [16083]: [49353-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.486 ms
2017-06-26 09:53:55 ACST [16083]: [49354-1] db=postgres,user=postgres app=[unknown],host=[local] LOG: duration: 0.014 ms

thanks for your help.

Regards
Dylan

pganalyze stuck in a loop or stop writing to pgaudit schema

Hi,

Pganalyze stuck in loop with error unable to connect to database.

Case 1 -

We had 4 databases in cluster. and then we dropped 1 database as it was a test db. After dropping the db, pganalyze keep on trying to connect to the test db with error -

2022-08-20 12:16:43: reading postgresql-Sat.csv
2022-08-20 12:16:43: DBI connect('dbname=PLW_DEV_03_26072022;port=5432;','postgres',...) failed: connection to server on socket "/var/run/postgresql/.s.PGSQL.5432" failed: FATAL: database "PLW_DEV_03_26072022" does not exist at /usr/pgaudit_analyze/bin/pgaudit_analyze line 166.
at /usr/pgaudit_analyze/bin/pgaudit_analyze line 44, <$hFile> line 616.
main::ANON('DBI connect('dbname=PLW_DEV_03_26072022;port=5432;','postg...') called at /usr/share/perl5/vendor_perl/Carp.pm line 100
Carp::croak('DBI connect('dbname=PLW_DEV_03_26072022;port=5432;','postg...') called at /usr/lib64/perl5/vendor_perl/DBI.pm line 683
DBI::ANON(undef, undef) called at /usr/lib64/perl5/vendor_perl/DBI.pm line 739
DBI::connect('DBI', 'dbi:Pg:dbname=PLW_DEV_03_26072022;port=5432;', 'postgres', undef, 'HASH(0x15ef130)') called at /usr/pgaudit_analyze/bin/pgaudit_analyze line 166
main::databaseGet('PLW_DEV_03_26072022') called at /usr/pgaudit_analyze/bin/pgaudit_analyze line 706
eval {...} called at /usr/pgaudit_analyze/bin/pgaudit_analyze line 662

Case 2 -
If there are multiple CSV files in log dir. pganalyze stops writing to pgaudit schema.
log_filename = postgresql-%a.log [%a - Mon Tue Wed and so on]
log_rotate = 1d

so at midnight when logfile rotates, we have 2 or more csv files present in a log dir.
So, pganalyze daemon stops writing audit to pgaudit schema.

Solution for case 1 - remove all entries for DB from csv file, which was dropped and then it works fine.
Solution for case 2 - I have to rename/move old csv files and then restart daemon process to make work.

Is Pganalyze behaves this way or do I need to set some parameters?
Also, why is it monitoring all 4 databases? pgaudit should only audit that DB where audit.sql was executed.

Thanks

add use warning

should add use waring in CSV.pm

regex match in perl has limits, before 5.30 the limit is aboult 32k, after 5.30 is about 64k

if a single csv line large than 32k and OS is centos 7 (perl 5.16), the LOOP will not never exit in "getline"

Can't locate DBI.pm in @INC

I'm having trouble launching pgaudit_analyze:
pgdata]$ /pgdata/pgaudit_analyze/bin/pgaudit_analyze --help

Can't locate DBI.pm in @inc (@inc contains: /pgdata/pgaudit_analyze/ /usr/local/lib64/perl5 /usr/local/share/perl5 /usr/lib64/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib64/perl5 /usr/share/perl5 .) at /pgdata/pgaudit_analyze/bin/pgaudit_analyze line 13.

BEGIN failed--compilation aborted at /pgdata/pgaudit_analyze/bin/pgaudit_analyze line 13.

time zone bug?

[postgres@vm10-0-0-2 bin]$ date
Fri Jul 26 10:12:01 CST 2019
[postgres@vm10-0-0-2 bin]$ date -R
Fri, 26 Jul 2019 10:12:04 +0800

2019-07-26 10:04:05.148 CST,"postgres","test",14723,"[local]",5d3a5f91.3983,3,"idle",2019-07-26 10:04:01 CST,3/3887,0,LOG,00000,"statement: select 1;",,,,,,,,,"psql"
2019-07-26 10:04:05.149 CST,"postgres","test",14723,"[local]",5d3a5f91.3983,4,"SELECT",2019-07-26 10:04:01 CST,3/0,0,LOG,00000,"duration: 1.026 ms",,,,,,,,,"psql"
[postgres@vm10-0-0-2 bin]$ vim ../data/log/postgresql-2019-07-26_000000.csv 
[postgres@vm10-0-0-2 bin]$ ./psql test 
psql (11.4)
Type "help" for help.

test=# select '2019-07-26 10:04:05'::timestamptz - '2019-07-27 00:00:56'::timestamptz;
 ?column?  
-----------
 -13:56:51
(1 row)

test=# show timezone;
 TimeZone 
----------
 PRC
(1 row)

test=# 

after analyze:

-[ RECORD 2 ]----------+------------------------------------------------------------------
session_id             | 5d3a5f91.3983
session_line_num       | 3
log_time               | 2019-07-27 00:04:05.148+08
command                | idle
error_severity         | log
sql_state_code         | 00000
virtual_transaction_id | 3/3887
transaction_id         | 0
message                | statement: select 1;
detail                 | 
hint                   | 
query                  | 
query_pos              | 
internal_query         | 
internal_query_pos     | 
context                | 
location               | 

why?

pgaudit_analyze process filling up audit log file

I have several databases in the PostgreSQL instance and created the extension and setup the pgaudit analyzer for these database, including the postgres database.

After starting the pgAudit Analyze as a daemon process, ./pgaudit_analyze --daemon /var/lib/postgresql/9.6/main/pg_log/
the log file and the csv file are filling up very fast (1mb per sec) continuously.
The files are filling up with the following continuously:

Regards,
Panha

Is it possible to pass <host> parameter on command line

Hi,
We want to use pgaudit_analyze on standby database audit records, in current situation it is not possible to pass host parameter to pgaudit_analyze to connect a remote database. So currently, pgaudit_analyze connects where log files reside.

Is it possible to add host parameter to connect pgaudit_analyze to central PostgreSQL Audit Vault Database?

Hint: I have done this by changing pgaudit_analyze perl file which is here and have a writing (in Turkish) on medium But of course it is more convenient to be done by owner.

Best Regards.

Log infinite growth

Parse the log of the current instance and import the current instance. Is this an incorrect operation?The log will grow indefinitely, and the parsing tool is to import it into another set of instances, right?Or is there a setting that doesn't parse --user specifies the log generated by the user?

Running multiple processes to audit logfiles

Hi,

I'm looking for an answer to the question I've included below.

Can I run multiple processes so that pgaudit_analyze can read various log files from the same database? Somehow process was stopped and we have weeks of the log file to audit.

Regards,
Ravi

pgaudit_analyze daemon failing to connect to "audit" database.

I'm having trouble setting up pgaudit_analyze.
Thus far, I have taken the following steps:

  1. Installed pgaudit.
  2. Installed pgaudit_analyze under postgres home sirectory.
  3. psql -c "CREATE DATABASE testdb"
  4. psql -f $HOME/pgaudit_analyze/sql/audit.sql testdb
  5. $HOME/pgaudit_analyze/bin/pgaudit_analyze --daemon --log-file /var/log/dbaas/pgaudit_analyze.log /var/log/postgres
  6. /var/log/dbaas/pgaudit_analyze.log is repeating the following error:

2021-06-18 19:10:18: DBI connect('dbname=audit;port=5432;','postgres',...) failed: FATAL: database "audit" does not exist at /var/lib/pgsql/pgaudit_analyze/bin/pgaudit_analyze line 166.
at /var/lib/pgsql/pgaudit_analyze/bin/pgaudit_analyze line 44, <$hFile> line 3.
main::ANON("DBI connect('dbname=audit;port=5432;','postgres',...) failed:"...) called at /usr/share/perl5/vendor_perl/Carp.pm line 167
Carp::croak("DBI connect('dbname=audit;port=5432;','postgres',...) failed:"...) called at /usr/lib64/perl5/vendor_perl/DBI.pm line 690
DBI::ANON(undef, undef) called at /usr/lib64/perl5/vendor_perl/DBI.pm line 746
DBI::connect("DBI", "dbi:Pg:dbname=audit;port=5432;", "postgres", undef, HASH(0x558d6ad00708)) called at /var/lib/pgsql/pgaudit_analyze/bin/pgaudit_analyze line 166
main::databaseGet("audit") called at /var/lib/pgsql/pgaudit_analyze/bin/pgaudit_analyze line 704
eval {...} called at /var/lib/pgsql/pgaudit_analyze/bin/pgaudit_analyze line 660

Database "audit" does not exist. The sql setup script was run against the 'testdb" database. I have verified that the associated
tables and view were created in "testdb". I cannot find "audit" configured anywhere. Is this a default?

Huge size logfiles

Hi,

As soon as I start pgaudit_analyze daemon, Log file gets filled with insert into pgaudit.log_event. How do I stop these messages. I know I do not have to audit the auditor user which is pgaudit.

pg_p20717_5: insert into pgaudit.log_event (session_id, log_time, session_line_num, command, error_severity, sql_state_code,
virtual_transaction_id, transaction_id, message, detail, hint, query, query_pos,
internal_query, internal_query_pos, context, location)
values ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17)', $10 = 'parameters:

I have enabled the db -
ALTER DATABASE pwa set pgaudit.log='All'; (owner is role1)

pgaudit schema exists in DB - auditdb. (owner is audit_user)

THis is how pg_roles look like-
role1 |{pgaudit.log=ALL}
ravi | [null]
audit_user | [null]
pgaudit_owner | [null]
pgaudit_etl | [null]
pgaudit | {pgaudit.log=none,pgaudit.role=}

Please share what parameters need to be set.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.