Giter Site home page Giter Site logo

domainaware / parsedmarc Goto Github PK

View Code? Open in Web Editor NEW
975.0 27.0 210.0 32.25 MB

A Python package and CLI for parsing aggregate and forensic DMARC reports

Home Page: https://domainaware.github.io/parsedmarc/

License: Apache License 2.0

Shell 0.18% Python 99.74% Dockerfile 0.08%

parsedmarc's Introduction

parsedmarc

Build Status Code Coverage PyPI Package PyPI - Downloads

A screenshot of DMARC summary charts in Kibana

parsedmarc is a Python module and CLI utility for parsing DMARC reports. When used with Elasticsearch and Kibana (or Splunk), it works as a self-hosted open-source alternative to commercial DMARC report processing services such as Agari Brand Protection, Dmarcian, OnDMARC, ProofPoint Email Fraud Defense, and Valimail.

Note

Domain-based Message Authentication, Reporting, and Conformance (DMARC) is an email authentication protocol.

Help Wanted

This project is maintained by one developer. Please consider reviewing the open issues to see how you can contribute code, documentation, or user support. Assistance on the pinned issues would be particularly helpful.

Thanks to all contributors!

Features

  • Parses draft and 1.0 standard aggregate/rua reports
  • Parses forensic/failure/ruf reports
  • Can parse reports from an inbox over IMAP, Microsoft Graph, or Gmail API
  • Transparently handles gzip or zip compressed reports
  • Consistent data structures
  • Simple JSON and/or CSV output
  • Optionally email the results
  • Optionally send the results to Elasticsearch and/or Splunk, for use with premade dashboards
  • Optionally send reports to Apache Kafka

parsedmarc's People

Contributors

anaelmobilia avatar ardovm avatar aroldxd avatar bencomp avatar bhozar avatar ccutrer avatar cvandeplas avatar dbermuehler avatar drawks avatar kuzuto avatar m0rcq avatar maurofaccenda avatar michaeldavie avatar mikesiegel avatar mlodic avatar mwander avatar nathanthorpe avatar olen avatar rodpayne avatar roeften avatar rubeste avatar ruffy91 avatar seanthegeek avatar silvian-io avatar supaeasy avatar szasza avatar tom-henderson avatar uknowted avatar williamdes avatar yjszk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

parsedmarc's Issues

Not a valid aggregate or forensic report

Cant Parse Some Reports:
Mail:
Report Domain: xxxxxx.de Submitter: ikea.com Report-ID: 2018_10_05_5bb7e9b4f3e8a
This is an aggregate report from ikea.com.

Attachment:
ikea.com!xxxxxx.de!1538690400!1538776800.xml.gz
ERROR:parsedmarc:Failed to parse ikea.com!xxxxxx.de!1538690400!1538776800.xml - Not a valid aggregate or forensic report
{
"aggregate_reports": [],
"forensic_reports": []
}

Example:
https://pastebin.com/Pzgq4eGh

Error starting ngix serive

I keep getting this error when starting the ngix service
nginx: [emerg] unexpected end of file, expecting ";" or "}" in /etc/nginx/sites-enabled/kibana:33

I have checked the script and I do not see a missing ; or } -

Your in site will be greatly appreciated -

Unexpected error: [Errno 13] Permission denied: '.public_suffix_list.dat'

I am on Ubuntu 18.04 . parsedmarc version 6.0.1
the imap part of the parsedmarc service is retrieving reports from inbox fine. It cannot pass to elasticsearch.
I get an unexpected error because parsedmarc apparently cannot get to the .public_suffix_list.dat file.
I have reproduced this on 2 Ubuntu systems. I also get this error when I try to pass the test aggregate report. Example of the error:

Feb 09 19:24:04 kvw-dmarc parsedmarc[12718]: WARNING:init.py:1037:Message with subject "Aggregate test" is not a valid aggregate DMARC report: Unexpected error: [Errno 13] Permission denied: '.public_suffix_list.dat'

I can pass the same report manually. When I do that I get:

root@kvw-dmarc:~# parsedmarc -c parsedmarctest.ini file_path acme.org!example.com!1335571200!1335657599.zip
DEBUG:elastic.py:281:Saving aggregate report to Elasticsearch
DEBUG:elastic.py:208:Creating Elasticsearch index: dmarc_aggregate-2012-04-28

The report passes.
When I do an ls -la I see 2 files added in the directory where I called parsedmarc:
.GeoLite2-Country.mmdb
.public_suffix_list.dat

I am wondering if this has something to do with the change to /etc/parsedmarc.ini and the parsedmarc user/group. I understand this change from a security perspective and applaud it. I am curious though if this is a permission issue and that is why I propose this.

Your help in solving this mystery is much appreciated.

Greeting from the Netherlands!

Forensic Report Attachments

Can you add a option to parsedmarc for forensic reports without parsing the attachments? like pdf,doc (office files). The Elasticsearch Index grow very fast. 1 Day 300 MB. Nobody need the attachment for reporting :-)

dmarc_aggregate not creating

Dear Sir,
I have update to the version 6.0.2 after that elastic search dmarc_aggregate index is not creating. It moves successfully to the aggregate folder of mail server with out creating the dmarc_aggregate. This issue happen after the update of 6.0.2. Then I have updated to the latest 6.0.3. After update I have moved one mail from Aggregate folder to Inbox for processing. Now showing debug information as invalid and move the message to invalid folder of mail server now.
Kindly inform what debug file information I have to sent for this.
Regards,
R.Arun Vasan

Publicsuffix

Okt 22 14:44:33 Hostname parsedmarc[10654]: WARNING:parsedmarc:Failed to download an updated PSL HTTPSConnectionPool(host='publicsuffix.org', port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by NewConnection

Parsedmarc try to connet to publixsuffix.org without Proxy.
Can you add a proxy feature? Or i must use a direkt connection rule.

grafik

Option to move invalid reports

Awesome project! One issue I've run into: invalid DMARC reports are left where they are. On subsequent runs, it will again attempt to parse them.

I think an additional folder in Archive (say 'Invalid') would be a good option for people who don't want to delete, but rather move them somewhere.

Connection Problem

Hello,

i have a Problem since a update from parsedmarc.
I see now you add a configuration file for the connection. But now i get a error if i try to connect.
Befor the update it work.

0it [00:00, ?it/s]
DEBUG:init.py:928:Connecting to IMAP over plain text
ERROR:cli.py:463:IMAP Error: socket error: EOF

Config File:

This is an example comment

[general]
save_aggregate = True
save_forensic = True
nameservers = 10.2.1.105

[imap]
host = servername
user = [email protected]
password = password
watch = True
ssl = false
delete = true

Whats wrong here?

parsedmarc exits with unexpected internal exception

Today I've noticed in the syslog that parsedmarc service is exiting repeatedly. What happens right now is that service is restarted after exiting, there is an IMAP check (but the inbox of the user which collects the DMARC logs is empty) and after about one minute it exits again.

Dec 5 14:07:09 smtp-test systemd[1]: parsedmarc.service: Service hold-off time over, scheduling restart.
Dec 5 14:07:09 smtp-test systemd[1]: parsedmarc.service: Scheduled restart job, restart counter is at 20.
Dec 5 14:07:09 smtp-test systemd[1]: Stopped parsedmarc mailbox watcher.
Dec 5 14:07:09 smtp-test systemd[1]: Started parsedmarc mailbox watcher.
Dec 5 14:07:13 smtp-test dovecot: imap-login: Login: user=, method=PLAIN, rip=192.168.206.71, lip=192.168.206.71, mpid=17323, secured, session=<hEt+D0Z89L/AqM5H>
Dec 5 14:08:08 smtp-test parsedmarc[17299]: RPython traceback:
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "pypy_interpreter.c", line 38745, in BuiltinCode2_fastcall_2
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "pypy_module__io_1.c", line 15945, in W_BufferedReader_readline_w
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "implement_2.c", line 47370, in dispatcher_69
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "pypy_module__io.c", line 16566, in W_BufferedReader__raw_read_1
Dec 5 14:08:08 smtp-test parsedmarc[17299]: Traceback (most recent call last):
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/usr/local/bin/parsedmarc", line 11, in
Dec 5 14:08:08 smtp-test parsedmarc[17299]: sys.exit(_main())
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/cli.py", line 373, in _main
Dec 5 14:08:08 smtp-test parsedmarc[17299]: dns_timeout=args.timeout, strip_attachment_payloads=sa)
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/init.py", line 1527, in watch_inbox
Dec 5 14:08:08 smtp-test parsedmarc[17299]: responses = server.idle_check(timeout=wait)
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/opt/venvs/parsedmarc/site-packages/imapclient/imapclient.py", line 156, in wrapper
Dec 5 14:08:08 smtp-test parsedmarc[17299]: return func(client, *args, **kwargs)
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/opt/venvs/parsedmarc/site-packages/imapclient/imapclient.py", line 795, in idle_check
Dec 5 14:08:08 smtp-test parsedmarc[17299]: line = self._imap._get_line()
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/opt/pypy3.5-6.0.0-linux_x86_64-portable/lib-python/3/imaplib.py", line 1133, in _get_line
Dec 5 14:08:08 smtp-test parsedmarc[17299]: line = self.readline()
Dec 5 14:08:08 smtp-test parsedmarc[17299]: File "/opt/pypy3.5-6.0.0-linux_x86_64-portable/lib-python/3/imaplib.py", line 297, in readline
Dec 5 14:08:08 smtp-test parsedmarc[17299]: line = self.file.readline(_MAXLINE + 1)
Dec 5 14:08:08 smtp-test parsedmarc[17299]: SystemError: unexpected internal exception (please report a bug): <BlockingIOError object at 0x7fe88a727d40>; internal traceback was dumped to stderr
Dec 5 14:08:08 smtp-test dovecot: imap(dmparser): Connection closed (IDLE running for 0.001 + waiting input for 25.045 secs, 0.001 in locks, 2 B in + 10 B out, state=wait-input) in=295 out=1675
Dec 5 14:08:08 smtp-test systemd[1]: parsedmarc.service: Main process exited, code=exited, status=1/FAILURE
Dec 5 14:08:08 smtp-test systemd[1]: parsedmarc.service: Failed with result 'exit-code'.
Dec 5 14:08:08 smtp-test dovecot: imap(dmparser): Connection closed (UID SEARCH finished 55.139 secs ago) in=240 out=1233

The server is Ubuntu 18.04.1, parsedmarc 5.1.0, pypy3.5-6.0.0-linux_x86_64-portable and the service command line is:
/usr/local/bin/parsedmarc --watch --silent -H -u dmparser -p --imap-no-ssl --elasticsearch-index-suffix unitntest --save-aggregate --save-forensic -n

Imap Trash

Another Feature Request: Can you add the purge-message command?
I can only delete the message after parsing the emails. But the mails are in recycle bin and the mailbox is growing.Keep up this good Project :-)

Remove credentials from CLI

This is a feature request. But is also has security impact.

Currently the IMAP credentials are used on the commandline.
Which makes them visible in a lot of places. Like log files, status outputs, ......
Is there way to store in a separate file and point to it from the commandline?
That would reduce the exposure.

DMARC Forensic Samples

Dashboard/DMARC Forensic Samples

Could not locate that index-pattern-field (id: sample.reply_to.address.keyword)
Could not locate that index-pattern-field (id: source_country.keyword)
Could not locate that index-pattern-field (id: source_reverse_dns.keyword)
Could not locate that index-pattern-field (id: source_country.keyword)

If you need a Sample Forensic Report you can get one from me

Error starting nginx: no such file dhparam.pem

I am following the documentation for installing this and run into an issue. I'm doing this on a new VM that is running a new install of Debian 9.5 using the minimal install. Your documentation does not include any instruction on creating the dhparam.pem file. Not having the file will cause nginx not to start.

I recommend that you add instructions just after creating the self-signed certificate to create the file. I used the following command to create the file.

sudo openssl dhparam -out dhparam.pem 4096

Make sure that the dhparam.pem file is inside the ssl folder.

got an unexpected keyword argument 'ssl'

Attempting to configure SMTP not to SSL.

This is the error received:

Traceback (most recent call last):
  File "/usr/local/bin/parsedmarc", line 10, in <module>
    sys.exit(_main())
  File "/usr/local/lib/python3.5/dist-packages/parsedmarc/cli.py", line 435, in _main
    subject=opts.smtp_subject)
TypeError: email_results() got an unexpected keyword argument 'ssl'

Config file

[smtp]
host = ***
port = 25
ssl = False
from = ***
to = ***
subject = ParseDMARC Report

Error: SMTP AUTH extension not supported by server

HI all,

I've successfully configured parsedmarc to read reports from an email address via IMAP and registering it in our ELK cluster without issue, and I think it's a great job! But now I'm trying to configure it to send an output email, but I'm stuck in configure it using gmail account.

The version I installed using pip3 is:

sergiofernandez@hyperion:~$ pip3 list | grep parsedmarc
parsedmarc (3.9.7)

My command is (data intentionally hidden):

/usr/local/bin/parsedmarc -H "imap.gmail.com" -u "[email protected]" -p "XXXXXXXXXXXXXX" -E https://dmarc_user:[email protected]:9243 -O "smtp.gmail.com" -U "[email protected]" -P "XXXXXXXXXXXXXXXX" -F "[email protected]" -T "[email protected]" -S "DMARC reports for example.com"

The report is generated but then an SMTP error appears:

ERROR:parsedmarc:SMTP Error: SMTP AUTH extension not supported by server

I've tried using the three vias to send mail using gmai as described here with no success:

https://support.google.com/a/answer/176600?hl=en

How can I solve this issue?

Documentation suggestion

Thist note could use a small update.

Warning
The default JVM heap size for Elasticsearch is very small (1g), which will cause it to crash under a heavy load. To fix this, increase the minimum and maximum JVM heap sizes in /etc/elasticsearch/jvm.options to more reasonable levels, depending on your serverโ€™s resources.
Always set the minimum and maximum JVM heap sizes to the same value. For example, to set a 4 GB heap size, set
-Xms4g
-Xmx4g
See https://www.elastic.co/guide/en/elasticsearch/reference/current/heap-size.html for more information.

I would add here:

Make sure the hostmachine has (at least) 2 GB more RAM then the assigned JVM HEAP size.

Failed at "raw_msg = raw_msg[msg_key]"

Hello,

First of all, excellent tool!

However, I am unable to successfully run against a mailbox. I have installed per documentation and have everyhting working up to the actual processing of the available messages in the specified folder. Execution fails with the following:

> parsedmarc -c /etc/parsedmarc.ini --debug
   DEBUG:__init__.py:906:Connecting to IMAP over plain text
   DEBUG:__init__.py:829:IMAP server supports: ['IMAP4REV1', 'AUTH=LOGIN', 'MOVE', 'SPECIAL-USE']
   DEBUG:__init__.py:977:Found 3 messages in IMAP folder DMARC
   DEBUG:__init__.py:983:Processing message 1 of 3: UID 2
Traceback (most recent call last):
  File "/usr/local/bin/parsedmarc", line 10, in <module>
    sys.exit(_main())
  File "/usr/local/lib/python3.5/dist-packages/parsedmarc/cli.py", line 404, in _main
    strip_attachment_payloads=sa
  File "/usr/local/lib/python3.5/dist-packages/parsedmarc/__init__.py", line 995, in get_dmarc_reports_from_inbox
    raw_msg = raw_msg[msg_key]
KeyError: ''

I am running through Davmail to connect to Exchange and have installed msgconvert.

Running against a single report stored on the local file system does parse correctly.

I have also pulled the latest dev release via Git with same result.

Fresh install of Ubuntu 16.04 LTS

Thank you for any assistance.

Problem Parsing since last version

WARNING:parsedmarc:Message with subject "Report Domain: remondis.de Submitter: xxxxxxxx.de Report-ID:
<1539772340.15746>" is not a valid DMARC report
Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 561, in parse_forensic_report
strip_attachment_payloads=strip_attachment_payloads)
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/utils.py", line 461, in parse_email
payload = base64.b64decode(payload)
File "/usr/lib/python3.6/base64.py", line 87, in b64decode
return binascii.a2b_base64(s)
binascii.Error: Incorrect padding

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 701, in parse_report_email
strip_attachment_payloads=strip_attachment_payloads)
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 586, in parse_forensic_report
"Unexpected error: {0}".format(error.str()))
parsedmarc.InvalidForensicReport: Unexpected error: Incorrect padding

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 11, in
load_entry_point('parsedmarc==4.3.4', 'console_scripts', 'parsedmarc')()
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/cli.py", line 305, in _main
strip_attachment_payloads=sa
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 958, in get_dmarc_reports_from_inbox
strip_attachment_payloads=sa)
File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 703, in parse_report_email
raise ParserError(e.str())
parsedmarc.ParserError: Unexpected error: Incorrect padding

indefinite hang on forensic email

I've noticed you pushed an update to hand bad forensic email a bit ago, I have a sender whose emails always cause parsedmarc to hang until I move that email out of it's way. It's got 3 attachemnts
a seemily empty .msg file and two .file attachments like these
====================================
ATT00001.file
Feedback-Type: auth-failure
User-Agent: szn-mime/2.0.41
Version: 1
Original-Rcpt-To: [email protected]
Source-Ip: 24.42.59.243
Authentication-Results: email.seznam.cz 1;
spf_align=fail;
dkim_align=fail
Delivery-Result: reject
===============================
ATT00003.file
Arrival-Date: Sun, 06 Jan 2019 22:36:18 +0100
Reporting-MTA: dns; email.seznam.cz

Final-Recipient: rfc822; =========@=======.com
Status: 2.0.0
Diagnostic-Code: x-uknown;
Action: x-unknown
Original-Recipient: rfc822; [email protected]

Reports not pulled during Idle

Configuration: My DMARC reports are going to a folder/label in a G Suite email account.
parsedmarc version: 4.4.1

When parsedmarc starts, it grabs all the reports properly.

However, any subsequent reports will sit unread while parsedmarc keeps refreshing the IDLE session. During this period, new (unrelated) emails have arrived in the Inbox.

invalid forensic report - inline?

Are inline forensic reports like the following supported?

But got thw follow from Baruwa as an inline message ... is this supposed to be valid? Is this a parsedmarc issue or is it Baruwa this id doing something wrong?

Here is the mesage:

A message claiming to be from you has failed the published DMARC
policy for your domain.

  Sender Domain: domain.tld
  Sender IP Address: 191.252.30.42
  Received Date: Wed, 20 Feb 2019 08:52:51 +0100
  SPF Alignment: no
  DKIM Alignment: no
  DMARC Results: Reject

------ This is a copy of the headers that were received before the error
       was detected.

Received-SPF: pass (mailwatch.domain.tld: domain of fantozziassociates.com.br designates 191.252.30.42 as permitted sender) client-ip=191.252.30.42; [email protected]; helo=mcegress-30-lw-42.correio.biz;
Received: from mcegress-30-lw-42.correio.biz ([191.252.30.42])
        by mailwatch.domain.tld with esmtp (Baruwa 2.0)
        (envelope-from <[email protected]>)
        id 1gwMgU-000BZz-Ii ret-id none;
        for [email protected]; Wed, 20 Feb 2019 08:52:51 +0100
X-Sender-Id: x-authuser|[email protected]
Received: from mcbain0002.correio.biz (mcingress0005.correio.biz [10.30.225.40])
        by mcrelay.correio.biz (Postfix) with ESMTP id B6599E7FB4
        for <[email protected]>; Wed, 20 Feb 2019 04:38:27 -0300 (-03)
X-Sender-Id: x-authuser|[email protected]
Received: from mcbain0002.correio.biz (mcbain0002.email.locaweb.com.br
 [10.30.224.225])
        by 0.0.0.0:2500 (trex/5.9.14);
        Wed, 20 Feb 2019 04:38:27 -0300
X-LW-Relay: Bad
X-LW-SenderId: x-authuser|[email protected]
Received: from mcbain0002.correio.biz (localhost [127.0.0.1])
        by mcbain0002.correio.biz (Postfix) with ESMTP id B868780E109
        for <[email protected]>; Wed, 20 Feb 2019 04:38:21 -0300 (-03)
Received: from proxy.email-ssl.com.br (bartf0034.email.locaweb.com.br [10.31.120.66])
        by mcbain0002.correio.biz (Postfix) with ESMTP id 94B8680A079
        for <[email protected]>; Wed, 20 Feb 2019 04:38:21 -0300 (-03)
x-locaweb-id: dhgMVzE2N6Che-U8r-uy0-8GOczG90QXqWf1mPzhNnDb041_bM9xYl5CAv6pKmafkEV_FrqY_ktG7QelXU2nuAMZ-FM0gZyq1NL_r6PEX6tVUKIbsEn68tDhhitNMPjObNP-7nv3utX6nG-515B_SgMtXHH3zlkoLYwRtAKgSAfpExUPhcydzhHlDIGrndZP4N0HtsTlDy_yd6vJ7hXHgzhqip2MJfNPEKDMaq66Ogg= NzQ2NTYzNjg2ZTZmNmM2ZjY3Nzk0MDY2NjE2ZTc0NmY3YTdhNjk2MTczNzM2ZjYzNjk2MTc0NjU3MzJlNjM2ZjZkMmU2Mjcy
X-LocaWeb-COR: locaweb_2009_x-mail
X-AuthUser: [email protected]
Received: from [177-129-200-154.nnt.net.br] (unknown [177.129.200.154])
        (Authenticated sender: [email protected])
        by proxy.email-ssl.com.br (Postfix) with ESMTPSA id 7AEE27A0352
        for <[email protected]>; Wed, 20 Feb 2019 04:38:24 -0300 (-03)
List-Subscribe:
 <http://mailer.fantozziassociates.com.br/misc/pages/subscribe/q5ricwgpnd38z1uhjha6616rhonlor2a2djt9zc6hy3a9r8bsf5onqcz7n0f>,
  <mailto:[email protected]?subject=Subscribe+87334_3110_6_776050_8348>
To: [email protected]
Date: Wed, 20 Feb 2019 08:38:25 +0100
Errors-To: [email protected]
X-aid: 8691818038
X-Priority: 2
Message-ID:
 <53kz0kecyil122wsjpzgbqwqp@zqzb6cnx632waplm7idrcqlr0spvpwvvanoz677w9wgdd6wfrq2uls>
Content-Transfer-Encoding: base64
Content-Type: text/plain; charset=UTF-8
Abuse-Reports-To: [email protected]
Subject: hhp
From: <[email protected]>
X-Sender: [email protected]
X-Outbound-RspamD: yes
X-MC: yes

If you think this is an issue with parsedmarc that should be supported, please let me know ... I will supply any information missing....

If this is an issue with baruwa I will file an issue on there support channel ...

ALREADYEXISTS after 1st run

Hi

I'm experiencing this issue after the first run:

ERROR:parsedmarc:IMAP Error: create failed: [ALREADYEXISTS] Mailbox already exists

(using this command: /bin/parsedmarc --watch --silent --delete --output /home/dmarc/www/ -a OLD -H "my_mail_server" -u "my_mailbox" -p "my_password"

Using dovecot 2.2.10

First manual run completed without errors, from the 2nd one I get the error.

If I change the archive folder with the -a switch it works, but starting from the 2nd run I get the error again.

Ubuntu 16.04 install error

Hello,

I am attempting to install and am receiving these errors:

$ sudo -H pip3 install -U parsedmarc
Collecting parsedmarc
Exception:
Traceback (most recent call last):
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connectionpool.py", line 560, in urlopen
body=body, headers=headers)
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connectionpool.py", line 346, in _make_request
self._validate_conn(conn)
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connectionpool.py", line 787, in validate_conn
conn.connect()
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connection.py", line 252, in connect
ssl_version=resolved_ssl_version)
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/util/ssl
.py", line 305, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib/python3.5/ssl.py", line 377, in wrap_socket
_context=self)
File "/usr/lib/python3.5/ssl.py", line 752, in init
self.do_handshake()
File "/usr/lib/python3.5/ssl.py", line 988, in do_handshake
self._sslobj.do_handshake()
File "/usr/lib/python3.5/ssl.py", line 633, in do_handshake
self._sslobj.do_handshake()
ConnectionResetError: [Errno 104] Connection reset by peer

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/pip/basecommand.py", line 209, in main
status = self.run(options, args)
File "/usr/lib/python3/dist-packages/pip/commands/install.py", line 328, in run
wb.build(autobuilding=True)
File "/usr/lib/python3/dist-packages/pip/wheel.py", line 748, in build
self.requirement_set.prepare_files(self.finder)
File "/usr/lib/python3/dist-packages/pip/req/req_set.py", line 360, in prepare_files
ignore_dependencies=self.ignore_dependencies))
File "/usr/lib/python3/dist-packages/pip/req/req_set.py", line 577, in _prepare_file
session=self.session, hashes=hashes)
File "/usr/lib/python3/dist-packages/pip/download.py", line 810, in unpack_url
hashes=hashes
File "/usr/lib/python3/dist-packages/pip/download.py", line 649, in unpack_http_url
hashes)
File "/usr/lib/python3/dist-packages/pip/download.py", line 842, in _download_http_url
stream=True,
File "/usr/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/sessions.py", line 480, in get
return self.request('GET', url, **kwargs)
File "/usr/lib/python3/dist-packages/pip/download.py", line 378, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/usr/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/usr/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-any.whl/cachecontrol/adapter.py", line 46, in send
resp = super(CacheControlAdapter, self).send(request, **kw)
File "/usr/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/adapters.py", line 376, in send
timeout=timeout
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connectionpool.py", line 610, in urlopen
_stacktrace=sys.exc_info()[2])
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/util/retry.py", line 228, in increment
total -= 1
TypeError: unsupported operand type(s) for -=: 'Retry' and 'int'

Ubuntu 16.04.5
pip 8.1.1
python 3.5.2

Any help would be greatly appreciated!

Problem parsing forensic report

I recently started receiving forensic reports (which may or may not be valid).
However they generated an Unexpected error.
I updated to the latest 5.2.1 and deleted the forensic reports from my mailbox, but still get the error below.

parsedmarc -n adsrms01.rms.com -t 1 -o dmarc-rpts -H outlook.office365.com -u xxx -p xxx --imap-port 993 -E localhost:9200 --save-aggregate --save-forensic -w >> dmarc.log
Traceback (most recent call last):
File "c:\python\python37-32\lib\site-packages\parsedmarc_init_.py", line 537, in parse_forensic_report
"Forensic sample is not a valid email")
parsedmarc.InvalidForensicReport: Forensic sample is not a valid email

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "c:\python\python37-32\lib\site-packages\parsedmarc_init_.py", line 724, in parse_report_email
strip_attachment_payloads=strip_attachment_payloads)
File "c:\python\python37-32\lib\site-packages\parsedmarc_init_.py", line 607, in parse_forensic_report
"Unexpected error: {0}".format(error.str()))
parsedmarc.InvalidForensicReport: Unexpected error: Forensic sample is not a valid email

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "c:\python\python37-32\lib\runpy.py", line 193, in run_module_as_main
"main", mod_spec)
File "c:\python\python37-32\lib\runpy.py", line 85, in run_code
exec(code, run_globals)
File "C:\Python\Python37-32\Scripts\parsedmarc.exe_main
.py", line 9, in
File "c:\python\python37-32\lib\site-packages\parsedmarc\cli.py", line 343, in main
strip_attachment_payloads=sa
File "c:\python\python37-32\lib\site-packages\parsedmarc_init
.py", line 1010, in get_dmarc_reports_from_inbox
strip_attachment_payloads=sa)
File "c:\python\python37-32\lib\site-packages\parsedmarc_init
.py", line 726, in parse_report_email
raise ParserError(e.str())
parsedmarc.ParserError: Unexpected error: Forensic sample is not a valid email

Occasional 'corrupt' message

Every now and again I get a 'corrupt' message in my dmarc reporting mailbox and the parser fails with the below errors. I clear the offending email and it all starts working again. Not sure how I can share the message with you, but they have an attachment called 'eml' and that's it. It's certainly not a proper dmarc report xml.

Another question. Do you have a recommendation on server specs for running this? DMARC is configured to report back on 1% on email and I find the reports from Google take up to 3 hours to process. This is ~450k emails, so we'd be talking 4.5 million if I take that up to 100% of emails being reported...

Dec 04 10:46:35 dmarc-proc systemd[1]: parsedmarc.service: Failed with result 'exit-code'.
Dec 04 10:46:35 dmarc-proc systemd[1]: parsedmarc.service: Main process exited, code=exited, status=1/FAILURE
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: parsedmarc.ParserError: Unexpected error: 'NoneType' object has no attribute 'isoformat'
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: raise ParserError(e.str())
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/init.py", line 718, in parse_report_email
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: strip_attachment_payloads=sa)
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/init.py", line 1001, in get_dmarc_reports_f
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: strip_attachment_payloads=sa
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/cli.py", line 321, in _main
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: sys.exit(_main())
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/usr/local/bin/parsedmarc", line 11, in
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: Traceback (most recent call last):
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: During handling of the above exception, another exception occurred:
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: parsedmarc.InvalidForensicReport: Unexpected error: 'NoneType' object has no attribute 'isoformat'
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: "Unexpected error: {0}".format(error.str()))
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/init.py", line 599, in parse_forensic_repor
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: strip_attachment_payloads=strip_attachment_payloads)
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/init.py", line 716, in parse_report_email
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: Traceback (most recent call last):
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: During handling of the above exception, another exception occurred:
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: AttributeError: 'NoneType' object has no attribute 'isoformat'
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: parsed_report["arrival_date"] = msg_date.isoformat()
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: File "/opt/venvs/parsedmarc/site-packages/parsedmarc/init.py", line 530, in parse_forensic_repor
Dec 04 10:46:35 dmarc-proc parsedmarc[19229]: Traceback (most recent call last):
Dec 04 10:46:30 dmarc-proc systemd[1]: Started parsedmarc mailbox watcher.

ERROR:utils.py:274:More than one match found for (...

Hello, I'm currently testing parsedmarc 6.0.1 :)

I saw some errors in the console but not present in the log_file.

The error is the following:

ERROR:utils.py:274:More than one match found for (?:id\s+(?P<id>.+?)(?:\s*[(]?envelope-from|\s*[(]?envelope-sender|\s+from|\s+by|\s+with(?! cipher)|\s+for|\s+via|;)) in from bulk1-smtp.messagingengine.com bulk1-smtp.messagingengine.com 123.123.123.123 ^M
 using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 256/256 bits ^M
 Client did not present a certificate ^M
 by mygateway.mydomain.com MTA with ESMTPS id 43ySkp6wpdz4RFQs^M
 for <[email protected]>; Mon, 11 Feb 2019 02:24:50 +0100 CET

How bad is it?

Unable to parse email KeyError: b'RFC822'

Hi. I have recently stood up parsedmarc using the install instructions on https://domainaware.github.io/parsedmarc/index.html
It works well using the cli to parse report from a local directory

running version 5.2.1 with Davmail

DEBUG:init.py:905:Connecting to IMAP over plain text
DEBUG:init.py:976:Found 2 messages in IMAP folder INBOX
DEBUG:init.py:982:Processing message 1 of 2: UID 10
Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 11, in
sys.exit(_main())
File "/usr/local/lib/python3.5/dist-packages/parsedmarc/cli.py", line 343, in _main
strip_attachment_payloads=sa
File "/usr/local/lib/python3.5/dist-packages/parsedmarc/init.py", line 987, in get_dmarc_reports_from_inbox
["RFC822"])[message_uid][b"RFC822"]
KeyError: b'RFC822'

2019-01-17 08:50:38,644 INFO [ImapConnection-34854] davmail.connection - LOGON - 127.0.0.1:34854 [email protected]
2019-01-17 08:50:38,644 DEBUG [ImapConnection-34854] davmail - > MIKK1 OK Authenticated
2019-01-17 08:50:38,644 DEBUG [ImapConnection-34854] davmail - < MIKK2 LIST "" "Archive"
2019-01-17 08:50:38,692 DEBUG [ImapConnection-34854] davmail - > * LIST (\HasChildren) "/" "Archive"
2019-01-17 08:50:38,693 DEBUG [ImapConnection-34854] davmail - > MIKK2 OK LIST completed
2019-01-17 08:50:38,734 DEBUG [ImapConnection-34854] davmail - < MIKK3 LIST "" "Archive/Aggregate"
2019-01-17 08:50:38,812 DEBUG [ImapConnection-34854] davmail - > * LIST (\HasNoChildren) "/" "Archive/Aggregate"
2019-01-17 08:50:38,812 DEBUG [ImapConnection-34854] davmail - > MIKK3 OK LIST completed
2019-01-17 08:50:38,854 DEBUG [ImapConnection-34854] davmail - < MIKK4 LIST "" "Archive/Aggregate"
2019-01-17 08:50:38,933 DEBUG [ImapConnection-34854] davmail - > * LIST (\HasNoChildren) "/" "Archive/Aggregate"
2019-01-17 08:50:38,933 DEBUG [ImapConnection-34854] davmail - > MIKK4 OK LIST completed
2019-01-17 08:50:38,974 DEBUG [ImapConnection-34854] davmail - < MIKK5 LIST "" "Archive/Forensic"
2019-01-17 08:50:39,054 DEBUG [ImapConnection-34854] davmail - > * LIST (\HasNoChildren) "/" "Archive/Forensic"
2019-01-17 08:50:39,054 DEBUG [ImapConnection-34854] davmail - > MIKK5 OK LIST completed
2019-01-17 08:50:39,094 DEBUG [ImapConnection-34854] davmail - < MIKK6 LIST "" "Archive/Invalid"
2019-01-17 08:50:39,169 DEBUG [ImapConnection-34854] davmail - > * LIST (\HasNoChildren) "/" "Archive/Invalid"
2019-01-17 08:50:39,169 DEBUG [ImapConnection-34854] davmail - > MIKK6 OK LIST completed
2019-01-17 08:50:39,210 DEBUG [ImapConnection-34854] davmail - < MIKK7 SELECT "INBOX"
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Folder INBOX - Search items current count: 3 fetchCount: 500 highest uid: 12 lowest uid: 10
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Message IMAP uid: 10 uid: AAAAAMJFXiqcxYZLn8LeyLhsBJ8BAOkEKOL0zPtKlch7lA7QutsAAAAADSgAAA== ItemId: AAMkAGZlODU3MGNkLThhNjEtNDZmOS05MjIxLTFkMjZjYzE5YjVhZgBGAAAAAADCRV4qnMWGS5/C3si4bASfBwDpBCji9Mz7SpXIe5QO0LrbAAAAAAEMAADpBCji9Mz7SpXIe5QO0LrbAAAAAA0oAAA= ChangeKey: CQAAABYAAADpBCji9Mz7SpXIe5QO0LrbAAAAABDQ
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Message IMAP uid: 11 uid: AAAAAMJFXiqcxYZLn8LeyLhsBJ8BAOkEKOL0zPtKlch7lA7QutsAAAAADSkAAA== ItemId: AAMkAGZlODU3MGNkLThhNjEtNDZmOS05MjIxLTFkMjZjYzE5YjVhZgBGAAAAAADCRV4qnMWGS5/C3si4bASfBwDpBCji9Mz7SpXIe5QO0LrbAAAAAAEMAADpBCji9Mz7SpXIe5QO0LrbAAAAAA0pAAA= ChangeKey: CQAAABYAAADpBCji9Mz7SpXIe5QO0LrbAAAAABDR
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Message IMAP uid: 12 uid: AAAAAMJFXiqcxYZLn8LeyLhsBJ8BAOkEKOL0zPtKlch7lA7QutsAAAAADSoAAA== ItemId: AAMkAGZlODU3MGNkLThhNjEtNDZmOS05MjIxLTFkMjZjYzE5YjVhZgBGAAAAAADCRV4qnMWGS5/C3si4bASfBwDpBCji9Mz7SpXIe5QO0LrbAAAAAAEMAADpBCji9Mz7SpXIe5QO0LrbAAAAAA0qAAA= ChangeKey: CQAAABYAAADpBCji9Mz7SpXIe5QO0LrbAAAAABDS
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > * 3 EXISTS
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > * 3 RECENT
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > * OK [UIDVALIDITY 1]
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > * OK [UIDNEXT 13]
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > * FLAGS (\Answered \Deleted \Draft \Flagged \Seen $Forwarded Junk)
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > * OK [PERMANENTFLAGS (\Answered \Deleted \Draft \Flagged \Seen $Forwarded Junk *)]
2019-01-17 08:50:39,248 DEBUG [ImapConnection-34854] davmail - > MIKK7 OK [READ-WRITE] SELECT completed
2019-01-17 08:50:39,334 DEBUG [ImapConnection-34854] davmail - < MIKK8 UID SEARCH ALL
2019-01-17 08:50:39,378 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Folder INBOX - Search items current count: 3 fetchCount: 500 highest uid: 12 lowest uid: 10
2019-01-17 08:50:39,378 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Message IMAP uid: 10 uid: AAAAAMJFXiqcxYZLn8LeyLhsBJ8BAOkEKOL0zPtKlch7lA7QutsAAAAADSgAAA== ItemId: AAMkAGZlODU3MGNkLThhNjEtNDZmOS05MjIxLTFkMjZjYzE5YjVhZgBGAAAAAADCRV4qnMWGS5/C3si4bASfBwDpBCji9Mz7SpXIe5QO0LrbAAAAAAEMAADpBCji9Mz7SpXIe5QO0LrbAAAAAA0oAAA= ChangeKey: CQAAABYAAADpBCji9Mz7SpXIe5QO0LrbAAAAABDQ
2019-01-17 08:50:39,378 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Message IMAP uid: 11 uid: AAAAAMJFXiqcxYZLn8LeyLhsBJ8BAOkEKOL0zPtKlch7lA7QutsAAAAADSkAAA== ItemId: AAMkAGZlODU3MGNkLThhNjEtNDZmOS05MjIxLTFkMjZjYzE5YjVhZgBGAAAAAADCRV4qnMWGS5/C3si4bASfBwDpBCji9Mz7SpXIe5QO0LrbAAAAAAEMAADpBCji9Mz7SpXIe5QO0LrbAAAAAA0pAAA= ChangeKey: CQAAABYAAADpBCji9Mz7SpXIe5QO0LrbAAAAABDR
2019-01-17 08:50:39,378 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Message IMAP uid: 12 uid: AAAAAMJFXiqcxYZLn8LeyLhsBJ8BAOkEKOL0zPtKlch7lA7QutsAAAAADSoAAA== ItemId: AAMkAGZlODU3MGNkLThhNjEtNDZmOS05MjIxLTFkMjZjYzE5YjVhZgBGAAAAAADCRV4qnMWGS5/C3si4bASfBwDpBCji9Mz7SpXIe5QO0LrbAAAAAAEMAADpBCji9Mz7SpXIe5QO0LrbAAAAAA0qAAA= ChangeKey: CQAAABYAAADpBCji9Mz7SpXIe5QO0LrbAAAAABDS
2019-01-17 08:50:39,378 DEBUG [ImapConnection-34854] davmail - > * SEARCH 10 11 12
2019-01-17 08:50:39,378 DEBUG [ImapConnection-34854] davmail - > MIKK8 OK SEARCH completed
2019-01-17 08:50:39,418 DEBUG [ImapConnection-34854] davmail - < MIKK9 UID FETCH 10 (RFC822)
2019-01-17 08:50:39,429 DEBUG [ImapConnection-34854] davmail.imap.ImapConnection - * 1 FETCH (UID 10
2019-01-17 08:50:39,449 DEBUG [ImapConnection-34854] davmail.exchange.ExchangeSession - Downloaded full message content for IMAP UID 10 (12048 bytes)
2019-01-17 08:50:39,449 DEBUG [ImapConnection-34854] davmail - > BODY[null] {12048}
2019-01-17 08:50:39,449 DEBUG [ImapConnection-34854] davmail - > )
2019-01-17 08:50:39,449 DEBUG [ImapConnection-34854] davmail - > MIKK9 OK UID FETCH completed
2019-01-17 08:50:39,539 INFO [ImapConnection-34854] davmail.connection - DISCONNECT - 127.0.0.1:34854

Problem when trying to create elasticsearch indexes

Hi,

I'm using parsedmarc without difficulties, until today when I tried to connect it with elasticsearch/kibana, following the documentation.
Ran with parsedmarc -o $reportdir -H $host -u $user -p $pass --save-aggregate -E 127.0.0.1:9200, the output is:

Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 11, in
load_entry_point('parsedmarc==4.1.0', 'console_scripts', 'parsedmarc')()
File "/usr/local/lib/python3.5/dist-packages/parsedmarc/cli.py", line 194, in _main
elastic.create_indexes([es_aggregate_index, es_forensic_index])
File "/usr/local/lib/python3.5/dist-packages/parsedmarc/elastic.py", line 187, in create_indexes
index.put_settings(settings)
File "/usr/local/lib/python3.5/dist-packages/elasticsearch_dsl/index.py", line 430, in put_settings
return self._get_connection(using).indices.put_settings(index=self._name, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/elasticsearch/client/utils.py", line 76, in _wrapped
return func(*args, params=params, **kwargs)
TypeError: put_settings() missing 1 required positional argument: 'body'

Any idea on how to solve this?

Publixsuffix

Okt 22 15:28:14 DE9899SDR parsedmarc[10654]: WARNING:parsedmarc:Failed to download an updated PSL HTTPSConnectionPool(host='publicsuffix.org', port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by NewConnectio

Parsedmarc try to connect without Proxy to publicsuffix.org. Can you add a Proxy feature?
Or i must use a direct connection rule.

Message with subject <> is not a valid DMARC report / Error in visualization: responseAggs is undefined

Hi!

I've installed parsedmarc 6.0.2 on Ubuntu 16.04 Server LTS (fresh install) with Elasticsearch/Kibana 6.6.0.

The following problems occur:

  1. I receive a DMARC report from Google and receive the message:
    Message with subject "Report domain: Submitter: google.com Report ID: 123456789" is not a valid DMARC report

  2. On the Kibana Dashboard (DMARC-Summary) I get the messages "Error in visualization: responseAggs is undefined"

Am I doing anything wrong or could that be a bug?

I would be very happy to receive an answer because I am very excited about parsedmarc and would like to use it.

Sorry for bad english :-)

Suggestion: bundle time per day.

As Inoticed that most organisations only report once a day I changed the visualization for the 2 "over time" graphs. The X-Axis date-range is set to an interval of auto. But I changed it to daily so the graphs make more sense.
image

It might be a prudent thing to change in the online template. Or at least suggest the option.

And I was also looking into changing the colors as blue is not the color people associate with something that is wrong. So I changed the alignment issues to dark yellow and DMARC "errors" to red.

Elasticsearch monthly indexes

Need create mechanism for switch between daily and monthly indexes.

In "home" usage - I receiving 3-5 reports every day. Using monthly index is more effective than daily.

Problems with indexes and Kibana dashboard

Hi there,
sorry for bothering about an issue caused by myself doing dumb operations on the server, but I'm not able to display new data in the dashboard anymore. It's not a bug of parsedmarc, but I don't know where else to ask for help.
Yesterday I updated parsedmarc to version 6.0.0 and I thought it were a good idea to renew the Kibana index patterns as if I were coming from a version prior to 5. So I followed the instructions but, when I tried to import the indexes from the downloaded kibana_saved_objects.json, I got the message you can see in Screenshot_001. After creating 2 new index patterns (dmarc_aggregate* and dmarc_forensic* that matched all the daily indexes of each type) I'm now able to display data until 02/05/2019, but even if 02/06 and 02/07 reports have been parsed (after all the mess), the results are not shown in the dashboard, as you can see in Screenshot_002. Any suggestion, apart starting from scratch with a clean install and paying some more attention? Should you need any further information, please don't hesitate to ask. Thank you very much for any hint you feel to share with me.
screenshot_001
screenshot_002

IMAP Connection timeout

I've been running parsedmarc as a service for several months. It's working well with the kibana dashboard. But I'm having a problem now that I'm getting about 200 reports a day. Is there a way to run in batches of 10 messages or so? My mail server is timing out and closing the connection before the reports can be archived. If I could run in batches or at least archive the messages that it's already processed instead of starting over when it sees more messages have arrived in the inbox before the last run is finished I think I could get around my timeout issue.
But I'm also looking into other ideas for dealing with this, like running on more than one inbox and process, to reduce the load. Maybe you have some suggestions if no changes to the script?
Great tool, though! I've been pretty happy so far which is why I increased the work that it was doing so much. lol.

From the logs:
Jan 23 14:04:45 mtk-unms-srv01 parsedmarc[8919]: ERROR:parsedmarc:IMAP error: Error moving message UID 6447: socket error: [Errno 32] Broken pipe
Jan 23 14:04:45 mtk-unms-srv01 parsedmarc[8919]: ERROR:parsedmarc:IMAP error: Error moving message UID 6448: socket error: [Errno 32] Broken pipe
Jan 23 14:04:45 mtk-unms-srv01 parsedmarc[8919]: ERROR:parsedmarc:IMAP error: Error moving message UID 6449: socket error: [Errno 32] Broken pipe
Jan 23 14:04:45 mtk-unms-srv01 parsedmarc[8919]: ERROR:parsedmarc:IMAP error: Error moving message UID 6450: socket error: [Errno 32] Broken pipe
Jan 23 14:04:45 mtk-unms-srv01 parsedmarc[8919]: ERROR:parsedmarc:IMAP error: Error moving message UID 6451: socket error: [Errno 32] Broken pipe
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: Traceback (most recent call last):
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: File "/usr/local/bin/parsedmarc", line 11, in
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: sys.exit(_main())
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: File "/usr/local/lib/python3.6/dist-packages/parsedmarc/cli.py", line 321, in _main
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: strip_attachment_payloads=sa
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 1162, in get_dmarc_reports_from_inbox
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: results=results
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 1162, in get_dmarc_reports_from_inbox
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: results=results
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 1162, in get_dmarc_reports_from_inbox
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: results=results
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: [Previous line repeated 11 more times]
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: File "/usr/local/lib/python3.6/dist-packages/parsedmarc/init.py", line 979, in get_dmarc_reports_from_inbox
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: ["RFC822"])[message_uid][b"RFC822"]
Jan 23 14:21:40 mtk-unms-srv01 parsedmarc[8919]: KeyError: 6448
Jan 23 14:21:40 mtk-unms-srv01 systemd[1]: parsedmarc.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 14:21:40 mtk-unms-srv01 systemd[1]: parsedmarc.service: Failed with result 'exit-code'.
Jan 23 14:26:40 mtk-unms-srv01 systemd[1]: parsedmarc.service: Service hold-off time over, scheduling restart.
Jan 23 14:26:40 mtk-unms-srv01 systemd[1]: parsedmarc.service: Scheduled restart job, restart counter is at 3.
Jan 23 14:26:40 mtk-unms-srv01 systemd[1]: Stopped parsedmarc mailbox watcher.
Jan 23 14:26:40 mtk-unms-srv01 systemd[1]: Started parsedmarc mailbox watcher.
Jan 23 15:26:26 mtk-unms-srv01 parsedmarc[31307]: ERROR:parsedmarc:IMAP error: IMAP error: Skipping message UID 6662: socket error: [Errno 104] Connection reset by peer
Jan 23 15:26:26 mtk-unms-srv01 parsedmarc[31307]: ERROR:parsedmarc:IMAP error: IMAP error: Skipping message UID 6663: socket error: [Errno 32] Broken pipe

Using Azure Event Hub

I'm trying to use Azure Event Hub which can act as a Kafka endpoint. This is the command:

sudo parsedmarc --save-aggregate --save-forensic -H "{hostname}" -u "{user}" -p "{password}" -K "{Event Hub Connection String}" --test --debug

The Event Hub Connection String is an endpoint URL that contains the endpoint hostname, access key and topic. I get the following error:

ERROR:parsedmarc:Kafka Error: encoding with 'idna' codec failed (UnicodeError: label too long)
ERROR:parsedmarc:Kafka Error: local variable 'kafka_client' referenced before assignment
ERROR:parsedmarc:Kafka Error: local variable 'kafka_client' referenced before assignment
ERROR:parsedmarc:Kafka Error: local variable 'kafka_client' referenced before assignment
(This local variable error repeats as for each message the mailbox)

Imap Port

I cant use imap without ssl. Can you add for parsedmarc build a option without imap ssl?

Error pulling dmarc reports

I've installed parsedmarc on a new system (tried both Ubuntu and Debian) and when pulling DMARC reports I'm getting this error:

ERROR:parsedmarc:Elasticsearch Error: Elasticsearch error: Range accepts a single dictionary or a set of keyword arguments.

Error wasn't present in 4.4.1. Hopefully nothing I'm doing on my new system...

Elasticsearch with username/password hashtag

I'm having an issue setting the hosts file under Elasticsearch with credentials.
According to the Docs the host takes the form

[elasticsearch]
hosts = https://username:password@hostname:port

The issue i noticed is that when the password has a hashtag (#) it seems like the string is escaped
i.e hosts = https://username:password#001@hostname:port

I get the error:

Traceback (most recent call last):
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\xq66\AppData\Local\Continuum\anaconda3\Scripts\parsedmarc.exe\__main__.py", line 9, in <module>
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\site-packages\parsedmarc\cli.py", line 330, in _main
    opts.elasticsearch_ssl_cert_path)
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\site-packages\parsedmarc\elastic.py", line 192, in set_hosts
    connections.create_connection(**conn_params)
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\site-packages\elasticsearch_dsl\connections.py", line 66, in create_connection
    conn = self._conns[alias] = Elasticsearch(**kwargs)
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\site-packages\elasticsearch\client\__init__.py", line 188, in __init__
    self.transport = transport_class(_normalize_hosts(hosts), **kwargs)
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\site-packages\elasticsearch\client\__init__.py", line 43, in _normalize_hosts
    if parsed_url.port:
  File "c:\users\xq66\appdata\local\continuum\anaconda3\lib\urllib\parse.py", line 159, in port
    port = int(port, 10)
ValueError: invalid literal for int() with base 10: 'password'

Could not locate that visualization

This error appears in the Kibana dashboards below the ones that are populated.

There are 2 ids which can't be found

(id: 1fad3f60-2881-11e8-b8b2-15742da3055c)
(id: 40e7a5b0-2883-11e8-b8b2-15742da3055c)

these are referenced in the DMARC summary dashboard json but any clue as to how to get them to populate or even what they are?

Currently I have removed them from the dash as they are not referenced in the visualization section.

Tabs in email subjects / filename.

Noticed an issue where some dmarc reports arent being picked up via the IMAP client.

These reports have the subject/filename as receiver.com!myorg.com!epoch1!epoch2!dmarc.xml.gz

These aren't ending up being removed from the mailbox nor send to ES/Kafka. Processing it right from the file works fine, so I suspect it might be the imapclient? Not seeing any exceptions. I'll dig in deeper when I have some time.

Error Importing to Elastic Search

Fresh Install, Ubuntu 18.04.1

Python 3.6.6
parsedmarc 4.3.8
elasticsearch: 6.4.3

Reports seem to grab okay via IMAP, but fail when trying to import to elasticsearch.

Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 11, in
sys.exit(_main())
File "/opt/venvs/parsedmarc/site-packages/parsedmarc/cli.py", line 333, in _main
process_reports(results)
File "/opt/venvs/parsedmarc/site-packages/parsedmarc/cli.py", line 42, in process_reports
report, index=es_aggregate_index)
File "/opt/venvs/parsedmarc/site-packages/parsedmarc/elastic.py", line 292, in save_aggregate_report_to_elasticsearch
agg_doc.save()
File "/opt/venvs/parsedmarc/site-packages/parsedmarc/elastic.py", line 88, in save
return super().save(** kwargs)
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/document.py", line 383, in save
self.full_clean()
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/utils.py", line 444, in full_clean
self.clean_fields()
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/utils.py", line 430, in clean_fields
data = field.clean(data)
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/field.py", line 207, in clean
data.full_clean()
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/utils.py", line 444, in full_clean
self.clean_fields()
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/utils.py", line 430, in clean_fields
data = field.clean(data)
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/field.py", line 95, in clean
data = self.deserialize(data)
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/field.py", line 91, in deserialize
return self._deserialize(data)
File "/opt/venvs/parsedmarc/site-packages/elasticsearch_dsl/field.py", line 314, in _deserialize
return int(data)
ValueError: invalid literal for int() with base 10: '0:1:d:s'

Installation Issues

Recreated same issues when installing via using both methods described in instructions (straight install and install via Git). Installer fails with below issue:

Collecting publicsuffix (from parsedmarc==3.7.2)
Using cached https://files.pythonhosted.org/packages/76/8e/2be900ba8397bafe88c9d17fab456faddab7af22d7105df18699d3dd97de/publicsuffix-1.1.0.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-build-zl_4wfaa/publicsuffix/setup.py", line 19, in
long_description=get_long_description(),
File "/tmp/pip-build-zl_4wfaa/publicsuffix/setup.py", line 13, in get_long_description
read_doc("LICENSE")
File "/tmp/pip-build-zl_4wfaa/publicsuffix/setup.py", line 7, in read_doc
return open(os.path.join(os.path.dirname(file), name)).read()
File "/usr/lib/python3.5/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc5 in position 23: ordinal not in range(128)

----------------------------------------

Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-zl_4wfaa/publicsuffix/

Install attempted using Debian on Windows Subsystem for Linux
x> uname -vrma
x> Linux laptop 4.4.0-17134-Microsoft #1-Microsoft Tue Apr 10 18:04:00 PST 2018 x86_64 GNU/Linux

IMAP error: command: FETCH => socket error: EOF

I'm having troubles with IMAP to a Gsuite account with large volume of DMARC messages (~300). I've just updated to 6.0.1 (runing with pypy3 on a Ubuntu 18.04.1 LTS), but this problems was happening with the earlier version too.

This version (6.0.1) seem to handle better a connection timeout with the IMAP server, but after 5h of processing time, the scritpt started to get socket error from IMAP server, as seen in the log bellow. Just before the script exits, It produces an error ( Unexpected error: [Errno 24] Too many open files) when processing a DMARC message:

2019-02-07 18:52:11,294 - DEBUG - [init.py:829] - IMAP server supports: ['IMAP4REV1', 'UNSELECT', 'IDLE', 'NAMESPACE', 'QUOTA', 'ID', 'XLIST', 'CHILDREN', 'X-GM-EXT-1', 'UIDPLUS', 'COMPRESS=DEFLATE', 'ENABLE', 'MOVE', 'CONDSTORE', 'ESEARCH', 'UTF8=ACCEPT', 'LIST-EXTENDED', 'LIST-STATUS', 'LITERAL-', 'SPECIAL-USE', 'APPENDLIMIT=35651584']
2019-02-07 18:52:13,390 - DEBUG - [init.py:977] - Found 369 messages in IMAP folder TMP
2019-02-07 18:52:13,390 - DEBUG - [init.py:983] - Processing message 1 of 369: UID 91
...
2019-02-07 19:55:54,473 - DEBUG - [init.py:983] - Processing message 26 of 369: UID 116
2019-02-07 20:12:16,920 - DEBUG - [init.py:999] - IMAP error: [Errno 110] Connection timed out
2019-02-07 20:12:16,920 - DEBUG - [init.py:1000] - Reconnecting to IMAP
2019-02-07 20:14:49,550 - DEBUG - [init.py:983] - Processing message 27 of 369: UID 117
2019-02-07 20:15:08,831 - DEBUG - [init.py:983] - Processing message 28 of 369: UID 118
...
2019-02-07 23:54:47,861 - DEBUG - [init.py:983] - Processing message 112 of 369: UID 202
2019-02-07 23:54:47,862 - ERROR - [init.py:1035] - IMAP error: IMAP error: Skipping message UID 202: command: FETCH => socket error: EOF
2019-02-07 23:54:47,863 - DEBUG - [init.py:983] - Processing message 113 of 369: UID 203
2019-02-07 23:54:47,863 - ERROR - [init.py:1035] - IMAP error: IMAP error: Skipping message UID 203: command: FETCH => socket error: EOF
...
2019-02-07 23:54:48,161 - DEBUG - [init.py:983] - Processing message 369 of 369: UID 459
2019-02-07 23:54:48,161 - ERROR - [init.py:1035] - IMAP error: IMAP error: Skipping message UID 459: command: FETCH => socket error: EOF
2019-02-07 23:54:48,162 - DEBUG - [init.py:1094] - Moving aggregate report messages from TMP to TMP
2019-02-07 23:54:48,162 - DEBUG - [init.py:1100] - Moving message 1 of 97: UID 91
2019-02-07 23:54:48,162 - DEBUG - [init.py:935] - Moving message UID(s) 91 to ParseDMARC/Aggregate
2019-02-07 23:54:48,163 - ERROR - [init.py:1109] - IMAP error: Error moving message UID 91: command: UID => socket error: EOF
...
2019-02-07 23:54:48,313 - DEBUG - [init.py:1100] - Moving message 97 of 97: UID 201
2019-02-07 23:54:48,313 - DEBUG - [init.py:935] - Moving message UID(s) 201 to ParseDMARC/Aggregate
2019-02-07 23:54:48,313 - ERROR - [init.py:1109] - IMAP error: Error moving message UID 201: command: UID => socket error: EOF
2019-02-07 23:54:51,437 - DEBUG - [init.py:977] - Found 355 messages in IMAP folder TMP
2019-02-07 23:54:51,437 - DEBUG - [init.py:983] - Processing message 1 of 355: UID 91
2019-02-07 23:54:58,037 - DEBUG - [init.py:983] - Processing message 2 of 355: UID 92
2019-02-07 23:55:04,585 - DEBUG - [init.py:983] - Processing message 3 of 355: UID 93
...
2019-02-08 01:14:38,104 - DEBUG - [init.py:999] - IMAP error: [Errno 110] Connection timed out
2019-02-08 01:14:38,104 - DEBUG - [init.py:1000] - Reconnecting to IMAP
2019-02-08 01:17:10,307 - DEBUG - [init.py:983] - Processing message 27 of 355: UID 117
...
2019-02-08 02:34:55,124 - DEBUG - [init.py:983] - Processing message 97 of 355: UID 201
2019-02-08 04:16:45,938 - WARNING - [init.py:1037] - Message with subject "Report domain: [anonymized].com.br Submitter: [anonymized].com Report-ID: [anonymized]" is not a valid aggregate DMARC report: Unexpected error: [Errno 24] Too many open files
2019-02-08 04:16:45,939 - DEBUG - [init.py:1046] - Moving message UID 201 to ParseDMARC/Invalid
2019-02-08 04:16:45,939 - DEBUG - [init.py:935] - Moving message UID(s) 201 to ParseDMARC/Invalid
2019-02-08 04:16:45,940 - ERROR - [cli.py:411] - IMAP Error: command: CAPABILITY => socket error: EOF

CentOS 7

Hello,

Any setup guide for CentOS 7?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.