Giter Site home page Giter Site logo

intevation / intelmq-mailgen Goto Github PK

View Code? Open in Web Editor NEW
9.0 11.0 3.0 849 KB

IntelMQ command line tool to process events and send out email notifications.

Home Page: http://intevation.github.io/intelmq-mailgen/

License: Other

Python 73.25% PLpgSQL 4.84% Makefile 0.24% Shell 5.63% Roff 16.03%
csirt incident-response intelmq mail-notification

intelmq-mailgen's Introduction

IntelMQ Mailgen

IntelMQ command line tool to process events.

Call intelmqcbmail --help to see the current usage.

The concept's documentation can be found here: http://intevation.github.io/intelmq-mailgen/

Installation

Dependencies

These libraries and programs are required:

  • The python library psycopg2 (python3-psycopg2) for PostgreSQL communication.
  • The python library gpg (python3-gpg), part of the library gpgme. Due to issues with Ubuntu 20.04, this dependency is not installed when installed with pip or setup.py Other means of distributions (deb packages) are not affected by this bug.
  • GnuPG (v>=2.2) for python3-gpg.

As a Python3 application, see the install_requires section in setup.py for its dependencies.

If you install the deb-packages, the package management handles all dependencies.

For an installation from source use this command:

pip3 install -v -e .

In order to use IntelMQ Mailgen, you require a working certbund-contact-expert in IntelMQ, as Mailgen makes use of information and data which is not available in the IntelMQs default fields.

IntelMQ Configuration

For Mailgen to work, the following IntelMQ bots will need to be configured first:

  1. Expert: CERT-bund Contact Database
  2. Expert: CERT-bund Contact Rules
  3. Output: PostgreSQL

You must follow the setup instructions for these bots before setting up Mailgen.

Database

The intelmq-events database and the intelmq database-user should already have been set up by the configuration of the PostgreSQL output bot. For use with Mailgen this setup has to be extended:

As database-superuser (usually via system user postgres):

  1. Create a new database-user:

    createuser --encrypted --pwprompt intelmq_mailgen
    
  2. Extend the database:

    psql -f sql/notifications.sql intelmq-events
    
  3. Grant intelmq the right to insert new events via a trigger:

    psql -c "GRANT eventdb_insert TO intelmq" intelmq-events
    
  4. Grant the new user the right to send out notifications:

    psql -c "GRANT eventdb_send_notifications TO intelmq_mailgen" intelmq-events
    

Interaction with IntelMQ and the events database

The events written into the events database have been processed by the rules bot which adds notification directives to the events. The directives tell mailgen which notifications to generate based on that event. The statements in sql/notifications.sql add triggers and tables to the event database that process these directives as they come in and prepare them for use by mailgen. In particular:

  • The directives table contains all the directives. The main attributes of a directive are

    • ID of the event
    • recipient address
    • data format
    • template name (see "Templates" below)
    • how to aggregate
    • whether and when it was sent. this is the ID of the corresponding row in the sent table (see below)
  • When a new event is inserted into the events table, a trigger procedure extracts the directives and inserts them into directives.

  • The sent table records which notifications have actually been sent. Its main attributes are

    • the ticket number generated for the notification
    • a time stamp indicating when it was sent

When mailgen processes the directives, it reads the still unsent directives from the database, aggregates directives that are sufficiently similar that they could be sent in the same mail and calls a series of scripts for each of the aggregated directives. These scripts inspect the directive and if they can process the directive generate mails from it. mailgen then sends these mails and records it in the sent table.

Ticket Numbers

For every email sent by Mailgen a ticket number is generated. If a mail was successfully sent, this number is stored in the table sent, together with a timestamp when the mail was sent.

Configuration

intelmq-mailgen currently searches for configuration files in two places:

  1. $HOME/.intelmq/intelmq-mailgen.conf (user configuration file) and
  2. /etc/intelmq/intelmq-mailgen.conf (system configuration file).

Settings are read from both files with the one in the user's home directory taking precedence.

The system configuration file path can be overridden with the --config command line parameter.

Both files must be in JSON format. A complete example can be found in intelmq-mailgen.conf.example.

OpenPGP Signatures

gnupg_home has to point to the GnuPG home directory for email signatures. It must:

  • contains the private and public key parts for the OpenPGP signature without password protection.
  • is read/writable for the user running intelmq-mailgen.

For example, the following steps will create such a directory and import a test signing key.

GNUPGHOME=/tmp/gnupghome mkdir $GNUPGHOME
chmod og-rwx $GNUPGHOME
GNUPGHOME=/tmp/gnupghome gpg2 --list-secret-keys
GNUPGHOME=/tmp/gnupghome gpg2 --import src/intelmq-mailgen/tests/keys/test1.sec

Depending on your GnuPG version you may want to set additional options by editing $GNUPGHOME/gpg.conf.

For example, the following settings will set the default digest algorithm, suppress emitting the GnuPG version, and add a comment line for signatures:

personal-digest-preferences SHA256
no-emit-version
comment Key verification <https://example.org/hints-about-verification>

(See the GnuPG documentation for details.)

Now, you should be able to sign using this key without being prompted for a passphrase. Try, for example:

echo Moin moin. | GNUPGHOME=/tmp/gnupghome gpg2 --clearsign --local-user "5F503EFAC8C89323D54C252591B8CD7E15925678"

Templates

mailgen comes with a templating mechanism that the scripts that process the directives can use. This mechanism assumes that all templates are files in the directory from the template_dir setting in the configuration file.

The scripts that come with mailgen simply take the template name from the directive they are processing. This means that the name is set by the rules used by the rules bot, so see its documentation and configuration for which templates you need.

Template Format

The first line of a template file is used as the subject line for mails. The remaining lines will become the mail body. The body may optionally be separated from the subject line by one or more empty lines.

Both subject and body text will be interpreted as Python3 Template strings and may allow some substitutions depending on the format. Subject and body allow the same substitutions.

Typically supported substitutions:

  • All formats:

    • ${ticket_number}
  • Additional substitutions for CSV-based formats:

    • ${events_as_csv} for the CSV-formatted event data. This is only useful in the body.
  • When aggregating by event fields the event fields can also be used. E.g. if a directive aggregates by source.asn you can use ${source.asn}

    Like the template name, aggregation is determined by the rules bot, so see there for details.

Database

The database section in the configuration may look like:

    "database": {
        "event": {
            "name": "intelmq-events",
            "username": "intelmq_mailgen",
            "password": "your DB password",
            "host": "localhost",
            "port": 5432
        },
        "additional_directive_where": ""
    },

additional_directive_where parameter is optional and can contain SQL code appended to the WHERE clause of the SELECT operation on the table directives. The AND is appended automatically. The columns of table directives are available as d3 and the columns of table events as events. Normally the table events is not queried and only joined for the where statement if additional_directive_where contains events.. Examples:

        "additional_directive_where": "\"template_name\" = 'qakbot_provider'"
        "additional_directive_where": "events.\"feed.code\" = 'oneshot'"

Mind the correct quoting. If access to the table events is required, the used postgres user needs UPDATE permissions access to the table. This is by default not the case for mailgen-installations! This imperfection is a result of the update-locking on the table directives and the join of events in the same sub-statement.

Operation manual

The logfile shall be monitored for errors to detect unwanted conditions. Especially grep for:

 * 'ERROR'
 * 'Error:'

Each error condition should be handled by an administrator or service technician soon. It is recommended to use a monitor system to notify administrators as soon as such a string occurs in the log.

Log file contents

There should be no Traceback or other ERROR information in the log of mailgen. Please read the lines in question, often they have good hints about cause of the failure. Some problem may be solved by correcting the configuration.

INFO lines appear during normal operations. One condition to get an INFO message is if Mailgen detects that it is already running to that a second instance does not start. If this is the case, the running Mailgen process may still have problems and during the nature of log file, the messages of the Mailgen that tries to start up, may appear interwoven with the error conditions.

Mailgen needs to lock db rows

During a run, if mailgen is started a second time, it will fail to lock the necessary rows in the database. The postgres.log file will record the failed locks, e.g. like

2020-12-15 09:00:02 UTC ERROR:  could not obtain lock on row in relation "directives"

which can be ignored.

Mailgen tries to continue

Mailgen will try to continue processing directives and sending mails, even if some batch of mails could not be send for several reasons.

If it can't find templates, for instance, it will continue with the next directive and log an error message and the stacktrace. The error message contains information about the directives that could not be processed. The directive_ids part in the output is a list with the IDs of the rows in the directives table and event_ids a list with ids for events in the events table.

This information can be used by an administrator to see which events and emails may not have gone out in detail, to deal with them later, possibly with a small script depending on the problem cause.

Developer Information

Security Considerations

  • It is assumed that we need to protect against malicious external data coming to us via the database.
  • We do not need (or can) protect against local attacks with administration rights.
  • As our command will be able to run with and without user interaction, we assume that only users with administration rights have access to the machine and are allowed to start the interactive variant.
  • The private key material for signing will have no extra protection by passphrase, thus the system itself needs to be secured adequately. (This can include separating the setup of intelmq itself on a different machine with only access to fill the database.)
  • We should pay attention to preventing that the complete system becomes an effective signature (or encryption) oracle. To explain: Consider an attacker who will receive an automatic notification from our system. If this attacker also can trigger a warning over an used feed, she may partly control which plaintext is to be signed (or somewhere encrypted) and gets the automated result. There is a small potential under some circumstances that this can be used for an adaptive-plaintext attack.

Column Names

It is possible to define names for the CSV-columns in code. For instance in example_scripts/10shadowservercsv.py, the dictionary standard_column_titles maps event field names to column titles. These are used by most of the CSV formats later defined in table_formats. The formats specified there can still use special column titles if necessary.

Transformations

Currently, data is not transformed when it is being added to the CSV output.

Mailgen always removes the "UTC" notations from time stamps in time.source. It ensures that time stamps will always be UTC.

Testing

An easy way to test the actual sending of emails is to use Python's smtpd module running the DebuggingServer:

python3 -m smtpd -d -n -c DebuggingServer localhost:8025

(Don't forget to configure the corresponding SMTP host and port in your config.)

If you want to capture emails in Maildir format you can use https://pypi.org/project/dsmtpd/, e.g.

git clone git://github.com/matrixise/dsmtpd.git
cd dsmtpd
python3 -m dsmtpd -i localhost -p 8025 -d /path/to/Maildir

/path/to/Maildir has to be either an existing Maildir or non-existing, in which case it will be created by dsmtpd.

You can access the Maildir with mutt, for example:

mutt -f  /path/to/Maildir

Hint: By default Esc P will trigger mutt's <check-traditional-pgp> function, in case you want to check a no-MIME signature.

Test Suite

The test suite is split into two parts because some tests may fail depending on hardware specs (execution time) and their failure would not indicate errors per se.

The regular unit tests which must succeed can be started with make check; to run the complete test suite, use make check_all.

History

The intelmq-mailgen file was initially copied from https://github.com/certat/intelmq/blob/a29da5d798bd114535326ffdd2f5000c4b6a21e7/intelmq/bin/intelmqcli (revision from 2016-03-08).

intelmq-mailgen's People

Contributors

bernhard-herzog avatar bernhardreiter avatar dmth avatar gsiv avatar rolandgeider avatar swilde avatar th-certbund avatar wagner-intevation avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

intelmq-mailgen's Issues

Create a unique ticket number per email usable for help desks

Each email should have a unique ticket number.
It should be usable for help desks, this means:

  • Shorter and readable is better
  • should make it hard to guess the number of send reports.

Idea is: use a prefix for the cert, like cert-example "CE"
then an iso like date and a unique random number,
formated to be readable over the phone.

Example
CE-20160818-1234-5678

Variants:

  • Using hexadecimal number one could save one character length, which could be used
    to make it shorter or to implement a simple checksum to prevent typos

Implementation ideas:

  • Use the postgresql database to have a table that marks the randomly chosen ticket ids
    to avoid collisions. Could be done by a postgresql internal function.
  • use a dict internal to one mailgen run, and limit the mailgen runs to 1 per day
    (or add the hours behind the date)
  • sync mailgen runs via an extra database (redis?) :)
  • add a small service that only draws new unique numbers.

Using postgresql seems to be preferable to not introduce more dependencies.
If the roundtrip to the db or to the service becomes a problem, someone could draw a couple of ids and cache them for usage.

mailgen creates email without orig-date header

The "Date:" (aka. orig-date field) header is not only extremely useful, but also required according to RFC 5322 section 3.6. "Field Definitions".

But currently this header is missing from mails generated by mailgen. Mailgen should add this header, with the current time/date from the moment the mail was generated. For further semantics see the RfC.

Next step for testing, add instructions for an smtpd which saves the emails

The debugging smptd of python3 only dumps the email to std.
For analysis, manual and automatic testing, it makes sense to save the emails
somewhere. Maybe on disc in maildir format so that emails can be inspected by email
clients or other python scripts.

I'll give it a shot.
Next steps:

  • look for an smtpd python module which can already do this?
  • see if there is a python module for maildir

logging: prepare for python standard configuration methods

Right now, only the logging_level for intelmqmail can be set via cb.main()
and the configuration file. This is close to how intelmq does it right now.

If logging is used in production with more requirements, it may make sense to

  1. resort to a logging configuration standard file that python's logging module accepts
  2. unify this with intelmq itself, because there should be a central place fo configure logging for the whole solution (over core and all components)

'source_directives' JSON object not added to column "extra" from table "events"

Hi, I have a problem with inserting into directives table, which is later used for sending mails. As far as i understood this trigger on table events adds new row to table directives through few nested procedures into the trigger:
"
create trigger events_insert_directive_trigger after
insert
on
public.events for each row execute procedure events_insert_directives_for_row();
"

events_insert_directives_for_row procedure:
"
CREATE OR REPLACE FUNCTION public.events_insert_directives_for_row()
RETURNS trigger
LANGUAGE plpgsql
SECURITY DEFINER
AS $function$
BEGIN
PERFORM directives_from_extra(NEW.id, NEW.extra);
RETURN NEW;
END
$function$
;
"

directives_from_extra procedure:
"
CREATE OR REPLACE FUNCTION public.directives_from_extra(event_id bigint, extra json)
RETURNS void
LANGUAGE plpgsql
AS $function$
DECLARE
json_directives JSON := extra -> 'certbund' -> 'source_directives';
directive JSON;
BEGIN
IF json_directives IS NOT NULL THEN
FOR directive
IN SELECT * FROM json_array_elements(json_directives) LOOP
PERFORM insert_directive(event_id, directive, 'source');
END LOOP;
END IF;
END
$function$
;
"

The last procedure searches for JSON object "source_directives", but none of the CERT-bund Contact Database and CERT-bund Contact Rules seem to add this information to column "extra" in table "events". This is how my "extra" column looks formatted:
{
"features":"cmd,stat_v2,shell_v2",
"certbund":{
"source_contacts":{
"organisations":[
{
"import_source":"",
"name":"test1",
"id":0,
"managed":"manual",
"sector":null,
"contacts":[
{
"email":"[email protected]",
"managed":"manual",
"email_status":"enabled",
"annotations":[

                 ]
              }
           ],
           "annotations":[

           ]
        }
     ],
     "matches":[
        {
           "organisations":[
              0
           ],
           "managed":"manual",
           "field":"asn",
           "annotations":[

           ]
        }
     ]
  }

},
"model":"SM-G960F",
"name":"starltexx",
"tag":"adb",
"device":"starlte"
}

I only have source_contacts. I added the contact "[email protected]" from fody application, but i didnt insert all info, only ASN and mail. Could this be the problem, which i doubt or if you can help me with finding where CERT-bund expert should add "source_directives" info into column "extra".

Thanks in advance.

mailgen uses only default formating

The current implementation of mailgen can only create two variations
of mails:

  1. when the "format" field of the record in the notifications table
    contains "feed_specific" as value, the mail will be created from
    the "specific.txt" template, with cvs data containing
    `botnet_drone_csv_columns'.
  2. for any other value of "format" a mail will be generated from the
    template specified in the notification record, with cvs data containing
    `most_csv_columns'.

Looking at the code line 599 ff. this comes to no surprise:

    if notification["format"] == "feed_specific":
        formatter = known_formatters['feed_specific'].get(('csv',
                                                           notification["feed_name"]))
        if formatter is not None:
            formatter = known_formatters['feed_specific'].get(('csv',
                                                               'DEFAULT'))

    else:
        formatter = known_formatters['generic'].get((notification["format"],
                                                     "GENERIC"))

besides being hilarious the "if formatter is not None:" makes no sense
at all and should be deleted. The whole
mail_format_feed_specific_as_csv should be deleted, too, as there is
already a generic fallback...

Test data via database

In order to facilitate tests, we should have a database sql extract (part of a dump) that we can insert
and run mailgen on.

Good would be to have a few types of all patterns represented in the database table.

Should be easy to produce from a test run with intelMQ inserting the events.

Modifie CSV output format

As per cutomer request:

CSV output should:

  • be seperated by commas
  • quote every field using quotation marks

I'll implement the change.

Support python3-gpg (for Ubuntu 20.04 LTS)

GnuPG has published official Python bindings.

The currently used GnuPG python bindings (pygpgme) are being phased out from GNU/Linux distributions and are not very actively maintained.

We shall support the official bindings (and test with Ubuntu 20.04 LTS)
additionally. The official bindings are in package python3-gpg (see https://packages.ubuntu.com/focal/python3-gpg).

Support can be dropped for pygpgme, once Ubuntu 16.04 LTS stops getting maintenance updates (April 2021) https://ubuntu.com/about/release-cycle https://wiki.ubuntu.com/Releases

details on old library

Is https://launchpad.net/pygpgme (which would be the package python3-gpgme for Ubuntu 16.04 LTS (https://packages.ubuntu.com/search?suite=default&section=all&arch=any&keywords=python3-gpgme&searchon=names)

Handle unknown notification formats

We need to decide on how to handle notifications for which mailgen cannot determine the format to use for the message. E.g. if a notification specifies a format which mailgen simply does not know about, what should mailgen do?

Make ticket number unguessable

Split out from #28
Make the ticket number unguessable.

Implementation idea:
Add a table that keeps the used ticket ids,
draw new ones randomly until you find out that has not been in use.

Within our current design size:
100,000,000 possibilities for numbers per day
and aiming for sending out 1,000,000 mail per days,
this process will need a redraw in a max of 1% of cases.
So we are okay.

Less attractive implementation idea:
Using a festel chipher like suggested on the Postgresql wiki.
It is less clear to prove that the chipher will create no collisions
and what would be needed (in terms of the used "round" function or "key")
to make it unguessible, if the source is know.

enforce quoted-printable encoding for text/plain emails to be send out - even if only 7-bit body

In the rare case that very long lines are in there, quoted-printable is necessary.
For the other cases for a text/plain body it is not strictly necessary,
but we have some requests to do quoted-printable there anyway.
(Probably because of the compatibility with some email receivers.)

== Technical Analysis:
The version of python3 on Debian Jessie does not do quoted-printable on very long lines,
so enforcing could make the solution more robust.

Dokumentation flaws

The documentation needs some cleanup and rework, two especially important points:

  • the documentation contains a rather incomprehensible (at least to me) section "Specific Templates".
  • Some CSV output-formaters are using the information from classification identifier as malware name.
    The documentation should explain this fact and point out, that the modify expert should be
    used to fill the classification identifier with the wanted values.

non-interactive installation for apache password

The debian packages should give the possibility for a non-interactive setting
of the intelmq apache password. This is a precondition for automatic tests.

Solution idea:
Just generate a password and write it into the system wide configuration file.
So admins can look it up.

OpenPGP/MIME signatures

For email receivers that want to get emails with attachments (e.g. x-arf or csv attachments) they want to have an OpenPGP/MIME signature to be able to verify the sender in a standard compliant way.

A module or code can be useful to implement parts of certtools/intelmq#534 .

technical

RFCs 2015 and 3156 define a MIME compatible solution for OpenPGP signed emails. The advantage is that encodings and mime-types will be handled nicely and even in the case that the mail user agent does not know about crypto.

Packaging-Debian: Leave system configuration untouched

@gsiv does debian/rules
'''sh
sed 's@/usr/local/lib/intelmq@/usr/lib/intelmq@'
debian/intelmq-mailgen/usr/share/doc/intelmq-mailgen/examples/intelmq-mailgen.conf.example
> debian/intelmq-mailgen/etc/intelmq/intelmq-mailgen.conf
'''
leave the system configuration line okay?
(The replacement also seems unnecessary rightn ow.)

Allow parallel email creation to raise speed

Right now only one mailgen variant can run at a time
and mailgen only uses one (python thread).

If email sending speed becomes an issue, it may be possible to enable
more email creation processes to work in parallel.

Possibilities:

  • allow to start several mailgen workers (from different machines)
  • use threads within mailgen (because sending and crypto will be i/o bound from mailgen's side)
  • make sure crypto processes can run on a different thread/core.

Both ideas will allow a machine with several cores to utilise them better.

If implemented, the sql interactions must be checked for race conditions.

Technically the use of SQL selections should right now prevent
an active second mailgen script to run.

@bernhard-herzog
https://github.com/Intevation/intelmq-mailgen/blob/master/intelmq-mailgen#L715
has FOR UPDATE NOWAIT
does this prevent mailmen from running twice like you've said?

package not installable because of missing pyxarf package

# apt install intelmq-mailgen
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:

The following packages have unmet dependencies:
 intelmq-mailgen : Depends: python3 (>= 3.6) but 3.5.3-1 is to be installed
                   Depends: gnupg (>= 2.2) but 2.1.18-8~deb9u4 is to be installed
                   Recommends: python3-pyxarf (>= 0.0.5) but it is not installable
E: Unable to correct problems, you have held broken packages.

the pyxarf package is not available in any ubuntu repository
It is available for xenial here: http://apt.intevation.de/dists/xenial/intelmq-testing/binary-amd64/
but that is pretty useless nowadays
pyxarf is highly optional, so it should not be recommended either

Sending out x-arf emails

Should be able to send out x-arf emails.
Specification available from http://www.x-arf.org,
the question is: which version v0.2 or v0.3 draft.

TODO: List x-arf sender and receivers. Look for example emails.

Reading of configuration files should be optional

To enable testing as a regular user, reading the system configuration file should be optional
and if there is a complete system configuration file there is no need to enforce an additional local user
one.

turn intelmq-mailgen into a module to help unit test single functions

The intelmq-mailgen file contains about 20 functions right now.
In order to be able to write unittests for it, we probably should turn it into a module.

Within the tests directory I would want to import the functions and write tests for it.

e.g. into directory "mailgen" with init.py
and turn intelmq-mailgen into file that just imports that module and runs it.

fast way to notice that a "load of events" has been processed (as indicator for sending)

A number of feeds come as one block, e.g. once a day. (This means that will have the same time.observation value in intelmq.)

Each recipient wants to get one aggregrated email with all notification for this block as fast and complete as possible.

technical implementation thoughts

@bernhard-herzog has implemented a way to notice that when a directive was inserted the last time for a specific set of aggregation values, so when used to aggregate on time.observation and max(d.inserted_at) is 2 hours ago, we trigger sending the email.

This methods has the drawback that if the first event and the last event of one load (or batch) is for this set of aggregation values, it will take a long time before another directive is entering the database, so the time intervall has to be quite long to have a good detection that processing has been through.

This issue is about using a better detection mechanism, that can detect the completion of processing faster, thus sending emails faster on the average.

implementation idea

Using an extra table that for each feed.name and time.observation keeps that last inserted directive time. This way the email aggregation script can use a simple additional query to see with a higher reliability that the batch has been processed fully.

Necessary implementation steps (roughly):

  • extend trigger to enter information in the new table
  • add new table to db schema
  • extend mail generation to use the new information
  • provide a way to deal with old pending notifications for the migration to the new system

Adding proper header file

The main file in my view needs a header file stating the authors, copyright and license.

And: BTW: shouldn't be #!/usr/bin/env python3 to be sure?

Deb package build process no longer runs unit tests

Before issue #19 dpkg-buildpackage would run (a part of) mailgen's
test suite via its Makefile.

After the restructuring and switch to pybuild, this is no longer the
case:

I: pybuild base:170: cd /2auto/intelmq-mailgen/.pybuild/pythonX.Y_3.4/build; python3.4 -m unittest discover -v 

----------------------------------------------------------------------
Ran 0 tests in 0.000s

OK
I: pybuild base:170: cd /2auto/intelmq-mailgen/.pybuild/pythonX.Y_3.5/build; python3.5 -m unittest discover -v 

----------------------------------------------------------------------
Ran 0 tests in 0.000s

OK

Tests can still be run manually with make but we should probably
fix this regression.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.