Giter Site home page Giter Site logo

datanommer's Introduction

Datanommer

Datanommer is an application that is comprised of only a Fedora Messaging consumer that places every message into a Postgres / TimescaleDB database.

It is comprised of 3 modules:

  • datanommer.consumer: the Fedora Messaging consumer that monitors the queue and places every message into the database
  • datanommer.models: the database models used by the consumer. These models are also used by Datagrepper, FMN, and fedbadges. Typically, to access the information stored in the database by datanommer, use the Datagrepper JSON API.
  • datanommer.commands: a set of commandline tools for use by developers and sysadmins.

Refer to the online documentation for details.

datanommer's People

Contributors

abitrolly avatar abompard avatar ari3s avatar cverna avatar dependabot[bot] avatar encukou avatar gridhead avatar housewifehacker avatar hwmrocker avatar james02135 avatar lenkaseg avatar lmacken avatar mergify[bot] avatar mikebonnet avatar nbebout avatar olasd avatar pypingou avatar ralphbean avatar renovate[bot] avatar rtnpro avatar ryanlerch avatar siddharthvipul avatar silverrain avatar sontek avatar stephencoady avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datanommer's Issues

Use tox for running datanommer tests

ensure the datanommer tests use tox.

in addition to the unit tests, the also ensure the following checks are performed by tox:

  • bandit
  • flake8
  • black
  • liccheck

On bigger loads there is problem with queries on postgres table

The error in the response was

    This probably means the server terminated abnormally
    before or while processing the request.
 'SELECT count(*) AS count_1 
  FROM (SELECT 
                  messages.id AS messages_id, 
                  messages.msg_id AS messages_msg_id, 
                  messages.i AS messages_i, 
                  messages.topic AS messages_topic, 
                  messages.timestamp AS messages_timestamp,
                  messages.certificate AS messages_certificate,
                  messages.signature AS messages_signature, 
                  messages.category AS messages_category, 
                  messages.username AS messages_username, 
                  messages.crypto AS messages_crypto, 
                  messages.source_name AS messages_source_name,
                  messages.source_version AS messages_source_version,
                  messages._msg AS messages__msg, 
                  messages._headers AS messages__headers
               FROM messages
               WHERE messages.timestamp BETWEEN %(timestamp_1)s AND %(timestamp_2)s 
                     AND messages.topic = %(topic_1)s) AS anon_1' {'timestamp_2': datetime.datetime(2021, 1, 22, 11, 13, 3), 'topic_1': u'org.fedoraproject.prod.bodhi.update.status.testing.koji-build-group.build.complete', 'timestamp_1': datetime.datetime(2021, 1, 22, 10, 56, 2)}

and

 Can't reconnect until invalid transaction is rolled back (original cause: InvalidRequestError: Can't reconnect until invalid transaction is rolled back) u'
SELECT count(*) AS count_1 
FROM (SELECT messages.id AS messages_id, messages.msg_id AS messages_msg_id, messages.i AS messages_i, messages.topic AS messages_topic, messages.timestamp AS messages_timestamp, messages.certificate AS messages_certificate, messages.signature AS messages_signature, messages.category AS messages_category, messages.username AS messages_username, messages.crypto AS messages_crypto, messages.source_name AS messages_source_name, messages.source_version AS messages_source_version, messages._msg AS messages__msg, messages._headers AS messages__headers 
FROM messages \nWHERE messages.timestamp BETWEEN %(timestamp_1)s AND %(timestamp_2)s AND messages.topic = %(topic_1)s) AS anon_1' [immutabledict({})]

Introduce the use of the alembic tool.

The problem: If we make changes to datanommer's database in development, then we have to update the production database schema by hand or wipe the existing data and start over each time we modify datanommer's schema. This is not good for a tool that we want to produce a consistent, reliable archive of the fedmsg bus.

Solution: We should use alembic for versioning datanommer's database. http://alembic.readthedocs.org/en/latest/

Rework datanommer-stats to work with single message table

datanommer-stats used to show the number of messages in each table in the db.

Now that we have only one table/model; it should output the number of messages in each category.

Maybe if you passed --topic it could show you the messages in each topic? i.e. much more fine-grained.

Convert datanommer.models to store fedora-messaging style messages

Currently, datanommer.models store messages in the database in fedmsg format. Also the metadata (dates, topics, etc) that it stores as additional fields is extracted from the fedmsg style message.

When we implement #132 the messages that we add to the database via the model add() function will need to be properly handled in that function/

Note to that this issue is dependant on the outcome of the #128 spike ticket

Add dependabot

And configure it to merge minor and patch updates automatically, with mergify (see #127)

datanommer-dump should print out valid JSON

Currently, each individual message that it prints out is valid json, but the whole output is not a valid json blob.

The test of this would be if you could do the following at the shell:

datanommer-dump > myfile.json

and then do the following in Python:

import json
with open("myfile.json") as f:
    my_object = json.loads(f.read())

Setup coverage

Set it to fail the test run if the coverage gets below the current value.

do not enable datanommer by default

After installing datanommer on F19, it is enabled by default in /etc/fedmsg.d/datanommer.py and run when I start fedmsg-hub. This is unexpected as it needs configuration and is insecure by default.

Convert datanommer.consumer to use fedora-messaging

Currently the datanommer consumer consumes messages from the fedmsg queue. We currently get all messages due to the bridges, but we need to convert the consumer to use fedora-messaging.

The likely approach here will be to convert the consumer code to a fedora-messaging callback (https://fedora-messaging.readthedocs.io/en/stable/consuming.html#the-callback)

and run the consumer with a command like:

fedora-messaging consume --callback=datanommer.consumer:Nommer

pip install fails

When I try to install datanommer via pip it fails:

# pip3 install datanommer
Collecting datanommer
  Downloading datanommer-0.3.0.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-build-7h6uph24/datanommer/setup.py", line 24, in <module>
        long_description = f.read().strip()
      File "/usr/lib/python3.6/encodings/ascii.py", line 26, in decode
        return codecs.ascii_decode(input, self.errors)[0]
    UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 2393: ordinal not in range(128)
    
    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-7h6uph24/datanommer/

this also happens with the current master

root@fedmsg-test:~# pip3 install https://github.com/fedora-infra/datanommer.git

How to store messages in the datanommer db?

Currently, messages (both when sent via fedmsg or fedora-messages) are stored in the datanommer db in the fedmsg format (e.g. https://apps.fedoraproject.org/datagrepper/id?id=2021-01b823ed-6b7c-480c-a4a4-4af0670692ef&is_raw=true&size=extra-large) however, this is not the native fedora-messaging format.

After we convert datanommer to fedora-messaging, do we:

  • store messages from that point in the fedora-messaging format, but leave all the old ones in the old format?
  • convert all the old messages to the fedora-messaging format?
  • convert all new messages to the fedmsg format and keep doing that for all time.

Add unit tests for the grep method

The grep method in datanommer.models is shaping up to be the one super awesome api call that anyone and everyone will need.

It also currently doesn't have any unit tests associated with it. We should write some so that we can guarantee API stability.

Database timestamps are stored using the local timezone

Hi folks,

datanommer seems to store the messages using the local timezone instead of UTC. That looks wrong to me, but maybe there's a reason.

I can come up with a patch if you'd like. Depending on the deployment, there might be a need for a migration too.

Cheers,
Nicolas

Create a fedmsg "replayer"

The name is up for debate, but we need a tool that can take a slice of datanommer's history and export it to a file-based format and then another tool to read that file and "replay" those history messages on the bus.

The idea is that we want to be able to simulate events/interactions from our production environment, in our staging environment.

It should be able to perform the replay either "in time" or "instantaneously".

create alembic upgrade script to update old topics to new ones

Before we standardized on git.receive for all pkgs.fp.o git receives, we used topics like git.receive.pkgname.f18.

We should have an alembic upgrade script to update all the old topic names into new ones, so that when people search a topic they actually get the full history.

(via #fedora-apps, 2013-08-31 18:20 UTC)

Make the category column non nullable.

And give it some default value, whether that's a string literal like "unsorted" or some dynamic default (where it figures out the right category if none is provided).

Deploy latest datanommer to staging

A couple of things need to happen for this. Here's a checklist:

get @HousewifeHacker some rights!

package up the latest datanommer

  • Double check that we're ready to do a release.
  • Do a git-flow release with "$ git flow release start $VERSION_NUMBER"
  • Upload to pypi with "python setup.py sdist upload"
  • Update the datanommer rpms with the "fedpkg" command
  • Create bodhi updates for the new rpms
  • To avoid waiting a day for those updates to get mashed into the epel-test repository, have threebean copy them into the infrastructure-testing repo by hand.
  • Stop the fedmsg-hub (which will stop the datanommer consumer)
  • Do a "$ yum update python-datanommer-*" on busgateway01.stg.phx2.fedoraproject.org
  • Run alembic update scripts to upgrade the postgresql db
  • Run the datanommer-latest command to make sure it works
  • Restart the fedmsg-hub (and the datanommer consumer with it)

Add two new fields to every data model

As of fedmsg-0.6.0 the fedmsg.text module provides two get_relevant_usernames and get_relevant_packages`` functions that produce, well, lists of relevant metadata about messages. It would definitely make sense to store these in each table alongside messages.

shoulds:

  • You should modify the base model and add the two columns.
  • You should use a sqlalchemy mechanism to set the values to sensible defaults based on the output of fedmsg.text.
  • The change to the database should be logged in alembic so we can upgrade the production database with this change.

Note that, the ticket on introducing alembic should be done first before we can move ahead with this ticket.

Per-file licensing info missing

Hi folks,

It seems that none of the datanommer files have a licensing / copyright header. I assume all of it is GPLv3+ but, well, I'd like to be sure :)

Thanks,
Nicolas

insecure temp database

datanommer uses /tmp/datanommer.db as default database and is enabled by default in Fedora 19. This might lead to security problems.

datanommer.commands test does not pass in Python 3

There is a Python 3 compatibility issue related to json_object in datanommer.commands:

[datanommer.commands]$ python3 setup.py test
running test
Searching for datanommer.models
Best match: datanommer.models 0.9.0
Processing datanommer.models-0.9.0-py3.6.egg
...

======================================================================
ERROR: test_latest (test_commands.TestCommands)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jberan/git/datanommer/datanommer.commands/tests/test_commands.py", line 812, in test_latest
    eq_(json_object[0]['git']['msg'], 'Message 3')
KeyError: 'git'

======================================================================
FAIL: test_latest_timesince (test_commands.TestCommands)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jberan/git/datanommer/datanommer.commands/.eggs/freezegun-0.3.9-py3.6.egg/freezegun/api.py", line 495, in wrapper
    result = func(*args, **kwargs)
  File "/home/jberan/git/datanommer/datanommer.commands/tests/test_commands.py", line 709, in test_latest_timesince
    assert int(json_object[0])<=1.1
AssertionError

======================================================================
FAIL: test_latest_timesince_human (test_commands.TestCommands)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jberan/git/datanommer/datanommer.commands/tests/test_commands.py", line 765, in test_latest_timesince_human
    assert_not_in('day', json_object[0])
AssertionError: 'day' unexpectedly found in '1 day, 0:00:00.046187'

======================================================================
FAIL: test_latest_timestamp (test_commands.TestCommands)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jberan/git/datanommer/datanommer.commands/.eggs/freezegun-0.3.9-py3.6.egg/freezegun/api.py", line 495, in wrapper
    result = func(*args, **kwargs)
  File "/home/jberan/git/datanommer/datanommer.commands/tests/test_commands.py", line 655, in test_latest_timestamp
    eq_(json_object[0], time.mktime(datetime(2013,2,16).timetuple()))
AssertionError: 1360882800.0 != 1360969200.0

======================================================================
FAIL: test_latest_timestamp_human (test_commands.TestCommands)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jberan/git/datanommer/datanommer.commands/.eggs/freezegun-0.3.9-py3.6.egg/freezegun/api.py", line 495, in wrapper
    result = func(*args, **kwargs)
  File "/home/jberan/git/datanommer/datanommer.commands/tests/test_commands.py", line 603, in test_latest_timestamp_human
    eq_(json_object[0], "2013-02-16 16:16:16.000016")
AssertionError: '2013-02-15 15:15:15.000015' != '2013-02-16 16:16:16.000016'

----------------------------------------------------------------------
Ran 15 tests in 0.992s

FAILED (failures=4, errors=1)
Test failed: <unittest.runner.TextTestResult run=15 errors=1 failures=4>
error: Test failed: <unittest.runner.TextTestResult run=15 errors=1 failures=4>

datanommer does not create schema on startup

$ fedmsg-hub 
[moksha.hub] INFO 2013-08-25 20:39:37,681 Loading the Moksha Hub
Loading the Moksha Hub
[moksha.hub] WARNING 2013-08-25 20:39:37,681 No 'zmq_publish_endpoints' set.  Are you sure?
No 'zmq_publish_endpoints' set.  Are you sure?
[moksha.hub] INFO 2013-08-25 20:39:38,683 Loading Consumers
Loading Consumers
[moksha.hub] WARNING 2013-08-25 20:39:38,801 Failed to load 'fedmsg-tweet' consumer.
Failed to load 'fedmsg-tweet' consumer.
[moksha.hub] WARNING 2013-08-25 20:39:38,801 No module named bitlyapi
No module named bitlyapi
  enabled by config  - datanommer.consumer:Nommer
* disabled by config - fedmsg.consumers.relay:RelayConsumer
[moksha.hub] WARNING 2013-08-25 20:39:38,920 EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
* disabled by config - fedmsg.consumers.dummy:DummyConsumer
[moksha.hub] WARNING 2013-08-25 20:39:38,920 EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
* disabled by config - fedmsg.consumers.gateway:GatewayConsumer
[moksha.hub] WARNING 2013-08-25 20:39:38,921 EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
* disabled by config - fedmsg.consumers.ircbot:IRCBotConsumer
[moksha.hub] WARNING 2013-08-25 20:39:39,045 EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
EntryPoint.parse('fedmsg-ircbot = fedmsg.consumers.ircbot:IRCBotConsumer') didn't initialize correctly.  Did you call super(..).__init__?
[moksha.hub] INFO 2013-08-25 20:39:39,046 Loading Producers
Loading Producers
[moksha.hub] INFO 2013-08-25 20:39:39,048 Running the MokshaHub reactor
Running the MokshaHub reactor
/usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:459: SAWarning: Unicode type received non-unicode bind param value.
  param.append(processors[key](compiled_params[key]))
Got error: (OperationalError) no such table: messages u'INSERT INTO messages (i, topic, timestamp, certificate, signature, category, _msg) VALUES (?, ?, ?, ?, ?, ?, ?)' (1, u'org.fedoraproject.prod.buildsys.repo.done', '2013-08-25 20:39:48.565850', u'LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUVTVENDQTdLZ0F3SUJBZ0lCVkRBTkJna3Fo\na2lHOXcwQkFRVUZBRENCb0RFTE1Ba0dBMVVFQmhNQ1ZWTXgKQ3pBSkJnTlZCQWdUQWs1RE1SQXdE\nZ1lEVlFRSEV3ZFNZV3hsYVdkb01SY3dGUVlEVlFRS0V3NUdaV1J2Y21FZwpVSEp2YW1WamRERVBN\nQTBHQTFVRUN4TUdabVZrYlhObk1ROHdEUVlEVlFRREV3Wm1aV1J0YzJjeER6QU5CZ05WCkJDa1RC\nbVpsWkcxelp6RW1NQ1FHQ1NxR1NJYjNEUUVKQVJZWFlXUnRhVzVBWm1Wa2IzSmhjSEp2YW1WamRD\nNXYKY21jd0hoY05NVE13TVRJNE1Ua3lORFU1V2hjTk1qTXdNVEkyTVRreU5EVTVXakNCMkRFTE1B\na0dBMVVFQmhNQwpWVk14Q3pBSkJnTlZCQWdUQWs1RE1SQXdEZ1lEVlFRSEV3ZFNZV3hsYVdkb01S\nY3dGUVlEVlFRS0V3NUdaV1J2CmNtRWdVSEp2YW1WamRERVBNQTBHQTFVRUN4TUdabVZrYlhObk1T\nc3dLUVlEVlFRREV5SnJiMnBwTFd0dmFta3cKTXk1d2FIZ3lMbVpsWkc5eVlYQnliMnBsWTNRdWIz\nSm5NU3N3S1FZRFZRUXBFeUpyYjJwcExXdHZhbWt3TXk1dwphSGd5TG1abFpHOXlZWEJ5YjJwbFkz\nUXViM0puTVNZd0pBWUpLb1pJaHZjTkFRa0JGaGRoWkcxcGJrQm1aV1J2CmNtRndjbTlxWldOMExt\nOXlaekNCbnpBTkJna3Foa2lHOXcwQkFRRUZBQU9CalFBd2dZa0NnWUVBNVV6aVd2NlYKQjNjVW1w\nLzh1VEJQTzlCZlpNRmRpVnhjalVwR1ZJQWNWSjMwaTMrV2RiVlFpbW5HUS95TmhZTktVcGQrdkgr\nbApXUE12Z3Zaa1NDL1F2K1pjRS9hYUlqOEoxdDVZVFQxa090SWRIaGdldURJK3JpRmJmZWZoOHBC\nZEUxeVVaVDZiCkptcWlLeHBpTmNOT0YyaU96OXQ1b21vT01nZEJoMjJya0FrQ0F3RUFBYU9DQVZj\nd2dnRlRNQWtHQTFVZEV3UUMKTUFBd0xRWUpZSVpJQVliNFFnRU5CQ0FXSGtWaGMza3RVbE5CSUVk\nbGJtVnlZWFJsWkNCRFpYSjBhV1pwWTJGMApaVEFkQmdOVkhRNEVGZ1FVZjJVQzA3U1hPTGlEVDU1\nM0MyZ1l2eDB3ZWw4d2dkVUdBMVVkSXdTQnpUQ0J5b0FVCmEwQmErUklJaVZubldlVUY5UUlkQ2s1\nL0ZBQ2hnYWFrZ2FNd2dhQXhDekFKQmdOVkJBWVRBbFZUTVFzd0NRWUQKVlFRSUV3Sk9RekVRTUE0\nR0ExVUVCeE1IVW1Gc1pXbG5hREVYTUJVR0ExVUVDaE1PUm1Wa2IzSmhJRkJ5YjJwbApZM1F4RHpB\nTkJnTlZCQXNUQm1abFpHMXpaekVQTUEwR0ExVUVBeE1HWm1Wa2JYTm5NUTh3RFFZRFZRUXBFd1pt\nClpXUnRjMmN4SmpBa0Jna3Foa2lHOXcwQkNRRVdGMkZrYldsdVFHWmxaRzl5WVhCeWIycGxZM1F1\nYjNKbmdna0EKNDFBZVIwOFhIa1V3RXdZRFZSMGxCQXd3Q2dZSUt3WUJCUVVIQXdJd0N3WURWUjBQ\nQkFRREFnZUFNQTBHQ1NxRwpTSWIzRFFFQkJRVUFBNEdCQUFBQkhRMEUva3RtYUEyR1doUUM4amFm\neVBHSUdiVjR5a1dMMlJIQndoOHp1eHdSClVibHJFN3BnT3NtdWo4MWd6VVRJdE80WDh5dFV2cDVo\nM2xTMmU2bE8vL05VeWNPanYzcG95Y21BTzc2NlVFTnMKaVhMN2RBMUkwdllQT0xsMGVpTGQrbUI2\nRjdZODhIK2E4S3hJeWxPYXVRK3BwaC9POVJTZ1d2TFpIZGpECi0tLS0tRU5EIENFUlRJRklDQVRF\nLS0tLS0K\n', u'xGFlfhctbhqtaQ4rR+zBlEwb7ld976cyw+xGwaXydETfTAdaRGpu8ER6ZEQDhESAz7imIznIkiFJ\nlZ+YfewTd2aZSDkQ4y3ZI/D+k7islk0O4K6gm+PzCDfGakRohlO1/+m2C//sBQxmLwZ5LkxyX9Q7\n+2dZQiDOxKYcO7gsy6s=\n', 'buildsys', '{"repo_id":316125,"tag":"f18-build","tag_id":209}')
^C[moksha.hub] INFO 2013-08-25 20:39:54,473 MokshaHub reactor stopped
MokshaHub reactor stopped

Get the filters list from fedmsg.meta

In datanommer.models, BaseMessage defines a list of filters. Since it is hardcoded there, every time a new message source comes online, we'll have to go back to that code and edit it.

The fedmsg.meta module provides a list of message types and methods for extracting metadata about them. Maybe we can use it somehow to build that filters list dynamically.

Add a uuid field to the models

And an alembic upgrade script that assigns every message that doesn't have a uuid with a random one.

Then, add a classmethod to Message that allows us to query for a unique message.

Then, add an extra uuid option to the grep classmethod.

Add a datanommer-latest command

It should just print the JSON of the latest message to be stored.

You would need to:

  • Add a new command to the commands.py file.
  • Register that command along with the others in setup.py's entry-points section.

The new command would need to use sqlalchemy to find the last message added and then print it out just like datanommer-dump does (except it should print out just the last message, not all of them.

Add development instructions

Add a set of instructions to the README for how to hack on datanommer.

Right now, the only instructions are how to install it from yum. It would be useful for people who want to develop on it to know how to set it up from git.

Add "source" column in data table

A "source" column needs to be added to the data table.

Data put in the table from datanommer should have that field set to "datanommer"; this is so we can see what put in what data (older data would be put in with a script, for example).

Investigate work needed for timescale migration

Things to figure out:

  • installation and configuration of the timescaledb extension
  • schema
  • initial setup of the schema (like Metadata.create_all())
  • migration of the schema when it's using timescaledb
  • migration of the data from the current production schema to the new schema

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.