Giter Site home page Giter Site logo

datasette-publish-fly's Introduction

datasette-publish-fly

PyPI Changelog Tests License

Datasette plugin for deploying Datasette instances to Fly.io.

Project background: Using SQLite and Datasette with Fly Volumes

Installation

Install this plugin in the same environment as Datasette.

$ datasette install datasette-publish-fly

Deploying read-only data

First, install the flyctl command-line tool by following their instructions.

Run flyctl auth signup to create an account there, or flyctl auth login if you already have one.

You can now use datasette publish fly to publish one or more SQLite database files:

datasette publish fly my-database.db --app="my-data-app"

The argument you pass to --app will be used for the URL of your application: my-data-app.fly.dev.

To update an application, run the publish command passing the same application name to the --app option.

Fly have a free tier, beyond which they will charge you monthly for each application you have live. Details of their pricing can be found on their site.

Your application will be deployed at https://your-app-name.fly.io/ - be aware that it may take several minutes to start working the first time you deploy it.

Using Fly volumes for writable databases

Fly Volumes provide persistant disk storage for Fly applications. Volumes can be 1GB or more in size and the Fly free tier includes 3GB of volume space.

Datasette plugins such as datasette-uploads-csvs and datasette-tiddlywiki can be deployed to Fly and store their mutable data in a volume.

⚠️ You should only run a single instance of your application if your database accepts writes. Fly has excellent support for running multiple instances in different geographical regions, but datasette-publish-fly with volumes is not yet compatible with that model. You should probably use Fly PostgreSQL instead.

Here's how to deploy datasette-tiddlywiki with authentication provided by datasette-auth-passwords.

First, you'll need to create a root password hash to use to sign into the instance.

You can do that by installing the plugin and running the datasette hash-password command, or by using this hosted tool.

The hash should look like pbkdf2_sha256$... - you'll need this for the next step.

In this example we're also deploying a read-only database called content.db.

Pick a name for your new application, then run the following:

datasette publish fly \
content.db \
--app your-application-name \
--create-volume 1 \
--create-db tiddlywiki \
--install datasette-auth-passwords \
--install datasette-tiddlywiki \
--plugin-secret datasette-auth-passwords root_password_hash 'pbkdf2_sha256$...'

This will create the new application, deploy the content.db read-only database, create a 1GB volume for that application, create a new database in that volume called tiddlywiki.db, then install the two plugins and configure the password you specified.

Updating applications that use a volume

Once you have deployed an application using a volume, you can update that application without needing the --create-volume or --create-db options. To add the datasette-graphq plugin to your deployed application you would run the following:

datasette publish fly \
content.db \
--app your-application-name \
--install datasette-auth-passwords \
--install datasette-tiddlywiki \
--install datasette-graphql \
--plugin-secret datasette-auth-passwords root_password_hash 'pbkdf2_sha256$...' \

Since the application name is the same you don't need the --create-volume or --create-db options - these are persisted automatically between deploys.

You do need to specify the full list of plugins that you want to have installed, and any plugin secrets.

You also need to include any read-only database files that are part of the instance - content.db in this example - otherwise the new deployment will not include them.

Advanced volume usage

datasette publish fly will add a volume called datasette to your Fly application. You can customize the name using the --volume name custom_name option.

Fly can be used to scale applications to run multiple instances in multiple regions around the world. This works well with read-only Datasette but is not currently recommended using Datasette with volumes, since each Fly replica would need its own volume and data stored in one instance would not be visible in others.

If you want to use multiple instances with volumes you will need to switch to using the flyctl command directly. The --generate-dir option, described below, can help with this.

Generating without deploying

Use the --generate-dir option to generate a directory that can be deployed to Fly rather than deploying directly:

datasette publish fly my-database.db \
  --app="my-generated-app" \
  --generate-dir /tmp/deploy-this

You can then manually deploy your generated application using the following:

cd /tmp/deploy-this
flyctl apps create my-generated-app
flyctl deploy

datasette publish fly --help

Usage: datasette publish fly [OPTIONS] [FILES]...

  Deploy an application to Fly that runs Datasette against the provided database
  files.

  Usage example:

      datasette publish fly my-database.db --app="my-data-app"

  Full documentation: https://datasette.io/plugins/datasette-publish-fly

Options:
  -m, --metadata FILENAME         Path to JSON/YAML file containing metadata to
                                  publish
  --extra-options TEXT            Extra options to pass to datasette serve
  --branch TEXT                   Install datasette from a GitHub branch e.g.
                                  main
  --template-dir DIRECTORY        Path to directory containing custom templates
  --plugins-dir DIRECTORY         Path to directory containing custom plugins
  --static MOUNT:DIRECTORY        Serve static files from this directory at
                                  /MOUNT/...
  --install TEXT                  Additional packages (e.g. plugins) to install
  --plugin-secret <TEXT TEXT TEXT>...
                                  Secrets to pass to plugins, e.g. --plugin-
                                  secret datasette-auth-github client_id xxx
  --version-note TEXT             Additional note to show on /-/versions
  --secret TEXT                   Secret used for signing secure values, such as
                                  signed cookies
  --title TEXT                    Title for metadata
  --license TEXT                  License label for metadata
  --license_url TEXT              License URL for metadata
  --source TEXT                   Source label for metadata
  --source_url TEXT               Source URL for metadata
  --about TEXT                    About label for metadata
  --about_url TEXT                About URL for metadata
  --spatialite                    Enable SpatialLite extension
  --region TEXT                   Fly region to deploy to, e.g sjc - see
                                  https://fly.io/docs/reference/regions/
  --create-volume INTEGER RANGE   Create and attach volume of this size in GB
                                  [x>=1]
  --create-db TEXT                Names of read-write database files to create
  --volume-name TEXT              Volume name to use
  -a, --app TEXT                  Name of Fly app to deploy  [required]
  -o, --org TEXT                  Name of Fly org to deploy to
  --generate-dir DIRECTORY        Output generated application files and stop
                                  without deploying
  --show-files                    Output the generated Dockerfile, metadata.json
                                  and fly.toml
  --setting SETTING...            Setting, see
                                  docs.datasette.io/en/stable/settings.html
  --crossdb                       Enable cross-database SQL queries
  --help                          Show this message and exit.

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

cd datasette-publish-fly
python -m venv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Integration tests

The tests in tests/test_integration.py make actual calls to Fly to deploy a test application.

These tests are skipped by default. If you have flyctl installed and configured, you can run the integration tests like this:

pytest --integration -s

The -s option here ensures that output from the deploys will be visible to you - otherwise it can look like the tests have hung.

The tests will create applications on Fly that start with the prefix publish-fly-temp- and then delete them at the end of the run.

datasette-publish-fly's People

Contributors

cldellow avatar simonw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

datasette-publish-fly's Issues

Document usage with datasette-scale-to-zero

Following #29 I decided to try this:

datasette publish fly \
   fixtures.db \
   --app datasette-publish-fly-issue-29 \
   --install datasette-auth-passwords \
   --plugin-secret datasette-auth-passwords root_password_hash 'pbkdf2_sha256$480000$9ce99372d1fa079f770d4e2245bcf335$zJjskTDc6M8sxmEUYZBr/EC0e730Q9pzcF8RJB43c/c=' \
   --install datasette-scale-to-zero \
   --plugin-secret datasette-scale-to-zero duration 5m

Using https://datasette.io/plugins/datasette-scale-to-zero ... and it worked!

I watched the Fly Dashboard and the app at https://datasette-publish-fly-issue-29.fly.dev/ now successfully scales to zero if it receives no hits for 5 minutes, then starts running again when traffic arrives:

Here's it exiting 5m after the last request:

CleanShot 2023-08-21 at 12 14 55@2x

And here's it rebooting when a new request comes in:

CleanShot 2023-08-21 at 12 16 59@2x

Large sqlite db, 10min threshold guess

Tried:

datasette publish fly x.db \
--app "test" \
--region "sin" \
--extra-options="--setting sql_time_limit_ms 15000" \
--metadata metadata.yml \
--install pysqlite3-binary \
--install datasette-auth-tokens

The ph.db sqlite3 file is 2gb.

After ~10 minutes it stops with an error.

The deployment used to be work well. I'm guessing it was because it hit the threshold of inactivity for the remote builder, i.e. as a feature: "They turn off automatically after 10 minutes of inactivity"

==> Verifying app config
→ Verified app config
==> Building image
Remote builder fly-builder-white-wood-1586 ready
==> Creating build context
→ Creating build context done
==> Building image with Docker
→ docker host: 20.10.12 linux x86_64
[+] Building 596.1s (0/1)
=> [internal] load remote build context 596.1s
*ERRO[0622] Can’t add file /private/var/folders/7g/qc77bj6j63q0gr0mrzvbmxkc0000gn/T/tmph1n8z2gw/test/x.db to tar: io: read/write on closed pipe *
*ERRO[0622] Can’t close tar writer: io: read/write on closed pipe *
Error failed to fetch an image or build from source: error building: unexpected EOF

Assuming the guess is correct, is any any workaround possible? Will building the docker image locally help in not consuming the full 10 minutes?

Allow loading SQLite extensions

I tried adding a custom SQLite extension like this:

--extra-options '--load-extension ./text.so'

but it fails to deploy. For now I am using the --generate-dir flag, copying the .so manually and then run flyctl deploy myself.

`*.db` database being created

Running this command:

datasette publish fly \                        
--app my-new-fly-application-2 \
--volume-name vol1 \
--create-volume 1 \
--create-db tiddlywiki \
--install datasette-auth-passwords \
--install datasette-tiddlywiki \
--plugin-secret datasette-auth-passwords root_password_hash $ROOT_PW

Resulted in a database called * being created. This is using the released versions of everything.

Deployment fails with "incompatible types: TOML value has type string; destination has type integer"

I didn't change anything in the configuration, yet redeployment started to fail:

 datasette publish fly wfms.db --app workflows --metadata metadata.yaml --plugins-dir=plugins/ --static assets:static-files/
  shell: /usr/bin/bash -e {0}
  env:
    FLY_API_TOKEN: ***
    pythonLocation: /opt/hostedtoolcache/Python/3.10.5/x64
    PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.10.5/x64/lib/pkgconfig
    LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.10.5/x64/lib
  
Error failed loading app config from fly.toml: toml: line 18 (last key "services.ports.port"): incompatible types: TOML value has type string; destination has type integer

Make --app optional

Fly can generate random names for applications for you. I should make it an optional argument and let fly generate you a random name.

I should also have datasette publish fly output a command-clickable URL when it finishes (with an https:// prefix).

Refs superfly/flyctl#82

`--plugin-secret` should use Fly secrets

In writing the documentation for the new volumes stuff - #12 (comment)_ (also refs #10) - I realized that if I'm going to show people how to use volumes I need them to deploy writable plugins. But if I show them that, it's not responsible to show them how to do it without authentication. So I should include datasette-auth-passwords - but that requires them to set a password secret using --plugin-secret...

... and I don't want them to have to set the same plugin secret every time they run a deploy to update an existing instance!

So, instead, I'm going to see if I can get --plugin-secret to set a Fly secret, which will persist for the lifetime of the application across multiple deploys.

TypeError: __init__() got an unexpected keyword argument 'capture_output'

Hi,
I am following instructions in this blog post:

https://fly.io/blog/making-datasets-fly-with-datasette-and-fly/

but have hit this error:

$ datasette publish fly squirrels.db --app squirrelsx
Traceback (most recent call last):
File "/home/darreng/code/flyio_datasette_example/env/bin/datasette", line 8, in
sys.exit(cli())
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/click/core.py", line 829, in call
return self.main(*args, **kwargs)
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/datasette_publish_fly/init.py", line 106, in fly
apps = existing_apps()
File "/home/darreng/code/flyio_datasette_example/env/lib/python3.6/site-packages/datasette_publish_fly/init.py", line 119, in existing_apps
process = run(["flyctl", "apps", "list"], capture_output=True)
File "/usr/lib/python3.6/subprocess.py", line 423, in run
with Popen(*popenargs, **kwargs) as process:
TypeError: init() got an unexpected keyword argument 'capture_output'

But it appears that capture_output is new in Python 3.7.

Should the README be updated to indicate that 3.7 in required? or could you accomodate older versions using the PIPE workaround detailed here:
https://stackoverflow.com/questions/53209127/subprocess-unexpected-keyword-argument-capture-output

Thanks
Darren

--setting option

As seen in datasette-publish-vercel - easier than using --extra-options

App published to fly.io with volume is losing data after `fly apps restart`

I successfully deployed datasette to fly.io using the command below. I am trying to keep the .db in a persisted volume so that I can add and update data periodically but am running into some problems including:

  • After creating a new table, then running fly apps restart, the database is missing the table. Steps to create a new table were:
    • fly ssh console -a some-project
    • apt update && apt install sqlite3
    • sqlite3
    • >.open some.db
    • > .tables
    • some_table
    • > create table test_table as select 1 col1 union select 2;
    • > .tables
    • some_table test_table
  • The --install datasette-saved-queries flag doesn't do anything. If I install it locally first before deploying, then it works.
  • Created a file with echo some_data > /app/vol1/some_file.txt and it was gone after fly apps restart

Here's the full command I'm using

datasette publish fly data/outputs/sqlite/some.db \
	--app="some-project" \
	--create-volume 2 \
	--volume-name vol1 \
	--static vol1:data/outputs/test/ \
	--install datasette-saved-queries \
	--setting sql_time_limit_ms 25000

I'm not sure if this is a bug or user error, and appreciate any guidance. There's a good chance I'm missing something fundamental about how fly.io volumes work.

Parameter --plugin-secret is not working

Deploying an app with

datasette publish fly \
    data.db \
    --app my-app

works and I can access the data.

But using the script with --install ... and --plugin-secret ... flags, e.g.

datasette publish fly \
   data.db \
   --app my-app \
   --install datasette-auth-passwords \
   --plugin-secret datasette-auth-passwords root_password_hash 'pbkdf2_sha2 ...'

this error shows up:

./start-fly.sh                                                                                                ─╯
Error: Error calling 'flyctl secrets set':

Error: unknown flag: --json

Is there a change in the flyctl secret ... command?

Return code not non-zero on errors

I have the following in my GHA configuration:

name: Fly Deploy
on: [push]
env:
  FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
jobs:
  deploy:
      name: Deploy app
      runs-on: ubuntu-latest
      steps:
        - uses: actions/checkout@v3
        - uses: actions/setup-python@v4
          with:
            python-version: '3.10'
            cache: 'pip'
        - run: pip install -r requirements.txt
        - run: yaml-to-sqlite wfms.db wfms wfms.yaml
        - uses: superfly/flyctl-actions/setup-flyctl@master
        - run: datasette publish fly wfms.db --app workflows --metadata metadata.json --plugins-dir=plugins/ --static assets:static-files/

which failed in the last step with this:


Run datasette publish fly wfms.db --app workflows --metadata metadata.json --plugins-dir=plugins/ --static assets:static-files/
  datasette publish fly wfms.db --app workflows --metadata metadata.json --plugins-dir=plugins/ --static assets:static-files/
  shell: /usr/bin/bash -e {0}
  env:
    FLY_API_TOKEN: ***
    pythonLocation: /opt/hostedtoolcache/Python/3.10.5/x64
    PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.10.5/x64/lib/pkgconfig
    LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.10.5/x64/lib
==> Verifying app config
--> Verified app config
==> Building image
Waiting for remote builder fly-builder-black-cloud-1435...
Error failed to fetch an image or build from source: error connecting to docker: failed building options: lookup dfw2.gateway.6pn.dev on 127.0.0.53:53: server misbehaving

but GHA did not fail the job, which usually happens when a job returns a non-zero return/error code.

Don't break if upgrade message shown

Just saw this error:

  File "/usr/local/lib/python3.7/site-packages/datasette_publish_fly/__init__.py", line 106, in fly
    apps = existing_apps()
  File "/usr/local/lib/python3.7/site-packages/datasette_publish_fly/__init__.py", line 123, in existing_apps
    assert lines[0].startswith("NAME")
AssertionError
~/Downloads $ flyctl apps list
Update available 0.0.108 -> 0.0.109
  NAME                 OWNER            LATEST DEPLOY         
  datasette-demo       simon-willison   2020-03-19T01:59:40Z  
  squirrel-map         simon-willison   1m19s ago             
  summer-violet-8396   simon-willison   2020-03-19T22:10:45Z  

`--setting force_https_urls 1` fails due to missing imports

Traceback (most recent call last):
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/datasette_publish_fly/__init__.py", line 62, in convert
    return name, value_as_boolean(value)
NameError: name 'value_as_boolean' is not defined

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/cldellow/src/dux-publish/venv/bin/datasette", line 8, in <module>
    sys.exit(cli())
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 1657, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 1655, in invoke
    sub_ctx = cmd.make_context(cmd_name, args, parent=ctx)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 920, in make_context
    self.parse_args(ctx, args)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 1378, in parse_args
    value, args = param.handle_parse_result(ctx, opts, args)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 2360, in handle_parse_result
    value = self.process_value(ctx, value)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 2316, in process_value
    value = self.type_cast_value(ctx, value)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 2302, in type_cast_value
    return tuple(convert(x) for x in check_iter(value))
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/core.py", line 2302, in <genexpr>
    return tuple(convert(x) for x in check_iter(value))
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/click/types.py", line 82, in __call__
    return self.convert(value, param, ctx)
  File "/home/cldellow/src/dux-publish/venv/lib/python3.8/site-packages/datasette_publish_fly/__init__.py", line 63, in convert
    except ValueAsBooleanError:
NameError: name 'ValueAsBooleanError' is not defined

Adding the two missing imports seems to resolve it, #27 has a fix

`--generate-dir` works confusingly

I tried out --generate-dir and ran into two problems with the code that's responsible:

if generate_dir:
dir = pathlib.Path(generate_dir)
if not dir.exists():
dir.mkdir()
# Copy files from current directory to dir
for file in pathlib.Path(".").glob("*"):
shutil.copy(str(file), str(dir / file.name))
(dir / "fly.toml").write_text(fly_toml, "utf-8")
return

  • It can only copy files, not directories (e.g. the plugin directory!)
  • If you give a relative path, the directory is generated not in the current working directory, but in the temporary directory which is cleaned up afterwards!

As a workaround I had to give an absolute path, but that was after some confusion and digging into the code.

Take advantage of new --json option

The code here parses the output of flyctl apps list:

def existing_apps():
process = run(["flyctl", "apps", "list"], capture_output=True)
output = process.stdout.decode("utf8")
all_lines = [l.strip() for l in output.split("\n")]
# Skip lines until we find the NAME line
lines = []
collect = False
for line in all_lines:
if collect:
lines.append(line)
elif line.startswith("NAME"):
collect = True
apps = [l.strip().split()[0] for l in lines if l.strip()]
return apps

Fly now has a --json flag which would make this more robust: https://fly.io/blog/flyctl-meets-json/

"Can't close tar writer: io: read/write on closed pipe"

(datasette-demo) datasette-demo % datasette publish fly fixtures.db -a datasette-demo
Update available 0.0.200 -> 0.0.229
Update with flyctl version update

Deploying datasette-demo
==> Validating app configuration
--> Validating app configuration done
Services
TCP 80/443 ⇢ 8080
Remote builder fly-builder-blue-surf-155 ready
==> Creating build context
--> Creating build context done
==> Building image with Docker
ERRO[0007] Can't add file /private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpq6ofghqk/datasette-demo/Dockerfile to tar: io: read/write on closed pipe 
ERRO[0007] Can't close tar writer: io: read/write on closed pipe 
Error error building: error building with docker: error during connect: Post "https://fly-builder-blue-surf-155.fly.dev:10000/v1.41/build?buildargs=%7B%7D&cachefrom=null&cgroupparent=&cpuperiod=0&cpuquota=0&cpusetcpus=&cpusetmems=&cpushares=0&dockerfile=&labels=null&memory=0&memswap=0&networkmode=&platform=linux%2Famd64&rm=0&shmsize=0&t=registry.fly.io%2Fdatasette-demo%3Adeployment-1627679663&target=&ulimits=null&version=": read tcp 10.0.0.8:62141->213.188.211.42:10000: read: connection reset by peer

Tool now asks interactive questions, it did not before

% datasette publish fly fixtures.db -a datasette-fly-test-december-2020

Selected App Name: datasette-fly-test-december-2020

Automatically selected personal organization: Simon Willison

? Select builder:  [Use arrows to move, type to filter]
> Dockerfile
    (Do not set a builder and use the existing Dockerfile)

`--generate-dir` option

I'm going to imitate datasette publish vercel which has this option:

  --generate-dir DIRECTORY        Output generated application files and stop
                                  without deploying

Originally posted by @simonw in #12 (comment)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.