Giter Site home page Giter Site logo

giscience / openpoiservice Goto Github PK

View Code? Open in Web Editor NEW
170.0 16.0 22.0 17.08 MB

:round_pushpin: Openpoiservice is a flask application which hosts a highly customizable points of interest database derived from OpenStreetMap data.

Home Page: https://openrouteservice.org

License: Apache License 2.0

Python 98.53% Dockerfile 0.89% Shell 0.58%
openstreetmap openstreetmap-data openstreetmap-protobuffer-format giscience

openpoiservice's Introduction

Openpoiservice

Tests

Openpoiservice (ops) is a flask application which hosts a highly customizable points of interest database derived from OpenStreetMap.org data and thereby exploits its notion of tags...

OpenStreetMap tags consisting of a key and value describe specific features of map elements (nodes, ways, or relations) or changesets. Both items are free format text fields, but often represent numeric or other structured items.

This service consumes OSM tags on nodes, ways and relations by grouping them into predefined categories. If it picks up an OSM object tagged with one of the osm keys defined in categories.yml it will import this point of interest with specific additional tags which may be defined in ops_settings.yml. Any additional tag, for instance wheelchair or smoking may then be used to query the service via the API after import.

For instance, if you want to request all pois accessible by wheelchair within a geometry, you could add then add wheelchair: ['yes', 'dedicated] in filters within the body of your HTTP POST request.

You may pass 3 different types of geometry within the request to the database. Currently, "Point" and "LineString" with a corresponding and buffer are supported as well as a polygon. Points of interest will be returned within the given geometry.

You can control the maximum size of geometries and further restrictions in the settings file of this service.

Import Process

The osm file(s) to be imported are parsed several times to extract points of interest from relations (osm_type 3), ways (osm_type 2) and nodes (osm_type 1) in order. Which type the specific point of interest originated from will be returned within the response - this will help you find the object directly on OpenStreetMap.org.

Installation

You can either run openpoiservice on your host machine in a virtual environment or simply with Docker. The Dockerfile provided installs a WSGI server (gunicorn) which starts the flask service on port 5000.

Technical specs for storing and importing OSM files

Python version

As this service makes use of the python collections library, in particular the notion of deque's and its functions it only supports python 3.5 and greater.

Database

This application uses a psql/postgis setup for storing the points of interest. We highly recommend using this docker container.

Importer

Please consider the following technical requirements for parsing & importing osm files.

Region Memory
Germany 8 GB
Europe 32 GB
Planet 128 GB

Note: Openpoiservice will import any osm pbf file located in the osm folder or subdirectory within. This way you can split the planet file into smaller regions (e.g. download from Geofabrik, scraper script for the download links to be found in the osm folder) and use a smaller instance to import the global data set (as long as the OSM files don't exceed 5 GB of disk space, 16 GB of memory will suffice to import the entire planet).

Run as Docker Container (Flask + Gunicorn)

Make your necessary changes to the settings in the file ops_settings_docker.yml and to categories if you need inside categories_docker.yml. These files are mounted as volumes to the docker container. If you are planning to import a different osm file, please download it to the osm folder (any folder within will be scanned for osm files) as this will be a shared volume.

Docker Compose

All-in-one docker image

This docker-compose will allow you to run openpoiservice with psql/postgis image. This will allow you to deploy this project fast.

Important : The database is not exposed, you won't be able to access it from outside the container. If you want to acces it simply adds those lines to the database definition inside the docker-compose-with-postgis.yml:

ports:
   - <PORT YOU WANT>:5432

Don't forget to change the host name and port inside ops_settings_docker.yml by the one given to docker container for database.

  • Hostname default value : psql_postgis_db
  • Port default value : 5432

Notes : If openpoiservice can't connect to the database, it's probably because you don't have the same settings inside ops_settings_docker.yml and docker-compose-with-postgis.yml.

Command to use to run all-in-one docker container

docker-compose -f /path/to/docker-compose.yml up api -d

Only deploy openpoiservice

This will only run openpoiservice inside a container, meaning that you will need to handle the database yourself and connect it to this container.

docker-compose -f /path/to/docker-compose-standalone.yml up api -d

After deploy

Once the container is built you can either, create the empty database:

$ docker exec -it container_name /ops_venv/bin/python manage.py create-db

Delete the database:

$ docker exec -it container_name /ops_venv/bin/python manage.py drop-db

Or import the OSM data:

$ docker exec -it container_name /ops_venv/bin/python manage.py import-data

Init and Update DB with docker

You can initialize POI database with docker service init

docker-compose -f /path/to/docker-compose.yml up init

Or updating POI database

docker-compose -f /path/to/docker-compose.yml up update

Protocol Buffers (protobuf) for imposm.parser

This repository uses imposm.parser to parse the OpenStreetMap pbf files which uses google's protobuf library under its hood.

The imposm.parser requirement will not build with pip unless you are running protobuf 3.0.0.

To this end, please make sure that you are running the aforementioned version of protobuf if pip install -r requirements.txt fails (install protobuf from source)

Prepare settings.yml

Update openpoiservice/server/ops_settings.yml with your necessary settings and then run one of the following commands.

[

$ export APP_SETTINGS="openpoiservice.server.config.ProductionConfig|DevelopmentConfig"

]

Create the POI DB

$ python manage.py create-db

Drop the POI DB

$ python manage.py drop-db

Parse and import OSM data

$ python manage.py import-data

Run the Application with Flask-Werkzeug

$ python manage.py run

Per default you can access the application at the address http://localhost:5000/

Want to specify a different port?

$ python manage.py run -h 0.0.0.0 -p 8080

Tests

$ export TESTING="True" && python manage.py test

Category IDs and their configuration

openpoiservice/server/categories/categories.yml is a list of (note: not all!) OpenStreetMap tags with arbitrary category IDs. If you keep the structure as follows, you can manipulate this list as you wish.

transport:
   id: 580
   children:
       aeroway:
           aerodrome: 581        
           aeroport: 582 
           helipad: 598         
           heliport: 599 
       amenity:
           bicycle_parking: 583  
           
sustenance:
   id: 560             
   children:
       amenity:
           bar: 561             
           bbq: 562   
...

Openpoiservice uses this mapping while it imports pois from the OpenStreetMap data and assigns the custom category IDs accordingly.

column_mappings in openpoiservice/server/ops_settings.yml controls which OSM information will be considered in the database and also if these may be queried by the user via the API , e.g.

wheelchair:

smoking:

fees:

For instance means that the OpenStreetMap tag wheelchair will be considered during import and save to the database. A user may then add a list of common values in the filters object wheelchair: ['yes', 'dedicated', ...] which correspond to the OSM common values of the tag itself, e.g. https://wiki.openstreetmap.org/wiki/Key:wheelchair.

API Documentation

The documentation for this flask service is provided via flasgger and can be accessed via http://localhost:5000/apidocs/.

Generally you have three different request types pois, stats and list.

Using request=pois in the POST body will return a GeoJSON FeatureCollection in your specified bounding box or geometry.

Using request=stats will do the same but group by the categories, ultimately returning a JSON object with the absolute numbers of pois of a certain group.

Finally, request=list will return a JSON object generated from openpoiservice/server/categories/categories.yml.

Endpoints

The default base url is http://localhost:5000/.

The openpoiservice holds the endpoint /pois:

Method allowed Parameter Values [optional]
POST request pois, stats, list
geometry bbox, geojson, buffer
filter category_group_ids, category_ids, [name, wheelchair, smoking, fee]
limit integer
sortby category, distance

Examples

POIS around a buffered point
curl -X POST \
  http://localhost:5000/pois \
  -H 'Content-Type: application/json' \
  -d '{
  "request": "pois",
  "geometry": {
    "bbox": [
      [8.8034, 53.0756],
      [8.7834, 53.0456]
    ],
    "geojson": {
      "type": "Point",
      "coordinates": [8.8034, 53.0756]
    },
    "buffer": 250  
  }
}'
POIs of given categories
curl -X POST \
  http://localhost:5000/pois \
  -H 'Content-Type: application/json' \
  -d '{
  "request": "pois",
  "geometry": {
    "bbox": [
      [8.8034, 53.0756],
      [8.7834, 53.0456]
    ],
    "geojson": {
      "type": "Point",
      "coordinates": [8.8034, 53.0756]
    },
    "buffer": 100  
  },
  "limit": 200,
  "filters": {
    "category_ids": [180, 245]
  } 
}'
POIs of given category groups
curl -X POST \
  http://localhost:5000/pois \
  -H 'Content-Type: application/json' \
  -d '{
  "request": "pois",
  "geometry": {
    "bbox": [
      [8.8034, 53.0756],
      [8.7834, 53.0456]
    ],
    "geojson": {
      "type": "Point",
      "coordinates": [8.8034, 53.0756]
    },
    "buffer": 100  
  },
  "limit": 200,
  "filters": {
    "category_group_ids": [160]
  } 
}'
POI Statistics
curl -X POST \
  http://129.206.7.157:5005/pois \
  -H 'Content-Type: application/json' \
  -d '{
  "request": "stats",
  "geometry": {
    "bbox": [
      [8.8034, 53.0756],
      [8.7834, 53.0456]
    ],
    "geojson": {
      "type": "Point",
      "coordinates": [8.8034, 53.0756]
    },
    "buffer": 100  
  }
}'
POI Categories as a list
curl -X POST \
  http://127.0.0.1:5000/pois \
  -H 'content-type: application/json' \
  -d '{
	"request": "list"
}'

openpoiservice's People

Contributors

aoles avatar dependabot[bot] avatar isikl avatar larsrinn avatar nilsnolde avatar remi-sap avatar slowmo24 avatar takb avatar thegreatrefrigerator avatar timmccauley avatar zephylac avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openpoiservice's Issues

Include 'bbox' in response

All other ORS responses provide a bbox field in the response. Really helpful for web mapping to set zoom levels.. I'll give it a try when I'm on vacation:)

Improve HTTP error codes

More proper error codes: right now all is 500, even when most are user input errors, i.e. rather the 4xx family

Support multiple isochrone when request to api

Currently I'm working on a large dataset, I have to query my openpoiservice for every data instance I have. This makes the execution time very long. Maybe I missed something but I think we can call openpoiservice with only one isochrone. Maybe we could implement a way to pass multiple parameters inside geojson to avoid large amounts of request ?

I estimated that for about 40 000 examples I would have 32 hours of computing.
I allocated 4 cores, 8gb or RAM, 1gb on SWAP on my docker. Which seems enough. I've never reach 3gb of memory usage.

Support newer Python versions

Add support & Travis configs for Python 3.7 & 3.8.

Problem I could figure out so far: gunicorn needs to be updated to >=19.9.0, maybe ,<20.0.0, as there's been conflicts from 3.7 on.

is it normal? "database system was shut down"

i start docker with docker-compose -f "C:\docker\openpoiservice\docker-compose-with-postgis.yml" up, then i try run docker exec -it openpoiservice_gunicorn_flask_1 /ops_venv/bin/python manage.py create-db and here got an error could not connect to server: Connection refused
Maybe its caused by "database system was shut down" at docker container?

Investigate usage of alternative ways to process PBFs

Since imposm is deprecated since a long while and we really don't wanna start maintaining that, we need to investigate alternative ways to process PBFs:

  • pyosmium could be an OK candidate. Very limited API, but does accept callbacks for OSM types
  • https://pypi.org/project/esy-osm-pbf/ seems to be a relatively new package, doing what we'd need (couldn't find in on any VCS platform though..)
  • use osmium/osmosis or other command-line utilities or even Pelias' pbf2json utitlity. All at the expense of creating more non-Python dependencies..

So, these will have to be evaluated a little in terms of performance with clear favorites being the first two options, as only protobuf lib as non-Python dep needed.

temp: collect observations

Let's create separate issues from this list once we discussed them a little bit @zephylac. If you'd like to be involved too, we can see which ones who tackles:

  • support newest Python versions 3.7 & 3.8. Problem I could figure out so far: gunicorn needs to be updated to >=19.9.0, maybe ,<20.0.0, as there's been conflicts from 3.7 on (#79)
  • move away from imposm since deprecated since a long while (#80)
  • Update project to work on newest packages, most notably Flask>=1.0, which also means Werkzeug>=1.0 (#81)
  • Dockerfile and template docker-compose.yml can be improved:
    • Use a Python base image
    • has a lot of unnecessary RUN statements making the final image huge
    • Use configs and OSM file as build ARG and keep exposing via volumes, so that we could host the image on dockerhub/other registry, see ORS docker as example (#82)
  • take tests out of the main package: can have very undesirable side effects otherwise as the root __init__.py is executed when running the tests (#83)
  • more proper error codes: right now all is 500, even when most are user input errors, i.e. rather the 4xx family (#84)
  • add test for MulitPolygon API request
  • make README more readable in terms of structure etc
  • area checks to determine if the request exceeds the server limits are done in EPSG:3857, which should ideally be a global equal-area projection like Mollweide
  • the response type had a breaking change when you introduced MultiPolygon: now they're all lists of GeoJSON FeatureCollection where I don't really understand the point of. It's all the same geometry.type, so a list of FeatureCollection is pretty much an anti-pattern. Why not wrap all features for all polygons (if MultiPolygon) in a single FeatureCollection? If you really need to relate the polygons of a MultiPolygon to the index in the response list (only reason I can think of) then doing a multipart to singlepart conversion and requesting with single Polygons makes more sense to me. Can we please revert that again?
  • remove all user-settable files from git and provide maintained template versions of them: so we don't accidentally commit our passwords and stuff
  • slight changes to make the project more accessible: move ops_settings.yml to the root of the main package, remove server folder (redundant when tests are moved)

Optional:

  • use a proper dependency manager: I'm in love with Poetry. Super easy to use and million times better than any other.
  • Try to incorporate tox to test multiple Python versions easily
  • Introduce linter, e.g. yapf
  • improve testing framework: right now, every single test will create dbs, import data and drop dbs (#96)
  • lots of data duplication in the request handlers in views.py, which could be avoided, as that's a potential performance drain

Multipolygon overlapping issue

MultiPolygon doesn't support overlapping polygons:
explained here
official doc info

Said in OCJ standard for MultiPolygon features :
Downloadable PDF on page 22

2.1.12 MultiPolygon
A MultiPolygon is a MultiSurface whose elements are Polygons..
The assertions for MultiPolygons are :

  1. The interiors of 2 Polygons that are elements of a MultiPolygon may not intersect.
    ∀ M ∈ MultiPolygon, ∀ Pi, Pj ∈ M.Geometries(), i≠j, Interior(Pi) ∩ Interior(Pj) = ∅
  2. The Boundaries of any 2 Polygons that are elements of a MultiPolygon may not ‘cross’ and >may touch
    at only a finite number of points. (Note that crossing is prevented by assertion 1 above).
    ∀ M ∈ MultiPolygon, ∀ Pi, Pj ∈ M.Geometries(), ∀ ci ∈ Pi.Boundaries(), cj ∈ Pj.Boundaries()
    ci ∩ cj = {p1, ….., pk | pi ∈ Point, 1 <= i <= k}
  3. A MultiPolygon is defined as topologically closed.
  4. A MultiPolygon may not have cut lines, spikes or punctures, a MultiPolygon is a Regular, >Closed point
    set:
    ∀ M ∈ MultiPolygon, M = Closure(Interior(M))
  5. The interior of a MultiPolygon with more than 1 Polygon is not connected, the number of >connected
    components of the interior of a MultiPolygon is equal to the number of Polygons in the >MultiPolygon

This issue happens as I try to merge Polygons into a single MultiPolygon. That way I make one request and have multiple responses. As I make lots of requests, this hackish way allows me to reduce the number of requests thus gaining process time.

I would suggest to keep it like that to respect OCJ standard, but maybe add support for an official ‘multi-params’ support.
I would guess that support would be allowed by taking an array as parameter instead of a single object.

Refactor geometry parameter

  • Take bbox and radius out of geometry
  • geometry will optionally (?) be passed ageometry from GeoJSON
  • Rename radius to buffer and ignore on polygon/multipolygon input
  • bbox can stay and clip all geometries from geometry

Invalid JSON object in request

Request:

curl -X POST \
  127.0.0.1:8080/pois \
  -H 'Content-Type: application/json' \
  -d '{
  "request": "pois",
    "geojson": {
      "type": "Point",
      "coordinates": [-96.791028, 32.816137]
    }
  }
}'

Response:

{
  "code": 4000,
  "message": "Invalid JSON object in request"
}

The request body is taken from examples, but it doesn't work.

Returned categories list differs from categories.yml specification

Categories list returned by the following query to openruteservice pois API differs in structure from the definition. For example, animals: children: amenity and animals: children: shop contain 103 and 123 (!) elements, respectively.

curl -X POST https://api.openrouteservice.org/pois?api_key=<ORS_API_KEY> \
  -H 'content-type: application/json' -d '{"request": "list"}' > categories_json.txt

OUTPUT: categories_json.txt 🔍 VIEW ONLINE

Restructure POST body

e.g.:

{
   "request": "pois"|"category_stats"|"category_list",
   "category_ids": [cat_id_1,cat_id_2,...],
   "category_group_ids: [cat_group_id_1,cat_group_id_2,...]
   "geometry": {
       "type": "polygon"|"point"|"linestring"
       "geom:" [[lat,lng],[...,...]...],
       "bbox": [[lat,lng],[...,...]...],
       "radius": 10000
   }
   "filters": {
       "wheelchair": "yes"|"no"|"..."
       "smoking": "yes"|"no“|"..."	
       ...    
   }
   "limit": 100,
   "sortby": "distance"
}

All time Internal Server Error

I'll trying to make simple request:

curl -X POST \
  http://127.0.0.1:5000/pois \
  -H 'Content-Type: application/json' \
  -d '{
  "request": "pois",
  "geometry": {
    "bbox": [
      [32.70149630013472, -96.72892645140573],
      [32.70137085098112, -96.72854256023189],
      [32.710300884253556, -96.72420399392107]
    ] 
  }
}'

But all time i'll got status 500 and:

<html>
  <head>
    <title>Internal Server Error</title>
  </head>
  <body>
    <h1><p>Internal Server Error</p></h1>
    
  </body>
</html>

Is it wrong bbox, or error with openpoiservice? How i can check it?

Specify encoding in response `Content-Type`

Currently the response Content-Type contains just application/json and does not specify any encoding unlike the other ORS endpoints. For completeness and consistency consider setting it to application/json; charset=utf-8.

Inconsistency for 'pois' vs 'category_stats'

Now being back again, it's fine to write issues;)

I think the limit parameter might not work correctly. When I take Kreuzberg as polygon and search for pubs (ID 569) in Kreuzberg, it gives me:

  • 81 features for limit=100
  • 208 features for limit=500
  • 377 features for limit=2000

However, when asking for category_stats on the same category, it gives me 361 as total_count.

Below is the base request data block:

{'filters': {'category_ids': [569]},
 'geometry': {'geom': [[13.43926404, 52.48961046],
                       [13.42040115, 52.49586382],
                       [13.42541101, 52.48808523],
                       [13.42368155, 52.48635829],
                       [13.40788599, 52.48886084],
                       [13.40852944, 52.487142],
                       [13.40745989, 52.48614988],
                       [13.40439187, 52.48499746],
                       [13.40154731, 52.48500125],
                       [13.40038591, 52.48373202],
                       [13.39423818, 52.4838664],
                       [13.39425346, 52.48577149],
                       [13.38629096, 52.48582648],
                       [13.38626853, 52.48486362],
                       [13.3715694, 52.48495055],
                       [13.37402099, 52.4851697],
                       [13.37416365, 52.48771105],
                       [13.37353615, 52.48798191],
                       [13.37539925, 52.489432],
                       [13.37643416, 52.49167597],
                       [13.36821531, 52.49333093],
                       [13.36952826, 52.49886974],
                       [13.37360623, 52.50416333],
                       [13.37497726, 52.50337776],
                       [13.37764916, 52.5079675],
                       [13.37893813, 52.50693045],
                       [13.39923153, 52.50807711],
                       [13.40022883, 52.50938108],
                       [13.40443425, 52.50777471],
                       [13.4052848, 52.50821063],
                       [13.40802944, 52.50618019],
                       [13.40997081, 52.50692569],
                       [13.41152096, 52.50489127],
                       [13.41407284, 52.50403794],
                       [13.41490921, 52.50491634],
                       [13.41760145, 52.50417013],
                       [13.41943091, 52.50564912],
                       [13.4230412, 52.50498109],
                       [13.42720031, 52.50566607],
                       [13.42940229, 52.50857222],
                       [13.45335235, 52.49752496],
                       [13.45090795, 52.49710803],
                       [13.44765912, 52.49472124],
                       [13.44497623, 52.49442276],
                       [13.43926404, 52.48961046]],
              'radius': 2000,
              'type': 'polygon'},
 'limit': 100,
 'request': 'pois',
 'sortby': 'distance'}

Reconsider DB schema

Things to think about:

  • The tags are 1:1 with OSM objects, so they can be merged into one table
  • uuid is mostly useless it seems, it's not the PK in most tables
  • categories are extracted and stored for each OSM object, duplicating data millions of times
  • allow (simplified) MultiPolygon in DB. Would need some serious performance testing though for the intersects query stage. Would be nicer than only querying for centroids, which is often meaningless for big areas like parks or even true MultiPolygon objects.

Is it possible to use an external database?

Can someone please explain the mechanism of working with OSM files? I understand correctly that after importing the OSM file is no longer used and can it be deleted? Those. all data for work is taken only from the database, or not? If so, can I run the database on a separate server and connect to it?

Error response format differs from the convention used by other ORS backends

The format of errors returned by the pois API differs from the one used by the rest of ORS endpoints. It is an object with code and message at top level

{
    "code": 4000,
    "message": "Invalid JSON object in request"
}

rather than a JSON object with code and message nested in error such as for directions, for example:

{
  "error": {
    "code": 2001,
    "message": "Parameter 'profile' is missing."
  },
  "info": {
    "engine": {
      "version": "4.5.0",
      "build_date": "2018-03-26T13:41:46Z"
    },
    "timestamp": 1528809944677
  }
}

Error message inconsistent with ORS error messages

One more:
when HTTP error occurs, the message body is {'message':..}, while for most ORS errors it seems to be {'error': {'message':...}}. Would make sense to provide the same format. Very minor issue. But in ors-py, it's handled in ORS format.

Generate response in psql

GeoJSON response is currently built in Python which has an impact on the response time, this should ultimately be added to the work handled by postgres.

Improve Docker setup

Dockerfile and template docker-compose.yml can be improved:

  • Use a Python base image
  • has a lot of unnecessary RUN statements making the final image huge
  • Use configs and OSM file as build ARG and keep exposing via volumes, so that we could host the image on dockerhub/other registry, see ORS docker as example

Returning http 401 for requesting too many categories

If more than 5 categories are requested, the API returns a http 401 (unauthorized) with response like:

  "message": "Must be one of [100, 120, 130, 150, 160, 190, 200, 220, 260, 330, 360, 390, 420, 560, 580, 620] for dictionary value @ data['filters']['category_group_ids']"
}

This should be an http 500 with internal errorcodes 4001, 4002 etc. similar to directions or isochrones endpoint.

Take tests out of main package

Take tests out of the main package: can have very undesirable side effects otherwise as the root __init__.py is executed when running the tests

Docker enhancement

A little bit related to #63

I found it quite annoying to connect psql/postgis database with openpoiservice when both are running in separated networks. I finally ended up using one docker-compose.yml for both openpoiservice and psql/postgis.

I understand it might frustrate people to force them using one docker image but in the other hand, when you end-up with ready-to-test solution it's quite satisfying.

Maybe we could had a demo docker-compose.yml or enhance the existing one ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.