Giter Site home page Giter Site logo

itisfoundation / osparc-simcore Goto Github PK

View Code? Open in Web Editor NEW
44.0 11.0 26.0 80.34 MB

🐼 osparc-simcore simulation framework

Home Page: https://osparc.io

License: MIT License

HTML 0.07% JavaScript 21.71% Python 74.72% Makefile 0.96% Shell 1.26% Batchfile 0.01% Dockerfile 0.80% CSS 0.20% Mako 0.01% PLpgSQL 0.01% Jinja 0.25% HCL 0.01% Jupyter Notebook 0.01%
simulation osparc osparc-simcore docker python beta-distribution neuroscience simulation-modeling simulator swarm-cluster

osparc-simcore's Introduction

osparc-simcore platform

black_badge ci_badge codecov_badge doc_badge dockerhub_badge license_badge sonarcloud_badge osparc_status s4l_status

The SIM-CORE, named o2S2PARCOpen Online Simulations for Stimulating Peripheral Activity to Relieve Conditions – is one of the three integrative cores of the SPARC program’s Data Resource Center (DRC). The aim of o2S2PARC is to establish a comprehensive, freely accessible, intuitive, and interactive online platform for simulating peripheral nerve system neuromodulation/ stimulation and its impact on organ physiology in a precise and predictive manner. To achieve this, the platform will comprise both state-of-the art and highly detailed animal and human anatomical models with realistic tissue property distributions that make it possible to perform simulations ranging from the molecular scale up to the complexity of the human body.

Getting Started

A production instance of o2S2PARC is running at oSPARC.io.

If you want to spin up your own instance, you can follow the common workflow to build and deploy locally using the Linux commandline (Ubuntu recommended). Make sure you first install all the requirements mentioned in the section below.

  # clone code repository
  git clone https://github.com/ITISFoundation/osparc-simcore.git
  cd osparc-simcore

  # show setup info and build core services
  make info build

  # starts swarm and deploys services
  make up-prod

  # display swarm configuration
  make info-swarm

  # open front-end in the browser
  #  127.0.0.1.nip.io:9081 - simcore front-end site
  #
  xdg-open http://127.0.0.1.nip.io:9081/

  # to stop the swarm
  make down

Some routes can only be reached via DNS such as UUID.services.DNS. Since UUID.services.127.0.0.1 is not a valid DNS, the solution is to use nip.io. A service that maps <anything>[.-]<IP Address>.nip.io in "dot", "dash" or "hexadecimal" notation to the corresponding <IP Address>.

Services are deployed in two stacks:simcore-stack comprises all core-services in the framework and ops-stack is a subset of services from ITISFoundation/osparc-ops used for operations during development.

Requirements

To build and run:

  • git
  • docker
  • make >=4.2
  • awk, jq (optional tools within makefiles)

To develop, in addition:

  • python 3.10: we recommend using the python manager pyenv
  • nodejs for client part: we recommend using the node manager nvm
  • vscode (highly recommended)

To verify current base OS, Docker and Python build versions have a look at:

If you want to verify if your system has all the necessary requirements:

    make info

Setting up other Operating Systems

When developing on these platforms you are on your own.

On Windows, it works under WSL2 (Windows Subsystem for Linux version2). Some details on the setup:

MacOS is currently not supported.

Upgrading services requirements

Updates are upgraded using a docker container and pip-sync. Build and start the container:

    cd requirements/tools
    make build
    make shell

Once inside the container navigate to the service's requirements directory.

To upgrade all requirements run:

    make reqs

To upgrade a single requirement named fastapirun:

    make reqs upgrade=fastapi

Releases

WARNING This application is still under development.

Development build

For developers wanting to add/test code changes, a version can be built that will on-the-fly incorporate changes made in the source directory into the running containers. To enable this, the following commands should be used to build, instead of the ones provided in the Getting Started section:

  # clone code repository
  git clone https://github.com/ITISFoundation/osparc-simcore.git
  cd osparc-simcore

  # setup python environment and activate
  make devenv
  source .venv/bin/activate

  # show setup info and build core services
  make info build build-devel

  # starts swarm and deploys services
  make up-devel

  # The above command will keep in running with "[RUN] Running command..."
  # Open another terminal session, to continue

  # display swarm configuration
  make info-swarm

  # open front-end in the browser
  #  127.0.0.1.nip.io:9081 - simcore front-end site
  #
  xdg-open http://127.0.0.1.nip.io:9081/

  # to stop the swarm
  make down

Contributing

Would you like to make a change or add something new? Please read the contributing guidelines.

License

This project is licensed under the terms of the MIT license.


Made with love (and lots of hard work) at www.z43.swiss

osparc-simcore's People

Contributors

bisgaard-itis avatar colinrawlings avatar dependabot-preview[bot] avatar dependabot[bot] avatar ehzastrow avatar elisabettai avatar githk avatar ignapas avatar jsaq007 avatar matusdrobuliak66 avatar mguidon avatar mrnicegyu11 avatar odeimaiz avatar oetiker avatar pcrespov avatar pyup-bot avatar sanderegg avatar surfict avatar wvangeit avatar yuryhrytsuk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

osparc-simcore's Issues

Field Viewer

as a user, i want to visualize fields

  • support scalar and vector data
  • real and complex valued

Integration of DAT-CORE Dev

  • Create a test that queries interface: search, pull and push. Add little sample code to upload/download
  • Store Colin-Clancy in DAT-CORE together with their formation of what computational services and input used . this would allow us to hash things in a way that we know what changed or not

Postgres DB: migration and backup

  • Set up storage for Postgres database
  • Persist database from Postgres container on node (beyond life of the container)
  • Backup database data (beyond life of cluster)
  • handle migration of database structure

Integration of DAT-CORE

USE CASE

The file manager allows to select files from DAT-CORE and from SIM-CORE.
It only displays files that belong to the current user. Files are grouped by project name (SIM-CORE) and by datasets (DAT-CORE). I can search for files by name and by metadata.

I want to run the Clancy model by using the input file which is provided via the DAT-CORE. I want to store the raw data output file and one of the corresponding plots as a jpg.

USER STORY

  • I login to Blackfynn, create a pair of secret keys.
  • I upload the initial states file for the Clancy model (via Blackfynn).
  • I create a user on Sim-Core and add the tokens to the user settings.
  • Then I create the pipeline for executing the solver and the postprocessing.
  • I attach a filemanager to the solver and as the input I can select the file which I have previously uploaded via Blackfynn.
  • The postpro service has output ports to which I can attach file exporters.
  • The file exporter allows me to upload the file locally or to Blackfynn (which I'm doing for a jpg file and for the raw output data).
  • Now I go to the Blackfynn platform and have a look at my jpg file and at the data in a table view.

  • Integrate DAT-CORE database using their API
  • use for now to get input and store output parameters/data, potentially metadata and information about study (pipeline)
  • tbd: user account management across COREs
  • store Clancy-model output data on DAT_CORE
  • pull, push and search functions are available

Amazon Follow-Up

User story

As a user I want to make sure that the platform runs as cost effective as possible. Only the absolutely necessary services should run 24/7. Other computational resources shall be allocated dynamically on-demand.

Definition of done

  • Identify parts of the platform that need to be up and running permanently
  • Integrate AWS python SDK for allocating resources

Use case

Dag scheduler works with dynamically allocated hardware.

SAPD

  • FG Case 101322
  • Investigate containerization of XPostProcessor

Nerve electrophysiology model (NEURON)

User story

As a user I want to be able to run neuron simulations. For this purpose I need to be able to import hoc files or create splines representing neurons. Also I want to couple them to an external potential from a EM solver.

Definition of done

  • Yale neuron in a container
  • s4l_neuron in a container
  • users must be able to assign trajectories and specify parameters (similar to s4l simulation settings)
  • link to exposing potential from EM solver
  • at the current time, we only need to support the MRG model (motor or sensory)
  • tbd: what do we store? (point sensor? line sensor?)

Background

  • User story for this and the EM case is to be able to reproduce the two verfication models provided by Warren Grill
  • (also consider Eilif's input and format ... start with S4L side possibly)

Analysis and visualization (3D)

User story

Some results will need visualization with 3d field viewers. As a developer I want to be able to visualize such data, for now defined in a file but using a data format that is flexible enough to be reused later on for the MVP, i.e. the output of unstructured LF solver (decision if structured or unstructured is used for MVP is not finalized yet - hopefully decided in early July).

Definition of Done

  • There is dedicated panel/view/dashboard in the frontend (3d view?, dashboard?)
  • There is a defined data structure for such results (vtk?)
  • The field viewer supports scalar and vector data
  • The field viewer can visualize slices and allows for selecting components of the field (real, imaginary parts)

Use case

Use mockup output data from the sim4life unstructured LF solver.


User story (from user perspective)

(suggestion how the user could do it)

  • solids (e.g., from a modeling service) can be coupled to a suitable input port of the service
  • so can result fields (3D scalar and vector for now, potentially time varying later) to an other input port
  • inside the service a 3D view exists, as well as a selection of available fields (e.g., in a tree - or the Team comes up with another idea). in that tree, suitable viewers can be attached to the fields (initially i suggest at least a slice field viewer and perhaps a vector or isosurface viewer) which are then rendered along with the solids in the 3D view.
  • when selecting such a viewer in the tree (or a better way of offering/selecting) the related settings appear

Platform Resource Management

GOAL
As DevOps, I want to have full control of hardware resources consumed by the platform. I also need good monitoring/logging/debugging tools to keep track of issues and problems.

Lifetime of Services

  • if a user logs out of the platform, resources should automatically be freed
  • after a long time of inactivity, resources should automatically be freed

Protocol for Problem Handling

  • what to do if a node fails
  • what to do if a service fails

Monitoring Tools

  • access to swarm logs
  • access to node logs
  • access to service logs
  • look into available tools, i.e., docker EE

Hardware Allocation

  • automation of cloud resource management
  • integration into scheduler

Analysis and visualization (2D)

User story

Some results will need visualization with tables and/or 2d plots. As a user I want to be able to visualize such data

Definition of Done

  • There is dedicated panel/view/dashboard in the frontend
  • There is a defined data structure for such results
  • Tables shall be sortable and editable
  • Selecting subsets in the table should dynamically update the plots
  • In addition to a Jupyter like script representation, a standalone 2D xy-viewer is needed

Use case

Clancy case is used as a first goal.

see comments below (at the end) for the updated definition of done.

s4l as a service

User story

As a developer I want to have a clear picture in mind how to create a non-UI s4l service running in a container.

Definition of done

  • Build minimal s4l modeling service without any user interface plugins
  • Find common issues for different plugins and propose strategies how to fix them
  • Python: I can import s4l_v1 without a GUI

in gitlab and shall contain:

  • server: move and adapt
  • clients
  • Dockefile
  • CI

SIM - MAP-CORE ToDos

STORY PART I

Communication

According to meeting minutes of our last meeting:

  • send tissue list of NEUROCOUPLE to Peter
  • setup of meeting with them to discuss technical aspects (mapping, ontologies, API, metadata, etc.)

STORY PART II

Import simulation data provided by MAP-CORE and visualize

User story

As a user I want tor import simulation data provided by MAP-CORE into the SIM-CORE platform. I want to visualize them.

Definition of done

  • Analyze data types and visualizers required
  • Define where and how in the UI this should happen

Use case

We have to contact them in order to find examples.

EM simulation of VNS implant (oSPARC LF solver)

User story

As a user I want to upload a multi domain mesh that is used as a base for the LF solver. I want to be able to interactively assign conductivity parameters (sigma) and boundary conditions (PEC, voltage or current) for every domain in the mesh. Furthermore I can edit solver specific settings such as convergence criterion. I can run the simulation and will be able to visualize results with the 3d field viewer.

Definition of done

  • Allow input of geometries (involves in the user-story a mixture of anatomical model and CAD-based electrodes)
  • The user can specify dielectric/PEC parameters for domains (ideally, using materials from a material data-base that has been linked to another input port, and ideally supporting tagged structures for auto-assignment)
  • The user can specify boundary conditions for domains
  • We currently do not specify a mesh and create voxel. We might mimic a discretization process and just use a pre-prepared voxel file later.
  • The output after running the solver is mapped to a port that can link to a field viewer

** previously we wrote (revisit later)

  • Allow upload of multidomain meshes and visualization
  • The user can specify dielectric/PEC parameters for domains
  • The user can specify boundary conditions for domains
  • The output can be postprocessed for the field viewer

Use case

Setup a simulation in sim4life and recreate it in simcore. There should be a tutorial available.

Modeling: Issues in the Modeler

  • Add visibility checkboxes to the tree view
  • Usage of External modeler is broken in sphere/spline creation since GLTF transfer was added
  • Fully support window resizing
  • Allow multiselection by picking
  • Enable rotation while creating entities interactively
  • Move/Rotation handles should be centered in the object they are applied to
  • Clicking outside the object should reset the move/rotate buttons
  • Check what happens with the websocket when server gets restarted
  • The version of the client should be available in the GUI, and the tag on the docker image should be set up automatically if possible
  • After a sphere was added, the button should unclick itself as it is not possible to create another sphere right away (or it should be still possible to create another sphere and then the state is correct)
  • The invisible plane is not functioning correctly when creating several shapes (create 2 cylinders, then it's not possible to define the height of a third cylinder if the mouse is not hovering onto the previous 2 cylinders)
  • The selection state is reset when creating a new object. The buttons do not reflect this. And actually it should either be unclicked or it should maintain the selection mode.
  • After the first s4l spline is created, creating a second one blocks after 2 control points are set.

Porting of S4L Modeler to Linux

User story

As a developer I want to be able to have a s4l modeler service running on a linux node in the simcore platform.

Definition of Done

  • Investigate license terms for ACIS on linux
  • Identify plugins and 3rd party libraries needed for modeling functionality
  • Refactor/Throw away platform dependent parts

Use case

Make current prototype for s4l modeling service part of a pure linux docker swarm

Import simulation data provided by MAP-CORE and visualize

User story

As a user I want tor import simulation data provided by MAP-CORE into the SIM-CORE platform. I want to visualize them.

Definition of done

  • Analyze data types and visualizers required
  • Define where and how in the UI this should happen

Use case

We have to contact them in order to find examples.

Import of Geometries/CAD and Volume Meshes

User story

As a user I have CAD files or meshes needed for my computational service. I want to able to upload those files and have them visualized in the frontend. For a start I need support for STEP files and volume meshes.

Definition of Done

  • I can import a CAD file in STEP format.
  • I can import a volume mesh (vtk format? bryn?).
  • The imported data is shown in the 3D view.

Background

We are not promising CAD design or discretization as part of the MVP. Instead, it might be easier to implement CAD & mesh importing

Unidirectional Pipelining

User story

As a developer, I want to create a pipeline of dependent jobs that are scheduled via a distributed queuing system. Those jobs are computational services and consist of a input descriptor, a link to a docker image with the actual kernel and an output descriptor. Consecutive jobs are assumed to be compatible with each other. To reduce data traffic, parts of the pipeline can run on the same node.

Definition of Done

  • Define json descriptors for input and output data structures for typical data types (such that they can later on be interpreted for UI elements)
  • The queue allows for dependent jobs
  • Jobs can be aborted and there is a possibility to access logs and progress
  • When a job is done, a notification will be sent to listeners
  • Input/Output descriptors are attached to the docker images and can be queried by the frontend later on

Bonus

  • The computational backend is integrated into the qx/tornado framwork

Use case

Generalize the prototype workflow example (function parser + evaluator). The rendering should be a second, dependent job.

iSeg -> Open Source

USER STORY

As external contributor I would like to get access to iSeg code and add features or fix issues.

DEFINITION OF DONE

  • Code is cleaned up (using what metric?)
  • Carfully check 3rd party dependencies for legal issues w.r.t. to the iseg license (tbd)
  • The code is published somewhere
  • we assume that Esra is able to check out/compile and run it only being provided with the github repository link

As outcome from the review:

  • also provide precompiled version of it on github
  • still needs some minor cleanup and renaming

BACKGROUND

As part of our agreement, iSeg shall be openly available for the public.

Examine Web/Online 3D Solid Modeling Packages

Goal:
In order to facilitate the assessment of web/online modeling capabilities, it would be great to examine existing state of the art solutions. This is about the question of how close we can push the user experience of an online modeling/simulation solution to the desktop experience, e.g., similar to the one in S4L - with respect to 3D solid modeling in combination and handling of heavy models, e.g., a ViP model.

This shall encompass:

  • have a look at www.onshape.com
  • look for alternative packages, check what is industry standard, etc.
  • generate a login, dig into it, check the loading and handling of heavy models and meshes, snapping, drawing capabilities, etc.
  • check what 3D modeling libraries/packages are used within these solutions ... what import export formats do they support, etc. ... licensing/ open source restrictions for these libraries, etc.
  • examine the user experience when doing modeling operations on heavy models/meshes ... latency, delays, slow down, etc.
  • is the user experience similar to a desktop solution ?

oSPARC Framework, Design, Mockup

STORY
Based on the current GUI/style/suggestion from the Team:

  • Nik initiates and establishes a team and procedures to address all oSPARC platform GUI/design issues, user experience, perception, etc.
  • 2 dedicated meetings next week (w 19)
  • MVP GUI workflows and procedures are outlined and defined in detail
  • mockups of GUI, workflows, situations are provided
  • GUI design, style, perception are outlined and provided

AMC support

User story

It has been decided that AMC gets developer support for various issues related to neuron. This support is time-boxed to 2 days a month.

Definition of done

(SimCoreUser) will be the new user for host&container in order to avoid being root. ID with a very high number. Mechanism that the user in the host and in docker image are the same one.

  • scu (SimCoreUser) will be the new user for host&container in order to avoid being root. ID with a very high number. Mechanism that the user in the host and in docker image are the same one.
  • adapt director
  • adapt sidecar (connected to PR #199)
  • remove the need of export RUN_DOCKER_ENGINE_ROOT=1 in Makefile and export DOCKER_GID=1042
    export HOST_GID=1000

requires #215
links #300

Check ACIS Linux License

STORY
(note: this case does not require any work yet from the Team!)

  • Nik checks with Spatial (Frau Dietrich) if we have a Linux license for the ACIS modeler
  • if not -> check about cost (extend our Win license to Linux, etc.)

Strategy for Reducing Network Traffic when Loading Large Models

User story

As a user, I want to be able to interact with large models interactively without latency. Also, I do not want to spend much money for data traffic costs.

Definition of done

  • Investigate strategies for data reduction and come up with a solution
  • How can caching on client side help
  • A large vip model can be visualized in the browser without the user noticing that data is fetched on-demand from the server.

Pipeline creation

User story

As a developer I want to be able to create arbitrary pipelines out of computational services in the frontend. Mockup services can be used for finding compatible services but the metadata for input/output data structures need to be defined such that they can be used for the MVP.

Definition of Done

  • Services can interactively added to the pipeline
  • Only compatible services can be attached to each other
  • Define data structure needed for this (compatible with the pipeline that is being use in the comp. backend)
  • Multiple in- and outputs can exist (ports)
  • Settings can be defined by input
  • Branching

Cardiac model from Kember (Java code)

User story

As a user I want to create a pipeline consisting of the cardiac model from Kember and a post-processing service that reproduces the plots in the official MVP use case. I am able to edit all necessary input settings directly in the frontend. Optionally, I would like to have the option to choose a compiler and compiler flags in case I want to build the kernel first.

Definition of done

  • The model is available in the list of services
  • The required pipeline can be created
  • The code is containerized as binary and as code only with compilation performed on the fly
  • For the second part of the pipeline, either python or octave/matlab will be used
  • check if there is a need/possibility to have settings (property window)
  • the results of the python postpro script should be mapped as time series to the outputs, such that they can be visualized with the 2d plot service (item was modified)

Use case

There is documentation/code example for this case in filesrv. In addition, the full workflow is available on MaG's VM.

Initial Python Service (Accepts Script as Input)

User story

As a user i want to have the possibility to use my python scripts as part in the pipeline to do, for example, postprocessing.

Definition of done

  • Docker service that accepts a python script as input and runs it as a service on the osparc platform
  • Optional: provide a window in which the python script can be edited (guess what that is gonna be)

simcore - devops

This is a placeholder for issues related to simcore as a platform and for automation of the development process.

In-house cluster

  • Mirror AWS setup locally (what does this mean)

AWS EC2

Cardiac Tissue model (myocytes) from Colleen Clancy (C++ code)

User story

As a user I want to create a pipeline consisting of the cardiac tissue model from Colleen Clancy and a post-processing service that reproduces the plots in the official MVP use case. I am able to edit all necessary input settings directly in the frontend. Optionally, I would like to have the option to choose a compiler and compiler flags in case I want to build the kernel first.

Definition of Done

  • The model is available in the list of services
  • The required pipeline can be created
  • The code is containerized as binary and as code only with compilation performed on the fly
  • For the second part of the pipeline, either python or octave/matlab will be used
  • check if there is a need/possibility to have settings (property window)
  • the results of the python postpro script should be mapped as time series to the outputs, such that they can be visualized with the 2d plot service (item was modified)

Use case
There is documentation/code/Kepler example for this case in filesrv. In addition, the full workflow is available on MaG's VM.

Properties/Settings Input (Window)

User Story

As a user I want to be able to select from a list of available services. When selecting one I expect to have access to the specific input parameters for this service and I can edit them.

Definition of Done

  • A properties window is available
  • Following input data types can be edited: floats, integer, strings, files (uploadable)
  • User is able to select an input port as setting instead of typing it in

Use Case

Start with mockup data but keep MVP in mind (Clancy).

qx-py-s4l boilerplate

  • Hello world button in qooxdoo
  • Python based Web server serving qooxdoo HelloWorld website
  • Web socket connecting Server and Client
  • Connection between Server and S4L
  • Everything is in containers
  • Use this as an example to create a protocol for creating an automated landing page for the review

Multi-User Framework Setup

Goal:
To setup a general multi-user web platform framework.

As a user:

  • I want to see a form where I can register.
  • Typical login, name, pass, etc. setup ... recovery, etc. ... email sent, etc.
  • When accessing the platform, i do so by login and i will then only get access to services/data/models for which i have corresponding rights
  • Optional (not so urgent): I can define specific user levels/rights.
  • Later: relate these accounts to the DAT-CORE accounts

user story:

  • as a user i log in, and i can see the studies that i own. in addition, i can access studies from others, provided i have corresponding rights (look at pipeline and results, run pipeline, create a modifiable copy, modify parameters in the original study and run, modify even the study setup). for that purpose, it should be possible to provide other users or user groups with access to one's project at the corresponding level (e.g., by specifying an invitation with rights in the dash board that gets sent by mail to users) i only get access to services for which i have access permission.
  • for isan, we only need limited functionality: 1) access a non-login (default user) that can only access predefined studies (Ward, Clancy, Bornstein for now), (tbd: modify parameters and run the studies), and look at results. 2) access with a user login that provides access to own studies and to all services we decide to make available to beta users. see as well #640.

Hardcore Modeling Operations

User story

As a user I can model and place an "electrode" onto a ViP model/mesh.

Definition of done

  • I can load a ViP model into the 3D framework.
  • I can pan and rotate it smoothly.
  • I can snap onto the surface/mesh of the ViP model.
  • I can draw a spline onto the surface of the ViP model using snapping and drawing points/elements of the spline.
  • In this sense, I can interactively draw a cylinder (= "electrode") normal to the surface of the ViP model at a selected location.
  • Or I can draw the cylinder somewhere else and then use rotation and translation to bring it close to the ViP model surface.

Amazon

User story

As a developer I want to have the prototype and the associated technologies running on AWS. I want to get a feeling for prices and network latency when transferring large docker images and models.

Definition of done

  • SIM-CORE platform as defined in D1.1 is setup on AWS
  • Private SIM-CORE docker registry stored on AWS
  • Cost estimates for operating the full platform
  • Documentation
  • Deploy in house: Reproducible hardware/software configurations (Ansible)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.