Comments (19)
@moxgeek ,
Thank you for opening this issue.
I'm not sure I understand where this happens.
Are these many different clients or one WebSocket client with many requests?
Do you have example code I can see?
Thanks,
Bo.
from plezi.
@boazsegev thank you again ...
i create a multi-client (200) using a ruby , js , and even a C# code .
each client send a specific data as a body to the server (plezi ) data = JSON.parse(request.body.read
so my code is like that :
def create
data = JSON.parse(request.body.read)
ids = data['id']
content = data['content']
puts content.to_json # here i put my data into a database , but let just display it on the log terminal ...
puts "done !"
end
so every client will send different data to the server but at the same time
the result is i found some data duplicated , and other just disappear
i was sure this is linked to synchronized the function
so what i did is doing the same thing , with a Mutex and it's work fine ( no duplication or disappear of data )
def create
semaphore = Mutex.new
semaphore.synchronize {
data = JSON.parse(request.body.read)
ids = data['id']
content = data['content']
puts content.to_json # here i put my data into a database , but let just display it on the log terminal ...
puts "done !"
}
end
i believe that iodine should handle this not put it in the controller .
from plezi.
Hi @moxgeek ,
I think I understand a little better, thank you.
You have many clients creating data at the same time.
The create
function is called once per client (?) but the data in the database is corrupted unless the database access is protected using a Mutex.
If I understand correctly, this might be an issue related with database connection corruption or the database access.
Are you using ActiveRecord or Sequel? Are you using something different for the database access?
The iodine gem tries to support ActiveRecord and Sequel out of the box because it's important to close and reconnect the connection pool every time a worker is spawned. As long as you are using a connection pool properly, this shouldn't happen.
Which version of iodine are you using? Which version of Plezi?
Thanks,
Bo.
from plezi.
Plezi version: 0.16.3
Iodine 0.7.18
and i'am storing data in redis
i don't think the problem is related to the database , even when i remove the database storing , and just display the length of data ( displaying the whole data will never help , it will be hard to debug )
and the problem still here .
from plezi.
i don't think the problem is related to the database...
I can't recreate the issue. Do you have an example I can run? a controller?
Also, the client connections, are they WebSocket or HTTP? I tried recreating with both, but couldn't recreate a storage error (no corrupt or missing data).
and i'am storing data in redis
Which Redis connector are you using? Are you sure it isn't related to the Redis connection?
Consider that concurrency is important for web servers and applications. All connections should be able to run in parallel, otherwise the application might be slow if it needs database access (exactly like when using a single thread).
This means that database connections should protect against data corruption issues that relate to concurrency. If this is iodine's Redis connection, maybe the issue is in that connection management? If it's another gem, maybe there's also an issue with that.
from plezi.
is HTTP not websocket ( i use websocket to send data )
my current controller ( no problem with this strecture ) :
require "json"
require "thread"
# Replace this sample with real code.
class ExampleCtrl
CHANNEL = "User"
# HTTP
def index
# any String returned will be appended to the response. We return a String.
"render"
end
def new
'Should we make something new?'
end
# called when request is POST or PUT and params['id'] isn't defined or params[:id] == "new"
def create
redis = Redis.new
semaphore = Mutex.new
semaphore.synchronize{
"Hit the road jack ... no more no more no more"
data = JSON.parse(request.body.read)
ids = data['id']
content = data['content']
puts "#{content.length} --"
ids.each { |x|
begin # the oldcontent is a simple array i use to to put the data in oldcontent, if data exist in redis he will add the new data on it , else he will create an empty array and add the data on oldcontent , the try cach or begin rescure is just for handle exception if json parse give exception while trying to parse "empty " or null to json ...
oldcontent = []
oldcontent.clear
oldcontent =JSON.parse(redis.get ("User:#{x}"))
rescue # optionally: `rescue Exception => ex`
puts "hello is empty"
oldcontent.clear
ensure # will always get executed
end
oldcontent.push(content)
puts "#{content.length} #{x} -- x"
redis.set("User:#{x}",oldcontent.to_json)
publish CHANNEL+x.to_s , oldcontent.to_json
puts oldcontent.to_json.length # if i send multiple data with multiple thread at the same time, in the case of Mutex , the same length for the whole result
}
}
puts "done !"
redis.quit
end
end
Plezi.route '/getdata', ExampleCtrl
my old controller ( here the problem can be showen ) :
require "json"
# Replace this sample with real code.
class ExampleCtrl
CHANNEL = "User"
# HTTP
def index
# any String returned will be appended to the response. We return a String.
"render"
end
def new
'Should we make something new?'
end
# called when request is POST or PUT and params['id'] isn't defined or params[:id] == "new"
def create
redis = Redis.new
"Hit the road jack ... no more no more no more"
data = JSON.parse(request.body.read)
ids = data['id']
content = data['content']
puts "#{content.length} --"
ids.each { |x|
begin # "try" block
oldcontent = []
oldcontent.clear
oldcontent =JSON.parse(redis.get ("User:#{x}"))
rescue # optionally: `rescue Exception => ex`
puts "hello is empty"
oldcontent.clear
ensure # will always get executed
end
oldcontent.push(content)
puts "#{content.length} #{x} -- x"
redis.set("User:#{x}",oldcontent.to_json)
publish CHANNEL+x.to_s , oldcontent.to_json
puts oldcontent.to_json.length # if i send multiple data with multiple thread at the same time, in the case of Mutex , for every request display a different number , wish mean he add multiple content on oldcontent by other thread , and others have empty or only old content ...
}
puts "done !"
redis.quit
end
end
Plezi.route '/getdata', ExampleCtrl
for redis version , i use only gem 'redis'
in Gemfile .
from plezi.
@moxgeek ,
After reviewing the issue, I think I have two observations you might want to consider.
-
Your code introduces a race condition within the Redis database. Other database connections could alter the existing (old) data between the
get
and theset
operations. See redis transactions for more information.This isn't something Plezi could solve for you.
-
At the moment, the
create
method opens a new connection each time it is called (usingRedis.new
). This is not only a performance heavy task, but it also exposes the applications to known bugs.I'll explain:
The
redis
gem appears to useselect
internally.The
select
system call has known limitations on many OSs, namely, any socket with an identifier over1024
(or, on some implementations,2048
) will introduce bugs.This means that if you have 2048 concurrent clients (iodine supports tens of thousands of concurrent connections), none of them are allowed to start new Redis connections.
As for the first concern:
I think it will need a change of approach, such as pushing new data into a Redis list (using LPUSH
and LPOP
) rather re-packaging the data as a mutable JSON string.
I can't help with this error because this is very application specific. It should be considered as a bug and resolved on your end.
Using a Mutex solved it in a very local way. It doesn't help with scaling, since the Mutex is process/machine bound.
As for the second concern:
There are two possible solutions:
-
Use a connection pool.
This will minimize the number of open connections. Moreover, because the connection pool could be initialized before any clients, the numerical value for the file descriptors (sockets) will be below 1024.
-
Use iodine's evented Redis connection engine.
Iodine uses two redis connections per cluster. This will be significantly less than the number of connection your application currently employs and even significantly less than any possible connection pool.
However, iodine doesn't allow redis to block. In effect, calling redis commands from iodine returns no meaningful value the methods do nothing more than internal housekeeping and scheduling. All processing should be performed within callbacks, so it's impossible to return redis results to the HTTP request.
For a connection pool, you might consider something similar to this (untested) code:
require 'thread'
require 'redis'
# Use a Ruby Queue primitive for a thread-safe pool
REDIS_POOL = Queue.new
# Initialize pool on Iodine start
Iodine.on_state(:on_start) do
# 16 connections per process
16.times { REDIS_POOL << Redis.new }
end
# Close pooled connections when we're done
Iodine.on_state(:on_finish) do
until(REDIS_POOL.empty?)
REDIS_POOL.pop.quit
end
end
# Replace this sample with real code.
class ExampleCtrl
def create
# collect connection from the pool
redis = REDIS_POOL.pop
#... do stuff and then return connection to the pool
REDIS_POOL.push redis
end
Using iodine's bundled Redis connection might look something like this:
REDIS_CONNECTION = if Iodine::PubSub.default.is_a? Iodine::PubSub::Redis
# redis was initialized from the command-line
Iodine::PubSub.default
else
# create a new redis connection instance
Iodine::PubSub::Redis.new "redis://localhost:6379/"
end
# Replace this sample with real code.
class ExampleCtrl
def create
begin
data = JSON.parse(request.body.read)
rescue
# replace this line with: render(:error) or whatever you use
return [400, {}, ["Bad request!"]]
end
content = data['content'].to_json
# replace: redis.get & redis.set with a list command, `LPUSH`
REDIS_CONNECTION.cmd("LPUSH", "User:#{x}", content) do
# publish data only after it was submitted to the Redis database
publish CHANNEL+x.to_s , content
end
"Scheduled!"
end
end
from plezi.
hi , sorry i was away from my computer
i tried to do your proposition , and it's working well ...
my code right now is like that :
#coding: utf-8
require "json"
require "thread"
require "redis"
Encoding.default_external = Encoding::UTF_8
REDIS_CONNECTION = if Iodine::PubSub.default.is_a? Iodine::PubSub::Redis
# redis was initialized from the command-line
Iodine::PubSub.default
else
# create a new redis connection instance
Iodine::PubSub::Redis.new "redis://localhost:6379/"
end
# Replace this sample with real code.
class ExampleCtrl
CHANNEL = "User"
# HTTP
def index
# any String returned will be appended to the response. We return a String.
"render"
end
def new
'Should we make something new?'
end
# called when request is POST or PUT and params['id'] isn't defined or params[:id] == "new"
def create
data = JSON.parse(request.body.read)
ids = data['id']
content = data['content']
ids.each { |x|
REDIS_CONNECTION.cmd("LPUSH", "User:#{x}", content.to_json)
REDIS_CONNECTION.cmd("LRANGE","User:#{x}","0","-1"){|result|
publish CHANNEL+x.to_s , "[#{result.join(', ')}]"
}
}
puts "done !"
"Hit the road jack ... no more no more no more"
end
# called when request is POST or PUT and params['id'] exists and isn't "new"
def update
end
# Websockets
def on_open
Encoding.default_external = Encoding::UTF_8
userid = params['id']
subscribe "ALL"
subscribe CHANNEL+userid.to_s
REDIS_CONNECTION.cmd("LRANGE","User:"+userid,"0","-1"){|result|
publish CHANNEL+userid , "[#{result.join(', ')}]"
}
end
def on_message(data)
userid = params['id'.freeze]
REDIS_CONNECTION.cmd("del","User:"+userid)
publish CHANNEL+userid , "[]"
end
def on_close
userid = params['id'.freeze]
publish CHANNEL+userid, "#{@handle} left us :-("
puts "#{userid} Disconnected !"
end
end
Plezi.route '/getdata', ExampleCtrl
everything work great if i submit post data one by one , however if i send data with a threads ( 10 for my example ) it shows "bizarre" exception :
controllers/example.rb:35: [BUG] object allocation during garbage collection phase
the full stack : https://www.pastiebin.com/5c544b9b3fc7e
i tried to solve it by updating the ruby version to 2.6.0 and 2.6.1 and nothing changed
from plezi.
Hi @moxgeek ,
Thank you for exposing this issue 🙏!
I found the issue and I'm working on a fix. You'll have it later today, I hope.
Bo.
from plezi.
thanks !
an other issue maybe ...
make a POST to /getdata with json body
{"id":[1,2,555],"content":{"entity":"forfait","update":[{"id":1,"start_date":"2018-12-28 13:38:02 +0100","end_date":"2018-12-31 13:38:02 +0100","duration":50,"rate_group":"XXX","status":"enabled","status_date":"12:38:02 +0100","created_at":"2018-12-27 13:38:02 +0100","description":" fil","image_path":"http://lorempixel.com/300/100/nature/","push_status":null}]}}
it will work ( client recive a well-formated json response
but if we send this :
{"id":[1,2,555],"content":{"entity":"forfait","update":[{"id":1,"name":"XXXXX","start_date":"2018-12-28 13:38:02 +0100","end_date":"2018-12-31 13:38:02 +0100","duration":50,"rate_group":"XXX","status":"enabled","status_date":"12:38:02 +0100","created_at":"2018-12-27 13:38:02 +0100","description":" fil","image_path":"http://lorempixel.com/300/100/nature/","push_status":null}]}}
i add only "name":"XXXX" .
the client recive a "bizarre" response !
[{
"entity": "������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������:null}]}]
from plezi.
Hi @moxgeek ,
Thank you for your patience and for exposing the Redis issue.
I released a patch for the Redis engine (v. 0.7.21). This should fix the issue where Ruby crashes due to a Global Lock violation.
As for the JSON issue - I can't replicate the issue. Maybe this is related to the content of XXXX or maybe I'm not replicating this issue properly.
Please let me know if the patch fixes your issue and if you can show me a way to replicate the JSON error.
Thanks!
Bo.
from plezi.
hello @boazsegev , thank you , i will try the pull iodine and let you know .
about the Json , the XXXX is not a confidential , the json example is what i really send to plezi .
in browser based ws-client :
on a java env :
[{
"entity": "��������������������������������������������
when i print result on terminal after geting the body from rq it shows the same value ( woring )
when i print result from redis-cli he shows me the real value ( working )
but when print value of the list after reading from redis , exactly here :
REDIS_CONNECTION.cmd("LRANGE","User:"+userid,"0","-1"){|result|
puts "[#{result.join(', ')}]"
publish CHANNEL+userid , "[#{result.join(', ')}]"
}
the result is :
[{"entity":":null}]}]
[{"entity":":null}]}]
[{"entity":":null}]}]
if i replace "[#{result.join(', ')}]"
with "[#{result.ti_json}]"
the result :
[["{\"entity\":\"\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000:null}]}"]]
from plezi.
i just did it with more simple json
{"id":[555],"content":{
"mmm":"xxx",
"WWW":"xxx",
"XXX":"xxx",
"AAA":"xxx",
"BBB":"xxx",
"CCC":"xxx",
"ZZZ":"xxx",
"EEEE":"xxx",
"1mmm":"xxx",
"1WWW":"xxx",
"1XXX":"xxx",
"1AAA":"xxx",
"1BBB":"xxx",
"1CCC":"xxx",
"1ZZZ":"xxx",
"1EEEE":"xxx",
"2mmm":"xxx",
"2WWW":"xxx",
"2XXX":"xxx",
"2AAA":"xxx",
"2BBB":"xxx",
"2CCC":"xxx",
"2ZZZ":"xxx",
"2EEEE":"xxx"
}}
and it will work with
{"id":[555],"content":{
"mmm":"xxx",
"WWW":"xxx",
"XXX":"xxx",
"AAA":"xxx",
"BBB":"xxx",
"CCC":"xxx",
"ZZZ":"xxx",
"EEEE":"xxx",
"1mmm":"xxx",
"1WWW":"xxx",
"1XXX":"xxx"
}}
maybe is linked to the lenght of the json .
NB : this just happen when i use redis list , working on redis variable
from plezi.
i fixed the problem by using the redis without iodine :
#coding: utf-8
require "json"
require "thread"
require "redis"
REDIS = Redis.new
# Replace this sample with real code.
class ExampleCtrl
CHANNEL = "User"
# HTTP
def index
# any String returned will be appended to the response. We return a String.
"render"
end
def new
'Should we make something new?'
end
# called when request is POST or PUT and params['id'] isn't defined or params[:id] == "new"
def create
data = JSON.parse(request.body.read)
Encoding.default_external = Encoding::UTF_8
ids = data['id']
content = data['content']
ids.each { |x|
REDIS.lpush("User:#{x}", content.to_json)
rs = REDIS.lrange("User:#{x}","0","-1")
publish CHANNEL+x.to_s , "[#{rs.to_json}]"
}
puts "done !"
"Hit the road jack ... no more no more no more"
end
# called when request is POST or PUT and params['id'] exists and isn't "new"
def update
end
# Websockets
def on_open
puts defined? REDIS
userid = params['id']
subscribe "ALL"
subscribe CHANNEL+userid.to_s
rs = REDIS.lrange("User:#{userid}","0","-1")
publish CHANNEL+userid.to_s , "[#{rs.to_json}]"
end
def on_message(data)
userid = params['id'.freeze]
REDIS.del("User:"+userid)
publish CHANNEL+userid , "[]"
end
def on_close
userid = params['id'.freeze]
# publish CHANNEL+userid, "#{@handle} left us :-("
puts "#{userid} Disconnected !"
end
end
Plezi.route '/getdata', ExampleCtrl
from plezi.
Hi @moxgeek ,
Thank you for keeping me posted. I understand that you upgraded to the latest iodine
version and still experience the issue...?
Please note that:
-
the regular Redis gem is likely NOT process /
fork
safe. This means you might want to re-initialize the Redis connection in worker processes. i.e.:Iodine.on_state(:on_start) do REDIS.quit Object.send(:remove_const, "REDIS") REDIS = Redis.new end
Also, I think he Redis gem is thread safe, but I'm not sure it is. You'll need to check it out and possibly protect against race conditions.
-
I think the regular Redis gem uses blocking IO (no async communication), which might mean that using it will add a performance penalty.
As for the actual issue:
I'm currently traveling in Costa Rica and my computer-time and internet access are both limited.
I will continue looking further into the issue. I suspect it might be related to memory reference counting or marking, either in relation to the Ruby GC or in relation to facil.io's memory allocations... which mean that engaging the memory allocator might be required for reproducing the issue.
I am still trying to reproduce the issue.
Kindly,
Bo.
from plezi.
hello again @boazsegev
actually the only problem i have is the encoding problem .
redis problem is solved by using the last patch from the iodine git :) .
for resolving the encoding problem , i use the redis gem instead of Iodine redis .
two possible problem :
1 - maybe it related to the implementation approach of redis in iodine .
2 - or maybe it's related to my environment ( NB : the problem manifest only in LRAGE case with the cmd method ).
3 - or both :) .
btw , enjoy the beauty of Costa Rica ;)
from plezi.
Hi @moxgeek ,
Thank you very much for all the information, it was very helpful in exposing the issue.
I managed to re-create the issue and I think I fixed it in my latest commit.
I'm not releasing the patch yet, since I want to do some more tests, but I would appreciate if you could confirm that the patch actually solves the encoding issue on your end.
Thank you very much for exposing this concern! It only happens with JSON encoded messages in the Redis engine reply that have many escaped characters, so it wasn't easy to pinpoint.
Cheers!
Bo.
from plezi.
(y) thanks , it's working without any encoding problem .
i changed the iodine on the gemfile to gem 'iodine', git: 'https://github.com/boazsegev/iodine.git'
and it's working now with Iodine::PubSub::Redis
cheers 👍
i have a question , what is the difference between Iodine::PubSub::Redis and Redis.new from ruby gem
there is any performance features ?
from plezi.
I'm happy we managed to solve this for you
Thank you again for all the information and for opening this issue 👍🏻🙏🏻
As for your question, the Redis gem and iodine's Redis engine are significantly different in design.
TL;DR;
The iodine Redis engine is more opinionated and makes many tasks easier for the developer (set and forget).
The Redis gem offers more control over connectivity, but assumes the developer will manage connections, process forking and any other design choices.
Iodine::PubSub::Redis
The iodine Redis engine / driver uses the facil.io Redis extension under the hood.
This design is based on the idea that it's easier to horizontally scale application instances then to cluster database instances and the most important resource is the Redis database.
This approach attempts to minimize the number of connections hitting the database (2 connections per application instance) and avoids pipelining requests to minimize the memory load on the Redis server.
Iodine's approach also assumes an evented application. Calling Redis commands on iodine returns immediately, before the command was actually processed by (or even sent to) Redis, improving the responsiveness of WebSocket applications.
Overall, this might add slight latency to each Redis request (due to IPC), but it can significantly lower the costs of running a Redis Cluster.
The Redis gem
On the other hand, the Redis gem uses either a Ruby socket client or the hiredis C library.
This is a straight forward approach that assumes nothing. The developer is required to make their own design choices and find their own solutions for performance and scaling concerns.
Thanks again!
Bo.
from plezi.
Related Issues (20)
- ERROR -- : uninitialized constant OpenSSL::SSL::SSLErrorWaitReadable (NameError) HOT 10
- ERROR -- : incompatible character encodings: ASCII-8BIT and UTF-8 (Encoding::CompatibilityError) HOT 11
- Documentation help needed. HOT 2
- WebSocket connection to 'ws://localhost:3000/ws' failed: Error during WebSocket handshake: Unexpected response code: 200 HOT 21
- JSON event length is limited HOT 10
- Server configs HOT 4
- download error with bundled client.js HOT 1
- rabbitmq HOT 1
- Finding a websocket session and send message HOT 8
- SyntaxError in generated config.ru
- Rack < 2.0 : cannot load such file -- rack/query_parser.rb HOT 1
- message_size_limit uninitialized constant Iodine::Http HOT 10
- server shutdown when one or more client has a poor connection HOT 7
- Upgrade to bundler 2.0 HOT 3
- Help needed! HOT 1
- Failing at higher concurrency HOT 2
- Gap in documentation for serving static assets HOT 3
- Architectural clarification HOT 2
- There is no way to unsubscribe from Plezi::Controller.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from plezi.