Giter Site home page Giter Site logo

Comments (20)

nkzawa avatar nkzawa commented on July 4, 2024

Umm, can't you just use common stream events like readable, finish, end, drain and error to track those states?

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

Maybe, I'm having a hard time reading the docs, especially because there are multiple versions.

I would really appreciate some help clearing this up.

  • I understand that the readable is emitted when data can be read, but that isn't the same as, it's being read (streaming) right?
  • I understand finish is a writable stream event, so we could use that to know if an outgoing stream (in our case the stream we send to a consumer, using an emit) has finished successfully?
  • The end event would then be the way to know if an incoming stream finished successfully / completely consumed?
  • I'm not sure how I could use the drain event, would that indicate that a writable stream / outgoing stream is being handled (piped) and is ready to receive events?

Summary:

  • readable: ?
  • finish: outgoing stream has finished successfully?
  • end: incoming stream finished successfully?
  • drain: ?
  • error: something went wrong with outgoing stream or incoming stream.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

I was just listing those events which might be useful for your purpose. I will answer your questions as much as possible, but please understand that stream API is not easy for me too.

I understand that the readable is emitted when data can be read, but that isn't the same as, it's being read (streaming) right?

I think it depends on how you define streaming. When you pull data then readable would mean data is available so you can call read method, which might be streaming for you. But if you want to push data, then data event might be more appropriate.

I understand finish is a writable stream event, so we could use that to know if an outgoing stream (in our case the stream we send to a consumer, using an emit) has finished successfully?

This is not always true, because outgoing stream also can read data, so finish event only means all data is sent to the corresponding remote stream. And end event is also the same.

I'm not sure how I could use the drain event, would that indicate that a writable stream / outgoing stream is being handled (piped) and is ready to receive events?

drain means data is consumed so you can write more data to stream. This might not useful for your purpose.

I'd like to add more APIs, only when it's necessary to do something, or when it's very useful for common operation which frequently happens.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

@nkzawa, that's very honest. I'll try to set up some tests to figure out how we can use these events.
I think this is more necessary when using socket.io-stream because there is more latency (internet bandwidth), an internet connection is fragile and we do not always know what happens on the sending or receiving side. That last aspect is something I'm facing because I'm creating an API.

I understand that the readable is emitted when data can be read, but that isn't the same as, it's being >> read (streaming) right?

I think it depends on how you define streaming. When you pull data then readable would mean data is > available so you can call read method, which might be streaming for you. But if you want to push data, then data event might be more appropriate.

But available doesn't necessarily mean someone has piped (or is handling the stream in another way) on the receiving side right?
I understand that the data event is interesting, but I also understand that this puts the stream in another mode, I'd like to leave that choice to the receiving end (my API users)

I understand finish is a writable stream event, so we could use that to know if an outgoing stream (in our case the stream we send to a consumer, using an emit) has finished successfully?

This is not always true, because outgoing stream also can read data, so finish event only means all data is sent to the corresponding remote stream. And end event is also the same.

I'm not following you, isn't the fact that all data is send the same as finished successfully?

I'm not sure how I could use the drain event, would that indicate that a writable stream / outgoing stream is being handled (piped) and is ready to receive events?

drain means data is consumed so you can write more data to stream. This might not useful for your purpose.

That does sound hopeful, it does indicate reading activity on the consuming end. I only wurry how long it takes before we get the event.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

But available doesn't necessarily mean someone has piped (or is handling the stream in another way) on the receiving side right?

No. I'd like to know why you need this.

I'm not following you, isn't the fact that all data is send the same as finished successfully?

Ah, I just wanted to say IOStream is duplex, so both reading/writing is enabled.

I'm thinking that additional events for IOStream might be more appropriate if we add API.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

But available doesn't necessarily mean someone has piped (or is handling the stream in another way) on the receiving side right?

No. I'd like to know why you need this.

  1. Because of this issue: #36
  2. I'm building an API where I need to forward streams and I want to make sure that a stream is completed before a sender can send a next stream. So I would also like to check whether the current stream is already being consumed, if it's not (because there was no consumer) then the current stream is replaced with the new stream.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

Because of this issue: #36

Am I correct that additional events are not required when you use flowing-mode?
Anyway, I think piping to multiple streams is not officially supported in Node.js, even though it might work by chance.

I'm building an API where I need to forward streams and I want to make sure that a stream is completed before a sender can send a next stream. So I would also like to check whether the current stream is already being consumed, if it's not (because there was no consumer) then the current stream is replaced with the new stream.

In this case, you can just use finish or end event IMO.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

Because of this issue: #36

Am I correct that additional events are not required when you use flowing-mode?
Anyway, I think piping to multiple streams is not officially supported in Node.js, even though it might work by chance.

Well, the action of listening to a data stream is actually consuming the stream, if the consumer didn't respond yet and piped the stream this will probably break the stream. (I don't like the flowing-mode)

I'm building an API where I need to forward streams and I want to make sure that a stream is completed before a sender can send a next stream. So I would also like to check whether the current stream is already being consumed, if it's not (because there was no consumer) then the current stream is replaced with the new stream.

In this case, you can just use finish or end event IMO.

Not true, these don't tell me if a stream is being consumed (so someone is consuming a stream but hasn't finished yet). I currently use the internal Socket.io-stream ...-read event for this.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

Anyway I think it's not good to add status property or new events only for piping multiple streams. If you have any other reasons, please let me know.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

It's also needed to throttle incomming streams, see point 2 in #37 (comment)

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

Sorry, maybe I don't understand what you meant, but you really can't use finish event?
The doc of finish event says all data has been flushed to the underlying system, this event is emitted and underlying system is a remote IOStream in this case.

If I'm misunderstanding that, could you post a sample code of throttle incomming streams please?

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

This would be my incoming streams throttling idea:

var nsp = io.of("/mynamespace");
var fileStreams = {};
var currentStreamID;

nsp.on('connection', function(socket){
  socket.on('streamEvent', function(stream,data,callback) {

    var currentStream = fileStreams[currentStreamID];
    if(currentStream) {
      if(currentStream.streaming) { 
        // Cancel incomming stream: Still streaming a previous stream in this namespace
        if(callback) callback(new Error("Still streaming a previous stream in this namespace"));
        return;
      } else {
        // Remove prev stream");
        delete fileStreams[currentStreamID];
      }
    }
    currentStreamID = stream.id;
    currentStream = stream;
    fileStreams[currentStreamID] = currentStream;

    // remove stream when finished streaming
    currentStream.on('end',function() {
      delete fileStreams[currentStreamID];
    });

    // ToDo: handle failed stream

    var outgoingStream = ss.createStream();
    ss(targetSocket).emit("forwardedStreamEvent",outgoingStream);
    // hack to determine whether the targetSocket is actually consuming a stream 
    stream.pipe(outgoingStream);
    targetSocket.once(ss.Socket.event+'-read',function() {
      // consumer is consuming stream
      currentStream.streaming = true;
    });
  });
});

So it doesn't accept a new stream when there is still a stream, that is being consumed.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

Why is streaming property required? It looks you can just check whether currentStream exists or not.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

Please explain why you need to know when a stream start to read data.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

No because I only want to cancel an incoming stream when there is a stream that is being consumed. When the current stream isn't being consumed it can just be overridden.
So I need to now whether it's being consumed to know if I should reject an incoming stream (allowing the currentstream to finish streaming) or override the current stream.

from socket.io-stream.

nkzawa avatar nkzawa commented on July 4, 2024

Sorry, but I think this is not what this module should support. Your use case is too special.
Maybe you can solve that by yourself. For example, you can emit a socket.io event as a signal of being consuming instead of listening $stream-read.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

You mean from the consumer (socket.io-client) ? I can't since I'm building a API, I don't know what happens there, can't be dependent on it.
Maybe if I get to running the tests I want to do I'll figure out a simple way and do a pull request. It could be just a streaming boolean.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

I found a solution for my problem, not being able to know whether a stream is being read. I override the push method.

var orgPush = stream.push;
stream.push = function() {
  streamStreaming = true; 
  stream.push = orgPush;
  return orgPush.apply(this, arguments);
}

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

Also checking stream._readableState.flowing and stream._readableState.ended looks very interesting. It works for regular file read streams.
You can use this to prevent a second, later, pipe, which would normally deliver a partial "file".

Downside of using these _readableState properties is that these are "internal" / "private" properties, they are not a part of the official Stream API. There is a bigger change of them changing with new versions.

from socket.io-stream.

peteruithoven avatar peteruithoven commented on July 4, 2024

Looking into stream._readableState.flowing, I find the push call a slightly better indicator. Mostly when you want to pipe the same stream to multiple streams.
stream._readableState.flowing will be set to true after a first read, but at this point you can still pipe the same stream to another stream, because there was not data send from the stream. After a push call something is written to piped streams, so at that point you can't pipe it anymore.

from socket.io-stream.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.