Comments (3)
Did you try to convert the model using TensorRT?
No, I haven't try! You are welcome to try and please provide your feedback so that others can benefit.
do you think building Tensorflow from source helps accelerate the project or it is just a minor improvement?
I don't think building TF from source will help since it can still be optimised for CPU (uses Intel Acceleration instruction sets), for NVIDIA GPU it may use CUDA. Ideally, these binaries you could download readily and use. But since they don't offer libtensorflow_cc.so
(monolithic) binaries from TF, unfortunately. You have to build it manually on your system. Instructions can be found here
Is there any handling of the buffering needed for RT, e.g. Ring buffers for writing and reading that makes the library integration into a VST easier?
Yes, you have to write complete RTSP like buffer to feed the stream in real-time. Use FFMPEG and extend the existing code to support RTSP (Real-Time Streaming Protocol). Google for more info, you will also get some good GitHub Repositories for this.
what's the expected latency?
I have not estimated/calculated for this Model so can't comment really. But since the model is quite small, it should be pretty fast if you have decent accelerated platform.
from spleeter.
Hi @SuperKogito Yes you could use the NVIDIA CUDA Accelerations by converting the Model to the TensorRT and add new inference Engine to use NVIDIA Inference APIs and Model. If this is not an option, you could use the TFLite Model (Android/iOS compatible) to use this in real-time by leveraging the Neural Engines that modern devices offer.
Basically, for each type of acceleration, you can introduce a new Inference Engine in the source code and convert the model for that accelerator. This should work-out-of-the-box (may require little tweaks here-n-there)
from spleeter.
Thank you for this fast response <3
Did you try to convert the model using TensorRT? I took a look at the library and it has no maxpooling layer, which is as far as I know part of Spleeter architecture.
Leveraging the Neural Engines that modern devices offer is actually exactly what I am trying to do but with https://github.com/gvne/spleeterpp, Currently I am failing to use a Tensorflow built from source though. As you are more experienced with inference engines, do you think building Tensorflow from source helps accelerate the project or it is just a minor improvement?
Based on my analysis of your neat project, I noticed that this library mainly do inference, on the wav file read and passed as a CLI argument. Is there any handling of the buffering needed for RT, e.g. Ring buffers for writing and reading that makes the library integration into a VST easier?
I am sorry for these many questions but I got one more, in your experience, what's the expected latency?
I really appreciate you taking the time to answer me. Thank you so much :))
from spleeter.
Related Issues (13)
- Importing In Android Studio HOT 25
- bazel build //application:spleeter --------->ERROR Build did NOT HOT 1
- How can I fix the error as below? HOT 3
- Windows support HOT 1
- Reg: Input shape of TFLite model is different from the Spleeter Model HOT 4
- Hello, how can I generate a callable dynamic library (so file) from spleeter C++? HOT 19
- How do I compile cpp programs? HOT 1
- How do I compile cpp programs? HOT 1
- Hello, it takes a long time to process a song with a size of about 40MB. I don't know if it is because of insufficient computing power or it needs to use GPU to solve it. HOT 2
- Spleeter_dynamic_library: Solutions and steps to implement the executable file of the algorithm HOT 5
- I changed how to use the following example to compile the executable problem, and how should the location of the project directory be stored in the specified path ? HOT 2
- bazel run //application:spleeter ---------> external/models/5stems/5stems.tflite HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from spleeter.