Giter Site home page Giter Site logo

Comments (4)

marcelblum avatar marcelblum commented on May 20, 2024 2

Sure. There are a few gotchas bc of Electron, Chromium, and Web Audio API quirks but I have it working pretty well now. Was thinking of publishing a package for this if there's demand since there's not much info on the web about how to do this.

On Windows I have it working well with Electron 12+. On Mac, you'll need to use Electron 14+ for it to work well because of Chromium Mac-specific issues with audioWorklet in earlier versions. Note also the warning in the Electron docs about using native addons in workers, but I have not run into any of the problems mentioned there while using Audify.

In the main process, you'll need to instantiate your BrowserWindow with a webPreferences options object that includes { nodeIntegrationInWorker: true, nodeIntegration: true, contextIsolation: false }. Probably helps to add backgroundThrottling: false too.

In the renderer process, the best way I found to do this is to include your audioWorklet code as a string that is turned into a Blob and then an object URL. This is because require inside workers in Electron has some quirks, it does not have the relative path resolution capabilities of regular require due to isolation/security, so you have to feed it the absolute path to the Audify module, which of course is dynamic depending on whether the app is packaged and where your app is installed on the user's system. So to get around this you must get the path to Audify at runtime in the renderer using require.resolve().

Also, note that in audioWorklet the web audio buffer size is fixed at 128, at least for now (see the notes here). So it's easiest to manage if you set your RtAudio stream to have a 128 frameSize, though it is possible to manage smaller or larger sizes if needed (sometimes it's necessary with Windows ASIO, see #18), just increases the complexity of the worker code a bit.

So a basic example has renderer code that looks something like this (assumes you have already created an audioContext, defined what deviceId and firstChannel you want to use, and a fixed buffer size of 128, also assumes using ASIO):

var RtNode;
audioContext.audioWorklet.addModule(URL.createObjectURL(new Blob([`
  const RtASIOInstance = new (require(${JSON.stringify(require.resolve('audify'))}).RtAudio)(7);
  const defaultSamplesPerFrame = 128;
  const samplesPerFrame2x = defaultSamplesPerFrame * 2;
  RtASIOInstance.openStream(
    { deviceId: ${yourChosenDeviceID}, nChannels: 2, firstChannel: ${yourChosen1stChannel} },
    undefined,
    16, //float32 is the native format web audio delivers buffers to process()
    ${audioContext.sampleRate},
    defaultSamplesPerFrame,
    "MyStream",
    undefined,
    () => {
      console.log("played 1st buffer successfully");
      RtASIOInstance.setFrameOutputCallback();
    },
    1, //non-interleaved
    (error, message) => console.warn(error, message)
  );
  RtASIOInstance.start();
  class RtRouter extends AudioWorkletProcessor {
    constructor () {
      //might want to do stuff here like initialize messaging between the worker and renderer process
      //super();
      //this.port.onmessage = ...
    }
    process (inputs) {
      if (inputs[0]?.length > 1) { //inputs will be empty if no audio node is connected or something goes wrong in the renderer
        const bufferConcatenation = new Float32Array(samplesPerFrame2x);
        bufferConcatenation.set(inputs[0][0]); //left channel data
        bufferConcatenation.set(inputs[0][1], defaultSamplesPerFrame); //right channel data
        RtASIOInstance.write(bufferConcatenation);
      }
      return true;
    }
  }
  registerProcessor("deviceRouter", RtRouter);
`], { type: "application/javascript; charset=utf-8" }))).then(() => {
  RtNode = new AudioWorkletNode(audioContext, "deviceRouter");  
  RtNode.onprocessorerror = (e) => console.warn("error from RtNode", e);
  //now you can do anyWebAudioNode.connect(RtNode);
}).catch((reason) => console.warn("Rt device routing failed", reason));

Obviously this is a simple example, in real world use you'll likely want to handle messaging with the worklet, run getDevices() and return the result etc. The above code is untested I just adapted it from more complex code specific to my own app, apologies in advance if it has some stupid typo or error :)

from audify.

marcelblum avatar marcelblum commented on May 20, 2024 1

In case anyone comes across this just wanted to post the solution I settled on, the best way to get Web Audio raw output buffers in realtime seems to be via an audioWorklet, which is actually a pretty simple method and no need to mess with WebRTC streams at all. So I have the worklet itself instantiate RtAudio from audify and pipe the raw input PCM data directly to rtAudio.write() (in Electron worklets can call Node.js modules as long as nodeIntegrationInWorker is true). I connect it to a Web Audio node back in the main thread (in my case I connect my main master output node to the worklet), and the audio is routed to the RtAudio stream nicely, tested up to 32-bit 96kHz output, no glitching. Only issues I came across is that some ASIO devices I tested did not seem to work with in RtAudio. I may open a separate issue about that to investigate further if I can't figure it out.

If anyone's more curious I can post some sample code. Or if anyone has any feedback/ideas for better implementation I'm open to hearing.

from audify.

Durgaprasad-Budhwani avatar Durgaprasad-Budhwani commented on May 20, 2024

@marcelblum - if possible, could you please provide a sample code?

from audify.

Durgaprasad-Budhwani avatar Durgaprasad-Budhwani commented on May 20, 2024

HI, @marcelblum Thank you very much for the code snippet. Much Appreciated. I will reach out to you again if I am stuck at any place.

from audify.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.