Giter Site home page Giter Site logo

foobar404 / wave.js Goto Github PK

View Code? Open in Web Editor NEW
648.0 7.0 98.0 32.86 MB

Audio visualizer library for javascript. Create dynamic animations that react to an audio file or audio stream.

Home Page: https://foobar404.dev/wave.js/

License: MIT License

JavaScript 0.09% HTML 77.14% TypeScript 9.76% CSS 12.45% SCSS 0.55%
audio visualization canvas music oscillator javascript

wave.js's Introduction

Wave.js

Audio visualizer library for javascript.

Installation

Install With CDN

<script src="https://cdn.jsdelivr.net/gh/foobar404/wave.js/dist/bundle.js"></script>

Or NPM

npm i @foobar404/wave

Setup

If your using NPM:

import {Wave} from "@foobar404/wave";

Usage

let audioElement = document.querySelector("#audioElmId");
let canvasElement = document.querySelector("#canvasElmId");
let wave = new Wave(audioElement, canvasElement);

// Simple example: add an animation
wave.addAnimation(new wave.animations.Wave());

// Intermediate example: add an animation with options
wave.addAnimation(new wave.animations.Wave({
    lineWidth: 10,
    lineColor: "red",
    count: 20
}));

// Expert example: add multiple animations with options
wave.addAnimation(new wave.animations.Square({
    count: 50,
    diamater: 300
}));

wave.addAnimation(new wave.animations.Glob({
    fillColor: {gradient: ["red","blue","green"], rotate: 45},
    lineWidth: 10,
    lineColor: "#fff"
}));

// The animations will start playing when the provided audio element is played

// 'wave.animations' is an object with all possible animations on it.

// Each animation is a class, so you have to new-up each animation when passed to 'addAnimation'

Contributing

Get involved! Check out the Contributing Guide for how to get started.

License

MIT

wave.js's People

Contributors

b-stud avatar cgratie avatar cyntler avatar dependabot[bot] avatar foobar404 avatar github-actions[bot] avatar kgarner7 avatar mrvwman avatar mvishu405 avatar rogeriochaves avatar wooldoughnut310 avatar zeusthedev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

wave.js's Issues

Padding on 'rings' style is very large.

I was wondering if it is possible to decrease the padding on the waveform when using the 'rings' style.

As you can see there is a lot of whitespace left when using this style:
image

'ring' style for comparison:
image

P.S.
Another thing I noticed is that the image of the waveform is invisible until I click somewhere on the page, not sure though if it is supposed to do that.

React example doesn't work

The React example inside test/react-test doesn't work. The audio player works and I can hear/control the audio, but nothing is rendered to the canvas.

I've tried running it in both with the latest versions of Chrome and Firefox with the same result.

MediaStream with multiple tracks

hi,
first. you did a great work.

am trying to visualize stream with multiple tracks, but visuals are only based on one track (the first track in the stream )
am new to web audio api. i would really appreciate your help.

Package does not have Angular ng-add support

Wave.js doesn't seem to have ng-add support for Angular. This is making it difficult to use with Angular, especially since this package can't be recognized by Angular even after an NPM install.

Error: src/app/components/posts/record/record.component.ts:7:20 - error TS7016: Could not find a declaration file for module '@foobar404/wave'. '/home/electrenator/Documents/Code/[redacted]/front-end/node_modules/@foobar404/wave/dist/bundle.cjs.js' implicitly has an 'any' type.
  Try `npm i --save-dev @types/foobar404__wave` if it exists or add a new declaration (.d.ts) file containing `declare module '@foobar404/wave';`

7 import {Wave} from '@foobar404/wave';
                     ~~~~~~~~~~~~~~~~~

I installed this package via the Angular CLI. This yields the same result as installing it via NPM. My install log:

$ ng add @foobar404/wave
ℹ Using package manager: npm
✔ Found compatible package version: @foobar404/[email protected].
✔ Package information loaded.

The package @foobar404/[email protected] will be installed and executed.
Would you like to proceed? Yes
✔ Package successfully installed.
The package that you are trying to add does not support schematics. You can try using a different version of the package or contact the package author to add ng-add support.

Currently I can't figure out how I can still use this package within my project. It would be nice if I would be able too :)

Software versions

Angular CLI: 12.2.13
Node: 12.21.0
Package Manager: npm 8.1.3
OS: linux x64
typescript: 4.3.5

Does not work at all

I tried the example from the Readme, create a simple HTML file and paste the code. it did not work, only echo i could hear.

`

<script src="https://raw.githubusercontent.com/foobar404/Wave.js/v1.2.6/dist/bundle.iife.js"></script> <script> let wave = new Wave(); navigator.mediaDevices.getUserMedia({ audio: true }) .then(function (stream) { wave.fromStream(stream, "output", { type: "shine", colors: ["red", "white", "blue"] }); }) .catch(function (err) { console.log(err.message) }); </script> `

In safari for IOS17 audio is broken

I am developing a web site that uses a library that involves audio playback.

This playback is accompanied by an animated canva, and in the IOS17 version in safari, this does not work. The audio is muted and the canva does not animate.

However, in ios16 safari it works perfectly.

I have also tried in safari on MacOs and ipadOs and it works, which makes me see that the problem is with ios 17 safari.

It is necessary to emphasize that in ios17 it does not work, but it does not appear any error in console.

I would love to get help or know if this is a bug to report it.

can i help ?

Hi @foobar404 , I absolutely love your library! I was in the process of creating my own JavaScript audio visualization library before I discovered your amazing work. The only other solution I found was WaveSurfer, but it didn't appeal to me for various reasons. I've noticed that your library is relatively new and could benefit from some enhancements. One area that seems to need improvement is the documentation. It can be challenging for those not familiar with TypeScript or those who prefer step-by-step guides. I have expertise in JavaScript, TypeScript, and React JS, although I haven't previously contributed to an open-source project.

Would it be possible for me to contribute? Perhaps we could collaborate on developing a more comprehensive documentation site. I'm eager to play a significant role in your project and would love to help create an engaging documentation site, possibly using Next.js. If you're looking for someone to assist in building a robust documentation website, both in terms of coding and potentially writing guides, I believe I'm the right person for the job!

I need someone to have a conversation with about a few aspects of project management. As I mentioned, I haven't contributed to an open-source project before. With a bit of guidance on your Git model, preferences, and best practices, along with some planning, I could potentially play a significant role in this awesome project. I have plenty of time available, possibly over 4 hours each day, to dedicate to this project.

thank you <3

Not showing up on iPhone safari

Hi,
first off what an amazing script you got here. Great job on this!

I was able to successfully display the waves in firefox and chrome, desktop and android devices.
But for some reason the waves do not appear on safari iPhone.

I'm running an icecast server and displaying the waves via audio tag like so..

<audio id="radio" crossorigin="anonymous">
       <source src="https://mystation.com/stream" type="audio/mpeg" />
       <source src="https://mystation.com/stream.ogg" type="audio/ogg" />
</audio>

and I am triggering the playing of the audio like this..

var player = document.getElementById('radio');
player.load();
player.play();

I have a canvas html element on the page like this..
<canvas id="wavesCanvas" height="250" width="500"></canvas>

The waves is set like this..

var wave = new Wave();
var options = {type:"shockwave", colors: ["#6c2a99", "#e01919", "#b4751d"]};
wave.fromElement("radio","wavesCanvas",options);

The audio plays on all devices including safari on iPhone.
The wavesjs shows on all browsers except safari on iPhone.
I should note that I have not tested on safari on a mac because I do not own a mac at the moment.

Is this a known issue?
Is there any special configurations needed for this to work on safari?

IWaveOptions types and density (trying to reproduce shockwave)

(2.0 - I just upgraded today)

I'm trying to program some pre-sets for wave options and frequencies, and finding that IWaveOptions (and its enum for frequencyBand) are hidden away, meaning I have to go through an any or do hacks like frequencyBand: f as "base" | "lows" | "mids" | "highs" to get it to work. I'm trying to reconstruct what was "shockwave" in version 1 (3 frequency waves, each a different color).

Also, Wave (the animation) is kind over-powered when not frequency-split. I find that on a small canvas, it is often at 100% and so isn't really a very interesting visualization.

You did really good work! Thank you

I love this. It's just what I'm looking for. Effortlessly and efficiently selecting between multiple cool looking animations. Beautiful!

Thank you so much!

fromStream() Doesn't Visualize Due To self.visualize() Being Called Without the Frame Parameter

In the renderFrame() function inside of the fromStream(stream, canvas_id, options = {}) function, self.visualize() is being called without the frame parameter, and results in frame being undefined inside of the visualize function.

Simply changing the line that goes:
self.visualize(self.current_stream.data, self.current_stream.id, self.current_stream.options);

to instead be:
self.visualize(self.current_stream.data, self.current_stream.id, self.current_stream.options, 1);

I don't know if there's more to be fixed, but it seems that all the arguments passed for the frame parameter are 1 in the other functions. This solution worked enough for me to visualize a WebRTC stream.

99c140cba67c2870a6b1f4b86f91b9ca

Allowing to externalise sources storage to be able to create / destroy / create again the player

When using one-page apps which will render the DOM depending on the current route, if the current route has to render the Wave.js visualizer, then it has to be destroyed then created again, it will result in the following error :

bundle.cjs.js:13 Uncaught DOMException: Failed to execute 'createMediaElementSource' on 'AudioContext': HTMLMediaElement already connected previously to a different MediaElementSourceNode.

This error occurs because the sources are being stored in the class itself which is destroyed and created again :

this.sources[element.toString()] = {
                "audioCtx": audioCtx,
                "analyser": analyser,
                "source": source
            };

For classic webpages there won't be any problem since the visualizer will normally not be destroyed, but for all modern frameworks like React, Angular, VueJS, etc.. it will fail.

I think we should allow a way to override the default sources storage if needed by passing a setSource method which will be used instead of all such cases

this.sources[element.toString()] = {
                "audioCtx": audioCtx,
                "analyser": analyser,
                "source": source
            };

And a getSource method which will be be used as a getter in such cases :

this.sources[element.toString()].animation

I created a simple working fiddle : http://jsfiddle.net/ma3zu8n5/1/

Visualise Audio with muted audio

Is there a way you could implement visualising audio while the audio tag is muted? I want the bars to jump on autoplay but the user has to unmute it for a better UX. In a previous issue, someone suggested { connectDestination: false } but that no longer works

The sound disappears!

I really liked your library. And I want to use it in my pet project. I copied the code from your example, here it is:
https://jsfiddle.net/j6evzynt/

And there is no sound, and no errors, no visualization... Only seconds pass, like a sound is mute.
What am I missing?

I check in Chrome and Mozilla, Windows and MacOS.

.

.

Rounding on lines doesn't work properly

image
When the lines are inactive they are rounded to a circle, but when extended they lose the rounded corners and return to hard lines.

Here is my code:

let audioElement: HTMLAudioElement | null = document.querySelector('#audio');
let canvasElement: HTMLCanvasElement | null = document.querySelector('#canvas');
if (audioElement && canvasElement) {
	let wave = new Wave(audioElement, canvasElement, true);

		wave.addAnimation(
			new wave.animations.Lines({
				lineWidth: 3,
				lineColor: 'white',
				count: 100,
				rounded: true,
				center: true,
				mirroredY: true
				})
			);
		}

User interaction events watcher should be optional

Waiting for user events might be a must when the visualizer is loaded as soon as the page is loaded.

But, when using it with one-page apps frameworks, the user might have already interacted with the browser before the Visualizer is loaded, which will result in waiting uselessly for a user event before rendering the canvas.

The code that handles that :

if (this.activated) {
        run.call(waveContext);
    } else {
        //wait for a valid user gesture 
        document.body.addEventListener("touchstart", create, { once: true });
        document.body.addEventListener("touchmove", create, { once: true });
        document.body.addEventListener("touchend", create, { once: true });
        document.body.addEventListener("mouseup", create, { once: true });
        document.body.addEventListener("click", create, { once: true });
        element.addEventListener("play", create, { once: true });
    }

My suggestion is to add a boolean option waitForUserInteractionBeforeRendering, default to true, if false, then don't wait for anything.

When drawing a wave, keep already drawn elements on canvas.

Hey @foobar404, thanks for this amazing library!

When using the fromElement function it seems that it removes elements already drawn on the canvas and displays the wave. Is there an option to keep those elements and draw the wave on top of them?

Also, can we change the wave's positions using options?

v2.0 - changes to Wave types

Hello, I've been using v1.x version of your lib and I was able to make effect like on attached image with Flower type.

flower_old

I wanted to migrate to v 2.0, but effect of flower is completely different. Shine is kind of looking similiar, but:

  1. The lines go only outside of diameter
  2. When I slide the volume to max, lines are too big and I couldn't fix it with any options

This is old Flower with maximum volume:

old_flower_max

This is new Shine with maximum volume:

shine_max

Do you think I can achieve such effect with v2.0? Or wouldnt be possible to get an option to specify max width / height of the specific animation types somehow?

Thank you.

Ability to visualise muted audio

In some circumstances, it is useful to visualise audio, but mute/set volume to 0 to not hear it at the same time. Please consider adding this feature or the ability to control the volume of the audio that is being visualised. I've tried to set the volume of the audio object I'm passing to "fromElement" but no luck - I can still hear myself (visualising my microphone input).

Doesn't work randomly in "Android Chrome"

I am working on a project with a "Words Picture Gallery". When you click on one Word/Picture the webapp play "the word sound" (a mp3 file) and show the bars wave using "foobar404/wave".

On desktop this works 100%, but in Android Chrome, randomly doesnt work and when this happens, the wave library do a "INIFINITY LOOP" over the renderFrame function, I notice this put some console log in "renderFrame" function.

Also, when this happens, a console.log on variable "data" show this:

Uint8Array(1024) [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, …]

Also, the event "ended" on the audio element never fires

[Evolution Proposition] Full rewrite of Wave.js in Typescript, general improvements & fixes

Hello @foobar404,

Since we are more and more persons to use Wave.js, I think this is the right moment to go a step further and to rewrite the library to make it cleaner, more consistent, and easier to maintain, using Typescript.
I spent some days on this task and I'm coming with an available tested proposition.

Since it would be a huge PR and I'm not sure how you wish (or don't wish) to integrate it, I put the proposition in my own temporary repository for the moment. You can take a look and decide what to do with it Here.

I'd be happy to help you with the integration if you need anything.

I tried to address the following concerns :

  • Taking benefit of all typescript features.
  • Typing variables (safer usage, clearer exposure, compiler checks benefit, easier readability & easier maintainability, autocompletion).
  • Separation (isolation) of concerns for the different helpers (eg. skipUserCheck is only used with fromElement).
  • Sanitization (Removing dead code, useless variables, optimizing code when possible, adding more consistency to the style).
  • Using ESLint to adopt strong standards and protect commits.
  • Using Webpack to facilitate (and make it more customizable) the build generation.
  • Removal of draw-round-layers (was not doing anything, origamijs is no longer needed).
  • Creating unit tests with the Jest framework (It would be wonderful to increase the coverage rate in the future).
  • Using "standardized-audio-context" to deal with browsers specificities.
  • Important note about fromFile: From my understanding, this helper is supposed to regularly call a callback with a single argument corresponding to a base64 image. If so, according to the code base, it seems this one was not correctly implemented (calling the callback at 'ended' event instead of during playing, I adjusted it correspondingly to make it work based on the understanding explained right before).
  • Adding width & height options for fromFile (which will allow optimizing the rendering and will still default to window size)
  • Adding an existingMediaStreamSource option to fromElement options, so that if the user already has a source it can pass it to be used safely without facing an Error from the browser. Null by default. Useful if, for example, an existing source does exist on the audio element (to apply audio effects for example).
  • Adding a format parameter to be used to render the canvas with fromFile, default to "png"
  • Fixing the multi-types feature (that was not working for me, ctx.clearRect(0, 0, w, h) was being called for each type instead of once before all).
  • Automatically closing the AudioContext for all 3 helpers, except if an existingMediaStreamSource is passed to fromElement helper, if so, then it's not expectable to close the context since it's been created externally and might be still in use somewhere else.
  • Removing the "react-test" since it will lead to incompatibility issues with babel-jest that we are now using, creating instead 3 examples with vanilla js in examples/ folder. Appropriate npm scripts are available to launch these examples (npm run example-from-file /npm run example-from-stream /npm run example-from-element).
  • The fromElement example shows an advanced example of multiple canvas with multiple effects on each one, the 2 other examples are basic.
  • WaveJS.fromStream(...), WaveJS.fromFile(...) and WaveJS.fromElement(...) will now all return a on object containing a deactivate() method. This method will stop all the operations (the examples files use it to automatically deactivate the visualizers after an arbitrary time of 10 seconds). If needed, in the future, we will be able to add more methods to this object in order to have more control over the current execution.
  • When used inside a browser, the library is available through the global object WaveJS.
  • When used server-side or with ES6, the library is available using import WaveJS from 'wave.js-typescript'; and hopefully after the integration: import WaveJS from '@foobar404/wave';.

The spirit and the usage of the library have not been altered. Though, there's a breaking change (which led me to set the 2.0.0 version) with the options parameter since it's now proper to different handlers (fromFile, fromStream, fromElement).
These per-helper options must now be passed as a third argument instead of being mixed with the second argument.

This rewrite might also address the following issues :

And it integrates the following PRs :

Please update the dist

The cdn/dist points to what seems to be an outdated version of the library.

Things like fromStream don't work, but do when using the code from src.

Uncaught TypeError: this.current_stream.loop is not a function

Followed the guide to use the CDN version and how to start the stream, etc but I am getting the following error.

wave.js:266 Uncaught TypeError: this.current_stream.loop is not a function at Wave.playStream (wave.js:266) at playSound (player.js:46) at change (player.js:72) at HTMLElement.onclick ((index):214)

Visualizing waveform (time domain)

Hi, great work! Is it somehow possible to visualize amplitude over time (and not frequency)? So, the data returned by getByteTimeDomainData of analyzer node. In a limited range of course (several seconds), kind of moving right as the sound plays.

Getting 'infinite loop' with turntable

let i = 0;
        for (const f of ['base', "lows", "mids", "highs"]) {
            wave.addAnimation(new wave.animations.Turntable({
                frequencyBand: f as "base" | "lows" | "mids" | "highs", // this can't be typed?
                lineWidth: 4 - i,
                lineColor: colors[i],
                fillColor: colors[(i + 2) % 4],
                cubeHeight: 10 * i,
                count: 5 * (4 - i),
                rounded: true,
                glow: { strength: 15, color: colors[i] },
                mirroredX: false,
                diameter: 5
            } as ITurntableOptions));
            i++;
        }

this type of code worked fine for glob, arcs, cube, and wave...but for turntable, i'm getting an infinite loop 100% cpu. any ideas what might trigger that? i'll dig around when i have some free time, but just wondering if you had a suggestion where to look in the source code.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.