audiokit / audiokit Goto Github PK
View Code? Open in Web Editor NEWAudio synthesis, processing, & analysis platform for iOS, macOS and tvOS
Home Page: http://audiokit.io
License: MIT License
Audio synthesis, processing, & analysis platform for iOS, macOS and tvOS
Home Page: http://audiokit.io
License: MIT License
No idea what the issue is, but I spent hours trying to get AudioKit to compile following your tutorial video. I kept getting the following errors:
dyld: Library not loaded: libsndfile
dyld: Library not loaded: CsoundLib
If you follow the tutorial video exactly but your project name/dir has a space in it, you will get these errors.
The only thing that fixed it was creating a project name with no space in it. You might want to include this in the readme.
Hi Aure,
The following code gives a nice and pleasing sound, only not a sine wave, like I expected when setting a waveform:
// Instrument Properties
_amplitude = [self createPropertyWithValue:0.5 minimum:0.0 maximum:1.0];
_adsr = [[AKLinearADSREnvelope alloc] initWithAttackDuration:akp(0.1)
decayDuration:akp(0.1)
sustainLevel:akp(0.5)
releaseDuration:akp(0.1)
delay:akp(0)];
// Instrument Definition
self.oscillator = [AKFMOscillator oscillator];
self.oscillator.baseFrequency = note.frequency;
self.oscillator.amplitude = [_adsr scaledBy:_amplitude];
AKTable *sineTable = [AKTable standardSineWave];
[sineTable scaleBy:0.7];
self.oscillator.waveform = sineTable;
[self setAudioOutput:[self.oscillator scaledBy:_amplitude]];
If I remove the _adsr part, it produces a nice sine. If I then replace the sine with a square, I get a square sound. If I then put in the _adsr part, I get the original sound, not a square sound.
I've made a diff to the ContinuousControl example that demonstrates this: https://www.dropbox.com/s/qzvfzvia19yq4th/AKFMOscillator_issue.diff?dl=0
How can I keep using the AKFMOscillator with custom AKTables, yet still have an ADSR instead of a fixed amplitude?
Cheers
Nik
Even though AudioKit is up to version 2.1 we have not updated CocoaPods to reflect this version because we have not verified that our podspec is sufficient to create a working project.
I've filed this issue so we have a discussion area, but the best write up of the details is up at http://audiokit.io/pod/
I drap the AudioKit folder to my project then run and build.
It crash when the app start.
The error is follow:
dyld: Library not loaded: @rpath/CsoundLib.framework/CsoundLib
Referenced from: /Users/xxx/Library/Developer/CoreSimulator/Devices/51776503-6943-4AC0-903F-FE08A6AFB058/data/Containers/Bundle/Application/FBB6B689-8CE9-406D-B768-DEC2FD580054/audioSame.app/audioSame
Reason: image not found
Neither the Static library or dynamic library i can compiled... the error
Traceback (most recent call last):
File "/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang", line 63, in <module>
output, err = process.communicate()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 798, in communicate
return self._communicate(input)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 1402, in _communicate
stdout, stderr = self._communicate_with_select(input)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 1503, in _communicate_with_select
rlist, wlist, xlist = select.select(read_set, write_set, [])
KeyboardInterrupt Command /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang failed with exit code 1
i am using XCode 6.4 (6E35B) , have set the -ObjC flag and /AudioKit/AudioKit for
User Header Search Paths
Hi,
Do you have a AKFFT example with microphone input?
How can i get the FFT data as array?
The Harmonizer Example is not working.
thanks
The following HelloWorld app currently has a graphing feature implemented:
It would be nice to have graphing implemented in the rest of the HelloWorld examples well:
Things I'd like if you're adding tests for this project:
Tooling: XCTest, TravisCI
Areas to look at:
Notes:
subject
I've installed your pod and added this to my bridging header in my Swift iOS project:
#import <AudioKit/AKFoundation.h>
When trying to use your classes:
let instrument = AKInstrument()
XCode can't see them. I do NOT want to add a separate subproject just to use this in my project. I can't even see your AudioKit.swift file under the pod in XCode.
AudioKit has the ability to play / loop / "oscillate" any table and it would be a nice example to play a sine wave wave and then be able to use a touch gesture / mouse to draw a new waveform, set its limits (maximum / minimum / number of points) and then restart the sound to hear what the new waveform sounds like. This is mostly a UI challenge not so much an AudioKit challenge so I'm opening up the idea for anyone to work on or collaborate with us on.
How can I implement in Swift something similar to this project?
We really like the dual functionality of this cocoa object:
https://github.com/larsacus/LARSBar
Adding it just means copying their source and tying it to output amplitude. Then it would be good to provide an usage example.
I'm trying to process input from the microphone, perform some effects, and play the effects back in real time. This is the current state of my project
class ViewController: UIViewController {
var microphone = Microphone()
var stereoAmplifier: StereoAmplifier?
var stereoAudio: AKStereoAudio = AKStereoAudio.globalParameter()
required init(coder aDecoder: NSCoder) {
// Create Stereo Audio
stereoAudio.rightOutput = microphone.auxilliaryOutput
stereoAudio.leftOutput = microphone.auxilliaryOutput
stereoAmplifier = StereoAmplifier(audioSource: stereoAudio)
super.init(coder: aDecoder)
}
override func viewDidLoad() {
super.viewDidLoad()
if let stereoAmp = stereoAmplifier {
AKOrchestra.addInstrument(microphone)
AKOrchestra.addInstrument(stereoAmp)
microphone.start()
stereoAmp.start()
}
}
}
I'm new to this library and I can't find any examples or figure out how to add effects to the microphone Instrument. I tried adding a high and low pass filter, but the output was never modified.
Thats all : )
We are having a periodic crash in the AudioKit IOThread when it is trying to update csound.
The crash is in Csound_Render calling csoundPerformKsmps . I'm not really sure what all this is doing... we are just playing sounds periodically using a Conductor/Instrument pattern. Maybe this is happening when something gets dealloc'd ? Here is a screenshot of the crash, I'll try to get a proper stack/code black next time it crashes.
AudioKit Version 2.0
iOS 8
let audio = AKAudioOutput(audioSource: osc)
self.connect(audio)
As soon as I connect the AKAudioOutput to a subclass of AKInstrument, a prompt asking to use the Microphone is issued. Since I am not trying to do anything at all with input, this seems like a bug.
Is there any workaround to this?
For a Swift OSX project that installs AudioKit via CocoaPods, what additional steps are needed to run? I've tried all the steps in INSTALL-OSX.md, and none of them, and I always get:
dyld: Library not loaded: CsoundLib
Referenced from: /Users/xyz/Library/Developer/Xcode/DerivedData/story-ftxkjqthkgxxqbhkgiizafqfffso/Build/Products/Debug/story
Reason: image not found
I noticed that this: http://audiokit.io/downloads/ says "If you are creating an iOS project using Objective-C, you can use CocoaPods to install AudioKit." Does that mean Swift+OSX projects can't use CocoaPods for AudioKit?
FWIW, I'm using Xcode version 7b5, pod version 0.37.2.
Hi all,
I just started working on a project to use the 3D audio capabilities of AudioKit. I implemented the 3d binaural example but I can't seem to get any event that will let me know if a sound has finished playing or not. What is the best way to do that?
I am looking for something similar to the AVAudioPlayer that has a - (void)audioPlayerDidFinishPlaying.
Thank you for your help
I'm having issues with AudioKit not playing any music in the simulator, is there any tricks to this ?
Hi! I'm getting this error when trying build latest develop with XCode 7 beta 3. I think the problem is in podspec.
AudioKit/Templates/README should point out to the right directory to install for Xcode 4+
~/Library/Developer/Xcode/Templates/Application/Project\ Templates
For a first-time users this directory should be created
cd ~/Library/Developer/Xcode/
mkdir Templates/Application/Project\ Templates
Hi, to include AudioKit from Cocoapods, I had to add the dependency "pod 'TPCircularBuffer'" in addition to not get linker errors for missing functions in EZAudio.o. Perhaps this dependency should be added to the Cocoapod podspec?
I want to synthesize a SineWave with the the different frequency and time, how do I get it?
Thanks for ultra-fast response to other issues, and meaningful actions. I hope to make at least one example working on MacOSX, iOS examples are fine (and great).
Here is the new issue log:
Ld /Users/cerkut/Library/Developer/Xcode/DerivedData/Sequences-ailklqydfjokpuhbecfmeqyjnaju/Build/Products/Debug/Sequences.app/Contents/MacOS/Sequences normal x86_64
cd /Users/cerkut/Developer/audiokit/AudioKit/Examples/OSX/Sequences
export MACOSX_DEPLOYMENT_TARGET=10.9
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -arch x86_64 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.10.sdk -L/Users/cerkut/Library/Developer/Xcode/DerivedData/Sequences-ailklqydfjokpuhbecfmeqyjnaju/Build/Products/Debug -F/Users/cerkut/Library/Developer/Xcode/DerivedData/Sequences-ailklqydfjokpuhbecfmeqyjnaju/Build/Products/Debug -F/Library/Frameworks -F/Users/cerkut/Developer/audiokit/AudioKit/Examples/OSX/Sequences/../../../AudioKit/Libraries/OSX/csound-OSX -filelist /Users/cerkut/Library/Developer/Xcode/DerivedData/Sequences-ailklqydfjokpuhbecfmeqyjnaju/Build/Intermediates/Sequences.build/Debug/Sequences.build/Objects-normal/x86_64/Sequences.LinkFileList -mmacosx-version-min=10.9 -fobjc-arc -fobjc-link-runtime -framework CsoundLib64 -Xlinker -dependency_info -Xlinker /Users/cerkut/Library/Developer/Xcode/DerivedData/Sequences-ailklqydfjokpuhbecfmeqyjnaju/Build/Intermediates/Sequences.build/Debug/Sequences.build/Objects-normal/x86_64/Sequences_dependency_info.dat -o /Users/cerkut/Library/Developer/Xcode/DerivedData/Sequences-ailklqydfjokpuhbecfmeqyjnaju/Build/Products/Debug/Sequences.app/Contents/MacOS/Sequences
Undefined symbols for architecture x86_64:
"_csoundCompileOrc", referenced from:
-[CsoundObj updateOrchestra:] in CsoundObj.o
"_csoundReadScore", referenced from:
-[CsoundObj sendScore:] in CsoundObj.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Is AKSequence supposed to be sample-accurate while the device is locked or is it intended for events that don't need to be precise? I tried making a small metronome, but it seems to go out of sync on sleep, and even before. I notice it's using performSelector:withObject:afterDelay:inModes: which I have heard is inaccurate when it comes to audio timing.
I'm working on a synth app that includes a bank of 8 FM oscillators. Whenever I activate more than 2 of them I start getting digital noise. Initially the effect was only in Simulator and not on device, although now it occasionally happens on the device as well. The noise is much more pronounced on Simulator. It doesn't appear to be destructive interference, as it happens even when all 8 oscillators are set to the same frequency and mod index.
link to sound sample (recorded from simulator using SoundFlower): https://soundcloud.com/ars-one/aksample-distortion
link to repo for project: https://github.com/arsone/swarmatron
Not sure if it's a memory management issue, or an issue with handling multiple simultaneous audio threads, or if I just screwed something up in my implementation, but any guidance you guys could provide would be much appreciated.
Following exactly the same installation instruction of "AudioKit 2.1 Configuration for OSX Applications", however there's an error occured, "Framework not found CsoundLib".
My Xcode is 7.0.1 (the latest version currently).
Hi,
i have problem with the setup of the Dynamic Framework in Xcode 7 and iOS 9, i setup step by step with your movie, then i build and i become this error:
ld: framework not found CsoundLib
clang: error: linker command failed with exit code 1 (use -v to see invocation)
PS: The Setup with the Static Library works fine, but i need Dynamic for a commercial app
thanks for help
It would be great to have project templates as well as individual file templates available.
In addition if its possible to have prebuilt storyboard templates for this like:
Experiencing a lot of noise in the audio on iOS 8 simulator, Xcode 6.1 (6A1052d). Running Yosemite 10.10.
Almost sounds like a bit or sample rate issue.
Sound is normal on device.
I tried all 4 OS X example apps. When I try to run them they all crash at the following line
int ret = csoundCompile(cs, 5, argv);
It's inside - (void)runCsound:(NSString *)csdFilePath method. Compiler gives EXC_BAD_ACCESS. When I hover over cs and argv they are both non-nil. So, I'm not sure why it's crashing.
I'm running Xcode 6.1.1 on Yosemite 10.10.1.
Up until now I've been playing with iOS example code only. I'll try to create an OS X app following the tutorial on AudioKit.io page to see if I can find out why it's crashing.
I wanna learn a bit about audio development, can u help me?
๐
It would be great to have AKAudioOutputSpectrogramPlot and AKAudioInputSpectrogramPlots in our arsenal.
Hi,
thanks for the csound porting.
I would like to know if anybody has any app n the appstore with audiokit.
The gpl license is incompatible with the terms of service and I am a bit worried.
Thanks again
elio
It would be nice if AudioKit.xcodeproj built AudioKit.framework. This would make initial setup easier, removing the need for the explicit user header search paths setting, for example. It would also provide compatibility with Carthage.
Some UI elements don't exist on tvOS e.g. UISlider
.
Also CoreMIDI isn't available on tvOS.
Hi, everyone.
I am writing an app and want to get audio's frequency . Then I read the AudioKit's Docs, then I found the AKTrackedFrequency Class, And I think it meet my purpose. But I don't know how to get the frequency value from a instance of AKTrackedFrequency Class. The AKTrackedFrequency's doc don't show any property or method I can use to get the frequency value.
So I want to know how to get audio's frequency. If there have another method to get the frequency value or I use the AKTrackedFrequency in wrong way?
Can AudioKit.io play OGG file type ? Sorry i cannot find anywhere to submit question.
Thank you
I noticed that while trying to put together all of the AudioKit classes in a static library, but there is some apparent code duplication with AudioKitDemo which becomes a problem.
Specifically, the demo app has its own Tambourine
instrument which differs slightly from the one in the AudioKit library itself. I think it would make sense to get rid of the definition inside of the demo and just reuse the one from the library, or alternatively rename this instrument to something that doesn't conflict if this is supposed to be a useful example. This way it will also solve the linking issues I'm having when we can't selectively select which of the files are included in the static library.
Hi Aurelius,
has the updated podspec been sent to cocoapods yet? When I search I find version 2.0.1, but not 2.1.
So to test it out, I pointed my podfile directly:
pod 'AudioKit', :git => 'https://github.com/AudioKit/AudioKit.git'
Having done this, I get "csound.h" not found on line 32 in CsoundObj.h
Any idea what this is about?
Cheers
Nik
We have full MIDI support, but no OSC support yet. There seems to be a few options listed at Charles Martin's Blog:
http://charlesmartin.com.au/blog/2013/3/26/finding-a-good-osc-library-for-ios
but so far it looks like his Metatone might be best
https://github.com/cpmpercussion/MetatoneOSC
and it comes with this example
https://github.com/cpmpercussion/ExampleOSC
If you are on OSC wiz and want to knock this out, that would be great, or else you can also just comment that you would also like to have this feature, and that will motivate us to prioritize this higher.
dyld: Library not loaded: @loader_path/libFLAC.8.2.0.dylib
Referenced from: /Users/thisUser/Library/Developer/Xcode/DerivedData/ContinuousControl-chlbcchuixwzgbfvniyapyuzytat/Build/Products/Debug/ContinuousControl.app/Contents/Frameworks/CsoundLib64.framework/libs/libsndfile.1.dylib
Reason: image not found
Checking for a fix, will issue a Pull request if I manage.
There is nothing mentioned about this in the documents, but I get the following linker errors if I don't manually include
pod 'TPCircularBuffer'
in the pod file. Is this expected? I was just going to ignore it, but adding use_frameworks! to the Podfile has made the issue come back despite the inclusion of the TPCircularBuffer library.
This is on iOS 8.x using AudioKit 2.0.
Undefined symbols for architecture i386:
"_TPCircularBufferCleanup", referenced from:
+[EZAudio freeCircularBuffer:] in EZAudio.o
"_TPCircularBufferClear", referenced from:
+[EZAudio freeCircularBuffer:] in EZAudio.o
"_TPCircularBufferInit", referenced from:
+[EZAudio circularBuffer:withSize:] in EZAudio.o
ld: symbol(s) not found for architecture i386
clang: error: linker command failed with exit code 1 (use -v to see invocation)
I'm trying to use a group of AKVibratos to build a chorus effect, but when I put one out of phase I get an EXC_BAD_ACCESS
error inside Csound_render
.
[self setVibrato1:[[AKVibrato alloc] initWithShape:[AKTable standardSineWave]
averageFrequency:self.rate
frequencyRandomness:akp(0.0)
minimumFrequencyRandomness:akp(0.0)
maximumFrequencyRandomness:akp(0.0)
averageAmplitude:[self.depth scaledBy:akp(0.125)]
amplitudeDeviation:akp(0.0)
minimumAmplitudeRandomness:akp(0.0)
maximumAmplitudeRandomness:akp(0.0)
phase:akp(0.25)]];
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.