svhawks / scenekitvideorecorder Goto Github PK
View Code? Open in Web Editor NEWRecord your SceneKit and ARKit scenes easily.
License: MIT License
Record your SceneKit and ARKit scenes easily.
License: MIT License
@eric-krikey I am opening a new issue as it is not relevant to where you posted.
Initializing the recorder with audio disables haptic feedback on iPhone 7 Plus. Apparently this is normal behavior, since the native camera app does the same thing. Is it possible to set up the audio when the recording starts, instead of in the initializer?
After a new build, it always crash during the first time we try to record, then never crashes again (until we rebuild on Xcode).
Notice that we're using ARKit, Xcode 9, Swift 4 and iOS 11.
In SceneKitVideoRecorder.swift
line 284
self?.writer.startSession(atSourceTime: (self?.getAppendTime())!)
2017-10-10 17:36:02.021044+0200 ARText[1935:518028] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriter startSessionAtSourceTime:] invalid parameter not satisfying: CMTIME_IS_NUMERIC(startTime)'
*** First throw call stack:
(0x186297d38 0x1857ac528 0x18baff378 0x1009731c4 0x100970db4 0x1015e549c 0x1015e545c 0x1015f4110 0x1015e89a4 0x1015f5104 0x1015fc100 0x185ec2fe0 0x185ec2c30)
libc++abi.dylib: terminating with uncaught exception of type NSException
When trying to record a scene with https://github.com/ProjectDent/ARKit-CoreLocation/ I get the following error:
validateArgumentsForTextureViewOnDevice:1037: failed assertion `source texture pixelFormat (MTLPixelFormatRGB10A8_2P_XR10_sRGB) not compatible with texture view pixelFormat (MTLPixelFormatBGRA8Unorm).'
I would like to add a scene object on a click event.
I'm able to see the object appearing on the sceneView (after the click) but after recording, i can't see it on the (output) video :(
How to do that ? Thank you
Hello, this library works GREAT!!!
When recording or more importantly stopping and saving, how do I handle errors if the video url is somehow messed up?
The current callback in your example Swift file is a URL
:
self.recorder?.finishWriting().onSuccess { [weak self] url in
self?.checkAuthorizationAndPresentActivityController(toShare: url, using: self!)
}
Is there tuple available with a URL
and NSERROR
?
self.recorder?.finishWriting().onSuccess { [weak self] (url, error) in
if let error = error {
// alert user recording failed
return
}
self?.checkAuthorizationAndPresentActivityController(toShare: url, using: self!)
}
Or is the error handling in an entirely different function meaning we call one for success and another for failure?:
self.sceneKitVideoRecorder?.finishWriting().onFailure(callback: { [weak self](error) in
print("Recording Failed")
})
Hi guys,
The lib is awesome, but we get performance issue regarding frame drops.
Do you think recording in a buffer first and perform the actual write after the recording would solve the issue?
Recorded video is sometimes choppy.
Before submitting an issue please make sure that:
Issue environment:
Issue Details:
Hi, I have a 3D model which it has animations. When I record the scene with this framework, is model is idle in the movie ! why this happens ?! and how to fix it ?
In this commit e4a8caa
You changed the order of startDisplayLink() and startInputPipeline(), that would lead to appending to pixel buffer in another thread without a session started yet, in some cases (depending on thread execution).
Is that a bug?
var destinationTexture = currentDrawable.texture.makeTextureView(pixelFormat: .bgra8Unorm)
2017-09-21 22:51:56.034942+0800 SceneKitVideoRecorderDemo[2648:1261128] [DYMTLInitPlatform] platform initialization successful
2017-09-21 22:51:56.156224+0800 SceneKitVideoRecorderDemo[2648:1261064] Metal GPU Frame Capture Enabled
2017-09-21 22:51:56.156763+0800 SceneKitVideoRecorderDemo[2648:1261064] Metal API Validation Enabled
2017-09-21 22:51:56.284797+0800 SceneKitVideoRecorderDemo[2648:1261136] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
2017-09-21 22:51:56.285126+0800 SceneKitVideoRecorderDemo[2648:1261136] [MC] Reading from public effective user settings.
2017-09-21 22:51:56.321274+0800 SceneKitVideoRecorderDemo[2648:1261064] refreshPreferences: HangTracerEnabled: 0
2017-09-21 22:51:56.321323+0800 SceneKitVideoRecorderDemo[2648:1261064] refreshPreferences: HangTracerDuration: 500
2017-09-21 22:51:56.321356+0800 SceneKitVideoRecorderDemo[2648:1261064] refreshPreferences: ActivationLoggingEnabled: 0 ActivationLoggingTaskedOffByDA:0
2017-09-21 22:51:56.819496+0800 SceneKitVideoRecorderDemo[2648:1261164] [DeviceMotion] Event ref invalid
2017-09-21 22:52:27.431265+0800 SceneKitVideoRecorderDemo[2648:1261134] validateArgumentsForTextureViewOnDevice, line 1037: error 'source texture pixelFormat (MTLPixelFormatRGB10A8_2P_XR10_sRGB) not compatible with texture view pixelFormat (MTLPixelFormatRG11B10Float).'
validateArgumentsForTextureViewOnDevice:1037: failed assertion `source texture pixelFormat (MTLPixelFormatRGB10A8_2P_XR10_sRGB) not compatible with texture view pixelFormat (MTLPixelFormatRG11B10Float).'
(lldb)
First, thanks for the great work of this repo.
I want to render ar content, and apply some custom image filters(such as style filter,color adjust filter etc). When use ARSCNView, I have no choice to do it.
Now, I load 3D model by using SCNScene, and can render 3D model using SCNRender.
But don't know how to apply camera transform to the scene.
Thank you very much for any suggest!
Issue environment:
Issue Details:
How can i make an imageview that is a subview of ARSCNView include in the recorded video. Trying to add an image watermark to the saved video.
When I don't call setupMicrophone()
, the recorder crashes at line 226 in SceneKitVideoRecorder captureSession.stopRunning()
with the error Unexpectedly found nil while unwrapping an Optional value
When recording an ARSCNView, there's empty (black) frames at the beginning of the video.
Tested on an iPhone 7 Plus.
When support Swift 4.0?
Running with ARKit and rendering a ARSCNView
Issue Details:
Seems like the physic simulation gets run twice, once in the actual sceneView rendering and once in the SCNRenderer when recording. This results in a speedup in any physics-driven motion
Temporary and ugly workaround is to set physic simulation speed to half before recording and restore after recording is finished. (sceneView.scene.physicsWorld.speed = 0.5)
Might be somewhat related to #28
Thanks for a very useful lib!
The code seems to work without this line.
Before submitting an issue please make sure that:
Issue environment:
Issue Details:
The last commit was over 4 months ago and there haven't been any updates since
How can I check if the recorder is recording? For eg.
if recorder?.isWriting {
_ = recorder?.startWriting()
} else {
recorder?.finishWriting().onSuccess { [weak self] url in
print("recording success", url)
}.onFailure(callback: { (error) in
print("recording failed", error)
})
}
I can't make the new version work. I'm probably doing something wrong.
Like the demo project, I'm calling:
recorder=try! SceneKitVideoRecorder(withARSCNView: sceneView)
in viewDidLoad()
recorder.setupMicrophone()
in viewWillAppear
recorder.startWriting()
when pushing the record buttonstartDisplayLink()
method is never called. So when finishWriting
is called, isRecording
is false and it returns without actually stoping the recording.
Then when I push the record button again I get Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetWriter addInput:] Cannot call method when status is 3'
.
Issue environment:
Issue Details:
The audio is not being recorded in the new method recorder?.setupAudio()
. Have tried adding it in the beginning of the controller also in the recordbuttontapped method. Still no audio. The microphone permission was asked first time asked and allowed.
I don't think you guys realize how much trouble this library has saved me. Could you add a donate button to the page or how best to show appreciation?
Thanks.
Issue environment:
swift4
branchIssue Details:
Recordings are shorter than they should be when recorded on an iPhone X.
Here is a screenshot from a recording of a regular SCNSphere
, so it should be perfectly round.
My hope is that I can understand this library and send a PR.
Before submitting an issue please make sure that:
Issue environment:
Issue Details:
After calling startWriting and finishWriting multiple times in a row, the recorder stops working. A file is created, but it is empty. This happens with the demo project as well as my own.
In the demo project I'm getting the error: Video /private/var/mobile/Containers/Data/Application/09B7703C-A173-4EB5-BC8E-F482C2ACDAD8/tmp/output.mp4 cannot be saved to the saved photos album: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSUnderlyingError=0x1c0445310 {Error Domain=NSOSStatusErrorDomain Code=-12893 "(null)"
Issue environment:
👋
We got several crashes (3 for today) in Fabric related to:
SceneKitVideoRecorder.swift line 285
specialized SceneKitVideoRecorder.(renderSnapshot() -> ()).(closure #1)
Issue might be in the Apple stack, but I'd wanted to share with you in case of. Feel free to close 😅
Commit 71e239324b2a0366ab3a499e076f70ced50d7a70
Any thoughts on this?
Hello,
How do you increase the audio sample rate -- I see the default is 12000
Per
var assetWriterAudioInputSettings: [String : Any] {
return [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
}
Thanks,
Marc
The repo crashed when recording audio.
let aAudioAssetTrack : AVAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
No ready for media data
2017-10-25 10:59:52.949599+0530 CookbookARTK[3121:1288667] *** Terminating app due to uncaught exception 'NSRangeException', reason: '*** -[__NSArray0 objectAtIndex:]: index 0 beyond bounds for empty NSArray'
*** First throw call stack:
(0x180f6dd04 0x1801bc528 0x180ec90c4 0x10606d934 0x106082f98 0x105b2f418 0x105b2cbb4 0x105b2b6b4 0x181937b70 0x106c6149c 0x106c6145c 0x106c6d56c 0x106c72b54 0x106c72880 0x180b97120 0x180b96c20)
libc++abi.dylib: terminating with uncaught exception of type NSException
Issue environment:
Issue Details:
Issue environment:
Issue Details:
Sometimes the recorder crashes, line 343
self?.pixelBufferAdaptor.append(pixelBuffer, withPresentationTime: appendTime)
Error message:
'NSInternalInconsistencyException', reason: '*** -[AVAssetWriterInputPixelBufferAdaptor appendPixelBuffer:withPresentationTime:] A pixel buffer cannot be appended when readyForMoreMediaData is NO.'
It seems that video capture is not working in release mode. Any ideas on what's causing it?
Issue environment:
Device Model: iPhone 8
Xcode Version: 9.3.1
iOS Version: 11.3.1
Pod Version or Repo Commit: 1.5.1
Issue Details:
Shadows from scene lighting is not rendered/captured when the pod is installed. Both images below show an SCNBox with autoenablesDefaultLighting enabled. The bottom image is the SCNBox with no shadows after the pod is installed.
Runs well on iPhone 6S with iOS 11 beta.
Issue environment:
Issue Details:
The video is corrupted, please have a look at the photo. Bug with incorrect size is also can be seen on iPhone 6S Plus (without color bug), so I think this is a problem of all 5.5 iPhones.
Hi,
The video recording works fine but all the videos have a single transparent frame. How can i avoid that or remove it so that thumbnails shows correct image in all places.
Issue environment:
Issue Details:
Before submitting an issue please make sure that:
Issue environment:
Issue Details:
I want to record screen video like Animoji App do in my iMessage extension app, which is tracking face expressions using SCNView. Everything is working, but recording is not shown after it completes. Its just showing back video.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.