afathi / arvideokit Goto Github PK
View Code? Open in Web Editor NEWCapture & record ARKit videos ๐น, photos ๐, Live Photos ๐, and GIFs ๐.
License: Apache License 2.0
Capture & record ARKit videos ๐น, photos ๐, Live Photos ๐, and GIFs ๐.
License: Apache License 2.0
Hi, when I want to play an audio via runAction the sound is not played. When recording the app crashes. If I remove recorder, sound is working.
I've tried multiple ways how to play the sound until I found that the recorder has caused the issue.
SceneKit`__UpdateAudioTransform:
0x1990e776c <+0>: stp x22, x21, [sp, #-0x30]!
0x1990e7770 <+4>: stp x20, x19, [sp, #0x10]
0x1990e7774 <+8>: stp x29, x30, [sp, #0x20]
0x1990e7778 <+12>: add x29, sp, #0x20 ; =0x20
0x1990e777c <+16>: mov x20, x0
-> 0x1990e7780 <+20>: ldrb w8, [x20, #0xde]
0x1990e7784 <+24>: tbz w8, #0x4, 0x1990e77e4 ; <+120>
0x1990e7788 <+28>: adrp x8, 101960
0x1990e778c <+32>: add x8, x8, #0x7c8 ; =0x7c8
0x1990e7790 <+36>: ldr x1, [x8]
0x1990e7794 <+40>: mov x0, x20
0x1990e7798 <+44>: bl 0x1991f24cc ; C3DEntityGetAttribute
0x1990e779c <+48>: mov x19, x0
0x1990e77a0 <+52>: mov x0, x20
0x1990e77a4 <+56>: bl 0x1990e46c0 ; C3DNodeGetWorldMatrix
0x1990e77a8 <+60>: mov x20, x0
0x1990e77ac <+64>: mov x0, x19
0x1990e77b0 <+68>: bl 0x1940cd95c
0x1990e77b4 <+72>: mov x21, x0
0x1990e77b8 <+76>: cmp x21, #0x1 ; =0x1
0x1990e77bc <+80>: b.lt 0x1990e77e4 ; <+120>
0x1990e77c0 <+84>: mov x22, #0x0
0x1990e77c4 <+88>: mov x0, x19
0x1990e77c8 <+92>: mov x1, x22
0x1990e77cc <+96>: bl 0x1940cd960
0x1990e77d0 <+100>: mov x1, x20
0x1990e77d4 <+104>: bl 0x199223aac ; C3DAudioPlayerSetTransform
0x1990e77d8 <+108>: add x22, x22, #0x1 ; =0x1
0x1990e77dc <+112>: cmp x21, x22
0x1990e77e0 <+116>: b.ne 0x1990e77c4 ; <+88>
0x1990e77e4 <+120>: ldp x29, x30, [sp, #0x20]
0x1990e77e8 <+124>: ldp x20, x19, [sp, #0x10]
0x1990e77ec <+128>: ldp x22, x21, [sp], #0x30
0x1990e77f0 <+132>: ret
I got this error while compile with the given ARVideoKit.framework
Can any one help?
I want to disable audio for AR in my project. (Apple declined an app because I don't use a microphone at all for my application, but do require a permit)
When I set requestMicPermission
or enableAudio
after RecordAR initialization, it does setup() in viewDidLoad(). That's too early because it would request mic permission even if I don't want to use it.
We can add read-only configuration object and pass it into RecordAR
object on initialization step like @objc override public init?(ARSceneKit: ARSCNView, configuration: RecordARConfiguration)
. the setup would respect the configuration object and won't use default values every time on initialization.
What do you think? I wanted to make it as a struct with let prop
, but that way I would disable @objc compatibility.
struct RecordARConfiguration {
let enableAudio: Bool = true
}
Currently, I did open RecordAR for subclassing and did override enableAudio
and requestMicPermission
properties. Then I used my subclass as a RecordAR view.
class DefaultRecordAR: RecordAR {
private var isEnableAudio: Bool = false
override var enableAudio: Bool {
get {
return isEnableAudio
}
set {
isEnableAudio = newValue
}
}
}
So:
Quick hotfix: make RecordAR open for subclassing, make configuration variables open for subclassing too
Proper fix: add configuration object to inject into RecordAR
and setup it properly.
When comparing video taken with the stock iOS camera vs taken with ARVideoKit, it looks like the ARVideoKit video has no image stabilization. I think AVCaptureVideoStabilizationMode needs to be set to auto or cinematic or perhaps just be made user configurable.
I'm not an expert on this, just got the solution idea from this SO question: https://stackoverflow.com/questions/26043995/ios-8-avfoundation-how-to-enable-video-stabilization-on-capable-devices/26471527#26471527
This looks like an incredible framework, but I would love to be able to use it with custom 3D objects that I have using Unity as my 3D renderer rather than SceneKit.
Thank you for sharing this framework which is really usefull !
I have seen that your func record() starts or resumes recording a video but we can access to the video only when we stop recording. In my case, I need to get the video stream from ARkit continuously and in real time (streaming app). How to get the stream please ?
Eric
The exporting methods section outlines how to export media to a userโs Photos library. However, there are use cases where persisting media to other locations is preferred. For example, if a user wants to share a recently exported media, but does not want the media persisted to Photos, the device's temporary directory might be more suitable.
Thank you for creating such useful library.
Instead of showing emojis. I'd like to show animations. How can I do this?
Thank you
This is my first time in managing a video in a SCNView. I started using your library, and I solved a lot of problems, but I am still having an issue when starting and immediately stopping the video: I also noticed that your swift example project is having the same crash, with this log :
2018-02-07 14:25:23.808628+0100 ARVideoKit-Example[3957:1450433] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
2018-02-07 14:25:23.808945+0100 ARVideoKit-Example[3957:1450433] [MC] Reading from public effective user settings.
2018-02-07 14:25:25.226704+0100 ARVideoKit-Example[3957:1450312] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetWriter finishWritingWithCompletionHandler:] Cannot call method when status is 0'
*** First throw call stack:
(0x18131b164 0x180564528 0x186c2ad68 0x104df119c 0x104e0f3f0 0x104de4934 0x105a852cc 0x105a8528c 0x105a89ea0 0x1812c3344 0x1812c0f20 0x1811e0c58 0x18308cf84 0x18a9395c4 0x1045a1100 0x180d0056c)
libc++abi.dylib: terminating with uncaught exception of type NSException
It seems like there is an unknown status (value 0) on the assetsWriter.status.rawValue: I can avoid the error, using a check, when this value is 0, but next time you will run the record; the application won't do it.
Hello,
I need to have an optional feature which uses ARKit and ARVideoKit in my app, but don't want it to prevent users with iOS version less than 11.0 from downloading & using the rest of the app. Is there some way similar to weak link the ARVideoKit framework and put "@ available" checks as we do for every iOS framework?
The error I get when trying to import in a project with deployment target 9.0 is
Module file's minimum deployment target is ios11.0 v11.0: <PROJECT_ROOT>/ARVideoKit.framework/Modules/ARVideoKit.swiftmodule/arm64.swiftmodule
Thanks.
func exportMessage(success: Bool, status:PHAuthorizationStatus, url:URL) {
if success {
let alert = UIAlertController(title: "Exported", message: "Media exported to camera roll successfully!", preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "Awesome", style: .cancel, handler: nil))
self.present(alert, animated: true, completion: nil)
}else if status == .denied || status == .restricted || status == .notDetermined {
let errorView = UIAlertController(title: "๐
", message: "Please allow access to the photo library in order to save this media file.", preferredStyle: .alert)
let settingsBtn = UIAlertAction(title: "Open Settings", style: .cancel) { (_) -> Void in
guard let settingsUrl = URL(string: UIApplicationOpenSettingsURLString) else {
return
}
if UIApplication.shared.canOpenURL(settingsUrl) {
if #available(iOS 10.0, *) {
UIApplication.shared.open(settingsUrl, completionHandler: { (success) in
})
} else {
UIApplication.shared.openURL(URL(string:UIApplicationOpenSettingsURLString)!)
}
}
}
errorView.addAction(UIAlertAction(title: "Later", style: UIAlertActionStyle.default, handler: {
(UIAlertAction)in
}))
errorView.addAction(settingsBtn)
self.present(errorView, animated: true, completion: nil)
}else{
let alert = UIAlertController(title: "Exporting Failed", message: "There was an error while exporting your media file.", preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
self.present(alert, animated: true, completion: nil)
}
}
Application drops into else state when you try to modify export url. Changing the file save name is not allowed?
Hi, ARVideoKit is very useful to me.
I discover that the quality of photo/video i captured by using ARVideoKit is not the same what i see in the screen (downgraded in resolution i guess).
So i want to know
How can i improve the quality of video or photo captured by ARVideoKit?
Is there any ways to change the resolution?
Thanks a lot.
I am used SceneKit.
I want to stop video saved in iPhone photos, because new created video is already saved in App Document directory.
Thanks.
Hi,
Thanks for the awesome work. Recently I started using ARKit and ARVideoKit and because of the usage of Swift, my app got bloated up in size. So I'd like to implement ARKit on Objective-C, and would like to know if you have any plans on supporting ARVideoKit in it?
Thanks.
Thanks for the great work of this repo.
I think the performance could be improved when we only render the scene only once.
Normally, ARSCNView do the world tracking and apply the camera transform to the scene. If we only use ARSession, not ARSCNView, and update the scene camera manually. renderer the scene to a fbo with SCNRenderer. After that๏ผdisplay the fbo with metal or opengles and send the fbo buffer to video recorder.
The crash:
Fatal Exception: NSInvalidArgumentException
*** -[AVAssetWriterInput initWithMediaType:outputSettings:sourceFormatHint:] 4 is not a valid channel count for Format ID 'aac '. Use kAudioFormatProperty_AvailableEncodeNumberChannels (<AudioToolbox/AudioFormat.h>) to enumerate available channel counts for a given format.
Fatal Exception: NSInvalidArgumentException
0 CoreFoundation 0x186d31d04 __exceptionPreprocess
1 libobjc.A.dylib 0x185f80528 objc_exception_throw
2 AVFoundation 0x18c5c5a04 -[AVAssetWriterInput dealloc]
3 ARVideoKit 0x101bac6bc (Missing)
4 ARVideoKit 0x101bacd8c (Missing)
5 ARVideoKit 0x101bb41b4 (Missing)
6 AudioToolbox 0x18a94be60 AudioSessionRequestRecordPermission
7 AVFAudio 0x18c4e5d18 -[AVAudioSession requestRecordPermission:]
I have some problems with frame rate. Default in ARKit is 60fps, when i turn on recording the whole scene goes 2 times faster, all physics and animations work 2 times faster in record mode than in non record mode. In debug i see that scenekit fps is turned to 120fps while in recording. Is there any workaround ? I tried setting recorder fps to 60, but still same behaviour happens.
Hi,
Thank you for the framework! Super useful
I'm wondering if you (or anyone else) have gotten the following error:
When capturing photo or video with ARVideoKit, in some specific instances the photos/videos don't match the placement of nodes in the scene.
(see attached images--the one with a circular camera button at bottom of image is a screenshot of how ARSCNView is displaying the image on my device, and the other image is the image captured by ARVideoKit and saved to camera roll).
This is only happening for me when nodes are populated by fetching the user's current position from currentFrame with frame.camera.transform. When I populate nodes in the scene with positions relative to the rootNode, ARVideoKit works perfectly and captures look exactly like what is displayed on the device.
I'm wondering if this is something having to do with how ARVideoKit interacts with currentFrame, and if getting the location in this way is somehow adjusting the position that the nodes appear to be in the scene when it's passed to ARVideoKit to render and export? Any help would be very much appreciated!
Thanks
I have been getting this error over the last week and am unable to resolve it. Apologies if this is incorrect usage of the "issues" tab.
Auto-Linking bitcode bundle could not be generated because '/Users/allenwixted/Desktop/iOS : OS X/NoPlaceLike/Frameworks/ARVideoKit.framework/ARVideoKit' was built without full bitcode. All frameworks and dylibs for bitcode must be generated from Xcode Archive or Install build
I've tried this article and it worked for a day or so. Then I realised I was working on Xcode beta and needed to build it for iOS 11.3 and am unable to resolve the issue this time around. Any help is much appreciated.
Hello, I'm a graphic designer and I like to collaborate with open source projects. I would like to design a logo for your Project I will be happy to collaborate with you :)
Are you able to make the video a square? So the output of the recording is a movie file with a 1x1 aspect ratio?
Recorder only records a part of the scene view not all of it. This is the actual scene view:
you can see both eyes here but the recorder only records this much area:
Recorder is configured like this:
`lazy var recorder:RecordAR = {
let recorder = RecordAR(ARSceneKit: self.sceneView)
recorder?.videoOrientation = .auto
recorder?.fps = ARVideoFrameRate.auto
recorder?.contentMode = .auto
recorder?.enableAdjsutEnvironmentLighting = false
return recorder!
}()`
And sceneview and camera is configured like this:
` override func viewDidLoad() {
super.viewDidLoad()
self.deleteButton.isHidden = true
self.replayButton.isHidden = true
self.sceneView.scene = SCNScene()
sceneView.rendersContinuously = true
sceneView.backgroundColor = UIColor.white
sceneView.automaticallyUpdatesLighting = true
sceneView.scene.background.contents = UIColor.white
createFaceGeometry()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
self.recorder.requestMicrophonePermission()
resetTracking()
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
session.pause()
session.delegate = nil
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func createFaceGeometry() {
sceneView.isHidden = false
self.faceNode = SCNNode()
faceNode?.addChildNode(robotHead)
wrapperNode?.addChildNode(faceNode!)
sceneView.scene.rootNode.addChildNode(wrapperNode!)
self.configureCameraNode()
setupConstraints()
}
func configureCameraNode(){
let camera = SCNCamera()
camera.usesOrthographicProjection = true
camera.orthographicScale = 25
let cameraNode = SCNNode()
cameraNode.position = SCNVector3Make(0, 0, 50)
cameraNode.camera = camera
wrapperNode?.addChildNode(cameraNode)
sceneView.pointOfView = cameraNode
}`
how can i make recorder to record whatever is there in sceneview?
First, great work with this library - really really useful!
In the app that I am building I am overlaying 2d content (UIViews) on top of the AR content.
When recording, I would love to be able to record photo/video with everything that's in this other view that's overlaying the ARSCNView, not just the ARSCNView content.
Any thoughts on whether / how / when that would be possible?
Thanks!
Octavian.
Hi! Today my colleagues and I experimented with the framework and took a photo using photo() method of recorder object. However, the resulting image had dimensions equal to screen ones, which means it was simply a screenshot, not a photo taken by phone camera, which has much higher resolution. Is that an expected behavior?
First of all, thank you for sharing this framework!
I've been getting errors when the app goes into background:
Execution of the command buffer was aborted due to an error during execution. Insufficient Permission (to submit GPU work from background) (IOAF code 6)
I suspect the RecordAR is never been dealloc, maybe because RecordAR
holds a strong reference to ARSceneKit
which might potentially create a retain cycle in certain cases?
Thank you
Exporting an IPA fails due to:
I use ARVideoKit-Example Demo,
deinit method in SCNViewController did not called after go back to MainViewController
archive succefully but then after failed.
error: Cannot extract bundle from /var/folders/6g/4qwgtxn519973ksvtg8jds080000gn/T/IDEDistributionOptionThinning.yV7/Payload/ARTeam.app/Frameworks/ARVideoKit.framework/ARVideoKit (i386)
I use ARVideoKit.framework of version 1.12
My Xcode enable bitcode is false
Fastlane failed ๏ผ
error: exportArchive: Failed to verify bitcode in ARVideoKit.framework/ARVideoKit:
error: Cannot extract bundle from /var/folders/45/sr50c3y14sxcsdgn_q1gr9200000gn/T/XcodeDistPipeline.Xzl/Root/Payload/xxxxxx.app/Frameworks/ARVideoKit.framework/ARVideoKit (i386)
I had updated the Xcode to 9.3 beta. It shows me the message "Module compiled with Swift 4.0.3 cannot be imported in Swift 4.1".
Failed to verify bitcode in ARVideoKit.framework/ARVideoKit:
error: Cannot extract bundle from /var/folders/2v/_0rv13nn7911fdl1dt16wmjc0000gn/T/XcodeDistPipeline.JDt/Root/Payload/test.app/Frameworks/ARVideoKit.framework/ARVideoKit (x86_64)
Hi, fyi I ran into the problem that the app crashed upon startup as soon as I added the framework to it. Was able to resolve it by adding a step to copy the framework as described in the last steps of this answer: https://stackoverflow.com/a/43197278/1884907
Hey
I have used ARVideoKit for recording in my AR applications. but when I start recording and I have recorded video for 10 sec but it only show me black screen only. also show black screen in photo too.
How to use this libs for recording video in landscape and portrait application. If ARVideoKit always set device orientation in portrait mode?
I wish to record videos with custom sizes, but under the current implementation, the only options appear to landscape and portrait
Hi team,
The video and photo capture are supporting only for iPhone application. It doesn't support for iPad application. The video looks very stretched.
Kindly fix this and update it for iPad too.
The underlying image and the SceneKit
content are misaligned on the iPhone X (I presume because of the different aspect ratio of the display). It looks like the SceneKit
content is rendered too wide relative to the camera image.
Here's an example. First is a screenshot of what it looks like in the app. Second is an image exported from ARVideoKit
(videos have the same issue; was just easier to demonstrate with a photo):
I followed the first setup instructions and when I did so it ran fine on my physical device, however, its not able to complile on the simulators. How can I work around this problem as I still need to test other aspects of my app on the simulator. Thanks
on validating the archive - codesign fails because of missing password for key "access"
Doesn't capture the drawing made using this: https://github.com/laanlabs/ARBrush.
I'm not exactly sure what's going wrong, any leads appreciated.. :)
func frame(didRender buffer: CVPixelBuffer, with time: CMTime, using rawBuffer: CVPixelBuffer) {
if capture != nil {
capture?.didCaptureCVPixelBufferRef(buffer, time: time, usingRowBuffer: rawBuffer)
}
}
I use WebRTC capture and send CVPixelBuffer.
I update ARVideoKit.framework yesterday afternoon , and memory isssue cause crash .
It seems to be in Function: CVPixelBufferCreate , CVPixelBufferBacking::initWithPixelBufferDescription,CVPixelBufferStandardMemoryLayout
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.