Giter Site home page Giter Site logo

nextlevel / nextlevel Goto Github PK

View Code? Open in Web Editor NEW
2.1K 63.0 265.0 5.14 MB

⬆️ Media Capture in Swift

Home Page: http://nextlevel.engineering

License: MIT License

Ruby 0.19% Swift 99.15% Objective-C 0.51% Shell 0.15%
swift nextlevel ios video photography camera capture media avfoundation coreimage

nextlevel's Introduction

Next Level

NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage AVFoundation or ARKit.

Build Status Pod Version Swift Version GitHub license

Features
🎬 Vine-like” video clip recording and editing
🖼 photo capture (raw, jpeg, and video frame)
👆 customizable gestural interaction and interface
💠 ARKit integration (beta)
📷 dual, wide angle, telephoto, & true depth support
🐢 adjustable frame rate on supported hardware (ie fast/slow motion capture)
🎢 depth data capture support & portrait effects matte support
🔍 video zoom
white balance, focus, and exposure adjustment
🔦 flash and torch support
👯 mirroring support
low light boost
🕶 smooth auto-focus
configurable encoding and compression settings
🛠 simple media capture and editing API
🌀 extensible API for image processing and CV
🐈 animated GIF creator
😎 face recognition; qr- and bar-codes recognition
🐦 Swift 5

Need a different version of Swift?

  • 5.0 - Target your Podfile to the latest release or master
  • 4.2 - Target your Podfile to the swift4.2 branch

Quick Start

# CocoaPods
pod "NextLevel", "~> 0.16.3"

# Carthage
github "nextlevel/NextLevel" ~> 0.16.3

# Swift PM
let package = Package(
    dependencies: [
        .Package(url: "https://github.com/nextlevel/NextLevel", majorVersion: 0)
    ]
)

Alternatively, drop the NextLevel source files or project file into your Xcode project.

Important Configuration Note for ARKit and True Depth

ARKit and the True Depth Camera software features are enabled with the inclusion of the Swift compiler flag USE_ARKIT and USE_TRUE_DEPTH respectively.

Apple will reject apps that link against ARKit or the True Depth Camera API and do not use them.

If you use Cocoapods, you can include -D USE_ARKIT or -D USE_TRUE_DEPTH with the following Podfile addition or by adding it to your Xcode build settings.

  installer.pods_project.targets.each do |target|
    # setup NextLevel for ARKit use
    if target.name == 'NextLevel'
      target.build_configurations.each do |config|
        config.build_settings['OTHER_SWIFT_FLAGS'] = ['$(inherited)', '-DUSE_ARKIT']
      end
    end
  end

Overview

Before starting, ensure that permission keys have been added to your app's Info.plist.

<key>NSCameraUsageDescription</key>
    <string>Allowing access to the camera lets you take photos and videos.</string>
<key>NSMicrophoneUsageDescription</key>
    <string>Allowing access to the microphone lets you record audio.</string>

Recording Video Clips

Import the library.

import NextLevel

Setup the camera preview.

let screenBounds = UIScreen.main.bounds
self.previewView = UIView(frame: screenBounds)
if let previewView = self.previewView {
    previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
    previewView.backgroundColor = UIColor.black
    NextLevel.shared.previewLayer.frame = previewView.bounds
    previewView.layer.addSublayer(NextLevel.shared.previewLayer)
    self.view.addSubview(previewView)
}

Configure the capture session.

override func viewDidLoad() {
    NextLevel.shared.delegate = self
    NextLevel.shared.deviceDelegate = self
    NextLevel.shared.videoDelegate = self
    NextLevel.shared.photoDelegate = self

    // modify .videoConfiguration, .audioConfiguration, .photoConfiguration properties
    // Compression, resolution, and maximum recording time options are available
    NextLevel.shared.videoConfiguration.maximumCaptureDuration = CMTimeMakeWithSeconds(5, 600)
    NextLevel.shared.audioConfiguration.bitRate = 44000
 }

Start/stop the session when appropriate. These methods create a new "session" instance for 'NextLevel.shared.session' when called.

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)     
    NextLevel.shared.start()
    // …
}
override func viewWillDisappear(_ animated: Bool) {
    super.viewWillDisappear(animated)        
    NextLevel.shared.stop()
    // …
}

Video record/pause.

// record
NextLevel.shared.record()

// pause
NextLevel.shared.pause()

Editing Recorded Clips

Editing and finalizing the recorded session.

if let session = NextLevel.shared.session {

    //..

    // undo
    session.removeLastClip()

    // various editing operations can be done using the NextLevelSession methods

    // export
    session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality, completionHandler: { (url: URL?, error: Error?) in
        if let _ = url {
            //
        } else if let _ = error {
            //
        }
     })

    //..

}

Videos can also be processed using the NextLevelSessionExporter, a media transcoding library in Swift.

Custom Buffer Rendering

‘NextLevel’ was designed for sample buffer analysis and custom modification in real-time along side a rich set of camera features.

Just to note, modifications performed on a buffer and provided back to NextLevel may potentially effect frame rate.

Enable custom rendering.

NextLevel.shared.isVideoCustomContextRenderingEnabled = true

Optional hook that allows reading sampleBuffer for analysis.

extension CameraViewController: NextLevelVideoDelegate {

    // ...

    // video frame processing
    public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {
        // Use the sampleBuffer parameter in your system for continual analysis
    }

Another optional hook for reading buffers for modification, imageBuffer. This is also the recommended place to provide the buffer back to NextLevel for recording.

extension CameraViewController: NextLevelVideoDelegate {

    // ...

    // enabled by isCustomContextVideoRenderingEnabled
    public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {
		    // provide the frame back to NextLevel for recording
        if let frame = self._availableFrameBuffer {
            nextLevel.videoCustomContextImageBuffer = frame
        }
    }

NextLevel will check this property when writing buffers to a destination file. This works for both video and photos with capturePhotoFromVideo.

nextLevel.videoCustomContextImageBuffer = modifiedFrame

About

NextLevel was initally a weekend project that has now grown into a open community of camera platform enthusists. The software provides foundational components for managing media recording, camera interface customization, gestural interaction customization, and image streaming on iOS. The same capabilities can also be found in apps such as Snapchat, Instagram, and Vine.

The goal is to continue to provide a good foundation for quick integration (enabling projects to be taken to the next level) – allowing focus to placed on functionality that matters most whether it's realtime image processing, computer vision methods, augmented reality, or computational photography.

ARKit

NextLevel provides components for capturing ARKit video and photo. This enables a variety of new camera features while leveraging the existing recording capabilities and media management of NextLevel.

If you are trying to capture frames from SceneKit for ARKit recording, check out the examples project.

Documentation

You can find the docs here. Documentation is generated with jazzy and hosted on GitHub-Pages.

Stickers

If you found this project to be helpful, check out the Next Level stickers.

Project

NextLevel is a community – contributions and discussions are welcome!

  • Feature idea? Open an issue.
  • Found a bug? Open an issue.
  • Need help? Use Stack Overflow with the tag ’nextlevel’.
  • Questions? Use Stack Overflow with the tag 'nextlevel'.
  • Want to contribute? Submit a pull request.

Related Projects

Resources

License

NextLevel is available under the MIT license, see the LICENSE file for more information.

nextlevel's People

Contributors

ayaibrahim avatar boaztelem avatar brikerman avatar captainuberawesome avatar cemaleker avatar dthuering avatar dushmank avatar dylanreich avatar giogus avatar gottagetswifty avatar mshtmfv avatar nikolaygenov avatar piemonte avatar roger-blinto avatar s1ddok avatar tkohout avatar willi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nextlevel's Issues

Flipping camera results in blocking the main thread

Trying to flip the camera while capturing video causes the main thread to freeze.

I am trying to do a flip animation while flipping the camera.

Am I doing anything wrong or is there a way to avoid blocking the UI while flipping the camera?

Odd frame timing at 240fps

Recording a video at 240fps on an iPhone 6, and I'm getting some strange video timing. Where I'd be expecting frame durations of 4.167ms, I'm getting alternating durations of 5ms and 3.33ms. Over the whole video the frame timings come out to a standard deviation of 0.8431996842ms (I've cut out the frames captured at the start because there's a bunch of frame loss that happens when the asset writer kicks in. But after that there's no real frame loss after having disabled the audio capture.)

You can see some of the data arond the frame timestamps I'm capturing here, https://docs.google.com/spreadsheets/d/1JTTQKT9yZA5_quej1LxAb4N0AsjdeZlvQUROS5IHFNc/edit?usp=sharing, was wondering if you had any ideas around where the offset might be coming from?

Thanks.

Development Targer Problem

thanks for your good library.
I tried to add NextLevel in pod.
It need deployment_target 10.0.
Can you make it to work on 9.0 also ?

Optional NextLevelDelegate methods?

Maybe everyone doesn't need to conform to every single method in NextLevelDelegate. We can think of more suitable names, I used these comments as a guide. Let me know what you think?

protocol NextLevelPermissionDelegate {}
protocol NextLevelConfigurationDelegate {}
protocol NextLevelSessionDelegate {}
protocol NextLevelDeviceModeOrientationDelegate {}
protocol NextLevelOrientationDelegate {}
protocol NextLevelApertureDelegate {}
protocol NextLevelFocusExposureWhiteBalanceDelegate {}
protocol NextLevelTorchFlashDelegate {}
protocol NextLevelZoomDelegate {}
protocol NextLevelPreviewDelegate {}
protocol NextLevelVideoProcessingDelegate {}
protocol NextLevelVideoRecordingSessionDelegate {}
protocol NextLevelVideoFramePhotoDelegate {}
protocol NextLevelPhotoDelegate {}

Inspired from: https://ashfurrow.com/blog/protocols-and-swift/

Sample project fail?

I have just downloaded and tried the sample project and it's like dead?!
You start the App, you got 3 Buttons, nothing really happens. You press the checkmark and it says saved. I don't get any feedback like recording time or a final video?
There are like 1 Billion lines of code, but nothing really happens. Is this a wanted thing?

Image pixelete when using capture photo from video

Image pixelate when using capture photo from video.I required a image of high quality photo. I am using the the following code :-

NextLevel.sharedInstance.capturePhotoFromVideo()

Properties that I am setting to shared instance of NextLevel:-

let nextLevel = NextLevel.sharedInstance
        nextLevel.deviceOrientation = .portrait
        nextLevel.videoConfiguration.maximumCaptureDuration = CMTimeMakeWithSeconds(maximumDuration, 600)
        
        //   video configuration
        nextLevel.videoConfiguration.bitRate = 2000000
        nextLevel.videoConfiguration.scalingMode = AVVideoScalingModeResizeAspectFill
        
        // audio configuration
        nextLevel.audioConfiguration.bitRate = 128000
        
        nextLevel.isVideoCustomContextRenderingEnabled = true
        nextLevel.videoConfiguration.preset = AVCaptureSessionPresetHigh

🛠 NextLevel enhancements

  • add improved authorization checks
  • custom output directory
  • AV session interruption recovery support
  • add better error reporting
  • videoConfiguration could have pre-defined sizing presets

Orientation

Do i have to do anything special to ensure the orientation is correct?
I start the app in portrait, rotate the phone to landscape.. Hit record, but continue to get
Rotation: 0 in the resulting file.

exiftool -Rotation 1488148498200.mp4
Rotation : 0

Can't record twice

Presenting example CameraViewController works first time only.
Second time fails at start because captureSession was not nil.
Clearing that still failed because sharedInstance had some internal variables.
Using non-singleton NextLevel instance fared better but resulted in unreleased KVO errors.
Removing KVO in deinit finally allowed CameraViewController to work multiple times.

Play Music while Recording

Hello!

First off, I would like to congratulate you on building such an incredible engine. It has really accelerated development on my project. However, I was just wondering if there is any way to implement the ability to play music during a session, similar to that of Snapchat, Vine, and Instagram Stories. I've tried setting the AVAudioSession to both Ambient and PlayToRecord, but the music would still pause on the opening of the camera session.

Thank you in advance!

regarding presence of outputFormat of video and photo

Hi,

I have been updating my code from PBJVision to NextLevel in one my project. I am getting an issue related in finding the PBJOutputFormat type enum . I am trying to get the image/video in square formate .

Please let me know if there is any enum to use square output formate.

Thank you.

The flashMode property doesn't work

I'm pretty new with the AVFoundation apis so I know I might be wrong, but here goes.

The flashMode setter performs an AVCaptureDevice lockForConfiguration() / unlockForConfiguration() which I believe can be removed since Apple deprecated the flashMode property and it's never set anyway.

Also I think the flash problem is caused because the flashMode is never passed on to the AVCapturePhotoSettings before capture. I managed to fix it locally by adding photoSettings.flashMode = self.photoConfiguration.flashMode before the capture photo call. The cleaner way would have been to add it as an entry to the avcaptureDictionary() function on the NextLevelPhotoConfiguration but I was unable to find the corresponding key in the Apple Docs.

P.S. Thanks for the great library!

Problem with CocoaPods

I'm having trouble installing through CocoaPods. I'm receiving this error:

[!] Unable to satisfy the following requirements:

 - `NextLevel (~> 0.0.1)` required by `Podfile`

Any idea what might be going wrong? Thanks.

Camera Freezes when In call bar is showing

I recently build my own camera application and that was having the same problem as this API. Whenever someone calls me and I open up my application, the camera freezes on 1 frame and that freezed frame gets really dark. Do you have any idea why this happens / how to fix it?

Video Quality

Hi Piemonte,

We have below configuration for video recording:

  1. Bit rate : 2000000
  2. AVAssetExportPresetHighestQuality

but after video recording we are getting pixelated video file. Can you please help.

Flash not working.

Hi,

I have include this library in one of my project . I tried to on the Flash using following code.

NextLevel.sharedInstance.flashMode = .on

But it is not working. it always return "NextLevel.sharedInstance.flashMode" off.

Please check the issue.

previewView not updated with camera view on second presentation of CameraViewController

First off, amazing Lib.

I am using NextLevel in one of my apps and I need to present the CameraViewController() multiple times modally to allow for video capture. Everything is working well until I present the CameraViewController for a second time, and the previewView does not update to the NextLevel.sharedInstance.previewLayer. The previewView will update if the camera is flipped and then flipped back.

I have created a video detailing the bug.
https://www.youtube.com/watch?v=TztPcTT5TKk&feature=youtu.be

The culprit is on line ~90 of CameraViewController

        
        // preview
        self.previewView = self.cameraView
        if let previewView = self.previewView {
            previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
            previewView.backgroundColor = UIColor.blue
            NextLevel.sharedInstance.previewLayer.frame = previewView.bounds
            previewView.layer.addSublayer(NextLevel.sharedInstance.previewLayer)
            self.view.addSubview(previewView)
        }

Thanks again for sharing this Lib with all of us!!!!

Cocoapods issue

I tried to install NextLevev via cocoapods but get this error:

Analyzing dependencies
[!] Unable to satisfy the following requirements:

  • NextLevel (~> 0.3.4) required by Podfile

Specs satisfying the NextLevel (~> 0.3.4) dependency were found, but they required a higher minimum deployment target.

fliping camera issue while recording.

Hi,

I just meet the issue with flip the camera while recording.
Start recording the video in Device position back. then pause the recording and flip the camera to front.
Camera moves to front position.Now restart the recording . Session duration show increase in timing.Then click the save button. only back camera recording is available, no front camera recording is available on saving the video.

Thank you.

How can we integrate GPUImage2

Is there a way to integrate GPUImage2 with NextLevel, so that we have all the features of NextLevel, but the filtering processing is handled by GPUImage2.

Flip camera while recording causes audio sync issues

@piemonte I found the same issue I found in your other lib in this lib.

When you capture continuously and flip the camera mid capture. The audio get's chopped and looses sync to the video steam.

The ideal user experience would be for audio to keep recording without interruptions and get some black frames or a freeze frame while the camera is flipping(changing inputs). SnapChat keeps audio seamless so it's possible...

AVAudioRecorder, records audio uninterrupted even when flipping the camera. So it sounds like there might be a solution retiming the samples from the sample buffer?

I need to solve this for our lib in the next days so I'll keep you posted on our solution.

Any ideas?

How to initiate?

I get these error-messages when trying to implement NextLevel in my viewcontroller

  • Cannot assign value of type 'CreateAccountViewController' to type 'NextLevelDelegate?'
  • Value of type 'NextLevel' has no member 'photoDelegate'
  • Value of type 'CreateAccountViewController' has no member 'previewView'

NextLevel.sharedInstance.start()
gives me:

  • Call can throw, but it is not marked with 'try' and the error is not handled
import NextLevel


class CreateAccountViewController: UIViewController, UINavigationControllerDelegate {

  
    override func viewDidLoad() {
        super.viewDidLoad()
        NextLevel.sharedInstance.delegate = self
        NextLevel.sharedInstance.photoDelegate = self
        setupCameraPreview()
    }

    func setupCameraPreview(){
        let screenBounds = UIScreen.main.bounds
        self.previewView = UIView(frame: screenBounds)
        if let previewView = self.previewView {
            previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
            previewView.backgroundColor = UIColor.black
            NextLevel.sharedInstance.previewLayer.frame = previewView.bounds
            previewView.layer.addSublayer(NextLevel.sharedInstance.previewLayer)
            self.view.addSubview(previewView)
        }
    }

Is there some example-code how to implement?

Issue related to Video formate when recorded

Hi,

I have been updating my code from PBJVision to NextLevel in one my project. I am getting an issue related to video type when recorded and upload on server with mineType: "video/mp4"

It is giving a success message through PBJVision but giving an error with status code 422.

As per initial thought it seem that the video recording/saving formate is different.

Thank you.

flash functions

Hi There,

Thanks for wonderful library. In pbjvision we had flash on/off and maximum length video function. Is there any function in nextlevel about same?

Thanks
Monish

Remove clip time error

When I try to delete clip and than record another one session duration is NAN. I found a solution. The proble is weh you substract session duration.

public func remove(clipAt idx: Int, removeFile: Bool) {
    self.executeClosureSyncOnSessionQueueIfNecessary {
        if self._clips.indices.contains(idx) {
            let clip = self._clips.remove(at: idx)
            
          //  Use  CMTimeSubtract to substract CMTime
          //  and substration should be before clip.remove()

            self._duration = CMTimeSubtract(self._duration, clip.duration)
            
            if removeFile {
                clip.removeFile()
            }

        }
    }
}

Merging "External" Video with Original

So I have a function that takes in a video and returns the same video but in reverse. How can I record a video using NextLevel, input it into my reverse() function, then merge the original and reverse video together so that they become one final video? I was going to use the add(Clip: ) method but it required a NextLevelClip type which I am not familiar on how to convert to.

Usage Advice

How can I record a 5 second video by only tapping once on the record button without holding? Thanks!

Still image capture doesn't work (previewPhotoSampleBuffer always nil)

I'm trying to use NextLevel to capture a still photo. I've set NextLevel.sharedInstance.cameraMode = .photo and I'm calling NextLevel.sharedInstance.capturePhoto()

However, I noticed the nextLevel(_:didProcessPhotoCaptureWith:photoConfiguration) delegate method is never called. Looking at NextLevel.swift, I see that previewPhotoSampleBuffer is expected to to be non-nil.

public func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        if let sampleBuffer = photoSampleBuffer,
            let previewBuffer = previewPhotoSampleBuffer {

In my case previewPhotoSampleBuffer is always nil. Is there some configuration property that needs to be set to allow still photos to be taken? Thanks!

How to apply a CIIFilter to preview and video?

I've looked into it, but everything I find suggests taking the sample buffer and converting to a UIImage, which doesn't seem right, particularly when it would require updating the AVPreview layer each frame.

I'm currently looking at the VideoDelegate methods:

public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {}

public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {}

Any tips would be appreciated.

iOS 9 support

Hello!

Have you got plans to support iOS 9? There is iOS 10 only in podspec now.

crash on launch

Hi Piemonte,

Once my app is crashing due to other reasons. and after that when I opening my app its giving below crash. And its not opening app 2-3 times.

Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureSession addInput:] Multiple audio/video AVCaptureInputs are not currently supported'
*** First throw call stack:
(0x186de51b8 0x18581c55c 0x18e5c8c70 0x1011efe2c 0x1011edf00 0x1011ed6bc 0x1011ee530 0x1011ec188 0x1011e79d8 0x101fbd258 0x101fbd218 0x101fcaaec 0x101fc0ce0 0x101fcb088 0x101fcce2c 0x101fccb78 0x185e772a0 0x185e76d8c)
libc++abi.dylib: terminating with uncaught exception of type NSException

`didStartClipInSession` called multiple times while the camera is running

While the camera is running the didStartClipInSession is called continually.

    func nextLevel(_ nextLevel: NextLevel, didStartClipInSession session: NextLevelSession) {
        print("Did start clip")
    }

When I pause the clip.asset and session.asset are nil.

func start() {
     NextLevel.sharedInstance.record()
}

func pause() {
     NextLevel.sharedInstance.pause()
}

func nextLevel(_ nextLevel: NextLevel, didCompleteClip clip: NextLevelClip, inSession session: NextLevelSession) {
        print(clip.asset)
        print(session.asset)
    }

I was expecting access to the asset in this method to be able to export it.

🛠 sample proj additions

  • example clip saving
  • add a focus indicator example
  • add pan to zoom video example
  • add capture support when using the hardware volume buttons

Dynamic Video Overlays

Hi,

I'm using your video capture library in an app I'm making, thanks for all the work!

I am trying to add dynamic overlays to video I'm capturing using this library, but I've been unsuccessful thus far. I'm trying to capture the overlay in a video that'll be saved so I tried adding it to the preview layer but can't seem to get it to show up.

Thanks!

    let subtitleText = CATextLayer()
    subtitleText.fontSize = 72
    subtitleText.frame = CGRect(x: 0.0, y: 50.0, width: UIScreen.main.bounds.width, height: 100.0)
    subtitleText.string = "Text Overlay!"
    subtitleText.alignmentMode = kCAAlignmentCenter
    subtitleText.foregroundColor = UIColor.green.cgColor
    subtitleText.shadowOpacity = 1.0
    subtitleText.shadowOffset = CGSize(width: 0.0, height: 0.0)
    subtitleText.shadowRadius = 6.0
    
    let textOverlayLayer = CALayer()
    textOverlayLayer.addSublayer(subtitleText)
    textOverlayLayer.frame = CGRect(x: 0.0, y: 0.0, width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)

   // Neither of these work
    NextLevel.sharedInstance.previewLayer.addSublayer(textOverlayLayer)
    NextLevel.sharedInstance.previewLayer.insertSublayer(textOverlayLayer, at: 0)

So... Is this still a thing?

I would really like a PBJVision library in swift, but I'm not sure if this is still active. If so, could you please provide documentation for installation as well as setting it up?

support for real-time recorded progress

Hi,

I have been updating my code from PBJVision to NextLevel in one my project. I am getting an issue in live update of duration of recorded value . I am using the value :-

`session.duration.seconds'

but it return the constant value of last video duration clip . mean to say it is not live updating. I am using the value in following video delegate method:-

func nextLevel(_ nextLevel: NextLevel, didAppendVideoSampleBuffer sampleBuffer: CMSampleBuffer, inSession session: NextLevelSession) {
    ` log.debug("\(session.duration.seconds)..")`
)

Please check the issue.

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.