Giter Site home page Giter Site logo

dlevi309 / pixelkit Goto Github PK

View Code? Open in Web Editor NEW

This project forked from heestand-xyz/pixelkit

1.0 0.0 0.0 41.68 MB

Live Graphics Framework

Home Page: http://pixelkit.net

License: MIT License

Swift 79.35% Metal 15.24% Ruby 0.79% Objective-C 4.59% C 0.04%

pixelkit's Introduction

PixelKit

License Cocoapods Platform

Live Graphics Framework for iOS, macOS and tvOS
runs on RenderKit - powered by Metal - inspired by TouchDesigner

Examples: Camera Effects - Green Screen - Hello Pixels App - Code Reference - Code Examples - Code Demos
Info: Website - Coordinate Space - Blend Operators - Effect Convenience Funcs - High Bit Mode - Apps

Camera Depth Camera Image Video Screen Capture Stream In
Color Circle Rectangle Polygon Arc Line Gradient Noise Text Metal
Levels Blur Edge Threshold Quantize Transform Kaleidoscope Twirl Feedback Delay
Channel Mix Chroma Key Corner Pin Hue Saturation Crop Flip Flop Range Sharpen Slope Sepia
Blend Cross Lookup Displace Remap Reorder Res Convert Clamp Freeze Flare
Blends Luma Levels Luma Blur Time Machine Array AirPlay Record Stream Out

Install

CocoaPods:

CocoaPods is a dependency manager for Cocoa projects. For usage and installation instructions, visit their website. To integrate PixelKit into your Xcode project using CocoaPods, specify it in your Podfile:

pod 'PixelKit'

And import:

import PixelKit

Note that PixelKit only have simulator support in Xcode 11 for iOS 13 on macOS Catalina. Metal for iOS can only run on a physical device in Xcode 10 or below.

To gain camera access, on macOS, check Camera in the App Sandbox in your Xcode project settings under Capabilities.

To get access to all dependency features:

import LiveValues
import RenderKit

Setup

UIKit

import UIKit
import PixelKit

class ViewController: UIViewController {

    override func viewDidLoad() {
        super.viewDidLoad()

        let circlePix = CirclePIX(at: .fullscreen)

        let blurPix = BlurPIX()
        blurPix.input = circlePix
        blurPix.radius = 0.25

        let finalPix: PIX = blurPix
        finalPix.view.frame = view.bounds
        view.addSubview(finalPix)
        
    }
    
}

SwiftUI

import SwiftUI
import PixelKit

struct ContentView: View {
    var body: some View {
        BlurPIXUI {
            CirclePIXUI()
        }
            .radius(0.25)
            .edgesIgnoringSafeArea(.all)
    }
}

Docs

LiveValues Docs
RenderKit Docs
PixelKit Docs

Tutorials

Getting started with PixelKit in Swift
Getting started with Metal in PixelKit
Green Screen in Swift & PixelKit
Particles in VertexKit & PixelKit

Examples

Hello Pixels App
Code Reference
Code Examples
Code Demos

Example: Camera Effects

import PixelKit

let camera = CameraPIX()

let levels = LevelsPIX()
levels.input = camera
levels.brightness = 1.5
levels.gamma = 0.5

let hueSat = HueSaturationPIX()
hueSat.input = levels
hueSat.sat = 0.5

let blur = BlurPIX()
blur.input = hueSat
blur.radius = 0.25

let res: Resolution = .custom(w: 1500, h: 1000)
let circle = CirclePIX(at: res)
circle.radius = 0.45
circle.bgColor = .clear

let finalPix: PIX = blur & (camera * circle)
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

This can also be done with Effect Convenience Funcs:

let pix = CameraPIX()._brightness(1.5)._gamma(0.5)._saturation(0.5)._blur(0.25)

SwiftUI

struct ContentView: View {
    var content: PIXUI {
        HueSaturationPIXUI {
            LevelsPIXUI {
                ResPIXUI {
                    CameraPIXUI()
                }
            }
                .gamma(0.5)
                .brightness(LiveFloat(1.5))
        }
            .saturation(LiveFloat(0.5))
    }
    var body: some View {
        BlendsPIXUI {
            BlurPIXUI {
                RawPIXUI(pix: content.pix)
            }
                .radius(0.25)
            BlendsPIXUI {
                CirclePIXUI()
                    .bgColor(.clear)
                    .radius(0.25)
                RawPIXUI(pix: content.pix)
            }
                .blendMode(.multiply)
        }
            .blendMode(.over)
    }
}

Remeber to add NSCameraUsageDescription to your info.plist

Example: Green Screen

import PixelKit

let cityImage = ImagePIX()
cityImage.image = UIImage(named: "city")

let supermanVideo = VideoPIX()
supermanVideo.load(fileNamed: "superman", withExtension: "mov")

let supermanKeyed = ChromaKeyPIX()
supermanKeyed.input = supermanVideo
supermanKeyed.keyColor = .green

let blendPix = BlendPIX()
blendPix.blendingMode = .over
blendPix.inputA = cityImage
blendPix.inputB = supermanKeyed

let finalPix: PIX = blendPix
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

This can also be done with Blend Operators and Effect Convenience Funcs:

let pix = cityImage & supermanVideo._chromaKey(.green)

SwiftUI

struct ContentView: View {
    var body: some View {
        BlendsPIXUI {
            ImagePIXUI(image: UIImage(named: "city")!)
            ChromaKeyPIXUI {
                VideoPIXUI(fileNamed: "superman", withExtension: "mov")
            }
                .keyColor(.green)
        }
            .blendMode(.over)
    }
}

This is a representation of the Pixel Nodes Green Screen project.

Example: Depth Camera

import PixelKit

let cameraPix = CameraPIX()
cameraPix.camera = .front

let depthCameraPix = DepthCameraPIX.setup(with: cameraPix)

let levelsPix = LevelsPIX()
levelsPix.input = depthCameraPix
levelsPix.inverted = true

let lumaBlurPix = cameraPix._lumaBlur(with: levelsPix, radius: 0.1)

let finalPix: PIX = lumaBlurPix
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

The DepthCameraPIX was added in PixelKit v0.8.4 and requires an iPhone X or newer.

Note to use the setup(with:filter:) method of DepthCameraPIX.
It will take care of orientation, color and enable depth on the CameraPIX.

To gain access to depth values ouside of the 0.0 and 1.0 bounds,
enable 16 bit mode like this: PixelKit.main.render.bits = ._16

Example: Multi Camera

let cameraPix = CameraPIX()
cameraPix.camera = .back

let multiCameraPix = MultiCameraPIX.setup(with: cameraPix, camera: .front)

let movedMultiCameraPix = multiCameraPix._scale(by: 0.25)._move(x: 0.375 * (9 / 16), y: 0.375)

let finalPix: PIX = camearPix & movedMultiCameraPix
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

Note MultiCameraPIX requires iOS 13.

Coordinate Space

PixelKit coordinate space is normailzed to the vertical axis (1.0 in height) with the origin (0.0, 0.0) in the center.
Note that compared to native UIKit views the vertical axis is flipped and origin is moved, this is more convinent when working with graphics is PixelKit. A full rotation is defined by 1.0

Center: CGPoint(x: 0, y: 0)
Bottom Left: CGPoint(x: -0.5 * aspectRatio, y: -0.5)
Top Right: CGPoint(x: 0.5 * aspectRatio, y: 0.5)

Tip: Resolution has an .aspect property:
let aspectRatio: LiveFloat = Resolution._1080p.aspect

Blend Operators

A quick and convenient way to blend PIXs
These are the supported BlendingMode operators:

& !& + - * ** !** % ~ °
.over .under .add .subtract .multiply .power .gamma .difference .average cosine
<> >< ++ -- <-> >-< +-+
.minimum .maximum .addWithAlpha .subtractWithAlpha inside outside exclusiveOr
let blendPix = (CameraPIX() !** NoisePIX(at: .fullHD(.portrait))) * CirclePIX(at: .fullHD(.portrait))

Note when using Live values, one line if else statments are written with <?> & <=>:

let a: LiveFloat = 1.0
let b: LiveFloat = 2.0
let isOdd: LiveBool = .seconds % 2.0 < 1.0
let ab: LiveFloat = isOdd <?> a <=> b

The default global blend operator fill mode is .aspectFit, change it like this:
PIX.blendOperators.globalPlacement = .aspectFill

Live Values

Live values can be a constant (let) and still have changin values. Live values are ease to animate with the .live or .seconds static properites.

The Live Values:

  • CGFloat --> LiveFloat
  • Int --> LiveInt
  • Bool --> LiveBool
  • CGPoint --> LivePoint
  • CGSize --> LiveSize
  • CGRect --> LiveRect

Static properites:

  • LiveFloat.live
  • LiveFloat.seconds
  • LiveFloat.secondsSince1970
  • LiveFloat.touch / LiveFloat.mouseLeft
  • LiveFloat.touchX / LiveFloat.mouseX
  • LiveFloat.touchY / LiveFloat.mouseY
  • LivePoint.touchXY / LiveFloat.mouseXY
  • LiveFloat.gyroX
  • LiveFloat.gyroY
  • LiveFloat.gyroZ
  • LiveFloat.accelerationX/Y/X
  • LiveFloat.magneticFieldX/Y/X
  • LiveFloat.deviceAttitudeX/Y/X
  • LiveFloat.deviceGravityX/Y/X
  • LiveFloat.deviceHeading

Functions:

  • liveFloat.delay(seconds: 1.0)
  • liveFloat.filter(seconds: 1.0)
  • liveFloat.filter(frames: 60)

Static functions:

  • LiveFloat.noise(range: -1.0...1.0, for: 1.0)
  • LiveFloat.wave(range: -1.0...1.0, for: 1.0)

Effect Convenience Funcs

  • pix._reRes(to: ._1080p * 0.5) -> ResPIX
  • pix._reRes(by: 0.5) -> ResPIX
  • pix._brightness(0.5) -> LevelsPIX
  • pix._darkness(0.5) -> LevelsPIX
  • pix._contrast(0.5) -> LevelsPIX
  • pix._gamma(0.5) -> LevelsPIX
  • pix._invert() -> LevelsPIX
  • pix._opacity(0.5) -> LevelsPIX
  • pix._blur(0.5) -> BlurPIX
  • pix._edge() -> EdgePIX
  • pix._threshold(at: 0.5) -> ThresholdPIX
  • pix._quantize(by: 0.5) -> QuantizePIX
  • pix._position(at: CGPoint(x: 0.5, y: 0.5)) -> TransformPIX
  • pix._rotatate(to: .pi) -> TransformPIX
  • pix._scale(by: 0.5) -> TransformPIX
  • pix._kaleidoscope() -> KaleidoscopePIX
  • pix._twirl(0.5) -> TwirlPIX
  • pix._swap(.red, .blue) -> ChannelMixPIX
  • pix._key(.green) -> ChromaKeyPIX
  • pix._hue(0.5) -> HueSaturationPIX
  • pix._saturation(0.5) -> HueSaturationPIX
  • pix._crop(CGRect(x: 0.25, y 0.25, width: 0.5, height: 0.5)) -> CropPIX
  • pix._flipX() -> FlipFlopPIX
  • pix._flipY() -> FlipFlopPIX
  • pix._flopLeft() -> FlipFlopPIX
  • pix._flopRight() -> FlipFlopPIX
  • pix._range(inLow: 0.0, inHigh: 0.5, outLow: 0.5, outHigh: 1.0) -> RangePIX
  • pix._range(inLow: .clear, inHigh: .gray, outLow: .gray, outHigh: .white) -> RangePIX
  • pix._sharpen() -> SharpenPIX
  • pix._slope() - > SlopePIX
  • pixA._lookup(with: pixB, axis: .x) -> LookupPIX
  • pixA._lumaBlur(with: pixB, radius: 0.5) -> LumaBlurPIX
  • pixA._lumaLevels(with: pixB, brightness: 2.0) -> LumaLevelsPIX
  • pixA._vignetting(with: pixB, inset: 0.25, gamma: 0.5) -> LumaLevelsPIX
  • pixA._displace(with: pixB, distance: 0.5) -> DisplacePIX
  • pixA._remap(with: pixB) -> RemapPIX

Keep in mind that these funcs will create new PIXs.
Be careful of overloading GPU memory, some funcs create several PIXs.

MIDI

Here's an example of live midi values in range 0.0 to 1.0.

let circle = CirclePIX(at: ._1024)
circle.radius = .midi("13")
circle.color = .midi("17")

You can find the addresses by enabeling logging like this:

MIDI.main.log = true

High Bit Mode

Some effects like DisplacePIX and SlopePIX can benefit from a higher bit depth.
The default is 8 bits. Change it like this: PixelKit.main.render.bits = ._16

Enable high bit mode before you create any PIXs.

Note resources do not support higher bits yet.
There is currently there is some gamma offset with resources.

MetalPIXs

let metalPix = MetalPIX(at: ._1080p, code:
    """
    pix = float4(u, v, 0.0, 1.0);
    """
)
let metalEffectPix = MetalEffectPIX(code:
    """
    float gamma = 0.25;
    pix = pow(input, 1.0 / gamma);
    """
)
metalEffectPix.input = CameraPIX()
let metalMergerEffectPix = MetalMergerEffectPIX(code:
    """
    pix = pow(inputA, 1.0 / inputB);
    """
)
metalMergerEffectPix.inputA = CameraPIX()
metalMergerEffectPix.inputB = ImagePIX("img_name")
let metalMultiEffectPix = MetalMultiEffectPIX(code:
    """
    float4 inPixA = inTexs.sample(s, uv, 0);
    float4 inPixB = inTexs.sample(s, uv, 1);
    float4 inPixC = inTexs.sample(s, uv, 2);
    pix = inPixA + inPixB + inPixC;
    """
)
metalMultiEffectPix.inputs = [ImagePIX("img_a"), ImagePIX("img_b"), ImagePIX("img_c")]

Uniforms:

var lumUniform = MetalUniform(name: "lum")
let metalPix = MetalPIX(at: ._1080p, code:
    """
    pix = float4(in.lum, in.lum, in.lum, 1.0);
    """,
    uniforms: [lumUniform]
)
lumUniform.value = 0.5

Apps

a Live Graphics Node Editor for iPad
powered by PixelKit

a camera app lets you live layer filters of your choice.
combine effects to create new cool styles.

Live camera filters.
Load photos from library.

VJLive is a dual deck asset playback system with effects.
Assets can be loaded from Photos. Live camera support. AirPlay support.


by Anton Heestand, Hexagons

pixelkit's People

Contributors

heestand-xyz avatar

Stargazers

Mighel avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.