All Projects → ruanjx → VideoLab

ruanjx / VideoLab

Licence: MIT license
High-performance and flexible video editing and effects framework, based on AVFoundation and Metal.

Programming Languages

swift
15916 projects
Metal
113 projects

Projects that are alternatives of or similar to VideoLab

Metalvideoprocess
MetalVideoProcess is a High-Performance video effects processing framework. It's base on GPUImage3 Metal, support asynchronous and multithread processing.
Stars: ✭ 52 (-92.16%)
Mutual labels:  metal, avfoundation, video-processing
Avdemo
Demo projects for iOS Audio & Video development.
Stars: ✭ 136 (-79.49%)
Mutual labels:  avfoundation, video-processing
Cabbage
A video composition framework build on top of AVFoundation. It's simple to use and easy to extend.
Stars: ✭ 1,030 (+55.35%)
Mutual labels:  avfoundation, video-processing
Metalimage
MetalImage is more faster and powerful than opengles for iOS. It is very similar to GPUImage framework, but perform a better 3D rendering and multithreads computing abilities.
Stars: ✭ 207 (-68.78%)
Mutual labels:  metal, video-processing
Metalpetal
A GPU accelerated image and video processing framework built on Metal.
Stars: ✭ 907 (+36.8%)
Mutual labels:  metal, video-processing
Agimagecontrols
cool tools for image edition
Stars: ✭ 217 (-67.27%)
Mutual labels:  metal, avfoundation
VideoProcessingLibrary
The easiest library for video processing
Stars: ✭ 52 (-92.16%)
Mutual labels:  video-processing
FYCachedURLAsset
Enhanced AVURLAsset with seamless cache layer
Stars: ✭ 18 (-97.29%)
Mutual labels:  avfoundation
kagefunc
A collection of Vapoursynth functions. kagefunc.py is the only relevant file for users
Stars: ✭ 44 (-93.36%)
Mutual labels:  video-processing
Provision
Digital Rebar Provision is a simple and powerful Golang executable that provides a complete API-driven DHCP/PXE/TFTP provisioning system.
Stars: ✭ 252 (-61.99%)
Mutual labels:  metal
generativepy
Library for creating generative art and maths animations
Stars: ✭ 70 (-89.44%)
Mutual labels:  video-processing
mini-lab
a small, virtual setup to locally run the metal-stack
Stars: ✭ 50 (-92.46%)
Mutual labels:  metal
MetalCity
MetalCity - a procedural night city landscape generator
Stars: ✭ 29 (-95.63%)
Mutual labels:  metal
ImageEnhanceViaFusion
It is a Java implementation of underwater images and videos enhancement by fusion
Stars: ✭ 58 (-91.25%)
Mutual labels:  video-processing
Holo
Dummy camera works on simurator without changes.
Stars: ✭ 55 (-91.7%)
Mutual labels:  avfoundation
NabaztagHackKit
A simple SDK to get your hands dirty with Nabaztag
Stars: ✭ 28 (-95.78%)
Mutual labels:  metal
Video-Stabilization-and-image-mosaicing
video stabilization: stabilize the videos which is taken from wavering camera. Image mosaicing: stitches multiple, overlapping snapshot images of a video together in order to produce one large image.
Stars: ✭ 16 (-97.59%)
Mutual labels:  video-processing
Cocos2d X
Cocos2d-x is a suite of open-source, cross-platform, game-development tools used by millions of developers all over the world.
Stars: ✭ 15,713 (+2269.98%)
Mutual labels:  metal
Logodetect
Find logos in images and videos in just one-shot. Never be embarrassed again to say that you have a small data situation!
Stars: ✭ 41 (-93.82%)
Mutual labels:  video-processing
Mg
C# Vulkan interface/polyfill for WINDOWS and MacOS
Stars: ✭ 19 (-97.13%)
Mutual labels:  metal

VideoLab

README 中文版本 框架设计与实现介绍

High-performance and flexible video editing and effects framework, based on AVFoundation and Metal.

Framework design and implementation

Features

  • High-performance real-time video editing and exporting.
  • Highly free combination of video, image, audio.
  • Support audio pitch setting and volume adjustment.
  • Support CALayer vector animations, so complex text animations are supported.
  • Support keyframe animation.
  • Support After Effect-like pre-compose.
  • Support transitions.
  • Support custom effects. Such as LUT filter, zoom blur, etc.

The following are some GIFs of features(multiple layers, text animation, keyframe animation, pre compose, and transition)

Requirements

  • iOS 11.0+
  • Swift 5.0+

Installation

VideoLab is available through CocoaPods. Specify the following in your Podfile:

source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '11.0'
use_frameworks!

target '<Your Target>' do
  pod 'VideoLab'
end

Usage

Basic Concept

RenderLayer

RenderLayer is the most basic unit in the VideoLab framework. A video, image, audio can be a RenderLayer, or even just an effect can be a RenderLayer. RenderLayer is more like the concept of the layer in After Effect.

RenderComposition

RenderComposition works as a composite, can set frame rate, canvas size, contains multiple RenderLayers, can set CALayer to support vector animations.

VideoLab

VideoLab can be considered as a lab where AVPlayerItem, AVAssetExportSession, AVAssetImageGenerator can be generated according to RenderComposition.

Basic Usage

// 1. Layer 1
var url = Bundle.main.url(forResource: "video1", withExtension: "MOV")
var asset = AVAsset(url: url!)
var source = AVAssetSource(asset: asset)
source.selectedTimeRange = CMTimeRange(start: CMTime.zero, duration: asset.duration)
var timeRange = source.selectedTimeRange
let renderLayer1 = RenderLayer(timeRange: timeRange, source: source)
    
// 1. Layer 2
url = Bundle.main.url(forResource: "video2", withExtension: "MOV")
asset = AVAsset(url: url!)
source = AVAssetSource(asset: asset)
source.selectedTimeRange = CMTimeRange(start: CMTime.zero, duration: asset.duration)
timeRange = source.selectedTimeRange
timeRange.start = CMTimeRangeGetEnd(renderLayer1.timeRange)
let renderLayer2 = RenderLayer(timeRange: timeRange, source: source)
    
// 2. Composition
let composition = RenderComposition()
composition.renderSize = CGSize(width: 1280, height: 720)
composition.layers = [renderLayer1, renderLayer2]

// 3. VideoLab
let videoLab = VideoLab(renderComposition: composition)

// 4. Make playerItem
let playerItem = videoLab.makePlayerItem()
  1. Create RenderLayer
  2. Create RenderComposition, set renderSize and layers
  3. Create VideoLab with renderComposition
  4. Make AVPlayerItem or AVAssetExportSession

More Advanced Usage

Transform

var center = CGPoint(x: 0.25, y: 0.25)
var transform = Transform(center: center, rotation: 0, scale: 0.5)
renderLayer1.transform = transform
  1. Create Transform with center, rotation and scale
  2. RenderLayer set transform

Audio Configuration

let audioConfiguration = AudioConfiguration()
let volumeRampTimeRange = CMTimeRange(start: CMTime.zero, duration: CMTime(seconds: 5, preferredTimescale: 600))
let volumeRamp1 = VolumeRamp(startVolume: 0.0, endVolume: 0.0, timeRange: volumeRampTimeRange)
audioConfiguration.volumeRamps = [volumeRamp1]
renderLayer2.audioConfiguration = audioConfiguration
  1. Create AudioConfiguration
  2. Create VolumeRamp with startVolume, endVolume and timeRange
  3. AudioConfiguration set volumeRamps
  4. RenderLayer set audioConfiguration

CALayer Animation

For exporting set your customized CALayer for RenderComposition

composition.animationLayer = <Your customized CALayer>

For playback add AVSynchronizedLayer to your view's layer, See more detail in Text Animation Demo.

Keyframe Animation

// 1. Keyframe animation
let keyTimes = [CMTime(seconds: 2, preferredTimescale: 600),
                CMTime(seconds: 4, preferredTimescale: 600),
                CMTime(seconds: 6, preferredTimescale: 600)]
let animation = KeyframeAnimation(keyPath: "blendOpacity",
                                  values: [1.0, 0.2, 1.0],
                                  keyTimes: keyTimes, timingFunctions: [.linear, .linear])
renderLayer1.animations = [animation]
    
var transform = Transform.identity
let animation1 = KeyframeAnimation(keyPath: "scale",
                                   values: [1.0, 1.3, 1.0],
                                   keyTimes: keyTimes, timingFunctions: [.quadraticEaseInOut, .quadraticEaseInOut])
let animation2 = KeyframeAnimation(keyPath: "rotation",
                                   values: [0, Float.pi / 2.0, 0],
                                   keyTimes: keyTimes, timingFunctions: [.quadraticEaseInOut, .quadraticEaseInOut])
transform.animations = [animation1, animation2]
renderLayer1.transform = transform
  1. Create KeyframeAnimation with keyPath, values, keyTimes and timingFunctions
  2. Set animations for a struct or class that implements the Animatable protocol (e.g. Transform struct, RenderLayer class)

RenderLayerGroup (After Effect-like pre-compose)

let layerGroup = RenderLayerGroup(timeRange: timeRange)
layerGroup.layers = [renderLayer1, renderLayer2]
  1. Create RenderLayerGroup with timeRange
  2. Set sub layers for layerGroup. See more detail in Layer Group Demo.

Transition

We don't have a transition layer, so instead, you can add a transform or operations to each RenderLayer to create a transition. See more detail in Transition Demo.

Custom Effects

// Filter
var filter = LookupFilter()
filter.addTexture(lutTextures[0], at: 0)
renderLayer.operations = [filter]

// Zoom Blur
var zoomblur = ZoomBlur()
animation = KeyframeAnimation(keyPath: "blurSize",
                              values: [0.0, 3.0],
                              keyTimes: keyTimes, timingFunctions: [.quarticEaseOut])
zoomblur.animations = [animation]
layerGroup1.operations = [zoomblur]
  1. Create customize Operation inherited from BasicOperation. BasicOperation also conforms to the Animatable protocol
  2. Set operations for RenderLayer.

TODO

  • Support Open GL render
  • Add speed adjustment for RenderLayer.
  • Provide a more convenient way to use transitions, possibly providing TransitionLayer.
  • Add log system.

Author

License

VideoLab is available under the MIT license. See the LICENSE file for more info.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].