Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.

Overview

HaishinKit (formerly lf)

Platform Language CocoaPods GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • Issuesの言語は、日本語が分かる方は日本語でお願いします!

Sponsored with 💖 by
Stream Chat
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 💬

Communication

  • If you need help with making LiveStreaming requests using HaishinKit, use a GitHub issue with Bug report template
    • The trace level log is very useful. Please set Logboard.with(HaishinKitIdentifier).level = .trace.
    • If you don't use an issue template. I will immediately close the your issue without a comment.
  • If you'd like to discuss a feature request, use a GitHub issue with Feature request template.
  • If you want to support e-mail based communication without GitHub issue.
    • Consulting fee is $50/1 incident. I'm able to response a few days.
  • If you want to contribute, submit a pull request!

Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Handling (see also #126)
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)
    • Tunneled (RTMPT over SSL/TLS) (Technical Preview)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension (Technical Preview)

HLS

  • HTTPService
  • HLS Publish

Rendering

- HKView GLHKView MTHKView
Engine AVCaptureVideoPreviewLayer OpenGL ES Metal
Publish
Playback ×
VIsualEffect ×
Condition Stable Stable Beta

Others

  • Support tvOS 10.2+ (Technical Preview)
    • tvOS can't publish Camera and Microphone. Available playback feature.
  • Hardware acceleration for H264 video encoding, AAC audio encoding
  • Support "Allow app extension API only" option
  • Support GPUImage framework (~> 0.5.12)
  • Objective-C Bridging

Requirements

- iOS OSX tvOS XCode Swift CocoaPods Carthage
1.1.0+ 9.0+ 10.11+ 10.2+ 12.0+ 5.0+ 1.5.0+ 0.29.0+
1.0.0+ 8.0+ 10.11+ 10.2+ 11.0+ 5.0+ 1.5.0+ 0.29.0+
0.11.0+ 8.0+ 10.11+ 10.2+ 10.0+ 5.0 1.5.0+ 0.29.0+

Cocoa Keys

Please contains Info.plist.

iOS 10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

macOS 10.14+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

Installation

*Please set up your project Swift 5.3. *

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 1.1.4'
end

target 'Your Target'  do
    platform :ios, '9.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 1.1.4

Swift Package Manager

https://github.com/shogo4405/HaishinKit.swift

License

BSD-3-Clause

Donation

Paypal

Bitcoin

3FnjC3CmwFLTzNY5WPNz4LjTo1uxGNozUR

Prerequisites

Make sure you setup and activate your AVAudioSession.

import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with: [
            AVAudioSession.CategoryOptions.allowBluetooth,
            AVAudioSession.CategoryOptions.defaultToSpeaker]
        )
        try session.setMode(.default)
    }
    try session.setActive(true)
} catch {
    print(error)
}

RTMP Usage

Real Time Messaging Protocol (RTMP).

let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
    // print(error)
}

let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(hkView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

RTML URL Format

  • rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
    • [] mark is an Optional.
    rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]")
    rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
    
  • rtmp://localhost/live/streamName
    rtmpConneciton.connect("rtmp://localhost/live")
    rtmpStream.publish("streamName")
    

Settings

var rtmpStream = RTMPStream(connection: rtmpConnection)

rtmpStream.captureSettings = [
    .fps: 30, // FPS
    .sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
    // .isVideoMirrored: false,
    // .continuousAutofocus: false, // use camera autofocus mode
    // .continuousExposure: false, //  use camera exposure mode
    // .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
    .muted: false, // mute audio
    .bitrate: 32 * 1000,
]
rtmpStream.videoSettings = [
    .width: 640, // video output width
    .height: 360, // video output height
    .bitrate: 160 * 1000, // video output bitrate
    .profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    .maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]

// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)

Authentication

var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:[email protected]/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")

var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)

var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(hkView)

FAQ

How can I run example project?

git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift

carthage bootstrap --use-xcframeworks

open HaishinKit.xcodeproj

Reference

Issues
  • Switching Cameras - Delay mirroring image

    Switching Cameras - Delay mirroring image

    Sometimes when I switch the camera using:

            let position:AVCaptureDevicePosition = currentPosition == .Back ? .Front : .Back
    
            rtmpStream!.attachCamera(DeviceUtil.deviceWithPosition(position))
    
            currentPosition = position`
    

    the image is shown in the wrong position (as mirror) and after a time (about 30 secs) it becomes normal.

    Did you find this issue? I'm using lf (0.4.1)

    bug 
    opened by migueliOS 18
  • Crash on NetSocket.swift

    Crash on NetSocket.swift

    Hello,

    Describe the bug Crashlytics is reporting a crash on NetSocket.swift line 103 (NetSocket.doOutputProcess(_:maxLength:)).

    To Reproduce I can't reproduce it but it occured 10 times in 1 week in production (not many users).

    Smartphone (please complete the following information): Exemple of impacted devices :

    • 12.4.1 (16G102) / iPhone 7
    • 12.3.1 (16F203) / iPhone 6s
    • 12.4.2 (16G114) / iPad Air
    • 11.4.1 (15G77) / iPhone 6

    Additional context Here is the logs from Crashlytics :

    Crashed: com.haishinkit.HaishinKit.NetSocket.output
    0  CoreFoundation                 0x1d8cdd3e0 CFHash + 372
    1  CoreFoundation                 0x1d8d70780 CFBasicHashGetCountOfKey + 204
    2  CoreFoundation                 0x1d8cde8bc CFSetContainsValue + 116
    3  CoreFoundation                 0x1d8cd6e18 CFRunLoopRemoveSource + 164
    4  CFNetwork                      0x1d93ebc98 SocketStream::write(__CFWriteStream*, unsigned char const*, long, CFStreamError*) + 592
    5  CoreFoundation                 0x1d8cec0c0 CFWriteStreamWrite + 300
    6  HaishinKit                     0x10138dba4 NetSocket.doOutputProcess(_:maxLength:) + 103 (NetSocket.swift:103)
    7  HaishinKit                     0x10138d98c closure #1 in NetSocket.doOutput(data:locked:) + 49 (NetSocket.swift:49)
    8  HaishinKit                     0x1013355d8 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
    9  libdispatch.dylib              0x1d8788a38 _dispatch_call_block_and_release + 24
    10 libdispatch.dylib              0x1d87897d4 _dispatch_client_callout + 16
    11 libdispatch.dylib              0x1d8732320 _dispatch_lane_serial_drain$VARIANT$mp + 592
    12 libdispatch.dylib              0x1d8732e3c _dispatch_lane_invoke$VARIANT$mp + 428
    13 libdispatch.dylib              0x1d873b4a8 _dispatch_workloop_worker_thread + 596
    14 libsystem_pthread.dylib        0x1d8969114 _pthread_wqthread + 304
    15 libsystem_pthread.dylib        0x1d896bcd4 start_wqthread + 4
    
    crash_info_entry_1
    *** CFHash() called with NULL ***
    

    Have you any idea how to fix it ? Thanks !

    opened by Goule 17
  • stream from pre-record video ?

    stream from pre-record video ?

    is it possible to stream from existing mp4 video file instead of camera?

    question 
    opened by pensokha 16
  • Video autorotation

    Video autorotation

    Video is automatically rotated, but in player its always scaled up to full screen - in landscape this is correct behavior but after rotation to portrait I believe black area should appear on left and right so portrait video is showing in full size scaled down to fit size. Could you please correct me if I understand autorotation wrong? Thanks.

    opened by akovalov 16
  • Connection lost handling

    Connection lost handling

    Any easy way to return error once stream is interrupted after network lost??

    opened by keithcml 16
  • Choppy audio after updating to iOS 13

    Choppy audio after updating to iOS 13

    Describe the bug I'm running the example iOS app in iOS 13 and streaming with Mux (tried it with Wowza as well) and am having issues with the audio being incredibly choppy. Using different bitrates is not fixing the issue. This was not an issue in iOS 12.

    To Reproduce Steps to reproduce the behavior:

    1. Launch HaishinKit example on device running iOS13
    2. Start stream
    3. Listen to stream playback, it is choppy

    Expected behavior Clear/smooth audio

    Desktop (please complete the following information):

    • OS: MacOS
    • XCode 11.2.1

    Smartphone (please complete the following information):

    • Device: iPhone X
    • OS: iOS 13.3

    Additional context This problem was not occurring in iOS12.

    opened by hraza-simublade 15
  • AWS Media Elements Live

    AWS Media Elements Live

    Hi @shogo4405

    I am trying to get setup with AWS Media Live.

    I have a RTMP url here:

    rtmp: rtmp://52.6.106.57:1935/app key: testing

    And I have an endpoint here:

    m3u8: https://cf98fa7b2ee4450e.mediapackage.us-east-1.amazonaws.com/out/v1/94943f7bdb5d45ae85ecd327928cc302/index.m3u8

    I have tested with (OBS) open broadcast software and it works and will start displaying on the .m3u8 file in Safari I am trying to debug what the issue might be when transferring over to this framework I know I am probably missing something in AWS but not sure.

    The output I get from the log is:

    [Error] [com.haishinkit.HaishinKit] [RTMPMessage.swift:320] payload > AMF0Serializer{data: 236 bytes,position: 140,reference: HaishinKit.AMFReference}
    

    I am connecting like this.

    rtmpStream.publish("testing")
    rtmpConnection.connect("rtmp://52.206.199.219:1935/app")
    

    I will leave the urls open if you could suggest any possible issues that would be great

    Thanks

    bug 
    opened by samueleastdev 14
  • Can't connect to Periscope.

    Can't connect to Periscope.

    hello, i'm trying to use this lib on my iphone 6S but i keep getting this error "inSourceFormat > nil" of the AACEncoder.swift . what should i do?

    opened by RamzyChatti90 14
  • Stream is always portrait in Broadcast Upload Extension

    Stream is always portrait in Broadcast Upload Extension

    Stream is always portrait in Broadcast Upload Extension

    I used the exact copy of Examples/iOS/Screencast

    When I enter games that are landscape I expect the output to be landscape as well. But it's not.

    I tried

    broadcaster.stream.syncOrientation = true
    // And
    broadcaster.stream.orientation = .landscapeLeft
    

    both does not work

    question 
    opened by arslan2012 13
  • iPod 5th and iPhone4s can't send Audio

    iPod 5th and iPhone4s can't send Audio

    Hello!

    I try use your lib with Wowza server. But I can't receive sound on Wowza. Video receive normal.

    opened by serbintv 13
  • What should I do when the network changes?

    What should I do when the network changes?

    The rtmpStatus Callback function of the connection is called when a network change occurs, such as "Wifi -> 4G" or "4G -> Wifi". At this time, the value of rtmpStatus is closed and it feels like the connection is automatically closed when a network changed.

    However, the server did not actually receive a close to publish, and even if I call stream.close() or connection.close() again in the callback function, the server is also not receiving a close to publish.

    How can I forward a close to public to the server after the network changes?

    opened by jxxnnee 1
  • MTHKView sometimes has no mirroring or black screen

    MTHKView sometimes has no mirroring or black screen

    Describe the bug I run sample code several times but sometimes it can’t display the scree or mirror the screen.

    To Reproduce Steps to reproduce the behavior: Just running sample code.

    Expected behavior The screen displays the image taken by the front lens and is mirrored.

    Screenshots

    Desktop (please complete the following information):

    • OS: masOS Big Sur 11.6
    • Xcode: 13.1.0

    Smartphone (please complete the following information):

    • Device: iPhone13
    • OS: iOS15.1

    Additional context This is all my sample code.

    class ViewController: UIViewController {
        var rtmpStream: RTMPStream!
    
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view.
            let session = AVAudioSession.sharedInstance()
            do {
                try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
                try session.setActive(true)
            } catch {
                print(error)
            }
            
            let rtmpConnection = RTMPConnection()
            rtmpStream = RTMPStream(connection: rtmpConnection)
            
            rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio))
            rtmpStream.attachCamera(DeviceUtil.device(withPosition: .front))
            
            rtmpStream.captureSettings = [
                .sessionPreset: AVCaptureSession.Preset.medium,
                .isVideoMirrored: true
            ]
            rtmpStream.videoSettings = [
                .width: 720,
                .height: 1280
            ]
    
            let hkView = MTHKView(frame: view.bounds)
            hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
            hkView.attachStream(rtmpStream)
            
            view.addSubview(hkView)
        }
    }
    
    opened by jexwang 0
  • Add example for SwiftUI

    Add example for SwiftUI

    Hi,

    Could you point to a working example of integrating HaishinKit with SwiftUI ?

    Thanks

    enhancement help wanted 
    opened by mildis 0
  • バックグランド時に音声のみを配信する方法について

    バックグランド時に音声のみを配信する方法について

    概要

    配信中のアプリをバックグランドにしたときに音声のみを配信することは可能でしょうか?

    過去のIssuueを拝見させてもらったところ、映像はバックグランドでは配信不可能ということがわかりましたが、 音声であれば配信可能であるという旨のコメントを確認したので、お伺いしました。

    もし可能であれば、実装方法などをご教授いただけますと幸いです。

    確認したIssue

    https://github.com/shogo4405/HaishinKit.swift/issues/124 https://github.com/shogo4405/HaishinKit.swift/issues/525 https://github.com/shogo4405/HaishinKit.swift/issues/626 https://github.com/shogo4405/HaishinKit.swift/issues/847

    opened by Rikukuku 4
  • Metadata don't send if you append  as CMSampleBuffer

    Metadata don't send if you append as CMSampleBuffer

    Describe the bug RTMP stream Metadata don't send if you append as CMSampleBuffer

    To Reproduce Steps to reproduce the behaviour: connect to RTMP server and use input as sample buffer RTMPStream.appendSampleBuffer(CMSampleBuffer, AVMediaType) the problem is when message with data send use function createMetaData which check if input device was used but appendSampleBuffer doesn't set input device as function attachCamera

    Expected behavior When you use own capture session with output metadata must be included also

    Desktop (please complete the following information):

    • OS: iOS
    • Version 1.1.6
    opened by andrew3213221 0
  • Local record audio desynchronization on camera switch

    Local record audio desynchronization on camera switch

    Describe the bug The audio becomes desynchronized for a local record if the camera is switched

    To Reproduce Steps to reproduce the behavior:

    1. Modify the iOS example to enable local recordings (add type: .localRecord to publish method)
    2. Start a stream
    3. Switch the camera from the rear to front camera
    4. Speak into the camera
    5. Switch back camera
    6. End stream
    7. Review local recording

    Expected behavior The local recording's audio should not have any desynchronization

    Smartphone (please complete the following information):

    • Device: iPhone6s, iPhoneX
    • OS: iOS 12, iOS 14
    • Version: 1.1.5
    bug 
    opened by CubitSpeed 1
  • Streaming video stops when there is playing sound in webview.

    Streaming video stops when there is playing sound in webview.

    Describe the bug Video stops when sound is heard in Webview while streaming video. I am wondering if there is a conflict between the WKWebview sound and the HaishinKit library.

    To Reproduce Steps to reproduce the behavior:

    1. Start streaming rtmp video
    2. Create WKWebview to play sound
    3. See error

    Expected behavior During video streaming, the sound from WKWebview should be played normally. Streaming video should also continue to play without stopping.

    Smartphone (please complete the following information):

    • Device: Phone 12 Pro Max
    • OS: OS 14.4
    • Browser: wkwebivew
    opened by balbum 2
  • Fixing rotate recording file issue

    Fixing rotate recording file issue

    Missing recording files when turned on the rotate feature

    It seems the cause was the finishWriting function in AVRecorder.swift where the clean-up code been scheduled inside AVAssertWriter completionHandler resulting in new writer & writerInputs gets cleaned right after its creation.

    Move the clean up code outside the completionHandler seems to fix the issue without other side-effects

    opened by hunteva 2
  • 1080p capture stutters on iPhone 12 Pro/Max

    1080p capture stutters on iPhone 12 Pro/Max

    Describe the bug The 1080p capture stutters on the iPhone 12 Pro/Max. Stuttering affects preview, local recording, and stream.

    To Reproduce Steps to reproduce the behavior:

    1. Clone iOS Example
    2. Change rtmpStream.captureSettings.sessionPreset to be AVCaptureSession.Preset.hd1920x1080
    3. Run project on iPhone
    4. Switch back/front camera until issue occurs.

    Expected behavior Preview, local recording, and stream should be smooth. No stuttering

    Screenshots Please see attached video with reproduction steps and the stutter: https://user-images.githubusercontent.com/67028273/106406758-23917180-63ef-11eb-936e-a7dfb66c179e.MP4

    Smartphone

    • Device: iPhone 12 Pro and iPhone 12 Pro Max
    • OS: iOS14 (newest)
    • Version: HaishinKit.swift Master

    Additional context Does not happen on iPhone 12 (non Pro/Max). Or any other iPhone devices I have tried.

    Very strange: sometimes zooming in will solve the issue. But will re-occur after zoom out.

    bug 
    opened by CubitSpeed 1
  • HLS ts segment duration changes when using a high bitrate setting

    HLS ts segment duration changes when using a high bitrate setting

    Describe the bug ts segment duration changes when using a high bitrate setting

    To Reproduce use a setting with big bitrate like httpStream.videoSettings = [ .width: UIScreen.main.bounds.width, .height: UIScreen.main.bounds.height, .bitrate: 2000 * 1000, .profileLevel: kVTProfileLevel_H264_Baseline_3_1 }

    download a ts file in the m3u8. the ts file's duration should be 1 second. it is 1 second if the bitrate is set as 500*1000.but now the ts segment duration becomes 0.3 second

    Expected behavior the ts segment duration is still 1 second

    Smartphone (please complete the following information):

    • Device: iPhone 6 Plud
    • OS: iOS 14.2
    opened by sokold 3
Releases(1.2.2)
Owner
shogo4405
shogo4405
Player for streaming local and remote audio files. Written in Swift.

Jukebox is an iOS audio player written in Swift. Contents Features Installation Supported OS & SDK versions Usage Handling remote events Public interf

Teo 534 Sep 16, 2021
A framework for streaming audio between Apple devices using AirPlay.

Airstream An iOS / macOS framework for streaming audio between Apple devices using AirPlay. You can use Airstream to start an AirPlay server in your i

Qasim Iqbal 371 Nov 8, 2021
A fast and extensible gapless AudioPlayer/AudioStreamer for OSX and iOS (iPhone, iPad)

StreamingKit StreamingKit (formally Audjustable) is an audio playback and streaming library for iOS and Mac OSX. StreamingKit uses CoreAudio to decomp

Thong Nguyen 2.3k Nov 16, 2021
Syntax sugar of OpenTok iOS SDK with Audio/Video communication including screen sharing

Accelerator Core iOS The Accelerator Core is a solution to integrate audio/video communication to any iOS applications via OpenTok platform. Accelerat

OpenTok 29 Aug 3, 2021
A camera designed in Swift for easily integrating CoreML models - as well as image streaming, QR/Barcode detection, and many other features

Would you like to use a fully-functional camera in an iOS application in seconds? Would you like to do CoreML image recognition in just a few more sec

David Okun 823 Nov 8, 2021
LaiFeng IOS Live Kit,H264 and AAC Hard coding,support GPUImage Beauty, rtmp transmission,weak network lost frame,Dynamic switching rate

LFLiveKit LFLiveKit is a opensource RTMP streaming SDK for iOS. Features Background recording Support horizontal vertical recording Support Beauty Fac

null 4.3k Nov 22, 2021
An open source iOS app that lets you use one device as a camera and another as a remote control for the camera

Q: What is Open Source Selfie Stick? A: With this free app you can use any iPhone or iPad as a remote control for the camera on any other iPhone or iP

Richard Nelson 34 Nov 19, 2021
Telemat ist eine einfache Single-Screen-Streaming-App für tvOS

Telemat tvOS Basierend auf der ursprünglichen Idee von https://github.com/noestreich/Telemat1000_iPad ist Telemat tvOS eine tvOS APP, mit der ein schn

Oliver Michalak 14 Aug 6, 2021
:monkey::camera: Camera engine for iOS, written in Swift, above AVFoundation. :monkey:

?? The most advanced Camera framework in Swift ?? CameraEngine is an iOS camera engine library that allows easy integration of special capture feature

Remi ROBERT 559 Sep 25, 2021
A simple library to make authenticating tvOS apps easy via their iOS counterparts.

Voucher The new Apple TV is amazing but the keyboard input leaves a lot to be desired. Instead of making your users type credentials into their TV, yo

Riz 519 Jan 8, 2021
Player for streaming local and remote audio files. Written in Swift.

Jukebox is an iOS audio player written in Swift. Contents Features Installation Supported OS & SDK versions Usage Handling remote events Public interf

Teo 534 Sep 16, 2021
Streaming and realtime audio manipulation with AVAudioEngine

SwiftAudioPlayer Swift-based audio player with AVAudioEngine as its base. Allows for: streaming online audio, playing local file, changing audio speed

null 285 Nov 12, 2021
iOS 360-degree video player streaming from an AVPlayer.

Swifty360Player iOS 360-degree video player streaming from an AVPlayer. Demo Requirements Swifty360Player Version Minimum iOS Target Swift Version 0.2

Abdullah Selek 127 Nov 10, 2021
iOS 360-degree video player streaming from an AVPlayer.

Swifty360Player iOS 360-degree video player streaming from an AVPlayer. Demo Requirements Swifty360Player Version Minimum iOS Target Swift Version 0.2

Abdullah Selek 127 Nov 10, 2021
watchOS 2.0 healthkit, heartrate streaming, start workout session

watchOS-2-heartrate 06/29/2016 Updates watchOS3 + Swift3 https://github.com/coolioxlr/watchOS-3-heartrate 11/13/2015 Updates Apple finlly provide some

Ethan Fan 332 Oct 24, 2021
A framework for streaming audio between Apple devices using AirPlay.

Airstream An iOS / macOS framework for streaming audio between Apple devices using AirPlay. You can use Airstream to start an AirPlay server in your i

Qasim Iqbal 371 Nov 8, 2021
Swift library for embedding and controlling YouTube videos in your iOS applications via WKWebView!

YouTubePlayer Embed and control YouTube videos in your iOS applications! Neato, right? Let's see how it works. 0.7.0 Update: WKWebView breaking change

Giles Van Gruisen 796 Nov 12, 2021
Shows FPS, CPU and memory usage, device model, app and iOS versions above the status bar and report FPS, CPU and memory usage via delegate.

GDPerformanceView-Swift Shows FPS, CPU and memory usage, device model, app and iOS versions above the status bar and report FPS, CPU and memory usage

Gavrilov Daniil 2.1k Nov 14, 2021
A library, which adds the ability to hide navigation bar when view controller is pushed via hidesNavigationBarWhenPushed flag

HidesNavigationBarWhenPushed A library, which adds the ability to hide navigation bar when view controller is pushed via hidesNavigationBarWhenPushed

Danil Gontovnik 53 Nov 1, 2020
JustLog brings logging on iOS to the next level. It supports console, file and remote Logstash logging via TCP socket with no effort. Support for logz.io available.

JustLog JustLog takes logging on iOS to the next level. It supports console, file and remote Logstash logging via TCP socket with no effort. Support f

Just Eat 467 Nov 8, 2021
Make and accept payments in your iOS app via Venmo

Venmo iOS SDK The Venmo iOS SDK lets you make and accept payments in your app using Venmo. Installation If you're using CocoaPods: If you don't have a

Venmo 163 Aug 12, 2021
An Event View based on Apple's Event Detail View. Written in Swift 3. Supports ARC, Autolayout and editing via StoryBoard.

An Event View based on Apple's Event Detail View. Written in Swift 3. Supports ARC, Autolayout and editing via StoryBoard. Installation CocoaPods PTEv

Aman Taneja 36 Feb 13, 2021
Represent and compare versions via semantic versioning (SemVer) in Swift

Version Version is a Swift Library, which enables to represent and compare semantic version numbers. It follows Semantic Versioning 2.0.0. The represe

Marius Rackwitz 167 Oct 8, 2021
Easily use ARKit to detect facial gestures. FaceTrigger is a simple to use class that hides the details of using ARKit's ARSCNView to recognize facial gestures via ARFaceAnchor.BlendShapeLocations

FaceTrigger Introduction FaceTrigger is a simple to use class that hides the details of using ARKit's ARSCNView to recognize facial gestures via ARFac

Michael Peterson 65 Sep 4, 2021
A simple to use, standard interface for authenticating to oauth 2.0 protected endpoints via SFSafariViewController.

AuthenticationViewController A simple to use, standard interface for authenticating to OAuth 2.0 protected endpoints via SFSafariViewController. Instr

Raul Riera 255 Oct 18, 2021
JDVideoKit - You can easily transfer your video into Three common video type via this framework.

JDVideoKit Introduction You can easily transfer your video into Three common video type. You can use set up camera easily. Installation pod 'JDVideoK

郭介騵 24 Sep 9, 2021
Request the Location Services via a 3D 360° flyover MKMapView 🗺

STLocationRequest STLocationRequest is a simple and elegant way to request the users location services at the very first time. The STLocationRequestCo

Sven Tiigi 640 Sep 18, 2021
Framework for biometric authentication (via TouchID) in your application

Features Requirements Communication Installation Usage Intro Biometric authentication availability Feature enabled/disabled for biometric authenticati

Igor Vasilenko 25 Sep 27, 2020
Swift autocompleter for Sublime Text, via the adorable SourceKitten framework

SwiftKitten SwiftKitten is a Swift autocompleter for Sublime Text, via the adorable SourceKitten framework. Faster than XCode ! This package is new an

John Snyder 145 Nov 8, 2021