Giter Site home page Giter Site logo

apivideo / api.video-reactnative-live-stream Goto Github PK

View Code? Open in Web Editor NEW
169.0 11.0 32.0 3.61 MB

React Native RTMP live stream client. Made with ♥ by api.video

Home Page: https://api.video

License: MIT License

Kotlin 19.82% Ruby 2.05% JavaScript 2.28% Java 9.98% TypeScript 46.20% Swift 12.85% C 0.16% Objective-C 6.66%
react-native video rtmp live-streaming android ios

api.video-reactnative-live-stream's Introduction

badge   badge   badge

npm ts

React Native RTMP live stream client

api.video is the video infrastructure for product builders. Lightning fast video APIs for integrating, scaling, and managing on-demand & low latency live streaming features in your app.

Table of contents

Project description

This module is made for broadcasting rtmp live stream from smartphone camera

Getting started

⚠️ The React Native Live Stream SDK is designed for 0.69.1 version of React Native. Using the SDK with >0.69.1 of React Native can cause unexpected behaviour

Installation

npm install @api.video/react-native-livestream

or

yarn add @api.video/react-native-livestream

Note: if you are on iOS, you will need two extra steps:

  1. Don't forget to install the native dependencies with Cocoapods
cd ios && pod install
  1. This project contains swift code, and if it's your first dependency with swift code, you need to create an empty swift file in your project (with the bridging header) from XCode. Find how to do that

Permissions

To be able to broadcast, you must:

  1. On Android: ask for internet, camera and microphone permissions:
<manifest>
  <uses-permission android:name="android.permission.INTERNET" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.CAMERA" />
</manifest>

Your application must dynamically require android.permission.CAMERA and android.permission.RECORD_AUDIO.

  1. On iOS: update Info.plist with a usage description for camera and microphone
<key>NSCameraUsageDescription</key>
<string>Your own description of the purpose</string>

<key>NSMicrophoneUsageDescription</key>
<string>Your own description of the purpose</string>
  1. On react-native you must handle the permissions requests before starting your livestream. If permissions are not accepted you will not be able to broadcast.

Code sample

import React, { useRef, useState } from 'react';
import { View, TouchableOpacity } from 'react-native';
import { LiveStreamView } from '@api.video/react-native-livestream';

const App = () => {
  const ref = useRef(null);
  const [streaming, setStreaming] = useState(false);

  return (
    <View style={{ flex: 1, alignItems: 'center' }}>
      <LiveStreamView
        style={{ flex: 1, backgroundColor: 'black', alignSelf: 'stretch' }}
        ref={ref}
        camera="back"
        enablePinchedZoom={true}
        video={{
          fps: 30,
          resolution: '720p',
          bitrate: 2*1024*1024, // # 2 Mbps
          gopDuration: 1, // 1 second
        }}
        audio={{
          bitrate: 128000,
          sampleRate: 44100,
          isStereo: true,
        }}
        isMuted={false}
        onConnectionSuccess={() => {
          //do what you want
        }}
        onConnectionFailed={(e) => {
          //do what you want
        }}
        onDisconnect={() => {
          //do what you want
        }}
      />
      <View style={{ position: 'absolute', bottom: 40 }}>
        <TouchableOpacity
          style={{
            borderRadius: 50,
            backgroundColor: streaming ? 'red' : 'white',
            width: 50,
            height: 50,
          }}
          onPress={() => {
            if (streaming) {
              ref.current?.stopStreaming();
              setStreaming(false);
            } else {
              ref.current?.startStreaming('YOUR_STREAM_KEY');
              setStreaming(true);
            }
          }}
        />
      </View>
    </View>
  );
}

export default App;

Documentation

Props & Methods

type LiveStreamProps = {
  // Styles for the view containing the preview
  style: ViewStyle;
  // camera facing orientation
  camera?: 'front' | 'back';
  video: {
    // frame rate
    fps: number;
    // resolution
    resolution: '240p' | '360p' | '480p' | '720p' | '1080p';
    // video bitrate. depends on resolutions.
    bitrate: number;
    // duration between 2 key frames in seconds
    gopDuration: number;
  };
  audio: {
    // sample rate. Only for Android. Recommended: 44100
    sampleRate: 44100;
    // true for stereo, false for mono. Only for Android. Recommended: true
    isStereo: true;
    // audio bitrate. Recommended: 128000
    bitrate: number;
  };
  // Mute/unmute microphone
  isMuted: false;
  // Enables/disables the zoom gesture handled natively
  enablePinchedZoom?: boolean;
  // will be called when the connection is successful
  onConnectionSuccess?: () => void;
  // will be called when connection failed
  onConnectionFailed?: (code: string) => void;
  // will be called when the live-stream is stopped
  onDisconnect?: () => void;
};

type LiveStreamMethods = {
  // Start the stream
  // streamKey: your live stream RTMP key
  // url: RTMP server url, default: rtmp://broadcast.api.video/s
  startStreaming: (streamKey: string, url?: string) => void;
  // Stops the stream
  stopStreaming: () => void;
  // Sets the zoomRatio
  // Intended for use with React Native Gesture Handler, a slider or similar.
  setZoomRatio: (zoomRatio) => void;
};

Example App

You can try our example app, feel free to test it.

Setup

Be sure to follow the React Native installation steps before anything.

  1. Open a new terminal
  2. Clone the repository and go into it
git clone https://github.com/apivideo/api.video-reactnative-live-stream.git livestream_example_app && cd livestream_example_app

Android

Install the packages and launch the application

yarn && yarn example android

iOS

  1. Install the packages
yarn install
  1. Go into /example/ios and install the Pods
cd /example/ios && pod install
  1. Sign your application

Open Xcode, click on "Open a project or file" and open the Example.xcworkspace file.
You can find it in YOUR_PROJECT_NAME/example/ios.
Click on Example, go in Signin & Capabilities tab, add your team and create a unique bundle identifier.

  1. Launch the application, from the root of your project
yarn example ios

Plugins

api.video live stream library is using external native library for broadcasting

Plugin README
StreamPack StreamPack
HaishinKit HaishinKit

FAQ

If you have any questions, ask us here: https://community.api.video . Or use Issues.

api.video-reactnative-live-stream's People

Contributors

anthony-dantard avatar arthemium avatar bluebazze avatar erikkai avatar mardi66 avatar olivierapivideo avatar pfcodes avatar robwalkerco avatar romainpetit1 avatar thibaultbee avatar titozzz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

api.video-reactnative-live-stream's Issues

[Bug]: Bitrate is not set correctly on initial render

Version

v1.2.3

Which operating systems have you used?

  • Android
  • iOS

Environment that reproduces the issue

Samsung Galaxy S20 - Android 13 - wrong initial bitrate
Samsung Galaxy S9 - Android 10 - works fine

Is it reproducible in the example application?

Yes

RTMP Server

restream.io

Reproduction steps

  1. Start streaming with some bitrate (eg. 0.5Mb)
  2. On any page that shows bitrate (eg. restram.io) bitrate will not match
  3. Change bitrate in video params object via code while streaming is active or with any controls
  4. Bitrate will change to valid one (eg. 1Mb)

Expected result

Bitrate is accepted

Actual result

For params below streaming is started with 2.6Mbps

{
  fps: 30,
  resolution: '720p',
  bitrate: 1 * 1024 * 1024, // # 2 Mbps
}

Additional context

Works fine on S9 with Android 10.

Relevant logs output

No response

audiosamplerate not included on iOS

Describe the bug
The onMetaData from iOS has audiodatarate but not audiosamplerate (using the info parsed in #47 )

To Reproduce
Steps to reproduce the behavior:

  1. Send video with iOS
  2. Parse onMetaData message
  3. Observe audiodatarate is included but audiosamplerate is not

Expected behavior
audiosamplerate is also included with audiodatarate

Smartphone (please complete the following information):

  • Device: iPhone 14 Pro Max
  • OS: 16.1.2
  • Browser: Safari
  • Version: 16.1

Additional context
Knowing the sample rate of the audio allows proper handling of the audio stream that's coming in

App sometimes crashes both in ios/android.

Describe the bug
I have a screen where I use component like this:

<LiveStreamView
        isMuted={false}
        video={{
          fps: 30,
          resolution: '720p',
          bitrate: 2 * 1024 * 1024, // # 2 Mbps
        }}
        audio={{
          bitrate: 128000,
          sampleRate: 44100,
          isStereo: true,
        }}
        ref={streamRef}
        style={styles.stream}
        camera={isFront ? 'front' : 'back'}/>

This is the startStream function I use inside useEffect to get a key on backend where I stream and then call the startStreaming()

const startStream = () => {
   liveStreamServices
     .startStream()
     .then(res => res.json())
     .then(data => {
       const stream = data.data;
       streamRef.current?.startStreaming(stream.key, LIVESTREAM_URL);
     });
 };
 
 
   useEffect(() => {
   startStream();
   IdleTimerManager.setIdleTimerDisabled(true);
   return () => onUnmount();
 }, []);
 

Rarely the app crashes as soon as I get to this screen. I don't know why is it happening and there aren't special steps that I am doing when it is happening so I can't tell how to reproduce this. I am hoping you can get an understanding from the logs what could be going wrong.

Here are the logs from firebase crashlytics:

Crashed: com.apple.main-thread
0 klipster 0x123ed8 specialized AVVideoIOUnit.output.getter + 251 (AVVideoIOUnit.swift:251)
1 klipster 0x118bd4 AVMixer.preferredVideoStabilizationMode.setter + 236 (AVVideoIOUnit.swift:236)
2 klipster 0x11869c key path setter for AVMixer.preferredVideoStabilizationMode : AVMixer + 4309403292 (:4309403292)
3 libswiftCore.dylib 0x1786d0 NonmutatingWritebackBuffer.__deallocating_deinit + 324
4 libswiftCore.dylib 0x3da1d4 _swift_release_dealloc + 56
5 libswiftCore.dylib 0x17a7f8 swift_setAtReferenceWritableKeyPath + 276
6 klipster 0x111658 specialized Setting.subscript.setter + 41 (Setting.swift:41)
7 klipster 0x110ad8 specialized Setting.observer.didset + 4309371608
8 klipster 0x15af5c NetStream.captureSettings.setter + 4309675868 (:4309675868)
9 klipster 0x9160 ApiVideoLiveStream.prepareVideo() + 4308291936 (:4308291936)
10 klipster 0x875c ApiVideoLiveStream.init(initialAudioConfig:initialVideoConfig:preview:) + 145 (ApiVideoLiveStream.swift:145)
11 klipster 0x402c18 ReactNativeLiveStreamView.audioConfig.didset + 57 (ReactNativeLiveStreamView.swift:57)
12 klipster 0x402f30 ReactNativeLiveStreamView.audio.didset + 90 (ReactNativeLiveStreamView.swift:90)
13 klipster 0x402fcc @objc ReactNativeLiveStreamView.audio.setter + 1826316
14 klipster 0x24d464 __49-[RCTComponentData createPropBlock:isShadowView:]_block_invoke.101 + 315 (RCTComponentData.m:315)
15 klipster 0x24d618 __49-[RCTComponentData createPropBlock:isShadowView:]_block_invoke_2.102 + 333 (RCTComponentData.m:333)
16 klipster 0x24d878 __37-[RCTComponentData setProps:forView:]_block_invoke + 371 (RCTComponentData.m:371)
17 CoreFoundation 0x13948 NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK + 24
18 CoreFoundation 0x9ded0 -[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:] + 212
19 klipster 0x24d7f4 -[RCTComponentData setProps:forView:] + 373 (RCTComponentData.m:373)
20 klipster 0x2a1fdc __44-[RCTUIManager flushUIBlocksWithCompletion:]_block_invoke + 1199 (RCTUIManager.m:1199)
21 klipster 0x2a20d4 __44-[RCTUIManager flushUIBlocksWithCompletion:]_block_invoke.426 + 1222 (RCTUIManager.m:1222)
22 libdispatch.dylib 0x24b4 _dispatch_call_block_and_release + 32
23 libdispatch.dylib 0x3fdc _dispatch_client_callout + 20
24 libdispatch.dylib 0x127f4 _dispatch_main_queue_drain + 928
25 libdispatch.dylib 0x12444 _dispatch_main_queue_callback_4CF + 44
26 CoreFoundation 0x9aa08 CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE + 16
27 CoreFoundation 0x7c368 __CFRunLoopRun + 2036
28 CoreFoundation 0x811e4 CFRunLoopRunSpecific + 612
29 GraphicsServices 0x1368 GSEventRunModal + 164
30 UIKitCore 0x3a2d88 -[UIApplication _run] + 888
31 UIKitCore 0x3a29ec UIApplicationMain + 340
32 klipster 0x75e8 main + 7 (main.m:7)
33 ??? 0x1b3db1948 (Missing)

Add Data message before streaming video

Is your feature request related to a problem? Please describe.
Currently, there's no information to define the properties of the video and a Data message can provide that

Describe the solution you'd like
Send a Data message with relevant info before streaming the video

Describe alternatives you've considered
Not sure there's an alternative to adding this (other than just leaving things as is)

Additional context
Here's an example of the info that ffmpeg sends when you send a file to an RTMP server:

{
    "duration": 0.0,
    "width": 1920.0,
    "height": 1080.0,
    "videodatarate": 0.0,
    "framerate": 24.0,
    "videocodecid": 7.0,
    "audiodatarate": 159.7607421875,
    "audiosamplerate": 44100.0,
    "audiosamplesize": 16.0,
    "stereo": true,
    "audiocodecid": 10.0,
    "major_brand": "qt  ",
    "minor_version": "0",
    "compatible_brands": "qt  ",
    "com.apple.quicktime.creationdate": "2022-04-12T20:24:15-0400",
    "com.apple.quicktime.make": "Apple",
    "com.apple.quicktime.model": "iPhone XR",
    "com.apple.quicktime.software": "15.3.1",
    "encoder": "Lavf59.27.100",
    "filesize": 0.0
}

Defaulting with front camera is very slow to start

Describe the bug
As titled, if I set the default camera to be front, it is much slower compared to defaulting to back camera.

To Reproduce
Steps to reproduce the behavior:

  1. Set the broadcast screen as part of Stack Navigation
  2. Set the default to front camera const [camera, setCamera] = React.useState<'front' | 'back'>('front');
  3. Navigate to the broadcast screen

Expected behavior
The screen should open at least as fast as defaulting to the back camera.

Screenshots
If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

  • Device: iPhone 13 Pro
  • OS: iOS 15.x
  • Version 1.2.1

Additional context
Add any other context about the problem here.

Android: crashes on nativeClose on .stopStreaming

Describe the bug
When using react navigation.
Having a broadcaster page with the live stream view.
Navigating away from this page causes the entire app to crash.

When navigating away from the broadcaster page.
React navigation will unmount that page and destroy the components on that page.

Crashlog
https://gist.github.com/BlueBazze/b3dc08e342b732cd5b78af4c59c2d3b1

Reproduction
Reproduced with the example app
https://github.com/BlueBazze/api.video-reactnative-live-stream

To cause the crash

  1. Open app
  2. Press "Go to broadcaster"
  3. Press the back button in the app bar

Video
https://drive.google.com/file/d/1uOoys-dEo5siEGvJx3yvyahj195arbM7/view?usp=sharing

Smartphone (please complete the following information):

  • Device: ["Samsung A22", "Huawei p20 lite"]
  • OS: ["Android 12", "Android 9"]

Additional context
You can hook into the lifecycle of react with the following
https://gist.github.com/BlueBazze/f371f26f21fefdbf1a9251038d727fe1

Wouldnt know how to reproduce this with a native app.

Android: Failed to createCodec on .stopStreaming()

Describe the bug
Using the example code in the readme, streaming works fine but as soon as I stop recording the app crashes.

To Reproduce
Steps to reproduce the behavior:

  1. Use example code
  2. Press record button to stop streamin

Expected behavior
The app to not crash

Smartphone (please complete the following information):

  • Device: Realme 8
  • OS: 12

Additional context
I'm using Expo, consistently works for starting the stream but as soon as I stop it crashes. Majority of the time it crashes on the first time I stop streaming, very occasionally I'll stop streaming and it'll be fine, I then start and stop again and it crashes

Crash on Android if the livestream is muted. Happens either you mute it midway or start from the muted state.

Describe the bug
Crashes on muting the livestream only in android.

To Reproduce
Steps to reproduce the behavior:

  1. Create a livestream view
  2. Start the stream in muted state or mute it midway.

Expected behavior
Should not crash and continue streaming

Screenshots
If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

  • Device: Pixel 6
  • OS: Android 13(TP1A.221105.002)
  • React Native(0.70.6)
  • Version: 1.2.1

Additional context

FATAL EXCEPTION: AMediaCodecThread
Process: com.instaselllive, PID: 22006
java.lang.IndexOutOfBoundsException: off=0, len=4096 out of bounds (size=3584)
	at java.nio.Buffer.checkBounds(Buffer.java:587)
	at java.nio.DirectByteBuffer.put(DirectByteBuffer.java:304)
	at io.github.thibaultbee.streampack.internal.sources.AudioCapture.getFrame(AudioCapture.kt:134)
	at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$audioEncoderListener$1.onInputFrame(BaseStreamer.kt:94)
	at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onInputBufferAvailable(MediaCodecEncoder.kt:138)
	at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1824)
	at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1752)
	at android.os.Handler.dispatchMessage(Handler.java:106)
	at android.os.Looper.loopOnce(Looper.java:201)
	at android.os.Looper.loop(Looper.java:288)
	at android.os.HandlerThread.run(HandlerThread.java:67)

App crash on Android phones when navigate back from camera screen

Describe the bug
App crash when I navigate to another screen from the camera screen(where this lib is used).

To Reproduce
Steps to reproduce the behavior:

  1. Open the camera screen
  2. Press back
  3. And it crashed

Expected behavior
It should not crash the app. on moving between screen.

Crash logs
Here is the full crash logs details provided by the sentry:
https://gist.github.com/tarun-showday/62bbd2cbc4b938440808928b02979d87

Smartphone (please complete the following information):

  • Device: Mi A1
  • OS: Android 9

Additional context
It's working on some of the android device but in some of the device which I have mentioned and vivo y1 also it's crashing all the time.

Failed to create codec for: {mime=video/avc, width=720, height=1280}

Describe the bug
README example failing on Samsung Galaxy J7 Prime device (Android).

Error while updating property 'video' of a view managed by: ReactNativeLiveStreamView

null

Failed to create codec for: {mime=video/avc, width=720, height=1280}

[stack trace]

To Reproduce
Steps to reproduce the behavior:

  1. Try to run the README example

Expected behavior
App to not crash. Maybe handle unsupported codec on devices gracefully?

Smartphone (please complete the following information):

  • Device: Samsung Galaxy J7 Prime
  • OS: Android
  • Version 1.2.1

Additional context
I packaged react-native-livestream into custom Expo dev-client.

java.security.InvalidParameterException: Failed to create codec for: {mime=video/avc, width=1080, height=1920}

Describe the bug
Crashes on the livestream screen abruptly as soon as the screen is mounted.

To Reproduce
Steps to reproduce the behavior:

  1. Create a screen with <LivestreamView />
  2. Navigate to the screen

Expected behavior
Should not crash on the screen

Screenshots
If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

  • Device: Vivo V25
  • OS: Android 12
  • Version 1.2.1

Additional context

# Crashlytics - Stack trace
# Platform: android
# Version: 1.0 (1)
# Issue: 1eda585a3bdb2ccba46496e2e842614b
# Session: 638EC6A402E100015D3DD0EFEB972BB8_DNE_0_v2
# Date: Tue Dec 06 2022 10:08:08 GMT+0530 (India Standard Time)

Fatal Exception: com.facebook.react.bridge.JSApplicationIllegalArgumentException: Error while updating property 'video' of a view managed by: ReactNativeLiveStreamView
       at com.facebook.react.uimanager.ViewManagersPropertyCache$PropSetter.updateViewProp(ViewManagersPropertyCache.java:101)
       at com.facebook.react.uimanager.ViewManagerPropertyUpdater$FallbackViewManagerSetter.setProperty(ViewManagerPropertyUpdater.java:136)
       at com.facebook.react.uimanager.ViewManagerPropertyUpdater.updateProps(ViewManagerPropertyUpdater.java:56)
       at com.facebook.react.uimanager.ViewManager.updateProperties(ViewManager.java:86)`
       at com.facebook.react.uimanager.ViewManager.createViewInstance(ViewManager.java:188)
       at com.facebook.react.uimanager.ViewManager.createView(ViewManager.java:115)
       at com.facebook.react.uimanager.NativeViewHierarchyManager.createView(NativeViewHierarchyManager.java:281)
       at com.facebook.react.uimanager.UIViewOperationQueue$CreateViewOperation.execute(UIViewOperationQueue.java:194)
       at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.dispatchPendingNonBatchedOperations(UIViewOperationQueue.java:1110)
       at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.doFrameGuarded(UIViewOperationQueue.java:1081)
       at com.facebook.react.uimanager.GuardedFrameCallback.doFrame(GuardedFrameCallback.java:29)
       at com.facebook.react.modules.core.ReactChoreographer$ReactChoreographerDispatcher.doFrame(ReactChoreographer.java:175)
       at com.facebook.react.modules.core.ChoreographerCompat$FrameCallback$1.doFrame(ChoreographerCompat.java:85)
       at android.view.Choreographer$CallbackRecord.run(Choreographer.java:1226)
       at android.view.Choreographer.doCallbacks(Choreographer.java:1012)
       at android.view.Choreographer.doFrame(Choreographer.java:917)
       at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:1213)
       at android.os.Handler.handleCallback(Handler.java:938)
       at android.os.Handler.dispatchMessage(Handler.java:99)
       at android.os.Looper.loopOnce(Looper.java:233)
       at android.os.Looper.loop(Looper.java:334)
       at android.app.ActivityThread.main(ActivityThread.java:8501)
       at java.lang.reflect.Method.invoke(Method.java)
       at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:582)
       at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1068)

Caused by java.lang.reflect.InvocationTargetException:
       at java.lang.reflect.Method.invoke(Method.java)
       at com.facebook.react.uimanager.ViewManagersPropertyCache$PropSetter.updateViewProp(ViewManagersPropertyCache.java:93)
       at com.facebook.react.uimanager.ViewManagerPropertyUpdater$FallbackViewManagerSetter.setProperty(ViewManagerPropertyUpdater.java:136)
       at com.facebook.react.uimanager.ViewManagerPropertyUpdater.updateProps(ViewManagerPropertyUpdater.java:56)
       at com.facebook.react.uimanager.ViewManager.updateProperties(ViewManager.java:86)
       at com.facebook.react.uimanager.ViewManager.createViewInstance(ViewManager.java:188)
       at com.facebook.react.uimanager.ViewManager.createView(ViewManager.java:115)
       at com.facebook.react.uimanager.NativeViewHierarchyManager.createView(NativeViewHierarchyManager.java:281)
       at com.facebook.react.uimanager.UIViewOperationQueue$CreateViewOperation.execute(UIViewOperationQueue.java:194)
       at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.dispatchPendingNonBatchedOperations(UIViewOperationQueue.java:1110)
       at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.doFrameGuarded(UIViewOperationQueue.java:1081)
       at com.facebook.react.uimanager.GuardedFrameCallback.doFrame(GuardedFrameCallback.java:29)
       at com.facebook.react.modules.core.ReactChoreographer$ReactChoreographerDispatcher.doFrame(ReactChoreographer.java:175)
       at com.facebook.react.modules.core.ChoreographerCompat$FrameCallback$1.doFrame(ChoreographerCompat.java:85)
       at android.view.Choreographer$CallbackRecord.run(Choreographer.java:1226)
       at android.view.Choreographer.doCallbacks(Choreographer.java:1012)
       at android.view.Choreographer.doFrame(Choreographer.java:917)
       at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:1213)
       at android.os.Handler.handleCallback(Handler.java:938)
       at android.os.Handler.dispatchMessage(Handler.java:99)
       at android.os.Looper.loopOnce(Looper.java:233)
       at android.os.Looper.loop(Looper.java:334)
       at android.app.ActivityThread.main(ActivityThread.java:8501)
       at java.lang.reflect.Method.invoke(Method.java)
       at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:582)
       at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1068)

Caused by java.security.InvalidParameterException: Failed to create codec for: {mime=video/avc, width=1080, height=1920}
       at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder.createCodec(MediaCodecEncoder.kt:185)
       at io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder.createVideoCodec(VideoMediaCodecEncoder.kt:81)
       at io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder.configure(VideoMediaCodecEncoder.kt:59)
       at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.resetVideo(BaseStreamer.kt:364)
       at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.stopStream(BaseStreamer.kt:321)
       at io.github.thibaultbee.streampack.streamers.bases.BaseCameraStreamer.stopPreview(BaseCameraStreamer.kt:126)
       at io.github.thibaultbee.streampack.streamers.bases.BaseCameraStreamer.release(BaseCameraStreamer.kt:134)
       at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:243)
       at video.api.livestream.ApiVideoLiveStream.setVideoConfig(ApiVideoLiveStream.kt:86)
       at video.api.reactnative.livestream.ReactNativeLiveStreamView.setVideoConfig(ReactNativeLiveStreamView.kt:59)
       at video.api.reactnative.livestream.ReactNativeLiveStreamViewManager.setVideoConfig(ReactNativeLiveStreamViewManager.kt:62)
       at java.lang.reflect.Method.invoke(Method.java)
       at com.facebook.react.uimanager.ViewManagersPropertyCache$PropSetter.updateViewProp(ViewManagersPropertyCache.java:93)
       at com.facebook.react.uimanager.ViewManagerPropertyUpdater$FallbackViewManagerSetter.setProperty(ViewManagerPropertyUpdater.java:136)
       at com.facebook.react.uimanager.ViewManagerPropertyUpdater.updateProps(ViewManagerPropertyUpdater.java:56)
       at com.facebook.react.uimanager.ViewManager.updateProperties(ViewManager.java:86)
       at com.facebook.react.uimanager.ViewManager.createViewInstance(ViewManager.java:188)
       at com.facebook.react.uimanager.ViewManager.createView(ViewManager.java:115)
       at com.facebook.react.uimanager.NativeViewHierarchyManager.createView(NativeViewHierarchyManager.java:281)
       at com.facebook.react.uimanager.UIViewOperationQueue$CreateViewOperation.execute(UIViewOperationQueue.java:194)
       at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.dispatchPendingNonBatchedOperations(UIViewOperationQueue.java:1110)
       at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.doFrameGuarded(UIViewOperationQueue.java:1081)
       at com.facebook.react.uimanager.GuardedFrameCallback.doFrame(GuardedFrameCallback.java:29)
       at com.facebook.react.modules.core.ReactChoreographer$ReactChoreographerDispatcher.doFrame(ReactChoreographer.java:175)
       at com.facebook.react.modules.core.ChoreographerCompat$FrameCallback$1.doFrame(ChoreographerCompat.java:85)
       at android.view.Choreographer$CallbackRecord.run(Choreographer.java:1226)
       at android.view.Choreographer.doCallbacks(Choreographer.java:1012)
       at android.view.Choreographer.doFrame(Choreographer.java:917)
       at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:1213)
       at android.os.Handler.handleCallback(Handler.java:938)
       at android.os.Handler.dispatchMessage(Handler.java:99)
       at android.os.Looper.loopOnce(Looper.java:233)
       at android.os.Looper.loop(Looper.java:334)
       at android.app.ActivityThread.main(ActivityThread.java:8501)
       at java.lang.reflect.Method.invoke(Method.java)
       at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:582)
       at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1068)

Camera in use indicator doesn't go away

After calling stopStreaming() the onDisconnect event successfully fires and the streaming is stopped, although my phone is still showing a green color that indicates camera is still in use and only goes away if I entirely close the application, what is the problem?

Versions:
"@api.video/react-native-livestream": "^1.0.0",
"react-native": "0.66.4",

Testing device:
iPhone XS
ios 15.4.1

Cannot start a live streaming on Android 12 and Android 11

Hello everyone,

I'm trying to use this library with Cloudflare Stream and I can't make it work properly on Android latest versions.

It seems to work fine on Android 9.

This is the code:

<LivestreamView
  style={{
    height: 300,
    backgroundColor: "black",
    alignSelf: "stretch",
  }}
  ref={cameraViewRef}
  video={{
    fps: 30,
    resolution: "720p",
    camera: "front",
    orientation: "portrait",
  }}
  rtmpServerUrl="rtmps://live.cloudflare.com:443/live/"
  liveStreamKey={
    "32d69c287f287a426405a0840b581043k341b948cd3334c5f8c1c74e97794adc6"
  }
  onConnectionSuccess={() => {
    console.log("success");
    setStreaming(true);

    //do what you want
  }}
  onConnectionFailed={(e) => {
    console.log("fail", e);
    setStreaming(false);
    setStreamKey(null);
    //do what you want
  }}
  onDisconnect={() => {
    console.log("disconnect");
    setStreaming(false);
    setStreamKey(null);

    //do what you want
  }}
/>

<Button
  onPress={() => {
    if (streaming) {
      console.log('streaming');
      cameraViewRef.current?.stopStreaming();
      false;
    } else {
      console.log('else');
      cameraViewRef.current?.startStreaming();
    }
  }}
  title={streaming ? "stop" : "start 2"}
  color="red"
/>

I can't see any logs on metro.

If I debug the APK this is the error I get:

2022-01-19 22:04:11.443 28188-28188/? E/unknown:UIViewOperationQueue: Unhandled SoftException
    java.io.IOException: Could not start RTMP streaming. audioReady=false, videoReady=true
        at video.api.livestream_module.ApiVideoLiveStream.startStreaming(ApiVideoLiveStream.kt:179)
        at com.apivideoreactnativelivestream.ReactNativeLivestreamViewManager.startStreaming(ReactNativeLivestreamViewManager.kt:177)
        at com.apivideoreactnativelivestream.ReactNativeLivestreamViewManager.receiveCommand(ReactNativeLivestreamViewManager.kt:88)
        at com.facebook.react.uimanager.NativeViewHierarchyManager.dispatchCommand(NativeViewHierarchyManager.java:839)
        at com.facebook.react.uimanager.UIViewOperationQueue$DispatchCommandOperation.executeWithExceptions(UIViewOperationQueue.java:317)
        at com.facebook.react.uimanager.UIViewOperationQueue$1.run(UIViewOperationQueue.java:873)
        at com.facebook.react.uimanager.UIViewOperationQueue.flushPendingBatches(UIViewOperationQueue.java:1019)
        at com.facebook.react.uimanager.UIViewOperationQueue.access$2600(UIViewOperationQueue.java:47)
        at com.facebook.react.uimanager.UIViewOperationQueue$DispatchUIFrameCallback.doFrameGuarded(UIViewOperationQueue.java:1079)
        at com.facebook.react.uimanager.GuardedFrameCallback.doFrame(GuardedFrameCallback.java:29)
        at com.facebook.react.modules.core.ReactChoreographer$ReactChoreographerDispatcher.doFrame(ReactChoreographer.java:175)
        at com.facebook.react.modules.core.ChoreographerCompat$FrameCallback$1.doFrame(ChoreographerCompat.java:85)
        at android.view.Choreographer$CallbackRecord.run(Choreographer.java:1035)
        at android.view.Choreographer.doCallbacks(Choreographer.java:845)
        at android.view.Choreographer.doFrame(Choreographer.java:775)
        at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:1022)
        at android.os.Handler.handleCallback(Handler.java:938)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loopOnce(Looper.java:201)
        at android.os.Looper.loop(Looper.java:288)
        at android.app.ActivityThread.main(ActivityThread.java:7842)
        at java.lang.reflect.Method.invoke(Native Method)
        at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:548)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1003)

Any ideas on what I'm missing or how can I fix it?

Thank you :)

Stuck for 10 seconds when mount a page with LivestreamView

The video:
https://user-images.githubusercontent.com/660581/158354018-50f43d5a-820a-4696-b7b7-9922de1343a0.mp4

Core dependencies

  "dependencies": {
    "@api.video/react-native-livestream": "^0.2.1",
    "react-native": "~0.63.4",
    "@react-navigation/native": "~5.9.3",
    "@react-navigation/stack": "~5.14.3",
    ...

Logs (tapped on 18:49:56, entered on 18:50:05):

2022-15-03 18:49:56.459 [Warn] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:158] continuousAutofocus > focusMode(2) is not supported
2022-15-03 18:49:56.459 [Warn] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:217] continuousExposure > exposureMode(2) is not supported
2022-15-03 18:49:56.462 [Info] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:91] fps > (fps: 30.0, duration: __C.CMTime(value: 100, timescale: 3000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0))
2022-15-03 18:50:05.470 [Info] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:91] fps > (fps: 30.0, duration: __C.CMTime(value: 100, timescale: 3000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0))
orientation 1

Android: Camera view is stretching out.

Version

v1.2.3

Which operating systems have you used?

  • Android
  • iOS

Environment that reproduces the issue

Any android device.
I have
Samsung galaxy s10+
Samsung galaxy A20
Samsung galaxy F62

Is it reproducible in the example application?

Yes

RTMP Server

rtmps://global-live.mux.com:443/app

Reproduction steps

  1. Open the app
  2. Start the live stream in landscape mode.
  3. See the camera view its stretching the view.

Expected result

Not stretching out the view.

Actual result

stretching out the view.

Additional context

"react": "16.13.1",
"react-native": "0.64.2",
"@api.video/react-native-livestream": "^1.2.3",
"react-native-orientation-locker": "^1.4.0",

Relevant logs output

-

Not able to capture stream from Android

Hey! 👋🏻😁
Thanks for the library, we've been looking for a maintainable streaming library for a while and here's it 🎉

However, we're experiencing a weird issue, possibly from a configuration from our side, but we wanted to check with you first. The capture process is working on IOS but not on Android.

Here's our configuration:

const liveStreamVideoOptions = {
  fps: 30,
  resolution: '720p',
  camera: 'front',
  orientation: 'portrait',
};
 /// {...}
  const [streaming, setStreaming] = useState(false);
  const ref = useRef<ReactNativeLivestreamMethods>(null);

  const handlePress = () => {
    if (streaming) {
      ref.current?.stopStreaming();
      setStreaming(false);
    } else {
      ref.current?.startStreaming();
      setStreaming(true);
    }
  };

  return <LivestreamView
        style={stylesheet.livestreamView}
        ref={ref}
        video={liveStreamVideoOptions}
        liveStreamKey={streamKey}
        rtmpServerUrl={rtmpUrl}
      />

According to the docs, there's some setup needed for permissions, which is already implemented on our side

  <uses-permission android:name="android.permission.INTERNET" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.CAMERA" />

And the permissions are also requested using react-native-permissions library.
So finally, I wonder why we're not able to capture from Android devices, are we missing something?

RTMP Stream not received on Android

Describe the bug
I am running the example app on the Android simulator (api spec 31 and 33). I am able to Start Streaming but the RTMP stream is not received by the streaming provider.
Meanwhile testing on iOS device the example app, it works as expected.

To Reproduce
Steps to reproduce the behavior:

  1. yarn install
  2. yarn example android (Emulator)
  3. Settings > Set RTMP endpoint and Streaming Key
  4. Start Streaming

Expected behavior
On the streaming service I am using (Livepeer.studio) I expect to receive a streaming. Instead nothing happens.
Trying to do the same thing on iOS device instead works as expected.

Smartphone (please complete the following information):

  • Device: Pixel_3a_API_33_arm64-v8a(AVD) - 13 (Android Virtual Device)
  • OS: API 33

Additional context
Should it work on the Emulator? Or should it work just on real Android device?

disable and enable video

Adding an option to enable and disable video would be great.
Similar to the mute option we have on audio, this allows us to broadcast only audio, in that situation, It would be great if you could offer to us to add an image or something alternative to a black/blank video for showing to users while we disable video.

[Android] Error while updating property...

Hey guys!
I have this error on Android device when trying to render LiveStreamView component. Also I had the same error but it said "Error while updating property 'audio'". I can't figure out what the problem is. Did anyone encounter with it? My RN version is 0.69.
BTW: I am using Apple MacBook with M1 chip

Screen Shot 2022-07-21 at 3 39 06 PM

Here is my component source code (basically the same as example in README, but with some permissions settings):

const App = () => {
  const ref = useRef(null);
  const [streaming, setStreaming] = useState(false);
  const [initialized, setInitialized] = useState(false);

  const init = async () => {
    const hasMicPermission = await checkMicrophonePermission();
    const hasCamPermission = await checkCameraPermission();

    const micPermissionGranted =
      hasMicPermission ||
      (await requestMicrophonePermission()) === RESULTS.GRANTED;

    const camPermissionGranted =
      hasCamPermission || (await requestCameraPermission()) === RESULTS.GRANTED;

    if (micPermissionGranted && camPermissionGranted) {
      setInitialized(true);
    }
  };

  useEffect(() => {
    init();
  }, []);

  if (!initialized) {
    return (
      <View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}>
        <ActivityIndicator />
      </View>
    );
  }

  return (
    <View style={{ flex: 1, alignItems: 'center' }}>
      <LiveStreamView
        style={{ flex: 1, backgroundColor: 'black', alignSelf: 'stretch' }}
        ref={ref}
        camera="back"
        video={{
          fps: 30,
          resolution: '720p',
          bitrate: 2*1024*1024, // # 2 Mbps
        }}
        audio={{
          bitrate: 128000,
          sampleRate: 44100,
          isStereo: true,
        }}
        isMuted={false}
        onConnectionSuccess={() => {
          //do what you want
        }}
        onConnectionFailed={e => {
          //do what you want
        }}
        onDisconnect={() => {
          //do what you want
        }}
      />
      <View style={{ position: 'absolute', bottom: 40 }}>
        <TouchableOpacity
          style={{
            borderRadius: 50,
            backgroundColor: streaming ? 'red' : 'white',
            width: 50,
            height: 50,
          }}
          onPress={() => {
            if (streaming) {
              ref.current?.stopStreaming();
              setStreaming(false);
            } else {
              ref.current?.startStreaming('*****');
              setStreaming(true);
            }
          }}
        />
      </View>
    </View>
  );
};

export default App;

Difference between video record on phone?

Describe the bug
If we record a video on an iPhone and then stream it to an RTMP server using ffmpeg, the video works, but if we send it directly to the RTMP server using this module, it ends up being green. Is there a difference in the codec or way the video is recorded than what the camera app does? The detail that I've noticed is that from the recorded video the NALU Headers are 0x67, 0x68, 0x65, but when using this module, they are 0x27, 0x28, 0x06 so I think that that's the indication of something being different that's related to this issue

To Reproduce
Steps to reproduce the behavior:

  1. Stream from module to RTMP server
  2. Log NALU Headers
  3. Observe difference

Expected behavior
Video looks correct from both a recorded file and a stream from the module

Screenshots
Screenshot 2022-12-02 at 9 03 06 PM

Smartphone (please complete the following information):

  • Device: iPhone 14 Pro Max
  • OS: 16.1.2
  • Browser: Safari
  • Version: 16.1 (18614.2.9.1.12)

Fatal crash when opening screen

Describe the bug
When installing the application no matter what i do on my samsung s7 android 8.0 the app crash, at frist i thought i was because of the android version but am just from testing on my redmi note 10 with android 12 same issue different outcome there the issue is that the stream start but doesn't send any package and disconnect as soon as it started. is worth noting i have tested 2 different backends both written in go and nodejs but the output is still the same, but they both work well with OBS.

To Reproduce
Steps to reproduce the behavior:

  1. Clone example repo and install dep.
  2. Run react-native
  3. App crash on android 8.0 or doesn't send any package on android 12.

Expected behavior
The appication is suppose to stream my screen

Smartphone (please complete the following information):

  • Device: Redmi Note 10
  • OS: Android
  • Version :12

Additional context
Here is the different logs from flipper

`*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***

Cannot figure out the cause


Build fingerprint: 'Redmi/sunny_in/sunny:12/SKQ1.210908.001/V13.0.1.0.SKGINXM:user/release-keys'
Revision: '0'
ABI: 'arm64'
Timestamp: 2022-05-23 13:07:47.398812292+0100
Process uptime: 0s
Cmdline: video.api.reactnative.livestream.example
pid: 19554, tid: 19773, name: VMediaCodecThre >>> video.api.reactnative.livestream.example <<<
uid: 10422
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x18
Cause: null pointer dereference
x0 0000000000000000 x1 0000000072465250 x2 00000000000009b8 x3 0000007e6acc0f5f
x4 0000007e527f6688 x5 000000000000004a x6 284901ff3a435328 x7 7f7f7f7f7f7f7f7f
x8 0000000000000000 x9 9e24f0d7579eb989 x10 0000000000430000 x11 000000000000001c
x12 00000000000000ff x13 0000000000000000 x14 7ffbffff00000000 x15 0000000000000000
x16 0000007dc985ef10 x17 0000007dcd85a504 x18 0000007d9576e000 x19 b400007e53a58400
x20 0000000000000000 x21 0000000000000030 x22 0000007debe5fc06 x23 0000000000004070
x24 0000007e6ae00880 x25 0000007e527f68a8 x26 0000007e527f68bc x27 0000007e527f68a8
x28 0000007e527f67a0 x29 0000007e527f6780
lr 0000007e6aed4048 sp 0000007e527f6730 pc 0000007dcd85a520 pst 0000000060000000
backtrace:
#00 pc 0000000000014520 /data/app/_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/lib/arm64/librtmp.so (RTMP_Write+28) (BuildId: bfd03f379ed41c82df8ea932a4ccf15bcfbc66dd)
#1 pc 00000000002d4044 /apex/com.android.art/lib64/libart.so (art_quick_generic_jni_trampoline+148) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#2 pc 000000000020a0a0 /apex/com.android.art/lib64/libart.so (nterp_helper+4016) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#3 pc 000000000017ac06 [anon:dalvik-classes6.dex extracted in memory from /data/app/
_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk!classes6.dex]
#4 pc 000000000020a044 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#5 pc 0000000000467ad2 [anon:dalvik-classes.dex extracted in memory from /data/app/_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#6 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#7 pc 00000000004791a8 [anon:dalvik-classes.dex extracted in memory from /data/app/
_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#8 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#9 pc 000000000046b978 [anon:dalvik-classes.dex extracted in memory from /data/app/_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#10 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#11 pc 0000000000479324 [anon:dalvik-classes.dex extracted in memory from /data/app/
_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#12 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#13 pc 00000000004688f6 [anon:dalvik-classes.dex extracted in memory from /data/app/~~_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#14 pc 000000000020a044 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#15 pc 000000000031f806 /system/framework/framework.jar
#16 pc 000000000020a044 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#17 pc 000000000031fae0 /system/framework/framework.jar
#18 pc 000000000200ece0 /memfd:jit-cache (deleted) (android.os.Handler.dispatchMessage+272)
#19 pc 0000000002038b48 /memfd:jit-cache (deleted) (android.os.Looper.loopOnce+1480)
#20 pc 0000000002016878 /memfd:jit-cache (deleted) (android.os.Looper.loop+696)
#21 pc 00000000002ca9e8 /apex/com.android.art/lib64/libart.so (art_quick_invoke_static_stub+568) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#22 pc 00000000002ee6b8 /apex/com.android.art/lib64/libart.so (art::interpreter::ArtInterpreterToCompiledCodeBridge(art::Thread*, art::ArtMethod*, art::ShadowFrame*, unsigned short, art::JValue*)+320) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#23 pc 000000000040ade4 /apex/com.android.art/lib64/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+820) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#24 pc 000000000076d4b8 /apex/com.android.art/lib64/libart.so (MterpInvokeStatic+3812) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#25 pc 00000000002c5014 /apex/com.android.art/lib64/libart.so (mterp_op_invoke_static+20) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#26 pc 00000000004490d8 /system/framework/framework.jar
#27 pc 000000000027d840 /apex/com.android.art/lib64/libart.so (art::interpreter::Execute(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame&, art::JValue, bool, bool) (.llvm.3351068054637636664)+644) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#28 pc 000000000027c9e8 /apex/com.android.art/lib64/libart.so (artQuickToInterpreterBridge+1176) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#29 pc 00000000002d4178 /apex/com.android.art/lib64/libart.so (art_quick_to_interpreter_bridge+88) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#30 pc 00000000002ca764 /apex/com.android.art/lib64/libart.so (art_quick_invoke_stub+548) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#31 pc 000000000030e980 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+156) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#32 pc 00000000003c1db4 /apex/com.android.art/lib64/libart.so (art::JValue art::InvokeVirtualOrInterfaceWithJValuesart::ArtMethod*(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, art::ArtMethod*, jvalue const*)+380) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#33 pc 00000000004578ec /apex/com.android.art/lib64/libart.so (art::Thread::CreateCallback(void*)+992) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#34 pc 00000000000f0d34 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: e1f31041be009cf8613d6678cdccd160)
#35 pc 000000000008d57c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: e1f31041be009cf8613d6678cdccd160)


Build fingerprint: 'Redmi/sunny_in/sunny:12/SKQ1.210908.001/V13.0.1.0.SKGINXM:user/release-keys'
Revision: '0'
ABI: 'arm64'
Timestamp: 2022-05-23 13:07:47.398812292+0100
Process uptime: 0s
Cmdline: video.api.reactnative.livestream.example
pid: 19554, tid: 19773, name: VMediaCodecThre >>> video.api.reactnative.livestream.example <<<
uid: 10422
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x18
Cause: null pointer dereference
x0 0000000000000000 x1 0000000072465250 x2 00000000000009b8 x3 0000007e6acc0f5f
x4 0000007e527f6688 x5 000000000000004a x6 284901ff3a435328 x7 7f7f7f7f7f7f7f7f
x8 0000000000000000 x9 9e24f0d7579eb989 x10 0000000000430000 x11 000000000000001c
x12 00000000000000ff x13 0000000000000000 x14 7ffbffff00000000 x15 0000000000000000
x16 0000007dc985ef10 x17 0000007dcd85a504 x18 0000007d9576e000 x19 b400007e53a58400
x20 0000000000000000 x21 0000000000000030 x22 0000007debe5fc06 x23 0000000000004070
x24 0000007e6ae00880 x25 0000007e527f68a8 x26 0000007e527f68bc x27 0000007e527f68a8
x28 0000007e527f67a0 x29 0000007e527f6780
lr 0000007e6aed4048 sp 0000007e527f6730 pc 0000007dcd85a520 pst 0000000060000000
backtrace:
#00 pc 0000000000014520 /data/app/_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/lib/arm64/librtmp.so (RTMP_Write+28) (BuildId: bfd03f379ed41c82df8ea932a4ccf15bcfbc66dd)
#1 pc 00000000002d4044 /apex/com.android.art/lib64/libart.so (art_quick_generic_jni_trampoline+148) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#2 pc 000000000020a0a0 /apex/com.android.art/lib64/libart.so (nterp_helper+4016) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#3 pc 000000000017ac06 [anon:dalvik-classes6.dex extracted in memory from /data/app/
_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk!classes6.dex]
#4 pc 000000000020a044 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#5 pc 0000000000467ad2 [anon:dalvik-classes.dex extracted in memory from /data/app/_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#6 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#7 pc 00000000004791a8 [anon:dalvik-classes.dex extracted in memory from /data/app/
_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#8 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#9 pc 000000000046b978 [anon:dalvik-classes.dex extracted in memory from /data/app/_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#10 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#11 pc 0000000000479324 [anon:dalvik-classes.dex extracted in memory from /data/app/
_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#12 pc 000000000020ae64 /apex/com.android.art/lib64/libart.so (nterp_helper+7540) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#13 pc 00000000004688f6 [anon:dalvik-classes.dex extracted in memory from /data/app/~~_QAk2eMv1ChyjpC-9zD8Ww==/video.api.reactnative.livestream.example-k3-VntwSyovRnzc5_56SLw==/base.apk]
#14 pc 000000000020a044 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#15 pc 000000000031f806 /system/framework/framework.jar
#16 pc 000000000020a044 /apex/com.android.art/lib64/libart.so (nterp_helper+3924) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#17 pc 000000000031fae0 /system/framework/framework.jar
#18 pc 000000000200ece0 /memfd:jit-cache (deleted) (android.os.Handler.dispatchMessage+272)
#19 pc 0000000002038b48 /memfd:jit-cache (deleted) (android.os.Looper.loopOnce+1480)
#20 pc 0000000002016878 /memfd:jit-cache (deleted) (android.os.Looper.loop+696)
#21 pc 00000000002ca9e8 /apex/com.android.art/lib64/libart.so (art_quick_invoke_static_stub+568) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#22 pc 00000000002ee6b8 /apex/com.android.art/lib64/libart.so (art::interpreter::ArtInterpreterToCompiledCodeBridge(art::Thread*, art::ArtMethod*, art::ShadowFrame*, unsigned short, art::JValue*)+320) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#23 pc 000000000040ade4 /apex/com.android.art/lib64/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+820) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#24 pc 000000000076d4b8 /apex/com.android.art/lib64/libart.so (MterpInvokeStatic+3812) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#25 pc 00000000002c5014 /apex/com.android.art/lib64/libart.so (mterp_op_invoke_static+20) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#26 pc 00000000004490d8 /system/framework/framework.jar
#27 pc 000000000027d840 /apex/com.android.art/lib64/libart.so (art::interpreter::Execute(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame&, art::JValue, bool, bool) (.llvm.3351068054637636664)+644) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#28 pc 000000000027c9e8 /apex/com.android.art/lib64/libart.so (artQuickToInterpreterBridge+1176) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#29 pc 00000000002d4178 /apex/com.android.art/lib64/libart.so (art_quick_to_interpreter_bridge+88) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#30 pc 00000000002ca764 /apex/com.android.art/lib64/libart.so (art_quick_invoke_stub+548) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#31 pc 000000000030e980 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+156) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#32 pc 00000000003c1db4 /apex/com.android.art/lib64/libart.so (art::JValue art::InvokeVirtualOrInterfaceWithJValuesart::ArtMethod*(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, art::ArtMethod*, jvalue const*)+380) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#33 pc 00000000004578ec /apex/com.android.art/lib64/libart.so (art::Thread::CreateCallback(void*)+992) (BuildId: 34e3dd028e2e682b63a512d6a4f1b5eb)
#34 pc 00000000000f0d34 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: e1f31041be009cf8613d6678cdccd160)
#35 pc 000000000008d57c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: e1f31041be009cf8613d6678cdccd160)
`

README example freezes after on start/stop/start

Running the README example. After I stopStreaming the first time, the "camera view" freezes on next startStreaming.

This is how it goes:

  1. Mount screen ✅
  2. Do not start streaming yet
  3. Move camera around, the "camera view" works ✅
  4. Start streaming
  5. onConnecttionSuccess event is fired ✅
  6. Move camera around, the "camera view" works ✅
  7. Stop streaming
  8. onDisconnect event is fired ✅
  9. Move camera around, the "camera view" works ✅
  10. Start streaming again
  11. onConnecttionSuccess event is fired ✅
  12. "Camera view" does NOT work ❌
  13. Stop streaming
  14. Button changes color red --> white (so not the screen freezing) ✅
  15. onDisconnect event is NOT fired ❌

No JS errors, no crash. Just the "camera view" freezing.

Any help/hint would be appreciated 🙏

Example app not working on macOS 13 Ventura, Xcode 14, iPhone with iOS 16

Describe the bug
I am trying to run the the example app on device but I am failing with error Failed to build iOS project and a lot of errors like warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99

To Reproduce
Steps to reproduce the behavior:

  1. In the project folder I run yarn
  2. yarn example ios --device iPhonezee (my iPhone device)

And I get errors regarding the pod dependencies minimum version:

$ yarn --cwd example ios --device iPhonezee
$ react-native run-ios --device iPhonezee
info Found Xcode workspace "Example.xcworkspace"
info Building (using "xcodebuild -workspace Example.xcworkspace -configuration Debug -scheme Example -destination id=00008101-00044C162684001E")
error Failed to build iOS project. We ran "xcodebuild" command but it exited with error code 65. To debug build logs further, consider building your app with Xcode.app, by opening Example.xcworkspace.
Command line invocation:
    /Applications/Xcode_14.app/Contents/Developer/usr/bin/xcodebuild -workspace Example.xcworkspace -configuration Debug -scheme Example -destination id=00008101-00044C162684001E

User defaults from command line:
    IDEPackageSupportUseBuiltinSCM = YES

Prepare packages

Computing target dependency graph and provisioning inputs

Create build description
Build description signature: bee723534211d98cef7253eb80756ead
Build description path: /Users/claudio/Library/Developer/Xcode/DerivedData/Example-fvizzzwihcblhzdgbmhfljdbmehx/Build/Intermediates.noindex/XCBuildData/bee723534211d98cef7253eb80756ead-desc.xcbuild

note: Building targets in dependency order
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-Folly' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'SocketRocket' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'YogaKit' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'CocoaAsyncSocket' from project 'Pods')
warning: Run script build phase 'Start Packager' will be run during every build because it does not specify any outputs. To address this warning, either add output dependencies to the script phase, or configure it to run in every build by unchecking "Based on dependency analysis" in the script phase. (in target 'Example' from project 'Example')
warning: Run script build phase 'Bundle React Native code and images' will be run during every build because it does not specify any outputs. To address this warning, either add output dependencies to the script phase, or configure it to run in every build by unchecking "Based on dependency analysis" in the script phase. (in target 'Example' from project 'Example')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'FlipperKit' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.4, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-PeerTalk' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'fmt' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-Boost-iOSX' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-DoubleConversion' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-RSocket' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'RNVectorIcons' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'libevent' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: error: Signing for "React-Core-AccessibilityResources" requires a development team. Select a development team in the Signing & Capabilities editor. (in target 'React-Core-AccessibilityResources' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-Glog' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'OpenSSL-Universal' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 10.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'Flipper-Fmt' from project 'Pods')
/Users/claudio/Downloads/api.video-reactnative-live-stream/example/ios/Pods/Pods.xcodeproj: warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 9.0, but the range of supported deployment target versions is 11.0 to 16.0.99. (in target 'RCT-Folly' from project 'Pods')

** BUILD FAILED **

NOTE: if I run on the simulator with yarn example ios it works as expected.

Expected behavior
The example app should run on device.

Smartphone (please complete the following information):

  • Device: iPhone 12 Pro Max
  • OS: 16.1.1
  • Browser Safari
  • Version 16.1
  • ruby 3.1.2p20 (2022-04-12 revision 4491bb740a) [arm64-darwin22]
  • node v18.9.0
  • Xcode 14

gop, keyframes

Can I change this setting (keyframes)? I need to set it from 2-8 sec

Streaming status

Hi,
Currently, the ref.current?.startStreaming(); method start the stream from the LivestreamView but does give any feedback on the real status of the stream.
Is this a feature you plan to add to the SDK ?
If not, what would be the best way to handle it ?
Thank you for your reply.

v1.2.1 Stream Url Problem

Describe the bug
There is a problem in the Steam Key section when opening a live broadcast in the application. The stream url information to the RTMP Server side is live/ffd83834-f73e-459e-a0f5-80964700f2f9live1flashverFMLE/3.0 (compatible; FMSc/1.0)1.0)/ffd83834-f73e-459e-a0f5-80964700f2f. It comes to RTMP server repetitively and with different information added to it.

To Reproduce
Steps to reproduce the behavior:

  1. Create a new project v1.2.1
  2. Use custom rtmp server (e.g.: srs, zlmediakit, livego etc.)
  3. Live stream is not started because stream url is not parsable

Expected behavior
Stream url coming to server now like this => live/ffd83834-f73e-459e-a0f5-80964700f2f9live1flashverFMLE/3.0 (compatible; FMSc/1.0)1.0)/ffd83834-f73e-459e-a0f5-80964700f2f
expected like this => live/ffd83834-f73e-459e-a0f5-80964700f2f9

Screenshots
image

Smartphone (please complete the following information):

  • Device: Google Pixel 4
  • OS: Android 12.0

Additional context
There weren't a bug in 1.2.0 like this. It was working properly.

Unexpectedly found nil while unwrapping an Optional value

Describe the bug
The bug occurs as soon as the screen with the <LiveStreamView /> component is opened a second time (> 1) and the component is re-rendered. The app crashes immediately and an error log is issued with the following error:
Fatal error: Unexpectedly found nil while unwrapping an Optional value.

To Reproduce
Steps to reproduce the behavior:

  1. Create a react-native application with more than one screen
  2. Put the <LiveStreamView /> component on the second screen
  3. Open the screen more than one time (or probably trigger a rerender in another way)

Expected behavior
The app is going to crash with the following error message: Fatal error: Unexpectedly found nil while unwrapping an Optional value

Screenshots
If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

  • Device: iPhone 12
  • OS: iOS 14.5
  • Version 1.0.0

Translated Report (Full Report Below)

Incident Identifier: DDB756FE-59F8-40BB-8D86-FA41FBD58A30
CrashReporter Key: 10736A7A-E9D7-3061-3D4C-DB9C574F2979
Hardware Model: MacBookPro16,1
Process: studio [8005]
Path: /Users/USER/Library/Developer/CoreSimulator/Devices/381ADF7A-5ED0-4D10-B2AD-D91E3F05E31E/data/Containers/Bundle/Application/A58DF142-0EEB-422D-A4AC-C35D637C2898/studio.app/studio
Identifier: com.shopshot.studio
Version: 1.0.0 (1)
Code Type: X86-64 (Native)
Role: Foreground
Parent Process: launchd_sim [613]
Coalition: com.apple.CoreSimulator.SimDevice.381ADF7A-5ED0-4D10-B2AD-D91E3F05E31E [709]
Responsible Process: SimulatorTrampoline [542]

Date/Time: 2022-03-28 23:06:27.6416 +0200
Launch Time: 2022-03-28 22:55:58.0951 +0200
OS Version: macOS 12.2 (21D49)
Release Type: User
Report Version: 104

Exception Type: EXC_BAD_INSTRUCTION (SIGILL)
Exception Codes: 0x0000000000000001, 0x0000000000000000
Exception Note: EXC_CORPSE_NOTIFY
Termination Reason: SIGNAL 4 Illegal instruction: 4
Terminating Process: exc handler [8005]

Triggered by Thread: 0

Application Specific Information:
CoreSimulator 802.6 - Device: iPhone 12 (381ADF7A-5ED0-4D10-B2AD-D91E3F05E31E) - Runtime: iOS 14.5 (18E182) - DeviceType: iPhone 12
HaishinKit/AVVideoIOUnit.swift:251: Fatal error: Unexpectedly found nil while unwrapping an Optional value
dyld4 config: DYLD_ROOT_PATH=/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 14.5.simruntime/Contents/Resources/RuntimeRoot

Are any settings variable during stream?

Thanks for the work you've done on this library, The issue I'm having is that when I stream to go live, I lose control over settings and seem unable to change the resolution or bitrate. Sometimes due to connection speed, it may be necessary to reduce the resolution or the bitrate to ensure the stream continues? How would you recommend we workaround this?

Crashing/ANR during live stream on some devices

Describe the bug
Getting these error and app is getting crash or ANR on some devices, other devices working fine.

io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder$CodecSurface.stopStream (VideoMediaCodecEncoder.kt:247)

main(native)
video.api.rtmpdroid.Rtmp.nativeConnectStream (Native method)

lib.so
Crashed: Thread #1
SIGSEGV 0x0000000000000018

To Reproduce
Steps to reproduce the behavior:

  1. Start live streaming (issue happening on OnePlus 6 and OnePlus Nord, but working fine in OnePlus Nord 2 CE)

Expected behavior
It should live stream properly without any ANR or crash

Smartphone (please complete the following information):

  • Device: OnePlus 6 and OnePlus Nord
  • OS: Android 10/11
  • Version [Latest]

A problem occurred configuring project ':@api.video/react-native-livestream'.

react-native: v0.61.5
@api.video/react-native-livestream: v.1.1.0

The project name '@api.video/react-native-livestream' must not contain any of the following characters: [/, , :, <, >, ", ?, *, |]. Set the 'rootProject.name' or adjust the 'include' statement (see https://docs.gradle.org/6.1.1/dsl/org.gradle.api.initialization.Settings.html#org.gradle.api.initialization.Settings:include(java.lang.String[]) for more details).

[Bug]: App interrupts streaming and freezes when trying to stop stream

Version

v1.2.3

Which operating systems have you used?

  • Android
  • iOS

Environment that reproduces the issue

Samsung Galaxy S9 - Android 10

Is it reproducible in the example application?

Yes

RTMP Server

rtmp://***.in.streamster.io/in (key: ***)

Reproduction steps

  1. Start streaming with any parameters
  2. After some time (usually around 10-15 seconds) streaming will stop - server will not receive data
  3. Preview will still work fine
  4. Try to stop streaming, app will freeze but preview will be visible

Expected result

App does not freeze

Actual result

App freezes, no logs on react-native side, no errors in logcat, only anr crash report is saved

https://gist.github.com/Pitros/e3d1239dd1875538d57d3b7725bbf281

Additional context

Streaming is also stuttering on this device, other phone (Galaxy S20) does not stutter and does not have this freeze issue.

Relevant logs output

No response

on running in expo it gives error of Invariant Violation: requireNativeComponent: "ReactNativeLiveStreamView" was not found in the UIManager.

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:
1.
2.
3.

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Front camera as default not working

Describe the bug
Using the example app when I set the default camera to be 'front' the back camera is still selected by default. I have to double tap the switch camera button to get it in sync and work normally.

To Reproduce
Steps to reproduce the behavior:

  1. Change :const [camera, setCamera] = React.useState<'front' | 'back'>('back');
    to: const [camera, setCamera] = React.useState<'front' | 'back'>('front');
  2. Tap the switch camera button.
  3. Nothing happens
  4. Tap switch camera button again.
  5. Works as expected.

Expected behavior
The app starts up with the front facing camera.

Smartphone (please complete the following information):

  • Device: iPhone 8
  • OS: iOS 14.6

zoom feature

it would be great if you guys add zoom feature as well

Android app crash on the camera screen

Describe the bug
Jump to camera screen crash the app on some of the android phones

To Reproduce
Steps to reproduce the behavior:

  1. Jump to camera screen
  2. It will load the camera and then in a second it crashed

Expected behavior
It should not crash the app

Smartphone (please complete the following information):

  • Device: Samsung galaxy Note 9 (SM-N960U)
  • OS: Android 10
  • API stream SDK version: 1.2.0

Crash logs
https://gist.github.com/tarun-showday/616a3b4454b4137d4e94a0bfd7bffeaa

live streaming from camera works on IOS but not Android

Hello! I'm currently using Amazon IVS service and trying to use this .
I used the React Native Example code from this repository and also added permission into 'AndroidManifest.xml' file same as example.
it works only on IOS device but not on Android.

import React, {useRef, useState} from 'react';
import {
  StatusBar,
  Text, TouchableOpacity,
  View
} from 'react-native';
import {LiveStreamView} from '@api.video/react-native-livestream';

interface Props {
  children?: React.ReactNode;
}
const VideoPage: React.FC<Props> = (props) => {
  const ref = useRef(null);
  const [streaming, setStreaming] = useState(false);

  return (
    <View style={{flex: 1, alignItems: 'center'}}>
      <StatusBar animated={true} barStyle="light-content" />
      <LiveStreamView
        style={{flex: 1, backgroundColor: 'black', alignSelf: 'stretch'}}
        ref={ref}
        camera={'back'}
        video={{
          fps: 30,
          resolution: '480p',
          bitrate: 1 * 1024, // # 2 Mbps
        }}
        audio={{
          bitrate: 128,
          // isStereo: false,
        }}
        isMuted={true}
        onConnectionSuccess={() => {
          //do what you want
        }}
        onConnectionFailed={(e) => {
          //do what you want
        }}
        onDisconnect={() => {
          //do what you want
        }}
      />
      <View style={{position: 'absolute', bottom: 40}}>
        <TouchableOpacity
          style={{
            borderRadius: 50,
            backgroundColor: streaming ? 'red' : 'white',
            width: 50,
            height: 50,
          }}
          onPress={() => {
            if (streaming) {
              ref.current?.stopStreaming();
              setStreaming(false);
            } else {
              ref.current?.startStreaming('key', 'url_ends_with_slash');
              setStreaming(true);
            }
          }}
        />
      </View>
    </View>
  );
};
export default VideoPage;
<manifest>
  <uses-permission android:name="android.permission.INTERNET" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.CAMERA" />
</manifest>

Screenshots of Amazon IVS report

Android

image

image

[Amazon IVS report] IOS

image

image

Smartphone

[working] IOS

  • Device: iPad mini 6th gen
  • OS: iOS15.4
  • Browser: React Native
  • Version: 0.66.4

[not working] Android

  • Device: Galaxy S20
  • OS: android 12
  • Browser: React Native
  • Version: 0.66.4

Do you have any idea why it is not working properly on android device?

Front camera got freeze after 2 to 3 seconds

Describe the bug
Front camera is getting struck in android after opening front camera and screen got freeze, Where as back camera is working with out getting struck

In IOS both cameras were working.

I tested on devices android 13 and android 9, In both devices front camera got struck

Support autofocus

Is your feature request related to a problem? Please describe.
Auto focus would improve stream quality a lot.

Describe the solution you'd like
I think Haishinkit has support of continuousFocus, we could expose it as a prop?

Describe alternatives you've considered
I am still exploring.

Could not resolve org.apache.httpcomponents:httpcore:4.4.6.

 Could not resolve all artifacts for configuration ':api.video_react-native-livestream:classpath'.
   > Could not resolve org.apache.httpcomponents:httpcore:4.4.6.
     Required by:
         project :api.video_react-native-livestream > com.android.tools.build:gradle:3.2.1 > com.android.tools.analytics-library:crash:26.2.1       
         project :api.video_react-native-livestream > com.android.tools.build:gradle:3.2.1 > com.android.tools.build:builder:3.2.1 > com.android.tools:sdklib:26.2.1
      > Could not resolve org.apache.httpcomponents:httpcore:4.4.6.
         > Could not get resource 'https://jcenter.bintray.com/org/apache/httpcomponents/httpcore/4.4.6/httpcore-4.4.6.pom'.
            > Could not GET 'https://jcenter.bintray.com/org/apache/httpcomponents/httpcore/4.4.6/httpcore-4.4.6.pom'. Received status code 502 from server: Bad Gateway
   > Could not resolve org.apache.httpcomponents:httpcore:4.4.6.
     Required by:
         project :api.video_react-native-livestream > com.android.tools.build:gradle:3.2.1 > com.android.tools.analytics-library:crash:26.2.1 > org.apache.httpcomponents:httpclient:4.5.3
      > Could not resolve org.apache.httpcomponents:httpcore:4.4.6.
         > Could not get resource 'https://jcenter.bintray.com/org/apache/httpcomponents/httpcore/4.4.6/httpcore-4.4.6.pom'.
            > Could not GET 'https://jcenter.bintray.com/org/apache/httpcomponents/httpcore/4.4.6/httpcore-4.4.6.pom'. Received status code 502 from server: Bad Gateway

Versions

 "@api.video/react-native-livestream": "^0.2.0",
 "react": "17.0.2",
 "react-native": "0.66.4",

Camera Orientation issue.

Describe the bug
When we do live stream in landscape mode always displays in portrait mode. even my device/UI orientation is in the landscape.

To Reproduce
Steps to reproduce the behavior:

  1. Open the app.
  2. Rotate your device in landscape mode.
  3. Start streaming.

Expected behavior
If the device orientation is in landscape always shows stream in landscape mode.

Smartphone (please complete the following information):

  • Device: iPhone 11 pro max
  • iPhone OS Version: 16.2
  • Xcode version 14.2
  • Mac OS: Monterey 12.6

Additional context
"@api.video/react-native-livestream": "v1.2.2"
"react": "16.13.1",
"react-native": "0.64.2",
"react-native-orientation-locker": "^1.4.0",

Crash with error [AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration'

The video:
https://user-images.githubusercontent.com/660581/158349595-d5ba5cb1-5aa4-430b-b2e4-c9ac69feea7b.mp4

Keep looping mount and unmount the page will cause this crash randomly.

2022-03-15 17:44:27.254875+0800 shootingRange[10321:2880744] [javascript] Camera permission denied
2022-15-03 17:44:27.669 [Warn] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:158] continuousAutofocus > focusMode(2) is not supported
2022-15-03 17:44:27.669 [Warn] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:217] continuousExposure > exposureMode(2) is not supported
2022-15-03 17:44:27.673 [Info] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:91] fps > (fps: 30.0, duration: __C.CMTime(value: 100, timescale: 3000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0))
orientation 1
2022-15-03 17:44:27.680 [Info] [com.haishinkit.HaishinKit] [AVVideoIOUnit.swift:91] fps > (fps: 30.0, duration: __C.CMTime(value: 100, timescale: 3000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0))
2022-03-15 17:44:27.706448+0800 shootingRange[10321:2880744] [javascript] 'Native Error', '*** -[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration\n(\n    "4   libc++abi.dylib                     0x00000001b9c33fa0 15191D90-33BD-3E37-A891-243CA73D63DE + 77728",\n    "5   libc++abi.dylib                     0x00000001b9c33f2c _ZSt9terminatev + 48",\n    "6   libdispatch.dylib                   0x000000010820ddf4 _dispatch_client_callout + 40",\n    "7   libdispatch.dylib                   0x0000000108221ffc _dispatch_root_queue_drain + 856",\n    "8   libdispatch.dylib                   0x00000001082227ec _dispatch_worker_thread2 + 136"\n)'
2022-03-15 17:44:28.186853+0800 shootingRange[10321:2881801] [connection] nw_endpoint_handler_set_adaptive_read_handler [C19.1 8.209.73.42:443 ready channel-flow (satisfied (Path is satisfied), viable, interface: en0, ipv4, dns)] unregister notification for read_timeout failed
2022-03-15 17:44:28.186956+0800 shootingRange[10321:2881801] [connection] nw_endpoint_handler_set_adaptive_write_handler [C19.1 8.209.73.42:443 ready channel-flow (satisfied (Path is satisfied), viable, interface: en0, ipv4, dns)] unregister notification for write_timeout failed
2022-03-15 17:44:32.708969+0800 shootingRange[10321:2880612] RELEASING LOCKED RN EXCEPTION HANDLER
2022-03-15 17:44:32.710391+0800 shootingRange[10321:2882118] *** Terminating app due to uncaught exception 'NSGenericException', reason: '*** -[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration'
*** First throw call stack:
(0x1a4ff6dc0 0x1b9b537a8 0x1beffce50 0x102f295b4 0x102f2c7ac 0x10820c0b4 0x10820dde0 0x108221ffc 0x1082227ec 0x1f0dd6768 0x1f0ddd74c)
libc++abi: terminating with uncaught exception of type NSException
*** Terminating app due to uncaught exception 'NSGenericException', reason: '*** -[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration'
terminating with uncaught exception of type NSException

Core dependencies

  "dependencies": {
    "@api.video/react-native-livestream": "^0.2.1",
    "react-native": "~0.63.4",
    "@react-navigation/native": "~5.9.3",
    "@react-navigation/stack": "~5.14.3",
    ...

iOS: Live Stream start in mirror mode.

To Reproduce
Steps to reproduce the behavior:

  1. Open the app.
  2. Start the live stream.
  3. See the camera view

Expected behavior
Always display non-mirror mode.

Screenshots
You can refer this video
here is video

Smartphone (please complete the following information):
Smartphone (please complete the following information):

Device: iPhone 11 pro max

  • iPhone OS Version: 16.2
  • Xcode version 14.2
  • Mac OS: Monterey 12.6

Additional context
"@api.video/react-native-livestream": "v1.2.3"
"react": "16.13.1",
"react-native": "0.64.2",
"react-native-orientation-locker": "^1.4.0",

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.