Giter Site home page Giter Site logo

fyhertz / libstreaming Goto Github PK

View Code? Open in Web Editor NEW
3.4K 270.0 1.1K 1.5 MB

A solution for streaming H.264, H.263, AMR, AAC using RTP on Android

License: Apache License 2.0

Java 100.00%
android rtsp-server rtp mediacodec aac amr java mediarecorder mediarecorder-api h264

libstreaming's Introduction

Introduction

What it does

libstreaming is an API that allows you, with only a few lines of code, to stream the camera and/or microphone of an android powered device using RTP over UDP.

  • Android 4.0 or more recent is required.
  • Supported encoders include H.264, H.263, AAC and AMR.

The first step you will need to achieve to start a streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.

  • With the RTSP client: if you want to stream to a Wowza Media Server, it's the way to go. The example 3 illustrates that use case.
  • With the RTSP server: in that case the phone will act as a RTSP server and wait for a RTSP client to request a stream. This use case is illustated in the example 1.
  • Or you use libstreaming without using the RTSP protocol at all, and signal the session using SDP over a protocol you like. The example 2 illustrates that use case.

The full javadoc documentation of the API is available here: http://guigui.us/libstreaming/doc

How does it work? You should really read this, it's important!

There are three ways on Android to get encoded data from the peripherals:

  • With the MediaRecorder API and a simple hack.
  • With the MediaCodec API and the buffer-to-buffer method which requires Android 4.1.
  • With the MediaCodec API and the surface-to-buffer method which requires Android 4.3.

Encoding with the MediaRecorder API

The MediaRecorder API was not intended for streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a LocalSocket instead of a regular file (see MediaStream.java).

Edit: as of Android Lollipop using a LocalSocket is not possible anymore for security reasons. But using a ParcelFileDescriptor does the trick. More details in the file MediaStream.java! (Thanks to those guys for the insight)

This hack has some limitations:

  • Lip sync can be approximative.
  • The MediaRecorder internal buffers can lead to some important jitter. libstreaming tries to compensate that jitter.

It's hard to tell how well this hack is going to work on a phone. It does work well on many devices though.

Encoding with the MediaCodec API

The MediaCodec API do not present the limitations I just mentionned, but has its own issues. There are actually two ways to use the MediaCodec API: with buffers or with a surface.

The buffer-to-buffer method uses calls to dequeueInputBuffer and [queueInputBuffer](http://developer.android.com/reference/android/media/MediaCodec.html#queueInputBuffer(int, int, int, long, int)) to feed the encoder with raw data. That seems easy right ? Well it's not, because video encoders that you get access to with this API are using different color formats and you need to support all of them. A list of those color formats is available here. Moreover, many encoders claim support for color formats they don't actually support properly or can present little glitches.

All the hw package is dedicated to solving those issues. See in particular EncoderDebugger class.

If streaming with that API fails, libstreaming fallbacks on streaming with the MediaRecorder API.

The surface-to-buffer method uses the createInputSurface() method. This method is probably the best way to encode raw video from the camera but it requires android 4.3 and up.

The gl package is dedicated to using the MediaCodec API with a surface.

It is not yet enabled by default in libstreaming but you can force it with the setStreamingMethod(byte) method.

Packetization process

Once raw data from the peripherals has been encoded, it is encapsulated in a proper RTP stream. The packetization algorithm that must be used depends on the format of the data (H.264, H.263, AMR and AAC) and are all specified in their respective RFC:

  • RFC 3984 for H.264: H264Packetizer.java
  • RFC 4629 for H.263: H263Packetizer.java
  • RFC 3267 for AMR: AMRNBPacketizer.java
  • RFC 3640 for AAC: AACADTSPacketizer.java or AACLATMPacketizer.java

If you are looking for a basic implementation of one of the RFC mentionned above, check the sources of corresponding class.

RTCP packets are also sent to the receiver since version 2.0 of libstreaming. Only Sender Reports are implemented. They are actually needed for lip sync.

The rtp package handles packetization of encoded data in RTP packets.

Using libstreaming in your app

Required permissions

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />

How to stream H.264 and AAC

This example is extracted from this simple android app. This could be a part of an Activity, a Fragment or a Service.

    protected void onCreate(Bundle savedInstanceState) {

        ...

		mSession = SessionBuilder.getInstance()
		.setCallback(this)
		.setSurfaceView(mSurfaceView)
		.setPreviewOrientation(90)
		.setContext(getApplicationContext())
		.setAudioEncoder(SessionBuilder.AUDIO_NONE)
		.setAudioQuality(new AudioQuality(16000, 32000))
		.setVideoEncoder(SessionBuilder.VIDEO_H264)
		.setVideoQuality(new VideoQuality(320,240,20,500000))
		.build();

		mSurfaceView.getHolder().addCallback(this);

        ...

    }

	public void onPreviewStarted() {
		Log.d(TAG,"Preview started.");
	}

	@Override
	public void onSessionConfigured() {
		Log.d(TAG,"Preview configured.");
		// Once the stream is configured, you can get a SDP formated session description
		// that you can send to the receiver of the stream.
		// For example, to receive the stream in VLC, store the session description in a .sdp file
		// and open it with VLC while streming.
		Log.d(TAG, mSession.getSessionDescription());
		mSession.start();
	}

	@Override
	public void onSessionStarted() {
		Log.d(TAG,"Streaming session started.");
        ...
	}

	@Override
	public void onSessionStopped() {
		Log.d(TAG,"Streaming session stopped.");
        ...
	}	

	@Override
	public void onBitrateUpdate(long bitrate) {
        // Informs you of the bandwidth consumption of the streams
		Log.d(TAG,"Bitrate: "+bitrate);
	}

	@Override
	public void onSessionError(int message, int streamType, Exception e) {
        // Might happen if the streaming at the requested resolution is not supported
        // or if the preview surface is not ready...
        // Check the Session class for a list of the possible errors.
		Log.e(TAG, "An error occured", e);
	}

	@Override
	public void surfaceChanged(SurfaceHolder holder, int format, int width,
			int height) {
		
	}

	@Override
	public void surfaceCreated(SurfaceHolder holder) {
        // Starts the preview of the Camera
		mSession.startPreview();
	}

	@Override
	public void surfaceDestroyed(SurfaceHolder holder) {
        // Stops the streaming session
		mSession.stop();
	}

The SessionBuilder simply facilitates the creation of Session objects. The call to setSurfaceView is needed for video streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the MediaRecorder API). On Android 4.3, streaming with no SurfaceView is possible but not yet implemented. The call to setContext(Context) is necessary, it allows H264Stream objects and AACStream objects to store and recover data using SharedPreferences.

A Session object represents a streaming session to some peer. It contains one or more Stream objects that are started (resp. stopped) when the start() (resp. stop()) method is invoked.

The method getSessionDescription() will return a SDP of the session in the form of a String. Before calling it, you must make sure that the Session has been configured. After calling configure() or startPreview() on you Session instance, the callback onSessionConfigured() will be called.

In the example presented above, the Session instance is used in an asynchronous manner and calls to its methods do not block. You know when stuff is done when callbacks are called.

You can also use a Session object in a synchronous manner like that:

    // Blocks until the all streams are configured
    try {
         mSession.syncConfigure();
    } catch (Exception e) {
         ...
    }
    Strinf sdp = mSession.getSessionDescription();
    ...
    // Blocks until streaming actually starts.
    try {
         mSession.syncStart();
    } catch (Exception e) {
         ...
    }
    ...
    mSession.syncStop();

How to use the RTSP client

Check out this page of the wiki and the example 3.

How to use the RTSP server

Add this to your manifest:

<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>

If you decide to override RtspServer change the line above accordingly.

You can change the port used by the RtspServer:

Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
editor.putString(RtspServer.KEY_PORT, String.valueOf(1234));
editor.commit();

The port is indeed stored as a String in the preferences, there is a good reason to that. The EditTextPreference object saves its input as a String and cannot easily (one would need to override it) be configured to store it as an Integer.

Configure its behavior with the SessionBuilder:

SessionBuilder.getInstance()    
			.setSurfaceHolder(mSurfaceView.getHolder())
			.setContext(getApplicationContext())
			.setAudioEncoder(SessionBuilder.AUDIO_AAC)
			.setVideoEncoder(SessionBuilder.VIDEO_H264);

Start and stop the server like this:

// Starts the RTSP server
context.startService(new Intent(this,RtspServer.class));
// Stops the RTSP server
context.stopService(new Intent(this,RtspServer.class));

Spydroid-ipcamera

Visit this github page to see how this streaming stack can be used and how it performs.

libstreaming's People

Contributors

bilthon avatar brunosiqueira avatar kentvu avatar pnemonic78 avatar serpro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

libstreaming's Issues

Green blocky video bug in streaming

I'm trying to streaming with the following quality:

"320x180, 30 fps, 250 Kbps";
"320x240, 30 fps, 250 Kbps";
"320x240, 25 fps, 600 Kbps";
"640x360, 30 fps, 600 Kbps";
"640x480, 30 fps, 600 Kbps";
"320x240, 30 fps, 300 Kbps";

but always the output on wowza is blocky and with some resolution is green screen. What the best parameter for a discrete streaming ?

Stream video is rotated

Video stream is always rotated by 90 degrees even with using of MediaCodec API (I know that it is impossible to set correct rotation with MediaRecorder API). Is there any way to rotate it?

Record stream on Wowza into video file

Thanks for this great library and example apps! :)

I want to record the streams sent to Wowza (v3.6.3) using your example 3 app. I have followed these instructions:

https://github.com/fyhertz/libstreaming/wiki/Using-libstreaming-with-Wowza-Media-Server

But changed to rtp-live-record in live/Application.xml which seems to be the documented way to record streams. The app connects to Wowza fine and I get connect rtsp, create stream, unpublish stream, destroy stream and disconnect rtsp messages in the logs but no video file appears. If I add another folder to the rtsp path it creates a folder of that name in the applications/live/sharedobjects folder but that is it, no data.

I have tried changing many settings in Wowza and in your app but cant get the video data to be saved. I had to disable digest authentication to get the streams working but I cant see why that would cause the video files not to save. Do you have any idea what I need to do to save the streams as flv or mp4 files?

Any help you can give will be greatly appreciated!

Many thanks :)

Decoder buffer not big enough, decoder did not decode anything

Hi, I followed these instructions in Section 3. Creating Android Project to create a new project using libstreaming as a library:

http://www.androidhive.info/2014/06/android-streaming-live-camera-video-to-web-page/

Then I followed example2 to try and stream H.264 without RTSP. I deploy this app to my Galaxy Nexus and it fails with "The decoder did not decode anything." This is using VideoQuality.DEFAULT_VIDEO_QUALITY. If I use a larger resolution (640x480, supported according to a log entry from VideoQuality), the error changes to "The decoder input buffer is not big enough (nal=91280, capacity=65536)." Every combination of resolution, fps, and bitrate I've tried results in one of these two errors.

I've been struggling for days and cannot get libstreaming to work for me. Where do these errors come from and what do I need to look at to get them resolved?

Question also asked here, with code and log outputs: http://stackoverflow.com/questions/24128279/libstreaming-errors-decoder-buffer-not-big-enough-decoder-did-not-decode-anyth

(Somewhat unrelated, but I had to deviate slightly from example2: I'm building the session in a button click, not in onCreate(), because findViewById() was always returning null for my SurfaceView if I called it from onCreate().)

Thanks!

Video stream orientation the receiver side

I've been trying to stream videos to wowza in portrait mode. Setting preview orientation doesn't seen to have any effect. Is there any correction available to make it work?

Save stream as playable file

Is there a way to save a rtsp stream as a playable file (i.e. mp4)?
I want to separate a stream into different files, that's why I'm not using androids MediaRecorder directly.

Thanks.

example3 ,encodeWithMediaCodecMethod2, createinputsurface() return null

I tried encodeWithMediaCodecMethod2 in HTC M8 device,
but it failed when I click recording, and I can not find the reason.
configurationis : H264, 176x144,
following is the detail log

07-02 11:37:10.062: I/RtspClient(8391): RECORD rtsp://202.153.207.34:1935/jacklive/test.stream RTSP/1.0
07-02 11:37:10.062: I/RtspClient(8391): Range: npt=0.000-
07-02 11:37:10.062: I/RtspClient(8391): CSeq: 5
07-02 11:37:10.062: I/RtspClient(8391): Content-Length: 0
07-02 11:37:10.062: I/RtspClient(8391): Session: 925019696
07-02 11:37:10.062: I/RtspClient(8391): Authorization: Digest username="hodo_jack",realm="Streaming Server",nonce="6a43415836208e65df339760e13446fe",uri="rtsp://202.153.207.34:1935/jacklive/test.stream",response="5f19c9b02e323cedcf116c6c388a1a30"
07-02 11:37:10.062: I/RtspClient(8391):
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=RTSP/1.0 200 OK
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=Range: npt=now-
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=Session: 925019696;timeout=60
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=Cseq: 5
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=Server: Wowza Streaming Engine 4.0.4 build11775
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=Cache-Control: no-cache
07-02 11:37:10.082: D/RtspClient(8391): Response from server line=
07-02 11:37:10.082: D/RtspClient(8391): Response from server: 200
07-02 11:37:10.082: D/libc(8391): [NET] getaddrinfo+,hn 14(0x3230322e313533),sn(),family 0,flags 4
07-02 11:37:10.082: D/libc(8391): [NET] getaddrinfo-, SUCCESS
07-02 11:37:10.082: D/jack-h264stream(8391): enter H264stream.configure
07-02 11:37:10.092: D/jack-videostream(8391): createCamera, camera preview size:176 x 144
07-02 11:37:10.092: D/jack-h264stream(8391): testMediaCodecAPI enter updateCamera
07-02 11:37:10.092: D/jack-videostream(8391): enter updateCamera, mUpdated=true,mQuality=176 x 144
07-02 11:37:10.092: D/MP4Config(8391): PPS: aM4G4g==
07-02 11:37:10.092: D/MP4Config(8391): SPS: Z0KAFNoLE6AbQoTU
07-02 11:37:10.092: D/VideoStream(8391): Video encoded using the MediaCodec API with a surface
07-02 11:37:10.092: D/jack-videostream(8391): createCamera, camera preview size:176 x 144
07-02 11:37:10.092: D/jack-videostream(8391): encodeWithMediaCodecMethod2 enter updateCamera
07-02 11:37:10.092: D/jack-videostream(8391): enter updateCamera, mUpdated=true,mQuality=176 x 144
07-02 11:37:10.922: D/VideoStream(8391): Actual framerate: 25
07-02 11:37:11.042: E/ACodec(8391): [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648
07-02 11:37:11.042: I/RtspClient(8391): TEARDOWN rtsp://202.153.207.34:1935/jacklive/test.stream RTSP/1.0
07-02 11:37:11.062: W/System.err(8391): java.lang.NullPointerException
07-02 11:37:11.062: W/System.err(8391): at net.majorkernelpanic.streaming.gl.SurfaceManager.(SurfaceManager.java:70)
07-02 11:37:11.062: W/System.err(8391): at net.majorkernelpanic.streaming.gl.SurfaceView.addMediaCodecSurface(SurfaceView.java:105)
07-02 11:37:11.062: W/System.err(8391): at net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2(VideoStream.java:524)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodec(VideoStream.java:400)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.MediaStream.start(MediaStream.java:248)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.video.VideoStream.start(VideoStream.java:279)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.video.H264Stream.start(H264Stream.java:100)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.Session.syncStart(Session.java:456)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.Session.syncStart(Session.java:501)
07-02 11:37:11.072: W/System.err(8391): at net.majorkernelpanic.streaming.rtsp.RtspClient$4.run(RtspClient.java:258)
07-02 11:37:11.072: W/System.err(8391): at android.os.Handler.handleCallback(Handler.java:733)
07-02 11:37:11.072: W/System.err(8391): at android.os.Handler.dispatchMessage(Handler.java:95)
07-02 11:37:11.072: W/System.err(8391): at android.os.Looper.loop(Looper.java:157)
07-02 11:37:11.072: W/System.err(8391): at android.os.HandlerThread.run(HandlerThread.java:61)

Connection lost by timeout!

Hi! I'm using your library to stream to wowza server.

I have problems connecting to wowza:
2013/11/19 21:13:57 background: Connecting to wowza server
2013/11/19 21:13:57 background: Connecting to 10.0.1.8:1935/live/test.stream
2013/11/19 21:13:57 background: quality set...
2013/11/19 21:13:57 background: credentials set. username: admin; password: admin
2013/11/19 21:13:57 background: server address set
2013/11/19 21:13:57 background: path set. Starting stream...
2013/11/19 21:15:28 background: starting stream failed: Connection lost
2013/11/19 21:15:28 background: starting stream failed: java.net.SocketException: Connection lost

Problem occurs on line:
mClient.startStream(1);

Wowza server is reacheable (I can ping it) and ip_address:1935 is visible from phone.

Do you have any workarounds?

Stream without preview

Hi,
first of all i'd like to say that your library is awesome and helped me a lot. Actually this is not really an issue but since Android 4 you can provide a SurfaceTexture instead of a SurfaceView as Preview. A big advantage of this method is that you don't have to add a SurfaceTexture to an activity to be a valid preview. I tried that by just replacing every call to setSurfaceView by setSurfaceTexture and it works perfectly on my Nexus 4. I'm actually not sure wether this is already supported by your library as i did not review the complete code. Just thought in case you don't know this might be interesting.

Best

Delays

Hey,

This is an excellent library and it was really easy to setup and work with.

I've this small problem, when connecting using VLC there is a delay in the video (up to 2 seconds).

  1. The android device and the PC are on the same WIFI network.
  2. I tried both h.263/264 with and without audio
  3. Frame rate set to 15
  4. resolution 320x240
  5. Tested with Galaxy S2 / S3

Ideas?

Preview Color

I am having a color problem in the preview (the streaming is fine). The displayed color is wrong. It looks like the red and blue values are inverted. Once in a while the colors are correct (maybe 10% of the time). Should I look at how I am creating the rendering surface?

Streaming and playing RTSP

I am developing a simple app that let you record video/audio using this library and stream it over network. App has basically two SurfaceView (one for recording/preview and other for playback) . What I am doing is as follows

  1. Start RtspServer service
  2. Create SessionBuilder instance and configure it.
  3. Create a session using SessionBuilder and start it (for preview and recording)
  4. Create a MediaPlayer passing url "rtsp://127.0.0.1:8086" . Set the second Surface view for display and start
  5. Connection is made to RtspServer but when "SETUP" request is sent, RtspServer start() the session created by the same SessionBuilder (as of step 2) in DESCRIBE request. It throws error in play().

Kindly let me know simple way of achieving the task of recording and playback on same device using RTSP. I checked spydroid camera but its code is right now complex for me to understand as I m short of time.

VLC not receiving any audio packets

hey hi, i was trying the example 2 application that uses this library with VLC on the other end as client playing using the sdp file. VLC recognises the audio track but its not receiving any audio blocks to decode. i guess something is not right with the audio streaming package. the spydroid app also had the same problem. i tried with both AAC and AMR. no use.

Time is going backwards

Hello,

I am attempting to stream via RTSP from an Android device to another Android device. I have encountered the following problem however: on select wireless networks, i receive this error:
W/SystemClock﹕ time going backwards: prev 1895085352768999(ioctl) vs now 1895085352738482(ioctl), tid=31682

I have no idea what to do about this, and i'm using example 1 as the backbone for the streaming service.

Preview doesn't work when the view is recreated

Both preview and streaming is working fine for me with libstreaming until the view that I'm using for it is recreated, either by starting the activity a second time, or tilting the phone so that it changes between landscape and portrait.

The output is showing the message "sending message to a handler on a dead thread", and it's caused on the Session.Startpreview() method.

Both example2 and example3 downloaded here have the same problem when I modify them to allow screen rotation.

Anyone else who has/had this problem?

Audio/video synchronization

Hello,

I'm having problems with audio/video synchronization. I debugged it in multiple ways and I still can't pinpoint the problem.

First of all I want to mention that the demo application (spydroid) also has the delay between audio and video, i.e. the video is about 1.2seconds late. The problem appears both when using spydroid as a RTSP server or when using the library to stream to a wowza server and then connect with a player to the wowza stream.

AFAICT the packets sent over the rtcp protocol should solve any sync problems. The packets are sent by the phone and received by the player (I tried with vlc and a flash player). It would help me a lot if you game me some debugging ideas or if you could tell me if it works on your end.

I want to also mention that I tested on a Nexus 4 and a Samsung Galaxy S3.

Thank you

Exampl 2

Running Example 2 on Samsung Galaxy SII running Android 4.04:

After running example2 a few times, it operates normally just after clicking the start button, but when I click the stop button about 3/4 of the screen is about a one second loop of the last frame or so while the remaining portion is a live view of the camera preview but with a bad color format.

How to capture an image

I want to capture an image from the camera while streaming is running. How can I do it with libstreaming?

Blank preview screen when app is re-launched

It seems from the examples that launching the app the first time, it works well.
Closing and relaunching it, the video preview is black and it doesn't start until pressing the start button (before the preview was automatic when the app was launched).

Hack to stream audio from microphone

Hi, i'm trying to stream the audio from microphone in AAC-ADTS format.
I have the :

server = new LocalServerSocket(LOCAL_SOCKET_ADDRESS);
sender = new LocalSocket()
sender.connect(new LocalSocketAddress(LOCAL_SOCKET_ADDRESS));

and my MediaRecorder records into it:

myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
myAudioRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
myAudioRecorder.setOutputFile(sender.getFileDescriptor());

I also have a LocalSocket receiver:

receiver = server.accept();

I also have the code to build the Session and Start the server:

SessionBuilder.getInstance()
.setContext(getApplication())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
.setAudioQuality(new AudioQuality(44100,128000))
.setVideoEncoder(SessionBuilder.VIDEO_NONE)
.build()
thisActiviry.startService(new Intent(thisActiviry,RtspServer.class));

I am checking the code of this libstreaming library but i can't find where can I connect the stream coming from this local socket to the one that goes out as a stream.

Can you please give a help how should i do this hack to stream the audio?

Thank You!
Gabriel.

Video Recording Inconsistencies & Playback Problems

I am currently testing out the libstreaming library for use in recording videos to Wowza in the same way that the Example 3 works.

The recording of the videos works great, but the final video output seems to have some inconsistencies.

80% of the time the videos are recorded and playback correctly. The other 20% of the time, there is 8-10 seconds of the video in the beginning that doesn't have any metadata associated to it. When I play any of those videos in VLC, they play correctly but do not show any progress information or duration. When I try to play them back in Android using a simple VideoView, the video does not play at all.

What might cause this to happen?

Here is an example of a video that was recorded, and shows the issue that occurs: http://sharesend.com/wuk0lm1m

Here is an example of a video that I recorded a few seconds later that does not have any problems: http://sharesend.com/xbaks1fc

Also, here is the full implementation of the libstreaming library in my VideoRecorder class: http://sharesend.com/9pbac4nn

Finally, the device logs during playback: http://sharesend.com/am45eh1e

If I can provide any more info, just let me know.

Any help would be appreciated.

RTSP server using 3G AND vlc

Hi!

I am using this library for doing streaming for my android phone to VLC.

In a local network, I get to watch the streaming in VLC OK, but when i disable wifi and i Use 3g/lte, I don't get to watch the streaming in VLC.

I am using example 1 of libstreaming-examples

Is it possible? And how? Thanks!

TCP support

I'm trying Example 3 with Wowza .
All works fine on local network, but I have problem if smartphone try to connect on wowza by server public ip address.
I have set port forwarding for my wowza port, and when I click on start button, streaming starts, and in wowza manager I can see this connection in incoming streams.
test

The problem is that wowza not receive packets from android client.
If I see Network Throughput section, it show this :
bytes in 0 , bytes out 0
and of course video not shown.
From smartphone I can ping my public wowza's server ip, port 1935 is open and forwarded to local address of server. What I miss ? I must open and forward some other ports ?

smartphone android 4.1.2
wowza server on windows 7 (firewall disabled)

MediaRecorder Start failed: -38

Using example 1 if I set:

.setAudioEncoder(SessionBuilder.AUDIO_AMRNB)
.setVideoEncoder(SessionBuilder.VIDEO_NONE);

How can I stream audio only with no video?

Can't seem to access the stream

Hello,

This is really a question on functionality probably rather than an "issue".

I just started investigating the API after a few failed attempts to build similar by scratch. My intention is to stream the camera preview from one Android device to another Android device with WifiP2pDirect. Any thoughts on the ability to do that? I started building my own activity which handles the p2p and will include the libstreaming code. Before I got in over my head I thought I'd check out the examples.

Based on my needs I thought example 2 would be the best route. I'm a little confused on what control I would send the stream to though. I know a SurfaceView is required for the camera, but what about the other end; how do I pipe the stream into the client control? I'm trying to run through the source, but there's a lot and I'd prefer not to waste too much time on a incompatible solution. The distance between the devices will be short and the stream will have to be as close to real time as possible.

Thanks for any help.

TCP Streaming

I am having an issue with TCP. It stops sending packets after a few minutes. If I stream to Wowza I can see that the connection is still open, but the data rate goes to zero and the app stops updating the debug console with sending messages (it only shows birate updates). Any ideas.

Streaming in the background

Hey!

I want to stream in the background, and as a service with this awesome library? Is it possible, if so how should i do it?

Thanks, Jonas

Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1)

hi,
i am using your library and my app has stopped. This is my error - http://pastebin.com/9Zdxgs3G
My device is a rooted tablet Asus Transformer TF101 with custom rom 4.3. App crashed when SurfaceView was instantiated. I must mention that this error appears only on tablets, On phones works well but i don't have a phone with Android 4.3 for testing.

February example3 working with an old libstreaming export: no callback buffer error

In libstreaming-example-master of February 21th 2014, the example works correctly with the libstreaming-4.0.jar in libs directory. If I try the same example code with the last libstreaming-master.zip code, I've always the 'Callback buffer was too small !'
I need to use the libstreaming-master.zip because of some edits I did on it... so can't use the old .jar file in my project.

I really don't know where's the issue...

Camera cannot autofocus and cannot start surfaceView when the activity access in the second time

Although I have added:
< uses-feature android:name="android.hardware.camera.autofocus" / >

The camera still cannot autofocus
how to fix it?

camera.getParameters().setFocusMode(android.hardware.Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
may have to be set but I do not know how to access the camera instance of libstreaming

Is it possible to control the parameter of camera myself?


In onDestory of activity,
super.onDestroy();
mClient.release();
mSession.release();
have been added.
And other coding almost exactly with example 3.

but when I get into this activity again,
the surface view does not show anything.
(I have checked and confirm that the onDestory is run when I get back to the previous activity)
what's wrong?

How to stream audio and video with the MediaRecorder

I am having issues streaming audio and video with the MediaRecorder option, MediaCodec works fine. The issue i see is that both AudioStream and VideoStream try to open and start media recorder. How would that work since only one instance of Media recorder can work at one time.

.setAudioEncoder(SessionBuilder.AUDIO_AMRNB)
.setVideoEncoder(SessionBuilder.VIDEO_H264);

Your response would be greatly appreciated.

Wierd problems in recieved video

We have incorporated your library into out app to stream a short video to our Wowza server similar to your example 3. This is working but we consistently get videos like the following:

https://www.dropbox.com/s/w3hcblfctmkko3p/libstreaming.mp4

The 3 problems we are consistently getting are the following:

  • The first second of video doesn't change
  • There are strange glitches in some frames
  • The quality starts very bad but gets very good by the end

I can see that the kbps is variable and we need it to be fixed. For us, it is much more important that the entire video is good quality rather then it keeping up with real-time. Is it possible to fix the video quality and buffer if it cant keep up with the stream?

Any idea why some of the frames have glitches? It is very important that we get a good quality video.

We have tested this on a Samsung Galaxy S3 and Nexus 4.

Many thanks.

Audio-Video Sync

Hi..

I am using Example 3 to stream from Nexus 7 to Wowza Server. I am using MediaCodec for encoding audio and video. I am using AUDIO_AAC for audio.

Initially audio-video are in sync. But with time video lags audio and the lag increases.
For example, after 20 min the video lags audio by 4 sec.

What is the issue? Can anyone help me to sync audio and video.

Thanks
Alan

Using of libstreaming

Hi.
I'm new to android but I'm trying to make an application that streams the audio form the mic through rtsp (part of school project). But somehow i'm not able to figure out how to make it.

in my onResume() i have:

    Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
    editor.putString(RtspServer.KEY_PORT, String.valueOf(myPort));
    editor.commit();

    SessionBuilder.getInstance()
    .setSurfaceView(mSurfaceView)
    .setPreviewOrientation(90)
    .setContext(getApplicationContext())
    .setAudioEncoder(SessionBuilder.AUDIO_AAC)
    .setAudioQuality(new AudioQuality(44100,12800))
    .setVideoEncoder(SessionBuilder.VIDEO_NONE)
    .build();

    this.startService(new Intent(this,RtspServer.class)); 

but how do i get the audio from the mic?

I've read that it needs a "hack" to write MediaRecorder on LocalSocket but how should i do it? And how i set it as the source to stream?

Thanks for your response and help.

Auto Focus

How can we add auto focus to surfaceview while Recording stream

example3 : mediacodec.dequeueInputBuffer 640x480 fail

I tried to encode 640x480, but dequeueInputBuffer always fail,
so it change to mediarecoder method. but its quality is poor.
my question is

  1. can mediacodec support 640x480 or it is just impossible?
  2. how can i improve mediarecoder encoding quality (skip frame, not stable) ?

I really like this project and it is awesome!

Wrong mimimum version in manifest

The manifest has minSdkVersion set to 9, but when I try to build it against version 15, it fails (android.media.MediaCodec wasn't added until 16).

What's the actual minimum version? Any chance it could be made to work for 15?

Thanks.

Commercial License

I'm interested in commercially licensing this library. I've reached out with the email that you provided, but haven't heard back. Can you provide a contact email address for discussing this?

Thanks,
Adam

No Voice

I can't get voice working while streaming to wowza. I am using AAC codec. Here is some interesting bits of wowza log:

INFO rtsp announce 535626079 -
INFO server comment - RTPUDPTransport.bind[53158ecb69702d46f11e0000/_definst_]: 0.0.0.0/0.0.0.0:6970
INFO server comment - RTPUDPTransport.bind[53158ecb69702d46f11e0000/_definst_]: 0.0.0.0/0.0.0.0:6971
INFO server comment - RTPUDPTransport.bind[53158ecb69702d46f11e0000/_definst_]: 0.0.0.0/0.0.0.0:6972
INFO server comment - RTPUDPTransport.bind[53158ecb69702d46f11e0000/_definst_]: 0.0.0.0/0.0.0.0:6973

wowza is successfully bound to 4 ports as a result of RTSP set up.

then

INFO server comment - UDPTransport.firstPacket: bind:0.0.0.0/0.0.0.0:6972 msg:/192.168.1.2:49585
INFO server comment - RTCPHandler.sendFirstRTCPRR[1349905744,6971,/192.168.1.2:34097]
INFO server comment - UDPTransport.firstPacket: bind:0.0.0.0/0.0.0.0:6971 msg:/192.168.1.2:34097
INFO server comment - RTCPHandler.sendFirstRTCPRR[1475757431,6973,/192.168.1.2:57219]
INFO server comment - UDPTransport.firstPacket: bind:0.0.0.0/0.0.0.0:6973 msg:/192.168.1.2:57219
INFO server comment - LiveStreamPacketizerSanJose.init[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]: chunkDurationTarget: 10000
INFO server comment - LiveStreamPacketizerSanJose.init[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]: chunkDurationTolerance: 500
INFO server comment - LiveStreamPacketizerSanJose.init[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]: playlistChunkCount:4
INFO server comment - MediaStreamMap.getLiveStreamPacketizer[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]: Create live stream packetizer: sanjosestreamingpacketizer:53158f0069702d46f1200000.sdp
INFO server comment - SanJosePacketHandler.startStream[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]
INFO server comment - LiveStreamPacketizerSanJose.handlePacket[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]: Video codec: H264
INFO stream play 53158f0069702d46f1200000.sdp -
INFO server comment - LiveStreamPacketizerSanJose.handlePacket[53158ecb69702d46f11e0000/_definst_/53158f0069702d46f1200000.sdp]: Audio codec: AAC
INFO server comment - UDPTransport.firstPacket: bind:0.0.0.0/0.0.0.0:6970 msg:/192.168.1.2:52789

Wowza recognizes H264 and AAC codecs correctly. But in the player I am getting only video. Actually it only happens on some devices.

Any ideas what might go wrong with the sound?

reading stream in Android MediaPlayer don't work

Hi,
I tried to read the stream with another phone using mediaplayer api:

  • using h263 the mediaplayer stop with an unknown error code (1)

  • using h264, I fixed some warning (like port number that has to be even and source declaration) but couldn't get it to work. The stream start I got a buffering event but the busser never get anything in it. And I see nothing bad in log appart from this:

    I/MyHandler( 174): dropping damaged access unit.
    I/MyHandler( 174): dropping damaged access unit.
    I/MyHandler( 174): dropping damaged access unit.
    I/MyHandler( 174): dropping damaged access unit.
    I/MyHandler( 174): dropping damaged access unit.

which happens a lot but diving in the Android sourcecode for this error did not ring any bells for me...
I don't know if you are aware of this issue or if you have any idea that could help me...

Regards,
Martin

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.