Giter Site home page Giter Site logo

agora-android-tutorial-1to1's Introduction

Agora Android 1-to-1 Tutorial

This tutorial enables you to quickly get started in your development efforts to create an Android app with real-time video calls, voice calls, and interactive broadcasting. With this sample app you can:

  • Join and leave a channel.
  • Mute and unmute audio.
  • Enable or disable video.
  • Choose between the front or rear camera.

A more complete demonstration app can be found here.

Prerequisites

  • Android Studio 2.0 or above.
  • Android device (e.g. Nexus 5X). A real device is recommended because some simulators have missing functionality or lack the performance necessary to run the sample.

Quick Start

This section shows you how to prepare, build, and run the sample application.

Create an Account and Obtain an App ID

In order to build and run the sample application you must obtain an App ID:

  1. Create a developer account at agora.io. Once you finish the signup process, you will be redirected to the Dashboard.
  2. Navigate in the Dashboard tree on the left to Projects > Project List.
  3. Locate the file app/src/main/res/values/strings.xml and replace <#YOUR APP ID#> with the App ID in the dashboard.
<string name="agora_app_id"><#YOUR APP ID#></string>

Integrate the Agora Video SDK into the sample project

The SDK must be integrated into the sample project before it can opened and built. There are two methods for integrating the Agora Video SDK into the sample project. The first method uses JCenter to automatically integrate the SDK files. The second method requires you to manually copy the SDK files to the project.

Method 1 - Integrate the SDK Automatically Using JCenter (Recommended)

  1. Clone this repository.
  2. Open app/build.gradle and add the following line to the dependencies list:
...
dependencies {
    ...
    compile 'io.agora.rtc:full-sdk:2.1.0' 
}

Method 2 - Manually copy the SDK files

  1. Clone this repository.
  2. Download the Agora Video SDK from Agora.io SDK.
  3. Unzip the downloaded SDK package.
  4. Copy the .jar file from the libs folder of the downloaded SDK package to the /apps/libs folder of the sample application.
  5. Copy the .so files from the arm64-v8a folder of the downloaded SDK package to the /app/src/main/jniLibs/arm64-v8a folder of the sample application.
  6. Copy the .so files from the armeabi-v7a folder of the downloaded SDK package to the /app/src/main/jniLibs/armeabi-v7a folder of the sample application.
  7. Copy the .so files from the x86 folder of the downloaded SDK package to the /app/src/main/jniLibs/x86a folder of the sample application.

Obtain and Build the Sample Application

  1. Open the sample application in Android Studio.
  2. Build and run the sample project. This should display the application on your device.

Steps to Create the Sample

For details about the APIs used to develop this sample, see the Agora.io Documentation.

Set Permissions

In the AndroidManifest.xml file, uses-permissions settings were added for the Internet, audio recording, audio settings, network state, camera, and Bluetooth to allow the app to access these features:

<?xml version="1.0" encoding="UTF-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="io.agora.tutorials1v1vcall">

    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.BLUETOOTH" />

    <application
        android:allowBackup="true"
        android:icon="@drawable/ic_launcher"
        android:label="@string/app_name"
        android:supportsRtl="true"
        android:theme="@style/AppTheme">
        <activity
            android:name=".VideoChatViewActivity"
            android:screenOrientation="sensorPortrait"
            android:theme="@style/FullScreenVideoTheme">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>

</manifest>

Create Visual Assets

Add the following icon assets for the user interface to the /res/drawable folder:

Asset Description
btn_end_call.png An image of a red telephone for a hang up button.
btn_mute.png An image of a microphone to mute audio.
btn_switch_camera.png An image of a camera and rotational arrows to switch between the two cameras.
btn_video.png An image of a camera to start video.
btn_voice.png An image of an arrow indicating that audio chat is enabled.
ic_launcher.png A desktop icon for users to invoke the sample application.

Design the User Interface

The sample contains a single activity called VideoChatViewActivity and its layout is defined in /layout/activity_video_chat_view.xml.

The main aspects of this layout are shown here: ActivityViewChat.png

Component Description
activity_video_chat_view A view that handles the main video feed. This view contains other views.
remote_video_view_container A view displaying the remote, incoming video feed (for example, the video the user will see).
local_video_view_container A smaller view at the top right corner showing the local video feed.
quick_tips_when_use_agora_sdk Displays quick tip information.
LinearLayout (unamed) A layout that encapsulates four buttons: Pause Video, Audio Mute, Switch Camera, and Hang Up. Each button uses the assets described above.

Configure Resources

To configure 1-to-1 communication resources:

Create an Agora Instance

The code samples in this section are in ViewChatViewActivity.java.

The following imports define the interface of the Agora API that provides communication functionality:

  • io.agora.rtc.Constants
  • io.agora.rtc.IRtcEngineEventHandler
  • io.agora.rtc.RtcEngine
  • io.agora.rtc.video.VideoCanvas

Create a singleton by invoking RtcEngine.create() during initialization, passing the application ID stored in strings.xml and a reference to the activity's event handler. The Agora API uses events to inform the application about Agora engine runtime events, such as joining or leaving a channel and adding new participants.

import io.agora.rtc.Constants;
import io.agora.rtc.IRtcEngineEventHandler;
import io.agora.rtc.RtcEngine;
import io.agora.rtc.video.VideoCanvas;

...

private void initializeAgoraEngine() {
    try {
        mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
    } catch (Exception e) {
        Log.e(LOG_TAG, Log.getStackTraceString(e));

        throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));
    }
}

In the sample project, a helper method called initializeAgoraEngine() creates the singleton and is invoked by another helper method called initAgoraEngineAndJoinChannel():

private void initAgoraEngineAndJoinChannel() {
    initializeAgoraEngine();     
    ...
}

initAgoraEngineAndJoinChannel() is invoked in the activity's onCreate() method after it checks for the necessary permissions:

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_video_chat_view);

    if (checkSelfPermission(Manifest.permission.RECORD_AUDIO, PERMISSION_REQ_ID_RECORD_AUDIO) && checkSelfPermission(Manifest.permission.CAMERA, PERMISSION_REQ_ID_CAMERA)) {
        initAgoraEngineAndJoinChannel();
    }
}

Configure Video Mode

The next step is to enable video mode, configure the video encoding profile, and specify if the width and height can change when switching from portrait to landscape:

private void setupVideoProfile() {
    mRtcEngine.enableVideo();
    mRtcEngine.setVideoProfile(Constants.VIDEO_PROFILE_360P, false);
}

private void initAgoraEngineAndJoinChannel() {
    initializeAgoraEngine();     
    setupVideoProfile();        
    ...
}

In the sample, a helper method called setupVideoProfile() contains this logic and is invoked by initAgoraEngineAndJoinChannel() during the activity's creation:

  • It starts by enabling video with enableVideo.
  • The video encoding profile is then set to 360p and the swapWidthAndHeight parameter is set to false via setVideoProfile. Each profile includes a set of parameters such as resolution, frame rate, bitrate, etc.

Note: If a device's camera does not support the specified resolution, the SDK automatically chooses a suitable camera resolution. However, the encoder resolution still uses the profile specified by setVideoProfile().

Since this configuration takes place before entering a channel, the end user will start in video mode rather than audio mode. If video mode is enabled during a call, the app will switch from audio to video mode.

Set up Local Video

The logic for the local video feed is contained within a helper method called setupLocalVideo() that is invoked by initAgoraEngineAndJoinChannel():

private void setupLocalVideo() {
    FrameLayout container = (FrameLayout) findViewById(R.id.local_video_view_container);
    SurfaceView surfaceView = RtcEngine.CreateRendererView(getBaseContext());
    surfaceView.setZOrderMediaOverlay(true);
    container.addView(surfaceView);
    mRtcEngine.setupLocalVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_ADAPTIVE, 0));
}

private void initAgoraEngineAndJoinChannel() {
    initializeAgoraEngine();     
    setupVideoProfile();         
    setupLocalVideo();           
    ...
}

setupLocalVideo() creates a View object for the video stream and initializes the following:

  • The Z order media overlay is set to true to overlay the view on top of the parent view.
  • The View is added to the local_video_view_container layout.

The call to setupLocalVideo then passes a new VideoCanvas object to the engine that binds the video window (view) of local video streams and configures the video display settings.

Join a Channel

A helper method called joinChannel() invokes RtcEngine.joinChannel() enables a user to join a specific channel:

private void joinChannel() {
    mRtcEngine.joinChannel(null, "demoChannel1", "Extra Optional Data", 0); 
}

The channelName parameter receives the name of the channel to join (demoChannel1). The call to RtcEngine.joinChannel() enables the speakerphone when using Agora.

Note: Users in the same channel can talk to each other, but users with different app IDs cannot call each other even if they join the same channel.

In this code sample, the helper method joinChannel() is invoked by initAgoraEngineAndJoinChannel():

private void initAgoraEngineAndJoinChannel() {
    initializeAgoraEngine();     
    setupVideoProfile();         
    setupLocalVideo();           
    joinChannel();               
}

Set up Video Chat View Activity

The VideoChatViewActivity class contains an IRtcEngineEventHandler to handle various video session events:

private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() { 
    @Override
    public void onFirstRemoteVideoDecoded(final int uid, int width, int height, int elapsed) { 5
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                setupRemoteVideo(uid);
            }
        });
    }

    @Override
    public void onUserOffline(int uid, int reason) { 
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                onRemoteUserLeft();
            }
        });
    }

    @Override
    public void onUserMuteVideo(final int uid, final boolean muted) { 
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                onRemoteUserVideoMuted(uid, muted);
            }
        });
    }
};

The onFirstRemoteVideoDecoded() method is invoked once another user is connected and the first remote video frame is received and decoded. This method invokes a helper method called setupRemoteVideo():

private void setupRemoteVideo(int uid) {
    FrameLayout container = (FrameLayout) findViewById(R.id.remote_video_view_container);

    if (container.getChildCount() >= 1) {
        return;
    }

    SurfaceView surfaceView = RtcEngine.CreateRendererView(getBaseContext());
    container.addView(surfaceView);
    mRtcEngine.setupRemoteVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_ADAPTIVE, uid));

    surfaceView.setTag(uid); // for mark purpose
    View tipMsg = findViewById(R.id.quick_tips_when_use_agora_sdk); // optional UI
    tipMsg.setVisibility(View.GONE);
}

setupRemoteVideo() performs the following:

  • Gets a reference to the remote video view in the layout.
  • Creates and adds a View object to the layout.
  • Creates a VideoCanvas and associates the view with it.
  • Tags the View with the channel ID.
  • Hides the quick tips.

The onUserOffline() method is invoked when another user leaves the channel. This method invokes a helper method called onRemoteUserLeft():

private void onRemoteUserLeft() {
    FrameLayout container = (FrameLayout) findViewById(R.id.remote_video_view_container);
    container.removeAllViews();

    View tipMsg = findViewById(R.id.quick_tips_when_use_agora_sdk); // optional UI
    tipMsg.setVisibility(View.VISIBLE);
}

onRemoteUserLeft performs the following:

  • Obtains a reference to the remote video view in the layout and removes it.
  • Shows the quick tips.

The onUserMuteVideo() method s invoked when a remote user pauses their stream. This method invokes a helper method called: onRemoteUserVideoMuted():

private void onRemoteUserVideoMuted(int uid, boolean muted) {
    FrameLayout container = (FrameLayout) findViewById(R.id.remote_video_view_container);

    SurfaceView surfaceView = (SurfaceView) container.getChildAt(0);

    Object tag = surfaceView.getTag();
    if (tag != null && (Integer) tag == uid) {
        surfaceView.setVisibility(muted ? View.GONE : View.VISIBLE);
    }
}

onRemoteUserVideoMuted performs the following:

  • Gets a reference to the remote video view in the layout and the associated View.
  • Checks that the channel ID associated with View's tag matches the channel ID passed in from Agora.
  • Toggles the visibility of the remote video view.

Manage Communication Features

Implement the following communication features:

Hang Up and End the Call

Video Chat View Activity contains a helper method called leaveChannel() with the logic to leave the current video call (channel). This is invoked in onDestroy() when the application is shut down:

private void leaveChannel() {
    mRtcEngine.leaveChannel();
}

@Override
protected void onDestroy() {
    super.onDestroy();

    leaveChannel();
    ...
}

leaveChannel() invokes RtcEngine.leaveChannel() to leave the channel.

The class also contains a helper method called onEncCallClicked() which invokes finish() to cause the onDestroy() event to occur. The btn_end_call button that ends a call, has been configured in activity_video_chat_view.xml to invoke onEncCallClicked() in response to the onClick() event:

<ImageView
    android:layout_width="0dp"
    android:layout_height="match_parent"
    android:layout_weight="20"
    android:onClick="onEncCallClicked"
    android:scaleType="centerInside"
    android:src="@drawable/btn_end_call" />

Toggle Cameras

To enable the user to choose between the front and rear cameras, the activity defines a method called onSwitchCameraClicked():

public void onSwitchCameraClicked(View view) {
        mRtcEngine.switchCamera();
    }

onSwitchCameraClicked() invokes RtcEngine.switchCamera() to toggle between the device's front and rear cameras. The btn_switch_camera button that switches the cameras, has been configured in activity_video_chat_view.xml to invoke onSwitchCameraClicked() in response to the onClick() event:

<ImageView
    android:layout_width="0dp"
    android:layout_height="match_parent"
    android:layout_weight="20"
    android:onClick="onSwitchCameraClicked"
    android:scaleType="centerInside"
    android:src="@drawable/btn_switch_camera" />

Mute Audio and Video

To allow the user to mute audio, the activity defines a method called onLocalAudioMuteClicked():

public void onLocalAudioMuteClicked(View view) {
    ImageView iv = (ImageView) view;
    if (iv.isSelected()) {
        iv.setSelected(false);
        iv.clearColorFilter();
    } else {
        iv.setSelected(true);
        iv.setColorFilter(getResources().getColor(R.color.colorPrimary), PorterDuff.Mode.MULTIPLY);
    }

    mRtcEngine.muteLocalAudioStream(iv.isSelected());
}

The btn_mute button that mutes audio has been configured in activity_video_chat_view.xml to invoke onLocalAudioMuteClicked() in response to the onClick() event:

<ImageView
    android:layout_width="0dp"
    android:layout_height="match_parent"
    android:layout_weight="20"
    android:onClick="onLocalAudioMuteClicked"
    android:scaleType="centerInside"
    android:src="@drawable/btn_mute" />

onLocalAudioMuteClicked() performs the following:

  • Gets a reference to the View containing the mute audio button.
  • Determines if the state of the button is selected or not.
  • Toggles the View's selected state.
  • Clears the View's button color when deselecting it and sets the button color to red when selecting it.
  • Invokes RtcEngine.muteLocalAudioStream() to toggle audio based on the selected state.

To allow the user to mute local video (for example, to prevent video of the current user from being broadcast to other users), the activity defines a method called onLocalVideoMuteClicked():

public void onLocalVideoMuteClicked(View view) {
    ImageView iv = (ImageView) view;
    if (iv.isSelected()) {
        iv.setSelected(false);
        iv.clearColorFilter();
    } else {
        iv.setSelected(true);
        iv.setColorFilter(getResources().getColor(R.color.colorPrimary), PorterDuff.Mode.MULTIPLY);
    }

    mRtcEngine.muteLocalVideoStream(iv.isSelected());

    FrameLayout container = (FrameLayout) findViewById(R.id.local_video_view_container);
    SurfaceView surfaceView = (SurfaceView) container.getChildAt(0);
    surfaceView.setZOrderMediaOverlay(!iv.isSelected());
    surfaceView.setVisibility(iv.isSelected() ? View.GONE : View.VISIBLE);
}

onLocalVideoMuteClicked() performs the following:

  • Obtains a reference to the View containing the mute video button.
  • Determines if the state of the button is selected or not.
  • Toggles the View's selected state.
  • Clears the View's button color when deselecting it and sets the button color to red when selecting it.
  • Invokes RtcEngine.muteLocalVideoStream() to toggle video based on the selected state.
  • Obtains a reference to the layout for the Local Video View.
  • Toggles the Z order and visibility based on the Selected state.

The btn_voice button that mutes video has been configured in activity_video_chat_view.xml to invoke onLocalVideoMuteClicked() in response to the onClick() event:

<ImageView
    android:layout_width="0dp"
    android:layout_height="match_parent"
    android:layout_weight="20"
    android:onClick="onLocalVideoMuteClicked"
    android:scaleType="centerInside"
    android:src="@drawable/btn_voice" />

Resources

  • You can find full API documentation at the Document Center.
  • You can file bugs about this sample here.

License

This software is under the MIT License (MIT). View the license.

agora-android-tutorial-1to1's People

Contributors

guohai avatar prwrl avatar tongjiangyong avatar ioleon13 avatar yeyuzhou avatar zhangtao1104 avatar dubian0608 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.