Giter Site home page Giter Site logo

mobile_app's Introduction

MLPerf Mobile App

This project contains the MLPerf mobile app, an app-based implementationn of MLPerf Inference tasks.

Please note that this app is not official yet and the integration with LoadGen isn't quite complete yet.

Overview

The MLPerf app offers a simple mobile UI for executing MLPerf inference tasks and comparing results. The user can select a task, a supported reference model (float or quantized), and initiate both latency and accuracy validation for that task. As single-stream represents the most common inference execution on mobile devices, that is the default mode of inference measurement, with the results showing the 90%-ile latency and the task-specific accuracy metric result (e.g., top-1 accuracy for image classification).

Several important mobile-specific considerations are addressed in the app:

  • Limited disk space - Certain datasets are quite large (multiple gigabytes), which makes an exhaustive evaluation difficult. By default, the app does not include the full dataset for validation. The client can optionally push part or all of the task validation datasets, depending on their use-case.
  • Device variability - The number of CPU, GPU and DSP/NPU hardware permutations in the mobile ecosystem is quite large. To this end, the app affords the option to customize hardware execution, e.g., adjusting the number of threads for CPU inference, enabling GPU acceleration, or NN API acceleration (Android’s ML abstraction layer for accelerating inference).

The initial version of the app builds off of a lightweight, C++ task evaluation pipeline originally built for TensorFlow Lite. Most of the default MLPerf inference reference implementations are built in Python, which is generally incompatible with mobile deployment. This C++ evaluation pipeline has a minimal set of dependencies for pre-processing datasets and post-processing, is compatible with iOS and Android (as well as desktop platforms), and integrates with the standard MLPerf LoadGen library. While the initial version of the app uses TensorFlow Lite as the default inference engine, the plan is to support addition of alternative inference frameworks contributed by the broader MLPerf community.

Requirements

Getting Started

There are two ways to build the app. If you want to make your own environment, first make sure to download the SDK and NDK using the Android studio. Then set the following environment variables:

export ANDROID_HOME=Path/to/SDK # Ex: $HOME/Android/Sdk
export ANDROID_NDK_HOME=Path/to/NDK # Ex: $ANDROID_HOME/ndk/(your version)

The app can be built with the following command:

bazel build -c opt --cxxopt='--std=c++14' \
    --fat_apk_cpu=x86,arm64-v8a,armeabi-v7a \
    //java/org/mlperf/inference:mlperf_app

On the other hand, you can use our prebuilt docker image to build the app:

docker run \
    -v `pwd`:/mobile_app \
    -v <path to your cache dir>:/cache \
    -w /mobile_app \
    thaink/android-bazel:latest --output_user_root=/cache/bazel build \
    -c opt --cxxopt='--std=c++14' \
    --fat_apk_cpu=x86,arm64-v8a,armeabi-v7a \
    //java/org/mlperf/inference:mlperf_app

Please see these instructions for installing and using the app.

FAQ

Will this be available in the app store(s)?

Yes, eventually, but not with the 0.5 release.

When will an iOS version be avilable?

This is a priority for the community but requires some additional resourcing.

Will the app support all MLPerf Inference tasks?

That is the eventual goal. To start, it supports only those tasks specifically targeting mobile and/or edge use-cases (e.g., Classification w/ MobileNet, Detection w/ SSD MobileNet).

Will the app support more than just TensorFlow Lite for inference?

Yes, that is the plan, though this is largely dependent on contributions from teams and organizations who desire this.

Please search https://groups.google.com/forum/#!forum/mlperf-inference-submitters for additional help and related questions.

mobile_app's People

Contributors

jdduke avatar petermattson avatar profvjreddi avatar terryheo avatar thaink avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.