Realtime AI Application on Mobile Devices

Wei Niu also designed several creative real-time AI-applications on mobile devices based on his research, here are some examples:

More video information can be found on our Youtube channel or Bilibili Channel.

Selected Publications

* means equal contribution

MEST: Accurate and Fast Memory-Economic Sparse Training Framework on the Edge

Achieving on-Mobile Real-Time Super-Resolution with Neural Architecture and Pruning Search

GRIM: A General, Real-Time Deep Learning Inference Framework for Mobile Devices based on Fine-Grained Structured Weight Sparsity

ClickTrain: Efficient and Accurate End-to-End Deep LearningTraining via Fine-Grained Architecture-Preserving Pruning

DNNFusion: Accelerating Deep Neural Networks Execution with Advanced Operator Fusion

NPAS: A Compiler-aware Framework of Unified Network Pruning and Architecture Search for Beyond Real-Time Mobile Acceleration

Neural Pruning Search for Real-Time Object Detection of Autonomous Vehicles

Real-Time Mobile Acceleration of DNNs: From Computer Vision to Medical Applications

YOLObile: Real-Time Object Detection on Mobile Devices via Compression-Compilation Co-Design

Achieving Real-Time Execution of 3D Convolutional Neural Networks on Mobile Devices

CoCoPIE: Enabling Real-Time AI on Off-the-Shelf Mobile Devices via Compression-Compilation Co-Design

An Image Enhancing Pattern-based Sparsity for Real-Time Inference on Mobile Devices

A Privacy-Preserving-Oriented DNN Pruning and Mobile Acceleration Framework

RTMobile: Beyond Real-Time Mobile Acceleration of RNNs for Speech Recognition

PCONV: The Missing but Desirable Sparsity in DNN Weight Pruning for Real-time Execution on Mobile Devices

PatDNN: Achieving Real-Time DNN Execution on Mobile Devices with Pattern-based Weight Pruning