Skip to content

The challenge projects for Inferencing machine learning models on iOS

License

Notifications You must be signed in to change notification settings

tucan9389/awesome-ml-demos-with-ios

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 
Ā 

Repository files navigation

Awesome Hits PRs Welcome GIF PRs More Welcome

This repo was moved from @motlabs group. Thanks for @jwkanggist who is a leader of motlabs community.

Awesome Machine Learning DEMOs with iOS

We tackle the challenge of using machine learning models on iOS via Core ML and ML Kit (TensorFlow Lite).

ķ•œźµ­ģ–“ README

Contents

Machine Learning Framework for iOS

Flow of Model When Using Core ML

Flow of Model When Using Core ML

The overall flow is very similar for most ML frameworks. Each framework has its own compatible model format. We need to take the model created in TensorFlow and convert it into the appropriate format, for each mobile ML framework.

Once the compatible model is prepared, you can run the inference using the ML framework. Note that you must perform pre/postprocessing manually.

If you want more explanation, check this slide(Korean).

Flow of Model When Using Create ML

playground-createml-validation-001

Baseline Projects

DONE

  • Using built-in model with Core ML

  • Using built-in on-device model with ML Kit

  • Using custom model for Vision with Core ML and ML Kit

  • Object Detection with Core ML

TODO

  • Object Detection with ML Kit
  • Using built-in cloud model on ML Kit
    • Landmark recognition
  • Using custom model for NLP with Core ML and ML Kit
  • Using custom model for Audio with Core ML and ML Kit
    • Audio recognition
    • Speech recognition
    • TTS

Image Classification

Name DEMO Note
ImageClassification-CoreML

-
MobileNet-MLKit

-

Object Detection & Recognition

Name DEMO Note
ObjectDetection-CoreML

-
TextDetection-CoreML

-
TextRecognition-MLKit

-
FaceDetection-MLKit

-

Pose Estimation

Name DEMO Note
PoseEstimation-CoreML

-
PoseEstimation-TFLiteSwift -
PoseEstimation-MLKit

-
FingertipEstimation-CoreML

-

Depth Prediction

DepthPrediction-CoreML

-

Semantic Segmentation

Name DEMO Note
SemanticSegmentation-CoreML

-

Application Projects

Name DEMO Note
dont-be-turtle-ios

-
WordRecognition-CoreML-MLKit(preparing...)

Detect character, find a word what I point and then recognize the word using Core ML and ML Kit.

Annotation Tool

Name DEMO Note
KeypointAnnotation

Annotation tool for own custom estimation dataset

Create ML Projects

Name Create ML DEMO Core ML DEMO Note
SimpleClassification-CreateML-CoreML IMG_0436 IMG_0436 A Simple Classification Using Create ML and Core ML

Performance

Execution Time: Inference Time + Postprocessing Time

(with iPhone X) Inference Time(ms) Execution Time(ms) FPS
ImageClassification-CoreML 40 40 23
MobileNet-MLKit 120 130 6
ObjectDetection-CoreML 100 ~ 120 110 ~ 130 5
TextDetection-CoreML 12 13 30(max)
TextRecognition-MLKit 35~200 40~200 5~20
PoseEstimation-CoreML 51 65 14
PoseEstimation-MLKit 200 217 3
DepthPrediction-CoreML 624 640 1
SemanticSegmentation-CoreML 178 509 1
WordRecognition-CoreML-MLKit 23 30 14
FaceDetection-MLKit - - -

šŸ“Measure module

You can see the measured latency time for inference or execution and FPS on the top of the screen.

If you have more elegant method for measuring the performance, suggest on issue!

Implements

MeasurešŸ“ Unit Test Bunch Test
ImageClassification-CoreML O X X
MobileNet-MLKit O X X
ObjectDetection-CoreML O O X
TextDetection-CoreML O X X
TextRecognition-MLKit O X X
PoseEstimation-CoreML O O X
PoseEstimation-MLKit O X X
DepthPrediction-CoreML O X X
SemanticSegmentation-CoreML O X X

See also

WWDC

Core ML

Create ML and Turi Create

Common ML

Metal

AR

Examples