Elvis Miranda

Elvis Miranda

1566446689

Real Computer Vision for mobile and embedded

This is the second part of the series of articles about Computer Vision for mobile and embedded devices. Last time I discussed ways to optimize Image Preprocessing even before you try to get ML model inference directly on the device.

Inbounds of this article I am going to talk about the most crucial step — 

on-device ML model execution

What is the right ML tool for mobile?And this is a really good question! The answer to that could be found by going through the next items :

  • Performance
  • , in other words, how often are you going to run your ML model on the device?
  • Some Mobile apps (like photo editor with smart ML effects) could use ML output one time per run session, others required to track ML result one time per minute or even second but if we are talking about Real Computer Vision application it is a good idea to do it 
  • 10–30 times per second (10–30 FPS)
  • . As often as we have a new frame from our video source.ML operations (layers) capability: or ML framework which was used for 
  • the server-side model training process
  • .
  • This item is mostly about ML operations (layers) capability between different Training frameworks. In some cases, it could be a kind complicated task to make your Server-side trained ML model work with mobile (embedded) ML environment due to the absence of necessary operations. And this can be a vital factor in choosing a tool.

  • Performance
  • , in other words, how often are you going to run your ML model on the device?
  • Some Mobile apps (like photo editor with smart ML effects) could use ML output one time per run session, others required to track ML result one time per minute or even second but if we are talking about Real Computer Vision application it is a good idea to do it 
  • 10–30 times per second (10–30 FPS)
  • . As often as we have a new frame from our video source.ML operations (layers) capability: or ML framework which was used for 
  • the server-side model training process
  • .
  • This item is mostly about ML operations (layers) capability between different Training frameworks. In some cases, it could be a kind complicated task to make your Server-side trained ML model work with mobile (embedded) ML environment due to the absence of necessary operations. And this can be a vital factor in choosing a tool.

Let’s take a look closer to the two most popular mobile platforms.

iOS

Apple gave iOS developers a brilliant gift — CoreMl. But it is not the only solution for this mobile platform. So what options do we have, let’s check it one by one:

Pure Apple solution is** **CoreMl

  1. Performance: It works with a high level of performance through Metal shaders, directly on mobile GPU (or specially dedicated for ML operations hardware for latest models of iPhone).
  2. ML operations (layers) capability: Almost all modern server-side ML frameworks have prepared scripts for converting to CoreMl format. Even if the converter does not support necessary layers you can write these operations by yourself using Metal shaders. But be careful with that! From the **second generation of CoreML **tool, it is better to use only layers “from the box” at least for the latest iPhones (iPhone XS, XS Max, and XR). In that case, all the operations will be executed on special hardware, which will lead to fast performance and less power consuming. Custom Metal-shaders operations will bring your ML model **back to GPU **and you will not get advantages of CoreML 2 and above.
  3. Hardware specifications: As soon as all iOS phones have almost the same hardware vendor it is a simple topic. We have around 10 specifications which support the same kind of technologies.

Tensorflow Lite associated in our mind with Google (Android) technologies but it could be a solution for iOS platform.

  1. Performance: It works with a high level of performance through Metal shaders, directly on mobile GPU (or specially dedicated for ML operations hardware for latest models of iPhone).
  2. ML operations (layers) capability: Almost all modern server-side ML frameworks have prepared scripts for converting to CoreMl format. Even if the converter does not support necessary layers you can write these operations by yourself using Metal shaders. But be careful with that! From the **second generation of CoreML **tool, it is better to use only layers “from the box” at least for the latest iPhones (iPhone XS, XS Max, and XR). In that case, all the operations will be executed on special hardware, which will lead to fast performance and less power consuming. Custom Metal-shaders operations will bring your ML model **back to GPU **and you will not get advantages of CoreML 2 and above.
  3. Hardware specifications: As soon as all iOS phones have almost the same hardware vendor it is a simple topic. We have around 10 specifications which support the same kind of technologies.

The third alternative is Caffe2, made in Facebook labs it uses NNPACK and QNNPACK to performs as fast as possible on ARM CPUs

In terms of usage and performance, Caffe2 is similar to TF Lite but much more flexible in the converting process. Using ONNX as the middle format you can easily bring your server-side model to iOS platform.

Android

As I mentioned above — efficient on-device ML inference is quite a hardware-specific task and it makes a lot of troubles for Android devices. Nowadays there are more than 16 000 Google Play Certificated devices (device models), overall more than 24 000! Each model can have its hardware as well as software specifications. So the answer to the question of “the right tool” selection can be application-specific.Let’s take a look at our options and a short description for them.

Tensorflow Lite is the most promoted by Google Android ML tool and there is a set of reasons for that.

  1. Performance: It works with a high level of performance through Metal shaders, directly on mobile GPU (or specially dedicated for ML operations hardware for latest models of iPhone).
  2. ML operations (layers) capability: Almost all modern server-side ML frameworks have prepared scripts for converting to CoreMl format. Even if the converter does not support necessary layers you can write these operations by yourself using Metal shaders. But be careful with that! From the **second generation of CoreML **tool, it is better to use only layers “from the box” at least for the latest iPhones (iPhone XS, XS Max, and XR). In that case, all the operations will be executed on special hardware, which will lead to fast performance and less power consuming. Custom Metal-shaders operations will bring your ML model **back to GPU **and you will not get advantages of CoreML 2 and above.
  3. Hardware specifications: As soon as all iOS phones have almost the same hardware vendor it is a simple topic. We have around 10 specifications which support the same kind of technologies.

Qualcomm Neural Processing SDK for AI is a brilliant example of excellent developers support by the hardware vendor. Qualcomm provides us with a set of efficient tools to establish the whole pipeline of ML processing on the device. I am not talking only about their fast-performed ML libs but also about tools for Digital Signal Processing, Video Stream processing, compilation, etc.

  1. Performance: It works with a high level of performance through Metal shaders, directly on mobile GPU (or specially dedicated for ML operations hardware for latest models of iPhone).
  2. ML operations (layers) capability: Almost all modern server-side ML frameworks have prepared scripts for converting to CoreMl format. Even if the converter does not support necessary layers you can write these operations by yourself using Metal shaders. But be careful with that! From the **second generation of CoreML **tool, it is better to use only layers “from the box” at least for the latest iPhones (iPhone XS, XS Max, and XR). In that case, all the operations will be executed on special hardware, which will lead to fast performance and less power consuming. Custom Metal-shaders operations will bring your ML model **back to GPU **and you will not get advantages of CoreML 2 and above.
  3. Hardware specifications: As soon as all iOS phones have almost the same hardware vendor it is a simple topic. We have around 10 specifications which support the same kind of technologies.

HUAWEI HiAI is one more example of hardware-specific solutions. The good thing about this product that it has Android Studio plugin which makes a lot of work for you but bad things — that this plugin has plenty of bugs. Anyway, you have an opportunity to convert your ML model and generate Java code to use it using the UI tool.

  1. Performance: It works with a high level of performance through Metal shaders, directly on mobile GPU (or specially dedicated for ML operations hardware for latest models of iPhone).

  2. ML operations (layers) capability: Almost all modern server-side ML frameworks have prepared scripts for converting to CoreMl format. Even if the converter does not support necessary layers you can write these operations by yourself using Metal shaders. But be careful with that! From the **second generation of CoreML **tool, it is better to use only layers “from the box” at least for the latest iPhones (iPhone XS, XS Max, and XR). In that case, all the operations will be executed on special hardware, which will lead to fast performance and less power consuming. Custom Metal-shaders operations will bring your ML model **back to GPU **and you will not get advantages of CoreML 2 and above.

  3. Hardware specifications: As soon as all iOS phones have almost the same hardware vendor it is a simple topic. We have around 10 specifications which support the same kind of technologies.

       platform by HiSilicon (part of Huawei). A surprising fact for me that each version of HiAI library is designed only for one       Kirin model — so if you add HiAI 2.0 to your app it will work only with Kirin 980 and for an older model you should use the older version of HiAI. I guess it makes troubles on the way to use it in production.
    

MACE by XiaoMi is a good try to create unified ML solution for ARM-Based devices.

  1. Performance: It works with a high level of performance through Metal shaders, directly on mobile GPU (or specially dedicated for ML operations hardware for latest models of iPhone).
  2. ML operations (layers) capability: Almost all modern server-side ML frameworks have prepared scripts for converting to CoreMl format. Even if the converter does not support necessary layers you can write these operations by yourself using Metal shaders. But be careful with that! From the **second generation of CoreML **tool, it is better to use only layers “from the box” at least for the latest iPhones (iPhone XS, XS Max, and XR). In that case, all the operations will be executed on special hardware, which will lead to fast performance and less power consuming. Custom Metal-shaders operations will bring your ML model **back to GPU **and you will not get advantages of CoreML 2 and above.
  3. Hardware specifications: As soon as all iOS phones have almost the same hardware vendor it is a simple topic. We have around 10 specifications which support the same kind of technologies.

Caffe2 is also can be an option for Android devices. Its CPU runtime performs well — in many cases better than other frameworks CPU runtimes. And as we remember it is optimized for ARM-Based devices so it should work for almost all Android phones.

From all the above, you can see that the term “The right mobile ML tool” really depends on the needs of your app. It is a good idea to investigate the potential market, find out mobile hardware and software dominance before you start your mobile ML project.In the next article, I am going to talk about the potential outcome of ML models in **Computer Vision **and ways to efficiently process it. **Output post-processing **can be a tricky thing in terms of application performance.

Don’t forget to give us your 👏 !

Further reading:

What is TensorFrames? TensorFlow + Apache Spark

How to optimize your Jupyter Notebook

5 Common Python Mistakes and How to Fix Them

How to Program a GUI Application (with Python Tkinter)

5 TensorFlow and ML Courses for Programmers

Best Java machine learning library

#python #machine-learning #tensorflow

What is GEEK

Buddha Community

Real Computer Vision for mobile and embedded
Rahim Makhani

Rahim Makhani

1616669264

On-Demand Mobile App Development Services in USA

Mobile apps are developing day-by-day and the usage of mobile apps is also increasing. There are many mobile app development company that are providing services for on-demand mobile app development services.

One of the leading mobile app development company in the USA is Nevina Infotech. It is the best known for providing on-demand app development services till now.

Our On-Demand Mobile App Development Services:-

iPhone App Development
Android App Development
iPad App Development
Game App Development
ionic App Development
Wearable App Development
Flutter App Development

#mobile app development company #mobile app development services #mobile application development services #mobile application development company #mobile app development company usa

Real Estate Mobile App Development Company

AppClues Infotech is a top-notch Real Estate Mobile App Development Company that offers innovative mobile app solutions for your real estate business needs. If you are associated with Real Estate industries, a powerful real estate mobile app helps your business to gain fruitful benefits by enhancing work productivity, customer satisfaction & overall growth.

Real Estate Mobile App Solutions We Provide:
• Real Estate Mobile App Development
• Property Management Application
• Online Property Booking Solutions
• Lead Management Solutions
• Maintenance And Support

For more info:
Call: +1-978-309-9910
Email: info@appcluesinfotech.com

#real estate app development #real estate mobile app development company #top real estate app development company #best real estate mobile app development service #hire real estate app developers

real estate app development

TechGropse one of Leading real estate app development company that provides a high-end property listing app helping real estate agents to engage more visitors and convert them to leads. Real estate app development by TechGropse team includes a whole line of features to real estate agents. If you’re looking for the right real estate app development company TechGropse is a one-stop solution.We have best real estate app developer.

# real estate app development #real estate app development company #real estate mobile app development #real estate app developer #real estate mobile app development company

Jessica Smith

Jessica Smith

1612606870

REAL TIME CHAT SOLUTIONS SERVICES FOR MOBILE APPS

Build a Real Time chat application that can integrated into your social handles. Add more life to your website or support portal with a real time chat solutions for mobile apps that shows online presence indicators, typing status, timestamp, multimedia sharing and much more. Users can also log into the live chat app using their social media logins sparing them from the need to remember usernames and passwords. For more information call us at +18444455767 or email us at hello@sisgain.com or Visit: https://sisgain.com/instant-real-time-chat-solutions-mobile-apps

#real time chat solutions for mobile apps #real time chat app development solutions #live chat software for mobile #live chat software solutions #real time chat app development #real time chat applications in java script

Jones Brianna

Jones Brianna

1614154249

List Of The Top Pittsburgh Mobile App Development Companies

https://clutch.co/app-developers/pittsburgh
Let’s look at the list of top list of the top Pittsburgh mobile app development companies which are known for providing top-notch services globally. They are great developers who provide quality services for all your needs.

#mobile app developers #mobile app development services #mobile app development #mobile app developers #mobile apps #mobile app development solutions