Google introduced ML Kit in beta: a new SDK that brings Google’s machine learning
In today’s fast-moving world, people have come to expect mobile apps to be intelligent – adapting to users’ activity or delighting them with surprising smarts. As a result, we think machine learning will become an essential tool in mobile development. That’s why on Tuesday at Google I/O, Goolge introduced ML Kit in beta: a new SDK that brings Google’s machine learning expertise to mobile developers in a powerful, yet easy-to-use package on Firebase.
Machine learning for all skill levels
Getting started with machine learning can be difficult for many developers. Typically, new ML developers spend countless hours learning the intricacies of implementing low-level models, using frameworks, and more. Even for the seasoned expert, adapting and optimizing models to run on mobile devices can be a huge undertaking. Beyond the machine learning complexities, sourcing training data can be an expensive and time consuming process, especially when considering a global audience.
With ML Kit, you can use machine learning to build compelling features, on Android and iOS, regardless of your machine learning expertise. More details below!
Production-ready for common use cases
If you’re a beginner who just wants to get the ball rolling, ML Kit gives you five ready-to-use (“base”) APIs that address common mobile use cases:
- Text recognition
- Face detection
- Barcode scanning
- Image labeling
- Landmark recognition
With these base APIs, you simply pass in data to ML Kit and get back an intuitive response. For example: Lose It!, one of our early users, used ML Kit to build several features in the latest version of their calorie tracker app. Using our text recognition based API and a custom built model, their app can quickly capture nutrition information from product labels to input a food’s content from an image.
ML Kit gives you both on-device and Cloud APIs, all in a common and simple interface, allowing you to choose the ones that fit your requirements best. The on-device APIs process data quickly and will work even when there’s no network connection, while the cloud-based APIs leverage the power of Google Cloud Platform’s machine learning technology to give a higher level of accuracy.
See these APIs in action on your Firebase console:
Heads up: We’re planning to release two more APIs in the coming months. First is a smart reply API allowing you to support contextual messaging replies in your app, and the second is a high density face contour addition to the face detection API. Sign up here to give them a try!
Deploy custom models
If you’re seasoned in machine learning and you don’t find a base API that covers your use case, ML Kit lets you deploy your own TensorFlow Lite models. You simply upload them via the Firebase console, and we’ll take care of hosting and serving them to your app’s users. This way you can keep your models out of your APK/bundles which reduces your app install size. Also, because ML Kit serves your model dynamically, you can always update your model without having to re-publish your apps.
Introducing ML KitDevelopers are increasingly relying on machine learning to enhance their app’s user experience, and only with finely-tuned machine learning models can they deliver those powerful features to delight their users. Introducing ML Kit, a Machine Learning SDK available on Firebase. ML Kit comes with a set of ready-to-use APIs focused on common mobile use cases: recognizing text, detecting faces, recognizing landmarks, scanning barcodes, and labeling images.
This video is also subtitled in Chinese, Indonesian, Italian, Japanese, Korean, Portuguese, and Spanish.
Check out the documentation to learn more → https://goo.gl/hYLrJu
Subscribe to the Firebase Channel → https://goo.gl/9giPHG
But there is more. As apps have grown to do more, their size has increased, harming app store install rates, and with the potential to cost users more in data overages. Machine learning can further exacerbate this trend since models can reach 10’s of megabytes in size. So we decided to invest in model compression. Specifically, we are experimenting with a feature that allows you to upload a full TensorFlow model, along with training data, and receive in return a compressed TensorFlow Lite model. The technology behind this is evolving rapidly and so we are looking for a few developers to try it and give us feedback. If you are interested, please sign up here.
Better together with other Firebase products
Since ML Kit is available through Firebase, it’s easy for you to take advantage of the broader Firebase platform. For example, Remote Config and A/B testing lets you experiment with multiple custom models. You can dynamically switch values in your app, making it a great fit to swap the custom models you want your users to use on the fly. You can even create population segments and experiment with several models in parallel.