Google Developers Blog: Firebase

Posted by Brahim Elbouchikhi, Director of Product Management and Matej Pfajfar, Engineering Director

We launched ML Kit at I/O last year with the mission to simplify Machine Learning for everyone. We couldn’t be happier about the experiences that ML Kit has enabled thousands of developers to create. And more importantly, user engagement with features powered by ML Kit is growing more than 60% per month. Below is a small sample of apps we have been working with.

But there is a lot more. At I/O this year, we are excited to introduce four new features.

The Object Detection and Tracking API lets you identify the prominent object in an image and then track it in real-time. You can pair this API with a cloud solution (e.g. Google Cloud’s Product Search API) to create a real-time visual search experience.

When you pass an image or video stream to the API, it will return the coordinates of the primary object as well as a coarse classification. The API then provides a handle for tracking this object's coordinates over time.

A number of partners have built experiences that are powered by this API already. For example, Adidas built a visual search experience right into their app.

The On-device Translation API allows you to use the same offline models that support Google Translate to provide fast, dynamic translation of text in your app into 58 languages. This API operates entirely on-device so the context of the translated text never leaves the device.

You can use this API to enable users to communicate with others who don't understand their language or translate user-generated content.

To the right, we demonstrate the use of ML Kit’s text recognition, language detection, and translation APIs in one experience.

We also collaborated with the Material Design team to produce a set of design patterns for integrating ML into your apps. We are open sourcing implementations of these patterns and hope that they will further accelerate your adoption of ML Kit and AI more broadly.

Our design patterns for machine learning powered features will be available on the Material.io site.

With AutoML Vision Edge, you can easily create custom image classification models tailored to your needs. For example, you may want your app to be able to identify different types of food, or distinguish between species of animals. Whatever your need, just upload your training data to the Firebase console and you can use Google’s AutoML technology to build a custom TensorFlow Lite model for you to run locally on your user's device. And if you find that collecting training datasets is hard, you can use our open source app which makes the process simpler and more collaborative.

Wrapping up

We are excited by this first year and really hope that our progress will inspire you to get started with Machine Learning. Please head over to g.co/mlkit to learn more or visit Firebase to get started right away.

Posted by Brahim Elbouchikhi, Director of Product Management and Matej Pfajfar, Engineering Director

We launched ML Kit at I/O last year with the mission to simplify Machine Learning for everyone. We couldn’t be happier about the experiences that ML Kit has enabled thousands of developers to create. And more importantly, user engagement with features powered by ML Kit is growing more than 60% per month. Below is a small sample of apps we have been working with.

But there is a lot more. At I/O this year, we are excited to introduce four new features.

The Object Detection and Tracking API lets you identify the prominent object in an image and then track it in real-time. You can pair this API with a cloud solution (e.g. Google Cloud’s Product Search API) to create a real-time visual search experience.

When you pass an image or video stream to the API, it will return the coordinates of the primary object as well as a coarse classification. The API then provides a handle for tracking this object's coordinates over time.

A number of partners have built experiences that are powered by this API already. For example, Adidas built a visual search experience right into their app.

The On-device Translation API allows you to use the same offline models that support Google Translate to provide fast, dynamic translation of text in your app into 58 languages. This API operates entirely on-device so the context of the translated text never leaves the device.

You can use this API to enable users to communicate with others who don't understand their language or translate user-generated content.

To the right, we demonstrate the use of ML Kit’s text recognition, language detection, and translation APIs in one experience.

We also collaborated with the Material Design team to produce a set of design patterns for integrating ML into your apps. We are open sourcing implementations of these patterns and hope that they will further accelerate your adoption of ML Kit and AI more broadly.

Our design patterns for machine learning powered features will be available on the Material.io site.

With AutoML Vision Edge, you can easily create custom image classification models tailored to your needs. For example, you may want your app to be able to identify different types of food, or distinguish between species of animals. Whatever your need, just upload your training data to the Firebase console and you can use Google’s AutoML technology to build a custom TensorFlow Lite model for you to run locally on your user's device. And if you find that collecting training datasets is hard, you can use our open source app which makes the process simpler and more collaborative.

Wrapping up

We are excited by this first year and really hope that our progress will inspire you to get started with Machine Learning. Please head over to g.co/mlkit to learn more or visit Firebase to get started right away.

Posted by Mertcan Mermerkaya, Software Engineer

We have great news for web developers that use Firebase Cloud Messaging to send notifications to clients! The FCM v1 REST API has integrated fully with the Web Notifications API. This integration allows you to set icons, images, actions and more for your Web notifications from your server! Better yet, as the Web Notifications API continues to grow and change, these options will be immediately available to you. You won't have to wait for an update to FCM to support them!

Below is a sample payload you can send to your web clients on Push API supported browsers. This notification would be useful for a web app that supports image posting. It can encourage users to engage with the app.

{   "message": {     "webpush": {       "notification": {         "title": "Fish Photos 🐟",         "body":           "Thanks for signing up for Fish Photos! You now will receive fun daily photos of fish!",         "icon": "firebase-logo.png",         "image": "guppies.jpg",         "data": {           "notificationType": "fishPhoto",           "photoId": "123456"         },         "click_action": "https://example.com/fish_photos",         "actions": [           {             "title": "Like",             "action": "like",             "icon": "icons/heart.png"           },           {             "title": "Unsubscribe",             "action": "unsubscribe",             "icon": "icons/cross.png"           }         ]       }     },     "token": "<APP_INSTANCE_REGISTRATION_TOKEN>"   } }

Notice that you are able to set new parameters, such as actions, which gives the user different ways to interact with the notification. In the example below, users have the option to choose from actions to like the photo or to unsubscribe. 

To handle action clicks in your app, you need to add an event listener in the default firebase-messaging-sw.js file (or your custom service worker). If an action button was clicked, event.action will contain the string that identifies the clicked action. Here's how to handle the "like" and "unsubscribe" events on the client: // Retrieve an instance of Firebase Messaging so that it can handle background messages. const messaging = firebase.messaging(); // Add an event listener to handle notification clicks self.addEventListener('notificationclick', function(event) {    if (event.action === 'like') {        // Like button was clicked        const photoId = event.notification.data.photoId;        like(photoId);    }    else if (event.action === 'unsubscribe') {        // Unsubscribe button was clicked        const notificationType = event.notification.data.notificationType;        unsubscribe(notificationType);    }    event.notification.close(); });

The SDK will still handle regular notification clicks and redirect the user to your click_action link if provided. To see more on how to handle click actions on the client, check out the guide.

Since different browsers support different parameters in different platforms, it's important to check out the browser compatibility documentation to ensure your notifications work as intended. Want to learn more about what the Send API can do? Check out the FCM Send API documentation and the Web Notifications API documentation. If you're using the FCM Send API and you incorporate the Web Notifications API in a cool way, then let us know! Find Firebase on Twitter at @Firebase, and Facebook and Google+ by searching "Firebase".                             

 Posted by Mertcan Mermerkaya, Software Engineer

We have great news for web developers that use Firebase Cloud Messaging to send notifications to clients! The FCM v1 REST API has integrated fully with the Web Notifications API. This integration allows you to set icons, images, actions and more for your Web notifications from your server! Better yet, as the Web Notifications API continues to grow and change, these options will be immediately available to you. You won't have to wait for an update to FCM to support them! 

Below is a sample payload you can send to your web clients on Push API supported browsers. This notification would be useful for a web app that supports image posting. It can encourage users to engage with the app. 

{   "message": {     "webpush": {       "notification": {         "title": "Fish Photos 🐟",         "body":           "Thanks for signing up for Fish Photos! You now will receive fun daily photos of fish!",         "icon": "firebase-logo.png",         "image": "guppies.jpg",         "data": {           "notificationType": "fishPhoto",           "photoId": "123456"         },         "click_action": "https://example.com/fish_photos",         "actions": [           {             "title": "Like",             "action": "like",             "icon": "icons/heart.png"           },           {             "title": "Unsubscribe",             "action": "unsubscribe",             "icon": "icons/cross.png"           }         ]       }     },     "token": "<APP_INSTANCE_REGISTRATION_TOKEN>"   } }

Notice that you are able to set new parameters, such as actions, which gives the user different ways to interact with the notification. In the example below, users have the option to choose from actions to like the photo or to unsubscribe.

To handle action clicks in your app, you need to add an event listener in the default firebase-messaging-sw.js file (or your custom service worker). If an action button was clicked, event.action will contain the string that identifies the clicked action. Here's how to handle the "like" and "unsubscribe" events on the client:

// Retrieve an instance of Firebase Messaging so that it can handle background messages. const messaging = firebase.messaging(); // Add an event listener to handle notification clicks self.addEventListener('notificationclick', function(event) {    if (event.action === 'like') {        // Like button was clicked        const photoId = event.notification.data.photoId;        like(photoId);    }    else if (event.action === 'unsubscribe') {        // Unsubscribe button was clicked        const notificationType = event.notification.data.notificationType;        unsubscribe(notificationType);    }    event.notification.close(); });

The SDK will still handle regular notification clicks and redirect the user to your click_action link if provided. To see more on how to handle click actions on the client, check out the guide.

Since different browsers support different parameters in different platforms, it's important to check out the browser compatibility documentation to ensure your notifications work as intended. Want to learn more about what the Send API can do? Check out the FCM Send API documentation and the Web Notifications API documentation. If you're using the FCM Send API and you incorporate the Web Notifications API in a cool way, then let us know! Find Firebase on Twitter at @Firebase, and Facebook and Google+ by searching "Firebase".

Posted by Brahim Elbouchikhi, Product Manager

In today's fast-moving world, people have come to expect mobile apps to be intelligent - adapting to users' activity or delighting them with surprising smarts. As a result, we think machine learning will become an essential tool in mobile development. That's why on Tuesday at Google I/O, we introduced ML Kit in beta: a new SDK that brings Google's machine learning expertise to mobile developers in a powerful, yet easy-to-use package on Firebase. We couldn't be more excited!

Machine learning for all skill levels

Getting started with machine learning can be difficult for many developers. Typically, new ML developers spend countless hours learning the intricacies of implementing low-level models, using frameworks, and more. Even for the seasoned expert, adapting and optimizing models to run on mobile devices can be a huge undertaking. Beyond the machine learning complexities, sourcing training data can be an expensive and time consuming process, especially when considering a global audience.

With ML Kit, you can use machine learning to build compelling features, on Android and iOS, regardless of your machine learning expertise. More details below!

Production-ready for common use cases

If you're a beginner who just wants to get the ball rolling, ML Kit gives you five ready-to-use ("base") APIs that address common mobile use cases: