[go: up one dir, main page]

Diego Giorgini
Diego Giorgini
Software Engineer


Firebase Cloud Messaging (FCM) is a cross-platform messaging solution that lets you reliably deliver messages to your apps and sites. It provides two types of messages:

  1. Notification Messages display a simple notification popup, with optional data payload.
  2. Data Messages deliver a JSON payload to your application and let your code handle it.

Data Messages are a great way to build custom notifications, when the layout provided by notification messages is not enough, or to trigger background operations like a database sync or the download of additional content (image attachments, emails, etc.)

How should Data Messages trigger background operations?

The best way to trigger a background operation from a data message is by using the Work Manager to schedule your operation at a point when it's best for the user (like avoiding extra work when the battery is very low, or when the CPU is already heavily used by other foreground applications).

Background Process Optimizations in Android O

To get started, check out the Android O Developer Preview site where you will find instructions on downloading and installing the required SDKs. For Firebase Development, you'll also need to install the Firebase SDKs for Android. Be sure to use version 10.2.1 or later for Android O development.

Android O introduces new background processes optimizations, which make the use of JobScheduler (or wrapper libraries like the Work Manager) a requirement for long-running background operations. Due to these optimizations, the FCM (hence GCM as well) callbacksonMessageReceived()and onTokenRefresh() have a guaranteed life cycle limited to 10 seconds (same as a Broadcast Receiver). After the guaranteed period of 10 seconds, Android considers your process eligible for termination, even if your code is still executing inside the callback. To avoid your process being terminated before your callback is completed, be sure to perform only quick operations (like updating a local database, or displaying a custom notification) inside the callback, and use JobScheduler to schedule longer background processes (like downloading additional images or syncing the database with a remote source).

@Override
public void onMessageReceived(RemoteMessage remoteMessage) {​
 if (/* Check if data needs to be processed by long running job */ true) {​
     // For long-running tasks (10 seconds or more) use the Work Manager
     scheduleJob();
 } else {​
 // Handle message within 10 seconds
 handleNow();
 }
}
/**
 * Schedule a job using the Work Manager.
 */
private void scheduleJob() {​
  WorkManager.getInstance().enqueue(
      new OneTimeWorkRequest.Builder(MyWorker.class).build());
}
/**
 * Perform and immediate, but quick, processing of the message.
 */
private void handleNow() {​
  Log.d(TAG, "Short lived task is done.");
}

We hope this helps you understand how to use FCM to schedule long-running background operations. This solution greatly helps Android to preserve battery life and ensures that your application works fine on Android O. In case you have any questions don't hesitate to ask us on our support channels.

Abe Haskins
Abe Haskins
Developer Programs Engineer

Cloud Functions are a great solution for running backend code for your Firebase app. You can write a function which is triggered by many different actions like user sign-ups, writes to the Realtime Database, changes to a Cloud Storage bucket, or conversion events in Firebase Analytics. Cloud Functions can also be triggered by some external sources, for example you could tie a Cloud Function to an HTTPS endpoint or a Cloud Pub/Sub topic.

Reacting to these events is very powerful, but you may not always want to react to an event - sometimes you may want to run a function based on a time interval. For example, you could clean up extra data in your Realtime Database every night, or run analysis on your Analytics data every hour. If you have a task like this, you'll want to use App Engine Cron with Cloud Functions for Firebase to reliably trigger a function at a regular interval.

How to Schedule Functions



Cloud Functions for Firebase does not have any special support which allow us to utilize App Engine Cron to schedule events. In fact, the solution we'll implement is nearly identical to the solution we recommend for doing reliable task scheduling on Google Compute Engine.

The trick is to create a tiny App Engine shim that provides hooks for App Engine Cron. These hooks will then push to Cloud Pub/Sub topics for each scheduled job.

We will then configure our Cloud Function to handle incoming messages on that Pub/Sub topic.
Although this solution is the preferred option for scheduling functions, it isn't the only way you achieve this goal. If you're interested in seeing an alternative method, you should check out the functions-samples repository which explains how to achieve a similar result using an external scheduling service.

Deploying the App Engine App

It just so happens that we've already written the App Engine app you'll need to set up scheduled functions. It's available in the firebase/functions-cron repo on Github.

By default this sample triggers hourly, daily, and weekly Cloud Pub/Sub ticks. If you want to customize this schedule for your app then you can modify the cron.yaml.

For details on configuring this, please see the cron.yaml Reference in the App Engine documentation.

Let's get started!

1. Prerequisites

Install (or check that you have previously installed) the following tools.

2. Clone this repository

To clone the GitHub repository to your computer, run the following command:

git clone https://github.com/firebase/functions-cron
Change directories to the functions-cron directory. The exact path depends on where you placed the directory when you cloned the sample files from GitHub.

cd functions-cron

3. Deploy to App Engine

Configure the gcloud command-line tool to use your Firebase project.

gcloud config set project 
Change directory to appengine/
cd appengine/

Install the Python dependencies
$ pip install -t lib -r requirements.txt
Create an App Engine App

gcloud app create
Deploy the application to App Engine.
gcloud app deploy app.yaml \ cron.yaml
Open Google Cloud Logging and in the right dropdown select "GAE Application". If you don't see this option, it may mean that App Engine is still in the process of deploying.

Look for a log entry calling /_ah/start. If this entry isn't an error, then you're done deploying the App Engine app.

4. Deploy to Google Cloud Functions for Firebase

Ensure you're back the root of the repository (cd .. if you're coming from Step 2)
Deploy the sample hourly_job function to Google Cloud Functions

firebase deploy --only functions --project 
Warning: This will remove any existing functions you have deployed. If you have existing functions, copy the example from functions/index.js into your project's index.js

5. Verify your Cron Jobs

We can verify that our function is wired up correctly by opening the Task Queue tab in App Engine and clicking on Cron Jobs. Each of these jobs has a Run Now button next to it.

The sample functions we deployed only has one function: hourly_job. To trigger this job, let's hit the Run Now button for the /publish/hourly-tick job.

Then, go to your terminal and run...

firebase functions:log --project 
You should see a successful console.log from your hourly_job.

You're Done!

Your cron jobs will now "tick" along forever. As we mentioned above, you're not limited to the hourly-tick, daily-tick and weekly-tick that are included in the App Engine app.

You can add more scheduled functions by modifying the cron.yaml file and re-deploying the app.

Todd Kerpleman
Todd Kerpelman
Developer Advocate
By now, you probably already know that you can export your Firebase Analytics data to BigQuery, which lets you run all sorts of sophisticated ad hoc queries against your analytics data.

At first, the data set in BigQuery might seem confusing to work with. If you've worked with any of our public BigQuery data sets in the past (like the Hacker News post data, or the recent San Francisco public data that our Developer Advocate Reto Meier had fun with), it probably looked a lot like a big ol' SQL table. Something like this:
The truth of the matter is that BigQuery can get much more sophisticated than that. The rows of a BigQuery table don't just have to be straightforward key-value pairs. They can look more like rows of JSON objects, containing some simple data (like strings, integers, and floats), but also more complex data like arrays, structs, or even arrays of structs. Something a little more like this:

Firebase Analytics takes advantage of this format to bundle all of your users' user properties together in the same row. Rather than have you perform some kind of join against a separate user_properties table, all of your user properties are included in the same BigQuery row as an array of structs.

A slightly simplified version of the user_properties struct in your BigQuery data 

The same thing holds true for your events. Your event parameters are included inside your events as an array of structs. And it turns out these events themselves are stored inside of an array. One single row of data in BigQuery will often contain 2 or 3 Firebase Analytics events all bundled together.
This means a single row in your BigQuery table can contain an array of user properties, as well as an array of events, which in turn also have their own arrays of event parameters. I know combining all of that information into a data structure like this seems confusing at first, but in the long run, it actually makes your life easier because there aren't any JOINs with other tables for you to worry about.

Important note: For all of these examples, I'm going to be using standard SQL, which is what all the cool kids are doing this days1. If you want to follow along, turn off Legacy SQL in your BigQuery options. Also, you'll need to follow this link to access the sample Firebase Analytics data we'll be using.

For example, I can see all of my event data at once just by calling

#standardSQL
SELECT event_dim 
FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607` 
LIMIT 50
and I'll get back all of my event data, along with all of the event parameters, in one nice little table
And then if we want to get a list of all of my "Round completed" events, I can just write some SQL like this…

#standardSQL
SELECT event_dim 
FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607` 
WHERE event_dim.name = "round_completed"
...which gives me a nice result of...

Error: Cannot access field name on a value with type ARRAY<STRUCT<date STRING, name STRING, params ARRAY<STRUCT<key STRING, value STRUCT<string_value STRING, int_value INT64, float_value FLOAT64, ...>>>, ...>> at [2:17]

Oh. Oh dear. 

Okay, so this won't win any awards for "Best Error Message of 2017"2 , but if you think about it, the reason it's barfing makes sense. You're trying to compare a string value to "an element of a struct that's buried inside of an array". Sure, that element ends up being a string, but they're fairly different objects.

So to fix this, you can use the UNNEST function. The UNNEST function will take an array and break it out into each of its individual elements. Let's start with a simple example.

Calling:

#standardSQL
WITH data AS (
  SELECT "primes under 15" AS description,
  [1,2,3,5,7,11,13] AS primes_array)
SELECT * 
FROM data 
will give you back a single row consisting of a string, and that array of data.

Instead, try something like this:

#standardSQL
WITH data AS (
  SELECT "primes under 15" AS description,
  [1,2,3,5,7,11,13] AS primes_array)
SELECT description, prime 
FROM data CROSS JOIN UNNEST (primes_array) as prime
What you're basically saying is, "Hey, BigQuery, please break up that primes_array into its individual members. Then join each of these members with a clone of the original row." So you end up with a data structure that looks more like this:
The results are similar as before, but now each prime is in its own row:
You'll notice that the original primes_array is still included in the data structure. In some cases (as you'll see below), this can be useful. In this particular case, I found it was a little confusing, which is why I only asked for the individual fields of description and prime instead of SELECT *.3

It's also common convention to replace that CROSS JOIN syntax with a comma, so you get a query that looks like this.

#standardSQL
WITH data AS (
  SELECT "primes under 15" AS description,
  [1,2,3,5,7,11,13] AS primes_array)
SELECT description, prime 
FROM data, UNNEST (primes_array) as prime
It's the exact same query as the previous one; it's just a little more readable. Plus, I can now stand by my original statement that this data format means you don't have perform any JOINs. :)

And the nice thing here is that I now have one piece of "prime" data per column that I can interact with. So I can start to do comparisons like this:

#standardSQL
WITH data AS (
  SELECT "primes under 15" AS description,
  [1,2,3,5,7,11,13] AS primes_array)
SELECT description, prime 
FROM data, UNNEST (primes_array) as prime
WHERE prime > 8
To get just that list of prime numbers between 8 and 15.
So going back to our Firebase Analytics data, I can now use the UNNEST function to look for events that have a specific name. 

#standardSQL
SELECT event.name, event.timestamp_micros
FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607`, 
  UNNEST(event_dim) as event
WHERE event.name = "round_completed"

As you'll recall, events have their own params array, which contains all of the event parameters. If I were to UNNEST those as well, I'd be able to query for specific events that contain specific event parameter values:

#standardSQL
SELECT event, event.name, event.timestamp_micros
FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607`, 
  UNNEST(event_dim) as event,
  UNNEST(event.params) as event_param
WHERE event.name = "round_completed"
AND event_param.key = "score"
AND event_param.value.int_value > 10000

Note that in this case, I am selecting "event" as one of the fields in my query, which gives me the original array of all my event parameters nicely grouped together in my table results.

Querying against user properties works in a similar manner. Let's say I'm curious as to what language my users prefer using for my app, something our app is tracking in a "language" user property. First, I'll use the UNNEST query to get just a list of each user and their preferred language.

#standardSQL
SELECT
 user_dim.app_info.app_instance_id as unique_id,
  MAX(user_prop.key) as keyname,
  MAX(user_prop.value.value.string_value) as keyvalue
FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607`,
  UNNEST(user_dim.user_properties) AS user_prop
WHERE user_prop.key = "language"
GROUP BY unique_id
And then I can use that as my inner selection to grab the total number of users4 that fits into that group.
#standardSQL
SELECT keyvalue, count(*) as count
FROM (
  SELECT
   user_dim.app_info.app_instance_id as unique_id,
    MAX(user_prop.key) as keyname,
    MAX(user_prop.value.value.string_value) as keyvalue
  FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607`,
    UNNEST(user_dim.user_properties) AS user_prop
  WHERE user_prop.key = "language"
  GROUP BY unique_id
) 
GROUP BY keyvalue
ORDER BY count DESC
I can also UNNEST both my event parameters and my user properties if I want to create one great big query (no pun intended) where I want to look at events of a specific name where an event parameter matches a particular criteria, while also filtering by users who meet a certain criteria:

#standardSQL
SELECT user_dim, event, event.name, event.timestamp_micros
FROM `firebase-analytics-sample-data.android_dataset.app_events_20160607`,
  UNNEST(event_dim) as event,
  UNNEST(event.params) as event_param,
  UNNEST(user_dim.user_properties) as user_prop
WHERE event.name = "round_completed"
  AND event_param.key = "squares_daubed"
  AND event_param.value.int_value > 20
  AND user_prop.key = "elite_powers"
  AND (CAST(user_prop.value.value.string_value as int64)) > 1
Once you start playing around with the UNNEST function, you'll find that it's really powerful and it can make working with Firebase Analytics data a lot more fun. If you want to find out more, you can check out the Working with Arrays section of BigQuery's standard SQL documentation.

And don't forget, you get 1 terabyte of usage data for free every month with BigQuery, so don't be afraid to play around with it. Go crazy, you array expander, you!



1 The BigQuery team has asked me to inform you that this is really because standard SQL is the preferred SQL dialect for querying data stored in BigQuery. But I'm pretty sure they're just saying that so they get invited to all the good parties.

2 Yet another year the Messies have slipped from our grasp!

3   I could have also done this by saying "SELECT * EXCEPT (primes_array)", which can be pretty convenient sometimes.

4 Okay, technically, each "App Instance" -- a user interacting with my app from multiple devices would get counted multiple times here.









Doug Stevenson
Doug Stevenson
Developer Advocate
A while back, we discussed how Firebase initializes on Android. There was a lot of great discussion around that, and it sounded like some of you experimented with the same technique for getting your own Android libraries initialized. Many of you also noted that there were a few situations when you couldn't use the normal automatic init procedure.

What if you have a customized build system?

Normally, Android apps using Firebase are built with Gradle and the Google Services Gradle Plugin. This plugin pulls your Firebase project data out of google-services.json, and adds it to your app's resources. Once the resources are added to your project, there is a component called FirebaseInitProvider that automatically picks up those values and initializes Firebase with them.
However, if you have a different build system, such as Bazel, or you're otherwise unable to use the Gradle plugin, you need to find another way to get those resources into your app. The solution could be as simple as creating your own resource XML file and adding the correct values to it. The documentation for the plugin gives the details of how to get those values out of your google-services.json file and into your resources.

What if you need to select your app's Firebase project at runtime?

It's very rare, but sometimes an app must be able to select its Firebase project at runtime. This means your app can't use the automatic init provided by the FirebaseInitProvider that's merged into your app. In that case, the solution has two tasks.

1. Disabling FirebaseInitProvider

FirebaseInitProvider is normally automatically merged into your app build by the Android build tools when doing a build with Gradle. If you're doing your own init, however, you'll to make sure it doesn't get merged at all. The way to do that is to use your own app's manifest to override that behavior. In your manifest, add an entry for FirebaseInitProvider, and make sure to use a node marker to set its tools:node attribute to the value "remove". This tells the Android build tools not to include this component in your app:
<provider
    android:name="com.google.firebase.provider.FirebaseInitProvider"
    android:authorities="${applicationId}.firebaseinitprovider"
    tools:node="remove"
    />

If you don't have the "tools" namespace added to your manifest root tag, you'll have to add that as well:
<manifest
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    package="your.package"
    >

2. Calling FirebaseApp.initializeApp() to initialize

Because you removed FirebaseInitProvider, you'll need to perform the same init somewhere in your app (in your own ContentProvider during its onCreate, to ensure Analytics can measure your app correctly). The method you'll need to call is FirebaseApp.initializeApp(Context, FirebaseOptions). If you click through to the javadoc, you'll see a few varieties of that method. The one you want takes both a Context and FirebaseOptions to initialize the default FirebaseApp instance. You can create a FirebaseOptions object using its Builder:

FirebaseOptions.Builder builder = new FirebaseOptions.Builder()
    .setApplicationId("1:0123456789012:android:0123456789abcdef")
    .setApiKey("your_api_key")
    .setDatabaseUrl("https://your-app.firebaseio.com")
    .setStorageBucket("your-app.appspot.com");
FirebaseApp.initializeApp(this, builder.build());
The documentation for the plugin will help you locate the correct strings for your project in your google-services.json file.

With these changes in place, you no longer need the Google services plugin and its JSON config file in your project.

Again, if you're happy with the way your Android app builds, you shouldn't need to implement any of the changes here. Otherwise, if your situation requires it, the information here should be all you need to do to take control of your Firebase init.

Tyler Rockwood
Tyler Rockwood
Software Engineer
The Firebase Realtime Database has traditionally been a black box that doesn't really give you a lot of insight into its performance. We're changing that. Today, you'll be able to get insights into how your database instance is performing by using the profiler built into the Firebase CLI. You can now easily monitor your database writes and reads at the path level, collecting granular data on bandwidth usage and speed.


To start, make sure you have the latest version of the Firebase CLI installed and initialized. Start profiling using the database:profile command.


firebase database:profile



This will start streaming operations from your Realtime Database. When you press enter, the CLI aggregates the collected data into a summary table broken down into three main categories: speed, bandwidth and unindexed queries. Speed and bandwidth reports are further broken down by the operation type (write or read) and the path in your Realtime Database. If you have a location with more than 25 children (for example, if you're using .push in the SDKs), the summary table collapses those paths into a single entry and replaces the push ids with $wildcard.


Speed*

This table displays 4 items: the path, the number of times that location has been hit, the number of milliseconds it took for the server to process the request, and the number of times that the path has been denied by rules.




Bandwidth**

The table displays 3 items: the path, the total amount of bandwidth for the path, and the average bandwidth per operation.


Unindexed Queries

This shows 3 things: the path, the index rule that should be added to your rules, and the number of times the location has been queried. Warnings for these queries also show up in the SDK logs.



What if this isn't enough?


You can also collect the raw operations from your server by using the --raw flag (you'll probably also want to specify an output file with --output), to get more detailed information like ip addresses and user agent strings for connected applications. See the profiler reference page for a complete list of possible operations you can collect information about and what they show.


* Speeds are reported at millisecond resolution and refers to the time it takes for the database to process an operation. However, you may see vastly different latencies depending on network conditions and numerous other factors.


** Bandwidth is an estimate based on the data payload and is not a valid measure of the billable amount. Values here could be over or under your actual billed bandwidth, and the stats sent by the profiler also count towards your bandwidth bill.

Mike McDonald
Mike McDonald
Product Manager
Since the initial release of Firebase Storage at Google I/O 2016, we've been happy to see mobile app developers make use of its scalable, secure, and robust file storage to power their apps. Hundreds of thousands of developers have created buckets, and we serve hundreds of millions of requests for photos, videos, audio, and other rich media every day.

But we're not done yet: we have a few more features lined up that will make it faster and easier to store and share your app's content.

Use multiple buckets in your projects

After our launch at I/O '16, Firebase projects were limited to a single bucket, located in the United States. With our announcement at Google Cloud Next '17, any Firebase project on the Blaze payment plan can now create buckets in any of the regions and storage classes supported by Google Cloud Storage. This enables some powerful use cases:
  • Logically separate different types of data (e.g. user data from developer data).
  • Store data in a location closer to users, either to optimize performance or support regulatory compliance.
  • Reduce cost by storing infrequently accessed data (e.g. backups) in a different storage class.
Creating new buckets in the Firebase Console is easy: just select the location and storage class and give it an easy-to-remember name!

Link existing buckets to your projects

Because every Firebase project is also a Google Cloud Platform project, you can easily use any existing Cloud Storage buckets directly with Firebase SDKs for Cloud Storage. This means your mobile and web apps can access data in your buckets without having to do an expensive data migration. This is a useful feature for existing apps looking to modernize by integrating Firebase.

Linking your existing bucket to Firebase is easier than creating a new one: just select the bucket you want to import, configure your security rules to allow access, and start using the bucket directly from your app.


Integrate with Google Cloud Functions

At Google Cloud Next '17 we also announced Cloud Functions for Firebase, which enables developers to write code that responds to events in Cloud or Firebase features. Cloud Storage for Firebase integrates well with that, allowing you to trigger code when a file is uploaded, changed, or deleted from a storage bucket. This powerful mechanism enables developers to build new functionality on top of their project storage, such as automatically converting images, generating thumbnails, moderating images with the Google Cloud Vision API, and extracting metadata. Previously, these tasks would have required maintenance of a custom backend, but now, Cloud Functions makes it easy to automate by deploying code with a single command line.

Same feature, new name

With these new features and integrations with Cloud Storage, we're proud to announce Firebase Storage is now Cloud Storage for Firebase. We want to highlight the fact that Firebase Storage is Google Cloud Storage, and that using Firebase means that you're getting the ease of use of an SDK tailored for mobile and web developers, plus the full scale and performance of Google's infrastructure.

You can continue to use the existing Firebase SDKs for Cloud Storage on iOS, Android, JavaScript, C++, and Unity, knowing that your data is stored on the same infrastructure that powers Snapchat, Spotify, and Google Photos. And if you want to access data from Cloud Functions or your own server, you can always use the Cloud Storage server SDKs.

We think you're going to love the expanded Cloud Storage for Firebase. When you're building your next app with us, reach out on Twitter, Facebook, Slack, or our Google Group and let us know how it's going. We can't wait to see what you build!

Brendan Lim
Brendan Lim
Product Manager
Firebase started with the belief that apps could be built with mostly client code since it was, in many instances, easier and faster. However, there are still some cases where server code is needed, such as executing trusted code, authenticating to a third party API, or running battery intensive operations. In these instances, you had to stand up your own server —  until now.

Today we are excited to announce the beta launch of Cloud Functions for Firebase. It lets you write small pieces of JavaScript, deploy them to Google's Cloud infrastructure, and execute them in response to events from throughout the Firebase ecosystem. This has been the most requested feature since Firebase launched. The ability to extend and connect Firebase features using Cloud Functions makes Firebase more powerful, allowing you to do even more with your app without having to think about servers.



Cloud Functions is a versatile tool for building your mobile app. Here are a just a few of the many tasks you can perform with the integrations available at launch:

Firebase Analytics integration lets you trigger a function when a specific conversion event is fired. You can create functions to automate growth and retention workflows for your mobile apps, all without ever needing to update your client code.
Firebase Authentication integration lets you trigger a function when a new user is created or deleted.
Firebase Realtime Database integration lets you trigger a function when data is created, updated, or deleted at a specific path in the database.
Cloud Storage integration lets you trigger a function when an object is written, updated, or deleted within a particular storage bucket.
HTTP endpoint integration gives your Cloud Function a URL that can be used as a webhook. These functions are triggered when a request is made to their own unique, secure URLs.
We'll continue to add more integrations in the future.


"We were early testers of Cloud Functions for Firebase and were excited to see how easy it was to extend the Realtime Database to export data and integrate with other services."
- Erling MÃ¥rtensson, Master Architect, Sony

Firebase SDK and tooling

Cloud Functions for Firebase provides a first-class experience for Firebase developers, built on top of Google Cloud Functions. Cloud Functions are single-purpose JavaScript functions that are executed in a secure, managed Node.js environment. The Firebase SDK for Cloud Functions gives you an API that allows you to choose an event source (such as writes to Firebase Realtime Database at a specific data location) and implement a function that triggers on every matching event. Our SDK also works with TypeScript to support code completion and help you catch syntax errors early.

The SDK works in tandem with the Firebase CLI to provide a seamless experience when deploying your functions. This tight integration allows you to deploy all of your functions using only a single command.

Pricing

Cloud Functions is available on all Firebase pricing plans, including our free tier. The free tier allows you to quickly experiment and try out integration with other Firebase products. For our Blaze plan, you only pay for what you use. Blaze customers also receive a monthly allotment of free usage for Cloud Functions.

"Thanks to Cloud Functions for Firebase, I built a company with no permanent employees but myself (so far), with no serious scaling concerns, and no major costs maintaining or upgrading the backend as the app grows. It's something of a miracle."

- Paul Budnitz, Founder/CEO, Wuu

Create and deploy your first Cloud Function today

Getting started is easy! Walk through our step-by-step codelab, which takes you through setting up your first Cloud Function. You can refer to our full documentation for all the details.

We can't wait to see what you build!

James Tamplin
James Tamplin
Product Manager

Since expanding Firebase to become Google's mobile application development platform at Google I/O last year, our amazing community of developers has created over 1 million Firebase projects.

We're thrilled so many of you use and trust us. While Firebase is a full suite of products for building and growing apps, we know that some apps need more than we offer out-of-the-box. That's why we're bringing Firebase much closer to Google Cloud Platform (GCP) to serve even the most demanding applications, whether you're a new startup or a large enterprise.

Product Integrations

Firebase already shares the same account and billing system as GCP, so you can attach Firebase services to your GCP project and vice-versa. This makes for powerful combinations, such as exporting raw event data from Firebase Analytics into BigQuery for ad-hoc analysis. Starting today, we're beginning to share products too.

First, Firebase developers have been asking for ways to extend their app's functionality without spinning up a server, and Cloud Functions for Firebase lets you do just that. Cloud Functions is our new event-driven serverless compute offering that enters public beta today. The infrastructure is shared between Cloud and Firebase, allowing you to invoke a function or access resources from throughout the Cloud/Firebase ecosystem. For more information, read the announcement on the Firebase blog and Cloud blog.

Next, we're bringing Firebase Storage closer to Cloud Storage. Firebase Storage launched 10 months ago, and lets you easily upload and download files from your device directly to Cloud Storage. Previously we gave you a single bucket for your files. Now we're fully aligning the two products and letting you use any Cloud Storage bucket, from any global region and from any storage class, all straight from the Firebase SDK. To reflect this alignment we're renaming the product Cloud Storage for Firebase. Read more in our blog post.

Stay tuned for more product integrations in the future as Firebase continues to provide direct client-side access to GCP infrastructure through our iOS, Web, and Android SDKs.

Streamlined Terms of Service

We love lawyers almost as much as developers, so we're extending GCP's Terms of Service to cover several Firebase products. This makes Firebase and Cloud simpler to evaluate and use together. Products to be covered include: Authentication, Hosting, Storage, Functions, and Test Lab. Our streamlined Terms of Service will take effect soon.

The Big Picture

Firebase brings together the best of Google on mobile -- whether that's Google's flagship advertising solutions like AdMob and AdWords, or Google's analytics expertise in the form of Firebase Analytics.

Google Cloud Platform lets you to benefit from the institutional knowledge Google has developed from almost two decades of running global-scale computing infrastructure.

By bringing together the ease-of-use of Firebase with the full-range of GCP infrastructure offerings, we're better able to serve you up and down the stack. If you're a startup using Firebase to quickly get to market, you can now easily scale into a full public cloud. If you're an existing business running on GCP who wants to ship a mobile app, we've got you covered too.

We can't wait to see what you build with Firebase and Google Cloud Platform!

Laurence Moroney
Laurence Moroney
Developer Advocate
Firebase Notifications is a free service that enables targeted user notifications for mobile app developers. It is based on Firebase Cloud Messaging (FCM), and provides an option if you need a flexible notification platform that requires minimal effort to get started. It includes a graphical console for sending messages, so no programming is needed when picking your target segment and creating the message they’ll receive.

We’ve now expanded the way that you can target your notifications. With the first release of Firebase Notifications, we provided the facility for you to target users in a particular app, optionally filtering on Firebase Analytics Audience membership, language, or specific app version.  . Today we’re releasing a new set of enhancements to the Firebase Notifications console that will allow you to select a much more granular audience to receive your notification -- and as you know, accurately targeted notifications give a far greater chance of user engagement.

First of all, the user audience filter has more granular control. Now, if you choose to filter on  User Audience, you’ll see a set of tools that let you flexibly target users by their membership in multiple different audiences.





This allows you to target or exclude users who  are members of a particular set of audiences. . Using a combination of several targeting criteria, as well as user-defined audiences, you have great control over which users will  receive your notification.

Also new is the facility to filter your desired target segment based on user property. Firebase User Properties are attributes that allow you to define segments of your user base, beyond the typical app level properties. So, for example, if you’re building a fitness app, and you want to track how many people have met their fitness goals, you could define a user property for that, and then send notifications to them accordingly.

Now, you can target a notification by selecting a user property to compare against a desired value. So, if the user property contains a string, you can check if it contains, does not contain, is in, or is not in a range of values, or you can target users with user properties that match a Regex using RE2 syntax.



And of course if the user property contains a numeric value, there are several operators that you can use to compare it to a value -- such as equals, greater than, less than and more.

With these tools you’re equipped to more accurately target your notifications, to improve response rates while avoiding notification spam. We’re continuing to improve all of Firebase, including Firebase Notifications, and would love to hear your feedback on this or any other features.

To learn more, check out the Firebase Notifications documentation site.


Steve Ganem
Steve Ganem
Product Manager
When it comes to analytics, the sooner you can see your data, the quicker you can react to it, and the more valuable it can be. This has been at the front of our mind since launching Firebase Analytics, and we've been working hard to get you closer to that goal.

Back in November we launched real-time conversions. This meant that your most important events were delivered immediately from end-user devices. Around the same time, we also launched real-time export to BigQuery, so that your client events could be analyzed within BigQuery as soon as they were sent to our servers.


But for those of you not using BigQuery, you'd often have to wait several hours before you could start to see any meaningful results from Firebase Analytics. And when it comes to tasks like debugging analytics within your app, or just getting a quick snapshot of how your community is reacting to your latest changes, those hours can feel realllly long.


To address those needs, we are pleased to announce that StreamView and DebugView from Firebase Analytics are being rolled out to the general public.


StreamView visualizes events as they flow into Firebase Analytics,  and gives you a general sense of how your users are interacting with your app. With StreamView you can find out if a brand new feature is as well-received as you were hoping, watch the roll-out of the latest version of your app, or gauge the response to your latest re-engagement efforts - all in real-time as events are received by Firebase Analytics.


StreamView can give you a sense of where in the world people are using your app, right down to the city level…




...or what user properties are most common among your users….





You can also get an overview of what kind of events are being triggered in your app, and for any of those events, you can see a breakdown of all the custom event parameters you've been recording...





...and you can use all of these values as report filters to see where in the world certain events are occurring or certain products are popular, and you can see what user properties or events are common among users in certain countries or cities.


One unique feature available in StreamView is the User Snapshot. This will give you a live stream of the events from a random user. You can use the view to gain insight into the individual user journeys within your app.





Within the User Snapshot view you can see the set user properties, the user's device and location, and drill into the parameters of the events they're sending. You can filter the users included by their location and app version.





While you may still need to use BigQuery to get full and complete reports, or cross-reference certain sets data against others, there's a surprising amount you can learn about your app just by following the events as they happen in StreamView. The Santa Tracker team was an early customer of StreamView and they've assured me that it's (and this is a direct quote), "Pretty bad-ass."


You can can check out StreamView right away in our Demo Project!


While StreamView is a great real-time picture of your app once it's gone out to the general public, event delivery from device is still batched. This is great for users and battery optimization, but difficult when you're developing your app and need to check right away if your analytics setup is correct!


With DebugView, you can immediately see which events are being reported to Firebase Analytics. This will be activated for any device for which you've turned on Analytics Debugging, and is great for making sure you're logging the right events with the right parameters while you're building your app.





DebugView shows you events, parameters and user properties for any individual development device. It can also highlight any events that contain invalid parameters, so that you can get those fixed before you publish your app.





To get started yourself, take a look at our documentation on DebugView, or just look for these features in your own project console in the coming days and try them out!