Here we describe how to obtain different types of credentials that can be important when working with a Google API:
This can be important for both users and developers:
Note that most users of gargle-using packages do not need to read this and can just enjoy the automatic token flow. This article is for people who have a specific reason to be more proactive about auth.
You will need a Google Cloud Platform (GCP) project to hold your credentials.
Go to the Google Cloud Platform Console:
This console is your general destination for inspecting and modifying your GCP projects.
Create a new project here, if necessary. Otherwise, select the project of interest, if you have more than one.
Enable the relevant APIs(s) for your GCP project.
In the left sidebar, navigate to APIs & Services > Library.
Identify the API of interest. Click Enable.
If you get this wrong, i.e. need to enable more APIs later, you can always come back and do this then.
For some APIs, you won’t be able to do anything interesting with the credentials hosted in your project unless you have also linked a billing account. This is true, for example, for BigQuery and anything that has to do with Maps. This is NOT true, for example, for Drive or Sheets or Gmail.
If your target API requires a billing account, that obviously raises the stakes for how you manage any API keys, OAuth clients, or service account tokens. Plan accordingly.
If you’re new to Google Cloud Platform, you’ll get to enjoy GCP Free Tier. At the time of writing, this means you get $300 credit and no additional billing will happen without your express consent. So there is a low-stress way to experiment with APIs, with a billing account enabled, without putting actual money on the line.
Some APIs accept requests to read public resources, in which case the request can be sent with an API key in lieu of a token. If this is possible, it’s a good idea to expose this workflow in a wrapper package, because then your users can decide to go into a “de-authed” mode. When using the package in a non-interactive or indirect fashion (e.g. a scheduled job on a remote server or via Shiny), it is wonderful to NOT have to manage a token, if the work can be done with an API key instead.
Some APIs aren’t really usable without a token, in which case an API key may not be relevant and you can ignore this section.
Package maintainers might want to build an API key in as a fallback, possibly taking some measures to obfuscate the key and limit its use to your package.
Package users could register an API key for use with a wrapper package. For example, in googlesheets4, one would use
googlesheets4::gs4_auth_configure() to store a key for use in downstream requests, i.e. after a call to
library(googlesheets4) gs4_auth_configure(api_key = "YOUR_API_KEY_GOES_HERE") gs4_deauth() # now you can read public resources, such as official example Sheets, # without any need for auth gs4_example("gapminder") %>% read_sheet()
Most APIs are used to create and modify resources on behalf of the user and these requests must include the user’s token. A regular user will generally need to send an OAuth2 token, which is obtained under the auspices of an OAuth “app” or “client”. This is called three-legged OAuth, where the 3 legs are the app or client, the user, and Google.
The basic steps are described in the Prerequisites section for doing Google OAuth 2.0 for Mobile & Desktop Apps:
Two ways to package this info for use with httr or gargle, both of which require an object of class
In both cases, I suggest you devise a nickname for each OAuth credential and use it as the credential’s name in GCP Console and as the
appname argument to
Package maintainers might want to build this app in as a fallback, possibly taking some measures to obfuscate the client ID and secret and limit its use to your package.
Package users could register this app for use with a wrapper package. For example, in googledrive, one would use
googledrive::drive_auth_configure() to do this:
library(googledrive) # method 1: direct provision client ID and secret <- httr::oauth_app( google_app "my-very-own-google-app", key = "123456789.apps.googleusercontent.com", secret = "abcdefghijklmnopqrstuvwxyz" )drive_auth_configure(app = google_app) # method 2: provide filepath to JSON containing client ID and secret drive_auth_configure( path = "/path/to/the/JSON/you/downloaded/from/gcp/console.json" ) # now any new OAuth tokens are obtained with the configured app
For a long time, the recommended way to make authorized requests to an API in a non-interactive context was to use a service account token. As of April 2021, an alternative exists – workload identity federation – which is available to applications running on specific non-Google Cloud platforms, such as AWS. If you can use workload identity federation, you probably should (see the next section). But for those who can’t, here we outline the use of a conventional service account.
An official overview of service accounts is given in this official documentation by Google. But note that it’s not necessary to understand all of that in order to use a service account token.
pathto this file.
Authors of wrapper packages can use the symmetric encryption strategy described in Managing tokens securely to use this token on remote servers, such as continuous integration services like GitHub Actions.
You could provide the token’s filepath to a wrapper package’s main auth function, e.g.:
# googledrive drive_auth(path = "/path/to/your/service-account-token.json")
Alternatively, you could put the token somewhere (or store its location in an environment variable) so that it is auto-discovered by the Application Default Credentials search strategy.
Workload identity federation is a new (as of April 2021) keyless authentication mechanism that allows applications running on a non-Google Cloud platform, such as AWS, to access Google Cloud resources without using a conventional service account token. This eliminates the dilemma of how to safely manage service account credential files.
Unlike service accounts, the configuration file for workload identity federation contains no secrets. Instead, it holds non-sensitive metadata. The external application obtains the needed sensitive data “on-the-fly” from the running instance. The combined data is then used for a token exchange that ultimately yields a short-lived GCP access token. This access token allows the external application to impersonate a service account and inherit the permissions of the service account to access GCP resources.
So what’s not to love? Well, first, this auth flow is only available if your code is running on AWS, Azure, or another platform that supports the OpenID Connect protocol. Second, there’s a non-trivial amount of pre-configuration necessary on both ends. But once that is done, you can download a configuration file that makes auth work automagically with gargle.
This feature is still experimental in gargle and currently only supports AWS. For more, see the documentation for
credentials_external_account(). Like conventional service account tokens, workload identity federation is a great fit for the Application Default Credentials strategy for discovering credentials. See
credentials_app_default() for more about that.
These two links provide, respectively, a high-level overview and step-by-step instructions for this flow: