

Username = "me", password = "mypassword") # gcr.io/gcer-public/google-auth-r-cron-tidy Tag <- gce_tag_container("google-auth-r-cron-tidy", project = "gcer-public") # get the tag for prebuilt Docker image with googleAuthRverse, cronR and tidyverse With googleComputeEngineR and the new gcer-public project containing public images that include one with cronR already installed, this is as simple as the few lines of code below: library(googleComputeEngineR)

# use the GCS auth to download the auth files for your API It downloads authentication files, does an API call, then saves it up to the cloud again: library(googleAuthR) To help with this, on Google Cloud you can authenticate with the same details you used to launch a VM to authenticate with the storage services above (as all are covered under the scope) - you can access this auth when on a GCE VM in R via googleAuthR::gar_gce_auth()Īn example skeleton script is shown below that may be something you are scheduling.

This may be a bit more complicated to set up, but will save you tears if the VM or service goes down - you still have your data. Use a service like BigQuery ( bigQueryR) or googleCloudStorageR ( googleCloudStorageR) to first load any necessary data, do your work then save it out again. I would suggest to not save or use data in the same place you are doing the scheduling.
