We’ve delivered processed result data to clients in various ways. Sometimes we create a custom Shiny dashboard hosted on Amazon Web Services or shinyapps.io. In simpler cases, it is a flat text file, Excel sheet, or for more complex cases – a hosted database solution with a Tableau implementation sitting on top.
Then along came Jenny Bryan’s Google Sheets R package. This was a game changer for us, since many of our clients need up-to-date data in an environment they are comfortable in, which is typically Excel. Rather than set up a remote database and dashboard, or even schedule e-mails to send Excel reports on a regular basis, we just need to schedule an R script to run, which then updates all of the data in the same place.
Here’s a high-level summary of how to accomplish this:
1. Install the Google Sheets package:
2. Set up an OAuth token.
3. Create the Google Sheet via Google Docs
4. Access the Sheet via it’s name using
gs_title(‘mysheetname’). This will display its key, which we then use to identify the sheet for future updates.
5. Create a Google Sheet object in R:
gs_object <- gs_key(‘some_key_from_step_4’)
6. Once the data.frame in R is ready ('result' in this example) to upload to Google Sheets, it takes two commands to update the sheet. We have been overwriting the data each time, instead of appending new data, since we aren’t dealing with large amounts of data. The first command sets the header names (anchored at cell A1). The second command uploads the data itself. We recommend setting trim=TRUE so the sheet only uses the minimum number of rows and columns needed.
gs_edit_cells(gs_object, ws='sheetname', input=colnames(result), byrow=TRUE, anchor="A1")
gs_edit_cells(gs_object, ws='sheetname', input = result,