December 10, 2017

Running a local script “from” Google Sheets

It’s been two years since I started learning Python. Incidentally, or not, it has also been two years since I last posted something on here. It’s been a fun ride and I’ve now got several projects going on: web-scrapers and APIs for personal use, automation scripts, and one nice but practically useless predictive model utilizing machine learning.

Challenging subjects like ML are fun to practice, but nothing beats automation in my world.

For a recent project, the need was to run some local script whenever a button is clicked in a Google Sheets spreadsheet. Each script automates a specific After Effects composition, based on data in its respective spreadsheet. So we have sheet-script pairs, one button per sheet, and many users interacting with different sheets.

Google makes it fairly easy to manipulate Drive files based on certain events. Apps Script runs on their own cloud platform and can do pretty much anything a human can with Google’s own services. Invoking a locally hosted script isn’t one of them though.

Generalizing the problem helped pave the way: what we needed was to notify the machine, in realtime, of an event taking place on the web. Once that’s done, executing the appropriate script is straightforward.

There are two ways to make a computer aware of remote events: we either have it pull them periodically from the sender, or we have the sender push each event to the machine. The pull approach is simpler, but wouldn’t have worked here: it’s possible to write a script that pings Google’s servers every second, but that’s also a surefire way to go over the APIs quota.

So we need the sender to push the notification, something Google’s platform doesn’t provide out of the box. But actually, the only thing a Google script can push out of the box is even better: a filesystem event. That is, if Google Drive is synced as a local folder, changes in this folder are normal file events as far as the operating system is concerned, and these are among the easiest to monitor. Now the question becomes how it is best to do that.1

For our use-case, I decided to create a timestamped JSON file each time there’s a button event. It sure would’ve been nice to do the tracking in one file, but with multiple users and sheets, such design would create a substantial risk of race-conditions.2

Here’s where I complicate things a bit: Since I already had Dropbox installed, I decided to save the JSON files there, and not in Google Drive. This changes the implementation, but not the underlying concept.

So, each time one of the users clicks a button, a Google script is invoked, creating a new file in the Dropbox subfolder json_jobs”:

### filename example: rhino-12-8-2017 10/32/30.json
"processed": false,
"script": "C:\\ae_folder\\ae_script_1.ps1",
"generated_at": "12-8-2017 10:34:02AM",
"delegator_spreadsheet": "spreadsheet_id",

Whenever a file like this is created, it’s grabbed and inspected by a Python script running in the background: this script then invokes the PowerShell script at the specified path, moves the resulting files to their final destination, and if all went good, changes the processed flag to True. This way retries won’t reprocess completed jobs.3

But that’s the end result. The very first step was to write the script that creates the JSON file and saves it to Dropbox. Here’s a slightly modified version of the main function, the one connected to the button-click event:

function main(spreadsheet_id) {
var json_dict = fillDict(spreadsheet_id)
var json_obj = convertDictToJSON(json_dict)

Connected” is a little misleading here: the file holding this code isn’t actually bound to any one spreadsheet. Instead, all spreadsheets involved import and utilize it as a library, passing their IDs whenever they call it. From there, we pass spreadsheet_id to a function that constructs a dictionary with the appropriate values, convert the dictionary to a JSON object, and send it to sendJSONToDropbox():

function sendJSONToDropbox(json_object) {
var timestamp = timeStamp()
var parameters = {
"path": "/json_jobs/rhino" + "-" + timestamp + ".json",
"mode": "add", // do not overwrite
"autorename": true,
"mute": false // notify Dropbox client

var headers = {
"Content-Type": "application/octet-stream",
'Authorization': 'Bearer ' + 'dropbox_access_token',
"Dropbox-API-Arg": JSON.stringify(parameters)

var options = {
"method": "POST",
"headers": headers,
"payload": json_object

var API_url = "";
var response = JSON.parse(UrlFetchApp.fetch(API_url, options).getContentText());


And that’s one way to run a local script from Google Sheets.

  1. Yes, we do end up listening to something, but polling the filesystem is overwhelmingly cheaper than polling Google’s servers. ↩︎
  2. I haven’t found a way to directly append data to a Dropbox file using the API. ↩︎
  3. This is one of several reasons to call the scripts indirectly via Python. The main reason: I didn’t write these scripts myself, and have no intentions whatsoever to learn PowerShell. ↩︎
Share: Facebook · Twitter
Subscribe: RSS · Newsletter · Twitter