Use this as a guide. But instead of using the java runtime, we'll use python.
The setup can be accomplish with virtualenv to create an isolated environment.
The project folder structure will look like this:
cloudfunction/
- index.js
- pipeline.py
- ENV/
So...
mkdir cloudfunction
cd cloudfunction
pip install virtualenv
virtualenv ENV
Install the Dataflow SDK (2.4.0) inside the virtual environment.
source ENV/bin/activate
pip install google-cloud-dataflow
Dump your Cloud Function index.js
script and Dataflow pipline.py
script into the project folder.
Now bundle all the files by compressing the contents of the cloudfunction/ folder into a single zip file.
It should go from this:
cloudfunction/
- index.js
- pipeline.py
- ENV/
To this:
cloudfunctions/
- bundled.zip
Use your trigger type of choice when deploying.
gcloud alpha functions deploy FUNCTION --stage-bucket BUCKET --trigger-bucket BUCKET
Or, deploy from the Cloud Console.