Create a virtual environment
python3 -m venv .venv
source .venv/bin/activate
Install requirements
Create a virtual environment
python3 -m venv .venv
source .venv/bin/activate
Install requirements
from google.cloud.logging_v2.types import logging_config | |
from google.cloud.logging_v2.services.config_service_v2 import ConfigServiceV2Client | |
from google.protobuf.field_mask_pb2 import FieldMask | |
#FYI - https://realpython.com/python-f-strings/ for "f" | |
REGION = "global" | |
PROJECT_ID = "rjc-testing-project" | |
BUCKET_ID="my-bucket-id" |
This file script is provided without any warrantees and is for testing purposes.
python3 -m venv .venv
source .venv/bin/activate
pip install --upgrade pip
import sys | |
import json | |
from pprint import pprint | |
from googleapiclient import discovery | |
from oauth2client.client import GoogleCredentials | |
def get_forwarding_rules(service, project): |
package main | |
import ( | |
"crypto/tls" | |
"fmt" | |
"log" | |
"net" | |
"net/http" | |
"os" | |
"sync" |
# The following file downloads complaint data from the Consumer Finance Board. | |
URL="https://data.consumerfinance.gov/api/views/s6ew-h6mp/rows.csv?accessType=DOWNLOAD" | |
FILE=complaints.csv | |
if [ ! -f $FILE ]; then | |
echo "-- Downloading file --" | |
curl -L $URL --output $FILE | |
fi | |
python process.py complaints.csv |
from __future__ import print_function | |
from apiclient.discovery import build | |
from oauth2client import service_account | |
import httplib2 | |
# https://cloud.google.com/dataflow/docs/templates/executing-templates#example-1-custom-template-batch-job | |
# https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.templates/launch | |
# https://cloud.google.com/dataflow/docs/templates/executing-templates#using-gcloud | |
projectId = "my-project-id" |
using System; | |
using Xunit; | |
using System.IO; | |
using Google.Cloud.Dialogflow.V2; | |
using Grpc.Core; | |
using Google.Apis.Auth.OAuth2; | |
using Grpc.Auth; | |
namespace dialogflowtest | |
{ |
Create 2 files: main.go
and docker-compose.yml
. Once both are created use docker-compose up
and it will build the necessary files and start Elastic Search.
I found it necessary to add elastic.SetSniff(false)
or I could not connect. Also remember the docker containers have security enabled for Elasticsearch with the password set as elastic:changeme
. You can test it using curl:
curl http://127.0.0.1:9200/_cat/health -u elastic:changeme
Once it is running you can successfully run the main.go
file using go run main.go
.
Checks if a topic exists. If it doesn't it will be created. Next it will create messages on that topic with a new subscription.
package main
import (
"context"
"log"
"time"