- Switch to project (you can see project ID on top left near search bar):
gcloud config set project <project name>
- Connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name" --project "project-name"
Example:
gcloud beta compute ssh --zone "us-central1-a" "instance-1" --project "clgcporg8-052"
- Activate cloud shell and use below command to copy file in local:
gcloud compute scp --zone "instance_zone_name" vm_instance_name:path_from_copy local_path_to_copy
To copy folder use:
gcloud compute scp --recurse --zone "instance_zone_name" vm_instance_name:path_from_copy local_path_to_copy
Example:
gcloud compute scp --zone "us-central1-a" instance-1:~/node/index.js ~
sudo apt-get update
sudo apt install nginx -y
cd ~
curl -sL https://deb.nodesource.com/setup_16.x -o /tmp/nodesource_setup.sh
sudo bash /tmp/nodesource_setup.sh
sudo apt install nodejs
- Setup NextJS project:
npx create-next-app@latest
- Install nginx:
sudo apt-get update
sudo apt install nginx -y
- Setup Nginx to proxy NextJS from port
3000
to80
:
cd /etc/nginx/sites-available
sudo mv default def.bak
sudo vim default
Paste below code in config:
server {
listen 80;
listen [::]:80;
location / {
# reverse proxy for next server
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
Restart Nginx:
sudo systemctl restart nginx
- Install pm2 using below command:
sudo npm install pm2@latest -g
- Build NextJS application:
npm run build
- Run below command to run start script:
pm2 start npm --name "nextapp" -- start
- Check status of running app process using command:
pm2 status
- Stop process using command:
pm2 stop nextapp
- To update app, rebuild application and restart process using command:
pm2 restart nextapp
- Install Jenkins:
sudo apt update
sudo apt install openjdk-17-jdk
curl -fsSL https://pkg.jenkins.io/debian-stable/jenkins.io-2023.key | sudo tee \
/usr/share/keyrings/jenkins-keyring.asc > /dev/null
echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc] \
https://pkg.jenkins.io/debian-stable binary/ | sudo tee \
/etc/apt/sources.list.d/jenkins.list > /dev/null
sudo apt-get update
sudo apt-get install jenkins -y
- Check Jenkins status:
sudo systemctl status jenkins
Start jenkins if not started:
sudo systemctl start jenkins
- Allow
8080
jenkins port through firewall (if required):
sudo ufw enable
sudo ufw allow 8080
sudo ufw status
- Access Jenkins on
8080
port on server. Run below command to get initial password to setup Jenkins:
sudo cat /var/lib/jenkins/secrets/initialAdminPassword
Switch to project (you can see project ID on top left near search bar):
gcloud config set project <project name>
Connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name" --project "project-name"
- Install Java:
sudo apt update
sudo apt install openjdk-17-jdk
- Copy download link from here and extract:
wget https://downloads.apache.org/kafka/3.5.0/kafka_2.12-3.5.0.tgz
tar -xzf kafka_2.12-3.5.0.tgz
cd kafka_2.12-3.5.0/
- Start zookeeper server service:
bin/zookeeper-server-start.sh config/zookeeper.properties
- Open another terminal in same project and connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name" --project "project-name"
- Start Kafka broker service:
bin/kafka-server-start.sh config/server.properties
- Open another terminal in same project and connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name" --project "project-name"
- Start Kafka producer to write some events to Kafka topic:
bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
Type any message in this terminal to send message to this topic.
- Open another terminal in same project and connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name" --project "project-name"
- Start Kafka consumer to read to messages from topic:
bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092
First make sure the kafka port 9092
is not blocked by firewall. Next go to Kafka folder and config
folder. Edit server.properties
:
nano server.properties
Look for lines with below code and remove hash from beginning of line:
advertised.listeners=PLAINTEXT://localhost:9092
listeners=PLAINTEXT://0.0.0.0:9092
Update above lines with below:
advertised.listeners=PLAINTEXT://SERVER_IP:9092
listeners=PLAINTEXT://0.0.0.0:9092
Example:
advertised.listeners=PLAINTEXT://35.208.219.6:9092
listeners=PLAINTEXT://0.0.0.0:9092
Example NodeJS code to connect to Kafka:
// install kafkajs package dependency
// producer.js
// import the `Kafka` instance from the kafkajs library
const { Kafka } = require("kafkajs")
// the client ID lets kafka know who's producing the messages
const clientId = "my-app"
// we can define the list of brokers in the cluster
const brokers = ["localhost:9092"] // or SERVER_IP:PORT
// this is the topic to which we want to write messages
const topic = "message-log"
// initialize a new kafka client and initialize a producer from it
const kafka = new Kafka({ clientId, brokers })
const producer = kafka.producer()
// we define an async function that writes a new message each second
const produce = async () => {
await producer.connect()
let i = 0
// after the produce has connected, we start an interval timer
setInterval(async () => {
try {
// send a message to the configured topic with
// the key and value formed from the current value of `i`
await producer.send({
topic,
messages: [
{
key: String(i),
value: "this is message " + i,
},
],
})
// if the message is written successfully, log it and increment `i`
console.log("writes: ", i)
i++
} catch (err) {
console.error("could not write message " + err)
}
}, 1000)
}
const consumer = kafka.consumer({ groupId: clientId })
const consume = async () => {
// first, we wait for the client to connect and subscribe to the given topic
await consumer.connect()
await consumer.subscribe({ topic })
await consumer.run({
// this function is called every time the consumer gets a new message
eachMessage: ({ message }) => {
// here, we just log the message to the standard output
console.log(`received message: ${message.value}`)
},
})
}
module.exports = { produce, consume }
// index.js
const kafka = require("./producer")
const { produce, consume } = kafka;
// call the `produce` function and log an error if it occurs
produce().catch((err) => {
console.error("error in producer: ", err)
})
// start the consumer, and log any errors
consume().catch((err) => {
console.error("error in consumer: ", err)
})