This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
CREATE TABLE temp_20200817( | |
id integer PRIMARY KEY, | |
temp_id integer, | |
temp_status VARCHAR(16), | |
update_type VARCHAR(14), | |
price numeric(16,8), | |
quantity numeric, | |
timestamp timestamp, | |
seqno integer, | |
ts timestamp, |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
diff --git a/hudi-common/src/main/java/org/apache/hudi/common/model/DebeziumAvroPayload.java b/hudi-common/src/main/java/org/apache/hudi/common/model/DebeziumAvroPayload.java | |
new file mode 100644 | |
index 00000000..cae8f13a | |
--- /dev/null | |
+++ b/hudi-common/src/main/java/org/apache/hudi/common/model/DebeziumAvroPayload.java | |
@@ -0,0 +1,43 @@ | |
+/* | |
+ * Licensed to the Apache Software Foundation (ASF) under one | |
+ * or more contributor license agreements. See the NOTICE file | |
+ * distributed with this work for additional information |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Download Confluent platform as zip locally : https://www.confluent.io/download/?_ga=2.149291254.1340520780.1594928883-290224092.1594928883&_gac=1.220398892.1594951593.EAIaIQobChMIm6Cmz5nT6gIVCa_ICh0IeAjlEAAYASAAEgLWkfD_BwE | |
#Choose zip option. Unzip after download. setup in your home directory. | |
export CONFLUENT_HOME=<path_to_confluent_home> | |
export PATH=$PATH:$CONFLUENT_HOME/bin | |
# Start services | |
confluent local start |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Hoodie Columns removed | |
{ | |
"type" : "record", | |
"name" : "spark_schema", | |
"fields" : [ { | |
"name" : "timestamp", | |
"type" : [ "null", "double" ], | |
"default" : null | |
}, { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"type" : "record", | |
"name" : "spark_schema", | |
"fields" : [ { | |
"name" : "timestamp", | |
"type" : [ "null", "double" ], | |
"default" : null | |
}, { | |
"name" : "_row_key", | |
"type" : [ "null", "string" ], |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Screen shot Page |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Configs: Attached (ds_configs.tgz) | |
Upload configs: | |
tar -zxvf ds_configs.tgz | |
hadoop fs -copyFromLocal -r ds_configs <DFS_CONFIG_ROOT>/ | |
Spark Submit Command: | |
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 | |
export HADOOP_CONF_DIR=/home/guoyihua/wireline/hadoop-conf | |
export HUDI_UTILITIES_BUNDLE=<PATH_TO>/hoodie-utilities-0.4.8-SNAPSHOT.jar |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
->compactions show all | |
╔═════════════════════════╤═══════════╤═══════════════════════════════╗ | |
║ Compaction Instant Time │ State │ Total FileIds to be Compacted ║ | |
╠═════════════════════════╪═══════════╪═══════════════════════════════╣ | |
║ 20190605181247 │ COMPLETED │ 1106 ║ | |
╟─────────────────────────┼───────────┼───────────────────────────────╢ | |
║ 20190605115126 │ COMPLETED │ 1204 ║ | |
╟─────────────────────────┼───────────┼───────────────────────────────╢ | |
║ 20190605053033 │ COMPLETED │ 1303 ║ | |
╚═════════════════════════╧═══════════╧═══════════════════════════════╝ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
diff --git a/docker/compose/docker-compose_hadoop284_hive233_spark231.yml b/docker/compose/docker-compose_hadoop284_hive233_spark231.yml | |
index bbb9f10e..015c9e2b 100644 | |
--- a/docker/compose/docker-compose_hadoop284_hive233_spark231.yml | |
+++ b/docker/compose/docker-compose_hadoop284_hive233_spark231.yml | |
@@ -145,6 +145,45 @@ services: | |
- "8081:8081" | |
environment: | |
- "SPARK_MASTER=spark://sparkmaster:7077" | |
+ - "SPARK_WORKER_WEBUI_PORT=8081" | |
+ links: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Apply patch to add DFS properties | |
https://github.com/bvaradar/hudi/commit/a4f79a7ab6955503e3cca0a36876305a544991ee | |
Instead of Step (1) in demo | |
varadarb-C02SH0P1G8WL:hudi varadarb$ docker exec -it adhoc-2 /bin/bash | |
# Creating DFS Root Directory | |
root@adhoc-2:/opt#hadoop fs -mkdir -p /var/data/input_batch/ |
NewerOlder