Import All Tables:
-----------------
sqoop import-all-tables --connect jdbc:mysql://localhost/my1st --username=root --password=cloudera -m 1
see the contents : hdfs dfs -ls /user/root
Subscribe to:
Post Comments (Atom)
Flume - Simple Demo
// create a folder in hdfs : $ hdfs dfs -mkdir /user/flumeExa // Create a shell script which generates : Hadoop in real world <n>...
-
How to fetch Spark Application Id programmaticall while running the Spark Job? scala> spark.sparkContext.applicationId res124: String = l...
-
// Lead Example // Lead means Next row's salary value spark.sql("SELECT id, fname,lname, designation, technology,salary, LEAD(sal...
-
from pyspark.sql import SparkSession spark = SparkSession.builder.appName("LondonCrimes").getOrCreate() data = spark.read.format(...
No comments:
Post a Comment