$ mysql -u root -p
Enter password:
mysql> SHOW VARIABLES LIKE 'validate_password%';
+--------------------------------------+--------+
| Variable_name | Value |
+--------------------------------------+--------+
| validate_password.check_user_name | ON |
| validate_password.dictionary_file | |
| validate_password.length | 8 |
| validate_password.mixed_case_count | 1 |
| validate_password.number_count | 1 |
| validate_password.policy | MEDIUM |
| validate_password.special_char_count | 1 |
+--------------------------------------+--------+
7 rows in set (0.06 sec)
mysql> SET GLOBAL validate_password.length = 6;
Query OK, 0 rows affected (0.00 sec)
mysql> SET GLOBAL validate_password.number_count = 0;
Query OK, 0 rows affected (0.00 sec)
mysql> set global validate_password.policy = LOW;
Query OK, 0 rows affected (0.00 sec)
mysql> CREATE USER 'hiveuser'@'localhost' IDENTIFIED BY 'mypassword';
...
mysql> REVOKE ALL PRIVILEGES, GRANT OPTION FROM 'hiveuser'@'localhost';
mysql> GRANT ALL PRIVILEGES ON metastore.* TO 'hiveuser'@'localhost';
mysql> FLUSH PRIVILEGES;
mysql> quit;
mysql -u hiveuser -p
password : mypassword
hive-site.xml:
--------------
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
<description>metadata is stored in a MySQL server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>MySQL JDBC driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hiveuser</value>
<description>user name for connecting to mysql server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>mypassword</value>
<description>password for connecting to mysql server</description>
</property>
</configuration>
Copy hive-site.xml from hive/conf folder into spark/conf folder.
cp /home/hadoop/apache-hive-3.1.2-bin/conf/hive-site.xml //home/hadoop/spark-3.0.0-preview2-bin-hadoop3.2/conf
Download mysql connector directly in jars folder of spark:
hadoop@hadoop:~/spark-3.0.0-preview2-bin-hadoop3.2/jars$ wget https://repo1.maven.org/maven2/mysql/mysql-connector-java/8.0.20/mysql-connector-java-8.0.20.jar
Derby installation:
sudo apt install derby-tools libderby-java libderbyclient-java
hadoop@hadoop:~/apache-hive-2.3.7-bin/conf$ whereis derby
derby: /usr/share/derby
hive-site.xml - only for derby:
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:/home/hadoop/apache-hive-3.1.2-bin/metastore_db;databaseName=metastore_db;create=true</value>
</property>
hive --service metastore
if anything goes wrong - just drop metastore db in mysql and try the following step to re-create it.
Start creating metastore schema in mysql
schematool --dbType mysql --initSchema
Start creating metastore schema in derby
schematool -initSchema -dbType derby
No comments:
Post a Comment