Hadoop Hive org.apache.hadoop.ipc.RpcException RPC response exceeds maximum data length

48 Views Asked by At

I am working in setting Remote PostgreSQL Database for Hive Metastore using

Hive 3.1.3 Hadoop 3.3.4 Spark 3.2.4 PostgresSQL 14.9

I could successfully use hive and hiveserver2 as it is shown below

HIVESERVER2

ivan@ivan-Z590I-AORUS-ULTRA ~ $ beeline
Beeline version 2.3.9 by Apache Hive
beeline> !connect jdbc:hive2://localhost:10000
Connecting to jdbc:hive2://localhost:10000
Enter username for jdbc:hive2://localhost:10000: ivan
Enter password for jdbc:hive2://localhost:10000: ****
2023-11-28 08:49:52,557 INFO jdbc.Utils: Supplied authorities: localhost:10000
2023-11-28 08:49:52,557 INFO jdbc.Utils: Resolved authority: localhost:10000
Connected to: Apache Hive (version 3.1.3)
Driver: Hive JDBC (version 2.3.9)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://localhost:10000> show tables;
INFO  : Compiling command(queryId=ivan_20231128085006_a2c14d58-b983-45cd-9693-09200f796b0c): show tables
INFO  : Concurrency mode is disabled, not creating a lock manager
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=ivan_20231128085006_a2c14d58-b983-45cd-9693-09200f796b0c); Time taken: 0.429 seconds
INFO  : Concurrency mode is disabled, not creating a lock manager
INFO  : Executing command(queryId=ivan_20231128085006_a2c14d58-b983-45cd-9693-09200f796b0c): show tables
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : Completed executing command(queryId=ivan_20231128085006_a2c14d58-b983-45cd-9693-09200f796b0c); Time taken: 0.041 seconds
INFO  : OK
INFO  : Concurrency mode is disabled, not creating a lock manager
+-----------+
| tab_name  |
+-----------+
+-----------+
No rows selected (0.623 seconds)

HIVE

hive> SHOW DATABASES;
OK
default
Time taken: 0.016 seconds, Fetched: 1 row(s)

However, I cannot create either tables or databases using Hive. I get this same type of errror in either case

HIVE

hive> CREATE SCHEMA userdb;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RpcException RPC response exceeds maximum data length)

HIVESERVER2

0: jdbc:hive2://localhost:10000> CREATE SCHEMA userdb;
INFO  : Compiling command(queryId=ivan_20231128085354_033255ed-36fb-4482-ab0f-a4df056bf04c): CREATE SCHEMA userdb
INFO  : Concurrency mode is disabled, not creating a lock manager
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO  : Completed compiling command(queryId=ivan_20231128085354_033255ed-36fb-4482-ab0f-a4df056bf04c); Time taken: 0.009 seconds
INFO  : Concurrency mode is disabled, not creating a lock manager
INFO  : Executing command(queryId=ivan_20231128085354_033255ed-36fb-4482-ab0f-a4df056bf04c): CREATE SCHEMA userdb
INFO  : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RpcException RPC response exceeds maximum data length)
INFO  : Completed executing command(queryId=ivan_20231128085354_033255ed-36fb-4482-ab0f-a4df056bf04c); Time taken: 0.028 seconds
INFO  : Concurrency mode is disabled, not creating a lock manager
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RpcException RPC response exceeds maximum data length) (state=08S01,code=1)

This happens with the following configuration

core-site.xml

    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
    <property>
        <name>io.file.buffer.size</name>
        <value>131072</value>
        <description>sandbox Buffer size</description>
    </property>
    <property>
      <name>hadoop.proxyuser.[username].groups</name>
      <value>*</value>
    </property>
    <property>
      <name>hadoop.proxyuser.[username].hosts</name>
      <value>*</value>
    </property>

hdfs-site.xml

<property>
    <name>ipc.maximum.data.length</name>
    <value>134217728</value>
</property>
<property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
</property>
<property>
    <name>dfs.replication</name>
    <value>2</value>
</property>
 
<property>
    <name>dfs.name.dir</name>
    <value>file:///home/ivan/hadoopdata/hdfs/namenode</value>
</property>
 
<property>
    <name>dfs.data.dir</name>
    <value>file:///home/ivan/hadoopdata/hdfs/datanode</value>
</property>
<property>
  <name>dfs.permissions</name>
  <value>false</value>
</property>

<property>
  <name>dfs.datanode.use.datanode.hostname</name>
  <value>false</value>
</property>

<property>
  <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
  <value>false</value>
</property>

mapred-site.xml

<property>
    <name>mapreduce.framework.name</name>
    <value>yarn</value>
</property>

hive-site.xml

<property>
  <name>system:java.io.tmpdir</name>
  <value>/tmp/hive/java</value>
</property>
<property>
  <name>system:user.name</name>
  <value>${user.name}</value>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>false</value> 
</property>
<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:postgresql://localhost:5432/metastore</value>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>org.postgresql.Driver</value>
</property>
<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>hive</value>
</property>
<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>1234</value>
</property>
<property>
  <name>hive.metastore.warehouse.dir</name>
  <value>hdfs://localhost:9870/user/hive/warehouse</value>
</property>

I would really appreciate any help

0

There are 0 best solutions below