Submitting forms on the support site are temporary unavailable for schedule maintenance. If you need immediate assistance please contact technical support. We apologize for the inconvenience.
Importing from Apache Hadoop Hive DDL, getting error :FATAL [MBCM_F0029] Input model is null
Description
Importing from Apache Hadoop Hive DDL, getting error :FATAL [MBCM_F0029] Input model is null
Resolution
I am using the bridge to import Apache Hadoop HiveQL DDL as SQL DDL and I amgetting the error:
ERROR:Could not find required directory: directory path FATAL:[MBCM_F0029] Input model is null
Please make sure
follow the instruction to generate the DDL (HQL) use the utility provided at '${MODEL_BRIDGE_HOME}/bin/hive_generate_ddl.sh'. Place this utility on the hive cluster (should not matter where). Once there, remotely connect with a bash shell and execute the utility. It will extract the DDL from all the schemas in the cluster into a file name 'tables.hql'.
In the Import Wizard, make sure the file path point to a directory (NOT the file). The directory should only contains tables.hql file
Your Request will be reviewed by our technical reviewer team and, if approved, will be added as a Topic in our Knowledgebase.
Welcome to Quest Support
You can find online support help for Quest *product* on an affiliate support site. Click continue to be directed to the correct support content and assistance for *product*.
Search All Articles
IE 8, 9, & 10 No longer supported
The Quest Software Portal no longer supports IE8, 9, & 10 and it is recommended to upgrade your browser to the latest version of Internet Explorer or Chrome.
use the utility provided at '${MODEL_BRIDGE_HOME}/bin/hive_generate_ddl.sh'. Place this utility on the hive cluster (should not matter where). Once there, remotely connect with a bash shell and execute the utility. It will extract the DDL from all the schemas in the cluster into a file name 'tables.hql'.