Can variables be passed to a SQL*Loader control file via sqlldr command?

22.9k Views Asked by At

Below is my Control file example :

    OPTIONS (skip=1,errors=1000,direct=true,rows=10000)
    load data 
    append
    into table TABLE_NAME
    fields terminated by ','
    OPTIONALLY ENCLOSED BY '"'
    trailing nullcols(
      DATE_ID       DATE_ID_VALUE,
      DESC1         char(1000),
      DESC2         char(1000),
      DISP_URL      char(1000),
      DEST_URL      char(1000),
      ACCT_ID       ACCOUNTID_VALUE,
      Acct_num      ACCOUNT_NUM,
      created_date SYSDATE
    )

I need to pass DATE_ID_VALUE,ACCOUNTID, ACCOUNTNUM values from sqlldr command. Am passing the remaming columns data via csv file which inturn is passed from sqlldr "DATA" parameter. Is there a way to pass other required parameters via sqlldr command or any other way to do it?

Below is my sqlldr command :

   sqlldr userid=abc/abcdef@abcdefgh CONTROL= cont.ctl DATA= $csvFilePath  LOG=admaster.log BAD=admaster.bad
2

There are 2 best solutions below

0
On

I know this is old, but I just stumbled on it and I recently answered a similar question. Please see my reply here for a technique to create a control file from a wrapper program.

insert timestanp of INFILE into a column from SQLLOADER

0
On

It is not possible to parameterise the variables in the .ctl file, but it is possible to dispense with a .ctl file altogether and parameterise whatever you want.

Rather than calling sqlldr directly, you need to declare an external table, then do a SQL INSERT into TABLE_NAME SELECT * from EXTERNAL_TABLE;. The external table declaration contains the sqlldr parameters that are used behind the scenes, and is defined in a SQL query which can be run directly from the Unix shell, with all the required parameters specified as Unix system variables or commands.

E.g. having set the values of $DATE_ID_VALUE and the other 2 variables in your calling environment, first create an external table:

echo "create table myschema.temp_table_name (
    DATE_ID       INTEGER,
    DESC1         char(1000),
    DESC2         char(1000),
    DISP_URL      char(1000),
    DEST_URL      char(1000),
    ACCT_ID       INTEGER,
    Acct_num      INTEGER,
    created_date  DATE)
   organization external
     (
     type oracle_loader
     default directory mydir
     access parameters (
       records delimited by newline
       badfile bad_dir: 'temp_ext_temp_table_name_load.bad'
       logfile log_dir: 'temp_ext_temp_table_name_load.log'
       fields terminated by ',' (
          DESC1         char(1000),
          DESC2         char(1000),
          DISP_URL      char(1000),
          DEST_URL      char(1000)
       )
       column transforms (
          DATE_ID       FROM CONSTANT '$DATE_ID_VALUE',
          ACCT_ID       FROM CONSTANT '$ACCOUNTID_VALUE',
          Acct_num      FROM CONSTANT '$ACCOUNT_NUM',
          created_date  FROM CONSTANT \"`date '+%d-%b-%Y'`\"
       )
     )
     location ('temp_table_name.dat')
     )
     reject limit 1000;" | sqlplus -s /

The column transforms clause will populate the external table with the constant values resolved from your environment variables and the Unix date command.

Then do the insert into the target table (optional append hint for direct path load):

insert /*+ append */ into table_name
select * from myschema.temp_table_name;

I couldn't find a way to include SYSDATE, so used the Unix date command equivalent instead.