Loading 5GB of data into Autonomous Database?

306 Views Asked by At

To load the 5 GB of data from object storage to Autonomous Database . Can I create one external table on this 5 GB file and load data or do I have to divide this file into few parts and then load it. Is there any restriction in object storage for maximum size of file which we can load to Autonomous Database?

1

There are 1 best solutions below

0
On

On Autonomous Database on Shared Infrastructure, you can use your 5GB (or larger, no practical file size limitation) file to create an external table, or you can use it directly to load data into your table using the DBMS_CLOUD package.

To create an external table over data in your object store, you will want to use the "dbms_cloud.create_external_table"

Eg.

> BEGIN    DBMS_CLOUD.CREATE_EXTERNAL_TABLE(
>     table_name =>'CHANNELS_EXT',
>     credential_name =>'DEF_CRED_NAME',
>     file_uri_list =>'https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/bucketname/o/channels.txt',
>     format => json_object('delimiter' value ','),
>     column_list => 'CHANNEL_ID NUMBER, CHANNEL_DESC VARCHAR2(20), CHANNEL_CLASS VARCHAR2(20)' ); END; /

To load data into your database, you will need the "dbms_cloud.copy_data" procedure.

Eg.

BEGIN
 DBMS_CLOUD.COPY_DATA(
    table_name =>'CHANNELS',
    credential_name =>'DEF_CRED_NAME',
    file_uri_list =>'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/channels.txt',
    format => json_object('delimiter' value ',')
 );
END;
/

For more details on the parameter options and credential required for your object store of choice, refer to the documentation here.