I have a data staging table that is populated by procedural code.
TABLE BatchRecord{
BatchRecordID NUMBER PRIMARY KEY,
BatchID NUMBER, --Assigned by the procedural code
RecordID NUMBER, --Relative to the batch
FieldID NUMBER, --Assigned by the procedural code; FK to another table
Value VARCHAR2(MAX)
Instance NUMBER}
Each BatchRecord then needs to be transformed into three related tables: Records, RecordFields, FieldValues.
TABLE Record{
RecordID NUMBER PRIMARY KEY,
BatchID NUMBER, --Same from BatchRecords}
TABLE RecordFields{
RecordFieldID NUMBER PRIMARY KEY,
RecordID NUMBER, --FK from Records
FieldID NUNBER --Same from BatchRecords}
TABLE FieldValues{
RecordFieldID NUMBER PRIMARY KEY,
Instance NUMBER PRIMARY KEY, --Same from BatchRecords
Value VARCHAR2(MAX) --Same from BatchRecords}
In SQL Server, I can accomplish this using a MERGE
statement and using OUTPUT
to capture the newly inserted keys:
CREATE TABLE #InsertedRecords(RecordID INT, NewRecordID INT);
MERGE INTO Records USING (
SELECT RecordID
FROM BatchRecords
WHERE BatchID = @BatchID
GROUP BY RecordID) AS BR ON 1 = 0
WHEN NOT MATCHED THEN
INSERT (BatchID)
VALUES (@BatchID)
OUTPUT BR.RecordID, INSERTED.RecordID INTO #InsertedRecords;
A similar approach is taken to insert RecordFields, and an INSERT...SELECT
statement is used to insert into FieldValues.
However, in Oracle, it is not possible to use RETURNING
(the equivalent to OUTPUT
) in a MERGE
statement, or even in a SELECT...INTO
statement.
Is there a possible way to accomplish the same task in Oracle while also avoiding a FORALL
loop as the total number of BatchRecords could be upward of several million?
You can try using oracle PIPELINE function to select all records from BatchRecord and then pipe row into different tables. If needed I could show an example.
EDIT
I have shown an example for inserting into Record. You have to do similarly for the other two tables as well. Hope this helps !!