l've a fact table DEVICE with following structure,
DEVICE_NAME VARCHAR(50)
DEVICE_DATE DATETIME
DEVICE_NUMBER INT
Where DEVICE_NAME and DEVICE_DATE form a PRIMARY KEY
So l would like to import a text file with same information into this table.
My problem is, text file contains records which will violate my primary key constraint. In that case, l would only insert the record with DEVICE_NUMER not equal to ZERO and discard and log the others.
In case of the records violtae primary key constraints have DEVICE_NUMBER not equal to ZERO, discard both and log it.
So anyone has good suggestion on this?
In the Data Flow you can check for all of these situations. Start with a text file source.
To prevent insertion of record with zero device_number, use a Conditional Split transform. For anew output, write an expression to test for DEVICE_NUMBER == 0. Any rows that match will follow that output. Connect the following component to the default output, thus discarding those rows.
To prevent insertion of rows that already exist, use a Lookup. Set the Error Configuration, to Redirect errors. This means that when the lookup does not find a match, those "new" rows will flow to the error output, and "matched" rows will flow doen the default output. Connect the following component to the error output, again discarding the matched rows.
If you suspect you have new rows, but they can be duplicates within the file itself, use an Aggregate transform, which has the option to provide unique rows only.
No comments:
Post a Comment