The source files columns keep on changing every time, and so the data in the target table gets jumbled. I want to avoid this issue by creating some program in informatica cloud where it will check the column order and column name of the incoming source file with the existing(correct)source file. If the columns are matching, then only it should proceed with inserting the data to the target. Else it should fail the job. Thanks in advance
how to compare column names and order of columns in two files in informatica cloud
574 Views Asked by gunjan gupta At
1
There are 1 best solutions below
Related Questions in ETL
- dbt Incremental Model Issue with Snowflake Autoincrement Column
- Ibis vs. Spark for big data processing against an analytics datawarehouse with a DataFrame API?
- How to copy XML files in a folder F1 based on whether its content is present on folder F2 (disregarding file names)
- Can we orchestrate Matillion Data Loader in Matillion Designer?
- Reading Unstructured Text from the entire file in Azure Data Factory
- Write rows on destination even when an error occurs?
- What is the difference between Data Ingestion and ETL?
- SSIS remove $ format from csv
- Generate data flow graph for ETL process
- Meta Data driven ADF pipeline to ingestion data from multiple sources
- How to push data from multiple sources/integrations for a single destination in stitch ETL Tool
- Pentaho PDI || Windows Current User
- MATILLION API Query Profile
- Joining Data Frame & SQL Server table directly and update table
- Extract composite unique key from GoHighLevel API with Python {{ contact.utm_source }}
Related Questions in INFORMATICA
- Impact of PRE-SQL and POST -SQL on Push Down Optmisation in IICS
- Scheduling Informatica IICS taskflow
- Running 2 stored procedures in post SQL for Informatica
- Subtaskflow is getting suspended instead of failed in IICS
- Instr to find the exact pattern match in informatica
- Ho to change existing MD5 checksum value to make it different from previous value
- Informatica new line and carriage return
- Unable to read the JSON array element data from event based kafka topic message connection in IICS/IDMC
- REG_MATCH function in Informatica - Identify a pattern
- Return all records in Salesforce TASK table with Informatica
- Informatica Powerexchange post sql greenplum writer
- Replace SAS stored process with different ETL tool
- is there any open source tool to migrate informatica Etl Script to Aws glue?
- PMCMD Command to get session statistics Informatica
- how to handle chinese characters in teradata
Related Questions in INFORMATICA-POWERCENTER
- How to convert ansii file to utf-8 encoded file through informatica workflow?
- How to match the values generated by MD5 hash (Informatica) and Standard hash (Oracle)?
- Subtaskflow is getting suspended instead of failed in IICS
- Instr to find the exact pattern match in informatica
- Ho to change existing MD5 checksum value to make it different from previous value
- Informatica new line and carriage return
- Unable to read the JSON array element data from event based kafka topic message connection in IICS/IDMC
- Affected rows from seesion logs does not reflect in actual target table in Informatica
- Sequence-id in associated expression for dynamic lookup in Informatica Powercenter
- how to lookup on a table based on a null key in informatica powercenter
- PMCMD Command to get session statistics Informatica
- can we use informatica power center to load into teradata cloud?
- Informatica Incremental Load using greenplum writer
- How to call APIs in Informatica Powercenter
- how to create ETL to dynamically get the port name
Related Questions in INFORMATICA-CLOUD
- Impact of PRE-SQL and POST -SQL on Push Down Optmisation in IICS
- Scheduling Informatica IICS taskflow
- Subtaskflow is getting suspended instead of failed in IICS
- IICS, Unable to unpublish the TaskFlow using Rest API in POSTMAN
- Unable to read the JSON array element data from event based kafka topic message connection in IICS/IDMC
- is there any open source tool to migrate informatica Etl Script to Aws glue?
- Running a .bat script in IICS informatica using Command task
- Create a file list in IICS informatica and load the files to a table
- PCSF_46008 Cannot connect to domain to look up service coreservices/UserManagementService
- Distribution List in Notification Task in IICS
- Looping in Informatica Cloud
- Loading distinct rows from flatfile(listfile) to Target using informatica cloud
- Loading only distinct records into table via IICS
- Informatica like operator
- CMN_1022 Database driver error... CMN_1022 [ [Informatica][ODBC Oracle Wire Protocol driver]Connection refused
Related Questions in INFORMATICA-DATA-INTEGRATION-HUB
- IICS, Unable to unpublish the TaskFlow using Rest API in POSTMAN
- Unable to read the JSON array element data from event based kafka topic message connection in IICS/IDMC
- Informatica Incremental Load using greenplum writer
- Running a .bat script in IICS informatica using Command task
- Looping in Informatica Cloud
- Aggregator transformation
- Informatica like operator
- XML Read in Informatica BDM 10.5.3
- How to create a rest web service in informatica cloud (IICS)?
- IICS: How to deal with API which returns first character as a number but not letter?
- Informatica Incremental extract and load process
- Informatica: dealing with double quotes using expression tranformation
- IICS: How to increase string precision in business services response fields
- IICS TO_DATE() function formatting question
- IICS Error: "Concatenation disallowed on transformation Joiner. "
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
You can do it in two options.
Option 1 - Use UNIX
head -1 dailyfile > file1andhead -1 modelfile > file2. Then compare file1 and file2. if both are same, then its a good file else replace dailyfile with a 0 byte file and inform support team(or do whatever you want to).Option 2 - Little tricky.

Create two pipelines - one for modelfile one for dailyfile.
attach a exp transformation and add sequence gen(starts from 1 always).
Then put a filter and pick up nextval =1.
Then compare filtered result using joiner (you can join on all columns or concat them into a long string and compare on that column.)- if they are same, generate flag ='Pass' else 'NoPass'. The join SQ_dailyfile and this ouput - join condition will be on this flag column. If 'Pass'='Pass', it will produce rows else 0 rows. You can also call
ABORT()when flag = 'NoPass' too.Whole mapping should look like below pic.