Connection Timeout Error while reading the table having more than 100 columns in Mosaic Decisions

35 Views Asked by At

I am reading a table via snowflake reader node having less number of columns/attributes(around 50-80),the table is getting read on the Mosaic decisions Canvas. But when the attributes of table increases (approx 385 columns),Mosaic reader node fails. As a workaround I tried using the where clause with 1=2,in that case it is pulling the structure of the Table. But when I am trying to read the records even by applying the limit (only 10 records) to the query, it is throwing connection timeout Error.

1

There are 1 best solutions below

0
On BEST ANSWER

Even I faced similar issue while reading (approx. 300 columns) table and I managed it with the help of input parameters available in Mosaic. In your case you will have to change the copy field variable to 1=1 used in the query at run time.

Below steps can be referred to achieve this -

  1. Create a parameter (e.g. copy_variable) that will contain the default value 2 for the copy field variable

  2. In reader node, write the SQL with 1 = $(copy_variable) So while validating, it’s same as 1=2 condition and it should validate fine.

  3. Once validated and schema is generated, update the default value of $(copy_variable) to 1 so that while running, you will still get all records.