I am reading a table via snowflake reader node having less number of columns/attributes(around 50-80),the table is getting read on the Mosaic decisions Canvas. But when the attributes of table increases (approx 385 columns),Mosaic reader node fails. As a workaround I tried using the where clause with 1=2,in that case it is pulling the structure of the Table. But when I am trying to read the records even by applying the limit (only 10 records) to the query, it is throwing connection timeout Error.
Connection Timeout Error while reading the table having more than 100 columns in Mosaic Decisions
34 Views Asked by Abhijeet Vipat At
1
Even I faced similar issue while reading (approx. 300 columns) table and I managed it with the help of input parameters available in Mosaic. In your case you will have to change the copy field variable to
1=1
used in the query at run time.Below steps can be referred to achieve this -
Create a parameter (e.g.
copy_variable
) that will contain the default value2
for the copy field variableIn reader node, write the SQL with
1 = $(copy_variable)
So while validating, it’s same as1=2
condition and it should validate fine.Once validated and schema is generated, update the default value of
$(copy_variable)
to1
so that while running, you will still get all records.