I am reading a table via snowflake reader node having less number of columns/attributes(around 50-80),the table is getting read on the Mosaic decisions Canvas. But when the attributes of table increases (approx 385 columns),Mosaic reader node fails. As a workaround I tried using the where clause with 1=2,in that case it is pulling the structure of the Table. But when I am trying to read the records even by applying the limit (only 10 records) to the query, it is throwing connection timeout Error.
Connection Timeout Error while reading the table having more than 100 columns in Mosaic Decisions
62 Views Asked by Abhijeet Vipat At
1
There are 1 best solutions below
Related Questions in MOSAIC-DECISIONS
- Writing data into a single file for Azure instead of part files
- Handling data containing comma in a CSV file
- Unexpected Result when data source has null values in Mosaic Decisions
- Issue while adding datasource to the project in Mosaic Decisions
- How can I fetch a dynamic file from FTP server-generated every day?
- How to run flow with minimal configuration in mosaic
- Is there a provision to view data at any processing node of a flow in Mosaic?
- How can I perform analytical operations along with join in Mosaic Decisions?
- Is import and export of a schedule job possible in Mosaic?
- Does Mosaic supports ingesting compressed data?
- Connection Timeout Error while reading the table having more than 100 columns in Mosaic Decisions
- How can I create a connection to any external database in Mosaic?
- Manage auto increment column values in Mosaic Decisions
- Convert Existing Stored Procedure having multiple update statements on Mosaic
- Project requirement to Read input files from COBOL and bring on Mosaic Platform
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Even I faced similar issue while reading (approx. 300 columns) table and I managed it with the help of input parameters available in Mosaic. In your case you will have to change the copy field variable to
1=1used in the query at run time.Below steps can be referred to achieve this -
Create a parameter (e.g.
copy_variable) that will contain the default value2for the copy field variableIn reader node, write the SQL with
1 = $(copy_variable)So while validating, it’s same as1=2condition and it should validate fine.Once validated and schema is generated, update the default value of
$(copy_variable)to1so that while running, you will still get all records.