In the openGauss database, when gs_restore is used to import the MOT table, the table size is 1.3 GB. An exception occurs. The log shows that the requested 1.3 GB exceeds the configured 1 GB. What should I do?
The problem of importing MOT table using gs_restore in openGauss database
57 Views Asked by Sarah At
1
There are 1 best solutions below
Related Questions in OPEN-GAUSS
- gaussdb: error while loading shared libraries: libssl.so.1.1: cannot open shared object file: No such file or directory
- Specifying the order of join inopenGauss
- How can I enable case-insensitive search on openGauss
- Error reported when creating a stored procedure in openGauss: ORA-00933
- Why when use "explain timing" shows error in openGauss?
- About openGauss, what is the command to view the table structure properties or the system table?
- Is openGauss 3.1.1 fully compatible with PosgreSQL 14.2 in SQL syntax?
- About mysql compatibility in openGauss 3.1.0 ?
- Opengauss+keepalived active/standby switchover, and the active/standby replication relationship is lost
- The problem of openGauss database connection schema
- OpenGauss environment installation problem
- openGauss installation error reporting app_0bd0ce80, no need to create symbolic link?
- Opengauss docker version build cannot run
- What permissions do opengauss need to grant to create a user?
- What is the column name restriction of openGauss database creation table
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
The error logs in the server side should have more details about what memory limit (global or local) is reached, please check. Also instead of loading everything in a single transaction, try using --jobs option.
https://opengauss.org/en/docs/3.0.0/docs/Toolreference/gs_restore.html
-j, –jobs=NUM
Specifies the number of concurrent, the most time-consuming jobs of gs_restore (such as loading data, creating indexes, or creating constraints). This parameter can greatly reduce the time to import a large database to a server running on a multiprocessor machine.
Each job is one process or one thread, depending on the OS; and uses a separate connection to the server.
The optimal value for this option depends on the server hardware setting, the client, the network, the number of CPU cores, and disk settings. It is recommended that the parameter be set to the number of CPU cores on the server. In addition, a larger value can also lead to faster import in many cases. However, an overly large value will lead to decreased performance because of thrashing.
This parameter supports custom-format archives only. The input file must be a regular file (not the pipe file). This parameter can be ignored when you select the script method rather than connect to a database server. In addition, multiple jobs cannot be used in conjunction with the –single-transaction parameter.