What are best practices for handling large data imports (100k+ rows) in NocoBase? Should I chunk uploads or move to API batch import?
- What is the NocoBase version?
- How many fields and relationship fields are there in the data table?
Thanks for the reply!
- NocoBase Version: 1.7 (Docker install)
- Table Structure:
- ~5-10 fields per table
- ~3–5 relationship fields
- Mostly text, number, date fields — no file uploads or complex field types
I expect import volumes to reach 100k+ rows (e.g., inventory history, transactions, order lines), so the max limit of 2k rows will not work for me.
Is there an alternative?
Appreciate any best practices you can share.
The “plugin-action-import-pro” plugin provides a large amount of data import. Please refer to the document content for import performance:
Import Pro - NocoBase
1 Like