Out of Memory error while deleting full-text index Debug-ID: D598359509
Due to many performance problems with a Zotero instance that has a big database, we decided to delete the full-text index (following mostly this discussion https://forums.zotero.org/discussion/comment/367168/#Comment_367168).
But after some seconds (less than a minute) we get the following error message:
Error: Error(s) encountered during statement execution: out of memory [QUERY: DELETE FROM fulltextItemWords WHERE itemID NOT IN (SELECT itemID FROM itemAttachments WHERE linkMode = 3)] [PARAMS: ] [ERROR: out of memory]
Lastly we disabled all Plugins, restarted Zotero and the computer and closed all unnecessary programs before we retried the deletion of the index (excluding weblinks).
Debug-ID: D598359509
Current Index statics
index: 80765
partially indexed: 1876
not indexed: 9339
words: 4414058
But after some seconds (less than a minute) we get the following error message:
Error: Error(s) encountered during statement execution: out of memory [QUERY: DELETE FROM fulltextItemWords WHERE itemID NOT IN (SELECT itemID FROM itemAttachments WHERE linkMode = 3)] [PARAMS: ] [ERROR: out of memory]
Lastly we disabled all Plugins, restarted Zotero and the computer and closed all unnecessary programs before we retried the deletion of the index (excluding weblinks).
Debug-ID: D598359509
Current Index statics
index: 80765
partially indexed: 1876
not indexed: 9339
words: 4414058
For the operation above, your database is running up against memory limits in the Windows version of Zotero, which for technical reasons is currently much lower than on macOS or Linux. But let's focus on the actual performance problems you've reported in your other thread.
But my client normally doesn't use the full-text search anyway so it's worth a shot. He'll use a second Zotero instance as a copy on an other computer that will be used to create and sync the full-text index to the cloud. Just his main instance with all the latest problems has no full-text index since it has gotten worse and worse to work with it.
We worked around the memory limit error by rebuilding the index with a lower number of words and pages per file and finally rebuilding it with 0 for each setting.