How big can a library get?

Realistically, how big can a library get before noticeable issues (e.g., slower performance in search) begin to emerge simply due to the library's size? Here, I am just assuming a library solely populated by references; no PDFs or notes—those are stored elsewhere. My library is currently at 5428 items and I've noticed that Zotero is a bit sluggish on start-up and search seems slower, although still much faster than other reference managers I've used in the past.
  • Depends on lots of factors including your computer as well as your definition of noticeable. As you say, you certainly _notice_ the size of the library at 5k, but it's still pretty quick. My sense is that for most people things get noticeably slower once you go over ~20k, you hit serious performance issues in the 50-100k range, and it becomes difficult to work with Zotero much beyond 100k items.

    Zotero is designed to frontload a lot of its database work on startup, so if possible it's recommended to just have it running.
  • Well, I am running it off spinning rust, so I imagine that even the small dip in performance would become less noticeable if it were launching on an SSD instead. I'm glad to hear that there's plenty of headroom for large libraries though. I can't imagine reaching 20k, but then again I didn't foresee reaching 5k either!
  • Is the best way to deal with large libraries without having to delete things to move some of the older/less relevant/specialized files into another library? Or do you have another suggestion?
  • It really depends. I'd say for most researchers, that size of library that actually requires worrying about this is unusual, so these are mostly going to be special cases with particular needs.

    Moving projects to groups will work to some extent (currently that still affects the word processor add-on, which searches through all groups) and in some situation different accounts & profiles could be an option.
Sign In or Register to comment.