Database size

Searching through documentation and forums I didn't find information on how big a database can be without significant loss of performance.

Discovering zotero just a week ago my library isn't very extensive yet: it contains about 200 items (including attached files) tagged with about 60 tags. Most of them are also sorted in about 20 folders (btw, is there some statistic-screen in zotero presenting these infos?)

Although the collection isn't very large the sqlite-file already has about 4 mb. That's why I'm kind of worrying about how big the file is going to be and what that means for the performance of zotero and the firefox.

Is there a theoretical maximum to the database and what are your experiences with big ones?

Thanks!

rantanplan
  • edited May 11, 2007
    I wouldn't worry about it; SQLite has a reputation of being robust and fast. This page seems to suggest you'll be fine up to 20 or 30 GB.
  • Actual size on disk of the database file shouldn't really affect performance much.

    The bigger issues are in the JavaScript layer, and we're constantly working to improve performance there. We've made some improvements in recent versions that should speed things up (at the cost of a slight delay when first opening the Zotero pane after starting Firefox), though some of these gains were offset by new features. We're hoping to do further optimization to make it useable with much larger libraries.

    As it stands, you shouldn't really seen a slowdown up to about 5,000–10,000 items, but it really depends on computer speed and the kinds of items you have in your library.
  • Thanks to both of you for soothing me!

    I think that will do it for me for the time being. It will take me a lot of time to collect 5-10.00 items and I don't expect getting far beyond this... but who knows?!
Sign In or Register to comment.