Large database?
apologies if this topic has been discussed already.
My library contains around 3500 items (not including sub-items) and the size of the database is around 1.8 GB.
What is the maximal recommended size of the database?
Should I split the library
Thanks
My library contains around 3500 items (not including sub-items) and the size of the database is around 1.8 GB.
What is the maximal recommended size of the database?
Should I split the library
Thanks
A.
Would it make sense then to work with Zotero with a library which has around 170'000 items?
Have there been any attempts to try an alternative db-engine to sqlite in case this should be the bottleneck (e.g. open source Wakanda DB http://wakandadb.org)?
Do server- and client-side DB have the same "size restrictions" or could the server-DB store more entries than the client-DB? How would synchronization work in that case?
Regards,
Adrian
Currently sync will also crumble under such databases, although that might get better.
Can't speak to the other questions, though I believe it's not just sqlite, but also xulrunner's (i.e. mozilla's) sqlite implementation that causes problem and switching database systems is a huge undertaking, likely beyond what's possible to do.
my library has 1161 items most of them (~95%) have an attached
pdf journal article. Sometimes Zotero takes about 9 seconds to
display the items. Is it normal? I remember when I had less than
50 items in the library Zotero was really much more faster. The
version I am using is 4.0.8.
regards.
I have similar concerns than those of previous posters. I have a 250Mb (3500 items) db running on a W7, Firefox 32.0.3, Zotero 4.0.22. The records have almost all pdf attached and the time to load Zotero for the first time is about 15 sec. During the loading time Firefox doesn't respond. Is it because of the size of the db? The number of pdfs attached? Laptop memory/CPU or any other reason? Or all of the above?
Many thanks