I notice when I duplicate a record in my database that if it's a recent addition (i.e., created in the last 100 records I've added), the duplicate is made almost immediately. (This on my Dual 867 MHz PPC G4, running OS X 10.4.10. I'm running BE with a 16 MB internal cache for 6500 records.)
If on the other hand I duplicate a record that was originally created long ago -- say, when I first converted my EndNote database over to Bookends a couple of years ago -- the duplicate will not be made for as much as 45 seconds. During this time the beachball spins and Bookends seems not to be doing anything. (Activity Monitor will even report that Bookends is "Not Responding.) Rebuilding the database index doesn't seem to improve the delay.
This is often annoying, but something I've become accustomed to. I assume that this difference in the time required to duplicate a record is related to internal database maintenance. Am I correct in that assumption? Is there any chance that future versions of Bookends will handle this task more gracefully?
TH
Technical question: duplicating records & time required
Hi,
I'm guessing that his is because Bookends tries to make the duplicate have the same unique id as the original + 1. For references made with Bookends this is easy and instantaneous. If you imported from EN, which uses sequential unique ids, Bookends has to try each unique id in sequence until finds an "empty" one. So if you have 1000 references with unique id's of 1-1000, and you duplicate reference 1, it will try to duplicate 1000 times until if finally settles on an id of 1001.
Jon
Sonny Software
I'm guessing that his is because Bookends tries to make the duplicate have the same unique id as the original + 1. For references made with Bookends this is easy and instantaneous. If you imported from EN, which uses sequential unique ids, Bookends has to try each unique id in sequence until finds an "empty" one. So if you have 1000 references with unique id's of 1-1000, and you duplicate reference 1, it will try to duplicate 1000 times until if finally settles on an id of 1001.
Jon
Sonny Software
Duplicating records & time required
Ah. That makes perfect sense. Sort of a problem, however, if you want to convey a good impression to users who have just migrated from EndNote. I'm not sure if this would add more complexity than is wise, but might I suggest a tweak to the algorithm used to assign ids in duplicated records?Jon wrote:If you imported from EN, which uses sequential unique ids, Bookends has to try each unique id in sequence until finds an "empty" one. So if you have 1000 references with unique id's of 1-1000, and you duplicate reference 1, it will try to duplicate 1000 times until if finally settles on an id of 1001.
- when a new record is to be duplicated, start with the id of the original and count up by units of 1, looking for the next free slot (as you do now)
- if no slot is found after a cycle of, say, 100 increments, accelerate the process by changing the increment unit to 10 (or even 100)
If I understand correctly how this works, that should speed things up quite a bit.