-
Notifications
You must be signed in to change notification settings - Fork 22
Too much time in import #11
Comments
any update on this? |
I checked into this and I see that I did not use the bulk_import method. So documents are inserted one by one instead of by the hundreds at a time. I will fix this this weekend and push an update to github and PyPI. |
thanks a lot. From: Erik-Jan van Baaren [email protected] I checked into this and I see that I did not use the bulk_import method. So documents are inserted one by one instead of by the hundreds at a time. I will fix this this weekend and push an update to github and PyPI. |
Sorry for breaking a promise but the issue is more difficult than first estimated. To make this work correctly with parent/child documents the easy fix I had planned is not enough. |
Ok no problem. Thanks for the efforts. From: Erik-Jan van BaarenSent: Tuesday, September 24, 2013 07:13To: eriky/ESClientReply To: eriky/ESClientCc: starit1977Subject: Re: [ESClient] Too much time in import (#11)Sorry for breaking a promise but the issue is more difficult than first estimated. To make this work correctly with parent/child documents the easy fix I had planned is not enough. —Reply to this email directly or view it on GitHub. |
Hi,
First of all - good work, this plug in really works.
The only issue i see is the Import takes time. I had an index just of 1k size document each, total 273781 document and when i imported it - the time consumed was around 20-30 minutes. The size of the whole index is around 50MB.
We are planning to do backup of 50GB or 100GB data, and based on above it will be like day's when the import completes.
Any hints or ideas appreciated to resolve this?
thanks
pranav.
The text was updated successfully, but these errors were encountered: