Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel indexing? #12

Open
gsuberland opened this issue Nov 19, 2021 · 1 comment
Open

Parallel indexing? #12

gsuberland opened this issue Nov 19, 2021 · 1 comment

Comments

@gsuberland
Copy link

gsuberland commented Nov 19, 2021

Would it be possible to parallelise the indexing process, or at least parts of it, to improve the overall speed?

Running this over a 6.4GB repository with 275,000 files in it, on Windows, the process is neither bottlenecked on CPU or disk IO, but the process takes over an hour. Running two index commands on two repos on the same NVMe SSD, in parallel, results in a disk IO of around 20% and barely taxes one core. The memory usage is only around 400MB per process.

I suspect that sequentially opening each file, reading and processing contents, storing the results, then moving onto the next file is causing heavy throughput limitations when there are many thousands of small files.

I don't know enough Go to implement this myself, unfortunately. Is this something you could potentially investigate?

@makuto
Copy link

makuto commented Jun 4, 2022

I was thinking about this a bit. One potential solution without much code change is to make it embarrassingly parallel, I.e. run many instances of the cindex executable on subsets of the repo (and with separate output indexes). It's a bit gross but would be a way to do it without touching Go

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants