Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When size of data is large (over 100 MB), Brown-cluster program will be killed. How can I fix this error? #12

Open
luongtieumy opened this issue Nov 7, 2014 · 2 comments

Comments

@luongtieumy
Copy link

No description provided.

@ajaech
Copy link
Contributor

ajaech commented Nov 7, 2014

The size of the input data doesn't matter as much as the size of the vocabulary. How big is the vocabulary you are dealing with?

@luongtieumy
Copy link
Author

About 80.000 vocabularies in my data. It's OK now. I think I didn't enough memory for running this program before.
How big is the vocabulary for limited in this program?
Thank you so much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants