Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RLE in trip2seq #32

Open
jlliRUC opened this issue Mar 7, 2023 · 0 comments
Open

RLE in trip2seq #32

jlliRUC opened this issue Mar 7, 2023 · 0 comments

Comments

@jlliRUC
Copy link

jlliRUC commented Mar 7, 2023

Hi boathit,

Thanks for this inspirational work!

I notice that you adopt RLE (Running Length Encoding) instead of the full series of tokens as the embedding of one trajectory., which you didn't mention in your paper. As far as I know, this is not a traditional way to compress trajectory data. If you just input the single values without counts how can you retrieve the trajectory from it? This compression may cause some information loss during training. But I compared model_rle with model_full, model_full accidentally outperforms model_rle on three tasks, especially self-similarity : )

In addition, by adopting RLE on cell token, the length of the trajectory is reduced by a large margin. When I tried to apply t2vec on other datasets which have longer trajectories, if I use full trajectory instead of RLE, I couldn't handle trajectories longer than 400 points with default model parameters (I use a GPU-server with 24GB Graphic Memory).

I hope this repo is still active : ) Thanks in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant