You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@nv-guomingz hello there, thanks for the quick reply! 😀
No, I don't see anything specifically calling out models? I was under the impression that those models built and in the HF repo for Eagle should work here for a quick demo/check.
Per the docs:
Limitations
EAGLE-2 is not supported.
All EAGLE choices have to have exactly the same depth as num_eagle_layers of the engine.
Pipeline parallelism is not supported.
Hi @JoJoLev vicuna-7b-v1.3 is a verified example model for Eagle feature, I think the team didn't verify the models u mentioned above.
So the failure is very possible but I consider it as a corner case.
Let's have a try and get back to you ASAP.
Went through the Readme for llama 3 8b instruct on the examples/eagle. When running the convert checkpoint, I get the error: key error 'fc.bias'
I used llama3 8b instruct from hugging face with the eagle version for the eagle model directory.
Not really sure the issue? Maybe I need to recreate the eagle model?
The text was updated successfully, but these errors were encountered: