-
Notifications
You must be signed in to change notification settings - Fork 67
[sharktank] Llama 3.1 f16 HF import presets #2290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -15,6 +15,7 @@ | |
from pathlib import Path | ||
from sharktank.models.llm.llm import PagedLlmModelV1 | ||
from sharktank.models.llama.toy_llama import generate | ||
from sharktank.utils import chdir | ||
from sharktank.utils.export_artifacts import IreeCompileException | ||
from sharktank.utils.testing import ( | ||
is_mi300x, | ||
|
@@ -210,3 +211,21 @@ def test_import_llama3_8B_instruct(tmp_path: Path): | |
] | ||
) | ||
assert irpa_path.exists() | ||
|
||
|
||
@pytest.mark.expensive | ||
archana-ramalingam marked this conversation as resolved.
Show resolved
Hide resolved
|
||
def test_import_llama3_8B_instruct_from_preset(tmp_path: Path): | ||
from sharktank.tools.import_hf_dataset_from_hub import main | ||
|
||
irpa_path = tmp_path / "llama3.1/8b/instruct/f16/model.irpa" | ||
archana-ramalingam marked this conversation as resolved.
Show resolved
Hide resolved
|
||
tokenizer_path = tmp_path / "llama3.1/8b/instruct/f16/tokenizer.json" | ||
tokenizer_config_path = tmp_path / "llama3.1/8b/instruct/f16/tokenizer_config.json" | ||
with chdir(tmp_path): | ||
main( | ||
[ | ||
"--preset=meta_llama3_1_8b_instruct_f16", | ||
] | ||
) | ||
assert irpa_path.exists() | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. we should verify these files in some form instead of seeing that they exist in case of a bad download or something. Maybe have an md5sum that we can compare? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It is possible that it fails silently in such a way. It will be pretty sad that the HF hub package would fail there as robust downloading I would assume is a major goal. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This works for now. But like Ian pointed out, if there are more reliable ways to verify if the irpa generation was complete/successful, that would be great. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What I think ultimately should happen is to run a job for model importation before running the CI test jobs. Another option is to run nightly a job that imports from HF then uploads to Azure so that other runners can update their model cache. This has the problem that we may overwrite existing model files with faulty ones if some bug appeared. In this scenario a more thorough model validation would be needed before uploading. |
||
assert tokenizer_path.exists() | ||
assert tokenizer_config_path.exists() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would hesitate to add this flag as we currently can not directly consume it. sharktank expects naming conventions from gguf format, not huggingface
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where are we supposed to consume it in GGUF format? The converter would do on-the-fly conversion to our format which is derived from GGUF. We save only in IRPA. We can't save in GGUF, we can only read it as
sharktank.types.Dataset
.