Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 11 additions & 3 deletions sharktank/sharktank/layers/configs/llm_configs.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,17 @@ def is_hugging_face_llama3_config(hf_config: dict[str, Any]) -> bool:

@dataclass
class LlamaHParams:
"""Corresponds 1:1 with the 'LLM' section of the GGUF docs.

Comments are only provided if they differ from this source.
"""This was originally designed to correspond 1:1 with the 'LLM'
section of the GGUF docs. It has experienced some semantic drift,
and now additionally collects additional model parameters needed
for model export. Currently there are some "subsections" that
refer to a specific model, but this is not ideal due to the
iterative nature of model arch design, so don't follow this
trend when adding more Params.

See the optional_configs under the get_custom_configs function
for an example of how to add support for custom Params without
basing conditionals on a name prefix.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am a bit confused as get_custom_configs
would conditionally populate fields based on the model name prefix.

I think the main point is to have fields in the structure that have general/abstract names such that they survive and can be reused for new models.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah what I was trying to get at is that the optional_keys under that function are agnostic to the name prefix provided. I'll make that clearer

"""

# Attention config
Expand Down
Loading