Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keyerror: hook_point_in #87

Open
BoyuanJackChen opened this issue Feb 8, 2025 · 1 comment
Open

Keyerror: hook_point_in #87

BoyuanJackChen opened this issue Feb 8, 2025 · 1 comment

Comments

@BoyuanJackChen
Copy link

BoyuanJackChen commented Feb 8, 2025

Following ./examples/loading_llamascope_saes.ipynb, and received the following error on the line below... Please instruct on how to fix. Thanks!

sae = SparseAutoEncoder.from_pretrained("fnlp/Llama3_1-8B-Base-L15R-8x")

Local path `fnlp/Llama3_1-8B-Base-L15R-8x` not found. Downloading from huggingface model hub.
Downloading Llama Scope SAEs.
Fetching 3 files: 100%|██████████| 3/3 [00:00<00:00, 37900.34it/s]
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
Cell In[4], [line 1](vscode-notebook-cell:?execution_count=4&line=1)
----> [1](vscode-notebook-cell:?execution_count=4&line=1) sae = SparseAutoEncoder.from_pretrained("fnlp/Llama3_1-8B-Base-L15R-8x")

File ~/Desktop/Language-Model-SAEs/src/lm_saes/sae.py:585, in SparseAutoEncoder.from_pretrained(cls, pretrained_name_or_path, strict_loading, **kwargs)
    [583](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/sae.py:583) @classmethod
    [584](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/sae.py:584) def from_pretrained(cls, pretrained_name_or_path: str, strict_loading: bool = True, **kwargs):
--> [585](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/sae.py:585)     cfg = SAEConfig.from_pretrained(pretrained_name_or_path, strict_loading=strict_loading, **kwargs)
    [586](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/sae.py:586)     return cls.from_config(cfg)

File ~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:84, in BaseSAEConfig.from_pretrained(cls, pretrained_name_or_path, strict_loading, **kwargs)
     [82](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:82) sae_config["sae_pretrained_name_or_path"] = pretrained_name_or_path
     [83](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:83) sae_config["strict_loading"] = strict_loading
---> [84](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:84) return cls.model_validate({**sae_config, **kwargs})

    [... skipping hidden 1 frame]

File ~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:53, in BaseSAEConfig.<lambda>(validated_model)
     [51](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:51) sae_type: Literal["sae", "crosscoder", "mixcoder"]
     [52](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:52) hook_point_in: str
---> [53](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:53) hook_point_out: str = Field(default_factory=lambda validated_model: validated_model["hook_point_in"])
     [54](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:54) d_model: int
     [55](https://file+.vscode-resource.vscode-cdn.net/home/bc3194/Desktop/Language-Model-SAEs/examples/~/Desktop/Language-Model-SAEs/src/lm_saes/config.py:55) expansion_factor: int

KeyError: 'hook_point_in'
@Hzfinfdu
Copy link
Member

Hi! Our codebase is constantly updating and we now realized that we are not capable to maintain a stable version for external use XD. Apologies for this! However you can use llamascope with SAELens. For example,

from sae_lens import SAE

sae, cfg_dict, sparsity = SAE.from_pretrained(
    release="llama_scope_lxr_32x",  # see other options in sae_lens/pretrained_saes.yaml
    sae_id="l0r_32x",  # won't always be a hook point
    device=device,
)

should work.

Please reach out if you have any further questions :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants