Skip to content

Data for "Quantifying Memorization Across Neural Language Models"

License

Notifications You must be signed in to change notification settings

ethz-spylab/lm_memorization_data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

29b979e · Mar 26, 2024

History

7 Commits
Mar 26, 2024
Aug 24, 2022
Mar 26, 2024

Repository files navigation

lm_memorization_data

Data for "Quantifying Memorization Across Neural Language Models".

Our main repository provides the prefixes and model continuations which we used in our analysis of memorization in large language models.

Tip

The data can be downloaded from here.

As obtaining the prefixes requires one to download the entire 800GB The Pile dataset, this repository contains the extracted data (570 MB) as described here.

Citation

@article{lm-memorization,
  title={Quantifying Memorization Across Neural Language Models},
  author={Carlini, Nicholas and Ippolito, Daphne and Jagielski, Matthew and Lee, Katherine and Tram\`er, Florian and Zhang, Chiyuan
},
  journal={arXiv:2202.07646},
  url={https://arxiv.org/abs/2202.07646},
  year={2022}
}