Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to generate synthetic data after model fitting #45

Open
MedhaviShruti opened this issue Jan 17, 2024 · 5 comments
Open

Not able to generate synthetic data after model fitting #45

MedhaviShruti opened this issue Jan 17, 2024 · 5 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@MedhaviShruti
Copy link

I have a tabular data with shape of 11 rows and 25 columns. I have trained two models based on the following command
model = GReaT(llm='distilgpt2', batch_size=32, epochs=25)

and tried to generate synthetic data for this table after fitting based on these model but it fails with the below error:

An error has occurred: Breaking the generation loop!
To address this issue, consider fine-tuning the GReaT model for an longer period. This can be achieved by increasing the number of epochs.
model = GReaT(llm='distilgpt2', batch_size=25, epochs=100) ( Tried with this model as well but same error)
Alternatively, you might consider increasing the max_length parameter within the sample function. For example: model.sample(n_samples=10, max_length=2000)

Please let me know if there is a way the command has to be given for successful generation.

@unnir
Copy link
Collaborator

unnir commented Apr 19, 2024

I suggest to train it longer 100+ epochs.

However, 11 rows and 25 columns is rather a very small dataset.
I would recommend here to do a prompt engineering with ChatGPT, Mixtral, or Claude.

@bvanbreugel
Copy link

Hi all,

I very much appreciate the clean and easy-to-use repo. In my limited experience with the repo, I've encountered OP's issue many times, however---that the generation loop is broken and no data is outputted. I've tried increasing the number of epochs (e.g. 200) and max_length, but neither helps reliably. This remains true even when datasets are not tiny (e.g. 100 samples). To reproduce, e.g. use the UCI Spambase dataset (58 features):

from be_great import GReaT
from sklearn.datasets import fetch_openml

# load spam dataset, reduce to 100 samples
data = fetch_openml(data_id=44,as_frame=True).frame[:100]

# train model, takes about 12 minutes on a single GPU
model = GReaT(llm='distilgpt2', batch_size=32,  epochs=200, fp16=True)
model.fit(data.to_numpy(), column_names=list(data.columns))

# generate---this will raise error "Breaking the generation Loop!"
synthetic_data = model.sample(n_samples=1000, max_length=2000)

print(len(synthetic_data))
assert len(synthetic_data)>0 # This will fail

Of course, in an example like the above you would expect the mode to overfit, but it's frustrating the model doesn't generate anything at all. Is there any guidance on when GReaT can be used reliably?

@unnir unnir added bug Something isn't working help wanted Extra attention is needed labels May 22, 2024
@unnir
Copy link
Collaborator

unnir commented May 22, 2024

Thank you for providing your script and sorry for the issues with our model's sampling function.

I agree that the current behavior of the model is not optimal, and we should guide users better.
I will try make an update in the near future.

@bvanbreugel
Copy link

Thanks for the quick response! That'd be very helpful 😀

@iamamiramine
Copy link

iamamiramine commented Jul 4, 2024

Hello, I am facing the same issue with NHANES 1999-2014 Dataset which consists of 6833 samples (rows) and 29 features (columns). I trained the model for 300 epochs, and tried generating with different max_length parameter values. Any suggestions on how to fix this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants