Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue training model on batch (ValueError: tf.function-decorated function tried to create variables on non-first call.) #21

Open
sehgalsakshi opened this issue Sep 24, 2020 · 2 comments

Comments

@sehgalsakshi
Copy link

sehgalsakshi commented Sep 24, 2020

image
Hi,
I'm getting this issue even after annotating loss function with @tf.function.
I google more about it. Seems like loss function needs to be initialized at every epoch because it breaks after the first epoch!
Could you please help me out with this asap!

@sehgalsakshi sehgalsakshi changed the title Issue training model on batch Issue training model on batch (ValueError: tf.function-decorated function tried to create variables on non-first call.) Sep 24, 2020
@sehgalsakshi
Copy link
Author

@tf.function
def get_compiled_model():
model = create_model(args, maxlen, vocab)
model.get_layer('word_emb').trainable = False
model.compile(optimizer=optimizer, loss=max_margin_loss, metrics=[max_margin_loss])
return model

And on adding wrapper around model, I'm getting the below error
image

@JoannaSimm
Copy link

JoannaSimm commented Oct 29, 2021

I'm getting the same error. @madrugado is there a solution for this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants