You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! I'm using faker to generate some random data for my unit tests, and it appears that when tests are run too fast on my CPU I'm getting the same data seed in two different tests, resulting in faker generating the same emails for two different users in two different tests:
funcTestCreateUserInDb(t*testing.T) {
userDao:=persistence.NewUserDao()
fake:=faker.New()
user:=dto.NewUser(fake.Person().Name(), fake.Internet().Email(), fake.Internet().Password(), "Pony")
userDao.CreateUser(user)
}
funcTestSelectUserById(t*testing.T) {
userDao:=persistence.NewUserDao()
fake:=faker.New()
// The test above runs at the same exact epoch, causing faker to initialize with the same seed, therefore the first email given is the samefake.Internet().Email()
user:=dto.NewUser(fake.Person().Name(), fake.Internet().Email(), fake.Internet().Password(), "Pony")
userDao.CreateUser(user)
readUser:=userDao.GetUserById(user.Id)
if!reflect.DeepEqual(user, readUser) {
t.Errorf("User and readUser are not equal")
}
}
2024/11/09 04:33:07 Creating user: [email protected]
2024/11/09 04:33:07 Creating user: [email protected]
2024/11/09 04:33:07 Could not create user: pq: duplicate key value violates unique constraint "users_email_key"
I guess what we need is some feature to control the data seed identifier or at least some random identifier in it.
The text was updated successfully, but these errors were encountered:
Hello! I'm using faker to generate some random data for my unit tests, and it appears that when tests are run too fast on my CPU I'm getting the same data seed in two different tests, resulting in faker generating the same emails for two different users in two different tests:
I guess what we need is some feature to control the data seed identifier or at least some random identifier in it.
The text was updated successfully, but these errors were encountered: