|
def __getitem__(self, index): |
In the following getitem() method, the index variable is not used. So, every batch generated is identical since seed is set by the get_predefined_generator().
Not sure, if this is intended behavior. Essentially, we are training same batch element over (1000/batch_size) iterations and across epochs.
def __getitem__(self, index):
batch_x, batch_y = [], []
for _ in range(self.batch_size):
x, y = self.generate_pair()
batch_x.append(x)
batch_y.append(y)
return self.encode_x_batch(batch_x), self.encode_y_batch(batch_y)