How did you choose the size of your bucket? It seems that, if too small, your computing time increases, but if too large, you are only leaving potentially bucket_size/2 on the table.
We experimented with different number of Xs. There was a tradeoff between having enough training examples for each X in our (unbiased) training data and giving more flexibility to the optimizer. We ended up using X = 50. Inference time was not a factor in the decision as TensorFlow Serving can make inferences in virtually the same amount of time for X=1-100 in a single call to the service.
How did you choose the size of your bucket? It seems that, if too small, your computing time increases, but if too large, you are only leaving potentially bucket_size/2 on the table.
Hi Stephane,
We experimented with different number of Xs. There was a tradeoff between having enough training examples for each X in our (unbiased) training data and giving more flexibility to the optimizer. We ended up using X = 50. Inference time was not a factor in the decision as TensorFlow Serving can make inferences in virtually the same amount of time for X=1-100 in a single call to the service.