alibi_detect.utils.tensorflow.prediction module

alibi_detect.utils.tensorflow.prediction.predict_batch(x, model, batch_size=10000000000, preprocess_fn=None, dtype=<class 'numpy.float32'>)[source]

Make batch predictions on a model.

Parameters:
  • x (Union[list, ndarray, Tensor]) – Batch of instances.

  • model (Union[Callable, Model]) – tf.keras model or one of the other permitted types defined in Data.

  • batch_size (int) – Batch size used during prediction.

  • preprocess_fn (Optional[Callable]) – Optional preprocessing function for each batch.

  • dtype (Union[Type[generic], DType]) – Model output type, e.g. np.float32 or tf.float32.

Return type:

Union[ndarray, Tensor, tuple]

Returns:

Numpy array, tensorflow tensor or tuples of those with model outputs.

alibi_detect.utils.tensorflow.prediction.predict_batch_transformer(x, model, tokenizer, max_len, batch_size=10000000000, dtype=<class 'numpy.float32'>)[source]

Make batch predictions using a transformers tokenizer and model.

Parameters:
  • x (Union[list, ndarray]) – Batch of instances.

  • model (Model) – Transformer model.

  • tokenizer (Callable) – Tokenizer for model.

  • max_len (int) – Max token length.

  • batch_size (int) – Batch size.

  • dtype (Union[Type[generic], DType]) – Model output type, e.g. np.float32 or tf.float32.

Return type:

Union[ndarray, Tensor]

Returns:

Numpy array or tensorflow tensor with model outputs.