TensorFlow: Quick Start¶
In this tutorial, we are going to deploy an image classifier to Model Zoo with TensorFlow and use it to make sample predictions.
You can follow along this tutorial in any Python environment you’re comfortable with, such as a Python IDE, Jupyter notebook, or a Python terminal. The easiest option is to open this tutorial directly in colab:
Install the Model Zoo client library via pip:
!pip install modelzoo-client[tensorflow]
To deploy and use your own models, you’ll need to create an account and configure an API key. You can do so from the command line:
First, we’ll need a TensorFlow model to deploy. Model Zoo is solely focused on deployment and monitoring of models, so you can feel free to train or load a Tensorflow model using any method, tool, or infrastructure. For the purposes of this demo, we’ll use the TensorFlow official example to classify images of clothing.
import numpy as np import tensorflow as tf model = tf.keras.Sequential( [ tf.keras.layers.Flatten(input_shape=(28, 28)), tf.keras.layers.Dense(128, activation="relu"), tf.keras.layers.Dense(10), ] ) model.compile( optimizer="adam", loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=["accuracy"], ) ( (train_images, train_labels), (test_images, test_labels), ) = tf.keras.datasets.fashion_mnist.load_data() train_images = train_images / 255.0 test_images = test_images / 255.0 model.fit(train_images, train_labels, epochs=5) test_loss, test_acc = model.evaluate(test_images, test_labels, verbose=2) print("\nTest accuracy:", test_acc)
To deploy this TensorFlow model to a production-ready HTTP endpoint, use the
import modelzoo.tensorflow model_name = modelzoo.tensorflow.deploy(model)
That’s all there is to it! Behind the scenes, Model Zoo serialized your model, uploaded it to object storage, deployed a container to serve any HTTP requests made to the model, and set up a load balancer to route requests to multiple model shards. If you’d like, take some time to explore the model via the Web UI link. There you’ll be able to modify documentation, test the model with raw or visual inputs, monitor metrics and/or logs. By default, only your account (or anybody you share your API key with) will be able to access this model.
You can specify the name of the model you’d like to deploy via a
argument. If a name is omitted, Model Zoo will choose a unique one for you.
Model names must be unique to your account.
Now that the model is deployed, you can use our Python client API to query
the model for a prediction. The
function requires the
model_name and a payload for prediction – in this
case a numpy array representing a test image.
print("\nTest label ground truth:", test_labels) scores = modelzoo.tensorflow.predict(model_name, test_images) scores_by_class = list(zip(range(10), scores)) model_prediction = np.argmax(scores) print("\nModel scores by class:", scores_by_class) print("\nModel prediction:", np.argmax(scores))
Great! At this point, we’ve successfully queried our deployed model for a prediction on an image it hasn’t seen during training.
By default, Model Zoo will deploy your model and wait for it to get into a
HEALTHY state, meaning that it’s ready for predictions. You can always
check on the state of a model by using the
To save resources and shut down any model if you aren’t using it, you can use
With Model Zoo you can manage model state manually, or automatically. By default, our free trial will stop any model where there has been no request activity for 15 minutes, saving you resources if you forget to stop manually. Our unlimited version has more options for controlling autoscaling behavior.