Skip to content

TensorFlow Lite Use Case

TensorFlow Lite was designed for on-device machine learning inference with low latency and a small binary size. Hence, TensorFlow Lite is ideally suited for running inside of Intel SGX enclaves with the help of SCONE.

We will add some more documentation about the curated TensorFlow Lite image later. Until then, you can have a look at our TensorFlow Lite screencast:

If you want to evaluate the TensorFlow Lite image, send us an email.