Kev-HL's picture
Minimal clone for deployment, see README for full project
babf969
# Requirements for the application
# Python version used in the container: 3.9.23
fastapi==0.121.3
jinja2==3.1.6
numpy==1.26.4
pillow==11.3.0
python-multipart==0.0.20
uvicorn==0.38.0
# Additional requirements for testing
# Uncomment if needed
# measure_inference.py requires psutil for memory sampling
# measure_http.py requires requests for HTTP requests
# psutil==7.1.3
# requests==2.32.5
# Model interpreter for the TFLite model is installed in the Dockerfile (manually compiled from LiteRT repo)
# https://github.com/google-ai-edge/LiteRT
# Commit used for the provided wheel: cc245c70a9113041467a4add21be6d1553b8d831
# If replicating the environment without the provided wheel, install one of the following:
# And remove/comment the interpreter installation line in the Dockerfile
#
# - full tensorflow (includes tflite interpreter till TF 2.20)
# USAGE: from tensorflow.lite.python.interpreter import Interpreter (for TF 2.20.0, other versions may differ)
# tensorflow==2.20.0
#
# - tflite-runtime if trained/converted with older TF versions, smaller package but not compatible with recent op versions (deprecated package)
# USAGE: from tflite_runtime.interpreter import Interpreter
# tflite-runtime==2.14.0
#
# - ai-edge-litert for the latest TFLite interpreter with extended op support, but larger package size
# USAGE: from ai_edge_litert.interpreter import Interpreter
# ai-edge-litert==2.0.3