Configuring AI Inference Software
To perform AI inference, users need to configure inference software with trained machine learning models and input data. Various deep learning frameworks, such as TensorFlow and PyTorch, support GPU acceleration for inference tasks.
Users can specify the paths to their trained machine learning models and input data to perform AI inference on provisioned GPU instances.
Last updated