Hugging Face
satya - 1/23/2024, 2:08:46 PM
Quick tour of the transformer library
satya - 1/23/2024, 2:12:05 PM
What is the difference between hugging face transformers library and the inference API?
What is the difference between hugging face transformers library and the inference API?
Search for: What is the difference between hugging face transformers library and the inference API?
satya - 1/23/2024, 3:47:31 PM
Scikit, Pytorch, Tensorflow
- Scikit: A collection of ML algorithms to train, test, and deploy non traditional ML models
- Pytorch, TensorFlow: Core ground level libraries to run Deeplearning models, establish neural networks, run through data, train, test etc.
satya - 1/23/2024, 3:51:07 PM
Transformers library from HuggingFace
- For developing, training, or fine-tuning machine learning models. This library allows users to work with pre-trained models or to train their own models.
- Uses other libraries like Pytorch and Tensorflow underneath
- A large collection of models pretrained models
- Provides an execution Environment if needed
- It runs in the user's own environment, which means the user needs to manage the computational resources, dependencies, and everything else required to run the models.
- It supports multiple deep learning frameworks like TensorFlow and PyTorch and provides interfaces in several programming languages, predominantly Python.
satya - 1/23/2024, 3:53:44 PM
The Hugging Face inference API
- Models are managed in the cloud
- API simplifies the interaction with the models
- Used for using 'trained models', not to train them
- The models are classified into the type of tasks they do
- The API is also classified based on the tasks
- The API definition (signature which encompasses the inputs, outputs based on task) also varies by task
- Suitable for small to midsize companies
satya - 1/23/2024, 4:14:02 PM
Here is the HuggingFace TGI: Text Generation Inference Server
Here is the HuggingFace TGI: Text Generation Inference Server