This small blog post is about how to set up a demo environment for using Caikit and Hugging Face models on your local machine.
- Caikit is an AI toolkit that enables users to manage models through a set of developer-friendly APIs. It provides a consistent format for creating and using AI models against a wide variety of data domains and tasks, and it streamlines the management of AI models for application usage by letting AI model authors focus on solving well-known problems with novel technology.
- Hugging face is the AI community building the future. Build, train, and deploy state-of-the-art models powered by the reference open source in machine learning.
The demo setup contains the backend Caikit server and an UI you can inspect how to use the gRPC API with various Hugging face models.
You can find more details of this example in the Caikit Hugging Face examples GitHub project.
Step 1: Clone the project and navigate to the project folder
git clone https://github.com/caikit/caikit-huggingface-demo
cd caikit-huggingface-demo
Step 2: Set a virtual environment
Also see: Set a virtual environment for Python
python3.11 -m venv caikit-env-3.11
Step 3: Activate the virtual environment
source ./caikit-env-3.11/bin/activate
Step 4: Install the needed modules and libraries
# optional: python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
Step 5: Verify the server configuration
cat ./caikit-huggingface-demo/runtime/config/config.yml
- Example output:
cat ./runtime/config/config.yml
runtime:
# The runtime library (or libraries) whose models we want to serve using Caikit Runtime. This should# be a snake case string, e.g., caikit_nlp or caikit_cv.
library: runtime
local_models_dir: models
# Service exposure options
port: 8085
find_available_port: True
Step 6: Copy some example models into a new models folder
cd caikit_huggingface_demo
mkdir models
cp -r example_models_extras/image_classification models/
cp -r example_models/sentiment models/
Step 7: Start server and UI
python3 app.py
- Example output:
<function register_backend_type at 0x12fc7ec00> is still in the BETA phase and subject to change!
Command-line enabled Caikit gRPC backend server and frontend gradio UI
▶️ Starting the backend Caikit inference server...
{'log_code': '<COR56759744W>', 'message': 'No backend configured! Trying to configure using default config file.', 'args': None}
No model was supplied, defaulted to distilbert-base-uncased-finetuned-sst-2-english and revision af0f99b (https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english).
Using a pipeline without specifying a model name and revision in production is not recommended.
No model was supplied, defaulted to google/vit-base-patch16-224 and revision 5dca96d (https://huggingface.co/google/vit-base-patch16-224).
Using a pipeline without specifying a model name and revision in production is not recommended.
Downloading (…)lve/main/config.json: 100%|███████████| 69.7k/69.7k [00:00<00:00, 19.7MB/s]
Downloading pytorch_model.bin: 100%|███████████████████| 346M/346M [00:44<00:00, 7.79MB/s]
Downloading (…)rocessor_config.json: 100%|███████████████| 160/160 [00:00<00:00, 1.18MB/s]
<function ModuleBase.metadata at 0x12fc7f600> is still in the WIP phase and subject to change!
<function ModuleBase.metadata at 0x12fc7f600> is still in the WIP phase and subject to change!
✅️ Sentiment tab is enabled!
✅️ ImageClassification tab is enabled!
▶️ Starting the frontend gradio UI with using backend target=localhost:8085
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Step 8: Use the example UI

I hope this was useful to you, and let’s see what’s next?
Greetings,
Thomas
#python, #venv, #caikit, #Huggingface, #ai
Leave a Reply