How to define a custom Open API specification for a Watson Machine Learning deployment to integrate it into watsonx Assistant

This blog post is about how to define a custom Open API specification` for Watson Machine Learning – IBM Cloud deployment to integrate it into watsonx Assistant.

The Watson Machine Learning deployments make it easy for data scientists to write AI Prototypes to be integrated into applications because they can use Jupyter Notebooks and Python they are used to without knowing how to write containers and set up runtimes; they can deploy, and the developers can consume the AI functionalities they have implemented via a REST API.

This blog post has the following sections:

  1. Objective
  2. Technical background
  3. Write a custom Open API specification
  4. Summary

1. Objective

In Watson Machine Learning, you can externalize functionalities you have implemented in Watson Studio by using deployments.

“Create an online (also called Web service) deployment to load a model or Python code when the deployment is created to generate predictions online, in real time. For example, if you create a classification model to test whether a new customer is likely to participate in a sales promotion, you can create an online deployment for the model. Then, you can enter the new customer data to get an immediate prediction.”

IBM Documentation

Maybe you want to integrate these functionalities into watsonx Assistant by using extensions or Bring Your Own Search. You need an Open API specification 3.0.x at the moment for the setup.

This is about how to do this.

2. Technical background

The following content is related to an IBM Cloud Pak for Data as a Service instance.

Simplified dependencies diagram

You can create a deployment for a Python Function defined inside a Jupyter Notebook. With this functionality, you can provide access to the Python Function you have implemented in Notebooks, where you implement your custom AI functionality you may want to use in other applications.

A deployment provides you, let’s say, a runtime environment where you can access your Python Function by using the Watson Machine Learning REST API to call it.

With the Open API specification, you define how the REST API can be used.

watsonx Assistant uses this Open API specification to integrate an external application using an extension. Here are some details about the OpenAPI security configuration in the watsonx Assistant documentation.

For more details about a deployment, please visit the IBM Documentation.

3. Write a custom Open API specification

Before we start with that custom Open API specification for IBM REST API documentation you can usually find an Open API specification here:

In this example, we focus only on one REST API call predictions.
To access the deployed function with the REST API predictions:

  1. You need an IBM Cloud IAM token.
  2. You can invoke the predictions API callNote: You need to ensure that your function does fulfill the given input format for the payload and returns in the right format.

In this case, we need only define a specification that fulfills this REST API definition:

We need to define as parameters the version and the deployment_id.

'https://us-south.ml.cloud.ibm.com/ml/v4/deployments/:deployment_id/predictions?version=2020-09-01'

With this in mind, we are looking at two major sections of our custom Open API specification:

  • Authentication This code snipped shows how the authentication is implemented in the example later.
securitySchemes:
    oAuth2:
    type: oauth2
    description: "See https://cloud.ibm.com/docs/account?topic=account-iamoverview"
    flows: 
        x-apikey: 
        tokenUrl: https://iam.cloud.ibm.com/identity/token
        grantType: "urn:ibm:params:oauth:grant-type:apikey"
        secretKeys: ["apikey"]
        paramKeys: []
        scopes: {}

This code snipped shows how for Path and Query are implemented as the parameters in the example later.

parameters:
        - in: query
        name: version
        schema:
            type: string
        required: true  
        description: Version number as a string
        - in: path
        name: deploymentid
        schema:
            type: string
        required: true
        description: Deployment ID as a string

You can find a working example of how to specify the IAM token for oAuth2 in the example language-model-watsonx Open API JSON specification in one of the excellent watsonx Assistant starter kits for watsonx Assistant extensions.

I created an example OpenAPI specification in a YAML format because you can easily put this specification into the OpenAPI swagger editor, where you can verify the content and then convert it to an Open API JSON format.

openapi: 3.0.3
info:
  title: Your Example Integration
  description: Using OpenAPI v3.0.3 for integration.
  version: 1.0.0
servers:
  - description: Watson Machine Learning deployment URL
    url: https://us-south.ml.cloud.ibm.com
security:
  - oAuth2: []
paths:
  /ml/v4/deployments/{deploymentid}/predictions:
    post:
      summary: Get the predictions
      operationId: predictions_post
      parameters:
        - in: query
          name: version
          schema:
            type: string
          required: true  
          description: Version number as a string
        - in: path
          name: deploymentid
          schema:
            type: string
          required: true
          description: Deployment ID as a string
      requestBody:
        content:
          application/json:
            schema:
              $ref: '#/components/schemas/Predictions_post_request'
        required: true
      responses:
        '200':
          description: Successful Response
          content:
            application/json:
              schema:
                $ref: '#/components/schemas/Predictions_post_response'
components:
  securitySchemes:
    oAuth2:
      type: oauth2
      description: "See https://cloud.ibm.com/docs/account?topic=account-iamoverview"
      flows: 
        x-apikey: 
          tokenUrl: https://iam.cloud.ibm.com/identity/token
          grantType: "urn:ibm:params:oauth:grant-type:apikey"
          secretKeys: ["apikey"]
          paramKeys: []
          scopes: {}
  schemas:
   Predictions_post_request:
        type: object
        required: [input_data]
        properties:
          input_data:
            type: array
            required: [fields, values]
            items:
               $ref: '#/components/schemas/Predictions_post_request_content'
   Predictions_post_request_content:
        type: object
        properties:
          fields:
            type: array
            items:
              type: string
              example: "YOUR CONTENT"
          values:
            type: array
            items:
              type: array
              items:
                type: string
                example: "YOUR CONTENT"
   Predictions_post_response:
        type: object
        properties:
          predictions:
            type: array
            items:
               $ref: '#/components/schemas/Predictions_post_response_content'
   Predictions_post_response_content:
        type: object
        properties:
          fields:
            type: array
            items:
              type: string
              example: ["YOUR RETURN VALUE"]
          values:
            type: array
            items:
              type: array
              items:
                type: object
                properties:
                    response: 
                        type: string
                        example: "YOUR INFORMATION"
                    references: 
                        type: array
                        items:
                            type: object
                            properties:
                                text:
                                    type: string
                                    example: "YOUR TEXT"

4. Summary

With an Open API specification, you have a powerful tool to build REST APIs for your custom integration objectives, for example, to integrate into watsonx Assistant.

The Watson Machine Learning deployments make it easy for data scientists to write AI Prototypes to be integrated into applications because they can use Jupyter Notebooks and Python they are used to without knowing how to write containers and set up runtimes; they can deploy, and the developers can consume the AI functionalities they have implemented via a REST API.


I hope this was useful to you, and let’s see what’s next?

Greetings,

Thomas

#openai, #watsonxassistant, #deployments, #watsonmachinelearning, #restapi, #ibmcloud

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog at WordPress.com.

Up ↑