Overview
API Protect can integrate with the IBM DataPower Gateway to provide active protection and “continuous discovery” of APIs. This guide explains what the requirements are to use API Protect, and how the integration works.
Requirements
Supported IBM DataPower Gateway Versions
IBM DataPower Gateway 2018.4
IBM DataPower Gateway 7.6
IBM DataPower Gateway 10.0.x
API Protect Analyzer
To keep request data inside your cloud or on-premises infrastructure, the API Protect Analyzer will need to be deployed in your infrastructure. Our analyzer is resource-efficient, but care should be taken to provision enough resources to handle the traffic. In most cases, provisioning 4 vCPUs and 8GB of RAM will handle the normal traffic and spikes.
Supported Deployments
Kubernetes Helm Chart
Docker Compose file
VMWare OVA
Linux package
How The Integration Works
DataPower Gateways apply security policies to requests flowing through the gateway. Additionally, the gateway can execute GatewayScript to dynamically route, transformation, and apply policies. After installing the the API Protect GatewayScript, requests flowing through the gateway can be analyzed by our API Protect analyzer running inside your cloud or on on-premises infrastructure. All requests will be analyzed to maintain a live observability profile of all your gateway traffic. Based on the API Protect rules you configure and enable, requests can also be blocked on your gateway.
Installation
GatewayScript Install
We generate and provide you with api_protect.js
GateWayScript file that can be loaded either from a local location using local://some_local_directory/api_protect.js
or from the store using store://some_store_directory/api_protect.js
API Protect Analyzer
Instructions for installation of the API Protect analyzer vary depending on the type of deployment you choose for your environment, ie, container or VM. We provide detailed instructions and support for whichever style of deployment you choose.
For evaluation purposes, running our Docker Compose file can have the analyzer running in minutes.
# Generate and download the compose file and .env file with your clientId # from Data Theorem Portal and place them in a local directory: $ ls -A docker-compose.yml .env $ docker compose --env-file .env up
Once the analyzer is running, it can accept tcp traffic on port 8080.