Ricky Moorhouse

Apiconnect

Product Academy for Teams - San Jose

Last week I had the opportunity to attend the three-day Product Academy for Teams course at the IBM Silicon Valley Lab in San Jose.

This brought together members of our team from across different disciplines - design, product management, user research, and engineering. It was fantastic to spend time face to face with other members of the team that we usually only work with remotely and to all go through the education together learning from each others approaches and ideas. The API Connect team attendees were split into three smaller teams to work on separate items and each was joined by a facilitator to help us work through the exercises.

We spent time together learning about the different phases of the product development lifecycle and in each looking at the process some of the best practices and ways to apply them to our product. It was particularly effective to use real examples from our roadmap in the exercises so we could collaboratively apply the new approaches and see how they apply directly to our product plan.

Each day of the course looked at a different phase of the product development lifecycle - Discovery, Delivery and Launch & Scale:

Discovery - Are we building the right product? - looking at and assessing opportunities and possible solutions we could offer for them, using evidence to build confidence and reviewing the impact this would have on our North Star Metric.

Delivery - Are we building it right? - ensuring we have a clear understanding of the outcomes we're looking for, how we can achieve them and how we can measure success.

Launch & Scale - Are customers getting value? - ensuring we enable customers to be successful in their use of the product and that we are able to get feedback and data to measure this and improve.

Each of these phases has an iterative approach to it and we looked at how we could apply these to our product plan. We also looked at some of the tools and techniques that can be used to help us apply this and members from the different product teams attending shared how they are using these today.

On the final day of the course I also had the opportunity to share some of our journey with instrumentation, how this has evolved and some of the lessons we learnt along the way - such as the benefits of having a data scientist on the team. I am looking forward to sharing this with the wider team and seeing how we apply some of the learning to improve our systems going forward. For example, better validation of decisions through measuring and improving our use of data.

API Connect Quick Check

This script originated as part of a much wider framework of tests that we put together when I was in the API Connect SRE team. However I’ve found this set of functions to be something useful to be able to validate quickly from time to time in different contexts to give a high level answer to ‘Is it working?’

The steps this script takes are as follows:

  • Authenticate to the API Manager Platform API and retrieve an access token
  • Take a templated API and Product and modify it to return a unique value
  • Publish this API to a nominated catalog
  • Invoke the API through the gateway (looping until a successful response is seen)
  • Query the Analytics APIs to find the event logged for this invocation (again looping until found)

Whilst the entire test frameworks a lot of assumptions around how our environments are built and deployed, this test was relatively standalone - and I just needed to make a couple of updates to it to work outside of our cloud deployments - add support to turn off certificate validation and for username/password instead of API Key based authentication. Take a look at the script on GitHub.

If you want to try this in your own environment you can follow these steps:

  1. Clone the repository

    git clone https://github.com/ibm-apiconnect/quick-check.git
    
  2. Install the python required dependencies

    pip install -r requirements.txt
    
  3. Identify the appropriate credentials to use for your stack - either username/password or an api-key if you are using an OIDC registry and set these as environment variables (APIC_API_KEY or APIC_REALM, APIC_USERNAME & APIC_PASSWORD) or use the command line parameters - either of:

    • -u <apim_user> -p <apim_password> -r provider/default-idp-2 (Local user registry or LDAP)
    • -a <apim_api_key> (OIDC e.g. SaaS)
  4. Download the credentials.json file from the toolkit download page to identify the client id and client secret for your environment - these can either be set as environment variables (CLIENT_ID / CLIENT_SECRET) or as command line parameters (--client_id / --client_secret)

  5. Run the script according to the usage examples

    python api-deploy-check.py -s <platform-api-hostname> -o <provider_org_name> -c <catalog_name> [credential parameters]
    

If successful, you should see output like this:

Example output from the script

I'd be interested to hear if you find this useful or if you have other similar utilities you use already - let me know!

Originally posted on the IBM API Connect Community Blog

Time with the team in Kochi

I was fortunate to finally get a chance to visit my team in Kochi and had a fantastic few days with them, it was so great to be able to spend some time face to face after working together remotely for several years. I travelled to Kochi via Chennai with British Airways. In Chennai this meant I arrived through the international terminal and then had to clear immigration and head over to the domestic terminal next door. Immigration all went smoothly but then at the domestic terminal they didn't recognise my boarding pass for the Indigo codeshare that British Airways had given me, so I had to get help at the check in desks. Once this was all resolved I had to pass through security again and all went smoothly, efficient boarding and a short flight later I arrived in Kochi around 9 in the morning and was met by Akhil and Midhun at the airport to take me to the office.

I didn't have very much time to explore Kochi, with arriving on Monday morning and heading back over to Chennai on the Thursday night to spend a day with the team there on Friday but we managed to get away earlier one afternoon and head to the coast - then on the way it poured with rain so we ended up at a nice spot, Old Lighthouse Lounge, overlooking the coast where we could get some food and drink.

I tried a lot of different foods (whilst sticking to the less spicy options) heading to different places with the team each day I was there - I think my favourite was the local fish wrapped with spices and baked in a banana leaf followed closely by parotta (a flakey layered flatbread) and some of the different paneer based dishes.

There was a real buzz in the office there and it was fantastic seeing the sense of community there and the collaboration that went on within the teams. I managed to have a lot of conversations with different groups across the team, a few one on ones and a couple of full team meetings, but there's lots more that could easily taken another week as the time flew by too quickly. I hope to see them all again soon!

Remote Gateway on Openshift

This post should guide you through the steps on how to deploy a datapower gateway to use as a remote gateway with API Connect Reserved instance, optionally configuring the inbound management traffic through IBM Cloud Satellite Connector.

Overview diagram

Installing the Operators

To install the operators in your cluster, the steps are as described in the documentation on how to install the operators

Set up certificates

Again follow the steps for creating certificates in the documentation.

Deploy the Gateway Cluster

Deploying the gateway cluster into Openshift is just a case of creating the GatewayCluster CR - you can start from this template.

  • Create a pull-secret with access to download the datapower images and reference it under imagePullSecrets. You can download the image to mirror to your registry from the 'Download Gateway' button in the reserved instance Config Manager.

  • Create a secret containing the password for the datapower admin user and ensure it is referenced under adminUser - you can use the following command to create this:

    oc create secret generic admin-secret --from-literal=password={SET-PASSWORD-HERE!}

  • Update imageRegistry to point to your image registry.

  • Update the jwksUrl for your reserved instance, this needs to be the platform api endpoint for the reserved instance followed by /api/cloud/oauth2/certs - you can find the platform api endpoint url from the 'Download Clients' link in the API Manager interface.

  • Select and configure the appropriate profile for your cluster.

  • Create a secret with the CA the reserved instance endpoints are signed by - Let's Encrypt X2 Root CA (Download from the Let's Encrypt site) and ensure the secretName for mgmtPlatformEndpointCASecret points to this.

    curl https://letsencrypt.org/certs/isrg-root-x2.pem -o isrg-root-x2.pem oc create secret generic isrg-root-x2 --from-file=ca.crt=isrg-root-x2.pem

  • [Optional] If you are routing the inbound traffic through IBM Cloud Satellite Connector, you will need to configure the hostname for the gatewayManagerEndpoint to match the private cloud endpoint hostname for your connector - typically c-01.private.{region}.link.satellite.cloud.ibm.com.

  • Apply the gateway cluster yaml

  • You can check the status of the cluster using oc get gatewaycluster. If you see any issues you can use oc describe gatewaycluster for more details.

Set up Satellite Connector [optional]

Optionally, deploy a satellite connector agent on the same cluster as the remote gateway

  • Create a Satellite Connector
  • Deploy the agent
  • Create a Connector Endpoint for the gateway management interface docs. For the Gateway management endpoint you will need the following details:
    • Destination FQDN: {gateway-cluster-name}-datapower.{namespace}.svc e.g. api-gateway-datapower.apicri-gateway.svc
    • Destination Port: 3000
    • TCP

Register gateway with Reserved Instance

Create TLS Client Profile so that the manager can trust the CA that signs the certificate for the gateway management endpoint. This can be done through the RI Config manager under TLS.

  • Create a Trust Store containing the CA Certificate which can be obtained by copying out the ca.crt from the gateway-manager-endpoint-secret and putting it in a file named ca.pem (API Connect needs the .pem extension for the upload to be accepted.)
  • Create TLS Client Profile referencing the Trust store created

Create TLS Server profiles to present to clients invoking the APIs:

  • Create TLS Key Store containing the certificate and private key to present - typically these would be obtained through an external Certificate Authority.
  • Create a Server profile referencing the key store created.

To register the gateway on the 'Gateways' tab you will need the following details:

  • URL of management endpoint: If using Satellite, this is the link endpoint URL including the port number. If not, this will be the hostname from the gatewayManagerEndpoint in the gateway cluster CR.
  • TLS Client Profile: profile created above
  • Base URL of API Invocation endpoint: The host from the gatewayEndpoint in the CR that inbound clients will use
    • TLS Server Profile: profile created above

Lambda Integration in IBM API Connect on AWS

This blog post will provide you an overview of our Lambda integration in API Connect on AWS, what you could use it for and a simple worked example of setting this up. If you’d prefer you can watch me demonstrate this in our video on YouTube.

Lambda is serverless computing platform provided by Amazon Web Services (AWS) which lets you build and deploy your code in a number of different programming languages with easy integrations to AWS services without having to manage infrastructure or servers. This makes it an ideal place to build out your API implementation and when you combine this with our API Connect on AWS SaaS offering you can quickly build, manage and socialise your APIs without worrying about the infrastructure behind them.

Using our Lambda policy in the SaaS service makes this simple and straight forward to integrate these so you can build out your apis around numerous different AWS services then manage them centrally through API Connect, sharing them with consumers through our customisable developer portal.

In order to manage the integration securely we use AWS Security Token Service (STS) so there is no need for you to give API Connect the credentials for your AWS account, you can just create an IAM role and grant permission for API Connect to assume that role.

The API Connect service is running in our AWS account and we have a fixed STS role that the service uses for each region:

Region Role
US East arn:aws:iam::623947394061:role/ibm-apiconnect-us-east-a
Frankfurt arn:aws:iam::623947394061:role/ibm-apiconnect-eu-central-a
London arn:aws:iam::623947394061:role/ibm-apiconnect-eu-west-a

In your AWS account you would set up a role that has permission to invoke your Lambda function(s) and set up the principal for the trust policy to be our service role and the external ID to be your provider org ID of your API Connect service instance. You can set this role to grant permission for just a single lambda function or multiple depending on what you need.

Within API Connect you will find the Lambda policy in the API Assembly editor on the Gateway tab along with all the other policies you can use to build out your API. The preferred model is to configure the lambda policy using the STS Assume Role option, however we do also support providing an Access Key ID and Secret Access Key. Then along with the credentials you just need to specify the name of the function and the region you have the lambda function deployed in.

To do this for yourself you can follow along with my simple worked example:

1. Create Lambda function

From the AWS Console, select Lambda then Create function - either start from scratch or make use of the ‘Hello World’ blueprint example. If you start from scratch and want to use the simple echo function I demonstrate in the video, give your function a name and select NodeJS as the Runtime - you can leave all the other settings the same unless your organisation has specific requirements around roles and permissions. Then click Create function.
Here is the code for the simple example from the video, which you can copy and paste into your function - alternatively you can use an existing function or write something more useful yourself:

export const handler = async(event) => {
    // TODO implement something more interesting here!  
    const response = event;
    return response;
};

After the code is updated in the function, click Deploy. Take a note of the function name and region your function is deployed in - both are part of the displayed function ARN - arn:aws:lambda:{{region}}:{{account}}:function:{{functionName}}

2. Create a role

Now you have the function created you will need to create a role to grant API Connect permission to assume which can invoke the function - for this, go to IAM and select Roles within the AWS Console.

  • Select Create role
  • Select Custom trust policy
  • Update the trust policy to have set the Principal to the API Connect role for the region you are using (see the table above) and set the “Condition” to require an external ID of the provider org you are using in the API Connect service- you should have something like the following:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Statement1",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::623947394061:role/ibm-apiconnect-eu-central-a"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "sts:ExternalId": "ac51909e-a379-4153-a0e8-ef4a25a83405"
                }
            }
        }
    ]
}

Next, create a permission policy for the role to allow it to invoke lambda functions - something like this (you can also specify the function within the Resource field to limit which lambda functions API Connect can invoke):

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "lambda:InvokeFunction",
            "Resource": "*"
        }
    ]
}

Now you have the Lambda function and Role configured we’re ready to start making use of this function within API Connect

3. Create your API in API Connect

To do this, open API Manager - if you don't already have a trial instance to use, you can create a free trial now.

  • Select Develop APIs and Products.
  • Create a new API using the Add button and selecting API (from REST, GraphQL or SOAP).
  • Select New OpenAPI.
  • Complete the details for the name and path and click Next.
  • Confirm the security options and click Next.

Now you have completed the guide to create the basic API you can set up the OpenAPI specification for it as you would like before moving on to configure the implementation. To configure the implementation, select the Gateway tab, remove the default Invoke policy by hovering over it until you see the bin icon - and then click it.

You can now drag the Lambda policy across in its place and fill in the details from your Lambda policy as the parameters (role, function name and region).

Now you have the Lambda policy in place you can Save the API, and toggle the Online indicator to make the API available to test and then try it out under the Test tab.

There we have an API implemented end to end from a serverless function in AWS Lambda through to defining the OpenAPI spec for it and managing it in API Connect. You could now go on to share this API with Consumers through the Developer Portal, or continue to build out the functionality with additional policies first.

Try it for yourself by signing up for our free 30 day trial today.