Prerequisites

Ensure that you meet the following requirements before installing the agent:

Topics


Ensure you have an active subscription

To be able to use CxLink Backup in your SAP Server you will need to have an active contract with valid licenses in your CxLink Account.


Prepare your Storage Provider Account - Amazon Web Services

Before you can start using Amazon S3 as your storage provider, you will need to perform some actions in your AWS Account

0. Required AWS knowledge

  • Basic understanding of AWS S3, AWS KNS, AWS IAM, AWS Cloudformation and AWS SNS services
  • Understanding of IAM instance profile, IAM policies and IAM Roles
  • Ability to launch provided Cloudformation template
  • Understanding of S3 billing

1. Create or Prepare your Amazon S3 Bucket

You will need to create a bucket to store your database backups.

VPC endpoints

To obtain the best performance of your backups it is highly recommended to set up S3 Private Endpoints on the VPC where your EC2 instance is running (There is no additional charge for using gateway endpoint). To ensure that you are using endpoint, see S3 Endpointsopen in new window.

If you don't have one, you can use the Amazon S3 console, Amazon S3 APIs, AWS CLI, or AWS SDKs to create a bucket following the guidelines in https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.htmlopen in new window:

  1. Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/open in new window

  2. Choose Create bucket.

  3. In Bucket name, enter a DNS-compliant name for your bucket. The bucket name must:

    • Be unique across all of Amazon S3.
    • Be between 3 and 63 characters long.
    • Not contain uppercase characters.
    • Start with a lowercase letter or number.

    Bucket name compliance

    After you create the bucket, you can't change its name. For information about naming buckets, see Bucket naming rulesopen in new window.

    Avoid including sensitive information, such as account numbers, in the bucket name. The bucket name is visible in the URLs that point to the objects in the bucket.

  4. In Region, choose the AWS Region where you want the bucket to reside.

    Choose a Region close to you to minimize latency and costs and address regulatory requirements. Objects stored in a Region never leave that Region unless you explicitly transfer them to another Region. For a list of Amazon S3 AWS Regions, see AWS service endpoints in the Amazon Web Services General Reference.

  5. In Bucket settings for Block Public Access, choose the Block Public Access settings that you want to apply to the bucket.

    We recommend that you keep all settings enabled unless you know that you need to turn off one or more of them for your use case, such as to host a public website. Block Public Access settings that you enable for the bucket are also enabled for all access points that you create on the bucket. For more information about blocking public access, see Blocking public access to your Amazon S3 storage.

  6. (Optional) If you want to enable S3 Object Lock, do the following:

    • Choose Advanced settings, and read the message that appears.

      Important

      You can only enable S3 Object Lock for a bucket when you create it. If you enable Object Lock for the bucket, you can't disable it later. Enabling Object Lock also enables versioning for the bucket. After you enable Object Lock for the bucket, you must configure the Object Lock settings before any objects in the bucket will be protected. For more information about configuring protection for objects, see Using S3 Object Lock.

    • If you want to enable Object Lock, enter enable in the text box and choose Confirm.

      For more information about the S3 Object Lock feature, see Using S3 Object Lock.

      Note

      To create an Object Lock enabled bucket, you must have the following permissions: s3:CreateBucket, s3:PutBucketVersioning and s3:PutBucketObjectLockConfiguration.

  7. Choose Create bucket


2. Create or Maintain an IAM Policy

For CxLink Backup to be able to store and retrieve backups from Amazon S3, you must create an IAM Policy with the following permissions and attach it to your EC2 service role.

Amazon IAM Policies

If you need additional information, follow the guidelines described in https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html#access_policies_create-json-editoropen in new window

The following permissions should be granted to use all the configuration options available in CxLink Backup agent setings:

AWS ServiceAWS PermissionDescriptionResource
EC2ec2:DescribeRegionsList all available AWS Regions from the Configuration Wizard*
EC2ec2:DescribeInstancesRetrieve EC2 Instance tags to be sent to LinkeIT Console (Optional)*
S3s3:ListAllMyBucketsList all Buckets in AWS Account from the Configuration Wizard*
S3s3:ListBucketVersionsList metadata about all of the versions of objects in a bucket*
S3s3:HeadBucketList all CloudFront distributions.*
S3s3:*Allow all operations in the bucket"arn:aws:s3:::<bucket_name>", "arn:aws:s3:::<bucket_name>/*"
KMSkms:ListKeysList all load balancers.*
KMSkms:ListAliasesGet the name of the S3 bucket containing ELB access logs.*
KMS"kms:GetPublicKey" "kms:GenerateDataKey" "kms:Decrypt" "kms:Encrypt" "kms:GetKeyPolicy"Encrypt and decrypt your backups<kms_key_arn>
SNSsns:PublishInform about failed backups via AWS Simple Notification Service<sns_topic_arn>
STSsts:AssumeRoleAssume a Role from other AWS account<role_arn>

You can use any of the following IAM policy templates as the base for your SAP Server instance profile:

S3 Policy - Expand to see an example
{
    "Version": "2012-10-17",
    "Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": [
            "s3:ListAllMyBuckets",
            "s3:HeadBucket"
        ],
        "Resource": "*"
    },
    {
        "Sid": "VisualEditor1",
        "Effect": "Allow",
        "Action": "s3:*",
        "Resource": [
            "arn:aws:s3:::<YOUR_BUCKET_NAME>/*",
            "arn:aws:s3:::<YOUR_BUCKET_NAME>"
        ]
    }
    ]
}
EC2 Instance - Expand to see an example
{
    "Version": "2012-10-17",
    "Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": [
            "ec2:DescribeInstances",
            "ec2:DescribeRegions"
        ],
        "Resource": "*"
    }
    ]
}
Amazon KMS - Expand to see an example
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "kms:GetPublicKey",
                "kms:Decrypt",
                "kms:Encrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:Verify"
            ],
            "Resource": [
                "arn:aws:kms:eu-west-1:${AWS::AccountId}:key/<KMS_KEY_NAME>",
                "arn:aws:kms:eu-west-1:${AWS::AccountId}:alias/<KMS_KEY_ALIAS>"
            ]
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": [
                "kms:ListKeys",
                "kms:GenerateRandom",
                "kms:ListAliases"
            ],
            "Resource": "*"
        }
    ]
}
Amazon SNS - Expand to see an example
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "sns:Publish",
            "Resource": "arn:aws:sns:eu-west-1:${AWS::AccountId}:CxLink Backup-Topic-Name"
        }
    ]
}
Assume Role - Expand to see an example
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AssumeCrossAccountRole",
            "Effect": "Allow",
            "Action": "sts:AssumeRole",
            "Resource": "arn:aws:iam::<REMOTE_AWS_ACCOUNT_ID>:role/<RemoteRole>"
        }
    ]
}

3. Attach IAM Policy to the EC2 Instance as a Service Role

Ensure that the IAM Policies have been added to the EC2 instance profile of your SAP Database server.


4. Allow access to remote accounts (optional)

If you want to access AWS Resources (S3 & KMS or SNS Topic) in a different account, you will need to generate an IAM policy on remote account to grant the proper permissions.

  1. Create an IAM policy in the AWS Account where the SAP Server resiteds that allow assume role in the remote account.

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "AssumeCrossAccountRole",
                "Effect": "Allow",
                "Action": "sts:AssumeRole",
                "Resource": "arn:aws:iam::<REMOTE_AWS_ACCOUNT_ID>:role/<RemoteRole>"
            }
        ]
    }
    
  2. Attach the newly created policy to the EC2 Service Role.

  3. Grant access to the role. Create a new role on remote account that must have a Trust Relationship to allow the resources to be accesed from remote Role, the one that you attach to your Database EC2 Instances.

    {
        "Version": "2012-10-17",
        "Statement": [
            {
            "Sid": "TrustingPolicy",
            "Effect": "Allow",
            "Principal": {
            "AWS": "arn:aws:iam::<DATABASE_SERVER_AWS_ACCOUNT>:role/<EC2InstanceRole>"
            },
            "Action": "sts:AssumeRole"
            }
        ]
    }
    

Prepare your Storage Provider Account - Microsoft Azure

Before you can start using Microsoft Azure as your storage provider, you will need to perform some actions in your Azure account.

1. Create or Prepare your Microsoft Azure Storage account and container

You will need to create an storage account and at least one container.

If you don't have neither an storage account nor a container, you can use the Microsoft Azure Portal to create them, following the guidelines in https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-portalopen in new window:

  1. Sign in to the Microsoft Azure Portalopen in new window

  2. Choose Storage accounts and create.

  3. Set storage account basic attributes:

    • Resource group where storage account will be created
    • Storage account name
    • Region
    • Performance
    • Redundancy
  4. Choose Review + create

  5. Under storage account attributes, access to Data storage -> Containers

  6. Set a container name and Public access level to Private


2. (optional) Configure managed identity for virtual machines

For CxLink Backup to be able to store and retrieve backups from Azure, you can configure storage account name and storage account key o have your virtual machines with identity managed configured.

The following storage account storage access roles must be granted to use all the configuration options available in CxLink Backup agent settings:

  • Owner
  • Storage Blob Data Contributor

As an example, for a virtual machine named emoryazdb2 and an storage account named emorydevelopment this would be the roles assignment:

Azure virtual machine managed identity roles


Prepare your Storage Provider Account - Google Cloud

Before you can start using Google Cloud as your storage provider, you will need to perform some actions in your Google Cloud Project.

1. Select the Project ID

Select the proper Project ID on your GCloud Account where you want to store your backups for the next steps.

2. Create your Google Cloud IAM Service Account Key File

  1. Create a custom Role with the following permissions:

    • cloudkms.cryptoKeyVersions.useToDecrypt
    • cloudkms.cryptoKeyVersions.useToEncrypt
    • cloudkms.cryptoKeyVersions.useToSign
    • cloudkms.cryptoKeyVersions.useToVerify
    • cloudkms.cryptoKeyVersions.viewPublicKey
    • cloudkms.locations.get
    • cloudkms.locations.list
    • storage.buckets.get
    • storage.buckets.list
    • storage.multipartUploads.abort
    • storage.multipartUploads.create
    • storage.multipartUploads.list
    • storage.multipartUploads.listParts
    • storage.objects.create
    • storage.objects.delete
    • storage.objects.get
    • storage.objects.getIamPolicy
    • storage.objects.list
    • storage.objects.update
  2. Create a Service Account Private Key file in JSON format that will be used to grant access from CxLinkBackup agent to GCloud Storage Bucket. Store this file securely.

  3. Under IAM section, grant created Role to created Service Account

3. (optional) Create your custom KMS Key for encryption

Under Security - Key Management you will be able to create your KMS Key that will be used to encrypt data at rest on Cloud Storage Bucket

  1. (optional) Create new keyring
    • KeyRing Name
    • Location type: Region or MultiRegion
  2. Create new key inside Keyring
    • Generated Key
    • Keyname : This key name will be used to encrypt data on bucket
    • Protection Level: Software
    • Purpose: Symmetric encryptdecrypt
    • Key Rotation period: Specify number of days after which our keys will be rotated automatically

4. Create or Prepare your Google Cloud Storage account and bucket

Create a new Bucket under Cloud Storage that will be used for atoring our backups

  • Name you bucket
  • Choose where to store your data : Choose MultiRegion, DualRegion or Region
  • Choose how to control access to objects: Select Autoclass for best pricing model
  • Choose how to protect object data : Enforce Public Access prevention on this bucket
  • Choose how to protect object data :
    • Choose Google Managed encryption key if you want Google to manage your keys
    • Customer-Managed Encryption key to use previously generated key to encrypt the data.