Page tree
Skip to end of metadata
Go to start of metadata


This article is introduced by a blog providing an overview and summary


Watch a video of the content being presented (Duration 1h 34 mins. Recorded May 2020)


User Management

SAP provides a default Identity Provider

This default provides:

  • Manual User Creation through ‘Menu-Users’
  • Manual User Import from a CSV File
  • Can also use a SCIM API

Typically more suitable to integrate with your existing custom IdP:

  • Must be SAML2 based
  • Commonly used IdP’s: Okta, ADFS, Microsoft Azure
  • Grant users access to the ‘SAP Analytics Cloud’ application within the IdP


  • Enable ‘dynamic user creation’ during the custom IdP setup
    • Enables automatic assignment of new users to a ‘concurrent session’ license
      • Much more in the blog
    • Map IdP user attributes to ‘Teams’ in SAP Analytics Cloud

  • SAP Analytics Cloud is case-sensitive during the user mapping process. This means
    • If the IdP changes the case (typically the email) then the user won’t match with the user in SAP Analytics Cloud
    • Need to work with SAP to resolve user mismatches until a full solution is available
  • One lesson learnt is to change the case of all emails, in custom IdP, prior to integration
    • becomes
  • Fully self-service
    • Bookmark the ‘Identity Provider Administration’ tool to rescue yourself if you get locked out!
  • More resources
  • See demo!


Roles, Teams, User concepts

  • Authorisations are performed within SAP Analytics Cloud
  • Key concepts inside SAP Analytics Cloud:
    • Roles
    • Teams
    • Users
  • Best Practice Summary:
    • Use inheritance on objects:
      • Avoid assigning rights on individual files or individuals
      • Assign rights to Teams and Folders
      • Assign Teams to Roles
      • Assign Users to those Teams and not directly to Roles
        • though there is an exception for ‘concurrent session’ license assignment
      • For more details on “Managing Licenses with Roles and Teams”: blog and wiki

Best Practices setup

Data Security

Acquired data models

Row level data security

Two ways to define Data access in a model

Option 1: Read/Write Property in Model Dimension

  • Assign, for a dimension value who can access it
  • Can use Team membership
  • Works with hierarchies
  • Data Access control for Dimension must switched on

Option 2: Data Access Filter in Role

  • Assign access right filters in the role
  • Can use ‘Current User’ and ideal for Organisation Dimensions when using Planning
  • Typically assign users to a team, where the team is a member of the role
  • Model Data Privacy must switched on
  • Tip! Easy way to enable/disable access to a modelwithout changing other file permissions!

Live connections
SAP Analytics Cloud - Live Connection and Security - Best Practices

  • Live connections is a true direct connection between the browser and the data source
  • Data is not sent or stored in SAP Analytics Cloud
  • Enabled with the use of Cross-Origin Resource Sharing (CORS) and is fully encrypted and secured

  • SAP Analytics Cloud stores just the meta-data:
    • Connection name, description, server/port (no usernames or passwords are stored)
    • Model meta data: field definitions, data type (measure, dimension, decimals, aggregation type etc.)
    • Story definitions (layout, labels, styling, filter values, formulas etc.)

Single-Sign-On to ‘live’ data sources means

  • Data source applies all data security
  • Data security is completely externalised
  • No need to replicate data security within SAP Analytics Cloud
  • Only need to manage ‘application’ rights
    • What the user can/can’t do (create/edit stories) – via ‘Roles’
    • What the user has rights over content – via ‘File permissions’
  • SSO typically performed via SAML2, but can be others:
    • X509 certificate, Kerberos or SAP Logon tokens are also possible if the database supports it
    • OAuth also supported for Cloud sources

On-premise Components

Acquired Data: Cloud Connector and Cloud Agent

  • Architecture
    • Each SAP Analytics Cloud Service can use multiple SAP Cloud Connectors
    • One SAP Cloud Connector can use only one Agent
  • Cloud Connector
    • Cloud Connector Documentation link
    • Includes details on sizing and setting up a ‘Master’ and ‘Shadow’ to enable High Availability
    • Each Connector must be given a unique location name(to enable a ‘default’ location leave the location name blank)
  • Configuration
    • Recommended to use same server for both Connector and Agent
    • Connector, Agent and Data source need to connect to each other
    • Typically takes a few hours but can be just 15 minutes with the ‘SAP Analytics Cloud Agent Simple Deployment Kit’
  • Tips
    • When configuring the SAP Cloud Connector ‘subaccount’, it will be a different account from any previous SAP Cloud Platform subaccount you’ve used before
    • How to add members to existing subaccount KBA 2463966
    • How to connect multiple Cloud Connectors to one subaccount KBA 2712296
    • Don’t delete any ‘SAP’ users in the subaccount – they are for SAP ‘DevOps’ and critical to its function. They have no access to data

Live Connectivity including advanced options

Setup Overview

  • No need for any special on-premise components
  • Configuration of on-premise data sources:
    • Setup SSO, typically SAML
    • Enable CORS (trust relationship, SSL certificate)
    • Setup Same Site Cookie configuration
  • Client browsers:
    • Allow 3rd party cookies, pop-up windows

Areas of Expertise

SAP Analytics Cloud system owner

SAP Analytics Cloud settings, such as SAML 2.0 settings, users and roles management, and connection settings

Data source expert

Connectivity layer and security (SAP HANA, SAP BW or SAP BW/HANA, SAP S4/HANA, and so on)

Network expert

Proxy, firewall, DNS server etc.

Security expert

SAML 2.0, your organization's Identity Provider (IdP), SSL certificates, etc.

Information system architect

General architecture topics

Application expert

SAP or non-SAP, depending on your data sources: connectivity, security, modelling etc.

Live Connections requiring additional components

  • Using the iOS Mobile App
    • Enabled via an optional connection setting
  • Scheduling/Publishing based off Live data source
  • System owner needs to change settings
    • ‘Allow live data to securely leave my network’
    • Add additional properties to existing live connections
  • Setup the SAP Cloud Connector for ‘Principal Propagation’ and trust relationship

Data Blending within Stories

  • Blending of data is possible with other ‘live’ models and acquired models
  • See below for matrix of possibilities!
  • Depending upon the level of support either browser-based or SDI-based blending is possible

  • Browser-based

    • Blending occurs in the Cloud, not in the browser!
    • Data is transmitted via the browser and so has performance and size limits
    • No configuration needed!

  • Smart Data Integration (SDI)

  • Suitable for larger data volumes
  • Requires setup of Provisioning Agent
  • Switch to SDI-based if possible and needed later


(please ensure you refer to the Support Matrix wiki, not this image as it will NOT be updated)

Live SAP HANA Data Access for Smart Predict

  • Train and apply a predictive model using business data that stays in your on-premise SAP HANA system
  • Configuration steps:
    • Install the HANA Automated Predicted Libraries (APL)
    • Configure HANA Technical User
    • Configure SAP Cloud Connector
    • Add and Configure the Data Repository in SAP Analytics Cloud


  • Use the SAP hosted ‘R-server’

  • Use your own R-Server
    • Install and configure
    • Can use any R-package
    • The only time SAP Analytics Cloud will make an outbound call!
    • SAP Cloud Connector is not used
    • Place your R-Server ‘close’ to the SAP Data Centre

  • For both: All data within the visualisation, including data from a ‘live’ connection will leave your on-premise and travel to the Cloud
  • For data from ‘live’ connections must enable this feature in the administration settings ‘R on Live Data Models’

On-premise Dependencies & Keeping up-to-date

Notice of what’s coming

  • A ‘Fast Track’ release cycle, consumed by only a small number of customers, enables:
    • early notification of any on-premise dependencies.
    • ample time to plan.
  • On-premise ‘true-dependents’ are rare, but they do occur.
  • Do need to follow certain SAP Notes to ensure consistent service is provided.


Primary on-premise components:

SAP Cloud Connector

SAP Analytics Cloud Agent

  • Email notification ahead of time should the agent require an update.
  • You can add an email address for system notifications.

For all on-premise components

Other notification settings

  • Within each users’ Profile Setting enable:
    • ‘System Notifications’
    • ‘Product Updates & Learning’

Keeping up-to-date – data source requirements

  • For Live connectivity to: BW, HANA and S/4:
    • Minimum requirements do change for these source systems
    • They don't change just because SAP updated SAP Analytics Cloud
      • Only additional features brought into SAP Analytics Cloud would require updates to on-premise systems
      • All existing features/functions will work as before
      • Typically new features will only be enabled to the user if the backend system was updated
      • Follow SAP Notes 2715030 2541557
    • ŸHowever, minimum requirements are raised as per SAP Support Strategy and are typically related to defect corrections only

      • In order to stay supported, on-premise systems are expected to be updated

      • Follow SAP Note 375631

  • For Live connectivity to: SAP BusinessObjects Universe:
    • The on-premise connector may require an update upon an SAP Analytics Cloud update
    • Occasionally changes to the BI Platform are required
    • Follow SAP Note 2771921

  • Once a currently supported on-premise version reaches its end of maintenance, then minimum requirements will be increased
  • Minimum requirements are subject to change


Content Network

  • Shared folder in the Cloud
  • Hosts ‘Unit’ or ‘Package’ files
  • Secured sharing

Promote content, not security

  • In general, users in source and target are different set of users
  • Content transport will respect target folder security

Content Network Demonstration: Transport a public folder ‘Project A’ to another SAP Analytics Cloud Service

  • Create a new package and save it into a new folder
  • Export the package
  • Share the folder with the target
  • In the target, import the package
  • See demo!

Best Practices

  • Create a folder, in the Content Network, for each Source System and each Project
  • Store Packages for each project in their respective folder
  • Set sharing and security settings on the folder, to take advantage of inheritance
  • Only transport model data if really needed
  • Keep the packages small, rather than monolithic!
  • Makes managing the content much easier

  • Properties of the Package can contain a lot of text. Include:
  • Date of export
  • SAP Analytics Cloud wave version at time of export
  • ‘Status’ of package. E.g. ‘Waiting for test’, ‘Approved for import’ etc.
    • This can be changed and updated over time
    • ‘Edit’ right on the Content Network folder enables the users in the target to edit the package properties

  • ŸRe-use existing packages to ‘re-capture’ change content rather than create a new package each time
  • ŸTurn on ‘Modify content’ to re-capture the source
  • Remember: Respect the ID of objects and create content once, then transport it to other environments

Monitoring & Usage Tracking

  • Resource Usage Tracking
    • Standard content providing usage information:
      • Stories, models, user and sessions
    • Keep the content in a secured public folder
    • personal folders can cause issues when you wish to update the content
    • Provides data across the whole service exposing usernames, model names etc
  • System Monitor
    • License consumption
    • Historical Usage
    • System Usage by Storage and by User
    • Trace
  • Activity Log
    • Audit log of events
    • Filter and download as needed
    • Notified as approaching max 500,000 rows
  • Data Changes Log
  • Shows actual measure data changes when a private planning versions is published to a public version
  • Notified as approaching max 500,000 rows
  • Remember, these logs contain personal data
  • See demo!

Other System Administration Tasks & Tips

  • BW Live Connections
    • Change the ‘Parallel number of queries’ from 0 to a max of 12

  • Content Namespace

  • App Integration – Trusted Origins
    • Allows other apps to embed SAP Analytics Cloud

  • Default Appearance
    • The Logo is also used in the ‘clock’!

  • Catalog
    • For those without Analytics Hub, enable and configure

  • Use a local web server for hosting:
    • Profile pictures (instead of user having to upload their own)
    • Add the web server to the white list in System-Administration
    • Static content for Application Building static files(custom widgets etc.)
    • Custom fonts
      • To use SAP Fiori icons within SAP Analytics Cloud, download the Fiori Font
      • Blog

In general, please post your comments to the blog post that introduces this wiki page rather than directly here. Thank you

  • No labels