Page tree
Skip to end of metadata
Go to start of metadata

This article is introduced by a blog providing an overview and summary


Video preview (Duration 1h 16 mins. Recorded May 2020)

Video download 

Out of date

(warning)(warning)(warning)(warning)(warning)(warning)  This page is now out of date and will be deleted soon. (warning)(warning)(warning)(warning)(warning)(warning)

(warning)(warning)(warning)(warning)(warning)(warning) The updated contents is now available on the blog page. (warning)(warning)(warning)(warning)(warning)(warning)(warning)


  • SAP Analytics Cloud is a Software as a Service offering
  • This article refers to an individual SAP Analytics Cloud ‘instance’ as a Service
    • The term ‘tenant’ or ‘system’ is often used, but it is synonymous with Service
    • ‘Service’ is the correct term to use as this is what you consume!

Landscape Architecture: On-premise v Cloud

On-premise landscapes

‘Traditional’ on-premise landscape environments typically consists of

  • Two primary tiers
    • One for the application / database. E.g. SAP BW
    • One for the Business Intelligence or Planning system. E.g. SAP BusinessObjects BI Suite
  • Both primary tiers hosted on-premise
  • Each environment (Sandbox, Dev, QA, Prod) has a unique role with regard to the content life-cycle
  • So, what changes when working with a Cloud Service?


What’s the same

  • Typically still need two primary tiers
    • Though some use-cases can be fulfilled solely with SAP Analytics Cloud
  • Still need multiple environments (Sandbox, Dev, QA, Prod)
    • The role these environments provide remains valid, albeit now with a more obvious associated cost


What’s different

  • The SAP Analytics Cloud services are updated on a scheduled basis
  • May have a desire for fewer Cloud environments to reduce Cloud Service fees
  • There could be Cloud to On-premise dependencies
  • Impacts
    • life-cycle management activities
    • on-premise software update cycles

Landscape Architecture

There are many factors that determine the landscape and these include

  • Choice of Public and Private hosted editions
  • Any use of SAP Analytics Cloud Test Service and its Preview option
  • Pros/Cons of Quarterly Release and Fast Track Update Release cycles
  • Options and limitations to promote content between the environments
  • Maturity of lifecycle management and change control practices

Public and Private Editions

Public and Private Editions

  • Public Edition
    • Internal SAP HANA database instance is shared, with a schema per customer
    • Database performance could be impacted by other customer activity
  • Private Edition
    • Still has some shared components but the core SAP HANA database instance is dedicated
    • Dedicated database instance reduces impact from other customers’ database activity
    • Typically chosen for larger enterprise deployments
    • Need to ‘size’ correctly to avoid hitting a hardware limit
  • Public Editions are typically chosen
    • but not exclusively since Private Editions have increased database performance protection from other customer services

Private Editions – a little more about sizing and performance

  • Analytic Models connecting to ‘live’ data sources require minimal hardware resources on SAP Analytics Cloud
    • Most of the load is on the data source
  • Analytic Models using acquired data means the underlying HANA database is used, mostly for read access and so there is a load but it not too heavy
    • Likely to require high CPU than high Memory
  • Planning Models using acquired data can require significant resources for both CPU and Memory
    • Planning models are more highly dependent upon the model design
      • # of exception aggregation, # of dimensions, scope of the users’ security context when performing data entry
      • Physical size of a model isn’t necessarily the determining factor for performance!
  • Other factors: # of concurrent users, size and shape of the data, # of schedules etc.

Applicable for all environments

  • Public and Private editions are applicable for all environments
    • There are no restrictions on what type of edition can be used for any particular life-cycle purpose
    • However, performance load testing is only permitted against Private editions
    • Penetration testing is positively encouraged against all editions
      • For both load & penetration testing follow SAP Note 2249479 to ensure compliance with contractual agreements
        and so we don’t block you thinking it’s a denial of service attack!

Test Services

  • Provides unique opportunity designed for testing in-mind
  • Includes named user Planning Professional licenses and a SAP Digital Boardroom license
    • Planning Professional includes Planning Standard license, and Planning Standard includes Business Intelligence named user license
    • However there are no Analytics Hub users, nor any concurrent Business Intelligence licenses available with this service
  • Typically all Beta features, if requested and granted, can be enabled on Test Services
  • Cannot be used for Productive use
    • Does not adhere to standard SLA’s
    • Means you can not log a Priority 1 support incident with SAP
    • Does not mean it cannot connect to a Productive data source (i.e. you can connect it to a productive data source with a Test Service)
  • A Test Service is not required for Testing as such, a regular non-Test Service can be used for all non-Production life-cycle use-cases
  • Test Services are available for both Public and Private editions
    • However Public Test editions have restrictions with the Test Preview option described later

Quarterly Release Cycle (QRC)

Primary Update Release Cycle

  • There is one primary update release cycle, the ‘Quarterly Release Cycle’ (QRC)
  • If you purchased SAP Analytics Cloud, then your service will be on this update release cycle by default
  • It means your SAP Analytics Cloud Service is updated
    • on a scheduled basis and you can not elect for it to be ‘not updated’
    • once a quarter, so 4 times a year

About the wave version number

  • For the year 2020, the Q1 QRC wave version is 2020.02, for 2020 Q2 is wave version is 2020.08 etc.
    • The wave version number is a combination of the year, version and patch number
    • E.g. version 2020.02.21 is version 2, of the year 2020, with patch 21
    • Waves and patches are always cumulative (they include all the features of everything before it)

Life-cycle use of QRC in overall landscape

  • For life-cycle management needs to be able to transport objects between Dev, QA and Prod environments at all times
    • The ability to transport content is critical to running any productive system
    • There should be no ‘backout’ period, where transporting objects is not possible
  • Thus, it makes perfect sense for Dev, QA and Prod to be on the same Release Cycle
    • Content can always be transported between all environments, but most typically between Dev and QA, and Dev and Production (as shown in diagram)

Test Preview Quarterly Update Release Cycle

Test Preview

  • Only available with a Private Edition Test Service
    • i.e. not available for a Public Edition Test Service

  • This ‘Test Preview’ service receives the Quarterly Update Release, but ~1 month earlier
    • It’s a regular Test Service, but the update release cycle is the ‘Quarterly Preview’
  • Provides a unique opportunity to validate the new version with productive data and productive content
  • It is expected that ‘Test Preview’ connects to a Productive data source, even though it is classed as a Test Service. This is necessary to validate the new version with existing Productive content, both data source and SAP Analytics Cloud content (models, stories etc.)
  • A ‘Test Preview’ service, like a regular Test Service can not be used for productive purposes, but unlike a regular Test Service, it can also not be used for development purposes either

Life-cycle use of ‘Test Preview’ in overall landscape

  • ‘Test Preview’ introduces a new environment into the landscape, almost unique to the cloud
  • It is neither development, QA, pre-prod nor production
  • Preview should never be used for development purposes; its role is purely to validate new software with existing productive content
  • Since it is updated ~1 month ahead, for that month, you cannot transport content from it into another environment until those other environments are updated to the same version
    • i.e. you can only import into it (not export from it) during the month overlap
    • Typically you transport content from Production into Preview, but not exclusively
  • Dev content would also be transported into it at times
  • For 2 months of each quarter, it remains aligned with the Quarterly Release Cycle allowing you to easily transport content between all environments
    • Remember from a license contract point of view Test Preview can not be used for development or productive purposes

Fast Track Update Release Cycle

Fast Track

  • Updates are made ~ every 2 weeks, so about 26 times a year
  • Not provided by default and needs to be ordered specially
    • (SAP representative needs to request a ‘DevOps’ ticket to be raised prior to the order)
  • Up to 8 wave versions ahead of the Quarterly Release Cycle
    • As the version is considerably ahead its tricky to transports content in and out of it

  • Transport of content from Fast Track to others requires you to ‘hold’ the content until the target is updated to the same version
  • However the content must be the same or the previous version, so the ‘export’ needs to be performed in a small-time window, otherwise it is ‘blocked’!  (1) (3)
  • Transport of content into Fast Track from others is incredibly limited to a few times a year
  • ‘Older’ content can be transported into Fast Track, but the target can only be the Quarterly Release version, otherwise it is ‘blocked’!  (2) (3)

  1. Technically, if content is transported via the ‘Content Network’ then all earlier versions (not just the current or previous) can be imported, but it is not supported.
  2. Technically, if content is transported via the ‘Content Network’ then non-QRC versions can be imported, but it is not supported.
  3. If content has been exported ‘manually’ (via Menu-Deployment-Export) then it is not even technically possible to import it. Additionally, this manual method is required if the source and target services are hosted in mixture of SAP and non-SAP data centres. If the source and target are all non-SAP (or all SAP) data centres then the ‘Content Network’ can be used to transport content, even across geographic regions

Life-cycle use of ‘Fast Track’ in overall landscape

  • Perfect for validating and testing new features that will come to the Quarterly Release later
  • Occasionally Beta features can be made available in non-test Services, allowing organisations to provide early feedback and allow for SAP to resolve issues ahead of general availability
  • Suitable for Sandbox testing only
  • Explicitly not suitable for productive or development of any content
  • Do not reply upon the ability to transport content into or out of Fast Track
  • For the Fast Track Update Release cycle:
    • Occasionally the update schedule changes in an ad-hoc fashion to cater for platform updates or other un-planned events
    • Rarely, but not completely unheard of, an entire update is skipped, so the next update updates by two versions. It’s possible the window of opportunity for transporting content is closed for some quarters

Update Release Cycles

For SAP Analytics Cloud Services within the same update release cycle:

  • All services, public and private editions, are updated at the same time when they are hosted in the same data centre
  • Data centres are hosted all around the globe. Each having a schedule that will vary slightly from others

    • Quarterly Release Cycle schedule dates are published in SAP Note 2888562

    • Exact dates vary slightly by region and data centre host (SAP, Amazon and more are planned)

    • Some fluidity in the schedule necessary for operational reasons so the update schedule for Fast Track is not published

    • Data centres are complex and updates are occasionally delayed to ensure service levels and maintenance windows are not breached. Delays can be caused by urgent service updates to the underlying infrastructure

  • It's important to ensure all your SAP Analytics Cloud Services are hosted in the same Data Centre to ensure version consistency with regard to life-cycle management

Why multiple environments

Objects relate to other objects by ID

  • Relationship of objects inside SAP Analytics Cloud is performed by identifiers
  • It is therefore not possible to manage the life-cycle of different objects independently of each other within the same service
  • For example, taking a copy of a story means that copy will:
    • still point to the original model (red line in diagram)
    • not point to a copy of any model taken
    • lose dependences of all comments or bookmarks associated to the original
      • Other dependencies include SAP Digital Boardroom and discussions, but there are many more
    • have a new ‘ID’ itself so end-user browser-based bookmarks will still point to original story
  • To re-point a copied story to a copied model, requires considerable manual effort and is not appropriate for life-cycle management change control
  • Multiple environments are thus mandatory for proper change and version control

What life-cycle management can be achieved within a single Service

Some life-cycle aspects can be achieved within a single Service

  • Although objects dependent upon models (like Stories or Applications) cannot be independently managed within one ‘system’, some life-cycle aspects can be managed within one SAP Analytics Cloud Service

  • This typically applies to ‘live’ data sources
  • such as Data Warehouse Cloud, HANA, BW, BPC, S/4, BI Platform etc.
  • this tends not to be applicable for ‘acquired’ data models, but that is still possible

  • For example, when a new version of the data source (or copied SAC model) has been created, that new version can be validated in the same SAP Analytics Cloud Service
    • For ‘live’ data sources, the step of creating a new Model is simple, quick and easy
    • Creating a new Story with new, often simple, widgets can help validate ‘changed’ aspects of the data source (or copied SAC model)
    • It will thus, help validate changes made to the next data source version
    • ‘live’ data sources are typically hosted in other cloud services or on-premise systems
  • It would not validate the existing SAP Analytic Cloud content within that same Service, but it does provide a level of validation and opportunity to re-develop or adjust the model/data source design
  • Thus, albeit in a limited way, for the data source at least, two environments within a single SAP Analytics Cloud service can be supported

Simplifying the landscape

  • Some organisations have 4 systems in their on-premise landscape, however mirroring this setup is typically undesired with the Cloud as costs of more obvious
  • The option to validate data source changes within one SAP Analytics Cloud service is available
  • Means 4 data source systems can be managed with 3 SAP Analytics Cloud Services
    • Sandbox could be removed for this purpose - still may need it for another!
      • See Fast Track Services

  • Development validates new data source (and SAC model) changes by creating new story widgets that are soon destroyed
  • Once data source changes have been validated, the changes can then be made to the data source that supports all the existing content dependent upon it
  • This enables the development and validation of SAP Analytics Cloud content independently of the ‘next’ data source version that may be in development

Typical landscape options chosen

  • Arrows indicate most common path of adoption
  • First choice is typically Option 1
    • Useful for initial valuation
  • Next is to add a Dev environment, Option 2
    • Ideal for developing new content and starting to use the transport services available
  • ‘Test Preview’ is often followed, Option 4
    • Most customers find a need to validate new upcoming features with productive content
  • Sandbox options are typical for customers wishing try out features way ahead of general availably. It reduces the risk to the project having validated features ahead of time
  • QA environments are not so common
  • Update Release Cycles are as shown
    • Sandbox and Preview are not on QRC
    • Dev, QA and Prod are on QRC
  • Non-productive environments only need a handful of users, perhaps 10 users or fewer
    • Will need at least as many developers/testers, but doesn’t necessarily need to be a significant investment
  • ŸDue to license terms on Test Services, Preview has a minimum number of users: 20 for Public Editions and 50 for Private Editions

  • Typically, only the ‘Test Preview’ Service is using a ‘Test Service’, all others are non-Test Services including Sandbox, Dev, QA

  • Private editions are recommended for large enterprise deployments, commonly for production

Typical landscape connectivity (for ‘live’ data sources)

  • It's very important to validate content against production size and quality data
    • Production data sources are accessed by Prod, but also Preview and occasionally Sandbox
    • Preview requires access to Prod data source to validate content with the upcoming wave version
      • Even though Preview is a ‘Test’ service, it can still connect to Prod data
    • Use a copy of Production data for QA purposes where possible
  • In order to switch the connection to use the next data source environment, access to that next data source is required from the current environment (hence the dotted line from Dev to QA and Prod)
    • Switch the connection before transporting new model versions
      • Since there is no concept of ‘connection mappings’ or ‘dynamic connection switching’ currently
    • Switch the connection back after creating the transport unit

Respect the ID and Best Practices for updating objects

Updating objects is performed by ID, not by name

  • When objects are transported from one environment to another, upon updating the target, objects are matched on the ID
    • If the ID’s match, the object can be updated
    • If there is no match by ID, the object will be new
  • Can lead to multiple objects with the same name in the target as shown in this workflow
  • It is impossible to create two new objects with the same ID, even across different environments
    • An object, when transported from one environment to another, maintains its ID (and its Content Namespace of where it was first created)

  • In order to update an object created in Production with a new version in Dev, it must first be transported to Dev (Step 2)
  • The object can then be updated to the new version (Step 3) before being transported back to Prod (Step 4)
    • Typically it should go to QA first for testing!
  • Since the ‘ID’ has been respected, the object is matched by ID and is correctly updated
    • There is no duplicated object name in the target
    • All dependencies are also respected

Best Practices for updating objects

  • Only create objects once and then transport them to other environments
  • Avoid ‘copying’ objects or ‘save as’ as this creates a new object with a new ID
  • Create objects with a name that doesn’t refer to its current environment
    • i.e. avoid “Test Finance”, instead use “Finance”
    • Applicable for all object types: folders, stories, teams, roles, models etc.

Content Namespaces and Best Practices

Changing the default

  • Each SAP Analytics Cloud Service has a Content Namespace
  • Its default format is simple [Character].[Character]
    • Default examples include: t.2, t.0, t.O
  • You can change the Content Namespace to almost anything you like
    • e.g. ‘MyDevelopment’

  • When any content is created, it keeps the Content Namespace that Service had at the time of its creation
    • Like an objects’ ID, you cannot change the Content Namespace of an object
  • This also means, that when content is transported, it keeps that Content Namespace with it
    • As shown above, the model and story, when transported into QA, maintain the Content Namespace of Dev
  • It's likely any one Service will contain content with different Content Namespaces
    • For example, importing the Samples or Business Content from SAP or Partners, will have different Content Namespaces

Pros and cons of consistent values

  • Benefit of different Content Namespaces across environments:
    • You can identify in which environment the object was original created
    • Though the namespace is hidden from end-users, you can see the namespace in logs and browser consoles

  • Benefits of the same and consistent Content Namespace across all environments:
    • Easier coding of Service APIs (REST, URL, SCIM)
      • These APIs often refer or include the Content Namespace, so a consistent value means slightly easier coding
    • Reduces project risk, typically when using APIs
      • There are occasional issues when multiple Content Namespaces are used, whilst SAP will fix these, it might be better to avoid any surprises by using a consistent value
  • Known issues when changing Content Namespaces
    • Teams with Custom Identity Provider
      • Once changed, new objects will have new Content Namespaces and this includes ‘Teams’
      • This also means the SCIM API used by Custom Identity Providers, if utilising the ‘Team’ attribute mapping could fail when Teams are using a mixture of Content Namespace values (KBA 2901506 internal reference FPA60-2979). A workaround is possible, but better to avoid until a full solution is available

Best Practices

  • Do set the Content Namespace to be the same and consistent value across all environments
  • Change the Content Namespace as one of the first things you do when setting up your new Service
    • Do this before setting up any custom Identify Provider
    • Do this before creating any Teams
  • Set the value with a small number of characters and keep the format the same (e.g. T.0)
    • Lengthy values (e.g. MyWonderfulDevelopmentSystem) causes extra payload in communications and if the number objects referred to in a call is large, this could have a performance impact
    • No need for a lengthy name, just keep it short and simple!
  • Do NOT change the Content Namespace if you already have a working productive landscape
    • Especially if you are using a Custom Identify Provider and mapping user attributes to Teams
    • Don’t fix something that ain’t broke

Transporting Objects, Best Practice and Recommendations

There are 2 options for transporting content:

  • Option 1: (legacy) manual ‘Deployment-Export/Import’
  • Option 2: Content Network

Option 1: (legacy) ‘Deployment-Export/Import’

  • Unlike Content Network:
    • Can be used for transporting between a mixture of SAP and non-SAP data centres (between NEO and CF)
    • Sharing settings are included
  • Requires manual management of ‘Unit’ files: downloading from source and uploading to the target
  • Has limitations on file size

Option 2: Content Network

  • ‘Unit’ files are hosted in the Cloud
    • Can be organised into folders including with folder security options
    • Processing occurs in the background
    • No need for any manual download/upload
  • Unlike (legacy) manual deployment option:
    • Sharing settings are not included
    • Supports a greater number of object types
  • Will only show Units that can be imported into the Service
    • Units created by newer versions of the Service will not be shown (i.e. Units created on Fast Track Release Cycle, but the current Service has yet to be updated to that version)
    • Older units that are not supported can still be imported, but a warning message is shown
      • Fully supported units are when the version is the same, the previous or the previous Quarter Release

  • It is best practice and recommended to use the Content Network for transporting content between environments whenever possible

Best Practice Summary

Key take-aways

  • Use multiple Services to life-cycle manage content
  • Do keep Dev, QA and Prod on the same update release cycle
  • Always validate content against production quantity and quality of data
  • Use the Test Preview to validate new wave versions with existing Production content
  • Use Fast Track to reduce risk to the project especially when success is dependent upon upcoming features
  • Respect the ID of objects and create content once, then transport it to other environments
  • Change the Content Namespace only at initial setup and set all Services to the same value
  • Use the Content Network to transport objects between Services

Frequently Asked Questions

Question: What is the best practice to manage the life-cycle of content with just one SAP Analytics Cloud Service

  • ŸAnswer: SAP Analytics Cloud has been designed so that content life-cycle management requires multiple SAP Analytics Cloud services. It means a single SAP Analytics Cloud Service can not manage the life-cycle of content on its own.
  • The service provides various tools by which content can be transported from one SAP Analytics Cloud Service to another and these tools are constantly being developed and improved
  • It is explicitly recommended not to attempt to manage the content life-cycle (of SAP Analytics Cloud content) within a single SAP Analytics Cloud Service

In general, please post your comments to the blog post that introduces this wiki page rather than directly here. Thank you

  • No labels


  1. I have a question on best practice on setting up live connections in a multiple SAC tenant landscape with regards to life-cycle management process: we have 3 SAC tenants DEV-QAS-PROD. We created user stories - models on top of DEV data sources and we have setup 2 live connections on the DEV tenant: SAP BW live connection and a SAP BPC live connection both on on-premise SAP DEV data source: we called them something like DEVBPC and DEVBW. Now we want to transport the user stories with the models build on the DEV connections to QAS tenant. Once we transported the user stories to QAS we want to models to be connected to QASBPC and QASBW automatically, not DEVBPC and DEVPC. We don't want to create the DEVBPC and DEVBW in QAS tenant as it is QAS, we don't want DEV connections setup in that tenant. When we transport to PROD Tenant, same thing => transport user stories and models but models should point to PRDBPC and PRDBW, we don't want DEV and QAS connections in PROD. What is best practice to get this achieved?

    1. Hi Angelique,

      Best practice is to create the connections on your dev tenant which are called BPC and BW. You use the export/import feature to import these connections on your qas and prod tenants as well, but here you simply change the connection details. If you now export and import a model the connection object ID will be the same but it will connect to a different system on each tenant.

      Kind regards,

      Martijn van Foeken | Interdobs