Registration

Dear SAP Community Member,
In order to fully benefit from what the SAP Community has to offer, please register at:
http://scn.sap.com
Thank you,
The SAP Community team.
Skip to end of metadata
Go to start of metadata

Author: Alexey Arseniev
Submitted: 20.11.2013
Other code samples from me:

Why

There are a lot of other implementations of the ABAP to JSON Serializer and Deserializer in SDN, but for different reasons, all implementations I have found were not suitable to my needs. From SAP_BASIS 7.40 there is also simple transformation available for converting ABAP to JSON and JSON to ABAP. It is the best choice if you need maximal performance and does not care about serialization format, but for proper handling of ABAP types and name pretty printing, it fits bad. 

So, I have written my own JSON serializer/deserializer which has some key differentiates from other implementation.

Below you can find a snippet of the class I wrote, that you can use as a local class or global renamed.

An original and actual version of the source can be found in class /UI2/CL_JSON delivered with UI2 Add-on (can be applied on SAP_BASIS 700 – 76X).

What it can

ABAP to JSON

  • Serialize classes, structures, internal tables, class and data references, any kind of elementary types. Complex types, as a table of structures/classes, classes with complex attributes etc. are also supported and recursively processed.
  • ABAP to JavaScript adopted way of data type serializations:
    • strings, character types to JavaScript string format,
    • ABAP_BOOL / BOOLEAN / XFELD / BOOLE_D to JavaScript Boolean,
    • Built in TRIBOOL (TRUE/FALSE/UNDEFINED = 'X'/'-'/'') support, for better control of initial values when serializing into JavaScript  Boolean
    • int/floats/numeric/packed to JavaScript Integers/floats,
    • date/time to JavaScript date/time string representation as "2015-03-24" or "15:30:48",
    • timestamp to JavaScript  integer or to ISO8601 string
    • structures to JavaScript objects (include types are also supported; aliases => AS are ignored)
    • internal tables to JavaScript arrays or associative arrays (objects)
  • Pretty Printing of JavaScript property names: MY_DATA -> myData, /SAPAPO/MY_DATA -> sapapoMyData.
  • Condensing of default values: initial values are not rendered into resulting JSON string
  • Performance is optimized for processing big internal tables with structures

JSON to ABAP

  • Deserialize JSON objects, arrays and any elementary types into corresponding ABAP structures. Complex objects, with embedded arrays and objects with any level of nesting, are also supported.
  • Generic deserialization of JSON objects into reference data types: 
    • as simple data types (integer, boolean or string into generic data reference (REF TO DATA) -> ABAP type is selected based on JSON type.
    • as dynamically generated complex object (structures, tables, mixed) for initial REF TO DATA fields
    • as typed references for prefilled REF TO DATA fields (you assign a reference to typed empty data object to REF TO DATA field in execution time)
  • Deserialization of unknown JSON structures possible using method GENERATE into on the fly created data types
  • On JSON to ABAP transformation following rules are used:
    • objects parsed into corresponding ABAP structures, classes (only classes with constructors with no obligatory parameters are supported) or to internal hash/sorted tables
    • arrays converted to internal tables (complex tables are also supported). 
    • Boolean converted as ABAP_BOOL (‘’ or ‘X’)
    • Date/Time/Timestamps from JSON converted based on type of corresponding ABAP element
    • integers/floats/strings moved to corresponding fields using ABAP move semantic (strings are un-escaped)
    • elementary data types of are converted if do not match: JavaScript integer can come into ABAP string or JavaScript string into ABAP integer and etc.
    • Transformation takes into account property naming guidelines for JSON and ABAP so that camelCase names will be copied into corresponding CAMEL_CASE field if CAMELCASE field is not found in ABAP structure. Do not forget to use the same PRETTY_MODE for deserialization, as you have used for serialization.
    • Default field values, specified in reference ABAP variable are preserved, and not overwritten in not found in JSON object
    • Transformation of JSON structures into ABAP class instances is NOT supported.
  • Deserializer uses single-pass parsing and optimized to provide best possible performance. But for time critical applications, which shall run on SAP_BASIS 7.02 and higher it is recommended to use built-in JSON to ABAP transformations (CALL TRANSFORMATION).

Usage example

ABAP to JSON usage example
DATA: lt_flight TYPE STANDARD TABLE OF sflight,
      lrf_descr TYPE REF TO cl_abap_typedescr,
      lv_json   TYPE string.

 
SELECT * FROM sflight INTO TABLE lt_flight.
 
* serialize table lt_flight into JSON, skipping initial fields and converting ABAP field names into camelCase
lv_json = /ui2/cl_json=>serialize( data = lt_flight compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
WRITE / lv_json.

CLEAR lt_flight.
 
* deserialize JSON string json into internal table lt_flight doing camelCase to ABAP like field name mapping
/ui2/cl_json=>deserialize( EXPORTING json = lv_json pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = lt_flight ).

* serialize ABAP object into JSON string
lrf_descr = cl_abap_typedescr=>describe_by_data( lt_flight ).
lv_json = /ui2/cl_json=>serialize( lrf_descr ).
WRITE / lv_json.

Output

[{"mandt":"000","carrid":"AA","connid":0017,"fldate":20130515,"price":422.94,"currency":"USD","planetype":"747-400","seatsmax":385,
"seatsocc":375,"paymentsum":192683.30,"seatsmaxB":31,"seatsoccB":31,"seatsmaxF":21,"seatsoccF":19},{"mandt":"000","carrid":"AA",
"connid....
{"ABSOLUTE_NAME":"\\TYPE=%_T00004S00000000O0000012480","ADMIN_TAB":"\\TYPE=%_T00004S00000000O0000012480",
"ADMIN_TAB_LINE":"\\TYPE=%_T00004S00000000O0000012480","DECIMALS":0,"HAS_UNIQUE_KEY":false,"INITIAL_SIZE":0,
"KEY":[{"NAME":"MANDT"},{"NAME":"CARRID"},{"NAME":"CO....

API description

There are two static methods that are of most interest in common cases: SERIALIZE and DESERIALIZE. The rest of public methods are done public only for reuse purpose if you would like to build/extend your own serialization/deserialization code. 

SERIALIZE: Serialize ABAP object into JSON

  • > DATA (any) - any ABAP object/structure/table/element to be serialized
  • > COMPRESS (bool, default=false) - tells serializer to skip empty elements/objects during serialization. So, all for which IS INITIAL = TRUE. 
  • > NAME (string, optional) - optional name of the serialized object. Will '"name" : {...}' instead of ' {...} ' if supplied. 
  • > PRETTY_NAME (enum, optional)  - mode, controlling how ABAP field names transformed in JSON attribute names. See description below.  
  • > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR, optional) - if you know object type already - pass it to improve performance. 
  • > ASSOC_ARRAYS (bool, default = false) - controls how to serialize hash or sorted tables with unique keys. See below for details.
  • > ASSOC_ARRAYS_OPT (bool, default = false) - when set, serializer will optimize rendering of name-value associated arrays (hash maps) in JSON
  • > TS_AS_ISO8601 (bool, default = false) - says serializer to output timestamps using ISO8601 format.
  • > NUMC_AS_STRING (bool, default = false) - Controls the way how NUMC fields are serialzed. If set to ABAP_TRUE, NUMC fields serialized not as integers, but as strings, with all leading zeroes. Deserialization works compatibly with both ways of NUMC serialized data.
  • < R_JSON - output JSON string.

DESERIALIZE: Deserialize ABAP object from JSON string

  • > JSON (string) - input JSON object string to deserialize
  • > PRETTY_NAME (enum, optional) - mode, controlling how JSON field names mapped to ABAP component names. See description below.  
  • > ASSOC_ARRAYS (bool, default = false) -  controls how to deserialize JSON objects into hash or sorted tables with unique keys. See below for details.
  • > ASSOC_ARRAYS_OPT (bool, default = false) - when set, the deserializer will take into account optimized rendering of associated arrays (properties) in JSON. 
  • > TS_AS_ISO8601 (bool, default = false) - says deserializer to read timestamps from strings into timestamps fields using ISO8601 format.
  • <> DATA (any) - ABAP object/structure/table/element to be filled from JSON string. If ABAP structure contains more fields than in JSON object, a content of unmatched fields is preserved.

GENERATE: Generates ABAP object from JSON

  • > JSON (string) - input JSON object string to deserialize
  • > PRETTY_NAME (enum, optional) - mode, controlling how JSON field names mapped to ABAP component names. See description below.  
  • < RR_DATA (REF TO DATA) - reference to ABAP structure/table dynamically generated from JSON string.

In addition to explained methods, there are two options, that need wider explanation:

PRETTY_NAME : enumeration of modes, defined as constant /UI2/CL_JSON=>pretty_name.

  • NONE - ABAP component names serialized as is (UPPERCASE).
  • LOW_CASE - ABAP component names serialized in low case 
  • CAMEL_CASE - ABAP component types serialized in CamelCase where symbol "_" treated as word separator (and removed from resulting name). 
  • EXTENDED - works the same way as CAMEL_CASE but in addition, replace "__" (double underscore) by "_" (single underscore) and "___" (triple underscore) by "." (dot). 

    NONE and LOW_CASE works the same way for DESERIALIZE.

ASSOC_ARRAYS :

This option controls the way how hashed or sorted tables with unique keys serialized/deserialized. Normally, ABAP internal tables serialized into JSON arrays, but in some case, you will like to serialize them as associated arrays (JSON object) where every row of the table shall be reflected as separated property of JSON object. This can be achieved by setting ASSOC_ARRAYS parameter to TRUE. If set, serializer checks for sorted/hashed tables with a UNIQUE key(s) and serialize them as an object. The JSON property name, reflecting row, constructed from values of fields, used in key separated by constant MC_KEY_SEPARATOR = '-'. If the table has only one field marked as key, the value of this single field become a property name and REMOVED from the associated object (to eliminate redundancy). If TABLE_LINE used as a unique key, all values of all fields construct key property name (separated by MC_KEY_SEPARATOR). During deserialization, logic works vice versa: if ASSOC_ARRAYS set to TRUE, and JSON object matches internal hash or sorted table with the unique key, the object is transformed into the table, where every object property reflected in separated table row. If ABAP table has only one key field, property name transformed into a value of this key field.

ASSOC_ARRAYS_OPT:

By default, when dumping hash/sorted tables with a unique key into JSON, the serializer will write key field as property name and rest of fields will write object value of properties:

Dumping of hash tables from ABAP to JSON
 TYPES: BEGIN OF ts_record,
        key TYPE string,
        value TYPE string,
       END OF ts_record.
 
DATA: lt_act TYPE SORTED TABLE OF ts_record WITH UNIQUE KEY key.
lv_json = /ui2/cl_json=>serialize( data = lt_exp assoc_arrays = abap_true ).
Output JSON
{
    "KEY1": {
        "value": "VALUE1"
    },
    "KEY2": {
        "value": "VALUE2"
    }
}

But if you will use assoc_arrays_opt flag during serialization, the serializer will try to omit unnecessary object nesting on dumping of simple, name/value tables, containing only one key field and one value field:

Dumping of hash tables from ABAP to JSON
lv_json = /ui2/cl_json=>serialize( data = lt_exp assoc_arrays = abap_true assoc_arrays_opt = abap_true ).
Output JSON
{
    "KEY1": "VALUE1",
    "KEY2": "VALUE2"
}

For deserialization, the flag is used to tell the deserializer that value shall be placed in a non-key field of the structure.

Supported SAP_BASIS releases

The code was tested from SAP_BASIS 7.00 and higher, but I do not see reasons why it cannot be downported on lower releases too. But if you plan to use it on SAP_BASIS 7.02 and higher (and do not need property name pretty printing) better consider the standard solution for ABAP, using CALL TRANSFORMATION. It shall be definitely faster, while implemented in the kernel. See blog of Horst Keller for details. Maybe the best will be, if you need support in lower SAP_BASIS releases as well as in 7.02 and higher, to modified provided a class in a way to generate same JSON format as standard ABAP CALL TRANSFORMATION for JSON does and redirect flow to home-made code or built-in ABAP transformation depending on SAP_BASIS release.

Related notes

  • 2292558 - Corrections for JSON serializer /UI2/CL_JSON
  • 2300508 - /UI2/CL_JSON deserialization of recursive structures
  • 2330592 - /UI2/CL_JSON corrections
  • 2368774 - /UI2/CL_JSON corrections - optimization for serialization of name/value tables
  • 2382783 - /UI2/CL_JSON corrections - bug fixes, supporting of firing of an exception on parsing error, helper methods for working with XSTRING output
  • 2429758 - /UI2/CL_JSON corrections - bug fixes control on serialization of NUMC types, support of on the fly ABAP data generation for unknown JSON structure
  • 2480119 - /UI2/CL_JSON corrections - fix for GENERATE method, fix for deserializing empty tables, auto-generation of initial fields with REF TO DATA, guided generation for initialized REF TO DATA fields (type of the referenced data is used)

Further optimizations

Escaping of property values can be expensive. To optimize performance, in this case, you can replace escapement code by some kernel implemented function (from cl_http_utility class for example), instead of explicit REPLACE ALL OCCURRENCES calls.

Remarks

  • Due to optimization reasons, some methods were converted to macros, to reduce overhead for calling methods for data type serialization. If performance in your case is not critical, and you prefer clean/debuggable code you can replace macro calls by corresponding methods.

The /UI2/CL_JSON code

Below you can find the code itself, you can use.

If want to use the class globally, I suggest to create a proxy class, in your own namespace, with a reduced interface (serialize/deserialize only) and call local copy (local class of it) of /UI2/CL_JSON. Then you can easily update to new version of /UI2/CL_JSON from SDN or call UI Addon implementation if it is installed.

/UI2/CL_JSON code
*----------------------------------------------------------------------*
*       CLASS zcl_json DEFINITION
*----------------------------------------------------------------------*
CLASS zcl_json DEFINITION.

  PUBLIC SECTION.
    TYPE-POOLS abap .
    CLASS cx_sy_conversion_error DEFINITION LOAD .

    TYPES json TYPE string .
    TYPES bool TYPE char1 .
    TYPES tribool TYPE char1 .
    TYPES pretty_name_mode TYPE char1 .

    CONSTANTS:
      BEGIN OF pretty_mode,
        none       TYPE char1  VALUE ``,
        low_case   TYPE char1  VALUE `L`,
        camel_case TYPE char1  VALUE `X`,
        extended   TYPE char1  VALUE `Y`,
      END OF  pretty_mode,
      BEGIN OF c_bool,
        true  TYPE bool  VALUE `X`,
        false TYPE bool  VALUE ``,
      END OF  c_bool,
      BEGIN OF c_tribool,
        true      TYPE tribool  VALUE c_bool-true,
        false     TYPE tribool  VALUE `-`,
        undefined TYPE tribool  VALUE ``,
      END OF  c_tribool,
      mc_key_separator TYPE string VALUE `-` ##NO_TEXT,
      version          TYPE i VALUE 4 ##NO_TEXT.

    CLASS-DATA sv_white_space TYPE string READ-ONLY .
    CLASS-DATA mc_bool_types TYPE string READ-ONLY VALUE `\TYPE-POOL=ABAP\TYPE=ABAP_BOOL\TYPE=BOOLEAN\TYPE=BOOLE_D\TYPE=XFELD` ##NO_TEXT.
    CLASS-DATA mc_bool_3state TYPE string READ-ONLY VALUE `\TYPE=BOOLEAN` ##NO_TEXT.
    CLASS-DATA mc_json_type TYPE string READ-ONLY .

    CLASS-METHODS class_constructor .
    CLASS-METHODS string_to_xstring IMPORTING in TYPE string CHANGING VALUE(out) TYPE any .
    CLASS-METHODS xstring_to_string IMPORTING in TYPE any RETURNING VALUE(out) TYPE string .
    CLASS-METHODS raw_to_string IMPORTING iv_xstring TYPE xstring iv_encoding TYPE abap_encoding OPTIONAL RETURNING VALUE(rv_string) TYPE string .
    CLASS-METHODS string_to_raw IMPORTING iv_string TYPE string iv_encoding TYPE abap_encoding OPTIONAL RETURNING VALUE(rv_xstring) TYPE xstring .
    CLASS-METHODS bool_to_tribool IMPORTING iv_bool TYPE bool RETURNING VALUE(rv_tribool) TYPE tribool .
    CLASS-METHODS tribool_to_bool IMPORTING iv_tribool TYPE tribool RETURNING VALUE(rv_bool) TYPE bool .

    CLASS-METHODS deserialize
      IMPORTING
        json             TYPE json OPTIONAL
        jsonx            TYPE xstring OPTIONAL
        pretty_name      TYPE pretty_name_mode DEFAULT pretty_mode-none
        assoc_arrays     TYPE bool DEFAULT c_bool-false
        assoc_arrays_opt TYPE bool DEFAULT c_bool-false
      CHANGING
        data             TYPE data .
    CLASS-METHODS serialize
      IMPORTING
        data             TYPE data
        compress         TYPE bool DEFAULT c_bool-false
        name             TYPE string OPTIONAL
        pretty_name      TYPE pretty_name_mode DEFAULT pretty_mode-none
        type_descr       TYPE REF TO cl_abap_typedescr OPTIONAL
        assoc_arrays     TYPE bool DEFAULT c_bool-false
        ts_as_iso8601    TYPE bool DEFAULT c_bool-false
        expand_includes  TYPE bool DEFAULT c_bool-true
        assoc_arrays_opt TYPE bool DEFAULT c_bool-false
        numc_as_string   TYPE bool DEFAULT c_bool-false
      RETURNING
        VALUE(r_json)    TYPE json .
    CLASS-METHODS generate
      IMPORTING
        json           TYPE json
        pretty_name    TYPE pretty_name_mode DEFAULT pretty_mode-none
      RETURNING
        VALUE(rr_data) TYPE REF TO data .
    METHODS deserialize_int
      IMPORTING
        json  TYPE json OPTIONAL
        jsonx TYPE xstring OPTIONAL
      CHANGING
        data  TYPE data
      RAISING
        cx_sy_move_cast_error .
    METHODS generate_int
      IMPORTING
        json           TYPE json
      RETURNING
        VALUE(rr_data) TYPE REF TO data .
    METHODS serialize_int
      IMPORTING
        data          TYPE data
        name          TYPE string OPTIONAL
        type_descr    TYPE REF TO cl_abap_typedescr OPTIONAL
      RETURNING
        VALUE(r_json) TYPE json .
    METHODS constructor
      IMPORTING
        compress         TYPE bool DEFAULT c_bool-false
        pretty_name      TYPE pretty_name_mode DEFAULT pretty_mode-none
        assoc_arrays     TYPE bool DEFAULT c_bool-false
        ts_as_iso8601    TYPE bool DEFAULT c_bool-false
        expand_includes  TYPE bool DEFAULT c_bool-true
        assoc_arrays_opt TYPE bool DEFAULT c_bool-false
        strict_mode      TYPE bool DEFAULT c_bool-false
        numc_as_string   TYPE bool DEFAULT c_bool-false .

  PROTECTED SECTION.

    TYPES:
      BEGIN OF pretty_name_pair,
        in  TYPE string,
        out TYPE string,
      END OF pretty_name_pair,
      BEGIN OF t_s_symbol,
        header       TYPE string,
        name         TYPE string,
        type         TYPE REF TO cl_abap_datadescr,
        value        TYPE REF TO data,
        compressable TYPE abap_bool,
        read_only    TYPE abap_bool,
      END OF t_s_symbol,
      t_t_symbol TYPE STANDARD TABLE OF t_s_symbol WITH DEFAULT KEY,
      BEGIN OF t_s_field_cache,
        name  TYPE string,
        type  TYPE REF TO cl_abap_datadescr,
        value TYPE REF TO data,
      END OF t_s_field_cache,
      t_t_field_cache TYPE HASHED TABLE OF t_s_field_cache WITH UNIQUE KEY name.

    DATA mv_compress TYPE bool .
    DATA mv_pretty_name TYPE pretty_name_mode .
    DATA mv_assoc_arrays TYPE bool .
    DATA mv_ts_as_iso8601 TYPE bool .
    DATA mt_cache_pretty TYPE HASHED TABLE OF pretty_name_pair WITH UNIQUE KEY in .
    DATA mv_expand_includes TYPE bool .
    DATA mv_assoc_arrays_opt TYPE bool .
    DATA mv_strict_mode TYPE bool .
    DATA mv_numc_as_string TYPE bool .

    METHODS dump_symbols FINAL IMPORTING it_symbols TYPE t_t_symbol RETURNING VALUE(r_json) TYPE json .
    METHODS get_symbols FINAL
      IMPORTING
        type_descr      TYPE REF TO cl_abap_typedescr
        data            TYPE REF TO data OPTIONAL
        object          TYPE REF TO object OPTIONAL
        include_aliases TYPE abap_bool DEFAULT abap_false
      RETURNING
        VALUE(result)   TYPE t_t_symbol .
    METHODS get_fields
          FINAL
      IMPORTING
        type_descr       TYPE REF TO cl_abap_typedescr
        data             TYPE REF TO data OPTIONAL
        object           TYPE REF TO object OPTIONAL
      RETURNING
        VALUE(rt_fields) TYPE t_t_field_cache .
    METHODS dump_int FINAL
      IMPORTING
        data          TYPE data
        type_descr    TYPE REF TO cl_abap_typedescr OPTIONAL
      RETURNING
        VALUE(r_json) TYPE json .
    METHODS is_compressable
      IMPORTING
        type_descr         TYPE REF TO cl_abap_typedescr
        name               TYPE csequence
      RETURNING
        VALUE(rv_compress) TYPE abap_bool .
    METHODS restore FINAL
      IMPORTING
        json              TYPE json
        length            TYPE i
        VALUE(type_descr) TYPE REF TO cl_abap_typedescr OPTIONAL
        field_cache       TYPE t_t_field_cache OPTIONAL
      CHANGING
        data              TYPE data OPTIONAL
        offset            TYPE i DEFAULT 0
      RAISING
        cx_sy_move_cast_error .
    METHODS restore_type FINAL
      IMPORTING
        json              TYPE json
        length            TYPE i
        VALUE(type_descr) TYPE REF TO cl_abap_typedescr OPTIONAL
        field_cache       TYPE t_t_field_cache OPTIONAL
      CHANGING
        data              TYPE data OPTIONAL
        offset            TYPE i DEFAULT 0
      RAISING
        cx_sy_move_cast_error .
    METHODS pretty_name_ex
      IMPORTING
        in         TYPE csequence
      RETURNING
        VALUE(out) TYPE string .
    METHODS generate_int_ex FINAL
      IMPORTING
        json   TYPE json
        length TYPE i
      CHANGING
        data   TYPE data
        offset TYPE i .
    METHODS pretty_name
      IMPORTING
        in         TYPE csequence
      RETURNING
        VALUE(out) TYPE string .
ENDCLASS.

*----------------------------------------------------------------------*
*       CLASS zcl_json MACROS
*----------------------------------------------------------------------*

DEFINE escape_json_inplace.
  replace all occurrences of `\` in &1 with `\\`.
  replace all occurrences of `"` in &1 with `\"`.
END-OF-DEFINITION.

DEFINE escape_json.
  move &1 to &2.
  escape_json_inplace &2.
END-OF-DEFINITION.

DEFINE dump_type.

  case &2->type_kind.
    when cl_abap_typedescr=>typekind_float or cl_abap_typedescr=>typekind_int or cl_abap_typedescr=>typekind_int1 or
         cl_abap_typedescr=>typekind_int2 or cl_abap_typedescr=>typekind_packed or `8`. " TYPEKIND_INT8 -> '8' only from 7.40.
      if &2->type_kind eq cl_abap_typedescr=>typekind_packed and mv_ts_as_iso8601 eq c_bool-true and &2->absolute_name cp `\TYPE=TIMESTAMP*`.
        if &1 is initial.
          &3 = `""`.
        else.
          move &1 to &3.
          if &2->absolute_name eq `\TYPE=TIMESTAMP`.
            concatenate `"` &3(4) `-` &3+4(2) `-` &3+6(2) `T` &3+8(2) `:` &3+10(2) `:` &3+12(2) `.0000000Z"`  into &3.
          elseif &2->absolute_name eq `\TYPE=TIMESTAMPL`.
            concatenate `"` &3(4) `-` &3+4(2) `-` &3+6(2) `T` &3+8(2) `:` &3+10(2) `:` &3+12(2) `.` &3+15(7) `Z"`  into &3.
          endif.
        endif.
      elseif &1 is initial.
        &3 = `0`.
      else.
        move &1 to &3.
        if &1 lt 0.
          if &2->type_kind <> cl_abap_typedescr=>typekind_float. "float: sign is already at the beginning
            shift &3 right circular.
          endif.
        else.
          condense &3.
        endif.
      endif.
    when cl_abap_typedescr=>typekind_num.
      if mv_numc_as_string eq abap_true.
        if &1 is initial.
          &3 = `""`.
        else.
          concatenate `"` &1 `"` into &3.
        endif.
      else.
        if &1 is initial.
        &3 = `0`.
      else.
        move &1 to &3.
        shift &3 left deleting leading ` 0`.
        endif.
      endif.
    when cl_abap_typedescr=>typekind_string or cl_abap_typedescr=>typekind_csequence or cl_abap_typedescr=>typekind_clike.
      if &1 is initial.
        &3 = `""`.
      elseif &2->absolute_name eq mc_json_type.
        &3 = &1.
      else.
        escape_json &1 &3.
        concatenate `"` &3 `"` into &3.
      endif.
    when cl_abap_typedescr=>typekind_xstring or cl_abap_typedescr=>typekind_hex.
      if &1 is initial.
        &3 = `""`.
      else.
        &3 = xstring_to_string( &1 ).
        escape_json_inplace &3.
        concatenate `"` &3 `"` into &3.
      endif.
    when cl_abap_typedescr=>typekind_char.
      if &2->output_length eq 1 and mc_bool_types cs &2->absolute_name.
        if &1 eq c_bool-true.
          &3 = `true`.                                      "#EC NOTEXT
        elseif mc_bool_3state cs &2->absolute_name and &1 is initial.
          &3 = `null`.                                      "#EC NOTEXT
        else.
          &3 = `false`.                                     "#EC NOTEXT
        endif.
      else.
        escape_json &1 &3.
        concatenate `"` &3 `"` into &3.
      endif.
    when cl_abap_typedescr=>typekind_date.
      concatenate `"` &1(4) `-` &1+4(2) `-` &1+6(2) `"` into &3.
    when cl_abap_typedescr=>typekind_time.
      concatenate `"` &1(2) `:` &1+2(2) `:` &1+4(2) `"` into &3.
    when others.
      if &1 is initial.
        &3 = `null`.                                        "#EC NOTEXT
      else.
        move &1 to &3.
      endif.
  endcase.

END-OF-DEFINITION.

DEFINE format_name.
  case &2.
    when pretty_mode-camel_case.
      &3 = pretty_name( &1 ).
    when pretty_mode-extended.
      &3 = pretty_name_ex( &1 ).
    when pretty_mode-low_case.
      &3 = &1.
      translate &3 to lower case.                         "#EC SYNTCHAR
    when others.
      &3 = &1.
  endcase.
END-OF-DEFINITION.

DEFINE throw_error.
  raise exception type cx_sy_move_cast_error.
END-OF-DEFINITION.

DEFINE while_offset_cs.
  while offset < length.
    find first occurrence of json+offset(1) in &1.
    if sy-subrc is not initial.
      exit.
    endif.
    offset = offset + 1.
  endwhile.
END-OF-DEFINITION.


DEFINE eat_white.
  while_offset_cs sv_white_space.
END-OF-DEFINITION.

DEFINE eat_string.
  if json+offset(1) eq `"`.
    mark   = offset + 1.
    offset = mark.
    unescape = abap_false.
    do.
      find first occurrence of `"` in section offset offset of json match offset pos.
      if sy-subrc is not initial.
        throw_error.
      endif.
        offset = pos.
        pos = pos - 1.
        " if escaped search further
        while pos ge 0 and json+pos(1) eq `\`.
          pos = pos - 1.
        unescape = abap_true.
      endwhile.
      match = ( offset - pos ) mod 2.
      if match ne 0.
        exit.
      endif.
      offset = offset + 1.
    enddo.
    match = offset - mark.
    &1 = json+mark(match).
    if unescape eq abap_true.
      replace all occurrences of `\"` in &1 with `"`.
    endif.
    " \ shall be unescaped always, while we do not have check for that
    replace all occurrences of `\\` in &1 with `\`.
    offset = offset + 1.
  else.
    throw_error.
  endif.
END-OF-DEFINITION.

DEFINE eat_number.
  mark   = offset.
  while_offset_cs `0123456789+-eE.`.                        "#EC NOTEXT
  match = offset - mark.
  &1 = json+mark(match).
END-OF-DEFINITION.

DEFINE eat_bool.
  mark   = offset.
  while_offset_cs `aeflnrstu`.                              "#EC NOTEXT
  match = offset - mark.
  if json+mark(match) eq `true`.                            "#EC NOTEXT
    &1 = c_bool-true.
  elseif json+mark(match) eq `false`.                       "#EC NOTEXT
    if type_descr is bound and mc_bool_3state cs type_descr->absolute_name.
      &1 = c_tribool-false.
    else.
      &1 = c_bool-false.
    endif.
  elseif json+mark(match) eq `null`.                        "#EC NOTEXT
    clear &1.
  endif.
END-OF-DEFINITION.

DEFINE eat_char.
  if offset < length and json+offset(1) eq &1.
    offset = offset + 1.
  else.
    throw_error.
  endif.
END-OF-DEFINITION.

*----------------------------------------------------------------------*
*       CLASS zcl_json IMPLEMENTATION
*----------------------------------------------------------------------*

CLASS zcl_json IMPLEMENTATION.

  METHOD class_constructor.

    DATA: lo_bool_type_descr    TYPE REF TO cl_abap_typedescr,
          lo_tribool_type_descr TYPE REF TO cl_abap_typedescr,
          lo_json_type_descr    TYPE REF TO cl_abap_typedescr,
          lv_json_string        TYPE json.

    lo_bool_type_descr    = cl_abap_typedescr=>describe_by_data( c_bool-true ).
    lo_tribool_type_descr = cl_abap_typedescr=>describe_by_data( c_tribool-true ).
    lo_json_type_descr    = cl_abap_typedescr=>describe_by_data( lv_json_string ).

    CONCATENATE mc_bool_types lo_bool_type_descr->absolute_name lo_tribool_type_descr->absolute_name INTO mc_bool_types.
    CONCATENATE mc_bool_3state lo_tribool_type_descr->absolute_name INTO mc_bool_3state.
    CONCATENATE mc_json_type lo_json_type_descr->absolute_name INTO mc_json_type.

    sv_white_space = cl_abap_char_utilities=>get_simple_spaces_for_cur_cp( ).

  ENDMETHOD.                    "class_constructor

  METHOD constructor.
    mv_compress       = compress.
    mv_pretty_name    = pretty_name.
    mv_assoc_arrays   = assoc_arrays.
    mv_ts_as_iso8601  = ts_as_iso8601.
    mv_expand_includes  = expand_includes.
    mv_assoc_arrays_opt = assoc_arrays_opt.
    mv_strict_mode      = strict_mode.
    mv_numc_as_string   = numc_as_string.
  ENDMETHOD.

  METHOD deserialize.

    DATA: lo_json TYPE REF TO zcl_json.

    " **********************************************************************
    "! Usage examples and documentation can be found on SCN:
    " http://wiki.scn.sap.com/wiki/display/Snippets/One+more+ABAP+to+JSON+Serializer+and+Deserializer
    " **********************************************************************  "

    IF json IS NOT INITIAL OR jsonx IS NOT INITIAL.

      CREATE OBJECT lo_json
        EXPORTING
          pretty_name      = pretty_name
          assoc_arrays     = assoc_arrays
          assoc_arrays_opt = assoc_arrays_opt.

      TRY .
          lo_json->deserialize_int( EXPORTING json = json jsonx = jsonx CHANGING data = data ).
        CATCH cx_sy_move_cast_error.
      ENDTRY.

    ENDIF.

  ENDMETHOD.                    "deserialize

  METHOD deserialize_int.

    DATA: length    TYPE i,
          unescaped LIKE json.

    " **********************************************************************
    "! Usage examples and documentation can be found on SCN:
    " http://wiki.scn.sap.com/wiki/display/Snippets/One+more+ABAP+to+JSON+Serializer+and+Deserializer
    " **********************************************************************  "

    IF json IS NOT INITIAL OR jsonx IS NOT INITIAL.

      IF jsonx IS NOT INITIAL.
        unescaped = raw_to_string( jsonx ).
      ELSE.
        unescaped = json.
      ENDIF.

      " to eliminate numeric replacement calls for every single sting value, we do
      " replacement over all JSON text, while this shall not destroy JSON structure
      REPLACE ALL OCCURRENCES OF `\r\n` IN unescaped WITH cl_abap_char_utilities=>cr_lf.
      REPLACE ALL OCCURRENCES OF `\n`   IN unescaped WITH cl_abap_char_utilities=>newline.
      REPLACE ALL OCCURRENCES OF `\t`   IN unescaped WITH cl_abap_char_utilities=>horizontal_tab.

      length = numofchar( unescaped ).
      restore_type( EXPORTING json = unescaped length = length CHANGING data = data ).

    ENDIF.

  ENDMETHOD.                    "deserialize

  METHOD serialize.

    " **********************************************************************
    "! Usage examples and documentation can be found on SCN:
    " http://wiki.scn.sap.com/wiki/display/Snippets/One+more+ABAP+to+JSON+Serializer+and+Deserializer
    " **********************************************************************  "

    DATA: lo_json  TYPE REF TO zcl_json.

    CREATE OBJECT lo_json
      EXPORTING
        compress         = compress
        pretty_name      = pretty_name
        assoc_arrays     = assoc_arrays
        assoc_arrays_opt = assoc_arrays_opt
        expand_includes  = expand_includes
        numc_as_string   = numc_as_string
        ts_as_iso8601    = ts_as_iso8601.

    r_json = lo_json->serialize_int( name = name data = data type_descr = type_descr ).

  ENDMETHOD.                    "serialize

  METHOD serialize_int.

    " **********************************************************************
    "! Usage examples and documentation can be found on SCN:
    " http://wiki.scn.sap.com/wiki/display/Snippets/One+more+ABAP+to+JSON+Serializer+and+Deserializer
    " **********************************************************************  "

    DATA: lo_descr   TYPE REF TO cl_abap_typedescr.

    IF type_descr IS INITIAL.
      lo_descr = cl_abap_typedescr=>describe_by_data( data ).
    ELSE.
      lo_descr = type_descr.
    ENDIF.

    r_json = dump_int( data = data type_descr = lo_descr ).

    " we do not do escaping of every single string value for white space characters,
    " but we do it on top, to replace multiple calls by 3 only, while we do not serialize
    " outlined/formatted JSON this shall not produce any harm
    REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>cr_lf          IN r_json WITH `\r\n`.
    REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>newline        IN r_json WITH `\n`.
    REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>horizontal_tab IN r_json WITH `\t`.

    IF name IS NOT INITIAL AND ( mv_compress IS INITIAL OR r_json IS NOT INITIAL ).
      CONCATENATE `"` name `":` r_json INTO r_json.
    ENDIF.

  ENDMETHOD.                    "serialize

  METHOD generate.

    DATA: lo_json TYPE REF TO zcl_json,
          lv_json LIKE json.

    lv_json = json.

    REPLACE ALL OCCURRENCES OF `\r\n` IN lv_json WITH cl_abap_char_utilities=>cr_lf.
    REPLACE ALL OCCURRENCES OF `\n`   IN lv_json WITH cl_abap_char_utilities=>newline.
    REPLACE ALL OCCURRENCES OF `\t`   IN lv_json WITH cl_abap_char_utilities=>horizontal_tab.

    CREATE OBJECT lo_json
      EXPORTING
        pretty_name      = pretty_name
        assoc_arrays     = c_bool-true
        assoc_arrays_opt = c_bool-true.

    TRY .
        rr_data = lo_json->generate_int( lv_json ).
      CATCH cx_sy_move_cast_error.
    ENDTRY.

  ENDMETHOD.

  METHOD generate_int.

    TYPES: BEGIN OF ts_field,
             name  TYPE string,
             value TYPE json,
           END OF ts_field.

    DATA: length TYPE i,
          offset TYPE i.

    DATA: lt_json   TYPE STANDARD TABLE OF json WITH DEFAULT KEY,
          lv_json   LIKE LINE OF lt_json,
          lt_fields TYPE SORTED TABLE OF ts_field WITH UNIQUE KEY name,
          lo_type   TYPE REF TO cl_abap_datadescr,
          lt_comp   TYPE abap_component_tab,
          ls_comp   LIKE LINE OF lt_comp.

    FIELD-SYMBOLS: <data>   TYPE any,
                   <struct> TYPE any,
                   <field>  LIKE LINE OF lt_fields,
                   <table>  TYPE STANDARD TABLE.

    length = numofchar( json ).

    eat_white.

    CASE json+offset(1).
      WHEN `{`."result must be a structure
        restore_type( EXPORTING json = json length = length CHANGING  data = lt_fields ).
        IF lt_fields IS NOT INITIAL.
          ls_comp-type = cl_abap_refdescr=>get_ref_to_data( ).
          LOOP AT lt_fields ASSIGNING <field>.
            ls_comp-name = <field>-name.
            TRANSLATE ls_comp-name USING `/_:_~_._-_`. " remove characters not allowed in component names
            IF mv_pretty_name EQ pretty_mode-camel_case OR mv_pretty_name EQ pretty_mode-extended.
              REPLACE ALL OCCURRENCES OF REGEX `([a-z])([A-Z])` IN ls_comp-name WITH `$1_$2`. "#EC NOTEXT
            ENDIF.
            APPEND ls_comp TO lt_comp.
          ENDLOOP.
          TRY.
              lo_type = cl_abap_structdescr=>create( p_components = lt_comp p_strict = c_bool-false ).
              CREATE DATA rr_data TYPE HANDLE lo_type.
              ASSIGN rr_data->* TO <struct>.
              LOOP AT lt_fields ASSIGNING <field>.
                ASSIGN COMPONENT sy-tabix OF STRUCTURE <struct> TO <data>.
                <data> = generate_int( <field>-value ).
              ENDLOOP.
            CATCH cx_sy_create_data_error cx_sy_struct_creation.
          ENDTRY.
        ENDIF.
      WHEN `[`."result must be a table of ref
        restore_type( EXPORTING json = json length = length CHANGING  data = lt_json ).
        CREATE DATA rr_data TYPE TABLE OF REF TO data.
        ASSIGN rr_data->* TO <table>.
        LOOP AT lt_json INTO lv_json.
          APPEND INITIAL LINE TO <table> ASSIGNING <data>.
          <data> = generate_int( lv_json ).
        ENDLOOP.
      WHEN OTHERS.
        IF json+offset(1) EQ `"`.
          CREATE DATA rr_data TYPE string.
        ELSEIF json+offset(1) CA `-0123456789.`.
          IF json+offset CS '.'.
            CREATE DATA rr_data TYPE f.
          ELSE.
            CREATE DATA rr_data TYPE i.
          ENDIF.
        ELSEIF json+offset EQ `true` OR json+offset EQ `false`.
          CREATE DATA rr_data TYPE abap_bool.
        ENDIF.
        IF rr_data IS BOUND.
          ASSIGN rr_data->* TO <data>.
          restore_type( EXPORTING json = json length = length CHANGING  data = <data> ).
        ENDIF.
    ENDCASE.

  ENDMETHOD.

  METHOD generate_int_ex.

    DATA: lv_assoc_arrays     LIKE mv_assoc_arrays,
          lv_assoc_arrays_opt LIKE mv_assoc_arrays_opt,
          lv_mark             LIKE offset,
          lv_match            LIKE lv_mark,
          lv_json             TYPE zcl_json=>json.

    lv_mark = offset.
    restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
    lv_match = offset - lv_mark.
    lv_json = json+lv_mark(lv_match).

    lv_assoc_arrays     = mv_assoc_arrays.
    lv_assoc_arrays_opt = mv_assoc_arrays_opt.

    mv_assoc_arrays     = abap_true.
    mv_assoc_arrays_opt = abap_true.

    data = generate_int( lv_json ).

    mv_assoc_arrays = lv_assoc_arrays.
    mv_assoc_arrays_opt = lv_assoc_arrays_opt.

  ENDMETHOD.

  METHOD dump_int.

    DATA: lo_typedesc   TYPE REF TO cl_abap_typedescr,
          lo_elem_descr TYPE REF TO cl_abap_elemdescr,
          lo_classdesc  TYPE REF TO cl_abap_classdescr,
          lo_structdesc TYPE REF TO cl_abap_structdescr,
          lo_tabledescr TYPE REF TO cl_abap_tabledescr,
          lt_symbols    TYPE t_t_symbol,
          lt_keys       LIKE lt_symbols,
          lt_properties TYPE STANDARD TABLE OF string,
          lt_fields     TYPE STANDARD TABLE OF string,
          lo_obj_ref    TYPE REF TO object,
          lo_data_ref   TYPE REF TO data,
          ls_skip_key   TYPE LINE OF abap_keydescr_tab,
          lv_array_opt  TYPE abap_bool,
          lv_prop_name  TYPE string,
          lv_keyval     TYPE string,
          lv_itemval    TYPE string.

    FIELD-SYMBOLS: <line>   TYPE any,
                   <value>  TYPE any,
                   <data>   TYPE data,
                   <key>    TYPE LINE OF abap_keydescr_tab,
                   <symbol> LIKE LINE OF lt_symbols,
                   <table>  TYPE ANY TABLE.

    CASE type_descr->kind.
      WHEN cl_abap_typedescr=>kind_ref.

        IF data IS INITIAL.
          r_json = `null`.                                  "#EC NOTEXT
        ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
          lo_data_ref ?= data.
          lo_typedesc = cl_abap_typedescr=>describe_by_data_ref( lo_data_ref ).
          ASSIGN lo_data_ref->* TO <data>.
          r_json = dump_int( data = <data> type_descr = lo_typedesc ).
        ELSE.
          lo_obj_ref ?= data.
          lo_classdesc ?= cl_abap_typedescr=>describe_by_object_ref( lo_obj_ref ).
          lt_symbols = get_symbols( type_descr = lo_classdesc object = lo_obj_ref ).
          r_json = dump_symbols( lt_symbols ).
        ENDIF.

      WHEN cl_abap_typedescr=>kind_elem.
        lo_elem_descr ?= type_descr.
        dump_type data lo_elem_descr r_json.

      WHEN cl_abap_typedescr=>kind_struct.

        lo_structdesc ?= type_descr.
        GET REFERENCE OF data INTO lo_data_ref.
        lt_symbols = get_symbols( type_descr = lo_structdesc data = lo_data_ref ).
        r_json = dump_symbols( lt_symbols ).

      WHEN cl_abap_typedescr=>kind_table.

        lo_tabledescr ?= type_descr.
        lo_typedesc = lo_tabledescr->get_table_line_type( ).

        ASSIGN data TO <table>.

        " optimization for structured tables
        IF lo_typedesc->kind EQ cl_abap_typedescr=>kind_struct.
          lo_structdesc ?= lo_typedesc.
          CREATE DATA lo_data_ref LIKE LINE OF <table>.
          ASSIGN lo_data_ref->* TO <line>.
          lt_symbols = get_symbols( type_descr = lo_structdesc data = lo_data_ref ).

          " here we have differentiation of output of simple table to JSON array
          " and sorted or hashed table with unique key into JSON associative array
          IF lo_tabledescr->has_unique_key IS NOT INITIAL AND mv_assoc_arrays IS NOT INITIAL.

            IF lo_tabledescr->key_defkind EQ lo_tabledescr->keydefkind_user.
              LOOP AT lo_tabledescr->key ASSIGNING <key>.
                READ TABLE lt_symbols WITH KEY name = <key>-name ASSIGNING <symbol>.
                APPEND <symbol> TO lt_keys.
              ENDLOOP.
            ENDIF.

            IF lines( lo_tabledescr->key ) EQ 1.
              READ TABLE lo_tabledescr->key INDEX 1 INTO ls_skip_key.
              DELETE lt_symbols WHERE name EQ ls_skip_key-name.
              " remove object wrapping for simple name-value tables
              IF mv_assoc_arrays_opt EQ abap_true AND lines( lt_symbols ) EQ 1.
                lv_array_opt = abap_true.
              ENDIF.
            ENDIF.

            LOOP AT <table> INTO <line>.
              CLEAR: lt_fields, lv_prop_name.
              LOOP AT lt_symbols ASSIGNING <symbol>.
                ASSIGN <symbol>-value->* TO <value>.
                IF mv_compress IS INITIAL OR <value> IS NOT INITIAL OR <symbol>-compressable EQ abap_false.
                  IF <symbol>-type->kind EQ cl_abap_typedescr=>kind_elem.
                    lo_elem_descr ?= <symbol>-type.
                    dump_type <value> lo_elem_descr lv_itemval.
                  ELSE.
                    lv_itemval = dump_int( data = <value> type_descr = <symbol>-type ).
                  ENDIF.
                  IF lv_array_opt EQ abap_false.
                    CONCATENATE <symbol>-header lv_itemval INTO lv_itemval.
                  ENDIF.
                  APPEND lv_itemval TO lt_fields.
                ENDIF.
              ENDLOOP.

              IF lo_tabledescr->key_defkind EQ lo_tabledescr->keydefkind_user.
                LOOP AT lt_keys ASSIGNING <symbol>.
                  ASSIGN <symbol>-value->* TO <value>.
                  MOVE <value> TO lv_keyval.
                  CONDENSE lv_keyval.
                  IF lv_prop_name IS NOT INITIAL.
                    CONCATENATE lv_prop_name mc_key_separator lv_keyval INTO lv_prop_name.
                  ELSE.
                    lv_prop_name = lv_keyval.
                  ENDIF.
                ENDLOOP.
              ELSE.
                LOOP AT lt_symbols ASSIGNING <symbol>.
                  ASSIGN <symbol>-value->* TO <value>.
                  MOVE <value> TO lv_keyval.
                  CONDENSE lv_keyval.
                  IF lv_prop_name IS NOT INITIAL.
                    CONCATENATE lv_prop_name mc_key_separator lv_keyval INTO lv_prop_name.
                  ELSE.
                    lv_prop_name = lv_keyval.
                  ENDIF.
                ENDLOOP.
              ENDIF.

              CONCATENATE LINES OF lt_fields INTO lv_itemval SEPARATED BY `,`.
              IF lv_array_opt EQ abap_false.
                CONCATENATE `"` lv_prop_name `":{` lv_itemval `}` INTO lv_itemval.
              ELSE.
                CONCATENATE `"` lv_prop_name `":` lv_itemval `` INTO lv_itemval.
              ENDIF.
              APPEND lv_itemval TO lt_properties.

            ENDLOOP.

            CONCATENATE LINES OF lt_properties INTO r_json SEPARATED BY `,`.
            CONCATENATE `{` r_json `}` INTO r_json.

          ELSE.

            LOOP AT <table> INTO <line>.
              CLEAR lt_fields.
              LOOP AT lt_symbols ASSIGNING <symbol>.
                ASSIGN <symbol>-value->* TO <value>.
                IF mv_compress IS INITIAL OR <value> IS NOT INITIAL OR <symbol>-compressable EQ abap_false.
                  IF <symbol>-type->kind EQ cl_abap_typedescr=>kind_elem.
                    lo_elem_descr ?= <symbol>-type.
                    dump_type <value> lo_elem_descr lv_itemval.
                  ELSE.
                    lv_itemval = dump_int( data = <value> type_descr = <symbol>-type ).
                  ENDIF.
                  CONCATENATE <symbol>-header lv_itemval INTO lv_itemval.
                  APPEND lv_itemval TO lt_fields.
                ENDIF.
              ENDLOOP.

              CONCATENATE LINES OF lt_fields INTO lv_itemval SEPARATED BY `,`.
              CONCATENATE `{` lv_itemval `}` INTO lv_itemval.
              APPEND lv_itemval TO lt_properties.
            ENDLOOP.

            CONCATENATE LINES OF lt_properties INTO r_json SEPARATED BY `,`.
            CONCATENATE `[` r_json `]` INTO r_json.

          ENDIF.
        ELSE.
          LOOP AT <table> ASSIGNING <value>.
            lv_itemval = dump_int( data = <value> type_descr = lo_typedesc ).
            APPEND lv_itemval TO lt_properties.
          ENDLOOP.

          CONCATENATE LINES OF lt_properties INTO r_json SEPARATED BY `,`.
          CONCATENATE `[` r_json `]` INTO r_json.
        ENDIF.

    ENDCASE.

  ENDMETHOD.                    "dump

  METHOD dump_symbols.

    DATA: lv_properties TYPE STANDARD TABLE OF string,
          lv_itemval    TYPE string.

    FIELD-SYMBOLS: <value>  TYPE any,
                   <symbol> LIKE LINE OF it_symbols.

    LOOP AT it_symbols ASSIGNING <symbol>.
      ASSIGN <symbol>-value->* TO <value>.
      IF mv_compress IS INITIAL OR <value> IS NOT INITIAL OR <symbol>-compressable EQ abap_false.
        lv_itemval = dump_int( data = <value> type_descr = <symbol>-type ).
        CONCATENATE <symbol>-header lv_itemval INTO lv_itemval.
        APPEND lv_itemval TO lv_properties.
      ENDIF.
    ENDLOOP.

    CONCATENATE LINES OF lv_properties INTO r_json SEPARATED BY `,`.
    CONCATENATE `{` r_json `}` INTO r_json.

  ENDMETHOD.

  METHOD get_fields.

    DATA: lt_symbols TYPE t_t_symbol,
          lv_name    TYPE char128,
          ls_field   LIKE LINE OF rt_fields.

    FIELD-SYMBOLS: <sym>   LIKE LINE OF lt_symbols.

    lt_symbols = get_symbols( type_descr = type_descr data = data object = object include_aliases = abap_true ).

    LOOP AT lt_symbols ASSIGNING <sym> WHERE read_only EQ abap_false.
      ls_field-name  = <sym>-name.
      ls_field-type  = <sym>-type.
      ls_field-value = <sym>-value.

      " insert as UPPER CASE
      INSERT ls_field INTO TABLE rt_fields.

      " insert as lower case
      TRANSLATE ls_field-name TO LOWER CASE.
      INSERT ls_field INTO TABLE rt_fields.

      " as pretty printed
      IF mv_pretty_name NE pretty_mode-none AND mv_pretty_name NE pretty_mode-low_case.
        format_name ls_field-name mv_pretty_name ls_field-name.
        INSERT ls_field INTO TABLE rt_fields.
        " let us check for not well formed canelCase to be compatible with old logic
        lv_name = ls_field-name.
        TRANSLATE lv_name(1) TO UPPER CASE.
        ls_field-name = lv_name.
        INSERT ls_field INTO TABLE rt_fields.
      ENDIF.

    ENDLOOP.

  ENDMETHOD.

  METHOD get_symbols.

    DATA: comp_tab     TYPE cl_abap_structdescr=>component_table,
          symb_tab     LIKE result,
          symb         LIKE LINE OF symb_tab,
          class_descr  TYPE REF TO cl_abap_classdescr,
          struct_descr TYPE REF TO cl_abap_structdescr.

    FIELD-SYMBOLS: <comp>  LIKE LINE OF comp_tab,
                   <attr>  LIKE LINE OF cl_abap_objectdescr=>attributes,
                   <field> TYPE any.

    IF type_descr->kind EQ cl_abap_typedescr=>kind_struct.

      struct_descr ?= type_descr.
      comp_tab = struct_descr->get_components( ).

      LOOP AT comp_tab ASSIGNING <comp>.
        IF <comp>-name IS NOT INITIAL AND
          ( <comp>-as_include EQ abap_false OR include_aliases EQ abap_true OR mv_expand_includes EQ abap_false ).
          symb-name = <comp>-name.
          symb-type = <comp>-type.
          symb-compressable = is_compressable( type_descr = symb-type name = symb-name ).
          ASSIGN data->(symb-name) TO <field>.
          GET REFERENCE OF <field> INTO symb-value.
          format_name symb-name mv_pretty_name symb-header.
          CONCATENATE `"` symb-header  `":` INTO symb-header.
          APPEND symb TO result.
        ENDIF.
        IF <comp>-as_include EQ abap_true AND mv_expand_includes EQ abap_true.
          struct_descr ?= <comp>-type.
          symb_tab = get_symbols( type_descr = struct_descr data = data include_aliases = include_aliases ).
          LOOP AT symb_tab INTO symb.
            CONCATENATE symb-name <comp>-suffix INTO symb-name.
            symb-compressable = is_compressable( type_descr = symb-type name = symb-name ).
            ASSIGN data->(symb-name) TO <field>.
            GET REFERENCE OF <field> INTO symb-value.
            format_name symb-name mv_pretty_name symb-header.
            CONCATENATE `"` symb-header  `":` INTO symb-header.
            APPEND symb TO result.
          ENDLOOP.
        ENDIF.
      ENDLOOP.

    ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_class.

      class_descr ?= type_descr.
      LOOP AT class_descr->attributes ASSIGNING <attr> WHERE is_constant IS INITIAL AND alias_for IS INITIAL AND
        ( is_interface IS INITIAL OR type_kind NE cl_abap_typedescr=>typekind_oref ).
        ASSIGN object->(<attr>-name) TO <field>.
        CHECK sy-subrc IS INITIAL. " we can only assign to public attributes
        symb-name = <attr>-name.
        symb-read_only = <attr>-is_read_only.
        symb-type = class_descr->get_attribute_type( <attr>-name ).
        symb-compressable = is_compressable( type_descr = symb-type name = symb-name ).
        GET REFERENCE OF <field> INTO symb-value.
        format_name symb-name mv_pretty_name symb-header.
        CONCATENATE `"` symb-header  `":` INTO symb-header.
        APPEND symb TO result.
      ENDLOOP.

    ENDIF.

  ENDMETHOD.                    "GET_SYMBOLS

  METHOD is_compressable.
    rv_compress = abap_true.
  ENDMETHOD.

  METHOD pretty_name.

    DATA: tokens TYPE TABLE OF char128,
          cache  LIKE LINE OF mt_cache_pretty.

    FIELD-SYMBOLS: <token> LIKE LINE OF tokens,
                   <cache> LIKE LINE OF mt_cache_pretty.

    READ TABLE mt_cache_pretty WITH TABLE KEY in = in ASSIGNING <cache>.
    IF sy-subrc IS INITIAL.
      out = <cache>-out.
    ELSE.
      out = in.

      REPLACE ALL OCCURRENCES OF `__` IN out WITH `*`.

      TRANSLATE out TO LOWER CASE.
      TRANSLATE out USING `/_:_~_`.
      SPLIT out AT `_` INTO TABLE tokens.
      LOOP AT tokens ASSIGNING <token> FROM 2.
        TRANSLATE <token>(1) TO UPPER CASE.
      ENDLOOP.

      CONCATENATE LINES OF tokens INTO out.
      REPLACE ALL OCCURRENCES OF `*` IN out WITH `_`.

      cache-in  = in.
      cache-out = out.
      INSERT cache INTO TABLE mt_cache_pretty.
    ENDIF.

  ENDMETHOD.                    "pretty_name

  METHOD pretty_name_ex.

    DATA: tokens TYPE TABLE OF char128,
          cache  LIKE LINE OF mt_cache_pretty.

    FIELD-SYMBOLS: <token> LIKE LINE OF tokens,
                   <cache> LIKE LINE OF mt_cache_pretty.

    READ TABLE mt_cache_pretty WITH TABLE KEY in = in ASSIGNING <cache>.
    IF sy-subrc IS INITIAL.
      out = <cache>-out.
    ELSE.
      out = in.

      REPLACE ALL OCCURRENCES OF `___` IN out WITH `.`.
      REPLACE ALL OCCURRENCES OF `__` IN out WITH `*`.

      TRANSLATE out TO LOWER CASE.
      TRANSLATE out USING `/_:_~_`.
      SPLIT out AT `_` INTO TABLE tokens.
      LOOP AT tokens ASSIGNING <token> FROM 2.
        TRANSLATE <token>(1) TO UPPER CASE.
      ENDLOOP.

      CONCATENATE LINES OF tokens INTO out.
      REPLACE ALL OCCURRENCES OF `*` IN out WITH `_`.

      cache-in  = in.
      cache-out = out.
      INSERT cache INTO TABLE mt_cache_pretty.
    ENDIF.

  ENDMETHOD.                    "pretty_name_ex

  METHOD restore.

    DATA: mark       LIKE offset,
          match      LIKE offset,
          pos        LIKE offset,
          unescape   TYPE abap_bool,
          ref_descr  TYPE REF TO cl_abap_refdescr,
          data_descr TYPE REF TO cl_abap_datadescr,
          data_ref   TYPE REF TO data,
          object_ref TYPE REF TO object,
          fields     LIKE field_cache,
          name_json  TYPE string.

    FIELD-SYMBOLS: <value>       TYPE any,
                   <field_cache> LIKE LINE OF field_cache.

    fields = field_cache.

    IF type_descr IS NOT INITIAL AND type_descr->kind EQ type_descr->kind_ref.
      ref_descr ?= type_descr.
      type_descr = ref_descr->get_referenced_type( ).
      IF ref_descr->type_kind EQ ref_descr->typekind_oref.
        IF data IS INITIAL.
          " can fire an exception, if type is abstract or constructor protected
          CREATE OBJECT data TYPE (type_descr->absolute_name).
        ENDIF.
        object_ref ?= data.
        fields = get_fields( type_descr = type_descr object = object_ref ).
      ELSEIF ref_descr->type_kind EQ ref_descr->typekind_dref.
        IF data IS INITIAL.
          data_descr ?= type_descr.
          CREATE DATA data TYPE HANDLE data_descr.
        ENDIF.
        data_ref ?= data.
        ASSIGN data_ref->* TO <value>.
        fields = get_fields( type_descr = type_descr data = data_ref ).
        restore( EXPORTING json = json length = length type_descr = type_descr field_cache = fields
                   CHANGING data = <value> offset = offset ).
        RETURN.
      ENDIF.
    ENDIF.

    IF fields IS INITIAL AND type_descr IS NOT INITIAL AND type_descr->kind EQ type_descr->kind_struct.
      GET REFERENCE OF data INTO data_ref.
      fields = get_fields( type_descr = type_descr data = data_ref ).
    ENDIF.

    eat_white.
    eat_char `{`.
    eat_white.

    WHILE offset < length AND json+offset(1) NE `}`.

      eat_white.
      eat_string name_json.
      eat_white.
      eat_char `:`.
      eat_white.

      READ TABLE fields WITH TABLE KEY name = name_json ASSIGNING <field_cache>.
      IF sy-subrc IS NOT INITIAL.
        TRANSLATE name_json TO UPPER CASE.
        READ TABLE fields WITH TABLE KEY name = name_json ASSIGNING <field_cache>.
      ENDIF.

      IF sy-subrc IS INITIAL.
        ASSIGN <field_cache>-value->* TO <value>.
        restore_type( EXPORTING json = json length = length type_descr = <field_cache>-type CHANGING data = <value> offset = offset ).
      ELSE.
        restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
      ENDIF.

      eat_white.

      IF offset < length AND json+offset(1) NE `}`.
        eat_char `,`.
      ELSE.
        EXIT.
      ENDIF.

    ENDWHILE.

    eat_char `}`.

  ENDMETHOD.                    "restore

  METHOD restore_type.

    DATA: mark        LIKE offset,
          match       LIKE offset,
          unescape    TYPE abap_bool,
          sdummy      TYPE string,                          "#EC NEEDED
          lr_idummy   TYPE REF TO i,                        "#EC NEEDED
          lr_bdummy   TYPE REF TO bool,                     "#EC NEEDED
          lr_sdummy   TYPE REF TO string,                   "#EC NEEDED
          pos         LIKE offset,
          line        TYPE REF TO data,
          key_ref     TYPE REF TO data,
          data_ref    TYPE REF TO data,
          key_name    TYPE string,
          key_value   TYPE string,
          lt_fields   LIKE field_cache,
          lt_symbols  TYPE t_t_symbol,
          lo_exp      TYPE REF TO cx_root,
          elem_descr  TYPE REF TO cl_abap_elemdescr,
          table_descr TYPE REF TO cl_abap_tabledescr,
          data_descr  TYPE REF TO cl_abap_datadescr.

    FIELD-SYMBOLS: <line>      TYPE any,
                   <value>     TYPE any,
                   <data>      TYPE data,
                   <field>     LIKE LINE OF lt_fields,
                   <table>     TYPE ANY TABLE,
                   <value_sym> LIKE LINE OF lt_symbols.

    IF type_descr IS INITIAL AND data IS SUPPLIED.
      type_descr = cl_abap_typedescr=>describe_by_data( data ).
    ENDIF.

    eat_white.

    TRY .
        IF type_descr IS NOT INITIAL AND type_descr->absolute_name EQ mc_json_type.
          " skip deserialization
          mark = offset.
          restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
          match = offset - mark.
          data = json+mark(match).
        ENDIF.

        CASE json+offset(1).
          WHEN `{`. " object
            IF type_descr IS NOT INITIAL.
              IF mv_assoc_arrays EQ c_bool-true AND type_descr->kind EQ cl_abap_typedescr=>kind_table.
                table_descr ?= type_descr.
                data_descr = table_descr->get_table_line_type( ).
                IF table_descr->has_unique_key IS NOT INITIAL.
                  eat_char `{`.
                  eat_white.
                  IF json+offset(1) NE `}`.
                    ASSIGN data TO <table>.
                    CLEAR <table>.
                    CREATE DATA line LIKE LINE OF <table>.
                    ASSIGN line->* TO <line>.
                    lt_fields = get_fields( type_descr = data_descr data = line ).
                    IF table_descr->key_defkind EQ table_descr->keydefkind_user AND lines( table_descr->key ) EQ 1.
                      READ TABLE table_descr->key INDEX 1 INTO key_name.
                      READ TABLE lt_fields WITH TABLE KEY name = key_name ASSIGNING <field>.
                      key_ref = <field>-value.
                      IF mv_assoc_arrays_opt EQ c_bool-true.
                        lt_symbols = get_symbols( type_descr = data_descr data = line ).
                        DELETE lt_symbols WHERE name EQ key_name.
                        IF lines( lt_symbols ) EQ 1.
                          READ TABLE lt_symbols INDEX 1 ASSIGNING <value_sym>.
                        ENDIF.
                      ENDIF.
                    ENDIF.
                    WHILE offset < length AND json+offset(1) NE `}`.
                      CLEAR <line>.
                      eat_white.
                      eat_string key_value.
                      eat_white.
                      eat_char `:`.
                      eat_white.
                      IF <value_sym> IS ASSIGNED.
                        ASSIGN <value_sym>-value->* TO <value>.
                        restore_type( EXPORTING json = json length = length type_descr = <value_sym>-type
                                      CHANGING data = <value> offset = offset ).
                      ELSE.
                        restore_type( EXPORTING json = json length = length type_descr = data_descr field_cache = lt_fields
                                      CHANGING data = <line> offset = offset ).
                      ENDIF.
                      IF table_descr->key_defkind EQ table_descr->keydefkind_user.
                        IF key_ref IS BOUND.
                          ASSIGN key_ref->* TO <value>.
                          IF <value> IS INITIAL.
                            MOVE key_value TO <value>.
                          ENDIF.
                        ENDIF.
                      ELSEIF <line> IS INITIAL.
                        MOVE key_value TO <line>.
                      ENDIF.

                      INSERT <line> INTO TABLE <table>.
                      eat_white.
                      IF offset < length AND json+offset(1) NE `}`.
                        eat_char `,`.
                      ELSE.
                        EXIT.
                      ENDIF.
                    ENDWHILE.
                  ELSE.
                    CLEAR data.
                  ENDIF.
                  eat_char `}`.
                ELSE.
                  restore( EXPORTING json = json length = length CHANGING  offset = offset ).
                ENDIF.
              ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
                IF data IS INITIAL.
                  generate_int_ex( EXPORTING json = json length = length CHANGING offset = offset data = data ).
                ELSE.
                  data_ref ?= data.
                  type_descr = cl_abap_typedescr=>describe_by_data_ref( data_ref ).
                  ASSIGN data_ref->* TO <data>.
                  restore_type( EXPORTING json = json length = length type_descr = type_descr CHANGING data = <data> offset = offset ).
                ENDIF.
              ELSE.
                restore( EXPORTING json = json length = length type_descr = type_descr field_cache = field_cache
                         CHANGING data = data offset = offset ).
              ENDIF.
            ELSE.
              restore( EXPORTING json = json length = length CHANGING  offset = offset ).
            ENDIF.
          WHEN `[`. " array
            IF type_descr IS NOT INITIAL AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
              IF data IS INITIAL.
                generate_int_ex( EXPORTING json = json length = length CHANGING offset = offset data = data ).
              ELSE.
                data_ref ?= data.
                type_descr = cl_abap_typedescr=>describe_by_data_ref( data_ref ).
                ASSIGN data_ref->* TO <data>.
                restore_type( EXPORTING json = json length = length type_descr = type_descr CHANGING data = <data> offset = offset ).
              ENDIF.
            ELSE.
              eat_char `[`.
              eat_white.
              IF json+offset(1) NE `]`.
                IF type_descr IS NOT INITIAL AND type_descr->kind EQ cl_abap_typedescr=>kind_table.
                  table_descr ?= type_descr.
                  data_descr = table_descr->get_table_line_type( ).
                  ASSIGN data TO <table>.
                  CLEAR <table>.
                  CREATE DATA line LIKE LINE OF <table>.
                  ASSIGN line->* TO <line>.
                  lt_fields = get_fields( type_descr = data_descr data = line ).
                  WHILE offset < length AND json+offset(1) NE `]`.
                    CLEAR <line>.
                    restore_type( EXPORTING json = json length = length type_descr = data_descr field_cache = lt_fields
                                  CHANGING data = <line> offset = offset ).
                    INSERT <line> INTO TABLE <table>.
                    eat_white.
                    IF offset < length AND json+offset(1) NE `]`.
                      eat_char `,`.
                    ELSE.
                      EXIT.
                    ENDIF.
                  ENDWHILE.
                ELSE.
                  " skip array
                  WHILE offset < length AND json+offset(1) NE `}`.
                    eat_white.
                    restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
                    eat_white.
                    IF offset < length AND json+offset(1) NE `]`.
                      eat_char `,`.
                    ELSE.
                      EXIT.
                    ENDIF.
                  ENDWHILE.
                ENDIF.
              ELSE.
                CLEAR data.
              ENDIF.
              eat_char `]`.
            ENDIF.
          WHEN `"`. " string
            eat_string sdummy.
            IF type_descr IS NOT INITIAL.
              " unescape string
              IF sdummy IS NOT INITIAL.
                IF type_descr->kind EQ cl_abap_typedescr=>kind_elem.
                  elem_descr ?= type_descr.
                  CASE elem_descr->type_kind.
                    WHEN cl_abap_typedescr=>typekind_char.
                      IF elem_descr->output_length EQ 1 AND mc_bool_types CS elem_descr->absolute_name.
                        IF sdummy(1) CA `XxTt1`.
                          data = c_bool-true.
                        ELSE.
                          data = c_bool-false.
                        ENDIF.
                        RETURN.
                      ENDIF.
                    WHEN cl_abap_typedescr=>typekind_xstring OR cl_abap_typedescr=>typekind_hex.
                      string_to_xstring( EXPORTING in = sdummy CHANGING out = data ).
                      RETURN.
                    WHEN cl_abap_typedescr=>typekind_date.
                      REPLACE FIRST OCCURRENCE OF REGEX `(\d{4})-(\d{2})-(\d{2})` IN sdummy WITH `$1$2$3`
                      REPLACEMENT LENGTH match REPLACEMENT OFFSET pos. "#EC NOTEXT
                      IF sy-subrc EQ 0 AND pos EQ 0.
                        sdummy = sdummy(match).
                      ENDIF.
                    WHEN cl_abap_typedescr=>typekind_time.
                      REPLACE FIRST OCCURRENCE OF REGEX `(\d{2}):(\d{2}):(\d{2})` IN sdummy WITH `$1$2$3`
                      REPLACEMENT LENGTH match REPLACEMENT OFFSET pos. "#EC NOTEXT
                      IF sy-subrc EQ 0 AND pos EQ 0.
                        sdummy = sdummy(match).
                      ENDIF.
                    WHEN cl_abap_typedescr=>typekind_packed.
                      REPLACE FIRST OCCURRENCE OF REGEX `(\d{4})-?(\d{2})-?(\d{2})T(\d{2}):?(\d{2}):?(\d{2})(?:[\.,](\d{0,7}))?Z?` IN sdummy WITH `$1$2$3$4$5$6.$7`
                      REPLACEMENT LENGTH match REPLACEMENT OFFSET pos. "#EC NOTEXT
                      IF sy-subrc EQ 0 AND pos EQ 0.
                        sdummy = sdummy(match).
                      ENDIF.
                  ENDCASE.
                ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
                  CREATE DATA lr_sdummy TYPE string.
                  MOVE sdummy TO lr_sdummy->*.
                  data ?= lr_sdummy.
                  RETURN.
                ELSE.
                  throw_error. " Other wise dumps with OBJECTS_MOVE_NOT_SUPPORTED
                ENDIF.
              ENDIF.
              MOVE sdummy TO data. " to avoid crashes due to data type inconsistency
            ENDIF.
          WHEN `-`. " number
            IF type_descr IS NOT INITIAL.
              IF type_descr->kind EQ type_descr->kind_ref AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
                CREATE DATA lr_idummy TYPE i.
                eat_number lr_idummy->*.                    "#EC NOTEXT
                data ?= lr_idummy.
              ELSEIF type_descr->kind EQ type_descr->kind_elem.
                eat_number data.                            "#EC NOTEXT
              ELSE.
                eat_number sdummy.                          "#EC NOTEXT
              ENDIF.
            ELSE.
              eat_number sdummy.                            "#EC NOTEXT
            ENDIF.
          WHEN OTHERS.
            FIND FIRST OCCURRENCE OF json+offset(1) IN `0123456789`.
            IF sy-subrc IS INITIAL. " number
              IF type_descr IS NOT INITIAL.
                IF type_descr->kind EQ type_descr->kind_ref AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
                  CREATE DATA lr_idummy TYPE i.
                  eat_number lr_idummy->*.                  "#EC NOTEXT
                  data ?= lr_idummy.
                ELSEIF type_descr->kind EQ type_descr->kind_elem.
                  eat_number data.                          "#EC NOTEXT
                ELSE.
                  eat_number sdummy.                        "#EC NOTEXT
                ENDIF.
              ELSE.
                eat_number sdummy.                          "#EC NOTEXT
              ENDIF.
            ELSE. " true/false/null
              IF type_descr IS NOT INITIAL.
                IF type_descr->kind EQ type_descr->kind_ref AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
                  CREATE DATA lr_bdummy TYPE bool.
                  eat_bool lr_bdummy->*.                    "#EC NOTEXT
                  data ?= lr_bdummy.
                ELSEIF type_descr->kind EQ type_descr->kind_elem.
                  eat_bool data.                            "#EC NOTEXT
                ELSE.
                  eat_bool sdummy.                          "#EC NOTEXT
                ENDIF.
              ELSE.
                eat_bool sdummy.                            "#EC NOTEXT
              ENDIF.
            ENDIF.
        ENDCASE.
      CATCH cx_sy_move_cast_error cx_sy_conversion_no_number cx_sy_conversion_overflow INTO lo_exp.
        CLEAR data.
        IF mv_strict_mode EQ abap_true.
          RAISE EXCEPTION TYPE cx_sy_move_cast_error EXPORTING previous = lo_exp.
        ENDIF.
    ENDTRY.

  ENDMETHOD.                    "restore_type

  METHOD string_to_raw.

    CALL FUNCTION 'SCMS_STRING_TO_XSTRING'
      EXPORTING
        text     = iv_string
        encoding = iv_encoding
      IMPORTING
        buffer   = rv_xstring
      EXCEPTIONS
        OTHERS   = 1.

    IF sy-subrc IS NOT INITIAL.
      CLEAR rv_xstring.
    ENDIF.

  ENDMETHOD.

  METHOD raw_to_string.

    DATA: lv_output_length TYPE i,
          lt_binary_tab    TYPE STANDARD TABLE OF sdokcntbin.

    CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
      EXPORTING
        buffer        = iv_xstring
      IMPORTING
        output_length = lv_output_length
      TABLES
        binary_tab    = lt_binary_tab.

    CALL FUNCTION 'SCMS_BINARY_TO_STRING'
      EXPORTING
        input_length  = lv_output_length
        encoding      = iv_encoding
      IMPORTING
        text_buffer   = rv_string
        output_length = lv_output_length
      TABLES
        binary_tab    = lt_binary_tab.

  ENDMETHOD.

  METHOD string_to_xstring.

    DATA: lv_xstring TYPE xstring.

    CALL FUNCTION 'SSFC_BASE64_DECODE'
      EXPORTING
        b64data = in
      IMPORTING
        bindata = lv_xstring
      EXCEPTIONS
        OTHERS  = 1.

    IF sy-subrc IS INITIAL.
      MOVE lv_xstring TO out.
    ELSE.
      MOVE in TO out.
    ENDIF.

  ENDMETHOD.                    "string_to_xstring

  METHOD xstring_to_string.

    DATA: lv_xstring TYPE xstring.

    " let us fix data conversion issues here
    lv_xstring = in.

    CALL FUNCTION 'SSFC_BASE64_ENCODE'
      EXPORTING
        bindata = lv_xstring
      IMPORTING
        b64data = out
      EXCEPTIONS
        OTHERS  = 1.

    IF sy-subrc IS NOT INITIAL.
      MOVE in TO out.
    ENDIF.

  ENDMETHOD.                    "xstring_to_string

  METHOD tribool_to_bool.
    IF iv_tribool EQ c_tribool-true.
      rv_bool = c_bool-true.
    ELSEIF iv_tribool EQ c_tribool-undefined.
      rv_bool = abap_undefined. " fall back to abap_undefined
    ENDIF.
  ENDMETHOD.                    "tribool_to_bool

  METHOD bool_to_tribool.
    IF iv_bool EQ c_bool-true.
      rv_tribool = c_tribool-true.
    ELSEIF iv_bool EQ abap_undefined. " fall back for abap _bool
      rv_tribool = c_tribool-undefined.
    ELSE.
      rv_tribool = c_tribool-false.
    ENDIF.
  ENDMETHOD.                    "bool_to_tribool

ENDCLASS.

Serialization/deserialization of hierarchical/recursive data

Handling of the recursive data structure in ABAP is not very trivial. And it is not very trivial to serialize and deserialize it either.
If you would like to model your hierarchical data (tree-like) as ABAP structures, the only allowed way will be to do it like in the example below, where you use references to generic data:

Modeling of recursive data types in ABAP
TYPES: 
  BEGIN OF ts_node,
    id        TYPE i,
    children  TYPE STANDARD TABLE OF REF TO data WITH DEFAULT KEY,
  END OF ts_node.

DATA: lv_exp    TYPE string,
      lv_act    TYPE string,
      ls_data   TYPE ts_node,
      lr_data   LIKE REF TO ls_data.

ls_data-id = 1.

CREATE DATA lr_data.
lr_data->id = 2.
APPEND lr_data TO ls_data-children.

Such way more or less straightforward and will work, but leads to losing type information for data persisted in children table. That will mean that you will need to cast data when you access it. In addition to that, it blocks you from being able to deserialize such data from JSON, while parser will not be able to deduce the type of the data needs to be created in children table. But serialization will work fine:

Serialization of recursive ABAP structures
lv_exp = '{"ID":1,"CHILDREN":[{"ID":2,"CHILDREN":[]}]}'.
lv_act = /ui2/cl_json=>serialize( data = ls_data ).
cl_aunit_assert=>assert_equals( act = lv_act exp = lv_exp msg = 'Serialization of recursive data structure fails' ).

The better way to model hierarchical data in ABAP is with help of objects, while objects are always processed as references and ABAP allow you to create nested data structures, referring to objects of the same type:

Modeling of recursive data in ABAP using objects
CLASS lcl_test DEFINITION FINAL.
  PUBLIC SECTION.
    DATA: id TYPE i.
    DATA: children TYPE STANDARD TABLE OF REF TO lcl_test.
ENDCLASS.                    "lcl_test DEFINITION

In that manner, you are able to process data in same way as with ABAP structures but using typed access and serialization/deserialization of data in JSON works fine while types can be deduced on 

Serialization/deserialization of recursive objects in ABAP
DATA: lo_act    TYPE REF TO lcl_test,
      lo_exp    TYPE REF TO lcl_test,
      lv_json   TYPE string,
      lo_child  LIKE lo_data.

CREATE OBJECT lo_exp.

lo_exp ->id = 1.

CREATE OBJECT lo_child.
lo_child->id = 2.
APPEND lo_child TO lo_exp->children.

lv_json = /ui2/cl_json=>serialize( data = lo_exp ).
ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data =  lo_act ).

Remark: There are some constraints for data design exist in regard to deserialization of objects:

  • You cannot use constructors with obligatory parameters
  • References to interfaces will be not deserialized

Partial serialization/deserialization

When it is needed:

  • You deserialize JSON to ABAP but would like some known parts to be deserialized as JSON string, while you do not know nesting JSON structure.
  • You deserialize a collection (array/associative array) which has objects with heterogeneous structure (for example the same field has different type depending on object type). Using partial deserialization, you can restore such type as JSON string in ABAP and apply later additional deserialization based on object type.  
  • You serialize ABAP to JSON and have some ready JSON pieces (strings) which you would like to mix in. 

The solution /UI2/CL_JSON has for this type /UI2/CL_JSON=>JSON (alias for built-in type string). ABAP fields using declared with this type will be serialized/deserialized as JSON pieces. Be aware that during serialization from ABAP to JSON, the content of such JSON piece is not validated for correctness, so if you pass invalid JSON block, it may destroy whole resulting JSON string at the end.

Below you can find examples for partial serialization/deserialization.

Serialization:

Partial serialization of ABAP to JSON
TYPES: BEGIN OF ts_record,
        id      TYPE string,
        columns TYPE /ui2/cl_json=>json,
       END OF ts_record.

DATA: lv_json   TYPE /ui2/cl_json=>json,
      lt_data   TYPE SORTED TABLE OF ts_record WITH UNIQUE KEY id,
      ls_data   LIKE LINE OF lt_data.

ls_data-id = 'O000001ZZ_SO_GRES_CONTACTS'.
ls_data-columns = '{"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}'.
INSERT ls_data INTO TABLE lt_data.

ls_data-id = 'O000001ZZ_TRANSIENT_TEST_A'.
ls_data-columns = '{"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}'.
INSERT ls_data INTO TABLE lt_data.

lv_json = /ui2/cl_json=>serialize( data = lt_data assoc_arrays = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).

WRITE / lv_json. 

Results in:

JSON Output
{
    "O000001ZZ_SO_GRES_CONTACTS": {
        "columns": {
            "AGE": {
                "bVisible": true,
                "iPosition": 2
            },
            "BRSCH": {
                "bVisible": true
            }
        }
    },
    "O000001ZZ_TRANSIENT_TEST_A": {
        "columns": {
            "ABTNR": {
                "bVisible": false
            },
            "CITY1": {
                "bVisible": false
            },
            "IC_COMPANY_KEY": {
                "bVisible": true
            }
        }
    }
}

Deserialization:

Partial deserialization of JSON into ABAP
TYPES: BEGIN OF ts_record,
        id      TYPE string,
        columns TYPE /ui2/cl_json=>json,
       END OF ts_record.

DATA: lv_json  TYPE string,
      lt_act   TYPE SORTED TABLE OF ts_record WITH UNIQUE KEY id.

CONCATENATE 
'{"O000001ZZ_SO_GRES_CONTACTS":{"columns":{"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}},'
'"O000001ZZ_TRANSIENT_TEST_A":{"columns":{"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}}}'
INTO lv_json.

" if you know first level of undelying structure ("columns" field) -> Output Var 1
/ui2/cl_json=>deserialize( EXPORTING json = lv_json assoc_arrays = abap_true CHANGING data = lt_act ).
 
" if you do not know underlying structure of first level (naming of second filed e.g columns in example does not matter )
" => result is a little bit different -> Output Var 2
/ui2/cl_json=>deserialize( EXPORTING json = lv_json assoc_arrays = abap_true assoc_arrays_opt = abap_true CHANGING data = lt_act ).

Results in the following ABAP data:

ABAP Output (variant 1)
ID(CString)	                COLUMNS(CString)
O000001ZZ_SO_GRES_CONTACTS  {"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}
O000001ZZ_TRANSIENT_TEST_A  {"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}
ABAP Output (variant 2)
ID(CString)	                COLUMNS(CString)
O000001ZZ_SO_GRES_CONTACTS  {"columns":{"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}}
O000001ZZ_TRANSIENT_TEST_A  {"columns":{"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}}

/UI2/CL_JSON extension

If standard class functionality does not fit your requirements there are two ways of how you can adapt it to your needs:

  • Use a local copy of the class /UI2/CL_JSON and modify logic directly, by the change of original code.
  • Inherit from class /UI2/CL_JSON and override methods where another logic is required. 

The advantage of the first approach that you are completely free in what you may change and have a full control on class lifecycle. The disadvantage, you will probably need to merge your changes with /UI2/CL_JSON updates. 

For the second approach you can use /UI2/CL_JSON directly (prerequisite is the latest version of note 2330592), do not need to care about the merge, but can override only some methods. The methods are:

IS_COMPRESSIBLE – called to check, if given type output may be suppressed during ABAP to JSON serialization when a value is initial. 

  • > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR) – value type
  • < RV_COMPRESS (bool) – compress initial value

The default implementation of the method allows compressing of any initial value.

PRETTY_NAME – called to format ABAP field name written to JSON or deserialized from JSON to ABAP field, when the pretty_name parameter of SERIALIZE/DESERIALIZE method equal to PRETTY_MODE-CAMEL_CASE.

  • > IN (CSEQUENCE) – Field name to pretty print.
  • < OUT (STRING) – Pretty printed field name

The default implementation applies camelCase formatting, based on usage of “_” symbol. To output “_” symbol, use double “__” symbol in the field name.

PRETTY_NAME_EX – called to format ABAP field name written to JSON or deserialized from JSON to ABAP field, when the pretty_name parameter of SERIALIZE/DESERIALIZE method equal to PRETTY_MODE-EXTENDED.

  • > IN (CSEQUENCE) – Field name to pretty print.
  • < OUT (STRING) – Pretty printed field name

The default implementation does same as PRETTY_NAME, plus converting of “___” ABAP sequence into “.” JSON character. 

CLASS_CONSTRUCTOR - used to initialize static variables. You can not overwrite it, but you can implement your own class constructor that adapts default globals. For example, adds additional boolean type to be recognized during serialization/deserialization. 

SERIALIZE/DESERIALIZE - these methods are static therefore can not be redefined. Methods are helpers for a consumption code, hiding the construction of the class instance and further *_INT calls. So, if you would like to use something similar, in you custom class, you need to copy mentioned methods to new ones e,g *_EX and overwrite there /UI2/CL_JSON type to your custom class name. And use these methods instead of standard.

Extension using inheritance:

Extension of /UI2/CL_JSON
CLASS lc_json_custom DEFINITION FINAL INHERITING FROM /ui2/cl_json.
  PUBLIC SECTION.
    CLASS-METHODS:
      class_constructor,
      deserialize_ex IMPORTING json TYPE json OPTIONAL
                        pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
                      CHANGING data TYPE data,
      serialize_ex IMPORTING data TYPE data
                         compress TYPE bool DEFAULT c_bool-false
                      pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
          RETURNING value(r_json) TYPE json .

  PROTECTED SECTION.
    METHODS:
      is_compressable REDEFINITION,
      pretty_name REDEFINITION.
ENDCLASS.                    "lc_json_custom DEFINITION

CLASS lc_json_custom IMPLEMENTATION.
  METHOD class_constructor.
    CONCATENATE mc_bool_types `\TYPE=/UI2/BOOLEAN` INTO mc_bool_types.
  ENDMETHOD.                    "class_constructor
  METHOD is_compressable.
    IF type_descr->absolute_name EQ `\TYPE=STRING` OR name EQ `INITIAL`.
      rv_compress = abap_false.
    ELSE.
      rv_compress = abap_true.
    ENDIF.
  ENDMETHOD.                    "is_compressable
  METHOD pretty_name.
    out = super->pretty_name( in ).
    CONCATENATE out 'Xxx' INTO out.
  ENDMETHOD.                    "pretty_name
  METHOD serialize_ex.
    DATA: lo_json  TYPE REF TO lc_json_custom.
    CREATE OBJECT lo_json
      EXPORTING
        compress         = compress
        pretty_name      = pretty_name
        assoc_arrays     = abap_true
        assoc_arrays_opt = abap_true
        expand_includes  = abap_true
        numc_as_string   = abap_true
        ts_as_iso8601    = abap_true.
    r_json = lo_json->serialize_int( data = data ).
  ENDMETHOD.                    "serialize_ex
  METHOD deserialize_ex.
    DATA: lo_json TYPE REF TO lc_json_custom.
    IF json IS NOT INITIAL.
      CREATE OBJECT lo_json
        EXPORTING
          pretty_name      = pretty_name
          assoc_arrays     = abap_true
          assoc_arrays_opt = abap_true.
      TRY .
          lo_json->deserialize_int( EXPORTING json = json CHANGING data = data ).
        CATCH cx_sy_move_cast_error.
      ENDTRY.
    ENDIF.
  ENDMETHOD.                    "deserialize_ex 
ENDCLASS.                    "lc_json_custom IMPLEMENTATION

TYPES:
 BEGIN OF tp_s_data,
   tribool   TYPE lc_json_custom=>tribool,
   bool      TYPE lc_json_custom=>bool,
   str1      TYPE string,
   str2      TYPE string,
   initial   TYPE i,
 END OF tp_s_data.

DATA: ls_exp          TYPE tp_s_data,
      ls_act          LIKE ls_exp,
      lo_json_custom  TYPE REF TO lc_json_custom,
      lv_json_custom  TYPE lc_json_custom=>json.

ls_exp-tribool = lc_json_custom=>c_tribool-false.
ls_exp-bool    = lc_json_custom=>c_bool-false.
ls_exp-str1    = ''.
ls_exp-str2    = 'ABC'.
ls_exp-initial = 0.
CREATE OBJECT lo_json_custom
  EXPORTING
    compress    = abap_true
    pretty_name = lc_json_custom=>pretty_mode-camel_case.

lv_json_custom = lo_json_custom->serialize_int( data = ls_exp ).
lo_json_custom->deserialize_int( EXPORTING json = lv_json_custom CHANGING data = ls_act ).
 
" alternative way 
lc_json_custom=>deserialize_ex( EXPORTING json = lv_json_custom CHANGING data = ls_act ).
cl_aunit_assert=>assert_equals( act = ls_act exp = ls_exp msg = 'Custom pretty name fails!' ).

WRITE / lv_json_custom. 

Results in the following JSON:

Serialization with custom /UI2/CL_JSON
{
	"triboolXxx": false,
	"str1Xxx": "",
	"str2Xxx": "ABC",
	"initialXxx": 0
}

Deserialization of untyped (unknown) JSON object

If you need to deserialize a JSON object with unknown structure, or you do not have passing data type on ABAP side or data type of the resulting object may vary, you can generate ABAP object on the fly, using corresponding GENERATE method. The method has some limitations comparing to standard deserialization like:

  • all fields are generated as a reference (even elementary types)
  • you can not control how deserialized arrays or timestamps
  • you can not access components of generated structure statically (while the structure is unknown at compile time) and need to use dynamic access

The simplest example, with straightforward access:

Generating of ABAP Data using /UI2/CL_JSON
DATA: lv_json TYPE /ui2/cl_json=>json,
      lr_data TYPE REF TO data.

FIELD-SYMBOLS:
  <data>   TYPE data,
  <struct> TYPE any,
  <field>  TYPE any.

lv_json = `{"name":"Key1","properties":{"field1":"Value1","field2":"Value2"}}`.
lr_data = /ui2/cl_json=>generate( json = lv_json ).

" OK, generated, now let us access somete field :(
IF lr_data IS BOUND.
  ASSIGN lr_data->* TO <data>.
  ASSIGN COMPONENT `PROPERTIES` OF STRUCTURE <data> TO <field>.
  IF <field> IS ASSIGNED.
    lr_data = <field>.
    ASSIGN lr_data->* TO <data>.
    ASSIGN COMPONENT `FIELD1` OF STRUCTURE <data> TO <field>.
    IF <field> IS ASSIGNED.
      lr_data = <field>.
      ASSIGN lr_data->* TO <data>.
      WRITE: <data>. " We got it -> Value1
    ENDIF.
  ENDIF.
ENDIF.

Nice alternative, using dynamic data accessor helper class

Access generated ABAP data object using dynamic data accessor helper
DATA: lv_json TYPE /ui2/cl_json=>json,
      lr_data TYPE REF TO data,
      lv_val  TYPE string.

lv_json = `{"name":"Key1","properties":{"field1":"Value1","field2":"Value2"}}`.
lr_data = /ui2/cl_json=>generate( json = lv_json ).

zcl_dyn_access=>create( ir_data = lr_data iv_component = `properties-field1`)->value( IMPORTING ev_data = lv_val ).
WRITE: lv_val.

Implicit generation of ABAP objects on deserialization

In addition to explicit generation of the ABAP data objects from JSON string, the deserializer supports an implicit way of generation, during DESERIALIZE(INT) call. To trigger generation, you output data structure shall contain a field with type REF TO DATA and name of the field shall match JSON attribute (pretty name rules are considered). Depending on the value of the field, the behavior may differ:

  • The value is not bound (initial): deserialize will use generation rules when creating corresponding data types of the referenced value
  • The value is bound (but maybe empty): the deserializer will create new referenced value based on referenced type.
Example of implicit generation of ABAP data from JSON string
TYPES:
  BEGIN OF ts_dyn_data1,
    name     TYPE string,
    value    TYPE string,
  END OF ts_dyn_data1,
  BEGIN OF ts_dyn_data2,
    key      TYPE string,
    value    TYPE string,
  END OF ts_dyn_data2,
  BEGIN OF ts_data,
    str     TYPE string,
    data    TYPE REF TO data,
  END OF ts_data.

DATA:
  ls_data  TYPE ts_data,
  lv_json  TYPE /ui2/cl_json=>json.

lv_json = `{"str":"Test","data":{"name":"name1","value":"value1"}}`.

" deserialize data and use generic generation for field "data",
" the same as with method GENERATE (using temporary data type)
/ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_data ).

" deserialize data and use type TS_DYN_DATA1 for the field "data"
CREATE DATA ls_data-data TYPE ts_dyn_data1.
/ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_data ).

" deserialize data and use alternative type TS_DYN_DATA2 for the field "data"
CREATE DATA ls_data-data TYPE ts_dyn_data2.
/ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_data ).

JSON/ABAP serialization/deserialization with runtime type information

Automatic deserialization of the JSON into appropriate ABAP structure is not supported. The default implementation assumes that you need to know target data structure (or at least partial structure, it will also work) to deserialize JSON in ABAP and then work with typed data. 

But if for some reason one needs the ability to deserialize JSON in source ABAP structure in a generic way, he can extend both serialize/deserialize methods and wrap outputs/inputs of /UI2/CL_JSON data by technical metadata describing source ABAP structure and use this information during deserialization (or use GENERATE method). Of course, you need to ensure that source ABAP data type is known in deserialization scope (global and local types are "visible").

See example below:

Serialization and deserialization with runtime type information
TYPES: BEGIN OF ts_json_meta,
         abap_type LIKE cl_abap_typedescr=>absolute_name,
         data      TYPE string,
       END OF ts_json_meta.

DATA: lt_flight TYPE STANDARD TABLE OF sflight,
      lv_json   TYPE string,
      lo_data   TYPE REF TO data,
      ls_json   TYPE ts_json_meta.

FIELD-SYMBOLS: <data> TYPE any.

SELECT * FROM sflight INTO TABLE lt_flight.

* serialize table lt_flight into JSON, skipping initial fields and converting ABAP field names into camelCase
ls_json-data      = /ui2/cl_json=>serialize( data = lt_flight compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
ls_json-abap_type = cl_abap_typedescr=>describe_by_data( lt_flight )->absolute_name.
lv_json           = /ui2/cl_json=>serialize( data = ls_json compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
WRITE / lv_json.

CLEAR: ls_json, lt_flight.

* deserialize JSON string json into internal table lt_flight doing camelCase to ABAP like field name mapping
/ui2/cl_json=>deserialize( EXPORTING json = lv_json pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = ls_json ).
CREATE DATA lo_data TYPE (ls_json-abap_type).
ASSIGN lo_data->* TO <data>.
/ui2/cl_json=>deserialize( EXPORTING json = ls_json-data pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = <data> ).

IF lo_data IS NOT INITIAL.  
  BREAK-POINT. " check here lo_data
ENDIF.

Exception Handling in /UI2/CL_JSON

By default, /UI2/CL_JSON tries to hide from consumer thrown exceptions (that may happen during deserialization) catching them in all levels. In some cases, it will result in missing attributes, in other cases, when an error was critical and the parser can not restore, you will get an empty object back. The main TRY/CATCH block, not letting exceptions out is in DESERIALIZE method.

If you want to get a reporting in case of error, you shall use instance method DESERIALIZE_INT which may fire CX_SY_MOVE_CAST_ERROR. The reporting is rather limited - all errors translated into CX_SY_MOVE_CAST_ERROR and no additional information available.

How to use CALL TRANSFORMATION for JSON

Below is a small example of CALL TRANSFORMATION usage to produce JSON from ABAP structures. Do not ask me details - I do not know them. (smile) Was just small test of me.

CALL TRANSFORMATION for JSON
DATA: lt_flight          TYPE STANDARD TABLE OF sflight,
      lo_writer          TYPE REF TO cl_sxml_string_writer,
      lv_output_length   TYPE i,
      lt_binary_tab      TYPE STANDARD TABLE OF sdokcntbin,
      lv_jsonx           TYPE xstring,
      lv_json            TYPE string.

SELECT * FROM sflight INTO TABLE lt_flight.

* ABAP to JSON
lo_writer = cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ).
CALL TRANSFORMATION id SOURCE text = lt_flight RESULT XML lo_writer.
lv_jsonx = lo_writer->get_output( ).

CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
  EXPORTING
    buffer                = lv_jsonx
  IMPORTING
    output_length         = lv_output_length
  TABLES
    binary_tab            = lt_binary_tab.

CALL FUNCTION 'SCMS_BINARY_TO_STRING'
  EXPORTING
    input_length          = lv_output_length
  IMPORTING
    text_buffer           = lv_json
    output_length         = lv_output_length
  TABLES
    binary_tab            = lt_binary_tab.

 * JSON to ABAP
 CALL TRANSFORMATION id SOURCE XML lv_jsonx RESULT text = lt_flight.

91 Comments

  1. Alexey - works ok , thanks for sharing. Worth trying.

    1. Use it with pleasure (wink)

  2. Thanks for this class... we encountered an issue using it... apparently the JSON in Javascript needs to have the number surrounded by double-quotes also...

    I added them in the concatenate in the macro and it works like a charm. Thanks (smile)

    1. Hi Gerg,

      I'm facing the same problem.

      Could you please post how did you overcome the issue. In which macro and how did you add the concatenate statement?

      Thanks in advance,

      Sunil 

      1. Hey mate, sorry my account has been changed... anyway, we made some changes but I think Alexey already provided the same solution.

  3. Hi Greg,

    based on my knowledge, and on JSON RFC numbers shall not be surrounded by quotes.

    I assume in your case you need a special handling, while JS on client side expect to have string instead of number. The proper way will be then to change ABAP structure in such way it corresponds expected format (so change base data type of attribute from I based to N based for example). 

    Can you please post here an example that does not work?

    BR, Alexey.

     

     

    1. Hey,

      Unfortunately, we can not change the structure as it is dynamic. We actually use JSON because oData is even less flexible. The issue is indeed in the JavaScript client that does not recognise the object during parsing.

      Here is an example:

      var test = {
          "Data" : [ {
              "COL_EMPNO" : 00000001,
              "COL_MANM1" : "Roberts, Marcia",
              "COL_MADOB" : 19630416,
          }]
      };

      JSON.parse(test) fails... Error: Unexpected number

  4. Hi Greg,

    is this last comma after 19630416, is generated or you just add it in your example? If this come from serializer - it is a bug. But I think JSON parser on JS side shall overcome it.

    I think reason for error is: COL_EMPNO" : 00000001 . 

    Please try example like this if it works:

    JSON.parse( {
        "Data" : [ {
            "COL_EMPNO" : 1,
            "COL_MANM1" : "Roberts, Marcia",
            "COL_MADOB" : 19630416
        }]
    };

    If this 00000001 is a reason, I will try to fix it and update the parser. The workaround on your side until I update parser will be to use not NUMC type but I for COL_EMPNO. 

    BR, Alexey.

  5. Hello Alexey / Greg,

    Accidentally came across this blog; but found the information very helpful. Thank you for the blog and comments. (smile)

    We had written a custom JSON parser some time ago and I was interested in the standard SAP utility.

    But I have encountered 2 issues while testing this;

    One: as Greg pointed out, when the output JSON string is parsed with external parsers, they have errors as the numbers need to be in quotes. After reading https://tools.ietf.org/html/rfc7158 I see that JSON numbers need not be in quotes but they cannot have leading zeros. So I guess, the solution could be either to have quotes around the number values or to have SAP's character based number formats without leading zeros.

    Two: the standard utility ( /ui2/cl_json=>serialize ) dumps when you have a meta-structure via the include command and without a "group" name. I don't know if this is fixed with a higher component version but you can try with the example below.

    The below dumps for me :

    typesbegin of ty_s_str01,
            c01 type length 1,
            c02 type length 1,
           end of ty_s_str01.
    typesbegin of ty_s_str02,
            c03 type length 1.
            include type ty_s_str01 .
    typesend of ty_s_str02.
    data ls_strc type ty_s_str02.
    data lv_json type string.

    ls_strc-c01 'X'.
    ls_strc-c03 'X'.
    lv_json /ui2/cl_json=>serializedata ls_strc
                                       compress    abap_true
                                       pretty_name abap_true ).

    Now replace the definition of ty_s_str02 as below and it should work.

    typesbegin of ty_s_str02,
            c03 type length 1.
            include type ty_s_str01 as str01.
    typesend of ty_s_str02.


    The same applies to a DDIC structure having an include without a "group" name.


    Hope this helps; just thought that I would point it out.


    Thanks.

  6. Hi John,

    the "SAP standard", I think, will be to use CALL TRANSFORMATION with JSON format: I have added the example in the bottom of the article of how to serialize data with it. But, as I have written the code will only work from 7.02 and one does not have too much freedom (easy way) to control output format. But it is faster.

    If one would use /UI2/CL_JSON you have nicer consumption + more functionality but less performance (while it is pure ABAP).

    Back to problems.

    1) output of the leading zeroes it is a bug I will correct.

    2) support of INCLUDE is known bug, already fixed in delivered /UI2/CL_JSON but not yet here. I will update code together with fix for leading zeroes soon.

    BR, Alexey

  7. Hi Alexey,

    Thank you for the updates and information on using the transformation. (smile)

    I was aware of using transformation but it was good to see an example.

    About the problems, not a show stopper for me; instead I just thought I would point these out.

    Thanks.

  8. Hi Guys,

    I have corrected both erorrs: with leading zeroes and include structures. Try new version.

    BR, Alexey.

  9. Added fix for type conversion overflow on deserializing. 

  10. Hello,

    I have some wish to improvement for /UI2/CL_JSON. 

    If you want deserialize some character value to numeric datafields than program dump to system error.  It is not planed issue, but sometimes consume data are come with bad format.

    For example:

    try.
        datajson type string.
        json '{ "userName": "sap", "password": "123456" }'.
        databegin of user,
                username type string,
                password type int4,
              end of user.
        /ui2/cl_json=>deserializeexporting json json
                                   changing data user ).
      catch cx_root.
    endtry.

    Catch cx root is not catched a prohras has been terminaded. 

     

    System analysis:

    In statement "'REPLACE", only character-type data objects are supported at
    argument position "DATA'".

    In this case, operand "DATA'" has the non character-type type "I".

    Method  RESTORE_TYPE:

    73 WHEN `"`. " string
    74 IF data IS SUPPLIED.
    75 eat_string data.
    76 " unescape string
    >>>> REPLACE ALL OCCURRENCES OF `\"` IN data WITH `"`.

    If I consume some JSON data, I can´t reduce any mistake. 

    Could you please implement some program exception handler or as advance implement JSON schema for validate input.

    Thanks.

     


    1. Hi Jan,

      accepted. Please check new version.

      BR, Alexey.

      1. You are so quick. It´s function great. Thanks.

    2. Agree, I bumped into the similar issue, before I saw Alexey's correction the original in method restore_type() is 

          WHEN `"`" string
            IF data IS SUPPLIED.
              eat_string data.
              REPLACE ALL OCCURRENCES OF `\"` IN data WITH `"`" unescape string


      I just changed it to 

      WHEN `"`. " string
      IF data IS SUPPLIED.
      eat_string data.
      type_descr = cl_abap_typedescr=>describe_by_data( data ).
      IF type_descr->type_kind EQ cl_abap_typedescr=>typekind_CHAR
      REPLACE ALL OCCURRENCES OF `\"` IN data WITH `"`. " unescape string
      ...

      I had to copy this class to make changes...


      George

      1. Hi George,

        wrapping of the the \UI2\CL_JSON class in you own class as local is a preferred way if you want to protect your code from changes, which can happen if standard delivered \UI2\CL_JSON will be modified in a way, that does not fit your purposes. And you can always copy actual version of code from here.

        About suggest change by you: please use actual one from the article. It is more robust. 

        BR, Alexey.

  11. New version published
    • Support for DATE, TIME, HEX, XSTRING data types added. As for serialization as for deserialization:
      • DATE -> formatted strings as “2015-03-24” (is not locale dependent). Changed from integer.
      • TIME -> formatted string as “15:03:45” (is not locale dependent). Changed from integer.
      • XSTRING -> based64 encoded string as "q83v" for 'ABCDEF' (XString)
      • HEX -> based64 encoded string as " AAAAAAAAOt5osQ==" for '0000000000003ADE68B1' (Hex)
    • Optimized performance of the deserialization
    • Support of XFELD as Boolean type
    • Exception raising in case of malformed input (you need to catch cx_sy_move_cast_error)
    • Better support of data convertion, if input data type does not fit to output data type
  12. Another item: in /UI2/CL_JSON_SERIALIZER, method GET_VALUES(),

          ELSE.
    *     null
             IF mv_case_type /ui2/if_serialize=>c_case_type-camel_case_s.
              IF  <lv_field>-descr->type_kind cl_abap_typedescr=>typekind_struct1
              OR <lv_field>-descr->type_kind cl_abap_typedescr=>typekind_struct2
              OR <lv_field>-descr->type_kind cl_abap_typedescr=>typekind_dref.
                  DATA lv_u TYPE string.
                  DATA lv_l TYPE string.
                  FIND REGEX '([a-z])([a-z]*)' IN lv_field_name SUBMATCHES lv_u lv_l.
                  TRANSLATE lv_l TO LOWER CASE.
                  TRANSLATE lv_u TO UPPER CASE.
                  CONCATENATE lv_u lv_l INTO lv_field_name.
            endif.
            endif.
            CONCATENATE '"' lv_field_name '":null' INTO lv_value.

          ENDIF.

    here, the situation I got is the interface does not exepct "null". so I make change to 

    CONCATENATE '"' lv_field_name '":' INTO lv_value.

    then there is another issue,

    in this line:

        ASSIGN COMPONENT sy-tabix OF STRUCTURE <lv_data> TO <lv_value>.
        IF <lv_field>-numeric abap_true.

    if this component of structure <lv_data> is a table with no entries, then <lv_value> will be considered initial.

    once this <lv_value> is considered initial, it will come to the following code:

       ELSEIF <lv_value> IS INITIAL.
          IF    <lv_field>-descr->type_kind =  cl_abap_typedescr=>typekind_char AND
              <lv_field>-descr->length   .
    *      IF <lv_field>-name <> 'NUMBER_FORMAT'.
            CONCATENATE '"' lv_field_name '":" "' INTO lv_value.
          ELSE.
    *     null
             IF mv_case_type /ui2/if_serialize=>c_case_type-camel_case_s.
              IF  <lv_field>-descr->type_kind cl_abap_typedescr=>typekind_struct1
              OR <lv_field>-descr->type_kind cl_abap_typedescr=>typekind_struct2
              OR <lv_field>-descr->type_kind cl_abap_typedescr=>typekind_dref.
                  DATA lv_u TYPE string.
                  DATA lv_l TYPE string.
                  FIND REGEX '([a-z])([a-z]*)' IN lv_field_name SUBMATCHES lv_u lv_l.
                  TRANSLATE lv_l TO LOWER CASE.
                  TRANSLATE lv_u TO UPPER CASE.
                  CONCATENATE lv_u lv_l INTO lv_field_name.
            endif.
            endif.
            CONCATENATE '"' lv_field_name '":null' INTO lv_value.

          ENDIF.

    then I will miss the  square bracket [      ] from the output.   I did a fix here as well, to do something like check 

    <lv_field>-descr->type_kind =  cl_abap_typedescr=>typekind_table.

    if it is, then copy your code 

                GET REFERENCE OF <lv_value> INTO lr_dref.
                lo_tabledescr ?= <lv_field>-descr.
                lv_value serialize_tableio_tabledescr lo_tabledescr ir_data lr_dref  ).
                CONCATENATE '"' lv_field_name '":' lv_value INTO lv_value.

    so I think when <lv_data> is initial, we need to think it further to make changes.

    overall, thank you Alexjey, this is a very good program, runs very fast,easy to debug, thank you very much. it is preferred to transformation as this is dynamic.going forward, I am wondering if we can determine the export abap structure at runtime.

    George

  13. Hi George,

    do not use class /UI2/CL_JSON_SERIALIZER - it is deprecated and not supported any more. Just left in because of compatibility reasons. 

    If you need portable class for JSON serialization/deserialization /UI2/CL_JSON is proper one.

    About the: "I am wondering if we can determine the export abap structure at runtime". 

    This feature is not supported, because I do not see a reason for it. If you deserialize something, you still need to be able to read it in typed way in ABAP. So, the idea that you need to know target data structure (or at least partial structure, it will also work) to deserailize JSON and that work with you typed target. 

    But if you for some reason need ability to deserilaize JSON in source ABAP structure in generic way, you may extend both serialize/deserilaize methods (you any way have copied the class) and wrap outputs/inputs of /UI2/CL_JSON data by technical meta data describing source ABAP structure and use this information during deserialization. For example:

    Error rendering macro 'code': Invalid value specified for parameter 'lang'
    TYPES: BEGIN OF ts_json_meta,
             abap_type LIKE cl_abap_typedescr=>absolute_name,
             data      TYPE string,
           END OF ts_json_meta.
    DATA: lt_flight TYPE STANDARD TABLE OF sflight,
          lv_json   TYPE string,
          lo_data   TYPE REF TO data,
          ls_json   TYPE ts_json_meta.
    FIELD-SYMBOLS: <data> TYPE any.
    SELECT * FROM sflight INTO TABLE lt_flight.
    " serialize table lt_flight into JSON, skipping initial fields and converting ABAP field names into camelCase
    ls_json-data      = /ui2/cl_json=>serialize( data = lt_flight compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
    ls_json-abap_type = cl_abap_typedescr=>describe_by_data( lt_flight )->absolute_name.
    lv_json           = /ui2/cl_json=>serialize( data = ls_json compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
    WRITE / lv_json.
    CLEAR: ls_json, lt_flight.
    " deserialize JSON string json into internal table lt_flight doing camelCase to ABAP like field name mapping
    TRY.
        /ui2/cl_json=>deserialize( EXPORTING json = lv_json pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = ls_json ).
        CREATE DATA lo_data TYPE (ls_json-abap_type).
        ASSIGN lo_data->* TO <data>.
        /ui2/cl_json=>deserialize( EXPORTING json = ls_json-data pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = <data> ).
      CATCH cx_sy_move_cast_error.
        CLEAR: lo_data.
    ENDTRY.
    IF lo_data IS NOT INITIAL.
      " check here lo_data
      BREAK-POINT.
    ENDIF. 

     

    Best regards,

    Alexey.

    1. thank you Alexey! completely agree with you

       

      George

  14. Hi all,

    This serialization do not works with REF TO DATA which are supposed to be objets ==> CX_SY_MOVE_CAST_ERROR.

    Do you have any solution that works with "REF TO DATA" ?

    Taryck.

    1. do you mean the type mismatch? can you give an example of what you expect to be the input parameter, type ref to data?

       

    2. Hi Taryck,

      as George mentioned: please provide test example and expected result. I will check.

      In general serializations of TYPE REF shall also work (as in example above for ref to LT_FLIGHTS) but I have not done extensive tests for such use cases.

      Best regards,

      Alexey

  15. OK here it is :

     

    data t_param type abap_parmbind_tab.

    data s_param like line of t_param.

    Data interger type int4.

    Data str type string.

    s_param-name = 'INTERGER'.

    S_param-kind = 'E'

    Interger = 3.

    get reference of integer into S_param-value.

    insert s_param into table t_param.

    str = /ui2/cl_json(data = t_param).

     

    Error is in method dump because when  "type_desc->kind = Kind_ref" you assume this is an object. If you looked at type_desc->type_kind you'll see it's dref type kind...

    1. Hello Taryck,

      thanks for example. Yes, I see - will try to extend that soon.

      As you know, this is not an official SAP JSON parser, so you can copy and extend it as you like and share your suggestions here. All will appreciate valuable contribution!

      BR, Alexey.

       

      1. Hi,

        OK thanks. How could it be unofficial and been part of SAP's packages ?

        I've try to find solutions for serializing and most of all deserializing Ref to Data but I do not find any solution.

        I've try CALL TRANSFORMATION ID Options data_ref = 'embedded' ... Which is OK for Serialization but fails on deserialize...

        1. The /UI2/CL_JSON class is created to solve specific needs of UI2 services and not intended to be generic solution for anyone need JSON serialize/deserialize abilities (see introduction part of the article). For generic solution use CALL TRANSFORMATION or request something from Gateway colleges going official way with messages, dev requests etc. The same you can also try with /UI2/CL_JSON requests.

          Here I am presenting local copy anyone can modify and try to help people with their questions & requirements. If I have time and see the need.

          I've try CALL TRANSFORMATION ID Options data_ref = 'embedded' ... Which is OK for Serialization but fails on deserialize...

          How you imagine deserializing of the value from typeless JSON to generic data reference in ABAP (looking back on your example)?

           

           

          1. Well, I imagine that for CALL TRANSFORAMTION because data type is present in the XML.

            For JSON I'm not an expert so I expect this could be done. But If only data are stored without any data definition well I understand that it will be almost impossible...

            1. Hi Taryck,

              I have updated the code to support serialization of the data references. Deserialization, as I have already tried to explain is not possible in generic case, while JSON does not include type information + deserialization shall support any JSON but not only one previously serialized with it. But at least code provided will support your example, with deserialization of simple types in similar kind (so it can desterilize string, Boolean and integer, but cannot of course support deserialization in some specific type of original structure).

              Please verify.

              Best regards,

              Alexey

              1. Hi,

                 

                It's OK. Your example :

                {"ABSOLUTE_NAME":"\\TYPE=%_T00004S00000000O0000012480","ADMIN_TAB":"\\TYPE=%_T00004S00000000O0000012480",  "ADMIN_TAB_LINE":"\\TYPE=%_T00004S00000000O0000012480"

                Provide type definition so I tought JSON could handle type definition also.

                 

                Thanks.

                1. Hello Taryck

                  probably mentioned example is a little bit missleading. It purpose was only to show, that you can serialize ABAP Object also, but it does not serialize JSON type information, it is just dump of CL_ABAP_TYPEDESCR class which was used as easy example.

                  BR, Alexey

                   

  16. Hello Alexey,

     

    I see that my code is livin inside SAP standard code. One question. I checked the usage of  /UI2/CL_JSON. It's mainly used in NWBC. Is there any other implementation of this class in SAP Standard? You can write me direct message. I tried to send you via SCN but couldn't since you are not following me.

     

    Kind Regards,

    Ümit Coşkun Aydınoğlu

    1. Hi Coşkun,

      /UI2/CL_JSON is part of SAP coding, but not a standard class for serialization/deserialization of the JSON in SAP. We use it in NWBC and in Fiori and it is public class and part of the UI Addon. So any one may use it. Do not know other usages but know they exist. There may be some copies of the class encapsulated as local classes, following guidelines I gave in article, so it may be even more usages.

      BR, Alexey.

  17. Hi Alexey,

    is there any way to set Pretty Printing in UpperCamelCase instead of lowerCamelCase ?

     

    Best regards

    Diego

  18. Hi Diego,

    there is no such pretty printing option, but it is easy to built in if you use local copy of the class as suggested:

    •  introduce new format => constants: pretty_mode => ucamel_case TYPE char1  VALUE `U`
    • new formating method or extend pretty_print method with optional parameter. The code will be to set first character of resulting name after pretty_name to upper case => TRANSLATE out(1TO UPPER CASE. Or even better do it directl in macro below:
    • update macro format_name wtih new case statement =>     
      when pretty_mode-ucamel_case.
            &3 pretty_name&1 ).

           TRANSLATE &1(1TO UPPER CASE.
    • Done.

    But to be honest, you can do it even easier, without any modification. What you need is just start all your fields with "_" and use camel_case as formating option. I assume it will result in formatting you need.

    BR, Alexey. 

    1. Hi Alexey,

      I've tried what you've suggested "start fields with '_' " but it does not work, that's the reason of my question.  Thanks anyway, great work!!

       

      Best Regards

      Diego

       

      1. I would say it is a bug, that it does not work. Actually there is dedicated code blocking usage of _  in-front of the names, that I now treat as non needed...

        The quick correction would be to replace method PRETTY_NAME by following code:

        Error rendering macro 'code': Invalid value specified for parameter 'lang'
        METHOD pretty_name.
         
          DATA: tokens TYPE TABLE OF char128.
          FIELD-SYMBOLS: <token> LIKE LINE OF tokens.
         
          out = in.
         
          REPLACE ALL OCCURRENCES OF `__` IN out WITH `*`.
         
          TRANSLATE out TO LOWER CASE.
          TRANSLATE out USING `/_:_~_`.
          SPLIT out AT `_` INTO TABLE tokens.
          LOOP AT tokens ASSIGNING <token> FROM 2.
            TRANSLATE <token>(1) TO UPPER CASE.
          ENDLOOP.
         
          CONCATENATE LINES OF tokens INTO out.
          REPLACE ALL OCCURRENCES OF `*` IN out WITH `_`.
         
        ENDMETHOD.

         

         

  19. Hi Again,

    I've realized that the class produces invalid JSON when DATA parameter has non-printable characters. 

    You need to add content cleaning something like :

    replace all occurrences of regex '[^[:print:]]+(?!$)' in STRING with ` `.

    replace all occurrences of regex '[^[:print:]]+$' in STRING with ''.

     

    P.S : It's good to know that my code lives in essentials part of SAP like nwbc and fiori.

    Kind Regards,

    Coşkun

    1. Hello Coskun,

      thanks for the feedback.

      Can you be more concrete and provide an example to check? Do you mean SERILAIZE or DESERIALIZE?

      Why not-printable characters is a problem for JSON? Do you mean they are not allowed inside string literals? But for unicode text processing it is not a problem. The only " shall be escaped.

      In addition to that, such un-transparent replace will lead to data loss, which is not so obvoious to consumer of the API. If one knows he need to transport some binary or not well formed content, he needs to expose data as XSTRING, which will be then base64 encoded, but this does not lead to any data loss. Or do preliminary excapment by himself, before serialization.

      And I have not understood suggested regular expressions:

      What does this mean '[^[:print:]]+(?!$)'

      • capture all neighboring non printable characters + negative lookahead (?!$) ? Is it supported by ABAP regexp engine (BOOST 1.35)? I thought lookahead in general can be only used in-front of the pattern? Lookahead to not consume it if it on end of buffer/line?

      And this '[^[:print:]]+$' :

      • replace all occurences here make no sense, while it can be max only one match => trailing non-printable characters.  Or it goes about multiline content? But still, while only trailing...

      I think something like this will be enough, but I will not build that in, while lead to data loss: 

      replace all occurrences of regex '[^[:print:]]' in STRING with ` `.

      Best regards,

      Alexey.

       

  20. Hi,

    I've faced severals problems/issues I could only fix without changing the code. Here they are :

    1. Pretty printer does fit REST API I had to use :
      1. In my situation I connect SAP to JIRA. JIRA use custom fields (custom_10110) that have numbers that change (or might change) from a server (DEV to Prod) to an other. So I can't have as serverals developpements for each server. So I use the functional name of the cutom field (My_Field) in my ABAP structure, but I need on the JSON Class a kind of specific pretty printer that will map My_Field to custom_10110 on DEV and custom_10221 on Prod....
      2. I suggest to add a new pretty printer mode : "user specific" that use a classe to translate field names from SAP to JSON and from JSON to SAP
    2. Can't set to null JSON fields when compress is ON
      1. I've got fields that I need to set to blank/null but still require to be in compress mode because I've got other fields in ABAP that are not to be modified (they will be set as blank/null with no compress and then de deleted from data) or doesn't exists on this REST API, because fileds that are allowed to be changed might change depending of the JIRA request status.
      2. I suggest to add exception list (table or class) to the fields that are not compressed when compress is on

    I've done this two changes on your previous version and works fine.

    Taryck.

    1. Hi Taryck,

      post your code here, I will try to integrate it or will check how it is possible to achieve the same functionality. 

      • With user specific pretty printing of names on serialization/deserialization, probably, will be not so problematic and without performance penalty for standard use case... 
      • With fields, that shall be ignored on "compress" may be more problematic while checks are done in several places, and I would need to do type reflection before - this may influence performance... 

      But nevertheless, show what you have, and I will check what I can do (smile) 

      Best regards,

      Alex.

      1. I have already done some solution for unwanted compression, but only for booleans, introducing 3bool type, but I doubt that similar solution (using types) will be the best. And with a table with exceptions, at least by field names, it is also not bulletproof - while you can have same fields in different structures, and you would like to have different compression behavior for them.

      2. Hi,

         

        This is a quick workaround, I gues the best way is to use an interface and provide a pretty printing class.

        Here the main method added :

           <ITEM CMPNAME="DFS_JIRA_CUSTOM_FIELD_NAME" STATE="1" MTDDECLTYP="1">

            <TEXTS>

             <ITEM LANG="E" TEXT="ABAP Name to  JIRA cutom field name"/>

             <ITEM LANG="F" TEXT="ABAP Name to  JIRA cutom field name"/>

            </TEXTS>

            <PARAMETERS>

             <ITEM SCONAME="IN" CMPTYPE="1" PARPASSTYP="1" TYPTYPE="1" TYPE="CSEQUENCE">

              <TEXTS>

               <ITEM LANG="E" TEXT="IN"/>

               <ITEM LANG="F" TEXT="IN"/>

              </TEXTS>

             </ITEM>

             <ITEM SCONAME="OUT" CMPTYPE="1" PARDECLTYP="3" TYPTYPE="1" TYPE="STRING">

              <TEXTS>

               <ITEM LANG="E" TEXT="OUT"/>

               <ITEM LANG="F" TEXT="OUT"/>

              </TEXTS>

             </ITEM>

            </PARAMETERS>

            <SOURCE>`

          FIELD-SYMBOLS: &lt;m&gt; LIKE LINE OF dfs_fields_abap_2_jira.

          CLEAR dfs_no_compress.

          READ TABLE dfs_fields_abap_2_jira ASSIGNING &lt;m&gt;

               WITH TABLE KEY abap_fieldname = in.

          IF sy-subrc = 0.

            out = &lt;m&gt;-jira_customfield.

            READ TABLE dfs_nocomp_list TRANSPORTING NO FIELDS WITH TABLE KEY table_line = in.

            IF sy-subrc = 0.    dfs_no_compress = abap_true.     ENDIF.

          ELSE.

            out = in.

          ENDIF.</SOURCE>

        And here are the call in your code :

           <ITEM CMPNAME="PRETTY_NAME" EXPOSURE="2" STATE="1" MTDDECLTYP="1">

            <SOURCE>`

          STATICS: st_cache TYPE HASHED TABLE OF pretty_name_pair WITH UNIQUE KEY in.

        ...

          IF sy-subrc IS INITIAL.

            out = &lt;cache&gt;-out.

          ELSE.

            out = dfs_jira_custom_field_name( in ).

            REPLACE ALL OCCURRENCES OF `__` IN out WITH `*`.

        ...

           INSERT cache INTO TABLE st_cache.

          ENDIF.</SOURCE>

        And

           <ITEM CMPNAME="PRETTY_NAME_EX" EXPOSURE="2" STATE="1" MTDDECLTYP="1">

            <SOURCE>`

          STATICS: st_cache TYPE HASHED TABLE OF pretty_name_pair WITH UNIQUE KEY in.

          ...

          READ TABLE st_cache WITH TABLE KEY in = in ASSIGNING &lt;cache&gt;.

          IF sy-subrc IS INITIAL.

            out = &lt;cache&gt;-out.

          ELSE.

            out = dfs_jira_custom_field_name( in ).

            REPLACE ALL OCCURRENCES OF `___` IN out WITH `.`.

        ...

            INSERT cache INTO TABLE st_cache.

          ENDIF.</SOURCE>

        And

           <ITEM CMPNAME="PRETTY_NAME_LOWER_CASE" EXPOSURE="2" STATE="1" MTDDECLTYP="1">

            <SOURCE>`

          out = dfs_jira_custom_field_name( in ).

          TRANSLATE out TO LOWER CASE.                            &quot;#EC SYNTCHAR

            </SOURCE>

        And

           <ITEM CMPNAME="PRETTY_NAME_OTHERS" EXPOSURE="2" STATE="1" MTDDECLTYP="1">

            <SOURCE>`

          out = dfs_jira_custom_field_name( in ).

           </SOURCE>

        This was for field names

         

      3. And this is for Compress (no) : Define list of field that won't be compressed

           <ITEM CMPNAME="DFS_SET_NO_COMP_FIELD_LIST" EXPOSURE="2" STATE="1" MTDDECLTYP="1">

            <TEXTS>

             <ITEM LANG="E" TEXT="Define not compressed field list"/>

            </TEXTS>

            <PARAMETERS>

             <ITEM SCONAME="T_LIST" CMPTYPE="1" PARPASSTYP="1" TYPTYPE="1" TYPE="TT_ABAP_FIELDLIST" PAROPTIONL="X"/>

            </PARAMETERS>

            <SOURCE>`

          DATA wt_list TYPE tt_abap_fieldlist.

          FIELD-SYMBOLS: &lt;l&gt; LIKE LINE OF wt_list.

          wt_list = t_list.

          SORT wt_list. DELETE ADJACENT DUPLICATES FROM wt_list.

          LOOP AT wt_list ASSIGNING &lt;l&gt;.

            TRANSLATE &lt;l&gt; TO UPPER CASE.    &quot; NOTEXT

          ENDLOOP.

          dfs_nocomp_list = wt_list.</SOURCE>

        And Methods that determine if compression is on or not.

           <ITEM CMPNAME="DFS_DO_COMPRESS" STATE="1" MTDDECLTYP="1">

            <TEXTS>

             <ITEM LANG="E" TEXT="Do we compress this field ?"/>

            </TEXTS>

            <PARAMETERS>

             <ITEM SCONAME="COMPRESS" CMPTYPE="1" PARPASSTYP="1" TYPTYPE="1" TYPE="ABAP_BOOL">

              <TEXTS>

               <ITEM LANG="E" TEXT="compressionis asked"/>

              </TEXTS>

             </ITEM>

             <ITEM SCONAME="FIELDNAME" CMPTYPE="1" PARPASSTYP="1" TYPTYPE="1" TYPE="CSEQUENCE">

              <TEXTS>

               <ITEM LANG="E" TEXT="field name"/>

              </TEXTS>

             </ITEM>

             <ITEM SCONAME="RESULT" CMPTYPE="1" PARDECLTYP="3" TYPTYPE="1" TYPE="ABAP_BOOL">

              <TEXTS>

               <ITEM LANG="E" TEXT="Compress or not"/>

              </TEXTS>

             </ITEM>

            </PARAMETERS>

            <SOURCE>`

          DATA wl_fieldname TYPE td_abap_fieldname.

          IF compress = abap_false. RETURN.   ENDIF.

          wl_fieldname = fieldname.

          TRANSLATE wl_fieldname TO UPPER CASE.

          READ TABLE dfs_nocomp_list TRANSPORTING NO FIELDS WITH TABLE KEY table_line = wl_fieldname.

          IF sy-subrc &lt;&gt; 0.    result = abap_true.     ENDIF.</SOURCE>

        And changes in your code :

           <ITEM CMPNAME="DUMP" EXPOSURE="2" STATE="1" MTDDECLTYP="1">

            <SOURCE>`

         ...

                LOOP AT lo_classdesc-&gt;attributes ASSIGNING &lt;attr&gt; WHERE is_constant IS INITIAL AND alias_for IS INITIAL AND

                  ( is_interface IS INITIAL OR type_kind NE cl_abap_typedescr=&gt;typekind_oref ).

                  ASSIGN lo_obj_ref-&gt;(&lt;attr&gt;-name) TO &lt;value&gt;.

        *            IF compress IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                  IF dfs_do_compress( compress = compress fieldname = &lt;attr&gt;-name ) IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                    lo_typedesc = cl_abap_typedescr=&gt;describe_by_data( &lt;value&gt; ).

        ...

                  ENDIF.

                ENDLOOP.

         ...

            WHEN cl_abap_typedescr=&gt;kind_struct.

              IF data IS INITIAL.

                r_json = `null`.                                    &quot;#EC NOTEXT

              ELSE.

                lo_structdesc ?= type_descr.

                lt_symbols = get_symbols( struct_descr = lo_structdesc pretty_name = pretty_name ).

                LOOP AT lt_symbols ASSIGNING &lt;symbol&gt;.

                  ASSIGN COMPONENT &lt;symbol&gt;-idx OF STRUCTURE data TO &lt;value&gt;.

        *          IF compress EQ c_bool-false OR &lt;value&gt; IS NOT INITIAL.

                  IF dfs_do_compress( compress = compress fieldname = &lt;symbol&gt;-name ) IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                    lv_itemval = dump( data = &lt;value&gt; compress = compress pretty_name = pretty_name assoc_arrays = assoc_arrays ts_as_iso8601 = ts_as_iso8601 type_descr = &lt;symbol&gt;-type ).

                    CONCATENATE &lt;symbol&gt;-header lv_itemval INTO lv_itemval.

                    APPEND lv_itemval TO lv_properties.

                  ENDIF.

                ENDLOOP.

        ...

            WHEN cl_abap_typedescr=&gt;kind_table.

              lo_tabledescr ?= type_descr.

        ...

                  LOOP AT &lt;table&gt; ASSIGNING &lt;line&gt;.

                    CLEAR: lv_fields, lv_prop_name.

                    LOOP AT lt_symbols ASSIGNING &lt;symbol&gt;.

                      ASSIGN COMPONENT &lt;symbol&gt;-idx OF STRUCTURE &lt;line&gt; TO &lt;value&gt;.

        *              IF compress IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                      IF dfs_do_compress( compress = compress fieldname = &lt;symbol&gt;-name ) IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                        IF &lt;symbol&gt;-type-&gt;kind EQ cl_abap_typedescr=&gt;kind_elem.

                          lo_elem_descr ?= &lt;symbol&gt;-type.

                          dump_type &lt;value&gt; lo_elem_descr lv_itemval.

                        ELSE.

         ...

                  LOOP AT &lt;table&gt; ASSIGNING &lt;line&gt;.

                    CLEAR lv_fields.

                    LOOP AT lt_symbols ASSIGNING &lt;symbol&gt;.

                      ASSIGN COMPONENT &lt;symbol&gt;-idx OF STRUCTURE &lt;line&gt; TO &lt;value&gt;.

        *              IF compress IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                      IF dfs_do_compress( compress = compress fieldname = &lt;symbol&gt;-name ) IS INITIAL OR &lt;value&gt; IS NOT INITIAL.

                        IF &lt;symbol&gt;-type-&gt;kind EQ cl_abap_typedescr=&gt;kind_elem.

         ....</SOURCE>

        I've got the same question about field name for compress. I do not have such trouble with JIRA, but I could suggest as I did in ABAP to XML conversion of my framework ZAPLINK to have a "stack table" that contains the name of the of tree node route to get to the fields. This helps to know where we are if several fields have the same name..

        1. Hi Taryck,

          as you have seen I have already posted the updated code, that may cover you needs. I have allowed instance creation of the class and several methods from been static to instance. In such way, it became possible to create a custom serializer overriding desired methods. As I look now on your suggestion - it shall work for you. See my examples about the extension of /UI2/CL_JSON at the end of the article. 

          I have not selected the way with passing "strategy" for pretty printing, while I would like to keep the solution compact, contained in one class, and I do not want to introduce a new global interface. 

          The only one difference from your suggestion I have, is a usage of the type, for detecting fields to be compressed, but not a field name. I find it more flexible than names, while it does not influence output structure, but still requires change for input ABAP structure... Maybe I will update the code with an additional input parameter for IS_COMPRESSABLE method with a field name.  It shall not influence the flow and performance. The stack... I would say it may be critical and still not enough.

          Please check current solution and give a feedback.

          BR, Alex.

  21. Hi Alexey,

    When I am trying to run you first code, I am getting error " Field " pretty_name-camel_case"  is unknown. It is neither in one of the specified tables nor defined by a "DATA" statement".

    Please help me with this.

    and one more thing Can we create nested json array from this method??

    1. Hi Ankit,

      sorry for the delay with a response - have not been notified about you post for some reason...

      Can you please provide example code which does not run? In my examples, there is no pretty_name-camel_case, but there is pretty_mode-camel_case. May it be that you have mistyped? 

      If I got your second question right: yes, you can serialize ABAP table with nested tables in corresponding JSON array with nested arrays.

      BR, Alexey.

  22. Thank you for your work)

    1. Пользуйтесь на здоровье (wink)

  23. Hi all, 

    Sorry for such a beginners question. 

    I have a webservice that  return JSON format with 2 headers , one which is just a string (RESULT) and the other is a table of entries(OUTTAB) (check image attached)

    I tried using sample code from Partial deserialization of JSON into ABAP but table lt_act -columns are empty . ( The ID files are filled with 'RESULT'  and 'OUTTAB')

    I am wanting to  just get the string from RESULT to check if successful, then bring in the OUTTAB tables into internal table. 

    what is the best way of accomplishing this?

     

     

    return json sample

    1. Hi Juan,

      yes, you are right, this is a beginner question (wink) Have nothing to do with partial deserialization - you have standard use case.

      See below code that shall work:

      Error rendering macro 'code': Invalid value specified for parameter 'lang'
      TYPES: 
        BEGIN OF t_out,
          nachn   TYPE string,
          vorna   TYPE string,
          actdir  TYPE string,
          pernr   TYPE string,
          stell   TYPE string,
          persk   TYPE string,
          btrtl   TYPE string,
          werks   TYPE string,
          usrid   TYPE string,
          usrty   TYPE string,
          kostl   TYPE string,
          orgeh   TYPE string,
          plans   TYPE string,
          begda1  TYPE d,
          endda1  TYPE d,
          stat2   TYPE string,
          begda2  TYPE d,
          endda2  TYPE d,
        END OF t_out,
        BEGIN OF t_response,
          result TYPE string,
          outtab TYPE STANDARD TABLE OF t_out WITH DEFAULT KEY,
        END OF t_response.
      
      DATA: ls_response TYPE t_response,
            lv_json     TYPE string. " <-- this is your input JSON string
      
      /ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_response ).

      Best regards,

      Alexey.

      1.  thanks for typing out the solution. It much more automated than I expected. 

         

  24. New /ui2/CL_JSON version posted:

    • /UI2/CL_JSON creates unnecessary wrapping JSON object around a value for name/value (table with 2 fields) tables with 1 unique key
    • Performance of serialization/deserialization of big tables into/from JSON associative arrays (maps) is slow
    • When trying to deserialize invalid (not matching) structure from JSON to ABAP dump OBJECTS_MOVE_NOT_SUPPORTED occurs
  25. Thanks for this class.

    I've one question:
    Would it be possible to include also the possibility for an utf-8 conversion of the values before writing the JSON output?

    This is the coding that could be used:

    METHOD convert_to_uft8.

      DATA:
        lr_cvto_utf8           TYPE REF TO cl_abap_conv_out_ce,
        lr_converter           TYPE REF TO cl_abap_conv_in_ce,
        lv_utf8_xml_out     TYPE xstring.

      lr_cvto_utf8 cl_abap_conv_out_ce=>createencoding 'UTF-8' ).
      lr_cvto_utf8->writedata i_string ).
      lv_utf8_xml_out lr_cvto_utf8->get_buffer).

      lr_converter cl_abap_conv_in_ce=>createinput lv_utf8_xml_out ).
      lr_converter->readIMPORTING data r_string ).

    ENDMETHOD.

      Thanks.

    1. Hello Thorsten,

      The serialization of the JSON is done in string/textual format (which is represented in system encoding, that can be Unicode  or not), not xstring (binary data), so it is not possible to dump in UTF8 (and in same time in xstring → binary) only values - the whole output shall be done in xstring then.

      If UTF8 encoding conversion will be applied multiple times for each value, it will decrease performance significantly. 

      But all the mentioned is not needed → the idea was, you apply the conversion to needed encoding on top of produced string, after the serialization. It will bring no harm and will work the same way as with single encoding of each string value. And because it would be single kernel call, I assume it would be also faster (even if data size will be larger). 

      BR, Alexey.

      P.S: in next version of the class (not yet released) I have already added helper methods for converting result from/to XSTRING, but without providing the desired encoding. I will try to not forget and extend it also with encoding (wink)

  26. New /UI2/CL_JSON version posted (note 2382783):

    Fixed:

    • Unescape of symbol '\' on JSON deserialization does not work
    • Short dump on serialization of classes with protected/private attributes
    • Short dump when serializing dynamic, not existing, types

    New:

    • DESERIALIZE_INT method throws an exception CX_SY_MOVE_CAST_ERROR and stops further processing in the case of malformed data found and STRICT_MODE parameter in constructor set to TRUE.
    • Added support of XSTRING as input for deserialization.
  27. Hello,

    I have a question about partial deserialization.

    For example

    Where count of elements and name of each element (json object) varying and content of each element may vary too.

    In ABAP created data type for this json:

    How can I deserialize this JSON and receive smth like this in ABAP table for valueLists:

    ID (string)                                           Content (JSON string)

    locno_vl                                              {….

    zvdog_vl                                             {…

     

     

    1. Hi Dmitry,

      it is a little bit tricky, but may work (smile) 

      You need an  assoc_arrays_opt abap_true in addition to  assoc_arrays abap_true when calling deserialize to get it working (because you do not know underlying structure). I have also updated an example in the article accordingly. 

      BR, Alexey.

  28. Hello, I create method for dynamic generating ABAP data from JSON. If it can be helpfull for smbd here it is

    1. Enhance class /UI2/CL_JSON with public static method (ZZ_GENER_DATA_FROM_JSON in my case) with importing parametr I_JSON type JSON and with returning parameter ER_DATA type ref DATA
    2. Insert code into the method
    zz_gener_data_from_json
      METHOD zz_gener_data_from_json.
    
        DATA: json   TYPE string,
              length TYPE i,
              offset TYPE i.
        DATA: li_json TYPE TABLE OF json,
              BEGIN OF ls_part,
                id    TYPE string,
                value TYPE json,
              END OF ls_part,
              li_part  LIKE SORTED TABLE OF ls_part WITH UNIQUE KEY id,
              l_string TYPE string.
        DATA: lr_ref  TYPE REF TO data,
              lo_type TYPE REF TO cl_abap_datadescr,
              li_comp TYPE abap_component_tab.
        FIELD-SYMBOLS: <l_data>  TYPE any,
                       <ls_data> TYPE any,
                       <li_data> TYPE STANDARD TABLE.
    
        CLEAR er_data.
    
        json = i_json.
    
        REPLACE ALL OCCURRENCES OF `\r\n` IN json WITH cl_abap_char_utilities=>cr_lf.
        REPLACE ALL OCCURRENCES OF `\n`   IN json WITH cl_abap_char_utilities=>newline.
        REPLACE ALL OCCURRENCES OF `\t`   IN json WITH cl_abap_char_utilities=>horizontal_tab.
    
        length = numofchar( json ).
    
        eat_white.
    
        CASE json+offset(1).
          WHEN `{`."result must be a structure
            CALL METHOD deserialize
              EXPORTING
                json             = i_json
                assoc_arrays     = 'X'
                assoc_arrays_opt = 'X'
              CHANGING
                data             = li_part.
            IF li_part IS NOT INITIAL.
              lo_type ?= cl_abap_typedescr=>describe_by_data( p_data = lr_ref ).
              LOOP AT li_part
                  ASSIGNING FIELD-SYMBOL(<ls_part>).
                APPEND VALUE #( name = <ls_part>-id type = lo_type ) TO li_comp.
              ENDLOOP.
              TRY.
                  lo_type = cl_abap_structdescr=>create( p_components = li_comp p_strict = space ).
                CLEANUP.
                  "break drybyakov.
              ENDTRY.
              CREATE DATA er_data TYPE HANDLE lo_type.
              ASSIGN er_data->* TO <ls_data>.
              LOOP AT li_part
                  ASSIGNING <ls_part>.
                ASSIGN COMPONENT <ls_part>-id OF STRUCTURE <ls_data> TO <l_data>.
                <l_data> = zz_gener_data_from_json( i_json = <ls_part>-value ).
              ENDLOOP.
            ENDIF.
          WHEN `[`."result must be a table of ref
            CREATE DATA er_data LIKE TABLE OF lr_ref.
            ASSIGN er_data->* TO <li_data>.
            CALL METHOD deserialize
              EXPORTING
                json = i_json
              CHANGING
                data = li_json.
            LOOP AT li_json
                INTO json.
              APPEND zz_gener_data_from_json( i_json = json ) TO <li_data>.
            ENDLOOP.
    	  WHEN OTHERS."result must be a simple data
            IF json+offset = 'true' OR "boolean
               json+offset = 'false'.
              er_data = NEW bool( ).
            ELSEIF json+offset = 'null'."null
              "null value
            ELSE.
              IF find( val = json+offset regex = '^\d+\.?\d*$' ) = -1.
                "string
                CREATE DATA er_data LIKE l_string.
              ELSE.
                "number
                IF json+offset CP '.'.
                  "decimals
                  er_data = NEW f( ).
                ELSE.
                  "integer
                  er_data = NEW i( ).
                ENDIF.
              ENDIF.
            ENDIF.
            IF er_data IS BOUND.
              ASSIGN er_data->* TO <l_data>.
              CALL METHOD deserialize
                EXPORTING
                  json = i_json
                CHANGING
                  data = <l_data>.
            ENDIF.
        ENDCASE.
    
      ENDMETHOD.

    Exaples of usage

    JSON example
    {"code":"2000","message":"Resource CRUD success","output":{"$schema":"http://json-schema.org/draft-04/schema#","title":"Rule Service","id":"#root","description":"Rule service schema","type":"object","required":["vocabulary","executionContext"],"additionalProperties":false,"properties":{"description":{"description":"Rule service description","type":"string","maxLength":256},"vocabulary":{"description":"Reference to vocabulary resource in repository","type":"string","maxLength":256},"output":{"description":"Reference to output in vocabulary","type":"string","maxLength":256},"readOnly":{"description":"NON-FUNCTIONAL","type":"boolean","default":false},"resultView":{"description":"Indicates if result view should be created","type":"string","enum":["withResultView","resultViewOnly"]},"ruleAssignment":{"description":"Indicates if the rule assignment is done manually by each rule, or automatically according to the service ruleGroup","type":"string","enum":["Automatic","Manual"],"default":"Automatic"},"ruleGroup":{"description":"Condition in rule expression language (REL) for matching rule selection","type":"string","maxLength":5000,"minLength":1},"executionContext":{"description":"Service execution details","id":"#executionContext","type":"object","additionalProperties":false,"properties":{"dataObject":{"description":"Represents the business entity on which the rule will be evaluated","type":"object","required":["name","keys"],"additionalProperties":false,"properties":{"name":{"description":"Data object name","type":"string","maxLength":256},"keys":{"description":"A set of data object fields which is returned when a rule is matched (does not have to be unique)","type":"array","items":{"type":"string","maxLength":256},"uniqueItems":true,"minItems":1}}},"filter":{"description":"A REL-based expression; filters in data objects to be evaluated in business rule","type":"string","maxLength":5000,"minLength":1},"parameters":{"description":"Input parameters; can be persisted in the system and can be used in a filter REL-based expression","type":"object","required":["definition"],"additionalProperties":false,"properties":{"definition":{"description":"parameters definition","type":"array","items":{"type":"object","oneOf":[{"$ref":"#basicParameterFirstLevel"},{"$ref":"#structureParameterFirstLevel"},{"$ref":"#dataObjectParameter"}]},"uniqueItems":true},"associations":{"description":"Mapping between parameters and data objects","type":"array","items":{"type":"object","required":["targetDataObject","attributes"],"additionalProperties":false,"properties":{"targetDataObject":{"description":"Name of target data object","type":"string"},"attributes":{"description":"Mapping details","type":"array","uniqueItems":true,"items":{"type":"object","required":["parameterName","targetAttribute"],"properties":{"parameterName":{"description":"Parameter name","type":"string"},"targetAttribute":{"description":"Name of data object's target attribute","type":"string"}}}}}}}}}},"executionContextDefinitions":{"basicParameterFirstLevel":{"id":"#basicParameterFirstLevel","required":["name","dataType"],"additionalProperties":false,"properties":{"name":{"description":"Parameter name","type":"string"},"dataType":{"description":"SAP HANA data type of the attribute","$ref":"#dataType"},"size":{"description":"SAP HANA data type size of the attribute","type":"string","maxLength":256},"businessDataType":{"description":"Data type of the attribute","$ref":"#businessDataType"},"description":{"description":"Parameter description","type":"string"},"persist":{"description":"NON-FUNCTIONAL","type":"boolean","default":false}}},"structureParameterFirstLevel":{"id":"#structureParameterFirstLevel","required":["name","dataType","attributes"],"additionalProperties":false,"properties":{"name":{"description":"Parameter name","type":"string"},"dataType":{"description":"Parameter with Structure or Collection data type","type":"string","enum":["Structure","Collection"]},"description":{"description":"Parameter description","type":"string"},"persist":{"description":"NON-FUNCTIONAL","type":"boolean","default":false},"attributes":{"description":"structure attributes","type":"array","items":{"type":"object","oneOf":[{"$ref":"#basicParameter"},{"$ref":"#structureParameter"}]}}}},"basicParameter":{"id":"#basicParameter","required":["name","dataType"],"additionalProperties":false,"properties":{"name":{"description":"Parameter name","type":"string"},"dataType":{"description":"SAP HANA data type of the attribute","$ref":"#dataType"},"size":{"description":"SAP HANA data type size of the attribute","type":"string","maxLength":256},"businessDataType":{"description":"Data type of the attribute","$ref":"#businessDataType"},"description":{"description":"Parameter description","type":"string"}}},"dataObjectParameter":{"id":"#dataObjectParameter","required":["name","dataType","dataObject"],"additionalProperties":false,"properties":{"name":{"description":"Parameter name","type":"string"},"dataType":{"description":"'Data Object' data type","type":"string","enum":["DataObject"]},"description":{"description":"Parameter description","type":"string"},"persist":{"description":"NON-FUNCTIONAL","type":"boolean","default":false},"dataObject":{"description":"Referenced object of the parameter","type":"object","required":["name","identifiers"],"additionalProperties":false,"properties":{"name":{"description":"Data object name","type":"string","maxLength":256},"identifiers":{"description":"Data object identifiers","type":"array","uniqueItems":true,"minItems":1,"items":{"type":"string"}}}}}},"structureParameter":{"id":"#structureParameter","required":["name","dataType","attributes"],"additionalProperties":false,"properties":{"name":{"description":"Parameter name","type":"string"},"dataType":{"description":"Parameter with Structure or Collection data type","type":"string","enum":["Structure","Collection"]},"description":{"description":"Parameter description","type":"string"},"attributes":{"description":"Structure attributes","type":"array","items":{"type":"object","oneOf":[{"$ref":"#basicParameter"},{"$ref":"#structureParameter"}]}}}},"businessDataType":{"id":"#businessDataType","description":"Model data type","type":"string","enum":["String","Number","Timestamp","Boolean","TimeSpan","Date","Time"]},"dataType":{"id":"#dataType","description":"SAP HANA data type","type":"string","enum":["CHAR","VARCHAR","NVARCHAR","ALPHANUM","SHORTTEXT","DATE","TIME","TINYINT","SMALLINT","INTEGER","BIGINT","SMALLDECIMAL","DECIMAL","REAL","DOUBLE","TIMESTAMP","SECONDDATE","Condition"]}}},"dependsOn":{"description":"Definition of the dependencies","type":"array","maxItems":1,"uniqueItems":true,"items":{"type":"object","required":["name","package","suffix"],"additionalProperties":false,"properties":{"package":{"type":"string"},"name":{"type":"string"},"suffix":{"type":"string"}}}},"conversionFlagsMap":{"type":"object","additionalProperties":false,"properties":{"isValueListConverted":{"type":"boolean","enum":[true]}}}},"dependencies":{"ruleGroup":{"description":"rule group is only allowed for automatic rule assignment","properties":{"ruleAssignment":{"enum":["Automatic"]}}}}},"details":[]}

    For creation a data in ABAP use this

    Usage of ABAP data generation
    DATA(lr_ref) = /ui2/cl_json=>ZZ_GENER_DATA_FROM_JSON( i_json = json ). 

    And you receive this



    1. Hi Dmitry,

      nice piece of code! I have formatted it a little bit, downported the language construction to SAP_BASIS 700, and shown how to integrate it without changing /UI2/CL_JSON.

      Would be also nice if you provide usage example for your code.

      See below my adaptations:

      LC_JSON_CUSTOM
      *----------------------------------------------------------------------*
      *       CLASS lc_json_custom DEFINITION
      *----------------------------------------------------------------------*
      *
      *----------------------------------------------------------------------*
      CLASS lc_json_custom DEFINITION FINAL INHERITING FROM /ui2/cl_json.
        PUBLIC SECTION.
          METHODS:
            generate IMPORTING i_json TYPE json                          
                     RETURNING value(er_data) TYPE REF TO data.
        PROTECTED SECTION.
          METHODS:
            eat_white IMPORTING json TYPE json length TYPE i
                      CHANGING offset TYPE i.
      ENDCLASS.                    "lc_json_custom DEFINITION
      
      *----------------------------------------------------------------------*
      *       CLASS lc_json_custom IMPLEMENTATION
      *----------------------------------------------------------------------*
      *
      *----------------------------------------------------------------------*
      CLASS lc_json_custom IMPLEMENTATION.
      
        METHOD eat_white.
          WHILE offset < length.
            FIND FIRST OCCURRENCE OF json+offset(1) IN sv_white_space.
            IF sy-subrc IS NOT INITIAL.
              EXIT.
            ENDIF.
            offset = offset + 1.
          ENDWHILE.
        ENDMETHOD. "eat_white
        METHOD generate.
      
          TYPES: BEGIN OF ts_field,
                  id    TYPE string,
                  value TYPE json,
                END OF ts_field.
      
          DATA: json   TYPE string,
                length TYPE i,
                offset TYPE i.
      
          DATA: lt_json   TYPE STANDARD TABLE OF json WITH DEFAULT KEY,
                lt_fields TYPE SORTED TABLE OF ts_field WITH UNIQUE KEY id,
                lv_string TYPE string,
                lr_ref    TYPE REF TO data,
                lo_type   TYPE REF TO cl_abap_datadescr,
                lt_comp   TYPE abap_component_tab,
                ls_comp   LIKE LINE OF lt_comp.
      
          FIELD-SYMBOLS: <data>   TYPE ANY,
                         <struct> TYPE ANY,
                         <field>  LIKE LINE OF lt_fields,
                         <table>  TYPE STANDARD TABLE.
      
          CLEAR er_data.
      
          json = i_json.
      
          REPLACE ALL OCCURRENCES OF `\r\n` IN json WITH cl_abap_char_utilities=>cr_lf.
          REPLACE ALL OCCURRENCES OF `\n`   IN json WITH cl_abap_char_utilities=>newline.
          REPLACE ALL OCCURRENCES OF `\t`   IN json WITH cl_abap_char_utilities=>horizontal_tab.
      
          length = NUMOFCHAR( json ).
      
          eat_white( EXPORTING json = json length = length CHANGING offset = offset ).
      
          CASE json+offset(1).
            WHEN `{`."result must be a structure
              deserialize( EXPORTING json = i_json assoc_arrays = c_bool-true assoc_arrays_opt = c_bool-true
                           CHANGING  data = lt_fields ).
              IF lt_fields IS NOT INITIAL.
                lo_type ?= cl_abap_typedescr=>describe_by_data( p_data = lr_ref ).
                LOOP AT lt_fields ASSIGNING <field>.
                  ls_comp-name = <field>-id.
                  ls_comp-type = lo_type.
                  APPEND ls_comp TO lt_comp.
                ENDLOOP.
                TRY.
                    lo_type = cl_abap_structdescr=>create( p_components = lt_comp p_strict = c_bool-false ).
                  CLEANUP.
                    "break drybyakov.
                ENDTRY.
                CREATE DATA er_data TYPE HANDLE lo_type.
                ASSIGN er_data->* TO <struct>.
                LOOP AT lt_fields ASSIGNING <field>.
                  ASSIGN COMPONENT <field>-id OF STRUCTURE <struct> TO <data>.
                  <data> = generate( i_json = <field>-value ).
                ENDLOOP.
              ENDIF.
            WHEN `[`."result must be a table of ref
              deserialize( EXPORTING json = i_json CHANGING data = lt_json ).
              CREATE DATA er_data TYPE TABLE OF REF TO data.
              ASSIGN er_data->* TO <table>.
              LOOP AT lt_json INTO json.
                APPEND INITIAL LINE TO <table> ASSIGNING <data>.
                <data> = generate( i_json = json ).
              ENDLOOP.
            WHEN OTHERS."result must be a simple data (string)
              deserialize( EXPORTING json = i_json CHANGING data = lv_string ).
              CREATE DATA er_data TYPE string.
              ASSIGN er_data->* TO <data>.
              <data> = lv_string.
          ENDCASE.
      
        ENDMETHOD.  "generate
      
      ENDCLASS. "lc_json_custom IMPLEMENTATION

      BR, Alexey.

      1. Alexey, I suppose to make a Enhence, because outside the class /ui2/cl_json we can't use macros "eat_white"

        And ofcourse I don't mind If you include this into the class /ui2/cl_json in future release (smile)

        1. Hi Dmitry,

          yes, you completely right... I have tested the code in a unit test for /ui2/cl_json (local class) and there it was working, because, obviously, macros are visible to local classes. I have extended the code in my previous post with a copy of eat_white macro (but now as a method).

          I will evaluate taking of the code into /UI2/CL_JSON, but to do so, it shall be more comprehensive and well-tested (wink) - taking it to standard, means also support and bug fixing for it (wink) 

          From possible extension possibilities I see, is to support also other end types on generation. At least for booleans and integers, it may work correctly. Imagine an extension of the last branch WHEN OTHERS, with regular expression evaluating value format. 

          Plus, if it is done with an inheritance, method DESERIALIZE may be replaced by DESERILAIZE_INT to avoid redundantly creating of /ui2/cl_json instances.

          And de pretty printing of names is not supported (wink)

          What do you think?

          BR, Alexey.

          1. Hi Alexey,

            I can’t agree more (smile).

            Of course, my solution was dictated by the need to do it quickly any with minimum modifications/enhencements.

            Realization as instance method with suitable end data types and pretty printing names supporting will be the best.

            1. (smile)  - I was hoping to get all this nice handling of end data types, optimizations and pretty name support from you (wink)

              Actually, with pretty names, I was too fast, it can work the same way like for deserialization here: on deserialization, I "map" JSON names to existing ABAP names - here this not possible. But at least proper handling of not supported names as "ABC.BCD" or "1" shall be done. 

              I will try to integrate your generate method with proper modifications from me into next update of class, but I can not place any reference on you in the code. So, you need to accept, that copyright goes to SAP SE here. Otherwise, it will not work. Is it OK with you?

              BR, Alex.

              1. Hi, Alexey,

                Data types handling can be like this

                handling sutaible data types
                WHEN OTHERS."result must be a simple data
                        IF json+offset = 'true' OR "boolean
                           json+offset = 'false'.
                          er_data = NEW bool( ).
                        ELSEIF json+offset = 'null'."null
                          "null value
                        ELSE.
                          IF find( val = json+offset regex = '^\d+\.?\d*$' ) = -1.
                            "string
                            CREATE DATA er_data LIKE l_string.
                          ELSE.
                            "number
                            IF json+offset CP '.'.
                              "decimals
                              er_data = NEW f( ).
                            ELSE.
                              "integer
                              er_data = NEW i( ).
                            ENDIF.
                          ENDIF.
                        ENDIF.
                        IF er_data IS BOUND.
                          ASSIGN er_data->* TO <l_data>.
                          CALL METHOD deserialize
                            EXPORTING
                              json = i_json
                            CHANGING
                              data = <l_data>.
                        ENDIF.

                for pretty name supporting we can use case function "from_mixed( ... )" and for handling incorrect ABAP structure names and for pretty names too we can use mapping:

                1. Create public class (smile) with public attribute JSON_NAME = <ls_part>-id, public attribute ABAP_NAME = to_upper( <ls_part>-id ) if this correct in ABAP or any other generated correct ABAP name in another case. Also this class must have attribute SUB_MAPP type table or ref to <this class> (for deep nested structures).
                2. Add exporting parameter to "generete method" with type like <this-class>->sub_mapp and optional importing parameter type ref to <this class>.
                3. Fill this mapping in "generete method"

                YES, I don’t mind. I share this small piece of code with everybody and you like smbd else can use it at your discretion

                (I update my post with zz_gener_data_from_json method with type handling)

                P.S. if there will be such kind of mapping it would be nice to have the mapping in serialization too (smile)

                 

                 

                 

                1. Hi Dmitry,

                  Please find below my updated example for extension of /UI2/CL_JSON. I have integrated code for end type handling and added checks for proper ABAP component names and support for pretty printing of names (only camel case, let us see if this would be sufficient). I was thinking also to use direct elementary types, but not reference to it, but unfortunately, it is not easy to handle this for JSON arrays, while you never know, if all elements in JSON array are the same type or not. (sad) And it would be also not convenient for the consumption if some elements are references and some not.

                  LC_JSON_CUSTOM
                  *----------------------------------------------------------------------*
                  *       CLASS lc_json_custom DEFINITION
                  *----------------------------------------------------------------------*
                  *
                  *----------------------------------------------------------------------*
                  CLASS lc_json_custom DEFINITION FINAL INHERITING FROM /ui2/cl_json.
                  
                    PUBLIC SECTION.
                      CLASS-METHODS:
                        generate IMPORTING json TYPE json
                                           pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
                                 RETURNING value(rr_data) TYPE REF TO data.
                      METHODS:
                        generate_int IMPORTING json TYPE json
                                     RETURNING value(rr_data) TYPE REF TO data.
                  
                    PROTECTED SECTION.
                      CLASS-METHODS:
                        eat_white IMPORTING json TYPE json length TYPE i
                                  CHANGING offset TYPE i.
                  
                  ENDCLASS.                    "lc_json_custom DEFINITION
                  
                  *----------------------------------------------------------------------*
                  *       CLASS lc_json_custom IMPLEMENTATION
                  *----------------------------------------------------------------------*
                  *
                  *----------------------------------------------------------------------*
                  CLASS lc_json_custom IMPLEMENTATION.
                  
                    METHOD eat_white.
                      WHILE offset < length.
                        FIND FIRST OCCURRENCE OF json+offset(1) IN sv_white_space.
                        IF sy-subrc IS NOT INITIAL.
                          EXIT.
                        ENDIF.
                        offset = offset + 1.
                      ENDWHILE.
                    ENDMETHOD. "eat_white
                  
                    METHOD generate.
                  
                      DATA: lo_json   TYPE REF TO lc_json_custom,
                            lv_json   LIKE json.
                  
                      lv_json = json.
                  
                      REPLACE ALL OCCURRENCES OF `\r\n` IN lv_json WITH cl_abap_char_utilities=>cr_lf.
                      REPLACE ALL OCCURRENCES OF `\n`   IN lv_json WITH cl_abap_char_utilities=>newline.
                      REPLACE ALL OCCURRENCES OF `\t`   IN lv_json WITH cl_abap_char_utilities=>horizontal_tab.
                  
                      CREATE OBJECT lo_json
                        EXPORTING
                          pretty_name      = pretty_name
                          assoc_arrays     = c_bool-true
                          assoc_arrays_opt = c_bool-true.
                  
                      TRY .
                          rr_data = lo_json->generate_int( lv_json ).
                        CATCH cx_sy_move_cast_error.
                      ENDTRY.
                  
                    ENDMETHOD.                    "generate
                  
                    METHOD generate_int.
                  
                      TYPES: BEGIN OF ts_field,
                              name  TYPE string,
                              value TYPE json,
                            END OF ts_field.
                  
                      DATA: length TYPE i,
                            offset TYPE i.
                  
                      DATA: lt_json   TYPE STANDARD TABLE OF json WITH DEFAULT KEY,
                            lv_json   LIKE LINE OF lt_json,
                            lt_fields TYPE SORTED TABLE OF ts_field WITH UNIQUE KEY name,
                            lo_type   TYPE REF TO cl_abap_datadescr,
                            lt_comp   TYPE abap_component_tab,
                            ls_comp   LIKE LINE OF lt_comp.
                  
                      FIELD-SYMBOLS: <data>   TYPE ANY,
                                     <struct> TYPE ANY,
                                     <field>  LIKE LINE OF lt_fields,
                                     <table>  TYPE STANDARD TABLE.
                  
                      length = NUMOFCHAR( json ).
                  
                      eat_white( EXPORTING json = json length = length CHANGING offset = offset ).
                  
                      CASE json+offset(1).
                        WHEN `{`."result must be a structure
                          restore_type( EXPORTING json = json length = length CHANGING  data = lt_fields ).
                          IF lt_fields IS NOT INITIAL.
                            ls_comp-type = cl_abap_refdescr=>get_ref_to_data( ).
                            LOOP AT lt_fields ASSIGNING <field>.
                              ls_comp-name = <field>-name.
                              TRANSLATE ls_comp-name USING `/_:_~_._-_`. " remove characters not allowed in component names
                              IF mv_pretty_name EQ pretty_mode-camel_case OR mv_pretty_name EQ pretty_mode-extended.
                                REPLACE ALL OCCURRENCES OF REGEX `([a-z])([A-Z])` IN ls_comp-name WITH `$1_$2`. "#EC NOTEXT
                              ENDIF.
                              APPEND ls_comp TO lt_comp.
                            ENDLOOP.
                            TRY.
                                lo_type = cl_abap_structdescr=>create( p_components = lt_comp p_strict = c_bool-false ).
                                CREATE DATA rr_data TYPE HANDLE lo_type.
                                ASSIGN rr_data->* TO <struct>.
                                LOOP AT lt_fields ASSIGNING <field>.
                                  ASSIGN COMPONENT sy-tabix OF STRUCTURE <struct> TO <data>.
                                  <data> = generate_int( <field>-value ).
                                ENDLOOP.
                              CATCH cx_sy_create_data_error cx_sy_struct_creation.
                            ENDTRY.
                          ENDIF.
                        WHEN `[`."result must be a table of ref
                          restore_type( EXPORTING json = json length = length CHANGING  data = lt_json ).
                          CREATE DATA rr_data TYPE TABLE OF REF TO data.
                          ASSIGN rr_data->* TO <table>.
                          LOOP AT lt_json INTO lv_json.
                            APPEND INITIAL LINE TO <table> ASSIGNING <data>.
                            <data> = generate_int( lv_json ).
                          ENDLOOP.
                        WHEN OTHERS.
                          IF json+offset(1) EQ `"`.
                            CREATE DATA rr_data TYPE string.
                          ELSEIF json+offset(1) CA `-0123456789.`.
                            IF json+offset CS '.'.
                              CREATE DATA rr_data TYPE f.
                            ELSE.
                              CREATE DATA rr_data TYPE i.
                            ENDIF.
                          ELSEIF json+offset EQ `true` OR json+offset EQ `false`.
                            CREATE DATA rr_data TYPE abap_bool.
                          ENDIF.
                          IF rr_data IS BOUND.
                            ASSIGN rr_data->* TO <data>.
                            restore_type( EXPORTING json = json length = length CHANGING  data = <data> ).
                          ENDIF.
                      ENDCASE.
                  
                    ENDMETHOD.                    "generate
                  
                  ENDCLASS.                    "lc_json_custom IMPLEMENTATION

                   

                  The mapping for serialization/deserialization is already supported. The way to do it - inherit from /UI2/CL_JSON and redefine pretty_name and pretty_name_ex methods. This was my way to do it, while I want the logic to be encapsulated in a single class, without external dependency. Otherwise, of course, one can imagine an external "formatter" class (interface) that is passed in all methods and called for name processing. 

                  The current approach for pretty printing/adoption of names on a generation does not allow any flexibility in addition to a built-in logic. I do not want to introduce a method for UN pretty printing that is only used in a generation because it will bring a confusion for current approach. But we will see a feedback and addition flexibility will be requested, I will look for a better way. 

                  I will integrate the code above (with an adoption for internal usage) in next update/correction for /UI2/CL_JSON. 

                  Thanks a lot for your help and great idea (wink)

                  BR, Alexey.

                  1. Hi Alexy,

                    we want to extend the /UI2/CL_JSON class, however we are at SAP_UI 740 SP16 and eg note 2330592 does not contain corrections for SAP_UI 740. We opened already an OSS issue, however the only suggestion was to go to SP17, which is currently difficult. What could we do? Why do the notes not contain corrections for SAP_UI 740?

                    What will happen with your next corrections? Still not for SAP_UI 740?

                    Thanks,

                     Wolfgang

                    1. Hi Wolfgang,

                      no good news. I also have just answered the guy processed your message ( I assume). If you would check related notes section of the blog, and the notes itself, you will see that any of them has a correction for SAP_UI 740. Only for SAP_UI 750. This is relevant for notes preceding 2330592 and notes after it. In general, mentioned corrections are corrections for UI Addon V2 but they are automatically integrated into SAP_UI 750 and so corrections are also valid for them. To get a correction for SAP_UI 740, we need provide correction for UI Addon V1 which is not maintained anymore :/ At general recommendation for UI Addon support on customer side was → apply new SP. That is why you got such response from them. For me providing the correction for SAP_UI 740 would mean: provide corrections for all previous related notes and all subsequent... So, a significant effort. Plus I am not sure that we can properly test implementation. 

                      I will check what is possible, maybe we can limit it to some SP, but I do not promise something. As a workaround, you can always use a local class, like it is recommended in the blog, with actual code state from the blog.

                      BR, Alexey.

                      1. Hi Alexey,

                        thanks. I now updated to SP17 and we will check further possibilities.

                        Regards,

                          Wolfgang

                        1. Hello Wolfgang,

                          I have also invested yesterday some hours creating and testing corrections for UI Addon 1 (SAP_UI 740). For note you 2330592 (including previous corrections) and for subsequent notes, which already delivered with SP18. Hope notes would be released on Monday. Because I have updated all notes to include correction for SAP_UI 740, new notes will also have corrections for SAP_UI 740 - it is not much effort now to support both addon versions, while code basis is same.

                          My recommendation to you → apply subsequent notes also as far as they would be once more released. Means: 2368774 and 2382783, to have latest code state.

                          BR, Alexey.

                2. Hi Dmitry,

                  I have created a class (this time it is a local class) that may help in traversing dynamic data, generated by /UI2/CL_JSON=>GENERATE (or your variant):

                  Dynamic Data Accessor Helper Class for ABAP

                  Feedback is appreciated (wink)

                  BR, Alexey.

                   

  29. HI Alexey,

    Amazing code first of all (smile). thanks a lot! My current FIORI project is based on 1 GATEWAY service only by using  several columns and a string field called CONTEXT which contains whole JSON data inside. I used this class to send JSON data to FIORI application and parse back after taking the JSON back from the FIORI. It decreased my development time a lot. With 1 service, i finished the whole complicated project by using this class only. No matter how your internal table or structure is complicated, it is working fine.

    Only there was an issue (i noticed you fixed it already), about dynamic internal tables that the columns are dynamically created. Now it works like charm

    1. Hi Bilen, 

      thanks for the feedback! I am planning to release a new version of the class soon (with the support of data generation from Dmitry Rybyakov), so it would be even more generic. (wink) 

      And next time you find a bug, report it here -  I will try to help asap.

      BR, Alexey.

       

      1. Hi Alexey,

        I found an issue i think, i have a test structure as below, HEADER and COLUMNS has a flat structure reference, but my DATA component is dynamically generated by using  CL_ABAP_STRUCTDESCR class.

         

        I have GET and CREATE methods, i am using same structures during data transfer. After i retrieve JSON file, i am using ;

        myZJSONCLASS=>JSON_DESERIALIZE( exporting IV_JSON = LV_JSON changing CT_DATA = LS_DATA2 ).

        here it doesn't assign the field values for the DATA component (rest is fine). As a workaround, i used 

            LR_DATA = myZJSONCLASS=>GENERATE( JSON = LV_JSON ).

        and assigned field variables from the reference, is that a correct way or initial DESERIALIZE method should work ? 


        1. Hi Bilen,

          as far as I see, it works as designed. DESERIALIZE method works only with typed data, it does not generate data if it does not know the type of it. So, basically, you need to provide ready typed structure to get it filled. It will not assign something to REF TO data field, while it does not know what to create there. It is a design decision that not matched JSON attributes are skipped. DESERIALIZE is always trying to work in sync with SERIALIZE.

          In contrast to DESERIALIZE, GENERATE always generates the data. And you have very few control how that would be done (types used). But it does not require to know the receiving data type. 

          In you example, I would assume it shall work (if not it is a bug) if you pre-create DATA field in LS_DATA2 structure (create an empty table with expected line type). Then type is known in runtime and deserialize may work. Previously, without GENERATE method, I was suggesting to use partial deserialization (see in the blog above) and deserialize data into /ui2/cl_json=>json and later repeat deserialization of this member with proper receiving data type. Now, you do mostly the same but apply GENERATE to the DATA member (you can extend the struct to one more field, and assign generated data to it). Then you will have partially typed and partially dynamic access.  Or use GENERATE only with full dynamic access. 

          BR, Alexey.

        2. Hi Bilen,

          I thought about your use case... And probably there is a way how such object can be deserialized in more or less controllable way. 

          As I wrote, there is already a way to do partial deserialization: if there a matched ABAP field to JSON attribute, and that ABAP field has type /ui2/cl_json=>json (alias for string), the content of that attribute would be copied into ABAP field as JSON. The idea is, to use similar logic and execute generation on top of that JSON string if matched ABAP field has type REF TO DATA. I think that may work. For now on, as you have seen, such attribute would be simply ignored. 

          I would evaluate that option, and maybe it will be included in next version of /ui2/cl_json.

          BR, Alexey

          1. Thank you Alexey! without this class would be very hard for me to complete project (big grin)

            I haven't tried partial deserialisation yet, but wrote a generic method for my own local class as below; LR_DATA is filled inside GENERATE method, and i re-created my dynamic table <LT_DYN_REF> and rest is below;  I have very few cases that tables are dynamic.

             

            assign LR_DATA->* to field-symbol(<LV_DATA>).

                assign component 'DATA' of structure <LV_DATA> to field-symbol(<LV_REF_DATA>).
                assign  <LV_REF_DATA>->to <LV_TAB_DATA>.
                loop at <LV_TAB_DATA> assigning field-symbol(<LV_LINE_DATA>).
                  assign <LV_LINE_DATA>->* to field-symbol(<LV_STR_DATA>).
                  append initial line to <LT_DYN_REF> assigning field-symbol(<LS_DYN_REF>).
                  LR_DYN_COL ?= CL_ABAP_TYPEDESCR=>DESCRIBE_BY_DATA( <LV_STR_DATA> ).
                  loop at LR_DYN_COL->COMPONENTS into data(LS_COMP)  .
                    assign component LS_COMP-NAME of structure <LV_STR_DATA> to field-symbol(<LV_FIORI_REF>).
                    assign <LV_FIORI_REF>->* to field-symbol(<LV_FIORI_VAL>).
                    assign component LS_COMP-NAME of structure <LS_DYN_REF> to field-symbol(<LV_SAP_VAL>).
                    <LV_SAP_VAL> = <LV_FIORI_VAL>.
                  endloop.
                endloop.



            1. The example is incomplete (wink)

              Yes, that may also work.

              But I assume, it can be done easier, with deserialize method. You struct for deserialization shall look like that:

              TYPE: 
                BEGIN OF ts_data,
                  BEGIN OF header,
                    ...
                  END OF header,
                  columns TYPE STANDARD TABLE OF ...,
                  data TYPE REF TO data,
                END OF ts_data.   

              Then, BEFORE calling DESERIALIZE method, you need to create on ts_data-data reference on the empty table of the type you need (LT_DYN_REF). In this case, I assume, deserialize method shall be able to fill it, because the resulting structure is known beforehand. 

              BR, Alexey.

              1. hi Alexey,

                 

                 I did same thing but it doesn't fill the variable. I am exporting same structure to table actually, later user changing some variables and saving. In this case converting dynamic itab to json is fine but after save event when i try to DESERIALIZE doesnt work :/ probably i am doing something wrong but it is ok, i already found a workaround.

                Just want to ask, is there a way to parse JSON and get back the subnodes of JSON ? something like oModel.getProperty("/INPUT").

                In this case, i can  call DESERIALIZE  by usign a field-symbol ( a standard table ); like this;

                1) Get json node.

                 /ui2/cl_json=>deserialize( EXPORTING json = lv_node iv_node = 'INPUT' changing CT_DATA = <lt_data> ).

                i am thinking currently it can't deserialize just bcoz of my structure contains a data reference. Maybe if we just give the node directly and send the field symbol as standard table, i can get back (smile). Maybe partial deserialize is already doing this one. I will try now.


                1. Hi Bilen,

                  I would say if it does no deserialize, though the type of referenced data can be detected, it is a bug. I would try to recreate the problem and will fix it. 

                  About your idea. There is no such "query" support, like in JS for accessing data, while JSON itself is not persisted somewhere in the object. It is single-pass parsing filling things recursively as found. 

                  But you have a right direction (wink)  With partial deserialization, as I wrote you can get JSON node, as a string and later apply deserialize or generate on it. But that would require you use different receiving structure, in which DATA field has type /UI2/CL_JSON=>JSON but not REF TO data. Probably easier would be if you do not care too much about performance, is to run first DESERIALIZE on input JSON, with DATA as REF TO DATA, get DATA not filled, and after run GENERATE method on the same JSON. Then get the reference of generated DATA field and assign it to a DATA field in deserialized structure. You can use dynamic data accessor helper for easy traversal of generated data

                  But probably in next version of the class, it will work out of the box.

                  BR, Alexey.

  30. Hi Alexey,

    I use your class with Microsoft Graph API (which is OData v4).

    Unfortunately they use a few special characters in their property names, namely @ and -  (and . ) which are illegal in ABAP names. Thus the need for an extension.

    I have a question regarding the extension concept: When I inherit from /UI2/CL_JSON in a new class eg  ZJSON and only redefine PRETTY_NAME_EX, calling ZJSON=>SERIALIZE will still call UI2/CL_JSON.>SERIALIZE_INT and thus not use the redefinition, is this intended?

    Currently I inserted in PRETTY_NAME_EX

        REPLACE ALL OCCURRENCES OF `_AT_` IN out WITH `@` IGNORING CASE.
        REPLACE ALL OCCURRENCES OF `_DASH_` IN out WITH `-` IGNORING CASE.

    for dealing with these characters. Do you have a better suggestion?

    Regards,

     Wolfgang

    1. Hi Wolfgang,

      SERIALIZE/DESERIALIZE are static methods, that can not be redefined and are used to minimize the code on consumer side needed fo serialization/deserialization. Basically, it hides constructor calls and a further call to serialize_int/deserialze_int. And because static methods in /UI2/CL_JSON class do not know subclass name, they create an instance of  /UI2/CL_JSON but not your extension class (wink) So, it is by design. I do not know an elegant way to force it to create the instance of subclass instead of super class. 

      So, recommendations are:

      • use instance methods serialize_int/deserialize_int directly or 
      • create static methods SERIALIALIZE_EX/DESERIALIZE_EX, copying the code from SERIALIALIZE/DESERIALIZE and just replace the type of object to your custom class.

      Regarding the adjusting, you can redefine PRETTY_NAME or PRETTY_NAME_EX (or both), just use the proper mode later on serialization/deserialization (see the corresponding section in the blog for details). Do not forget to call super class implementation. Take care of field length in ABAP, as far as I remember it is 30 characters, so and _DASH_ will eat 6 characters. 

      BR, Alexey.

  31. Hello Alexey!

    One of the things I noticed with this solution is that it explicitly does not deal with mapping to class instances.

    I wrote my last JSON parser for ABAP about two years ago; in there I make provision for mapping to a class references as a way to handle "unknown" elements (because some JSON documents will contain a mix of "known" and "dynamic" parts). This allows you to then process those parts of the JSON document separately by interrogating its sub-elements; similar to how you are doing it. (The mapper will simply assign the reference of the parsed JSON structure to the data element it encounters). Not sure if this is an option for this solution as well, but I thought maybe it would be useful to be handle both known and unknown elements in the same document.

    I describe my solution here: http://ceronio.net/2015/03/yet-another-abap-json-parser-and-some-other-stuff/. If you scroll down, you will come across a snippet where I demonstrate parsing an Elasticsearch response payload (which contains a variable section) by defining part of the receiving ABAP data structure with a type ref to a class representing a JSON element. (The snippet is actually here: https://gist.github.com/mydoghasworms/2232055307715178e89e#file-elasticsearch_response-abap)

    Anyway, I only discovered /UI2/CL_JSON today after someone brought up REST services from ABAP again and I went and looked for ABAP JSON parsers. A lot of people have over the years been putting effort into JSON ↔ ABAP.

    Kind Regards,

    Martin

    1. Hi Martin,

      thanks for the feedback! 

      I have read your article and got your approach. I think you can get a similar result with /UI2/_CL_JSON, but in different ways (there are 2 alternatives). My design goal for ABAP<>JSON parsing solution was to have single self-contained class, that one can easily copy and modify. So, I do not want to have additional classes/interfaces for handling "JSON objects" (you have one parser class + JSON object traverser) - that applies some restrictions on the way I can go with a development. 

      So, there are 2 ways how /UI2/CL_JSON can handle "unknown" parts in JSON structure, without a need for new entities. 

      1) Partial deserialization (see details in blog): in that way, for your "unknown" JSON element, you define a corresponding ABAP field of type /UI2/CL_JSON=>JSON (e.g. SOURCE TYPE /UI2/CL_JSON=>JSON) and deserializer will put the unknown element/node to that field as JSON string. Later, one can apply /UI2/CL_JSON=>DESERIALIZE on that field once more, with properly typed value to get a typed ABAP object (analog of your map_data). Of course, you need to already know a type of the serialized data in this stage. That way was a long time in. 

      2) Implicit generation of ABAP objects on deserialization (see blog for details): that is a way I have introduced in the latest update. With implicit generation, you can define an ABAP field of type REF TO DATA and deserializer will generate an ABAP data structure corresponding to the JSON structure. In this way, you can not control how deserializer assigns types, but if you need it, and you in-front know in the execution time the type of result you can assign to that REF TO DATA field an empty object of the proper type and deserializer will use it as a "template". For working with generated dynamic data object you can use another class of me, for dynamic data access, that has xpath like syntax and exception safe - it makes life much easier (wink)

      Best regards,

      Alexey.

       

  32. Hello Alexey,

    Thank you for your reply.

    I am also not a great fan of distributing code over many classes in the ABAP repository, but using functional methods gives you a form of expresiveness that is otherwise difficult to achieve in ABAP (at the cost of some performance, perhaps). My workaround is just to create local classes in an include; not pretty, but very portable (smile)  (I wonder if SAP will ever introduce proper functions in ABAP to replace subroutines; but that is another discussion).

    I have tried on our system your first approach with the following code (I see /UI2/CL_JSON=>JSON is just a type STRING):

    data: begin of d1,
            abc type i,
            def type string_table,
            ghi type /ui2/cl_json=>json,
          end of d1.
    data(json) = '{"abc": 25, "def": [ "a", "b", "c"], ghi: { jkl: "some value" } }'.
    data(lr_json) = new /ui2/cl_json( ).
    lr_json->deserialize(
      exporting
        json         = conv #( json )
      changing
        data         = d1 ).

    However, the field 'ghi' is not being filled with the character data (it remains blank).

    My solution for dynamic querying of a document was to provide a generic 'GET' method which you can call on an array (to get a value by index) or an object (to get a value by key). Using this, you can chain together method calls which you can use as a query. You can potentially build an XPath-like query syntax on top of it, which I think should be pretty trivial, but I never got round to it. So you can do for example:

      try.
          data(json) = '{"abc": 25, "def": { "ghi": [ { "jkl": [ 44, 45, 56 ] } ] } }'.
          data(jdoc) = json_document=>parse( conv #( json ) ).
          data(res) = jdoc->root->get( 'def' )->get( 'ghi' )->get( 1 )->get( 'jkl' )->get( 2 ). "THE QUERY
          write: / cast json_number( res )->value. " -> 45
        catch json_error into data(lx_json).
      endtry.

    Granted, having to know the type for the conversion is a bit clunky; I should probably add another level in my type hierarchy to group together scalar values (wink)

    Regards,

    Martin

    EDIT: I forgot to mention: So for my parser, what you would do, if you want to handle a sub-element dynamically, is to define it with TYPE REF to JSON_VALUE (or OBJECT and cast later), from where you can query it as above.

    data: begin of d1,
            abc type i,
            def type ref to json_value,
          end of d1.
    
    start-of-selection.
      try.
          data(json) = '{"abc": 25, "def": { "ghi": [ { "jkl": [ 44, 45, 56 ] } ] } }'.
          json_document=>parse( conv #( json ) )->map_root( CHANGING data = d1 ).
          data(res) = d1-def->get( 'ghi' )->get( 1 )->get( 'jkl' )->get( 2 ). "QUERY SUB-ELEMENT DYNAMICALLY
          write: / cast json_number( res )->value. " -> 45
        catch json_error into data(lx_json).
      endtry.
    1. Hi Martin,

      you are providing an invalid JSON, that is why it does not work. Try this:

      Partial deserialization of JSON into ABAP
      DATA: BEGIN OF d1,
              abc TYPE i,
              def TYPE string_table,
              ghi TYPE /ui2/cl_json=>json,
            END OF d1.
      
      DATA(json) = `{"abc": 25, "def": [ "a", "b", "c"], "ghi": { "jkl": "some value" } }`.
      /ui2/cl_json=>deserialize( EXPORTING json = json CHANGING data = d1 ).

      And here is a variant with implicit generation:

      Implicit generation of the ABAP object on JSON deserialization
      DATA: BEGIN OF d1,
              abc TYPE i,
              def TYPE string_table,
              ghi TYPE REF TO data,
            END OF d1.
      
      DATA(json) = `{"abc": 25, "def": [ "a", "b", "c"], "ghi": { "jkl": "some value" } }`.
      /ui2/cl_json=>deserialize( EXPORTING json = json CHANGING data = d1 ).

      And if you want to access generated data in a nice way, you can use helper class for dynamic data access (ZCL_DYN_ACCESS) I already mentioned:

      Dynamic access to ABAP object fields generated from unknown JSON
      DATA: lo_data TYPE REF TO zcl_dyn_access,
            lv_val  TYPE string.
      
      CREATE OBJECT lo_data EXPORTING iv_data = d1.
      lo_data->at(`ghi`)->at(`jkl`)->value( IMPORTING ev_data = lv_val ).
       
      "or directly, with XPath like syntax
      zcl_dyn_access=>create( iv_data = d1 iv_component = `ghi-jkl`)->value( IMPORTING ev_data = lv_val ).

      Best regards,

      Alexey.

      1. Ah yes, absolutely. You should consider perhaps raising parse exceptions so one can deal with them.

        As for the rest, I was merely sharing my solution to some of the common problems (in this case querying) that people normally encounter. Not that my solution is better, but it's always nice to have alternatives (thumbs up)

        1. (smile) 

          And I am always interesting in a way people use /UI2/CL_JSON, while looking new ways to improve it (wink)

          The class is actually raising an exception, but it is caught in DESERIALZE method. That is done for compatibility reasons. To get an exception popped, one needs to use deserialize_int instance method directly. But honestly, the exception is not very detailed.(smile)