Author: Alexey Arsenyev
Submitted: 20.11.2013
Other code samples from me:
Why
There are a lot of other implementations of the ABAP to JSON Serializer and Deserializer in SDN, but for different reasons, all implementations I have found were not suitable for my needs. From SAP_BASIS 7.40 there is also a simple transformation available for converting ABAP to JSON and JSON to ABAP. It is the best choice if you need maximal performance and do not care about serialization format, but for proper handling of ABAP types and name pretty-printing, it fits badly.
So, I have written my ABAP JSON serializer and ABAP JSON deserializer which has some key differences from other implementations.
Below you can find a snippet of the ABAP JSON class I wrote, that you can use as a local class or as a global after renaming.
An original and actual version of the source can be found in class /UI2/CL_JSON delivered with UI2 Add-on (can be applied to SAP_BASIS 700 – 76X). So, you can use this ABAP JSON parser in your standard code mostly on any system.
What it can
ABAP to JSON
- Serialize classes, structures, internal tables, class and data references, any kind of elementary types. Complex types, such as a table of structures/classes, classes with complex attributes, etc. are also supported and recursively processed.
- ABAP to JavaScript adopted way of data type serializations:
- strings, character types to JavaScript string format (no length limitation),
- ABAP_BOOL / BOOLEAN / XFELD / BOOLE_D to JavaScript Boolean,
- Built-in TRIBOOL (TRUE/FALSE/UNDEFINED = 'X'/'-'/'') support, for better control of initial values when serializing into JavaScript Boolean
- int/floats/numeric/packed to JavaScript Integers/floats,
- date/time to JavaScript date/time string representation as "2015-03-24" or "15:30:48",
- timestamp to JavaScript integer or ISO8601 string
- structures to JavaScript objects (include types are also supported; aliases => AS are ignored)
- convert ABAP internal table to JSON, e.g JavaScript arrays or associative arrays (objects)
- Support of conversion exits on ABAP data serialization
- Pretty Printing of JavaScript property names: MY_DATA -> myData, /SAPAPO/MY_DATA -> sapapoMyData.
- Condensing of default values: initial values are not rendered into the resulting JSON string
- Optionally apply JSON formatting/beautifing/pretty-print for serialized JSON
- Performance is optimized for processing big internal tables with structures
JSON to ABAP
- Deserialize JSON objects, arrays, and any elementary types into corresponding ABAP structures. Complex objects, with embedded arrays and objects with any level of nesting, are also supported.
- Convert JSON to an internal table
- Generic deserialization of JSON objects into reference data types:
- as simple data types (integer, boolean, or string into generic data reference (REF TO DATA) -> ABAP type is selected based on JSON type.
- as dynamically generated complex object (structures, tables, mixed) for initial REF TO DATA fields
- as typed references for prefilled REF TO DATA fields (you assign a reference to typed empty data object to REF TO DATA field in execution time)
- Deserialization of unknown JSON structures possible using method GENERATE into on the fly created data types
- On JSON to ABAP transformation following rules are used:
- objects parsed into corresponding ABAP structures, classes (only classes with constructors with no obligatory parameters are supported), or internal hash/sorted tables
- arrays converted to internal tables (complex tables are also supported).
- Boolean converted as ABAP_BOOL (‘’ or ‘X’)
- Date/Time/Timestamps from JSON converted based on the type of corresponding ABAP element
- integers/floats/strings moved to corresponding fields using ABAP move semantic (strings are un-escaped). There is no limit on the size of deserialized strings, the only restriction is the constraints of receiving data type. Escaped Unicode symbols (\u001F) ins strings are decoded.
- elementary data types are converted if do not match: JavaScript integer can come into ABAP string or JavaScript string into ABAP integer, etc.
- Transformation takes into account property naming guidelines for JSON and ABAP so that camelCase names will be copied into the corresponding CAMEL_CASE field if the CAMELCASE field is not found in the ABAP structure. Do not forget to use the same PRETTY_MODE for deserialization, as you have used for serialization.
- Default field values, specified in reference ABAP variable are preserved, and not overwritten if not found in the JSON object
- Transformation of JSON structures into ABAP class instances is NOT supported.
- Support of conversion exits on deserialization
Parser for serialize/deserialize uses single-pass parsing and optimized to provide the best possible performance in ABAP in release independent way. But for time-critical applications, which have kernel version 7.20 and higher, it is recommended to use built-in JSON to ABAP transformations (CALL TRANSFORMATION). If transformation for some reason does not work, please assist the following notes: 1650141 and 1648418.
Usage example
CLASS demo DEFINITION.
PUBLIC SECTION.
CLASS-METHODS main.
ENDCLASS.
CLASS demo IMPLEMENTATION.
METHOD main.
DATA: lt_flight TYPE STANDARD TABLE OF sflight,
lrf_descr TYPE REF TO cl_abap_typedescr,
lv_json TYPE /ui2/cl_json=>json.
SELECT * FROM sflight INTO TABLE lt_flight.
" serialize table lt_flight into JSON, skipping initial fields and converting ABAP field names into camelCase
lv_json = /ui2/cl_json=>serialize( data = lt_flight
pretty_name = /ui2/cl_json=>pretty_mode-camel_case
compress = abap_true
).
cl_demo_output=>write_json( lv_json ).
CLEAR lt_flight.
" deserialize JSON string json into internal table lt_flight doing camelCase to ABAP like field name mapping
/ui2/cl_json=>deserialize( EXPORTING json = lv_json pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = lt_flight ).
" serialize ABAP object into JSON string
lrf_descr = cl_abap_typedescr=>describe_by_data( lt_flight ).
lv_json = /ui2/cl_json=>serialize( data = lrf_descr format_output = abap_true ).
cl_demo_output=>write_json( lv_json ).
cl_demo_output=>display( ).
ENDMETHOD.
ENDCLASS.
START-OF-SELECTION.
demo=>main( ).
[
{
"mandt":"120",
"carrid":"AA",
"connid":17,
"fldate":"2018-08-15",
"price":422.94,
"currency":"USD",
"planetype":"747-400",
"seatsmax":385,
"seatsocc":268,
"paymentsum":192361.84
},
{
"mandt":"120",
"carrid":"AA",
"connid":17,....
{
"ABSOLUTE_NAME":"\\TYPE=%_T00004S00000000O0000014656",
"DECIMALS":0,
"HAS_UNIQUE_KEY":false,
"INITIAL_SIZE":0,
"KEY":
[
{
"NAME":"MANDT"
},
{
"NAME":"CARRID"
},
{
"NAME":"CONNID....
API description
Two static methods are of most interest in common cases: SERIALIZE and DESERIALIZE. The rest of the public methods are done public only for reuse purposes if you would like to build/extend your own serialization/deserialization code.
SERIALIZE : Serialize ABAP object into JSON
- > DATA (any) - any ABAP object/structure/table/element to be serialized
- > COMPRESS (bool, default=false) - tells serializer to skip empty elements/objects during serialization. So, all for which IS INITIAL = TRUE.
- > NAME (string, optional) - optional name of the serialized object. Will '"name" : {...}' instead of ' {...} ' if supplied.
- > PRETTY_NAME (enum, optional)- mode, controlling how ABAP field names transformed in JSON attribute names. See the description below.
- > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR, optional) - if you know object type already - pass it to improve performance.
- > ASSOC_ARRAYS (bool, default = false) - controls how to serialize hash or sorted tables with unique keys. See below for details.
- > ASSOC_ARRAYS_OPT (bool, default = false) - when set, serializer will optimize rendering of name-value associated arrays (hash maps) in JSON
- > TS_AS_ISO8601 (bool, default = false) - says serializer to output timestamps using ISO8601 format.
- > NUMC_AS_STRING (bool, default = false) - Controls the way how NUMC fields are serialized. If set to ABAP_TRUE, NUMC fields serialized not as integers, but as strings, with all leading zeros. Deserialization works compatible with both ways of NUMC serialized data.
- > NAME_MAPPINGS (table) - ABAP<->JSON Name Mapping Table
- > CONVERSION_EXITS (bool, default = false) - use DDIC conversion exits on serialize of values (performance loss!)
- > FORMAT_OUTPUT (bool, default = false) - Indent, add formatting spaces and split in lines serialized JSON
- > HEX_AS_BASE64 (bool, default = true) - Serialize hex values as base64
- < R_JSON - output JSON string.
DESERIALIZE : Deserialize ABAP object from JSON string
- > JSON (string) - input JSON object string to deserialize
- > JSONX (xstring) - input JSON object as raw string to deserialize
- > PRETTY_NAME (enum, optional) - mode, controlling how JSON field names mapped to ABAP component names. See the description below.
- > ASSOC_ARRAYS (bool, default = false) - controls how to deserialize JSON objects into hash or sorted tables with unique keys. See below for details.
- > ASSOC_ARRAYS_OPT (bool, default = false) - when set, the deserializer will take into account the optimized rendering of associated arrays (properties) in JSON.
- > TS_AS_ISO8601 (bool, default = false) - says deserializer to read timestamps from strings into timestamps fields using ISO 8601 format.
- > NAME_MAPPINGS (table) - ABAP<->JSON Name Mapping Table
- > CONVERSION_EXITS (bool, default = false) - use DDIC conversion exits on deserialize of values (performance loss!)
- > HEX_AS_BASE64 (bool, default = true) - Deserialize hex values as base64
- <> DATA (any) - ABAP object/structure/table/element to be filled from JSON string. If the ABAP structure contains more fields than in the JSON object, the content of unmatched fields is preserved.
GENERATE : Generates ABAP object from JSON
- > JSON (string) - input JSON object string to deserialize
- > PRETTY_NAME (enum, optional) - mode, controlling how JSON field names mapped to ABAP component names. See the description below.
- > NAME_MAPPINGS (table) - ABAP<->JSON Name Mapping Table
- < RR_DATA (REF TO DATA) - reference to ABAP structure/table dynamically generated from JSON string.
In addition to the explained methods, there are two options, that need a wider explanation:
PRETTY_NAME : enumeration of modes, defined as constant /UI2/CL_JSON=>pretty_name.
- NONE - ABAP component names serialized as is (UPPERCASE).
- LOW_CASE - ABAP component names serialized in low case
- CAMEL_CASE - ABAP component types serialized in CamelCase where symbol "_" is treated as word separator (and removed from the resulting name).
- EXTENDED - works the same way as CAMEL_CASE but also, has extended logic for encoding special characters, such as: ".", "@", "~", etc. Shall be used if you need JSON names with characters not allowed for ABAP data component names. Do not use it, if you do not have special characters in JSON names - the performance would be slower in comparison with CAMEL_CASE mode. Example: ABAP name '__A__SCHEMA' translates in JSON name '@schema'
Encoding rules (ABAP name → JSON name):- '__E__' → '!'
- '__N__' → '#'
- '__D__' → '$'
- '__P__' → '%'
- '__M__' → '&'
- '__S__' → '*'
- '__H__' → '-'
- '__T__' → '~'
- '__L__' → '/'
- '__C__' → ':'
- '__V__' → '|'
- '__A__' → '@'
- '__O__' or '___' → '.'
NONE and LOW_CASE work the same way for DESERIALIZE.
ASSOC_ARRAYS :
This option controls the way how hashed or sorted tables with unique keys are serialized/deserialized. Normally, ABAP internal tables are serialized into JSON arrays, but in some cases, you will like to serialize them as associative arrays (JSON objects) where every row of the table shall be reflected as a separated property of JSON object. This can be achieved by setting the ASSOC_ARRAYS parameter to TRUE. If set, the serializer checks for sorted/hashed tables with a UNIQUE key(s) and serialize them as an object. The JSON property name, reflecting row, constructed from values of fields, used in key separated by constant MC_KEY_SEPARATOR = '-'. If the table has only one field marked as key, the value of this single field becomes a property name and REMOVED from the associated object (to eliminate redundancy). If TABLE_LINE is used as a unique key, all values of all fields construct key property names (separated by MC_KEY_SEPARATOR). During deserialization, logic works vice versa: if ASSOC_ARRAYS is set to TRUE, and JSON object matches internal hash or sorted table with the unique key, the object is transformed into the table, where every object property is reflected in a separated table row. If the ABAP table has only one key field, the property name is transformed into a value of this key field.
ASSOC_ARRAYS_OPT:
By default, when dumping hash/sorted tables with a unique key into JSON, the serializer will write key field as the property name, and the rest of the fields will write object value of properties:
TYPES: BEGIN OF ts_record,
key TYPE string,
value TYPE string,
END OF ts_record.
DATA: lt_act TYPE SORTED TABLE OF ts_record WITH UNIQUE KEY key.
lv_json = /ui2/cl_json=>serialize( data = lt_exp assoc_arrays = abap_true ).
{
"KEY1": {
"value": "VALUE1"
},
"KEY2": {
"value": "VALUE2"
}
}
But if you will use the assoc_arrays_opt flag during serialization, the serializer will try to omit unnecessary object nesting on dumping of simple, name/value tables, containing only one key field and one value field:
lv_json = /ui2/cl_json=>serialize( data = lt_exp assoc_arrays = abap_true assoc_arrays_opt = abap_true ).
{
"KEY1": "VALUE1",
"KEY2": "VALUE2"
}
For deserialization, the flag is used to tell the deserializer that value shall be placed in a non-key field of the structure.
Supported SAP_BASIS releases
The code was tested from SAP_BASIS 7.00 and higher, but I do not see the reasons why it cannot be downported on lower releases too. But if you plan to use it on SAP_BASIS 7.02 and higher (and do not need property name pretty-printing) better consider the standard solution for ABAP, using CALL TRANSFORMATION. It shall be faster, while implemented in the kernel. See the blog of Horst Keller for details. Maybe the best will be, if you need support in lower SAP_BASIS releases as well as in 7.02 and higher, to modify provided a class in a way to generate the same JSON format as standard ABAP CALL TRANSFORMATION for JSON does and redirect flow to home-made code or built-in ABAP transformation depending on SAP_BASIS release.
Further optimizations
- Be aware, that usage of flag conversion_exits may significantly decrease performance - use only in cases, when you are sure that you need it.
- Escaping property values can be expensive. To optimize performance, in this case, you can replace escapement code with some kernel implemented function (from cl_http_utility class for example), instead of explicit REPLACE ALL OCCURRENCES calls.
- Unescaping can influence deserialization performance even worse, depending on the fact if your JSON has encoded \n\r\t\f\b\x. So, avoid the usage of them if you can.
- It is possible to significantly increase performance for serialization/deserialization by dropping the support of releases below 7.40. That can be realized by moving base parsing from ABAP to kernel implemented classes cl_sxml_string_writer and cl_sxml_string_reader.
Remarks
Due to optimization reasons, some methods were converted to macros, to reduce overhead for calling methods for data type serialization. If performance in your case is not critical, and you prefer clean/debuggable code you can replace macro calls with corresponding methods.
Related pages
The /UI2/CL_JSON code
Below you can find the code itself, which you can use (corresponds to the state of PL16 - see the value of version attribute of the class).
If you want to use the class globally, I suggest creating a proxy class, in your namespace, with a reduced interface (serialize/deserialize only) and calling local copy (local class of it) of /UI2/CL_JSON. Then you can easily update to the new version of /UI2/CL_JSON from SDN or call UI Addon implementation if it is installed.
*----------------------------------------------------------------------*
* CLASS zcl_json DEFINITION
*----------------------------------------------------------------------*
*
*----------------------------------------------------------------------*
CLASS zcl_json DEFINITION.
PUBLIC SECTION.
TYPE-POOLS abap .
CLASS cl_abap_tstmp DEFINITION LOAD .
CLASS cx_sy_conversion_error DEFINITION LOAD .
TYPES:
json TYPE string,
BEGIN OF name_mapping,
abap TYPE abap_compname,
json TYPE string,
END OF name_mapping,
name_mappings TYPE HASHED TABLE OF name_mapping WITH UNIQUE KEY abap,
ref_tab TYPE STANDARD TABLE OF REF TO data WITH DEFAULT KEY,
bool TYPE char1,
tribool TYPE char1 ,
pretty_name_mode TYPE char1 .
CONSTANTS:
BEGIN OF pretty_mode,
none TYPE char1 VALUE ``,
low_case TYPE char1 VALUE `L`,
camel_case TYPE char1 VALUE `X`,
extended TYPE char1 VALUE `Y`,
user TYPE char1 VALUE `U`,
user_low_case TYPE char1 VALUE `C`,
END OF pretty_mode,
BEGIN OF c_bool,
true TYPE bool VALUE `X`,
false TYPE bool VALUE ``,
END OF c_bool ,
BEGIN OF c_tribool,
true TYPE tribool VALUE c_bool-true,
false TYPE tribool VALUE `-`,
undefined TYPE tribool VALUE ``,
END OF c_tribool,
mc_key_separator TYPE string VALUE `-`, "#EC NOTEXT
version TYPE i VALUE 16.
CLASS-DATA sv_white_space TYPE string READ-ONLY .
CLASS-DATA mc_bool_types TYPE string READ-ONLY VALUE `\TYPE-POOL=ABAP\TYPE=ABAP_BOOL\TYPE=BOOLEAN\TYPE=BOOLE_D\TYPE=XFELD`. "#EC NOTEXT
CLASS-DATA mc_bool_3state TYPE string READ-ONLY VALUE `\TYPE=BOOLEAN`. "#EC NOTEXT
CLASS-DATA mc_json_type TYPE string READ-ONLY .
CLASS-METHODS class_constructor .
CLASS-METHODS string_to_xstring
IMPORTING
in TYPE string
CHANGING
VALUE(out) TYPE any .
CLASS-METHODS xstring_to_string
IMPORTING
in TYPE any
RETURNING
VALUE(out) TYPE string .
CLASS-METHODS raw_to_string
IMPORTING
iv_xstring TYPE xstring
iv_encoding TYPE abap_encoding OPTIONAL
RETURNING
VALUE(rv_string) TYPE string .
CLASS-METHODS string_to_raw
IMPORTING
iv_string TYPE string
iv_encoding TYPE abap_encoding OPTIONAL
RETURNING
VALUE(rv_xstring) TYPE xstring .
CLASS-METHODS dump
IMPORTING
data TYPE data
compress TYPE bool DEFAULT c_bool-false
type_descr TYPE REF TO cl_abap_typedescr OPTIONAL
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
assoc_arrays TYPE bool DEFAULT c_bool-false
ts_as_iso8601 TYPE bool DEFAULT c_bool-false
RETURNING
VALUE(r_json) TYPE json .
CLASS-METHODS deserialize
IMPORTING
json TYPE json OPTIONAL
jsonx TYPE xstring OPTIONAL
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
assoc_arrays TYPE bool DEFAULT c_bool-false
assoc_arrays_opt TYPE bool DEFAULT c_bool-false
name_mappings TYPE name_mappings OPTIONAL
conversion_exits TYPE bool DEFAULT c_bool-false
hex_as_base64 TYPE bool DEFAULT c_bool-true
CHANGING
data TYPE data .
CLASS-METHODS serialize
IMPORTING
data TYPE data
compress TYPE bool DEFAULT c_bool-false
name TYPE string OPTIONAL
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
type_descr TYPE REF TO cl_abap_typedescr OPTIONAL
assoc_arrays TYPE bool DEFAULT c_bool-false
ts_as_iso8601 TYPE bool DEFAULT c_bool-false
expand_includes TYPE bool DEFAULT c_bool-true
assoc_arrays_opt TYPE bool DEFAULT c_bool-false
numc_as_string TYPE bool DEFAULT c_bool-false
name_mappings TYPE name_mappings OPTIONAL
conversion_exits TYPE bool DEFAULT c_bool-false
format_output TYPE bool DEFAULT c_bool-false
hex_as_base64 TYPE bool DEFAULT c_bool-true
RETURNING
VALUE(r_json) TYPE json .
METHODS deserialize_int
IMPORTING
json TYPE json OPTIONAL
jsonx TYPE xstring OPTIONAL
CHANGING
data TYPE data
RAISING
cx_sy_move_cast_error .
CLASS-METHODS generate
IMPORTING
json TYPE json
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
name_mappings TYPE name_mappings OPTIONAL
RETURNING
VALUE(rr_data) TYPE REF TO data .
METHODS serialize_int
IMPORTING
data TYPE data
name TYPE string OPTIONAL
type_descr TYPE REF TO cl_abap_typedescr OPTIONAL
RETURNING
VALUE(r_json) TYPE json .
METHODS generate_int
IMPORTING
json TYPE json
VALUE(length) TYPE i OPTIONAL
CHANGING
data TYPE REF TO data
offset TYPE i DEFAULT 0
RAISING
cx_sy_move_cast_error .
METHODS constructor
IMPORTING
compress TYPE bool DEFAULT c_bool-false
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
assoc_arrays TYPE bool DEFAULT c_bool-false
ts_as_iso8601 TYPE bool DEFAULT c_bool-false
expand_includes TYPE bool DEFAULT c_bool-true
assoc_arrays_opt TYPE bool DEFAULT c_bool-false
strict_mode TYPE bool DEFAULT c_bool-false
numc_as_string TYPE bool DEFAULT c_bool-false
name_mappings TYPE name_mappings OPTIONAL
conversion_exits TYPE bool DEFAULT c_bool-false
format_output TYPE bool DEFAULT c_bool-false
hex_as_base64 TYPE bool DEFAULT c_bool-true
bool_types TYPE string DEFAULT mc_bool_types
bool_3state TYPE string DEFAULT mc_bool_3state
initial_ts TYPE string DEFAULT `""`
initial_date TYPE string DEFAULT `""`
initial_time TYPE string DEFAULT `""` .
CLASS-METHODS bool_to_tribool
IMPORTING
iv_bool TYPE bool
RETURNING
VALUE(rv_tribool) TYPE tribool .
CLASS-METHODS tribool_to_bool
IMPORTING
iv_tribool TYPE tribool
RETURNING
VALUE(rv_bool) TYPE bool .
PROTECTED SECTION.
TYPES:
BEGIN OF t_s_field_cache,
name TYPE string,
type TYPE REF TO cl_abap_datadescr,
elem_type TYPE REF TO cl_abap_elemdescr,
convexit_out TYPE string,
convexit_in TYPE string,
value TYPE REF TO data,
END OF t_s_field_cache ,
BEGIN OF t_s_symbol,
header TYPE string,
compressable TYPE abap_bool,
read_only TYPE abap_bool.
INCLUDE TYPE t_s_field_cache.
TYPES: END OF t_s_symbol ,
t_t_symbol TYPE STANDARD TABLE OF t_s_symbol WITH DEFAULT KEY ,
t_t_field_cache TYPE HASHED TABLE OF t_s_field_cache WITH UNIQUE KEY name ,
name_mappings_ex TYPE HASHED TABLE OF name_mapping WITH UNIQUE KEY json .
TYPES:
BEGIN OF t_s_name_value,
name TYPE string,
value TYPE json,
END OF t_s_name_value .
TYPES:
t_t_name_value TYPE SORTED TABLE OF t_s_name_value WITH UNIQUE KEY name ,
t_t_json TYPE STANDARD TABLE OF json WITH DEFAULT KEY .
TYPES:
BEGIN OF t_s_struct_type,
keys TYPE string,
type TYPE REF TO cl_abap_datadescr,
END OF t_s_struct_type .
TYPES:
t_t_struct_type TYPE SORTED TABLE OF t_s_struct_type WITH UNIQUE KEY keys ,
BEGIN OF t_s_struct_cache_res,
data TYPE REF TO data,
symbols TYPE t_t_symbol,
END OF t_s_struct_cache_res ,
BEGIN OF t_s_struct_cache,
type_descr TYPE REF TO cl_abap_structdescr,
include_aliases TYPE abap_bool,
level TYPE i,
result TYPE t_s_struct_cache_res,
END OF t_s_struct_cache ,
t_t_struct_cache TYPE HASHED TABLE OF t_s_struct_cache WITH UNIQUE KEY type_descr include_aliases level .
CONSTANTS mc_default_indent TYPE string VALUE ` `. "#EC NOTEXT
DATA mv_bool_types TYPE string.
DATA mv_bool_3state TYPE string.
DATA mv_initial_ts TYPE string VALUE `""`. "#EC NOTEXT
DATA mv_initial_date TYPE string VALUE `""`. "#EC NOTEXT
DATA mv_initial_time TYPE string VALUE `""`. "#EC NOTEXT
DATA mv_compress TYPE bool .
DATA mv_pretty_name TYPE pretty_name_mode .
DATA mv_assoc_arrays TYPE bool .
DATA mv_ts_as_iso8601 TYPE bool .
DATA mv_expand_includes TYPE bool .
DATA mv_assoc_arrays_opt TYPE bool .
DATA mv_strict_mode TYPE bool .
DATA mv_numc_as_string TYPE bool .
DATA mv_format_output TYPE bool .
DATA mv_conversion_exits TYPE bool .
DATA mv_hex_as_base64 TYPE bool .
DATA mt_name_mappings TYPE name_mappings .
DATA mt_name_mappings_ex TYPE name_mappings_ex .
DATA mt_struct_type TYPE t_t_struct_type .
DATA mt_struct_cache TYPE t_t_struct_cache .
CLASS-DATA mc_name_symbols_map TYPE string VALUE ` _/_\_:_;_~_._,_-_+_=_>_<_|_(_)_[_]_{_}_@_+_*_?__&_$_#_%_^_'_`. "#EC NOTEXT
CLASS-DATA so_type_s TYPE REF TO cl_abap_elemdescr .
CLASS-DATA so_type_f TYPE REF TO cl_abap_elemdescr .
CLASS-DATA so_type_p TYPE REF TO cl_abap_elemdescr .
CLASS-DATA so_type_i TYPE REF TO cl_abap_elemdescr .
CLASS-DATA so_type_b TYPE REF TO cl_abap_elemdescr .
CLASS-DATA so_type_t_json TYPE REF TO cl_abap_tabledescr .
CLASS-DATA so_type_t_name_value TYPE REF TO cl_abap_tabledescr .
CLASS-METHODS unescape
IMPORTING
escaped TYPE string
RETURNING
VALUE(unescaped) TYPE string .
CLASS-METHODS get_convexit_func
IMPORTING
elem_descr TYPE REF TO cl_abap_elemdescr
input TYPE abap_bool OPTIONAL
RETURNING
VALUE(rv_func) TYPE string .
METHODS dump_symbols
FINAL
IMPORTING
it_symbols TYPE t_t_symbol
opt_array TYPE bool OPTIONAL
format_scope TYPE bool DEFAULT abap_true
level TYPE i
RETURNING
VALUE(r_json) TYPE json .
METHODS get_symbols_struct
FINAL
IMPORTING
type_descr TYPE REF TO cl_abap_structdescr
include_aliases TYPE abap_bool DEFAULT abap_false
data TYPE REF TO data OPTIONAL
level TYPE i DEFAULT 0
RETURNING
VALUE(result) TYPE t_s_struct_cache_res .
METHODS get_symbols_class
FINAL
IMPORTING
type_descr TYPE REF TO cl_abap_classdescr
object TYPE REF TO object OPTIONAL
RETURNING
VALUE(result) TYPE t_t_symbol .
METHODS get_symbols
FINAL
IMPORTING
type_descr TYPE REF TO cl_abap_typedescr
data TYPE REF TO data OPTIONAL
object TYPE REF TO object OPTIONAL
include_aliases TYPE abap_bool DEFAULT abap_false
RETURNING
VALUE(result) TYPE t_t_symbol .
METHODS get_fields
FINAL
IMPORTING
type_descr TYPE REF TO cl_abap_typedescr
data TYPE REF TO data OPTIONAL
object TYPE REF TO object OPTIONAL
RETURNING
VALUE(rt_fields) TYPE t_t_field_cache .
METHODS dump_int
IMPORTING
data TYPE data
type_descr TYPE REF TO cl_abap_typedescr OPTIONAL
convexit TYPE string OPTIONAL
level TYPE i DEFAULT 0
RETURNING
VALUE(r_json) TYPE json .
METHODS is_compressable
IMPORTING
type_descr TYPE REF TO cl_abap_typedescr ##NEEDED
name TYPE csequence ##NEEDED
RETURNING
VALUE(rv_compress) TYPE abap_bool .
METHODS restore
IMPORTING
json TYPE json
length TYPE i
VALUE(type_descr) TYPE REF TO cl_abap_typedescr OPTIONAL
field_cache TYPE t_t_field_cache OPTIONAL
CHANGING
data TYPE data OPTIONAL
offset TYPE i DEFAULT 0
RAISING
cx_sy_move_cast_error .
METHODS restore_type
IMPORTING
json TYPE json
length TYPE i
VALUE(type_descr) TYPE REF TO cl_abap_typedescr OPTIONAL
field_cache TYPE t_t_field_cache OPTIONAL
convexit TYPE string OPTIONAL
CHANGING
data TYPE data OPTIONAL
offset TYPE i DEFAULT 0
RAISING
cx_sy_move_cast_error .
METHODS dump_type
IMPORTING
data TYPE data
type_descr TYPE REF TO cl_abap_elemdescr
convexit TYPE string
RETURNING
VALUE(r_json) TYPE json .
METHODS dump_type_ex
IMPORTING
data TYPE data
RETURNING
VALUE(r_json) TYPE json .
METHODS pretty_name_ex
IMPORTING
in TYPE csequence
RETURNING
VALUE(out) TYPE string .
METHODS generate_int_ex
FINAL
IMPORTING
json TYPE json
length TYPE i
CHANGING
data TYPE data
offset TYPE i .
METHODS pretty_name
IMPORTING
in TYPE csequence
RETURNING
VALUE(out) TYPE string .
CLASS-METHODS escape ##SHADOW[ESCAPE]
IMPORTING
in TYPE any
RETURNING
VALUE(out) TYPE string .
CLASS-METHODS edm_datetime_to_ts
IMPORTING
ticks TYPE string
offset TYPE string OPTIONAL
typekind TYPE abap_typekind
RETURNING
VALUE(r_data) TYPE string .
CLASS-METHODS get_indent
IMPORTING
level TYPE i DEFAULT 0
RETURNING
VALUE(indent) TYPE string .
METHODS generate_struct
CHANGING
fields TYPE t_t_name_value
data TYPE REF TO data .
PRIVATE SECTION.
DATA mv_extended TYPE bool .
CLASS-DATA mc_me_type TYPE string .
CLASS-DATA mc_cov_error TYPE c .
ENDCLASS.
DEFINE escape_json.
&2 = &1.
* replace all occurrences of regex `[\\"]` in &1 with `\\$0`. <-- this is slower than 2 plain replaces
REPLACE ALL OCCURRENCES OF `\` IN &2 WITH `\\`.
REPLACE ALL OCCURRENCES OF `"` IN &2 WITH `\"`.
REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>cr_lf IN &2 WITH `\r\n`.
REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>newline IN &2 WITH `\n`.
REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>horizontal_tab IN &2 WITH `\t`.
END-OF-DEFINITION.
DEFINE is_compressable.
IF mv_compress EQ abap_false.
&3 = abap_false.
ELSEIF mv_extended IS INITIAL.
&3 = abap_true.
ELSE.
&3 = is_compressable( type_descr = &1 name = &2 ).
ENDIF.
END-OF-DEFINITION.
DEFINE dump_type.
IF mv_extended IS INITIAL.
dump_type_int &1 &2 &3 &4.
ELSE.
&3 = dump_type( data = &1 type_descr = &2 convexit = &4 ).
ENDIF.
END-OF-DEFINITION.
DEFINE xstring_to_string_int.
IF mv_hex_as_base64 IS INITIAL.
MOVE &1 TO &2.
ELSE.
&2 = xstring_to_string( &1 ).
ENDIF.
END-OF-DEFINITION.
DEFINE string_to_xstring_int.
IF mv_hex_as_base64 IS INITIAL.
MOVE &1 TO &2.
ELSE.
string_to_xstring( EXPORTING in = &1 CHANGING out = &2 ).
ENDIF.
END-OF-DEFINITION.
DEFINE format_list_output.
IF mv_format_output EQ abap_true AND &2 IS NOT INITIAL.
CONCATENATE `,` lv_indent INTO lv_lb.
CONCATENATE LINES OF &2 INTO &4 SEPARATED BY lv_lb.
CONCATENATE &1 lv_indent &4 indent &3 INTO &4.
ELSE.
CONCATENATE LINES OF &2 INTO &4 SEPARATED BY `,`.
CONCATENATE &1 &4 &3 INTO &4.
ENDIF.
END-OF-DEFINITION. " format_list_output
DEFINE dump_type_int.
IF &4 IS NOT INITIAL AND &1 IS NOT INITIAL.
TRY.
CALL FUNCTION &4
EXPORTING
input = &1
IMPORTING
output = &3
EXCEPTIONS
OTHERS = 1.
IF sy-subrc IS INITIAL.
CONCATENATE `"` &3 `"` INTO &3.
ENDIF.
CATCH cx_root. "#EC NO_HANDLER
ENDTRY.
ELSE.
CASE &2->type_kind.
WHEN cl_abap_typedescr=>typekind_float OR cl_abap_typedescr=>typekind_int OR cl_abap_typedescr=>typekind_int1 OR
cl_abap_typedescr=>typekind_int2 OR cl_abap_typedescr=>typekind_packed OR `8`. " TYPEKIND_INT8 -> '8' only from 7.40.
IF &2->type_kind EQ cl_abap_typedescr=>typekind_packed AND mv_ts_as_iso8601 EQ c_bool-true AND &2->absolute_name CP `\TYPE=TIMESTAMP*`.
IF &1 IS INITIAL.
&3 = mv_initial_ts.
ELSE.
&3 = &1.
IF &2->absolute_name EQ `\TYPE=TIMESTAMP`.
CONCATENATE `"` &3(4) `-` &3+4(2) `-` &3+6(2) `T` &3+8(2) `:` &3+10(2) `:` &3+12(2) `.0000000Z"` INTO &3.
ELSEIF &2->absolute_name EQ `\TYPE=TIMESTAMPL`.
CONCATENATE `"` &3(4) `-` &3+4(2) `-` &3+6(2) `T` &3+8(2) `:` &3+10(2) `:` &3+12(2) `.` &3+15(7) `Z"` INTO &3.
ENDIF.
ENDIF.
ELSEIF &1 IS INITIAL.
&3 = `0`.
ELSE.
&3 = &1.
IF &1 LT 0.
IF &2->type_kind <> cl_abap_typedescr=>typekind_float. "float: sign is already at the beginning
SHIFT &3 RIGHT CIRCULAR.
ENDIF.
ELSE.
CONDENSE &3.
ENDIF.
ENDIF.
WHEN cl_abap_typedescr=>typekind_num.
IF mv_numc_as_string EQ abap_true.
IF &1 IS INITIAL.
&3 = `""`.
ELSE.
CONCATENATE `"` &1 `"` INTO &3.
ENDIF.
ELSE.
&3 = &1.
SHIFT &3 LEFT DELETING LEADING ` 0`.
IF &3 IS INITIAL.
&3 = `0`.
ENDIF.
ENDIF.
WHEN cl_abap_typedescr=>typekind_string OR cl_abap_typedescr=>typekind_csequence OR cl_abap_typedescr=>typekind_clike.
IF &1 IS INITIAL.
&3 = `""`.
ELSEIF &2->absolute_name EQ mc_json_type.
&3 = &1.
ELSE.
escape_json &1 &3.
CONCATENATE `"` &3 `"` INTO &3.
ENDIF.
WHEN cl_abap_typedescr=>typekind_xstring OR cl_abap_typedescr=>typekind_hex.
IF &1 IS INITIAL.
&3 = `""`.
ELSE.
xstring_to_string_int &1 &3.
CONCATENATE `"` &3 `"` INTO &3.
ENDIF.
WHEN cl_abap_typedescr=>typekind_char.
IF &2->output_length EQ 1 AND mv_bool_types CS &2->absolute_name.
IF &1 EQ c_bool-true.
&3 = `true`. "#EC NOTEXT
ELSEIF &1 IS INITIAL AND mv_bool_3state CS &2->absolute_name.
&3 = `null`. "#EC NOTEXT
ELSE.
&3 = `false`. "#EC NOTEXT
ENDIF.
ELSE.
escape_json &1 &3.
CONCATENATE `"` &3 `"` INTO &3.
ENDIF.
WHEN cl_abap_typedescr=>typekind_date.
IF &1 IS INITIAL.
&3 = mv_initial_date.
ELSE.
CONCATENATE `"` &1(4) `-` &1+4(2) `-` &1+6(2) `"` INTO &3.
ENDIF.
WHEN cl_abap_typedescr=>typekind_time.
IF &1 IS INITIAL.
&3 = mv_initial_time.
ELSE.
CONCATENATE `"` &1(2) `:` &1+2(2) `:` &1+4(2) `"` INTO &3.
ENDIF.
WHEN `k`. " cl_abap_typedescr=>typekind_enum
&3 = &1.
CONCATENATE `"` &3 `"` INTO &3.
WHEN OTHERS.
IF &1 IS INITIAL.
&3 = `null`. "#EC NOTEXT
ELSE.
&3 = &1.
ENDIF.
ENDCASE.
ENDIF.
END-OF-DEFINITION.
DEFINE format_name.
CASE &2.
WHEN pretty_mode-camel_case.
&3 = pretty_name( &1 ).
WHEN pretty_mode-extended.
&3 = pretty_name_ex( &1 ).
WHEN pretty_mode-user_low_case.
READ TABLE mt_name_mappings WITH TABLE KEY abap = &1 ASSIGNING <cache>. "#EC WARNOK
IF sy-subrc IS INITIAL.
&3 = <cache>-json.
ELSE.
&3 = &1.
TRANSLATE &3 TO LOWER CASE. "#EC SYNTCHAR
ENDIF.
WHEN pretty_mode-user.
READ TABLE mt_name_mappings WITH TABLE KEY abap = &1 ASSIGNING <cache>. "#EC WARNOK
IF sy-subrc IS INITIAL.
&3 = <cache>-json.
ELSE.
&3 = &1.
ENDIF.
WHEN pretty_mode-low_case.
&3 = &1.
TRANSLATE &3 TO LOWER CASE. "#EC SYNTCHAR
WHEN OTHERS.
&3 = &1.
ENDCASE.
END-OF-DEFINITION.
DEFINE restore_reference.
CREATE DATA data TYPE HANDLE &1.
ASSIGN data->* TO <data>.
restore_type( EXPORTING json = json length = length type_descr = &1 CHANGING offset = offset data = <data> ).
END-OF-DEFINITION.
DEFINE throw_error.
RAISE EXCEPTION TYPE cx_sy_move_cast_error.
END-OF-DEFINITION.
DEFINE while_offset_cs.
WHILE offset < length.
FIND FIRST OCCURRENCE OF json+offset(1) IN &1.
IF sy-subrc IS NOT INITIAL.
EXIT.
ENDIF.
offset = offset + 1.
ENDWHILE.
END-OF-DEFINITION.
DEFINE while_offset_not_cs.
WHILE offset < length.
FIND FIRST OCCURRENCE OF &2+offset(1) IN &1.
IF sy-subrc IS INITIAL.
EXIT.
ENDIF.
offset = offset + 1.
ENDWHILE.
END-OF-DEFINITION.
DEFINE eat_white.
while_offset_cs sv_white_space.
IF offset GE length.
throw_error.
ENDIF.
END-OF-DEFINITION.
DEFINE eat_name.
IF json+offset(1) EQ `"`.
mark = offset + 1.
offset = mark.
FIND FIRST OCCURRENCE OF `"` IN SECTION OFFSET offset OF json MATCH OFFSET offset.
IF sy-subrc IS NOT INITIAL.
throw_error.
ENDIF.
match = offset - mark.
&1 = json+mark(match).
offset = offset + 1.
ELSE.
throw_error.
ENDIF.
END-OF-DEFINITION.
DEFINE eat_string.
IF json+offset(1) EQ `"`.
mark = offset + 1.
offset = mark.
IF json+mark(1) EQ `"`.
CLEAR &1.
ELSE.
DO.
FIND FIRST OCCURRENCE OF `"` IN SECTION OFFSET offset OF json MATCH OFFSET pos.
IF sy-subrc IS NOT INITIAL.
throw_error.
ENDIF.
offset = pos.
pos = pos - 1.
" if escaped search further
WHILE pos GE 0 AND json+pos(1) EQ `\`.
pos = pos - 1.
ENDWHILE.
match = ( offset - pos ) MOD 2.
IF match NE 0.
EXIT.
ENDIF.
offset = offset + 1.
ENDDO.
match = offset - mark.
&1 = json+mark(match).
" unescaped singe characters, e.g \\, \", \/ etc,
" BUT ONLY if someone really need the data
IF type_descr IS NOT INITIAL.
&1 = unescape( &1 ).
ENDIF.
ENDIF.
offset = offset + 1.
ELSE.
throw_error.
ENDIF.
END-OF-DEFINITION.
DEFINE eat_number.
mark = offset.
while_offset_cs `0123456789+-eE.`. "#EC NOTEXT
match = offset - mark.
&1 = json+mark(match).
END-OF-DEFINITION.
DEFINE eat_bool.
mark = offset.
while_offset_cs `aeflnrstu`. "#EC NOTEXT
match = offset - mark.
IF json+mark(match) EQ `true`. "#EC NOTEXT
&1 = c_bool-true.
ELSEIF json+mark(match) EQ `false`. "#EC NOTEXT
IF type_descr IS BOUND AND mv_bool_3state CS type_descr->absolute_name.
&1 = c_tribool-false.
ELSE.
&1 = c_bool-false.
ENDIF.
ELSEIF json+mark(match) EQ `null`. "#EC NOTEXT
CLEAR &1.
ENDIF.
END-OF-DEFINITION.
DEFINE eat_char.
IF offset < length AND json+offset(1) EQ &1.
offset = offset + 1.
ELSE.
throw_error.
ENDIF.
END-OF-DEFINITION.
CLASS zcl_json IMPLEMENTATION.
METHOD bool_to_tribool.
IF iv_bool EQ c_bool-true.
rv_tribool = c_tribool-true.
ELSEIF iv_bool EQ abap_undefined. " fall back for abap _bool
rv_tribool = c_tribool-undefined.
ELSE.
rv_tribool = c_tribool-false.
ENDIF.
ENDMETHOD. "bool_to_tribool
METHOD class_constructor.
DATA: lo_bool_type_descr TYPE REF TO cl_abap_typedescr,
lo_tribool_type_descr TYPE REF TO cl_abap_typedescr,
lo_json_type_descr TYPE REF TO cl_abap_typedescr,
lv_pos LIKE sy-fdpos,
lv_json_string TYPE json.
lo_bool_type_descr = cl_abap_typedescr=>describe_by_data( c_bool-true ).
lo_tribool_type_descr = cl_abap_typedescr=>describe_by_data( c_tribool-true ).
lo_json_type_descr = cl_abap_typedescr=>describe_by_data( lv_json_string ).
CONCATENATE mc_bool_types lo_bool_type_descr->absolute_name lo_tribool_type_descr->absolute_name INTO mc_bool_types.
CONCATENATE mc_bool_3state lo_tribool_type_descr->absolute_name INTO mc_bool_3state.
CONCATENATE mc_json_type lo_json_type_descr->absolute_name INTO mc_json_type.
FIND FIRST OCCURRENCE OF `\TYPE=` IN lo_json_type_descr->absolute_name MATCH OFFSET lv_pos.
IF sy-subrc IS INITIAL.
mc_me_type = lo_json_type_descr->absolute_name(lv_pos).
ENDIF.
sv_white_space = cl_abap_char_utilities=>get_simple_spaces_for_cur_cp( ).
mc_cov_error = cl_abap_conv_in_ce=>uccp( '0000' ).
so_type_s = cl_abap_elemdescr=>get_string( ).
so_type_f = cl_abap_elemdescr=>get_f( ).
so_type_p ?= cl_abap_typedescr=>describe_by_name( 'p' ).
so_type_i = cl_abap_elemdescr=>get_i( ).
so_type_b ?= cl_abap_typedescr=>describe_by_name( 'ABAP_BOOL' ).
so_type_t_json ?= cl_abap_typedescr=>describe_by_name( 'T_T_JSON' ).
so_type_t_name_value ?= cl_abap_typedescr=>describe_by_name( 'T_T_NAME_VALUE' ).
ENDMETHOD. "class_constructor
METHOD constructor.
DATA: rtti TYPE REF TO cl_abap_classdescr,
pair LIKE LINE OF name_mappings.
mv_compress = compress.
mv_pretty_name = pretty_name.
mv_assoc_arrays = assoc_arrays.
mv_ts_as_iso8601 = ts_as_iso8601.
mv_expand_includes = expand_includes.
mv_assoc_arrays_opt = assoc_arrays_opt.
mv_strict_mode = strict_mode.
mv_numc_as_string = numc_as_string.
mv_conversion_exits = conversion_exits.
mv_format_output = format_output.
mv_hex_as_base64 = hex_as_base64.
mv_bool_types = bool_types.
mv_bool_3state = bool_3state.
mv_initial_ts = initial_ts.
mv_initial_date = initial_date.
mv_initial_time = initial_time.
LOOP AT name_mappings INTO pair.
TRANSLATE pair-abap TO UPPER CASE.
INSERT pair INTO TABLE mt_name_mappings.
ENDLOOP.
" if it dumps here, you have passed ambiguous mapping to the API
" please check your code for duplicates, pairs ABAP - JSON shall be unique
INSERT LINES OF mt_name_mappings INTO TABLE mt_name_mappings_ex.
IF mt_name_mappings IS NOT INITIAL.
IF mv_pretty_name EQ pretty_mode-none.
mv_pretty_name = pretty_mode-user.
ELSEIF pretty_name EQ pretty_mode-low_case.
mv_pretty_name = pretty_mode-user_low_case.
ENDIF.
ENDIF.
rtti ?= cl_abap_classdescr=>describe_by_object_ref( me ).
IF rtti->absolute_name NE mc_me_type.
mv_extended = c_bool-true.
ENDIF.
ENDMETHOD.
METHOD deserialize.
DATA: lo_json TYPE REF TO zcl_json.
IF json IS NOT INITIAL OR jsonx IS NOT INITIAL.
CREATE OBJECT lo_json
EXPORTING
pretty_name = pretty_name
name_mappings = name_mappings
assoc_arrays = assoc_arrays
conversion_exits = conversion_exits
hex_as_base64 = hex_as_base64
assoc_arrays_opt = assoc_arrays_opt.
TRY .
lo_json->deserialize_int( EXPORTING json = json jsonx = jsonx CHANGING data = data ).
CATCH cx_sy_move_cast_error. "#EC NO_HANDLER
ENDTRY.
ENDIF.
ENDMETHOD. "deserialize
METHOD deserialize_int.
DATA: length TYPE i,
offset TYPE i,
unescaped LIKE json.
IF json IS NOT INITIAL OR jsonx IS NOT INITIAL.
IF jsonx IS NOT INITIAL.
unescaped = raw_to_string( jsonx ).
ELSE.
unescaped = json.
ENDIF.
" skip leading BOM signs
length = strlen( unescaped ).
while_offset_not_cs `"{[` unescaped.
restore_type( EXPORTING json = unescaped length = length CHANGING data = data offset = offset ).
ENDIF.
ENDMETHOD. "deserialize
METHOD dump.
DATA: lo_json TYPE REF TO zcl_json.
CREATE OBJECT lo_json
EXPORTING
compress = compress
pretty_name = pretty_name
assoc_arrays = assoc_arrays
ts_as_iso8601 = ts_as_iso8601.
r_json = lo_json->dump_int( data = data type_descr = type_descr ).
ENDMETHOD. "dump
METHOD dump_int.
DATA: lo_typedesc TYPE REF TO cl_abap_typedescr,
lo_elem_descr TYPE REF TO cl_abap_elemdescr,
lo_classdesc TYPE REF TO cl_abap_classdescr,
lo_structdesc TYPE REF TO cl_abap_structdescr,
lo_tabledescr TYPE REF TO cl_abap_tabledescr,
ls_struct_sym TYPE t_s_struct_cache_res,
lt_symbols TYPE t_t_symbol,
lt_keys TYPE STANDARD TABLE OF REF TO data WITH DEFAULT KEY,
lt_properties TYPE STANDARD TABLE OF string,
lo_obj_ref TYPE REF TO object,
lo_data_ref TYPE REF TO data,
ls_skip_key TYPE LINE OF abap_keydescr_tab,
lv_array_opt TYPE abap_bool,
indent TYPE string,
lv_indent LIKE indent,
lv_level LIKE level,
lv_lb TYPE string,
lv_prop_name TYPE string,
lv_keyval TYPE string,
lv_itemval TYPE string.
FIELD-SYMBOLS: <line> TYPE any,
<value> TYPE any,
<data> TYPE data,
<key> TYPE LINE OF abap_keydescr_tab,
<symbol> TYPE t_s_symbol,
<table> TYPE ANY TABLE.
" increase hierarchy level
lv_level = level + 1.
" we need here macro instead of method calls because of the performance reasons.
" based on SAT measurements.
CASE type_descr->kind.
WHEN cl_abap_typedescr=>kind_ref.
IF data IS INITIAL.
r_json = `null`. "#EC NOTEXT
ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
lo_data_ref ?= data.
lo_typedesc = cl_abap_typedescr=>describe_by_data_ref( lo_data_ref ).
ASSIGN lo_data_ref->* TO <data>.
r_json = dump_int( data = <data> type_descr = lo_typedesc level = level ).
ELSE.
lo_obj_ref ?= data.
lo_classdesc ?= cl_abap_typedescr=>describe_by_object_ref( lo_obj_ref ).
lt_symbols = get_symbols_class( type_descr = lo_classdesc object = lo_obj_ref ).
r_json = dump_symbols( it_symbols = lt_symbols level = level ).
ENDIF.
WHEN cl_abap_typedescr=>kind_elem.
lo_elem_descr ?= type_descr.
dump_type data lo_elem_descr r_json convexit.
WHEN cl_abap_typedescr=>kind_struct.
lo_structdesc ?= type_descr.
ls_struct_sym = get_symbols_struct( type_descr = lo_structdesc level = level ).
ASSIGN ls_struct_sym-data->* TO <data>.
<data> = data.
r_json = dump_symbols( it_symbols = ls_struct_sym-symbols level = level ).
WHEN cl_abap_typedescr=>kind_table.
lo_tabledescr ?= type_descr.
lo_typedesc = lo_tabledescr->get_table_line_type( ).
ASSIGN data TO <table>.
IF mv_format_output EQ abap_true.
indent = get_indent( level ).
lv_indent = get_indent( lv_level ).
ENDIF.
" optimization for structured tables
IF lo_typedesc->kind EQ cl_abap_typedescr=>kind_struct.
lo_structdesc ?= lo_typedesc.
ls_struct_sym = get_symbols_struct( type_descr = lo_structdesc level = level ).
ASSIGN ls_struct_sym-data->* TO <line>.
" here we have differentiation of output of simple table to JSON array
" and sorted or hashed table with unique key into JSON associative array
IF lo_tabledescr->has_unique_key IS NOT INITIAL AND mv_assoc_arrays IS NOT INITIAL.
IF lo_tabledescr->key_defkind EQ lo_tabledescr->keydefkind_user.
LOOP AT lo_tabledescr->key ASSIGNING <key>.
READ TABLE ls_struct_sym-symbols WITH KEY name = <key>-name ASSIGNING <symbol>.
APPEND <symbol>-value TO lt_keys.
ENDLOOP.
ENDIF.
IF lines( lo_tabledescr->key ) EQ 1.
READ TABLE lo_tabledescr->key INDEX 1 INTO ls_skip_key.
DELETE ls_struct_sym-symbols WHERE name EQ ls_skip_key-name.
" remove object wrapping for simple name-value tables
IF mv_assoc_arrays_opt EQ abap_true AND lines( ls_struct_sym-symbols ) EQ 1.
lv_array_opt = abap_true.
ENDIF.
ENDIF.
LOOP AT <table> INTO <line>.
CLEAR: lv_prop_name.
" construct key attribute name
IF lo_tabledescr->key_defkind EQ lo_tabledescr->keydefkind_user.
LOOP AT lt_keys INTO lo_data_ref.
ASSIGN lo_data_ref->* TO <value>.
lv_keyval = <value>.
CONDENSE lv_keyval.
IF lv_prop_name IS NOT INITIAL.
CONCATENATE lv_prop_name mc_key_separator lv_keyval INTO lv_prop_name.
ELSE.
lv_prop_name = lv_keyval.
ENDIF.
ENDLOOP.
ELSE.
LOOP AT ls_struct_sym-symbols ASSIGNING <symbol>.
ASSIGN <symbol>-value->* TO <value>.
lv_keyval = <value>.
CONDENSE lv_keyval.
IF lv_prop_name IS NOT INITIAL.
CONCATENATE lv_prop_name mc_key_separator lv_keyval INTO lv_prop_name.
ELSE.
lv_prop_name = lv_keyval.
ENDIF.
ENDLOOP.
ENDIF.
lv_itemval = dump_symbols( it_symbols = ls_struct_sym-symbols opt_array = lv_array_opt format_scope = abap_false level = lv_level ).
IF lv_array_opt EQ abap_true.
IF mv_format_output EQ abap_true AND lv_itemval IS NOT INITIAL.
CONCATENATE `"` lv_prop_name `": ` lv_itemval INTO lv_itemval.
ELSE.
CONCATENATE `"` lv_prop_name `":` lv_itemval INTO lv_itemval.
ENDIF.
ELSE.
IF mv_format_output EQ abap_true AND lv_itemval IS NOT INITIAL.
CONCATENATE `"` lv_prop_name `": {` lv_itemval lv_indent `}` INTO lv_itemval.
ELSE.
CONCATENATE `"` lv_prop_name `":{` lv_itemval `}` INTO lv_itemval.
ENDIF.
ENDIF.
APPEND lv_itemval TO lt_properties.
ENDLOOP.
format_list_output `{` lt_properties `}` r_json.
ELSE.
LOOP AT <table> INTO <line>.
lv_itemval = dump_symbols( it_symbols = ls_struct_sym-symbols level = lv_level ).
APPEND lv_itemval TO lt_properties.
ENDLOOP.
format_list_output `[` lt_properties `]` r_json.
ENDIF.
ELSE.
LOOP AT <table> ASSIGNING <value>.
lv_itemval = dump_int( data = <value> type_descr = lo_typedesc level = lv_level ).
APPEND lv_itemval TO lt_properties.
ENDLOOP.
format_list_output `[` lt_properties `]` r_json.
ENDIF.
ENDCASE.
ENDMETHOD. "dump
METHOD dump_symbols.
DATA: lt_fields TYPE STANDARD TABLE OF string,
lv_indent TYPE string,
lv_level LIKE level,
lv_itemval TYPE string.
FIELD-SYMBOLS: <value> TYPE any,
<symbol> LIKE LINE OF it_symbols.
" increase hierarchy level
lv_level = level + 1.
IF mv_format_output EQ abap_true AND opt_array EQ abap_false.
lv_indent = get_indent( lv_level ).
ENDIF.
LOOP AT it_symbols ASSIGNING <symbol>.
ASSIGN <symbol>-value->* TO <value>.
CHECK <symbol>-compressable EQ abap_false OR <value> IS NOT INITIAL OR opt_array EQ abap_true.
IF <symbol>-elem_type IS NOT INITIAL.
dump_type <value> <symbol>-elem_type lv_itemval <symbol>-convexit_out.
ELSE.
lv_itemval = dump_int( data = <value> type_descr = <symbol>-type convexit = <symbol>-convexit_out level = lv_level ).
ENDIF.
IF opt_array EQ abap_false.
IF mv_format_output EQ abap_true.
CONCATENATE lv_indent <symbol>-header lv_itemval INTO lv_itemval.
ELSE.
CONCATENATE <symbol>-header lv_itemval INTO lv_itemval.
ENDIF.
ENDIF.
APPEND lv_itemval TO lt_fields.
ENDLOOP.
CONCATENATE LINES OF lt_fields INTO r_json SEPARATED BY `,`.
IF format_scope EQ abap_true.
IF r_json IS INITIAL.
r_json = `{}`.
ELSEIF mv_format_output EQ abap_true.
lv_indent = get_indent( level ).
CONCATENATE `{` r_json lv_indent `}` INTO r_json.
ELSE.
CONCATENATE `{` r_json `}` INTO r_json.
ENDIF.
ENDIF.
ENDMETHOD.
METHOD dump_type.
CONSTANTS: lc_typekind_utclong TYPE abap_typekind VALUE 'p', " CL_ABAP_TYPEDESCR=>TYPEKIND_UTCLONG -> 'p' only from 7.60
lc_typekind_int8 TYPE abap_typekind VALUE '8'. " TYPEKIND_INT8 -> '8' only from 7.40
IF convexit IS NOT INITIAL AND data IS NOT INITIAL.
TRY.
CALL FUNCTION convexit
EXPORTING
input = data
IMPORTING
output = r_json
EXCEPTIONS
OTHERS = 1.
IF sy-subrc IS INITIAL.
CONCATENATE `"` r_json `"` INTO r_json.
ENDIF.
CATCH cx_root. "#EC NO_HANDLER
ENDTRY.
ELSE.
CASE type_descr->type_kind.
WHEN cl_abap_typedescr=>typekind_float OR cl_abap_typedescr=>typekind_int OR cl_abap_typedescr=>typekind_int1 OR
cl_abap_typedescr=>typekind_int2 OR cl_abap_typedescr=>typekind_packed OR lc_typekind_utclong OR lc_typekind_int8.
IF mv_ts_as_iso8601 EQ c_bool-true AND
( type_descr->type_kind EQ lc_typekind_utclong OR
( type_descr->type_kind EQ cl_abap_typedescr=>typekind_packed AND type_descr->absolute_name CP `\TYPE=TIMESTAMP*` ) ).
IF data IS INITIAL.
r_json = mv_initial_ts.
ELSE.
r_json = data.
IF type_descr->absolute_name EQ `\TYPE=TIMESTAMP`.
CONCATENATE `"` r_json(4) `-` r_json+4(2) `-` r_json+6(2) `T` r_json+8(2) `:` r_json+10(2) `:` r_json+12(2) `.0000000Z"` INTO r_json.
ELSEIF type_descr->absolute_name EQ `\TYPE=TIMESTAMPL`.
CONCATENATE `"` r_json(4) `-` r_json+4(2) `-` r_json+6(2) `T` r_json+8(2) `:` r_json+10(2) `:` r_json+12(2) `.` r_json+15(7) `Z"` INTO r_json.
ENDIF.
ENDIF.
ELSEIF data IS INITIAL.
r_json = `0`.
ELSE.
r_json = data.
IF data LT 0.
IF type_descr->type_kind <> cl_abap_typedescr=>typekind_float. "float: sign is already at the beginning
SHIFT r_json RIGHT CIRCULAR.
ENDIF.
ELSE.
CONDENSE r_json.
ENDIF.
ENDIF.
WHEN cl_abap_typedescr=>typekind_num.
IF mv_numc_as_string EQ abap_true.
IF data IS INITIAL.
r_json = `""`.
ELSE.
CONCATENATE `"` data `"` INTO r_json.
ENDIF.
ELSE.
r_json = data.
SHIFT r_json LEFT DELETING LEADING ` 0`.
IF r_json IS INITIAL.
r_json = `0`.
ENDIF.
ENDIF.
WHEN cl_abap_typedescr=>typekind_string OR cl_abap_typedescr=>typekind_csequence OR cl_abap_typedescr=>typekind_clike.
IF data IS INITIAL.
r_json = `""`.
ELSEIF type_descr->absolute_name EQ mc_json_type.
r_json = data.
ELSE.
r_json = escape( data ).
CONCATENATE `"` r_json `"` INTO r_json.
ENDIF.
WHEN cl_abap_typedescr=>typekind_xstring OR cl_abap_typedescr=>typekind_hex.
IF data IS INITIAL.
r_json = `""`.
ELSE.
xstring_to_string_int data r_json.
CONCATENATE `"` r_json `"` INTO r_json.
ENDIF.
WHEN cl_abap_typedescr=>typekind_char.
IF type_descr->output_length EQ 1 AND mv_bool_types CS type_descr->absolute_name.
IF data EQ c_bool-true.
r_json = `true`. "#EC NOTEXT
ELSEIF data IS INITIAL AND mv_bool_3state CS type_descr->absolute_name.
r_json = `null`. "#EC NOTEXT
ELSE.
r_json = `false`. "#EC NOTEXT
ENDIF.
ELSE.
r_json = escape( data ).
CONCATENATE `"` r_json `"` INTO r_json.
ENDIF.
WHEN cl_abap_typedescr=>typekind_date.
IF data IS INITIAL.
r_json = mv_initial_date.
ELSE.
CONCATENATE `"` data(4) `-` data+4(2) `-` data+6(2) `"` INTO r_json.
ENDIF.
WHEN cl_abap_typedescr=>typekind_time.
IF data IS INITIAL.
r_json = mv_initial_time.
ELSE.
CONCATENATE `"` data(2) `:` data+2(2) `:` data+4(2) `"` INTO r_json.
ENDIF.
WHEN 'k'. " cl_abap_typedescr=>typekind_enum
r_json = data.
CONCATENATE `"` r_json `"` INTO r_json.
WHEN OTHERS.
IF data IS INITIAL.
r_json = `null`. "#EC NOTEXT
ELSE.
r_json = data.
ENDIF.
ENDCASE.
ENDIF.
ENDMETHOD. "dump_type
METHOD dump_type_ex.
DATA: lo_descr TYPE REF TO cl_abap_elemdescr,
lv_convexit TYPE string.
lo_descr ?= cl_abap_typedescr=>describe_by_data( data ).
IF mv_conversion_exits EQ abap_true.
lv_convexit = get_convexit_func( elem_descr = lo_descr input = abap_false ).
ENDIF.
r_json = dump_type( data = data type_descr = lo_descr convexit = lv_convexit ).
ENDMETHOD. "DUMP_TYPE_EX
METHOD edm_datetime_to_ts.
CONSTANTS: lc_epochs TYPE string VALUE `19700101000000`.
DATA: lv_ticks TYPE p,
lv_seconds TYPE p,
lv_subsec TYPE p,
lv_timestamps TYPE string,
lv_timestamp TYPE timestampl VALUE `19700101000000.0000000`.
lv_ticks = ticks.
lv_seconds = lv_ticks / 1000. " in seconds
lv_subsec = lv_ticks MOD 1000. " in subsec
IF lv_subsec GT 0.
lv_timestamps = lv_subsec.
CONCATENATE lc_epochs `.` lv_timestamps INTO lv_timestamps.
lv_timestamp = lv_timestamps.
ENDIF.
lv_timestamp = cl_abap_tstmp=>add( tstmp = lv_timestamp secs = lv_seconds ).
IF offset IS NOT INITIAL.
lv_ticks = offset+1.
lv_ticks = lv_ticks * 60. "offset is in minutes
IF offset(1) = '+'.
lv_timestamp = cl_abap_tstmp=>subtractsecs( tstmp = lv_timestamp secs = lv_ticks ).
ELSE.
lv_timestamp = cl_abap_tstmp=>add( tstmp = lv_timestamp secs = lv_ticks ).
ENDIF.
ENDIF.
CASE typekind.
WHEN cl_abap_typedescr=>typekind_time.
r_data = lv_timestamp.
r_data = r_data+8(6).
WHEN cl_abap_typedescr=>typekind_date.
r_data = lv_timestamp.
r_data = r_data(8).
WHEN cl_abap_typedescr=>typekind_packed.
r_data = lv_timestamp.
ENDCASE.
ENDMETHOD.
METHOD escape.
escape_json in out.
ENDMETHOD. "escape
METHOD generate.
DATA: lo_json TYPE REF TO zcl_json,
offset TYPE i,
length TYPE i.
" skip leading BOM signs
length = strlen( json ).
while_offset_not_cs `"{[` json.
CREATE OBJECT lo_json
EXPORTING
pretty_name = pretty_name
name_mappings = name_mappings
assoc_arrays = c_bool-true
assoc_arrays_opt = c_bool-true.
TRY .
lo_json->generate_int( EXPORTING json = json length = length CHANGING offset = offset data = rr_data ).
CATCH cx_sy_move_cast_error. "#EC NO_HANDLER
ENDTRY.
ENDMETHOD.
METHOD generate_int.
DATA: lt_json TYPE t_t_json,
lt_fields TYPE t_t_name_value.
FIELD-SYMBOLS: <data> TYPE data,
<struct> TYPE data,
<json> LIKE LINE OF lt_json,
<field> LIKE LINE OF lt_fields,
<table> TYPE STANDARD TABLE.
IF length IS NOT SUPPLIED.
length = strlen( json ).
ENDIF.
eat_white.
CASE json+offset(1).
WHEN `{`."result must be a structure
restore_type( EXPORTING json = json length = length type_descr = so_type_t_name_value CHANGING offset = offset data = lt_fields ).
generate_struct( CHANGING fields = lt_fields data = data ).
IF data IS BOUND.
ASSIGN data->* TO <struct>.
LOOP AT lt_fields ASSIGNING <field>.
ASSIGN COMPONENT sy-tabix OF STRUCTURE <struct> TO <data>.
generate_int( EXPORTING json = <field>-value CHANGING data = <data> ).
ENDLOOP.
ENDIF.
WHEN `[`."result must be a table of ref
restore_type( EXPORTING json = json length = length type_descr = so_type_t_json CHANGING offset = offset data = lt_json ).
CREATE DATA data TYPE ref_tab.
ASSIGN data->* TO <table>.
LOOP AT lt_json ASSIGNING <json>.
APPEND INITIAL LINE TO <table> ASSIGNING <data>.
generate_int( EXPORTING json = <json> CHANGING data = <data> ).
ENDLOOP.
WHEN `"`."string
restore_reference so_type_s.
WHEN `-` OR `0` OR `1` OR `2` OR `3` OR `4` OR `5` OR `6` OR `7` OR `8` OR `9`. " number
IF json+offset CS '.'.
restore_reference so_type_f.
ELSEIF length GT 9.
restore_reference so_type_p.
ELSE.
restore_reference so_type_i.
ENDIF.
WHEN OTHERS.
IF json+offset EQ `true` OR json+offset EQ `false`. "#EC NOTEXT
restore_reference so_type_b.
ENDIF.
ENDCASE.
ENDMETHOD.
METHOD generate_int_ex.
DATA: lv_assoc_arrays LIKE mv_assoc_arrays,
lv_assoc_arrays_opt LIKE mv_assoc_arrays_opt.
lv_assoc_arrays = mv_assoc_arrays.
lv_assoc_arrays_opt = mv_assoc_arrays_opt.
mv_assoc_arrays = abap_true.
mv_assoc_arrays_opt = abap_true.
generate_int( EXPORTING json = json length = length CHANGING offset = offset data = data ).
mv_assoc_arrays = lv_assoc_arrays.
mv_assoc_arrays_opt = lv_assoc_arrays_opt.
ENDMETHOD.
METHOD get_convexit_func.
DATA: ls_dfies TYPE dfies.
elem_descr->get_ddic_field(
RECEIVING
p_flddescr = ls_dfies " Field Description
EXCEPTIONS
not_found = 1
no_ddic_type = 2
OTHERS = 3
).
IF sy-subrc IS INITIAL AND ls_dfies-convexit IS NOT INITIAL.
IF input EQ abap_true.
CONCATENATE 'CONVERSION_EXIT_' ls_dfies-convexit '_INPUT' INTO rv_func.
ELSE.
CONCATENATE 'CONVERSION_EXIT_' ls_dfies-convexit '_OUTPUT' INTO rv_func.
ENDIF.
ENDIF.
ENDMETHOD.
METHOD get_fields.
DATA: lt_symbols TYPE t_t_symbol,
lv_name TYPE char128,
ls_field LIKE LINE OF rt_fields.
FIELD-SYMBOLS: <sym> LIKE LINE OF lt_symbols,
<cache> LIKE LINE OF mt_name_mappings.
lt_symbols = get_symbols( type_descr = type_descr data = data object = object include_aliases = abap_true ).
LOOP AT lt_symbols ASSIGNING <sym> WHERE read_only EQ abap_false.
MOVE-CORRESPONDING <sym> TO ls_field.
" insert as UPPER CASE
INSERT ls_field INTO TABLE rt_fields.
" insert as lower case
TRANSLATE ls_field-name TO LOWER CASE.
INSERT ls_field INTO TABLE rt_fields.
" as pretty printed
IF mv_pretty_name NE pretty_mode-none AND mv_pretty_name NE pretty_mode-low_case.
format_name <sym>-name mv_pretty_name ls_field-name.
INSERT ls_field INTO TABLE rt_fields.
" let us check for not well formed canelCase to be compatible with old logic
lv_name = ls_field-name.
TRANSLATE lv_name(1) TO UPPER CASE.
ls_field-name = lv_name.
INSERT ls_field INTO TABLE rt_fields.
ENDIF.
ENDLOOP.
ENDMETHOD.
METHOD get_symbols.
DATA: class_descr TYPE REF TO cl_abap_classdescr,
struct_descr TYPE REF TO cl_abap_structdescr,
struct_cache TYPE t_s_struct_cache_res.
IF type_descr->kind EQ cl_abap_typedescr=>kind_struct.
struct_descr ?= type_descr.
struct_cache = get_symbols_struct( type_descr = struct_descr data = data include_aliases = include_aliases ).
result = struct_cache-symbols.
ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_class.
class_descr ?= type_descr.
result = get_symbols_class( type_descr = class_descr object = object ).
ENDIF.
ENDMETHOD. "GET_SYMBOLS
METHOD is_compressable.
rv_compress = abap_true.
ENDMETHOD.
METHOD pretty_name.
DATA: tokens TYPE TABLE OF char128,
cache LIKE LINE OF mt_name_mappings.
FIELD-SYMBOLS: <token> LIKE LINE OF tokens,
<cache> LIKE LINE OF mt_name_mappings.
READ TABLE mt_name_mappings WITH TABLE KEY abap = in ASSIGNING <cache>.
IF sy-subrc IS INITIAL.
out = <cache>-json.
ELSE.
out = in.
REPLACE ALL OCCURRENCES OF `__` IN out WITH `*`.
TRANSLATE out TO LOWER CASE.
TRANSLATE out USING `/_:_~_`.
SPLIT out AT `_` INTO TABLE tokens.
LOOP AT tokens ASSIGNING <token> FROM 2.
TRANSLATE <token>(1) TO UPPER CASE.
ENDLOOP.
CONCATENATE LINES OF tokens INTO out.
REPLACE ALL OCCURRENCES OF `*` IN out WITH `_`.
cache-abap = in.
cache-json = out.
INSERT cache INTO TABLE mt_name_mappings.
INSERT cache INTO TABLE mt_name_mappings_ex.
ENDIF.
ENDMETHOD. "pretty_name
METHOD pretty_name_ex.
DATA: tokens TYPE TABLE OF char128,
cache LIKE LINE OF mt_name_mappings.
FIELD-SYMBOLS: <token> LIKE LINE OF tokens,
<cache> LIKE LINE OF mt_name_mappings.
READ TABLE mt_name_mappings WITH TABLE KEY abap = in ASSIGNING <cache>.
IF sy-subrc IS INITIAL.
out = <cache>-json.
ELSE.
out = in.
TRANSLATE out TO LOWER CASE.
TRANSLATE out USING `/_:_~_`.
REPLACE ALL OCCURRENCES OF `__e__` IN out WITH `!`.
REPLACE ALL OCCURRENCES OF `__n__` IN out WITH `#`.
REPLACE ALL OCCURRENCES OF `__d__` IN out WITH `$`.
REPLACE ALL OCCURRENCES OF `__p__` IN out WITH `%`.
REPLACE ALL OCCURRENCES OF `__m__` IN out WITH `&`.
REPLACE ALL OCCURRENCES OF `__s__` IN out WITH `*`.
REPLACE ALL OCCURRENCES OF `__h__` IN out WITH `-`.
REPLACE ALL OCCURRENCES OF `__t__` IN out WITH `~`.
REPLACE ALL OCCURRENCES OF `__l__` IN out WITH `/`.
REPLACE ALL OCCURRENCES OF `__c__` IN out WITH `:`.
REPLACE ALL OCCURRENCES OF `__v__` IN out WITH `|`.
REPLACE ALL OCCURRENCES OF `__a__` IN out WITH `@`.
REPLACE ALL OCCURRENCES OF `__o__` IN out WITH `.`.
REPLACE ALL OCCURRENCES OF `___` IN out WITH `.`.
REPLACE ALL OCCURRENCES OF `__` IN out WITH `"`.
SPLIT out AT `_` INTO TABLE tokens.
LOOP AT tokens ASSIGNING <token> FROM 2.
TRANSLATE <token>(1) TO UPPER CASE.
ENDLOOP.
CONCATENATE LINES OF tokens INTO out.
REPLACE ALL OCCURRENCES OF `"` IN out WITH `_`.
cache-abap = in.
cache-json = out.
INSERT cache INTO TABLE mt_name_mappings.
INSERT cache INTO TABLE mt_name_mappings_ex.
ENDIF.
ENDMETHOD. "pretty_name_ex
METHOD raw_to_string.
DATA: lv_output_length TYPE i,
lt_binary_tab TYPE STANDARD TABLE OF sdokcntbin.
CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
EXPORTING
buffer = iv_xstring
IMPORTING
output_length = lv_output_length
TABLES
binary_tab = lt_binary_tab.
CALL FUNCTION 'SCMS_BINARY_TO_STRING'
EXPORTING
input_length = lv_output_length
encoding = iv_encoding
IMPORTING
text_buffer = rv_string
output_length = lv_output_length
TABLES
binary_tab = lt_binary_tab.
ENDMETHOD.
METHOD restore.
DATA: mark LIKE offset,
match LIKE offset,
ref_descr TYPE REF TO cl_abap_refdescr,
data_descr TYPE REF TO cl_abap_datadescr,
data_ref TYPE REF TO data,
object_ref TYPE REF TO object,
fields LIKE field_cache,
name_json TYPE string.
FIELD-SYMBOLS: <value> TYPE any,
<field_cache> LIKE LINE OF field_cache.
fields = field_cache.
IF type_descr IS NOT INITIAL AND type_descr->kind EQ type_descr->kind_ref.
ref_descr ?= type_descr.
type_descr = ref_descr->get_referenced_type( ).
IF ref_descr->type_kind EQ ref_descr->typekind_oref.
IF data IS INITIAL.
" can fire an exception, if type is abstract or constructor protected
CREATE OBJECT data TYPE (type_descr->absolute_name).
ELSE.
type_descr = cl_abap_typedescr=>describe_by_object_ref( data ).
ENDIF.
object_ref ?= data.
fields = get_fields( type_descr = type_descr object = object_ref ).
ELSEIF ref_descr->type_kind EQ ref_descr->typekind_dref.
IF data IS INITIAL.
data_descr ?= type_descr.
CREATE DATA data TYPE HANDLE data_descr.
ELSE.
type_descr = cl_abap_typedescr=>describe_by_data_ref( data ).
ENDIF.
data_ref ?= data.
ASSIGN data_ref->* TO <value>.
fields = get_fields( type_descr = type_descr data = data_ref ).
restore( EXPORTING json = json
length = length
type_descr = type_descr
field_cache = fields
CHANGING data = <value>
offset = offset ).
RETURN.
ENDIF.
ENDIF.
IF fields IS INITIAL AND type_descr IS NOT INITIAL AND type_descr->kind EQ type_descr->kind_struct.
GET REFERENCE OF data INTO data_ref.
fields = get_fields( type_descr = type_descr data = data_ref ).
ENDIF.
eat_white.
eat_char `{`.
eat_white.
WHILE offset < length AND json+offset(1) NE `}`.
eat_name name_json.
eat_white.
eat_char `:`.
eat_white.
READ TABLE fields WITH TABLE KEY name = name_json ASSIGNING <field_cache>.
IF sy-subrc IS NOT INITIAL.
TRANSLATE name_json TO UPPER CASE.
READ TABLE fields WITH TABLE KEY name = name_json ASSIGNING <field_cache>.
ENDIF.
IF sy-subrc IS INITIAL.
ASSIGN <field_cache>-value->* TO <value>.
restore_type( EXPORTING json = json length = length type_descr = <field_cache>-type convexit = <field_cache>-convexit_in CHANGING data = <value> offset = offset ).
ELSE.
restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
ENDIF.
eat_white.
IF offset < length AND json+offset(1) NE `}`.
eat_char `,`.
eat_white.
ELSE.
EXIT.
ENDIF.
ENDWHILE.
eat_char `}`.
ENDMETHOD. "restore
METHOD restore_type.
DATA: mark LIKE offset,
match LIKE offset,
sdummy TYPE string, "#EC NEEDED
rdummy TYPE REF TO data, "#EC NEEDED
pos LIKE offset,
line TYPE REF TO data,
key_ref TYPE REF TO data,
data_ref TYPE REF TO data,
key_name TYPE string,
key_value TYPE string,
lt_fields LIKE field_cache,
ls_symbols TYPE t_s_struct_cache_res,
lv_ticks TYPE string,
lv_offset TYPE string,
lv_convexit LIKE convexit,
lo_exp TYPE REF TO cx_root,
elem_descr TYPE REF TO cl_abap_elemdescr,
table_descr TYPE REF TO cl_abap_tabledescr,
struct_descr TYPE REF TO cl_abap_structdescr,
data_descr TYPE REF TO cl_abap_datadescr.
FIELD-SYMBOLS: <line> TYPE any,
<value> TYPE any,
<data> TYPE data,
<field> LIKE LINE OF lt_fields,
<table> TYPE ANY TABLE,
<value_sym> TYPE t_s_symbol.
lv_convexit = convexit.
IF type_descr IS INITIAL AND data IS SUPPLIED.
type_descr = cl_abap_typedescr=>describe_by_data( data ).
IF mv_conversion_exits EQ abap_true AND lv_convexit IS INITIAL AND type_descr->kind EQ cl_abap_typedescr=>kind_elem.
elem_descr ?= type_descr.
lv_convexit = get_convexit_func( elem_descr = elem_descr input = abap_true ).
ENDIF.
ENDIF.
eat_white.
TRY .
IF data IS SUPPLIED AND type_descr->absolute_name EQ mc_json_type.
" skip deserialization
mark = offset.
restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
match = offset - mark.
data = json+mark(match).
ELSE.
CASE json+offset(1).
WHEN `{`. " object
IF data IS SUPPLIED.
IF mv_assoc_arrays EQ c_bool-true AND type_descr->kind EQ cl_abap_typedescr=>kind_table.
table_descr ?= type_descr.
data_descr = table_descr->get_table_line_type( ).
IF table_descr->has_unique_key IS NOT INITIAL.
eat_char `{`.
eat_white.
IF json+offset(1) NE `}`.
ASSIGN data TO <table>.
CLEAR <table>.
CREATE DATA line LIKE LINE OF <table>.
ASSIGN line->* TO <line>.
lt_fields = get_fields( type_descr = data_descr data = line ).
IF table_descr->key_defkind EQ table_descr->keydefkind_user AND lines( table_descr->key ) EQ 1.
READ TABLE table_descr->key INDEX 1 INTO key_name.
READ TABLE lt_fields WITH TABLE KEY name = key_name ASSIGNING <field>.
key_ref = <field>-value.
IF mv_assoc_arrays_opt EQ c_bool-true.
struct_descr ?= data_descr.
ls_symbols = get_symbols_struct( type_descr = struct_descr data = line ).
DELETE ls_symbols-symbols WHERE name EQ key_name.
IF lines( ls_symbols-symbols ) EQ 1.
READ TABLE ls_symbols-symbols INDEX 1 ASSIGNING <value_sym>.
ENDIF.
ENDIF.
ENDIF.
eat_white.
WHILE offset < length AND json+offset(1) NE `}`.
CLEAR <line>.
eat_name key_value.
eat_white.
eat_char `:`.
eat_white.
IF <value_sym> IS ASSIGNED.
ASSIGN <value_sym>-value->* TO <value>.
restore_type( EXPORTING json = json
length = length
type_descr = <value_sym>-type
convexit = <value_sym>-convexit_in
CHANGING data = <value>
offset = offset ).
ELSE.
restore_type( EXPORTING json = json
length = length
type_descr = data_descr
field_cache = lt_fields
CHANGING data = <line>
offset = offset ).
ENDIF.
IF table_descr->key_defkind EQ table_descr->keydefkind_user.
IF key_ref IS BOUND.
ASSIGN key_ref->* TO <value>.
IF <value> IS INITIAL.
<value> = key_value.
ENDIF.
ENDIF.
ELSEIF <line> IS INITIAL.
<line> = key_value.
ENDIF.
INSERT <line> INTO TABLE <table>.
eat_white.
IF offset < length AND json+offset(1) NE `}`.
eat_char `,`.
eat_white.
ELSE.
EXIT.
ENDIF.
ENDWHILE.
ELSE.
CLEAR data.
ENDIF.
eat_char `}`.
ELSE.
restore( EXPORTING json = json length = length CHANGING offset = offset ).
ENDIF.
ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
IF data IS INITIAL.
generate_int_ex( EXPORTING json = json length = length CHANGING offset = offset data = data ).
ELSE.
data_ref ?= data.
type_descr = cl_abap_typedescr=>describe_by_data_ref( data_ref ).
ASSIGN data_ref->* TO <data>.
restore_type( EXPORTING json = json length = length type_descr = type_descr CHANGING data = <data> offset = offset ).
ENDIF.
ELSE.
restore( EXPORTING json = json
length = length
type_descr = type_descr
field_cache = field_cache
CHANGING data = data
offset = offset ).
ENDIF.
ELSE.
restore( EXPORTING json = json length = length CHANGING offset = offset ).
ENDIF.
WHEN `[`. " array
IF data IS SUPPLIED AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
IF data IS INITIAL.
generate_int_ex( EXPORTING json = json length = length CHANGING offset = offset data = data ).
ELSE.
data_ref ?= data.
type_descr = cl_abap_typedescr=>describe_by_data_ref( data_ref ).
ASSIGN data_ref->* TO <data>.
restore_type( EXPORTING json = json length = length type_descr = type_descr CHANGING data = <data> offset = offset ).
ENDIF.
ELSE.
eat_char `[`.
eat_white.
IF json+offset(1) NE `]`.
IF data IS SUPPLIED AND type_descr->kind EQ cl_abap_typedescr=>kind_table.
table_descr ?= type_descr.
data_descr = table_descr->get_table_line_type( ).
ASSIGN data TO <table>.
CLEAR <table>.
CREATE DATA line LIKE LINE OF <table>.
ASSIGN line->* TO <line>.
lt_fields = get_fields( type_descr = data_descr data = line ).
WHILE offset < length AND json+offset(1) NE `]`.
CLEAR <line>.
restore_type( EXPORTING json = json
length = length
type_descr = data_descr
field_cache = lt_fields
CHANGING data = <line>
offset = offset ).
INSERT <line> INTO TABLE <table>.
eat_white.
IF offset < length AND json+offset(1) NE `]`.
eat_char `,`.
eat_white.
ELSE.
EXIT.
ENDIF.
ENDWHILE.
ELSE.
" skip array
eat_white.
WHILE offset < length AND json+offset(1) NE `]`.
restore_type( EXPORTING json = json length = length CHANGING offset = offset ).
eat_white.
IF offset < length AND json+offset(1) NE `]`.
eat_char `,`.
eat_white.
ELSE.
EXIT.
ENDIF.
ENDWHILE.
IF data IS SUPPLIED. " JSON to ABAP type match error
eat_char `]`.
throw_error.
ENDIF.
ENDIF.
ELSEIF data IS SUPPLIED.
CLEAR data.
ENDIF.
eat_char `]`.
ENDIF.
WHEN `"`. " string
eat_string sdummy.
IF data IS SUPPLIED.
" unescape string
IF sdummy IS NOT INITIAL.
IF type_descr->kind EQ cl_abap_typedescr=>kind_elem.
elem_descr ?= type_descr.
IF lv_convexit IS NOT INITIAL.
TRY .
CALL FUNCTION lv_convexit
EXPORTING
input = sdummy
IMPORTING
output = data
EXCEPTIONS
error_message = 2
OTHERS = 1.
IF sy-subrc IS INITIAL.
RETURN.
ENDIF.
CATCH cx_root. "#EC NO_HANDLER
ENDTRY.
ENDIF.
CASE elem_descr->type_kind.
WHEN cl_abap_typedescr=>typekind_char.
IF elem_descr->output_length EQ 1 AND mv_bool_types CS elem_descr->absolute_name.
IF sdummy(1) CA `XxTt1`.
data = c_bool-true.
ELSE.
data = c_bool-false.
ENDIF.
RETURN.
ENDIF.
WHEN cl_abap_typedescr=>typekind_xstring.
string_to_xstring_int sdummy data.
RETURN.
WHEN cl_abap_typedescr=>typekind_hex.
" support for Edm.Guid
REPLACE FIRST OCCURRENCE OF REGEX `^([0-9A-F]{8})-([0-9A-F]{4})-([0-9A-F]{4})-([0-9A-F]{4})-([0-9A-F]{12})$` IN sdummy
WITH `$1$2$3$4$5` REPLACEMENT LENGTH match IGNORING CASE. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
TRANSLATE sdummy TO UPPER CASE.
data = sdummy.
ELSE.
string_to_xstring_int sdummy data.
ENDIF.
RETURN.
WHEN cl_abap_typedescr=>typekind_date.
" support for ISO8601 => https://en.wikipedia.org/wiki/ISO_8601
REPLACE FIRST OCCURRENCE OF REGEX `^(\d{4})-(\d{2})-(\d{2})` IN sdummy WITH `$1$2$3`
REPLACEMENT LENGTH match. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
ELSE.
" support for Edm.DateTime => http://www.odata.org/documentation/odata-version-2-0/json-format/
FIND FIRST OCCURRENCE OF REGEX `^\/Date\((-?\d+)([+-]\d{1,4})?\)\/` IN sdummy SUBMATCHES lv_ticks lv_offset IGNORING CASE. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = edm_datetime_to_ts( ticks = lv_ticks offset = lv_offset typekind = elem_descr->type_kind ).
ELSE.
" support for Edm.Time => https://www.w3.org/TR/xmlschema11-2/#nt-durationRep
REPLACE FIRST OCCURRENCE OF REGEX `^-?P(?:(\d+)Y)?(?:(\d+)M)?(?:(\d+)D)?(?:T(?:(\d+)H)?(?:(\d+)M)?(?:(\d+)(?:\.(\d+))?S)?)?` IN sdummy WITH `$1$2$3`
REPLACEMENT LENGTH match. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
ENDIF.
ENDIF.
ENDIF.
WHEN cl_abap_typedescr=>typekind_time.
" support for ISO8601 => https://en.wikipedia.org/wiki/ISO_8601
REPLACE FIRST OCCURRENCE OF REGEX `^(\d{2}):(\d{2}):(\d{2})` IN sdummy WITH `$1$2$3`
REPLACEMENT LENGTH match. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
ELSE.
" support for Edm.DateTime => http://www.odata.org/documentation/odata-version-2-0/json-format/
FIND FIRST OCCURRENCE OF REGEX '^\/Date\((-?\d+)([+-]\d{1,4})?\)\/' IN sdummy SUBMATCHES lv_ticks lv_offset IGNORING CASE. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = edm_datetime_to_ts( ticks = lv_ticks offset = lv_offset typekind = elem_descr->type_kind ).
ELSE.
" support for Edm.Time => https://www.w3.org/TR/xmlschema11-2/#nt-durationRep
REPLACE FIRST OCCURRENCE OF REGEX `^-?P(?:(\d+)Y)?(?:(\d+)M)?(?:(\d+)D)?(?:T(?:(\d+)H)?(?:(\d+)M)?(?:(\d+)(?:\.(\d+))?S)?)?` IN sdummy WITH `$4$5$6`
REPLACEMENT LENGTH match. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
ENDIF.
ENDIF.
ENDIF.
WHEN cl_abap_typedescr=>typekind_packed.
REPLACE FIRST OCCURRENCE OF REGEX `^(\d{4})-?(\d{2})-?(\d{2})T(\d{2}):?(\d{2}):?(\d{2})(?:[\.,](\d{0,7}))?Z?` IN sdummy WITH `$1$2$3$4$5$6.$7`
REPLACEMENT LENGTH match. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
ELSE.
FIND FIRST OCCURRENCE OF REGEX '^\/Date\((-?\d+)([+-]\d{1,4})?\)\/' IN sdummy SUBMATCHES lv_ticks lv_offset IGNORING CASE. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = edm_datetime_to_ts( ticks = lv_ticks offset = lv_offset typekind = elem_descr->type_kind ).
ELSE.
" support for Edm.Time => https://www.w3.org/TR/xmlschema11-2/#nt-durationRep
REPLACE FIRST OCCURRENCE OF REGEX `^-?P(?:(\d+)Y)?(?:(\d+)M)?(?:(\d+)D)?(?:T(?:(\d+)H)?(?:(\d+)M)?(?:(\d+)(?:\.(\d+))?S)?)?` IN sdummy WITH `$1$2$3$4$5$6.$7`
REPLACEMENT LENGTH match. "#EC NOTEXT
IF sy-subrc EQ 0.
sdummy = sdummy(match).
ENDIF.
ENDIF.
ENDIF.
WHEN `k`. "cl_abap_typedescr=>typekind_enum
TRY.
CALL METHOD ('CL_ABAP_XSD')=>('TO_VALUE')
EXPORTING
cs = sdummy
CHANGING
val = data.
RETURN.
CATCH cx_sy_dyn_call_error.
throw_error. " Deserialization of enums is not supported
ENDTRY.
ENDCASE.
ELSEIF type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
CREATE DATA rdummy TYPE string.
ASSIGN rdummy->* TO <data>.
<data> = sdummy.
data ?= rdummy.
RETURN.
ELSE.
throw_error. " Other wise dumps with OBJECTS_MOVE_NOT_SUPPORTED
ENDIF.
data = sdummy.
ELSEIF type_descr->kind EQ cl_abap_typedescr=>kind_elem.
CLEAR data.
ELSE.
throw_error. " Other wise dumps with OBJECTS_MOVE_NOT_SUPPORTED
ENDIF.
ENDIF.
WHEN `-` OR `0` OR `1` OR `2` OR `3` OR `4` OR `5` OR `6` OR `7` OR `8` OR `9`. " number
IF data IS SUPPLIED.
IF type_descr->kind EQ type_descr->kind_ref AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
eat_number sdummy. "#EC NOTEXT
match = strlen( sdummy ).
IF sdummy CS '.'. " float.
CREATE DATA rdummy TYPE f.
ELSEIF match GT 9. " packed
CREATE DATA rdummy TYPE p.
ELSE. " integer
CREATE DATA rdummy TYPE i.
ENDIF.
ASSIGN rdummy->* TO <data>.
<data> = sdummy.
data ?= rdummy.
ELSEIF type_descr->kind EQ type_descr->kind_elem.
IF lv_convexit IS NOT INITIAL.
TRY .
eat_number sdummy. "#EC NOTEXT
CALL FUNCTION lv_convexit
EXPORTING
input = sdummy
IMPORTING
output = data
EXCEPTIONS
error_message = 2
OTHERS = 1.
IF sy-subrc IS INITIAL.
RETURN.
ENDIF.
CATCH cx_root. "#EC NO_HANDLER
ENDTRY.
ENDIF.
eat_number data. "#EC NOTEXT
ELSE.
eat_number sdummy. "#EC NOTEXT
throw_error.
ENDIF.
ELSE.
eat_number sdummy. "#EC NOTEXT
ENDIF.
WHEN OTHERS. " boolean, e.g true/false/null
IF data IS SUPPLIED.
IF type_descr->kind EQ type_descr->kind_ref AND type_descr->type_kind EQ cl_abap_typedescr=>typekind_dref.
CREATE DATA rdummy TYPE bool.
ASSIGN rdummy->* TO <data>.
eat_bool <data>. "#EC NOTEXT
data ?= rdummy.
ELSEIF type_descr->kind EQ type_descr->kind_elem.
eat_bool data. "#EC NOTEXT
ELSE.
eat_bool sdummy. "#EC NOTEXT
throw_error.
ENDIF.
ELSE.
eat_bool sdummy. "#EC NOTEXT
ENDIF.
ENDCASE.
ENDIF.
CATCH cx_sy_move_cast_error cx_sy_conversion_no_number cx_sy_conversion_overflow INTO lo_exp.
CLEAR data.
IF mv_strict_mode EQ abap_true.
RAISE EXCEPTION TYPE cx_sy_move_cast_error EXPORTING previous = lo_exp.
ENDIF.
ENDTRY.
ENDMETHOD. "restore_type
METHOD serialize.
DATA: lo_json TYPE REF TO zcl_json.
CREATE OBJECT lo_json
EXPORTING
compress = compress
pretty_name = pretty_name
name_mappings = name_mappings
assoc_arrays = assoc_arrays
assoc_arrays_opt = assoc_arrays_opt
expand_includes = expand_includes
numc_as_string = numc_as_string
conversion_exits = conversion_exits
format_output = format_output
hex_as_base64 = hex_as_base64
ts_as_iso8601 = ts_as_iso8601.
r_json = lo_json->serialize_int( name = name data = data type_descr = type_descr ).
ENDMETHOD. "serialize
METHOD serialize_int.
DATA: lo_descr TYPE REF TO cl_abap_typedescr,
lo_elem_descr TYPE REF TO cl_abap_elemdescr,
lv_convexit TYPE string.
IF type_descr IS INITIAL.
lo_descr = cl_abap_typedescr=>describe_by_data( data ).
ELSE.
lo_descr = type_descr.
ENDIF.
IF mv_conversion_exits EQ abap_true AND lo_descr->kind EQ cl_abap_typedescr=>kind_elem.
lo_elem_descr ?= lo_descr.
lv_convexit = get_convexit_func( elem_descr = lo_elem_descr input = abap_false ).
ENDIF.
r_json = dump_int( data = data type_descr = lo_descr convexit = lv_convexit ).
IF name IS NOT INITIAL AND ( mv_compress IS INITIAL OR r_json IS NOT INITIAL ).
CONCATENATE `"` name `":` r_json INTO r_json.
ENDIF.
ENDMETHOD. "serialize
METHOD string_to_raw.
CALL FUNCTION 'SCMS_STRING_TO_XSTRING'
EXPORTING
text = iv_string
encoding = iv_encoding
IMPORTING
buffer = rv_xstring
EXCEPTIONS
OTHERS = 1.
IF sy-subrc IS NOT INITIAL.
CLEAR rv_xstring.
ENDIF.
ENDMETHOD.
METHOD string_to_xstring.
DATA: lv_xstring TYPE xstring.
CALL FUNCTION 'SSFC_BASE64_DECODE'
EXPORTING
b64data = in
IMPORTING
bindata = lv_xstring
EXCEPTIONS
OTHERS = 1.
IF sy-subrc IS INITIAL.
out = lv_xstring.
ELSE.
out = in.
ENDIF.
ENDMETHOD. "string_to_xstring
METHOD tribool_to_bool.
IF iv_tribool EQ c_tribool-true.
rv_bool = c_bool-true.
ELSEIF iv_tribool EQ c_tribool-undefined.
rv_bool = abap_undefined. " fall back to abap_undefined
ENDIF.
ENDMETHOD. "TRIBOOL_TO_BOOL
METHOD unescape.
DATA: lv_offset TYPE i,
lv_match TYPE i,
lv_delta TYPE i,
lv_length TYPE i,
lv_offset_e TYPE i,
lv_length_e TYPE i,
lv_unicode_symb TYPE c,
lv_unicode_escaped TYPE string,
lt_matches TYPE match_result_tab.
FIELD-SYMBOLS: <match> LIKE LINE OF lt_matches.
" see reference for escaping rules in JSON RFC
" https://www.ietf.org/rfc/rfc4627.txt
unescaped = escaped.
lv_length = strlen( unescaped ).
FIND FIRST OCCURRENCE OF REGEX `\\[rntfbu]` IN unescaped RESPECTING CASE.
IF sy-subrc IS INITIAL.
FIND ALL OCCURRENCES OF REGEX `\\.` IN unescaped RESULTS lt_matches RESPECTING CASE.
LOOP AT lt_matches ASSIGNING <match>.
lv_match = <match>-offset - lv_delta.
lv_offset = lv_match + 1.
CASE unescaped+lv_offset(1).
WHEN `r`.
REPLACE SECTION OFFSET lv_match LENGTH 2 OF unescaped WITH cl_abap_char_utilities=>cr_lf(1).
lv_delta = lv_delta + 1.
WHEN `n`.
REPLACE SECTION OFFSET lv_match LENGTH 2 OF unescaped WITH cl_abap_char_utilities=>newline.
lv_delta = lv_delta + 1.
WHEN `t`.
REPLACE SECTION OFFSET lv_match LENGTH 2 OF unescaped WITH cl_abap_char_utilities=>horizontal_tab.
lv_delta = lv_delta + 1.
WHEN `f`.
REPLACE SECTION OFFSET lv_match LENGTH 2 OF unescaped WITH cl_abap_char_utilities=>form_feed.
lv_delta = lv_delta + 1.
WHEN `b`.
REPLACE SECTION OFFSET lv_match LENGTH 2 OF unescaped WITH cl_abap_char_utilities=>backspace.
lv_delta = lv_delta + 1.
WHEN `u`.
lv_offset = lv_offset + 1.
lv_offset_e = lv_offset + 4.
lv_length_e = lv_length + lv_delta.
IF lv_offset_e LE lv_length_e.
lv_unicode_escaped = unescaped+lv_offset(4).
TRANSLATE lv_unicode_escaped TO UPPER CASE.
lv_unicode_symb = cl_abap_conv_in_ce=>uccp( lv_unicode_escaped ).
IF lv_unicode_symb NE mc_cov_error.
REPLACE SECTION OFFSET lv_match LENGTH 6 OF unescaped WITH lv_unicode_symb.
lv_delta = lv_delta + 5.
ENDIF.
ENDIF.
ENDCASE.
ENDLOOP.
ENDIF.
" based on RFC mentioned above, _any_ character can be escaped, and so shall be enscaped
" the only exception is Unicode symbols, that shall be kept untouched, while serializer does not handle them
" unescaped singe characters, e.g \\, \", \/ etc
REPLACE ALL OCCURRENCES OF REGEX `\\(.)` IN unescaped WITH `$1` RESPECTING CASE.
ENDMETHOD.
METHOD xstring_to_string.
DATA: lv_xstring TYPE xstring.
" let us fix data conversion issues here
lv_xstring = in.
CALL FUNCTION 'SSFC_BASE64_ENCODE'
EXPORTING
bindata = lv_xstring
IMPORTING
b64data = out
EXCEPTIONS
OTHERS = 1.
IF sy-subrc IS NOT INITIAL.
out = in.
ENDIF.
ENDMETHOD. "xstring_to_string
METHOD generate_struct.
DATA: lv_comp_name TYPE abap_compname,
lt_comp TYPE abap_component_tab,
lt_keys TYPE STANDARD TABLE OF string,
lv_invalid TYPE abap_bool,
ls_type LIKE LINE OF mt_struct_type,
lt_names TYPE HASHED TABLE OF string WITH UNIQUE KEY table_line,
cache LIKE LINE OF mt_name_mappings_ex,
ls_comp LIKE LINE OF lt_comp.
FIELD-SYMBOLS: <field> LIKE LINE OF fields,
<cache> LIKE LINE OF mt_name_mappings_ex.
CHECK fields IS NOT INITIAL.
" prepare structure type key
LOOP AT fields ASSIGNING <field>.
APPEND <field>-name TO lt_keys.
ENDLOOP.
CONCATENATE LINES OF lt_keys INTO ls_type-keys.
READ TABLE mt_struct_type WITH TABLE KEY keys = ls_type-keys INTO ls_type.
IF sy-subrc IS NOT INITIAL.
ls_comp-type = cl_abap_refdescr=>get_ref_to_data( ).
LOOP AT fields ASSIGNING <field>.
READ TABLE mt_name_mappings_ex WITH TABLE KEY json = <field>-name ASSIGNING <cache>.
IF sy-subrc IS INITIAL.
ls_comp-name = <cache>-abap.
ELSE.
cache-json = ls_comp-name = <field>-name.
" remove characters not allowed in component names
TRANSLATE ls_comp-name USING mc_name_symbols_map.
IF mv_pretty_name EQ pretty_mode-camel_case OR mv_pretty_name EQ pretty_mode-extended.
REPLACE ALL OCCURRENCES OF REGEX `([a-z])([A-Z])` IN ls_comp-name WITH `$1_$2`. "#EC NOTEXT
ENDIF.
TRANSLATE ls_comp-name TO UPPER CASE.
cache-abap = ls_comp-name = lv_comp_name = ls_comp-name. " truncate by allowed field name length
INSERT cache INTO TABLE mt_name_mappings_ex.
ENDIF.
INSERT ls_comp-name INTO TABLE lt_names.
IF sy-subrc IS INITIAL.
APPEND ls_comp TO lt_comp.
ELSE.
DELETE fields.
lv_invalid = abap_true.
ENDIF.
ENDLOOP.
TRY.
ls_type-type = cl_abap_structdescr=>create( p_components = lt_comp p_strict = c_bool-false ).
CATCH cx_sy_struct_creation. "#EC NO_HANDLER
ENDTRY.
IF lv_invalid EQ abap_false.
INSERT ls_type INTO TABLE mt_struct_type.
ENDIF.
ENDIF.
IF ls_type-type IS NOT INITIAL.
TRY.
CREATE DATA data TYPE HANDLE ls_type-type.
CATCH cx_sy_create_data_error. "#EC NO_HANDLER
ENDTRY.
ENDIF.
ENDMETHOD.
METHOD get_indent.
STATICS: st_indent TYPE STANDARD TABLE OF string WITH DEFAULT KEY.
DATA: lv_filled TYPE i.
READ TABLE st_indent INDEX level INTO indent.
IF sy-subrc IS NOT INITIAL.
lv_filled = lines( st_indent ).
indent = cl_abap_char_utilities=>cr_lf.
DO level TIMES.
CONCATENATE indent mc_default_indent INTO indent.
IF sy-index GT lv_filled.
APPEND indent TO st_indent.
ENDIF.
ENDDO.
ENDIF.
ENDMETHOD.
METHOD get_symbols_class.
DATA: symb LIKE LINE OF result.
FIELD-SYMBOLS: <attr> LIKE LINE OF cl_abap_objectdescr=>attributes,
<cache> LIKE LINE OF mt_name_mappings,
<field> TYPE any.
LOOP AT type_descr->attributes ASSIGNING <attr> WHERE is_constant IS INITIAL AND alias_for IS INITIAL AND
( is_interface IS INITIAL OR type_kind NE cl_abap_typedescr=>typekind_oref ).
ASSIGN object->(<attr>-name) TO <field>.
CHECK sy-subrc IS INITIAL. " we can only assign to public attributes
symb-name = <attr>-name.
symb-read_only = <attr>-is_read_only.
symb-type = type_descr->get_attribute_type( <attr>-name ).
IF symb-type->kind EQ cl_abap_typedescr=>kind_elem.
symb-elem_type ?= symb-type.
ELSE.
CLEAR symb-elem_type.
ENDIF.
IF mv_conversion_exits EQ abap_true AND symb-elem_type IS NOT INITIAL.
symb-convexit_in = get_convexit_func( elem_descr = symb-elem_type input = abap_true ).
symb-convexit_out = get_convexit_func( elem_descr = symb-elem_type input = abap_false ).
ENDIF.
is_compressable symb-type symb-name symb-compressable.
GET REFERENCE OF <field> INTO symb-value.
format_name symb-name mv_pretty_name symb-header.
CONCATENATE `"` symb-header `":` INTO symb-header.
IF mv_format_output EQ abap_true.
CONCATENATE symb-header ` ` INTO symb-header.
ENDIF.
APPEND symb TO result.
ENDLOOP.
ENDMETHOD. "GET_SYMBOLS
METHOD get_symbols_struct.
DATA: comp_tab TYPE cl_abap_structdescr=>component_table,
sym_cache LIKE result,
symbol TYPE t_s_symbol,
struct_descr TYPE REF TO cl_abap_structdescr,
struct_cache LIKE LINE OF mt_struct_cache.
FIELD-SYMBOLS: <comp> LIKE LINE OF comp_tab,
<symbol> LIKE symbol,
<cache> LIKE LINE OF mt_name_mappings,
<struct> LIKE LINE OF mt_struct_cache,
<data> TYPE data,
<field> TYPE any.
READ TABLE mt_struct_cache WITH TABLE KEY type_descr = type_descr include_aliases = include_aliases level = level
ASSIGNING <struct>.
IF sy-subrc IS NOT INITIAL.
struct_cache-type_descr = type_descr.
struct_cache-include_aliases = include_aliases.
struct_cache-level = level.
CREATE DATA struct_cache-result-data TYPE HANDLE type_descr.
INSERT struct_cache INTO TABLE mt_struct_cache ASSIGNING <struct>.
ASSIGN <struct>-result-data->* TO <data>.
comp_tab = type_descr->get_components( ).
LOOP AT comp_tab ASSIGNING <comp>.
IF <comp>-name IS NOT INITIAL AND
( <comp>-as_include EQ abap_false OR include_aliases EQ abap_true OR mv_expand_includes EQ abap_false ).
symbol-name = <comp>-name.
symbol-type = <comp>-type.
IF symbol-type->kind EQ cl_abap_typedescr=>kind_elem.
symbol-elem_type ?= symbol-type.
ELSE.
CLEAR symbol-elem_type.
ENDIF.
IF mv_conversion_exits EQ abap_true AND symbol-elem_type IS NOT INITIAL.
symbol-convexit_in = get_convexit_func( elem_descr = symbol-elem_type input = abap_true ).
symbol-convexit_out = get_convexit_func( elem_descr = symbol-elem_type input = abap_false ).
ENDIF.
is_compressable symbol-type symbol-name symbol-compressable.
ASSIGN COMPONENT symbol-name OF STRUCTURE <data> TO <field>.
GET REFERENCE OF <field> INTO symbol-value.
format_name symbol-name mv_pretty_name symbol-header.
CONCATENATE `"` symbol-header `":` INTO symbol-header.
IF mv_format_output EQ abap_true.
CONCATENATE symbol-header ` ` INTO symbol-header.
ENDIF.
APPEND symbol TO <struct>-result-symbols.
ENDIF.
IF <comp>-as_include EQ abap_true AND mv_expand_includes EQ abap_true.
struct_descr ?= <comp>-type.
sym_cache = get_symbols_struct( type_descr = struct_descr include_aliases = include_aliases ).
LOOP AT sym_cache-symbols INTO symbol.
CONCATENATE symbol-name <comp>-suffix INTO symbol-name.
IF symbol-type->kind EQ cl_abap_typedescr=>kind_elem.
symbol-elem_type ?= symbol-type.
ELSE.
CLEAR symbol-elem_type.
ENDIF.
IF mv_conversion_exits EQ abap_true AND symbol-elem_type IS NOT INITIAL.
symbol-convexit_in = get_convexit_func( elem_descr = symbol-elem_type input = abap_true ).
symbol-convexit_out = get_convexit_func( elem_descr = symbol-elem_type input = abap_false ).
ENDIF.
is_compressable symbol-type symbol-name symbol-compressable.
ASSIGN COMPONENT symbol-name OF STRUCTURE <data> TO <field>.
GET REFERENCE OF <field> INTO symbol-value.
format_name symbol-name mv_pretty_name symbol-header.
CONCATENATE `"` symbol-header `":` INTO symbol-header.
IF mv_format_output EQ abap_true.
CONCATENATE symbol-header ` ` INTO symbol-header.
ENDIF.
APPEND symbol TO <struct>-result-symbols.
ENDLOOP.
ENDIF.
ENDLOOP.
ENDIF.
result = <struct>-result.
IF data IS BOUND AND data NE <struct>-result-data.
result-data = data.
ASSIGN data->* TO <data>.
LOOP AT result-symbols ASSIGNING <symbol>.
ASSIGN COMPONENT <symbol>-name OF STRUCTURE <data> TO <field>.
GET REFERENCE OF <field> INTO <symbol>-value.
ENDLOOP.
ENDIF.
ENDMETHOD. "GET_SYMBOLS_STRUCT
ENDCLASS.
Custom ABAP to JSON, JSON to ABAP name mapping
By default, you control the way JSON names are formatted/mapped to ABAP names by selecting proper pretty_mode as a parameter for SERIALIZE/DESERIALIZE/GENERATE method. But in some cases, the standard, hard-coded formatting, is not enough. For example, if you need special rules for name formatting (for using special characters) or because JSON attribute name is too long and you can not map it to ABAP name (which has 30 characters length limit).
The recommended way for custom mapping was an extension of the /UI2/CL_JSON class and redefining methods PRETTY_NAME or PRETTY_NAME_EX, but since note 2526405 there is an easier way, without the need in own class. If you have a static list of field mappings from ABAP to JSON you can pass the name mapping table as a parameter for the constructor/serialize/deserialize and control the way JSON names are formatted/mapped to ABAP names.
TYPES:
BEGIN OF tp_s_data,
sschema TYPE string,
odatacontext TYPE string,
shortened_abap_name TYPE string,
standard TYPE string,
END OF tp_s_data.
DATA: ls_exp TYPE tp_s_data,
lt_mapping TYPE /ui2/cl_json=>name_mappings,
lv_json TYPE /ui2/cl_json=>json.
lt_mapping = VALUE #( ( abap = `SSCHEMA` json = `$schema` )
( abap = `ODATACONTEXT` json = `@odata.context` )
( abap = `SHORTENED_ABAP_NAME` json = `VeeeeryyyyyLooooongJSONAttrbuuuuuuuuuteeeeeeeeeee` ) ).
lv_json = /ui2/cl_json=>serialize( data = ls_exp name_mappings = lt_mapping ).
In some cases, you need to have custom formatting for your ABAP data, when serializing it into JSON. Or another use case, you have some custom, DDIC defined data types, that are not automatically recognized by standard code, and therefore no appropriate formatting is applied (for example custom boolean or timestamp type).
In this case, you have the following options:
- Extend the class and overwrite the method DUMP_TYPE. See an example in section "/UI2/CL_JSON extension".
- Add conversion exits for your custom type and apply formatting as part of the conversion exit.
- Create an alternative structure, with your custom types replaced by supported types, only for serialization, and do the move of data before the serialization.
Serialization/deserialization of hierarchical/recursive data
Handling the recursive data structure in ABAP is not very trivial. And it is not very trivial to serialize and deserialize it either.
If you would like to model your hierarchical data (tree-like) as ABAP structures, the only allowed way will be to do it like in the example below, where you use references to generic data:
TYPES:
BEGIN OF ts_node,
id TYPE i,
children TYPE STANDARD TABLE OF REF TO data WITH DEFAULT KEY,
END OF ts_node.
DATA: lv_exp TYPE string,
lv_act TYPE string,
ls_data TYPE ts_node,
lr_data LIKE REF TO ls_data.
ls_data-id = 1.
CREATE DATA lr_data.
lr_data->id = 2.
APPEND lr_data TO ls_data-children.
Such a way is more or less straightforward and will work, but leads to losing type information for data persisted in children table. That will mean that you will need to cast data when you access it. In addition to that, it blocks you from being able to deserialize such data from JSON, while the parser will not be able to deduce the type of data that needs to be created in the children's table. But serialization will work fine:
lv_exp = '{"ID":1,"CHILDREN":[{"ID":2,"CHILDREN":[]}]}'.
lv_act = /ui2/cl_json=>serialize( data = ls_data ).
cl_aunit_assert=>assert_equals( act = lv_act exp = lv_exp msg = 'Serialization of recursive data structure fails' ).
The better way to model hierarchical data in ABAP is with help of objects, while objects are always processed as references and ABAP allow you to create nested data structures, referring to objects of the same type:
CLASS lcl_test DEFINITION FINAL.
PUBLIC SECTION.
DATA: id TYPE i.
DATA: children TYPE STANDARD TABLE OF REF TO lcl_test.
ENDCLASS. "lcl_test DEFINITION
In that manner, you can process data in the same way as with ABAP structures but using typed access and serialization/deserialization of data in JSON works fine while types can be deduced on
DATA: lo_act TYPE REF TO lcl_test,
lo_exp TYPE REF TO lcl_test,
lv_json TYPE string,
lo_child TYPE REF TO lcl_test.
CREATE OBJECT lo_exp.
lo_exp ->id = 1.
CREATE OBJECT lo_child.
lo_child->id = 2.
APPEND lo_child TO lo_exp->children.
lv_json = /ui2/cl_json=>serialize( data = lo_exp ).
ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = lo_act ).
Remark: There are some constraints for data design that exist regarding the deserialization of objects:
- You cannot use constructors with obligatory parameters
- References to interfaces will be not deserialized
Serializing of protected and private attributes
If you do the serialization from outside of your class, you can access only the public attributes of that class. To serialize all types of attributes (private+protected) you need to allow /ui2/cl_json access to them. This can be done by defining /ui2/cl_json as a friend of your class. In this way, you do not disrupt your encapsulation for other classes but enable /ui2/cl_json to access all data of your class.
If you do not own a class you want to serialize, you probably can inherit it from your class and add friends there. In this case, you can access at least protected attributes.
Partial serialization/deserialization
When it is needed:
- You deserialize JSON to ABAP but would like some known parts to be deserialized as JSON string, while you do not know nesting JSON structure.
- You deserialize a collection (array/associative array) that has objects with heterogeneous structures (for example the same field has a different type depending on object type). Using partial deserialization, you can restore such a type as JSON string in ABAP and apply later additional deserialization based on the object type.
- You serialize ABAP to JSON and have some ready JSON pieces (strings) which you would like to mix in.
The solution /UI2/CL_JSON has for this type /UI2/CL_JSON=>JSON (alias for built-in type string). ABAP fields using declared with this type will be serialized/deserialized as JSON pieces. Be aware that during serialization from ABAP to JSON, the content of such JSON piece is not validated for correctness, so if you pass an invalid JSON block, it may destroy the whole resulting JSON string at the end.
Below you can find examples for partial serialization/deserialization.
Serialization:
TYPES: BEGIN OF ts_record,
id TYPE string,
columns TYPE /ui2/cl_json=>json,
END OF ts_record.
DATA: lv_json TYPE /ui2/cl_json=>json,
lt_data TYPE SORTED TABLE OF ts_record WITH UNIQUE KEY id,
ls_data LIKE LINE OF lt_data.
ls_data-id = 'O000001ZZ_SO_GRES_CONTACTS'.
ls_data-columns = '{"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}'.
INSERT ls_data INTO TABLE lt_data.
ls_data-id = 'O000001ZZ_TRANSIENT_TEST_A'.
ls_data-columns = '{"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}'.
INSERT ls_data INTO TABLE lt_data.
lv_json = /ui2/cl_json=>serialize( data = lt_data assoc_arrays = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
WRITE / lv_json.
Results in:
{
"O000001ZZ_SO_GRES_CONTACTS": {
"columns": {
"AGE": {
"bVisible": true,
"iPosition": 2
},
"BRSCH": {
"bVisible": true
}
}
},
"O000001ZZ_TRANSIENT_TEST_A": {
"columns": {
"ABTNR": {
"bVisible": false
},
"CITY1": {
"bVisible": false
},
"IC_COMPANY_KEY": {
"bVisible": true
}
}
}
}
Deserialization:
TYPES: BEGIN OF ts_record,
id TYPE string,
columns TYPE /ui2/cl_json=>json,
END OF ts_record.
DATA: lv_json TYPE string,
lt_act TYPE SORTED TABLE OF ts_record WITH UNIQUE KEY id.
CONCATENATE
'{"O000001ZZ_SO_GRES_CONTACTS":{"columns":{"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}},'
'"O000001ZZ_TRANSIENT_TEST_A":{"columns":{"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}}}'
INTO lv_json.
" if you know first level of underlying structure ("columns" field) -> Output Var 1
/ui2/cl_json=>deserialize( EXPORTING json = lv_json assoc_arrays = abap_true CHANGING data = lt_act ).
" if you do not know underlying structure of first level (naming of second filed e.g columns in example does not matter )
" => result is a little bit different -> Output Var 2
/ui2/cl_json=>deserialize( EXPORTING json = lv_json assoc_arrays = abap_true assoc_arrays_opt = abap_true CHANGING data = lt_act ).
Results in the following ABAP data:
ID(CString) COLUMNS(CString)
O000001ZZ_SO_GRES_CONTACTS {"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}
O000001ZZ_TRANSIENT_TEST_A {"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}
ID(CString) COLUMNS(CString)
O000001ZZ_SO_GRES_CONTACTS {"columns":{"AGE":{"bVisible":true,"iPosition":2},"BRSCH":{"bVisible":true}}}
O000001ZZ_TRANSIENT_TEST_A {"columns":{"ABTNR":{"bVisible":false},"CITY1":{"bVisible":false},"IC_COMPANY_KEY":{"bVisible":true}}}
/UI2/CL_JSON extension
If standard class functionality does not fit your requirements there are two ways how you can adapt it to your needs:
- Use a local copy of the class /UI2/CL_JSON and modify logic directly, by the change of original code.
- Inherit from class /UI2/CL_JSON and override methods where another logic is required.
The advantage of the first approach is that you are completely free in what you may change and have full control of the class lifecycle. The disadvantage, you will probably need to merge your changes with /UI2/CL_JSON updates.
For the second approach you can use /UI2/CL_JSON directly (prerequisite is the latest version of note 2330592), do not need to care about the merge, but can override only some methods. The methods are:
IS_COMPRESSIBLE – called to check, if the given type output may be suppressed during ABAP to JSON serialization when a value is initial.
- > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR) – value type
- < RV_COMPRESS (bool) – compress initial value
The default implementation of the method allows compressing any initial value.
PRETTY_NAME – called to format ABAP field name written to JSON or deserialized from JSON to ABAP field, when the pretty_name parameter of SERIALIZE/DESERIALIZE method equal to PRETTY_MODE-CAMEL_CASE.
- > IN (CSEQUENCE) – Field name to pretty print.
- < OUT (STRING) – Pretty printed field name
The default implementation applies camelCase formatting, based on usage of the “_” symbol. To output, the “_” symbol, use the double “__” symbol in the field name.
PRETTY_NAME_EX – called to format ABAP field name written to JSON or deserialized from JSON to ABAP field, when the pretty_name parameter of SERIALIZE/DESERIALIZE method equal to PRETTY_MODE-EXTENDED.
- > IN (CSEQUENCE) – Field name to pretty print.
- < OUT (STRING) – Pretty printed field name
The default implementation does the same as PRETTY_NAME, plus converting special characters "!#$%&*-~/:|@.".
DUMP_INT - called for recursive serialization of complex ABAP data objects (structure, class, table) into JSON string
- > DATA (DATA) – Any data to serialize.
- > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR, optional) – Type of data provided
- < R_JSON (JSON) – serialized JSON value
DUMP_TYPE - called for serialization of elementary ABAP data type (string, boolean, timestamp, etc) into the JSON attribute value. Overwrite it if you, for example, want to apply data output data conversion of currency rounding
- > DATA (DATA) – Any data to serialize
- > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR) – Type of data provided
- < R_JSON (JSON) – serialized JSON value
RESTORE - called for deserializing JSON objects into ABAP structures
- > JSON (JSON) – JSON string to deserialize
- > LENGTH (I) – Length of the JSON string
- > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR, optional) – Type of changing data provided
- > FIELD_CACHE (type T_T_FIELD_CACHE, optional) – Cache of ABAP data fields with type information
- <> DATA (type DATA, optional) – ABAP data object to fill
- <> OFFSET (I) – parsing start point in JSON string
RESTORE_TYPE - called to deserializing simple JSON attributes and JSON arrays
- > JSON (JSON) – JSON string to deserialize
- > LENGTH (I) – Length of the JSON string
- > TYPE_DESCR (ref to CL_ABAP_TYPEDESCR, optional) – Type of changing data provided
- > FIELD_CACHE (type T_T_FIELD_CACHE, optional) – Cache of ABAP data fields with type information
- <> DATA (type DATA, optional) – ABAP data object to fill
- <> OFFSET (I) – parsing start point in JSON string
CLASS_CONSTRUCTOR - used to initialize static variables. You can not overwrite it, but you can implement your class constructor that adapts default globals. For example, adds the additional boolean types to be recognized during serialization/deserialization.
SERIALIZE/DESERIALIZE - these methods are static therefore cannot be redefined. Methods are helpers for a consumption code, hiding the construction of the class instance and further *_INT calls. So, if you would like to use something similar, in your custom class, you need to copy mentioned methods to new ones e,g *_EX and overwrite there /UI2/CL_JSON type to your custom class name. And use these methods instead of standard.
Extension using inheritance:
CLASS lc_json_custom DEFINITION FINAL INHERITING FROM /ui2/cl_json.
PUBLIC SECTION.
CLASS-METHODS:
class_constructor,
deserialize_ex IMPORTING json TYPE json OPTIONAL
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
CHANGING data TYPE data,
serialize_ex IMPORTING data TYPE data
compress TYPE bool DEFAULT c_bool-false
pretty_name TYPE pretty_name_mode DEFAULT pretty_mode-none
RETURNING value(r_json) TYPE json .
PROTECTED SECTION.
METHODS:
is_compressable REDEFINITION,
pretty_name REDEFINITION,
dump_type REDEFINITION.
ENDCLASS. "lc_json_custom DEFINITION
CLASS lc_json_custom IMPLEMENTATION.
METHOD class_constructor.
CONCATENATE mc_bool_types `\TYPE=/UI2/BOOLEAN` INTO mc_bool_types.
ENDMETHOD. "class_constructor
METHOD is_compressable.
IF type_descr->absolute_name EQ `\TYPE=STRING` OR name EQ `INITIAL`.
rv_compress = abap_false.
ELSE.
rv_compress = abap_true.
ENDIF.
ENDMETHOD. "is_compressable
METHOD pretty_name.
out = super->pretty_name( in ).
CONCATENATE out 'Xxx' INTO out.
ENDMETHOD. "pretty_name
METHOD dump_type.
DATA: is_ddic TYPE abap_bool,
ddic_field TYPE dfies.
is_ddic = type_descr->is_ddic_type( ).
IF is_ddic EQ abap_true.
ddic_field = type_descr->get_ddic_field( ).
IF mv_ts_as_iso8601 EQ c_bool-true AND ddic_field-domname EQ `TZNTSTMPL`.
r_json = data.
CONCATENATE `"` r_json(4) `-` r_json+4(2) `-` r_json+6(2) `T` r_json+8(2) `:` r_json+10(2) `:` r_json+12(2) `.` r_json+15(7) `Z"` INTO r_json.
RETURN.
ENDIF.
ENDIF.
IF mv_ts_as_iso8601 EQ c_bool-true AND type_descr->absolute_name EQ `\TYPE=LCM_CHANGED_ON`.
r_json = data.
CONCATENATE `"` r_json(4) `-` r_json+4(2) `-` r_json+6(2) `T` r_json+8(2) `:` r_json+10(2) `:` r_json+12(2) `.` r_json+15(7) `Z"` INTO r_json.
RETURN.
ENDIF.
r_json = super->dump_type( data = data type_descr = type_descr convexit = convexit ).
ENDMETHOD. "dump_type
METHOD serialize_ex.
DATA: lo_json TYPE REF TO lc_json_custom.
CREATE OBJECT lo_json
EXPORTING
compress = compress
pretty_name = pretty_name
assoc_arrays = abap_true
assoc_arrays_opt = abap_true
expand_includes = abap_true
numc_as_string = abap_true
ts_as_iso8601 = abap_true.
r_json = lo_json->serialize_int( data = data ).
ENDMETHOD. "serialize_ex
METHOD deserialize_ex.
DATA: lo_json TYPE REF TO lc_json_custom.
IF json IS NOT INITIAL.
CREATE OBJECT lo_json
EXPORTING
pretty_name = pretty_name
assoc_arrays = abap_true
assoc_arrays_opt = abap_true.
TRY .
lo_json->deserialize_int( EXPORTING json = json CHANGING data = data ).
CATCH cx_sy_move_cast_error.
ENDTRY.
ENDIF.
ENDMETHOD. "deserialize_ex
ENDCLASS. "lc_json_custom IMPLEMENTATION
TYPES:
BEGIN OF tp_s_data,
tribool TYPE lc_json_custom=>tribool,
bool TYPE lc_json_custom=>bool,
str1 TYPE string,
str2 TYPE string,
initial TYPE i,
END OF tp_s_data.
DATA: ls_exp TYPE tp_s_data,
ls_act LIKE ls_exp,
lo_json_custom TYPE REF TO lc_json_custom,
lv_json_custom TYPE lc_json_custom=>json.
ls_exp-tribool = lc_json_custom=>c_tribool-false.
ls_exp-bool = lc_json_custom=>c_bool-false.
ls_exp-str1 = ''.
ls_exp-str2 = 'ABC'.
ls_exp-initial = 0.
CREATE OBJECT lo_json_custom
EXPORTING
compress = abap_true
pretty_name = lc_json_custom=>pretty_mode-camel_case.
lv_json_custom = lo_json_custom->serialize_int( data = ls_exp ).
lo_json_custom->deserialize_int( EXPORTING json = lv_json_custom CHANGING data = ls_act ).
" alternative way
lc_json_custom=>deserialize_ex( EXPORTING json = lv_json_custom CHANGING data = ls_act ).
cl_aunit_assert=>assert_equals( act = ls_act exp = ls_exp msg = 'Custom pretty name fails!' ).
WRITE / lv_json_custom.
Results in the following JSON:
{
"triboolXxx": false,
"str1Xxx": "",
"str2Xxx": "ABC",
"initialXxx": 0
}
Deserialization of an untyped (unknown) JSON object
If you need to deserialize a JSON object with an unknown structure, or you do not have a passing data type on the ABAP side, or the data type of the resulting object may vary, you can generate an ABAP object on the fly, using the corresponding GENERATE method. The method has some limitations compared to standard deserialization like:
- all fields are generated as a reference (even elementary types)
- you can not control how deserialized arrays or timestamps
- you can not access components of generated structure statically (while the structure is unknown at compile time) and need to use dynamic access
The simplest example, with straightforward access:
DATA: lv_json TYPE /ui2/cl_json=>json,
lr_data TYPE REF TO data.
FIELD-SYMBOLS:
<data> TYPE data,
<struct> TYPE any,
<field> TYPE any.
lv_json = `{"name":"Key1","properties":{"field1":"Value1","field2":"Value2"}}`.
lr_data = /ui2/cl_json=>generate( json = lv_json ).
" OK, generated, now let us access somete field :(
IF lr_data IS BOUND.
ASSIGN lr_data->* TO <data>.
ASSIGN COMPONENT `PROPERTIES` OF STRUCTURE <data> TO <field>.
IF <field> IS ASSIGNED.
lr_data = <field>.
ASSIGN lr_data->* TO <data>.
ASSIGN COMPONENT `FIELD1` OF STRUCTURE <data> TO <field>.
IF <field> IS ASSIGNED.
lr_data = <field>.
ASSIGN lr_data->* TO <data>.
WRITE: <data>. " We got it -> Value1
ENDIF.
ENDIF.
ENDIF.
A nice alternative, using dynamic data accessor helper class:
DATA: lv_json TYPE /ui2/cl_json=>json,
lr_data TYPE REF TO data,
lv_val TYPE string.
lv_json = `{"name":"Key1","properties":{"field1":"Value1","field2":"Value2"}}`.
lr_data = /ui2/cl_json=>generate( json = lv_json ).
/ui2/cl_data_access=>create( ir_data = lr_data iv_component = `properties-field1`)->value( IMPORTING ev_data = lv_val ).
WRITE: lv_val.
Implicit generation of ABAP objects on deserialization
In addition to the explicit generation of the ABAP data objects from JSON string, the deserializer supports an implicit way of generation, during DESERIALIZE(INT) call. To trigger generation, your output data structure shall contain a field with the type REF TO DATA, and the name of the field shall match the JSON attribute (pretty name rules are considered). Depending on the value of the field, the behavior may differ:
- The value is not bound (initial): deserialize will use generation rules when creating corresponding data types of the referenced value
- The value is bound (but may be empty): the deserializer will create a new referenced value based on the referenced type.
TYPES:
BEGIN OF ts_dyn_data1,
name TYPE string,
value TYPE string,
END OF ts_dyn_data1,
BEGIN OF ts_dyn_data2,
key TYPE string,
value TYPE string,
END OF ts_dyn_data2,
BEGIN OF ts_data,
str TYPE string,
data TYPE REF TO data,
END OF ts_data.
DATA:
ls_data TYPE ts_data,
lv_json TYPE /ui2/cl_json=>json.
lv_json = `{"str":"Test","data":{"name":"name1","value":"value1"}}`.
" deserialize data and use generic generation for field "data",
" the same as with method GENERATE (using temporary data type)
/ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_data ).
" deserialize data and use type TS_DYN_DATA1 for the field "data"
CREATE DATA ls_data-data TYPE ts_dyn_data1.
/ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_data ).
" deserialize data and use alternative type TS_DYN_DATA2 for the field "data"
CREATE DATA ls_data-data TYPE ts_dyn_data2.
/ui2/cl_json=>deserialize( EXPORTING json = lv_json CHANGING data = ls_data ).
Automatic deserialization of the JSON into the appropriate ABAP structure is not supported. The default implementation assumes that you need to know the target data structure (or at least partial structure, it will also work) to deserialize JSON in ABAP and then work with typed data.
But if for some reason one needs the ability to deserialize JSON in source ABAP structure in a generic way, he can extend both serialize/deserialize methods and wrap outputs/inputs of /UI2/CL_JSON data by technical metadata describing source ABAP structure and use this information during deserialization (or use GENERATE method). Of course, you need to ensure that the source ABAP data type is known in the deserialization scope (global and local types are "visible").
See the example below:
TYPES: BEGIN OF ts_json_meta,
abap_type LIKE cl_abap_typedescr=>absolute_name,
data TYPE string,
END OF ts_json_meta.
DATA: lt_flight TYPE STANDARD TABLE OF sflight,
lv_json TYPE string,
lo_data TYPE REF TO data,
ls_json TYPE ts_json_meta.
FIELD-SYMBOLS: <data> TYPE any.
SELECT * FROM sflight INTO TABLE lt_flight.
* serialize table lt_flight into JSON, skipping initial fields and converting ABAP field names into camelCase
ls_json-data = /ui2/cl_json=>serialize( data = lt_flight compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
ls_json-abap_type = cl_abap_typedescr=>describe_by_data( lt_flight )->absolute_name.
lv_json = /ui2/cl_json=>serialize( data = ls_json compress = abap_true pretty_name = /ui2/cl_json=>pretty_mode-camel_case ).
WRITE / lv_json.
CLEAR: ls_json, lt_flight.
* deserialize JSON string json into internal table lt_flight doing camelCase to ABAP like field name mapping
/ui2/cl_json=>deserialize( EXPORTING json = lv_json pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = ls_json ).
CREATE DATA lo_data TYPE (ls_json-abap_type).
ASSIGN lo_data->* TO <data>.
/ui2/cl_json=>deserialize( EXPORTING json = ls_json-data pretty_name = /ui2/cl_json=>pretty_mode-camel_case CHANGING data = <data> ).
IF lo_data IS NOT INITIAL.
BREAK-POINT. " check here lo_data
ENDIF.
Exception Handling in /UI2/CL_JSON
By default, /UI2/CL_JSON tries to hide from consumer thrown exceptions (that may happen during deserialization) catching them at all levels. In some cases, it will result in missing attributes, in other cases, when an error was critical and the parser can not restore, you will get an empty object back. The main TRY/CATCH block, not letting exceptions out is in DESERIALIZE method.
If you want to get a reporting in case of error, you shall use instance method DESERIALIZE_INT which may fire CX_SY_MOVE_CAST_ERROR. The reporting is rather limited - all errors translated into CX_SY_MOVE_CAST_ERROR and no additional information is available.
Below is a small example of CALL TRANSFORMATION usage to produce JSON from ABAP structures. Do not ask me for details - I do not know them.
Was just a small test of me.
DATA: lt_flight TYPE STANDARD TABLE OF sflight,
lo_writer TYPE REF TO cl_sxml_string_writer,
lv_output_length TYPE i,
lt_binary_tab TYPE STANDARD TABLE OF sdokcntbin,
lv_jsonx TYPE xstring,
lv_json TYPE string.
SELECT * FROM sflight INTO TABLE lt_flight.
* ABAP to JSON
lo_writer = cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ).
CALL TRANSFORMATION id SOURCE text = lt_flight RESULT XML lo_writer.
lv_jsonx = lo_writer->get_output( ).
CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
EXPORTING
buffer = lv_jsonx
IMPORTING
output_length = lv_output_length
TABLES
binary_tab = lt_binary_tab.
CALL FUNCTION 'SCMS_BINARY_TO_STRING'
EXPORTING
input_length = lv_output_length
IMPORTING
text_buffer = lv_json
output_length = lv_output_length
TABLES
binary_tab = lt_binary_tab.
* JSON to ABAP
CALL TRANSFORMATION id SOURCE XML lv_jsonx RESULT text = lt_flight.
FAQ
GENERATE or DESERIALIZE into REF TO DATA vs DESERIALIZE into a typed data structure
It is always better to deserialize into explicit data structure but not into anonymous reference:
- It is faster
- It is type-safe
- Processing deserialized result is much easier.
Deserialize into REF TO data is the same as using of GENERATE method and results in generating real-time ABAP types, which is quite slow. You can not specify the resulting types for elements and the deserializer need to guess. To process generated results, you need to always use dynamic programming by default slow (or /UI2/CL_DATA_ACCESS, which is more comfortable, but still use dynamic programming inside).
Serialize huge data objects into JSON and short dumps (SYSTEM_NO_ROLL, STRING_SIZE_TOO_LARGE, MEMORY_NO_MORE_PAGING)
You are using class /ui2/cl_json to serialize your data into JSON. Unfortunately, sometimes you pass too big tables, which results in too long a JSON string (for example, longer than 1GB) and this leads to dumps, while the system can not allocate such a big continuous memory chunk. Potentially this specific case can be solved by increasing the memory allocation limit, but you would still end up in INT4 size limit for string length, which can not be more than 2GB size.
The string (JSON) of such size can not be created and also can not be transported or persisted. You would need to have special handling on your side for this case.
E.g. if you want to serialize such a big amount of data, you will need to split input in chunks and do the serialization and transport of the resulting JSON chunks by parts.
The memory exceptions are not catchable and you will need to do data size evaluations on your side, before calling serialization.
So, the only robust way to solve the issue will be by having a limit on serialized data size. Which can be done only on the /ui2/cl_json consumer side.
Even if you would select another format for serialization (XML or ABAP JSON) you will stick to some limit. So, no other way.
If you still need to serialize everything, you may do split data into chunks and give it to the serializer one by one. And then do deserialize all chunks into the same data object for merging.
Encoding of Unicode characters (for example Chinese) with /UI2/CL_JSON
The /ui2/cl_json does not do any explicit character encoding, this is done by ABAP. Normally, ABAP works with UTF16, 2 byte Unicode encoding that can represent any character, also Chinese. That is why you see Chinese characters in the debugger. Later on, after serializing in JSON (you may also check in debugger JSON and see that Chinese characters are still in), you pass the JSON string further, maybe as a REST response. And there is, probably, goes converted into UTF8 encoding, which is multibyte encoding, where some characters (Latin) encoded with one byte, and some (Chinese, Russian, etc) as multibyte. And then the viewer of such UTF8 text shall be able to interpret them properly and display them. If you do not see characters as expected in your viewer tool, then, probably, nothing is corrupt and the receiver will get them fine. It is just an issue of the viewer that does not recognize UTF8, or probably, lost an encoding ID interpreted wrong.
Incompatible change for initial date/time fields serializing with PL16
First of all, I would agree that this is an incompatible change and I am asking you to excuse me for your efforts. But it was an intentional change and I was aware that someone can already rely on current behavior and may get issues.
The reason for this change of default was a customer complaint regarding the handling of initial date-time values, which are not 0000-00-00 or 00:00:00. In general 0000-00-00 is an invalid date, 00:00:00 is valid, but how to understand that it is initial but not explicit midnight?
Because of that I have decided to at all not render initial values for date/time and give a receiver a way to understand that it is initial and have their own processing of default/initial. I know that it is incompatible, but I want that a default behavior would be the best and most common choice, even with the cost of modification of the consumer code that relies on old behavior :/.
Because having custom rendering of the initial date/time is quite exotic, I have only let this for constructor calls and have not extended the serialize method, to keep common API simple. If I would get multiple requests regarding extending SERIALIZE with these defaults - I will do.
My recommendation for you:
- Variant 1: Adopt your unit tests for new initial values for date-time.
- Variant 2: Use instance method for serialization. E.g parametrized CONSTRUCTOR + SERIALIZE_INT.
- Variant 2: Extend /ui2/cl_json class or create a helper method in your class with your static SERIALIZE call which already considers new defaults for the /ui2/cl_json constructor.
Version History
Click here for details...
/UI2/CL_JSON
- Fixed: Serializing of hash tables with empty values and the parameters assoc_arrays_opt and compress produces invalid JSON.
- Fixed: Method serialize produces corrupt data when an initial date or time field is serialized (something like "--" or "::" ).
- Fixed: Performance for serializing and deserializing dynamic data objects (REF TO DATA) is greatly improved.
- New: You can apply text formatting/beautifying to serialized JSON with the new parameter format_output.
- New: You can control the format of how hex values are serialized or deserialized with the new parameter hex_as_base64 for SERIALIAZE and DESERIALIZE methods. If hex_as_base64 set to abap_true, binary/hex values processed as base64 (default, compatible behaviour). If hex_as_base64 is set to abap_false value processed in raw hex form.
- New: Now you can provide bool types (true, false), 3 bool types (true, false, undefined), initial timestamp, initial time, initial date (JSON value for initial ABAP value) in the instance constructor. This would allow you to not inherit class but just parametrize (better performance).
Click here for details...
/UI2/CL_JSON
- Fixed. Generating of ABAP structures for JSON attributes which include special characters like "/\:;~.,-+=><|()[]{}@+*?!&$#%^'§`" fails.
- Fixed. Edm.Guid with the mixed or lower case is not recognized.
- Fixed. Iterating and data assignments for the generated data object (REF TO DATA), produced by GENERATE and DESERIALIZE methods fail in ABAP Cloud.
/UI2/CL_DATA_ACCESS
- Fixed. Index access to generated tables in LOOP construction fails.
Click here for details...
/UI2/CL_JSON
- Fixed. Unescaping of strings with a single Unicode entity (e.g "\uXXXX") does not work
- New. More robust logic for handling invalid JSON (e.g cases with extra "," without further element { "a": 1, } )
Click here for details...
/UI2/CL_JSON
- Fixed. Conversion exists does not work when data is located inside of internal tables.
- Fixed. TIMESTAMPL subsecond values are truncated when deserializing from Edm.DateTime.
Click here for details...
/UI2/CL_JSON
- New. DESERIALIZE and GENERATE methods supporting decoding of Unicode symbols (\u001F)
- Fixed. Invalid JSON causing <STING_OFFSET_TOO_LARGE> exception and dump.
/UI2/CL_DATA_ACCESS
- Fixed. Access to fields with special characters in the name (e.g "/BIC/YEAR") fails.
Click here for details...
/UI2/CL_JSON
- Optimized. Performance lost, introduced in PL10 (note 2763854) when unescaping special characters (\r\n\t\")
- Fixed. Short dump, with <STRING_OFFSET_TOO_LARGE> when running GENERATE method with empty or invalid input
/UI2/CL_DATA_ACCESS
- Fixed. Short dump, when accessing elements of null array
Click here for details...
- Fixed: Deserialization and generation of the ABAP data from JSON strings with Unicode characters fail
- Fixed: Unescaping of \\t and \\n char combinations in strings handled incorrectly
- Fixed: GENERATE method fails on JSON attribute names containing spaces
Click here for details...
- New: Support for deserialization of OData Edm.Guid
- New: Support of Enum data types in ABAP. From SAP_BASIS 7.51, below - enums are ignored.
- New: Support of conversion exits for serializing and deserializing.
- Fixed: SERIALIZE method delivers an invalid JSON string, when NUMC type, filled with spaces is used.
Click here for details...
- New: JSON timestamp fields, serialized in OData Edm.DateTime format (e.g. "\/Date(1467981296000)\/") are supported, and properly deserialized in ABAP date, time or timestamp fields
- New: JSON timestamp fields, serialized in OData Edm.Time format (e.g. "PT10H34M55S") are supported, and properly deserialized in ABAP date, time, or timestamp fields
- Fixed: content is scrambled, when using GENERATE method for JSON objects with a name containing special characters (for example "__metadata")
- Fixed: GENERATE method does not consider custom name mapping pairs passed as a parameter for CONSTRUCTOR or GENERATE methods
- Fixed: generation of very long integers (serialized numeric date) fails, due to I type overflow (you get 0 instead of an expected number)
Click here for details...
- Fixed: Deserialization of the inconsistent data (JSON string into ABAP table) leads to a short dump if the JSON string is empty.
- Fixed: Serialization of data with includes with defined suffix (RENAME WITH SUFFIX) dumps
- Fixed: GENERATE method fails, if the JSON object contains duplicate attributes and PRETTY_MODE-CAMEL_CASE is not used.
- Fixed: GENERATE method fails, if JSON object contains attribute names longer than 30 characters (allowed ABAP field length). Can also occur in case the name is shorter than 30 characters, but PRETTY_MODE-CAMEL_CASE is used.
- New: methods DUMP_INT, DUMP_TYPE, RESTORE_TYPE, and RESTORE can be overridden now. So, you can introduce your data type conversion on serialization and deserialization.
- New: now it is possible to pass the name mapping table as a parameter for the constructor/serialize/deserialize and control the way JSON names are formatted/mapped to ABAP names. This may help if you need special rules for name formattings (for special characters or two long JSON attributes) and standard pretty printing modes cannot help. With this feature, you may eliminate the need for the class extension and redefine PRETTY_NAME and PRETTY_NAME_EX methods.
- New: PRETTY_NAME_EX method extended to support the encoding of more special characters (characters needed in JSON names but that can not be used as part of ABAP name). The supported characters are: "!#$%&*-~/:|@.". Used with pretty_mode-extended.
- New: /UI2/CL_DATA_ACCESS class for working with dynamic ABAP data object (generated with method /UI2/CL_JSON=>GENERATE). The class can be used as a replacement for multiple ASSIGN COMPONENT language constructions.
Click here for details...
- Fixed: Empty JSON objects, serialized as entries of the table, are not deserialized into corresponding ABAP structures and further parsing of the JSON string after an empty object is skipped.
- Fixed: JSON fields containing stringified timestamp representation in ISO 8601 format are not deserialized properly in the corresponding ABAP timestamp field.
- Fixed: Recursive (hierarchical) JSON objects cannot be deserialized.
Click here for details...
- Fixed: Partial serialization/deserialization of the JSON is not supported
- New: Extending of the class is supported
- New: Added support for serializing named include structures from ABAP as embedded sub-objects in JSON
Click here for details...
Click here for details...
Click here for details...
Click here for details...
- New: GENERATE method creates local custom class for deserialization (lc_json_custom), instead of standard /ui2/cl_json
- Fixed: Internal tables are not initialized when deserializing JSON with empty arrays
- New: Deserialization into a field with REF TO data type, if the field is bound, using a referenced data type
- New: Deserialization uses automatic generation of the data if the field has "REF TO DATA" type and bound data is initial
95 Comments
Former Member
Alexey - works ok , thanks for sharing. Worth trying.
Alexey Arsenyev
Use it with pleasure
Former Member
Thanks for this class... we encountered an issue using it... apparently the JSON in Javascript needs to have the number surrounded by double-quotes also...
I added them in the concatenate in the macro and it works like a charm. Thanks
Sunil Kumar Verma
Hi Gerg,
I'm facing the same problem.
Could you please post how did you overcome the issue. In which macro and how did you add the concatenate statement?
Thanks in advance,
Sunil
Gregory Tutt
Hey mate, sorry my account has been changed... anyway, we made some changes but I think Alexey already provided the same solution.
Alexey Arsenyev
Hi Greg,
based on my knowledge, and on JSON RFC numbers shall not be surrounded by quotes.
I assume in your case you need a special handling, while JS on client side expect to have string instead of number. The proper way will be then to change ABAP structure in such way it corresponds expected format (so change base data type of attribute from I based to N based for example).
Can you please post here an example that does not work?
BR, Alexey.
Former Member
Hey,
Unfortunately, we can not change the structure as it is dynamic. We actually use JSON because oData is even less flexible. The issue is indeed in the JavaScript client that does not recognise the object during parsing.
Here is an example:
var test = {
"Data" : [ {
"COL_EMPNO" : 00000001,
"COL_MANM1" : "Roberts, Marcia",
"COL_MADOB" : 19630416,
}]
};
JSON.parse(test) fails... Error: Unexpected number
Alexey Arsenyev
Hi Greg,
is this last comma after 19630416, is generated or you just add it in your example? If this come from serializer - it is a bug. But I think JSON parser on JS side shall overcome it.
I think reason for error is: COL_EMPNO" : 00000001 .
Please try example like this if it works:
JSON.parse( {
"Data" : [ {
"COL_EMPNO" : 1,
"COL_MANM1" : "Roberts, Marcia",
"COL_MADOB" : 19630416
}]
}) ;
If this 00000001 is a reason, I will try to fix it and update the parser. The workaround on your side until I update parser will be to use not NUMC type but I for COL_EMPNO.
BR, Alexey.
Former Member
Hello Alexey / Greg,
Accidentally came across this blog; but found the information very helpful. Thank you for the blog and comments.
We had written a custom JSON parser some time ago and I was interested in the standard SAP utility.
But I have encountered 2 issues while testing this;
One: as Greg pointed out, when the output JSON string is parsed with external parsers, they have errors as the numbers need to be in quotes. After reading https://tools.ietf.org/html/rfc7158 I see that JSON numbers need not be in quotes but they cannot have leading zeros. So I guess, the solution could be either to have quotes around the number values or to have SAP's character based number formats without leading zeros.
Two: the standard utility ( /ui2/cl_json=>serialize ) dumps when you have a meta-structure via the include command and without a "group" name. I don't know if this is fixed with a higher component version but you can try with the example below.
The below dumps for me :
types: begin of ty_s_str01,
c01 type c length 1,
c02 type c length 1,
end of ty_s_str01.
types: begin of ty_s_str02,
c03 type c length 1.
include type ty_s_str01 .
types: end of ty_s_str02.
data ls_strc type ty_s_str02.
data lv_json type string.
ls_strc-c01 = 'X'.
ls_strc-c03 = 'X'.
lv_json = /ui2/cl_json=>serialize( data = ls_strc
compress = abap_true
pretty_name = abap_true ).
Now replace the definition of ty_s_str02 as below and it should work.
types: begin of ty_s_str02,
c03 type c length 1.
include type ty_s_str01 as str01.
types: end of ty_s_str02.
The same applies to a DDIC structure having an include without a "group" name.
Hope this helps; just thought that I would point it out.
Thanks.
Alexey Arsenyev
Hi John,
the "SAP standard", I think, will be to use CALL TRANSFORMATION with JSON format: I have added the example in the bottom of the article of how to serialize data with it. But, as I have written the code will only work from 7.02 and one does not have too much freedom (easy way) to control output format. But it is faster.
If one would use /UI2/CL_JSON you have nicer consumption + more functionality but less performance (while it is pure ABAP).
Back to problems.
1) output of the leading zeroes it is a bug I will correct.
2) support of INCLUDE is known bug, already fixed in delivered /UI2/CL_JSON but not yet here. I will update code together with fix for leading zeroes soon.
BR, Alexey
Former Member
Hi Alexey,
Thank you for the updates and information on using the transformation.
I was aware of using transformation but it was good to see an example.
About the problems, not a show stopper for me; instead I just thought I would point these out.
Thanks.
Alexey Arsenyev
Hi Guys,
I have corrected both erorrs: with leading zeroes and include structures. Try new version.
BR, Alexey.
Alexey Arsenyev
Added fix for type conversion overflow on deserializing.
Jan Valousek
Hello,
I have some wish to improvement for /UI2/CL_JSON.
If you want deserialize some character value to numeric datafields than program dump to system error. It is not planed issue, but sometimes consume data are come with bad format.
For example:
try.
data: json type string.
json = '{ "userName": "sap", "password": "123456" }'.
data: begin of user,
username type string,
password type int4,
end of user.
/ui2/cl_json=>deserialize( exporting json = json
changing data = user ).
catch cx_root.
endtry.
Catch cx root is not catched a prohras has been terminaded.
System analysis:
In statement "'REPLACE", only character-type data objects are supported at
argument position "DATA'".
In this case, operand "DATA'" has the non character-type type "I".
Method RESTORE_TYPE:
73 WHEN `"`. " string
74 IF data IS SUPPLIED.
75 eat_string data.
76 " unescape string
>>>> REPLACE ALL OCCURRENCES OF `\"` IN data WITH `"`.
If I consume some JSON data, I can´t reduce any mistake.
Could you please implement some program exception handler or as advance implement JSON schema for validate input.
Thanks.
Alexey Arsenyev
Hi Jan,
accepted. Please check new version.
BR, Alexey.
Jan Valousek
You are so quick. It´s function great. Thanks.
Former Member
Agree, I bumped into the similar issue, before I saw Alexey's correction the original in method restore_type() is
WHEN `"`. " string
IF data IS SUPPLIED.
eat_string data.
REPLACE ALL OCCURRENCES OF `\"` IN data WITH `"`. " unescape string
I just changed it to
WHEN `"`. " string
IF data IS SUPPLIED.
eat_string data.
type_descr = cl_abap_typedescr=>describe_by_data( data ).
IF type_descr->type_kind EQ cl_abap_typedescr=>typekind_CHAR
REPLACE ALL OCCURRENCES OF `\"` IN data WITH `"`. " unescape string
...
I had to copy this class to make changes...
George
Alexey Arsenyev
Hi George,
wrapping of the the \UI2\CL_JSON class in you own class as local is a preferred way if you want to protect your code from changes, which can happen if standard delivered \UI2\CL_JSON will be modified in a way, that does not fit your purposes. And you can always copy actual version of code from here.
About suggest change by you: please use actual one from the article. It is more robust.
BR, Alexey.
Alexey Arsenyev
Former Member
Another item: in /UI2/CL_JSON_SERIALIZER, method GET_VALUES(),
ELSE.
* null
IF mv_case_type = /ui2/if_serialize=>c_case_type-camel_case_s.
IF <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_struct1
OR <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_struct2
OR <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_dref.
DATA lv_u TYPE string.
DATA lv_l TYPE string.
FIND REGEX '([a-z])([a-z]*)' IN lv_field_name SUBMATCHES lv_u lv_l.
TRANSLATE lv_l TO LOWER CASE.
TRANSLATE lv_u TO UPPER CASE.
CONCATENATE lv_u lv_l INTO lv_field_name.
endif.
endif.
CONCATENATE '"' lv_field_name '":null' INTO lv_value.
ENDIF.
here, the situation I got is the interface does not exepct "null". so I make change to
CONCATENATE '"' lv_field_name '":' INTO lv_value.
then there is another issue,
in this line:
ASSIGN COMPONENT sy-tabix OF STRUCTURE <lv_data> TO <lv_value>.
IF <lv_field>-numeric = abap_true.
if this component of structure <lv_data> is a table with no entries, then <lv_value> will be considered initial.
once this <lv_value> is considered initial, it will come to the following code:
ELSEIF <lv_value> IS INITIAL.
IF <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_char AND
<lv_field>-descr->length = 2 .
* IF <lv_field>-name <> 'NUMBER_FORMAT'.
CONCATENATE '"' lv_field_name '":" "' INTO lv_value.
ELSE.
* null
IF mv_case_type = /ui2/if_serialize=>c_case_type-camel_case_s.
IF <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_struct1
OR <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_struct2
OR <lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_dref.
DATA lv_u TYPE string.
DATA lv_l TYPE string.
FIND REGEX '([a-z])([a-z]*)' IN lv_field_name SUBMATCHES lv_u lv_l.
TRANSLATE lv_l TO LOWER CASE.
TRANSLATE lv_u TO UPPER CASE.
CONCATENATE lv_u lv_l INTO lv_field_name.
endif.
endif.
CONCATENATE '"' lv_field_name '":null' INTO lv_value.
ENDIF.
then I will miss the square bracket [ ] from the output. I did a fix here as well, to do something like check
<lv_field>-descr->type_kind = cl_abap_typedescr=>typekind_table.
if it is, then copy your code
GET REFERENCE OF <lv_value> INTO lr_dref.
lo_tabledescr ?= <lv_field>-descr.
lv_value = serialize_table( io_tabledescr = lo_tabledescr ir_data = lr_dref ).
CONCATENATE '"' lv_field_name '":' lv_value INTO lv_value.
so I think when <lv_data> is initial, we need to think it further to make changes.
overall, thank you Alexjey, this is a very good program, runs very fast,easy to debug, thank you very much. it is preferred to transformation as this is dynamic.going forward, I am wondering if we can determine the export abap structure at runtime.
George
Alexey Arsenyev
Hi George,
do not use class /UI2/CL_JSON_SERIALIZER - it is deprecated and not supported any more. Just left in because of compatibility reasons.
If you need portable class for JSON serialization/deserialization /UI2/CL_JSON is proper one.
About the: "I am wondering if we can determine the export abap structure at runtime".
This feature is not supported, because I do not see a reason for it. If you deserialize something, you still need to be able to read it in typed way in ABAP. So, the idea that you need to know target data structure (or at least partial structure, it will also work) to deserailize JSON and that work with you typed target.
But if you for some reason need ability to deserilaize JSON in source ABAP structure in generic way, you may extend both serialize/deserilaize methods (you any way have copied the class) and wrap outputs/inputs of /UI2/CL_JSON data by technical meta data describing source ABAP structure and use this information during deserialization. For example:
Error rendering macro 'code': Invalid value specified for parameter 'com.atlassian.confluence.ext.code.render.InvalidValueException'Best regards,
Alexey.
Former Member
thank you Alexey! completely agree with you
George
Taryck Bensiali
Hi all,
This serialization do not works with REF TO DATA which are supposed to be objets ==> CX_SY_MOVE_CAST_ERROR.
Do you have any solution that works with "REF TO DATA" ?
Taryck.
Former Member
do you mean the type mismatch? can you give an example of what you expect to be the input parameter, type ref to data?
Alexey Arsenyev
Hi Taryck,
as George mentioned: please provide test example and expected result. I will check.
In general serializations of TYPE REF shall also work (as in example above for ref to LT_FLIGHTS) but I have not done extensive tests for such use cases.
Best regards,
Alexey
Taryck Bensiali
OK here it is :
data t_param type abap_parmbind_tab.
data s_param like line of t_param.
Data interger type int4.
Data str type string.
s_param-name = 'INTERGER'.
S_param-kind = 'E'
Interger = 3.
get reference of integer into S_param-value.
insert s_param into table t_param.
str = /ui2/cl_json(data = t_param).
Error is in method dump because when "type_desc->kind = Kind_ref" you assume this is an object. If you looked at type_desc->type_kind you'll see it's dref type kind...
Alexey Arsenyev
Hello Taryck,
thanks for example. Yes, I see - will try to extend that soon.
As you know, this is not an official SAP JSON parser, so you can copy and extend it as you like and share your suggestions here. All will appreciate valuable contribution!
BR, Alexey.
Taryck Bensiali
Hi,
OK thanks. How could it be unofficial and been part of SAP's packages ?
I've try to find solutions for serializing and most of all deserializing Ref to Data but I do not find any solution.
I've try CALL TRANSFORMATION ID Options data_ref = 'embedded' ... Which is OK for Serialization but fails on deserialize...
Alexey Arsenyev
The /UI2/CL_JSON class is created to solve specific needs of UI2 services and not intended to be generic solution for anyone need JSON serialize/deserialize abilities (see introduction part of the article). For generic solution use CALL TRANSFORMATION or request something from Gateway colleges going official way with messages, dev requests etc. The same you can also try with /UI2/CL_JSON requests.
Here I am presenting local copy anyone can modify and try to help people with their questions & requirements. If I have time and see the need.
How you imagine deserializing of the value from typeless JSON to generic data reference in ABAP (looking back on your example)?
Taryck Bensiali
Well, I imagine that for CALL TRANSFORAMTION because data type is present in the XML.
For JSON I'm not an expert so I expect this could be done. But If only data are stored without any data definition well I understand that it will be almost impossible...
Alexey Arsenyev
Hi Taryck,
I have updated the code to support serialization of the data references. Deserialization, as I have already tried to explain is not possible in generic case, while JSON does not include type information + deserialization shall support any JSON but not only one previously serialized with it. But at least code provided will support your example, with deserialization of simple types in similar kind (so it can desterilize string, Boolean and integer, but cannot of course support deserialization in some specific type of original structure).
Please verify.
Best regards,
Alexey
Taryck Bensiali
Hi,
It's OK. Your example :
{"ABSOLUTE_NAME":"\\TYPE=%_T00004S00000000O0000012480","ADMIN_TAB":"\\TYPE=%_T00004S00000000O0000012480", "ADMIN_TAB_LINE":"\\TYPE=%_T00004S00000000O0000012480"
Provide type definition so I tought JSON could handle type definition also.
Thanks.
Alexey Arsenyev
Hello Taryck
probably mentioned example is a little bit missleading. It purpose was only to show, that you can serialize ABAP Object also, but it does not serialize JSON type information, it is just dump of CL_ABAP_TYPEDESCR class which was used as easy example.
BR, Alexey
Former Member
Hello Alexey,
I see that my code is livin inside SAP standard code. One question. I checked the usage of /UI2/CL_JSON. It's mainly used in NWBC. Is there any other implementation of this class in SAP Standard? You can write me direct message. I tried to send you via SCN but couldn't since you are not following me.
Kind Regards,
Ümit Coşkun Aydınoğlu
Alexey Arsenyev
Hi Coşkun,
/UI2/CL_JSON is part of SAP coding, but not a standard class for serialization/deserialization of the JSON in SAP. We use it in NWBC and in Fiori and it is public class and part of the UI Addon. So any one may use it. Do not know other usages but know they exist. There may be some copies of the class encapsulated as local classes, following guidelines I gave in article, so it may be even more usages.
BR, Alexey.
Former Member
Hi Alexey,
is there any way to set Pretty Printing in UpperCamelCase instead of lowerCamelCase ?
Best regards
Diego
Alexey Arsenyev
Hi Diego,
there is no such pretty printing option, but it is easy to built in if you use local copy of the class as suggested:
when pretty_mode-ucamel_case.
&3 = pretty_name( &1 ).
TRANSLATE &1(1) TO UPPER CASE.
But to be honest, you can do it even easier, without any modification. What you need is just start all your fields with "_" and use camel_case as formating option. I assume it will result in formatting you need.
BR, Alexey.
Former Member
Hi Alexey,
I've tried what you've suggested "start fields with '_' " but it does not work, that's the reason of my question. Thanks anyway, great work!!
Best Regards
Diego
Alexey Arsenyev
I would say it is a bug, that it does not work. Actually there is dedicated code blocking usage of _ in-front of the names, that I now treat as non needed...
The quick correction would be to replace method PRETTY_NAME by following code:
Error rendering macro 'code': Invalid value specified for parameter 'com.atlassian.confluence.ext.code.render.InvalidValueException'Former Member
Hi Again,
I've realized that the class produces invalid JSON when DATA parameter has non-printable characters.
You need to add content cleaning something like :
replace all occurrences of regex '[^[:print:]]+(?!$)' in STRING with ` `.
replace all occurrences of regex '[^[:print:]]+$' in STRING with ''.
P.S : It's good to know that my code lives in essentials part of SAP like nwbc and fiori.
Kind Regards,
Coşkun
Alexey Arsenyev
Hello Coskun,
thanks for the feedback.
Can you be more concrete and provide an example to check? Do you mean SERILAIZE or DESERIALIZE?
Why not-printable characters is a problem for JSON? Do you mean they are not allowed inside string literals? But for unicode text processing it is not a problem. The only " shall be escaped.
In addition to that, such un-transparent replace will lead to data loss, which is not so obvoious to consumer of the API. If one knows he need to transport some binary or not well formed content, he needs to expose data as XSTRING, which will be then base64 encoded, but this does not lead to any data loss. Or do preliminary excapment by himself, before serialization.
And I have not understood suggested regular expressions:
What does this mean '[^[:print:]]+(?!$)'
And this '[^[:print:]]+$' :
I think something like this will be enough, but I will not build that in, while lead to data loss:
replace all occurrences of regex '[^[:print:]]' in STRING with ` `.
Best regards,
Alexey.
Bilen Cekic
Hi Alexey,
Thank you again i finished several project with this class
. I have a question.
I need to write a data integration layer but will transfer millions of rows sometimes. I testested 100K record is fine but is it possible to remove column duplication like json.hpack ?
https://github.com/WebReflection/json.hpack/wiki
I want to send columns in an array and remaning values as value only without column name.
Another questions is what is the minimum NW version that i can use this class ? (assume i make a zcopy)
Alexey Arsenyev
Hello Bilen,
Sorry for the delay with response.
The code is tested from SAP_BASIS 700, but I think it can be used even earlier. There is nothing special, that requires something modern. Maybe from SAP_BASIS version that introduced regular expressions.
About your request. I have checked hpack. The idea looks interesting, but I do not know, how much are the size benefits when once compares hpack and standard gzip compressing over the content. Maybe it not worth it. Have not found benchmarks. The number of stars is OK, but last update 3 years ago. That is also not very convincing.
That was an answer if I would implement that in standard
Maybe if I would get more requests for it.
Also, I did not get, how to recognize real array JS compared to a compressed collection hpack object. Looks like that you can not mix standard content and hpack collections...
But if you would like to implement it by yourself, in your zcopy, it shall be more or less easy, at least for serialization. Depending on compression level you need.
It is method DUMP_INT, line 149 to 169.
The actual column name added in line 160 - so just comment it (that is for your request, compression level 0, just skip column name):
CONCATENATE <symbol>-header lv_itemval INTO lv_itemval.
In addition to that, you would need to add as first line "header" row. By looping over lt_symbols table and collecting "name" components. After inserting it on lv_itemval.
BR, Alexey.
Bilen Cekic
Thanks a lot man! Yes in 7.0.3 i tried and very few fixes are required. Problem is in 7.0.3, REST classes are totally missing :/
Bilen Cekic
Hi Alexey,
Recently i had an issue regarding an internal table created dynamically. Here is the error;
CREATE DATA: The specified type (\TYPE=%_T00011S00000261O0000001280) is not a valid data type.
Do you have any solution for this ?
Bilen Cekic
Oh man sorry my bad. I noticed my class version was old, just updated dump_int method and working fine.
Lucius Allen
Hi Alex.
I have some JSON text that contains a deep structure and table. After executing "lr_data = /ui2/cl_json=>generate( EXPORTING json = lv_json ).", how can I access variables "d":"2018-09-25" and "v":1.29848?
here is the JSON text:
{
"terms": {
"url": "https://www.bankofcanada.ca/terms/"
},
"seriesDetail": {
"FXUSDCAD": {
"label": "USD/CAD",
"description": "US dollar to Canadian dollar daily exchange rate"
}
},
"observations": [
{
"d": "2018-09-25",
"FXUSDCAD": {
"v": 1.2948
}
}
]
}
Alexey Arsenyev
Hi Allen
Sorry, I just got a notification letter about your comment :/.
Probably it would be already too late, but nevertheless, I will answer
The answer is: you need to use dynamic programming, accessing data components by name. Traversing structures up-down. With a lot of surrounding checking code, for component existence, assignment result etc. See an example in the chapter "Deserialization of untyped (unknown) JSON object".
That is why it is always better to prefer SERIALIZE method to defined data structure instead of GENERATE.
But as I also mentioned, I have created and published (it same note as /UI2/CL_JSON) class /ui2/cl_data_access that can simplify it in some way. See the same chapter for example or a more detailed article here: Dynamic Data Accessor Helper Class for ABAP.
Particularly for your example, it would be like this:
DATA: lr_data TYPE REF TO data,
lv_val TYPE f.
lr_data = /ui2/cl_json=>generate( json = lv_json ).
/ui2/cl_data_access=>create( ir_data = lr_data iv_component = `observations[1]-FXUSDCAD-V`)->value( IMPORTING ev_data = lv_val ).
Best regards,
Alexey.
Lucius Allen
Thanks Alex for response. That is exactly the way I would anticipate referencing the data. However, the complex structure was not de-constructed in the expected way. See below for how the routine returned the data. Try it for yourself with the JSON text. I want to use this in my production system so I wanted to run it by you in case you would have a fix for it.
here is the JSON text:
{
"terms": {
"url": "https://www.bankofcanada.ca/terms/"
},
"seriesDetail": {
"FXUSDCAD": {
"label": "USD/CAD",
"description": "US dollar to Canadian dollar daily exchange rate"
}
},
"observations": [
{
"d": "2018-09-25",
"FXUSDCAD": {
"v": 1.2948
}
}
]
}
Alexey Arsenyev
Hi Allen,
by me - works. I think it is the old issue I have already fixed for GENERATE method when components were scrambled...
Please apply all related notes (listed in the blog above in Version History section) and retest.
My test code:
Best regards,
Alexey.
Lucius Allen
I applied the notes. All is well now. Thanks Alexey!!
Dmitrii Sharshatkin
Hi All,
Was looking into a similar topic. I was needing to convert unknown JSON into string table.
I found a way to do just within two lines of code (excluding the data declarations):
data: lv_json type string.
data: lr_node type ref to if_ixml_node.
data: lt_xml_table type table of smum_xmltb.
lr_node ?= cl_fpm_chart_json_generator=>json_to_xml( lv_json ).
perform get_element(lsmumf01) tables lt_xml_table using lr_node.
BR, Dima
Alexey Arsenyev
Hello All,
can someone post a comment here? While I got a message that commenting is not working anymore.
If not possible write me a personal message on SDN for @alexey.arseniev.
BR, Alexey.
Wolfgang Röckelein
Hi All,
seems it is possible again to comment here!
For all of you who want to have /UI2/CL_JSON accessible in ABAP Cloud, please vote on https://influence.sap.com/sap/ino/#/idea/234724 for this!
Many Thanks!
Alexey Arsenyev
Hi Wolfgang,
thanks for confirming that comments are again working
I have talked with a college from SDN, regarding commenting and he said that it is probably because the page is too long, and I need to split or migrate it. But seems, that such a conversation still helped, indirectly
Also, I appreciate your efforts for collecting votes for enabling class for ABAP Cloud whitelisting. I have checked with Steampunk colleges already possibility for releasing of the class and in general, they are OK with it. But there are now some other formal reasons why the class can not be released and even further developed :/ I hope votes you have collected may help to prove the need for further development of the class.
Would be also nice to know the actual list of class consumers, by customers in on-premise...
BR, Alexey.
Andre Fischer
It has been released with the latest Hotfix.
Mike Wagner
Hi All,
I am trying to create an optional array on my Kafka topic. My generated JSON looks like this:
},
"headerTexts": [
{
"headerTextId": {
"string": "0001"
},
"headerText": {
"string": "Mike | Mike two | "
}
}
]
Our Kafka team is telling my JSON needs to look like (below) with a "{" at the begining, the word "array": and a "}" at the end for my array.
I have tried different options on /ui2/cl_json=>serialize but nothing seems to generate JSON with word "array".
Any help would be appreciated
Regards Mike
This is my ABAP code
lv_body = /ui2/cl_json=>serialize( data = lw_output pretty_name = /ui2/cl_json=>pretty_mode-camel_case assoc_arrays = abap_true assoc_arrays_opt = abap_true ).
This is my table definitions:
BEGIN OF ty_header_text,
header_text_id TYPE ty_string,
header_text TYPE ty_string,
END OF ty_header_text,
ty_header_text_tab TYPE STANDARD TABLE OF ty_header_text WITH NON-UNIQUE DEFAULT KEY,
header_texts TYPE ty_header_text_tab,
Kakfa Teams says I need to generate this
},
"headerTexts": {
"array":[
{
"headerTextId": {
"string": "0001"
},
"headerText": {
"string": "Mike | Mike two | "
}
}
]
}
Wolfgang Röckelein
Hi Mike,
well, then header_texts needs to be a structure with "array TYPE ty_header_text_tab" as single property. header_text_id and header_text also needs to be a structure with only "string TYPE string" as single property.
Regards,
Wolfgang
Mike Wagner
Wolfgang,
Thanks for the quick response.
I'm not familiar with how to define TYPE ty_header_text_tab" as a single property. Can you provide a quick example?
Regards,
Mike...
Mike Wagner
All,
I have changed the header_text_id and the header_text to type "string" but I'm unsure as to how to define the ty_header_text_tab to make it a single property as Wolfgang suggested.
Any help would be appreciated.
Regards,
Mike Wagner
BEGIN OF ty_header_text,
header_text_id TYPE string,
header_text TYPE string,
END OF ty_header_text,
ty_header_text_tab TYPE STANDARD TABLE OF ty_header_text WITH NON-UNIQUE DEFAULT KEY,
Wolfgang Röckelein
BEGIN OF ty_string_s,
string TYPE string,
END OF ty_string_s,
BEGIN OF ty_header_text,
header_text_id TYPE ty_string_s,
header_text TYPE ty_string_s,
END OF ty_header_text,
ty_header_text_tab TYPE STANDARD TABLE OF ty_header_text WITH NON-UNIQUE DEFAULT KEY,
BEGIN OF ty_header_texts,
array TYPE ty_header_text_tab,
END OF ty_header_texts,
BEGIN OF ty_around,
header_texts TYPE ty_header_texts,
END OF ty_around,
Alexey Arsenyev
Thanks for the support, Wolfgang!
Sorry, can not respond fast, while was on vacation
Mike Wagner
Wolfgang,
Thanks for the help. We made the modifications to the structures and everything is working correctly on our send to Kafka.
Mike Wagner
Michael Hoppe
Hi Mike Wagner
how do you sent out your json from ABAP to Kafka ? What Do you have between ABAP and Kafka ?
We have a similar topic on customer project but I dont want to use this 3rd party adapter on Cloud Platform Integration.
Best regards
Michael
Alexey Arsenyev
Q:
A:
Michael Hoppe
Hi Alexey,
excellent blog and job that you are doing here !!
Regarding DESERIALIZE of JSON timestamp field (Note 2629179) I saw that for ABAP fields with long time stamp of type TIMESTAMPL the decimals are not deserialized to the ABAP variable.
Example:
JSON value: "\/Date(1577462450454)\/"
ABAP Value: 20191227160050.0000000 with /UI2/CL_JSON=>DESERIALIZE
Real Value: 20191227160050.4540000 according https://www.epochconverter.com/
Do you think that you will implement this in the future in /UI2/CL_JSON ?
Best regards
Michael
Alexey Arsenyev
Hello Michael,
Thanks for the feedback!
The issue you mentioned with a deserializing of EDM.DateTime is a bug. It will be fixed in the new patch note PL13 (2870163).
Happy New Year and BR,
Alexey.
Wolfgang Röckelein
Hi Alexey Arsenyev,
thanks for the bugfix note!
Could you mention which PL-Level is available on Steampunk?
Thanks,
Wolfgang
Alexey Arsenyev
Hi Wolfgang,
It is a good question
I assume that ABAP Cloud based on cloud SAP_BASIS stack, the same as for S/4 Cloud model, S/4 Cloud normally does not allow any note to be applied, only as part of the hotfix, if the change is critical and security-relevant. All other corrections will be delivered with a new release (quarterly delivery).
It seems that correction will be included in SAP_BASIS 7.80, which corresponds to S/4 2005 release.
I will put this in the note text.
BR, Alexey.
Aleh Altynau
Hi Alexey Arsenyev,
I have found one bug (I think so) when we're dealing with edm.Guid OData deserializations (JSON→ABAP)... we're not doing any serializations.
So, we have a response with a edm.Guid value "6a297074-45d3-4ef9-bc83-022d0efa8bf9" ... as you can see there're numbers and lower case letters. So, during the conversion of the hex value of the json response value string your REPLACE statement is case sensitive by default, therefore the edm.Guid cannot be converted to RAW16.
Here's the code:
... as I know OData v4 / v2 Edm.Guid has always lower case value... so, potentially this statement always get sy-subrc = 4 and RAW16 cannot be converted correctly.
I'm not sure if there's a SAP note for that but this is the system which we're using: S4/HANA 1909 on-premise, SAP_UI 754 SP02
Could you please have a look?
Thanks.
Bets regards,
Aleh
Alexey Arsenyev
Hi Aleh,
thanks for reporting this. I would not call it a "bug", I just was not thinking about this case, when developing the code, but definitely, this can be corrected.
The issue was already reported by other consumers, and I have already done the correction you mentioned, but have not yet released it, while it was not that urgent. But while now more people reporting it, I would try to release the correction in the next weeks.
BR, Alexey.
Aleh Altynau
Thanks Alexey.
One more hit, hex OSF standard values in AS ABAP expect upper case alpha-numeric letters, so simple assignment to RAW16 does not work... so after the replacement Edm.Guid you need to translate it to upper case and then only do assignment to the RAW16.
In general you can use a standard system class method CL_SYSTEM_UUID=>CONVERT_UUID_C36_STATIC to do a conversion into RAW16, but again... the input must be converted into upper case first. If your method designed in such a case that Edm.Guid could be any binary string (I assume that it is) then you could check target RTTI type if it's RAW16 and do the system conversion according to OSF standard.
Thank you anyway!
Best regards,
Aleh
Alexey Arsenyev
Hi Aleh,
Thanks, it was actually a good tip. I have forgotten conversion to upper case, and my unit test was still working, while input was also the upper case.
I did not implement too many checks for receiving data structure, while it, in any case, need an upper case, and do a conversion to upper case always in this case.
Best regards,
Alexey.
P.S.: If you have further suggestions, you are welcome
Lukas Klausner
Hi @Alexey Arsenyev,
Serializing of hash maps with empty values and the parameters assoc_arrays_opt and compress produces invalid json.
There is no placeholder for the empty value in the result string, for example '{"key1":}'.
Following unit test shows that problem:
A possible bug fix would be to add in method DUMP_INT something like
before
Best Regards
Lukas
Alexey Arsenyev
Hi Lukas,
Thanks for the bug report! Yes, I have recreated it (thanks for example) and it is definitely a bug. It will be fixed in the next patch note (will update the wiki with a new source also). I think the bug relates only to the assoc_arrays_opt switch.
But with a suggested fix I am not sure... Or not convinced what would be in general expected in this combination. There are possible alternatives how to handle it:
Also need to take care on deserialization of the same example.
What do you think?
And all the rest think
BR, Alexey.
Lukas Klausner
Hi Alexey,
thanks for your fast response and accepting it is a bug.
After another look I would prefer your 2. option.
In case of assoc_arrays_opt, compress should be ignored for hash tables.
Then also deserialization would work without any change.
That could be done by extending the condition:
IF mv_compress IS INITIAL OR <value> IS NOT INITIAL OR <symbol>-compressable EQ abap_false.
with "OR lv_array_opt EQ abap_true."
BR
Lukas
Alexey Arsenyev
Hi Lukas,
OK, agreed. I have included your test example in my unit tests and have done a correction. It will be available in the next patch note.
BR, Alexey.
Alexandre Ourth
Hello,
First, thanks a lot for your work. It helped me a lot of times along my years of coding in ABAP.
I am right now facing a big issue, and i don't know if it's possible to solve it with ZCL_JSON.
I have to call a rest web service. This service is built with a deep data structure like this :
"data": [
{
"companyCode": "",
"establishmentCode": "",
"code": "",
"areaCode": "",
"label": "",
"operations": {
"2015-11-20": [
{
"code": "",
"label": "
}
],
"2020-01-01": [
{
"code": "",
"label": ""
}
]
}
}
]
}
The problem here is that the service is using "MAP" object (a string, associated to a table) under "operations" object.
There are a number of date objects (2015-11-20, 2020-01-01 and could be more), which is calculated only at runtime, so it's not possible to initiate a final structure as i don't how many dates i will have to handle. And this amount can differ of course for each "Header" appened in the top list ("data")
Is the class ZCL_JSON able to handle such a data definition to serialize ABAP in JSON? I am stuck as i don't know what to do if i can't create a structure initially.
Thanks a lot for your time, and your help.
Regards,
Alexandre
Wolfgang Röckelein
assoc_arrays = abap_true and careful constructing the abap types should do the trick, cf he example above for assoc_arrays
Alexandre Ourth
Sorry for missing that...i was not at ease with this concept so i didn't knew where to look i admit. I will look into it, thanks a lot Wolfgang Röckelein
Alexandre Ourth
Hi Wolfgang Röckelein,
Sorry to bother you again but i can't find a proper way to do that. My "Key", which is a date here, must be a list, because for one date, i can have multiple items. But as it must be a list, then i can't use SORTED TABLE OF on it because of course we can't set a table as a KEY.
What i need to do
{
"data": [
{
"companyCode": "",
"establishmentCode": "",
"code": "",
"areaCode": "",
"label": "",
"operations": {
"2015-11-20": [
{
"code": "",
"label": "
},
{
"code": "",
"label": "
}
],
"2020-01-01": [
{
"code": "",
"label": ""
}
]
}
}
]
}
The code i am trying with several combinations (sorry for no color)
TYPES :
BEGIN OF ty_ope,
code TYPE string,
label TYPE string,
END OF ty_ope,
BEGIN OF ty_sorted,
"'date' must be handled in JSON like a table of TY_OPE like in my example.
"But then impossible to use SORTED TABLE OF in TY_ROUTING...
date TYPE string,
code TYPE string,
label TYPE string,
END OF ty_sorted,
BEGIN OF ty_routing,
company_code TYPE string,
establishment_code TYPE string,
code TYPE string,
area_code TYPE string,
label TYPE string,
operation TYPE SORTED TABLE OF ty_sorted WITH UNIQUE KEY date,"Test 1: 'operation' object but 'date' not a list. Dump is twice same date
* operation TYPE SORTED TABLE OF ty_sorted WITH NON-UNIQUE KEY primary_key COMPONENTS date,"Test 2: 'operation' list, 'date' format is wrong
* operation TYPE ty_sorted,
END OF ty_routing,
BEGIN OF ty_data,
data TYPE STANDARD TABLE OF ty_routing WITH DEFAULT KEY,
END OF ty_data.
DATA :
ls_data TYPE ty_data,
ls_routing TYPE ty_routing.
CLEAR ls_routing.
ls_routing-company_code = 'CC1'.
ls_routing-area_code = 'A1'.
ls_routing-establishment_code = 'E1'.
ls_routing-code = 'C1'.
ls_routing-label = 'L1'.
ls_routing-operation = VALUE #( ( date = '01-01-2021' code = 'C11' label = 'LL1' )
* ( date = '01-01-2021' code = 'C12' label = 'LL11' ) "I need a 2nd record with same date but KO with Test 1
).
APPEND ls_routing TO ls_data-data.
CLEAR ls_routing.
ls_routing-company_code = 'CC2'.
ls_routing-area_code = 'A2'.
ls_routing-establishment_code = 'E2'.
ls_routing-code = 'C2'.
ls_routing-label = 'L2'.
ls_routing-operation = VALUE #( ( date = '02-02-2022' code = 'C2' label = 'LL2' ) ).
APPEND ls_routing TO ls_data-data.
DATA(lo_tool) = NEW zcl_json( ).
lv_json = lo_tool->serialize( EXPORTING data = ls_data
pretty_name = zcl_json=>pretty_mode-camel_case
assoc_arrays = abap_true ).
cl_demo_output=>display_json( lv_json ).
Result with Test 1
Not good because 01-01-2021 is not a list '['
{
"data":
[
{
"companyCode":"CC1",
"establishmentCode":"E1",
"code":"C1",
"areaCode":"A1",
"label":"L1",
"operation":
{
"01-01-2021":
{
"code":"C11",
"label":"LL1"
}
}
},
{
"companyCode":"CC2",
"establishmentCode":"E2",
"code":"C2",
"areaCode":"A2",
"label":"L2",
"operation":
{
"02-02-2022":
{
"code":"C2",
"label":"LL2"
}
}
}
]
}
Result with Test 2
Not good because 'operation' as become a list '[' instead of object '{'
{
"data":
[
{
"companyCode":"CC1",
"establishmentCode":"E1",
"code":"C1",
"areaCode":"A1",
"label":"L1",
"operation":
[
{
"date":"01-01-2021",
"code":"C12",
"label":"LL11"
},
{
"date":"01-01-2021",
"code":"C11",
"label":"LL1"
}
]
},
{
"companyCode":"CC2",
"establishmentCode":"E2",
"code":"C2",
"areaCode":"A2",
"label":"L2",
"operation":
[
{
"date":"02-02-2022",
"code":"C2",
"label":"LL2"
}
]
}
]
}
I don't know what i am missing here.
Any idea please?
Best regards,
Alexandre
Alexey Arsenyev
Hello Alexander,
see the code below. That one is working. You need one more flag assoc_arrays_opt = abap_true and proper data structure.
BR, Alexey.
Alexandre Ourth
Hello Alexey Arsenyev
Thank you so much for that. For taking the time to check this problem and more, finding a solution. I think i have to read again how these "ASSOC_ARRAYS..." parameters work.
Well with your code now i understand a little bit more.
Thanks again.
Have a great day. Mine will be better with this
Regards,
Alexandre
Alexandre Ourth
Hello again,
I hope it's the last question
:
So i am using ZCL_JSON(PL14). The customer doesn't have the add-on, so i've downloaded it manually.
Thanks a lot.
Good weekend to all.
Regards,
Alexandre
Alexey Arsenyev
Hi Alexandre,
there is no automatic way. You may apply the note to some system having UI-Addon installed, then go to Source Code-based editing mode (or open the class in ADT) and copy /ui2/cl_json in a new report. Then rename /ui2/cl_json to zcl_json. This is the way how I am preparing the source to publish it here. BTW, the currently available version here is PL15. The newest is PL16. I will try to update the code of ZCL_JSON in the article to the newest state soon.
BR, Alexey.
Lukas Klausner
Hi Alexey,
I would like to ask you for updating the zcl_json to PL16 on this wiki page.
Thanks, BR
Lukas
Alexey Arsenyev
Done.
Alexandre Ourth
Hello Alexey Arsenyev
You told Lukas, you updated on this page the code of ZCL_JSON to PL16, but it still indicates this is the state PL14. Is this really the new version, but the title is outdated?
Thanks a lot.
Regards,
Alexandre
Alexey Arsenyev
Yes, it is really the PL16 code. I have just forgotten to update the description above. No corrected.
Alexandre Ourth
Hello everyone,
The documentation says :
But when creation a JSON from an internal table, i have this :
{ "data": [ { "companyCode": "blabla", "establishmentCode": "blabla", "itemCode": "blabla", "batch": "blabla", "recordType": "CRITERION", "generationDate": "2019-05-24T00:00:00Z", "effectiveDate": "", "statusCode": "", "criterionCode": "PEREMPDT", "measureDate": "2019-05-24T00:00:00Z", "criterionValue": "22/05/2022" }, { "companyCode": "blabla", "establishmentCode": "blabla", "itemCode": "blabla", "batch": "blabla", "recordType": "STATUS", "generationDate": "2019-05-24T00:00:00Z", "effectiveDate": "2021-12-02T00:00:00Z", "statusCode": "B", "criterionCode": "", "measureDate": "", "criterionValue": "" } ] }
Somes fields are filled in one record, but not on the other. Is this why they are still extracted in the json even if they have default value in my internal table? Is there a way to not display these fields when they are not filled please? Associated type is STRING for these fields.
Thanks a lot for your time.
Regards,
Alexandre
Alexey Arsenyev
Hello Alexander,
have you added compress = abap_true flag to serialize method call?
BR, Alexey.
Alexandre Ourth
Hello Alexey Arsenyev ,
So sorry...I thought it was a native behaviour of the framework so i thought it was due to something wrong i did. I should have read a little bit further the documentation...
Thanks again.
Regards,
Alexandre
Shai Sinai
Hi,
I couldn't find one, but I'm still asking:
Is there any implicit option for special handling of currencies or a manual conversion is required?
(for ODATA, The SAP Gateway Foundation calls internally to CURRENCY_AMOUNT_SAP_TO_IDOC, for example)
Alexey Arsenyev
Hi Shai,
no there is no implicit option for currency handling. /ui2/cl_json was not designed as a replacement for SAP Gateway and OData. Yes, there are some OData-related type conversions for reading data (deserialize), which were added as a convenience feature for writing unit tests for OData services, but there is no support for serialization.
There was already a similar request for OData support, but currently, I do not consider adding this as a feature. Honestly, the best would be if GW colleagues would open their JSON conversion infrastructure for public consumption.
If you are still interested in using /ui2/cl_json for that, there are 2 options:
What is your use case: reading or writing OData JSON?
BR, Alexey.
Shai Sinai
Hi Alexey,
Thanks for the detailed response.
My current use case requires writing JSON (serialization), but I guess that deserialization might also be relevant in the future.
Best regards.