Liking cljdoc? Tell your friends :D

szew.essbase.aso

Essbase ASO export.

What to expect from input:

  • Space separated.
  • Quoted member names, non-quoted values.
  • Variable column count, last column is always a value, every line is a cell.
  • First line is a complete POV, other lines do minimal POV update only.
  • So this whole file must be parsed in order...
  • ... and all members must be mapped to dimensions properly.

To parse the export file you need to know one thing: the complete member to dimension mapping of storage dimensions.

It's easy to get member names from a data file: just drop the last column in every row.

Essbase ASO export.

What to expect from input:

* Space separated.
* Quoted member names, non-quoted values.
* Variable column count, last column is always a value, every line is a cell.
* First line is a complete POV, other lines do minimal POV update only.
* So this whole file must be parsed in order...
* ... and all members must be mapped to dimensions properly.

To parse the export file you need to know one thing: the complete member
to dimension mapping of storage dimensions.

It's easy to get member names from a data file: just drop the last column
in every row.
raw docstring

szew.essbase.bso

Essbase BSO export.

What to expect from input:

  • Space separated.
  • Quoted member names, non-quoted values.
  • Max file size is 2GB.
  • COLUMNS are specified in first line of the file:
    • List of quoted member names of single dense dimension, last field is empty.
    • No members from this dimension ever appear in the file again.
    • N members are specified, up to these many figures can appear in data lines.
  • POV lines appear periodically, those signal complete POV update:
    • List of quoted member names from distinct sparse dimensions.
  • DATA line consists of both partial POV updates and figures:
    • Quoted members of remaining dense dimensions (not present in POV lines).
    • Figures are non-quoted #Mi and numeric values, up to N occurrences per line.
    • Last field is always empty string, so last figure is followed by space.
    • Missing values from the left are marked as #Mi, from the right - skipped.

To parse the export file you need to know two things:

  • Number of data storing dimensions in the cube.
  • Complete mapping of members to dimensions for storage dimensions.
Essbase BSO export.

What to expect from input:

* Space separated.
* Quoted member names, non-quoted values.
* Max file size is 2GB.
* COLUMNS are specified in first line of the file:
  - List of quoted member names of single dense dimension, last field is empty.
  - No members from this dimension ever appear in the file again.
  - N members are specified, up to these many figures can appear in data lines.
* POV lines appear periodically, those signal complete POV update:
  - List of quoted member names from distinct sparse dimensions.
* DATA line consists of both partial POV updates and figures:
  - Quoted members of remaining dense dimensions (not present in POV lines).
  - Figures are non-quoted #Mi and numeric values, up to N occurrences per line.
  - Last field is always empty string, so last figure is followed by space.
  - Missing values from the left are marked as #Mi, from the right - skipped.

To parse the export file you need to know two things:

* Number of data storing dimensions in the cube.
* Complete mapping of members to dimensions for storage dimensions.
raw docstring

szew.essbase.cols

Essbase columns export.

What to expect from input:

  • Space separated.
  • Quoted member names, non-quoted values.
  • Max file size 2GB.
  • COLUMNS are specified in first line of the file:
    • List of quoted member names of single dense dimension, last field is empty.
    • No members from this dimension ever appear in the file again.
    • N members are specified, up to these many figures can appear in data lines.
  • DATA line consists of both full POV updates and figures:
    • Quoted members of non-columns dimensions followed by figures.
    • Dimension order in the data lines should always be the same.
    • Figures are non-quoted #Mi and numeric values, up to N occurrences per line.
    • Last field is always empty string, so last figure is followed by space.
    • Missing values from the left are marked as #Mi, from the right - skipped.

To parse the export file you need to know two things:

  • Number of data storing dimensions in the cube.
  • Complete mapping of member name to dimension name in those dimensions.
Essbase columns export.

What to expect from input:

* Space separated.
* Quoted member names, non-quoted values.
* Max file size 2GB.
* COLUMNS are specified in first line of the file:
  - List of quoted member names of single dense dimension, last field is empty.
  - No members from this dimension ever appear in the file again.
  - N members are specified, up to these many figures can appear in data lines.
* DATA line consists of both full POV updates and figures:
  - Quoted members of non-columns dimensions followed by figures.
  - Dimension order in the data lines should always be the same.
  - Figures are non-quoted #Mi and numeric values, up to N occurrences per line.
  - Last field is always empty string, so last figure is followed by space.
  - Missing values from the left are marked as #Mi, from the right - skipped.

To parse the export file you need to know two things:

* Number of data storing dimensions in the cube.
* Complete mapping of member name to dimension name in those dimensions.
raw docstring

szew.essbase.logs

Essbase application log and MaxL spool.

Application logs

Log timestamp line looks like:

[Tue Nov 06 08:50:26 2001]Local/Sample///Info(1013214)

So it's [timestamp]Local/application/database/issuer/type(code) more or less.

And then data follows, that looks like this:

Clear Active on User [admin] Instance [1];

So this one contains user info, but it could be Command [.+] or Database [.+] etc.

TODO: break entries into fields, not only headers?

Fields you'll get using AppLog:

  • full timestamp, decoded from date
  • date: yyyy-mm-dd decoded from timestamp
  • application: String
  • database: String
  • user: String
  • level: Info | Warning | Error | ???
  • code: int
  • raw: String, full payload of the entry (head + message)

Additional tables of use: Code categories [1].

MaxL shell spool

And then there's MaxL allowing you to see all the useful system properties, and even run MDX queries and get results back, almost like in a real database, but only if you remember to set the column_width just right, or it will truncate those space padded, fixed width table outputs.

FIXME: can headers in MaxL output be multiline?

I just set it to 256 and that's the default value here. YMMV.

That MaxLSpool will help you extract those, remove the padding and pack columns into hash maps. Will also keep MaxL output preceding and following tabular output.

Some 'special' values are resolvable via maxl-constants map. It's WIP with little P.

[1] https://docs.oracle.com/cd/E12825_01/epm.111/esb_dbag/dlogs.htm

Essbase application log and MaxL spool.

# Application logs

Log timestamp line looks like:

  [Tue Nov 06 08:50:26 2001]Local/Sample///Info(1013214)

So it's [timestamp]Local/application/database/issuer/type(code) more or less.

And then data follows, that looks like this:

  Clear Active on User [admin] Instance [1];

So this one contains user info, but it could be Command [.+]
or Database [.+] etc.

TODO: break entries into fields, not only headers?

Fields you'll get using AppLog:

* full timestamp, decoded from date
* date: yyyy-mm-dd decoded from timestamp
* application: String
* database: String
* user: String
* level: Info | Warning | Error | ???
* code: int
* raw: String, full payload of the entry (head + message)

Additional tables of use: Code categories [1].

# MaxL shell spool

And then there's MaxL allowing you to see all the useful system properties,
and even run MDX queries and get results back, almost like in a real database,
but only if you remember to set the column_width just right, or it will
truncate those space padded, fixed width table outputs.

FIXME: can headers in MaxL output be multiline?

I just set it to 256 and that's the default value here. YMMV.

That MaxLSpool will help you extract those, remove the padding and pack columns
into hash maps. Will also keep MaxL output preceding and following tabular
output.

Some 'special' values are resolvable via maxl-constants map. It's WIP with
little P.

[1] https://docs.oracle.com/cd/E12825_01/epm.111/esb_dbag/dlogs.htm
raw docstring

szew.essbase.otl

Essbase XML Outline export.

Provides szew.io/XML processors for dimension extraction and convenience functions consuming the outline export file directly.

Allows both sequencing and zipping over dimensions and members.

Just keep in mind that parsing big, deeply nested XMLs is a memory hog.

Essbase XML Outline export.

Provides szew.io/XML processors for dimension extraction and convenience
functions consuming the outline export file directly.

Allows both sequencing and zipping over dimensions and members.

Just keep in mind that parsing big, deeply nested XMLs is a memory hog.
raw docstring

szew.essbase.otl.dimension

Dimension specs for otl.

Dimension specs for otl.
raw docstring

No vars found in this namespace.

szew.essbase.otl.member

Member specs for otl.

Member specs for otl.
raw docstring

No vars found in this namespace.

szew.essbase.txl

Essbase transaction logs.

ALG file is just pairs of timestamps and transaction descriptions:

  • First two lines is the time stamp of when audit log was enabled.
  • Remaining pairs describe user and location+length in ATX file, line wise.

ATX file holds the data as it was locked and sent:

  • Quoted member names, non-quoted values.
  • Data chunks are separated by empty lines.

This namespace lets you process transaction logs, filter and pack results in a presentable way. Contains some basic predicates to aid that.

Essbase transaction logs.

ALG file is just pairs of timestamps and transaction descriptions:

* First two lines is the time stamp of when audit log was enabled.
* Remaining pairs describe user and location+length in ATX file, line wise.

ATX file holds the data as it was locked and sent:

* Quoted member names, non-quoted values.
* Data chunks are separated by empty lines.

This namespace lets you process transaction logs, filter and pack results
in a presentable way. Contains some basic predicates to aid that.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close