Since ADTF version 3.3.0 there is a extendable command line tool - based on ADTF File Library - to handle and access ADTFDAT Files for read, write and process containing streaming data using processors and readers, which can be implemented by the use of adtfdat_processing library. The ADTF File Library will be redeployed within ADTF, the SDK, reader/processors and examples can be found under $(ADTF_DIR)/pkg/adtf_file, the ADTF DAT Tool will be next to the other applications under $(ADTF_DIR)/bin.

ADTFDAT files contain one or more streams representing recorded data. The ADTF DAT Tool operates on these streams in one of two ways:

Export Mode

In this mode the ADTF DAT Tool makes use of processors to operate on streams inside the ADTFDAT file. processors are written by software developers to do versatile data manipulation and extraction. The file ending of a processor is *.adtffileplugin.

Currently the ADTF delivery provides one such processor to export the samples of a stream into a CSV file (see ADTF File Library). Another processor can be found in the programming examples of this guide which is capable of extracting images from a video stream. Addional content can be found within the toolboxes (ASC/MDF within Device and Calibration Toolbox) and standalone solutions from digitalwerk (see artifactory and store) and others.

To extract data from streams of an ADTF DAT file use the --export argument. Select the streams you want to extract by specifying the --stream argument. You can specify a processor for each stream with the --processorid argument. If you do not specify one explicitly, the first one that supports the stream is used. Properties of processors can be specified with the --property argument. The destination filename is specified with the --output argument. Here is an example that exports two streams:

Apply the parameter --liststreams to query all information about a given input (DAT file, or any other supported input). To load additional plugins use the --plugin argument as often as you like.

Create Mode

In this mode you can select arbitrary streams (even from multiple DAT files) to create a new DAT file. To do so use the --create argument. The --input argument is used to specify an input file. The --readerid argument can be used to pass in the reader that should be used to read the file. If none is given, the first one that supports the file is used. Set the --start and --end arguments to select the range of the input that should be imported into the new DAT file. To shift the timstamps of all imported stream items apply the --offset parameter.
To select streams from an input, use the --stream argument. If you do not select one or more streams explicitly, all streams will be added. Streams can be renamed with the --name argument.
Use the --serializerid argument to choose the serializer of your liking. If not specified, sample_copy_serialization.serialization.adtf.cid will be used.
Here is an example that creates a new ADTF DAT file from two inputs:

Best Practice

This chapter should show you a workflow how to start from scratch to use the ADTF DAT Tool, extend the functionality using adtffileplugins and how to export containing streams from a recording. The separated steps will teach you about architecture and how to components and functionality are connected. For a better overview, we strip the paths from the call and arguments.

Get information about the tooling

First of all, the command line help should always be the first address for information, no matter as getting started or for a deeper dive later:

Get information from the adtfdat file

As next step you should have a look at the recorded file, in this example the example_file.adtfdat within $(ADTF_DIR)/src/examples/datfiles. The --liststreams option will dump the content, in this case, one video and one ddl described stream.

Get information from the adtffileplugin

With the knowledge about the containing streams, it is time to find a suitable processor to export some data. The --inspect option can be used to look inside an adtffileplugin, in this case the csv_processor.adtffileplugin with the ADTF File Library besides the ADTF DAT Tool itself.

Extend the functionality

With this information, the calls can be extend by using --plugin which provides the csv processor to export the stream NESTED_STRUCT compared to the --liststreams call without that adtffileplugin.

Export the data

Now all required information for exporting are available:

  • the tool call from the help
  • wa valid adtfdat file
  • the name of streams
  • an adtffileplugin containing logic to export data
  • the suitable processorid for csv export
  • information about optional properties
  • Use cases for using properties for this processor could be adapting the decimal places:

    or exporting timestamps in microseconds:

    Handle substreams

    Processing substreams is almost the same workflow, with tiny extensions. The --liststreams command will show you the content as already learned:

    The important identifier in this case is the type adtf/substreams for the stream substreams. So you have to dive deeper using --listsubstreams:

    You will receive the same required information compared to the common sample streams use case. Now you can also extend the functionality using a processor and export csv in the same manner (we skip the step with a relook and inspect because we already know the information from our previous steps). The only difference during the --export call is to add the (containing) --substream after the specified --stream:

    Side note

    Now you have a good start and also common workflow for the export use case. Of course you can use this procedure with less adaption for create and modify option as well. Please keep also in mind that you can load several adtffileplugins at once, each with an own --plugin call to load readers and processors as a kind of file conversion or several readers and processors for exporting more than one stream with processors from different adtffileplugins (that's why it is important to address them by using --processorid). For more help, please always refer to --help which provides you the information as follows:

    Full Command Line Help

    The following lists all options the tool supports. Just issue the following command: adtf_dattool.exe --help

    Where to go next?

    ADTF itself is a large bundle of SDKs and tooling. It has its limitations but is able to work in a distributed setup. We think it is time to step across this border and give you an introduction in the ADTF FEP Integration.