api package

Subpackages

Submodules

api.chessboard module

api.datatypes module

api.datatypes.box2d()

Fields are: ‘x’ (float): the top coordinate of the box in pixels ‘y’ (float): the left coordinate of the box in pixels ‘h’ (float): box height in pixels ‘w’ (float): box width in pixels ‘r’ (float): box rotation angle (anti-clockwise) ‘classes’ (int): the object category number ‘id’ (int): the object’s instance unique id ‘flags’ (int): miscellaneous infos

api.datatypes.box3d()
Fields are:

‘c’ (float, float, float): center coordinates ‘d’ (float, float, float): the box dimensions ‘r’ (float, float, float): the Euler angles (rotations) ‘classes’ (int): the object category number ‘id’ (int): the object’s instance unique id ‘flags’ (int): miscellaneous infos

Coordinate system: +x is forward, +y is left and +z is up.

api.datatypes.datasource_xyzit()

Fields are: ‘x’ (float): x coordinate (forward) ‘y’ (float): y coordinate (left) ‘z’ (float): z coordinate (up) ‘i’ (int): intensity ‘t’ (int): timestamp

api.datatypes.datasource_xyzit_float_intensity()

Fields are: ‘x’ (float): x coordinate (forward) ‘y’ (float): y coordinate (left) ‘z’ (float): z coordinate (up) ‘i’ (float): intensity ‘t’ (int): timestamp

api.datatypes.lane()

Fields are: ‘vertices’: (N,3) numpy array of the 3D coordinates ‘type’ (int): number to identify the type of lane (see das/lane_type.py)

api.datatypes.poly2d()

Fields are: ‘polygon’: an (N,2) array with N vertices and 2 x,y coordinates (in pixel units) ‘classes’ (int): the object category number ‘id’ (int): the object’s instance unique id ‘flags’ (int): miscellaneous infos

api.datatypes.rad()

Fields are: ‘x’ (float): x coordinate (forward) ‘y’ (float): y coordinate (left) ‘v’ (float): velocity ‘i’ (float): intensity (float) ‘t’ (int): timestamp

api.datatypes.seg2d()
Fields are:

‘confidences’: 2D array of values between 0 and 1 ‘classes’ (int): the object category number

Note: the size of the 2D arrays should be the same as the sensor’s resolution

api.datatypes.seg3d()
Fields are:

‘classes’ (int): the object category number ‘id’ (int): the object’s instance unique id

Note: A 3D segmentation array must have the same length and ordering as the corresponding point cloud array.

api.imu module

api.interpolators module

api.interpolators.ceil_interpolator(datasource, float_index)
api.interpolators.euler_imu_linear_ndarray_interpolator(datasource, float_index)
api.interpolators.floor_interpolator(datasource, float_index)
api.interpolators.from_float_index(float_index)
api.interpolators.linear_dict_of_float_interpolator(datasource, float_index)
api.interpolators.linear_ndarray_interpolator(datasource, float_index)
api.interpolators.nearest_interpolator(datasource, float_index)

api.loaders module

api.loaders.image_loader(fileobj)

Image loader to be used with FileSource

Parameters

{string} -- Image filename (fileobj) –

Returns

np.ndarray – Image

api.loaders.load_files_from_folder(folder, pattern, sort=False, return_keys=False)

Load files from a folder and sort them according to a regex pattern

api.loaders.pickle_loader(fileobj)

Pickle data file loader. To be used with FileSource.

Parameters

{string,file} -- A file object or a string (fileobj) –

Returns

object – Unpickled object

api.loaders.txt_loader(fileobj)

Bytes data file loader. To be used with FileSource.

Parameters

{string,file} -- A file object or a string (fileobj) –

Returns

bytes

api.platform module

class api.platform.Filtered(synchronized: api.platform.Synchronized, indices: List[int])

Bases: object

__getitem__(index: Any) → Dict[str, pioneer.das.api.samples.sample.Sample]

Implements the ‘[]’ API, using the intervals

Parameters

index – Will index in this view’s intervals, and then call Synchronized.__getitem__()

Returns

See Synchronized.__getitem__()

__init__(synchronized: api.platform.Synchronized, indices: List[int])

Initialize self. See help(type(self)) for accurate signature.

expand_wildcards(labels: List[str])

Wraps Synchronized.expand_widlcards()

keys()

Wraps Synchronized.keys()

property platform
class api.platform.Platform(dataset: Optional[str] = None, configuration: Optional[str] = None, include: Optional[list] = None, ignore: Optional[list] = [], progress_bar: bool = True, default_cache_size: int = 100)

Bases: object

A Platform is what encapsulate an instance (configuration) of some data acquisition platform. It contains one or more sensors, each containing one or more datasources. A live platform interfaces live sensors, while an offline platform can be used to extract data from a recording.

__getitem__(label: str) → Union[pioneer.das.api.sensors.sensor.Sensor, pioneer.das.api.datasources.abstract_datasource.AbstractDatasource]

Implement operator ‘[]’

Parameters

label – can be a sensor name (e.g. ‘pixell_tfc’) or a datasource name (e.g. ‘pixell_tfc_ech’)

Returns

the Sensor or AbstractDatasource instance

Raises

IndexError – label must contain exactly 2 or 3 ‘_’

__init__(dataset: Optional[str] = None, configuration: Optional[str] = None, include: Optional[list] = None, ignore: Optional[list] = [], progress_bar: bool = True, default_cache_size: int = 100)

Constructor

Parameters
  • dataset – If None, this platform will be considered a live sensor platform. Otherwise, ‘dataset’ contains the path to an offline recording, where one expects to find one or many ‘.zip’ filesources named like their corresponding datasource. If argument ‘configuration’ is None, then one expect to find a file named ‘platform.yml’ which declares which sensor this platform must consider, and thus which ‘.zip’ file to use. ‘zip’ files are expected to contain one file name according to das.api.filesource.TIMESTAMPS_CSV_PATTERN, zero or one file name according to das.api.filesource.CONFIG_YML_PATTERN, and other files corresponding to actual sensor data (e.g. ‘.pkl’ files, ‘.png’ files, etc)

  • configuration – If None, see ‘dataset’ documentation. Otherwise, ‘configuration’ is expected to contain a path to a ‘.yml’ file describing a das.api acquisition platform configuration, or its content directly, which allows the user to instanciate a platform without creating a file. If ‘dataset’ is not None, ‘configuration’ can be a path relative to ‘dataset’

  • include – A list of strings (e.g. [‘lca2’,’flir_tfc’]) to be included exclusively in the platform’s keys

  • ignore – A list of strings (e.g. [‘lca2’,’flir_tfc’]) to be ignored and not included in the platform’s keys

  • progress_bar – If True, show the progress bars when initializing or synchronizing.

  • default_cache_size – the default value for (offline) datasource cache size

datasource_names() → List[str]

Returns the list of all datasource names, e.g. [‘pixell_tfc_ech’, ‘pixell_tfc_sta’, …]

property egomotion_provider
expand_wildcards(labels: List[str]) → List[str]

See also: platform.expand_widlcards()

property extrinsics

Returns a dict with extrinsics information for each platform’s sensor

from_relative_path(relative_path: str) → str

Converts a path relative to dataset folder to an absolute path

property intrinsics

Returns a dict with intrinsics information for each platform’s sensor

is_live() → bool

Returns wether this platform contains live sensors or offline recordings

property orientation

Returns a dict with orientation matrix for each platform’s sensor

record()

Live platform only toggle recording for all (live) sensors

property sensors

Returns this platform’s Sensors instance

start()

Live platform only starts the (live) sensors

stop()

Live platform only stops the (live) sensors

synchronized(sync_labels: List[str] = [], interp_labels: List[str] = [], tolerance_us: Union[float, int] = None, fifo: int = -1)

Creates a Synchronized instance with self as platform

See Also: Synchronized.__init__()

to_nas_path(path: str) → str

Convert absolute yaml paths to path relative to os.environ[‘nas’]

try_absolute_or_relative(folder)

tries for absolute or relative path

class api.platform.Sensors(pf: api.platform.Platform, yml: dict)

Bases: object

The collection of Sensor instances in a platform

__getitem__(label)

Implement ‘[]’ API

__init__(pf: api.platform.Platform, yml: dict)

Constructor

Parameters
  • platform – the platform that holds this Sensors instance

  • yml – the YAML database

items()

Implement dict API

keys()

Implement dict API :returns: The list of sensor names, e.g. [‘pixell_tfc’, ‘flir_tfc’, …]

override_all_extrinsics(extrinsics_folder: str)

Override the extrinsics that we loaded from the platfrom yml.

Only the sensor for which the extrinsics were previously loaded will be overriden.

Parameters

extrinsics_folder – The extrinsics folder

override_sensor_extrinsics(name: str, extrinsics_folder: str)

Override the extrinsics that we loaded from the platfrom yml.

Parameters

extrinsics_folder – The extrinsics folder

start()

Live platform only starts the (live) sensors

stop()

Live platform only starts the (live) sensors

values()

Implement dict API

class api.platform.Sliced(synchronized: api.platform.Synchronized, intervals: List[Tuple[int, int]], stride: int = 1)

Bases: object

Creates a view on a synchronized dataset using a list of intervals

__getitem__(index: Any) → Dict[str, pioneer.das.api.samples.sample.Sample]

Implements the ‘[]’ API, using the intervals

Parameters

index – Will index in this view’s intervals, and then call Synchronized.__getitem__()

Returns

See Synchronized.__getitem__()

__init__(synchronized: api.platform.Synchronized, intervals: List[Tuple[int, int]], stride: int = 1)

Constructor

Parameters
  • synchronized – The ‘Synchronized’ instance we want a view on.

  • intervals – An list of (open, open) intervals of interest over synchronized’s domain. The list does not need to be ordered, and can index a synchronized index more than once. For example, with a ‘stride’ of 1, the list [(5, 7), (12, 15), (15, 13)] will expand to indices [5,6,7, 12,13,14,15, 15,14,13].

  • stride – the stride (jump between consecutive frames)

expand_wildcards(labels: List[str])

Wraps Synchronized.expand_widlcards()

keys()

Wraps Synchronized.keys()

property platform
class api.platform.Synchronized(platform: api.platform.Platform, sync_labels: List[str], interp_labels: List[str], tolerance_us: Union[float, int], fifo: int)

Bases: object

Synchronized view over a Platform

class SynchGetter(index, synch)

Bases: object

__getitem__(ds_name)
__init__(index, synch)

Initialize self. See help(type(self)) for accurate signature.

clear()
copy()
has_key(k)
items()
keys()
len()
values()
__getitem__(index: Union[int, slice, Iterator[int]]) → Dict[str, pioneer.das.api.samples.sample.Sample]

Implements the ‘[]’ API

Parameters

index – the (complete) datasource name

Returns

A Sample

__init__(platform: api.platform.Platform, sync_labels: List[str], interp_labels: List[str], tolerance_us: Union[float, int], fifo: int)

Constructor

Parameters
  • platform – The platform that contains datasources to synchronize

  • sync_labels – A list of expanded datasource names to harvest tuples of samples synchronized whithin ‘tolerance_us’ microseconds. Important The first label will serve as the ‘reference datasourse’

  • interp_labels – A list of expanded datasource names from which to obtain interpolated samples to be added to the ‘synchronized tuples’. The corresponding datasource must have an interpolator defined.

  • tolerance_us – the synchronization tolerance, i.e. for each sample in ‘reference datasource’, we try to find one sample in each of the other synchronized datasources that is whitin ‘tolerance_us’ and add it to the ‘synchronized tuple’. Incomplete ‘synchronized tuples’ are discarded.

  • fifo – For future use (live datasources)

expand_wildcards(labels: List[str])

See also: platform.expand_widlcards()

filtered(indices: List[int])
get_single_ds(ds_name: str, index: Union[int, slice, Iterator[int]]) → pioneer.das.api.samples.sample.Sample
indices(index: int)

Returns indices for all datasources for a given synchronized index

keys()

Returns synchronization labels and interpolation labels (implement dict API)

property ref_ds_name
sliced(intervals: List[Tuple[int, int]])

Returns a Sliced instance

See Also: Sliced

timestamps(index: int) → numpy.ndarray

Returns timestamps for all synchronized datasources for a given index

class api.platform.SynchronizedGroup(datasets: Union[str, list], sync_labels: List[str] = [], interp_labels: List[str] = [], tolerance_us: Union[float, int] = None, include: Optional[list] = None, ignore: Optional[list] = [], preload: bool = False)

Bases: api.platform.Synchronized

Groups multiple synchronized platforms in a single one :param -datasets: Can be either a list of paths for the datasets to be grouped,

or a single string for the path of a directory that contains the datasets that will all be grouped.

Parameters
  • -sync_labels – see Synchronized

  • -interp_labels – see Synchronized

  • -tolerance_us – see Synchronized

  • -include – see Platform

  • -ignore – see Platform

  • -preload – If True, all platforms will be initialized and synchronized at start. Good for random or non-sequential data access, but initialization is slower.

__getitem__(index: Union[int, slice, Iterator[int]]) → Dict[str, pioneer.das.api.samples.sample.Sample]

Implements the ‘[]’ API

Parameters

index – the (complete) datasource name

Returns

A Sample

__init__(datasets: Union[str, list], sync_labels: List[str] = [], interp_labels: List[str] = [], tolerance_us: Union[float, int] = None, include: Optional[list] = None, ignore: Optional[list] = [], preload: bool = False)

Constructor

Parameters
  • platform – The platform that contains datasources to synchronize

  • sync_labels – A list of expanded datasource names to harvest tuples of samples synchronized whithin ‘tolerance_us’ microseconds. Important The first label will serve as the ‘reference datasourse’

  • interp_labels – A list of expanded datasource names from which to obtain interpolated samples to be added to the ‘synchronized tuples’. The corresponding datasource must have an interpolator defined.

  • tolerance_us – the synchronization tolerance, i.e. for each sample in ‘reference datasource’, we try to find one sample in each of the other synchronized datasources that is whitin ‘tolerance_us’ and add it to the ‘synchronized tuple’. Incomplete ‘synchronized tuples’ are discarded.

  • fifo – For future use (live datasources)

filtered(indices: List[int])
get_synchronized(dataset_index, progress_bar=False)
indices(index: int)

Returns indices for all datasources for a given synchronized index

keys()

Returns synchronization labels and interpolation labels (implement dict API)

property platform
sliced(intervals: List[Tuple[int, int]])

Returns a Sliced instance

See Also: Sliced

timestamps(index: int)

Returns timestamps for all synchronized datasources for a given index

api.platform.closest_timestamps

Numba-optimized version of closest_timestamps_np()

api.platform.closest_timestamps_np(ref_ts: numpy.ndarray, target_ts: numpy.ndarray, tol: Union[float, int])

Finds indices of timestamps pairs where a value in ‘target_ts’ is whithin ‘tol’ of a value ‘ref_ts’

Parameters
  • ref_ts – the timestamps of the sensor we want to synchronize with

  • target_ts – the timestamps of the sensor we hope to find matches with ‘ref_ts’ whithin ‘tol’

Returns

The indices of the matches

api.platform.closest_timestamps_numba

Numba-optimized version of closest_timestamps_np()

api.platform.parse_yaml_string(ys)

api.utils module

Module contents