Custom Data
rerun
class AnyValues
Bases: AsComponents
Helper to log arbitrary values as a bundle of components.
Example
rr.log(
"any_values", rr.AnyValues(
confidence=[1.2, 3.4, 5.6],
description="Bla bla bla…",
),
)
def __init__(drop_untyped_nones=True, **kwargs)
Construct a new AnyValues bundle.
Each kwarg will be logged as a separate component using the provided data. - The key will be used as the name of the component - The value must be able to be converted to an array of arrow types. In general, if you can pass it to pyarrow.array you can log it as a extension component.
Note: rerun requires that a given component only take on a single type. The first type logged will be the type that is used for all future logs of that component. The API will make a best effort to do type conversion if supported by numpy and arrow. Any components that can't be converted will result in a warning (or an exception in strict mode).
None
values provide a particular challenge as they have no type
information until after the component has been logged with a particular
type. By default, these values are dropped. This should generally be
fine as logging None
to clear the value before it has been logged is
meaningless unless you are logging out-of-order data. In such cases,
consider introducing your own typed component via
rerun.ComponentBatchLike.
You can change this behavior by setting drop_untyped_nones
to False
,
but be aware that this will result in potential warnings (or exceptions
in strict mode).
If you are want to inspect how your component will be converted to the underlying arrow code, the following snippet is what is happening internally:
np_value = np.atleast_1d(np.array(value, copy=False))
pa_value = pa.array(value)
PARAMETER | DESCRIPTION |
---|---|
drop_untyped_nones
|
If True, any components that are None will be dropped unless they have been previously logged with a type.
TYPE:
|
kwargs
|
The components to be logged.
TYPE:
|
class AnyBatchValue
Bases: ComponentBatchLike
Helper to log arbitrary data as a component batch.
This is a very simple helper that implements the ComponentBatchLike
interface on top
of the pyarrow
library array conversion functions.
See also rerun.AnyValues.
def __init__(descriptor, value, drop_untyped_nones=True)
Construct a new AnyBatchValue.
The value will be attempted to be converted into an arrow array by first calling
the as_arrow_array()
method if it's defined. All Rerun Batch datatypes implement
this function so it's possible to pass them directly to AnyValues.
If the object doesn't implement as_arrow_array()
, it will be passed as an argument
to pyarrow.array .
Note: rerun requires that a given component only take on a single type. The first type logged will be the type that is used for all future logs of that component. The API will make a best effort to do type conversion if supported by numpy and arrow. Any components that can't be converted will be dropped, and a warning will be sent to the log.
If you are want to inspect how your component will be converted to the underlying arrow code, the following snippet is what is happening internally:
np_value = np.atleast_1d(np.array(value, copy=False))
pa_value = pa.array(value)
PARAMETER | DESCRIPTION |
---|---|
descriptor
|
Either the name or the full descriptor of the component.
TYPE:
|
value
|
The data to be logged as a component.
TYPE:
|
drop_untyped_nones
|
If True, any components that are None will be dropped unless they have been previously logged with a type.
TYPE:
|
def partition(lengths)
Partitions the component into multiple sub-batches. This wraps the inner arrow
array in a pyarrow.ListArray
where the different lists have the lengths specified.
Lengths must sum to the total length of the component batch.
PARAMETER | DESCRIPTION |
---|---|
lengths
|
The offsets to partition the component at.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
The partitioned component.
|
|