Columnar API
rerun
class TimeNanosColumn
Bases: TimeColumnLike
A column of time values that are represented as integer nanoseconds.
Columnar equivalent to rerun.set_time_nanos
.
class TimeSecondsColumn
Bases: TimeColumnLike
A column of time values that are represented as floating point seconds.
Columnar equivalent to rerun.set_time_seconds
.
class TimeSequenceColumn
Bases: TimeColumnLike
A column of time values that are represented as an integer sequence.
Columnar equivalent to rerun.set_time_sequence
.
def send_columns(entity_path, times, components, recording=None, strict=None)
Send columnar data to Rerun.
Unlike the regular log
API, which is row-oriented, this API lets you submit the data
in a columnar form. Each TimeColumnLike
and ComponentColumnLike
object represents a column
of data that will be sent to Rerun. The lengths of all of these columns must match, and all
data that shares the same index across the different columns will act as a single logical row,
equivalent to a single call to rr.log()
.
Note that this API ignores any stateful time set on the log stream via the rerun.set_time_*
APIs.
Furthermore, this will not inject the default timelines log_tick
and log_time
timeline columns.
When using a regular ComponentBatch
input, the batch data will map to single-valued component
instances at each timepoint.
For example, scalars would be logged as:
times = np.arange(0, 64)
scalars = np.sin(times / 10.0)
rr.send_columns(
"scalars",
times=[rr.TimeSequenceColumn("step", times)],
components=[rr.components.ScalarBatch(scalars)],
)
However, it is still possible to send temporal batches of batch data. To do this the source data first must
be created as a single contiguous batch, and can then be partitioned using the .partition()
helper on the
ComponentBatch
objects.
For example, to log 5 batches of 20 point clouds, first create a batch of 100 (20 * 5) point clouds and then partition it into 5 batches of 20 point clouds:
times = np.arange(0, 5)
positions = rng.uniform(-5, 5, size=[100, 3])
rr.send_columns(
"points",
times=[rr.TimeSequenceColumn("step", times)],
components=[
rr.Points3D.indicator(),
rr.components.Position3DBatch(positions).partition([20, 20, 20, 20, 20]),
],
)
PARAMETER | DESCRIPTION |
---|---|
entity_path |
Path to the entity in the space hierarchy. See https://www.rerun.io/docs/concepts/entity-path for more on entity paths.
TYPE:
|
times |
The time values of this batch of data. Each
TYPE:
|
components |
The columns of components to log. Each object represents a single column of data. If a batch of components is passed, it will be partitioned with one element per timepoint.
In order to send multiple components per time value, explicitly create a
TYPE:
|
recording |
Specifies the
TYPE:
|
strict |
If True, raise exceptions on non-loggable data.
If False, warn on non-loggable data.
if None, use the global default from
TYPE:
|