Trajdata

Latest version: v1.4.0

Safety actively analyzes 626071 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

1.4.0

In this release, we:
* Added support for fast map area queries (backed by an STRTree within trajdata's `VectorMap`).
* Added quality of life improvements (additional warnings, the ability to control if maps are kept in memory, etc).
* Fixed the format of nuPlan Lane IDs (mainly impacting traffic light queries).
* Added the ability to cache the data index to disk (speeding up subsequent trajdata initializations for the same core arguments).

1.3.3

Adding support for the [nuPlan Dataset](https://www.nuscenes.org/nuplan)!

This release also contains bugfixes related to:
* A typo in vector map tests.
* Handling of scenes with only the ego vehicle present.
* Imports related to the INTERACTION dataset.
* A typo related to traffic light <-> lane associations.
* Loading of traffic light data during batching and in the map API.
* Passing information to the vector map constructor during dataset initialization.

It also adds CI that runs tests (still simple for now) per commit as well as automated publishing of releases to PyPI!

1.3.2

Adding support for the INTERACTION dataset and Stanford Drone Dataset (SDD), as well as improvements to a few internal utilities to better support datasets like it.

1.3.1

Adding support for the [Waymo Open Motion Dataset](https://waymo.com/open/)!

Please note, there are a few caveats with this release:
- Traffic light colors are not yet perfectly aligned between Waymo and trajdata (we haven't had a dataset that distinguishes between yellow/caution lights and red/stop lights yet). Stay tuned for a small update in the future to fix this!
- There are known issues with the mapping of lanes to lane boundaries (this is immediately visible when rendering Waymo scenes in trajdata). In essence, mapping lane boundaries to lanes can be [quite complicated in the Waymo dataset...](https://github.com/waymo-research/waymo-open-dataset/issues/389) so we tried our best to align lanes to their respective lane boundaries.
- For example, when there are no direct lane-to-boundary mappings provided, we subdivide lanes into chunks with the same availability of lane boundaries to their left and right (measured geometrically via projection).

1.3.0

This version brings with it a major visualization upgrade, now with HTML-based interactive plots and animations! Additionally, we provide `trajdata.utils.batch_utils.SceneTimeBatcher`, a batch_sampler that can be fed into a standard torch dataloader, for use cases where one wants to loop through a whole Agent-centric dataset, but calculate statistics grouped by individual timesteps in scenes.

See the notes below for more details.
- Users can now create interactive plots via Bokeh's HTML-based visualization library. Beyond static figures, users can also create interactive animations! Take a look at `examples/visualization_example.py` to see how you can use these features too!
- `SceneTimeBatcher` is a batch sampler that can be fed into a standard PyTorch dataloader, e.g.,

dataset = UnifiedDataset(
desired_data=["nusc_mini-mini_train"],
centric="agent"
)

dataloader = DataLoader(
dataset,
batch_sampler=SceneTimeBatcher(dataset),
collate_fn=dataset.get_collate_fn(),
num_workers=4,
)

Each batch from the resulting dataset is an `AgentBatch`, but with each element corresponding to each agent at a particular timestep in a particular scene. An example is provided at `examples/scenetimebatcher_example.py`.
- Added information about the nuPlan dataset to `DATASETS.md`, additional tests related to the above additions, and bugfixes.

1.2.1

This release mainly brings bugfixes related to the introduction of `StateTensor` and `StateArray`. Thankfully, things seem to be much more stable now and all of trajdata's core usecases should be working fully again.

Further, lat/lon velocity components can now be requested from state arrays/tensors (new feature).

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.