Carla Simulator collision scenario DVS Sequences from Bio-inspired event-based looming object detection for automotive collision avoidance
<p dir="ltr">Data for paper published in <i>Neuromorphic Computing and Engineering</i> (April 2025)<br><br>This dataset comprises 1406 sequences (4 sec each) of simulated dynamic vision sensor data from virtual driving and collision scenarios with cars and ped...
محفوظ في:
| المؤلف الرئيسي: | |
|---|---|
| مؤلفون آخرون: | , , |
| منشور في: |
2025
|
| الموضوعات: | |
| الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
| الملخص: | <p dir="ltr">Data for paper published in <i>Neuromorphic Computing and Engineering</i> (April 2025)<br><br>This dataset comprises 1406 sequences (4 sec each) of simulated dynamic vision sensor data from virtual driving and collision scenarios with cars and pedestrians, created using the CARLA Simulator. This was used to evaluate the capabilities of a neuromorphic looming detector, as presented in <a href="https://doi.org/10.1088/2634-4386/add0da" rel="noreferrer" target="_blank">Fabian Schubert et al 2025 <i>Neuromorph. Comput</i>.</a></p><h4>The corresponding code is available under <a href="https://github.com/FabianSchubert/Bio-Inspired-Event-Based-Looming-Object-Detection-for-Automotive-Collision-Avoidance" rel="noreferrer" target="_blank">https://github.com/FabianSchubert/Bio-Inspired-Event-Based-Looming-Object-Detection-for-Automotive-Collision-Avoidance</a>.</h4><h4>In the corresponding study, we sought to evaluate the capability of our bio-inspired network to distinguish approaching vehicles and pedestrians that are on a collision course from regular driving based on dynamic vision sensor data captured by a forward-facing camera that is mounted at the front of the ego vehicle. To that end, we simulated various scenarios in a virtual urban environment involving collisions with cars and pedestrians, as well as regular driving. The system is successful in reliably detecting looming cars, yet it struggles with the detection of pedestrians - a result that can be partially explained by the fact that, compared to cars, pedestrians trigger relatively few events when approaching due to their significantly smaller cross section.</h4><p><br></p><p dir="ltr">Data Formats</p><p dir="ltr">Each sequence is stored in a separate folder, which contains of an event.npy file and a sim_data.npz metadata file.<br>The event.npy file holds a structured numpy array, with each element corresponding to an event. The data fields of the structured array are<br><br>The data fields of the structured array are [("t", "< u4"), ("x", "< u2"), ("y", "< u2"), ("p", "< u2")]</p><p><br></p><ul><li>t: event time in milliseconds</li><li>x, y: pixel coordinate of the event</li><li>p: polarity, where 0 is a negative event, and 1 is positive.</li></ul><p dir="ltr">Here, "< u4" and "< u2" refer to 32 bit and 16 bit unsigned integers, respectively.</p><p><br></p><p dir="ltr">The metadata is stored in sim_data.npz, which stores ["coll_type", "t_end", "dt", "vel", "diameter_object"].</p><ul><li>coll_type: string describing the type of collision. Event types are: ["pedestrian", "cars", "none", "none_with_traffic"]. the first two denote collisions with pedestrians and cars, the latter two refer to regular driving without collisions, either involving either no other vehicles, or driving in urban traffic.</li><li>t_end: end time of the sequence in milliseconds. If the sequence contains a collision, this also marks the collision time.</li><li>dt: simulation time step in milliseconds.</li><li>vel: the average velocity of the collision object relative to the ego perspective/camera of the vehicle in meters/sec. If no collision occurs, this value is "np.nan".</li><li>diameter_object: The geometric mean of the collision object bounding box width and height (using orthografic projection along the camera's forward facing axis). Essentially a representation of the cross section area of the collision object. If no collision occurs, this is "np.nan".</li></ul><p dir="ltr">If you only intend to inspect the event data provided here, only the numpy python package is required. To run the looming detection simulation code provided in the repository, installing <a href="https://github.com/genn-team/genn/tree/genn_4_master" rel="noreferrer" target="_blank">PyGeNN 4.9</a> is also necessary.<br>For visualisation and analysis of the results, matplotlib and pandas is required.</p> |
|---|