Dense urban environments present unique challenges to the potential deployment of autonomous vehicles, including large number of vehicles moving at various speeds, obstructions which are opaque to in-vehicle sensors, and chaotic behavior of pedestrians. Smart city intersection will be at the core of an AI-powered traffic management system for crowded metropolises. COSMOS will provide all components needed for developing smart intersections, and will support cloud-connected vehicles to overcome the limitations of autonomous vehicles. In particular, COSMOS will enable vehicles to wirelessly share in-vehicle sensor data with other vehicles and the edge cloud servers. COSMOS will also deploy a variety of infrastructure sensors, including street-level and bird’s eye cameras, whose data will be aggregated by the servers. The servers will run real-time algorithms to monitor and manage traffic.
We devised an example experiment at COSMOS’ pilot location. It will rely on the concept of “radar screen” as a real-time evolving snapshot of positions and velocity vectors of objects in the intersection. The radar screen will be constructed by the edge cloud servers based on learning algorithms that will be dynamically distributed to various computing resources based on the application latency requirements and available bandwidth. The radar screen will be wirelessly broadcast to participants in the intersection within a constrained time period. Results of the experiments using bird’s eye cameras to detect and track vehicles and pedestrians from the COSMOS pilot site are reported in , where we assess the capabilities for real-time computation and detection and tracking accuracy, by evaluating and customizing video pre-processing and deep-learning algorithms.